MATCH
2, 4, 6
8, 10, 12
12, 14, 16
20, 40, 60
NOT MATCH
10, 8, 6
If the answer is "numbers in ascending order", then this is a perfect illustration of synthetic vs. realistic examples. The numbers indeed fit that rule, so in theory, everything is fine. In practice, you'd be an ass to give such examples on a test, because they strongly hint the rule is more complex. Real data from a real process is almost never misleading in this way[0]. In fact, if you sampled such sequences from a real process, you'd be better off assuming the rule is "2k, 2(k+1), 2(k+2)", and treating the last example as some weird outlier.
Might sound like pointless nitpicking, but I think it's something to keep in mind wrt. generative AI models, because the way they're trained makes them biased towards reality and away from synthetic examples.
--
[0] - It could be if you have very, very bad luck with sampling. Like winning a lottery, except the prize sucks.
That's the one. Though where I heard it, you can set your own rule, not just use the example.
I'd say that every black swan is an example of a real process that is misleading.
But more than that, I mentioned verified/falsified, as in the difference between the two in science. We got a long way with just the first (Karl Popper only died in 1994), but it does seem to make a difference?
Looking at the example patterns given:
If the answer is "numbers in ascending order", then this is a perfect illustration of synthetic vs. realistic examples. The numbers indeed fit that rule, so in theory, everything is fine. In practice, you'd be an ass to give such examples on a test, because they strongly hint the rule is more complex. Real data from a real process is almost never misleading in this way[0]. In fact, if you sampled such sequences from a real process, you'd be better off assuming the rule is "2k, 2(k+1), 2(k+2)", and treating the last example as some weird outlier.Might sound like pointless nitpicking, but I think it's something to keep in mind wrt. generative AI models, because the way they're trained makes them biased towards reality and away from synthetic examples.
--
[0] - It could be if you have very, very bad luck with sampling. Like winning a lottery, except the prize sucks.