It’s basically the line for all the AI-hype people: “all the problems are going away!”, “soon it’ll all magically make things exponentially good-er-er!”
Alternatively, it’s a restatement of the obvious empirical truth that technology tends in improve on an exponential and not linear curve. Seems like a simpler explanation that doesn’t even require insulting people.
The premise would be better supported if it could be shown that if we could 10x the speed at which matrix multiplication is performed conferred a linear or better increase in performance post GPT-4. As it stands that would just seem to give us current results faster, not better results
I would argue that any given technology tends to improve on an S curve, so exponentially at first and then flattening out. See Moore’s law as a great example.
Or more on topic see the improvements in LLMs since they were invented. At first each release was an order of magnitude better than the last (see GPT 2 vs 3 vs 4), now they’re getting better but at a much slower rate.
Certainly feels like being at the top of an S curve to me, at least until an entirely new architecture is invented to supersede transformers.