Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

AI winters will keep coming as long as the definition of AI stays relative. We used to call chess programs chess "AIs", but hardly anyone says that anymore. We call LLMs "AIs" now, but let's be real: a few decades from now, we'll probably be calling them token predictors, while some shiny new "AIs" are already out there kicking asses.

At the end of the day, "AI" really just means throwing expensive algorithms at problems we've labeled as "subjective" and hoping for the best. More compute, faster communication, bigger storage, and we get to run more of those algorithms. Odds are, the real bottleneck is hardware, not software. Better hardware just lets us take bolder swings at problems, basically wasting even more computing power on nothing.

So yeah, we’ll get yet another AI boom when a new computing paradigm shows up. And that boom will hit yet another AI winter, because it'll smack into the same old bottleneck. And when that winter hits, we'll do what we've always done. Move the goalposts, lower the bar, and start the cycle all over again. Just with new chips this time.

Ah, Jesus. I should quit drinking Turkish coffee.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: