Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

As a person outside the field, my subjective experience is that every day comes headlines with new superlatives regarding "The Coming AI Revolution." My expectations are being set for radical transformation across most areas of technology in the next decade or two on the basis of current developments in this area.


>My expectations are being set for radical transformation across most areas of technology in the next decade or two

As someone who as lived through most of the last four decades, this seems like a very reasonable expectation. I mean, this maybe wasn't obvious to people who weren't born into the IT industry, but to me? the '90s seemed to have a 'holy shit, this changes everything!" moment every few years. (Of course, some things stay the same. UNIX is older than I am, as is c, and both are recognizable and ubiquitous today.) - and I think as the aughts rolled around, those 'holy shit, this changes everything!' moments became, ah, more evenly distributed. More of this new tech touched the lives of ordinary people in obvious ways.

> on the basis of current developments in this area.

This part... I think that you have to understand that the way we talk about technological change is functionally science fiction. Yes, yes, it's very likely that the high rate of change maintained over the last few decades will continue and even continue to accelerate.[1] but I personally treat claims of where tech will be in 10 years as functional science fiction. A lot like how I treat predictions of what sectors of the stock market will be where in ten years. Yes, the market as a whole will probably go up... but to say you know the winners and losers ten years out? that's... a strong claim.

So yeah, saying "there will be revolutionary changes in the next decade!" is fine, and probably right. there were revolutionary changes last decade. And, hell, you could say that ML and similar have already given us revolutionary changes this decade. Arguing that ML will give us revolutionary changes next decade? well, maybe... but it looks more like Delphi than C to me, if you know what I mean. (Of course, I'm IT; I operate, manage and maintain infrastructure; I'm no programming expert. so take that how you will.)

[1]the counterpoint here is moore's law. Your compute per dollar is not getting massively better over time the way it was in the 80s through the 00s. I mean, it's still trending towards better, but not nearly as fast as it was. The difference between the compute power of available compute resources in 1980 and 2000 is gong to be dramatically greater than the difference between the compute power of available compute resources from 2000 to 2020. This means that if you want to make predictions, you can't rely as much on computers being more powerful in the future.


Re Moore's Law has slowed a lot. I still like Kurzweil's chart, showing that historically, we can backtrapolate the exponential improvement to previous other information technologies. So this might extend to some future information technology, other than silicon... but that form of reasoning really is science fiction.

Most of these imminant AI predictions are founded on computer complexity soon approaching that of the human brain. Even some fundamental discovery, like RNA playing an information processing role within/between neurons, that pushes brain complexity up by many orders of magnitude, would not push back exponential growth reaching it by all that many years.

The whole thing hinges on Moore's Law, which is (currently) looking rickety.


this mainly has to do with the incentives of journalists in an attention-based industry, I think.


As well as the incentives of researchers, developers, and executives in the NN industry.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: