Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I am agnostic to its success or continued development.

I do not think there is any hard evidence to indicate it will suddenly stop getting better. We have only seen consistent progress, and the cost of infererence per kwh keeps going down. Additionally, the industry is figuring out minified models. A few years ago gpt3 scale models filled datacenters, now they fit on an iphone. We will probably see them sneak onboard drone flight controllers, and toys as ambulation chips, cheap security cameras, etc.

This idea that a new technology never required a scale jump in supply chains and energy consumptions is false (see steel/computers/shipping/etc..).

As gpus keep improving each year the cost of training a model keeps coming down. More companies will have the opportunity to train models. Four years ago you needed a hundred thousand dollars to train some of those toy deepmind rl agents. Now you can rent an H100/A100 for 1-5k for a month and go ham. Give it 10 years and nerdy high schoolers will be buying old H100s as desks just like we bought powerpc server blades with half a terrabyte of ram for 10 bucks on craigslist. (Unless moores law stops but then we have a bigger problem)

I do not speculate on whether ai will turn into god or something like that. It just seems more likely to me we are going to get more ai not less. I genuinely do not see real evidence of a wall. People just want it to fail.



> I do not think there is any hard evidence to indicate it will suddenly stop getting better. We have only seen consistent progress, and the cost of infererence per kwh keeps going down.

You could've literally said the exact same thing about Moore's law for decades and decades. Then progress fell off that curve. Eventually, it will hit physical limits and stop.

> I am agnostic to its success or continued development.

Then the question is why do you have the double standard? We don't have any hard evidence that AI will keep getting better just like we don't have any hard evidence that TMSC or Intel will pull off another major process shrink or two.

Past performance is not a guarantee of future results.

> A few years ago gpt3 scale models filled datacenters, now they fit on an iphone. We will probably see them sneak onboard drone flight controllers, and toys as ambulation chips, cheap security cameras, etc.

I don't think "more gpt3" is what people mean by AI "getting better."




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: