Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The bubble bursts when Apple announces it's doing good enough (private/secure) LLMs on device. At that point the capex on cloud infra starts to come into question and the dominos start to fall...


Google's been doing this since at least 2022 and... well, nobody really cares.


LLM to me are the least interesting part of AI. Deep learning has proven very useful for signal processing and image segmentation among other things. Those are small enough to run on phones. LLM simply don’t seem to be that useful at small scales because the illusion of knowledge falls apart with too few parameters.


Yeah, it’s only a matter of time till LLMs can easily be run locally by more people and once the market realizes that it’s over.


You can't run anything close to ChatGPT levels of parameters locally though.


As laughable as Apple's efforts have been so far, I think they still have an advantage precisely because of the unified architecture.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: