Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You're all referencing the strange idea in a world where there would be no open-weight coding models trained in the future. Even in a world where VC spending vanished completely, coding models are such a valuable utility that I'm sure at the very least companies/individuals would crowdsource them on a reoccurring basis, keeping them up to date.

The value of this technology has been established, it's not leaving anytime soon.





SOTA models cost hundreds of millions to train. I doubt anyone is crowdsourcing that.

And that’s assuming you already have a lot of the infrastructure in place.


I think faang and the like would probably crowdsource it given that they would—according to the hypothesis presented—would only have to do it every few years, and ostensibly are realizing improved developer productivity from them.

I don’t think the incentive to open source is there for $200 million LLM models the same way it is for frameworks like React.

And for closed source LLMs, I’ve yet to see any verifiable metrics that indicate that “productivity” increases are having any external impact—looking at new products released, new games on Steam, new startups founded etc…

Certainly not enough to justify bearing the full cost of training and infrastructure.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: