Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There are free and offline options, like Llama.cpp, but you will have to pay by giving up your privacy to Meta (or similar large companies)


How does using an offline model give up your privacy?

Also, running models locally requires good hardware to get acceptable performance. It's still a large barrier to entry.


Sure, for now, and maybe in the future. But it's possible that paid models will end up greatly outpacing free ones, and at some point the companies controlling them will stop burning billions of dollars per month and jack up prices.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: