Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Where is AI actually selling and doing well? What's a good resource for these numbers? What are the smaller scale use-cases where AI is selling well?

I am generally curious, because LLMs, VLMs, generative AI, advances are proving useful, but the societal impact scale and at this the desired rate is not revealing itself.



Coding - e.g. Claude Code, Cursor both announced 1B revenue run rates.


That would be meaningful if they weren’t losing money to generate that revenue.


The product works and saves enough human effort to justify the real cost. People will eventually pay when it comes down to it.

If that were the case, why not charge more?

Because they're loss-leading like all their competitors for now

I am running a container on an old 7700k with a 1080ti that gives me vscode completions with rag with similar latency and enough accuracy to be useful for boilerplate etc…

That is something I would possibly pay for but as the failures on complex tasks are so expensive, this seems to be a major use case and will just be a commodity.

Creating the scaffolding for a jwt token or other similar tasks will be a race to the bottom IMHO although valuable and tractable.

IMHO they are going to have to find ways to build a mote, and what these tools are really bad at is the problem domains that make your code valuable.

Basically anything that can be vibe coded can be trivially duplicated and the big companies will just kill off the small guys who are required to pay the bills.

Something like surveillance capitalism will need to be found to generate revenue needed for the scale of Microsoft etc…


Given how every CPU vendor seems to push for some kind of NPU, local running models will probably be far more common in next 5 years. And convincing everyone to pay subscription for very minimal improvements in functionality gonna be hard.


The NPUs integrated into CPU SoCs are very small compared to even integrated GPUs, much less discrete or datacenter GPUs.

NPUs seem to be targeted towards running tiny ML models at very low power, not running large AI models.


Have you documented your VSCode setup somewhere? I've been looking to implement something like that. Does your setup provide next edit suggestions too?


I keep idly wondering what would be the market for a plug and play LLM runner. Some toaster sized box with the capability to run exclusively offline/local. Plug it into your network, give your primary machine the IP, and away you go.

Of course, the market segment who would be most interested, probably has the expertise and funds to setup something with better horsepower than could be offered in a one size fits all solution.



Ooof, right idea but $4k is definitely more than I would be comfortable paying for a dedicated appliance.

Still, glad to see someone is making the product.


I am working on a larger project about containers and isolation stronger than current conventions but short kata etc…

But if you follow the podman instructions for cuda, the llama.cpp shows you how to use their plugin here

https://github.com/ggml-org/llama.vscode


Market size for this is in the billions though, not trillions.


it's easily a 200bn ARR business, if coding agent achieved another step jump in abilities ~ 1trn+ marketcap


> if coding agent achieved another step jump in abilities ~ 1trn+ marketcap

Do you want to walk us through that math?


Agreed, coding is one. What else?


Professional legal services seem to be picking up steam. Which sort of makes sense as a natural follow on to programming, given that 'the law' is basically codified natural language.


I don't know how it is in other countries, but in the UK using LLMs for any form of paid legal services is hugely forbidden, and would also be insanely embarrassing. Like, 'turns out nobody had any qualifications and they were sending all the work to mechanical Turks in third world countries, who they refused to pay' levels of embarrassing.

I say this as someone who once had the bright idea of sending deadline reminders, complete with full names of cases, to my smart watch. It worked great and made me much more organised until my managers had to have a little chat about data protection and confidentiality and 'sorry, what the hell were you thinking?'. I am no stranger to embarrassing attempts to jump the technological gun, or the wonders of automation in time saving.

But absolutely nobody in any professional legal context in the UK, that I can imagine, would use LLMs with any more gusto and pride than an industrial pack of diarrhoea relief pills or something - if you ever saw it in an office, you'd just hope it was for personal use and still feel a bit funny about shaking their hands.


Except that it keeps getting lawyers into trouble when they use it.

https://www.reuters.com/legal/government/judge-disqualifies-...


Yeah, good point. These things never get better.

Newsrooms, translation services.

sales, marketing, customer support, oh my, so many


I don't use it, but I know several people who use ChatGPT to edit emails etc. so they don't come across nasty. How well it works, I can't say.


Most of my family uses ChatGPT instead of Google to answer questions, despite my warnings that it’ll just make stuff up. I definitely Google much less now than I used to, directing a fair amount of that into ChatGPT instead.


But how much are you paying for these services?


My family? Same as they pay for Google

that's frankly mostly because google search got so massively worse... I'd still use google more if not for the fact the stuff I asked it 5 years ago and got answer no longer provides useful answers


you can check on trustmrr.com (mostly indie/solo businesses) that a large chunk of those smaller companies make money by selling AI video generation and other genAI services.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: