Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> There's a dichotomy in the software world between real products (which have customers and use cases and make money by giving people things they need) and hype products (which exist to get investors excited, so they'll fork over more money).

AI is not both of these things? There are no real AI products that have real customers and make money by giving people what they need?

> LLMs are a more substantive technology than blockchain ever was, but like blockchain, their potential has been greatly overstated.

What do you view as the potential that’s been stated?



Not OP but for starters LLMs != AI

LLMs are not an intelligence, and people who treat them as if they are infallible Oracles of wisdom are responsible for a lot of this fatigue with AI


>Not OP but for starters LLMs != AI

Please don't do this, make up your own definitions.

Pretty much anything and everything that uses neural nets is AI. Just because you don't like how the definition has been since the beginning doesn't mean you get to reframe it.

In addition, if humans are not infallible oracles of wisdom, they wouldn't be an intelligence in your definition.


Why then there is an AI-powered dishwasher, but no AI car?


https://www.tesla.com/fsd ?

I also don't understand the LLM ⊄ AI people. Nobody was whining about pathfinding in video games being called AI lol. And I have to say LLMs are a lot smarter than A*.


Cannot find any mention of AI there.

Also it's funny how they add (supervised) everywhere. It looks like "Full self driving (not really)"


Yes one needs some awareness of the technology. Computer vision: unambiguously AI, motion planning: there are classical algorithms but I believe tesla / waymo both use NNs here too.

Look I don't like the advertising of FSD, or musk himself, but we without a doubt have cars using significant amounts of AI that work quite well.


None of those things contain actual intelligence. On that basis any software is "intelligent". AI is the granddaddy of hype terms, going back many decades, and has failed to deliver and LLMs will also fail to deliver.


It's because nobody was trying to take video game behavior scripts and declare them the future of all things technology.


Ok? I'm not going to change the definition of a 70 year old field because people are annoyed at chatgpt wrappers.


A way to state this point that you may find less uncharitable is that a lot of current LLM applications are just very thin shells around ChatGPT and the like.

In those cases the actual "new" technology (ie, not the underlying ai necessarily) is not as substantive and novel (to me at least) as a product whose internals are not just an (existing) llm.

(And I do want to clarify that, to me personally, this tendency towards 'thin-shell' products is kind of an inherent flaw with the current state of ai. Having a very flexible llm with broad applications means that you can just put Chatgpt in a lot of stuff and have it more or less work. With the caveat that what you get is rarely a better UX than what you'd get if you'd just prompted an llm yourself.

When someone isn't using llms, in my experience you get more bespoke engineering. The results might not be better than an llm, but obviously that bespoke code is much more interesting to me as a fellow programmer)


Yes ok then I definitely agree


Shells around chatgpt are fine if they provide value.

Way better than AI jammed into every crevice for no reason.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: