Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Funnily enough I was just reading an article about this and "my boyfriend is AI" is the tamer subreddit devoted to this topic because apparently one of their rules is that they do not allow discussion of the true sentience of AI.

I used to think it was some fringe thing, but I increasingly believe AI psychosis is very real and a bigger problem than people think. I have a high level member of the leadership team at my company absolutely convinced that AI will take over governing human society in the very near future. I keep meeting more and more people who will show me slop barfed up by AI as though it was the same as them actually thinking about a topic (they will often proudly proclaim "ChatGPT wrote this!" as though uncritically accepting slop was a virtue).

People should be generally more aware of the ELIZA effect [0]. I would hope anyone serious about AI would have written their own ELIZA implementation at some point. It's not very hard and a pretty classic beginner AI-related software project, almost a party trick. Yet back when ELIZA was first released people genuinely became obsessed with it, and used it as a true companion. If such a stunning simple linguistic mimic is so effective, what chance to people have against something like ChatGPT?

LLMs are just text compression engines with the ability to interpolate, but they're much, much more powerful than ELIZA. It's fascinating to see the difference in our weakness to linguistic mimicry than to visual. Dall-E or Stable Diffusion make a slightly weird eye an instantly people act in revulsion but LLM slop much more easily escapes scrutiny.

I increasingly think we're not is as much of a bubble than it appears because the delusions of AI run so much deeper than mere bubble think. So many people I've met need AI to be more than it is on an almost existential level.

0. https://en.wikipedia.org/wiki/ELIZA_effect



I'm so surprised that only one comment mentions ELIZA. History repeats itself as a farce... or a very conscious scam.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: