Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

What about the difference that the human knows what they don't know?

In contrast, the LLM knows nothing, but confidently half regurgitates correlational text that it is seen before.



As far as all the research on this goes, LLMs (internally) mostly know what they know, but incentivizing that information making it to output is difficult.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: