Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
player1234
3 months ago
|
parent
|
context
|
favorite
| on:
Why language models hallucinate
There is no such thing as confidence regarding the actual facts, only confidence in probable output from the input. Factual confidence is impossible with current architecture.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: