Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Given that LLMs can learn to translate between languages based on just having lots of related tokens without any explanations I'd bet they could translate those thoughts to words even if the person doesn't think of them as words.

Would probably take more to get data from such people though. From people with an inner monologue you could just make them read text, record that, and then you can follow their inner monologues.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: