Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That’s a good point, I never thought of hiding stuff in the images you send. LLMs truly are the most insecure software in history.

I remember testing the precursor to Gemini, and you could just feed it a really long initial message, which would wipe out its system prompt. Then you could get it to do anything.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: