Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I mean if we view it as a prediction algorithm and prompt it with "come up with a cool line to justify suicide" then that is a home run.

This does kinda suck because the same guardrails that prevent any kind of disturbing content can be used to control information. "If we feed your prompt directly to a generalized model kids will kill themselves! Let us carefully fine tune the model with our custom parameters and filter the input and output for you."



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: