Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don’t get how this is supposed to be enforced. Say your license says “no hate speech” and I have a bunch of models and code and I make, I dunno, a rewrite of the Old Testament[0].

You see it and somehow connect the dots and you think I used your AI to do it, now what do you do? Sue me? What is the threshold for it being worth your time, given that it’s nontrivial (maybe impossible?) to prove that your model was used for the thing you prohibited, and not some other model or combination of models?

I guess you could somehow watermark your model’s output but that radically decreases its utility and can probably be defeated anyway.

So I really don’t understand, besides performative and politically fashionable “alignment” signaling, what this even means.

[0]: https://en.wikipedia.org/wiki/The_Bible_and_violence



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: