Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

We would need regulation to stop model ingesting data they do not have the right to, which would mean something like laws governing ML algorithms, having to declare what data you fed it and so on. Like some kind of SOC2 audit for data provenance.

Maybe ML weights are just numbers, but then so is a movie, an mp3, a logo, a brand, and so on.



Regulation would introduce a whole new set of problems and unintended consequences.


Which tradeoffs do we want though? Endlessly degrading AI vomit sounds much worse to me.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: