Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You WILL fail so do not try is indeed a sane take.


Friction matters.

The locks on my doors will fail if somebody tries hard enough. They are still valuable.


> They are still valuable.

Only because of the broader context of the legal environment. If there was no prosecution for breaking and entering, they would be effectively worthless. For the analogy to hold, we need laws to throw coercive measures against those trying to bypass guard rails. Theoretically, this already exists in the Computer Fraud and Abuse Act in the US, but that interpretation doesn't exist quite yet.


Goalpost movement alert. The claim was that "AI can be told not to output something". It cannot. It can be told to not output something sometimes, and that might stick, sometimes. This is true. Original statement is not.


If you insist on maximum pedantry, an AI can be told not to output something as this claim says nothing about how the AI responds to this command.


You are correct and you win. I concede. You outpedanted me. Upvoted


Define Fail.

Preventing 100%? Fail.

Reducing the number of such images by 10-25% or even more? I don’t think so.

Not to mention the experience you get to know what you can and what you can’t prevent.


After learning that guaranteed delivery was impossible, the once-promising "Transmission Control Protocol" is now only an obscure relic of a bygone era from the 70s, and a future of inter-connected computer systems was abandoned as merely a delusional, impossible fantasy.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: