Only because of the broader context of the legal environment. If there was no prosecution for breaking and entering, they would be effectively worthless. For the analogy to hold, we need laws to throw coercive measures against those trying to bypass guard rails. Theoretically, this already exists in the Computer Fraud and Abuse Act in the US, but that interpretation doesn't exist quite yet.
Goalpost movement alert. The claim was that "AI can be told not to output something". It cannot. It can be told to not output something sometimes, and that might stick, sometimes. This is true. Original statement is not.
After learning that guaranteed delivery was impossible, the once-promising "Transmission Control Protocol" is now only an obscure relic of a bygone era from the 70s, and a future of inter-connected computer systems was abandoned as merely a delusional, impossible fantasy.