Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

it is not for generating factual English sentences.

Then the tool should not be doing it --- but it does. And therein is the legal liability.



The tool did it because the person asked it to. They used the tool the wrong way.

The knives are entering people’s guts. They should not be doing that. The knife companies should be liable for these stabbings.


The tool did it because the person asked it to.

The tool did it because this is what it was designed and trained to do --- at great expense and effort --- but somewahat less than successfully.

You can't have it both ways --- the tool can't be "intelligent" yet too stupid to understand it's own limitations.

If people ask the wrong questions, the "intelligent" response would be, "Sorry, I can't do that".

Maybe the problem here is that this "intelligent" tool is really as dumb as a rock. And it's only a matter of time until lawyers argue this point to a jury in court.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: