Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I like the idea, but it is too black-and-white in some places. Never exploiting security vulnerabilities? There are certainly cases where that is justifiable -- attacking the Enigma machine, for example (speaking of Turing...). If World War II were to happen today, there would almost certainly be attacks on computer systems, and we would want our military to attack the enemy military's systems.

There is also the matter of law enforcement. It is better for law enforcement agencies to exploit vulnerabilities (with a warrant) than to require special back doors in software. No reasonable person can take issue with the police exploiting a vulnerability to catch a mafia hit man, a criminal arms dealer, etc. Some hacker needs to be willing to write the software that exploits those vulnerabilities. I would say that writing software with back doors is a much more serious ethical problem than exploiting unintentional vulnerabilities.



There's an issue open on the repo right now about the exact paragraph you're talking about. I was thinking of making it say

"I swear to not design software for the purpose of MALICIOUSLY exploiting a vulnerability, damaging another computer system or exploiting a user."

Although, yeah, I see what you mean with the Enigma thing. Hard to put into words. Suggestions?


Utilitarianism comes to mind: seek to maximize the benefit to society. Sometimes that means attacking, sometimes it means defending, sometimes it means refusing to create the software you are told to create.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: