Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
The Promise to Develop Ethical Software (github.com/maxmackie)
22 points by darxius on May 1, 2013 | hide | past | favorite | 36 comments


There's also the ACM Software Engineering Code of Ethics and Professional Practice: http://www.acm.org/about/se-code


Does anyone really hold software developers to the same standards as licensed professions (laywers, accountants, doctors, professional engineers etc.)?

NB I am married to a lawyer and have a brother who is a professionally qualified engineer, so I have to be careful never to refer to myself as a "professional" as I have no professional qualifications. I'm also quite happy not to have the level of personal responsibility and liability that they have!


>> Does anyone really hold software developers to the same standards as licensed professions (laywers, accountants, doctors, professional engineers etc.)?

I don't know. But if electrical engineers, mechanical engineers, and software engineer are all working to build a jet, it would seem reasonable to expect the same diligence from them all.

>> I have to be careful never to refer to myself as a "professional"

You're a professional if you make a living at it.

"A professional is a person who is engaged in a certain activity, or occupation, for gain or compensation as means of livelihood; such as a permanent career, not as an amateur or pastime."

UPDATE:

BTW, licensing helps only if there's a reliable way to determine competence. Arguably, different areas of programming could be considered different fields of expertise.

Licensing can also be harmful to a field. See http://www.nytimes.com/2012/06/17/magazine/so-you-think-you-...

If you had to get a software license to build a web site for your local soccer club, that would make it very hard to get started programming.


There is a difference between the lay definition of "professional" and the legal or regulatory definition in some jurisdictions.

In many contexts a "professional" occupation is one that is governed either by the government or a self-regulating body. Accountants, electricians, doctors, lawyers, engineers, etc.


>> In many contexts a "professional" occupation is one that is governed either by the government or a self-regulating body. Accountants, electricians, doctors, lawyers, engineers, etc.

True, but as I edited my comment to say, these bodies can be self-serving as easily as they can serve the public. People already working in a field can inflate their wages by making it harder to enter that field.


I don't know about other fields but around here there is absolutely no shortage of qualified laywers.


That's one use of the term - in the UK (especially amongst people who are professionally qualified) the term often has a narrower meaning.

This doesn't give me any problems! And I am NOT suggesting it is appropriate for general development (although it probably is in a very narrow range of domains - e.g. safety critical systems).


licensed professions

I think that's your answer. How are you to hold an unlicensed professional accountable for anything that s/he's not already legally obligated to do? Note: I'm not making a statement in support of or in opposition of licensing software developers. It's interesting to think about, though.

I wouldn't really take pause in throwing around the term "software professional" unless I was around somebody who was a stickler about using the term "professional"....then I'd really start throwing around the term around. People take themselves too seriously.


"How are you to hold an unlicensed professional accountable for anything that s/he's not already legally obligated to do?"

That's the sanction that is facing a lawyers/accountant/doctor/PE - if they mess up bad enough their licensing body takes their license away and they can't work anymore in that role. Combine that with personal liability for the decisions they make and I'm quite happy not being that kind of professional.


Programmers would need control over schedules and budgets to be held liable for issues.


I would change the push from ethical software to ethical development.

Software, like so many drugs is not "unethical" on its own, but the user (or doctor compared to medicine) is the one that makes an unethical use of it (or medicine)

I agree that in our society where software is pervasive, an ethical framework needs to be in place. As an oath, it will not force everybody to comply, but the choice will be based on full knowledge that it is an unethical choice. And that amounts to intentionality

Can I advocate that software developers should be oath to write software that do not harm? The ideal would be that the law should protect software developers that refuse a particular assignment on ethical grounds.


Create a new (viral) software license, the Ethical Public License (EPL). The EPL is linked to the oath and forbids use of the source code in any project or system deemed to break the oath.

This might result in an economic incentive for companies like Path to curb bad behaviour, due to the cost of rewriting EPL code or using inferior code.

Note: The OSI will probably claim the EPL is not an open source license ("No Discrimination Against Fields of Endeavor") and the FSF will likely say the EPL isn't a free software license because it breaks Freedom 0 ("The freedom to run the program, for any purpose").


Not sure why Turing's name is associated with this. Is there any evidence that he thought about ethical use of computers?


The irony of taking an oath so focused on privacy and naming it after a famous codebreaker is just fantastic.

Of course it's not the same thing, and I'd never argue that he was wrong to do it given the circumstances, but Turing's career was built on reading other people's communications against their will.


When I first made the Oath (yesterday), I just picked someone I looked up to. Seeing as this should be a social endeavor, I'm open to changes.

https://github.com/maxmackie/Turing-Oath/issues/1


I honestly can't find the source of this, but I recall reading that Turing was against mnemonic assemblers (as they would consume scarce resources).

Its understandable given the computers of the time. But I suspect that at some point, you'll probably want to talk about quality, which so many programmers have discussed.

David Parnas would be a more suitable candidate, if you really want to name it after someone. But I'd like to point out that you may not want to name the oath after a person unless they are involved in drafting it.

Dijkstra, for example, spoke frequently on the topic of being ethical as a programmer / professor. Out of respect, I would never name something like this after him unless I could back up every word with his various papers.


Good point, the oath could just not be named after someone. There's a github issue opened for discussing the name change.


Tesla may be a good candidate. My only reservation is I do think he may have tried or did invent weapons to sell to the military.


"I swear to respect the privacy of the user and secure all personal information in accordance with current standards."

(Except for Hitler's private communications.)


Gotcha: But what about the people communicating with Hitler. What about their privacy? Would you extend the exception to them? If so, fine. What about the information they, in turn relay to others? Is that subject to the exemption as well? What if Hitler only talks privately to one guy, who is within the exception and then that guy turns around and talks to everybody Hitler intends to talk to, outside of the exception?

Where would the line be drawn? ;-)


Draw the line at the same place you would draw it for engineers. Who do you think designed the tanks, guns, aircraft, and ships we used to defeat the Third Reich? How might that war have ended had no engineers been willing to design those weapons?


I'snt software amoral?

The same routine that helps target a missile from a drone, could help target viruses more effectively, but someone has to make the choice to use it for that purpose.


Things are amoral. The use and creation of things, however, is open to moral and ethical considerations.

Making a radiation therapy machine without the inclusion of safety mechanisms could be considered unethical. Making a computer vision system that tracks targets is amoral (what's a good word for something that is neither ethical nor unethical? I'm not fond of using moral/amoral for this), but applying it to a weapon targeting system hits an ethical gray area (as with any other weapon engineering task).

Like most things, the ethical status of an act is largely related to the intent and foreknowledge. Knowingly creating a system that will (not just could) be used for harming others (physically, emotionally, financially, etc) can be considered unethical. Making a system which is most likely going to be used for harm might be on the unethical side of gray, even if it has a number of potentially positive or neutral uses. Making a system that might be used for harm (consider knives) is probably not unethical, but then the manner of sale and marketing would become important for the calculation.


What is ethical software? Is netcat bad because you can do bad things with it?


Tools merely empower us to do better what we were already doing. The tool is never bad.

We should probably take clues from other professions. A scalpel is never bad, and cutting up a person can be ethically correct.

I haven't thought carefully about this, but it seems that intention is the most important factor, followed by actual outcome. (Eg doing bad things without intent isn't necessarily unethical behavior. But some bad actions are so gruesome that it is hard to forgive the lack of foresight. Recklessness you might call that)


Guns are pretty unambiguously tools for killing. Arguably, not all killing is "bad" though.

Nuclear bombs are pretty unambiguously tools for killing an awful lot of people.

Software is interesting because it is tools for making tools for making tools. If your software has a big green button as the input and a 3D printed nuclear bomb as the output, you may have created a bad tool. If it takes 2 steps to connect your software with one other piece of software to achieve the same result, you still might want to think about what you're doing. Three steps, four steps, five steps, and the ethical boundaries get just as difficult to resolve as most other human problems.


Privacy and security are very important, but if one is going to write a code of ethics for software, it should probably include something about not writing software that kills people (see, for example, Therac-25).


Therac-25 was designed to cure people, not kill them (See http://en.wikipedia.org/wiki/Therac-25). If we are talking about software that is used in drones or missiles, that is a different matter, and one that the oath should address.


It was designed to treat them, but was designed poorly. Due to a combination of poor risk assessment and QA hardware and software mechanisms to ensure safety were not in place. That is an ethical lapse on the part of the individuals and teams that developed it.


Therac-25 was not well designed, which is a much greater ethical problem than designing a machine to kill people. If the failure of your software can kill people, it is unethical to just hack together a solution.


Agreed. Feel free to fork the repo and add a paragraph/section.


This sort of thing has been tried before:

JSON.org License Literally Says it "shall be used for Good, not Evil": https://news.ycombinator.com/item?id=3693108

But, there is always the problem of defining evil: http://www.youtube.com/watch?v=JRxl02mULws

How long before the oath says "Except for IBM, its minions and customers..."?


I like the idea, but it is too black-and-white in some places. Never exploiting security vulnerabilities? There are certainly cases where that is justifiable -- attacking the Enigma machine, for example (speaking of Turing...). If World War II were to happen today, there would almost certainly be attacks on computer systems, and we would want our military to attack the enemy military's systems.

There is also the matter of law enforcement. It is better for law enforcement agencies to exploit vulnerabilities (with a warrant) than to require special back doors in software. No reasonable person can take issue with the police exploiting a vulnerability to catch a mafia hit man, a criminal arms dealer, etc. Some hacker needs to be willing to write the software that exploits those vulnerabilities. I would say that writing software with back doors is a much more serious ethical problem than exploiting unintentional vulnerabilities.


There's an issue open on the repo right now about the exact paragraph you're talking about. I was thinking of making it say

"I swear to not design software for the purpose of MALICIOUSLY exploiting a vulnerability, damaging another computer system or exploiting a user."

Although, yeah, I see what you mean with the Enigma thing. Hard to put into words. Suggestions?


Utilitarianism comes to mind: seek to maximize the benefit to society. Sometimes that means attacking, sometimes it means defending, sometimes it means refusing to create the software you are told to create.


How about naming it for Aaron Swartz?

As I noted in a commend on github I would not conflate this issue of privacy and ethics as it relates to user information collected by applications with things like war crimes and crimes against humanity.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: