In 2018, StackOverflow's annual survey asked developers if they'd feel ultimately responsible for the unethical use of code they wrote. 80% said no.
This July, we helped run Coed:Ethics, the first conference about tech ethics specifically for practising developers. Our goal was to investigate 3 questions facing tech today:
Ethics are sometimes part of CompSci courses but what’s learned in a classroom can seem very different on the ground. Do we need to think about this again? We asked experts from tech, design, big data, the law, education and psychology to discuss tech-related ethical issues and how they approached them. Represented in the audience we had journalists, marketeers, and academics as well as us techies.
All the talks have already been written up in detail by several of the attendees (links below); and the whole conference was recorded and will be available free on InfoQ. I won’t re-cover those excellent write-ups here. Instead, I’d like to step back and talk about the larger questions the conference tackled and what we should do next.
Each speaker raised at least one example of tech-related behaviour they considered an ethical issue, including:
As several speakers pointed out, most of these concerns are not new. They weren’t introduced by new tech like machine learning or AI - we've always had them (even chatbots have been around for 15 years). However, just because we didn’t act in the past is not an argument that we shouldn’t do so in the future. As the scale and reach of tech increases the issues become more acute. Times change.
Culturally, I suspect we may have become better informed, and less accepting of poor behaviour, than we were. As our conference’s Data Citizenry expert Caitlin McDonald noted, many of us techies are now being directly affected by the code and algorithms that we or our fellow developers write. That gives us a new perspective.
Of course, not all of the examples we heard about from our speakers would be ethically dodgy to everyone. We all care about different issues to different degrees. Some ethical positions are complex, counter-intuitive or even contradictory. As our engineering speaker Yanqing Cheng challenged, is it right to break laws in order to introduce self driving cars faster if they could ultimately save thousands of lives every week? Similarly, is it better to write good facial recognition software or leave it to less scrupulous folk who might do a worse job? Human rights lawyer Cori Crider counter-argued that such questions cannot be separated from who uses the software. You might do an amazing job at writing bug free facial recognition but it could still be used wrongly or without compassion.
If we’d been hoping to get a bible of unarguably ethical behaviour from the conference we’d have been disappointed. It turns out doing the right thing is non-trivial!
So if there are no easy and universal rules for ethics does that mean anything goes? Actually, our psychologist Andrea Dobson answered that question. Unlike philosophers, psychologists are pragmatists. Extensive research was done on ethical behaviour after Germany’s WW2 Nuremberg trials with their ubiquitous “I was only following orders” rationales. Psychologists soon realised that unethical behaviour was not only committed by bad people. Most humans would behave unethically if ordered to (“obedience”) or if everyone else was (“conformity”).
However, the good news is most of us are far happier if we stand up to these pressures and do what we believe is right. According to Dobson, therein lies the best human-scale definition of individual ethics. Decide what you feel is right and then have the courage to stick to it in the face of pressure from authority and peers. "Courage is persistence in the face of fear".
The second question the conference asked was what could developers do to improve the ethics of our employers and our industry? We heard about several constructive actions, including:
None of this sounds impossibly difficult, so what stands in the way of developers doing it?
Our psychologist's description of ethics as deciding what you believe in and standing up for it seems to define ethics as deeply personal. As an individual, you may or may not believe in drone warfare, privacy, or algorithmic decisions - it’s seldom a simple call and there are always arguments on either side. That makes ethics easy to grasp at a personal level (you know what you believe) but more difficult to share, generalise or encode. Is that going to be a problem? Does that mean it’s wrong?
One of the questions clever philosophers have been asking for 2000 years is, “is there a definition of good?” (if you want to read about this try the excellent “Justice: What’s the Right Thing to Do?” by Michael Sandel). Those philosophers have failed to agree. I’m going to take a pragmatic approach and say there may or may not be a universal “right thing to do”. What matters is what’s going on around you; the informed and reasoned judgements you and your team make about their effect on the world; and what you care enough about to act on.
So if micro-ethics , though vital, are context-specific and somewhat personal, are there any concrete, high-level ethical aspirations that everyone can get behind? So we can talk about "what to do" not just "what not to"? Actually, the UN have started that.
The United Nations’ Sustainable Development Goals were defined in 2000 with the objective to “produce a set of universal goals that meet the urgent environmental, political and economic challenges facing our world”. They are 17 high level, clear, bold, and uncontroversial targets for the human race. They include “zero hunger”, “clean water” and “clean energy”. Despite their ambition they have proved highly successful. For example, “More than 1 billion people have been lifted out of extreme poverty and child mortality dropped by more than half”.
So, there is reason to believe that audacious, obvious, uncontentious goals we all agree on can have a significant impact on the world. Ethics isn’t just about what not to do, it’s about what we should be aiming to achieve. We reckon we should set some of these goals for the tech sector.
We think the next steps for tech ethics are to:
Setting challenging ethical goals is a job for the whole industry but we’d like to get things started with what we suspect is the most obvious, no-brainer goal. We’ll make an announcement soon.
The conference turned out to be hugely popular. It demonstrated that developers are ready to do way more than endlessly debate the best Javascript framework. In tech, we can aim higher than amorality and we intend to do so.
If you want to follow our progress follow @coedethics or @containersoluti