In 2018, StackOverflow's annual survey asked developers if they'd feel ultimately responsible for the unethical use of code they wrote. 80% said no.
This July, we helped run Coed:Ethics, the first conference about tech ethics specifically for practising developers. Our goal was to investigate 3 questions facing tech today:
- Is there an ethical problem?
- Is there anything devs can do about it?
- What might stand in our way?
Ethics are sometimes part of CompSci courses but what’s learned in a classroom can seem very different on the ground. Do we need to think about this again? We asked experts from tech, design, big data, the law, education and psychology to discuss tech-related ethical issues and how they approached them. Represented in the audience we had journalists, marketeers, and academics as well as us techies.
All the talks have already been written up in detail by several of the attendees (links below); and the whole conference was recorded and will be available free on InfoQ. I won’t re-cover those excellent write-ups here. Instead, I’d like to step back and talk about the larger questions the conference tackled and what we should do next.
Is there an Ethical Problem?
Each speaker raised at least one example of tech-related behaviour they considered an ethical issue, including:
- Tech use in weaponry with high civilian casualty rates.
- Algorithms that affected people, potentially adversely, without their knowledge.
- Bad conclusions from big data.
- Uninformed consent.
- Rogue or badly trained chatbots.
As several speakers pointed out, most of these concerns are not new. They weren’t introduced by new tech like machine learning or AI - we've always had them (even chatbots have been around for 15 years). However, just because we didn’t act in the past is not an argument that we shouldn’t do so in the future. As the scale and reach of tech increases the issues become more acute. Times change.
Culturally, I suspect we may have become better informed, and less accepting of poor behaviour, than we were. As our conference’s Data Citizenry expert Caitlin McDonald noted, many of us techies are now being directly affected by the code and algorithms that we or our fellow developers write. That gives us a new perspective.
Of course, not all of the examples we heard about from our speakers would be ethically dodgy to everyone. We all care about different issues to different degrees. Some ethical positions are complex, counter-intuitive or even contradictory. As our engineering speaker Yanqing Cheng challenged, is it right to break laws in order to introduce self driving cars faster if they could ultimately save thousands of lives every week? Similarly, is it better to write good facial recognition software or leave it to less scrupulous folk who might do a worse job? Human rights lawyer Cori Crider counter-argued that such questions cannot be separated from who uses the software. You might do an amazing job at writing bug free facial recognition but it could still be used wrongly or without compassion.
If we’d been hoping to get a bible of unarguably ethical behaviour from the conference we’d have been disappointed. It turns out doing the right thing is non-trivial!
Are Rules Required?
So if there are no easy and universal rules for ethics does that mean anything goes? Actually, our psychologist Andrea Dobson answered that question. Unlike philosophers, psychologists are pragmatists. Extensive research was done on ethical behaviour after Germany’s WW2 Nuremberg trials with their ubiquitous “I was only following orders” rationales. Psychologists soon realised that unethical behaviour was not only committed by bad people. Most humans would behave unethically if ordered to (“obedience”) or if everyone else was (“conformity”).
However, the good news is most of us are far happier if we stand up to these pressures and do what we believe is right. According to Dobson, therein lies the best human-scale definition of individual ethics. Decide what you feel is right and then have the courage to stick to it in the face of pressure from authority and peers. "Courage is persistence in the face of fear".
What Can Developers Do?
The second question the conference asked was what could developers do to improve the ethics of our employers and our industry? We heard about several constructive actions, including:
- If you are uncomfortable, speak up. Say something to your colleagues, your manager, your HR department or even write an open letter. It works! Look at Google’s Maven project. Just 12 engineers caused Google to cancel a whole military contract. You might not get the project cancelled but you should at least be able to kick off some discussion because engineers are highly sought after and their opinions matter even to big companies.
- Adam Sandor and Sam Brown talked about using Agile processes and models to nudge yourselves to keep stepping back and thinking about what you are doing and the potential ramifications.
- Get a range of views and opinions, particularly from people who might be affected by your products but are different from you.
- Ask about ethics at interviews. This turns out to be important for interviewers and interviewees. As an interviewer, asking about ethics makes a statement that they are important to your team. Several of our attendees have already started doing this.
- As Cori Crider, our Human Rights Lawyer, told us, “Get informed!”. Read about the world beyond tech and consider what role you want yourself and your creations to play in it. Decide what you care about.
What Stands in Your Way?
None of this sounds impossibly difficult, so what stands in the way of developers doing it?
- Imposter syndrome and lack of confidence. Most developers do not realise what a valuable commodity they are, which makes them doubt the power that they have.
- Introversion. Cancelling the Maven project required only 12 engineers - but it did need more than one. Lone voices are seldom heard. You need to speak to your colleagues and build a consensus. That can seem scary but others might be as unhappy as you and just not saying anything either. You can only find out by asking.
- Distraction. It’s very easy to put your head down and never think about what you believe in, or what your company does. Is that what you want?
- Courage. Overcoming imposter syndrome, shyness, inertia or introversion are difficult. Is the result important enough to you?
Macro vs Micro-Ethics
Our psychologist's description of ethics as deciding what you believe in and standing up for it seems to define ethics as deeply personal. As an individual, you may or may not believe in drone warfare, privacy, or algorithmic decisions - it’s seldom a simple call and there are always arguments on either side. That makes ethics easy to grasp at a personal level (you know what you believe) but more difficult to share, generalise or encode. Is that going to be a problem? Does that mean it’s wrong?
One of the questions clever philosophers have been asking for 2000 years is, “is there a definition of good?” (if you want to read about this try the excellent “Justice: What’s the Right Thing to Do?” by Michael Sandel). Those philosophers have failed to agree. I’m going to take a pragmatic approach and say there may or may not be a universal “right thing to do”. What matters is what’s going on around you; the informed and reasoned judgements you and your team make about their effect on the world; and what you care enough about to act on.
Macro-Ethics and Global Goals
So if micro-ethics , though vital, are context-specific and somewhat personal, are there any concrete, high-level ethical aspirations that everyone can get behind? So we can talk about "what to do" not just "what not to"? Actually, the UN have started that.
The United Nations’ Sustainable Development Goals were defined in 2000 with the objective to “produce a set of universal goals that meet the urgent environmental, political and economic challenges facing our world”. They are 17 high level, clear, bold, and uncontroversial targets for the human race. They include “zero hunger”, “clean water” and “clean energy”. Despite their ambition they have proved highly successful. For example, “More than 1 billion people have been lifted out of extreme poverty and child mortality dropped by more than half”.
So, there is reason to believe that audacious, obvious, uncontentious goals we all agree on can have a significant impact on the world. Ethics isn’t just about what not to do, it’s about what we should be aiming to achieve. We reckon we should set some of these goals for the tech sector.
What Next?
We think the next steps for tech ethics are to:
- help devs to recognise and stand up for their own beliefs
- generate bold, shared ethical goals for the technology industry that everyone can support.
Setting challenging ethical goals is a job for the whole industry but we’d like to get things started with what we suspect is the most obvious, no-brainer goal. We’ll make an announcement soon.
The conference turned out to be hugely popular. It demonstrated that developers are ready to do way more than endlessly debate the best Javascript framework. In tech, we can aim higher than amorality and we intend to do so.
If you want to follow our progress follow @coedethics or @containersoluti