John S Nolan, 19th September 2018, at @FFSTechConf
JOHN: So, hi, I'm John. I'm the most stupid of the organisers because I'm the only one actually giving a rant. Have you noticed that! [Laughter]. Idiot!
Anyway. So, it was interesting trying to do this, because I'm very well used to public speaking, and doing stuff like this, but with this constraint of ranting in ten minutes it's very hard to do a normal presentation - can't do it. I've tried several times to write this, and I gave up.
So I have three pieces of card in my hand, and I’m going to improvise - this is how I'm going to do it.
We will talk about moral philosophy, and I will be talking a lot about moral philosophy. So I see this session as being about dialectic. The question and answer session here is about dialectic: thesis-antithesis-synthesis and Socratic questions for clarity. And that’s what I'm looking for so that I can learn here as well! I don't have the answers, I have a frustration, and I want to kind of get it out.
So I will just be throwing a stream of shit at you for about five or ten minutes!
[Laughter].
And then I really just want to get into this, you know, the questions, and the, "How dare you say this or that."
Always good to start with a story. I suddenly realised I was going to present this cartoon, and Andrew, our lovely stenographer…. Hi, how are you doing?
CAPTIONER: Oh, now you're making me make mistakes!
JOHN: Sorry, mate! [Laughter].
I'm going to have to read bits of this out so we can caption it. This is a from a cartoon in The Guardian by Stephen Collins.
[Laughter].
Why are you laughing?
[Laughter].
Why are you laughing? This is a serious question.
[Slightly confused silence?]
This, in moral philosophy, is known as a "cold joke". We laugh at it because it is uncomfortable. We know there's truth in here, and it's uncomfortable for us to deal with.
I put it to you by the very fact that you're laughing, for fuck's sake, we are amoral, and we are unethical when we are doing technology development. I'm not talking about you right now - I hope right now you won't start killing me - but I think you know what I mean?
I will be very specific about these terms. Amoral because we absent our own values in conversations about technology and when we are delivering things. Unethical because we are absenting ourselves from a larger conversation with society about should we be doing these things and how do we do these things.
How have I come to this very extreme view? I've been spending time doing sessions on morals and ethics, talking to people in pubs, bars - I do go to other places! - but what I get and I hear from developers all the time is “oh, well, that's not my decision”, or, “that's a business decision”, or, "Well, I just do what I'm told." This is all very morally neutral argument.
Morally neutral arguments are very, very dangerous. Morally neutral arguments are Nuremburg "I'm just following orders" defences.
I'm being serious, it is that level of problem we have, I think, and that's why it bothers me, and it worries me, because technology is extremely powerful.
Right now, what is tech causing? What is happening?
Communications between us are creating personal amplified echo chambers which are dividing us; algorithms are being introduced which are segregating us, separating us, setting us up by saying this person is this, and this one is not. Did you see the recent story on a council using algorithms for predicting which kids are in gangs, or likely to be in gangs - what factors and biases are in there? Tech’s also producing social and economic pressure. The systems we are building are introducing and making a larger and larger economic disparity - we’re enabling the power-curve. We are creating all the conditions that break societies through what we do. That bothers me. I care about that.
I also care about it because the potential for technology to do the opposite is so large, yes? We can look at existential crises like climate change, energy management - we could be dealing with those things. What are we doing? Trying to create a better way of sharing 4K cat videos for someone who can afford $1,000 for a phone.
I don't know about you, but this fucks me off. This is why it's Chatham House Rule: This Apple presentation the other day sickened me. That 5 trillion operations a second on the Bionic: what are we using it for? Improve your selfies. What the fuck!?
And it's not just my experiences that have twisted my views….
Stack Overflow did a survey of 30,000 developers - okay, it is Stack Overflow, so there's a bias, in this, let's be fair! [CORRECTION: was 100,000+ developers and ~70,000 answered the ethics questions]
What they found is they asked the question: “who is responsible for a piece of software that gets used unethically?” 58 percent of developers said it was management, the business. You know, somebody else. This is what it known as "distancing". So what you're doing is pushing it away, not taking responsibility. I'm pushing it away. It is somebody else. And it's called distancing. That goes hand in hand with cold humour, by the way. Those two things are the markers of potentially moral turpitude, so if you go to the States and they ask, "Have you committed an act of moral turpitude", as a developer the answer is probably “yes”.
The other question was really, really concerning: which is, if you're asked to do something in technology which is clearly unethical, and it was that simple, would you do it? 36 percent of developers said "It depends." It depends? It fucking depends? It's clearly unethical. Oh, it depends. NO IT FUCKING DOESN’T ! [Laughter].
The thing about this is, when you look at it, no wonder we should not be allowed to make moral or ethical decisions if that is our fucking attitude… if we are distancing ourselves… If we are pushing away and making cold jokes about this. We don't speak up. Why don't we speak up? I don't know. Have there been times I haven't spoken up? Yes, but not recently, I have to admit! Maybe that's because I've got older, and the older you get, the more unreasonable you become, and this kind of stuff starts making you care?
So...
We have to examine the way we operate. We operate in secret and in private. By yourself, in a small group, in a private company, talking amongst yourselves, whatever your group of biases are, and how do we then engage with the public? We release a piece of software. We release one of the most dangerous pieces of technology man has ever invented. Because it's got two problems with it: one, just the mechanics of it: we're messing with economics, ways of communicating, how people see themselves, how people see other people. And we do this without engaging in a conversation in the public domain about what we are intending to do, or what we are doing, or what might be the consequences of that stuff. We don't engage: we release it.
We have drunk the neoliberal capitalist Koolaid and said the only thing is important is to put it out on the market and the market will say it's good or bad. That doesn't float with me. I will be really interested in having that discussion, because it has become a central belief of technology.
Now, the other thing is we're children of the Enlightenment, and thing about being children of the Enlightenment is evidence-thinking, challenge everything - disrupt. That was the Enlightenment: Disruption is good. No, it fucking isn't. The ethical philosophy of the Enlightenment, which we then stick to as we adopt everything wholesale, like utilitarian arguments (doing the greatest good) or the universality of Kant (if it works for you, it works for other people) was a failure.
You understand that enlightenment philosophy failed? As a project, morality failed. Moral philosophy in those days was the cool kids thing, and they were out there, the hip, happening guys. This is when pamphlets and news (a bit like the internet on paper for the younger) popped up, and distributed the ideas that were shared. They were the cool kids. What the morality project, let's call it, was trying to do was find a rational basis for morality. It failed. It failed individually for each philosopher and every one of them wrote of this failure - even Kant. You need irrationality as well. They all made this get-out clause, and it failed and failed, and failed. This is one of the reasons why philosophy turned into an academic subject because it failed to to serve the needs, due to scholasticism where they attacked the church, but could provide no successful alternative. And that led to the Age of Revolutions, where people felt frustrated but emancipated, felt they could do something. If you want to read a book which is really depressing read Humanity by Jonathan Glover, which explains many of the cruelties of the 20th century as a consequence of a lot of this Enlightenment thinking.
What am I suggesting? I'm going to make some provocative suggestions, if Andy keeps waving his little things at me, I'm about on time.
What can we do about this?
So I think we have to reject Enlightenment thinking, reject looking for a rational, rules-based approach. Controversially, because I know one person will disagree with me, with Moral psychology we have to be very, very careful. It is just another set of rules that we are telling ourselves are helping us understand stuff. It tells us how we make decisions, but does not help us understand what decisions we should be making.
I think there is something about becoming Virtuous. We have to move back to an idea of virtue ethics, if you don't know about this, this is about Aristotle’s idea of how do I live a good life? We can talk about that.
I think we also have to come together, we have to do a union of us to work together, to support each other, and support others, not to protect our rights - fuck rights - and I've got a whole rant on that! [Laughter]. It's us being supportive of each other and have solidarity with those our systems affect. You can still have identity, you know? I'm not looking at you, D-, but a Pythonista of the worst kind can absolutely be in solidarity with a C++ bigot because we have the same principles at heart. We can have the solidarity together.
We need public oversight of what we do because - this is controversial - when we are working in companies in organisations, and we are being private and secret about things that affect society. I think we should expose and talk about it. Fuck NDAs, fuck the secrecy.
I think we have to refuse to be trivial. Fuck cat videos!
All right! He's waving “time” at me. I've got a whole lot more. The screen has gone off! Thank you. [Applause].
FLOOR: I'm K-, and I'm so grateful for this conference. This was the one I'm looking forward to, I will do the tl;dr version otherwise I will be here all day. I'm surprised the subject of this is what we are writing rather than the fact we are writing software at all.
JOHN: I agree that’s often the question.
FLOOR>> With three main things: oil companies, aviation industry (based on the oil industry), and us (so everyone in this room) are the biggest contributor to the carbon footprint that is totally going to fuck the planet. We are all responsible. I recognise someone else in here who is also doing a talk about it. I'm a Java dev so I use a garbage collector because I'm too lazy to do garbage-collecting myself, and I have a massive CPU overhead which means I waste a lot of energy doing this bullshit. Why are we doing this in the first place?
JOHN: I absolutely agree. That's why I challenge the trivial - take for example Facebook - I fucking hate Facebook. One of the reasons is “let's do this”. Let's “get this messaging out because it’s obviously a good thing”, you know? There was no question about, "Should we do this?". If you asked the question, "We're going to have so much content, not be able to moderate it, people likely to abuse it." There was an answer which is: don't fucking do it.
I have had this energy conversation very, very recently, about why are we running so many servers? "It doesn't cost a lot." Dollars and cents but what about all of the energy? Reframing, it's often the "don't do it." I think it's us who have to say that, not the business. I think WE have to say it because we've got the power in our hands - if only when typing! This is where that stuff comes up and it’s up to us to speak up and ask. Let's not push it off.
FLOOR>> I'm J-. So, I'm going to agree and disagree.
JOHN: Good. [Laughter].
FLOOR>> I'm going to embarrass A- by saying she ran a thing called Coed Ethics which talks a lot about this, and it was terrifying, and we should unionise, and lots of other things that you were talking about there. But I'm also going to disagree in that I worked at GDS and having good user-centred design can change policy and delivery and make people's lives better as a result of that.
JOHN: I don't disagree. I think we need to do more of that, and potentially we need to do it before we are writing and calling them "users". We have to have that public engagement through tech. I think GDS had some of that, as well, purely because it is an obvious policy in public systems?
FLOOR>> Hi, I'm J-. I agree, yes, the answer is often don't fucking do it a lot of the time. I was particularly kind of struck by that comic of the self-driving car, because this is like a classic example of a trite technologist's analysis of an ethical problem: do you kill the old lady or do you kill the woman with the pram with the puppies, or whatever, the occupant of da-da, but the answer is don't drive so fucking fast -
JOHN: Ethics is not a feature.
FLOOR>> Ethics is not a feature. I have strong opinions on self-driving cars - bollocks! I think this is a wider problem which is, whenever we analyse a problem as technologists, we tend to apply our technology, our understanding of technology problems to it. There's a person waving at me from the moderators? No.
JOHN: No, she's just being friendly!
FLOOR>> We tend to do this thing of applying our own frame of reference when we do our own kind of analysis and that really hampers us.
JOHN: This is part of how we remove our own agency from it as an individual, and how we remove our agency as a member of society as well. And just that thing: this is about being children of the Enlightenment. We're not thinking bigger.
FLOOR>> There is a quote by the writer of Pinboard saying approaching the world is a category error which has led to some terrible habits of mine which is true. What evidence do you have that we are as technology developers amoral or unethical than the rest of the population?
JOHN: I don't care! [Applause]. The scary thing is maybe we're more. There's a worrying idea. Maybe we're in a position where we should see this. We're generally a bright bunch of people.
FLOOR>> No, my experience of the world is that there is a bunch of people in the world that are amoral and unethical, and we bump into them all the time, and so to somehow characterise this as if we are terrible in being amoral and unethical is kind of a category error. It is a general problem of life rather than the problem of technology?
JOHN: I don't disagree in terms of the larger context, but what we are talking about here is what we in technology can do something about. We're not the root of all evil, but we're a nodule on the root of all evil! Because there is that fact that we are empowering bad things to happen. By removing ourselves and distancing ourselves from our responsibility for the very technologies we create, we are failing. It is the Spiderman quote here - well it was the French National Congress in 1783 but, whatever - with great power comes great responsibility. And we have power.
That's why we have to have solidarity with each other. If I say no, and you're all going to say no, maybe we can move something. But that's it: I think it's about us. Solidarity is about finding people who you can have influence with. And maybe we're not moral midgets as we appear, maybe we're moral giants of the world, which is even scarier.
FLOOR>> There are moral and ethical people in software like everywhere else. There's a flip side to it.
JOHN: Maybe, but my argument is if you look at the consequences of what we do and the way we are seen, we are not those people. This is the other thing, what is the role of the technologists? In virtue ethics, you have a role and you have a responsibility to play that role in society. The role of technologists is seen as geek, loner, sits alone, somebody has a bright idea, becomes a billionaire.
FLOOR>> It's a human tool to do whatever somebody else wants.
JOHN: Which which is a Kantian sin: treating us as a means to an end. And we treat users as a means to an end - we make the same errors in both directions. I'm not saying we are amoral or unethical people, but the consequences of us as an industry appears to be that we collectively are. I refuse that is our role in society - we can be better.
FLOOR>> I'm K-. So, this has been sort of touched on, but at the moment I work for one of these trendy companies quite close to here where there are Ping-Pong tables and everyone is very individual and one of the issues is - I say "individual" with carbon copy MacBook stickers - if you do have an issue saying I'm not going to do this, 20 people will stand up and say I will do it anyway and I will get left out. Before, in my previous company, we would go to the union. That has been touched on. Whenever unions are brought up with this sort of thing, people go, "I don't need a union because I can negotiate my own wages, and is that that sort of thing."
JOHN: It's not about our rights of pay and coverage, it's different. It is a union in the sense, that's why I try to avoid the word "union", and would you say we've drunk the neocapitalist Kool-Aid, “a union is bad”? That's interesting that you were part of the union -
FLOOR>> I was thinking of it the whole way through, and you got to it at the end.
JOHN: But how do we do it? Morals is about me, ethics is about everyone. If it is you saying, "I can't do this," that's fine, if you stand up and walk out. An interesting case was the Google one, not the most recent one, but the Project Maven one, and there was an interesting thing where developers said, "We're not going to use our technology for this." I thought there was a lot of dodginess in doing that. I know that's a controversial position to take, but: you live in a society which is a democratic society where you cede the use of violence and defense to the army and the nation state. Is it your right as an individual to refuse your technology in that context? There is a school of thought that says no. There's a moral issue which says I won't do it, but ethically, it is very dodgy to say, "This company's not going to contribute." That made it quiet in the room.
>> [Inaudible].
JOHN: Who is next? A-? Uh-oh!
FLOOR>> It's amazing how we've been in resounding agreement with everything you said. It's really striking how much we want to do something about this, and your point is very good. We're not less ethical in the tech industry than anybody else. We're probably more so, but we are more powerful, as we said. We have an enormous amount of power, education, wealth, safety, to do something about it, and I would really like to agree with the first person, the first ranter this evening, who I couldn't see, because he was the other side of the pillar, but talking later about Java. The energy use in the sector is one of the most interesting examples of us being amoral and unethical through inadvertence. We don't think about we use as much energy as the aviation industry, just turn on another server and throw more dollars about it. Amazon are already thinking about this stuff because they have very sensibly already started to be secret about where their data centres are.
There was an interesting report in Bloomberg a couple of weeks ago about how there were two new data centres in US states that had significantly raised the cost of energy for everybody in the state. Because there is only a certain amount of electricity to go round, and if we're taking it out, we're not putting it in, then we are actually - that is almost the very definite in addition of that kind of amoral inadvertence lack of ethics.
JOHN: Absolutely, well said! One of the things that annoyed me about the Apple event recently was the were saying “we are 100% sustainable energy” and yet they have 2 billion products out there. They're all being charged, and the network usage, and the servers ... is that all sustainable? They also threw in device recycling, "We're going to recover the materials" - that's because we’ve screwed so many mid-African countries where the resources are that are war zones now due to unethical practices and poor economics. There is that extended ethical sense, again, that we seem to ignore in technology until it’s too late..
FLOOR>> A group of us, including John, are putting together a petition to actually start to say what we think should happen in the tech industry. The feedback I got from the Maven Project which you might have heard about, how Google cancelled a major contract because their developers said “no, they didn't want to do it”. The impression was that 12 left, and that's the 12 people who left and made a giant move who changed Google. It wasn't. I've got it from the inside of that, and the decision was the 10,000 people who signed a petition saying they didn't want to do it. That is what changed Google. 12 people can do whatever they like, but 10,000 people who do one tiny little thing is more effective. So, petitions actually could be a really useful way for us all to express our solidarity.
JOHN: I think the mechanisms of solidarity are actually the ones that will work and we need to leverage, and it's a dirty word now, and we have to - God, I was about to say "reclaim it". For fuck's sake!
I think that's probably time, isn't it? Okay.
FLOOR>> [Inaudible].
[One more point…]
FLOOR>> Can everybody hear me. My name is M-. I see a lot of this, and two things I think away can help fix this. Special attention to how we recruit people over the last five or six years, unfortunately, I apologise in advance if there are any grads here, I'm seeing a lot of narcissistic behaviour coming into the industry. Secondly, there's a reason why there's so much stress on diversity. If you come or brought up in a Third World country and seen the damage of a carbon footprint, you will pay attention to what you're doing and the effects it has on agreement.
JOHN: Thank you. I agree. One of the outcomes, I was at the Coed Ethics, I realised in 20 or 30 years of interviewing people, I had never asked an ethical question in an interview. I had never asked an ethics question. It's just like, you idiot! We have to start thinking sideways. That's why inclusion of voices from these different places is so right. We have to change the mix. Is that enough? All right! [Applause].