Part 2: Public-purpose tech accountability is up to us
Part 2 of an interview with Dr. Stephanie Hare, author of Technology Is Not Neutral
Dr. Stephanie Hare is a leading researcher, speaker, and broadcaster in technology ethics. In this exclusive two-part interview, Stephanie discusses key themes of her book Technology Is Not Neutral, and their relationship to public-purpose technology, with StateUp’s Rachel Osnos.
RO: We spoke about the importance of taking the time to review lessons learned in building a technology. Expanding on this theme, what other tech ethics considerations have you seen come up for recent emerging technologies?
SH: If you're a technologist, understand this: anytime you're building something that could end up being used by the police or security services or government, then you have to build in privacy protections and other civil liberties protections from the start - or it will be too late.
In the UK, I think we have a real complacency because we live in a liberal democracy. Before Russia invaded Ukraine, Russia was using facial recognition technology to identify protesters and later locate and arrest them. Now ClearView AI has volunteered its facial recognition technology products to Ukraine to assist in identifying dead Russian soldiers and contacting their families. How would we feel if Russia were using ClearView AI to identify Ukrainian soldiers and contact their families? Just because many people feel a solidarity with Ukraine doesn't mean that we absolve ClearView AI of accountability. A key part of technology ethics is this nimbleness of thinking through the hypotheticals where the situation is flipped.
Not only do we need to hold technology companies accountable; we also need to hold ourselves accountable.
Oftentimes people will say things like, it's a war, or it's a pandemic, or be more relaxed due to feelings of loyalty - and use these as excuses to suspend critical faculties. Instead, I believe that's actually when you have to double down and examine things more deeply because the costs are higher, and the risks are higher.
RO: Considering your background - what do perspectives from history or risk analysis offer in helping us to create and use tools and technologies to maximise benefits and serve major public needs?
SH: My background of training as a historian is at the core of everything I do, probably because I believe that there's nothing new under the sun. Someone has probably already tried something somewhere, and if you just do enough research, you'll find it or you'll find a parallel. Historians are really good philosophers, since they have developed the nimbleness for thought experiments and counterfactual arguments. Philosophers are obsessed with the question of, “what is reality?”, but so are historians - and historians are obsessed with epistemology, or sources of knowledge.
My own personal research questions have largely centred around ethics and political philosophy. So even though I was an historian, I was focusing on those questions. When you're trained to think a certain way, it changes your mindset. Just like if you've been a ballerina for most of your childhood, you're going to have the physique of a dancer - that intellectual training stays with you, no matter where you go. If you pair it in your professional life, as I did, with a career in technology, then you get a very interesting cross section.
As a technology ethicist, I’ve seen a lot of scholarship that reflects a real humility, and to be honest, fear, because no one person can know everything. In the book I say: everybody has opinions, not everyone has expertise. Even people with expertise do not have expertise in all areas. We must beware of epistemic trespassing because it's very tempting in this world of hot-takes, and no pressure, to have an opinion. I think it’s really okay to say: I don't have a fully-formed view on this yet. I haven't done my reading. I haven't gone and talked to people who've been spending their whole lives working in this area. Technologists using the approach of move fast and break things and just iterate could use a little more caution.
RO: That's really helpful: understanding that you have to consider things from a few different perspectives, and possibly incorporate a little bit more caution, in tech.
SH: Peer review is key. I can't publish anything in my fields without getting absolutely destroyed by any number of colleagues. I kind of welcome it, but it teaches you a certain - again, I use the word - humility. I don't mean like “#humble”, “#blessed”; I mean genuinely. I think my acknowledgments section is about four pages long - I had people reading this book repeatedly, and it would have been a lot worse if they hadn't read it. It was essential that these reviewers came from really different backgrounds. Along these lines, technologists could benefit from opening up to a much wider range of perspectives. Many people who are working in areas that don't seem like traditional “technology subjects” actually have so much to contribute to the peer review of tools and tech. Given that we're talking about public-purpose tech, it is for the public. Everybody must have a role to play in this because we're building our world, we're designing our world. We need to hear from everybody, not just technologists.
RO: Where do you see the greatest opportunity for technology and or new policies and business models in public purpose tech, to help address a major problem or public need?
SH: I think the greatest opportunity is in training our civil servants that come from non-technical backgrounds and supporting them by pairing them with data scientists, software developers, and STS (science and technology studies) and digital humanities critics. We have a planet that the IPCC says will be largely uninhabitable, so the way I view it is “all hands on deck”. We have been given the warning; we have the information and analysis.
Now we have to build a totally different way of living that is sustainable, equitable, transparent and accountable, and that works for everyone. This is the ultimate engineering problem.
How are we going to build, frankly, a much better way of living than we've been doing for a long time? We need more technologists to work on the stuff that actually really matters, not the stuff that will only make money.
We need to equip the people who are already trying to build a better world for all of us, through partnering or training, so that they can use all of the knowledge and tools that are at their disposal. I think we've got a big job on our hands with that. But that, to me, is a very solvable problem. We could do that within a matter of years. To me, that's exciting.
Missed part one of the interview with Dr. Stephanie Hare, author of Technology Is Not Neutral? Read it here.
Feedback:
Select feedback from the last post is available at the bottom of the page here. Tell us what you think about part 2 of our interview with Dr. Stephanie Hare here.