Podcast guest: Rebecca Parsons
October 28 2022 | 27 min 0 sec
Brief overview
When is technology irresponsible and when is it simply unethical? How can we create a culture of responsible tech in our organizations, ensuring that our technology decisions don’t lead to unintended consequences?
In this episode, Thoughtworks CTO Rebecca Parsons talks about responsible and ethical technology, living in a hostile tech era and how technologists can make a positive difference in how they build solutions: for their organizations, for their clients and for society at large.
Transcript
Eduardo Meneses: Welcome, everyone. Welcome to a new episode of our podcast Talking Tech and Social Change. During these minutes of discussion, we'll be exploring how technology can be analyzed from what we call a common good perspective.
We'll be exploring, in each one of our episodes, this complexity of the intersection between technology and social change. We are really excited in Thoughtworks to host this initiative because we are going to be inviting and sharing these spaces with passionate technologists, activists, citizens who really care about this topic, who are thinking and acting right now in this intersection between technology and social change.
I will first introduce myself. I am Eduardo. I am the Global Head of Social Change in Thoughtworks. I will be your host in this journey, making questions for our guests, trying to go deeper into our thoughts and trying to develop this discussion so we can build together a better understanding of how our world is changing, what are the challenges we're facing in this digital revolution we are living right now.
Our guest today is Rebecca Parsons, the Chief Technology Officer at Thoughtworks, who spends a very important time of her life sharing her thoughts on this topic. She's always traveling around the world, giving talks and conferences. She has even been presented as one of the most influential women in technology. So, first of all, welcome, Rebecca. We're really happy to have you in this episode.
Rebecca Parsons: Thanks, Eduardo. Happy to be here. Always enjoy talking about responsible tech.
Eduardo Meneses: Thanks for being here. So we always do this in our episodes before going directly to the topic. Could you tell us a little bit about your history? How has been your journey in technology? And how did you come to be as deep into this discussion in terms of responsible tech, the intersection between technology and society?
Rebecca Parsons: Well, I actually started programming when I was 13 years old and just fell in love with it. I proudly proclaim on stages across the world I'm a geek. I love technology. I can't imagine doing anything else but technology, which might be how I've survived as long as I have in the technology field, given the fact that I started when I was 13 years old.
I've worked in many, many different industries. Even before coming to a consulting company, I worked for a heavy manufacturer. I worked for a computer company. I worked for a semiconductor company.
I worked for the federal government here in the US in one of our research labs. I was a university professor. And then I came to Thoughtworks, and I kind of rounded out my experiences. So I did work for a startup very briefly.
And when I got to my sabbatical-- Thoughtworks gives sabbaticals at 10 years of service-- I went and worked-- I was the chief architect, if you will, for the technology for development group in Kampala, Uganda. And that really brought home to me many of the issues that still arise when we talk about responsible technology.
And I was, in particular, struck by one story I heard from a member of one of the NGOs. They had gone into a village, and they thought they were doing the right thing. And they built a pipe from the well that was a mile out of town into the center of town so that the women wouldn't have to trudge one mile each way every single day to get water for their families.
And yet, strangely, the pipe kept getting sabotaged. And they couldn't figure out what was going on. And somebody finally figured out, why don't we talk to the women?
And because of the cultural norms in that village, when they went to the well, that was the only time the women got to interact with anybody outside their household. And so what these people had done by trying to be helpful was, in fact, completely disrupt the social life and the life and the village of the women that they were trying to help. And there were no bad intentions, but they just did not understand the psychology and the impact that this would have.
All they could see was the problem that they were solving. They couldn't see the context around it. And that was really my first experience with thinking, even if we intend to do good things, our technological solutions can bring about these unintended consequences. And I've been exploring that idea ever since. But that was really the genesis of it.
Eduardo Meneses: That's really interesting because that brings me a little bit to the first question I wanted to ask you because I have seen some recent conferences and articles that you have written. And you are often speaking about the fact that we are in an era of hostile tech. And, in fact, that goes beyond the first assumptions that we have when we are speaking about hostile tech, bad actors, and these kind of things. It goes into what you're saying. Could you explain a little bit to us what do you understand by this idea of hostile tech and the fact that we are in this era of a hostile tech?
Rebecca Parsons: Well, first there is the obvious. We have the hackers, the ransomware, all of those kinds of things. And our awareness of the fact that there are malicious bad actors, not just individuals, but nation states, our awareness of that is increasing.
But what we often don't realize, again, is that tech that we might intend for a particular purpose doesn't always end up getting used that way. And some people may feel as if it is an attack. One example-- I had a conversation with a marketing person who said, no one minds being tracked online if the information that they get back is usable. And I said, I'm sorry, but that's false.
There are many people who work for Thoughtworks, as an example, who don't care how good a recommendation is. They don't want to be tracked. They are going to do everything in their power to turn off everything so that they're not tracked, because they are consciously making this choice, actively making the choice to say, no, I do not want that benefit, because I am not willing to pay that cost.
And for anybody who is trying to circumvent all of the blockers and such, that's a hostile act to those individuals because they do not want to be tracked. And so when we start to think about hostile tech, what we want to think about is not just the intent because the intent can be good. There's a great quote from Marc Andreessen that technologists are terrible at predicting the consequences of their software, but so is everybody else. And I think that really captures it.
We have to consciously think about how somebody might misuse what I'm writing? Or in the case of the pipe, what is the cultural context into which I am delivering my solution? And how might that change the way what I'm trying to do is perceived? Because as technologists, much like the people in that NGO, we have tunnel vision. We see a problem. We know how we're going to solve that problem, and that's what we're going to do.
And trying to imagine how somebody could take our creation and use it for malicious purposes, that just never occurs to us. We have to be very deliberate in thinking about how something that we build can be misused in a hostile way. And we're not used to thinking about that.
And part of what we're talking about here with this hostile tech is as a group of technologists, we have to be much more aware of all of the consequences on all of the stakeholders, not just the people we have in our sights, but if we lift our head and look around who else might be out there who is being impacted in some way positively or negatively.
Eduardo Meneses: And something that makes me think about what you were saying is that, very often-- I mean, in a very ground level about how people see these kinds of things, very often, we speak about individual responsibility, the fact that you can opt out if you want. You can have this choice. But, in fact, it's much more complicated than that.
When we do that, we put the focus on the user and about if the user wants, he could opt out or try to not get involved. But, in fact, what you're doing in this discussion and what I have been hearing is that we also need to put the focus on what technologists are doing and how they design the-- since the beginning-- technology, right? And to that point, you have been speaking in some of these articles and conferences about the need for the tech industry to embrace this idea of an ethical technology and also developing responsible tech.
And it is very interesting because, very often, people mix up these things. And, in fact, there are different approaches when you speak about ethical tech. It's not the same thing, the responsible tech. And I wanted to ask you, could you tell us a little bit more-- for you, what are the differences between these different approaches to these challenges?
Rebecca Parsons: Well, one of the things I like to do in teasing apart concepts like this is to turn them around. So unethical tech, what is that? That is things like ransomware. That is things like malware. That is deliberate invasions of privacy, those kinds of things. That is unethical tech.
And when you look at what I call unethical tech, there is a clear right and wrong. There is a societal expectation that something's going to happen or not happen. And unethical tech violates that. And if you're not unethical tech, then you're ethical tech.
Now, if we turn responsible into irresponsible, what kinds of things might be there? Well, if you don't consider the impact on groups that might be affected, there might be nothing ethically wrong with the software that you're writing or the technology that you're delivering, but it's still having an impact.
I would say that pipe into the center of the village, there was nothing unethical about that. They were trying to do the right thing, but it was irresponsible. They didn't take the time to understand the users and the context in which it was being delivered. And that is not acting responsibly as a technologist.
And so one aspect of it is, are there clear ethical guidelines? Are there clear societal expectations that you're violating? And if not, your tech is ethical.
Responsibility is have you, in fact, fulfilled all of the activities that should be expected of a professional? Is the software that you delivered of high quality? Do you know that it has the right kind of uptime and availability? Is it usable? Is it easy enough for a user to understand?
I actually had a discussion this morning with someone who said, sure. If there are configuration parameters that will allow you the kind of visibility of your data that you want, that isn't sufficient if it's so hard to figure out how to set it right and how to find it and all of those things. So you could say, OK, well, they were acting ethically because they did give you the choice. But were they acting responsibly by making it so hard to find that you give up in disgust?
But they're very closely related concepts. And I think part of why it's important, even if there's a lot of overlap between the two to distinguish them, is when we talk about responsible tech, a lot of this has to do with what we as technologists should be doing differently. What is our responsibility as a technologist in the way we approach this?
And part of that should be doing what we can to identify and mitigate against unintended negative consequences but also identify and perhaps amplify positive consequences. I mean, it's possible that if you just looked up a little bit, it's like, oh, wait a minute. Not only can I solve this problem, but I can solve this problem over here. And that's going to give me more business and all of that kind of stuff.
And so ethical tech is really about right and wrong. And what is the software or what is the technology doing? And responsible tech is more about how we are approaching creating it. And, hopefully, with the tools of responsible tech, we're in a better position to create ethical tech.
But you could, in fact, do everything right as a technologist. But what you're doing is building exceptional malware. So you can responsibly build unethical tech, and you can have ethical tech that is built irresponsibly. And so the notions are different in my mind.
Eduardo Meneses: And that brings me to something else. We definitely have this responsibility when it comes to the design and development of software. But there may also be a risk of narrowing this responsibility only to the technical part when, in fact, I mean, we are seeing technology impacting millions of lives, changing a lot of social dynamics.
And we could argue that, in fact, part of this debate should be also in the whole society, not just in the tech industry. And so when it comes to thinking also what you were mentioning about thinking about technology in a responsible way, maybe there's a discussion on what should be the role of the tech industry into connecting to this wider social debate. How do you think that the tech industry and technologies should connect to this idea when we recognize that this should be like a discussion that the whole society should have?
Rebecca Parsons: Well, first off, I think it is our responsibility as an industry to make sure that decision makers, policymakers, the broader society understands the consequences of our technology. I was mortified a few years ago when all of these people were saying, well, of course, Facebook is objective. It's an algorithm.
A person wrote that algorithm. That algorithm had a goal, and that goal is not necessarily what you think it is. I don't know what objective means in that context. But that wasn't that person's fault.
As technologists, I think it's our responsibility to help society understand these are the choices that you have. These are the consequences that you're signing up for. But I definitely agree. I don't want just technologists making some of these decisions.
Do you want a technologist working for some auto manufacturer somewhere to make the decision whether or not the automatic driving software prioritizes the driver or the pedestrian? No. I think that's the discussion civil society needs to be having. And the answer is not actually very clear.
I read about a survey one of the auto manufacturers did. And the vast majority of people when they were asked what should the auto manufacturers build, very, very nicely said, oh, well, of course, they should build something that prioritizes the life of the pedestrian. When asked what kind of car they would buy, they all said they would buy a car that prioritize the life of the driver.
Now, that is not helpful. [LAUGHS] But we do have to have these kinds of debates. What should this technology try to accomplish? How do we make these decisions?
And the first step in that, much like the first step for technology within a business, is to help the business understand the role that technology can play in making their business more successful. We need to have the same kind of discussion with civil society. These are the considerations. These are the possibilities that we've come up with so far.
But you all need to decide. And we are here to inform about consequences and about possibilities, not to make the decision. But, unfortunately, for the most part, we're still in the driver's seat on how products are designed and what technology gets deployed. And that needs to change.
Unfortunately, at the moment, our avenues for societal participation, our policies and legislation and they both go so slowly. And we don't yet have a critical mass of people working within the government and the policymaking organizations that really understand the way technology is moving and what some of those consequences are. So I don't know how to make it work, but I really want to make it work.
Eduardo Meneses: Yeah, that's really interesting because we were discussing in another episode in this podcast with some of our colleagues in Australia that they were saying exactly this problem that you're mentioning, like how quick technology's going. And the need for having a social debate needs more time than that. And they were speaking about how this is also bringing innovation in legislation about not legislation making case-by-case exact recipes of how to act, but more legislations on principles, I mean, what should be the principles we maintain.
And that brings, in fact, for me, also, the focus on the other part about what are-- if we think a little bit on the tech industry, at the end, the main driver of the tech industry right now, we could say, is business. I mean, technology is developed in the business sector more than in every other place right now.
And that brings to a very, very interesting debate because a lot of people may think, because they do not know too much about this debate, that thinking about responsible tech could be expensive, could not be compatible with business, could not necessarily be something that business could integrate in their perspective. And what is interesting for me is that I know that in Thoughtworks, and you have been discussing that, and a lot of people in the industry have been also discussing that there are connections. And there are some cycles that are positive that we can create on that.
And I wanted to ask a little bit about that, about where do you think that these perspectives on the business perspective, the business approach, and the responsible tech perspective can come to a common ground that they can be connected? And if not, also, what is the risk of not doing so? How do you see this perspective?
Rebecca Parsons: Well, first, as I said earlier, some unintended consequences are positive. And so there's an immediate business benefit right there. If I make this such that it's easy to use for both men and women, I've just doubled my potential market right there.
But there are other aspects of this, too. Very often, people think I need to trade off quality for speed of development. In software, that's not true over the long term.
And I think the same thing is true. If we take into account these possible unintended consequences, we reduce the potential for reputational risk for the business. As I said before, we possibly broaden the potential market.
But another factor is how easy is it to hire people because more and more of the new generation, as they're entering the market, it matters to them whether or not their company has principles. And if they feel like you're developing in an irresponsible fashion, they're not going to want to work for you.
And one of the big determining factors right now of the success of organizations is their ability to hire and retain exceptional talent. And you're not going to be able to do it if you say, oh, well, we don't care about the environment. And we don't care about this, and we care about that. And so I do think there's, definitely, a strong case from the perspective of employer brand, where, how attractive am I in the market? But, also, it's a reputational risk.
I mean, if you have a risk from a data breach or if people suddenly figure out, oh, well, wait a minute. I'm using this technology, and it actually doesn't recognize Black skin. Well, that's not a good look for a company. And so taking some of these things into account helps manage that reputational risk and protects the brand. And brands are valuable. And so I do think we can make a business case for wanting to deal responsibly with technology.
Eduardo Meneses: And I really like that. I know that we already passed through the time we had for this podcast. This discussion, we could go on for hours, I feel. But you just mentioned something which is really important, which is technologists and how technologies are being more and more attracted by this perspective.
And maybe that's how I wanted to end this discussion today, is I wanted to ask you, speaking to technologists, what do we think are the concrete things that they can do tomorrow? I mean, just after hearing this podcast and saying, OK, this connects to me. I want you to do something on this. What would you recommend technologists to do in their journey, start doing their journey. What are the directions they can go to develop this perspective?
Rebecca Parsons: Well, one of the resources that we've put together at Thoughtworks and made available on our website is something called the Responsible Tech Playbook. And these are techniques-- and we haven't developed them all. We have developed some. But we've worked with other organizations that have built things like the tarot cards of tech.
But they're different workshops, different techniques that technologists can use to lift their head up a little bit and try to view the world from a different perspective. We'll never be able to build diverse enough teams that handle all of the different intersectional possibilities within the world.
Obviously, a more diverse team will be better. But we need techniques that will help get us out of our own head and see the world from a different perspective. And so that's what I would recommend.
Go get the playbook. Play with some of those different techniques. And there are all various kinds of facilitated exercises that basically say, get out of your own head and see the world from somebody else's perspective.
Eduardo Meneses: Thank you very much, Rebecca. Thank you for this discussion. I know we may have a second episode with you because there are so many things that we still need to talk about. But thank you very much for your time.
I hope that everyone enjoyed this discussion with Rebecca Parsons. And we are inviting you to the next episode. We will continue this discussion around the intersection between technology and social change. We have tons and tons of topics to treat. So welcome, everyone. Continue connecting. And, again, thank you very much, Rebecca, for your time.
Rebecca Parsons: Thank you Eduardo.