Podcast guests: Tom Sulston & Kathryn Gledhill-Tucker
December 9 2022 | 29 min 17 sec
Brief overview
The concept of digital rights is not new, yet we are just beginning to scratch the surface of how they manifest in today’s rapidly-evolving digital world. But when we shift our thinking to see digital rights as human rights in a digital context, what is perhaps a nebulous concept starts to crystallize.
In this episode, Tom Sulston & Kathryn Gledhill-Tucker talk with host Eduardo Meneses about what we mean by digital rights and how technology’s borderless influence often conflicts with regional legislation. They also share resources to learn about digital rights and discuss how technologists can use their influence responsibly to advocate for a fairer internet and digital society.
Transcript
Eduardo Meneses: Welcome. Welcome, everyone, to this new episode of our podcast, Tech and Social Change: Discussion from a Common Good Perspective of Technology. Today, we are making a special episode. We have two guests that are coming from Australia. We have Tom Sulston and Kathryn Gledhill-Tucker who are with us today. I am Eduardo Meneses. I am the global head of social change in Thoughtworks. And I am going to be your host today on this journey on this discussion about a very specific topic that we are really passionate about, which is Digital Rights. Before going into this topic in the detail of the concept and the challenges we have as a society concerning these rights, I wanted just to ask you a little bit from you, Tom and Kat, around your backgrounds, where are you coming from, how you landed into this topic. So please, Tom, I don't know if you could tell us a little bit from you.
Tom Sulston: Sure. Thanks, Edu. Yes, as you say, my name is Tom. I'm coming to you from Australia, part of the world where sovereignty has never been ceded. And I very much got into digital rights in 2013, which was the year that the Edward Snowden revelations happened. We learnt that lots of companies were spying on us for our government's sake. And that was the turning point that radicalized me from being a very gentle system administrator, DevOps type who was doing a lot of consulting work at Thoughtworks. And that was what kicked me off on the digital rights journey. And now, I spend quite a bit of my time working with Digital Rights Watch, where I'm the deputy chair. It’s a charity that looks after Australian Human Rights online.
Eduardo Meneses: Thanks, Tom. Kat, from your side. Where do you come from? From what's your background?
Kathryn Gledhill-Tucker: Great. Thanks, Edu. I'm Kat. I am the lead of the First Nations delivery center here in Thoughtworks, Australia, but I grew up on the South Coast of WA, so I'm also a visitor on these lands today. I came to digital rights via marketing analytics essentially. I worked as a data analyst for a few years after I finished my master's degree.
And I think it was being embedded inside that machine and seeing the enormous imbalance of power between users and technologies and organizations that really radicalized me. I was able to see firsthand just how much information can be measured and gathered and mined from people without explicit consent. So I very quickly turned from an analyst into an activist. And now, today, I serve as the vice chair of Electronic Frontiers Australia, which is a not for profit organization here in Australia also championing privacy and digital rights in Australia.
Eduardo Meneses: Great. Thanks, Kat. So as I was saying at the beginning, this episode, which is part of our first series of episodes, we are making kind of an introduction to this intersection between technology and social change is going to be centered around this idea of digital rights. But before going to the details of this, I wanted to ask you, Tom and Kat, maybe a quick definition around that. What do we understand about digital rights? Is this something which is different from traditional rights? Where did this concept come from? Could you tell us a little bit about that?
Kathryn Gledhill-Tucker: I can jump in first if you like. So we typically consider digital rights to be human rights in a digital context, not necessarily a different or discrete group of rights. But when we are talking about digital rights, certain themes and discussions do come up. So closely coupled with digital rights are things like our right to privacy, freedom of expression and assembly, our right to access, create and publish content on the internet, and equal access to hardware, software, and networks across the world. So really harkening back to that Berners-Lee vision of the world wide web, of equal access and net neutrality. So certain conversations do come up around digital rights, but we effectively do consider digital rights to be human rights.
Eduardo Meneses: Is there a history about this concept? Is this a new concept? Have we been speaking for years, for decades about that? Where did that come from?
Tom Sulston: So, digital rights is a term that's been thrown around really only in the last few years as a practice. But discussion around human rights, you can go back to the Magna Carta as a very foundational document of defining the rights of a person. Really, the document that we work to is the 1948 UN Declaration on Human Rights, which was the big international piece of work that came out of the ashes of the Second World War with an aim to not have that happen again. And so it's a spectacularly good piece of documentation. It's a really good declaration. It's very pithy and descriptive of, as it says, fundamental human rights that are inalienable.
And so a lot of the work that we do is looking at rights that are granted under the UNHDR and thinking about, as Kat said, how do those human rights manifest in the digital space? So some of them are obvious candidates that pop up a lot, like the right to privacy, the right to free speech, those sorts of things. Some of them are a little less obvious but they still happen, like the right to association. So that's a kind of base document of where we start and where a lot of certainly the modern thinking around human rights originates.
Eduardo Meneses: And then, for example, this last one, as you were saying, the first ones were it's kind of obvious and mainly for people who are in the tech industry. But when you speak about, for example, this one you were mentioning, the right to association, when it comes to this digital perspective, how does that translate into this perspective?
Tom Sulston: So one of the things that we see on the internet in the dig sphere is that actually it's a very privatized world. A lot of the services and platforms that exist in the digital world are run by companies. They're not run by civil institutions. And so those companies have the ability to decide who their users are, who their clients are, and can choose to kick people off their platform for whatever reason they happen to choose. So we see it quite commonly applied with sex workers that are no longer allowed to use certain platforms because the platforms do not want to be associated with adult content, even if those sex workers are not using the platform in a professional sense. They're using it in a personal capacity. They can still be removed from social media sites. And they lose that right to associate with their family through that.
Eduardo Meneses: And you were mentioning something, Tom, right there that I think could be really interesting to go a little bit deeper on, that is, you were mentioning this idea of the privatization of this digital space, the fact that a lot of these digital territories privatized is owned by private companies. And I was wondering, when both of you were speaking about digital rights, there's often a debate around human rights in general when we speak about what that is about, do we explicitly make a link to economic injustices, for example, when it comes to that? Or do we maintain that in an abstract way of rights? So this idea you were mentioning about the privatization of this territory is very interesting.
And that leads me to this next question about, where are we on these rights? In these territories who are very often private, who are not controlled by direct democratic tools, but needs to be very often there's intermediation of the state or other kind of collective control of these spaces. Where are we, if we needed to make a categorization of the moment in which we are concerning digital rights? Where do you see our societies right now in terms of recognitions of these rights? Where are we? What are the challenges?
Kathryn Gledhill-Tucker: That's a really good question. I think, in a very brief sentence, I call this date of technology or digital rights is highly captured and highly privatized, as we've kind of touched on a little bit. And a lot of digital rights issues aren't necessarily new or unique to the digital world, too, although the advancement of technology means that power can be wielded in a way that exacerbates existing inequalities. So in terms of recognizing digital rights and how we as a society are interpreting, or that level of literacy in terms of existing digital rights. I think it's much easier to communicate with people on that kind of base level of human rights rather than expecting a higher level of literacy around digital rights. But I think it's also interesting that you mention privatization and freedom of assembly.
I think another way that we see that manifest is around a very specific example is people who use social media platforms or digital platforms as places to assemble and for activists to assemble. I think it's something-- I think it's a real, real challenge and a real point of education for a lot of people. A lot of these platforms were not built to support or facilitate democracy, and certainly not built to facilitate activists safely and freely assembling. They're built to mine and surveil and make profit. So I think that's definitely one of the biggest issues that comes up when we are discussing digital rights or human rights with people.
Tom Sulston: Yeah, big agreement with that. And I think it's exacerbated or the state of digital rights is exacerbated by the broader free nature of the internet where platforms and providers can effectively jurisdiction shop and put their locations in kind of physical locations where there is not great enforcement of human rights. And that may or may not line up with an individual country's view on human rights themselves. So the United States is-- it's got a bill of rights.
There are all sorts of rights that you have in America. They're written down. It's very plain to see. And there are many legal cases that go through their system that allow citizens of the United States to express those rights and to have them respected. But in terms of digital rights and respecting privacy, it's a lot more Wild West-ish in its approach. And a lot of organizations are getting away with some pretty terrible use of their users data, especially those users from outwith the borders of whichever region they happen to be operating.
Eduardo Meneses: And then this question you were mentioning, Tom, this makes me think about something, a point that very often is complicated right for people to really understand, which is the fact that very often we speak about digital rights and the digital territories as something which could be almost abstract or ethereal. But in fact, there's a real materiality on that. As you were mentioning, for example, where servers are located could define a lot of the things we can do on terms of rights. And that makes me think about this perspective on what are the challenges in terms of how do we make legislation about that. You were mentioning the fact that, for example, someone could find countries where traditional rights are not the same that some of the users would expect to be.
So in this sense, how could we expect that legislation apply? Is there something that we can think in terms of international regulation that could apply in order that this doesn't happen, this kind of search in these places where the rights are less recognized and this kind of things? Is there something-- is there a reflections around this thing about the territory where this should apply, the international perspective or how we should legislate or we should create norms around digital rights?
Tom Sulston: Yeah, international legislation is really hard. It's the short version. And so although there are many, many countries that signed up to the Universal Declaration of Human Rights, similarly there are many, many countries who have signed up to the various Geneva Conventions over the years.
Those international agreements then need to be turned into local, national laws within the countries that have signed up to them. And so the realization of that is very dependent on the country itself and how it operates. So as kind of a direct example of that, in Australia, Australia is a signatory to the Universal Declaration of Human Rights. Australia is a signatory to the Universal Declaration of Human Rights, but it does not have primary legislation respecting Australian civil rights. We do not have a bill of rights like the Americans do.
And while there are little bits of legislation around the place-- there are some not bad privacy principles that are enshrined in law in how our information is meant to be handled-- it means that we lack that backbone and that strong kind of legal framework that primary legislation gives you that lets you use the legal system to protect your rights. So when we have things go wrong in digital rights land in Australia, our legal avenues to seek redress are really limited. They're not zero. It's not a complete disaster. But it's very hard to do that without clear primary legislation that says, here are the rights that you have if you are a resident in Australia.
Eduardo Meneses: And in that sense, I was wondering, what are the debates right now? I mean, what are the challenges? You were mentioning some of your experience, both of you, on some of the organizations you have been involved in, these kind of things. What are the main topics we are treating right now. I know that you mentioned some things like privacy. I don't know if that is, for the moment, the main one, the central one, the mismanagement of personal data. Are there other main topics you think are in the center of this debate right now?
Kathryn Gledhill-Tucker: Yeah, if I can touch on something Tom just mentioned around-- Australia does not have a bill of rights, which means we don't have a federally enforceable right to privacy. I think with the kind of globalization of American culture, a lot of people tend to assume that we do have certain inalienable rights in Australia. And we don't. So that's something that we are broadly trying to advocate for is principles-based, federally enforceable rights framework in Australia. And that's something that we've been fighting for and challenging for a very long time. I think one of the other big debates in this country at the moment in our space.
So our Privacy Act is currently under review. And it's under review for the first time, I think, in about at least a decade. So there are lots of opportunities to consolidate various types of tech policy legislation in a way that is principles-based. And we also have a review of surveillance legislation at the same time. There are other nuances in Australian legislation that are quite unique to Australia. We might not have a lot of time to get into. So those are, I think, two of the biggest things that are coming up at the moment that are entering our political zeitgeist.
One thing that has come up is this idea that tech advancement outpaces legislation, which is just necessarily true. We can build technology and we can create new technology a lot faster than we can write and pass legislation. And that's why we do advocate for principles-based legislation rather than rules-based or very specifically technology-based. The other thing that I was just thinking about, while we were talking, is while a lot of this legislation does need to be state-based, even when we do anchor on international, UN Declaration of Human Rights, a lot of these platforms and a lot of tech companies have global uses and are globally operating.
So what we've noticed over time is that platforms will tend to adhere to the lowest common denominator of legislation. So whichever state-based piece of legislation, whether it might be GDPR or the California Privacy Act, whichever rules and regulations are the most difficult or the most punitive, it's easier for platforms to adhere to those rules regardless of the location of the user. So that tends to lead to a lot of overall moderation and overpolicing of content on platforms. Sorry, that was a lot, but those are the sort of things that come up.
Eduardo Meneses: But this is really interesting. There is something you were mentioning about this difference between-- this idea of principle-based regulation. Could you go a little bit on what does that mean? I don't know if there are some examples. What is the difference with traditional regulation, let's say that this rules regulation that you mentioned or between this-- what is the difference with this principle-based regulation you were mentioning?
Tom Sulston: I am not a lawyer, so my understanding doesn't come from a legal perspective. But the driver for me behind principles is to recognize that human rights are-- they're not always absolute. You do have a right to freedom of speech, but that doesn't mean you have the right to incite violence and use your speech to cause crimes and incite hate against other people. And I think that's where the principles-based approach has a strength because you can say, well, this is the default standard.
And this is what you can expect most of the time. But we also know that there will be some edge cases where this doesn't work out. So talking-- to bring it back to what are some international issues that have been going on, there's been a lot of fuss recently around Cloudflare, providing denial of service protection to kiwi farms. And kiwi farms is a message board that was used by a lot of far right figures to incite hate and violence towards transgender people and gay people and all sorts of other different groups of people that they didn't like. They do not have a freedom of speech to do that.
And Cloudflare was under no obligation to provide them a platform under which to exercise that mythical right. And so they have finally removed kiwi farms from their platform. But the fact that that took so long and so many people are on the receiving end of so much abuse during that period of time shows that there are shortcomings to having private companies act as effective utilities in the internet space. And so that's a really problem-- that was a very tangible example of a problematic area where we have entrusted the protection of the commons to a private entity and its understanding of what a right to a freedom of speech means is not quite right.
Eduardo Meneses: And just-- I know that we are reaching the end of our episode today. But I wanted to end this with some practical things that the people who are hearing this could go to-- could connect to that if they want to go deeper into this understanding. I wanted to ask both of you if you could maybe suggest some readings. Is there some books, some documentaries that have been very important when it comes to this perspective of digital rights that you would recommend to people to watch, to read? There are some references that are very important for you in this topic?
Kathryn Gledhill-Tucker: Oh, well, I'd say both of our organizations are quite active in responding to the Australian tech landscape in the media and on our own websites. So that would be my first recommendation is looking up Electronic Frontiers Australia and Digital Rights Watch to see what we're doing and what we're saying. In terms of books, my favorites are Silicon Values by Gillian C. York. And then there's always the kind of timeless Shoshana Zuboff surveillance capitalism, which is a bit of a tome, but lays the landscape for a lot of theory around our current age of surveillance capitalism.
Eduardo Meneses: I know, Tom, from your side, which are your favorites, the ones you would recommend for the people who are hearing this?
Tom Sulston: I give a big plus one to both of those recommendations from Kat, both excellent. I would add to it Zeynep Tufekci writes an excellent substack as to Ed Snowden. You can find those on the internet and subscribe to them. And her book about network protest is really, really insightful as well. So those are my go to things to get activated and get angry about.
Eduardo Meneses: Thanks. Thanks, Tom. Thanks, Kat. Just a last word before we close. What would you recommend to technologists who are hearing to that? What is the things that they can do tomorrow, if they want to start acting on that? Is there something that you feel that technologists have a role to play, something that they could do just after stop listening to this podcast? What is the next steps?
Kathryn Gledhill-Tucker: Oh, I think technologists are in a really unique position, kind of on the ground floor and inside the machine of a lot of these technologies that we're talking about. So while I definitely recommend and advocate that people learn about the kind of legislation that's going to affect our day to day lives and our day to day jobs, I also recommend learning to identify systemic injustices and imbalances of power when we're building technologies.
And this might be a little bit of a Thoughtworks plug, but things like looking at the responsible technology playbook that we've developed and learning to turn that lens onto the work that we do every day while recognizing that we as individual technologists and individual developers might not be able to change the world or change those systemic injustices, but being able to recognize the power that we do have in the work that we do.
Eduardo Meneses: Thanks, Kat. Tom, I don't know if there's something that you wanted to mention in that sense too?
Tom Sulston: Yeah, I'm going to take a slightly contrary position to Kat because I think that individual technologists absolutely can change the world and a lot of individual technologists have done. And what I would like to impress upon people who work in the tech industry is you have these superpowers. They can drive the machine, they can build the internet, they can build software in a way that other people cannot. And so the amount of influence that the technologists have within companies, within governments, within big organizations, and even as individuals out in society is really quite a lot.
And so I think recognizing that the power that each individual technologist has and our ability to stand up and say, no, I am not going to make that system that hoovers up someone's private data, I am not going to write this feature that disadvantages people of color because of our clumsy artificial intelligence, I am just not going to do that, and I refuse to take part in this kind of shady surveillance capitalism or these kind of rights-abusing things that are being built is a thing that technologists can do and do do and have been doing that successfully in some very large companies.
And we've seen a lot of technologies-- a lot of technologists have been building unions and building other groups of power to exercise that kind of individual power and to amplify it. And we have a lot of influence in the world. And as Kat says, we need to exercise it responsibly. And so A, thinking about that is really important, but recognizing that power exists and knowing when to apply it and to apply some pressure to push for more rights and for a better internet and for a better digital society for everyone is really important.
Eduardo Meneses: Thank you. Thank you very much. Tom, thank you, Kat, for all these very interesting insights. For sure this is not the last time we're going to speak with both of you around these topics. I really enjoyed a lot this discussion. I hope that other people are hearing this, too. So thank you very much, Kat. Thank you, Tom. And I will see you very soon for another episode with both of you. Thank you.
Tom Sulston: Thanks, Edu.
Kathryn Gledhill-Tucker: Thanks, Eduardo.