loader from loading.io

Liberating our minds in a digital world: how do we do it?

The Rights Track

Release Date: 05/11/2022

The Rights Track: Sound Evidence on Human Rights and Modern Slavery - SPECIAL EPISODE show art The Rights Track: Sound Evidence on Human Rights and Modern Slavery - SPECIAL EPISODE

The Rights Track

In this special BONUS episode of the podcast, Todd is joined by Rights Track producer Chris Garrington of  to discuss their recently published book The book, published by Anthem Press is (September 6, 2022) at a special event hosted by the University of Nottingham's Rights Lab, funders of Series 3-5 of the podcast.  Transcript Todd Landman  0:01  Welcome to The Rights Track podcast, which gets the hard facts about the human rights challenges facing us today. I'm Todd Landman. In this special episode of the podcast, I'm delighted to be joined by Rights Track producer, to...

info_outline
Human rights in a digital world: pause for thought show art Human rights in a digital world: pause for thought

The Rights Track

In Episode 9 of Series 7, Todd is joined again by , Director of at the University of Nottingham, funders of this series. Together they reflect on some of the key themes and ideas to emerge from Series 7 of The Rights Track about human rights in a digital world.   Transcript Todd Landman  0:01   Welcome to The Rights Track podcast, which gets the hard facts about the human rights challenges facing us today. In series seven, we've been discussing human rights in a digital world. I'm Todd Landman. And in the last episode of this fantastic series, I'm delighted to be joined for the...

info_outline
Eyewitness: using digital technology to prosecute human rights abusers show art Eyewitness: using digital technology to prosecute human rights abusers

The Rights Track

In Episode 8 of Series 7 of The Rights Track, Todd is in conversation with Wendy Betts, Director of , an International Bar Association project launched in 2015 which collects verifiable video of human rights violations for use in investigations and trials. We're asking Wendy how the use of digital technology can help to hold accountable those who commit human rights crimes.   Transcript Todd Landman  0:01  Welcome to The Rights Track podcast, which gets the hard facts about the human rights challenges facing us today. In series seven, we're discussing human rights in a...

info_outline
Democracy assaulted: are we our own worst enemy? show art Democracy assaulted: are we our own worst enemy?

The Rights Track

In Episode 7 of Series 7 of The Rights Track, Todd is joined by Tom Nichols, Professor Emeritus of National Security Affairs at the U.S. Naval War College and Contributing Writer at The Atlantic. Tom specialises in international security affairs including U.S. - Russia relations, nuclear strategy, and NATO issues. His recent book – is an account of the spread of illiberal and anti-democratic sentiment throughout our culture.  Transcript Todd Landman  00:00 Welcome to The Rights Track podcast, which gets the hard facts about the human rights challenges facing us today. In...

info_outline
Communicating human rights show art Communicating human rights

The Rights Track

Todd was invited to talk about Communicating Human Rights by Pembroke College Oxford and used the opportunity to discuss the motivation behind The Rights Track Podcast and what has been achieved by the podcast over 7 series to date. He joins Professor Alison Brysk to discuss how o new forms of global communication foster rights campaigns and human rights education.  Watch the talks and discussions here  

info_outline
Liberating our minds in a digital world: how do we do it? show art Liberating our minds in a digital world: how do we do it?

The Rights Track

In Episode 6 of Series 7 of The Rights Track, we're joined by Susie Alegre, an international human rights lawyer and associate at Doughty Street Chambers specialising in digital rights. Susie's work focuses in particular on the impact of technology and AI on the rights to freedom of thought and opinion. Her recently published book - Freedom to Think: The Long Struggle to Liberate Our Minds – explores how the powerful have always sought to influence how we think and what we buy. And today we are asking her how do we liberate our minds in a modern digital world?    Transcript Todd...

info_outline
Using prison data to reduce incarceration show art Using prison data to reduce incarceration

The Rights Track

In Episode 5 of Series 7 of The Rights Track, Todd is in conversation with , Director of Partnerships at  – a team of technologists committed to getting decision makers the data they need to drive better criminal justice outcomes.  Transcript Todd Landman  0:00  Welcome to the Rights Track podcast, which gets the hard facts about the human rights challenges facing us today. In series seven, we're discussing human rights in a digital world. I'm Todd Landman, in this episode, I'm delighted to be joined by Amrit Dhir. Amrit is the Director of Partnerships at...

info_outline
An optimist's view: What makes data good? show art An optimist's view: What makes data good?

The Rights Track

In Episode 4 of Series 7 of The Rights Track, Todd is in conversation with Sam Gilbert, an entrepreneur and affiliated researcher at the Bennett Institute for Public Policy at the University of Cambridge. Sam works on the intersection of politics and technology. His recent book –  – explores the different ways data helps us, suggesting that “the data revolution could be the best thing that ever happened to us”.  Transcript Todd Landman  0:01  Welcome to The Rights Track podcast which gets the hard facts about the human rights challenges facing us today. In Series...

info_outline
Dizzying digital change: how is it disrupting our lives and our world? show art Dizzying digital change: how is it disrupting our lives and our world?

The Rights Track

In Episode 3 of Series 7 of The Rights Track, , Bennett Professor of Public Policy at the University of Cambridge and co-director of the joins Todd to discuss the dizzying digital changes over the last 25 years, how it has disrupted the economy and impacted on our lives. Transcript Todd Landman  0:01  Welcome to The Rights Track podcast which gets the hard facts about the human rights challenges facing us today. In Series 7, we're discussing human rights in a digital world. I'm Todd Landman, in our third episode of the series, I'm delighted to be joined by Professor Diane...

info_outline
Getting to grips with the grammar of human rights show art Getting to grips with the grammar of human rights

The Rights Track

In Episode 2 of Series 7 of The Rights Track, Martin Scheinen, British Academy Global Professor at the University of Oxford and a member of the Scientific Committee of the EU Fundamental Rights Agency joins Todd to discuss whether the grammar of human rights law can cope with multiple challenges of the digital realm.  

info_outline
 
More Episodes

In Episode 6 of Series 7 of The Rights Track, we're joined by Susie Alegre, an international human rights lawyer and associate at Doughty Street Chambers specialising in digital rights. Susie's work focuses in particular on the impact of technology and AI on the rights to freedom of thought and opinion. Her recently published book - Freedom to Think: The Long Struggle to Liberate Our Minds – explores how the powerful have always sought to influence how we think and what we buy. And today we are asking her how do we liberate our minds in a modern digital world? 

 

Transcript

Todd Landman  0:01 

Welcome to the Rights Track podcast which gets the hard facts about the human rights challenges facing us today. In series seven, we're discussing human rights in a digital world. I'm Todd Landman, in the sixth episode of the series, I'm delighted to be joined by Susie Alegre. Susie is the international human rights lawyer and associate the Doughty Street Chambers specialising in digital rights, in particular the impact of technology and artificial intelligence on the rights to freedom of thought and opinion. Her recently published book - Freedom to Think; The Long Struggle to Liberate our Minds - explores how the powerful have always sought to influence how we think and what we buy. And today we're asking her, how do we liberate our minds in a modern digital world? So Susie it's great to have you on this episode of the Rights Track. Welcome.

Susie Alegre  0:47 

Thank you so much for having me. I'm very excited to be here.

Todd Landman  0:49 

So I love the book - Freedom to Think - I've read it cover to cover. In fact, I read it probably in two days, because it's such a compelling read. And I guess my first question for you is, why is the freedom to think broadly understood belief, expression, speech, religion, thought, why is all of that so critical to us as human beings?

Susie Alegre  1:10 

I think the way that I've looked at it in the book is really dividing those elements up a little bit. So what I focused on in the book is freedom of thought and opinion and what goes on inside our heads, as opposed to the more traditional discussions that we have around freedom of speech. And one of the reasons for that is that while freedom of speech has consequences and responsibilities, and freedom of speech can be limited, that freedom in our inner worlds to think whatever we like to practice our thoughts and opinions and decide whether or not there's something we should share, is what allows us to really develop and be human. And the right to freedom of thought and opinion, along with belief and conscience, insofar as we practice that inside our heads is something that's protected absolutely in international human rights law, which I think reflects its importance. And when you consider other absolute rights and human rights law, like the prohibition on torture, or the prohibition on slavery, the right to freedom of thought inside your head alongside those other rights, really gets to the heart of human dignity, and what it means for us to be humans.

Todd Landman  2:24 

Yes and so in protecting those rights, we are giving people agency because I was caught really captured by one thing you just said there about, we choose what we want to share. So a lot of us can have a million thoughts a second, but we don't share all of them. Although in the current era, it seems that people are sharing pretty much everything that they're thinking. But we'll get to that in a minute. I'm just curious about this idea of agency that, you know, you choose what to share, you also choose what not to share. And that element of choice is fundamental to being human.

Susie Alegre  2:53 

Absolutely. And what the right to freedom of thought, well certainly a key element is right to freedom of thought and freedom of opinion, is what's called freedom in the forum internal that's inside, you know, in our inner lives, it's not what we then choose to do, or say in the outer world. And having that inner space, it's really important for us to be able to develop who we are, you know, I'm sure all of us have had thoughts that we wouldn't particularly like to be recorded. And I don't know if you've seen the recent drama Upload, which.

Todd Landman  3:28 

I have not.

Susie Alegre  3:29 

Well it's worth a look, because I was watching one of the episodes where it was about people being unable effectively to shut off their thoughts or their thoughts were being live streamed if you like. And I mean, you can only imagine the horror of that, you know, that was a comedy. A similar story played out in a short story by Philip K. Dick, The Hood Maker, which was a situation where you had people who were able to read other people's thoughts, and the only way that you could protect yourself from this mind reading was to wear a hood. And so protecting your thoughts from mind reading was really seen as an act of rebellion and effectively made unlawful and that I think shows just how important this space is. It is if you like the absolute core of privacy. So privacy becomes like a gateway right to that central core of who we are, and how we decide who we're going to be.

Todd Landman  4:27 

I like this idea of a gateway right - that's really cool. Now, in the book, you have this really the first part is quite a deep dive into history. I mean, you go right back to Socrates, you worked your way through Galileo, you work your way through people that challenge the status quo, through freedom of thought, whether it was scientific practice, or religious belief or any kind of thought, but what are some of the high points of this history and shall we say the analogue attempts to control people's thoughts?

Susie Alegre  4:53 

Yeah, as you say, I looked right back and and Socrates is if you like, a classic example of a martyr for freedom of thought. One of the interesting things as well about Socrates is that we don't have anything written down by Socrates, because Socrates was himself very suspicious of the written word and what that did for humans ability to debate. But what he did do was absolutely question the status quo. And he delighted in creating arguments that would undermine Greek democracy at the time. But one of the reasons why we all know the name of Socrates and remember, Socrates, is because Socrates was effectively judged by his peers, and forced to take his own life by Hemlock because of his scurrilous ideas, and his attempts to twist the minds of young Athenians and to question the gods. So while Socrates might be sort of seen as an example of a champion of freedom of thought and freedom of speech, it was very clear that at that time in history, you didn't really have freedom of speech, because it ultimately landed up with a death sentence. Some of the other areas I looked at were people like Galileo and questioning whether the sun and the universe travelled around the Earth or the other way around, and that really landed him in house arrest. So really, again, questioning the status quo of the church, and certainly religions through the centuries have been one of the prime movers in curtailing freedom of thought and freedom of religion, if you'd like.

Todd Landman  6:32 

Yeah, in my world, the Galileo story is a kind of clash between observational data and belief.

Susie Alegre  6:38 

Yeah, absolutely, absolutely. But again, it sounds like one of those arguments of you know, well, you can have your own opinion and every opinion is sort of questions, but in another century, and in that century, you'll end up under house arrest, when you challenge the beliefs of the status quo and of the powers that be.

Todd Landman  6:56 

Yes, we see that being played out today, in the scepticism around science, whether one takes an extreme view about for example, being a flat earther. Or if there's doubt about scientific discovery, scientific development, the way in which countries respond to the COVID crisis, the hesitancy around vaccines, masks mandates, that kind of general scepticism around science, is also one where sure, there's freedom of thought, belief and opinion. But then there's also tested peer reviewed scientific evidence for the best thing we think we can possibly do under times of great uncertainty.

Susie Alegre  7:31 

Absolutely. And that area is a prime area where you see the difference between freedom of thought and opinion and freedom of speech and expression. So where you have sort of COVID conspiracy theories, if you like spreading through social media or spreading really proven false information that can harm people. You know, there is then a legitimate reason to restrict that expression and the spread of that expression, to protect public health. Doesn't mean that people can't still think those things. But there really have to be limitations on how those expressions are spread, when they are absolutely damaging to public health or to other people's rights.

Todd Landman  8:18 

Yes, exactly. And I don't think you covered this in the book. But I just want to push you a little bit. You mentioned about Socrates written word not being written down. But with the invention of the printing press historically, how had that changed freedom, expression, thought, belief? What's the role of that technological advance in your understanding of the history of this idea?

Susie Alegre  8:39 

Well, the printing press just really accelerated the way that information could be shared, it effectively accelerated the impact of expression, if you'd like. And interestingly, actually, I was asked recently, to compare regulation of the printing press and of printing around that time and how long it took to get serious regulation as compared to trying to regulate the internet today. And I said, rather flippantly, well, people were arrested, and books were burned. That was how regulation worked initially in response to the massive impact of the printing press. And while I was being flippant when I thought about it afterwards, well actually, that is how they tried to regulate the printing press. And one of the reasons I looked back at the past of freedom of thought in the ways that we didn't really have freedom of thought historically. To me, that was important because it showed what a sea change, having human rights law has been for us as human beings. So you know, people may complain about cancel culture, but certainly in the UK cancel culture very rarely involves actually being put in prison. Certainly it doesn't involve being told to drink hemlock or certainly not being obliged to drink hemlock. Human rights have really put the brakes on the ability of the powers that be to control us. But they've also put an obligation to protect us from each other.

Todd Landman  10:13 

And there's a certain duality then because if I think about what you just said, the powers that be, let's translate that into the rise of the modern state, as it were. And you draw on reading some, you know, quite regularly through the book you draw on Orwell's 1984. You draw on Arendt's Origins of Totalitarianism you draw on Huxley's Brave New World. So why did you draw on those sources? It seems to be you're alluding to the power of the state, the power of control, all those sorts of aspects. And yet, in order for human rights to work, we still need the power of the state. So there's two sides of the coin problem that we face in this quest to regulation.

Susie Alegre  10:52 

Absolutely. And drawing on those sources, in particular, in particular, Orwell and Huxley. I mean, perhaps because I'm a bit of a masochist, I spent the start of lockdown reading 1984. And just marvelling at how prescient it was, and how accurately it portrayed the developments of technology in our life. The Speak Write machine, the way that Winston Smith is employed to rewrite history, if you like, sort of creating in real time, disinformation in 1984, was somehow a real surprise to me having not read it since 1984, was just how accurately prescient it was. And similarly, reading Brave New World and the consumerism and the use of distraction as a means of social control, rather than the oppressive jackboot that you see in 1984. And seeing the ways that potentially commercial enterprises and a light touch can be used to have an equally corrosive and problematic effects on our societies. So the reflections of the images of Huxley and Orwell in particular was so stark that I felt that I had to use them because it seemed that rather than taking those as a warning from the 20th century, we've taken them as a template for the development of technology and consumerism in our lives.

Todd Landman  12:23 

So I suppose that really allows me now to segue nicely into your concerns over the digital world and how this digital world relates to human rights. And I guess my entry point is this famous line you have in the book where you say, you know, I told my daughter, she can't have Alexa. And she asked me why. And I said, you can't have an Alexa because it steals your dreams, and sells them to other people. Talk me through that. Talk me through your fears and worries around Alexa and what that means for the broader digital problem that we face.

Susie Alegre  12:52 

Yeah, Alexa is certainly a case in point. And as I'm sure anyone else with children has had the experience, your child comes home and their friends have got whatever technology it is, in this case, Alexa, and I know several people, several families where the kids do have Alexa in their bedroom. So you will always get these arguments as well sounds so has it so it must be great. For me the idea of Alexa the idea of actively choosing to bring a listening device into your home, that is constantly listening to what is going on in your home and sharing that with you have no idea who using that information in ways that you have no real idea how that's going to land up is something so astonishing. You know, having spent years working on human rights and counterterrorism, and also most recently, working in oversight on interception of communications, and how sort of allergic people or if you like, and quite rightly, to state intrusions to the idea that the state might be bugging your home, to then actually pay money and to let a private actor come in and listen to everything that's going on in your home for profit, just to me seems really astonishing. And yet somehow, it's become so normalised that as I said, I know lots of people who do have Alexa and are delighted to have Alexa. Plenty of people in the lockdowns suddenly sending around videos from their Ring cameras outside their doors, but this idea of constant control constant monitoring of our lives for someone else's profit. To me seems like something that is an really fundamental shift and something that we should all be really concerned about.

Todd Landman  14:51 

Now you're in addition to the Alexa example you're also very concerned about, shall we say the unregulated or the unleashing of and I will use the generic term algorithms in the digital world? So why are these algorithms problematic? From your perspective? What do they do? How do they affect people? Or is it a way that they're affecting people? And people don't even know? And is it that ignorance of the effect that concerns you? Or is it just the development of algorithms in the first place that concerns you?

Susie Alegre  15:20 

Now, I mean, algorithms are digital tools, if you like. So it's not the algorithm itself. There are two things really well, there are many. But let's start with two. One is the ability to understand why an algorithm is operating in the way it's operating. So an algorithm is effectively told to take information and translate that information into a conclusion or into an action, but understanding exactly what information is taken, how that information is being weighted, and then how a decision if you like, as being taken and what impact that decision will have, is often not very clear. And so where an algorithm based on huge amounts of data, for example, is being used to decide whether or not you might be fraudulently requesting benefits, for example, in the benefits system, that raises really serious concerns, because the outcome of not getting benefits or the outcome of being flagged as a fraud risk, has a really, really seriously detrimental impact on an individual life.

Todd Landman  16:29 

Yes. And you also give examples of credit rating. So if typically, somebody wants to get a mortgage in the UK, the mortgage company will say, well, we're gonna run a credit check on you. And they might go to one of the big data providers, that gives you a score. And that score is a function of how many credit cards you have any loans, you might have had any late payments you might have had on a loan or a mortgage in the past. And in the absence of a particular number. The company may reserve the right to say, you can't have a mortgage and I think you give the personal examples of your own struggles setting up a bank account after having lived abroad.

Susie Alegre  17:03 

Yeah.

Todd Landman  17:04 

Talk us through some of that.

Susie Alegre  17:05 

Yeah, absolutely. So as you say, I talk a bit in the book about returning from Uganda, where ironically, I've been working as a diplomat for European Union on anti-corruption. And I came back to the UK to work as an ombudsman in the Financial Ombudsman Service. But when I applied for a bank account, I was suddenly told that I couldn't have the bank account. Because the computer said no, effectively. The computer had clearly decided that because I was coming from Uganda or whatever other information had been weighed up against me, I was too much of a risk to take. The fact that I had been fully vetted as an ombudsman, and that the money that would be going through that bank account was going to be salary from the Financial Ombudsman Service was not enough to outweigh whatever it is the algorithm had decided against me. Eventually, I was able to open an account a few months later. But one of the interesting things then working as an ombudsman was that I did come across cases where people had had their credit score downgraded because the computer said so and where the business was unable to explain why that had happened. I mean, from an ombudsman perspective, I was in a position to decide what's fair and reasonable in all circumstances of a case. In my view, it's very difficult to say that a decision is fair and reasonable if you don't know how that decision has been reached. But those kinds of decisions are being made about all of us all the time, every day in different contexts. And it's deeply concerning that we're not often able to know exactly why a decision has been taken. And in many cases, we may find it quite difficult to even challenge those decisions or know who to complain to.

 Todd Landman  17:14 

Yeah and this gets back to core legal principles of fairness, of justice, of transparency of process and accountability of decision making. And yet all of that is being compromised by, let's say, an algorithm, or as you say, in the book, the computer says no.

Susie Alegre  18:49

Completely and I think one of the key things to bear in mind that even the drafters have the right to freedom of thought and opinion in the International Covenant on Civil and Political Rights, discuss the fact that inferences about what you're thinking or what your opinions are about, can be a violation of the right even if they're incorrect. So when you find the algorithm, making inferences about how risky a person you are, whether or not the algorithm is right, it may still be violating your right to keep your thoughts and opinions to yourself. You know, you should only be judged on what you do and what you say, not on what somebody infers about what's going on in your inner life.

Todd Landman  19:50 

Not on what you might be thinking.

Susie Alegre  19:52 

Exactly. Absolutely. Absolutely.

Todd Landman  19:54 

Right now, we've had a couple of guests on previous episodes that I would put broadly speaking in the camp of the 'data for good' camp. And when I read your book, I feel like I'm gonna broadly put you in the camp of 'data for bad'. And that might be an unfair judgement. But is there data for good here? I mean, because, you know, you cite the sort of surveillance capitalism literature, you have some, you know, endorsements from authors in that tradition. But if I were to push you, is there a data for good story that could be told nevertheless?

Susie Alegre  20:23 

I think there might be in public data. So for example, in the US, and I don't know if they are included in your guests, but there's data for black lives. And they've done really interesting work from public data, you know, flagging where there are issues of racial and systemic injustice. So that kind of work, I think, is very important. And there is a distinction between public data and private data, although how you draw that distinction is a really complicated question. But in terms of our personal data, one of the things that I think is important in looking at how to address these issues, is about setting the lines for the things that you can never do. And what I hope is that if you set down some barriers, some very, very clear lines of what can never ever be done with data. Then you will find technology, particularly technology related to data, and that includes the use of AI interpreting and working with data will develop in a different direction, because at the moment, the money is in extracting as much personal information as you can out of every single one of us and selling them.

Todd Landman  21:40 

And the degree of the extraction of that information is both witting and unwitting. So you also make the point in the book, if somebody signs up for a Facebook account, they just hit agree to the terms and conditions. But actually the time it takes to read the terms and conditions could be two or three days to get through to the fine print. And so people are just saying yes, because they want this particular account with not actually knowing the degree to which the sharing their personal information. Is that correct?

Susie Alegre  22:06 

Absolutely. And the other problem was the terms and conditions is that if you don't like them, what exactly you're going to do about it? Particularly if you're looking at terms and conditions to be able to access banking or access the National Health Service. If you don't like the terms and conditions, how exactly are you going to push back. But that point that you've made as well about the consent button, there's also an issue around what are called dark patterns. So the way that technology is designed, and that our online experience is designed to nudge us in certain directions. So if you're asked to agree the terms and conditions, the easiest thing is to hit the big green button that says I consent. Again, we see it with cookies, you know, often you've got a simple option where you hit I consent, or there's a complicated option, where you can manage your cookie settings and go through a couple of different layers in order to decide how much you want to be tracked online. And so that is clearly pushing you in the direction in time poor life experience, to hit the easiest option and just consent.

Todd Landman  23:16 

I feel that everybody you know, I read through Flipboard, which is a way of aggregating news sources from around the world by topic. And I sort of follow politics and law and international events, music and various other things. But every news story open up because of GDPR I get a pop up screen that says accept cookies, manage cookies. And I always say accept because I want to read the story. But what I'm actually doing is telling the world I've read this story, is that right?

Susie Alegre  23:43 

Yeah, absolutely. The cookies question as well as one where, actually, why should we be being tracked in all of our activities? All of our interests? And as you say, you know, telling the world that you've read this article is partly telling the world what you're interested in and what you're thinking about, not just that you've read this article in an abstract sense, you know, it's telling the world about your interests. One of the things that is also disturbing that people often don't realise is that it's not just what you read. It's even things that you may hover over and not click on that are equally being tracked. And it's not just on the page where you're reading the article. It's about being tracked all around your online activity being tracked with your phone being tracked, where you are not just what you're looking at on the phone. It's so granular, the information that's being taken, that I think very few of us realise it and even if you do realise that as individuals, we can't really stop it.

Todd Landman  24:52 

And I think for that reason I take a little bit of comfort because I wasn't targeted by Cambridge Analytica. I probably played some of the games on Facebook, you know the personality test stuff, but I never got ads as far as I was concerned that were being, you know, foisted upon me by the Cambridge Analytica approach. I use that as, let's say, a metaphor. But I know that there was micro-targeting based on certain profiles, because there was an attempt to leverage voters who had never voted before, or voters who were predisposed to in particular vote to vote for certain things. But again, it's that unwitting sort of profile that you build by the things that you hover over or the things that you'd like or the things that you at least read and accept that button on cookies. And of course, we now know that that microtargeting actually might have had a, you know, a significant impact on the way in which people viewed particular public policy issues.

Susie Alegre  25:41 

Completely, and I mean, I don't know whether I was or was not targeted by Cambridge Analytica or similar, around that time around 2016/2017. I don't know if you've come across a Who Targets Me, which is a plugin that you can put onto your browser to find out particularly around election times, who is targeting you. And I have to say that when I very briefly joined a political party for a couple of months, I signed off my membership after a couple of months, because I discovered that they were targeting me and people in my household through this, who targets me plugin. So even though theoretically, as a member, I was already going to vote for them. But that information was being used to pollute my online environment, as far as I'm concerned, which was a bit of an own goal, I imagine for them.

Todd Landman  26:32 

So that really does bring us to the question of what is to be done. So you know, I was waiting in the book for sort of what's the regulatory answer, and you do give some good practical suggestions on a way forward, because there is this challenge, particularly where we need services, you know, we do need mortgages, we need access to health care, we need public information, we need all the benefits that come from the digital world. But at the same time, we need to protect ourselves against the harms that digital world can bring to us. So what are the sort of three or four major things that need to happen to maybe mitigate against the worst forms of what you're worried about in the book?

Susie Alegre  27:10 

Well, one of the difficulties in the book was coming up with those things, if you like, what are the key things that we need to stop, and particularly in an atmosphere where we are seeing regulation happening, rapidly trying to play catch up, we've just seen the Digital Services Act in the European Union being agreed, we have the Online Safety Bill on the table in the UK, in Chile, we've seen in the last year legislation around neuro rights being introduced. And so it's a very fast paced environment. So trying to come up with suggestions that go to the heart of it while recognising the complexity and also recognising that it's in a huge state of flux. I wanted to really highlight the things that I think are the core of how we've got here and the core, very obvious things that we should not be doing. The first one of those is surveillance advertising. And that is advertising that is based on information, granular information, like we've been talking about about our inner lives, including how we're feeling potentially at any single moment in order to decide what images what messages we should be delivered. And whether those are political messages, whether that is commercial messages, whether it's just trying to drag us into gambling, when we're having a bad moment online. All of those kinds of things are part of this surveillance advertising ecosystem. And while surveillance advertising isn't the whole problem, I think that surveillance advertising is the oil that is driving this machine forward. If you don't have surveillance advertising, there isn't so much money in gathering all of this information about us. Because that information is valuable because it can sell us stuff, whether it's selling us a political candidate, or whether it's selling us a particular pair of socks tomorrow. And so surveillance advertising, I think is the key. And I think banning surveillance advertising would be the single most effective way to start change. Another thing that I think could make a real sea change in the way tech develops is recommender algorithms. And again, the things that are being recommended to us the way that we receive our information, whether that is on Netflix, whether that is on new services, potentially, very personalised recommendations of information are a way of distorting how we think and how we see the world based on information about our emotional states information about our psychological vulnerabilities, a whole raft of things that could lead to that. That I think is a real vehicle for social control. And so you may want occasionally, or even always, to have somebody suggesting what you should watch, when you're feeling tired, you don't want to make a decision yourself and you're happy to just be given whatever it is. But recommender algorithms and that kind of personalization of information feeds should never ever be the default. At the moment for most of us that is the situation. When we open up our laptops. When we open up social media, when we look at our phones, we're being given a curated personalised experience without necessarily realising it. So addressing that, and making sure that personalization is not the automatic choice would make a really big difference.

Todd Landman  30:53 

It's just an amazing set of insights. You've taken us from Socrates to socks here today. And it's been an incredible journey listening to you and so much to think about and so many unresolved issues. And when I listen to you, and I read your book, you know, I feel like I should get off the grid immediately, and put my hood on because I don't want anyone reading my mind and I don't want anyone selling me socks. But for now, Susie, it was just great to have you on this episode of the Rights Track and thanks ever so much.

Susie Alegre  31:20 

My pleasure. Thank-you so much for having me.

Christine Garrington  31:23 

Thanks for listening to this episode of The Rights Track, which was presented by Todd Landman and produced by Chris Garrington of Research Podcasts with funding from 3DI. You can find a detailed show notes on the website at www.RightsTrack.org. And don't forget to subscribe wherever you listen to your podcasts to access future and earlier episodes.