TRIGGERnometry - September 04, 2024


Do We Need to Censor Hate Speech? - Imran Ahmed


Episode Stats

Length

1 hour and 9 minutes

Words per Minute

180.59984

Word Count

12,555

Sentence Count

828

Misogynist Sentences

6

Hate Speech Sentences

14


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.760 You know, your sort of thing of, well, I'm willing to tolerate the normalization of anti-Semitism
00:00:05.960 because some people were banned for saying that men can't be women.
00:00:09.260 Imran, you can't disagree with something I didn't say.
00:00:13.900 Should you be able to call for an armed revolution on social media?
00:00:19.460 Do you think they should?
00:00:20.760 If we were living in the social media age in 1776, this country never would have been created.
00:00:25.820 In what way is Donald Trump?
00:00:27.240 He was banned from all the social media platforms.
00:00:29.740 Were you worried about his ability to get his message out?
00:00:33.080 Well, his ability to get his message out was definitely reduced significantly,
00:00:36.700 as you said yourself, when you ban people from social media.
00:00:39.380 You said it yourself, they lose reach, right?
00:00:46.080 But it's sad to me that you can't define, that you can't see hate speech normally.
00:00:49.760 Hold on a second, you're misrepresenting what I'm saying.
00:00:51.940 Again, rewind and just listen.
00:00:53.620 Good, I hope people do that because they'll see that you're misrepresenting what I'm saying.
00:00:57.260 For the main part, it's fucking obvious.
00:01:00.800 I tell you what, you are being disrespectful to the people who do receive hate speech.
00:01:07.220 Like me.
00:01:08.220 I warn you that most of your listeners will think themselves,
00:01:11.600 actually, Constantine's being a bit of a dickhead right now.
00:01:15.280 And calling me a dickhead might be considered hate speech.
00:01:17.600 No, no, it's not hate speech, it's just good observation.
00:01:23.560 Imran, welcome to Trigonometry.
00:01:25.020 Great to have you on.
00:01:26.340 You are the founder and CEO of the Center for Countering Digital Hate,
00:01:30.480 which is an organization that has, you know, some people love it,
00:01:33.840 some people, it's like Marmite, some people love it,
00:01:35.420 some people hate it, some people think it's very important.
00:01:37.480 Other people, like Elon Musk, are suing you.
00:01:39.980 So we want to get into all of that.
00:01:42.280 First of all, tell us a little bit about your background.
00:01:44.880 What made you want to do that?
00:01:46.700 How did you even get to starting it?
00:01:48.600 Sure.
00:01:49.440 So the genesis of CCDH was in 2015, 2016.
00:01:55.940 I was a political advisor to the Right Honourable Hillary Benn MP,
00:02:01.540 who at the time was the Shadow Foreign Secretary.
00:02:04.740 And of course, it was that time when Jeremy Corbyn had taken over the Labour Party.
00:02:07.860 So in the winter of 2015, there was a decision taken by Parliament
00:02:15.120 as to whether or not to join the United States and its allies
00:02:19.480 in military action against the Islamic State in Syria.
00:02:23.080 Jeremy opposed that decision.
00:02:26.180 Hillary supported it.
00:02:28.160 I remember being in the office with him, talking about it, the reasons why.
00:02:31.820 What really motivated him, what really motivated me,
00:02:34.460 and I come from a Muslim background myself,
00:02:36.800 was reading and seeing the intelligence reports and the reporting by brave journalists
00:02:42.860 about the murder of Yazidi women who were too old to rape in Mount Sinjar.
00:02:51.260 And Hillary gave what was, by all accounts, a highly acclaimed speech.
00:02:57.040 It led to that standing ovation famously in Parliament,
00:02:59.460 which is quite unusual for those viewers that don't know how Parliament works.
00:03:02.900 You're not allowed to clap, really.
00:03:05.700 And the next day, we received a mountain of abusive, anti-Semitic comments saying,
00:03:12.340 why are you fighting the Jews' war for them?
00:03:14.620 Which was bizarre, because what did it have to do with Jews?
00:03:17.600 This was actually a war of Muslims killing Muslims.
00:03:20.480 And, you know, a ton of abuse and phone calls and everything else.
00:03:27.260 And it seemed coordinated.
00:03:29.060 In fact, we knew it was coordinated and we could find the spaces where people were coordinating
00:03:33.100 the very specific ideas that they were promulgating on Facebook groups and other spaces.
00:03:38.880 A few months later, like a piece of chattel, I was traded to Alan Johnson.
00:03:44.220 When you're an advisor, you are basically, you know, chattel.
00:03:47.460 So I was given to Alan Johnson.
00:03:49.380 He liked the look of me.
00:03:50.380 He asked me to be powdered and brought to his office to be his advisor for the referendum
00:03:54.320 campaign, because, of course, Jeremy didn't want to leave that.
00:03:56.560 And Alan stepped up.
00:03:59.160 And during that campaign, we saw a really high prevalence of hate, of conspiracy theories.
00:04:04.800 There was stuff about election denial.
00:04:06.300 There was people saying that if you voted in pencil to remain, to leave, that the election
00:04:13.280 agents have been told to rub that out and put remain in pen.
00:04:16.420 And so when people were walking into the election booths, they were literally waving their pens
00:04:21.620 to people.
00:04:21.920 You can't rub out my vote, mate.
00:04:24.800 But also there were much more pernicious lies being spread on social media platforms, including
00:04:30.280 the lie that the EU was trying to bring in Muslims and black people to rape 14 year old
00:04:34.300 girls and destroy the white race.
00:04:36.320 It's a variant of the Great Replacement Theory, which we know has led to the massive loss of
00:04:41.380 life in Christchurch of Muslims, in Pittsburgh of Jewish people.
00:04:46.580 And then, of course, there was the assassination of Joe.
00:04:49.760 And so that's Joe Cox.
00:04:53.000 Yeah.
00:04:53.340 I mean, Joe Cox was a 35 year old mother of two.
00:04:56.620 who represented the constituency of Batley and Spen.
00:05:01.680 And she was.
00:05:04.280 I say the words every time and I say them because it's important to remember just how
00:05:08.320 inhuman the loss of her life was.
00:05:12.500 She was shot, stabbed and beaten to death on the streets of her constituency by a man who
00:05:18.580 believed these conspiracy theories and lies and had imbibed for years and years the lies that
00:05:24.860 underpin hate.
00:05:27.160 And he thought it was right to take the life of a young woman and did.
00:05:31.180 And at that point, I just thought something is wrong.
00:05:35.920 Something's profoundly wrong with our politics.
00:05:38.600 Now, when he was killing her, it was reported at the time that he shouted Britain first death
00:05:43.400 to traitors.
00:05:44.260 Death to traitors was something that these groups like, you know, the sort of the people
00:05:48.560 who are promulgating the Great Replacement Theory, they were saying that these people
00:05:52.260 are traitors to Britain.
00:05:53.300 But Britain first really struck a chord with me because about a year and a half earlier,
00:05:58.480 I'd been told that Britain first was the first political movement to achieve a million likes
00:06:02.780 on Facebook in the UK.
00:06:04.660 When that happened, my response was, who gives a shit?
00:06:08.420 They've got a million clicks.
00:06:09.820 We've got half a million members.
00:06:11.740 They've just got clicktivists.
00:06:13.140 They're nothing.
00:06:14.420 And I was wrong.
00:06:15.340 And what really came to me in that moment was the realization that conventional institutions
00:06:23.200 like politics, like journalism, like the government had forgotten, had not realized that the primary
00:06:31.540 locus of information exchange in 2016, the primary locus of where we set our social mores,
00:06:40.040 our norms of attitude and behavior, where we negotiate our values, even where we negotiate,
00:06:44.520 the corpus of information that we call facts had shifted to digital spaces.
00:06:49.540 And they worked to different mathematics than the normal world.
00:06:53.980 The way that amplification works, the way that 50 people can create 5,000 notifications
00:06:58.940 in 10 seconds and create social proof from nowhere, even though they are fringe and, you
00:07:05.020 know, a disproportionately small number of voices.
00:07:07.720 And I wanted to understand better how bad actors were weaponizing those spaces.
00:07:14.400 So I spent two years working on that.
00:07:17.280 I quit my job.
00:07:18.540 I did it.
00:07:19.480 Didn't get paid.
00:07:20.620 I went and worked with the platforms.
00:07:23.040 I said to them, look, guys, I'm finding all this stuff.
00:07:25.060 And they were saying to me, brilliant.
00:07:26.740 Come on in.
00:07:27.600 Tell us everything you've learned.
00:07:28.720 And it took me two years to work out that the entire policymaking process and stakeholder
00:07:33.380 engagement process at the social media companies was an elaborate gaslighting scheme.
00:07:38.400 And at that point...
00:07:39.740 What does that mean exactly?
00:07:40.820 It just meant that it was never going to go anywhere.
00:07:42.960 Like they were just, they were pretending to care and pretending to listen.
00:07:46.580 The actual moment where it happened, where I realized it was an elaborate gaslighting scheme
00:07:50.460 is when I went into Meta with my friend, Rachel Riley, and we brought in a dossier of extreme,
00:07:58.860 extreme antisemitism, which we'd found in a group that was named, it was called Truthers
00:08:06.640 Against Zionism or something.
00:08:09.680 And we took it in and we showed them and they said, whoa, this is amazing.
00:08:13.140 We've never seen this before.
00:08:14.080 This is very serious.
00:08:15.060 This is criminal level stuff.
00:08:16.600 We're going to have to, you know, spend some time studying it.
00:08:19.300 We can't take it down because otherwise we'll lose all this vital data.
00:08:23.300 The thing is, I knew that they'd seen that group six months earlier, the very same executive
00:08:27.780 because Hope Not Hate had taken it into them.
00:08:30.020 And a year earlier, because the Community Security Trust, the primary body that defends
00:08:35.600 the physical security of Jews in Britain, had taken it into them.
00:08:39.320 So they were lying to me.
00:08:40.940 And at that point, I just thought, well, this is going nowhere.
00:08:43.260 So it's time for us to change tactics.
00:08:45.620 And that's why we launched CCDH.
00:08:47.500 We launched CCDH to create transparency, to create public awareness of the ways in which
00:08:55.440 those platforms were being weaponized by bad actors.
00:08:58.360 But also, crucially, that the platforms were in it too, that they were unwilling to do anything
00:09:03.940 about it because, frankly, it was making them money.
00:09:06.440 And at that time, we were developing our economic analysis and economic theory of why hate and
00:09:11.760 disinformation have become so profitable for both the producers and the disseminators of
00:09:19.300 the hate and lies.
00:09:20.860 And why is it that they've become so profitable?
00:09:23.220 Because they grab attention.
00:09:25.400 And that's just as simple as that.
00:09:27.420 Look, a social media company is just a billboard company.
00:09:31.180 That's all it is.
00:09:32.760 It's nothing more sophisticated than that.
00:09:34.980 98% of the revenues of Meta come from advertising.
00:09:39.180 And, you know, Mark Zuckerberg is younger than me.
00:09:41.480 He may be your age is.
00:09:43.400 He's worth $100 billion.
00:09:45.240 That's a depressing thought.
00:09:46.180 Yeah.
00:09:47.400 He's built an amazing business model.
00:09:49.680 And the business model is amazingly clever.
00:09:51.960 It's really parasitic.
00:09:53.560 It doesn't actually produce anything.
00:09:55.480 Everyone else produces the content.
00:09:57.260 They just have an algorithm that reorders it.
00:09:59.560 Because, of course, you can't see what 5.5 billion people are saying, right?
00:10:02.500 Because that would just be nuts to have all those voices in your head.
00:10:06.160 So instead, what they do is they reorder it.
00:10:08.140 And they reorder it based on engagement.
00:10:09.900 And engagement really is driven by, you know, what gets more engagement is more violative content.
00:10:17.400 In fact, Mark Zuckerberg himself once presented a chart, a graph, that showed on the x-axis was
00:10:23.640 how violative the content was, essentially how close to breaking the rules.
00:10:27.700 And the y-axis is engagement.
00:10:29.760 And what you get is, as it's innocuous content, very low engagement, stays flat, stays flat,
00:10:37.820 stays flat.
00:10:38.380 And then you get to violative and it shoots up with an exponential curve.
00:10:42.140 So they know that basically the most contentious content, the most near-violative content,
00:10:47.540 the stuff that's breaking the rules, actually gets the most engagement.
00:10:52.120 Now, if you amplify that, if you amplify based on engagement, you are therefore amplifying
00:10:58.020 the most violative content.
00:10:59.600 So you've got a mechanism that constantly is shifting the norms of our society because
00:11:04.680 of frequency bias, because of how frequently we're seeing this content towards the most
00:11:09.220 violative content.
00:11:10.420 And that has a net effect on societies over time.
00:11:13.360 It changes them.
00:11:14.960 It makes them more fearful, more angry-seeming, more brittle, more polarised, more hateful.
00:11:21.300 And it undermines the values that underpin democracy.
00:11:24.520 And for me, CCDH is one of the first of the breed of organisations that's looking at
00:11:29.100 the macro-level impact of these organisations, these companies, and the way they reorder our
00:11:34.840 information ecosystem and saying, well, crumbs, what does that mean for democracy?
00:11:39.900 What does it mean for our ability to just rub on us people?
00:11:42.400 And, you know, have a happy, cohesive society?
00:11:47.800 Because that's a good thing, we think.
00:11:50.400 Yeah.
00:11:50.840 Look, I agree with what you're saying.
00:11:53.500 I guess my point is, who determines what is towards is extremist content?
00:12:00.640 Because let's go back to...
00:12:02.500 I didn't say extremist content.
00:12:03.840 I said violative.
00:12:04.740 Violative.
00:12:05.260 This is their own definition.
00:12:06.440 So, of course, when you join any platform, you sign up to their community standards, right?
00:12:10.820 Right.
00:12:11.040 So, I mean, they have community standards that say you can't promulgate hate, you can't
00:12:15.260 promulgate disinformation, you can't do this, you can't do that, you can't do this, you
00:12:19.100 can't do that.
00:12:20.120 And so they have made the definitions.
00:12:22.440 In fact, with, for example, Facebook, you know, there's been discussions over time about
00:12:26.100 whether or not Holocaust denial should be violative on their platform.
00:12:29.020 And they've decided over time, I remember five years ago speaking to a very senior executive
00:12:35.780 at Facebook, and he was saying, absolutely never will we make this ban on our platform,
00:12:40.180 because unless you ban it in the real world, we can't ban it on our platform.
00:12:44.320 Now, that's not true.
00:12:45.460 They have an absolute First Amendment right in the US to decide what's on their platform
00:12:50.500 and what's not.
00:12:51.280 But they made the choice over time to actually say, well, no, we've changed our minds, we're
00:12:56.280 going to ban it.
00:12:57.120 And so they decide what's violative on their platform.
00:13:00.140 And of course, what defines for them what's violative is what advertisers will tolerate.
00:13:07.320 And that is the crucial point, of course, is that advertisers are the one moral voice
00:13:12.280 and they're going, not really sure if we want to place adverts on a stream of Nazi videos.
00:13:20.000 So it's the advertisers essentially who are in control is what you're saying, because
00:13:24.360 it's the advertisers that hold the purse strings.
00:13:26.680 Well, they have they have a considerable voice in it, but the control is held by it's a negotiated
00:13:31.780 thing, right?
00:13:32.540 That's how civil society works.
00:13:34.220 I have my voice.
00:13:35.360 I say it's disgusting to be giving a megaphone to hate and extremism and disinformation.
00:13:43.020 They have their voice saying, well, people should have the right to speak out.
00:13:46.740 The advertisers have their voice.
00:13:48.340 Governments have their voice.
00:13:49.600 You know, that's the way that civil society works.
00:13:51.960 And generally speaking, when platforms have gone off the rails, you've seen those elements
00:13:57.920 of society work together to try and change things.
00:14:01.660 So, for example, when Elon Musk took over X, now, you know, Elon Musk, brilliant engineer.
00:14:09.260 I certainly a very, very good CEO.
00:14:12.100 You know, I admire the way that he runs his companies, the way that he engineers processes
00:14:16.520 in a really intelligent way and ekes out every bit of efficiency and value he can from them.
00:14:24.320 But when he took over that platform, he said, it's a free speech zone.
00:14:28.700 And he hadn't thought about it.
00:14:29.960 He said, it's a free speech zone.
00:14:31.220 You say whatever you want.
00:14:32.260 He let tens of thousands of people who've been banned from the platform under the previous
00:14:36.460 regime, which is an incredibly lax regime, frankly.
00:14:39.220 I would push back on that.
00:14:40.620 I know plenty of people who got banned off Twitter, as it was then known, for saying things
00:14:44.860 like men can't be women and women can't be men.
00:14:47.340 That, to me, doesn't seem lax.
00:14:48.720 That is a direct infringement of freedom of speech.
00:14:51.060 I'm sure you can find a couple of people.
00:14:52.900 There's a lot of people.
00:14:54.180 There's a lot of people.
00:14:55.080 And Imran, just to make the point as well, there were a lot of people who were shadow
00:14:59.840 banned as well because of...
00:15:01.580 But let Imran also make his point first.
00:15:04.000 I don't know how shadow banning works.
00:15:05.300 And it's one of those things that I'm not quite sure if it's just a made up thing or if it's
00:15:08.240 real thing.
00:15:09.100 But I mean, so with banning the people, there are lots of people who are undoubtedly hate
00:15:15.000 actors spreading hatred against black people, LGBTQ plus people, blah, blah, et cetera, et cetera.
00:15:19.220 And so when you let them back on the platform, we did a study to have a look at what happened
00:15:24.120 on the platform when he came in and signaled that what would be acceptable on his platform
00:15:30.320 is hate content.
00:15:32.220 There was a 202% increase in the use of the N-word on Twitter in the week after he took
00:15:39.440 over compared to the daily rate for the year prior.
00:15:43.720 Now, what that showed was that people had come on there and thought, well, I can get away
00:15:48.700 with this.
00:15:49.140 He'd actually shifted the social mores, the norms of attitude and behavior.
00:15:53.360 Now, we did a piece of research that showed that to be true.
00:15:55.760 We quantified it.
00:15:56.840 It went on the front page of the New York Times and advertisers flooded out of his platform.
00:16:01.360 He says it cost him $100 million in advertising loss.
00:16:04.760 And he sued us for it.
00:16:06.840 He sued us saying that we had the temerity to...
00:16:11.380 He didn't actually sue us for defamation because we were right.
00:16:14.000 But what he sued us for was doing the research itself and then publishing the research.
00:16:20.060 And that case was dismissed by the courts as a violation of our First Amendment rights.
00:16:26.260 And he has to...
00:16:26.880 In fact, in America, the way that civil litigation works is if you sue someone and you lose, they
00:16:33.140 still have to pay their own costs.
00:16:35.080 In this instance, the court actually awarded us costs as well.
00:16:38.620 So it's an extraordinary judgment.
00:16:41.140 Now, he is appealing it, which I'm very much looking forward to because we raised a lot
00:16:45.360 of money by saying that Elon Musk was suing us.
00:16:47.760 And he'll have to pay our costs if he loses the appeal too.
00:16:50.760 But that just goes to show you that these free speech absolutists like Elon Musk, actually
00:16:56.620 are very thin-skinned when it comes to us holding up a mirror to them.
00:17:00.100 We weren't even criticizing.
00:17:01.820 We simply held up a mirror.
00:17:03.380 Advertisers took the action that they did because, of course, they absolutely have the
00:17:07.460 right to say that we don't want our brands appearing next to the N-word or, you know,
00:17:13.140 abuse of women or abuse of gay people or abuse of Jewish people or whoever else.
00:17:19.320 And so that's the way that this system works.
00:17:21.720 Now, the problem is that, of course, what's become more difficult for this system, which
00:17:25.420 is not self-regulating but is systemically regulating, is that the platforms have been shutting
00:17:31.660 down the primary means by which we're able to create checks and balances on them, and
00:17:36.800 that's transparency.
00:17:38.360 So whether it is Facebook, for example, buying the only system that was able to do systemic
00:17:44.340 analysis of their platform and then shutting it down, whether it's Twitter suing its critics
00:17:50.100 or suing people that research it, whether it's TikTok, which in response to a study that we
00:17:55.660 did showing that 13.2 billion people had viewed eating disorder content on their site using
00:18:01.380 their own statistics.
00:18:02.700 They shut down the area of their site that provided those statistics.
00:18:06.140 We know that these platforms are incredibly unwilling to actually engage a free and open
00:18:10.960 discourse about how they run their platforms and the impact on society.
00:18:15.640 Imran, I'm really glad we're having this conversation because, as we mentioned before we started, I
00:18:21.600 think we're probably coming at this from different positions initially.
00:18:24.960 But I think the one thing I've been saying for a long time is, first of all, no one who's
00:18:29.340 in charge of a platform can be a free speech absolutist.
00:18:32.500 It's not possible because once you are in charge of a system where there have to be some
00:18:36.900 rules, then absolutism is impossible, number one.
00:18:40.200 Number two, I think my view is I think we will look back on this period of the emergence
00:18:47.900 of social media in a similar way to the way that we looked at tobacco companies telling
00:18:52.020 people that their product was good for them and they should be taking more of it and it doesn't
00:18:56.860 cause any cancer.
00:18:59.540 That was another perspective.
00:19:02.680 But at the same time, and I think this is really important for us to discuss, I have
00:19:09.240 definitely noticed since Elon Musk took over Twitter, X, whatever you want to call it, that
00:19:15.860 there's been more anti-Semitic content in my feed and in replies to me, etc.
00:19:21.680 And generally speaking, I have to say, I'm pretty comfortable with that because what I've
00:19:27.040 also noticed is that a lot of people who otherwise were not able to say things that they should
00:19:32.140 have been able to say, in my opinion, were being censored, were being banned, etc.
00:19:36.580 Those people are now back on the platform and able to say whatever it is, men can't change
00:19:41.740 sex, women can't, whatever it is.
00:19:43.300 Right?
00:19:44.300 So, I think what we're talking about is how do you negotiate the conversation between that
00:19:50.440 tension that's always existed in human communication, between safety and freedom.
00:19:55.720 You're grinning at me in a way that tells me you don't agree with what I'm saying.
00:19:59.580 Oh, I just think it's remarkable that you're willing to tolerate the normalization of anti-Semitism
00:20:04.060 to make a point about trans people.
00:20:05.900 I'm willing to tolerate people being hateful towards me.
00:20:09.900 Towards Jews, you just said.
00:20:11.900 Yes, I'm Jewish.
00:20:12.900 Right.
00:20:14.080 In order to make a point about trans people.
00:20:15.980 Not in order to make a point about it, that's not what I said.
00:20:18.340 In order that people do not get banned off the platform for expressing what is a reasonable
00:20:22.160 opinion about biology or sex or anything of that kind.
00:20:26.340 I'm not personally aware of anyone being banned from any platform.
00:20:29.300 Megan Murphy.
00:20:30.300 Megan Murphy.
00:20:31.300 Megan Murphy is a gender critical feminist who was banned from Twitter.
00:20:33.840 What was she banned for?
00:20:35.240 People saying men can't change sex or women can't change sex.
00:20:38.240 Okay.
00:20:39.240 I accept that.
00:20:40.240 That was-
00:20:41.240 There are many other people like that.
00:20:42.240 And if that was not in breach of their own rules, and they have the right to set their
00:20:46.400 rules, right?
00:20:47.400 Every platform has the right to set those rules.
00:20:48.400 If it was not in breach of their rules, that's the wrong decision to take.
00:20:51.360 It's one of the reasons why, for example, in our- So just to be really clear, I mean,
00:20:56.740 I think you've come at this from a very weird perspective when it comes to CCDH.
00:21:00.580 We don't want to-
00:21:01.580 I'm not having a go at you.
00:21:02.580 No, it's okay.
00:21:03.580 I'm trying to explore the argument.
00:21:04.580 So the problem is that we understand that people have a- I actually do think that people
00:21:11.580 have a First Amendment right to say whatever the hell they want.
00:21:14.580 I don't have a right to say whatever I want in your program.
00:21:16.580 So if I started chanting swear words right now, presumably you wouldn't broadcast it.
00:21:20.580 If I started chanting-
00:21:21.580 How do you know?
00:21:22.580 Well, we could give it a go if you'd like.
00:21:25.580 Go for it.
00:21:26.580 I'd rather not.
00:21:27.580 Okay.
00:21:28.580 I mean, I have somewhat of a reputation to uphold here.
00:21:31.580 But, you know, the point is that in any space, a platform can take decisions about what they
00:21:40.580 accept and what they don't.
00:21:41.580 The problem is that the rules are badly enforced.
00:21:44.580 Yes.
00:21:45.580 Yes.
00:21:46.580 And so here's our agenda.
00:21:48.580 What we say is when it comes to these platforms that, and I agree with you, I think in 20 years'
00:21:53.580 time we'll look at them like, you know, we looked at the way that tobacco companies lied about how
00:21:57.580 they operated when, you know, when people said that having seatbelts would be an impingement on my
00:22:02.580 freedom as a human being, and of course they are now de rigueur in all cars that come out.
00:22:09.580 That if you look at the damage that these platforms can do, both systemically in terms of the normalization
00:22:15.580 of hate and extremism, but also the most important work that we do, which is on children and the damage
00:22:23.580 that's been done to their mental health.
00:22:24.580 We've done enormously important studies on things like eating disorder, self-harm content, and also the
00:22:28.580 recommendation of steroid-like drugs and unachievable body images to young men.
00:22:34.580 I think that we will look back on that and kind of think how on earth did we expose our children to an
00:22:39.580 unregulated psychological experiment that's led to a global shift in mental illness amongst young people.
00:22:49.580 But I do think that if you come back to the way that these platforms operate, there's no transparency in how they
00:22:57.580 take decisions on how they enforce their rules.
00:23:01.580 So the three components of transparency we want are algorithms.
00:23:05.580 How do you choose which content wins and which content loses?
00:23:09.580 Because that's what the algorithm is.
00:23:11.580 It decides who wins and who loses.
00:23:13.580 And we know that what they want to win is the most contentious content, the most egregiously harmful content,
00:23:19.580 hate, disinformation.
00:23:21.580 Second thing is we want to have transparency on the advertising.
00:23:25.580 So we understand how advertising vitiates the decision of what to present.
00:23:30.580 How the advertising is presented within the system as well.
00:23:34.580 Because, of course, increasingly what you get is a system which tries to mask advertising as organic content.
00:23:40.580 The third thing that we want transparency on, and I think this is the most important one.
00:23:44.580 What are your rules?
00:23:45.580 How do you enforce them?
00:23:47.580 What rule did you apply to take this decision?
00:23:49.580 And why did you come to that conclusion?
00:23:53.580 Because anyone that's banned from a platform or have their post removed should be told you broke this rule.
00:23:58.580 They don't do that.
00:23:59.580 No, they don't.
00:24:00.580 No, they don't.
00:24:01.580 Which is what we were talking about, right?
00:24:02.580 And I think that's dumb as hell.
00:24:04.580 So I think that, you know, your sort of thing of, well, I'm willing to tolerate the normalization of anti-Semitism.
00:24:10.580 That's not what I said.
00:24:11.580 Because some people were banned for saying that men can't be women.
00:24:13.580 That's not what I said.
00:24:14.580 Imran.
00:24:15.580 I disagree with you on that.
00:24:16.580 Hold on.
00:24:17.580 You can't disagree with something I didn't say.
00:24:18.580 I disagree with you on men and women.
00:24:20.580 But that's fine.
00:24:21.580 And we can talk about men and women if you like.
00:24:23.580 I'm happy to.
00:24:24.580 You just said that I said I'm willing to tolerate the normalization of anti-Semitism.
00:24:28.580 That is not what I said.
00:24:29.580 What did he say?
00:24:30.580 What I said is there has been an increase in anti-Semitism.
00:24:32.580 It's not like there wasn't anti-Semites on Twitter before, right?
00:24:35.580 I have seen more of that type of thing on my feed.
00:24:38.580 There's not normalization.
00:24:39.580 I'm not saying it's normal.
00:24:41.580 What did he say after that?
00:24:42.580 What I didn't say is I want the normalization of anti-Semitism because I want people to be able to be bigots against trans people.
00:24:49.580 Which seems to be slightly what you're saying.
00:24:51.580 What I said.
00:24:52.580 I think if people go and rewind right now, they'll find that's exactly what you said.
00:24:55.580 Okay, well, let me finish.
00:24:56.580 Here's the point.
00:24:57.580 You don't have to balance.
00:24:58.580 Imran, you've got to let me finish the point.
00:24:59.580 You don't have to balance hatreds.
00:25:00.580 Imran, I'll let you talk.
00:25:01.580 Let me talk.
00:25:02.580 Hold on.
00:25:03.580 There's no filibustering here.
00:25:04.580 You're going to get plenty of time.
00:25:05.580 You've got an hour more of our conversation.
00:25:06.580 Let me finish my point.
00:25:07.580 Let's actually discuss what you're saying and what I'm saying instead of these straw men that we make up.
00:25:12.580 So I don't want the normalization of anti-Semitism, nor do I want people to be bigots against trans people.
00:25:19.580 Good.
00:25:20.580 The point I'm trying to make to you is there is a relationship which has been acknowledged throughout the centuries, not just online obviously, but in all sorts of communication about the relationship between the amount of freedom that we have on the one hand and the amount of safety that we have on the other hand.
00:25:34.580 When we open up the range of things that people can do or say, we necessarily decrease the amount of safety.
00:25:41.580 That's the relationship that those two things have.
00:25:44.580 So what I'm saying to you is someone like me was not comfortable with the amount of censorship that we saw on Twitter in the past.
00:25:51.580 That does not mean that I'm also not against showing kids eating disorder content.
00:25:57.580 We just had psychologist Jonathan Haidt on the show to talk about the dangers of social media.
00:26:02.580 We've had many people on the show to talk about that.
00:26:04.580 So when it comes to kids, when it comes to genuinely hateful content where it's like people are being cited to commit violence, et cetera.
00:26:11.580 But I don't want somebody to be censored from saying, you know, you're a Jew and therefore you support this cause or whatever to me.
00:26:18.580 And as a result, creating an environment in which people are really, really prevented from having debates about things like, you know, the trans issue, having debates around things like vaccine safety and efficacy and having other conversations that really matter to our society.
00:26:34.580 That's the point I'm putting to it.
00:26:35.580 And I agree with you. I think people should be able to have those debates.
00:26:38.580 The question is whether or not these companies should be able to amplify particular voices to make them win or make them lose based on, you know, based on an algorithm which rewards only one thing, which is engagement.
00:26:51.580 You know, the freedom of speech is not the freedom of reach.
00:26:54.580 It's not it's not anyone's God given right to be given a megaphone, nor is it anyone's God given right to be to be paid for that either.
00:27:03.580 And with the way that platforms work, both the, you know, the monetization of this content and then the revenue sharing of that content, what you actually are incentivizing is things like people who are who are performatively offensive.
00:27:17.580 Agreed. Yeah.
00:27:18.580 For the purposes of generating, generating attention.
00:27:21.580 You know, I think about the effort that you guys have put into putting this together. Right.
00:27:25.580 Both of you, you've, you've, you've, you put in thought and care into selecting your guests.
00:27:31.580 You, you, you're trying to have a debate, an actual extension of human understanding, conversation, extension of human understanding and knowledge, bringing two people together and creating something bigger than just the sum of our two opinions, trying to create insight, those flashes of things that really matter.
00:27:46.580 That's admirable. That's not the game they're in. Don't for a second, let Elon Musk persuade you that he's in the game of trying to extend human consciousness.
00:28:02.580 When it comes to Twitter, he's in the game of trying to make some money from advertising and he's failed catastrophically at it because he just didn't peg that it wasn't just about the number of eyeballs.
00:28:15.580 He thought if I can optimize the number of eyeballs, I can make even more money. If I turn this into an ever moving car crash, he thought, right, it's a billboard company.
00:28:24.580 I know what I'll have car crashes. I'll put billboards on them. Everyone will watch them. Bob's your uncle. Loads of money.
00:28:31.580 It turns out people don't want to put adverts on car crashes. And that's what Twitter's become.
00:28:36.580 It's become the equivalent of a billboard attached to a slowly unrolling, brutal car crash in which the rights of thousands, millions, billions of people, women, gay people, Jews, Muslims, black people are constantly being diminished to amplify the voices of a small fringe of, frankly, lunatics.
00:29:00.580 So the thing that I get really quite uncomfortable with in these conversations is the term disinformation, because we know, for instance, when we look at the pandemic, what was classified as disinformation at one point then becomes either accepted or scientific consensus in a few years time, particularly the issue of masks, for example, where it was proven that if you have a mask that is not an N95, it's a paper mask.
00:29:27.580 It is of no use, of no use whatsoever. So that is my issue, Imran. That's something that I have real problems with going. This is disinformation within certain reason, because you go, how do you know at this point?
00:29:42.580 So let's break this down to three types of disinformation. And I'll tell you what CCDH focused on as an organization. So when I left school, I went to medical school. In 1996, I went to UCLMS.
00:29:55.580 I did preclinical medicine there, the two years of theory, and then I quit before I did clinical because I hated it. While I was there, up the road was Andrew Wakefield writing the paper that said that MMR, the vaccine, the measles, mumps, rubella vaccine causes autism, which we know to be untrue based on fabricated data.
00:30:16.760 And at the time he was being paid, he was being paid by personal injury lawyers, hoping to bring autism cases against vaccine companies for on behalf of autistic parents.
00:30:28.200 That paper caused real devastation, real damage, especially in the northwest of England, where I come from. And kids were harmed immeasurably. Measles is still one of the biggest killers of children in the world today, still.
00:30:43.440 And so I've had an interest in how lies are spread about vaccines for some time. When the pandemic kicked off, we saw three types of disinformation flowing. One was the stuff about whether it was China or a lab leak. Don't care. Personally, I don't care. And we did not ever classify that as disinformation or not.
00:31:08.360 It was just idle ramblings, as far as I could see. And mainly sort of like, you know how people do on Twitter, where they start talking about geopolitics and they're not really talking about geopolitics. They're just talking about whether or not they think China's mean or not.
00:31:21.940 All right. The second type was people saying, you know, asking about things like school closures, masks. That stuff all seemed like, well, there's people who are complaining about the decisions taken by government.
00:31:36.840 I think in a moment like this, I know how difficult it is in government. I've worked in politics and I think to myself, it's kind of shitty and mean on those politicians who are having to make decisions based on insufficient information at pace.
00:31:48.140 And I thought that was, I thought some of it was unfair, that criticism, especially the sort of the with hindsight criticism of it.
00:31:55.660 Yeah.
00:31:55.900 It's a little bit unfair. Where we were really concerned, where all of CCDH's effort went in, was people who were fundamentally spreading untruths about vaccines or about the deadliness of COVID or about the intent of medics who were coming up with it.
00:32:16.040 One of the biggest spreaders of disinformation about vaccines, for example, prior to the vaccine launching, were the left in the US because they were all calling it Trump's vaccine.
00:32:30.040 They said Trump's vaccine cannot be trusted. One of my favourite bookmarks on my thing is a press release that came up from the Trump administration decrying the Democrats for being vaccine deniers and anti-vaxxers because they said that they had done everything they could.
00:32:47.140 So the truth is that that vaccine, you know, it was the disinformation, it was things like saying that the vaccine wouldn't work and so therefore you should take nebulised hydrogen peroxide.
00:32:56.440 Now that might, nebulised hydrogen peroxide, hydrogen peroxide is bleach. It's what you used to, I used to have blonde hair when I was 18.
00:33:02.700 That's how I got my hair blonde.
00:33:04.060 And nebulising is, if you're an asthmatic or you've ever had a lung condition, you will know it. It's where you inhale particulates, so droplets.
00:33:13.900 So they were saying, inhale droplets of bleach to cure yourself.
00:33:19.520 That's the number one anti-vaxxer in America was suggesting that, a guy called Joe McCullough.
00:33:24.040 Joe McCullough was working with Robert F. Kennedy Jr. very closely.
00:33:27.240 Now when we did a study showing that of that kind of disinformation, 12 people were behind it and quite often they were profiting from that disinformation.
00:33:36.940 So they were saying, don't trust the vaccine, nebulise hydrogen peroxide and here you can buy it from my web shop.
00:33:44.240 That was clear stuff. Those were snake oil salesmen, SPIVs, who were lying for personal profit.
00:33:50.300 And that I think is quite different to idle, I think it's bollocks, speculation.
00:33:57.280 I think it's meaningless. It's like worthless mental effort.
00:34:00.900 And I know that some people like doing it about whether or not it came from a lab or whether or not it came from a wet market or whether or not, you know, blah, blah, blah, blah, blah.
00:34:08.760 May I interrupt you very briefly just on that point?
00:34:12.360 And Francis, sorry, I'm interrupting your flow as well.
00:34:14.560 I just think on that point we ought to just take a moment to consider what we're saying here.
00:34:19.620 COVID-19 killed more people than anything since World War II in terms of a single event.
00:34:28.920 If it came from a lab in China, it's almost certainly came from a lab in China because they were doing gain-of-function research.
00:34:37.520 You're messing around with viruses, including to make them more lethal, more viral, etc.
00:34:42.660 You don't think we ought to know where this came from?
00:34:46.120 I'm pretty sure that that's going to be discovered through some of the best forensic pathologists, virologists, and intelligence sources in the world.
00:34:57.700 It will not be come up with by Barry from the pub down the road on Twitter.
00:35:03.640 I'm absolutely certain they have no insight.
00:35:06.360 But that's not who was being censored.
00:35:07.480 So that's why I say it's not disinformation. It's just idle speculation on Twitter.
00:35:12.180 You saying you don't care about it is interesting to me because on YouTube there were Harvard scientists who were saying it and being censored.
00:35:18.800 I don't care what people say about it on social media.
00:35:22.080 I think that the way that the process will work to derive the truth on that does not at any point involve a Twitter flame war between one side or another.
00:35:36.220 Do you not think that the silencing of certain discussions on social media has an impact of what is happening in the real world?
00:35:42.840 I don't think they should be silenced.
00:35:44.120 Oh, you don't?
00:35:44.680 You don't.
00:35:44.980 Of course not.
00:35:45.680 Yeah.
00:35:46.000 Why should people silence that discussion?
00:35:48.020 Yeah.
00:35:48.360 People can have...
00:35:49.400 Look, I've sat in a pub and talked about nonsense before.
00:35:52.880 I don't think people should silence you for talking nonsense.
00:35:55.260 Even if I think it's nonsense, they might find it important or interesting and they might find it satisfying to be able to point to a villain on it.
00:36:04.800 That's none of my business.
00:36:06.280 My business comes in when that impacts other people's fundamental human rights.
00:36:12.340 You know, in America, there was...
00:36:14.080 Well, some people make this kind of...
00:36:15.640 Elon Musk always says, like, your most fundamental and first right is your right to free speech.
00:36:19.280 It's not.
00:36:20.540 It's the First Amendment to the Constitution, buddy.
00:36:22.840 Life, liberty, and the pursuit of happiness.
00:36:25.160 Above that in the Constitution.
00:36:27.220 Life.
00:36:27.940 The right to life.
00:36:29.200 If your lies spread with intent, telling people not to take a vaccine that could save their lives in order to sell them some janky pill that you sell on your web store,
00:36:44.400 that, to me, is an act of absolute moral evil.
00:36:50.040 There's just no doubt in my mind about it.
00:36:52.500 And I think that platforms that say that you can't spread health disinformation should enforce their rules and ensure that people, that snake oil salesmen,
00:37:01.580 and there is a tradition of that in the US, should not be able to weaponize that moment, that moment of incredible anxiety for many people when they are vulnerable.
00:37:11.560 But do you know what, Imran?
00:37:13.400 The same...
00:37:13.920 Look, and I'm in agreement with you.
00:37:15.600 I think those people are willfully doing that.
00:37:17.820 And look, that's one thing.
00:37:20.580 But your argument could be used against people, for instance, like the US government, who were pushing out vaccines and saying everybody needs to take the vaccine,
00:37:29.080 when the data was showing that if you were a young man of a certain age, you didn't...
00:37:33.700 Or a young person of a certain age, the risks of COVID were negligible.
00:37:37.780 And actually, it was probably more dangerous to take the vaccine.
00:37:42.040 They were saying to take, you know, the AstraZeneca vaccine was completely safe.
00:37:46.040 It has now been withdrawn because of blood clots.
00:37:49.080 Do you see what I'm saying?
00:37:51.480 What can be seen as disinformation at one point can be changed at another point.
00:37:55.660 Again, I mean, I make the point that I'm talking about the spreading of deliberate health.
00:38:02.720 And you're talking to me about things which I think that, you know, really, really exercise you.
00:38:07.220 And I get that.
00:38:08.460 I don't see what that's got to do with the work that we've done.
00:38:10.860 I think the very nuanced and careful way that we differentiated between disinformation and just discussion is really, really vital.
00:38:21.260 I think people can take a lesson from the way that CCDH did it.
00:38:25.140 Because actually, we focused on the people who are profiting from the pandemic by spreading disinformation deliberately that would lead to the loss of life for millions of people.
00:38:36.020 You know, 200,000 people in America were calculated by Microsoft AI and Johns Hopkins University, professors that I know there very well, incredible specialists.
00:38:47.260 200,000 people died unnecessarily in the US because they failed to take a vaccine that would have saved their lives, that they were too late.
00:38:54.840 Many of them died in ICUs in America, choking to death, begging for a vaccine that they had once thought would harm them, but in fact would have saved their lives.
00:39:04.380 But it was too late for them to take it because they were already on the cusp of death.
00:39:09.980 And they did go on to die.
00:39:11.840 That to me is an atrocity that simply, you know, if you think about 9-11, so one of the signal moments in my life, it was the day before my 23rd birthday in 9-11.
00:39:22.980 I used to work for Merrill Lynch.
00:39:24.680 I used to work in banking.
00:39:26.720 You know, after I quit med school, I just wanted to go and make some money.
00:39:29.860 And it was that was the first time I ever felt morally motivated to do something with my life beyond for myself.
00:39:40.140 So I quit my job.
00:39:41.740 I went to Cambridge.
00:39:43.260 I studied politics.
00:39:44.160 I worked in politics for 10 years after that, hoping to make sure that we didn't have a world in which these kinds of things happen again.
00:39:50.180 That was off the back of 7,000 people being killed on 9-11.
00:39:55.720 200,000 died after COVID.
00:39:58.240 And our perspective, you know, we have done very, very little to clean up the fact that the way that those lies were spread was intermingled along legitimate debate.
00:40:09.460 A legitimate debate, debate that people should be able to have about where it came from, gain of function research, whether or not masks are a good idea or not, whether or not the evidence supports whether or not kids should be off school, whether or not young people should be encouraged to take the vaccine.
00:40:24.920 I think on balance, probably they should have been because there is the risk of transmission to adults and that can kill the adults or, you know, certainly elderly people.
00:40:33.840 And so like on those, but when it comes to spreading lies deliberately, I think that we showed, we showed exactly how to have that, how to differentiate between the two.
00:40:43.600 Because the other problem with this is, and I'm finding out here for myself, like people go, well, look, they banned Alex Jones.
00:40:52.860 They banned Alex Jones off every single platform.
00:40:55.860 Has that reduced Alex Jones's reach?
00:40:58.280 Alex Jones has a huge audience that he influences.
00:41:01.740 The same with Andrew Tate.
00:41:03.240 Like, okay, Elon brought him back to X.
00:41:06.240 But apart from that, Andrew was banned off every single platform and he has found a way in order to put his content out there.
00:41:14.400 So I guess my question to you is this.
00:41:16.920 Yeah.
00:41:17.340 Does banning people off social media actually reduce their reach?
00:41:20.800 Because the two examples that I've given you would say otherwise.
00:41:23.800 Yeah, well, every empirical study that's ever been done shows that it works.
00:41:27.080 Whether it's, you know, whether it's the US government studying the deplatforming of ISIS or CCDH's own research.
00:41:34.000 We, you know, sometimes we have to do this research to go and check as to whether or not we're having an effect.
00:41:37.500 And so we accidentally, we put out a report on who's the guy from Liverpool, David Icke.
00:41:46.720 Yeah.
00:41:47.340 A couple of, four years ago, just, just sort of laying out what he says.
00:41:52.800 Yeah.
00:41:53.020 And how he was monetizing it.
00:41:55.640 And the next day, Facebook and a few of the platforms banned him.
00:41:59.960 And we looked at his reach and his reach collapsed catastrophically.
00:42:04.980 There's no way of them replicating it.
00:42:07.660 Tate's a very interesting example.
00:42:10.320 So Tate has really innovated a system whereby he, he's built a cadre of people who will go and post this content for him.
00:42:19.400 So he's finding ways around it.
00:42:21.140 It's quite smart.
00:42:22.080 I mean, he's a very organized and intelligent man.
00:42:24.400 Yeah.
00:42:24.720 You know, spreading hate against women and some really, I think it's hateful against men too.
00:42:30.840 Because it's so desiccated his worldview of how men and women should interact.
00:42:36.060 You know, I'm married and I got married later in life.
00:42:40.700 I don't think I've ever understood how joyful it is to be in something that's more than the sum of its parts as much as a marriage is.
00:42:51.600 And he's trying to steal that from every young man out there because he's telling them the only way to be in a relationship is to dominate.
00:42:58.080 It's a zero sum game of domination versus submission.
00:43:01.740 Go fuck yourself.
00:43:03.120 The fuck is wrong with you?
00:43:04.280 Why would you tell young boys that?
00:43:06.140 What a terrible thing to tell them.
00:43:07.820 So, yeah, I think he's a piece of shit.
00:43:11.760 And I think that his voice has been diminished somewhat, but he has found ways around it.
00:43:17.040 And, of course, I find that vexing.
00:43:19.920 I think that if he does break the rules, that he shouldn't be allowed on those platforms.
00:43:23.780 And platforms should do their utmost to make sure that this guy who monetizes hatred is not rewarded for it.
00:43:31.100 You know, I almost wouldn't care if the algorithms didn't amplify his stuff deliberately.
00:43:35.680 We found that, for example, if you set up an account on TikTok as a 14-year-old boy, within 2.4 minutes it was serving up Andrew Tate content.
00:43:43.860 That's a deliberate editorial decision by the algorithm and by the company to actually advantage that discourse, to advantage his speech over other speech.
00:43:54.560 So, yeah, I mean, I'm quite pleased that he was deplatformed.
00:44:00.020 I think he has found ways around it.
00:44:02.280 And I think platforms should be working harder to make sure that he can't profit from his nonsense.
00:44:07.080 Let me just make a point, I think, where there is an element of us talking past each other, which reflects the broader debate on these issues, actually, I think, of us talking.
00:44:15.240 Sorry for swearing.
00:44:16.940 It's all right.
00:44:18.480 You're allowed to swear.
00:44:19.560 We won't censor you on here.
00:44:20.740 I know, but my mum might be watching.
00:44:22.960 I like my mum.
00:44:24.040 She turned off a long time ago.
00:44:25.840 She really does.
00:44:27.500 She sends me little memes of, like, what I've been saying, like, little WhatsApp messages, like, Imran Ahmed says.
00:44:32.020 Tell him to listen better because he keeps interrupting.
00:44:34.340 You didn't raise him well.
00:44:36.020 She knows that.
00:44:37.220 You really do.
00:44:38.000 Sorry, Mum.
00:44:38.320 Imran.
00:44:38.580 So, the bit where I think people are talking past each other in this conversation, this debate more broadly, is this.
00:44:44.960 You're making some very good points that we agree with, which is there are certain types of content that should not be on platforms.
00:44:52.100 They shouldn't be banned.
00:44:52.780 And at the same time, what we are bringing to your attention, I think, is that there were other types of content that you don't focus on at the CC.
00:45:01.200 You don't focus on your organization.
00:45:03.100 You don't, right?
00:45:04.220 But they were also being censored.
00:45:07.460 They were being prevented from being talked about.
00:45:09.380 And the concern of people like us is that sometimes when people are keen to make sure that the Internet is, quote, as safe as possible, they go very far and they censor things that should never have been prevented from being discussed.
00:45:24.260 A lot of that happened during the pandemic.
00:45:26.700 A lot of things should have been banned during the pandemic from being spread around.
00:45:30.220 I agree with you.
00:45:31.280 But at the same time, a lot of things were prevented from being discussed.
00:45:34.080 First, on cultural issues, there are people who are being prevented from expressing perfectly reasonable mainstream opinions that we should be able to talk about.
00:45:42.480 And on and on it goes.
00:45:43.480 So I think part of the reason we're having this disagreement is kind of where we are in society.
00:45:47.320 You've got one group of people saying, there's all these terrible things on the Internet.
00:45:50.560 They should be banned.
00:45:51.600 And actually, most people agree.
00:45:52.640 And then there's also the other angle, which is we should be able to discuss most things in a sensible way.
00:45:58.480 And this isn't specifically about the Center for Countering Digital Hate.
00:46:01.780 We're having a more broad philosophical discussion here.
00:46:04.860 I do recognize some of the sort of the hysterical kind of like, you know, performative virtue signaling of those people who want to shut down voices.
00:46:15.080 I see them with the deplatforming at universities and stuff like that.
00:46:18.340 I do, you know, I had the privilege of going to Cambridge and hearing voices from every part of the political spectrum in my education.
00:46:25.500 I read everyone from the chief jurist of the Third Reich, you know, through to Marx when I was studying politics there.
00:46:34.120 And it all contributed to my education and understanding of the world around me.
00:46:38.040 So I'm a great believer in free and open discourse.
00:46:41.620 Let me square the circle for you.
00:46:43.000 Okay.
00:46:43.240 So you and I agree, or we agree, that there are some voices that are being silenced, which we think are unfair, arbitrary, you know, vindictive decision making.
00:46:55.940 We think that there are some voices that should be because they're clearly outwith the rules of those platforms and definitely outwith, you know, 99.9% of the norms of our society.
00:47:04.460 But we want to have visibility of that.
00:47:08.900 We want it to be actually what we're talking about is fairness.
00:47:12.380 We're saying we want a fair system.
00:47:15.400 The fair system is not going to be created in this discourse here.
00:47:19.400 I think that we'll get closer to understanding what fairness is.
00:47:22.760 Let me just proffer to you our solution as an organization.
00:47:26.280 So this is what CCDH stands for.
00:47:28.320 So CCDH's primary goals are to ensure that every country legislates what we call the STAR framework.
00:47:37.840 So the STAR framework says that if you have transparency of the algorithms, the enforcement decisions, and the advertising, that allows for meaningful accountability in which you can ask them tough questions based on hard data and get real answers back from them.
00:47:53.320 And you probably need some sort of body to do that accountability work.
00:47:56.140 So whether it's Ofcom in the UK, the Digital Services Act implementation bodies in the European Union, or Elizabeth Warren and Lindsay Graham, two people from very different parts of the political spectrum, have an idea for an FCC for social media in the US.
00:48:11.560 I think it's a grand idea if they can get the data that can actually allow them to meaningfully hold them accountable.
00:48:16.240 And then where they create harms that actually do impinge on other people's fundamental rights in a tortious way that actually might harm them, for example, whether it's eating disorder content or it's content about, you know, hate content that leads to a terrorist attack, that they should be subject to the same laws as any other company or publisher, which is negligence law, defamation law.
00:48:39.060 At the moment, at the moment, under US law, under something called Section 230 of the Communications Decency at 1996, are not subject to those laws.
00:48:47.880 So we're saying that they should be subject to the laws.
00:48:50.120 So transparency, accountability, meaningful accountability based on data with people who can ask them questions and get the answers back.
00:48:58.020 And then where they actually cause harm, they should be subject to the same laws as everyone else, which is negligence law, which is defamation law, etc.
00:49:05.580 In doing those things, what you actually then end up with is a safety by design culture.
00:49:11.480 So a culture that thinks about not just about maximizing their revenues, but also about creating healthy spaces for discourse in which real discourse can happen.
00:49:22.160 And by having that transparency, you ensure that they don't go too far either in the wrong direction.
00:49:27.380 So you end up with actually good spaces in which real discussion can happen.
00:49:32.140 I read in the newspapers every day still.
00:49:33.820 I read a lot.
00:49:35.360 And it's not like newspapers can't feature bananas opinions that I would find repulsive, you know, politically, personally.
00:49:44.080 That's good.
00:49:45.420 That's healthy.
00:49:46.480 If I hadn't read bonkers opinions, I wouldn't have, I wouldn't feel as empowered as I do to be able to reject them in the first place or to give you a cultured and intelligent critique of them.
00:49:56.920 So I do see the value of that discourse.
00:49:59.740 But we don't want, and we don't want, by the way, the platform, we don't want governments to give you a list of things you can and can't say on those things.
00:50:06.020 That's nowhere in the STAR framework.
00:50:08.140 What the STAR framework says is, let's have more informed speech about how these platforms work so that we can have a healthier spaces for those discourse and have a healthier society to fulfill the original promise of social media,
00:50:20.420 which was to create a truly global conversation in which humanity can actually extend its understanding and insight collectively, you know, negotiate a better understanding of who we are, where we, you know, and what's good and what isn't.
00:50:35.320 So, Imran, I guess one of the things that I wanted to ask in that, and look, I think where everybody is in here is in favor of greater transparency for these social media companies.
00:50:48.000 Too many of their choices are opaque.
00:50:50.900 You can't get any answers from them.
00:50:53.020 And look, we're all in favor of that.
00:50:54.920 I guess my question to you is, is how do you define harm?
00:50:57.880 How do you measure harm?
00:50:58.980 Things are feeling a little less human these days, aren't they?
00:51:03.140 But isn't the whole point of progress to make things more human?
00:51:06.600 That's why, at TD, when we design a product, whether it's an app for making trading easier or monitoring your account for fraud, we ask one simple question.
00:51:16.280 How does this help people?
00:51:18.540 That's how we're making banking more simple, more seamless, and more intuitive.
00:51:23.660 But most importantly, that's how TD is making banking more human.
00:51:28.980 In the people who are ingesting or being exposed to this concept.
00:51:33.140 So, the responsibility in the US, so in the US, responsibility would be defined by negligence law.
00:51:38.140 It's really, really simple.
00:51:39.100 Every single company in America is subject to negligence law.
00:51:41.760 If, for example, the coffee company that, you know, you brought me a very kindly, brought me a cup of coffee.
00:51:47.400 If the coffee had been negligently, it was poisonous, say, right?
00:51:52.960 And it kills 100 people.
00:51:54.960 If it could be shown that they'd been negligent in the way that they'd put together their systems, put together their quality assurance, then they could be held liable for it.
00:52:02.660 If, for example, TikTok knew that within 2.6 minutes of setting up an account as a 13-year-old girl, it was serving them self-harm content, within eight minutes eating disorder content, every 39 seconds on average, and that when you open an account with a name like Susan, it gives you a lot of self-harm content.
00:52:21.240 If you call it Susan Lose Weight, it gives you 12 times the amount of self-harm content because it knows that you're vulnerable, which is actually the results of a research study that CCDH did called Deadly by Design.
00:52:32.400 And if that leads to a child taking their own life, and I have, at this point, met far, far too many parents who've lost their children that way.
00:52:41.180 Ian Russell, the father of Molly Russell, the 14-year-old girl in England who took her own life, is on the board of CCDH.
00:52:46.960 And, you know, if at that point they knew about it, they did nothing about it, they can be held legally liable for it.
00:52:53.920 That is where they could be held liable.
00:52:56.360 Or, for example, if someone could show that a terrorist attack was plotted on their site, this is another real case.
00:53:03.320 This is a real case of Israelis who took one of the platforms to court because of the way that Hamas bombing had been planned on their, Hamas attack had been planned on their site, operationalized on their site, fundraised on their site.
00:53:17.700 They didn't mean to be able to brazelatize their hatred on their site, and it led directly to someone being killed.
00:53:22.020 At the moment, under US law, they can't be held liable.
00:53:24.680 The problem that we have at the moment isn't that these companies aren't being held liable, it's that they've got specific get-out-of-jail-free cards saying they can't be held liable.
00:53:34.280 And I don't understand why social media companies should not be subject to the same pressures that any other company are, which is that you have to abide by American negligence law.
00:53:44.280 So who will decide it?
00:53:45.420 The courts.
00:53:46.020 What will they decide it based on?
00:53:47.680 The entire corpus of decision-making in negligence law, which is a very well-established and extensive set of law that doesn't say that if one cup of coffee is poisonous, I can't sue them until they bleed.
00:54:02.360 But if they do this on a systemic basis because they just don't give a shit about the people that buy their stuff, then they can be held liable.
00:54:09.500 That's how negligence law works.
00:54:11.260 Why can't they be subject to that?
00:54:12.860 Or product design law, which says that if you create a product that is designed to be harmful, that you can be held liable for it.
00:54:19.700 Well, I think that's what most sensible people would agree on, particularly those two examples that you gave.
00:54:24.760 The serving up of harming harmful content for children and the other example.
00:54:31.100 Well, not hate, but the example you gave is terrorism, actually.
00:54:34.380 There's a big difference between hate and terrorism.
00:54:37.040 Let me give you an example which I think is much more difficult to adjudicate.
00:54:40.500 Should you be able to call for an armed revolution on social media?
00:54:45.660 I don't know.
00:54:47.040 Do you think they should?
00:54:47.960 Well, wouldn't that cause harm?
00:54:49.200 Do you think you should?
00:54:50.820 Let me ask you this question.
00:54:52.720 We can play the argument forward logically.
00:54:57.040 Should you be able to call for an armed...
00:54:59.020 It depends on the context.
00:55:00.220 Well, but wouldn't an armed revolution...
00:55:01.420 Should an individual be able to?
00:55:03.100 That's up to the platform.
00:55:05.020 I don't know.
00:55:06.360 I mean, should people be able to call for violence?
00:55:09.060 Probably not.
00:55:09.840 If you can't do it on the street, should you be able to do it on Twitter?
00:55:12.080 Should Twitter enforce the same rules as you would have anywhere else?
00:55:16.120 Probably.
00:55:16.800 The reason I ask the question...
00:55:18.200 But the test that they will have to have is, do we think this is likely to lead to real harm in the real world?
00:55:24.940 Correct.
00:55:25.480 And therefore, are we willing to take the legal risk tolerance of allowing that?
00:55:31.620 So here's the reason I'm asking you the question.
00:55:34.540 Undoubtedly, large numbers of people calling for an armed revolution would lead to real world harm.
00:55:39.540 In other words, people will be killed.
00:55:41.060 There will be a civil war, potentially, or a revolution.
00:55:43.900 And yet, we're sitting here in Washington, D.C., in a place that was created by lots of people getting together and publishing pamphlets and speaking in the town square and saying,
00:55:56.400 we want to resist the tyranny of King George and we're going to have our own country.
00:56:01.160 And they fought a war in which many people died in order to get there.
00:56:04.920 So if we were living in the social media age in 1776, this country never would have been created.
00:56:11.820 What do you mean?
00:56:13.060 What do you mean, what do I mean?
00:56:13.820 People would have still said it, right?
00:56:15.280 No, they used the printing press to spread their message through pamphlets and all of this other stuff, which is exactly what social media is now.
00:56:20.820 Sure, and people will still use social media to do that.
00:56:23.120 Look, in the event of a tyranny, you know, of course people use social media to expose the tyrant, to call for action, to call for things to happen.
00:56:33.200 Now, the question is whether or not platforms should enable that, should amplify that, whether or not they should enforce their rules.
00:56:40.400 That's up to the individual platforms themselves.
00:56:42.820 But people will find a way to get the message out.
00:56:45.280 We're talking about social media as though it is perfectly elided with the internet itself.
00:56:50.220 People will be able to set up websites, they'll be able to use SMS, be able to use WhatsApp, be able to use Signal, be able to use social media to prezelitize why the tyrant is a tyrant, to be able to give that.
00:57:03.980 Now, the question of whether or not in that moment they'll be able to say, everyone come down here with an AK-47 and start killing people, eh, I don't know.
00:57:13.540 I'm not advocating for that.
00:57:15.020 I'm just saying I think it's not quite as simple as, you know, ban harmful things because many of the things that have created the world we live in today were at the time considered harmful by the powers that be.
00:57:25.500 I also don't think that people saying that we should overthrow the tyrant should be something that you should ever get rid of from your platform.
00:57:31.940 Now, the question of whether or not, you know, so around January the 6th, that became a real question for platforms.
00:57:38.480 So, I mean, I live five blocks from the Capitol, or I used to until two days ago when we moved, but, you know, the question there is whether or not the spreading deliberately of lies, which were amplified by algorithms, normalizing the idea that the election was stolen, should be permitted on a site in the context of potential real violence in an established democracy over an election result.
00:58:06.840 So, people can decide for themselves.
00:58:10.620 Those platforms can decide for themselves.
00:58:12.520 But I think most of them concluded that it was unacceptable, which is why they took action to ban those people.
00:58:18.680 And I disagree with that.
00:58:19.860 Are you comfortable with the sitting outgoing president of the most powerful country in the world being prevented from speaking in public?
00:58:28.080 Yeah.
00:58:28.640 In what way is Donald Trump...
00:58:30.120 Well, he was banned from all the social media platforms.
00:58:33.000 Did you stop hearing from Donald Trump?
00:58:35.080 Were you worried about his ability to get his message out?
00:58:40.320 Well, his ability to get his message out was definitely reduced significantly, as you said yourself, when you ban people from social media.
00:58:46.880 You said it yourself.
00:58:47.700 They lose reach, right?
00:58:48.680 Yeah.
00:58:48.780 So, you can't make the argument both ways in one conversation.
00:58:51.740 And the other thing I would say is the precedent that set, I think, is very dangerous.
00:58:55.380 And if you talk about people being able to criticize a tyrant, well, the tyrant would always say this is misinformation.
00:59:02.700 Look at what Justin Trudeau did in Canada.
00:59:04.440 On Trump...
00:59:05.180 He took away people's bank accounts.
00:59:06.860 On Trump, I have to admit, it was a marginal...
00:59:12.380 I think it was one of the most difficult cases that we've ever come across.
00:59:16.700 He did spread disinformation deliberately that undermined the values that underpinned democracy and deliberately cast doubt on the electoral system despite knowing better.
00:59:26.820 So did Hillary Clinton.
00:59:28.500 And a lot of politicians do that.
00:59:30.680 Did she get banned?
00:59:31.480 No.
00:59:31.820 So this is the double standard that people talk about.
00:59:33.820 If I may finish, I think that the decision that was taken by companies was actually primarily to do with currying favor with the incoming administration who felt that they'd not been treated well by those companies rather than a decision that was taken on principle.
00:59:50.680 Again, again, again, it's why transparency on content enforcement decisions, on whether or not people are...
00:59:57.080 You know, I want to know whether or not shadow banning actually exists.
00:59:59.640 I'm not convinced it does.
01:00:01.300 But maybe it does.
01:00:01.840 What exists?
01:00:02.500 Sorry, I didn't catch that.
01:00:02.880 Shadow banning.
01:00:03.640 Shadow banning.
01:00:03.940 The thing is, again, opaque systems that don't tell you what's going on.
01:00:09.280 One of the few times that we've ever been told what works on these platforms and what doesn't is when Elon took over and it all turned to shit and everyone realized that Elon had actually demanded that his content be amplified over everyone else's because he was pissed off that Joe Biden's tweet about the Super Bowl got more views than his tweet did about the Super Bowl.
01:00:28.500 Do you remember that?
01:00:29.460 No.
01:00:30.060 He deliberately instructed engineers to come in all night and work out how to ensure that his tweets never got less visibility than Joe Biden's because Joe Biden's tweet saying that he was going to be supporting one team or another got more views than his tweet saying go Eagles or something.
01:00:49.220 I mean, that's how bonkers this guy is.
01:00:51.080 So we know that they can put their thumb on the scale.
01:00:53.920 We know that they had an active promotion list.
01:00:56.240 The question of whether or not they have an active shadow banning list.
01:00:58.900 Anyway, with the Trump decision, I actually think it was a marginal case.
01:01:02.540 I'm not quite sure what I would have done.
01:01:04.020 It's a really difficult decision to take.
01:01:06.100 I think that if they felt that there was a specific instance in which he'd broken their rules in such an egregious way that they had to ban him, they should have explained that much more clearly.
01:01:16.040 And they didn't.
01:01:17.200 And again, it's because they are judge, jury and executioner with no oversight, no checks and balances.
01:01:23.740 And I mean, ultimately, I do, you know, I mean, it's funny being someone who very, very strongly believes in something, but not in everything.
01:01:35.920 If you know what I mean, like I very, very, very strongly believe in checks and balances.
01:01:40.440 I very strongly believe in being able to sort of to be able to see what decisions people have taken and be able to criticize them.
01:01:48.200 But I don't have a particularly strong opinion on what decisions they should have taken, especially when it comes to something like Trump.
01:01:54.660 I think when it comes to something like banning anti-Semitism, I'm really pro it.
01:01:57.960 I see the way in which lies have underpinned enormous, enormous human damage, cost over millennia.
01:02:10.340 You know, I'm very interested in how lies have underpinned hatred against Jews.
01:02:15.720 I've just at the moment, I'm rereading Elaine Pagel's book, The Origins of Satan, where she explains how decisions taken on which gospels support that in the New Testament were actually.
01:02:24.880 And the creation of Satan as a concept, as an equal and opposite force to God, was actually there to demonize Jewish people, to create this idea that Jewish people were the perfect enemy of godliness.
01:02:38.920 So I am interested in all that stuff.
01:02:40.220 And I do think you should ban certain types of hate.
01:02:43.220 I think hate is a pernicious virus underpinned always by lies that has an enormous human damage, does enormous damage to lives.
01:02:54.880 To livelihoods, to people's ability to enjoy their full rights.
01:02:59.080 And that every time you give free speech to haters, you actually reduce the fundamental human rights of the people who are victims of that hate.
01:03:07.560 But I do think that when it comes to this kind of discourse, you should, you know, I'd rather have more discourse about and more informed discourse about the way they operate and the impact they have on our societies and the individual decisions they take, because these are important decisions.
01:03:25.140 Banning a president, fucking hell, was a huge thing to do.
01:03:29.720 Correct.
01:03:30.780 Enormous decision to take.
01:03:32.600 Why him and not the Ayatollah Khomeini?
01:03:36.040 Exactly.
01:03:36.820 I mean, seriously.
01:03:38.940 Exactly.
01:03:39.520 That guy's spouting nutty bollocks all the time.
01:03:43.060 Hate against Jews.
01:03:44.860 Utter lies about democracy.
01:03:46.720 Utter lies about what he's doing in his own society.
01:03:49.240 Why is he allowed to have a platform on it?
01:03:51.120 And funding terrorist organisations which destabilise the Middle East.
01:03:54.320 Completely.
01:03:55.480 Guys, we're running out of time, so...
01:03:57.240 So, this is a point that I wanted to make, and I'd be interested to hear your thoughts on this.
01:04:02.840 The reason, Enron, I'm so suspicious of governments is because I think, especially post-pandemic, governments have taken a more authoritarian turn when it comes to free speech.
01:04:12.860 I'll give the example of Scotland.
01:04:14.660 Scotland have introduced hate speech laws which criminalises public performance.
01:04:18.880 So, if you do a public performance, and, for instance, in the example they use is plays, but it will also be other types of performance like stand-up comedy, and you say a form of hate, they define hate as something that a reasonable person would find hateful, that is now a criminal offence.
01:04:36.620 And I am very worried and very suspicious of governments who introduce hate speech laws which are excessive in this instance.
01:04:45.120 I wouldn't trust, for example, the SNP to run anything, and I certainly wouldn't trust them to dictate or to start implementing what we can and can't say online.
01:04:53.960 Oh, look, I mean, I think I'm an old, I'm a lefty, like, I don't think governments should be telling us what to say or think.
01:05:01.520 I think that, you know, governments have balanced, you know, balanced things in the past.
01:05:06.540 Like, there are, of course, rules on hate speech in the UK.
01:05:09.060 And I think of the case of Alison Chabloz, who was jailed for publishing a song in which she went beyond offensive into grossly offensive.
01:05:22.160 And she published a song which was mocking the victims of the Holocaust, you know, making fun of them and the families of those victims and the survivors.
01:05:30.800 And that was deemed to be illegal, malicious communications, and she was jailed for it.
01:05:37.640 I think about public order offences.
01:05:40.420 And, of course, we've got that balance right over time, I think, generally speaking.
01:05:44.940 And where there's been extensions of those to deal with, you know, some of it's kind of virtue signalling policymaking.
01:05:51.660 And it's like, well, look how liberal we can be.
01:05:54.040 And they're actually being incredibly illiberal.
01:05:55.760 I tend to think the SNP will end up being kicked out of office for that, to be frank.
01:06:02.260 But the law will stay on the books.
01:06:04.200 That's the problem.
01:06:05.440 Well, it depends on who's in power next, right?
01:06:07.800 It's probably going to be Labour.
01:06:09.060 And I can't see, in all honesty, Labour wanting to repeal that.
01:06:13.740 I think, you know, honestly, I'm not familiar with what's happening in Scotland.
01:06:20.020 I really haven't kept up with British politics in the last four years, apart from the Online Safety Act, which I was very, very involved with.
01:06:26.500 But I can see why people would be very, very concerned about those rules.
01:06:32.740 I know as well with C63, which you're talking about in Canada.
01:06:35.840 Yes.
01:06:36.240 There's concern about that as well.
01:06:37.620 And I'm going to be going to also want to give evidence on that in due course in the next few months.
01:06:44.020 So that you have to be cautious about, and I think that they're almost dealing with the symptom rather than the cause.
01:06:53.280 Correct.
01:06:53.440 The cause is, of course, the hyperacceleration of the most defensive speech, the normalization, which is causing the re-socialization of our societies to make normal the idea that hatred against other people is acceptable.
01:07:05.780 And where people think that hatred is acceptable, they will start to mobilize into other forms of action as well beyond just speech.
01:07:11.620 And I think that when you have a society in which hate is given an advantage over tolerance, bad things will happen in that society over time.
01:07:21.440 And so the question of how platforms work at a systemic level, I think, is absolutely open for going and attack them, have a look at the algorithms, demand transparency, do all that stuff.
01:07:32.680 But banning individual bits of speech, dumb shit, dumb thing to do, ends up in very, very bad places, ends up with people like me being told I can't hold certain opinions about things, or you and other people.
01:07:45.500 And then you're in the tyranny that you're trying to avoid.
01:07:49.680 Correct.
01:07:50.120 Yeah.
01:07:50.560 So we've ended on a note of agreement.
01:07:52.700 We're going to go to locals for our supporters' questions.
01:07:55.280 Where we're going to disagree again.
01:07:56.480 In a second.
01:07:57.260 What's locals?
01:07:57.860 It's a platform where we post bonus content.
01:08:00.780 Go there in a second when we wrap up.
01:08:02.640 But before we do, we always end with the same question, which is, what's the one thing we're not talking about as a society that you think we should be?
01:08:12.820 Just that tonight?
01:08:15.860 I'm not really very opinionated about these things.
01:08:23.160 Don't know.
01:08:24.400 There we go.
01:08:25.140 There we go.
01:08:25.600 Head on over to locals where we ask Imran your questions.
01:08:33.080 But it's sad to me that you can't define, that you can't see hate speech normally.
01:08:36.720 Hold on a second.
01:08:37.440 You're misrepresenting what I'm saying.
01:08:38.920 Again, rewind and just listen.
01:08:40.600 Good.
01:08:40.860 I hope people do that because they'll see that you're misrepresenting what I'm saying.
01:08:44.240 For the main part, it's fucking obvious.
01:08:47.680 I tell you what, you are being disrespectful to the people who do receive hate speech.
01:08:54.200 Like me.
01:08:54.660 I warn you that most of your listeners will think to themselves, actually, Konstantin's being a bit of a dickhead right now.
01:09:02.260 And calling me a dickhead might be considered hate speech.
01:09:04.580 No, no, no, it's not hate speech.
01:09:06.500 It's just good observation.
01:09:08.680 I'm sorry if you're not.
01:09:09.300 I don't want to.
01:09:11.600 Be sad.
01:09:14.060 I'm sorry.
01:09:14.820 I was going to say I can't do good voice.
01:09:17.180 I'm sorry.
01:09:18.320 I'm sorry.
01:09:22.740 I'm sorry.
01:09:23.220 I'm sorry.
01:09:25.660 I'm sorry.
01:09:26.060 I'm sorry.
01:09:27.120 I'm sorry.
01:09:28.160 I'm sorry.
01:09:29.120 I'm sorry.
01:09:30.640 I'm sorry.