00:25:20.580The point I'm trying to make to you is there is a relationship which has been acknowledged throughout the centuries, not just online obviously, but in all sorts of communication about the relationship between the amount of freedom that we have on the one hand and the amount of safety that we have on the other hand.
00:25:34.580When we open up the range of things that people can do or say, we necessarily decrease the amount of safety.
00:25:41.580That's the relationship that those two things have.
00:25:44.580So what I'm saying to you is someone like me was not comfortable with the amount of censorship that we saw on Twitter in the past.
00:25:51.580That does not mean that I'm also not against showing kids eating disorder content.
00:25:57.580We just had psychologist Jonathan Haidt on the show to talk about the dangers of social media.
00:26:02.580We've had many people on the show to talk about that.
00:26:04.580So when it comes to kids, when it comes to genuinely hateful content where it's like people are being cited to commit violence, et cetera.
00:26:11.580But I don't want somebody to be censored from saying, you know, you're a Jew and therefore you support this cause or whatever to me.
00:26:18.580And as a result, creating an environment in which people are really, really prevented from having debates about things like, you know, the trans issue, having debates around things like vaccine safety and efficacy and having other conversations that really matter to our society.
00:26:35.580And I agree with you. I think people should be able to have those debates.
00:26:38.580The question is whether or not these companies should be able to amplify particular voices to make them win or make them lose based on, you know, based on an algorithm which rewards only one thing, which is engagement.
00:26:51.580You know, the freedom of speech is not the freedom of reach.
00:26:54.580It's not it's not anyone's God given right to be given a megaphone, nor is it anyone's God given right to be to be paid for that either.
00:27:03.580And with the way that platforms work, both the, you know, the monetization of this content and then the revenue sharing of that content, what you actually are incentivizing is things like people who are who are performatively offensive.
00:27:18.580For the purposes of generating, generating attention.
00:27:21.580You know, I think about the effort that you guys have put into putting this together. Right.
00:27:25.580Both of you, you've, you've, you've, you put in thought and care into selecting your guests.
00:27:31.580You, you, you're trying to have a debate, an actual extension of human understanding, conversation, extension of human understanding and knowledge, bringing two people together and creating something bigger than just the sum of our two opinions, trying to create insight, those flashes of things that really matter.
00:27:46.580That's admirable. That's not the game they're in. Don't for a second, let Elon Musk persuade you that he's in the game of trying to extend human consciousness.
00:28:02.580When it comes to Twitter, he's in the game of trying to make some money from advertising and he's failed catastrophically at it because he just didn't peg that it wasn't just about the number of eyeballs.
00:28:15.580He thought if I can optimize the number of eyeballs, I can make even more money. If I turn this into an ever moving car crash, he thought, right, it's a billboard company.
00:28:24.580I know what I'll have car crashes. I'll put billboards on them. Everyone will watch them. Bob's your uncle. Loads of money.
00:28:31.580It turns out people don't want to put adverts on car crashes. And that's what Twitter's become.
00:28:36.580It's become the equivalent of a billboard attached to a slowly unrolling, brutal car crash in which the rights of thousands, millions, billions of people, women, gay people, Jews, Muslims, black people are constantly being diminished to amplify the voices of a small fringe of, frankly, lunatics.
00:29:00.580So the thing that I get really quite uncomfortable with in these conversations is the term disinformation, because we know, for instance, when we look at the pandemic, what was classified as disinformation at one point then becomes either accepted or scientific consensus in a few years time, particularly the issue of masks, for example, where it was proven that if you have a mask that is not an N95, it's a paper mask.
00:29:27.580It is of no use, of no use whatsoever. So that is my issue, Imran. That's something that I have real problems with going. This is disinformation within certain reason, because you go, how do you know at this point?
00:29:42.580So let's break this down to three types of disinformation. And I'll tell you what CCDH focused on as an organization. So when I left school, I went to medical school. In 1996, I went to UCLMS.
00:29:55.580I did preclinical medicine there, the two years of theory, and then I quit before I did clinical because I hated it. While I was there, up the road was Andrew Wakefield writing the paper that said that MMR, the vaccine, the measles, mumps, rubella vaccine causes autism, which we know to be untrue based on fabricated data.
00:30:16.760And at the time he was being paid, he was being paid by personal injury lawyers, hoping to bring autism cases against vaccine companies for on behalf of autistic parents.
00:30:28.200That paper caused real devastation, real damage, especially in the northwest of England, where I come from. And kids were harmed immeasurably. Measles is still one of the biggest killers of children in the world today, still.
00:30:43.440And so I've had an interest in how lies are spread about vaccines for some time. When the pandemic kicked off, we saw three types of disinformation flowing. One was the stuff about whether it was China or a lab leak. Don't care. Personally, I don't care. And we did not ever classify that as disinformation or not.
00:31:08.360It was just idle ramblings, as far as I could see. And mainly sort of like, you know how people do on Twitter, where they start talking about geopolitics and they're not really talking about geopolitics. They're just talking about whether or not they think China's mean or not.
00:31:21.940All right. The second type was people saying, you know, asking about things like school closures, masks. That stuff all seemed like, well, there's people who are complaining about the decisions taken by government.
00:31:36.840I think in a moment like this, I know how difficult it is in government. I've worked in politics and I think to myself, it's kind of shitty and mean on those politicians who are having to make decisions based on insufficient information at pace.
00:31:48.140And I thought that was, I thought some of it was unfair, that criticism, especially the sort of the with hindsight criticism of it.
00:31:55.900It's a little bit unfair. Where we were really concerned, where all of CCDH's effort went in, was people who were fundamentally spreading untruths about vaccines or about the deadliness of COVID or about the intent of medics who were coming up with it.
00:32:16.040One of the biggest spreaders of disinformation about vaccines, for example, prior to the vaccine launching, were the left in the US because they were all calling it Trump's vaccine.
00:32:30.040They said Trump's vaccine cannot be trusted. One of my favourite bookmarks on my thing is a press release that came up from the Trump administration decrying the Democrats for being vaccine deniers and anti-vaxxers because they said that they had done everything they could.
00:32:47.140So the truth is that that vaccine, you know, it was the disinformation, it was things like saying that the vaccine wouldn't work and so therefore you should take nebulised hydrogen peroxide.
00:32:56.440Now that might, nebulised hydrogen peroxide, hydrogen peroxide is bleach. It's what you used to, I used to have blonde hair when I was 18.
00:33:04.060And nebulising is, if you're an asthmatic or you've ever had a lung condition, you will know it. It's where you inhale particulates, so droplets.
00:33:13.900So they were saying, inhale droplets of bleach to cure yourself.
00:33:19.520That's the number one anti-vaxxer in America was suggesting that, a guy called Joe McCullough.
00:33:24.040Joe McCullough was working with Robert F. Kennedy Jr. very closely.
00:33:27.240Now when we did a study showing that of that kind of disinformation, 12 people were behind it and quite often they were profiting from that disinformation.
00:33:36.940So they were saying, don't trust the vaccine, nebulise hydrogen peroxide and here you can buy it from my web shop.
00:33:44.240That was clear stuff. Those were snake oil salesmen, SPIVs, who were lying for personal profit.
00:33:50.300And that I think is quite different to idle, I think it's bollocks, speculation.
00:33:57.280I think it's meaningless. It's like worthless mental effort.
00:34:00.900And I know that some people like doing it about whether or not it came from a lab or whether or not it came from a wet market or whether or not, you know, blah, blah, blah, blah, blah.
00:34:08.760May I interrupt you very briefly just on that point?
00:34:12.360And Francis, sorry, I'm interrupting your flow as well.
00:34:14.560I just think on that point we ought to just take a moment to consider what we're saying here.
00:34:19.620COVID-19 killed more people than anything since World War II in terms of a single event.
00:34:28.920If it came from a lab in China, it's almost certainly came from a lab in China because they were doing gain-of-function research.
00:34:37.520You're messing around with viruses, including to make them more lethal, more viral, etc.
00:34:42.660You don't think we ought to know where this came from?
00:34:46.120I'm pretty sure that that's going to be discovered through some of the best forensic pathologists, virologists, and intelligence sources in the world.
00:34:57.700It will not be come up with by Barry from the pub down the road on Twitter.
00:35:03.640I'm absolutely certain they have no insight.
00:35:06.360But that's not who was being censored.
00:35:07.480So that's why I say it's not disinformation. It's just idle speculation on Twitter.
00:35:12.180You saying you don't care about it is interesting to me because on YouTube there were Harvard scientists who were saying it and being censored.
00:35:18.800I don't care what people say about it on social media.
00:35:22.080I think that the way that the process will work to derive the truth on that does not at any point involve a Twitter flame war between one side or another.
00:35:36.220Do you not think that the silencing of certain discussions on social media has an impact of what is happening in the real world?
00:35:42.840I don't think they should be silenced.
00:35:49.400Look, I've sat in a pub and talked about nonsense before.
00:35:52.880I don't think people should silence you for talking nonsense.
00:35:55.260Even if I think it's nonsense, they might find it important or interesting and they might find it satisfying to be able to point to a villain on it.
00:36:29.200If your lies spread with intent, telling people not to take a vaccine that could save their lives in order to sell them some janky pill that you sell on your web store,
00:36:44.400that, to me, is an act of absolute moral evil.
00:36:50.040There's just no doubt in my mind about it.
00:36:52.500And I think that platforms that say that you can't spread health disinformation should enforce their rules and ensure that people, that snake oil salesmen,
00:37:01.580and there is a tradition of that in the US, should not be able to weaponize that moment, that moment of incredible anxiety for many people when they are vulnerable.
00:37:20.580But your argument could be used against people, for instance, like the US government, who were pushing out vaccines and saying everybody needs to take the vaccine,
00:37:29.080when the data was showing that if you were a young man of a certain age, you didn't...
00:37:33.700Or a young person of a certain age, the risks of COVID were negligible.
00:37:37.780And actually, it was probably more dangerous to take the vaccine.
00:37:42.040They were saying to take, you know, the AstraZeneca vaccine was completely safe.
00:37:46.040It has now been withdrawn because of blood clots.
00:38:08.460I don't see what that's got to do with the work that we've done.
00:38:10.860I think the very nuanced and careful way that we differentiated between disinformation and just discussion is really, really vital.
00:38:21.260I think people can take a lesson from the way that CCDH did it.
00:38:25.140Because actually, we focused on the people who are profiting from the pandemic by spreading disinformation deliberately that would lead to the loss of life for millions of people.
00:38:36.020You know, 200,000 people in America were calculated by Microsoft AI and Johns Hopkins University, professors that I know there very well, incredible specialists.
00:38:47.260200,000 people died unnecessarily in the US because they failed to take a vaccine that would have saved their lives, that they were too late.
00:38:54.840Many of them died in ICUs in America, choking to death, begging for a vaccine that they had once thought would harm them, but in fact would have saved their lives.
00:39:04.380But it was too late for them to take it because they were already on the cusp of death.
00:39:11.840That to me is an atrocity that simply, you know, if you think about 9-11, so one of the signal moments in my life, it was the day before my 23rd birthday in 9-11.
00:39:58.240And our perspective, you know, we have done very, very little to clean up the fact that the way that those lies were spread was intermingled along legitimate debate.
00:40:09.460A legitimate debate, debate that people should be able to have about where it came from, gain of function research, whether or not masks are a good idea or not, whether or not the evidence supports whether or not kids should be off school, whether or not young people should be encouraged to take the vaccine.
00:40:24.920I think on balance, probably they should have been because there is the risk of transmission to adults and that can kill the adults or, you know, certainly elderly people.
00:40:33.840And so like on those, but when it comes to spreading lies deliberately, I think that we showed, we showed exactly how to have that, how to differentiate between the two.
00:40:43.600Because the other problem with this is, and I'm finding out here for myself, like people go, well, look, they banned Alex Jones.
00:40:52.860They banned Alex Jones off every single platform.
00:42:24.720You know, spreading hate against women and some really, I think it's hateful against men too.
00:42:30.840Because it's so desiccated his worldview of how men and women should interact.
00:42:36.060You know, I'm married and I got married later in life.
00:42:40.700I don't think I've ever understood how joyful it is to be in something that's more than the sum of its parts as much as a marriage is.
00:42:51.600And he's trying to steal that from every young man out there because he's telling them the only way to be in a relationship is to dominate.
00:42:58.080It's a zero sum game of domination versus submission.
00:43:19.920I think that if he does break the rules, that he shouldn't be allowed on those platforms.
00:43:23.780And platforms should do their utmost to make sure that this guy who monetizes hatred is not rewarded for it.
00:43:31.100You know, I almost wouldn't care if the algorithms didn't amplify his stuff deliberately.
00:43:35.680We found that, for example, if you set up an account on TikTok as a 14-year-old boy, within 2.4 minutes it was serving up Andrew Tate content.
00:43:43.860That's a deliberate editorial decision by the algorithm and by the company to actually advantage that discourse, to advantage his speech over other speech.
00:43:54.560So, yeah, I mean, I'm quite pleased that he was deplatformed.
00:44:02.280And I think platforms should be working harder to make sure that he can't profit from his nonsense.
00:44:07.080Let me just make a point, I think, where there is an element of us talking past each other, which reflects the broader debate on these issues, actually, I think, of us talking.
00:44:52.780And at the same time, what we are bringing to your attention, I think, is that there were other types of content that you don't focus on at the CC.
00:45:07.460They were being prevented from being talked about.
00:45:09.380And the concern of people like us is that sometimes when people are keen to make sure that the Internet is, quote, as safe as possible, they go very far and they censor things that should never have been prevented from being discussed.
00:45:24.260A lot of that happened during the pandemic.
00:45:26.700A lot of things should have been banned during the pandemic from being spread around.
00:45:31.280But at the same time, a lot of things were prevented from being discussed.
00:45:34.080First, on cultural issues, there are people who are being prevented from expressing perfectly reasonable mainstream opinions that we should be able to talk about.
00:45:52.640And then there's also the other angle, which is we should be able to discuss most things in a sensible way.
00:45:58.480And this isn't specifically about the Center for Countering Digital Hate.
00:46:01.780We're having a more broad philosophical discussion here.
00:46:04.860I do recognize some of the sort of the hysterical kind of like, you know, performative virtue signaling of those people who want to shut down voices.
00:46:15.080I see them with the deplatforming at universities and stuff like that.
00:46:18.340I do, you know, I had the privilege of going to Cambridge and hearing voices from every part of the political spectrum in my education.
00:46:25.500I read everyone from the chief jurist of the Third Reich, you know, through to Marx when I was studying politics there.
00:46:34.120And it all contributed to my education and understanding of the world around me.
00:46:38.040So I'm a great believer in free and open discourse.
00:46:43.240So you and I agree, or we agree, that there are some voices that are being silenced, which we think are unfair, arbitrary, you know, vindictive decision making.
00:46:55.940We think that there are some voices that should be because they're clearly outwith the rules of those platforms and definitely outwith, you know, 99.9% of the norms of our society.
00:47:04.460But we want to have visibility of that.
00:47:08.900We want it to be actually what we're talking about is fairness.
00:47:28.320So CCDH's primary goals are to ensure that every country legislates what we call the STAR framework.
00:47:37.840So the STAR framework says that if you have transparency of the algorithms, the enforcement decisions, and the advertising, that allows for meaningful accountability in which you can ask them tough questions based on hard data and get real answers back from them.
00:47:53.320And you probably need some sort of body to do that accountability work.
00:47:56.140So whether it's Ofcom in the UK, the Digital Services Act implementation bodies in the European Union, or Elizabeth Warren and Lindsay Graham, two people from very different parts of the political spectrum, have an idea for an FCC for social media in the US.
00:48:11.560I think it's a grand idea if they can get the data that can actually allow them to meaningfully hold them accountable.
00:48:16.240And then where they create harms that actually do impinge on other people's fundamental rights in a tortious way that actually might harm them, for example, whether it's eating disorder content or it's content about, you know, hate content that leads to a terrorist attack, that they should be subject to the same laws as any other company or publisher, which is negligence law, defamation law.
00:48:39.060At the moment, at the moment, under US law, under something called Section 230 of the Communications Decency at 1996, are not subject to those laws.
00:48:47.880So we're saying that they should be subject to the laws.
00:48:50.120So transparency, accountability, meaningful accountability based on data with people who can ask them questions and get the answers back.
00:48:58.020And then where they actually cause harm, they should be subject to the same laws as everyone else, which is negligence law, which is defamation law, etc.
00:49:05.580In doing those things, what you actually then end up with is a safety by design culture.
00:49:11.480So a culture that thinks about not just about maximizing their revenues, but also about creating healthy spaces for discourse in which real discourse can happen.
00:49:22.160And by having that transparency, you ensure that they don't go too far either in the wrong direction.
00:49:27.380So you end up with actually good spaces in which real discussion can happen.
00:49:32.140I read in the newspapers every day still.
00:49:46.480If I hadn't read bonkers opinions, I wouldn't have, I wouldn't feel as empowered as I do to be able to reject them in the first place or to give you a cultured and intelligent critique of them.
00:49:56.920So I do see the value of that discourse.
00:49:59.740But we don't want, and we don't want, by the way, the platform, we don't want governments to give you a list of things you can and can't say on those things.
00:50:08.140What the STAR framework says is, let's have more informed speech about how these platforms work so that we can have a healthier spaces for those discourse and have a healthier society to fulfill the original promise of social media,
00:50:20.420which was to create a truly global conversation in which humanity can actually extend its understanding and insight collectively, you know, negotiate a better understanding of who we are, where we, you know, and what's good and what isn't.
00:50:35.320So, Imran, I guess one of the things that I wanted to ask in that, and look, I think where everybody is in here is in favor of greater transparency for these social media companies.
00:50:58.980Things are feeling a little less human these days, aren't they?
00:51:03.140But isn't the whole point of progress to make things more human?
00:51:06.600That's why, at TD, when we design a product, whether it's an app for making trading easier or monitoring your account for fraud, we ask one simple question.
00:51:54.960If it could be shown that they'd been negligent in the way that they'd put together their systems, put together their quality assurance, then they could be held liable for it.
00:52:02.660If, for example, TikTok knew that within 2.6 minutes of setting up an account as a 13-year-old girl, it was serving them self-harm content, within eight minutes eating disorder content, every 39 seconds on average, and that when you open an account with a name like Susan, it gives you a lot of self-harm content.
00:52:21.240If you call it Susan Lose Weight, it gives you 12 times the amount of self-harm content because it knows that you're vulnerable, which is actually the results of a research study that CCDH did called Deadly by Design.
00:52:32.400And if that leads to a child taking their own life, and I have, at this point, met far, far too many parents who've lost their children that way.
00:52:41.180Ian Russell, the father of Molly Russell, the 14-year-old girl in England who took her own life, is on the board of CCDH.
00:52:46.960And, you know, if at that point they knew about it, they did nothing about it, they can be held legally liable for it.
00:52:53.920That is where they could be held liable.
00:52:56.360Or, for example, if someone could show that a terrorist attack was plotted on their site, this is another real case.
00:53:03.320This is a real case of Israelis who took one of the platforms to court because of the way that Hamas bombing had been planned on their, Hamas attack had been planned on their site, operationalized on their site, fundraised on their site.
00:53:17.700They didn't mean to be able to brazelatize their hatred on their site, and it led directly to someone being killed.
00:53:22.020At the moment, under US law, they can't be held liable.
00:53:24.680The problem that we have at the moment isn't that these companies aren't being held liable, it's that they've got specific get-out-of-jail-free cards saying they can't be held liable.
00:53:34.280And I don't understand why social media companies should not be subject to the same pressures that any other company are, which is that you have to abide by American negligence law.
00:53:47.680The entire corpus of decision-making in negligence law, which is a very well-established and extensive set of law that doesn't say that if one cup of coffee is poisonous, I can't sue them until they bleed.
00:54:02.360But if they do this on a systemic basis because they just don't give a shit about the people that buy their stuff, then they can be held liable.
00:55:25.480And therefore, are we willing to take the legal risk tolerance of allowing that?
00:55:31.620So here's the reason I'm asking you the question.
00:55:34.540Undoubtedly, large numbers of people calling for an armed revolution would lead to real world harm.
00:55:39.540In other words, people will be killed.
00:55:41.060There will be a civil war, potentially, or a revolution.
00:55:43.900And yet, we're sitting here in Washington, D.C., in a place that was created by lots of people getting together and publishing pamphlets and speaking in the town square and saying,
00:55:56.400we want to resist the tyranny of King George and we're going to have our own country.
00:56:01.160And they fought a war in which many people died in order to get there.
00:56:04.920So if we were living in the social media age in 1776, this country never would have been created.
00:56:13.820People would have still said it, right?
00:56:15.280No, they used the printing press to spread their message through pamphlets and all of this other stuff, which is exactly what social media is now.
00:56:20.820Sure, and people will still use social media to do that.
00:56:23.120Look, in the event of a tyranny, you know, of course people use social media to expose the tyrant, to call for action, to call for things to happen.
00:56:33.200Now, the question is whether or not platforms should enable that, should amplify that, whether or not they should enforce their rules.
00:56:40.400That's up to the individual platforms themselves.
00:56:42.820But people will find a way to get the message out.
00:56:45.280We're talking about social media as though it is perfectly elided with the internet itself.
00:56:50.220People will be able to set up websites, they'll be able to use SMS, be able to use WhatsApp, be able to use Signal, be able to use social media to prezelitize why the tyrant is a tyrant, to be able to give that.
00:57:03.980Now, the question of whether or not in that moment they'll be able to say, everyone come down here with an AK-47 and start killing people, eh, I don't know.
00:57:15.020I'm just saying I think it's not quite as simple as, you know, ban harmful things because many of the things that have created the world we live in today were at the time considered harmful by the powers that be.
00:57:25.500I also don't think that people saying that we should overthrow the tyrant should be something that you should ever get rid of from your platform.
00:57:31.940Now, the question of whether or not, you know, so around January the 6th, that became a real question for platforms.
00:57:38.480So, I mean, I live five blocks from the Capitol, or I used to until two days ago when we moved, but, you know, the question there is whether or not the spreading deliberately of lies, which were amplified by algorithms, normalizing the idea that the election was stolen, should be permitted on a site in the context of potential real violence in an established democracy over an election result.
00:58:30.120Well, he was banned from all the social media platforms.
00:58:33.000Did you stop hearing from Donald Trump?
00:58:35.080Were you worried about his ability to get his message out?
00:58:40.320Well, his ability to get his message out was definitely reduced significantly, as you said yourself, when you ban people from social media.
00:59:06.860On Trump, I have to admit, it was a marginal...
00:59:12.380I think it was one of the most difficult cases that we've ever come across.
00:59:16.700He did spread disinformation deliberately that undermined the values that underpinned democracy and deliberately cast doubt on the electoral system despite knowing better.
00:59:31.820So this is the double standard that people talk about.
00:59:33.820If I may finish, I think that the decision that was taken by companies was actually primarily to do with currying favor with the incoming administration who felt that they'd not been treated well by those companies rather than a decision that was taken on principle.
00:59:50.680Again, again, again, it's why transparency on content enforcement decisions, on whether or not people are...
00:59:57.080You know, I want to know whether or not shadow banning actually exists.
01:00:03.940The thing is, again, opaque systems that don't tell you what's going on.
01:00:09.280One of the few times that we've ever been told what works on these platforms and what doesn't is when Elon took over and it all turned to shit and everyone realized that Elon had actually demanded that his content be amplified over everyone else's because he was pissed off that Joe Biden's tweet about the Super Bowl got more views than his tweet did about the Super Bowl.
01:00:30.060He deliberately instructed engineers to come in all night and work out how to ensure that his tweets never got less visibility than Joe Biden's because Joe Biden's tweet saying that he was going to be supporting one team or another got more views than his tweet saying go Eagles or something.
01:00:49.220I mean, that's how bonkers this guy is.
01:00:51.080So we know that they can put their thumb on the scale.
01:00:53.920We know that they had an active promotion list.
01:00:56.240The question of whether or not they have an active shadow banning list.
01:00:58.900Anyway, with the Trump decision, I actually think it was a marginal case.
01:01:02.540I'm not quite sure what I would have done.
01:01:04.020It's a really difficult decision to take.
01:01:06.100I think that if they felt that there was a specific instance in which he'd broken their rules in such an egregious way that they had to ban him, they should have explained that much more clearly.
01:01:17.200And again, it's because they are judge, jury and executioner with no oversight, no checks and balances.
01:01:23.740And I mean, ultimately, I do, you know, I mean, it's funny being someone who very, very strongly believes in something, but not in everything.
01:01:35.920If you know what I mean, like I very, very, very strongly believe in checks and balances.
01:01:40.440I very strongly believe in being able to sort of to be able to see what decisions people have taken and be able to criticize them.
01:01:48.200But I don't have a particularly strong opinion on what decisions they should have taken, especially when it comes to something like Trump.
01:01:54.660I think when it comes to something like banning anti-Semitism, I'm really pro it.
01:01:57.960I see the way in which lies have underpinned enormous, enormous human damage, cost over millennia.
01:02:10.340You know, I'm very interested in how lies have underpinned hatred against Jews.
01:02:15.720I've just at the moment, I'm rereading Elaine Pagel's book, The Origins of Satan, where she explains how decisions taken on which gospels support that in the New Testament were actually.
01:02:24.880And the creation of Satan as a concept, as an equal and opposite force to God, was actually there to demonize Jewish people, to create this idea that Jewish people were the perfect enemy of godliness.
01:02:40.220And I do think you should ban certain types of hate.
01:02:43.220I think hate is a pernicious virus underpinned always by lies that has an enormous human damage, does enormous damage to lives.
01:02:54.880To livelihoods, to people's ability to enjoy their full rights.
01:02:59.080And that every time you give free speech to haters, you actually reduce the fundamental human rights of the people who are victims of that hate.
01:03:07.560But I do think that when it comes to this kind of discourse, you should, you know, I'd rather have more discourse about and more informed discourse about the way they operate and the impact they have on our societies and the individual decisions they take, because these are important decisions.
01:03:25.140Banning a president, fucking hell, was a huge thing to do.
01:03:55.480Guys, we're running out of time, so...
01:03:57.240So, this is a point that I wanted to make, and I'd be interested to hear your thoughts on this.
01:04:02.840The reason, Enron, I'm so suspicious of governments is because I think, especially post-pandemic, governments have taken a more authoritarian turn when it comes to free speech.
01:04:14.660Scotland have introduced hate speech laws which criminalises public performance.
01:04:18.880So, if you do a public performance, and, for instance, in the example they use is plays, but it will also be other types of performance like stand-up comedy, and you say a form of hate, they define hate as something that a reasonable person would find hateful, that is now a criminal offence.
01:04:36.620And I am very worried and very suspicious of governments who introduce hate speech laws which are excessive in this instance.
01:04:45.120I wouldn't trust, for example, the SNP to run anything, and I certainly wouldn't trust them to dictate or to start implementing what we can and can't say online.
01:04:53.960Oh, look, I mean, I think I'm an old, I'm a lefty, like, I don't think governments should be telling us what to say or think.
01:05:01.520I think that, you know, governments have balanced, you know, balanced things in the past.
01:05:06.540Like, there are, of course, rules on hate speech in the UK.
01:05:09.060And I think of the case of Alison Chabloz, who was jailed for publishing a song in which she went beyond offensive into grossly offensive.
01:05:22.160And she published a song which was mocking the victims of the Holocaust, you know, making fun of them and the families of those victims and the survivors.
01:05:30.800And that was deemed to be illegal, malicious communications, and she was jailed for it.
01:06:09.060And I can't see, in all honesty, Labour wanting to repeal that.
01:06:13.740I think, you know, honestly, I'm not familiar with what's happening in Scotland.
01:06:20.020I really haven't kept up with British politics in the last four years, apart from the Online Safety Act, which I was very, very involved with.
01:06:26.500But I can see why people would be very, very concerned about those rules.
01:06:32.740I know as well with C63, which you're talking about in Canada.
01:06:53.440The cause is, of course, the hyperacceleration of the most defensive speech, the normalization, which is causing the re-socialization of our societies to make normal the idea that hatred against other people is acceptable.
01:07:05.780And where people think that hatred is acceptable, they will start to mobilize into other forms of action as well beyond just speech.
01:07:11.620And I think that when you have a society in which hate is given an advantage over tolerance, bad things will happen in that society over time.
01:07:21.440And so the question of how platforms work at a systemic level, I think, is absolutely open for going and attack them, have a look at the algorithms, demand transparency, do all that stuff.
01:07:32.680But banning individual bits of speech, dumb shit, dumb thing to do, ends up in very, very bad places, ends up with people like me being told I can't hold certain opinions about things, or you and other people.
01:07:45.500And then you're in the tyranny that you're trying to avoid.
01:08:02.640But before we do, we always end with the same question, which is, what's the one thing we're not talking about as a society that you think we should be?