00:00:16.760Uh, you're right, I should have got new investment ideas.
00:00:19.900Well, if you want to take back control of your finances, then fortune and freedom is for you.
00:00:24.960He was founded by Nigel Farage, who has over 40 years of experience in finance and politics.
00:00:30.920Fortune and Freedom is published by South Bank Investment Research
00:00:34.620and is for the investor looking to access a wide range of informed opinions
00:00:39.720on lots of different investing opportunities.
00:00:42.600Their brilliant newsletter covers everything from causes and the impact of inflation
00:00:47.380to the rise of cryptocurrencies, gold investing and much more besides.
00:00:51.820Through the daily news commentary and special reports, Fortune and Freedom can give you more confidence in making informed decisions about what to do with your money.
00:01:00.780Simply go to fortuneandfreedom.com. That's fortuneandfreedom.com and sign up for a free newsletter that will help your money work for you.
00:01:35.420And this is a show for you if you want honest conversations with fascinating people.
00:01:40.460Our brilliant and returning guest today is the General Secretary of the Free Speech Union here
00:01:45.320in the UK, Toby Young. Welcome back to Trigonometry. Thank you, Constantine. Thank you, Francis.
00:01:50.120It's been a while since we had you on the show. And look, the Free Speech Union has been doing
00:01:54.460incredible work. Only a few days ago, one of our interns came in who does a bit of reading
00:01:58.820and research for us. And he's at university at the moment. And he was telling us that at his
00:02:03.340university, some people were trying to shut down some kind of event that was going to happen that
00:02:08.960had something to do with free speech or whatever and uh they were considering shutting down and
00:02:13.440then they went actually if we do this toby young's going to come along and fuck things up so uh and
00:02:18.080they they allow things to carry on basically uh because of that and you're making great stuff
00:02:23.980you know doing great stuff on other things how's it going with the free speech union yeah it's
00:02:27.920going really well um we've now got 9 000 plus members we've got something like 17 employees
00:02:35.080You know, I'm working on it, you know, pretty much full time.
00:02:38.900The demand for our services, you know, is almost limitless.
00:02:43.220We've just launched an FSU in Scotland.
00:02:46.780So we've opened a Scottish office and we've put together a really impressive advisory council in Scotland.
00:02:52.600You know, we've got Murdo Fraser, former deputy leader of the Conservative group up there, still an MSP.
00:02:58.020We've got Jim Sillars, former deputy leader of the SNP.
00:03:01.540We've got some trade unionists, we've got some poets, some writers, some intellectuals, a huge cross-section of people from, you know, every political side all really concerned about the erosion of free speech in Scotland, particularly when the Scottish Hate Crime and Public Order Scotland Act is activated.
00:03:18.740It's got royal assent, but it hasn't yet been activated. When it's activated, that's going to make, you know, free speech under greater threat, I think, in Scotland than anywhere else in Europe.
00:03:27.340Yeah, and we're going to talk about the online harms bill for the UK more broadly as well, because that's really important. But as an outside observer, you know, Francis and I, we don't have any direct involvement with the FSU, although we obviously support what you're doing.
00:03:40.600We are on the advisory. We've given no advice whatsoever, which is for the massive benefit of the FSU, I reckon. But it seems to me from the outside watching that you keep winning battle after battle.
00:03:51.760You defend people who get cancelled or fired or whatever.
00:03:55.140As I said, you are fostering a culture of genuine respect for viewpoint diversity in academia.
00:04:01.300You're helping people recover their reputation after they've been unfairly smeared and attacked.
00:04:06.340Do you feel that the sum of these battles being won is helping us win the war?
00:10:33.360There are all these big tech social media companies, search engines, which are pretty much all progressive in one form or another.
00:10:42.500So it's great that someone's come in who doesn't share that agenda necessarily and believes more strongly in free speech.
00:10:49.040You know, that's an example of a market correction.
00:10:51.320So, you know, I think it's a good thing. Whether he'll actually kind of turn Twitter into the platform we'd all like it to be remains to be seen. What I hope he'll do is rather than overhaul Twitter's content moderation policies, I hope that he'll outsource the content moderation.
00:11:12.120so essentially say to people look it's up to you to censor what you want to censor it's up to you
00:11:18.980if you want to take a risk if you say i don't mind being exposed to misinformation or hate speech
00:11:24.560then you know then then then what they see on twitter needn't be filtered if they want to just
00:11:30.440have civil grown-up respectful conversations about important public policy issues then they
00:11:36.580can turn all the filters on so they kind of screen out the trolls but it should be up to
00:11:41.320the individual as to the risk they're prepared to take it shouldn't be imposed upon them top down by
00:11:47.320you know content moderators in silicon valley so i i hope that's the i hope that's the way forward
00:11:51.940toby when you said if they want civil discussions i'm like get off twitter
00:11:55.540that's what you need to do that will be there you could create a kind of you know um a sub
00:12:01.140community within twitter which is kind of um in which people kind of voluntarily um screen out
00:12:08.240the kind of loud you know brash vulgar voices people like us basically yeah so we wouldn't
00:12:15.260screen us out and have grown-up conversations oh sorry no i was gonna say um so let's talk about
00:12:22.620the online harms bill because this is what we've come to talk about primarily yeah so let's get
00:12:27.140into it what's the problem toby well what is it first what is it and what is the problem with it
00:12:31.180So the online harms bill, it's not called the online safety bill, but it began life as the online harms white paper a few years ago.
00:12:43.720And Sajid Javid, when he unveiled that white paper, said his aim was to make Britain the safest place in the world to go online,
00:12:51.260which, you know, sounds to me like the most regulated place in the world if you want to set up a social media company.
00:12:58.560Anyway, so it began life there, and it was partly in response to a moral panic about how the internet and social media was harming children, causing them to self-harm, in some cases commit suicide, providing them with access to drugs and pornography and so on and so forth.
00:13:16.420So a moral panic about how children were being kind of corrupted, led astray, harmed by, you know, the Wild West.
00:13:25.060And so they brought it in and it's now kind of gone through a kind of evolutionary process.
00:13:30.620And the latest iteration of it is the second draft of the online safety bill.
00:13:35.780And I'll set out the kind of initial case against it.
00:13:39.640Then let's try and steel man the online safety bill.
00:13:43.300and I'll then try and set out the case for it
00:14:13.120It will impose this new legal duty on social media, big social media companies, search engines, Google, Facebook, Twitter, YouTube, whereby those companies will have to remove harmful content, by which they mean not just content that's harmful to children, but also content that is harmful to adults.
00:14:36.020and if they fail to do so they can be fined by Ofcom who will be empowered as the new internet
00:14:43.900regulator they can be fined 10 percent of their annual global turnover so when you're looking at
00:14:49.100companies like Facebook, Google, Twitter that's a huge that's billions of dollars so social media
00:14:55.820companies search engines will have a powerful financial incentive to remove harmful content
00:15:01.580And the really sinister thing is that it won't just be content that's unlawful that they'll be obliged to remove.
00:15:10.120They'll also be obliged to remove what has been called legal but harmful content.
00:15:15.360So stuff you can say offline, you won't be able to say online.
00:15:19.180So the fact that it's legal doesn't mean it won't be prohibited.
00:15:22.840and and often the stuff that the big social media companies will end up removing which is perfectly
00:15:30.700lawful perfectly lawful will be stuff you know that the lobby groups activists are complaining
00:15:36.940about they'll say this is harmful to the lgbtq plus community or this is harmful to you know
00:15:43.740recently arrived migrants or whatever it is and and the companies kind of fearful that they might
00:15:48.980be fine you know these huge amounts by ofcom will err on the side of removal uh censorship um so
00:15:56.500that's the reason for concern it's gonna it's it's it's been described as a kind of census charter
00:16:02.120and um it means that um stuff you'll be able to say you know in other countries you won't be able
00:16:08.400to say here online um uh it'll be you know one of the most censorious heavily regulated you know
00:16:16.020environments for social media companies and search engines if they want to operate
00:16:19.380in the UK. Okay, so that's the sort of standard, that's the case against it.
00:16:25.300What is the case for it from a free speech point of view? Well, I made a couple of notes on the
00:16:29.880train on the way here, trying to kind of think about how to steel man this from a free speech
00:16:34.880point of view. So Nadine Dorries makes the following arguments and her ministers in favour
00:16:40.880of the bill from a free speech point of view. They point out that one of the things the bill
00:16:45.060will do will be to repeal various communications offences, such as the Malicious Communications
00:16:51.420Act, which free speech campaigners have been campaigning against for some time. The communications
00:16:58.360offence under which Count Dankula was prosecuted, that'll be repealed. And those communications
00:17:06.200offences will be replaced by a new harm-based communications offence, which is instead of
00:17:13.080looking at the subject matter of the unlawful communication or the communication to determine
00:17:18.580whether it's unlawful or not they'll look at the effect it has the psychological effect it has on
00:17:24.140the recipient and it'll be unlawful it'll be an unlawful unlawfully harmful communication if it
00:17:31.460causes psychological harm amounting to at least serious distress if the person sending the message
00:17:37.820intended to cause harm like to a likely audience and the person has no reasonable excuse
00:17:42.940for sending the message and nadine dorries and her ministers argue that this new harm-based
00:17:49.200communications offense because it's replacing all these other communications offenses which free
00:17:53.960speech campaigners like me don't like um it'll actually create a more permissive environment
00:17:58.680than the one we're currently in so from a free speech point of view we should welcome
00:18:02.700those that particular cluster of reforms which were included in the bill um she also says and
00:18:09.420she said this on twitter yesterday i think in a debate with fraser nelson that there's nothing
00:18:13.520in the bill as it stands to force social media companies to remove legal but harmful content
00:18:19.960all they'll have to do is if it is agree terms and conditions with ofcom and in those terms
00:18:30.240and conditions yes they'll be obliged to pledge to remove unlawful content but they won't be
00:18:36.400required by law under this bill to remove legal but harmful content. If they want to do that,
00:18:43.400they can. So if they want to say, if Twitter, probably not Twitter now, Facebook, if Facebook
00:18:48.720wants to say, we're going to prohibit this content, because even though it's lawful, we think it's
00:18:54.460harmful, they'll be able to do that, but subject to a couple of provisos. One of the provisos is
00:19:00.180that they have to give special protection to content of democratic importance and journalistic
00:19:06.160content. We'll get onto that in a minute. And when removing content from their platforms,
00:19:11.380when removing lawful content, they have to have regard for freedom of speech. And Nadine
00:19:17.320Dorries says that, you know, I'm not going to force these social media companies, at
00:19:22.540least not in this bill, to remove legal but harmful content. The only thing they'll be
00:19:27.400obliged to do will be to remove unlawful content. If they want to go beyond that, they
00:19:32.560can but they have to do it subject to these various caveats which will protect free speech
00:19:39.420and she argues that this is the first time social media companies have had these free speech duties
00:19:45.240imposed upon them so it's a better place from a free speech point of view than we're in at the
00:19:50.040moment and then the final argument the final pro-free speech argument is that once a social
00:19:57.240media company has agreed its T's and C's with Ofcom, Ofcom will then hold that company's feet
00:20:03.620to the fire when it comes to enforcing those policies. So it'll have to enforce them consistently
00:20:09.780and non-arbitrarily and not in a politically biased way. And she argues that that'll make
00:20:14.900it impossible for Twitter, for instance, in future to remove Trump from the platform,
00:20:19.400but not the leader of the Taliban or Vladimir Putin. That social media companies won't be able
00:20:25.960to selectively apply their community standards
00:20:29.500and say Trigonometry's interview with Kathleen Stock
00:20:33.260has breached YouTube's community standards,
00:20:35.980but Navarro Media's interview with George Galloway hasn't.
00:20:42.580and in that way, that will protect free speech
00:20:47.220and eliminate political bias on social media platforms.
00:20:50.620So that's the case for it from a free speech point of view.
00:20:54.380And I think if you just go back over those arguments to tell you why I don't think any of them are particularly compelling.
00:21:01.980So, yes, I think the repeal of various communications offences and the replacement of them by a harm based communications offence.
00:21:13.120Yes, that's in some ways attractive. It's more attractive than where we are now.
00:21:18.240And I think it probably will be a little bit more permissive.
00:21:20.240fewer people will be prosecuted. I mean, not many people are prosecuted now, but even fewer
00:21:25.020in all likelihood will be prosecuted for communications offences in future once this
00:21:31.240bill becomes law. But the shortcoming of this approach of defining unlawful communications
00:21:39.960by looking at the impact they have on the recipient, psychological harm likely to cause
00:21:47.320extreme distress is it kind of it's it's creating this kind of responsibility for the state to
00:21:53.580protect people from psychological harm and i think that is a very dangerous precedent and not
00:21:58.760not something the state should be in the business of doing well not least because psychological harm
00:22:03.120is entirely subjective well there's the subjectivity argument you know how are we going to measure it
00:22:07.140well we francis could discuss both of our hairlines and we could you know one of us could
00:22:11.640be deeply offended by another one could see it as a joke yeah right and and it says and it it says
00:22:17.440that um if it's likely to cause psychological harm amounting to extreme distress to a likely audience
00:22:24.720so on you know on twitter you say something you think you're just saying it to your followers
00:22:29.880but if one of them retweets it and they then retweet it it goes viral it ends up you know
00:22:35.000in the on the screen of an extremely kind of psychologically fragile person it could actually
00:22:42.000cause them extreme distress but you know why should you be held responsible for that so yeah
00:22:49.180I think the idea that the state should be in the business of protecting people from psychological
00:22:56.060harm and as you say it it it's a sort of gold embossed invitations to activists and lobby
00:23:01.220groups to say, that's psychologically harmful. That made me feel unsafe. Who are you to say
00:23:05.680that my distress isn't serious distress? Is it because I'm a trans person? You're not taking
00:23:13.020my distress seriously? You can just imagine a cascade of demands to remove content on the
00:23:19.620grounds that it's causing people who are quite psychologically fragile extreme psychological
00:23:24.880distress. So that's extremely dangerous and a really dangerous precedent, I think. But okay,
00:23:30.560Even setting that aside, when me and the chief legal counsel of the free speech union had a conversation with one of the ministers and some senior officials in the Department for Culture, Media and Sport about the bill, they offered this as a kind of quid pro quo.
00:23:47.580Yes, you know, in some respects, free speech will get worse because of this bill, but in others, it'll get better.
00:23:53.360And look, we're repealing the Malicious Communications Act. You must be happy about that.
00:23:57.620Well, yes. But one interesting thing is that these reforms to communications law will only apply in England and Wales. They're not going to because, you know, communications offences are a devolved area of legislation. So they won't apply in Scotland. They won't apply in Northern Ireland.
00:24:15.980In Scotland, you could be in the worst of all possible worlds in which, you know, companies, social media companies could be prosecuted, fined billions of dollars for posting or allowing people to post content that's harmful under these old communications laws.
00:24:33.780um so so you know and it's gonna it's kind of like an analog bill for a digital age in that
00:24:39.520respect to create kind of a different regulatory environment in england and wales to the regulatory
00:24:44.060environment in scotland and northern ireland i mean it's going to be an incredible headache
00:24:47.800for somebody who wants someone who wants to set up a compete you know a competitor to twitter or
00:24:53.280facebook they're going to have to get their heads around you know the unbelievable patchwork quilt
00:24:57.880complexity of regulations in the uk there won't even be one standard for the uk it'll depend
00:25:03.240whether you're in scotland northern ireland or england and wales i mean it's ludicrous um toby
00:25:08.340before we go like yeah this is a conservative government this is a conservative government
00:25:14.680bringing this in if you know we're in an alternate universe and corbyn won and they brought this in
00:25:19.800i'd be like this makes complete sense what's going on well and and um that's absolutely true
00:25:27.420It's baffling that a conservative government that claims to believe in free speech and, you know, a small state should be bringing in this sledgehammer of a bill, which is going to have an unbelievably chilling effect on free speech online.
00:25:45.240And one particularly interesting you bring up, bring up Corbyn, Nadine's argument that nothing in this bill will force social media companies to remove legal but harmful content.
00:25:55.180Well, that's true, but it's quite a dishonest argument because what the bill does is it creates an opportunity, a mechanism for the Secretary of State at DCMS, currently Nadine Dorries, to bring forward supplementary legislation, probably in the form of a statutory instrument, identifying what are called in the bill priority harms.
00:26:21.000those are things that social media companies will have to remove as a matter of priority
00:26:25.180you know they don't remove them they really will get fined and it's when it's in that
00:26:29.480supplementary legislation in that statutory instrument that the legal but harmful stuff
00:26:34.260is going to be included so it's likely for instance that social media companies will be told
00:26:38.260you have to remove misinformation as a matter of priority you have to remove hate speech as a
00:26:44.080matter of priority doesn't matter if it's lawful if it's harmful according to this nebulous open-ended
00:26:49.500definition, you'll have to remove it. So that's where the real mischief is going to occur in this
00:26:55.000supplementary legislation. But Toby, what Francis is getting at, and we'll get to the rest of your
00:26:58.800point in a second, is, you know, you know Boris Johnson, right? You've studied together. We were
00:27:04.460told, and look, neither of us is, by the way, a big fan of his, this is a liberal Tory who wants
00:27:13.720to reduce the size of the state and keep us all free.
00:27:17.900And then we had two years like we've just seen,
00:27:35.100but they can be quite easily overridden.
00:27:37.260I mean, I think he has a libertarian kind of default response,
00:27:42.440And I think that's partly why he didn't want to plunge us into lockdown initially, you know, in February, March 2020.
00:27:50.780But they were quite quickly, you know, cast aside when he came under a lot of pressure from colleagues and officials, scientific advisors to lock us down.
00:28:00.600You know, they proved to be quite fragile principles.
00:28:04.060And I think in this case, you know, I don't think he's really applied himself to the bill.
00:28:09.280I mean, quite often the reason he can be easily kind of bullied by kind of more authoritarian colleagues into abandoning his libertarian instincts is because he hasn't done his homework.
00:28:59.020I think that was what he was sort of getting at.
00:29:01.820That he has quite a short attention span.
00:29:04.940And he's not good at kind of sitting down and going through his boxes.
00:29:09.880He's not really interested in the kind of nitty-gritty,
00:29:12.900the minutiae of kind of policy documents and proposals.
00:29:16.900And that's why, you know, I think Dominic Cummings refers to him as trolley
00:29:20.680because, you know, his nickname behind closed doors in Downing Street is trolley
00:29:24.200because he's like a wonky supermarket trolley that kind of veers around
00:29:27.480according to whoever's pushing it or he's bumped into last, you know.
00:29:30.660And maybe that's a slight exaggeration.
00:29:32.640But I think that's the explanation for why someone like Boris, who styles himself this great kind of libertarian, Rabelazian, Falstaffian kind of, you know, freedom loving, you know, yeoman, why he is bringing in, his government is bringing in this bill because he doesn't, he's not, he's easily bamboozled and easily kind of knocked off course and his libertarian instincts can easily be kind of submerged under all this kind of pressure.
00:29:59.760I think that's probably the explanation for that. But just back to your Corbyn point, because the bill creates an opportunity for the Secretary of State to come up with these priority harms, and that's where the legal but harmful stuff comes in.
00:30:15.560OK, Nadine Dorries may not go completely over the top when she brings forward her supplementary legislation and it's debated in the House of Commons.
00:30:23.820It may be reasonable. We may think, well, actually, you know, the definition of hate speech in this statutory instrument isn't too bad, you know.
00:30:32.040But what's to stop, you know, a successor?
00:30:35.380You know, let's say Keir Starmer wins the next general election, not out of the question at this point,
00:30:39.640and appoints Chris Bryant or Dawn Butler as Secretary of State at DCMS,
00:30:45.640they would then have the power under this bill to bring forward another statutory instrument
00:35:01.360And, you know, why shouldn't I be able to decide
00:35:04.460what lawful content I'm able to see, you know,
00:35:08.640on my phone, on my social media platforms?
00:35:11.000Well, Toby, the argument, and I do actually think for all my strong support for the idea of freedom of speech, of course, you know my position on it. At the same time, I think the argument now is the harm side of these things is becoming more prominent because as these networks become bigger and they connect more and more people, the concern is, and it's not the example I'm about to give is not one in which I particularly share the analysis, but I can see a similar example in which that would be the case.
00:35:41.000is the banning of Donald Trump at the very beginning of 2021.
00:35:45.500And the argument from the social media companies was
00:35:48.280this was an election in the most powerful country in the world
00:35:52.080with tons of nuclear weapons, and it seemed to them,
00:35:55.360I don't agree with this interpretation, but it seemed to them
00:35:58.000that there was essentially an attempt to overthrow
00:36:00.400the lawfully elected candidate by storming the Capitol.
00:36:04.740Again, I don't agree with that interpretation,
00:36:08.000And they felt that given the potential for huge harm that could come from lawful activity, it's not unlawful to say on Twitter that you think the election is illegitimate or whatever.
00:36:25.720But the impact of someone with, I don't know, 70 million followers saying that over and over and over, they argue, caused people to get to that position where they are storming the Capitol and people die in the process.
00:36:39.720Isn't that something that whether you like it or not, whether I like it or not, we do have to reckon with that ideas are powerful things and connecting billions of people together in a place where they can say things of that nature has real world impact.
00:36:54.600and as a society like it or not we're going to have to have some kind of reckoning with that
00:36:58.700well i think um i would accept that um a limit should be placed on free speech um whereby people
00:37:07.520shouldn't be free to say things which are likely to um result uh imminently in violence you know i
00:37:16.520think that that's a legitimate constraint restriction on free speech and maybe you could
00:37:21.500argue that that was a good reason. That was the rationale for banning Trump from Twitter. What he
00:37:28.260was saying on the eve of the attack on the Capitol was inciting violence. And if it was, then maybe
00:37:37.120that was a good reason. But I think the difficulty with removing stuff, people saying we think the
00:37:43.100election was stolen, for instance, after Trump's been removed from the platform, is that actually
00:37:49.400that's not it's i think it's hard to make the case that that's likely to lead to imminent right user
00:37:55.9807345 saying the election was stolen isn't causing the capital and i think i think that there was the
00:38:01.720argument that um uh i think louis brandeis who was a supreme court justice made in a famous case i
00:38:08.360think in the 1950s in which he said the um uh cure for bad information is not to suppress that
00:38:16.060information the cure is more information sunlight is the best disinfectant and the problem with
00:38:21.120banning stuff that you think is misinformation or is likely to cause insurrection um is that
00:38:28.740by banning it you're not reducing its toxicity you know you're forcing it underground you're
00:38:34.400forcing it into kind of darker corners of the internet you know people who who are kind of
00:38:39.700inclined to be triggered by that stuff are probably going to find it anyway isn't it better
00:38:44.220that it should be on, you know, larger, more mainstream platforms where it can be challenged openly.
00:38:51.540You know, if you think the election was stolen, you know, and you make that argument on Twitter and Twitter bans it,
00:38:59.820that's going to persuade you that you're right and that's why it was banned.
00:39:03.120It's going to persuade other people seeing that ban that there's actually something to it because otherwise, why would they ban it?
00:39:08.660They're obviously scared they've got something to hide.
00:39:10.380Whereas if you let them say it and then rebut it with a kind of overwhelming number of kind of facts with a kind of mountain of evidence to show it isn't true, then it's likely to kind of lose a lot of its power and lose a lot of its influence.
00:39:24.080And I think that sunlight is the best disinfectant.
00:39:26.120And that's a really good argument against suppressing even dangerous misinformation.
00:39:33.840And I'm obviously playing devil's advocate and trying to explore the argument here.
00:39:37.500But one of the things we do see on social media is that isn't what happens. When somebody says, I think this, it's not that there's an overwhelming barrage of counter information. It's that everyone's in their own silo now. And what happens is people who think A and people who think B are just in different spaces talking to themselves now.
00:39:58.640and so you're not getting that traditional idea of if we're all in a meeting with 300 people and
00:40:04.760you stand up and you go i think covid is caused by 5g and we need to go and blah blah blah and
00:40:10.720200 other people stand up and go you're an idiot and here's why and blah blah blah that's what
00:40:16.380you're talking about but that's not what happens on social media what happens on social media is
00:40:20.840one person stands up and says something 200 other people agree who already agree and they never see
00:40:26.500the people who don't agree with them i think that's a slight caricature i mean you know we're
00:40:30.980all on twitter sometimes you do see you know um quite grown up kind of well-informed quite
00:40:37.060sophisticated nuanced debates taking place of course between people on completely opposite
00:40:41.720sides of big contentious issues and you know you see it on facebook too you see it uh you see it
00:40:47.140on youtube you know it's uh of course there are kind of people who don't want to hear the other
00:40:51.780side and you know but i think it's hard to blame social media for that i think you know we've
00:40:55.820naturally just become a more kind of politically polarized society both here and in America and
00:41:00.760elsewhere and you know social media may have accentuated that but you know it's not I don't
00:41:05.180think it's the root of the problem. I think there's another argument too the argument against
00:41:09.740banning hate speech. So the argument for banning hate speech is that if you don't you allow these
00:41:17.000kind of toxic flowers to bloom and they could result in kind of unpleasant people getting
00:41:24.680elected, populist revolts, and so on and so forth. But in Jacob Matcham-Gamer's recent book on free
00:41:32.880speech, which was a history of free speech, he makes this really good observation about Weimar,
00:41:39.140Germany. And you probably know this, but in Weimar, Germany, various forms of hate speech,
00:41:43.180such as anti-Semitism, were criminalised. And various kind of prominent Nazis actually spent
00:41:50.800time in jail for producing, you know, publishing anti-Semitic material. Hitler was banned for large
00:41:58.680parts of the 1920s in various German states. And none of that did anything to undermine the kind of
00:42:09.780appeal and growth of the Nazi party. On the contrary, it allowed them to cast themselves
00:42:15.260as martyrs at odds with this kind of oppressive state that was kind of suppressing what they had
00:42:21.580to say because it was true you know it had the opposite of its intended effect i think that's
00:42:25.900generally if you do try and suppress these right-wing toxic points of view you know you don't
00:42:32.020you don't stave off another kind of nazi victory at an election you make it more likely and we keep
00:42:39.660using the word harm how do they define this word well they they i think i they they they i got it
00:42:46.200got the definition i wrote it down here it's um uh yeah psychological harm um amounting to at least
00:42:52.980serious distress but unless what see this is the problem because what are you going to do everybody
00:42:59.860who then makes a complaint are you going to have a psychiatrist come in and give them some kind of
00:43:04.660psychiatric assessment? How do you define this? No, there's no requirement for a clinical
00:43:10.880assessment. Now, I think it'll be subjectively defined. If someone claims that a post that
00:43:16.760they've read has caused them extreme psychological distress, then you'll have to take that at face
00:43:23.020value. But let's flip it over and look at it another way. So my mother's Venezuelan.
00:43:29.400socialism has been a disaster in my country one of my cousins literally drinks rain water from
00:43:34.880the tank on top of his house because the water the running water in his house is so polluted
00:43:40.200that if he drinks it it could very literally make him incredibly ill even kill him so what someone
00:43:47.040puts a post advocating socialism on Facebook I go I'm distressed take it down well um I guess this
00:43:53.180is where the protections for content of democratic importance come in. I mean, this is one of
00:44:00.920the great, Nadine Dorries and the defenders of the bill make great play of this. If you
00:44:05.540want to make a political argument, it doesn't matter if it causes people extreme psychological
00:44:10.160distress, it'll be protected because it's content of democratic importance and we don't
00:44:14.460want to censor those important democratic debates. The question becomes, well, who gets
00:44:19.920to decide what's content of democratic importance and what isn't you know um there was that recent
00:44:25.520uh the Maya Forstater the first Maya Forstater employment tribunal in which the judge used this
00:44:31.380very sinister phrase in which he said that gender critical views weren't deserving of respect in a
00:44:37.760democratic society so you know according to that you know judge on a the member of the employment
00:44:43.700tribunal you know he he didn't think that was content of democratic importance now you know um
00:44:48.820That judgment was overturned in part by the Employment Appeals Tribunal, and now the Employment Tribunal is taking place again, and we don't know what the outcome of that will be.
00:44:59.300But, you know, for a figure of considerable authority to declare, no, that isn't content of democratic importance, because gender-critical beliefs are not deserving of respect in a democratic society, you can imagine that kind of argument being made over and over again.
00:45:15.340And, you know, how much do we trust Ofcom to get that kind of thing right?
00:45:19.740Don't forget, Melanie Dawes, the chief executive of Ofcom, said a couple of years ago that she didn't think that it was appropriate to feature anyone from the LGB Alliance on a BBC News or ITV News discussion about the Gender Recognition Act because their views were beyond the pale.
00:45:37.420you know and okay she's revised that position now but how confident can we be that ofcom isn't
00:45:43.160going to make similar errors of judgment when it comes to deciding what issues are deserving of
00:45:48.940this protection because they're of democratic importance and what aren't you know i i certainly
00:45:53.680don't trust them i'd like personally to be able to make that decision myself and not trust to a
00:45:57.900regulator to make it on my behalf i mean there's another problem as well with protecting journalistic
00:46:02.320content you know who gets to decide who's a journalist right are we journalists is this
00:46:06.420journalistic content you know what we'd have to ask ofcom you know are they going to come up with
00:46:11.560what they might say about that well they can account with a list of kind of approved will
00:46:15.960you have to be regulated by ipso or impress you know and if you do that's a way of bringing in
00:46:22.040state regulation of of the press you know until now um regulation of the press has been voluntary
00:46:28.380and there was a huge battle that me and others fought to prevent it becoming you know regulated
00:46:35.020by a state regulator in the aftermath of the Leveson inquiry.
00:46:38.980And we won that battle, but this looks like that's coming back now