TRIGGERnometry - May 09, 2022


Toby Young: "New Law is the Greatest Blow to Free Speech"


Episode Stats

Length

56 minutes

Words per Minute

178.45013

Word Count

10,001

Sentence Count

287


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

Transcript

Transcript generated with Whisper (turbo).
00:00:00.080 Hey Francis, do you need new investment ideas?
00:00:03.180 No thanks, I've got all my cash tied up in Venezuelan crypto.
00:00:07.260 Ah, how is GringoCoin doing?
00:00:09.100 It's pronounced GringoCoin.
00:00:10.660 My portfolio is now worth a billion Venezuelan bolivars.
00:00:15.300 That's about three quid then.
00:00:16.760 Uh, you're right, I should have got new investment ideas.
00:00:19.900 Well, if you want to take back control of your finances, then fortune and freedom is for you.
00:00:24.960 He was founded by Nigel Farage, who has over 40 years of experience in finance and politics.
00:00:30.920 Fortune and Freedom is published by South Bank Investment Research
00:00:34.620 and is for the investor looking to access a wide range of informed opinions
00:00:39.720 on lots of different investing opportunities.
00:00:42.600 Their brilliant newsletter covers everything from causes and the impact of inflation
00:00:47.380 to the rise of cryptocurrencies, gold investing and much more besides.
00:00:51.820 Through the daily news commentary and special reports, Fortune and Freedom can give you more confidence in making informed decisions about what to do with your money.
00:01:00.780 Simply go to fortuneandfreedom.com. That's fortuneandfreedom.com and sign up for a free newsletter that will help your money work for you.
00:01:11.420 The link is in the description.
00:01:12.900 The critical thing is the government, through Ofcom, the state will be deciding
00:01:19.460 what jokes it thinks you should be able to hear and what jokes you shouldn't.
00:01:30.060 Hello and welcome to Trigonometry. I'm Francis Foster.
00:01:34.260 I'm Constantine Kishan.
00:01:35.420 And this is a show for you if you want honest conversations with fascinating people.
00:01:40.460 Our brilliant and returning guest today is the General Secretary of the Free Speech Union here
00:01:45.320 in the UK, Toby Young. Welcome back to Trigonometry. Thank you, Constantine. Thank you, Francis.
00:01:50.120 It's been a while since we had you on the show. And look, the Free Speech Union has been doing
00:01:54.460 incredible work. Only a few days ago, one of our interns came in who does a bit of reading
00:01:58.820 and research for us. And he's at university at the moment. And he was telling us that at his
00:02:03.340 university, some people were trying to shut down some kind of event that was going to happen that
00:02:08.960 had something to do with free speech or whatever and uh they were considering shutting down and
00:02:13.440 then they went actually if we do this toby young's going to come along and fuck things up so uh and
00:02:18.080 they they allow things to carry on basically uh because of that and you're making great stuff
00:02:23.980 you know doing great stuff on other things how's it going with the free speech union yeah it's
00:02:27.920 going really well um we've now got 9 000 plus members we've got something like 17 employees
00:02:35.080 You know, I'm working on it, you know, pretty much full time.
00:02:38.900 The demand for our services, you know, is almost limitless.
00:02:43.220 We've just launched an FSU in Scotland.
00:02:46.780 So we've opened a Scottish office and we've put together a really impressive advisory council in Scotland.
00:02:52.600 You know, we've got Murdo Fraser, former deputy leader of the Conservative group up there, still an MSP.
00:02:58.020 We've got Jim Sillars, former deputy leader of the SNP.
00:03:01.540 We've got some trade unionists, we've got some poets, some writers, some intellectuals, a huge cross-section of people from, you know, every political side all really concerned about the erosion of free speech in Scotland, particularly when the Scottish Hate Crime and Public Order Scotland Act is activated.
00:03:18.740 It's got royal assent, but it hasn't yet been activated. When it's activated, that's going to make, you know, free speech under greater threat, I think, in Scotland than anywhere else in Europe.
00:03:27.340 Yeah, and we're going to talk about the online harms bill for the UK more broadly as well, because that's really important. But as an outside observer, you know, Francis and I, we don't have any direct involvement with the FSU, although we obviously support what you're doing.
00:03:39.520 I think you're both on the advisory.
00:03:40.600 We are on the advisory. We've given no advice whatsoever, which is for the massive benefit of the FSU, I reckon. But it seems to me from the outside watching that you keep winning battle after battle.
00:03:51.760 You defend people who get cancelled or fired or whatever.
00:03:55.140 As I said, you are fostering a culture of genuine respect for viewpoint diversity in academia.
00:04:01.300 You're helping people recover their reputation after they've been unfairly smeared and attacked.
00:04:06.340 Do you feel that the sum of these battles being won is helping us win the war?
00:04:13.340 Or are you more hesitant about that?
00:04:15.060 I'm a little more hesitant.
00:04:16.180 I mean, I don't want to sound too pessimistic.
00:04:18.220 And I think the war is winnable.
00:04:20.000 but it's not going to be winnable particularly quickly it feels like a kind of generational
00:04:25.340 battle rather than something that can be won overnight you know if you think about the long
00:04:30.460 march through the institutions that's taken you know 70 years so it's not that surprising if people
00:04:36.420 have dedicated themselves to capturing you know the commanding heights of the cultural economy
00:04:40.740 that those heights are now captured and to recapture them or at least to decouple them from
00:04:47.820 you know this kind of these these kind of woke cultists um uh is not going to be something that
00:04:53.000 can be accomplished overnight you know i don't think it'll take 70 years and i think the presence
00:04:57.120 of social media kind of accelerates these cultural shifts so you know it's possible um things may
00:05:02.300 improve a lot over the next five to ten years and what made you start the free speech union toby
00:05:07.540 it was partly my own experience of um being cancelled at the beginning of 2018 um i was
00:05:13.260 appointed to the Office for Students which is the university's regulator in England by Theresa May
00:05:20.960 and because I was a political appointee to a public role you know all the enemies of the
00:05:29.580 government kind of decided to kind of do some offence archaeology on me looking at everything
00:05:35.620 I'd said or written or tweeted you know dating back to the 80s and one person literally dredged
00:05:42.520 up something i'd written in 1987 which was 31 years earlier and uh and of course because i've
00:05:48.480 been a you know a a journalist kind of um a fairly kind of provocative controversial journalist for
00:05:53.920 more than 30 years it didn't take them long to find unsuitable material um and so i stepped down
00:06:00.360 from that position after about a week you know the urging of various government ministers and
00:06:05.500 officials um and uh and then you know um the mob then came for me and all my other positions and
00:06:11.920 I ended up having to step down from five positions in total, including my full-time job, which paid
00:06:16.100 the mortgage. And that was kind of an unpleasant experience. And one of the really unpleasant
00:06:23.940 things about it is that there was no organization to turn to for kind of good professional guidance.
00:06:30.500 You know, if I step down, is that going to help? Or is that going to make things worse? Should I
00:06:34.320 resign or force them to fire me? Should I apologize? Or will that just be like, you know,
00:06:38.600 blood in the water um uh you know who do i turn to for good psychological counseling and some people
00:06:44.620 you know when they're faced with being targeted by an outrage mob find it really psychologically
00:06:49.280 traumatic one of the things we've found at the free speech union so i thought afterwards you
00:06:53.520 know after the kind of dust had settled and i've recovered i thought why not set up an organization
00:06:57.980 that can help people that find themselves in this situation i mean you think you can rely on your
00:07:02.980 friends and to a large part you can i mean louis louis uh ck came up with a good line which he got
00:07:10.200 cancelled obviously um he said um people tell you that when something like this happens you find out
00:07:15.040 who your real friends are and that's true but it's the wrong half in my case it wasn't the wrong half
00:07:20.440 most of my friends did stick by me but people are a bit reluctant to kind of stand up and defend you
00:07:25.880 in public for fear that you know then the mob will target them so to have a kind of professional
00:07:29.920 organisation there to do that for you and offer you advice and put a kind of protective shield
00:07:34.600 around you, I thought that would be, you know, a really helpful service, something that was really
00:07:38.580 needed. And that was really the inspiration for the Free Speech Union. What people think and get
00:07:43.660 wrong about the Free Speech Union is they think it's just about helping people like celebrities,
00:07:49.320 etc, radio presenters. But that's not the vast majority of the work that you do, is it? No,
00:07:53.860 the vast majority of the people we help are ordinary people, you know, from all walks of
00:07:58.300 life. So, for instance, we recently came to the aid of a West Midlands train driver, a guy called
00:08:04.880 Jeremy Sleeth, a Corbynista, actually. And he, on Freedom Day, July 19th last year, he said on his
00:08:13.400 private Facebook account, thank God the pubs are reopening. I didn't want to live in a Muslim
00:08:19.940 alcohol-free caliphate for the rest of my life. And one of his colleagues complained. He was
00:08:26.800 investigated and fired for gross misconduct for something he said you know a mildly provocative
00:08:32.980 not terribly funny gag on his private Facebook account that was enough to get him fired from
00:08:38.040 his job as a train driver and he came to us for help we raised some money we took his case to the
00:08:45.440 employment tribunal and we successfully got a judgment of unfair dismissal and now they're
00:08:53.140 agreeing the compensation he should be paid. But yeah, people like that are the people that come
00:08:58.020 to us for help. Ordinary people, not big celebrities like J.K. Rowling, who can look
00:09:02.280 after themselves. Toby, before we get onto the online harms bill, which I think is really,
00:09:06.800 really important for us to talk about, really important, just as a curious aside, we've seen
00:09:12.380 in the last few days Elon Musk's impending takeover of Twitter, if it does in fact happen,
00:09:18.180 because it still hasn't been finalized. As the General Secretary of the Free Speech Union,
00:09:22.720 What do you make of sort of one man taking over one of the crucial platforms in which public debate happens?
00:09:28.960 And, of course, I think he's expressed some principles that the three of us would agree with.
00:09:33.820 But the principle of one person having that much control, are you optimistic?
00:09:39.240 Are you concerned? Are you both?
00:09:40.960 Where do you sit on that issue?
00:09:43.020 I'm not kind of too troubled by the fact that, you know, one guy is exercising this extraordinary influence.
00:09:52.720 You know, you could say the same about Mark Zuckerberg and Facebook.
00:09:56.680 But I am troubled by Mark Zuckerberg.
00:09:58.920 Okay, but it's not.
00:09:59.840 People objecting to it are objecting as though something like this has never happened before.
00:10:04.780 This is an unprecedented example of one man exercising this extraordinary influence.
00:10:12.080 When, you know, social media companies, successful internet companies, search engines,
00:10:17.020 they're all owned by, you know, a handful of billionaires.
00:10:19.580 So, you know, okay, you can change the system.
00:10:22.160 But there's nothing kind of particularly unusual or sinister about what's just happened.
00:10:26.260 And actually, in a way, I think it's an example of capitalism working the way it should, not an indictment of capitalism.
00:10:31.700 You know, it's a market correction.
00:10:33.360 There are all these big tech social media companies, search engines, which are pretty much all progressive in one form or another.
00:10:42.500 So it's great that someone's come in who doesn't share that agenda necessarily and believes more strongly in free speech.
00:10:49.040 You know, that's an example of a market correction.
00:10:51.320 So, you know, I think it's a good thing. Whether he'll actually kind of turn Twitter into the platform we'd all like it to be remains to be seen. What I hope he'll do is rather than overhaul Twitter's content moderation policies, I hope that he'll outsource the content moderation.
00:11:12.120 so essentially say to people look it's up to you to censor what you want to censor it's up to you
00:11:18.980 if you want to take a risk if you say i don't mind being exposed to misinformation or hate speech
00:11:24.560 then you know then then then what they see on twitter needn't be filtered if they want to just
00:11:30.440 have civil grown-up respectful conversations about important public policy issues then they
00:11:36.580 can turn all the filters on so they kind of screen out the trolls but it should be up to
00:11:41.320 the individual as to the risk they're prepared to take it shouldn't be imposed upon them top down by
00:11:47.320 you know content moderators in silicon valley so i i hope that's the i hope that's the way forward
00:11:51.940 toby when you said if they want civil discussions i'm like get off twitter
00:11:55.540 that's what you need to do that will be there you could create a kind of you know um a sub
00:12:01.140 community within twitter which is kind of um in which people kind of voluntarily um screen out
00:12:08.240 the kind of loud you know brash vulgar voices people like us basically yeah so we wouldn't
00:12:15.260 screen us out and have grown-up conversations oh sorry no i was gonna say um so let's talk about
00:12:22.620 the online harms bill because this is what we've come to talk about primarily yeah so let's get
00:12:27.140 into it what's the problem toby well what is it first what is it and what is the problem with it
00:12:31.180 So the online harms bill, it's not called the online safety bill, but it began life as the online harms white paper a few years ago.
00:12:43.720 And Sajid Javid, when he unveiled that white paper, said his aim was to make Britain the safest place in the world to go online,
00:12:51.260 which, you know, sounds to me like the most regulated place in the world if you want to set up a social media company.
00:12:58.560 Anyway, so it began life there, and it was partly in response to a moral panic about how the internet and social media was harming children, causing them to self-harm, in some cases commit suicide, providing them with access to drugs and pornography and so on and so forth.
00:13:16.420 So a moral panic about how children were being kind of corrupted, led astray, harmed by, you know, the Wild West.
00:13:25.060 And so they brought it in and it's now kind of gone through a kind of evolutionary process.
00:13:30.620 And the latest iteration of it is the second draft of the online safety bill.
00:13:35.780 And I'll set out the kind of initial case against it.
00:13:39.640 Then let's try and steel man the online safety bill.
00:13:43.300 and I'll then try and set out the case for it
00:13:45.660 from a free speech point of view.
00:13:47.400 And incredibly, there is a free speech argument for it
00:13:49.480 and Nadine Dorries has made that argument.
00:13:51.200 So we'll look at that and then tell you
00:13:53.680 why I don't think the free speech case
00:13:55.560 as she's making it for the bill is very compelling.
00:13:58.920 I hope that doesn't sound too much,
00:14:00.100 like a 40-minute lecture.
00:14:02.140 So the kind of...
00:14:03.080 Keep it to 20.
00:14:04.120 I'll try and keep it to 20.
00:14:05.120 So on the face of it, the reason for concern
00:14:08.160 is that it will create a duty of care
00:14:13.120 It will impose this new legal duty on social media, big social media companies, search engines, Google, Facebook, Twitter, YouTube, whereby those companies will have to remove harmful content, by which they mean not just content that's harmful to children, but also content that is harmful to adults.
00:14:36.020 and if they fail to do so they can be fined by Ofcom who will be empowered as the new internet
00:14:43.900 regulator they can be fined 10 percent of their annual global turnover so when you're looking at
00:14:49.100 companies like Facebook, Google, Twitter that's a huge that's billions of dollars so social media
00:14:55.820 companies search engines will have a powerful financial incentive to remove harmful content
00:15:01.580 And the really sinister thing is that it won't just be content that's unlawful that they'll be obliged to remove.
00:15:10.120 They'll also be obliged to remove what has been called legal but harmful content.
00:15:15.360 So stuff you can say offline, you won't be able to say online.
00:15:19.180 So the fact that it's legal doesn't mean it won't be prohibited.
00:15:22.840 and and often the stuff that the big social media companies will end up removing which is perfectly
00:15:30.700 lawful perfectly lawful will be stuff you know that the lobby groups activists are complaining
00:15:36.940 about they'll say this is harmful to the lgbtq plus community or this is harmful to you know
00:15:43.740 recently arrived migrants or whatever it is and and the companies kind of fearful that they might
00:15:48.980 be fine you know these huge amounts by ofcom will err on the side of removal uh censorship um so
00:15:56.500 that's the reason for concern it's gonna it's it's it's been described as a kind of census charter
00:16:02.120 and um it means that um stuff you'll be able to say you know in other countries you won't be able
00:16:08.400 to say here online um uh it'll be you know one of the most censorious heavily regulated you know
00:16:16.020 environments for social media companies and search engines if they want to operate
00:16:19.380 in the UK. Okay, so that's the sort of standard, that's the case against it.
00:16:25.300 What is the case for it from a free speech point of view? Well, I made a couple of notes on the
00:16:29.880 train on the way here, trying to kind of think about how to steel man this from a free speech
00:16:34.880 point of view. So Nadine Dorries makes the following arguments and her ministers in favour
00:16:40.880 of the bill from a free speech point of view. They point out that one of the things the bill
00:16:45.060 will do will be to repeal various communications offences, such as the Malicious Communications
00:16:51.420 Act, which free speech campaigners have been campaigning against for some time. The communications
00:16:58.360 offence under which Count Dankula was prosecuted, that'll be repealed. And those communications
00:17:06.200 offences will be replaced by a new harm-based communications offence, which is instead of
00:17:13.080 looking at the subject matter of the unlawful communication or the communication to determine
00:17:18.580 whether it's unlawful or not they'll look at the effect it has the psychological effect it has on
00:17:24.140 the recipient and it'll be unlawful it'll be an unlawful unlawfully harmful communication if it
00:17:31.460 causes psychological harm amounting to at least serious distress if the person sending the message
00:17:37.820 intended to cause harm like to a likely audience and the person has no reasonable excuse
00:17:42.940 for sending the message and nadine dorries and her ministers argue that this new harm-based
00:17:49.200 communications offense because it's replacing all these other communications offenses which free
00:17:53.960 speech campaigners like me don't like um it'll actually create a more permissive environment
00:17:58.680 than the one we're currently in so from a free speech point of view we should welcome
00:18:02.700 those that particular cluster of reforms which were included in the bill um she also says and
00:18:09.420 she said this on twitter yesterday i think in a debate with fraser nelson that there's nothing
00:18:13.520 in the bill as it stands to force social media companies to remove legal but harmful content
00:18:19.960 all they'll have to do is if it is agree terms and conditions with ofcom and in those terms
00:18:30.240 and conditions yes they'll be obliged to pledge to remove unlawful content but they won't be
00:18:36.400 required by law under this bill to remove legal but harmful content. If they want to do that,
00:18:43.400 they can. So if they want to say, if Twitter, probably not Twitter now, Facebook, if Facebook
00:18:48.720 wants to say, we're going to prohibit this content, because even though it's lawful, we think it's
00:18:54.460 harmful, they'll be able to do that, but subject to a couple of provisos. One of the provisos is
00:19:00.180 that they have to give special protection to content of democratic importance and journalistic
00:19:06.160 content. We'll get onto that in a minute. And when removing content from their platforms,
00:19:11.380 when removing lawful content, they have to have regard for freedom of speech. And Nadine
00:19:17.320 Dorries says that, you know, I'm not going to force these social media companies, at
00:19:22.540 least not in this bill, to remove legal but harmful content. The only thing they'll be
00:19:27.400 obliged to do will be to remove unlawful content. If they want to go beyond that, they
00:19:32.560 can but they have to do it subject to these various caveats which will protect free speech
00:19:39.420 and she argues that this is the first time social media companies have had these free speech duties
00:19:45.240 imposed upon them so it's a better place from a free speech point of view than we're in at the
00:19:50.040 moment and then the final argument the final pro-free speech argument is that once a social
00:19:57.240 media company has agreed its T's and C's with Ofcom, Ofcom will then hold that company's feet
00:20:03.620 to the fire when it comes to enforcing those policies. So it'll have to enforce them consistently
00:20:09.780 and non-arbitrarily and not in a politically biased way. And she argues that that'll make
00:20:14.900 it impossible for Twitter, for instance, in future to remove Trump from the platform,
00:20:19.400 but not the leader of the Taliban or Vladimir Putin. That social media companies won't be able
00:20:25.960 to selectively apply their community standards
00:20:29.500 and say Trigonometry's interview with Kathleen Stock
00:20:33.260 has breached YouTube's community standards,
00:20:35.980 but Navarro Media's interview with George Galloway hasn't.
00:20:40.540 They'll have to be consistent,
00:20:42.580 and in that way, that will protect free speech
00:20:47.220 and eliminate political bias on social media platforms.
00:20:50.620 So that's the case for it from a free speech point of view.
00:20:54.380 And I think if you just go back over those arguments to tell you why I don't think any of them are particularly compelling.
00:21:01.980 So, yes, I think the repeal of various communications offences and the replacement of them by a harm based communications offence.
00:21:13.120 Yes, that's in some ways attractive. It's more attractive than where we are now.
00:21:18.240 And I think it probably will be a little bit more permissive.
00:21:20.240 fewer people will be prosecuted. I mean, not many people are prosecuted now, but even fewer
00:21:25.020 in all likelihood will be prosecuted for communications offences in future once this
00:21:31.240 bill becomes law. But the shortcoming of this approach of defining unlawful communications
00:21:39.960 by looking at the impact they have on the recipient, psychological harm likely to cause
00:21:47.320 extreme distress is it kind of it's it's creating this kind of responsibility for the state to
00:21:53.580 protect people from psychological harm and i think that is a very dangerous precedent and not
00:21:58.760 not something the state should be in the business of doing well not least because psychological harm
00:22:03.120 is entirely subjective well there's the subjectivity argument you know how are we going to measure it
00:22:07.140 well we francis could discuss both of our hairlines and we could you know one of us could
00:22:11.640 be deeply offended by another one could see it as a joke yeah right and and it says and it it says
00:22:17.440 that um if it's likely to cause psychological harm amounting to extreme distress to a likely audience
00:22:24.720 so on you know on twitter you say something you think you're just saying it to your followers
00:22:29.880 but if one of them retweets it and they then retweet it it goes viral it ends up you know
00:22:35.000 in the on the screen of an extremely kind of psychologically fragile person it could actually
00:22:42.000 cause them extreme distress but you know why should you be held responsible for that so yeah
00:22:49.180 I think the idea that the state should be in the business of protecting people from psychological
00:22:56.060 harm and as you say it it it's a sort of gold embossed invitations to activists and lobby
00:23:01.220 groups to say, that's psychologically harmful. That made me feel unsafe. Who are you to say
00:23:05.680 that my distress isn't serious distress? Is it because I'm a trans person? You're not taking
00:23:13.020 my distress seriously? You can just imagine a cascade of demands to remove content on the
00:23:19.620 grounds that it's causing people who are quite psychologically fragile extreme psychological
00:23:24.880 distress. So that's extremely dangerous and a really dangerous precedent, I think. But okay,
00:23:30.560 Even setting that aside, when me and the chief legal counsel of the free speech union had a conversation with one of the ministers and some senior officials in the Department for Culture, Media and Sport about the bill, they offered this as a kind of quid pro quo.
00:23:47.580 Yes, you know, in some respects, free speech will get worse because of this bill, but in others, it'll get better.
00:23:53.360 And look, we're repealing the Malicious Communications Act. You must be happy about that.
00:23:57.620 Well, yes. But one interesting thing is that these reforms to communications law will only apply in England and Wales. They're not going to because, you know, communications offences are a devolved area of legislation. So they won't apply in Scotland. They won't apply in Northern Ireland.
00:24:15.980 In Scotland, you could be in the worst of all possible worlds in which, you know, companies, social media companies could be prosecuted, fined billions of dollars for posting or allowing people to post content that's harmful under these old communications laws.
00:24:33.780 um so so you know and it's gonna it's kind of like an analog bill for a digital age in that
00:24:39.520 respect to create kind of a different regulatory environment in england and wales to the regulatory
00:24:44.060 environment in scotland and northern ireland i mean it's going to be an incredible headache
00:24:47.800 for somebody who wants someone who wants to set up a compete you know a competitor to twitter or
00:24:53.280 facebook they're going to have to get their heads around you know the unbelievable patchwork quilt
00:24:57.880 complexity of regulations in the uk there won't even be one standard for the uk it'll depend
00:25:03.240 whether you're in scotland northern ireland or england and wales i mean it's ludicrous um toby
00:25:08.340 before we go like yeah this is a conservative government this is a conservative government
00:25:14.680 bringing this in if you know we're in an alternate universe and corbyn won and they brought this in
00:25:19.800 i'd be like this makes complete sense what's going on well and and um that's absolutely true
00:25:27.420 It's baffling that a conservative government that claims to believe in free speech and, you know, a small state should be bringing in this sledgehammer of a bill, which is going to have an unbelievably chilling effect on free speech online.
00:25:45.240 And one particularly interesting you bring up, bring up Corbyn, Nadine's argument that nothing in this bill will force social media companies to remove legal but harmful content.
00:25:55.180 Well, that's true, but it's quite a dishonest argument because what the bill does is it creates an opportunity, a mechanism for the Secretary of State at DCMS, currently Nadine Dorries, to bring forward supplementary legislation, probably in the form of a statutory instrument, identifying what are called in the bill priority harms.
00:26:21.000 those are things that social media companies will have to remove as a matter of priority
00:26:25.180 you know they don't remove them they really will get fined and it's when it's in that
00:26:29.480 supplementary legislation in that statutory instrument that the legal but harmful stuff
00:26:34.260 is going to be included so it's likely for instance that social media companies will be told
00:26:38.260 you have to remove misinformation as a matter of priority you have to remove hate speech as a
00:26:44.080 matter of priority doesn't matter if it's lawful if it's harmful according to this nebulous open-ended
00:26:49.500 definition, you'll have to remove it. So that's where the real mischief is going to occur in this
00:26:55.000 supplementary legislation. But Toby, what Francis is getting at, and we'll get to the rest of your
00:26:58.800 point in a second, is, you know, you know Boris Johnson, right? You've studied together. We were
00:27:04.460 told, and look, neither of us is, by the way, a big fan of his, this is a liberal Tory who wants
00:27:13.720 to reduce the size of the state and keep us all free.
00:27:17.900 And then we had two years like we've just seen,
00:27:20.580 and now this.
00:27:21.280 How do you explain that?
00:27:23.800 I don't think he's a man unduly constrained by principle.
00:27:31.500 I mean, I think he has instincts,
00:27:35.100 but they can be quite easily overridden.
00:27:37.260 I mean, I think he has a libertarian kind of default response,
00:27:42.440 And I think that's partly why he didn't want to plunge us into lockdown initially, you know, in February, March 2020.
00:27:50.780 But they were quite quickly, you know, cast aside when he came under a lot of pressure from colleagues and officials, scientific advisors to lock us down.
00:28:00.600 You know, they proved to be quite fragile principles.
00:28:04.060 And I think in this case, you know, I don't think he's really applied himself to the bill.
00:28:09.280 I mean, quite often the reason he can be easily kind of bullied by kind of more authoritarian colleagues into abandoning his libertarian instincts is because he hasn't done his homework.
00:28:21.040 He hasn't mastered the brief.
00:28:22.380 They'll make a number of arguments and he won't know what the counter arguments are.
00:28:25.840 So if Nadine Dory, if it comes up in cabinet, she's sort of mastered this brief.
00:28:31.200 She'll say, no, Boris, no, Prime Minister.
00:28:33.440 It's going to actually help from a free speech point of view.
00:28:36.440 And she'll make all the arguments I just summarised.
00:28:38.220 And he won't know what the counter-arguments are.
00:28:41.660 But isn't that his job?
00:28:42.680 Well, I think when Michael Gove said back in, what, 2016,
00:28:49.380 that he wasn't temperamentally suited to be prime minister,
00:28:53.460 and that's why Michael Gove decided to throw his own hat into the ring
00:28:56.680 and withdraw his support,
00:28:59.020 I think that was what he was sort of getting at.
00:29:01.820 That he has quite a short attention span.
00:29:04.940 And he's not good at kind of sitting down and going through his boxes.
00:29:09.880 He's not really interested in the kind of nitty-gritty,
00:29:12.900 the minutiae of kind of policy documents and proposals.
00:29:16.900 And that's why, you know, I think Dominic Cummings refers to him as trolley
00:29:20.680 because, you know, his nickname behind closed doors in Downing Street is trolley
00:29:24.200 because he's like a wonky supermarket trolley that kind of veers around
00:29:27.480 according to whoever's pushing it or he's bumped into last, you know.
00:29:30.660 And maybe that's a slight exaggeration.
00:29:32.640 But I think that's the explanation for why someone like Boris, who styles himself this great kind of libertarian, Rabelazian, Falstaffian kind of, you know, freedom loving, you know, yeoman, why he is bringing in, his government is bringing in this bill because he doesn't, he's not, he's easily bamboozled and easily kind of knocked off course and his libertarian instincts can easily be kind of submerged under all this kind of pressure.
00:29:59.760 I think that's probably the explanation for that. But just back to your Corbyn point, because the bill creates an opportunity for the Secretary of State to come up with these priority harms, and that's where the legal but harmful stuff comes in.
00:30:15.560 OK, Nadine Dorries may not go completely over the top when she brings forward her supplementary legislation and it's debated in the House of Commons.
00:30:23.820 It may be reasonable. We may think, well, actually, you know, the definition of hate speech in this statutory instrument isn't too bad, you know.
00:30:32.040 But what's to stop, you know, a successor?
00:30:35.380 You know, let's say Keir Starmer wins the next general election, not out of the question at this point,
00:30:39.640 and appoints Chris Bryant or Dawn Butler as Secretary of State at DCMS,
00:30:45.640 they would then have the power under this bill to bring forward another statutory instrument
00:30:49.960 identifying all these priority harms.
00:30:52.540 That's where we could really get into a kind of unbelievably censorious chilling climate.
00:30:56.700 All right, you have to legislate for the future possibility that your political opponent...
00:31:00.740 Yeah, this bill is not future-proofed against political change.
00:31:04.060 You know, it's bad enough under a supposedly liberty-loving conservative government.
00:31:08.560 What if we have something like a Corbyn-led Labour government who have absolutely no regard for free speech?
00:31:14.980 You know, they'll have the powers at their disposal to impose unbelievable amounts of censorship on what we can say online.
00:31:22.400 You know, it'll be at their fingertips thanks to this bill.
00:31:26.080 Hey, Konstantin, what razor do you use?
00:31:29.220 I used Russian blade, forging fires of Ural mountains, hardened in the winters of Siberia,
00:31:35.240 purified in the river of the Volga, and sold over-the-counter Edbuts in Moscow.
00:31:40.640 Is it good?
00:31:41.500 No, it sheds more blood than Uncle Vlad cracking down on political protest.
00:31:45.920 In that case, you need to try Harry's razors.
00:31:49.040 I use them myself, and look how clean and smooth my face is.
00:31:53.820 Yes, my face would also be that smooth if I had no testosterone.
00:31:57.060 on harry's is offering an incredible deal where you get a free trial set of harry's products
00:32:03.060 which includes one five blade cartridge foaming shave gel travel blade cover and a free travel
00:32:10.900 size face wash how much it will only cost you three pound 95 that's just three pound 95 which
00:32:20.340 is the cost of the packaging and the posting this is good deal yes and all their products
00:32:26.740 are dermatologically tested and formulated
00:32:29.820 by Harry's experts.
00:32:31.460 Their skincare products are alcohol and cruelty free.
00:32:35.120 Alcohol free, is this supposed to be a good thing?
00:32:38.140 Harry's razors have a weighted handle,
00:32:40.240 not a cheap plastic one that you get
00:32:42.060 in some mainstream and supermarket brands.
00:32:44.760 This gives you greater control when you're shaving
00:32:47.600 and means you're less likely to cut and nick your skin.
00:32:51.040 Just head to harrys.com forward slash trigger
00:32:53.900 to have your set delivered and start shave plan.
00:32:56.720 That's harrys.com slash trigger.
00:32:58.880 And it's just £3.95 for a trial set.
00:33:03.280 There are certain elements of the bill that I do agree with, Toby.
00:33:06.400 For example, the cyber flashing element of it.
00:33:08.640 I don't understand why people, especially women,
00:33:12.040 should be subjected to people sending pictures of their genitals, for instance.
00:33:16.020 You know, protecting kids from pornography.
00:33:18.760 You know, we've both been involved in education.
00:33:21.060 That is a massive problem that not enough people are talking about.
00:33:24.160 So there are elements of it which I'm actually on board with.
00:33:26.460 what do you think about that i think i agree i mean i think there are some aspects of it
00:33:30.000 um which are completely defensible um like you know prohibiting cyber flashing though
00:33:35.060 you get into a slightly gray area when it's cyber flashing between consenting adults
00:33:39.240 that'll be prohibited too um uh so that's kind of uh should we prohibit that you know why why
00:33:45.080 shouldn't you be able to send a dick pic to kind of um mrs foster you know if because i'm socially
00:33:50.940 conservative that's why that should only be kept for a darkened bedroom yeah but i think i think
00:33:56.620 that the the um the clauses of the bill which are intending to protect children you know it's hard
00:34:02.320 to argue with them um but it's the stuff which is intended to protect adults from harm including
00:34:09.340 legal content that's the kind of really pernicious dangerous censorious stuff that's going to have a
00:34:14.600 chilling effect on free speech why should the government be able to decide on my behalf what
00:34:21.280 i can and can't see you know if it's lawful you know it's one thing for parliament to pass laws
00:34:27.560 you know in a democratic process to prohibit certain things you know that's one thing you
00:34:33.860 can argue with that maybe majority shouldn't be able to impose their kind of moral values on
00:34:38.380 minorities and so forth but you know that's one thing but that isn't this they're saying that
00:34:42.640 even if a law hasn't been passed prohibiting certain content on social media,
00:34:48.320 we will insist that social media companies remove it nevertheless
00:34:53.220 under threat of being fined 10% of their annual global turnover.
00:34:57.880 And that's like, well, that's completely undemocratic.
00:35:01.360 And, you know, why shouldn't I be able to decide
00:35:04.460 what lawful content I'm able to see, you know,
00:35:08.640 on my phone, on my social media platforms?
00:35:11.000 Well, Toby, the argument, and I do actually think for all my strong support for the idea of freedom of speech, of course, you know my position on it. At the same time, I think the argument now is the harm side of these things is becoming more prominent because as these networks become bigger and they connect more and more people, the concern is, and it's not the example I'm about to give is not one in which I particularly share the analysis, but I can see a similar example in which that would be the case.
00:35:41.000 is the banning of Donald Trump at the very beginning of 2021.
00:35:45.500 And the argument from the social media companies was
00:35:48.280 this was an election in the most powerful country in the world
00:35:52.080 with tons of nuclear weapons, and it seemed to them,
00:35:55.360 I don't agree with this interpretation, but it seemed to them
00:35:58.000 that there was essentially an attempt to overthrow
00:36:00.400 the lawfully elected candidate by storming the Capitol.
00:36:04.740 Again, I don't agree with that interpretation,
00:36:06.860 but that's how they saw it.
00:36:08.000 And they felt that given the potential for huge harm that could come from lawful activity, it's not unlawful to say on Twitter that you think the election is illegitimate or whatever.
00:36:20.260 It's not against the law.
00:36:21.160 If you and I went down the pub, you could say that, right?
00:36:24.860 That's not against the law.
00:36:25.720 But the impact of someone with, I don't know, 70 million followers saying that over and over and over, they argue, caused people to get to that position where they are storming the Capitol and people die in the process.
00:36:39.720 Isn't that something that whether you like it or not, whether I like it or not, we do have to reckon with that ideas are powerful things and connecting billions of people together in a place where they can say things of that nature has real world impact.
00:36:54.600 and as a society like it or not we're going to have to have some kind of reckoning with that
00:36:58.700 well i think um i would accept that um a limit should be placed on free speech um whereby people
00:37:07.520 shouldn't be free to say things which are likely to um result uh imminently in violence you know i
00:37:16.520 think that that's a legitimate constraint restriction on free speech and maybe you could
00:37:21.500 argue that that was a good reason. That was the rationale for banning Trump from Twitter. What he
00:37:28.260 was saying on the eve of the attack on the Capitol was inciting violence. And if it was, then maybe
00:37:37.120 that was a good reason. But I think the difficulty with removing stuff, people saying we think the
00:37:43.100 election was stolen, for instance, after Trump's been removed from the platform, is that actually
00:37:49.400 that's not it's i think it's hard to make the case that that's likely to lead to imminent right user
00:37:55.980 7345 saying the election was stolen isn't causing the capital and i think i think that there was the
00:38:01.720 argument that um uh i think louis brandeis who was a supreme court justice made in a famous case i
00:38:08.360 think in the 1950s in which he said the um uh cure for bad information is not to suppress that
00:38:16.060 information the cure is more information sunlight is the best disinfectant and the problem with
00:38:21.120 banning stuff that you think is misinformation or is likely to cause insurrection um is that
00:38:28.740 by banning it you're not reducing its toxicity you know you're forcing it underground you're
00:38:34.400 forcing it into kind of darker corners of the internet you know people who who are kind of
00:38:39.700 inclined to be triggered by that stuff are probably going to find it anyway isn't it better
00:38:44.220 that it should be on, you know, larger, more mainstream platforms where it can be challenged openly.
00:38:51.540 You know, if you think the election was stolen, you know, and you make that argument on Twitter and Twitter bans it,
00:38:59.820 that's going to persuade you that you're right and that's why it was banned.
00:39:03.120 It's going to persuade other people seeing that ban that there's actually something to it because otherwise, why would they ban it?
00:39:08.660 They're obviously scared they've got something to hide.
00:39:10.380 Whereas if you let them say it and then rebut it with a kind of overwhelming number of kind of facts with a kind of mountain of evidence to show it isn't true, then it's likely to kind of lose a lot of its power and lose a lot of its influence.
00:39:24.080 And I think that sunlight is the best disinfectant.
00:39:26.120 And that's a really good argument against suppressing even dangerous misinformation.
00:39:30.520 I agree with you in principle.
00:39:32.900 I really, really do.
00:39:33.840 And I'm obviously playing devil's advocate and trying to explore the argument here.
00:39:37.500 But one of the things we do see on social media is that isn't what happens. When somebody says, I think this, it's not that there's an overwhelming barrage of counter information. It's that everyone's in their own silo now. And what happens is people who think A and people who think B are just in different spaces talking to themselves now.
00:39:58.640 and so you're not getting that traditional idea of if we're all in a meeting with 300 people and
00:40:04.760 you stand up and you go i think covid is caused by 5g and we need to go and blah blah blah and
00:40:10.720 200 other people stand up and go you're an idiot and here's why and blah blah blah that's what
00:40:16.380 you're talking about but that's not what happens on social media what happens on social media is
00:40:20.840 one person stands up and says something 200 other people agree who already agree and they never see
00:40:26.500 the people who don't agree with them i think that's a slight caricature i mean you know we're
00:40:30.980 all on twitter sometimes you do see you know um quite grown up kind of well-informed quite
00:40:37.060 sophisticated nuanced debates taking place of course between people on completely opposite
00:40:41.720 sides of big contentious issues and you know you see it on facebook too you see it uh you see it
00:40:47.140 on youtube you know it's uh of course there are kind of people who don't want to hear the other
00:40:51.780 side and you know but i think it's hard to blame social media for that i think you know we've
00:40:55.820 naturally just become a more kind of politically polarized society both here and in America and
00:41:00.760 elsewhere and you know social media may have accentuated that but you know it's not I don't
00:41:05.180 think it's the root of the problem. I think there's another argument too the argument against
00:41:09.740 banning hate speech. So the argument for banning hate speech is that if you don't you allow these
00:41:17.000 kind of toxic flowers to bloom and they could result in kind of unpleasant people getting
00:41:24.680 elected, populist revolts, and so on and so forth. But in Jacob Matcham-Gamer's recent book on free
00:41:32.880 speech, which was a history of free speech, he makes this really good observation about Weimar,
00:41:39.140 Germany. And you probably know this, but in Weimar, Germany, various forms of hate speech,
00:41:43.180 such as anti-Semitism, were criminalised. And various kind of prominent Nazis actually spent
00:41:50.800 time in jail for producing, you know, publishing anti-Semitic material. Hitler was banned for large
00:41:58.680 parts of the 1920s in various German states. And none of that did anything to undermine the kind of
00:42:09.780 appeal and growth of the Nazi party. On the contrary, it allowed them to cast themselves
00:42:15.260 as martyrs at odds with this kind of oppressive state that was kind of suppressing what they had
00:42:21.580 to say because it was true you know it had the opposite of its intended effect i think that's
00:42:25.900 generally if you do try and suppress these right-wing toxic points of view you know you don't
00:42:32.020 you don't stave off another kind of nazi victory at an election you make it more likely and we keep
00:42:39.660 using the word harm how do they define this word well they they i think i they they they i got it
00:42:46.200 got the definition i wrote it down here it's um uh yeah psychological harm um amounting to at least
00:42:52.980 serious distress but unless what see this is the problem because what are you going to do everybody
00:42:59.860 who then makes a complaint are you going to have a psychiatrist come in and give them some kind of
00:43:04.660 psychiatric assessment? How do you define this? No, there's no requirement for a clinical
00:43:10.880 assessment. Now, I think it'll be subjectively defined. If someone claims that a post that
00:43:16.760 they've read has caused them extreme psychological distress, then you'll have to take that at face
00:43:23.020 value. But let's flip it over and look at it another way. So my mother's Venezuelan.
00:43:29.400 socialism has been a disaster in my country one of my cousins literally drinks rain water from
00:43:34.880 the tank on top of his house because the water the running water in his house is so polluted
00:43:40.200 that if he drinks it it could very literally make him incredibly ill even kill him so what someone
00:43:47.040 puts a post advocating socialism on Facebook I go I'm distressed take it down well um I guess this
00:43:53.180 is where the protections for content of democratic importance come in. I mean, this is one of
00:44:00.920 the great, Nadine Dorries and the defenders of the bill make great play of this. If you
00:44:05.540 want to make a political argument, it doesn't matter if it causes people extreme psychological
00:44:10.160 distress, it'll be protected because it's content of democratic importance and we don't
00:44:14.460 want to censor those important democratic debates. The question becomes, well, who gets
00:44:19.920 to decide what's content of democratic importance and what isn't you know um there was that recent
00:44:25.520 uh the Maya Forstater the first Maya Forstater employment tribunal in which the judge used this
00:44:31.380 very sinister phrase in which he said that gender critical views weren't deserving of respect in a
00:44:37.760 democratic society so you know according to that you know judge on a the member of the employment
00:44:43.700 tribunal you know he he didn't think that was content of democratic importance now you know um
00:44:48.820 That judgment was overturned in part by the Employment Appeals Tribunal, and now the Employment Tribunal is taking place again, and we don't know what the outcome of that will be.
00:44:59.300 But, you know, for a figure of considerable authority to declare, no, that isn't content of democratic importance, because gender-critical beliefs are not deserving of respect in a democratic society, you can imagine that kind of argument being made over and over again.
00:45:15.340 And, you know, how much do we trust Ofcom to get that kind of thing right?
00:45:19.740 Don't forget, Melanie Dawes, the chief executive of Ofcom, said a couple of years ago that she didn't think that it was appropriate to feature anyone from the LGB Alliance on a BBC News or ITV News discussion about the Gender Recognition Act because their views were beyond the pale.
00:45:37.420 you know and okay she's revised that position now but how confident can we be that ofcom isn't
00:45:43.160 going to make similar errors of judgment when it comes to deciding what issues are deserving of
00:45:48.940 this protection because they're of democratic importance and what aren't you know i i certainly
00:45:53.680 don't trust them i'd like personally to be able to make that decision myself and not trust to a
00:45:57.900 regulator to make it on my behalf i mean there's another problem as well with protecting journalistic
00:46:02.320 content you know who gets to decide who's a journalist right are we journalists is this
00:46:06.420 journalistic content you know what we'd have to ask ofcom you know are they going to come up with
00:46:11.560 what they might say about that well they can account with a list of kind of approved will
00:46:15.960 you have to be regulated by ipso or impress you know and if you do that's a way of bringing in
00:46:22.040 state regulation of of the press you know until now um regulation of the press has been voluntary
00:46:28.380 and there was a huge battle that me and others fought to prevent it becoming you know regulated
00:46:35.020 by a state regulator in the aftermath of the Leveson inquiry.
00:46:38.980 And we won that battle, but this looks like that's coming back now
00:46:42.260 through the back door.
00:46:43.540 If in order to be entitled to this protection,
00:46:47.920 journalistic content, you have to be registered in some way.
00:46:51.420 You have to satisfy Ofcom.
00:46:52.600 You've got to be on the register.
00:46:55.620 What's your point of making?
00:46:57.240 The register of problematic YouTubers.
00:46:59.140 I think I've been reassured that there won't be kind of a list
00:47:03.900 of approved you know journalists and a list of kind of dangerous journalists um that all you'll
00:47:10.140 have to do to satisfy ofcom that you are a legitimate provider of journalistic content
00:47:16.280 you need to have a proper complaints policy and an opportunity for people who think they've you
00:47:21.600 know been misrepresented to reply you have to offer people a right of reply you have to have
00:47:25.740 a proper uh policy about how you deal with complaints and factual corrections and you have
00:47:31.960 to correct them and so forth you don't have to be you don't have to be regulated by you know a
00:47:37.480 regulator like that that's what they say but you know all these things can be kind of changed as
00:47:41.500 soon as the bill becomes law you know by a small amendment so that's all quite scary i'm very aware
00:47:45.980 that you have to to run uh so we won't hold you too much longer but you the way you're talking
00:47:51.720 about it sort of sounds to me like you've accepted that this bill is going to pass right well what
00:47:56.840 can people do before we let you go about this is there and do they write to their mp do they
00:48:02.540 you know find out more from like what what's what do we do well um i think politically um it is going
00:48:09.880 to happen i think it's it's it's a huge mountain to climb to actually derail this now i mean maybe
00:48:16.420 that's not impossible but it feels like that's kind of unrealistic it's gonna happen there's a
00:48:22.120 lot of political momentum there's a lot of support for the bill in the house of commons even though
00:48:26.240 the Conservatives have an 80 seat majority, go figure. So what can we do? Well, we can try and
00:48:30.960 improve it. And what the Free Speech Union is doing is working with a number of lawyers and
00:48:35.100 other lobby groups in this space to try and come up with a raft of amendments to make the bill
00:48:40.520 better. So I referred earlier to this duty that the bill will impose on social media companies to
00:48:46.020 have regard for freedom of speech when making a decision about whether to remove, you know,
00:48:51.000 harmful content. But have regard is the weakest of the legal duties. So, you know, Twitter could
00:48:56.620 say, we've been told that JK Rowling's posts are causing serious psychological distress to trans
00:49:04.040 people. We've considered the free speech implications of banning JK Rowling from the
00:49:09.500 platform and entirely dismissed them without a second thought. That would be satisfying the
00:49:16.060 duty to have regard for freedom of speech. It requires you to do no more than just think for
00:49:20.980 a second about the free speech implications and then entirely dismiss them. So one thing we want
00:49:25.740 to do, the biggest amendment, the one we really hope we'll get through, is to strengthen that
00:49:30.540 duty. So it becomes something just a little bit more onerous, like have particular regard for
00:49:36.580 freedom of speech. We want that duty to be on the same, have the same status, the same legal force
00:49:42.620 as the duty of care and if we can do that that would go some distance towards improving the
00:49:47.340 bill i think another way you could kind of in order to you could say let's amend the new
00:49:51.900 communications offense so in order to convict someone under this new harmful communications
00:49:56.960 offense um you have to show that the person in the likely audience that was psychologically
00:50:02.500 seriously distressed by this um didn't consent to to it so you know if you if you build a consent
00:50:09.360 clause into that you know a person can't be prosecuted if all the members of the likely
00:50:13.760 audience they thought it would end up who'd end up receiving the message if they'd all consented
00:50:18.080 to it if he had reasonable grounds for assuming they consented to it and then you can build in
00:50:21.800 a bit of this kind of you know what i was talking about earlier kind of um with people deciding for
00:50:27.120 themselves what they can and can't see and you could say look i i just assumed that everyone
00:50:31.900 that was going to see this message had consented to kind of you know um uh being in the kind of
00:50:37.180 wild west category um and if they're not in that category then they it shouldn't have been sent to
00:50:41.480 them it shouldn't have been passed on to them that's the person you should go after not me
00:50:44.200 what does this mean for comedy toby well um uh nadine dorries said uh she actually gave as an
00:50:53.680 example jimmy carr's um joke about gypsies and the holocaust as an example of legal but harmful
00:51:00.960 content that this bill would prohibit so you know she actually honed in on a particular joke
00:51:07.200 and said that's the kind of thing this bill is going to put a stop to so yeah comedians should
00:51:13.580 be seriously concerned so hang on a second so people will be able then to let's say they watch
00:51:19.280 jimmy car special they will then be able to go after netflix as a result of this well i think
00:51:25.620 netflix will be covered under a new similar bill which will apply a similar regulatory regime to
00:51:31.080 on-demand let's translate it into facebook twitter terms if someone took that clip yeah and put it on
00:51:37.680 twitter on facebook and shared it yeah they could be liable to be prosecuted well the the the the
00:51:44.160 platform um which didn't remove it as soon as someone asked for it to be removed they would
00:51:49.420 be liable for prosecution so they would never let that be published never published no no right
00:51:54.280 So we're getting to the point where the government
00:51:57.200 is going to censor comedians' jokes, basically.
00:51:59.460 Yeah. Well, Ofcom will be in the business
00:52:01.540 of censoring jokes, yes.
00:52:04.100 Yeah, and I've met Melanie Dawes
00:52:05.700 and she doesn't strike me as having a particularly good
00:52:07.560 sense of humour.
00:52:09.620 That's reassuring.
00:52:11.020 But, I mean, that's...
00:52:13.440 That could be unfair, actually.
00:52:15.000 Forget about her... I don't give a shit about her sense of humour.
00:52:17.640 We're talking about a government body
00:52:19.600 or a state body
00:52:21.100 deciding what jokes are allowed to be told yeah which is ridiculous yeah and the critical thing
00:52:29.660 is the government through ofcom the state will be deciding what jokes it thinks you should be
00:52:37.080 able to hear and what jokes you shouldn't it's putting itself in the business it's like the
00:52:42.860 lord chamberlain of old deciding what plays can be put on on the west end and what can't because
00:52:47.000 some people have to be protected. What if your servants read Lady Chatterley's Lover? What effect
00:52:52.020 would it have on them? The state is resurrecting this responsibility to protect people from their
00:53:00.360 own worst instincts, to stop them from seeing legal content, but could be harmful. They're not
00:53:06.020 the best judges of whether that's going to harm them, cause them psychological distress or not,
00:53:10.600 what effect it's likely to have on their behaviour and their families. No, we are the best judges of
00:53:14.880 that and we get to decide and so what where is netflix in all of this then well i think there's
00:53:20.520 a separate media bill which will um regulate um on-demand streaming services like netflix i don't
00:53:28.980 think netflix is within scope of this particular i think youtube is but netflix isn't but there's
00:53:33.280 another but because of that they've got this other bill ready um which is which is going to kind of
00:53:37.760 mean that they can regulate ofcom can regulate netflix um you know apple tv plus disney plus
00:53:44.100 etc so eventually it's going to be the government dictating what you can and can't say online what
00:53:50.700 you can and can't laugh at what yeah and the interesting thing is that you'll be able to say
00:53:56.200 it on you'll be able to make the dangerous joke on stage provided someone doesn't film it and
00:54:01.380 then put it up online it just you just won't be able to make the same joke online no stuff which
00:54:06.440 it's lawful to say won't be lawful to type after this bill but then that has an effect on people's
00:54:12.640 psyche because we well we all now spend so much time online right so that is going to then filter
00:54:17.280 into real life well there is a there is a there is a quick fix vpn you know if if you can access
00:54:23.780 the internet using a vpn which which places you you know in florida rather than london then you'll
00:54:31.840 be able to see all this unregulated content i mean what it will mean effectively is that if you're
00:54:35.520 accessing all this content from the uk um or rather from england and where depends where they are in
00:54:41.780 the UK as we said earlier but if you're here and you're trying to access that content there's lots
00:54:45.720 of stuff you won't be able to see that you would be able to see if you lived somewhere else almost
00:54:49.420 anywhere else but particularly in America so I think I think there'll be a massive surge
00:54:54.780 in the use of VPNs because people just won't want the government to protect them in the way
00:55:00.940 it's proposing to do well that is quite a sobering thought Toby listen thank you so much for coming
00:55:07.560 back I'm sure this is an issue we'll be talking more and more about because I find that really
00:55:11.920 really genuinely troubling it is it's it's really scary it's the it's the greatest blow to freedom
00:55:17.640 of speech I think of our generation and it's a really significant battle and I would just urge
00:55:23.040 people who want to join this battle to join the free speech union and help us wage that battle
00:55:28.400 Toby thank you so much we're going to ask you a couple of questions from our supporters for our
00:55:33.600 supporters before we let you go but thank you for joining us and thank you for watching and
00:55:37.460 listening we'll see you very soon with another interview or our show all of them go out at 7
00:55:42.360 p.m uk time while we're still allowed exactly and for those of you who like your trigonometry on the
00:55:47.600 go it's also available as a podcast and that's probably going to be fucked as well take care
00:55:53.020 and see you soon guys we have to persuade young people for social justice reasons as a woke star
00:56:00.560 you should be defending free speech.