Why Britain Arrests 30 People EVERY DAY For Speech
Episode Stats
Length
1 hour and 8 minutes
Words per Minute
166.48196
Summary
In this episode, we talk to Lord Lord Young, founder of the Free Speech Union, about the impact of the government's new online censorship legislation, the Online Safety Act, and whether it will ever be repealed. We also hear from Lord Young about why he thinks the legislation is a bad idea, and why it needs to go.
Transcript
00:00:00.000
When I first started the free speech union, I didn't think it could get any worse, hopelessly
00:00:06.560
naive. And actually, all we've been able to do is kind of hold back the incoming tide
00:00:11.520
a little bit. So that's over 30 people a day being arrested.
00:00:16.520
Are you worried that this is basically a way of censoring people so that they don't challenge
00:00:22.120
It's a kind of willed naivety and a kind of avoidance of confronting their own policy
00:00:28.920
I think Sir Keir Starmer, Peter Kyle do genuinely think that the Online Safety Act doesn't go
00:00:36.080
nearly far enough. So I think in all likelihood, the Online Safety Act is not going to be repealed.
00:00:42.260
It's not going to be improved. It's just going to get worse and worse.
00:00:47.820
Every family tree holds extraordinary stories, especially those of the women who shaped who we
00:00:53.100
are. In honor of International Women's Month, Ancestry invites you to shine a light on their
00:00:58.220
legacy until March 10th. Enjoy free access to over 4 billion family history records and discover
00:01:04.620
where they lived, the journeys they took, and the legacy they left behind. Start with just a name
00:01:10.080
or place and let our intuitive tools guide you. Visit ancestry.ca to start today. No credit card
00:01:16.940
Getting ready for a game means being ready for anything, like packing a spare stick. I like
00:01:26.760
to be prepared. That's why I remember 988, Canada's Suicide Crisis Helpline. It's good to know,
00:01:32.880
just in case. Anyone can call or text for free confidential support from a trained responder,
00:01:37.860
any time. 988 Suicide Crisis Helpline is funded by the government in Canada.
00:01:44.880
Lord Young, Director and Founder of the Free Speech Union. Welcome back to Trigonometry.
00:01:50.920
It's good to have you on. Obviously, we want to talk about the Online Safety Act, which has
00:01:54.940
effectively come into force in recent days. To most people's surprise, the impact of it has
00:02:02.480
been to most people's surprise. Let's talk about it. What is it? How is it introduced? And why are
00:02:09.360
people now, you know, you see lots of people talking about using VPNs, and there's lots of
00:02:14.940
online censorship effectively happening in this country. Talk to us about the Online Safety Act.
00:02:19.280
So, the Online Safety Act is a piece of legislation that was partly prompted by a moral panic
00:02:28.360
about children self-harming and, in some cases, committing suicide as a consequence of things
00:02:36.540
they'd seen online. And it was first introduced by Sajid Javid when he was Home Secretary, I think
00:02:44.180
under Theresa May. And he said that his ambition was to make the UK the safest place in the world
00:02:53.160
to go online, which, on the face of it, sounds quite anodyne. But when you think about it, it's
00:02:59.220
like, well, what does that mean? Safer than North Korea? Safer than China? And what does safety mean
00:03:05.520
in this context? Does it mean the most heavily censored? And that was the original way to pitch
00:03:10.740
it, but always linked to child safety. And it became more explicitly about protecting children,
00:03:15.400
or at least the way to sell it became more explicitly about that under Boris's government.
00:03:20.200
Protect them from suicide content and pornography, right?
00:03:24.020
Yeah, protect them from pornography, from self-harming websites, from websites promoting
00:03:31.160
anorexia. The idea was that, or that the premise is that children are at enormous risk from being
00:03:40.980
exposed to unregulated content online, exacerbated by algorithms, and it's leading to a kind of
00:03:50.180
epidemic of self-harm, including suicide. So that was the kind of moral panic which prompted it.
00:03:55.900
And that was very much the way it was sold under Boris's government. This is really a bill about
00:04:03.220
protecting children from the Wild West that is the unregulated social media environment.
00:04:10.460
But Toby, isn't that quite a noble aim, let's be fair? Because you do look at some of the content
00:04:16.540
that kids are accessing. You know, 11-year-olds have got a smartphone, which basically gives
00:04:22.240
unfettered access to however much pornography they want. That's surely not a good thing.
00:04:27.860
Yeah, I think it's hard to object to the ambition of wanting to protect children.
00:04:34.700
When the bill was originally drafted, I mean, it's gone through many iterations, it wasn't just about
00:04:43.480
protecting children from legal but harmful content. It was also about protecting adults
00:04:49.520
from legal but harmful content. And there was this, the words legal but harmful never actually
00:04:53.360
appeared in any of the iterations of the bill. But the form it took was a clause which enabled
00:05:00.560
then the Secretary of State at DCMS to draw up a list of legal but harmful content, which social
00:05:10.040
media companies, search engines, user-to-user providers would be under a particular obligation
00:05:16.580
to remove. So we thought that at the Free Speech Union, we thought that was very dangerous.
00:05:22.480
Why should people, why should adults have to be protected from legal but harmful content? And in
00:05:28.160
particular, why should a senior politician be tasked with designating what content social
00:05:34.540
media companies in particular are under a particular obligation to remove on pain of being fined up to
00:05:40.340
10% of their annual global turnover or 18 million by Ofcom, whichever was the highest. I mean, a pretty
00:05:46.380
savage penalty for non-compliance. So we pushed back hard against that. One reason for pushing back was
00:05:53.720
that seemed to be a breach of a sacrosanct principle of English common law, which is that unless something
00:05:59.820
is explicitly prohibited, then it's permitted. That's a kind of, you know, cornerstone of English
00:06:05.300
liberty and very different from the Napoleonic Code, whereby unless something's explicitly permitted,
00:06:11.880
then it's prohibited, which tends to be the kind of basis for much continental law. So that was one of
00:06:16.640
the fundamental linchpins of English common law, an important foundation of English liberty. And this
00:06:22.360
seemed to be a departure from that, the creation of a new grey area, whereby it's legally permitted,
00:06:27.060
but nevertheless, in some sense prohibited. And the same applies to non-crime hate incidents.
00:06:32.620
So we pushed back hard against that. And so did other kind of free speech advocacy groups. And
00:06:38.700
the Tories, to be fair, dropped that. They dropped that clause. But they kept in the obligation on
00:06:45.900
providers to protect children from legal but harmful content. And what happened recently was that the
00:06:52.560
clause requiring providers to protect children, to put an age gate on their content that they think
00:07:00.940
isn't suitable for children, that came into force. And of course, it led to an enormous amount of
00:07:07.500
over-removal. So social media platforms like X ended up age-gating blog posts about Richard the Lionheart,
00:07:17.200
speeches by Katie Lamb about the grooming gangs, footage of someone being arrested at an anti-migrant
00:07:22.740
protest in Leeds. And the kind of that obviously raised questions. Well, why do you need to be 18 to see this
00:07:29.540
content? And particularly if Labour are going to lower the voting age to 16. I mean, this is kind
00:07:36.200
of democratically important content. How are 16 and 17 year olds going to make informed decisions at the
00:07:40.920
ballot box when they get the vote in 2029, if they're not allowed to access this content until
00:07:45.640
they're 18? It seemed kind of ludicrous. But we always fear that this obligation to protect children
00:07:52.820
from legal but harmful content would lead to over-removal. Companies are very concerned about the
00:07:59.080
penalties they'll face if they don't comply. In some cases, non-compliance can lead to executives
00:08:03.580
going to jail. And that is one of the more kind of authoritarian clauses in the Act. So of course,
00:08:10.940
they're terrified of not complying with all the requirements of the Act. And as you've probably
00:08:17.760
seen, it's almost 300 pages. And it's unbelievably dense and complicated. So for most kind of tech
00:08:25.040
company executives, of course, they're going to kind of err on the side of extreme caution and
00:08:30.260
over-removed content. And that's exactly what's happened.
00:08:32.700
One of the interesting things I found about it is when I was reading up on this, it goes,
00:08:37.600
so they talked about free speech guardrails, didn't they? And implementing it in the Act itself.
00:08:45.040
That really hasn't worked out quite the way we expected to.
00:08:49.660
Yes. And one of the free speech guardrails, one of the original guardrails in an earlier iteration
00:08:56.680
of the bill was that social media companies, when deciding what content to remove to comply
00:09:03.220
with the Act, had to have regard for free speech. And talking that over with kind of various MPs,
00:09:11.600
they kind of laughed, said having to have regard for something is like the lowest possible legal
00:09:18.300
duty, which is always trumped by other legal duties. So almost pointless, including that.
00:09:22.120
So the free speech union lobbied to have that change to have particular regard, which is kind
00:09:28.520
This is the most British fight in history, isn't it?
00:09:31.080
It is. One notch up. But that duty currently applies. And we've seen from the over-removal of
00:09:39.880
the last few weeks that clearly that isn't having much impact on deterring platforms like X and
00:09:46.820
Facebook from over-removing. But there's another reason why the safeguards haven't had much
00:09:54.840
impact. So two additional safeguards were that for category one providers and most big social
00:10:01.600
media platforms were going to be designated as category one providers by not the Secretary of State
00:10:07.600
at DCMS, but the Secretary of State now it's changed at the Department of Science, Innovation and
00:10:12.940
Technology. Peter Kyle, good friend of Sir Keir Starmer's. So he was tasked, he's supposed to
00:10:19.540
have said which providers are designated as category one providers. And once that designation has
00:10:26.240
happened, category one providers then have these additional free speech duties. They're not supposed
00:10:30.780
to remove content of democratic importance or journalistic content. But because I think Wikimedia
00:10:38.980
are concerned about Wikipedia being designated as a category one provider, they have legally challenged
00:10:46.440
the criteria that Peter Kyle is going to be using to designate which platforms are category one,
00:10:54.460
which platforms 2A and which platforms 2B. And so for the time being, Peter Kyle hasn't done that.
00:11:01.960
So that designation is yet to take place. Because that hasn't taken place and social media companies don't
00:11:07.380
know what category they're in. Those two additional safeguards to protect content of democratic importance
00:11:13.580
and journalistic content haven't kicked in yet, which is another reason why we've seen so much
00:11:19.000
over-removal in the past few weeks. We can get into this, but the reason Wikimedia is legally challenging
00:11:25.000
the government's criteria for designating what sites are in category one, the reason it doesn't want
00:11:30.820
to be in category one is because when you're in category one, you have to carry out these pretty
00:11:34.960
onerous annual risk assessments. And in the course of doing that, you have to disclose exactly who the
00:11:40.520
content creators are on your particular website. You have to actually share their identity with
00:11:46.640
Ofcom. And Wikipedia doesn't want to do that because a lot of its editors are anonymous. I mean,
00:11:52.640
they have various ways of, you know, making sure that they're respectable, they're not bad actors and
00:11:58.580
so forth. But nevertheless, they don't want to share their identities with Ofcom. So that's why
00:12:04.740
they're legally challenging this set of criteria. They don't want Wikipedia to be designated a category
00:12:11.080
one provider. And they've effectively said that if it is, we might have to geoblock Wikipedia in the UK,
00:12:18.200
or at least limit how often users are able to access Wikipedia, which is one of the many
00:12:24.380
unintended consequences of the online sites. Every cloud, Toby. Right. If you see what's
00:12:29.820
happening to my Wikipedia page, I'm totally on board with this. Mine looks like it was written by
00:12:33.180
Owen Jones. Yeah, exactly. We're joking, of course. But actually, Toby, I think you touched on something
00:12:38.540
very important, which is that I think, as Francis said, I think we'd all want children not to be able
00:12:44.120
to access pornography. I think we'd all want children not to be able to access really harmful content.
00:12:49.580
Although the link between suicide and watching things, I think, is very tenuous. And it's probably
00:12:54.180
quite the other way around. People who are already suicidal seek out content like that to verify and
00:12:59.740
confirm what they already feel, as opposed to, you know, a perfectly happy child suddenly discovering
00:13:04.040
something. But anyway, the issue is, it seems to me, that the problem with all of this is that once
00:13:11.960
you start introducing a mechanism by which people have to prove that they're over 18, then you've
00:13:18.780
effectively removed anonymity from the internet, which has very important uses for people, you know,
00:13:26.180
A, to be able to speak freely, who aren't necessarily able to speak freely. I mean, the Free Speech Union,
00:13:30.940
which you founded, represents many people who've been punished for speaking freely. And this will take
00:13:36.060
that away from them. But it effectively ends anonymity in general on the internet if we are constantly
00:13:41.700
having to prove that we're over 18, especially when, as you say, it currently extends to journalistic
00:13:47.780
content or just covering footage of protests or whatever else it might be. So are you hopeful
00:13:53.860
that once this process has taken its course, these overreaches will be, will go back? Or are you worried
00:14:02.580
that the slippery slope is going to remain slippery?
00:14:04.560
No, I think when I first started the Free Speech Union, back in 2020, I naively thought that things had
00:14:13.900
reached an absolute nadir, and could only get better. And one of the one of the jobs of the Free Speech
00:14:21.860
Union would be to try and repeal the fetters, the legal fetters on our free speech. I didn't think it could get
00:14:28.400
any worse. Hopelessly naive. And actually, all we've been able to do is kind of hold back the incoming
00:14:35.540
tide a little bit. So I think in all likelihood, the Online Safety Act is not going to be repealed.
00:14:41.740
It's not going to be improved. It's just going to get worse and worse. I mean, I think you make a good
00:14:47.300
point about the risks of having to verify your age. And I mean, as you say, anonymity is a really
00:14:57.640
important safeguard. It's a really important safeguard for Muslim apostates, for instance,
00:15:02.480
whose lives will be at risk if they're identified, if they're criticizing Islam or talking about having
00:15:09.160
left the Islamic religion, you know, the penalty for which is death. People in oppressive countries
00:15:16.800
like China and North Korea, particularly refugees from those countries, they rely on anonymity to
00:15:23.620
protect themselves from, you know, hostile secret services. So and then the kind of age verification
00:15:30.080
sites say, oh, no, don't worry, you know, we'll keep your information confidential. There's no risk that
00:15:35.680
it'll ever escape our servers. And it's like, you know, the Ministry of Defence, which you'd think would have
00:15:40.320
pretty good secure protocols about protecting privacy, recently inadvertently leaked the names of
00:15:45.520
33,000 people who collaborated with the British Army in Afghanistan to the Taliban. So, you know,
00:15:50.920
I'm not sure how much confidence you should have in these age verification companies, particularly as
00:15:56.300
a lot of them have kind of sprung up in, you know, the last few weeks, because everyone's kind of
00:16:00.920
frantically trying to kind of verify their age so they can access, you know, Katie Lamb's speeches in
00:16:06.260
the House of Commons or whatever it might be. I mean, I think going back to the fear of children being
00:16:15.500
exposed to harmful content. And I think there are the, as you say, I think there is this tendency
00:16:23.060
on the part of the paternalistic authorities to believe that when children do engage in self-destructive
00:16:33.700
behaviour, or when they engage in violent behaviour, particularly sexually violent behaviour, you know,
00:16:39.200
the cause, the fundamental cause is exposure to kind of toxic, harmful content on particularly
00:16:47.840
social media. And that seems to me to be kind of psychologically pretty naive. It's a pretty kind
00:16:54.000
of impoverished analysis. And you saw it in adolescence, you know. I mean, first of all, it was ludicrous
00:17:02.420
to imagine that the greatest threat of sexual violence to young women was posed by white working
00:17:10.980
class boys brought up in kind of stable, two-parent working class households. I mean, we know that's
00:17:16.780
not true. And in addition, the analysis was, well, the reason he's engaging in this behaviour is because
00:17:22.760
he's been exposed to the manosphere, you know, which is completely, I mean, a pretty threadbare
00:17:28.840
analysis of, of why, you know, women and girls are more and more vulnerable to sexual assaults in
00:17:36.040
contemporary Britain. But there is this kind of naive kind of belief that, you know, the internet,
00:17:46.080
social media, user-to-user services are the main cause of all our problems. And, you know,
00:17:54.160
we didn't have any of these problems 15 years ago, you know, before, before this kind of, before this,
00:17:58.700
this horrible apparatus kind of, uh, got into our souls. I mean, it's, it's just kind of, I guess
00:18:04.820
it's, it's partly a kind of form of avoidance, not wanting to kind of think more deeply about what
00:18:09.520
the real causes of these kind of long-running society. Well, knowing what they are and refusing
00:18:14.080
to address them and then censoring the people who are attempting to address them. Yeah.
00:18:18.340
Let me introduce you to the people I work with to protect my family against financial instability,
00:18:25.240
inflation, and turmoil. For me, one of the best ways to safeguard your wealth is by investing in
00:18:30.600
gold. And that's why I trust the Pure Gold Company. Here's a tip. If you buy UK-minted gold
00:18:36.780
coins like Sovereigns, you won't pay VAT when you buy them or capital gains tax when you sell them.
00:18:42.560
All the profit is yours to keep. And the best part? Gold is completely private investment.
00:18:48.340
It's outside the banking system, so it can't be manipulated by governments or anyone else.
00:18:53.380
Unlike digital assets, it's yours in the truest sense. The Pure Gold Company is trusted by first
00:18:59.280
time investors and seasoned professionals alike. What I really appreciate about them is their
00:19:03.840
knowledgeable, pressure-free approach. They explain everything clearly so you feel confident
00:19:08.800
about your decisions and they never push you to buy. Physical gold has stood the test of time as a
00:19:14.660
store of value. And it offers peace of mind that other assets can't match. It's reassuring to know
00:19:20.380
that even if the banking system falters, your gold and its value remain safe. Whether you're looking to
00:19:26.540
diversify your investments, protect your savings, or just gain the security of owning a real,
00:19:32.320
physical asset, we highly recommend the Pure Gold Company. Click the link in the description or go to
00:19:38.240
pure-gold.co.trigger to get your free copy of the Investor Guide. That's pure-gold.co.trigger.
00:19:49.520
Got PC optimum points? Visit Shoppers Drug Mart for the bonus redemption event and get more for your
00:19:54.340
points. Friday, March 6th to Wednesday, March 11th. Valid in-store and online.
00:19:58.280
And I guess another reason to push back a bit against this kind of argument that we have a kind of
00:20:10.580
the state has a duty to protect children from exposure to harmful content is that, well, isn't
00:20:17.120
OK, maybe the state needs to do a little bit about that, but isn't that really the responsibility of
00:20:21.580
parents? And why should the state be taking on the role that really parents should be discharging?
00:20:29.020
I think there's a technological dimension to this, Toby, which is a lot of parents don't have the
00:20:32.740
first clue of what the internet is or how to use it. And I think I understand why people are concerned.
00:20:38.520
I've told the story before, I think. Last time, not last time, one of the times I was on question time,
00:20:45.040
you know how they don't record the first question. There's a practice question before the full program.
00:20:50.160
And there was a question about Donald Trump being removed from Facebook. This was several years
00:20:55.860
ago. And the Labour representative on the panel, we must have the safest internet in the world,
00:21:02.660
very much speaking to your point earlier. And I think for a lot of people, particularly when it
00:21:06.800
comes to children, look, the safety of children is very important, of course. And I think we're both
00:21:12.820
parents, we'd certainly share that view. But my sense is that this is kind of being used now.
00:21:20.160
To achieve other ends, to suppress dissent. I mean, you had an article on one of your blogs earlier
00:21:27.160
this week talking about the fact that the government is introducing a policing unit to monitor social
00:21:32.820
media for signs of dissent and protest moods. I mean, I think we'd all recognize that the country is kind
00:21:39.720
of a fever pitch at the moment with protests outside migrant hotels and so on. Are you concerned that
00:21:46.400
part of this is simply about preventing people who are expressing their genuine fears and concerns
00:21:53.500
about what's happening in the country? And it's a way of, you know, they will argue it's preventing
00:21:58.420
real world harm. But a lot of people would argue it's a way of preventing the expression of concern
00:22:04.060
about government policy from happening. Are you worried that this is basically a way of censoring
00:22:10.740
people so that they don't challenge the government? Well, I think the government, the, the, the, the,
00:22:17.840
I think, um, Sir Keir Starmer, Peter Kyle, um, other senior members of the government, um,
00:22:23.940
do genuinely think that the online safety act doesn't go nearly far enough that the, um, legal but
00:22:31.040
harmful to adult clause, uh, should never have been removed. Um, and, um, uh, Morgan McSweeney's, um,
00:22:39.900
uh, pro censorship lobby group, um, the center for countering digital hate. I say Morgan McSweeney's,
00:22:45.540
he's Keir Starmer's chief of staff, uh, because he was the founding director. I don't think he's any
00:22:50.040
longer involved, but, um, they held a special summit, uh, during the riots that followed the
00:22:55.780
Southport attack last summer. And they concluded the online safety act needs to be toughened up.
00:23:02.000
We need to reinsert the clauses that were removed, uh, when it was going through parliament. Um,
00:23:07.300
Ofcom needs more powers, uh, to censor legal but harmful content, which adults are exposed to.
00:23:13.900
And Peter Kyle in particular needs to be able to stipulate. Um, it, I think the, the, the,
00:23:19.180
the, the kind of safeguard was in an emergency, in an emergency, Peter Kyle needs these special
00:23:24.860
powers to direct Ofcom to remove legal but harmful content, uh, uh, from being able to be seen by
00:23:31.360
adults. But of course, you know, in an emergency, I mean, as far as the left is concerned, there's
00:23:36.220
always an emergency of one kind or another. Emergencies are always being invoked to kind of
00:23:43.260
Toby, before you go on, one thing I will just interject is, uh, we interviewed the director,
00:23:49.560
uh, then director, I don't know if he's still there, of the Center for Countering Digital. Hey,
00:23:52.980
it was one of the most bizarre interviews that we've ever done. What's his name? I keep forgetting.
00:23:59.000
We interviewed him. Uh, people can go and watch that interview. It is one of the most,
00:24:03.420
like, weird interviews we've ever done because for 50 minutes, we had a very reasonable discussion.
00:24:08.920
But then the moment we started to test some of his arguments, he just became bitterly, personally
00:24:13.780
antagonistic and rude. Uh, and it kind of showed you where some of these people are coming from.
00:24:19.020
Yeah. Well, I think that was, that, that interview was revealing and his reaction to being challenged
00:24:23.600
was revealing and we'll get onto that in a second. Um, but, um, uh, I think, I think, uh, the government
00:24:31.200
were probably pleasantly surprised that lots of, um, content, um, uh, uh, which, which fuels
00:24:39.160
an anti-immigration narrative, such as Katie Lamb talking about the grooming gangs, um, uh,
00:24:45.160
uh, uh, uh, uh, uh, an X account, which, um, published transcripts of the grooming gang
00:24:50.640
trials, footage of an anti-immigration protester being arrested in Leeds. Government was probably
00:24:55.860
pleasantly surprised that that stuff was suppressed by X just on the grounds that it needed to verify
00:25:02.860
that you were 18 before you were able to see it. And until you could, you couldn't see it.
00:25:05.960
But I think they do want to increase the, uh, powers that Peter Kyle, that Ofcom has to suppress
00:25:13.680
legal but harmful content that supposedly fuels an anti-immigration narrative. Because I think,
00:25:19.920
um, I think on one level, they actually believe that the Southport disturbances, the social disorder
00:25:28.960
that followed the Southport attack last year, um, wasn't caused by roiling public anger about mass
00:25:38.480
uncontrolled immigration and, um, unhappiness that successive governments have promised to do
00:25:45.840
something to reduce overall migration and have done the opposite. Um, Boris being the prime example,
00:25:53.260
the Boris wave, you know, take back control of our borders. No, he didn't do that. So I think
00:25:57.460
there's this sense of distrust, betrayal, loss of faith in British institutions, uh, this feeling
00:26:05.420
that we need to, we need to take the law into our own hands. We need direct action if we're going to do
00:26:11.140
anything about this because successive governments have ignored us. Um, I think that there is this kind of
00:26:15.840
belief, I think on the part of the authorities, particularly the kind of, uh, pan-gendrums of
00:26:22.320
the Labour Party, um, that, that, that these disturbances, social disorder, the fact that we
00:26:28.360
are kind of seemingly kind of, um, sitting on an unexploded bomb that various towns and cities
00:26:34.740
have become tinderboxes. I think they think it's got nothing to do with mass uncontrolled immigration,
00:26:40.160
nothing to do with multiculturalism, all the policies we've been advocating for, cheerleading
00:26:46.020
for, for the past decades, you know, rather than admit that they got anything wrong or that they
00:26:52.000
failed the public in some way or betrayed their own manifesto promises. No, no confrontation about
00:26:58.500
that at all. No acceptance of that. No, it's all to do with bad actors on social media spreading
00:27:04.360
myths and disinformation and hate speech about asylum seekers. So all we need to do to solve the
00:27:10.620
problem is suppress that. If we can suppress that, the problem goes away. I mean, it's, it's a, it's a
00:27:16.060
kind of willed naivety and a kind of avoidance of confronting their own policy failures over the past
00:27:23.680
decades. And I think, I think there's a degree of self-deception involved. Um, uh, they don't want to
00:27:29.080
have an argument. They don't want to be forced to defend those policies in the public square because
00:27:35.160
they recognize, you know, at some level that they haven't been a success and that they've, they've
00:27:39.840
created this legacy of distrust, this kind of crisis of democratic legitimacy. Um, but rather than
00:27:47.500
defend those policies in the public square, better to just suppress anyone criticizing them, demonize
00:27:54.900
them as far right, uh, xenophobic, et cetera. And I think it's partly avoidance. It's partly an unwillingness
00:28:01.920
to confront their own failures. It's partly a kind of crisis of faith in liberalism itself. And you can see
00:28:07.580
that in the kind of curdling of their faith in multiculturalism. Um, you know, from thinking diversity
00:28:14.220
was our strength, multiculturalism has, you know, improved our cuisine, enriched our cultural life, et cetera,
00:28:20.120
et cetera, to suddenly thinking, crikey, you know, we're sitting on a powder keg, which is going to
00:28:25.280
blow at any moment we need to. But, but, but I think it's also about, um, one of the reasons they
00:28:30.160
don't want to have an argument, defend their policies in the public square, um, is because
00:28:35.100
they've lost the kind of ability to argue those muscles of atrophy. So when they're confronted,
00:28:40.140
like when you confronted, um, um, uh, when you confronted the, the, the, the CEO of the center of
00:28:46.860
countering digital hate, instead of kind of having an argument, instead of defending his
00:28:52.200
position, he just gets angry and just launches into ad hominem attacks because they've lost the
00:28:57.360
ability to argue that muscle has atrophied. And that's another reason I think the impulse to
00:29:03.380
suppress rather than engage is so powerful. If you kept interrupting me during the course of
00:29:09.500
this conversation, which you have, and I perceive that to be motivated by the fact that you're from a
00:29:14.300
Muslim background, I'm from a Jewish background, would that make that hate speech? No. But I,
00:29:20.060
my self-perception is that. No, it's just that you're really slow at talking and my brain's
00:29:25.040
already finished your sentence. But my perception isn't that you walked out of the room and when
00:29:29.300
you walk out of here, I'm going to turn to these guys and say, well, that was a lot of hate speech.
00:29:33.120
That's fine. You can say that that's hate speech. And then you can be prosecuted. It's absolutely fine
00:29:36.760
for you to say that's hate speech if you want. And then you can be prosecuted. If me interrupting you
00:29:40.600
hate speech, and that's your definition of it, I tell you what, you are being disrespectful
00:29:45.260
to the people who do receive hate speech. Like me. The enormous number of people who receive hate
00:29:51.600
speech. Real, clear. So do I. Undeniable hate speech. So do I. And your sophistry right now
00:29:58.960
is both transparent and deeply disrespectful. And I warn you that most of your listeners will think
00:30:04.900
themselves, actually, Constantine's being a bit of a dickhead right now.
00:30:09.560
That's interesting that you say that, because what I'm doing is I'm stress testing your argument.
00:30:13.380
And the point I'm making to you is a person's self-perception can be manipulated. People can lie
00:30:18.980
about how they perceive the things that they experience. And sometimes people will use the law
00:30:24.620
against other people when it's badly written. Don't you think that's true?
00:30:28.240
That's what I'm saying. So I'm not really being a dickhead. I'm stress testing your argument.
00:30:31.700
And calling me a dickhead might be considered a hate speech.
00:30:35.020
No, no, it's not hate speech. It's just good observation.
00:30:40.160
Toby, I actually think as well, if we put a good faith spin on this, I think that's kind
00:30:45.060
of how the left look at it, in that they have quite a paternalistic attitude to people.
00:30:50.100
And their instinct is to protect, is to keep people safe. And the people who tend to vote
00:30:57.600
for these kind of policies or governments want to be kept safe. So for these types of people,
00:31:03.300
it is actually quite a mutually beneficial relationship, isn't it?
00:31:08.600
Well, I think, yeah, we saw that kind of dynamic at work during the pandemic with the, to my mind,
00:31:16.900
surprisingly wide public acceptance of the lockdown policy. People were willing to have their liberties
00:31:23.180
temporarily suspended in order to keep, keep themselves safe, keep vulnerable people safe.
00:31:30.320
Very different to the reaction to previous pandemics in the kind of second half of the 20th century.
00:31:36.680
And it does feel as though, as we've become a kind of more comfortable, affluent society,
00:31:42.000
so people have begun to prioritise safety over liberty. And I think people are willing to kind
00:31:49.900
of make that trade often, because they don't think it will affect their liberties, or if it will,
00:31:55.960
it will only affect their liberties temporarily. I mean, a lot of the restrictions that were imposed
00:32:01.020
during the lockdown weren't lifted, you know, when the lockdown was lifted. So it turned out not to be
00:32:06.080
temporary suspension of liberties at all. But I think one of the impacts of the Online Safety Act in
00:32:12.760
the past few weeks is that people are beginning to realise that, you know, when they make this
00:32:18.760
Faustian pact, and agree to sacrifice some liberty for safety, they don't think of their own liberties
00:32:26.000
as being kind of sacrificed. But now they're beginning to discover that actually, it does involve their
00:32:31.840
own liberties. I think people are waking up to the fact that actually, this is, this is a kind of harder trade
00:32:37.500
than they imagine, because, you know, it isn't just affecting the liberty of people on the kind of
00:32:42.820
far right to rabble rouse and disseminate myths and disinformation and hate speech on social media
00:32:48.640
platforms. It's actually affecting their ability to access content that they want to see, like people
00:32:54.860
being arrested at anti-immigration protests. Well, I have to take issue with both of you making this
00:32:59.340
about the left. Let's just go back to where we started this. This bill was brought in by the last
00:33:05.180
conservative government. Let's just be honest about it, right? So this isn't about the left. This is
00:33:10.360
about, as you say, the country is becoming more pro-restrictions. The country is, and you mentioned
00:33:17.900
the public supported it during COVID. I don't know if they support this. I imagine at the surface level,
00:33:23.000
without delving into it, a lot of people will support this as well. And so I don't think it's about
00:33:27.340
the left at all. I think we just live in a country that's pro-censorship.
00:33:30.240
Well, if you look at surveys, people's attitude towards censorship obviously depends on how the
00:33:41.700
questions are phrased. Yeah, there's a massive part of it.
00:33:43.960
A massive part of it. But when you ask people whether, you know, people are too sensitive,
00:33:49.920
people are too easily offended, people should be able to kind of speak more freely without being
00:33:55.260
worried about jeopardizing their livelihoods. Generally, there seems to be a lot of public
00:34:00.700
So have we become more pro-censorship? I mean, yes, I think we have.
00:34:06.160
And our elites certainly are more pro-censorship.
00:34:08.120
Our elites, certainly. I mean, one curious change in the attitude of the elites is that
00:34:13.860
until about 15 years ago, the kind of prevailing orthodoxy when it came to censorship was what's
00:34:21.100
known as the counterspeech doctrine. So the Supreme Court Justice, Louis Brandeis, in a landmark
00:34:26.160
First Amendment case in the early part of the 20th century, said, and I'm paraphrasing,
00:34:31.040
the best way of countering misleading and harmful speech is not enforced silence, but more and better
00:34:37.860
speech. Sunlight is the best disinfectant. Suppression doesn't work. You know, we need, if you're concerned
00:34:45.760
about, you know, various narratives kind of getting abroad and having a kind of toxic effect on public
00:34:53.160
life, on the functioning of democracy, the best way to respond to that is not, is not enforced
00:34:59.000
silence, but to actually engage with the people making these arguments. And that was the kind of
00:35:05.800
prevailing orthodoxy amongst our elites till about 15 years ago. But suddenly they seem to have come
00:35:11.160
back around to censorship as the best way of dealing with, of countering kind of harmful and
00:35:18.120
misleading speech. And I think that's a, it's a huge mistake, not least because it's always
00:35:22.580
counterproductive. I mean, the kind of paradigmatic example is the Weimar Republic in the 1920s, the early
00:35:29.260
1930s, banning Nazis, banning Nazi propaganda in various German states. Hitler was banned from speaking
00:35:36.960
in various German states. Antisemitism was a criminal offence in various German states. And this was thought
00:35:43.280
to be the best way to kind of suppress the rise of the Nazis. Didn't work. And you can see that
00:35:49.300
these various efforts to censor social media in particular, not just here, but in Australia,
00:35:55.180
in New Zealand, across the continent, as a way of kind of countering kind of populist insurgence,
00:36:01.740
has been just completely ineffective. The rise of populism continues unabated. In this country,
00:36:09.520
I don't think that suppressing anti-immigration content on social media is going to in any way
00:36:16.200
impede the progress of Reform UK. I mean, they're topping the polls and the efforts of the authorities
00:36:22.900
to kind of suppress, you know, public anger about mass uncontrolled immigration, about the small boats,
00:36:31.660
et cetera, et cetera. It's having the opposite of its intended effect. You think kind of,
00:36:34.860
when are they going to learn that suppression, censorship doesn't work? Sunlight is the best
00:36:40.820
disinfectant. If they want to defend their policies, if they want to cling on to power,
00:36:44.880
then, you know, get out there in the public square, make as good a fist as you can of defending
00:36:50.700
these policies. Don't just try and smear your opponents as kind of far-right kind of lunatics,
00:36:58.020
When I order something online, it's the shipping experience that often makes or breaks it.
00:37:03.140
Fast delivery, tracking updates, no issues. That's what builds loyalty. So if you run an
00:37:08.760
e-commerce business, you know how important it is to get this right. That's why we're big fans of
00:37:15.180
ShipStation, the platform that helps smart retailers turn shoppers into loyal customers. It syncs all your
00:37:22.920
orders into one simple dashboard and lets you automate everything from label printing to tracking
00:37:28.920
updates. It's smooth, fast, and simple to use. We love how easy it is to set rules for shipping.
00:37:35.940
No manual tasks, fewer errors, and more time to focus on what matters. And the discounts,
00:37:42.420
up to 90% off FedEx and up to 88% off UPS, DHL, and USPS. ShipStation grows with your business.
00:37:52.200
You'll never need to upgrade no matter how big things get. Over 130,000 companies trust ShipStation,
00:38:00.540
and 98% of those that stick with it for a year become customers for life. So go to
00:38:06.440
ShipStation.com and use code TRIGGER to sign up for your free trial. No credit card, no contract.
00:38:17.840
Investing is all about the future. So what do you think is going to happen?
00:38:22.120
Bitcoin is sort of inevitable at this point. I think it would come down to precious metals.
00:38:27.200
I hope we don't go cashless. I would say land is a safe investment.
00:38:31.900
Technology, companies, solar energy. Robotic pollinators might be a thing.
00:38:36.640
A wrestler to face a robot? That will have to happen.
00:38:39.980
So whatever you think is going to happen in the future,
00:38:43.080
you can invest in it at Wealthsimple. Start now at Wealthsimple.com.
00:38:47.840
We're talking about this. And then a couple of days ago, you had Peter Kyle, who we said,
00:38:54.560
the science minister for Labour. He made the argument on national television that
00:39:00.640
Nigel Farage, who criticised this bill, is on the side of paedopholes and most pertinently of all,
00:39:09.020
Jimmy Savile. And you're just watching him make this argument. I mean, have you lost your mind?
00:39:15.940
As I said, if you want to overturn the Online Safety Act, you are on the side of predators
00:39:23.760
and paedophiles. Nigel Farage wants to overturn the Online Safety Act that puts in it the duty
00:39:30.020
to prevent children from being groomed by paedophiles, that prevents adults from contacting
00:39:36.700
children who don't know each other, strangers contacting children.
00:39:40.680
This is an act that is the biggest step forward in the experience that children have online since
00:39:46.720
the creation of the internet, the biggest step forward in the safety of children across Britain
00:39:50.960
since the internet was created. And Nigel Farage wants to overturn all of that,
00:39:55.620
to send it back to the days when predators and paedophiles could contact children.
00:40:00.200
So I am not going to apologise for pointing out the obvious. He wants to overturn those measures.
00:40:05.520
I will stay steadfast to make sure that children are kept safe in the digital age, just as we
00:40:10.760
strive to keep children safe in the online world, offline world as well.
00:40:15.140
I think, Peter Kyle, what Nigel Farage is asking you to apologise for is saying earlier this morning
00:40:21.500
that if Jimmy Savile were still alive, he would be perpetrating his terrible crimes online.
00:40:27.080
And that you're saying Nigel Farage is on the side of Jimmy Savile and people like him.
00:40:35.460
There are people like Jimmy Savile who are trying to get access to children online that will be
00:40:41.160
prevented from doing so by the Online Safety Act. Nigel Farage wants to overturn the very laws that
00:40:48.440
are preventing that from happening. He is on the side of people like Jimmy Savile and I'm not going to
00:40:53.740
apologise for saying so. It is about time that Nigel Farage was held to account for the words
00:40:59.060
that he says, the actions that he takes, because this is politics. We have taken the action to
00:41:04.660
prevent those things happening to children in our country. And Nigel Farage wants to throw all of
00:41:10.040
that away, to turn the clock back. I will not apologise for pointing it out. Nigel Farage needs to be
00:41:15.500
held to account for the dangerous policies that he is trying to impose on children across our country.
00:41:20.440
It was, I mean, quite a bold argument to make, given, you know, that Keir Starmer's track record
00:41:28.460
as director of public prosecutions when it came to prosecuting Jimmy Savile isn't, you know,
00:41:33.380
completely unblemished that we could get into that. But yeah, and you initially thought that he's
00:41:40.020
overreaching here. This is a political blunder. But actually, various other senior Labour ministers
00:41:48.640
have doubled down on it. I think, is it Angela Rayner, the most recent one? Yeah, it's such a tenuous
00:41:56.200
argument. I mean, the idea being that, I suppose that modern day groomers, the kind of contemporary
00:42:06.380
equivalents of Jimmy Savile, are going to use unregulated kind of, I don't know, user to user
00:42:12.120
services on the internet to kind of groom children and find their victims. So if you're opposed to
00:42:19.020
age-gating content and protecting under 18-year-olds, then you're basically on the same
00:42:24.840
side of Jimmy Savile. I mean, a real stretch. And I just can't imagine that argument is landing with
00:42:29.680
the public and making them think, oh, well, crikey, yeah, Nigel Farage, if he repeals the
00:42:35.620
online safety app, we're going to have a kind of tsunami of Jimmy Savile's kind of stalking our
00:42:40.240
children on the web. What it shows to me, I find it very interesting, actually, because I was watching
00:42:45.980
an episode of the News Agents where they were talking about this, and even they were saying
00:42:51.760
that this is completely ludicrous. Wow. If even they're saying it, then, yeah.
00:42:56.560
And you just look at it, and to me, it looks like a government that is completely rattled,
00:43:03.160
doesn't have any answers, and like you said before, is just resorting to ad hominems. And I'm thinking,
00:43:09.400
guys, you haven't even been in power for, what, barely a year? And you already look like you're
00:43:13.820
losing the plot massively. I mean, I think one of the reasons various governments, not just
00:43:21.020
our government, are so quick to press the censorship button, is because they are becoming more and more
00:43:29.880
brittle, less and less confident. You know, you see, I think it's a historical fact that as regimes
00:43:38.800
become more fragile, as they begin to lose authority, so they become more repressive. And that seems to be
00:43:46.320
what's happening here. And they're becoming further from the people as well. I mean, your point about
00:43:50.080
I mean, anyone who thinks that you're going to suppress people's concerns about illegal
00:43:56.200
immigration in Britain in August of 2025 is a moron. I mean, the entire country is concerned
00:44:03.580
about illegal immigration. It's not because of online content. It's because they experience it
00:44:09.220
on a daily basis. There's no suppressing that. And that will backfire, don't you think?
00:44:14.340
Absolutely. I mean, even in the past few days, there have been a couple of cases
00:44:19.980
one, I think, brought to light by the Mail on Sunday, another by The Sun of asylum seekers
00:44:26.520
sexually assaulting, kidnapping minors, efforts by the police to suppress the fact that these
00:44:34.300
crimes were committed by asylum seekers, only for it to then come out. I mean, talk about
00:44:40.000
fueling anti-immigration sentiment. I mean, I think people are incredibly suspicious.
00:44:46.440
Whenever there is a terrorist incident, whenever there's an attack on a sexual assault on children,
00:44:55.820
people just automatically assume now it must have been carried out by an asylum seeker,
00:45:01.540
probably from a Muslim-majority country. Not because that's more likely to be the case
00:45:07.140
than not, but because we know there have been just so many occasions on which the government
00:45:11.920
have tried to cover up the fact that one of these attacks, one of these episodes has been carried
00:45:16.600
out by an asylum seeker from a Muslim-majority country. Suppressing this information just has
00:45:22.580
the opposite of its intended effect. It's crazy.
00:45:24.680
And there's one other thing I wanted to touch on with you as well. One of the things I've read is
00:45:28.420
that WhatsApp and Signal, I think, are two of the companies, but I'm sure others as well,
00:45:32.980
who are saying it hasn't happened yet, but the government is effectively attempting to force
00:45:37.620
these companies to give access to the government to up encrypted private messages. And this may be
00:45:43.720
another consequence of the introduction of the online safety act. Can you talk a little bit about that?
00:45:47.740
There's one threat to encrypted messaging apps is linked to a crime and terrorism act whereby,
00:46:01.220
which granted the British police and security services access if the government issues a special
00:46:09.600
notice to encrypted data on encrypted messaging apps. And that's currently being contested by Apple
00:46:18.540
amongst other companies. And it meant that Apple said they could no longer offer the highest level
00:46:23.240
of encryption to users of Apple services in the UK. But the online safety act requires category one
00:46:33.900
providers, I think. So when they finally, when Peter Carl finally gets around to designating them
00:46:37.800
to do these risk assessments and annual risk assessments that they need to provide Ofcom with.
00:46:43.020
And that will mean disclosing some of the, or being in a position to disclose some of the messaging
00:46:52.860
on their apps. And the argument is that groomers, the Jimmy Savills of today, are using encrypted
00:47:02.660
messaging apps to kind of groom children undetected. So Ofcom has to have access to this content if it's
00:47:09.760
going to protect children. But what it means is that unless, unless Ofcom backs down, it could well
00:47:17.380
mean that apps like Signal, WhatsApp, Telegram are just geo-blocked in the UK.
00:47:23.040
But, you know, you know, when the more I read this act, because I sat down and God bless my little
00:47:31.500
brain, I sat down and I tried to read this act. Now, I'm not a lawyer. I'm not an expert, but I was
00:47:36.880
trying to go through it. And the thing that kept jumping out to me as I was reading through it, I
00:47:42.320
was just trying to make sense of it, was A, it's incredibly difficult to read. It's incredibly opaque.
00:47:48.260
Now, maybe that might be because I'm not legally trained. I'm not bright enough. But the thing that
00:47:52.060
also kept jumping out to me was the use of nebulous terms. Safety. I have a very particular
00:47:58.980
issue with safety. Because the past couple of years, when everybody said the words,
00:48:03.820
I need to feel safe, or you've compromised my safety, they're not talking about safety. What
00:48:08.740
they've said is, I don't want my ideas to be challenged. You make me feel uncomfortable
00:48:13.040
because you've challenged my ideas. Significant psychological harm. Well, who judges that?
00:48:19.100
What does that actually mean? Does that mean that you get a form of PTSD? And that makes me feel
00:48:26.480
Yeah. I mean, I think, to your first point, it isn't just you. I mean, the Online Safety Act
00:48:31.940
is an incredibly dense, voluminous piece of legislation, really hard to understand. I mean,
00:48:37.120
we've got a team of lawyers at the Free Speech Union going through it, trying to summarise it,
00:48:42.520
trying to see what parts of it are legally challengeable. And, you know, our Chief Legal
00:48:46.640
Counsel has a PhD in classics from Oxford, and he's struggling with it. I mean, it is an unbelievably
00:48:52.860
opaque and dense document. On the kind of frequency with which these kind of nebulous terms occur,
00:49:00.980
you're quite right. And that seems to be a kind of recurring feature of kind of sensorial legislation.
00:49:08.420
So Section 127 of the Communications Act makes it a criminal offence to communicate messages which
00:49:17.000
are grossly offensive. Well, how to define grossly offensive? Section 179 of the Online Safety Act
00:49:23.340
criminalises disinformation. So if you knowingly disseminate information which you know to be
00:49:32.800
false, and it's likely to cause psychological harm to a likely audience, then you can go to prison for
00:49:39.420
two years. And, you know, how do you define psychological harm? I mean, could you be prosecuted
00:49:45.100
for causing psychological harm to someone who's particularly psychologically vulnerable because
00:49:49.320
they're suffering from mental illness? Do we have to take their possible reaction into account? Could
00:49:54.680
you be prosecuted for that? And as you say, the concept of safety, the concept of harm have been
00:50:00.800
just stretched to breaking point, particularly in the last decade or so. It's no longer just about
00:50:06.480
physical harm, about protecting people's physical safety. It's about protecting people from
00:50:11.480
psychological harm. It's about protecting their psychological well-being. And as you say, that
00:50:15.700
can include not challenging people. I mean, we see it again and again in university students claiming
00:50:21.600
they need safe spaces. They need trigger warnings and so on and so forth, because if their ideas are
00:50:27.580
challenged, they find that psychologically traumatising. They find that undermines their
00:50:32.780
psychological well-being. I mean, that's taking the concept of safety and harm to ludicrous lengths.
00:50:38.540
And it's also as well, they go, oh, they were talking about truth and something that may not
00:50:43.420
be true. And I was thinking, and I think Nigel Farage made this point, what about humour? Humour and
00:50:48.740
satire is taking something that we know to be true and then exaggerating it. And you're going, what?
00:50:54.360
So are we now going to get rid of humour because you made a satirical joke and somebody would say,
00:50:59.380
well, actually, if you look at it, maybe David Cameron didn't have intimate relations with the pig.
00:51:04.380
And the reality is that David Cameron would find that particularly upsetting.
00:51:08.380
Yeah. One way of looking at this is to distinguish between tolerance and respect.
00:51:16.980
So when it comes to different religious faiths, being able to practice in the UK,
00:51:27.840
we pride ourselves on religious tolerance. It's why the UK is such an attractive country
00:51:32.340
for migrants from other countries, particularly Muslim majority countries, because we have this
00:51:37.560
tradition dating back, you know, more than 100 years of tolerating people of different religious
00:51:44.060
faiths, of tolerating the practice of different religious faiths in our midst. And that applies
00:51:50.220
to all kinds of beliefs, all kinds of traditions, customs, communities. We have this great tradition
00:51:56.740
of which we're rightly proud of tolerance. But that in the past few decades has somehow
00:52:02.460
segued into respect. So now it's not enough to tolerate people with different beliefs to the
00:52:09.680
mainstream, various minority groups practicing their own customs and religions. Now they demand
00:52:14.620
to be respected. And I'll give you just two examples. One obvious example is Muslims wanting
00:52:21.740
their blasphemy codes to be respected by non-believers. And one of the Free Speech Union's
00:52:26.940
big cases recently was defending Hamit Koskan, a Turkish political refugee who was prosecuted for
00:52:32.700
burning a copy of the Quran outside the Turkish consulate in Knightsbridge. And unfortunately,
00:52:37.660
that prosecution was successful. We're now going to appeal it. But we don't want to see the
00:52:42.200
reintroduction of blasphemy codes via the back door, particularly blasphemy codes that only protect
00:52:46.760
people of one particular faith. But Islamophobia, another example, the government is currently
00:52:51.800
tasked this group with coming up with a definition of Islamophobia slash anti-Muslim hatred, which seems
00:52:57.260
to be a way of embedding and forcing non-believers to respect Islamic blasphemy codes in the public
00:53:03.580
square. But the other example I'm thinking of is trans rights activists and trans people. It's not enough
00:53:10.940
that they should be tolerated. They also want to be respected. And that means they want to compel
00:53:16.180
people to use their preferred gender pronouns. I mean, that's one of the forms it takes.
00:53:20.700
So I think we've gone from a society which was very tolerant of all these different communities,
00:53:26.320
these different religions, and we prided ourselves on being tolerant. But that didn't mean that
00:53:31.340
non-believers had to respect them. We might tolerate people of different faiths. That doesn't mean,
00:53:37.420
particularly if you're an atheist, that you have to respect people of different faiths or respect their
00:53:41.360
religious beliefs. But we seem to have segued into a society in which everyone is required to
00:53:46.280
respect everyone else. And that means tiptoeing around. It means biting our tongues, not saying
00:53:51.900
what we think, for fear of offending someone by disrespecting them. And that's a recipe for disaster.
00:53:58.820
That's not a recipe for making a multicultural, multi-faith, multi-ethnic society work. The way to make
00:54:04.520
a society like ours work is to promote the value of tolerance, not to demand that every different
00:54:09.560
identity group respect what are in effect the blasphemy codes, the sacred values of every
00:54:15.680
other identity group. That just means a society in which no one talks to each other. I mean, if you
00:54:19.760
look at these communities like Leicester, where there are these kind of sectarian religious conflicts
00:54:26.580
which currently break out into the open, not just Leicester. I mean, virtually every town and city in
00:54:30.480
the UK. But the way to resolve this, the way to bring about a kind of lasting kind of peace and
00:54:40.180
tranquility in these communities to avoid these kind of sectarian conflicts breaking out is not to force
00:54:46.080
everyone to kind of tiptoe around, tread on eggshells, respect each other's blasphemy codes, not trespass on
00:54:53.060
each other's sacred values. The way to make sure these communities get along, to forge some sense of
00:54:59.600
common identity and common purpose, some new sense of kind of Britishness rooted in our multicultural,
00:55:05.540
multi-ethnic, multi-faith society, is for these people to talk to each other, to argue, to negotiate,
00:55:11.260
you know, to promote free speech, not to suppress it. Look, starting a business is exciting, but it isn't
00:55:17.640
easy. It's not just logos, merch and big ideas. It's late nights, awkward first sales and wondering if
00:55:23.780
your bank balance is going to survive until Friday. That's why having the right tools from day one makes
00:55:29.040
all the difference. Enter Shopify. Shopify powers millions of businesses around the world from
00:55:35.720
big names like Mattel and Gymshark to people just getting started on their kitchen table. It gives
00:55:41.400
you hundreds of beautiful website templates, built-in marketing tools, even AI that will help you write
00:55:47.540
product descriptions, which let's face it, we could have used for trigonometry back in 2018. Plus,
00:55:53.740
did I mention that iconic purple ShopPay button used by millions of businesses around the world?
00:56:00.060
It's why Shopify has the best converting checkout on the planet. Your customers already love it. Turn
00:56:05.840
your big business idea into with Shopify on your side. Sign up for your $1 per month trial and start
00:56:13.960
selling today at shopify.co.uk slash trigger. Go to shopify.co.uk slash trigger. One more time,
00:56:23.300
that's shopify.co.uk slash trigger. And it's a great point because you think, let's take the example
00:56:31.140
of blasphemy laws. If Labour implements blasphemy laws, which some Labour MPs want to do, like Tahir
00:56:38.280
Ali, is it that really far of a stretch to go, well, you can't make a joke about this religion
00:56:44.280
because I find this psychologically damaging? And then, so what you have is just a censorship
00:56:49.980
industrial complex, don't you? You do, yeah. And another argument for why we shouldn't be forced
00:56:58.740
to respect the blasphemy codes of different religions is that, what about our tradition in
00:57:05.860
this country, of humour, of taking the mickey, you know, of deliberately disrespecting, you know,
00:57:14.200
the sacred values of different identity groups, which creates a kind of good-humoured atmosphere?
00:57:20.980
What about that? Why not? Don't our traditions deserve to be tolerated as well? I mean, you probably
00:57:29.020
know about this, but one of the big campaigning issues that the Free Speech Union and me and the
00:57:33.600
House of Lords have been kind of engaged in over the past six months or so is the banter ban. So
00:57:39.040
clause 20 of the Employment Rights Bill extends employer liability for the harassment of their
00:57:47.140
employees to third parties. And I'm not talking about third-party sexual harassment of employers,
00:57:53.120
I'm talking about that's already, you know, employers are already liable for that. I'm talking about
00:57:57.420
the non-sexual harassment of employees by third parties. Employers are now liable for that,
00:58:02.660
meaning they could be sued by their employees for not taking all reasonable steps to protect them
00:58:08.220
from non-sexual harassment by customers, by members of the public. And what do we mean by
00:58:13.560
harassment? Well, it's one of these nebulous terms that just seems to be becoming more and more
00:58:18.320
capacious. So in the Employment Tribunal, harassment has evolved to include now overheard conversations.
00:58:24.920
So if a blue-haired barmaid in a pub overhears a couple of gender-critical feminists talking about
00:58:32.220
how unfair they think it is that biological men who identify as women should be able to compete
00:58:38.380
against women in women's sports like boxing and football, she could claim,
00:58:44.540
I feel harassed. And I feel harassed in virtue of one of my protected characteristics. You as my employer,
00:58:50.200
Mr. Publican, should have done more to protect me from that. And I'm going to sue you because you didn't.
00:58:54.160
How will publicans react to this? They'll employ banter bouncers to eavesdrop on people's
00:58:59.120
conversations to make sure they're not saying anything likely to offend or upset the kind of
00:59:03.180
pink-haired barmaid. There'll be notices on the wall saying, don't assume my pronouns. You know,
00:59:07.680
before you can buy a pint of beer, you'll be able to have to show you've had kind of the full suite
00:59:11.940
of diversity training. I mean, it's ludicrous, but the kind of rationale for this, I mean, I've been
00:59:16.740
arguing with Labour ministers in the House of Lords. How can you possibly justify the banter ban?
00:59:22.300
You're going to kind of put every pub in the country out of business. It's not just pubs. It applies to
00:59:26.060
hotels, restaurants, bars, football stadiums. You know, if a fan says to a linesman, are you blind?
00:59:33.380
And there's a partially sighted steward who overhears it. The partially sighted steward will be able to sue
00:59:37.560
the club for not doing enough to protect him from being harassed in that way by virtue of his protected
00:59:42.240
characteristic, his partial sightedness. So, you know, if Keir Starmer has his way, every stadium in the country
00:59:46.820
will become a library, not just his fucking, sorry, his beloved arsenal. And it's ludicrous.
00:59:52.960
But the rationale for it is, well, we have to protect vulnerable people in the workplace. They
00:59:58.140
shouldn't have to overhear offensive or upsetting conversations. We need to make the workplace safe
01:00:04.180
for everybody. That's the argument. It's going to destroy, you know, it's going to destroy a hugely
01:00:09.740
important aspect of our public life. But they think that that's toxic, banter's toxic. It's an
01:00:18.060
excuse for being racist and misogynistic and homophobic. Of course, people shouldn't be able
01:00:23.020
to tell jokes in pubs. What about comedy? It's really going to affect, you know, the comedy
01:00:28.280
industry. I mean, already, you know, people who work in comedy clubs essentially have a heckler's veto
01:00:34.200
over what comedians can say. But this will be setting it in stone. Henceforth, you know, if an
01:00:39.840
employee of a comedy club overhears a comedian make a joke that they find offensive or upsetting by
01:00:44.920
virtue of their protected characteristic, they'll be able to sue the owner of the comedy club for not
01:00:49.020
taking all reasonable steps to protect them from overhearing this joke. So it means that, you know,
01:00:54.460
it won't just be you that's required to sign a kind of contract before being hired. It'll be every
01:01:01.020
comedian in the country. And if you have any risky material, if you're, if you're straddling that line
01:01:06.680
between what's acceptable and what isn't, which makes comedy so exciting, you know, you won't be
01:01:10.840
able to perform. It'll just, it'll wipe out a kind of vast swathes of our cultural life at a stroke,
01:01:17.760
the banter ban. And of course, we've had no success in holding it back.
01:01:23.420
Toby, before we wrap up, speaking of protecting people, one of the things you do at the Free Speech
01:01:27.720
Union is help people who have been punished in some way for things they've said. What is,
01:01:34.280
what is the bulk of your caseload at the moment? Who, who are the people that are coming to you for
01:01:38.220
help? Well, since setting it up about five years ago, we fought over 4,000 cases and where they've
01:01:45.100
come to a conclusion, we've been successful about 80% of the time. And in our case database on Salesforce,
01:01:51.040
by far the largest category is sex and gender. About 40% of those 4,000 plus cases we fought over
01:01:59.000
the past five years have been defending gender critical women who've complained about having to
01:02:07.000
share bathrooms, changing rooms with biological men who think they're women. I mean, when I set up
01:02:14.440
the Free Speech Union, I thought, you know, um, I'd end up defending people like me for the most part,
01:02:20.500
um, you know, male, pale and stale Tories. Um, uh, but actually the front line, the people who find
01:02:28.260
themselves cancelled more often, the people who find it more difficult than anyone else to kind of,
01:02:33.060
uh, express how they feel about something, um, even if it's perfectly lawful, um, are women of a
01:02:39.540
certain age, um, uh, uh, who, who, who, you're going to cancel for that one. Not always of a certain
01:02:45.920
age. I mean, uh, but, uh, but, but they're mainly, they mainly identify as being on the left. Um,
01:02:50.680
so, you know, that will change. That is changing. Um, uh, but, um, increasingly we, we, we find
01:02:57.980
ourselves involved in, uh, criminal work. So defending people who are actually being prosecuted and face
01:03:04.980
jail time. So, um, one big success that I was really proud of is, um, a former Royal Marine
01:03:11.260
called Jamie Michael, um, who in the immediate aftermath of the Southport attack posted a 12
01:03:16.940
minute Facebook video, um, in which he wrongly blamed the attack on, um, uh, an undocumented
01:03:23.960
migrant. Um, uh, uh, partly because there was a vacuum, as you know, at that time, the government
01:03:29.280
didn't say he'd actually committed the attack. Um, and, um, and he urged his followers to protest
01:03:34.660
peacefully about, um, uh, um, illegal immigration into this country, the small boats in particular,
01:03:41.320
and a local labor politician complained to the police about this Facebook post, and he
01:03:46.760
was prosecuted for intending to stir up racial hatred. The maximum penalty for which is seven
01:03:52.840
years in jail. Um, and the, the Crown Prosecution Service went through with it. We urged them to
01:03:58.300
drop it. We said, you know, there's nowhere, this doesn't meet the threshold for intending to
01:04:03.440
stir up racial hatred. He was urging his protesters, his followers to protest peacefully, um, uh, about,
01:04:09.260
uh, illegal migration. And, um, uh, it took a jury all of, we paid for his defense. There was a,
01:04:14.900
there was a, there was a proper trial, a jury trial, and it took the jury all of 17 minutes to
01:04:19.520
unanimously acquit him. Uh, and we're doing more and more work like that, um, as the authorities become
01:04:26.160
more and more, uh, likely to prosecute people for challenging kind of, uh, the prevailing, um, uh,
01:04:34.460
government narrative about things like immigration, asylum seekers, the small boats. Um, and, uh, so
01:04:41.160
that, that, that, that's been a relatively new departure for us. But one thing to say is that since
01:04:45.700
labor came to power, um, just over a year ago, um, we've doubled in size. So we had about 14,000 members
01:04:53.240
on July 4th, 2024. We're now up to over 31,000 members. So Sir Keir Starmer, bless him, has been
01:05:00.500
a fantastic recruiting sergeant for the free speech union. Um, and we're now averaging 37 new cases
01:05:07.660
every week. Um, and, um, it's, it's often people who are, I think, um, perfectly reasonably worried
01:05:15.580
that they're going to have their collars felt because of something they've said on social media.
01:05:19.020
Um, and, um, yeah, I mean, we, we, we do our best to help every one of them.
01:05:24.060
Well, we've been big supporters of the free speech union from the moment it was founded. It's a really
01:05:27.780
important organization. Toby, we don't have, you, you're a very busy guy precisely because
01:05:31.780
there's so much work for you to do. So we won't have time for questions from our audience, although
01:05:35.580
we have incorporated many of them into this conversation. So we'll wrap up with this in the
01:05:40.420
context of speech in Britain, the online safety act and all of that. What is the one thing that no one's
01:05:45.100
talking about that they really should be? Um, the one thing people aren't talking about, well, um,
01:05:52.440
I'm not sure, um, the extent to which the police are arresting people for, um, supposedly, um, uh, saying
01:06:05.340
something unlawful online. I'm not sure the extent of it has fully sunk in. I mean, we do read from time
01:06:11.360
to time about, you know, people who are arrested often by, you know, six or seven police officers
01:06:18.060
who then ransack their houses, haul them down to the station, imprison them in cells for eight or
01:06:22.600
nine hours only to then kind of release them. Um, but I don't think people realize quite the extent
01:06:28.000
of it. So there was this, um, uh, piece in the times a few months ago, uh, which discovered, um,
01:06:34.940
quite astonishingly, um, that over, I think it was over, um, 12,000 people, um, uh, in the past year
01:06:43.760
had been arrested under suspicion of having committed just two speech offenses, uh, an offense
01:06:49.560
under, I think, section one of the malicious communications act and section one, two, seven
01:06:53.180
of the communications act. That's just two of the many speech offenses you can be prosecuted for.
01:06:58.600
I mean, there are, there are at least a dozen. That was just two over 12,000 people a year
01:07:03.600
being arrested, um, under suspicion of committing just those two speech offenses, which is over 30 a
01:07:10.480
day. So that's over 30 people a day being arrested, um, under suspicion of having just broken two of
01:07:18.640
these laws, uh, criminalizing what you can say. And invariably it's, it's stuff they've said online.
01:07:24.360
Um, and even when they conclude, and more often than not they do, that, uh, they don't have sufficient
01:07:30.080
evidence to prosecute this person, uh, it'll be recorded as a non-crime hate incident. So,
01:07:35.440
you know, I don't think people realize quite the scale, um, of, uh, non-crime hate incidents being
01:07:40.520
recorded either. So through various freedom of information requests, um, the FSU has established
01:07:45.420
that over the past 11 years, over a quarter of a million NCHIs have been recorded in England and
01:07:52.120
Wales alone. Um, and that's over 62, an average of over 62 a day. I mean, you wonder why if your
01:07:59.180
house is burgled or your car is stolen or your child has his phone stolen on the bus, the police
01:08:04.420
will take very little interest. They'll at best send you a crime report number so you can claim on
01:08:09.480
insurance. The reason they won't come around to your house is because they're far too busy
01:08:13.300
policing our tweets to police our streets. If you want the police, if you get burgled and you want the
01:08:17.800
police to come to your house, get a, get a can of spray paint, uh, before you do anything else and
01:08:22.280
spray paint on the wall, trans women aren't women. Report it as a hate crime and literally have a
01:08:27.240
helicopter overhead, you'll have six police officers outside. I mean, then they'll investigate.
01:08:31.300
Otherwise, no. Toby Young, thanks for coming on. Thank you.