ManoWhisper
Home
Shows
About
Search
Juno News
- October 29, 2021
Justin Trudeau is promising to restrict online speech
Episode Stats
Length
41 minutes
Words per Minute
171.08104
Word Count
7,131
Sentence Count
315
Misogynist Sentences
5
Hate Speech Sentences
4
Summary
Summaries are generated with
gmurro/bart-large-finetuned-filtered-spotify-podcast-summ
.
Transcript
Transcript is generated with
Whisper
(
turbo
).
Misogyny classification is done with
MilaNLProc/bert-base-uncased-ear-misogyny
.
Hate speech classification is done with
facebook/roberta-hate-speech-dynabench-r4-target
.
00:00:00.000
Welcome to Canada's Most Irreverent Talk Show.
00:00:06.660
This is the Andrew Lawton Show, brought to you by True North.
00:00:13.100
Coming up, Parliament is returning with another Liberal government,
00:00:16.800
which means it won't be long until we have another bill going after online speech.
00:00:21.960
Let's talk about it.
00:00:24.500
The Andrew Lawton Show starts right now.
00:00:30.500
Hello and welcome to Canada's Most Irreverent Talk Show.
00:00:34.520
This is the Andrew Lawton Show here on True North.
00:00:37.280
Every Friday we do things a bit differently, take an issue and do a deep dive into it,
00:00:41.940
talk about why it matters with some of the best and brightest we can muster up to do it.
00:00:46.500
Today I want to talk about Bill C-36.
00:00:49.840
C-36 is the bill the Liberal government introduced just in the day before
00:00:54.620
the House of Commons rose for the last time for the summer.
00:00:58.440
It's a bill that would allow the government to prosecute people through the Canadian Human Rights Act
00:01:03.520
for so-called hate speech.
00:01:05.720
They would do it in a way that goes after only speech that causes detestation or vilification.
00:01:12.600
They say, no, no, no, it's not about speech that just offends or humiliates.
00:01:16.520
It's not about jokes.
00:01:17.440
It's not about imposing censorship.
00:01:18.780
It's only the worst, only the worst speech.
00:01:21.080
Well, that speech is already illegal.
00:01:23.780
The criminal code is already abundantly clear in Canada that speech that rises to that critical level
00:01:29.580
of threatening or inciting violence, inciting genocide, that's what hate speech is.
00:01:35.640
Anything else necessarily lowers that bar, which means it expands the ability of government
00:01:40.880
to go after online speech, to go after speech in general.
00:01:44.400
This is why Bill C-36, which brings back Section 13 of the Canadian Human Rights Act,
00:01:50.240
is such a dangerous piece of legislation,
00:01:52.520
and one that got virtually no media coverage outside of a couple of individual people
00:01:57.460
that seem to be like me paying attention to it because they remembered
00:02:01.240
when it was around the first time, repealed by the Conservatives in 2013.
00:02:06.780
So where we look at this battle unfolding now is that the Liberals have won re-election.
00:02:11.980
They are likely to bring this back.
00:02:14.540
It might have a different number than C-36, but we know it's coming back,
00:02:18.300
and that's why it's important to talk about the state of free speech in Canada
00:02:21.660
and also what it's likely to be when this bill goes forward.
00:02:25.780
I'm joined by a fantastic panel of experts to do exactly that.
00:02:29.860
Jonathan Kaye is a Canadian editor of Quillette and joins me,
00:02:33.480
as does Sarah Miller, a lawyer with JSS Barristers in Calgary,
00:02:37.300
and Lisa Bildy, a lawyer with Libertas Law in my neck of the woods in London, Ontario.
00:02:43.100
John, Sarah, Lisa, thanks so much for joining me.
00:02:47.020
Glad to be here.
00:02:48.060
Thank you.
00:02:48.840
Thank you, Andrew.
00:02:49.940
So I want to start with you on this, Jonathan, to set the stage here,
00:02:53.600
because I remember when, almost a decade ago,
00:02:56.400
the Conservatives were pushing for the repeal of Section 13 of the Canadian Human Rights Act.
00:03:01.780
There were some notable examples of these human rights laws in action.
00:03:06.900
You had Ezra Levant and Mark Stein and some other more prominent cases,
00:03:10.880
but there did seem to be among a lot of members of the Canadian media
00:03:14.680
this understanding that this section of the law that allows the government to go after speech isn't good,
00:03:20.920
and this tended to cross Liberal-Conservative lines ideologically and also in a partisan way.
00:03:27.140
When the Liberals brought back Section 13 effectively, or tried to, before the House rose for the summer,
00:03:34.860
I didn't hear a lot of that outrage, and I could be just in my own little echo chamber here,
00:03:39.560
but I'm wondering if you could reflect on whether a lot of those principled pro-free-speech journalists
00:03:44.900
have kind of moved on.
00:03:46.600
So it's hard to compare the climate now with what I think was 2013
00:03:52.060
when the Conservatives got rid of that section, because before that,
00:03:56.860
so much of the discussion centered on two people.
00:03:59.580
It centered on Ezra Levant, and it centered on Mark Stein,
00:04:02.700
because Mark Stein, I think it was a book excerpt that appeared in McLean's,
00:04:07.380
which was then run by Ken White, and it was denounced by some progressives as Islamophobic.
00:04:12.720
But the debate, I mean, I think everybody listening probably knows who Ezra is,
00:04:18.880
and in his Ezra way, he turned it into something that was very focused on,
00:04:24.340
I mean, I happen to agree with him, and I have disagreements with Ezra and with Mark Stein
00:04:29.400
about certain aspects of what they've written, but they were right.
00:04:32.780
And the campaign against that section, I think to some extent,
00:04:40.640
maybe not as much as I think you suggest,
00:04:42.440
but to some extent there were principled liberals who agreed.
00:04:45.220
The big problem with that section, to my mind, was you could have literally just random people come
00:04:50.060
and it's an administrative action, but there was, well, there was one guy in Ottawa,
00:04:57.780
I think he like accounted for something like 40% of all the cases under Section 13.
00:05:01.320
So it wasn't just the substantive aspect of the specter of censorship.
00:05:06.560
It was also all of the kind of fairly loose and undisciplined procedural aspects,
00:05:16.160
which you can expect under what was essentially an administrative provision.
00:05:20.180
You know, you weren't protected by the safeguards you'd get in a criminal court proceeding.
00:05:24.720
Let me turn to you on this, Sarah, because I know there were a number of issues,
00:05:28.480
as John indicates there, in the first go-around of Section 13.
00:05:33.080
And you can tell in the wording of C-36, which again, died with the election,
00:05:38.020
but we know is likely to be reintroduced, at least identically or in some similar form.
00:05:43.200
You can tell that they've tried to prevent some of those criticisms.
00:05:46.860
They have one clause that says, well, you know, mere offense isn't enough.
00:05:51.140
And they're trying to say this is only for really extreme things.
00:05:53.620
Is that enough to fix it, or is it really flawed in a more fundamental way?
00:05:59.560
I don't think that it's quite enough to fix it.
00:06:03.740
You know, they've obviously identified that that's an issue that they want to address and answer,
00:06:08.560
but there's a number of more core issues on how that plays out that have not been addressed.
00:06:17.160
And they leave it still, I mean, at least in what they've proposed way too broad.
00:06:22.880
And let me turn to you next, Lisa, because we know that there have been a number of areas here
00:06:28.520
where we've seen the government try to sort of expand the sphere of how it can regulate online speech and online content.
00:06:36.520
We had also the bill that would extend the internet regulations to online publishers,
00:06:42.500
and we still don't entirely know the definition of that, and that's through Bill C-10,
00:06:47.180
which I think ended up getting a lot more attention than Bill C-36 did.
00:06:51.940
And we can perhaps talk about the reasons why.
00:06:54.380
But fundamentally, are we talking about something here that is flawed because of what it's trying to do
00:07:00.280
or just flawed because of how it's trying to do it?
00:07:03.220
Well, both.
00:07:04.000
I mean, you know, I think anytime you start trying to curtail speech and import this sort of subjectivity
00:07:12.000
that is necessary to do that, you're asking for trouble.
00:07:16.700
And so a couple of things that jump off the page for me.
00:07:19.560
One is that there's actually, in addition to reconstituting Section 13 of the Canadian Human Rights Act,
00:07:27.860
you're also, there's also amendments to the criminal code,
00:07:32.500
and one of them in particular allows you to now go and get a, basically a peace bond
00:07:39.420
or a, you know, something that would protect you from a future crime that might be committed under this section.
00:07:45.320
Now, it's one thing to go and get that kind of a restraining order if you're, say, you know, a battered wife
00:07:50.060
and you're expecting, you're anticipating that a spouse might harm you
00:07:54.820
and, you know, you want to get something that protects you from that.
00:07:58.800
But this is now importing this into the idea of a speech crime, effectively.
00:08:04.700
So basically, you can go to the courthouse and get a private information laid
00:08:12.960
or seek out a peace bond, basically, for what is essentially a thought crime.
00:08:17.780
Somebody might say something that rises to the level of a hate crime or publish something online.
00:08:24.520
Now, I can see perhaps in the context where, you know, you've,
00:08:28.920
and I think one of the things with this act is that it tried to bring in
00:08:32.320
or show that it was necessary because of certain very egregious things.
00:08:37.500
Like, for example, you've got photographs taken of somebody in a compromising position
00:08:43.280
and there's a relationship breakup and that person is going to go and publish them on the internet
00:08:47.080
or something like that.
00:08:48.240
You know, you might want to anticipate that that's going to happen and get an order against it.
00:08:53.560
But it lumps in, of course, child pornography and other things that nobody would have any objection to
00:08:58.880
are all stated objectively in the act.
00:09:03.080
But then it imports the concept of harm into it.
00:09:06.200
And so I'm a little concerned that people might have a vendetta against somebody
00:09:11.160
and go and try and get a peace bond against them for a hate crime that they have not yet committed.
00:09:17.640
We're getting into Minority Report territory with that one.
00:09:21.380
Yeah, I mean, even with Section 13, when it was around,
00:09:24.340
the term that was often used to describe it was thought crime.
00:09:27.820
And what you've just described there, Lisa, is in a literal sense,
00:09:31.040
because we're talking about the criminal code, not just the Human Rights Act,
00:09:34.880
this idea of online harms is a relatively new term.
00:09:39.460
And I do think you raise an important point, Lisa, that I want to get your take on, Jonathan,
00:09:43.480
which is the expansion of it.
00:09:45.460
We're taking things that are very harmful and expanding it to include hate speech.
00:09:50.360
And hate speech is in and of itself a loaded term that lacks often a very clear definition.
00:09:55.680
But do you find that this idea of harm is getting far too broad when we start talking about all these ways to counter harm?
00:10:03.000
Yeah, although it's sort of an inevitable dilution of language,
00:10:07.660
as you get more people who are writing laws and who are acting as judges and running the country
00:10:15.120
who have grown up in the social media era.
00:10:18.200
And I can attest that as a parent, I've seen situations where you have kids and something bad happens online.
00:10:25.680
And it's kind of, even though it's in the virtual world, it's sort of like is the most important event in their life.
00:10:31.000
And even as adults, I mean, there are probably people watching this who, you know,
00:10:35.200
something happened on Twitter or Facebook, and it sort of blew up their life for a week or a month or more like it happens.
00:10:41.820
What I do object to is if you look, actually, I think I was just reading a news report about these somewhat vague measures
00:10:50.380
that are intended to help female journalists protect them from abuse and harassment in Canada.
00:10:56.180
There is a somewhat promiscuous use of the term violence to describe what is essentially harassment.
00:11:02.200
And, you know, harassment can be a terrible thing, but it is now kind of common.
00:11:06.580
And I think I was just reading a flat out news article that just described
00:11:10.040
people going after someone on Twitter as a form of violence.
00:11:13.960
The problem with that sort of thing is, you know, I think most of us appreciate the fact
00:11:19.860
that if somebody comes and smashes the windows on your house or tries to, you know,
00:11:24.160
burn down your garage or something, that's different from them harassing you on Twitter.
00:11:29.580
Unfortunately, that kind of language has become all intermingled, and I don't think that's helpful.
00:11:34.820
Yeah, and just to go into the social media realm a bit further there, John, we have Twitter,
00:11:40.440
which has a policy which, as a private company, I maintain it's allowed to have,
00:11:44.540
that says using the wrong pronouns is something that falls under its hate speech policy or dead naming.
00:11:50.360
If you call a transgender person by their birth name, that is something that Twitter says is hate.
00:11:55.680
And again, Twitter is not the Canadian Human Rights Commission.
00:11:59.160
Twitter is not Canadian law, but it shows that we do have varying degrees of bread.
00:12:04.940
Then a lot of the activists that are pushing for more strict measures against online harms,
00:12:10.840
I don't think would have an issue extending it to misgendering someone.
00:12:15.620
Yeah, although the interaction with social media is kind of interesting for a few reasons.
00:12:20.860
One reason is I think it's the last few years in particular,
00:12:24.260
it's been very interesting to watch progressives make full-throated defenses of the prerogatives
00:12:31.560
of Silicon Valley billionaires to control what all of us can say and read and watch.
00:12:39.920
These are some of the wealthiest plutocrats in the world, and in some context, not all,
00:12:45.440
their biggest cheerleaders are people who not so long ago would have seen themselves as Marxists
00:12:51.700
or socialists or whatnot. The other weird thing is that if you look at Bill C-36,
00:12:59.460
it's written in this way that sort of presumes the government of Canada can impose these new
00:13:06.780
strictures on social media companies. And as we've seen with Facebook in Australia and other examples,
00:13:13.100
is that a lot of the time when governments tell Twitter or Facebook what to do,
00:13:19.740
often they say, thanks for the suggestion, we're not doing that.
00:13:26.780
And you also, by the way, see this with these manifestos that have been going around to protect
00:13:32.840
female journalists in Canada. There's this sort of like very vaguely defined call to get social media
00:13:38.620
companies to do stuff. Putting aside the substance of whether this is a good idea,
00:13:44.100
Canada, it often seems like very pie in the sky. The idea, you know, Canada is a relatively small
00:13:49.540
country, population wise, and in terms of our user base on social media, that even somebody like
00:13:56.120
Justin Trudeau is going to say, well, we passed this law, and you have to do this, that, and the other
00:13:59.140
to operate in Canada. It seems likely that the result of that would just be Facebook ignoring us.
00:14:04.820
Yeah, I mean, the intersection of this attitude and these laws with social media is a key one.
00:14:12.700
Sarah, I know when Section 13 was around the first time, what happened was people would,
00:14:18.000
if they were unlucky enough to be targeted by this, get hauled into some tribunal.
00:14:22.140
And the question was really about the remedy of it. Now, we've heard social media companies
00:14:27.100
and governments get into dialogues about what the best way to work together to get rid of hate speech is.
00:14:32.500
We know the Canadian government has tried to put in restrictions that would require takedown of
00:14:38.520
content within 24 hours of so-called offending content, not based on social media companies'
00:14:44.660
definitions of offending content, but based on things that government prohibits. And the challenge
00:14:50.000
with that is that you don't really have that ability to appeal and go through an open and transparent
00:14:54.900
tribunal if Facebook or Twitter is interpreting some government hate speech definition and just
00:15:02.500
zapping your content.
00:15:03.520
Yeah, absolutely. So the, I mean, the takedown provision really does limit your ability to
00:15:11.520
express yourself. But also it kind of gives you this, like, takes away your procedural fairness
00:15:19.940
rights, right? So instead of what you say, you know, going to the tribunal and having a full
00:15:24.680
hearing as to whether or not what you've done is hateful. You've got somebody stepping in and
00:15:31.040
deplatforming you to stay compliant with the law or to be risk adverse.
00:15:37.000
Yeah, I mean, it really does look like the state is trying to deputize these companies. And obviously,
00:15:42.900
I understand in practice, if we are talking about child pornography, we're talking about some of the
00:15:49.980
materials that are banned because of terrorism provisions and terror propaganda and all of these
00:15:55.080
things. And I guess the question I have, Sarah, is how do we separate those two things? How do we say,
00:16:01.260
okay, these things that are documented harms in society are in this one category, and speech is entirely
00:16:07.720
different? We don't want to be deleting people's speech on them under these very vaguely defined provisions.
00:16:12.940
Well, I mean, what we have currently is criminal law to address terrorism, child pornography, right? And so what
00:16:22.460
they're doing is almost duplicating the provisions to say, okay, it's not only criminal, but here's a
00:16:29.080
regulatory remedy. And maybe that makes sense in the sense that criminal law has a high burden of proof,
00:16:36.920
and they want an alternative remedy, but it captures all these other harms to society and democratic
00:16:44.960
rights and freedoms that we want to have protected. And regulatory provisions, regulatory law doesn't
00:16:52.760
always capture the same charter protection that you would get in a criminal law setting. So it really
00:16:59.320
creates a lot of risk for individuals that they're going to be captured in this regulatory provision
00:17:06.040
that wouldn't have triggered criminal law response.
00:17:11.700
Sarah mentioned the charter. I know you've been at the forefront, as has Sarah, Lisa, of a lot of
00:17:17.020
charter-based litigation in the last year and a half and even beyond. Most people would look at this and
00:17:23.380
say, well, you know, the charter guarantees freedom of expression. So our free speech is protected. This
00:17:27.620
bill is going to be fine. You're probably not all that optimistic on that, are you?
00:17:32.100
Well, not really, no, because I put all of this into the broader political and cultural context,
00:17:38.380
and it gives me great cause for concern. You know, it's interesting, and I'm going to circle
00:17:43.420
back to the charter in a second, but you know, you look at, there's a group that's called the
00:17:48.820
Canadian Anti-Hate. I'm going to see if I can find their, anti-hate.ca. Okay, so they had a little
00:17:55.760
statement they put out back in the summer about this bill, Bill C-36, and I thought it was interesting
00:18:02.960
how they characterized this bill. They said, hate speech is an attack on free speech. Removing hate
00:18:11.320
will make it more possible for women, BIPOC, 2SLGT, GBTQ+, persons, etc., to exercise their charter
00:18:19.280
rights to expression and fully participate in society. Well, of course, I mean, if you are
00:18:24.880
paying attention to the culture war, this is taken right out of Herbert Marcuse's repressive
00:18:30.960
tolerance essay, in a sense. In other words, we have to, in order to be free, in order to be able
00:18:36.840
to have this idealistic, liberated utopia that he wrote about back in the 1960s, and which has kind
00:18:46.120
of become the world in which we live, as it turns out, through, you know, the gradual creep and then
00:18:55.240
steamrolling through many of our institutions. What it means is, we have to clamp down on any speech,
00:19:04.040
in order to be free, any speech that doesn't align with our views. And so, the bottom line is, left-wing
00:19:11.480
speech, no matter how extreme, is okay, it will get the pass. And right-wing speech, or anything to the
00:19:17.960
right of, you know, hard left, which is pretty much everybody else, their speech is going to be curtailed
00:19:25.000
and controlled, because it will be perceived as hate speech in the eyes of the people who are probably
00:19:31.560
going to be making the decisions on what gets captured by all of this. And so, you know, you sort
00:19:38.040
of see this extrapolated more broadly into the charter as well, because we now have charter values,
00:19:43.480
which can override the actual stated provisions in the document. So, you know, we're just in a new
00:19:53.640
constitutional era, I think, where I don't actually have a lot of hope that our constitution will be
00:19:59.160
upheld the way we think it should for the people who are going to find themselves running afoul of
00:20:04.360
this, which is basically anybody who's not saying the correct things in the current cultural zeitgeist.
00:20:11.560
Sorry, I have to go soon, but if I may just comment on C-A-H-N, I think that's the acronym there,
00:20:17.560
the Canadian Anti-Hate Network, which has a very august name, but from what I can tell is basically,
00:20:23.560
mostly it's just one dude with a Twitter account, but also one dude with a Twitter account who I
00:20:28.120
think got more than $200,000 from the federal government to keep up his super sophisticated
00:20:35.640
anti-hate operation going. And if you're one of the few people who isn't blocked on their Twitter account,
00:20:41.800
you can read all about their initiatives. And it's basically a steady stream of, you know,
00:20:49.320
left of center calls for censorship, you know, getting rid of speech they don't like.
00:20:57.000
And I think if you want an indication of how broadly their vision of censorship would go,
00:21:04.520
and again, this is, they don't have a broad constituency, it's basically a grant recipient
00:21:08.600
operating a Twitter account. They have trumpeted a study that claims there are 300 right wing,
00:21:16.920
I think, white supremacist hate groups in Canada. I personally have contacted them and said,
00:21:23.080
or at least the person who performed the study, and I said, could we have the list of 300 white
00:21:27.960
supremacist hate groups? I'm sure there are 10, maybe there's 20, 30, 40. They wouldn't give me
00:21:32.360
the study. I wrote a National Post column about it. They were very happy to keep scaring people with
00:21:37.080
the idea that there are 300 of these right wing hate groups. And of course, that big scary number
00:21:42.760
is great for getting people around to the idea that we need to censor this and that. From the best I
00:21:49.240
could tell, the only way you can get to that number 300 is pretty much if every conservative
00:21:54.040
group in the country is lumped in as a white supremacist right wing hate group. So, you know,
00:21:58.760
once you go down the road of censorship, and this applies to conservatives and liberals too,
00:22:02.520
I'm not a fan of conservative censorship either. You quickly find out that a lot of the people who are
00:22:07.720
the biggest cheerleaders for it. They don't just want to stop at revenge porn, which everybody
00:22:13.160
agrees is horrible and should be censored, and child porn, same category. But they're kind of just
00:22:19.880
happy to censor anything they disagree with. And I think, you know, CHN is a great example. And people
00:22:27.560
should consult that example, they should Google it, though not for the reasons that the CHN itself
00:22:33.320
would have us believe. I think it's a cautionary tale. Yeah, I know you've got another commitment,
00:22:38.040
John. I checked and I'm not yet blocked, although after this show, perhaps that'll change. Jonathan
00:22:42.840
Kaye, Canadian editor of Coilette, thanks very much for joining us. Thank you. Bye, Sarah. Bye, Lisa.
00:22:48.440
And continuing along in John's absence here, it wasn't intentional. We just had a scheduling conflict.
00:22:54.200
Let me turn to you on this, Sarah. We were just talking about this one particular group that
00:22:59.080
I would suspect, at the risk of predicting the future, but if the government can go after thought
00:23:03.560
crimes, we can try to predict the future as well, that there will be activist groups that try to
00:23:07.880
weaponize whatever this enforcement regime looks like once these are put into place. And that to me
00:23:15.000
strikes me as a big problem, that we could have a system where activist groups that have a pretty
00:23:20.360
demonstrated track record of trying to go after people for expressing opinions they don't like,
00:23:25.240
have all of a sudden a legal mechanism to go after these people and start telling
00:23:29.720
the social media companies, you've got to delete this because it violates this section of law, or
00:23:34.520
going right to the commission and saying, you know, we're filing complaints against all of these.
00:23:38.360
And that seems to be, and again, I could be just overly pessimistic here, but that seems to be an
00:23:43.320
inevitability in something like this.
00:23:45.480
Yeah, I mean, what we have is the Constitution that's not going to really take any steps to
00:23:54.040
protect those who aren't part of the majority, right? And so anything that's not going to fall within
00:24:00.440
the majority mindset, the majority narrative, the parts that we consider politically correct or right,
00:24:09.800
or like right as incorrect, those speech topics are all going to be fine. And anything that falls
00:24:17.400
outside of that is going to be, as you say, attacked, and I think aggressively so. I mean,
00:24:23.960
the government is essentially giving us a green light for that, right? They're saying, yeah, go use this,
00:24:29.880
use these kind of mechanisms to go and pursue what you interpret as hate crime. Go and file a complaint
00:24:37.720
as a self-represented person at the Canadian Human Rights Commission. Like, these are the things that
00:24:43.160
are designed to empower people who fall within that politically correct camp and, you know, left of
00:24:54.200
center camps that are going to lead the narrative and de-platform anybody who doesn't agree with their
00:25:02.360
points of view. Yeah, I remember when the Mark Stein case, just to go back to the example we talked
00:25:07.720
about at the beginning happened, the complainants in that case actually shopped it around to different
00:25:13.480
commissions. They went to the Ontario Human Rights Commission and said, yeah, we don't like this
00:25:17.640
excerpt that ran in McLean's and the Canadian Human Rights or the Ontario Human Rights Commission
00:25:21.400
said, yeah, I mean, we don't like it. We're with you. That's offensive, but we can't prosecute it.
00:25:26.120
The Canadian Human Rights Commission wouldn't take it up. And then it ultimately was the BC
00:25:30.440
one that said, yeah, sure, let's do this. You know, the BC Human Rights Tribunal going after
00:25:35.080
a column written by a Canadian living in New England that was published in a Canadian magazine,
00:25:40.920
not based in British Columbia. And I mean, we can talk about the jurisdictional aspects of this,
00:25:46.040
although I would tend to agree with your point a few moments ago, Lisa, that all of these are
00:25:50.280
secondary to the cultural thrust that seems to be behind this. And I guess, let me ask you,
00:25:56.040
is the best way to combat this using legal means or going after that source, going after that
00:26:01.640
political and cultural impetus behind this? Well, I think so. But I don't know if that ship
00:26:06.760
has sailed. I suspect it may have. I mean, given how, for example, the Conservatives were absolutely
00:26:12.360
silent on these bills in the lead up to the last election, you know, they don't seem terribly concerned
00:26:19.080
about this. This is something that should be of concern. I don't think that they quite get the
00:26:23.240
cultural moment that we're in. I think we're already too late. And I'm not, I don't really
00:26:28.600
fit perfectly in the conservative camp myself, but as someone who values free speech and, you know,
00:26:37.560
the marketplace of ideas as being sort of foundational to what our Western civilization is
00:26:42.440
built on, I would think that that would be something that should at least attract, you know,
00:26:47.640
a statement or two. And in fact, it sounds like from some of the clips that I've seen,
00:26:54.120
you know, along the way from the House, that they're actually in favor of this kind of legislation.
00:26:58.680
You know, they want to be able to stomp out hate, however that's defined. Of course,
00:27:03.800
I don't think they realize that they're going to be the primary targets of this kind of legislation,
00:27:09.320
because anything they say will be viewed as hate. It's just inevitable. We see it all the time on
00:27:15.320
social media and other places. And whenever you've got words that can be manipulated,
00:27:21.160
and that's part of the whole game is the manipulation of language. So when you talk about
00:27:26.440
vilification or detestation or other words of art that are in these in this legislation, you know,
00:27:32.360
how is that going to be interpreted and defined? Well, it won't be in a way that provides maximal
00:27:38.840
protection for free expression. So, yeah, so I don't know that we can turn the ship around very
00:27:47.400
easily at this point. The culture is pretty solid. And I think we see this over the last 18 months,
00:27:52.360
dealing with the pandemic and seeing how collectivist our country has become. And, you know,
00:27:58.760
the rally now around sentiment that would vilify and detest people who don't choose to have the
00:28:05.960
vaccination as an example. Are the people who engage in that kind of speech going to be
00:28:12.600
taken to task? No, I don't think they are.
00:28:17.080
That's a very good point. And just to use that language in C36 for a moment, it says,
00:28:22.280
as Lisa mentions, detestation and vilification. But it also says that speech that expresses mere
00:28:29.640
dislike or disdain or speech that discredits, humiliates, hurts or offends doesn't meet that
00:28:35.720
threshold of being prohibited. So there's a spectrum of mere dislike to vilification. The
00:28:41.000
government has drawn the line between humiliates and offends and vilification and detestation.
00:28:47.320
I mean, these are terms that all sound well and good. And we talked about this earlier. People
00:28:51.160
would look at that and say, oh, OK, great. You know, they're only going after the extreme stuff.
00:28:54.760
They say it right there. But at a certain point, someone has to take a word or a phrase or an
00:29:00.520
expression and plot it on that spectrum. And Sarah, how does that process unfold in a way that protects
00:29:07.160
free speech? It's I mean, so often going to be informed by your own philosophical understanding,
00:29:14.920
right? Like Lisa made a great point. If you're anti if you're pro vaccination and you you hate those who
00:29:23.320
aren't vaccinated, nobody's going to use these these provisions to pursue a pro vaccination person.
00:29:33.800
Right. And so you really just don't have the right mechanisms in place to protect freedom of speech.
00:29:40.520
You don't see it in in recent court decisions on freedom, freedom of expression. There's not a really
00:29:47.960
robust desire to protect freedom of expression. We are we are far more often falling into the
00:29:59.000
suppress fake news, suppress any kind of narrative that doesn't fall within the approved narrative.
00:30:07.960
And any time that those are explicitly expressed or repressed, then what we have is just the government
00:30:16.200
saying section one, this is justifiable, the ends justify the means. And I mean, the the charter doesn't
00:30:24.600
exist to to protect the majority. That's not what it's there for. It's supposed to be protecting the
00:30:31.720
the fringe expression. And I don't think that the appetite is there to do that. Yeah. And this is,
00:30:38.520
I think, more of a philosophical point than anything else. But I think it still bears putting on the record
00:30:43.320
here that you don't need legal protections for popular speech. You don't need the protections
00:30:49.320
for the speech that's unlikely to be censored for exactly that reason. So when people start trying
00:30:54.440
to draw this line of, well, I support free speech, but well, it's the stuff in that but category,
00:30:59.480
Lisa, that is the most in need of a protection against censorship. Right. And there was a time when
00:31:04.680
our Supreme Court of Canada actually believed that, too. And I was actually just I'm going to rattle through
00:31:10.040
some papers here for a half second, because I had a little quote from RV Zundel, which is an older
00:31:15.560
case now, but if I can find my hands on it. Yeah. So that in that decision, the Supreme Court of Canada
00:31:26.200
said, the guarantee of freedom of expression serves to protect the right of the minority to express its
00:31:32.200
view, however unpopular it may be. Adapted to this context, it serves to preclude the majority's
00:31:37.800
perception of truth or public interest from smothering the minority's perception. The view of the
00:31:43.000
majority has no need of constitutional protection. It's tolerated in any event. Viewed thus, a law which
00:31:49.080
forbids expression of a minority or false view on pain of criminal prosecution and imprisonment on its
00:31:54.680
face offends the purpose of the guarantee of free expression. Now, but I, you know, I share Sarah's,
00:32:02.120
I mean, I think we are all on the same page. Concerns about section one, it's become really almost sort
00:32:09.000
of a sword against the minority opinions that don't fit or the minority lifestyles and so on that don't
00:32:18.520
fit with their current. I mean, we've been seeing this again through the last 18 months as well.
00:32:23.640
Everything's been given a pass because the majority wants a certain thing. So I just don't have a lot of
00:32:30.280
confidence anymore. It feels like the case law of late has tended to focus more on the limits around
00:32:37.080
free speech than on giving that sort of, you know, fulsome support for the principle. It's how can
00:32:47.000
we restrict it? How can we make sure that the place where it's being exercised is appropriate? How
00:32:53.000
can, you know, they focus on all of that stuff more now than they do on giving voice to the idea?
00:32:58.280
Yeah. And you contrast that line you just shared from an older decision with, I think it was
00:33:04.280
Watcott, which said, and I'm paraphrasing here that, you know, just because something's true
00:33:08.600
doesn't mean you can say it basically. Well, and I think actually this legislation very clearly
00:33:13.800
incorporated Watcott to charter proof it in a way that can use the language from the Watcott decision.
00:33:20.280
That's where you get the detestation vilification language from. So that's their way of trying to ensure
00:33:26.200
that the legislation is charter proof essentially. So let's turn to the way forward here because I
00:33:34.440
try not to depress people too much without offering a bit of hope. And it sounds like there might not
00:33:38.520
be reasons to be hopeful, except for perhaps hoping that, you know, because it's a minority election,
00:33:43.800
minority parliament, perhaps we might end up at the polls again before something like this
00:33:47.960
goes into effect. But Sarah, I know that you have been challenging a number of fines,
00:33:53.800
many under lockdowns and also speech restrictions. But beyond this, I mean, what would your recommendation
00:33:59.640
be on how best to attack this before it gets to that point? I mean, what do you think is the winning
00:34:04.920
argument to Canadians that are not as tuned in on this, that hear someone say, you know, we're going
00:34:10.520
after hate speech and think, well, yeah, that seems like a good thing. I don't like hateful speech.
00:34:14.360
How do you convince those people that this is about more than just that?
00:34:21.000
Hate is an emotion, right? I mean, that's how we understand hate. And if you start thinking about
00:34:28.120
regulating people's emotions, you're taking away the human aspect of humans and civilization. So
00:34:36.040
I think although people say they don't like hate speech, they need to understand that hate is on
00:34:45.400
the spectrum of emotions that we all experience. And regardless of how uncomfortable that is,
00:34:53.000
human beings are uncomfortable. We are inherently flawed. And people need to, instead of trying to step
00:35:00.200
away from that, accept the fact that there's a human nature aspect and nobody wakes up in the
00:35:06.440
morning wanting to be a hateful, terrible person. And so having compassion for when things that you
00:35:14.280
dislike or disagree with come up, I think is a much better way to think about things than to say,
00:35:21.160
well, we need the government to step in and intervene on an individual's human nature.
00:35:27.080
What would your message be, Lisa? How would you recommend selling this to people that aren't
00:35:32.920
aware of just what the stakes are on this? Well, I think people have to be more concerned about
00:35:39.400
how this will be applied than they are concerned about the existence of hate speech. And I mean,
00:35:44.280
there's no question there's stuff out there that nobody wants to hear that people find offensive,
00:35:49.560
disgusting, horrible. There's no limit to that on the internet now when you can see everybody's
00:35:56.920
opinions and thoughts. But that isn't where this is going to stop. And I think the average person
00:36:01.880
needs to understand that things in our post-truth world, things that we all, or many of us,
00:36:08.040
consider to be truth, consider to be common sense, those are the kinds of things that may end up being
00:36:15.000
targeted. And so I think people, if they could get their heads around the idea that simple concepts and
00:36:22.920
ideas that that are not currently popular, well, let me give you an example, I guess, you know,
00:36:28.280
to say that if you wade into the into the transgender debate at all, in any way, even if you do it
00:36:34.840
compassionately, you will find that your comments are immediately construed as hateful. And so if you
00:36:41.240
were to say something like, you know, men cannot be women, if you said something like that, you might find
00:36:48.920
yourself being accused of saying something harmful, or even hateful, I, you know, I don't know that it's
00:36:53.880
going to rise to that level. But we certainly see a trend to want to shut down anything that doesn't
00:36:59.560
say that doesn't comport with our with our current permissible, very narrow thought that that's allowed,
00:37:08.440
that's allowed to occur. So I just, I guess what I'm trying to say is, we could have a cultural pushback,
00:37:15.560
but we need a lot of people to wake up to the fact that this is going to impact them,
00:37:19.640
it's going to impact their children, it's going to impact their future, it's going to impact this
00:37:23.800
country. And until enough people recognize that and say, Okay, enough of this stuff. This is all going
00:37:30.120
somewhere very bad that is not at all conducive to the kind of society that Canada is supposed to be,
00:37:35.640
which is fair and just and, you know, compassionate and all those kinds of things. I mean, we don't want
00:37:42.280
to have hate speech here. But this is going to take us down a path that goes far beyond what anybody
00:37:48.360
really imagines. And so I guess I would hope that people start paying more attention and start speaking
00:37:55.960
up about it. It is possible to change the culture, it is possible to make have political changes and
00:38:02.920
solutions. But but it starts with everybody becoming more conscious of all of this and speaking up and
00:38:09.720
speaking the truth whenever they can. If we don't speak it, we are going to be finding ourselves
00:38:17.160
in hushed whispers around our kitchen table saying things that used to be allowed but aren't anymore.
00:38:22.600
If that's the kind of Canada that you want to have, then carry on. But if you don't, you got to speak
00:38:27.560
up now, speak the truth, live not by lies is I put that in my Twitter bio because that's I'm trying to
00:38:32.120
live by that. You know, speak the truth whenever possible. And, and know that things like this are
00:38:39.320
not conducive to that, to that wave of thinking and, and not conducive to, like this bill is not conducive to
00:38:48.200
the kind of Canada that allows for us to, to think and speak freely. Sorry, it's not very eloquent. I'm just, I'm just
00:38:54.120
I'm just sputtering. But I, you know, I just think people need to kind of wake up.
00:39:00.680
Well, I think the wake up sums it up perfectly. So very well said, both of you, Sarah and Lisa,
00:39:06.040
thank you so much. Sarah Miller is a lawyer with JSS Barristers in Calgary. And Lisa Bildy is a lawyer
00:39:12.120
with Libertas Law in London, Ontario. But I know you both have a very national reach for your work,
00:39:17.400
and we are all the better off for it as people that value freedom. So Sarah, Lisa, thank you so much.
00:39:22.360
Thanks, Andrew. Thank you, Andrew. When you look around at the state of free speech or lack thereof
00:39:29.720
in Canada, it's very difficult to be optimistic and cheerful. But I hope we can at the very least
00:39:35.480
arm you with some of the points about this, because I think Lisa touched on it earlier,
00:39:39.880
that we can talk about the law, and the law is important because obviously state coercion and state
00:39:44.680
force and state censorship is worse than all the other kinds. But you have to look at what it is that
00:39:49.800
gives governments a mandate to do that. And Justin Trudeau won re-election literally weeks after
00:39:56.360
proposing a law that would allow the government to regulate online speech. He thinks that Canadians
00:40:01.400
like this, and by and large, he may well be right. So yes, we need to push back against the law,
00:40:06.840
there will be legal challenges. I hope to testify before this in Parliament. My colleague,
00:40:11.640
Lindsay Shepherd, testified when the government had its initial consultation on this,
00:40:16.040
and it was a tremendous, tremendous testimony that you should go and look up. She appeared alongside
00:40:21.080
Mark Stein and John Robson as well, who both gave very necessary contributions to this that
00:40:26.600
clearly fell on deaf ears insofar as the Liberals are concerned. But underscoring that mandate is
00:40:33.720
the fact that Canadians don't actually respect free speech all that much. And it might be because
00:40:39.720
they don't think the censorship will ever go after them. Maybe it's because they think that this
00:40:43.640
peaceful, harmonious existence that Canadians enjoy is one that requires us to go after so-called hate
00:40:49.240
speech. But whatever it is, they need to understand the stakes. Just look earlier this month, Margaret
00:40:53.960
Atwood, this like lion of left-wing advocacy and literary excellence in Canada, facing the cancel
00:41:00.440
mob because she tweeted a column that says we should be able to use the word woman. People like ripping
00:41:06.760
Margaret Atwood's books off their bookshelves because she thinks that woman means something. And this is
00:41:12.120
what we're up against. So it's not going to be an easy battle, but as Lisa, Sarah, and Jonathan,
00:41:16.920
I think would all agree, it is very much a necessary one. So thanks to all of you for tuning into this
00:41:22.120
Andrew Lawton Show. Deep dive into free speech and Bill C-36. Whatever it comes back as we know it
00:41:28.360
cannot come back. We will talk to you soon. Thank you, God bless, and good day. Thanks for listening to
00:41:34.440
The Andrew Lawton Show. Support the program by donating to True North at www.tnc.news.
Link copied!