Juno News - March 04, 2024


Trudeau's "online harms" bill is an all-out assault on free speech


Episode Stats

Length

45 minutes

Words per Minute

164.74472

Word Count

7,486

Sentence Count

359

Misogynist Sentences

7

Hate Speech Sentences

6


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.000 Transcription by CastingWords
00:00:30.000 Thank you.
00:01:00.000 welcome to canada's most irreverent talk show
00:01:15.500 this is the andrew lawton show brought to you by true north
00:01:20.020 hello and welcome to you all i was gonna say welcome back but i guess i'm the one
00:01:29.220 coming back. So I hope you are all telling me welcome back or you're wondering who the heck
00:01:33.280 is this guy. This is The Andrew Lawton Show, Canada's most irreverent talk show here on
00:01:38.200 True North. I am, of course, as the name of the show would suggest, Andrew Lawton. I was
00:01:43.400 away last week. I don't even know if I want to tell you what I was doing because you might just
00:01:47.820 hold me in deep contempt or have some resentment. But I was doing the very heavy, very difficult,
00:01:53.520 very tough assignment. I was speaking on a cruise ship. Yes, I had the great privilege of being
00:01:58.640 aboard the Mark Stein cruise. Conrad Black was there. Mark Stein was there. Some English
00:02:04.300 commentators, Leilani Dowding and Samantha Smith, Phelan McAleer and Anne McElhenney, the Irish
00:02:10.060 podcasters and filmmakers that we've had on the show, they were aboard as well. Tal Bachman,
00:02:14.680 the rock singing legend, was there. We had a great time. I know some of those shows that we
00:02:20.300 recorded on board are going to be available via Mark Stein in the coming weeks, but it was a lot
00:02:25.280 of fun. And as it happened, I met a number of True North fans, both in the Mark Stein group and this
00:02:31.420 foursome from Sarnia, Ontario, that I kept running into on the ship that were just so hugely
00:02:37.440 supportive. So Paul, if you are watching, it was great meeting you and your wife and friends on
00:02:43.400 the, what was the ship called? The New Amsterdam, I believe. So anyway, I am back in the thick of it.
00:02:48.900 And of course, this is what happens. I don't want to have too much of an inflated sense of
00:02:52.600 self-importance here. But I note that Justin Trudeau waited until I was in international
00:02:56.940 waters and out of the country before he tabled this online harms bill. Again, it could have just
00:03:03.660 been coincidental. I don't know. But I was in international waters. And to be honest, I almost
00:03:08.540 wanted to stay in international waters because at least out there, I could perhaps circumvent the
00:03:13.000 online harms bill and talk about things that I wanted to talk about. This came out on Monday.
00:03:19.000 But this has been an issue I've been covering for well over a decade,
00:03:21.800 so I wanted to hit it with full strength and full force upon my return.
00:03:27.000 We'll be speaking about it a little bit later on in the show
00:03:29.960 with Christine Van Gein of the Canadian Constitution Foundation.
00:03:33.460 But I want to begin by sharing some of my own thoughts on this.
00:03:37.360 So if you have not followed this from last week, the bill is Bill C-63.
00:03:43.200 It's called the Online Harms Act.
00:03:46.440 Now, this bill does a lot of different things.
00:03:49.540 It amends the Criminal Code.
00:03:50.880 It amends the Canadian Human Rights Act.
00:03:52.940 And it talks about all of these different supposed online harms,
00:03:57.180 some of which are bona fide online harms.
00:04:00.300 But it talks about them as though they're all one particular thing,
00:04:05.520 as though they're one particular aspect.
00:04:08.100 Now, this is why the bill is so dangerous.
00:04:10.720 I'm going to read the bullet point list provided by the federal government
00:04:14.880 on what this bill purports to do.
00:04:18.400 It targets seven types of harmful content.
00:04:21.900 This is the government's language.
00:04:23.460 It would specifically target seven types of harmful content.
00:04:26.960 Number one, content that sexually victimizes a child
00:04:31.980 or re-victimizes a survivor.
00:04:34.100 Number two, intimate content communicated without consent.
00:04:38.060 Number three, content used to bully a child.
00:04:42.280 Number four, content that induces a child to harm themselves.
00:04:47.300 Number five, I'll get back to.
00:04:49.120 Number six, content that incites violence.
00:04:52.820 Number seven, content that incites violent extremism or terrorism.
00:04:57.440 And then we circle back to number five,
00:04:59.640 the one that I'll be spending most of the show talking about,
00:05:02.160 content that foments hatred.
00:05:05.300 So we have one bill that is trying to deal with child pornography
00:05:11.120 pornography, trying to get young people to kill themselves, terrorism content and extremist
00:05:18.520 content, non-consensual sexual images, so naked photos that are shared in confidence with one
00:05:26.300 person and then republished online, all of that under the same banner as so-called hate speech.
00:05:32.940 Now, how does the government define what hate speech is? Because as always, the devil is in
00:05:38.840 the details on this. I will give you the definition. It is the same one they used in Bill C-36, which
00:05:43.860 came out in 2021, just before the election. It is content that is likely to foment detestation
00:05:52.200 or vilification. Detestation or vilification. Now, the bill says there are a couple of things that
00:05:59.980 don't count as hate speech. And I'm going to read the definition of hate speech. Hate speech means
00:06:04.780 the content of a communication that expresses detestation or vilification of an individual
00:06:09.080 or group of individuals on the basis of a prohibited ground of discrimination. So this is
00:06:14.180 basically race, gender, sexual orientation, and so on. Number nine of this law is a clarification.
00:06:22.300 For greater certainty, a communication does not express detestation or vilification solely because
00:06:28.980 it expresses disdain or dislike, or it discredits, humiliates, hurts, or offends. So you've got that
00:06:36.120 if something foments, sorry, if it is likely to foment detestation or vilification, it's hate
00:06:43.320 speech. But if it's just disdain or dislike or discrediting or humiliation or hurt or offense,
00:06:50.040 it is not hate speech. Are you confused? Do you know what's happening? Because I'm sure as heck
00:06:57.500 confused. Where is that line going to be drawn? At what point does a missive, does a communication
00:07:02.980 cross from disdain to detestation? At what point does it go from hurt and humiliation to vilification?
00:07:10.880 Let's take a couple of examples of this. The Ontario Human Rights Commission says that
00:07:16.620 misgendering someone is hate speech. The Ontario Human Rights Commission says that misgendering
00:07:22.200 is, in fact, hate.
00:07:24.000 Now, this is, to me, tremendously dangerous
00:07:25.940 because we already see that beauty-like hate
00:07:28.440 are in the eye of the beholder.
00:07:31.120 If the commissars at the Canadian Human Rights Tribunal
00:07:33.860 decide that it is hate speech,
00:07:35.300 then your life will be miserable.
00:07:37.300 You don't even have the right to face your accuser.
00:07:40.080 Complaints can be made anonymously.
00:07:41.980 A complainant who makes one of these claims
00:07:43.800 could get $20,000 from the government.
00:07:45.840 So we have the emergence of a new grievance industry
00:07:48.760 where you could just go through Twitter,
00:07:50.680 find a bunch of people that have said things
00:07:52.040 you don't like that may apply to your demographic group. You can file complaints and, oh, what do
00:07:57.220 you know? 20,000, 40,000, 60,000, 100,000, 20,000, 220,000, all of these different amounts that
00:08:03.760 could be paid out under the scheme that the government says is making the internet a safer
00:08:09.120 place. But I go back to where the line is drawn. Now, as it happened this morning, I was scrolling
00:08:15.280 Twitter. I don't usually recommend that as a pastime, but I was away from the world for the
00:08:20.220 last week, so I was trying to get caught up. And I saw this tweet from a gentleman whose CV I will
00:08:27.140 read in part in a few moments. His name is Mark Kirsten. Now, he was making a comment about Lisa
00:08:33.720 McLeod, who's a member of the provincial parliament in the province of Ontario. This is what he said
00:08:38.900 about Lisa McLeod. Let's put that tweet up there. This is racist and potential hate speech from a
00:08:45.060 sitting provincial legislator. Now Lisa McLeod was tweeting about an incident that took place
00:08:51.400 over the weekend. Protests in, again, protests that we've seen that are of an anti-Israel or
00:08:57.140 in many cases anti-Semitic flavor. I think we have a clip from one such protest.
00:09:15.060 intifada we are the intifada now if you are a little bit unclear on that intifada translates
00:09:32.200 to uprising intifadas in the middle east have been very violent ordeals so when a bunch of
00:09:38.360 people are standing outside a speech or a meeting by georgia maloney the prime minister of italy and
00:09:43.500 Justin Trudeau declaring themselves the Intifada. I realized that if you're a Jewish person, you
00:09:47.860 probably are crossing the street to get away from all that. Now, I believe that is protected free
00:09:53.400 speech. I believe that unless you are being very explicit with your call to violence, you should
00:09:58.680 have the right to express your opinions and express your positions. And I have the right to
00:10:03.560 condemn them. I'm consistent on that, whether it's pro-Israel or anti-Israel rhetoric. Now, Lisa
00:10:09.260 McLeod said this about those protesters in the city of Toronto. She said, though they aren't
00:10:16.000 pro-Palestine, they are pro-terror. Now, you can disagree with that. You can say it's perhaps
00:10:20.960 unbecoming of a legislator. You could agree with it. You can do whatever you want. That is what
00:10:24.980 free speech and debate are about, the ability to share opinions. But Mark Kirsten says that is
00:10:33.680 potential hate speech. He doesn't go all the way to indict her, but he says it is potential
00:10:38.300 hate speech that she is guilty of. Now, I want to make very clear, he is not saying we should
00:10:44.100 throw the Human Rights Commission book at her, but his determination is that Lisa McLeod's comment
00:10:48.540 is potentially hate speech. Now, why am I focusing so much on this guy you'd never heard of and I had
00:10:53.840 never heard of before this morning and something he posted on Twitter? Because he is a human rights
00:10:59.500 lawyer. He is a law professor at the University of Fraser Valley in British Columbia. It's not
00:11:06.320 the Canada version of an Ivy League university. Associate professor, Sean says. He's on the
00:11:12.920 faculty. He's on the faculty. Associate professor notwithstanding. He has a PhD. He's very well
00:11:18.660 educated from the London School of Economics. This is a guy who is supposed to know the law.
00:11:24.420 He's supposed to know the situation. And he is saying that this is potential hate speech. Now,
00:11:29.940 is this just his own academic opinion or is this a legal opinion? I don't particularly care.
00:11:35.440 What I'm saying here is that already we have,
00:11:38.200 within a week of legislation being announced,
00:11:40.960 a law professor arguing that speech
00:11:43.960 from a provincial politician in Ontario
00:11:46.240 is potentially hate speech.
00:11:47.720 Now, what if a guy like this ends up being appointed
00:11:51.180 as an adjudicator or member
00:11:53.500 of the Canadian Human Rights Tribunal,
00:11:55.640 which is going to be the body tasked
00:11:57.180 with adjudicating and determining these matters?
00:12:01.400 Now, all of a sudden,
00:12:02.420 if you say that a group of anti-israel protests are protesters are pro-terror you could be held up
00:12:08.120 as guilty of hate speech you could be maligned and of course censored now when i said that the
00:12:14.000 federal government through bill c63 had reintroduced something that was repealed back in 2013 they have
00:12:20.060 supercharged it they've added to this takedown powers where social media platforms like facebook
00:12:25.480 and twitter or x as it's now called google will have to take down all of this content so now
00:12:31.320 you have a mechanism by which you can not only say to the Canadian Human Rights Commission or
00:12:36.200 subsequently the tribunal that these bodies should go after that guy for saying that thing I don't
00:12:41.960 like, but you can now go to the social media companies and say you should not allow this to
00:12:46.060 exist. So we will have big tech being deputized by big government to censor your speech. And the
00:12:53.940 guy with his finger on the trigger of all of this right now is Justin Trudeau. Now I want to play
00:13:00.640 a clip of the Justice Minister, Arif Farhani, claiming that, oh no, your charter rights are
00:13:07.480 going to be fine and none of this is going to attack or assault your charter rights. Take a look.
00:13:13.080 The other thing that's really critical to understand, David, is that people talk to me
00:13:16.560 about freedom of expression. I'm duty bound to defend that and always will. It is a charter
00:13:20.680 protected right. I swore an oath to uphold the constitution. But what I also explain to people
00:13:24.960 is that right now, a lot of what you put out to the world and also what you receive in terms of
00:13:29.700 your feed is being decided for you not by a government but by a private company that private
00:13:34.780 company is making decisions behind closed doors we don't know what decisions they're taking or
00:13:39.120 why they're taking them that will change by forcing them to provide a digital safety plan
00:13:43.660 that exposes and provides transparency about the decisions they're taking that affect your and my
00:13:48.720 free speech so he's saying that government should be the one making that decision on what you say
00:13:56.880 and see online. You can't trust big tech companies to do it, which I, by the way, agree with. But
00:14:01.660 the result of that is not trust government instead. But the justice minister in Canada
00:14:06.820 says, no, no, no. Government is going to be the body that's going to look after you. You can't
00:14:10.580 trust Facebook and Google. We'll be the ones. We'll be the censors. We'll be the authorities.
00:14:14.860 We will manage the algorithms. All your speeches are belong to us. That's what the government
00:14:19.380 is saying to Canadians right now. Do you feel better about it? Because I sure as heck don't.
00:14:25.160 I've said that big tech is a malign alliance, the big tech companies, on your free speech.
00:14:32.860 Government is forming its own malignant force against your free speech.
00:14:37.580 You put those two together, and it is an all-out assault on free speech,
00:14:41.420 especially when you combine this with all of the other internet regulations to date.
00:14:45.980 C-18, that has effectively restricted where you can access news content online.
00:14:52.160 The Online News Act, Bill, what was it, C11 or C10?
00:14:56.220 I get the numbers all mixed up now.
00:14:57.740 I think it was C10 and then it was reintroduced, C11.
00:15:00.820 The bill that's effectively mandating a podcast registry.
00:15:04.160 So government has been setting the groundwork.
00:15:06.240 It's been laying the groundwork all this time for the coup de grace,
00:15:11.660 which is now government saying these places that have now been licensed,
00:15:16.500 these places that have now had authority delegated to them,
00:15:19.040 these places that government has now brought into its regulatory ambit, they now have to take down
00:15:25.440 your content if the government says so. If the government says so, they now have to take off
00:15:31.280 all your content. So Bill C-11, Bill C-18, all of that was just the appetizer. It was the palate
00:15:39.140 cleanser. I realize those are two different dishes, but one was the appetizer, one was the palate
00:15:43.260 cleanser. And here we have the main course of Justin Trudeau's censorship desires. Now, we're
00:15:49.860 going to talk about this with Christine Van Gein in just a couple of moments time here, because
00:15:53.900 one of the points that I think is incredibly necessary to stress is that a lot of what the
00:16:01.560 government has brought into this bill is already captured by other laws. They're regulating things
00:16:06.840 that are already illegal. They're regulating things that have already been prohibited.
00:16:12.440 Hate speech is a great example of this. The criminal code of this country already prohibits
00:16:17.200 hate speech. If something is illegal hate speech in the criminal code in your local deli or on a
00:16:24.120 street corner, it is also illegal on the internet. So when the government has brought this online
00:16:30.500 hate speech component, what they've actually done is created a new threshold. They have lowered the
00:16:35.840 bar. The criminal code threshold, I don't have the exact wording handy, but the criminal code
00:16:40.580 threshold is so insanely high a bar that what it requires is that you must prove to convict someone
00:16:47.300 of hate speech that they have effectively been inciting, usually violence or a call to genocide.
00:16:53.700 That's typically what is being dealt with here. But when you then bring it into the context of
00:16:59.260 here, we've talked about that category that doesn't quite make sense. It has to be likely
00:17:04.580 to foment detestation or vilification, but it can't just be merely causing disdain or humiliation
00:17:11.020 or dislike. Well, I would venture to say that one person's dislike is another person's detestation,
00:17:18.700 that something, for example, misgendering, that's the example I brought up in Ontario,
00:17:23.520 that misgendering will be at some point subject to a complaint. And it is very reasonable. I mean,
00:17:31.760 if you look at the issue that launched Jordan Peterson to fame, it was Bill C-16 that was
00:17:36.760 amending the Canadian Human Rights Act to put gender identity in the list of prohibited grounds
00:17:42.380 of discrimination. So the fight that Jordan Peterson was waging against Bill C-16 has a
00:17:48.100 direct line to what is now human rights law in Canada. But just look at this. It's absolutely
00:17:54.100 insane. If you are found guilty of violating the online hate speech section of Section 13,
00:18:00.280 you will not only have an order to cease the discriminatory practice
00:18:04.700 and take measures in consultation with the Human Rights Commission
00:18:08.360 to redress the practice or prevent the same or a similar practice from occurring.
00:18:14.280 So your future speech is going to be affected by this.
00:18:17.940 Not just your current speech, but you have to prevent it from the future.
00:18:22.140 So you have to essentially be enjoined to this plan
00:18:25.780 that the government is going to come up with with you
00:18:27.780 to not offend people's sensibilities in the future.
00:18:30.340 You have to pay the victim up to $20,000
00:18:34.720 for any pain and suffering that they experienced
00:18:38.800 as a result of the hate speech
00:18:40.860 to which the complaint relates.
00:18:42.960 And I'm gonna venture a guess to say
00:18:44.540 that there won't actually be a huge rigorous test
00:18:49.540 on whether you had hate,
00:18:53.000 whether your so-called hate genuinely caused any problems.
00:18:56.440 I just don't see that happening.
00:18:58.460 And also, it's not just enough that you pay the person that you may have,
00:19:02.500 I don't know, misgendered or something, $20,000.
00:19:05.440 You also will have to pay a fine, if not more than $50,000,
00:19:09.080 to the Receiver General if the Human Rights Tribunal decides this.
00:19:13.160 So that is stand-in for the government.
00:19:15.420 You will have to pay the government.
00:19:18.120 Pay the government if you are found guilty of this.
00:19:22.300 So you could be out $70,000 if you post a tweet
00:19:25.300 that Justin Trudeau's censorship law decides was just a little bit too offensive. It went a little
00:19:31.000 bit higher than disdain and dislike and all the way to detestation and vilification. Now, let me
00:19:38.240 play a clip. This was of the justice minister, not the one we played earlier, but this was him
00:19:43.020 explaining why this bill is the direction that the government apparently needs to go.
00:19:48.620 This is what the Online Harms Act will do. Under this bill, major online services will have three
00:19:55.280 overarching obligations a duty to protect children a duty to act
00:20:00.380 responsibly and the duty to remove the most egregious content this bill targets
00:20:06.200 the worst of what we see online content that sexually victimizes children or
00:20:11.240 revictimizes survivors intimate content shared without consent content that
00:20:17.840 incites violence extremism or terrorism content that incites violence or
00:20:22.700 or foments hatred and content that is used to bully a child or induce a child to self-harm
00:20:30.460 this bill will establish a new digital safety commission to make sure that online services
00:20:35.420 comply with their new obligations as well as an ombudsperson to advocate for users
00:20:40.940 and victims of online harm again the way that they're viewing all of these things as the same
00:20:49.740 problem, that take an issue that we could all agree on, which is the sharing of non-consensual
00:20:54.260 images, which is child sexual exploitation. All of that we could agree is a bad thing. No liberal,
00:20:59.680 no conservative, no new Democrat is going to stand up and say, you know what, maybe you should be
00:21:05.020 able to share revenge porn online. No, but they put it all together so that if anyone does what
00:21:11.620 I'm doing, which is saying that this bill is a problem, this bill is bad, if anyone does that,
00:21:16.220 they are going to get hit by criticism from the Liberals on,
00:21:20.200 oh, well, the Conservatives are just trying to keep children unsafe.
00:21:23.440 The Conservatives are okay with the internet being a dangerous place and all that.
00:21:27.980 This is exactly what's going to happen.
00:21:29.740 It's what I called months ago and years ago when this issue first came up.
00:21:34.540 We were supposed to have Christine Van Gein on the program.
00:21:37.240 We've had some technical issues there, so we'll try to get her in a little bit later on.
00:21:41.460 But I want to welcome into the show Chris Sims on this,
00:21:43.820 our regular Monday fabulous contributor.
00:21:46.980 It's fabulous any day,
00:21:47.900 but on our show,
00:21:48.480 she's fabulous on Mondays
00:21:49.580 from the Canadian Taxpayers Federation.
00:21:52.020 Chris, always a pleasure.
00:21:53.160 Thanks for coming on.
00:21:54.920 Likewise.
00:21:55.660 And thank you for holding on the fort
00:21:57.120 on Mr. Stein's cruise.
00:21:58.620 I hope it went really well.
00:21:59.880 It did.
00:22:00.420 It was a lot of fun.
00:22:01.220 I said we were in international water,
00:22:02.580 so I was free of any of these
00:22:03.760 so-called hate speech complaints.
00:22:05.960 Not that this,
00:22:06.780 I mean, this is an issue
00:22:07.580 from a constitutional perspective,
00:22:09.020 but one of the things
00:22:10.560 that I would point out also
00:22:11.760 is that they're bureaucratizing
00:22:13.520 this process. The idea that we need all of these different roles and bureaucrats that are now
00:22:18.260 adjudicating the online process. What could possibly go wrong, right? Yeah, exactly. So
00:22:23.580 they're going to balloon the size of government as if we don't have a big enough government already.
00:22:27.600 And I hope Christine manages to log in here because she's absolutely brilliant on these issues.
00:22:33.220 And of course, it was partially her team that had that big win at the federal court level very
00:22:38.920 recently on the invocation of the emergencies act so she really knows her stuff we've known her for
00:22:44.840 years from the taxpayers perspective one of our key elements is accountable government and i need
00:22:51.480 to stress to folks here that you won't oh i think you've gotten now a complaint against you from the
00:22:56.440 government chris your camera just went off my goodness we've been having some really funny
00:23:00.680 technical issues here this is bill c63 in action once christian starts criticizing the government
00:23:06.280 the uh the camera goes off well uh oh there we go we got you back i haven't i'm not joking i'm not
00:23:12.120 touching anything this makes no sense that this happens anyway i'll try to will my camera to stay
00:23:17.720 on the point here is and you described it brilliantly andrew is with free expression that
00:23:24.280 is up to someone else's judgment right what you say or what i say may offend other people may not
00:23:31.160 offend others and may cause detestation in others and may not cause detestation in others. The point
00:23:36.920 here is that this is a freedom for exactly this reason. And you do not want, folks, you do not want
00:23:44.760 the state deciding whether or not misinformation can be construed as hate speech or whether or not
00:23:52.520 if I calmly explain to a young person that you know what, the carbon tax, if we crank it up,
00:23:59.240 is not going to stop the planet from ceasing to exist in 18 months. Like here's maybe some other
00:24:05.260 data. I'll walk you through some of the things that I've lived through and look back through
00:24:09.900 a history book. Some folks who are really into this climate stuff might consider that to be
00:24:15.380 hate speech. It may be attacking their identity. Like this is the weird slippery slope that we
00:24:21.740 could go down here and we can't do this. We should not be growing the size of government
00:24:25.760 and we should not be impinging on people's free expression because otherwise we won't be able to
00:24:31.440 hold the government to account. And again, I try to explain this to everybody who might be thinking,
00:24:36.480 oh, well, I don't mind if Justin Trudeau decides what is and is not hate speech and whether or not
00:24:41.300 I'm allowed to express myself. Okay. If you're totally fine with Justin Trudeau deciding that,
00:24:46.880 are you also fine with Pierre Polyev deciding that for you? If the answer is no, we shouldn't
00:24:52.140 be changing these laws. Yeah. And that was, I don't know if you were on when I played that
00:24:56.980 clip from Marie Farhani, the justice minister, but when he was saying, oh, no, no, we can't trust big
00:25:01.020 tech companies to adjudicate what you can say and see online. Government is the answer to that. I
00:25:06.200 mean, I'm like, hang on, how about neither? How about we do, how about we delegate that authority
00:25:10.600 to neither of them? Is that an option? Apparently not. And you are absolutely right about this
00:25:15.280 because it's very difficult in this day and age to do the principled stand that I'm doing and
00:25:20.380 you're doing here and saying no i don't believe government should regulate hate speech online
00:25:24.620 because you're you're necessarily taking a stand in support of your ability to say things that
00:25:29.500 i would firmly believe people shouldn't say but the governor of that should not be the government
00:25:35.740 it should be civil society it should be your own conscience it should be the friend that says hey
00:25:41.020 you should probably delete that tweet and and we've just completely abandoned that in society
00:25:45.500 and you're right there are so many people that would love to put a bill like this in
00:25:48.700 to stick it to conservatives but the second you get some trigger happy conservative government
00:25:53.740 that wants to go after let's say anti-israel speech now this brings us to the canadian or
00:26:00.060 the center for israel and jewish affairs a group that has done tremendous work on anti-semitism
00:26:05.020 and i think is a very important voice they've come out in favor of this and i i understand why i
00:26:10.300 understand why they look at anti-semitism as a problem and saying we want to go after this but
00:26:15.500 but that would be used if they had their way against a lot of speech that right now people
00:26:20.320 on the left are quite fond of exactly and to your point that you made earlier a lot of this stuff
00:26:26.120 that you described off the top of the show is already illegal it's already illegal in real life
00:26:32.720 it's already illegal on the internet now the one i did find this interesting correct me if i'm wrong
00:26:38.420 but i think they've shifted this now from culture over to justice which is really interesting because
00:26:44.360 the last two times they tried pushing this thing through they tried doing it through the auspices
00:26:49.240 of culture so they're getting a little bit smarter because they're putting it through justice i'll
00:26:53.960 put it this way i just want to take my taxpayers federation hat off slightly here um but it's still
00:26:59.320 to accountable government okay it's still about accountable government and smaller government
00:27:03.880 anybody who's followed the hill for the last 20 or 30 years or so has not just fallen off the
00:27:08.920 turnip truck. We can see what they're doing here. They're repackaging things like protecting
00:27:15.120 children who wouldn't want to do that, protecting people from hideous images, the revenge p-word
00:27:22.440 as you've said before. All of that is understood. All of that's already illegal but then they're
00:27:28.040 putting this little thing in there about hate speech which is widely defined, really variable
00:27:35.480 and we've seen it play out before they could easily separate this bill now i'm not a lawyer
00:27:41.000 hopefully christine will be able to explain some of that if she has time but if they could split
00:27:45.160 this bill i'm guessing that this would be a lot easier to manage why link it to something that
00:27:50.600 every decent person would want to outlaw anyway yeah i very agree i mean for starters it over
00:27:56.200 complicates the bill and it also slows down what could be an easy slam dunk i mean the conservatives
00:28:00.680 I think would probably be on the front lines of pushing that forward and saying, let's just get
00:28:05.400 that done in weeks. And I also just to put your taxpayer hat back on here, Chris, I wanted to ask
00:28:10.860 about this Digital Safety Commission of Canada, because the bill creates this body, it creates
00:28:15.980 this bureaucracy to manage online safety. Now, we can look around the world and see what the
00:28:22.180 government is going after there. And one of them, I should have pulled the clip today, that they've
00:28:26.620 looked at when they were crafting this was in Australia, which has a woman by the name of Julie
00:28:30.360 Inman Grant, who is the eSafety Commissioner of Australia, who I have run into on the streets of
00:28:35.980 Davos and who famously said at Davos that we need to recalibrate things like freedom of speech
00:28:41.840 because she believes they are anachronisms in the internet age. So it's pretty clear that the
00:28:47.740 government is creating a bureaucracy that's going to just put censorship into this perpetual motion
00:28:52.300 here. And bodies like that do not like to stay in their lane. Bodies like that are the ones that are
00:28:57.240 going to come out in three years and say it's not enough to go after hate speech we need to go after
00:29:01.640 misinformation as well yes exactly when you grow government that's like purposefully planting
00:29:07.220 invasive species it is it will choke out the sunlight it'll take up all the soil it'll eat
00:29:12.540 up all the oxygen and it will put a chill on free expression and free speech and they tried
00:29:18.140 a last time i'm curious to see that they've kept this now in this new incarnation i didn't see that
00:29:22.940 that they're going to keep the so-called censorship or free expression czar. So that
00:29:28.800 position is still going to be there. And under that person, they're going to build an entire
00:29:33.560 bureaucracy. Folks, we are more than a trillion dollars in debt. We have un-money. We cannot pay
00:29:40.640 for one more person to be hired by the federal government. If you and I started counting to a
00:29:46.040 trillion right now, it would take us 30,000 years. So in no way should the Trudeau government be
00:29:52.340 expanding the size of a government, much less in order to make themselves less accountable to the
00:29:57.640 people. All right. Well, thank you so much, Chris Sims. Always great to talk to you. We will see you
00:30:03.320 next Monday. Likewise. Thank you. All right. And as promised, we have our long-awaited legal eagle
00:30:10.320 herself, Christine Van Gein, who is the litigation director for the Canadian Constitution Foundation
00:30:15.180 on the program. Christine, always good to talk to you. Thanks for coming on today.
00:30:20.300 Yeah, thanks for having me on, Andrew.
00:30:22.160 So, I mean, we've covered a lot of, I think, the basis of why I'm concerned as a non-lawyer free speech advocate on this.
00:30:29.000 But let me just ask you about that point that Chris had alluded to there and that I spoke to earlier, which is that the government has gone after a lot of things here that most people would assume rightfully so are illegal.
00:30:41.880 I mean, even on the non-consensual images stuff, I understood a lot of that already existed in law.
00:30:46.060 certainly on hate speech, we already have a criminal prohibition on hate speech, which
00:30:50.080 applies to the internet. So this new definition has a lower threshold. Am I missing anything here?
00:30:57.520 So I think that the definition, the statutory definition they're proposing seems like it's the
00:31:01.800 same definition as exists in the what caught decision, which is where that previously the
00:31:10.080 definition came from. The problem is that it's by its very nature amorphous that they are
00:31:16.620 increasing the penalties and they're creating a standalone offense of an offense motivated by
00:31:21.880 hatred and that it can apply to all federal laws. So all federal statutes are included.
00:31:29.980 So anyone who breaks any other federal law motivated by hate can be found guilty of a hate
00:31:36.620 crime and subject to a maximum life sentence um so that's not just limited to the criminal code
00:31:42.220 they're also increasing the statutory maximum for the crime of advocating genocide up to a maximum
00:31:49.420 of uh currently five years they're increasing that to uh to life imprisonment and look obviously the
00:31:57.020 problem with criticizing this bill is that advocating for genocide is abhorrent and terrible
00:32:02.700 a lot of there's a lot of political challenges in criticizing legislation like this the a big
00:32:09.180 part of it is that for some reason politically uh the government has tied completely unrelated
00:32:15.260 things to this bill so they're trying to solve completely unrelated problems of sexual abuse
00:32:20.940 imagery whether that's sexual abuse of children or the non-consensual disclosure of intimate images
00:32:27.820 to things that are already criminalized or tying that to hate speech which is a inherently amorphous
00:32:34.860 and subjective uh problem and obviously hate online is wrong and bad but the way the government
00:32:44.060 has proposed tackling this is unconstitutional i think that this law is going to be subject
00:32:49.340 to an immediate charter challenge and we don't want the parts of the bill that deal with sexual
00:32:55.900 abuse to be struck down. If this government actually cared about sexual criminals, they would
00:33:01.420 not tie increased penalties or improved reporting for sexual abuse to a constitutionally vulnerable
00:33:11.460 proposal. Explain to me if you're able to, because I know a lot of this would come from regulations
00:33:18.400 that would have to be passed after the bill is passed and enacted, but how this would even work
00:33:23.900 when you're talking about the relationship
00:33:25.180 between these regulations and internet platforms
00:33:28.320 like Facebook or Twitter,
00:33:29.780 because they would have under this an obligation
00:33:31.780 to remove content that would be in violation of this.
00:33:36.600 But I mean, what would that mechanism even look like?
00:33:40.420 My fear is that Facebook is either going to
00:33:43.580 do the non-compliance thing that they did with C18
00:33:45.960 and say, you know what, it's too much of a hassle.
00:33:47.880 We're just shutting down our platform to Canadians.
00:33:49.840 There's no point in being the adjudicators.
00:33:52.020 or if they want to play ball, the worst thing is, all right, well, we'll come up with these
00:33:57.060 terms of service to encompass what we think the Canadian government is after. And all of a sudden
00:34:02.340 you have Facebook preemptively zapping content because they don't want to deal with the digital
00:34:07.880 safety commissioner of Canada. Yeah, I think that's absolutely right. It's going to create
00:34:12.300 a chilling effect where content will be preemptively or proactively removed when it doesn't
00:34:19.220 actually meet the threshold that's being outlined in this proposed statutory definition of hate
00:34:26.760 speech. And look, it's because the definition is inherently subjective, and I specialize in
00:34:34.420 expression and freedom of expression law. And it's difficult for lawyers like me to explain
00:34:40.880 where that line gets drawn. So I think it's going to be incredibly hard for regular Canadians who
00:34:47.560 now will the the bill proposes this automatic reporting mechanism so ordinary canadians can
00:34:54.360 report content that they think meets this threshold we're going to see just an inundation
00:34:59.160 of reporting that comes with if it does meet that threshold huge financial penalties for
00:35:06.920 the platforms i think it's um i think it was 10 million dollars in fines or six percent of global
00:35:16.200 revenues, whichever is higher. And for platforms like Facebook or Meta, Google, Instagram, these
00:35:22.700 are, that's a huge amount of money. They obviously are going to be incentivized to take content down
00:35:29.880 proactively. And then the other part of this is that the creation of this civil remedy, where
00:35:35.840 people can complain about speech that they think is hateful to the Canadian Human Rights Commission,
00:35:42.460 people are going to soft paddle their own speech and not say things that they that they might
00:35:48.420 otherwise say and I wanted to give you a couple of examples because I think that we we are talking
00:35:54.660 about this in a very um like high level way without actually putting our teeth in what some
00:36:00.420 of these terms mean so hate speech is defined in the case law it's now going to be put in the
00:36:06.200 statute as words, uh, like detestation or vilification. It doesn't include speech that
00:36:13.560 is just offensive, or, um, it has to be at a higher level, even though these are kind of
00:36:18.660 synonyms. Yeah, more, it has to be higher than disdain. It can't just be disdain. Yeah. So to
00:36:22.920 try to put some teeth in that, in the what caught decision from the Supreme court, the court gave a
00:36:28.400 few examples of what they call the hallmarks of hatred speech that has the hallmarks of hatred.
00:36:34.060 So an example is blaming a whole group for current problems in society, alleging that that group is a powerful menace. Now, I can certainly imagine that type of language being used in a way that blames some racial group.
00:36:51.360 But frankly, Andrew, I've been to some women's meetings where you are the problem.
00:36:57.440 It's men.
00:36:58.460 It's white men.
00:36:59.900 I've been to DEI trainings where those are the groups who are blamed for the problems
00:37:06.700 of society.
00:37:07.640 And look, I don't like DEI training, but I don't think it should be illegal.
00:37:12.340 Another example is describing an entire group as, quote, pure evil.
00:37:19.960 This is another hallmark of hatred. And certainly I can imagine examples we're calling an entire race or religion as pure evil.
00:37:28.960 But I'm not sure if you're familiar with the group, the Westboro Baptist Church.
00:37:33.800 This is a notoriously awful church based in the United States that famously holds signs at the funerals of war veterans celebrating the death of those veterans.
00:37:49.320 and attacking other religious or sexual minority groups.
00:37:53.720 This is an awful group.
00:37:57.200 And if you Google Westboro Baptist Church and the term pure evil,
00:38:01.400 you get a lot of hits.
00:38:03.020 Is that hate speech?
00:38:04.180 Well, it certainly is calling a religious group pure evil,
00:38:09.260 which is a hallmark of hatred.
00:38:10.900 But under C63, do we want to subject people who say that that church is,
00:38:18.540 quote, pure evil to criminal penalty, or even to civil penalty. I don't know that that's an
00:38:25.620 outcome that we want. So these are just a couple of examples to put some colour, some flavour to
00:38:31.740 what this might look like, and why it's such a problem when we criminalise speech. We want these
00:38:39.740 ideas to be fleshed out and debated. And frankly, a lot of the things that might, you know, you can
00:38:45.800 think a hallmark of hatred. Well, of course, using these terms will always be hateful, but it won't
00:38:51.440 be. It just won't. And we don't want to criminalize that. And I know that I mentioned a little bit
00:38:56.580 before you came on, there's this $20,000 that if you're a successful complainant, you could
00:39:01.220 theoretically get up to, and it's dependent on what you've endured on this. But really, to make
00:39:07.320 a complaint, it doesn't look like you need to have suffered anything from it, or it even needs to
00:39:12.160 have been targeted to you. I mean, if I am a member of some group, and I don't like your post
00:39:18.180 that affects my group, even if you didn't direct it at me, it sounds like I would have standing
00:39:23.080 to bring that to the Canadian human rights stars, would I not? Yeah, I mean, some of these details
00:39:29.080 haven't been fleshed out completely yet. But certainly, that's sort of how the old Canadian
00:39:34.820 Human Rights Commission Act that had Section 13, where you could make these civil complaints,
00:39:41.080 That's how it used to work. And that provision was famously repealed after it was abused to
00:39:49.160 bring these claims against author Mark Stein. So it was right for Parliament to remove that
00:39:56.200 10 years ago and to bring it back now just doesn't make any sense. It shouldn't happen.
00:40:03.080 And keep in mind that these civil penalties will be proven at a lower threshold than the criminal
00:40:09.400 penalties. To prove the criminal offenses of hate speech, it's beyond a reasonable doubt. And under
00:40:16.760 the civil penalty, it's on a balance of probabilities. And in many ways, look, there's
00:40:22.740 going to be a lot of frivolous complaints. I mean, I'm willing to take bets. I'm sure there's some
00:40:27.580 vagus odds on who's going to have a complaint first, you or me, Andrew. I'm getting one on
00:40:34.280 day one. I think you're too nice on that. No, people have already been saying I can't wait
00:40:39.120 i'm christine charged some tech person's gonna make some script that you just like you know
00:40:46.120 select your right-wing hater and you just like autofile complaints it'll be like on twitter when
00:40:51.240 you report something it'll like give you 10 sample tweets and it's like check the boxes of the ones
00:40:55.580 you might also want to complain about yeah so in many ways like these might these complaints might
00:41:01.820 not be investigated as but as chris sims your previous guest explained they're going to need
00:41:07.520 have a whole bureaucracy to be investigating these complaints and dismissing them and then
00:41:12.160 some of them that perhaps aren't even close to the threshold are still going to be investigated
00:41:18.560 and once you're investigated the process becomes the punishment because you need to defend yourself
00:41:23.520 in that process and even if it's not a criminal charge even if you have not um even risen to the
00:41:30.480 level of of civil hate speech you still are going to pay the costs of of defending yourself through
00:41:36.880 through a lawyer, or socially, the costs of having your name dragged through the mud.
00:41:42.320 Well, we'll certainly be unpacking this in more angles as things progress. The liberals have been,
00:41:48.260 I mean, the thing is, it's taken them about three years to come up with this. So the idea that there
00:41:53.860 was something they overlooked is very unlikely. They're very aware of what's in this. They've
00:41:58.700 chosen this all deliberately. And I think that should be very concerning. Christine Van Gein,
00:42:02.960 I know you'll have lots of commentary on this on your podcast over at the CCF.
00:42:07.040 Thank you so much for sharing your thoughts as always with us.
00:42:09.500 Thank you, Sandra.
00:42:10.660 All right.
00:42:11.100 Thank you, Christine Van Gein.
00:42:12.960 Just before we get things rolling into ending the show,
00:42:17.000 I guess that's stopping the rolling,
00:42:18.600 but before we start gathering moss on the program,
00:42:21.880 I wanted to share a bit of an announcement with you.
00:42:24.600 I've teased this in recent weeks,
00:42:26.640 and anyone who tried to email me over the last few months and didn't get a response,
00:42:30.200 you've probably been aware that I've been working on something here.
00:42:32.960 I can finally tell you what that is.
00:42:35.820 I have my second book coming out in just a couple of months.
00:42:39.480 We're putting the finishing touches on it with my publisher in the next couple of days here.
00:42:44.240 But we can certainly share with you the cover of it.
00:42:47.400 This is Pierre Polyev, A Political Life.
00:42:50.880 It is the first biography of Conservative leader Pierre Polyev,
00:42:55.240 a man who may well be Canada's next Prime Minister.
00:42:58.580 This is not a memoir.
00:43:00.140 It's not written by him or even with his cooperation as he did not agree to be interviewed for it.
00:43:05.240 But it is a book that I'm very proud of that goes to the very early years of his life through his childhood, adolescence, his involvement in politics, and all that he's done as a politician.
00:43:17.020 And it was a bit of an interesting process to write about.
00:43:20.700 I followed his career, obviously, but I didn't know a lot of what I ended up learning during the writing process.
00:43:26.360 That book will be coming out in May.
00:43:28.080 if you are interested in pre-ordering it
00:43:30.520 you can do so at Amazon
00:43:31.760 or also through the publisher
00:43:33.500 which is Sutherland House
00:43:35.060 so you can head on over to
00:43:36.940 SutherlandHouseBooks.com
00:43:38.720 and hopefully we'll be able to do some events
00:43:40.580 in your towns across this great nation
00:43:42.580 when it comes out in May
00:43:43.940 but wanted to share those details with you now
00:43:46.220 that does it for us for today
00:43:47.880 we'll be back in just 23 hours and 15 minutes
00:43:50.540 with more of Canada's Most Irreverent Talk Show
00:43:52.880 here on True North
00:43:54.020 thank you, God bless and good day to you all
00:43:56.420 Thanks for listening to The Andrew Lawton Show.
00:43:59.660 Support the program by donating to True North at www.tnc.news.
00:44:26.420 We'll be right back.
00:44:56.420 We'll be right back.