Juno News - December 14, 2022


The Liberals are waging war on internet free speech


Episode Stats

Length

31 minutes

Words per Minute

181.78017

Word Count

5,651

Sentence Count

210

Hate Speech Sentences

4


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

In this episode of Canada's Most Irreverent Talk Show, host Andrew Lawton takes a victory lap after a tweet he sent to Elon Musk about the liberal government in Canada's attempt to regulate the internet and force social media companies to remove so-called hate speech.

Transcript

Transcript generated with Whisper (turbo).
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.980 Welcome to Canada's Most Irreverent Talk Show. This is the Andrew Lawton Show, brought to you by True North.
00:00:13.360 Hello and welcome to you all. This is Canada's Most Irreverent Talk Show, the Andrew Lawton Show, here on True North on this Wednesday, December 14th, 2022.
00:00:25.340 to 11 days until Christmas.
00:00:28.160 I know I am infuriating you with the Christmas countdown.
00:00:31.740 I feel like I'm going to be accused of a hate crime,
00:00:33.900 not because I am supporting the Christian holiday
00:00:37.660 more than any other faith's holiday,
00:00:39.780 but just because everyone is so stressed out
00:00:41.500 that the reminder of how quickly approaching Christmas is
00:00:44.620 will likely be the kind to induce vast levels of hatred in society.
00:00:48.780 But I take no responsibility for that.
00:00:50.460 This is a free speech zone, a safe space for all opinions,
00:00:53.620 and I thank you so very much for tuning in.
00:00:57.840 So this is one of those moments where before I get into the thick of it,
00:01:01.940 I have to do a little bit of a victory lap here
00:01:04.340 because I feel like this is one of those programs
00:01:07.400 where I can say that the Andrew Lawton Show got action.
00:01:11.080 Yesterday, I had an idea which wasn't really prompted by anything specific
00:01:15.880 except for some of the general news that had been emerging
00:01:18.900 with the muskification of Twitter.
00:01:21.240 And I was talking about how I think Elon Musk really needs to take a stand against the liberal government in Canada and its efforts to regulate the internet and specifically to force social media companies to pay mainstream media outlets, to force social media companies to remove content that fits the liberal government's definition of so-called hate speech.
00:01:42.700 And I did a very impassioned show about this, spoke about it in a number of different contexts.
00:01:49.440 And then I had also tweeted at Elon Musk, kind of calling on him, trying to bring attention to this.
00:01:55.220 Anyway, I don't really think anything of it.
00:01:57.260 A lot of people respond to the tweet, and I don't know if Elon's going to see it or not.
00:02:01.420 I think an hour went by, and I figured out, okay, I guess he's busy doing other things like running his many companies.
00:02:07.400 And then I wake up this morning.
00:02:09.900 And I think I got up a little bit later than I usually did, maybe six o'clock or so.
00:02:14.180 And my phone is just like lighting up every few seconds.
00:02:17.940 And I'm like, what on earth is happening?
00:02:19.800 And I look over at it and realize that this tweet to Elon Musk has blown up.
00:02:25.220 And then I realized the reason it's blown up is because Elon Musk has replied to my
00:02:30.540 tweet to him, calling on him to act in the sense about what Canada is doing.
00:02:36.220 So my tweet, just to give you a little bit of a refresher, was this.
00:02:41.380 Canada's liberal government wants to regulate internet content and deputize social media companies to enforce hate speech bans with a low and murky threshold for what hate speech is.
00:02:52.400 I hope Elon Musk takes a stand against this.
00:02:55.940 And then Elon replies, sounds like an attempt to muzzle the voice of the people of Canada.
00:03:03.360 and that is quite an astute observation i think more astute than most of the members of the legacy
00:03:09.740 media in canada have been able to come up with about what's happening and i wanted to expand a
00:03:15.040 little bit on the themes we started talking about yesterday and explain why there is a an all-out
00:03:20.220 war on internet freedom taking place right now that is not just about c11 and i just if you
00:03:27.920 subscribe to my substack newsletter i just a few minutes before going on air had published my latest
00:03:32.660 edition which is on this topic and i say that c11 is the tip of the censorship iceberg and the
00:03:38.420 reason i bring that up is because there's been a very sustained backlash to c11 and i think
00:03:43.540 justifiably so it's a terrible bill that does a lot of bad stuff to the internet by manipulating
00:03:49.440 algorithms to force more canadian content to appear so you want to go and see the latest clip
00:03:55.700 from i don't know joe rogan or my little pony or perhaps both maybe it's my little rogan you never
00:04:02.160 know and instead you get the highlight clip from little mosque on the prairie because you haven't
00:04:08.820 hit your canadian content quota for the day and justin trudeau says you need to see more of it
00:04:13.440 now the amusing part of it is that technically the andrew lawton show is canadian not technically i
00:04:18.940 mean technically logistically de facto to jury this is canadian content but i don't think this
00:04:24.560 is the type of canadian content that the government wants to serve up on your algorithm or one of my
00:04:30.840 absolute favorite shows of all time is Money Heist on Netflix, a Spanish show that I only have access
00:04:37.580 to because Netflix decided there might have been a global audience for it, but maybe I log in and
00:04:44.080 I don't actually get Netflix to recommend Money Heist to me because instead there's some, oh I
00:04:49.400 don't know, Quebec drama that has to be served up as a measure to comply with Bill C-11. So all of
00:04:57.480 this is an example of why the and again you can take issue with big tech company algorithms and
00:05:03.380 I would encourage you to do it but the only thing worse than tech companies manipulating these
00:05:07.780 algorithms is government manipulating these algorithms like has Justin Trudeau ever done
00:05:14.460 anything to suggest that he should be the one to tell you what you see on YouTube or Netflix what
00:05:20.660 you listen to on Spotify you go on and you're like oh wow I hear Rihanna has a hot new album
00:05:25.820 minutes oh my goodness this is just every Celine Dion track ever made now nothing against Celine
00:05:30.480 Dion I quite like Celine Dion but I don't want Celine Dion like forced into my ears like the
00:05:35.820 oral equivalent of the clockwork orange scene where the guy said that the eyes are being pried
00:05:42.140 open instead the ears are pried open and it's just all Celine Dion all the time because C11 says
00:05:47.360 more Canadian content on the internet not realizing that the internet is where we go to escape
00:05:53.180 the regulations that affect TV and radio in several ways.
00:05:58.740 So all of this is to say that C-11 is a big deal,
00:06:02.780 but it's so much more than that.
00:06:06.240 And this is what I was getting at in that tweet to Elon Musk,
00:06:09.440 which he replied to and said it was an attempt to muzzle
00:06:12.440 the voice of the people of Canada, which I wholeheartedly agree with
00:06:16.220 because the Liberals put Bill C-36 forward in the previous parliament.
00:06:21.640 And this is a bill that would reinstate Section 13 of the Canadian Human Rights Act, which allows for the prosecution, effectively, by the Canadian Human Rights Commission of so-called hate speech.
00:06:35.080 And it isn't hate speech that meets the threshold for criminal hate speech.
00:06:40.060 It's hate speech that instead meets a threshold that has been set out by the government that is lower than the criminal threshold.
00:06:48.660 That is, speech that maybe falls within the bounds of political debate, or speech that is contentious, controversial, maybe icky, but certainly not speech that is or should be illegal.
00:07:02.000 And one of the big problems, and I'm going to be talking about this a little bit later on in the program, is the conflation by those predisposed to online censorship, by people like Justin Trudeau and Stephen Gilbeau.
00:07:13.600 with social accept of social acceptability of speech with legality of speech and i think that
00:07:22.400 is a big challenge that comes up here there is lots of speech that is objectionable perhaps
00:07:28.260 offensive controversial but that doesn't mean it's illegal and freedom of expression freedom
00:07:33.320 of speech i i use those terms interchangeably now if you talk to some canadian legal scholars
00:07:38.380 there are a lot more distinctions than uh you may agree with and certainly than i agree with but
00:07:43.360 nevertheless they're all part of this same debate and I think when we talk about the regulation of
00:07:48.800 the internet we can't look at C11 in isolation but I do want to focus in a little bit more on it
00:07:54.660 because a lot of people have certainly heard about it and they understand some of the issues with it
00:07:59.660 but I don't know if people realize what the stakes of this are and I think part of this is because
00:08:04.340 the government has couched C11 in very benign sounding terms like oh it's just Canadian content
00:08:10.360 Oh, it's just modernizing. It's doing all these things. There's something wrong. We're not against modernization, are we?
00:08:16.500 Well, that's not exactly what's happening here.
00:08:20.020 Joining me on the line now from Open Media is the Campaigns Director, Matt Hatfield.
00:08:25.500 Matt, good to talk to you. Thanks for coming on today.
00:08:28.740 Hi there. Thanks for having me.
00:08:30.100 So let's start. Why is this an issue for your organization, first off, C11?
00:08:34.420 So I think C11 was drawn so broadly that it did pose a real expression risk to ordinary Canadians.
00:08:43.120 So there's been a lot of debate since it was first introduced to C10 last year about our ordinary audiovisual posts,
00:08:50.200 you know, the videos and podcasts and such that we as users put on the Internet.
00:08:54.460 Are they being treated as broadcasting content? And could the CRTC be regulating our individual expression?
00:08:59.580 And in the version of C-11 passed through the House, I think that was still a very real consideration, a very real concern.
00:09:06.800 We're lucky that there's been an amendment during the Senate process that could fix that.
00:09:11.800 But we'll have to see if the House ends up accepting that.
00:09:15.180 Now, what was that amendment specifically?
00:09:18.200 Yeah, so this comes from Senator Simons and Miville Deschene.
00:09:21.740 And together, they essentially narrowed the scope of the bill.
00:09:25.720 So previously, under C10, it just very clearly included all user expression,
00:09:30.980 and people were very alarmed about that, understandably.
00:09:33.700 Now, in C11, they tried to exclude some user speech,
00:09:37.780 but they did it on such broad terms that they were saying, like,
00:09:40.320 well, if you're earning any revenue, if anyone is earning revenue alongside this content,
00:09:44.180 then we're going to treat it as broadcasting, and the CRTC can regulate it.
00:09:47.640 Well, as you know, most things on the internet make revenue, right?
00:09:50.500 ...monetization on YouTube and make, like, five bucks a month.
00:09:52.960 you're now a commercial publisher like, you know, Global News is or some giant YouTube streamer.
00:09:58.860 Exactly. And so the House didn't fix that. That was still a huge risk there. This amendment
00:10:03.900 restricts the being treated as broadcasting to professional sound recording. And just does a
00:10:10.880 much tighter job of making sure most of our posts should no longer be subject to the bill
00:10:14.460 if this amendment is accepted. Now, I know open media has a petition calling on the government
00:10:20.000 to fix this. So I just want to kind of drill down here. Is this just your view that this is
00:10:25.720 inevitable and we have to try to make the best of it? Or do you actually think that at its core,
00:10:31.260 it's trying to achieve a good thing, but going about it the wrong way?
00:10:35.660 Well, I think a part of what it's trying to achieve isn't a bad thing. We don't think
00:10:40.460 fundamentally treating the internet like broadcasting doesn't make sense. Like the
00:10:43.480 internet is not broadcasting. All of us are making a million choices a day that we never got to make
00:10:48.260 in in the tv broadcasting world and so some of the things the government has proposed here
00:10:54.900 don't respect our choices don't respect our individual right to you know opt into things
00:10:59.380 and opt out of other things and that's a huge concern the idea of finding some financial
00:11:04.500 support for canadian creators we're not necessarily against that but unfortunately there's the question
00:11:09.940 of is it going to be fair is it going to be open to all canadian creators or is it again going to
00:11:14.660 to be a situation where some people are picked as favorites and others aren't. And the CanCon
00:11:18.600 system hasn't been updated since the 1980s. It won't be updated yet under this bill. We're still
00:11:23.240 going to see who's included and who isn't, and that's a huge question. Well, that's another thing
00:11:27.820 that I think you raised that's quite important here. I mean, the content, the types of content,
00:11:31.640 the genres of content that are on the internet are a lot more varied than music, where you have,
00:11:37.160 you know, sure, different genres of music, but, you know, within the parameters of what a country
00:11:41.100 station is playing there's canadian artists and non-canadian artists for the purposes of canadian
00:11:46.380 content on the internet we're talking about political content and i mentioned sort of jokingly
00:11:50.560 earlier on that you know my show is canadian content but i don't think it's the type that you
00:11:54.740 know the canadian government might want to promote but there's a serious point in that which is that
00:11:59.540 not all content is equal just because of its country of origin yeah that's right i mean of
00:12:06.260 course you and i are both canadians but probably neither of our content would be counted as cancon
00:12:10.420 under the current system. It's got quite an arcane points-based system where there's all kinds of
00:12:14.280 different benchmarks you need to hit to be qualified. And some things that are very clearly
00:12:18.020 Canadian stories don't end up qualified, whereas some things that really ought not to, you know,
00:12:23.240 things like a documentary about the British tutors, that will qualify as CanCon. So something
00:12:29.040 that I said in testimony, I don't think that C11 is necessarily intended to be a censorship bill,
00:12:34.180 but it's so broad it could very easily be used as one. That's why we've been calling to tie it down,
00:12:39.260 restrict it, make it very clear what it's actually about, because there's a real risk that a future
00:12:43.160 government in a year or two or three years could pick this up and say, well, now that we have these
00:12:47.720 powers, let's see what we can do with it. And we don't want to see that. One of the issues, too,
00:12:51.940 and I don't know if this has been resolved in the Senate process, but is that so much of how this
00:12:57.460 would actually unfold and how this would manifest really comes down to regulations which aren't in
00:13:01.820 the bill, which would be developed after the fact by the CRTC. That's right. And so one of the things
00:13:07.960 I'd like to encourage listeners is to stay engaged with this process. You know, people pay a lot of
00:13:11.900 attention to bills. They don't pay nearly as much attention to what the CRTC does in CRTC
00:13:16.860 proceedings, but those matter a lot. So we're going to be there throughout in interpreting any
00:13:21.640 part of this bill. We're going to be fighting to have the choices and content of ordinary people
00:13:26.660 respected in that process. And I really hope that lots of people who've been alarmed by C-11 will
00:13:31.320 also participate in that because it will determine the reality of a lot of this.
00:13:36.020 Now, has the government, to your knowledge, given any indication of whether it's willing
00:13:40.340 to accept that Senate amendment?
00:13:42.420 They said that they're studying it, so we don't know which way they'll go.
00:13:46.660 I should actually flag as well, there was a really negative Senate amendment as well.
00:13:50.520 They added an age verification requirement for internet platforms that I think was a
00:13:55.340 huge error to introduce at this late stage, and we're hoping the government will remove
00:13:58.460 that as well as accepting this new one.
00:14:00.820 Yeah, so since you bring this up, this is an interesting one because I understood that
00:14:05.060 what they were trying to do was try to help as far as like accessing online pornography was
00:14:10.120 concerned. Like that was the thrust behind it, wasn't it? Yeah, it's about, in their view,
00:14:14.380 it's about protecting kids. And of course, who wants to be against protecting kids? But the
00:14:18.540 Senate didn't hear from a single witness about this idea about what the consequences could be
00:14:22.980 about how it would work. And I mean, right, rightfully so, I think, some representatives
00:14:28.000 of the government said during the consideration of the amendment, this really doesn't belong here.
00:14:31.860 and we should consider this separately if we're going to do it. It needs a full consideration
00:14:36.160 because like the worst version of this is that you need a government issued ID and you need to
00:14:41.240 provide it to Reddit or to Twitter to log on and to post your content. And again, it might not be
00:14:45.960 intended for political content, but it could sure affect political content if they do it in a
00:14:49.880 detrimental way. Yeah. And then you throw privacy issues into the mix. You throw equity issues into
00:14:56.160 the mix. Not everyone has that, has the means to publish it or to upload that. One other dimension
00:15:03.360 of this that I think is important is that you are right that this can't be done in an omnibus way
00:15:07.860 where just everything that's remotely connected to the internet gets thrown into one thing. I mean,
00:15:12.640 my issue with another proposed set of reforms on online harms is that, you know, that you have
00:15:19.680 people that are wanting to view terrorist content the same way as hate speech, which I don't
00:15:25.060 actually think can be done in the same way because there are so many varied definitions of hate
00:15:30.200 speech. And I think there is this tendency, you're right, to just try to, well, they've opened this
00:15:34.680 box, just put everything they can into it. That's exactly right. So we haven't had harmful content
00:15:40.720 legislation yet. We had a paper that sort of outlined what they were thinking last year.
00:15:45.520 And it was a giant mess. It was amongst the worst ideas we've seen from any democratic government,
00:15:50.280 actually around these issues. They were very, very heavily critiqued. They have pulled back
00:15:55.080 from that initial proposal, but we don't yet have a sign of what their follow-up legislation will be
00:15:59.300 on that. And as you said, like terrorism content or content that's abusive of children is actually
00:16:05.960 quite a different matter than hate speech, which is very subjective, very contextual. There's no
00:16:11.080 way we can apply the same legal considerations to the two situations. And so we're hoping to see-
00:16:16.940 there were also some members of the panel that wanted to add unrealistic body image into the
00:16:21.260 category as well, like further complicating things. Sure. Misinformation comes up a lot,
00:16:25.560 right? And we all hate misinformation and we all disagree about who's putting it out. So
00:16:29.920 there's a lot of things that we might not like on the internet that can't be well treated by the
00:16:33.960 same law. So I don't want to kind of speak, put words in your mouth here. I mean, I've gotten the
00:16:39.240 sense that you've probably had, or your organization have probably had criticisms
00:16:42.300 against conservatives in the past,
00:16:44.280 but this has also kind of become
00:16:46.200 somewhat of a left-right issue, it seems like.
00:16:49.660 I don't think it should be.
00:16:51.320 I mean, we're a nonpartisan organization.
00:16:54.100 We're here for people's fundamental rights.
00:16:56.360 I do think you're right that in the last,
00:16:59.860 say, five to seven years, for various reasons,
00:17:02.460 people on the right side of the spectrum
00:17:03.680 have felt more motivated to speak up
00:17:06.020 for fundamental liberties.
00:17:07.480 But at the end of the day,
00:17:08.560 it all depends upon which end of the stick you're on, right?
00:17:12.300 And one of the points that we try to make when we're talking to the government, or to even the NDP, is that a lot of very marginalized groups can be very, very harmed by these very aggressive record story approaches, where it all sounds good on paper, but in reality, when you're an RCMP officer administrating some of this, you might victimize, you know, some of your traditional targets, folks like indigenous groups or other marginalized communities.
00:17:38.440 Yeah, I think you always have to, and I would hope people do, but I find often they don't, is look at how would my political opponents use this?
00:17:46.660 And if you'd be uncomfortable with giving the power to them, you shouldn't give the power to your political allies.
00:17:52.540 Well, hopefully we can see some progress on this, although I think we're past the point where minor tweaks are going to be the thing that fixes this here.
00:18:01.800 Where can people sign your petition?
00:18:04.360 Over at openmedia.org, we've got a new petition calling on the government to support the user content defending amendments and strike down the age verification one.
00:18:14.020 It's going to get a third vote at the Senate soon, and then it will go back to the House.
00:18:17.420 So we are hopeful that both of those changes can be made.
00:18:20.260 And although it's still going to be a pretty imperfect bill, it'll at least be less of a mess than it started as.
00:18:24.620 And I think your point earlier about following it even beyond that as the regulations are developed is an important one.
00:18:30.980 That is openmedia.org campaign's director, Matt Hatfield.
00:18:34.760 Thanks very much for your time, Matt.
00:18:36.180 Really appreciate it.
00:18:37.060 Thanks so much.
00:18:38.360 Yeah, this shouldn't be a left-right issue.
00:18:40.880 And I think that's actually a good evergreen position to take on most things,
00:18:45.720 which is if you're talking about giving the government a power or an authority in some way,
00:18:50.320 you have to say, how would I feel if someone who hated my very existence had that power?
00:18:57.080 I mean, we've talked about the Emergencies Act a lot on this show.
00:18:59.420 you have to look at the longer term implications of this. How would I feel if this were used against
00:19:04.000 someone like me? You have to look at censorship provisions through this sense. How would I like it
00:19:08.720 if someone had this power and could use it against me? And I think we're long past the point of being
00:19:15.900 able to say, just trust us, which is what the government says about a lot of its internet
00:19:21.500 regulations. And Stephen Gilbeau has probably been the worst messenger imaginable on C11,
00:19:28.720 because every time he speaks, he gives what is basically a different definition of what C11
00:19:35.620 actually does and who it affects and who it targets and user generated content. He's saying,
00:19:40.600 no, it's not going after what you post online. And then, okay, well, maybe it's going after. And now
00:19:45.340 if you make any money of it, you're an online publisher and it's, but it doesn't matter how
00:19:49.060 much. So all of a sudden you could have a multimillion dollar online publishing operation
00:19:53.880 like, oh, I don't know, Netflix, being treated the same way as some guy who just tries to make
00:19:59.180 a hundred bucks a year just by running some ads on his silly YouTube videos. And what the
00:20:05.960 government is claiming is happening here is that they're just trying to promote Canadian content.
00:20:12.060 They're just trying to give the guy from Toronto a little bit of a leg up over the guy from
00:20:16.780 Bucharest. Not to pick on YouTube streamers from Bucharest. I don't know if there are any
00:20:21.180 particularly good YouTube streamers from Bucharest I need to be following but now to be honest I'll
00:20:26.120 never know about them if C11 passes I'll never find out who the hottest guy not not the hottest
00:20:31.380 guy I mean like the trendiest guy in Bucharest but take from that what you will and all of this
00:20:37.340 is going to be harmful to Canadians because it takes away Canadians ability to navigate this
00:20:44.700 stuff for themselves and I go back to the stuff I started off the show talking about when we look
00:20:51.080 at the broader package of reforms here it's not just c11 it's also to some extent i mean it may
00:20:56.660 not be the biggest deal but c6 c18 which is the bill that forces twitter and facebook and google
00:21:04.120 to pay the toronto star and the globe and mail and perhaps global news to pay them
00:21:10.380 why because global news the toronto star and the globe and mail post their content voluntarily
00:21:16.500 on Facebook and upload it to YouTube and tweet about it.
00:21:20.180 Like, this is the most absurd thing ever.
00:21:23.120 So what the government is doing is trying to orchestrate
00:21:25.760 yet another bailout of legacy media.
00:21:28.800 But instead of actually using taxpayer money,
00:21:31.900 they're just looking to big tech companies
00:21:33.980 and extorting them, basically.
00:21:36.280 Extorting big tech companies to pay for a bailout
00:21:39.940 so that the liberals get to claim credit for saving journalism
00:21:43.600 without having to do the heavy lifting
00:21:45.560 because they don't have enough money to, I guess, indefinitely keep these things afloat.
00:21:50.560 But then it's the speech regulations.
00:21:53.100 And I've said numerous times that free speech is the hill to die on
00:21:56.660 because if you don't have free speech,
00:21:58.060 you can't actually debate anything of consequence in society
00:22:00.980 and deal with any of the other big challenges facing society.
00:22:04.640 If you look at the COVID era,
00:22:06.600 one of the biggest challenges to being able to push back
00:22:09.180 against the government's COVID restrictions
00:22:11.280 were actually attacks on free speech.
00:22:14.120 you couldn't debate vaccines you couldn't debate vaccine mandates without being accused of violating
00:22:20.740 medical misinformation and just imagine if all of these youtube bans and twitter bans and facebook
00:22:27.240 bans that were in place on so-called misinformation were actually state bans were actually coming from
00:22:35.220 government because that's what the government wants to do now as matt said they have not
00:22:40.120 proposed this in legislation, but they've talked about wanting to put forward takedown notices
00:22:46.140 so that if a YouTube or a Twitter or a Facebook or any of these companies has content that violates
00:22:54.440 the government's determination of what harmful content is, they have to zap it within 24 hours
00:23:01.800 or be fined up to like some ridiculous amount, like $25 million. That's what they're going to
00:23:08.700 have to deal with. So what's happened here is the government has tried to and may well succeed
00:23:15.420 in what I said in my tweet to Elon Musk is deputizing social media companies, deputizing
00:23:21.860 tech companies to do the government's dirty work. And this is not something that we should just
00:23:27.460 brush off and think, oh, well, it doesn't affect me. No, it affects everyone. It affects everyone.
00:23:33.200 And I'm so sick and tired whenever you talk about free speech and the importance of free speech and
00:23:38.040 the issues with governments arbitrarily redefining what hate speech is. You get people that say
00:23:43.000 the same tired criticism, which is, oh, well, I don't need to worry about being censored by hate
00:23:49.060 speech laws. What are you saying? And people that start conflating, as I said earlier, social
00:23:53.900 acceptability with legality. People that have no issue saying, because I don't like this form of
00:24:00.040 speech, I don't really care if it gets censored. And I'm so long past caring about that. You know,
00:24:08.120 it's very difficult to defend free speech in the abstract because you're going to be confronted
00:24:12.640 with individual examples and people are going to say, well, do you support that? Do you support
00:24:17.060 that? And I made a decision long ago that I wouldn't care about the substance of this speech
00:24:24.820 to care about whether it should be free or not. One example of this, and I may have told this
00:24:30.580 story before, years ago on my old radio show, I was talking about the Holocaust. Well, actually,
00:24:35.840 no, I was talking about campus free speech. And I said that universities are places where
00:24:40.580 academic debate, even if contentious ideas should be allowed. And I said, universities should
00:24:45.080 welcome that. They should relish that. And I gave just an obviously extreme example of Holocaust
00:24:51.360 denial. And I said, Holocaust denial is something that you should be legally allowed to debate.
00:24:56.920 I know the Holocaust happened. I know 6 million Jews were killed. I know it is one of the most
00:25:01.200 heinous displays, the most heinous display of anti-Semitism that we've ever seen in the world.
00:25:06.920 The answer to Holocaust denial is not censorship. The answer to Holocaust denial is exposing the
00:25:13.840 truth, exposing the truth, confronting Holocaust deniers with the truth. But I said, I do not
00:25:21.140 support censorship. So I believe that legally people should be allowed to debate the Holocaust.
00:25:26.980 And there was this hackish left-wing outlet called Press Progress that ran a story,
00:25:32.360 Andrew Lawton thinks the Holocaust is debatable, which is a gross misrepresentation because of
00:25:38.880 how people read that word debatable of what I was actually saying. But this is the era we live in.
00:25:45.940 This is the culture we live in.
00:25:47.160 You cannot defend someone's right to say something
00:25:51.100 without censors accusing you of defending
00:25:54.540 or agreeing with the thing itself.
00:25:58.600 And this is actually something that is so despicable.
00:26:04.160 So despicable.
00:26:05.680 And it's an inevitable byproduct, in my view,
00:26:08.220 of a culture in which we feel the need
00:26:10.680 to cancel those we disagree with
00:26:12.900 instead of just disagreeing with people we disagree with?
00:26:16.720 Like, why do we all need to agree?
00:26:18.060 Why does it need to be my position or censorship?
00:26:21.980 Why can we not just accept that there are some people
00:26:24.400 that say things that I want nothing to do with,
00:26:26.660 but they get to occupy their little corner of the world
00:26:28.920 and I get to live in the mainstream of civil discourse?
00:26:33.640 And a lot of these issues would be so much easier to manage
00:26:37.560 if people just had a fundamental human appreciation
00:26:41.340 for free speech, a fundamental human respect for free speech. Forget about law. It's not that the
00:26:47.380 law doesn't matter. It's not that constitutional rights don't matter. It's that that human desire
00:26:52.540 to be free and to have freedom and to respect freedom in others is more fundamental than
00:26:58.860 whatever the law says or whatever Justin Trudeau says. And I don't have an answer to this because
00:27:05.540 right now we have a government that clearly wants censorship that clearly supports online censorship
00:27:11.540 that clearly supports the ability for their government and for future governments to draw
00:27:17.380 this line and say that content on that side of the line is not allowed and must be removed from
00:27:24.100 the internet and content on this side of the line is fine and you can't look at what the government
00:27:30.580 has proposed without understanding that when the liberals and others that are predisposed
00:27:37.300 to online censorship use terms like hate speech, what they're actually talking about is not
00:27:41.940 what most Canadians think of when they hear hate speech, which is, you know, really vile,
00:27:47.520 violent rhetoric.
00:27:48.440 What they mean is content that they hate, speech that they hate.
00:27:54.160 Just remember, I know the clip has gotten a lot of play, but Justin Trudeau has oftentimes,
00:27:59.080 Well, let me take a step back here. He denied ever calling the unvaccinated names. When everyone saw that same clip, we played it on the show from the election campaign, where in French he said that the people protesting him were unvaccinated, they were misogynist, they were racist, they were extremists.
00:28:16.840 If Justin Trudeau views being unvaccinated as being extremist and racist and misogynistic,
00:28:25.980 if he believes that is extremist, what on earth would he support censoring
00:28:32.180 when it comes time to determining what hate speech is in the context of the Canadian Human Rights Act?
00:28:39.500 And the government can write whatever guidelines it wants to in law.
00:28:43.680 And they've tried to do this.
00:28:44.720 They've said, well, it's it's speech that is likely to foment detestation and vilification and to try to neutralize from the free speech concerns that I and others raise.
00:28:55.340 They say it's not just about offensive speech.
00:28:57.840 It's not just about humiliating speech.
00:28:59.960 And there are a few other examples there.
00:29:01.960 It's just this really extreme stuff.
00:29:04.800 But somehow they've still had to come up with a new definition that has a lower threshold than the hate speech definition that exists in the criminal code.
00:29:13.300 and why would they do that if not because they plan to shrink the bounds of what is legally
00:29:20.440 possible to debate what is legally possible to say and i don't know if you can look at this at all
00:29:26.320 and come away with the conclusion that there is not a war underway by the government
00:29:31.060 against online free speech and yes oppose c11 talk about c11 talk about how you reject this
00:29:39.780 idea that we need to mandate Canadian content absolutely have that discussion but you have to
00:29:44.420 look at the whole suite of things that the government is trying to do right now about
00:29:48.800 the internet and they're trying to dramatically expand regulation of the internet beyond I mean
00:29:54.460 Michael Geist who's one of the most foremost thinkers on this issue and scholars on this issue
00:29:59.380 has said this is the most anti-internet government in the world and I'd be hard-pressed to come up
00:30:05.240 with an example certainly among liberal western democracies we've got to end things there we will
00:30:10.620 be back next week with more of canada's most irreverent talk show but back on friday with
00:30:16.140 fake news friday alongside sue ann levy i believe is on deck this week so you won't want to miss
00:30:20.340 that we will talk to you soon folks thank you god bless and good day to you all listening to
00:30:26.860 the andrew martin show support the program by donating to true north at www.tnc.news
00:30:35.240 Thank you.