Rebel News Podcast - February 28, 2024


EZRA LEVANT | Trudeau tables a draconian censorship bill that criminalizes dissenting views


Episode Stats

Length

1 hour and 32 minutes

Words per Minute

160.83055

Word Count

14,797

Sentence Count

1,223

Misogynist Sentences

7

Hate Speech Sentences

20


Summary

Justin Trudeau introduces the worst censorship bill in the free world. It s the largest and worst censorship law in Canada, and maybe anywhere in a democracy. It actually has life sentences for hate speech. I ve never seen anything so bad.


Transcript

00:00:00.320 Hello, my friends. A special show today. I did almost a three-hour live stream this afternoon
00:00:05.320 going through the new, quote, Online Harms Act that Justin Trudeau has introduced. It's the
00:00:11.240 largest and worst censorship law ever in Canada, maybe anywhere in a democracy. I've never seen
00:00:16.480 anything so bad. It actually has life sentences for hate speech. I'm not even making that up. I
00:00:22.600 wish I were. I'll take you through it. But first, I'd like to invite you to become a subscriber to
00:00:26.760 Rebel News Plus. That's the video version of this podcast. I'm going to take you through section
00:00:31.440 by section. I want you to see with your own eyes. I'm not making this up. Go to rebelnewsplus.com.
00:00:36.420 Click subscribe. It's eight bucks a month. And, you know, that's how we stay strong because we
00:00:41.840 don't get any money from Trudeau and it shows. All right. Here's today's podcast.
00:00:56.760 Tonight, Justin Trudeau announces a new censorship bill, the worst ever seen in the free world. I
00:01:09.980 kid you not, it will end our freedom of speech and much of our democracy. It's February 27th,
00:01:15.360 and this is The Ezra Levant Show.
00:01:20.040 Shame on you, you censorious bug.
00:01:32.040 Hey, maybe you think it's an exaggeration what I'm about to say. I don't think it is. I've just
00:01:37.000 read Bill C-63. That's Justin Trudeau's new proposed online harms act. It's basically the largest and
00:01:44.360 deepest censorship law ever introduced, not just in Canada, but I think anywhere in the free world.
00:01:50.160 It would destroy freedom of speech, freedom of the press, freedom of thought, and it'll destroy our
00:01:54.880 rule of law as well. It makes it a life imprisonment crime to engage in hate speech. Life in jail.
00:02:03.200 It allows judges to put you in house arrest with an ankle bracelet, tell you who you can or can't talk
00:02:11.020 to just on suspicion that you might commit a hate offense. One of the things that terrifies me is
00:02:17.480 there's a provision in there that people can complain to the Human Rights Commission
00:02:20.560 in secret. Keep their identity secret even from you. Keep the provision of their evidence a secret
00:02:27.500 from you, and they can win $20,000 from you, and you have to pay 50 grand to the government too. This
00:02:33.000 will be the starter pistol for a thousand hucksters and hustlers to file frivolous complaints against any
00:02:39.140 conservative to enrich themselves and punish Trudeau's enemies. I don't think I've ever seen anything like
00:02:44.820 this in Canada. I'd like to think it will be stopped, but of course Jagmeet Singh has signed a
00:02:50.280 deal with Justin Trudeau to support whatever he does. That's my preamble. I'd like to share with
00:02:56.320 you the highlights now of the three-hour video, and if you have the stomach for it, if you have the
00:03:01.240 patience for it, I would encourage you to watch the whole two-and-a-half-hour video, which is about
00:03:06.200 almost three hours, where I go through it line by line. And if that's not the most terrifying law
00:03:12.700 you've ever seen, well then let me know what is. I think that this law destroys civil liberties more
00:03:17.580 even than the Emergencies Act did in Canada. By the way, the Emergencies Act didn't come with
00:03:22.540 life-in-jail prison sentences. I think it's a destruction of civil liberties worse than, well,
00:03:29.540 than anything I think that has ever happened in Canada. And I hope that it won't come to pass.
00:03:35.440 I fear that it will, and I promise that if it does, Rebel News will fight it to the death,
00:03:41.040 because that is how this will end. Either we kill the bill, or the bill kills us. By the way,
00:03:46.840 we've got a petition at stopthecensorship.ca. Feel free to sign that as well. Without further ado,
00:03:53.580 here are the excerpts of my extended live stream earlier today.
00:03:57.100 We're going to go through this. When I say it's 100 pages long it is, but remember half of it's in
00:04:06.240 French, so that reduces it to 50 pages. Some of it's technical language we don't need to read,
00:04:12.400 and some of it I've already covered. So I think we're really just going to look at 30 pages together.
00:04:17.100 Can you bear with me? Now I think our team is going to cut this up into little videos so people
00:04:21.740 don't have to watch the whole massive thing. But what I'm trying to do here today is give you a
00:04:27.600 deeper look at this law than you're going to get on the CBC or any pro-Trudeau media.
00:04:36.600 I want to show you what's actually scary here, and so you can see it with your own eyes. So let's
00:04:41.180 get right into it. Bill C-63, an act to enact the Online Harms Act, to amend the Criminal Code,
00:04:46.920 the Canadian Human Rights Act, and an act respecting mandatory reporting of internet child pornography.
00:04:51.960 Okay, again, you can see what they're really doing here. They're bundling their hate speech
00:05:00.960 provisions with a child pornography provision just to distract you. So whenever you criticize,
00:05:06.660 they'll pivot you. Are you against banning child pornography? No, actually, we believe in banning
00:05:11.880 child pornography. It's already in the criminal code. If you want to give police more resources,
00:05:15.900 that's a good idea. And if you want to have special policies for Facebook, I think that's a good
00:05:20.340 idea too. But that is not the center of this law. Let's get cracking. Summary, part one of this
00:05:26.620 enactment enacts the Online Harms Act. The act, among other things, A, establishes the Digital Safety
00:05:34.080 Commission of Canada, whose mandate is to administer and enforce that act to ensure that operators of social
00:05:40.100 media services, in respect of which the act applies, are transparent and accountable.
00:05:43.840 B, creates the position of a digital safety ombudsperson, whose mandate is to provide support to users of
00:05:52.100 social media, and to advocate for the public interest in relation to online safety. That's a clever way of
00:06:01.140 saying a censor. C, establishes the Digital Safety Office. So they got three new offices, three new
00:06:08.960 bureaucracies, millions of dollars, a dozen commissioners, a hundred staff, tens of millions
00:06:17.480 in budgets. That's what it takes to silence a country, and Trudeau's willing to do it.
00:06:23.500 D, imposes on the operators of social media services, in respect of which that act applies,
00:06:29.380 a duty to act responsibly in respect of the services that they operate, including by implementing
00:06:35.120 measures that are adequate to mitigate the risk that users will be exposed to harmful content,
00:06:39.500 two, a duty to protect children, three, a duty to make content that sexually victimizes a child
00:06:44.680 or re-victimizes a survivor inaccessible, four, a duty to keep records.
00:06:51.820 E, authorizes the Digital Safety Commission to accredit certain persons that conduct research or engage in
00:07:00.300 education, advocacy, or awareness activities that are related to that act for the purposes of enabling
00:07:06.140 those persons to have access to inventories of electronic data and to electronic data of the
00:07:10.960 operators of social media services, in respect of which the act applies. I don't know exactly what that
00:07:15.440 means. We'll get into it later. But basically, they want to snoop around inside Facebook, YouTube,
00:07:20.680 Google, Twitter. I wonder what they're looking for. Provides that persons in Canada may make a complaint
00:07:28.000 to the Digital Safety Commission. Authorizes the governor in council to make regulations.
00:07:34.280 All right. Let's skip down a little bit further. Part two, amends the criminal code. This is the
00:07:40.120 scary stuff. A, create a hate crime offense of committing an offense under that act or any other
00:07:47.260 act of parliament that is motivated by hatred based on certain factors. Create a recognizance. That means
00:07:52.580 like a restraining order. To keep the peace relating to hate propaganda and hate crimes offenses. That's what I
00:07:59.060 alluded to earlier. A pre-crime. C, define hatred for the purposes of a new offense. I can tell you what
00:08:05.260 hatred means. It means hard feelings. It means to despise something. Everyone knows what hatred means. It's a
00:08:12.000 feeling. But that's what they're doing. There used to be hate crimes. Now they're just throwing away the crime
00:08:17.900 part. Now hate in itself is a crime. Hate is not a crime. You have it in your heart. And if you are the
00:08:25.380 only person in the world who has banished every trace of hatred from your heart, then you are a saint.
00:08:30.540 But even saints would say that they have hatred in their heart that they wrestle with. This is Trudeau
00:08:35.420 criminalizing feelings. Let's keep on going. Define hatred. Increase the maximum sentences for hate
00:08:43.680 propaganda to life in prison. You don't get life in prison for terrorism in Canada. You get 10.5
00:08:50.820 million dollars and a public apology from Justin Trudeau. Just ask Omar Cotter. Part three amends
00:08:56.680 the Canadian Human Rights Act to provide that it is a discriminatory practice to communicate
00:09:01.220 or cause to be communicated hate speech by means of the internet. Now stop right there. I know what a
00:09:07.020 discriminatory practice is. And so do you. If you go into a restaurant and are told you're Jewish, you can't
00:09:14.160 come in. Or you're black, you have to sit in the back. Or whatever crazy thing you can dream up. I don't
00:09:20.180 even think that happens one in a million to Canadians. I mean, theoretically. I mean, I'm not saying it didn't
00:09:26.780 happen in the past. I live in Toronto, which is a majority minority city. Any landlord that refused to rent a
00:09:34.400 home to a visible minority, I think you'd be out of business pretty quickly in this town.
00:09:39.180 That's just an example of a kind of discrimination that may have been more prominent 60 years ago.
00:09:43.900 It's just not there now. That's discrimination. Discrimination is if you're not hired because of
00:09:53.460 your race or if you're fired because of your race. I say again that it's hard to imagine that happening in
00:09:59.280 our completely integrated, ethnically diverse economy. I'm not saying it never happens, but it's
00:10:05.400 sort of rare. But if it does happen, that's discrimination. Discrimination literally means
00:10:11.680 to choose between things. He's got a discriminating palate. It means he knows what tastes, you know,
00:10:18.120 he knows the difference between fine wine and crummy wine. He's got very discriminating taste. It's
00:10:24.420 actually a compliment when you phrase it that way. When you use that to describe something in real
00:10:31.340 life, discrimination in the workplace and in the marketplace, you're doing the opposite. You're
00:10:35.640 saying he's discriminating based on invalid reasons like race or religion. We do allow some
00:10:41.520 discrimination, of course. For example, in women's changing rooms, only women can go there. There's
00:10:48.240 certain youth prices. Kids pay less at a movie theater than adults do. So that's discrimination that we
00:10:53.760 allow. Although, of course, they're now calling sex discrimination in bathrooms. They're calling
00:10:57.900 that a hate crime. But my point is, they're now saying that merely expressing a point of view on
00:11:04.300 the internet is discrimination. How is that discrimination? You're not banning someone from
00:11:10.600 riding a bus. You're not banning someone from renting an apartment. You're not firing someone from work.
00:11:16.640 All of those things are discriminatory actions against someone based on a prohibited ground.
00:11:20.540 How is an internet comment discrimination?
00:11:26.600 Well, when Justin Trudeau says it is, it is. Because that's the whole purpose of this law.
00:11:34.900 Part four amends the reporting of internet child pornography. I'm fine with that. That is not the
00:11:39.400 purpose of this law. All right. So let's skip down to the table of provisions. Another way of saying a
00:11:44.620 table of contents. We're going to skip through this quickly. Online harms act, interpretation. So you can
00:11:51.080 see it's got a table of contents. We're just going to skip through this because it's just a table of
00:11:58.200 contents. So we're going to scroll all the way down
00:12:01.480 until we get to the beginning of the bill.
00:12:13.560 And you can see, yeah, there it is. And that's the title of the act. I won't read that again.
00:12:20.180 The online harms act is enacted as follows. Definitions. Let's spend a little bit of time on
00:12:25.260 definitions. The following definitions apply in this act. Child means a person who was under 18.
00:12:32.460 So that's relevant when they talk about bullying. And we'll get to that a little bit later.
00:12:37.120 Commission is a digital safety commission. Content that foments hatred. This is very interesting to me.
00:12:43.000 Means content that expresses detestation or vilification of an individual or group of
00:12:48.720 individuals on the basis of a prohibited ground of discrimination within the meaning of the Canadian
00:12:53.080 Human Rights Act. And that given the context in which it is communicated, is likely to foment
00:12:57.820 detestation or vilification of an individual or group of individuals on the basis of such a
00:13:02.380 prohibited ground. Let me translate. So they're saying hatred means if you detest something. That's
00:13:09.060 just sort of a thesaurus. Hatred and I hate something, I detest something. Sort of the same
00:13:16.100 thing. Vilified is to say is a little more than hate. It's to say you're awful. I don't just hate
00:13:22.680 you. I think you're evil. That's vilification and detestation a little bit different. But
00:13:26.480 content that foments hatred, that's such an important concept because like when I was charged
00:13:35.160 with hate speech for publishing the Danish cartoons of Mohammed, I didn't actually do anything.
00:13:39.980 I didn't express hatred towards anyone. They didn't say I did. What they said I did is by
00:13:47.100 publishing something, quote, likely to expose a person to hatred or contempt. So they said that
00:13:53.220 by publishing those Danish cartoons of Mohammed, and why don't we throw them up on the screen while
00:13:57.160 I'm talking. By publishing those Danish cartoons of Mohammed in a news article, I was likely to cause
00:14:06.760 person A to hate person B. Didn't happen. They had no evidence of it happening. No one said it
00:14:13.700 happened. No one was harmed. It didn't happen. It might possibly in the future happen, likely to
00:14:20.120 expose a person to hatred or contempt. But it's not an actual harm. Not even a fake harm of you hurt my
00:14:32.400 feelings. I was prosecuted for 900 days. You can go to Wikipedia and you'll find it there. I think
00:14:39.080 they still have it. They may have taken it down now. Just go to Danish cartoons wiki and you should
00:14:44.360 be able to find it. Wouldn't surprise me if it was taken down. 15 years ago, you were allowed to see
00:14:50.800 that online. Yeah, there it is. Just click that. Yeah, they won't let you see a large copy of it.
00:14:56.320 Just show that on the screen for a second. Yeah, thank you. So this was the Danish cartoons of
00:15:04.420 Mohammed. You can see some of them are very mild. Two of them are a little bit edgy. One looks like a
00:15:10.040 turban is a bomb. That's in the top. And one is two women in a burka and a guy with a sword. But the other
00:15:16.820 ones are pretty friendly. That top left one there, it's the cartoonist drawing Mohammed. The middle left
00:15:26.240 on the top one looks like a stylized Olympic logo or something. The one on the right,
00:15:31.100 Mohammed looks sort of friendly. That one on the left middle there, it says, stop, stop,
00:15:38.660 we've run out of virgins, a commentary on suicide bombers. You can see one that looks like a guy in
00:15:43.360 the desert with a donkey, you know, sort of like a illustrated Bible. The one on the bottom left there,
00:15:49.880 that boy is named Mohammed. And he's basically saying something like, I'm Mohammed. I'm a kid
00:15:54.720 here in Denmark or something. So that's it. These cartoons, they caused huge boycotts,
00:16:03.000 riots around the world, hundreds of people killed in riots about these cartoons, people who had not
00:16:07.660 even seen the cartoons. We did a news story about it. Simply doing a news story about that news event
00:16:13.160 was deemed likely to expose a person to hatred or contempt. That is now criminalized.
00:16:18.460 Content that foments hatred or might foment. It's likely to foment. Show the law again.
00:16:28.260 Is likely to foment just detestation. Do you see that? Likely to highlight that on the screen.
00:16:36.080 Likely to. So it didn't happen yet. Might not happen. If it happens, no one was stabbed,
00:16:44.660 no one was punched. They just had hard feelings. That's not a crime. That's what content that
00:16:52.900 foments hatred means. You didn't have to hurt anyone. You didn't have to damage anyone.
00:16:57.720 You didn't. No one had to show that you, that they lost a dollar or lost a friend or had anything bad
00:17:03.440 happen to them. It's just that they could have. It was likely to expose me to what? To hard feelings.
00:17:09.940 That's not a crime. Just keep trucking. Content that incites violence. Well, that's already
00:17:15.200 against the criminal code. This is a distraction. Inciting violence is against the criminal code.
00:17:21.920 It has been ever since we've had a criminal code. Let's keep going. Content that incites violence,
00:17:27.900 extremism, or terrorism. Keep scrolling down. That's against the law right now. That's not new.
00:17:33.960 That's just in there to distract you. Content that induces a child to harm themselves. Well,
00:17:39.060 this is interesting because remember, we just read the definition of a child, anyone under 18.
00:17:44.100 So what does this mean? Content that advocates self-harm. Does that include cutting off,
00:17:50.580 God forbid, your breasts as a girl or your penis if you're a boy? That's what they're talking about
00:17:55.260 with the transgender surgery. Childless defined as anyone under 18. And this says content that
00:18:02.760 induces a child to harm themselves mean content that advocates self-harm, disordered eating,
00:18:07.040 or dying by suicide, or the counsels a person to commit or engage in any of these acts.
00:18:12.400 I would put it to you that it's self-harm
00:18:14.240 to chop up children in the transgender, gender-affirming surgery.
00:18:20.200 Is that a crime? Yeah, I don't think so. I think,
00:18:22.860 I think Trudeau will use this as Laverente Barria says.
00:18:27.120 This will be applied against whoever the government wants to.
00:18:29.360 Content that sexually victimizes a child or re-victimizes a survivor. Obviously,
00:18:36.980 everyone agrees with this. Much of this is already in the law. And if we need to update the law,
00:18:43.040 so be it. But that's not the purpose of this law. That's not the new offices. That's not the new
00:18:49.020 hunter-killer commissioners. That's not the $20,000 bounty for having your feelings hurt.
00:18:56.000 This is the window dressing that they want you to focus on.
00:19:00.080 Content used to bully a child. Again, a child could be 17 and a half years old.
00:19:07.740 Means content or an aggregate of content that, given the context in which it's communicated,
00:19:12.620 could cause serious harm to a child's physical or mental health.
00:19:17.420 What does that mean? What does the word bully mean? I mean, I suppose we know it when we see it.
00:19:22.800 To criminalize bullying for someone who's 17 years and 10 months old. Basically, an adult.
00:19:34.880 You can criminalize communicated content if it could cause serious harm to a person's mental health.
00:19:42.520 This is threatening to criminalize social interactions that might include just harsh banter,
00:19:54.600 hazing in a friendly way, theoretically, even harsh criticism from a family member or from a teacher or anything.
00:20:02.820 The word bully is so vague. In fact, I don't think the word bully itself is not actually defined in this law.
00:20:17.020 And I think that's why.
00:20:19.260 Okay, let's keep going here.
00:20:21.340 Harmful content means intimate content communicated without consent.
00:20:25.960 Again, that's revenge porn. We agree that that should be banned.
00:20:28.360 I think it already is, at least in some jurisdictions.
00:20:33.340 We've already gone through this list already.
00:20:35.180 Indigenous people, intimate content communicated without consent.
00:20:38.700 We've already gone through that.
00:20:40.440 The minister, obviously the minister of Canadian heritage here.
00:20:47.020 Hey, I'm going to stop for one second.
00:20:48.900 I'm going to throw to a video.
00:20:50.020 Do you have those Stephen Gilboa videos handy?
00:20:51.660 Like I say, this bill was just released by Arif Virani, the Minister of Justice and Online Harms.
00:21:00.840 But this bill was drafted when Stephen Gilboa was the heritage minister.
00:21:09.900 And he sort of did the rounds talking about it.
00:21:12.360 He talked about things like how can you stop certain websites.
00:21:19.240 He answered a few questions.
00:21:21.340 And I'm just waiting.
00:21:22.340 We're calling up the video here.
00:21:23.840 I've shown these to you before if you've watched my show.
00:21:28.360 What's the real purpose here?
00:21:30.920 Well, do you have the criticized politicians clip?
00:21:33.640 And then the nuclear one is the second clip.
00:21:39.940 So the first one is criticized politicians.
00:21:44.500 We're just calling it up, folks.
00:21:45.740 I want to prove, before I go further, I want to show you one of the worst people in the Canadian cabinet, Stephen Gilboa.
00:21:53.760 When he was drafting this bill and he was talking to a friendly government subsidized journalist at a liberal think tank, I think it was called Canada 2020.
00:22:03.640 He was asked, yeah, that's the clip there.
00:22:06.600 He was asked, what's the purpose of this?
00:22:08.380 Why are you doing this?
00:22:09.760 Here, go ahead and play the clip.
00:22:12.320 We've seen too many examples of public officials retreating from public service due to the hateful online content targeted towards themselves or even their families.
00:22:24.860 That's his rationale.
00:22:27.880 People are mean to politicians.
00:22:29.480 So the hate and the fomenting hate, you heard him.
00:22:33.640 It's because we've had too many people hating politicians, including him.
00:22:40.020 But what happens if a website just won't stop?
00:22:43.560 Like, what do you do?
00:22:45.220 Well, he has an answer.
00:22:47.740 Go full North Korea.
00:22:49.640 He calls that the nuclear option here.
00:22:51.360 Do you have that clip handy?
00:22:52.740 Take a look.
00:22:54.540 Envision having blocking orders.
00:22:56.540 I mean, that's – maybe.
00:23:01.100 It's not – you know, it would be – it would likely be a last resolved nuclear bomb in a toolbox of mechanism for a regulator.
00:23:16.260 There you have it.
00:23:20.200 Now, I understand we have a freedom-oriented lawyer from the Justice Center for Constitutional Freedoms who will be joining our call.
00:23:26.980 So, Olivia, as soon as Marty is ready, let me know.
00:23:29.900 But I'm going to keep going until then because we're still in the definitions part here, and it's already 2-11 Eastern.
00:23:35.940 Office means the Digital Safety Office of Canada.
00:23:40.180 Ombudsperson means the Digital Safety Ombudsperson.
00:23:44.040 Operator means a person that operates a regulated service.
00:23:48.560 I'm going to keep going.
00:23:49.780 Social media service means a website or application that is accessible in Canada, the primary purpose of which is to facilitate interprovincial or international online communication amongst users.
00:24:00.140 Let's just keep going.
00:24:01.080 I'm going to skip some of these definitions because they're – oh, I'm going to focus on the definition that – the next one there.
00:24:11.460 For greater certainty, content that foments hatred.
00:24:15.480 Because this is what Stephen Gilboa was talking about there.
00:24:20.500 Causing another person to have hard feelings.
00:24:22.540 That is not a crime.
00:24:24.460 You're allowed to have hard feelings.
00:24:26.620 In fact, it's in tyrannical regimes where you're not allowed to have hard feelings.
00:24:31.660 Another thing that Stalin did is he would make note of whoever stopped clapping first.
00:24:39.520 When he would give a speech, there would almost be this contest to see how long you could clap because no one wanted to be the first to stop clapping.
00:24:47.040 What's my point of giving that anecdote to you?
00:24:50.360 Is in a free society, you're allowed to hate your politicians.
00:24:53.800 You're allowed to have hard feelings.
00:24:55.600 You're allowed to detest whoever you like.
00:24:59.700 And hopefully you will find a positive outlet, campaign, run for office, write a letter to the editor, get involved and actually fix a problem in life.
00:25:08.060 It's only in authoritarian regimes where you're not allowed to feel those feelings.
00:25:12.280 But of course, you cannot stop someone from feeling some way.
00:25:15.080 You can just force them to fake it, which is what they did in the Soviet Union.
00:25:20.800 So here's the definition for greater certainty.
00:25:23.800 And for the purposes of the definition, content that foments hatred, content does not express detestation or vilification solely because it expresses disdain or dislike or discredits, humiliates, hurts or offends.
00:25:35.400 Well, what does that mean?
00:25:37.420 Because they already defined it as it causes detestation or vilification.
00:25:42.000 They're saying just because you dislike something or hurt or humiliate or offend.
00:25:47.540 You know what that is?
00:25:48.400 That's a contradiction.
00:25:50.880 Which is how they're going to excuse their liberal friends and how they're going to accuse their conservative critics.
00:25:58.360 So I showed you the front page of the Toronto Star.
00:26:00.880 Clearly fomenting hatred.
00:26:02.640 I mean, I think that was pretty obviously its purpose.
00:26:05.520 That was certainly its effect.
00:26:06.580 And by the way, you don't have to have a mens rea.
00:26:10.300 You don't have to have a guilty mind to be convicted of any of this.
00:26:13.140 It's just could it, could it cause someone to have hard feelings?
00:26:17.360 You don't even have to have intent in your heart.
00:26:19.880 So if you are conservative and you foment hatred, you're guilty.
00:26:25.880 If you are liberal and you foment hatred, well, not solely just because you express disdain, dislike or discredit, humility.
00:26:32.640 That contradiction is so essential in the law.
00:26:37.120 That's how they're going to go after conservative critics, but spare the haters on the street.
00:26:43.420 Every day in this country, we see pro-Hamas protests calling for genocide, condemning the Jews, actually engaging in real crimes, not just hate crimes.
00:26:53.480 They will never be prosecuted on this law.
00:26:55.860 You think Arif Varani, who is an Ismaili Muslim himself, so he's fairly liberal.
00:27:00.400 I'm not going to pretend he's an Islamist.
00:27:02.240 He's not.
00:27:03.260 But he's clearly in the pro-Hamas liberal caucus, along with Melanie Jolie, Omar Al-Gabra, Ahmed Hassan, Justin Trudeau.
00:27:11.260 Do you think Arif Varani is going to prosecute some of these pro-Hamas extremists on the street?
00:27:17.900 Well, he hasn't so far, and he's had some legal tools to do so.
00:27:22.180 So he absolutely will not.
00:27:24.520 Let's keep going.
00:27:26.360 They have some more definitions regarding sexual matters.
00:27:30.720 Again, that is not the main purpose of this law.
00:27:32.520 Well, let's skip ahead down to 6.1, which is exclusion of private messaging feature.
00:27:41.680 The duties imposed under this act do not apply in respect of any private messaging feature of the regulated service.
00:27:48.940 So it doesn't apply to your direct messages.
00:27:51.640 So I guess there's that.
00:27:52.540 So I'm going to scroll down a bit, because we've already covered some of this stuff when we looked at the government flyer.
00:28:05.560 So I'm going to skip ahead to page 9, where it talks about the Digital Safety Commission of Canada, establishment and mandate.
00:28:13.080 The mandate, the commission's mandate, is to promote online safety and contribute to the reduction of harms caused to persons in Canada as a result of harmful online content.
00:28:25.940 And in particular, they talk about child pornography.
00:28:33.240 Obviously, there's no one in the world who could disagree with that.
00:28:38.820 But like I say, that's not the main purpose of this bill.
00:28:41.160 So let's skip ahead to Section 12.
00:28:44.240 The commission consists of three to five full-time members to be appointed by the governor and council.
00:28:49.560 That's a fancy way of saying cabinet.
00:28:52.500 The governor and council may remove a member for cause.
00:28:55.920 Five-year terms, Section 13, five-year terms.
00:28:59.340 Section 14, they'll be paid a certain amount.
00:29:02.320 They'll get benefits.
00:29:03.940 And look at Section 16.
00:29:06.120 They don't have to be a Canadian citizen.
00:29:08.920 You can just be a permanent resident here.
00:29:11.160 So you can censor, you can be Trudeau's officer of censorship, and you don't have to be a Canadian citizen.
00:29:19.860 Section 17, a member must devote the whole of their time to the performance of the functions.
00:29:23.560 And then they have a chairperson and a vice chairperson, and they talk about committees.
00:29:27.320 I'm going to skip over that because that's all administrative matter that's not substantive.
00:29:33.040 Section 25 talks about powers they can delegate.
00:29:35.540 Section 27, let's pick it up there.
00:29:40.560 When making regulations and issuing guidelines, codes of conduct and other documents, the commission must take into account freedom of expression.
00:29:49.080 Really?
00:29:49.560 Every single thing they do is infringing on freedom of expression.
00:29:56.100 You don't add laws and punishments and life sentences and fines and offices and inspections.
00:30:01.740 Literally every single word, every single comment, every single jot and tittle in this is an infringement or freedom of expression.
00:30:11.980 I think you know who might save us in the end is the big tech companies who probably have some claim against Canada for an unfair trade practice.
00:30:19.260 Ironically, I think we'll probably get more help from Facebook, YouTube, Google, and even the Chinese-owned TikTok to stop this than we will from any Canadian source.
00:30:32.020 Equality rights, privacy rights, the needs and perspectives of indigenous people of Canada and any other factor that the commission considers relevant.
00:30:39.200 Really?
00:30:39.860 Any other factor at all, eh?
00:30:42.380 Anything.
00:30:42.960 So they can literally use anything they think relevant when they go about their censorship.
00:30:48.120 All right, good to know.
00:30:49.260 Now, here's the next one.
00:30:50.540 The Digital Safety Ombudsperson of Canada.
00:30:53.800 Go ahead and show this on the screen.
00:30:55.220 Appointment of mandate.
00:30:56.980 To hold office for five years.
00:30:58.860 To support users of regulated services and advocate for the public interest with respect to systemic issues related to online safety.
00:31:06.560 Oh, I hear the word systemic issues and I hear systemic racism immediately.
00:31:10.680 So they're going to be the D.I.E., the Diversity, Inclusion, and Inequity Cultural Marxism.
00:31:17.940 Injected straight into the veins of the social media companies.
00:31:20.780 By the way, they're all completely woke except for Twitter.
00:31:23.420 You could call this the anti-Twitter law.
00:31:27.040 And I'll skim through remuneration, benefits.
00:31:29.640 They just talk about all the money this person is going to get.
00:31:33.000 The powers and duties.
00:31:35.320 Just read that.
00:31:35.800 Section 37.
00:31:36.380 In fulfilling their mandate, the ombudsperson may gather information with respect to issues related to online safety.
00:31:43.060 Including with respect to harmful content.
00:31:45.700 Such as by obtaining the perspective of users of regulated services and victims of harmful content.
00:31:50.860 So it's basically going to be a therapy session where people put their grievances together and complain to a government official.
00:31:56.380 Yeah, what could come from that other than censorship?
00:32:00.700 Highlight issues related to online safety, including by making publicly available any information gathered under paragraph A.
00:32:08.580 So basically, they're going to be an advocate, a stalking horse, an accuser.
00:32:15.820 C. Direct users to resources, including those provided for under this act, that may address their concerns regarding harmful content.
00:32:21.160 That's the get-rich-quick scheme of filing $20,000 jackpots with the Canadian Human Rights Commission.
00:32:28.460 That's really what this is.
00:32:29.840 This person will take any complainant and help them craft a complaint to go after Rebel News, True North, National Post, Toronto Sun.
00:32:37.840 That's what this is about.
00:32:39.720 Digital Safety Office of Canada.
00:32:42.880 This is the third agency.
00:32:46.260 Trudeau does not have three agencies tackling inflation.
00:32:48.620 Trudeau does not have three agencies tackling taxes, cost of living, cost of housing.
00:32:57.480 Doesn't have three agencies dealing with our decrepit military that doesn't have equipment.
00:33:03.560 But censorship is something he'll have not one, not two, but three agencies for.
00:33:07.920 So important to him, he will literally rename the Justice Minister the Minister of Justice and Online Harms.
00:33:14.720 Chief Executive Officer.
00:33:16.120 Again, Chief Executive.
00:33:17.540 That sounds pretty fancy.
00:33:18.420 See, that's going to be a quarter-million-dollar job right there.
00:33:22.840 As with the others, he's got a term of service, and he's got payments and human resources.
00:33:32.940 He may employ any employees that are necessary to conduct the work of the office.
00:33:37.180 I'm on Section 52 here.
00:33:39.480 May delegate powers.
00:33:40.940 They're setting up new agencies here, new commissions, new boards, hiring lawyers, hiring political friends.
00:33:50.220 There's no way this is going to cost less than $100 million a year.
00:33:54.580 Part four.
00:33:55.300 Duties of operators of regulated services.
00:33:57.340 So these are – I went through this in the brochure.
00:34:00.140 These are the duties that they demand Facebook, YouTube, Twitter, and other social media accounts have.
00:34:06.000 An operator, that would be like Elon Musk or Mark Zuckerberg, has a duty to act responsibly in respect of a regulated service.
00:34:12.980 Well, of course, you didn't need to put that in a law.
00:34:15.600 That's in the common law.
00:34:17.640 That's – you know, you have a duty to the public.
00:34:21.580 You have a duty to your customers.
00:34:22.500 You have a contractual duty.
00:34:23.700 You have a common law duty.
00:34:24.660 You have statutory duties.
00:34:25.500 You do not need a law to say that.
00:34:29.460 That is already the case.
00:34:32.960 The operator of a regulated service must implement the measures that are adequate to mitigate the risk that user of the service will be exposed to harmful content on the service.
00:34:39.940 They already do.
00:34:41.620 They do so because they have that common law duty and because they want to have a safe place.
00:34:47.240 But their definition of a safe place is not the same one that Justin Trudeau has.
00:34:50.780 That's the purpose of this law, is to change the terms of service for the social media companies in a way – I mean, they're already woken up.
00:34:57.700 They're not woken up for Trudeau.
00:34:59.220 And the crimes are what Trudeau is adding.
00:35:03.020 Section 55.2.
00:35:05.520 In order to determine whether the measures implemented by the operator are adequate to mitigate the risk that user of the regulated service will be exposed to harmful content on the service,
00:35:13.700 the commissioner must take into account the following factors.
00:35:16.980 The effectiveness of the measures in mitigating the risk, the size of the service, including the number of users, the technical and financial capacity of the operator,
00:35:24.560 whether the measures are designed or implemented in a manner that is discriminatory on the basis of a prohibited ground in the Canadian Human Rights Act,
00:35:30.740 and any factor provided for by regulations.
00:35:33.540 So they're now going to tell Facebook, YouTube, Google, Twitter how to run their companies.
00:35:38.000 I don't know.
00:35:40.480 It'll be interesting to see if they comply or if, like Facebook, they say, you know what?
00:35:44.880 We love you, Canada, but you enjoy the great firewall of China that Trudeau's brought over there.
00:35:49.740 We're just going to stay out of this.
00:35:51.140 I wonder if those companies will allow themselves to be used like this.
00:35:55.160 There was actually a clip.
00:35:55.920 Do you have that clip when Verani was asked?
00:35:57.900 I think it was the last clip he showed me, Olivia, when he was asked if he had any feedback from the companies?
00:36:02.840 And I don't think he's actually spoken to them about this.
00:36:05.780 Do you have that clip?
00:36:07.740 Here, let's take a listen to that.
00:36:09.100 He didn't bother to consult with them.
00:36:10.880 What could go wrong?
00:36:13.660 Now that the legislation is out, have you heard anything from social media platforms,
00:36:17.520 especially after we saw a meta pull out of Canadian news with their opposition to C18?
00:36:22.200 So I've not heard directly in terms of a sort of a formal response from the platforms.
00:36:27.680 What I can say to Canadians is that the idea is not to chase platforms out of Canada.
00:36:31.120 We want platforms operating in Canada, but we want them operating at a baseline standard of safety.
00:36:36.440 I find it troubling that we've got more rules that relate to the Lego in my kid's basement
00:36:41.480 than we do for the most difficult and dangerous toy that is in every Canadian's home right now,
00:36:46.800 which is the smartphone or the screen that their children are in front of.
00:36:49.640 We need to establish baseline standards to protect the safety of Canadian children
00:36:53.280 and to protect vulnerable people from hatred.
00:36:55.600 That is what we're doing with this legislation, which I think is fundamental for the Ministry of Justice.
00:37:01.120 Every word of that is a lie, of course.
00:37:03.720 Frankly, I don't know what laws there would be about Lego in your living room.
00:37:06.980 I can't even think of what one of them would be.
00:37:09.380 I think that's just such a – whoever came up with that talking point,
00:37:12.100 maybe that's that new PR whiz the liberals have hired.
00:37:16.760 There are hundreds of laws that would apply to Facebook, YouTube, etc.,
00:37:20.820 including the entire criminal code.
00:37:24.240 Just because it's the internet doesn't mean it's immune from the same laws that apply to the telephone,
00:37:28.800 to the fax machine, to handwritten notes.
00:37:32.680 You can rob a bank by saying, give me your money, by writing a note, give me your money,
00:37:37.820 by sending an email, give me your money.
00:37:39.240 I suppose you could rob a bank with a podcast, theoretically.
00:37:43.540 Robbing a bank is illegal no matter the format you use.
00:37:47.000 And laws against child pornography are made more complicated by the technology in terms of their policing.
00:37:54.340 But the laws are the same.
00:37:55.400 Now, by the way, I support increased child pornography laws,
00:37:57.880 but you see how that was his emphasis there instead of the hate speech stuff.
00:38:01.640 When he said he didn't talk to these foreign companies at all,
00:38:06.640 I'm just getting deja vu all over again because Facebook said,
00:38:09.280 we're going to pull out if you make us pay $100 million to Trudeau's friends.
00:38:12.680 And they didn't think Facebook was serious and Facebook was.
00:38:16.460 You cannot get a news story about Canada on Facebook,
00:38:19.880 which is a terrible setback to every single Canadian news outlet,
00:38:24.300 except for the CBC, which doesn't need traffic.
00:38:27.140 They just care about one viewer, Justin Trudeau.
00:38:28.800 But every other media company in Canada has suffered a decline in viewership
00:38:32.520 because Facebook was a source of viewers.
00:38:35.340 So they just devastated the news industry.
00:38:38.000 It'll be fascinating to see what happens here.
00:38:42.120 I'm going to take it.
00:38:43.140 I see a couple of rumble rants coming in.
00:38:45.160 I'm going to read a few of them.
00:38:46.740 Rora Bennu says, hey, Ezra, have a wonderful day.
00:38:48.720 What about domestic disputes?
00:38:51.140 I don't think that that applies to this legislation.
00:38:53.900 I haven't seen any of it.
00:38:54.920 I mean, unless you're tweeting about a spouse,
00:38:56.520 but I think that that would be, I don't think it would be covered by this law.
00:39:00.380 Nana Awake says, so murderers walk free in Trudeau's Canada,
00:39:03.220 but you disagree with Trudeau or her feelings are locked up for life?
00:39:05.880 Welcome to enforcement of my idiocy in China.
00:39:09.900 That's what this, I thought when I first heard life in prison,
00:39:12.800 I thought someone was joking.
00:39:14.060 But no, it's right in there.
00:39:16.240 Candy 3, $5.
00:39:17.520 Amanda Todd's mom is on the news stating that this could have saved her daughter.
00:39:20.540 What part of the bill applies to this?
00:39:24.120 There are portions of this bill that have to do with revenge porn and child pornography.
00:39:29.440 I'm not familiar with the case of Amanda Todd.
00:39:32.800 Some of those things have been prosecuted at the provincial level.
00:39:38.540 And by the way, I'm completely open to that.
00:39:40.320 I don't know a lot about that area of law.
00:39:42.920 That's not what this bill is about.
00:39:45.460 This bill is about fomenting feelings of hate.
00:39:49.240 This bill is about lifetime life sentences for hate.
00:39:53.980 And hate is a human emotion.
00:39:56.600 They added the terrorism stuff, which is already illegal.
00:39:59.320 They added the child pornography stuff, which is already illegal.
00:40:01.840 I think they could improve on it.
00:40:03.260 They added that to distract.
00:40:05.780 Candy 3, I want to know how they see it applies to her.
00:40:08.240 I don't trust what they say, which is why I ask here.
00:40:10.480 How are they twisting it or outright lying?
00:40:13.020 Well, we've been going through things.
00:40:14.220 I'm just going to, and I'm sorry, I'm not an expert in the case of Amanda Todd.
00:40:17.580 A 15-year-old Canadian student and victim of cyberbullying who hanged herself at home.
00:40:23.060 You know what?
00:40:23.880 I do remember this case now.
00:40:25.620 And there was also a case, if I'm not mistaken, in Atlantic Canada somewhere.
00:40:31.820 I'm sorry the name didn't ring a bell immediately to me.
00:40:33.920 I can't answer that right now.
00:40:37.920 But so far, you know, I believe that revenge porn should be made illegal.
00:40:41.800 You don't need a 100-page bill and setting up three new censorship offices to make revenge porn illegal.
00:40:48.620 You need one line in the criminal code.
00:40:51.100 This is not about that.
00:40:52.440 That's the distraction.
00:40:53.340 I think we should do that.
00:40:55.060 If we, I'm not up to speed with what the criminal code provisions are on revenge porn.
00:40:58.960 I'm sorry, it's not something I've looked into.
00:41:01.460 But that's not what this law is about.
00:41:06.920 I'm going to Google it right now.
00:41:10.520 I'm Googling revenge porn.
00:41:13.960 Canadian criminal code.
00:41:15.040 I hope that this search doesn't yield the results I don't want.
00:41:18.420 Oh, good.
00:41:19.000 The first result was non-consensual distribution of intimate images.
00:41:24.360 That's the first hit I got.
00:41:25.600 I'm glad I didn't get a porn result.
00:41:28.400 Cyber bullying and non-consensual distribution of intimate images.
00:41:33.660 I'm reading a government can of page.
00:41:35.220 I don't want to go deep into it right now.
00:41:38.620 Here.
00:41:39.240 You know what?
00:41:40.540 Yeah, you found it.
00:41:42.820 Can you pump that up?
00:41:43.840 I know we've got Marty on deck.
00:41:45.460 But I just want to answer your question properly.
00:41:47.840 I'm so glad you asked it.
00:41:48.920 And I'm so glad I took the time to look.
00:41:50.280 And Olivia, you found it the same time I did.
00:41:52.120 Can you put that on the screen?
00:41:53.300 So this is the law right now.
00:41:56.240 We are looking at the criminal code of Canada as it stands now.
00:42:00.240 So let's read this.
00:42:01.740 By the way, great question.
00:42:03.520 Thank you very much.
00:42:04.940 And we found the answer.
00:42:06.120 Let me read it.
00:42:08.060 This section is called publication of an intimate image without consent.
00:42:12.400 Let's read it together.
00:42:13.320 Everyone who knowingly publishes, distributes, transmits, sells, makes available or advertises an intimate image of a person knowing that the person depicted in the image did not give their consent to that conduct or being reckless as to whether or not that person gave their consent to the conduct is guilty.
00:42:28.260 Of an indictable offense and liable to imprisonment for a term of not more than five years or be even an offense punishable on summary conviction.
00:42:36.160 And then they define intimate image, including in the person is nude, exposing his or her genitals, et cetera.
00:42:44.700 At the time of the recording, there were circumstances that gave rise to a reasonable expectation of privacy.
00:42:51.480 And the person depicted retains a reasonable expectation of privacy at the time the offense is committed.
00:42:55.240 And then there's more details there.
00:43:02.820 But my point is that this law, it looks like it was passed in 2014.
00:43:11.160 If I'm reading the notes here.
00:43:14.680 That's another way of saying I think this law has been on the books for 10 years.
00:43:17.860 And the case you asked me about, Amanda Todd, was in, I think it was 2012.
00:43:30.700 So I think that this amendment to the criminal code probably came about in reaction to the Amanda Todd case.
00:43:36.660 And I think there was another case similarly.
00:43:40.060 Do you see my point?
00:43:42.340 And I'm not very familiar with that area law.
00:43:45.060 I just read it with you together.
00:43:47.860 But do you see my point?
00:43:49.520 I said that the child pornography and the revenge porn aspects here are a distraction.
00:43:54.700 And because of your excellent rumble comment, I just Googled it with Olivia.
00:43:59.440 And we demonstrated that that has actually been against the law five years in prison.
00:44:05.080 And that's been the state of the law since 2014, if I'm reading that provision correctly.
00:44:12.620 It has been, revenge porn has been illegal in Canada for 10 years.
00:44:15.660 And there may have even been ways to prosecute it before then.
00:44:18.580 But at least for 10 years, it's been in the criminal code.
00:44:22.840 And the liberals are claiming this bill is about revenge porn today.
00:44:25.820 It is not.
00:44:27.280 This bill today is about censorship, using revenge porn, using child porn as the excuse.
00:44:32.360 Do you see what I'm saying?
00:44:33.700 It's already against the law to incite violence.
00:44:35.780 Of course, you knew that.
00:44:37.180 It's already against the law to incite terrorism.
00:44:40.040 Of course, you knew that.
00:44:40.760 You knew that.
00:44:41.340 You know that.
00:44:42.460 Are you going to be tricked and deceived by Arif Arani when he says, no, no, no, no.
00:44:47.200 We've got to stop revenge porn.
00:44:48.980 Yeah, I agree with you, buddy.
00:44:50.180 And the time machine, we can go back to 2014 and vote for it again if you want.
00:44:55.300 But you are just a liar if you're saying that's what this bill is about.
00:44:59.260 All right.
00:44:59.680 I'm so glad that we took that rumble rant and looked into it.
00:45:02.560 I'm just simply not familiar with that section of the criminal code.
00:45:04.920 But now I am.
00:45:06.320 And what was that section?
00:45:07.860 Was that 162?
00:45:13.780 Section 162.1.
00:45:17.400 Just for you to remember, if anyone ever asks you about that, you can say section 162.1 of the criminal code.
00:45:26.180 It's been elite.
00:45:27.120 Revenge porn is being illegal in Canada, punishable by up to five years in prison.
00:45:31.140 It's been illegal for 10 years.
00:45:33.440 Don't let Trudeau trick you.
00:45:34.920 All right.
00:45:35.420 Without further ado, I want to bring in Marty Moore, one of my friends from the Justice Center for Constitutional Freedom.
00:45:38.840 She's been waiting patiently as I did a little bit of Googling about the criminal code.
00:45:43.300 And I'm so glad I did.
00:45:44.140 Marty, great to see you.
00:45:45.020 Welcome to the live stream.
00:45:47.640 Thanks, Ezra.
00:45:48.300 Great to be with you.
00:45:49.600 Likewise.
00:45:50.140 So give our viewers a little bit of a biography.
00:45:52.420 Where are you joining us today from?
00:45:54.120 And you are a staffer with the Justice Center.
00:45:56.360 Am I right?
00:45:57.520 Well, thank you, Ezra.
00:45:58.320 I'm joining you from Calgary.
00:45:59.540 I've been working with the Justice Center for about 10 years.
00:46:01.840 And I currently lead a team of lawyers actually across the country.
00:46:04.920 Who are funded by the Justice Center to defend constitutional freedom.
00:46:08.640 So it's a privilege to join you.
00:46:10.540 But obviously, our work is not done.
00:46:12.920 In fact, with a bill like this, it's hard to know when it will ever end with the kind of ideas that you see being propagated in this bill, which are so illiberal.
00:46:22.560 Interestingly, being brought forward by the former Liberal Party, if you might say.
00:46:29.020 I'm not sure what to call it now.
00:46:30.440 Well, I went through a 17-page media handout that the government gave yesterday.
00:46:34.960 And I'm sort of marching my way through the bill.
00:46:38.500 I'm only at page 17.
00:46:42.300 So, you know, I'm taking my time.
00:46:45.100 Maybe, I presume that you've read it.
00:46:47.400 Instead of us going through it, you know, section by section, maybe you can tell me what concerns you the most.
00:46:55.060 Sort of skip ahead to the bad parts, so to speak.
00:46:57.860 Like, what do you think is the worst part of this?
00:47:00.200 Or there's probably several of them.
00:47:02.420 You know, Ezra, to pick a worst part, I think one of the worst parts we can find is the amendment to the criminal code.
00:47:10.300 That they are wanting to put forward to allow for people, anyone with the consent of the Attorney General of Canada, to bring an information forward to say, not that you have committed a crime, a speech crime, but that you may commit a crime.
00:47:28.900 That they have reasonable grounds to believe that you will commit a crime.
00:47:31.980 And what kind of crimes are we talking about?
00:47:33.320 We're talking about crimes such as inciting hatred or the most recent amendments to the criminal code that have passed, you know, downplaying the Holocaust.
00:47:44.600 And in order for them to come to court and say, you haven't committed this, but we think that you will commit this crime, a provincial court can then order you, on the basis of someone else's belief, has to be reasonable, but that's in the eyes of a court to find.
00:48:01.020 They can order you to undertake a recognizance.
00:48:06.440 And that is a court order that you promise to do a bunch of different things.
00:48:11.220 And under this new piece of legislation, it's not just a normal kind of undertaking, you know, we're going to keep the peace.
00:48:18.380 It's we can put an ankle monitor on you.
00:48:21.780 We can issue a curfew that keeps you in your house.
00:48:25.940 We can order that you submit your bodily fluids for the registry.
00:48:30.080 We can pause right there.
00:48:31.640 I want to call that up on the screen.
00:48:32.820 Olivia, can you in the PDF document type in the word recognizance?
00:48:39.020 And it's the seventh or the sixth.
00:48:44.120 It's one of the first hits there.
00:48:46.120 Scroll ahead to where it says conditions in recognizance.
00:48:49.520 Because Marty's talking about, you know, you click ahead to the next one.
00:48:53.040 You know how you can go to the next one and find and then the next one.
00:48:56.720 And then, yeah.
00:48:58.420 So the provincial court may commit the defendant to prison for a term of not more than 12 months if the defendant fails or refuses to enter the recognizance.
00:49:07.260 And then, Marty, I'm just going to rattle through the recognizance.
00:49:09.620 I'll go straight back to you.
00:49:10.440 I just want to show on the screen as you mention it.
00:49:13.220 The provincial court judge may add any reasonable conditions to the recognizance that the judge considers desirable to secure the good conduct of the defendant, including wearing an electronic monitoring device, return to and remain at their place of residence.
00:49:28.420 So house arrest, abstain from the consumption of alcohol, provide for the purposes of analysis, a sample of bodily substance, provide a sample on regular intervals, abstain from communicating directly or indirectly with any person or refrain from going to any place.
00:49:52.420 Firearms, ban from firearms, surrender, anything.
00:50:08.420 I've never seen such a draconian.
00:50:11.600 Is this the provision you're talking about?
00:50:13.420 I haven't gone this far ahead, but I did.
00:50:15.760 I mean, I was familiar with this with the earlier version of this bill.
00:50:19.300 Is that what you're talking about, Marty?
00:50:20.460 This is what I'm talking about, Ezra.
00:50:22.380 Like, this is so bizarrely draconian.
00:50:26.420 And again, you do not find these kinds of conditions for even serious crimes.
00:50:31.440 You find these kind of conditions being imposed on someone who has been, there's evidence against them for committing actual terrorism.
00:50:40.000 And what we are talking about here, Ezra, we're not talking about someone actually having committed offense.
00:50:44.340 No, if you go up to section sub three under adjudication, we're talking about a provincial court judge finding that an informant has reasonable grounds for the fear that you will do something that violates the speech provisions of the criminal code.
00:51:05.740 And just to remind you of how broad our criminal code speech provisions are getting, and they're going to get broader under this bill if it passes.
00:51:15.460 We're talking about provisions such as, and it listed there, 319 sub 2 of the criminal code.
00:51:25.780 And what does 319 sub 2 of the 2.1 of the criminal code says?
00:51:29.560 It says, everyone who, by communicating statements other than in private conversation, willfully promotes anti-Semitism, of course, we're all against that, by condoning, denying, or downplaying the Holocaust.
00:51:40.980 So what we are saying is that if this law passes, if someone is concerned that someone else might in the future downplay the Holocaust, well, then they can take that concern.
00:51:54.320 And if that concern is found to have reasonable grounds, the judge in that case can issue an order that you submit to a recognizance with all of these conditions, or if you don't submit to a recognizance, that you go to jail for a year.
00:52:08.920 That's what we're talking about here, Ezra.
00:52:10.360 You know what, and the section right before that, there's section 319.2, and then 319.2.1.
00:52:23.440 You know, it's very interesting what they're doing here.
00:52:25.260 I think I know what they're doing, Marty.
00:52:29.020 There's a left-wing Jewish, or there's left-wing Jewish organizations in Canada who for years have supported human rights censorship, even though it's greatly backfired against the community.
00:52:40.360 I think this is their way of sort of buying off some of the official Jews who are more censorship-oriented, saying, look, yeah, we're going to smash a lot of things here.
00:52:51.660 We're going to smash a lot of civil liberties.
00:52:53.000 But we're going to let you smash your enemies here.
00:52:56.600 This will be used against Jews, too.
00:52:58.340 I mean, earlier, Marty, I talked about how I published the Danish cartoons of Mohammed.
00:53:03.060 I was prosecuted for three years for that.
00:53:05.320 And the Jewish community actually expressed support for these kind of censorship provisions 50 years ago when they were drafted.
00:53:13.920 But that kind of minority report preemptive censorship you refer to, whether it's against Jews or Muslims or Christians or anybody, is a shocking departure from how we operate in Canada.
00:53:29.320 We don't believe in a pre-crime.
00:53:31.020 We don't believe in prior restraint.
00:53:33.780 Oh, Your Honor, he hasn't said anything yet, but I just know he's going to.
00:53:37.660 He hasn't done anything yet, Your Honor, but boy, I feel it in my bones.
00:53:41.480 And look at him. He's the kind of guy who might say something wrong.
00:53:44.960 And so while I myself, of course, don't like Holocaust denial, I also know one thing, Marty.
00:53:50.340 If someone doesn't believe the Holocaust happened, throwing them in jail is not going to change their mind.
00:53:55.360 It's actually probably going to confirm for them that you're trying to cover up the truth.
00:53:58.220 If someone denies the Holocaust, I think the first thing you've got to say is, why did we bring this person to Canada, hundreds of thousands of people to Canada who deny the Holocaust?
00:54:06.200 The second more practical thing is to try and persuade them otherwise, but simply threatening them with prosecution is not going to change their mind.
00:54:13.980 Absolutely. And of course, this applies to more than simply Holocaust denial or downplaying the Holocaust, as this statement reads.
00:54:23.160 It applies to the broader category of all of the hate offenses, you know, the promotion of hatred.
00:54:29.160 And again, we're not talking about actually adjudicating whether or not, based on a standard of beyond a reasonable doubt, someone has, in fact, advocated for genocide or willfully promoted hatred.
00:54:40.060 We're just talking about whether someone can convince a provincial court judge, any provincial court judge in Canada, that you have a reasonable ground for fear that someone will, for example, advocate genocide or promote genocide.
00:54:53.840 Well, we can certainly see how that could cut both ways in the heated conversation that we have.
00:54:58.580 These terms are commonplace across the country, whether it's Jewish or Palestinian or any issue of debate.
00:55:06.180 Let me pull up Section 319. I just sent you the link in the live stream Slack channel.
00:55:12.620 Section 319, I've always been against this section in the criminal code.
00:55:17.340 Section 319 is called Public Incitement of Hatred.
00:55:22.000 And Section 319 2.1 is about the Holocaust, and we've been talking about that.
00:55:28.880 But that's just one detail. Look at how 319 starts.
00:55:31.640 Section 319.1, everyone who, by communicating statements in any public place, incites hatred against any identifiable group, where incitement is likely to lead to a breach of the peace, is guilty of an indictable offense, liable to imprisonment of a term not exceeding two years.
00:55:48.540 And then Section 319.2, everyone who, by communicating statements, other than in a private conversation, willfully promotes hatred against any identifiable group, is guilty and has two years.
00:55:59.760 So, you were referring to 319.2.1, which specifically talks about the Holocaust.
00:56:05.520 But 3.91 and 3.92 talk about any ground.
00:56:11.680 And let me remind our viewers, that's not just age, race, gender.
00:56:16.680 That's now gender identity.
00:56:18.720 That's now gender expression.
00:56:20.620 That's transgenderism.
00:56:22.020 So, if you have expressed anything that is likely to expose a person to foment, you know, to feelings, you're guilty of a crime.
00:56:35.140 And now, as you point out, Marty, you don't even have to do it yet.
00:56:38.420 So, for example, Rebel News talks about public controversies, including extreme cases of transgender activism, where big men play sports against women and even girls.
00:56:50.180 And I can see some transgender activists going before a judge and saying, Your Honor, they haven't broken the law yet.
00:56:58.000 But, boy, they're getting close.
00:56:59.440 I need you to put David Menzies under house arrest.
00:57:02.140 I need you to put Ezra LeMant under house arrest.
00:57:05.180 I need you to put an ankle monitor on them.
00:57:07.620 I need you to ban them from communicating with certain people.
00:57:11.480 They haven't done anything yet.
00:57:13.000 But we need that ordered, Your Honor.
00:57:15.880 And that's the power that this judge has.
00:57:17.900 Am I correct, Marty?
00:57:19.060 I'm relying on you to correct me because I'm sort of working my way through this legislation in real time.
00:57:23.660 Ezra, we are reeling, I think, our whole legal team that we're looking at this right now.
00:57:28.680 This is some of the worst drafted legislation.
00:57:31.820 And when you don't have the ability to look at this legislation, all of a sudden say, Ah, no, no, no.
00:57:37.140 Ezra, we're going too far.
00:57:38.320 We were passed.
00:57:39.040 No, I have not seen any check on this yet.
00:57:43.900 Having read this and reading it again here, we're talking about real imprisonment or, you know, agree to all these conditions of people just because someone else is judged by a provincial court judge to have a reasonable fear that they might cross the line into some level of hatred.
00:58:05.120 And we know that the definition of hatred is a squishy definition to start with.
00:58:09.440 It's one of the most difficult legal concepts to explain because it's also very much in the eye of the beholder, in the eye of the hearer or the ear of the hearer, I guess I should be saying here.
00:58:20.220 And now, if you have a reasonable fear that someone's going to speak hatred in some form, you are going straight to jail, as they say, or you're going to agree to all of these conditions at the discretion, again, respectfully, but of a provincial court judge simply based on whether there is a reasonable grounds for the fear.
00:58:41.820 And so, again, this is just one provision, one absolutely draconian provision, but only one provision of concern in this piece of legislation.
00:58:51.840 So I want to quickly show you some of those criminal code provisions that Marty referred to, and I'm not going to go through in detail, but I just want to show them to you.
00:59:00.760 If you can put those on the screen there, like this is just a small excerpt.
00:59:06.920 It's called Sexual Offenses, Part 5.
00:59:11.960 Offenses, you know, every person commits an offense who makes prints, publishes, distributes, circulates, or has in their possession an obscene thing, and then they define it, sell it, expose it.
00:59:22.840 It goes on and on, there's section after section, they just, they, in section 163.11, they define child pornography.
00:59:31.980 They talk about making, distributing, possessing, accessing child pornography, and aggravating factors.
00:59:39.240 This has been in the law for decades.
00:59:43.460 In fact, you can see some of these provisions predate the internet.
00:59:47.140 Many of these provisions go back generations.
00:59:52.840 Child pornography, of course, was a scourge even before the internet.
00:59:56.000 The internet has vastly expanded the scope of it, but the law has kept up.
01:00:02.320 The law has been updated and adjusted.
01:00:04.860 I see certain provisions in the version on the screen here were adjusted in 1993.
01:00:10.060 Some were adjusted in 2018.
01:00:11.660 This is constantly updated to keep up with the times, both the technological times.
01:00:18.800 It's a 2015 update.
01:00:21.660 The criminal code is amended all the time, and I'm not saying it shouldn't be amended more.
01:00:26.420 I'm not saying there aren't more improvements that can be done.
01:00:28.600 But to pretend that, that without the law, the online harms bill, child pornography or revenge porn or inciting terrorism would not be happening is a lie.
01:00:44.380 Understand the reason they're lying to you, because they're trying to make you say, no, there's some good parts to it.
01:00:50.280 The good parts to it are already in law.
01:00:53.740 The banning of revenge porn, I showed it to you.
01:00:58.240 Ten years ago, the law was passed.
01:01:00.160 Five years in prison.
01:01:01.560 A referent didn't do that.
01:01:05.560 It was actually done under the Harper government.
01:01:07.980 Okay, so I'm going to speed through this.
01:01:10.540 Just talking about the power to issue regulations and guidelines.
01:01:14.380 Tools to block users, Section 58.
01:01:16.320 The operator of a regulated service must make available to users who have an account or otherwise registered tools that enable those users to block other users.
01:01:26.880 Does anyone not know that exists?
01:01:29.240 Like I said before, if you don't have the blocking function on your social media app, you're not actually allowed to sell it on the app store.
01:01:35.440 This is an example of a law drafted by people who either know nothing about social media.
01:01:42.820 I have an idea.
01:01:43.820 I have an idea.
01:01:44.680 A block button.
01:01:46.520 The liberals very well know about block buttons because they've been trying to block Rebel News and we've been taking them to court to fight that.
01:01:51.720 But they either don't know anything about social media, which is possible, or more likely they know that this is already the case, but they're trying to distract from the true essence of this bill.
01:02:02.560 Tools and processes to flag harmful content.
01:02:04.800 The operator of a regulated service must implement tools and processes to enable a user to easily flag the operator content that's harmful, et cetera, et cetera.
01:02:15.140 That is ubiquitous in social media.
01:02:17.360 Like I say, it's already in the law.
01:02:19.320 Sorry, I don't know if it's in the law.
01:02:20.640 It's already in practice.
01:02:22.240 So this law is redundant, unnecessary, other than it's to camouflage their true intent.
01:02:31.800 Let's scooch down a little bit.
01:02:33.260 Section 60, multiple instances of automated communication by computer program.
01:02:39.460 It's basically saying ban spam.
01:02:42.260 I can assure you that stopping spam is an enormous priority for social media companies.
01:02:48.240 I think it's one of the main things that Elon Musk has railed against since buying the app is how to stop spam bots.
01:02:55.880 And he's come up with that ideas like making accounts pay a few dollars or whatever.
01:03:01.500 Do you think you need to tell Facebook, Twitter, YouTube, Instagram that spam is a problem?
01:03:07.960 They're going full tilt on it.
01:03:09.440 Does the government have better solutions than them?
01:03:11.240 I doubt it.
01:03:11.800 But, again, this isn't what the Online Harms Act is really about.
01:03:18.160 Section 61.
01:03:20.800 The operator of a regulated service must make a resource person available to users of the service to hear users' concerns.
01:03:27.980 Every single app does that.
01:03:30.360 I mean, they don't have someone on a telephone, but they have automated systems.
01:03:34.080 Contact information accessible.
01:03:36.140 The operator must ensure that the resource person is easily identifiable and the resource person's contact info is easily accessible.
01:03:41.980 Obviously not a phone number, but how is that not what they do right now?
01:03:45.900 And even if it was, this is not the essence of the law.
01:03:50.240 Section 73.
01:03:51.880 The commission may, on request and in accordance with the criteria set out in the regulations, accredit a person other than an individual for the purpose of giving that person access to the inventories of electronic data that are included in the digital safety plans.
01:04:04.660 The person's primary purpose is to conduct research or engage in education, advocacy, or awareness activities.
01:04:10.340 And the person conducts research that is or engages in education, advocacy, or awareness activities.
01:04:15.900 Now, I don't know what this means in reality.
01:04:18.360 I don't know what they're trying to pull here.
01:04:19.980 But I think what they're saying is they want the ability to root around in Elon Musk's internal records.
01:04:26.780 I think what they want to do is they want to poke around inside Facebook, YouTube, Instagram, and take a good look around just out of curiosity, maybe out of competitive interest.
01:04:39.060 I don't know.
01:04:40.100 We don't really know what this is.
01:04:41.280 What does it mean when the purpose is education, advocacy, and awareness activities?
01:04:47.100 So you're going to poke around in the inside of a company, in the private stuff, in the name of advocacy, advocating what, on behalf of whom, towards what end?
01:04:56.740 I think it's a disaster.
01:04:58.140 And I'm guessing the companies will rightfully refuse it if it's for those purposes.
01:05:03.520 Every operator commits an offense.
01:05:05.660 So this is – I'm just going to spend a little bit of time here.
01:05:08.340 This is basically the penalties for Facebook, YouTube, Instagram, Twitter, et cetera.
01:05:12.200 If they don't follow the law, here's what could happen to them.
01:05:14.720 Every operator commits an offense that contravenes an order of the commission, that obstructs or hinders the commission, or an inspector, that makes a false or misleading statement.
01:05:24.600 And every operator that commits an offense under subsection one, on conviction, on indictment, to a fine of not more than 8% of the operator's gross global revenue, or $25 million, whichever is greater.
01:05:42.320 So if Facebook commits an offense here, the Trudeau government is going to take either 8% of Facebook's revenue, or $25 million, whichever is greater.
01:05:57.800 Facebook revenue, 2023, what do you think it was?
01:06:01.400 $134 billion.
01:06:04.440 $134 billion times 8% is $10 billion.
01:06:18.900 What do you think Facebook, YouTube, Google, Instagram, Twitter, TikTok are going to say when they read that?
01:06:25.920 They're going to say, well, we won't break the rules.
01:06:30.020 We're not going to break the law.
01:06:32.260 We're lawful companies.
01:06:33.640 I mean, that's what Facebook said.
01:06:34.900 Facebook was faced with CA team, which said, you have to give $100 million to Trudeau's friends in the media if you link to their work.
01:06:42.980 So Facebook said, well, we're not lawbreakers.
01:06:45.560 So we just won't link to their work.
01:06:48.140 So we're in full compliance because the law doesn't apply to us now.
01:06:51.640 We don't link to Canadian news.
01:06:53.340 So good luck with all that.
01:06:55.660 So Facebook is not breaking the law.
01:06:57.680 They're just not participating in a business activity that puts them at $100 million cost.
01:07:02.380 So I just did the math on Facebook's global.
01:07:08.120 And that's, by the way, that's $10 billion U.S. dollars.
01:07:11.440 That's about $13 billion Canadian.
01:07:13.440 Do you think that it's worth the risk for Facebook to operate in Canada at all?
01:07:18.640 Well, if there's a law that says they could be fined $10 billion U.S. for an offense, I don't think Facebook has made $10 billion in Canada in the last decade.
01:07:32.080 Maybe I'm wrong.
01:07:32.620 Maybe they have.
01:07:33.220 I don't know.
01:07:37.820 Maybe this is an opening offer.
01:07:39.340 But I think that you're going to see a U.S. MCA, U.S.-Mexico-Canada-America Free Trade Agreement complaint out of this.
01:07:45.600 Section 121, persons other than an operator.
01:07:52.720 If you can contravene the law, you could be, you know, there's fines there.
01:07:59.080 If an individual, $50,000 fines, like just enormous fines, but no prison.
01:08:08.600 We're very, very glad.
01:08:09.620 Section 124 says you won't go to prison for lying to the commission, but we can put you in prison for life, as we talked about, for hate speech.
01:08:18.680 So this is the most terrifying part of the law.
01:08:23.500 So what this is, is this law called the Online Harms Act will amend other laws.
01:08:29.440 It'll amend the Human Rights Act.
01:08:30.900 We'll talk about that a little bit later.
01:08:32.000 But it'll also amend the Criminal Code.
01:08:35.400 And it's creating a new hate crime right here.
01:08:39.040 Or gender identity or expression.
01:09:09.040 Is guilty of an indictable offense and liable to imprisonment for life.
01:09:12.580 Let me say that again.
01:09:15.960 If the government thinks you hate someone based on their gender identity or expression.
01:09:25.660 So, for example, if there is a grown man in a changing room with girls.
01:09:33.040 Because his gender identity is, well, no, I'm a trans woman.
01:09:39.040 And if you react to that person in any manner that shows hatred, which is defined as detestation or vilification.
01:09:47.460 Get ready to go to prison for as long as Paul Bernardo.
01:09:58.540 That's shocking.
01:10:00.160 Do you see that?
01:10:00.940 Get sexual orientation or gender identity or expression.
01:10:04.620 Get ready to go to jail for life.
01:10:08.840 The next one is even more terrifying.
01:10:11.860 Fear of hate propaganda offense.
01:10:14.700 A person may, with the Attorney General's consent, lay in information before a provincial court judge.
01:10:20.860 This is what we were talking about with Marty.
01:10:22.180 If the person on reasonable grounds that another person, if the person fears on reasonable grounds that another person will commit an offense.
01:10:36.000 So, this is the pre-crime.
01:10:38.620 This is the time machine future crime.
01:10:44.060 This is the Tom Cruise movie Minority Report.
01:10:47.140 I'm just going to read this one more time slowly because it's so shocking.
01:10:49.620 I want you to get it.
01:10:51.480 This section is called Fear of Hate Propaganda.
01:10:54.300 Not hate propaganda.
01:10:55.580 That's obviously allowed.
01:10:56.760 Go to jail for life or whatever.
01:10:58.540 But if you're just afraid that someone might in the future possibly, well, that's a crime too for them.
01:11:04.880 Fear of hate propaganda offense.
01:11:06.540 A person may, with the Attorney General's consent, lay in information before a provincial court judge.
01:11:12.600 If the person fears on reasonable grounds that another person will commit an offense that we've described about hate speech about transgenderism or minority or whatever.
01:11:25.340 You can get a court order against someone just because you're afraid of them.
01:11:31.640 They haven't done anything to you yet, but you think they might.
01:11:36.540 Subsection 3, if the provincial court judge before whom the parties appear is satisfied by the evidence adduced, that the informant has reasonable grounds for the fear.
01:11:50.040 The judge may order that the defendant enter into recognizance to keep the peace and be of good behavior for a period of not more than 12 months.
01:11:58.440 Next section.
01:11:59.320 However, the provincial court judge is also satisfied that the defendant was convicted previously of any offense.
01:12:04.200 The judge may order that the defendant enter into recognizance for a period of not more than two years.
01:12:09.960 The provincial court judge may commit the defendant to prison for a term of not more than 12 months if the defendant fails or refuses to enter into recognizance.
01:12:17.520 So let me summarize.
01:12:20.540 Someone's afraid that you might.
01:12:22.660 You haven't done anything yet.
01:12:24.180 Someone's afraid that you might.
01:12:25.660 The judge says you're under house arrest.
01:12:27.140 You've got to wear an ankle bracelet.
01:12:28.300 No more alcohol for you.
01:12:29.360 You're not allowed to go to these five places.
01:12:30.800 And if you don't do that, you're straight to jail.
01:12:32.900 Can I get the Parks and Rec straight to jail clip?
01:12:38.440 I mean, remember that?
01:12:39.380 Have you ever watched Parks and Rec?
01:12:40.520 I didn't watch a lot of it.
01:12:41.580 But there's this funny scene where Fred Armisen plays some little dictator from some Latin American country.
01:12:47.920 Just such a great job.
01:12:50.360 And this is not a laughing matter, but we're going to take a second to laugh at it.
01:12:53.340 Go ahead and play that clip.
01:12:54.840 They're holding coffee.
01:12:56.380 This is outrageous.
01:12:57.220 Where are the armed men who come in to take the protesters away?
01:13:00.780 Where are they?
01:13:01.620 This kind of behavior is never tolerating in Barakwa.
01:13:04.740 You shout like that, they put you in jail.
01:13:07.300 Right away.
01:13:08.700 No trial, no nothing.
01:13:10.480 Journalists, we have a special jail for journalists.
01:13:13.380 You're stealing, right to jail.
01:13:15.300 You're playing music too loud, right to jail.
01:13:17.140 Right away.
01:13:17.840 You're driving too fast, jail.
01:13:19.540 Slow, jail.
01:13:20.540 You're charging too high prices for sweaters, glasses.
01:13:24.900 You're right to jail.
01:13:25.640 You undercook fish, believe it or not, jail.
01:13:28.220 You overcook chicken, also jail.
01:13:31.180 Undercook, overcook.
01:13:32.380 You make an appointment with a dentist and you don't show up, believe it or not, jail.
01:13:35.500 Right away.
01:13:36.300 We have the best patients in the world because of jail.
01:13:41.620 That's funny because it was a joke when that was filmed probably 10 years ago.
01:13:47.720 But it's not funny now because that's what happens in Canada.
01:13:50.960 Did you say something mean about someone whose gender identity or gender expression you disagree
01:13:58.740 with because maybe they were in a girl's changing room?
01:14:02.680 Straight to jail.
01:14:04.060 Own undercooked fish, overcooked chicken.
01:14:06.640 Straight to jail.
01:14:07.340 Undercooked, overcooked.
01:14:08.420 That's very funny because we have to laugh.
01:14:10.660 Now, this is the part I'm really scared about.
01:14:12.420 Orders regarding discriminatory practices.
01:14:17.600 A discriminatory practice, as described in sections 5 to 14.1, may be the subject of a complaint.
01:14:24.520 And anyone found to be engaging or have engaged in a discriminatory practice may be subject to an order.
01:14:28.960 And as I read to you earlier, they include an internet posting as a discriminatory act.
01:14:35.220 It's not.
01:14:36.540 A discriminatory act is when you choose amongst things based on some characteristics.
01:14:42.540 So discrimination is you're allowed to come into this restaurant.
01:14:48.260 You're not.
01:14:49.140 You're allowed to book a ticket on this airline.
01:14:51.480 You're not based on race or religion or whatever.
01:14:54.140 We do allow certain discriminations.
01:14:56.560 If you're a child of tender years, you're not allowed to buy a ticket because you're too young.
01:15:01.120 We obviously have certain discrimination.
01:15:03.300 But unlawful discrimination is what I've just described.
01:15:08.160 Giving your opinion on the internet is not discrimination.
01:15:10.960 How is that discrimination?
01:15:11.840 You haven't altered someone's life other than through an idea.
01:15:15.340 You haven't taken any action.
01:15:18.040 That is now covered by the Human Rights Act.
01:15:20.160 So look at this next section here.
01:15:25.580 13.1.
01:15:26.640 This is what Stephen Harper's government repealed.
01:15:30.180 They're bringing it back.
01:15:32.160 It is a discriminatory practice to communicate or cause to be communicated hate speech by means of the internet or any other means of communication.
01:15:40.440 In a context in which the hate speech is, again, future crime, likely to foment detestation or vilification of an individual or grouping of individuals on the basis of a prohibited ground of discrimination.
01:15:52.100 So you didn't actually interfere with anyone's life.
01:15:55.460 You didn't ban them from a building.
01:15:56.960 You didn't deny them service at a restaurant.
01:15:59.260 You didn't refuse to bake a cake.
01:16:00.580 You just posted something on the internet.
01:16:04.120 And if that could cause him or her to have hard feelings towards the transgender man going into the girls' changing room, you're guilty.
01:16:17.000 Subsection 2.
01:16:17.740 For the purposes of subsection 1, a person communicates or causes to be communicated hate speech so long as the hate speech remains public and the person can remove or block access to it.
01:16:26.620 So everything you've ever written or said historically can now get you convicted.
01:16:39.640 Exception.
01:16:40.280 This section does not apply in respect of private communication.
01:16:47.180 Definition of hate speech.
01:16:49.040 We've already read this earlier.
01:16:51.820 Content expresses detestation or vilification.
01:16:54.860 On a basis of prohibited grounds.
01:16:57.380 And I've already told you that includes gender identity, gender expression.
01:17:04.280 I'm just scrolling through this stuff.
01:17:09.300 Now, one of the most important things in our legal system is, in fact, it's a charter right.
01:17:15.460 It's the ability to face your accuser, to face the charges, to know who's prosecuting you.
01:17:22.140 We don't have secret trials in Canada.
01:17:24.660 We have certain parts that are subject to publication bans to protect the identity of children or if there's some national security reasons.
01:17:32.400 But there are no secret trials in Canada.
01:17:35.040 It's not like the Star Chamber.
01:17:37.480 Secret trials centuries ago in the UK.
01:17:40.900 But look at this section here.
01:17:43.240 Non-disclosure of identity.
01:17:45.040 This is absolutely insane.
01:17:46.560 The Human Rights Commission may deal with a complaint in relation to a discriminatory practice described in section 13
01:17:55.940 without disclosing to the person against whom the complaint was filed or to any other person
01:18:02.060 the identity of the alleged victim, the identity or group of individuals that has filed the complaint,
01:18:08.860 or any individual who's given evidence or assisted the commission in any way in dealing with the complaint.
01:18:15.220 If the commission considers that there is a real and substantial risk,
01:18:18.440 then any of those individuals will be subjected to threats, intimidation, or discrimination.
01:18:22.200 What does that even mean?
01:18:24.500 Well, I'll tell you what it's going to mean.
01:18:26.360 So you will receive notice one day that you are being prosecuted.
01:18:31.440 You won't know who filed the complaint.
01:18:35.860 You won't know who gave evidence against you.
01:18:39.440 You don't know who has helped the commission prosecute you.
01:18:44.340 You won't have any of that information.
01:18:46.440 You won't know if it's a disgruntled ex-employee.
01:18:49.580 You won't know if it's a for-profit company who's decided to make complaints for cash.
01:18:56.360 You won't know if it's an extremist activist.
01:18:58.700 You won't know if it's the Liberal Party of Canada trying to embarrass their opponents.
01:19:03.200 The commission may order a person who, in the course of the commission's dealing with a complaint
01:19:07.020 in relation to a discriminatory practice described in this section,
01:19:10.480 has learned the identity of the alleged victim,
01:19:13.200 the individual or group of individuals that filed the complaint,
01:19:15.400 or any individual who's given evidence or assisted the commission,
01:19:18.220 not to disclose the identity.
01:19:21.160 So if you find out, or if anyone finds out who's hunting you,
01:19:24.720 you can be ordered to keep that a secret.
01:19:27.340 This is not a Canadian law.
01:19:29.200 There's no law like this anywhere in Canada.
01:19:31.700 We have certain official secrets laws.
01:19:34.000 We have certain laws protecting the identity of children.
01:19:36.300 But this protects hunter-killer race hucksters like the kind that swarm the Trudeau Liberals.
01:19:44.500 Every for-profit complainer of fortune, every victimologist,
01:19:49.020 every race huckster, gender huckster, transgender huckster,
01:19:52.780 they will now be able to file dozens, hundreds, thousands of complaints against their enemies in secret.
01:19:59.020 And if you find out who they are, you'll be subject to a publication ban,
01:20:04.000 and don't you dare say who it is.
01:20:05.840 There is established a tribunal to be known as the Canadian Human Rights Tribunal,
01:20:08.940 consisting, subject to subsection six of a maximum of 20 members, including a chairperson.
01:20:15.940 So Justin Trudeau is going to appoint 20 activists, friends, allies, liberal patronage donors,
01:20:22.500 to this Human Rights Tribunal.
01:20:24.880 They don't have to be judges.
01:20:26.460 In fact, they're not judges.
01:20:27.560 And they will rule on these cases of secret complaints against you in secret courts
01:20:34.440 with secret evidence from secret sources.
01:20:37.060 It's not Canadian.
01:20:40.940 Now, what happens if you're convicted?
01:20:43.700 Scroll ahead to 53.1.
01:20:46.240 If at the conclusion of an inquiry, the member or panel conducting the inquiry
01:20:50.460 finds that a complaint in relation to a discriminatory practice described in Section 13 is substantiated,
01:20:56.500 the member or panel may make one or more of the following orders against the person found to be engaging,
01:21:03.180 or to have engaged in the discriminatory practice.
01:21:04.900 In order to cease the discriminatory practice and take measures.
01:21:09.260 In order to pay compensation of not more than $20,000 to any victim
01:21:15.740 identified in the communication that can constitute the discriminatory practice.
01:21:25.840 In order to pay a penalty of not more than $50,000 to the Receiver General.
01:21:31.180 So a fine.
01:21:31.880 So you have to pay the race huckster $20,000,
01:21:34.840 and you have to pay the government $50,000 based on whatever Trudeau's appointees say.
01:21:42.000 Oh, and of course, Section 53.2,
01:21:44.460 a member or panel conducting inquiry may award costs.
01:21:49.820 So you have to pay $20,000 to the huckster, $50,000 to the government,
01:21:54.800 and you have to pay for the costs of this Star Chamber prosecution.
01:21:58.040 That could be a $100,000 hit.
01:22:01.020 Oh, and you'll never be able to find out who came for you.
01:22:04.620 That's shocking to me.
01:22:07.800 Let's keep going.
01:22:12.140 I'm going to skip ahead the child pornography part,
01:22:14.360 because like I say, that's been on the books for years.
01:22:17.320 And I think, frankly, that Trudeau is abusing the victims of pornography and revenge porn
01:22:24.680 by making them his sort of human shields here,
01:22:28.380 by saying, oh, look at that woman whose daughter killed herself because of revenge porn.
01:22:32.500 Revenge porn has been illegal in Canada for 10 years, five-year fine,
01:22:35.680 but we're going to trot her out today,
01:22:37.720 so that if you dare oppose these censorship provisions,
01:22:41.120 we'll call you a racist or whatever they do.
01:22:43.280 You know what?
01:22:47.580 I'm going to scroll through, and I'm pretty much actually through the heart of it.
01:22:51.880 There's just technical amendments and technical provisions,
01:22:55.560 making sure particular words are in the Human Rights Act,
01:23:00.160 the Youth Criminal Justice Act, and the Criminal Code.
01:23:02.220 So that actually is the end of it.
01:23:04.740 What we have here is a monstrosity of a law.
01:23:08.140 It's over 100 pages, the version I went through.
01:23:10.320 The summary I went through was 17 pages.
01:23:14.740 But I think that probably 90 of those 100 pages were a distraction.
01:23:19.800 They were a placebo.
01:23:20.960 They were a camouflage.
01:23:22.760 They were a, just look over there.
01:23:25.460 It was a misdirection, as they'd say.
01:23:29.860 It was about the things that people actually want fixed.
01:23:33.360 And by that, I mean for Trudeau to enforce the existing laws we have on the books.
01:23:37.040 We have a revenge porn law on the books.
01:23:40.800 Five years in jail.
01:23:42.240 Maybe that should be 10 years in jail.
01:23:43.880 Has there even been a single prosecution under that law where a suspect,
01:23:48.760 a criminal was convicted and got the five years?
01:23:50.760 Like I don't, I'm guessing that that law has not been fully used.
01:23:55.680 I would be very surprised if a single person in Canada had already been prosecuted to the maximum extent
01:24:01.500 and given a five-year jail term under that provision.
01:24:05.100 So do you want to pump up that law?
01:24:06.720 Oh, sure you could.
01:24:07.320 But how about enforce it?
01:24:09.780 The laws against revenge porn, the laws against child porn are on the books.
01:24:13.080 How about we enforce it?
01:24:14.200 The law against fomenting hate, section 319 of the criminal code, which I do not like,
01:24:19.840 it's already on the books.
01:24:20.700 It's not being enforced.
01:24:22.060 It's not being enforced against Trudeau's friends in the pro-Hamas movement.
01:24:25.960 You could say that some of Trudeau's MPs, his own MPs, have engaged in hate speech.
01:24:32.640 Certainly the Toronto Star in hate speech, fomenting hate against the unvaccinated.
01:24:36.960 Um, no, this is not about those things.
01:24:42.620 This is not about child pornography or revenge porn.
01:24:45.660 This is about criminalizing politics.
01:24:49.140 It's very specifically about criminalizing transgenderism.
01:24:52.640 I read to you, it's not just race and sex anymore, religion.
01:24:56.520 It's gender expression and gender identity.
01:24:59.300 And to get you, to ensure that they get you, they're offering a $20,000 bounty to anyone
01:25:07.620 who files a complaint against you, and they can do it in secret.
01:25:10.640 You'll never have a chance to cross-examine them.
01:25:12.780 You'll never have a chance to challenge them.
01:25:15.900 That is not a form of justice that anyone in Canada has ever heard of before.
01:25:20.320 That is not how it is.
01:25:21.380 I, um, I sent a copy of this bill to one of our favorite law firms who's been fighting
01:25:27.840 free speech matters for us, and I said how absolutely critical it is that we challenge
01:25:34.260 this law at the first moment.
01:25:35.420 Now, I have to tell you the obvious answer.
01:25:37.840 It's not a law yet.
01:25:39.680 It's a bill, which is a proposed law.
01:25:43.160 First reading, as they call it, means the bill is entered into Parliament.
01:25:46.140 And then there's debates, and then there's second reading, and bills like this typically
01:25:52.080 have hearings and committees, but with the new deal between Trudeau and the NDP, I think
01:26:00.020 they're going to shorten the debate on this.
01:26:02.800 I think they're going to weed out any critics they don't want to hear from, and I think they're
01:26:06.340 going to ram this through.
01:26:08.920 Then it's going to go to the Senate, which is dominated by the Liberal Party.
01:26:13.520 I think they'll probably push it through, too.
01:26:16.140 And then it's enacted, and then it can be challenged.
01:26:20.980 Do you doubt that Rebel News will probably be the first people charged under the law?
01:26:27.000 I bet we will.
01:26:28.500 Not just charged once, but probably charged every single day.
01:26:31.120 Probably every single news story we publish will be called an act of fomenting hatred.
01:26:37.800 Probably every single one.
01:26:39.340 And even if we were to win in this star chamber in front of Trudeau's hand-picked Human Rights
01:26:45.280 Tribunal members in a system where you can't know the accuser, can't see the facts against
01:26:49.260 you, even if we were to win, the process is the punishment.
01:26:52.980 And it's not just one time.
01:26:54.440 Why wouldn't it be a hundred times?
01:26:56.000 Why wouldn't it be a thousand times?
01:26:57.860 Why wouldn't every single story we publish be prosecuted?
01:27:01.100 I think that Rebel News has to fight this.
01:27:05.740 It'll be interesting to see if others do.
01:27:08.620 Toronto Star does not need to fight this.
01:27:10.620 They will never be charged.
01:27:11.580 So I showed you that quirk where there's a section in it that says just because you spread
01:27:15.100 hate doesn't mean you spread hate.
01:27:16.540 It makes no sense in comparison with the rest of the law, other than it's what they're going
01:27:21.220 to point to when someone complains about hatred in the CBC or hatred in the Liberal Party or
01:27:27.500 hatred in the Toronto Star.
01:27:28.700 They can say, oh, well, you're allowed to hate if you're a good guy like they are.
01:27:32.960 No, I think this will be the kill rebel bill.
01:27:36.840 It'll be interesting to see if there's any journalists in the country who are still dedicated
01:27:41.940 to freedom of speech.
01:27:43.300 And not just that, but a legal process that it's not this insane secret star chamber like
01:27:48.960 that, I've never heard of anything so insane in my life.
01:27:53.560 Even in mob trials, even where the mafia are being prosecuted, they have the right to
01:28:00.100 see witnesses and challenge them and to see the source of evidence against them.
01:28:04.080 You don't think those witnesses feel intimidated, but even they are not hidden from scrutiny.
01:28:10.120 But if you're some transgender extremist in Canada, you can file a complaint against any
01:28:14.640 critic and have your identity kept secret because you feel intimidated.
01:28:17.800 And you can actually go to your provincial court and say you feel scared and you can maybe
01:28:22.340 get your opponent put under house arrest.
01:28:25.860 This is the worst law I've ever seen.
01:28:28.900 I remember when it was called, when it was introduced in part through Bill C-36 and in
01:28:35.780 part through a proposal that Stephen Gilbold put out.
01:28:38.220 And I thought that it can't be serious.
01:28:40.760 But remember, this is the final step.
01:28:43.160 Step one was C-11, which gave Trudeau control of the Internet.
01:28:46.140 Step two was C-18, which let him loot the Internet companies to pay off his friends in the media.
01:28:51.580 And this is part three.
01:28:53.460 Censorship.
01:28:55.960 Destroying our legal systems, checks and balances.
01:29:00.400 Life sentences for hate.
01:29:03.480 Criminalizing the emotion of hate.
01:29:05.040 I think this is the worst law ever introduced by any government in Canadian history.
01:29:13.740 I'm trying to think.
01:29:14.900 I suppose the War Measures Act was pretty strict.
01:29:21.080 And the Emergencies Act destroyed liberty, too.
01:29:24.260 And I'm guessing that there were some racist laws in the past.
01:29:27.280 I don't know how Japanese internment was done.
01:29:29.540 But that wasn't a life sentence, was it?
01:29:33.340 That wasn't good.
01:29:34.840 But that wasn't prison for life.
01:29:37.260 That wasn't ankle bracelets, urine tests, blood tests, banning on who they could talk to.
01:29:42.900 I mean, that was absolutely an atrocious thing done by a liberal government, of course.
01:29:46.900 But it's nothing like what's contemplated here.
01:29:50.120 It'll be interesting to see in the days ahead what law professors say, what pundits have to say,
01:29:54.860 what the Canadian Civil Liberties Association has to say.
01:29:57.060 They snooze, they slept straight through the pandemic and the lockdown.
01:30:01.900 So I wonder if they'll be fine with this.
01:30:03.640 Because they're really a transgender extremist group themselves.
01:30:07.080 I wonder if they'll speak up about this.
01:30:08.800 Or if they say, no, no, this is done just right.
01:30:11.360 It allows us to destroy conservative political enemies and protect ourselves.
01:30:17.440 We're fine with that.
01:30:18.200 It'll be interesting to see.
01:30:19.520 I promise you this.
01:30:21.340 Rebel News will fight this until we're put in jail.
01:30:24.680 That's a promise.
01:30:28.060 Well, that's the show for today.
01:30:29.980 Just monstrous.
01:30:31.940 I don't want to say I'm scared because that implies that I'm emotionally petrified or something.
01:30:38.780 I'm worried.
01:30:39.740 I'm concerned.
01:30:40.640 I'm motivated.
01:30:41.860 I feel like I'm informed.
01:30:44.160 I feel like I know what I have to do.
01:30:48.120 I have to politically and journalistically fight against this to try to change it in whatever way is possible before it's turning to law.
01:30:57.620 I don't know if it'll be permeable to any change because the liberals and the NDP are going to ram it through.
01:31:02.200 And then I think Rebel News will instantly be targeted by it.
01:31:06.760 And I think we have to challenge it and its constitutionality with the absolute best lawyers we can muster and fight to the death.
01:31:13.620 I just think that's how it's going to go.
01:31:16.840 Help me out if you like.
01:31:18.140 Sign our petition at stopthecensorship.ca and I'll keep you posted.
01:31:22.020 This will be the story of the year.
01:31:24.120 Until next time, on behalf of all of us here at Rebel World Headquarters to you at home, good night.
01:31:29.200 And keep fighting for freedom.
01:31:30.240 We'll see you next time.