00:32:02.760And I'm going to hopefully interview Bruce Party about this because he kind of mentioned this in a Twitter space when I hopped in with him.
00:32:11.140And he was saying like, yeah, well, he kind of implied, yeah, well, Supreme Court, it's from the Supreme Court.
00:32:17.860So and it's like, OK, but what if the what if the Supreme Court is wrong on this?
00:32:22.980is the supreme court ever wrong i feel like they kind of got this wrong i feel like they really
00:32:29.720kind of weren't really thinking properly to set such a precedent that detestation and vilification
00:32:38.000should be criminal speech because that's super super broad um oh but the supreme court said it
00:32:45.660okay i don't care i i i don't know no offense to the supreme court i don't know if you guys uh
00:32:52.160have the best track record under justin trudeau especially and um i mean you wear these you wear
00:32:59.500these weird santa outfits kind of weird kind of weird santa outfits to be honest anyway let's keep0.94
00:33:05.900it going here because i feel like there's a lot of at least bovets out there that are not able to
00:33:13.660find their voice and i raise that because people talk to me a lot about this legislation and about
00:33:19.780people's voices and what are you doing to protect free expression and freedom of speech
00:33:26.220particularly online and I'm going to address this at length a bit later on in my comments but what
00:33:31.980I'd say just at the outset is that I think what we need to ensure that we all understand is that
00:33:36.680so much of discourse in civic debate in civic society right now is notwithstanding the fact
00:33:43.620I'm here in front of you at a microphone and I'm pleased to be here in person it takes place online
00:33:48.200on our phones on our computers etc and if people are impeded from penetrating that virtual town
00:33:53.780square that virtual public space then their expressive rights are being lived this is so
00:34:01.920ironic man this is so ironic so what he's doing here is he's implying that because hateful people
00:34:08.800exist because there's anonymous accounts saying hateful things online uh to this example of this
00:34:16.460black woman who got reviled from even commenting because that hate exists um people people can't
00:34:23.700participate in free speech because people are getting offended we can't have free speech that's
00:34:30.320what he's saying because people are offended because there's these oversensitive people who
00:34:35.280are afraid of criticism or afraid of pushback or afraid of anonymous twitter trolls that's why we
00:34:42.000can't have free speech it's only when we silence all these twitter trolls and stop all the hate
00:34:47.540that's the only way we can have free speech this is such a like orwellian double speak like
00:34:53.720inversion of the truth and reality he's saying no no by silencing hate we are going to uphold
00:35:01.200democracy and free speech when in reality you are stopping free speech you are you are killing the
00:35:07.460free speech by saying these this is hate we need to stop this free speech to allow free speech
00:35:13.980it's it's it's it's evil i i would call this villainous i think slickly telling this lie
00:35:21.460this inversion of the truth is an evil thing to do i think that this is like a you know this is
00:35:27.100like a uh gotham gotham city villain this this type of lying of like no you see um we're gonna
00:35:37.180save free speech by silencing the hate because people can only talk once other people have been
00:35:42.900silenced that's pretty well what he's saying people can only have free speech if the hateful
00:35:48.760people are silent and we're going to determine who's hateful and we're going to determine what's
00:35:53.600bad and that's going to help us have a more freedom of speech it's totally insane guys
00:35:59.920totally insane i'd also want to point out that there is
00:36:07.300a great deal of information that is happening online and that is affecting young people
00:36:13.760and the statistics really bear this out the statistics are that a quarter of teens experience
00:36:20.140cyber bullying that the number of hate crimes report a quarter of teens experience cyber
00:36:24.860bullying i love i love this fact because you know when they were asked that question in a survey
00:36:30.680it's like was it a bad thing you know what i mean i've been so i've been cyber bullied
00:36:37.520on tiktok i've had children make fun oh look at this old man wow this bro thought he did something
00:36:45.360hundreds of likes making fun of me i'm not mad i think it's funny because it's the internet i
00:36:52.620I understand that because it's more anonymized, it's more like anonymous in terms of the comments.
00:37:00.820This is part of the fun of the internet.
00:37:04.120Sure, it's part of the terrifying part of posting content and posting your face out there.
00:37:09.020You might get roasted in the comments.
00:37:19.000okay and you know you could make the argument that meet the me too movement1.00
00:37:26.680that these women like trying to call out sexual predators or harvey weinstein well that's cyber1.00
00:37:34.340bullying a reef all these people harassing harassing uh harvey weinstein for something0.98
00:37:40.520he did in the past why do they keep cyber bullying him this is such a horrible thing right we need to
00:37:46.140stop the cyber bullet you see what i mean like sometimes the mob mentality that manifests on
00:37:51.840the internet isn't always a bad thing sometimes it's actually uh like the social policing playing
00:37:58.200out you know what i mean the social policing as in if somebody acts a fool if somebody acts like
00:38:04.380an asshole or or a creep or whatever and then people cyber bully them and kind of uh what's
00:38:11.860what's the word kind of condemn them for their actions or for what they said this is sort of
00:38:15.680like the normal social order playing out where, where human beings react to something as a group
00:38:20.980because they all agree that they did not like that thing. So this whole thing of trying to
00:38:25.720control bullying is just such a mess because bullying is a human. Um, it's like part of
00:38:32.460human nature. Uh, there's this book, a reef that, uh, maybe you should read. It's called
00:38:38.180the Lord of the flies. Uh, very, very influential book. Sometimes I read it in English class
00:38:44.540And it speaks to this idea of human nature and how it's this thing that can kind of descend into chaos when it's not sort of, you know, having some sort of natural order to things.
00:38:55.480And this idea that you're going to like stop human nature, you're going to stop people from hatred, you're going to stop people from feeling that human emotion, you're going to stop people from bullying.
00:45:02.320But I think any dad would probably react in the same way.
00:45:04.640But I raise this both partly flippantly, but also partly just to point out that we don't generally tolerate people who would show up at our houses,
00:45:13.200show up at the playground or show up on the phone at 10 o'clock or 11 o'clock or 12 o'clock at night
00:45:19.320particularly if they're wanting to chat about things that maybe might be harmful to our children
00:45:23.460and yet that's exactly what's happening online all of the time and we need bros bros seriously
00:45:31.480comparing irl to the internet you know and i think and this the ridiculous irony of his or
00:45:41.340like not irony but the absurdity of his example is that uh the internet doesn't come to you
00:45:47.720okay the internet does not come to you you need to open the internet you need to go on a device
00:45:53.040and open the internet you need to open it you need to actively go inside and look at it and
00:45:59.800go inside of the internet okay that's how you do it he's talking about people showing up at your
00:46:03.880door calling you on your phone yeah that's not that's not how the internet works the internet
00:46:10.760doesn't call and say like hey look at these hateful comments you know i mean i guess you
00:46:15.180could make the argument with notifications that there maybe maybe he's going to go there let's
00:46:20.120see need to take action to prevent that in terms of protecting young people in particular i mean
00:46:25.580it is hilarious just like the imagery of like i don't know what like a cartoon character of the
00:46:30.300internet would look like but like the internet's knocking on your door i'm here to cause harm to
00:46:36.800your children oh no it's the internet a reef for ronnie save us don't worry canadians i have bill
00:46:45.660c63 the internet's not gonna harm you again yay thank you a reef totally
00:46:54.060the internet it's gonna come eat your children ah when in reality though in reality though it0.85
00:57:36.760The other component that I think I feel compelled to speak about is this idea that people have mentioned to me, and I'm sort of jumping around my notes here a little bit, but they've said to me that, you know, Arif, you've got a really impressive piece of legislation, and it is impressive, and it's worked on a long time.
00:57:56.800It's not sort of my handiwork on the back of a napkin.
01:08:28.700Because these legislators suck, okay?1.00
01:08:32.200Because this liberal party, these bureaucrats in Canada are terrible at their job.0.84
01:08:36.940They're terrible at writing legislation to the point where professional, massive, international, big tech companies are calling them out on their bluff.
01:08:46.340And they're just like, this is not even tenable for us.
01:08:48.400So we're just leaving the country because you're so bad at writing legislation.
01:08:53.940And I feel Bill C63, based on this absurd dream that Arif Farhani says that we're going to protect your kid from anything harmful online.
01:09:01.980And based on that, yeah, I'm thinking that they might just nuke the internet or at least nuke big tech from even existing in Canada if this were to go through.
01:09:10.280And what I would say to you is that we all need to think about those examples because I think in my job is protecting Canadians, protecting those who are victimized online.
01:09:20.100I'm not going to cherry pick which victims.
01:09:23.420I'm not going to focus only on child sex predator victims and hanging out to dry the LGBTQ plus community, hang out to dry indigenous people, hang out to dry black Canadians, Muslim Canadians, Jewish Canadians, you name it.
01:09:34.900My job is to try and protect to the best of my ability all Canadians.
01:09:38.360But I also see to you and I find this very troubling is that because I thought we we learn more coming out of COVID.
01:09:44.820Once again, the the absurdity there is crazy.1.00
01:09:47.440how do you protect jewish and palestinian canate uh canate canadians at the same time0.99
01:09:54.440uh like like how do you police the content because sorry you're going to have jewish
01:10:00.660canadians complaining about the palestinian protesters and their posts and you're going
01:10:04.560to have palestinian canadians complaining about the jewish canadians and their posts0.82
01:10:07.920how do you square that circle you can't you can't this bill's a power grab uh you're a clown0.75
01:10:16.400about the mental health impacts of what people are experiencing online you've seen the u.s surgeon
01:10:22.300general issue a warning literally like think about again for those of you old enough the 1960s
01:10:27.740there's a warning tobacco causes cancer this is bad for your health the u.s surgeon general has
01:10:32.260issued a warning about social media and what it does to young people that's significant we know
01:10:39.020that that is significant right but you're still not even encouraging kids to use their phone less
01:10:44.000you're encouraging the government to babysit everybody who ever used the internet in the
01:10:47.780country. But what I find troubling is that I don't think anyone in Canada, regardless of whether they
01:10:52.100vote for me or hate my party's guts, I don't think anyone in Canada would countenance somebody taking
01:10:56.520a firearm into a mosque. No one's going to defend that because it's deplorable conduct. But I find
01:11:02.040it troubling that people will take less umbrage at the pernicious mental health impact and trauma
01:11:09.320that young people are experiencing who face hatred what it does to their confidence levels their
01:11:14.660self-esteem their anxiety their depression their ability to engage that's something that we need
01:11:19.040to be thinking about once again stopping the hate online you're interested in that but not the fact
01:11:25.480that kids are glued to their screens that part i called it at the beginning that aspect doesn't
01:11:32.380get talked about at all uh the cell phone addiction doesn't get talked about at all it's just uh no
01:11:37.160we want the government to police speech that's the only that's the only solution no we're not
01:11:41.380we're not here to kill free speech in canada and give the government an insane amount of power over
01:11:46.620speech and free expression across the country no no no no not at all no this is to protect kids
01:11:51.220yeah i also reflect on you that when i stood at this podium and i mentioned
01:18:35.840And the last response I have to people who tell me,
01:18:37.820you know, why don't we just divide it up?
01:18:39.480You know, it'd be just so much easier,
01:18:41.240such low-hanging fruit just to protect the kids,
01:18:43.920is that what I say to them is that, you know,
01:18:46.580I've looked internationally at examples about this bill.
01:18:49.500And Australia is one example we've looked at really closely.
01:18:52.820And Australia moved in this area in 2015.
01:18:55.040In 2015, nine years ago, they dealt with children.
01:18:59.340Nine years later, Australia has gone way beyond just dealing with children.
01:19:04.240If I'm moving for the first time as a minister of the Crown to deal with this issue, I think it's incumbent upon me to address the issue as we see it now, not as the world saw it nine years ago.
01:19:16.820There's a lot of components about the bill that my staff hate this because I've deviated wildly from the order that they laid things out here in my comments.
01:19:27.000but there's people that have talked to me extensively about this bill and they said
01:19:32.640arif it looks like this is just you manufacturing sort of your own vision of canada
01:19:37.580this is you manufacturing your yeah they said to me quite candidly that
01:19:45.700you know it looks like a bold face attack on freedom of expression in this country
01:19:50.360yeah let's clip that that's a good clip that's a good clip this is just you manufacturing sort
01:19:56.220of your own vision this is a bold face attack on freedom of speech in this country said to me
01:20:00.200quite candidly that you know it looks like a bold face attack on freedom of expression in this
01:20:05.540country totally real real he's right and to that i have several responses let's hear it let's hear
01:20:11.340the first is that freedom of expression is fundamental in a democracy and i say that as
01:20:16.040somebody who had a very tender age but nevertheless my family fled an authoritarian regime in east
01:20:21.420Africa, the regime of Idi Amin. So if you want a family that venerates Canada and everything
01:20:25.940it represents, it's probably my family. If you want somebody who's going to defend freedom
01:20:30.860of expression under section 2b of the charter, it's probably me as minister of justice, not
01:20:35.300only because I practice constitutional law, but because my oath as a cabinet minister
01:20:39.360is different from every one of my colleagues. We all pledge allegiance to the king.
01:20:44.660I'm so tired of this disingenuous. I love free speech is so important. Hey guys, let
01:20:51.280mean I said free speech I said I care about it okay that's why we're passing hate speech laws
01:20:57.100okay but I talked about I talked about loving free speech it's just I'm so tired of how
01:21:02.280disingenuous this stuff is I'll pledge to keep cabinet confidence I pledge to uphold the
01:21:06.620constitution and I take that very very seriously but the most important thing I will say to you
01:21:11.300about freedom of expression is that hate speech is not constitutionally protected speech okay
01:21:19.300says you what do you mean like says you what is hate speech like this is when i first started
01:21:29.220researching this bill like no one defines hate like hate just like does not get defined um
01:21:34.380the only the only thing yeah and it's like it's ridiculous i'm just gonna repeat that
01:21:40.740in this country hate speech is not constitutionally protected speech he's casting a spell
01:21:46.880That begs the question that if we don't protect it in the physical space, me here on the campus at TMU, why would we protect it online?
01:22:45.460the definitions around paragraphs 42 to 47. Read those five paragraphs. What does it say? It says
01:22:52.880hatred is detestation and vilification. Hatred is not insulting comments, offensive comments,
01:23:01.380expressions of disdain or dislike. Okay, why do I point that out? People love to ask me, you know,
01:23:07.400you've done a lot of stuff in your career. How does it help inform you in terms of what you're
01:23:10.500doing now? I say it helps inform me all the time. It helped that I practice constitutional law in
01:23:14.440the city it helped that i was a u.n war crimes prosecutor on the rwandan genocide you know how1.00
01:23:19.460tutsis were described in rwanda in 1994 as cockroaches that should be stamped out that's
01:23:26.120called detestation and vilification when you dehumanize another well sorry sorry so any version
01:23:32.620of dehumanizing is detestation and you know it's like it it's so annoying it's so disingenuous the
01:23:38.880way in which he's like these words are so important and then he's just kind of lumping
01:23:42.940them all together with the very first example what that definition does is it sets a high bar
01:23:47.860it's not going to affect accent no it doesn't it really does not it really does not set a high bar
01:23:54.520and that's the whole problem on toronto on october 17th it's not going to affect colorful jokes at a
01:24:00.300yucky yucky comedy cabaret it's not even going to affect people who have legitimate criticism
01:24:05.040a better immigration policy see i'm he says this now but you know it's it's like a trust me bro
01:24:13.160this is like a trust me bro this moment is like a trust trust me bro trust me bro you won't get
01:24:19.700in trouble for that let's just pass the law okay trust me bro you'll be fine all this will be fine
01:24:26.420no hey just just trust me bro i know the legislation is totally broad and open to
01:24:32.000interpretation but just trust me bro you'll be fine including which country people are coming
01:24:36.860in from people express to me all the time as recently as two weeks ago in my riding on
01:24:41.580somebody's porch you have too many people coming in from this particular country that is legitimate
01:24:46.620debate in canadian society that is protected speech but what it is going to draw differentiation
01:24:52.340from is between a casual racial slur what i call awful but lawful language and somebody who calls
01:24:58.200for the extermination of a people that's detestation and vilification
01:25:02.120once once again he's he's kind of muddying the waters he's conflating two things now he's saying
01:25:10.400that he said uh you know calling people cockroaches that's the that's detestation that's
01:25:17.020vilification well it's also dehumanization so the way he's calling people cockroaches
01:25:21.080detestation vilification dehumanization are all three personally i think it's just dehumanization
01:25:28.300does that also imply is dehumanization is that also part of what's hatred is that also one of
01:25:34.940the thing is d is dehumanization always detestation is it always vilification and
01:25:40.700also calling for the death of a group of people is that detestation or is that vilification or
01:25:46.620Or is that something else entirely where it's, you know, inciting violence, which is something that's already illegal, if I'm not mistaken.
01:25:53.320And really annoying that this is guys.
01:25:56.360This is supposed to be the top expert on this bill.
01:25:58.940And he's not caring to actually differentiate any of these things.
01:32:23.720this is not something where we as political officials weighed in on what's going to be in
01:32:28.960the public domain or not this is where other actors the four that i listed we are not public
01:32:34.860officials waiting into what is in the public domain or not that the whole that's the whole
01:32:38.820point of you making this bill is you saying i need to protect you know that that was the whole
01:32:43.580introduction of this presentation arif of you saying i am going to determine what's good and
01:32:49.380what's not good on the canadian internet and i'm going to decide and i'm going to make this happen
01:32:54.560with bill c63 that's the whole premise of you is to wade into the public domain of what is allowed
01:33:01.880and what is not allowed apply a definition that's entrenched in the statute that's really really
01:33:07.080fundamental for people to understand looks like ai i'm not ai other thing that has come up a great
01:33:15.600deal about this legislation in terms of sort of people criticizing this stuff but well we like the
01:33:20.180points that deal with children and how we prosecute children because we're beefing that up as well
01:33:27.240where we can prosecute children of the data the metas of the world have to transfer that data
01:33:32.040to an rcmp uh child exploitation center they're clearly scoped into the bill etc that sounds super
01:33:39.480creepy hey was your child sexually exploited just send all the photos to the rcmp okay i think i
01:33:46.180think we're trying to understand what this is really about here but when people say to me that
01:33:49.780you know okay you've explained to me a bit about the criminal code stuff but you've got this human
01:33:54.080rights act aspect i'm not sure about this human rights act stuff like why should people be able
01:33:59.080to raise their hand and say this is hate speech that's i love always going over all of like
01:34:04.220people's concerns with the bill and it's like hey what about all this hate speech stuff it's like no
01:34:08.500no no it's fine and now he's like yeah what about all this human rights stuff and he's like no no
01:34:12.040it's fine like there's so many excuses that he has to give to try and rationalize this trash
01:34:17.240piece of legislation it's very funny keep dancing keep singing and dancing buddy keep dancing for
01:34:22.940us monkey boy discriminatory and complain about it in front of the human rights commission
01:34:27.180and you know didn't we remove that as a parliament in 2013 you're right the previous government did
01:34:33.060removed that under Stephen Harper. Yeah, we did. That removal deprived people of a root and a
01:34:39.740recourse vis-a-vis the individual author of the material. So understand that when we talk about
01:34:44.940the Digital Safety Commission, that layout that I just gave you, that was all vis-a-vis meta and
01:34:50.140what it's exposing me to or not online. But here you would have Jess feeling aggrieved by an
01:34:56.500individual, Arif Arani, and what recourse would she have against me personally? People said,
01:35:00.780well, you're going to flood. It's going to be all full of BS complaints and bad faith
01:35:04.780complaints. It's going to just inundate the commission and oh my God, the sky will fall.
01:35:08.900Actually, we know this based on what happened when section 13 of the
01:35:12.880Human Rights Code was already in effect. And it wasn't necessarily a flood.
01:35:16.700It would be a flood now because a lot's changed since then. Everyone's sensitive to microaggressions
01:35:21.020now, Arif. I'm sure you can appreciate that because you opened up your entire
01:35:24.760presentation about how sensitive you are to the color of your skin and all this type of stuff.
01:35:28.940um and you want and you want to what was it you want to protect people from
01:35:34.020def uh defend people from safety but um no we know that the the most of the complaints from
01:35:42.560section 13 were like someone abusing the piece of legislation i can't believe he's trying to
01:35:47.560defend this this is so funny this is really good actually we'll probably clip this into something
01:35:54.220people said well you're going to flood it's going to be all full of bs complaints and bad
01:35:59.580faith complaints and it's going to just inundate the commission and oh my god the sky will fall
01:36:03.840yeah we thought about that because we heard about it so there's four ways specifically how we
01:36:08.680address that oh the first is that we define hatred the same definition i told you applies across the
01:36:14.140act so a reef so it's not yeah again very broad definition you say that it's a high bar it's
01:36:20.360really not casual comments do not arise to detestation or vilification unless i'm doing
01:36:26.520things like calling for the extermination of a people secondly again he's muddying the waters
01:36:31.360he's muddying the waters extermination like calling for the extermination of people is already
01:36:36.440something that's illegal it's called incitement in the states whatever it's called here that's
01:36:41.560already something that is in considered criminal speech calling for the the death of people
01:36:45.160and he's saying no no detestation and vilification that's only calling for the death of people
01:36:50.020no it's not again you're supposed to be the expert on this bill and you're just muddying
01:36:55.000together all of these words to justify your bullshit it's very very detestable a reef come
01:37:00.700on it's very dishonest we said to the human rights commission that if somebody's going to raise a0.86
01:37:05.340complaint under this new section 13 provision that you have the ability to summarily dismiss it so
01:37:10.920for the budding lawyers in the room a summary dismissal mechanism is sort of a prima facie
01:37:15.060the case has not been made out they just dismiss it out of hand no hearing nothing just gone
01:37:19.660third they've always had the ability to weed out frivolous or vexatious complaints and fourth
01:37:25.640rarely in an administrative context if somebody has used the process and is deemed to have abused
01:37:31.620the process of the Canadian Human Rights Commission costs can be ordered against them
01:37:35.500so it takes time and money to deal with some of these things in the Canadian Human Rights Commission
01:37:39.060so you can have a punitive order of costs so if Jess felt aggrieved and I abused the
01:37:43.940the provision she gets costs ordered against me and I pay for her legal costs
01:37:48.380Like it's, I'm just thinking about, oh man, like we already have so much useless bureaucracy in this country and more and more of it is, is catering to like DEI stuff, diversity, equity, inclusion.
01:38:01.920And, and this would really be like the Holy grail bringing in more human rights court stuff, all this taxpayer money spent on a bunch of complaints from people.