Greg Wycliffe - December 17, 2024


'Hate Experts' want to CENSOR Canadians Online🔴REACTION to Cyber Bullying Townhall 🔴#stopbillc63


Episode Stats

Length

4 hours and 11 minutes

Words per Minute

157.79161

Word Count

39,619

Sentence Count

984


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

Transcript

Transcript generated with Whisper (turbo).
00:00:00.000 Let's get started.
00:00:30.000 Thank you.
00:01:00.000 Thank you.
00:01:30.000 Thank you.
00:02:00.000 Thank you.
00:02:30.000 Thank you.
00:03:00.000 Thank you.
00:03:30.000 Thank you.
00:04:00.000 Thank you.
00:04:30.000 Thank you.
00:05:00.000 Thank you.
00:05:30.000 Thank you.
00:06:00.000 Thank you.
00:06:30.000 Thank you.
00:07:00.000 We'll be right back.
00:07:30.000 Uh, hey guys, uh, listen, um, I know you guys are scared and, you know, you have to understand
00:07:57.340 And things don't happen the way you want them to happen.
00:08:02.380 You're a very special person.
00:08:05.140 Instead of making fun and match, you need to respect them.
00:08:09.040 Comment on a pin, back it up with the link, start,
00:08:13.540 threads, butt heads, push, content, fight that could
00:08:18.920 fight, argue, online.
00:08:22.960 Hit refresh, set set, set, set
00:08:29.960 Notifications, blood pressure
00:08:34.380 I scare my wife, waste my life
00:08:37.920 Arguing online
00:08:40.580 I'm arguing online
00:08:44.900 I'm arguing online
00:08:49.200 I'm arguing online
00:08:53.420 Making a difference
00:08:57.880 And losing all my friends
00:09:00.040 Got a source you can send
00:09:02.080 Let's take this to my DMs
00:09:04.460 That's tribe
00:09:06.140 Tide
00:09:07.800 Argue online
00:09:10.140 And my back's against the wall
00:09:15.120 And all their moms
00:09:17.180 and get the rents to their inbox, pick a side, pick a hell that I can't argue that I don't like me.
00:09:28.780 Realize that is evidence that I am a prophet of God.
00:09:31.860 Some go to war, some occupy, some just a straight-up genocide, some behead, some will end the night.
00:09:44.360 And most of us argue online
00:09:48.440 So stop, whatever, and just be a fucking apostle
00:09:51.160 Pushing out a penny head, dunking on some children
00:09:55.120 When the red dot's on that bill
00:09:57.340 That's what a cow live in
00:09:59.880 Henry fly, watch him cry
00:10:03.040 That's how I lost my job
00:10:05.620 My check, custody of kids
00:10:08.880 Lost my cool, lost the fight
00:10:11.100 Most of all, I lost my blood
00:10:14.980 Argue and I'm glad
00:10:21.700 Argue and I'm glad
00:10:32.140 About Israel and Palestine
00:10:40.100 If I tell you to do something, it's a commandment. It's very important that you do it.
00:10:46.180 So, watch my Facebook video where I read the Quran.
00:10:50.980 Read the Quran. Read the Quran. Read the Quran. Read the Quran. Read the Quran.
00:10:57.660 Hey everybody, welcome to the stream.
00:10:59.640 For those curious, that was a song from Dirty Jirty, a friend of mine.
00:11:04.920 He just dropped an album recently, Songs You Shouldn't Sing in Public.
00:11:08.200 and that was his track arguing online i thought it was pretty appropriate for
00:11:13.220 what we're doing tonight hey hi everyone cyberbullying do you are you ready for the
00:11:18.780 cyberbullying tonight i hope so hi everyone um if the sound is not good let me know because
00:11:26.520 i was tweaking with some stuff earlier today we got a lot to go through tonight
00:11:32.320 because online harms and cyber bullying town hall there was a cyber bullying town hall isn't that
00:11:43.460 incredible um yeah it's about an hour and a half i'm sure there's gonna be lots and lots and lots
00:11:50.460 of stuff to react to but there is a lot of exciting things happening tonight because not
00:11:54.100 only are we going to react to that and i'm sure it's going to be very fun and and illuminating
00:11:58.360 and insightful but we're also going to be showing a couple clips of the documentary of the interviews
00:12:04.680 we've been doing with people who have been persecuted for their opinion because i have a
00:12:08.720 feeling that uh these clips will be especially relevant to what comes up in this cyber bullying
00:12:16.220 town hall uh there's also a cool uh christmas gift thing that we're going to be talking about
00:12:22.760 at safefreespeech.ca as well what else what else any other news hey hey Greg what do you think
00:12:30.200 about Christia Freeland what do you think about Christia Freeland uh resigning and and uh Jagmeet
00:12:37.500 Singh said Justin Trudeau should resign I don't care you know like I just feel so much of the
00:12:46.620 you know political commentary in canada is so irrelevant you know if you're not talking about
00:12:53.100 foreign interference the foreign invasion or the sort of ongoing uh tyrannical managerial
00:13:00.460 managerial class in canada that's continuing to push further left with their various policies
00:13:06.520 it's not just at the federal level we saw a mayor get uh fined five thousand dollars for not
00:13:12.500 recognizing pride month this stuff is ongoing it has nothing to do with justin trudeau it has to
00:13:18.540 do with this increasingly emboldened and corrupt managerial class that's cracking down on people
00:13:24.320 with the wrong politics oh did you hear christia freeland yeah we we know this we know this
00:13:31.640 no one likes christia freeland she's she's you know she's a kindergarten teacher taking care
00:13:38.040 of the finances of the country how's that going do we need to reiterate that do we need to go
00:13:42.460 over this again i mean i get it feels good to dunk on people and say see told you these people
00:13:48.320 suck they're stupid they've ruined the country but really all the only solution we have is vote
00:13:55.780 for polyev that's usually the solution the kind of outcome of uh of all of this yammering and
00:14:02.360 talking on but if yeah if you're not talking about the foreign interference the foreign invasion
00:14:07.840 the mass immigration the radical demographic change or the sort of increasing um
00:14:13.840 things like things things aren't really moving to the right things are moving anti-trudeau
00:14:18.720 but that's not moving to the right at all you know like every single day that passes
00:14:23.740 where pierre polyev doesn't push back against foreign or foreign interference foreign invasion
00:14:29.300 or dei policies which he never does if anything he embraces them more than anyone else at this point
00:14:36.960 yeah we're not winning the same institution the same establishment in ottawa is winning
00:14:43.480 so if the conversation is not about that i don't really care did you feel it yeah i saw i saw it
00:14:49.780 guys i saw freeland okay but uh let's get into it here's our favorite our number one our number one
00:14:55.660 pal uh your pal my friend we love evan balagord okay give a give a nice thumbs up put up any
00:15:04.680 finger you'd like on your on your hand for Evan Balagord in the chat there but this is really
00:15:11.940 interesting I saw this I saw this today and I'm like whoa is this real is this real according to
00:15:18.460 anonymous Twitter accounts Evan Balagord wears a butt plug 24 7 I mean if the anonymous Twitter
00:15:26.620 account said it I'm thinking this might be geez I mean if you look at the facial expression here
00:15:32.300 it's very interesting it's very interesting that the the executive director of uh the canadian
00:15:39.680 anti-hate network might be wearing a butt plug 24 7 i mean this is just what the anonymous twitter
00:15:45.600 account said so it's not yet confirmed however i've heard some things i've heard some stories
00:15:52.880 i've heard i've been i've been hearing stuff i've been hearing some things but uh we do have a lot
00:15:59.840 to get through a lot to react to so let's just get right into it hope everyone's doing well in chat
00:16:04.300 things are moving to the right in some areas legend of nelda tears of the incel says things
00:16:09.480 are moving to the right in some areas excluding the most crucial thing white replacement it's a
00:16:14.060 pressure pressure release valve uh where what areas are we moving right in though other than
00:16:22.380 like economically and even then i feel like it's not strong enough it's not really that uh substantial
00:16:29.340 enough um but yeah i mean no one's no one's talking about the radical demographic change it's true
00:16:37.260 and unfortunately this isn't something that the the ppc taps into quite often enough so
00:16:44.740 uh let's get into it cyber bullying are you afraid yet are you afraid i had so much fun
00:16:51.620 working on today's thumbnail I want to get some sort of like single lens futuristic headset
00:17:00.060 so I can you know accelerate my cyber bullying let's get into it I'm already afraid this music
00:17:12.260 oh my god
00:17:14.020 i mean wow i mean wow just cyber bullying town hall is such a crazy catchphrase
00:17:26.200 anyway but guys this is important because like look how professional this looks
00:17:33.200 look how many how much resources they're putting into this they have a what ctv or cbc panel guy
00:17:39.760 um they they are really what the efforts to censor the internet are coming from so many
00:17:48.260 different angles and uh they're putting a lot of resources into it which is exactly why
00:17:53.220 uh what i'm doing i think is important savefreespeech.ca uh educating people on bill c63
00:18:00.540 and really kind of getting into the getting into that topic in general because because they are
00:18:06.540 trying to attack it from all sides it's important to understand you know how to how to levy our
00:18:12.900 arguments against it to just completely dismiss it because uh it's a fundamental right if we lose
00:18:18.540 this one it's going to get a lot harder for people like us hello everyone i'm michael serapio cyber
00:18:25.040 bullying and online harassment is on the rise the abuses harassment you heard the man harassment
00:18:31.600 ... happening through social media, direct messaging, gaming platforms, and perpetrators are using this technology to harass, to threaten, to shame, and to scare their targets.
00:18:44.100 Now, in this online harms and cyberbullying town hall special, we are gathering together experts, politicians, representatives of big tech companies,
00:18:52.060 and people who have experienced and delved into the realm of cyberbullying and online harm so that we might identify the problem.
00:19:00.200 And as we go on in our discussion, perhaps come up with some solutions that we could apply in this country.
00:19:06.400 You already know what the solution is.
00:19:08.260 Maybe we'll come up with solutions as if I'll put my my house on it.
00:19:13.680 Bill C-63 will come up.
00:19:16.580 Yeah.
00:19:17.160 So they already know the answer.
00:19:19.160 I wonder what we might come up with to censor people.
00:19:22.060 Oh, yeah.
00:19:22.380 There's already a bill in its second reading in the House of Commons that is specifically all about online harms.
00:19:28.680 and uh and so they mentioned bullying and that's another thing like that this is this is very much
00:19:35.540 how the establishment operates right they they make it seem like hey this is this is just an
00:19:41.760 independent this is just an independent broadcast independent town hall and they're going to use
00:19:47.760 all of the language from bill c63 isn't that convenient isn't that amazing um so yeah like
00:19:55.500 none of this is a coincidence. Duh. And to that, let's get to the very big question right off top.
00:20:02.440 What is cyberbullying? How is online harms happening? Who does it target? And to talk
00:20:07.600 about this right now, we're joined by Evan Balgert, Executive Director of the Canadian
00:20:11.720 Anti-Hate Network. So they're already laying the groundwork saying, oh, like, who does it target?
00:20:17.140 I.E. only certain people? Only certain people are targeted from by online bullying?
00:20:23.260 uh okay so yeah i don't know see that how tight is there a butt plug inside of this man
00:20:32.080 one's in chat if you think there is a butt plug inside of this guy the anonymous twitter sources
00:20:37.380 suggested that that he might have a butt plug in 24 7 uh one's in chat if you think this is the
00:20:43.400 case amanda arella national director of public policy and advocacy at the ywca raissa amini
00:20:50.840 YWCA. We got to look it up. We got to look this up, guys.
00:21:00.360 YWCA.
00:21:06.440 YWCA, National Advocacy Community Action About Us.
00:21:14.320 Let's go to the FAQ.
00:21:15.780 are you a religious organization what what do they do
00:21:22.340 well we'll find out guys we'll find out
00:21:27.560 the youth engagement specialist at children first canada
00:21:32.840 josie niepenak president of native women's association of canada
00:21:36.840 and callie metler executive director of ottawa capital pride
00:21:40.000 please welcome and thank you for joining us today
00:21:43.000 you know man i'm gonna i'm just gonna move my face so we can see their faces all right
00:21:52.920 yeah no i uh i think this is like a housing thing for a woman that ywca and i i i wonder if what
00:22:02.660 their trans policy is i'm gonna get you to start up for the conversation right now because the ywca
00:22:09.540 recently completed the block hate project and that included research on
00:22:14.100 building resilience on online hate so that being the case I'm wondering if
00:22:18.540 you might start us off by talking about what the project found in terms of
00:22:22.540 findings and who is being targeted online. Thank you Michael and so our
00:22:27.360 blocky project focused specifically on online hate as it affects young women
00:22:31.860 and gender diverse youth aged 16 to 30 and unfortunately the reality that the
00:22:36.700 project saw is that that age group and demographic is universally experiencing online hate. And so
00:22:43.220 we did national polling data to understand exactly what the experiences of young women
00:22:48.100 and gender diverse youth is online in Canada and saw that over 44% talked about experiencing
00:22:54.360 online hate within the last year, often monthly and daily. And 82% of youth said that they
00:23:01.340 have witnessed personally witnessed online hate this is so silly all of this stuff is um
00:23:09.580 fundamentally misunderstands like what the internet is uh you could find hate for anyone
00:23:16.660 on the internet you could find hate for every single group you could find it
00:23:31.340 is it normal to hate whites
00:23:35.700 i mean that's obviously i mean well this is interesting what if we change it to black
00:23:43.420 why does the world hate black men
00:23:48.520 i mean it's the point is it's the internet okay we found online hate yeah i'm sure you found
00:23:58.600 online hate you can find all sorts of stuff so we know that it's happening across a number of
00:24:05.240 different platforms online um there was really every modality of of social media and uh internet
00:24:12.520 was talked about uh from youth that we surveyed and we also saw that there was a really a component
00:24:18.280 we might have to go at 1.25 speed guys because this this is just an onslaught of bs and it's a
00:24:25.480 long video so yeah she's moving kind she's moo she's moo she's moving kind of slowly gendered
00:24:35.320 within that so not only are young women and gender diverse youth more impacted by hate the hate that
00:24:39.480 they are experiencing specifically targets their identity let's bring you into the conversation
00:24:43.640 because as we listen to amanda there i'm wondering why are we seeing such an increase in hate and do
00:24:48.200 we know where this is actually coming from we've observed it as well so we've done our own polling
00:24:52.280 And there's been several other polls done which show similar things that nearly one
00:24:56.000 in two Canadians have seen or experienced hate, especially more so among youth, gender
00:24:58.900 versus individuals.
00:25:00.020 Yeah.
00:25:00.560 See, they're not going to, they're not going to define hate once again.
00:25:03.680 They're not going to define hate.
00:25:05.840 I've been researching this stuff for months now.
00:25:08.420 They never define the term.
00:25:10.240 It's just this nebulous thing called hate.
00:25:12.740 People are hated.
00:25:13.660 It's the end of the world.
00:25:15.260 And I have actually, he's mentioning a study.
00:25:17.020 I did a study.
00:25:17.920 if you look back on my youtube channel i got a i got an ad on facebook that said hey are you
00:25:23.300 probably something along the lines of are you experiencing hate online something along those
00:25:27.320 lines and i tried to do the survey and essentially if you don't think that there's a concern uh of
00:25:35.040 hate online or you don't think the government should control hate online they don't even let
00:25:39.360 you do the survey they just lock you out of the survey so yeah they're the way that they conduct
00:25:46.400 these these research and studies is once again oh my god is totally controlled to direct towards
00:25:51.540 a political agenda shocking isn't that shocking by the way the fact that this guy and the canadian
00:25:59.440 anti-hit network is recognized as a expert is totally insane um they helped propagate lies
00:26:08.820 or mistruths or misinformation about diagonal in regards to the trucker convoy these lies were
00:26:15.340 not fact-checked by CSIS, RCMP, police in Ottawa, the safety minister, Marco Mendocino at the time.
00:26:24.100 And yeah, they lied. This information was not accurate, yet it was repeated by pretty well
00:26:31.140 all of the major institutions in this country. Yet here he is, still seen as an expert, still seen
00:26:38.780 as a trustworthy source on this type of thing even though they have uh how did kareem asad put it
00:26:48.400 they they were responsible with supplying information for like the largest intelligence
00:26:52.880 failure in canadian history but here they are here they are being recognized as hate experts
00:27:00.020 uh yeah this is just there's just it's a gangbang of rotten corrupt institutions
00:27:06.460 and who's fighting against them, who's pushing back against them, who's doing something about
00:27:12.140 this. For those unaware, I'm going to be plugging this a lot. We are making a documentary to expose
00:27:19.740 these corrupt institutions, how they operate, and airing out their dirty laundry. Evan Balgord
00:27:26.000 comes up in the documentary. And if you want to support it, you can go to givecentgo.com
00:27:30.500 slash save free speech because we are doing something about it. We are going to talk about
00:27:36.420 how this insidious network of far left bureaucrats operate it's connected to the mainstream media
00:27:43.420 it's connected to anonymous twitter accounts and we have all the receipts we're telling the story
00:27:48.480 we're hiring an award-winning documentarian his name is steve hanning you can look him up
00:27:52.700 and uh yeah it's going to be a professional documentary that it tells the story of persecuted
00:28:00.600 canadians for their political opinions and it's gonna we need to identify the problem folks
00:28:05.960 we need to identify the problem. And it needs to be done professionally so we can reach outside of
00:28:11.700 our echo chamber. Okay, so if you want to fight back against this, go to give send go.com slash
00:28:16.500 save free speech. Thank you so much for your donation. Let's keep it going with this
00:28:21.900 with this pipsqueak, so on and so forth. And there's several places where it is coming from
00:28:28.080 especially we're talking about different levels of severity. So we see what I would call it just
00:28:31.920 like the background level of hate on the internet, which is, you know, there's this toxic and poison
00:28:35.740 an environment that has been created uh where people are um finding people from from groups
00:28:40.020 that they like to target in and harassing and abusing them there's this is like it's it's
00:28:48.500 almost it's it's almost like a skit of a bunch of part of my language but a bunch of like baby
00:28:54.500 boomers who who are discovering the internet for the very first time and they're like yo we we found
00:29:01.280 we found this website called 4chan and like there's a lot of racism on there we guys know
00:29:06.600 we there's this thing called memes it's crazy guys guys you got to listen to me there's these
00:29:12.100 things called memes i'm gonna let you in guys i'm gonna reach out of my uh what do i call it
00:29:18.820 my toy box i'm gonna reach out of my toy box not right now because uh this is stuff that i've
00:29:26.220 wanted to uh bring up on stream in the context of free speech in the context of bill c63 and how
00:29:34.320 these bureaucrats just do not understand how the internet works and what is in my toy box in my
00:29:39.720 toy box is um well i'll just say interesting content interesting content this is on uh
00:29:47.800 facebook i found this video right now i'm on my way to the hospital i gotta you know get a couple
00:29:53.520 test done monkey pop shit is real y'all it's fucking it's real it's real that's all i can say
00:30:02.240 yeah shit damn we got monkey pox
00:30:08.300 yo take care of yourselves you hear me this shit is no joke this shit is no joke no joke
00:30:27.520 he's got a banana too i watched this right before the stream i was laughing very hard but i like
00:30:34.280 just saw it and then so this presents a question though this presents a question you know like so
00:30:44.800 evan balgort is this targeting groups who is this targeting is this targeting gay people for making
00:30:54.040 light of monkey pox because monkey pox most often affects gay people is he making fun of black
00:30:59.820 people specifically because he's using a you know monkey filter and he's black oh that's racist
00:31:05.680 right is he just making fun of himself what how would we police this does this get taken down
00:31:13.400 you know and and you could say oh this is clearly comedy greg well what if i like you know leverage
00:31:19.720 this and retweeted it and said like haha like it's it's it'll check this video out oh no that's
00:31:25.020 racist. He's promoting racism. You know, it's ridiculous. Um, yeah. Okay. But yeah, that's
00:31:35.960 just an example of what's in my toy box. Like a lot of videos like that, or that are totally like
00:31:40.460 insane in terms of being funny, touching on something cultural and just presenting the
00:31:46.720 question to someone like Mr. Evan Belgord, like how on earth or a referani for that matter,
00:31:51.540 how on earth would you police this content because that by all means could be considered
00:31:57.240 racist would you take that down why or why not um because you know here's the other thing maybe
00:32:02.480 you take a video like that and you send it to a black your black friend and say haha is that
00:32:08.640 bullying you know like the the lines are so unclear and the reason why like these people
00:32:14.640 are insane in my opinion is everything they're talking about are the ugly parts of human nature
00:32:21.520 they're talking about the ugly parts of human nature we get angry at each other you know
00:32:27.540 sometimes you love your sister sometimes you hate your sister she's so annoying a huge part of
00:32:32.700 comedy is i believe it's it's called ambivalence which means kind of like you you feel like two
00:32:38.240 ways about something sometimes you love them sometimes you hate them that's just what a human
00:32:42.600 being does same with other individuals same with other groups you know like it's and they want to
00:32:49.740 police that it's so misguided and inhuman um yeah it's obvious it's obvious to any real human being
00:32:59.800 oh did i did i did i thumbs up the video whoops i'm so sorry i'm so sorry i'm so sorry i i i would
00:33:09.300 have left a comment let me tell you is there oh is there a live chat there's probably gonna be
00:33:15.000 nothing in the live chat because there's also organized campaigns where they pick a particular
00:33:19.020 oh censorship is cool they all look like they have something oh you can't see it shoot sorry there
00:33:27.180 so people are already chirping them too bad i didn't see this live but anyway let's keep it
00:33:31.180 going here we've seen a lot of that especially around 2s lgbtq plus persons drag performances
00:33:37.900 things like that and then the harassment sometimes even goes so far yeah once again like this stuff
00:33:42.940 of drag people being targeted or trannies being targeted um you know they're trying they're trying
00:33:50.380 they're going after children at this point like they're coddling their content to children they
00:33:54.860 never talk about that it's so disingenuous like it's it's these people are nuts are as to result
00:34:01.260 in venues getting threats and bomb threats then there's the worst of the worst stuff that we see
00:34:06.620 which are networks that...
00:34:08.620 Hey, Evan, if you're watching, you're a terrible actor.
00:34:12.620 He's trying to...
00:34:14.020 This is the worst of the worst.
00:34:17.540 ...that target youth and vulnerable people
00:34:19.820 and encourage them to harm themselves.
00:34:22.300 And then we are also seeing,
00:34:24.140 in an even more extreme level,
00:34:25.520 groups and networks that produce materials
00:34:27.400 that are intended to inspire terrorist attacks
00:34:30.140 or mass murders.
00:34:31.900 Across the spectrum,
00:34:33.400 there are different kinds of harms.
00:34:36.100 Oh, wait a minute.
00:34:36.480 what about made hold on encouraging people to kill themselves isn't that what the liberal party's
00:34:42.840 doing again the consistency of any of this stuff is is pretty wacko um but and if the person if
00:34:51.640 there's content that's actually promoting like real race um sorry terrorism that stuff usually
00:34:57.580 already gets taken down by by big tech so but then again this guy has such a broad definition
00:35:03.740 of terrorism you know he thought uh he thought the uh diagonalon was a terrorist cell right
00:35:10.160 but you know the consistency and the track record of this person apparently isn't that important
00:35:15.160 even though he's an expert expert but the reality is today that uh being online being part of an
00:35:22.560 identifiable group uh can feel very threatening and very dangerous you know josey i'm wondering
00:35:27.560 if you could build a bit upon that you know we were hearing how is he the only man on the panel
00:35:32.360 well i guess he's not really a man but um yeah there's just the there's the asian bro from cpac
00:35:40.180 evan and then the rest are women interesting noted you know josey i'm wondering if you could
00:35:47.980 build a bit upon that you know we were hearing how gendered the attacks like this oh we were
00:35:53.260 feeling so emotional about the hate online can can you add to this this this emotional uh this
00:35:58.420 emotional crybaby party for example so i'm wondering from from your perspective what are
00:36:02.700 you seeing in indigenous communities in particular how indigenous women are being targeted yes we are
00:36:07.080 certainly seeing an increase in bullying cyberbullying against indigenous women throughout
00:36:11.180 the country and indigenous women are reporting that through the various forms that they are
00:36:15.760 targeted and sometimes for reason of ethnicity i mean how about the real life bullying from
00:36:21.660 indigenous men is that not more pressing like like what about protecting indigenous women in
00:36:29.320 real life because sorry to say i'm pretty sure there's a very strong correlation between crime
00:36:35.240 domestic abuse and indigenous men in the indigenous community can someone fact check for that
00:36:41.720 um but no the the online comments the comment section on the internet we need to take we don't
00:36:48.420 need to prioritize the protection of indigenous women in real life no we need to protect them
00:36:53.140 from mean comments on tiktok like it's just a bunch it's just a clown show this whole thing's
00:37:02.120 just a clown show and it's i don't understand how these people sleep at night and it's it's
00:37:07.040 very interesting right because this is just one collection of bureaucrats and people working for
00:37:12.100 the ngos all sucking on the teat of the canadian government right and most of what they do in a way
00:37:18.860 is fake it's all fake it's all baked it's all based on a sort of false reality that they sell
00:37:26.020 and invent and exchange with one another uh you know the far right is the worst thing ever
00:37:32.000 and it's all based on stuff that comes from left-wing think tanks
00:37:36.760 and it's all nonsense it's all it's all fake it's all a bunch of we're going to talk about
00:37:42.900 gender and the importance of accepting transgender people and the economies in the toilet uh there's
00:37:50.180 rising crime uh no but we want to police internet comments and obfuscate the trans agenda that is
00:37:59.500 you know actually preying on young people and messing with their minds like it's all it's all
00:38:05.400 totally fake but it's a nice presentation they got the money they got the money so they will
00:38:12.000 they will not bite the hand that feeds them and they will just feed into the bullshit narrative
00:38:16.820 because that's where the trough is they there's just a dei trough and they're they just stick
00:38:22.760 their face in there and they just eat it up and uh we also are seeing that the indigenous youth
00:38:31.280 Lil Fringe says, have you ever partied on a reservation with natives?
00:38:35.740 They look at you like a buffet.
00:38:37.900 Oh, my gosh.
00:38:39.380 Oh, my gosh.
00:38:42.000 That's an unsettling way to put it.
00:38:45.440 Darren Turner says, it's a whole industry.
00:38:47.460 Absolutely, it's a whole industry.
00:38:49.640 We interviewed Karima Sad, and she actually talks about it in those terms.
00:38:57.280 She talks about the hate industry and how it's very profitable.
00:39:00.580 It's a way to get Canadians to hate each other and have fear amongst one another.
00:39:04.840 And it kind of gets used by different political factions for different reasons.
00:39:08.540 So, yeah.
00:39:10.580 Up to 35, 36% of Indigenous youth have been targeted as compared to non-Indigenous youth in this country.
00:39:17.540 Callie, could you build as well a bit of what Gord was talking about, or Evan, rather, was talking about in terms of the fact that members of the 2SLGBTQ plus community are oftentimes the target of this harassment.
00:39:27.280 hate yeah um we're definitely seeing it um and i think one of the the ottawa capital pride let's go
00:39:36.640 let's go speaking of violence i wonder if she commented on uh on this where is it
00:39:42.400 I wonder if she commented
00:39:51.360 I wonder if she commented on this
00:39:54.020 This
00:39:55.680 When
00:39:58.420 Josh Alexander
00:40:00.400 Nick Alexander got his eye bloodied
00:40:03.020 At a protest
00:40:03.900 But they don't really care about real life
00:40:06.780 This isn't about
00:40:07.660 This thing isn't really about real life
00:40:09.900 This is about internet comments guys
00:40:11.420 The things that's most tangible for me is we see it online and we're seeing it migrate into real life.
00:40:17.360 So as you were mentioning, Evan, venues and other stakeholders when you're planning events are now getting threats and those threats are very tangible.
00:40:24.640 So we're seeing this online space become very immediate for us at events.
00:40:29.760 Immediate for you. Raisa, Children First Canada released a report earlier this year, some sobering numbers,
00:40:34.720 because one of the findings was that in the last 10 years, I'm going to read this, online sexual exploitation of children has tripled.
00:40:41.420 Talk to us a bit more about those finding and the abuse children are experiencing online.
00:40:45.420 Yeah, absolutely.
00:40:46.420 You know, these numbers are staggering with what you just said around tripled since 2014
00:40:51.420 and specifically with online sextortion has increased 150% this year, right?
00:40:56.420 And young people are looking for a safe space, the safe third space, which is the online world today.
00:41:01.420 And, you know, they are facing those harms and, you know, facing every single day
00:41:06.420 those sexploitation, this sextortion and the other harms that come into play every day
00:41:10.420 to play every day without even them knowing about it right and they're just kind of taking it in
00:41:14.020 just taking it in as you say and you know as you share all this i'm thinking about a survey which
00:41:18.340 which cpac commissioned abacus data to to conduct for us earlier this year and so this i made a
00:41:24.840 tick tock about this but this is a referani who has is presenting the online harms act and saying
00:41:31.300 saying exactly this that we're going to protect kids online here he is their pride parade this
00:41:37.380 is what happens at pride parades this guy is a bug this guy is naked with a bugs bunny mask
00:41:44.640 with his just naked at this event so the liberal rubber stamps um events like this and then of
00:41:55.640 course here are two other uh liberal mps uh it's cut off in the photo but this kid has like a
00:42:01.520 rainbow colored skirt on so this idea that the liberal party this idea that the liberal party
00:42:08.460 wants to protect kids from sexual exploitation i can also who's got the clip who's got the clips
00:42:15.440 of the the clippings of these books that are in canadian public schools
00:42:18.540 with all this like weird trans stuff and pornographic material the stuff in these
00:42:25.160 books are so pornographic that even these school boards are like don't know don't bring that up
00:42:30.400 well just let the kids read that we don't want to see that again it's like this this whole it's
00:42:37.420 it's a clown show it's this whole thing is so disingenuous um there's so much resources behind
00:42:46.040 it it's insane they're really gunning they're really gunning they're not gunning for your kids
00:42:51.380 they're gunning for your tongue they want to rip your tongue out and i mean i'll bring it up in a
00:42:56.320 but the undertone of this is that uh man we got to stop these white people we got to stop these
00:43:02.460 like right-wing conservative people who are complaining about all this nonsense out of that
00:43:07.000 survey they found that 42 percent of those who who responded either had experienced cyberbullying
00:43:11.100 personally or knew someone who was victimized by it and to that take a listen to three people who
00:43:15.700 were generous enough to show i uh i can't wait to see this um because i've been cyberbullied
00:43:24.820 what's what's the threshold for being cyber bullied do you know how guys do you know how
00:43:31.000 badly that I was roasted on my tiktok account by children it hurt my feelings I had my feelings
00:43:40.420 hurt by children on tiktok I'm a victim I'm a victim okay so now you need to ban the internet
00:43:49.180 you need to change the rules the gut you need to give the government the ultimate control to take
00:43:53.740 down comments online because i felt offended by a tiktok comment it's totally insane
00:43:59.900 share with us their stories so these are oh god
00:44:05.160 this is the professional victim the professional victim uh look at just look at the look on his
00:44:16.620 face look at the look on his face i can't wait to tell you about um this is my moment this is
00:44:22.380 my shining moment where i talk about how people hated me on the internet all because i'm trying
00:44:27.480 to groom children all because i'm trying to normalize i'm sexualizing minors and calling
00:44:32.740 them trans kids and gay kids as if kids should have a sexuality at all yeah people are getting
00:44:40.740 mad at you for that no shit i was bullied yeah imagine imagine imagine walking out in public
00:44:51.180 like this and not expecting to get bullied oh man youtube just said on in the in the chat i
00:45:00.540 can't do it greg i'm sorry hey man it's okay that's why i'm here sometimes i wish i could
00:45:04.940 step back because it hurts it hurts okay here we go thanks for watching anyway i appreciate it who
00:45:11.820 was victimized by it and to that take a listen to three people who were generous enough to share
00:45:16.060 with us their stories sometimes i wish i could sorry i love that they were they were generous
00:45:21.340 enough to share their stories as if faye johnson is not in the total attention whore who was
00:45:26.540 chomping at the bit to get a spot on this uh on this panel yeah right i you know what i guess
00:45:32.700 i'll share my story that's like her whole her whole grift her whole shtick is being this trans victim
00:45:38.780 oh man who was victimized by it and to that take the fries in the bag who were generous enough to
00:45:46.820 share with us their fries in the bag fay sometimes i wish i could step back because it hurts it hurts
00:45:52.420 to be turned into a figure for mockery and a symbol of loathing for these canadian for these
00:45:58.920 groups are you kidding me i i mean i'm sorry how could you how could you not be that
00:46:05.900 like respectfully like like i can't believe they turned me they turned they turned you into a
00:46:13.380 figurehead where is it i just tweeted about it the other day come on here we go let's hear that
00:46:19.360 one more time do that take a listen to three people who were generous enough to share with
00:46:26.900 us their stories hold that thought sometimes i wish i could step back because it hurts
00:46:31.160 it hurts to be turned into a figure for mockery and a symbol of loathing for these far right groups
00:46:39.800 this is her this is her on sorry him this is him on twitter we have been hated before we have been
00:46:48.240 targeted before we have been here before we won then we will win now if the far right and their
00:46:53.740 friends in public office want to start a culture war on lgbt we will rise to the challenge and we
00:46:58.860 will win so uh yeah and i said over here they hate you because you're sexualizing minors and
00:47:06.880 mischaracterizing it as a civil rights issue also you're male um so it's she's what do you mean i
00:47:15.020 wish i could step back this is this is you're not gonna you're not even honest about anything you're
00:47:20.320 doing so yeah this is this is funny wait can you guys see this shit anyway
00:47:28.320 trent dab says you turn yourself into a figurehead yeah exactly it's it's uh it's it's the whole cry
00:47:38.880 bullying thing right like i can't believe you would do this to me by the way this is great content
00:47:43.100 you calling me a dumb tranny bitch i'm gonna use that as content as an excuse for me to exist and
00:47:48.760 to to grift um are you going to be honest about how you're sexualizing minors i didn't think so
00:47:54.020 i've struggled with suicidal ideation in my in my past as well and i was at the point where i
00:47:58.760 didn't feel like i wanted to keep living either because all of the people that i cared about
00:48:02.620 were taking their own lives if i knew then what i know now i would have had so many more heart
00:48:09.300 to heart conversations oh wow interesting i think you i think you kind of just uh ants i think you
00:48:21.240 kind of just actually touched upon something that would actually be productive i think that
00:48:27.080 would actually be productive if i knew then what i do now maybe i would have talked to my daughter
00:48:32.680 more often about using the internet, right? If I would have known then what I know now,
00:48:39.780 I would have talked to my daughter more often about what she's doing on the internet.
00:48:45.500 Hey, newsflash, nothing in Bill C63 does that. Nothing in the Online Harms Act does that.
00:48:51.860 None of these politicians are even talking about that. None of these politicians are even talking
00:48:56.880 about interfacing and talking to children about how they use the internet.
00:49:02.680 It's all going to just lead to, hey, we got to pass this bill.
00:49:09.620 All of the people that I cared about were taking their own lives.
00:49:12.540 If I knew then what I know now, I would have had so many more heart to heart conversations with my children about.
00:49:22.960 Yeah, I actually agree.
00:49:24.820 I agree with that.
00:49:26.820 That's probably a good takeaway.
00:49:29.480 Unironically, that's probably a good takeaway with all of this.
00:49:31.860 maybe don't just let your kid use the internet without supervision or talking to them about this
00:49:36.980 stuff um that's that's actually a solution i could get behind giving the government an insane
00:49:44.720 amount of control over all content this government no i don't i think that's completely irrelevant
00:49:50.120 the online world about people about relationship building about um trust in communication
00:49:59.480 Faye Johnstone
00:50:01.100 Maddie Freeman
00:50:02.140 and Carol Todd
00:50:03.260 know first hand
00:50:04.320 the human toll
00:50:05.360 time spent online
00:50:06.620 can have
00:50:07.340 in October 2012
00:50:09.160 15 year old
00:50:10.200 see once again
00:50:12.720 this is so silly
00:50:14.360 they know the toll
00:50:16.800 that spending time online
00:50:18.700 can have
00:50:19.440 you shouldn't spend
00:50:22.580 too much time online
00:50:23.760 in general
00:50:25.100 again this is
00:50:26.220 this is like a valid point
00:50:27.560 that they're not
00:50:28.140 like they just don't
00:50:28.780 want to talk about which is like if you're chronically online all the time probably not
00:50:32.880 good for you in general they don't want to have that conversation though they're just raising all
00:50:37.560 this alarmism and fear porn about hate so the government can take control of it it's disgusting
00:50:44.640 amanda todd took her own life a month after posting a nine minute video on youtube using
00:50:51.400 flashcards to tell her story as a victim of cyberbullying and online sexual exploitation
00:50:57.020 I actually didn't know what was going on until the RCMP showed up at my door, right,
00:51:07.340 telling me that there had been a call and that they were doing a child safety check.
00:51:15.180 And that was the evening that Amanda's offender had posted her image out to the world
00:51:22.700 to get back at her for not sharing more images and not doing videos as he was telling her to,
00:51:31.100 so the threats. Her offender threatened her that if she didn't send more pictures or videos for him,
00:51:37.900 he would put her picture out on the World Wide Web, and so he did. After that, when I became aware
00:51:45.020 of it and attempted to have conversations with my daughter um she being a teenager was embarrassed
00:51:51.980 right and she didn't want to talk about it and then eventually trent says the fact every kid
00:51:58.900 has a cell phone now i don't see how the government can keep people safe
00:52:03.160 yeah once again the the the whole character of this conversation is completely misguided
00:52:10.980 Everything that this woman is talking about is not something that Bill C-63 would even help with whatsoever.
00:52:17.420 You're talking about a child sex predator online who is entrapping your child.
00:52:23.600 That happens through private communications.
00:52:26.620 And it's a horrible, disgusting thing, of course.
00:52:30.820 But, you know, she even said it herself.
00:52:32.900 I wish I would have had more heart-to-hearts with my kid who killed themselves.
00:52:38.860 Yeah.
00:52:39.380 so what why what are we doing talking talking about passing bill c63 legislation
00:52:45.640 well maybe this will come up more we'll see when she was re-threatened again by this person
00:52:53.680 she came to me and then i realized with facebook was one of the platforms that her vendor used
00:53:00.260 how difficult it was i couldn't find the report button because there wasn't one i couldn't find
00:53:05.280 a phone number. I couldn't find an email address. I couldn't find anything. In the 12 years since
00:53:09.400 Amanda took her life, the number of social media platforms has exploded, with the digital
00:53:14.100 advisory group Kepios reporting that by July 2024, there were over 5 billion social media users
00:53:20.180 around the world. Billions of users and countless hours being spent online. Maddie Freeman was one
00:53:26.300 of those users. At 12 years old, she struggled with depression and anxiety. So at that time,
00:53:31.920 i didn't realize how much social media was negatively impacting me this is just so disgusting
00:53:35.960 to watch them use these people as like political props until i got to high school and i saw some
00:53:42.220 pretty extreme experiences from my peers i lost many friends to suicide throughout my years of
00:53:46.540 high school into college to see my whole community in shambles because we just kept feeling like
00:53:50.860 these deaths were never gonna end so like just imagine having somebody with one of these like
00:53:55.520 genuine stories where the internet fucked you up and thinking that the mainstream media news
00:54:01.260 and politicians actually want to help you like they actually want to help you this is the same
00:54:07.800 government that promotes medical assistance in dying they promote medical assistance in dying
00:54:15.320 and they think that they're going to help you with your depression they genuinely want that
00:54:19.500 no they're using you as a political prop like it's so it's insane oh um for me the only antidote to
00:54:27.360 being able to continue on and feel okay with myself was to take action. I knew that something
00:54:32.180 needed to be done about this. Maddie founded Know So November, a global advocacy movement aimed to
00:54:37.940 get teenagers offline in November and provide educational tools to empower teens when they are
00:54:42.660 online. Women, children, vulnerable and Indigenous communities all experience higher rates of online
00:54:48.480 abuse and cyberbullying, according to Statistics Canada. Transgender woman and activist Faye
00:54:53.380 Johnstone was not prepared for the online abuse she began experiencing as her advocacy work
00:54:57.960 increased. The piece of that that was hard was that no one knows how to support or help. And so
00:55:02.060 you're seeing something you didn't expect. It's about you. It's personal and it's targeted. And
00:55:06.120 you don't have anyone to tap in to say, hey, like this is really rough. This is also something that
00:55:11.900 a lot of people don't see because it happens online. And it's like, you know, you mob somebody's
00:55:15.800 Twitter mentions and not everyone around me is seeing that. So it would happen to me almost in
00:55:19.520 this like silent little corner that is very personal to me but that's not necessarily getting
00:55:23.460 seen uh by anyone around me what log off let's put the phone down what the hell this is is this
00:55:36.300 the best they got this is actually pathetic i and i'm just i kept doom scrolling and looking at the
00:55:42.160 mean comments and like no one else sees that only i see that because i'm looking at my twitter
00:55:46.480 notifications and you're spending all day on twitter looking at all the hateful people in
00:55:51.300 the comments that sounds like a you problem you know like what and there's just so many hateful
00:56:01.400 comments uh-huh yeah yeah that's what that's what happens when you're a public figure honey
00:56:06.700 you get hateful comments that's what and especially if you sexualize minors with your ideology
00:56:13.500 yeah you're going to get hateful comments this same thing as if you were in a public square
00:56:20.360 doing the same thing you're going to get people yelling at you and disagreeing with you
00:56:25.220 if you enter a debate and you're like i want to sexualize minors and i think trans kids are real
00:56:30.960 and gay kids are real and we need to we need to like talk about that more often the result is
00:56:38.520 sexualizing kids in the classroom you're gonna get pushback you're gonna get people you're gonna
00:56:43.940 get mother bears who are angry at you and what you're saying that's just this welcome to life
00:56:50.060 hello this shit's just so basic it's heartbreaking uh it's heartbreaking uh and you know reading the
00:56:57.440 twitter notifications heart is heartbreaking by the way i have a hard time like i'm a strong proud
00:57:02.820 happy confident human man but it does uh you internalize all of that hate when you see it
00:57:09.260 all of the time who does you do that's not healthy what and you see also what it does to
00:57:18.280 your community um i have parents of trans kids who are scared to speak out about what's going
00:57:22.360 on for their children because they know the risk is higher for them to be targeted because they see
00:57:26.060 what happens to me trying to make trans kids a real thing is just pure evil there's always
00:57:34.760 concerns about safety about cyber bullying online abuse um i think teens are very aware of these
00:57:39.660 problems but the problem is that social she's right kids are very aware of these problems
00:57:46.440 yet you have all these grown-ups thinking they can they can solve the problem uh by giving the
00:57:53.700 government a whole bunch of orwellian power over what's said online yeah okay media has become so
00:57:59.400 ingrained into our world and into our social circles that we feel trapped on the platforms
00:58:03.680 a lot of teens struggle to find a healthy relationship with social media because everyone
00:58:07.580 else that they're connected to is on it and using it actively i never thought that again i've said
00:58:12.260 this many many times on stream before but like the the direction of the conversation is totally
00:58:16.600 misguided if you want to have a conversation about people spending too much time online then by all
00:58:20.480 means i agree with that a predator from across the waters like in another country would come
00:58:26.100 after my child but the internet has no boundaries it's there's no door like i'm sorry i know this
00:58:32.680 the the the daughter of this woman killed herself but like this is literally like a boomer discovering
00:58:37.540 the internet for the first time you know like it's like these are supposed to be the experts
00:58:41.820 on cyberbullying the front door of your house anymore when it comes to being online so did you
00:58:47.940 know that the internet doesn't have boundaries can come from all over the world and appear in
00:58:54.620 your house through the online world so literally explaining the internet okay thank you three
00:59:02.960 powerful stories from individuals whose lives have been affected that by cyber bullying and
00:59:07.720 online harassment and abuse carol todd's daughter manda todd lost her life to suicide in 2012
00:59:12.100 maddie friedman friedman talked about contemplating suicide and losing her friends to suicide and
00:59:16.540 Faye Johnstone, talking about the abuse and threats she receives online as an advocate.
00:59:22.220 Yeah, I mean, just to juxtapose that, if I'm not mistaken, Canadian Afghan veterans more have
00:59:28.780 killed themselves since coming back from Afghanistan than actually died in Afghanistan.
00:59:36.220 There's a trend. There's a suicide trend. And our government doesn't really support our veterans.
00:59:44.520 they kind of just give them a cold shoulder so yeah oh no the government cares this time though
00:59:51.300 the government cares this time when they want to take control over the internet yeah
00:59:55.240 those are pretty weak stories though that's uh i mean a reef for ronnie did you not help them
01:00:01.820 with this because this is pretty bad this is pretty weak a reef hello a reef this sucks man
01:00:07.600 this prop you you've put up better propaganda than this this is weak sauce you know evan it's
01:00:12.620 hard not to be moved by those stories but i'm wondering if we can go back a little bit when
01:00:16.060 did we start seeing and tracking this type of online abuse this kind of abuse uh has existed
01:00:21.920 as long as the internet has existed ding ding ding ding ding ding ding ding i actually agree
01:00:26.280 with evan balgord this kind of abuse has existed as long as human beings have existed
01:00:32.920 as long as human beings have existed they have hated each other they have harassed each other
01:00:39.960 they've gotten mad at each other then human beings created laws and they made a line they said hey
01:00:46.460 if you um you can say what you want as long as you're not advocating violence against somebody
01:00:52.440 but the second that you actually start to advocate for violence or commit violence against somebody
01:00:56.980 that's when we're going to draw the line and all the human beings said that seems like a pretty
01:01:00.680 good line to have in the sand but then we can say what we want right like we can still hate each
01:01:06.020 other yeah you can still hate each other cool so that's kind of where we've been and it's been
01:01:11.220 it's been working out pretty well actually whenever whenever they go over that and start
01:01:15.240 to censor people for their speech it usually ends up in tyranny and communism and violent
01:01:21.620 regimes that will start eradicating uh sections of their population so but yeah tell me tell me
01:01:28.620 more about the origins of hate spin all around ever since the internet yeah it's like the internet
01:01:35.200 Then God created the internet and then hatred became a thing.
01:01:39.480 Then bullying became a thing.
01:01:41.240 As if bullying hasn't been around and hatred hasn't been around since the beginning of the dawn of time.
01:01:47.700 Please read Lord of the Flies.
01:01:50.540 What has changed is the mechanisms by which it happens.
01:01:54.820 Everybody has become concentrated on platforms.
01:01:56.880 This fake seriousness.
01:01:59.900 This fake seriousness from Evan is like crazy.
01:02:03.180 it's it's it's gotten more bad it's gotten more intense yeah it's gotten more intense you got to
01:02:08.300 go back to acting classes bro come on and uh we have created this culture of harassers earning
01:02:14.920 fame and money for tormenting other people where they can build an audience that eggs them on
01:02:19.840 and they go after target after target usually picking on on somebody's identity uh so what's
01:02:24.460 new about it is that there are isn't that what the anti-hate network does though you work for
01:02:29.920 the Canadian anti-hate network your whole thing is like ragging on people huh rewards and no
01:02:35.800 punishments for this kind of behavior uh social norms seem to be breaking down in that people are
01:02:40.300 less ashamed to be engaging in this kind of harassment of others I recall when we started
01:02:44.200 doing this work if you found out who was behind some kind of far-right awful activity and named
01:02:49.180 them and shamed them that would might make them disappear or at least stop that behavior like 70
01:02:52.560 of the time I'm just pulling numbers out of my experience here I don't have a poll but if well
01:02:56.460 and then here it is cats out the bag this is about the far right this is about right wing
01:03:01.900 people doing awful things there you go you think anyone's going to clarify this for the rest of
01:03:07.000 this clip no it's gonna it's no no it's the far right doing all of this it's the far right that
01:03:12.780 is evil it's the far right that we must do something about that's where the awful stuff
01:03:16.980 is coming from as if there can be no hatred coming from the far left as if there can't be
01:03:24.040 identifiable groups all collectively hating and shitting on white people hashtag cancel canada
01:03:28.760 today it's all part of the same agenda folks feels that that number has gone down that naming
01:03:35.420 and shaming of individuals is less effective and people are being incentivized um for doing these
01:03:40.280 kinds of harassment campaigns arisa i wonder if you could and once again no specifics they're not
01:03:45.580 going to be specific about anything it's just hate it's just talk to us about the the impact
01:03:51.500 this abuse actually has on the mental health of young people yeah absolutely so within our report
01:03:56.460 we found that about one in five canadian young people experienced cyber bullying and online
01:04:01.820 harms and that stats kind of follow with you know they may be experiencing some mental health
01:04:06.220 challenges uh whether that be anxiety depression or even greater such as suicidal ideation attempt
01:04:11.420 like what maddie had said earlier um and you know young people are struggling because there's been
01:04:15.660 an onus on young people to learn more about media literacy and even parents within our report um you
01:04:20.860 You know, we found that 65% of them feel not equipped to deal with the cyberbullying that's been happening within their own young people's lives.
01:04:28.000 So it's a really tricky thing where, you know, we try to empower young people to learn more about the harms that they may be experiencing,
01:04:34.900 but the supports are not there for them to take action.
01:04:38.300 Kelly, what would you say about the impact on mental health when it comes to the 2SLGBTQ plus community?
01:04:42.680 yeah I think um the 2SLGBTQ community has uh historically um you know been asked to open up
01:04:49.660 to come out to be yourself to be authentic and I think um that is very important for us to feel
01:04:55.080 free to do so but I think um you know online hate that we see and and cyberbullying is this authentic
01:05:00.280 that affects us is this is this um is this guy in the bunny suit being authentic this is the this is
01:05:08.340 the gay community this is the gay community coming out and being authentic that's that's
01:05:13.100 authentic lgbt expression why are they hating us yeah i wonder i wonder why by keeping ourselves
01:05:22.780 secret by keeping more things close to the vest and i think that that feeling of you know you
01:05:28.380 can't be your true self uh you can't share um i think that really affects people's people's mental
01:05:33.020 health you know in november the canadian mental health association they released a report on the
01:05:37.580 state of mental health in this country and they found and really what's happening is right wingers
01:05:41.300 are like trying to bring shame back they're trying to make shame a thing of like you should
01:05:45.500 probably be ashamed of yourself uh for doing this with a bugs bunny mask on you know the community
01:05:51.240 the the gay community should probably check themselves because a lot of people just kind
01:05:57.520 of find some of this behavior gross okay uh because it you know you can say anything you
01:06:03.900 want shitting on white people but the second that you criticize anyone in the lgbtq plus you're like
01:06:08.700 this horrible monster um and yeah that the tides are turning because people are tired of that
01:06:14.940 but we got to stop the hate now that mental health is three times worse right now than before the
01:06:20.360 covid pandemic they also found that 38 of indigenous people reported their mental health
01:06:24.720 as being poor or fair you know josie when we add online abuse and cyber bullying man i can we get
01:06:30.600 like a study count here how many studies have been referenced in this like so many studies we got
01:06:37.440 another study of the stats that were totally uh construed a specific way to give us the conclusion
01:06:44.760 that we wanted to justify our political agenda very cool very cool into this can you talk to us
01:06:51.080 about the impact that you're seeing in indigenous communities as a result of this type of harassment
01:06:55.220 and perhaps not just...
01:06:57.040 The harassment is crazy.
01:06:58.780 Saying harassment like that, that's wild.
01:07:01.820 The harassment?
01:07:02.680 The community in general, but also on Indigenous women.
01:07:07.520 Yes, thank you.
01:07:08.700 Certainly that percentage does not surprise me
01:07:11.080 that there is an increase in terms of Indigenous women
01:07:14.580 and the feelings of bullying and the hate speech
01:07:19.600 that's attached to it in terms of Canada's history
01:07:23.840 with Indigenous peoples and in terms of the colonial constructs that we have had to live in.
01:07:29.240 So that further perpetuates the feelings of isolation, the feelings of forced assimilation,
01:07:36.760 and the feelings of not being part of the Canadian fabric, the way in which we want to be,
01:07:41.960 the feelings of rejection.
01:07:43.400 And so that increases the vulnerabilities of Indigenous women and children and youth in this country.
01:07:48.000 And so therefore, we feel the same. We feel the anxiety.
01:07:51.580 like the stuff that i do for you guys the stuff that i do watching this sitting through this
01:07:56.940 the stuff that i do for you guys sitting through these word salads sitting through these insufferable
01:08:05.360 struggle sessions about our feelings and how horrible the colonization
01:08:10.380 i just had a moment there saying i'm like where the why the fuck am i doing this
01:08:17.620 why do why do i have to sit through this this nonsense can you imagine being there can you
01:08:22.660 imagine being in the front row can you imagine being somebody listening to this and actually
01:08:28.740 thinking it was like valuable and actually thinking yeah oh my god they're all such victims
01:08:36.800 i mean what do we got we got um we got hate expert on the left we got you know women's oh by the way
01:08:45.820 I looked it up. Do I have it here? Shit. Yeah, I got it
01:08:48.520 right here. She's part of the YWCA. Oops, one sec.
01:08:58.320 This woman here is part of the YWCA. The YWCA has a
01:09:03.940 commitment to supporting trans women and gender diverse
01:09:06.260 people. They denounce anti-trans discrimination and
01:09:10.960 violence and stands in solidarity with trans women.
01:09:13.320 They welcome trans women as leaders and participates in
01:09:15.340 their programs and recognize their contributions to the feminist movement so yeah they let uh men
01:09:26.420 into women's shelters they they let men into women's shelters that that's what um that's how
01:09:35.040 that's what this woman does this guy works for cpac this is like the the gay representative
01:09:40.900 this is the indigenous representative and uh i'm not sure this one's about uh kids i guess
01:09:46.000 but uh yeah it's just a campaign against just regular regular canadians who are calling out the
01:09:55.580 bs we have the suicide ideation we have all of those issues as everyone else only it's it's more
01:10:04.320 because of the history of canada thank you you know i i'm thinking about this rising trend
01:10:10.580 and services. And so Amanda, I wonder for you, as we talk about an increase in online harms,
01:10:16.480 an increase in cyberbullying, are there the services to actually address the need?
01:10:21.360 Here we go. We're going to get into solutions. What does YWCA stand for?
01:10:32.000 I don't know. The W stands for women, probably.
01:10:40.580 What does YWCA stand for?
01:10:49.580 Young Women's Christian Association
01:10:52.720 Of course
01:10:55.580 Of course it stands for Christian
01:10:58.600 Oh my gosh
01:10:59.940 Oh my gosh
01:11:03.020 Frequently asked questions
01:11:04.180 That's why the question was
01:11:05.660 Are you a religious organization?
01:11:07.940 No, we're a secular organization
01:11:09.980 and gender diverse people of all faiths and backgrounds isn't that great isn't that fantastic
01:11:18.400 more and more organizations are pivoting to be able to address that need and provide services
01:11:24.240 not only for people who are experiencing online harms and particularly for youth but for the
01:11:28.380 trusted communities in their life who might be able trend says but it's an organization that
01:11:31.980 supports men yeah exactly yeah yep to help them respond and help them build resilience to this
01:11:38.960 kind of hate. But unfortunately, there needs to be resources to support that kind of engagement
01:11:42.740 and capacity building. And that's a place where WBC Canada is really calling for change. I think
01:11:47.260 the other piece that we have to be cognizant of is, as we've heard a number of times, young people
01:11:51.720 and everyone still remain in online spaces and particularly see social media as a place to build
01:11:56.600 community and so need to be able to do so safely. And organizations who are supporting that kind
01:12:01.000 of community building also need to be able to engage safely in online spaces. And I think there's
01:12:06.120 Yeah, see, they're laying the groundwork for Bill C63 here, engaging safely in online spaces.
01:12:11.840 So this is the whole argument that until we censor people, they cannot participate in democracy or the online space.
01:12:21.340 This stuff is, I'm just getting so, it feels like Groundhog Day.
01:12:26.560 There's not quite yet an ability to fill that gap to ensure that service providers and community organizations can be online to support their community in a virtual way and do so.
01:12:36.120 safely for everyone they're all just talking they're talking about bill c63 that's all they're
01:12:41.680 talking about if there was only a thing that could fill the gap there's only a thing if only there's
01:12:48.820 like a piece of legislation spoiler alert they already have one i i can't wait till they first
01:12:56.580 bring it up and they're gonna make it seem like it's this organic thing it's this totally organic
01:13:00.360 thing oh my god oh my gosh you know i i want to get back to that survey that cpac commission from
01:13:06.520 abacus data and you know david coleto is the ceo and the founder of abacus and and among the things
01:13:11.160 we asked him to look into was was to just measure what canadians are saying about online harassment
01:13:15.880 about cyber bullying sandy klein makes a great point she says many indigenous women and girls
01:13:20.440 go missing the internet is vital for sharing info do they talk about that if they restrict it it
01:13:26.840 It will harm.
01:13:27.980 Yeah, so thank you for the comments, Sandy Klein.
01:13:30.860 There's actually a lawyer.
01:13:33.180 He's on YouTube, too.
01:13:34.220 He's got a big channel.
01:13:34.940 What's his name?
01:13:36.820 Runkle by the Bailey.
01:13:37.860 He had a chat, I think, with Brian Lilly about Bill C-63
01:13:42.920 and talked about how parts of this legislation could be used
01:13:47.300 to hide harmful content that could be used as evidence
01:13:51.160 to persecute somebody like a child sex predator,
01:13:55.240 like something else.
01:13:56.520 Because like, like what if the content is violent and sexual in nature and it's actual evidence that exposes someone for being evil or bad?
01:14:07.780 And like, think about when you see a viral video of someone getting beat up of some outrageous thing happening in public.
01:14:16.020 That stuff deserves to be seen so people know it's happening in real life.
01:14:19.940 this legislation would give the canadian government the power to take down uh to take
01:14:28.680 down content that they don't like and they have a whole smorgasbord of reasons to take down the
01:14:35.000 content it's been very it's it's been it's been it's very clear that this canadian government
01:14:45.120 will use its power to silence people and silence things they don't like so how are we supposed to
01:14:50.940 believe that this legislation won't be used to take down the things they don't like that's bad
01:14:55.240 for their reputation that's a bad look there's i mean and all signs all roads all signs point
01:15:02.880 towards that based on the thing based on what we've seen based on that the fact they have
01:15:07.340 on the board here which is part of a uh explicitly bipartisan organization that
01:15:15.100 It focuses on the far right.
01:15:17.100 So, take a listen to what David Gledo and Abacus State have found.
01:15:27.100 Well, look, I ask Canadians, you know, hundreds of questions every week about a range of things
01:15:31.500 and to find, you know, close to 90% believing that something is a problem.
01:15:36.220 These days, we don't, it's rare to find us all agreeing on a problem.
01:15:39.820 This is an...
01:15:40.820 Mass immigration.
01:15:42.320 More and more Canadians are agreeing with that.
01:15:44.860 we're not going to talk about that. We're going to talk about censoring the internet. Sorry.
01:15:47.400 ... of how widespread public recognition is of the problem. Unfortunately, it doesn't really
01:15:51.640 surprise me because I think we've heard anecdotally lots of stories of both women and those in racial
01:15:56.460 ized communities saying, I feel more targeted. I'm more likely to be targeted from this. This
01:16:00.480 survey just counts it up and shows just how widespread overall people's experience with
01:16:06.400 cyberbullying is and how those in particular demographics seem to be the targets of it more
01:16:11.920 and it's more frequently happening to them.
01:16:14.220 And so, yeah, it's not so much a surprise,
01:16:16.140 but it should shock us, right?
01:16:17.620 That there are certain people in the country
01:16:19.940 who seem to be getting the brunt of this bullying
01:16:23.000 and it's simply because of who they are,
01:16:25.280 not necessarily because of anything they've done.
01:16:27.660 Wow, dude.
01:16:28.860 Like they got everything so controlled
01:16:31.460 and dialed in here in Canada.
01:16:33.360 Even the abacus data,
01:16:35.120 even the data guy is spouting the exact same rhetoric
01:16:38.320 to support Bill C-63 and the progressive politics,
01:16:41.520 which is they're they're being targeted for who they are was that in the survey was there a
01:16:46.980 question about that in the survey was there a question of like you know being discriminated
01:16:51.940 against for your different uh like was white one of the options for being discriminated against or
01:16:56.760 was it just my you know like if you design the survey a certain way and this is actually
01:17:01.700 specifically how they designed the survey that i took the link is from a way back i think i did
01:17:06.480 like two years ago but um it was very clear that you know the far right and white supremacy was
01:17:12.500 like hate they basically equated like white supremacy hate uh you know heteronormative
01:17:18.360 all of that stuff is like where the hate is coming from like that's how this survey was
01:17:23.460 construed so um yeah i just feel like i'm taking laps in a swimming pool full of shit right now
01:17:32.380 I feel like I'm taking like watching this video. I feel like I'm just doing laps in and just
01:17:38.160 I'm sorry for the graphic example, but that's I'm just letting you know how I feel right now
01:17:43.600 live watching this. So David Coletto of Abacus Data. And again, the numbers here, 90 percent of
01:17:48.340 Canadians believe there is some problem with online abuse. And in that same survey, 80 percent
01:17:52.680 of respondents said cyberbullying was one of the most significant challenges, dangers facing young
01:17:56.840 people. You know, Amanda, as you're listening to David there speak, I'm wondering if this backs up
01:18:01.020 kind of numbers trend lines that you're seeing with the ywca and as you answer that i'm also
01:18:06.780 wondering if you could build on it like where is this actually coming from are we talking about
01:18:09.500 social media are we talking about dms are we talking about gaming can you break down where
01:18:13.180 this is actually originating absolutely and i think uh without question we're seeing the same
01:18:18.300 sort of trends in our own research in our own work at ywca not only in the levels of online
01:18:22.780 hate and harm um that young women and gender diverse people particularly those who are um
01:18:27.580 members of racialized communities. Young women and gender diverse people.
01:18:34.140 These are newcomer communities or 2SLGBTQ plus communities are experiencing, but also that
01:18:39.260 overwhelming consensus that this is an issue that people should be taking action on. And what we
01:18:44.060 know is that, unfortunately, the harms can originate from any source online. And so social media, I
01:18:48.120 think, is one that we have mentioned a number of times. And is it clear? The hate is coming from
01:18:51.980 everywhere. It's coming from all sides. At least where we're seeing this harm happening, but it can
01:18:55.940 happen um on a myriad of online platforms uh in public conversations in private conversations
01:19:00.740 oh my gosh this is news to me people hate each other in conversation oh my goodness
01:19:08.900 and there needs to be a sort of nuanced approach to addressing these individual ways that um online
01:19:13.300 harm can manifest you know you you mentioned okay so she didn't really answer the question
01:19:16.900 she basically said hate can come from anywhere nuance i'm also wondering if you if there's a way
01:19:20.580 of nuancing you know the the the source person or organizations and evan i'll ask you to talk about
01:19:25.620 that because are we talking about uh an individual the source person or organization okay we're
01:19:31.220 getting we're getting warmer here using uh social media or these platforms to abuse are we talking
01:19:35.780 about we're going to talk about elon musk bots are we talking about coordinated efforts and
01:19:40.420 where if at all is a region targeted um so different platforms vary in how much they have
01:19:48.340 created an environment of hate and abuse um and in particular you know these platforms have been
01:19:53.780 driven for a long time by profit motive by maximizing engagement at the cost to their users
01:20:00.660 now we see you know the evolution of that certain platforms have become much much worse than others
01:20:06.660 because some have leaned even further into that and with elon musk's takeover of twitter and x
01:20:11.620 we have now seen not just a profit motive behind this very large company but what seems to be an
01:20:17.540 ideological motive behind musk replying to white nationalists trying to use twitter to influence
01:20:24.560 the politics of the country there so he is okay twitter was in bed with the cia in the previous
01:20:33.700 election now twitter is owned by elon musk elon musk is very close friends with trump so it's like
01:20:40.700 Like, it was this, you could make that argument, but it was the same before.
01:20:45.540 So, and the ideology he's speaking about is freedom, is the Constitution, is free speech.
01:20:53.000 But, you know, just hand it to the establishment, progressive, uber-left in Canada to just clutch their pearls over an Elon Musk meme or an Elon Musk reply.
01:21:04.980 Elon reply!
01:21:06.220 Oh, my God, did you see what Elon replied to?
01:21:09.040 This is the end of the world!
01:21:10.700 gosh it's created a space um that is hostile towards people who would advocate um for human
01:21:18.460 rights and amenable to people who would advocate against human rights so it's it's just so annoying
01:21:24.940 that we have these experts so-called experts on online harm and bullying and they're literally
01:21:33.540 all like far leftists they're literally all leftists talking about online harms and bullying
01:21:40.380 as if that's like the entire scope of conversation that's the entire scope of the human experience
01:21:49.580 that's the entire scope of the world that we live in just these this just these things out here all
01:21:55.880 this all this hateful stuff out here that's what we got to get rid of as if like they're not even
01:22:00.720 human beings they're not even like people as if conservatives don't even got a seat at the table
01:22:06.220 don't even get a chance to defend themselves or have their voices heard or disagree with
01:22:10.380 sexualizing children or radical demographic change through mass immigration you know oh
01:22:17.740 they're just white nationalists they're just these shitty nazi people like it's uh and again guys
01:22:24.680 who was fighting back against this if if this was just if this was just some like stupid press
01:22:30.540 progress little panel i wouldn't really care that much okay if it was just like some leftist
01:22:36.200 little blog doing their YouTube channel. That's fine. That's not what we have here. We have the
01:22:41.900 leaders of these, you know, NGO organizations. This guy's on the news on a regular basis. I'm
01:22:47.520 pretty sure they're going to have politicians. This guy runs abacus data. Okay. And is Pierre
01:22:53.940 Polyev fighting back against this? No, there's no, no people are aggressively identifying this
01:23:02.340 problem and exposing it. But that is why I have started safe free speech dot CA. And that's why
01:23:10.800 this documentary that I'm making is so important because it's going to reach out of our echo
01:23:16.160 chamber and it's going to reach people. I swear, I swear to you, you might think it's crazy. It's
01:23:20.600 going to reach people like this. We're, we're higher. We've hired this award winning documentarian,
01:23:29.400 this award-winning filmmaker who has all of the dirt, all of the dirt on the Canadian Anti-Hate
01:23:36.180 Network. And they connect it to the teachers unions. And we connect it to the mainstream
01:23:41.500 media. We connect it to the Trucker Convoy. We connect it to the Million Man March for Kids
01:23:46.220 and how all these people get smeared. They get their reputation smeared. They get dehumanized.
01:23:50.720 They get persecuted, thrown in jail, some of them. There's a very clear pattern of how this network
01:23:56.740 works and how they all have the same far left ideology and people need to understand what's
01:24:03.000 happening here it's not just trudeau there's a whole network of the managerial class some people
01:24:08.540 call it who prescribe to this progressive far left ideology and they need to be exposed like
01:24:14.700 canadians patriots they need to understand what's going on here and that's what we do
01:24:19.200 in our documentary you can go to give send go.com slash save free speech to donate
01:24:24.180 we've done over eight we're almost at 10 interviews i think of different people
01:24:30.220 we've interviewed a couple lawyers we've interviewed professors we've interviewed
01:24:34.380 uh teachers who have been canceled the patriots who have been thrown in jail
01:24:38.020 um and we're telling the story looping it all together and exposing these corrupt
01:24:43.360 non-governmental organizations um yeah i think it's a very important piece because it's like i
01:24:49.940 that is going to reach out of our echo chamber.
01:24:52.060 But if you want to donate,
01:24:53.860 go to givestandgo.com
01:24:54.880 slash savefreespeech.
01:24:57.120 And yeah, thank you so much for your support.
01:25:01.380 I think they all should have spent a year
01:25:03.580 with the Amish, yeah.
01:25:07.140 For doing these kinds of harassment campaigns.
01:25:10.840 Arisa, I'm wondering if you could talk to us about this.
01:25:13.020 Actually, I don't know how much longer.
01:25:15.380 Okay, still there.
01:25:16.240 The impact this abuse actually has
01:25:19.180 on the mental health of young people.
01:25:21.240 Yeah, absolutely.
01:25:22.540 So within our report, we found that about one in five
01:25:25.900 Canadian young people experience cyberbullying and online harms.
01:25:31.360 Oh, I rewinded it.
01:25:33.340 Sorry, sorry, sorry.
01:25:36.140 Let's not watch all that again.
01:25:37.980 Jeez.
01:25:39.000 You know, I want to get back to that survey,
01:25:41.280 that CPAC commission from Abacus Data.
01:25:44.180 And, you know, David Coletto is this.
01:25:46.440 As you answer that, I'm also wondering if you could build on it.
01:25:48.760 like where is different platforms there against human rights so it varies platform by platform
01:25:54.940 and as I said before in terms of like where it is coming from there is a lot of hate and it gets
01:26:01.180 fomented and whipped up by a fairly small number of individuals who have become content creators
01:26:07.380 or started sort of like fake news platforms and things like that that managed to whip up
01:26:11.680 people into a frenzy to then go after particular groups you know so we so it's funny it's you know
01:26:18.920 it's really funny that he says that it's funny that he says that because uh i'm about to share
01:26:25.760 a clip from the documentary or from one of the interviews we did with um karima sad maybe you've
01:26:34.240 heard of karima sad she is a lawyer criminal lawyer in toronto she you know she interacts
01:26:40.860 with all sorts of people
01:26:41.840 going to these protests.
01:26:42.980 She's all around the protest circuit
01:26:44.580 and she essentially,
01:26:48.260 well, let's just hear
01:26:49.520 one of the clips here.
01:26:50.760 I want to see,
01:26:51.280 how do I make this work?
01:27:02.520 What?
01:27:03.920 What?
01:27:04.540 What?
01:27:04.840 What?
01:27:05.060 How do I get this to work?
01:27:09.180 Oh, maybe.
01:27:10.860 Just a second here.
01:27:15.480 What's going on here?
01:27:24.920 Oh my gosh.
01:27:26.040 Okay, hold on.
01:27:26.900 This is huge.
01:27:29.760 Oh, that's why it was so big.
01:27:32.880 Just a moment.
01:27:34.600 I was trying to do this before it started.
01:27:37.560 This is a clip
01:27:38.700 from the upcoming documentary.
01:27:42.240 I just want to make sure I can hear the sound too
01:27:44.060 so I can listen along right here.
01:27:57.060 Monitor and output.
01:27:59.040 Okay.
01:28:00.340 That should work.
01:28:05.260 All right.
01:28:06.160 Speaking of people being online,
01:28:08.700 and targeting others
01:28:10.940 let's hear Karima
01:28:12.520 talk a little bit about her experience
01:28:15.060 with the Canadian Anti-Hate
01:28:16.940 Network. The question I ask her
01:28:18.800 is have you ever been targeted
01:28:20.240 have you ever had your reputation smeared
01:28:22.660 by the Canadian Anti-Hate Network
01:28:24.840 the Anti-Hate Network
01:28:27.380 and why
01:28:29.140 can't I hear it
01:28:30.400 Monitor and output, right?
01:28:42.100 Would you say that the Anti-Hate Network has done personal or professional reputational damage to you?
01:28:49.760 They've certainly tried.
01:28:51.380 Yeah, the Anti-Hate Network has really tried, I think, to bury me.
01:28:55.040 again you know through its own media and commentary and also through proxies.
01:29:05.040 What's really striking though is that I'm just one person who's been through
01:29:11.140 this and over the course of my work I've encountered many other Canadians who
01:29:18.600 have received similar treatment whether directly from anti-hate or it through
01:29:25.500 more informal channels because of course the idea of a network is almost
01:29:34.320 deliberately like it's amorphous right who's part of a network and I've come to
01:29:41.040 learn about various groups that exist online primarily centering around
01:29:47.280 discord servers where there is a real attempt to stalk and disparage Canadians
01:30:00.780 and you know what's the purpose I think whether it's to drive people to civil
01:30:07.380 ruin or criminal consequences or even suicide this is a real phenomenon that
01:30:15.360 that exists and you know it could be much bigger than sort of the small scope
01:30:22.200 that I have but what I have seen is deeply disturbing and that's what keeps
01:30:27.780 me in this looking to expose it so that hopefully it can be dealt with I don't
01:30:36.120 think that our political leaders or law enforcement at the moment are doing a
01:30:42.900 good job dealing with it would you say that the anti-hate network has done uh well there you go
01:30:49.780 there you go i mean it's something that's going unchecked it's something that people are not
01:30:57.260 paying attention to it is uh it's almost like anti-hate the canadian anti-hate network is
01:31:05.740 guilty of these things the anti-hate network has done isn't that interesting could the could the
01:31:14.260 canadian anti-hate network be bullying people themselves and weaponizing the internet against
01:31:19.980 their political enemies and doing this very same thing that they're saying they're trying to stop
01:31:25.260 well they say they're a politically partisan organization that is proudly anti-fascist
01:31:31.020 and focus on the far right.
01:31:32.840 They also associate with Antifa,
01:31:35.140 who is a violent organization.
01:31:37.560 Huh.
01:31:38.680 Wow.
01:31:39.940 It's almost like everything that this guy is saying
01:31:44.800 is a bunch of crap.
01:31:48.760 Amazing.
01:31:50.000 Wow.
01:31:50.840 Isn't it great to learn?
01:31:53.300 Isn't critical thinking amazing?
01:31:55.360 Isn't exposing liars fun?
01:32:01.020 um you know during covet it was like sorry it was muted you didn't you didn't miss much it was the
01:32:22.160 start of like people talk about media literacy um and i think i do want to bring up that you know
01:32:27.000 many young people are digital natives right so they are very familiar with the concepts of social
01:32:30.700 media and seeing that as a third safe space safe very relatively you know in question um but i think
01:32:38.700 with more awareness and more education and you know even though richard mitchell says great job
01:32:44.920 greg thank you once again if you want to like that was just a raw clip of us interviewing karima
01:32:50.680 if you want to donate and see more stuff like that we have a whole bunch of interviews in the can
01:32:56.220 The reason that I haven't been posting more content is because I've been interview, I've
01:32:59.760 been, sorry, starting to cut up and take notes on all these interviews, work it into our
01:33:04.080 existing scripts and blah, blah, blah.
01:33:06.140 But if you want to support, you know, if you want to support that project for that clip
01:33:10.180 was from, then please go to givestandgo.com slash sayfree speech.
01:33:13.500 I appreciate it.
01:33:15.340 We have the onus of, you know, making young people to learn more about media literacy,
01:33:20.400 but can we push that into the broader system within our education system or even the policy?
01:33:26.220 yeah let's get a let's get an anti-hate toolkit we could call it put it in the schools
01:33:31.340 right in regards to the legislation within some of these uh social media spaces you know big d says
01:33:37.900 union leaders zoom meeting went public calling members to dress up like antifa to go against
01:33:42.940 freedom that there was a loop there was a leaked zoom call i don't i don't think the union members
01:33:50.700 said dress up like antifa but they did uh they did allude to seasoned activists who can help
01:33:57.980 them and they did allude to intimidating and demoralizing protesters who care about their
01:34:04.300 children's uh freedom and not to be kind of perverted by sexual ideas in schools these were
01:34:10.560 union members supporting the idea of demoralizing uh protesters very militant very antifa minded
01:34:18.060 um there's a lot of other good stuff in that kareem a interview actually alluding to this
01:34:23.780 but anyway let's keep it going tally i'm wondering if you are a member of the 2s lgbtq plus community
01:34:28.640 and you find yourself being targeted or victimized by this type of heritage i'm targeting her right
01:34:34.680 now this meant this cyber bullying what is the recourse are there services supports for members
01:34:42.040 of the community how about uh putting your phone down and going for a walk is that a resource
01:34:50.080 does that work um i guess the short answer is no um not that i've found or been able to take
01:34:57.620 see how they all nod they're all nodding they're all nodding yes yes yes yes yes
01:35:02.500 oh look oh look it's oh you can't see that shoot can you see it over here
01:35:08.840 anyway a referani comes later so we know this is all leading to bill c63 it's all very predictable
01:35:14.880 i've seen this one before i know how this ends take advantage of um and i think uh just as we
01:35:21.940 were all all speaking here uh something that kind of struck me is um you know for for me personally
01:35:28.180 and and for many in my position uh it's not it's not just in the personal sphere uh on social media
01:35:34.420 but it's also at work um my inbox has been a very nasty place um and that just has massive
01:35:43.720 effects on community right so when we're talking about um what is that what do you what do you mean
01:35:48.220 that that just has massive effects on community because you got hate mail these people don't
01:35:54.180 these people are not good at selling this you know i'm trying i'm trying to i'm trying to have
01:35:59.100 something to grapple with here that's challenging but it's just the same you guys have just been
01:36:02.540 selling the same shit over and over again your atrocity propaganda is weak um my feelings hurt
01:36:10.440 is that it my feelings hurt your identity being part of the rotten studio says her box is a nasty
01:36:19.120 place ah that's a good one that you do and then um you know being thrust into into that spotlight
01:36:28.040 um and and having uh this kind of online presence um it it's a scary place for sure
01:36:35.100 okay i'm gonna i'm gonna i'm gonna reach into my toy box here i don't know what this is
01:36:42.380 let's see
01:36:44.900 oh god oh god
01:36:50.860 oh god this guy this guy this guy is like just ridiculous
01:36:56.820 the heart of creation and innovation it's just that our aesthetics
01:37:04.260 queer people of color have always been the heart of creation and innovation
01:37:15.080 it's just that our aesthetics make it onto the runway never our bodies
01:37:18.780 we're always the mood boards never the models and so i actually feel like what social media
01:37:25.300 has done is created a cage made out of glass, where constantly people say, I see you online,
01:37:33.920 but then I ask, why don't you see me on the train next to you? Why don't you see me on
01:37:39.100 the runway? Why don't you see me on TV? It's because you still think I'm a freak show.
01:37:45.100 And so what social media has allowed is a digital freak show. People scroll and they
01:37:50.540 follow and they say wow look at these obscure strange people and they mine us for inspiration
01:37:56.060 but it's still another specter of our dehumanization gender non-conforming people of color
01:38:01.900 i can't believe people would call this person a freak show i can't believe people would call
01:38:10.780 this person a freak show can you imagine calling this person a freak show it's it's just it's just
01:38:17.480 like you're an attention whore you know what I mean like so much of it isn't is being an attention
01:38:21.600 whore and there's actually stats backing this up guys this isn't me just being a dick this isn't
01:38:26.400 just me being provocative you can look this up people who are gender non-conforming and they do
01:38:33.220 and they dress up like this they have a personality disorder that's about attention seeking so but no
01:38:41.960 but this woman is just like i can't believe people are bullying us yeah i like it's it's
01:38:47.860 it's mission impossible for these folks they're like we need to stop the hate and you you have
01:38:52.580 you know you have homosexuals addicted to methamphetamines spreading diseases like no
01:38:58.680 one's business even inventing new diseases like monkey pox and then people are going to point
01:39:03.640 that out and make jokes about it and then it's like we have to stop the hate and it's like no
01:39:07.700 you guys just need to be responsible you guys just need to be able to take criticism because
01:39:12.800 there's a lot of valid criticism and if you haven't noticed this progressive agenda is all
01:39:17.080 about empowering freaks like yourself and uh but but of course they never they never get the the
01:39:24.440 especially uh you know detestful or disgusting freak on the panel they always get someone who's
01:39:29.980 very proper and it's like okay are you going are you going to defend the behavior of disgusting
01:39:35.660 people at the pride parades are you going to defend this no you're just going to ignore that
01:39:40.780 your car so you're going to ignore that and then say i care about gay rights and anybody who
01:39:46.560 criticizes or gets mad at gays is this horrible person and we need to stop it yeah it's disingenuous
01:39:51.580 josey i'm wondering if you could talk to us about supports whether there are any for
01:39:56.120 indigenous women and their families who who become targeted or victimized through through online
01:40:01.220 Yes, thank you. Currently at Native Women's Association, we do have a program called Safe Passage, and in the program we have a component on cyberbullying and violence and offer safety advice, where to go, resources, how to protect themselves, those kinds of things.
01:40:24.760 And so, having said that, we're not reaching anywhere near enough of the folks that we need to.
01:40:31.300 One of the research documents that I read just recently said that Indigenous women youth should be researched separately because of our lived experiences, which are different based on the colonial constructs that I previously mentioned.
01:40:45.800 And so the research needs to be targeted towards Indigenous women, youth, due to poverty,
01:40:53.320 sometimes don't have the resources that are required to keep individuals safe.
01:40:59.320 But I do believe that schools have a responsibility.
01:41:02.180 IT companies have certainly a bigger responsibility.
01:41:04.220 Government has a responsibility to create safer spaces within social media.
01:41:09.700 Thank you.
01:41:10.260 Thank you for that.
01:41:11.380 Listen, I'm going to ask you to hold on to those thoughts because we'll continue the conversation.
01:41:14.340 But I also want to bring in another statistic, again, from the survey that was done by Apex's data after being commissioned by CPAC.
01:41:20.680 And in that poll, they found that 58 percent of respondents said online hate and bullying was a deterrent for qualified people who are considering running for public office.
01:41:29.060 Now, we know that politicians from all.
01:41:30.680 Let's go. Hatred, working, intimidating people, stopping them from running for office.
01:41:36.020 i mean i know someone personally who didn't run for the ppc because they were afraid of the
01:41:41.660 professional consequences of associating with the people's party of canada so once again like
01:41:46.840 is that is that ever factored in that side the other side no probably not um by the way that
01:41:56.720 woman was just talking about uh you know native life and i just thought there's actually a really
01:42:03.720 like crazy section of tiktok that's like um jokes about living on the red like on a native
01:42:12.880 residence where's the volume where's the volume where's the volume
01:42:19.400 there we go that is a big yo bro can i have a cup of water uncle
01:42:29.260 I hear you go and I feel real small like your wee nook.
01:42:34.600 I want a bigger one.
01:42:37.440 Better don't be touching my cup.
01:42:40.920 You think you're bigger and more solid than me?
01:42:43.440 This ain't even your house, it's cook'em.
01:42:46.860 I get to have the bigger cup.
01:42:50.680 Only big onks could drunk out of big cups.
01:42:55.640 Oh yeah, fuck around and find out.
01:42:59.260 Can I have a couple? Okay, don't you just love the internet?
01:43:18.160 Yo, bro, check out my new girl
01:43:19.840 Ever sick? That's her cousin
01:43:23.560 Yeah, settle down. I said my girl is in one, not twelve. I ain't like that no more, bro
01:43:28.900 Okay, that's a dozen. I said she's our cousin. So they're making fun of having sex with your cousin
01:43:36.660 Again, like am I allowed to laugh at that or is that me targeting a community or whatever?
01:43:41.140 Can I have a couple you want to take the mattress to the living room or what come with me you trust me?
01:43:47.380 With me on a magic
01:43:52.960 Yeah, I want to see this one
01:43:58.900 it's locked it's locked
01:44:01.940 fuck around i'm gonna whistle
01:44:09.300 that's hilarious
01:44:13.720 these effects are like so crazy
01:44:18.800 fuck around it's locked
01:44:23.000 for those that don't know the joke is that the sheet is the door and apparently that's a thing
01:44:28.060 that happens a lot on the res um again is that like is that racism is uh they're targeting
01:44:35.400 themselves with these this humor like it's uh i don't know it's because the thing is is if they
01:44:42.420 make that joke it's fine because they're native if i made that joke it'd be totally not okay
01:44:45.840 because i'm white or whatever you know what i mean like it's it's uh all the stuff that they
01:44:50.900 talk about in the online harms and stuff is uh it's they don't distinguish between criticism
01:44:56.460 between humor between satire or hate they're just kind of saying it's all hate it's all horrible
01:45:05.080 because that's another thing that's another great point which is
01:45:09.180 some people will find this and be depressed and they'll think it's hateful and they'll make a
01:45:17.140 comment and they might get bullied in the comment section okay so what that's the problem of the
01:45:23.000 person can like you know browsing the internet but we need an answer we need some sort of answer
01:45:30.060 to this let's see what the answer is all levels federal provincial local are facing more abuse
01:45:34.480 more online harassment uh and we did speak to a few politicians about their experience with this
01:45:39.360 type of online hate take a look let's go the very i hate who h in chat if you hate politicians let's
01:45:47.320 go first times i i received online abuse i sometimes reached out to the person and said
01:45:52.280 why are you talking like that? But then I would be inundated. There would be, there's times when
01:45:57.040 there's sort of like, almost like a concerted campaign to just drive you off Twitter or drive
01:46:02.660 your Facebook into the ground, and you can't keep up. So it's in the morning, I get up, get a cup
01:46:07.580 of coffee, and I block, block, block, block, block, so that the people who are on my site are people
01:46:11.740 who want to talk. Well, it's awful, and it's, it's, it wears on you, quite frankly. This is a...
01:46:17.980 you're a politician. You're a politician. You're a public servant.
01:46:24.340 Interfacing with the public is what you do. Getting criticism, getting feedback from the
01:46:30.260 public, that is a huge part of your job. And if people are mad at you or hating on you,
01:46:38.160 maybe there's a reason for it. Maybe there's a reason for it.
01:46:42.260 vocal minority but they're bullies they like to intimidate and it's um it's difficult especially
01:46:48.940 if it involves family um i think that's that's one of the hardest if somehow um family gets
01:46:55.520 brought into it an mp talked about an email that they received threatening to rape his wife
01:47:00.260 you know and i've had media say to me and and people have written well you this is what you
01:47:04.400 chose to do you need to get thick skin and toughen up the hate started pretty soon after i
01:47:09.280 Yo, there she is.
01:47:10.440 Oh, shit.
01:47:10.940 Sorry, I forgot to switch it back.
01:47:12.160 Sorry about that.
01:47:13.520 There she is.
01:47:15.980 Oh, my God.
01:47:17.020 She does not look good.
01:47:19.260 Oh, my God.
01:47:20.060 They just poor woman just used as a political puppet.
01:47:26.720 Look at this used and abused political puppet by the Liberal Party.
01:47:31.080 They just used her as a puppet, and they just threw her aside, man.
01:47:35.860 You hate to see it.
01:47:37.860 Anyway, how are you doing, Chris?
01:47:38.980 See you?
01:47:39.280 no not Christy
01:47:40.920 sorry Catherine
01:47:41.640 Climate Barbie
01:47:42.780 yeah
01:47:43.640 didn't she lose
01:47:44.920 didn't she misplace
01:47:45.920 like billions of
01:47:46.920 taxpayer money
01:47:47.900 dollars
01:47:48.740 didn't she like
01:47:50.180 misplace
01:47:50.920 thousands of
01:47:52.360 infrastructure projects
01:47:53.600 I got bullied
01:47:56.780 I'm kind of glad
01:47:57.900 you got bullied
01:47:58.660 if that's what you did
01:48:00.160 you kind of
01:48:00.660 you kind of deserve
01:48:01.520 to be bullied
01:48:02.360 I think after doing
01:48:03.340 something like that
01:48:04.200 I mean if anything
01:48:08.020 in terms of like
01:48:09.060 damage done i think you've caused more damage than any amount of people calling you the c word
01:48:14.380 uh on twitter or real life for that matter came in a sort of environment and climate change
01:48:20.980 now it's kind of quaint i was called climate barbie um although it wasn't all that quaint
01:48:25.340 if you looked at the tweets they often you know had like violent acts where they would crush a
01:48:29.760 barbie okay so again take taking the funny part of climate barbie and then just associating it
01:48:35.420 with the more inflammatory tweets
01:48:39.080 is if it's the same thing,
01:48:40.880 which it's not,
01:48:42.680 a lot of the criticism,
01:48:44.980 a lot of the hatred is totally valid.
01:48:49.100 And then they always try to muddy the waters
01:48:50.780 with the stuff that might be potentially illegal to say
01:48:55.740 because it's inciting violence.
01:48:58.260 But they always try to mix those two together.
01:49:01.460 Yeah, because they want to pass more bills
01:49:03.340 to take away our freedom.
01:49:04.560 So, yeah.
01:49:05.420 It was pretty brutal from the start, but I ignored it.
01:49:08.680 My team was said, like, do not do anything with this.
01:49:11.560 You will look weak.
01:49:13.020 You'll get drawn in.
01:49:14.620 And then one day I was at the United Nations.
01:49:17.740 We were talking with world leaders about climate policy.
01:49:21.660 And I noticed that Jerry Ritz, who was a colleague, a conservative in the House of Commons,
01:49:27.420 had tweeted something about, you know, something stupid, how stupid I was in climate Barbie.
01:49:31.560 and i just said to my team i was like okay folks like i'm saying something back i had multiple
01:49:38.180 death threats i had a woman incarcerated can we just replay on how like silly that is
01:49:43.160 my colleague criticized me and it's like i'm gonna reply and it's like yeah that's twitter
01:49:48.580 this is supposed to be serious is this this is supposed to be a serious thing right here
01:49:54.640 it's crazy once but three times as a result of this um and in the 2022 election i just couldn't
01:50:01.020 take anything anymore I was being targeted and harassed not just online but in person and I
01:50:06.960 decided to self-harm to stop it because my world became incredibly small I was suffocating and I
01:50:13.460 wanted to die and a lot of this was because of online trolls without any sort of name or
01:50:20.240 identifier around so like this is so counterintuitive because if you show this to online
01:50:27.320 trolls they are going to feel so empowered the fact that they have learned that their bullying
01:50:34.600 has caused you to be so emotionally tilted that you wanted to end your life that like
01:50:42.140 emboldens the trolls and you're basically saying yes these bullies are so powerful i can't manage
01:50:48.700 the internet comment section and how i deal with my own life these trolls these oh my god stop
01:50:56.460 these anonymous accounts i don't think they realize how how uh how this totally fuels the
01:51:04.460 fire and just emboldens online trolls my gosh this is a conservative by the way on them but
01:51:12.740 who felt that they were able to get to me very closely and to personalize things including with
01:51:19.280 my family and uh really wanted to eliminate me i ended up in the hospital eliminate me
01:51:26.280 i see i'm a detail guy i would love to see the details of like any of what they just brought up
01:51:33.380 um because they just went from i wanted to kill myself from climate barbie being called climate
01:51:39.780 barbie and they're seriously like pushing those two things together as if they're like equivalent
01:51:43.820 are you kidding me like ridiculous when i started to get the first death rats online
01:51:51.500 i was really shocked and i was shocked that there seemed to be so little
01:51:58.420 anybody know what charlie angus was doing why do people hate him um think so little i could do
01:52:05.700 about it when people are saying you're being i mean you can hunt it you're never going to see
01:52:10.600 coming tell your family to arrange a funeral well is that just an idiot in
01:52:15.580 their mom's basement or is that someone who actually is going to do it and I
01:52:19.940 think the bigger area of concern that I have is that it's not threat a or person
01:52:30.120 B that is going to actually follow through to the result it's when you
01:52:35.360 create an atmosphere where threat is seen as normal that someone else will
01:52:40.700 act and that when it becomes when you demonize public officials to such a
01:52:45.560 level that it's not just the swears and the hate but the actual threats it will
01:52:53.780 trigger someone who's probably unstable to do something it's not just members of
01:52:58.580 part I mean this is so rich coming from the establishment that regularly
01:53:03.920 demonizes and human uh dehumanizes um donald trump conservatives right wingers i do this all
01:53:12.220 the time they downgrade and ignore um you know any sort of uh hatred or violence that the right
01:53:23.000 wing receives i brought this up already this is something that did not make a headline anywhere
01:53:28.160 in canada it's somebody protesting the gender indoctrinations in schools and um
01:53:35.200 yeah so like that violence doesn't really get covered right so antifa violence doesn't really
01:53:42.100 get covered so it's just kind of annoying when they kind of try to use this emotional argument
01:53:49.120 and uh it's clearly a completely lopsided one and they don't give a fuck at all about people
01:53:57.240 who don't have their politics right parliament i'm talking to municipal counselors who are facing
01:54:02.280 this in small towns i'm talking to to provincial mlas that's every day for some politicians
01:54:09.660 including me feeling like there's no allies there's no light there's no support
01:54:14.100 and what about your pension what about what about uh your salary there's lots of support
01:54:20.140 you know you know what you could do lisa you could you could buy an app with all the money you have
01:54:25.700 that can like filter your time online
01:54:28.780 and it can actually tell you,
01:54:30.340 hey, you've been spending too much time on Twitter.
01:54:32.180 Maybe you should log off.
01:54:33.800 Or you could hire somebody with your salary
01:54:36.760 to check your emails for you,
01:54:39.220 to protect you from the mean comments.
01:54:43.860 Or you could hire a therapist
01:54:46.680 because you're trying to kill yourself
01:54:49.660 because of online comments.
01:54:51.920 That could maybe be a good idea too.
01:54:53.980 or if you're seriously like worried about someone might be stalking you or whatever you can you
01:54:59.880 know obviously call the police if you have any sort of leads on who this person might be if it's
01:55:04.700 a real threat or not or you could even try to hire a private investigator there's a lot of things to
01:55:09.080 do but i just really feel like you know looking and reading the mean comments and like commiserating
01:55:14.160 about it and panicking is uh yeah i don't think it's a good idea when you start to internalize
01:55:20.820 that you not only are that's the second time we've heard that term that i remember internalize
01:55:25.860 when we internalize that why are you internalizing it it's a it's a bunch of older folks who just do
01:55:32.920 not know how to navigate the internet and they take everything too seriously nice in the bully
01:55:37.740 space but you start to dehumanize your liquid gal says stay off your phone then she needs some cheese
01:55:42.920 for that wine yeah and the most annoying thing with public officials complaining about this is
01:55:48.640 I would love if they covered what these people did as well, because in the case of Christia Freeland, I mentioned it, but like, you know, she is, she is rightfully hated.
01:56:01.720 People, people are very justified in, in, in hating this woman in terms of like the, the, the egregious decisions and fumbles she'd made.
01:56:12.200 he's made as a public official um and i just feel like that's really really important to the context
01:56:17.760 of what we're talking about yourself too and you have no self-esteem left and i mean especially
01:56:24.620 i'm not gonna i'm not gonna go into detail here but like especially during covid and stuff
01:56:28.660 uh locking people down forcing kids to wear masks forcing people to take a uh untested
01:56:37.900 medical product. Yeah, you kind of deserve to be hated. You deserve to be hated. And
01:56:45.440 because what you're doing was totally wrong and immoral, and did cause lives. If you want to talk
01:56:51.620 about depression and forcing people online, you want to talk about people being depressed and
01:56:55.980 being online. Let's talk about all of these lockdown rules that you imposed upon the entire
01:57:01.300 country, including children. I can't believe they're depressed online. Yeah, you force them
01:57:06.800 to go online during COVID
01:57:08.940 with very misguided policies.
01:57:14.160 Again, part of this conversation
01:57:16.400 is a totally valid one.
01:57:18.460 I can get addicted to my phone sometimes
01:57:20.420 and it's unhealthy.
01:57:21.600 I've got too much screen time.
01:57:24.280 Or maybe I'll get caught up
01:57:25.760 in the comment section
01:57:26.540 and get caught up in my feelings.
01:57:27.780 I'm guilty of that sometimes.
01:57:30.240 But I don't internalize it
01:57:31.960 because I'm not an idiot.
01:57:33.280 And I also go outside
01:57:36.460 and i go for a walk and i realize hey it's just the internet it's just the internet okay if there's
01:57:43.320 a valid threat on your life that's different but if it's a hateful comment it's a stupid it's just
01:57:50.220 the internet ah i just feel like i'm going crazy i'm sorry if i hurt your earlobes with my yelling
01:57:57.120 for those of us with an underlying mental illness it it can be deadly i do report it and i did like
01:58:04.600 these politicians online harassment can be deadly and this is what's spooky folks because
01:58:12.520 she is a conservative this woman is a conservative lisa mcleod this is another thing that will
01:58:20.340 probably be fighting it back against with the safe free speech but uh the conservatives also
01:58:26.000 have an online harms bill that they want to pass i'm not going to get into the detail but like that
01:58:30.700 is another potential threat for our free expression on the internet coming from the
01:58:34.480 conservative party the response from the police is quite frankly underwhelming usually the response
01:58:42.140 i get is what would you like us to do or we have three options we can do nothing we can phone the
01:58:49.100 individual or we can look at pressing charges what would you like us to do and my response is this
01:58:54.480 it looks like a woman who
01:58:57.360 calling her the c-word
01:59:00.840 she would be more concerned about
01:59:03.260 you calling her the c-word
01:59:06.200 than like
01:59:06.980 I don't know
01:59:08.560 some new Indian
01:59:10.420 fresh off the boat migrant
01:59:11.840 like raping somebody
01:59:12.980 she would be like
01:59:14.080 she would put more emphasis
01:59:15.260 on like well someone called me the c-word
01:59:17.200 it's like well are you worried about this like
01:59:18.760 crime that happened
01:59:20.740 and it's like someone called me the c-word
01:59:24.480 They called me the C word.
01:59:26.960 Okay.
01:59:27.820 Oh yeah, someone got like raped over there
01:59:29.660 by some undocumented migrant
01:59:33.040 who came over the board.
01:59:34.640 You don't care about that.
01:59:36.380 I got called the C word.
01:59:39.680 These are our public officials.
01:59:41.840 You're supposed to be the experts,
01:59:43.860 but at the very least,
01:59:45.000 I want you to call the individual.
01:59:46.800 Social media companies need to be held accountable.
01:59:48.780 Like the online harm spill.
01:59:49.980 I think sometimes government...
01:59:51.560 It's the Online Harms Act.
01:59:52.440 gets caught up in being worried that it looks like you know they're you know they're controlling
01:59:57.160 speech uh or they are something to do with freedom of speech hate speech is not freedom of speech
02:00:02.740 uh yeah it is uh yeah it is yeah it is this is not a thing i think like what a ditzy valley girl
02:00:14.040 listen to how terrible that that uh clip was social media companies need to be held accountable
02:00:19.500 like the online harms bill. I think sometimes government gets caught up in being worried.
02:00:25.380 Frankly, underwhelming. Usually the response I get is, what would you like us to do?
02:00:31.780 Or, we have three options. We can do nothing, we can phone the individual, or we can look at
02:00:37.120 pressing charges. What would you like us to do? And my response is, you're supposed to be the
02:00:43.320 experts, but at the very least, I want you to call the individual. Social media companies need to be
02:00:49.400 held accountable, like the online harm spill. I think sometimes government gets
02:00:54.740 caught up in being worried that it looks like you know they're you know they're
02:00:58.780 controlling speech or it's something to do with freedom of speech. Hate speech is
02:01:02.360 not freedom of speech. It's just not a thing. I think that there needs to be
02:01:09.640 the tools in place to address these threats of democracy in a better way so
02:01:16.120 that the lone municipal council or the lone member of parliament's not having to face these on their
02:01:21.600 own by dealing with police and trying to deal with the courts. I think there are tools that
02:01:25.880 could be used to to lessen the threat and menace but right now it's everybody's waiting to see
02:01:32.380 what happens next. I really were totally disingenuous totally disingenuous criticizing
02:01:39.420 politicians as part of democracy this entire time they've just conflated everything to imply that
02:01:46.100 That us getting criticism and hate is basically we're getting threatened.
02:01:51.920 It's like, okay, were there actual credible threats?
02:01:55.180 Then yeah, call the police and figure that out, sure.
02:01:58.580 But you're conflating everything together.
02:02:01.200 Climate Barbie getting called names as if it's like a crime to call out a shitty politicians
02:02:07.300 who's fucking up for their country and their constituents.
02:02:10.360 I don't think so.
02:02:12.160 Police and trying to deal with the courts.
02:02:13.860 I think there are tools that could be used to lessen the threat and menace.
02:02:19.460 But right now, everybody's waiting to see what happens next.
02:02:23.900 I really worry now.
02:02:26.260 Like, when I look back, it was definitely bad.
02:02:28.980 It sucked sometimes, to be perfectly honest.
02:02:32.820 If we're not careful, someone is going to be hurt,
02:02:35.680 either at the hands of one of these perpetrators or threats,
02:02:41.680 or it's going to be at their own hands.
02:02:43.600 I think that suicide is going to be a very big issue for us as we move forward.
02:02:47.940 And we cannot underestimate the impact of this online hate.
02:02:53.440 Now, let's go W for the bullies, W bullies.
02:02:59.080 All of these politicians are terrified trying to end their own lives.
02:03:03.860 I mean, that's interesting because, again, they don't have any details here.
02:03:10.240 and like details are so important if you if you want to if you wanted to seriously have a
02:03:15.780 conversation about controlling content on the internet details would be of the utmost important
02:03:21.760 portents but i've been researching this stuff for months and anytime that people like this
02:03:28.460 who are part of the establishment or part of the bigger ngos media companies politicians
02:03:33.620 they don't go into any detail at all none no specifics they just they amalgamate everything
02:03:41.840 together hate criticism it's the same thing as death threats it's totally disingenuous
02:03:46.740 getting tired of it to be honest now the samara center in toronto they use a machine known as a
02:03:55.560 sam bot to analyze millions of comments made during elections it also helps provide some
02:04:01.040 insight into how this type of harassment and abuse is affecting our politics and our democracy
02:04:06.740 here in Canada. And to that, take a listen now to the CEO of that company, Sabrina Delong.
02:04:13.180 If you want to seek elected office entail a high volume of abuse, that will be quite severe.
02:04:20.820 I'm sorry, guys. I'll be back in like 10 seconds.
02:04:23.080 We need to think about affecting our politics and our democracy here in Canada. And to that,
02:04:27.920 Take a listen now to the CEO of that company, Sabrina Delon.
02:04:32.420 If you want to seek elected office entail a high volume of abuse, that will be quite severe.
02:04:40.740 And so we need to think about this as conditions of work so that we can bring this conversation down to earth.
02:04:47.460 These are people who are elected to serve our interests.
02:04:50.040 They're meant to serve the public.
02:04:51.980 What conditions do you want your representative working under so that they can do their best for you?
02:04:56.300 and violence occurring in a digital format is still violence and so we're
02:05:03.080 facing such complex challenges right now with our democracy related to trust
02:05:07.400 risk and information and we need to be responsive to these realities of what
02:05:13.400 it's really like to be an active citizen offline and offline because there's a
02:05:17.420 connection and the reality is is people don't feel safe performing these roles
02:05:22.280 and if people don't feel safe performing them then they're not going to step
02:05:25.240 forward or increasingly will be attracting people who also want to facilitate uh this type of
02:05:31.960 violent and uncivil behavior which this this woman is actually a weapon because everything she's
02:05:38.600 saying uh doesn't really make any sense violence online is still violence the fuck you talking
02:05:47.880 about what what do you mean violence is online is still violence like this woman is a master of
02:05:56.240 just smushing everything together you know when i talk about like blurring the lines between actual
02:06:00.640 crime and speech she's doing this like effortlessly and uh and of course of course
02:06:07.680 she's a brown woman every time they want someone to like protect protect democracy and anyone who
02:06:16.360 especially spouting a bunch of
02:06:18.280 bullshit nonsense in Canada
02:06:20.020 they're always either
02:06:22.320 a woman, brown
02:06:24.320 or a brown woman
02:06:26.100 that's like the super duper
02:06:28.600 combo because
02:06:29.960 it's a much better package
02:06:31.960 it's harder to criticize
02:06:33.960 the brown woman because people
02:06:36.180 hesitate and plus she's very polite
02:06:38.460 as well, this woman's very polite
02:06:40.260 she's very poised
02:06:41.180 if she saw this clip of me she would say
02:06:44.020 well he's being violent, he's
02:06:45.780 he's trying to inspire terrorists you know and but she would sound very reasonable and very
02:06:49.880 measured in the way she speaks about it this is uh you know i don't like what she's saying she's
02:06:54.860 totally full of it but she's she's a weapon in terms of her uh this is basically the female
02:06:59.200 arif varani as far as i can tell it's going to have a very corrosive and damaging effect
02:07:06.020 on the practice of our politics in canada so a bit of a snapshot as to how this online abuse
02:07:12.680 is affecting politicians and by extension our democracy in this country you know when you add
02:07:18.460 that again like i know like i know i'm a broken record here but this is clearly leading up the
02:07:23.380 bill c63 it's like why don't you just say it's about bill c63 you know what i mean they're trying
02:07:29.380 to make it like it's going to be this surprise at the end surprise we have the legislation already
02:07:34.900 you know but but they're making it this drawn out sort of like mystery novel how are we going
02:07:41.020 to solve all of these hurt feelings.
02:07:44.380 To the discussion that we as a group have already had, I'm going to ask each one of
02:07:48.900 you to just share your thought.
02:07:51.220 What is your biggest concern, your biggest fear, if we don't address the problem and
02:07:55.780 come up with some type of solution to make sure this doesn't spread and get worse?
02:08:00.300 Evan, I'll get you to start.
02:08:01.320 I'd also like to react to something that we saw there, where these are, in some cases,
02:08:05.280 sitting members of parliament, and they have a lot of privilege and a lot of power.
02:08:10.100 and even they feel helpless in some cases going to law enforcement but many other people don't
02:08:16.640 even have close to that privilege they're not having law enforcement ask them hey what can we
02:08:20.100 do and giving them options they're having law enforcement not take their report or not take
02:08:24.040 it seriously or say there's nothing we can do because it's online so I just want to acknowledge
02:08:27.560 that I have a lot of sympathy for that the regular person out there has it unfortunately a lot worse
02:08:33.540 if they are the target for these these campaigns my fear for the yeah but Evan clearly people are
02:08:39.360 going to hate public officials more because they're the ones that kind of are rightfully
02:08:43.620 receiving a disproportionate amount of attention because they're public officials.
02:08:49.020 They're the people near the levers of power.
02:08:51.860 So, yeah.
02:08:53.800 And once again, the lack of detail, the lack of detail on any of the like, oh, it's hate.
02:08:57.840 Oh, it's abuse.
02:08:58.500 Oh, it's like, what are we actually talking about here?
02:09:01.060 Because I bet you if we went through every single one and every single thing, it would
02:09:07.480 be it would be a mixed bag there would only be so many things that were actually you know a a
02:09:14.100 viable sort of threat or potentially criminal uh speech or behavior but of course they don't want
02:09:19.920 to talk about that they want to just all smash it all together the future um is the normalized
02:09:26.960 yeah jennifer francis says they can dish it but can't take it same with trudeau that's a great
02:09:32.440 point it's a great point they completely villainized unvaccinated people they completely
02:09:38.360 villainized and dehumanized right-wingers on a regular basis punch a nazi tear down a sir john
02:09:45.120 a mcdonald statue burn down a church it's understandable this is something that gerald
02:09:51.260 butt said this is justifying violence justifying crime and arson against specific identifiable
02:10:01.540 groups in canada but those identifiable groups are white christian conservative men so all of
02:10:09.060 that was totally fine people who are proud of our country proud of sir john a mcdonald you can shit
02:10:15.040 on them tear down the statues all you want that's not hate that's just progress but all that gets
02:10:22.440 totally fucking ignored and the second that people start pushing back and criticizing other people
02:10:29.520 or saying something oh they're getting too angry no no that's not allowed anymore that's not allowed
02:10:33.620 anymore that's that's hate gender invert are you talking about uh you're talking about
02:10:46.240 i don't know i don't know if it's uh i'm not sure what gender he is this harassment has already led
02:10:53.740 to people being murdered and killed, and it will...
02:10:57.120 Can we get a source on that, please?
02:11:00.800 You should be backing something like that up.
02:11:04.200 What did he say?
02:11:08.120 They're having law enforcement not take their report
02:11:10.300 or not take it seriously or say there's nothing we can do
02:11:12.680 because it's online.
02:11:13.720 So I just want to acknowledge that I have a lot of sympathy for that.
02:11:17.460 The regular person out there has it, unfortunately, a lot worse
02:11:20.540 if they are the target for these campaigns.
02:11:22.740 my fear for the future um is the normalization of this harassment has already led to people
02:11:30.340 being murdered and killed fact check okay what's the who's who's been killed over hate and harassment
02:11:38.420 um and it will continue to do so in the future uh and on the horizon right now in this country
02:11:44.180 like it really don't like you're not going to give the example how the fuck can he they've
02:11:50.180 inciting study study study oh we got all the studies all the evidence what about can you get
02:11:54.380 in a fucking example of someone being killed we're not going to get an example we're not going to get
02:11:58.700 the receipt for that we're just going to we're just going to keep moving past that okay um i don't
02:12:04.200 think that we are moving in a direction where that is going to lessen what would you say he
02:12:10.940 thinks people are going to die because of twitter comments someone's died already from twitter
02:12:17.340 comments can you give the example no uh no i'm not going to do that echo evan's fear that i think
02:12:24.040 there's a great risk of online harms translating into offline violence um and offline loss and i
02:12:31.420 also worry that um what we're seeing and what what is being experienced is that young people
02:12:37.320 and young women and gender diverse people in particular are stepping away from public life
02:12:41.960 they are stepping out of online space they're stopping they they no longer share their
02:12:46.320 perspectives and opinions and their passion for perhaps changing the world which is something
02:12:50.360 why WCA so desperately wishes to foster and instead they are disengaged and I think that
02:12:56.240 hold on hold on though like if we follow her logic that these people are inspired to change
02:13:03.000 the world and they're afraid to go online you know what I'm saying like it like you like you're
02:13:13.280 afraid to go out there and share your opinion on the internet, but I really want to change the
02:13:19.220 world. Like you're not going to make it like you're, you're not, you're not going to change
02:13:24.040 the world. Like, how are you going to change the world? If someone hating on you is going to be a
02:13:32.040 deterrent from you, even speaking, if you cannot stand up to somebody hating you and opposing you,
02:13:38.780 you're not going to change the world so you're saying that we need to silence other people's
02:13:44.060 voices in order for what some racialized minority gender diverse person to change the world i think
02:13:52.240 that you're actually using their hurt feelings to try and pass legislation to change the world
02:13:58.900 by taking away free speech and free expression i think that's what's happening here no these little
02:14:04.340 these little scared gender diverse people are going to change the world but they can't because
02:14:08.800 there's hateful bullies on the internet so once we science the hateful bullies then we can change
02:14:14.000 the world and it's like going to be like ariel from the little mermaid but she's going to be a
02:14:18.320 black trans ariel and then she's and then she's gonna then she's gonna come online once the
02:14:23.940 bullies the the they're gonna part the seas of the hateful trolls and this one ariel black
02:14:31.060 indigenous in a wheelchair woman she's gonna make that tweet i want world peace and then the whole
02:14:41.100 world is saved that's that's the world that these people live in that's like the movie that they
02:14:48.300 have in their heads it's completely delusional is concerning for the health of our democracy now but
02:14:53.540 also very much in the future as we look to think about who may serve as elected officials and um
02:14:59.380 you know, shepherd our country into the future as well.
02:15:03.380 Kelly, what's your biggest fear?
02:15:05.380 That we will continue to be pigeonholed
02:15:08.380 and to deal with these abuses and bullying on an individual basis.
02:15:14.380 I think having conversations like this are a good start
02:15:17.380 to be able to quantify and to understand the quality
02:15:23.380 and the depth of what's going on online,
02:15:27.380 because as we've been saying it can often be a very individual experience so i think just making
02:15:32.740 spaces to to talk about it uh is is both a positive but it's also a very scary thing to
02:15:38.420 to step into so um i i worry that people um won't feel that they they have the space to
02:15:45.300 have conversations like we're having today josie they won't have the space can you hold that can
02:15:50.900 you hold that space for me can you guys hold that space for me please please hold that space
02:15:57.140 for me because we need to be able to hold hold the space hold that space for me i want to i want to
02:16:03.940 i don't know if this is going to be worth it maybe this can this work let's see if this works just
02:16:08.660 one second oh man computer my pewter is not what it used to be guys hold on come on come on
02:16:21.700 yes the numbers are very frightening
02:16:28.180 uh okay let's see if this works this is what
02:16:35.980 this would be very funny i i just have this idea in my head i wanted to try and show it to you
02:16:42.840 guys like it would be so appropriate if at the front of the stage there was just like a massive
02:16:51.700 It's just like a massive Kleenex box.
02:16:55.180 Imagine like a cartoonishly huge Kleenex box.
02:16:58.980 Because they're all just crying.
02:17:00.900 It's all just crying.
02:17:01.900 Oh my gosh, it's horrible.
02:17:05.100 Oh my goodness, they're calling us these names.
02:17:08.680 Oh no.
02:17:09.760 Here, take the Kleenex box.
02:17:11.140 Like a cartoonishly massive Kleenex box.
02:17:17.500 That's how I feel about this panel.
02:17:21.700 just crying crying crying the mean words the mean words is basically the same as
02:17:27.700 like threats and people killing us and it's the violence can you be more
02:17:31.360 specific no you can't you got hate mail and it affected your day and now you
02:17:37.280 want to control the internet you fucking break I my my biggest fears are going to
02:17:43.780 see more people dying I mean the rate of murder of missing and murdered
02:17:48.660 indigenous women in all of their diversity the numbers are much higher than it is for non-indigenous
02:17:56.420 women and that is the same stat for for uh indigenous youth as well who are dying at the
02:18:03.060 hands of cyber bullying at the hands of other other sources including government as well and so greater
02:18:10.740 um this this this woman is the weakest link she is the weakest link this is the most outrageous
02:18:19.580 statement people are dying they're going to die more it's the government it's cyber bullying it's
02:18:25.140 everything and this is like the most unthoughtful like all over the place comment cyber bullying is
02:18:30.300 killing indigenous women fact check can we get a source no okay it doesn't matter like we're just
02:18:38.580 here to cry okay we're just we're just here to cry with our kleenex box anyway
02:18:43.380 like the facts don't matter at all legislation policy we need to protect the
02:18:51.120 most vulnerable people in this country and unfortunately and sadly it is as as
02:18:58.320 the numbers state and the investigation state it is indigenous women and
02:19:02.740 children and youth and families we have we could do much better i'm not like i want to meet the
02:19:11.060 person who is watching this and holding on to every word you know what i mean like i i don't
02:19:18.120 but i'm also i would be fascinated to like sit down and have an ipa with them or have some sort
02:19:24.220 of like vegan uh weed shot drink with them uh but i don't think that that's going to happen i would
02:19:35.860 just love to meet somebody who actually watches this and hangs on to every word risa absolutely
02:19:40.900 and i think my biggest fears um i would share too is that the numbers of you know many young
02:19:46.440 people within canada keep increasing with the online sexploitation sextortion um cyberbullying
02:19:52.580 um you know these numbers will just continue driving up and it will actually impact you know
02:19:56.740 other systems such as the health care system right around the mental health care and having
02:20:01.300 that social support in order for a young person whoever is going through that cyber bullying
02:20:05.780 feel supported and have the adequate access um and you know the many young people that i work
02:20:11.140 with they are empowered to take on this challenge with policy makers so why don't we put them in
02:20:17.300 directly at the seat at the table right here it comes here comes the well there's actually this
02:20:23.140 bill called bill c63 wow oh my god all of our crying could go away i can't wait for that moment
02:20:31.140 and they can provide their lived expertise um and provide innovative solutions rather than putting
02:20:37.060 the onus back on the kids and the parents in order for them to feel safe on social media
02:20:42.500 yeah we're not going to talk about the kids and the parents actually talking to each other more
02:20:46.820 Even though Amanda, what's her name?
02:20:51.240 She killed herself.
02:20:52.080 Anyway, her mom earlier in this clip said, I wish I would have talked to Amanda more.
02:20:56.800 That's an actual real solution.
02:20:59.300 But she's like, no, no, no.
02:21:00.540 The onus shouldn't be on the kids and the parents talking to each other and actually
02:21:04.900 trying to cultivate a healthy relationship that's offline.
02:21:08.840 No, the government should do everything.
02:21:10.800 The government should babysit everybody and solve all of these fucking problems.
02:21:14.280 It's the most misguided slop.
02:21:18.100 It's the most misguided Ottawa bureaucracy slop.
02:21:23.280 Fuck.
02:21:24.620 Raisa, thank you.
02:21:25.560 And really, thank you to all of you for starting our discussion
02:21:28.280 as we look into online harms and cyberbullying in this town hall.
02:21:32.440 Evan Balgord of the Canadian Anti-Hate Network.
02:21:34.980 Thank you.
02:21:35.420 Amanda Arella, YWCA.
02:21:37.340 Thank you.
02:21:37.940 Raisa Amani, Children First Canada.
02:21:40.520 Josie Mipenak of the Native Women's Association.
02:21:42.460 and Callie Mettler of Ottawa Capital Pride.
02:21:45.600 Thank you.
02:21:46.120 Thank you for this discussion.
02:21:51.760 Point fire says the undateables.
02:21:55.200 That's funny.
02:21:56.540 That's funny.
02:21:57.220 07s to you, sir.
02:21:58.240 That's funny.
02:21:58.740 You got me good there.
02:21:59.960 Watch.
02:22:01.660 The undateables.
02:22:05.400 Cry bullying.
02:22:06.820 Sorry, cyber bullying town hall.
02:22:08.780 So we have just talked about the problem, who is being affected by online harm and cyberbullying.
02:22:15.700 Well, let's switch the discussion now and focus on solutions, what we might be able to do about this.
02:22:21.760 And to take part in this discussion, we're now joined by Matthew Johnson.
02:22:26.180 Oh, my God. What? Oh, whoa.
02:22:30.200 Whoa. This is what we're dealing with, folks.
02:22:33.280 Let's talk about solution. Cut to.
02:22:37.040 What's going on with that?
02:22:38.780 what is going on there chat what am i looking at what the hell is going on here
02:22:48.300 what is that what is that this is the this person has the solutions this person has the solution
02:22:58.760 to online bullying this is the person that's going to solve our problems for online bullying
02:23:03.720 I'm terrified
02:23:05.640 I'm terrified
02:23:07.760 You're in good hands Canada
02:23:09.500 You're in good hands
02:23:10.800 Look how intense the strap
02:23:14.000 On that mask is too
02:23:15.160 Nothing's getting through there
02:23:17.400 Nothing's getting through there
02:23:20.120 And we're going to protect
02:23:22.020 You from online harms don't be afraid
02:23:24.220 I'm afraid
02:23:25.340 Someone says SNM dude
02:23:27.580 Yeah
02:23:28.680 This is only a few of the new panelists
02:23:31.620 On this round I'm kind of afraid to see
02:23:33.700 what else if that's that's the first people we're seeing here director of education with media
02:23:38.520 smarts anis busier mcnichol lawyer and director fundamental freedoms with the canadian civil
02:23:43.340 liberties association jaden braves ceo and founder of young politicians of canada we rat we reacted
02:23:50.500 to this guy once on stream it's cringe he supports bill c63 he doesn't even criticize
02:23:55.160 the fact that kids spend 10 hours a day online he's just like yeah we just spend that much time
02:24:00.960 online and that's it and the government's going to take care of us anuda dugal executive director
02:24:06.120 of women's shelters canada and cynthia ku a technology and human rights lawyer hello to all
02:24:10.760 of you like really a mask on both sides let's begin with the premise if someone is being cyber
02:24:15.300 bullied and abused online what tools are actually in place to that they might access and i'll begin
02:24:20.660 with you matthew there's absolutely nothing there's nothing we need the government to do something
02:24:24.880 about it that's going to be the answer because i'm wondering what resources media speed this up a bit
02:24:28.980 That's available for someone to understand what they can actually do if they're being victimized.
02:24:32.840 We have resources that are based on the research we've been conducting for almost 25 years that help to, first of all, prevent cyberbullying by...
02:24:42.840 Guys, it's 2024. It's 2024 right now.
02:24:47.980 And I'm supposed to listen to this guy and take him seriously?
02:24:53.060 What the...
02:24:54.840 Welcome to Ottawa, folks.
02:24:56.340 like come on come on this is supposed to be an snl sketch what is this this is real life
02:25:05.380 allowing teachers and parents to help kids develop the essential skills they need
02:25:10.980 to manage conflict online to manage their own emotions and to take effective action when they
02:25:16.700 witness cyberbullying and these also provide witness i'm a witness i'm a witness to the
02:25:23.160 cyberbullying. Tools for kids to use when they experience cyberbullying themselves. They provide
02:25:29.680 strategies for dealing with it and options for reporting it and in particular for reaching out
02:25:35.620 to different sources of help and support because we know that really that is what's most important
02:25:40.460 for young people is feeling that they have support, feeling that they're not alone.
02:25:45.000 Feeling that they're not alone. So how important is parental involvement in that fight?
02:25:48.940 It's tremendously important. We found throughout our research that having a connection between parents and kids, having an open conversation and having rules in place in the home, not necessarily.
02:26:00.840 Wow, I'm actually agreeing with this mask hole right now.
02:26:04.300 He's actually making more sense, which is kids should talk to their parents about this and just have an ongoing conversation about how you deal with the Internet.
02:26:15.840 rules that are based on punishment, but rules that are establishing routines, that are establishing
02:26:20.940 values, and in particular, establishing that kids can come and talk to their parents anytime
02:26:26.320 something goes wrong, have a huge impact on how kids behave when they're online and on their
02:26:32.020 experience when bad things happen to them, including cyberbullying. You know, Jaden, I think
02:26:37.100 you are the youngest panelist being involved in this town hall. So as you hear Matthew talk about
02:26:41.800 the types of resources that are available, is that even talked about among young people? Do they
02:26:45.820 know or are they sharing information with each other yeah thank you Michael I
02:26:49.900 I think as a 16 year old and somebody that's advocating heavily for this work
02:26:53.980 on the internet you just have to look at the facts you have to talk to some young
02:26:57.400 people and see what's actually going on on the day-to-day on the internet and at
02:27:01.000 the end of the day there aren't resources you scroll on your phone you
02:27:04.400 listen to what MPs had to say a little bit earlier but you can see there aren't
02:27:08.260 phone numbers to call there aren't emails to type and the normalization of
02:27:12.340 of exchanging nudes or pornographic content
02:27:15.440 or hatred that's totally seen as acceptable
02:27:18.700 amongst people my age, amongst people younger,
02:27:20.740 amongst people that I see in grade five or six
02:27:22.820 getting access to cell phones.
02:27:24.640 There's no front door you can close, as Amanda said.
02:27:26.720 There's no limitation to what you can see.
02:27:29.320 The first thing you see on the morning is your cell phone.
02:27:31.760 That's exactly what you're looking at,
02:27:33.280 what you're checking for, what you're being stimulated by.
02:27:35.520 And to put it on parents and put the owner...
02:27:37.780 This is just such like tyranny porn.
02:27:40.180 This is like tyrant porn right here.
02:27:42.080 I mean, look, you can just go online and you can see anything you want.
02:27:47.560 Yeah, it's like that's the whole freedom of it.
02:27:49.840 Like that's the whole freedom of the Internet.
02:27:52.080 I can go to the URL.
02:27:53.280 I can go to Google.
02:27:54.020 I can type in anything I fucking want.
02:27:57.360 And Google will show me.
02:28:00.160 And sometimes it's going to offend people.
02:28:02.740 And that doesn't matter because it is what I want to see or I want to interact with.
02:28:08.960 Oh, but you see a comment that you don't like?
02:28:10.960 well you can just hit x you can hit the x button look for the x on your screen and you can go away
02:28:19.420 it's magic i know i know it's very powerful technology but i'm telling you you can walk
02:28:25.540 away from your phone listen to me son you can walk away from the phone you can put the phone down
02:28:32.140 i swear to you it works and this guy's just like well you can see anything online we're on the
02:28:39.100 phones all the time what are we going to do about this and the tyrants are like that's right that's
02:28:42.680 right that's right we can't let these people have all these freedoms it's just you know what this is
02:28:49.240 this whole this whole uh town hall is just like horror horrible foreplay it's just been the worst
02:28:57.860 foreplay to set us up to get fucked by a reeferani and bill c63 but it's not good foreplay it's almost
02:29:05.600 like it's some really slick business guy with greasy hair and he's he's giving us feeding us
02:29:11.360 all these lines that don't make any sense you know he's buying us all these drinks and it's like this
02:29:16.040 guy's kind of a fucking creep it doesn't really make any sense like he clearly he's clearly wants
02:29:21.760 to fuck me over uh but he's telling me like he's making it seem like he's not about to fuck me
02:29:27.240 over he's making it seem like it's a good thing and it's like uh no um you can pay for the check
02:29:33.100 I'm getting the fuck out of here.
02:29:37.600 Are you sure this young kid is not related to Trudeau?
02:29:39.960 Yeah, they made him in a lab, actually.
02:29:42.260 They're cloning Trudeau.
02:29:43.500 This is the latest Trudeau model.
02:29:46.420 So, yeah.
02:29:47.500 Great question.
02:29:48.780 Onus on parents is also really difficult
02:29:49.960 because you alienate certain students
02:29:51.140 that don't have parents that maybe are as attentive.
02:29:55.000 And if you can't standardize
02:29:56.580 the way that young people are receiving information,
02:29:58.820 then you really have a lack of fairness
02:30:01.500 throughout young people.
02:30:03.040 what the fuck standardize the way that kids receive information look at look at this
02:30:08.380 what's going on here too what is that guys what is this can we get a physiognomy check on this
02:30:22.140 just like look that's wild is that am i crazy
02:30:24.760 maybe i'm looking into it too much the way the chin is sticking out there
02:30:30.600 okay bro okay bro i i feel bad for that man's that young man's soul he's only 16
02:30:43.980 he's getting corrupted into the world of politics and just spouting off nonsense
02:30:48.900 like he's he's gonna probably be dangerous when he grows up but it's like dude you are
02:30:52.800 you are just corrupting your soul with total nonsense okay perhaps we can get into that a
02:30:58.360 little bit later but I'm wondering in terms of women's shelters the the
02:31:01.960 supports that are available to to women who step forward to say that they've
02:31:05.260 been victimized by online harms are those available so shelters have told us
02:31:09.520 95% of shelter workers have said that the violence that women experience
02:31:13.480 domestic violence gender-based violence intimate partner violence is exacerbated
02:31:17.920 by online harassment threats and online monitoring and surveillance these are
02:31:22.300 the three top forms of violence that they're experiencing and so women's
02:31:26.260 Shelters Canada has a program that is training shelter workers so that they know how to respond
02:31:30.740 when they receive that information so they know how to help women protect themselves so that
02:31:35.860 women think about the location tracking apps on their phone or they think about what kind of
02:31:40.100 information is being shared by their apps that they're maybe not aware of. So hold on this this
02:31:45.060 might be the actual nugget of valid concern that they're mixing in with everything else because
02:31:52.620 um you know jilted ex-boyfriends that are like trying to hunt down their exes
02:31:57.440 that that is something that is fucked up and does and could happen um but once again not nothing
02:32:05.500 really to do with bill c63 and in in my experience researching it so this is very interesting that
02:32:12.020 they have this woman in the mix when they come into the shelter or even the um type of
02:32:16.660 communication that they have to have with somebody who may be an abusive partner around taking care
02:32:21.360 of children or following legal proceedings they it's not someone you can block in these cases
02:32:27.480 so what we're trying to do is say that when you have to have a relationship with somebody who is
02:32:32.360 abusive you know they're abusive there have to be protections in place so that that relationship
02:32:36.940 doesn't extend to you feeling unsafe online because of the behavior of that particular what
02:32:42.340 the fuck what what the fuck what does one have to do with the other you you have an abusive partner
02:32:49.480 and we have to make sure
02:32:51.620 this doesn't go into the online world?
02:32:54.540 What?
02:32:56.920 Chat, does this make any sense?
02:32:59.760 Because I feel like, you know,
02:33:01.240 domestic abuse and like being
02:33:02.880 in an abusive relationship that like that's,
02:33:04.980 yeah, that's a genuine concern for people
02:33:06.820 that should be addressed.
02:33:08.680 What does this have to do with Bill C-63?
02:33:11.300 What does this have to do with like,
02:33:12.680 you know, the physical abuse
02:33:16.040 obviously is the worst.
02:33:19.480 that's a crime what does this have to do with like stopping hate online it's very unclear
02:33:25.060 instructions unclear killer abuser or their family or their friends because people will also
02:33:29.940 pull others into the online abuse so abuse online harms you know i wonder cynthia does this qualify
02:33:36.160 as a human rights issue are there laws that people can actually use to fight back if they are victims
02:33:40.900 of this type of abuse yes it absolutely is a human rights issue so we heard from the panel earlier
02:33:45.960 that they're total see what they're doing is they're mixing fucking everything together like
02:33:50.180 i said they take the one genuine thing that's already criminal and then they mix it together
02:33:55.380 with like other activity on the internet as if it's all the same thing disingenuous bullshit
02:34:01.980 online abuse amounts to chilling of people's freedom of expression so they will self-censor
02:34:06.760 they will withdraw from online spaces and withdraw from public participation which is
02:34:10.800 crucial for having a healthy
02:34:12.980 open and free democracy
02:34:14.160 she is not good at her
02:34:17.200 job she's not good at selling this at all
02:34:19.320 happy robot says
02:34:21.200 why is she wearing a mask
02:34:22.380 because she's
02:34:27.020 Asian I don't know I don't
02:34:29.080 know someone else is wearing a mask
02:34:30.800 see this other guy whoops this other guy
02:34:32.960 two of them
02:34:34.520 John Saul says these masks really have
02:34:39.000 to go how old is she I don't know
02:34:40.700 I don't know this is this is this is a mess though and a lot of people when we talk about
02:34:46.000 regulating the internet regulating online hate immediately see it as a freedom of expression
02:34:49.140 issue but it's just yeah it is it's as much if not more so a right to equality issue because if
02:34:54.760 a right to equality issue I'm going to slow this down because this is one of like the most
02:35:01.120 bullshit arguments where they're like no no no no silencing hate will like empower people to have
02:35:08.060 equal conversation that's the argument so let's sorry we'll listen the last 30 seconds again of
02:35:14.700 this uh this mask this ninja back if they are victims of this type of abuse yes it absolutely
02:35:21.900 is a human rights issue so we heard from the panel earlier that online abuse amounts to chilling of
02:35:28.940 people's freedom of expression so they will self-censor they will withdraw from online
02:35:33.500 spaces and withdraw from public participation which is uh crucial for having a healthy open
02:35:40.000 and free democracy and honestly like this is sad she's she is so bad at making this argument that
02:35:47.980 i can't even react to it properly like she's she's failing she's failing at making the point so much
02:35:54.800 that i don't even have anything to react to i can't i can't even clip this because you're not
02:35:59.380 even presenting the argument properly no no but seriously and and like people like are afraid to
02:36:04.740 like participate online so actually that's like the thing that's like against the online abuse
02:36:08.020 take the mask off take the mask off do yourself do us all a favor charade it's crazy a lot of
02:36:16.580 people when we talk about regulating the internet regulating online hate immediately see it as a
02:36:20.680 freedom of expression issue but it's just as much if not more so a right to equality issue because
02:36:25.940 if it's a right to equality issue just it i i've been eating word salad all night just bring on
02:36:36.660 more of it it's a right to equality issue historically marginalized groups are systematically
02:36:41.700 targeted every time they speak out then they don't have freedom of expression because what
02:36:48.180 does it mean to be able to speak freely if every single time you dare open your mouth you're hit
02:36:52.420 with a wall of hate speech and rape threats and death threats whoa whoa whoa that's correct like
02:36:59.620 they paint such an insane it's actually a hilariously cartoonish like image of the internet
02:37:05.480 where like any minority who dares tweet they're just gonna get a tidal wave of rape threats and
02:37:13.600 death threats and white supremacists in their comments i wish i wish it was like that it is not
02:37:22.220 like that that's a crazy way to paint the internet they're making like every web page
02:37:28.900 every tweet seem like 4chan's poll like the most like toxic anonymous uh offensive place on the
02:37:37.020 internet which is um i mean that's what they need to try and make their argument right in terms of
02:37:43.240 laws to address it there are some for example there are criminal offenses such as stalking
02:37:49.220 cyber otherwise or for criminal harassment and there are also laws that
02:37:54.580 they all know the same thing fast unlimited fire
02:37:59.140 was an ad sorry i didn't switch it back
02:38:03.060 lol it's hilarious the same people saying words have consequences say they shouldn't have
02:38:07.060 consequences when they speak same people saying words have consequences say they shouldn't have
02:38:15.140 consequences when they speak is that what you meant oh guys i forgot to plug it by the way
02:38:22.660 christmas time is around the corner okay and you got to get your shopping done right
02:38:28.020 well here's an idea you can dedicate a donation to savefreespeech.ca and you can get
02:38:38.820 a gift certificate and then you can give it to a fellow patriot that's right if you go to
02:38:45.400 savefreespeech.ca slash gift you can download this certificate you can print it out fill it in
02:38:55.520 and give it to your patriotic friend if they're you know if they're a fan of mine or whatever
02:39:01.880 you know all you got to do is you go to the page you go to our give send go page you know you make
02:39:06.720 a donation to the documentary to savefreespeech.ca and then boom you go back over to savefreespeech.ca
02:39:14.980 slash gift download the certificate all you need is a printer all you need is a printer
02:39:20.940 you know put in the person's name that you're dedicating it to and then there you go you got
02:39:25.060 a nice crisp a very thoughtful i think christmas gift for uh for your patriotic friend and plus
02:39:31.780 you're helping save free speech in canada you know your gift will help broadcast the stories
02:39:36.380 of persecuted Canadians across the country
02:39:38.960 safeguarding our fundamental right
02:39:41.320 to free freedom of expression.
02:39:44.320 I mean, what better gift is there
02:39:47.400 than the gift of free speech?
02:39:49.940 But yeah, you can go to savefreespeech.ca
02:39:52.540 slash gift for all that information.
02:39:54.960 And of course, you can always donate
02:39:56.640 at givecentgo.com slash savefreespeech.
02:39:59.580 Get your Christmas shopping done now.
02:40:02.400 And actually, if you want, if you want,
02:40:04.280 But if you donate over $25 to the Give, Send, Go, I will make you a personal cameo video for you or for a friend as a Christmas gift, whatever.
02:40:17.340 Just put cameo in the donation comment.
02:40:23.420 You can just put cameo.
02:40:24.500 Don't worry about it.
02:40:25.920 And I will email you.
02:40:27.500 I'll get in touch with you and I'll help make a little short video for you or your friend or whatever you'd like.
02:40:32.660 Because it's the Christmas season.
02:40:34.120 it's the season of giving and i really appreciate um you know patriots like yourself who are
02:40:39.440 watching and i want to support this mission to tell the story of what's happening in this country
02:40:44.480 to try and turn it turn it turn it around so uh people like this can kind of be put in their place
02:40:53.500 and be humiliated and exposed for their nonsense um yeah let's keep going because what does it
02:41:02.780 mean to be able to speak freely if every single time you dare open your mouth you're hit with a
02:41:07.080 wall of hate speech and rape threats and death threats in terms of like she did it so perfectly
02:41:15.760 there of you're hit with a wall of hate speech and rape threats and death threats as if that's
02:41:22.220 all it's all like all one thing it's all the same uh every every time a minority comments on the
02:41:29.940 internet. They're hit with a wall of rape threats, death threats, crazy. Laws to address it. There are
02:41:36.460 some, for example, there are criminal offenses such as stalking, cyber, otherwise, or for criminal
02:41:42.620 harassment. And there are also laws that will let people sue individuals if they happen to know who
02:41:47.720 they are, if they aren't anonymous for privacy invasion or online harassment, for example.
02:41:52.940 But these laws leave a lot to be desired when it comes to online abuse, because one, they don't
02:41:58.660 cover huge laws of behavior that would constitute online abuse to a lot of
02:42:04.720 victims research has shown don't actually prefer to engage with the law
02:42:09.460 because they themselves have had bad experiences or over criminalized so they
02:42:13.120 prefer non-legal recourse that's available but the third and one of the
02:42:18.660 most major factors is that these laws don't address the platforms themselves
02:42:23.420 themselves and as we heard earlier their business models are optimized like she's got such a crazy
02:42:29.800 look oh my gosh to help this problem proliferate and we know this because previous reporting and
02:42:37.300 whistleblowers last thing you see before being dragged to a human rights court isn't that kind
02:42:46.260 of terrifying though like am i crazy like if i were to show this image to a child i feel like
02:42:52.220 they'd be kind of terrified you know what i mean like that's kind of terrifying a little bit
02:43:00.280 that's kind of spooky i feel like that's a i feel like that's like the the villain
02:43:09.580 in some sort of movie i've seen before
02:43:11.540 what movie is this from
02:43:15.200 mortal kombat
02:43:19.180 yeah true have explained that even when at meta at facebook at youtube their own employees raise
02:43:28.400 these problems point out that we our systems are facilitating this kind of online abuse
02:43:33.720 and here are some things here are some tweaks we could make to fix that those initiatives get
02:43:38.900 shut down internally because higher level management knows that it's going to hit their
02:43:43.480 bottom line so let me pick up on that and bring you into the conversation and he's you know when
02:43:48.580 We're getting into the big-headedness of the Canadian government,
02:43:52.660 where they think they're going to step into Silicon Valley
02:43:54.920 and change the way that Facebook operates.
02:43:59.100 To think about the laws that are available, limitations otherwise,
02:44:03.240 do they go far enough?
02:44:05.460 It's a very interesting question, and my answer to that is that
02:44:08.860 they could do a much better job, and there's room for improvement.
02:44:12.980 As Cynthia mentioned, legislation and recourses exist,
02:44:17.020 but it's sometimes very difficult for people to access it and one very big flaw of this type of
02:44:23.400 legislation and recourses is that they place a very heavy burden on the shoulders of the alleged
02:44:27.780 victim to carry the lawsuit to establish that there was defamation to make a defamation now
02:44:36.020 we're talking about defamation something that's already something that people already legislate
02:44:42.200 against each other playing to the police if they believe that a criminal conduct a criminal
02:44:49.440 offense thanks for hanging out sandy thank you and has been committed um that being said i also
02:44:56.280 don't think that we can legislate our way completely out of this issue i think that
02:45:02.100 prevention education community supports those are all very important tools that we will need
02:45:09.460 if we are to effectively fight online abuse.
02:45:14.320 So I'm wondering then, when you consider what's been shared here,
02:45:17.860 and I'll go down this and I'll ask for a very quick answer from all of you.
02:45:22.780 What can we do to make what exists better?
02:45:27.960 I think...
02:45:28.700 Bill C-63. Let's hear it.
02:45:31.180 A really important thing is to educate people about the tools that are available to them.
02:45:39.460 And that education isn't just going to be about using those tools, it's going to be about understanding the purpose of those tools.
02:45:49.420 So we know, for instance, when we look at the law against sharing intimate images, which is a very strong law here in Canada,
02:45:57.460 knowing that that law exists in our research did not mean kids were less likely to share sex.
02:46:05.360 The kids who were aware that that law existed were not any less likely to share sex than those that didn't know about it.
02:46:13.140 And that is because...
02:46:14.040 To share sex? Like, what?
02:46:19.300 Because they didn't feel it was likely to be enforced.
02:46:23.820 They didn't feel that it was likely to be used as a consequence.
02:46:29.860 and also because they felt they, in many cases, were justified in what they were doing.
02:46:36.580 And so in the same way that household rules that parents put in place have to be about communicating values
02:46:44.160 as much as about laying down the law, in the same way our approach to laws has to be about educating people
02:46:52.060 about why these laws exist and helping people who are targets of online harms see the law
02:47:00.840 as something that helps them rather than something that they that is going to further victimize them
02:47:06.580 i think a priority would be to
02:47:09.240 we got still so much more to go through i was just going to say that that guy
02:47:15.200 was makes the most sense but like he still doesn't make any sense you know what i'm saying
02:47:20.300 yeah word silent man exactly um like he like he's the closest to the person who i would actually
02:47:26.320 agree with because he's saying we need to educate people uh parents need to talk to kids about this
02:47:33.000 like that's that's the only valid thing that i'm seeing from this from this uh entire panel
02:47:38.280 this woman who talks about actual domestic abuse that's another thing where it's like okay well
02:47:45.080 that's that's that's a real thing that's definitely worth addressing which i feel like we already have
02:47:49.440 laws about i don't even really understand why she's here but no what victims need uh is it to
02:47:56.160 be compensated for what they're they're suffered uh is it uh for the abuser to be sent away uh to
02:48:04.640 to to a prison or is it uh for the speech to be taken down very quickly so and then we have to
02:48:12.640 ask ourselves who is in the best position and to for instance take down the speech and of course
02:48:18.240 i'm sure we'll have the opportunity to discuss this but now we also have to balance this with
02:48:22.640 freedom of expression i'll ask each one of you to to answer as well just this is really bad this is
02:48:29.520 like not pointed or a reef i don't know if you organize this but it's not good not good content
02:48:35.200 right here it's probably not very good not very useful stuff a bit shorter it's like like this
02:48:40.640 that woman who just spoke did not sound very prepared like it sounds like she just woke up
02:48:44.480 and was told to be at this town hall about cyberbullying sure so one thing we can do to alleviate
02:48:51.360 some of the suffering given the current status quo is to make sure that all possible frontline
02:48:56.720 contacts that a victim of online abuse might come into contact with is actually educated on this
02:49:01.840 issue so whether that's teachers police officers social workers research has shown that people can
02:49:08.160 get re-traumatized by not the initial what happened but by when they reach out for help
02:49:15.440 the help being unresponsive or not understanding or itself being re-traumatizing so that's something
02:49:20.720 before we even get to the actual root problem so you hear that guys re-traumatized that's a new word
02:49:26.960 um we need to educate we need to educate people to be more sensitive to
02:49:33.440 online hate we need to educate canadian uh police officers emotional damage yeah that's right
02:49:44.560 emotional damage where is it
02:49:48.480 emotional damage
02:49:53.140 oh god just to offer greater support to those experiencing it is the first step
02:50:01.200 So the violence and abuse through technology that people are being affected by is really stopping them from getting access to basic rights.
02:50:13.840 They can't use, let's say, online banking if they're using their smartphone.
02:50:17.980 That might be a method of multiplying or amplifying abuse, whether it's direct messaging.
02:50:23.520 It might be the way that their smartphone, the phones, the messages that they get, the way that they're interacting online.
02:50:32.140 And that's a basic right because they should be able to use their smartphone to find housing if they need it.
02:50:36.760 They should be able to use it to find employment that they need.
02:50:39.780 So it's not realistic to ask people to stay offline or to minimize their interaction with online activity.
02:50:49.580 So we need to get away from that kind of thinking and we need to move into absolutely believing that this sort of abuse is happening, understanding the danger of it and having recourse that, as we've said, is not always about the law or the police or bringing charges, but having recourse that makes the abuser responsible to stop in a way that doesn't necessarily make the person who's experiencing the abuse,
02:51:18.460 the person has to push for that all the time so i i would say we have to believe people have
02:51:24.740 experienced this a lot more and take action immediately and that includes bringing in
02:51:29.740 tech companies to take material down jaden i mean the spooky thing about what you just said is like
02:51:35.760 we need to believe we need to believe women we need to believe women more often and it's been
02:51:43.500 brought up with bill c63 many different aspects of it it could easily be abused by people easily
02:51:50.680 be abused by people they called me this they're assaulting me they did this and then reporting
02:51:55.460 them anonymously that would be a nightmare so this idea of just believing people uh yeah no
02:52:02.780 and what i really don't like what about what she said there is she was kind of like mixing it in
02:52:07.840 with like the emotions that you're feeling the emotions that you're feeling and it's again it's
02:52:12.180 like there needs to be the line between criminal behavior and violence and things that are just
02:52:19.720 annoying and making you emotional like if we don't keep that line intact then we're going down a bad
02:52:25.660 road bad bad road let's see what the young guy has to say i think it's about escalation i think
02:52:33.480 it's about looking around at what the original motivators are and figuring out how we could
02:52:37.980 stop them before that happens so for one education can we standardize the fact that online and the
02:52:44.940 internet in general is going to be a really big part of growing up in 2024 you absolutely have
02:52:50.540 to be aware of how to use that responsibly uh starting from elementary school but i think on
02:52:56.220 the legislative side and that's a lot of the work that we've done when sitting down with uh mps
02:53:01.020 working to legislate the internet we have to look at what people have commonalities on what can we
02:53:07.500 all agree on what can everybody have a shared experience on and that experience is frequently
02:53:13.820 there is hate on the internet we do see it people are impacted by it what are the solutions
02:53:18.780 uh the first section of bill c63 is there it is hey ding ding ding ding ding let's go
02:53:27.180 bill c63 mentioned 56 minutes in that's kind of surprising
02:53:31.580 but uh sorry there it is surprise i i i bet the hosts are going to be like wow bill c63 i've
02:53:41.620 never heard of that on the legislative side and that's a lot of the work that we've done
02:53:45.160 when sitting down with uh mps working to legislate the internet we have to look at
02:53:51.020 what people have commonalities on what can we all agree on what can everybody have the shared
02:53:55.520 experience on and that experience is frequently there is hate on the internet we do see it people
02:54:00.860 are impacted by it what are the solutions the first section of bill c63 is a really great
02:54:05.080 example of how we can break down the vulnerabilities of who's out there and who's actually being
02:54:08.680 impacted and what are the what are some of the first steps that we can take to really make sure
02:54:12.660 people are safe on the internet but turn it around right this this is this is so contradictory he's
02:54:18.980 like we have to accept the reality there is hate on the internet yeah there is that's part of the
02:54:24.600 part of the human beings the internet is a reflection of humanity so there's going to be
02:54:29.440 ugly stuff there's going to be beautiful stuff everything in between and now it's straight to
02:54:33.600 well we have to keep people safe i've heard that one before i've heard that one before
02:54:40.460 i remember lockdowns i remember being ostracized from my family i remember not being allowed to go
02:54:48.180 to the gym it was to keep people safe i remember being dehumanized by the prime minister of the
02:54:53.960 country it was to keep people safe so i remember not being allowed to leave the country or get on
02:55:02.240 a train or a plane that was to keep people safe too but this internet legislation it's going to
02:55:07.240 keep people safe but this time this time but this time it's going to be a good thing it's totally
02:55:14.000 not going to be a tyranny thing totally not going to be a no no no no no none of that you mentioned
02:55:20.580 Young people around Canada don't understand that what they're doing is illegal.
02:55:24.440 They don't even understand that there are repercussions.
02:55:26.060 Whoa, whoa, whoa, what's illegal? What?
02:55:29.120 Frequently, there aren't, and there isn't people that are going to turn around
02:55:31.640 and create consequences for actions on the Internet.
02:55:35.280 So it's a really big concern and something we all need to kind of wake up to.
02:55:38.680 Where did illegal come from?
02:55:40.300 The first section of Bill C-63 is a really great example
02:55:43.600 of how we can break down the vulnerabilities of who's out there
02:55:45.840 and who's actually being impacted and what are some of the first steps
02:55:47.440 that we can take to really make sure people are safe on the Internet.
02:55:49.840 But turn it around, right?
02:55:51.800 You mentioned young people around Canada don't understand that what they're doing is illegal.
02:55:57.380 They don't even understand that there are repercussions because quite frequently there aren't.
02:56:00.940 And there isn't people that are going to turn around and create consequences for actions on the Internet.
02:56:07.400 What's illegal?
02:56:07.720 It's a really big concern.
02:56:10.180 What are you talking about?
02:56:12.420 I'll need to kind of wake up to, in part, as a society to take action on.
02:56:15.500 okay well we'll pick up on the online harms bill a little bit later in this town hall but
02:56:20.040 uh he's like you gave away the secret bud you gave it away too soon you know we we don't talk
02:56:27.120 about the online harms bill yet pit squeak i've been doing this for a while i've been doing this
02:56:32.380 word salad game for a while kid don't fuck it up for me okay have heard the criticism that big tech
02:56:39.080 and social media platforms are not doing enough to address online harms and cyber bullying and
02:56:44.540 In this fall, we heard more Ontario school boards stepping forward.
02:56:47.860 They are suing social media platforms like Snapchat, Instagram, Facebook, TikTok for harming, in their point of view, the mental health of children.
02:56:58.160 Now, we did speak to representatives of big tech here in Canada.
02:57:01.540 They talked to us about what they were doing to address online harm and cyberbullying.
02:57:05.660 Take a listen.
02:57:06.220 every piece of content uploaded to tiktok is is moderated and reviewed either by machine
02:57:13.020 or by humans or both and where there's when there is a violation that is removed the user
02:57:19.640 who posted it will receive a notification they have the opportunity to appeal that
02:57:24.400 we publish quarterly community guidelines enforcement reports that outline the speed
02:57:30.700 in which we're able to remove violative videos and we break that down actually by country
02:57:35.480 and by the type of policy violation.
02:57:39.100 Depending on the type of violation and repeated violations,
02:57:42.880 they could have their account or their device banned from TikTok.
02:57:46.440 We work with governments around the world.
02:57:48.260 We really see this as a shared challenge and a shared responsibility.
02:57:53.840 And so, you know, we would do a number of things to work with governments,
02:57:58.780 whether that's on, you know, partnering on the disinformation or misinformation landscape,
02:58:04.700 making commitments around how we're going to safeguard our platforms, having transparency,
02:58:10.800 not only about how our products and tools work, but how they're performing. And making that
02:58:17.840 publicly available. So we're on this kind of continuous journey of improvement and of
02:58:23.520 partnership. People who are indicating to us that they are of a certain age, that are kind of,
02:58:30.980 you're in your teen years, that we actually default the accounts into the most privacy
02:58:36.660 protective settings, meaning that you actually aren't able to or an adult can't interact
02:58:43.840 with you unless you're already connected with them because they're a parent or a family
02:58:48.200 friend. If they're unknown to you, they can't kind of automatically message you. We want
02:58:52.420 to make sure that certain types of content on our platforms are not recommended to you
02:58:58.020 if you're of a certain age and we also ensure that we give you certain tools to ensure that
02:59:04.580 you're able to actually control your experience. So for example you can block out certain keywords
02:59:09.460 in terms of messages and posts that people leave for you so that those things actually don't appear
02:59:14.580 if someone tries to write something that is not nice, that's negative. And we've invested a lot
02:59:21.380 in technology and resources to address child sexual exploitation on the service
02:59:27.060 just to give you a sense of the impact and the data. In 2022, we suspended about 2.3 million
02:59:34.420 accounts on our child sexual exploitation policies. In 2023, we suspended 12.4 million accounts,
02:59:41.060 becoming way more aggressive on people that were seeking this content, trying to traffic in this
02:59:45.220 content. We've forged new and exciting partnerships to share information across platforms to get these
02:59:49.140 bad actors off our services and partnering closely with government on this and making
02:59:52.500 So job safety is certainly top priority for us and just making sure it's a safe service for minors.
03:00:03.520 Interesting. I mean, they were pretty extensive there in terms of different tools they have.
03:00:07.540 Hey, you can block people. Hey, you can restrict certain comments if you don't want to hear them.
03:00:12.080 Hey, we're going after the pedophiles. I mean, I'm critical of big tech, obviously.
03:00:18.620 I've been banned and censored in various ones,
03:00:22.640 had my content taken down off of most of them, I'd say.
03:00:25.940 But this idea that the government's going to do a better job
03:00:30.040 than paid professionals, executives in Silicon Valley,
03:00:34.320 that is just totally insane.
03:00:36.100 That's an insane proposition.
03:00:38.420 But it's going to be interesting to see if anyone on this panel
03:00:41.420 actually addresses what they just heard
03:00:43.540 in terms of the sort of infrastructure that already exists at Big Tech.
03:00:47.820 Okay, so listening to that, I'll begin with you, Matthew. Are big tech companies doing enough
03:00:54.440 to protect users? I think there are definitely positive steps that they've taken. I think there
03:00:59.700 are certainly more things they can be doing. It's been really encouraging for us to see them doing
03:01:05.380 some things that the youth in our research asked for, particularly things like creating a safer
03:01:11.100 experience mo c nine m says how about an iq requirement to access lol that's pretty funny
03:01:18.860 matrix world report is that trans in chat he says my name is trans and i approve of greg's live
03:01:27.820 stream thanks for hanging out buddy go subscribe to trans matrix world report and chat it's a good
03:01:35.580 guy 14 users because the youth in our research said that you're not an adult the moment you turn
03:01:41.400 13 there's a tremendous amount of growth that happens between 13 and 18 and so there are
03:01:46.480 definitely steps that online platforms can do to minimize and mitigate online conflict
03:01:53.520 things to encourage empathy things to discourage people from encourage empathy
03:01:58.840 you want facebook to encourage empathy they're they're trying to set up an algorithm to keep
03:02:06.180 me hooked to their phone to my phone it's just it's like bureaucrats such misguided bureaucrats
03:02:14.240 and facebook should do this and youtube should do this and i wish they would do this just like
03:02:18.840 this huge wish list responding too quickly things that they can do to change their algorithms so
03:02:24.660 are less likely to share or recommend polarizing content and and and this is and this is the
03:02:34.980 insanity of it you know like i can agree that big tech algorithms are very kind of insidious
03:02:43.220 and sneaky in the way in which they try to get you hooked on the algorithm but it's like
03:02:47.960 so you a bureaucrat are going to try and influence the algorithm
03:02:53.400 i don't know it just kind of seems it's an insane proposition to me
03:02:59.920 and generally things that they can do for a couple reasons number one it's like where do
03:03:05.540 you get off being a bureaucrat in ottawa canada thinking that we should control the algorithms
03:03:10.600 i thought they already did that with bill c11 by the way and then number two it's um
03:03:14.860 the idea that a bureaucrat is going to like hey hey Facebook this huge massive corporation let
03:03:21.160 me get in there and tinker with how you how how the core of your business works okay do in that
03:03:26.940 way to change particularly the default settings of their platforms so that they are encouraging
03:03:33.460 positive use so that they're encouraging empathetic use you know what I'd like I'd
03:03:38.180 like to control the algorithms to advocate for patriotic content, content like myself,
03:03:45.220 Bernier, get some diagonal on content, some, some Hillier content. You must see it. You must watch
03:03:52.280 it. It's going to be in your feed. It's mandatory viewing. The rage cast with Jeremy McKenzie is
03:03:59.080 mandatory viewing. That's what I want on my algorithms in my country mandatory. But the
03:04:07.120 point is is like this and this is not going to be biased it's not going to be
03:04:11.500 politically biased and so that they're minimizing conflict and particularly
03:04:20.320 retributive cyber bullying because we know among youth which are common
03:04:24.040 causes the most common motivations for cyber bullying are getting back at
03:04:27.960 someone else and so it's interrupting that cycle of cyber bullying that where
03:04:32.760 I think platforms in their default settings and in their basic design have a tremendous role they could still be playing.
03:04:40.120 Jayden, what do you say?
03:04:41.180 As someone who's in the front lines of this one with young people, what would you say to whether or not big tech is doing enough?
03:04:47.560 Google sold out millions of young Canadians and young people internationally under 18 by allowing Meta to promote downloading Instagram on YouTube,
03:04:57.400 which significantly is getting into people under 18, is being put in front of them.
03:05:02.620 And those are the same companies that you just watched saying there are these measures.
03:05:05.660 So when there isn't the government regulation that's stopping this and people aren't waking
03:05:09.660 up to this is happening, organizations, big tech is taking advantage of millions of young
03:05:15.720 people vulnerable and on the internet, whether they claim something else, it's what's happening.
03:05:19.080 Look at the facts, see what's going on, and then say, are these measures actually doing
03:05:22.800 anything or is this virtue signaling to try to get a response to the public and to make
03:05:25.920 us not aware of what's actually happening?
03:05:28.120 Great.
03:05:28.380 I'm so glad you brought that up.
03:05:29.660 So this kid is smart.
03:05:30.660 this guy this guy does expose the hypocrisy saying hey look at big tech they say that they're going
03:05:36.760 to protect kids online but they're also trying to get them to download instagram on youtube okay
03:05:42.460 young man surely you can appreciate um the liberal party of canada and how they promote
03:05:49.940 hey the guy promoting bill c63 a referani here's him at the pride parade and he says the online
03:05:56.960 Harms Act will protect kids from sexual content. He's at the Pride Parade. This is what happens at
03:06:02.540 the Pride Parade. Things like this. This a naked guy with the Bugs Bunny costume on. There's been
03:06:10.120 all ages pride events in Canada where there's nudity, where there's pornographic material.
03:06:18.840 There's Drag Time Story Hour for kids. There's books in public schools with sexual pornographic
03:06:26.580 content. This is all rubber stamped. This is all endorsed and approved, not criticized, not condemned
03:06:32.620 by the Liberal Party of Canada. So smart young man, do you really think Mr. Arif Farhani wants
03:06:40.100 to protect kids from sexual exploitation when the Liberal Party rubber stamps all of this same stuff,
03:06:46.220 all of the same sexual material? You just expose the hypocrisy of big tech. Why won't you expose
03:06:52.940 the hypocrisy of the liberal party it's almost like there is a political agenda at play and
03:06:58.600 you're just a pawn saying the things trying to be a good young little politician to push this
03:07:05.600 bullshit legislation why would you not how can he see the hypocrisy in big tech but not see hypocrisy
03:07:10.640 in the liberal party it's very interesting i need something even at this point you know i and you
03:07:16.660 said earlier that the laws can't do everything but should there be higher guardrails i think
03:07:22.040 there's a possibility and there's certainly room to ask social media operators to have to fulfill
03:07:27.920 some statutory duties so for instance having a process in place through which users can flag
03:07:32.640 categories of harmful content they already have that I think would be beneficial then even for
03:07:38.400 categories of extremely harmful content that for instance content that sexualizes a minor I think
03:07:43.400 it's it's acceptable to have even a specific time frame for social media operators to process those
03:07:48.060 flags. I do think, however, that we have to avoid asking either explicitly or implicitly
03:07:54.900 through laws to social media operators to proactively monitor, surveil, and take down
03:07:59.520 speech. Although, you know, Cynthia. Oh, my God, she's actually standing up for speech. Is that
03:08:04.680 really? My goodness. Oh, my God, she's doing it. She's standing up for speech.
03:08:10.320 Where are those images? What? I guess it's on the other one. Hold on.
03:08:14.420 I'll let it keep playing because that raises an interesting question, though, because if we're having a debate as to how to move forward here, do big tech companies have to be more accountable?
03:08:26.240 They absolutely do. And to be completely honest, it's a little bit difficult to take the statements that we've heard seriously by this point in time, because right now we're 5, 10, 15 years, almost two decades into having social media as a daily part of our lives.
03:08:38.580 and these are multi-billion multinational companies that have arguably shaped elections
03:08:43.460 contributed to genocides and but this is the one place where their hands are tied and they can't do
03:08:48.180 anything about it and so when did they say they can't do anything about it they were just talking
03:08:54.580 about all the different methodologies by which that they they are doing things about it this
03:09:00.420 so disingenuous these people even when they try we see deficiencies there too so even though they
03:09:05.700 say oh we are addressing hate speech online they it seems to be very selective application of who
03:09:11.060 gets thrown the book at them and who is treated with leniency for example women are constantly
03:09:15.940 reporting how x formerly twitter has routinely ignored when they report misogynistic abuse um
03:09:22.980 human rights watch recently published man i get like this is like being in hell this video is
03:09:29.860 like being in hell dude there's no details they're never specific with details she equates
03:09:37.360 misogynistic abusive content with rape threats with death threats like is it is this a offensive
03:09:43.580 meme that's like anti-women or like makes fun of women because that's free speech making fun of
03:09:50.480 women is is a thing that you're allowed to do but you just this keeps kind of just grouping it all
03:09:56.760 together is the same thing to report talking about how meta has been systemically censoring
03:10:00.560 and suppressing pro-palestinian voices on their platform and yet they are not addressing um
03:10:06.900 the really overt low-hanging fruit instances of hate speech i think the bigger question like
03:10:12.340 no examples no examples they have no examples it's just so annoying like i'm trying i'm trying
03:10:19.660 to give them the benefit of the doubt like show me show me a strong argument but it's just word
03:10:26.480 salad after word salad and though is that even if troglodyte said the earth is flat and so is her
03:10:32.800 face they were doing everything right we are still then how dare you make a comment like that about
03:10:41.040 cynthia chew huh there it is and on the goodwill of a handful of ceos in the world to control our
03:10:53.040 online environments and that's just not oh yeah and you want a handful of bureaucrats to do that
03:10:58.560 really do you not see the hypocrisy in your argument there do we really want a handful
03:11:04.640 of tech ceos controlling our online environment no we want a handful of bureaucrats doing that
03:11:10.160 yeah that'd be much better because the bureaucrats in canada have a good track record
03:11:15.520 not a tenable system for a healthy online environment and a healthy free and open
03:11:19.200 democracy because we've seen what happened with X
03:11:21.300 just due to the one ownership change and
03:11:23.200 it already had such a huge impact and so that's why
03:11:25.340 even though regular again what do you mean like
03:11:27.200 this stuff is only
03:11:29.420 for people who are far lapped
03:11:31.280 into the CBC rabbit hole already
03:11:33.240 because it's like you already
03:11:35.240 saw what happened with X
03:11:36.880 should we look it up
03:11:39.120 X
03:11:40.880 Elon Musk
03:11:42.340 hateful content
03:11:44.540 let's
03:11:47.380 see what the let's see what the
03:11:48.660 um, let's see what the normies are reading these days when it
03:11:52.080 comes to X and Elon Musk.
03:11:57.180 Um,
03:11:57.980 Jesus. Data shows X is suspending far fewer users for hate
03:12:10.860 speech.
03:12:14.180 Okay.
03:12:14.700 Elon Musk's ex is policing harmful content as
03:12:20.460 scrutiny of the platform grows. That was from September.
03:12:28.900 Interesting. Elon Musk's new X algorithm is harming our free speech.
03:12:34.020 like all privately owned news
03:12:45.560 outlets ethics take a backseat
03:12:48.140 to profit in determining what content is
03:12:50.040 delivered
03:12:50.440 this is so this is it's so
03:12:58.200 funny it's like hey if there's mean
03:13:00.100 people memeing me and making fun of me I can't
03:13:02.220 have free speech anymore and it's like regulation legislation that's a you problem you know
03:13:10.180 is not the only solution i think it definitely has to be part of the solution okay part of the
03:13:15.980 solution is just saying we are going to talk a bit about the regulation and the regulatory sphere
03:13:18.920 in a moment but you know in terms of as things are right now is there a role hey i just thought
03:13:24.900 of something fun i just thought of something fun i'm going to post the link to this video in the
03:13:29.140 chat everybody go give it a down vote right now i want to see those down votes up there's only
03:13:37.200 four down votes right now i have a plug-in where you can see how many down votes there are
03:13:42.040 everyone go down vote this video right now please obviously keep the tab open keep the live stream
03:13:50.180 open go over and um hit the down vote please hit the down vote on this video whoops yes we're at
03:14:03.060 10 down votes already let's go is this one updating no it doesn't we're already at 10
03:14:09.280 down votes i like it thank you we're down voting the video we're changing the future of the country
03:14:15.060 right now. Let's go. I'm going to hit refresh again. It's at
03:14:22.000 10 guys. Come on. Come on. Let's change that ratio. Let's
03:14:26.900 flip it. Go download it. Done. Okay. Sorry. Getting
03:14:36.640 distracted here. Back into the word salad. For education for
03:14:42.200 for example, for women, their families, their children who are victims of this type of abuse
03:14:47.040 to know a bit more as to how they might protect themselves?
03:14:49.500 I think there's a role for education, but I would just point out some of the important
03:14:54.280 triggers within the speeches that we just heard from big tech companies.
03:14:58.460 They talked about protecting their own companies, not protecting the people on their websites,
03:15:02.500 and they talked about protecting those who they deem worthy of respect and protection
03:15:06.020 such as minors.
03:15:07.100 They didn't necessarily talk about racialized women or immigrants or people who receive,
03:15:11.820 again have to use their tech to stay in touch with people and be connected to
03:15:15.420 the world the wider world and those and they also didn't take their clues from
03:15:20.220 survivors from people who've said you know what it's not necessarily just the
03:15:23.800 hate speech that I am concerned about it's that just to give you an example
03:15:27.120 somebody might have said to them when I send you a bunch of red roses that's the
03:15:32.160 day I'm gonna kill you and they'll use their harassment product by sending
03:15:36.540 pictures of red roses on their direct messages or online that's not hate
03:15:41.060 speech but that's harassment and that's stalking and it's usually happening privately which is
03:15:45.620 again something that the tech companies will not take on they have not said that they're going to
03:15:49.940 make any changes there the tools that we do offer at women's shelters canada include i mean this
03:15:56.420 woman is talking about a real like interesting angle of all this but again it's not at all
03:16:02.660 addressed in bill c63 private communications aren't touched i don't think they should touch
03:16:07.220 private communications um when i had my conversation with wandsbutter nicholas wandsbutter
03:16:12.820 lawyer in ontario here he was talking about if they actually cared about this issue they would
03:16:18.500 just increase the amount of resources going into law enforcement to uh have consequences for people
03:16:25.380 who are doing this type of thing when it goes when it goes into the realm of criminal behavior that
03:16:30.180 affects real life real life right but this is all really an agenda to try and just censor people
03:16:36.660 um on the internet so uh this is completely different so again this this person's kind of
03:16:43.540 she's talking about valid things but she's muddying the waters to make it seem like bill c63 will
03:16:48.100 address this stuff which is which it does not do things like the digital safe um breakup tool so
03:16:54.180 when you're breaking up from a relationship there's all kinds of ways that your tech might
03:16:58.020 be connected to somebody else's whether it's passwords whether it's access whether it's
03:17:02.020 different ways that you've sort of built a way of combining your tech usage,
03:17:07.960 the tool offers very clear advice on how to separate that
03:17:11.760 and then how to keep yourself safe afterwards.
03:17:14.580 So two-factor authentication, changing all your passwords,
03:17:17.700 turning off your location, apps, all of those things were important steps.
03:17:21.860 And yet, just about women's safety,
03:17:24.000 turning off your location may mean that also you wouldn't be able to be found
03:17:27.780 if you were in a dangerous situation.
03:17:29.440 So women have to choose. Am I safe in this situation or not safe in this situation? Do I have my location on or my location off? Once again, it's back to individuals to make decisions to keep themselves safe and make them responsible, which is fine up until a point. And everybody should be educated on how to use tech safely. But it's not fine when that's all the responsibility we see.
03:17:50.100 Yeah, so this is, again, she's talking about a real valid issue, which is, you know, domestic abuse and how it may get sort of communicated or amplified, like, you know, via technology.
03:18:05.040 But this idea that Facebook is going to or Twitter is going to prevent domestic abuse.
03:18:14.020 You know what I mean?
03:18:15.320 It's going to somehow prevent that.
03:18:17.360 that's the job of law enforcement that's not the job of big tech to prevent domestic abuse
03:18:23.520 from happening like that's a crazy crazy like crazy um sort of misguided to have this as part
03:18:32.300 of the conversation or at least like it's a separate conversation uh from what everyone
03:18:36.640 else is having in terms of centering the internet but thank you for that now you know we've already
03:18:41.020 touched on right the regulatory sphere and in the united kingdom in 2023 the online safety act
03:18:45.740 became law. And part of that law obliges platforms to have systems. Hold on, hold on, hold on.
03:18:49.920 Them responsible when that's all they're talking about? The regulatory sphere. And in the United
03:18:55.000 Kingdom in 2023, the Online Safety Act became law. And part of that law obliges platforms to
03:19:01.800 have systems and processes in place in order to protect against harmful content. Now, the European
03:19:07.660 Commission adopted the Digital Services Act to further protect online users. And take a listen
03:19:13.980 now to two people who are instrumental in getting that legislation passed wow i can't believe they're
03:19:20.000 invoking the online harms bill in the uk where protesters have been thrown in jail
03:19:27.700 for for for protesting for having an anti-immigrant sentiment they've been thrown in jail
03:19:34.980 with this online harms bill legislation i.e the worst case scenario for people like us of hey is
03:19:42.180 this online harms act going to be used to throw people in jail
03:19:44.620 who have the wrong opinion on immigration? Well, we've seen
03:19:47.420 that. Yes, that is going to be the case in the UK. But now
03:19:51.320 they're saying, no, no, no. Let's talk to the legislators
03:19:53.820 who helped pass this bill and why it's so great. So
03:19:57.400 completely obfuscating the problems of all of this, which
03:20:01.820 is free violations of free speech. And they're doing it
03:20:05.440 flagrantly at this point. The first thing that we did was we
03:20:09.340 put in. I'm not even going to say anything. And data protection for children and people don't
03:20:25.280 always understand how making children more private makes them safer. Fundamentally you're taking them
03:20:31.900 out of the business model. You're saying you may not have our children's attention you know on this
03:20:37.080 basis. The second thing we've done is introduced the Online Safety Act, and that gets very specific
03:20:44.580 about what content you can recommend and can't recommend to children under 18. So we are age
03:20:51.980 gating pornography. We are saying that self-harm and suicide ideation is not suitable for under
03:21:00.880 18s and and so we have made a move on that and we've given a very big suite of powers to a
03:21:10.140 regulator to get information and in particular to get information for the coroner if a children
03:21:18.280 has died so we've seen these horrible horrible situations all over the world where children
03:21:24.680 have got into, been pushed into a sort of a state of despair that is so great that they take their
03:21:34.220 own life. And we now in England have the right to look and see what was the company recommending?
03:21:41.280 What was the company suggesting to the child? Because it's not neutral. It's not.
03:21:48.920 Again, this feels so totally misguided.
03:21:54.680 We already saw Amanda Todd's mom say,
03:21:57.200 I wish I would have talked to my daughter more before she killed herself.
03:22:03.000 And now they're saying, hey, once the kid kills himself,
03:22:06.620 we can look at what their browser history was.
03:22:10.400 How does that help?
03:22:12.780 How is that helping?
03:22:15.800 Like they've shown a very,
03:22:19.080 like everyone on this panel has shown a very strong lack of willingness
03:22:22.600 to actually talk about the emotional sort of real,
03:22:28.140 the real solution when it comes to people feeling isolated,
03:22:32.020 people feeling disconnected,
03:22:33.420 people not having real life conversations
03:22:35.720 about how to deal with the internet.
03:22:39.020 No, it's just been like, I've internalized the trauma.
03:22:41.460 I've been re-traumatized.
03:22:43.240 I kept looking at the hateful comments
03:22:45.220 and I panicked and I freaked out
03:22:46.680 and then people killed themselves.
03:22:48.520 And now we have to censor the internet.
03:22:50.320 Now we have to control the internet.
03:22:51.440 now we have to get all these these facilitations and power as she said this woman said it she her
03:22:57.960 eyes lit up she's like now we have the power to see what the kids looked at before they killed
03:23:03.020 themselves completely missing the mark in terms of solving the problem and even the solution misses
03:23:09.100 the mark on solving the problem you want to collect data after the fact after a kid killed
03:23:14.200 themselves what is that solving because you've you've because even if you do collect that data
03:23:19.260 You've shown a lack of willingness to solve the problem at its root, which is a child who is isolated, a child who doesn't have a good relationship with their parents, a family, like a community family problem that happens in real life.
03:23:36.680 You've shown no willingness to address that crucial part of the problem.
03:23:41.140 So why the fuck should anyone believe that you collecting data after the fact is actually going to be applied in a positive direction?
03:23:49.960 And as I said before,
03:23:51.280 they're not even looking at the free speech violations,
03:23:53.800 the egregious ones of this online harms bill in the UK,
03:23:57.140 which is just like a huge red flag to begin with.
03:23:59.980 Whoa,
03:24:00.620 Caesar with a $10 donation.
03:24:02.440 Thank you so much,
03:24:03.340 sir,
03:24:03.520 for the super chat.
03:24:04.780 He says,
03:24:05.260 Santa dress up.
03:24:06.560 When I actually do have some Santa stuff.
03:24:09.360 I should,
03:24:09.980 I should get some Santa stuff and put it in the background.
03:24:12.580 Hey,
03:24:12.740 damn,
03:24:13.320 I'll do that for next stream.
03:24:14.660 There's going to be other streams this week,
03:24:16.220 by the way,
03:24:16.680 we're going to be talking to musical artists
03:24:19.640 we're going to be talking to
03:24:21.460 Wandsbutter, we're going to be talking to
03:24:23.540 Nicholas Lawyer, Nicholas Wandsbutter
03:24:25.700 on Thursday
03:24:26.480 and if all goes well, there's going to be a preview
03:24:29.840 for the documentary
03:24:31.520 released this week, I'm going
03:24:33.600 through a lot of the amazing interviews
03:24:35.740 we've got, so you're going to see some of the people that
03:24:37.580 we're talking to in this documentary
03:24:39.380 and yeah, if you didn't know
03:24:41.320 if you
03:24:43.800 do want to help fight against all this
03:24:45.400 bullshit and stand up for the politically persecuted people in canada and expose the
03:24:51.780 canadian anti-hate network some of the people behind bill c63 then please support our documentary
03:24:57.300 it's at give saying go.com slash safe free speech we have hired an award-winning documentary an
03:25:03.660 award-winning filmmaker he's won a guinness world record okay he's going to help this get
03:25:09.440 distributed internationally it's going to be professional it's going to reach out of our
03:25:13.540 echo chamber and tell our story because that's what we need no one is fighting for us okay but
03:25:20.020 this piece of media this documentary will have a good shelf life it'll be created it'll be there
03:25:25.140 you can send people the link and be like this is what's happening in canada this is what's
03:25:29.900 actually happening this is the corruption this is the lies this is the deception oh look they
03:25:36.320 all happen to be these far left tyrants and they all come back to this ideology and this is how
03:25:42.120 they operate they operate by smearing right-wingers and it's very dishonest they work with the media
03:25:46.680 they work with antifa who are violent who are hateful and they justify hate all that stuff
03:25:53.560 it's going to be summarized in the documentary go to give single.com slash save free speech
03:25:58.180 link is in the description or sorry link is in the description it's also in the chat as well
03:26:02.560 and um whoa we got a big donation from d turner with a hundred dollars keep up the great work
03:26:14.620 greg thank you so so much that is amazing and uh yeah hey another thing i just got reminded
03:26:21.780 it's the holidays it's christmas time you need a gift it's last second it's just around the
03:26:29.580 corner we actually have a solution for you here at savefreespeech.ca all you got to do is donate
03:26:36.360 to the gifts and go then go to our website savefreespeech.ca and you can download this
03:26:42.140 certificate this certificate of donation and say hey insert name of patriot friend
03:26:48.420 on behalf of you who donated i donated this many dollars to savefreespeech.ca
03:26:54.740 and you are helping broadcast
03:26:56.920 the stories of persecuted Canadians
03:26:58.980 across the country, safeguarding
03:27:01.060 our fundamental right to freedom of
03:27:02.980 expression, your country and fellow patriots
03:27:04.900 thank you, so you can go to
03:27:07.200 safefreespeech.ca
03:27:08.480 slash gift
03:27:09.760 and you can download that certificate
03:27:12.960 print it out in PDF, all you need is
03:27:14.940 a printer guys, all you need is a printer
03:27:16.580 and then send a donation to the Gifts End Go
03:27:19.060 and yeah, let's make it happen
03:27:21.060 let's make it happen, I think it's a thoughtful gift
03:27:22.920 um it's gonna be you're getting in here in here at the ground floor here you know you can have
03:27:29.140 the certificate to be like hey i stood up for canada i supported that documentary it's gonna
03:27:35.820 be a banger it's gonna be a banger but um yeah so go on and check that out and uh it's the season
03:27:44.180 of giving you know it's the season of giving so it's a good last minute shopping idea all right
03:27:50.500 let's get back into this video. We're almost, we've got another 20 minutes. Okay. Okay. Okay.
03:27:58.700 Okay. Neutral what they're doing in Europe. Now we have the law, which means that the commission
03:28:06.180 can say, uh, to the platform, uh, the big ones, uh, and we have 19 big platforms and systems
03:28:13.920 which have more than 45 million users a month that's about a parameter so now
03:28:20.760 big change is that the Commission who is the enforcer of the digital services act
03:28:24.600 can say look you did not take measures which really mitigate the risks which
03:28:32.880 protect the children and which create the the system where nobody will be
03:28:38.220 endangered it's first and foremost the duty of the parents to be oh so it is on the parents
03:28:46.500 aware of their what their children are doing what they read and where they disappear when they
03:28:51.820 go online and uh no regulation uh will uh how to say deprive the parents of their parental
03:29:01.600 responsibility that's their primary task in the eu we not really a great argument for what these
03:29:08.000 people are talking about simply decided to introduce legally binding rules and
03:29:13.100 it was also because of the appearance of smaller systems you mentioned Google
03:29:18.320 meta and others yes the biggest ones had agreement with us but there were more
03:29:22.520 and more platforms and digital systems which were not part of the voluntary
03:29:28.340 commitments so that's why at the end of the day the big platforms said introduce
03:29:35.060 the rules which will cover all of us so this is what we did now here in canada the online harms
03:29:41.100 bill bill c63 installed in part i love how they just like hey look at these bureaucrats they were
03:29:45.960 successful in usurping power online let's do that here let's do that here and i i bet they won't
03:29:53.920 even address the fact that that one uh bureaucrat was like hey it's up to the parents you know at
03:29:58.840 the end of the day i thought they won't even mention that parliament and critics have pointed
03:30:03.680 to some concerns that it might limit free speech. Others are pointing to concerns about how much it
03:30:08.340 would cost to set up and monitor hate speech online, online abuse. And recently I did sit
03:30:13.100 down with the federal justice minister, Ari Farana. We talked about the bill and also what
03:30:16.740 his government is willing to do to get it passed. Listen, let's begin with the timing here because
03:30:21.700 it took your government years to introduce a bill to address online abuse. And only after
03:30:25.800 months of delay in the house, you announced that you will be splitting the bill in order to get
03:30:29.460 moving quickly why did it take so long to get to this point well i'd say to you that we consulted
03:30:35.460 extensively on the bill because we wanted to get it right we looked at domestic expertise we looked
03:30:39.860 at international competitors that took time years in fact and we tabled a very comprehensive bill in
03:30:45.140 february in the spring we started debate on this bill and i thought that was a promising start
03:30:49.780 after labor day we returned to the house of commons we also recommends debate on this bill
03:30:55.060 why did you split the bill arif why did you split the bill answer the question i was encouraged by
03:31:01.300 that as well what i've unfortunately seen for the last three months is a complete blockage
03:31:05.540 of the chamber that is right behind me and that's been frustrating it's been frustrating to me
03:31:09.300 as a parliamentarian it's been frustrating to me as a father and someone who cares about my kids
03:31:13.940 as minister of justice who's put a lot of effort and thought into this very piece of legislation
03:31:18.900 now we're approaching christmas i had a tough decision to make do i want to see this bill
03:31:22.820 die on the order paper or do i want to see kill it kill bill c63 the parts of this bill expedited
03:31:30.260 where i believe that we can reach some consensus i chose the latter i think it's an important
03:31:34.180 decision because ultimately my job is to protect canadians and i'm going to target the most
03:31:38.820 vulnerable canadians which are our kids and i approach that from that vein that protecting
03:31:43.620 and combating hate against hate is important but protecting kids including kids who might face fatal
03:31:49.940 dangers is urgent. And I'm proceeding with urgency on the parts of this bill where I believe I can
03:31:55.820 achieve a consensus. So he got caught lying for those who aren't aware. Arif Arani said this bill
03:32:02.020 is all about protecting kids. How could you not want to protect kids? And now that people saw,
03:32:07.040 no, this is a Trojan horse. There's a whole bunch of free speech implications here. You want to
03:32:11.420 change the criminal code for hate speech. You want to bring back section 13 of the human rights code,
03:32:17.920 which is a total mess and it got kicked it got removed from law for a reason and now that he's
03:32:24.200 been caught red-handed lying and saying that it's all about protecting children he's like uh you know
03:32:29.260 what yeah let's split it up uh you were kind of right part of it is totally just about stopping
03:32:36.580 hate and changing those criminal code stuff and no no but this side this side's actually about
03:32:41.860 online harms this one's actually this side's actually about online harms and this one's
03:32:46.260 actually about protecting children online but uh that's also a lie so he told the lie he got caught
03:32:51.320 and now he's telling the same lie again because if you actually break down the trojan horse part
03:32:55.960 of the bill there's not really much in it uh other than an insane amount of power that unelected
03:33:03.000 bureaucrats would have over the canadian internet but let's let's listen to him lie more okay so you
03:33:08.620 you think you can achieve the consensus and we look at this the bill c63 the online harms act
03:33:13.800 again will now consist of two parts one part will deal with keeping children safe online which
03:33:18.580 includes uh sexploitation pornography the second part relates to the criminal also bullying
03:33:23.880 bullying is on there bullying is an online harm that the canadian government is going to control
03:33:31.300 apparently criminal code and human rights act which includes the controversial section on free
03:33:37.140 speech given that you have split the bill and the priority is the section on child safety do you
03:33:43.180 think you will be able to pass the second part of this, which you have said would give Canadians
03:33:48.520 a recourse on discriminatory hate speech? Well, I'm certainly hopeful, Michael, and I want Canadians
03:33:53.620 to understand that when you bifurcate the bill, we are not leaving one part aside. We will still
03:34:00.280 be pursuing both pieces of legislation. We'll be pursuing the portions that deal with addressing
03:34:04.940 children with prioritization, working very, very quickly on those. But the portions that relate to
03:34:10.580 amending the criminal code and amending the Canadian Rights Act remain on the floor of the
03:34:14.700 legislature for debate and further study. I think what is also clear to all Canadians is that we
03:34:19.780 have seen rising hatred in this country. There are a lot of ideas about what we should do in terms of
03:34:24.180 how to address that hatred. We've heard people that support the bill. We've heard people that
03:34:27.940 want the bill refined to provide more clarity. Also, people want to throw it out like myself.
03:34:34.320 I want to throw it out. Don't forget me. Hey, Arif. Hey. Hey, bud. Don't forget me. I started a
03:34:39.320 whole website just dedicated to you buddy all ears and willing to entertain good faith suggestions
03:34:44.420 to improving the hatred components of this bill but also to improving the portions that deal with
03:34:49.760 combating child sex predation and also things like bullying inducing a child to self-harm
03:34:54.780 and the sharing of what's called revenge porn these are critical things that help keep our
03:34:59.280 kids safe i think what's now isn't it great that everything that these people have been talking
03:35:04.880 about the in the panel oh my god arif our hero he's addressing all of them isn't that amazing
03:35:11.600 oh he's he's he's our hero here is that the ball is in the court of my opposition parliamentarian
03:35:17.600 colleagues i've already had good indications from the la quebecois the ndp the conservative
03:35:22.800 party has unfortunately been blocking everything in the chamber right behind me for three months
03:35:26.960 now but i've been given some reason for optimism that even the conservatives will get behind this
03:35:31.760 built because they appreciate the urgency and appreciate the need to proceed in a nonpartisan
03:35:36.680 manner okay so so perhaps a political path forward but you know i also wonder about yeah
03:35:41.080 nonpartisan manner in which you will have the power to censor conservative voices
03:35:45.980 on the canadian internet if if the conservatives fall for this or support this oh my god dude
03:35:54.760 oh no but this but this half this half is really about protecting kids online it's total fucking
03:36:00.880 bullshit it's totally going to be weaponized uh to censor people they don't like like the idea of
03:36:06.520 giving the government that much power over the canadian all content on the canadian internet is
03:36:10.900 just insane um with bill c18 the online news act you can't even get news on meta platforms or on
03:36:20.580 google anymore because the government's like hey we don't understand the internet and we want to
03:36:27.200 charge big tech platforms for sharing news, even though they did that for free already.
03:36:32.740 And then big tech said, hey, Canada, you have a small market anyway, so we're not even going
03:36:36.580 to share news anymore. And the same thing might happen with Bill C-63, because they have these
03:36:41.980 massive financial penalties for big tech if big tech doesn't obey Arif Varani and the Liberal
03:36:48.340 Party. And I think big tech might say, hey, this financial risk is not worth it. We're going to
03:36:54.820 actually just block the use of Facebook and Instagram in in Canada because your
03:37:02.680 bill sucks that could easily happen you're what your government is doing to
03:37:07.120 ensure big tech social media platforms are putting in place the guardrails they
03:37:11.500 promise to prevent or reduce online abuse well I'll say to you Michael that
03:37:15.660 it's been frustrating because what we've seen is a checkerboard response right
03:37:18.520 we've seen some companies taking pretty aggressive steps others taking very few
03:37:23.380 steps what we're trying to do is eliminate that checkerboard provide
03:37:26.440 Canadians with the stability and understanding that regardless of which
03:37:29.560 platform your child is on which app that they are using for example that this
03:37:33.520 will address the safety of children online across the board and set a new
03:37:37.540 floor what I've been encouraged by Michael is actually been the has been
03:37:41.160 the response of social media companies thus far as opposed to rolling up the
03:37:45.260 sleeves gloves off getting ready for battle it has been quite the opposite
03:37:48.580 social media companies have been reaching out to me in my office
03:37:51.260 considering ways of how they can collaborate with us
03:37:53.680 in order to get this across the finish line.
03:37:55.780 I think it's helped by the fact that we are the sixth or the seventh
03:37:58.400 democracy in the Western world that is moving in this area.
03:38:01.400 They've already seen what other international allies have done in this area.
03:38:04.700 They've realized that more needs to be done to protect children
03:38:06.840 around the world and in Canada,
03:38:08.740 and they're willing to work with us in that regard.
03:38:10.920 Yeah, and the UK is already arresting people
03:38:13.820 who oppose immigration at protests.
03:38:16.160 This bill's already been abused to target conservative individuals
03:38:20.180 for simply just sharing their political views uh it's very funny when they invoke international
03:38:25.600 stuff when it's like i don't think you want to go there bud i don't think you want to go there
03:38:31.120 because my suspicions my concerns about bill c63 are being confirmed and validated in the uk
03:38:41.800 with the online harms bill which is a carbon copy or very similar to your online harms act
03:38:47.640 so great example yeah the uk proves why this bill would be a censorship bill targeting specifically
03:38:55.620 conservative-minded citizens and censoring them and throwing them in jail if the criminal code
03:39:01.960 changes were to take effect now you know i do wonder in the time we have left i wonder
03:39:08.160 what in your opinion will happen if bill c63 does not pass before the house is dissolved and
03:39:15.500 another election takes place we will be celebrating in the street we will be posting our memes we'll
03:39:22.060 be posting our racism on the internet for all to see and celebrating and we will be happy to be
03:39:29.060 unmolested by the government imposing their power across the internet that's what we're going to be
03:39:34.260 doing when it when it gets stopped by this documentary yes well if you take your cues
03:39:42.700 from what the official opposition has presented as their proposed alternative uh there is some
03:39:49.120 consideration about criminal code reform but that's it and i think that's problematic because
03:39:53.500 what i've heard from law enforcement and what i've heard from parents including people like carol
03:39:57.660 todd the mother of amanda todd is that the victimization of her daughter still occurs
03:40:03.020 10 years after her death yeah amanda todd her mother also said that she wishes she would have
03:40:09.100 had a conversation more often with her daughter and that's a part of the conversation that you
03:40:14.080 a referani do not even want to have you just want to add a whole bunch of rules and gain a bunch of
03:40:21.500 power over the canadian internet so you're very disingenuous reason why is because images of
03:40:27.460 amanda's naked body continue to circulate online 10 years after she passed by suicide what she
03:40:33.920 wants and what the police have told me that they want to deliver is for those images to simply come
03:40:38.340 down that is possible like this bro it's crazy bro it's copying like i don't want to be that guy
03:40:44.960 but like you just copy and paste it and then you upload it again it's just whack-a-mole you know
03:40:51.320 like like people like we know internet anyone who uses the internet knows this but he's like once we
03:40:57.340 pass this bill then no copy and paste copy and paste is going to be disabled on all of your
03:41:02.660 devices you can no longer save a file and then re-upload it somewhere else on the internet like
03:41:07.500 are you stupid with this piece of legislation because it poses it poses duties on social
03:41:14.040 media platforms and requires them to take down child sex abuse material and revenge porn within
03:41:19.360 a 24-hour time period so i'm very confident that that's the right solution it needs to pass and
03:41:24.060 that's why i'm putting a lot of effort and political capital on the line to ensure that
03:41:28.500 by bifurcating we can proceed with speed on the parts where we can achieve a consensus i've
03:41:33.560 confidence in the better principled nature of my colleagues in the chamber from other parties
03:41:38.940 and their willingness to work in a nonpartisan way to keep kids safe because ultimately at the
03:41:43.760 end of the day lives are at stake here and that's important it's funny how he goes from saying lives
03:41:48.500 are at stake to we need to stop homophobia lives are at stake and also we need to stop the bullying
03:41:54.400 lives are at stake and also this is problematic content should we go into the uh let's go into
03:42:03.240 the box of, let's go into the, what do they call it, the toy box? Okay, this is actually
03:42:13.020 perfect. What about content like this, Arif? Child Predator got knocked out with a pumpkin
03:42:22.920 for trying to meet a little boy. Does your wife know you're out here trying to meet a
03:42:27.960 little boy right now James James I'm gonna get real loud your wife's gonna
03:42:40.920 find out or we can stop that sorry trigger warning James you said you was
03:42:50.760 gonna give him a kiss
03:42:53.300 So they like you know, they confront this pedo don't touch me your cars on a predator
03:43:02.800 Wow
03:43:04.800 I
03:43:05.800 Mean it easy. I'm assuming their information is correct. This guy's a disgusting monster
03:43:12.420 Are we allowed to hate this man for for trying to entrap a child a reef is that content? Okay?
03:43:20.380 What's going to happen with that?
03:43:22.080 What if that was like a liberal MP?
03:43:24.960 Let's just say.
03:43:26.280 Let's just say the creep in this video is a liberal MP.
03:43:30.940 It's a staffer.
03:43:31.900 Let's say this is one of your staffers, Arif.
03:43:35.760 It's exposed that a liberal staffer is actually trying to meet a little boy.
03:43:42.680 Are you going to take that content down?
03:43:46.140 Exposing this man?
03:43:47.120 you could you could you can make a few excuses you could say hey there's violence it's advocating
03:43:54.580 for violence we'll take it down for that reason or maybe because uh some other reason you could
03:44:00.920 say it's bullying a child or maybe some other reason you could find many different reasons
03:44:05.960 to take the video down thus insulating yourself from criticism of having a pedophile on your staff
03:44:12.360 you see how this bill is very problematic
03:44:15.200 where if you give the government the power
03:44:18.080 to take down any content they want
03:44:19.660 it could be taken down to insulate high-level pedophiles
03:44:23.380 when they're getting caught
03:44:24.340 this guy's not a high-level one of course
03:44:26.160 but that is something that lawyers have considered
03:44:31.520 this is something that Runkle has brought up
03:44:33.500 the Runkle by the Bailey
03:44:34.640 and of course any conversation from somebody like this
03:44:39.360 is completely disingenuous
03:44:40.700 it's clearly part of an agenda
03:44:42.060 just to push the same old censorship bullshit.
03:44:45.780 Just more liberal party nonsense.
03:44:48.340 And I wish the conservatives would just be good people
03:44:51.040 and support my bullshit.
03:44:54.220 Okay, so our thanks there to the Justice Minister.
03:44:56.520 So a lot to unpack.
03:44:57.660 Matthew, you know, as I said, this bill is currently stalled.
03:45:01.720 The government may fall before it's actually passed.
03:45:04.120 If it does not pass, is there enough in the toolbox
03:45:07.260 to address concerns about vulnerable youth,
03:45:09.880 vulnerable individuals without this being passed by parliament i couldn't say if there's enough
03:45:16.200 in the toolbox there are definitely still our tools in the toolbox um and one of the most
03:45:22.200 important of those is making sure that this conversation stays on the national conversation
03:45:29.800 that this topic has not forgotten that the idea of improving our online spaces
03:45:37.400 is not abandoned in in many cases the threat or the possibility of regulation can be as effective
03:45:46.040 or even in some cases more effective than regulation itself so i actually agree with them and
03:45:53.160 i agree with them in that what they've suggested with bill c63 in and of itself is totally
03:45:59.320 outrageous it's totally outrageous especially how they it would be the end of free speech if
03:46:05.400 if the whole thing passed as is the fact that they're even suggesting that
03:46:10.620 is is despicable it's a disgrace to the freedoms of this country it would totally undermine the
03:46:18.020 freedoms of uh that we've always enjoyed in canada and there's so little pushback there's
03:46:24.660 so little pushback there's so little like hey look at this tyranny bill they want to expand
03:46:29.860 the definition of hate speech it would be the return of section 13 which we already threw out
03:46:36.380 during the harper government this has been this is on the table in the second reading and i feel
03:46:43.240 like that is so demonstrative it's it's so representative of the authoritarian character
03:46:49.220 of the liberal government but for some reason the conservative government doesn't want to use that
03:46:55.360 line of attack nearly enough maybe they do a little bit but the reason they don't is because
03:47:02.660 they want the power they want to be the ones in power with that power do control the internet
03:47:08.120 they want their own they want they you know they want their own amount of power
03:47:12.580 they don't want to get criticized for being the pieces of shit at the trough next
03:47:16.620 who uh you know what when the conservatives are in power they don't want to be subject to
03:47:22.860 harassment subject to being made fun of online or being hated online for all the awful decisions
03:47:30.380 they're probably going to make as a conservative party, ruling conservative party. So they have
03:47:34.700 their own invested interest in the government getting more power. And that's why I don't think
03:47:38.920 they're talking a lot about how there's a tyrannical angle to this, how this could totally
03:47:43.880 be abused. And let's not forget, one of those speech laws, the Holocaust denial laws, was
03:47:49.360 initially introduced by a conservative party member so it's just full-blown authoritarian
03:47:58.220 in terms of like its character in terms of how it would be enforced and very few people are
03:48:05.620 actually talking about that not even the conservative party and it's disgusting that's
03:48:10.060 why this is really much honestly like that this is why i started say free speech.ca i saw the
03:48:15.320 writing on the wall i'm like the conservative party is not going to talk about this they're
03:48:19.440 not going to talk about how this is you know the tyranny could be the very the end of free speech
03:48:25.340 in canada who knows how thing dark who knows how dark things will get after that and that's why i
03:48:31.620 decided to do something about it and if you want to help me then help fund the documentary we're
03:48:37.260 making because we're exposing the authoritarian character of the people behind bill c63 there is
03:48:44.320 already a collection of political prisoners there's a collection of canadians who have
03:48:49.840 been persecuted for their political opinion they've been thrown in jail they've been kicked
03:48:54.680 out of their jobs they've like they've been you know stocked online they've had they've had their
03:49:01.000 reputations destroyed simply for saying what they believe that's not that's not something that
03:49:04.860 happens in a free country and that creeping authoritarianism is very evident especially
03:49:10.400 since the convoy. But we document all of this. We talk about Bill C-63 because we need to educate
03:49:16.520 more Canadians about this, guys. Yeah, sure. People who watch Rebel News, they might understand this,
03:49:21.820 but we need to reach more people. We need to reach more people than that. Look how professional
03:49:27.420 this demonstration has been. Look at all the resources that they're putting into this.
03:49:33.800 We need to put resources into our side. We need to put resources into our efforts,
03:49:38.820 into our messaging and into our storytelling.
03:49:42.140 And that's exactly what I'm doing
03:49:43.360 by making this documentary.
03:49:45.040 So if you want to support me
03:49:46.100 and want to help us fight back,
03:49:47.680 then go to givesendgo.com
03:49:49.320 slash savefreespeech.
03:49:51.140 Send a donation there.
03:49:53.480 I really appreciate it.
03:49:55.460 Dave Turner or D Turner,
03:49:57.000 thanks again for the $100 donation.
03:49:59.240 And you can also volley this
03:50:02.140 into a Christmas gift as well.
03:50:05.100 If you send a donation here,
03:50:06.780 that you can then go to our website,
03:50:08.120 savefreespeech.ca
03:50:09.660 slash gift
03:50:10.680 you can just click on the gift button
03:50:12.040 in the top corner
03:50:12.780 and you can download a certificate
03:50:14.700 download a certificate
03:50:16.400 dedicated to
03:50:17.380 your mom
03:50:18.360 your dad
03:50:18.880 your brother
03:50:19.280 your sister
03:50:19.840 a friend of yours
03:50:20.880 that maybe is a fan of mine
03:50:22.540 or cares about free speech
03:50:23.600 maybe you want to
03:50:24.080 like let them know
03:50:25.080 about this documentary I'm making
03:50:26.440 and because it's
03:50:29.140 you know it's hard to buy a gift
03:50:30.380 for people sometimes
03:50:31.380 people already have enough crap
03:50:33.260 why not dedicate them
03:50:35.800 and say hey
03:50:36.540 here's the gift of free speech
03:50:38.520 I thought of you I know how much you care about this country
03:50:40.860 I dedicated this
03:50:42.840 donation on your behalf
03:50:44.040 to save free speech in Canada
03:50:46.500 to the documentary and you can you know send them
03:50:48.500 any information they want to know there is going to be a preview
03:50:50.900 coming out about the
03:50:52.860 documentary later this week which I
03:50:54.900 think is going to be very hype for people
03:50:56.320 but yeah
03:50:58.340 yeah
03:51:00.660 we need to fight back guys but we're almost
03:51:02.780 through this video we're almost done here let's
03:51:04.420 let's let's let's keep it going
03:51:06.180 let's get through the rest of this word salad which significantly is getting into people under
03:51:15.900 18 is being put in front of them and those are the same companies that you just watched saying
03:51:21.380 there are these measures so when there isn't the government regulation that's stopping this
03:51:25.800 and people aren't waking up to this is happening organizations big tech is taking advantage of
03:51:33.980 millions of young people vulnerable and on the internet whether they claim something else
03:51:37.880 that's what's happening look at the facts see what's going on and and then say are these measures
03:51:43.260 actually doing anything or is this virtue signaling to try to get a response to the public
03:51:47.300 and to make us not aware of what's actually happening i need a whole lot of nothing but
03:51:53.340 from that guy that guy is like a trudeau just a whole lot of whole lot of yapping whole whole
03:51:57.960 lot of nothing being said oh such a lack of details throughout this entire thing it's just
03:52:03.520 it's just really despicable it's really pathetic it's i'll bring you in at this point you know
03:52:07.720 i i and you said earlier that the laws can't do everything but should there be higher guardrails
03:52:14.180 feel so bad for the coots men says two flips four twists absolutely absolutely i mean at one point
03:52:21.400 um at one point uh rebel news was talking about that kind of championing their cause
03:52:27.440 you know the the dust still has not settled from the trucker convoy and the establishment is
03:52:34.140 basically gaslighting the country the conservative party especially is gaslighting their base in the
03:52:39.880 country to be like no that whole authoritarian thing that never happened what do you mean
03:52:44.620 what do you mean political prisoners what do you mean people in remand for over a year rotting in
03:52:51.780 a jail cell away from their families. Don't even pay, don't pay attention to that. And it's really
03:52:59.780 insidious and subversive what this conservative party is doing. Okay. I know the guy makes good
03:53:05.460 documentaries on YouTube, Aaron Gunn. If you've heard of him, he's made great documentaries
03:53:10.920 documenting the decrepit degeneracy of, you know, the drug and the tent cities happening in Vancouver,
03:53:17.960 for example and he also made a documentary though on the end of free speech in canada
03:53:23.260 i mean it's not going to be the tier of our documentary ours is going to be much more
03:53:27.900 professional it's going to be much you know much more higher production quality than that
03:53:31.540 his is like more of a youtube documentary but um in this end of free speech documentary
03:53:37.800 the trucker convoy is not mentioned it's mentioned once okay
03:53:45.000 as as if chris barber tamara litch jeremy mckenzie archer bavlowski all the other people who were
03:53:55.480 arrested in and around the convoy or people arrested for defying covid measures as if all
03:54:01.740 of those court cases all of those charges had nothing to do with free speech no that had nothing
03:54:10.220 what would that have to do with the end of free speech someone being thrown in jail on politically
03:54:15.100 trumped up charges i wonder oh aaron gunn the guy who made that documentary not talking about
03:54:21.540 the trucker convoy in the context of free speech yeah he is now a conservative party candidate
03:54:26.520 i wish i was making this up man okay i i wish i i wish uh i wish that i was making it up
03:54:34.980 I think there's a possibility and there's certainly room to ask social media operators to have to fulfill some statutory duties.
03:54:44.940 So, for instance, having a process in place through which users can flag.
03:54:49.940 Hold on. I just jumped the spot, right? Yeah, I did. OK, thank God.
03:54:53.900 I'm like, I have another 30 minutes. I'm like, what?
03:54:56.640 In spaces, is not abandoned.
03:55:00.060 We saw that.
03:55:01.040 Where we, as consumers, have, in many cases, something get passed.
03:55:05.920 And even if it's not the entire section of the bill and what the entirety...
03:55:11.360 So when you look, again, it's stalled, but when you look at what's in C63, does it go far enough?
03:55:16.620 Yeah, and you're absolutely right.
03:55:18.180 It has to do with when the government or the public has a reaction to a big issue that the private sector is taking on.
03:55:26.640 they do make changes in order to retain their viewing.
03:55:29.720 That's what they need, that's what they need to succeed.
03:55:32.100 And so one of the things that young politicians of Canada
03:55:34.000 put out was the mandate to have young representation
03:55:37.200 on the Digital Safety Commission.
03:55:38.400 So those are people, the most vulnerable,
03:55:41.060 that are using the internet constantly,
03:55:42.300 that are largely affected by C63.
03:55:44.720 But to actually have their voices in,
03:55:46.640 what changes could be made.
03:55:47.980 So this is not a commission where you have one person
03:55:50.460 saying, this is what can go on the internet,
03:55:52.040 this is what can't, this is what freedom of expression is,
03:55:54.220 this is what's not.
03:55:54.940 It's about democracy.
03:55:55.880 about having multiple voices and this has gotten a lot of endorsement even from from you know the
03:56:00.680 ministers and other stakeholders in government so we want to see this this debate but young people
03:56:07.320 also want to see something get passed and even if it's the entire do they do they want to see
03:56:13.800 something get passed like it's it is it does remind me of trudeau when trudeau's like canadians
03:56:21.480 want this i speak for all canadians and now this young guy is like i speak for all young people in
03:56:27.520 canada do you that's crazy talk section of the bill and what the entirety of it means
03:56:36.440 something needs to happen in this term in order to have a right doing by canadians just as the
03:56:42.580 minister said we need i mean isn't it creepy when you have this young hey i represent young
03:56:48.800 young political group by the way i'm excited i'm saying the exact same thing as a referani
03:56:55.880 and we're all saying the exact same thing like this this facade that it's like a fair
03:57:02.060 discussion and that they're actually grappling with the different issues it's such a look it's
03:57:08.760 oh man it's funny it's a it's funny that people might might actually buy into that need to have
03:57:14.680 something passed to protect the internet for young people to protect the internet
03:57:19.660 for young people uh-huh you know and he said as I said as this is tall there are
03:57:28.920 concerns and one of them is the impact this would have on free speech can you
03:57:32.120 address that a bit because it's that is a major criticism of the current
03:57:35.680 legislation sure and I'll focus my concerns with respect to part one of the
03:57:39.160 bill so the online harms act and because of course this is a very lengthy bill
03:57:42.040 So with respect to the Online Harms Act, I think what the bill gets mostly right is very specific statutory duties on social media operators to act on very serious types of harmful content.
03:57:53.340 For instance, content that sexualizes a child, intimate content shared without consent, things like that.
03:58:00.300 Where I think that the bill needs to be improved and amended, frankly, is when it creates very broad and even vague statutory duties on social media operators.
03:58:08.840 For instance, the duty to mitigate the risks that users will be exposed to seven categories of harmful content.
03:58:16.360 What that really means, well, nobody really knows, because a lot is left to future regulations.
03:58:20.820 So I think it's really important to make sure that social media operators do not use this type of duty,
03:58:28.200 of statutory duty, and try to comply with it by engaging in proactive monitoring of speech and takedown of speech,
03:58:34.080 or by taking down content, flagged content, without even reviewing it,
03:58:38.840 or reviewing it but with the help the help of ai artificial intelligence and without any human
03:58:43.640 involvement or and without any necessary transparency as to the decision-making process
03:58:48.520 which can be particularly problematic when you think of categories that are a lot more subjective
03:58:53.960 in nature including hate speech so an easy hey she's our girl she actually came through with
03:58:59.960 making some decent points at the very least some of this stuff is too broad some of the stuff is
03:59:06.120 going to be figured out later, which is a problem. I wish people would bring up the bullying thing
03:59:10.780 though. They actually have in the legislation, bullying a child as content that must be removed
03:59:16.620 or that is harmful. And it's like, how do you determine if it's bullying a child? Like that's
03:59:21.440 such a broad thing. And I give the example of if I criticize something to do with the gender
03:59:28.420 ideology being taught in schools, am I invalidating the existence of trans kids? You could easily make
03:59:34.140 that argument and you could say that i'm bullying trans kids because of my content i basically got
03:59:40.220 banned on tiktok for uh for you know talking about the trans kid issue my first account at 50k
03:59:48.040 subscribers by the way rest in peace first tiktok anyway excuse the word problematic i'm wondering
03:59:54.220 from the the human rights perspective uh there's also the concern that because there is a role here
03:59:59.280 for the human rights that's so funny that that's i feel like that's so emblematic of how um the
04:00:06.000 human rights works right so this woman is like yeah there is free speech issues and it's like well
04:00:11.360 let's hear from the human rights person who's going to make up a whole bunch of emotional
04:00:15.600 to justify why free speech is actually not important commission that that could
04:00:20.320 bog down the whole process can you can you talk about that cynthia absolutely so
04:00:24.320 So the problem with that particular,
04:00:27.960 I believe you're referring to part three of the legislation
04:00:30.420 is that the Canadian Human Rights Commission
04:00:33.080 is already so backlogged and this would give it
04:00:36.780 a whole additional area of responsibility.
04:00:39.820 And so not only is the concern
04:00:41.500 that it might not end up being as helpful
04:00:43.780 as we would like to see it in addressing the issues,
04:00:46.560 but another issue that people have brought up
04:00:48.500 is the possibility of it being gamed or weaponized
04:00:51.740 And it backfiring gains the very point of having such a commission and addressing.
04:00:58.060 I can't believe I'm agreeing with the ninja right now.
04:01:01.120 I'm actually agreeing with the ninja, masked up ninja right now.
04:01:05.400 Roll to address hate speech.
04:01:06.300 So, for example, the same way that we've seen.
04:01:09.180 Someone says back to China.
04:01:11.400 Content moderation features being weaponized to silence women, racialized people, and historically marginalized groups.
04:01:17.340 people could raise complaints to the Human Rights Commission doing similarly
04:01:23.820 to tie up people to silent speech that they don't like to silence people who
04:01:27.840 are justifiably speaking out against harms and so we would want to make sure
04:01:32.280 that any entity that's put in place whether it's the Commission and whatever
04:01:37.740 additional staff is hired or this new digital safety Commission that they have
04:01:42.660 an explicit commitment to substantive equality which is recognizing that we
04:01:48.120 don't exist in level playing field and so different groups if we're to be
04:01:52.120 honest about the existing landscape society are treated differently and so
04:01:57.100 that concept has to be taken into account for there it to be effective on
04:02:00.500 the ground okay so that's hilarious because essentially she is saying listen
04:02:07.720 we have to make sure that these rules are enforced equally we have to make sure
04:02:12.060 that these rules are enforced properly yeah okay but there's no guarantee of that happening
04:02:20.220 when a reefer ronnie has been asked about this how do we know it's going to be enforced properly
04:02:25.580 he says don't worry it's not going to be me or justin trudeau doing it yeah it's going to be
04:02:30.640 your friends though so yeah no that is a problem and it's funny that she can recognize the problem
04:02:37.340 But also just like, yeah, can you just promise?
04:02:40.020 Can you just promise us that it's going to be enforced equally?
04:02:44.160 Naive, delusional Ottawa bureaucrats.
04:02:49.420 Liquid Gal says this BS smacks in the face of our forefathers that fought for our freedoms and speech.
04:02:55.720 Absolutely.
04:02:57.240 Absolutely.
04:02:57.880 That's what I'm saying.
04:02:58.540 This stuff is just despicable.
04:03:00.880 This country is in a bad place right now.
04:03:03.740 Nurita, you're nodding your head there to what Cynthia was saying.
04:03:06.580 So what do you think we need to see from the federal government, from regulators and lawmakers in this country?
04:03:12.980 So I agree that the regulations are one of the places where you can actually bring in some of the voices of those who are most harmed by online behavior.
04:03:22.880 And it must be trauma-informed.
04:03:24.820 That's the other thing that I think is often missing when you are looking at any form of legislation.
04:03:29.020 Of course, it's often written in very sweeping ways, but the way it's applied needs to be informed by those who experience that harm, those who are most affected by it.
04:03:38.420 And we have to have spaces to talk about that and to share that information and then apply it, implement it, and then measure it.
04:03:45.480 Where is it making a change?
04:03:46.940 How has it changed our behavior?
04:03:48.800 Who is now seeing improvements in their life because of any regulations or legislation that's brought in?
04:03:55.340 There are other telecoms companies, there are other companies working and acting online, the banks, that have a responsibility also to their own behavior online and how their applications, how their interfaces are spaces that can create harm also.
04:04:11.080 And I would like to see them included in this conversation.
04:04:13.820 This conversation, as you say, and it is just the start, so thank you to everyone involved there.
04:04:18.420 And we will continue to have the conversations in the days, weeks, months and years to come.
04:04:22.040 Now, we know that cyberbullying and online abuse is on the rise.
04:04:25.140 We also know that it is taking a toll on the mental health of Canadians.
04:04:27.900 So if anyone who is watching this, seeing this, is struggling right now as a result of online harm or cyberbullying,
04:04:33.540 a reminder that there is a national suicide crisis line to reach for help.
04:04:36.780 All you need to do is dial 988.
04:04:37.780 I'm fucking struggling.
04:04:39.020 You guys are making me want to kill myself.
04:04:40.540 And with that, I want to thank our panelists for taking part in this portion of the discussion of Matthew Johnson.
04:04:42.420 You guys are making me demoralize.
04:04:44.000 I hate my country.
04:04:44.940 I hate these communists.
04:04:46.120 Hello.
04:04:47.320 Thank you very much for being a part of this.
04:04:49.340 And in the coming months, we will be posting on CPAC.ca more information, more of these interviews on this very important subject.
04:04:54.500 Ooh.
04:04:55.140 All right, yeah, yeah, I meant to I I meant to shout out guys. We
04:05:03.140 made it through. We made it through. Let's see how we did
04:05:05.140 with dislikes. How do we do with dislikes? I asked you guys
04:05:09.140 to dislike. We're still at 10 dislikes. Okay, not bad. Not
04:05:12.140 bad. I do want to shout out real quick to Mr. Mr. I retweeted
04:05:18.140 him recently. Where is it? To Mr. Wiretap. He actually
04:05:23.140 He actually brought me attention to this to this whole CPAC video. I just reacted to show shouts outs to wiretap media
04:05:29.980 he clips some of the parts with a
04:05:32.700 Butt plug boy there. I mean Balgord. So
04:05:36.520 Yeah
04:05:38.620 Guys, thanks so much for tuning in. Thank you so much for the donations and once again
04:05:43.260 If you're looking for a last-minute Christmas gift, you can actually
04:05:47.980 Get a gift certificate or a certificate of a donation rather
04:05:51.960 to the documentary so all you have to do is number one go to the give send go page send us
04:05:58.600 a donation to support this documentary that is going to oppose all of this nonsense
04:06:04.260 and help say free speech in canada and expose these far left ideologues expose these people
04:06:11.300 over here and show them what really is going on in canada and with bill c63 because we need to
04:06:19.400 produce a professional story that people can hear that reaches outside of our echo chamber.
04:06:25.180 And that's exactly what we're doing. I've hired Steve Hanning, an award-winning
04:06:28.420 filmmaker to help us bring this documentary together. We've interviewed a couple of different
04:06:34.260 lawyers, a couple of different professors, people who have been politically persecuted in this
04:06:38.520 country for their political opinion. And we're telling the story with this documentary. You can
04:06:42.600 go to give, send go.com slash save free speech to donate. And then you can go over to our website,
04:06:47.540 savefreespeech.ca slash gift click on the gift button then you can download this certificate
04:06:53.380 download it print it out and you can dedicate your donation to a loved one you can dedicate
04:06:59.320 your donation to a patriot and say hey i donated on your behalf to help save free speech in canada
04:07:05.020 and then there's another there's another gift to give somebody the gift of free speech
04:07:10.740 isn't that amazing guys thanks so much for tuning in oh man that was a long one
04:07:17.180 But I really do appreciate it and I'm gonna sing I'm gonna play a song from mr. Dirty Jirty. What should I play?
04:07:30.900 I started off with this one. I'm gonna end with this one too because it is arguing online
04:07:35.180 So I think it's appropriate. Thanks again for watching guys. Thanks so much for the super chat
04:07:40.220 Steven with and troglodyte
04:07:42.220 Wes Wayne Sand
04:07:44.020 Rotten Studios
04:07:44.980 Happy Robot
04:07:45.900 Virk Jun
04:07:47.260 Billy Bob
04:07:50.440 Liquid Gal
04:07:51.560 Thanks for hanging out
04:07:52.640 Happy Robot
04:07:53.660 Caesar
04:07:54.160 Thanks for the super chat again my friend
04:07:56.120 We will see you guys soon
04:07:59.500 I'm going to be
04:08:00.160 I'm going to be streaming later this week as well
04:08:03.380 Going to be interviewing Wandsbutter on Thursday
04:08:06.800 I might be interviewing Dirty Journey actually on Wednesday
04:08:10.680 and we're going to be doing more fundraising.
04:08:12.940 There should be hopefully a trailer
04:08:15.520 for the documentary later this week.
04:08:17.080 But guys, thanks so much for watching.
04:08:18.360 We'll talk to you soon.
04:08:19.420 Thanks again.
04:08:19.880 Comment on a pin, back it up
04:08:23.300 with the link, start threads,
04:08:25.580 but heads push content like that.
04:08:29.300 Good fight, argue online.
04:08:34.580 Every precious test set
04:08:41.060 Notifications
04:08:43.780 Blood pressure hot
04:08:46.660 Scare my wife
04:08:47.920 Waste my life
04:08:49.400 Arguing online
04:08:51.920 I'm arguing online
04:08:56.400 I'm arguing online
04:09:00.640 I'm arguing online
04:09:04.880 Making a difference
04:09:09.340 And losing all my friends
04:09:11.500 Got a source you can send
04:09:13.540 Let's take this to my DMs
04:09:15.920 That's a tribe
04:09:17.620 Tribe
04:09:19.280 Argue online
04:09:21.620 Pat my back
04:09:25.240 Against the wall
04:09:26.600 And all their moms
04:09:28.640 and get the rents to their
04:09:31.140 in-backs, pick a
04:09:33.060 side, pick a hell
04:09:35.120 to die, can't argue
04:09:37.240 that I don't like
04:09:39.140 you. Realize that is
04:09:41.140 evidence that I am a prophet of God.
04:09:43.540 Some go to war,
04:09:45.240 some occupy,
04:09:47.600 some just a straight
04:09:49.280 up jetta inside,
04:09:51.860 some behead,
04:09:53.580 some are willing to die,
04:09:56.080 but most of us
04:09:57.860 Argue on life
04:10:00.020 So stop, whatever, and just be a fuckin' apostle
04:10:02.640 Pushing out a penny head
04:10:04.860 Dunkin' on some children
04:10:06.600 When the red dot's on that bill
04:10:08.800 That's what a cow
04:10:10.500 Livin' head reply
04:10:12.240 And watch them cry
04:10:14.520 That's how I lost my job
04:10:17.080 My check
04:10:18.180 Custody of kids
04:10:20.340 Lost my cool, lost the fight
04:10:22.560 Most of all, I lost my
04:10:24.600 One
04:10:25.640 R.Q. Hanan lad
04:10:31.640 R.Q. Hanan lad
04:10:43.640 About Israel and Palestine
04:10:51.640 If I tell you to do something, it's a commandment.
04:10:54.640 It's very important that you do it, so watch my Facebook video where I read the Quran, read the Quran, read the Quran, read the Quran.