00:51:09.900no it was the vaccine mandates you fucking idiot it was the vaccine mandates people people
00:51:18.340could not even go to the restaurant they couldn't leave the country they couldn't see their loved0.86
00:51:24.380ones they lost their jobs because of this medical decision
00:51:32.460how do so many people support this guy it's crazy it's crazy and by imposing unnecessary and
00:51:43.340unscientific rules uh that brought you almost had it almost had it so close so close unscientific
00:51:52.720rules uh that brought uh an end to the livelihoods of countless heroes we needed
00:51:59.020for our economy thank you like nothing about freedom nothing about democracy
00:52:07.540nothing about trudeau being a tyrant these are all slam dunks these are total slam dunks0.59
00:52:16.660slam dunk on trudeau make him look like an asshole couldn't fucking do it couldn't do it
00:52:24.040And as we know, the one and only Christine Anderson had to do it, where she said, what did she say?
00:52:32.960She said, appropriate for Mr. Trudeau, Prime Minister of Canada, to address this house according to Article 144, an article which was specifically designed to debate violations of human rights, democracy and the rule of law, which is clearly the case with Mr. Trudeau.
00:52:54.040Then again, a prime minister who openly admires the Chinese basic dictatorship who tramples
00:53:00.560on fundamental rights by persecuting and criminalizing his own citizens as terrorists just because
00:53:06.800they dared to stand up to his perverted concept of democracy should not be allowed to speak
01:05:00.420get censored online if an unelected bureaucrat deems your content harmful this is part of the
01:05:06.740part a arif does that sound fun does that sound like like uh it's just protecting kids online no
01:05:14.120it sounds like unelected bureaucrats censoring me because they make the argument that my content is
01:05:19.180harmful c63 would create a digital safety commission which would be a group of unelected
01:05:25.800bureaucrats who have the ability to force facebook instagram youtube twitter slash x or any of these
01:05:31.280big platforms to take to take down your content within 24 hours under the guise of protecting
01:05:37.380kids online these bureaucrats can use an array of reasons to justify censoring your voice online and
01:05:43.040potentially collecting your data as a means to further persecute you damn who wrote this this
01:05:49.160is fire c63 the list of harmful content the digital safety commission will censor this is
01:05:55.580the important part that i wanted to show so initially we're told that bill c63 hold on
01:06:05.400initially we're told bill c63 is going to protect kids online
01:06:12.300we've determined that that is a lie because the reef verania said you know what half of the bill
01:06:18.780was basically about hate speech not really anything to do with protecting kids online
01:06:22.200Okay, great. So half was not that, right? Half of the bill was not protecting kids online.
01:06:30.300But now he's saying, no, no, no, this half actually is protecting kids online. Okay,
01:06:34.980now let's look at that half. Let's look at that half. Because here we have the outline of what
01:06:40.840is harmful content. And we have seven things here. A, B, C, D, E, F, G. That's seven, right?
01:06:49.440So, A, intimate content communicated without consent.
01:06:55.720I believe this is revenge porn, I think.
01:06:59.680B, content that sexually victimizes a child or victimizes a survivor.
01:07:05.580I believe that these two pieces of content might already be illegal.
01:07:10.100I think they're alluding to child pornography here in at least one of these.
01:07:14.760so already most likely illegal if it's not already pretty sure they already are
01:07:22.560okay so two that's two of seven okay two of seven and i say two of seven because i've read this many
01:07:32.380times already and the other five things have nothing to do with anything sexual the rest of
01:07:40.620them are let's just read them can't see content that induces a child to harm themselves0.88
01:07:47.760so if i say you're stupid kill yourself is that enough is that like too much kys which is the
01:08:00.140short form that kids use kys hey like what if you're a kid and you're like you say to another0.94
01:08:06.640kid in your class hey you're fat hey you're ugly are they gonna are they gonna kill them0.58
01:08:14.540are they gonna hurt themselves now hey you're skinny hey you're too fat you should be more
01:08:20.820skinny i uh i don't have it in front of me now but if you break down the definition of c content
01:08:27.080that intuses a child to harm themselves they specifically say uh like eating disorders if
01:08:32.160if it might cause a kid to have an eating disorder and keep in mind,
01:08:37.880this is the most important thing of this section of the bill.
01:08:40.700That's not that bad is unelected bureaucrats will be deciding all of this.1.00
01:08:48.680Unelected bureaucrats will be deciding all of this.
01:08:52.140The problem with so many pieces of legislation that have to do with speech is like,
01:08:58.280Like the recent ones that are usually coming from a progressive standpoint, from a DEI standpoint, from a stopping hate standpoint.
01:09:07.120The problem with all of it is that there's no clear line of like what is criminal speech and or sorry, there is a clear line, which is like inciting, inciting violence and wanting to kill a group of people or heart.
01:09:27.880like specifically harm a group of people that is the line that has been the line for what is
01:09:35.060actually speech that's not okay but and the problem with a lot of these speech laws they're
01:09:40.100trying to bring in is they're starting to blur that line and expand that line uh into like
01:09:46.380different words of like oh no if you detest the person you can't do that either no if you uh
01:09:54.980If you villainize the person, that's when it becomes bad.
01:09:59.840And the problem is it's not a clear line.
01:12:25.16013 Reasons Why is an American teen drama television series developed by Netflix and based on the 2007 novel 13 Reasons Why.
01:12:33.700The series revolves around high school student and the aftermath of the suicide of fellow student.
01:12:39.020Before her death, she leaves behind a box of cassette tapes in which she details the reasons why she chose to kill herself as well as the people she believes are responsible for her death.
01:12:51.440are they going to mention how there was a spike in suicides after this tv show
01:12:57.960i don't know if it's actually in there maybe i should just google something else
01:13:01.360here we go criticism several health professionals educators and advocates
01:13:06.720linked the show to self-harm and suicide threats among young people
01:13:10.320this community also expressed major concerns about the series romanticizing suicide actually
01:13:16.800that's a great point suicide what about made what about medical assistance and dying
01:13:23.340isn't that a good thing a reef isn't it good to kill yourself if you feel like it
01:13:29.620so any content promoting made is that going to be now removed off the internet off the canadian
01:13:38.900internet. How do we, Arif, what do we do with MAID? How do we compute MAID? MAID is medical
01:13:48.320assistance in dying, a good thing for people who want to kill themselves. But also when it comes
01:13:55.860to harmful content, we cannot have content that induces a child to harm themselves, even though
01:14:00.620we are kind of advocating for people to kill themselves as the Canadian government. Go fuck
01:14:07.220yourself so that's c content that induces a child to harm themselves d is content this i love this
01:14:16.440one this one is so good this one is the craziest one this is the good side of the bill guys this
01:14:22.920is the this is the side of the bill that's not that bad and that's actually gonna pretend killed
01:14:26.920you content used to bully a child this is what they're going this is harmful content on the
01:14:35.020internet that unelected bureaucrats are going to remove off the internet apparently content used
01:14:42.680to bully a child that's like does it do i even need to say anything for this one how it insanely
01:14:53.260broad that is he bullied me take his post off the internet you know like the idea that you could
01:15:02.720police that is insane is so insane and uh yeah it's like like i honestly at this point i do not0.96
01:15:14.620envy you a referani you're trying to sell this piece of shit legislation what the fuck are you0.77
01:15:19.780guys thinking well we have to uh remove the content of uh if it's used to bully a child0.96
01:15:27.020how are you going to determine if content is used to bully a child
01:15:30.360the the thing that like grown-ups i feel like a lot of grown-ups don't understand
01:15:36.080kids younger generations are on a whole nother level when it comes to like making fun of each
01:15:42.600other and like digging at each other like the bullying of today for young people we probably
01:15:50.020wouldn't even recognize it it's on such another insidious insane level but no a referani and
01:15:59.700friends are going to determine if the content bullies a child hi i'm a bureaucrat who works
01:16:10.060in ottawa canada i work for the digital safety commission it's my job to take content off the
01:16:15.680internet that is used to bully a child oh by the way facebook by the way facebook facebook if you
01:16:24.920don't take this down, we're going to fine you $10 million. We're going to fine you $10 million
01:16:30.100if you don't take this post down. That's bullying a child. Let me bring that part up. See, this is
01:16:37.560another, uh, this is another angle of criticism, like huge criticism that should maybe arguably
01:16:43.760be the bigger one. Uh, where do I have it? Is it here? Boom. Maximum penalty, the maximum penalty