ManoWhisper
Home
Shows
About
Search
Greg Wycliffe
- December 08, 2024
Justin Trudeau's CENSORSHIP bureaucracy for Canadian Internet #stopbillc63
Episode Stats
Length
22 minutes
Words per Minute
151.17674
Word Count
3,443
Sentence Count
280
Misogynist Sentences
2
Hate Speech Sentences
1
Summary
Summaries are generated with
gmurro/bart-large-finetuned-filtered-spotify-podcast-summ
.
Transcript
Transcript is generated with
Whisper
(
turbo
).
Misogyny classification is done with
MilaNLProc/bert-base-uncased-ear-misogyny
.
Hate speech classification is done with
facebook/roberta-hate-speech-dynabench-r4-target
.
00:00:00.000
Part A is also horrible. Imagine unelected bureaucrats and they get to decide what they
00:00:06.820
can censor and take offline. Hey, this is harmful content. We're going to delete your tweet. We're
00:00:12.440
going to make sure that it disappears. Hey, isn't that great? You're not going to go to jail. You're
00:00:16.820
just going to get censored on the internet. Amazing. It's so much better. It's protecting
00:00:20.300
kids online. No, it's not. If we have government controlling speech, we're in trouble. More power
00:00:25.380
to intimacy, to lock up, to silence. The most totalitarian bill I've ever seen. SaveFreeSpeech.ca
00:00:32.040
So here he is, Arif Arani. He tabled Bill C-63. And what does he say today? Months of conservative
00:00:38.760
filibustering have stalled progress on Bill C-63. Our kids can't wait any longer. Today, I propose
00:00:45.620
to urgently advance the Online Harms Act, specifically the parts that protect children. It's time
00:00:51.760
that platforms work with us to save lives. Look at that rhetoric. Saving lives, eh? Saving
00:00:58.740
lives. Are you sure this isn't about setting up a large $200 million piece of bureaucracy
00:01:04.260
that's going to let the government control all content on the internet? Are you sure it's
00:01:09.580
not about the government getting control over all content, pretty well all content on the
00:01:16.160
internet? You sure it's not a control thing? You think it's saving lives?
00:01:21.760
Really, Arif? That's crazy. This is great for me because it's my job to educate people on
00:01:29.040
Bill C-63. And now he's kind of just divided it up. Okay, now we can talk about Part A,
00:01:33.600
Part B. Let's make it nice and simple. Part B certainly is the more disturbing stuff in the
00:01:40.720
bill where it expands the definition of hate speech, detestation, and vilification. Detestation
00:01:47.220
means intense dislike. So if you intensely dislike something under this new legislation, that would
00:01:55.080
be enough to go to jail for up to five years. So yeah, it's no wonder that the Part B, the stop
00:02:01.740
hate, was not so popular. However, and this is what I need to emphasize, Part A is also horrible.
00:02:10.180
Part A is also horrible. Imagine unelected bureaucrats, and they get to decide what they can censor and
00:02:21.300
take offline. Hey, this is harmful content. We're going to delete your tweet. We're going to make
00:02:28.960
sure that it disappears. And possibly more. Possibly, can you get the data on this person?
00:02:35.120
Hey, this person posted harmful content. Can you tell us more? We're actually doing an investigation
00:02:40.900
now. It's bad. And the thing is, again, this guy is obviously dishonest. He tells us that this bill,
00:02:49.080
first of all, first strike, Arif, is that you said this bill is about protecting kids online. And now you've
00:02:54.820
proven to us that that is a lie, because there's this whole thing about stopping hate. Now you're
00:02:59.720
actually separating, you're kind of separating it. And you're saying, oh yeah, actually, you know what,
00:03:03.840
you're right. Half of it wasn't even really about protecting kids. So you were lying. So you were
00:03:10.920
lying, Arif. Great. Thank you. Thank you for confirming for us that you were lying. But now,
00:03:17.560
of course, he's lying again, because he's saying, hey, Part A. No, guys, seriously, half of this bill
00:03:24.680
truly is about protecting kids online. No, it's not. No, it's not. And once again,
00:03:33.200
all you got to do is you go to savefreespeech.ca, you click on Bill C63. You click on Bill C63 right
00:03:38.780
up here in the corner. Get censored online if an unelected bureaucrat deems your content harmful.
00:03:44.640
This is part of the Part A. Does that sound fun? Does that sound like it's just protecting kids
00:03:50.300
online? No, it sounds like unelected bureaucrats censoring me because they make the argument that
00:03:55.360
my content is harmful. C63 would create a digital safety commission, which would be a group of
00:04:02.120
unelected bureaucrats who have the ability to force Facebook, Instagram, YouTube, Twitter slash X,
00:04:07.100
or any of these big platforms to take down your content within 24 hours. Under the guise of protecting
00:04:14.220
kids online, these bureaucrats can use an array of reasons to justify censoring your voice online and
00:04:19.880
potentially collecting your data as a means to further persecute you. Damn, who wrote this? This
00:04:26.000
is fire. C63, the list of harmful content the Digital Safety Commission will censor. This is the
00:04:32.540
important part that I wanted to show. Initially, we're told Bill C63 is going to protect kids online.
00:04:42.640
We've determined that that is a lie because the Reef Verani has said, you know what? Half of the bill
00:04:47.220
was basically about hate speech, not really anything to do with protecting kids online. Okay,
00:04:51.280
great. So half was not that, right? Half of the bill was not protecting kids online. But now he's
00:04:59.460
saying, no, no, no, this half actually is protecting kids online. Okay, now let's look at that half.
00:05:05.280
Let's look at that half because here we have the outline of what is harmful content. And we have seven
00:05:11.580
things here. A, B, C, D, E, F, G. That's seven, right? So A, intimate content communicated without
00:05:22.940
consent. I believe this is revenge porn, I think. B, content that sexually victimizes a child or
00:05:31.240
victimizes a survivor. I believe that these two pieces of content might already be illegal. I think
00:05:38.660
they're alluding to child pornography here in at least one of these. So already most likely illegal
00:05:46.020
if it's not already. Pretty sure they already are. Okay. So two, that's two of seven. Okay. Two of
00:05:54.560
seven. And I say two of seven because I've read this many times already. And the other five things have
00:06:00.060
nothing to do with anything sexual. The rest of them are, let's just read them. See, content that
00:06:10.640
induces a child to harm themselves. So if I say, hey, like what if you're a kid and you're like,
00:06:20.060
you say to another kid in your class, hey, you're fat. Hey, you're ugly. Are they going to kill
00:06:28.420
themselves? Are they going to hurt themselves now? Hey, you're skinny. Hey, you're too fat. You
00:06:34.340
should be more skinny. If you break down the definition of C, content that induces a child
00:06:39.360
to harm themselves, they specifically say like eating disorders. If it might cause a kid to have
00:06:45.560
an eating disorder. And keep in mind, this is the most important thing of this section of the bill.
00:06:52.000
That's not that bad is unelected bureaucrats will be deciding all of this. Unelected bureaucrats
00:07:01.100
will be deciding all of this. The problem with so many pieces of legislation that have to do with
00:07:07.420
speech is like, like the recent ones that are usually coming from a progressive standpoint,
00:07:15.380
from a DEI standpoint, from a stopping hate standpoint. The problem with all of it is that
00:07:21.360
there's no clear line of like, what is criminal speech and, or sorry, there is a clear line,
00:07:30.000
which is like inciting, inciting violence and wanting to kill a group of people or heart,
00:07:36.600
like specifically harm a group of people. That is the line that has been the line for what is actually
00:07:44.160
speech that's not okay. But, and the problem with a lot of these speech laws they're trying to bring
00:07:49.200
in is they're starting to blur that line and expand that line of into like different words of like,
00:07:57.240
oh no, if you detest the person, you can't do that either. No, if you, uh, if you villainize the
00:08:05.060
person, that's when it becomes bad. And the problem is it's not a clear line. It's not a clear line at
00:08:11.900
all who decides what detestation is. And it's totally horrible. It's, it's so, it's even way
00:08:18.820
more broad with this content that induces a child to harm themselves. Personally, when I watch Taylor
00:08:28.300
Swift, I want to kill myself. So anytime a kid posts a Taylor Swift promo, is that harmful? It's
00:08:35.960
harmful to me. Seeing all this Taylor Swift stuff everywhere. That's depressing. Makes me want to
00:08:41.600
kill myself. Okay. So if I was an unelected bureaucrat, part of the digital safety commission,
00:08:48.060
I would censor all Taylor Swift posts because I find it depressing. It's got to stop actually. And
00:08:55.540
unironically, actually, anytime I see someone tweet about how they want Trump to annex Canada
00:09:02.920
and maple mega, that makes me want to kill myself. Okay. So we got to stop those posts
00:09:09.880
because I'm, I, cause I, I, I identify, I identify as a child. See how loose these definitions are.
00:09:18.120
If you, if you really wanted to go there, we are the digital safety commission and we will decide what,
00:09:23.880
what content induces a child to harm themselves. Do you hear that out loud? How insane that is?
00:09:31.780
I am an unelected bureaucrat who works in Ottawa. I'm part of the digital safety commission.
00:09:38.120
I am going to determine which content induces a child to harm themselves.
00:09:45.340
What about that Netflix show? 13 reasons why?
00:09:51.020
Are you going to ban that? This is a Netflix show. 13 reasons why is an American teen drama
00:09:59.600
television series developed by Netflix and based on the 2007 novel, 13 reasons why the series revolves
00:10:05.640
around high school student and the aftermath of the suicide of fellow student before her death.
00:10:11.000
She leaves behind a box of cassette tapes in which she details the reasons why she chose to kill
00:10:15.260
herself as well as the people she believes are responsible for her death. Are they going to mention
00:10:20.600
how there was a spike in suicides after this TV show? Here we go. Criticism.
00:10:24.600
Several health professionals, educators, and advocates linked the show to self-harm and suicide
00:10:29.240
threats among young people. This community also expressed major concerns about the series
00:10:33.920
romanticizing suicide. Actually, that's a great point. Suicide? What about MAID? What about medical
00:10:40.540
assistance and dying? Isn't that a good thing, Arif? Isn't it good to kill yourself if you feel like it?
00:10:47.300
So, any content promoting MAID, is that going to be now removed off the internet, off the Canadian
00:10:55.880
internet? How do we, Arif, what do we do with MAID? How do we compute MAID? MAID is medical assistance
00:11:05.780
and dying, a good thing for people who want to kill themselves, but also when it comes to harmful
00:11:13.280
content. We cannot have content that induces a child to harm themselves, even though we are kind
00:11:18.040
of advocating for people to kill themselves as the Canadian government. Go fuck yourself.
00:11:24.340
So, that's C, content that induces a child to harm themselves. D is content. I love this one. This one
00:11:32.820
is so good. This one is the craziest one. This is the good side of the bill, guys. This is the side of
00:11:39.740
the bill that's not that bad. And that's actually going to pretend killed you. Content used to bully
00:11:45.740
a child. This is harmful content on the internet that unelected bureaucrats are going to remove
00:11:54.240
off the internet, apparently. Content used to bully a child. Do I even need to say anything for this
00:12:04.160
one? How insanely broad that is? He bullied me. Take his post off the internet. You know, like, the idea
00:12:13.260
that you could police that is insane. Is so insane. Like, I honestly, at this point, I do not envy you,
00:12:22.540
Arif Arani. You're trying to sell this piece of shit legislation. What the fuck are you guys thinking?
00:12:27.520
Well, we have to remove the content of, if it's used to bully a child. How are you going to determine
00:12:34.340
if content is used to bully a child? The thing that, like, grownups, I feel like a lot of grownups
00:12:40.260
don't understand. Kids, younger generations are on a whole nother level when it comes to, like,
00:12:45.940
making fun of each other and, like, digging at each other. Like, the bullying of today for young
00:12:52.620
people, we probably wouldn't even recognize it. It's on such another insidious, insane level.
00:13:00.760
But no, Arif Arani and friends are going to determine if the content bullies a child.
00:13:08.800
Hi, I'm a bureaucrat who works in Ottawa, Canada. I work for the Digital Safety Commission.
00:13:15.260
It's my job to take content off the internet that is used to bully a child.
00:13:20.140
Oh, by the way, Facebook, by the way, Facebook, Facebook, if you don't take this down,
00:13:26.960
we're going to fine you $10 million. We're going to fine you $10 million if you don't
00:13:32.160
take this post down that's bullying a child. Let me bring that part up. See, this is another,
00:13:39.040
this is another angle of, like, huge criticism that should maybe arguably be the bigger one.
00:13:46.340
Maximum penalty. The maximum penalty for a violation is not more than 6% of the gross
00:13:53.160
global revenue of the person that is believed to have committed the violation or $10 million,
00:13:59.600
whichever is greater. So, Facebook, Meta, Google, Elon Musk, you'll have to pay $10 million
00:14:10.260
every time there's any piece of content that is used to bully a child on your platform.
00:14:20.920
This is what the Canadian government is proposing. And that's why, you know, part of the theory here is
00:14:28.360
that not only is this side of the bill bad because it's not really focused on protecting kids,
00:14:36.220
it's a lot more about just censoring the internet, giving the government the power to censor the
00:14:41.400
internet, but it might actually nuke the internet. It might nuke the Canadian internet from orbit.
00:14:49.880
What am I talking about? If you are Elon Musk, if you are Zuckerberg, if you're an owner of these,
00:14:57.140
one of these big tech platforms, and you learn that this new piece of legislation,
00:15:02.020
you could lose tens of millions, hundreds of millions of dollars a day from the Canadian
00:15:09.320
government because you broke their laws, you're going to look at our market and say,
00:15:13.740
huh, Canada is only 40 million people. The country of Iran is double that. The country of Germany is
00:15:22.380
double that. Brazil is like five times that. Indonesia is like five times that. That's like one
00:15:28.640
California. Canada is one California. Why would we hang around and wait to get fined 10 million
00:15:36.440
dollars by Justin Trudeau? I think we're actually just going to close up shop in Canada. That's what
00:15:43.600
nuking the internet would look like with Bill C-63, the part A that's going to protect kids online.
00:15:49.300
It could actually nuke the internet because I don't think big tech platforms would want to expose
00:15:54.320
themselves to the financial risk of having to put up with this, put up with unelected bureaucrats
00:16:01.260
saying, oh, we found, let's look at the list again. Oh, we found, uh, Elon, uh, we found content that
00:16:08.880
was used to bully a child. We found content that foments hatred. If you don't take it down, we're
00:16:15.140
going to, we're going to find you $10 million. This is what the Canadian government's saying.
00:16:19.360
There's a very good chance that big tech platforms might just be like, bye, bye. Bye Canada.
00:16:27.700
Au revoir. Why would we even stay here? Your country sucks. And there's very good precedent for
00:16:35.960
this happening because the most recent internet bill that passed was the online news act. And now,
00:16:42.940
oh, now you can't get news content on meta. Now you can't get news content on Facebook and Instagram.
00:16:50.540
And in some cases, Google because of shitty legislation from the Canadian government
00:16:57.580
because they don't understand the internet. They're, they're just looking for money. Basically
00:17:03.680
actually, do I got the other screenshot here? It's pretty funny. He's doing some research on it today.
00:17:10.800
They have, they have written here the purpose of a penalty, 98 purpose of a penalty. The purpose of
00:17:19.260
a penalty is to promote compliance with the act. And it is not meant to punish like the audacity
00:17:27.120
of these Ottawa bureaucrats. No, no, no, no. It's not meant to punish. No, no. It's just to make you
00:17:33.300
guys comply. Are you sure it's not to make a $10 million? Are you sure it's not to, uh, rob big
00:17:40.560
tech of money? You should, it's not, oh no, no, no, no, no. It's not meant to punish anybody.
00:17:47.900
The whole point of punishing somebody is so that they comply, right? The very premise of this like
00:17:55.320
little note, the purpose of a penalty is to promote compliance with this act and not to punish people.
00:18:01.020
No, the penalty is meant to punish people so they comply. Like, am I crazy? Or is this,
00:18:07.020
is this legislation absolute shit? My God, bro. My God. My God. I mean, should I even go over the
00:18:16.340
next one? The next one is also very important content that foments hatred. Really? And here's
00:18:24.540
the thing. They also have the definition. It's the same expanded definition that they have in part B of
00:18:30.840
the bill, which is detestation and vilification. So the very low threshold of intensely disliking
00:18:39.400
something. If you post content that intensely dislikes something, well, uh, the, the Canadian
00:18:45.960
government is going to find Facebook $10 million for them to take it down off the internet. Yeah.
00:18:51.300
See, it, it expands the definition here. Content that foments hatred means content that expresses
00:18:56.440
detestation or vilification of an individual or group of individuals on the basis of a prohibited
00:19:01.800
ground of discrimination within the meaning of the Canadian human rights act. And that given the
00:19:07.100
context in which it's communicated is, is likely to foment detestation or vilification of an individual
00:19:14.080
or group of individuals on the basis of such a prohibited ground. It's the same definition.
00:19:20.780
It's the same expanded hate speech definition that they want in part two of the bill. That's also
00:19:26.820
going to apply in part one of the bill. So instead of going to jail for five years, you're just going
00:19:33.560
to get censored on the internet. Hey, isn't that great? You're not going to go to jail. You're just
00:19:39.600
going to get censored on the internet. All right. Amazing. It's so much better. It's protecting kids
00:19:47.320
online. Yeah. All right, bud. Yeah. All right. I'm so excited for the holiday season. I'm so excited
00:19:57.060
to continue on with this documentary. Oh my goodness. I wish I could tell you who were, who we're
00:20:04.480
interviewing for this documentary, but it would mean the world to me if you could help support it.
00:20:12.860
It is my, my Christmas dream. Okay. It's my, my dream for Christmas is to get this documentary
00:20:21.840
funded. And to give you an idea, I'm going to be driving to Kitchener Waterloo tomorrow to do an
00:20:30.180
interview. I have to pay the cameraman. I have to buy hard drives, put the footage on to then upload.
00:20:38.320
I have to, I'm also booking office space in Friday to do interviews. So all this stuff costs money is
00:20:46.300
the point. So if you even have just a few dollars, then please support the production of this
00:20:52.340
documentary. Go to gifts and go.com slash say free speech. It's yeah. It's not just about educating
00:21:00.120
people on bill C 63, but it's exposing the bigger picture of what bill C 63 is all about. You know,
00:21:08.100
bill C 63 doesn't exist in a vacuum. Okay. Bill C 63 exists in this world where there's communists.
00:21:16.100
There's these, there's these Antifa members. There's these ideologically possessed power hungry
00:21:23.920
bureaucrats and also just sort of ideologues who are hell bent on centering people who they disagree
00:21:33.440
with silencing people. They disagree with destroying the lives of people they disagree with. And this is
00:21:40.360
a very real phenomenon. It happens in different places, in different ways, and it needs to be
00:21:46.260
documented. The story needs to be told. So people know they're not crazy. People need to know that
00:21:52.500
they're not crazy. And we need to identify the enemy here. Okay. We need to identify the problem
00:21:58.520
and identify the enemy because there is one in Canada. And that's what we're talking about
00:22:04.080
in this documentary. I'm so serious about this that I have hired an award-winning filmmaker to help us
00:22:11.440
with it. He's going to make sure that this story that identifying this villain reaches as many people
00:22:18.020
as possible. I sent it again in the chat there. If you want to donate,
00:22:21.040
um, hey, I really appreciate everybody watching.
00:22:31.060
Save free speech. Save free speech. Save free speech. That's the place to go.
00:22:38.200
And I think something happens to your brain when you've been in Ottawa for too long. I don't know
00:22:41.620
what it means to be incarcerated for what you believe. Things are not going to change unless we change them.
Link copied!