Greg Wycliffe - August 21, 2024


LIVE🔴CIJA discusses C-63🔴Online Harms Act is in the Budget 🔴 #STOPBillC63 🔴 SAVE FREE SPEECH .ca


Episode Stats

Length

3 hours and 55 minutes

Words per Minute

165.76674

Word Count

39,030

Sentence Count

580

Misogynist Sentences

26

Hate Speech Sentences

93


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.000 We'll be right back.
00:00:30.000 Thank you.
00:01:00.000 Thank you.
00:01:30.000 We'll be right back.
00:02:00.000 Thank you.
00:02:30.000 I'll see you next time.
00:03:00.000 Thank you.
00:03:30.000 Thank you.
00:04:00.000 Thank you.
00:04:30.000 Thank you.
00:05:00.000 Thank you.
00:05:30.000 Thank you.
00:06:00.000 Thank you.
00:06:30.000 hello oh hello we're up close here
00:06:45.380 good evening ladies and gentlemen um how are you doing how are you feeling
00:06:55.980 uh another stream another stream to do oh man how do these guys do it how do these guys do it it's
00:07:06.640 all the lights everything how are we feeling how are we feeling i saw trent and trent dabs
00:07:11.580 in chat said what about the chocolate milk see the thing i love chocolate milk but
00:07:19.000 chocolate milk is really bad for broadcasting or singing or like anything to do with speaking
00:07:25.600 or using your mouth because it's just so it's just so thick and delicious it's uh it's literally
00:07:32.880 the worst thing though if you're if you're trying to be a broadcaster and trying to uh talk uh and
00:07:38.560 stuff um but yeah we're gonna get into some stuff i'm just gonna try to clean this up a bit this
00:07:46.300 looks like a mess look at look at this so just come on i eventually would like to be um you know
00:07:53.800 streaming on rumble and other platforms as well but you know i think it's most important to just
00:08:00.360 actually be here to actually stream you know gotta stream we gotta make sure that i'm sexually
00:08:06.800 streaming streaming comes first always gotta be there for the people
00:08:15.140 um yeah we're trying to save free speech in canada if you're just tuning in i'm just setting
00:08:21.640 some stuff up for the stream here. What are we going to be talking about tonight? We're going
00:08:24.900 to be going over this CJA video. CJA is the Center for Israel and Jewish Affairs. They actually have
00:08:34.140 a debate about Bill C-63. Spoiler alert, I'm pretty sure they're all for the bill. They have
00:08:41.740 some concerns with some aspects of it, but they are more or less pushing the bill, which is a
00:08:48.720 So it's a good learning opportunity for us.
00:08:51.200 If you've tuned into the stream before, it's all about steel manning the argument, steel manning the argument in favor of, you know, stopping free speech or to pass this bill C63.
00:09:05.560 So I want to look into this.
00:09:06.540 I want to look into that chat that they have going to be reacting to some other content and also going over the fact that, yeah, you didn't know.
00:09:13.420 uh bill c63 is being um like already included in the uh federal budget or being talked about
00:09:21.100 and being budgeted out 300 employees 200 million dollars to police your internet comments
00:09:28.220 the thought police are coming and uh yeah it's kind of okay and kind of fun to make fun of it
00:09:35.820 now well we should make fun of it even if it were to pass but it's best let's just humiliate these
00:09:41.060 tyrants now so the bill gets thrown out and then we don't actually have to deal with this
00:09:45.980 in real life you know let's do this thing called standing up standing on guard for the
00:09:50.940 and defending uh the freedoms of this country which made it so good to begin with
00:09:55.600 uh yeah it feels like we're in the fourth quarter and we really need to like pull off
00:10:00.500 a miracle here but i think we can do it i think we can do it um i don't know what this is this
00:10:07.960 page is going to look like but hopefully let's see this might be a mess one second here oh there's
00:10:12.960 nothing there let's see what this looks like you guys can't hear me it's gonna it's gonna be quiet
00:10:22.660 for a sec just everybody sit tight okay it's gonna be an awkward silence here just sit tight
00:10:38.880 And we're back. Did you miss me? We're back. Did you miss me?
00:10:44.600 All right. Is that looking good? All right. All right. All right.
00:10:52.000 You can't really read that.
00:10:54.840 This is a quote from Sijia, Center for Israel and Jewish Affairs.
00:10:59.620 Today we join Minister Varani Arif, MPs and those who have been directly and painfully affected by online hate and abuse
00:11:07.300 for the tabling of bill c63 the online harms act anti-semitism is flourishing online and across
00:11:13.680 canada we welcome this long-awaited legislation that was tweeted on february 26th 2024
00:11:22.320 um yeah that was right when the bill was tabled so it's interesting that we're going to later
00:11:31.160 watch a clip of them debating it
00:11:33.100 debating it
00:11:34.880 although they are, if I'm not mistaken
00:11:37.080 literally standing behind them
00:11:38.600 with all of their muster to try and
00:11:41.060 pass this
00:11:41.660 bill
00:11:43.640 censorship has never helped us
00:11:49.140 this bill is authoritarian
00:11:50.360 garbage
00:11:51.700 good point
00:11:53.780 this guy actually makes pretty good
00:11:57.120 TikTok content when his stuff is
00:11:59.100 not being totally censored
00:12:00.580 let's see what we're just looking at what are we looking at here recent videos from this guy
00:12:05.800 bill gates has the scoop arrested let's let's listen to this because this guy's content is
00:12:14.540 decent it's nice and short but let's see you can now be arrested for what you post on social media 0.65
00:12:19.080 but mps can betray their country get away with it burn down a church vandalize a synagogue get 0.93
00:12:23.860 away with it you can now be arrested for what you post on social media so true he's pretty good at 0.52
00:12:29.540 the quick like concise messaging uh sorry we'll do that full screen sorry you guys can't see that 0.85
00:12:36.740 there we go media but mps can betray their country get away with it burn down a church 0.60
00:12:42.600 vandalize a synagogue get away with it you can now be arrested for what you post on social media 0.75
00:12:47.140 so true so true where's the like button boom boom uh yeah so true so freaking true so uh yeah we are
00:12:57.920 trying to save free speech in canada started save free speech.ca i'm going to throw throw all my
00:13:02.420 content effort behind uh stopping this bill if you want to help us save free for save free speech in
00:13:08.800 canada you can go to give send go.com slash save free speech how are we doing yes we're almost at
00:13:15.780 the four thousand dollar mark uh very exciting a lot of huge donations coming in you love to see it
00:13:22.560 it's going towards it's going towards content that's going to ring the alarm bells
00:13:27.720 on save free speech
00:13:29.200 on bill c63 we need to
00:13:32.020 stop this bill it's a little
00:13:34.020 loud says edgy dtv
00:13:35.260 my voice is a little loud
00:13:37.000 interesting
00:13:39.480 interesting maybe i'll just move the mic back
00:13:42.080 is that better is that better
00:13:46.080 sir maybe we'll actually i could do this
00:13:48.120 as well
00:13:48.840 let's try that
00:13:56.800 is my voice a little loud or is it the background music anyone insulting quote old stock canadians
00:14:05.140 will continue to have free speech of course absolutely that's absolutely right uh and more
00:14:12.260 on that when we when we react to the siege of stuff because you know i always kind of tuned
00:14:17.700 i watched a little bit of it i want to react to the most of it tonight not all of it because 0.94
00:14:21.640 kind of long but um yeah uh because essentially we have a group of people who are jewish looking
00:14:30.400 out for jewish people and they're deciding this this bill is going to be beneficial for us for
00:14:35.400 our our group our tribe and as lyle crawford said anyone insulting old stock canadians will
00:14:42.460 continue to have free speech of course yeah you know like do we think that bill c60 if we were
00:14:48.020 let's say a tribe of you know white european canadians let's say uh do we think this bill
00:14:55.300 is going to be beneficial for us uh let's let's let's see what happened at the trucker convoy
00:15:00.800 let's see what's happened to all of the churches that have been burnt down let's see all of the
00:15:06.580 uh statues which have been torn down it seems to be a kind of like consistent sort of hatred
00:15:12.380 for for white canadian folk um i don't know if the bill is going to address that something tells
00:15:18.780 me though something tells me that it uh might not that it might not but um yeah let's uh all right
00:15:29.360 let's get into it let's get to let's get through this uh through this article then get into this
00:15:33.380 get into this video um
00:15:36.400 All right, creating online harms regulators expected to cost Canada $200 million.
00:16:01.400 dollars now i don't have the tweets on hand but you'll see um you'll see uh you know conservative
00:16:10.300 mps some conservative mps have tweeted about this basically saying it's gonna cost 200 million
00:16:16.200 dollars can you believe this it's gonna cost 200 million dollars i'm more upset about the fact that
00:16:22.940 we wouldn't have freedom of speech anymore and the liberal biased far left biased canadian
00:16:29.880 government would have free reign to basically persecute potentially in prison and throw in jail
00:16:35.120 your base conservative party your base people who vote for you people who support you that's who's 0.55
00:16:43.840 going to be uh thrown in jail for this bill it's going to cost so much money i do not give a fuck 0.62
00:16:50.660 about how much money it's going to cost sir a member of parliament conservative member of 0.50
00:16:56.380 parliament i'm more concerned about myself and my friends and everyone who's actually standing
00:17:01.600 out to fight for this country getting thrown in jail i'm more worried about uh more political
00:17:09.080 prisoners i'm more worried about uh tyranny being on the rise it's gonna cost 200 million dollars
00:17:15.440 it's so irritating that this seems to be like the only language that the conservative party
00:17:22.320 of canada speaks it's like okay uh new law is gonna passed that basically outlaws anyone with
00:17:29.980 conservative sentiment uh you know it's basically you guys will be second-class citizens you won't
00:17:35.960 have a right to free speech and uh people can make fun of you but if you criticize anyone at
00:17:40.740 all you'll get thrown in jail and the conservatives are like um um um uh well well how much how much
00:17:47.220 does it cost maybe there's an argument we could make of because it costs too much money like
00:17:52.160 There's no way I could just come out and oppose this for being totally morally wrong
00:17:55.960 and a complete, absolute, what's the word?
00:18:00.500 It's worse than a failure.
00:18:01.760 It's worse than a humiliation.
00:18:03.240 It's a total insult.
00:18:04.860 It's a travesty to the country, you know?
00:18:09.880 And it costs too much money.
00:18:13.880 That's my only concern.
00:18:15.140 Well, it costs too much money.
00:18:16.460 That's my only problem with it.
00:18:17.340 Okay, so if it was cheaper, then persecuting conservatives would be fine.
00:18:22.160 persecuting people who are too far right. That would be a-okay if it was just a little bit
00:18:26.740 cheaper, right? Okay. We're going to make sure our censorship brown shirts are more affordable.
00:18:36.500 Great. We're going to make sure your mass migration is more affordable.
00:18:40.820 Saves you money. Still going to be a lot of, you know, there's still going to be a lot of 0.99
00:18:46.640 Pajits everywhere, invading your communities, invading your spaces and replacing you in the 1.00
00:18:51.260 workforce but it's going to be more affordable bring it home building your online business
00:18:58.620 an ad um yeah you guys have heard this all before but still it bears repeating
00:19:06.020 all right let's get back into this article let's get through this creating online harms
00:19:11.800 regulators expected to cost canada 200 million uh parliamentary budget officer
00:19:17.760 Yves Giro awaits to appear before a committee at the Senate in Ottawa.
00:19:23.660 This is back in June.
00:19:25.520 This article, though, yeah, this article is from July, July 4th.
00:19:30.920 Staffing the new regulators in the Liberals' Online Harms Act will cost around $200 million
00:19:36.640 over five years, according to a new analysis released by the Parliamentary Budget Officer
00:19:41.460 on Thursday.
00:19:42.540 The report looks at the federal government's pledge to establish a digital safety commission to regulate social media companies and force them to limit harmful content online.
00:20:12.540 online harms, which if you witnessed during COVID, uh, yeah, it was a whole industry of
00:20:18.900 censorship and silencing people who, uh, ended up having the right opinion on, uh, on things
00:20:25.860 during lockdowns. So, uh, yeah, they already have a censorship problem. Quite frankly,
00:20:31.960 a lot of the big tech platforms probably aside from X, but, um, but yeah, this idea that no,
00:20:39.400 the government actually is going to do a better job at censoring the internet.
00:20:44.080 Yeah.
00:20:44.620 Okay.
00:20:47.960 CBC.
00:20:48.700 This is a comment from Jennifer Francis.
00:20:50.820 CBC 22 minutes made a comedy skit about replacing white people and everyone 0.94
00:20:54.640 will be Brown in the future. 0.99
00:20:56.100 I remember this. 1.00
00:20:57.040 Yeah.
00:20:57.740 It was a whole like kind of like cool,
00:20:59.520 like MTV style skit or like a music video about becoming beige.
00:21:05.600 It was like a rap about everyone becoming beige.
00:21:07.800 it's crazy man
00:21:10.020 it's crazy
00:21:11.740 and that's the thing when you look back in
00:21:13.720 like the history of Canadian media
00:21:15.540 especially
00:21:16.800 late 90s early 2000s
00:21:19.980 it's everywhere like it's written
00:21:21.980 everywhere the writing is on the wall
00:21:23.960 all over the place
00:21:24.960 alright if the legislation passes
00:21:30.060 in parliament that commission would establish
00:21:32.120 a set of regulations and have the power
00:21:34.120 to levy fines against companies that break
00:21:36.080 the rules okay
00:21:37.380 will the ctv news mention the fact of house arrest do you think they'll mention house arrest in this
00:21:44.900 article put a one if you think they're going to mention house arrest in chat and put a two
00:21:50.000 if you think ctv will not mention house arrest and the more egregious things in bill c63 in this
00:21:56.260 article the online harms bill also proposes creating a digital safety ombudsperson that
00:22:02.600 can bring their concerns to as well as a new safety digit as a new digital safety office
00:22:09.500 in a report released thursday the pbo says the heritage department estimates those new entities
00:22:14.660 will employ about 300 people when they're fully up and running imagine 300 people working at
00:22:22.400 anti-hate i'm pretty sure there's only like five or like you know four or five maybe eight that are
00:22:28.960 actually like employed on a salary and then they have probably i'm kind of guessing this is all
00:22:33.800 kind of like guesswork i don't actually know this for sure but if i had to guess probably you know
00:22:38.520 six to ten people on their on salary and then probably a dozen maybe two dozen realistically
00:22:48.100 a lot of antifa people don't actually do stuff so it's probably only like five people who just
00:22:52.980 kind of like try to feed them information and all these silly twitter threads trying to cancel
00:22:56.740 right-wingers but uh three imagine having 300 salaried bloggers and sleuths online to try and
00:23:06.420 misrepresent your facebook comment to call you racist the the point is um well this is really
00:23:14.540 why i'm saying it's like the the it would be the end of free speech it's like literally creating
00:23:18.840 a small army of people to make it their job financially incentivized to police the internet
00:23:28.640 and if you don't think there's going to be a left-wing bias you're just completely stupid
00:23:33.520 no offense uh you know based on everything we've seen from this liberal government
00:23:39.540 based on everything you've seen from this liberal government uh the trucker convoy the political
00:23:46.360 prisoners the traitors in parliament after all of that the far left bias after all of that
00:23:55.360 the you know the faulty legislation that's already been passed with the online news act
00:24:00.080 can't even see news on meta facebook and instagram after all of that you're still
00:24:06.000 going to give this government the benefit of the doubt and say no no i think they'll do a good job
00:24:10.480 here no no actually i no actually this time i think they're going to do a good job on bill c63
00:24:17.320 and the online harms act because it's called the digital safety office it's called it's called the
00:24:23.520 safety board they wouldn't lie to us again would they they wouldn't be totally wrong again oh my
00:24:29.940 god they're pushing a political agenda and they're just being polite and acting polite and nice to be
00:24:36.500 to pass more legislation to act more tyrannical i can't believe that even though there's been
00:24:42.780 example after example after example of that especially since 2022 it's you know it's
00:24:51.120 understandable when the people who are far left will kind of just repeat the same talking points
00:24:55.700 and say yes this bill is good for us it's going to stop hate blah blah blah but it's totally crazy
00:25:01.740 when right-wing people conservative mps give this regime this liberal trudeau regime the 0.79
00:25:09.640 canadian government regime whatever you want to call it the globo homo canadian regime the benefit 0.85
00:25:14.260 of the doubt like they still give it the benefit of the doubt despite everything uh we've been 0.95
00:25:20.460 through and that's another reason why i really kind of resent pierre polyev because he has he
00:25:24.860 has successfully de-radicalized people and lulled them back to sleep in this regard to be like no
00:25:30.480 no no everything's fine guys everything's fine um no it's not everything is not fine
00:25:36.300 um all right where are we at here um
00:25:41.500 so here we go here we go let's let's finish the rest of this article
00:25:50.760 okay oh i didn't check the ones and the twos how did we do here we got a lot of twos a lot
00:26:08.260 of twos that they are not gonna bring up the uh not gonna bring up any of the egregious things
00:26:13.940 in the bill hey turn the volume up you want the volume up now all right let me turn it up a little
00:26:19.780 bit. Just boomer tech. I feel like I have boomer tech issues right now. Come on there. Boom. Let's
00:26:33.000 try that. Is that, is that good enough? Is that good enough for you guys? Is that good enough?
00:26:40.820 All right. All right. Cool. All right. CTV news. The online harms bill also proposes creating a
00:26:49.720 digital safety ombudsperson that Canadians can bring their concerns to as well as a new digital
00:26:54.300 safety office. In a report released Thursday, the PBO says the Heritage Department estimates those
00:26:59.680 new entities will employ about 300 people when they're fully up and running. The PBO estimates
00:27:04.840 that from 2024-25 to 2028-29, the total operating costs will be $201 million minus any possible
00:27:12.500 administrative monetary penalties, fines, or regulatory charges collected by the commission,
00:27:17.760 ombudsperson and office
00:27:19.300 so I'm guessing that's assuming
00:27:25.860 that this is money they will be receiving
00:27:27.740 collected by the
00:27:29.820 commission ombudsperson and office
00:27:31.780 like
00:27:35.420 like aside
00:27:37.940 from the politics of you know
00:27:39.820 the left and right and how this is about tyranny
00:27:41.840 it's like there's just such a gross
00:27:43.760 misunderstanding of how the internet
00:27:46.020 works and it's just a bunch of bureaucrats who are like let's come let's get some of that big
00:27:51.560 tech money like it's it's literally people from an older generation who do not understand new media
00:27:58.940 they're obsessed with trying to have their um you know state approved state funded media like the
00:28:05.320 cbc and they just are so bloodthirsty lusting after more control and influence over new media
00:28:11.820 which is the internet and if they can't get that then they're gonna steal a piece of the pie
00:28:16.120 they're gonna take their money back from big tech it's totally misguided it's like it's like some
00:28:21.320 some old person clutching on to to something and it's like bro just just let it go just let it go
00:28:27.880 okay your old media sucks you can't compete in the new media sphere stop just ruining everything
00:28:34.060 because you cannot accept uh the way the world is anyway the report notes the government may
00:28:39.540 collect revenue by fining companies that don't comply but he estimated costing does not include
00:28:45.860 an analysis of what that could look like they see like they don't even know it's the same thing as
00:28:52.480 bill c is is bill c18 the online news actor like hey let's do this let's try to um you know force
00:28:59.720 big tech to pay us for a free service that they already provide news platforms meta called them
00:29:07.280 out on their bluff and said fuck you uh we're just not going to share the news anymore so there you
00:29:14.600 go so uh but once again here they are they don't know uh there's a high degree of uncertainty in 0.58
00:29:22.040 the revenues they don't know what they're doing they don't know what they're doing with this
00:29:27.000 legislation because they don't know how big tech is going to react they've taken years to develop
00:29:32.780 this bill but they still don't know shit about what's actually going to happen like the more you
00:29:39.280 look into this bill and the actual experts who are presenting it and trying to pass it and budget for
00:29:45.300 it they don't even know they don't know because it's brand new they're they're literally just
00:29:50.800 power and money hungry bureaucrats they're government employees and that's why we really
00:29:56.300 need to win you know we're up against government employees you guys they really don't give a shit
00:30:02.060 about what they're doing they really like all they care about is their pension we should be
00:30:07.180 able to beat these people we should be able to beat these people handily okay costs may also be
00:30:14.060 higher if the new entities decide to use outside consulting services or legal support why would
00:30:20.280 you need legal support oh yeah they don't know what they're doing right uh the watchdog notes
00:30:24.900 that the government's staffing estimates are based on other canadian and international regulators
00:30:31.020 Justice Minister Arif Farhani introduced the online harms bill in February,
00:30:35.100 saying social media giants must take accountability for harmful content.
00:30:39.160 Once again, Justice Arif Farhani says it's here to protect children online,
00:30:43.700 yet says absolutely nothing about MindGeek.
00:30:47.200 They have a different name now, but it's a massive pornography enterprise that's situated in Canada.
00:30:52.880 Are they included in a social media giant?
00:30:55.360 No, they're not.
00:30:56.720 Because this bill is not about protecting kids.
00:31:00.420 from sexual exploitation it's about controlling speech that's why it says social media giants and
00:31:05.660 not we're going to go after the porn websites it's right there in black and white it's obvious
00:31:11.200 but the opposition conservatives have been critical saying it will accomplish nothing more
00:31:15.980 than create a new bureaucracy again i'm not not good not um i'm high not good enough not good
00:31:25.580 enough not good enough we know that they have tyrannical intentions we know that this is a
00:31:31.220 it's going to be such a far-left bias towards old stock white christian canadians oh it's just
00:31:39.240 going to be more bureaucracy it's just gonna be more expensive bureaucracy bureaucracy no it would
00:31:43.440 be the end of free speech it would be the the green light to uh you know hunting down political
00:31:48.700 opposition real political opposition conservative party you guys don't count you guys will be fine
00:31:53.700 of course i'm still pushing mass migration for a fuck's sakes oh part of my language i'm just uh
00:32:01.460 yeah yeah it's trying to be trying to be civil trying to be a civil just hey calm calm just calm
00:32:07.300 down everybody calm down okay calm down it does feel good to to vent a little bit though
00:32:12.820 chris freestone says white guilt sickens me
00:32:21.180 yeah yeah i kind of agree with that yeah it's um
00:32:27.700 sometimes there's different versions of white guilt sometimes it's subtle
00:32:33.760 and it kind of manifests as it'll like subtle white guilt would manifest as something like
00:32:41.440 subtle white yeah subtle white guilt manifests as as i don't know it's it's always it's very
00:32:55.820 subtle it's very subtle but it's it's always kind of there it's always kind of like that
00:32:59.720 oh well i'm good i'm not going to actually share my opinion on that i'm actually i'm no i'm not
00:33:04.540 actually going to oh cool like uh oh wow that there was you know there was a no we have a
00:33:10.420 diverse workforce it's a very sort of uh timid and sort of oh yeah no no i'm i'm totally i'm hip
00:33:16.920 i'm with it but the more explicit white guilt i agree is totally sickening um when when people
00:33:23.760 sort of like just excuse themselves and insist that they are not important because they're white
00:33:32.620 um and of course it's never explicitly said like that but uh like the most one of the most
00:33:41.980 horrible examples would be like a white couple married couple who are like well we're not going 0.55
00:33:47.280 to bring kids into this world you know climate change and there's already people coming in with 0.62
00:33:52.680 mass migration so we're just going to take care of our dog um like that's crazy that's crazy
00:33:59.740 sickening and like they don't even know how insane it is because they're so indoctrinated 0.56
00:34:04.000 it really is such a such a such a spiral of uh of just like oh god then again there's there's
00:34:11.800 there's positives in there folks because at least we're not that far gone you know at least we're
00:34:19.040 actually aware of what's going on i know can be very overwhelming and that sort of thing but
00:34:23.160 it's not
00:34:24.580 yeah at least
00:34:27.580 we're aware of you know we're seeing
00:34:29.560 the world as it is and we're
00:34:31.560 watching ourselves live through history
00:34:33.340 right man I really should have shaved
00:34:35.480 before this my god
00:34:36.960 Dan the Oracle Greg Wycliffe why do you 0.99
00:34:39.600 keep emphasizing Christians you think old stock
00:34:41.640 white Canadians who aren't Christians are 0.99
00:34:43.480 somehow going to get a break
00:34:44.700 I don't think I'm emphasizing Christians that
00:34:47.560 much but
00:34:48.760 you know look at look at the what happened
00:34:51.480 during lockdowns and specifically christian pastors were picked on specifically because 0.74
00:34:57.520 they are representing a community of yeah probably mostly white old stock canadians
00:35:03.040 but you know when a when a christian pastor like archer poplowski for example who got thrown in
00:35:08.900 jail for delivering a sermon at um literally speaking words for 10 minutes got equated to
00:35:15.840 uh destroying infrastructure some crazy charge got levied against him for literally speaking
00:35:20.600 but uh you know they targeted these figureheads for a reason and i think when a christian is
00:35:26.620 actually like doing like practicing properly i.e like they're not waving trans flags around
00:35:33.180 because you can definitely make the argument that religious institutions have definitely been
00:35:37.180 weakened and that some of these uh communities are basically compromised with woke politics
00:35:43.480 progressive politics don't disagree with you there however it depends on the community it depends on
00:35:48.940 the christian depends on the leader the faith leader whatever and the good ones actually do
00:35:53.480 speak out they speak out in favor of of uh you know protecting children from the gender ideology
00:35:59.100 and um all these things that we're fighting against like tyranny and uh yeah i mean it's
00:36:06.600 they get targeted especially when they are you know leading a community because that's a powerful
00:36:12.700 thing right when when when christians get together and they have a strong community and they're tight
00:36:18.920 knit like the government doesn't want tight knit communities it's another reason why they went
00:36:22.360 after uh diagonal so i'm not saying it's just christians or it's just non-christians like you
00:36:27.400 know it's just a kind of another uh point of reference that's that's totally relevant to this
00:36:32.680 and it's also worth mentioning because compare it to the other religions you know we're very
00:36:38.000 sensitive to anything happening to jewish people or mosques in this country um synagogues or
00:36:44.820 Muslims in this country you know it's almost like a war over the airwaves over should we mention
00:36:50.240 anti-semitism more or should we mention Islamophobia more which is the more sense that
00:36:54.860 over a hundred churches have burnt down in the country how many mosques have burnt down how many
00:37:02.500 synagogues have burnt down like it's not even close it's not even close like that disparity
00:37:07.920 is obviously worth mentioning um and you know if we're going to pick a tribe so to speak as old
00:37:15.760 white stock canadians and we're going to associate more closely with one real one religion or another
00:37:21.340 it would obviously be christianity and you know we can save this for another stream um but
00:37:28.960 we like the right wing in canada especially if you're further right i get it if you don't like
00:37:35.620 religious institutions i totally get it totally fair okay but this sort of hostility towards
00:37:42.180 christianity i don't really i don't understand it i don't understand why that's uh like why that's
00:37:48.680 happening there's nothing but sort of mutual benefit between someone who's far right and
00:37:55.300 what a true uh christian man or woman uh wants for the country you know like hey if you have a
00:38:02.780 problem with certain parts of scripture you think certain certain churches have trans flags obviously
00:38:07.260 yeah sure complain about that but it's like this kind of wholesale rejection of you know like like
00:38:13.680 christians who are on the right wing um you know you can criticize them for supporting israel too
00:38:19.400 much like sure yeah do that but like this whole the wholesale rejection of it i i don't understand
00:38:24.500 where that comes from with some people um all right let's get back to this article
00:38:28.960 oh we have a conservative mp being mentioned okay conservative mp michelle rempel garner
00:38:38.780 requested the pbo analysis on the cost that would be involved in setting up the new system she has
00:38:43.580 argued the government could instead modify existing laws and regulators to ensure canadians
00:38:49.560 are better protected online once again terrible frame terrible frame you know guys winning
00:38:56.000 arguments is is all about how you frame the debate okay and this frame is well we could
00:39:01.580 protect kids online in some other way so they're validating the premise of the bill by saying oh
00:39:06.060 yeah this is totally about protecting kids online rejected by saying this is tyrannical
00:39:10.580 she says the liberals controversial legislation have received received significant criticism from
00:39:17.280 concerned canadians and raised alarm amongst legal experts and civil rights advocates she said in the
00:39:23.240 statement. Now we learn Trudeau will spend over 200 millions of taxpayers' money on his useless
00:39:28.600 330-person censorship bureaucracy. That's better. That's an improvement. She used the word censorship
00:39:35.580 instead of using that money to hire police, protect Canadians, and lock up criminals.
00:39:42.040 Yeah, this is definitely an improvement. But once again, I want more. I want more.
00:39:46.120 it's um there should be a wholesale rejection uh of this bill the party also sent an email blast
00:39:53.300 to supporters asking them to help conservative leader peer poly of defeat what it called the
00:39:57.060 three-headed monster or a trio of liberal bills that seek to regulate tech giants
00:40:01.520 i mean not bad as much as i criticize poly of the conservatives they do make uh good strategic
00:40:08.320 moves here and there uh i just really wish they would drill home the fact that we shouldn't you
00:40:13.780 We're still stuck in the liberal framework, unfortunately.
00:40:18.680 In addition to the proposed online harms bill,
00:40:20.880 the liberal government has laws on the books
00:40:23.600 that regulate online streaming services
00:40:25.520 and social media platforms that display Canadian news content.
00:40:28.760 This is referring to Bill C-11 and C-18.
00:40:33.900 C-11 I call the Algorithm Act,
00:40:36.320 and C-18 is the Online News Act.
00:40:38.920 What's the official name for C-11, actually?
00:40:41.720 I'm actually not even...
00:40:43.120 it amends, oh yeah, it amends
00:40:53.140 the Broadcasting Act
00:40:54.420 I like the Algorithm Act though
00:40:59.060 because it basically wants the CRTC
00:41:01.040 to apply to
00:41:03.000 online content
00:41:03.900 anyway
00:41:04.720 Polyev often accuses Prime Minister
00:41:10.920 Justin Trudeau of censorship in reference
00:41:12.840 to those bills while the liberals say the tories are guilty of peddling misinformation about what
00:41:17.800 the legislation actually does bullshit virani's office has not yet responded to a request for
00:41:24.960 comment about thursday's costing analysis the file landed on his desk after being previously
00:41:30.600 assigned to the canadian heritage minister years after trudeau first promised to legislate against
00:41:35.640 online harms during the 2019 election campaign experts widely panned a consultation paper
00:41:45.740 released around the time of the 2021 federal vote which proposed a 24-hour takedown rule
00:41:50.700 for content flagged as harmful an approach they said risked censoring legal content and chilling
00:41:55.980 free speech experts widely panned a consultation paper released around the time of the 2021 federal
00:42:09.800 vote which proposed a 24-hour takedown rule for content flagged as harmful an approach they said
00:42:15.320 risked censoring legal content and chilling free speech that feedback prompted the government to
00:42:20.120 return to the drawing board and assemble a new expert advisory group so they're still pushing
00:42:24.980 for this in the bill though or something close to that and what they're referring to here uh back
00:42:32.440 in 2021 the bill was called bill c36 now it's called bill c63 and it's all confusing but uh
00:42:39.300 it used to be called bill c36 and it was all labeled that the bill was about stopping hate
00:42:44.060 speech and hate speech propaganda blah blah blah um and because that left a bad taste in people's
00:42:51.420 mouth it sounds like hey sounds like you're trying to chill free speech uh then they rephrased it to
00:42:55.360 we're gonna protect kids online because that that's what they did in the uk and they got that
00:42:59.780 bill through and looks look what's happening in the uk look what's happening in the uk now
00:43:05.220 i should i should actually watch it a video on that we'll watch a video on that in a sec
00:43:11.640 because obviously it's relevant and it's it's pretty terrifying it's pretty pretty terrifying
00:43:19.680 they're basically emptying the jail of violent criminals so they can put um people white people
00:43:25.440 who are complaining about uh mass migration and crime and even worse behavior the worst types of
00:43:33.440 crime uh throwing throwing white people in jail for complaining about it on the internet
00:43:37.480 it's just totally shameful it's beyond it's beyond it's beyond evil it's beyond it's beyond
00:43:46.520 being evil and traitorous to your country it's like i don't even know um uh the current bill has
00:43:53.580 prompted criticism by civil society advocates and legal experts over its criminal justice reform
00:44:00.080 which include proposing stiffer sentences for hate related crimes hate related crimes
00:44:06.440 and reinstating a controversial section of the canadian human rights act that would allow
00:44:11.600 canadians to lodge complaints about hate speech so in other words they alluded to the worst
00:44:17.220 worser parts of the bill but they did not at all did not at all go into detail explaining what that
00:44:23.760 is so uh everyone who said two in chat you're correct i don't think anyone even said one in chat
00:44:29.820 to uh thinking that they were going to mention that part of the bill not surprising
00:44:35.340 trent dab says a woman was arrested in bc for mean tweets this sure was sure was
00:44:43.140 they they arrested her and they didn't even know what they were going to charge her with
00:44:46.940 isn't that cool isn't that funny
00:44:48.820 david ryan david ryan said shared to gay facebook hey nothing wrong with facebook come on
00:45:01.100 thomas gordon says why is this bill required is there a lot of complaints crime needs to be
00:45:09.100 reduced question mark uh this bill is being this bill is not required it is a thin it's thinly
00:45:16.000 veiled authoritarianism they're saying that it's going to uh defend kids online but the bill is
00:45:22.160 really intended to give the government more power over its citizens over the speech and thoughts
00:45:28.080 of its citizens. People express themselves online. Now there is a whole bunch of tools. I call it the
00:45:34.080 tyranny toolkit. It's going to give them a whole bunch of tools to persecute political opposition
00:45:39.780 in this country. And if you don't think that there is an agenda to persecute political opposition in
00:45:44.740 this country, I would like to let you know about this little thing called the trucker convoy.
00:45:48.440 It happened in 2022. And essentially, everybody, nurses, doctors, truck drivers, blue-collar
00:45:55.020 workers white collar workers people who worked at banks everyone in between who was opposing the
00:46:01.680 federal vaccine mandate they were all vilified by the government told that they were hateful they
00:46:07.200 had their bank accounts frozen some of them and the people who are more prominent leaders and
00:46:14.260 effective at opposing the government were thrown in jail were levied with all sorts of trumped up
00:46:19.860 political charges. They weren't speech charges. None of them were about specifically speech,
00:46:25.220 but if you boil it down and if you're honest with yourself, this government was like,
00:46:31.660 let's target these people who have these opinions that are opposing our mandates and opposing
00:46:39.960 what we're doing to our subjects. And then this massive protest happened for three weeks
00:46:46.680 and they were beaten up by police usually in a in a democracy a free country if there if there is a
00:46:53.440 massive protest like that usually in a democracy there will be a conversation between elected
00:46:58.180 officials and the upset um citizens but that didn't happen there there was they were just
00:47:04.340 crushed crushed by a police horse um so thomas gordon um if you still think like if you're still
00:47:14.320 giving this sort of corrupt regime, the benefit of the doubt, uh, I would say that you're a little
00:47:19.500 misguided and that this, this bill is actually just more of the same. It's more of tyrannical
00:47:24.620 intentions to control the subjects, uh, in this country. And, uh, yeah, that's what we're going
00:47:31.800 to be. That's kind of, that's kind of a long story short. That's what I need to, that's what
00:47:36.240 we need to do at save free speech to convey to people who don't understand this because the
00:47:40.720 people in this chat probably know what's going on they probably know about all that what's happened
00:47:44.840 during the trucker convoy and how freedom is kind of like a punchline uh to to a lot of canadians
00:47:51.260 who truly see uh what's happened in this country to people who speak out but um not everyone knows
00:47:58.660 that and not everyone has taken the time to explain that to them because sometimes if you
00:48:03.540 send someone a rebel news video you see david menzi's being like oh my god i got pushed over
00:48:08.520 and like trudeau trudeau it's over it's over trudeau oh my god you send that to like your
00:48:14.260 normie friend or your center leftist friend or whomever your apolitical friend and they're like
00:48:18.940 yeah um yeah rebel news is kind of kooky i don't know if uh i don't know if i believe you i don't
00:48:25.940 know if that's actually happening i don't know that guy has a weird a weird hat on he's got a
00:48:30.280 weird fedora on is that really um don't want to associate with that guy but um and yeah i'm only
00:48:37.580 i don't even know if i'm really kidding with that i feel like i was 80 serious with uh with that
00:48:43.620 critique of rebel news like i really don't know how often sending their videos to other people
00:48:48.920 really helps helps convince does do rebel do rebel news videos help convince people outside
00:48:56.720 of our echo chamber that's a good question to reflect on because sure if people agree with you
00:49:03.300 already people already hate trudeau they already see the tyranny send them a rebel news video
00:49:07.920 they'll probably nod along and be like totally agree but if you send it to someone who is
00:49:11.320 apolitical who's a centrist who is whatever not not following this stuff they might be like huh
00:49:17.620 i don't know that seems like a little alarmist and i think i think there's a lot of a potential
00:49:24.240 benefit to just kind of take the time to explain to people hey this is what's happened at the
00:49:30.600 convoy there's political prisoners in this country now yeah uh if we look at history usually we refer
00:49:37.480 to political prisoners in a rise of tyranny and we really see that as a bad thing and oh well i would
00:49:43.380 have heard about this on the news yeah well the news is um paid for by the government now so that's
00:49:49.160 why you haven't heard about it and that's just an example of how that's just the beginning really
00:49:53.580 of how deep the rabbit hole goes or how deep the corruption goes in this country ctv news has been
00:49:59.000 lying to me yeah yeah it has yeah yeah remember when they said everyone who didn't get a vaccine
00:50:06.960 uh was was like like the worst ever and then they could totally kind of backflipped and like no one
00:50:11.900 cared anymore you remember that oh yeah yeah i do remember that yeah yeah try to keep that at the
00:50:16.760 front of your mind please um uh all right what was i gonna do again we're gonna look at oh i'm gonna
00:50:23.980 look at some stuff sent to me, um, relating to, relating to what's happening in, uh, we're gonna
00:50:33.760 do a quick detour to the UK, a foreshadow of how, uh, how awful things can get if we don't do
00:50:40.560 anything. If we don't do anything. Let's see what we got here. I think this is a few days old now.
00:51:02.920 One day ago from the BBC. Wow. Protester jailed after chanting at police.
00:51:10.560 it's not just online either it's not just online hey 67 year old man was jailed after chanting
00:51:19.440 you're not english anymore at police officers during a violent demonstration david notley of
00:51:24.860 buckhurst hill in essex was among supporters of far-right organizations that took part in the
00:51:30.120 protest in london on the 31st of july he was among 120 people who were arrested during violent
00:51:36.160 scenes in white hall two days after the three fatal knife attacks in south port that prompted
00:51:42.620 disorder across the country notley was jailed for 20 months after he admitted
00:51:46.380 violent disorder and causing religiously aggravated distress notley was jailed for
00:51:52.580 20 months after he admitted violent disorder and causing religiously aggravated distress
00:51:59.340 at Inter-London Crown Court.
00:52:01.600 What the fuck?
00:52:05.620 Okay, I'm gonna, I'm gonna
00:52:06.640 quickly write this down as a note.
00:52:21.860 What the hell does that mean?
00:52:25.420 Notley was jailed for 20 months
00:52:26.600 after he admitted violent disorder.
00:52:29.340 and causing religiously aggravated distress like are these hate crimes are these hate crimes in
00:52:35.800 in london or something i don't i don't get it um and it's funny how it's like yes he was arrested
00:52:41.960 with other other violent people but yeah he was just shouting see how they're see how they're
00:52:46.600 doing this where they're they're just kind of they they've been doing this for years or trying
00:52:50.720 to do this for years conflating uh violence and hate conflating violence and hate together it's
00:52:56.840 the same thing it's the same thing and they did this seamlessly in this article saying he shouted
00:53:01.040 oh by the way he was with violent people so like what's the difference right uh the rule of law is
00:53:05.940 the difference uh was a person committing a crime oh they were just shouting throw them in the
00:53:12.700 party wagon boys i gotta work on my accents jeez bottles and cans were thrown to police and flares
00:53:19.680 launched during the trouble last month uh alex agbamu prosecuting said notely made his way to
00:53:28.880 the front of the crowd and confronted police in a fighting pose surging back and forth
00:53:36.480 are you kidding me standing his violence posing his violence mewing mewing his violence
00:53:49.680 I'm going to get my Mew on.
00:53:53.200 The comedy writes itself.
00:53:55.180 Like, you couldn't write this.
00:53:58.120 Notley also joined in the chat,
00:53:59.880 You're Not English Anymore,
00:54:01.020 and sang derogatory remarks about Islam.
00:54:03.900 He helped push another demonstrator into a police officer,
00:54:07.000 which precipitated a physical confrontation
00:54:09.980 involving the police and demonstrators.
00:54:11.480 mr agbamu ooga booga added that notely then remained at the front of the crowd as if that's
00:54:23.120 like a crime this is this is bonkers this is freaking bonkers let me guess they have no
00:54:30.300 comment section bbc i mean at least at least they kind of reported on the facts because even if you
00:54:37.760 add up the facts it's like you can see you can see the the insane bias here so what was he what
00:54:42.840 is he guilty of he's guilty of a fighting pose he's he's guilty of saying you're not english
00:54:48.240 anymore he's guilty of singing derogatory remarks about islam i'm not seeing any laws broken here
00:54:56.020 yet um he helped push another demonstrator into a police officer what i mean if that's a charge
00:55:05.120 let me know is that a charge was someone pushing him was it one of those things where you're in a
00:55:11.100 sea of people and you get pushed dude it's a lot easier to cope with this because it's not
00:55:19.840 happening here and i don't even want to say yet because how about this guys how about we see
00:55:24.960 what's happening and start to organize to stop it and that's going to take ringing the alarm bells
00:55:31.940 creating a professional organization getting people on board and stopping this legislation
00:55:38.240 from passing and that's exactly what i'm trying to do with safefreespeech.ca okay ringing the
00:55:43.680 alarm bells we can see what's already happening in the uk plain as day if we don't do something
00:55:50.400 about it if we don't try to organize and stop this legislation from passing then this is our future
00:55:56.060 this is our future if we don't stop bill c63 if you want to support our mission go to save
00:56:02.320 free speech.ca or if you want to help really help then you can support our give send go give send
00:56:08.860 go.com slash save free speech we're creating a documentary all about the true nature the true
00:56:15.920 tyrannical nature of this bill the far left bias of this bill what it's really about and uh yeah
00:56:21.560 I'm super pumped.
00:56:22.940 Hey, we got a donation here from Drawf Wark for $20.
00:56:27.300 Thank you so much, sir.
00:56:28.680 Really appreciate it.
00:56:29.860 You're going to help us save free speech.
00:56:32.760 Oh, man, I'm pumped.
00:56:34.320 I'm honestly pumped.
00:56:36.020 I'm pumped.
00:56:36.860 It's high stakes.
00:56:38.620 You know, some people will say it's too late, Greg.
00:56:41.160 It's too late.
00:56:43.400 But I don't know.
00:56:46.020 I think there's still time.
00:56:47.500 I believe if we keep going, then good things will come.
00:56:53.160 Good things will come.
00:56:56.840 Please provide for those who are less fortunate.
00:56:59.300 Amen.
00:57:01.000 EdgyDTV in chat says, arrested for mewing while white. 0.61
00:57:04.420 Real. 0.65
00:57:05.220 Real.
00:57:11.860 Sorry, officer.
00:57:12.920 Was I mewing too hard?
00:57:14.600 Was I, was I, sorry, sorry, officer.
00:57:16.140 Was I in a fighting pose?
00:57:17.500 i apologize arrested for having too strong of a jawline that would imagine that was in the
00:57:25.300 article the man had a strong jawline and was clearly going to the gym he was clearly had
00:57:31.400 violent intent he was singing a song he was singing a song you guys and it hurt people's
00:57:38.560 feelings throw him in jail please unbelievable unbelievable all right i think there's another uh
00:57:50.560 video to watch here can we do this
00:57:58.320 this is we will guarantee a prison cell we will make sure that those people who need to be in prison
00:58:03.440 we'll be in prison not necessarily in the area where they live hold on hold on they may be two
00:58:08.320 three hundred miles away from home but we will guarantee people a prison cell with the numbers
00:58:13.700 are so tight that how can you make that guarantee we will guarantee a prison sorry my stuff just
00:58:20.160 froze there for a sec i'm gonna try to i'm gonna try to get this up on screen here one second
00:58:24.000 we will guarantee a prison okay okay okay okay okay one second one second guys one second just
00:58:32.880 Give me a second.
00:58:34.260 I'm trying to get this on screen for you, okay?
00:58:38.060 I'm a total newbie at this, all right?
00:58:40.720 I'm still trying to get better at this.
00:58:43.700 Oh.
00:58:49.440 There we go.
00:58:53.120 Nope, that doesn't look very good.
00:58:55.360 Does that look good to you?
00:58:56.380 That does not look good to me.
00:58:57.360 There we go.
00:58:58.560 All right, let's watch this video here.
00:59:01.360 I'm going to turn the music off.
00:59:02.880 this is from uh we the media or end wokeness on telegram the caption says holy shit the uk is
00:59:15.840 releasing 5k prisoners to make room for anti-immigration protesters some of these will
00:59:20.540 be violent offenders i.e the 5k prisoners being released will be violent offenders let's watch
00:59:25.360 this from the beginning we will guarantee a prison cell we will make sure that those people who need
00:59:31.440 to be in prison, we'll be in prison.
00:59:33.420 Not necessarily in the area where they live.
00:59:36.180 They may be 200, 300 miles away from home,
00:59:38.460 but we will guarantee people a prison cell.
00:59:41.460 The numbers are so tight there.
00:59:43.380 How can you make that guarantee?
00:59:44.800 They are tight, and that's why we've initiated Operation Early Dawn.
00:59:49.000 So, basically, the easiest way to describe it is one in, one out.
00:59:52.980 So, as people get released,
00:59:55.220 we can then pick up people from police cells and take them to court,
00:59:58.600 and we will triage that three times a day. 0.82
01:00:01.440 are you fucking kidding me 0.98
01:00:04.420 what a piece of shit this man is 0.98
01:00:12.040 we will guarantee a prison cell
01:00:14.720 we will make sure that those people 0.99
01:00:16.400 what a fucking piece of shit dude 0.95
01:00:17.900 this needs to be this this like
01:00:20.780 i mean we have people like this in canada too
01:00:23.900 but oh my god
01:00:24.880 operation early dawn
01:00:28.220 operation let violent criminals out into the streets i mean man like where do you even begin
01:00:37.560 where do you even begin with somebody like this with a situation like this what the hell man
01:00:43.860 that's so crazy oh so um what's most infuriating about this of course is and i remember seeing a
01:00:56.460 ferryman post about this uh a couple weeks ago because what is that ginger's name again i'm so
01:01:06.280 glad when i forget her name oh yeah rachel gilmore she made a quick tiktok about this saying like
01:01:10.240 there's all these anti-immigrant protesters happening in the uk and it's basically because
01:01:14.180 like they're just really far-right idiots and and they're like totally bad people and ferryman's
01:01:20.440 commentary was are you going to mention the fact that three young girls were stabbed to death
01:01:27.500 at like a taylor swift little dance thing three young white girls were stabbed to death brutally
01:01:33.480 murdered are you going to talk about that you know and then we have these brit brits on the bbc
01:01:42.520 saying well are we going to be able to throw these white people in jail for complaining about
01:01:46.500 the immigrants it's like so you're not no mention at all like the mention of the murdered kids is
01:01:51.900 just the white murdered kids just brushed aside don't really care about that brushed aside white
01:01:57.820 murdered kids just no not really it's uh i can't believe these white people are angry we need to
01:02:04.680 throw them in jail quicker operation early dawn it's more it's more dystopian than here
01:02:12.380 it's more dystopian than here and again it's exciting it's exciting because we still have
01:02:21.540 a chance we still have a chance to stop this I think I think we do yeah that's
01:02:32.200 it's gonna be a long fight though it's not gonna be it's not gonna be simple it's not gonna be
01:02:38.840 easy look how look how dis oh my god that's disgusting that's just that's just vile that's
01:02:48.200 that's villainous that's detestable gross gross all right we're gonna watch i'm gonna just gonna
01:02:56.660 check the chat for a second and we are going to watch this we're gonna get into this video
01:03:00.440 this cj video in a second um so you guys are saying in the chat
01:03:06.520 this sounds like the soviet union in the early 1900s says edgy dtv
01:03:16.820 is there uh is there a precedent for that uh um letting violent criminals out of jail
01:03:23.820 and then throwing in the dissidents 0.91
01:03:25.580 hope coop boys uh oh well says hope the coots boys sue but can they question mark is that 0.88
01:03:42.640 why they are being forced to plead guilty for garbage question mark 0.86
01:03:46.280 i i don't know i i haven't i haven't had the uh to use a cringy corporate word i haven't 0.71
01:03:54.960 have the bandwidth to keep up with with uh with what's going on with the different coots uh men 0.67
01:04:00.600 but um should we be calling them men so the coots boys i know it's like sounds maybe maybe it makes 0.86
01:04:07.680 them sound more like innocent and we need to save them but they are men who you know did i think 0.76
01:04:14.500 something patriotic um but uh yeah i mean the problem with the legal proceedings is that
01:04:26.380 when it comes to this is kind of my interpretation i'm not a huge law guy
01:04:32.740 okay i'm i'm i'm trying to like become more versed when it comes to um how to
01:04:39.540 understand the law and especially all these different cases for different political prisoners
01:04:44.140 and everything but uh it's my interpretation that if you are going to sue the government
01:04:49.820 or sue some you know some large governmental entity you're kind of up against an enemy
01:04:56.820 that has infinity money in a way like they kind of have so much money and they also might have
01:05:04.260 uh strings to pull to to um make it as painful as possible for anybody who's trying to sue them
01:05:11.580 i.e. if you want to continue to try and sue us we are going to make it very expensive for you
01:05:16.620 we're going to we're going to make it longer and longer and longer and push it off for years and
01:05:21.220 years and years and years and uh that's kind of i'm kind of like a broad brushstrokes guy and
01:05:27.540 that's kind of the my understanding is you can try but it's probably going to be very expensive
01:05:33.540 very painful and will you even get justice in the end when you're facing the government they
01:05:41.180 have so much more tools at their disposal to try and get away with it and not be held accountable
01:05:47.080 and once again another reason why i'm trying this approach of creating videos with savefree
01:05:54.360 speech.ca where just throwing everything at the wall over and over again on this topic on this
01:06:02.640 topic and and making it a part of the conversation forcing the conversation forcing politicians to
01:06:09.040 feel feel discomfort and pain from the pressure that they're receiving why haven't you talked
01:06:14.760 about bill c63 why haven't you talked about the online harms act why aren't you talking more about
01:06:19.080 why aren't you opposing this you know um that's the plan that's the plan and part of that plan
01:06:28.400 is to do our due diligence is to do our research do our due diligence do our research into
01:06:36.480 To see what people are saying about this bill.
01:06:44.520 To steel man the argument in favor of Bill C-63.
01:06:52.480 In favor of this censorship.
01:06:54.000 And I found a tweet today that's going to kind of introduce us into that.
01:07:03.520 Maybe I'll start with the tweet beforehand.
01:07:05.580 Oh, by the way, while we are on Twitter
01:07:08.500 Follow us
01:07:10.520 Follow Save Free Speech on Twitter
01:07:12.060 SFS Canada
01:07:15.100 At SFS Canada
01:07:17.520 Save Free Speech
01:07:18.340 SFS Canada
01:07:20.220 On x.com slash SFS Canada
01:07:23.580 Become a follower
01:07:24.740 Retweet our stuff, like our stuff
01:07:27.140 And
01:07:27.980 Yeah, but
01:07:30.500 Where were we?
01:07:33.740 Alright
01:07:34.140 one second
01:07:35.740 how do i get to my bookmarks there we go
01:07:42.960 here we go here we go so let's begin here so once again this is sieja this is the video we're
01:07:52.760 about to watch the center for israel and jewish affairs is the advocacy agent of jewish federations
01:07:58.240 of canada uia representing jewish federations across canada tell your member of parliament
01:08:03.260 to support Canada's Jews and Bill C-63.
01:08:07.600 So supporting Bill C-63 is synonymous with supporting Jews, apparently,
01:08:13.520 according to CJE.
01:08:14.400 This week, Minister of Justice Arif Varani tabled the long-awaited Bill C-63,
01:08:20.720 the Online Harms Act.
01:08:21.780 Given the dramatic rise in anti-Semitism,
01:08:24.320 this legislation has come at a time when it is needed most by Canada's Jewish community.
01:08:29.600 With this act, we have a chance to diminish Jew hatred online.
01:08:33.260 you know hey they're looking out for their people all right and they're all over it they're all
01:08:39.500 they're all over uh looking out for their people they're on top of it it sounds like they're they're
01:08:44.740 they think this legislation is going to benefit them uh it's going to stop anti-semitism and i
01:08:50.260 think this is an interesting for a couple we're going to watch this video for a couple reasons
01:08:53.920 uh number one i think they have intelligent people on here michael geist is one of the uh
01:09:01.000 people spot speaking and he is a lawyer who has kind of been an expert in uh bill c11 he was
01:09:08.220 really strongly opposing bill c11 sounds like he has a soft spot for uh bill c63 maybe it might be
01:09:15.120 the fact that he's jewish i don't know but um regardless he's a smart guy so i want to hear
01:09:20.000 what he has to say uh and there's somebody else who's also very uh i think they're like a liberal
01:09:26.880 minister or something but anyway they're going to have once again they're going to have kind of
01:09:31.160 like the strongest arguments in favor of bill c63 and we'll see kind of what their counter arguments
01:09:36.120 are and yeah the other the other kind of thing worth mentioning you know this topic gets kind
01:09:43.240 of really uncomfortable really quickly but i've mentioned in the past and past streams that you
01:09:49.040 know someone who's supporting bill c63 is the canadian race relations foundation and they kind 0.85
01:09:53.660 of make this victim porn and they basically say hey this asian woman got death threats on the
01:09:58.900 internet um she's crying therefore we're going to end free speech you know they kind of use these
01:10:07.480 like these these fragile people who can't read internet comments without breaking down and um
01:10:13.340 and they use that as sort of like victim porn to try and justify censoring people online uh trying
01:10:20.080 trying to insulate all canadian citizens from any harm from any harm or wrongdoing uh and the reason
01:10:26.380 why i think it might be interesting to see what cjo is saying about this is uh you know one could 0.91
01:10:31.520 make the argument that uh you know jews are kind of they they are very they play the victim game 0.86
01:10:42.960 you know they are very much like hey look what happened to us blah blah blah and yeah i think 0.99
01:10:51.420 that's relevant because to pass through this bill this tyrannical bill bill c63 they are going to be 0.77
01:10:56.760 using that victim narrative not just this jewish group group but like in general they're going to
01:11:01.680 be like what about these victims what about these victims so uh you know let's let's talk to the
01:11:06.320 experts. Let's talk to Cija. The experts in being victims. You know what I'm saying? So where's the
01:11:14.100 video here? All right. All righty then. Okay. Here we go. So what's fact and what's fiction?
01:11:34.300 or maybe we'll start with this.
01:11:36.040 This is from the CJ website,
01:11:37.680 linking to this video.
01:11:38.980 What's fact and what's fiction?
01:11:40.220 The new online harms bill.
01:11:42.180 This is from April.
01:11:43.340 There have been a lot of questions
01:11:44.220 around the new online harms bill.
01:11:45.420 In fact, it has been one of our most asked
01:11:47.920 about topics since the legislation was tabled.
01:11:51.040 That's why we sought the most knowledgeable experts
01:11:53.020 about the bill that aims to do
01:11:54.420 and not what the bill aims to do and not do.
01:11:57.560 We hosted a special briefing with Dr. Michael Geist
01:12:00.040 and Dr. Emily Laidlaw.
01:12:02.520 They took us through how they see the bill
01:12:04.180 its intentions and the pros and cons of how it will help fight online hate dr michael geist
01:12:10.020 is a you know what we might just watch this in the in the thing let's just let's just get started
01:12:17.060 it's it is long so i'm going to be kind of jumping through
01:12:21.540 card is the general counsel and vice president of cija and together with several of our professional
01:12:26.980 staff has worked extremely hard on this file richard is also a former member of parliament
01:12:33.380 he is a lawyer a published author a human's right activist and a person who is passionate
01:12:39.540 about the jewish story he understands both the pros and the cons of this legislation
01:12:44.900 and he has he has he brings in today two panelists that have been heavily involved
01:12:50.180 in in this legislation and present both sides of this debate so richard i will let you take
01:12:54.580 over from here and thank you all for joining us thank you i thank you gail and thank you everybody
01:12:58.100 for for being here it's a pleasure to be moderating this this panel um as gail said there we know
01:13:02.500 know there's a lot of questions and comments moderating this battle is that what he said
01:13:08.820 everybody for for being here um it's a pleasure to be moderating this this battle um as gail said
01:13:13.540 we know there's a lot of questions and comments within our community um and uh we will tackle
01:13:18.100 them head on um a couple of things that i want to flag at the beginning uh in order to be fully
01:13:22.020 transparent um given the level of anti-semitism online something that affects all of us but
01:13:26.340 especially our children who spend so much time online cija has long been advocating that something
01:13:30.900 had to be done to combat online hate and the level of anti-semitism both online and offline are at
01:13:36.740 an all-time high and the numbers of emails and calls that we get daily to ask for help
01:13:40.820 about anti-semitism is off the charts so not to mince words there is a crisis of anti-semitism
01:13:45.540 in this country so today all right i just kind of want to juxtapose that with the fact that over a
01:13:50.500 hundred churches have burnt down in canada over the past couple years is there is there a christian
01:13:57.540 you know an anti-christ anti-christian hotline where all these complaints are being collected
01:14:02.900 and and you know is is there a christian advocacy group like seja for for christians in canada i'm
01:14:10.840 i'm assuming the answer is no but you know this is sort of this is kind of an issue right or at
01:14:19.560 least like it's it's you kind of have to admire the fact that they're on top of it they're like
01:14:24.100 hey, they're very sensitive. They're like, hey, antisemitism is at an all-time high. We have to
01:14:27.860 do something about this. All these disconnected Christians across the countries whose literal 1.00
01:14:33.720 churches are being burnt down, where do they go? Where do they go when their church gets burnt 0.97
01:14:38.760 down? Who are they complaining to? Who's hearing their complaints? Are they calling anybody?
01:14:43.700 And then what's the fallout of that? I was talking on a podcast earlier with Mark Paralavis
01:14:49.660 earlier today and i brought up tribalism and i brought up the fact like hey um i've heard younger
01:14:57.360 people in politics say that tribalism is inevitable and i gotta say i was in denial of that for some
01:15:03.320 time back in like 2019 when i was still like a naive libertarian but now when you see the
01:15:08.480 calistanis fighting with the indians in the street you see the palestinians fighting with the israelis
01:15:14.660 in the street you see the ukrainians fighting with the the russians in the street you know it's
01:15:20.980 it seems to be like this this is this is kind of everywhere now and when it comes to a you know
01:15:28.900 white canadian christian uh old stock canadian where where is our tribe where is our representation
01:15:36.960 like who is actually advocating for us and the answer is we've been so indoctrinated that we've
01:15:42.320 been convinced that we don't have a tribe and if we do try to tribe together that we are the worst
01:15:46.520 people possible right i'm sure a lot of people watching already understand this but that's
01:15:50.840 that's the fact of the matter you know everyone everyone in their uh every other group that is
01:15:57.740 non-white is totally allowed to create a community create a tribe and like be like be like represented 0.93
01:16:03.720 um but if you're a white canadian we can tear down your statues we can we can shit on your 0.69
01:16:10.000 history we can spit on uh you we can say you're inherently racist in schools with critical race 0.81
01:16:15.260 theory we can do whatever we want and uh you know there is no helpline but uh you know if you're
01:16:22.640 jewish you have sieja you have you have a whole network of support that's going to be advocating
01:16:27.120 for you so um yeah there's a lot of work to do uh in this country as tribalism is on the rise 0.86
01:16:35.920 especially when you're in a multicultural a sort of like borderless boundary boundaryless
01:16:43.420 ruthless multicultural society uh what do you know it's almost like it was inevitable 1.00
01:16:48.400 today we our aim is to cut through the noise so to speak and to give you upfront answers about 1.00
01:16:55.580 the online harms bill what it does and what it does not do and in order to do this we have two
01:17:00.380 great panelists uh michael guise is a law professor at the university of ottawa where he holds the
01:17:04.680 Canada Research Chair in Internet and e-commerce law and is a member of the Center for Law,
01:17:08.460 Technology, and Society. He has obtained a Bachelor of Laws from Osgood Hall Law School in
01:17:12.020 Toronto, a Master's of Laws, degrees from Cambridge University in the UK and Columbia Law School in
01:17:15.600 New York, and he has a PhD from Columbia Law School. Dr. Geist serves on many boards, including
01:17:19.640 the Internet Archive Canada and the Electronic Frontier Foundation Advisory Board. Bringing
01:17:24.040 another perspective is Dr. Emily Leyla. Emily is a Canada Research Chair in Cybersecurity Law and
01:17:28.740 Associate Professor at the Faculty of Law at the University of Calgary. She's a Senior Fellow at
01:17:32.540 center for international governance innovation and network director of the canadian network on
01:17:35.980 information and security emily researchers in the areas of technology regulation and human rights
01:17:39.820 which focuses on platform regulation online harms privacy freedom of expression and corporate
01:17:43.580 governance she actively contributes to law reform and advisory work with recent projects on online
01:17:47.580 harms mis and disinformation defamation law and intimate image abuse she co-chaired the expert
01:17:51.820 group that advised the federal government on development on of the online harm spell okay so
01:17:56.220 So this is somebody who advised in developing the specific legislation.
01:18:01.780 Good.
01:18:02.360 So Laidlaw is her name. 0.53
01:18:04.000 I'm going to look her up.
01:18:05.160 Well, thank you both for being here.
01:18:06.600 As for the format, I will ask each of our panelists to give us their own overview of
01:18:10.240 the bill for about five to seven minutes.
01:18:12.220 Then I will put questions to the panelists.
01:18:14.200 Most of the questions actually.
01:18:16.580 All right.
01:18:17.700 Right away.
01:18:19.860 Right away. 0.97
01:18:21.080 We got pronouns and bio with Emily Laidlaw.
01:18:25.280 canada research chair in cyber security law and associate professor of law university of calgary
01:18:28.880 love all things about tech regulation and human rights yeah okay we'll we'll see about that hey
01:18:36.240 i helped develop uh i helped develop uh the online harms act i care about human rights okay let's
01:18:42.000 we'll keep that in mind let's see come from you uh either what you write you wrote us uh you
01:18:46.400 commented on on social media uh sometimes it was due to interactions i know i had a kiddush at
01:18:50.720 school after services oftentimes i've been approached to talk about this this bill but
01:18:54.000 just know that there's also a possibility for you to answer questions as we as we go through the
01:18:57.920 process there's a q a button at the bottom feel free to write uh questions that you would like
01:19:01.360 me to pose the panelists um and i will do my best to get to all of them it's a vast subject we have
01:19:05.440 an hour uh but we will do our best so um to start with i will pass the baton to michael to give us
01:19:09.600 an overview of the bill and uh where you stand on it michael okay great well thanks richard and
01:19:14.400 thank you to see jeff for the invitation to participate actually emily and i have a problem
01:19:17.760 far more similarities and differences i think on this bill so i think this will be an opportunity
01:19:20.480 really try to educate about what the Online Harms Act or Bill C-63 does. I thought it's good to start
01:19:25.040 to make sure that everybody's operating with the same baseline of knowledge. And the breakdown of
01:19:28.060 this bill would start by recognizing that this is the result of several years of work. You mentioned
01:19:30.920 that the community and many communities have been looking for this. Back in 2021, the government was
01:19:34.400 about to introduce legislation. It held off after there was controversy associated with some other
01:19:38.620 internet-related bills. It launched a consultation that summer. It was the same time as that 2021
01:19:42.940 election. And when many groups, many of the groups actually and experts, many of the groups
01:19:47.740 quite frankly were expected to support this legislation criticized what the government had
01:19:51.320 in line in mind rather they decided to go back to the drawing board and they that included creating
01:19:55.560 a new expert panel that emily was co-chair of there were additional consultations and round
01:19:59.700 tables and i think to the government's credit on much of this bill they took that expert advice
01:20:03.720 seriously oh crap the regular um i was just going to say i'm interested to see michael go more in
01:20:12.760 depth here but my interpretation of what he just said is well the government presented something
01:20:17.840 that was all about online uh hate speech and hate propaganda and that sort of triggered a strong
01:20:24.480 response an easy sort of attack on the bill of this is clearly going to take away people's freedom
01:20:29.100 of speech so they went back to the drawing board and now they're presenting it as the online harms
01:20:33.000 act that's going to protect kids online so they went they didn't go back to the drawing board
01:20:36.820 because they they disagreed with the sort of tyrannical intentions of the bill now they went
01:20:41.220 back to the drawing board because they're like there's no way we can sell this to the canadian
01:20:44.220 public you need a much better trojan horse for this and now they're back with bill c63 all about
01:20:49.820 protecting kids online but uh anyway let's let's continue here creating a new expert panel that
01:20:57.220 emily was co-chair of there were additional consultations and roundtables and i think to
01:21:01.380 the government's credit on much of this bill they took that expert advice seriously and so we have
01:21:05.700 now bill c63 or this online harms act i think it's best understood as at least two bills in one
01:21:11.020 There is an online harms piece that is involved regulating large internet platforms, what
01:21:15.200 their responsibilities and potential liabilities might be for things that take place on their
01:21:18.700 platforms.
01:21:19.440 And then there is an...
01:21:20.300 I'm going to be pausing this a lot probably, but once again, big tech platforms already
01:21:26.560 do a pretty intense job of regulating their content.
01:21:30.800 Michael Geist knows this as someone who is somewhat of an expert on Bill C-11.
01:21:37.040 So I'm sure he knows that.
01:21:38.620 But anyway, let's keep going.
01:21:39.460 In a sense, almost the second bill that includes changes to both the Criminal Code and the Canada Human Rights Act that is focused on hate, and more particularly on individuals, not on the platforms.
01:21:47.980 The online harms piece, the platform piece, is really the main part of the legislation.
01:21:52.960 I'm going to rewind that a tiny bit.
01:21:56.320 The liabilities might be for things that take place on their platforms.
01:21:59.480 And then there is, in a sense, almost the second bill that includes changes to both the Criminal Code and the Canada Human Rights Act that is focused on hate and more particularly on individuals, not on the platforms.
01:22:11.160 Hate on individuals, not on the platforms.
01:22:15.820 So that has entirely nothing to do with protecting kids online and everything to do with targeting dissidents, targeting people who are saying hateful things.
01:22:25.500 the online harms piece the platform piece is really the main part of the legislation
01:22:32.680 it focuses on social media services and it targets seven harms in particular sexually
01:22:38.700 victimizing children bullying inducing a child to harm themselves extremism or terrorism inciting
01:22:45.440 violence fomenting hatred and intimate content that is sent without consent often referred to
01:22:50.920 was revenge porn so i think one of the biggest i'm just realizing now i think one of the biggest
01:22:57.020 vectors of attack of this bill is going to be bullying believe it or not because bullying
01:23:02.720 although it may have a negative connotation for some uh some people who are sensitive to that
01:23:08.680 it's also the the flimsiest sort of so you're gonna stop bullying on the internet really who's
01:23:18.940 going to define what bullying is like that is that's crazy that's it's such a crazy premise
01:23:27.380 that you're going to stop bullying you're going to stop bullying on the internet is that right
01:23:33.960 like that that that just sounds so absurd to stop all bullying can we make a meme about this can we
01:23:45.320 make a meme about that no that's bullying says who says the government says the person complaining
01:23:49.780 online like what a what a mess what a what a mess that these people have the audacity to say that
01:23:56.860 we're going to control the internet to the extent that we are going to stop bullying crazy those are
01:24:03.080 the harms there are also for these platforms that are really subject to this subject to three duties
01:24:08.220 there's a duty to act responsibly that'll i i hate this so much because i've listened i've listened
01:24:14.040 to this stuff in various forms already but like like hey here's our grocery list of things that
01:24:18.760 we're gonna stop we're gonna stop this and this and bullying and this and then there's like these
01:24:23.440 are the three things that big tech needs to do they need to act responsibly they need to act
01:24:30.640 responsibly like is that really in the legislation to act responsibly it's just it's such it's it's
01:24:41.140 it's such like just gobbledygook to try and pass nonsense, obviously. Hey, we're going to pass a
01:24:48.900 really serious legislation. What do we want these big tech platforms to do? Act responsibly. Yeah,
01:24:55.220 we're the Canadian government. We want big tech to act responsibly. What does that even mean?
01:25:00.700 I'll come to in just a moment. There's a duty to protect children, which is largely focused on how
01:25:06.420 they design their platforms to better protect kids like it's correct once again michael geist
01:25:11.260 it's crazy that i will we'll see if he mentions it but like do you not think i'm hoping he'll
01:25:18.280 mention this i'm hoping he'll mention this you know what i'm gonna stop talking because there's
01:25:22.420 a lot more to add into maybe he will mention it this is just the introductory statement but
01:25:25.620 i was just gonna say there's already a lot of things that big tech does to try and protect kids
01:25:31.920 so you know they talk about we need a better way to report things on social media there's literally
01:25:38.020 a report button everywhere on big tech i can do it right here i can do it right here look
01:25:45.260 i could report this video it's everywhere on any big tech platform there's a report button
01:25:52.040 but anyway let's keep going and there is a requirement to make certain content inaccessible
01:25:57.420 essentially to block content really only two kinds of content what revenge porn and child
01:26:02.860 endangerment the duty to act responsibly for these platforms is really the the core element
01:26:08.780 of this legislation with platforms it would include things like ensuring that you've got
01:26:13.260 the ability for users to flag content that may buy you can already do that you can already flag
01:26:20.340 content they all have a flag button violate the law the ability to block other users so if you're
01:26:25.680 a social media the block button canadian government literally is like hey can you guys make a block
01:26:31.520 button oh you already have that yeah you guys are writing legislation for the internet and you're
01:26:36.000 suggesting a block button tell me you know nothing about the internet without telling me you know
01:26:40.800 you know nothing about the internet site you're being harassed you have the ability to block them
01:26:45.920 and a requirement to create a digital safety plan where you identify how you're going to deal with
01:26:51.440 this be far more transparent about how you deal with these kinds of harms can't big tech is going
01:26:59.280 like big tech already probably behind closed doors hates these different western countries coming up
01:27:06.640 with all of this nonsense hey can you make a report about what your digital safety plan is
01:27:11.060 so now either meta and google has to play ball and come up with all these like reports for for
01:27:18.700 the canadian government or more likely they might just say f you we're not even going to operate in
01:27:24.140 canada anymore and you may think that's ridiculous but of course bill c18 the online news act you
01:27:29.340 already can't get news on meta platforms instagram and facebook anyway all of this is overseen by
01:27:35.500 a new digital safety commissioner and there is also the creation of an ombuds person that will
01:27:40.620 assist and emily may get into that excuse me now that the second part of the bill involves several
01:27:47.240 changes to the criminal code, including the prospect of a new offense where another offense
01:27:53.220 is motivated by hatred, and that carries penalties that could go as high as life in prison. It's been
01:27:58.720 pretty controversial, and I'm sure we'll come to it. There's also a new provision involving
01:28:02.180 genocide or promoting genocide, which also carries life in prison, and the creation of a new peace
01:28:08.800 bond, essentially a prior restraint, the ability to have some prior restraint in and around some
01:28:15.040 of these issues there is also within the canada human rights oof oof come on michael geist are
01:28:22.880 you going to be honest about the pre-crime stuff come on bro exact the return of section 13 that
01:28:27.920 that many here may be familiar with it was uh somewhat contra quite controversial it was in the
01:28:33.020 law then it was removed it's back again and it create and it creates a provision dealing with
01:28:37.640 the communication of hate speech and the prospect of complaints to the human rights commission and
01:28:42.420 and then potentially later to the tribunal, which would create actual real liability to the victim
01:28:46.880 and potentially to the government as well. So that's the core of the legislation very quickly.
01:28:52.080 If I might, just for a couple of minutes, provide you with a quick assessment of sort of the good
01:28:56.480 and the bad. On the good side, I think there were many that breathed a sigh of relief that
01:29:00.560 what the government had initially in mind, much of that was tossed to the side. And instead,
01:29:05.820 I don't even know if that's necessarily true, because I feel like a lot of the stuff in the
01:29:11.140 previous legislation is still here it's just coded slightly differently uh so i'll have to
01:29:19.340 fact check uh michael on that because i feel like a lot of the stuff in bill c36 is still in bill c63
01:29:25.780 um so i don't really know what he's talking about there that really did listen to listen to the
01:29:33.340 experts particularly around the issue of online harms and i think some of the approaches that
01:29:38.360 they've taken really are are quite sensible we can we can debate some of us around the edges
01:29:43.480 but it's a pretty good starting point do you really believe that michael you're the guy that
01:29:48.780 opposed bill c11 this guy opposed bill c11 man and bill c11 was you know not so much about speech
01:29:57.500 but about like basically giving the keys to the algorithm on big tech algorithms to the government
01:30:04.020 and he was talking about how silly this is and getting into the minutia of how silly this is
01:30:09.900 and now he's kind of saying hey you know what they built they turned bill c63 around i think
01:30:17.080 there's a lot of really good things in here do you are you sure about that so i have relief that
01:30:22.700 the government what the government had initially in mind look how look how look how erratic this
01:30:28.620 man is right now much of that was tossed to the side and instead really did listen to listen to
01:30:33.940 the experts particularly around the issue of online harms and i think some of the approaches
01:30:38.880 that they've taken really are are quite sensible we can we can debate some of us around the edges
01:30:44.200 but it's a pretty good starting point debate around the edges you're a lawyer should there
01:30:50.740 should there not be zero debate around the edges if it's good legislation right like just as a
01:30:55.940 like as a base point mr geist dr geist i don't know like should there not be a sort of like
01:31:03.220 no debate well anyway we'll see how the debate goes but doesn't seem like he's really opposing
01:31:09.100 the the free speech concerns like at all which is uh which is really concerning i i feel like i kind
01:31:17.380 of saw this coming because you know i was a big fan of michael geist i like his work i still look
01:31:21.640 at his work of what he talks about when it comes to this bill but when bill c63 came out i saw him
01:31:27.560 kind of passively so he's like oh there might be some problems but i guess i support it and it's
01:31:31.940 like oh boy i was hoping he'd be more of a champion to oppose this just on the principle of i don't
01:31:37.000 know the charter of rights and freedoms uh the potential way that this bill could be weaponized
01:31:41.100 for political reasons the fact that there's a lot of evidence that uh laws have already been
01:31:47.160 weaponized for political reasons um against political dissidents anyway where there are
01:31:53.360 concerns i think that frankly the duty to act responsibly is a little bit both a feature and a
01:31:58.320 bug the we know the the broad brush strokes of what that might include for a platform but there's
01:32:03.700 a lot of uncertainty still so it's easy to like it when we don't know the specifics perhaps when
01:32:08.380 we get to the specifics there may be some concerns i think the oversight mechanisms definitely are a
01:32:13.460 cause for some concern there is this new digital safety commissioner i believe there's a lack of
01:32:17.640 oversight a real lack of specifics especially with what are truly quite enormous powers that
01:32:23.620 are vested in in the digital safety commissioner and i believe trust in this relation in this
01:32:28.460 legislation is going to depend in large measure and trust in how it's enforced with the digital
01:32:33.600 safety commissioner the other major concern quite frankly has okay time stamp this is a good clip
01:32:40.300 i'm an expert i'm michael geist and a lot of this really depends on how much we can trust these
01:32:47.600 people because we're giving them a lot of power uh and if and if they're ideologically biased
01:32:53.900 that would probably be a big problem that's that's basically translation that's what he's saying
01:32:57.560 um hold on
01:33:00.300 has been the criminal code provisions i think the notion of life in prison
01:33:11.160 where um where any any violation is motivated by hate by hatred the idea that that could include
01:33:17.840 life has some pretty significant implications and i find it really difficult to justify
01:33:21.780 um so that okay good good shout out to michael geist you know we we got we got to give him
01:33:27.960 credit when he's saying the right stuff uh i have concerns too michael totally agree totally agree
01:33:34.080 i probably would have worded it a bit stronger and said this this has tyrannical implications
01:33:38.340 just look at what happened during the trucker convoy but hey i'll give you credit for saying
01:33:42.220 that uh motivated by hate by hatred the idea that that could include life has some pretty
01:33:49.900 significant implications and i find it really difficult to justify um so that that's some of
01:33:54.660 my my perspectives in a nutshell but i know we'll have the chance to expand on this and many other
01:33:58.700 issues during the q a thank you uh michael that's a great way i think to set the table so just so
01:34:05.040 you know like they're about to have a debate here and he's and michael's our guy he's he's the best
01:34:10.680 sort of to fight against this bill and it's kind of concerning because he it was mostly glowing it
01:34:17.740 was a mostly very sort of glowing review of the bill like he only seemed to have like one concern
01:34:22.520 of, okay, there might
01:34:24.700 be some, I don't know if I can justify
01:34:26.720 the life in prison for speech. Was this
01:34:28.720 one concern? Not a very
01:34:30.600 strong concern. It wasn't very strongly
01:34:32.700 worded. That being said, you
01:34:34.660 are dealing with a Zoom call 1.00
01:34:36.620 full of Jewish people post 0.99
01:34:38.720 October 7th, so I guess 0.96
01:34:40.660 if you don't strongly want to throw
01:34:42.660 everyone in jail who
01:34:44.100 even, you know, but well,
01:34:47.140 I think it's
01:34:48.620 kind of self-explanatory. They
01:34:50.180 They want to call from the river to the sea is genocide.
01:34:52.860 Anybody who opposes the state of Israel is genocide, blah, blah, blah, blah, blah.
01:34:55.760 Very sensitive topic, obviously.
01:34:57.880 But yeah, that's that's so it was I'm concerned about the bias that people might have and the power that people might have on the Digital Safety Commission.
01:35:07.340 And then, yeah, I don't know if throwing people in jail for life imprisonment for speech is necessarily justified.
01:35:13.800 That's that's as strong as we got in the Zoom call.
01:35:17.300 But maybe maybe he'll have some stronger points later on.
01:35:20.000 Let's let's see what the opposition says.
01:35:22.500 And having by having this overview of the bill, Emily, if you could share your perspective for a few minutes.
01:35:29.120 Yeah. Thank you, Richard. And thank you, Michael, for the great overview.
01:35:34.020 And and I think that we do have many shared concerns.
01:35:37.940 And I'm sure that I think we should just sit down and solve this, the tweaks that need to be made to this legislation.
01:35:45.240 So I thought I'd come.
01:35:47.040 um that's actually kind of a gotcha right there uh ma'am respectfully you helped craft this
01:35:55.780 legislation and you're already saying yeah maybe we should be able to just tweak it right now
01:36:01.280 shouldn't this bill be perfect if you helped create it you helped craft this legislation
01:36:07.260 the way it is and you're already like yeah maybe we should tweak it yeah maybe it's uh yeah maybe
01:36:11.780 i'm trying to justify tyranny and maybe it would the way it's framed right now is a little kind of
01:36:16.680 you know
01:36:18.020 kind of makes it obvious
01:36:20.820 hey let's put a little thing in there where we can
01:36:22.900 kind of copy and paste
01:36:24.540 a charge to throw somebody in jail
01:36:26.920 if we call it a hate charge
01:36:28.060 so basically we could just kind of throw anyone in jail
01:36:30.900 for life if we don't like them
01:36:32.260 if the government doesn't like yeah
01:36:34.060 alright she's a little nervous
01:36:36.740 not coming out strong
01:36:38.540 you helped craft this legislation 0.64
01:36:40.960 Emily laid law 1.00
01:36:42.280 she laid the law down 1.00
01:36:43.500 and she's already like hey
01:36:45.380 uh yeah maybe we should tweak it yeah maybe we should throw it out actually uh hi i think we
01:36:51.460 should throw it out because uh you guys are bringing up so many massive loopholes here and
01:36:55.900 you can't even seriously talk about this bill without being both of you kind of fidgety so
01:37:00.820 compliment this too by talking about how this would work from a victim-centered perspective
01:37:06.140 oh what how this would work from a victim-centered perspective so you know how we know the term like
01:37:21.500 climate change and being not just being uh like anti like anti-racist like super and not just
01:37:27.800 being not racist but being anti-racist these terms didn't just fall out of the sky these terms were
01:37:34.100 workshopped by people they were workshopped by liberal ministers and other people in academia
01:37:39.100 to try and push forth uh what's the word um uh total nonsense bullshit that sounds all nice and
01:37:47.080 fluffily and progressive and this victim-centered perspective oh my god oh my god because because
01:37:55.680 you got to understand guys if this bill passes which it will not but if miraculously i get hit
01:38:01.660 by a bus and this bill passes um that's going to be like a term probably used like five years down
01:38:08.580 the line well as in a victim centered perspective uh this is this is called the genocide see how
01:38:15.520 this person said hey um i don't like your politics i'm going to work out at the gym that's actually
01:38:21.300 a called the genocide and if you take the victim centered perspective then we need to throw this
01:38:27.300 person in jail forever because the feelings of the victim really should determine whether or not
01:38:32.660 someone should be locked up and have any freedoms or not let's listen to the victim let's listen to 0.55
01:38:38.080 the uh emotions and thoughts of the mentally ill victim who is super neurotic and uh as i mentioned 0.54
01:38:48.100 mentally ill let's listen to the victim and let's see what that oh you think this person should be 0.67
01:38:53.080 thrown in jail indefinitely okay let's listen to the mentally ill victim here in this situation 0.73
01:38:58.140 the victim yeah let's listen to the victim of bullying who is going to cry every time that 0.82
01:39:03.100 they open twitter that's who we should listen to to see if people should have rights or not
01:39:07.620 so let's say sorry i gotta time stamp that victims victim centered it's like a victim
01:39:17.400 centered perspective uh oh my god oh my god i just thought i just wanted to take a victim
01:39:29.400 centered perspective on this real quick guys that's such a crazy that's so crazy uh you think
01:39:36.360 the do you think the three young girls who got murdered in the uk by a knife attack should we
01:39:43.040 take the victim-centered perspective on that well they're dead actually they got murdered
01:39:47.700 they got killed by the same guy um descendant of i believe migrants from from africa or something
01:39:56.940 uh can we take the so what are the what are the parents have we heard from the parents
01:40:01.940 of the three white kids who have been murdered can we take a victim-centered perspective on that
01:40:07.020 oh no that's right we're gonna throw violent criminals out of jail into the street so we can
01:40:11.860 get the uh the people complaining about it uh into a jail cell where i would actually like to
01:40:18.500 know that where where are the parents of the of the victims in the in the uk uh of the southport
01:40:24.480 southport murders of those young girls i would like to hear what they have to say anyway you
01:40:30.500 know something you know awful is happening online and we'll you know deal with the real world
01:40:35.360 anti-semitism that we've been seeing flowing online what is available to an individual
01:40:41.540 so that's such a oh man they're so good at this i've mentioned this when we went over the arif
01:40:48.000 verani press conference but they do this thing where they go you know let's say something's
01:40:53.000 happening online and there's also like real world things that happen and like really there's no
01:40:57.200 difference between what happens online like a comment online and what's happening in real life
01:41:01.920 Like they totally conflate the real world with comments online as if they're synonymous.
01:41:09.200 How this would work from a victim-centered perspective.
01:41:12.720 So let's say, you know, something, you know, awful is happening online and we'll, you know, deal with the real world anti-Semitism that we've been seeing flowing online.
01:41:23.560 What is available to an individual?
01:41:26.040 um now what i love about that is that it's like has a crime been committed
01:41:34.040 what's the crime this is what's so funny about these people who justify online harms
01:41:41.660 like the line of like what is a violent crime becomes irrelevant to them because they're
01:41:48.020 trying to move the line to be like oh okay no no now speech now speech is violence now now your
01:41:53.780 opinion is violence like that's that's really what this game is here if you if you kind of boil it
01:41:59.080 down what this game is in passing bill c63 is to move the line of what is criminal to include
01:42:05.960 speech to include certain political opinions that you don't like that's what it boils down to and
01:42:12.940 what they're going to do i'm guessing just off the top of my head is they're going to try everything
01:42:17.500 they can to to do the guilt by association game and be like do you see this violence that happened
01:42:21.860 that speech is basically the same thing as that uh well is there any sort of evidence of proving
01:42:27.140 that the person who said this is actually violent or are you just kind of extrapolating an internet
01:42:33.720 comment to make it now a violent crime or now crime all of a sudden like that's that's what
01:42:40.280 their game is and to be to be i mean let's look they're not gonna make it guys like i'm i'm sorry
01:42:46.780 emily you're working way too hard trying to tie these things together i do not think you're not
01:42:51.680 you're not going to make it good luck good luck but uh i don't think you're going to be able to
01:42:57.360 tie these tie these dots together with legislation she's working around the clock and this is the
01:43:02.240 best she's got victim-centered perspective good luck when michael said how there's three different
01:43:08.420 you know there's almost two or three different bills within one um there's good and bad about
01:43:15.380 that that we can debate uh from a victim perspective when something bad happens you
01:43:20.520 could say well what are the responsibilities of the social media company and that would be i love
01:43:24.640 how she's not defining what something bad is she said when something bad happens when there's a mean
01:43:30.600 tweet is that what you mean when someone says a mean tweet when something bad happens
01:43:40.880 mean tweets have you seen the comment section that's another thing i bet the people who make
01:43:49.360 this legislation like don't have a platform where they post content regularly you know it's it's it
01:43:55.940 would be such a joke for any even like micro influencer like myself or especially like a
01:44:01.160 larger influencer to say like yeah hey guys we got to stop hate online any any big platform would be
01:44:06.760 like what do you mean hate is everywhere online it doesn't matter like all sorts of people get
01:44:13.760 hate constantly on the internet it's it's just it's so synonymous with the internet it's maybe
01:44:21.060 that's a really good argument actually like can you even have the internet without hate
01:44:27.080 that's one of the best parts of the internet you hating people people hating you you know it being
01:44:33.900 it being of much lower consequence because it is the internet that's kind of like kind of one of
01:44:39.060 the joys i would argue is that you can kind of vent your anger in a safe place hello you know
01:44:46.840 it's like when something bad happens when someone gets angry on the internet no when someone actually
01:44:53.400 gets angry about a very specific political agenda that is against our agenda that's what i mean
01:44:58.540 you know it's it's so hard to separate this stuff from uh from the political agenda and the you know
01:45:04.380 tyrannical intention that being said you know the people in this meeting i think i think they're
01:45:10.040 genuine genuinely they could be genuinely afraid of anti-semitism and genuinely afraid that like
01:45:16.340 oh my god like we need to do something uh um so it's not necessarily i i just want to kind of give
01:45:23.220 some sympathy because there are the people like a reefer ronnie and the crooks in parliament who
01:45:29.140 are definitely like you know evan balgord and people with the anti-hate network they like they
01:45:33.780 are they are definitely like have a very tyrannical intention of like censoring certain people and
01:45:38.480 that's like that's their goal but along with that is they they bring along the victims they bring
01:45:43.060 along the people and they feed them the story of like no no like we're gonna save the world
01:45:48.080 by censoring people we don't like but we're gonna save the world from the hate and we're actually
01:45:52.480 gonna stop the hate and the bullying and there's a lot of people who go oh okay okay i believe you
01:45:58.460 okay yeah that's what we're gonna do and they're you know they think they're doing the right thing
01:46:03.720 but uh they're not they're being manipulated by psychopaths and tyrants unfortunately anyway let's
01:46:11.260 continue be the the online harms act that they're looking to create with the digital safety
01:46:16.460 commission or has a crime been committed and you want to go to law enforcement or are you looking
01:46:22.540 at something else where you know that isn't at the level of criminal enforcement but some sort
01:46:28.500 of accountability for the individual which is when you can make a complaint to the human rights
01:46:32.880 Commission. But for hate speech or advocating genocide, if the goal is to impose responsibility
01:46:44.160 on social media services, to put into place some sort of mechanisms to mitigate risk,
01:46:49.280 an individual can go and make a submission to this new Digital Safety Commission.
01:46:55.640 The Commission doesn't then investigate that. It's not a complaint to the Commission. Basically,
01:47:01.320 it receives that information and then it becomes part of the information that it might share
01:47:06.600 publicly about social media or it might be the basis then that the the digital safety commission
01:47:12.360 might launch a hearing essentially like an investigation but different where it's investigating
01:47:18.520 Wait a minute. Wait a minute. It's like an investigation.
01:47:32.380 That's something else where, you know, that isn't at the level of criminal enforcement, but some sort of accountability for the individual, which is when you can make a complaint to the Human Rights Commission.
01:47:43.540 um but for for hate speech um or advocating genocide if the goal is to um to impose
01:47:53.100 responsibility on social media services to put in place some sort of mechanisms to mitigate risk
01:47:56.660 an individual can go and make a submission to this new digital safety commission um there is
01:48:00.540 no um the commission doesn't then investigate that it's not a complaint okay so they don't
01:48:05.060 investigate that it receives that information and then it becomes part of the information that it
01:48:14.540 might share publicly about social media or it might be the basis then that the the digital
01:48:18.220 safety commission might launch a hearing essentially like an investigation but different
01:48:21.800 where it's so it's not an investigation but it also is an investigation 0.93
01:48:27.300 oh come on emily you know i don't know if she's gonna make it guys i don't know if she's gonna
01:48:35.840 make it here investigating not your individual piece of content but more broadly say if a social
01:48:40.320 media service is complying with you know what michael talked about is that duty to act responsibly
01:48:44.500 and that duty duty involves taking adequate measures to mitigate the risks of exposure to
01:48:51.100 the harmful content to exposure to say hate speech to content that advocates genocide the
01:48:57.000 individual can go to this ombudsperson i think this is really important it is there to provide
01:49:00.960 victim support to say victim support got victim support yeah i mean it's it's we've already seen
01:49:14.440 the trend with progressive politics to empower victims and to like make people feel special for
01:49:18.960 being a victim and you know they're gonna salivate at this legislation oh my god i can i can arrest
01:49:26.140 people online i can be financially incentivized to to arrest people online because i'm gonna like
01:49:32.740 play up the fact that i'm a victim this would be a nightmare this would be a nightmare look this
01:49:38.620 was happened with these are your options this is how you can navigate it um the odd buns person
01:49:43.040 would be able to pass on information to that commission um also the digital safety commission
01:49:47.520 a major part of their mandate is is education and research supported by a digital safety office
01:49:52.160 so part of this is taking the data that they get from these companies and saying okay
01:49:55.980 where do we go from here what do we know about the landscape of what's going on what are the
01:50:00.020 next steps more broadly about some supporting canadians and in addressing these types of
01:50:04.400 harmful content this is so interesting because it's like okay we want to get a bunch of data
01:50:09.280 from big tech to see like what is going on here it's like so what you want want to data scrape
01:50:16.960 people who who are like saying the wrong things online for what like it's so unspecific other than
01:50:23.840 we just want more control we want to see what's going on we want to you know we want to we want
01:50:28.400 to get our hand in the cookie jar there and just kind of like know what's going on here we want
01:50:32.000 more power we want more power is what we want but for individuals for their individual piece of
01:50:37.840 content in the end when it comes to social media services you're you're looking at still going
01:50:42.720 directly to the social media service but it isn't it's a long-term project with the digital safety
01:50:46.960 commission it's about harm reduction and it's about setting minimum standards for these companies
01:50:51.200 about how they manage these risks of harm again total that these companies don't have a minimum
01:50:58.240 already you're you're looking at still going directly to the social media service but it is
01:51:03.360 an it's a long-term project with the digital safety commission it's about harm reduction and
01:51:07.040 it's about setting minimum standards for these companies about how they manage these risks of
01:51:11.440 harm and so hopefully i mean if this is working at harm reduction the goal would be that you're
01:51:16.560 You're going to see better standardization about how this is managed by social media services.
01:51:21.180 Imagine being like an expert. 0.68
01:51:23.920 What is she, an expert on what again?
01:51:26.780 Is this her?
01:51:29.820 She's an expert.
01:51:33.340 She's like an expert on like cyberbullying and all this. 0.99
01:51:37.100 But she's like, we need big tech needs to have a bare minimum on enforcing this stuff.
01:51:42.120 Are you serious?
01:51:43.760 Are you kidding me?
01:51:45.020 you don't think that big tech has some sort of bare minimum to enforce harmful content like
01:51:52.060 what it's it's so disingenuous it's like these people are just completely ignorant
01:52:00.060 and and and and daft and incredibly stupid in terms of not knowing what big tech does 0.64
01:52:08.860 or it's just straight up malicious it's straight up malicious it's straight up sort of
01:52:14.300 of, uh, you know, I'm in, I'm here to push the tyranny. We're
01:52:18.580 here to get the power. We're here to get the money. Like
01:52:21.440 there's no in between. You're either incredibly stupid and
01:52:24.180 naive and, and, and literally don't know anything about the
01:52:28.880 so-called big tech companies you're trying to regulate or
01:52:32.960 you're literally an evil person who knows exactly what they're
01:52:37.240 doing. And this is all just to show to get more power and
01:52:40.360 control go to law enforcement i i have many concerns about the criminal code provisions
01:52:46.580 like michael that we can dig into you wrote the legislation
01:52:50.880 miss laidlaw mrs laidlaw emily you wrote the legislation you have concerns
01:52:56.640 you have concerns you wrote it you wrote it and you have concerns must be a real good piece of
01:53:04.280 legislature. Good job, Arif Varani. Fantastic. The person who helped contribute to this also
01:53:11.160 has concerns. Or you can go to the Human Rights Commission. I think that reintroducing Section 13
01:53:16.820 was always going to be controversial. I think that some important steps have been taken to
01:53:22.160 narrow the scope of it, a very clear, a much clearer definition of hate, the ability to dismiss
01:53:26.760 complaints, to be able to impose costs on parties that are frivolous and vexatious in the complaints
01:53:32.660 that they make um i take seriously though michael this this is the more clear definition of hate
01:53:38.400 right here by the way this this this chart this chart is the more clear definition of hate
01:53:44.000 is it detestation is it detestation that is not allowed disdain that's okay humiliates
01:53:55.680 that's okay dislikes okay offends okay but no no detests and it's very clear it's very clear what
01:54:02.300 the difference between disdain and detest is. We're a tyrannical government. You can trust us.
01:54:10.160 Liz commented separately on a panel with me about the concerns of weaponization of that process.
01:54:16.260 And what I have been trying to mull are what are ways that, you know, that can be contained,
01:54:21.460 because there's an important access to justice element to being able to go to the Human Rights
01:54:25.180 commission important access to justice element being able to go to the human rights commission
01:54:33.900 why do i feel like what she's talking about is coming from a victim perspective 0.74
01:54:39.940 justice to the victim right justice to the the mentally ill uh person who is terrified of
01:54:48.000 internet comments their their form of justice justice is now taking on a a new definition here
01:54:54.320 uh according to the human rights uh person they hurt my feelings and they're not in jail yet i
01:55:00.400 want justice unbelievable all right this this video is really long so we got to speed this up
01:55:07.260 i'm only 15 minutes in it's an hour long um the last point i'll make is about the regulatory
01:55:13.040 structure of the digital safety commission i i don't have the concerns that michael has about
01:55:18.520 the structure of the commission itself um i think that there is oversight through the ordinary
01:55:23.740 processes um through the federal court um there are we need a commission that has power the power
01:55:29.900 to investigate the power to impose fines um there is an important duty on the digital safety
01:55:34.820 commission to take account of privacy freedom of expression and equality among other things
01:55:38.900 in setting out any of the guidelines it writes or or um any regulations um that's it's so funny
01:55:45.940 like it's it's just doublespeak it's all like everything baked in here is so doublespeak she
01:55:51.260 says we need the power so it's important we have the power also we're concerned about freedom of
01:55:57.740 expression court um there are we need a commission that has power the power to investigate the power
01:56:03.100 to impose fines um there is an important duty on the digital safety commission to take account of
01:56:08.540 privacy freedom of expression and equality among other things yeah that's important too
01:56:13.580 power to find people but we also need to be concerned about equality and freedom of speech
01:56:19.500 I can just see Ari Farani being like, okay, make sure, hey, make sure every time you talk about
01:56:25.080 the power and the fines and throwing people in jail, make sure you always follow that up with
01:56:29.660 a comment about how much you love freedom of speech. All right, honey? Good job. You're doing
01:56:33.820 great, kid. In setting out any of the guidelines it writes or any regulations. And much of this
01:56:41.200 needs to be left to be developed later because of evolving tech and social media. You know,
01:56:45.100 if you're oh oh really that's a great caveat isn't that a great excuse to pass through shit
01:56:53.220 legislation let's hear that again let's hear that again i love that i love beautiful bureaucratic
01:56:59.420 excuses like this this is just privacy freedom of expression and equality among other things
01:57:03.460 in setting out any of the guidelines it writes or or um any regulations and much of this needs
01:57:09.880 to be left to be developed later because of evolving tech and social media you know if
01:57:13.680 Their powers are essentially to write regulations about risk management, digital safety, kids design, like what the obligations are to keep kids safe.
01:57:23.020 No, we just need to rush this legislation through and figure it out later because the Internet's always changing, you guys.
01:57:29.340 Let's just make sure we have this huge bureaucratic body with a lot of power that can find people.
01:57:34.340 And we'll figure it out later because kids safety.
01:57:37.600 You got to give them credit, you know, because it's like these are their best talking points, I think.
01:57:43.680 And I could see how this would make sense to someone who's like, oh, okay, yeah, yeah, internet, it's explained later, yeah, okay, all right, yeah, let's just pass it through, fair enough, all right.
01:57:56.660 And how to manage their complaints.
01:57:58.520 So once I dug deeper, it doesn't seem as broad as it seems, it's very scoped what it is, but certainly...
01:58:04.600 Bullshit.
01:58:07.480 It really doesn't seem as broad as it is.
01:58:10.860 That's crazy.
01:58:12.420 that's such a crazy comment it's about risk management digital safety kids design like
01:58:19.940 how what the obligations are to keep kids safe and how to manage their complaints so once i dug
01:58:25.400 deeper it doesn't seem as broad as it seems it's very scoped to what it is but certainly there is
01:58:29.720 that uncertainty of saying we don't exactly know what the risk management plan will be for those
01:58:34.680 particular companies um i think i'll leave it there and we can dig into this with uh with various
01:58:39.300 uh questions and and kind of we'll we'll solve all this right michael i don't i don't even know
01:58:45.460 like what the takeaway was from from her opening statement she said herself that uh there's issues
01:58:52.020 with it even though she helped contribute to writing this legislation i think if you wrote 0.98
01:58:56.660 the legislation you should be standing behind your work but she's so insecure and unconfident 0.97
01:59:01.940 in her work maybe because she has a guilty conscious conscience uh that she's like yeah 1.00
01:59:06.820 yeah no we do yeah we do need to change things um you know the digital safety board doesn't
01:59:12.100 investigate fast forward 10 seconds well you know it's basically an investigation
01:59:15.860 okay uh there's some double speak there uh it's totally broad
01:59:23.220 sorry it's not broad at the end which is interesting
01:59:26.660 poor emily i don't think she's gonna make it i don't think she's gonna be able to
01:59:32.380 push the line of what being criminal
01:59:34.680 is into the realm of
01:59:36.380 people speaking freely
01:59:37.500 she's not going to make it but
01:59:40.400 yeah such boilerplate
01:59:44.940 someone said yeah exactly
01:59:46.140 exactly
01:59:46.860 she loves the sound of her own voice and whiffs her own farts
01:59:50.740 okay
01:59:51.580 listen listen listen
01:59:54.100 hey okay let's try and be respectful
01:59:56.760 here
01:59:57.220 that's uh is that from something 1.00
02:00:00.520 though that's a really good she loves the sound of her own voice and whips her own farts that's 0.59
02:00:04.440 like a classic like narcissism uh characterization alberta climber says i'm guessing that she's 0.99
02:00:12.600 either a lesbian or single divorced in brackets guys you guys you guys keep them coming um
02:00:21.700 she has a permanent skunt look on her face i don't even know what that is okay guys let's 0.99
02:00:29.840 Stop the cyberbullying, okay?
02:00:32.360 We don't want to be criminals now, do we?
02:00:34.520 No, I'm kidding.
02:00:35.140 I love cyberbullying.
02:00:36.040 It's a lot of fun.
02:00:37.400 It's what makes the internet fun, you guys.
02:00:39.180 Then again, from a victim perspective, we're basically committing violence.
02:00:47.060 That's basically we're terrorizing her emotional world.
02:00:53.020 We're terrorizing her emotional real world vis-a-vis.
02:00:57.100 it's real world violence vis-a-vis human rights court says i need to wear an ankle bracelet for
02:01:02.920 the rest of my life guys there's no way we can let this pass okay if you guys are just tuning in
02:01:09.440 we are saving free speech in canada we're exposing the online harms act aka bill c63
02:01:15.640 if you want to support our work go to givesendgo.com slash save free speech you can also
02:01:23.780 check out savefreespeech.ca we got a working website there lots to come we already have some
02:01:28.360 videos out there on our social media but we are trying to stop bill c63 i'm just doing some of
02:01:33.300 the research right now hearing some of the strongest strongest arguments in favor of bill c63 the online
02:01:38.840 harms act it needs to be challenged it needs to be challenged we need to humiliate the people
02:01:43.100 trying to push this it would be the end of free speech um but yeah check that out at give send
02:01:47.140 go.com slash save free speech. Oh my God. Scarecrow scarecrow with $50. Yes. Thank you so much for
02:01:54.280 donating, sir or ma'am. We don't stop this by giving in. Our only chance is to fight it. I'll
02:01:59.560 take a slim chance over no chance. Yo, that's a bar. That is a bar right there. That is a bar. 1.00
02:02:07.640 I'm definitely stealing that scarecrow. That is so good. I'll take a slim chance over no chance.
02:02:13.240 boom boom heck yeah shout outs to scarecrow 07s to scarecrow in chat that pumped me up that
02:02:25.460 pumped me up and you know maybe it's silly but i feel like little little bumper stickers like that
02:02:32.040 little catchphrases like that they're powerful okay hope is a very very powerful thing and
02:02:37.460 this is a big part of the challenge it's not just exposing bill c63 okay it's not just
02:02:44.660 whoops it's not just exposing bill c63 and the online harms act and about how it's not about
02:02:51.220 protecting children and it's really about tyranny but it's about inspiring canadian it's about
02:02:56.640 fighting demoralization and it's about inspiring canadians to say hey we can stop this there is a
02:03:02.820 slim chance but it's a chance okay we can make enough noise to do this we have to try we have
02:03:08.540 to try we have to this is this is in america it's the foundation of the country of the constitution
02:03:17.820 the first amendment that's then that's the crazy thing you know if this was happening
02:03:24.240 in america there would be way way way way way more pushback on a crazy broad bill like this
02:03:31.600 and of course they are trying to like weasel their way in in america but they're much more
02:03:36.020 tactful in america because because they they know the stakes are higher they they know that patriots
02:03:41.060 are out there they're much more sensitive to tyranny they're much more sensitive to this
02:03:45.020 bullshit not here in canada we're too chill here that's a problem and unfortunately the people who
02:03:50.720 know what's going on are too demoralized but little catchphrases like that like i would take 0.54
02:03:56.060 a slim chance over no chance like let's do this let's let's freaking go we need to politically
02:04:02.920 activate people in this country and i think we can do it i think we can do it um yeah and okay
02:04:11.840 just to address some people hey hey i'm getting banned in chat blah blah blah listen it's important
02:04:16.940 that we represent say free speech properly to reach as many people as possible so don't take 0.83
02:04:22.940 it personally oh you're a hypocrite you're you're banning people in chat you're you're you're you're
02:04:27.340 free speech advocate you're not letting me speak freely listen okay we got we we got to focus on 0.87
02:04:35.820 the the bigger picture here we got to focus on on doing damage in this in this um you know in this
02:04:41.620 what do you call it fifth generational warfare okay we got to make striking blows here and i
02:04:49.700 love the chat i love reading the chat but you know a little comment on chat in chat is not going to
02:04:54.040 necessarily change change the world here especially if it's something that no offense one of my
02:04:58.940 moderators deems is like not productive to the conversation okay so don't take don't take it
02:05:04.440 personally we're here to fry bigger fish much bigger fish let's focus on the bigger fish
02:05:10.880 respectfully okay thank you again scarecrow for the 50 donation really appreciate it let's put it
02:05:17.400 on screen one more time. We don't stop this by giving in. So true. Our only chance is to fight
02:05:24.380 it. I'll take a slim chance over no chance. Bars, straight up bars from Scarecrow. Thank you again
02:05:31.940 for the $50 donation. If you want to donate, Hey, we cracked 3,900. Let's go. If you want to
02:05:37.260 donate, go to gifts and go.com slash safe free speech. I do need to, uh, get a drink and go to
02:05:45.200 the bathroom though so we can so we can push out and finish this uh finish this we'll probably
02:05:50.480 wrap this up within the next hour but uh what should i put on screen for you guys while i go
02:05:55.280 take a little quick break um uh let's oh you know what i know what we'll bring up
02:06:03.420 you know what i might be uh interviewing this gentleman
02:06:10.560 shout outs to um shout outs to wiretap media he um he posted this video today cbc news acts my
02:06:22.840 interview this guy is exposing um the car theft epidemic and the rise of crime in canada
02:06:29.160 he's a car guy cool guy sounds like a great guy if things go well i'm going to be interviewing him
02:06:35.880 uh on thursday so stay tuned for that on uh on my thursday stream but let's take a look at this
02:06:43.560 video here i'm gonna go get a drink and yeah shout out shout out to uh wiretap media for uh
02:06:49.960 for posting this i'm gonna hide myself for a second and what am i gonna do here come on
02:06:59.000 let's get that a little bigger all right i'll be back in a sec
02:07:06.060 whoops
02:07:10.120 was to canadians this is so clear to me because they came to interview me about the car theft
02:07:18.160 situation that is happening across the country they canned the interview they did not release
02:07:23.240 the interview because it did not fit the narrative that they were hoping to go ahead and spin to
02:07:27.440 canadians you're going to ask me well what did you say saundran in the interview they asked me
02:07:32.240 what is the situation going on with car thefts here's the high level and this they canned the
02:07:36.240 interview because of this i told them one it's a failure of our government our leadership team
02:07:40.820 the justice system that has allowed this to happen in the first place number one number two we need
02:07:45.940 to hold these leaders accountable because now canadians feel like their communities are no
02:07:50.340 longer safe and there's stats to prove this 50 percent year over year increase in car thefts in
02:07:55.960 ontario 400 increase in home invasions in ontario that lead to a car theft i shared all of these
02:08:02.840 stories and these stats with cbc and they didn't want to go ahead and share this because this does
02:08:07.240 not fit their narrative i let them know why are we not treating criminals in this country like
02:08:12.400 criminals people are terrorizing canadians hundreds of you have sent me the stories of
02:08:18.000 your family's being traumatized and i shared those stories with cbc they didn't want to hear it it
02:08:23.360 doesn't fit the narrative they don't want to hear the canadian side of the story they have their own
02:08:28.260 specific side of the story that they want to share this is such a big issue our newscasters in canada
02:08:33.660 are completely biased and i can tell the person that was interviewing me did not like the answers
02:08:38.740 i was giving which was the truth why do we not have proper authentication in vehicles today it's
02:08:44.240 2024 why do we have all of this crime happening across the country because we don't hold these
02:08:49.400 criminals accountable the toronto police chief just said we are catching people and releasing
02:08:54.280 them right away their entire police force feels disenfranchised because what are they doing what
02:08:59.040 are they doing and then how are we shipping tens of thousands cars across the port of montreal it's
02:09:03.940 like we are blind and i shared this common sense story that all of you are sharing with me and they
02:09:10.280 can the interview this is a message to all canadians do not trust cbc news they do not
02:09:15.640 share both sides of the story they just share the side of the story that benefits their benefactors
02:09:21.160 cbc news is all about spinning a narrative they're not there to actually share the news
02:09:27.400 uh bars uh absolute bars by this man uh yeah i'm excited to chat with this guy i just made
02:09:35.640 made some commentary on the save free speech page.
02:09:38.440 Save free speech page.
02:09:40.280 This is why it's relevant to the fight to save free speech.
02:09:43.400 We hear some Canadians say,
02:09:45.420 if my freedoms are being violated,
02:09:47.220 I would hear about it on the news.
02:09:49.640 Um, no, you would not.
02:09:51.640 CBC refuses to report on the violent crime
02:09:53.860 and car theft epidemic happening in Canada.
02:09:56.000 If our safety isn't their priority,
02:09:58.120 then why would our freedoms be?
02:10:01.300 We're passing Bill C-63.
02:10:03.200 We're going to keep, we're going to keep you safe.
02:10:04.820 we're gonna keep you safe safety is our priority yeah i doubt that you're not gonna report on the
02:10:10.920 rising crime happening in this country it's absolutely insane cbc news does not and also
02:10:16.880 to uh to also echo that fact where is it right here also also from wiretap media he posted five
02:10:25.660 stories that didn't make canadian media headlines and i wanted to focus on this one canadian citizens
02:10:32.060 were impacted by data security breach
02:10:34.080 when 3 billion social security numbers
02:10:35.700 were stolen from data centers
02:10:36.900 and sold on the dark web.
02:10:39.080 You think this is going to be mentioned
02:10:41.020 in Bill C-63 to keep us safe online harms?
02:10:45.760 How about my data being breached
02:10:48.220 by who God knows what kind of bad actors?
02:10:52.480 What about that part of safety?
02:10:54.300 Oh, you're just concerned
02:10:55.000 about censoring dissidents, aren't you?
02:10:56.780 Okay, great.
02:10:57.880 Great, fantastic.
02:10:59.820 Fantastic.
02:11:01.140 Yeah, shout outs to
02:11:01.960 wiretap uh media he's making some good edits this is him on twitter
02:11:06.460 all right let's get back into reacting to the cj video about bill c63
02:11:17.860 um yeah chocolate milk milk and cookies no this is my beverage of choice it's the best
02:11:27.340 it's the best pop they got going i'm gonna say pop because it's very canadian
02:11:32.620 i'm just gonna i mean i'm gonna take a quick break here and have a have a couple sips
02:11:41.780 couple sip break before we hop back into this reaction
02:11:45.940 oh no was there a tim horton commercial oh bro okay i try i may have turned on monetization
02:12:01.800 on this one i i apologize but just so you know it does it does help if sit through the ad
02:12:09.460 you're helping me out you're helping me out so you're helping me support myself so i can focus
02:12:14.340 on saving free speech in Canada so just know if you're watching some annoying ad that uh you're
02:12:21.520 helping save free speech on my channel but if you're watching an ad on some other um some other
02:12:25.800 video no but uh no I know what that's like you know I actually speaking of anger and hate I I was
02:12:37.180 i'm watching the end of game of thrones right now and i am so filled with rage because the person
02:12:45.420 who operates the crave account where i watch it that's streaming service on crave this is kind of
02:12:50.300 a tangent but they got the cheaper version of crave and now there's ads there is four ads
02:13:00.580 every 10 minutes of content for so i'm watching like the long night like one of the most epic
02:13:10.060 uh game of thrones episodes there is it's an hour and 20 minutes long it's like this movie
02:13:14.880 and then like right at the most intense moments it's like you could say 15 or more i'm like oh
02:13:21.800 my god i forgot how painful this is it's like it's just pure like when when the ad starts it's
02:13:28.760 just pure pure pure rage pure rage where it's just like i i just really want to
02:13:36.520 like i shouldn't say it but like i really just like i i feel like snuffing the life out of
02:13:44.620 something right now you interrupted the most important part i was there emotionally i was
02:13:50.620 i was suspending disbelief i was in the world of of john snow and the dragon queen
02:13:57.360 And then it's like
02:13:59.180 You could save 50% or more on car insurance
02:14:02.500 It makes it
02:14:06.320 It does make me a little grateful though
02:14:07.920 Of like how
02:14:08.860 Because we used to have to watch ads all the time right
02:14:11.780 Like I'm a millennial
02:14:13.060 I'm 35 now
02:14:14.800 And yeah I grew up watching TV
02:14:17.960 Those ads
02:14:19.340 Sometimes those ads were brutal
02:14:20.980 The difference is though
02:14:22.780 They designed TV to have an ad break
02:14:25.940 And a natural break
02:14:27.120 they don't have that in in like modern tv shows on netflix and crave and hbo and all this
02:14:32.880 there's no natural break for a commercial so it's like right in the middle of some
02:14:38.340 emotional beat something so dramatic and ah
02:14:41.920 hey do you do you want to sign up to audible and on top of that the audio isn't mixed properly
02:14:51.900 so you'll hear like an ad with like super loud music super low it's unbelievable anyway let's
02:14:56.660 focus here. What are we doing? Guys, what are we doing? Focus. Come on.
02:15:08.160 Dark Derek Nelson. There's been at least a dozen ads on this live stream alone
02:15:12.160 on iPhone. Oh, bro.
02:15:15.760 When Greg inadvertently makes a commercial for piracy. Did I?
02:15:21.260 Oh, yeah. Okay. I see what you're saying. Yeah. Yeah. Yeah.
02:15:25.260 just pirate it bro um commercials ruined my life
02:15:30.640 i'm let's take a victim-centered perspective on this i'm just trying to enjoy my game of thrones
02:15:39.660 and then i get violated by ads violated crave violates me with their advertising
02:15:47.100 anyway i'm going to arrest crave for uh their hate speech their online yeah you know what you
02:15:54.760 want to talk about online harms let's talk about these ads on the crave streaming service okay i'm
02:16:00.380 trying to enjoy my tv show and then i'm attacked i am my senses are raped okay by this ad interrupting
02:16:09.080 my tv show i use this tv show for escapism it's the one part of my day where i get to escape the
02:16:16.340 stresses of being a victim to capitalism of being a victim to the patriarchy of white supremacy i'm
02:16:23.180 trying to watch my tv show and then an ad comes up violating me
02:16:29.300 it's harmful it's bullying you know what it is it's bullying that's what it is that's what it
02:16:41.100 is and that's why we need to pass bill c63 the online harms act so i can fine crave for money
02:16:48.880 I want to find them
02:16:50.500 I want to throw some of them in jail
02:16:52.200 Because I had a bad day today
02:16:53.800 And I don't know how to deal with my emotions
02:16:55.840 So we need this bill
02:16:58.040 So I can feel better about myself
02:17:00.340 For punishing people
02:17:01.780 Who make me uncomfortable
02:17:03.640 And make me have to self-reflect on the fact that
02:17:06.400 I'm overweight and don't take care of myself
02:17:08.680 And haven't dealt with my emotional problems
02:17:10.880 Oh and we want to protect kids online too
02:17:14.520 That as well
02:17:16.140 Okay yeah thanks
02:17:17.640 Perfect
02:17:18.220 all right let's get back into this
02:17:27.740 what's fact and what's fiction the new online harms bill all right hopefully we speed this up a
02:17:32.700 bit yeah thank you so quick question just so uh to help me uh understand like a very touchless
02:17:39.660 concrete thing so i see something on twitter that i think is very hateful if if this bill is in place
02:17:45.900 what do i do what what are the possibility the options that that that as a jewish person i i can
02:17:51.580 i can do you walk away from the computer you can put your phone down um how hateful is it
02:18:03.260 how hateful is it how how do we like first question would be how do we determine uh is it
02:18:10.380 Where was the tweet?
02:18:18.080 Disdains, but not detests.
02:18:20.840 Sorry.
02:18:21.680 Try again next time.
02:18:23.780 Oh, but someone in the government, someone at the ombuds person said,
02:18:27.640 actually, no, we can make the argument that that's detestation.
02:18:29.900 Sure.
02:18:30.440 Yeah. 1.00
02:18:30.780 Let's find this motherfucker.
02:18:33.280 Pardon my language.
02:18:36.020 So he poses the question.
02:18:38.140 Good question.
02:18:38.620 hey i find something hateful on twitter what do i do as a jewish person uh can one of you answer
02:18:44.440 that yes Emily maybe i'll just jump in because i um that was a bit where i started which is well
02:18:48.520 the first step would be to claim this to the social media platform i think we're seeing of
02:18:52.260 course on x that they're not taking very seriously any complaints about hate um the other option
02:18:58.040 then is to submit let's should we start counting should we start counting maybe we'll start
02:19:05.740 counting. That might be fun. I should have started counting
02:19:09.880 from the beginning, but let's see if this is worth it or just a
02:19:14.940 big waste of time. Is there a way to make this? Okay. So this
02:19:24.580 number is the number of times hate is mentioned without
02:19:27.500 clarifying what hate is. So that's one. Actually, no, it's
02:19:31.520 two because he it's two because he said the tweet and he
02:19:35.500 didn't uh specify all right uh that information to the digital safety commission they're not going
02:19:42.160 to act on your complaint that will be information that feeds whether they do a bigger investigation
02:19:46.500 or bigger hearing about the social media service or you can go to the ombudsperson who can't do
02:19:51.740 anything exactly but provide support to help you figure out what to do so that would be perhaps a
02:19:56.520 first point of call as well and they can help you navigate um and the last two options would be if
02:20:01.880 think it's really it's reached that threshold go to the police or make a complaint to the canadian
02:20:05.880 human rights commission go to the police
02:20:14.280 again maybe if they had defined what the hate is but this person without even flinching is saying
02:20:20.600 you should maybe call the cops actually if you read the question was what do you do when you
02:20:25.080 read a hateful tweet she says maybe call the cops yeah i might want to call the police
02:20:31.880 if you read a hateful tweet without even clarifying anything about what the tweet might
02:20:38.160 have said don't you think that's kind of an important thing to clarify no let's just it
02:20:44.460 was a hateful tweet i'm call the police of course that's what i'd do this is insane hope please
02:20:51.420 please michael geist say something in response about defining hate please i hope our man michael
02:20:58.480 guy who says something here yeah if i can for just a moment we should pick up on that i think
02:21:02.740 that that provides a really good review of the different options i will say that you know the
02:21:06.460 idea that this is suddenly going to result in you know when you face something online that it comes
02:21:09.980 down or that you get justice is unlikely it still will depend on uh especially initially some of
02:21:14.380 the things social media companies do i face a lot of this sort of stuff especially on twitter on my
02:21:18.320 own account when i post on anti-anti-semitism related concerns and don't have really any
02:21:22.120 expectation that the law would directly change that but what can we get some specificity
02:21:27.780 of of the anti-semitism of the hate i feel like this is kind of important it will change and i
02:21:35.300 think this is a positive is that the process around dealing with some of this content where
02:21:38.940 there are complaints about the prospects sometimes of content coming down is oftentimes very opaque
02:21:42.980 i on a on a less controversial site i had content that was removed when i posted an anti-semitic
02:21:48.360 anti-semitic image trying to highlight the fact that this is what the community is facing day in
02:21:51.980 day out right now in canada in 2024 it's it's really unthinkable and quite astonishing i posted
02:21:56.820 it on linkedin that that content came down it was presumably an algorithm or ai of some sort that
02:22:01.860 decided it came down because there wasn't the context it went back up it came back down again
02:22:05.860 ultimately there was some sort of review but the entire process about how those decisions are made
02:22:09.860 something that we don't have much insight into and what this legislation will do as part of a
02:22:13.860 digital safety plan is not necessarily tell you what happened to your particular piece of content
02:22:18.260 but it will add i think far more transparency to a process that will require companies to disclose
02:22:23.300 complaint data information on how they respond to that data and so that people may in the
02:22:28.180 individual case be left unhappy or happy with what the resolution was but at least we're going to have
02:22:32.180 i think far better more open information about what is actually taking place behind the curtain
02:22:36.580 of many of these social media companies which right now very often you file a complaint you
02:22:40.020 have no real idea of what comes of it hmm so what i'm hearing is instead of getting access to the
02:22:47.300 algorithms of how they feed different canadian citizens their information and algorithms on
02:22:52.660 on their feed now they want access to how social media companies deal with censoring content or
02:22:59.400 taking down content they want to see what's inside of the big tech companies they want
02:23:05.080 more data they want it to be more transparent um and like i think the added piece that is not
02:23:16.140 mentioned there is you want more bureaucracy involved in that so he i think it's interesting
02:23:22.080 how he's like hey it's a messy process i posted something that was anti-semitic but i wanted to
02:23:27.720 highlight that it was anti-semitic and it got taken down then it got back up and it was a whole
02:23:31.620 mess and it's like okay so adding adding canadian bureaucracy to this is going to help how
02:23:36.500 you know the big tech platforms are already struggling with this and you think adding the
02:23:41.780 canadian government is going to help deal with the you know tricky nature of you know moderating
02:23:50.100 online content if anything it's going to be obviously more politically biased like that's
02:23:56.700 like that's all i can that's all i can say but it's it's um once again it's wanting power it's
02:24:02.320 wanting more power over how the social media algorithms work i'm i'm quite disappointed so
02:24:08.340 far that mr geist did not bring up like a hey maybe we should define what hate is he kind of
02:24:16.980 just said yeah yeah i got a lot of hateful comments too on my on my social media but i don't think
02:24:21.020 we're going to get any justice there uh can't like i hate to say it but can they even make
02:24:28.440 something like this can they make something like this they don't even have anything like this
02:24:33.100 to you know classify what they're talking about i mean i mean they've said it in words but if
02:24:40.140 they're not going to hone in closer well i mean it's a tyrannical bill so i i really don't care
02:24:45.380 them to hone in closer closer because i know it's all nonsense but it's silly that they're even
02:24:49.700 talking about this with more of a kind of defined uh definition of what's going on here of what what
02:24:54.740 the actual content is okay thank you it's hard to have this conversation uh without taking the
02:25:01.700 context into into question and we're seeing accusations that israel is committing genocide
02:25:07.300 uh that if you're supporting israel you're supporting genocide um and which
02:25:12.260 Which is an argument that people are making and there's validity to that argument. 0.77
02:25:23.460 So, you know, like it gets so hairy so quickly, especially on the Palestine Israel topic, because if you haven't seen, there are people on both sides saying I want to kill that other side.
02:25:39.640 all of them. There's Palestinians or Hamas supporters saying this and there's Israelis saying
02:25:45.680 this. They're both calling for genocide in one form or another. Both sides. Now of course we're
02:25:52.660 watching Sija so it's mostly all Jewish people on this call so obviously they're going to have a 0.89
02:25:56.920 very very strong bias but anyway it's interesting it'll be interesting to see where they go with
02:26:01.180 this. Without taking the context into into question and we're seeing accusations that
02:26:08.640 that Israel is committing genocide, that if you're supporting Israel, you're supporting
02:26:12.500 genocide. And there's concern out there that, and given also the context also that Jews
02:26:19.660 are often portrayed at least by some people as oppressors, both in Israel, because Israel
02:26:25.680 is a powerhouse in the region, and in Canada as being white settler colonialists and all
02:26:31.940 that stuff. And we've heard concerns that being a pro-Israel advocate, being a defender
02:26:37.840 of zionism of the right of the jewish people to self-determination could uh lend somebody into
02:26:43.220 legal trouble with this legislation and and i was getting emails just before this this uh this panel
02:26:49.680 started so i really like to to have your thoughts on this because we're hearing it and and and your
02:26:54.920 opinion matters on this so uh emily if i could start with you and then go to michael this is a
02:26:58.840 great hey bravo to the panel guy because this is actually a really good question uh hey everyone
02:27:04.600 who is uh pro-israel could technically be accused of supporting genocide so how does that work
02:27:11.780 with this legislation it's a really important question and i would say that that concern about
02:27:18.940 what the threshold is for hatred is probably one of the key concerns about this legislation
02:27:23.380 again key concern you wrote the legislation and you don't know there's not a clear answer here
02:27:30.480 right you wrote the legislation there's no clear it's almost like it's terribly written
02:27:35.960 legislation that's incredibly broad even though you said it wasn't broad earlier but anyway um
02:27:41.640 so uh let me identify where there's i don't think there should be concern and perhaps where there
02:27:46.080 should be um the the supreme court has set a very high threshold for what hate speech means um and
02:27:54.460 so and it they really have not they've said a subjective one but please continue and and none
02:28:02.540 of what you're talking about would be captured by that um what what are you gonna say the magic
02:28:09.360 word vilification and detestation if if you're saying that you know if you make the argument 0.55
02:28:15.760 that that that israel is committing genocide or you say that they're villains for committing
02:28:21.100 genocide or you you support um israel and say hey um palestinians are villainous or palestinians are 0.60
02:28:30.940 detestable or hamas is detestable that's the threshold you pass the threshold for the this
02:28:37.700 new depth supreme court definition of speech so hate speech is ultimately at that level where
02:28:46.580 it's beyond even disdain it's beyond offense um that you're essentially uh communicating
02:28:52.180 statements are predicated on the destruction of an entire group of people and the threshold
02:28:55.580 should be incredibly high there's so that's not even correct that's not even correct let's look
02:29:02.840 it up destruction of an entire group of people detestation and vilification are the words
02:29:14.540 the definition detestation detestation intense dislike
02:29:19.540 he is the detestation of the neighborhood
02:29:23.660 woodworth detestation of eritocracy this is one of the words this is one of the words
02:29:34.160 that the supreme court has defined as criminal hate speech what's the other one vilification
02:29:39.780 vilification vilification abusively disparaging speech or writing the widespread vilification
02:29:51.240 of politicians you know google google definitions are it's cooking right now i'm loving this
02:29:59.280 vilification um so destruction of an entire people is not uh part of vilification or detestation
02:30:11.300 which is the supreme court definition broadened definition of criminal hate speech so the woman
02:30:18.060 who wrote this legislation or helped contribute to developing the legislation doesn't even know 1.00
02:30:25.000 what she's talking about. She's not going to make it. Okay. So hate speech is ultimately at that
02:30:33.840 level where it's beyond even disdain. It's beyond offense that you're essentially communicating
02:30:40.040 statements that are predicated on the destruction of an entire group of people and the threshold
02:30:43.480 should be incredibly high. There's still a risk of a chilling effect because people are fearful
02:30:47.800 of bad decisions by regulatory bodies, by companies, by law enforcement or Canadian
02:30:54.440 Human Rights Commission about what that means and being pulled into a process that they don't
02:30:59.660 belong in. Facts. Totally real. Totally true. Right. I would say that when it comes to the
02:31:06.540 idea of the regulator, like the Digital Safety Commission, ultimately they have oversight of
02:31:11.580 social media about broadly how they're dealing with these issues. So it doesn't really come up
02:31:16.420 there right because emily you said it wasn't broad broadly broadly we have ways to to deal
02:31:23.980 to deal with it that's such a non-answer broadly we we have a plan to deal with this i'm not going
02:31:29.080 to go into detail but uh broadly it's not broad but broadly yeah but what i would say is that
02:31:36.200 there should be some sort of mandate about what the expertise is of those commissioners in
02:31:40.180 particular you know very clear legal understanding about freedom of expression i think that's key
02:31:45.300 And that's core to that body.
02:31:47.280 Okay, great.
02:31:47.780 So the legislation sucks.
02:31:50.640 It's incomplete.
02:31:51.200 And now we need a mandate on top of the legislation.
02:31:53.860 There's the mandate going to be added to the legislation.
02:31:56.460 What's the mandate going to be?
02:31:58.180 So many question marks here.
02:32:00.900 You helped contribute to writing this legislation.
02:32:03.400 Yet you're saying, no, actually, you know what?
02:32:04.820 We actually, you know what?
02:32:05.540 Actually, you know what?
02:32:06.140 We need a mandate as well.
02:32:10.760 When it comes to law enforcement,
02:32:12.380 it all they've done um in the with the criminal code amendments has been to kind of embed in it
02:32:18.100 the Supreme Court definition of hate all they've done all they've done
02:32:23.600 this is actually creepy because who was I talking to before they they they did this it was another
02:32:31.960 lawyer I think and they very they hand waved away this thing and they said no I know it's part of
02:32:36.540 the Supreme Supreme Court it's part of the Supreme Court the definition of hate and it's like guys
02:32:42.380 this silly meme created by lee stewie by the way shout outs to lee stewie this silly meme is is
02:32:48.480 literally the supreme court this is the supreme court right here this is the supreme court's like
02:32:53.660 epic uh this oh well as long as they don't hit the red they won't get thrown in jail by the feds
02:33:01.380 okay detests or vilifies she didn't even bring up the magic words by the way
02:33:07.920 she just said oh no no no this is totally this is totally totally handled it's it's just the
02:33:13.280 supreme court definition i guess we just we never questioned the supreme court of canada i guess
02:33:17.240 that's a mistake you know very clear legal understanding about freedom of expression
02:33:23.060 i think that's key and that's core to that body um when it comes to law enforcement it all they've
02:33:28.140 done um in the with the criminal code amendments has been to kind of embed in it the supreme court
02:33:33.560 definition of hate but i think that there's other problems with the criminal code provisions which
02:33:36.800 michael and i can talk about um the canadian human rights commission again it's a it's a
02:33:42.260 high definition but that's where i think the greater risk is and i and i'm still i think
02:33:46.900 that's where you're going to see that risk right and you're pulled into a process that that you
02:33:50.000 don't belong in because of different views of what hate means that's i feel like that's just
02:33:58.160 such a huge lie to say that it's a high definition it's a high threshold for hate it's really not
02:34:05.060 like it's it's it's just it's such a it's such a circus we're talking about genociding people
02:34:15.040 and they're like uh yeah no it's a really high definition it's like wait what wait what uh
02:34:20.500 the accusation is that you're a genociding people yeah no but it's a really high definition don't
02:34:25.860 worry about it don't worry um at least what's meaningful to people that's where i go to the
02:34:32.440 question of i want to find a way to tighten it to make it better um i think michael you might be
02:34:36.760 landing saying the risks are too high and we should get rid of it i'm curious so i i'll get
02:34:40.540 to michael but i just want to push you on this a bit emily so when you say that the risks are
02:34:44.060 higher at the human rights uh in the human rights setting uh is there is there anything like does
02:34:49.260 the bill include i guess boundaries or ways to to avoid frivolous people being uh looped in or
02:34:55.400 brought into frivolous litigation and and if there is are they strong enough or should we be asking
02:35:01.020 uh for for those bellies for those protections to be higher please answer this and then michael
02:35:06.160 another good question from this guy have you had a few words please well so there's strong
02:35:10.700 protections and they might be sufficient is that if it's evident that this isn't hate speech then
02:35:14.820 the commission can um dismiss the complaint if there's evidence that this isn't hate speech
02:35:21.360 what the fuck does that mean we're at we're at three uh uses of the term hate without defining
02:35:30.740 it if it's frivolous they can just they can impose costs so it basically dissuades individuals from
02:35:36.060 making a complaint that's specious um that's just made in bad faith so there are things built into
02:35:40.700 it and they also narrow the definition of section 13 compared to the old version so people might be
02:35:45.860 excited at this of like oh oh there's going to be consequences for people who wrongly accuse me of
02:35:50.520 hate i got a better idea let's just not pass the bill so we don't have to deal with people accusing
02:35:55.320 us of hate speech crazy i know great it's such a great because here's the thing the courts the
02:36:02.780 digital safety commission will probably be slanted to the far left so this idea of like oh maybe we'll
02:36:09.220 get a little cookie and people who wrongly accuse us of hate speech we might be able to like
02:36:13.040 find them can you imagine like the culture of canada would be destroyed by this bill it'll be
02:36:20.280 a crazy tattletales if people people telling on the teacher telling the principal he called me
02:36:25.020 this he come out well no i didn't call her that i know she has to pay me money like it's it's i
02:36:29.000 could see it just being a total nightmare and it would just cause paranoia and social decay
02:36:35.220 like never before seen so it is supreme court counter jurisprudence so it might take care of
02:36:40.400 things um what might it might once again the person who contributed to this legislation not
02:36:46.520 sounding too confident concerns me which is what michael i will say is the one who's mentioned this
02:36:50.540 me is the fact that there can still be coordinated campaigns of well-funded groups that might target
02:36:57.180 individuals okay she that's a bar okay that's a bar that's a bar she actually had that's a that's 0.92
02:37:04.220 that's a that's actually a bar emily's got some emily's got some she's spitting actually
02:37:11.260 yeah i'm definitely concerned about that as well
02:37:13.660 good point emily emily scoring some points here so it is supreme court counter jurisprudence so
02:37:21.780 it might take care of things um what concerns me which is what michael i will say is the one who's
02:37:26.500 mentioned this to me is the fact that there can still be coordinated campaigns of well-funded
02:37:32.080 groups that might target individuals i that needs to be solved somehow in the legislation so i don't
02:37:39.040 know if it is a fee i don't know if it's some other metric to dismiss it um that and i haven't
02:37:45.080 solved that problem yet i haven't figured out a way through that it's what i'm wondering about
02:37:48.840 thank you what you're wondering about am i crazy they said that she helped contribute to this
02:37:54.760 legislation did they not am i crazy she's she's got she's got so many problems with this own
02:38:01.200 bill that she that she contributed to this is crazy oh my god michael yeah sure i'll pick up
02:38:09.920 on a few things that i said and we do agree on most i first want to actually emphasize that
02:38:13.700 there is already a chilling effect uh for for anyone in our community and frankly in a number
02:38:17.960 of communities that speak out on these issues there is a chilling effect uh the backlash that
02:38:21.480 you invariably face causes i think many people to think twice about whether they want to step
02:38:24.780 out and comment now and it's not just online there's a chilling effect offline as well so
02:38:28.840 okay so people already self-censor themselves interesting
02:38:35.900 they already self-censor themselves
02:38:40.980 there already is the chilling effect
02:38:46.100 is that supposed to be like an encouragement to pass it like there wouldn't be more of a chilling
02:38:53.500 effect these issues are very real and they don't and many of them will not be solved by legislation
02:39:00.620 no matter what the legislation says okay that's a bar too that's another bar
02:39:05.480 and and and that's and that's really like you know the common sense sort of argument for a lot
02:39:13.600 of this stuff is like you really you really think we're going to be able to let all human
02:39:17.320 beings get along and not offend each other are you crazy the online harms act imagine imagine a
02:39:26.860 classroom full of it's a it's a busy classroom it's like it's it's over it's a packed classroom
02:39:32.380 it's like 35 kids huge classroom they're in grade six so they're at that age where they're kind of
02:39:37.820 getting they're like they're talking back they're smart asses kids naturally group together and 0.96
02:39:42.640 bully other kids that's just what happened kids go into cliques whatever maybe there's brown kids
02:39:49.240 white kids black kids asian kids you know do you think that all these kids are going to get along
02:39:55.800 and that there's never going to be any bullying that happens and there's never going to be any
02:40:00.060 disagreements that happen and none of the kids are going to be offended by any of the other kids
02:40:04.500 ever of course not that's absurd the book lord of the flies kind of reflects this whole notion
02:40:15.020 that you know our base our base instincts take take over and it becomes tribal and it becomes
02:40:19.920 ugly okay and i'm not saying that we should like keep it legal for uh for what's his name again
02:40:27.240 the fat kid who gets killed by the rock or whatever i don't think i think i don't think
02:40:30.680 that should be allowed anyway this is wrong tangent but the point is is like they're trying
02:40:36.500 to legislate human behavior they're trying to control human behavior and emotions and it's it's
02:40:43.180 very utopian of we're going to make sure that everyone gets along and we're going to end hate
02:40:49.040 forever and you the people watching know this it's it's that would be that's a tyrannical thing
02:40:54.240 you say you're going to help everybody but you're actually just going to usurp more control
02:40:57.860 and people are going to hate each other.
02:41:00.300 That's actually going to be the net result.
02:41:02.560 Bullying is human nature.
02:41:03.880 Exactly 420 all day.
02:41:06.180 Bullying is precisely human nature
02:41:08.220 and this idea that we could legislate it away
02:41:10.440 is insane.
02:41:12.820 You could look at any group
02:41:14.300 around the world,
02:41:16.960 any human culture, anywhere, 0.50
02:41:18.600 even if they live in the woods
02:41:20.920 and they've never even discovered civilization.
02:41:23.980 You bet your ass there's bullying.
02:41:25.680 You bet your ass there's some sort of social hierarchy that naturally happens.
02:41:30.220 People get offended.
02:41:31.300 People's feelings get hurt.
02:41:32.320 It's part of being human.
02:41:34.860 It's part of being human.
02:41:37.680 All right, let's keep going.
02:41:39.740 Let's keep going.
02:41:40.640 Let's get through this, guys.
02:41:42.680 The core elements in terms of dealing with the platforms don't address individuals at all.
02:41:47.120 So the idea that an individual will be liable under the elements of the law that seek to create a duty back responsibly for the platforms,
02:41:54.000 it doesn't really affect individuals in that way.
02:41:55.420 about what the platforms do the platforms might themselves adopt certain policies but uh and
02:41:59.340 perhaps encourage to do so through the legislation but no one's no one's landing in jail clearly
02:42:03.420 anyone's landing in jail for these kind of comments period but they certainly aren't
02:42:06.140 affected in that way based on the social media side of it the criminal code provisions do as
02:42:10.300 emily notes raise concerns but but i don't think that the kind of uh speech that you're talking
02:42:14.460 about and comes anywhere near the the sort of level that would lead to any sort of prosecution
02:42:18.860 it is as emily suggests at the human rights commission that um under the human rights act
02:42:22.780 that I think there are real risks.
02:42:23.920 And I really, I genuinely fear
02:42:25.460 based on what we see
02:42:26.200 that this will cause
02:42:27.060 a weaponization of complaints
02:42:28.400 that will flow in all directions.
02:42:30.660 And this notion that
02:42:32.100 there is the ability
02:42:33.220 to dismiss complaints,
02:42:34.140 which there is.
02:42:34.880 And so those powers are,
02:42:36.040 I think, important.
02:42:37.000 But I think the prospect
02:42:38.620 that an individual begins,
02:42:40.260 faces even one of these complaints,
02:42:41.640 never mind potentially hundreds,
02:42:43.340 where there are organized campaigns
02:42:45.260 to target certain people
02:42:46.400 that speak actively
02:42:47.560 on a part of a community
02:42:48.440 and claiming what they are doing
02:42:49.380 is engaged in hate,
02:42:50.400 I think must surely have,
02:42:51.680 even more than a chilling effect.
02:42:54.760 Another hate mention.
02:43:02.620 I'm just going to wind it a tiny bit.
02:43:05.020 ...are, I think, important.
02:43:06.400 But I think the prospect that an individual
02:43:09.060 faces even one of these complaints,
02:43:11.020 never mind potentially hundreds,
02:43:12.740 where there are organized campaigns
02:43:14.700 to target certain people that speak
02:43:16.240 actively on a part of a community
02:43:17.840 and claiming what they are doing
02:43:18.780 is engaged in hate,
02:43:19.560 I think, must surely have even more than a chilling effect. I think it creates real fear
02:43:23.940 in individuals. I mean, it is not a comfortable thing. I've never received one, but I'm quite
02:43:27.260 certain it is not a comfortable thing to be advised that there has been a complaint filed
02:43:30.860 against you at the Human Rights Commission based on something that you said on social media.
02:43:34.800 And so looking at this as a mechanism to address this issue, I think, carries some great risk.
02:43:39.940 And if it had a realistic prospect of significantly altering what, Richard, you started off by
02:43:45.800 talking about a community genuinely in fear and deeply concerned about the the level of
02:43:50.300 anti-semitism that's being faced if there was i think a realistic chance that this process would
02:43:54.320 would have a positive effect in trying to address those issues then one might be able to conclude
02:43:58.280 that even that weapon weaponization might still be worth the risk but i frankly don't see a lot
02:44:02.540 of evidence to suggest that when you think about what's taking place that that those that engage
02:44:06.620 in that weaponization are going to be particular particularly persuaded to stop based on on almost
02:44:11.400 anything and so um this is so crazy uh man so michael says it's a concern that this could
02:44:19.960 definitely be weaponized by many groups however maybe weaponizing this could work
02:44:25.000 you know weaponization of this legislation would be bad but at the same time you know we might be
02:44:32.660 able we might be able to use this to our advantage to uh you know to oppose the people we don't like
02:44:38.100 am i reading that wrong michael that's that's what it sounded like um
02:44:45.540 weaponization bad but at the same time maybe we can use this certainly not when those complaints
02:44:52.420 might be dismissed which unless you start raising penalties for those kinds of complaints but
02:44:55.620 otherwise i think that we would continue to see the kind of hate regardless of the
02:44:59.860 the availability of this remedy and there are real risks i think to individuals who
02:45:03.900 might find themselves targeted as a result of it
02:45:05.500 Yeah, obviously there's a risk to the people
02:45:10.020 Who will be targeted because of this
02:45:11.840 I mean
02:45:16.080 I feel like he brushed over a couple things there
02:45:20.380 That I think he
02:45:21.520 You know, he kind of, again, waved away the whole
02:45:25.360 The new definition of criminal hate speech
02:45:27.660 No, no, no, people who support Israel won't get charged with
02:45:30.620 With criminal hate speech
02:45:32.940 uh it's detest you don't think that there's any pro israel person who could be
02:45:40.920 how about netanyahu should i should i look up a netanyahu clip right now
02:45:45.820 there's got to be a netanyahu clip uh let's see
02:45:51.780 netanyahu
02:45:56.120 um
02:45:59.300 all right here we go i think this looks like this is what i'm looking for
02:46:15.320 good old netton
02:46:18.840 so it would this be would this qualify as criminal hate speech under the supreme court
02:46:25.340 would this consider vilification or detestation in the middle east iran's axis of terror
02:46:32.780 confronts america israel and our arab friends this is not a clash of civilizations
02:46:41.260 it's a clash between barbarism and civilization
02:46:47.100 so that sounds like he's vilifying uh iran to say we are civilized they are barbarians
02:46:53.420 that kind of implies that
02:46:55.840 they are the enemy, they are villains
02:46:57.900 and also 1.00
02:46:58.720 Iran is an axis of terror. 1.00
02:47:03.380 I think you could 1.00
02:47:04.000 make the argument that that is
02:47:05.760 vilifying Iran and vilifying
02:47:08.140 the enemies of Israel.
02:47:11.300 But
02:47:11.860 for whatever reason
02:47:13.920 we're just hand-waving away
02:47:16.080 that the threshold is so high.
02:47:18.520 The threshold is so high
02:47:19.880 for the new hate speech definition.
02:47:22.220 No, it's not.
02:47:23.420 I think it's pretty clear to say that Netanyahu detests Iran and they are an axis of terror.
02:47:40.800 They are barbarians.
02:47:41.700 I think it'd be pretty easy to make the argument that Netanyahu, the prime minister of Israel is vilifying his enemies, which would be criminal hate speech.
02:47:53.420 under bill c63 according to the supreme court which i totally disagree with i think i don't
02:47:58.220 know how i don't know how that supreme court decision is they're going to try to weasel this
02:48:01.980 in into the new definition of hate speech but anyway the fact that they're glossing over that
02:48:05.980 is crazy to me so if i can is it okay by jumping with one short because why i land differently on
02:48:11.660 this is that um i don't view it as uh reducing hate access should i keep counting or is this
02:48:20.380 stupid let me know if I should keep counting maybe it's not that exciting eh it's just kind
02:48:24.620 of interrupting the flow isn't it to the Canadian Human Rights Commission I view it as access to
02:48:30.300 justice and at the moment and if it remains this is the victim-centered perspective this isn't
02:48:37.600 about stopping hate this is about access to justice what the if I can jump in with one short
02:48:45.540 because why I land differently on this is that I don't view it as reducing hate access to the
02:48:52.980 Canadian Human Rights Commission. I view it as access to justice and at the moment and if it
02:48:58.080 remains unchanged individuals the only thing available to them is is to go to law enforcement
02:49:03.740 many groups don't feel comfortable with that law enforcement is overwhelmed and yes we need to make
02:49:07.440 major steps to better resource better train law enforcement to deal with this but it's not always
02:49:11.840 are now they still haven't defined hate and they're still saying all they can do is go to
02:49:17.560 law enforcement go in law enforcement go to law enforcement for what like does it reach the
02:49:23.980 threshold of being against the law no maybe there's a reason for that but there's a reason
02:49:32.140 we've had that in place for so long in free free western countries necessarily the appropriate
02:49:38.940 solution many provincial human rights bodies don't uh even though they have hate speech
02:49:43.500 provisions won't hear them if they're related to the internet um they see that as federal and so
02:49:48.000 it means that except for going to the police there's nothing there's absolutely nothing
02:49:52.900 available for an individual that's not true you could put your phone down emily you could you
02:49:57.720 could walk away from the computer i know these are these are radical suggestions but you could you
02:50:03.580 could you could literally put your phone down that might solve the problem of someone disagreeing
02:50:09.700 with you or saying something hateful on the internet turning the internet off unplugging
02:50:15.060 your wi-fi router maybe instead of calling the police you should just unplug your wi-fi router
02:50:20.380 visual who sees sees it as appropriate that they should access some sort of justice and that's why
02:50:25.260 i land that we i don't have a solution to the weaponization yet i just want to try to solve
02:50:29.260 that one because otherwise i do think that as reintroduced it's it's important it doesn't have
02:50:34.460 an answer to the weaponization so yeah we still got this kind of matzo ball hanging out here of
02:50:38.220 just weaponizing uh weaponizing legislation citizens weaponizing legislation against one
02:50:45.400 another haven't solved that yet i helped contribute it to the legislation though it is if i can quickly
02:50:49.640 and we'll turn it into true debate i i i think you're right about that there isn't that remedy
02:50:53.520 but when you think of what that remedy involves potentially years before as this process works
02:50:57.960 its way through the commission and then in through the tribunal the risk that even the
02:51:01.260 fact that you participated in that results in in a further backlash as it continues i'm not sure
02:51:06.040 that that brings a whole lot of justice when if presented with the alternative saying is that are
02:51:10.120 there mechanisms to try to stop the amplification of this hate to try to are there mechanisms to
02:51:16.420 stop the implication of this hate i haven't found hate yet to support the community it's got such a
02:51:22.040 hate has such a high threshold though you know the whole even in the absence of myself being
02:51:26.780 made potentially whole for the harm that that occurred i think many would say listen i really
02:51:29.960 just want this to go away um and if there are mechanisms to reduce the amplification to reduce
02:51:33.700 the harm that's where i want to put our resources and thinking as opposed to saying that you know
02:51:37.780 what's so stupid about that because this is something that a reef ronny said he said i just 0.57
02:51:43.000 want that to go away i just don't want to see that meme tweet anymore i just want that to go away
02:51:51.580 we're literally pandering to like the most precious people with like these precious
02:51:59.460 neurotic control freaks am i wrong here like this is this is i just want that to go away
02:52:06.360 they just want it taken down i remember a reef saying that they just want it taken down
02:52:10.420 i just don't want to read i don't want to know that people like that exist and have those opinions
02:52:15.340 just take it down please yeah let's let's pass laws based on that attitude that's a great idea
02:52:22.200 well i got mine in terms of some justice in a particular case yeah no i do yeah okay i'll stop
02:52:27.800 there we can talk about this i'm also mindful of the time so i'd like to switch a little bit
02:52:32.520 because um the internet is a global phenomenon um in in other countries or institutions that
02:52:38.620 have started to grapple with us the european commission the european union has started
02:52:42.140 Australia has started. When you compare C63 with what other jurisdictions have done, how does it
02:52:49.460 compare? Is Canada, were C63 adopted? Would it be an outlier? Would it be around in the same
02:52:56.800 ballpark as what the European and the Australians, for example, have done? Maybe Emily and then
02:53:02.060 Michael. Yeah, so it has picked elements from the different regimes and I think that
02:53:08.340 was that a slip was that it was that a freudian slip she used the r word not that r word
02:53:19.460 regimes for example have done uh maybe emily and then michael yeah so uh it has picked elements
02:53:30.040 from the different regimes and i think the best ones it is um it aligns with them um but it is
02:53:36.480 narrower so the digital safety act uh in europe covers a broader array of players we have aren't
02:53:43.680 covering search engines we're not covering private messaging um and the obligations are
02:53:47.840 much broader including all illegal content um they're even tackling disinformation things that
02:53:53.260 yeah uh that that must have been a freudian slip there that's hilarious
02:53:58.900 uh yeah we are taking dibs from uh we are taking leads from other regimes a government especially
02:54:05.600 an authoritarian one a system or planned way of doing things especially one imposed from above
02:54:11.520 regime regime that's funny yeah we're taking uh we're taking cues from other regimes absolutely
02:54:20.480 uh the most tyrannical ones the most ruthless ones yeah we want to do our jobs right
02:54:27.440 errors we have aren't covering search engines we're not covering private messaging
02:54:32.080 And the obligations are much broader, including all illegal content.
02:54:37.460 Illegal content is already illegal.
02:54:39.940 I never understand why they loop this in.
02:54:42.320 I don't get that.
02:54:43.720 It's already illegal.
02:54:44.740 And if it's illegal, it's probably already cracked down on aggressively by big tech anyway.
02:54:50.120 I feel like this is one of their kind of like talking points that they love to bring up.
02:54:54.700 But it's like, what does it even mean?
02:54:56.340 What does that even mean?
02:54:57.600 What would change?
02:54:59.040 It's already illegal.
02:55:00.080 big tech already takes down illegal things you know it's it's it they use this a lot especially
02:55:06.360 with like the the sexual stuff and the pornography and it's like well what actually changes there
02:55:12.460 because big tech already does this and that stuff's already illegal it's crazy tackling
02:55:16.900 disinformation things that undermine democracy none of that doesn't go for our so well that's
02:55:24.760 that's actually that's actually uh positive again not this bill is trash throwing it out
02:55:32.400 in europe covers a broader array of players we have aren't covering search engines we're not
02:55:37.960 covering private messaging and the obligations are much broader including all illegal content
02:55:43.420 they're even tackling disinformation things that undermine democracy none of that's in scope for
02:55:48.620 ours so ours is much more narrowly targeted um and i'm gonna have to fact check fact check her
02:55:53.240 that because she's been wrong wrong already
02:55:57.880 they're probably going to try to add it in what was it misinformation
02:56:00.840 undermining democracy
02:56:05.800 and um we're we have elements of the uk one i think there's a bit more
02:56:10.200 similarities but even the uk one is much broader so i think that we've
02:56:14.040 landed in the right place given i think the legal climate and
02:56:18.120 um where we've been the legal climate balance freedom of expression with
02:56:23.000 other rights and also really the capacity of a regulator if you're building something from
02:56:26.180 scratch there's only so much you can scope in australia started very narrow and built out
02:56:30.440 we aren't as narrow whereas where australia started they were just for child safety but
02:56:34.360 that ship has sailed i mean there's no way we could have just started with child safety
02:56:37.240 um but uh it's a good start speaking of child safety throughout this 30 minutes we've watched
02:56:43.720 it's been brought up maybe two or three times and never with any sort of detail like the whole
02:56:49.880 child safety thing it's like we haven't been talking about that at all we've been talking
02:56:53.620 about hate speech we've been talking about hey can i get thrown in jail for this can i throw
02:56:57.420 somebody else in jail weaponizing these laws to uh bully people who have the wrong opinion
02:57:01.660 or to stop bullying nothing has come up about the child safety stuff michael yeah i think emily and
02:57:08.640 i agree on this it's worth noting i mentioned off the top of uh of my opening comments that
02:57:12.380 where the government started in this process when they consulted is a much different place
02:57:16.380 from where we are now and i think that at one time they were thinking about importing
02:57:19.360 some provisions from other countries that were far more problematic. And I think,
02:57:22.340 frankly, thankfully, they've dispensed with that. The one part that I think that's worth
02:57:26.000 emphasizing that Emily's response, and frankly, my response to this, will focus on the online
02:57:29.560 harms piece. And so the emphasis will be, as a point of comparison, what are you doing with
02:57:33.560 the platforms? What kind of responsibility and potential liability are you imposing on those
02:57:37.620 platforms so that they step up to deal with these various harms? And in that respect, I think Canada
02:57:41.520 has done a nice job of striking a balance and drawing from what they've seen elsewhere.
02:57:45.780 The problem, come back to it, and it's inescapable when you start debating this legislation,
02:57:49.860 is that that's not the whole bill.
02:57:50.960 And when you get the second part of the bill, having thrown in the criminal code provisions
02:57:54.280 and the Human Rights Act provisions, those are not the parts that we're drawing lessons
02:57:58.060 from.
02:57:58.280 And I know less about how those other jurisdictions deal with those issues.
02:58:03.440 But where the government really did, I think, engage in that comparative analysis, especially
02:58:06.940 with respect to, you know, we recognize the power that the platforms have.
02:58:10.840 That law, as it exists right now, alone is insufficient to deal with some of the concerns,
02:58:14.880 especially around the algorithms the amplification the the opacity in how these platforms deal with
02:58:20.240 these issues in that respect we've i think done a good job and drawn some powerful lessons from
02:58:23.440 from what has occurred elsewhere interesting interesting so geist likes
02:58:33.760 the fact that uh the canadian government would have more control
02:58:38.960 over uh how big tech operates on the inside when it comes to censoring things
02:58:45.200 which is totally counterintuitive or totally like the different opinion he had of bill c11
02:58:50.280 which was about seeing what uh the government having control over algorithms right but i guess
02:58:56.660 i mean the thing is is that i feel like that point is so redundant it's like we have big tech
02:59:03.360 they already have their own system of uh preventing online harms and i'm pretty sure they use it the
02:59:09.180 same term but michael's like no no i think we should help them i think we should be able to
02:59:13.660 see it i think we should be able to see what they do um but everything else in this bill he did
02:59:18.460 bring up a good point of how the the big problem is that's not the whole bill right it's it's a
02:59:22.800 trojan horse for a whole bunch of other nonsense i wish he would push harder against that or push
02:59:27.800 that point harder because it's you know this whole time the pro the pro the woman emily who
02:59:32.720 is supporting the bill who helped contribute to the bill is constantly saying how there's
02:59:36.160 problems with it throw this thing out in the trash does the fact that most of the platforms
02:59:42.220 are based outside of canada play a role in what canadian canadian government or canadian
02:59:47.120 legislation can do sure i can kick off if you want i think of course it does you know i think
02:59:53.460 that the legislation talks about mitigating these harms doesn't talk about eliminating them and i
02:59:56.920 think there's a recognition that no legislation is going to entirely eliminate but if you target
03:00:01.100 the large players and potentially identify some other players along the way, that's enough of a
03:00:06.420 starting point. It captures so many of the average users that at least you've had a positive effect
03:00:11.420 in mitigating against some of the harms. And I think that's what we're trying to do here. And so
03:00:15.440 I think there's a recognition that there will be thresholds about which organizations, which
03:00:19.620 entities this applies to. And I think that while some are skeptical about law on the internet,
03:00:24.000 are these companies really going to apply with Canadian law? And I think there's every reason
03:00:26.860 to believe that especially the larger players will. I mean, I think that they recognize they
03:00:29.860 do have a responsibility there will be a willingness indeed a desire to ensure that
03:00:33.600 they are compliant with with this legislation they they may push back on certain elements of it
03:00:38.100 but by and large platforms aren't themselves happy to see the kind of harms that are taking place
03:00:41.880 online and if we impose these kinds of obligations most will say yeah this is something that they're
03:00:45.160 willing to comply with that's really interesting for michael to say that based on the fact that
03:00:52.160 the latest bill to pass about the canadian internet was bill c18 which was trying to get
03:00:57.880 social media companies to pay to distribute news in canada and now meta has basically said go f
03:01:07.000 yourself and to some extent google has said like you know they called the bluff of the canadian
03:01:11.920 government with poor legislation and here's michael geist saying no no no you know i think
03:01:17.340 the platforms will be totally compliant with everything the canadian government says
03:01:20.280 really because the last time we tried this now i can't get news on meta anymore i can't get news
03:01:26.100 on Instagram or Facebook anymore.
03:01:28.100 Now, all of a sudden,
03:01:29.260 you think they're going to be compliant.
03:01:32.040 Where is this coming from?
03:01:33.760 Based on what?
03:01:34.640 Based on a hunch?
03:01:35.940 Based on what evidence?
03:01:37.500 It's based on this goodwill argument
03:01:39.000 that like, no, no, no,
03:01:40.100 we all want to stop hate, right?
03:01:41.380 So you'll let the Canadian government
03:01:42.820 in on what you're doing
03:01:44.700 and you, a private organization,
03:01:48.840 massive company who's like super dialed in
03:01:51.440 and super well operating,
03:01:52.860 you think you're going to let
03:01:53.720 the Canadian government,
03:01:54.660 yeah they'll let the canadian government come see what we're doing right that's crazy
03:02:00.700 thank you emily yes actually that being said now now i'm thinking that maybe that point isn't so
03:02:07.820 strong maybe there's already a president set for this with australia and the uk and how that works
03:02:13.920 so i guess with that being said like the infrastructure is already there for be like
03:02:18.300 okay we got another another canadian now the canadian idiot's gonna come in and see how we
03:02:23.160 do the algorithms. That's interesting. I'll have to look
03:02:26.980 into that.
03:02:48.060 All right, let's get through this, guys. We're almost there.
03:02:50.660 uh i i think that enforcement has always been an issue i mean it's been this that that wicked
03:02:55.460 issue with internet regulation since it was commercialized but um i don't think it's a
03:02:59.220 reason not to pass laws and we have another context i mean privacy commissioners investigate
03:03:03.620 companies that impact canadians um that are based elsewhere and oh i i do want to add uh
03:03:10.740 i brought this up earlier but it's worth it bears repeating they said oh it's going to be hard for
03:03:15.460 big tech companies who aren't based in Canada to get them to follow the rules.
03:03:19.860 And there's actually a massive pornography company based in Canada that once again,
03:03:26.420 if they wanted to protect children online and the re-victimization, blah, blah, blah,
03:03:30.300 the sex exploitation, blah, blah, blah.
03:03:31.900 Maybe they would actually focus on that company that's in Canada,
03:03:34.140 but it's not going to get mentioned in this legislation.
03:03:36.820 They're not going to frame it in that way to target this massive porn company
03:03:39.740 because they don't care about kids, right?
03:03:42.340 They don't care about protecting children online.
03:03:44.400 that's just a trojan horse uh i think that where we fit in is that we're not the first player
03:03:49.600 stepping onto the scene because there are laws in the eu in the uk and australia and we're targeting
03:03:54.400 the major global players you know i agree with michael that i i don't think that they're going
03:03:57.760 to have an issue we're going to have an issue with with uh you know them trying to shirk compliance
03:04:02.160 in any way because they're having to already comply with more onerous obligations in other
03:04:06.240 jurisdictions thank you obviously all right well i guess that answers my question but we'll we'll
03:04:12.560 have to uh look into it nonetheless to see if that's actually true or not part of this conversation
03:04:17.120 is is the issue of free expression and free speech and a lot of criticism are saying that
03:04:21.360 it's an attempt to censor to to to limit free speech um and one of the ways that i think the
03:04:27.040 government is trying to land in a place that that is legal is the use of the definition of hatred
03:04:31.600 that flows from and emily you were mentioning this the supreme court canada decision in in watcott
03:04:36.480 back in 2013 if i if i remember correctly um the the at the time the supreme court kind of landed
03:04:43.360 at a place that said uh that uh the use of the of the human rights code of saskatchewan was yes a
03:04:48.880 limit to free speech but was reasonable uh in a free democratic society according to section one
03:04:52.720 of the charter broadly speaking um do you think that that where the bar is set in the bill in
03:04:58.000 terms of of what is what is hatred is is high enough to on the one hand protect solidly protect
03:05:05.520 freedom of expression which is a fundamental values of canada and i would say that for jews 0.64
03:05:10.480 who have been minorities everywhere they've been but for but for israel they the the right to be
03:05:15.600 outside of the mainstream sometimes is very important has been historically important
03:05:18.800 um so on the one hand that that freedom of expression is protected while at the same time
03:05:21.920 catching the worst elements of hatred like is is the government setting the bar at the right level
03:05:28.080 in two um is the definition clear enough or tight enough that it will do what the government wants
03:05:34.800 it to do i will start with michael and then emily okay these are these are fantastic questions i
03:05:39.520 really like this guy yeah no i i think the i think the fact that they have relied on prior jurisprudence
03:05:44.480 is another positive within the legislation you know i think that had they worked to work to
03:05:49.280 establish an entirely new definition because this invariably does involve some very difficult line
03:05:53.200 drawing uh it would quite clearly be challenged on those grounds alone and so the fact that they're
03:05:57.360 relying on something that is somewhat tested i think already makes makes quite a lot of sense
03:06:01.600 you know is this going to have a chilling effect you know the question becomes well where does this
03:06:05.280 apply in a in a criminal code context i must admit i i don't think that the this inclusion
03:06:09.760 i already mentioned it i don't think the inclusion of the additional penalties makes much sense i
03:06:12.560 don't think it has any real um effect in terms of uh on the ground what this is what you know the
03:06:17.120 kind of hate that takes place i think that this is primarily an enforcement issue i wish law
03:06:21.120 enforcement would enforce the laws as they stand now and and then we can see whether or not we've
03:06:24.400 got shortcomings but i think we've got more failure of enforcement than we do failure of the existing
03:06:28.320 legislation when it comes to some of these languages uh some of these legislation is there
03:06:31.680 the potential for a chilling effect i mean the answer may well be for i mean there might i can't
03:06:35.360 speak for every individual i mean there may be some that look at this and say well this might
03:06:38.720 have an effect and i'm really concerned about it i think for the vast majority of people they don't
03:06:41.840 really come close it is a high bar there are other safeguards and guard rails that are built into the
03:06:45.840 system to seek to guard against it and so you know i think many should be comforted by that and i
03:06:50.160 think that and i alluded to it a moment ago i think that we need to recognize that alongside
03:06:55.040 the risk of a chilling effect from the legislation is the real world chilling effect that we have
03:06:58.640 right now so that when we have individuals who are afraid to speak um because of the kind of
03:07:03.040 hate that they themselves may face that of course represents a significant issue and a limitation on
03:07:07.040 their own freedom of expression and we need to be conscious of that as we try to strike a balance
03:07:10.320 with some of these issues thank you man michael no what that was oh that was rough bro that was
03:07:23.120 rough the supreme court totally correct about everything uh will there be a chilling effect
03:07:28.860 he did not say yes or no he basically said no uh with a big wall of text um and he's giving
03:07:37.540 the classic arif verani argument in so many words which is this will actually encourage
03:07:46.080 free speech because people aren't free to talk right now because they're afraid the implication
03:07:52.260 being that this law will lock up all the scary people so us jewish people can speak freely
03:08:02.020 without being threatened what this it's so it's not going to have a chilling effect
03:08:09.080 but people are too afraid to speak out right now and this legislation will help us speak out
03:08:16.700 because other people like the implication is because other people won't be able to speak out
03:08:22.340 isn't it like like how would how would you speaking out how will more speech legislation
03:08:29.460 inspire jewish people to speak out more
03:08:33.760 if the speech legislation can only do anything but silence people 0.97
03:08:40.420 you totally you totally lost me mr geist on that answer hate to speak um because of the kind of
03:08:49.800 hate that they themselves may face that of course represents a significant issue and a limitation on
03:08:53.740 their own freedom of expression and we need to be conscious of that as we try to strike a balance
03:08:57.020 with some of these issues thank you uh emily yes there are other safeguards and guardrails that
03:09:03.040 are built into the system to seek to guard against it and so you know i think many should be comforted
03:09:07.400 by that. And I think that, and I alluded to it a moment ago, I think that we need to recognize
03:09:11.560 that alongside the risk of a chilling effect from the legislation is the real world chilling effect
03:09:15.920 that we have right now. So that when we, you know, sure, maybe there's a chilling effect
03:09:24.520 from the legislation, but what about the chilling effect we have right now? People are afraid to
03:09:29.460 speak out. Yeah. Maybe this legislation will add a chilling effect, but people are already
03:09:34.060 having a chilling effect so let's just add to that chilling effect
03:09:36.680 we have individuals who are afraid to speak um because of the kind of hate that they themselves
03:09:43.140 may face that of course represents a significant issue and a limitation on their own freedom of
03:09:47.080 expression and we need to be conscious of that as we try to strike a balance with some of these
03:09:50.860 issues so people are afraid to speak out because they're afraid they're going to get hated
03:09:55.480 so the idea the idea is this legislation will help jewish people speak out because the people
03:10:02.480 that hate them will be silenced is that kind of the implication there
03:10:08.620 it's crazy thank you uh emily yes and and that's a good reminder that the the issues of hate here
03:10:17.320 are so far beyond you know what this legislation is is tackling um i don't think that we can get
03:10:23.400 more precise than the definition that has been provided it is as michael said set out in spring
03:10:27.160 court jurisprudence um there's criticism about it's great crazy that they're just brushing over
03:10:32.440 this supreme court definition crazy like crazy that they're just oh my god the need to pinpoint
03:10:39.440 something more precise but you can't because we need to be able to engage in a contextual analysis
03:10:43.140 so you don't need to be more precise because you need to engage in a contextual analysis
03:10:47.940 this is just getting broader and broader and broader uh just uh two points one is uh so i
03:10:55.060 wouldn't change the definition. What I would do for at least the Digital Safety Commission
03:10:58.600 component, that Online Harms Act, I think that the private platforms play such a key role here
03:11:04.560 in setting out in their own terms and conditions how they manage this. Their digital safety plan
03:11:08.820 shouldn't just be how they mitigate risks of harm. I think they also need to set out how they are
03:11:12.900 promoting and protecting freedom of expression. And also, I would add privacy and equality. But
03:11:16.960 I think that needs to be front of mind in how they manage their services. Right now, that provision
03:11:20.860 is not in there. The only obligations on the commission, not the company's. When it comes to
03:11:24.920 criminal code provisions that is such a non-point does she really just say that we're concerned that
03:11:31.320 the big tech isn't worried about free expression that's crazy i mean it sets out to define hatred
03:11:39.560 fine you know it doesn't change anything but it offers some clarity perhaps um more concerning
03:11:44.840 are the other dimensions of the criminal code provision which i think that's where that chilling
03:11:48.200 effect will happen um that stand-alone provision uh uh risking you know life in prison i i think
03:11:54.280 that they i mean i suppose you could tweak it slightly to make a little bit better not make it
03:11:58.840 life in prison not remove from it that it's violating any legislation maybe just keep it to
03:12:03.560 other criminal offenses or you just scrap it entirely that might be the easiest solution here
03:12:08.760 wow scrap it entirely emily who contributed to building this legislation maybe just scrap it
03:12:18.760 there's a thought you mean hiving the the criminal code amendments off of this legislation so what
03:12:28.040 i just want to supplement on that because i think both are in agreement that uh it might
03:12:32.200 have made sense to have excluded that i have to say that i think there's even there was a
03:12:35.640 particularly compelling case to have stuck specifically with the online harms as its
03:12:39.400 own piece of legislation i think frankly the way that they've targeted it made a lot of sense and
03:12:42.920 frankly would have gone a long way to addressing some of these issues and at a time when there
03:12:46.120 there has been enormous frustration at the lack of action by the federal government,
03:12:49.660 by many governments, in dealing with online hate.
03:12:51.700 They could have, or dealing with hate more broadly, not just online,
03:12:53.980 they could have packaged the Criminal Code and the Human Rights Act provisions
03:12:57.000 in a separate piece of legislation.
03:12:58.400 I still would have problems with them,
03:12:59.660 but at least it would not have sidetracked the good work that takes place,
03:13:02.100 I think, with respect to the platforms.
03:13:03.120 And frankly, the government would have had something that it could have pointed to
03:13:05.680 to say, okay, we are looking to try to deal with some of these issues.
03:13:08.180 As it is, they run the risk, frankly, of bringing it all down
03:13:10.720 and really bogging down and clouding the debate at a time when what we need,
03:13:14.560 especially with some of the online harms related issues is more is more clear cut leadership and
03:13:19.080 expeditious action so that's good i mean i'm glad he said that hey we really need to throw this part
03:13:26.680 out let's throw this part out it's not good it's gonna weigh you down it is it is gonna weigh it
03:13:32.660 down without a doubt that's probably like easily over half of the bill what he was talking about
03:13:45.560 by the way hey this whole part this huge chunk all this criminal stuff yeah just throw all of
03:13:51.840 that out i wouldn't like i don't like the uh michael stance here though because he still says
03:13:57.180 that you know hey we don't we want the government to have control over uh or be able to mess with
03:14:04.000 big tech platforms to help them censor speech better big it's redundant big tech already does
03:14:09.740 this and this this idea it really comes down to them i hate to like summarize it like this
03:14:16.920 but i really get the impression that some of the people on the zoom call are really just mad at
03:14:22.140 elon musk and they want to have more control or influence or be able to fine elon musk for uh
03:14:30.700 anti-semitic tweets that they read uh in the morning on their phone i want to go i want to
03:14:36.140 go back to something that both of you mentioned about the uh safely in their nice home by the
03:14:41.740 way they're totally safe they're just reading someone's opinion they don't like on the internet
03:14:44.860 it's not it's not that big a deal the digital safety uh commissioner um two questions that
03:14:49.740 i'm seeing in the chats uh one is how independent would would that regulator be because there's this
03:14:55.260 concern by some people that they would be at the direction of the government uh so is is the
03:15:01.340 independence of that that regulator protected or clear in the in the legislation and if not should
03:15:06.140 it be uh clarified and two michael in your introduction you were mentioning the that that
03:15:11.980 commissioner might have too much power um i'd like to hear both of your um of your insights on this
03:15:17.980 so we'll start with emily questions yes it's um i i mean the body needs to be the commission needs
03:15:24.780 to be independent from government um i think it is inaccurate uh what i have read that um criticism
03:15:31.820 saying that it would be government censorship this isn't this is set up as an independent body
03:15:36.620 um and the commissioners have how does that how does that that's such a naive thing to say
03:15:42.220 in terms of i think five years i mean as it compared i would say it's more comparable to
03:15:46.300 the idea that we have our privacy commissioner um and who i think is seven year terms where there
03:15:51.340 is some risk is the fact that in the end um government appoints the the individuals um it
03:15:58.140 needs to be approved i think it's not going to be government censorship we need to have a right
03:16:04.900 independent person but the government is going to approve the people it's not going to be
03:16:09.520 government censorship but the government is going to choose who has this power cool how naive can
03:16:16.040 you be yeah if i recall it needs to be approved by by parliament so oh thank you okay no it's
03:16:21.620 fine it's gonna be approved by parliament we're good we're good guys we're fine
03:16:30.600 everything's fine don't worry it's gonna be government's gonna choose the non-biased person
03:16:36.780 don't worry the parliament the parliament buildings filled with traitors
03:16:42.100 are going to decide the right people to control speech online perfect it has a more bipartisan
03:16:48.460 kind of nature to it that the individuals in the commission would actually have to be
03:16:51.820 approved by those bodies so it should operate independently it should it might it should
03:16:57.760 listen to all these unconfident words um and again remember it's oversight is of companies
03:17:03.040 not of individuals they're not making individual content decisions or holding individuals
03:17:08.020 accountable here um so i think that when it comes to their their power you know what
03:17:13.140 that's such a sleight of hand they did the same thing with uh bill c11 i'm just realizing this
03:17:21.520 now i should have realized this earlier they're saying no because with bill c11 they're like
03:17:25.520 we're not dealing with individual pieces of content and individual users we're just dealing
03:17:30.920 with big tech it's like yeah you're but you're dealing with big tech to affect things that will
03:17:36.340 affect individual users at the end of the day it's going to be the individual content sure
03:17:41.880 you're going through big tech but it's going to the result is going to be controlling the content
03:17:48.020 of individual users on the big tech platforms that's such a sleight of hand i just i can't
03:17:53.000 believe i didn't realize that sleight of hand they were doing there my god
03:17:58.560 no we're just holding the big tech accountable yeah holding the big tech accountable
03:18:05.860 to censor individuals.
03:18:13.040 How are we doing in chat here?
03:18:14.900 Are we falling asleep yet?
03:18:17.820 All hail Trudolph. 0.98
03:18:19.100 This woman trips over her own phrases again and again. 1.00
03:18:21.860 She's an absolute hypocrite and liar. 1.00
03:18:23.560 Hey, I mean, I didn't say anything. 1.00
03:18:26.720 Let's let her, you know, 0.98
03:18:28.500 bury her own, 1.00
03:18:29.140 dig herself, dig her own hole here. 1.00
03:18:33.140 Guilty of tweaking with greasy hair, 1.00
03:18:34.980 says andrew simpson what else we got alberta climber so independent of government funding
03:18:41.680 as well okie dokie yeah exactly exactly oh my god
03:18:48.840 super villain says hope the monetization is worth it probably not lol hey man i'm putting all my
03:18:58.200 effort into this to try and save free speech i hate ads too if you look back in the in the in
03:19:03.360 the live i was ranting about how i hate ads as well but if you can if you can sit through some
03:19:08.620 of these i will change the ad setting next time because to be fair i may i may have put it on like
03:19:13.500 i you know what i'll i'm sorry i'm sorry i don't like watching ads either
03:19:19.240 daryl scurd says what in god's name does michael want to say that he's afraid to say he's kind of
03:19:26.640 afraid isn't he i was kind of getting that vibe too that he's kind of like petrified and neurotic
03:19:31.660 uh let's get through this oh my god we're almost done it's 15 minutes although the first 15 minutes
03:19:38.320 took us like an hour oh my god three hours oh my god okay let's do this guys come on let's go
03:19:42.560 let's go i'm out of my drink too or um if i can even jump ahead and and guess i i know
03:19:50.040 where some of michael's concerns are and i will say that this is an administrative body it doesn't
03:19:56.280 operate like a court. It has to follow. That means that the rules of evidence and how it proceeds
03:20:03.060 is softer than a court. What it does have to follow is rules of procedural fairness. It has
03:20:08.100 to be non-biased, has to be open, has to basically have due process rules in how it operates. And
03:20:13.220 ultimately, if there is an issue, it can be able to be taken to the federal court. So I'm okay with
03:20:19.480 that process. I think where you can have debate is what should be detailed in the legislation
03:20:23.860 versus later the later parts there's quite a bit of power to this commission this body of three to
03:20:29.000 five people to write the regulations for the social media companies to write out what they
03:20:34.860 want in digital safety plans to write out what they think kids safety should look like
03:20:38.380 hey kids safety came up only 45 minutes in 47 29
03:20:47.120 again this this is this sounds so redundant to me
03:20:56.640 i'm just writing notes just writing notes because i'm going to be you know making content
03:21:08.360 about this later to kind of cut through all this nonsense uh write digital safety so they want this
03:21:15.360 commission to write, uh, you know, some sort of manifesto of what they want big tech to do.
03:21:21.860 I'm sure that big tech already has this. This sounds so redundant. Does the UK and Australia
03:21:29.020 have their specific manifesto of what they want, uh, with their censorship? I mean,
03:21:35.320 it's crazy. Like it's like, I'm just picturing the U the UK digital safety commission,
03:21:41.980 the australia day safety commission now the canadian safety commission we want you to ban
03:21:46.780 child porn and it's like yeah we already we already do we already do that uh we already
03:21:55.260 try to do that that's we actually have thousands of employees here that already do that but no no
03:22:02.100 okay yeah okay sure you're gonna try to mess with us foreign government sure yeah yeah we'll we'll
03:22:08.400 look out for the child porn we're already doing that but okay i think that's okay because i think
03:22:14.160 that that is what's needed there are guardrails saying this is what you have to have and say a
03:22:18.360 safety plan so i i think it's landed appropriately um but um i think that reasonable people might
03:22:24.600 have different views about where that balance should be it's funny that you know it's not
03:22:29.340 government censorship isn't a problem but we're going to get people elected or chosen by the
03:22:34.740 government or the parliamentarians to tell big tech uh what is hate and what is harmful content
03:22:39.660 these people are lying through their teeth or they're incredibly naive or like i said earlier
03:22:47.080 they clearly are into they're being manipulated by tyrants to do this blink twice if you're if
03:22:53.940 you're being held captive emily blink twice if you're being held captive uh these i mean these
03:23:00.460 people can't be serious no there's not going to be any government censorship at all that's not a
03:23:04.520 concern at all for mine we just gotta like the right people thank you michael okay all right
03:23:10.560 hopefully i'll be reasonable in saying that that i do have some concerns and i'll start just by
03:23:14.300 noting first come on that we do need something to enforce this law and we don't i don't think
03:23:18.700 we have a readily readily available option so that it's not going to be the government you
03:23:22.660 don't want the government making decisions on content in this way it's not going to be the
03:23:25.280 crtc the broadcast and telecom regulator which i think is already out of its element in dealing
03:23:29.180 with certain internet related issues so we do need some hey okay so i mean there's not the
03:23:34.980 strongest points but they're points nonetheless okay uh i wish you'd harp on this a lot more
03:23:40.700 because there's a lot of criticisms you can give to the crtc and how they they are not set up to
03:23:46.160 control the internet that's a huge point you know you can basically say uh this has all this this
03:23:52.820 will like you know the crtc has been a failure for xyz and um
03:23:59.940 the crtc has been a failure and similarly why would you get a government body to do this type
03:24:08.440 of stuff um if the crtc the government body has already been a failure in doing this but he's
03:24:15.340 saying we do need something which is weird let's say we're going to see let's see what he's going
03:24:18.840 to say next something and i also want to emphasize that i think trust in this legislation really
03:24:23.980 depends in great measure on this commission that i think that you know some of the concerns that
03:24:27.620 you've been highlighting richard i think they are very real at the end of the day much of the public
03:24:31.160 confidence in this legislation is going to be dictated by this commission and i think that
03:24:34.240 there are from my own perspective reasons for interesting they make it seem like they're going
03:24:39.160 to get the avengers together you know who's this commission going to be the public trust
03:24:45.160 is going to determine who the commission is they're going to get fucking canada man they're
03:24:50.720 going to get uh mr america team america captain america they're going to get spider-man on the
03:24:57.120 safety commission what kind of bureaucrat are they going to get that's going to win the public's
03:25:03.460 hey we we found this bureaucrat from the the sewage and and the the swamp in ottawa and the
03:25:10.820 public is going to trust them concerns i mean emily's articulated you know why this you know
03:25:16.700 it's independence and i agree with that i don't see the independence issue as an issue uh i have
03:25:20.660 concerns about the size of the commission quite frankly when i think of just the wide range of
03:25:25.520 kinds of issues that are going to come up in this context you want people both that have uh expertise
03:25:29.840 on issues like freedom expression and privacy you also want them to have an understanding of hate
03:25:33.780 and harm and child uh endangerment and child related issues there's just and how these
03:25:37.800 platforms themselves function you want some technical expertise and then of course you
03:25:40.500 so something to point out here emily did the same thing when describing this is what the safety
03:25:48.580 yeah the safety commission should have this should have that should have this should have that should
03:25:52.460 have this none of this is in the legislation by the way none of this is clearly defined in the
03:25:56.740 legislation you are you guys are just kind of wishing this you're wishing that yeah yeah wouldn't
03:26:01.920 it be great if we had this perfect digital safety commission that didn't censor uh what the
03:26:06.940 government wanted to censor and was just so perfectly perfect on being you know being able
03:26:11.160 to solve complex subjective issues when it comes to human expression on the internet wouldn't that
03:26:16.120 be beautiful it's total utopian garbage what both of these people are talking about just wishful
03:26:21.760 thinking yeah no the government won't be tyrannical at all we just need to elect the right people
03:26:26.340 want people to bring an independent view to all of this i'm not convinced that the size of the
03:26:30.800 commission we have larger commissions certainly with the crtc is is the right size but even more
03:26:35.260 than that the kind of power that it does have is is is is remarkable in certain respects it can
03:26:39.760 issue rulings to make content inaccessible conducts obviously investigations it can demand
03:26:43.200 any information it wants from these regulated services it can hold hearings that if it decides 0.56
03:26:47.300 is appropriate under the circumstances can be held in secret the default is in it damn
03:26:51.540 i'm really glad that i watched this because this is a lot of great uh stuff on what's actually in
03:27:04.160 bill itself and this panelist is really asking incredible questions is open but it can say no
03:27:09.040 in this instance we want to hold it in secret that it will be establishing both regulations
03:27:12.720 and codes of conduct and this is very very broad the penalties are huge potentially up to six to
03:27:17.120 eight percent of global revenues and so it wields a lot of power and then it does act you know on
03:27:22.240 its own with some amount of informality and to me this is is is a source of concern so for example
03:27:27.360 the legislation specifically states that commission's not subject any legal or technical rules
03:27:31.200 of evidence and it speaks to the need to act informally and expeditiously and while i recognize
03:27:35.520 that it is positioned to establish some of its own set of rules you know with great power comes great
03:27:39.920 responsibility and at a minimum it seems to me that it was incumbent it is incumbent on the
03:27:44.320 government to flesh out in far more detail where the limits where the guardrails are around the
03:27:49.040 commission so that we aren't basically saying adopting a trust us approach with respect to
03:27:54.080 the commission as part of so many of the elements of how this legislation ultimately gets interpreted
03:27:58.160 and play that plays out this is so interesting because geist has supported like the bill in a
03:28:05.680 way like you know he's been saying there are some good things that we need to deal with
03:28:08.980 he's also like oh yeah all of this stuff about speech and criminal speech we should throw that
03:28:14.240 out and oh yeah the one thing that we should keep the digital safety commission we need to go back
03:28:19.160 to the drawing board there needs to be far more detail on this so what is actually good in it
03:28:24.660 like it's it's it's weird how geist has kind of implied that he supports it but when you get into
03:28:30.240 the minutiae he's like no it's you need far more detail here throw all of that out i i just maybe
03:28:37.760 i don't know maybe he's just being timid in this zoom call but uh it's interesting how he's kind
03:28:44.320 of implied that he supports the bill but he's also like oh yeah no we need far more detail here
03:28:48.120 that this could be a huge fucking mess.
03:28:51.680 A quick comment on that is okay,
03:28:53.940 because I will say, you know,
03:28:55.540 it's a really interesting point
03:28:56.560 about the size of the commission.
03:28:57.620 And I think that the list you gave, Michael,
03:28:58.880 of the expertise is exactly the list
03:29:00.260 of the kinds of expertise we need
03:29:01.660 so that supports the idea of a bigger commission.
03:29:03.940 I would say that the idea of rules of technical evidence,
03:29:06.500 that was the, so that is actually in most,
03:29:09.500 I mean, that is administrative law, isn't it?
03:29:11.300 That is what the Privacy Commission or CRTC,
03:29:13.720 I mean, all those regulators set up that way.
03:29:15.580 And same when it comes to the inspection powers,
03:29:18.120 that informality is is kind of core to what administrative tribunals are um and um and i
03:29:25.800 think that i wouldn't call them secret hearing i would just say that they're closed hearings and
03:29:29.080 courts have closed hearings at times too and there's some we really that's a bad pr work
03:29:34.600 michael we don't like secret they're just they're special they're they're interesting they're behind
03:29:39.560 closed doors okay it's just a you know the secret makes us sound like we're sneaky and we're up to
03:29:44.360 something really special scenarios here if you're talking about child sexual abuse images and
03:29:49.600 intimate images i think it's fair to say well we want to know the circumstances around it but
03:29:53.860 um you know that i suppose that is some of that trust element that that you set out but i mean
03:29:58.440 i mean just to be like the counterpoint to that would be shouldn't we know more if you know sure
03:30:05.040 these are sensitive things of child sexual abuse material but the public almost deserves to know
03:30:10.200 where that's coming from who's responsible for it who uploaded this you know like again we're
03:30:15.840 supposed to be defending kids i kind of don't feel comfortable having a bunch of bureaucrats
03:30:20.740 talking about that behind closed doors i'd much rather that that part be out in the open so we
03:30:26.180 can i don't know condemn vile people who distribute and produce child sexual abuse material but
03:30:33.620 instead we're going to have a bunch of ottawa bureaucrats privately talking about what we should
03:30:38.660 do about uh you know some of the most the the most vile content on the internet to me just to
03:30:44.960 be devil's advocate maybe shouldn't shouldn't that be more uh public information so we can
03:30:50.620 condemn these these uh vile evil monsters but it's about censoring speech it's not actually
03:31:01.240 about stopping pedophiles it is consistent too with what we see with what courts courts do and
03:31:06.560 that power to take down images is extraordinary it is but it is for two things it's child sexual
03:31:12.040 abuse material and it is for intimate images um i mean once again if they brought if they brought
03:31:18.240 up the canadian company that's a huge like you know porn empire this this what they're saying
03:31:23.400 right now would sound a lot more genuine um but they don't they just they just imply that uh no
03:31:30.700 it's only going to be facebook where people are going to be uploading uh cp that's that's child
03:31:36.380 porn this sucks like you know i don't i i can't even believe that this is i turned off the ads 0.64
03:31:41.520 by the way i turned off the ads by the way i'm surprised they're even sharing ads i'm swearing 0.82
03:31:45.900 i'm saying swear words anyway we're almost through it for me i've always landed that you need a quick
03:31:52.440 and easy remedy in that particular case and if you were setting it out to your hearing or tribunal
03:31:57.420 just don't even bother what you want as a commissioner with that power that is consistent
03:32:00.540 with what australia is doing with their e-safety commissioner thank you i'd like to go briefly
03:32:05.340 because i'm looking at the timeline for the time the issue of peace bond has been uh of attention
03:32:10.940 um so i maybe one of you could explain what a peace bond is but but one of the when i was
03:32:16.460 speaking to the government one of the one of the examples they gave me is that let's say
03:32:20.460 jane john doe is writing crazy anti-semitic stuff on the internet um and one that that
03:32:27.100 that raises a lot of a lot of question uh and the fact that there could be a peace bond with all the
03:32:32.060 process that that that peace bond that that all the all the the steps that need to be gone through
03:32:38.220 before a peace bond is issued uh that would be okay or reasonable to have all you know this john
03:32:42.940 doe should not be allowed within 100 meters of a synagogue or jewish community center and the like
03:32:47.420 so that's often the the example that that the government has has given us and and i would like 1.00
03:32:52.060 to hear your thought is is the peace bond in those circumstances does it make sense to use
03:32:57.580 that tool that's already in existence in the context of online online arms uh michael and then
03:33:02.220 emily okay well first of all i mean there's an online element there may be an offline element
03:33:05.820 of course there as well you know i i think this is a more defensible uh provision i think there
03:33:10.460 is still i think going to be sources of concern in part because this ventures peace bonds into
03:33:14.940 an area that we traditionally wouldn't see them so you know the idea of peace bonds which which
03:33:18.540 one might think of as prior restraint you know is is used in a range of different spaces and
03:33:22.620 oftentimes one can well understand why you would want to use it if someone is is fearful of assault
03:33:26.700 in a domestic situation you want to you may want to ensure that you've got a restraint put in place
03:33:31.260 to to try to guard against the event actually happening and that's what this is that example
03:33:36.620 is a real life violence uh domestic assault okay so the crime that's already illegal trying to do
03:33:43.020 it's you've got enough evidence to believe that this is likely to happen you want to prevent it
03:33:46.780 it from happening rather than find yourself in a position where you now have to remedy after the
03:33:50.880 fact and that and the harm has been caused putting this in so so michael geist supports
03:33:57.920 pre-crime the speech context though is different and because it is certainly unusual to do it and
03:34:03.640 so i can well understand why there would be concerns associated with it there are a significant
03:34:07.580 amount of approvals and guardrails that are built into the process and so i think it's something
03:34:11.800 that i think we we need to be working through with respect to um the review that will ultimately
03:34:15.900 take place before a committee to better understand you know the precise uses and the guardrails that
03:34:19.660 exist i think as a concept the idea of trying to prevent harm when we have a real notion uh there's
03:34:24.540 a real likelihood that it is going to occur is certainly better than sort of after the fact
03:34:28.540 trying to remedy or deal with the harm that takes place and i further think that in we gotta listen
03:34:33.340 to that again this is really this stuff's important through with respect to um the review that will
03:34:39.580 ultimately take place before a committee to better understand you know the precise uses and the
03:34:42.940 guardrails that exist i think as a concept the idea of trying to prevent harm when we have a
03:34:47.420 real notion uh there's a real likelihood that it is going to occur is certainly better than sort of
03:34:51.820 after the fact trying to remedy or deal with the harm that takes place and i further think that and
03:34:55.660 we've referenced it a few times that in some ways the criminal code provisions have been
03:34:59.820 tarnished by the inclusion of things like the life in prison that it has muddied the debate around
03:35:04.300 some of the other provisions as well and and that has made made much of this discussion much more
03:35:08.140 difficult wow i did not see that coming from from mr geist so that's because he supports one of the
03:35:18.520 most egregious things in my opinion that is by far one of the most egregious things you know
03:35:23.880 what if we can prevent the harm we should so all of a sudden all of a sudden you know punishing
03:35:29.480 people peace bonds house arrest ankle monitors based on a suspicion that someone might commit
03:35:36.700 the hate crime based on what exactly because there's safe guardrails that are like we're
03:35:43.920 really good at predicting this is this is uh minority report this is minority report
03:35:50.140 and michael geist supports this that's disappointing that's very very disappointing
03:35:57.160 come on michael this is this this is this is um also
03:36:03.400 does this exist in the uk and australia because this this this part of the bill is especially
03:36:11.380 insane let's see what emily says thank you emily i entirely agree with with michael i think that
03:36:18.500 there this is where we can maybe clarify understanding is there this is not a thought
03:36:23.880 crime a peace bond it's it's um you know this isn't minority report is it totally it totally
03:36:28.960 is it totally is you're going to prevent crime before it happens based on a hunch because
03:36:35.400 somebody said something hateful or anti-semitic online what was once again we've been talking
03:36:40.380 about hate this whole time no one's defined any sort of oh the the supreme court definition is
03:36:46.000 very clear you guys haven't clarified anything this has been accused to be what but there is a
03:36:51.400 legitimate discussion to be had about whether it should be used in the context of a speech crime
03:36:54.640 and i suppose you could say speech crime that's a new one it's not it's not guys it's not it's not
03:37:03.840 minority report but speech crime is real yeah it can be used right now for intimate images um so i
03:37:10.680 guess you could call that you know that's essentially a communication and speech as well
03:37:13.660 um but it is a de-escalation tool so it's actually soft supposed to be a softer mechanism than the
03:37:20.280 other so there might be a legitimate use for it um but i am waiting to see uh you know how this
03:37:26.200 is discussed at committee and and how they might ensure some constraint of it but but i do think
03:37:30.440 that that one there there might be a real uh potential value for that one to to help address 0.97
03:37:35.060 hate thank you last one that's crazy man she's running out of gas she's running out of gas in
03:37:41.020 this zoom call no that's a great way to address hate yeah that'll work that'll work great
03:37:46.360 question um so you are uh given full authority to legislate on online hate in in canada and you
03:37:57.220 have ultimate power to decide what to do with c63 uh what do you do do you adopt the bill as is uh
03:38:02.520 do you tinker with the bill but ultimately adopt something that is you know looks like c63 do you
03:38:08.300 start from scratch um or do you say you know what i it's just a question of of using existing
03:38:14.840 legislation there's enough tools in the toolbox to combat this this hate online and we should push
03:38:20.120 uh for proper enforcement of of existing legislation emily then michael i would say
03:38:26.600 that the online harms act proper that's regulating social media i think that could be passed now it's
03:38:31.480 just minor tinkering with it like i think there's a few things i would change about it um the criminal
03:38:35.880 code provisions major concerns major revisions uh scrap it or or i think they need to to make
03:38:42.040 significant changes to that i the canadian human rights act oh i i'm now struggling with that i
03:38:46.680 think if you even asked me a few days ago i'd say no pass that one as is they've made important
03:38:50.920 advances but that weaponization issue really is weighing on me at the moment thank you michael
03:38:55.640 yeah i think i hinted at at least my starting point here which i would separate out the bill
03:38:58.840 and so i would separate i would put the criminal code and human rights act provisions perhaps in
03:39:02.120 their own separate bills and have a solid robust debates around both of these issues which would
03:39:06.760 place the online harms piece if not on a fast track because i do think that there are still
03:39:10.040 some areas where there is need of reform and there is need for study and potentially identifying
03:39:14.660 issues that are not addressed in the legislation that perhaps could improve the legislation
03:39:19.160 itself. So I certainly wouldn't toss it aside. I would seek to separate it, so to speak,
03:39:22.420 not a Solomon-esque separation, but rather moving ahead directly with the elements on
03:39:26.480 the online harms piece. The one other piece, though, that I would say is that could improve
03:39:30.620 the legislation itself. So I certainly wouldn't toss it aside. I would seek to separate it,
03:39:34.160 so to speak, not a Solomon-esque separation, but rather Solomon-esque?
03:39:41.000 Is that a reference to the Bible?
03:39:47.640 Interesting.
03:39:49.300 They're moving ahead directly with the elements on the online harms piece.
03:39:52.820 The one other piece, though, that I would say is that I do not think that government gets to hold up a mission accomplished sign by saying, hey, we introduced this.
03:39:59.480 It even doesn't get to put up a mission accomplished sign, even if it passes this.
03:40:02.200 There is a need for leadership right now when we deal with some of these issues.
03:40:05.480 And that means using the tools we have right now, both in showing leadership, in speaking out against anti-Semitism and hate, both online and offline, enforcing the laws we have right now.
03:40:14.920 So government certainly needs to pay attention to this and prioritize this from a political perspective.
03:40:18.720 But there is so much more that needs to be happening right now.
03:40:21.140 And I think we've really suffered, at least in part, because governments at multiple levels haven't stepped up in the way that they must when it comes to this issue.
03:40:28.080 Wow. That's I mean, Mike, Michael is sounding like a tyrant himself right now.
03:40:33.900 he's basically saying hey we should we should be arresting people by now we got plenty of laws that
03:40:38.860 we could use to you know fuck with these anti-semites that that's that's what i'm hearing
03:40:43.440 from this but am i reading that properly so thank you very much both of you before i i let my my 0.66
03:40:49.160 chair finish uh close do the closing remarks i want to personally thank you i wanted to say to
03:40:53.580 the people uh try to go i tried to go through all the questions that that were in the chat okay okay
03:40:57.960 that's good enough wow wow guys wow we made it through we made it through oh my gosh i'm
03:41:09.600 exhausted i'm exhausted that was um that was really good though that was really good stuff
03:41:15.720 that was um really good clips in there lots to go over uh man
03:41:24.820 Michael Michael Geist I mean there there was some good things that Michael brought up
03:41:30.900 he is more of a legal expert than me so I do at least respect that you know there are some
03:41:37.500 decent clips there are some decent points that he brought up that I'm definitely going to 0.62
03:41:41.900 weaponize to try and stop Bill C63 um but uh yeah I'm also thinking that like you know
03:41:51.280 when it comes to politics and pushing for legislation or pushing for what you want
03:41:58.780 the same tactics and the same sort of principles apply as they would for sales or for negotiation
03:42:06.820 and if you look at like the art of negotiation um and how to get what you want when negotiating
03:42:15.000 um you ask for more than that you want you ask for more than what you truly want
03:42:23.960 um in sales they kind of make like an example in sales would be uh price anchoring so let's say
03:42:33.340 you're sales you're selling something and you want to sell it for uh you know 50 bucks okay
03:42:40.540 you can you can an easy way to do that is to say hey this is a hundred dollars but it's actually
03:42:48.160 on sale for 75 and then the and then the person there is like i don't know if i really want 75
03:42:55.680 seems like a little bit much and then you say you know what how about this i'll give it to you for
03:43:00.340 50 i'll give you a deal and then and then the person walks off thinking they got a great deal
03:43:05.380 hey i got this for 50 bucks when in reality the salesperson was like well you know i i didn't lie
03:43:12.040 but like i misleaded them and said yeah i want 100 bucks for it this is how much it's worth
03:43:15.800 and then they get haggled down negotiated down to 50 which is where they wanted to sell it for right
03:43:21.060 i suspect the same thing because let's face it you got to give credit to this you know this
03:43:27.400 canadian regime they've been messing with us they've been pulling off insane stuff in terms 0.87
03:43:31.560 of robbing our rights in Canada. 0.93
03:43:34.540 They know what they're doing.
03:43:35.700 The propaganda of Justin Trudeau
03:43:37.060 should not be taken lightly.
03:43:40.900 They've brainwashed our nation
03:43:42.320 to hate themselves, guys.
03:43:44.000 They're doing a good job
03:43:45.400 in terms of what their goals are.
03:43:47.580 And along with that,
03:43:48.400 when it comes to passing this legislation,
03:43:52.000 I'm now thinking that
03:43:53.360 this is part of their negotiation tactic
03:43:57.340 to pass parts of Bill C-63, right?
03:44:01.380 So, I want to just get the Digital Safety Commission.
03:44:05.840 I just want to get the Digital Safety Commission
03:44:08.100 so the Canadian government can start to control the algorithms
03:44:10.940 and start to slowly censor Canadians, right?
03:44:13.640 So, I'm not going to start by just saying that's all I want.
03:44:16.440 No, I'm going to say that I'm going to pass a bunch of speech laws as well
03:44:19.300 to broaden the definition of hate speech.
03:44:21.460 And I'm going to threaten to bring back the Human Rights Act
03:44:24.540 or the Human Rights Code, Section 13 of the Human Rights Code.
03:44:28.300 That's $100. It's a lot.
03:44:30.520 And then people are going to push back.
03:44:32.080 That's way too expensive.
03:44:32.920 That's way too expensive.
03:44:33.700 That's way too much.
03:44:34.500 As you were seeing in this kind of chat with Seja.
03:44:37.020 And then they're like, you know what?
03:44:38.940 How about this?
03:44:40.480 How about I'll just sell it to you for $50.
03:44:43.700 And we just passed the Digital Safety Commission.
03:44:46.680 And we'll take out that other stuff.
03:44:48.440 And then everyone's like, yeah, all right.
03:44:49.660 I got this thing for $50.
03:44:51.020 When in reality, we got fucking ripped off.
03:44:53.560 And now the government is going to add another layer of censorship
03:44:57.600 onto onto social media platforms so they're going to be able to censor people even more effectively
03:45:02.940 um that's the net result i mean it's certainly not it's certainly not as um
03:45:10.160 like you know the pressure would obviously be off in terms of like just being able to
03:45:15.760 easily throw people in jail and persecute them but still i i reject that entirely
03:45:20.080 just because australia and the uk did it doesn't mean that we need it too i need to do
03:45:26.620 specific research though on the differences uh of this of what the uk and australia has
03:45:33.300 and what they're suggesting with us i know there's a lot of there's a huge amount of
03:45:38.660 similarities i'm sure but um i'm just yeah i i mean you know what it doesn't even matter
03:45:50.720 we're gonna get the whole bill thrown out anyway guys if you want to support my mission to save
03:45:55.180 free speech in canada we have a documentary coming up that's going to be exposing all of this stuff
03:46:00.020 and it's going to be very very um good it's going to be illuminating for people we're going to get
03:46:05.040 more canadians into the fold and upset about what's happening in our country and letting them
03:46:08.940 know how they can do something about it um you can go to give send go.com save free speech
03:46:15.640 that's how you can support this documentary is going to be like as i said exposing the people
03:46:21.580 exposing the true agenda behind these censorship bills behind this tyrannical bills hey we got a
03:46:26.680 huge donation by man on the mountain hey shouts out shout out to you man thank you so much for
03:46:33.080 the hundred dollars towards saving free speech in canada most important issue in canada right now
03:46:38.080 thank you so much sir appreciate the donation i assumed his ginger his gender because he said
03:46:44.140 man on the mountain anonymous giver gives 20 thanks for fighting this essential fight thank
03:46:49.320 you sir thank you sir for the donation appreciate it um and yeah we're also on social media
03:46:57.140 we're also on facebook and twitter at sfs canada um that's sfs canada let me bring it up
03:47:09.240 here we are on twitter
03:47:14.220 sfscanada and it's the same at on facebook so if you go to facebook.com slash sfscanada
03:47:22.280 the same thing will come up and our website's savefreespeech.ca and on instagram if you go
03:47:29.460 at savefreespeech.ca one word you'll also get our instagram well lots of stuff coming down the pipe
03:47:36.640 that was a really productive uh really productive stream actually i'm really glad he went over all
03:47:41.040 that i know it was very long it's uh shout outs hey oh sevens in chat if you hang hung around the
03:47:46.780 whole time it's amazing yeah edgy tv says even if bill 63 doesn't pass that they'll still
03:47:54.660 relentlessly try to censor and attack us yeah exactly you know the cops are already showing
03:47:58.960 up at people's houses for what they're posting online so uh hello uh it's crazy how watching
03:48:06.120 that stuff will like trying to lull you to sleep of like maybe maybe the legislation isn't that bad
03:48:10.120 no no eye on the prize they're tyrants we know what they want we know what they want
03:48:17.660 andrew simpson diversity is a strength amongst our own greg greg does great work oh seven thanks
03:48:24.420 andrew simpson i mean diversity of thought is great we love that neuro-linguistic programming
03:48:32.700 says art story I wonder what you're referring to there 420 all day says that was fucking painful
03:48:40.140 yeah yeah yeah wow that honestly I'm glad we sat through all that there's a lot of good stuff in
03:48:49.360 there John Smith do you simulcast any everywhere I do not right now I'm only streaming on YouTube
03:48:55.740 I do want to change that I want to set up my rumble channel and do all that post in chat
03:49:02.060 where you want me to stream right now post in chat where you want this stream to be other than
03:49:08.980 youtube uh i might invest in like a streaming service that streams to different platforms
03:49:14.900 blah blah blah but please in chat let me know where you want this to stream guys thanks so
03:49:20.580 much for hanging around we gotta wrap it up it's been it's been fun hanging out with you guys
03:49:26.600 it's been fun it's nice to have the uh the group chat there hanging out
03:49:32.520 i really enjoyed the kind of funny comments bullying some of the people that we were
03:49:38.340 responding to it's a good it's a good it's a good like spirit in the spirit of saving free speech i
03:49:43.900 think we should probably kind of be poking fun and bullying people a little bit and making fun 0.97
03:49:47.440 of them um i feel like i need to shower though after watching all those neurotic jewish people 0.96
03:49:56.160 go on about how worried they are about everything oh my god oh my god um yeah do you know how many 0.95
03:50:04.840 churches have burnt down in canada like burns to the ground gone and again i actually i kind
03:50:12.900 of resent the fact that i'm playing you know victim olympics but from a victim perspective
03:50:18.840 as emily was saying we need to look at this from a victim perspective
03:50:22.420 um daryl skirt says rumble does rumble charge for rumble studio uh probably i should look
03:50:31.380 into rumble studio actually that's a good idea lee stewie says telegram trent dabb says stream
03:50:37.920 everywhere john smith says everywhere edgy tv says rumble 420 all day says cnn maybe i should
03:50:44.820 stream on cnn good idea that might work uh like 150 by now what was that daryl skirt says rumble
03:50:56.200 and ember tree would be good to you to g gtv might be a good option too yeah well hey guys thanks
03:51:04.000 thanks for hanging out thanks for spending time i'm starving i have all these notes to now
03:51:10.620 organize which is great but uh a lot of great clips in there a lot of great clips in there
03:51:17.420 i think to be safe i'm just gonna download that video right now too just to have like a raw version
03:51:21.980 of it all right guys thanks again for watching shout outs to do to flip sr5.twists shout outs
03:51:32.060 to lee stewie for hanging out raging against machine daryl scurd trent dabs thanks for
03:51:37.900 for moderating edgy tv and lee for moderating andrew simpson fight for freedom thanks for
03:51:45.740 hanging out derrick nelson as an american in minnesota we're going through our own things
03:51:51.080 but i have so much love for you all in canada if there is going to be a fight i want to be there
03:51:55.880 with you appreciate it man appreciate it it's always nice connecting with americans who get it
03:52:03.360 uh i feel i feel like americans are on another level though in terms of opposing this tyranny
03:52:09.380 and just really hating it and standing up and having having no patience for it but thank you
03:52:13.560 for hanging out derrick uh kristin benny at some point it would be nice not playing defense and be
03:52:19.380 ahead of them yeah totally totally absolutely totally agree and and that's that's why i'm
03:52:28.020 trying to you know get ahead of the conservative party on this issue and to try and help them lead
03:52:34.020 the charge or not help them lead the charge but lead the charge to for them to like fall in line
03:52:38.760 uh because sure we're opposing the legislation but kind of in the background or at its core
03:52:45.120 this is about saying fuck political correctness this is about saying like like actually fuck
03:52:52.100 woke progressives and this sort of tyrannical attitude they have towards everything you know
03:52:57.700 Canadians are already afraid to speak their mind 0.69
03:53:01.260 we need to change that
03:53:03.400 not only do we need to stop this bill
03:53:05.280 but we need to encourage Canadians
03:53:07.480 to actually use your free speech
03:53:09.720 while you have it
03:53:11.120 let's turn this around
03:53:13.900 I feel like this is a good
03:53:15.700 moment in our history
03:53:18.180 to be like hey remember when we almost lost free speech
03:53:20.340 yeah let's start to say fuck you
03:53:21.900 and turn this ship around
03:53:24.420 okay
03:53:25.420 it's absurd
03:53:27.100 anyway guys thanks so much for hanging out
03:53:29.960 07's in chat
03:53:32.220 we will
03:53:33.620 talk to you on Thursday
03:53:35.280 I'm going to be interviewing that gentleman
03:53:37.460 who got his CBC
03:53:39.380 canned
03:53:40.600 his interview canned off of CBC
03:53:43.220 but thanks for hanging out once again
03:53:45.760 have a good evening
03:53:48.280 I'll talk to you soon
03:53:50.320 maybe I'll play a song
03:53:51.920 next time
03:53:52.580 later
03:53:57.100 Thank you.
03:54:27.100 Thank you.
03:54:57.100 Thank you.