00:06:45.380good evening ladies and gentlemen um how are you doing how are you feeling
00:06:55.980uh another stream another stream to do oh man how do these guys do it how do these guys do it it's
00:07:06.640all the lights everything how are we feeling how are we feeling i saw trent and trent dabs
00:07:11.580in chat said what about the chocolate milk see the thing i love chocolate milk but
00:07:19.000chocolate milk is really bad for broadcasting or singing or like anything to do with speaking
00:07:25.600or using your mouth because it's just so it's just so thick and delicious it's uh it's literally
00:07:32.880the worst thing though if you're if you're trying to be a broadcaster and trying to uh talk uh and
00:07:38.560stuff um but yeah we're gonna get into some stuff i'm just gonna try to clean this up a bit this
00:07:46.300looks like a mess look at look at this so just come on i eventually would like to be um you know
00:07:53.800streaming on rumble and other platforms as well but you know i think it's most important to just
00:08:00.360actually be here to actually stream you know gotta stream we gotta make sure that i'm sexually
00:08:06.800streaming streaming comes first always gotta be there for the people
00:08:15.140um yeah we're trying to save free speech in canada if you're just tuning in i'm just setting
00:08:21.640some stuff up for the stream here. What are we going to be talking about tonight? We're going
00:08:24.900to be going over this CJA video. CJA is the Center for Israel and Jewish Affairs. They actually have
00:08:34.140a debate about Bill C-63. Spoiler alert, I'm pretty sure they're all for the bill. They have
00:08:41.740some concerns with some aspects of it, but they are more or less pushing the bill, which is a
00:08:48.720So it's a good learning opportunity for us.
00:08:51.200If you've tuned into the stream before, it's all about steel manning the argument, steel manning the argument in favor of, you know, stopping free speech or to pass this bill C63.
00:09:06.540I want to look into that chat that they have going to be reacting to some other content and also going over the fact that, yeah, you didn't know.
00:09:13.420uh bill c63 is being um like already included in the uh federal budget or being talked about
00:09:21.100and being budgeted out 300 employees 200 million dollars to police your internet comments
00:09:28.220the thought police are coming and uh yeah it's kind of okay and kind of fun to make fun of it
00:09:35.820now well we should make fun of it even if it were to pass but it's best let's just humiliate these
00:09:41.060tyrants now so the bill gets thrown out and then we don't actually have to deal with this
00:09:45.980in real life you know let's do this thing called standing up standing on guard for the
00:09:50.940and defending uh the freedoms of this country which made it so good to begin with
00:09:55.600uh yeah it feels like we're in the fourth quarter and we really need to like pull off
00:10:00.500a miracle here but i think we can do it i think we can do it um i don't know what this is this
00:10:07.960page is going to look like but hopefully let's see this might be a mess one second here oh there's
00:10:12.960nothing there let's see what this looks like you guys can't hear me it's gonna it's gonna be quiet
00:10:22.660for a sec just everybody sit tight okay it's gonna be an awkward silence here just sit tight
00:10:38.880And we're back. Did you miss me? We're back. Did you miss me?
00:10:44.600All right. Is that looking good? All right. All right. All right.
00:19:42.540The report looks at the federal government's pledge to establish a digital safety commission to regulate social media companies and force them to limit harmful content online.
00:20:12.540online harms, which if you witnessed during COVID, uh, yeah, it was a whole industry of
00:20:18.900censorship and silencing people who, uh, ended up having the right opinion on, uh, on things
00:20:25.860during lockdowns. So, uh, yeah, they already have a censorship problem. Quite frankly,
00:20:31.960a lot of the big tech platforms probably aside from X, but, um, but yeah, this idea that no,
00:20:39.400the government actually is going to do a better job at censoring the internet.
01:21:39.460In a sense, almost the second bill that includes changes to both the Criminal Code and the Canada Human Rights Act that is focused on hate, and more particularly on individuals, not on the platforms.
01:21:47.980The online harms piece, the platform piece, is really the main part of the legislation.
01:21:56.320The liabilities might be for things that take place on their platforms.
01:21:59.480And then there is, in a sense, almost the second bill that includes changes to both the Criminal Code and the Canada Human Rights Act that is focused on hate and more particularly on individuals, not on the platforms.
01:22:11.160Hate on individuals, not on the platforms.
01:22:15.820So that has entirely nothing to do with protecting kids online and everything to do with targeting dissidents, targeting people who are saying hateful things.
01:22:25.500the online harms piece the platform piece is really the main part of the legislation
01:22:32.680it focuses on social media services and it targets seven harms in particular sexually
01:22:38.700victimizing children bullying inducing a child to harm themselves extremism or terrorism inciting
01:22:45.440violence fomenting hatred and intimate content that is sent without consent often referred to
01:22:50.920was revenge porn so i think one of the biggest i'm just realizing now i think one of the biggest
01:22:57.020vectors of attack of this bill is going to be bullying believe it or not because bullying
01:23:02.720although it may have a negative connotation for some uh some people who are sensitive to that
01:23:08.680it's also the the flimsiest sort of so you're gonna stop bullying on the internet really who's
01:23:18.940going to define what bullying is like that is that's crazy that's it's such a crazy premise
01:23:27.380that you're going to stop bullying you're going to stop bullying on the internet is that right
01:23:33.960like that that that just sounds so absurd to stop all bullying can we make a meme about this can we
01:23:45.320make a meme about that no that's bullying says who says the government says the person complaining
01:23:49.780online like what a what a mess what a what a mess that these people have the audacity to say that
01:23:56.860we're going to control the internet to the extent that we are going to stop bullying crazy those are
01:24:03.080the harms there are also for these platforms that are really subject to this subject to three duties
01:24:08.220there's a duty to act responsibly that'll i i hate this so much because i've listened i've listened
01:24:14.040to this stuff in various forms already but like like hey here's our grocery list of things that
01:24:18.760we're gonna stop we're gonna stop this and this and bullying and this and then there's like these
01:24:23.440are the three things that big tech needs to do they need to act responsibly they need to act
01:24:30.640responsibly like is that really in the legislation to act responsibly it's just it's such it's it's
01:24:41.140it's such like just gobbledygook to try and pass nonsense, obviously. Hey, we're going to pass a
01:24:48.900really serious legislation. What do we want these big tech platforms to do? Act responsibly. Yeah,
01:24:55.220we're the Canadian government. We want big tech to act responsibly. What does that even mean?
01:25:00.700I'll come to in just a moment. There's a duty to protect children, which is largely focused on how
01:25:06.420they design their platforms to better protect kids like it's correct once again michael geist
01:25:11.260it's crazy that i will we'll see if he mentions it but like do you not think i'm hoping he'll
01:25:18.280mention this i'm hoping he'll mention this you know what i'm gonna stop talking because there's
01:25:22.420a lot more to add into maybe he will mention it this is just the introductory statement but
01:25:25.620i was just gonna say there's already a lot of things that big tech does to try and protect kids
01:25:31.920so you know they talk about we need a better way to report things on social media there's literally
01:25:38.020a report button everywhere on big tech i can do it right here i can do it right here look
01:25:45.260i could report this video it's everywhere on any big tech platform there's a report button
01:25:52.040but anyway let's keep going and there is a requirement to make certain content inaccessible
01:25:57.420essentially to block content really only two kinds of content what revenge porn and child
01:26:02.860endangerment the duty to act responsibly for these platforms is really the the core element
01:26:08.780of this legislation with platforms it would include things like ensuring that you've got
01:26:13.260the ability for users to flag content that may buy you can already do that you can already flag
01:26:20.340content they all have a flag button violate the law the ability to block other users so if you're
01:26:25.680a social media the block button canadian government literally is like hey can you guys make a block
01:26:31.520button oh you already have that yeah you guys are writing legislation for the internet and you're
01:26:36.000suggesting a block button tell me you know nothing about the internet without telling me you know
01:26:40.800you know nothing about the internet site you're being harassed you have the ability to block them
01:26:45.920and a requirement to create a digital safety plan where you identify how you're going to deal with
01:26:51.440this be far more transparent about how you deal with these kinds of harms can't big tech is going
01:26:59.280like big tech already probably behind closed doors hates these different western countries coming up
01:27:06.640with all of this nonsense hey can you make a report about what your digital safety plan is
01:27:11.060so now either meta and google has to play ball and come up with all these like reports for for
01:27:18.700the canadian government or more likely they might just say f you we're not even going to operate in
01:27:24.140canada anymore and you may think that's ridiculous but of course bill c18 the online news act you
01:27:29.340already can't get news on meta platforms instagram and facebook anyway all of this is overseen by
01:27:35.500a new digital safety commissioner and there is also the creation of an ombuds person that will
01:27:40.620assist and emily may get into that excuse me now that the second part of the bill involves several
01:27:47.240changes to the criminal code, including the prospect of a new offense where another offense
01:27:53.220is motivated by hatred, and that carries penalties that could go as high as life in prison. It's been
01:27:58.720pretty controversial, and I'm sure we'll come to it. There's also a new provision involving
01:28:02.180genocide or promoting genocide, which also carries life in prison, and the creation of a new peace
01:28:08.800bond, essentially a prior restraint, the ability to have some prior restraint in and around some
01:28:15.040of these issues there is also within the canada human rights oof oof come on michael geist are
01:28:22.880you going to be honest about the pre-crime stuff come on bro exact the return of section 13 that
01:28:27.920that many here may be familiar with it was uh somewhat contra quite controversial it was in the
01:28:33.020law then it was removed it's back again and it create and it creates a provision dealing with
01:28:37.640the communication of hate speech and the prospect of complaints to the human rights commission and
01:28:42.420and then potentially later to the tribunal, which would create actual real liability to the victim
01:28:46.880and potentially to the government as well. So that's the core of the legislation very quickly.
01:28:52.080If I might, just for a couple of minutes, provide you with a quick assessment of sort of the good
01:28:56.480and the bad. On the good side, I think there were many that breathed a sigh of relief that
01:29:00.560what the government had initially in mind, much of that was tossed to the side. And instead,
01:29:05.820I don't even know if that's necessarily true, because I feel like a lot of the stuff in the
01:29:11.140previous legislation is still here it's just coded slightly differently uh so i'll have to
01:29:19.340fact check uh michael on that because i feel like a lot of the stuff in bill c36 is still in bill c63
01:29:25.780um so i don't really know what he's talking about there that really did listen to listen to the
01:29:33.340experts particularly around the issue of online harms and i think some of the approaches that
01:29:38.360they've taken really are are quite sensible we can we can debate some of us around the edges
01:29:43.480but it's a pretty good starting point do you really believe that michael you're the guy that
01:29:48.780opposed bill c11 this guy opposed bill c11 man and bill c11 was you know not so much about speech
01:29:57.500but about like basically giving the keys to the algorithm on big tech algorithms to the government
01:30:04.020and he was talking about how silly this is and getting into the minutia of how silly this is
01:30:09.900and now he's kind of saying hey you know what they built they turned bill c63 around i think
01:30:17.080there's a lot of really good things in here do you are you sure about that so i have relief that
01:30:22.700the government what the government had initially in mind look how look how look how erratic this
01:30:28.620man is right now much of that was tossed to the side and instead really did listen to listen to
01:30:33.940the experts particularly around the issue of online harms and i think some of the approaches
01:30:38.880that they've taken really are are quite sensible we can we can debate some of us around the edges
01:30:44.200but it's a pretty good starting point debate around the edges you're a lawyer should there
01:30:50.740should there not be zero debate around the edges if it's good legislation right like just as a
01:30:55.940like as a base point mr geist dr geist i don't know like should there not be a sort of like
01:31:03.220no debate well anyway we'll see how the debate goes but doesn't seem like he's really opposing
01:31:09.100the the free speech concerns like at all which is uh which is really concerning i i feel like i kind
01:31:17.380of saw this coming because you know i was a big fan of michael geist i like his work i still look
01:31:21.640at his work of what he talks about when it comes to this bill but when bill c63 came out i saw him
01:31:27.560kind of passively so he's like oh there might be some problems but i guess i support it and it's
01:31:31.940like oh boy i was hoping he'd be more of a champion to oppose this just on the principle of i don't
01:31:37.000know the charter of rights and freedoms uh the potential way that this bill could be weaponized
01:31:41.100for political reasons the fact that there's a lot of evidence that uh laws have already been
01:31:47.160weaponized for political reasons um against political dissidents anyway where there are
01:31:53.360concerns i think that frankly the duty to act responsibly is a little bit both a feature and a
01:31:58.320bug the we know the the broad brush strokes of what that might include for a platform but there's
01:32:03.700a lot of uncertainty still so it's easy to like it when we don't know the specifics perhaps when
01:32:08.380we get to the specifics there may be some concerns i think the oversight mechanisms definitely are a
01:32:13.460cause for some concern there is this new digital safety commissioner i believe there's a lack of
01:32:17.640oversight a real lack of specifics especially with what are truly quite enormous powers that
01:32:23.620are vested in in the digital safety commissioner and i believe trust in this relation in this
01:32:28.460legislation is going to depend in large measure and trust in how it's enforced with the digital
01:32:33.600safety commissioner the other major concern quite frankly has okay time stamp this is a good clip
01:32:40.300i'm an expert i'm michael geist and a lot of this really depends on how much we can trust these
01:32:47.600people because we're giving them a lot of power uh and if and if they're ideologically biased
01:32:53.900that would probably be a big problem that's that's basically translation that's what he's saying
01:34:57.880But yeah, that's that's so it was I'm concerned about the bias that people might have and the power that people might have on the Digital Safety Commission.
01:35:07.340And then, yeah, I don't know if throwing people in jail for life imprisonment for speech is necessarily justified.
01:35:13.800That's that's as strong as we got in the Zoom call.
01:35:17.300But maybe maybe he'll have some stronger points later on.
01:35:20.000Let's let's see what the opposition says.
01:35:22.500And having by having this overview of the bill, Emily, if you could share your perspective for a few minutes.
01:35:29.120Yeah. Thank you, Richard. And thank you, Michael, for the great overview.
01:35:34.020And and I think that we do have many shared concerns.
01:35:37.940And I'm sure that I think we should just sit down and solve this, the tweaks that need to be made to this legislation.
01:36:45.380uh yeah maybe we should tweak it yeah maybe we should throw it out actually uh hi i think we
01:36:51.460should throw it out because uh you guys are bringing up so many massive loopholes here and
01:36:55.900you can't even seriously talk about this bill without being both of you kind of fidgety so
01:37:00.820compliment this too by talking about how this would work from a victim-centered perspective
01:37:06.140oh what how this would work from a victim-centered perspective so you know how we know the term like
01:37:21.500climate change and being not just being uh like anti like anti-racist like super and not just
01:37:27.800being not racist but being anti-racist these terms didn't just fall out of the sky these terms were
01:37:34.100workshopped by people they were workshopped by liberal ministers and other people in academia
01:37:39.100to try and push forth uh what's the word um uh total nonsense bullshit that sounds all nice and
01:37:47.080fluffily and progressive and this victim-centered perspective oh my god oh my god because because
01:37:55.680you got to understand guys if this bill passes which it will not but if miraculously i get hit
01:38:01.660by a bus and this bill passes um that's going to be like a term probably used like five years down
01:38:08.580the line well as in a victim centered perspective uh this is this is called the genocide see how
01:38:15.520this person said hey um i don't like your politics i'm going to work out at the gym that's actually
01:38:21.300a called the genocide and if you take the victim centered perspective then we need to throw this
01:38:27.300person in jail forever because the feelings of the victim really should determine whether or not
01:38:32.660someone should be locked up and have any freedoms or not let's listen to the victim let's listen to0.55
01:38:38.080the uh emotions and thoughts of the mentally ill victim who is super neurotic and uh as i mentioned0.54
01:38:48.100mentally ill let's listen to the victim and let's see what that oh you think this person should be0.67
01:38:53.080thrown in jail indefinitely okay let's listen to the mentally ill victim here in this situation0.73
01:38:58.140the victim yeah let's listen to the victim of bullying who is going to cry every time that0.82
01:39:03.100they open twitter that's who we should listen to to see if people should have rights or not
01:39:07.620so let's say sorry i gotta time stamp that victims victim centered it's like a victim
01:39:17.400centered perspective uh oh my god oh my god i just thought i just wanted to take a victim
01:39:29.400centered perspective on this real quick guys that's such a crazy that's so crazy uh you think
01:39:36.360the do you think the three young girls who got murdered in the uk by a knife attack should we
01:39:43.040take the victim-centered perspective on that well they're dead actually they got murdered
01:39:47.700they got killed by the same guy um descendant of i believe migrants from from africa or something
01:39:56.940uh can we take the so what are the what are the parents have we heard from the parents
01:40:01.940of the three white kids who have been murdered can we take a victim-centered perspective on that
01:40:07.020oh no that's right we're gonna throw violent criminals out of jail into the street so we can
01:40:11.860get the uh the people complaining about it uh into a jail cell where i would actually like to
01:40:18.500know that where where are the parents of the of the victims in the in the uk uh of the southport
01:40:24.480southport murders of those young girls i would like to hear what they have to say anyway you
01:40:30.500know something you know awful is happening online and we'll you know deal with the real world
01:40:35.360anti-semitism that we've been seeing flowing online what is available to an individual
01:40:41.540so that's such a oh man they're so good at this i've mentioned this when we went over the arif
01:40:48.000verani press conference but they do this thing where they go you know let's say something's
01:40:53.000happening online and there's also like real world things that happen and like really there's no
01:40:57.200difference between what happens online like a comment online and what's happening in real life
01:41:01.920Like they totally conflate the real world with comments online as if they're synonymous.
01:41:09.200How this would work from a victim-centered perspective.
01:41:12.720So let's say, you know, something, you know, awful is happening online and we'll, you know, deal with the real world anti-Semitism that we've been seeing flowing online.
01:41:26.040um now what i love about that is that it's like has a crime been committed
01:41:34.040what's the crime this is what's so funny about these people who justify online harms
01:41:41.660like the line of like what is a violent crime becomes irrelevant to them because they're
01:41:48.020trying to move the line to be like oh okay no no now speech now speech is violence now now your
01:41:53.780opinion is violence like that's that's really what this game is here if you if you kind of boil it
01:41:59.080down what this game is in passing bill c63 is to move the line of what is criminal to include
01:42:05.960speech to include certain political opinions that you don't like that's what it boils down to and
01:42:12.940what they're going to do i'm guessing just off the top of my head is they're going to try everything
01:42:17.500they can to to do the guilt by association game and be like do you see this violence that happened
01:42:21.860that speech is basically the same thing as that uh well is there any sort of evidence of proving
01:42:27.140that the person who said this is actually violent or are you just kind of extrapolating an internet
01:42:33.720comment to make it now a violent crime or now crime all of a sudden like that's that's what
01:42:40.280their game is and to be to be i mean let's look they're not gonna make it guys like i'm i'm sorry
01:42:46.780emily you're working way too hard trying to tie these things together i do not think you're not
01:42:51.680you're not going to make it good luck good luck but uh i don't think you're going to be able to
01:42:57.360tie these tie these dots together with legislation she's working around the clock and this is the
01:43:02.240best she's got victim-centered perspective good luck when michael said how there's three different
01:43:08.420you know there's almost two or three different bills within one um there's good and bad about
01:43:15.380that that we can debate uh from a victim perspective when something bad happens you
01:43:20.520could say well what are the responsibilities of the social media company and that would be i love
01:43:24.640how she's not defining what something bad is she said when something bad happens when there's a mean
01:43:30.600tweet is that what you mean when someone says a mean tweet when something bad happens
01:43:40.880mean tweets have you seen the comment section that's another thing i bet the people who make
01:43:49.360this legislation like don't have a platform where they post content regularly you know it's it's it
01:43:55.940would be such a joke for any even like micro influencer like myself or especially like a
01:44:01.160larger influencer to say like yeah hey guys we got to stop hate online any any big platform would be
01:44:06.760like what do you mean hate is everywhere online it doesn't matter like all sorts of people get
01:44:13.760hate constantly on the internet it's it's just it's so synonymous with the internet it's maybe
01:44:21.060that's a really good argument actually like can you even have the internet without hate
01:44:27.080that's one of the best parts of the internet you hating people people hating you you know it being
01:44:33.900it being of much lower consequence because it is the internet that's kind of like kind of one of
01:44:39.060the joys i would argue is that you can kind of vent your anger in a safe place hello you know
01:44:46.840it's like when something bad happens when someone gets angry on the internet no when someone actually
01:44:53.400gets angry about a very specific political agenda that is against our agenda that's what i mean
01:44:58.540you know it's it's so hard to separate this stuff from uh from the political agenda and the you know
01:45:04.380tyrannical intention that being said you know the people in this meeting i think i think they're
01:45:10.040genuine genuinely they could be genuinely afraid of anti-semitism and genuinely afraid that like
01:45:16.340oh my god like we need to do something uh um so it's not necessarily i i just want to kind of give
01:45:23.220some sympathy because there are the people like a reefer ronnie and the crooks in parliament who
01:45:29.140are definitely like you know evan balgord and people with the anti-hate network they like they
01:45:33.780are they are definitely like have a very tyrannical intention of like censoring certain people and
01:45:38.480that's like that's their goal but along with that is they they bring along the victims they bring
01:45:43.060along the people and they feed them the story of like no no like we're gonna save the world
01:45:48.080by censoring people we don't like but we're gonna save the world from the hate and we're actually
01:45:52.480gonna stop the hate and the bullying and there's a lot of people who go oh okay okay i believe you
01:45:58.460okay yeah that's what we're gonna do and they're you know they think they're doing the right thing
01:46:03.720but uh they're not they're being manipulated by psychopaths and tyrants unfortunately anyway let's
01:46:11.260continue be the the online harms act that they're looking to create with the digital safety
01:46:16.460commission or has a crime been committed and you want to go to law enforcement or are you looking
01:46:22.540at something else where you know that isn't at the level of criminal enforcement but some sort
01:46:28.500of accountability for the individual which is when you can make a complaint to the human rights
01:46:32.880Commission. But for hate speech or advocating genocide, if the goal is to impose responsibility
01:46:44.160on social media services, to put into place some sort of mechanisms to mitigate risk,
01:46:49.280an individual can go and make a submission to this new Digital Safety Commission.
01:46:55.640The Commission doesn't then investigate that. It's not a complaint to the Commission. Basically,
01:47:01.320it receives that information and then it becomes part of the information that it might share
01:47:06.600publicly about social media or it might be the basis then that the the digital safety commission
01:47:12.360might launch a hearing essentially like an investigation but different where it's investigating
01:47:18.520Wait a minute. Wait a minute. It's like an investigation.
01:47:32.380That's something else where, you know, that isn't at the level of criminal enforcement, but some sort of accountability for the individual, which is when you can make a complaint to the Human Rights Commission.
01:47:43.540um but for for hate speech um or advocating genocide if the goal is to um to impose
01:47:53.100responsibility on social media services to put in place some sort of mechanisms to mitigate risk
01:47:56.660an individual can go and make a submission to this new digital safety commission um there is
01:48:00.540no um the commission doesn't then investigate that it's not a complaint okay so they don't
01:48:05.060investigate that it receives that information and then it becomes part of the information that it
01:48:14.540might share publicly about social media or it might be the basis then that the the digital
01:48:18.220safety commission might launch a hearing essentially like an investigation but different
01:48:21.800where it's so it's not an investigation but it also is an investigation0.93
01:48:27.300oh come on emily you know i don't know if she's gonna make it guys i don't know if she's gonna
01:48:35.840make it here investigating not your individual piece of content but more broadly say if a social
01:48:40.320media service is complying with you know what michael talked about is that duty to act responsibly
01:48:44.500and that duty duty involves taking adequate measures to mitigate the risks of exposure to
01:48:51.100the harmful content to exposure to say hate speech to content that advocates genocide the
01:48:57.000individual can go to this ombudsperson i think this is really important it is there to provide
01:49:00.960victim support to say victim support got victim support yeah i mean it's it's we've already seen
01:49:14.440the trend with progressive politics to empower victims and to like make people feel special for
01:49:18.960being a victim and you know they're gonna salivate at this legislation oh my god i can i can arrest
01:49:26.140people online i can be financially incentivized to to arrest people online because i'm gonna like
01:49:32.740play up the fact that i'm a victim this would be a nightmare this would be a nightmare look this
01:49:38.620was happened with these are your options this is how you can navigate it um the odd buns person
01:49:43.040would be able to pass on information to that commission um also the digital safety commission
01:49:47.520a major part of their mandate is is education and research supported by a digital safety office
01:49:52.160so part of this is taking the data that they get from these companies and saying okay
01:49:55.980where do we go from here what do we know about the landscape of what's going on what are the
01:50:00.020next steps more broadly about some supporting canadians and in addressing these types of
01:50:04.400harmful content this is so interesting because it's like okay we want to get a bunch of data
01:50:09.280from big tech to see like what is going on here it's like so what you want want to data scrape
01:50:16.960people who who are like saying the wrong things online for what like it's so unspecific other than
01:50:23.840we just want more control we want to see what's going on we want to you know we want to we want
01:50:28.400to get our hand in the cookie jar there and just kind of like know what's going on here we want
01:50:32.000more power we want more power is what we want but for individuals for their individual piece of
01:50:37.840content in the end when it comes to social media services you're you're looking at still going
01:50:42.720directly to the social media service but it isn't it's a long-term project with the digital safety
01:50:46.960commission it's about harm reduction and it's about setting minimum standards for these companies
01:50:51.200about how they manage these risks of harm again total that these companies don't have a minimum
01:50:58.240already you're you're looking at still going directly to the social media service but it is
01:51:03.360an it's a long-term project with the digital safety commission it's about harm reduction and
01:51:07.040it's about setting minimum standards for these companies about how they manage these risks of
01:51:11.440harm and so hopefully i mean if this is working at harm reduction the goal would be that you're
01:51:16.560You're going to see better standardization about how this is managed by social media services.
01:51:45.020you don't think that big tech has some sort of bare minimum to enforce harmful content like
01:51:52.060what it's it's so disingenuous it's like these people are just completely ignorant
01:52:00.060and and and and daft and incredibly stupid in terms of not knowing what big tech does0.64
01:52:08.860or it's just straight up malicious it's straight up malicious it's straight up sort of
01:52:14.300of, uh, you know, I'm in, I'm here to push the tyranny. We're
01:52:18.580here to get the power. We're here to get the money. Like
01:52:21.440there's no in between. You're either incredibly stupid and
01:52:24.180naive and, and, and literally don't know anything about the
01:52:28.880so-called big tech companies you're trying to regulate or
01:52:32.960you're literally an evil person who knows exactly what they're
01:52:37.240doing. And this is all just to show to get more power and
01:52:40.360control go to law enforcement i i have many concerns about the criminal code provisions
01:52:46.580like michael that we can dig into you wrote the legislation
01:52:50.880miss laidlaw mrs laidlaw emily you wrote the legislation you have concerns
01:52:56.640you have concerns you wrote it you wrote it and you have concerns must be a real good piece of
01:53:04.280legislature. Good job, Arif Varani. Fantastic. The person who helped contribute to this also
01:53:11.160has concerns. Or you can go to the Human Rights Commission. I think that reintroducing Section 13
01:53:16.820was always going to be controversial. I think that some important steps have been taken to
01:53:22.160narrow the scope of it, a very clear, a much clearer definition of hate, the ability to dismiss
01:53:26.760complaints, to be able to impose costs on parties that are frivolous and vexatious in the complaints
01:53:32.660that they make um i take seriously though michael this this is the more clear definition of hate
01:53:38.400right here by the way this this this chart this chart is the more clear definition of hate
01:53:44.000is it detestation is it detestation that is not allowed disdain that's okay humiliates
01:53:55.680that's okay dislikes okay offends okay but no no detests and it's very clear it's very clear what
01:54:02.300the difference between disdain and detest is. We're a tyrannical government. You can trust us.
01:54:10.160Liz commented separately on a panel with me about the concerns of weaponization of that process.
01:54:16.260And what I have been trying to mull are what are ways that, you know, that can be contained,
01:54:21.460because there's an important access to justice element to being able to go to the Human Rights
01:54:25.180commission important access to justice element being able to go to the human rights commission
01:54:33.900why do i feel like what she's talking about is coming from a victim perspective0.74
01:54:39.940justice to the victim right justice to the the mentally ill uh person who is terrified of
01:54:48.000internet comments their their form of justice justice is now taking on a a new definition here
01:54:54.320uh according to the human rights uh person they hurt my feelings and they're not in jail yet i
01:55:00.400want justice unbelievable all right this this video is really long so we got to speed this up
01:55:07.260i'm only 15 minutes in it's an hour long um the last point i'll make is about the regulatory
01:55:13.040structure of the digital safety commission i i don't have the concerns that michael has about
01:55:18.520the structure of the commission itself um i think that there is oversight through the ordinary
01:55:23.740processes um through the federal court um there are we need a commission that has power the power
01:55:29.900to investigate the power to impose fines um there is an important duty on the digital safety
01:55:34.820commission to take account of privacy freedom of expression and equality among other things
01:55:38.900in setting out any of the guidelines it writes or or um any regulations um that's it's so funny
01:55:45.940like it's it's just doublespeak it's all like everything baked in here is so doublespeak she
01:55:51.260says we need the power so it's important we have the power also we're concerned about freedom of
01:55:57.740expression court um there are we need a commission that has power the power to investigate the power
01:56:03.100to impose fines um there is an important duty on the digital safety commission to take account of
01:56:08.540privacy freedom of expression and equality among other things yeah that's important too
01:56:13.580power to find people but we also need to be concerned about equality and freedom of speech
01:56:19.500I can just see Ari Farani being like, okay, make sure, hey, make sure every time you talk about
01:56:25.080the power and the fines and throwing people in jail, make sure you always follow that up with
01:56:29.660a comment about how much you love freedom of speech. All right, honey? Good job. You're doing
01:56:33.820great, kid. In setting out any of the guidelines it writes or any regulations. And much of this
01:56:41.200needs to be left to be developed later because of evolving tech and social media. You know,
01:56:45.100if you're oh oh really that's a great caveat isn't that a great excuse to pass through shit
01:56:53.220legislation let's hear that again let's hear that again i love that i love beautiful bureaucratic
01:56:59.420excuses like this this is just privacy freedom of expression and equality among other things
01:57:03.460in setting out any of the guidelines it writes or or um any regulations and much of this needs
01:57:09.880to be left to be developed later because of evolving tech and social media you know if
01:57:13.680Their powers are essentially to write regulations about risk management, digital safety, kids design, like what the obligations are to keep kids safe.
01:57:23.020No, we just need to rush this legislation through and figure it out later because the Internet's always changing, you guys.
01:57:29.340Let's just make sure we have this huge bureaucratic body with a lot of power that can find people.
01:57:34.340And we'll figure it out later because kids safety.
01:57:37.600You got to give them credit, you know, because it's like these are their best talking points, I think.
01:57:43.680And I could see how this would make sense to someone who's like, oh, okay, yeah, yeah, internet, it's explained later, yeah, okay, all right, yeah, let's just pass it through, fair enough, all right.
02:18:38.620hey i find something hateful on twitter what do i do as a jewish person uh can one of you answer
02:18:44.440that yes Emily maybe i'll just jump in because i um that was a bit where i started which is well
02:18:48.520the first step would be to claim this to the social media platform i think we're seeing of
02:18:52.260course on x that they're not taking very seriously any complaints about hate um the other option
02:18:58.040then is to submit let's should we start counting should we start counting maybe we'll start
02:19:05.740counting. That might be fun. I should have started counting
02:19:09.880from the beginning, but let's see if this is worth it or just a
02:19:14.940big waste of time. Is there a way to make this? Okay. So this
02:19:24.580number is the number of times hate is mentioned without
02:19:27.500clarifying what hate is. So that's one. Actually, no, it's
02:19:31.520two because he it's two because he said the tweet and he
02:19:35.500didn't uh specify all right uh that information to the digital safety commission they're not going
02:19:42.160to act on your complaint that will be information that feeds whether they do a bigger investigation
02:19:46.500or bigger hearing about the social media service or you can go to the ombudsperson who can't do
02:19:51.740anything exactly but provide support to help you figure out what to do so that would be perhaps a
02:19:56.520first point of call as well and they can help you navigate um and the last two options would be if
02:20:01.880think it's really it's reached that threshold go to the police or make a complaint to the canadian
02:20:05.880human rights commission go to the police
02:20:14.280again maybe if they had defined what the hate is but this person without even flinching is saying
02:20:20.600you should maybe call the cops actually if you read the question was what do you do when you
02:20:25.080read a hateful tweet she says maybe call the cops yeah i might want to call the police
02:20:31.880if you read a hateful tweet without even clarifying anything about what the tweet might
02:20:38.160have said don't you think that's kind of an important thing to clarify no let's just it
02:20:44.460was a hateful tweet i'm call the police of course that's what i'd do this is insane hope please
02:20:51.420please michael geist say something in response about defining hate please i hope our man michael
02:20:58.480guy who says something here yeah if i can for just a moment we should pick up on that i think
02:21:02.740that that provides a really good review of the different options i will say that you know the
02:21:06.460idea that this is suddenly going to result in you know when you face something online that it comes
02:21:09.980down or that you get justice is unlikely it still will depend on uh especially initially some of
02:21:14.380the things social media companies do i face a lot of this sort of stuff especially on twitter on my
02:21:18.320own account when i post on anti-anti-semitism related concerns and don't have really any
02:21:22.120expectation that the law would directly change that but what can we get some specificity
02:21:27.780of of the anti-semitism of the hate i feel like this is kind of important it will change and i
02:21:35.300think this is a positive is that the process around dealing with some of this content where
02:21:38.940there are complaints about the prospects sometimes of content coming down is oftentimes very opaque
02:21:42.980i on a on a less controversial site i had content that was removed when i posted an anti-semitic
02:21:48.360anti-semitic image trying to highlight the fact that this is what the community is facing day in
02:21:51.980day out right now in canada in 2024 it's it's really unthinkable and quite astonishing i posted
02:21:56.820it on linkedin that that content came down it was presumably an algorithm or ai of some sort that
02:22:01.860decided it came down because there wasn't the context it went back up it came back down again
02:22:05.860ultimately there was some sort of review but the entire process about how those decisions are made
02:22:09.860something that we don't have much insight into and what this legislation will do as part of a
02:22:13.860digital safety plan is not necessarily tell you what happened to your particular piece of content
02:22:18.260but it will add i think far more transparency to a process that will require companies to disclose
02:22:23.300complaint data information on how they respond to that data and so that people may in the
02:22:28.180individual case be left unhappy or happy with what the resolution was but at least we're going to have
02:22:32.180i think far better more open information about what is actually taking place behind the curtain
02:22:36.580of many of these social media companies which right now very often you file a complaint you
02:22:40.020have no real idea of what comes of it hmm so what i'm hearing is instead of getting access to the
02:22:47.300algorithms of how they feed different canadian citizens their information and algorithms on
02:22:52.660on their feed now they want access to how social media companies deal with censoring content or
02:22:59.400taking down content they want to see what's inside of the big tech companies they want
02:23:05.080more data they want it to be more transparent um and like i think the added piece that is not
02:23:16.140mentioned there is you want more bureaucracy involved in that so he i think it's interesting
02:23:22.080how he's like hey it's a messy process i posted something that was anti-semitic but i wanted to
02:23:27.720highlight that it was anti-semitic and it got taken down then it got back up and it was a whole
02:23:31.620mess and it's like okay so adding adding canadian bureaucracy to this is going to help how
02:23:36.500you know the big tech platforms are already struggling with this and you think adding the
02:23:41.780canadian government is going to help deal with the you know tricky nature of you know moderating
02:23:50.100online content if anything it's going to be obviously more politically biased like that's
02:23:56.700like that's all i can that's all i can say but it's it's um once again it's wanting power it's
02:24:02.320wanting more power over how the social media algorithms work i'm i'm quite disappointed so
02:24:08.340far that mr geist did not bring up like a hey maybe we should define what hate is he kind of
02:24:16.980just said yeah yeah i got a lot of hateful comments too on my on my social media but i don't think
02:24:21.020we're going to get any justice there uh can't like i hate to say it but can they even make
02:24:28.440something like this can they make something like this they don't even have anything like this
02:24:33.100to you know classify what they're talking about i mean i mean they've said it in words but if
02:24:40.140they're not going to hone in closer well i mean it's a tyrannical bill so i i really don't care
02:24:45.380them to hone in closer closer because i know it's all nonsense but it's silly that they're even
02:24:49.700talking about this with more of a kind of defined uh definition of what's going on here of what what
02:24:54.740the actual content is okay thank you it's hard to have this conversation uh without taking the
02:25:01.700context into into question and we're seeing accusations that israel is committing genocide
02:25:07.300uh that if you're supporting israel you're supporting genocide um and which
02:25:12.260Which is an argument that people are making and there's validity to that argument.0.77
02:25:23.460So, you know, like it gets so hairy so quickly, especially on the Palestine Israel topic, because if you haven't seen, there are people on both sides saying I want to kill that other side.
02:25:39.640all of them. There's Palestinians or Hamas supporters saying this and there's Israelis saying
02:25:45.680this. They're both calling for genocide in one form or another. Both sides. Now of course we're
02:25:52.660watching Sija so it's mostly all Jewish people on this call so obviously they're going to have a0.89
02:25:56.920very very strong bias but anyway it's interesting it'll be interesting to see where they go with
02:26:01.180this. Without taking the context into into question and we're seeing accusations that
02:26:08.640that Israel is committing genocide, that if you're supporting Israel, you're supporting
02:26:12.500genocide. And there's concern out there that, and given also the context also that Jews
02:26:19.660are often portrayed at least by some people as oppressors, both in Israel, because Israel
02:26:25.680is a powerhouse in the region, and in Canada as being white settler colonialists and all
02:26:31.940that stuff. And we've heard concerns that being a pro-Israel advocate, being a defender
02:26:37.840of zionism of the right of the jewish people to self-determination could uh lend somebody into
02:26:43.220legal trouble with this legislation and and i was getting emails just before this this uh this panel
02:26:49.680started so i really like to to have your thoughts on this because we're hearing it and and and your
02:26:54.920opinion matters on this so uh emily if i could start with you and then go to michael this is a
02:26:58.840great hey bravo to the panel guy because this is actually a really good question uh hey everyone
02:27:04.600who is uh pro-israel could technically be accused of supporting genocide so how does that work
02:27:11.780with this legislation it's a really important question and i would say that that concern about
02:27:18.940what the threshold is for hatred is probably one of the key concerns about this legislation
02:27:23.380again key concern you wrote the legislation and you don't know there's not a clear answer here
02:27:30.480right you wrote the legislation there's no clear it's almost like it's terribly written
02:27:35.960legislation that's incredibly broad even though you said it wasn't broad earlier but anyway um
02:27:41.640so uh let me identify where there's i don't think there should be concern and perhaps where there
02:27:46.080should be um the the supreme court has set a very high threshold for what hate speech means um and
02:27:54.460so and it they really have not they've said a subjective one but please continue and and none
02:28:02.540of what you're talking about would be captured by that um what what are you gonna say the magic
02:28:09.360word vilification and detestation if if you're saying that you know if you make the argument0.55
02:28:15.760that that that israel is committing genocide or you say that they're villains for committing
02:28:21.100genocide or you you support um israel and say hey um palestinians are villainous or palestinians are0.60
02:28:30.940detestable or hamas is detestable that's the threshold you pass the threshold for the this
02:28:37.700new depth supreme court definition of speech so hate speech is ultimately at that level where
02:28:46.580it's beyond even disdain it's beyond offense um that you're essentially uh communicating
02:28:52.180statements are predicated on the destruction of an entire group of people and the threshold
02:28:55.580should be incredibly high there's so that's not even correct that's not even correct let's look
02:29:02.840it up destruction of an entire group of people detestation and vilification are the words
02:47:41.700I think it'd be pretty easy to make the argument that Netanyahu, the prime minister of Israel is vilifying his enemies, which would be criminal hate speech.
02:47:53.420under bill c63 according to the supreme court which i totally disagree with i think i don't
02:47:58.220know how i don't know how that supreme court decision is they're going to try to weasel this
02:48:01.980in into the new definition of hate speech but anyway the fact that they're glossing over that
02:48:05.980is crazy to me so if i can is it okay by jumping with one short because why i land differently on
02:48:11.660this is that um i don't view it as uh reducing hate access should i keep counting or is this
02:48:20.380stupid let me know if I should keep counting maybe it's not that exciting eh it's just kind
02:48:24.620of interrupting the flow isn't it to the Canadian Human Rights Commission I view it as access to
02:48:30.300justice and at the moment and if it remains this is the victim-centered perspective this isn't
02:48:37.600about stopping hate this is about access to justice what the if I can jump in with one short
02:48:45.540because why I land differently on this is that I don't view it as reducing hate access to the
02:48:52.980Canadian Human Rights Commission. I view it as access to justice and at the moment and if it
02:48:58.080remains unchanged individuals the only thing available to them is is to go to law enforcement
02:49:03.740many groups don't feel comfortable with that law enforcement is overwhelmed and yes we need to make
02:49:07.440major steps to better resource better train law enforcement to deal with this but it's not always
02:49:11.840are now they still haven't defined hate and they're still saying all they can do is go to
02:49:17.560law enforcement go in law enforcement go to law enforcement for what like does it reach the
02:49:23.980threshold of being against the law no maybe there's a reason for that but there's a reason
02:49:32.140we've had that in place for so long in free free western countries necessarily the appropriate
02:49:38.940solution many provincial human rights bodies don't uh even though they have hate speech
02:49:43.500provisions won't hear them if they're related to the internet um they see that as federal and so
02:49:48.000it means that except for going to the police there's nothing there's absolutely nothing
02:49:52.900available for an individual that's not true you could put your phone down emily you could you
02:49:57.720could walk away from the computer i know these are these are radical suggestions but you could you
02:50:03.580could you could literally put your phone down that might solve the problem of someone disagreeing
02:50:09.700with you or saying something hateful on the internet turning the internet off unplugging
02:50:15.060your wi-fi router maybe instead of calling the police you should just unplug your wi-fi router
02:50:20.380visual who sees sees it as appropriate that they should access some sort of justice and that's why
02:50:25.260i land that we i don't have a solution to the weaponization yet i just want to try to solve
02:50:29.260that one because otherwise i do think that as reintroduced it's it's important it doesn't have
02:50:34.460an answer to the weaponization so yeah we still got this kind of matzo ball hanging out here of
02:50:38.220just weaponizing uh weaponizing legislation citizens weaponizing legislation against one
02:50:45.400another haven't solved that yet i helped contribute it to the legislation though it is if i can quickly
02:50:49.640and we'll turn it into true debate i i i think you're right about that there isn't that remedy
02:50:53.520but when you think of what that remedy involves potentially years before as this process works
02:50:57.960its way through the commission and then in through the tribunal the risk that even the
02:51:01.260fact that you participated in that results in in a further backlash as it continues i'm not sure
02:51:06.040that that brings a whole lot of justice when if presented with the alternative saying is that are
02:51:10.120there mechanisms to try to stop the amplification of this hate to try to are there mechanisms to
02:51:16.420stop the implication of this hate i haven't found hate yet to support the community it's got such a
02:51:22.040hate has such a high threshold though you know the whole even in the absence of myself being
02:51:26.780made potentially whole for the harm that that occurred i think many would say listen i really
02:51:29.960just want this to go away um and if there are mechanisms to reduce the amplification to reduce
02:51:33.700the harm that's where i want to put our resources and thinking as opposed to saying that you know
02:51:37.780what's so stupid about that because this is something that a reef ronny said he said i just0.57
02:51:43.000want that to go away i just don't want to see that meme tweet anymore i just want that to go away
02:51:51.580we're literally pandering to like the most precious people with like these precious
02:51:59.460neurotic control freaks am i wrong here like this is this is i just want that to go away
02:52:06.360they just want it taken down i remember a reef saying that they just want it taken down
02:52:10.420i just don't want to read i don't want to know that people like that exist and have those opinions
02:52:15.340just take it down please yeah let's let's pass laws based on that attitude that's a great idea
02:52:22.200well i got mine in terms of some justice in a particular case yeah no i do yeah okay i'll stop
02:52:27.800there we can talk about this i'm also mindful of the time so i'd like to switch a little bit
02:52:32.520because um the internet is a global phenomenon um in in other countries or institutions that
02:52:38.620have started to grapple with us the european commission the european union has started
02:52:42.140Australia has started. When you compare C63 with what other jurisdictions have done, how does it
02:52:49.460compare? Is Canada, were C63 adopted? Would it be an outlier? Would it be around in the same
02:52:56.800ballpark as what the European and the Australians, for example, have done? Maybe Emily and then
02:53:02.060Michael. Yeah, so it has picked elements from the different regimes and I think that
02:53:08.340was that a slip was that it was that a freudian slip she used the r word not that r word
02:53:19.460regimes for example have done uh maybe emily and then michael yeah so uh it has picked elements
02:53:30.040from the different regimes and i think the best ones it is um it aligns with them um but it is
02:53:36.480narrower so the digital safety act uh in europe covers a broader array of players we have aren't
02:53:43.680covering search engines we're not covering private messaging um and the obligations are
02:53:47.840much broader including all illegal content um they're even tackling disinformation things that
02:53:53.260yeah uh that that must have been a freudian slip there that's hilarious
02:53:58.900uh yeah we are taking dibs from uh we are taking leads from other regimes a government especially
02:54:05.600an authoritarian one a system or planned way of doing things especially one imposed from above
02:54:11.520regime regime that's funny yeah we're taking uh we're taking cues from other regimes absolutely
02:54:20.480uh the most tyrannical ones the most ruthless ones yeah we want to do our jobs right
02:54:27.440errors we have aren't covering search engines we're not covering private messaging
02:54:32.080And the obligations are much broader, including all illegal content.
03:39:49.300They're moving ahead directly with the elements on the online harms piece.
03:39:52.820The one other piece, though, that I would say is that I do not think that government gets to hold up a mission accomplished sign by saying, hey, we introduced this.
03:39:59.480It even doesn't get to put up a mission accomplished sign, even if it passes this.
03:40:02.200There is a need for leadership right now when we deal with some of these issues.
03:40:05.480And that means using the tools we have right now, both in showing leadership, in speaking out against anti-Semitism and hate, both online and offline, enforcing the laws we have right now.
03:40:14.920So government certainly needs to pay attention to this and prioritize this from a political perspective.
03:40:18.720But there is so much more that needs to be happening right now.
03:40:21.140And I think we've really suffered, at least in part, because governments at multiple levels haven't stepped up in the way that they must when it comes to this issue.
03:40:28.080Wow. That's I mean, Mike, Michael is sounding like a tyrant himself right now.
03:40:33.900he's basically saying hey we should we should be arresting people by now we got plenty of laws that
03:40:38.860we could use to you know fuck with these anti-semites that that's that's what i'm hearing
03:40:43.440from this but am i reading that properly so thank you very much both of you before i i let my my0.66
03:40:49.160chair finish uh close do the closing remarks i want to personally thank you i wanted to say to
03:40:53.580the people uh try to go i tried to go through all the questions that that were in the chat okay okay
03:40:57.960that's good enough wow wow guys wow we made it through we made it through oh my gosh i'm
03:41:09.600exhausted i'm exhausted that was um that was really good though that was really good stuff
03:41:15.720that was um really good clips in there lots to go over uh man
03:41:24.820Michael Michael Geist I mean there there was some good things that Michael brought up
03:41:30.900he is more of a legal expert than me so I do at least respect that you know there are some
03:41:37.500decent clips there are some decent points that he brought up that I'm definitely going to0.62
03:41:41.900weaponize to try and stop Bill C63 um but uh yeah I'm also thinking that like you know
03:41:51.280when it comes to politics and pushing for legislation or pushing for what you want
03:41:58.780the same tactics and the same sort of principles apply as they would for sales or for negotiation
03:42:06.820and if you look at like the art of negotiation um and how to get what you want when negotiating
03:42:15.000um you ask for more than that you want you ask for more than what you truly want
03:42:23.960um in sales they kind of make like an example in sales would be uh price anchoring so let's say
03:42:33.340you're sales you're selling something and you want to sell it for uh you know 50 bucks okay
03:42:40.540you can you can an easy way to do that is to say hey this is a hundred dollars but it's actually
03:42:48.160on sale for 75 and then the and then the person there is like i don't know if i really want 75
03:42:55.680seems like a little bit much and then you say you know what how about this i'll give it to you for
03:43:00.34050 i'll give you a deal and then and then the person walks off thinking they got a great deal
03:43:05.380hey i got this for 50 bucks when in reality the salesperson was like well you know i i didn't lie
03:43:12.040but like i misleaded them and said yeah i want 100 bucks for it this is how much it's worth
03:43:15.800and then they get haggled down negotiated down to 50 which is where they wanted to sell it for right
03:43:21.060i suspect the same thing because let's face it you got to give credit to this you know this
03:43:27.400canadian regime they've been messing with us they've been pulling off insane stuff in terms0.87