00:56:31.100And this week, we're going to take a deep dive into Bill C-63, the good, the bad, and the ugly.
00:56:37.580There's been a lot of commentary that this is a bill that should actually be cut in two.
00:56:41.600The online harms portions dealing with children, revenge porn, etc., put into one.
00:56:46.680And then all the elements that deal with regulating speech, hate speech, etc., even advocating genocide, should be a separate bill and get longer, deeper, more thoughtful study.
00:56:58.220Is that the way to go? What are the portions that we need to worry about? Or do we have some of them taken out of context?
00:57:06.340Ian Runkle is a lawyer in Edmonton, criminal defense lawyer.
00:57:09.880He also has a very large following online with his YouTube channel, Runkle at the Bailey, where he talks about legal issues, especially firearms on a regular basis.
00:57:19.400I do want to say when it comes to splitting the bill, one of the top comments here is.
00:57:25.100The bill shouldn't be shouldn't be split.
01:03:12.040Two's in chat if you think it will not come up in this conversation.
01:03:14.780Well, a lot of the policing, they plan to push on to social media organizations.
01:03:21.560And the problem is that in doing so, they're going to create tremendous costs for the social media organizations, both in terms of their internal moderation and processes, but also in terms of potential liability fines.
01:03:36.460if they're within Canada, as well as the possibility of inspections.
01:03:42.340And so what we might actually see is some of these companies fleeing Canada
01:03:46.380in order to take up residence elsewhere to avoid these particular rules.
01:03:52.240Okay, so let's get into that in a little bit, but let's go back to...
01:08:15.080that shows a child who's being subjected
01:08:17.920to cruel, inhuman, or degrading acts of physical violence,
01:08:21.080which would include, for instance, the movie Carrie.
01:08:23.940um this is it's very difficult to craft these uh okay well look these restrictions the movie
01:08:31.920carry there's going to be i want to hear this again because you know this is what we need we
01:08:37.380need an autistic lawyer looking at this specific legislation and be like you could totally do this
01:08:41.880to ban a movie from the 80s children are actually quite um to or will be sort of forbidden since
01:08:51.680one of the absolutely uh for instance unpack that for me uh for instance one of the things that is
01:08:59.040uh forbidden to or will be sort of forbidden is any sort of visual representation uh that shows
01:09:05.920a child who's being subjected to cruel inhuman or degrading acts of physical violence which would
01:09:11.680include for instance the movie carrie um so i think his point there is based on the verbiage
01:09:18.720in the legislation is it could be a fictional movie technically could be a tv show technically
01:09:27.200could be like a sketch on youtube or a short film on youtube technically and uh yeah that matters
01:09:35.580because it's stuff we're writing into law so that that kind of matters especially if that
01:09:40.800exception is not accounted for this is it's very difficult to craft these okay well look
01:09:48.280these restrictions. The movie Carrie, there's going to be an awful lot of people. I know the
01:09:52.440movie you're talking about. I watched it as a kid in the 70s. There's going to be an awful lot of
01:09:56.880people who have no idea what we're talking about. This is an old movie. It's an old movie. What is
01:10:04.100it about Carrie that would trigger this? It's about a high school student who is very badly
01:10:10.960bullied by her peers. And it becomes a horror movie because, you know, she's got magical powers.
01:10:18.280But it includes things like a scene quite famously where they trick her into a situation where they can dump pig's blood on her specifically for the purpose of degrading and humiliating her.
01:10:30.920There's a lot of like high school movies that include these kinds of scenes of bullying and would be about children because it's high school students.
01:10:40.140yeah and so you can say like it's really we don't want to have videos of child abuse out there
01:10:46.480but there may actually and the other thing is that there may be good reason to to actually show
01:10:52.540even those somebody who sees somebody abusing a child in you know in public may videotape that
01:11:00.180and put it online in order to condemn that behavior and yet run afoul of this so it becomes
01:11:07.460very difficult to say where the line is and when you when they're sort of painting these broad
01:11:12.460broad strokes they're they're going to get a lot of things wrong and there's going to be a lot of
01:11:17.980situations where it may uh may not be covered uh perhaps more contemporary uh south park he brings
01:11:25.040up a great point there with uh and i brought this up on on other streams but you know what if there
01:11:30.640is a legitimate child abuser who gets caught in public or something or other like where there's
01:11:37.480a clip of somebody like capturing this person being a vile monster technically you could use
01:11:42.980this legislation to take that down like imagine imagine i'm not saying this is true but imagine
01:11:48.080if hypothetically justin trudeau was friends with the child abuser and they there's like some
01:11:53.260scandalous like video that gets leaked or somebody is able to capture of a horrible monstrous thing
01:11:59.980happening to one of trudeau's friends who's who's a monster abusing a child then you can say hey
01:12:04.160we'll just use that legislation bill to just take that content off the internet and you know trudeau's
01:12:09.580friend is innocent because there's no evidence because we wiped it off the internet because we
01:12:13.860have the power to do that now we have the power to take things off the internet now that are on
01:12:20.040big tech platforms because we'll threaten to like uh what is it get eight percent of their annual
01:12:27.340global profits or 10 million dollars you know think think of what kind of think of the decision
01:12:34.000if you're a big tech platform and the government the canadian government says hey if you don't
01:12:40.660take down this video it if you don't take down this video then you have to pay us 10 million
01:12:47.120dollars or however however many million dollars it would be like it's crazy they probably will
01:12:51.740leave the country if if they you know we'll probably just get no big tech platforms in
01:12:56.100hand anymore if uh if this bill passes but let's just make sure it doesn't pass how about that
01:13:03.820lee stewie said one time i messaged runkel for legal advice and he got back to me right away
01:13:12.940good guy did you message him on twitter lee i haven't reached out to him yet i figured i'd
01:13:17.820watch this first just to kind of get his uh his kind of best takes already on the bill to bring
01:13:23.700me up to speed with where he's at anyway art story said the person sharing the video like
01:13:33.940the child abuse video would be guilty of prejudice or something yeah i mean you never know
01:13:37.480you never know he he really runcles was is really cooking there in terms of like they usually get
01:13:43.760it wrong when it's so broad like this it could there's so many little loopholes that could be
01:13:49.600exploited this is the law this is the law this is important it's important to get this right
01:13:56.120because it's it's harder to uninstall a law once it's there let's throw the whole thing out let's
01:14:02.680see if see because and what's interesting here too is if i'm not mistaken what runkle is criticizing
01:14:08.960is the part that lily thinks is more reasonable about hey let's just get rid of the uh let's just
01:14:14.140get rid of the the criminal stuff and the and the um human rights stuff and let's just focus on the
01:14:20.220you know childhood sexual online harm stuff which is again what the other podcast i listened to was
01:14:26.080saying but he's he's pointing out even the the uh the problems with that as well um oh my god they
01:14:31.720killed kenny yes uh that's that's south park south park runs a foul of bill c63 potentially i mean
01:14:40.440there's exceptions there in terms of whether or not it has sufficient artistic merit, which
01:14:45.880when South Park came out was a hugely debated issue. I think that they've perhaps found some
01:14:51.780social traction, but what happens with all of this? I can see the argument for and against
01:14:58.500South Park having artistic merit. There are times when expression. I hate when I do that.
01:15:04.980here's the thing though what are we actually gaining from passing this bill because the
01:15:12.160assumption with all these conversations is like well we have to do something about online harms
01:15:16.040and it's like do we what what is what is truly the uh because all i'm seeing is a bunch of far
01:15:22.460leftists complaining about people's whose opinion they don't like that's what i see mostly and sure
01:15:28.340there's the egregious things like ornography of you know abusing children all of this stuff's
01:15:34.780already illegal though it's already against the law big tech already does a whole bunch of stuff
01:15:40.460to circumvent that and i brought this up on another stream if they were more serious about
01:15:47.000wanting to stop the monstrous stuff of abusing kids and recording it and putting it online
01:15:51.060maybe they'd actually name names of large porn companies and actually hone in on that they don't
01:15:58.900at all they just they're just blanketing the entire internet it's ridiculous and for those
01:16:04.980who don't know yeah there's a massive porn company uh like you know empire that is in uh
01:16:11.840headquartered originally i believe in montreal and it's like if they were serious about like
01:16:16.280getting that content off the internet then maybe they would actually mention this and actually gear
01:16:20.080this uh bill towards that but obviously they aren't it's it's i i really believe it's much
01:16:24.620more about uh power grab crap i lost the spot didn't i i think we were around here
01:16:31.000you know it's like oh this bill would this bill would uh you know even make um south park
01:16:38.240potentially illegal or content to be taken down yeah do we really need this bill that badly what
01:16:44.400like what what truly is the benefit other than patting ourselves on the back and acting like
01:16:48.260we're gonna save kids from uh from uh pedophiles pedophiles happen in real life by the way that's
01:16:54.500of really where uh anyway that's a whole other other topic let's keep going exceptions there
01:16:59.300in terms of whether or not it has sufficient artistic merit which when south park came out
01:17:05.220was a hugely debated issue i think that they've perhaps found some social traction but what what
01:17:13.060happens with the argument for and against south park having artistic merit there are times when
01:17:20.020it cracks me up in times where I cringe. And I think most of us are like that. And so, yeah,
01:17:26.460you could have that debate. You could have that argument. But depending on who's interpreting
01:17:31.160the law, you're essentially saying, could result in unforeseen instances of people being caught up
01:17:40.960in this. Absolutely. And one of the things is that content that is actual child sexual abuse
01:17:48.700material is already one of the most vilified and illegal sorts of things you can find online
01:17:55.060um yep yep exactly it's refreshing to listen to runkle because it's like
01:18:04.260all of the very obvious arguments like that it's like he's one of the only people who i've heard
01:18:12.180who heard kind of mentioned the most obvious stuff about this bill and it's like it's just
01:18:17.080very refreshing appreciate you appreciate you runkle provider like social media and so forth
01:18:25.180as a major category polices the heck out of this and watches for that kind of content on their
01:18:32.140media and actively reports it to police one thing that i hope they'll fix because it's very
01:18:37.920concerning in this bill is that this could actually make it very difficult to prosecute
01:18:43.420uh people who are sharing some of that content online um there's in what way wow uh there's a
01:18:50.360provision that if content is uh gets a a notice essentially somebody says this content is
01:18:57.120objectionable which hopefully somebody who saw that material would say this content is objectionable
01:19:02.300uh the i the provider is required to within 24 hours provide a notice to the person who posted
01:19:10.340that content and what are they going to do when they see hey your content was removed we have
01:19:16.820we've noticed this is they may start destroying evidence the police will often engage in
01:19:23.860lengthier sting operations where they're watching to see where this material goes who's sharing it
01:19:30.740and then you'll see those big busts where they arrest like a hundred people okay what we may
01:19:36.580instead see is one person who gets a notice side because of these notices yeah somebody who gets
01:19:43.860one of these notices is very likely to just say whoa i i need to expect a police visit and then
01:19:50.760burn the hard drives lost and and get rid of everything yeah sheesh i've never heard this
01:19:57.780argument yet that is crazy so i mean it it is kind of par for the course for the liberal party
01:20:04.520right? We're going to help protect kids online. We're going to make it easier for pedophiles
01:20:08.760and get away with it as well. That's wow. Big brain. Look, I told you guys, look how big this
01:20:17.600man, this man's brain is so big, so big. He's, he's, he's 10 steps ahead. That's very chilling
01:20:26.460and disturbing. And he, he makes a fantastic point that I didn't think about. He is a criminal
01:20:30.420a lawyer so you know he's he's right there he's right on the ball right on the money making it
01:20:35.860easier for pedophiles to get away with it yeah if you want to protect kids online then you should
01:20:44.340be talking about this potentiality a riverani if you want to protect if you're saying that you want
01:20:49.840to protect kids online with bill c63 then this is something that you need to consider and clearly
01:20:55.600have not almost like you're full of shit and just want to uh usurp more power from big tech and for
01:21:04.020the canadian government to control its own citizens and potentially potentially protect your pedophile0.78
01:21:08.980friends i don't know who knows i don't know who you hang out with the reef geez so i mean hopefully0.54
01:21:16.540they'll put in some sort of exception of like maybe we don't have to give people advance notice
01:21:22.380if they're sharing stuff that is that kind of harmful, because I'd much rather see those people
01:21:28.180face an actual, you know, is Lily starting to sweat. He's starting to sweat a lot. Suddenly,
01:21:33.840am I crazy? Arrest and prosecution. It's absolutely, we should be taking that material
01:21:40.580down. But prosecution is important. And if it can interfere with that, that's a problem.
01:21:46.520You mentioned earlier that when people wrap themselves up in we're protecting the children,
01:21:51.480We should be worried. And it reminded me of, correct me if I've got the bill wrong in Bill C-51, I think it was in the Harper era. And there were a whole bunch of measures in that to to try and deal with different elements that they want to dealt with.
01:22:11.540But I believe it was then Justice Minister Vic Taves or Public Safety Minister Vic Taves saying, you're either with the child pornographers or you're with us.
01:22:20.780And it was a horrible bill. There were horrible elements to it.
01:22:24.800And but he tried to wrap himself up in I'm defending children.
01:22:29.420Why aren't you? It is kind of a thing that politicians will do.
01:22:34.560Politicians of all stripes. I'm not being partisan here.
01:22:37.100I'm picking on Vic Taves as well as worried about what the current government will do.
01:22:41.540Is this something that you worry about when you see those statements, when you see these bills that they're just trying to hide other things?
01:22:51.940I mean, I'm sure that there is some motivation of, you know, we've got to protect the kids.
01:22:57.000The problem is, is that almost it's one of those things that's a real red flag.
01:23:01.680Almost every time you see it, there's something in those bills that is really a problem.
01:23:06.160and this bill has a lot of issues that um that may create real problems where
01:23:12.340the various efforts to regulate the internet that we're seeing including ones that are supported by
01:23:18.900the conservative party right now the sort of porn id law are have a real possibility of sort of
01:23:25.860creating a balkanized internet where canada gets a lesser internet hold on though hold on
01:23:31.860why let him cook ryan oh is he about to criticize the conservative party whoa whoa whoa whoa whoa
01:23:39.960hold on hold on there we're here to shit on just trudeau and the liberals okay okay uh pure
01:23:48.480polyev was asked do you think that uh people should have to verify their age you know that
01:23:55.080that's vastly different than you've got to you know scan your driver's license when you're going
01:24:00.040on a porn site um it's uh there's a whole lot of grades in there and that was used by the liberal
01:24:07.480party just before they bring out this law to attack him i mean there's politics on all sides on
01:24:12.760this oh for sure it kind of drives me nuts it's like we're going to stop the child pornographers
01:24:18.680but don't ask anybody their age if they're uh uploading or watching on porn hub well the
01:24:25.640concern there is not just that it's don't ask anybody their age it's got to be verified by some
01:24:31.000means so it may actually be that you have to post your id or uh have a picture taken of you for uh
01:24:38.120for various you know there's technologies that purport to guess your age based on a picture
01:24:44.040but that would also not just cover it covers any site that has you know adult content on it which
01:24:49.640isn't just Pornhub. It's also Twitter and Reddit. And at that point, if you're needing to have an
01:24:56.960ID for everything you do on places like Reddit and Twitter, then we've essentially created a
01:25:02.780digital ID, which is a real problem because that can very much suppress speech that might be
01:46:01.120this is just what i've seen i'm a professional so i am a lawyer i you know there is this real
01:46:12.060possibility that you'll get people facing very where the very serious charge is this enhancement
01:46:18.360and the other thing that happens is you get people who get into a dispute
01:46:22.860and they get angry and they say something offensive specifically to upset the person
01:46:29.460that they're angry with and it might be something that they wouldn't otherwise say maybe they've
01:46:33.680been maybe they've been punched a couple of times in the face before they they get to saying
01:46:39.040something like that but now they've gone from like we're having a bar fight to i'm on the hook for
01:46:45.940like there's a maximum of life even if they're not actually likely to get a life sentence
01:46:51.600and look i it's still silly it's still silly you know like
01:46:57.700i brought this up earlier what is what's the benefits what are the good things of this bill
01:47:04.840what are the good i'm like i don't i'm not seeing any i'm just i'm just seeing so many excuses to
01:47:11.100be able to throw more people in jail whose opinion we don't like like the more i research
01:47:17.360The more I do research on this bill, the more I see nothing good, very few things good, anything that's already, anything that it might actually help with are already things that big tech silences and censors and takes off their platforms anyway, or are things that are already objectively illegal in this country, like the child porn stuff.
01:47:37.360so all the good parts are things that are already reduced by big tech a very well dialed in
01:47:44.240professional trans international companies that have their shit together probably hundreds of
01:47:50.980employees at each big tech company dedicated to online harms already or the bill the good
01:47:57.200thing about the bill is that it's we're going to stop something that's already illegal
01:48:00.340other than that the only sort of benefit like the the things that this bill is adding is just like
01:48:06.680It's just problems on problems on problems on problems, loopholes to attach charges to this person or that person based on something they said.
01:48:14.920It's a bunch of political correct nonsense that will be used to persecute dissidents, persecute people who have the wrong opinion.
01:48:27.380Obviously, like we've kind of we've seen this already happen.
01:48:30.680the police are showing up at people's doors for things they're saying on social media
01:48:34.980and it's always a thing that is kind of politically charged or it's anti-immigrant or whatever it's
01:48:40.860totally legal to say that it's it's legal to criticize things in this country and police
01:48:45.760are already showing up at people's doors you know they're they're obviously trying to take
01:48:51.360away our freedom of speech with this bill because again like i just said there's no benefits there's
01:48:56.340no real benefits other than a whole bunch of loopholes for bureaucrats to use to persecute
01:49:00.840people they don't like because of their politics. I'm not against tough sentences. I'm a tough
01:49:06.040sentence kind of guy. It just seems like compared to where this current government is on other
01:49:13.820issues, getting rid of a mandatory minimum for your third conviction of gun smuggling under C5,
01:49:20.840get, you know, reducing sentences for all kinds of other crimes. And then they come up with this.
01:49:26.180And I think there's four instances where they've moved the maximum sentence up from two years to
01:49:31.720five years to show that they're really, really serious about this. To me, this side of the bill
01:49:38.100seems like it's performative, that it is for show rather than being a serious legal attempt to deal
01:49:47.720with real problems. And I think that we're going to see a lot of situations where, I mean, nobody's
01:49:54.400actually going to get life. It's the maximum on this. The minimum is nothing, but it's a straight
01:50:00.860indictable offense, which means that they can't go by summary conviction. It cuts off certain options
01:50:05.740in terms of sentencing. We've got summary convictions for terrorism offenses. You hide a
01:50:12.000terrorist, you can get a summary conviction, which means if it isn't spelled out in the legislation,
01:50:17.260It's a maximum of six months in court or in jail.
02:06:29.720for uh for instance a peace bond if it looks like you're going to get into a fight with somebody
02:06:36.740you may never have done so but they they do sometimes impose peace bonds for instance
02:06:41.860uh when like two neighbors can't get along and they keep getting into situations
02:06:47.300but they do need evidence for that and i don't think we're going to see this used a lot i think
02:06:54.240we're assuming it's going to pass i don't think we're going to see this used a lot
02:07:03.820why would we why do we need it why do we need to have it why would we want to have a pre-crime bill
02:07:11.880a thought it's literally a thought crime bill when you have a big enough fear something's
02:07:16.180going to happen then you get to then you get to charge somebody crazy i suspect we'll see
02:07:23.260this used actually in situations where people have already committed some sort of hate propaganda
02:07:29.620offense or hate crime but hate propaganda offense might be difficult for them to lay out the full
02:07:38.220proof or where the circumstances might be such that all right so let me give you an example and
02:07:44.560tell me if i'm on the right track wrong track uh i'm some yahoo some jerk who every weekend goes
02:07:51.780out to a church, the synagogue, a mosque, temple, what have you. And I'm always outside with the
02:07:58.460really nasty slogans. I'm screaming in everyone's face. He's talking about, uh, you know, pro
02:08:04.100Palestine or Hamas, Hamas supporters probably hypothetically, but he's, I think he's kind of
02:08:09.580anyway, every weekend. Could they eventually just say, you know what? We think he's going to keep
02:08:15.420coming back. We need to do something. Is that where this provision would come in or, or is it
02:08:21.260something else now the thing is they could probably already use provisions like just the existing peace
02:08:27.900bond provisions depending on what it is that you're saying which seems like is a real issue with this
02:08:33.500bill is that there's a lot of it where it's there's already provisions in the law yeah i mean one of
02:08:39.740the things is that they could put like an electronic monitoring device on you which
02:08:44.060that is concerning to me because those are actually incredibly expensive and you pay for
02:08:48.540that if they impose it on you um normally you see those on like bail conditions where somebody's
02:08:54.620like i will agree to pay for this because otherwise the alternative is that i will
02:08:59.580not be let out i'm concerned that they may put these conditions on people and then like
02:09:05.340we've imposed on you and you have to pay for it that might be very concerning
02:09:09.340and so if you can't pay why is that the most concerning part that you have to pay for it
02:09:13.420what why why is having to pay for it the most concerning part not the part about hey we're
02:09:22.620we made an argument that we're afraid of you what you might do in the future
02:09:26.080uh oh but then he had to he had to pay for his own ankle monitor what injustice pay for it you've
02:09:34.720got to stay in jail well they can't really jail you on this because it's a peace bond it's just
02:09:40.500like are people going to end up with these debts that can then be enforced i'm not sure how that's
02:09:44.920going to end up working but the standards for this i think are going to actually be fairly
02:09:49.340high it has to be reasonable ground again think think it's going to be reasonably high
02:09:54.660what if everyone involved in the decision making process is totally biased for political reasons
02:10:01.480against the person who whatever argument is being made the person who might commit a hate crime in
02:10:06.840the future because you can just look at the scenario of the trucker convoy with jeremy mckenzie
02:10:13.200or arthur bavlowski or any any other of these uh political dissidents and see how this law would
02:10:21.140obviously be used against them obviously and they would work around the clock to make an argument
02:10:26.960to get them locked up or monitored with an ankle monitor or what have you just extra
02:10:32.600extra legislation to pile on because we already saw them pile on legislation to all these people
02:10:39.720who opposed the government agenda during covid we already saw them pile on all sorts of trumped
02:10:45.820up charges and the fact that they're not bringing up this angle to this bill of hey remember
02:10:52.740political prisoners remember remember political dissidents remember how the government has an
02:10:57.060agenda to specifically pick on specific people. The fact that they're not covering this angle is
02:11:03.260very, very disappointing. Anyway. For the fear. And usually where you like the main use that you
02:11:10.120actually see a peace bonds is situations where somebody has already been charged with a criminal
02:11:15.060offense and it gets sort of pled down, even though a peace bond isn't really a plea. So you'll see
02:11:21.220situations where like two guys get into a bar fight and instead of saddling them with a criminal
02:11:26.360record they'll say how about you guys you know how about you just go on a peace bond and stay
02:11:31.760away from that dude maybe get some uh maybe get some treatment you can't both be at billy's bar
02:11:37.200on friday and you got to stay away from each other exactly so here you may see situations
02:11:42.940where somebody has already committed or likely committed a hate propaganda offense or a hate
02:11:48.720crime and instead of going through that full prosecution you may see them imposing this as a
02:11:54.380you know as a step down and so it's like rather than proceeding with charges i think that'll be
02:12:00.480the main use that you see for this is somebody who's already committed it ian you are are fairly
02:12:06.420i don't know this this all feels like it's it's um like all the hypotheticals they're talking
02:12:12.220about seems to exist in a world where the government is completely not malicious at all
02:12:19.160completely not sort of not ruling with an iron fist at all like you know the the trucker convoy
02:12:26.660never happened there's no evidence of political prisoners no it's just going to be how is this
02:12:31.140going to affect citizens how how will they use this bill or that bill you know there's nothing
02:12:36.820nothing wrong with the government whatsoever um it just i don't know it just seems like a very
02:12:43.120sort of innocent uh worldview that they're looking at this this from um as if it's just citizens who
02:12:49.680are going to be you know accusing other citizens with it without any sort of political bias any
02:12:54.180sort of political agenda at play it's not part of the larger picture whatsoever i mean it would be
02:13:00.280really interesting actually to hear runkel's opinion on this bill since everything that's
02:13:04.020happening in the uk that's actually a really good place to uh to start from um uh robust in
02:13:13.520defending civil liberties you would you would never be charged for criticizing a group of people
02:13:18.300people complaining about immigrants in the uk literally getting thrown in jail right now and
02:13:22.720charged constantly that would never happen here though and you you seem to be saying
02:13:28.920could be some problems with this but don't worry too much about this section
02:13:32.860this isn't the part that actually worries me the most so what what does that um things there's
02:13:40.100it's largely the the material about what they can take off sorry real quick he doesn't worry you the
02:13:46.020most but still as a lawyer as someone who follows legislation and law very closely what other sort
02:13:52.840of what is like this part what has existed that is similar to this um part of the legislation
02:14:02.740where hey if we make the argument that we're afraid enough that somebody might do something
02:14:07.840in the future then we're going to persecute them when has this ever happened in history
02:14:14.200like anywhere i don't i don't know that much about the law and the history of law but
02:14:18.240again i feel like they're kind of just glossing over this part um
02:14:23.540they're saying yeah no if we just if you can make the argument that you think someone's going
02:14:31.000to do something bad in the future then yeah we should be able to oh they haven't committed a
02:14:34.800crime yet oh yeah yeah i mean a lot of the examples that they use they're just their
02:14:43.460example like you know they're i feel like they're really not um there's a naivete in the air in this
02:14:50.160conversation especially when you're not considering um the evidence of political prisoners in canada
02:14:56.820since uh 2022 offline and what they can um rewind it a bit you know what the the material
02:15:04.040most replayed parts coming up most so what what does that um things there's it's largely the
02:15:12.100the material about what they can take offline and what they can um you know one of the concerns i
02:15:18.800have is actually with the uh the intimate images the revenge porn aspect of things and i think
02:15:25.180that that is way too broad it's very and i mean this is one of those things where everyone is
02:15:30.540like you shouldn't be doing that you shouldn't be sharing i don't know anyone in favor of revenge
02:15:36.060porn except that you will actually if you frame the question right find everyone in favor of
02:15:41.900revenge porn okay so what do you think of the sistine chapel beautiful piece of art beautiful
02:15:49.660piece of art and it's actually revenge porn believe it or not uh at least portion of it uh
02:15:55.420one of the pope's sort of figures uh was criticizing the painting of it because he felt
02:16:01.340that there were too many nudes on it and so michelangelo as a bit of a sort of a you know
02:16:07.260take this uh amended one of the figures to put that guy's face on it and then depicted that person
02:16:14.220being bitten south of the equator by a snake and that is that's permanently on there it's a nude
02:16:22.340figure um and that was specifically done as a screw this guy and it's one that now persists
02:16:29.660throughout history but it's also a great work of art is this going to be something that we go
02:16:35.080and we know that this person gave no consent to this um there's actually a lengthy history of
02:16:41.960using nudity in political commentary, depicting your political opponents as nude, typically
02:16:49.040in unflattering ways. We've seen this with Trump, for instance. There's lots of people
02:16:54.800who have depicted Trump sort of having a small penis. Obama made a comment about Trump having
02:17:03.060a small penis, I think, the other night at the DNC, the Democratic National Convention.
02:17:08.060in the nude and sort of de-emphasize certain uh body parts well trump has not given his consent
02:17:16.300to that but it also is very broad it is if it is reasonable to suspect that the person had a
02:17:22.840reasonable expectation of privacy and the person does not consent to the recording being communicated
02:17:27.260well reasonable to suspect is a very very low standard and it doesn't mean that there's no
02:17:35.520sort of proof required and so for example if there is um any artificially created image
02:17:43.040that depicts somebody in the nude how do you know as if you're the provider how do you know
02:17:48.880who this person is there's no requirement to track this person down and so if it just
02:17:54.400if somebody creates an image of nobody who actually exists it's just an image of a something
02:18:01.280that looks like a person but is not actually duplicating a person and with ai right now
02:18:07.200and the image is coming out yep um then the they would have to take that down
02:18:14.240notwithstanding the fact that they that there is no actual person who looks like that because the
02:18:19.120the organization doesn't know they would say it's reasonable to suspect that this is a person
02:18:25.280and so we get into this real thing of policing content and all of this stuff basically they have
02:18:30.800have to respond within 24 hours when they get a notice and so we may see a lot of a lot of
02:18:37.600situations where people are flagging content and it has to be taken down because there's no way
02:18:43.400that they can determine that it's not actually you know intimate content communicated without
02:18:48.360consent and that may also apply to things that are not even pornographic because if they have
02:18:53.920a 24-hour turnaround and they get something flagged it may well be that the response is
02:18:59.380just to take something down so for instance you have your podcast somebody flags it as containing
02:19:05.920you know whatever material it might have to be taken down offline until somebody looks at it and
02:19:12.400they may not want to go through like if your podcast is an hour they may say it's easier for
02:19:17.500us to just take it down and never repost it rather than go through it right so he's talking about how
02:19:24.520So when you flag something under this part of the legislation, it would be up to some employee at Facebook or Google to look through the content to make sure it's not like, you know, violating some some crime according to or some content violation according to not Facebook or Google, but to the Canadian government's classification of online content.
02:19:48.620see how convoluted that is if you're the facebook employee you have to what watch the content consume
02:19:55.220the content make this judgment call um or you can just take it down uh because the consequence of
02:20:02.580making the wrong decision is you might have your what your corporation facebook might get find
02:20:09.480millions of dollars so a lot easier just to take it down for me as a content creator
02:20:17.500your first 24 hours up online, your first 40 hours up online, and you know this from
02:20:23.600your very popular YouTube channel, that's when you get most of your views. That's when you get
02:20:28.580most of your clicks. And so if you get flagged, if you get essentially swatted, and it's taken
02:20:35.900down, you're not getting that back. That's especially, you know, the graph is basically,
02:20:41.780it goes straight up and then plateaus in terms of views. You get almost everything within that first
02:20:47.26024, 48 hours. And if you have a really good 24 to 48 hours, you can keep going.
02:20:53.260But if you don't, it's it's just flat the whole time. Yeah. Oh, absolutely. If you got got hit
02:20:59.200in that time and especially you could also consider what if you happen to get the big scoop?
02:21:04.380You've got you've got the story no one else has. You know, you get a picture of a politician
02:21:11.080like breaking into his ex-wife's house you know something stunning like that yeah you posted online
02:21:18.560the pull you know the political party that this targets might have an incentive to make a false
02:21:24.660report it gets knocked down for 24 hours everyone's talking about it but you're not getting the
02:21:30.600you know you're not getting the traffic of everyone wanting to come and see your pictures
02:21:35.140and by the time you get it back online those pictures are probably now everywhere and so
02:21:41.400they've managed to effectively punish you for getting this report by preventing you from
02:21:45.760successfully monetizing it and if you can't make money in this game then well you're out of
02:21:53.320business so well you just keep doing that to if there's a particular news organization that is
02:21:59.200critical of a particular party you just keep sniping them every time they try to make money
02:22:04.420pretty soon they go to business the abuse potentials for this become very huge
02:22:09.720interesting how once again they're talking about the monetary implications
02:22:14.980of uh you know having your stuff taken down they never really brought up the i mean they were they
02:22:22.460never used the word censorship but like that's kind of it could easily be used that way too
02:22:30.840right if there's information that's out there that you don't want out there
02:22:35.680that makes xyz look bad um then yeah you get it taken down within 24 hours doesn't go nearly as
02:22:43.240viral right so there's censorship potential imagine there was uh you know a government
02:22:50.000that wanted to censor its own citizens i know that's kind of like hard it's not part of the
02:22:54.060conversation we're having here i'm apparently apparently that's not really coming up but um
02:22:59.300imagine a world where the government didn't want to censor its own citizens and had no intention
02:23:06.760of doing that ever let me ask you about section 13 coming back i was part of the campaign from
02:23:14.380about 05 06 until 2011 i think it was withdrawn uh to get rid of section 13 and
02:23:22.960And for this to come back, it seems like the language is trying to be a bit more nuanced.
02:23:32.200It's trying to have some parameters around it in ways that it didn't before.
02:23:37.000But I'm still concerned about the Human Rights Commission and the Human Rights Tribunal being able to find someone $20,000 based on an anonymous complaint of someone who may or may not have been actually harmed by what was said or done.
02:23:52.960um and then it becomes a money-making opportunity for people like we were saying instead of taking
02:23:59.340away the monetization ability for a news organization let's say you're making it so
02:24:05.360that someone can make 20 grand a pop every time they complain that ian runkel well he said something
02:24:11.280i don't agree with ian runkel said something that's against the human rights code and i think
02:25:02.080Now they're tied up in this human rights tribunal stuff.
02:25:05.640It can become very difficult to be any sort of political commentator at that point if you're always getting tied up in it.
02:25:12.360I think back to Mark Stein's famous case with this. One of the complaints about it was, and this is very timely because we're hitting the month of Ramadan, and this joke only makes sense to you and is only funny if you actually understand Islam in Ramadan.
02:25:31.900And one of Mark's comments was, is it just me or does Ramadan come earlier every year?
02:25:39.040And of course it does because it comes around every 11 and a half months.1.00
02:27:52.260What is this? Come on, man. Like the demoralization in that comment is crazy.
02:28:00.660Happens if somebody is critical of, you know, any sort of sexuality, et cetera. If you're on the left, think about what happens when somebody posts something in favor of, you know, diversity, equity and inclusion sorts of things.
02:28:16.460All of these things could be targeted under these provisions and tie people up.
02:28:21.680And we may end up with a world where we just can't talk online without people sniping us into court.0.55
02:29:21.580I don't deny that either on the child exploitation, the revenge-born side, or on the other side, that there are serious issues here.
02:29:30.560But do you get the sense that the government is trying to show that they take it very seriously by having very serious sentences, very serious fines, that they are using a sledgehammer to take on a fly when they could have done this in a much more nuanced way, dealt with the issues that are bothering people in a way that's charter compliant, that doesn't lead to the swatting, and that isn't just about performance?
02:29:59.100i would love to see a much more nuanced much more targeted bill and there's no reason why
02:30:07.080they can't approach this incrementally they can say listen let's go after the most like the worst
02:30:12.700stuff that everyone agrees is bad because there are provisions in here that i think are good
02:30:18.460and provisions that i think are well-intentioned that can be later cut you know later fixed in
02:30:25.680ways that will the road to hell is paid with good intentions it's got good intentions what does that
02:30:32.360ever count for though runkle and legislation who cares if it has good intentions what does it
02:30:37.720actually say what's the actual result will make them good but they just want to cover so much
02:30:46.100provisions that i think are well intentioned that can be later cut you know later fixed in ways that
02:30:52.980will make them good but they just want to cover so much stuff that this ends up including everything
02:31:00.520from really really bad content to hurt feelings and it's the hurt feelings area that really can
02:31:07.440you know can cause huge problems and it can end up the courts may end up having to look at these
02:31:14.180provisions and because they're so broad it can be listen we have to throw out the baby with the
02:31:20.360bathwater because you haven't given us any other choice throw it out throw the whole thing out i
02:31:26.800agree with that i i think we do need some additional provisions to deal with things like
02:31:32.700revenge porn and you know targeted deep fakes and these kinds of things but they have to be
02:31:39.220done carefully and they have to be done with nuance and i think this bill needed a lot more
02:31:44.940study than it got before it was uh brought forward again so oh well it needs a lot more study a reef
02:31:52.740was talking about hey we put so much time into this we put so much time into this years we know
02:32:00.640we took years we took years into this bill and we believe that we have struck in the best balance
02:32:05.960well runkle disagrees with you a reef oh i have real concerns all right in runkle thank you very
02:32:14.200much for your time today and they have to be done with nuance and i think this bill needed a lot
02:32:19.680more study than it got before it was i i think we do need some additional provisions to deal with
02:32:26.960things like revenge porn and you know targeted deep fakes and these kinds of things i mean just
02:32:34.480to be like devil's advocate for this it's like what if we just need to encourage better uh internet
02:32:41.580literacy and better sort of you know more cautionary messages to young people of how to
02:32:48.000use and consume the internet and i mean at the same time even as i say that i feel like
02:32:53.320kids probably on average know more than the grown-ups when it comes to like how to use
02:33:00.520the internet and and move around on the internet uh so i do think it's like they're they're kind
02:33:05.960of closer to the internet than than the older generation like you know people in their 20s
02:33:11.200are much more better versed on internet culture how to disseminate if something is fake or real
02:33:17.000or not than the older generations or at least that's what i've found and um yeah i just feel
02:33:24.740like this idea of we need to control the internet is totally um misguided because it is you know0.53
02:33:37.440Because it's all it's going to amount to is what you have in China.
02:33:40.040At the end of the day, like, you know, maybe it's just like a fallacy to try or it's like it's it's misguided to try and control the Internet because you're always just going to get a greedy government that wants more money and control.