The ‘Online Harms’ Act could censor Twitter, Netflix, and us
Episode Stats
Words per Minute
165.31483
Summary
Bill C-63, also known as Bill C- 63, is the latest attempt by the Trudeau Liberals to try and deal with regulating the online world. There are some good parts of this bill, and there are parts that deal with childhood online sexual exploitation. But are there parts of the bill that go too far?
Transcript
00:00:00.000
Ontario, the wait is over. The gold standard of online casinos has arrived. Golden Nugget
00:00:06.000
Online Casino is live, bringing Vegas-style excitement and a world-class gaming experience
00:00:11.040
right to your fingertips. Whether you're a seasoned player or just starting, signing up is fast and
00:00:16.680
simple. And in just a few clicks, you can have access to our exclusive library of the best slots
00:00:21.740
and top-tier table games. Make the most of your downtime with unbeatable promotions and jackpots
00:00:27.220
that can turn any mundane moment into a golden opportunity at Golden Nugget Online Casino.
00:00:32.820
Take a spin on the slots, challenge yourself at the tables, or join a live dealer game to feel the
00:00:37.920
thrill of real-time action, all from the comfort of your own devices. Why settle for less when you
00:00:43.140
can go for the gold at Golden Nugget Online Casino? Gambling problem? Call Connex Ontario,
00:00:49.320
1-866-531-2600. 19 and over, physically present in Ontario. Eligibility restrictions apply. See
00:00:56.340
goldennuggetcasino.com for details. Please play responsibly.
00:01:03.700
The Online Harms Act, also known as Bill C-63, it's the latest attempt by the Trudeau liberals to
00:01:09.760
try and deal with regulating the online world. Look, there's some good parts of this. Parts that
00:01:15.000
say they're going to deal with childhood online sexual exploitation. Nobody I know is in favor of
00:01:20.420
that. There are portions to deal with revenge porn. Well, again, I don't know anybody in favor of
00:01:25.880
revenge porn. But are there elements of the bill that go too far? Will they infringe on freedom of
00:01:30.820
expression? Will they lead to people abusing this for political means? Those are very real concerns.
00:01:37.580
Hello, welcome to the Full Comment Podcast. My name is Brian Lilly, your host. And this week,
00:01:42.060
we're going to take a deep dive into Bill C-63, the good, the bad, and the ugly. There's been a lot of
00:01:48.660
commentary that this is a bill that should actually be cut in two. The online harms portions, dealing
00:01:54.260
with children, revenge porn, etc., put into one. And then all the elements that deal with regulating
00:01:59.920
speech, hate speech, etc., even advocating genocide, should be a separate bill and get longer, deeper,
00:02:06.900
more thoughtful study. Is that the way to go? What are the portions that we need to worry about? Or do we
00:02:13.380
have some of them taken out of context? Ian Runkle is a lawyer in Edmonton, criminal defense lawyer. He
00:02:20.400
also has a very large following online with his YouTube channel, Runkle at the Bailey, where he
00:02:25.300
talks about legal issues, especially firearms, on a regular basis. We reached out to Ian to discuss this
00:02:31.800
issue and figure out what he sees as problems from a criminal defense point of view. Ian Runkle,
00:02:37.920
thanks for the time. Oh, thank you for having me. So let me ask you off the top about the online harms
00:02:44.000
bill. It seems to be two bills in one, in my view. You've got a whole area that's aimed at stopping
00:02:52.860
child sexual exploitation, revenge porn, things like that. And then you've got hate speech,
00:02:59.820
incitement to genocide, all these other things. Does this seem like they've kind of jammed a bunch of
00:03:05.060
things together that the only thing they have in common is that they might happen online?
00:03:11.440
This is really, I think, sort of an omnibus bill of all of the things that happen online that we don't
00:03:16.980
like, sort of wedged into one big thing. But it also will have some offline applications as well in
00:03:24.820
terms of things like the abilities to get peace bonds applied to people who they think might do
00:03:33.260
something, might do something, which can include things like house arrest. So those provisions,
00:03:40.480
it seems that they maybe should sever that off into a separate bill. But there's a lot of problems with
00:03:46.120
this because the government really has this big idea that they can sort of tame the internet. And
00:03:54.640
I mean, tame the internet. I don't know what it's like in Edmonton, but we've had a dinner between
00:04:03.120
two G7 leaders shut down because of pro-Hamas terror groups taking over the event venue. We've had a
00:04:14.180
political fundraiser for local liberal constituency associations shut down because the same pro-Hamas
00:04:24.060
terror group show up and take over the restaurant in a very affluent, nice part of town. I'm not sure
00:04:30.120
that you can police the internet when you can't police the streets.
00:04:33.680
Well, a lot of the policing, they plan to push on to social media organizations. And the problem is
00:04:41.620
that in doing so, they're going to create tremendous costs for the social media organizations,
00:04:45.940
both in terms of their internal moderation and processes, but also in terms of potential liability
00:04:54.600
fines if they're within Canada, as well as the possibility of inspections. And so what we might
00:05:02.000
actually see is some of these companies fleeing Canada in order to take up residence elsewhere to
00:05:10.280
Okay, so let's get into that in a little bit, but let's go back to the first question that I asked
00:05:15.800
you, and you kind of hinted at it in the first answer. Should this bill be split? Because on the
00:05:23.120
question of sexual exploitation of children online, on the question of revenge porn, things like that,
00:05:29.240
I don't hear a lot of pushback. I don't hear a lot of, oh, no, no, no, we need to have that.
00:05:34.860
But there seems to be unanimous consent, unanimous viewpoint that these are things that we should
00:05:43.540
be looking after. And to me, it seems like if you had that as a separate bill, that would pass
00:05:49.580
very quickly versus all the other stuff that you and I are going to spend the next half hour or so
00:05:54.800
unpacking. So should this bill be split? It's certainly a good idea in terms of separating out,
00:06:01.620
for instance, the Criminal Code and Human Rights Act stuff. The one thing I will say is anytime the
00:06:08.000
government tells you we're here to protect the children, you should be very, very concerned,
00:06:12.500
because that's usually a sort of wrapper for very bad legislation. And I think that a lot of these
00:06:19.420
portions that may very well pass without too much criticism, simply because they say this is to
00:06:26.060
protect children, are actually quite, quite dangerous and quite charter infringing provisions.
00:06:33.560
So some of them may actually... The child exploitation things, you're worried that there are
00:06:37.640
child or there are charter violations within that. Absolutely. For instance, there's... Unpack that for me.
00:06:46.880
For instance, one of the things that is forbidden to or will be sort of forbidden is any sort of visual
00:06:54.080
representation that shows a child who's being subjected to cruel, inhuman, or degrading acts of physical
00:07:00.020
violence, which would include, for instance, the movie Carrie. This is... It's very difficult to craft these...
00:07:09.140
Okay, well, look, the movie Carrie, there's going to be an awful lot of people. I know the movie you're talking
00:07:15.260
about. I watched it as a kid in the 70s. There's going to be an awful lot of people who have no idea what we're
00:07:21.420
talking about. This is an old movie. It's an old movie, but... What is it about Carrie that would
00:07:27.940
trigger this? It's about a high school student who is very badly bullied by her peers, and it becomes
00:07:37.000
a horror movie because, you know, she's got magical powers, but it includes things like a scene, quite
00:07:42.940
famously, where they trick her into a situation where they can dump pig's blood on her specifically
00:07:49.360
for the purpose of degrading and humiliating her. There's a lot of, like, high school movies that
00:07:55.820
include these kinds of scenes of bullying and would be about children because it's high school
00:08:01.780
students. Yeah. And so you can say, like, it's really... We don't want to have videos of child abuse
00:08:08.100
out there, but there may actually... And the other thing is that there may be good reason to actually
00:08:14.260
show even those. Somebody who sees somebody abusing a child in, you know, in public may videotape that
00:08:22.320
and put it online in order to condemn that behavior and yet run afoul of this. So it becomes very difficult
00:08:30.300
to say where the line is. And when they're sort of painting these broad strokes, they're going to get a
00:08:37.940
lot of things wrong, and there's going to be a lot of situations where it may not be covered.
00:08:44.040
Perhaps more contemporary, South Park. Oh my God, they killed Kenny. Yes.
00:08:55.880
Potentially. I mean, there's exceptions there in terms of whether or not it has sufficient artistic
00:09:01.020
merit, which when South Park came out was a hugely debated issue. I think that they've perhaps found
00:09:07.820
some social traction, but what happens with all of this?
00:09:12.520
I can see the argument for and against South Park having artistic merit. There are times when it,
00:09:18.960
you know, it cracks me up and times where I cringe. And I think most of us are like that. And so,
00:09:24.960
yeah, you could have that debate. You could have that argument. But depending on who's interpreting
00:09:29.780
the law, you're essentially saying, could result in, you know, unforeseen instances of people being
00:09:40.480
Absolutely. And one of the things is that content that is actual, like, child sexual abuse material
00:09:47.760
is already one of the most vilified and illegal sorts of things you can find online.
00:09:53.700
provider, like social media and so forth, as a major category, polices the heck out of this
00:10:01.540
and watches for that kind of content on their media and actively reports it to police.
00:10:07.400
One thing that I hope they'll fix, because it's very concerning in this bill,
00:10:11.660
is that this could actually make it very difficult to prosecute people who are sharing some of that
00:10:18.100
content online. In what way? There's a provision that if content gets a notice, essentially somebody
00:10:28.000
says this content is objectionable, which hopefully somebody who saw that material would say this
00:10:33.360
content is objectionable. The provider is required to, within 24 hours, provide a notice to the person
00:10:42.140
who posted that content. And what are they going to do when they see, hey, your content,
00:10:47.940
was removed, we have noticed this, is they may start destroying evidence. The police will often
00:10:54.080
engage in lengthier sting operations where they're watching to see where this material goes, who's
00:11:02.060
sharing it, and then you'll see those big busts where they arrest, like, 100 people.
00:11:08.360
What we may instead see is one person who gets a notice.
00:11:11.100
They're on the wayside because of these notices.
00:11:15.420
Somebody who gets one of these notices is very likely to just say, whoa, I need to expect a
00:11:21.160
police visit. And then burn the hard drives and get rid of everything.
00:11:27.280
Yeah. So, I mean, hopefully they'll put in some sort of exception of, like,
00:11:32.200
maybe we don't have to give people advance notice if they're sharing stuff that is that
00:11:38.340
kind of harmful. Because I'd much rather see those people face an actual, you know,
00:11:44.420
arrest and prosecution. It's absolutely, we should be taking that material down, but
00:11:49.440
prosecution's important. And if it can interfere with that, that's a problem.
00:11:54.260
You mentioned earlier that when people wrap themselves up in, we're protecting the children,
00:11:59.500
we should be worried. And it reminded me of, correct me if I've got the bill wrong, Ian.
00:12:08.480
Bill C-51, I think it was, in the Harper era. And there were a whole bunch of measures in that to
00:12:15.680
try and, you know, deal with different elements that they want to deal with. But I believe it was
00:12:21.280
then Justice Minister Vic Taves, or Public Safety Minister Vic Taves, saying, you're either with the
00:12:26.100
child, pornographers, or you're with us. And it was a horrible bill. There were horrible elements
00:12:31.640
to it. And, but he tried to wrap himself up in, I'm defending children, why aren't you? It is kind of
00:12:40.020
a thing that politicians will do. Politicians of all stripes, I'm not being partisan here, I'm picking
00:12:45.420
on Vic Taves, as well as worried about what the current government will do. Is this something that
00:12:52.240
that you worry about when you when you see those statements, when you see these bills that they're
00:12:57.660
just trying to hide other things? I mean, I'm sure that there is some motivation of, you know,
00:13:03.680
we've got to protect the kids. The problem is, is that almost, it's one of those things that's a real
00:13:08.920
red flag. Almost every time you see it, there's something in those bills that is really a problem.
00:13:14.180
And this bill has a lot of issues that, that may create real problems, where the various efforts to
00:13:23.200
regulate the internet that we're seeing, including ones that are supported by the Conservative Party
00:13:27.800
right now, the sort of porn ID law, are have a real possibility of sort of creating a balkanized
00:13:34.980
internet where Canada gets a lesser internet. Hold on, though, hold on. Pierre Polyev was asked,
00:13:41.900
do you think that people should have to verify their age? You know, that that's vastly different
00:13:48.420
than you've got to, you know, scan your driver's license when you're going on a porn site. It's
00:13:54.920
there's a whole lot of grades in there. And that was used by the Liberal Party just before they bring
00:14:00.820
out this law to attack him. I mean, there's politics on all sides on this. Oh, for sure.
00:14:06.200
Which kind of drives me nuts. It's like, we're going to stop the child pornographers, but don't
00:14:12.180
ask anybody their age if they're uploading or watching on Pornhub. Well, the concern there is
00:14:18.920
not just that it's don't ask anybody their age, it's got to be verified by some means. So it may
00:14:24.040
actually be that you have to post your ID, or have a picture taken of you for for various, you know,
00:14:31.780
there's technologies that purport to guess your age based on a picture. But that would also not
00:14:37.340
just cover it covers any site that has, you know, adult content on it, which isn't just Pornhub.
00:14:43.420
It's also Twitter and Reddit. And at that point, if you're needing to have an ID for everything you
00:14:50.320
do on places like Reddit and Twitter, then we've essentially created a digital ID, which is a real
00:14:56.460
problem, because that can very much suppress speech that might be legal, but controversial.
00:15:04.100
And that becomes, that's my concern is that it gets wrapped up as we need to protect children.
00:15:09.900
But it instead is something that affects, affects online discourse and speech, including
00:15:16.740
legitimate lawful speech. So that's, that's the concern. And there's plenty, I should note,
00:15:24.380
all of the concerns that apply to that bill are wrapped up in C63, in particular, because there's
00:15:31.640
a section, section 65. And it just says, it's under this category of duty to protect children.
00:15:38.280
It says design features, an operator must integrate into a regulated service that it operates any design
00:15:44.180
features respecting the protection of children, such as age appropriate design that are provided for
00:15:49.580
by regulations. Those regulations won't come to a vote, they can just be, it's like the order in
00:15:56.000
council to, you know, ban firearms and so forth, they can just be put forward with a stroke of a pen.
00:16:02.540
And they could say you must have this kind of age verification tomorrow. You know, assuming that
00:16:08.720
this law was in in place, they could just drop that down. And suddenly, all of these companies have
00:16:15.200
to adjust to that. They could, they could do worse than that. They could do more content.
00:16:20.300
I'm glad that you're raising that because an awful lot of people don't realize how much in legislation
00:16:25.580
is left to regulation. And regulation is not passed by parliament. It's not seen by parliament for the
00:16:32.500
most part. It is decided by bureaucrats after the fact. And what you just described is something that
00:16:38.060
would leave an awful lot of the, the details up to someone who is not elected, who we have no recourse
00:16:47.920
with, who has no accountability, and they're going to decide how the law is applied. Correct?
00:16:55.500
Absolutely. And typically what we see is a sort of ever expanding sphere of what they feel should be
00:17:01.900
their influence. Bureaucrats very rarely say, I should have less power and I should have less things
00:17:07.840
under me because the more things that they deal with, the more money and budget and so forth,
00:17:14.000
and the more important they, they get to be. And so we can, we can imagine that this is going to grow,
00:17:21.420
not shrink. So that's a very big concern to me. And I mean, I want to see a healthy Canada economically.
00:17:32.320
And one of the things that I think Canada does well is our sort of ground game on social media
00:17:39.480
and on this kind of content, but it will be very difficult for these companies to consider locating
00:17:47.780
themselves in Canada if they may have to completely change their system at the stroke of a pen by a
00:17:53.660
bureaucrat. Why would you want to subject yourself to that when it might be literally millions of
00:17:59.160
dollars in order to adjust these things? And then you might have to do it again in six months.
00:18:06.600
You know, I, I, I've talked to people who are either in these major tech companies or represent
00:18:13.000
them. And as you know, my industry, my company, they've had a lot of issues with big tech and the
00:18:20.580
whole argument over how many, what percentage of ad dollars go to Google and Facebook versus everyone
00:18:26.580
else. They're still major players and we're still partners with them. And they're part of the,
00:18:34.020
the internet economy, as you say. And yet this government keeps coming up with bills that make
00:18:39.400
it harder and harder for them to operate. And yet there are conversations about, should we leave?
00:18:46.020
Should, or should we withdraw certain services from Canada? Should we scale back in Canada? Which is
00:18:53.460
really bothersome when you think about how much all of us rely on, whether it's storage services,
00:19:03.160
email services, social media services, all of these things that are part of the economy that we
00:19:09.700
wouldn't have, wouldn't have thought of 15 or 20 years ago. They're there now. We're going to add
00:19:14.840
these regulations on, and that might lead to some of them saying, you know what? We don't need Canada
00:19:20.040
because they don't want us. And it's, it is a very real concern. And at those points,
00:19:26.660
we end up with situations where these companies do flee, and then we can't regulate them at all.
00:19:33.060
The other thing that we, that we may see with some of these bills is if you prevent people from doing
00:19:40.320
things legally, they may move things into, you know, there may be a growth in illegal distribution
00:19:46.600
channels, which have very few breaks. And so we may actually end up making our internet worse
00:19:53.760
by driving out the people who are sort of the good actors and only creating ones that have
00:20:03.200
Essentially, you're talking about the dark web.
00:20:06.180
Those sorts of things. Like if we chase out, you know, pornography is controversial,
00:20:11.180
but we're not going to get it to go away. And so for instance, if we drive out companies that try
00:20:17.580
their best to make sure that things are not too toxic, what you'll instead see is a massive growth
00:20:24.880
of people seeking out that content on other sources, which may not follow rules at all. And that's going
00:20:32.680
to be a major, that's a major concern I have with some of these bills as well. Um, and when you look
00:20:41.260
at just the, uh, the legal exposure of operating in Canada, um, right now, I'm not sure what the global
00:20:47.920
revenue is for Twitter right now. I'm just looking, um, is it in millions? Uh, like if the, uh, sorry,
00:20:58.860
I'm not, I shouldn't be Googling right now, but it's eight percent, the fines allowed for penalties
00:21:03.740
are up to 8% of the operators close global revenue or 25 million. Yeah. Or 25 million, whichever is
00:21:12.180
greater. So if you're talking about a company like Facebook or, uh, Google, uh, Google's maximum fine
00:21:19.440
for violating this is a billion dollars. Yeah. If, if I'm Google, I'm starting to say, you know what?
00:21:28.080
Uh, I don't need to, uh, to sell Brian, who's Google drive account anymore. I don't need to sell
00:21:33.260
the storage and all the other services that various people buy. I'm going to, I'm going to
00:21:38.400
to camp. I'm going to leave. And so we may actually see measures to just say, listen,
00:21:43.600
um, I mean, as a market in terms of our money, Canada, isn't the big player. We're the,
00:21:49.920
we're basically the equivalent of the kid who goes to the grocery store once a week to buy a candy bar
00:21:54.340
out of his allowance. And so when we say, listen, if you want to play in our sandbox,
00:22:00.520
you have to follow all of these rules. They may say, we're just happy to block Canadians.
00:22:05.620
And if we start getting blocked off of things like Netflix or, you know, Reddit or, you know,
00:22:11.500
or Twitter, we end up with this world where as Canadians, we have a lot less access to the net.
00:22:18.040
And that's becoming a very real possibility. Some of these bills, uh, not C63, uh, C63 purports to
00:22:25.680
give orders to take specific content online, but you've also got C11, C18 and so on.
00:22:31.980
Yeah. And some of those include provisions to say that, uh, that websites have to be blocked if they
00:22:37.460
don't comply. Um, you know, if you don't have enough Canadian content and you're not willing to
00:22:42.860
play ball, then we can block you from, from Canada. And that's a problem. If you're looking
00:22:50.100
at things like, you know, uh, the BBC, we're never going to convince the BBC to put Canadian
00:22:56.920
content on because that's not what they do. They're the British Broadcasting Corporation.
00:23:01.760
Um, all of these are very, you know, serious concerns. And to sort of tie that back to bill C63,
00:23:08.940
when we start saying, listen, if you want to operate in Canada, we've got all of these rules
00:23:13.660
and, you know, billion dollar fines. It's easy to just say, you know what, we're going to decamp
00:23:21.360
Canada. You do what you want to do. We're just going to operate from, uh, yeah. Uh, Ian, we've got to
00:23:28.440
take a quick break right now, but when we come back, I want to get into the rest of bill C63, because
00:23:33.000
all of the issues around free speech, um, those are in the other part of the bill that we haven't
00:23:39.620
talked about yet. And I want to get your take on, uh, the dangers, whether there's anything
00:23:45.300
salvageable in that part of the bill, or do they need to go back to the drawing board more when we
00:23:50.120
come back. So we've talked about the issues surrounding things like the child exploitation,
00:23:54.880
the revenge, revenge porn side of things on bill C63, but there's a whole other element.
00:24:01.200
And this is the one that most critics are saying huge impact on freedom of expression,
00:24:07.400
freedom of speech. Um, also, uh, you know, let, let me start here. I wrote a column when
00:24:14.420
it first came out, my first read through. And of course, these are big bills and it takes
00:24:18.120
a few days to digest and really understand because what people don't understand when you
00:24:22.380
get a piece of legislation like this is folks like you and I have to spend our time going back
00:24:28.640
and forth between the existing law and what's there, because you'll get this section that says,
00:24:33.940
and we replace this section with this. And you're like, well, what did it say before? And you've got
00:24:37.960
to go back and forth and it's dizzying at times. And so it takes a long time for us to break it out.
00:24:44.380
But my first read on this was, this is not a serious bill for dealing with what are absolutely
00:24:52.500
issues that folks want dealt with. But when you have a bill that says you're going to get,
00:24:58.960
um, let me, and let me read this to you. I potentially up to life in jail for either
00:25:04.920
advocating genocide, or let me read this part. Everyone who commits an offense under this act
00:25:11.060
or any other act of parliament, if the commission of the offense is motivated by hatred based on race,
00:25:16.620
national or ethnic origin, language, color, religion, sex, age, mental or physical disability,
00:25:21.820
sexual orientation, or gender identity or expression, is guilty of an indictable offense
00:25:26.900
in liable to imprisonment for life. We don't put people in prison for life for murder.
00:25:33.480
Now, I think it's worth noting, well, nobody's actually going to get a life sentence for most of
00:25:38.820
these things, right? It's a maximum penalty. It's not going to be the average penalty. But it is going to
00:25:44.880
be very interesting, because some of this could cover some very minor offenses overall. One of the
00:25:51.640
offenses that people commit that might trigger this is mischief. Somebody spray paints on, you know,
00:26:00.540
a religious building, something hateful to that religion. Well, okay, that's, that's a detestable
00:26:09.800
crime. But it's, it's spray paint, right at the end of the day. And so it'll be like, okay, you're facing
00:26:15.620
months in prison as a maximum or up to life. But life because of what you spray painted. And
00:26:23.500
what that'll end up meaning is you might end up with a, you know, a jury trial over a spray painting,
00:26:31.220
because you're going to have certain rights. But also, any other act of parliament could be really,
00:26:37.460
really interesting. For instance, out on the East Coast, we're seeing disputes over fisheries.
00:26:44.120
And so if they say, well, you decided to fish lobster out of season, in order to, you know,
00:26:51.120
because you dislike these other groups, then now you're facing a life sentence. It's like,
00:26:57.340
what? And for people that don't follow this issue, I follow it on the, you know, just gently. You're
00:27:07.580
right. That could happen. Someone could say this is a hate crime.
00:27:11.700
Yeah. And I mean, the courts are usually reluctant to find something as a hate crime,
00:27:17.300
because a lot of the time when people say something is a hate crime, it's more somebody
00:27:22.120
with mental illness. But, you know, there is this real possibility that you'll get people
00:27:28.180
facing very, where the very serious charge is this enhancement. And the other thing that happens
00:27:34.860
is you get people who get into a dispute, and they get angry, and they say something offensive,
00:27:41.000
specifically to upset the person that they're angry with. And it might be something that they
00:27:46.260
wouldn't otherwise say, maybe they've been, maybe they've been punched a couple of times in the face
00:27:51.220
before they, they get to saying something like that. But now they've gone from like, we're having
00:27:57.160
a bar fight to, I'm on the hook for like, there's a maximum of life, even if they're not actually
00:28:03.900
likely to get a life sentence. And look, I'm not against tough sentences. I'm a tough sentence kind
00:28:10.380
of guy. It just seems like compared to where this current government is on other issues, getting rid of
00:28:19.560
a mandatory minimum for your third conviction of gun smuggling under C5, getting, you know, reducing
00:28:26.220
sentences for all kinds of other crimes. And then they come up with this. And I think there's four
00:28:31.580
instances where they've moved the maximum sentence up from two years to five years, to show that they're
00:28:37.540
really, really serious about this. To me, this side of the bill seems like it's performative,
00:28:45.020
that it is for show, rather than being a serious legal attempt to deal with real problems.
00:28:54.060
And I think that we're going to see a lot of situations where, I mean, nobody's actually going
00:28:58.760
to get life. It's the maximum on this. The minimum is nothing. But it's a straight indictable offense,
00:29:05.560
which means that they can't go by summary conviction. It cuts off certain options in terms
00:29:10.100
of sentencing. We've got summary convictions for terrorism offenses. You hide a terrorist,
00:29:16.740
you can get a summary conviction, which means if it isn't spelled out in the legislation, it's a
00:29:21.540
maximum of six months in court or in jail. Am I correct? I'd have to check. I sadly don't have
00:29:29.620
all of the sentencing memorized. I always check.
00:29:32.660
But if it isn't spelled out in legislation, a summary conviction, you're not going to jail for
00:29:37.840
more than six months, right? I think they upped it to 18.
00:29:41.200
Oh, does it? Okay. So, but yeah, it's like you've got summary convictions for things related to
00:29:45.800
terrorism, and you don't have summary convictions here. As you say, it could be spray paint.
00:29:50.640
Yes, but life? This may actually discourage the laying of these charges and sort of if I'm a
00:30:01.840
prosecutor, let's say you've got a guy who has spray painted his, you know, he spray painted a
00:30:06.900
shed somewhere and he basically says like, whatever. Now I'm considering whether I want to lay this
00:30:13.360
charge. Well, if I lay this charge, now this person, because they're liable to imprisonment for
00:30:18.880
life, has an entitlement to things like a jury trial. And do I really want to run that on
00:30:25.440
a shed spray painting? I might say, well, no, I don't. I'm going to just leave that out.
00:30:32.620
And so that also becomes a concern. The other thing...
00:30:36.500
And then people who are upset at the offense, and rightly so, say, well, why aren't you giving
00:30:42.460
them the full charges? And it's like, well, they lose faith in the justice system, because they
00:30:49.020
say, well, but you could be charging them with more, and you're not. So you're not taking this
00:30:53.860
seriously. And like, let's say it's a religious place of worship, as you were saying earlier.
00:30:59.700
Yeah. You know, if your church, your synagogue, your mosque is attacked, that is very personal,
00:31:07.300
is very visceral for you. And you're going to want someone to face full consequences. But because
00:31:14.840
this bill goes so far, you're saying there's a good chance prosecutors will just, meh, let's leave
00:31:21.160
that aside. And then you've got the public feeling like the prosecutors aren't taking it seriously.
00:31:27.560
And what they really ought to be doing on this, I think, is splitting it into, so, you know,
00:31:35.080
there should be a summary conviction option for this, for those situations that are less serious.
00:31:39.300
I can certainly see situations where it might be really, really serious. If you've got somebody,
00:31:45.480
you know, they've got the swastika tattoo on their forehead, they go to a synagogue,
00:31:51.220
they barricade the doors, and they set it on fire while everyone is inside. I mean, first of all,
00:31:56.900
they'd be looking at life imprisonment anyway. But I still think, you know, that's a situation where
00:32:03.560
you're like, okay, that's where we want to take it up to a life imprisonment maximum. But we don't
00:32:08.360
necessarily want to do this over somebody who, you know, at a bus stop, throws an empty coffee cup.
00:32:15.380
Like, is this really the same category? Do we need a life imprisonment, like maximum? Does it have to
00:32:21.780
be that indictable offense that triggers all of these protections? Or should maybe the Crown be able
00:32:27.180
to say, we're going to proceed summarily in this case, because, well, it might be, you know, might be
00:32:33.620
more serious. But we still don't need to, to go that far. You know, life doesn't have to be on the table
00:32:40.820
I want to hear your thoughts on what some have, in my view, unfairly characterizes thought crimes.
00:32:48.580
Look, I don't like the portions of the bill related to the peace bond for fear of committing a hate
00:32:53.280
crime. Yeah. Not, not, not my favorite part of the bill. But it's not. First off, the Harper
00:33:02.380
government put this in, and maybe you would know if it's ever been challenged. The Harper government put
00:33:07.000
this in, in one of their last bills in 2015 for fear of committing terrorism. And it's still on the
00:33:15.560
books. So if it's been challenged, it hasn't been successfully challenged. But there is a requirement
00:33:21.960
that if you're going to try and have someone get a peace bond for fear that they will commit a hate
00:33:30.220
crime, you've got to get the attorney general involved. That's a pretty high bar. Yeah. Then you've
00:33:35.080
got to take it to a judge and present the evidence. That's a pretty high bar. There are at least checks
00:33:40.640
and balances on that, that there aren't in other provisions of this bill, like the section 13 that
00:33:46.540
we can get into in a moment. I still don't like it, but do you think that sort of thing would,
00:33:51.680
would face a challenge or would successfully survive a challenge? Has this sort of thing been
00:33:58.140
challenged before this fear of a crime? I mean, that, that just leaves me feeling uneasy that you're
00:34:03.260
going to get something for fear of a crime. The thing is, is that they have to actually bring
00:34:08.140
evidence to say that that fear is reasonable and is supported. So they can't simply, I don't think
00:34:14.180
that they'd be able to get this on the basis of you have posted online and you dislike a particular
00:34:20.520
group. It would have to be something a little more serious than that, because if they start using it to
00:34:25.360
just police speech, then they're going to run into serious charter issues. But we do actually have a
00:34:31.660
number of these provisions that allow for, for instance, a peace bond. If it looks like you're
00:34:36.800
going to get into a fight with somebody, you may never have done so, but they, they do sometimes
00:34:42.060
impose peace bonds, for instance, when like two neighbors can't get along and they keep getting
00:34:48.300
into situations, but they do need evidence for that. And I don't think we're going to see this used
00:34:54.460
a lot. I think we're, I suspect we'll see this used actually in situations where people have
00:35:00.520
already committed some sort of hate propaganda offense or hate crime, but it might be difficult
00:35:07.120
for them to lay out the full proof or where the circumstances might be such that.
00:35:12.380
So let me give you an example and tell me if I'm on the right track or wrong track.
00:35:17.700
Uh, I'm some yahoo, some jerk who every weekend goes out to a church, the synagogue, a mosque,
00:35:24.880
temple, what have you. And I'm always outside with the really nasty slogans. I'm screaming in
00:35:30.780
everyone's face every weekend. Could they eventually just say, you know what? We think he's going to
00:35:36.700
keep coming back. We need to do something. Is that where this provision would come in or,
00:35:43.480
Now, the thing is, they could probably already use provisions, like just the existing peace bond
00:35:49.680
provisions, depending on what it is that you're saying.
00:35:52.520
Which seems like is a real issue with this bill is that there's a lot of it where it's,
00:36:00.580
Yeah. I mean, one of the things is that they could put like an electronic monitoring device on you,
00:36:04.740
which, um, that is concerning to me because those are actually incredibly expensive and you pay for
00:36:10.080
that if they impose it on you. Um, normally you see those on like bail conditions where somebody's
00:36:16.180
like, I will agree to pay for this because otherwise the alternative is that I will not be
00:36:21.520
let out. I'm concerned that they may put these conditions on people and then like we've imposed
00:36:27.560
it on you and you have to pay for it. That might be very concerning.
00:36:31.080
And so, um, so if you can't pay for it, you've got to stay in jail.
00:36:33.480
Well, they can't really jail you on this because it's a peace bond. It's just like,
00:36:38.880
are people going to end up with these debts that can then be enforced? I'm not sure how that's going
00:36:43.060
to end up working, but the standards for this, I think are going to actually be fairly high. It
00:36:48.400
has to be reasonable grounds for the fear. And usually where you like the main use that you
00:36:53.960
actually see a peace bonds is situations where somebody has already been charged with a criminal
00:36:58.920
offense and it gets sort of pled down, even though a peace bond isn't really a plea. So you'll see
00:37:05.080
situations where like two guys get into a bar fight and instead of saddling them with a criminal
00:37:10.220
record, they'll say, how about you guys, you know, how about you just go on a peace bond and stay away
00:37:15.840
from that dude? Maybe get some, uh, maybe get some treatment. You can't both be at Billy's bar on
00:37:21.320
Friday and you got to stay away from each other. Exactly. So here you may see situations where
00:37:27.060
somebody has already committed or likely committed a hate propaganda offense or a hate crime.
00:37:33.440
And instead of going through that full prosecution, you may see them imposing this as a, you know,
00:37:39.300
as a step down. And so it's like, rather than proceeding with charges, I think that'll be the
00:37:44.480
main use that you see for this is somebody who's already committed it. Ian, you are, are fairly, um,
00:37:51.040
robust in defending civil liberties. And you, you seem to be saying, uh, could be some problems with
00:37:58.000
this, but don't worry too much about this section. This isn't the part that actually worries me the
00:38:02.800
most. Um, things there's, it's largely the, the material about what they can take offline and what
00:38:11.460
they can, um, you know, one of the concerns I have is actually with the, uh, the intimate images,
00:38:18.120
the revenge porn aspect of things. And I think that that is way too broad. It's very, and I mean,
00:38:25.420
this is one of those things where everyone is like, you shouldn't be doing that. You shouldn't
00:38:29.240
be sharing. I don't know anyone in favor of revenge porn, except that you will actually,
00:38:35.980
if you frame the question, right, find everyone in favor of revenge porn. Okay. So what do you think
00:38:40.820
at the Sistine Chapel? Beautiful piece of art, beautiful piece of art. And it's actually revenge
00:38:48.720
porn, believe it or not, uh, at least portion of it. Uh, one of the Pope's sort of figures, uh, was
00:38:55.200
criticizing the painting of it because he felt that there were too many nudes on it. And so Michelangelo
00:39:01.160
as a bit of a sort of a, you know, take this, uh, amended one of the figures to put that guy's face on
00:39:08.840
it. And then depicted that person being bitten south of the equator by a snake. And that is,
00:39:17.000
that's permanently on there. It's a nude figure. Um, and that was specifically done as a screw this
00:39:24.220
guy. And it's one that now persists throughout history, but it's also a great work of art.
00:39:29.560
Is this going to be something that we go, and we know that this person gave no consent to this.
00:39:34.740
Um, there's actually a lengthy history of using nudity in political commentary,
00:39:41.580
depicting your political opponents as nude, um, typically in unflattering ways. Um, we've seen
00:39:48.780
this with Trump, for instance, there's lots of people who have depicted Trump, um, sort of in the
00:39:55.080
nude and sort of de-emphasize certain, uh, body parts. Well, Trump has not given his consent to that,
00:40:02.280
but it also is very broad. It is, if it is reasonable to suspect that the person had a
00:40:08.360
reasonable expectation of privacy and the person does not consent to the recording being communicated.
00:40:13.340
Well, reasonable to suspect is a very, very low standard. And it doesn't mean that there,
00:40:20.400
there's no sort of proof required. And so, for example, if there is, um, any artificially
00:40:27.280
created image that depicts somebody in the nude, how do you know, as if you're the provider, how do
00:40:34.300
you know who this person is? There's no requirement to track this person down. And so if it just,
00:40:40.060
if somebody creates an image of nobody who actually exists, it's just an image of a,
00:40:46.720
something that looks like a person, but it's not actually duplicating a person.
00:40:50.780
And with AI right now and the images coming out. Yep. Um, then the, they would have to take that
00:40:58.720
down, notwithstanding the fact that they, that there is no actual person who looks like that
00:41:03.940
because the, the organization doesn't know. They would say it's reasonable to suspect that this
00:41:08.900
is a person. And so we get into this real thing of policing content and all of this stuff, basically,
00:41:16.140
they have to respond within 24 hours when they get a notice. And so we may see a lot of, uh, a lot
00:41:23.000
of situations where people are flagging content and it has to be taken down because there's no way
00:41:28.920
that they can determine that it's not actually, you know, intimate content communicated without
00:41:33.880
consent. And that may also apply to things that are not even pornographic because if they have a 24
00:41:39.920
hour turnaround and they get something flagged, it may well be that the response is just to take
00:41:45.440
something down. So for instance, you have your podcast, somebody flags it as containing,
00:41:51.440
you know, whatever material it might have to be taken down offline until somebody looks at it and
00:41:57.920
they may not want to go through. Like if your podcast is an hour, they may say it's easier for
00:42:03.020
us to just take it down and never repost it rather than go through it. For me, as a content creator,
00:42:10.220
your first 24 hours up online, your first 48 hours up online, and you know, this from
00:42:16.320
your very popular YouTube channel, that's when you get most of your views, that's when you get most of
00:42:21.640
your clicks. And, and so if you get flagged, if you get essentially swatted and it's taken down,
00:42:32.100
That's especially, you know, the graph is basically, it goes straight up and then plateaus
00:42:36.880
in terms of views. You get almost everything within that first 24, 48 hours. And if you have
00:42:42.700
a really good 24 to 48 hours, you can keep going, but if you don't, it's, it's just flat the whole
00:42:48.900
time. Yeah. Oh, absolutely. If you got, got hit in that time and especially you could also consider
00:42:54.920
what if you happen to get the big scoop? You've got, you've got the story. No one else has,
00:43:00.600
you know, you get a picture of a politician like breaking into his ex-wife's house, you know,
00:43:07.700
something stunning like that. Yeah. You posted online the pull, you know, the political party
00:43:14.300
that this targets might have an incentive to make a false report. It gets knocked down for 24 hours.
00:43:20.640
Everyone's talking about it, but you're not getting the, you know, you're not getting the traffic of
00:43:25.900
everyone wanting to come and see your pictures. And by the time you get it back online, those pictures
00:43:31.580
are probably now everywhere. And so they've managed to effectively punish you for getting this report
00:43:36.940
by preventing you from successfully monetizing it. And if you can't make money in this game, then
00:43:44.660
you're out of business. Well, you just keep doing that to, if there's a particular news organization
00:43:51.400
that is critical of a particular party, you just keep sniping them every time they try to make money
00:43:57.140
and pretty soon they go to business. The abuse potentials for this become very high. Huge. Let me
00:44:03.540
ask you about section 13 coming back. I was part of the campaign from about 05, 06 until 2011. I think
00:44:12.140
it was withdrawn to get rid of section 13. And for this to come back, it seems like the language is
00:44:22.560
trying to be a bit more nuanced. It's trying to have some parameters around it in ways that it didn't
00:44:29.660
before, but I'm still concerned about the human rights commission and the human rights tribunal
00:44:35.920
being able to find someone $20,000 based on an anonymous complaint of someone who may or may
00:44:42.120
not have been actually harmed by what was said or done. And then it becomes a money-making
00:44:49.800
opportunity for people. Like we were saying, instead of taking away the monetization ability
00:44:54.580
for a news organization, let's say, you're making it so that someone can make 20 grand a pop every
00:45:01.620
time they complain that Ian Runkle, well, he said something I don't agree with. Ian Runkle said
00:45:06.760
something that's against the human rights code. And I think we will, we'll, we've already seen
00:45:12.760
Jessica Yaniv, for instance, filed a bunch of complaints and has not had a whole lot of success
00:45:19.940
at the human rights tribunal, but has filed complaints that have been found to be unfounded.
00:45:26.280
And apparently for a sort of purposes of, it looks to me like money-making. And so if we see that kind
00:45:35.100
of practice going on, you may have bad actors who are just like, all right, let's go like sniping
00:45:41.760
these for profit. But you may also see a lot of reporting just as people saying, well, this is a
00:45:48.960
way to tie someone up. Um, I don't like the, I don't like what they're saying. I'll make a complaint.
00:45:55.640
Now they're tied up in this human rights tribunal stuff. It can become very difficult to be any sort
00:46:02.200
of political commentator at that point, if you're always getting tied up in it. I think back to Mark
00:46:08.380
Stein's famous, um, case with this, one of the complaints about it was he, and this is very timely
00:46:16.300
because we're, we're hitting the month of Ramadan and this joke only makes sense to you. And it's
00:46:22.240
only funny if you actually understand Islam and Ramadan. And one of Mark's comments was,
00:46:27.940
is it just me or does Ramadan come earlier every year? And of course it does because it comes around
00:46:34.560
every 11 and a half months. Yes, that's funny. But somebody said, no, that's an insult to Islam.
00:46:42.500
No, it's not. It comes around earlier every year. It's a calendar joke. Yes. And, uh, and he was tied
00:46:51.140
up with that for a very long time. Um, and that becomes expensive. And then you, you start self
00:46:59.040
censoring. We saw the case of the, uh, Quebecois comedian who made jokes about, um, a disabled child
00:47:07.120
singer who then ended up having to go to the Supreme court as to whether or not that was, um, that was
00:47:13.460
a violation. I'm sure that was a ton of money on the line for, for that process. Um, every time
00:47:23.000
somebody is controversial and I know that there's people who are going to say, oh, this just affects,
00:47:28.000
you know, the other people on the side of politics. So I don't care because it'll go after the people I
00:47:34.120
don't like. It will, I guarantee you be used against the side that you like as well. So if you
00:47:41.720
are a right winger, you know, think about what happens if somebody is critical of, you know, any
00:47:47.960
sort of sexuality, et cetera. If you're on the left, think about what happens when somebody posts
00:47:53.980
something, uh, in favor of, you know, diversity, equity, and inclusion sorts of things. All of these
00:48:01.460
things could be targeted under these provisions and tie people up. And we, we may end up with a
00:48:07.920
world where we just can't talk online without people sniping us into court. Apparently the human
00:48:15.580
rights tribunal, by the way, is already fully backed up without these provisions. Um, so all of this is
00:48:22.700
going to cost us a lot more because we're going to have to hire a lot more people to police everything
00:48:26.620
we're saying and doing and thinking. Um, in is this, I don't deny that either on the child
00:48:36.620
exploitation, the revenge born side, or on the other side, that there are serious issues here, but do
00:48:43.660
you get the sense that the government is trying to show that they take it very seriously by having very
00:48:51.580
serious sentences, very serious fines that they are using a sledgehammer to take on a fly when they
00:48:59.680
could have done this in a much more nuanced way, dealt with the issues that are bothering people
00:49:04.660
in a way that's charter compliant, that doesn't lead to the swatting, and that isn't just about
00:49:09.980
performance. I would love to see a much more nuanced, much more targeted bill. And there's no
00:49:18.060
reason why they can't approach this incrementally. They can say, listen, let's go after the most,
00:49:23.220
like the worst stuff that everyone agrees is bad. Uh, cause there are provisions in here that I think
00:49:29.420
are good and provisions that I think are well-intentioned that can be later cut, you know,
00:49:35.960
later fixed in ways that will make them good, but they just want to cover so much stuff that this ends up
00:49:43.500
including everything from really, really bad content to hurt feelings. And it's the hurt feelings
00:49:50.620
area that really can, you know, can cause huge problems and it can end up, the courts may end
00:49:57.680
up having to look at these provisions. And because they're so broad, it can be, listen, we have to throw
00:50:03.460
out the baby with the bathwater because you haven't given us any other choice. I, I think we do need some
00:50:11.060
additional provisions to deal with things like revenge porn and, you know, targeted deep fakes
00:50:17.160
and these kinds of things, but they have to be done carefully and they have to be done with nuance.
00:50:22.980
And I think this bill needed a lot more study than it got before it was brought forward again. So
00:50:29.420
I have real concerns. All right. Ian Runkle, thank you very much for your time today.
00:50:35.300
Thank you so much for having me. Let us know what you think. Are you concerned about portions
00:50:41.300
of this as the government hit the right spot? Do they need to have a bit more study? Do they need to
00:50:46.580
change this bill a little bit? Let us know what you think. Drop a comment, send us an email,
00:50:51.660
reach out and make sure of course that you're sharing this online. Full Comment is a post media
00:50:56.580
podcast. My name is Brian Lilly, your host. This episode was produced by Andre Pru with theme music
00:51:01.700
by Bryce Hall. Kevin Libin is the executive producer. You can listen to watch for comment on
00:51:07.860
YouTube, on Apple Podcasts, Google, Spotify, anywhere that you get your podcasts. We are there
00:51:13.900
and available. Listen through the app or Alexa enabled devices. And of course you can help us out
00:51:18.620
by leaving us a review, giving us a rating, and of course, tell your friends about us, share it on
00:51:23.760
social media where you can. Until next time. Thanks for listening. I'm Brian Lilly.