Juno News - February 11, 2021


PURGED: A panel discussion on Big Tech censorship


Episode Stats

Length

1 hour and 15 minutes

Words per Minute

181.5394

Word Count

13,616

Sentence Count

219

Misogynist Sentences

5

Hate Speech Sentences

4


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.000 welcome everyone to purged a panel discussion on big tech censorship hosted by true north in
00:00:08.580 partnership with civitas canada my name is andrew lawton from true north your moderator for this
00:00:14.060 evening's very exciting panel i don't think i am speaking in a way that is overstating it because
00:00:19.300 we have a fantastic array of panelists to cover an issue that has become more and more relevant
00:00:24.060 with each passing day especially in recent weeks and months and that is the power that big tech
00:00:29.060 has over the online speech space and sphere and there are many different responses to this even
00:00:35.660 from within what many would argue is the political right from libertarians classical liberals
00:00:40.800 conventional conservatives we've heard free market answers to it we've heard regulatory answers so we
00:00:46.400 want to tackle this from the cultural perspective from the regulatory perspective and also from its
00:00:51.800 impact on people that are trying to engage on social media and utilize these big tech platforms
00:00:57.660 As mentioned, we've got a great group of panelists here. So let's get right to it.
00:01:02.120 Joining us this evening, Robbie Suave, who's a senior editor with Reason Magazine and actually
00:01:06.920 has a book coming out later this year called Tech Panic. Kelly Jane Torrance, who's a member
00:01:12.060 of the editorial board at the New York Post, which, as we know, has a very direct reason to be
00:01:17.600 frustrated with the power of big tech companies. And Bruce Party, a law professor at Queen's
00:01:22.680 University, and someone who's been very good on online speech and free speech in general.
00:01:27.640 So all of you, thank you so much for tuning in. And to the panelists,
00:01:30.380 thank you very much for joining us this evening.
00:01:34.760 We are going to get right into things here. I want to start with you, Kelly, because we saw
00:01:40.300 during the US election, and I should note that you are a Canadian who's living and working in
00:01:45.060 the US. So you've seen both sides of the border at work on this. The New York Post had a bombshell
00:01:49.840 story that in any other year would have been the story of the election cycle on hunter biden and
00:01:54.580 content found on his laptop but very swiftly this was blocked by twitter and facebook the new york
00:02:01.440 post account was taken offline so i'm very curious what's your perspective on this i mean where is
00:02:05.800 the lay of the land now for big tech censorship and and what are your thoughts on how we respond
00:02:10.000 to it i haven't muted myself now uh it's a great question and a big question of course
00:02:19.640 which is, you know, why the four of us can have an evening's conversation about this.
00:02:25.440 But yeah, really, what happened to the New York Post during the very tail end of the election campaign
00:02:32.800 really drove home for me what these tech companies are up to and why we should be worried about it.
00:02:39.540 So, you know, the New York Post, some reporters got a copy of a laptop that Hunter Biden had left
00:02:46.240 at a computer repair shop in Delaware and then apparently forgot about. Some people thought that
00:02:52.440 that was ridiculous. Why would the son of the former vice president do something so silly?
00:02:59.180 Well, Hunter Biden also left a crack pipe in a rental car once after he returned it. So this
00:03:05.720 isn't entirely out of character for him. And the idea that we were not, that we were publishing
00:03:12.700 something that was so suspicious that nobody could believe it i think was was just ridiculous and
00:03:19.420 what really struck me was you know soon after that story was published uh one of the executives
00:03:25.740 at facebook actually went on twitter um actually bragging about how he was actually suppressing
00:03:32.620 the story on facebook uh you know they're not even hiding what they're doing they're trying
00:03:38.620 to suppress a story about one of the major candidates for election and they're bragging
00:03:43.660 about it it was incredible and soon after that of course as you mentioned um twitter not only uh
00:03:50.220 made it impossible for us to share the story it actually locked the new york post twitter account
00:03:55.580 and that account was locked for weeks um i am very proud we did not back down the twitter demanded
00:04:02.460 that we delete our posts about the story uh keep in mind that twitter eventually decided that it
00:04:08.300 had made a mistake in suppressing our story it first said that it has a policy against distributing
00:04:15.820 hacked materials well the materials weren't hacked and in fact hunter biden had never claimed that
00:04:21.660 he'd been hacked uh and i will say hunter biden and joe biden neither of them has ever said
00:04:28.380 anything in our story was false they have said it's a smear job it's russian disinformation
00:04:34.300 but they have never actually denied the accuracy of the emails from the laptop that we published,
00:04:40.860 which showed that apparently Hunter Biden had introduced a Burisma executive, someone at that
00:04:47.180 Ukrainian gas company that was paying Hunter so much money, that he'd introduced them to his
00:04:52.060 father while Joe was vice president. And there was a lot of other information in there, questions
00:04:57.260 about whether Joe Biden was the big guy for whom they were saving a stake in a Chinese company.
00:05:02.620 lots of lots of great fodder there for investigative journalists but journalists uh did not want to
00:05:09.280 investigate in fact they didn't even want our story coming out npr actually had an article about
00:05:14.960 why they weren't reporting our story and they said it was because it was a waste of time uh it was
00:05:20.840 really i mean it was just quite something andrew i i have to tell you and and so um we refused to
00:05:26.340 delete um the the uh tweets and we finally won it took weeks but twitter finally gave us our
00:05:34.060 account back and uh within days we actually had far more followers than we had before so in a way
00:05:39.380 it was it was really a triumph um for us and it's only after the election do you now see other media
00:05:45.640 outlets reporting on this story uh after joe biden won we suddenly saw some people looking into this
00:05:52.320 And I think everyone knows now that the stuff we published was completely accurate, raises
00:05:57.600 a lot of questions.
00:05:59.340 You know, we even have, we've had Joe Biden's brother putting out an ad on Inauguration
00:06:03.360 Day, touting his connections to the new president in trying to drum up some business for himself.
00:06:10.520 I mean, they're not even hiding it now, it seems like, but they don't have to because
00:06:15.480 Joe Biden got elected.
00:06:17.680 And I think the media and social media helped him do that.
00:06:21.320 And so that's, I mean, that's one of the interesting questions. You know, in journalism, when the internet started, it was really an option for people to sort of hear other voices besides those in mainstream media, which of course we've been complaining for years, and I think rightly so, that they have, they tend to have a liberal bias.
00:06:40.820 Well, the internet made it possible to hear from more voices. You could search people out, blogs, and then social media came along. And it was a powerful, I think, platform for outsiders, people who don't share the views of the elites.
00:06:56.200 And in fact, social media has now gotten so popular and so powerful, it's able to actually
00:07:02.540 suppress those voices now.
00:07:04.220 So it's really quite a change from how we first saw those outlets, those platforms develop.
00:07:13.440 And so, you know, the question I think we have as classical liberals, that's how I describe
00:07:18.640 myself, and whether you're a conservative libertarian, you think that private companies
00:07:23.520 have the right to do their business as they see fit and i certainly think that twitter has the
00:07:29.600 right to suppress the new york post and others if it wants to uh you know i lost thousands of
00:07:34.800 followers when it uh you know after it banned donald trump and began um banning and suspending
00:07:41.840 a lot of conservative accounts but that doesn't mean i think it's wise and i don't and i do think
00:07:46.320 it's something we need to be concerned about what can we do we we we sort of don't want the heavy
00:07:51.120 handed uh government uh regulation on it but at the same time is this a case of collusion
00:07:57.440 for example uh you know parlor it's interesting that apple google amazon all acted almost at the
00:08:04.560 same time against this company you might ask is there is there collusion i'm not even sure it
00:08:09.840 needed to be behind the scenes i think uh you know once michelle obama for example called for
00:08:15.200 twitter to ban trump it happened very quickly and you know all the um outlets you know facebook
00:08:22.720 twitter or youtube uh moved right away to do that and i you know it's it's not even going to serve
00:08:28.800 the goals i think of of what these um you know liberal institutions want uh you know it's much
00:08:35.360 easier to get people to uh reject their bad ideas if they're forced to defend them uh what we're
00:08:42.480 seeing is they want to put you know right-wing voices I think in some ways into a sort of ghetto
00:08:48.240 where you're only surrounded by like-minded people that's a terrible idea if we think that
00:08:53.420 there's some ideas out there that are very bad the best thing to do is to bring them out in the open
00:08:58.060 make people defend them you know I've had family members send me things that I think they heard
00:09:04.040 from you know probably started with QAnon and asked me hey is this true and I was able to point
00:09:09.260 out to them no it's not and this is why um so you know hiding that putting it in into a uh a ghetto 0.60
00:09:16.860 i think is a very bad idea and of course you know with parlor investigators actually use data uh
00:09:22.620 before they shut it down to find some of the rioters that were causing trouble at the capitol
00:09:28.460 so you know there's just so many reasons not to keep this stuff closed down um but what's
00:09:34.140 interesting is that you know jack dorsey the ceo of twitter actually had an interesting thread in
00:09:40.300 which uh he actually sounded like he did not want to ban donald trump from the platform but what
00:09:46.140 happened was younger workers at the company sort of pressured him and demanded it and we're seeing
00:09:51.420 that uh across all kinds of institutions these days the new york times i think is a perfect
00:09:56.300 example uh it just claimed another head um you know star science reporter excuse me donald mcneil
00:10:03.700 was uh basically forced out because the younger staff didn't think he'd be disciplined enough and
00:10:09.800 we're seeing this in in all these cases you know the employees of google are are um you know upset
00:10:14.960 about certain things the company does so it's it's it's we're seeing sort of a transition from
00:10:19.980 one old guard to a newer guard and it's amazing how much power the younger people have even over
00:10:25.580 the CEO of their own company um and um you know I think my time's almost up so I'll try to wrap
00:10:33.520 up here but um again the question about what to do with it I find it interesting that Facebook
00:10:38.080 is actually urging uh governments in the U.S. and Canada to regulate uh of course they would
00:10:44.880 prefer if everybody was on the same uh playing field and they feel like hey we're trying to
00:10:50.740 regulate some content it'd be much easier if everybody was forced to do that and Andrew actually
00:10:55.180 sent a very interesting article right before this event to us um facebook is is going to
00:11:01.340 um have less political content uh coming up on your feed they're going to change their
00:11:06.540 algorithms they're testing this in i think three countries including canada and they hope to do
00:11:11.260 this in the united states and of course the question about what is political content who
00:11:16.780 decides even that um you know i go to this knitting retreat uh don't laugh i go to a
00:11:21.740 knitting retreat every year and each year because of these you know divided times in the united
00:11:27.740 states the organizer says hey let's not have any political discussions here let's let's you know
00:11:33.020 keep this non-political we don't want to get into any arguments and then the very next day once for
00:11:38.220 example uh she started talking about how new york had banned fracking and how this was a great thing
00:11:43.420 well to her that wasn't a political issue uh to her it was common sense of course you'd want to
00:11:48.300 to ban fracking well i think those of us who know that uh you know fracking is a much much better
00:11:54.200 way of getting energy and using energy than of course coal which is really dirty we would have
00:12:00.000 many reasons to disagree with her but i think that shows you that you know once you let people start
00:12:05.440 deciding what can be said what can't and even what's political and what's not um you're going
00:12:11.520 to have problems and i will say that uh the article andrew sent uh facebook says it's going
00:12:16.860 to exempt certain government uh and and big organizations like the world health organization
00:12:23.740 well can you think right now of a more political organization uh than the world health organization
00:12:29.180 uh you know everyone from the new york times to the associated press has shown how uh who helped
00:12:36.300 china cover up what happened with the coronavirus and it now went to china and claims that oh
00:12:42.460 it's exactly like China said, it came from this live market, nothing to do with the lab. I mean,
00:12:48.420 it's incredible. But, you know, because the liberal elites who run Facebook see, you know,
00:12:54.580 UN organizations as apolitical, and in fact, just right, they deem that non-political. So it's a lot
00:13:02.120 of, you know, tough questions. And I look forward to seeing what our other panelists have to say.
00:13:06.760 Thank you very much, Kelly Jane Torrance. And I should say to our viewers,
00:13:10.120 that was not a really long answer to a question we were giving all the panelists opening statements
00:13:14.920 so don't worry and i'm glad you covered so much ground there uh kelly because this is a huge issue
00:13:19.700 and i think you've done a very good job at explaining uh what can happen in a very real
00:13:24.080 and very recent way when tech companies do decide for whatever reason that they don't think something
00:13:29.060 should be on their platform and i will tell all the viewers if you want to weigh in on this
00:13:33.460 discussion on the big tech platforms about which we're speaking the hashtag for this panel
00:13:37.440 is purged panel. So you can chat away on Twitter about that if you'd like. I want to turn to you,
00:13:44.560 Robbie, because this is an issue you've been covering at great length. And I know you get
00:13:48.100 a lot of resistance from the right from your perspective as well. But I'm very curious how
00:13:53.140 you approach this issue. Sure. So I agree with a lot of what Kelly just had to say. Obviously,
00:13:58.060 the decisions taken about regarding the New York Post story are impossible to defend and should
00:14:03.180 be defended and i have a lot of criticism of the various moderation decisions that facebook and
00:14:08.860 twitter have made over the years and of course recently um but the issue becomes what to do
00:14:14.620 about it and there aren't really any good answers as kelly explained these are private companies
00:14:20.220 we're talking about you know they don't when they take you take down your post well maybe it's
00:14:25.340 unfair but they don't have to be fair they're private companies and uh and and they find
00:14:29.740 themselves in a very difficult position because on the at least in an american context the progressive
00:14:36.620 left democrats broadly want to aggressively moderate facebook twitter other social media
00:14:42.380 companies for not doing enough content moderation they think facebook twitter etc have allowed
00:14:48.620 disinformation and conspiracy theories and covet denialism and q anon organizing and hate speech
00:14:55.900 and all this other nasty content that the platforms are responsible for violence and
00:15:01.740 conspiracy theories etc and that their failure to take more aggressive moderating actions means
00:15:08.540 that they should be regulated that's what the left says that's what explicitly elizabeth warren and
00:15:13.340 joe biden have said elizabeth warren wants to get rid of section 230 which is liability protection 1.00
00:15:18.140 for the platforms etc so that's the left the right uh also wants to get rid of section 230 liability
00:15:24.300 protections for the opposite reason they think facebook twitter etc have done too much moderating
00:15:29.820 have taken down too much content that they would prefer to to to be left up um this so even if you
00:15:36.300 don't like a lot of what facebook and twitter have done and i frequently find myself in that place
00:15:41.100 i have to feel a little sorry for mark zuckerberg jack dorsey etc that you have a hundred um senators
00:15:47.340 potential regulators who who basically all want you to be regulated for conflicting reasons so
00:15:52.780 there's there's no there's no way they can appease this because they're they're they're they're
00:15:56.860 opposite uh the left wants more more censorship the right ones less censorship being not exactly
00:16:02.380 the operative word because we're not talking about government doing it although we actually are talking
00:16:05.580 about government doing it when the senate is saying we want to regulate you unless you do xyz speech
00:16:10.460 decision um so anyway it's uh so so it prompts a lot of interesting um uh uh problems um i think
00:16:20.140 uh broadly speaking that additional regulation of social media uh would be a disaster for the
00:16:26.300 political right because facebook has in particular has allowed for so much conservative content to
00:16:34.860 flourish um if you look at any given week you will find that ben shapiro dan bongino breitbart
00:16:41.260 the daily wire fox news other conservative news outlets are routinely the top articles on facebook
00:16:47.420 that's probably one of the reasons elizabeth warren wants to regulate facebook out of existence 1.00
00:16:50.780 because she sees it as a powerful engine for conservative speech so even when they make a
00:16:55.660 decision that i think is as stupid as what would happen with the new york post article i still 0.62
00:17:01.260 think well i would you know if it was left to the mainstream media's own devices to cover that story
00:17:06.540 right it would have been even uh greater censorship if you want to use that word so every time you
00:17:11.900 know the new york times and again kelly referenced some of these uh decisions that the mainstream
00:17:16.700 media has made but the new york times has essentially vowed never to run an opinion
00:17:21.020 piece from a republican senator ever again because of the blowback they received from their own from
00:17:25.660 their own staff so i don't want to return to a world where they are the gatekeepers of information
00:17:31.180 because i think that's even more unfriendly uh to conservative voices in particular the idea of
00:17:37.100 getting rid of the liability protection that's section 230 which is this american law that says
00:17:42.460 that facebook is not responsible for the content so if i post a a libelous facebook comment or post
00:17:49.980 you can sue me but you can't sue facebook that's because of section 230. a concern a lot of
00:17:55.260 conservatives want to get rid of it because they say that's this unfair special perk and if facebook
00:17:59.660 is making moderating decisions then it's like a publisher like simon and schuster or the new york
00:18:04.140 post or reason magazine you could sue all of those entities if someone not and not just the person
00:18:09.580 but the entity for libel you can't sue facebook does facebook really deserve that protection
00:18:14.540 maybe it doesn't but in a world where facebook doesn't have that protection i think again they
00:18:19.340 will just their their fallback then would be to censor so much more content to protect themselves
00:18:25.020 from light from liability and it it is not obvious to me that that would benefit conservative speech
00:18:30.700 in fact it seems very likely to backfire on conservative speech so i'm really not persuaded
00:18:35.100 so even where i agree with some of the criticisms of social media i am so unpersuaded by the
00:18:39.900 solutions um which all which the solutions also could run into their own first amendment problems
00:18:45.500 in some if in some sense it's the government saying we will contingent we will make contingent
00:18:51.180 this benefits your company on you making some speech or or ideology decision we agree with
00:18:57.900 i i think it's possible at least our supreme court could have a huge problem with that our supreme
00:19:02.860 court currently is like wildly in favor of of free speech um in a way that i think they might
00:19:09.740 scrutinize some of that so it's uh it's a tough issue obviously when you have issues like you had
00:19:15.740 with the new york post story with faith with not so much the leadership of these social media
00:19:20.060 companies but they're they're very liberal they're very progressive employees uh who often staffed
00:19:24.860 were staffers for democratic politicians before they came to facebook and twitter um demanding
00:19:30.060 this kind of behavior uh it's not ideal but again but and and to the extent these arguments rely on
00:19:37.980 so so like i don't i i don't love uh what happened to parlor i think a lot of the organizing of of
00:19:45.260 the violence or the capital riots was also done on facebook and twitter so why is parlor being
00:19:49.740 held accountable when they're not um it at the same time there's certain uh liability that the
00:19:56.860 companies have read an interview with the the parlor ceo where he didn't it was in the washington
00:20:01.660 post where he didn't seem to understand that uh that 230 does not protect you from having to take
00:20:07.500 uh action against for instance like child pornography um the platforms are obligated to
00:20:13.020 take down sex trafficking content some other kinds of content under existing federal law
00:20:17.420 and he made some content like well i don't know if that's on our platform because i'm not looking
00:20:20.940 for it that's not how it works you have to take it down uh so to the extent that like amazon and
00:20:26.300 apple they de-platform him they de-platform parlor because they are like afraid of the liability risk
00:20:33.180 they would incur by hosting some service that isn't complying with these federal laws because
00:20:37.020 it took a position that might be might be admirable or might be the position you'd want these companies
00:20:41.820 to take but actually is not allowed under existing federal law so they so these issues tend to get
00:20:46.540 thornier uh thornier than they they seem so i'll i'll i'll leave it at that those are some of my
00:20:51.900 my opening thoughts on these issues thank you very much for that robbie suave and one of the
00:20:58.540 notable examples of what happens when government does decide to get involved in this discussion is
00:21:03.660 actually happening in canada right now if you're an american this will just reinforce your worst
00:21:08.620 uh stereotypical thoughts about canada and if you're a canadian well you've probably heard me
00:21:12.940 yammer on about this a fair bit but and that is when a government starts to look at deputizing
00:21:18.220 social media companies to censor content which is the crux of a proposed law it hasn't actually
00:21:24.360 been tabled yet from the liberal government in Canada that will find social media companies if
00:21:29.920 they don't remove from their platforms what the government determines is hate speech which may
00:21:34.140 actually be defined in a manner that is distinct from the existing criminal threshold for hate
00:21:39.680 speech so a level of content that the government would be telling social media companies they have
00:21:44.440 to take off. One person who's been very good on these issues and a lot of other free speech issues
00:21:49.560 in Canada is Bruce Pardee, Queen's University law professor. Bruce, what's your assessment of where
00:21:55.540 things are on this issue? Andrew, this is a bad news story all around. Philosophically, I'm not
00:22:03.740 in disagreement with Kelly and Robbie on this, but I think we have to take a step back. One of the
00:22:09.640 weaknesses of the right, if I may put it this way, is our disbelief. We tend to be naive and try not
00:22:21.080 to see what's happening right in front of our eyes. But as Kelly said, they're not even hiding
00:22:26.640 it anymore. And there is, as the title suggests, a purge going on. And it's not censorship,
00:22:34.740 as Robbie points out, in the strict legal or constitutional sense. It's not a violation of
00:22:39.700 the Constitution because it's being done by a private company. But the problem is that they
00:22:45.980 are essentially an oligopoly, and that in this sense, big tech and governments are aligned.
00:22:56.220 And if anything, as you point out with the proposed legislation from the Heritage Minister,
00:23:00.940 when the question of regulation comes up it's that governments want these tech companies to
00:23:07.080 do more of this not less and so i would be inclined i mean so let's put it this way
00:23:13.860 here's the situation that we have we have a set of private companies which are private
00:23:19.140 and in an ideal world i would agree with robbie that private companies should do what they like
00:23:24.880 and the less regulation on them the better but that's not actually what we have
00:23:30.200 Right now, we have private companies of all kinds who are restricted in what they do by all kinds of laws, including human rights laws that prohibit discrimination.
00:23:40.960 So when you go to the baker, the baker has to serve you no matter who you are or what you think.
00:23:49.560 But the big tech companies can censor you if you're a conservative or a libertarian.
00:23:54.240 and what we have on the table is not getting rid of regulation so that private companies can be
00:24:02.200 private that's just not going to happen and so in this real world if you ask me whether or not
00:24:09.120 it would be a good idea to regulate big tech companies so as to restrict them from censoring
00:24:15.720 just right-wing voices i would say well that's a fair thing because right now
00:24:21.600 um left-wing causes and and identities are all protected and the right-wing identities and
00:24:30.740 causes are not and so if you want to play apples and apples that kind of restriction should apply
00:24:37.140 to everybody my preference would be to get rid of all of it but that's not on the table
00:24:41.800 problem is this that's not going to happen either as i say big tech and and governments
00:24:50.500 especially in canada but i think even in the states are aligned on this and it's the governments that
00:24:56.260 are pushing the tech companies to do more of what they're doing and they're they're not pretending
00:25:03.620 anymore it's not subtle it's out in the open and the the only solution here is this is going to get
00:25:12.900 worse it's going to spread it's i mean people have suggested it's going to next go to your email
00:25:19.380 or to your phone use, your texts.
00:25:22.260 So imagine this.
00:25:23.960 Imagine you have a phone with Rogers or TELUS or Bell.
00:25:27.620 And they start to say, well, yeah, you can use your phone, of course.
00:25:30.260 But you can't send messages over your phone that contain misinformation.
00:25:38.640 And what they mean by misinformation in the current context is information that goes against
00:25:45.400 the politically current view of the truth.
00:25:49.380 And that means the phone company is going to start to watch over you and censor you and restrict your ability.
00:25:56.220 Just like it's possible. I mean, who knows where this could go?
00:26:01.200 Will you not be able to get on a plane with Air Canada if you have the wrong kinds of views?
00:26:06.420 So it's all very well to say, oh, these are private companies.
00:26:09.680 But the problem is we don't have a free market.
00:26:12.860 We don't have a market in which private companies are private.
00:26:15.640 You'll note this pattern.
00:26:16.740 When private companies are encroaching upon interest of the left, a private company is called quasi-public.
00:26:28.060 But when a private company is encroaching upon an interest of the right, a private company is private and can do whatever the heck it likes.
00:26:36.140 That's not even-handed.
00:26:38.020 You've got to have the same thing happening to both sides.
00:26:41.780 And let's not mince words.
00:26:43.740 This is a question of two sides.
00:26:46.320 We're polarized.
00:26:48.040 All these calls of unity have only come up once the proper side is now in control.
00:26:55.220 So you're not going to get regulation from governments right now that fix this problem.
00:27:01.120 You're not going to get big tech backing down.
00:27:03.240 This is going to get worse.
00:27:04.900 And it could very well spread, as I said.
00:27:08.220 I'm afraid the only possible solution is a very difficult one, which is that those people who think this is wrong,
00:27:14.220 are going to have to build a parallel system,
00:27:19.260 a parallel set of companies,
00:27:21.120 parallel media, parallel platforms.
00:27:24.640 And, you know, myself, I don't know how to do that.
00:27:27.960 But, you know, and we've made a start,
00:27:30.780 there's Parler, we have True North.
00:27:33.300 We need more of that.
00:27:34.300 We need more of the alternative media
00:27:37.540 and big and technological companies
00:27:40.080 that will provide access to people
00:27:43.740 who the left are now saying are preaching misinformation and making the internet an
00:27:51.040 unsafe place which is total nonsense but we're we're in a dark place this is a bad corner to be
00:27:58.980 in and the way out is not obvious it's not going to be legal i mean i'm not saying that they that
00:28:05.560 there wouldn't be a legal solution if you had governments interested in one but i can't think
00:28:10.740 of any political party in power in canada right now with the spine to do this kind of work
00:28:19.220 so i i i think we have a problem thank you very much bruce party for that you actually i think
00:28:27.140 touched on something that makes a great launching point as we move into the more open discussion
00:28:31.460 part of this panel which is the parallel societies angle because one of the greatest criticisms that
00:28:37.060 I think was directed to people on the right that were taking aim at conservatives for going after
00:28:43.140 Facebook and Twitter was, well, if you don't like it, build your own. And listen, I mean, I'm more
00:28:47.620 of a libertarian on a lot of things. I love build your own, where, okay, if you don't like Facebook,
00:28:51.780 build your own Facebook. And Parler did that. And then we saw in the last couple of, well,
00:28:56.980 the last month and a half, the breakdown of that, of how many layers of things do you need to build
00:29:01.780 your own of? Do you need to build your own cell phone so that you can have your own app store?
00:29:06.020 do you need to build your own internet so that you don't have hosting companies like amazon web
00:29:10.340 services do you need to build your like how many things do you need to build your own of for that
00:29:14.660 to work and then are we any better off as a society if we just have you know the the liberal
00:29:19.780 internet and the conservative internet for lack of a better term i'll go to you on this one first
00:29:24.580 robbie i mean what do you think of that argument is it is it practical and is it a good idea to
00:29:29.780 push that segregation if you will yeah build your own is clearly not a satisfying um actual
00:29:38.500 answer to this problem as because you can i mean you can see what happens to parlor so i never
00:29:43.300 say that um because it doesn't work there's too much um uh uh it is too hard to do now i i don't
00:29:51.860 think uh i don't believe that the current tech companies the current giants we have are so uh
00:29:58.580 powerful and absolute that they could never be challenged or they could never have some rival
00:30:02.820 come along um if you think that it's naive thinking because every step of the way if you
00:30:08.420 look at the top you know 10 internet companies they have changed over and over again year after
00:30:13.300 year sure google facebook etc look pretty dominant now but you know when i was a teenager myspace
00:30:19.220 looked utterly dominant and then what happened myspace um myspace versus facebook and myspace
00:30:25.060 just made a bunch of kind of dumb like decisions for what its site looks like it's like got really
00:30:29.700 buggy with ads it's like no we're just going to focus on music facebook said let's branch
00:30:33.860 out beyond college let's try to sign everybody up for facebook so truly for reasons that only
00:30:39.540 have to do with business decisions myspace crashed and facebook took off you can't convince me that
00:30:44.980 that couldn't happen again uh with some company to come along yes they face they'll they'll face
00:30:49.700 challenges from these other companies but importantly when people talk about sort of like
00:30:53.620 monopolies and antitrust law so there's no at least in u.s law there's no foundation to punish
00:31:00.900 facebook for like hurt or google or amazon for not wanting to play nice with its competitors right
00:31:06.500 the foundation of antitrust is some harm to the consumer and thus far you can't really
00:31:11.300 demonstrate a heart you can demonstrate i guess harm to individual entities that think they've
00:31:16.020 been wrongly moderated or censored but you can but the the consumer overall is is not is not
00:31:22.900 like losing out it's because these are free products right the concern with monopolies
00:31:27.860 is you know if all if standard oil has all the oil then they can raise the price of oil and
00:31:31.940 everybody needs it but facebook isn't charging you for the for the the the pictures website
00:31:37.780 it has right and it's not even that important if you actually if you don't want to be on it
00:31:42.420 so the harm to the consumer has not which is the entire basis of this category of law at least in
00:31:47.140 the u.s has not been stated and thus i think antitrust challenges to these kinds of companies
00:31:52.740 will end up as a practical matter even if you don't like the companies will end up falling
00:31:56.740 completely flat i want to go to kelly next but i want to add something to the discussion based on
00:32:02.420 what robbie just mentioned which is that a lot of media companies in particular have become very
00:32:08.500 reliant on facebook and twitter to get their content out i mean we know that print subscriptions
00:32:14.500 are down. We know that cable viewership has its demographics, of course, but a lot of media
00:32:20.520 companies have built their business models around being able to Facebook and tweet links. And in
00:32:24.820 that sense, they are reliant. So I'm not sure if it is as insignificant based on some business
00:32:30.520 models. As Robbie is saying, I think these services are generally in our individual lives.
00:32:36.020 What's your thought on that, on that or on the parallel societies aspect?
00:32:39.640 Yeah, I think you bring up a very good point, Andrew. And, you know, I tend to agree with Robbie that I, you know, the less regulation, the better. But Bruce made some good points about this is not necessarily a free market right now.
00:32:53.140 You know, how many Americans have a Twitter account? Not the majority. But think about it. Since Donald Trump was banned from Twitter and Facebook, have you heard much from him lately?
00:33:06.540 The only thing I can think of that I've really heard Trump in his own words was that letter he sent to the Screen Actors Guild, you know, resigning his membership, which was hilarious.
00:33:17.620 and if you haven't read it you need to um but that's the only thing i can think of since he's
00:33:22.660 been banned from social media that we heard trump in his own words he really used social media and
00:33:28.900 again i mean everybody knows of course that he was uh the ultimate politician who learned how
00:33:33.780 to use social media to get his words out and to speak directly to the american people because he
00:33:40.260 felt and i think he had uh some right to think so in many cases that the media were not treating him
00:33:46.740 fairly and we're not uh representing him fairly uh you know yes he's he's out of office now but
00:33:53.620 uh you know i think it's strange how little uh we've heard from him his own words and i do think
00:33:58.020 that you know while not every american of course has a twitter or facebook account that is a way
00:34:04.260 that a lot of people uh find out about what's going on and of course you know we also had
00:34:09.300 mainstream media reporting what he said on facebook and twitter so um you know there's
00:34:15.220 there's some questions there i think and you know i think that i was also going to mention you know
00:34:20.180 ravi had mentioned um you know that a lot of the top uh you know uh users on facebook that are
00:34:26.740 getting a ton of hits are conservatives well i think that's because uh there's there's not a lot
00:34:32.580 of other ways to get those ideas out there uh your average media is not letting conservative voices
00:34:38.500 be heard that much and so they're turning to facebook and of course people are sharing these
00:34:43.300 things because they're not seeing them uh on the nightly news in the newspapers um and sorry what
00:34:49.060 was what was the other question well i think you answered very effectively the other aspect was the
00:34:54.820 just the split world of the liberal internet conservative internet liberal apps conservative
00:34:59.380 apps that sort of one quick thing on that too which yeah i mean i think it's a bad idea
00:35:03.620 you know i i follow liberals progressives conservatives libertarians classical was on
00:35:08.980 on twitter because i like to know what everybody's saying um you know i uh i went to i did some some
00:35:14.340 grad studies in philosophy and my my guy was carl popper who you know taught that this is how we
00:35:20.500 learn is by putting ideas out there and getting feedback on them john stewart mill also a great
00:35:27.700 proponent of free speech who understood this so i think i think it would be bad for all of us
00:35:33.220 left right middle if we only talk to people who agreed with us and even on the right as we know
00:35:38.820 i mean i don't i don't know a lot of people who agree with me a hundred percent on every issue uh
00:35:43.780 you know that would be awfully hard to find but what's scary you know and this is sort of a point
00:35:48.660 that um i think bruce alluded to is people using uh people really trying to shut down entire uh
00:35:57.700 platforms. I mean, you have CNN's Oliver Darcy and Brian Stelter who are musing about, well,
00:36:04.640 can we get Comcast and other cable companies to quit carrying things like Fox News because they
00:36:10.940 are lying to their viewers? I mean, it's incredible to me that hardly anyone seems to call them on
00:36:16.540 that. They're actually talking about getting their competitors off the air by going sort of above
00:36:23.240 them to the companies that you know put them on your tv at night um and you know that's that's
00:36:29.400 that's dangerous i think and again you know as as you guys have i think i'll mention that we
00:36:35.080 we love that you can build that thing but um you know some of these things are are expensive and
00:36:40.680 take expensive um infrastructure that it's not going to be easy to come up with our own immediately
00:36:46.760 Yes, and I would agree, I think generally, that the ideal scenario here is that culturally in society, everyone supports free speech, supports open debate, that people aren't lobbying web hosts and email servers to cancel their relationship with clients of political stripes.
00:37:06.400 And that isn't happening.
00:37:07.560 And I guess the problem is when you know that the utopia we're all chasing after isn't there, do you need to go further than that?
00:37:13.640 And I wanted to go back to something you mentioned in your opening statement there, Bruce, about the infamous Christian Baker cases.
00:37:21.040 And I know that this is something that differs in Canada from much of the United States.
00:37:25.980 But even so, I find that a lot of the people who are on the right, who defend the right of the Christian Baker to not bake a gay wedding cake and similar cases, are on the other side of this in a big tech context.
00:37:39.180 And they actually don't support the right of big tech companies making very similar determinations.
00:37:45.560 And I'm wondering how you rationalize those two seemingly contradictory positions that we see in a lot of the discourse around these platforms.
00:37:53.460 I don't know. I wondered the same question myself.
00:37:57.560 It's a it's a it's a bit of a puzzle to me.
00:38:00.820 there seems to be an inclination on the part of some to say, well, anti-discrimination laws are
00:38:06.540 fine, they're good, they're reasonable, we should do that. And yet, when it comes to this kind of
00:38:11.380 discrimination, they think, oh, you can't touch that, it's a private company. I don't understand
00:38:16.080 that. Yeah, and sorry, to your point there, we do see it both. The left does the opposite
00:38:21.420 juxtaposition as well, for sure. Sure, sure. But so, under our human rights codes,
00:38:27.280 discrimination is prohibited on the basis of a number of grounds in some of those codes
00:38:33.040 you have the word and along with you know religion and sex and race and so on you have the word
00:38:38.320 creed now some of us had argued that creed should include political opinion but that's not the way
00:38:44.960 it's been interpreted so far but so right now for example there's a section in the bc human rights
00:38:50.880 code that prohibits you publishing a statement that discriminates on any of the grounds listed
00:38:57.660 there. So you can't discriminate, you can't publish a statement that discriminates against
00:39:03.260 a transgender person or a person of a certain race, but you can publish a statement that
00:39:08.900 discriminates against the political right. And so that seems to me to be unfair. Why wouldn't you
00:39:15.220 have a level playing field for all of the various causes and identities that that everybody
00:39:20.920 embraces i would prefer not to have that section at all i think that's a terrible infringement of
00:39:26.680 free speech terrible but the political climate is such that you're not about to get rid of it
00:39:32.600 so the only case left to make is look this is this is not fair you're protecting some and not others
00:39:40.000 so you know you try to even it up but even that is going to be a problem
00:39:44.220 to to the question about whether or not this is a good thing to get silos of left and right
00:39:50.640 i mean i agree with you it's a terrible idea a terrible idea that's not the way things should
00:39:54.940 be but but let's let's not be naive the the direction that we're heading is between having
00:40:03.280 two silos or having one that is left. And that's all. That's what I mean by disbelieving. Listen,
00:40:13.480 the left really means it. And one of our problems is that we don't believe what they have said
00:40:20.820 out loud. Very, very well said. Thank you, Bruce. This is a question that touches on
00:40:31.160 the New York Post story and also Section 230. So I don't know if Robbie or Kelly wants to jump in
00:40:37.300 on this one first. And it tends to cut to the selective nature of some platforms using Section
00:40:44.480 230. And I should note that this is a question from a Civitas member or a True North insider who
00:40:50.120 are all invited to submit questions in advance here. And it's that if Section 230 provides
00:40:55.180 protection for social media platforms, they're not publishers, they're not editors, they can't
00:40:59.260 held accountable for content why would they intervene to get rid of the new york post hunter
00:41:04.620 biden story knowing they're protected and making that argument quite fervently in other contexts
00:41:10.220 that they are protected and don't bear right so i guess the question underlying that is why are
00:41:14.700 these companies so selective about when they want to be a platform and when they want to be
00:41:18.700 a publisher that's actively moderating i don't know who wants to take that one first can i just
00:41:23.420 real quick so there's one there's a common and this didn't quite get to it but there's a common
00:41:29.260 misconception about section 230 that i see a lot of people on the right fall into what they say
00:41:35.020 what what they people falsely claim is that section 230 says well you're either a publisher or a
00:41:40.140 platform if you start making editorial decisions if you start moderating content in an unfair way
00:41:46.300 then you are not a a platform a neutral if you stop being a neutral platform you're a publisher
00:41:51.660 and you're no longer entitled to section 230 protection i hear that argument all the time
00:41:55.260 it's totally wrong because section 230 actually says even if they make moderation decisions
00:42:01.980 they are still to be treated as publishers so section 230 was specifically written when it was
00:42:08.700 written in the 90s to let social media companies engage in moderation without then taking on
00:42:16.140 the designation of being publishers who are subjected to liability if they don't moderate
00:42:20.460 everything totally correctly so that's why they now you could say that law is bad and should be
00:42:24.700 changed we should do that but under existing law they can they act like under the way it is written
00:42:30.060 they can take down stuff for whatever reason and they're they're not violating the law the law says
00:42:34.860 they can take down stuff for whatever reason it was initially to deal with if you know these are
00:42:39.100 social media platforms going back to the early 90s that were like compute serve online forums
00:42:44.140 these kinds of things where there would be hate speech libelous comments maybe there's pornography
00:42:49.260 maybe there's whatever and the platforms were like can we take this down if we don't if we take this
00:42:53.980 down then are we going to be treated like we're the new york times or simon schuster or something
00:42:58.620 and so so the law was designed to give them the protection to take that stuff down if they want
00:43:02.940 to so so the short answer is they can do whatever they want that's what section 230 exists the power
00:43:08.460 to give them yeah i'll jump in on on the other part of the question um yeah why are they doing
00:43:16.940 this i mean you know some of the arguments that they've given rationales that they've given
00:43:23.100 are for example they're worried about public safety so after the january 6 uh riot at the
00:43:29.180 capitol they were worried about public safety so they banned people like donald trump from twitter
00:43:34.620 because they worried he could incite violence well i don't believe that um you know the supreme
00:43:40.860 leader of iran regularly tweets um genocidal things uh he's talked about giving support to
00:43:48.940 anyone who wants to help destroy israel um you know if that's not trying to incite violence 0.68
00:43:55.340 what is um you know you have uh you know government accounts from china for example that are
00:44:02.380 simply lying uh you know there's finally some outrage and they finally deleted a tweet recently
00:44:09.260 in which someone from the Chinese foreign ministry tried to say that they've been helping the Uyghur
00:44:15.100 women not become baby-making machines anymore. I mean, it's ridiculous. But the fact that 1.00
00:44:21.260 much of this stuff remains, you have to ask why. For example, just do a search on Twitter for
00:44:28.920 hashtag assassinate Trump or hashtag kill Trump. There's plenty of people calling for violence.
00:44:35.500 So I don't believe the rationales that these companies gave. And again, you have to say, so why would Facebook brag about suppressing it and Twitter suppress and all these other places, NPR as well mentioned it, suppress a story that might make the Democratic presidential candidate look bad?
00:44:57.540 And, you know, to me, there's the, you know, Occam's razor, the most obvious answer is that they wanted Joe Biden to win. And again, you know, I don't think anybody, you know, any of the executives of these companies and many of their workers have made any secret of the fact that they have liberal political views.
00:45:17.580 so yeah why are they doing this why are they attacking one side much more than the other
00:45:24.380 why do they ban Donald Trump but let Ali Khamenei tweet the answer to me seems seems pretty obvious
00:45:33.240 but if anyone has any other ideas I'd love to hear them I agree with that we're going to get
00:45:37.800 the Ayatollah on the next panel I think Bruce there's a question for you here that or at least
00:45:44.420 to start off for you that i wanted to to ask and it's about the the legal jurisdiction here because
00:45:49.460 these companies are almost exclusively at least the ones that are a part of our everyday lives
00:45:54.620 domiciled in the united states but each country around the world is finding that they want to
00:46:00.160 regulate how these companies operate there a notable example of this is australia trying to
00:46:05.360 make social media platforms pay media pay news publishers for content and we have a bill going
00:46:11.520 forward in the canadian parliament right now that's again looking at bringing a lot of online
00:46:16.080 publishers under the auspices of broadcast regulations even non-canadian publishers
00:46:21.440 how effectively uh you're a canadian lawyer so we'll talk about it in a canadian context but how
00:46:26.560 effectively can canadian laws govern these u.s platforms without actually really punching way
00:46:35.120 above their weight well the question of effectiveness is different from the question
00:46:40.000 of the strict legal jurisdiction fair enough jurisdictionally of course anything that that
00:46:45.520 happens in canada or or you know with canadians on canadian soil you know where where signals go
00:46:52.720 and they have servers and so on that that can be legally done now to what extent you're going to be
00:46:58.960 effective at changing the behavior of these very large companies based in the us or elsewhere
00:47:04.560 is another question you can try your you can try your damnedest and you can pass laws and as in
00:47:09.520 australia you can try and get them to pay uh and and in some places like in europe they do have
00:47:15.280 rules and they seem to be sort of following them um so it's not that it can't be done it's very
00:47:21.440 awkward um but but it sounds like they're determined to do so and we'll just have to
00:47:26.880 have to wait and see to what extent it it works so let's turn to i i think one of the ways in which
00:47:35.040 there tends to be, as Robbie alluded earlier, a lot of agreement between the political left and
00:47:40.800 the political right, which is, you know, that these are, you know, these big evil corporations
00:47:45.060 that can't really be trusted. There's not a lot of love for Facebook and Twitter. So I guess that
00:47:50.400 leads to the question of what are they doing or what do they need to do to regain this trust?
00:47:56.660 Because ultimately, a lot of these political fights that exist are putting, and I think Robbie
00:48:01.800 alluded to this very effectively they're putting these companies in the middle of a fight that
00:48:05.560 they can't really win is there a way for big tech to open up so to speak in a way that is transparent
00:48:13.720 that doesn't lead a side to feel like it's being picked on to use a trite term or is this kind of
00:48:19.640 just destined to be this tug of war i don't know who wants to take that one well go ahead robbie
00:48:24.680 i would i you know i would qualify what you just said a little bit because i think uh the social
00:48:29.640 media companies are unpopular with um conservative and liberal commentators and politicians for
00:48:36.760 various reasons for the very online because we're paying a special attention to how they've treated
00:48:41.640 the new york post and other places and it and again it is rightly objectionable but broadly
00:48:46.280 speaking these companies are actually quite popular um and i mean we we focus narrowly on
00:48:50.920 facebook and twitter amazon is wildly popular uh and i think for good reason because it has mastered
00:48:57.720 the important art of of cheaply promptly delivering you the goods you need to like function um
00:49:05.800 meanwhile congress the legislative body that purports to rule these companies or break them
00:49:11.560 uh is like routinely the least popular institution in all of american life
00:49:16.360 so who who should be regulating whom now um i i like i find myself frequently asking that i
00:49:22.600 I especially think when we're coming out of this, hopefully coming out of this just horrible year, the pandemic, I, for one, am, and I think most people feel this way, am, you know, warts and all, very thankful for social media, which has been as miserable as things have been.
00:49:41.920 And I mean, thank God we can at least we have Zoom.
00:49:45.320 It's better to have it than not.
00:49:46.420 I would rather be in person discussing this.
00:49:48.160 But I think a lot of young people that have been out of school for an entire year, either
00:49:52.860 we know their depression rates are way up, but they are better off that they have Snapchat
00:49:58.540 and TikTok and which are the apps that they prefer to engage in some socialization.
00:50:03.840 So we attack the companies often rightly for missteps.
00:50:10.620 But, man, I would not want to imagine this year without them.
00:50:15.060 And I think a lot of people feel that way.
00:50:17.480 Well, Robbie raises something important there.
00:50:19.860 Is this actually, I mean, what these companies do affects people.
00:50:24.040 It affects people, even those who aren't paying attention to these political debates.
00:50:27.600 But are these issues actually very niche?
00:50:30.060 And are they not actually resonating with the people generally out there?
00:50:34.980 The average, not the New York Post editor or social media manager,
00:50:37.980 but the person who might read the new york post or to bruce's context the person that might share
00:50:42.180 something online that might be a little controversial that might fall under the
00:50:46.220 government law do ordinary people care about these issues in your view are you asking me
00:50:51.840 go ahead i'm asking anyone but i saw kelly unmuted first yeah go ahead kelly just uh you know i'm not
00:50:56.640 sure i'm going to answer you directly maybe bruce will do that but you know these these companies
00:51:01.020 and things do have an effect um you know again i can't help but you know keep referring to the
00:51:06.720 new york post story since i i work there um but you know there was actually an increase in searches
00:51:13.480 for uh things like can i change uh my vote after i've mailed in my ballot um after our hunter
00:51:21.480 biden story came out there this is real there was an increase in those kind of searches and there
00:51:26.880 was even a poll that found a great number of people uh would actually like to would have
00:51:33.640 considered changing their vote if they had known that hunter biden stuff before they voted now of
00:51:38.640 course that's one reason i think early voting is a terrible idea um but you know these things do i
00:51:43.700 mean think about uh you know what effect the suppression of that story might have had on the
00:51:48.900 presidential race i i you know it's whether it would be enough to change uh you know i mean this
00:51:54.920 was a relatively close election and that it was decided in a few states by not that many votes
00:52:00.980 You know, Joe Biden, of course, got way more votes overall than Donald Trump, but those heavily contested states, it was pretty close in some cases. So I think it could have an effect on people. And yeah, you know, I know people, you know, admittedly, most of the people I know, not all of them, but a lot of the people I know are more politically engaged maybe than your average Canadian or American.
00:52:23.460 but i've had friends who have been in facebook jail as it's called um you know they posted
00:52:28.300 something that somebody uh found offensive and reported it and they you know they were not
00:52:32.880 allowed to use facebook for a couple weeks um so yeah this is something that is affecting people
00:52:37.660 who aren't editors writers politicians um you know you have people who are are really engaged
00:52:44.520 in this stuff and and concerned about it you know my sister uh for example she lives in northern
00:52:49.580 Alberta. She's not a politician or a writer. She runs, owns and runs a shop that sells work gear 0.99
00:52:57.600 and safety gear for the guys on the oil rigs. And she's very concerned about this stuff. You know,
00:53:02.420 we talk about it now and then, and she does feel that people with views like hers are really,
00:53:09.600 they're trying to get rid of those views. They're trying to suppress those views. So I think your
00:53:15.860 average person um you know maybe they've experienced it themselves maybe they've been put
00:53:20.820 into facebook jail but you know they've seen uh you know they find out a story was suppressed
00:53:26.020 that they would have been very interested in reading so uh of course you know journalists
00:53:30.980 and and political types like us we and professors uh we tend to be very interested in these issues
00:53:37.300 and and maybe focus more on them than the average person but i do think they they affect the average
00:53:42.500 person and there is some concern about that um i worry though that in some ways uh you know places
00:53:48.580 like canada that doesn't have quite the same free speech tradition i would say than the united
00:53:54.240 states there's much more willingness um to have censorship and have the government um decide
00:54:02.300 what's hate speech and what's not and and you know that's a dangerous thing agreed we'll go to you
00:54:08.780 Bruce. I agree. I do not think we should underestimate the effects of these practices
00:54:16.620 on the lives of people in their own personal lives or in the larger political situation. I mean,
00:54:23.840 there's a very good chance that the big tech censorship and control in this most recent
00:54:30.160 presidential election might have changed the outcome. It might have changed the outcome of
00:54:34.660 a presidential election. And the Twitter CFO just this morning was being, was asked whether or not
00:54:42.860 if Trump happens to run again and win, would they allow him back on the platform? He said, nope.
00:54:51.940 Nope, no way. Okay. If that doesn't tell you something, then I, this is like, it's a kind
00:54:58.040 of willful blindness and i don't i think we should distinguish between the the the desirability of
00:55:06.400 the technology and the behavior of the companies i mean robbie was saying well people are better
00:55:11.940 off with it this way people are better off with a facebook and a twitter and so on and that's true
00:55:17.200 but let's go let's go back a few decades and ask the same question about the telephone
00:55:21.260 if you were if you were ordering a telephone in i don't know 1970 you had to order it from bell
00:55:30.140 canada and that meant you had to wait three weeks and maybe it would show up maybe it wouldn't
00:55:37.260 now the question are you better off with a phone well of course telephone technology is great
00:55:43.020 especially in 1970 how do we feel about bell canada terrible terrible terrible monopoly
00:55:49.660 inefficient not responsive terrible company at the time we're not we're a little better off now
00:55:56.760 because we have two other big phone companies but but they're not really responsive either
00:56:01.960 so yeah technology is one thing but i don't think we should confuse that question with whether or
00:56:08.520 not the behavior of these oligarchs is appropriate and i think the answer to that is clearly no
00:56:15.260 you got to watch talking about the telecom oligopoly in front of an american libertarian
00:56:21.480 bruce i think you're breaking robbie's heart here and you're breaking my heart as well we
00:56:24.640 we know how bad things are there kelly mentioned something earlier that i thought was very
00:56:29.260 important which is that we actually haven't heard much from donald trump since his twitter account
00:56:33.800 was suspended and there are a lot of people who support that outcome and don't care how we get
00:56:39.240 there they think that hearing less from trump is is great so they don't care how we got there
00:56:42.860 Another example is Milo Yiannopoulos. I remember when he was all the rage in a lot of political discussions, for better or for worse, but he really hasn't been heard from since Facebook and Twitter went after him and other people that have been forced into the memory hole.
00:56:57.480 We can't go through a big tech panel without quoting an Orwell reference, so the memory
00:57:01.920 hole, so to speak.
00:57:02.940 And I do find that there are a lot of people, and this really clouds the discussion culturally,
00:57:07.920 who support the outcomes because they don't happen to like the targets of it, but aren't
00:57:13.780 really seeing the broader implications and longer-term implications of this.
00:57:17.920 Do you think that's fair?
00:57:19.400 I'll go to you, Robbie.
00:57:21.100 Yeah.
00:57:21.360 um so i think one issue uh well i mean with a lot of these cases it comes to do you think
00:57:29.280 um twitter and facebook the rules they have should they be enforced if they can't be enforced
00:57:36.180 evenly on every user of the platform right that's the issue because so milo i think was
00:57:43.380 deplatformed for you know violating some rule against doxing or or sticking his followers to
00:57:49.300 harass someone else um and i you know i i mean the trump case is obviously more contentious i i i
00:57:56.020 don't think it is maybe i would am in disagreement with the rest of the panel here i don't know what
00:58:00.100 they think i i well i question the judgment of leaving him off permanently i i think the things
00:58:05.860 he was tweeting at the time of the capital riot present a at least a fine um case for having taken
00:58:12.180 him down uh temporarily uh but then it becomes so but you're right why not ayatollah khamenei
00:58:17.700 why not the tweets about the weigar muslims right you can all you can point to a thousand or ten 0.91
00:58:22.260 thousand other examples on all these social media companies because they're so vast and there's so
00:58:26.820 much content of the rules not being applied because not everything is it's not as if it's
00:58:32.420 reviewed and then it goes live it goes live and then if people notice it and complain maybe there's
00:58:37.700 action taken this is most apparent on youtube where there is really no way unless someone flags
00:58:43.940 it for you i mean youtube it's i it's something it's some insane number i recently had to look
00:58:48.820 it up for my book but it's like thousands of new hours of content every couple minutes so there's
00:58:53.780 no way to watch it and then have it go live it just goes live and later they make some kind of
00:58:59.060 moderation decision if someone complains so often what looks like biased enforcement of rules is
00:59:04.420 instead biased complaining on behalf of the user base someone noticed this and said something or
00:59:09.860 10 or a hundred or a thousand people which made them do something quicker but no one did something
00:59:13.700 about all these other things so it that's the that's a problem of the vast amount of content
00:59:19.540 that the rules will never be enforced evenly ever it's impossible to do it because there's just too
00:59:24.100 much content and i ironically the people in a better position to enforce rules on taking down
00:59:30.180 content are the the big players already so any pro any proposal especially a legislative or
00:59:35.780 government proposal that would require more moderation or more fair moderation or making
00:59:41.620 sure they're taking down or not taking down the right people well guess what facebook has 15 000
00:59:46.180 people who work for the company to do moderation twitter has only 1500 and any other company
00:59:51.220 smaller than that would have even fewer so so a some of these proposals are actually i think
00:59:58.100 would have the effect of further entrenching the biggest players because they are best positioned
01:00:02.580 to take on any additional requirements that you would make, such as the ones proposed.
01:00:09.860 Yeah, that's a fair point. And one interesting thing I find is that because of these bona fide
01:00:14.780 examples of big tech censorship, all of the situations we've talked about, whether it's
01:00:19.620 Trump's suspension or Milo or the New York Post story, we know it's real, we know it happens,
01:00:24.160 but it forces a lot of people, especially on the right, admittedly, to have this hair-trigger
01:00:28.820 response to any glitch that exists on a social media platform. So people forget that just
01:00:34.200 sometimes tech doesn't work. I had, for example, my Facebook page once disappear, and I immediately
01:00:39.160 went into like warrior mode. And then half an hour later, it was back. And that was that. I know
01:00:43.980 that I've lost a couple of thousand followers after Twitter did the supposed QAnon purge. And
01:00:49.840 that one was timed with other people. So I knew there was something to it. But I get people, and
01:00:55.420 No offense to anyone viewing, but people who I don't think are the most adept at technology that, you know, they like an email is in the wrong place and they think that, you know, someone's after them.
01:01:03.460 And there is this problem there that is real, but everyone seems to, sorry, I shouldn't say everyone, but a lot of people that are keenly following this issue seem to view any interaction they have with technology through that lens.
01:01:17.820 Kelly, where do you think we break through that?
01:01:20.420 Or do we?
01:01:20.980 Well, first, before answering that, I just want to say that, you know, Rob, you made some great points. This is why I think, in my view, they shouldn't try to moderate because it's impossible to do. And once you start picking and choosing, people wonder why did you pick this and not that.
01:01:36.960 But, you know, the Israeli government actually asked Twitter to take down some of Khamenei's tweets about Israel, and Twitter declined to do it.
01:01:49.000 I believe they said that this is a government official.
01:01:52.260 It's in the public interest to have, you know, what he says out there.
01:01:56.740 So, yeah, in some cases, they're just not finding it.
01:02:00.040 They're not quick enough.
01:02:00.880 but we do see that there's cases when there have been people complaining when a government
01:02:05.080 has complained and Twitter decided not to do anything so again I think um you know some of
01:02:11.340 it is there's too much stuff but some of it is they're just they have double standards these
01:02:15.220 these uh people are are hypocrites um but yeah you know it's you know it's it's it's funny Andrew
01:02:20.900 because sometimes I'll post something on Twitter and it doesn't get as much of a response as I
01:02:25.240 thought it would and then you know have I been shadow banned by by Twitter this is something
01:02:29.160 you know a lot of uh people on the right worry about yeah or was it just a joke that didn't land
01:02:33.940 now it's like oh no no it's jack dorsey's fault my joke was great exactly so you know it could be
01:02:38.740 hard to to tell sometimes but i mean the fact that it's a real thing and it's happened to some people
01:02:44.520 makes us suspicious i think you know if the social media companies weren't doing that kind of thing
01:02:50.600 people wouldn't be paranoid about them doing it to them it's it's a real phenomenon and um yeah i
01:02:58.780 mean again you know why are so many uh conservatives feel that their views are being suppressed that
01:03:05.980 people want to drive them out of the public square because that's to some extent what they've been
01:03:12.420 doing to other people and so it creates that suspicion um you know i mean admittedly facebook
01:03:17.900 already had people uh you know a little suspicious of it as robbie said earlier it's it's gotten
01:03:23.560 attacks from the left and the right and of course just regular people i mean who hasn't had that
01:03:28.340 experience where they were talking to somebody about some product and then saw a Facebook ad for
01:03:33.200 it a few minutes later. I mean, that kind of thing creeps people out. And so I think Facebook has a
01:03:39.100 lot of issues around privacy and that sort of thing, even before all of this stuff started
01:03:45.820 happening. But, you know, again, when you have, you know, Mark Zuckerberg and other people saying,
01:03:51.760 you know, we, you know, we want to lower, you know, in the piece that Andrew sent about wanting
01:03:57.420 to have less political speech on Facebook.
01:04:01.820 You know, someone said that they wanted
01:04:03.200 to cool things down, lower the tone.
01:04:06.860 I mean, again, this is a company saying
01:04:09.140 that it wants to set the tone
01:04:10.980 of how people in certain countries are feeling
01:04:14.660 about what's going on around them.
01:04:17.660 I mean, that's a very powerful, ambitious project.
01:04:21.340 And I think we should all be concerned about it
01:04:24.160 and how it might affect us.
01:04:26.120 I agree.
01:04:26.920 how how how could you not be suspicious given what they've said they want to do and in fact
01:04:33.080 what they are doing i mean and on twitter for example if you're engaging in conversations about
01:04:40.760 the the efficacy of lockdowns or vaccines or policies regarding covet they will often put
01:04:46.920 a warning on the tweet or not allow the tweet to go through i mean how could you not be suspicious
01:04:53.960 of a company that is transparent about its intent to censor you if you say what they regard as the
01:05:01.840 wrong thing. Sure, we jump to conclusions when there's a glitch and that might be an extreme
01:05:06.920 reaction, but there's no reason to expect reasonable behavior here because they've promised
01:05:12.900 not to give that to us. Yeah, that's very fair. Unfortunately, we are coming to the end of our
01:05:20.400 time here but i want to go around the virtual table here and and get just some final thoughts
01:05:26.160 if we can and and especially if you have a recommendation that you would put either culturally
01:05:31.600 socially politically from a regulatory way on on something you think should be done because
01:05:36.400 that's always the challenge with these sorts of panels as we unearth a lot of problems and
01:05:39.920 and people always say what now so i'll as we wind things down we'll start with robbie if you want
01:05:44.160 to just offer some closing thoughts on on all we've discussed sure uh one thing i just wanted
01:05:48.720 ad on the on the you know the choosing choosing what to prioritize facebook maybe saying we want
01:05:54.160 less political content because we want to turn down the the dial you know it's interesting though
01:05:58.480 because so they've done that before i remember there was a time when uh so clickbait headlines
01:06:05.600 and by clickbait i mean like upworthy independent journal review uh there were concerted sites that
01:06:10.720 did it too and they would they were getting all this traffic from facebook and then facebook
01:06:14.640 decided you know what we think those articles are signed up emotionally manipulative they have this
01:06:20.240 crazy headline and then it's and then people click through and then they quickly click away
01:06:24.560 um so facebook punished those kinds of headlines and those those news websites died but facebook
01:06:30.880 gets to do that because it decided it'd be better user in uh experience if they punish that kind of
01:06:35.200 stuff if one day they decide you know what we want to we want people to see more on their facebook
01:06:40.720 feeds we want people to see more photos of their friends and loved ones and less like divisive
01:06:45.680 political content they get to do that and if you don't like it you can get off facebook and if you
01:06:50.640 do like you might find you do like it and it's like their company to decide that so we still
01:06:54.480 like arrive at that philosophical point i think uh so i i will the area of social media kind of
01:07:00.800 regulation which i am more sympathetic to regulation it's not on the political speech
01:07:05.520 bias frame because again our laws just even if there is a problem are the first amendment i think
01:07:10.640 prohibits so much of anything you could do with that in a united in the u.s at least maybe not
01:07:14.800 in canada or elsewhere um i i'm interested in kinds of things like right to be forgotten which
01:07:20.160 is something europe is interested it does where you could like you can ask you can petition google
01:07:24.960 to get rid of your search results you want to get rid of something like embarrassing from your teen
01:07:29.600 years because you can't get a job because it shows up in google search that kind of thing
01:07:33.520 obviously we can't have a law that forces google to do that in the us because of the first amendment 0.83
01:07:37.920 but in general i'm i'm more um uh on the kind of privacy aspect of social media we there there is
01:07:46.480 a little bit of a of a tradition even in u.s law of making trade-offs between privacy and speech
01:07:52.880 uh where you have so so for instance i think efforts to criminalize like um revenge porn so
01:07:58.640 the intentional leaking of people's nude photos if you if you got them and you share them on social
01:08:03.920 media right now it's like a gray area do they have to take it down section 230 they probably
01:08:09.840 actually don't have to take it down maybe you could tweak it so they do have to take down
01:08:13.760 something like that or you could have something like the copyright protections where uh where
01:08:18.000 they're not immediately in trouble if if content goes up that violates someone's copyright but they
01:08:22.240 have a certain amount of time period where then they do have to take it down or you could sue them
01:08:25.760 that's a kind that's that's the area of social media where i think potentially libertarian but
01:08:32.240 potentially you could have some kind of regulatory change that would do some good i don't think it's
01:08:37.480 really in the in the um category of of speech or political speech because man there is a lot of
01:08:44.660 leeway granted to private entities to to police to handle their own political speech matters under
01:08:50.260 the u.s constitution yeah very good and we could do an entire entirely different panel on the
01:08:56.540 privacy issues and and probably four or five of them and and still barely scratch the surface so
01:09:01.180 i'm glad you brought that up even if it was in the the end here given the format that i laid out so
01:09:05.980 thank you for that uh kelly yeah i've enjoyed the discussion with the panelists and i i have to
01:09:13.100 admit i've been peeking at twitter a little bit to see what people are saying about the panel
01:09:16.940 on twitter and it seems like a lot of uh people out there are enjoying the discussion and and
01:09:21.180 participating in it as well uh so that's great to see but yeah you know i think it's sort of
01:09:25.340 one of the points that i made at the beginning really i think has been driven home which is that
01:09:29.180 this this is a difficult issue it's thorny and especially for those of us on the right who don't
01:09:35.020 like the idea of government intervention in private business we it's what is the solution
01:09:41.180 what what can we do um you know again i'm i'm reticent to to tell private companies how to
01:09:46.780 behave but at the same time this is a real problem um and i think maybe the ultimate solution is
01:09:55.340 unfortunately the same a very tough one but it's it's really maybe more of a cultural uh solution
01:10:01.500 you know i'm i'm really struck by again i mentioned this earlier how many uh situations have people
01:10:07.260 gotten cancelled or um you know banned you know all these different things because younger workers
01:10:13.340 at companies were pressuring the older people in charge um we you know it's sad we've somehow
01:10:20.220 failed as a society i think if journalists are now encouraging suppression of speech and that's
01:10:27.500 the place we've gotten to um not to beat a dead horse but with the new york post story i was
01:10:32.940 amazed at how many journalists were on twitter saying yeah this is russian disinformation i
01:10:38.460 wouldn't i wouldn't talk about this and i'm glad that they um you know suppressed this story uh
01:10:44.540 you know there was a time when journalists stood in solidarity with one another and and you know
01:10:49.820 know would have gotten upset and angry at these this kind of suppression but now we have journalists
01:10:55.820 cheering it um and i think you know if you you look at polls young people these days don't feel
01:11:01.020 the same way about in america the first amendment and in in both our countries free speech generally
01:11:07.100 as people used to um you know i'm sure that the academy has had something to do with this i i
01:11:12.620 doubt bruce is is teaching these sorts of ideas but a lot of other professors are um and you know
01:11:19.020 people have discovered that uh you know if you can claim harm if you can claim someone offends you
01:11:25.420 offends you and get that person uh fired or or suppressed um you're gonna start maybe start
01:11:32.140 enjoying that power maybe using it even more so you know i think ultimately we need to have a
01:11:37.820 society that values free expression that values tolerance that values free speech i mean i think
01:11:44.220 i think a lot of us would agree that the liberals these days are very illiberal they don't support
01:11:49.580 those liberal values of tolerance and open debate and you know it's how do we get back to that that's
01:11:55.660 going to be a lot tougher than even figuring out how to uh how and if we should regulate social
01:12:01.580 media companies but i think that ultimately uh that's the way we're going to have to go
01:12:07.260 thank you kelly and last but certainly not least we go to bruce bruce some final thoughts
01:12:11.420 I hate to be negative, but I think it is later than we think.
01:12:20.940 If you consider what has become mainstream in terms of views about speech and safety
01:12:28.540 and misinformation, just notice the gap between that and what we regard as our political and
01:12:37.340 legal norms about speech it's hard to ignore the gap that's growing between those two things
01:12:46.300 and we seem determined to ignore it and we shouldn't that'd be a mistake good regulation
01:12:52.940 is not coming bad regulation probably is bad regulation which will increase this not decrease
01:12:59.980 it so let me leave you with a a great quote i like from frank herbert in his book children
01:13:07.180 of dune it goes like this when i am weaker than you i ask you for freedom because that is according
01:13:15.200 to your principles when i am stronger than you i take away your freedom because that is according
01:13:23.940 to my principles and that's what we got thank you very much bruce i think i saw robbie smirk when
01:13:31.680 you mentioned children of dune so we have another uh sci-fi sci-fi reader on the panel
01:13:36.380 all right well i want to give a big thank you to all of those who view who have tuned in and
01:13:41.780 especially those who have hashtagged along on big tech well you still can to put the ominous
01:13:46.120 dower note there and to civitas canada and true north for hosting this and allowing me to moderate
01:13:51.800 this discussion big thanks to queen's university law professor bruce party reason editor robbie
01:13:58.180 suave and again a plug that book coming out later this year is tech panic and new york post
01:14:03.840 editorial board member kelly jane torrance all of you thank you so much for uh joining this evening
01:14:08.480 it's been a fascinating discussion so really appreciate it thanks andrew great thank you
01:14:12.840 all right well that is it for us we say farewell but thank you to you all
01:14:16.900 and have a great day stay safe and free
01:14:28.180 .
01:14:58.180 You