The Megyn Kelly Show - January 13, 2021


Tech Censorship And Independent Media, with Glenn Greenwald and the CEOs of Parler and Substack | Ep. 50


Episode Stats

Length

2 hours and 10 minutes

Words per Minute

185.24295

Word Count

24,109

Sentence Count

1,440

Misogynist Sentences

16

Hate Speech Sentences

18


Summary

After the Capitol Hill riot last week, Megyn Kelly and Glenn Greenwald take a look at the tactics being used by the left to silence the President Trump and others on the right. They discuss the dangers of big tech's silence of free speech, and the need for law enforcement to do more to protect the public.


Transcript

00:00:00.000 Your business doesn't move in a straight line.
00:00:02.840 Make sure your team is taken care of through every twist and turn
00:00:05.980 with Canada Life Savings, Retirement and Benefits Plans.
00:00:09.660 Whether you want to grow your team, support your employees at every stage
00:00:13.120 or build a workplace people want to be a part of,
00:00:16.200 Canada Life has flexible plans for companies of all sizes
00:00:19.400 so it's easy to find a solution that works for you.
00:00:22.840 Visit canadalife.com slash employee benefits to learn more.
00:00:26.540 Canada Life. Insurance. Investments. Advice.
00:00:30.740 When I found out my friend got a great deal on a wool coat from Winners,
00:00:34.500 I started wondering, is every fabulous item I see from Winners?
00:00:39.060 Like that woman over there with the designer jeans.
00:00:41.980 Are those from Winners?
00:00:43.520 Ooh, or those beautiful gold earrings?
00:00:45.980 Did she pay full price?
00:00:47.300 Or that leather tote? Or that cashmere sweater?
00:00:49.560 Or those knee-high boots?
00:00:50.940 That dress? That jacket? Those shoes?
00:00:53.120 Is anyone paying full price for anything?
00:00:56.540 Stop wondering. Start winning.
00:00:58.640 Winners. Find fabulous for less.
00:01:01.220 Welcome to The Megyn Kelly Show.
00:01:03.320 Your home for open, honest, and provocative conversations.
00:01:07.600 Trump and others on the right are being banned from Twitter and Facebook.
00:01:11.660 Parler booted entirely by big tech.
00:01:14.280 Its CEO is here, along with a Substack founder and Glenn Greenwald.
00:01:19.140 Now.
00:01:19.440 Now.
00:01:26.540 Hey everyone, it's Megyn Kelly.
00:01:28.680 Welcome to The Megyn Kelly Show.
00:01:30.180 We are going to get to our jam-packed show with a bunch of great guests for you in just
00:01:35.600 one second.
00:01:36.120 And there's much to discuss.
00:01:37.640 But I've got some thoughts on what we're seeing right now that I wanted to add before
00:01:41.020 we go there.
00:01:42.360 I've been thinking over the past few days of the old Rahm Emanuel motto, never let a crisis
00:01:46.600 go to waste.
00:01:47.720 Remember that?
00:01:48.380 Under President Obama?
00:01:49.180 Well, it's rearing its ugly head again in the wake of last week's Capitol Hill riot.
00:01:56.320 Everyone was outraged by what they saw at the Capitol last week.
00:01:59.520 I have yet to hear anybody defend it.
00:02:01.640 A cop was murdered.
00:02:02.960 Another was beaten mercilessly with American flags.
00:02:07.580 At least four other people died, including that one female protester who got it in her
00:02:11.380 head that she would, quote, die for President Trump.
00:02:13.940 Um, pushing, you know, ridiculous claims and storming the Capitol, a place she never should
00:02:20.000 have been.
00:02:21.440 You've seen these tapes of the lawmakers who try to serve the country honorably.
00:02:26.520 A lot of them do.
00:02:27.380 And their staffs forced to cower under their desks as the Capitol police were drawing guns
00:02:31.860 to prevent the breaching of the congressional chambers.
00:02:33.940 It was disgusting.
00:02:35.200 It was stomach turning.
00:02:36.840 Every one of these rioters should be tracked down and prosecuted.
00:02:39.400 And it's happening.
00:02:40.020 And law enforcement, which, as it turns out, had been warned that this was coming, but
00:02:44.780 did not take it seriously, needs to reevaluate its approach to these threats, especially
00:02:49.480 in the next week, because we're hearing more chatter about more events.
00:02:53.720 But the answer is not to change the parameters of free speech in America.
00:02:58.900 And that, to me, that seems to be the left's and the media's and big tech's solution to what
00:03:06.080 we saw.
00:03:06.920 In fact, my impression is many seem to be salivating.
00:03:10.020 At the opportunity here to shut down not just Trump, but conservative platforms, speech
00:03:15.320 and any viewpoints they deem problematic, stuff they've wanted to shut down for years
00:03:19.600 now.
00:03:20.020 They want to use this as the excuse to do it.
00:03:22.820 And this, to me, does not appear to be about preventing another attack at this point or curbing
00:03:27.200 someone's dangerous rhetoric.
00:03:28.640 It appears to be about silencing one's political adversaries.
00:03:33.580 First, they've come for Trump.
00:03:35.740 The president of the United States.
00:03:37.240 OK, I realize Trump is controversial, but he's still the president.
00:03:40.480 Banned now forever from Twitter and at least temporarily from Facebook and Instagram.
00:03:47.140 Even the company responsible for processing donations to, I think, his campaign has severed
00:03:52.500 its relationship with them.
00:03:53.440 So it's been a clear and concerted effort to silence the president.
00:03:59.900 And frankly, it started before last week.
00:04:01.740 Remember when CNN and MSNBC and even some shows on Fox cut away from the president when
00:04:06.700 he veered off into his unsupported electoral gripes instead of simply fact checking him after
00:04:11.940 the leader of the free world?
00:04:12.980 How did that work out?
00:04:14.560 Was all of America suddenly disabused of its trust in President Trump or in their suspicions
00:04:21.280 that there might have been electoral fraud?
00:04:23.780 No, because that's a that's a left wing fantasy.
00:04:27.740 That's not how this works.
00:04:30.800 Now it's morphed into a total shutdown of his communication with the American people.
00:04:36.160 The vast majority of Americans, contrary to the belief here, are people who can separate
00:04:43.780 fact from fiction and who even when they fail to, even when they've been lied to and they
00:04:49.860 don't see the difference between the lies and the truth, can control their anger and would
00:04:54.960 never dream of storming the Capitol and hurting people.
00:04:58.220 Now, are there some nutcases out there?
00:05:00.020 Obviously.
00:05:01.280 Can we solve that with Twitter and other electronic censorship?
00:05:05.640 No.
00:05:06.160 I don't think we can.
00:05:07.860 That's why we had riots and societal outrage and breakdowns long before we had the Internet.
00:05:12.720 The Internet gives us a peek at what these groups are up to, what they're saying and
00:05:18.380 planning.
00:05:18.800 It gives us a heads up if we would just listen and plan accordingly.
00:05:24.380 The Internet, while far from perfect, is not the cause of the angst and the outrage seething
00:05:30.200 in the country right now.
00:05:32.740 For that, we have to look at so much more.
00:05:34.940 You know, the partisan media, the hyper partisanship in government, these shutdowns, interminable
00:05:44.120 and indiscriminate, then aren't followed by the people imposing them on us.
00:05:48.460 Political elitism, where they turn up their noses at middle America and say, you don't
00:05:53.400 matter, and much more.
00:05:56.540 And don't even get me started on Trump and, you know, all of his rhetoric and untethered
00:06:02.960 relationship with the truth, as I've said.
00:06:04.640 But only that last thing is being discussed.
00:06:07.080 Now, sometimes the Internet is a place for confused people to traffic in baseless conspiracy theories.
00:06:14.740 But that's America.
00:06:16.580 You're allowed to be a nut.
00:06:19.140 When things veer over into threats that clearly incite violence, speech can and should be curtailed.
00:06:25.660 But the standard should not be arbitrarily applied based on politics.
00:06:30.320 That's what we're seeing.
00:06:31.420 And people like the president should be given as wide a berth as possible.
00:06:37.380 The Twitter crackdown on Trump was absurd, in my view.
00:06:40.560 He did not call for violence or any kind of attack.
00:06:44.000 He was obviously slow to denounce it.
00:06:46.960 And some of his surrogates got a lot closer to the line than he did.
00:06:51.560 But Twitter, you know, holier than thou Twitter, they allow the Ayatollah Khomeini to urge jihad against
00:06:57.000 Israel, saying, and I quote, everyone must help the Palestinian fighters and saying, quote,
00:07:02.700 the Zionist regime is a deadly cancerous growth that must be uprooted and destroyed.
00:07:09.460 Twitter says, well, that doesn't violate his policies because it's, quote, discourse by a world leader.
00:07:16.420 Give me a break.
00:07:18.440 So what was Trump's discourse that ultimately got him booted off of the platform?
00:07:22.680 Two tweets in particular, says Twitter.
00:07:24.600 One, he says, the 75 million great American patriots who voted for me will have a giant
00:07:30.860 voice long into the future.
00:07:33.680 OK, we're waiting for it to get controversial.
00:07:36.600 They will not be disrespected or treated unfairly in any way, shape or form.
00:07:40.740 Boom.
00:07:41.960 Incitement.
00:07:44.100 OK.
00:07:45.380 The second one was, he said, to those who have asked, I will not be going to the inauguration
00:07:50.060 on January 20th.
00:07:51.140 Oh, the argument is he was telegraphing that violence could take place there, even though
00:07:59.720 he wasn't going to be there based on what?
00:08:02.000 Maybe he was just finally letting us know something we've been speculating about for months in this
00:08:05.960 country, which is, would he go if he lost?
00:08:09.040 And by the way, his daughter, Ivanka, already says she's going.
00:08:11.380 So please on that other theory.
00:08:14.240 So basically, where are we?
00:08:15.340 Destroy the Zionist regime with jihad?
00:08:17.600 No problem.
00:08:18.040 American patriots will not be treated unfairly.
00:08:21.020 Off you go, Trump.
00:08:22.900 And now we find out that Parler, that's the Twitter competitor popular with conservatives,
00:08:28.500 has been shut down.
00:08:30.700 Not only did Apple and Google remove the Parler app from their smartphone app stores, but Amazon
00:08:37.540 suspended Parler from being able to use its server.
00:08:39.620 All of them claiming too much hateful or violent content is appearing on Parler and it's allegedly
00:08:46.580 posing a danger and did prior to the riot in the Capitol.
00:08:51.960 That is effectively the end of Parler for right now, though the CEO is going to be here in a
00:08:57.220 second, the founder.
00:08:58.100 We're going to talk about whether that's true, what his plan is.
00:09:01.140 But this is selective outrage, people.
00:09:04.080 Those who have been looking into this actually say Facebook is where much of the planning took
00:09:07.720 place for the Capitol riot, not Parler.
00:09:10.400 We'll talk about that with Glenn Greenwald.
00:09:12.420 Already we're seeing some media analysts and others say more, more need to be banned.
00:09:17.180 More speech needs to be silenced, not just Trump.
00:09:19.000 The media critic over at CNN wants Fox News to be silenced.
00:09:23.440 They want Fox gone.
00:09:25.160 Anyone who supported President Trump now being lumped in with the nutcases who committed murder
00:09:29.400 at the Capitol last week.
00:09:31.780 Future career and other opportunities are being threatened.
00:09:34.820 And if you want to raise questions about, quote, widespread voter fraud in this election,
00:09:41.120 good luck.
00:09:42.520 YouTube's already banning those discussions, which gets censored on Twitter and elsewhere,
00:09:46.100 too.
00:09:46.180 Now, look, there's been zero proof of widespread voter fraud.
00:09:51.160 The Sidney Powell crack and never came through.
00:09:53.640 People admit it.
00:09:55.360 It wasn't there.
00:09:57.260 They were trying to craft a soft exit for the president where he didn't have to admit
00:10:02.340 he lost, but he did.
00:10:03.620 So people do need to accept reality.
00:10:05.460 But if you choose not to, that's up to you.
00:10:07.460 And frankly, if you want to continue discussing your beliefs, even if they're unfounded, you
00:10:11.820 should be able to.
00:10:13.100 What is this?
00:10:13.640 East Germany?
00:10:14.200 I mean, speaking of East Germany, by the way, even Angela Merkel, who grew up there,
00:10:19.380 is criticizing Twitter's decision to ban Trump.
00:10:22.580 But OK, those claims are based on lies, you say, right?
00:10:26.840 It's not true.
00:10:27.780 There wasn't widespread voter fraud.
00:10:29.380 OK, so people are saying, why shouldn't YouTube and others shut down these discussions?
00:10:34.140 Those are the lies that influence those people who stormed the Capitol.
00:10:37.220 First of all, do you have any idea how many lies there are on YouTube and the Internet?
00:10:41.120 And not just lies like I was abducted by aliens and lost my virginity to an alien hologram
00:10:45.720 at age five, which is which is what I actually just watched.
00:10:50.220 OK, that's on YouTube now.
00:10:52.300 But lies that matter, lies that have led to deep anger, division and even danger.
00:10:58.920 Look at what happened just this past summer.
00:11:02.500 The lies told about police.
00:11:03.980 LeBron James saying cops are literally hunting black men, hunting them in the streets.
00:11:09.820 Did Twitter put a warning on that as disputed?
00:11:14.000 Lies like America was founded to preserve slavery, which appeared in The New York Times
00:11:18.060 and won a Pulitzer, despite the fact that it's been universally discredited.
00:11:24.180 They're not even admitting that they lied.
00:11:27.280 Lies like Michael Brown really did say hands up, don't shoot before he was killed by a cop
00:11:32.060 in Ferguson, Missouri, a lie that is still being peddled by race hustlers like Al Sharpton.
00:11:38.720 No problem.
00:11:40.640 Well, you might say those aren't lies.
00:11:42.060 They're just opinions.
00:11:43.200 But the same could be said about the widespread voter fraud claims we're hearing.
00:11:46.700 Well, those lies about, say, cops and race, they're not being peddled by elected officials
00:11:52.340 with huge power and influence.
00:11:54.560 You mean like Kamala Harris, a U.S. senator, now vice president-elect, who said the cop who
00:11:59.900 shot Michael Brown, who was, by the way, exonerated by Eric Holder's DOJ, committed murder.
00:12:05.360 That was her word.
00:12:06.300 He committed murder, totally unfounded, untrue and outrageous.
00:12:10.940 But that's fine because why again?
00:12:15.040 Well, I'll tell you why, because this is America.
00:12:16.700 And she's entitled to her opinion, even if it's unfounded.
00:12:19.560 And if you think there was widespread voter fraud and the Kraken was suppressed, it's not
00:12:23.700 true.
00:12:24.060 But you're entitled to your opinion and you're entitled to discuss it.
00:12:28.340 It's it's crazy to me, the standards.
00:12:30.260 You know, meantime, of course, The New York Post is shut down from circulating online.
00:12:33.480 A true story about Hunter Biden's foreign corruption deals because, you know, big tech
00:12:37.800 and its defenders tell us honesty matters, except in certain instances.
00:12:41.520 It's like the it's like the Corona protests, like the Corona virus, you know, and the shutdowns
00:12:48.600 like stay inside.
00:12:49.820 It's dangerous.
00:12:50.940 Socially distance unless you're protesting BLM or morning RBG or celebrating Joe Biden.
00:12:57.780 Right.
00:12:57.940 Like the double standards are obvious to anybody paying attention.
00:13:00.680 Here's the bottom line.
00:13:03.040 These big tech giants are so powerful.
00:13:05.100 They're so huge and they're so monopolistic in their control of the Internet that they need
00:13:09.900 to be treated like a governmental entity.
00:13:12.220 The Wall Street Journal has an op-ed dated Monday by Vivek Ramaswamy and Jed Rubenfeld that
00:13:18.100 sums it up perfectly.
00:13:19.160 They say these guys, these guys in Silicon Valley have the power of a governmental entity
00:13:25.320 without the accountability.
00:13:27.420 And the authors end the piece by saying, quote, Silicon Valley seized on last week's attack
00:13:35.120 to do what Congress couldn't by suppressing the kind of political speech the First Amendment
00:13:40.500 was designed to protect.
00:13:42.020 Aggrieved plaintiffs should sue these companies now to protect the voice of every American
00:13:48.560 and our constitutional democracy.
00:13:52.580 Amen.
00:13:55.380 We'll get to our guests in one second.
00:13:57.020 But first, insurance can be complicated.
00:14:00.140 That's why the zebra was created.
00:14:02.620 Oh, have I gotten your attention?
00:14:03.880 The zebra.
00:14:05.020 When you use thezebra.com, insurance finally feels like it's in black and white.
00:14:10.080 No more confusion.
00:14:10.820 Just honest rates from real companies.
00:14:14.280 The zebra is the nation's leading insurance comparison site for car and home insurance.
00:14:19.860 They make it easy for you and they can help you save money today.
00:14:23.340 Go to thezebra.com and answer just a few questions to compare accurate insurance quotes for free.
00:14:30.080 That's so great.
00:14:30.800 So they'll help you figure out what's the best policy for you.
00:14:33.720 The zebra will protect your personal information and make sure that there are no hidden fees
00:14:37.480 or surprises along the way.
00:14:38.900 You can secure your insurance from thezebra.com or over the phone from one of their licensed insurance agents.
00:14:45.960 So either way, online or by phone.
00:14:48.500 How much money can you save on car or home insurance?
00:14:51.140 Well, visit thezebra.com slash Kelly to find out.
00:14:54.980 That's T-H-E-E-B-R-A.com slash Kelly for insurance in black and white.
00:15:02.880 And now, without further ado, we're going to get to John Mates.
00:15:05.960 He's the CEO and founder of Parler, which is kind of like Twitter, not quite as big,
00:15:11.880 but was on the rise up until they just got shut down by Apple, Google and ultimately Amazon,
00:15:17.800 which has its server and Parler has become very, very popular with conservatives who have kind
00:15:23.800 of had it up to here with Twitter.
00:15:26.220 Unfortunately for John Mates and those who like Parler, as of now, it's been, if not killed,
00:15:32.180 placed on life support.
00:15:33.400 And we're going to get into why.
00:15:35.560 Here's John.
00:15:35.920 John Mates, CEO of Parler.
00:15:42.480 Thank you so much for being here.
00:15:43.680 How are you?
00:15:44.860 Thank you.
00:15:45.500 I'm doing OK, given the circumstances, but, you know, it's got to be a scary time for you.
00:15:51.400 I mean, this is your company.
00:15:52.440 You decided to create and innovate a space where people could speak freely with their opinions
00:15:57.280 and not have to worry about, in particular, liberal censorship, which we've seen.
00:16:00.480 And the answer this week has effectively been not only do the does the conversation on Parler
00:16:07.160 need to change, Parler itself is done.
00:16:10.480 That's effectively the message you've been given.
00:16:12.720 Well, that's what they'd like.
00:16:14.360 I mean, they they tried pretty hard to basically erase us from the Internet completely, you know,
00:16:23.340 by terminating web services, by terminating our app store.
00:16:26.760 You know, the app stores alone is devastating, but, you know, they pulled the plug for our
00:16:32.320 servers, too.
00:16:33.720 And with that, the reputational damage and the, you know, a lot of the, you know, fake
00:16:40.420 information out there about what Parler actually stands for, you know, it left such a big impact
00:16:46.920 that we can't find other service providers.
00:16:50.200 Now, we are finally identifying a few who will stand up.
00:16:54.100 But, you know, even the few that we have come in contact with, they say they get attacked
00:16:59.460 in the press, they're getting attacked by hackers, they're getting attacked by everybody
00:17:03.240 as a result of what what's happened from Amazon.
00:17:06.560 I mean, these these what they did is evil.
00:17:10.240 So Parler, just just to take one step back, I think people know this at this point, but
00:17:13.680 Parler is basically like Twitter.
00:17:15.220 It's a forum in which people can post their opinions and, you know, sort of quick, short
00:17:20.720 posts.
00:17:21.700 And it encourages back and forth and so on.
00:17:24.780 But it's not as big as Twitter.
00:17:27.540 And as of a couple of months ago, it wasn't yet profitable.
00:17:30.320 It's only been around for a couple of years, but you were growing.
00:17:32.800 And just to take a again, step back, just tell us why you thought it was necessary to
00:17:36.700 create the forum.
00:17:37.340 Well, I guess it goes back into the kind of the root of what we call to post.
00:17:43.020 So when you post on Twitter, it's tweet you post on Facebook, I guess it's just a post.
00:17:47.240 But when you post on Parler, you create a parlay.
00:17:50.480 And if you look up the definition of a parlay, it'll be two opposing parties.
00:17:55.800 You know, maybe they are maybe they hate each other.
00:17:58.580 Maybe they're just in a disagreement.
00:18:00.400 Maybe they violently disagree with one another.
00:18:02.260 That and then a parlay is bringing those two people together or those two parties together
00:18:07.640 to have a civil discussion and to work things out.
00:18:10.460 They may not leave the discussion agreeing.
00:18:12.620 Maybe they found a better answer.
00:18:14.180 But when they leave the discussion, you know, they don't hate each other anymore.
00:18:18.080 That's the idea of a parlay.
00:18:20.380 And Parler was founded on the principles of free speech, not conservatism, not liberalism,
00:18:27.600 not political.
00:18:28.740 You know, the idea that, you know, we all have an innate right to have a voice and that
00:18:36.680 we shouldn't be tracked or stalked or surveilled in the process of us speaking online.
00:18:43.460 And so those are our two, you know, founding principles, which I think, you know, it doesn't
00:18:48.720 get more American than that.
00:18:50.340 And it doesn't get more, you know, it's not, you know, this is what we need now more than
00:18:55.840 ever.
00:18:56.080 Well, I know that one of the one of the thoughts you had and you only you only graduated from
00:19:01.160 the University of Denver in 2014.
00:19:02.820 You're a young guy was you were sick of the ideological suppression you felt you were seeing
00:19:08.980 of conservatives by big tech.
00:19:11.940 And when I read that, I thought, oh, the irony, right?
00:19:14.500 The irony.
00:19:15.060 Little did you know what was coming your way, which which would be what certainly looks like
00:19:19.060 ideological suppression of your entire business by big tech.
00:19:22.220 Yes.
00:19:23.360 And and compounded on it, you know, a lot of a lot of angry people on the Internet, you
00:19:30.220 know, on places like Twitter, which, you know, I have described as being a very hateful place,
00:19:35.640 you know, are compounding it by spreading information.
00:19:40.020 You know, they started a rumor the other day that we were hacked, but it's not true, you
00:19:44.600 know, and that went on for a full day, you know.
00:19:48.140 So it's it's pretty crazy.
00:19:50.440 And and people genuinely, as a result, you know, hate us.
00:19:56.220 And I don't.
00:19:57.360 But we also have a lot of support.
00:19:58.580 Right.
00:19:58.980 I mean, it's not all doom and gloom.
00:20:00.580 We have a lot of people, millions of Americans all over the country who can see through it and
00:20:04.120 they know what's happening is wrong.
00:20:06.400 Well, I think most conservatives like like Parler and I mean, I'm on Parler and I like
00:20:10.400 it a lot and I actually do find it far less hateful than Twitter, far less.
00:20:13.800 And the people who don't think Twitter are hateful are liberals.
00:20:16.420 That's their forum.
00:20:17.600 You know, they get their worldview reinforced and they get people who are conservatives
00:20:20.640 shamed and ratioed.
00:20:22.700 And and Parler's like, you know, I lean more right, although I'm not a Republican, but
00:20:28.020 it's it reminds me of a line I heard from Ann Coulter years ago at the Republican National
00:20:32.480 Convention.
00:20:33.080 I saw her.
00:20:33.560 I said, Ann, how's it going?
00:20:35.040 How are you feeling?
00:20:35.740 And she goes, I feel great.
00:20:37.860 I'm in a sea of Republicans.
00:20:40.760 And I think, you know, Republicans have very few places they can go where they feel like
00:20:45.260 that.
00:20:45.520 And if Parler has turned into that for them.
00:20:48.220 Great.
00:20:48.800 Fine.
00:20:49.180 That's the marketplace working.
00:20:50.380 I mean, you found a market and you were growing.
00:20:53.820 But as with so many things, as with Fox News and even these more, I don't know, I mean,
00:20:58.680 OAN is more fringy, but it has a base.
00:21:02.040 Fine.
00:21:02.760 You may not like it.
00:21:03.840 You don't have to watch it.
00:21:04.720 You don't have to join Parler.
00:21:06.400 But to shut it down is to take it to a new place.
00:21:09.020 And so what were you told specifically?
00:21:11.600 Forget I know Google and Apple removed the Parler app, which is bad enough, but Amazon
00:21:16.360 removing your ability to even exist.
00:21:19.300 Basically, they shut down your ability to access the cloud, which means you can't get
00:21:24.860 Parler on your phone anymore.
00:21:27.440 You know, you'll see nothing.
00:21:29.360 So when you heard that, what was your first reaction?
00:21:32.920 You know, I was pretty shocked.
00:21:35.540 In theory, they could have done that, right?
00:21:38.740 In theory.
00:21:39.300 They've not, it's very, very rare, slash, you know, I've never really seen Amazon do
00:21:44.520 that to anybody.
00:21:47.560 But, you know, I thought, you know, it's feasible.
00:21:49.940 But I was assured from our rep that they would never do something like that.
00:21:54.220 You know, it's not in writing.
00:21:55.520 It was over telephone conversations.
00:21:57.340 But I felt really confident.
00:21:59.040 I used to be an Amazon Web Services employee.
00:22:01.280 I know the staff there.
00:22:03.320 They felt very open-minded to me.
00:22:06.020 And so it was very shocking to know that my former, you know, co-workers even did this
00:22:11.080 to us.
00:22:12.360 And, you know, it's worse than just having us taken offline, too.
00:22:17.780 Because, yes, you can't access Parler, but we can't even access our own code right now.
00:22:21.600 So our developers cannot work because our services that were hosted, that, you know, we even got
00:22:30.000 our own Git repositories and everything.
00:22:33.660 But they were all hosted between two data centers, one in Amazon and one in another data center.
00:22:38.420 The other data center also dropped us because they saw Amazon did.
00:22:41.800 And they were our hedge against Amazon.
00:22:44.200 And so, you know, it's more devastating than that.
00:22:47.940 You know, we lost a lot of communication.
00:22:49.640 Our Parler jury, who oversees the platform, and make sure that violent content doesn't
00:22:56.420 get out and make sure that these things, you know, don't happen.
00:22:59.440 The things that they're accusing us of, the group that actually organizes to stop that,
00:23:04.040 the messaging service that we use to talk with them also dropped us.
00:23:07.520 So we don't have any contact with our 600 jurors anymore.
00:23:11.380 So, I mean, they really, and even if we wanted to contact them, we can't access our servers
00:23:15.440 because Amazon shut them off.
00:23:16.640 So we can't get their email addresses, even if we wanted to email them.
00:23:19.640 So, I mean, this is, it's really what they've done is, uh, is devastating.
00:23:25.680 We're going to overcome it.
00:23:27.120 We're not, you know, I would, I would hate to describe Parler as done because we're not,
00:23:30.480 we're going to come back.
00:23:31.520 When somebody does something that's evil, you don't let them get away with it.
00:23:36.020 Well, and that's where you're, you're fighting back and you filed this antitrust lawsuit against
00:23:39.480 them, which I'll get to in a second.
00:23:40.700 I mean, I'll tell you just a headline.
00:23:42.720 I think you filed your, I think you filed your lawsuit on this, on the wrong basis.
00:23:47.340 I don't think this is an antitrust case.
00:23:48.780 I think this is a first amendment case.
00:23:50.200 And I think you should be going off of the, um, op-ed in the wall street journal written
00:23:54.880 by some constitutional scholars yesterday saying these guys should be treated as first amendment
00:23:58.800 actors and they should be held to the same first amendment, uh, or as, as government actors
00:24:03.000 and they should be held to the same first amendment standards as the state would be.
00:24:06.940 Um, that they can't shut down this kind of a speech.
00:24:09.400 It's, it's, they, they may not now that they control the internet, the way they do, they
00:24:14.140 have to be subject to different restrictions than your average private company.
00:24:17.660 This is just, this was never foreseen, but I'll get your reaction that one second.
00:24:21.220 Cause I do want to talk to you about the lawsuit before we get to that.
00:24:23.860 You know, what Amazon is saying is, look, we, we didn't want to shut them down, but there's
00:24:28.740 been a steady increase in violent content.
00:24:31.040 And what we've seen, especially leading up to the Capitol riot is you don't have a reliable
00:24:35.620 process to prevent violations of your, of Amazon's terms of service, which don't allow
00:24:41.240 that kind of violent content to post.
00:24:44.160 And they say this has been going on for a while and that they're not satisfied with
00:24:49.560 your jury.
00:24:50.400 I mean, over at Twitter, they've got Jack Dorsey who basically will ultimately can ultimately
00:24:54.540 decide this is an okay poster.
00:24:56.240 This isn't, um, you guys have this.
00:24:58.740 Jury jury of one's peers, where it goes out to, let's say five people who volunteer to
00:25:02.840 serve as jurors, random people brought together who have sort of been taught what the standards
00:25:07.700 should be.
00:25:08.720 And if they like it, it can stay.
00:25:10.440 If they think the post is okay, it can stay.
00:25:11.820 If they don't, it goes.
00:25:13.360 But what do you say to, to Amazon's claim that your system's not working and it's leading
00:25:17.940 to, as they say, quote, a steady increase in violent content?
00:25:21.260 You know, our, uh, chief policy officer and me, um, and I, I have to also been in contact
00:25:29.140 with them, uh, you know, have calls ever all the time with the Amazon we had, um, they sent
00:25:36.300 us emails anytime they found stuff that they didn't like, they'd send it to us and say,
00:25:39.920 Hey, just by the way, just one or two complaints.
00:25:41.840 We said, should we be worried?
00:25:42.840 Do you have any problem with what we're doing?
00:25:44.040 No, no, no.
00:25:44.640 Continue.
00:25:45.160 You're doing fine.
00:25:46.160 You know, just keep on going.
00:25:48.080 Um, but you know, just take a look at these couple of things, but otherwise, you know,
00:25:51.620 you guys are doing great.
00:25:52.400 You're fine.
00:25:53.140 You know, and we get calls, you know, how can we help you scale more?
00:25:55.760 We love your business.
00:25:56.600 We love what you're doing.
00:25:57.520 How can we help you get more services?
00:25:59.400 Can we sell you more Amazon products?
00:26:00.940 Can we help you here?
00:26:01.820 Can we help you there every week?
00:26:03.300 It's the same thing.
00:26:04.600 And so out of the blue to pull the plug and then to, to slander us by claiming that, you know,
00:26:11.020 they were unhappy for a long time is a fallacy.
00:26:14.400 They weren't unhappy for a long time.
00:26:16.300 In fact, even the week of the ban, you know, we were having conversations with them trying
00:26:21.700 to ask how they can help us grow our business and help us succeed more because they knew
00:26:25.760 that they liked what we were doing.
00:26:27.100 And, um, you know, they claim they couldn't, they couldn't find a quick remedy.
00:26:31.320 Uh, we even offered them to say, Hey, Amazon has products like, um, that can detect violent
00:26:39.660 content.
00:26:40.660 Um, they have machine learning algorithms, uh, Amazon, uh, recognition with a K.
00:26:46.440 Um, and, uh, you know, we offered to use that and they said that wouldn't remedy it.
00:26:52.100 And I'm like, well, if your own products can't remedy it, you know, this doesn't really seem
00:26:56.140 Yeah.
00:26:56.700 How are we supposed to remedy it?
00:26:58.620 So what, what about the people are looking at specific parlays now, um, and saying, well,
00:27:05.040 look, this is egregious.
00:27:06.000 And the one that keeps getting circulated is the, the, the parlay by Lin Wood, who's the
00:27:11.120 pro-Trump lawyer.
00:27:12.200 Um, he's been permanently suspended from Twitter and he tweeted out or forgive me.
00:27:17.700 I'm used to that, that language, but he sent out a parlay saying, get the firing squads
00:27:22.320 ready.
00:27:22.840 Pence goes first.
00:27:25.020 Um, that, you know, that's a problem.
00:27:28.520 That's a problem on Twitter.
00:27:29.780 At least, is that a problem for you?
00:27:32.000 Well, that was removed off parlor.
00:27:35.240 That exact parlay that you're talking about from Lin Wood was deleted and was deleted
00:27:39.680 very quickly.
00:27:41.480 How, when, when did it go up and when was it taken down?
00:27:44.700 I don't know the exact timing, but it was within 24 hours.
00:27:48.220 And the only place that that, uh, parlay existed on the internet where people could see
00:27:54.280 it and spread that information was on Twitter.
00:27:57.620 One of Amazon's clients and they weren't taken down over it.
00:28:02.000 But it was mostly, it was mostly on Twitter becoming a trending thing because people were
00:28:06.120 so outraged about it.
00:28:07.120 It got retweeted and retweeted because people were like, what the hell?
00:28:10.500 Yeah.
00:28:10.600 But nobody, nobody informed them that it was removed off of parlor.
00:28:16.660 Well, so that wasn't the only one, right?
00:28:19.240 There were, if you, if you go through the list, there was a lot of talk of sort of insurrection
00:28:23.260 and violence and going to the Capitol on parlor.
00:28:26.240 And now that's being used against you saying your system didn't work well enough to call
00:28:30.580 out these communications.
00:28:32.560 Do you, do you take any responsibility for them?
00:28:34.620 And is that true at all that you, your system did not work very well to, to call out those
00:28:38.880 communications?
00:28:39.440 Well, I, we definitely didn't do our best job in the last few days because, you know, we
00:28:47.260 had a large influx of people and our backlog of, you know, cases that needed adjudicating
00:28:53.520 by the jury had skyrocketed.
00:28:58.340 And, you know, that was a problem that had been a problem in the past too.
00:29:03.140 And Apple and Amazon and Amazon actually never talked to us about this stuff.
00:29:08.080 They never gave us any indication.
00:29:09.020 Actually, Apple was the only one who ever gave us any indication that, you know, they
00:29:12.400 were, but they, Apple in the past had communicated to us and said, you know, Hey, you know, you're
00:29:18.300 looks like you're falling behind here.
00:29:19.500 And we said, yes, we are.
00:29:20.300 But, you know, give us, you know, we're, we're just really growing a lot.
00:29:23.400 Give us a few days.
00:29:24.120 We've got to grow out the jury some more.
00:29:25.240 We're taking this really seriously.
00:29:26.420 We just need to get more people on this.
00:29:27.980 And they, in the past, it's been acceptable.
00:29:29.840 And, you know, it never got to the point where they were threatening to remove us.
00:29:33.520 They just said, Hey, you guys should really do something about this.
00:29:35.300 And we said, yes, of course we are.
00:29:36.400 We'll take care of it.
00:29:37.480 And so, yeah, we were, we were behind, but, um, you know, all, most of those things are
00:29:41.800 against our rules.
00:29:43.020 And when the jury would get to them, you know, they're supposed to take them down and the
00:29:47.020 jury is trained and contrary to popular belief, uh, in opinion, they are paid to do this.
00:29:52.540 And, you know, I, I legally speaking, you know, we want to follow the law, which, um, and we
00:30:01.940 want to make sure that, you know, that doesn't happen, but also from a personal responsibility
00:30:05.740 and, and, and my personal opinion on the matter, we don't want violence.
00:30:09.720 I don't want violence.
00:30:10.780 If anybody followed me on parlor, they, they would know that I'm not, um, that, that I'm,
00:30:16.880 I'm really mostly a pacifist.
00:30:19.180 And so when I see this kind of stuff, you know, it, it makes me sick.
00:30:23.300 I, you know, I don't want to see that.
00:30:24.800 And that's not what our platform's for, you know, it, what is it, uh, um, insurrections,
00:30:31.160 violence, all this stuff has no place to be spread on social media, especially not parlor.
00:30:36.520 Right.
00:30:37.200 And that's always been my belief.
00:30:39.320 So, but what do you, I mean, when you get specific, you know, when you start broad brush
00:30:43.920 and say there was sort of violent rhetoric, you say, okay, that's not good.
00:30:47.520 And then you look at the specifics and I'm just going by what I've seen reported in Washington
00:30:51.020 post and elsewhere saying there were specific posts on, on parlor.
00:30:55.540 Like you want a war?
00:30:57.780 Well, you're asking for one to the American people on the ground in DC today and all over
00:31:01.440 this great nation, be prepared for anything.
00:31:03.080 Now we're here.
00:31:04.080 Now they get what they want and going on saying, um, bring our weapons, bring your weapons in
00:31:11.880 support of our nation's resolve, um, will come in numbers that no standing army or police can
00:31:17.840 match.
00:31:18.380 The police are not our enemy unless they choose to be come armed, you know, on and on.
00:31:24.380 So it was like, those specific words are problematic.
00:31:28.580 You know, I, I've, I wouldn't say I've defended president Trump's rhetoric around this issue,
00:31:34.080 but his specific tweets that got him in trouble, I do not think are legal incitement.
00:31:39.660 These are a lot closer.
00:31:41.380 Sure.
00:31:42.080 And, and on those days leading up to those events, that content was viral and happening
00:31:47.060 on every platform.
00:31:48.520 They just looked at us.
00:31:50.500 In fact, a lot of these events were organized on places like Facebook and telegram.
00:31:55.920 In fact, there was an account that was on parlor that was recruiting people had over a hundred
00:32:01.300 thousand people following it.
00:32:02.720 That was recruiting people to telegram where they were organizing stuff like this.
00:32:06.780 We banned them weeks, at least a week in advance of the event.
00:32:12.740 What, what about that?
00:32:14.320 Because we're going to have Glenn Greenwald coming up on the program in a minute, and
00:32:18.880 he's going to be talking about what he saw on Facebook.
00:32:21.000 Um, do you see a double standard in the way parlor's being treated versus the way Facebook
00:32:27.180 has been with respect to this particular incident?
00:32:30.100 Oh yeah, completely.
00:32:32.900 Facebook's gotten a pass.
00:32:34.300 Twitter's gotten a pass.
00:32:36.200 Um, and all of them have gotten a pass, really.
00:32:38.980 It's just, they, they wanted to find a scapegoat and blame them for, for what's happening.
00:32:42.800 And, and, you know, if you kind of look at the way the world's sitting right now, you
00:32:47.140 have a lot of angry people in the United States.
00:32:49.340 You have a lot of angry people who are upset about what happened on the 6th.
00:32:54.220 They're very upset about what happened and understandably so, right?
00:32:58.080 Nobody wants to see a country fall apart.
00:33:00.060 Nobody wants to see violence.
00:33:01.240 No one wants to see the Capitol attacked.
00:33:02.980 That was awful.
00:33:04.880 And so there's a lot of angry people and understandably so.
00:33:08.400 And so the irresponsible thing that has happened is these tech companies, um, have acted as
00:33:15.540 cowards and have blamed parlor for it and are trying to, or sit insinuating that, you
00:33:21.240 know, it, maybe they're not outright saying it.
00:33:23.260 Apple definitely insinuated it in their email.
00:33:25.840 They sent, uh, they sent out to the press as well as they sent to us.
00:33:29.940 Um, you know, that's what they insinuated.
00:33:32.080 And that's not what the country needs.
00:33:34.520 We need people to look at each other as humans again and say, we are not each other's enemies.
00:33:39.800 You know, we, we, we can't let our leaders blame each other for these things.
00:33:44.040 We have to come together as a country and move on.
00:33:46.880 What do you think it make of the fact, John, that not only are they now saying, you know,
00:33:50.580 the violent talk, you know, what, what appeared to be plans for violence is problematic, but
00:33:55.660 now they're saying, now they're going after parlor and others for any talk that was allowed
00:34:01.420 about widespread voter fraud, that, that, that in and of itself should have been banned the
00:34:07.900 way Twitter kept doing the way we saw even some cable news channels do that, that those
00:34:13.220 discussions themselves were dangerous.
00:34:16.560 Well, that's a bit extreme.
00:34:17.560 If there was widespread voter fraud, wouldn't you think that, you know, talking about it,
00:34:24.280 at least finding some transparent answer that satisfies everybody would be the answer.
00:34:28.020 But, you know, they're saying there wasn't any, you know, once those cases were settled
00:34:32.320 and it was proven that there was no white's red voter fraud, which by the way, I agree
00:34:36.780 there wasn't, they could not prove it in court.
00:34:39.020 That's for sure.
00:34:40.460 That the discussion, the mere allowance of an ongoing discussion about it was dangerous
00:34:46.940 and fired up these crazies.
00:34:48.780 That's essentially their argument.
00:34:50.860 Well, they didn't convince them.
00:34:54.120 You know, I, I don't know what to say.
00:34:56.060 You know, you have to talk to people.
00:34:58.100 Do you believe it's okay to have a discussion about something like that?
00:35:01.660 I mean, do you believe that people should be able to talk to one another and convince
00:35:05.520 them?
00:35:06.100 Or do you think that it's up to, you know, an authoritative state or an authoritative
00:35:10.220 central point of power to sit there and say, this is what you may, and this is what you
00:35:14.120 may not talk about?
00:35:15.620 I know.
00:35:15.980 I have to say, I find it, I find it really problematic.
00:35:18.400 I've been fine all along saying, okay, they keep, they, they're holding on.
00:35:22.060 Let's talk about it.
00:35:22.920 Here's what, here's what I think.
00:35:24.260 They haven't been able to prove it in court.
00:35:25.440 In fact, when asked to specifically offer their pro their proof of fraud claims in
00:35:29.320 courts, they keep saying they have no fraud claim.
00:35:31.620 They don't have, they don't have it.
00:35:32.920 They say a different thing in front of the microphones than they say in front of a judge,
00:35:35.860 which tells you what you need to know if you're paying attention to this.
00:35:38.600 And I think the more discussions we can have like that, the better.
00:35:41.760 But I don't believe in just it's banned.
00:35:44.580 The words may not come out of your mouth because they're too dangerous because there is a faction
00:35:49.360 of crazies in this country that, you know, do consider themselves sort of an, are an unarmed
00:35:55.480 or an armed, but ununiformed militia.
00:35:58.000 And that, you know, they're going to do what they're going to do.
00:35:59.960 I, I don't know that the solution is like whack-a-mole try to shut down those discussions
00:36:06.960 wherever they happen.
00:36:08.060 I, I just don't know a, if that's going to work and B, if that's the way this country
00:36:14.020 is supposed to work.
00:36:16.200 If, if you are, if you believe you've been wronged and you want to talk about it, even
00:36:23.160 if you're a little hysterical because you've been wronged, right.
00:36:25.720 And you feel you have been.
00:36:28.340 And so you want to talk about it.
00:36:29.920 The last thing that's going to deescalate that individual or those groups of individuals
00:36:34.540 is going to be to ban them or to get rid of them or to censor them.
00:36:38.500 If you ban them, censor them, get rid of them.
00:36:41.380 It's going to make the problem worse.
00:36:43.020 You're pouring gas on a fire.
00:36:45.080 You're not solving it.
00:36:46.180 What you need to do is come out and say, I understand your concerns.
00:36:49.360 How do we get through this together?
00:36:51.280 And how do we get everybody on the same page here?
00:36:53.520 Because we're moving forward.
00:36:55.260 You know, we're not going to collapse over this.
00:36:57.340 We're moving forward and we're moving forward as a country.
00:36:59.580 So we need leaders to set that example.
00:37:01.400 We need the press to set that example.
00:37:02.880 Instead, people are just kind of riling them up more to get clicks and to get links to
00:37:09.360 their website so they can read about this outrage and get more ad revenue or, you know,
00:37:14.200 dehumanize the other side.
00:37:16.300 You know, this needs to stop.
00:37:17.620 This is more of a societal problem than trying to blame a specific social media for saying
00:37:23.620 free speech is allowed.
00:37:24.840 If free speech is not allowed and free speech is dead, then I think our democracy is too.
00:37:30.320 We have to allow people to speak their mind.
00:37:34.760 Since Amazon has cracked down only on you and not on Facebook, which also uses its cloud,
00:37:42.500 what is this?
00:37:43.260 Is this a PR stunt by Amazon?
00:37:45.220 I don't know exactly what their intention is.
00:37:47.960 I can speculate on a few things.
00:37:50.800 You know, I can speculate that they were probably worried maybe Donald Trump would join and they
00:37:54.920 wanted to make sure he was off the Internet.
00:37:56.360 And getting rid of us was a good way because they knew that we wouldn't ban somebody just
00:38:00.680 because of their name.
00:38:02.740 You know, I don't agree with his politics much of the time.
00:38:06.260 In fact, you know, I made that very clear during, you know, some of the time on Parler and some
00:38:11.480 of my parlays.
00:38:12.300 But, you know, he still has a right to speak.
00:38:14.380 And, you know, that is true.
00:38:17.900 But maybe that's part of the reason.
00:38:19.300 Maybe part of the reason is, you know, people were looking to blame Facebook, Twitter and
00:38:23.240 the tech companies and they wanted to cover up.
00:38:25.120 Or maybe they just wanted to get rid of a competitor because they saw us as a threat.
00:38:30.140 We had a large percentage.
00:38:32.000 You know, how about not large, but a sizable percentage of U.S. voters on Parler.
00:38:37.540 Um, and, uh, people were getting a lot of real interaction.
00:38:41.800 A lot of links were being clicked.
00:38:43.480 People were actually viewing the material.
00:38:45.580 Um, you know, I know you put a lot of your podcasts and, uh, and, and some links on Parler.
00:38:50.020 Um, they probably performed very well.
00:38:53.200 Yeah.
00:38:53.820 So that's the threat.
00:38:54.500 Well, and, but I mean, you're, you're, you're tiny in comparison to Twitter.
00:38:58.500 You've got what, 10 million users and Twitter has almost 200 million, right?
00:39:04.040 So it's, they, they dwarf you, you're, you're just getting started, which doesn't mean that
00:39:09.260 you're not a threat at all to them, but you're a small threat.
00:39:12.560 So, you know, is it too self-aggrandizing to say they saw a threat and wanted to squash
00:39:17.240 it?
00:39:17.820 Or do you think you were just a political irritant?
00:39:20.980 Well, Twitter has a lot of accounts all over the world, right?
00:39:23.680 They have in the Middle East alone, about 20 million, you know, Parler is concentrated in
00:39:28.040 the United States, which is one of the, you know, uh, which is where most of the revenue
00:39:32.660 comes from too, by the way.
00:39:34.640 So, you know, the United States is, is a very, uh, in the United States, it's a smaller subset
00:39:40.820 of the accounts on Twitter.
00:39:42.180 Now our accounts are harder to make on Parler.
00:39:44.620 You have to have an SMS number.
00:39:46.080 It limits the amount of accounts you can make the time to get in.
00:39:48.380 And so, you know, they might be using a standard to measure user numbers that are not the same
00:39:53.640 as ours.
00:39:54.480 But in fact, we had a large number of people on Parler and I do feel that it was a threat
00:39:59.960 to their business model.
00:40:00.720 If I were them, I would have been afraid and I would have taken it seriously because if
00:40:05.480 you look at the engagement on Parler, you know, uh, a lot of people, a lot of prominent
00:40:10.740 people like, let's say Devin Nunes or Dan Bongino, you know, they were getting, uh, even
00:40:15.100 Hannity, they were getting more followers and more engagement on Parler than they were on
00:40:19.240 Twitter.
00:40:19.660 People were using it.
00:40:20.960 They were getting real results.
00:40:22.700 And that is, let me ask you this.
00:40:24.060 So that makes sense.
00:40:25.280 That explains why Twitter might be threatened by you, but why would Amazon be threatened by you?
00:40:29.740 Because you've now filed this antitrust suit trying to fight back.
00:40:32.340 And I admire you for trying to fight back.
00:40:33.920 I do think in a lot of what we've seen this year, people are, have been too reluctant
00:40:38.300 to use the law, which, which will help a lot of people in these kinds of situations.
00:40:44.120 But your lawsuit is against, is against Amazon and it's alleging an antitrust violation.
00:40:51.440 And I don't see why would Amazon want to squash you for competitive reasons?
00:40:57.840 You're a client, Twitter's a client, what competitively, because that's what the antitrust
00:41:03.300 law looks at trying to squash competition.
00:41:05.680 Why would Amazon want to do that?
00:41:06.900 Well, a few things.
00:41:09.880 One, I'm, I'm not particularly well-versed as a, as a legal scholar.
00:41:15.260 So I'm an engineer, so I wouldn't be able to answer any legitimate legal question in
00:41:22.080 any proper way.
00:41:23.000 But, you know, it isn't just against Amazon though.
00:41:26.500 Would they, you know, it's my belief that they work together with the big tech community,
00:41:31.500 with Amazon, Apple, Google, and all of these other companies that banned us within a 24
00:41:37.420 hour period too.
00:41:39.120 They all work together to benefit the established companies such as Twitter and Facebook, who
00:41:45.680 are very large spenders on Amazon and have just recently made large long-term commitments
00:41:51.280 to using their services.
00:41:53.700 Who else dropped you?
00:41:54.860 I mean, it's a pretty long list.
00:41:56.300 Um, I, I'll give a couple examples, uh, our legal, one of our legal teams, we had many
00:42:04.160 legal teams, one of our legal teams, uh, that we relied on heavily at that time dropped us,
00:42:08.260 um, uh, Slack.
00:42:09.500 Have you said who that is yet publicly?
00:42:11.720 No, um, I'm not prepared to yet.
00:42:14.020 Uh, we, I, I'd like to get a lot of it approved, but you know, some of them that came out publicly,
00:42:18.600 you know, you have Zendesk banned us, uh, Slack.
00:42:21.520 We had Twilio ban us.
00:42:23.520 Twilio was, uh, was pretty unfortunate too.
00:42:26.080 Because that's how we kept, uh, they, they are the largest SMS company in the world.
00:42:30.480 They send, uh, text messages.
00:42:32.100 So when people try to log in, we tell them, Hey, send this person a text message so we
00:42:36.320 can verify that, um, that is their account.
00:42:39.900 And so that we can verify they own that phone.
00:42:43.040 That's how we kept out a lot of bots, a lot of hackers, especially a lot of foreign bad
00:42:47.080 actors.
00:42:47.780 We were able to track them effectively by making sure like, Hey, this isn't a U S country
00:42:52.460 code.
00:42:53.080 Okay.
00:42:53.480 It's interesting.
00:42:54.220 Now they're posting a lot of U S news and they're pretending to be U S media outlet.
00:42:57.880 Okay.
00:42:58.080 That's nice.
00:42:58.680 Now we know that, you know, this is good information.
00:43:01.600 So, you know, they were really helpful in that regard.
00:43:04.520 It's, it's interesting because a lot of the things that they were complaining about, uh,
00:43:09.040 you know, they, they, uh, they were taking away the tools that would make it even worse
00:43:14.060 for us.
00:43:14.600 Um, right.
00:43:16.340 Which just goes to show you like, they're trying to, they're trying to send a message
00:43:19.820 to America that they're not on your side, that they won't be associated with anybody
00:43:25.020 who allows violent discussion that leads to an event such as we saw last week, period.
00:43:30.560 And they're not really interested in the specifics, right?
00:43:33.980 Like they don't, I don't, it's what I, from what I can see, they don't care that Twitter
00:43:37.320 did it too, that Facebook did it worse.
00:43:39.860 You've gotten made the scapegoat and now everybody's running without looking.
00:43:45.420 Do you think that storm is going to calm down where people are going to say, eh, maybe we
00:43:51.280 can't really blame this on parlor and we need to lighten up.
00:43:54.020 I think the public has already started coming around a lot to it.
00:43:57.720 I think it's a small fringe group of very angry people.
00:44:01.540 Most of it goes viral on Twitter that believe, uh, that parlor is evil.
00:44:07.060 And, uh, and, and they kind of are experiencing what I'd call the five minutes of hate where
00:44:11.980 they just, they just learn to hate and get really angry.
00:44:14.380 And then they all want to just go after it.
00:44:16.220 They probably never used parlor.
00:44:18.060 They probably don't know what our core values are.
00:44:20.060 They probably just read a couple of pieces in Buzzfeed, um, or the Washington post and
00:44:25.600 they go, you know, this place is evil and they think it's a, it's a horrible right-wing
00:44:30.400 neo-Nazi place, which is not true.
00:44:33.280 Couldn't be further from the truth.
00:44:35.400 Uh, and you know, if I thought those things about a place, I would probably be outraged too.
00:44:42.260 And I probably wouldn't have as much sympathy if I thought I wouldn't have any sympathy if
00:44:47.600 I thought, Hey, that place is all about spreading evil, violent things.
00:44:52.460 The problem is, is they've been misled.
00:44:54.700 And so I don't blame the people who are angry and coming after us and calling for us to get
00:44:59.160 canceled.
00:44:59.540 Although I am against cancel culture.
00:45:02.020 Um, I don't blame them for their, for their anger.
00:45:04.940 I just blame the leadership of the, the, you know, these, these, uh, media outlets that
00:45:10.520 don't show journalistic integrity.
00:45:12.140 They're not telling the truth, you know, it's, it's in, and I, and I blame these companies
00:45:17.040 for falsely portraying our problem as being bigger than the problems of our competitors.
00:45:23.360 You know, that's the problem.
00:45:26.360 What do you make of, you know, some of the tweets we discussed earlier that do have the
00:45:29.540 violent rhetoric in them?
00:45:30.980 Is there any, does it make sense at all to have somebody at the company who monitors
00:45:36.260 these parlays and says when appropriate, we're going to get law enforcement involved.
00:45:41.920 I mean, was there any thought to saying that that's on our site and we're going to take
00:45:48.640 it down and, or we're going to notify the cops?
00:45:53.080 We do.
00:45:53.960 We already do that.
00:45:55.880 You know, we have, uh, we have multiple authorities and different jurisdictions all over the United
00:46:01.700 States that, um, that our policy team works with, um, daily almost for stuff like that.
00:46:09.400 Okay.
00:46:09.940 So is there any way of ramping that up?
00:46:12.040 If you're not going to beat Amazon, and again, I do think you have a different kind of lawsuit
00:46:15.620 against them that you should at least try the first amendment, uh, case, but if you're
00:46:19.780 not going to necessarily beat them and it's tough to go up against Amazon, what about trying
00:46:24.580 to comply with their demands in a way that doesn't gut your commitment to the free exchange
00:46:29.580 of ideas and speech?
00:46:31.620 Well, that's the crazy part is we offered to, you know, by the time Sunday came around
00:46:37.860 and they banned us, we had implemented a proactive algorithm.
00:46:41.640 Um, and, uh, what it would do is it would, uh, and it was only, it was only online for
00:46:48.080 a few hours, but it would analyze any, any and all comments and parlays before they went
00:46:54.500 out to check to see if they were violent or, uh, toxic and potentially inciting violence
00:47:00.220 and it would go and flag it and send it to the jury before it could go out.
00:47:06.780 Uh, and we said, if then once the jury approves it, then I can go out.
00:47:11.000 And we said, is that good?
00:47:12.480 And then by the way, also by the end of Sunday, we worked through most of our backlog that was
00:47:16.480 building up too.
00:47:18.160 And, you know, they said, oh, that's great.
00:47:21.360 That's exactly what we asked for.
00:47:22.660 But no, we're like, well, then why did you have this meeting with us?
00:47:25.880 If you already knew the answer was no, you know, and I said, well, what if we use your
00:47:29.680 tools to do that?
00:47:30.700 Would that be good enough?
00:47:31.620 No, we just don't think you, we just don't think you can handle it.
00:47:35.400 And I said, well, if we used your tools, do you not believe in your tools?
00:47:38.240 So, I mean, they had already made up their minds.
00:47:40.620 They were not talking with us in good faith.
00:47:43.060 They were talking with us, in my opinion, in bad faith.
00:47:47.360 And you don't think there's any way around that because right now this is about PR, right?
00:47:52.380 I mean, this is a, maybe it's a principle, maybe it's PR.
00:47:55.880 But your point is, it's not actually about a violation of policy.
00:48:00.240 No, it has nothing to do with their violation of policy.
00:48:02.780 They made that up.
00:48:03.500 If it was a violation of policy, they would have talked to us a month ago.
00:48:06.360 They would have talked to us, you know, a week ago even about it.
00:48:10.840 You know, they gave us an extremely small amount of notice, an extremely, you know,
00:48:18.280 tight timeframe on a Friday, you know, during a weekend to find new hosts.
00:48:24.500 People aren't even working on the weekend to sign the contracts.
00:48:28.280 We had to wake up CEOs of other billion-dollar companies to try to get them to sign paperwork.
00:48:34.120 And some of them did.
00:48:35.600 Some of them did on a Sunday or a Saturday or even a Friday.
00:48:39.640 We got one on Friday that agreed.
00:48:42.820 He got up 60 employees at one of his data centers.
00:48:45.840 And I'm not going to say the name of the company because I don't want to, they were genuinely trying to help us.
00:48:51.180 Got up 60 employees who were, a lot of them Parler, you know, people on Parler.
00:48:56.400 And this was a billion-dollar company, very large data center.
00:48:59.940 They all started setting up servers to start hosting us for 24 hours.
00:49:05.860 Sunday comes around.
00:49:07.040 We're almost ready to switch over and use them.
00:49:09.240 The CEO, apparently, somebody on their board came to them and said, you can't use Parler.
00:49:15.020 They dumped us on Sunday at the last minute.
00:49:17.600 We would have been online Sunday at midnight had that not happened.
00:49:21.900 And so these are the things that keep happening to us over and over again.
00:49:27.120 Is there any hope?
00:49:29.340 I mean, you mentioned at the top of the conversation, you had a few glimmers of light.
00:49:33.240 What are they?
00:49:33.880 Or can you talk about it at all?
00:49:35.140 Well, I mean, there's only one reality, and we have to rebuild from scratch on our own.
00:49:40.500 I mean, that's what we're going to have to do.
00:49:42.800 And we're going to have to be self-reliant.
00:49:45.380 We're going to have to rebuild most of the core infrastructure that most of these companies never have to build because they rely on, you know, people like Amazon or, you know, Twilio or these other companies to do for them, which is the industry standard.
00:49:57.200 And so that's the only way we can be really self-reliant and can't be blamed for, you know, things like this in the future.
00:50:07.380 That's the only way.
00:50:08.400 And so it's going to take time, but we're going to come through and we're going to do it.
00:50:12.300 We've got a lot of people motivated and a lot of people supporting us.
00:50:15.160 You know, you have millions of people all over the United States who want this back.
00:50:18.300 You know, we had almost 20 million accounts on Parler by the time we were, you know, booted.
00:50:23.000 So there's a huge number of people who want this back.
00:50:27.880 I'm one of them.
00:50:29.100 I mean, I'm sure there has been violent rhetoric on there.
00:50:32.120 I've seen a lot on Twitter.
00:50:33.320 I myself was subjected to about 200 tweets from our president that were pretty unfortunate and misleading and had me with like Saudi sheiks who are known for bad conduct.
00:50:42.180 All made up and Photoshop.
00:50:43.840 I never thought Twitter should ban the president.
00:50:46.320 I thought this is going to create a problem in my life.
00:50:48.520 I'm going to have to deal with it.
00:50:49.320 It's unpleasant.
00:50:50.140 It's it's America.
00:50:51.440 We're allowed to say stuff that's not true.
00:50:53.640 You know, I understand if it's if it's insane.
00:50:57.200 It's excitement legally that it could lead to the immediate, you know, violence.
00:51:01.380 It's in a different category.
00:51:03.380 But you know what I think that the standard for you guys needs to be there needs to be a good system that goes after quote unquote dangerous speech.
00:51:12.320 But the standard for what's dangerous speech has to be pretty generous because this is America.
00:51:17.200 As long as you're involving law enforcement, as long as you have maybe you could get better and faster at cracking down on some of this stuff.
00:51:24.280 But I also think I mean, you tell me because I also really believe that the way forward for you is because I'm going to predict that your antitrust lawsuit is going to go away.
00:51:34.820 I think it's going to get thrown out as a lawyer.
00:51:37.080 I'll tell you that.
00:51:37.720 But I do think that you've got a real argument that even though the free speech clause of the First Amendment prohibits the government, not private parties, from curtailing speech, that these entities, these that big tech is now effectively acting as a government arm.
00:51:56.480 I mean, they've been threatened by Democrats in Congress to do exactly this kind of thing.
00:52:01.820 So that's, you know, the stick, as this Wall Street Journal op-ed put it.
00:52:06.700 And they've been sort of lured into doing it like, you know, we're going to be super happy with you because all of America hates these conservative leaning publications, right?
00:52:16.480 All of the media, at least.
00:52:17.960 And I think you've got a good argument that these are now government actors and should be treated as such.
00:52:23.480 And that means they cannot abridge your free speech.
00:52:25.640 They cannot shut you down based on your political viewpoint.
00:52:28.720 What do you think?
00:52:30.080 Well, it's certainly a good idea.
00:52:32.740 We're exploring a lot more options.
00:52:34.960 So, you know, maybe this is something I can bring up with our team.
00:52:37.940 You know, we can talk to you this morning.
00:52:40.880 Which one?
00:52:42.280 Okay.
00:52:42.700 It's, well, I don't know.
00:52:44.760 Did I post it on Parlay?
00:52:46.040 Abby's here with me.
00:52:46.800 She's my assistant.
00:52:47.580 She, I don't know if I posted it, if I reposted it.
00:52:50.840 But it's called, it's a Wall Street Journal opinion piece called Save the Constitution from Big Tech.
00:52:54.820 And the reason it caught my attention is because one of the authors is Jed Rubenfeld, who's a constitutional scholar at Yale Law School.
00:53:02.640 Really smart guy and goes through exactly why they've crossed over into state actors and like the immunity that that the government has provided to these guys.
00:53:15.580 The 230 thing that actually works to your benefit.
00:53:18.600 The threats that the congressional Democrats have made to the social media giants if they fail to censor speech that the lawmakers don't like.
00:53:25.760 The tweet by Jen Palmieri, Hillary Clinton's former communications director, that basically said it didn't escape my notice that this happened just as the Democrats were about to take over all the committees that oversee these guys.
00:53:35.940 In other words, you know, real nice company you have there.
00:53:39.480 I hate to see anything happen to it, Parler.
00:53:41.080 All of that and the control that they operate, the monopoly they have over these clouds and so on, give you a much better argument, I think, that they have to behave as the government would, which means they can't have viewpoint discrimination in the way they manage these companies.
00:53:57.440 That seems like a fair argument to me.
00:53:59.940 We really need to stop what they're doing.
00:54:02.420 You know, this is sick because it's not just Parler.
00:54:04.200 It could happen to anybody they don't like, you know, at any time.
00:54:08.760 And if they could get away with something like this, that sets a horrible precedence, you know, where you live in a country that's ruled by, you know, Jeff Bezos and Tim Cook and a few other people.
00:54:20.720 You know, they just rule everything.
00:54:22.360 And if they don't like your business, they can shut you down.
00:54:24.520 I don't know what's worse, a tyrannical government or a tyrannical group of like three or four tech tyrants that, you know, give in to the rage mob online.
00:54:32.980 I mean, that's pretty crazy.
00:54:36.020 Well, and all of whom share the same politics.
00:54:38.340 So you're basically shutting down the voice of half of the country, which already doesn't see its voice expressed in media, in Hollywood, in these sports, you know, social justice activities we've seen in corporate America's recent crackdown on beliefs.
00:54:53.320 None of it reflects what most Republicans believe.
00:54:56.480 And now the people who control the biggest means of communication in the United States, you know, big tech and everything we do online is declaring.
00:55:04.640 I mean, it's a war that it's on.
00:55:06.780 This is about much more than Parler.
00:55:08.500 So I await your amended complaint.
00:55:11.580 There's my unsolicited legal advice for you, John.
00:55:14.580 And I really hope you fight the good fight because I think this is much bigger than you guys.
00:55:20.120 And my experience of your company has been nothing but positive.
00:55:24.440 Thank you very much.
00:55:25.260 Yeah.
00:55:25.500 I mean, people are on our side, not just Republicans.
00:55:29.340 We've got people all over the spectrum on our side.
00:55:32.320 And, you know, I don't think that people are with these big tech companies.
00:55:35.380 There's a small, very vocal minority that is.
00:55:39.060 And I think we're going to win.
00:55:41.040 And I think we're going to come through just fine.
00:55:42.920 We might be we might come through this, you know, better in the end if we can prove that what they did was so wrong and not legal.
00:55:52.260 It's certainly the best name recognition boost you could have asked for.
00:55:57.380 I wouldn't say good PR, but just your name recognition has gone through the roof.
00:56:00.860 So, you know, there's there's something to be said for that.
00:56:03.760 Look, all the best and good luck with it.
00:56:05.960 Thank you very much.
00:56:06.940 Take care.
00:56:09.300 From the founder of one new platform to the founder of another, Substack, its founder and CEO, Chris Best, is going to be here on independent media and what he's most worried about.
00:56:20.680 And then in a bit, we'll talk to Glenn Greenwald, who's on fire.
00:56:24.020 Stay tuned for him.
00:56:24.920 But before we get to any of that, listen, sticking to your New Year's resolution, we're not yet in February.
00:56:29.260 Have you blown it yet?
00:56:29.940 It's a matter of making one right decision at a time.
00:56:34.200 If you're looking to institute some healthy habits and improve your lifestyle this year, you need to check out Super Beats Heart Chews.
00:56:40.400 In my family, we are trying to do less TV.
00:56:43.260 And actually, it's going pretty well.
00:56:44.680 That's also healthy for you and also will lower your blood pressure, especially if what you're watching on television is cable news.
00:56:49.780 So eat your Super Beats Heart Chews and avoid cable news, and you're going to be healthier before you even get to February.
00:56:55.620 All you need is two of these things, two Super Beats Heart Chews per day, which will give you the cardiovascular support and promote the heart-healthy energy that you need to chase your goals.
00:57:06.540 Super Beats Heart Chews combine non-GMO beets and clinically researched grapeseed extract, shown to be two times as effective at supporting normal blood pressure than just a healthy lifestyle alone.
00:57:17.360 When it comes to implementing healthy habits this year, adding Super Beats Heart Chews to your daily routine is an easy and smart decision to make.
00:57:25.160 And now you can get a free 30-day supply of Super Beats Heart Chews, plus a free 30-day supply of their new delicious flavor, Super Grapes, with your first purchase.
00:57:35.360 Just go to getsuperbeats.com.mk.
00:57:38.700 That's two free gifts, peeps, valued at over 50 bucks.
00:57:43.360 And it's only available at getsuperbeats.com.mk.
00:57:47.300 That's getsuperbeats.com.mk.
00:57:50.320 Chris Best, thank you for being here.
00:57:55.780 Thanks for having me.
00:57:56.760 We just finished speaking with the Parler founder and CEO.
00:58:02.260 And look, he's scared.
00:58:04.180 I think he's mad and feels like they've been targeted, they've been singled out by Amazon, by Apple, by Google.
00:58:12.840 Do you think he's right?
00:58:14.240 I think so.
00:58:14.780 I mean, it's definitely concerning seeing, especially people higher up or lower down, I should say, in the tech stack, like having AWS take action against somebody, I think, is a different kettle of fish than, you know, Facebook deciding what page to leave up or not.
00:58:32.560 And it's definitely, definitely concerning.
00:58:35.000 So, I mean, when you look at this as a guy who started his own company, Substack, that allows for independent media to get their views out, do you worry?
00:58:46.360 You know, could you be next?
00:58:48.200 I worry about this stuff in general.
00:58:50.480 I think we're at a really bad moment in our sort of shared understanding of norms and values.
00:58:58.900 And, you know, all this stuff that's happening right now, I think there's a lot of cases where there aren't good answers.
00:59:07.460 Like, we've gotten to a place where every alternative that all of these platforms and all of these people have is kind of bad.
00:59:13.720 And the fact that we're having to have the conversation, like, hey, you know, when is it appropriate for social media platforms to kick the president off?
00:59:23.740 The fact that we're in the place where we're having that conversation means that something went very wrong, like some time ago.
00:59:32.360 Like, things have not been in a good trajectory for a while.
00:59:37.620 And the thing that I'm fascinated with, although I know this is all kind of like really hot in the moment right now, the thing that fascinates me is like what has been happening to our information ecosystem over the past several years, the past decade, that's got us to this point?
00:59:55.700 And how can we improve those underlying causes?
01:00:01.180 Because we're at a place now where, like, whatever happens, it's not like whether, you know, when do you decide to kick the president's Twitter account off?
01:00:09.720 When does, you know, is it only hosting, like should hosting providers be able to kick people off for this or that?
01:00:14.860 None of the outcomes of these decisions are going to really help things.
01:00:20.360 It's kind of going to be bad, whatever happens in my estimation.
01:00:24.160 No, I sent out a tweet yesterday saying these companies are going to rue the day when they did this because they may think they've this is solving the problem of Trump or Parler.
01:00:33.720 It's it's creating a whole different kind of hornet's nest for them.
01:00:36.800 I just I really think the lawsuit's going to come alleging that these guys have now become government actors and need to be treated as such.
01:00:44.720 And that's going to change their their companies forever.
01:00:47.860 That's that's going to change the Internet forever.
01:00:50.920 Yeah, I think I think that's that's.
01:00:54.160 People are right to be concerned about that.
01:00:57.420 I also think that we're like I'm somebody that has a very strong bent towards kind of free speech and freedom of the press.
01:01:04.720 It shapes what we do at Substack.
01:01:06.580 I also think a lot of people who are wringing their hands about this are wringing their hands a little over dramatically and not selectively enough.
01:01:16.240 Like I think the question of, you know, what role do platforms have in shaping and like deciding who's who gets to use them and who who doesn't is actually a pretty nuanced question.
01:01:30.580 And the answer is not an obvious, simple rule, even if you're someone that strongly believes in free speech and a principle of of of the free press.
01:01:40.140 What do you mean? I mean, what all we know, all the regular layperson knows is apparently Parler had to use Amazon to get out to the people.
01:01:49.700 And now Amazon has says has said you're booted and they have precious few other options.
01:01:54.980 Yeah. And I definitely think that hosting providers like Amazon.
01:02:00.380 So Amazon runs like the servers that Parler runs on.
01:02:04.180 And I think that's an instance that is more concerning and deserves more scrutiny of like, hey, you know, the lower you are in the stack, which is kind of like the further the further away you are from kind of the end user and the more that you're just powering the inner workings of how servers run.
01:02:23.280 And I think there's a much stronger case to be made that that intervening in these kinds of decisions is a is a big mistake and a bad precedent.
01:02:31.360 Right. You're kind of closer to that. You're getting closer in that world to being like, you know, the utility company.
01:02:37.300 I agree. It's like the phone company stepping in to turn off your phone if you're if you're talking about white supremacy or something really bad.
01:02:44.980 But, you know, up until this point in history, at no point has the has AT&T been stepping in to say you're disconnected.
01:02:54.460 Exactly. And I think that's a really strong argument. And I agree with that. I think that that that AWS doing that is a mistake.
01:03:02.140 I do think that the same people who are saying that now are saying the exact same things about, hey, you know, I don't think Twitter should ever be allowed to kick the president off.
01:03:10.300 I don't think Facebook should ever be allowed to moderate these things.
01:03:13.400 Because and I think when you get into that world, a lot of that is is is less compelling or at least less clear cut.
01:03:21.640 And everyone in this debate, it feels to me, has kind of like rapidly switched sides of what they believe due to circumstances.
01:03:32.720 Like you got a bunch of people kind of on the left saying, you know, hey, of course, it's of course, corporations have the right to like intervene in political speech because that's really important and good.
01:03:42.020 Because it's convenient in this case. And, you know, as is tougher for me to criticize, you get people who are making kind of vaguely free speech arguments, I would say disingenuously saying that, hey, you know, none of this stuff is actually a problem.
01:03:58.240 You know, all of this is just protected political speech, yada, yada, yada, yada, and kind of refusing to address the elephant in the room, which is at some point, if you're inciting, you know, a violent mob to march on the Capitol, it's different than expressing, you know, expressing your beliefs.
01:04:16.680 Mm hmm. Well, that's there's no question. It is different. But and even under the First Amendment, even if this were a government making these decisions, the Supreme Court has been very clear that time, place and manner restrictions on speech are OK.
01:04:28.620 You know, you can say you can protest that we saw this with those crazy lunatics at the Westboro Baptist Church who were protesting the funerals of dead soldiers with horrific signs.
01:04:39.460 And they held that they said that they had the First Amendment right to do it. And the Supreme Court agreed, but but said there can be time, place and manner restrictions placed on you.
01:04:51.020 So you you may have the right to show up and say crazy ass things, but the local government of the state can say you've got to be 100 yards away from the church or the cemetery and it can only be done during the following hours.
01:05:03.160 Like all those things are OK. So I feel like if Twitter, if Facebook, if Amazon get treated as government actors, which I do think is where this is going, they, too, will be allowed to use time, place and manner restrictions, neutral content, neutral restrictions.
01:05:18.220 Right. You can you can say we can't have illegal postings like child pornography. You can say we can't have violent postings that may place somebody in danger, but they have to be applied uniformly.
01:05:28.600 You can't single out just the one conservative platform. And I really think that's where that's how it that's how this looks. Right. Because Twitter has a lot of violent postings.
01:05:39.480 Now, they say that their system's better for calling it out. But Facebook, you know, they they have a lot of violent postings leading up to the Capitol protest and they're just fine with Amazon. Amazon's not cracking down on them.
01:05:50.200 Yeah, I think we're at a we're at a delicate moment here where the things that I I worry about a lot are kind of the overreactions in both directions to to what's happening.
01:06:02.220 Right. Either people who are upset about the, you know, the the march on the Capitol saying, oh, this is a great excuse to kind of come in and have like a massive crackdown on not just this one thing, but all of these things that we've disagreed with all along.
01:06:15.400 And, you know, anybody who doesn't buy into all of that is is not with the program and they're bad. And I think that way lies madness.
01:06:22.480 But I think that there's an equal threat in the other direction saying, hey, you know, we've always we've always said that these private companies is their platform.
01:06:30.120 They can allow allow or not what they want. But now that they're cracking down on people that I like, I actually don't like that.
01:06:35.080 And I think the government should be stepping in and, you know, saying that what you know, what the moderation decisions on these platforms should be.
01:06:42.480 And I think if you have any kind of a liberty loving bone in your body, that should give you the major creeps.
01:06:48.820 But who wants Nancy Pelosi and Chuck Schumer regulating what we say?
01:06:53.700 And on the other side, you know, President Trump regulating what we say.
01:06:57.500 Nobody wants that. No one, no sane person should be putting this anywhere near the hands of the government.
01:07:02.240 But right now it's in the hands of Mark Zuckerberg and Jeff Bezos and, you know, all the others who run these companies.
01:07:08.220 And that's that's not OK either, because, as you know, very well, these are all left leaning guys, even if they weren't, even if there was a diversity of views, it'd be problematic.
01:07:16.700 But you know how this is going to go. One side of the country, by the way, Republicans are half the country, is going to get shut down.
01:07:23.740 And that you want to talk about insurrection that this isn't going to help prevent the next one.
01:07:28.640 All right. Let me before we go forward with that, let me just give people your background, because I don't know that everybody knows what Substack is.
01:07:34.420 And I'll tell you, it first came to my attention because people I really liked reading, like Matt Taibbi went over there.
01:07:40.760 I'm like, what is this place? Why is he why is he writing over there? He works for Rolling Stone.
01:07:44.820 Why doesn't he just write his stuff on Rolling Stone?
01:07:46.740 And then I found out just by reading what he's been writing and listening to him.
01:07:51.760 And then I had him on the show.
01:07:52.720 So Matt Taibbi doesn't necessarily feel he can print everything he wants to print in Rolling Stone and wanted a more open forum where he could write what he wanted to write.
01:08:02.780 And that's what Substack is, as I understand it.
01:08:05.340 It's a place where independent writers can go, journalists, what have you.
01:08:09.720 They can they can write what they want to write.
01:08:11.480 They can have a direct relationship with consumers where the consumers pay, let's say, five bucks a month and then they get to read whatever Matt Taibbi writes or Glenn Greenwald now without an editor stepping in to say no, without Amazon so far stepping in to say no, without Chris Best, who owns and runs the company, stepping in to say no.
01:08:34.520 Do I have it right?
01:08:36.180 Yeah. Yeah, that's right.
01:08:37.180 And there's you say sort of have a direct connection with your audience.
01:08:39.560 And there's two really important pieces of that.
01:08:41.480 One is that, yeah, you know, you can have paid subscriptions on Substack.
01:08:44.800 People can pay you directly.
01:08:45.900 And so you're kind of hired and fired by your readers.
01:08:48.440 And the other piece is that because it's based on kind of an email list, like you have this direct connection where you get to, you know, send your stuff to your readers who have opted into it whenever you want, unmediated by an algorithm that doesn't necessarily have your interest at heart.
01:09:04.320 So this right now, I would think, is looking very attractive to people who don't have the correct views.
01:09:12.200 And I speak, of course, of conservatives or people who support President Trump.
01:09:15.160 That is not the right view to have at the moment.
01:09:17.300 And I've listened.
01:09:19.480 I mean, I listen to conservative radio and podcasts and television along with the regular mainstream.
01:09:23.980 And I know that conservatives are scared right now.
01:09:26.400 I mean, if I told you the number of people, Chris, who have contacted me by text over the past few days saying, let's make sure we have each other's numbers.
01:09:33.500 We could all get shut down.
01:09:35.200 People are scared.
01:09:36.520 So I think a place like Substack is probably looking good to folks who are worried they might get booted or their platform might get, you know, attacked.
01:09:43.760 But you're not immune from this, are you?
01:09:46.740 Are you immune to this, to this kind of crackdown?
01:09:48.560 Because you also have servers that you probably don't control, no?
01:09:52.360 Yeah.
01:09:52.740 I mean, you know, that's part of why this stuff is worrying to me, too.
01:09:56.480 Like, I don't think that we should be having people shut down people's servers and all that stuff.
01:10:00.260 I think that that way lies madness.
01:10:02.140 To me, the more interesting question here, though, is not like, you know, yes, our present moment is crazy.
01:10:08.520 Everyone's starting to kind of, like, realize that things have gone off the rails.
01:10:11.560 But things have been going off the rails, like, for some time.
01:10:15.720 And my, this is sort of the reason we started Substack.
01:10:20.740 It's the reason we started this company is I kind of feel like the place we're at now is a necessary consequence of the way that, like, the internet and our reading habits
01:10:31.220 and the incentive structure within media has been pushing the whole discourse for the past sort of decade.
01:10:41.600 Outrage, outrage, outrage.
01:10:43.020 Outrage, outrage, outrage, right?
01:10:44.340 You have all these things where, like, you know, on social media, on these feeds that are maximizing for engagement, you know, it's all about these takes, right?
01:10:51.980 The people are vulnerable.
01:10:52.800 It doesn't matter who says the take.
01:10:53.720 As long as someone says the take, it's going to, like, take off and be great.
01:10:57.240 And that means that you get this kind of, you know, market for outrage where even if you're an honest person and you don't want to take the cheap take, someone's going to take it, right?
01:11:07.100 So any incendiary take that can exist will eventually exist.
01:11:11.700 The truth gets devalued.
01:11:14.440 We have this, it completely breaks our ability to, like, see our own society and understand it, right?
01:11:19.800 Because you're always seeing the craziest takes from everybody that you see as your enemies.
01:11:23.840 And you get this funhouse mirror view of society where everybody that's not on my team is completely crazy.
01:11:30.380 It completely justifies whatever my team does.
01:11:32.480 And if enough people believe that, if enough people, you know, even if you don't believe in what people say on social media, people tend to believe that other people believe it.
01:11:41.880 And then you get this effect where that becomes a reality, right?
01:11:47.160 People start to organize around these crazy lines where whatever we do is justified.
01:11:52.560 Whatever they do is awful.
01:11:53.740 Everything is in this complete fight to the death.
01:11:56.060 And then you get this groupthink, right, where anybody that's critical of your own tribe lives in fear of being ostracized, of being shut out.
01:12:08.280 And you really do see this across the spectrum, right?
01:12:11.760 Like, you do see this, you know, I think you're probably familiar with, like, the left-wing cancel culture, you know, throw people out.
01:12:18.080 This happens on the right too, right?
01:12:19.460 People that are insufficiently supportive of Trump have been being thrown out of things for a long time.
01:12:25.720 And it breeds this place of craziness that no individual moderation decision or rule about a company kicking people off or not, none of that's going to help unless we can rewrite the rules of how we're having these conversations in the large scale.
01:12:44.540 Well, and you're starting that, I mean, by letting, by providing a platform for these journalists, you know, Andrew Sullivan's another one, to go direct to consumer.
01:12:54.880 I mean, I guess it's not totally direct because they're using you.
01:12:58.200 But I, you know, the pushback we've heard so far against places like Substack is, you know, by the mainstream journalism outlets, which, of course, feel threatened by you.
01:13:08.160 The New York Times does not like a group like you because great writers can reject them and go write on Substack and make a lot of money and have all the same followers who would have read them in the Times.
01:13:19.040 But the pushback, of course, is there's no control of the information.
01:13:24.580 There's no fact checker.
01:13:26.340 There's no editor, the way we have at the Times, who, of course, is the entity that led the 1619 Project to be published, which is a disgrace.
01:13:34.140 My favorite one is, I think it was the, was it New Yorker magazine said, but is all of this journalistic individualism really good for the collective?
01:13:44.940 The collective.
01:13:45.980 The editor thing is a bit of a canard, by the way.
01:13:49.400 You know, on Substack, you can use an editor, right?
01:13:52.560 And often a lot of great writers really like having a good editor and benefit from having a relationship like that.
01:14:00.320 The difference is that on Substack, the writers and the readers are in charge.
01:14:05.720 So if, you know, Matt Taibbi or Heather Cox Richardson or any of these people, you know, it's up to them what they're going to publish.
01:14:13.280 And if they want an editor that's going to come and help them make it better, they can do that.
01:14:17.700 But there's no one that's coming in and telling them, you know, drop that story.
01:14:23.260 It's it's too hot or whatever.
01:14:24.680 Well, so and given the freedom that they have and these these guys who have been going over and gals there, and I say this in a good way, they're kind of like outlaws.
01:14:35.840 You know, they're the ones who maybe used to be acceptable to, in particular, the mainstream, the left.
01:14:40.860 And then, you know, when they just refused to bow to what they were being told they must write became unacceptable.
01:14:47.620 And they said, you know, screw you.
01:14:49.160 I'm not compromising my integrity.
01:14:50.600 The truth is the truth.
01:14:51.520 I'm thinking about Glenn in particular, and I'm going to keep doing it.
01:14:55.340 And that, I think, makes you a big target.
01:14:58.220 And so I do wonder just I mean, when you say it, you worry about it sometimes.
01:15:02.200 Like, how could they shut you down if if somebody wrote something, you know, let's say it was factually incorrect and it led somebody to do something nuts?
01:15:12.580 That's that'll be the argument.
01:15:14.080 I don't know how much I believe in like this one person's words led this other person to do X.
01:15:19.240 But anyway, it can happen.
01:15:21.140 And they come after you.
01:15:22.240 How could you get shut down?
01:15:23.280 I think one important piece about Substack is the way that you own the audience is that you have the email list.
01:15:32.260 And so one of the things that we put in as a deliberate valve for this is the fact that you can't actually leave Substack.
01:15:38.920 You have exit rights.
01:15:40.660 And so, you know, the worst case scenario from any way that someone could get shut down on Substack is basically, you know, hey, you can't publish on Substack anymore.
01:15:50.620 But you own all your content, you own the rights to it, you have all your content, you have your email list, right?
01:15:56.460 Everyone that's kind of opted into having this relationship with you, you have it, you can take it with you.
01:16:02.160 And that's true, you know, whatever kind of a way for whatever you worry about, right?
01:16:07.000 Whether Substack turns evil and starts kicking people off or, you know, the government comes in and shuts our servers down or whatever thing you worry about.
01:16:15.480 I actually think that's the best sort of practical answer that we have to this whole thing is to just say, like, look, there's not going to be a world where we can promise that we'll never get shut down.
01:16:27.880 There's not going to be a world where we can promise that we're never going to shut anyone down.
01:16:31.440 Like, we're, you know, we're strong proponents of a free press, but we have a conduct policy.
01:16:35.260 There's not, you know, we don't have to allow everything in the world.
01:16:38.080 However, if that does happen, you know, it's your, you own the rights to your stuff.
01:16:43.740 You have the relationship with your, with your people and you can leave.
01:16:47.160 I think that's a robust answer to that.
01:16:50.400 No, that's good because, you know, it's like on, on this show, for example, this is a podcast that, you know, it's, it's hosted everywhere, but we're subject to their whims, right?
01:17:01.020 So if they wanted to pull it, they could pull it.
01:17:02.700 And my pal, Ben Shapiro has a hybrid model where he offers a free podcast, but he also asks people to subscribe to the Daily Wire because he's hedging his bets.
01:17:13.120 He realizes he could be targeted and he wants to maintain that direct relationship.
01:17:17.760 So he's found, I think, a pretty clever way of doing it.
01:17:21.060 And I'd, I'd hate to believe we all are going to need to be in that role where we have to hedge the bet, but we probably will because the, this sort of interference in speech and opinion giving is getting worse, not better.
01:17:35.760 And I, and I take your point very well about how the opinion industry has gotten pretty disgusting writ large, right?
01:17:42.980 Like the, this, the need to stoke outrage and not be based in fact and so on, but I don't know what the answer to it is.
01:17:49.480 Well, and as much as, I mean, here's, here's part of our answer, as much as all of that is scary, right?
01:17:54.800 Like, yes, there's a bunch of, you know, it's, it's better to be a pirate than join the Navy and you get all of the rebels coming to substack.
01:18:00.900 That's a massive tailwind and advantage for us.
01:18:05.060 And for this kind of a model, because what it means is that all of the people who at some point place their integrity above the kind of like whatever pressures were on them to, to deviate from, you know, either telling the truth as they saw it or doing the work they thought was most important or, or whatever that thing is.
01:18:25.240 The people that were kind of willing to stand up to the group think that they were pressured by wherever they were, like those tend to be the best people.
01:18:34.400 Those tend to be the people that I, as a reader, if I'm choosing, like, who do I want to trust?
01:18:39.540 Those are the people that I want to trust.
01:18:41.180 Those are the people that, that paid some personal cost for sticking up for their integrity.
01:18:47.020 And like, if I'm thinking about, hey, who do I want to chip in 10 bucks a month to support?
01:18:51.420 It's them.
01:18:52.740 And I think substack is a really appealing place for those folks.
01:18:56.200 So does your company grow at all from the written word?
01:18:58.560 I mean, I feel like it's, it's really more for obviously the written word, but do you do, are you going to do podcasting?
01:19:03.180 Are you going to do digital television?
01:19:05.440 What are you going to do?
01:19:05.980 Um, so we're very focused on writers right now, um, because we think that that's, there's just such a need and there's so much room to grow within that space.
01:19:18.100 It's the thing that's kind of like the most underserved has the most broken ecosystem by, in our estimation.
01:19:24.240 Um, and so we're, we're very focused on writers that said, it turns out that a lot of writers want to have a podcast.
01:19:31.120 And so we do actually have a podcast, some beta podcast functionality that's in the platform now.
01:19:36.080 And I think over time that will give us a natural way to expand.
01:19:40.040 It's like people that are great writers, sometimes they want to do a podcast.
01:19:43.320 Sometimes they want to do a video lecture.
01:19:44.540 Sometimes they want to do this other stuff and staying focused on writers, uh, gives us kind of like the right way to focus and prioritize that stuff.
01:19:53.980 So running back to the top of our discussion, what do you think is the answer?
01:19:57.520 I mean, do you think, let's start with this.
01:19:59.820 Do you think Twitter was right to impose a lifetime ban on Trump?
01:20:05.140 I think that the, as I said before, like at the place where Twitter is put to these decisions, like at the place where the, you know, Twitter's having to like make this call.
01:20:16.760 I, I, and it would be easy for me to criticize Twitter because we kind of, you know, are positioned against Twitter, positioned against the social media status quo.
01:20:25.140 It would be convenient for me to say, oh, crazy Twitter is coming to, to stomp on your, you know, stomp on your free speech.
01:20:32.660 And there are other decisions they've made where I absolutely believe that.
01:20:35.420 Like, I think their, their censorship of the, the New York post story of Hunter Biden was a, was a colossal error.
01:20:42.460 Um, in the case of banning Trump, I don't know if it's exactly the right moment or not, but I'm kind of sympathetic.
01:20:49.660 I'm like, you know, there's at some point, at some point there's, there's a line somewhere and you can quibble about when you drew the line and whether it was the right thing and what, you know, whether it was the right action or whether the process was, was, was good enough.
01:21:03.840 But I'm, I'm, to be honest, I'm kind of, I don't think that relitigating that decision is actually the, even close to the overall solution.
01:21:13.680 I think the solution is we have to play an entirely different game.
01:21:19.320 The place we're in has happened because of what you said, because of the outrage, like the outrage market, right?
01:21:25.080 This media ecosystem we live in where there's this really, you know, hyper, uh, hyper competitive market for attention, right?
01:21:34.080 Markets are powerful.
01:21:35.320 You got to be careful what you create a market for.
01:21:37.180 We've created a market for outrage.
01:21:39.160 And as long as that's how the business works, it's going to lead to bad outcomes, no matter what everyone's moderation policy is.
01:21:47.160 And the only solution to that is to get people by their own free will to choose to play a different game, to say, I'm not actually going to spend all of my waking life on Twitter anymore.
01:21:57.020 I'm going to go somewhere else where I'm not the product and where the thing that I, the, the, the business model works a different way that actually serves me.
01:22:05.840 And, you know, Substack, we think that that's, people should pay for stuff they think is great, right?
01:22:11.300 If you, if you are paying to read a writer, to have an ongoing relationship with a writer that you trust, that fundamentally changes the kind of work that that writer is going to do.
01:22:21.780 They're not in the, the, the outrage take market anymore.
01:22:25.200 They're in the keep your trust game.
01:22:27.200 Well, I like what you're saying as somebody who's now in independent media.
01:22:31.100 I like that because people get to choose.
01:22:32.840 And if you trust me, then you'll come to me.
01:22:34.400 If you don't, or you don't find me, you know, interesting, then you won't.
01:22:37.640 But I think it's different.
01:22:39.280 We're talking about the president of the United States.
01:22:40.880 Like it's the president of the United States.
01:22:45.340 Why can't he communicate with the people?
01:22:50.820 I mean, I realize a lot of what he says is totally untrue.
01:22:55.300 That's why we have fact checkers.
01:22:57.360 Let me put this to you.
01:22:58.600 Is there anything that he could say on Twitter that you would say, you know, if I were, if you were the, if you were the queen of Twitter, that you would shut it down?
01:23:07.620 Is there anything that you could say?
01:23:09.600 Definitely.
01:23:10.160 I think, well, I mean, I think there, we, we already have to have, we have rules that make sense.
01:23:15.100 Like you can't post child pornography, things that would be illegal.
01:23:19.460 And you can't dot somebody.
01:23:19.720 So you agree that there is a line.
01:23:21.600 Of course.
01:23:21.920 And we're quibbling over where the line is.
01:23:23.480 So it's not just, he's the president.
01:23:25.300 There's no line.
01:23:25.960 There's still a line.
01:23:27.220 It's just a question.
01:23:27.680 But I don't think he's gotten anywhere near it.
01:23:29.680 I don't think he's anywhere near the line.
01:23:32.120 I really don't.
01:23:32.660 I don't like what he says.
01:23:34.000 I don't believe much of it is true.
01:23:36.000 I don't think it's particularly helpful or necessarily always a good force in America.
01:23:40.380 But I don't believe in banning the president of the United States unless it is disgustingly egregious and absolutely has to be shut down.
01:23:48.140 And the two tweets that they actually banned him over are absurd.
01:23:52.980 Do you, I mean, they're nothing.
01:23:54.840 It's, they're not even arguably over the line.
01:23:57.840 It's like, um, my American patriots are going to have a giant voice long into the future.
01:24:04.620 They shouldn't be treated unfairly.
01:24:06.100 And I'm not going to be at the inauguration.
01:24:08.960 After all the tweets, that's what did it.
01:24:12.100 And even his tweets about voter, massive voter fraud, which aren't true.
01:24:15.040 There wasn't massive voter fraud.
01:24:16.220 There was voter fraud, which we should look into.
01:24:18.160 We should improve the system.
01:24:19.620 Um, no, you don't ban the president of the United States.
01:24:22.300 You don't, you don't interrupt him while he's speaking.
01:24:24.300 You don't ban his Twitter feed.
01:24:25.700 You fact check.
01:24:26.860 That's America.
01:24:27.800 The answer to speech you do not like is not less speech.
01:24:30.720 It's more.
01:24:31.460 Now, what do you think, Chris?
01:24:32.300 You, you put it on me.
01:24:33.320 Now, what do you think?
01:24:34.440 That's my answer.
01:24:35.160 I think, I mean, I sort of said this up top.
01:24:39.080 I think it's tough.
01:24:40.160 I think I'm, I'm, you know, we agree that there's a line somewhere, you know, when do
01:24:44.200 you cross the line?
01:24:44.660 I think it's, it's, you know, those tweets are not the, the content of the tweets by themselves.
01:24:49.080 Like if there wasn't a mob that stormed the Capitol, would he have been banned because
01:24:52.100 of that stuff?
01:24:52.700 No, obviously not.
01:24:54.480 Um, but I do think that there, you know, because of the nature of these platforms, because
01:25:00.140 it's not just the case that people have these direct relationships, but rather that the
01:25:04.200 platform is kind of like, has this algorithm that's, that's taking the most incendiary
01:25:10.520 stuff and sending it out to all the people so that it can like keep them kind of maximum,
01:25:15.080 maximally riled up.
01:25:16.180 And these effects do exist.
01:25:17.680 And it is sort of having these real world consequences.
01:25:21.820 I'm kind of like, you know, I, I understand the decision, to be honest.
01:25:26.380 But there's no proof of that.
01:25:27.500 I mean, there's, as a lawyer, I want proof.
01:25:29.240 What, what is the proof that Trump's tweets are the proximate cause of anything we saw at
01:25:33.020 the Capitol?
01:25:33.680 These people are disaffected and they're pissed off and they've been angry for a long time.
01:25:38.580 And I'm not going to deny that they've been egged on by him.
01:25:40.840 I've, I've said that publicly.
01:25:42.180 He's been really irresponsible in his rhetoric, but to blame it all on Trump is to deny reality.
01:25:49.260 They, they live in an alternate universe and with or without the internet, without Twitter,
01:25:52.740 without Parler, without Trump, they find a way to communicate.
01:25:55.440 They're, they're angry.
01:25:56.940 They're angry at the media, at the government, the Democrats, you could go on.
01:26:00.060 Right.
01:26:00.720 I, I, I just don't.
01:26:02.040 We all live in alternate universes now.
01:26:03.920 It's no good.
01:26:04.700 Just, this is like trying to, you know, when the people blamed the, it was a Bernie Sanders
01:26:08.460 supporter who went and shot up the, the congressional baseball game and shot Steve Scalise, and
01:26:13.420 that wasn't Bernie Sanders fault.
01:26:16.340 Some people are nutcases.
01:26:18.360 Some people are loons.
01:26:19.440 Some people are totally disaffected and are going to do bad things.
01:26:22.580 And you can't, the answer is not to run around playing whack-a-mole with people's speech.
01:26:27.500 It's to have a generally responsible system that allows for as much debate back and forth
01:26:33.520 and monitoring in case we cross over into the dangerous place as possible.
01:26:37.600 Yeah, I agree with that.
01:26:38.320 I think emphasis on the generally responsible system.
01:26:41.700 And I would say if you're in a place where you have an irresponsible system, then making
01:26:47.720 the individual moderation decisions different one way or another, like by the time we're
01:26:51.740 at the place where we have to talk about what, like, when are we banning the president?
01:26:56.100 Like something has gone wrong far out stream.
01:26:58.960 And that's, that's sort of where I want to focus my time and energy is like, how do we,
01:27:04.940 how do we change the underlying root causes of all of this?
01:27:09.220 Because, you know, whether they banned him this week or two weeks from now or not, or like,
01:27:13.720 it's, it's the, the fact that you have a set of people that think it's reasonable to storm
01:27:18.280 the Capitol and all of this other stuff that's going on.
01:27:21.000 Uh, it's, it's not, it's not good.
01:27:25.380 Well, listen, I thank you for being part of the solution, which I really think you are.
01:27:29.100 I think it's an innovative and it's exciting what you're doing and, uh, totally rooting
01:27:32.720 you on.
01:27:33.160 So go get them.
01:27:34.500 Thank you so much.
01:27:35.260 In a minute, we're going to have Glenn Greenwald.
01:27:39.400 And I think you're going to love Glenn today.
01:27:41.580 I did.
01:27:42.060 My team did.
01:27:42.820 He's fired up and I'm fired up.
01:27:44.260 And he says all the things that needs saying in the way only Glenn can do.
01:27:48.420 He's really good at this.
01:27:49.660 But listen, before we get to him, let's talk about score master.
01:27:53.680 Now I shared a hot story a couple of weeks ago.
01:27:56.060 It nearly crashed the score master website.
01:27:58.080 And the story is the average American has 97 points, three short of a hundred that they
01:28:04.300 can quickly, quickly add to their credit score.
01:28:07.120 Can you believe that?
01:28:07.740 97 is a lot.
01:28:08.760 It can actually change your future, but most people have no idea how to get it.
01:28:13.500 Score master credit scientists discovered this algorithm you see that will super boost
01:28:18.320 your credit score.
01:28:19.380 Not just a few points, but 97 points and fast.
01:28:23.400 Imagine 97 points on top of your existing credit score.
01:28:26.160 It's super important if you're refinancing your home or buying a car, applying for credit.
01:28:31.300 Let's say you have okay credit and you're buying a car.
01:28:33.860 You go to score master first and boost your credit score.
01:28:36.200 Let's forget 97 for now, right?
01:28:37.660 Forget that.
01:28:38.300 Let's just say the average of 61 points that could save you 9,000 bucks on your car loan.
01:28:44.080 And if you go to score master and boost your credit score, just the average, again, a number
01:28:47.300 before you apply for the home loan, you could save almost a hundred thousand dollars over
01:28:51.940 the life of your loan.
01:28:54.260 If you own a business, same thing.
01:28:55.780 You got to get loans to fund projects or finance equipment.
01:28:59.240 You can super boost your business credit score and do much better and save yourself a fortune.
01:29:04.160 Score master will put you in control of your finances.
01:29:06.860 Enroll in just minutes and see how many plus points score master can add to your credit score.
01:29:11.140 What do you have to lose?
01:29:12.520 Visit scoremaster.com slash MK scoremaster.com slash MK.
01:29:17.360 And now journalist Glenn Greenwald, who started and founded The Intercept, he recently left
01:29:25.060 it because they'd been trying to sort of put him under the thumb and say, only great things
01:29:29.440 about Biden.
01:29:30.240 I'm short forming.
01:29:31.580 But he said, forget this.
01:29:33.080 I'm going to do my independent thing and is now writing on Substack, where I recommend
01:29:36.960 all of you read him.
01:29:38.640 And happily, we have him here with us today.
01:29:47.740 Glenn Greenwald, thank you so much for coming back.
01:29:50.440 It's great to be back, Megan.
01:29:51.720 Thank you for having me.
01:29:53.140 All right.
01:29:53.400 So I've been reading your tweets and your column on Substack and here you're just as
01:29:59.320 even more fired up about this whole thing as I am.
01:30:03.020 And, you know, it isn't about Trump.
01:30:05.940 Let's just start with Trump and Twitter and then we'll talk about Parler, too.
01:30:09.100 But it's not about Trump.
01:30:10.420 You don't have to love Trump to defend him on the total shutdown of his ability to communicate.
01:30:16.320 You don't have to love Parler and the conservative discussions that happen on there.
01:30:20.360 They're not all conservative, but leaning to defend what's, you know, to attack what's
01:30:25.500 happening to Parler.
01:30:26.640 What's your what's your take right now?
01:30:29.040 Well, in general, when you're talking about free speech or monopolistic behavior, both
01:30:33.880 of which are at play here, it never really matters what your opinion is of the person
01:30:39.560 to whom that power is being applied.
01:30:41.680 And, you know, we can say it's not about Trump, but the evidence of that is that two of the
01:30:46.300 world leaders with whom Trump has the most acrimony, the chancellor of Germany, Angela
01:30:52.300 Merkel, and the president of Mexico, Amla, who's a leftist, have both come out and very
01:30:58.080 vocally denounced Twitter and Facebook's banning of Trump on the grounds not that they love Trump
01:31:04.260 and think Trump should be heard, obviously, but on the grounds that they think that what's
01:31:08.520 happening is that Silicon Valley is essentially ascending to a position far more powerful than
01:31:14.880 any democracy and controls the discourse and the politics of world democracies by exercising
01:31:22.040 these censorship powers.
01:31:23.320 And they know it's not just confined to the United States, but to their countries as well.
01:31:27.520 So, you know, if anyone has any doubts about what you said, that it's not about Trump, not
01:31:30.960 about Parler, just look at the people.
01:31:33.180 Also, ministers in France who have said the same thing, who obviously have no love, Harvard
01:31:38.260 for Trump, and yet are just as concerned and troubled by this as are we, as is the ACLU,
01:31:44.060 by the way, who told the New York Times that what in particular was done to Parler is extremely
01:31:48.860 troubling.
01:31:50.200 Yeah, the ACLU, I mean, they've changed a lot from what they were born to be.
01:31:55.660 But, you know, when they're saying it's wrong and shutting down any sort of right leaning
01:31:59.960 speech or outlet, you've got to pay attention.
01:32:02.180 All right, so let me let me play the devil's advocate or argue the other side, OK, because
01:32:07.440 you and I probably agree, but I want people to understand both arguments.
01:32:11.440 Now, what Kara Swisher or somebody like her would say is, but violence, but five people
01:32:18.200 are dead.
01:32:19.220 But allowing Trump's rhetoric and the discussions on Parler, which got pretty specific in terms
01:32:24.580 of bring your guns and, you know, get Mike Pence first, can be directly tied to the dangers
01:32:31.680 that we saw unfold with our very eyes last Wednesday.
01:32:34.760 Right.
01:32:34.920 Well, first of all, on 9-11, 3,000 people died.
01:32:39.740 Three commercial jets filled with innocent human beings were flown into the World Trade
01:32:45.600 Center in lower Manhattan and the Pentagon.
01:32:47.180 And yet we recognize that despite how traumatizing that event was, how evil and horrible it was,
01:32:53.620 not even in the same universe as what happened last week at the Capitol, still major excesses
01:32:58.640 were committed in terms of powers that the state seized in the name of combating it.
01:33:02.860 Most people recognize that now.
01:33:05.060 Some people took a longer time to recognize that than others.
01:33:08.600 That's the first thing.
01:33:09.520 The second thing is, you know, there's always a cost to free speech.
01:33:13.840 There's a cost to every liberty.
01:33:15.920 We have a restraint on the police that says the police can't come into our homes and search
01:33:21.560 our homes without obtaining a search warrant from a court, which means that the police often
01:33:26.260 are hamstrung in finding murderers or in finding rapists.
01:33:30.160 And sometimes people are able to rape and murder again because of that constraint.
01:33:33.500 But that liberty is something we all cherish and don't want to give up, even though it has
01:33:37.520 a cost.
01:33:38.100 Free speech also sometimes has costs, which are people use that free speech to disseminate
01:33:43.060 dangerous ideas.
01:33:44.720 And so when you're weighing this framework, you can't just look at the speech being suppressed
01:33:50.220 and ask whether or not you think it's a good or a bad thing for that particular person to
01:33:54.680 be heard.
01:33:55.480 You have to look at the other side of the equation, which is the dangers that emanate
01:34:00.260 from empowering tech companies completely outside of a democratic framework.
01:34:05.460 No one elected Mark Zuckerberg or Jeff Bezos or Tim Cook or Google executives.
01:34:09.440 And yet they're exercising deeply political power, which has its own dangers.
01:34:13.780 And then finally, the irony of all of this, Megan, is that most of the planning and advocacy
01:34:20.640 of that breach at the Capitol was not done on Parler.
01:34:24.340 The first 13 people arrested, at least as a Monday, none of them appear to be active users
01:34:28.600 on Parler at all.
01:34:29.900 Most of the planning was done on YouTube, owned by Google, Facebook and Twitter.
01:34:34.280 So why aren't Democratic politicians and people who are making that argument calling for the
01:34:38.960 removal of Facebook from the Apple store or from Google Play?
01:34:43.140 It's because this isn't about that.
01:34:45.620 It's an opportunity to destroy a platform that is a new competitor to Silicon Valley giants
01:34:51.340 and politically to destroy one that's associated with right wing ideology.
01:34:55.720 So they so they would say Facebook, Twitter, Instagram, they at least try to play the whack-a-mole.
01:35:04.680 They at least try to shut down those tweets and so on when they see them.
01:35:07.940 And Parler's too relaxed.
01:35:10.420 Well, it's not even true.
01:35:11.480 That's it's so funny.
01:35:12.480 It's not that, you know, I wonder I know you signed up for Parler.
01:35:16.440 I know you've used it before.
01:35:17.540 I would be willing to bet that an infinitesimal percentage of people who are advocating Parler's
01:35:24.640 destruction and making claims about how it functions have never been on it, have no idea
01:35:29.500 what it is, have no idea what its rules are.
01:35:32.760 They have terms of service that explicitly preclude and prohibit advocacy of violence.
01:35:38.980 They have a team of paid, trained moderators who are there to delete any tweets or postings,
01:35:44.560 as they call it, in violation of their terms of service, just like Facebook and Twitter
01:35:49.360 does.
01:35:49.780 So imagine if you didn't if someone didn't know what Facebook and Twitter was like and
01:35:53.460 you so you cherry pick the worst possible tweets or the worst possible Facebook postings
01:35:58.800 to show them, they would say, oh, my God, this is like a neo-Nazi site.
01:36:01.660 This is an insane violence site, right?
01:36:04.140 You know how many times I've seen people advocating my death or violence against me?
01:36:08.340 I'm sure you have, too, on Twitter or Facebook.
01:36:10.100 Sometimes it's deleted a few days later.
01:36:11.720 Sometimes it's not.
01:36:13.200 Same with Parler.
01:36:14.120 They have exactly the same terms of service and exactly the same moderation practices
01:36:18.820 as Facebook and Twitter and YouTube.
01:36:21.080 And as I said, the planning was done overwhelmingly on those larger sites, not on not on Parler.
01:36:27.460 Yeah, no.
01:36:28.280 And I said, Jane Fonda, you need to stop that.
01:36:31.260 Just kidding.
01:36:34.100 Just get over it.
01:36:36.840 No, you're exactly right.
01:36:38.440 Even Instagram.
01:36:39.380 It's like Instagram is usually the sweetest place of the nicest pictures.
01:36:42.320 And it's like you can find yourself in certain pockets where you're like, oh, no, it's horrible.
01:36:47.820 No, you're right.
01:36:48.220 I know it's more traumatizing at Instagram because you expect like everybody to be sweet and nice
01:36:52.780 there.
01:36:53.020 But you're right.
01:36:53.640 And now you can find that there, too.
01:36:55.760 Right.
01:36:56.020 That's supposed to be the rainbows and unicorns site.
01:36:58.160 But you're 100 percent right.
01:36:59.540 None of these people has ever been on Parler.
01:37:01.280 They have heard that it's some.
01:37:02.440 I mean, I see it described now in all the articles, this right wing site.
01:37:06.180 Well, it's not that the CEO is a nonpartisan guy.
01:37:08.520 He's more right leaning, I'd say.
01:37:09.920 And he partnered with Rebecca Mercer, who's a right wing person who's funding this adventure.
01:37:15.220 But that doesn't mean it's unfair and it's crazy and it needs to go.
01:37:19.880 This is how they view Republicans, Trump supporters and the right wing in general.
01:37:26.000 It's it's like what is it confirming bias?
01:37:29.540 I think that we can look at how 9-11 played out for lessons, not because what happened
01:37:34.880 at the Capitol is comparable.
01:37:35.920 It's not.
01:37:36.460 But people are talking about it as though it is.
01:37:38.280 And a lot of the behavior is is the same.
01:37:41.060 You know, what happened with 9-11 is when the, you know, trauma of that event took hold,
01:37:49.400 everybody said, well, we have to go get the terrorists, which meant the people who are members
01:37:54.160 of the group that actually perpetrated those attacks.
01:37:57.920 And over time, what was and wasn't a terrorist expanded so radically that the government almost
01:38:04.580 had carte blanche to do anything it wanted in it just by simply calling somebody a terrorist
01:38:09.760 with no due process.
01:38:11.340 And you just call someone a terrorist and it was like, throw them in Guantanamo for 20
01:38:14.680 years with no charges, no due process, don't prove anything.
01:38:17.620 They're a terrorist.
01:38:18.440 Do whatever you need to do.
01:38:19.540 And that's what's happened here is these terms like white supremacist and terrorist and inciting
01:38:24.800 violence are all rapidly expanding so that at this point, Megan, as you know, white supremacy,
01:38:31.320 domestic terrorist basically means Trump supporter.
01:38:34.060 That's how Democrats see the world now.
01:38:36.480 And they absolutely intend to use the power that they already have over the culture with
01:38:44.120 the power that is about, they're about to merge with the power of the state, the FBI, the NSA,
01:38:49.360 the CIA, the Justice Department, and with Silicon Valley squarely in their corner.
01:38:54.280 Do you know who the first politician was who demanded that Apple kick off Parler?
01:38:59.580 It was Alexandria Ocasio-Cortez to her 9 million followers.
01:39:03.360 And Jennifer Palmieri, who was a longtime high level aide of the Clintons, tweeted last week
01:39:11.720 after Twitter and Facebook banned Trump, she said it outright.
01:39:15.860 She said, oh, I find it so interesting that Facebook and Twitter finally acknowledge that
01:39:21.000 they are capable of silencing Trump on the same day that they recognize, this was right
01:39:26.380 after the Georgia races were decided, that it's now Democrats who are going to be controlling
01:39:31.780 the committees that oversee their industry, meaning they're using their censorship power
01:39:36.580 to appease the party that's about to be in power.
01:39:38.900 She evidently thought that was a good thing, but that's incredibly alarming.
01:39:42.500 But that is what's happening.
01:39:44.280 Yeah, she said the thing out loud that you're not supposed to say out loud.
01:39:47.020 So let me, I want to get to sort of the way they're talking about this and also the analogy
01:39:53.380 to 9-11, which I think is an interesting one.
01:39:56.400 But here's the question.
01:39:57.580 I actually just had it raised in the last interview.
01:40:00.640 Do you agree that there is a line, a line that these companies can draw when it comes
01:40:05.900 to speech on their platforms that, you know, that's appropriate for them to be drawing?
01:40:12.680 Sure, sure.
01:40:14.020 First of all, it is true that these companies are not the state.
01:40:17.780 So it is different if they have terms of service that they're applying that, you know,
01:40:25.840 may be justified to censor certain things that not even the state being the state under the
01:40:30.480 First Amendment would have the right or the ability to censor.
01:40:35.040 So I think, for example, if you say people explicitly calling for violence against innocent
01:40:40.260 people, if a platform wants to say, we don't want to be associated with those, that kind
01:40:45.040 of language, then I think it's appropriate for those platforms to adopt that rule.
01:40:50.420 The problem is, is that these companies are monopolies.
01:40:56.360 They're monopolistic in nature, which changes everything.
01:40:59.920 So it's so ironic, you know, if you say to liberals, well, there's a problem with these
01:41:03.620 sites inconsistently or selectively applying these standards, they'll say, oh, well, they're
01:41:08.540 a private company.
01:41:09.340 They have the right to do whatever they want.
01:41:10.660 I don't know when liberals became libertarians.
01:41:12.600 These are the same liberals who want to force, you know, tiny little shop owners who are bakers
01:41:18.620 who don't want to make cakes for gay weddings on the grounds that it offends their religion
01:41:22.580 to force them to do so.
01:41:24.160 But suddenly now they're saying, you know, private companies can do what they want.
01:41:28.180 That's true up until the point where they become monopolies or violating antitrust.
01:41:33.580 Then there is a public interest in how they're exercising their power.
01:41:37.060 And I think that if we acknowledge that they might have some lines that they want to draw,
01:41:41.940 then they have to draw them consistently.
01:41:43.840 Why were leading liberals and Democrats permitted over the course of the three months following
01:41:50.700 the George Floyd killing to endorse violence and arson and property destruction in the name
01:41:57.460 of that cause without ever being banned?
01:41:59.860 I think that's the problem people have is if these standards were being applied consistently,
01:42:04.280 then people would be more comfortable that this power wasn't being politicized, but it
01:42:09.040 clearly is not.
01:42:10.760 Well, that's exactly it, because I opened the show by talking about misinformation on the
01:42:16.560 internet, on YouTube, on Twitter, on Facebook.
01:42:19.440 And if they really want to get into the business of banning misinformation that leads to social
01:42:27.920 unrest, potentially riots and death, they're going to have to take a hard look at the at what
01:42:33.540 happened over this past summer, because the misinformation about police officers and the
01:42:39.720 number of unarmed deaths involving black suspects was egregious.
01:42:45.960 I mean, just blatant lies were being told over and over.
01:42:49.440 And then riots happened and they were celebrated.
01:42:53.180 They weren't even just permitted.
01:42:54.880 They were celebrated.
01:42:55.840 And so now now that misinformation about massive voter fraud has been discussed and there's a
01:43:02.880 large swath of Trump supporters that that are angry about it, some of whom were at the
01:43:07.140 Capitol.
01:43:07.780 It's a totally different standard.
01:43:09.560 We cannot have lies that confuse or upset people because it's un-American.
01:43:14.740 You know, free speech has its limits and we have a responsibility as a social media company
01:43:19.140 to stop it.
01:43:20.720 Yeah, absolutely.
01:43:21.380 I mean, I think two important points about this, you know, number one is YouTube and then
01:43:28.380 other platforms prohibited any person from contesting the legitimacy of the 2020 election.
01:43:36.160 If you make a YouTube video arguing that you believe there is substantial evidence that
01:43:41.200 systemic fraud was committed and that the legitimacy of the outcome is in doubt, you will be banned.
01:43:46.660 If, however, you say that the Republicans stole the 2000 election with the corrupt Supreme
01:43:53.100 Court, that Al Gore was really the winner, as many, many liberals believe, or if you say
01:43:59.020 that you think Karl Rove tampered with Diebold machines in Ohio in 2004 to make George Bush
01:44:05.880 the illegitimate winner when John Kerry really should have won, which a lot of liberals believe
01:44:09.880 Barbara Boxer objected to the certification of the Electoral College vote on that grounds, you're
01:44:15.560 perfectly fine.
01:44:16.320 If you want to say that you think the Russians invaded the voting system in 2016 and converted
01:44:23.180 Hillary Clinton votes to Donald Trump votes and that's why he won, as two-thirds of Democrats
01:44:28.660 believe, two-thirds of Democrats believe that insane conspiracy theory, you can go all day
01:44:33.600 on Twitter, Facebook, and YouTube and say that and people are, but then suddenly you're not allowed
01:44:38.180 to say that about the 2020 election.
01:44:40.280 That's the kind of political inconsistency that people perceive and don't accept.
01:44:46.940 The other issue is, you know, I think, and I know you, you know, having gone to law school,
01:44:50.920 studied this a lot.
01:44:52.060 Um, but it's one of the major misconceptions, this concept of inciting violence.
01:44:56.920 You know, we're sitting here doing a show right now, um, where we're talking about the
01:45:02.380 evils of the actions of big tech.
01:45:05.140 It's possible someone in the audience listening to me might get so riled up that they might
01:45:10.680 want to go and do violence against a Google site or an Amazon facility.
01:45:14.920 Obviously they shouldn't do that, but they could.
01:45:16.940 If you are a pro-choice advocate and you say, I think pro-life activists are endangering the
01:45:23.800 lives of women by trying to make abortion illegal and forcing us to get unsafe abortions, as
01:45:28.760 many of them say, someone might hear that and say, wait, pro-life activists are killing
01:45:33.380 women.
01:45:34.120 I'm going to go firebomb a pro-life office.
01:45:36.760 Anything can be incitement.
01:45:38.180 So we have to be very careful that if someone says, you know, I think the 2020 election was
01:45:46.640 fraudulently determined, that that isn't incitement of violence unless they're explicitly saying,
01:45:53.600 and therefore you ought to go, you know, use violence against the people who did it.
01:45:59.180 And this is one of the key distinctions that's getting lost.
01:46:02.200 You know, any fiery or, or passionate political rhetoric can be incitement.
01:46:08.180 Um, but it's not incitement unless you're imminently directing people to go burn something
01:46:13.180 down or kill people.
01:46:14.440 And it's not, I think that's a really, right.
01:46:17.180 I completely agree with you.
01:46:18.280 People are missing this point.
01:46:19.480 And, and, and the point I'm trying to make is not that any of those comments about cops
01:46:24.200 and, and black defendants are, and it's not that those were incitement and should have
01:46:28.260 been banned.
01:46:28.640 They shouldn't have been.
01:46:29.300 People are entitled to their opinions of supported by the facts or not.
01:46:33.000 They're entitled to their opinions.
01:46:34.240 This is America, but it needs to work for both sides.
01:46:37.480 It needs to work on both issues and incitement.
01:46:40.000 While you could certainly make the argument, um, for some of the, some of the rhetoric we
01:46:44.660 saw online, um, about like, let's get our guns and go now.
01:46:49.120 That's a lot, a lot better case than Trump saying my supporters will not be disrespected
01:46:55.780 or even what he actually said right before the rally, which is we're going to march to Capitol
01:47:00.460 Hill and we're going to make our voices heard.
01:47:03.100 At no point did he come anywhere close to the legal standard for incitement, nor did
01:47:10.680 Josh Hawley, nor did Ted Cruz.
01:47:12.600 I mean, the, the Megan, the chairman of the Homeland Security Committee in the house, Benny
01:47:18.800 Thompson, a Democrat from Mississippi gave an interview yesterday saying that he thinks
01:47:22.340 Josh Hawley and Ted Cruz should be put on the no fly list because they're essentially terrorists.
01:47:29.240 They incited, how did Josh Hawley incite violence in any way?
01:47:34.820 He exercised this legislative prerogative that he has the Barbara Boxer use that lots of house
01:47:40.240 Democrats have tried to use.
01:47:41.400 Maxine Waters.
01:47:42.400 I hope is she going, is she going on the no fly list?
01:47:44.900 Cause she did exactly what Hawley did stood up and questioned the electoral results.
01:47:48.820 That four years exactly yet.
01:47:51.240 They did it in 2004 too.
01:47:52.900 And that time with the Senator and they had to go into that two hour session, which, so
01:47:56.760 you, you know, look, I don't, I don't think what Josh Hawley did was particularly advisable.
01:48:01.660 Um, and it was clear pandering.
01:48:05.080 Of course he wants the Trump lane and he had just spent the last three weeks partnering with
01:48:09.180 Bernie Sanders to get $2,000 checks for people.
01:48:11.340 So he wanted to show his conservative bona fides.
01:48:13.340 So fine, that's the case of like a politician being, um, opportunistic.
01:48:18.380 Like if that's terrorism, you know, they all belong in Guantanamo, but this language of
01:48:24.040 like there, these, these members of the Congress who exercise their legislative prerogative,
01:48:28.760 but never encouraged anyone to engage in violence are now terrorists is, is madness.
01:48:34.000 It is madness.
01:48:35.140 But that is what's happening with the discourse.
01:48:37.940 I totally agree with everything you just said.
01:48:40.120 A hundred percent.
01:48:40.960 And there's also a push by some 7,000 law students and law professors and lawyers to
01:48:46.860 have, uh, Holly and Cruz disbarred because they had the temerity to stand up and question
01:48:52.780 the legal opinions, finding that the election was legit.
01:48:57.260 Do they have any idea what lawyers do?
01:48:58.900 That's exactly what you, you question legal opinions.
01:49:01.780 There's always one against you.
01:49:03.060 And when you lose at the lower court, you spend the rest of the case questioning the lower
01:49:07.200 legal opinion.
01:49:07.940 I agree with you.
01:49:08.660 It was baseless.
01:49:09.360 Those guys didn't have a legal leg to stand on, but that is not a basis for censure disbarment,
01:49:15.340 the no fly list.
01:49:16.700 It's gotten to the point of absurdity, Glenn absurdity.
01:49:20.920 And, and as I watch it, I get, I get angrier because they won't learn their lesson.
01:49:27.180 They won't learn their lesson about, I realize no one is celebrating these losers who went to
01:49:34.000 the Capitol and did what they did.
01:49:35.520 No one is.
01:49:36.040 We all want to see them prosecuted, but to double down on the sneering elite, like, look at these
01:49:44.660 disgusting lowlifes is it's risky.
01:49:49.760 And I don't think it helps just by way of example.
01:49:53.840 Um, there was an article on the Atlantic talking about what happened.
01:49:56.580 It was, I think it was called worst revolution ever.
01:49:58.720 And the writer says here, they were a coalition of the willing deadbeat dads, you porn enthusiasts,
01:50:06.680 slow students, and MMA fans.
01:50:09.900 They had pulled into the swamp with bellies full of beer and sausage McMuffins, maybe a little
01:50:16.640 high on Adderall ready to get it done.
01:50:20.320 That on the heels of Anderson Cooper, I think we actually have this, um, and is sneering
01:50:26.700 about the Olive Garden here.
01:50:27.880 Listen, here's Anderson.
01:50:28.940 Look at them.
01:50:29.420 They're high-fiving each other for this deplorable display of, of completely unpatriotic, completely
01:50:37.100 against law and order, completely unconstitutional behavior.
01:50:42.420 It's stunning.
01:50:43.320 And they're going to go back, you know, to the Olive Garden and to their, the Holiday
01:50:47.620 Inn that they're staying at in the Garden Marriott.
01:50:50.060 And they're going to have some drinks and they're going to talk about the great day that
01:50:53.460 they had in Washington.
01:50:54.320 And they really did something and stand up for something.
01:50:56.940 And they stood up for nothing other than mayhem.
01:51:00.300 Okay.
01:51:00.740 I was with him until the words, it's stunning.
01:51:02.480 I hadn't heard that.
01:51:03.760 I was with him right through.
01:51:04.920 It's stunning.
01:51:05.700 That, I mean, right.
01:51:06.720 It is deplorable behaviors.
01:51:08.260 No one's defending what they did, but the disgust, like, why does it have to be their MMA
01:51:12.340 fans?
01:51:13.100 They're slow.
01:51:14.420 Bellies full of beer, sausage McMuffins, staying at the Holiday Inn, staying at the
01:51:18.580 Holiday Inn.
01:51:19.620 No, right.
01:51:20.060 The Holiday Inn, the Olive Garden.
01:51:21.400 Screw them.
01:51:22.780 So, so much of this is, you know, about culture and class.
01:51:26.840 So much of it.
01:51:27.580 I mean, Anderson Cooper was born into one of the richest families in the United States.
01:51:34.120 Yes, he's at Vanderbilt.
01:51:35.180 You know, yeah, he's at Vanderbilt.
01:51:37.200 He grew up on, on the Upper East Side.
01:51:39.060 He lives in this sprawling, you know, man down like West Village, uh, townhouse that
01:51:46.900 has been like an architect, you know, he makes $10 million a year.
01:51:49.760 Nancy Pelosi is one of the richest members of Congress.
01:51:52.240 This is absolutely what a lot of the reaction is, is seeing just one other point, one other
01:51:56.660 point.
01:51:57.260 He literally owns a castle just North of New York.
01:52:01.060 He literally has like little weekend getaway is actually a castle.
01:52:05.880 I mean, who is he to sneer about the Holiday Inn?
01:52:11.440 And so all that rhetoric, it only makes the core Trump supporters angrier.
01:52:16.320 Do you think that makes them trust Anderson Cooper when he says there was no widespread
01:52:19.660 voter fraud?
01:52:20.740 I mean, and you know, this is it.
01:52:22.940 This is happening in so many democracies around the world, which is that the, you know, ruling
01:52:30.580 class is becoming more and more, it is like the French monarchs that live behind the wall
01:52:37.180 in Versailles and were disgusted by the people begging for bread.
01:52:41.860 And when they, you know, would try and breach the walls, it was like this kind of great offense.
01:52:47.320 So much of the reaction was about exactly that, that these are the august halls, that this is
01:52:52.600 Nancy Pelosi's office.
01:52:53.960 And the people who are protesting are unworthy.
01:52:57.120 They're kind of, you know, the dirty dregs of society that ought to be unseen.
01:53:02.140 And again, that's not to justify anything that they did.
01:53:04.780 But I think the insanity of the reaction, the completely disproportionate way that it's
01:53:10.380 being talked about, a major part of it is that class and culture perspective.
01:53:14.600 And that goes to your point, you wrote the other day, which is that in the same way the Democrats
01:53:23.240 and the media were allies in the war on terror and the sweeping, you know, curtailing of civil
01:53:32.160 liberties we saw right after it, right after the 9-11, it's happening again.
01:53:37.360 Yeah, I mean, you know, obviously the initial burst of civil liberties restrictions was done
01:53:45.640 by neocons under the Bush-Cheney administration, but with the full support of most of the Democratic
01:53:51.060 Party.
01:53:52.020 So it was this kind of like unity.
01:53:54.200 And, you know, unity can be a positive thing when a country comes together in response to
01:53:58.940 a traumatic event, as some of the unity after 9-11 was.
01:54:02.360 But it can also be very dangerous, because once that happens, there's no more room for
01:54:08.100 dissent.
01:54:08.760 Those emotions are so powerful.
01:54:11.720 And, you know, one of the roles of the media is to, at those key moments, start questioning
01:54:18.060 the consensus that has arisen.
01:54:21.220 But the media is now so integrated into this class.
01:54:24.840 You know, like 40 years ago, the media prided themselves on being outsiders.
01:54:29.000 They work in the Capitol.
01:54:30.440 They're friends with all those aides.
01:54:31.740 They're friends with all the members of Congress and Senate.
01:54:34.160 Their kids go to school together.
01:54:35.600 They live in the same neighborhoods.
01:54:37.220 So there's no separation anymore between the media and the ruling class.
01:54:41.820 And so the reaction that you get is not just coming from the political class.
01:54:46.340 It's always amplified unquestioningly by the media class to the point where the people
01:54:51.340 demanding censorship most aggressively aren't even politicians.
01:54:55.760 They're these teams of reporters they have at CNN and NBC and the New York Times, whose
01:55:00.880 only purpose, from what I can tell, is to troll through Facebook and Twitter and 4chan and
01:55:06.120 demand that various obscure citizens have their platform taken away from them, which
01:55:11.140 has that same kind of class imbalance, right?
01:55:14.760 That like only we, the people who work at these major news corporations, have the right to
01:55:19.720 be heard and are responsible enough to disseminate information.
01:55:22.820 You are pretenders.
01:55:24.020 You are too reckless and uneducated to be trusted with this.
01:55:27.580 And that is the dynamic that people perceive.
01:55:29.640 Right.
01:55:30.800 The media, just in both cases, right after 9-11 and now, jump on board to be the advocates
01:55:37.000 of saying goodbye to civil liberties if it serves their cause.
01:55:42.400 And I agree with you after 9-11.
01:55:43.780 We've talked about this before.
01:55:44.800 I, too, was one of those scared people.
01:55:46.880 I wasn't in journalism on 9-11.
01:55:48.500 But, you know, you're scared and you want a curtailing of civil liberties because you
01:55:52.080 don't want to get bombed.
01:55:53.260 You know, and and then over time it loosens up and you've got to return to normal.
01:55:57.680 And now they're treating this one event, as awful as it was, as a as a reason to get
01:56:05.240 rid of free speech.
01:56:06.440 And of course, conveniently, it's only speech on the right.
01:56:10.420 Right.
01:56:10.600 And it's not just let's go kill people at the Capitol.
01:56:14.200 It's it's don't discuss voter fraud ever again.
01:56:18.000 You know, and like you say, the guy at all Oliver Darcy at CNN who's trying to shut down
01:56:22.640 half of the Fox News lineup like it's I'm worried about where it's going to go because it
01:56:27.680 it has now the Democrats are in charge of every branch of government.
01:56:31.100 Well, you know, the the White House and Congress and big tech obviously is 100 percent on board.
01:56:37.600 And I really am worried about their how emboldened they are.
01:56:43.140 Oh, me, too.
01:56:44.140 I mean, look, you know, I pretty much devoted my adult career first as a lawyer and I was
01:56:48.940 a journalist forced and foremost to defending civil liberties in general and free speech
01:56:54.660 in particular, I left law and started writing about politics principally motivated by a
01:57:00.100 concern about the erosion of due process rights and the political climate that arose after
01:57:04.880 9-11 in the years after the Iraq War.
01:57:07.620 And I would say that without doubt, this is the worst free speech crisis, the worst civil
01:57:12.760 liberties crisis that has emerged in the entire time, at least that I've been doing that work,
01:57:20.540 because all of this, Megan, as you know, is stemming from this very ingrained fury that
01:57:29.880 they have that Trump won in 2016.
01:57:32.460 That is what this is all about.
01:57:34.300 This whole thing with impeachment and the 25th Amendment and wanting to kick his supporters
01:57:39.040 off the Internet.
01:57:39.860 It all comes from this sense that this is their society and that these people came and took
01:57:45.900 it from them unjustly and illegitimately, and they've been wanting to punish them for
01:57:51.000 so long and they're now seizing on this as a pretext.
01:57:54.340 And that's what concerns me the most is if it were just one event, I would say, OK, I get
01:57:59.460 that like people in Congress that day are scared.
01:58:01.880 I get that it's traumatizing.
01:58:03.300 I'm sure it was.
01:58:04.060 I don't make light of that.
01:58:05.400 But eventually, as you said, especially if it's a one time event and not like two huge
01:58:10.340 towers falling on top of 3000 people, but, you know, just like a kind of three hour, four
01:58:15.580 hour long breach of the Capitol, eventually it's going to subside and people are going
01:58:20.260 to come to their senses.
01:58:21.120 But this is this is an ongoing multi-year rage that is driving all of this.
01:58:27.360 And this latest incident is just the kind of opportunity to finally justify doing what
01:58:32.100 they wanted to do all along, which is humiliate these people and silence them for good.
01:58:36.800 Absolutely right.
01:58:37.720 And they and the willingness to try to blame the actions of those people on Capitol Hill
01:58:44.920 on any Trump supporter or somebody who, you know, was even just open minded to Trump has
01:58:50.680 been infuriating.
01:58:52.540 You know, how many how many tweets have you seen like this is on all Republicans?
01:58:56.820 This is on anyone who defended President Trump, you know, for any of his agenda items.
01:59:01.360 You're no longer allowed to talk about anything good.
01:59:03.600 President Trump did or you're complicit in what happened on the Capitol.
01:59:08.680 Yeah.
01:59:09.000 And not only that, you know, not only that, but if you are questioning any of the things
01:59:14.080 they want to do in the name of, you know, preventing this from happening again or punishing
01:59:20.240 the people they hold responsible, not just the people who actually invaded the Capitol,
01:59:23.420 but everyone they think is complicit.
01:59:24.780 That, that, too, will subject you to accusations that you must be a sympathizer to the people
01:59:32.620 who wanted to do violence.
01:59:34.860 You know, it's kind of like it was the famous George Bush framework after 9-11, which is
01:59:39.020 you're either with us or you're with the terrorists, which got interpreted to mean if you oppose
01:59:43.520 anything that we're saying we want to do, you're going to be subjected to claims that you're on the
01:59:47.140 side of the terrorists.
01:59:47.940 That is 100% the framework.
01:59:50.320 So if you say, I'm really worried about what was done to Parler, or I think there's a lot
01:59:55.120 of serious repercussions from a private tech monopoly silencing the elected president, they'll
02:00:01.600 tell you that that must mean that you're in favor of Nazis or you want white supremacy speech
02:00:06.760 to flourish.
02:00:08.460 That is the tactic being used.
02:00:10.040 It's really repressive and, and kind of like a despotic way, a despotic way of, of conducting
02:00:15.580 debates.
02:00:16.420 I talked about this on the show the other day.
02:00:18.600 It's, this is small, stupid example, but just, just as an example, um, there's some guy
02:00:23.980 affiliated with the Lincoln project.
02:00:25.440 This, his name is Tom Nichols.
02:00:27.620 And, uh, he, he was trying to scare me, warn me in a nasty way, not in a friendly way on
02:00:34.660 Twitter the other day, because I said something to the effect of, you know, what happened in
02:00:38.740 the Capitol was wrong, obviously, but these people, pundits and Democrats who are trying
02:00:43.280 to say that they're trying to use it to justify four years of essentially Trump derangement
02:00:49.020 syndrome, criticizing everything he did is bad, uh, are absurd too.
02:00:52.760 Like this, this doesn't now excuse four years of bias and overreach on things like Russia.
02:00:59.080 And he actually, you know, started to get something going, like, remember, she said this, it'll be
02:01:04.260 used.
02:01:04.740 You remember, I'm like, this is insane, Glenn, you know?
02:01:08.740 They're all sort of ganging up the, the left, the media, um, to make sure anybody who shares
02:01:16.920 any viewpoints with president Trump, um, can understand him objectively, uh, supports any
02:01:22.440 of his agenda, defended him on Russia gate, sees the Democrats as having overreached on
02:01:27.000 impeachment.
02:01:28.140 Everyone's quote, complicit another attempt to silence and scare.
02:01:33.760 Like we've been seeing all summer with, you know, the, uh, cancel culture.
02:01:37.760 And this is cancel culture on steroids.
02:01:40.820 Totally.
02:01:41.620 And, you know, look, I mean, you are in a position in your work and in your life where you're not
02:01:48.460 really subject to those kinds of threats.
02:01:51.120 You know, I feel very similar, but we are a tiny minority of people.
02:01:55.680 Imagine, you know, the Lincoln project is launching a, a, a, a kind of, um, campaign to compile a
02:02:03.640 list of people that they believe are responsible for everything bad about the Trump era, including
02:02:09.760 this, you know, breach of the Capitol.
02:02:11.500 And they're essentially trying to pressure corporations to disassociate themselves from
02:02:16.220 anybody who's on their list.
02:02:18.320 And most people are not invulnerable to those kinds of pressures.
02:02:22.020 Most people have jobs and are at risk of losing them.
02:02:24.560 We're in a pandemic with an unemployment crisis.
02:02:27.000 The media is contracting all the time.
02:02:30.340 So if you're a journalist, of course, you're going to be very concerned about getting on
02:02:33.600 that list or some other list.
02:02:35.260 This is a really intimidating and thuggish climate that people like that guy at the Lincoln
02:02:39.880 project and others are purposely trying to cultivate in order to coerce everybody's acquiescence
02:02:46.640 to whatever orthodoxies they're trying to impose.
02:02:49.020 It's so ironic that they constantly accuse the Trump movement and Trump of being authoritarian
02:02:55.400 and fascist while at the same time, they're using the weapons and tactics that are the
02:03:00.060 hallmark of both of those pathologies.
02:03:02.860 Think about it.
02:03:03.620 If they can say that to me, I mean, I defy anybody to find somebody at the Lincoln project
02:03:08.260 with all their stupid ads to who raised a more publicized question about Trump's temperament
02:03:16.140 than me.
02:03:16.980 Good luck.
02:03:18.640 My question about Trump's temperament.
02:03:20.620 You are public enemy number one of the Trump movement for like a year and a half because
02:03:25.380 in front of the whole country, you challenged him in a way that they never did.
02:03:28.440 You needed security and guards.
02:03:30.340 I mean, it was a really serious thing that you were subjected to because you did your
02:03:36.540 job as a journalist.
02:03:37.580 And then they risked nothing, who got on board the anti-Trump train only once it was popular
02:03:42.160 and cost free to do actually much to their profit, is going to be threatening you with
02:03:48.120 career repercussions if you don't snap into line.
02:03:50.520 Exactly, for reporting on him fairly, you know, despite whatever my personal feelings are
02:03:55.160 for just bending over backwards to report on the guy fairly and try to take myself out
02:03:59.760 of it.
02:04:00.900 But that's I only raise myself because that's me with a big microphone and people know who
02:04:06.520 I am.
02:04:06.940 And, you know, Tom Nichols cannot destroy me.
02:04:09.480 Bigger, more important men have tried, Tom.
02:04:12.900 But I worry about people who aren't in my position.
02:04:15.720 I do.
02:04:16.420 And actually, no, if you'll let me, that leads me to something I want to ask you from
02:04:20.400 one of our listeners, because we have a segment on the show called Asked and Answered.
02:04:24.160 And it's where the listeners write in with a question about, could be about anything,
02:04:26.980 could be the news of the day or a personal question.
02:04:29.400 Today, it's about news of the day.
02:04:30.700 And it's for you, really, based on your column that you just posted on Substack.
02:04:35.280 And Steve Krakauer, our executive producer, has got it.
02:04:38.280 And we'd love to ask it directly of you.
02:04:40.040 Yeah, Megan and Glenn.
02:04:41.220 This is from Jackie LeFevre.
02:04:42.560 She's a 25-year-old who's outside the journalism and political arena and read Glenn's piece
02:04:47.460 and asked a question about that.
02:04:49.280 And she wants to know, what can we do?
02:04:51.880 How do we keep going forward when it feels like we've been defeated by these major tech
02:04:55.300 giants and there's no way to help?
02:04:57.340 So any solutions there, Glenn?
02:04:59.840 Well, you know, I am actually encouraged because a lot of times when someone has a just cause,
02:05:09.020 like, say, denouncing the people who entered the Capitol, especially the ones who did it
02:05:14.600 with an intention to carry out violence or who actually did commit violence, when they
02:05:19.440 overplay their hand, they turn allies and sympathizers into adversaries and enemies.
02:05:27.240 And I think very much that's what has happened over the past four to five years.
02:05:31.280 So if you look at the ascension of independent media, the success, for example, of Joe Rogan,
02:05:41.260 who, despite barely ever being talked about in mainstream media circles, has become one
02:05:46.160 of the most politically and culturally influential people in the country simply by having a YouTube
02:05:51.440 program where he's just open to different ideas and independent of any faction or dogma and
02:05:57.620 has, you know, doubled the audience of, um, these corporate television programs, just
02:06:03.720 because he seems honest and eager to kind of just speak freely and air differing opinions
02:06:10.960 so that people can decide for themselves instead of trying to manipulate them.
02:06:14.560 And you look at the success of podcasts, like Megan, you have a brand new podcast with already
02:06:19.280 a big audience.
02:06:20.420 Um, and there's other people with podcasts who are doing the same and obviously Substack,
02:06:24.760 which has become, I know you just interviewed the founder.
02:06:27.400 They're super, they know all these independent platforms.
02:06:30.980 No, you know, it's going to happen to Spotify.
02:06:32.800 It's going to happen to anyone carrying independent media, including Substack and Patreon that soon
02:06:38.620 the guns of the New York times and CNN and NBC are going to be turned on them.
02:06:42.100 And they're going to start saying, you are platforming these extremists.
02:06:47.660 You are responsible for the dissemination of this.
02:06:50.300 We demand that you kick this person off the platform and kick this one off the platform.
02:06:53.960 So if you're concerned, as I hope everyone rational is about the monopolistic power of Silicon
02:07:01.400 Valley, joining with the power of the state under democratic rule to suffocate discourse,
02:07:06.940 what you should do is support independent media, make it possible that sites like Patreon and
02:07:15.020 Substack and, you know, independent outlets can resist corporate media because they have funding,
02:07:21.460 they have subscribers, they have audience, you know, that kind of lets them react the way
02:07:26.860 we react, Megan, right?
02:07:27.920 Like we say, look, we're in a position where we're not vulnerable to your threats.
02:07:31.940 We have enough success, enough of a platform built that there's nothing you can do to me.
02:07:36.040 That's what needs to happen.
02:07:37.300 And people can make that happen to fortify these outlets who do want to kind of reject and
02:07:43.700 refuse these censorship attempts and this homogeneity that everyone wants to impose.
02:07:48.720 In order to do that, they're going to need support, financial support, audience support.
02:07:53.720 And I think that that's what any, you know, person can who wants to help and do something
02:07:59.860 about this can do.
02:08:01.820 It's great advice.
02:08:03.100 And I'm going to start a subscription piece to my podcast immediately.
02:08:07.760 I'm inspired.
02:08:09.380 Glenn, always such a pleasure to talk to you.
02:08:11.120 It's like, it's like, I don't know, it's like a warm blanket because you just, you speak
02:08:14.940 sense and you have such a beautiful way with words and you're always so spot on in your
02:08:19.440 analysis and your historical examples.
02:08:22.000 It's, it's lovely listening to you.
02:08:23.900 Great to talk to you, Megan.
02:08:24.880 I think you're, you're doing a great job and an important job.
02:08:28.120 And I'm happy to come back on anytime.
02:08:29.600 Our thanks to all of our guests today, Glenn, John Mates, Chris Best, and to all of you
02:08:34.380 for listening.
02:08:35.620 This hour was brought to you in part by The Zebra.
02:08:38.180 Find out how much money you can save on car or home insurance by visiting thezebra.com
02:08:43.780 slash Kelly.
02:08:45.400 Now, check it out.
02:08:47.080 On Friday, we decided to stay nimble and we're just going to stay on news of the day because
02:08:51.580 this is a big news week and we don't want to miss a moment of it for you.
02:08:55.060 Things are happening day to day, both with the president, with these Democrats and in the
02:08:58.900 House, which, you know, we didn't even get to the fact that they're trying to impeach
02:09:01.300 the president.
02:09:01.900 Oh yeah, by the way, there's, there's an impeachment underway.
02:09:04.000 I mean, that's the crazy news cycle.
02:09:07.480 So we'll have it covered for you on Friday and we'll respond to the day's news and we
02:09:10.940 hope you'll join us for that.
02:09:12.460 Talk to you soon.
02:09:13.860 Thanks for listening to The Megyn Kelly Show.
02:09:15.800 No BS, no agenda, and no fear.
02:09:20.480 The Megyn Kelly Show is a Devil May Care media production in collaboration with Red Seat Ventures.
02:09:24.860 Your business doesn't move in a straight line.
02:09:41.700 Some days bring growth, others bring challenges.
02:09:44.840 But what if you or a partner needs to step away?
02:09:47.720 When the unexpected happens, count on Canada Life's flexible life and health insurance to
02:09:52.980 help your business keep working, even when you can't.
02:09:55.900 Don't let life's challenges stand in the way of your success.
02:09:59.340 Protect what you've built today.
02:10:01.380 Visit canadalife.com slash business protection to learn more.
02:10:05.180 Canada Life.
02:10:06.600 Insurance.
02:10:07.540 Investments.
02:10:08.380 Advice.