The Megyn Kelly Show - February 14, 2023


America's "Reality Crisis," and Free Speech and Censorship Today, with Spencer Klavan, Will Chamberlain, and Kate Tummarello | Ep. 493


Episode Stats

Length

1 hour and 34 minutes

Words per Minute

200.28261

Word Count

18,993

Sentence Count

1,044

Misogynist Sentences

16

Hate Speech Sentences

16


Summary

Spencer Clavin is the host of the Young Heretics podcast and author of the new book, How to Save the West: Ancient Wisdom for Five Modern Crises, which explains why Western civilization is in a state of crisis.


Transcript

00:00:00.480 Welcome to The Megyn Kelly Show, your home for open, honest, and provocative conversations.
00:00:11.860 Hey everyone, I'm Megyn Kelly. Welcome to The Megyn Kelly Show.
00:00:15.160 Oh, we have an interesting debate for you coming just a bit later about free speech,
00:00:20.320 big tech, and censorship in America. Just how censorious do you want big tech to be?
00:00:27.400 Maybe you're thinking, I want them to be less censorious. I'm the one whose viewpoints always
00:00:32.260 get censored, and therefore I'm against protecting big tech when it comes to their censoring pen with
00:00:39.960 the big eraser. It's more complicated than that, because if we take away the things that protect
00:00:45.860 them, who's going to pay, big tech or us? It's actually a really interesting debate. We're
00:00:52.720 going to get into it in just a bit. But we begin with the return of one of my favorite guests and
00:00:56.880 quite possibly the smartest man I know. Western civilization is in a state of crisis. Perhaps
00:01:02.920 you've noticed. The virtual and digital are replacing genuine experience, right? The metaverse? What the
00:01:09.540 hell is that? How about just like the universe we're already in? Why do we need a new digital
00:01:13.460 universe? Feelings too often are replacing facts. How we navigate all these issues in our society
00:01:21.180 will determine no less than if we can save the Western world. Spencer Clavin is one of the people
00:01:28.220 who can help do that. He could actually save the Western world all on his own if we would just do
00:01:32.940 what he tells us. He is the host of the Young Heretics podcast and author of the new book out today,
00:01:39.220 which I highly, highly recommend to you, How to Save the West, Ancient Wisdom for Five Modern Crises.
00:01:47.260 Please. Spencer, welcome back. Great to have you. Oh, Megan, it's so great to be here. And I'm going
00:01:52.080 to tattoo Megan Kelly says I'm the smartest man she knows onto my forehead. That's going to be
00:01:58.140 I don't throw that out there loosely. I know a lot of smart people. I've interviewed a lot of smart
00:02:03.700 people. Your brain is special. Oh, thank you. It's so lovely to be back with you.
00:02:10.380 Oh, and you come by it, honestly, because your dad is Andrew Clavin, who we also love.
00:02:13.700 Sorry to mention your daddy every time you come on, but we just people know the last name and
00:02:18.080 we're all such fans. I'm proud to be associated with him, despite my constant protestations that
00:02:23.760 I'm not related to him. I actually am very proud to call him my dad. So more than happy to hear you
00:02:28.760 mention his name. All right. So let's set it up. Let's set up because what I love about this book
00:02:32.980 is we're all going through these feelings of like, what's happening? Wait a minute. Why isn't
00:02:36.640 what happened to truth? What happened to God? What what's going on with this gender craze?
00:02:43.040 What's like what's going on in our society? You can feel something very different from the way we
00:02:47.480 used to be. And this book diagnoses why that's happening. Yes, it is happening. Why it's happening.
00:02:53.820 What are the crises we're in the midst of? And then takes a look back at history, ancient history
00:02:59.640 to reassure us. I think that none of this is new. We've been through virtually all of this before,
00:03:05.580 and there are really great minds to give us some wisdom into how to navigate what's likely to come
00:03:11.980 next, what's winnable, what's not. And you as a classics expert know all of that stuff. You've read
00:03:19.260 all of that stuff and you're living the modern day crises with us all. So you you've sort of been able
00:03:24.060 to mend history with modern day problems to give us some insight and some some wisdom. So let's start
00:03:30.700 with the crisis as you see it. Why does the West need saving? What are we going through?
00:03:36.500 Well, I think you really put your finger on it when you describe that feeling like everything we're up
00:03:41.440 against is kind of new and confusing. This sense we have that nobody has ever faced these problems
00:03:47.780 before because our technology is totally new. And the digital revolution has just reshaped the
00:03:54.000 way we look at ourselves and see the world. And on one level, of course, it's true. The Internet
00:03:59.840 did not exist in ancient Greece. I am reliably informed. And yet, you know, at the same time,
00:04:06.140 the problems that we're being faced with by this new technology, questions like what is a human being
00:04:12.640 and what is our place in the universe? And you mentioned the question of God. Those are actually
00:04:17.980 fundamental, eternal questions. And what that means is that they've been around for as long
00:04:24.500 as human beings have been around. And there have been deep thinkers throughout the centuries in this
00:04:30.040 Western tradition that we're all inheritors of who have raised really beautiful answers to these
00:04:36.280 questions that can help us see our way forward. And what that means is we're not alone. I think when
00:04:42.140 people tell you, you know, oh, it's a brave new world and all the old books are primitive and
00:04:47.360 superstitious, what they're really trying to do is deprive you of the community of the past. And
00:04:52.520 I grew up, as you know, making in a house filled with books, old books that I would pull down off the
00:04:58.080 shelf. And soon, soon I realized that being surrounded with books meant being surrounded by
00:05:02.980 friends. And so I wrote this book, how to save the West, because I wanted people to have access to some
00:05:08.980 of that stuff, have ownership over this wisdom that comes down to us from Athens and Jerusalem,
00:05:13.400 so that we can answer some of these questions that are being raised, these five questions. Is there
00:05:18.540 absolute truth? What do I do with my body? Does the world have meaning? Is there a God? And what's
00:05:24.740 going to happen to America? Those are questions that we can answer or help to answer using the wisdom
00:05:29.820 that comes down from the past and not just using whatever the CDC or the WEF tells us today and tomorrow.
00:05:36.780 Mm hmm. It's funny because during the Trump administration, Kellyanne Conway famously,
00:05:42.260 infamously said, alternative facts. These are alternative facts. And people started to question
00:05:46.740 whether we really are in this post-truth world where one side has its facts and the other side
00:05:51.140 has its alternative facts. And that's only continued. You know, she's she described it on this show as
00:05:56.540 sort of a flub. You know, it's just she was stepping on her own words and she wasn't really trying to
00:06:00.620 create that narrative. But since then, it's become even worse. I mean, COVID is a great example of how,
00:06:06.020 you know, you could take the same issue and find two different experts with diametrically opposed
00:06:11.600 views. And depending on which one is the leftist view versus the more heterodox view,
00:06:17.520 that'll dictate how it's covered in the mainstream media. So people really are in a place of rejecting
00:06:23.020 what we used to see as truth. What appears on the nightly news? What appears in the paper? What
00:06:30.720 your trusted politicians tell you? That's gone. And a lot of us feel untethered now in trying to
00:06:36.600 figure out truth. So that's one of the crises we're facing is the reality crisis, which is related
00:06:43.500 to truth. So how do you analyze that? Yeah, well, I mentioned that moment with Conway in the book.
00:06:51.100 And what's so funny about that is all of a sudden, when Bad Orange Man came along,
00:06:56.660 it was like, we have a crisis of truth in the news, and we're having a post truth politics. And it's
00:07:03.060 like, I'm even I am old enough to remember when Bill Clinton said, it depends on what the meaning
00:07:09.140 of the word is, is like Donald Trump's team did not invent this problem. And if you like your doctor,
00:07:16.500 you can keep your doctor. There's a great example. Sure. And there was the there were the fake,
00:07:21.440 but accurate memos about George Bush, which Dan Rather put forward. And you know, as you've said,
00:07:27.400 it's only gotten worse. And it's particularly gotten worse because of what you were mentioning
00:07:31.900 earlier, the tech and the fights over censorship that we're having. One of the reasons I think we're
00:07:38.120 having these fights where people want to shut down free speech is because they believe there's this
00:07:44.100 idea that if you can stop people from saying something, it'll stop being true. And if you can take
00:07:49.760 control of the narrative, you can decide which COVID facts get spread and which don't, you know,
00:07:54.780 then you will actually have created reality, like as if it were just some kind of metaverse that we're
00:08:00.200 all living in. And, you know, an idea that has come up since I wrote the book is the idea of
00:08:04.540 malinformation, not just disinformation or misinformation, but malinformation is where
00:08:08.960 you're saying true facts, but you're using them wrong, right? It's it's bad to use those facts in
00:08:15.160 those ways to make that point. And so we really are when it gets down to it, we're in a crisis,
00:08:20.440 which I call the reality crisis. Is there anything which is true or false, whether or not you're
00:08:24.880 allowed to say it, even if everybody, even if all the censors try to shut you up, is it still true
00:08:29.360 that one plus one is two? And what I show in the book is that this is actually the crisis with which
00:08:35.560 Greek philosophy kind of begins. It's the origin story of Western philosophy in some sense,
00:08:40.800 that in Athens, the great democracy, you have this crisis where people are arguing that whatever
00:08:47.860 you can get voted in, whatever you can argue for and present before the assembly, that's what goes.
00:08:54.400 And so the justice is just the interests of the stronger, the strong do what they can while the
00:08:59.980 weak suffer what they must. And what I'm arguing in this book is that, in fact, if you want to take
00:09:06.340 that pill, you're going to go all the way. It's not going to be a happy, blissful, you know,
00:09:11.460 metaverse kind of universe beyond your world. It's actually going to be a war of all against
00:09:16.800 all. It's going to be power politics, because if there's no such thing as absolute truth,
00:09:20.560 all you're left with is strength and strength amounts essentially to the threat of violence.
00:09:26.080 And I think that a lot of people, as they start to lose their grip on, you know, what the
00:09:32.280 quote unquote official narrative is, they feel like there's no way of discovering truth.
00:09:36.000 But the fact is that we actually have an apparatus for searching out the truth. It's
00:09:41.000 called reason. The Greeks called it logos. And we can recover ownership over our own reason
00:09:46.860 and confidence, which is what a lot of people are beginning to do as they start to reject
00:09:50.760 the experts, which I think is the right move. I think we should move further in that direction
00:09:55.000 as we form our own opinions and open the discourse as much as we can to seeking, you know, the absolute
00:10:02.820 truth, which is the beauty and the goodness with which philosophy begins.
00:10:07.160 How does the rejection of God, of religion, of any sort of higher power factor into this?
00:10:17.660 Right. Well, it's difficult, I think, when you start talking about this, especially in an
00:10:22.800 American context, you start to feel like you're just forcing your religion down people's throats,
00:10:29.400 right? People will say, don't force your God on me. We have separation of church and state.
00:10:33.020 And one of the things I say in the book is I'm not trying to convert everybody to my church.
00:10:40.800 It would be great if everybody went to my church. I would love that. But that's really not the problem
00:10:45.240 that we're up against. The real problem that we're up against is we all actually kind of know that
00:10:49.580 some things are true and some things are false. And it's not just like physical facts that are true
00:10:54.740 and false. Like this table is sitting here in front of me right now. There's also moral truths that
00:11:00.020 are true and false. It's wrong to take innocent life without cause, for instance. These moral
00:11:05.960 truths also have a kind of absolute reality that we can't just wish away. And spiritual truths are
00:11:12.300 part of that universe, the universe beyond just our flesh. And if we want to believe in that,
00:11:18.100 which we all have to in order to form a functioning society, we do have to believe. We don't have to
00:11:24.120 subscribe to this or that faith tomorrow or believe the Nicene Creed right now. But we do have to admit
00:11:30.240 that everybody acts as if there is such a thing as a highest truth and a highest good. Bob Dylan,
00:11:36.520 the great poet and prophet, says you got to serve somebody. And the Bible's version of this is that
00:11:42.260 the fool hath said in his heart, there is no God. You know, we have this idea that that line just means
00:11:47.100 like, oh, atheists are dumb or something. But that's actually I don't think that's what the Psalms are
00:11:52.440 saying. When you say the fool has said in his heart that there is no God, what it means is, you know,
00:11:57.880 when you tell yourself you're not worshiping, when you tell yourself there's no higher power,
00:12:03.240 you're actually fooling yourself. You're making yourself into a fool because you're deceiving
00:12:07.180 yourself. Everybody operates as if there's a highest good behind everything he or she does. And if you
00:12:13.200 pretend that's not true, you just end up worshiping without knowing it, which is what we saw, you know,
00:12:17.780 in the summer of 2020, when people were kneeling before these Black Lives Matter protesters begging
00:12:23.580 for forgiveness and absolution, or when they were referring to the science, capital S, as a kind of,
00:12:29.620 you know, a cult authority that could tell them what to do. And Dr. Fauci represents it. He is his
00:12:34.060 priest. Amen. You know, this is kind of how people are starting to behave. And I think the real thing we
00:12:40.300 need is not so much a conversion as a surrender to realize that what we are doing already implies a kind
00:12:47.700 of worship. And we should be self-aware about that. And we should look to the great traditions of
00:12:52.580 scripture and the church to help us understand what is worthy of worship. What's the highest good that
00:12:58.220 we could seek that would actually ennoble us rather than making us slaves? Let's start there.
00:13:03.880 One of the aggravating things about that truth you just told us is if we could actually get those
00:13:11.160 beliefs recognized as a religion, then we could stop them from permeating the public square and being
00:13:17.320 pushed on us by government, which isn't allowed to favor one religion over another. And yet we can't,
00:13:22.700 it hasn't been recognized, wokeism, as a religion, and therefore it can be pernicious in how it gets
00:13:28.480 pushed on us in the schools, in our jobs, at the corporate, at the government level, as we're seeing
00:13:35.220 now with the Biden administration. But you write in the book the following. In the 21st century,
00:13:40.540 political demands often boil down to the assertion that the speaker's point of view or identity should be
00:13:45.760 taken as an absolute authority. The various slogans we chant show this. Believe women. That's my truth.
00:13:51.320 Elevate black voices. And your point is that without God, again, quoting, without some shared,
00:13:57.540 stable, objective basis for understanding what is true, moral, and real, we are left only with
00:14:03.140 competing demands for power and competing attempts to control the facts. This is a very smart way of
00:14:11.400 talking about this void. The more we remove God and the principles that we associate with God and
00:14:18.500 with a higher power, the more we create a vacuum that gets filled with utter banalities. That's to
00:14:24.300 be charitable. Really, the truth is what we fill it with is downright dangerous.
00:14:29.780 Well, sure. That's absolutely right. And I think that, you know, the kind of religious nature of all of
00:14:34.820 these belief systems can really be seen when you start to ask, well, you know, what's the basis for
00:14:42.160 believing, for instance, that, you know, a man can become a woman simply by saying so, or, you know,
00:14:48.940 that men can get pregnant, all these kind of abstract ideas that we use to divide sex and gender and to
00:14:58.100 suggest that they're both kind of infinitely malleable. Well, it's not like, you know, science has
00:15:03.540 discovered that this is true. You get a kind of pseudoscientific language around it. They've
00:15:08.800 claimed to have, you know, proven this in some objective way. But in fact, since notions like
00:15:14.880 gender, which is kind of a purely spiritual concept, those notions don't actually exist anywhere on like
00:15:21.340 a brain scan. They are ideas about the soul. They're really actually closely tied to some very
00:15:28.180 ancient notions like the Neoplatonic idea that we're kind of there's our body. And that's just
00:15:34.180 like flesh, or it's a play thing, or it's a doll to be molded. And then there is the soul, which kind
00:15:39.200 of lives in this perfect sphere. I mean, nowadays, we talk as if it lives kind of online or in the
00:15:44.880 cloud, you know, but that division between body and soul, which is very close to the heart of the
00:15:51.580 sort of trans extremist movement, the post gender third wave feminism, whatever you want to call it,
00:15:57.160 you know, you read like Judith Butler's gender trouble, where she really kind of goes into this
00:16:01.620 stuff. And it's, it's totally Neoplatonic and Cartesian. It's like, I'm dividing my body from
00:16:07.700 my soul. My soul is the true me and everything else. It's like, well, maybe I get surgery today.
00:16:12.820 Maybe I reconstruct my body tomorrow, or I, you know, put horns on my head or whatever, because
00:16:17.580 my body is just a kind of appendage or a toy that I'm playing with. Now, whatever else that is,
00:16:24.140 it's definitely an article of faith, right? It's definitely a profession of something that you,
00:16:29.720 some spiritual idea that you believe, rather than, you know, some scientific objective facts
00:16:35.440 that everybody has to accept tomorrow, or else you're a bigot, and you're, you're just ignorant
00:16:39.240 and wrong, right? These are, these are spiritual claims. And one way of measuring a spiritual claim
00:16:44.960 is to see what kind of behavior it produces and what kind of results it produces for people.
00:16:50.900 And that's where the danger that you're talking about comes in, because, you know, you ask,
00:16:54.840 how's it working out for you to be perpetrating these, you know, terrible invasive surgeries on
00:16:59.760 kids and whatever. And the answer is, it's making us sicker and more depressed and, and tearing apart
00:17:05.440 the fabric of our, of our social life and our society. And since it's simply an article of faith
00:17:10.740 that this is going to do anything good for us, I don't think it's working out that well. I think
00:17:15.100 it's pretty obvious that the older idea, which is that your body is the language for your soul,
00:17:20.480 that we are in some sense, embodied souls would be a truer religion that we could actually adopt
00:17:26.120 in place of this kind of neo-gnostic trans extremism. So how do we look at, you know,
00:17:33.340 some of the ancient philosophers and get an answer to this reality crisis? I know the book mentions
00:17:37.380 Socrates, always some wisdom there. Like, is there, is that just a cautionary tale? Is that a cautionary
00:17:43.260 tale or is that an answer? I mean, it can definitely start to look like a cautionary tale,
00:17:48.500 especially when you remember they made him kill himself, right? I mean, it's not
00:17:51.320 like this stuff is guaranteed to turn out right or to turn out well. But there's an interesting
00:17:56.280 thing that you see when you start to read these texts. And I point this out in the book that it's
00:18:02.440 often the case that the thing which will get you hounded out of town today is the seed that's going
00:18:08.700 to grow into the tree of tomorrow's civilization. And that's what happened with Socrates. Now,
00:18:14.660 nobody would wish Socrates fate on anybody. And I don't think that you're destined to be attacked
00:18:20.480 by an angry mob if you stand up for these realities that we're talking about here. But I do think that
00:18:26.280 we should recognize, you know, that the world being what it is, the world being fallen, you're always
00:18:32.560 going to be facing some opposition when you're seeking the true, the good and the beautiful. Those
00:18:38.240 things are, to say the least, they're inconvenient to the powers that be. And without developing a
00:18:44.540 persecution complex, we should be realists about knowing that, you know, it's going to be tough
00:18:49.320 out there. But I always think about this moment in Lord of the Rings of all places, which where
00:18:54.820 Frodo says, I wish the ring had never come to me. Gandalf, the wise wizard says, so do all who live to see
00:19:01.520 such times, but that is not for them to decide. All we can do is decide what to do with the time that's
00:19:06.760 given us. And I think that's the position that, you know, Socrates was in. It's the position,
00:19:11.840 for instance, that Marcus Cicero was in another thinker I write about in the book who lived at the
00:19:17.420 very end of the Roman Republic. And, you know, his ideas didn't win the day, but they carried forward
00:19:22.740 into the future until at last, you know, they helped to build this country. And, you know, when
00:19:29.640 we're talking about the reality crisis, we're talking about something that these thinkers have been
00:19:33.180 wrestling with again and again. I also mentioned Aristotle in the book. He's an important figure
00:19:37.880 to turn to. But I think really the biggest question when it comes to despair, right, is are we just
00:19:44.080 looking at cautionary tales here? I think what we're really trying to do is to understand ourselves as
00:19:50.020 inheritors of a tradition that will outlast us. Because even if things fall apart, and I'm not saying
00:19:57.000 they're going to, I'm not a, you know, determinist about this, but even if things fall apart,
00:20:01.500 you want to have been preserving the flame that future generations will be able to pick up. That
00:20:06.920 effort is never wasted, which is one of the things you learn from studying history in the long view.
00:20:13.000 Well, at least we're going to go down swinging. You write about Plato's cave, and this is
00:20:20.300 interesting because it relates to, it's a story that helps us understand the importance of distrusting
00:20:27.400 elites, which is something you mentioned just a couple minutes ago. Can you get into that?
00:20:31.780 Yeah. Yeah, sure. I mean, this is kind of the original virtual reality dystopia is Plato's cave,
00:20:39.120 and we're all already living in it is sort of the idea. Plato famously in the Republic book seven,
00:20:45.200 he talks about this cave where all of humanity is shackled, and all they can see is shadows on a wall.
00:20:52.700 There's a fire, and there are kind of puppet masters that manipulate the shadows. So people
00:20:58.440 think that's reality, but the truth is that actually outside the cave, the sun is shining,
00:21:03.700 and that's the true light, which is the beautiful and the good, which we only dimly at a distance
00:21:08.520 see reflected in the sort of day-to-day experience that we might have. And I think one of the things
00:21:14.260 that is so powerful about that image is that it gives you a third person outsider's view. It lets you
00:21:21.200 see that even though the people in the cave think they're perceiving reality, they're actually at
00:21:27.400 the whim of the powerful, people who have just one more degree of information and power than they do.
00:21:35.160 And as we start to play around with the virtual reality idea, as it becomes more and more possible
00:21:41.080 to think about ourselves, quote unquote, in the metaverse or uploading our consciousness into some
00:21:46.840 kind of virtual reality cloud, suddenly this idea, which has been the subject of dystopian horror
00:21:54.080 for centuries, becomes like a sales pitch. It becomes, you know, oh, this is something we should
00:22:00.200 all like and want to do. And I think if we look back to Plato's Cave, and if we even look back to,
00:22:06.720 you know, stories after that, which have kind of followed on, like, you know, The Matrix or Wall-E or
00:22:11.960 these other kind of snow crash, these sort of dystopian fiction stories that we've written,
00:22:17.520 they show us our intuition of something which is really true, which is that if you give up the
00:22:21.840 ability to determine true and false, if you give it over, you're always giving it over to somebody.
00:22:27.480 And that person has interests of his own, which might or might not be entirely salutary.
00:22:32.820 So when somebody comes and says to you, you know, digital tech has made it so there's no need for
00:22:37.180 true and false anymore. You can just have everything you want.
00:22:39.840 If you will simply strap on these goggles and live in a virtual reality, the alarm that I'm
00:22:46.160 sounding in this book, and that I think Plato is sounding in the all of these, you know, sort of
00:22:50.940 fiction writers after him is, you know, that's always a bum deal, because the person that you're
00:22:56.060 handing over your data to handing over your mind to handing control over to has his own agenda.
00:23:03.060 And that's the elites that you're that you're talking about, as people begin to discover that elites
00:23:07.340 are actually not infallible. And in fact, they have all many of the same kind of sins and temptations
00:23:13.160 that just you and I have. I think it's really healthy and salutary that we're realizing, you know,
00:23:18.400 actually, these guys are not gods. They're they're not, you know, beyond the problems that mankind has
00:23:24.500 faced for for generations. And maybe we should think twice before we hand over our lives to them.
00:23:28.860 Hmm. I'm thinking about this exchange I had. I've mentioned this to the audience before
00:23:33.840 at the beginning of covid when Trump had that very weird, disjointed news conference where he was
00:23:39.800 like, I'm shutting down all travel. And then they're like, no, it's not shutting down all travel.
00:23:43.560 Like he's had like five things he had to correct as soon as the press conference was over.
00:23:47.560 I remember tweeting out something to the effect of I wish I knew who to trust.
00:23:52.620 I recognize I cannot trust what he is saying, but I also recognize I cannot trust what the media is
00:23:58.580 saying about him and about this virus either, because they have an agenda prior to an election
00:24:03.440 and just getting him out and saying whatever he says is wrong. And it was a great frustration that
00:24:08.240 I recognized early on in covid and many people shared. And I love Ann Curry, by the way. She's
00:24:13.300 such a sweet person and I think the world of her. But at the time she tweeted sort of at me,
00:24:18.660 trust the WHO, the CDC, Anthony Fauci. And this was early enough in the pandemic. We weren't yet
00:24:28.280 where we are on them. You know what I mean? Like most of us had to be lied to repeatedly before the
00:24:33.940 light bulb went off of these organizations. But I remember being like, hmm. And to your point,
00:24:39.060 like, think of that. It's the same kind of thing. There's this group that you're supposed to trust.
00:24:44.600 They're the elites. And supposedly they had a little bit more information than we had in the
00:24:49.800 cave. And yet they didn't. And a healthy distrust was very much warranted. And now, you know, most of
00:24:56.720 our view of these people and these groups has completely changed, at least for most people on
00:25:02.300 the right and in the center of the country. That's right. And it's been transformative. It's
00:25:07.060 been transformative for me. That's for sure. I mean, I would be it would have been much more
00:25:11.260 sympathetic to somebody saying, trust the WHO before the pandemic than I than I would be now.
00:25:17.720 And it's because, you know, the people who the human beings, the fallible human beings who make
00:25:21.780 up those institutions have betrayed our trust. And that's not you know, that's something that has
00:25:27.200 happened in the past. Machiavelli says that when the elites betray the trust of the people,
00:25:31.660 they do two kinds of damage. They damage their own credibility, but they also damage the
00:25:36.320 credibility of the regime of the country or the nation that they are a part of. And that's why
00:25:42.180 it's so evil is because we don't just lose our faith in this or that governing body. We also lose
00:25:48.160 our faith in the whole kind of structure of power that we're supposed to be kind of believing in and
00:25:53.580 participating in. And this is, I think, really importantly, why our founding fathers who deserve,
00:26:00.560 as far as I am concerned, to be counted among the great thinkers of the Western tradition,
00:26:04.880 you know, they established a principle that actually, you know, the nation is sovereign
00:26:10.820 among nations and the individual is sovereign, you know, within his own personal life and personal
00:26:16.260 decisions. And the reason for that is it's not like there's no such thing as knowledge, right?
00:26:21.100 It's not like there's no such thing as people who know stuff that we don't know and can give us
00:26:25.640 information we don't have. It's that politics and the decision about what to do is actually an ethical
00:26:32.680 decision. We're actually making moral choices, not just about how infectious is this disease or,
00:26:39.360 you know, how, what's the, you know, number of molecules that are operative before you get
00:26:45.320 infected, whatever, but actually about what we should and should not do. And in those questions,
00:26:50.460 questions of ethics, questions of politics, it's not the same thing as a math problem. It's not
00:26:55.980 something that you can trust a scientist to go away and run the model and do the calculation and tell
00:27:00.420 you, Oh, climate change is this deadly. And so unfortunately we have to, you know, we must
00:27:04.340 pass this law. Um, no, no, that is not the idea this country was, was founded on. We believe that
00:27:10.020 when it comes to ethical decisions, uh, it's not a math problem, it's a soul problem. And we together
00:27:15.680 as the people elect representatives who make these choices for us, and we don't just, uh, outsource,
00:27:22.300 you know, our ethical or moral responsibility to these absolute bodies of, of total power and control.
00:27:28.820 Now that whole idea was called into question by the capital P progressives, right? There was this
00:27:34.020 notion that history had moved beyond our system and actually the constitution was kind of outdated.
00:27:39.060 And, you know, what we really just need to do is outsource all of this to governing bureaucracies.
00:27:44.440 It's the birth of the modern administrative state, you know, and, and, and this whole notion,
00:27:48.980 which is now kind of deep seated among, you know, one portion of our polity, um, it's gotta go.
00:27:56.100 If we want to recover the American idea, which is that, yes, there are people who know things. Yes,
00:28:01.620 there are scientists. Indeed. There are even legitimate experts out there. Um, but what they
00:28:06.100 aren't is Kings and they are not, you know, they're not deemed designed by God to rule over us. We are
00:28:13.380 designed to rule over us. And, and the last analysis, we get to make the decisions.
00:28:17.620 It's, it's so stimulating listening to you. I have to tell you, it's like great for my brain. I love
00:28:23.460 it all, but I'm thinking about right now, just, we've never had a stronger executive, uh, in this
00:28:28.260 country and it was never meant to be, you know, we were fleeing a King. We didn't want that. The
00:28:31.700 founders who were brilliant, didn't want that at all. They wanted the presidency to be the smallest
00:28:35.860 branch, the least powerful. And yes, the administrative state has grown beyond anything they ever envisioned,
00:28:41.220 but even just the powers that we seed, you know, look at Joe Biden just over the past, you know,
00:28:45.460 whatever year trying to extend the, uh, the, the rate of the rent abatement program and just all
00:28:53.700 these things that he acknowledged would be struck down by the courts, but he did it anyway, because
00:28:58.900 he thought it would help him politically not to mention the student loan, uh, nonsense that he knows,
00:29:05.860 he knows will not be upheld. Why is he doing that? Because he's, he's acting like a King,
00:29:10.020 right? Congress was meant to reign in the excessive president, the excessive executive
00:29:16.340 branch, and they won't. And now I look at Congress who were supposed to be a bunch of regular folks
00:29:21.140 who decided to serve their country and, and bring their farmer ideals into the office and sort of keep
00:29:26.340 a, a realistic pulse, you know, finger on the pulse of the nation. Now they're a bunch of morons.
00:29:31.520 They're a bunch of stupid morons who just want attention for themselves. And they're congressional
00:29:35.100 Kardashians. Is it any wonder that our politicians feel like they've completely failed us
00:29:39.900 and don't relate to us at all? And they've given over, Congress has given over so much of the power
00:29:45.900 that now exists in these bureaucracies and in the executive branch. You're absolutely right.
00:29:50.800 And this is an area where it is really easy to get into a despair cycle real quick.
00:29:56.060 It's happening. Structural. Spiraling. So here we are. We're in the spiral at this moment. Let me see
00:30:01.540 if I can like grab a, a handhold out of here. Um, in the book, what I discuss, describe is the
00:30:09.200 sort of history of political philosophy that got us up to the place where the founders were able to
00:30:14.060 say, you know what, let's have a Republic, right? There's this long tradition of thinking about
00:30:18.280 what's called anticyclosis, the cycle of regimes. And the basic idea is, you know,
00:30:23.140 let me just say, let me just say, this is like the most interesting part of the book to me.
00:30:26.440 Everybody needs to pay attention to this. This is actually really important. Go ahead. Okay, cool.
00:30:30.220 Yeah. Yeah. No, I'm, I'm glad, I'm glad you like it. Cause I, I find this stuff endlessly
00:30:33.560 fascinating. Um, and, and let, so let's go into it. So there's, um, there's three kinds of basic
00:30:39.580 government of, of, of, of politike in the Greek is the idea of how do you run your society? How does
00:30:45.420 your civilization function? What are the rules and who gets to make these decisions? Right. Um, and,
00:30:51.020 you know, the Greek idea of the polis, which is the city state is kind of the, you know,
00:30:54.920 the petri dish for thinking about this sort of stuff. And Aristotle, who's one of the great
00:30:58.980 thinkers on this topic in his politics, he says, there's three ways that you can organize this.
00:31:03.360 There's three ways that you can arrange the system. Uh, fundamentally one is monarchy. One
00:31:08.920 person rules, right? Another is aristocracy. The best, a few people rule and they're the best people,
00:31:15.320 uh, all the best people as, as Trump might say, are in charge in an aristocracy. Then you have what
00:31:20.200 we would now call democracy. Although Aristotle uses slightly different languages, but, you know,
00:31:23.520 rule of by, by the many who collectively make decisions and Aristotle, crucially, this is so
00:31:30.620 important for us to remember because it relates to what you were saying about the breakdown of our
00:31:34.240 system. There is no system that you can construct that will not suffer decay because human beings
00:31:41.180 are a mess. And over time we fall victim to our peccadilloes and our flaws and all of these sorts of
00:31:48.020 things. And he says, the thing that makes the difference. Yes. Hubris is classic. You're right.
00:31:52.220 That's the classical example, pride, overweening pride. And the thing that makes the difference,
00:31:58.000 and this is also crucial. Aristotle says between the virtuous version of a government and the evil
00:32:03.680 version of government has to do with love. It has to do with what the point is of doing politics
00:32:09.740 together at all. If the rulers are ruling for the benefit of the ruled out of love for their
00:32:15.100 citizenry, then you have a good state. If the King loves his people and makes decisions with their
00:32:19.980 interests in mind, he's a good King. Let's say his son comes along now and he's a spoiled brat
00:32:25.460 and he decides actually that he's going to rule for his own benefit. He's going to tax the people
00:32:30.920 heavily so that he can have a more beautiful palace, let's say, or he's going to go to war
00:32:35.260 out of pure spite with somebody, not because he needs to protect the nation. That's what we call
00:32:41.120 a tyrant. And that's the decayed version of, of monarchy. Now, if you have a monarchy,
00:32:46.040 which is sort of the natural way of living that somebody rises up like a strong man or something
00:32:50.100 to run society, decays into a tyranny, it's possible that you'll get an aristocratic uprising.
00:32:55.460 The nobles will say, I've had enough of this taxation. I'm going to take over. So the aristocrats
00:32:59.980 are going to be in charge. The decayed version of this, when they start dealing for their own
00:33:04.380 benefit, for their own self-love, that's called an oligarchy. And we're very familiar with this kind
00:33:09.220 of decay. It's when the elites become corrupt and they rule for themselves and their cronies.
00:33:13.560 And when that happens, it's very possible you get a democratic uprising. The people take control.
00:33:18.600 They take back that, that power of the system. If the Democrat, if the democratic regime decays,
00:33:24.800 then it becomes one of my favorite classical words. We've totally lost an occlocracy, which is the rule
00:33:31.180 by the occlos or the mob. It's mob rule. And in mob rule, of course, it's very easy for a strong man
00:33:36.540 to come in and take over. And the cycle of regimes begins again. So you get this kind of theory of history
00:33:41.740 that it just goes over and over again. And the whole cycle begins anew. The whole point of a
00:33:48.280 republic, which is what our system is, is to create a perpetual motion machine, take all these
00:33:54.220 different forms of government, these different kinds of power, and you combine them together and
00:33:58.520 you balance them against one another. That's checks and balances, right? And so now you have these
00:34:03.060 different kinds of parts of the government, like an executive branch that has kind of unitary power,
00:34:08.740 but then also, you know, everybody's accountable to the people. So you have that democratic power,
00:34:13.020 but you also have a legislative elite that's supposed to sort of serve as the aristocratic
00:34:17.220 branch and they work together, play off against one another. How is it possible that this beautifully
00:34:23.500 designed system has fallen apart into the decay that you, you described earlier on, right?
00:34:29.780 The answer Machiavelli tells us, and Plato kind of hints at this as well, is class warfare. Once you
00:34:37.640 get to a point where the different parts of the society, the aristocratic parts, the popular parts,
00:34:42.940 the populists and the elites, don't think of themselves as fellow citizens, but think of
00:34:46.880 themselves as members of a tribe. You're a white person, so you're inherently racist. You're a man,
00:34:52.920 so you're inherently sexist. You're straight, so you're inherently homophobic, right? Once you get
00:34:56.980 people thinking that way, you have poisoned the mechanism of the Republic and you have destroyed,
00:35:02.020 going back to that very crucial thing that Aristotle talks about, you've destroyed civic
00:35:06.380 love. It's love and friendship and neighborliness that makes a civilization what it is. These sort of
00:35:12.640 small daily acts of marrying and being given in marriage, of forming rituals together, going to
00:35:18.500 ballgames together. These things sound so simple, but they are the stuff that the community is made out
00:35:24.140 of. And this finally is the foothold that I think we can get out of our despair cycle because we're
00:35:29.980 not going to rewrite the system so that it all gets fixed overnight. But what we can do and what
00:35:35.760 in some places we already are doing is reinvest in that philia, that local love and neighborly
00:35:42.260 friendship that makes a civilization what it is. You're starting to see this in states, I think,
00:35:47.340 like Florida, out here in Tennessee, in Nashville. I see it happening every day. These local societies,
00:35:53.240 associations that go to the school board and fight for what they believe in and talk it out with their
00:35:59.420 neighbors, figure out how we're going to rule ourselves. It's bottom up, not top down. And I
00:36:05.400 think that's the way to kind of reclaim some ownership and start to move in the direction
00:36:09.740 of fixing the problems that ail the country. Of course, it's going to take many years. And of course,
00:36:14.160 there's still national elections to worry about and all that. But unless we get a sense that actually
00:36:19.160 we have ownership over our communities, we won't even get started. It's philia, political love,
00:36:25.800 civic friendship that really needs recovering in this hour. We have to talk about that more. I want
00:36:31.160 to talk more about how we can make that happen if we don't feel like it's happening in our community.
00:36:35.480 But I will say just listening to you, I was reminded of just the other night I was watching the Super
00:36:39.420 Bowl with my kids. And, um, of course they, we had to have the black national anthem before we had
00:36:45.720 the actual national anthem, which in this context is divisive. It is there's, you know, the kids are
00:36:51.040 sitting there like, well, why is there, what is that? What's what is there's a, there's a special
00:36:54.900 anthem just for black people and not for the white people. It's like, I don't, you know, how are you
00:36:59.220 supposed to explain this? Right? So it's like, yes, it is divisive. It is divisive. Not in this
00:37:03.260 context. It's divisive. Um, absolutely. You know, and then they played, we're just, I'll just
00:37:08.440 finish it, but then they played the national anthem, the actual national anthem. And of course,
00:37:12.960 I made my kids even sitting in my kitchen, stand up, put your hand on your heart and they did it.
00:37:17.160 And why did I need to do that? Nobody could see us. Nobody, it didn't so count for anything.
00:37:21.820 It counted because of the principles you're talking about right now, because I bet there were kids and
00:37:26.500 adults all over the country doing the same thing. Love of country, love of the ideals that this
00:37:31.600 country was built upon and stands for still. And that many of us are still trying to live by
00:37:35.800 like, that's what we salute. That's, that's, what's important. Listen to the words of that song.
00:37:40.040 Listen to the principles that have been handed down, not, not dividing, dividing us based on race.
00:37:46.060 There's something so much bigger that ties us all together. We need to get back to that. Go ahead.
00:37:51.160 Oh no. Amen. I'm, I'm glad that you, you finished there because I had a sort of similar experience
00:37:57.260 recently where I think it was in orange County. Somebody was saying, I can't believe that they
00:38:02.780 voted to take down the pride flag outside of this public building. Um, and it's just a sign of
00:38:08.740 hatred and more of the right wing campaign to yada, yada, yada. You know how this stuff goes.
00:38:13.800 And you know, they always make it out as if, you know, if you, if you're in favor of taking the
00:38:18.640 pride flag down, you're a bigot. And they hold people emotional hostage because as I well know,
00:38:23.880 you know, people who love whatever, who have a gay person in their life that they love,
00:38:27.840 they feel like, you know, if I don't go along with this political movement, with this other flag
00:38:33.400 outside of my public buildings, then, you know, I'm betraying my family. I'm betraying my loved one
00:38:38.660 and I'm, I'm hateful and I'm sorry, but my flag is the American flag and the flag that flies outside
00:38:45.920 my public building should be the American flag. And I will not be used as a prop in somebody else's
00:38:51.120 kind of neo-Marxist campaign. You know, they do this in a million different ways. They just find
00:38:56.060 the thing that you care about, you know, Oh, you're bad because you hate women. If you think
00:39:00.280 that like, you know, men can't magically become women or you're bad, right? Because you don't
00:39:04.600 want to sing the black national anthem before the, you know, before the Superbowl. The other one
00:39:10.200 that really got to me was the pandemic of the unvaccinated, which president Biden said at one point,
00:39:15.340 it's like, okay, so there's a whole portion of the population that is tantamount to a disease.
00:39:20.480 That's really what our political rhetoric is going to be. And this is the kind of hope and
00:39:24.500 change and transformation and the return to normalcy. I'm sorry, but you know, first and
00:39:28.940 foremost, before anything else, when it comes to politics, I am an American. If we can't say that,
00:39:34.000 then we're in trouble. But I think there's a lot of people out there that are ready to say that
00:39:37.360 if, if we have courage and sort of lead in that regard. I saw you had a great comment on the,
00:39:42.600 in connection with that flag controversy, who's saying, quoting Inez Fletcher of the Claremont
00:39:47.140 Institute saying she must surely be right, that no actual homosexual can possibly have been involved
00:39:52.060 in the design of something so grotesquely tacky as the LGBTQ flag.
00:39:58.000 These days, every day now they put some new color on it. It's like they can't even,
00:40:02.820 you can't even get a gay person to hang drapes that don't match the carpet. And you're trying
00:40:07.440 to convince me that there's some, you know, coalition of like all the gay people out there
00:40:12.420 are building. I don't buy it for a second. I think it's the Borg flag. I think it's like
00:40:16.360 they've sort of weird amalgamation of everything that they want to use to destroy the country.
00:40:21.440 I had never considered that before, but it is absolutely true. I'll never look at the flag
00:40:25.960 the same again, Spencer. Thank you. Stand by much, much more on the opposite side of this break
00:40:30.760 with Spencer Klavan. Spencer Klavan is my guest today. He's the author of the new
00:40:39.320 must, must read book, How to Save the West, which is out today. I'm telling you,
00:40:44.920 I don't say this about every book, every book. You must buy this. It is short. He actually makes
00:40:49.440 it an easy read, even for those of us who are dummies when it comes to classics. Another thing
00:40:54.320 I love about his podcast. So buy the book, You Won't Be Sorry, How to Save the West. I blurbed it for a
00:41:00.120 reason, not just because Spencer's a friend, but I truly want everybody to read this. And people
00:41:04.120 write in Spencer all the time saying, is there anything I can read to help me make sense of
00:41:07.300 the craziness happening in our world right now? And I've been recommending this, so I'm glad it's
00:41:10.400 finally out because I had the pleasure of the advanced read and now everybody gets to have it.
00:41:15.820 All right. I want to follow up on the cycle. So monarchy into aristocracy slash oligarchy
00:41:21.920 into democracy, into mob rule. And then does it go again back to monarchy? And where are we
00:41:29.860 in the cycle? Where is America right now? Obviously, we're technically a republic,
00:41:34.160 but I would imagine we're in the democracy into mob rule phase. But is monarchy coming our way if
00:41:38.800 this all fails? What's happening next? Well, it's really interesting. The way I think about this
00:41:44.460 theory, this theory of the cycle of regimes is it's not a prescription and it's not a prediction of
00:41:51.460 what's going to happen tomorrow. It's like a template. And once you have it in your mind,
00:41:56.520 you can see pieces of it playing out sort of like snatches from a familiar tune, you know,
00:42:02.020 like when, for instance, the barons rise up against King John because of his oppressive taxation,
00:42:07.800 you start to see that force of the aristocratic rebellion against a tyrannical king. And that
00:42:13.380 idea is really what we're looking at is dynamics that are always at play. They're eternal because
00:42:19.060 they're part of human nature, so they never go away. And we can use them to understand each new
00:42:23.860 thing that comes up. So what I say in the book is what I think we're in is kind of an interesting
00:42:28.740 position where at home, we're sort of looking like a decaying republic. And when republics decay,
00:42:36.340 they turn into oligarchies. They get seized by an elite and you start to get that war between the
00:42:43.140 different classes, the social classes. And by the way, a lot of this was done on purpose by the new left,
00:42:49.100 by Marxists who figured out that there wasn't going to be an economic revolution in America.
00:42:55.760 So the way to bring about revolution here was to foment different kinds of classes, you know,
00:43:00.380 to make them hate one another. And this is where you get ideas like white privilege, you know,
00:43:04.900 people like Noel Ignatieff and some of his colleagues talking about white skin privilege. That's where this
00:43:09.900 stuff comes from. And it's how we decay from a republic at home into this kind of oligarchic,
00:43:15.580 you know, weird court state. But overseas, you know, we have this similarly strange thing going
00:43:22.260 on where we're kind of almost an empire, right? We've extended our power across the world so
00:43:27.960 enormously. And we've done so kind of informally in all of these ways, you know, through NGOs and
00:43:34.000 with all of these, you know, many different ways that we exert influence over other nations.
00:43:37.580 Sometimes the influence is good. Sometimes it's not so good. But the truth is that we have this kind
00:43:42.900 of global network of influence that lands us in all sorts of trouble and complications. But
00:43:48.500 I think the real issue is not so much that, you know, those networks are falling apart,
00:43:54.780 but that they're falling apart because we are falling apart at home. And as I said,
00:44:00.500 the way we're falling apart is that class warfare. Machiavelli, who's somebody that not maybe not
00:44:06.240 everybody will be familiar with as, you know, the great theorist of republics, they think of him as
00:44:10.480 the kind of realpolitik scheming, you know, author of The Prince. But he has a book, The Discourses on
00:44:16.840 Livy. It's a really beautiful examination of the great Roman historian, Livy, who told the story of
00:44:23.540 the transition from republic into or from from monarchy into republic among the Romans. And
00:44:29.400 Machiavelli has this amazing passage where he sits around and he basically tries to figure out whether the
00:44:35.200 elites or the people are worse. Like which one is, you know, is it the populists or the elites that are
00:44:40.700 that are worse? And it's a very relevant passage for our times because we, of course, and we've been
00:44:45.740 talking here on the show about how terrible our elites are and our experts. And I believe all of
00:44:51.640 that stuff. But I also see how people could say, yeah, but the populists can be just as bad,
00:44:56.600 right? They could be. What about January 6th? What about these, you know, kind of excesses of
00:45:01.220 populism that we flirted with and all of this? And what Machiavelli ultimately concludes is that
00:45:07.140 although both of these things are a danger, elite decay, elite corruption is the most dangerous
00:45:15.580 thing because it destroys faith in the system. It destroys, betrays the trust, not just of the people
00:45:23.080 in the elites, but of the people in the country that elevated those people to positions of power.
00:45:28.220 And so you start to get that despair cycle again. It's like, how do we, you know, even operate in
00:45:33.080 this country when the systems that elevate people into positions of power are so broken? And so,
00:45:40.060 you know, there's a couple of ways out of that, not all of them very pretty. And as I say in the book,
00:45:44.620 you know, we do not want to head into another civil war, into another form of secession. These are
00:45:49.440 all things that I at least very deep, fervently pray will not come to pass. And so we ought to think
00:45:54.720 about what's the remedy to, uh, elite capture that doesn't involve all those, those terrible
00:46:00.420 outcomes. And the one that I pull out of these classical texts is that investment in the local
00:46:08.100 community. You know, when, when this country was founded, there was a big debate going on about
00:46:12.380 whether you could even do a Republic over such a large extended space. The, uh, many of the European
00:46:18.720 theorists, especially the Baron de Montesquieu in, in France had this idea that, you know,
00:46:24.140 Republic's kind of been tried, uh, Rome did it pretty well, but then they got too big. And that's
00:46:29.380 when you start to see all these problems kind of fell apart. And our founding fathers, especially
00:46:33.580 James Madison had this argument that actually, uh, a big country is an advantage for, uh, a Republic
00:46:42.020 because there's room to breathe. And he said, if you extend the sphere of your, of your country,
00:46:47.720 you're going to end up with all these little pockets of community where people can do things
00:46:52.220 in ways that maybe they don't approve of back in Washington. You know, maybe you get like little
00:46:56.940 Amish communities or you get places like Florida where, you know, they're not going to lock down
00:47:01.660 for the COVID mandates, or at least, you know, they're going to be a lot less intense about it.
00:47:05.880 And I think even though that system has been attacked a lot, it remains our best hope. And it
00:47:12.800 remains where I see the most exciting action going on because it's in those communities.
00:47:17.160 Um, and it's in those local neighborhoods and then up to the state level, um, that the problems
00:47:22.460 become human sized and they become at a level where people can talk to one another. They can
00:47:28.180 see each other face to face. Um, we don't reduce one another into these kinds of abstract concepts
00:47:33.400 like you're a, whatever, a blue haired lib and I'm a fascist Republican or whatever. Um, we can
00:47:39.320 actually talk at a human level about particular solutions to the particular problems that face
00:47:43.660 us. And I think that's why you're seeing so much movement, say on the school boards. Um,
00:47:48.860 you're seeing a lot of hope coming out of states like Florida where people are flocking to, you
00:47:53.320 know, they can't move there fast enough. Um, it's because in those, uh, local communities,
00:47:58.280 those little platoons as Edmund Burke called them, you can actually establish filia. You can establish
00:48:04.180 love, civic friendship. And if there's one thing I draw out of Aristotle in this book, it's that
00:48:09.500 civilization building for all that it's political, for all that it involves voting and fighting and
00:48:14.220 whatever at the bottom civilization building is an act of love. And we've got to recover that.
00:48:19.620 We can't be ashamed about that. We have to re imagine ourselves as, uh, neighbors and citizens
00:48:25.640 in a community built on love. Hmm. I love everything you just said. I also think it's a good reminder
00:48:32.040 that that doing all of that is not an online activity. It does not happen on Twitter. It does happen
00:48:38.300 in your actual neighborhood. I always say that all my friends in my Upper West Side neighborhood,
00:48:42.760 who are still my best friends, they're all liberals. I love them. I see how our love for
00:48:47.900 each other can be the foundation for the renewal of our society. I couldn't care less what their
00:48:52.040 politics are. I care about who they are as women. So it's, it's just a reminder. It's not the
00:48:58.220 metaverse. It's not Twitter. It's not Facebook. It's the people within 15 feet, the family,
00:49:03.300 the neighborhood, the friends, where we cultivate the solutions on how to save the West, read the
00:49:10.560 book, listen to young heretics. And aren't we all so lucky to have Spencer Clavin available to us.
00:49:16.160 Thank you so much for being here. Oh, Megan, I'm the lucky one. Thank you so much for having me.
00:49:20.800 It's such a pleasure. Hope to see you soon. And up next, we have more goodness for you as we take
00:49:25.720 a deep dive. It's going to be a fair and balanced debate on tech censorship. You've heard about section
00:49:30.380 230. You don't know what it is. You're going to know. And you're going to know about the debate
00:49:34.500 going underway right now. That's underway right now at the Supreme Court and beyond when it comes to
00:49:39.040 big tech and free speech in America. Are you concerned at all about censorship on tech platforms
00:49:49.240 and free speech in America? Do you feel like you've been targeted? You feel like what you can see online
00:49:54.720 has been targeted in a way that makes certain viewpoints unavailable to you. This affects
00:50:00.860 everyone. But what is the right solution? It sounds kind of wonky, but the topic of section 230 is
00:50:08.340 important and it's affecting your daily life, whether you know it or not. And it's also being kicked around
00:50:13.680 right now by the U.S. Supreme Court. Section 230 is a landmark U.S. law that shields social media
00:50:19.800 companies from liability over content their users post. So if I go online on the YouTube comments
00:50:28.660 section and I see something totally defamatory that's not true about somebody, I could get sued
00:50:34.860 potentially, but YouTube can't. YouTube, they're not responsible to police my thoughts because they're
00:50:40.920 not considered really like a publisher. The way, let's say, remember when Amber Heard got sued by Johnny
00:50:47.380 Depp for defamation. She posted something in the Washington Post. See, it's more dicey when you're
00:50:53.640 the newspaper than when you are a social media company. So like newspapers are held to a higher
00:51:00.140 level. Social media companies are held to a lower level. Some people think that should change and many
00:51:05.800 people do not. All right. So now the Supreme Court is going to hear next week a case involving this and
00:51:10.800 Google. And today we decided to get together two true experts on this issue who have vastly different
00:51:17.340 opinions on these very important topics. We're going to have a good, respectful debate between
00:51:22.720 Kate Tamarello, Executive Director of InGen, and Will Chamberlain, Senior Counsel for the Internet
00:51:29.420 Accountability Project. Kate and Will, thanks so much for being here. Yeah, thank you for having me.
00:51:35.900 Yeah, thanks for having us. Yeah, great to have you both. All right. So let's just, we're going to keep
00:51:39.840 it simple so people who don't follow this anywhere nearly as closely as you guys do can follow it. So Kate,
00:51:44.540 I'll start with you on briefly describe what is 230. We throw this turnaround on 230, 230. Should
00:51:49.860 it be repealed? Should it not be repealed? Josh Hawley doesn't like it. Elizabeth Warren doesn't
00:51:54.600 like it. Very weird bedfellows. But we know that the big tech platforms do like it. So we don't know
00:51:59.860 what side to be on. Right. Because it's like, wait, even conservatives are like, I don't want to
00:52:03.920 side with Elizabeth Warren. But right. So they get confused. What is 230?
00:52:08.260 Yeah, so very simply, Section 230 is a 1996 law that essentially says, whoever created the content
00:52:16.500 should be legally responsible for it, not the platform that's hosting the content. And it often
00:52:21.600 gets talked about in the context of social media, that makes a lot of sense. That's how most of us
00:52:25.260 deal with the internet these days. But it's actually much broader than that. It applies to all internet
00:52:30.080 platforms of all sizes, whether you're hosting social media posts, like tweets, or YouTube videos,
00:52:35.420 or Facebook posts, or Instagram photos. But it's also things like reviews and ratings and
00:52:40.260 photos and videos that you may be sharing on a smaller scale. So while the debate is often about
00:52:45.200 kind of 230 and big tech, Engine is a nonprofit that works with startups and internet creators.
00:52:49.900 And we're coming at it from the perspective of kind of the whole ecosystem and why the law and
00:52:53.560 the liability shield really empowers users to speak and share content and information online.
00:52:59.960 So, Will, what most people say, and I've listened to Ben Shapiro, who, of course,
00:53:03.260 has created and helps run the Daily Wire, he says, you know, he's got his reservations,
00:53:09.000 but he likes 230 because he's got the Daily Wire and he doesn't want to be held liable. If somebody
00:53:14.180 posts a comment on the Daily Wire, that turns out not to be true. And then, okay, that person gets
00:53:19.420 sued for defamation, but now Ben's getting sued for defamation. What's Ben going to do? He's going to
00:53:22.780 say, forget the comment section, forget, forget all of this. I'm not going to engage in this business
00:53:27.080 if I'm liable for what my users post. So most tech platforms, whether it's the Daily Wire,
00:53:33.240 YouTube, Twitter, whatever, they like this shield because they don't really view themselves as in
00:53:38.560 the business of endorsing the content that their users post. Right. And it's important to realize
00:53:44.480 that the immunity in 230 is bigger than nearly what our users post we're not responsible for.
00:53:50.240 They also have an immunity protection for anything that these companies remove. And that's granted at
00:53:55.200 the federal level. And so one of the big issues with 230 is that this federal grant of immunity
00:54:00.000 for any type of removal of content that can be even otherwise objectionable, that ability to remove
00:54:07.660 it has thwarted state laws at the state level that have been trying to protect individuals' right to
00:54:13.200 speak freely online. So I think we actually might be in an agreement to the extent that, yeah, your
00:54:18.080 average message forum or even social media platform shouldn't be held responsible for every single
00:54:22.880 thing. It's users post. I mean, the scale is enormous. But at the same time, I don't think
00:54:27.000 they should be given carte blanche to censor people based for political reasons. All right. So let me
00:54:32.140 stay with you for a minute. So you're going to explain to us why 230 is problematic in your view
00:54:37.700 in its current form. And it sounds like you're focusing in on this this ability to remove with
00:54:43.120 impunity. Is this what they use to censor? And I realize it can be any viewpoint, but it's oftentimes
00:54:49.740 the conservative one. Is this is this problematic because the right so often is the brunt of it?
00:54:58.140 Right. So there's two major sections of Section 230, two subsections. The first one is the one that
00:55:04.720 you and Kate have been talking about, which is the one that says that just if you are the platform
00:55:09.520 hosting the content, you are not the speaker of the content that is created by your users. And that is
00:55:15.240 sort of there to insulate you from defamation liability. Like just because somebody posts
00:55:19.240 something defamatory on Facebook doesn't mean Facebook should be getting sued by the person
00:55:23.820 who was lied about. But then there's the second part, subsection C2. And that part says that all
00:55:31.000 of these tech companies and platforms basically are immunized from liability for removing any content
00:55:35.920 that's lewd, lascivious, and a whole bunch of other negative words or otherwise objectionable.
00:55:40.760 And so that does give them some protection to remove things like pornography, for example.
00:55:45.660 But it's so broad that it means that the companies can say, no, we're just we just want to kick you
00:55:50.780 off the platform and remove your ability to speak. And not only do you have no recourse just generally,
00:55:55.880 but if a state or even the federal government tries to create a law and give you some recourse
00:56:00.740 against these companies doing that, it would be if you're a state, it would be what's called
00:56:04.760 preempted by the federal law. And so that.
00:56:07.700 Let me jump in. Let me jump in and ask you, what does give me a real life example of how
00:56:12.560 this has been used in a way you find problematic?
00:56:15.780 Oh, for a good example is so Florida passed a law. This was in the news, I think about a year ago.
00:56:21.980 Florida passed a law that said that their citizens shouldn't be censored on social on social media
00:56:27.600 for no good reason. And that if they are being censored for political reasons, they should have
00:56:31.120 the right to I believe this was the Florida law. They should have the right to sue the big tech
00:56:35.560 companies that that might have been Texas, actually. I'm not. I'm pretty sure one of those
00:56:39.720 two. But in any event, at least for the Florida law, the Florida law was found to be unconstitutional
00:56:44.500 in the courts and and and I'm sorry, invalid in the courts. And part of the reason was it was just
00:56:49.800 preempted by Section 230. And the way our federal system works is that if a federal law speaks to an
00:56:55.680 issue and it's a power within the federal government's power, it trumps any state law that
00:57:00.720 would contradict. So having this broad grain of federal power really hurts the ability for states
00:57:07.140 to protect their citizens from censorship. OK, but wait, but let me let me follow. Because what I
00:57:10.160 what I would like is like a specific example, you know, let's take I don't know, Stephen Crowder.
00:57:16.440 He got demonetized by YouTube. Is that is that in this lane or like give me an example of somebody
00:57:22.580 we would know or some kind of incident we would know where they exercise the power to remove
00:57:27.200 in a way that you don't want them to be shielded for? Oh, sure. OK, Laura Loomer was removed from
00:57:33.420 Twitter, for example, and she was actually the plaintiff in a major lawsuit against Twitter.
00:57:37.620 She was represented by a lawyer, Ron Coleman, who actually wrote the right with the white paper
00:57:41.760 talking about 230 reform. And she was essentially suing Twitter to get her account reinstated,
00:57:46.920 saying she was censored for political reasons. And that suit was thrown out of court on and on
00:57:53.380 among other grounds, the idea that Section 230 precluded any sort of remedy.
00:57:59.000 Hmm. All right. What about that, Kate? So now we've sort of discussed some of the virtues of 230,
00:58:04.020 but we'll zero in on C2, the removal and how it was used. Laura Loomer is a controversial person.
00:58:10.360 But this is always going to be used against, for the most part, people who you could dub controversial.
00:58:14.680 I mean, half the time people call me controversial. It's like, OK, fine.
00:58:16.920 So this is the problem, because in America, we don't tend to censor controversial speech. We tend
00:58:22.200 to believe that the answer to controversial speech or speech you don't like is more speech,
00:58:25.140 not less, not to censor the original offending comment. Yeah, I think there's a lot to dig into
00:58:31.300 here. And one of the most important pieces of context is to see to write as part of Section 230,
00:58:36.720 but it doesn't operate in a vacuum. All platforms, all businesses, all people are protected by the First
00:58:41.580 Amendment. And that includes, right, the government can't censor speech, but the government also can't
00:58:45.880 compel speech. No government, federal or state governments, can compel anyone to host speech
00:58:50.800 or to make speech. And that's actually the main reason that the Texas and Florida laws are having
00:58:56.020 so much trouble and what the courts have been looking at. You know, 230 obviously is involved,
00:58:59.880 but the real question at the heart of those cases is, do those laws violate the First Amendment?
00:59:03.460 And some courts have said yes, some courts have said no. It's likely the Supreme Court,
00:59:07.320 they're looking at it right now, they will decide whether to pick it up. They just asked for the
00:59:11.640 Solicitor General to weigh in. And so I think that there's kind of this idea that C2 is what enables
00:59:17.340 platforms to take down speech. But in kind of in practice, it's really the First Amendment.
00:59:22.260 And that's, that's pretty in line with a lot of- Wait, wait, just to stop you. It's,
00:59:26.180 it's the social media company's First Amendment right to take away-
00:59:30.560 the same way that, you know, the court has found that a bakery has a First Amendment right to refuse
00:59:36.120 to bake a gay wedding cake. The social media companies have their own First Amendment rights
00:59:40.620 to not be censored and to not be compelled to host speech. So C2 really is a modifier to C1.
00:59:47.260 This dates back to pre-internet laws about kind of like bookstores and in court looking at,
00:59:53.940 you know, how does someone who distributes someone else's speech,
00:59:57.400 when can we hold them responsible? And in the 1990s, the courts were looking at different cases
01:00:02.720 and said, if you moderate heavily and you miss something that should be illegal,
01:00:06.780 then you're held responsible. And Congress stepped in and said, whoa, whoa, we don't want a world where
01:00:10.760 you have a disincentive against moderation. We want platforms to be able to appeal to niche audiences,
01:00:16.480 to be able to cater to specific people, to be able to provide safe spaces without things like porn
01:00:21.040 or harassment or spam. And so they, that's why they put C2 in the law. And so while it is certainly an
01:00:26.960 important part of the law, the First Amendment underlies kind of all content moderation.
01:00:31.060 So I think that's, that's just worth calling out. In addition to the First Amendment,
01:00:33.920 when you sign up to a social media service, you sign up terms of use, you know, terms of service
01:00:39.060 agreement that usually bans lots of things. And it's really up to the platform what they include
01:00:43.800 in that. But if you violate that terms of service, then you're totally allowed to be kicked off the
01:00:49.240 platform. You violated a contract with the company. And so it's not as if absent 230 Twitter or whoever
01:00:56.480 has a legal obligation to host your speech. It's just absent 230. They don't have to worry about
01:01:01.860 fighting it out in court, which can cost millions of dollars versus getting it dismissed under 230,
01:01:06.540 which only costs somewhere between a couple of 10, maybe a hundred thousand dollars.
01:01:10.820 What about that? Well, so if Laura Loomer, if there were no section 230, what would be her grounds for
01:01:16.620 a lawsuit against Twitter for bouncing her off? Because it's a private company. They're like,
01:01:20.500 we don't like you. We don't like the color of your hair. You're booted.
01:01:22.580 Right. Well, I mean, there's regulations on private companies, and there have been serious
01:01:28.100 ones going back to the teens when it comes to common carrier type regulations or the 1960s with
01:01:33.140 civil rights and public accommodation. Slews of private companies are under various regulations
01:01:38.660 that say you're compelled to provide service to people and you're not allowed to arbitrarily
01:01:42.080 terminate it. That's not true in every aspect of the American economy, but it's true in many of them.
01:01:47.440 And we don't see those necessarily as First Amendment violations of freedom of association.
01:01:50.840 And another point, I mean, we were just talking about subsection C1, the point that because these
01:01:56.480 platforms aren't speaking, because they're just hosting tons of user-generated content,
01:02:01.700 well, they're obviously not the speaker, so they should be insulated from liability.
01:02:05.800 But then all of a sudden, when you want to say, okay, well, you're not the speaker,
01:02:09.340 so you should still be forced to host this other people's speech because your platform is so huge.
01:02:14.980 It's the public square. And then these companies all of a sudden say, whoa, whoa, whoa,
01:02:18.240 you're violating our First Amendment's rights to speech. I think it's unfair for these companies
01:02:21.760 to have it both ways. If they want liability protection for hosting user-generated content,
01:02:27.000 then they should concede that they are not the speaker of that user-generated content and that
01:02:31.100 their speech rights aren't being violated when a state in their decision, in their wisdom,
01:02:36.420 decides to give their citizens the right to speak on that platform.
01:02:39.560 How about that, Kate? How can they argue out of one side of their mouths,
01:02:41.780 we are not in control of the content, and out of the other side,
01:02:45.620 oh, we are very much in control of the content.
01:02:49.140 I don't think any platform would ever say that they're in control of the content. I don't think
01:02:53.260 that's a fight anyone thinks they can win. And I think this really kind of comes back to-
01:02:57.640 Well, that's what C2 is. That's C2. The reason they are able to remove porn and Laura Loomer is C2.
01:03:05.560 They are in control, ultimately, of the content.
01:03:07.480 I mean, they're in control of kind of the environment they create. But, you know, we live in a world because of the internet and because of 230 where anybody can create and share content on the internet instantaneously and not have to worry about, you know, like YouTube, for instance.
01:03:23.960 It used to be if you wanted to distribute a TV show, you had to hope that a cable channel and a TV studio and all the people necessary to participate in that process would agree to host you and to do that.
01:03:33.780 You don't need that now. Anybody can upload anything to YouTube.
01:03:36.180 And if YouTube, of course, has rules around what you can upload, they don't want things like terrorist content, which is at the heart of Gonzales.
01:03:42.040 They don't want porn. That's not the business they're in. So they're able to make those decisions.
01:03:46.280 That doesn't make them ultimately liable for the speech, though.
01:03:48.700 And I think if we lived in a world where they were liable for the speech, you would either have so much money and time spent on content moderation to the point where the internet stopped working the way we're used to it, where somebody, some human had to review every YouTube video before it's shared, which is crazy and not feasible considering how much content is shared, or companies would err on the side of not hosting things.
01:04:09.200 And so to your point earlier, Megan, if someone's deemed controversial, maybe it just automatically gets taken down. Maybe they just don't host any of that person's speech anymore.
01:04:16.040 There's no platform at all. YouTube's like, you're out. Yeah, you're out. Forget monitoring your contents and your content and your comments. You're done. You're not worth the risk. What about that, Will?
01:04:27.600 I mean, I basically agree that there would be a parade of horribles in a world where there wasn't some liability protection from user generated content, and that these companies would have a very difficult time doing business. But what are we concerned about at the Internet Accountability Project and those of us on the right who want to regulate big tech? Well, we're concerned about censorship.
01:04:46.400 We're concerned about major big tech monopolies using their monopoly power to censor political opinions they don't like. And we want to change the law to make it so to constrain their ability to censor us. We see it as a sort of collective regime of private discrimination. And the remedy to that is state and federal law.
01:05:05.420 And so from my perspective, I think the First Amendment debate will ultimately shake down in our favor because there are a variety of precedents that suggest that if you're not seen as the speaker, that states can protect the right of people to speak on your property and essentially compel you to allow them to speak on your property.
01:05:23.060 And that's pretty analogous here. So the 230 problem is that there's a federal law that stops states from protecting the rights of their own citizens.
01:05:30.620 And so I'm willing to effectively concede the idea that these companies should have liability protection from user-generated content and then use that to say, you're not the speaker. We should have the right to constrain your ability to censor people if your platform is large.
01:05:44.060 So how would it work? Like if you could revise 230, well, you'd leave number one, C1 in place where it says you're immune. You're not going to be liable for a comment. In other words, one of the good examples was Yelp.
01:05:58.700 If you read a negative review of a restaurant on Yelp, the restaurant can sue you as the commentator, but they can't sue Yelp. And you guys are both in agreement that under C1, Yelp cannot get sued and we don't want to mess with that.
01:06:11.420 So how would you like to see C2, the one that lets them remove certain things like terrorist content, pornography and so on, how would you like to see it changed to pull back on some of what many of us believe is political censorship for the most part of one particular viewpoint?
01:06:30.100 Sure. So one simple idea would be to remove the catch-all term that says that companies can remove content that is otherwise objectionable.
01:06:39.540 That gives them a huge amount of leeway to remove content and preempts laws that would protect against political censorship.
01:06:45.100 And another tweet to the law would just say that their good faith belief that the content is violative of their terms of service would have to be objectively reasonable rather than subjective.
01:06:54.880 That would be another tweak to that law that would make it so that states could write their own laws protecting their citizens from censorship that also go along with an objectively reasonable good faith standard.
01:07:06.120 And I think the ultimate world we're trying to lead to is one where people have a meaningful remedy and meaningful predictability about what content will get them kicked off platforms and what won't.
01:07:14.880 And if they are arbitrarily censored, they'll have a remedy they can go to court.
01:07:18.980 The thing that's attractive about that, Kate, is what Will was saying a moment ago, the public square, right, that these social media companies have become these behemoths that we never envisioned, you know, 20 years ago.
01:07:32.400 I was just thinking about this because I talked to my old pal from Fox News, Rick Leventhal, the other day, and I was telling him about how the coverage I saw on 9-11-2001 was the reason I left the law and got into news.
01:07:44.880 Well, that same time, I remember I was in Chicago, 2001-2002, and a friend of mine was dating a guy, and she said, oh, I Googled him.
01:07:55.220 And I was like, what does that mean?
01:07:57.000 This is 2002, right?
01:07:58.540 What does it mean to Google somebody?
01:08:00.520 That's 20 years ago.
01:08:01.720 That's when, even before that, these laws were being passed to govern a body, the internet, the social media companies, that the lawmakers had no clue what they would look like, what they would be like, how important they would be to the public dialogue, right?
01:08:19.820 So I see his point that, like, this is the public square, and shouldn't we be revising the regulations that govern them, understanding now what the pros and cons of this whole thing are?
01:08:30.560 And I would say to that, if you like the way Twitter or Google or Meta is moderating content and you want more of that, then yeah, revise 230, because that will ensure that those are the only companies that can continue to exist since the framework that 230 is created.
01:08:46.540 There's a reason we are the global leader in internet companies that host content, and there's a reason that U.S. policy has led to the kind of the vibrant internet world we have today, and changing it will ensure that only the largest companies that can afford to fight lawsuits, can afford to spend hundreds of millions of dollars on content moderation technology, can afford to hire tens of thousands of content moderators.
01:09:08.080 There's a reason that those companies will be fine if you change 230, and in fact, several large companies probably would be okay with you changing 230.
01:09:14.460 We come at it from the perspective of, what about the next generation of innovative and novel social media, but also other companies?
01:09:21.060 And I think it's really easy, especially in political circles, to think that the internet is Twitter and the internet is Facebook, but it's not.
01:09:27.500 There's a lot more than that.
01:09:28.500 We talk to companies every day that are doing really cool, unique, new things, including in the social media space, and if they don't have 230, they won't make it to be the next Facebook.
01:09:38.960 And even looking at 10 years from now, let's say, I don't think there's a guarantee that Facebook or Twitter or Google will be the size they are anymore.
01:09:46.500 I think there's been so much innovation in the social media space.
01:09:49.580 It can feel, again, because we're kind of in this echo chamber, that we're just all on Twitter.
01:09:55.040 But Discord is being used in new and innovative ways to essentially mimic social media.
01:10:00.600 I wouldn't have predicted the rise of TikTok, and that's such a big platform for a lot of people.
01:10:04.300 So I think it's dangerous to think that we're in this moment, we need to regulate specifically for this moment.
01:10:09.820 When doing so, we'll tip the scales to ensure that only the largest companies can stick around.
01:10:14.600 And I would like to see different companies in place in the next 10 years, but that they need 230 to grow, because they will be sued out of existence before they get a chance to really take off.
01:10:24.400 That's a good point, Will.
01:10:25.280 You can take one thing to the bank.
01:10:26.620 It's the litigiousness of the American people.
01:10:29.840 Well, yeah.
01:10:30.240 I mean, there's a reason I said that I don't really want to touch C1.
01:10:33.160 I take Kate's point, and I've believed for a long time, that small startups especially need liability protection from user-generated content.
01:10:41.140 That's not at all what my argument is.
01:10:43.140 And I think my argument is indeed focused on the major tech platforms and the laws I advocate that protect people's right to speak, essentially a private right of action for individuals so they could walk into court and sue.
01:10:56.340 Those laws would require that the defendant company have something like 100 million users or something more like that.
01:11:02.520 I think it's not impossible to modify our laws in such a way that protects the immunity protections that startups and smaller tech companies have, while also protecting Americans' right to speak on these huge monopoly social media platforms.
01:11:15.380 And whether those monopoly social media platforms change, it does seem to be the case that even if a new platform takes the place of an old one, most speech at any one given time is happening on one or two or three major social media platforms.
01:11:27.980 And I think there is a genuine public interest in ensuring that whatever those social media platforms are, if they're Twitter today or TikTok today or something else tomorrow, whatever those platforms are, that Americans have a right to speak on them.
01:11:38.560 Okay, I like this case.
01:11:40.420 So now he's, to take it to an analogy from when I was growing up, he's not going to go after Ma Bell for the conversations you're having on the phone, but he is going to, but he is, no, he is going to go after them.
01:11:50.140 He's going to make them subject to liability, but he's not going to get the two kids with the Campbell soup cans and the string attaching them having their conversation.
01:11:57.620 How about that?
01:11:58.280 Does that assuage your concerns about the small startups?
01:12:00.400 No, I think, you know, yes, we have the startup perspective.
01:12:05.520 We talk to startup founders all the time, but we also are increasingly talking to internet creators who are also small business owners.
01:12:11.260 I think it's really easy to dismiss internet creators as like dances on TikTok.
01:12:15.360 But right, these are advocates and educators and comedians and musicians and artists.
01:12:20.320 So it's a whole community, again, that I think only exists because of Section 230.
01:12:24.160 And these people use these large platforms to run their businesses.
01:12:28.040 And, you know, they don't, by and large, want to live in a world where their content is served next to hate speech or where their content is served next to harassment.
01:12:36.760 The ability of even large platforms to remove speech that they feel like make their platform a dangerous and unwelcoming and irrelevant place to be, it's still really dangerous.
01:12:45.100 And maybe the startups will be fine.
01:12:46.580 Although, of course, every startup wants to be big enough to kind of trigger some of these concerns and have that market power.
01:12:51.940 But even taking the smaller platforms out of it, so many people, and you can disagree that the platforms are doing a good job.
01:12:59.160 I actually don't think anybody thinks the platforms are doing a great job.
01:13:01.400 So that's like a conversation that's worth having.
01:13:03.500 But the legal framework that underlies the ability for a platform to say, no, actually, we're trying to be something specific.
01:13:10.200 And we don't want this hate speech.
01:13:12.000 We don't want this racism.
01:13:13.140 We don't want this harassment, this bullying.
01:13:15.260 We don't want it on our platform.
01:13:16.560 That's really important and keeps the internet, to make sense that the internet is healthy and working, it's because platforms can make those decisions.
01:13:22.980 Oh, boy.
01:13:23.540 I know that you were, I was kind of with you until that last sentence.
01:13:26.740 But, you know, the problem is what they think is hate speech is, it's absurd.
01:13:31.980 I mean, truly, it could be, it is not possible to change your sex.
01:13:36.840 If you are born a biological man, you're a man.
01:13:38.960 Like, that has been labeled hate speech on Twitter.
01:13:41.120 Sure. So those of us who are in the camp of sanity are sick and tired of getting our factual-based conversations shut down.
01:13:50.760 COVID was probably an even better example, right, where people are tweeting out, like, the vaccines do not prevent the spread of COVID.
01:13:58.760 Censored. Censored.
01:14:00.380 Right? Like, that's what Will is trying to fight back against.
01:14:03.300 Like, it's actually having a seriously deleterious effect on our national conversation on knowable fact discussion.
01:14:11.200 So how would you get after that?
01:14:13.520 And I think that's a perfect example of kind of why content moderation is so difficult.
01:14:18.400 Like I said, I don't think anybody who says they're doing it perfectly, nobody would, like, credit the companies with doing it perfectly.
01:14:23.420 But really, content moderation, especially at scale, is incredibly difficult.
01:14:28.400 And that's why these companies invest so much in technological tools and human beings, both inside and outside the company, to try to review content.
01:14:36.060 No, no, no, no. I got to jump in.
01:14:37.820 No, no, no, no, no.
01:14:38.520 I don't care how much they invest.
01:14:40.380 And I'm not taking a side on this, but on this point, I don't care how much.
01:14:44.200 You know what they need to invest in?
01:14:45.880 Ideological diversity amongst the people making these calls.
01:14:48.720 Because nine times out of ten, these calls go against conservatives or people who are pushing back against liberal dogma.
01:14:54.440 Take a look at John Stossel's expose on what happened to him when he tried to report on climate change.
01:14:58.980 Facts, actual facts.
01:15:00.420 And then had to run it up the Facebook authoritarian chain because they labeled it something like disinformation.
01:15:06.460 And he got real live people.
01:15:07.760 It was amazing to explain why they labeled it this.
01:15:10.340 And even when he had proven to them that they were wrong and he wasn't, they wouldn't stand down.
01:15:14.420 Like, that's completely aggravating.
01:15:17.020 And I can speak to this personally.
01:15:20.380 I went out to Facebook.
01:15:22.420 I went to Google.
01:15:24.340 I sat with the executives at YouTube.
01:15:26.880 All of these social media.
01:15:28.420 Back in 2016, they all invited me out to speak.
01:15:30.840 So I did it.
01:15:31.620 And they asked me, as I was like their favorite possible conservative.
01:15:36.720 So I was at Fox at the time, but they loved me because I challenged Trump.
01:15:39.500 Right?
01:15:39.720 This is back another day.
01:15:41.940 What should we do?
01:15:42.940 What should we do to solve some of these bias issues?
01:15:44.920 And I said, for the love of God, get more conservatives on your editorial boards or who's ever making these censorship decisions.
01:15:51.340 Real conservatives, not fake Lincoln, whatever project conservatives.
01:15:56.500 And they didn't do it, Kate.
01:15:58.120 It's not about money.
01:15:59.320 It's about an ideological bent that they refuse to get off of because they share in it.
01:16:03.500 And listen, I don't work for the large companies.
01:16:07.220 I don't need to defend them.
01:16:08.320 They have their own time and resources to do that.
01:16:10.200 But I do think this just kind of speaks to the fact that there's no perfect answer because all of the problems that you and Will have highlighted, those are pros to some members of Congress.
01:16:21.220 You know, you might say that Dr. Hawley and Elizabeth Warren are united in hating 230.
01:16:25.780 That is true.
01:16:26.780 But they want very different things.
01:16:28.180 And for every time Facebook doesn't label something that is, you know, allegedly misinformation is misinformation, you have Democratic members of Congress writing to them, asking them why.
01:16:38.280 And so the path on 230 reform isn't straight because for every complaint you have, there's someone with the opposite complaint.
01:16:44.940 And that pulls these companies in impossible directions, which is why I'm so worried about 230.
01:16:50.120 But my complaint is real.
01:16:52.220 And I think Elizabeth Warren would say her complaint is real, too.
01:16:55.000 So, Will, I realize Elizabeth Warren, she's not worried about a right-wing bias amongst the social media companies at all.
01:17:03.040 She's a little odd, but she's not dumb.
01:17:05.740 Elizabeth Warren is a smart lady.
01:17:07.780 She's worried about this hate speech and, you know, quote-unquote hate speech and all that stuff.
01:17:11.780 So what do you make of the exchange that Kate and I just had on what the real problem is inside of these companies and how we get at it?
01:17:19.720 Well, I think, I mean, Kate makes the point that social media moderation is very difficult because there's pressure coming from all sorts of ways and you don't know exactly what speech you should remove.
01:17:27.500 And I sympathize with that, right?
01:17:29.320 These companies are vulnerable to both external and internal pressure campaigns from activists, employees, and also pressures from federal government agencies, as we saw with the Twitter files when they go at these companies and say, hey, you should be doing this or that.
01:17:40.860 And so part of why I think my proposals and the proposals for these private rights of action are so useful and would be very good for these companies is that it would say, no, no, all this is against the law, right?
01:17:50.440 If we do what you say, activist, if we remove this person from their account or we ban this COVID misinformation, whatever you say, if we did that, we would be sued and we would lose.
01:18:00.120 So we're not going to do that.
01:18:01.360 It would allow them to fire a slew of all these people who are essentially interacting with all these activists and really liberate them to just focus on, okay, what are the core elements of moderation we need to focus on?
01:18:12.540 Child sexual abuse material, good, yeah, let's get rid of that, porn, et cetera.
01:18:16.480 But in terms of, they would be completely out of the political censorship debates, which is, I think, where they should be.
01:18:23.120 And I think, I mean, you're absolutely right that these companies have a liberal bias, I mean, in general.
01:18:28.020 But I think really a good analogy is actually, you go back to the civil rights era, a lot of small businesses that were enforcing Jim Crow didn't necessarily want to be.
01:18:37.300 They weren't making money hand over fist, but they were faced with a collective regime of private discrimination in the South.
01:18:42.680 And it was federal civil rights law that created the environment in which they could say to, say, a racist customer, guess what?
01:18:48.140 No, we're not discriminating because if we did so, it would be against the law.
01:18:51.540 Now, these are essentially liberating constraints for these companies that would allow them to get out of the business of censorship entirely and focus on what matters.
01:19:00.520 Okay, but you're talking about, like, the non-racist shop owner who wanted to be liberated.
01:19:05.240 And what I'm telling you is there is not the non-liberal shop owner amongst the social media giants in control who wants to be liberated.
01:19:12.840 Well, sure, I'm giving Kate a steel man here where we assume that, indeed, the problem is not the liberal bias of these companies, but instead how messy and difficult moderation is.
01:19:20.720 It's like, okay, well, even if that's the case, then you should be looking for laws that constrain you from having to engage in these muddy censorship debates in the first place and just say, you know, the whole category of this censorship would be against the law and would get you sued.
01:19:35.080 You're basically saying, I'm going to get Josh Hawley and Elizabeth Warren off your backs here.
01:19:38.680 I'm here to save the day.
01:19:39.940 By the way, I should say, there are some conservatives inside of YouTube and Facebook and so on.
01:19:44.100 It's just obviously not at the top, top in control and deciding all the editorial direction.
01:19:48.540 Go ahead, Kate.
01:19:48.880 This falls back onto the First Amendment problem, right?
01:19:52.760 Like we, and it's even more complicated because these companies are in the business of hosting speech and they can't be compelled to host speech under the First Amendment that they don't want to.
01:20:01.620 I know there's like the shopping mall analogy and there's past cases that look at like similar sounding things.
01:20:07.560 But, you know, the courts that are examining the Florida and Texas laws feel differently.
01:20:12.380 And ultimately, maybe the Supreme Court will weigh in.
01:20:14.320 Wait, can I jump in there just to, just to, so I'm, I want to make sure I'm in the right space mentally.
01:20:21.900 So like AT&T, I mentioned them before, right?
01:20:24.340 It was a private company and they wouldn't have been allowed to jump in on my private phone conversation and say, you can't say that and cut the line, right?
01:20:31.320 It's a public utility, like you can't, even though it was a private company.
01:20:35.320 Um, and there was a piece by Vivek Ramaswamy, um, and Jed Rubenfeld of Yale Law School in the Wall Street Journal a year plus ago, arguing that the social media company should be treated as such that they, you know, they crossed over public square.
01:20:49.960 Again, they're so big and they're so vital to the national conversation.
01:20:52.620 Now they should be treated more like a public, public entity that doesn't have the liberties they once had.
01:20:59.060 Is it, are we on the same page about like the difference?
01:21:02.340 Because you're saying a private company and they should be able to moderate the speech, whatever they feel.
01:21:05.920 And there is an argument that they're, they're not just this little private company anymore.
01:21:10.220 Well, and, and Will mentioned earlier, uh, the idea of a common carrier, right?
01:21:13.000 And the common carrier legally is something very specific.
01:21:15.000 Like, uh, it is a company usually that's much more highly regulated, um, and holds itself out as a neutral conduit.
01:21:21.060 So the phone company, right?
01:21:22.840 Like anybody can't just go launch their own phone company.
01:21:25.160 You have to dig up a million miles of land and put in cell towers and all this, right?
01:21:30.840 So it's not as if that's something that anybody can just get up and do.
01:21:34.060 Um, and it is very highly regulated.
01:21:35.500 Phone companies are very highly regulated by the Federal Communications Commission.
01:21:38.540 Um, and there, there is a lot of government interplay with how phone companies work and what they need from the government.
01:21:43.320 So they are common carriers, uh, and they can't step in.
01:21:46.060 They don't pretend to, right?
01:21:47.260 When you sign up for a phone, you don't sign an agreement saying, I won't, uh, use certain words or I won't say certain things or I won't talk about certain topics.
01:21:53.660 So they are a common carrier.
01:21:55.460 Um, social media platforms have never made that promise.
01:21:57.700 That's not what they're out here to do.
01:21:59.440 Um, and in fact, and something I think is worth talking about when we come with moderation, curation and moderation can be really valuable.
01:22:05.060 Um, for instance, like, you know, if there's an example, we always come back to in this space around Reddit.
01:22:09.780 There's a subreddit of cats standing on their hind legs.
01:22:12.520 It is literally just people posting pictures of cats on their hind legs.
01:22:15.280 And if you upload a picture of a dog on its hind leg, it will get removed because that's not the purpose of that forum.
01:22:20.520 Um, and, and Twitter or YouTube or Facebook or anyone can say, and the purpose of our forum is not, you know, parade of horribles that we're worried about.
01:22:27.940 Um, and so I don't think, uh, I don't know that legally there's a pathway forward for these companies to be considered common carriers, but they certainly, that's not how they hold themselves out to be.
01:22:35.640 And all of the startups we work with who want to compete with these companies, they don't want to be competing to get one day essentially like taken over by government regulation.
01:22:43.580 They don't want to live in a world where if they get big enough, they become a public utility and they get regulated as such.
01:22:48.720 And it would be very hard time attracting investors if you had to know that at the end of the successful road was government intervention, um, because you've been deemed too big.
01:22:57.100 The crackdown.
01:22:58.040 Will, can I ask you, and then we'll take a quick break.
01:22:59.620 Um, right now, am I correct that if Twitter or let's say Facebook, uh, wanted to say all conservative viewpoint is censored.
01:23:09.480 We don't want any conservative viewpoint.
01:23:11.960 Get out.
01:23:12.420 It's all going to be pulled.
01:23:14.000 It'd be a bad business decision, but legally they can do that.
01:23:17.640 Right.
01:23:19.200 Um, we might have a remedy in Texas and we might have a remedy in Florida, uh, depending on whether or not those laws have, have gotten through the courts.
01:23:27.100 Yeah.
01:23:27.220 I mean, there's, there's some tweaking, but as a general rule, I don't know that we would have a legal remedy under current law.
01:23:33.100 I think that's obviously why that law needs to change.
01:23:35.780 I mean, and I think one of the things in general is that conservatives have relied on the fact that it seems obviously in the entrepreneurial interests of these companies, not to mistreat conservatives.
01:23:46.300 Um, and so we don't need to regulate them, but it's pretty apparent after the last four years that, that, that entrepreneurial interest is not enough, um, to defeat the behavior of these monopolists.
01:23:57.000 And the fact that they have a monopoly is, is the reason why they feel so comfortable censoring in the first place, why they have the freedom to censor, if you will.
01:24:03.240 And just to, just to be clear, they can't, they cannot say no blacks on the platform, no, no, no disabled people, no women.
01:24:11.720 That's, that's not lawful, right?
01:24:13.580 Cause those are protected classes, but political thought, you know, your ideal ideology as a conservative or otherwise that's not protected.
01:24:20.580 And then more and more, even though, you know, conservatives on the internet, at least are kind of treated like their other, they're not in the country and therefore they don't have protected status.
01:24:31.560 And it's sort of this tension because the libs do control most of these platforms.
01:24:38.300 They control Hollywood, they control sports, they control media.
01:24:40.980 And so the conservatives are actually a minority, but they're not recognized as a protected class.
01:24:47.020 Right.
01:24:47.500 And I think we don't have to get into detail about which classes need to be protected to just say that everybody should have the right to be able to speak on social media and that your first amendment right to speak is not particularly meaningful.
01:24:57.920 If you can't speak on Facebook, Twitter, Instagram, the major social media platforms of the day, because that's where a political debate is.
01:25:03.460 Yeah, but it's not a government company.
01:25:04.620 I mean, like the right, this is one of the things, if, if this were run by Joe Biden, it would be a, it'd be fine.
01:25:10.280 Like it would be a first amendment issue, but it's run by a private company who we can say, I, you have red hair and I won't serve you.
01:25:16.480 Right.
01:25:16.920 Well, I mean, I mean, that's why we need to regulate private companies.
01:25:19.080 And I think contrary to Kate, the idea that these companies can't be regulated as common carriers because they are not currently heavily regulated.
01:25:25.700 Well, that's a circular argument.
01:25:26.820 You're saying we can't regulate them because they're not currently regulated.
01:25:30.320 Historically, common carrier regulation has been imposed upon companies that didn't want to be common carriers, that didn't want the obligation to not, to serve everybody equally imposed on.
01:25:40.280 That was, I mean, that was the, you know, the trains, I think in the 1880s was the first time that this came about because you had, you had monopoly train lines in the very first days of train tracks going across the country.
01:25:50.300 And they had the ability to discriminate and price discriminate against different customers.
01:25:53.500 And the customers were just SOL if they, you know, had a problem with that.
01:25:57.540 So, you know, the federal government said in their wisdom, yeah, federal government said in their wisdom, these are private companies.
01:26:02.760 Yes, but for the good of all, they need to be regulated and we're going to make you serve everybody.
01:26:08.060 And this is so interesting.
01:26:08.960 I'm really enjoying this.
01:26:10.200 All right.
01:26:10.320 Let me, let me, let me take a quick break and then we'll come back.
01:26:13.120 We'll talk about what's happening at the Supreme Court and how it relates to everything we've just discussed.
01:26:16.240 So the audience is going to be super smart when that case gets, gets decided.
01:26:19.240 Stand by.
01:26:23.120 All right, Kate.
01:26:23.820 So the Supreme Court is getting involved in this in a case called Gonzalez, Gonzalez versus Google.
01:26:29.660 And I understand you're involved in this case.
01:26:31.920 So give us Gonzalez for dummies.
01:26:34.080 What is this?
01:26:34.980 Yeah, absolutely.
01:26:35.540 And to be clear, not involved personally, but we have filed an amicus brief and helped another group file an amicus brief as well.
01:26:41.300 So Gonzalez is essentially asking the question, who should be responsible for terrorist content online?
01:26:48.740 And is it the platform that hosts it?
01:26:51.200 And what if the platform might allegedly recommend the content?
01:26:54.980 Then at what point does the platform become responsible in addition to obviously the terrorist group posting the content?
01:27:01.260 Okay.
01:27:01.520 And so is Google alleging that it should not be responsible if terrorist content gets posted on, say, YouTube?
01:27:10.000 Yeah.
01:27:10.340 So this stems from, again, this is all allegations, but the accusation here is that YouTube not only hosted, but recommended ISIS content.
01:27:19.020 And then that there was an ISIS terrorist attack and there were unfortunately victims and the victim, one of the victim's family is suing.
01:27:25.920 And Google has told the court that it's hosting and recommendations are both protected by 230.
01:27:32.780 And lots of other people have weighed in, including us.
01:27:35.520 We feel strongly that recommendations should be protected by 230.
01:27:37.800 Lots of startups use recommendations to kind of, as their competitive advantage, that's how they appeal to their users, by being able to recommend and curate specific content.
01:27:45.960 And then additionally, we worked with several internet creators, so YouTube creators, creators on TikTok and other platforms, to explain why recommendations are so important as people are trying to build out an audience.
01:27:57.780 Does Google acknowledge that this was a mistake, that this was not a good thing to do?
01:28:01.160 So Google certainly, and to their credit, all of the large companies invest heavily in finding and removing terrorist content.
01:28:08.020 This is one of the most collaborative and aggressive places that content moderation exists on the internet today is around terrorist content.
01:28:14.280 So nobody is saying terrorist content online is a good thing.
01:28:17.400 Nobody's arguing that.
01:28:18.800 I'm not sure that it is even true that Google or that YouTube did host this content.
01:28:22.600 I think that's something that would be discussed in court if we were talking about a kind of full jury trial or we're looking at the facts of the case.
01:28:29.940 But because we're talking about the legal mechanisms here, that's like not even a question.
01:28:33.500 It's just, should they be able to get sued if they did host and recommend content?
01:28:38.980 And where is the case coming up to the Supreme Court from?
01:28:41.400 What circuit was it decided in and how did it go at the lower court level?
01:28:44.720 So Google did win at the lower court level.
01:28:47.740 I'm not sure exactly which circuit it came from, but I think kind of this follows traditional 230 jurisprudence.
01:28:53.700 Normally, when it comes to content somebody else created, platforms are able to assert 230 and they're able to win.
01:29:00.340 And so this is the victim's family trying to challenge that ruling.
01:29:05.000 So is this a C1 case?
01:29:06.460 Like somebody else posted the controversial stuff and we don't have any liability for it?
01:29:11.300 Yes, this is C1.
01:29:12.200 And the Supreme Court is also thinking about taking up these two Texas and Florida cases, which, again, are being challenged on the First Amendment, but are part of the C2 conversation.
01:29:20.920 But that would be separate and likely next year at this point.
01:29:23.300 This is, like you said, happening next week.
01:29:25.880 And it's just about C1 and liability for hosting and recommending the content.
01:29:30.800 Usually, Will, it's not a good sign if you won at the lower court level and the Supreme Court takes the case.
01:29:36.180 Yeah. So I'm sure Google's not feeling too great about the fact that they're being forced to argue this in front of SCOTUS.
01:29:42.780 But, you know, this isn't necessarily something that divides along ideological lines perfectly.
01:29:48.460 So I don't know.
01:29:50.040 How do you think this is likely to go?
01:29:51.720 And what's unique about it if C1 is kind of not as controversial?
01:29:55.260 So I think it's probably going to get reversed.
01:29:58.640 I think that Google is probably going to lose because I don't think, as you suggest, I don't think the Supreme Court would have taken it if they weren't leaning in that direction.
01:30:05.640 I think it's really a question about the breadth of the C1 immunity grant.
01:30:09.280 And I think, you know, in other cases, courts have interpreted that really broadly to protect almost everything that these companies are doing with the relation to user-generated content on the Internet.
01:30:18.800 And so I think the Supreme Court sees this as an opportunity to narrow that grant of immunity to merely like you're not liable.
01:30:24.400 You're not the publisher of the speech.
01:30:25.760 We get that.
01:30:26.840 But that doesn't necessarily insulate you from liability for using for act of overtly recommend recommending the content to others.
01:30:33.540 That's not to say we couldn't have other laws if we deemed it in their wisdom.
01:30:36.720 It was smart to do that.
01:30:37.480 But I think, you know, from I'm not super familiar with this case because I'm not, you know, I'm not a Google.
01:30:42.040 I'm not involved in the case.
01:30:43.420 I didn't we didn't file an amicus or anything like that.
01:30:46.220 But I lean towards the side of saying the courts have probably interpreted the C1 grant of immunity too broadly beyond its text and that this is a good opportunity to constrain it and say that, you know, companies actually are ultimately responsible for the things they overtly do.
01:31:00.940 And that ultimately there is somebody harmed at the end of the day here.
01:31:04.080 So it's not a bad idea to say that to constrain to interpret liability grants somewhat narrowly.
01:31:09.700 Yeah, because, Kate, correct me if I'm wrong, but the thing here is that promotion is not the same as just hosting the content.
01:31:17.340 I think in a lot of ways it is the same.
01:31:20.560 And I actually really worry about the ability of the Supreme Court from like a technological level to distinguish.
01:31:24.860 But I mean, every time you search something, whether it's on Google or Bing or anywhere else, right, like that's that's an algorithm telling you what it thinks you want to see.
01:31:32.940 So so it's not just kind of like the YouTube recommendations that we're worried about, although I also think YouTube recommendations play a large role in a lot of content discovery.
01:31:43.040 There's a lot of YouTube out there. There's a lot of Internet out there.
01:31:45.620 And recommendations are really what enable platforms to try to give their users an experience they think the user wants to see.
01:31:52.320 And absent those recommendations, because if they remove 230, why would anybody recommend anything ever again?
01:31:56.760 You would be taking full responsibility for it.
01:31:59.880 Absin' 230 and absent recommendations, I worry the Internet becomes kind of like a needle in a haystack, a phone book that's on alphabetized,
01:32:07.100 hodgepodge trying to figure out what you actually want to see and how to find it online.
01:32:10.180 Wait, so let me ask you something. So that's a valid concern.
01:32:12.980 So if let's say they lose this case, Google owns YouTube.
01:32:18.240 So let's say they lose this case and just the mere promotion.
01:32:21.940 I mean, when I hear promotion, I think, hey, look at this ISIS video.
01:32:25.900 But it could just be the algorithm returning a result to searching people who are searching for something.
01:32:32.100 So it could be less pernicious than that.
01:32:35.040 So could it be the case that, you know, as a result in the opposite side of a negative ruling for them,
01:32:40.180 you've got Google and YouTube saying, we're not going to promote.
01:32:44.100 Like, here's a list of a thousand people who are considered controversial who we're not going to promote at all.
01:32:50.040 Like YouTube is no longer going to promote.
01:32:51.760 Hey, check in again.
01:32:54.020 Some people say I'm controversial.
01:32:55.320 No sane person says it.
01:32:56.540 But like Megyn Kelly's on the list.
01:32:58.000 Don't promote her show.
01:32:58.780 Or let's take somebody more realistic like Stephen Crowder, who is controversial.
01:33:02.280 So it could just be like, don't even touch it.
01:33:05.400 Don't promote it.
01:33:06.020 Don't don't allow YouTube to promote it.
01:33:07.760 That could be an outcome of this.
01:33:09.520 Absolutely.
01:33:10.100 I mean, I have no idea what YouTube is thinking.
01:33:12.220 Again, I don't work there.
01:33:13.280 I'm not sure what they're talking about internally, although I'm sure they're talking about something.
01:33:16.760 But it's not it's even broader than that.
01:33:18.300 Right.
01:33:18.440 It could be politics is controversial.
01:33:21.260 Medicine is controversial.
01:33:23.020 We're just not going to recommend anything in that space.
01:33:25.080 And I think that that gets really scary and really dangerous.
01:33:27.920 And it's something we told the court.
01:33:29.900 We think that this could have much broader impacts than just algorithmic recommendations
01:33:33.260 and terrorist content.
01:33:34.540 But I think lots of people have weighed in with concerns about the impact this will have
01:33:38.460 on free expression.
01:33:39.780 Oh, all right.
01:33:40.720 Well, I'll give you the last word on this because we had to wrap it up.
01:33:42.860 But how do you think people should be looking at this issue?
01:33:45.900 I mean, fair minded people who don't want these catastrophic outcomes that Kate is talking
01:33:49.800 about, but also feel like there's an injustice in this system.
01:33:52.780 I mean, yeah, you don't have to abolish 230 to regulate social media companies in a way
01:33:59.100 that protects people's right to speech.
01:34:00.700 There's you know, we are allowed to write new laws to solve new problems.
01:34:05.020 230 was designed to solve a problem of the early 90s.
01:34:08.160 Internet forums being unable to moderate any sort of content at all without getting liable
01:34:14.160 for defamation.
01:34:15.160 We just need to write a new law that says here's a way to solve social media censorship on political
01:34:19.540 grounds.
01:34:19.880 And are you an Elizabeth Warren supporter now?
01:34:23.720 No, I didn't think so.
01:34:27.200 You guys, great debate.
01:34:28.180 Really super appreciate all of your your information and thoughtful exchanges.
01:34:32.820 Thanks for being here.
01:34:36.140 Thanks for listening to The Megyn Kelly Show.
01:34:38.360 No BS, no agenda and no fear.
01:34:40.880 No BS, no agenda and no fear.
01:34:40.920 No BS, no agenda and no fear.
01:34:43.020 No BS, no agenda and no fear.
01:34:44.960 No BS, no BS, no agenda and no fear.