Spencer Clavin is the host of the Young Heretics podcast and author of the new book, How to Save the West: Ancient Wisdom for Five Modern Crises, which explains why Western civilization is in a state of crisis.
00:00:00.480Welcome to The Megyn Kelly Show, your home for open, honest, and provocative conversations.
00:00:11.860Hey everyone, I'm Megyn Kelly. Welcome to The Megyn Kelly Show.
00:00:15.160Oh, we have an interesting debate for you coming just a bit later about free speech,
00:00:20.320big tech, and censorship in America. Just how censorious do you want big tech to be?
00:00:27.400Maybe you're thinking, I want them to be less censorious. I'm the one whose viewpoints always
00:00:32.260get censored, and therefore I'm against protecting big tech when it comes to their censoring pen with
00:00:39.960the big eraser. It's more complicated than that, because if we take away the things that protect
00:00:45.860them, who's going to pay, big tech or us? It's actually a really interesting debate. We're
00:00:52.720going to get into it in just a bit. But we begin with the return of one of my favorite guests and
00:00:56.880quite possibly the smartest man I know. Western civilization is in a state of crisis. Perhaps
00:01:02.920you've noticed. The virtual and digital are replacing genuine experience, right? The metaverse? What the
00:01:09.540hell is that? How about just like the universe we're already in? Why do we need a new digital
00:01:13.460universe? Feelings too often are replacing facts. How we navigate all these issues in our society
00:01:21.180will determine no less than if we can save the Western world. Spencer Clavin is one of the people
00:01:28.220who can help do that. He could actually save the Western world all on his own if we would just do
00:01:32.940what he tells us. He is the host of the Young Heretics podcast and author of the new book out today,
00:01:39.220which I highly, highly recommend to you, How to Save the West, Ancient Wisdom for Five Modern Crises.
00:01:47.260Please. Spencer, welcome back. Great to have you. Oh, Megan, it's so great to be here. And I'm going
00:01:52.080to tattoo Megan Kelly says I'm the smartest man she knows onto my forehead. That's going to be
00:01:58.140I don't throw that out there loosely. I know a lot of smart people. I've interviewed a lot of smart
00:02:03.700people. Your brain is special. Oh, thank you. It's so lovely to be back with you.
00:02:10.380Oh, and you come by it, honestly, because your dad is Andrew Clavin, who we also love.
00:02:13.700Sorry to mention your daddy every time you come on, but we just people know the last name and
00:02:18.080we're all such fans. I'm proud to be associated with him, despite my constant protestations that
00:02:23.760I'm not related to him. I actually am very proud to call him my dad. So more than happy to hear you
00:02:28.760mention his name. All right. So let's set it up. Let's set up because what I love about this book
00:02:32.980is we're all going through these feelings of like, what's happening? Wait a minute. Why isn't
00:02:36.640what happened to truth? What happened to God? What what's going on with this gender craze?
00:02:43.040What's like what's going on in our society? You can feel something very different from the way we
00:02:47.480used to be. And this book diagnoses why that's happening. Yes, it is happening. Why it's happening.
00:02:53.820What are the crises we're in the midst of? And then takes a look back at history, ancient history
00:02:59.640to reassure us. I think that none of this is new. We've been through virtually all of this before,
00:03:05.580and there are really great minds to give us some wisdom into how to navigate what's likely to come
00:03:11.980next, what's winnable, what's not. And you as a classics expert know all of that stuff. You've read
00:03:19.260all of that stuff and you're living the modern day crises with us all. So you you've sort of been able
00:03:24.060to mend history with modern day problems to give us some insight and some some wisdom. So let's start
00:03:30.700with the crisis as you see it. Why does the West need saving? What are we going through?
00:03:36.500Well, I think you really put your finger on it when you describe that feeling like everything we're up
00:03:41.440against is kind of new and confusing. This sense we have that nobody has ever faced these problems
00:03:47.780before because our technology is totally new. And the digital revolution has just reshaped the
00:03:54.000way we look at ourselves and see the world. And on one level, of course, it's true. The Internet
00:03:59.840did not exist in ancient Greece. I am reliably informed. And yet, you know, at the same time,
00:04:06.140the problems that we're being faced with by this new technology, questions like what is a human being
00:04:12.640and what is our place in the universe? And you mentioned the question of God. Those are actually
00:04:17.980fundamental, eternal questions. And what that means is that they've been around for as long
00:04:24.500as human beings have been around. And there have been deep thinkers throughout the centuries in this
00:04:30.040Western tradition that we're all inheritors of who have raised really beautiful answers to these
00:04:36.280questions that can help us see our way forward. And what that means is we're not alone. I think when
00:04:42.140people tell you, you know, oh, it's a brave new world and all the old books are primitive and
00:04:47.360superstitious, what they're really trying to do is deprive you of the community of the past. And
00:04:52.520I grew up, as you know, making in a house filled with books, old books that I would pull down off the
00:04:58.080shelf. And soon, soon I realized that being surrounded with books meant being surrounded by
00:05:02.980friends. And so I wrote this book, how to save the West, because I wanted people to have access to some
00:05:08.980of that stuff, have ownership over this wisdom that comes down to us from Athens and Jerusalem,
00:05:13.400so that we can answer some of these questions that are being raised, these five questions. Is there
00:05:18.540absolute truth? What do I do with my body? Does the world have meaning? Is there a God? And what's
00:05:24.740going to happen to America? Those are questions that we can answer or help to answer using the wisdom
00:05:29.820that comes down from the past and not just using whatever the CDC or the WEF tells us today and tomorrow.
00:05:36.780Mm hmm. It's funny because during the Trump administration, Kellyanne Conway famously,
00:05:42.260infamously said, alternative facts. These are alternative facts. And people started to question
00:05:46.740whether we really are in this post-truth world where one side has its facts and the other side
00:05:51.140has its alternative facts. And that's only continued. You know, she's she described it on this show as
00:05:56.540sort of a flub. You know, it's just she was stepping on her own words and she wasn't really trying to
00:06:00.620create that narrative. But since then, it's become even worse. I mean, COVID is a great example of how,
00:06:06.020you know, you could take the same issue and find two different experts with diametrically opposed
00:06:11.600views. And depending on which one is the leftist view versus the more heterodox view,
00:06:17.520that'll dictate how it's covered in the mainstream media. So people really are in a place of rejecting
00:06:23.020what we used to see as truth. What appears on the nightly news? What appears in the paper? What
00:06:30.720your trusted politicians tell you? That's gone. And a lot of us feel untethered now in trying to
00:06:36.600figure out truth. So that's one of the crises we're facing is the reality crisis, which is related
00:06:43.500to truth. So how do you analyze that? Yeah, well, I mentioned that moment with Conway in the book.
00:06:51.100And what's so funny about that is all of a sudden, when Bad Orange Man came along,
00:06:56.660it was like, we have a crisis of truth in the news, and we're having a post truth politics. And it's
00:07:03.060like, I'm even I am old enough to remember when Bill Clinton said, it depends on what the meaning
00:07:09.140of the word is, is like Donald Trump's team did not invent this problem. And if you like your doctor,
00:07:16.500you can keep your doctor. There's a great example. Sure. And there was the there were the fake,
00:07:21.440but accurate memos about George Bush, which Dan Rather put forward. And you know, as you've said,
00:07:27.400it's only gotten worse. And it's particularly gotten worse because of what you were mentioning
00:07:31.900earlier, the tech and the fights over censorship that we're having. One of the reasons I think we're
00:07:38.120having these fights where people want to shut down free speech is because they believe there's this
00:07:44.100idea that if you can stop people from saying something, it'll stop being true. And if you can take
00:07:49.760control of the narrative, you can decide which COVID facts get spread and which don't, you know,
00:07:54.780then you will actually have created reality, like as if it were just some kind of metaverse that we're
00:08:00.200all living in. And, you know, an idea that has come up since I wrote the book is the idea of
00:08:04.540malinformation, not just disinformation or misinformation, but malinformation is where
00:08:08.960you're saying true facts, but you're using them wrong, right? It's it's bad to use those facts in
00:08:15.160those ways to make that point. And so we really are when it gets down to it, we're in a crisis,
00:08:20.440which I call the reality crisis. Is there anything which is true or false, whether or not you're
00:08:24.880allowed to say it, even if everybody, even if all the censors try to shut you up, is it still true
00:08:29.360that one plus one is two? And what I show in the book is that this is actually the crisis with which
00:08:35.560Greek philosophy kind of begins. It's the origin story of Western philosophy in some sense,
00:08:40.800that in Athens, the great democracy, you have this crisis where people are arguing that whatever
00:08:47.860you can get voted in, whatever you can argue for and present before the assembly, that's what goes.
00:08:54.400And so the justice is just the interests of the stronger, the strong do what they can while the
00:08:59.980weak suffer what they must. And what I'm arguing in this book is that, in fact, if you want to take
00:09:06.340that pill, you're going to go all the way. It's not going to be a happy, blissful, you know,
00:09:11.460metaverse kind of universe beyond your world. It's actually going to be a war of all against
00:09:16.800all. It's going to be power politics, because if there's no such thing as absolute truth,
00:09:20.560all you're left with is strength and strength amounts essentially to the threat of violence.
00:09:26.080And I think that a lot of people, as they start to lose their grip on, you know, what the
00:09:32.280quote unquote official narrative is, they feel like there's no way of discovering truth.
00:09:36.000But the fact is that we actually have an apparatus for searching out the truth. It's
00:09:41.000called reason. The Greeks called it logos. And we can recover ownership over our own reason
00:09:46.860and confidence, which is what a lot of people are beginning to do as they start to reject
00:09:50.760the experts, which I think is the right move. I think we should move further in that direction
00:09:55.000as we form our own opinions and open the discourse as much as we can to seeking, you know, the absolute
00:10:02.820truth, which is the beauty and the goodness with which philosophy begins.
00:10:07.160How does the rejection of God, of religion, of any sort of higher power factor into this?
00:10:17.660Right. Well, it's difficult, I think, when you start talking about this, especially in an
00:10:22.800American context, you start to feel like you're just forcing your religion down people's throats,
00:10:29.400right? People will say, don't force your God on me. We have separation of church and state.
00:10:33.020And one of the things I say in the book is I'm not trying to convert everybody to my church.
00:10:40.800It would be great if everybody went to my church. I would love that. But that's really not the problem
00:10:45.240that we're up against. The real problem that we're up against is we all actually kind of know that
00:10:49.580some things are true and some things are false. And it's not just like physical facts that are true
00:10:54.740and false. Like this table is sitting here in front of me right now. There's also moral truths that
00:11:00.020are true and false. It's wrong to take innocent life without cause, for instance. These moral
00:11:05.960truths also have a kind of absolute reality that we can't just wish away. And spiritual truths are
00:11:12.300part of that universe, the universe beyond just our flesh. And if we want to believe in that,
00:11:18.100which we all have to in order to form a functioning society, we do have to believe. We don't have to
00:11:24.120subscribe to this or that faith tomorrow or believe the Nicene Creed right now. But we do have to admit
00:11:30.240that everybody acts as if there is such a thing as a highest truth and a highest good. Bob Dylan,
00:11:36.520the great poet and prophet, says you got to serve somebody. And the Bible's version of this is that
00:11:42.260the fool hath said in his heart, there is no God. You know, we have this idea that that line just means
00:11:47.100like, oh, atheists are dumb or something. But that's actually I don't think that's what the Psalms are
00:11:52.440saying. When you say the fool has said in his heart that there is no God, what it means is, you know,
00:11:57.880when you tell yourself you're not worshiping, when you tell yourself there's no higher power,
00:12:03.240you're actually fooling yourself. You're making yourself into a fool because you're deceiving
00:12:07.180yourself. Everybody operates as if there's a highest good behind everything he or she does. And if you
00:12:13.200pretend that's not true, you just end up worshiping without knowing it, which is what we saw, you know,
00:12:17.780in the summer of 2020, when people were kneeling before these Black Lives Matter protesters begging
00:12:23.580for forgiveness and absolution, or when they were referring to the science, capital S, as a kind of,
00:12:29.620you know, a cult authority that could tell them what to do. And Dr. Fauci represents it. He is his
00:12:34.060priest. Amen. You know, this is kind of how people are starting to behave. And I think the real thing we
00:12:40.300need is not so much a conversion as a surrender to realize that what we are doing already implies a kind
00:12:47.700of worship. And we should be self-aware about that. And we should look to the great traditions of
00:12:52.580scripture and the church to help us understand what is worthy of worship. What's the highest good that
00:12:58.220we could seek that would actually ennoble us rather than making us slaves? Let's start there.
00:13:03.880One of the aggravating things about that truth you just told us is if we could actually get those
00:13:11.160beliefs recognized as a religion, then we could stop them from permeating the public square and being
00:13:17.320pushed on us by government, which isn't allowed to favor one religion over another. And yet we can't,
00:13:22.700it hasn't been recognized, wokeism, as a religion, and therefore it can be pernicious in how it gets
00:13:28.480pushed on us in the schools, in our jobs, at the corporate, at the government level, as we're seeing
00:13:35.220now with the Biden administration. But you write in the book the following. In the 21st century,
00:13:40.540political demands often boil down to the assertion that the speaker's point of view or identity should be
00:13:45.760taken as an absolute authority. The various slogans we chant show this. Believe women. That's my truth.
00:13:51.320Elevate black voices. And your point is that without God, again, quoting, without some shared,
00:13:57.540stable, objective basis for understanding what is true, moral, and real, we are left only with
00:14:03.140competing demands for power and competing attempts to control the facts. This is a very smart way of
00:14:11.400talking about this void. The more we remove God and the principles that we associate with God and
00:14:18.500with a higher power, the more we create a vacuum that gets filled with utter banalities. That's to
00:14:24.300be charitable. Really, the truth is what we fill it with is downright dangerous.
00:14:29.780Well, sure. That's absolutely right. And I think that, you know, the kind of religious nature of all of
00:14:34.820these belief systems can really be seen when you start to ask, well, you know, what's the basis for
00:14:42.160believing, for instance, that, you know, a man can become a woman simply by saying so, or, you know,
00:14:48.940that men can get pregnant, all these kind of abstract ideas that we use to divide sex and gender and to
00:14:58.100suggest that they're both kind of infinitely malleable. Well, it's not like, you know, science has
00:15:03.540discovered that this is true. You get a kind of pseudoscientific language around it. They've
00:15:08.800claimed to have, you know, proven this in some objective way. But in fact, since notions like
00:15:14.880gender, which is kind of a purely spiritual concept, those notions don't actually exist anywhere on like
00:15:21.340a brain scan. They are ideas about the soul. They're really actually closely tied to some very
00:15:28.180ancient notions like the Neoplatonic idea that we're kind of there's our body. And that's just
00:15:34.180like flesh, or it's a play thing, or it's a doll to be molded. And then there is the soul, which kind
00:15:39.200of lives in this perfect sphere. I mean, nowadays, we talk as if it lives kind of online or in the
00:15:44.880cloud, you know, but that division between body and soul, which is very close to the heart of the
00:15:51.580sort of trans extremist movement, the post gender third wave feminism, whatever you want to call it,
00:15:57.160you know, you read like Judith Butler's gender trouble, where she really kind of goes into this
00:16:01.620stuff. And it's, it's totally Neoplatonic and Cartesian. It's like, I'm dividing my body from
00:16:07.700my soul. My soul is the true me and everything else. It's like, well, maybe I get surgery today.
00:16:12.820Maybe I reconstruct my body tomorrow, or I, you know, put horns on my head or whatever, because
00:16:17.580my body is just a kind of appendage or a toy that I'm playing with. Now, whatever else that is,
00:16:24.140it's definitely an article of faith, right? It's definitely a profession of something that you,
00:16:29.720some spiritual idea that you believe, rather than, you know, some scientific objective facts
00:16:35.440that everybody has to accept tomorrow, or else you're a bigot, and you're, you're just ignorant
00:16:39.240and wrong, right? These are, these are spiritual claims. And one way of measuring a spiritual claim
00:16:44.960is to see what kind of behavior it produces and what kind of results it produces for people.
00:16:50.900And that's where the danger that you're talking about comes in, because, you know, you ask,
00:16:54.840how's it working out for you to be perpetrating these, you know, terrible invasive surgeries on
00:16:59.760kids and whatever. And the answer is, it's making us sicker and more depressed and, and tearing apart
00:17:05.440the fabric of our, of our social life and our society. And since it's simply an article of faith
00:17:10.740that this is going to do anything good for us, I don't think it's working out that well. I think
00:17:15.100it's pretty obvious that the older idea, which is that your body is the language for your soul,
00:17:20.480that we are in some sense, embodied souls would be a truer religion that we could actually adopt
00:17:26.120in place of this kind of neo-gnostic trans extremism. So how do we look at, you know,
00:17:33.340some of the ancient philosophers and get an answer to this reality crisis? I know the book mentions
00:17:37.380Socrates, always some wisdom there. Like, is there, is that just a cautionary tale? Is that a cautionary
00:17:43.260tale or is that an answer? I mean, it can definitely start to look like a cautionary tale,
00:17:48.500especially when you remember they made him kill himself, right? I mean, it's not
00:17:51.320like this stuff is guaranteed to turn out right or to turn out well. But there's an interesting
00:17:56.280thing that you see when you start to read these texts. And I point this out in the book that it's
00:18:02.440often the case that the thing which will get you hounded out of town today is the seed that's going
00:18:08.700to grow into the tree of tomorrow's civilization. And that's what happened with Socrates. Now,
00:18:14.660nobody would wish Socrates fate on anybody. And I don't think that you're destined to be attacked
00:18:20.480by an angry mob if you stand up for these realities that we're talking about here. But I do think that
00:18:26.280we should recognize, you know, that the world being what it is, the world being fallen, you're always
00:18:32.560going to be facing some opposition when you're seeking the true, the good and the beautiful. Those
00:18:38.240things are, to say the least, they're inconvenient to the powers that be. And without developing a
00:18:44.540persecution complex, we should be realists about knowing that, you know, it's going to be tough
00:18:49.320out there. But I always think about this moment in Lord of the Rings of all places, which where
00:18:54.820Frodo says, I wish the ring had never come to me. Gandalf, the wise wizard says, so do all who live to see
00:19:01.520such times, but that is not for them to decide. All we can do is decide what to do with the time that's
00:19:06.760given us. And I think that's the position that, you know, Socrates was in. It's the position,
00:19:11.840for instance, that Marcus Cicero was in another thinker I write about in the book who lived at the
00:19:17.420very end of the Roman Republic. And, you know, his ideas didn't win the day, but they carried forward
00:19:22.740into the future until at last, you know, they helped to build this country. And, you know, when
00:19:29.640we're talking about the reality crisis, we're talking about something that these thinkers have been
00:19:33.180wrestling with again and again. I also mentioned Aristotle in the book. He's an important figure
00:19:37.880to turn to. But I think really the biggest question when it comes to despair, right, is are we just
00:19:44.080looking at cautionary tales here? I think what we're really trying to do is to understand ourselves as
00:19:50.020inheritors of a tradition that will outlast us. Because even if things fall apart, and I'm not saying
00:19:57.000they're going to, I'm not a, you know, determinist about this, but even if things fall apart,
00:20:01.500you want to have been preserving the flame that future generations will be able to pick up. That
00:20:06.920effort is never wasted, which is one of the things you learn from studying history in the long view.
00:20:13.000Well, at least we're going to go down swinging. You write about Plato's cave, and this is
00:20:20.300interesting because it relates to, it's a story that helps us understand the importance of distrusting
00:20:27.400elites, which is something you mentioned just a couple minutes ago. Can you get into that?
00:20:31.780Yeah. Yeah, sure. I mean, this is kind of the original virtual reality dystopia is Plato's cave,
00:20:39.120and we're all already living in it is sort of the idea. Plato famously in the Republic book seven,
00:20:45.200he talks about this cave where all of humanity is shackled, and all they can see is shadows on a wall.
00:20:52.700There's a fire, and there are kind of puppet masters that manipulate the shadows. So people
00:20:58.440think that's reality, but the truth is that actually outside the cave, the sun is shining,
00:21:03.700and that's the true light, which is the beautiful and the good, which we only dimly at a distance
00:21:08.520see reflected in the sort of day-to-day experience that we might have. And I think one of the things
00:21:14.260that is so powerful about that image is that it gives you a third person outsider's view. It lets you
00:21:21.200see that even though the people in the cave think they're perceiving reality, they're actually at
00:21:27.400the whim of the powerful, people who have just one more degree of information and power than they do.
00:21:35.160And as we start to play around with the virtual reality idea, as it becomes more and more possible
00:21:41.080to think about ourselves, quote unquote, in the metaverse or uploading our consciousness into some
00:21:46.840kind of virtual reality cloud, suddenly this idea, which has been the subject of dystopian horror
00:21:54.080for centuries, becomes like a sales pitch. It becomes, you know, oh, this is something we should
00:22:00.200all like and want to do. And I think if we look back to Plato's Cave, and if we even look back to,
00:22:06.720you know, stories after that, which have kind of followed on, like, you know, The Matrix or Wall-E or
00:22:11.960these other kind of snow crash, these sort of dystopian fiction stories that we've written,
00:22:17.520they show us our intuition of something which is really true, which is that if you give up the
00:22:21.840ability to determine true and false, if you give it over, you're always giving it over to somebody.
00:22:27.480And that person has interests of his own, which might or might not be entirely salutary.
00:22:32.820So when somebody comes and says to you, you know, digital tech has made it so there's no need for
00:22:37.180true and false anymore. You can just have everything you want.
00:22:39.840If you will simply strap on these goggles and live in a virtual reality, the alarm that I'm
00:22:46.160sounding in this book, and that I think Plato is sounding in the all of these, you know, sort of
00:22:50.940fiction writers after him is, you know, that's always a bum deal, because the person that you're
00:22:56.060handing over your data to handing over your mind to handing control over to has his own agenda.
00:23:03.060And that's the elites that you're that you're talking about, as people begin to discover that elites
00:23:07.340are actually not infallible. And in fact, they have all many of the same kind of sins and temptations
00:23:13.160that just you and I have. I think it's really healthy and salutary that we're realizing, you know,
00:23:18.400actually, these guys are not gods. They're they're not, you know, beyond the problems that mankind has
00:23:24.500faced for for generations. And maybe we should think twice before we hand over our lives to them.
00:23:28.860Hmm. I'm thinking about this exchange I had. I've mentioned this to the audience before
00:23:33.840at the beginning of covid when Trump had that very weird, disjointed news conference where he was
00:23:39.800like, I'm shutting down all travel. And then they're like, no, it's not shutting down all travel.
00:23:43.560Like he's had like five things he had to correct as soon as the press conference was over.
00:23:47.560I remember tweeting out something to the effect of I wish I knew who to trust.
00:23:52.620I recognize I cannot trust what he is saying, but I also recognize I cannot trust what the media is
00:23:58.580saying about him and about this virus either, because they have an agenda prior to an election
00:24:03.440and just getting him out and saying whatever he says is wrong. And it was a great frustration that
00:24:08.240I recognized early on in covid and many people shared. And I love Ann Curry, by the way. She's
00:24:13.300such a sweet person and I think the world of her. But at the time she tweeted sort of at me,
00:24:18.660trust the WHO, the CDC, Anthony Fauci. And this was early enough in the pandemic. We weren't yet
00:24:28.280where we are on them. You know what I mean? Like most of us had to be lied to repeatedly before the
00:24:33.940light bulb went off of these organizations. But I remember being like, hmm. And to your point,
00:24:39.060like, think of that. It's the same kind of thing. There's this group that you're supposed to trust.
00:24:44.600They're the elites. And supposedly they had a little bit more information than we had in the
00:24:49.800cave. And yet they didn't. And a healthy distrust was very much warranted. And now, you know, most of
00:24:56.720our view of these people and these groups has completely changed, at least for most people on
00:25:02.300the right and in the center of the country. That's right. And it's been transformative. It's
00:25:07.060been transformative for me. That's for sure. I mean, I would be it would have been much more
00:25:11.260sympathetic to somebody saying, trust the WHO before the pandemic than I than I would be now.
00:25:17.720And it's because, you know, the people who the human beings, the fallible human beings who make
00:25:21.780up those institutions have betrayed our trust. And that's not you know, that's something that has
00:25:27.200happened in the past. Machiavelli says that when the elites betray the trust of the people,
00:25:31.660they do two kinds of damage. They damage their own credibility, but they also damage the
00:25:36.320credibility of the regime of the country or the nation that they are a part of. And that's why
00:25:42.180it's so evil is because we don't just lose our faith in this or that governing body. We also lose
00:25:48.160our faith in the whole kind of structure of power that we're supposed to be kind of believing in and
00:25:53.580participating in. And this is, I think, really importantly, why our founding fathers who deserve,
00:26:00.560as far as I am concerned, to be counted among the great thinkers of the Western tradition,
00:26:04.880you know, they established a principle that actually, you know, the nation is sovereign
00:26:10.820among nations and the individual is sovereign, you know, within his own personal life and personal
00:26:16.260decisions. And the reason for that is it's not like there's no such thing as knowledge, right?
00:26:21.100It's not like there's no such thing as people who know stuff that we don't know and can give us
00:26:25.640information we don't have. It's that politics and the decision about what to do is actually an ethical
00:26:32.680decision. We're actually making moral choices, not just about how infectious is this disease or,
00:26:39.360you know, how, what's the, you know, number of molecules that are operative before you get
00:26:45.320infected, whatever, but actually about what we should and should not do. And in those questions,
00:26:50.460questions of ethics, questions of politics, it's not the same thing as a math problem. It's not
00:26:55.980something that you can trust a scientist to go away and run the model and do the calculation and tell
00:27:00.420you, Oh, climate change is this deadly. And so unfortunately we have to, you know, we must
00:27:04.340pass this law. Um, no, no, that is not the idea this country was, was founded on. We believe that
00:27:10.020when it comes to ethical decisions, uh, it's not a math problem, it's a soul problem. And we together
00:27:15.680as the people elect representatives who make these choices for us, and we don't just, uh, outsource,
00:27:22.300you know, our ethical or moral responsibility to these absolute bodies of, of total power and control.
00:27:28.820Now that whole idea was called into question by the capital P progressives, right? There was this
00:27:34.020notion that history had moved beyond our system and actually the constitution was kind of outdated.
00:27:39.060And, you know, what we really just need to do is outsource all of this to governing bureaucracies.
00:27:44.440It's the birth of the modern administrative state, you know, and, and, and this whole notion,
00:27:48.980which is now kind of deep seated among, you know, one portion of our polity, um, it's gotta go.
00:27:56.100If we want to recover the American idea, which is that, yes, there are people who know things. Yes,
00:28:01.620there are scientists. Indeed. There are even legitimate experts out there. Um, but what they
00:28:06.100aren't is Kings and they are not, you know, they're not deemed designed by God to rule over us. We are
00:28:13.380designed to rule over us. And, and the last analysis, we get to make the decisions.
00:28:17.620It's, it's so stimulating listening to you. I have to tell you, it's like great for my brain. I love
00:28:23.460it all, but I'm thinking about right now, just, we've never had a stronger executive, uh, in this
00:28:28.260country and it was never meant to be, you know, we were fleeing a King. We didn't want that. The
00:28:31.700founders who were brilliant, didn't want that at all. They wanted the presidency to be the smallest
00:28:35.860branch, the least powerful. And yes, the administrative state has grown beyond anything they ever envisioned,
00:28:41.220but even just the powers that we seed, you know, look at Joe Biden just over the past, you know,
00:28:45.460whatever year trying to extend the, uh, the, the rate of the rent abatement program and just all
00:28:53.700these things that he acknowledged would be struck down by the courts, but he did it anyway, because
00:28:58.900he thought it would help him politically not to mention the student loan, uh, nonsense that he knows,
00:29:05.860he knows will not be upheld. Why is he doing that? Because he's, he's acting like a King,
00:29:10.020right? Congress was meant to reign in the excessive president, the excessive executive
00:29:16.340branch, and they won't. And now I look at Congress who were supposed to be a bunch of regular folks
00:29:21.140who decided to serve their country and, and bring their farmer ideals into the office and sort of keep
00:29:26.340a, a realistic pulse, you know, finger on the pulse of the nation. Now they're a bunch of morons.
00:29:31.520They're a bunch of stupid morons who just want attention for themselves. And they're congressional
00:29:35.100Kardashians. Is it any wonder that our politicians feel like they've completely failed us
00:29:39.900and don't relate to us at all? And they've given over, Congress has given over so much of the power
00:29:45.900that now exists in these bureaucracies and in the executive branch. You're absolutely right.
00:29:50.800And this is an area where it is really easy to get into a despair cycle real quick.
00:29:56.060It's happening. Structural. Spiraling. So here we are. We're in the spiral at this moment. Let me see
00:30:01.540if I can like grab a, a handhold out of here. Um, in the book, what I discuss, describe is the
00:30:09.200sort of history of political philosophy that got us up to the place where the founders were able to
00:30:14.060say, you know what, let's have a Republic, right? There's this long tradition of thinking about
00:30:18.280what's called anticyclosis, the cycle of regimes. And the basic idea is, you know,
00:30:23.140let me just say, let me just say, this is like the most interesting part of the book to me.
00:30:26.440Everybody needs to pay attention to this. This is actually really important. Go ahead. Okay, cool.
00:30:30.220Yeah. Yeah. No, I'm, I'm glad, I'm glad you like it. Cause I, I find this stuff endlessly
00:30:33.560fascinating. Um, and, and let, so let's go into it. So there's, um, there's three kinds of basic
00:30:39.580government of, of, of, of politike in the Greek is the idea of how do you run your society? How does
00:30:45.420your civilization function? What are the rules and who gets to make these decisions? Right. Um, and,
00:30:51.020you know, the Greek idea of the polis, which is the city state is kind of the, you know,
00:30:54.920the petri dish for thinking about this sort of stuff. And Aristotle, who's one of the great
00:30:58.980thinkers on this topic in his politics, he says, there's three ways that you can organize this.
00:31:03.360There's three ways that you can arrange the system. Uh, fundamentally one is monarchy. One
00:31:08.920person rules, right? Another is aristocracy. The best, a few people rule and they're the best people,
00:31:15.320uh, all the best people as, as Trump might say, are in charge in an aristocracy. Then you have what
00:31:20.200we would now call democracy. Although Aristotle uses slightly different languages, but, you know,
00:31:23.520rule of by, by the many who collectively make decisions and Aristotle, crucially, this is so
00:31:30.620important for us to remember because it relates to what you were saying about the breakdown of our
00:31:34.240system. There is no system that you can construct that will not suffer decay because human beings
00:31:41.180are a mess. And over time we fall victim to our peccadilloes and our flaws and all of these sorts of
00:31:48.020things. And he says, the thing that makes the difference. Yes. Hubris is classic. You're right.
00:31:52.220That's the classical example, pride, overweening pride. And the thing that makes the difference,
00:31:58.000and this is also crucial. Aristotle says between the virtuous version of a government and the evil
00:32:03.680version of government has to do with love. It has to do with what the point is of doing politics
00:32:09.740together at all. If the rulers are ruling for the benefit of the ruled out of love for their
00:32:15.100citizenry, then you have a good state. If the King loves his people and makes decisions with their
00:32:19.980interests in mind, he's a good King. Let's say his son comes along now and he's a spoiled brat
00:32:25.460and he decides actually that he's going to rule for his own benefit. He's going to tax the people
00:32:30.920heavily so that he can have a more beautiful palace, let's say, or he's going to go to war
00:32:35.260out of pure spite with somebody, not because he needs to protect the nation. That's what we call
00:32:41.120a tyrant. And that's the decayed version of, of monarchy. Now, if you have a monarchy,
00:32:46.040which is sort of the natural way of living that somebody rises up like a strong man or something
00:32:50.100to run society, decays into a tyranny, it's possible that you'll get an aristocratic uprising.
00:32:55.460The nobles will say, I've had enough of this taxation. I'm going to take over. So the aristocrats
00:32:59.980are going to be in charge. The decayed version of this, when they start dealing for their own
00:33:04.380benefit, for their own self-love, that's called an oligarchy. And we're very familiar with this kind
00:33:09.220of decay. It's when the elites become corrupt and they rule for themselves and their cronies.
00:33:13.560And when that happens, it's very possible you get a democratic uprising. The people take control.
00:33:18.600They take back that, that power of the system. If the Democrat, if the democratic regime decays,
00:33:24.800then it becomes one of my favorite classical words. We've totally lost an occlocracy, which is the rule
00:33:31.180by the occlos or the mob. It's mob rule. And in mob rule, of course, it's very easy for a strong man
00:33:36.540to come in and take over. And the cycle of regimes begins again. So you get this kind of theory of history
00:33:41.740that it just goes over and over again. And the whole cycle begins anew. The whole point of a
00:33:48.280republic, which is what our system is, is to create a perpetual motion machine, take all these
00:33:54.220different forms of government, these different kinds of power, and you combine them together and
00:33:58.520you balance them against one another. That's checks and balances, right? And so now you have these
00:34:03.060different kinds of parts of the government, like an executive branch that has kind of unitary power,
00:34:08.740but then also, you know, everybody's accountable to the people. So you have that democratic power,
00:34:13.020but you also have a legislative elite that's supposed to sort of serve as the aristocratic
00:34:17.220branch and they work together, play off against one another. How is it possible that this beautifully
00:34:23.500designed system has fallen apart into the decay that you, you described earlier on, right?
00:34:29.780The answer Machiavelli tells us, and Plato kind of hints at this as well, is class warfare. Once you
00:34:37.640get to a point where the different parts of the society, the aristocratic parts, the popular parts,
00:34:42.940the populists and the elites, don't think of themselves as fellow citizens, but think of
00:34:46.880themselves as members of a tribe. You're a white person, so you're inherently racist. You're a man,
00:34:52.920so you're inherently sexist. You're straight, so you're inherently homophobic, right? Once you get
00:34:56.980people thinking that way, you have poisoned the mechanism of the Republic and you have destroyed,
00:35:02.020going back to that very crucial thing that Aristotle talks about, you've destroyed civic
00:35:06.380love. It's love and friendship and neighborliness that makes a civilization what it is. These sort of
00:35:12.640small daily acts of marrying and being given in marriage, of forming rituals together, going to
00:35:18.500ballgames together. These things sound so simple, but they are the stuff that the community is made out
00:35:24.140of. And this finally is the foothold that I think we can get out of our despair cycle because we're
00:35:29.980not going to rewrite the system so that it all gets fixed overnight. But what we can do and what
00:35:35.760in some places we already are doing is reinvest in that philia, that local love and neighborly
00:35:42.260friendship that makes a civilization what it is. You're starting to see this in states, I think,
00:35:47.340like Florida, out here in Tennessee, in Nashville. I see it happening every day. These local societies,
00:35:53.240associations that go to the school board and fight for what they believe in and talk it out with their
00:35:59.420neighbors, figure out how we're going to rule ourselves. It's bottom up, not top down. And I
00:36:05.400think that's the way to kind of reclaim some ownership and start to move in the direction
00:36:09.740of fixing the problems that ail the country. Of course, it's going to take many years. And of course,
00:36:14.160there's still national elections to worry about and all that. But unless we get a sense that actually
00:36:19.160we have ownership over our communities, we won't even get started. It's philia, political love,
00:36:25.800civic friendship that really needs recovering in this hour. We have to talk about that more. I want
00:36:31.160to talk more about how we can make that happen if we don't feel like it's happening in our community.
00:36:35.480But I will say just listening to you, I was reminded of just the other night I was watching the Super
00:36:39.420Bowl with my kids. And, um, of course they, we had to have the black national anthem before we had
00:36:45.720the actual national anthem, which in this context is divisive. It is there's, you know, the kids are
00:36:51.040sitting there like, well, why is there, what is that? What's what is there's a, there's a special
00:36:54.900anthem just for black people and not for the white people. It's like, I don't, you know, how are you
00:36:59.220supposed to explain this? Right? So it's like, yes, it is divisive. It is divisive. Not in this
00:37:03.260context. It's divisive. Um, absolutely. You know, and then they played, we're just, I'll just
00:37:08.440finish it, but then they played the national anthem, the actual national anthem. And of course,
00:37:12.960I made my kids even sitting in my kitchen, stand up, put your hand on your heart and they did it.
00:37:17.160And why did I need to do that? Nobody could see us. Nobody, it didn't so count for anything.
00:37:21.820It counted because of the principles you're talking about right now, because I bet there were kids and
00:37:26.500adults all over the country doing the same thing. Love of country, love of the ideals that this
00:37:31.600country was built upon and stands for still. And that many of us are still trying to live by
00:37:35.800like, that's what we salute. That's, that's, what's important. Listen to the words of that song.
00:37:40.040Listen to the principles that have been handed down, not, not dividing, dividing us based on race.
00:37:46.060There's something so much bigger that ties us all together. We need to get back to that. Go ahead.
00:37:51.160Oh no. Amen. I'm, I'm glad that you, you finished there because I had a sort of similar experience
00:37:57.260recently where I think it was in orange County. Somebody was saying, I can't believe that they
00:38:02.780voted to take down the pride flag outside of this public building. Um, and it's just a sign of
00:38:08.740hatred and more of the right wing campaign to yada, yada, yada. You know how this stuff goes.
00:38:13.800And you know, they always make it out as if, you know, if you, if you're in favor of taking the
00:38:18.640pride flag down, you're a bigot. And they hold people emotional hostage because as I well know,
00:38:23.880you know, people who love whatever, who have a gay person in their life that they love,
00:38:27.840they feel like, you know, if I don't go along with this political movement, with this other flag
00:38:33.400outside of my public buildings, then, you know, I'm betraying my family. I'm betraying my loved one
00:38:38.660and I'm, I'm hateful and I'm sorry, but my flag is the American flag and the flag that flies outside
00:38:45.920my public building should be the American flag. And I will not be used as a prop in somebody else's
00:38:51.120kind of neo-Marxist campaign. You know, they do this in a million different ways. They just find
00:38:56.060the thing that you care about, you know, Oh, you're bad because you hate women. If you think
00:39:00.280that like, you know, men can't magically become women or you're bad, right? Because you don't
00:39:04.600want to sing the black national anthem before the, you know, before the Superbowl. The other one
00:39:10.200that really got to me was the pandemic of the unvaccinated, which president Biden said at one point,
00:39:15.340it's like, okay, so there's a whole portion of the population that is tantamount to a disease.
00:39:20.480That's really what our political rhetoric is going to be. And this is the kind of hope and
00:39:24.500change and transformation and the return to normalcy. I'm sorry, but you know, first and
00:39:28.940foremost, before anything else, when it comes to politics, I am an American. If we can't say that,
00:39:34.000then we're in trouble. But I think there's a lot of people out there that are ready to say that
00:39:37.360if, if we have courage and sort of lead in that regard. I saw you had a great comment on the,
00:39:42.600in connection with that flag controversy, who's saying, quoting Inez Fletcher of the Claremont
00:39:47.140Institute saying she must surely be right, that no actual homosexual can possibly have been involved
00:39:52.060in the design of something so grotesquely tacky as the LGBTQ flag.
00:39:58.000These days, every day now they put some new color on it. It's like they can't even,
00:40:02.820you can't even get a gay person to hang drapes that don't match the carpet. And you're trying
00:40:07.440to convince me that there's some, you know, coalition of like all the gay people out there
00:40:12.420are building. I don't buy it for a second. I think it's the Borg flag. I think it's like
00:40:16.360they've sort of weird amalgamation of everything that they want to use to destroy the country.
00:40:21.440I had never considered that before, but it is absolutely true. I'll never look at the flag
00:40:25.960the same again, Spencer. Thank you. Stand by much, much more on the opposite side of this break
00:40:30.760with Spencer Klavan. Spencer Klavan is my guest today. He's the author of the new
00:40:39.320must, must read book, How to Save the West, which is out today. I'm telling you,
00:40:44.920I don't say this about every book, every book. You must buy this. It is short. He actually makes
00:40:49.440it an easy read, even for those of us who are dummies when it comes to classics. Another thing
00:40:54.320I love about his podcast. So buy the book, You Won't Be Sorry, How to Save the West. I blurbed it for a
00:41:00.120reason, not just because Spencer's a friend, but I truly want everybody to read this. And people
00:41:04.120write in Spencer all the time saying, is there anything I can read to help me make sense of
00:41:07.300the craziness happening in our world right now? And I've been recommending this, so I'm glad it's
00:41:10.400finally out because I had the pleasure of the advanced read and now everybody gets to have it.
00:41:15.820All right. I want to follow up on the cycle. So monarchy into aristocracy slash oligarchy
00:41:21.920into democracy, into mob rule. And then does it go again back to monarchy? And where are we
00:41:29.860in the cycle? Where is America right now? Obviously, we're technically a republic,
00:41:34.160but I would imagine we're in the democracy into mob rule phase. But is monarchy coming our way if
00:41:38.800this all fails? What's happening next? Well, it's really interesting. The way I think about this
00:41:44.460theory, this theory of the cycle of regimes is it's not a prescription and it's not a prediction of
00:41:51.460what's going to happen tomorrow. It's like a template. And once you have it in your mind,
00:41:56.520you can see pieces of it playing out sort of like snatches from a familiar tune, you know,
00:42:02.020like when, for instance, the barons rise up against King John because of his oppressive taxation,
00:42:07.800you start to see that force of the aristocratic rebellion against a tyrannical king. And that
00:42:13.380idea is really what we're looking at is dynamics that are always at play. They're eternal because
00:42:19.060they're part of human nature, so they never go away. And we can use them to understand each new
00:42:23.860thing that comes up. So what I say in the book is what I think we're in is kind of an interesting
00:42:28.740position where at home, we're sort of looking like a decaying republic. And when republics decay,
00:42:36.340they turn into oligarchies. They get seized by an elite and you start to get that war between the
00:42:43.140different classes, the social classes. And by the way, a lot of this was done on purpose by the new left,
00:42:49.100by Marxists who figured out that there wasn't going to be an economic revolution in America.
00:42:55.760So the way to bring about revolution here was to foment different kinds of classes, you know,
00:43:00.380to make them hate one another. And this is where you get ideas like white privilege, you know,
00:43:04.900people like Noel Ignatieff and some of his colleagues talking about white skin privilege. That's where this
00:43:09.900stuff comes from. And it's how we decay from a republic at home into this kind of oligarchic,
00:43:15.580you know, weird court state. But overseas, you know, we have this similarly strange thing going
00:43:22.260on where we're kind of almost an empire, right? We've extended our power across the world so
00:43:27.960enormously. And we've done so kind of informally in all of these ways, you know, through NGOs and
00:43:34.000with all of these, you know, many different ways that we exert influence over other nations.
00:43:37.580Sometimes the influence is good. Sometimes it's not so good. But the truth is that we have this kind
00:43:42.900of global network of influence that lands us in all sorts of trouble and complications. But
00:43:48.500I think the real issue is not so much that, you know, those networks are falling apart,
00:43:54.780but that they're falling apart because we are falling apart at home. And as I said,
00:44:00.500the way we're falling apart is that class warfare. Machiavelli, who's somebody that not maybe not
00:44:06.240everybody will be familiar with as, you know, the great theorist of republics, they think of him as
00:44:10.480the kind of realpolitik scheming, you know, author of The Prince. But he has a book, The Discourses on
00:44:16.840Livy. It's a really beautiful examination of the great Roman historian, Livy, who told the story of
00:44:23.540the transition from republic into or from from monarchy into republic among the Romans. And
00:44:29.400Machiavelli has this amazing passage where he sits around and he basically tries to figure out whether the
00:44:35.200elites or the people are worse. Like which one is, you know, is it the populists or the elites that are
00:44:40.700that are worse? And it's a very relevant passage for our times because we, of course, and we've been
00:44:45.740talking here on the show about how terrible our elites are and our experts. And I believe all of
00:44:51.640that stuff. But I also see how people could say, yeah, but the populists can be just as bad,
00:44:56.600right? They could be. What about January 6th? What about these, you know, kind of excesses of
00:45:01.220populism that we flirted with and all of this? And what Machiavelli ultimately concludes is that
00:45:07.140although both of these things are a danger, elite decay, elite corruption is the most dangerous
00:45:15.580thing because it destroys faith in the system. It destroys, betrays the trust, not just of the people
00:45:23.080in the elites, but of the people in the country that elevated those people to positions of power.
00:45:28.220And so you start to get that despair cycle again. It's like, how do we, you know, even operate in
00:45:33.080this country when the systems that elevate people into positions of power are so broken? And so,
00:45:40.060you know, there's a couple of ways out of that, not all of them very pretty. And as I say in the book,
00:45:44.620you know, we do not want to head into another civil war, into another form of secession. These are
00:45:49.440all things that I at least very deep, fervently pray will not come to pass. And so we ought to think
00:45:54.720about what's the remedy to, uh, elite capture that doesn't involve all those, those terrible
00:46:00.420outcomes. And the one that I pull out of these classical texts is that investment in the local
00:46:08.100community. You know, when, when this country was founded, there was a big debate going on about
00:46:12.380whether you could even do a Republic over such a large extended space. The, uh, many of the European
00:46:18.720theorists, especially the Baron de Montesquieu in, in France had this idea that, you know,
00:46:24.140Republic's kind of been tried, uh, Rome did it pretty well, but then they got too big. And that's
00:46:29.380when you start to see all these problems kind of fell apart. And our founding fathers, especially
00:46:33.580James Madison had this argument that actually, uh, a big country is an advantage for, uh, a Republic
00:46:42.020because there's room to breathe. And he said, if you extend the sphere of your, of your country,
00:46:47.720you're going to end up with all these little pockets of community where people can do things
00:46:52.220in ways that maybe they don't approve of back in Washington. You know, maybe you get like little
00:46:56.940Amish communities or you get places like Florida where, you know, they're not going to lock down
00:47:01.660for the COVID mandates, or at least, you know, they're going to be a lot less intense about it.
00:47:05.880And I think even though that system has been attacked a lot, it remains our best hope. And it
00:47:12.800remains where I see the most exciting action going on because it's in those communities.
00:47:17.160Um, and it's in those local neighborhoods and then up to the state level, um, that the problems
00:47:22.460become human sized and they become at a level where people can talk to one another. They can
00:47:28.180see each other face to face. Um, we don't reduce one another into these kinds of abstract concepts
00:47:33.400like you're a, whatever, a blue haired lib and I'm a fascist Republican or whatever. Um, we can
00:47:39.320actually talk at a human level about particular solutions to the particular problems that face
00:47:43.660us. And I think that's why you're seeing so much movement, say on the school boards. Um,
00:47:48.860you're seeing a lot of hope coming out of states like Florida where people are flocking to, you
00:47:53.320know, they can't move there fast enough. Um, it's because in those, uh, local communities,
00:47:58.280those little platoons as Edmund Burke called them, you can actually establish filia. You can establish
00:48:04.180love, civic friendship. And if there's one thing I draw out of Aristotle in this book, it's that
00:48:09.500civilization building for all that it's political, for all that it involves voting and fighting and
00:48:14.220whatever at the bottom civilization building is an act of love. And we've got to recover that.
00:48:19.620We can't be ashamed about that. We have to re imagine ourselves as, uh, neighbors and citizens
00:48:25.640in a community built on love. Hmm. I love everything you just said. I also think it's a good reminder
00:48:32.040that that doing all of that is not an online activity. It does not happen on Twitter. It does happen
00:48:38.300in your actual neighborhood. I always say that all my friends in my Upper West Side neighborhood,
00:48:42.760who are still my best friends, they're all liberals. I love them. I see how our love for
00:48:47.900each other can be the foundation for the renewal of our society. I couldn't care less what their
00:48:52.040politics are. I care about who they are as women. So it's, it's just a reminder. It's not the
00:48:58.220metaverse. It's not Twitter. It's not Facebook. It's the people within 15 feet, the family,
00:49:03.300the neighborhood, the friends, where we cultivate the solutions on how to save the West, read the
00:49:10.560book, listen to young heretics. And aren't we all so lucky to have Spencer Clavin available to us.
00:49:16.160Thank you so much for being here. Oh, Megan, I'm the lucky one. Thank you so much for having me.
00:49:20.800It's such a pleasure. Hope to see you soon. And up next, we have more goodness for you as we take
00:49:25.720a deep dive. It's going to be a fair and balanced debate on tech censorship. You've heard about section
00:49:30.380230. You don't know what it is. You're going to know. And you're going to know about the debate
00:49:34.500going underway right now. That's underway right now at the Supreme Court and beyond when it comes to
00:49:39.040big tech and free speech in America. Are you concerned at all about censorship on tech platforms
00:49:49.240and free speech in America? Do you feel like you've been targeted? You feel like what you can see online
00:49:54.720has been targeted in a way that makes certain viewpoints unavailable to you. This affects
00:50:00.860everyone. But what is the right solution? It sounds kind of wonky, but the topic of section 230 is
00:50:08.340important and it's affecting your daily life, whether you know it or not. And it's also being kicked around
00:50:13.680right now by the U.S. Supreme Court. Section 230 is a landmark U.S. law that shields social media
00:50:19.800companies from liability over content their users post. So if I go online on the YouTube comments
00:50:28.660section and I see something totally defamatory that's not true about somebody, I could get sued
00:50:34.860potentially, but YouTube can't. YouTube, they're not responsible to police my thoughts because they're
00:50:40.920not considered really like a publisher. The way, let's say, remember when Amber Heard got sued by Johnny
00:50:47.380Depp for defamation. She posted something in the Washington Post. See, it's more dicey when you're
00:50:53.640the newspaper than when you are a social media company. So like newspapers are held to a higher
00:51:00.140level. Social media companies are held to a lower level. Some people think that should change and many
00:51:05.800people do not. All right. So now the Supreme Court is going to hear next week a case involving this and
00:51:10.800Google. And today we decided to get together two true experts on this issue who have vastly different
00:51:17.340opinions on these very important topics. We're going to have a good, respectful debate between
00:51:22.720Kate Tamarello, Executive Director of InGen, and Will Chamberlain, Senior Counsel for the Internet
00:51:29.420Accountability Project. Kate and Will, thanks so much for being here. Yeah, thank you for having me.
00:51:35.900Yeah, thanks for having us. Yeah, great to have you both. All right. So let's just, we're going to keep
00:51:39.840it simple so people who don't follow this anywhere nearly as closely as you guys do can follow it. So Kate,
00:51:44.540I'll start with you on briefly describe what is 230. We throw this turnaround on 230, 230. Should
00:51:49.860it be repealed? Should it not be repealed? Josh Hawley doesn't like it. Elizabeth Warren doesn't
00:51:54.600like it. Very weird bedfellows. But we know that the big tech platforms do like it. So we don't know
00:51:59.860what side to be on. Right. Because it's like, wait, even conservatives are like, I don't want to
00:52:03.920side with Elizabeth Warren. But right. So they get confused. What is 230?
00:52:08.260Yeah, so very simply, Section 230 is a 1996 law that essentially says, whoever created the content
00:52:16.500should be legally responsible for it, not the platform that's hosting the content. And it often
00:52:21.600gets talked about in the context of social media, that makes a lot of sense. That's how most of us
00:52:25.260deal with the internet these days. But it's actually much broader than that. It applies to all internet
00:52:30.080platforms of all sizes, whether you're hosting social media posts, like tweets, or YouTube videos,
00:52:35.420or Facebook posts, or Instagram photos. But it's also things like reviews and ratings and
00:52:40.260photos and videos that you may be sharing on a smaller scale. So while the debate is often about
00:52:45.200kind of 230 and big tech, Engine is a nonprofit that works with startups and internet creators.
00:52:49.900And we're coming at it from the perspective of kind of the whole ecosystem and why the law and
00:52:53.560the liability shield really empowers users to speak and share content and information online.
00:52:59.960So, Will, what most people say, and I've listened to Ben Shapiro, who, of course,
00:53:03.260has created and helps run the Daily Wire, he says, you know, he's got his reservations,
00:53:09.000but he likes 230 because he's got the Daily Wire and he doesn't want to be held liable. If somebody
00:53:14.180posts a comment on the Daily Wire, that turns out not to be true. And then, okay, that person gets
00:53:19.420sued for defamation, but now Ben's getting sued for defamation. What's Ben going to do? He's going to
00:53:22.780say, forget the comment section, forget, forget all of this. I'm not going to engage in this business
00:53:27.080if I'm liable for what my users post. So most tech platforms, whether it's the Daily Wire,
00:53:33.240YouTube, Twitter, whatever, they like this shield because they don't really view themselves as in
00:53:38.560the business of endorsing the content that their users post. Right. And it's important to realize
00:53:44.480that the immunity in 230 is bigger than nearly what our users post we're not responsible for.
00:53:50.240They also have an immunity protection for anything that these companies remove. And that's granted at
00:53:55.200the federal level. And so one of the big issues with 230 is that this federal grant of immunity
00:54:00.000for any type of removal of content that can be even otherwise objectionable, that ability to remove
00:54:07.660it has thwarted state laws at the state level that have been trying to protect individuals' right to
00:54:13.200speak freely online. So I think we actually might be in an agreement to the extent that, yeah, your
00:54:18.080average message forum or even social media platform shouldn't be held responsible for every single
00:54:22.880thing. It's users post. I mean, the scale is enormous. But at the same time, I don't think
00:54:27.000they should be given carte blanche to censor people based for political reasons. All right. So let me
00:54:32.140stay with you for a minute. So you're going to explain to us why 230 is problematic in your view
00:54:37.700in its current form. And it sounds like you're focusing in on this this ability to remove with
00:54:43.120impunity. Is this what they use to censor? And I realize it can be any viewpoint, but it's oftentimes
00:54:49.740the conservative one. Is this is this problematic because the right so often is the brunt of it?
00:54:58.140Right. So there's two major sections of Section 230, two subsections. The first one is the one that
00:55:04.720you and Kate have been talking about, which is the one that says that just if you are the platform
00:55:09.520hosting the content, you are not the speaker of the content that is created by your users. And that is
00:55:15.240sort of there to insulate you from defamation liability. Like just because somebody posts
00:55:19.240something defamatory on Facebook doesn't mean Facebook should be getting sued by the person
00:55:23.820who was lied about. But then there's the second part, subsection C2. And that part says that all
00:55:31.000of these tech companies and platforms basically are immunized from liability for removing any content
00:55:35.920that's lewd, lascivious, and a whole bunch of other negative words or otherwise objectionable.
00:55:40.760And so that does give them some protection to remove things like pornography, for example.
00:55:45.660But it's so broad that it means that the companies can say, no, we're just we just want to kick you
00:55:50.780off the platform and remove your ability to speak. And not only do you have no recourse just generally,
00:55:55.880but if a state or even the federal government tries to create a law and give you some recourse
00:56:00.740against these companies doing that, it would be if you're a state, it would be what's called
00:56:04.760preempted by the federal law. And so that.
00:56:07.700Let me jump in. Let me jump in and ask you, what does give me a real life example of how
00:56:12.560this has been used in a way you find problematic?
00:56:15.780Oh, for a good example is so Florida passed a law. This was in the news, I think about a year ago.
00:56:21.980Florida passed a law that said that their citizens shouldn't be censored on social on social media
00:56:27.600for no good reason. And that if they are being censored for political reasons, they should have
00:56:31.120the right to I believe this was the Florida law. They should have the right to sue the big tech
00:56:35.560companies that that might have been Texas, actually. I'm not. I'm pretty sure one of those
00:56:39.720two. But in any event, at least for the Florida law, the Florida law was found to be unconstitutional
00:56:44.500in the courts and and and I'm sorry, invalid in the courts. And part of the reason was it was just
00:56:49.800preempted by Section 230. And the way our federal system works is that if a federal law speaks to an
00:56:55.680issue and it's a power within the federal government's power, it trumps any state law that
00:57:00.720would contradict. So having this broad grain of federal power really hurts the ability for states
00:57:07.140to protect their citizens from censorship. OK, but wait, but let me let me follow. Because what I
00:57:10.160what I would like is like a specific example, you know, let's take I don't know, Stephen Crowder.
00:57:16.440He got demonetized by YouTube. Is that is that in this lane or like give me an example of somebody
00:57:22.580we would know or some kind of incident we would know where they exercise the power to remove
00:57:27.200in a way that you don't want them to be shielded for? Oh, sure. OK, Laura Loomer was removed from
00:57:33.420Twitter, for example, and she was actually the plaintiff in a major lawsuit against Twitter.
00:57:37.620She was represented by a lawyer, Ron Coleman, who actually wrote the right with the white paper
00:57:41.760talking about 230 reform. And she was essentially suing Twitter to get her account reinstated,
00:57:46.920saying she was censored for political reasons. And that suit was thrown out of court on and on
00:57:53.380among other grounds, the idea that Section 230 precluded any sort of remedy.
00:57:59.000Hmm. All right. What about that, Kate? So now we've sort of discussed some of the virtues of 230,
00:58:04.020but we'll zero in on C2, the removal and how it was used. Laura Loomer is a controversial person.
00:58:10.360But this is always going to be used against, for the most part, people who you could dub controversial.
00:58:14.680I mean, half the time people call me controversial. It's like, OK, fine.
00:58:16.920So this is the problem, because in America, we don't tend to censor controversial speech. We tend
00:58:22.200to believe that the answer to controversial speech or speech you don't like is more speech,
00:58:25.140not less, not to censor the original offending comment. Yeah, I think there's a lot to dig into
00:58:31.300here. And one of the most important pieces of context is to see to write as part of Section 230,
00:58:36.720but it doesn't operate in a vacuum. All platforms, all businesses, all people are protected by the First
00:58:41.580Amendment. And that includes, right, the government can't censor speech, but the government also can't
00:58:45.880compel speech. No government, federal or state governments, can compel anyone to host speech
00:58:50.800or to make speech. And that's actually the main reason that the Texas and Florida laws are having
00:58:56.020so much trouble and what the courts have been looking at. You know, 230 obviously is involved,
00:58:59.880but the real question at the heart of those cases is, do those laws violate the First Amendment?
00:59:03.460And some courts have said yes, some courts have said no. It's likely the Supreme Court,
00:59:07.320they're looking at it right now, they will decide whether to pick it up. They just asked for the
00:59:11.640Solicitor General to weigh in. And so I think that there's kind of this idea that C2 is what enables
00:59:17.340platforms to take down speech. But in kind of in practice, it's really the First Amendment.
00:59:22.260And that's, that's pretty in line with a lot of- Wait, wait, just to stop you. It's,
00:59:26.180it's the social media company's First Amendment right to take away-
00:59:30.560the same way that, you know, the court has found that a bakery has a First Amendment right to refuse
00:59:36.120to bake a gay wedding cake. The social media companies have their own First Amendment rights
00:59:40.620to not be censored and to not be compelled to host speech. So C2 really is a modifier to C1.
00:59:47.260This dates back to pre-internet laws about kind of like bookstores and in court looking at,
00:59:53.940you know, how does someone who distributes someone else's speech,
00:59:57.400when can we hold them responsible? And in the 1990s, the courts were looking at different cases
01:00:02.720and said, if you moderate heavily and you miss something that should be illegal,
01:00:06.780then you're held responsible. And Congress stepped in and said, whoa, whoa, we don't want a world where
01:00:10.760you have a disincentive against moderation. We want platforms to be able to appeal to niche audiences,
01:00:16.480to be able to cater to specific people, to be able to provide safe spaces without things like porn
01:00:21.040or harassment or spam. And so they, that's why they put C2 in the law. And so while it is certainly an
01:00:26.960important part of the law, the First Amendment underlies kind of all content moderation.
01:00:31.060So I think that's, that's just worth calling out. In addition to the First Amendment,
01:00:33.920when you sign up to a social media service, you sign up terms of use, you know, terms of service
01:00:39.060agreement that usually bans lots of things. And it's really up to the platform what they include
01:00:43.800in that. But if you violate that terms of service, then you're totally allowed to be kicked off the
01:00:49.240platform. You violated a contract with the company. And so it's not as if absent 230 Twitter or whoever
01:00:56.480has a legal obligation to host your speech. It's just absent 230. They don't have to worry about
01:01:01.860fighting it out in court, which can cost millions of dollars versus getting it dismissed under 230,
01:01:06.540which only costs somewhere between a couple of 10, maybe a hundred thousand dollars.
01:01:10.820What about that? Well, so if Laura Loomer, if there were no section 230, what would be her grounds for
01:01:16.620a lawsuit against Twitter for bouncing her off? Because it's a private company. They're like,
01:01:20.500we don't like you. We don't like the color of your hair. You're booted.
01:01:22.580Right. Well, I mean, there's regulations on private companies, and there have been serious
01:01:28.100ones going back to the teens when it comes to common carrier type regulations or the 1960s with
01:01:33.140civil rights and public accommodation. Slews of private companies are under various regulations
01:01:38.660that say you're compelled to provide service to people and you're not allowed to arbitrarily
01:01:42.080terminate it. That's not true in every aspect of the American economy, but it's true in many of them.
01:01:47.440And we don't see those necessarily as First Amendment violations of freedom of association.
01:01:50.840And another point, I mean, we were just talking about subsection C1, the point that because these
01:01:56.480platforms aren't speaking, because they're just hosting tons of user-generated content,
01:02:01.700well, they're obviously not the speaker, so they should be insulated from liability.
01:02:05.800But then all of a sudden, when you want to say, okay, well, you're not the speaker,
01:02:09.340so you should still be forced to host this other people's speech because your platform is so huge.
01:02:14.980It's the public square. And then these companies all of a sudden say, whoa, whoa, whoa,
01:02:18.240you're violating our First Amendment's rights to speech. I think it's unfair for these companies
01:02:21.760to have it both ways. If they want liability protection for hosting user-generated content,
01:02:27.000then they should concede that they are not the speaker of that user-generated content and that
01:02:31.100their speech rights aren't being violated when a state in their decision, in their wisdom,
01:02:36.420decides to give their citizens the right to speak on that platform.
01:02:39.560How about that, Kate? How can they argue out of one side of their mouths,
01:02:41.780we are not in control of the content, and out of the other side,
01:02:45.620oh, we are very much in control of the content.
01:02:49.140I don't think any platform would ever say that they're in control of the content. I don't think
01:02:53.260that's a fight anyone thinks they can win. And I think this really kind of comes back to-
01:02:57.640Well, that's what C2 is. That's C2. The reason they are able to remove porn and Laura Loomer is C2.
01:03:05.560They are in control, ultimately, of the content.
01:03:07.480I mean, they're in control of kind of the environment they create. But, you know, we live in a world because of the internet and because of 230 where anybody can create and share content on the internet instantaneously and not have to worry about, you know, like YouTube, for instance.
01:03:23.960It used to be if you wanted to distribute a TV show, you had to hope that a cable channel and a TV studio and all the people necessary to participate in that process would agree to host you and to do that.
01:03:33.780You don't need that now. Anybody can upload anything to YouTube.
01:03:36.180And if YouTube, of course, has rules around what you can upload, they don't want things like terrorist content, which is at the heart of Gonzales.
01:03:42.040They don't want porn. That's not the business they're in. So they're able to make those decisions.
01:03:46.280That doesn't make them ultimately liable for the speech, though.
01:03:48.700And I think if we lived in a world where they were liable for the speech, you would either have so much money and time spent on content moderation to the point where the internet stopped working the way we're used to it, where somebody, some human had to review every YouTube video before it's shared, which is crazy and not feasible considering how much content is shared, or companies would err on the side of not hosting things.
01:04:09.200And so to your point earlier, Megan, if someone's deemed controversial, maybe it just automatically gets taken down. Maybe they just don't host any of that person's speech anymore.
01:04:16.040There's no platform at all. YouTube's like, you're out. Yeah, you're out. Forget monitoring your contents and your content and your comments. You're done. You're not worth the risk. What about that, Will?
01:04:27.600I mean, I basically agree that there would be a parade of horribles in a world where there wasn't some liability protection from user generated content, and that these companies would have a very difficult time doing business. But what are we concerned about at the Internet Accountability Project and those of us on the right who want to regulate big tech? Well, we're concerned about censorship.
01:04:46.400We're concerned about major big tech monopolies using their monopoly power to censor political opinions they don't like. And we want to change the law to make it so to constrain their ability to censor us. We see it as a sort of collective regime of private discrimination. And the remedy to that is state and federal law.
01:05:05.420And so from my perspective, I think the First Amendment debate will ultimately shake down in our favor because there are a variety of precedents that suggest that if you're not seen as the speaker, that states can protect the right of people to speak on your property and essentially compel you to allow them to speak on your property.
01:05:23.060And that's pretty analogous here. So the 230 problem is that there's a federal law that stops states from protecting the rights of their own citizens.
01:05:30.620And so I'm willing to effectively concede the idea that these companies should have liability protection from user-generated content and then use that to say, you're not the speaker. We should have the right to constrain your ability to censor people if your platform is large.
01:05:44.060So how would it work? Like if you could revise 230, well, you'd leave number one, C1 in place where it says you're immune. You're not going to be liable for a comment. In other words, one of the good examples was Yelp.
01:05:58.700If you read a negative review of a restaurant on Yelp, the restaurant can sue you as the commentator, but they can't sue Yelp. And you guys are both in agreement that under C1, Yelp cannot get sued and we don't want to mess with that.
01:06:11.420So how would you like to see C2, the one that lets them remove certain things like terrorist content, pornography and so on, how would you like to see it changed to pull back on some of what many of us believe is political censorship for the most part of one particular viewpoint?
01:06:30.100Sure. So one simple idea would be to remove the catch-all term that says that companies can remove content that is otherwise objectionable.
01:06:39.540That gives them a huge amount of leeway to remove content and preempts laws that would protect against political censorship.
01:06:45.100And another tweet to the law would just say that their good faith belief that the content is violative of their terms of service would have to be objectively reasonable rather than subjective.
01:06:54.880That would be another tweak to that law that would make it so that states could write their own laws protecting their citizens from censorship that also go along with an objectively reasonable good faith standard.
01:07:06.120And I think the ultimate world we're trying to lead to is one where people have a meaningful remedy and meaningful predictability about what content will get them kicked off platforms and what won't.
01:07:14.880And if they are arbitrarily censored, they'll have a remedy they can go to court.
01:07:18.980The thing that's attractive about that, Kate, is what Will was saying a moment ago, the public square, right, that these social media companies have become these behemoths that we never envisioned, you know, 20 years ago.
01:07:32.400I was just thinking about this because I talked to my old pal from Fox News, Rick Leventhal, the other day, and I was telling him about how the coverage I saw on 9-11-2001 was the reason I left the law and got into news.
01:07:44.880Well, that same time, I remember I was in Chicago, 2001-2002, and a friend of mine was dating a guy, and she said, oh, I Googled him.
01:08:01.720That's when, even before that, these laws were being passed to govern a body, the internet, the social media companies, that the lawmakers had no clue what they would look like, what they would be like, how important they would be to the public dialogue, right?
01:08:19.820So I see his point that, like, this is the public square, and shouldn't we be revising the regulations that govern them, understanding now what the pros and cons of this whole thing are?
01:08:30.560And I would say to that, if you like the way Twitter or Google or Meta is moderating content and you want more of that, then yeah, revise 230, because that will ensure that those are the only companies that can continue to exist since the framework that 230 is created.
01:08:46.540There's a reason we are the global leader in internet companies that host content, and there's a reason that U.S. policy has led to the kind of the vibrant internet world we have today, and changing it will ensure that only the largest companies that can afford to fight lawsuits, can afford to spend hundreds of millions of dollars on content moderation technology, can afford to hire tens of thousands of content moderators.
01:09:08.080There's a reason that those companies will be fine if you change 230, and in fact, several large companies probably would be okay with you changing 230.
01:09:14.460We come at it from the perspective of, what about the next generation of innovative and novel social media, but also other companies?
01:09:21.060And I think it's really easy, especially in political circles, to think that the internet is Twitter and the internet is Facebook, but it's not.
01:09:28.500We talk to companies every day that are doing really cool, unique, new things, including in the social media space, and if they don't have 230, they won't make it to be the next Facebook.
01:09:38.960And even looking at 10 years from now, let's say, I don't think there's a guarantee that Facebook or Twitter or Google will be the size they are anymore.
01:09:46.500I think there's been so much innovation in the social media space.
01:09:49.580It can feel, again, because we're kind of in this echo chamber, that we're just all on Twitter.
01:09:55.040But Discord is being used in new and innovative ways to essentially mimic social media.
01:10:00.600I wouldn't have predicted the rise of TikTok, and that's such a big platform for a lot of people.
01:10:04.300So I think it's dangerous to think that we're in this moment, we need to regulate specifically for this moment.
01:10:09.820When doing so, we'll tip the scales to ensure that only the largest companies can stick around.
01:10:14.600And I would like to see different companies in place in the next 10 years, but that they need 230 to grow, because they will be sued out of existence before they get a chance to really take off.
01:10:30.240I mean, there's a reason I said that I don't really want to touch C1.
01:10:33.160I take Kate's point, and I've believed for a long time, that small startups especially need liability protection from user-generated content.
01:10:41.140That's not at all what my argument is.
01:10:43.140And I think my argument is indeed focused on the major tech platforms and the laws I advocate that protect people's right to speak, essentially a private right of action for individuals so they could walk into court and sue.
01:10:56.340Those laws would require that the defendant company have something like 100 million users or something more like that.
01:11:02.520I think it's not impossible to modify our laws in such a way that protects the immunity protections that startups and smaller tech companies have, while also protecting Americans' right to speak on these huge monopoly social media platforms.
01:11:15.380And whether those monopoly social media platforms change, it does seem to be the case that even if a new platform takes the place of an old one, most speech at any one given time is happening on one or two or three major social media platforms.
01:11:27.980And I think there is a genuine public interest in ensuring that whatever those social media platforms are, if they're Twitter today or TikTok today or something else tomorrow, whatever those platforms are, that Americans have a right to speak on them.
01:11:40.420So now he's, to take it to an analogy from when I was growing up, he's not going to go after Ma Bell for the conversations you're having on the phone, but he is going to, but he is, no, he is going to go after them.
01:11:50.140He's going to make them subject to liability, but he's not going to get the two kids with the Campbell soup cans and the string attaching them having their conversation.
01:11:58.280Does that assuage your concerns about the small startups?
01:12:00.400No, I think, you know, yes, we have the startup perspective.
01:12:05.520We talk to startup founders all the time, but we also are increasingly talking to internet creators who are also small business owners.
01:12:11.260I think it's really easy to dismiss internet creators as like dances on TikTok.
01:12:15.360But right, these are advocates and educators and comedians and musicians and artists.
01:12:20.320So it's a whole community, again, that I think only exists because of Section 230.
01:12:24.160And these people use these large platforms to run their businesses.
01:12:28.040And, you know, they don't, by and large, want to live in a world where their content is served next to hate speech or where their content is served next to harassment.
01:12:36.760The ability of even large platforms to remove speech that they feel like make their platform a dangerous and unwelcoming and irrelevant place to be, it's still really dangerous.
01:13:16.560That's really important and keeps the internet, to make sense that the internet is healthy and working, it's because platforms can make those decisions.
01:14:13.520And I think that's a perfect example of kind of why content moderation is so difficult.
01:14:18.400Like I said, I don't think anybody who says they're doing it perfectly, nobody would, like, credit the companies with doing it perfectly.
01:14:23.420But really, content moderation, especially at scale, is incredibly difficult.
01:14:28.400And that's why these companies invest so much in technological tools and human beings, both inside and outside the company, to try to review content.
01:16:08.320They have their own time and resources to do that.
01:16:10.200But I do think this just kind of speaks to the fact that there's no perfect answer because all of the problems that you and Will have highlighted, those are pros to some members of Congress.
01:16:21.220You know, you might say that Dr. Hawley and Elizabeth Warren are united in hating 230.
01:16:28.180And for every time Facebook doesn't label something that is, you know, allegedly misinformation is misinformation, you have Democratic members of Congress writing to them, asking them why.
01:16:38.280And so the path on 230 reform isn't straight because for every complaint you have, there's someone with the opposite complaint.
01:16:44.940And that pulls these companies in impossible directions, which is why I'm so worried about 230.
01:17:07.780She's worried about this hate speech and, you know, quote-unquote hate speech and all that stuff.
01:17:11.780So what do you make of the exchange that Kate and I just had on what the real problem is inside of these companies and how we get at it?
01:17:19.720Well, I think, I mean, Kate makes the point that social media moderation is very difficult because there's pressure coming from all sorts of ways and you don't know exactly what speech you should remove.
01:17:29.320These companies are vulnerable to both external and internal pressure campaigns from activists, employees, and also pressures from federal government agencies, as we saw with the Twitter files when they go at these companies and say, hey, you should be doing this or that.
01:17:40.860And so part of why I think my proposals and the proposals for these private rights of action are so useful and would be very good for these companies is that it would say, no, no, all this is against the law, right?
01:17:50.440If we do what you say, activist, if we remove this person from their account or we ban this COVID misinformation, whatever you say, if we did that, we would be sued and we would lose.
01:18:01.360It would allow them to fire a slew of all these people who are essentially interacting with all these activists and really liberate them to just focus on, okay, what are the core elements of moderation we need to focus on?
01:18:12.540Child sexual abuse material, good, yeah, let's get rid of that, porn, et cetera.
01:18:16.480But in terms of, they would be completely out of the political censorship debates, which is, I think, where they should be.
01:18:23.120And I think, I mean, you're absolutely right that these companies have a liberal bias, I mean, in general.
01:18:28.020But I think really a good analogy is actually, you go back to the civil rights era, a lot of small businesses that were enforcing Jim Crow didn't necessarily want to be.
01:18:37.300They weren't making money hand over fist, but they were faced with a collective regime of private discrimination in the South.
01:18:42.680And it was federal civil rights law that created the environment in which they could say to, say, a racist customer, guess what?
01:18:48.140No, we're not discriminating because if we did so, it would be against the law.
01:18:51.540Now, these are essentially liberating constraints for these companies that would allow them to get out of the business of censorship entirely and focus on what matters.
01:19:00.520Okay, but you're talking about, like, the non-racist shop owner who wanted to be liberated.
01:19:05.240And what I'm telling you is there is not the non-liberal shop owner amongst the social media giants in control who wants to be liberated.
01:19:12.840Well, sure, I'm giving Kate a steel man here where we assume that, indeed, the problem is not the liberal bias of these companies, but instead how messy and difficult moderation is.
01:19:20.720It's like, okay, well, even if that's the case, then you should be looking for laws that constrain you from having to engage in these muddy censorship debates in the first place and just say, you know, the whole category of this censorship would be against the law and would get you sued.
01:19:35.080You're basically saying, I'm going to get Josh Hawley and Elizabeth Warren off your backs here.
01:19:48.880This falls back onto the First Amendment problem, right?
01:19:52.760Like we, and it's even more complicated because these companies are in the business of hosting speech and they can't be compelled to host speech under the First Amendment that they don't want to.
01:20:01.620I know there's like the shopping mall analogy and there's past cases that look at like similar sounding things.
01:20:07.560But, you know, the courts that are examining the Florida and Texas laws feel differently.
01:20:12.380And ultimately, maybe the Supreme Court will weigh in.
01:20:14.320Wait, can I jump in there just to, just to, so I'm, I want to make sure I'm in the right space mentally.
01:20:21.900So like AT&T, I mentioned them before, right?
01:20:24.340It was a private company and they wouldn't have been allowed to jump in on my private phone conversation and say, you can't say that and cut the line, right?
01:20:31.320It's a public utility, like you can't, even though it was a private company.
01:20:35.320Um, and there was a piece by Vivek Ramaswamy, um, and Jed Rubenfeld of Yale Law School in the Wall Street Journal a year plus ago, arguing that the social media company should be treated as such that they, you know, they crossed over public square.
01:20:49.960Again, they're so big and they're so vital to the national conversation.
01:20:52.620Now they should be treated more like a public, public entity that doesn't have the liberties they once had.
01:20:59.060Is it, are we on the same page about like the difference?
01:21:02.340Because you're saying a private company and they should be able to moderate the speech, whatever they feel.
01:21:05.920And there is an argument that they're, they're not just this little private company anymore.
01:21:10.220Well, and, and Will mentioned earlier, uh, the idea of a common carrier, right?
01:21:13.000And the common carrier legally is something very specific.
01:21:15.000Like, uh, it is a company usually that's much more highly regulated, um, and holds itself out as a neutral conduit.
01:21:47.260When you sign up for a phone, you don't sign an agreement saying, I won't, uh, use certain words or I won't say certain things or I won't talk about certain topics.
01:21:55.460Um, social media platforms have never made that promise.
01:21:57.700That's not what they're out here to do.
01:21:59.440Um, and in fact, and something I think is worth talking about when we come with moderation, curation and moderation can be really valuable.
01:22:05.060Um, for instance, like, you know, if there's an example, we always come back to in this space around Reddit.
01:22:09.780There's a subreddit of cats standing on their hind legs.
01:22:12.520It is literally just people posting pictures of cats on their hind legs.
01:22:15.280And if you upload a picture of a dog on its hind leg, it will get removed because that's not the purpose of that forum.
01:22:20.520Um, and, and Twitter or YouTube or Facebook or anyone can say, and the purpose of our forum is not, you know, parade of horribles that we're worried about.
01:22:27.940Um, and so I don't think, uh, I don't know that legally there's a pathway forward for these companies to be considered common carriers, but they certainly, that's not how they hold themselves out to be.
01:22:35.640And all of the startups we work with who want to compete with these companies, they don't want to be competing to get one day essentially like taken over by government regulation.
01:22:43.580They don't want to live in a world where if they get big enough, they become a public utility and they get regulated as such.
01:22:48.720And it would be very hard time attracting investors if you had to know that at the end of the successful road was government intervention, um, because you've been deemed too big.
01:23:19.200Um, we might have a remedy in Texas and we might have a remedy in Florida, uh, depending on whether or not those laws have, have gotten through the courts.
01:23:27.220I mean, there's, there's some tweaking, but as a general rule, I don't know that we would have a legal remedy under current law.
01:23:33.100I think that's obviously why that law needs to change.
01:23:35.780I mean, and I think one of the things in general is that conservatives have relied on the fact that it seems obviously in the entrepreneurial interests of these companies, not to mistreat conservatives.
01:23:46.300Um, and so we don't need to regulate them, but it's pretty apparent after the last four years that, that, that entrepreneurial interest is not enough, um, to defeat the behavior of these monopolists.
01:23:57.000And the fact that they have a monopoly is, is the reason why they feel so comfortable censoring in the first place, why they have the freedom to censor, if you will.
01:24:03.240And just to, just to be clear, they can't, they cannot say no blacks on the platform, no, no, no disabled people, no women.
01:24:13.580Cause those are protected classes, but political thought, you know, your ideal ideology as a conservative or otherwise that's not protected.
01:24:20.580And then more and more, even though, you know, conservatives on the internet, at least are kind of treated like their other, they're not in the country and therefore they don't have protected status.
01:24:31.560And it's sort of this tension because the libs do control most of these platforms.
01:24:38.300They control Hollywood, they control sports, they control media.
01:24:40.980And so the conservatives are actually a minority, but they're not recognized as a protected class.
01:24:47.500And I think we don't have to get into detail about which classes need to be protected to just say that everybody should have the right to be able to speak on social media and that your first amendment right to speak is not particularly meaningful.
01:24:57.920If you can't speak on Facebook, Twitter, Instagram, the major social media platforms of the day, because that's where a political debate is.
01:25:03.460Yeah, but it's not a government company.
01:25:04.620I mean, like the right, this is one of the things, if, if this were run by Joe Biden, it would be a, it'd be fine.
01:25:10.280Like it would be a first amendment issue, but it's run by a private company who we can say, I, you have red hair and I won't serve you.
01:25:16.920Well, I mean, I mean, that's why we need to regulate private companies.
01:25:19.080And I think contrary to Kate, the idea that these companies can't be regulated as common carriers because they are not currently heavily regulated.
01:25:26.820You're saying we can't regulate them because they're not currently regulated.
01:25:30.320Historically, common carrier regulation has been imposed upon companies that didn't want to be common carriers, that didn't want the obligation to not, to serve everybody equally imposed on.
01:25:40.280That was, I mean, that was the, you know, the trains, I think in the 1880s was the first time that this came about because you had, you had monopoly train lines in the very first days of train tracks going across the country.
01:25:50.300And they had the ability to discriminate and price discriminate against different customers.
01:25:53.500And the customers were just SOL if they, you know, had a problem with that.
01:25:57.540So, you know, the federal government said in their wisdom, yeah, federal government said in their wisdom, these are private companies.
01:26:02.760Yes, but for the good of all, they need to be regulated and we're going to make you serve everybody.
01:27:10.340So this stems from, again, this is all allegations, but the accusation here is that YouTube not only hosted, but recommended ISIS content.
01:27:19.020And then that there was an ISIS terrorist attack and there were unfortunately victims and the victim, one of the victim's family is suing.
01:27:25.920And Google has told the court that it's hosting and recommendations are both protected by 230.
01:27:32.780And lots of other people have weighed in, including us.
01:27:35.520We feel strongly that recommendations should be protected by 230.
01:27:37.800Lots of startups use recommendations to kind of, as their competitive advantage, that's how they appeal to their users, by being able to recommend and curate specific content.
01:27:45.960And then additionally, we worked with several internet creators, so YouTube creators, creators on TikTok and other platforms, to explain why recommendations are so important as people are trying to build out an audience.
01:27:57.780Does Google acknowledge that this was a mistake, that this was not a good thing to do?
01:28:01.160So Google certainly, and to their credit, all of the large companies invest heavily in finding and removing terrorist content.
01:28:08.020This is one of the most collaborative and aggressive places that content moderation exists on the internet today is around terrorist content.
01:28:14.280So nobody is saying terrorist content online is a good thing.
01:28:18.800I'm not sure that it is even true that Google or that YouTube did host this content.
01:28:22.600I think that's something that would be discussed in court if we were talking about a kind of full jury trial or we're looking at the facts of the case.
01:28:29.940But because we're talking about the legal mechanisms here, that's like not even a question.
01:28:33.500It's just, should they be able to get sued if they did host and recommend content?
01:28:38.980And where is the case coming up to the Supreme Court from?
01:28:41.400What circuit was it decided in and how did it go at the lower court level?
01:28:44.720So Google did win at the lower court level.
01:28:47.740I'm not sure exactly which circuit it came from, but I think kind of this follows traditional 230 jurisprudence.
01:28:53.700Normally, when it comes to content somebody else created, platforms are able to assert 230 and they're able to win.
01:29:00.340And so this is the victim's family trying to challenge that ruling.
01:29:12.200And the Supreme Court is also thinking about taking up these two Texas and Florida cases, which, again, are being challenged on the First Amendment, but are part of the C2 conversation.
01:29:20.920But that would be separate and likely next year at this point.
01:29:23.300This is, like you said, happening next week.
01:29:25.880And it's just about C1 and liability for hosting and recommending the content.
01:29:30.800Usually, Will, it's not a good sign if you won at the lower court level and the Supreme Court takes the case.
01:29:36.180Yeah. So I'm sure Google's not feeling too great about the fact that they're being forced to argue this in front of SCOTUS.
01:29:42.780But, you know, this isn't necessarily something that divides along ideological lines perfectly.
01:29:50.040How do you think this is likely to go?
01:29:51.720And what's unique about it if C1 is kind of not as controversial?
01:29:55.260So I think it's probably going to get reversed.
01:29:58.640I think that Google is probably going to lose because I don't think, as you suggest, I don't think the Supreme Court would have taken it if they weren't leaning in that direction.
01:30:05.640I think it's really a question about the breadth of the C1 immunity grant.
01:30:09.280And I think, you know, in other cases, courts have interpreted that really broadly to protect almost everything that these companies are doing with the relation to user-generated content on the Internet.
01:30:18.800And so I think the Supreme Court sees this as an opportunity to narrow that grant of immunity to merely like you're not liable.
01:30:24.400You're not the publisher of the speech.
01:30:43.420I didn't we didn't file an amicus or anything like that.
01:30:46.220But I lean towards the side of saying the courts have probably interpreted the C1 grant of immunity too broadly beyond its text and that this is a good opportunity to constrain it and say that, you know, companies actually are ultimately responsible for the things they overtly do.
01:31:00.940And that ultimately there is somebody harmed at the end of the day here.
01:31:04.080So it's not a bad idea to say that to constrain to interpret liability grants somewhat narrowly.
01:31:09.700Yeah, because, Kate, correct me if I'm wrong, but the thing here is that promotion is not the same as just hosting the content.
01:31:17.340I think in a lot of ways it is the same.
01:31:20.560And I actually really worry about the ability of the Supreme Court from like a technological level to distinguish.
01:31:24.860But I mean, every time you search something, whether it's on Google or Bing or anywhere else, right, like that's that's an algorithm telling you what it thinks you want to see.
01:31:32.940So so it's not just kind of like the YouTube recommendations that we're worried about, although I also think YouTube recommendations play a large role in a lot of content discovery.
01:31:43.040There's a lot of YouTube out there. There's a lot of Internet out there.
01:31:45.620And recommendations are really what enable platforms to try to give their users an experience they think the user wants to see.
01:31:52.320And absent those recommendations, because if they remove 230, why would anybody recommend anything ever again?
01:31:56.760You would be taking full responsibility for it.
01:31:59.880Absin' 230 and absent recommendations, I worry the Internet becomes kind of like a needle in a haystack, a phone book that's on alphabetized,
01:32:07.100hodgepodge trying to figure out what you actually want to see and how to find it online.
01:32:10.180Wait, so let me ask you something. So that's a valid concern.
01:32:12.980So if let's say they lose this case, Google owns YouTube.
01:32:18.240So let's say they lose this case and just the mere promotion.
01:32:21.940I mean, when I hear promotion, I think, hey, look at this ISIS video.
01:32:25.900But it could just be the algorithm returning a result to searching people who are searching for something.
01:32:32.100So it could be less pernicious than that.
01:32:35.040So could it be the case that, you know, as a result in the opposite side of a negative ruling for them,
01:32:40.180you've got Google and YouTube saying, we're not going to promote.
01:32:44.100Like, here's a list of a thousand people who are considered controversial who we're not going to promote at all.
01:32:50.040Like YouTube is no longer going to promote.