The Saad Truth with Dr. Saad - March 24, 2026


Jeff Kosseff - The Future of Free Speech (The Saad Truth with Dr. Saad_976)


Episode Stats

Length

45 minutes

Words per Minute

144.9166

Word Count

6,524

Sentence Count

298

Misogynist Sentences

1

Hate Speech Sentences

3


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

In this episode, Gadsad interviews the co-author of the new book, "The Future of Free Speech," Jeff Kasseff, about the future of freedom of speech in the 21st century. They discuss the role of technology and AI in shaping our understanding of the world, and the role that technology plays in shaping the way in which we think about free speech.

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.000 I'm delighted to report that I have joined, as a scholar, the Declaration of Independence Center
00:00:06.120 for the Study of American Freedom at the University of Mississippi.
00:00:10.800 The center offers educational opportunities, speakers, internship, and reading groups for
00:00:17.020 the University of Mississippi community. It is named in honor of the United States founding
00:00:22.720 document, which constitutes the nation as a political community and expresses fundamental
00:00:28.820 principles of American freedom, including in the recognition of the importance of Judeo-Christian
00:00:34.960 values in shaping American exceptionalism. Dedicated to the academic and open-minded
00:00:41.100 exploration of these principles, the Center exists to encourage exploration into the many
00:00:47.740 facets of freedom. It will sponsor a speaker series and an interdisciplinary faculty research
00:00:54.360 team. If you'd like to learn more about the center, please visit Ole Miss, that's O-L-E-M-I-S-S dot
00:01:02.300 edu slash independence slash. Hi everybody, this is Gadsad. Today I have one of the two authors of
00:01:11.180 this book, The Future of Free Speech. I've got Jeff Kasseff with me. How you doing, Jeff?
00:01:18.020 How are you? Good, very good. And the other co-author who unfortunately couldn't join us
00:01:23.160 is someone who came on my show. I went back into my archives. He came on apparently two days after
00:01:29.860 the disastrous October 7th. I think he came on October 9th, 2023. And this was his previous
00:01:36.460 book also on free speech. It's Jacob Mshangamad. Is that right? Did I pronounce that correctly?
00:01:42.260 Yep. Well, it's a pleasure to meet you, Jeff. Why don't we just get into it? Give us the big
00:01:47.440 synopsis of this guy coming out soon? Yeah. So the book that we wrote is really
00:01:53.220 looking at how free speech is not just one right among many, that it's really foundational for
00:01:59.560 democracy itself. And what we're looking at is the unfortunate situation that in the first
00:02:06.760 quarter of the century, democracies have been entering what we see as really a free speech
00:02:13.220 recession, where for various reasons, whether they're noble or not, democracies are more
00:02:20.940 willing to accept some backsliding on freedom of speech.
00:02:24.860 And we really make the case that this is not only wrong, but it is short-sighted.
00:02:30.980 And we present alternative scenarios and other solutions to what governments see as really
00:02:40.080 large social problems that don't involve restricting speech.
00:02:43.980 Okay, so we'll get to some of those solutions, maybe that, you know, downstream in our conversation.
00:02:49.040 But for now, what would be some of the forces that have caused what you aptly call freedom
00:02:56.080 of speech recession?
00:02:58.180 Well, so to understand it, I think it's helpful to look at what happened in the 20th century,
00:03:02.300 especially toward the latter part of the 20th century, where we had really an expansion
00:03:09.400 of free speech. We had governments recognizing the potential for protecting freedom of expression
00:03:15.320 and resisting authoritarian urges. And we also had really the emergence of the modern commercial
00:03:23.160 internet really posing this great potential for speech. So we had governments really embracing it
00:03:29.420 during the end of the 20th century. And then what we started to see in this century has been
00:03:37.660 governments becoming nervous that people might have too much power and that the harms might be
00:03:45.040 too great, whether it's hate speech or whatever they would term as misinformation. And now we're
00:03:52.020 even seeing it with concerns about AI. And I think that the people in power felt like they were
00:03:58.940 losing control because of this expansive free speech. And that's really driving so much of
00:04:06.960 what you see in terms of proposals, and unfortunately, laws, regulations, and court rulings that really
00:04:13.500 crack down on people's ability to freely speak without government intervention.
00:04:19.180 Do you feel that, and this is something that I actually addressed in some of my own writings,
00:04:24.000 do you feel that the default reflex or default penchant, you know, within the architecture of
00:04:30.880 the human mind is actually to not permit for, you know, unfettered free speech if I'm in power.
00:04:38.620 And it really is a bleep in history that you actually have the, you know, institutional
00:04:45.220 mechanisms and the associated values that have allowed free speech to flourish. And if that's,
00:04:51.160 if what I'm saying is true, then in a sense, your book could be one that could be rewritten
00:04:57.880 every 10, 20, 100 years, because we always return to the default value, which is when I'm in power,
00:05:05.360 I want to control what you can say. I think that's exactly right. And there are some
00:05:10.440 exceptions where people in power do really stand up for free speech. But in general,
00:05:15.500 once you have the power, it's very tempting to say, well, I want to control criticism
00:05:21.700 of my government. And this goes even beyond just political power companies as well. Corporate
00:05:30.460 executives also don't like it when they're criticized. I think you're exactly right. It is
00:05:35.240 a very natural reaction to want to suppress criticism. And unfortunately, when people are
00:05:45.500 in power, they can use the force of law, the force of the police, of the court system to really make
00:05:53.740 it difficult, if not impossible, to speak out against those in power. So I fully agree. Now,
00:06:00.620 in my own writings, I've argued that the reason why there is constantly an attack on freedom of
00:06:07.780 speech, certainly within the Western tradition, is because you, the ruler, you're conflating
00:06:16.280 the distinction between deontological ethics and consequentialist ethics. And so deontological
00:06:22.260 ethics would be absolute principles, inviolable principles. Presumption of innocence should be
00:06:28.460 a deontological principle. Freedom of speech should be a deontological principle. Freedom of
00:06:33.080 of inquiry should be a deontological principle. A consequentialist ethic would be, it's okay to
00:06:39.300 lie if you're trying to spare someone's feeling. And so I think that not all, but some of the
00:06:47.120 attacks on freedom of speech in the West do not necessarily come from a sinister,
00:06:53.460 nefarious purpose, but it's because we are the benevolent and kind, empathetic overlords who
00:07:00.100 need to, you know, hold back some speech for some valid, consequentialist, empathetic reasons. So,
00:07:09.100 for example, don't do research that demonstrates that one particular religious group is less
00:07:15.220 likely to assimilate in Western countries than another, because that's just mean. This is what
00:07:19.900 I call epistemological empathy. So, do you at all distinguish, and even if you don't, maybe we can
00:07:27.020 talk about it there is the nefarious sinister you know squashing of speech and then there's
00:07:33.520 the empathetic benevolent plate platos lie type of squashing of does that matter to you that
00:07:40.040 distinction or all squashing is bad so i come from the first principle that all squashing is bad and
00:07:49.080 i i fully agree that there are different motivations and some might i might agree with
00:07:55.520 the reasons behind some of the motivations, but I still think that giving into that impulse
00:08:03.320 is wrong. I'll give you one example that is from about five years ago. I wrote about it in my last
00:08:09.140 book. This was proposed by a few Democratic senators in Congress in the United States when
00:08:18.280 they were very concerned about what they viewed as COVID misinformation. And so they proposed
00:08:24.460 a bill that would have essentially created greater liability under a law that I've written
00:08:31.980 about called Section 230 for platforms, online platforms, social media that transmit health
00:08:40.080 misinformation. And so it would basically become more costly. They could be sued more easily if
00:08:46.200 they algorithmically promote health misinformation. And you look in the bill and you think, huh,
00:08:50.460 what is health misinformation? And you read the definition and it says that
00:08:56.940 health misinformation is determined by guidance that is issued by the Health and Human Services
00:09:03.860 Secretary. And I don't know exactly what the logic was for the senators who proposed this at the
00:09:12.720 time, but this was in 2021. And I think perhaps they were thinking, well, the person who's
00:09:19.240 currently the Secretary of Health and Human Services, I agree with. So I want him to be
00:09:26.400 defining what is forbidden speech on the internet about health. And I spoke out very strongly
00:09:33.880 against this for a variety of reasons. But one of the things I said is, you know,
00:09:38.900 you're not always going to be in power forever. And there might be someone in that office who
00:09:44.440 you disagree with at some point. And I think that that's the sort of short-sighted approach
00:09:52.440 that they take thinking, well, we're going to control things forever. No, when you tear down
00:09:58.240 safeguards for freedom of speech, it's very hard to put them back up. And I think that's too often
00:10:05.900 lost on politicians. So I think that they, even when you're having that consequentialist
00:10:13.620 impulse. I think sometimes you're looking at the really short-term consequences and not the
00:10:20.000 long-term consequences of what happens when you remove these safeguards for free speech.
00:10:25.820 Indeed. And actually, I'm glad that you mentioned health quote misinformation,
00:10:30.160 because I have a whole section in my, so my forthcoming book is called Suicidal Empathy,
00:10:35.440 Dying to be Kind. And I have a whole chapter on forbidden knowledge and settled science and all
00:10:42.580 this kind of stuff. And to hammer the point that what is one man's misinformation turns out to be
00:10:50.360 another man's Nobel Prize, most Nobel Prizes at some point would be considered by the orthodoxy
00:10:59.740 as misinformation. And I mean, that's not hyperbolic. That is literally true, right?
00:11:06.400 So it just absolutely befuddles me when I see my colleagues, as I mentioned to you offline, that I've been a very outspoken professor for many decades now against all this nonsense.
00:11:19.660 Most of my colleagues who you would think would certainly be on board with me completely swim into the warm waters of the consequentialist pool.
00:11:30.540 so then my next question my next question to you would be is it just in the genetic makeup
00:11:37.000 of different persons that some of us really can see the value of the deontological principle of
00:11:45.120 freedom of speech whereas most people don't have the personhood to do so or is this something that
00:11:50.420 you could train the consequentialist to come to the to the warm waters of deontological thinking
00:11:57.640 what do you think of all this so i've been trying to do that training for a while and it's not
00:12:03.920 worked all that well so perhaps i'm just not doing it right and i or it's genetic or it's genetic and
00:12:10.720 you can't really change people's thinking when it comes to this issue yeah and i i'm not going to
00:12:16.760 say why people take that approach and again they might i i think many of them don't have any
00:12:23.700 nefarious motives. They just think, you know, this speech is so bad that we have to do something.
00:12:32.200 And this is actually how I got the title for my previous book, Liar in a Crowded Theater,
00:12:40.460 because that book is about why the First Amendment protects most false speech and what the different
00:12:47.240 rationales are for false speech being protected in the United States. And the reason I titled it
00:12:52.880 liar in a crowded theater was, I actually came to that title pretty late in the writing process
00:12:58.660 because as I read all of the court documents in these cases where the government or litigants
00:13:06.440 were trying to punish or suppress what they viewed as false speech, Hyman, again, the
00:13:13.500 justification by the censor was, well, just as you can't yell fire in a crowded theater,
00:13:18.900 you also can't say X, Y, or Z, when in fact, you usually could say X, Y, or Z. And fire in a
00:13:28.260 crowded theater really became a proxy for, well, yeah, we have free speech, but there are some
00:13:36.580 things that are really bad that we've got to be able to suppress. And free speech has limits. And
00:13:43.560 Even in the United States, which really has extraordinary protections for freedom of speech, it's true that the First Amendment is not absolute. That's not a controversial topic. You can't go to court and perjure yourself. If you do that, you can face punishment.
00:14:03.300 If you commit fraud, you could face penalties.
00:14:07.220 But those are narrowly defined, very carefully defined exceptions.
00:14:11.900 And fire in a crowded theater became sort of this placeholder, this wild card for, well, if speech is bad enough, then the government should be able to suppress it.
00:14:25.000 And that's exactly what the Supreme Court has said is not the case.
00:14:28.300 The Supreme Court has very strongly said we don't have an ad hoc balancing test.
00:14:33.300 for freedom of speech and that, yes, we have some narrowly defined exceptions, but that's exactly
00:14:40.020 what they are, exceptions. And other democracies say that. I'm more skeptical of whether that's
00:14:50.560 the case. So for this book, Jacob really has a lot of experience. He's from Denmark. And so he
00:14:59.580 has a lot of experience with the European system. And most of my work has primarily focused on
00:15:03.920 American First Amendment law. So it really was eye-opening for me to learn from Jacob about
00:15:11.020 how, yeah, the United States is actually pretty rare in its protections for freedom of speech,
00:15:18.660 even among Western democracies. Well, even never mind Europe, I'm Canadian.
00:15:24.280 and Canada is of course much closer to the European model than it would be to the American
00:15:30.140 model. So I'm sure you've thought about this. I wonder if you have sort of a definitive answer.
00:15:36.200 What is unique about the American, whether it be the American ethos or the American experience or
00:15:43.100 the American spirit that allowed them to have that extra guardrail called the First Amendment
00:15:49.560 that so many other countries, many of which were the founders, right, sort of the British,
00:15:54.720 we can turn to them as some of the originators of many of these ideas of individual dignity and so
00:16:01.340 on. What is unique about America that it came up with the First Amendment and that hasn't been
00:16:06.920 followed in the rest of the Western tradition? Well, what's really unique has been the judicial
00:16:13.000 interpretation of the First Amendment. We could have had a very different First Amendment outcome
00:16:19.360 in the United States had the Supreme Court not taken this very strong exceptionalist approach
00:16:27.340 to freedom of speech. So, I mean, the First Amendment is more strongly worded than some
00:16:32.340 of the foundational speech protections in other jurisdictions, including Europe. But
00:16:38.600 courts can still read in very broad exceptions and loopholes and balancing tests and so forth.
00:16:46.360 And the United States Supreme Court has really resisted that. There have been temptations and there have been some cases that have been better than others. In our book, we actually go through four cases of really hateful speech that the Supreme Court has protected the speakers.
00:17:10.060 And it's challenging because the speech is really vile. It is racist or anti-Semitic, anti-military. Really, there's not much redeeming value just from a normative sense.
00:17:29.500 But the Supreme Court has said, we're going to take these neutral principles and apply them even to the most difficult cases.
00:17:41.460 And I think that that's really the court's willingness to continue to do that is probably the top reason why the American system has greater protections.
00:17:56.940 Now, that can always change. We can have a few shifts on the Supreme Court and a case with really bad facts. And that could lead to a radically different interpretation of the First Amendment. And that's something that people like me really worry about all the time.
00:18:16.600 But I think at least for now, that's what we have.
00:18:20.080 I think the second reason why America has such strong speech protections, and this really applies to the Internet, is a law that I wrote my first book about, which is called Section 230 of the Communications Decency Act, which was passed in 1996.
00:18:34.920 And what it basically says is that, absent a few exceptions, online platforms are not legally responsible in civil actions for the speech of their users.
00:18:48.400 It was not noticed when it was passed, in part because it was 1996, but what that did is it created this ecosystem in the United States for platforms to build their business models around user-generated content rather than content that the platform has created.
00:19:06.520 And you think, OK, well, what did that really do? Well, that made it so that everyday users, consumers, individuals could speak their mind and the platform would allow them to do that because it's not worried about being obliterated in court.
00:19:27.420 So which section 230 says they're not responsible to to curate the stuff, right?
00:19:35.440 Well, it says what Section 230 says is that platforms are not legally responsible if they either moderate or don't moderate the content.
00:19:45.620 So they could make the choice if there's some particularly objectionable content, the platform could say, I'm going to leave it up, even if someone's complaining, even if it's defamatory.
00:19:56.820 Even if someone says this is about me and it's false, the platform can keep it up or it could take it down.
00:20:01.880 And either way, the platform is not going to face legal responsibility for that content.
00:20:10.080 Now, there are a few exceptions, but it's a very strong and broad law.
00:20:14.320 So to give you an example of how Section 230 really creates a business model, you think about Yelp.
00:20:22.020 So Yelp has hundreds of millions of reviews of businesses.
00:20:27.200 And Yelp has really, I think, thoughtful policies for when they take down a post.
00:20:36.140 And it's things like if it invades someone's privacy, if it names an individual employee who's not an executive, things that are really thoughtful.
00:20:45.860 But it also says we don't adjudicate factual disputes.
00:20:49.240 So if I go to a mechanic and I think the mechanic charged me double what they quoted me and I write a Yelp review about it, that mechanic can complain to Yelp and say, hey, I never did that.
00:21:06.320 But Yelp will say, we're not getting involved in this.
00:21:10.720 We're just going to keep the review up.
00:21:12.160 Now, because of Section 230, that mechanic cannot successfully sue Yelp. Without Section 230, there's a much greater risk that the Yelp could be sued and would have to spend a lot of resources defending itself on the merits of a defamation claim.
00:21:37.040 And you just think, you know, given the hundreds of millions of reviews on Yelp, would that business model even be viable?
00:21:45.660 What I think probably would happen is, and I don't want to speak for Yelp, but they would probably develop a system where it's much easier for businesses to convince the platform to take down that content.
00:21:58.760 Now, you might say, well, so what? But I personally go to Yelp to find the negative reviews. I want to see if I'm about to spend money on a business, I want to go and see if it's being blasted by consumers, then I'm not going to do business with it.
00:22:21.440 And without Section 230, Yelp would have far fewer negative reviews because businesses could just exercise a veto by complaining to Yelp.
00:22:31.600 So Section 230, if I summarize it, is more congruent with the First Amendment protections than if that clause were not operative, yes?
00:22:44.380 So Section 230, I think, supplements the First Amendment, because ultimately, I think a lot of the First Amendment provides strong protections in certain defamation cases, just in terms of for public figures and public officials.
00:22:59.160 But what Section 230 does is it provides an additional procedural safeguard so that a platform does not have to defend a defamation case on the merits, which could be very time consuming and costly.
00:23:13.380 But Section 230 really is a statute that codifies free speech values for the Internet.
00:23:19.340 Okay, well, I've got a great, as you were mentioning the Yelp example and defamation, I thought of a personal case.
00:23:25.520 Maybe I shouldn't mention it because it's going to lead to the Streisand effect, but who cares?
00:23:31.780 So you've heard of Rate My Professor?
00:23:34.200 Yes.
00:23:34.900 Okay.
00:23:35.600 So I'm saying this not to toot my horn because it's relevant to the story.
00:23:41.460 i i've been a professor for 32 years an award-winning professor in many universities
00:23:46.740 that i taught at but suddenly after october 7th you may or may not know this i'm jewish
00:23:52.300 and a strong jewish supporter of all things jewish and certainly of israel after october 7th
00:23:59.920 uh when it you know i was obviously very vocal in my support of of israel i my regular university
00:24:08.520 is called Concordia University in Montreal, which is the definition of the, it is the exemplar of by
00:24:14.860 far the most Jew-hating university in North America. It literally dwarfs all of the other
00:24:21.460 ones, not even close, different order of magnitude. So I had to take a two-year leave of absence from
00:24:26.860 my university because in the 21st century, it became too dangerous for a Jewish professor to
00:24:32.120 walk on campus well of course all of the hamas lovers and the the the beautiful peaceful islamic
00:24:38.940 folks uh went to rape my professor and bombed it for for years now and it turns out that you know
00:24:46.880 i literally gang rape students in the classroom i you know i i run you know i don't know epstein
00:24:53.940 island seminars in my and on and on and on it of course it's it's all nonsense uh but when i try
00:25:01.960 to very tentatively i wasn't gonna you know i just ignored it people said sort of the the 230 stuff
00:25:07.600 uh oh but you know rate my professor doesn't have to do anything so let me please explain it to me
00:25:13.340 like i'm a i'm a slow child in the in the back row if we have the asterix for we have freedom
00:25:21.940 of speech but not for defamation how is it feasible that an organization offers all the
00:25:28.920 conduit for people to post the most astoundingly defamatory things against me and then go but
00:25:36.920 bruh 230 explain it yeah so and did you did you try to get it taken down from right my professor
00:25:44.340 i uh i i think uh my assistant went to report something because i mean of course it's all
00:25:52.220 nonsense. But like, it literally is that they say, I taught such and such course, and the number is
00:25:59.760 officially wrong. Or they say that it's been the last two years that I'm gang raping all the
00:26:05.060 students. I haven't been at the university. I'm on leave. So they are very, like, we don't even
00:26:10.660 have to debate the intricacies of the astounding defamation. It's so laughable, but nothing
00:26:19.240 happens and so and then and then i went to the rate my professors thing and it says you know we
00:26:25.660 don't take off anything and you can't sue us or something so i said you know at some point i should
00:26:30.520 find the lawyer and i'm not trying to solicit your employment here but at some point someone
00:26:36.340 has to stand up and say you can't engage in this kind of defamation and completely get away scotch
00:26:41.860 free? Well, so there are a few things. First, I'm not familiar with Rate My Professor's
00:26:48.920 moderation. A lot of the platforms still have methods to get the content taken down despite
00:26:55.900 Section 230, because Section 230 doesn't just protect platforms' decision to keep content up,
00:27:02.360 but it also protects their ability to take it down. So if Rate My Professor took down
00:27:08.300 those reviews, they couldn't face some bogus lawsuit from the people who posted it. It gives
00:27:14.100 them that breathing space. There are some platforms that have an absolutely no moderation
00:27:19.580 policy where they won't take anything down ever. And part of that is determined by the marketplace
00:27:27.260 that they lose credibility. And that's what we've seen over time. The other thing I would say is
00:27:34.000 that this doesn't prevent a defamation lawsuit. It prevents a defamation lawsuit against Ritten
00:27:39.900 I Professor. You can still sue the people who posted the content. Now, you might say, well,
00:27:47.960 I don't know who they are because they're posting anonymously, but it's often pretty easy to
00:27:56.400 unmask people. I wrote a book about anonymous online speech after my Section 230 book for
00:28:02.660 this very reason. And it requires a few subpoenas. Now, people can try to mask their identities.
00:28:08.580 They might use Tor. They might use a neighbor's Wi-Fi. But over the past 30 years, there have been
00:28:14.900 incredibly successful attempts to use subpoenas. You have to subpoena them. The way that it would
00:28:23.540 work is you would issue a subpoena to rate my professor saying, I want all the information
00:28:29.060 about the person who posted this comment,
00:28:31.740 they would give you back,
00:28:33.140 I don't know if they require registration,
00:28:35.520 but they would, at minimum, they log IP addresses.
00:28:39.100 So they would have to give you the IP address.
00:28:42.280 After the IP address,
00:28:43.700 you would then figure out what ISP it's associated with,
00:28:46.760 and you'd send a second subpoena to the ISP
00:28:49.940 and say, I want the subscriber name
00:28:52.420 who's associated with this IP address.
00:28:54.320 And then you would get a name,
00:28:55.420 And that would be a pretty good indicator, often, of who's doing it. And that, I think, places the responsibility more squarely on the person who is committing the defamation.
00:29:12.000 Right. Got you. All right. Let me go back to the book. Here it is. For those of you who are watching or listening, please go and pre-order. It's out in mid-April, yes? Something like that?
00:29:22.500 Yeah, April 7th.
00:29:23.460 Right. What are some things in your research for this book that maximally surprised you?
00:29:30.840 Oh, geez, I could have, you know, I went and checked something about European free speech protections.
00:29:37.800 And boy, was I shocked by that. Or give us some surprising things.
00:29:41.880 So one thing that really surprised me was Germany.
00:29:45.880 They have laws against insult and particularly against insulting politicians and public officials.
00:29:52.840 And just the frequency of the penalties, usually it's fine, sometimes it could be more than that, for people who just post something obnoxious online about a politician.
00:30:07.660 And to see that really spreading, those sorts of penalties for what could be deemed insults or even broadly hate speech throughout Europe, the United Kingdom.
00:30:24.920 um when i say i was surprised about europe that was really it of how first how thin-skinned the
00:30:34.320 politicians are there and uh how frequently they resort to legal penalties for their critics now
00:30:43.120 um you mentioned briefly when you were talking about right my professor the streisand effect
00:30:47.400 uh that was uh i i assume your listeners know what the streisand effect
00:30:53.440 Go ahead. Thank you for saying that, because maybe I'm assuming wrong. So describe it for us.
00:30:58.700 So this is actually my friend, Mike Masnick, who runs the TechDirt blog. He wrote that this must have been in 2005, I believe. It might have been slightly earlier.
00:31:08.740 He was writing about Barbra Streisand. Someone had taken an aerial photo of her home and she filed a lawsuit to get it taken down.
00:31:22.680 And they posted it on a website that got very few views.
00:31:27.320 And because of the lawsuit she was filing, it massively increased the number of people who saw these photos.
00:31:36.860 They never would have seen it had it not been for what Streisand did.
00:31:43.480 And so that's become, Mike called it the Streisand effect, which has really taken hold.
00:31:48.300 And I think it's one of the most brilliant free speech concepts because what it says is that, you know, trying to penalize speech will often lead to more dire consequences than just kind of letting it be.
00:32:03.680 And when I see what the public officials in Europe are doing, I immediately think about the Streisand effect, because once they bring a criminal defamation charge or an insult charge or anything like that, that draws so much more attention to the speech, because it's one idiot on social media.
00:32:25.600 Who cares? You should just be able to let them say what they want to say and move on. But all of a sudden, if you're bringing a case against them, there's going to be news coverage. There's going to be more posts about it. And suddenly you're drawing a lot more attention to that speech.
00:32:40.760 You know, I'm currently reading this book, which you might appreciate why it's sort of one of the grand ideas. This is Galileo against the church. So when I usually have some sort of super libertarian politician say, you know, we need to get rid of tenure. It's just dead wood.
00:33:02.760 And I always say, look, I really do understand the reflex that, you know, the frustration that oftentimes you have professors who, you know, the second after they get tenure, they never do anything again for the next 40 years and are truly deadwood.
00:33:18.080 But that's not the case for many professors.
00:33:20.100 And certainly I could have not lasted for 15 seconds in academia were I not protected by tenure.
00:33:27.360 I know for a fact my university would have loved to solve the Gatsad problem if I weren't protected.
00:33:34.220 Do you feel that as a sort of subspecies of the greater sort of freedom of inquiry, freedom of speech, do you feel that the concept of tenure itself is increasingly under attack?
00:33:48.600 Or is this and I'm not asking this for for personal reasons, but what's your sense about how American politicians view the protection afforded to tenure?
00:33:58.400 Yeah, I think that's a great question. And you're seeing it at the state level in various states where tenure is there are proposals to either abolish tenure or make some really substantial changes to tenure.
00:34:12.200 And I think that's really dangerous. Now, I think I don't think that it's this sacrosanct system that can never be touched.
00:34:20.440 I think that there might be reasonable changes to just ensure that professors are doing their job or having an environment where, I mean, they're not harassing students or anything like that, which is already the case under tenure systems.
00:34:45.280 Exactly.
00:34:45.520 But ensuring that, I mean, you always hear the horror story about a professor, a tenured professor who doesn't show up to class or doesn't want to teach or all of those sorts of things.
00:34:59.260 I think there is a fiduciary responsibility that all professors, especially at public institutions, have to make sure they're doing their job. But tenure is absolutely vital to ensuring that professors are able to research freely and speak freely.
00:35:17.880 I think, you know, oftentimes we place these sort of constraints on a concept as if it were an intractable problem that can be solved.
00:35:29.080 You can, you know, I can walk and chew gum at the same time. So you can keep the tenure mechanism exactly as it is, but set up some minimal standards of productivity that you must surpass in order for the tenure protection to be operative, right?
00:35:49.200 So, for example, and it could be unbelievably minimal, right?
00:35:52.240 It could be as long as you publish one peer-reviewed paper every three years, which is not particularly difficult.
00:35:59.960 And as long as you are showing up to your classes and you're getting at least, you know, higher than one out of five on your teaching evaluations, you are so that that at least ensures that the most egregious manifestations of deadwood behavior post-tenure are dealt with.
00:36:17.280 But the idea of getting rid of literally the most fundamental mechanism that allows unorthodox, irreverent thinkers to say and do the things that they do, to get rid of that because there's some idiot who hasn't published something for 40 years, seems to be slightly short-sighted.
00:36:35.280 What do you think?
00:36:36.200 I fully agree.
00:36:37.400 You said it perfectly.
00:36:38.560 I mean, I think my one concern is in setting those standards.
00:36:45.280 I think you're right.
00:36:45.840 I think if it's something like one peer-reviewed article every so often or something like that, I think that's fine.
00:36:52.080 I think the danger is if the standards start to be a pretext to go after someone with controversial views, to say we're going to set these standards in a way that this person or these people can't meet for whatever reason.
00:37:10.600 You just don't want to get to that.
00:37:12.240 Right.
00:37:12.780 No, that makes sense.
00:37:14.120 I'm not sure if you were ready to answer the following question, but that's part of the organic nature of the sad truth.
00:37:21.840 So, you know, I'm someone who you may or may not know has had a very difficult childhood growing up in Lebanon as a part of the last remaining group of Jewish community in Lebanon in the Middle East.
00:37:36.880 So I'm intimately familiar with the beautiful and noble peace purveyors of Islam.
00:37:43.380 And so I've been warning for many decades now that while individual Muslims might be perfectly lovely, and as a matter of fact, I can attest to that because I know more Muslims by virtue of being from Lebanon than most people will ever meet in their lives, that doesn't mean that the central tenets of Islam are in any way compatible in the most obvious ways to anything that we would call Western liberties and freedoms.
00:38:09.540 usually the blowback i get from supposedly you know uh the intelligentsia is yeah god but you
00:38:19.220 know freedom of religion uh which to me again baffles me because what you're saying is that
00:38:27.000 there is this inviolable to our earlier discussion thing called freedom of religion which now could
00:38:34.280 be used to ensure that you commit what i call in my forthcoming book civilizational seppuku
00:38:41.240 right the disembowelment of your society because i'm utterly impotent to recognize that two
00:38:48.840 religions might carry very very different consequences to our liberties and freedoms
00:38:55.140 could you ever and i'm asking you're you're a lawyer by training yes you're a friend yes
00:38:59.880 So that's why I think asking you is particularly relevant. Do you think that there is some legal trajectory by which, whether it be Islam or any other ideology or belief system, that it could be construed as no longer protected under the cloak of freedom of religion, if it can be argued that its foundational tenets are an incitement to violence,
00:39:28.300 would result in the end of our freedoms and therefore is seditious and banned? Or there is
00:39:35.420 no conceivable legal mechanism under American jurisprudence where that is possible? And if it
00:39:42.620 is possible, do you think that that would be a good idea or not? So this is getting a bit outside
00:39:50.460 of my area of expertise. So I do freedom of speech, internet law. Freedom of religion is a robust
00:39:57.100 body of law that i i feel like i'm probably not the most qualified i i like to say when i am
00:40:04.580 qualified to talk about something and when i'm not um so in terms of what the outer bounds are
00:40:10.040 of freedom of religion and also i i i just don't have enough familiarity with exactly what would
00:40:17.960 be pushing those outer bounds to really be able to give an informed perspective on that on that
00:40:24.300 question. I fully respect that if you are truly honest in saying that you're exhibiting
00:40:31.700 epistemological humility, and not because, oh, no, I dare not speak about the unmentionable.
00:40:38.500 It's absolutely humility. So I've learned long ago to speak what I know. And so I have very
00:40:46.780 strong opinions about what I'm knowledgeable about. And I might have opinions about other
00:40:52.480 things, but they have to be informed, I think. I actually, this is a great opportunity to actually
00:40:59.080 support what you just said, because I've repeatedly said that one of the reasons why it is uniquely
00:41:06.320 difficult to cancel me as arguably the most irreverent professor is because I'm incredibly
00:41:14.760 well calibrated about what I know and don't know. So that when I go on shows that have millions of
00:41:21.880 views, where I know that all sorts of haters and detractors are going to, you know, rip apart every
00:41:28.640 syllable that I've enunciated, I'm able to defend every single position that I've taken. But the
00:41:35.500 reason why you could never catch me metaphorically with my pants down is because exactly to your
00:41:40.520 point, if you ask me, so what do you think about the original laws to, you know, make legalization
00:41:48.060 of marijuana, you know, throughout Canada, I say, well, that, you know, that's a great question.
00:41:52.940 It's above my pre-grade. I don't know enough about it. So I never wing it. I never fake it.
00:41:57.600 And so I never get caught having said this, but it really was that because I'm really confident.
00:42:02.980 So I respect that. All right. So anything else you want to cover in this book that we may not
00:42:08.960 have discussed so far that you think our listeners and viewers should be aware of?
00:42:14.320 So I think one thing we haven't really talked about is that all hope is not lost for free speech. So we believe in a free speech recession where we have governments, both democracies and authoritarian governments, really trying to find new ways to restrict speech.
00:42:32.020 But there are models around the world where we're seeing some resistance to that.
00:42:41.080 So we talk about New Zealand, for example, which after the Christchurch shooting, there were real movements to restrict hate speech online and really tighten up the speech ecosystem.
00:43:00.620 And there was a lot of pressure and New Zealand resisted that.
00:43:06.080 And it was principled political leaders who said, we're not going to do that.
00:43:11.460 I think another model that we focus a lot on in the book is Taiwan.
00:43:16.220 So Taiwan, there were a lot of concerns about foreign election misinformation and also misinformation about COVID.
00:43:24.980 And the government, there were some who said, well, maybe there should be more laws about misinformation. And Taiwan didn't take that approach. They actually said, we're going to do something called radical transparency, where we are going to be as transparent as we possibly can with the public and let them see the government warts and all.
00:43:49.080 And we can explain it to them and we will give them as much information as we can.
00:43:57.940 And that's been fairly effective both for public health and for elections where people really have the information they need to make informed decisions.
00:44:12.360 It might go with the government, it might go against it, but the key is that they're not being told what to think.
00:44:18.180 They're given the information, they're given the tools to come to their own decisions. And I think, unfortunately, that's not the norm right now. But I think those are just two models that we look at where we could have some hope and a roadmap for other governments.
00:44:41.740 Well, I'm always glad to end a show on an optimistic, hopeful note. So thank you for that. Thank you so much for coming on, Jeff. Say hello to Jacob on my behalf. Stay on the line so we can say goodbye and come back whenever you have another worthy topic to discuss. Thank you so much.
00:45:00.060 Thank you so much.
00:45:00.940 Cheers.