#368 — Freedom & Censorship
Episode Stats
Words per Minute
172.41057
Summary
In this episode, I speak with Greg Lukianoff, the President of the Foundation for Individual Rights and Expression (FIRE), about the origins of political correctness, free speech and its boundaries, the First Amendment, technology and the marketplace of ideas, social media and cancel culture, comparisons to McCarthyism, self-censorship by professors, the Hunter Biden laptop story, how to deal with Trump in the media, the deplorable state of higher education in America, and other topics. Greg and I discuss the origins and history of FIRE, how it came to be founded, and how it addresses some of the world s most pressing problems, such as political correctness and censorship. We don t run ads on the podcast, and therefore, therefore, it s made possible entirely through the support of our subscribers. If you enjoy what we re doing here, please consider becoming a supporter of what we're doing here. You'll get access to our scholarship program, where we offer free accounts to anyone who can t afford a full-time college education. There are no ads, and we don't run ads. We do not run ads, so if you're not currently a subscriber, you re not currently on our subscriber feed, you'll only be hearing the first part of this conversation. In order to access full episodes of the Making Sense Podcast, you ll need to subscribe to our premium membership program, which offers free accounts, where you'll get an ad-free version of the podcast available to you, the listener, for as little as $1.99 a month! Subscribe to the podcast! You can get 10% off your first month free, unlimited access to all our premium plans, plus a 20% discount when you become a member of our podcasting plan, and get 20% off for the next month, plus 5% off the first month for the rest of the month, free shipping throughout the entire month of the year, plus an additional 3 months of the course, plus 2 months free shipping, and a 2-month shipping plan when you sign up for the second year of the third year, shipping starts next year, for a total of $99 a year, starting at $39 a year get $40 a year and a maximum of $39,99 a maximum, plus they get a discount, plus two months free of the ad discount, and two months, they get the choice of two months of VIP access to the best deal, and an ad discount.
Transcript
00:00:00.000
Welcome to the Making Sense Podcast. This is Sam Harris. Just a note to say that if
00:00:11.640
you're hearing this, you're not currently on our subscriber feed, and we'll only be
00:00:15.580
hearing the first part of this conversation. In order to access full episodes of the Making
00:00:19.840
Sense Podcast, you'll need to subscribe at samharris.org. There you'll also find our
00:00:24.960
scholarship program, where we offer free accounts to anyone who can't afford one.
00:00:28.340
We don't run ads on the podcast, and therefore it's made possible entirely through the support
00:00:32.860
of our subscribers. So if you enjoy what we're doing here, please consider becoming one.
00:00:44.760
Today I'm speaking with Greg Lukianoff. Greg is the president of the Foundation for Individual
00:00:50.880
Rights and Expression. The acronym is FIRE. He earned his undergraduate degree from American
00:00:57.180
University and his law degree from Stanford. And he worked for the ACLU of Northern California
00:01:03.620
and other organizations before joining FIRE in 2001. And he's one of America's most passionate
00:01:10.500
defenders of free speech. He has written about the issue in the New York Times, the Wall Street
00:01:16.300
Journal, the Washington Post. He has produced documentaries on the subject. He also wrote,
00:01:22.400
along with Jonathan Haidt, The Coddling of the American Mind. And most recently, he wrote,
00:01:28.840
along with Ricky Schlott, The Canceling of the American Mind. Cancel culture undermines trust,
00:01:34.500
destroys institutions, and threatens us all. But there is a solution. And that is the topic of
00:01:40.720
today's conversation. Greg and I discuss the origins of political correctness, free speech and its
00:01:46.820
boundaries, the bedrock principle of the First Amendment, technology and the marketplace of ideas,
00:01:53.520
epistemic anarchy, social media and cancel culture, comparisons to McCarthyism, self-censorship by
00:02:01.180
professors, cancellations from the left and the right and how they differ, justified cancellations,
00:02:08.200
the Hunter Biden laptop story, how to deal with Trump in the media, the deplorable state of higher
00:02:14.760
education in America, and other topics. And now I bring you Greg Lukianoff.
00:02:27.460
I am here with Greg Lukianoff. Greg, thanks for joining me.
00:02:32.060
So we've got a lot to talk about. The world is on fire. And you happen to run an organization
00:02:38.940
called FIRE, which addresses some of these problems. I will have introduced you properly
00:02:43.740
in my housekeeping. But what is FIRE, and how do you come to be running it?
00:02:49.180
FIRE is the foundation for individual rights and expression. And we're actually celebrating our 25th
00:02:54.460
anniversary this year. We were founded back in 1999 by Harvey Silverglate, who is a liberal-leaning
00:03:01.780
libertarian, and Alan Charles Kors, who's a more conservative-leaning libertarian, who is one of the,
00:03:08.300
you know, world's foremost experts on the Enlightenment, especially Voltaire. And they founded
00:03:14.800
it because even back then, they realized that, and Alan was a professor at Penn, that students
00:03:21.940
were increasingly getting in trouble for what they said, not just for what they did. And they
00:03:27.660
wrote a book called The Shadow University, trying to blow the lid on what was going on, that came out
00:03:33.100
in 1998. And Harvey always likes to say that he thought, you know, that would solve the problem.
00:03:38.240
But instead, they got emails and letters from people all over the country asking, particularly
00:03:43.700
professors, asking for help in their free speech and academic freedom cases. So in order to deal with
00:03:50.140
the onslaught of interest, they founded FIRE in 1999. And I joined as the first legal director in 2001.
00:03:58.660
And I'm the weird law student who actually went to law school specifically to do First Amendment law.
00:04:05.480
I took every class that Stanford offered on First Amendment. And then when I ran out, I did six
00:04:12.020
credits on censorship during the Tudor dynasty. And I externed, a weird word for internship,
00:04:18.720
at the ACLU of Northern California. So when Harvey went to Kathleen Sullivan, who was then the dean of
00:04:25.540
the law school to ask for who they would recommend to be the first legal director of FIRE, it remains
00:04:31.160
the greatest compliment I ever received, is she recommended me by name.
00:04:38.820
Right. Actually, we overlapped a little bit there. But I was going back to finish my undergraduate.
00:04:44.160
But I remember that the first time I was there, there were some intimations of the coming
00:04:49.480
censorship apocalypse. I remember that the, the great books approach to, I forget what it was called
00:04:57.360
in the freshman years, like the freshman kind of literature requirement was a kind of a great books
00:05:03.320
seminar. And, you know, this was much maligned as just teaching the products of dead white men.
00:05:12.220
And there, there had been a march, I think it was, might've been a year before I got there as a
00:05:16.580
freshman, but, you know, Jesse Jackson had led a 500 person march on the Stanford campus that got a
00:05:21.980
lot of coverage. And there was certainly a movement afoot on that particular campus at that point to
00:05:29.260
reset everything with respect to how we talk about ideas. It's, I don't quite remember what was
00:05:35.880
happening in 1999, but I can only imagine the pot just continued to boil from there.
00:05:40.640
Yeah. It was interesting that, you know, I started there in 97 and nobody really talked about the
00:05:47.080
fact that just two years earlier, Stanford law had lost a court case defending a speech code that it
00:05:55.140
had come up with, with professor Tom Gray, who was a professor who was there when I was there to limit
00:06:00.480
nominally racist and sexist speech, but really going after offensive speech to a vast degree. And even
00:06:07.520
though it was a private school, the legislature passed a special law called, it's now called the
00:06:13.160
Leonard law that actually made sure that non-sectarian schools in the state of California actually had to
00:06:18.840
abide by first amendment standards. So they actually lost in court in 95. And this was like a dirty little
00:06:24.520
secret by 97 because it was actually kind of a sweet spot to go to school. Because what I missed
00:06:31.620
was what I call in my latest book, the first great age of political correctness, which is essentially
00:06:37.980
85 to 95, which is when speech codes came into vogue. There was, I remember actually the chant was famous
00:06:45.140
by Jesse Jackson. It was, Hey, Hey, Hey, Ho, Ho, Western Civ has got to go.
00:06:50.200
And this was a lot of what Harvey and Alan were writing about in the shadow university.
00:06:54.800
But by say 95, 96, political correctness had become even a joke on the left. So there was a
00:07:01.380
sense that the speech codes were defeated. The professor, it kind of fell out of love with
00:07:06.240
enlightened censorship. The students, you know, were more the kids of the boomers who were actually
00:07:10.820
quite good on free speech. And so this all, there was a sense that this all kind of died off.
00:07:16.000
What I learned in working with Jonathan Haidt, when I was doing the research for coddling the
00:07:20.500
American mind was that interestingly, even though this kind of fell off of the public radar, because
00:07:26.480
there was a sense of political correctness was like this fever that had broken sometime in the
00:07:30.420
mid nineties, but that the viewpoint diversity of, of the professor, of, of professor hirings
00:07:36.860
plummeted in the late nineties. It got much, much worse, even as people had taken their eye off the
00:07:42.180
wall. And when I started a fire in 2001, I was pretty shocked at how easy it was already to get
00:07:49.060
in trouble for what you said on campus. And I also say when, when I got to Stanford, it was a little
00:07:53.840
bit of a culture shock to meet so many wealthy kids who were pretty Victorian. Uh, I would almost
00:08:00.460
describe it in, in their ideas of what acceptable speech constituted. It seemed like it was very easy
00:08:06.440
already to get in trouble for saying something in an unapproved way, but it was a cultural thing.
00:08:12.460
It wasn't, um, that people were actually literally getting punished for it. It was more that it just
00:08:17.600
seemed kind of, forgive the expression, uptight.
00:08:20.500
Well, I should say much of what we're going to talk about, you cover in great depth in your most
00:08:27.060
recent book, the canceling of the American mind, which you wrote with Ricky Schlott. And where, you know,
00:08:33.920
there's just a ton of material in there and many case studies of which we are not going to get to
00:08:38.740
cover, which are, we may cover some, but- Let's cover all of them.
00:08:42.080
Yeah. Feel free to introduce any, any that you want, but there's just so much detail there where
00:08:47.080
it's just amazing. The, the, it's certainly amazing for anyone who thought that cancel culture wasn't
00:08:53.000
a thing. I mean, anyone who can be found having said that in the last 10 years really needs to read
00:08:58.820
your book. But, um, let's start with the concept of free speech and its boundaries. How do you think
00:09:07.120
about free speech and when, and what precisely do we want to defend here?
00:09:11.380
I have a pretty, um, expansive view of what the term free speech means. And I think of it as the big
00:09:17.880
Boolean sphere that includes the first amendment, but is much, much larger than that. Um, and I
00:09:23.880
definitely, you know, I think of it as not just being about the limits of the law of what you're
00:09:29.240
allowed to say, but also you're the cultural tolerance for the acceptance of people with
00:09:35.420
any kind of opinion. I sometimes explain there's no such thing as a free speech absolutist, or at
00:09:40.540
least I haven't met a serious person who is because we all believe that some speech is and should be
00:09:46.300
unprotected, but I am an opinion absolutist. And what I mean by that is that if you're merely
00:09:52.000
expressing your opinion, even if it's repugnant, that is protected. It has to be something more
00:09:56.880
than the mere expression of opinion. Uh, it has to be something like discriminatory harassment,
00:10:01.360
which is a pattern of severe persistent and pervasive behavior that targets someone on the
00:10:07.680
basis of a protected category that a reasonable person would also find offensive. And by the way,
00:10:12.820
we actually have seen genuine discriminatory harassment on campus since October 7th in a number of
00:10:18.540
cases. So it's not an unmeetable standard. Intimidation also, uh, also known as true threats,
00:10:24.260
which is when you make someone make, we would make a reasonable person fear that they're in
00:10:28.420
danger of bodily harm or death that is not protected, nor should it be. Defamation is not
00:10:33.860
protected, which is basically making a factual allegation that someone, for example, is guilty of
00:10:39.780
an odious crime. You know, I always gave the example when I was at the student newspaper
00:10:43.620
of, you know, groundlessly asserting that, you know, that someone is a pedophile, for example,
00:10:48.940
is a classic example, because I could ruin someone's life if, if they believed you, uh,
00:10:53.800
if they believed you. So nobody really believes that there, there's no way to use words in a way
00:10:59.240
that isn't protected. I mean, there's also, of course, you know, you use words to commit extortion,
00:11:03.900
use words to, uh, that are incidental parts of existing crimes, use words to commit conspiracies,
00:11:10.700
for example. But I think that the, the most clearly protected kind of freedom of speech is
00:11:17.520
when you're merely expressing a viewpoint. And under that, they're really, the law I think is
00:11:23.280
sensibly set up in the United States to have what we call the bedrock principle, which is,
00:11:29.480
it comes from a case called Texas v. Johnson from 1989, which is about whether or not you can burn an
00:11:35.180
American flag or whether or not a anti-flag burning law would be constitutional. Uh, and it's a great
00:11:41.080
opinion, by the way, like I really recommend reading it because when I used to teach first
00:11:44.860
amendment, it does this wonderful kind of overview of the entire history of the first amendment and
00:11:49.800
a free speech decision. So it's a, it's a great one to just one stop shopping for, for learning about
00:11:54.880
the field, but it calls the bedrock principle of first, the first amendment is that you cannot in
00:12:01.220
the United States ban speech simply because it's offensive. And I always make the point that this
00:12:07.360
is a great rule for a genuinely multicultural society because what people find offensive and I,
00:12:14.080
my dad's Russian, my mother's British. I grew up in a neighborhood with a lot of kids from Vietnam and
00:12:18.860
Puerto Rico and Korea and from Peru, like all people who had very different ideas of like what offensive
00:12:26.660
speech looked like. And it was such a perfect example for why free speech had to be the rule
00:12:31.840
because we couldn't, you know, my British mom thought all sorts of things were offensive. My
00:12:37.340
Russian father thought practically nothing was, but Danny Nguyen's, you know, parents had different
00:12:41.800
ideas of what was offensive and same thing with Nelson Bolito's. And so I think that the bedrock
00:12:47.180
principle is a very sensible rule for a free multicultural diverse society, but it is one of the major
00:12:55.000
things that makes America so very different from Europe, for example.
00:13:00.260
Hmm. But let's linger on that point because I'm often guilty of just reflexively seeing everything
00:13:06.660
through an American lens. And then I'm reminded that even our closest cousins over in the UK or
00:13:14.680
elsewhere in the English speaking world don't enjoy the same kinds of protections that we do.
00:13:21.940
How is it different in Europe and Canada and Australia and New Zealand and elsewhere? And
00:13:27.240
I think you and I are both going to agree that we have it closer to right in America than exists
00:13:34.940
anywhere else, at least that I'm aware of. What do we have that other nations don't around free speech?
00:13:40.760
I was sort of laughing right there because I violate a major convention among constitutional
00:13:48.220
lawyers when I travel to Europe. It's generally considered, and it's partially because a lot of
00:13:53.020
other constitutional lawyers are highly sympathetic to greater limitations to free speech. So it's
00:13:58.500
partially also just agreement. But even when they're not, there's a tendency to go over there and be very
00:14:04.200
deprecating of our own system and very complimentary towards the way things work in Europe. And I am not
00:14:11.120
that lawyer. I go over and I'm rude about it. I'm sorry. I actually think I'm quite polite about it,
00:14:16.500
but I violate the rule of saying, you guys have it all right and we have it all wrong. Because I'm like,
00:14:21.840
I don't think we do. Particularly since I attend a lot of these hate speech conferences and watching
00:14:27.060
someone with a German accent explain why this cartoon landed someone in jail, but this cartoon
00:14:34.140
didn't. And you're like, this doesn't make any sense. These are both arguably equally offensive
00:14:40.280
and you're just being arbitrary about it. So you're not really persuading me that this is the way to go.
00:14:45.940
And that's partially because they don't have a bedrock principle that you can't ban speech simply
00:14:51.260
because it's offensive. But I think that comes from an interesting place. And I think it's
00:14:55.380
essentially, oh God, I find myself thinking about the Treaty of Westphalia, 1648, the nation state
00:15:01.620
model. Essentially, I think that in my mother's Britain, there is an idea that there's sort of a
00:15:07.800
modal Brit and that nation can be kind of almost analogized to a single person who has just like
00:15:15.960
any person a list of things that are offensive and a list of things that aren't a John Bull kind
00:15:21.880
of character. And I think that that's something that you'll see in sort of like the national
00:15:26.980
character of Germany and France. And there is an idea that there is like a modal version of that
00:15:32.380
person. And if you think of your country that way, it can make sense that you can call balls and
00:15:37.920
strikes in terms of what's offensive and what's not, because you think that there's some way that
00:15:42.100
most people in your country would. I think this has gotten much harder to argue as all of these
00:15:47.300
countries have become much more diverse, much more multicultural. And I think it was also a little
00:15:51.360
bit asinine to argue in the first place, particularly in a lot of these places, because of profound class
00:15:56.340
differences within these same cultures. And so I think that that's one of the reasons why that they
00:16:02.300
can be more comfortable thinking that they have that not having a bedrock principle makes sense.
00:16:07.040
But I can see that the strain on the system is really happening in places like Scotland and
00:16:12.780
Ireland and Germany, for that matter, in terms of how do you enforce this fairly? And the answer is
00:16:17.720
it's incredibly difficult. Meanwhile, in the United States, we've always had an understanding
00:16:22.720
that Boston is not like Richmond, is not like Georgia, is not like, and then later, California,
00:16:29.640
not like Texas. So we've never really thought of ourselves, despite the image of Uncle Sam,
00:16:35.020
we've always been a country that we know full well that different parts of us are simply not going to
00:16:40.600
agree. And I think, frankly, we've been at the genuinely multicultural, generally multi-ethnic
00:16:46.700
society thing longer than a lot of the countries in Europe. And so I think we actually did get this
00:16:52.880
one right. And I think that what you see going on currently in Europe, particularly around issues
00:16:58.680
like immigration, I mean, I think about that, the polling that you presented on Bill Maher's show
00:17:05.500
about some conservative Muslim attitudes about a variety of issues. That struck me as something
00:17:12.940
that absolutely protecting the United States. But I could imagine that even some factual assertions
00:17:17.940
could be treated in some countries in Europe as if they're actual offenses now.
00:17:21.080
Yeah. Well, the modal Brit or the modal German or the modal Frenchman is increasingly going to be
00:17:29.080
guilty of Islamophobia in that context. So let's return to the American view of things. I think we
00:17:37.820
both agree that as well-intended as they might be, prescriptions against offensive speech in other
00:17:45.840
countries like the UK or Germany, Holocaust denial laws, for instance, in places like Germany, and I
00:17:52.480
think Austria has it, I'm not sure where else, you know, they can, you know, literally land you in
00:17:57.760
prison for denying the Holocaust. You know, as much as I am a student of the Holocaust and am alert to
00:18:04.600
the, even to the forward-looking prospect of future Holocaust, I think laws of that sort are just
00:18:11.220
wrong-headed. I mean, they don't accomplish what you would hope they would accomplish, which is to get
00:18:16.580
everyone to understand that, you know, in this case, the Holocaust really did happen and it was horrific.
00:18:21.600
And it's just the wrong algorithm to be running socially so as to purify people's thinking and to get
00:18:29.420
people to converge on a fact-based and well-intentioned discussion about reality, right? I mean, what I think we
00:18:35.760
want, and I'm now putting words in your mouth and feel free to take them out, but what I think we want is the
00:18:40.720
the proper error-correcting mechanism of it being legal to more or less talk about anything,
00:18:49.100
leaving aside those cases where speech isn't quite just speech, where it is, you know, I think, as you
00:18:56.200
said, a provocation to imminent lawless action or, you know, obvious harassment, where it's more,
00:19:02.580
you know, you're, it's a kind of behavior, you know, you could be defrauding someone with speech,
00:19:07.380
et cetera. But if you're, if you're just expressing opinions and even, you know, extraordinarily odious
00:19:12.820
ones and, and even in principle dangerous ones, you know, dangerous if acted upon, I think we want
00:19:20.100
the, generally speaking in the public sphere, which is to say, you know, that the sphere that the
00:19:26.280
government coercion casts a shadow over, I think we want just the best idea to win in collision with
00:19:35.040
all other ideas. Is that how you, do you agree with what I've said on that?
00:19:38.700
Well, the marketplace of ideas theory of freedom of speech, I think is, I think it's a great idea.
00:19:45.560
And I think it's particularly relevant, um, in higher education where it is supposed to be a
00:19:50.400
clashing of ideas and the use of disconfirmation to, you know, chip ever closer to the truth by chipping
00:19:57.120
away at falsity. But I, I have a little bit of a more idiosyncratic theory on freedom of speech
00:20:02.780
that unsurprisingly is fairly expansive. And I just call it simply the pure informational theory
00:20:08.000
of free speech, which is that there is, if the goal of human knowledge is to know the world as it
00:20:13.500
is, then you can't really know the world as it is without knowing what people actually think. And if
00:20:21.320
you create a situation in which people aren't too terrified to say what they actually think for fear
00:20:25.720
of punishment, you are depriving yourself of really important knowledge about the world. And I, and I
00:20:30.600
extend this also to conspiracy theories. So like, let's take a real one, the protocols of the elders
00:20:35.600
of Zion. If the idea that that wasn't a relevant and important thing to know about its existence
00:20:42.340
in history is just delusional. I think it's very important that people understand that there was
00:20:48.540
this massive conspiracy theory at the early part of the 20th century, that that was based on very
00:20:53.340
shoddy writing that basically was saying that Jews controlled the world. Is it true? No. Is it worth
00:20:59.680
knowing? Of course it is. And so when it comes to conspiracy theories, when I'm being a little more
00:21:05.100
lighthearted about it, the way I explain it is, listen, lizard people who live under the Denver
00:21:09.860
airport do not control the world. But knowing that your girlfriend or uncle believe that lizard people
00:21:17.000
living under the Denver airport control the world is very important information to have.
00:21:21.960
And so knowing what people really think and why is a key part to understanding the world.
00:21:27.640
And when it comes to conspiracy theories, I mean, here's one thing that I just have to say over
00:21:32.260
and over again. If you're battling someone who believes that there's a conspiracy to shut them
00:21:37.600
up, do absolutely nothing that looks like a conspiracy to shut them up because they're just going to
00:21:44.300
take that as like, well, I must be onto something. And then the last part of like why you have to let
00:21:49.760
these bad ideas out is something that I call the, um, I call mills trident from the, uh, from on
00:21:56.840
liberty, which is that in a truth seeking discussion, which is actually not most discussions, but in a
00:22:02.500
discussion to try to get towards truth, there's only three possibilities. You're either completely
00:22:08.060
wrong, you're partially wrong or partially right, or you're completely right. And now that middle
00:22:14.540
category is the one that's going to be the most common is well populated. Yeah, exactly. But in
00:22:19.840
the case that you're completely wrong, of course, free speech is useful because people can point out
00:22:25.140
how and why you're wrong in the middle one. There's really no other way to get at what's the wrong
00:22:31.780
parts of your opinion, uh, without usually a somewhat long process of, of, of debate and discussion.
00:22:38.100
But even under that final one, if you're completely right, that obviously the Holocaust really happened,
00:22:44.580
but you never actually get tested on that. People tend to hold their beliefs like they hold
00:22:49.160
prejudices. They know that something's true or they believe that something's true, but they don't know
00:22:54.500
why. And that's a very brittle state of affairs because if basically, you know, that it's illegal
00:23:00.940
to say that the Holocaust didn't happen, but you're never actually challenged on whether or not it
00:23:05.540
happened, you may not even end up being aware of how overwhelming the evidence that it absolutely
00:23:10.840
completely happened is. And so I think that a lot of these anti-conspiracy laws backfire in,
00:23:17.940
of multiple ways. And one thing that is, I think, interesting is when you look at the prevalence of
00:23:24.040
Holocaust denial, but also of things like antisemitism in Europe versus the United States,
00:23:29.960
Europe passed a lot of laws that, particularly places like France, laws that were designed to
00:23:35.800
go after antisemitic speech. There was also a push, of course, on college campuses to do the same thing
00:23:40.900
both in the nineties, but they actually got passed in France and they only got passed on
00:23:45.300
campuses in the United States. And for the most part, they were defeated in court.
00:23:49.980
But when you look at sort of like the rates of antisemitism in Europe, they're much,
00:23:56.200
much higher than they are in the United States. Although unfortunately things have been getting
00:23:59.840
worse in the past couple of years. And I think that's at least in part underestimating what can
00:24:05.080
happen if you don't actually create a situation in which people are, people are actually finding out
00:24:10.980
what people really think and why. I think we should introduce a few more distinctions here,
00:24:15.380
because I think as, as much as we seem to agree that we may disagree on a few points,
00:24:21.300
and it'll be interesting to have you defrag my heart drive if it needs it. So there's obviously
00:24:28.460
a distinction between laws and norms, and there's a distinction between public and private spaces or,
00:24:35.960
you know, platforms. And a failure to notice these distinctions, in my view, does kind of confound
00:24:42.400
a lot of the discussion around censorship and misinformation and, you know, big tech and
00:24:47.480
what should have happened on Twitter during COVID, et cetera. And some of which you cover in your
00:24:52.540
book. So as I just said, I'm, I'm, I fully agree, agree with you that nobody is a free speech
00:24:58.580
absolutist. That's just a cartoon of a cartoon. And especially when you move into private space and
00:25:05.680
private platforms and businesses that have brands to be maintained, right? Your meta or Instagram or
00:25:13.120
your Twitter now X, right? You're, you're a business that has to make money at some point,
00:25:19.200
presumably. And then the question is, what, what sort of information do you want on your platform?
00:25:24.960
You're, you're in a position to make decisions, you know, curatorial, you know, moderating decisions.
00:25:30.380
You have to make those decisions unless you're going to become the next 4chan or 8chan or some other
00:25:37.460
digital sewer that nobody or a few, a few of us want to spend much time in. And certainly no one
00:25:43.940
wants to advertise in, right? And yet any effort to clean up the sewage is, you know, routinely described,
00:25:53.300
you know, mostly right of center now as censorship, right? And this has always just struck me as pure
00:26:01.100
confusion. And especially in a political context where you're, you're, you're seeing the, the
00:26:07.160
algorithmic amplification of obvious misinformation and disinformation, some of which obviously is put
00:26:14.580
there maliciously by, you know, Russian troll farms, right? Although that's not the bulk of it, that
00:26:19.520
certainly happens. But much of it is just generated by maniacs, whether it's Alex Jones or someone you've
00:26:25.300
never heard of. And the way it's different than ordinary speech is once again, it is algorithmically
00:26:31.700
amplified based on the dynamics of these hallucination machines that we have built. And it's just a much
00:26:39.420
more complicated picture where, again, there are distinctions between laws and norms, public and
00:26:44.380
private, normal speech versus algorithmically boosted speech, speech whose terrible effects you can see
00:26:51.160
play out hour by hour. And in certain cases, you know, many of us worry that the misinformation
00:26:56.880
component of this, if left unchecked, could render us just ungovernable politically, right? I mean,
00:27:05.960
people are so confused. When you look at what's happening on college campuses now, I look at it as just a,
00:27:11.500
it's a manufactured hysteria based on almost pure misinformation. I mean, it's a lot of stupidity,
00:27:18.700
you know, greasing the wheels there and moral blindness, but there's a tremendous amount of
00:27:23.620
misinformation. And so the, you know, the happy talk I uttered a few minutes ago about how, you know,
00:27:30.800
the best idea should win in collision with all other ideas, we have created private and public and,
00:27:37.100
you know, semi-private, semi-public, you know, contexts, whether you think of each of these things as
00:27:42.280
publishers or platforms, you know, depending on how you squint your eyes, you can make a case for either.
00:27:47.100
We've created a situation where we appear to be driving ourselves fairly crazy and it's worth
00:27:54.300
worrying about this. And so I'm just wondering, I mean, take any side of that grotesque object I just
00:28:00.740
put in view you want, but how does your commitment to free speech in a very, you know, deep way help you
00:28:13.780
Yeah. No, this is, and I wish I could, I could have a short answer to this, but I don't think
00:28:21.180
I really can because I think that we have to take a big step back to where we are in 2024. And I talk
00:28:29.140
about this a lot within my organization. Probably people are sick and tired of hearing the printing
00:28:34.140
press, but I mentioned that I did censorship during the Tudor dynasty. But the reason why I was
00:28:39.540
studying censorship during the Tudor dynasty is that was all directed at the printing press as being
00:28:45.300
this infernal device that spread misinformation that led to all of these social ills. And I always
00:28:51.800
want to be clear here, it did. It really did in a lot of ways because, and this is a point that I
00:28:59.000
sometimes, you know, take a somewhat different approach than my much beloved friend, John Height,
00:29:03.900
is that I'm, you know, I'm more of a civil libertarian, but also I come from, you know,
00:29:07.900
a little bit of a different sort of academic background. And I definitely, you know, tend to
00:29:13.940
see solutions as more bottom up than top down, because I look at what happened in England around
00:29:20.880
the printing press, starting with Henry and then also going all the way through Elizabeth, actually all
00:29:25.400
the way through the glorious revolution and a little bit beyond about the printing press. And
00:29:30.920
essentially what happened there was you suddenly had millions of additional people in a faster paced
00:29:39.120
global discussion about what is true and what is not. And in the short run, it was devastating.
00:29:46.960
It meant an increase in the witch trials. It meant religious unrest. It meant political unrest.
00:29:54.720
It meant genuine bloodshed. And in some ways, this disorder led to the biggest war in European
00:30:01.400
history before World War I, the Thirty Years' War, which was just an absolute calamity.
00:30:08.240
So Henry's and Elizabeth's and Mary and Edward's, you know, concern about the printing press, you know,
00:30:14.840
honestly should not be so easily dismissed because it really was a device that led to all this
00:30:20.340
destruction. But that's going to happen anytime that you have this many additional people brought
00:30:25.560
into a conversation. Over the long term, it actually turns out that this many additional
00:30:29.900
eyeballs and minds on problems was a tremendous boon for the production of thought and for human
00:30:36.200
progress, because it allowed for things like the scientific revolution. It allowed for
00:30:40.740
decentralized disconfirmation of information, which is, you know, one of the great,
00:30:46.260
that we're still benefiting from to this day. Social media has added billions of people to an
00:30:54.800
instantaneous and fast paced global conversation. So there's literally no way to avoid that being a
00:31:03.140
anarchical, epistemically anarchical period. So I feel like we're unavoidably in a crazy period right now.
00:31:10.940
And I get sometimes more concerned about ways to fix the problem if they're heavy handed top down
00:31:16.980
problems. But another thing that happened when you have this many new minds and voices into a
00:31:22.260
conversation is you realize that the old authorities probably didn't necessarily deserve to be authorities
00:31:28.860
in many ways, because with this many minds and eyes on a problem, you realize, you know, the thinness
00:31:35.940
of what, you know, some of the ancient voices were actually telling us, particularly about things like
00:31:39.680
science. And so the old authority gets destroyed. And there's always a period before new, new authority
00:31:46.860
is able to establish itself. And it can be a messy, ugly process. I feel like we're in the epistemic
00:31:53.300
anarchy period still, but I do think that people are kind of raising their hands to say, actually,
00:31:59.900
and institutions and papers and publications as being, I will actually, you know, I can, I will be
00:32:07.340
straight. I will, I will actually be, be someone who will be a reliable source of expertise and
00:32:12.680
information. But the thing that one of the reasons why trust is in the title canceling of the American
00:32:19.380
mind is because I think cancel culture, I think what the calamities we're seeing on campus has been a
00:32:26.440
well-deserved disaster for the credibility of the academic class and for the, for the expert class
00:32:33.260
overall, because nobody really, after, you know, a professor loses their job, for example, for saying
00:32:38.640
biological sex is real, which really happened, you're, nobody's going to listen to you again,
00:32:43.460
when you try to claim actually it exists on a spectrum, because they realize you can't be
00:32:46.660
objective in an environment where your job would depend on giving the quote unquote right answer on
00:32:51.560
that. So I think that right now we're seeing a much bigger disruption because nobody knows who to
00:32:59.040
trust. And a lot of the voices that we were supposed to be able to trust, whether it's mainstream media,
00:33:04.660
whether it's academia, whether it's the expert class, have shown themselves to not be that
00:33:09.180
trustworthy. And I, and unfortunately I, I actually agree with that to a large degree. I actually think
00:33:14.460
that the expert class has really shot itself in the foot and in a lot of ways, but we haven't yet
00:33:19.540
reached the, the stage where we know which experts are actually reliable. And unfortunately the,
00:33:25.240
a lot of those same experts, you know, that we, that we would hope would be, are the same people
00:33:29.640
who created sort of the, the misinformation on campus. So there's a big problem there. And I've
00:33:35.600
been thinking a lot about how to create social media specifically for truth, for truth identification.
00:33:44.740
And, and I think it's possible. I talked about on Lex Friedman's podcast about the idea of, um,
00:33:49.880
being able to even create a truth seeking stream with an X. Now the rules have to be different,
00:33:55.240
in order to create that, in order to create that. But if you do it top down,
00:33:59.740
people aren't going to trust it, but also largely be based on the expertise of existing authority
00:34:04.540
who've proven themselves, not that trustworthy. So it has to be a process. It has to be relatively
00:34:09.120
transparent, but we're not quite doing it yet. I think that some of the authorities that you should
00:34:15.040
have some faith in have raised their hands, for example, on Substack. Um, I think the New York
00:34:20.400
Times is trying to improve its trustworthiness to, to the larger public, at least to some degree.
00:34:26.040
But I think we're unavoidably in a messy period where we haven't figured out a way to make our
00:34:31.640
systems for chipping away at falsity, you know, for, for establishing what is true by getting rid of
00:34:36.700
what isn't. I think we need to figure out a way to make that whole knowledge system more reliable.
00:34:41.860
Yeah. I, I, I take your point about the printing press and, and I, I have a, uh, printed copy of
00:34:47.300
the Malleus Maleficarum somewhere in the house. Oh, wow. So, uh, not, not an old one. I mean,
00:34:52.360
a new one. So, but, uh, it's just, I understand the chaos wrought of suddenly being able to manufacture
00:34:59.380
crazy documents and- Has it helped you find some witches?
00:35:02.540
Yes. Yeah. I'm still looking, but, uh, it's a good read. But the, I, I, I worry about drawing
00:35:10.740
a false analogy here or being misled by an analogy that doesn't quite capture the dynamics of what
00:35:18.500
we're suffering here, because it's not just a matter of more people talking, right? Because it's,
00:35:25.820
you know, if you're going to talk about just literacy and its consequences, well, we, we almost
00:35:30.760
got maxed out before the internet, right? I mean, not, you know, leaving aside like perhaps great
00:35:36.360
swaths of illiterate people, but when you're talking about the developed world, you know,
00:35:41.680
we had something like, you know, 90 plus percent literacy across it. And we were talking to each
00:35:48.440
other about everything under the sun. And what's happened now is we're talking much faster and the
00:35:55.760
usual gatekeepers no longer really, you know, really can man the gates because, because we go
00:36:02.300
around them and above them and below them on social media. And it's introduced some fairly perverse
00:36:09.920
incentives to the, and that have undermined the business model of, of journalism in particular.
00:36:16.040
It's made, you know, academia brittle in the face of public opinion online, you know, perhaps
00:36:22.820
quite unnecessarily, but nonetheless, they respond to the, the various mobbings we have seen on Twitter
00:36:29.260
in, in ways that have proven totally dysfunctional, you know, firing a professor for claiming that
00:36:35.300
there are only two biological sexes is the perfect example. We've, so we've created a machinery that is,
00:36:41.600
it's not just more voices. It is a, again, it's algorithmic amplification that preferentially
00:36:49.240
spreads misinformation over the truth. It biases us toward outrage. I mean, all of these, these effects
00:36:57.620
have been, you know, at this point, endlessly described by me and others, but, and, and I'm not
00:37:03.760
saying there's, there, there, there are no benefits to, you know, every piece of this machine. I mean,
00:37:08.580
it's, it's not that social media has been bad for us across the board, but it does have this effect of,
00:37:15.040
I mean, now we have, you know, people in Congress who are there, it seems just to talk to their social
00:37:21.760
media followers, right? I mean, they're not there to govern, you know, you know, somebody like Marjorie
00:37:25.780
Taylor Greene, it's just, she's building a lunatic brand on social media from Congress. That seems to be
00:37:33.480
the project. And I worry that it's a different situation. And like, you know, so for instance,
00:37:40.980
like one thing you said before we hit this topic, which you seem to take as a, a fairly ironclad
00:37:47.680
heuristic, which is, you know, if you, in the face of a conspiracy theorist who claims that he,
00:37:53.000
he's being deprived of speech or a community that claims as much by no means do anything to deprive
00:38:00.360
such people of speech, because then you're just, you know, then you have, you know, you've,
00:38:05.400
it's like the, um, I guess the David Koresh principle, you know, that he pulled himself up in a
00:38:09.880
bunker claiming the government was going to come and take his guns. And the government came to try
00:38:13.900
to take his guns and all hell broke loose. But given the dynamics of the situation, I think a
00:38:20.920
very strong case could be made for canceling certain people that is depriving them of speech
00:38:26.740
on these private platforms. And I think Alex, I think the defenestration of Alex Jones has always
00:38:32.000
made a lot of sense given what he was doing on Twitter. Yeah. And I was actually kind of shocked
00:38:38.120
that he was invited back to Twitter because, well, I mean, it's also kind of personal. I grew up in
00:38:45.240
Danbury, Connecticut and Newtown is kind of a suburb of Danbury. And one of the guys who used to come to
00:38:51.080
parties at my house lost a little girl at Sandy Hook. And so I knew from friends and family, you know,
00:38:59.140
that some of the Alex Jones insanity was leading to, you know, people showing up and targeting parents
00:39:07.240
who lost their kids. But here's the thing, that was defamation. Like, like, like, and that was
00:39:13.320
textbook defamation. And the idea of actually saying that these particular parents are engaged,
00:39:19.980
they're part of a conspiracy, they're, they didn't even lose their kids, you know, and leading to a
00:39:25.640
situation in which they were actually being targeted. That's something that, that I actually
00:39:31.280
thought that him being devastated in court was completely appropriate. And I think once you have a
00:39:36.840
finding like that against you, it would have been perfectly justifiable because that's unprotected
00:39:41.080
speech that the, he not be invited back to X. But Greg, I'm sure there's a version of it where he,
00:39:46.340
let's say he hadn't named the parents specifically, but he's still talking about it in such a way
00:39:52.240
on, again, this media platform that facilitates this kind of mob-like convergence on untruth.
00:40:01.160
He's propagating his lies and, you know, his, quote, opinions in a way that doesn't meet the
00:40:07.400
standard of defamation. No one could sue him over, no one would have the standing to sue him over it.
00:40:13.180
It's a little bit like, you know, mini Holocaust denial, which we, you and I think should be legal.
00:40:18.840
And I think, to be clear, I think everything that I'm aware of, defamation aside, most of what
00:40:24.640
Alex Jones has said, I think he should be free, legally free to say, but I think any platform
00:40:30.980
should be free to say, we want nothing to do with this. And we certainly don't want to help this
00:40:35.840
maniac ruin the lives of people who have already suffered the worst, worst thing on earth. So yeah,
00:40:43.360
well, he's, he's gone, right? We've effectively buried him in a digital hole out in the desert.
00:40:48.360
I think that's totally fair game. And if I was running one of those platforms, I would have
00:40:53.260
kicked him off in the first five minutes of my tenure there. And many people beyond him who are
00:40:59.980
participating in the same kind of destruction of our, of our information wilderness, you know,
00:41:06.260
in many cases, consciously and maliciously, but in many other cases, they're just insane, right? So
00:41:11.720
the way I tend to think about this is if you shrink the scale of it, if you just imagine,
00:41:16.420
you know, if I own a restaurant and the bar, I find out that the bartender is talking to
00:41:20.900
customers about how the, the Holocaust never happened or, you know, the Jews deserved it on
00:41:26.100
October 7th, right? Well, you know, I'm totally within my rights to fire that bartender, right?
00:41:31.180
It's just, he's destroying the brand of the restaurant, right? He doesn't get to do that as
00:41:35.760
my employee. And so it would be with a patron to the restaurant. If somebody is actually just making
00:41:41.080
it an unpleasant place to be, I should be able to kick them out, right? So how does that
00:41:46.400
change when it scales to the size of a, quote, platform like X or meta or, or does it change
00:41:54.780
in your view? Yeah. I mean, the primary concern with misinformation and disinformation as a
00:42:01.600
rationale for, for kicking people, well, certainly like, you know, as a rationale for punishing people
00:42:06.900
in a legal sense, it scares me a little bit. If you'd like to continue listening to this conversation,
00:42:12.100
you'll need to subscribe at samharris.org. Once you do, you'll get access to all full-length
00:42:18.000
episodes of the Making Sense podcast. The podcast is available to everyone through our scholarship
00:42:22.740
program. So if you can't afford a subscription, please request a free account on the website.
00:42:28.500
The Making Sense podcast is ad-free and relies entirely on listener support.