Making Sense - Sam Harris - January 23, 2024


#350 — Sharing Reality


Episode Stats

Length

42 minutes

Words per Minute

167.32326

Word Count

7,093

Sentence Count

396

Hate Speech Sentences

3


Summary

Jonathan Rausch and Josh Zepes talk about their new book, The Constitution of Knowledge, A Defense of Truth, and why they think it's a fantastic defense of truth as a lens through which to look at some current events that I think worry all of us. They also talk about the fragmentation of society, the state of the mainstream media, diversity of viewpoints, the threatened reality-based community, what the COID pandemic did to our information landscape, the unique challenge of Trump and Trumpism, the dangers of a second Trump term, the problem of immigration and controlling the southern border of the U.S., and other topics. Sam Harris is a senior fellow at the Brookings Institution in Washington, D.C., and is the author of eight books and many articles on public policy, culture, and government. He is a contributing writer for The Atlantic and a recipient of the 2005 National Magazine Award, which is the magazine industry s equivalent of the Pulitzer Prize. Josh Zeps is an independent journalist who was a founding host of HuffPost Live and also hosted a national morning television show in Australia and a radio show on ABC Radio and now he s full-time on his own platform, the wonderful podcast, Uncomfortable Conversations with Josh Zakepes, where he is co-hosting a podcast with Sam Harris. . The Making Sense Podcast is made possible by the scholarship program, Sam Harris' scholarship program. We don t run ads on the podcast, and therefore it s made possible entirely through the support of our subscribers. We do not run ads. So if you enjoy what we re making sense, please consider becoming a supporter of the podcast and become a supporter by becoming one! or become a patron of the program by becoming a member of the Making Sense Program. Thanks for listening to the podcast. We re made possible because of the support we re doing this podcast by you, and we re giving you a chance to get a discount on our premium membership plan. and get 10% off your first month of the M&A membership. The M&C. Subscribe to Making Sense. - Sam Harris and a free copy of his book: The Constitution Of Knowledge: A Defense Of Truth by Jonathan Rausch and a Defense of the Constitution by Jonathan Zeps in the making sense edition of his newest book, on Amazon Prime and the book is out now! on the Kindle and paperback edition is out soon!


Transcript

00:00:00.000 Welcome to the Making Sense Podcast. This is Sam Harris. Just a note to say that if
00:00:11.640 you're hearing this, you're not currently on our subscriber feed, and we'll only be
00:00:15.580 hearing the first part of this conversation. In order to access full episodes of the Making
00:00:19.840 Sense Podcast, you'll need to subscribe at samharris.org. There you'll also find our
00:00:24.960 scholarship program, where we offer free accounts to anyone who can't afford one.
00:00:28.340 We don't run ads on the podcast, and therefore it's made possible entirely through the support
00:00:32.860 of our subscribers. So if you enjoy what we're doing here, please consider becoming one.
00:00:45.100 Welcome to the Making Sense Podcast. This is Sam Harris. I'm fighting my way through a
00:00:51.180 respiratory virus here, so I will keep this short. Today I'm speaking with Jonathan Rausch
00:00:56.520 and Josh Zeps. Jonathan is a senior fellow at the Brookings Institution in Washington.
00:01:02.960 He is the author of eight books and many articles on public policy, culture, and government. He is a
00:01:10.380 contributing writer for The Atlantic and a recipient of the 2005 National Magazine Award, which is the
00:01:17.120 magazine industry's equivalent of the Pulitzer Prize. His latest book is The Constitution of Knowledge,
00:01:22.840 A Defense of Truth. And that is one of the focuses of our conversation today.
00:01:28.980 Josh Zeps is an independent journalist. He was a founding host of HuffPost Live. He also hosted a
00:01:35.260 national morning television show in Australia and a radio show on ABC Radio. And now he's full-time
00:01:41.480 on his own platform, the wonderful podcast Uncomfortable Conversations with Josh Zeps.
00:01:46.660 Josh and I have collaborated in the past. We did some live events in Australia. And I wanted to
00:01:53.700 bring him on to co-pilot this interview with me, with Jonathan. As I said, we talk about Jonathan's
00:01:59.920 book, The Constitution of Knowledge. We talk about the fragmentation of society, the state of the
00:02:06.180 mainstream media, diversity of viewpoints, the threatened reality-based community, what the COVID
00:02:13.020 pandemic did to our information landscape, the unique challenge of Trump and Trumpism, the dangers of a
00:02:19.540 second Trump term, the problem of immigration and controlling the southern border of the U.S., and other
00:02:26.300 topics. And now I bring you Jonathan Rausch and Josh Zeps.
00:02:31.740 I am here with Jonathan Rausch and Josh Zeps. Jonathan, Josh, thanks for joining me.
00:02:44.540 Thank you. Thanks so much.
00:02:46.620 So let me explain the structure here. I've invited Josh, who I think most of my audience will already
00:02:53.760 know, to co-pilot this interview with me. This is just for the fun of it and also to get the most out
00:03:00.820 of you, Jonathan. But let me just have you both just kind of introduce yourselves here. Jonathan,
00:03:07.960 how do you describe what you do? We're going to talk mostly about your book, The Constitution of
00:03:13.580 Knowledge, A Defense of Truth, which was a fantastic defense of truth as advertised and also use it as
00:03:21.360 a lens through which to look at some current events that I think worry all of us. How do you
00:03:27.260 summarize your career as a writer and journalist? Oh, well, that's exactly how I summarize it. I'm a
00:03:32.880 writer and journalist. I'm sometimes mistaken for an academic and called doctor. But my highest degree
00:03:39.680 is a bachelor's degree in history. I started out actually in a newspaper, which is how people did
00:03:45.560 start out back in the day and have done magazine work and written books on a lot of subjects. But
00:03:52.080 unlike some journalists, I'm comfortable in the world of theory. And so I do a lot of that.
00:03:58.440 Yeah, you should have gotten a PhD for this book. Your discussion of the foundation of knowledge and
00:04:04.580 just the way you marry the principles that safeguard our scientific epistemology and political
00:04:12.220 liberalism is just fantastic. So congratulations there.
00:04:16.260 Well, coming from you, it means a lot. Your work has been an inspiration and indeed is quoted
00:04:22.600 multiple times in the book.
00:04:25.020 Nice, nice. Well, Josh, remind people who you are.
00:04:28.120 I hereby bestow upon Jonathan an honorary doctorate from the School of Uncomfortable
00:04:32.260 Conversations. There you go, Jonathan. I'm an independent journalist.
00:04:37.300 Yeah, I just won't ask you to perform any cardiology if we're on a plane and someone calls out for a
00:04:42.280 doctor.
00:04:42.540 Just don't make me write a speech.
00:04:44.180 I'm an independent journalist. I have just left the legacy media where I was hosting a daily talkback
00:04:51.120 radio show on Australia's national public broadcaster for the past couple of years.
00:04:56.400 Prior to that, I anchored a sort of morning television show in Australia and I spent most
00:05:02.500 of my professional life in New York City. You may detect from my accent that I'm not from there.
00:05:07.940 I'm Australian. And in New York, I was a founding host and a producer of HuffPost Live,
00:05:13.820 which was this sort of big experiment to try to produce thousands of hours of live streaming,
00:05:20.020 talk, television. And so I hosted thousands of hours of content with interesting people there and
00:05:26.220 then decamped back to Australia when I had kids and got married a few years ago. And now I've gone
00:05:32.320 independent after finding it sort of intolerable. I guess I was somewhat pushed out of legacy media in a
00:05:39.260 story that's kind of parallel to Barry Weiss's or a story that has been replicated in many instances.
00:05:45.540 People who don't feel like the legacy media is doing a terribly good job of encapsulating the
00:05:50.820 full rambunctiousness and excitingness of all the conversations that we could be having find
00:05:56.880 themselves excluded from the party. And at some point I just said, you know, stuff this. So
00:06:01.460 Uncomfortable Conversations is now my gig.
00:06:03.180 Nice. Well, it's great to have you here. And you've got the truly professional radio voice that
00:06:08.820 will keep us on the straight and narrow here. I've got a respiratory virus going on, so I
00:06:13.920 happily will not catch my book.
00:06:15.440 Yeah, I definitely don't have that. If I may, I forgot to mention two institutional affiliations.
00:06:21.720 One is I'm a senior fellow at the Brookings Institution in Washington, and the second is I'm
00:06:27.660 a contributing writer of The Atlantic.
00:06:30.020 Yeah. Yeah. Okay. Well, so, you know, this is, I don't tend to do straightforward interviews. It
00:06:36.240 really are conversations. And so, you know, Josh and I can be expected to take up a fair amount of
00:06:41.940 time, but really we're trying to get the most out of you, Jonathan. Just to kind of open with my
00:06:46.540 concerns here, I think many of us sense that the moral intelligence of the West appears to be
00:06:53.500 somewhat exhausted. And we can see this both on the right and the left. I mean, the fact that
00:06:59.580 on the populist right, people can't seem to see that we have any stake in, I'm speaking from more
00:07:06.220 or less an American perspective here, people just can't seem to see that we have any stake in reducing
00:07:10.900 the danger and the chaos that is happening outside our borders. They seem to think that we should
00:07:15.740 become a nuclear-armed Switzerland of some kind. And even a phrase like the rules-based international
00:07:22.680 order is now sneered at as a piece of neoconservative or neoliberal can't. And on
00:07:29.960 the populist left, we have people who can't seem to distinguish between civilization and barbarism,
00:07:37.020 as we witnessed after October 7th. I mean, and they show no inclination or capacity to defend the
00:07:42.640 former. And this relates to the topics you deal with in your book. You know, these, this kind of
00:07:48.400 unraveling relates to the foundations of our knowledge, our ability to have a, anything like
00:07:53.960 a shared consensus about what's happening in the world. And it relates to the hope that we
00:07:58.480 one day may live in a world where people everywhere can agree about the basic norms and values that
00:08:05.160 allow for a truly open-ended form of cooperation among 8 billion strangers. So I thought we could start
00:08:12.460 with your book and there's two phrases in your book that do a tremendous amount of work. So I'd like you
00:08:18.340 to explain both of them and how they are connected. And the first is the title, The Constitution of
00:08:25.060 Knowledge. And the second is The Reality-Based Community. What do you mean by those phrases and
00:08:31.180 how are they connected?
00:08:32.540 So the constitution of knowledge is the system of norms and institutions that we in liberal societies
00:08:44.280 rely on to keep us anchored to some common version of reality. They don't require us to agree on
00:08:51.300 everything. In fact, they require us to disagree because disagreement is where knowledge comes from.
00:08:57.280 That's what allows us to check each other's errors, those different perspectives. But we do have to
00:09:02.360 have a set of rules. What I'm pushing back against there is the view that I started with 30 years
00:09:09.880 ago in an earlier book, which is called Kindly Inquisitors, The New Attacks on Free Thought. And
00:09:14.760 it's a good book, but it kind of starts and ends where most people start and end, which is
00:09:20.980 the marketplace of ideas and John Stuart Mill, which is all you need is a free environment and public
00:09:28.000 criticism where people are allowed to say things to each other and correct each other and knowledge
00:09:33.660 will appear as if by magic. That's not a terrible model. I love the marketplace of ideas metaphor. I
00:09:42.360 use it and I'm a fan of John Stuart Mill. But the part we forgot, partly because that system was so
00:09:50.000 successful for so long, is it doesn't just happen. You need a lot of rules and a lot of structure
00:09:56.280 to get people to disagree in ways that are productive. And that requires a lot of rules.
00:10:04.720 It turns out they look very, very much like the rules for the U.S. Constitution. For example,
00:10:11.360 the U.S. Constitution pits ambition against ambition, as we know from the Federalist. Constitution of
00:10:19.080 Knowledge pits bias against bias. They're both open-ended systems, which never allow a final say
00:10:27.180 or a final destination, and so forth. So that's the Constitution of Knowledge. So the Constitution of
00:10:35.040 Knowledge doesn't govern everything we do in life. It doesn't decide what we can say at the dinner table,
00:10:41.720 Thanksgiving. It doesn't apply in church. As you have tirelessly pointed out, most of the beliefs of
00:10:47.900 most major religions would flunk the Constitution of Knowledge because they're not replicable.
00:10:52.560 The areas, the fields that do adhere to the Constitution of Knowledge are what I call the
00:10:58.380 reality-based community. These are the spheres, mostly professionals, that do the work every day of
00:11:06.280 developing what we think of as objective knowledge. And the big four there are academia, science,
00:11:13.360 research, research, all of that. That's number one. Second is media, mainstream media, reality-based
00:11:20.060 media. That's my world. I think probably that's your world, certainly Josh's world. The third is law.
00:11:28.280 People forget that the idea of a fact comes not from science. It predates that. It comes from law
00:11:33.700 because courts needed a count of the facts that people could agree on in order to settle
00:11:38.700 cases. So they came up with these adversarial systems of fact-finding, evidence-based.
00:11:45.420 And the fourth is government. Our government, all liberal governments, are just shot through
00:11:51.900 with institutions and rules that keep them tethered to reality. Everything from the Administrative
00:11:58.640 Procedures Act to the many research agencies, the inspectors general, the Justice Department,
00:12:04.400 which, for example, has to be fact-based. If a government stops being fact-based, it becomes
00:12:09.180 tyrannical. It's just that simple.
00:12:11.920 It's fascinating, Jonathan. I mean, one thing that I pick up from what you're saying is that there is a
00:12:17.760 gulf in our experience of how knowledge gets accumulated if we haven't worked in one of those
00:12:25.840 environments that you just pointed to. When I'm talking to friends who have never worked in a
00:12:31.360 newsroom or a science lab or an academic institution or a court of law or the public service,
00:12:39.140 for example, where they're not familiar with the processes that are in place to sort fact from
00:12:45.620 nonsense, then I think there can be an assumption that the reason why we're losing our way and why
00:12:52.380 there's so much bullshit, pardon the French, floating around at the moment is because bad actors just
00:12:58.300 aren't doing a good enough job of being honest. The problem, as you articulated, is actually
00:13:05.280 thornier than that. It's not that there are bad people who are being dishonest. It's that we have
00:13:10.400 systems in place that are ignorant of the countervailing systems that are required in order
00:13:15.340 to filter the best ideas. If you've worked in a newsroom and you've ever had an editor come to you
00:13:21.740 and say, you don't have it yet, you don't have that story yet, you've got the shape of the story,
00:13:27.200 but I need two more sources and we need someone on the record about this, then it's hard to understand
00:13:32.720 the kind of framework of truth seeking that supports and buttresses ourselves. I mean,
00:13:39.360 I speak from personal experience here because I'm intentionally stepping out of that environment and
00:13:42.660 I'm requiring myself to erect this artifice of truth seeking on my own and kind of build the plane
00:13:48.940 while it's flying. And Sam has done essentially the same thing of holding oneself accountable.
00:13:55.100 But what do you want to say to people who've never been in that environment and who just think,
00:13:58.880 you know, well, the problem is that people are being nasty and lying?
00:14:03.660 This is a bad subject for me because I get defensive. I am old media incarnate. I work for
00:14:11.780 the Atlantic, which is what, 1857. I used to work for The Economist, 1848. And I am a believer
00:14:20.840 that there is a reason for all of those rules and norms, those layers of editing and copy editing and
00:14:28.280 fact-checking that go into a traditional media establishment. It's very expensive. It's very
00:14:34.040 exacting. I just went through two days of fact-checking at the Atlantic. It's an exhausting process.
00:14:42.540 And what's frustrating for me, and maybe for you guys too, I don't know, is people out there in
00:14:49.160 the world understandably think it's just easy. You know, why don't we just write what's true and not
00:14:54.120 write what's false? Why are we so biased? Why do we have all these blind spots? Why did we miss,
00:14:58.520 you know, the distress that was leading to the election of Donald Trump? Why are we too far left?
00:15:03.220 Why are we too far right? And there are certainly valid criticisms. There needs to be, I think,
00:15:09.880 more diversity, ideological in newsrooms. But what we are tasked to do every day is really hard. Come
00:15:19.280 up with some coherent, accurate account of reality in a complicated world on very, very tight deadlines.
00:15:26.020 You know, scientists get two years to do what we have to do in two hours. So I guess I'd be whining
00:15:31.640 to say the rest of the world should be more appreciative. But the truth is, that's how I feel.
00:15:35.580 Well, as you pointed out, I think the phrase that you use in the book is, the turbulence is a source
00:15:42.060 of stability, both politically and epistemically, right? So it's the fact that political factions
00:15:48.160 can oppose one another and government is divided. No one has all the power. That fractiousness is what
00:15:57.020 keeps the plane flying. And epistemically, the fact that scientists are in the business and journalists
00:16:02.780 are in the business of proving one another wrong based on their own biases. I mean, all of that
00:16:07.420 works to the advantage of truth and liberty on the political side in the end, except when it doesn't,
00:16:16.360 except, I mean, there's only so much turbulence that the system can use to its advantage. And I think what
00:16:22.420 many of us are now worried about, and you certainly appear worried in your book, is that the current
00:16:28.100 state of media, and journalism in particular, and social media, and the way in which the layer of
00:16:33.740 social media is interacting, and putting pressure on media, it's just made this, the information
00:16:40.740 landscape a kind of hallucination machine, right? It's no longer tracking truth, or it's no, I think
00:16:48.220 your phrase is something like, there's no longer a positive epistemic valence to all this chaos,
00:16:54.480 or it seems certainly reasonable to worry that there's not. And, you know, in the book, you talk
00:17:00.720 about how fear of new media is really quite old. I mean, it's as old as writing, and it's certainly
00:17:06.100 as old as the printing press, which, you know, quite infamously stoked the fires of the Inquisition
00:17:12.640 and the wars of religion. The Malleus Maleficarum, which was that great witch-finding manual of the
00:17:17.980 15th century, was one of the first bestsellers. So I'm not sure at what point our nostalgia for
00:17:24.880 the past should be focused. I mean, I think you also go into some detail in the book that
00:17:29.760 the early days of American journalism were pretty ugly, and fake news was really the standard of the
00:17:36.000 time. But at a certain point, that changed. And most of us, as you just suggested, are nostalgic
00:17:43.460 for some moment in the past when journalism seemed to be run by at least a sufficient number of adults
00:17:52.480 in the room. Do you think that was always an illusion? And if not, how do you judge the current
00:17:59.780 state of journalism? Perhaps we should take social media as a separate piece.
00:18:03.040 Yeah, I was going to say one has to disaggregate. So I'd be very interested in Josh's view. He's closer,
00:18:10.000 actually, to the newsroom these days than I am. I'd say you have to disaggregate, and that the core
00:18:16.900 values of mainstream media, places like the network newscasts, the Wall Street Journal, Washington
00:18:22.980 Post, New York Times, LA Times, those places still have their values intact, and they're struggling to
00:18:29.920 defend them economically. And there's a huge crisis, and this doesn't need any embellishment on this show,
00:18:37.520 but there's a huge problem with the business model, which is checking reality is extremely
00:18:43.380 expensive. People think, you know, Hunter Biden's laptop, that came in 10 days before the election,
00:18:49.580 why wasn't it checked? Well, it took a team of Washington reporters the better part of a year
00:18:54.780 to check a handful of the material. It's just very expensive and time-consuming to do this work.
00:19:02.220 But there, I think that's the issue. I don't think it's really, we've seen the kind of corruption of
00:19:06.500 values in mainstream media on anything like the scale that we've seen it in parts of academia,
00:19:11.540 which really has become very politicized. Others may disagree. They may say that I'm,
00:19:16.180 you know, kind of whitewashing mainstream media, so we can have that conversation.
00:19:20.380 Did you read Bennett's Economist piece on the post-mortem from his firing from the New York
00:19:27.160 Times? Yes, I did. Yeah, okay. Yeah, I just want to make sure that those facts were in evidence.
00:19:32.460 John, I can hear the new media listener, the younger listener, perhaps, saying to you mentally,
00:19:41.780 I mean, okay, so maybe it takes a year to absolutely fact-check every aspect of the
00:19:46.260 Hunter Biden laptop story, but why is that the bar for me as a voter to find out about it?
00:19:51.420 Like, why can't we just have different categories of truth claims? And okay, if the New York Times
00:19:56.900 doesn't want to publish something until they've absolutely got something where they can take it to
00:20:01.140 the bank, that's fine. But I want to live in a media universe where I have access to information
00:20:05.660 that might be true and that might be relevant. And so there has to be a mechanism by which I can
00:20:10.200 know about the Hunter Biden laptop story without it being hidden from me because my grand poobah
00:20:16.000 overlords say that I'm too stupid to be able to sort fact from fiction. And we don't necessarily want to
00:20:22.300 yield that space to the Tim Pooles and Alex Joneses and Mike Cernoviches of the world. There needs to be
00:20:28.620 some other mechanism by which we can ascertain things that may be true without requiring them
00:20:34.020 to meet the standard of traditional media. So I'll give you the traditional answer to that.
00:20:39.260 I'm not sure how younger people will like it, but this is not a new problem. Journalists have been
00:20:47.020 wrestling for over a hundred years since the age of the yellow press and the gutter media with,
00:20:53.000 what do you do with salacious gossip? And the old rule was, well, you just print it because why not?
00:20:58.620 It's fun. People read it. They eat it up. And then we got a different kind of system that emerged
00:21:05.140 and we got defamation lawsuits and we got schools of journalism and lots of rules that said, if it
00:21:11.140 isn't true, don't print it. That's your responsible. And we wound up with kind of a multi-tiered system
00:21:16.120 where you had highbrow journalists, places like the New York Times. And if you saw it there, it was very
00:21:21.100 likely to be checked and true. You know, they got stuff wrong, of course, but pretty, pretty darn
00:21:26.680 responsible. And then you had the tabloid media and the gutter press and the gossip mags and the
00:21:32.660 gossip columnists. And that's where the other stuff circulated. So people knew about it, but there
00:21:38.200 were these different sort of levels of gatekeeping and credentialing. And that seemed to work pretty
00:21:43.540 stably for a while. The problem today, of course, is who makes those decisions and why. You know,
00:21:51.020 it's really tough. For example, what would you do with the Steele dossier? So this is a pile of
00:21:58.880 unverified gossip. And that's all it even claims to be. Some guy goes out and collects a lot of
00:22:05.840 gossip and writes it down because that's what he's been hired to do. He's not even saying it's true.
00:22:09.680 He's just saying he's heard it. And then this circulates and it seems like everyone around
00:22:13.940 Washington has seen it or read it. I didn't, but apparently a lot of people did. And then
00:22:19.620 no one's publishing it because it's against the policy of the New York Times and the Washington
00:22:24.880 Post to publish salacious, unverified, possibly completely false gossip about anybody, including
00:22:30.840 Donald Trump. So they're trying to do the responsible thing. But then BuzzFeed says, well,
00:22:37.460 to heck with that. Everyone's reading this. The public should be able to read it too. They published it.
00:22:41.920 And I think that was the wrong decision. Maybe that's old fashioned of me, but a lot of people
00:22:47.760 disagree. Ben Smith clearly disagrees. And I don't think we'll ever have a pat answer to this question.
00:22:54.920 But I will tell you that I think that not thinking that this is a difficult question and that absolutely
00:23:00.260 everything alleged by anyone should be immediately published and transmitted around the world is a
00:23:04.860 good answer. I don't think that's a good answer. But I don't know. What do you think? I mean,
00:23:09.620 you worked at ABC. I mean, the challenge is that once the BuzzFeed publishes the piece,
00:23:15.600 then the New York Times and the Washington Post and the Wall Street Journal have to report on the
00:23:19.500 existence of the controversy about the publication of the piece. So there's this meta story about the
00:23:24.020 story, which it sort of would be derelict not to talk about because everybody's talking about the
00:23:28.540 fact that BuzzFeed has published this thing. So then you get this weird situation where readers of
00:23:33.120 the legacy media are saying, hang on, they're talking about how this other place has published this
00:23:37.960 other thing, but I don't even know what the underlying story actually is. I mean, it's tricky
00:23:43.680 and it complicates. You know, one thing that I'd love you to talk about, and Sam, forgive me if I'm
00:23:49.500 sort of hijacking this in a sense, but I'm interested in, you talk about this funnel of knowledge that
00:23:56.560 journalism is up to and academia and the other institutions that you're talking about where
00:24:00.200 you want as large as possible a mouth of the funnel where there is total free speech and everyone
00:24:06.200 can say whatever they want. And that goes into the funnel. Then the funnel starts cranking away and
00:24:09.600 doing its job. And at the bottom of the funnel, you have these pearls of wisdom. You have these
00:24:14.000 Willy Wonka everlasting gobstoppers of truth spitting out the bottom of the machine. The problem at the
00:24:19.460 moment, as I have seen it working in the legacy media, is that there are constraints on what is
00:24:27.240 going into the top of the funnel that the people who are imposing the constraints aren't even aware of
00:24:31.580 as constraints. They don't identify their worldview as being a worldview. They don't identify their
00:24:38.580 opinions as being opinions. You know, I had a run in with management at one organization because we
00:24:44.640 had a difference of opinion about gay pride. And John, you and I are both gay. We have our own
00:24:50.700 differences of opinion, probably not with each other, but with the rest of the gay universe.
00:24:54.540 The extent to which, you know, oiled up muscled men sitting astride giant inflatable penises going
00:25:02.620 along on a march is constructive or useful to young people who are trying to sort out their sexuality.
00:25:08.140 I was trying to articulate that point. I'm glad you got that phrase out. I use that phrase once every
00:25:12.940 podcast. So I'm glad we got it at the top here. And my employer was fully 100% pro pride. In fact,
00:25:23.680 was the official broadcaster of pride. And I got into a run in where they weren't allowing me to
00:25:29.460 articulate this point of view because they said that hosts, you know, on air talent are not allowed
00:25:34.380 to express opinions about controversial social or political or cultural events. Now, everybody else
00:25:40.080 on the air was expressing their support for pride, but that doesn't land as an opinion for them.
00:25:47.160 That's just common sense. That's just being on the right side of history. That's just not being nasty.
00:25:51.000 So the funnel ends up being curtailed in ways that they don't even notice it being curtailed. They're
00:25:56.240 like, Josh isn't allowed to have his crazy opinion because that's a crazy opinion, but our beliefs
00:26:00.640 aren't opinions. Our beliefs are just the truth. Well, so you need three things to make the
00:26:05.660 constitution of knowledge work, and you need all three. And the first and most obvious is free
00:26:10.400 thought, free inquiry, marketplace of ideas, enough said. The second is you need the discipline of fact.
00:26:16.560 You need a lot of people who are willing to follow a lot of very difficult rules governing
00:26:22.060 who is allowed and not allowed to claim this or that as fact, and under what circumstances. When is
00:26:29.500 an experiment considered replicated? When does a newsroom go with the story? Under what circumstances
00:26:35.340 does it correct it? And all that discipline of fact is super hard and requires years of training.
00:26:39.620 But then there's a third, and that's diversity of viewpoint. If everyone in a room is coming from
00:26:47.540 the same place ideologically and sharing the same assumptions epistemically, then no learning will
00:26:55.600 take place because these people will not be able to see each other's mistakes. The whole system works
00:27:01.980 because diversity of viewpoint allows me to see your biases and you to see mine because we can't see our
00:27:08.640 own. And yes, one of the problems that I worry about, and I know you worry about in journalism, but
00:27:18.120 especially in sectors of academia like the social sciences and humanities, is the lack of viewpoint
00:27:25.500 diversity. And the first symptom of that is when everyone agrees that something which is in fact quite
00:27:32.000 contentious, for example, that human sexes are on a spectrum, not binary, for example, when they see
00:27:40.060 that as not even contestable, that's telling you there probably aren't enough voices in the room.
00:27:47.200 But that's a solvable problem, right? That's not inherent to the model of journalism. That means that
00:27:52.920 you alluded to your newsroom earlier, I guess maybe I shouldn't call out any particular
00:27:57.680 outlet. But the implication is that those people you were working with need to hire some people from
00:28:04.920 different educational backgrounds with different ideological priors for the sake just of professionalism,
00:28:11.560 just of doing the job correctly. And yeah, we've fallen down on that. There has been an effort
00:28:16.080 in American newsrooms to diversify intellectually. But we've got a long way to go.
00:28:22.220 It's funny that you use the word diversify, because diversity is the lodestar, the goal of all of this.
00:28:27.460 But the diversity that they're looking for is a diversity of skin color and genitals,
00:28:31.660 not a diversity of thought. Or class or economic background. Editor at a
00:28:35.240 major magazine you all heard of told me a couple of years ago that he gets the resumes of which are
00:28:42.280 many come to him through a funnel of some sort. And he said he gets 25 or 50 versions of the same
00:28:48.780 resume. And that's hard to change. Isn't there some explanation for a weird class filter in
00:28:57.760 journalism in particular? Because to become a journalist is often the first rung on the ladder
00:29:04.460 is sort of the unpaid or underpaid internship at some wonderful institution. And the only people who
00:29:11.560 can do that are the people who are taking their summers between their years in an Ivy League
00:29:16.880 institution and it's all funded by their rich families and et cetera?
00:29:21.440 Well, you're cutting close to the bone, Sam, because in 1981, I started my journalism career
00:29:26.920 as a summer intern and was unpaid for that summer here in Washington, D.C. And I could afford to do
00:29:33.860 that. And yes, shame on unpaid internships. You know, I guess I'd be curious for your views on that.
00:29:41.920 In some ways, we're better in that respect because there's so many more paths in now.
00:29:49.020 You know, there are all these 20-somethings that have substacks and podcasts and get noticed through
00:29:55.760 these alternative channels that don't require you to be well-heeled. I think the problem has more to do
00:30:04.220 with the kinds of people who are attracted to journalism and for that matter, you know, anthropology
00:30:09.480 and some of the self-selection that's going on there. It's going to take positive effort to go
00:30:16.320 out and look at state schools that you've never hired from and where you're not getting referrals
00:30:22.240 from professors and saying, okay, who here could be a journalist? Maybe someone with an unconventional
00:30:28.280 background. When I entered journalism in—my first job was in North Carolina now 40 years ago—there
00:30:35.280 were still the last remnants of what we thought of as blue-collar journalism. I don't know if
00:30:39.800 you've heard that phrase. But journalists, reporters were not always people with Ivy League
00:30:44.820 degrees or, you know, Swarthmore humanities backgrounds. They were just good writers who
00:30:51.300 showed up and did the work. And there was this wonderful old reporter named Jesse Poindexter
00:30:56.600 who covered the courts. He'd been covering the courts for like a generation. And he knew where
00:31:02.420 everybody in the county was buried. And he knew what was going on with every corrupt cop and judge.
00:31:07.500 And he wrote like a dream. But I can tell you, he was not the product of Yale. And we were better for
00:31:15.340 that. It seems to me there's a tension between this call for diversity and another point, which I think
00:31:23.900 you make in your book, which is that at least at some layer of the liberal epistemic order relies on elite
00:31:33.960 consensus, right? We need elites who are qualified to judge the truth claims in, you know, in their
00:31:43.220 area of specialization. We need institutions that create the norms that allow that machinery of truth
00:31:50.300 testing and fallibilism to operate, you know, intergenerationally. And we need a population
00:31:57.080 of, you know, by definition, non-elites with respect to any specific area of specialization
00:32:04.320 that trusts those institutions, to trust their products. It's not to say that they're not capable
00:32:10.360 of error, but the error correction within physics is going to come from physicists, you know, and people
00:32:16.860 who have taken the time for, you know, on the basis of whatever advantages they've had, but, you know,
00:32:22.900 intellectually above all, quantitatively above all, to actually play that language game to the point of
00:32:29.340 being able to produce some work product that the rest of us can rely upon. Again, to a first
00:32:35.720 approximation, all the while knowing that, you know, again, you make this point beautifully in the book
00:32:41.180 at some point where, you know, truth is not a destination, it's a direction. It's like a north
00:32:46.000 on the compass. It's not that you arrive at the North Pole and you're done. It's just you have to
00:32:51.680 navigate. So we need people who are adequate based on their expertise to provide a conversation about
00:32:59.460 reality that is directionally correct. And you have another sentence in the book, which I underlined as
00:33:06.300 really, it sums up more or less everything that concerns me intellectually, ethically, politically,
00:33:14.340 even spiritually. And it's such a simple sentence, but it's when I hit upon it, I just, you should
00:33:21.000 have seen my face. And the sentence is, there is only one reality-based community, right? And that is
00:33:28.040 such a deep insight. It says everything about the situation we are in and the degree to which it's
00:33:35.340 misconstrued in so many fashionable disciplines. And it says everything about the unity of knowledge
00:33:41.640 and the possibility of consilience between disciplines, however disparate. And yet, I think
00:33:47.460 we're now living in a world where, you know, based on the algorithmic derangement of more or less
00:33:55.020 everybody, we're losing our connection to that even as an ideal. I mean, we're certainly losing our
00:34:02.840 grip on a shared civic reality. And largely, I mean, I think social media is to blame, but I think
00:34:07.880 alternative media is largely to blame. And what I continually call podcastistan, it's not functioning
00:34:14.900 by the same norms and scruples of traditional journalism. I mean, people are just freewheeling
00:34:21.340 in front of the microphone and platforming anyone who has an opinion. However, you know, and they're just,
00:34:27.320 they're in no position to debunk the confabulation of maniacs in real time. And therefore, this stuff
00:34:35.180 just gets believed at scale. And so it does seem like a new moment to me where you have,
00:34:39.880 and we can talk about it through the lens of any specific problem. I think I'd like to talk about
00:34:43.940 how you both view what COVID has done to us so we can get to that. And I think we should cover
00:34:50.420 politics as well. But, you know, feel free to react to what I just said. Well, to which part of what you
00:34:57.540 just said? The two of you are at least as well positioned as I am to assess kind of how we're
00:35:07.260 doing at scale on the big question of society's attachment to reality. When I look at it, I see a
00:35:15.040 landscape in which you still have large, large parts of the reality-based community, the legal
00:35:22.080 professions, lots of academics who are not corrupted, lots of mainstream journalists, lots
00:35:27.920 of lawyers, as we saw in the Trump administration, who really are hanging on and trying to defend the
00:35:34.140 norms of the constitution of knowledge. Simple stuff, but important stuff, like you don't go into
00:35:38.740 court and lie and you get sanctioned if you do. Ask Sidney Powell. So there's still a lot of
00:35:44.380 institutional integrity that is trying to defend itself. It often doesn't know how to defend
00:35:51.100 itself. As exhibit A, I would submit three university presidents who bombed in Congress who said the right
00:35:59.600 thing about what their policies were, which is it depends on context to know if the speech is allowable,
00:36:06.020 but did not know how to make the case behind that statement. So that's on us. It's on us liberals
00:36:13.380 to do a better job of defending these principles and understanding these principles. And that's why I
00:36:18.920 wrote the book. But then you have all that other stuff out there, which you allude to, Sam. And that's
00:36:25.300 the big, bewildering, blooming, buzzing, chaotic, anarchic world of social media and blogs and the fact
00:36:34.300 that anyone on Twitter can project a voice. And there you have a problem which is not new. It's
00:36:42.460 very old, but it's been amplified by these technologies, which there are a lot of ways to
00:36:48.220 manipulate humans cognitively. Even, by the way, very smart humans, such as the three of us. We can
00:36:56.060 be manipulated in all sorts of ways that our attention can be hijacked. That's what trolling does.
00:37:01.640 You know, if I say enough terrible things about Sam Harris, he's going to have to respond or his
00:37:05.960 reputation will be damaged. Or things like just repeating lies and so forth. Firehose of falsehood
00:37:13.560 disinformation campaigns. There are just all kinds of things you can do in an environment that's
00:37:19.260 completely unregulated by institutional norms. And people are doing them. It's not the first time this
00:37:25.300 has happened. It takes a while for institutions to figure out rules of the road and how to re-norm.
00:37:32.660 And I'm not completely sure how that happens this time or whether it happens this time. But I agree
00:37:40.500 with you that the environment in which those committed to the constitution of knowledge are swimming
00:37:46.580 is, it's very challenging right now. I find it somewhat terrifying, the media landscape
00:37:55.220 that I'm entering, I suppose. Because, John, it's not just that, it's not, the problem is not just the
00:38:01.620 way that you articulate it, which is that there is, you know, we're in an environment that's unregulated
00:38:07.040 by institutional norms, by the kinds of productive fact-checking that you talk about in your book.
00:38:12.940 It's not an unregulated environment. It's an environment regulated by precisely the wrong
00:38:17.980 incentives. Algorithms are encouraging us to produce content that maximizes people's time spent
00:38:25.240 on apps. That means that they want us to engage. That means that they want us to like posts,
00:38:31.720 share posts, comment on posts. And that is agnostic as to whether or not we're doing those things
00:38:38.140 because they reinforce what we already believe, or because they caricature and demonize things that
00:38:43.400 we don't believe. But it has to be one or the other. If you're in the nuance game, if you're in
00:38:48.220 the game of truth is not a destination, it is a process, then you are leaving a lot of listeners
00:38:54.660 on the table who you could be getting. I mean, Sam and I could both have bigger audiences. I don't know
00:39:01.420 how you get a bigger audience than Sam's. But we could have a bigger audience if we decided to
00:39:05.780 try to poke at elite consensus a bit more and be really edgy and follow these dissenting voices who
00:39:12.260 are sticking it to the man and raising things that nobody else is brave enough to talk about.
00:39:17.420 I mean, I think there are legitimate conversations, and clearly Sam does as well, that we need to be
00:39:21.320 brave enough to talk about. You just alluded to there being two sexes, John, which is going to
00:39:25.440 get us all fired and canceled, obviously. But, you know, so there is a space in which there are blind
00:39:30.760 spots that the legacy media has that we can step in and constructively fill. But the majority of
00:39:36.100 what's going on online is being driven by, like, if you're a young journalist starting out, just to
00:39:41.560 come back to the question of an unpaid internship, for example, if you're starting out on YouTube or
00:39:46.000 podcaster stand, the easiest thing to do is to try to point out how shady and suspicious elite
00:39:53.020 institutions are, including the mainstream media, and to have on a bunch of people who are conspiracy
00:39:59.360 theory adjacent. I mean, the last time of one of the few times that I've accidentally blown up the
00:40:05.540 internet was I've done Joe Rogan's show seven times. And the last time I was on here, and I got into a
00:40:10.560 spat about vaccines and about whether vaccines cause myocarditis in certain populations at a rate that
00:40:18.000 is higher than COVID does. And it was one of these just arcane moments that momentarily everybody
00:40:23.820 looks at because there's a conflict about a subject where there hasn't been an honest reckoning in
00:40:29.520 podcaster stand. There's like these two rival points of view. One is the mainstream media and elite
00:40:35.680 institutions are shutting us up, which to some extent is true because legacy media, when it feels
00:40:40.920 threatened by alternative narratives, like lockdowns are going too far, or there are problems with
00:40:46.420 vaccines that aren't being properly articulated by public health bureaucrats, then the legacy media
00:40:52.600 responds by circling the wagons and closing ranks and trying to insist that it has the one truth,
00:40:57.260 which just makes people more suspicious and makes them go to non-legacy independent media outlets that
00:41:02.520 are then cashing in people's curiosity and desire for salacious conspiracy theories by feeding them
00:41:08.700 nonsense. So you've got the constitution of knowledge funneling people towards their everlasting
00:41:14.460 gobstopper of truth. And then you've got this reverse funnel, which is pushing people into a climate of
00:41:20.820 bullshit. And the in-between space is one that we have to foster and water and tender and grow.
00:41:27.580 And that garden is withering. And I'm not sure how to empower it without completely changing the
00:41:32.660 economics of social media. Well, so Josh, why do you do it the way you're doing it if you could get
00:41:39.620 more followers and make more money and be more famous by, I don't know, trolling Sam Harris and me?
00:41:46.380 Because I'm not a whore. And why not? I mean, there must be some incentives that are driving you to try
00:41:54.320 to, to stick to norms out there. If you'd like to continue listening to this conversation,
00:41:59.600 you'll need to subscribe at samharris.org. Once you do, you'll get access to all full-length
00:42:05.520 episodes of the Making Sense podcast. The podcast is available to everyone through our scholarship
00:42:10.260 program. So if you can't afford a subscription, please request a free account on the website.
00:42:16.020 The Making Sense podcast is ad-free and relies entirely on listener support.
00:42:20.100 And you can subscribe now at samharris.org.