Making Sense - Sam Harris - December 22, 2025


#449 — Dogma, Tribe, and Truth


Episode Stats

Length

23 minutes

Words per Minute

166.05847

Word Count

3,868

Sentence Count

138

Hate Speech Sentences

5


Summary

In this episode, Ross Douthat joins me to talk about his new book, "Believe," and why he thinks religion is a necessary part of a healthy culture. We talk about the role of religion in shaping our culture, and how it intersects with the challenges we face as a society.


Transcript

00:00:00.000 Welcome to the Making Sense Podcast. This is Sam Harris. Just a note to say that if you're
00:00:11.740 hearing this, you're not currently on our subscriber feed, and we'll only be hearing
00:00:15.720 the first part of this conversation. In order to access full episodes of the Making Sense
00:00:20.060 Podcast, you'll need to subscribe at samharris.org. We don't run ads on the podcast, and therefore
00:00:26.240 it's made possible entirely through the support of our subscribers. So if you enjoy what we're
00:00:30.200 doing here, please consider becoming one.
00:00:36.800 I am here with Ross Douthat. Ross, thanks for joining me.
00:00:40.200 Sam, thanks for having me. It's a pleasure.
00:00:42.260 So, we've never met. Am I right in thinking that?
00:00:46.000 We have never met.
00:00:47.540 No, this is the closest I've come to your physically embodied presence.
00:00:51.460 All right. Well, let's see if the internet keeps us together here for the requisite hour
00:00:56.620 or hour or two. Well, so I've been reading your book this week, Believe, which, when did
00:01:02.940 this come out? It's pretty recent, right?
00:01:04.720 Yeah, it came out, I think, February of this year, of 2025.
00:01:08.960 Yeah.
00:01:09.580 And this is where you make your case for the rationality and even necessity of religion,
00:01:16.740 which I think we're going to get to because I think you and I have a—we share a sense
00:01:22.760 that our culture is ailing, but I think we probably diverge, at least on several key points
00:01:30.240 as to whether religion is part of the cure or part of the disease. But we don't—you
00:01:35.680 know, I don't think—I mean, I think our core concerns are so held in common so fully that
00:01:44.640 I think we—I don't know. I'm interested to see where this conversation goes because it's not—I
00:01:49.500 don't want to have and I don't think we will produce a conventional debate about—between
00:01:56.900 an atheist and a believer about the rationality of faith, although I think some of those points
00:02:02.000 are going to be unavoidable.
00:02:04.160 Yeah, we'll have a dynamic interaction that ends with your conversion.
00:02:08.600 Yeah, well, that's how you sold this to me, right?
00:02:11.900 On both, you know, if you're right, we're both hoping for that.
00:02:16.760 That's right.
00:02:17.240 Okay, so I think we should probably start with the problem. What most worries you at this
00:02:21.600 moment when you look at—because you spend a lot of time thinking about politics, as
00:02:24.700 I do. You probably spend more—you're over at the New York Times commenting on culture
00:02:29.840 regularly. What most concerns you at this moment?
00:02:32.520 I guess big picture, a little bit separate from the turbulence of politics right now,
00:02:38.480 is I'm worried about a kind of sense of human obsolescence in the 21st century that I think
00:02:45.440 has been partially forged by the experience of digital culture and disembodied ways of living
00:02:52.200 and is visible in a lot of different trends, including political polarization, but especially
00:02:58.360 in sort of general unhappiness, anxiety, issues of mental illness, and so on, that are in turn
00:03:05.400 connected to people not getting married, not having kids, and effectively not perpetuating
00:03:12.020 human culture.
00:03:12.820 And I think we're in the shadow right now of trends in artificial intelligence that,
00:03:18.140 however far they go, are likely to kind of ramp up the pressure on human beings as human
00:03:24.380 beings. And I wrote an essay, I think around the time, actually, that my book on religion
00:03:30.500 came out, where I suggested that we were in this kind of bottleneck, almost, this kind of
00:03:35.760 evolutionary bottleneck, which is maybe more a cultural evolutionary bottleneck, but where
00:03:40.260 there was just going to be all this pressure on human nations, human cultures, human families,
00:03:45.280 human individuals to sort of figure out how to live under this, you know, very, very novel
00:03:52.480 technological dispensation. And that a lot of institutions, people, whole countries might
00:03:58.160 not make it through that, you know, with extreme examples being, you know, nations in East Asia,
00:04:04.500 like South Korea and Taiwan that have incredibly low birth rates to the point where it's unclear
00:04:08.980 how these nations will survive the next 50 or 60 years. And yeah, so I think that's, maybe
00:04:14.940 that's a slightly unusual way of putting it, but I think this is a widely shared concern
00:04:20.680 that shadows a lot of, again, a lot of more immediate political debates, like in a lot
00:04:26.520 of the kind of new polarization of our era, the, you know, reactionary and far left politics
00:04:32.580 and so on.
00:04:33.040 And I think you can see people sort of searching for a form of politics that's adequate to
00:04:38.000 the 21st century challenge. And I don't think people have found it at all. And I think our
00:04:42.520 politics is a mess because of that. But I think people are sensing that we're in a pretty unique
00:04:47.660 squeeze on human cultures as we've known them, and we need to figure out how to get through.
00:04:54.000 Yeah. It's interesting that, just take the AI piece as a first facet of this ghastly object.
00:05:00.340 It's interesting that even in success, I mean, even in perfect success, if AI amounts to exactly
00:05:08.140 the drudgery-canceling all-purpose technology that we hope for without any of the terrifying
00:05:14.600 downsides, many people are still worried that this could be something like an extinction-level
00:05:22.720 event for human purpose, human solidarity, human culture. I mean, just people are terrified
00:05:29.980 that without the necessity of work, to take just one piece here, we, not all of us certainly,
00:05:37.280 but most people will find life much harder to live. I actually don't share that concern.
00:05:44.320 If you do, maybe you can prop up that fear because I think there's reasons to think that's
00:05:48.960 actually a mirage.
00:05:49.980 I mean, I think it depends on the human being and the human culture. Do I think that the human
00:05:55.040 race will be able to survive and find ways to flourish and thrive and do amazing new things
00:06:01.200 under optimal AI conditions, even if that means that lots of jobs go away? In the long run,
00:06:07.600 yes, I'm in that sense a long-range optimist about the future of the human species. I think
00:06:15.080 anything short of the total dystopian AI scenarios are scenarios in which human beings are going to be
00:06:22.520 able to survive and thrive. But I think there's going to be a lot of turbulence, angst, difficulty,
00:06:29.960 and sort of disappearance along the way, that there are going to be all these forms of life and ways
00:06:37.280 of living that are just not adapted to, again, even the world we live in now with kind of this level of
00:06:45.220 digital existence, sort of people separated from physical reality, from meeting other people in
00:06:50.780 reality. Like there's already a lot of strain on very basic things like having friends, getting
00:06:57.620 married, having kids. And when you add in, let's say, lots and lots of jobs disappearing and a kind
00:07:04.620 of, you know, existential metaphysical anxiety about AI being able to sort of substitute for things that
00:07:11.040 we thought of as human distinctives, plus, you know, whatever other weirder forces come in. Yeah,
00:07:16.800 I think it is a very difficult situation that people need to be prepared for, I guess is how I
00:07:22.600 would put it. It doesn't mean that we're doomed at all, but some things are going to be doomed some
00:07:27.180 places and people are going to be doomed. And you want to start thinking now, really, you want to start
00:07:32.120 thinking, you know, 25 years ago about how you, right, like you as a, you know, a person with
00:07:39.420 relationships and friendships and you as someone who's involved in culture and politics, what you're
00:07:45.600 doing that is sort of making your humanity resilient, I think, against these forces.
00:07:51.640 Yeah. So if you take the job piece, which is really the first point of concern for people,
00:07:56.600 imagine a world where something like UBI was the necessary response to all of the abundance that
00:08:03.500 AI has created. So people don't have to work. Everything has become like chess, which is to say the
00:08:08.620 computers are better at everything or virtually everything that humans have used to do to work.
00:08:13.600 Clearly, we need to figure out some new ethic and economics and politics around the non-necessity of
00:08:20.280 human labor and figure out how to spread the wealth around. And so at that point, it would be true to
00:08:26.740 say that, you know, something like UBI, I don't know if UBI is the actual right framing, but let's say
00:08:31.260 that was the case. Many people, certainly most people commenting on this issue seem to think that
00:08:37.080 most of their neighbors, if not themselves, need to spend eight hours a day doing something they
00:08:41.820 might not want to do in order to feel like they have a purpose in life. But it seems to me we have
00:08:47.100 a kind of ready sample of people, a fairly large population if you look at it historically, who
00:08:54.040 haven't had to work and figured out how to live reasonably or at least recognizably happy lives under
00:09:00.840 those conditions. And those are, we call them rich people, right? Or aristocracies of one flavor or
00:09:06.840 another, right? People who really didn't have to figure out what to do that others would pay them
00:09:14.840 for. Or if they did that early in their lives, they got to a point where they didn't have to do it any
00:09:21.000 longer. And then they had to figure out what to do with leisure. And it would seem very surprising to
00:09:29.120 me if in the presence of unlimited leisure, we as a species and as a culture couldn't figure out how
00:09:37.800 to enjoy it. I mean, there might be some painful bottleneck where, you know, all the people who were
00:09:44.560 totally dependent on drudgery to find some structure in their lives, you know, spin out of
00:09:51.200 control. But it just seems like this is this would be a problem of education and culture and a new kind of
00:09:57.600 ethical and political conversation rather than some kind of insuperable obstacle that we couldn't clear.
00:10:04.380 Yeah, I mean, I'll be more pessimistic a little bit. I mean, first, I would say that in my most
00:10:10.140 optimistic, my optimistic scenario for AI is a kind of middle ground scenario where, which I also think
00:10:17.680 is fairly likely in terms of the capacities of AI to just sort of replace human labor. I think the people
00:10:24.700 who think it's more likely to be a complement to various kinds of human labor, that you'll still have
00:10:29.720 lots of people, you know, working jobs and doing things in the world that earn money. I'm hopeful that that is
00:10:36.180 still the most likely pathway. In the pathway, even in the limit, I mean, you actually think you're hopeful
00:10:42.020 that even in with 100 years of progress, that's likely to be how we organize ourselves.
00:10:48.600 Again, one of the striking things about AI as a journalist who tries to write about it and like you
00:10:55.240 interview people about it is that, you know, even the people who see deeply into the technology
00:11:01.940 struggle to sort of form any kind of consensus predictions about just how far it would go.
00:11:07.760 My basic view tends to be that I'm skeptical of true superintelligence theories. I'm skeptical
00:11:15.180 that you get to a point where they're embodied in the world in ways that are completely substitutionary
00:11:22.760 for what human beings can do. Some of that probably does have to do with some views I have about
00:11:28.600 the human mind that are connected to religious ideas and assumptions, maybe. But let's just for
00:11:34.240 the sake of the conversation, let's say that you're right and or not that you're right per se, but
00:11:38.300 you know, that this this world is is the one we head into where there is some kind of guaranteed basic
00:11:44.380 income derived from the productivity of robots and so on. I think you have to work very hard,
00:11:51.500 very hard, given human nature as we have it, to prevent that from being a world where lots and lots
00:11:57.740 of people lead fundamentally debased lives. I think you said enjoy, you know, enjoyment,
00:12:03.300 right, or pleasure and so on, right? Like, yes, leisure, people, leisure, people with leisure,
00:12:09.040 seeking pleasure. It's very easy. And we see this again, right, right now in, I think, societies around
00:12:16.340 the world, it's very easy to default to a kind of round the clock entertainment cycle. This is before
00:12:24.800 you even get into issues related to, you know, drug use and so on. It's just like they're, you know,
00:12:31.320 the experiments we have with UBI, while not entirely depressing or not super optimism inducing. And just
00:12:38.280 to take your example of the aristocracy. So the historical aristocracy in the Western world, one,
00:12:45.540 lots of aristocrats did have to work because they were managing large estates or engaged in politics to
00:12:51.340 protect those large estates. Lots of them fought in wars and got killed in large numbers and found
00:12:56.300 purpose and meaning in that form. And then there was also just this constant struggle within
00:13:02.740 aristocratic groups to prevent decadence and debasement. You know, if you think about the
00:13:08.560 stereotypes of the third generation Rockefeller or Vanderbilt, as opposed to the first one,
00:13:14.380 it's not like, oh, you know, these people are all sitting in an English garden, reading Plutarch's
00:13:19.140 lives and, you know, painting watercolors. They're out sort of wasting their inheritance
00:13:23.940 and squandering it and being, you know, yeah, being sort of debased and decadent. And I think
00:13:31.360 such a society, the society you're envisioning, it could create you. I'm not going to say you can't
00:13:38.120 create a world where the mass of human beings gets to have like a Montesquieu style aristocratic
00:13:45.420 reverie experience, but it would be incredibly hard and require constant for a constant, constant
00:13:53.620 reinvention, constant effort in ways that we have never seen in the human society to date.
00:13:58.320 So you just have to be I think you just have to be aware of the magnitude of the challenge if you're
00:14:02.560 talking about a purely leisure based society. You know, one interesting model is the vision of
00:14:07.680 something like Star Trek, right, where you have this kind of utopian vision where apparently some form
00:14:13.640 of, you know, AI and things related to AI have relieved scarcity and want and so on. And that's
00:14:20.620 a show, a story that's all about human daring and mastery and accomplishment. But it does focus on
00:14:27.940 people who are choosing to be explorers, who have set up a system where they're still in charge of
00:14:33.440 the ship, even if the computer could make better decisions, right? There's never a moment where
00:14:37.340 Kirk or Picard says to the computer, OK, you decide how to handle the Romulans, right? And then you
00:14:43.580 also don't know, like, what is actually going on on Earth? Are people just, you know, people are
00:14:47.600 just sort of hanging out? Like, everyone isn't doing space exploration. I think the question of
00:14:53.700 what the average citizen of the Federation is doing with their life under Star Trek conditions is sort
00:14:59.700 of the question that sci-fi speculative thought needs to reckon with in the sort of AI abundance
00:15:06.700 scenario. Well, I didn't think we were going to talk about AI, Ross, but I think it's an interesting
00:15:11.620 window onto some of these concerns because, well, first let me just add a little fine print so that
00:15:16.660 you understand the abundance I think is conceivable. You know, I'm not by nature an optimist and I'm not
00:15:24.580 optimistic really that we're going to escape some of the real downsides of AI. But if we were, I think
00:15:31.700 it would be surprising to find that the only thing keeping us sane was that most people spend most of
00:15:39.760 their lives doing kind of fairly arbitrary things to earn a living that they, at least they imagine
00:15:46.260 they'd rather not do, and only to get to the weekend where they're free to debauch themselves.
00:15:52.240 And there's, yeah, and that just basically having that kind of the opportunity costs thrown up
00:15:59.300 against a life of pure leisure was the thing keeping us relatively sane. I think we just have the same
00:16:05.440 problem. But do you think that, I don't know if that's a fair description exactly of how, the way
00:16:13.040 human beings think about work historically, yes, it obviously does have elements of, you know, arbitrary
00:16:21.040 force productivity and toil. I think of the broad achievement of modern civilization, though,
00:16:28.020 as one that has partially, not completely, but partially liberated people from the purely arbitrary
00:16:34.880 and punitive nature of work that's allowed lots and lots of people, not just a narrow elite,
00:16:40.660 to have jobs that they take some kind of genuine satisfaction in that are themselves sources of
00:16:48.260 community. I think one of the lessons of the COVID era and the work from home era is that not everyone,
00:16:54.600 but lots and lots of people did find a form of sort of community and solidarity and so on in
00:17:00.740 the workplace, in those kind of collective action that employment offers, even when it's not the
00:17:08.040 most exciting thing in the world. And then it's also, yeah, I mean, it's not like historically people
00:17:14.480 who are working, I mean, a historical model in the United States of America, right, for long periods
00:17:20.120 of time has been communities and situations that are oriented around family, where you are working for
00:17:27.460 your family, you're working to support them in agrarian societies, you're working collectively
00:17:33.300 with your family. And again, I don't want to say from a 21st century perspective, like, you know,
00:17:39.120 the dignity of the toiling surf for something like, obviously, there's incredible impositions involved
00:17:45.940 in work and child rearing and all these things in most of human history. But I think it's too dismissive
00:17:51.140 to say, you know, oh, we're just liberating people from something that is inherently forced upon them
00:17:56.820 that they don't really want. I think people are working creatures, they're communal creatures,
00:18:00.760 they like doing things together, they like having a sense of mission, they like doing things to help
00:18:05.400 the people closest to them. So you are taking something away if you're saying, oh, no, here's
00:18:09.940 your, you know, here's your UBI and just decide what to do with yourself.
00:18:14.200 But Ross, any part of a job that maps on to what people actually like doing that they would do for
00:18:23.760 free? Well, then that's the precisely the kind of thing they would presumably they would do if they
00:18:28.340 could do anything they wanted, right? If they were given 24 hours in the day to spend however they
00:18:33.940 wanted with their friends and family and with other collaborators they meet, all of this, you know,
00:18:40.380 highly potentiated by access to unlimited intelligence and wealth. And again, this is the
00:18:47.140 utopian version of AI we're talking about.
00:18:49.180 Then, you know, if they want to become, you know, Christian contemplatives or build houses that are
00:18:58.720 bespoke for people who want their houses built by human artisans or whatever the, I mean, it would
00:19:05.900 be, it could be Burning Man for half the people and, you know, Meister Eckhart for the other half.
00:19:12.260 There would just be no impediment to just using your attention the way you want to use it.
00:19:18.860 And that's, it's just, we're living in that condition anyway, except it's just framed by
00:19:24.140 periods of time where most people have to do things they, at least within their own minds,
00:19:30.520 they think, well, I wish I were free to do less of this and more of the thing I really want to do.
00:19:34.900 And I think, I think you and I both fear that, you know, most people are capable of wanting to do
00:19:40.060 the wrong things, right? I mean, we're, we're, our attention gets captured by, you know, mere
00:19:45.360 entertainment, say more than, than in retrospect seems good for us. And so we're, we're, I think
00:19:50.860 you're worried and I'm also worried, but we're already in that culture now. It's just, people
00:19:54.380 just don't have unlimited time to explore it. We're worried about a kind of a digital entanglement
00:20:00.360 with, with things not worth paying attention to on some level.
00:20:04.860 Yes. And I, and I, I agree with you. I think we are in some version of that culture now. And so far,
00:20:10.540 the results are that large numbers of people given a profound degree of freedom, but also confronted
00:20:18.880 with incredibly addictive devices, substances, and entertainments to varying degrees, lose themselves
00:20:27.400 in those things and don't do, don't, don't do Burning Man or Meister Eckhart. Don't, again,
00:20:33.660 don't get married, don't have sex, which is sort of, you know, the, the, the interesting endpoint for
00:20:40.800 now of the liberated 21st century society is not more sex, but less sex. These, again, you, I think
00:20:49.520 you need, it's not that you can't imagine a society that has perfect abundance and also does not fall
00:20:57.560 prey to these kinds of snares, but you have to imagine entirely novel forms of essentially
00:21:07.140 communal and political self-restraint imposed on those human tendencies. Or maybe, I mean, look, I,
00:21:13.620 you know, maybe there are people who would say, well, no, what you need are pharmaceutical
00:21:18.140 interventions, right? Instead of Soma from Aldous Huxley's Brave New World to sort of, you know,
00:21:24.040 take the edge off everything, you need, you know, a perfected ozempic that removes, you know, that,
00:21:29.720 that cures original sin, right? Cures, cures, cures certain kinds of temptations. I mean, I think
00:21:35.420 there's a lot of different narratives you could offer. All I would say is, I think you need a pretty
00:21:40.300 dramatic, pretty substantial change in either human nature or human societies to prevent the perfect
00:21:49.400 abundance future from looking like Brave New World, looking like the, you know, sort of the spacecraft
00:21:55.460 in WALL-E, like where, you know, or a world, a world that has a kind of, a kind, maybe a kind of,
00:22:02.820 you know, digital age aristocracy that does seem to be doing pretty well, but that doesn't have a large,
00:22:10.920 large share of human beings, again, sort of debased by the experience of addictive conditions under,
00:22:18.300 and plenty. Well, it seems like we need some radical changes, even in our current situation,
00:22:23.560 with respect to our culture. What's your view of what's happening right of center? I mean, I know
00:22:29.540 you consider yourself right of center, and I don't know how far right, but... Depends. Yeah, perhaps,
00:22:35.580 perhaps you can... Depends on the month. Give me the potted... The potted version? The bio of your
00:22:40.740 political persuasion, but then let's talk about what's happening in the Republican Party in America.
00:22:46.820 The potted bio is that I'm some kind of religious conservative who has, generally, like in the
00:22:53.480 past, I have been more focused on religious and cultural issues.
00:22:59.140 If you'd like to continue listening to this conversation, you'll need to subscribe at
00:23:03.500 samharris.org. Once you do, you'll get access to all full-length episodes of the Making Sense podcast.
00:23:10.040 The Making Sense podcast is ad-free and relies entirely on listener support,
00:23:14.240 and you can subscribe now at samharris.org.