#450 — More From Sam: Resolutions, Conspiracies, Demonology, and the Fate of the World
Episode Stats
Words per Minute
186.61484
Summary
In this episode of the Making Sense Podcast, I sit down with author and speaker Sam Harris to talk about what it's like to live life as though it were your last year, and how to make the most of the last few days of the year.
Transcript
00:00:00.000
Welcome to the Making Sense Podcast. This is Sam Harris. Just a note to say that if you're
00:00:11.740
hearing this, you're not currently on our subscriber feed, and we'll only be hearing
00:00:15.720
the first part of this conversation. In order to access full episodes of the Making Sense
00:00:20.060
Podcast, you'll need to subscribe at samharris.org. We don't run ads on the podcast, and therefore
00:00:26.240
it's made possible entirely through the support of our subscribers. So if you enjoy what we're
00:00:30.200
doing here, please consider becoming one. Okay, we're back with another episode of More
00:00:38.600
from Sam, where we get more from you, Sam. Yes, good luck. Thank you. How are you? I'm good. How
00:00:45.160
are you doing? I'm good. Good to see you. Before we get into things, I just want to quickly remind
00:00:49.320
everyone that you will be giving talks in a number of cities in 2026. Los Angeles, Dallas, Austin,
00:00:55.600
Portland, Vancouver, Palm Beach, Toronto, Washington, D.C., and New York City. Last time we warned
00:01:02.860
Portland to start buying tickets or we were going to cancel, and that seemed to work. So
00:01:07.400
good news, the show is still happening. If you want to see Sam live, this is the time to
00:01:11.300
do it, and it's a really great talk. Republicans and Democrats alike will be thrilled.
00:01:15.800
Both hate it. Yeah. And annoyed, just like this podcast. Okay, on to our first topic. Last
00:01:21.620
year, you made a New Year's resolution where you essentially planned to live like you were
00:01:25.080
dying. I want to know how that went for you, and if you're making any adjustments to that plan for
00:01:29.800
2026, or is that just the plan for the rest of your life? No, no. I think our phrase is this,
00:01:35.840
live as though it were my last year. Yeah. I mean, it was a great frame to put over the year. The year
00:01:40.900
didn't quite become what I expected because seven days into it, half the city burned down, and I had to
00:01:48.580
flee my house, which didn't burn happily, but we still haven't been back. So the year got jiggered
00:01:55.060
around by real estate concerns way more than I was anticipating, and I wouldn't expect to spend the
00:02:00.120
last year of my life on that. Did that help you actually sharpen up your goal a little bit more?
00:02:04.740
It played a little havoc with it. I mean, in terms of actually the content of what I was paying
00:02:08.620
attention to, it was more terrestrial and practical than I would have hoped for. But it was, I mean,
00:02:18.080
I would give myself maybe a B on this aspiration. I mean, I think I did have my priorities pretty
00:02:25.460
straight. I mean, it's always, I mean, the filter for me with respect to the podcast is, I mean,
00:02:33.340
there would be less, if I could really do this, I think there would be less politics, or I mean,
00:02:37.280
the level, the cut would be higher. I mean, it would really be sort of emergency politics more
00:02:43.440
than just, okay, here's another thing in the news that I can't help but respond to.
00:02:49.080
Yeah, exactly. It's going to be, this is probably not going to be a moment of me walking my talk,
00:02:55.900
but no, it was good. I would do the same. I think I'll do the same year after year because it's
00:03:01.540
really, you know, obviously one doesn't know how many one has left. And yeah, I mean, no regrets
00:03:07.880
that that was my resolution. All right, good. I think we're all addicted to social media and our
00:03:12.060
phones at this point, and our attention has become so fragmented that it's almost impossible to fully
00:03:16.720
be anywhere anymore. Is there a quick pitch for the Waking Up app you can throw in here?
00:03:21.040
Well, I mean, you know, we often demonize the smartphone as the locus of all of our fragmentation
00:03:27.120
and collective derangement. I think that's true, but obviously there are different uses of a
00:03:32.880
smartphone, and this is certainly one that I can stand behind as being just categorically different
00:03:37.320
from the other stuff that's driving us crazy. And I view most, certainly most of the audio I consume
00:03:45.020
on a phone, which is a lot, as some version of, you know, kind of very productive and unifying of my
00:03:52.900
attention. I mean, you know, I listen to good books and good conversations and, you know, obviously,
00:03:57.440
you know, listening to guided meditations and meditation instruction is, I think, kind of the
00:04:01.740
ultimate example of a good use of attention on this device, which is just not at all analogous to
00:04:07.480
having your attention, you know, shattered on the regular by social media engagement and kind of the
00:04:14.040
dopaminergic checking in to the response, to the response, to the response. So yeah, I think waking up
00:04:19.560
is, I mean, I love what we've created over there, and I think it's a great use of the device, you
00:04:25.680
know. Can you give us some examples of what types of results one could expect once they develop this
00:04:30.720
skill that could, I think, honestly take somebody about a week to really begin to see something,
00:04:36.740
but, you know, what kind of results can one expect? Yeah, I love these questions. These are,
00:04:40.800
this is exactly the way I come at it. Hope the irony was detectable there. I mean, so there are kind of
00:04:46.400
two sides to this topic, right? There's the conventional mainstream pitch for mindfulness
00:04:52.500
that, you know, here's a list of benefits you can expect from the practice, and most of what is
00:04:59.220
claimed there, I think, is true. I mean, some of it is, you know, poorly researched, or at least the
00:05:04.440
research is kind of thin, and, you know, some of it might wash out, but the thing that is obviously
00:05:08.820
true is that virtually everyone, until they learn to practice mindfulness in some form, is spending
00:05:15.840
their life perpetually distracted, and they're so distracted they're not even aware of that,
00:05:20.840
right? They're thinking every moment of their lives, their moment-to-moment experience of being a self
00:05:25.760
in the world is being filtered through this discursive, conceptually framed conversation
00:05:34.220
they're having with themselves. And again, the conversation is so incessant and so loud that it
00:05:39.700
achieves this kind of white noise status where people aren't even aware that they're distracted.
00:05:45.040
Half the people hearing me say this who have never tried to meditate will be thinking,
00:05:50.040
what the hell is he talking about, right? But it's that voice in the mind, what the hell is he
00:05:54.220
talking about that feels like you? That's the endless conversation that is defining your experience
00:06:00.220
moment-to-moment. It's the medium on which all of your dissatisfaction and frustration and regret
00:06:06.280
and annoyance and, you know, everything that makes you an asshole in the world, it is the medium that
00:06:11.260
transmits that and makes it actionable emotionally and behaviorally moment-to-moment. It's the capture
00:06:17.580
by thought unwittingly, right? The impulse that you can't see creep up from behind that just becomes
00:06:24.640
you, that becomes your, you know, the next thing you say, the next thing you reach for, the next thing
00:06:29.820
you aspire to become. I mean, it's just, again, we're living in a dreamscape and virtually nobody
00:06:36.120
notices. So when you try to practice meditation for the first time, really any form of meditation,
00:06:41.660
but, you know, mindfulness in particular, initially you're given this very basic exercise of just try
00:06:47.220
to pay attention to the breath, you know, try to pay attention to sounds, you know, moment by moment,
00:06:51.340
and every time you notice you're lost in thought, come back to the feeling of breathing or the sounds
00:06:56.180
in your environment or the sense of your body resting in space or just some sensory experience that
00:07:01.500
you can use to try to build some concentration on. And it is, you know, it's, if you persist in it long
00:07:07.660
enough to notice how hard that is and how vulnerable your attention is in every present moment to the
00:07:14.960
next thought arising unrecognized and just seeming to become you, it is a kind of revelation. I mean,
00:07:21.600
I'll be a negative one. I mean, you just, what you, you, what you recognize in yourself is this
00:07:26.280
pervasive incapacity to pay attention to anything for more than a few moments at a time without being
00:07:31.300
distracted. And it's very hard, but whatever you think about meditation from that moment forward,
00:07:36.800
maybe you think it's just too much of a hassle, you're, you know, you're too restless, it's too
00:07:40.200
hard, you don't have a talent for it, you know, you're going to move on to other things, and you
00:07:43.320
kind of bounce off the project. Even if you're falling to that condition, it's, I think it should
00:07:48.900
be very hard to, to argue to yourself that, that this is somehow psychologically optimal to not be able
00:07:54.860
to pay attention to something for more than a few moments at a time and to be helplessly buffeted by
00:08:00.080
the winds of your own distraction. I had a good thought actually about that. I wanted to run by
00:08:04.740
you. There's this idea of sort of like a dealer dealing cards like thoughts. And imagine sitting
00:08:10.260
at the card table, blackjack table, just every time, just waiting for the cards and just, no,
00:08:16.240
don't want to play that card. No, just, you can wait for a blackjack every time. If you know how to
00:08:20.440
wait for the cards, you don't have to play the card that's dealt. You can say, I don't want that one.
00:08:24.660
Don't want that one. And everyone at the table is looking at you like, this guy's cheating and it's
00:08:27.940
kind of like cheating, but if you have that skill, it's kind of cool.
00:08:31.300
Oh yeah. The cards are thoughts, right? Thoughts taken seriously. So, you know, the, the voice of
00:08:37.380
your mind says, I can't believe how you fuck that up. Right. And so, okay. So how much, so what does
00:08:43.080
that, what does that convey, right? Like how much shame or regret or what do you do with that? You
00:08:49.380
don't have to do anything with it. You tell the dealer to give you another card.
00:08:52.220
Yeah. I mean, there's this image in Tibetan Buddhism of, you know, ultimately thoughts are
00:08:57.140
like thieves entering an empty house, right? There's nothing for them to steal, right? So
00:09:01.340
that like, just imagine what that's like. Imagine, just imagine the, the scene of, you know, thieves
00:09:06.460
come, you know, storming into a house that has nothing in it, right? I mean, there's no implication
00:09:11.720
of their presence there, right? That there's nothing for them to do. Thoughts recognized are just
00:09:18.180
these mysterious mental objects, right? I mean, they really don't, you know, it is just a bit
00:09:24.020
of language or a bit of imagery. And it is, it is genuinely mysterious that, you know, this next
00:09:29.800
thought can so fully commandeer your physiology and your, your whole sense of being in the world.
00:09:37.220
I mean, your, the next thought taken seriously could define the next decade of your life if you
00:09:42.200
can't see some reason not to take it seriously. So meditation on some level is a way of relaxing
00:09:48.080
the hold that thoughts automatically have on us. Yeah. As we move into 2026, I hope people will
00:09:54.620
develop a practice or give this a look, check out waking up. And if not waking up, there are plenty
00:09:59.700
of other apps where you can learn to meditate because it really is a basic skill that you should,
00:10:03.680
you should just, if you don't go super deep, that's fine, but you should have a basic understanding
00:10:08.500
and just explore that for yourself. And it's life-changing. So if anyone's focused on nutrition
00:10:14.260
and physical fitness and sleep, you're missing this aspect. If you care about those three and
00:10:20.220
you're not caring about your mind, you're missing this. All right. On to our next topic. I really
00:10:26.180
liked your podcast with Ross Douthat, that recent one. He's very likable and very smart. I think it
00:10:31.940
was the first time I liked an argument for why God, a perfect God would put a bad idea like slavery
00:10:38.760
in the Bible, where he basically had said that he knew it was bad, but wanted to allow room for
00:10:44.600
Christians to evolve and to sort of put their fingerprints on it over time. Now, he says-
00:10:50.360
No, he said, you know, if that's not good for you. No, I wasn't persuaded by it, but I actually
00:10:55.820
liked it. I thought, oh, that's a well-made argument that he's explaining that God wanted to
00:11:03.400
Well, it's well-made in the sense that it's totally unfalsifiable and it could absorb anything,
00:11:08.420
any possible contents of the Bible, you know, even mathematical errors, right? I mean, you know,
00:11:13.620
I think pi is calculated somewhere in the Bible as three, you know, full stop. I think I have that
00:11:19.540
right. I mean, I think it's just said that the, you know, the circumference of a circle is with a
00:11:24.340
diameter one is three or something like that. But, okay, so God can't do math while he's waiting for us
00:11:30.780
to get better at math. I mean, that's, he's leaving room for our own genius to explore that topic. It's just idiotic.
00:11:38.420
I mean, I guess maybe I'm in an uncharitable mood around this, but I mean, as at a certain point,
00:11:43.000
we have to run out of our patience with these dodges. I just don't know how they, I don't know
00:11:47.500
how you spend your life circling that specific attractor again and again, year after year. I mean,
00:11:53.060
it's just so obviously wrong and foolish. I mean, we'd have no patience for it. If this wasn't
00:11:57.660
grandfathered in by this tradition, I mean, again, you just look at the crazy, you know, it's,
00:12:02.080
it's Scientology. I mean, you look at, you watch Alex Gibney's documentary on Scientology and look
00:12:08.420
at that whole project and how embarrassing it was and look at those exit interviews, right? And it's
00:12:14.680
on some level more sophisticated than what we're talking about when you talk about, you know, any,
00:12:21.340
Yeah. And you guys also talked about AI. And for some reason, I find myself much less scared now
00:12:26.740
than I was earlier this year, simply because I believe the problem is going to be so big that
00:12:31.080
it will get addressed quickly. Where are you on that now? How are you feeling? I mean, I know at
00:12:35.080
some point earlier we were both thinking, how is everybody not talking about this every second of
00:12:39.400
the day? You still there? Well, I continue to be impressed about how hard it is to maintain one's
00:12:48.360
concern, even when one hasn't found a rational argument that should give you comfort, right? I mean,
00:12:54.680
it's, it is a sort of unique threat. I mean, this was the, at least the starting point of the talk I
00:12:59.820
gave on it now nearly 10 years ago, the TED talk in 2016, I think. But it's just, there's something
00:13:05.840
entertaining about it. It's fun to think about the downside. I mean, it's, there's something kind
00:13:10.480
of sexy and, and interesting about it that is, it's not like a coming plague or, you know, an asteroid
00:13:18.740
impact or something that you like, that was just, that's just scary and depressing, right? So even if you
00:13:24.120
think the risk is, is undiminished, I mean, you can't figure out how to, to be more comfortable
00:13:28.780
with the probabilities or the possible negative outcomes, it's just, there's something very
00:13:33.800
elusive about it. I mean, part of it is that the upside is also very compelling, right? I mean,
00:13:39.840
unlike the threat of nuclear war or anything else that is any other technological self-imposed risk,
00:13:45.840
I guess synthetic biology is slightly different. It's a little bit more like AI because there's
00:13:49.280
obviously some real upside to our breakthroughs in, in biology. But I mean, it's just, there's
00:13:55.080
something so, you know, in success, it looks like it could be amazing, you know, leaving aside
00:14:01.420
total unemployment and, and this, the social challenge of trying to grapple with that. But yeah,
00:14:07.320
I don't, I don't, you know, I think it's frankly terrifying when you look at the arms race condition
00:14:12.820
we're in and the people who are the, the, um, the moral quality or lack thereof of the people who
00:14:20.700
are making the decisions for us. I mean, it's like, you can count on two hands, the number of people
00:14:25.400
totally unregulated by a kleptocratic government at this point to speak of the United States.
00:14:31.340
Don't you believe we have to somewhat be unregulated at this point in order to win this arms race?
00:14:35.560
Well, I, I just think that if we had a morally sane and competent government, I think we would be
00:14:42.900
forcing some sort of global approach to this global problem, right? I mean, we, that, everything
00:14:49.040
would be bent toward that. I'm not talking about stopping development of AI. I don't think that's in
00:14:54.140
the cards under any regime, but clearly we need, we need to get out of this arms race condition. I mean,
00:15:00.900
we're, we're just merely in an arms race and that's it. I don't, I don't consider myself very close to
00:15:06.740
the, the behind the scene details here. So I just, I don't know how scary it actually is, but you know,
00:15:13.140
from what people say in public, it should be alarming that the people who are doing this work
00:15:17.460
and people who are closest to it, when asked, you know, what probability they give to our destroying
00:15:24.100
ourselves with this technology, they, I think to a man, they all say something terrifying. I mean,
00:15:30.060
they're all like, Oh, maybe 20%, you know? I mean, it's just not, it's like, you're not hearing
00:15:34.180
people like Sam Altman say, Oh no, no, no, no. We've got this totally in hand and we're being,
00:15:38.720
you know, really safe and scrupulous. And yet people just have misunderstood this technology.
00:15:43.180
There's no way, there's no self-improving thing on the horizon that could conceivably get away from us.
00:15:48.980
No, no, no. Like this has just been way overblown. I would put the risk as we, you know, one in a million
00:15:53.860
virtually, I mean, maybe, maybe Jan Lacoon is still somebody who talks that way, but virtually nobody
00:15:58.620
in a position to make decisions says anything like that. So just imagine if all the people
00:16:04.340
developing nuclear technology at the time were saying, okay, we're probably, we're running a
00:16:09.240
20% chance of destroying everything. And yet we can't stop, you know, we're doing this as fast as
00:16:14.740
possible. And we're in, now you're going to witness a multi-trillion dollar build-out that is
00:16:19.360
going to basically subsume every other economic concern for us and environmental concern. I mean,
00:16:25.300
where did, where did climate change go? You know, you got, you got all these people building these,
00:16:29.480
the most resource intensive technology anyone ever dreamed of. And, you know, even some of the,
00:16:35.200
the, the, the real climate change focused people like Elon in the past are, you know,
00:16:40.140
there's none of that, right? It's just, you know, we're going to use all the water and all the
00:16:43.880
electricity and, and let's go. Well, what's the alternative? Weren't we given a 10% chance
00:16:47.900
for just total destruction with nuclear war with you creating the bomb?
00:16:51.140
I mean, there was a, a very small chance that the actual, you know, the Manhattan Project
00:16:55.520
principles thought that we might, uh, I mean, they, they did, they did some final calculations
00:17:01.300
and put it, I think it, a truly small chance, like, you know, like, you know, one in 10,000
00:17:05.460
or something, it was not 10%, but they were still placing bets on whether we would ignite
00:17:09.780
the atmosphere and destroy everything. And that wasn't 10%? I thought that was 10%. No,
00:17:14.000
no, no. I mean, they didn't think it was 10% at the time, but imagine if they had,
00:17:17.560
imagine if they had been willing to pull the trigger at Alamogordo on the, on the Trinity test,
00:17:23.720
thinking there was a 10% chance that they were going to ignite the atmosphere and kill all life
00:17:30.140
on earth. That would have been irresponsible. That would have been pretty shocking if they had done
00:17:35.720
that. The fact that they thought that any of them thought it was within the realm of possibility
00:17:39.840
is still a little alarming. And that's always been held out as a moment where scientists showed
00:17:46.000
their capacity to whatever the real probabilities were. You can hold that aside. Within their own
00:17:53.160
minds, they showed their capacity to roll the dice with the future of the species in a way that
00:17:58.420
should have kind of shocked everyone. But here we're in a completely different game, right? I mean,
00:18:05.400
again, whatever the actual probabilities are aside, there's no way to know that. You have the people
00:18:10.900
who are doing this work, funding this work, making the decisions on a daily basis, the closest to the
00:18:16.440
engineering understanding that should govern one's sense of the probabilities here. They're telling us,
00:18:22.180
yeah, this is, we're sort of in coin toss land, you know, or at least dice roll land. You know,
00:18:28.220
a single die may come up, you know, six and we cancel the future, right? That's insane. And yet somehow
00:18:36.400
we're not even in a position to have an emotional response to it. I think, but what's the alternative?
00:18:41.180
Don't you want Sam Altman or the U.S. to get whatever that is first? No, I think we do need
00:18:47.740
to navigate this growing, you know, superpower contest with China. We need whatever, you know,
00:18:54.360
economic levers and military levers we can get in hand. But, you know, there should be some
00:19:02.920
impossible version of a carrot, which is we figure out how to solve truly global problems jointly,
00:19:09.160
right? I mean, so we need an American president who could have conceivably unified the world's
00:19:17.080
democracies on this and other points, right? Like all of our, all of our European allies in Australia
00:19:23.100
and Canada could be on the same page vis-a-vis Russia and the war in Ukraine, vis-a-vis China and
00:19:29.800
the AI arms race. We need leverage, right? We need to be able to hit the stop button somehow. We need to,
00:19:37.260
we need a system of alliances where we can actually credibly threaten China with a plunge back into
00:19:43.760
poverty when we take back all of our supply chain, you know, and we can't just be America, right? We
00:19:49.800
has to be, you know, every other democracy that cares about the fate of civilization. We just don't,
00:19:55.180
we, we're not in a political environment where we can collaborate globally in, in those kinds of
00:20:00.820
ways. And the reason why we're not is, it's not exclusively Trump, but it's, you know, it's Trump to
00:20:06.440
an extraordinary degree. We're not in a position to absorb this kind of shock to our system. You know,
00:20:13.560
it's like you're going to have Tucker Carlson talking about the rise of the machines. Is that
00:20:17.080
really the, the information diet that half the country got to have Joe Rogan and Tucker Carlson
00:20:21.080
chopping it up for four hours and making sense of this? We're not ready for the political and
00:20:27.380
economic pivot that will be required in success, right? In perfect success, again, without any
00:20:35.140
downside risk. I mean, if all of the catastrophic concerns are pure fiction, the alignment problem
00:20:41.000
was never a problem. And the malicious use of this powerful technology is never significant,
00:20:47.460
right? You don't have cyber terrorism. It's AI, you know, weaponized, you know, China doesn't turn
00:20:53.220
out the lights on us, et cetera, right? None of that happens, right? It's all just the good use of
00:20:58.340
this technology, just curing cancer all day long, you know, you know, curing cancer, playing chess,
00:21:02.740
making things, you got AI, you know, Hollywood, Netflix gets to make movies that cost $14 to make,
00:21:09.740
but they're the perfect, you know, wide release summer confections, right? That everyone wants to see
00:21:14.760
because they've been algorithmically tested on a billion brains. And, uh, you know, the next Tom
00:21:19.680
Cruz looks 32 for the rest of his life and it's all, you know, it's perfect. And yet where's the
00:21:26.580
economy around all of that? Who edited it? Who shot it? Who, you know, who catered it? None of that
00:21:33.440
happened. It was made for $14 and, you know, on a laptop. You don't think government intervenes
00:21:38.640
quickly and says, okay, wait, we got to reset. I mean, it would be amazing if we had the foresight
00:21:45.480
to make the changes politically and economically that we would have to make to spread the wealth
00:21:52.500
around. That would be amazing, right? But I just see nothing in our history or in our, in the present
00:21:59.160
that makes it seem like we're capable of doing that, right? I mean, you, what you would need,
00:22:04.060
you, you, you would need people to be able to, um, agree about the questions of fundamental value,
00:22:10.680
right? Like what is good for people? What kind of lives do we want to live, right? We can't even
00:22:15.740
agree about the ethics of universal basic income. Like the moment you, you've raised, I mean, there's,
00:22:20.080
I think there's some question as to whether UBI is the right remedy for a situation like this.
00:22:26.080
Um, and there's certainly debatable points. I think the research on, on UBI, the actual practical
00:22:30.960
and psychological and social effects of UBI are, that research is somewhat ambiguous still.
00:22:36.040
Although I would point out it's being conducted in a context where these changes haven't been forced
00:22:41.360
on, you know, culture across the board, right? But we can't agree about it. So, you know, you,
00:22:46.800
the moment you mentioned UBI, half the people will say, you know, this is obviously what we're going
00:22:52.160
to need, some version of this, you know, and it's, it has no, it should have no moral stigma attached to
00:22:57.040
it. And then the other half will say, no, this is a catastrophe because people need to work. The
00:23:01.820
people derive their meaning from work and they should derive their meaning from work. And I can't
00:23:06.000
imagine any other system of norms or expectations where you wouldn't get your meaning from work.
00:23:12.180
And they're, you know, good Christian anchors to a lot of that thinking. If we can't even have that
00:23:17.980
conversation and agree about what we should do, again, under conditions of perfect success, where
00:23:24.000
it's analogous to the creator of the universe, just handing us the ultimate labor-saving device
00:23:28.820
right here, just become as wealthy as you want to be. Here's the hardware and here's the software
00:23:33.400
that cancels the need for human drudgery until the end of the world and, and, and will produce
00:23:39.340
every scientific insight of which, which nature admits is possible, right? Here, here's kind of the,
00:23:46.640
here is an intelligence explosion in your hands, you know, go have at it. In our current condition,
00:23:52.520
we seem totally incapable of absorbing that, you know? We, I mean, I, that, that, that would be the
00:23:58.320
kind of the final irony. It's like, give us the best thing that could ever be invented and we will
00:24:04.600
turn it into the worst thing that has ever been invented, right? It's like, you know, the, the,
00:24:09.140
the Chinese would nuke us if they knew we had it now kind of thing. I got that, that's how politically
00:24:14.700
combustible we are as a species. Like if we knew that China had it, right? If China had the,
00:24:21.180
the AI that was, could cancel all of our efforts because it's perfect, right? And, and, and it's
00:24:27.660
perfectly, is again, perfectly aligned to their use, right? Like it's not going to get away from
00:24:32.260
them, but they can decide to just enjoy a winner take all spoils situation. They've got there first
00:24:39.560
and they've proven it. What would we do? Given the level of antagonism in our world, given how zero
00:24:46.820
some we are, what would we do? I mean, would we just bomb them? I mean, I think that's, I think
00:24:52.120
I would put the chances at 50-50, right? I mean, like we're so far from even having a conversation
00:24:58.520
about how to have a global civilization that, that works because, I mean, and, and this, again,
00:25:05.040
that I, this comes back to our own domestic politics and how it is so much worse than the
00:25:11.300
alternatives. I mean, say what you want about Biden and, and what could have been true under,
00:25:15.620
you know, Kamala Harris presidency. Lots of awful things I would, I could also whinge about, but the
00:25:22.300
one thing we wouldn't have is this plunge into America first, no nothingism, this retreat from
00:25:30.760
the world, the sense that even our allies are contemptible, right? I mean, we don't, we don't even
00:25:38.040
like our allies, you know, and, and we sort of like our enemies. If we can get our enemies to
00:25:42.180
pay up, we kind of like them more than our allies because they have our ethics, you know? It's like
00:25:47.000
that, like we don't want to hold ourselves to any standard of decency globally. So we actually are
00:25:54.060
more comfortable doing deals with other countries that don't hold them to themselves to any standard
00:25:59.620
of decency, right? Like, yes, of course we can do a deal with the UAE or Qatar, you know, that's just,
00:26:06.600
it's just bakshish, right? I mean, like, so that change, the stepping back from alliances and
00:26:12.400
imagining that we can go it alone on some level and, and the fact that half of our, of our society
00:26:16.900
is celebrating that and is just trying to figure out, you know, whether they can suffer the, the
00:26:21.960
Jews in their midst as they become more and more selfish and more and more oblivious to what's
00:26:27.740
happening in the rest of the world. I mean, that, that is far worse than the alternative, right? I mean,
00:26:34.160
again, Kamala Harris was not a good candidate. The wokeness is as terrible as everyone on the
00:26:40.160
right has said it is. I'm not, I would never minimize any of that, but we would have continued
00:26:45.960
being a country that is looking to solve global problems in a sane way under President Harris.
00:26:51.900
There's no question about that. Well, we know you'll have job security because you're going to
00:26:55.420
have to help us think through all this continually on the podcast. And then maybe on the other side,
00:27:01.220
you're going to have to help us figure out what to do with all this time with the, with the
00:27:05.020
meditation app. So let's, let's shift gears now to Tucker and his demons.
00:27:13.360
Yeah, no, I just thought, I think this is kind of funny. I saw him tell this story. I know he's
00:27:17.360
told it before about him being clawed by demons while asleep in bed with his wife and four dogs.
00:27:23.100
If you'd like to continue listening to this conversation, you'll need to subscribe at
00:27:29.640
samharris.org. Once you do, you'll get access to all full length episodes of the Making Sense
00:27:34.920
podcast. The Making Sense podcast is ad free and relies entirely on listener support. And you