Making Sense - Sam Harris - April 12, 2023


Making Sense of Existential Threat and Nuclear War | Episode 7 of The Essential Sam Harris


Episode Stats

Length

49 minutes

Words per Minute

152.43048

Word Count

7,620

Sentence Count

352

Misogynist Sentences

2

Hate Speech Sentences

8


Summary

The goal of this series is to organize, compile, and juxtapose conversations hosted by Sam Harris into specific areas of interest, in order to construct a coherent overview of Sam's perspectives and arguments, the various explorations and approaches to the topic, the relevant agreements and disagreements, and the pushbacks and evolving thoughts which his guests have advanced. The purpose of these compilations is not to provide a complete picture of any issue, but to entice you to go deeper into these subjects. Along the way, we ll point you to the full episodes with each featured guest, and at the conclusion, we'll offer some reading, listening, and watching suggestions which range from fun and light to densely academic. This is The Essential Sam Harris: Making Sense of Extraterrestrial Threat and Nuclear War, a series of episodes hosted by the philosopher and writer Sam Harris, exploring a broad array of ideas and concepts related to existential threat and nuclear war. In this episode, you ll hear the natural overlap between theories of ethics, violence, and pacifism, and more. We don t run ads on the podcast, and therefore it s made possible entirely through the support of our subscribers. So if you enjoy what we re doing here, please consider becoming a supporter of the podcast by becoming a subscriber. You ll get a better idea of what s going on here, and a deeper understanding of the topics covered in the episodes, and how they can affect your day-to-day life and the rest of your day to day life. Thanks to our sponsorships, we re making a better listening experience, and your life better, and you lllllllll. We lllll get more stuff like that s better at listening to the Making Sense Podcast! -Sam Harris Thanks, Sam Harris. -The Making Sense Project and - to become a supporter, too! . Thank you for listening Subscribe to the making sense podcast! -Your support is helping us make sense of existential threats, nuclear war, existential threat, and existential threats and nuclear threats, and everything in between. Sam Harris -Our goal is to make the world a little bit better, not less scary, more beautiful, more fun, more cool, more interesting, more peaceful, more profound, more understanding of life, more meaningful, and less scary -and more beautiful - Thank you, again and again, thank you.


Transcript

00:00:00.000 Welcome to the Making Sense Podcast.
00:00:08.820 This is Sam Harris.
00:00:10.880 Just a note to say that if you're hearing this, you are not currently on our subscriber
00:00:14.680 feed and will only be hearing the first part of this conversation.
00:00:18.420 In order to access full episodes of the Making Sense Podcast, you'll need to subscribe at
00:00:22.720 samharris.org.
00:00:24.060 There you'll find our private RSS feed to add to your favorite podcatcher, along with
00:00:28.360 other subscriber-only content.
00:00:30.520 We don't run ads on the podcast, and therefore it's made possible entirely through the support
00:00:34.640 of our subscribers.
00:00:35.900 So if you enjoy what we're doing here, please consider becoming one.
00:00:47.200 Welcome to The Essential Sam Harris.
00:00:50.240 This is Making Sense of Existential Threat and Nuclear War.
00:00:54.360 The goal of this series is to organize, compile, and juxtapose conversations hosted by Sam
00:01:01.560 Harris into specific areas of interest.
00:01:05.080 This is an ongoing effort to construct a coherent overview of Sam's perspectives and arguments,
00:01:10.600 the various explorations and approaches to the topic, the relevant agreements and disagreements,
00:01:16.940 and the pushbacks and evolving thoughts which his guests have advanced.
00:01:20.420 The purpose of these compilations is not to provide a complete picture of any issue, but
00:01:27.500 to entice you to go deeper into these subjects.
00:01:30.740 Along the way, we'll point you to the full episodes with each featured guest, and at the
00:01:35.860 conclusion, we'll offer some reading, listening, and watching suggestions, which range from
00:01:41.140 fun and light to densely academic.
00:01:43.320 One note to keep in mind for this series, Sam has long argued for a unity of knowledge where
00:01:50.680 the barriers between fields of study are viewed as largely unhelpful artifacts of unnecessarily
00:01:56.160 partitioned thought.
00:01:57.940 The pursuit of wisdom and reason in one area of study naturally bleeds into, and greatly
00:02:03.300 affects, others.
00:02:05.280 You'll hear plenty of crossover into other topics as these dives into the archives unfold.
00:02:09.960 And your thinking about a particular topic may shift as you realize its contingent relationships
00:02:15.860 with others.
00:02:17.000 In this topic, you'll hear the natural overlap with theories of ethics, violence and pacifism,
00:02:22.380 and more.
00:02:23.580 So, get ready.
00:02:25.700 Let's make sense of existential threat and nuclear war.
00:02:29.000 In 1961, the astronomer Francis Drake jotted down a fairly simple, back-of-the-napkin formula
00:02:39.960 to calculate just how many technologically advanced civilizations we should expect to be out there
00:02:45.340 in the cosmos right now.
00:02:47.780 It came to be known as the Drake Equation.
00:02:50.180 The equation starts with an extremely large number, the estimate of the total number of
00:02:57.640 stars in the universe.
00:02:59.440 Then we narrow that number down to how many of those stars have planets orbiting them.
00:03:04.700 Then we narrow that number down to how many of those planets are likely to be suitable
00:03:09.500 for the evolution of life.
00:03:11.800 Then we narrow that down to the number of those life-suitable planets that have actually had
00:03:16.780 life emerge.
00:03:17.460 Then we narrow that down to how many of those life-forms are intelligent.
00:03:23.400 And then, finally, we narrow that down to how many of those intelligent life-forms advanced
00:03:30.020 to the stage of a technological civilization.
00:03:33.460 Even if we're quite conservative with our estimate at each step of the narrowing process,
00:03:38.900 maybe we guess that only one in every 100,000 life-suitable planets actually did achieve even
00:03:45.040 basic microbial life.
00:03:46.400 Or that only one in every 1,000,000 forms of intelligent life became technologically
00:03:52.600 advanced.
00:03:54.340 Even if we apply these stringent factors, the results of the equation and our remaining
00:03:59.600 number suggest that there still ought to be between 1,000 and 100 million advanced civilizations
00:04:06.120 just in the Milky Way galaxy alone.
00:04:08.860 And there are, of course, billions of galaxies just like ours.
00:04:14.440 So even if the correct number is just in the hundreds in our Milky Way, when you look out
00:04:19.400 in the cosmos, there should be millions of civilizations out there.
00:04:22.900 A physicist named Enrico Fermi asked the simple question,
00:04:30.000 if this is true, where is everybody?
00:04:33.600 How come, when we look out into the cosmos, we don't see or hear obvious evidence of a plethora
00:04:40.040 of advanced life-forms zipping about in their ships, symmetrically geoforming entire galaxies
00:04:45.900 into power plants, or what have you?
00:04:49.100 This question became known as Fermi's paradox.
00:04:53.220 There is no shortage of hypotheses to address Fermi's question, but just about all of the
00:04:58.560 responses can be categorized under three general answer types.
00:05:03.000 One answer is that we're just early.
00:05:05.220 Perhaps all of Drake's math was right, and everybody will show up, but we just happen
00:05:10.960 to be amongst the first to the party.
00:05:13.680 The cosmos itself may have just recently reached a state of habitability, after the chaos from
00:05:19.020 the initial inflation and the Big Bang sent heat and debris flying about in every direction.
00:05:24.880 Maybe it just recently settled down and allowed life like ours to flourish, and we humans are
00:05:30.420 just an early riser.
00:05:32.200 Another answer is that we're very rare.
00:05:34.340 Maybe Drake's numbers were not nearly conservative enough, and life such as ours is just an exceedingly
00:05:40.440 unlikely cosmic event.
00:05:42.620 Perhaps there are only a small handful of civilizations out there, and given the vastness of the cosmos,
00:05:48.760 it's no surprise that we wouldn't have had any close neighbors who happen to be advanced
00:05:52.520 enough to say hello.
00:05:54.580 Maybe the neighborhood is just very quiet.
00:05:57.700 Or perhaps the most disturbing answer, the one we're going to be dealing with in this compilation,
00:06:02.780 is this one.
00:06:05.300 Maybe there is a great filter.
00:06:08.280 What if there is a certain unavoidable technological phase that every intelligent life's advancement
00:06:14.060 must confront?
00:06:15.780 A technological phase that is just so hard to get through that almost no civilization successfully
00:06:22.260 crosses the threshold.
00:06:23.440 And that explains why it appears that no one is out there.
00:06:27.820 It may be that we humans are on a typical trajectory, and are destined to be erased.
00:06:33.160 And soon.
00:06:35.660 But even if there is a filter, and even if just the tiniest percentage of civilizations have
00:06:41.300 been able to get through it and continue advancing without tripping over themselves, pretty soon
00:06:46.620 they'd have the knowledge of how to do monumentally big engineering projects, if they so choose.
00:06:53.040 We should see evidence of their continued existence, right?
00:06:57.720 So let's make sure we're imagining this filter analogy correctly.
00:07:02.180 Maybe a single filter isn't quite right.
00:07:05.760 Maybe we should be picturing thicker and thicker filter layers stacked one on top of the other.
00:07:10.680 Maybe there would be a moment when you really do leave them all behind.
00:07:16.200 That point of permanent safety would be when a civilization achieves a kind of knowledge so
00:07:21.360 powerful that it understands how to survive and avoid its own self-destruction perpetually,
00:07:26.620 and really does get through all of those filters.
00:07:30.520 But there does seem to be a kind of natural sequential order of the types of knowledge that
00:07:35.300 a civilization is likely to discover.
00:07:37.000 However, it is difficult to imagine discovering how to build flying machines before building
00:07:42.340 wheelbarrows, but that is also not a guarantee.
00:07:46.440 Is our human order of scientific discovery typical, or an outlier?
00:07:51.060 It seems that harnessing energy is key to both creative and destructive power, and that they
00:07:57.380 must go hand in hand.
00:07:59.320 You could imagine the kind of knowledge it would take to pull off a huge engineering project,
00:08:03.620 like building a device that could siphon all of the energy from a black hole at the center
00:08:08.300 of a galaxy, for example.
00:08:10.360 And you can recognize that this same knowledge would presumably also contain the power to destroy
00:08:15.720 the civilization which discovered it, either maliciously or accidentally.
00:08:20.400 And the odds of avoiding that fate trend towards impossible over a short amount of time.
00:08:26.040 No one makes it through.
00:08:27.140 This is the great filter answer to Enrico Fermi, that there are countless civilizations out there
00:08:34.220 that blip out of existence almost as quickly as they achieve the technical prowess to harness
00:08:39.620 even a small percentage of the potential energy available to them.
00:08:43.920 Is this what happens out there?
00:08:46.260 Does this answer Fermi?
00:08:47.780 How many filters are there?
00:08:51.080 We humans are a relatively young species, and already we seem to be discovering a few technologies
00:08:56.580 that have some filter potential.
00:08:59.460 If we get through our current challenges, are we bound to just discover another, even more
00:09:04.600 difficult technology to survive alongside?
00:09:07.520 Is this tenable?
00:09:09.960 This compilation is going to be a tour of Sam's engagement with, and a close look at, the strongest
00:09:16.000 weapon of war we've created so far.
00:09:18.700 A weapon that might be a candidate for this great filter, or at least a very difficult
00:09:22.900 one.
00:09:24.040 Nuclear war.
00:09:27.180 The complete erasure and annihilation of civilization was a talent once thought to be reserved only
00:09:33.080 for the gods.
00:09:34.800 As a reminder of just how stark the moment was when we realized we may have that power in
00:09:39.640 our own hands, perhaps for the first time sensing that great filter on our horizon.
00:09:44.660 It's worth playing a haunting and now very famous audio clip which lays the realization
00:09:50.880 bare.
00:09:52.800 Upon witnessing a successful test detonation of a nuclear bomb south of Los Alamos, Robert
00:09:58.320 Oppenheimer, the physicist leading the Manhattan Project, recalls the scene and his thoughts.
00:10:03.940 We knew the world would not be the same.
00:10:10.160 Few people laughed.
00:10:14.240 Few people cried.
00:10:16.760 Most people were silent.
00:10:18.140 And I remembered the line from the Hindu scripture, the Bhagavad Gita.
00:10:29.960 Vishnu is trying to persuade the prince that he should do his duty.
00:10:39.100 And to impress him, takes on his multi-armed form and says, now I am become death, the destroyer
00:10:51.280 of worlds.
00:10:54.560 I suppose we all thought that one way or another.
00:10:57.300 Making sense of nuclear war and its existential threat is not the happiest of subjects.
00:11:06.460 And perhaps that's why most of us don't often look closely at the precariousness of
00:11:10.200 the situation we're in.
00:11:12.180 We experience a kind of cognitive dissonance that can act as a psychological barrier when
00:11:17.120 direct engagement with a known threat is just too destabilizing.
00:11:20.900 And more importantly, when the threat seems to defy a readily available remedy.
00:11:26.940 If there is a great filter out there, what good would it do to worry about it?
00:11:31.900 Who would want to think about this stuff?
00:11:34.780 Well, Sam Harris is one of those people who forces himself to.
00:11:38.960 Though that wasn't always the case.
00:11:41.000 Before we get to the guests and conversations that Sam has hosted on Making Sense, we should
00:11:45.560 remind ourselves of the analogy that we're using to approach this subject.
00:11:48.680 A filter is not a wall.
00:11:51.860 A filter, no matter how dense, does permit some things to get through.
00:11:57.040 So even if the odds are stacked against us, the only game in town appears to be trying
00:12:02.080 to improve our chances of getting to the other side.
00:12:06.500 We're going to start with Sam himself as he describes his re-engagement with this threat.
00:12:11.740 It's his attempt to shake us out of our collective moral slumber, to help us notice our circumstances
00:12:17.400 when it comes to the nuclear question.
00:12:20.360 He reads here from a particular book which was instrumental to his paying close attention
00:12:24.800 to this subject.
00:12:26.440 Sam is speaking in July of 2020, in the introduction of episode 210.
00:12:30.740 We're coming up on the 75th anniversary of the atomic bomb in about a week.
00:12:39.760 July 16th is the 75th anniversary of Trinity, the explosion of the first atomic bomb at the
00:12:47.300 Trinity test site in Alamogordo, New Mexico.
00:12:49.440 Whatever the merits or necessity of our building the bomb, and even using it to end the war with
00:12:57.020 Japan, that can certainly be debated.
00:12:59.720 But what is absolutely clear to anyone who studies the ensuing 75 years, is that these were 75 years
00:13:09.080 of folly, nearly suicidal folly.
00:13:14.760 And this has been a chapter in human history of such reckless stupidity, that it's been a kind of
00:13:23.560 moral oblivion, and there's no end in sight.
00:13:29.120 Rather, we have simply forgotten about it.
00:13:32.260 We have forgotten about the situation we are in every day of our lives.
00:13:37.880 This is really difficult to think about, much less understand.
00:13:44.120 The enormity of our error here is stupefying, in some basic sense.
00:13:51.600 It's like we were convinced 75 years ago to rig all of our homes and buildings to explode.
00:14:01.620 And then we just got distracted by other things, right?
00:14:05.820 And most of us live each day totally unaware that the status quo is as precarious as it
00:14:13.740 in fact is.
00:14:14.940 So when the history of this period is written, our descendants will surely ask, what the hell
00:14:21.840 were they thinking?
00:14:23.160 And we are the people of whom that question will be asked.
00:14:28.020 That is, if we don't annihilate ourselves in the meantime, what the hell are we thinking?
00:14:36.180 What are our leaders thinking?
00:14:39.080 We have been stuck for nearly three generations in a posture of defending civilization, or imagining
00:14:48.220 that we are, by threatening to destroy it at any moment.
00:14:53.660 And given our capacity to make mistakes, given the increasing threat of cyber attack, the status
00:15:02.460 quo grows less tenable by the day.
00:15:05.920 The first book I ever read about the prospect of nuclear war was Jonathan Schell's The Fate
00:15:11.240 of the Earth, which originally came out in the New Yorker in 1982.
00:15:17.260 If you haven't read it, it's a beautifully written and amazingly sustained exercise in
00:15:23.960 thinking about the unthinkable.
00:15:26.900 And I'd like to read you a few passages to give you a sense of it.
00:15:30.800 This is from the beginning, starting a few sentences in.
00:15:33.920 These bombs were built as weapons for war, but their significance greatly transcends war
00:15:40.060 and all its causes and outcomes.
00:15:42.360 They grew out of history, yet they threatened to end history.
00:15:46.580 They were made by men, yet they threatened to annihilate man.
00:15:51.460 They are a pit into which the whole world can fall, a nemesis of all human intentions, actions,
00:15:58.080 and hopes.
00:15:58.520 Only life itself, which they threatened to swallow up, can give the measure of their
00:16:04.660 significance.
00:16:05.980 Yet in spite of the immeasurable importance of nuclear weapons, the world has declined,
00:16:10.880 on the whole, to think about them very much.
00:16:13.320 We have thus far failed to fashion, or even to discover within ourselves, an emotional or
00:16:18.940 intellectual or political response to them.
00:16:21.560 This peculiar failure of response, in which hundreds of millions of people acknowledge the
00:16:26.840 presence of an immediate, unremitting threat to their existence, and to the existence of
00:16:31.420 the world they live in, but do nothing about it, a failure in which both self-interest and
00:16:37.160 fellow-feeling seem to have died, has itself been such a striking phenomenon that it has to
00:16:43.560 be regarded as an extremely important part of the nuclear predicament, as this has existed
00:16:47.920 so far.
00:16:49.700 End quote.
00:16:50.180 So there, Shell gets at the strangeness of the status quo, where the monster is in the
00:16:57.220 room, and yet we have managed to divert our attention from it.
00:17:02.940 And I love this point he makes.
00:17:04.560 It's a violation both of self-interest and fellow-feeling.
00:17:09.800 Our capacity to ignore this problem somehow seems psychologically impossible.
00:17:14.680 It's a subversion of, really, all of our priorities, both personal and with respect to our ethical
00:17:22.260 commitments to others.
00:17:24.180 A little bit later on, he talks about this state of mind a little more.
00:17:28.940 Because denial is a form of self-protection, if only against anguishing thoughts and feelings,
00:17:35.120 and because it contains something useful, and perhaps even, in its way, necessary to life.
00:17:40.240 If anyone who invites people to draw aside the veil and look at the peril face-to-face is
00:17:46.380 at risk of trespassing on inhibitions that are part of our humanity, I hope in these reflections
00:17:52.180 to proceed with the utmost possible respect for all forms of refusal to accept the unnatural
00:17:57.800 and horrifying prospect of a nuclear holocaust.
00:18:02.020 So there, Shell is being more tactful than I'm being here, admitting that this denial is
00:18:07.880 on some level necessary to get on with life, but it is nonetheless crazy.
00:18:14.340 Year after year after year, we are running the risk of mishap here.
00:18:20.480 And whatever the risk, you can't keep just rolling the dice.
00:18:27.380 And so it seems time to ask, when is this going to end?
00:18:31.860 To begin the exploration of clips, we're going to hear from a philosopher and author who spends
00:18:41.260 a lot of time looking at existential risk, Nick Bostrom.
00:18:46.140 Bostrom has a talent for painting colorful analogies to prime our thinking about these
00:18:50.600 difficult topics.
00:18:52.220 One of his analogies that brings the great filter hypothesis into vivid clarity goes like
00:18:57.420 this.
00:18:57.820 Imagine a giant urn filled with marbles, which are mostly white in color, but range in shades
00:19:05.920 of gray.
00:19:07.460 Each of these marbles represents a kind of knowledge that we can pluck from nature and apply
00:19:11.940 technologically.
00:19:13.780 Picture reaching in and pulling out the knowledge of how to make a hairdryer, or the automobile,
00:19:19.440 or a toaster oven, or even something more abstract, like the knowledge of how to alter the genome
00:19:24.520 to choose eye color or some other aesthetic purpose.
00:19:28.300 Reaching into this urn, rummaging around and pulling out a marble, is the act of scientific
00:19:33.420 exploration and achievement.
00:19:36.020 Now, white marbles represent the kinds of knowledge that carry with them very little existential
00:19:41.120 threat.
00:19:42.400 Maybe pulling a marble like this would be gaining knowledge of how to manufacture glass.
00:19:46.760 That's a marble that we pulled out of the urn around 3500 BCE in Egypt.
00:19:53.080 That little bit of knowledge mostly improves life on Earth for humans and has all kinds
00:19:57.820 of lovely applications for food preservation, artistic expression, window manufacturing, eyesight
00:20:04.460 correction, and much more.
00:20:06.940 It likely carries with it some kind of minor threat as well, though it's difficult to imagine
00:20:12.180 how that specific advancement would inherently threaten the existence of the species.
00:20:16.780 You can imagine thousands of white marbles that feel as benign, positive, and generally
00:20:21.760 harmless as this one.
00:20:23.940 But Bostrom asks us to consider what a black marble would be.
00:20:28.840 Is there some kind of knowledge that, when plucked out of nature, is just so powerful that
00:20:33.980 every civilization is eradicated shortly after pulling it from the urn?
00:20:38.240 Are there several of these black marbles hiding in the urn somewhere?
00:20:41.320 Are we bound to grab one eventually?
00:20:45.140 Sam points out that it has generally been the attitude of science to just pull out as
00:20:49.660 many marbles as fast as we possibly can and let everyone know about it the moment you
00:20:54.120 have a good grip.
00:20:55.480 And we operate as if the black marbles aren't in the urn, as if they simply don't exist.
00:21:01.620 What shade of gray was the marble that represented the moment we obtained the knowledge of how to
00:21:06.880 split the nucleus of a uranium-235 atom and trigger and target its fission chain reaction
00:21:12.620 in a warhead?
00:21:14.160 Was that a black marble?
00:21:16.380 That will be a question we consider throughout this episode, as well as the specific political
00:21:21.340 entanglements which relate to this problem, and the alliances and personalities which affected
00:21:26.040 it in the recent past.
00:21:27.220 So, let's start out with Nick Bostrom and Sam engaging on the topic of existential threat
00:21:33.200 in general as we move towards the nuclear question.
00:21:36.940 Here, you'll hear Bostrom lay out his vulnerable world hypothesis and draw out the metaphor that
00:21:42.180 we introduced.
00:21:43.680 This is from episode 151, Will We Destroy the Future?
00:21:47.580 Let's start with the vulnerable world hypothesis.
00:21:53.300 What do you mean by that phrase?
00:21:56.100 Well, the hypothesis is, roughly speaking, that there is some level of technological development
00:22:01.840 at which the world gets destroyed by default, as it were.
00:22:06.800 So then, what does it mean to get destroyed by default?
00:22:10.120 I define something I call the semi-anarchic default condition, which is a condition in which there
00:22:16.340 are a wide range of different actors with a wide range of different human recognizable motives.
00:22:22.900 But then, more importantly, two conditions hold.
00:22:26.200 One is that there is no very reliable way of resolving global coordination problems, and the other is that
00:22:31.980 we don't have a very extremely reliable way of preventing individuals from committing actions
00:22:39.240 that are extremely strongly disapproved of by a great majority of other people.
00:22:43.080 So, maybe it's better to come at it through a metaphor.
00:22:47.260 Yeah, the urn.
00:22:48.680 The urn metaphor.
00:22:49.740 So, what if, in this urn, there is a black ball in there somewhere?
00:22:55.280 Like, is there some possible technology that could be such that whichever civilization discovers
00:23:01.500 it, invariably gets destroyed?
00:23:04.840 And what if there is such a black ball in the urn, though?
00:23:08.020 I mean, we can ask about how likely that is to be the case.
00:23:11.640 We can also look at what is our current strategy with respect to this possibility.
00:23:16.480 And it seems to me that currently our strategy, with respect to the possibility that the urn
00:23:22.520 might contain a black ball, is simply to hope that it doesn't.
00:23:25.700 And so, we keep extracting balls as fast as we can.
00:23:28.500 We have become quite good at that, but we have no ability to put balls back into the urn.
00:23:33.880 We cannot uninvent our inventions.
00:23:35.640 So, the first part of this paper tries to identify what are the types of ways in which the world
00:23:46.080 could be vulnerable, the types of ways in which there could be some possible black ball technology
00:23:50.720 that we might invent.
00:23:52.460 And the first and most obvious type of way the world could be vulnerable is if there is
00:23:57.680 some technology that greatly empowers individuals to cause sufficiently large quantities of destruction.
00:24:05.300 So, motivate this with a, or illustrate it by means of a historical counterfactual.
00:24:12.160 We, in the last century, discovered how to split the atom and release the energy that
00:24:18.220 is contained within, some of the energy that's contained within the nucleus.
00:24:24.040 And it turned out that this is quite difficult to do.
00:24:28.320 You need special materials.
00:24:29.680 You need plutonium or highly enriched uranium.
00:24:31.380 So, really, only states can do this kind of stuff to produce nuclear weapons.
00:24:37.120 But what if it had turned out that there had been an easier way to release the energy of
00:24:41.440 the atom?
00:24:41.840 What if you could have made a nuclear bomb by baking sand in the microwave oven or something
00:24:47.040 like that?
00:24:48.520 So, then that might well have been the end of human civilization in that it's hard to see
00:24:54.260 how you could have cities, let's say, if anybody who wanted to could destroy millions of people.
00:25:01.020 So, maybe we were just lucky.
00:25:02.680 Now, we know, of course, that it is physically impossible to create an atomic detonation by
00:25:09.380 baking sand in the microwave oven.
00:25:10.740 But before you actually did the relevant nuclear physics, how could you possibly have known how
00:25:14.620 it would turn out?
00:25:15.220 Well, let's just spell out that because I want to conserve everyone's intuitions as we
00:25:21.000 go on this harrowing ride to your terminus here because the punchline of this paper is
00:25:27.420 fairly startling when you get to what the remedies are.
00:25:31.420 So, why is it that civilization could not endure the prospect of what you call easy nukes?
00:25:41.580 If it were that easy to create a Hiroshima-level blast or beyond, why is it just a foregone conclusion
00:25:50.440 that that would mean the end of cities and perhaps the end of most things we recognize?
00:25:57.100 I think foregone conclusion is maybe a little too strong.
00:25:59.160 It depends a little bit on the exact parameters we plug in.
00:26:03.400 I mean, the intuition is that in a large enough population of people, like amongst every population
00:26:10.240 with millions of people, there will always be a few people who, for whatever reason, would
00:26:15.840 like to kill a million people or more if they could.
00:26:19.600 Whether they are just crazy or evil or they have some weird ideological doctrine or they're
00:26:27.260 trying to extort other people or threaten other people, that just humans are very diverse
00:26:33.420 and in a large enough set of people that will, for practically any desire, you can specify
00:26:39.160 there will be somebody in there that has that.
00:26:40.940 So, if each of those destructively inclined people would be able to cause a sufficient amount
00:26:45.880 of destruction, then everything would get destroyed.
00:26:48.740 Now, if one imagines this actually playing out in history, then to tell whether all of civilization
00:26:58.300 really would get destroyed or some horrible catastrophe short of that would happen instead
00:27:02.340 would depend on various things.
00:27:04.100 Like just what kind of nuclear weapon would it be like a small kind of Hiroshima type of
00:27:09.280 thing or a thermonuclear bomb?
00:27:10.860 How easy would it be?
00:27:12.020 Could literally anybody do it like in five minutes?
00:27:14.260 Or would it take some engineer working for half a year?
00:27:18.940 And so, depending on exactly what values you pick for those and some other variables, you
00:27:24.380 might get scenarios ranging from very bad to kind of existential catastrophe.
00:27:31.220 But the point is just to illustrate that there historically have been these technological transitions
00:27:38.320 where we have been lucky in that destructive capability we discovered were hard to wield.
00:27:46.660 You know, and maybe a plausible way in which this kind of very highly destructive capability
00:27:52.520 could become easy to wield in the future would be through developments in biotechnology that
00:27:57.640 maybe makes it easy to create designer viruses and so forth that don't require high amounts
00:28:04.100 of energy or special difficult materials and so forth.
00:28:07.740 And there you might have an even stronger case.
00:28:09.960 Like so with a nuclear weapon, like one nuclear weapon can only destroy one city, right?
00:28:14.760 Where the viruses and stuff potentially can spread.
00:28:18.700 So...
00:28:19.220 Yeah.
00:28:19.520 And we should remind people that we're in an environment now where people talk with some
00:28:26.460 degree of flippancy about the prospect of every household one day having something like
00:28:33.880 a desktop printer that can print DNA sequences, right?
00:28:37.840 That everyone becomes their own bespoke molecular biologist and you can just print your own medicine
00:28:44.900 at home or your own genetic intervention at home.
00:28:48.360 And this stuff really is, you know, the recipe under those conditions, the recipe to weaponize
00:28:54.560 the 1918 flu could just be sent to you like a PDF.
00:28:59.580 It's not beyond the bounds of plausible sci-fi that we could be in a condition where it really
00:29:06.320 would be within the power of one nihilistic or, you know, otherwise ideological person to
00:29:12.700 destroy the lives of millions and even billions in the wrong case.
00:29:16.360 Yeah, or send us a PDF or you could just download it from the internet.
00:29:20.640 So the full genomes of the number of highly virulent organisms are in the public domain
00:29:26.420 and just ready to download.
00:29:29.140 So, yeah.
00:29:29.920 So, I mean, we could talk more about that.
00:29:32.500 I think that I would rather see a future where DNA synthesis was a service provided by a few
00:29:37.560 places in the world where it would be able, if the need arose, to exert some control, some
00:29:42.900 screening rather than something that every lab needs to have its own separate little machine.
00:29:48.180 Yeah.
00:29:48.360 So that's, these are examples of type one vulnerability, like where the problem really
00:29:54.000 arises from individuals becoming too empowered in their ability to create massive amounts of
00:30:00.380 harm.
00:30:01.580 Now, so there are other ways in which the world could be vulnerable that are slightly more
00:30:06.420 subtle, but I think also worth bearing in mind.
00:30:09.380 So these have to do more about the way that technological developments could change the
00:30:14.620 incentives that different actors face.
00:30:16.960 We can again return to the nuclear history case for an illustration of this.
00:30:23.120 And actually, this is maybe the closest to a black ball we've gotten so far with thermonuclear
00:30:29.040 weapons and the big arms race during the Cold War led to something like 70,000 warheads
00:30:36.660 being on hair trigger alert.
00:30:37.820 So it looks like when we can see some of the archives of this history that have recently
00:30:45.020 opened up, that there were a number of close calls.
00:30:48.300 The world actually came quite close to the brink on several occasions, and we might have
00:30:52.740 been quite lucky to get through.
00:30:54.100 It might not have been that we were in such a stable situation, which rather might have
00:30:59.380 been that this was a kind of slightly black ball-ish technology and we just had enough luck
00:31:04.580 to get through.
00:31:05.060 But you could imagine it could have been worse.
00:31:07.880 You could imagine properties of this technology that would have created stronger incentives,
00:31:11.700 say, for a first strike so that you would have crisis instability.
00:31:16.540 If it had been easier, let us say, in a first strike to take out all the adversary's nuclear
00:31:21.260 weapons, then it might not have taken a lot in a crisis situation to just have enough fear
00:31:30.880 that you would have to strike first for fear that the adversary otherwise would do the
00:31:34.820 same to you.
00:31:35.700 Yeah.
00:31:36.400 Remind people that in the aftermath of the Cuban Missile Crisis, the people who were closest
00:31:41.420 to the action felt that the odds of an exchange had been something like a coin toss.
00:31:47.040 It was something like 30 to 50 percent.
00:31:49.220 And what you're envisioning is a situation where what you describe as safe first strike, which
00:31:55.340 is there's just no reasonable fear that you're not going to be able to annihilate your enemy
00:32:00.500 provided you strike first.
00:32:02.220 That would be a far less stable situation.
00:32:05.040 And it's also it's also forgotten that the status quo of mutually assured destruction was
00:32:11.160 actually a step towards stability.
00:32:13.340 I mean, it was before the Russians had or the Soviets had their own arsenals.
00:32:18.920 There was a greater game theoretic concern that we would be more tempted to use ours because
00:32:26.240 nuclear deterrence wasn't a thing yet.
00:32:29.280 Yeah.
00:32:29.460 So some degree of stabilizing influence, although, of course, maybe at the expense of the outcome
00:32:33.920 being even worse, if both sides were destroyed, then the safe first strike might just be one
00:32:39.200 side being destroyed.
00:32:40.800 Right.
00:32:40.980 Yeah.
00:32:41.820 And so if it had been possible, say, with one nuclear warhead to wipe out enemies, nuclear
00:32:47.620 warheads within a wider radius, then it's actually the case.
00:32:51.160 Or if it had been easier to detect nuclear submarines so that you could be more confident
00:32:57.760 that you had actually been able to target all of the other side's nuclear capability,
00:33:03.920 that could have resulted in a more unstable arms race, one that would, with a sort of
00:33:11.140 higher degree of certainty, result in the weapons being used.
00:33:16.240 And you can consider other possible future ways in which, say, the world might find itself
00:33:20.840 locked into arms race dynamics.
00:33:22.740 Or it's not that anybody wants to destroy the world, but it might just be very hard to come
00:33:28.760 to an agreement that avoids the arms being built up and then used in a crisis.
00:33:35.320 Nuclear weapon reduction treaties, you know, there are concerns about verification.
00:33:41.000 But in principle, you can kind of have, like, nuclear weapons are quite big and they use very
00:33:45.380 special materials.
00:33:46.440 There might be other military technologies where, even if both sides agree that they
00:33:50.200 wanted to just ban this military technology, it might just, the nature of the technology
00:33:55.860 might be such that it would be very difficult or impossible to enforce.
00:34:01.160 In that exchange, you heard Bostrom mention how lucky we may have gotten, in that it turns
00:34:06.420 out, nuclear weapons are not very easy to create.
00:34:09.740 So, even if this technology turns out to be a nearly black ball, and perhaps the darkest
00:34:16.340 one we've yet pulled out of the urn, we can examine our treatment of them as a dress
00:34:21.040 rehearsal with incredibly high stakes.
00:34:24.360 Bostrom also mentioned something in passing that's worth keeping in mind as we look closer
00:34:28.920 at the nuclear weapon question.
00:34:31.040 What he referred to as global coordination problems.
00:34:34.380 This is a concept sometimes used in economics and game theory, and it describes a situation
00:34:40.480 that would be best solved by everyone simultaneously moving in the same direction.
00:34:45.660 But of course, people can't be sure what's in anyone else's mind, and humans are famously
00:34:51.240 difficult to coordinate and synchronize in any case.
00:34:54.480 So often, these types of problems entrench themselves and worsen, even if most people agree that they
00:35:00.620 are incredibly harmful.
00:35:01.740 Another relevant feature of a coordination problem is that there's usually a strong disincentive
00:35:07.960 for first movers.
00:35:09.700 This can be applied to climate change, political revolutions, or even something like a great
00:35:14.760 number of people secretly desiring to quit social media, but not wanting to lose connections
00:35:19.680 or marketing opportunities.
00:35:21.920 Laying the global coordination problem framework onto disarmament of nuclear weapons is an easy
00:35:27.340 fit.
00:35:27.660 The first mover who dismantles their bombs may be at a huge disadvantage, even if everyone
00:35:33.720 privately agrees that we all ought to disarm.
00:35:36.720 In fact, as you also heard Bostrom point out, when thinking about nuclear war strategy, the
00:35:42.960 first strike is often aimed at decapitating the opponent's ability to strike back.
00:35:47.660 Of course, if your opponent has already willingly disarmed, say, in accordance with the mutual treaty,
00:35:53.180 while you have retained your weapons and only pretended to disarm, the effect is just as devastating.
00:35:59.520 So the coordination problem tends to persist.
00:36:02.740 Now that we've laid some of the foundation to think about existential risk in general,
00:36:07.140 let's move to a conversation Sam had with a guest who looks very closely at the prospect
00:36:11.640 of nuclear war.
00:36:12.520 The guest is Fred Kaplan, and when Sam spoke with him, Kaplan had just published a book called
00:36:19.420 The Bomb.
00:36:21.180 But before we get to Kaplan, let's first listen to some of Sam's introduction to the conversation,
00:36:26.260 and let him do the work of trying to drag our attention to the unnerving reality of this
00:36:30.620 situation again.
00:36:32.660 He's going to bring us back to 1983, at a moment when the only thing standing between us
00:36:38.480 and nuclear Armageddon may have been a single person's intuition.
00:36:44.420 The doomsday clock was just advanced closer to midnight than it has been at any point in the last 75 years.
00:36:52.460 It now reads 100 seconds to midnight.
00:36:56.120 Now, whether you put much significance in that warning,
00:37:01.100 just take a moment to consider that the people who focus on this problem
00:37:05.380 are as worried now as they've ever been.
00:37:09.460 But do you think about this?
00:37:11.840 And if I were to ask how long it's been since you worried that you might
00:37:14.780 have some serious illness,
00:37:18.140 or that your kids might,
00:37:20.300 or how long has it been since you've worried about
00:37:22.640 being the victim of crime,
00:37:24.580 or worried about dying in a plane crash,
00:37:27.860 it probably hasn't been that long.
00:37:31.160 It might have happened last week, even.
00:37:33.720 But I would wager that very few people listening to this podcast
00:37:38.580 have spent any significant time feeling the implications of what is manifestly true.
00:37:44.580 All of us are living under a system of self-annihilation
00:37:50.720 that is so diabolically unstable
00:37:53.560 that we might stumble into a nuclear war based solely on false information.
00:37:58.600 In fact, this has almost happened on more than one occasion.
00:38:04.340 Do you know the name Stanislav Petrov?
00:38:07.020 He should be one of the most famous people in human history.
00:38:10.920 And yet he's basically unknown.
00:38:12.720 He was a lieutenant colonel in the Soviet Air Defense Forces
00:38:16.660 who is widely believed to be almost entirely responsible
00:38:21.840 for the fact that we didn't have World War III in the year 1983.
00:38:27.280 This was at the height of the Cold War
00:38:30.460 and the Soviet Union had just mistaken a Korean passenger jet,
00:38:36.540 Flight 7, for a spy plane
00:38:39.120 and shot it down after it strayed into Siberian airspace.
00:38:45.400 And the U.S. and our allies were outraged over this
00:38:50.260 and on high alert.
00:38:52.220 In fact, both the U.S. and the Soviet Union
00:38:55.100 had performed multiple nuclear tests that month.
00:38:59.800 And so it was in this context
00:39:01.120 in which Soviet radar reported
00:39:03.500 that the U.S. had launched five ICBMs
00:39:06.700 at targets within the Soviet Union.
00:39:10.780 And the data were checked and rechecked
00:39:14.060 and there was apparently no sign that they were in error.
00:39:18.320 And Stanislav Petrov stood at the helm.
00:39:22.220 Now, he didn't have the authority
00:39:23.640 to launch a retaliatory strike himself.
00:39:26.400 His responsibility was to pass the information
00:39:28.720 up the chain of command.
00:39:30.800 But given the protocols in place,
00:39:34.220 it's widely believed that had he passed that information along,
00:39:38.020 a massive retaliatory strike against the United States
00:39:41.020 would have been more or less guaranteed.
00:39:43.440 And of course, upon seeing those incoming missiles,
00:39:47.660 of which there would likely have been hundreds,
00:39:49.420 if not thousands,
00:39:51.480 we would have launched a retaliatory strike of our own.
00:39:55.520 And that would have been game over.
00:39:57.820 Hundreds of millions of people
00:39:59.340 would have died more or less immediately.
00:40:02.680 Now, happily, Petrov declined
00:40:04.840 to pass the information along.
00:40:07.660 And his decision boiled down to mere intuition.
00:40:11.880 The protocol demanded that he pass the information along
00:40:16.300 because it showed every sign of being a real attack.
00:40:20.340 But Petrov reasoned that if the United States
00:40:23.160 were really going to launch a nuclear first strike,
00:40:25.960 they would do it with more than five missiles.
00:40:29.060 Five missiles doesn't make a lot of sense.
00:40:32.120 But it's also believed that any of the other people
00:40:34.300 who could have been on duty that night,
00:40:36.660 instead of Petrov,
00:40:38.120 would have surely passed this information
00:40:40.460 up the chain of command.
00:40:42.500 And killing a few hundred million people
00:40:44.580 and thereby wiping out the United States
00:40:47.280 and Russia,
00:40:49.720 as you'll soon hear,
00:40:51.220 our retaliatory strike protocol
00:40:52.920 entailed wiping out
00:40:54.780 Eastern Europe and China
00:40:56.400 for good measure.
00:40:58.040 This could have well ended human civilization.
00:41:01.440 So when you think about
00:41:03.040 human fallibility
00:41:04.400 and errors of judgment
00:41:06.080 and realize that this ability
00:41:08.140 to destroy the species
00:41:09.760 is at all times,
00:41:12.620 every minute of the day,
00:41:14.580 in the hands of utterly imperfect people.
00:41:17.540 And in certain cases,
00:41:19.260 abjectly imperfect people.
00:41:21.460 It should make the hair
00:41:22.480 stand up on the back of your neck.
00:41:24.500 And the infrastructure
00:41:25.920 that is maintaining
00:41:27.720 all of these systems
00:41:29.320 on hair trigger alert
00:41:30.920 is aging.
00:41:32.340 And in many cases,
00:41:33.580 run on computers so old
00:41:35.160 that any self-respecting business
00:41:37.280 would be embarrassed
00:41:38.480 to own them.
00:41:40.320 And yet,
00:41:40.680 for some reason,
00:41:42.120 almost no one is thinking
00:41:43.260 about this problem.
00:41:46.820 At the end of this compilation,
00:41:48.680 we'll offer some recommended reading
00:41:50.300 and viewing,
00:41:51.560 including a documentary
00:41:52.480 which focuses on that perilous moment
00:41:54.460 with Petrov.
00:41:56.440 Sam goes on in that introduction
00:41:58.120 to outline a few more
00:41:59.620 absurd instances
00:42:00.600 of close calls
00:42:01.660 involving accidental war game codes
00:42:03.800 being loaded into computers
00:42:05.040 or misinterpreted radar signals
00:42:07.280 which nearly sent the bombs flying.
00:42:09.760 So,
00:42:10.400 now let's hear more
00:42:11.380 from that episode.
00:42:12.860 We're going to hear
00:42:13.560 Kaplan and Sam
00:42:14.540 discuss Kaplan's writing
00:42:15.820 about the Cuban Missile Crisis
00:42:17.200 of 1962,
00:42:19.320 arguably the first whiff
00:42:20.800 that humanity got
00:42:21.780 of the genuine prospect
00:42:23.080 of nuclear war.
00:42:25.020 If you need a history refresher
00:42:26.800 on the events of 1962,
00:42:28.900 we'll recommend a documentary
00:42:30.100 in the outro
00:42:30.820 of this compilation.
00:42:32.300 And,
00:42:32.780 of course,
00:42:33.260 Kaplan's book as well.
00:42:36.000 For this clip,
00:42:37.080 you'll just need to recall
00:42:38.220 that at the tensest moment
00:42:39.500 of the standoff,
00:42:40.840 there were hundreds
00:42:41.600 of Soviet nuclear warheads
00:42:43.320 pointed at the U.S.
00:42:44.800 on launch pads
00:42:45.700 stationed in Fidel Castro's Cuba,
00:42:48.080 just 90 miles
00:42:49.060 off of the United States coast.
00:42:51.060 And the United States
00:42:52.040 had a far greater number
00:42:53.200 of missiles
00:42:53.760 fixed on Soviet targets.
00:42:56.160 Secret negotiations
00:42:57.200 were underway
00:42:57.880 by the leaders
00:42:58.540 of all three nations involved
00:43:00.120 to try to avert
00:43:01.340 World War III
00:43:02.140 and save face
00:43:03.200 in front of their
00:43:03.640 own populations.
00:43:06.160 So,
00:43:06.880 here is Sam
00:43:07.600 with Fred Kaplan
00:43:08.480 from episode 186,
00:43:10.900 an episode simply titled
00:43:12.420 The Bomb.
00:43:13.260 In your book,
00:43:17.280 you report facts
00:43:18.820 about the Cuban Missile Crisis
00:43:20.000 that were not widely known
00:43:21.960 and were actually
00:43:22.900 systematically concealed
00:43:25.260 to some effect.
00:43:27.380 I may perhaps go into that
00:43:28.300 for a second
00:43:28.700 because it gave us a sense
00:43:30.300 that bluffing
00:43:31.860 on the brink
00:43:32.900 of nuclear war
00:43:33.800 was a successful strategy
00:43:36.460 because people thought
00:43:37.240 that that's what had happened,
00:43:38.380 that he just basically
00:43:39.160 stared Khrushchev down
00:43:41.120 and, you know,
00:43:42.560 Khrushchev blinked,
00:43:43.840 but that's not quite
00:43:44.540 what happened.
00:43:45.400 That's not what happened.
00:43:46.420 Most of us do know now
00:43:47.980 because it was revealed
00:43:48.940 20 years after the fact
00:43:50.420 that, in fact,
00:43:52.140 on the final day
00:43:53.060 of the crisis,
00:43:54.940 Khrushchev proposed
00:43:56.320 a deal,
00:43:57.520 a secret deal.
00:43:58.860 I will take out
00:43:59.860 my missiles from Cuba
00:44:01.060 if you,
00:44:02.580 United States,
00:44:03.860 take out your
00:44:04.640 very similar missiles
00:44:05.700 from Turkey.
00:44:07.080 And Kennedy took the deal.
00:44:08.460 Now, what isn't generally known,
00:44:10.020 and I don't know
00:44:10.540 why it isn't known
00:44:11.400 because you can listen
00:44:12.260 to this whole exchange
00:44:13.420 on tapes
00:44:14.160 that were declassified
00:44:15.720 20 years ago
00:44:16.620 but that you will read about
00:44:18.240 in maybe two or three
00:44:20.040 other books
00:44:20.740 of that many,
00:44:22.160 but Kennedy reads
00:44:24.100 the proposal
00:44:25.000 and he says,
00:44:26.300 and, you know,
00:44:26.580 he secretly tape-recorded
00:44:28.080 all of this,
00:44:28.500 he goes,
00:44:29.200 well, this seems like
00:44:29.980 a pretty fair deal.
00:44:31.540 And everybody around
00:44:33.120 the table,
00:44:33.780 all of his advisors,
00:44:34.920 not just the generals,
00:44:36.360 but the civilians too,
00:44:37.520 Bobby Kennedy,
00:44:38.720 Robert McNamara,
00:44:40.300 McGeorge Bundy,
00:44:41.840 all these paragons
00:44:43.020 of good sense and reason,
00:44:44.780 feverishly opposed this deal.
00:44:47.540 NATO will be destroyed,
00:44:48.960 the Turks will be humiliated,
00:44:50.740 our credibility
00:44:51.420 will be lost forever.
00:44:53.900 And, you know,
00:44:54.700 Kennedy let them talk
00:44:55.860 and then, you know,
00:44:57.120 he said,
00:44:57.520 well, you know,
00:44:58.000 this was on a Saturday.
00:44:59.660 The following Monday,
00:45:01.180 they were,
00:45:01.660 the United States,
00:45:02.480 the military was scheduled
00:45:03.580 to start in the attack.
00:45:05.240 There were going to be
00:45:05.840 500 air sorties a day
00:45:07.720 against the missile silos,
00:45:09.760 missile sites in Cuba,
00:45:11.940 followed four days later
00:45:13.100 by an invasion.
00:45:14.780 And Kennedy took the secret deal.
00:45:17.620 He only told six people
00:45:19.120 about this, though.
00:45:20.700 And in fact,
00:45:21.160 he put out the myth
00:45:22.140 that there was no deal
00:45:23.240 because this was the height
00:45:25.000 of the Cold War.
00:45:25.720 It would look like appeasement.
00:45:27.240 One of the six people
00:45:28.240 that he did not tell
00:45:29.400 was his vice president,
00:45:31.160 Lyndon Johnson,
00:45:31.860 who therefore went
00:45:33.240 into the Vietnam War
00:45:34.440 convinced by the lesson
00:45:35.980 of Cuba,
00:45:36.620 the false lesson of Cuba,
00:45:38.580 that you don't negotiate.
00:45:40.560 You stare them down.
00:45:42.140 But here's what's even scarier.
00:45:44.460 We later learned,
00:45:45.780 this was not known
00:45:46.600 at the time,
00:45:47.760 that some of those missiles
00:45:49.100 already had nuclear warheads
00:45:50.680 loaded on them.
00:45:52.160 So, you know,
00:45:52.600 they could have been
00:45:53.100 launched on warning.
00:45:54.660 Another thing we didn't know
00:45:55.820 until much later
00:45:56.820 is that the Soviets
00:45:58.040 had secretly deployed
00:45:59.480 40,000 troops
00:46:01.020 on the island of Cuba,
00:46:03.160 some of them armed
00:46:04.400 with tactical nuclear weapons
00:46:05.920 to stave off
00:46:07.220 an anticipated
00:46:08.040 American invasion.
00:46:09.560 Therefore,
00:46:10.720 if anybody else
00:46:11.820 around that table
00:46:12.760 except John Kennedy
00:46:13.780 had been president,
00:46:15.100 or if he had said,
00:46:16.440 yeah, you're right,
00:46:17.500 this is a bad deal,
00:46:18.560 let's proceed with the plan,
00:46:20.380 then there would have been
00:46:21.820 a war with the Soviet Union
00:46:23.780 without any question.
00:46:25.660 Yeah, it's amazing.
00:46:26.920 And so in your book,
00:46:28.100 you report on the details
00:46:30.480 of these encounters
00:46:31.520 between each U.S. administration
00:46:34.440 and the war planners,
00:46:35.920 which are generally
00:46:36.620 the Air Force and the Navy,
00:46:38.660 and each incoming president,
00:46:41.000 you know,
00:46:41.160 whether we're talking about,
00:46:42.260 you know,
00:46:42.580 Kennedy and, you know,
00:46:43.680 his team with McNamara
00:46:44.920 or Nixon and Kissinger
00:46:46.460 or Clinton and Obama
00:46:48.580 and their teams,
00:46:49.640 each president
00:46:50.360 comes into these meetings
00:46:51.840 and for the first time
00:46:53.600 is told what our first strike
00:46:55.580 and second strike policies are.
00:46:57.440 And each one,
00:46:59.820 it sounds like,
00:47:00.600 comes away absolutely appalled
00:47:03.440 by what the doctrine actually is
00:47:05.660 and committed from that day
00:47:08.140 to changing it.
00:47:09.520 And yet,
00:47:10.440 each has found himself
00:47:11.820 more or less unable
00:47:13.320 to change it
00:47:13.980 in ways that fundamentally alter
00:47:16.160 the game-theoretic logic here.
00:47:19.060 I mean,
00:47:19.280 and these discussions
00:47:20.080 are like really out
00:47:21.740 of Dr. Strangelove.
00:47:22.720 The most preposterous scenes
00:47:24.900 in Dr. Strangelove
00:47:26.660 are no more comedic
00:47:28.660 than some of these exchanges
00:47:30.120 because these are plans
00:47:31.480 that call for
00:47:32.620 the annihilation
00:47:34.060 of hundreds of millions
00:47:35.220 of people
00:47:35.740 on both sides.
00:47:37.680 I mean,
00:47:37.820 ever since Kennedy,
00:47:38.540 we've been past the point
00:47:39.500 where a first strike
00:47:40.800 prevented the possibility
00:47:42.720 of a retaliatory strike
00:47:44.620 from the Soviet Union.
00:47:45.660 And so we're talking
00:47:46.520 about protocols
00:47:47.540 that are synonymous
00:47:49.220 with killing 150,
00:47:51.740 200 million people
00:47:52.760 on their side
00:47:53.940 and losing that many
00:47:56.060 on our side.
00:47:57.040 And for the longest time,
00:47:58.820 the protocol was
00:47:59.560 to annihilate China
00:48:01.260 and Eastern Europe,
00:48:02.660 whether they were even part
00:48:04.080 of the initial skirmish
00:48:05.340 with the Soviet Union.
00:48:06.220 Right.
00:48:07.220 The U.S. policy
00:48:08.660 throughout the 1950s
00:48:10.400 and into some of the 60s,
00:48:13.300 the policy,
00:48:14.340 and this wasn't just
00:48:15.040 the Strategic Air Command,
00:48:16.540 this was signed off on
00:48:18.400 by President Eisenhower
00:48:20.180 and the Joint Chiefs of Staff,
00:48:22.920 it was that
00:48:23.780 if the Soviet Union
00:48:25.200 attacked West Germany
00:48:26.840 or took over West Berlin,
00:48:30.280 and, you know,
00:48:30.680 this was at a time
00:48:31.480 in the late 50s,
00:48:32.480 early 60s
00:48:33.140 when we really didn't have
00:48:34.060 any conventional armies
00:48:35.280 in Europe,
00:48:36.220 but the plan
00:48:37.640 was that
00:48:38.960 at the outset
00:48:39.780 of the conflict
00:48:40.660 to unleash
00:48:42.800 our entire nuclear arsenal
00:48:44.600 at every target
00:48:46.920 in the Soviet Union,
00:48:47.980 the satellite nations
00:48:48.820 of Eastern Europe,
00:48:49.680 and as you point out,
00:48:50.960 China,
00:48:51.380 even if China
00:48:52.000 wasn't involved in the war,
00:48:53.800 and it was inquired,
00:48:55.440 well, how many people
00:48:56.140 is this going to kill
00:48:56.940 when the estimate was
00:48:57.860 about 285 million.
00:48:59.860 If you'd like to continue
00:49:00.960 listening to this conversation,
00:49:02.520 you'll need to subscribe
00:49:03.460 at samharris.org.
00:49:05.280 Once you do,
00:49:05.920 you'll get access
00:49:06.500 to all full-length episodes
00:49:07.780 of the Making Sense podcast,
00:49:09.420 along with other
00:49:10.020 subscriber-only content,
00:49:11.780 including bonus episodes
00:49:13.080 and AMAs
00:49:14.140 and the conversations
00:49:15.180 I've been having
00:49:15.760 on the Waking Up app.
00:49:17.300 The Making Sense podcast
00:49:18.220 is ad-free
00:49:19.080 and relies entirely
00:49:20.480 on listener support,
00:49:21.660 and you can subscribe now
00:49:23.140 at samharris.org.
00:49:24.560 and you can see
00:49:26.200 on the Waking Up app.
00:49:26.920 The Making Sense podcast
00:49:27.940 is a production of Waking Up app.
00:49:28.760 The Making Sense podcast
00:49:29.000 is a production of Waking Up app.
00:49:29.640 of the Making Sense podcast
00:49:33.280 is a production of Waking Up app.
00:49:35.120 The Making Sense podcast
00:49:35.800 is a production of W thanking