Making Sense - Sam Harris - August 16, 2023


#330 — The Doomsday Machine


Episode Stats

Length

1 hour and 35 minutes

Words per Minute

148.33809

Word Count

14,183

Sentence Count

642

Misogynist Sentences

4

Hate Speech Sentences

35


Summary

In this episode, I speak with Carl Robichaux, co-Director of Longview Philanthropy's program on nuclear weapons policy, and co-Managing Director of the Nuclear Weapons Policy Fund. We discuss the new film, J.J. Abramson s The War Plan, and its connection to the history of nuclear weapons, proliferation, and failure to contain nuclear weapons. We also discuss the legacy of the Manhattan Project, and the impact nuclear weapons had on the lives of the people who lived near the sites of the first nuclear detonation, the Trinity test site in 1945. And we talk about the role of private citizens in mitigating nuclear risk, and how they can play a role in preventing nuclear proliferation. This episode is a PSA, which means there is no paywall. If you'd like to support the podcast, you can subscribe at Samharris.org/Making Sense. You can also join the conversation by using the hashtag , and find out more about the podcast on social media by using and in the comments section below. Thanks to our sponsor, the Waking Up Foundation, for sponsoring this episode. Make Sense.org. The Making Sense Podcast is a production of Gimlet Media. Making Sense is a podcast on nuclear policy, defense, and security issues. Our mission is to make the world safer, smarter, and a place for everyone to learn and practice what they can do to make sense of the world around them. . We make the things they need to know, so they can be their best, not less, and can do the most effective way to live up to their potential impact, and have the most impact, not only in the world they can achieve the most of their best day to impact their day to day lives, everywhere they learn, and they can help them achieve their most impact and their most effective day to help them most impact the most effect, and their impact, they become a better day, they help them are most impactful, they are most effective, they will help us all of their most meaningful day to effect the most meaningful impact, their day is the most important thing, they care about it, their most of all, they deserve the most, they're everywhere they care, they get it, they learn the most they care most of it, they're a masterpiece, they understand it, and so they're the most powerful they're most effective.


Transcript

00:00:00.000 Welcome to the Making Sense Podcast.
00:00:23.720 This is Sam Harris.
00:00:26.440 Okay, today's episode is a PSA.
00:00:29.200 So, no paywall.
00:00:31.820 As always, if you want to support the podcast, you can subscribe at samharris.org.
00:00:38.960 Today I'm speaking with Carl Robichaux.
00:00:42.160 Carl co-leads Longview Philanthropy's program on nuclear weapons policy.
00:00:47.440 And he co-manages their Nuclear Weapons Policy Fund.
00:00:51.020 This is a fund to which the Waking Up Foundation will soon be giving a fair amount of money.
00:00:55.680 If you'd like to support it along with us, you can find the relevant link in the show notes in your podcast player.
00:01:02.900 For more than a decade, Carl led grant-making and nuclear security at the Carnegie Corporation of New York.
00:01:08.900 He also previously worked with the Century Foundation and the Global Security Institute,
00:01:14.080 where he focused on arms control, international security policy, and non-proliferation.
00:01:19.100 And the topic of this conversation is the ongoing thread of nuclear war.
00:01:25.120 We discussed the new film Oppenheimer, which I must say really is a masterpiece.
00:01:30.360 If you haven't seen it in a theater, and it's still playing in a theater near you,
00:01:34.940 I highly recommend that you see it.
00:01:37.120 This really is a film that benefits from the big screen.
00:01:40.020 We discussed the ethics of dropping the atomic bombs on Hiroshima and Nagasaki,
00:01:45.260 the Cuban Missile Crisis, and some of the false lessons we learned there,
00:01:49.580 the history and the future of nuclear proliferation,
00:01:52.340 the logic of deterrence,
00:01:54.320 our vulnerabilities to cyber attack,
00:01:57.280 the history of de-escalation,
00:01:59.740 the war in Ukraine,
00:02:01.620 war games,
00:02:03.280 the taboo around using nuclear weapons,
00:02:05.960 growing tensions between the U.S. and China,
00:02:08.000 artificial intelligence,
00:02:10.720 getting to nuclear zero,
00:02:12.560 the role of private citizens in mitigating nuclear risk,
00:02:15.840 and finally Longview Philanthropy's Nuclear Risk Policy Fund,
00:02:19.940 which again I encourage everyone to support.
00:02:22.580 Unfortunately, this remains one of the biggest problems of our time,
00:02:26.280 one which we do not talk about or think about nearly enough.
00:02:30.560 So I hope you find this conversation useful.
00:02:33.620 I now bring you Carl Robichaux.
00:02:38.000 I am here with Carl Robichaux.
00:02:43.700 Carl, thanks for joining me.
00:02:45.480 Yeah, my pleasure.
00:02:46.420 I'm a big fan of the work you're doing.
00:02:48.580 Nice.
00:02:49.000 Well, I'm a big fan of the work you're doing,
00:02:50.840 although I've only just encountered it.
00:02:52.620 But you are an expert on,
00:02:55.040 it seems,
00:02:56.140 much that ails us
00:02:57.660 with respect to nuclear weapons
00:02:59.880 and proliferation
00:03:00.880 and failures of containment
00:03:02.940 and all of that.
00:03:04.920 So we're having this conversation
00:03:06.760 24 hours after I watched Oppenheimer
00:03:10.260 in an IMAX theater,
00:03:12.040 which I highly recommend to people.
00:03:14.020 I recommend people go to the movie theater
00:03:15.480 to see this movie.
00:03:16.860 Have you seen it?
00:03:17.960 Yeah, I did.
00:03:18.880 I've seen it twice now,
00:03:20.160 and I'm going to go back a third time.
00:03:22.260 It's really,
00:03:22.920 it's some kind of masterpiece.
00:03:24.800 Perhaps you can alert me to anything
00:03:27.680 that gets wrong with respect to the history,
00:03:29.560 but not noticing any errors.
00:03:33.440 It's quite amazing.
00:03:34.460 I mean, it's just everything
00:03:35.140 from the performances
00:03:36.060 to the writing
00:03:37.000 to the sound design.
00:03:38.240 It's just,
00:03:38.660 it's really worth it.
00:03:39.860 Yeah.
00:03:40.300 Again, in a theater,
00:03:41.620 it's really required to appreciate it.
00:03:44.320 I found it very moving.
00:03:46.080 There's a high fidelity
00:03:47.020 to the source material.
00:03:48.840 And it's based on this book,
00:03:50.540 American Prometheus,
00:03:51.700 by Kai Bird and Marty Sherwin.
00:03:54.260 And if you've read that book,
00:03:55.640 you'll see that many of the quotes
00:03:57.040 in the scenes
00:03:57.760 are lifted directly from the book
00:03:59.620 and from the historical record.
00:04:01.100 So it takes its source material
00:04:02.960 very seriously,
00:04:04.300 which I appreciate.
00:04:06.260 And I think it also,
00:04:07.320 it's just an incredibly relevant film today,
00:04:10.400 because just as in the period
00:04:13.580 covered in the film,
00:04:14.540 as in 1945 and 1955,
00:04:17.940 we are now facing
00:04:19.160 this new nuclear arms race.
00:04:21.600 And it's the central question,
00:04:23.660 can we head off a new competition
00:04:25.420 that threatens to make us worse off?
00:04:28.520 So I think it's a phenomenal film.
00:04:31.060 But because of the way
00:04:33.400 the story is told,
00:04:34.440 it leaves out some important details.
00:04:38.140 And one of those is the effect
00:04:41.400 on the downwinders,
00:04:43.240 as they're called.
00:04:44.260 These are the people
00:04:45.080 who lived in the proximity
00:04:46.980 of the Trinity test.
00:04:49.380 And the New York Times
00:04:50.100 ran an article the other day
00:04:51.360 and talks about how there were
00:04:52.700 500,000 people who lived within
00:04:54.940 150 miles of that Trinity test site.
00:04:57.940 And none of them were informed
00:04:59.880 before the test
00:05:00.800 or after the test.
00:05:02.600 And a lot of them continue
00:05:04.040 to suffer from health consequences
00:05:06.340 related to that initial test.
00:05:09.540 And there's another whole part
00:05:10.920 of the story that's excluded,
00:05:12.200 which is Los Alamos
00:05:13.800 was only part of the Manhattan Project.
00:05:16.320 In fact, 90% of the budget
00:05:18.640 for the Manhattan Project
00:05:19.760 was spent on producing
00:05:21.320 the fissile material,
00:05:23.520 the enriched uranium
00:05:24.500 and the plutonium.
00:05:26.360 And Leslie Groves
00:05:27.120 oversaw this project.
00:05:28.220 It was an enormous
00:05:29.080 engineering feat.
00:05:31.260 And that work was done
00:05:32.920 primarily in Oak Ridge, Tennessee
00:05:35.500 and Hanford, Washington,
00:05:37.380 and has had various
00:05:39.360 health and environmental effects
00:05:41.600 that have lasted for generations.
00:05:43.080 We're still paying
00:05:43.840 some of the cleanup costs there.
00:05:45.280 So, there are victims
00:05:48.200 of this nuclear age
00:05:49.760 that are not depicted
00:05:51.720 in this film.
00:05:52.920 Both the victims
00:05:55.020 of the nuclear production
00:05:57.200 in the United States
00:05:58.840 and the victims
00:06:00.020 of nuclear use in Japan,
00:06:01.700 which are never really
00:06:02.940 depicted in the film.
00:06:04.920 And I think part of that
00:06:05.800 is that this is told
00:06:06.600 from Oppenheimer's perspective.
00:06:09.040 And you see him looking away.
00:06:12.180 You see him averting his gaze
00:06:13.840 from this part of the history.
00:06:16.080 And I think that's really clever,
00:06:18.740 the way the film portrays
00:06:20.520 Oppenheimer being unwilling
00:06:22.600 or unable to look at the destruction
00:06:24.580 that his work has created.
00:06:27.720 And the film itself
00:06:29.100 is looking away
00:06:29.900 from these second
00:06:30.560 and third order effects.
00:06:32.000 And I think it just reflects
00:06:33.200 a collective failure
00:06:34.740 of imagination
00:06:35.400 that we have
00:06:36.220 around nuclear weapons.
00:06:38.060 And these weapons
00:06:38.920 still have a legacy
00:06:39.760 that we live with today.
00:06:42.180 Yeah, I mean,
00:06:42.560 it could be that I'm so aware
00:06:44.120 of the second order effects
00:06:46.220 that I felt that the film
00:06:48.420 sort of properly invoked them
00:06:51.500 by ignoring them.
00:06:52.920 But yeah, I mean,
00:06:53.460 as you say,
00:06:54.140 this is very much
00:06:54.960 from Oppenheimer's eye view
00:06:57.420 of the situation
00:06:58.200 and what he averts his eyes from
00:07:00.320 and the stuff
00:07:01.540 that sort of invades
00:07:02.660 his consciousness
00:07:03.640 as he's trying
00:07:04.640 to give a speech
00:07:06.260 and I thought
00:07:07.320 it was just
00:07:07.860 very effective
00:07:09.200 at portraying
00:07:11.060 the cognitive dissonance
00:07:12.720 and the conflict.
00:07:14.060 It's pretty brilliant.
00:07:15.300 Especially the sound design
00:07:16.420 that happens there.
00:07:17.240 Yeah, oh my God.
00:07:17.940 That sound
00:07:18.520 and the auditorium scene
00:07:20.660 is just shaking.
00:07:22.520 And, you know,
00:07:23.240 the test itself
00:07:24.580 and the way you realize
00:07:25.860 that the flash
00:07:26.560 comes before the sound
00:07:27.780 and then it just
00:07:28.380 washes over you.
00:07:29.900 Yeah.
00:07:30.620 I mean,
00:07:31.200 I think it does a brilliant job
00:07:32.280 telling the story
00:07:33.060 that it tells
00:07:33.740 and I think
00:07:34.740 it's also our job
00:07:35.960 to tell the parts
00:07:36.880 of the story
00:07:37.340 that are not in the film
00:07:38.280 and as a compliment
00:07:39.260 to the film.
00:07:40.600 Yeah.
00:07:41.100 Yeah.
00:07:41.340 So perhaps you can summarize
00:07:42.840 your background
00:07:43.700 in this area.
00:07:45.080 How have you come
00:07:46.140 to these topics?
00:07:48.260 So I first discovered
00:07:50.640 nuclear weapons
00:07:51.640 in a course in college
00:07:53.440 and it was with
00:07:54.700 Jonathan Schell
00:07:55.520 who is someone
00:07:56.540 who you've spoken
00:07:57.400 about before.
00:07:58.440 Wow.
00:07:58.740 Where were you in school?
00:08:00.280 I was at Wesleyan University
00:08:01.540 and I had never thought
00:08:03.180 especially about
00:08:04.320 nuclear weapons
00:08:05.040 but I was interested
00:08:05.860 in writing
00:08:06.380 and someone told me
00:08:07.940 you've got to take a class
00:08:08.800 with Jonathan Schell.
00:08:09.900 He was a writer
00:08:10.760 at the New Yorker
00:08:11.620 and an editor there
00:08:12.640 and they said
00:08:13.500 he's one of the best people
00:08:15.280 if you want to learn
00:08:16.540 to write well.
00:08:17.700 So I signed up
00:08:18.260 for his class
00:08:18.920 which happened to be
00:08:19.820 on thinking
00:08:20.700 the unthinkable.
00:08:22.580 So I showed up
00:08:23.520 in the class.
00:08:24.440 I was the 13th person
00:08:26.420 in a 12 person class
00:08:27.940 and I went to him afterwards
00:08:30.080 because the lecture
00:08:31.020 was amazing
00:08:31.600 and I said
00:08:31.960 I really want to be
00:08:33.140 in this course
00:08:33.720 and he let me in
00:08:34.940 and that changed
00:08:36.500 the course of my life
00:08:37.540 because I was sort of
00:08:40.360 pulled the curtain back
00:08:41.720 on this hidden world
00:08:43.440 of nuclear weapons
00:08:44.460 that shape so much
00:08:46.260 of what we take
00:08:47.080 for granted
00:08:47.600 in the modern world
00:08:48.980 and he agreed
00:08:50.440 to advise me
00:08:51.540 on my senior thesis
00:08:52.840 which I wrote
00:08:53.480 about nuclear weapons
00:08:54.400 and I've done
00:08:55.740 other things since
00:08:56.580 and worked on other aspects
00:08:57.820 of international security policy
00:08:59.760 but I always keep
00:09:00.440 coming back
00:09:01.060 to this question.
00:09:03.200 Yeah, well he was
00:09:03.840 an amazing writer.
00:09:05.360 What year was that
00:09:06.200 that you studied with him?
00:09:07.660 It was 1998.
00:09:09.560 Well, the fate of the earth
00:09:10.860 was really instrumental
00:09:12.420 in my becoming aware
00:09:14.340 of this issue
00:09:15.460 and so well written.
00:09:17.960 Actually, I did a book report
00:09:19.560 on it I think
00:09:20.300 when I was 13.
00:09:22.800 Wow.
00:09:23.040 So I came to this
00:09:23.980 pretty early.
00:09:24.600 I don't know how the book
00:09:25.520 got into my hands
00:09:26.320 but yeah
00:09:27.360 and so really
00:09:28.840 for my entire life
00:09:30.880 longer
00:09:31.940 I mean for nearly 75 years
00:09:33.780 we've lived under
00:09:35.360 the shadow
00:09:35.880 of nuclear risk.
00:09:37.860 The Soviets
00:09:38.500 got the bomb
00:09:39.600 in 1949
00:09:40.840 which was earlier
00:09:42.660 than we were expecting
00:09:44.120 and as everyone knows
00:09:47.540 we're the only country
00:09:49.620 to have used it
00:09:50.520 in 1945
00:09:52.340 on August 6th
00:09:54.380 on Hiroshima
00:09:55.220 and August 9th
00:09:56.620 on Nagasaki.
00:09:57.940 Do you have a sense
00:09:59.580 of the ethics
00:10:01.260 or your beliefs
00:10:02.260 about the ethics
00:10:03.440 there
00:10:03.820 in our first
00:10:05.560 and only use
00:10:06.720 of these weapons?
00:10:08.420 They're treated
00:10:09.400 somewhat in the film
00:10:11.220 and I could be wrong
00:10:14.240 about this.
00:10:14.720 This could be a
00:10:15.420 this is a piece of history
00:10:16.720 I thought they were
00:10:18.060 getting wrong
00:10:18.740 but I could be wrong
00:10:20.140 about it.
00:10:20.500 I think the film
00:10:23.340 embraces an older
00:10:24.780 version of the history
00:10:26.240 and there's
00:10:27.540 a more recent
00:10:28.620 historiography
00:10:29.420 that has access
00:10:31.040 to all of the
00:10:31.740 declassified documents
00:10:33.020 and shows that
00:10:35.400 in many ways
00:10:36.260 we were sold
00:10:37.140 a false narrative
00:10:37.940 when it comes
00:10:38.960 to the necessity
00:10:39.720 of the use
00:10:40.360 of these weapons
00:10:41.020 and the Truman
00:10:42.980 administration
00:10:43.460 after the war
00:10:44.280 was really keen
00:10:45.200 to shape
00:10:46.380 the perception
00:10:48.180 of these weapons
00:10:49.320 and they framed it up
00:10:51.300 as if there had been
00:10:52.180 this debate
00:10:52.820 where Truman
00:10:53.960 considered all the options
00:10:55.380 carefully
00:10:55.940 and with a heavy heart
00:10:57.660 decided that
00:10:58.660 nuclear weapons
00:10:59.560 would save
00:11:00.440 American lives
00:11:01.440 would save
00:11:01.880 Japanese lives
00:11:02.680 and went ahead
00:11:03.780 and in many ways
00:11:05.800 that's a piece
00:11:06.900 of post-war
00:11:07.940 propaganda
00:11:08.620 because the debate
00:11:09.920 at that time
00:11:10.860 was not over
00:11:11.660 whether to use
00:11:12.620 the bomb
00:11:13.080 or to invade
00:11:14.320 they were
00:11:15.680 planning to use
00:11:16.640 the bomb
00:11:17.080 and to invade
00:11:18.060 and they didn't
00:11:19.440 know what the
00:11:20.420 future would be
00:11:21.120 and so
00:11:22.740 they
00:11:23.260 actually
00:11:24.520 sent both
00:11:25.580 weapons
00:11:26.080 Hiroshima
00:11:27.120 and Nagasaki
00:11:27.820 out to the local
00:11:28.820 commanders
00:11:29.380 in the Pacific
00:11:30.100 this is a time
00:11:31.500 where communications
00:11:32.180 took longer as well
00:11:33.220 and
00:11:34.400 there were
00:11:35.440 preparations
00:11:36.000 to use both
00:11:36.780 of them
00:11:37.120 when the weather
00:11:38.200 conditions
00:11:38.860 permitted
00:11:39.380 and
00:11:40.640 especially
00:11:41.560 with the use
00:11:42.180 of the bomb
00:11:42.660 on Nagasaki
00:11:43.560 there was not
00:11:44.320 a well-considered
00:11:46.220 strategic decision
00:11:47.300 to use that
00:11:48.460 the bomb
00:11:49.300 was used
00:11:49.680 just three days
00:11:50.560 after the initial
00:11:51.540 bombing of Hiroshima
00:11:52.760 so
00:11:53.580 yeah
00:11:53.900 that's always
00:11:55.380 seemed
00:11:56.360 inexplicable to me
00:11:57.760 that we felt
00:11:58.200 that we needed
00:11:58.700 I guess the
00:11:59.520 rationale
00:12:00.660 was that
00:12:01.580 to drop
00:12:03.340 a second bomb
00:12:04.280 is to indicate
00:12:05.660 in this case
00:12:06.540 falsely
00:12:07.160 that we've got
00:12:08.440 a whole arsenal
00:12:09.920 of these weapons
00:12:10.660 to spare
00:12:11.260 yeah
00:12:11.580 right
00:12:12.160 yeah
00:12:12.380 but at the time
00:12:14.000 the weapon
00:12:14.420 was dropped
00:12:14.980 the Japanese
00:12:15.820 were still
00:12:16.340 making sense
00:12:17.140 of what had
00:12:17.720 happened
00:12:18.120 with the first
00:12:19.040 weapon
00:12:19.400 yeah
00:12:19.720 and we're still
00:12:20.480 processing that
00:12:21.280 and as we know
00:12:22.200 now the Soviet Union
00:12:23.320 was preparing
00:12:24.260 to enter the war
00:12:25.320 Truman knew this
00:12:27.280 at the time
00:12:27.720 and he knew
00:12:28.180 when the Soviet Union
00:12:29.660 entered
00:12:30.100 it would be
00:12:31.140 the end
00:12:31.500 for Japan
00:12:32.040 and he wrote
00:12:33.020 that in his memoirs
00:12:34.280 and his communications
00:12:35.260 but there was
00:12:37.160 a real interest
00:12:38.780 in demonstrating
00:12:39.980 the power
00:12:40.780 of the bomb
00:12:41.280 to the Soviet Union
00:12:42.300 and in shaping
00:12:43.360 the post-war
00:12:45.200 balance
00:12:46.500 the U.S.
00:12:48.480 had demanded
00:12:49.440 unconditional
00:12:50.240 surrender
00:12:51.040 from Japan
00:12:51.920 so the use
00:12:53.820 of the bomb
00:12:54.240 in some ways
00:12:54.940 was to ensure
00:12:56.540 an unconditional
00:12:58.100 surrender
00:12:58.580 without the invasion
00:12:59.920 of the home islands
00:13:01.260 and nobody
00:13:02.820 knows exactly
00:13:03.560 what would have
00:13:04.180 happened
00:13:04.540 if those weapons
00:13:05.720 were not used
00:13:06.800 but the consequences
00:13:08.220 were just
00:13:09.020 devastating
00:13:09.840 why do you think
00:13:11.040 we didn't
00:13:12.000 drop the first bomb
00:13:13.600 off the coast
00:13:14.800 in the ocean
00:13:15.320 as a
00:13:15.680 just a demonstration
00:13:16.760 of its power
00:13:18.040 as opposed to
00:13:18.660 dropping it on
00:13:19.780 civilians
00:13:20.580 so this was
00:13:22.160 briefly considered
00:13:23.160 and one of the
00:13:25.280 one of the
00:13:25.760 considerations
00:13:26.320 was that
00:13:27.780 a demonstration
00:13:29.080 in a harbor
00:13:30.360 or off the coast
00:13:31.360 may not show
00:13:32.560 the full magnitude
00:13:33.520 of the weapon
00:13:34.740 and would not
00:13:36.100 impress upon
00:13:36.820 the Japanese
00:13:37.620 and the Soviets
00:13:39.980 the effect
00:13:41.340 of this weapon
00:13:41.980 there was also
00:13:43.460 a concern
00:13:43.980 what if it didn't
00:13:44.700 go off
00:13:45.300 and then now
00:13:46.400 you have
00:13:47.020 a device
00:13:48.480 that is
00:13:49.800 in the water
00:13:50.520 and could be
00:13:51.420 retrievable
00:13:51.960 by the enemy
00:13:52.620 there was
00:13:53.940 some talk
00:13:54.900 of inviting
00:13:55.520 the Japanese
00:13:56.440 to see
00:13:57.040 a demonstration
00:13:57.800 you know
00:13:59.380 at the Trinity
00:14:00.240 test site
00:14:00.920 but again
00:14:02.200 there was
00:14:02.840 concern
00:14:03.320 what if
00:14:03.880 the demonstration
00:14:04.900 doesn't go
00:14:05.720 as expected
00:14:06.420 and we'd be
00:14:07.820 tipping our
00:14:08.800 hand as to
00:14:09.480 this device
00:14:10.440 that we have
00:14:11.040 so ultimately
00:14:11.960 they decided
00:14:12.700 to drop it
00:14:13.660 with very
00:14:14.680 little notification
00:14:15.620 and
00:14:17.080 you know
00:14:18.020 they considered
00:14:19.060 maybe saying
00:14:19.980 you know
00:14:20.380 evacuate the city
00:14:21.380 we'll drop it
00:14:22.120 but then they
00:14:23.100 were concerned
00:14:23.620 that the Japanese
00:14:24.280 would shoot
00:14:24.740 down the plane
00:14:25.380 so this debate
00:14:26.580 is treated
00:14:27.180 very quickly
00:14:27.880 in the Oppenheimer
00:14:29.280 film
00:14:29.660 but it's not
00:14:32.620 exactly how
00:14:33.320 it played out
00:14:34.080 and again
00:14:34.960 I could be
00:14:35.560 mistaken about this
00:14:36.220 but I had
00:14:37.000 thought that
00:14:37.880 the rationale
00:14:38.620 that
00:14:39.900 maybe you
00:14:41.280 just indicated
00:14:41.820 I'm not mistaken
00:14:42.440 about this
00:14:42.820 I thought that
00:14:43.700 the rationale
00:14:44.180 that dropping
00:14:45.900 a bomb
00:14:46.440 at least on
00:14:47.180 Hiroshima
00:14:47.900 was justified
00:14:48.760 because it
00:14:50.420 saved something
00:14:51.060 like a million
00:14:51.840 lives of
00:14:53.260 infantry that
00:14:54.240 didn't have
00:14:54.820 to invade
00:14:55.640 that that
00:14:56.600 was a very
00:14:57.380 post hoc
00:14:58.300 epiphany
00:14:59.440 that was not
00:15:01.180 thought at the
00:15:01.800 time
00:15:02.200 right
00:15:02.600 yeah
00:15:03.300 so they had
00:15:04.180 estimates of
00:15:05.080 what it would
00:15:05.560 take for the
00:15:06.140 invasion
00:15:06.560 and that
00:15:07.780 nobody knew
00:15:08.720 of course
00:15:09.120 but I think
00:15:09.740 the median
00:15:10.200 estimates were
00:15:10.880 something like
00:15:11.420 a hundred thousand
00:15:12.360 U.S. troops
00:15:13.620 dead in the
00:15:14.480 invasion of the
00:15:15.180 islands
00:15:15.460 which obviously
00:15:16.660 would have been
00:15:17.240 awful
00:15:17.880 and any
00:15:18.820 national
00:15:19.120 any president
00:15:19.720 has to
00:15:20.240 think first
00:15:21.280 and foremost
00:15:21.820 of the lives
00:15:22.440 of those
00:15:22.920 U.S.
00:15:23.600 service members
00:15:24.260 but it
00:15:25.240 wasn't the
00:15:26.020 number that
00:15:26.940 was cited
00:15:27.420 in the
00:15:27.880 post-war
00:15:28.260 propaganda
00:15:28.900 and you
00:15:30.040 know I
00:15:30.180 just want
00:15:30.560 to acknowledge
00:15:31.100 that we're
00:15:31.820 having this
00:15:32.300 conversation
00:15:32.960 around the
00:15:33.680 time of
00:15:34.580 the bombings
00:15:35.920 of Hiroshima
00:15:36.620 and Nagasaki
00:15:37.440 and it's
00:15:38.860 striking to me
00:15:39.480 that there
00:15:39.840 are just
00:15:40.180 there are
00:15:40.580 people who
00:15:41.880 are alive
00:15:42.340 today who
00:15:43.280 live through
00:15:44.000 that and the
00:15:44.560 other day I
00:15:45.080 listened to an
00:15:45.960 interview with
00:15:46.700 Setsuko Thurlow
00:15:47.780 and she's
00:15:49.740 a survivor
00:15:50.900 of Hiroshima
00:15:51.520 and it's
00:15:53.080 this incredibly
00:15:53.760 courageous act
00:15:54.740 that she
00:15:55.660 continues to
00:15:56.580 bear witness
00:15:57.560 in a recognition
00:15:58.440 of this suffering
00:15:59.300 that can last
00:16:00.660 generations
00:16:01.520 and I just
00:16:03.480 want to
00:16:03.880 acknowledge that
00:16:04.440 we should be
00:16:05.220 listening to
00:16:05.680 those voices
00:16:06.200 as well
00:16:06.800 I would just
00:16:08.200 add anyone
00:16:08.920 who feels
00:16:09.740 that they
00:16:10.060 haven't
00:16:10.680 fully imbibed
00:16:11.640 the details
00:16:12.700 of what
00:16:13.420 happened at
00:16:14.120 Hiroshima
00:16:15.220 John Hersey's
00:16:16.720 small book
00:16:17.480 that based
00:16:17.880 on his
00:16:18.300 New Yorker
00:16:19.040 articles
00:16:19.460 is well worth
00:16:20.560 reading
00:16:20.880 that's an
00:16:21.780 incredible book
00:16:22.800 and the story
00:16:24.060 behind it's
00:16:24.700 pretty remarkable
00:16:25.420 because you
00:16:26.200 have Hirsi
00:16:26.920 who is this
00:16:27.860 really he's a
00:16:29.340 recognized war
00:16:30.180 reporter
00:16:30.640 he's in Tokyo
00:16:32.420 and at this
00:16:33.200 time Tokyo
00:16:33.880 is under
00:16:35.080 the occupation
00:16:36.980 the U.S.
00:16:38.200 occupation
00:16:38.620 and General
00:16:39.240 MacArthur's
00:16:39.840 the administrator
00:16:40.460 and Hirsi
00:16:41.820 actually slips
00:16:42.880 out he
00:16:43.280 pretends to
00:16:43.880 have a stomach
00:16:44.480 bug and he
00:16:45.940 goes and records
00:16:46.760 the story
00:16:47.400 of these
00:16:47.980 six survivors
00:16:48.880 of the
00:16:49.600 Hiroshima
00:16:50.020 bomb
00:16:50.440 and he
00:16:51.700 tells a
00:16:52.100 different story
00:16:52.900 than the
00:16:53.540 official one
00:16:54.140 the official
00:16:54.940 story is
00:16:55.540 focused very
00:16:56.380 much on the
00:16:57.040 size of the
00:16:57.660 explosion
00:16:58.120 and that's
00:17:00.160 where the
00:17:00.500 emphasis is
00:17:01.180 and he
00:17:01.460 tells the
00:17:01.860 human story
00:17:02.460 of these
00:17:02.860 survivors
00:17:03.400 and also
00:17:04.360 for the
00:17:04.720 first time
00:17:05.180 reveals that
00:17:06.460 there was
00:17:06.880 this radiation
00:17:07.700 sickness
00:17:08.280 that affected
00:17:09.600 people really
00:17:11.060 terribly
00:17:11.440 and I think
00:17:12.180 this changes
00:17:12.880 the way
00:17:13.480 the weapon
00:17:14.620 is viewed
00:17:15.240 it ends
00:17:16.220 up being
00:17:16.540 a 30,000
00:17:17.360 word piece
00:17:18.000 that's released
00:17:18.720 in the
00:17:19.260 New Yorker
00:17:19.780 it actually
00:17:20.600 is like a
00:17:21.340 full issue
00:17:21.980 of the
00:17:22.260 New Yorker
00:17:22.940 this one
00:17:23.720 story
00:17:24.300 and when
00:17:25.440 it hits
00:17:25.760 the newsstands
00:17:26.700 it's all
00:17:27.160 anyone's
00:17:27.720 talking about
00:17:28.360 and I think
00:17:28.940 that's an
00:17:29.760 example
00:17:30.200 of a
00:17:31.740 reporter
00:17:32.140 someone in
00:17:33.420 civil society
00:17:34.080 not in
00:17:34.540 government
00:17:34.820 who had a
00:17:35.400 really powerful
00:17:36.500 effect on
00:17:37.860 the nuclear
00:17:38.340 age
00:17:38.820 because our
00:17:41.100 relationship to
00:17:42.620 the bomb
00:17:42.980 changed once
00:17:43.760 we understood
00:17:44.400 its full
00:17:44.920 consequences
00:17:45.500 and I
00:17:47.000 you know I
00:17:47.720 also want to
00:17:48.580 say that the
00:17:49.120 decision to
00:17:50.220 drop the bomb
00:17:51.240 on Hiroshima
00:17:52.000 and Nagasaki
00:17:52.780 is it's not
00:17:54.100 taken in
00:17:54.800 isolation and
00:17:55.920 it's sort of
00:17:56.340 this culmination
00:17:57.500 of a series
00:17:58.720 of atrocities
00:17:59.720 and it's
00:18:01.400 something of a
00:18:02.260 coincidence that
00:18:03.280 the ability
00:18:04.940 to create
00:18:05.560 nuclear weapons
00:18:06.440 emerged during
00:18:07.880 World War
00:18:08.380 II
00:18:08.620 it didn't
00:18:10.000 need to be
00:18:10.720 that way
00:18:11.120 there's this
00:18:11.620 world of
00:18:12.480 physics
00:18:12.960 and sort of
00:18:14.520 the breakthroughs
00:18:15.420 in atomic
00:18:16.060 physics in the
00:18:16.820 1920s and
00:18:17.920 30s
00:18:18.580 is just an
00:18:19.940 incredibly
00:18:20.540 exciting period
00:18:21.600 of discovery
00:18:22.380 and by
00:18:24.240 coincidence
00:18:24.800 they realize
00:18:26.940 the potential
00:18:27.900 for building
00:18:28.520 a bomb
00:18:28.960 at the exact
00:18:29.700 time that
00:18:30.580 Europe is
00:18:31.140 descending
00:18:31.640 into war
00:18:32.440 and there's
00:18:34.000 not just any
00:18:34.920 war but this
00:18:35.500 war in which
00:18:36.100 atrocities are
00:18:36.960 being committed
00:18:37.680 on all sides
00:18:39.020 the genocide
00:18:40.860 of the Jews
00:18:41.700 and later the
00:18:42.540 firebombing of
00:18:43.500 the German
00:18:44.480 cities and the
00:18:45.320 Japanese cities
00:18:46.100 and so this
00:18:46.680 new weapon
00:18:47.420 enters the
00:18:48.620 world at a
00:18:50.000 time when
00:18:51.100 all of the
00:18:52.380 constraints
00:18:53.560 on humanitarian
00:18:55.420 behavior have
00:18:56.840 already been
00:18:57.620 washed away
00:18:58.400 and I think
00:19:00.600 we need to
00:19:04.060 put that into
00:19:05.220 context when
00:19:06.020 we think about
00:19:06.700 the decision
00:19:07.380 to use the
00:19:08.300 weapon it
00:19:09.020 didn't seem
00:19:09.920 like using a
00:19:11.420 weapon of the
00:19:12.020 sort against
00:19:12.540 civilians was
00:19:13.380 anything different
00:19:15.020 than what had
00:19:16.140 already been going
00:19:17.000 on for months
00:19:18.040 within the war
00:19:18.940 and you'd have
00:19:20.120 firebombings of
00:19:21.220 Tokyo where they
00:19:21.920 would try to
00:19:22.440 create these
00:19:22.880 conflagrations
00:19:23.800 that would kill
00:19:24.880 10,000 people
00:19:25.960 in a night
00:19:26.460 and you had
00:19:27.080 similar atrocities
00:19:28.200 by the Japanese
00:19:29.060 in China
00:19:30.660 and it's just
00:19:32.580 a you know if
00:19:33.660 you think about
00:19:34.160 even the origins
00:19:35.140 of nuclear
00:19:36.140 weapons right
00:19:37.140 the nuclear
00:19:38.780 weapons were not
00:19:39.740 born from the
00:19:40.520 Manhattan project
00:19:41.480 they started in
00:19:42.880 Nazi Germany
00:19:43.640 there was a
00:19:44.280 nuclear weapons
00:19:45.140 program and the
00:19:46.200 US only decided
00:19:47.300 to pursue nuclear
00:19:48.360 weapons because of
00:19:49.360 a fear that Nazi
00:19:50.380 Germany might get
00:19:51.280 there first and
00:19:53.100 so these weapons
00:19:54.060 enter the world
00:19:55.200 in the hands of
00:19:56.720 this victorious
00:19:57.460 democratic nation
00:19:58.580 and part of the
00:19:59.880 arsenal of
00:20:00.500 democracy but if
00:20:02.720 Nazi Germany had
00:20:03.700 taken a different
00:20:04.420 path with their
00:20:05.140 technology these
00:20:06.320 nuclear weapons
00:20:07.080 could very much
00:20:07.840 have entered the
00:20:09.380 world in the hands
00:20:10.140 of Nazi Germany
00:20:11.020 and in some ways
00:20:12.300 that would have
00:20:12.740 sort of revealed
00:20:13.380 the mask of what
00:20:15.000 they're capable of
00:20:15.760 doing.
00:20:16.920 Yeah that's a
00:20:17.620 very important
00:20:19.660 piece of context
00:20:20.380 because it
00:20:21.280 makes all of
00:20:23.400 the ethical
00:20:24.700 risks we ran
00:20:27.340 and ignored
00:20:29.160 seem totally
00:20:31.300 understandable
00:20:31.960 given the
00:20:32.640 context.
00:20:33.400 I mean you've
00:20:34.600 just pointed out
00:20:35.700 two very
00:20:37.540 important pieces
00:20:38.140 of context.
00:20:38.740 One is we were
00:20:39.360 already committing
00:20:40.140 similar genocides
00:20:42.400 of civilians
00:20:43.940 by firebombing
00:20:45.440 cities and
00:20:46.260 killing tens of
00:20:47.240 thousands of
00:20:47.700 people a day
00:20:49.020 and not
00:20:51.260 really I mean
00:20:52.340 in the aftermath
00:20:53.140 we second
00:20:54.380 guess that a
00:20:55.000 little bit but
00:20:55.540 it just seemed
00:20:56.540 like we were
00:20:57.640 especially in the
00:20:59.040 case of Nazi
00:20:59.540 Germany we had
00:21:00.260 an adversary
00:21:00.760 that was so
00:21:02.320 obviously in the
00:21:03.340 wrong and
00:21:04.580 evil and
00:21:06.180 aspiring to
00:21:07.940 create catastrophic
00:21:10.040 harm globally
00:21:11.200 that we sort of
00:21:13.200 had to throw out
00:21:14.000 the rule book
00:21:14.920 and our
00:21:16.520 scruples with
00:21:17.400 it.
00:21:18.040 And I think
00:21:18.340 right after the
00:21:19.060 war there was
00:21:19.700 an attempt to
00:21:20.920 pull back and
00:21:22.540 to return to
00:21:24.340 a different
00:21:25.320 approach and
00:21:26.720 that's part of
00:21:27.500 the debate that
00:21:28.020 plays out is
00:21:28.640 should we go on
00:21:29.700 and develop this
00:21:30.960 thermonuclear weapon
00:21:32.380 that's even
00:21:33.180 larger and
00:21:34.680 capable of a
00:21:36.720 thousand times
00:21:37.600 more destructive
00:21:38.740 power like a
00:21:40.040 true city
00:21:41.980 city busting
00:21:43.100 weapon and
00:21:44.900 that's the
00:21:45.880 that's the
00:21:46.820 debate and the
00:21:47.320 U.S. of course
00:21:47.980 does proceed with
00:21:48.960 this weapon in
00:21:50.300 part out of fear
00:21:51.160 that if they
00:21:51.660 don't the
00:21:52.260 Soviet Union
00:21:52.940 will.
00:21:54.020 Oppenheimer is
00:21:55.040 opposed to the
00:21:55.820 use of nuclear
00:21:56.440 weapons in that
00:21:57.100 way and that's
00:21:57.720 why he is
00:21:58.420 politically
00:21:58.860 sidelined by
00:22:00.000 his adversaries.
00:22:02.000 But even in
00:22:02.800 the movie you
00:22:03.360 can see the
00:22:05.040 emergence of
00:22:06.000 these two
00:22:07.320 new technologies
00:22:09.240 that are really
00:22:09.900 going to shape
00:22:10.400 the nuclear
00:22:10.880 age and one
00:22:12.080 is the H-bomb
00:22:13.060 the thermonuclear
00:22:13.780 weapon but the
00:22:14.460 other is the
00:22:15.340 intercontinental
00:22:16.240 missile and in
00:22:17.640 some of those
00:22:18.120 visions you can
00:22:19.020 see the terror
00:22:21.260 that a weapon of
00:22:22.320 that sort would
00:22:23.640 inspire because
00:22:24.660 they move 20
00:22:25.920 times the speed
00:22:26.620 of sound there's
00:22:27.520 no defense against
00:22:28.380 them and that
00:22:30.060 these are the
00:22:30.700 weapons that
00:22:31.200 really compress
00:22:32.100 the decision
00:22:33.080 making time and
00:22:35.060 put us right on
00:22:36.480 the brink and
00:22:38.080 so it's the
00:22:38.680 marriage of
00:22:39.880 miniaturized
00:22:41.240 hydrogen bombs
00:22:42.380 and intercontinental
00:22:43.860 ballistic missiles
00:22:44.540 that represent a
00:22:45.440 step change in the
00:22:47.000 level of danger to
00:22:48.580 humanity.
00:22:50.080 So now I realize I
00:22:51.300 derailed you in
00:22:52.260 giving us your
00:22:53.320 bona fides on this
00:22:54.860 topic.
00:22:55.220 I last left you with
00:22:56.040 Jonathan Schell
00:22:56.820 learning to write
00:22:58.220 so then what
00:22:59.600 happened to you?
00:23:00.780 Well I went and
00:23:01.620 worked at a couple
00:23:02.300 different think tanks
00:23:03.340 and I got a
00:23:04.640 fellowship first to
00:23:05.720 study internationally
00:23:06.660 I got this Watson
00:23:07.400 fellowship where I
00:23:08.180 could travel and
00:23:08.860 study internationally
00:23:09.800 and then came back
00:23:11.520 and did some work
00:23:12.420 at the Stimson
00:23:13.020 Center and the
00:23:13.860 Council on Foreign
00:23:14.600 Relations and went
00:23:16.280 back to graduate
00:23:16.900 school because that's
00:23:17.800 one of the things
00:23:18.340 that Jonathan told
00:23:19.160 me is you know if
00:23:19.900 you want to have
00:23:20.620 credibility on this
00:23:21.440 issue you got to
00:23:22.060 know the details and
00:23:23.080 I got a master's
00:23:24.540 degree at Princeton
00:23:26.820 University and then
00:23:29.220 went on to work at
00:23:30.480 the Century Foundation
00:23:31.480 where I was involved
00:23:35.040 in editing some
00:23:35.940 volumes the big
00:23:36.740 debate at that
00:23:37.380 time was over
00:23:38.840 counterterrorism in
00:23:41.360 Afghanistan as well
00:23:42.700 as Iran and Iran's
00:23:45.000 nuclear program so I
00:23:46.420 helped edit some
00:23:47.440 volumes and prepare
00:23:48.740 some events on those.
00:23:50.440 What is the Century
00:23:51.640 Foundation?
00:23:52.440 What do they do?
00:23:53.540 Yeah so they started
00:23:54.540 as the 20th Century
00:23:55.820 Fund and were they
00:23:58.440 published books and
00:23:59.760 supported scholarship
00:24:00.620 and some really
00:24:01.520 important books came
00:24:02.460 out through their
00:24:03.660 publishing house and
00:24:04.900 then at the end of
00:24:05.600 the 20th century they
00:24:06.840 decided that they
00:24:08.280 wanted to continue.
00:24:09.380 They were meant to
00:24:10.120 sunset but they
00:24:11.280 decided that they had
00:24:12.960 important work to
00:24:13.760 continue and so they
00:24:14.820 became the Century
00:24:15.620 Foundation.
00:24:16.260 They're based in New
00:24:16.820 York, they're a tiny
00:24:18.480 think tank and I
00:24:20.660 think they continue to
00:24:22.100 do good work.
00:24:23.120 And you also worked
00:24:23.960 at the Carnegie
00:24:25.220 Corporation which I
00:24:26.240 also realize I am
00:24:28.460 confused about.
00:24:29.560 I have heard their
00:24:30.160 name I think in
00:24:31.720 sponsorship of PBS
00:24:33.540 or NPR a bunch but
00:24:35.240 what do they do?
00:24:36.940 So they were
00:24:37.260 established by Andrew
00:24:38.440 Carnegie to continue
00:24:40.640 his philanthropic
00:24:41.720 legacy.
00:24:42.540 So he was making
00:24:44.100 money faster than he
00:24:45.320 could responsibly give
00:24:46.420 it away so he decided
00:24:47.420 to found a bunch of
00:24:48.740 institutions.
00:24:49.440 He was incredibly
00:24:50.080 prolific.
00:24:51.140 He founded the
00:24:51.820 Carnegie Endowment for
00:24:52.680 International Peace and
00:24:53.780 the Council on Ethics and
00:24:55.000 a Foundation on
00:24:55.660 Teaching but the main
00:24:57.240 continuation of his
00:24:58.660 philanthropic vision was
00:25:00.220 to be housed at the
00:25:01.180 Carnegie Corporation of
00:25:02.360 New York and it's a
00:25:03.300 grant-making foundation
00:25:04.460 that's made a couple
00:25:06.000 billion dollars of
00:25:07.000 grants over the years
00:25:08.040 and has an endowment
00:25:09.340 that they continue to
00:25:10.680 allocate for education,
00:25:12.760 for peace, and for
00:25:13.640 citizenship which were
00:25:14.840 Andrew Carnegie's main
00:25:16.340 passions in life.
00:25:18.000 Okay, so let's talk
00:25:19.260 about nuclear risk and
00:25:21.940 just how it has waxed and
00:25:25.140 waned over the years.
00:25:27.080 Most people still put the
00:25:29.560 absolute peak of risk at
00:25:31.180 the Cuban Missile Crisis
00:25:32.340 in 1962.
00:25:33.720 Isn't that the case?
00:25:35.100 Yeah, and I think
00:25:35.960 that's right.
00:25:36.920 And actually, we
00:25:38.600 recently learned that it
00:25:41.240 was quite a bit riskier
00:25:42.220 than we even thought.
00:25:44.440 Yeah.
00:25:44.680 Perhaps you want to
00:25:45.740 review some of that
00:25:46.760 history because it
00:25:47.580 really was what we were
00:25:49.700 unaware of.
00:25:50.540 I'm thinking, of course,
00:25:51.780 of the tactical nukes
00:25:53.440 that the Soviets already
00:25:54.340 had in place that we
00:25:55.300 were apparently unaware
00:25:56.620 of, it's very easy to
00:25:59.080 see how things could
00:25:59.760 have spiraled out of
00:26:00.740 control had we invaded
00:26:01.980 as I think was
00:26:03.220 recommended by the
00:26:04.280 National Security Council
00:26:05.480 and it was really just
00:26:06.700 JFK who decided, no,
00:26:08.800 we're not going to do
00:26:09.320 that.
00:26:10.180 Yeah.
00:26:10.300 What do we know about
00:26:11.060 what was happening
00:26:11.660 there?
00:26:12.100 Well, so this is an
00:26:13.960 interesting story because
00:26:15.360 for many years, we took
00:26:17.220 away the wrong lessons
00:26:18.800 from the Cuban Missile
00:26:19.840 Crisis, I believe.
00:26:21.660 So you have this crisis
00:26:23.500 that stretched over 13
00:26:25.020 days and it was this
00:26:26.720 high-stakes brinksmanship
00:26:28.440 and there were a lot of
00:26:29.200 opportunities for both
00:26:30.220 human and technical
00:26:31.160 error, but the crux of
00:26:33.720 it comes down to the
00:26:35.140 27th of October in
00:26:36.820 1962, which is known as
00:26:39.560 Black Saturday.
00:26:40.440 And in my view, this is
00:26:41.340 the closest the world
00:26:42.480 has ever come to
00:26:43.300 nuclear catastrophe.
00:26:44.660 It's also my wife's
00:26:45.960 birthday, so I made the
00:26:47.960 mistake once of pointing
00:26:48.980 that out and don't do
00:26:51.180 that anymore.
00:26:51.720 But I also think of it
00:26:53.160 as the day that we
00:26:54.480 survived nuclear
00:26:55.420 catastrophe, so you
00:26:57.220 could celebrate that
00:26:57.980 every year.
00:26:59.200 On that day, you have
00:27:01.080 this incredible series of
00:27:02.460 events.
00:27:02.840 I mean, the day starts
00:27:03.840 with Castro writing to
00:27:07.400 Khrushchev and encouraging
00:27:08.920 him to use nuclear
00:27:10.060 weapons against the
00:27:11.220 United States.
00:27:12.780 And it ends with the
00:27:13.920 Kennedy brothers
00:27:14.760 negotiating for the
00:27:16.880 removal of the missiles
00:27:18.160 in Cuba in return
00:27:20.320 secretly for the
00:27:22.000 removal of similar
00:27:23.760 missiles that were in
00:27:25.480 Turkey, right?
00:27:26.960 But in between, you
00:27:29.040 have three or four
00:27:30.420 different events, each
00:27:31.560 of which could have
00:27:32.780 led to a nuclear
00:27:34.500 exchange.
00:27:35.780 So you have the
00:27:37.160 shooting down of a U-2
00:27:40.480 surveillance plane in
00:27:42.680 Cuba.
00:27:43.320 You have another U-2
00:27:44.960 surveillance plane that
00:27:46.040 wanders into Soviet
00:27:47.240 airspace and triggers a
00:27:49.420 response there.
00:27:50.040 You have local
00:27:52.100 anti-aircraft batteries
00:27:54.340 in Cuba that are firing
00:27:56.360 on U.S.
00:27:58.640 planes.
00:27:59.460 And this would have, if
00:28:00.280 they had shot one of
00:28:00.920 those down, this is a
00:28:01.660 red line that Kennedy
00:28:03.440 had drawn.
00:28:05.000 And it was actually the
00:28:06.380 local commanders who
00:28:07.480 were doing this,
00:28:08.620 unbeknownst to Khrushchev
00:28:10.480 and to Castro.
00:28:11.760 You also have the U.S.
00:28:14.480 drawing up its final war
00:28:16.640 plans for an invasion of
00:28:18.580 Cuba, which was going to
00:28:20.140 happen on Monday.
00:28:21.560 So you're here on
00:28:22.160 Saturday, they're
00:28:23.040 preparing for an
00:28:24.220 invasion on Monday.
00:28:25.980 The Soviet forces are
00:28:27.800 preparing to use these
00:28:29.320 tactical nuclear weapons
00:28:30.920 if they have to.
00:28:31.740 As you've described,
00:28:32.520 there were secretly
00:28:33.040 these tactical nuclear
00:28:34.100 weapons that were on the
00:28:34.960 ground in the hands of
00:28:35.780 the local commanders.
00:28:37.420 And then amidst all of
00:28:38.640 this, you have this
00:28:39.620 incident with a nuclear
00:28:42.000 armed submarine, a
00:28:43.960 Soviet submarine that is
00:28:45.280 accompanying the ships
00:28:47.840 and the U.S.
00:28:51.080 Navy is dropping depth
00:28:52.740 charges to force this
00:28:54.380 submarine to surface.
00:28:56.780 And the guys on the
00:28:58.840 submarine, it's like 130
00:29:01.220 degrees down there.
00:29:02.880 They don't know if the war
00:29:03.840 started or not.
00:29:05.020 These depth charges are
00:29:06.000 going off.
00:29:06.660 It's like being in a tin
00:29:07.980 can that's being pounded
00:29:09.340 on, right?
00:29:10.880 And the captain of that
00:29:12.760 ship actually authorized
00:29:14.020 the use of their special
00:29:16.140 weapon, which was a
00:29:17.220 nuclear torpedo against
00:29:18.660 the U.S.
00:29:19.480 forces.
00:29:20.520 And we're fortunate that
00:29:21.580 they also had the
00:29:22.480 commodore of the fleet
00:29:23.860 who outranked the
00:29:25.520 captain.
00:29:26.140 And he basically said,
00:29:27.220 let's wait, let's see
00:29:28.260 what happens.
00:29:29.860 This is Vasili
00:29:31.200 Arkhipov?
00:29:32.080 Arkhipov, exactly.
00:29:33.540 Yeah.
00:29:34.180 So there were three
00:29:35.200 officers that would have
00:29:36.200 needed to authorize the
00:29:38.140 use of this weapon.
00:29:38.860 Two of them authorized it.
00:29:40.120 The third did not.
00:29:41.660 This story comes to us
00:29:43.300 through the memoirs of
00:29:44.540 these people and through
00:29:45.940 some archival material.
00:29:47.120 And it's always hard to
00:29:48.080 make sense of these
00:29:50.340 close call stories and
00:29:51.480 how close we really came.
00:29:52.780 But I think this is just,
00:29:54.280 you know, if you add up
00:29:55.420 those three or four
00:29:56.360 different things that
00:29:57.680 were all happening on
00:29:58.860 that day and any of them
00:30:00.080 go wrong and you get
00:30:01.320 nuclear war.
00:30:02.660 And as you said,
00:30:04.000 Kennedy was the one
00:30:05.980 person in that room who
00:30:07.960 was willing to accept
00:30:10.240 Khrushchev's offer.
00:30:12.200 I think Adelaide Stevenson
00:30:13.260 was also favorably
00:30:14.340 inclined towards it.
00:30:15.420 But all of the other
00:30:16.220 advisors, both civilian
00:30:17.280 and military, were
00:30:18.420 basically saying, don't
00:30:19.560 take this deal.
00:30:21.440 You don't want to betray
00:30:22.980 Turkey and sell them out
00:30:25.160 by trading off these
00:30:26.260 missiles.
00:30:27.100 We're ready to go in on
00:30:28.260 Monday with our invasion.
00:30:29.780 We have more nuclear
00:30:30.620 forces.
00:30:31.180 We're in a better
00:30:31.680 position.
00:30:32.160 And so they were ready
00:30:35.040 to go.
00:30:36.140 And the story that for
00:30:38.100 many years people took
00:30:39.480 from the Cuban missile
00:30:40.380 crisis is that you need
00:30:42.560 to demonstrate resolve
00:30:44.140 at all costs.
00:30:46.340 So Kennedy looked Khrushchev
00:30:47.800 in the eye.
00:30:48.880 Khrushchev blinked.
00:30:50.160 The U.S.
00:30:50.680 won.
00:30:51.420 That's the story that
00:30:52.880 people knew because the
00:30:54.480 deal to remove the
00:30:55.600 missiles from Turkey was
00:30:57.360 secret and was only
00:30:58.540 revealed 30 years later.
00:30:59.960 And only six people
00:31:02.180 knew about that deal.
00:31:03.780 So what actually saved
00:31:05.420 us in that crisis was
00:31:07.140 not fierce brinksmanship,
00:31:09.580 but the fact that both
00:31:10.900 men, both Kennedy and
00:31:12.260 Khrushchev, acknowledged
00:31:14.240 their vulnerability and
00:31:15.740 their fear.
00:31:17.040 And they could see that
00:31:17.980 this was a shared
00:31:20.120 problem that could take
00:31:21.480 down both their nations.
00:31:23.200 And so both men blinked.
00:31:24.820 And that's why we
00:31:27.360 avoided nuclear use.
00:31:28.480 Well, I want to return
00:31:31.060 to that logic of
00:31:33.040 brinksmanship and just
00:31:34.680 the game theory there
00:31:36.040 because obviously it's
00:31:36.900 relevant to our current
00:31:37.940 moment as we watch
00:31:40.040 the war in Ukraine
00:31:41.600 unfold and the
00:31:43.360 concern about first
00:31:45.160 use of nuclear weapons
00:31:46.260 has suddenly become
00:31:47.380 more relevant to
00:31:48.760 everyone.
00:31:49.260 Yeah.
00:31:49.760 But before we get
00:31:50.760 there, it's worth
00:31:52.740 focusing on this
00:31:54.380 feature of the
00:31:55.320 problem, which is
00:31:56.260 I mean, certainly
00:31:57.200 it's not talked about
00:31:58.200 enough, which is that
00:31:59.000 there's so many moments
00:32:00.260 where we have come
00:32:01.020 close to nuclear
00:32:03.100 catastrophe.
00:32:04.660 And the reason why we
00:32:06.100 haven't has come down
00:32:07.420 to a decision of a
00:32:10.020 of a single person.
00:32:12.480 You know, in the case of
00:32:13.720 JFK, it's understandable
00:32:14.880 he's the president of the
00:32:15.700 United States.
00:32:16.680 He's the person who
00:32:17.660 should be deciding
00:32:18.440 this.
00:32:19.580 I mean, you know, as
00:32:20.700 crazy as that sounds,
00:32:21.980 I'm not sure we've even
00:32:23.420 thought through the
00:32:24.180 logic and psychology
00:32:26.040 and practicality of
00:32:27.460 having even a president
00:32:28.460 make this decision.
00:32:29.460 But there are multiple
00:32:30.900 cases where you have
00:32:32.900 a low-level commander,
00:32:36.040 you know, on the
00:32:36.680 Soviet side, who's
00:32:38.340 deciding whether or not
00:32:40.580 to start a nuclear war
00:32:42.320 on the basis of some
00:32:43.500 information.
00:32:44.780 The other case was in
00:32:45.800 1983, where you had a
00:32:48.520 I think it was a
00:32:49.780 lieutenant colonel,
00:32:51.200 Stanislav Petrov, who
00:32:52.520 was, you know, got
00:32:53.920 some faulty radar
00:32:56.260 data.
00:32:57.240 He wasn't in a position
00:32:58.200 to decide whether or
00:32:59.360 not to respond with
00:33:00.400 nuclear weapons, but he
00:33:01.920 was in a position to
00:33:02.980 pass this data up the
00:33:04.680 chain, and it seems
00:33:05.960 very likely that a
00:33:07.040 retaliatory response
00:33:08.360 would have been
00:33:08.880 forthcoming.
00:33:10.240 But if memory serves,
00:33:11.960 he saw that it looked
00:33:13.320 like, based on the
00:33:14.460 radar, that the U.S.
00:33:15.800 had launched something
00:33:16.980 like five ICBMs as a
00:33:19.860 first strike, and he
00:33:20.820 reasoned that there's no
00:33:22.260 way they would just
00:33:23.360 launch five missiles.
00:33:25.500 If it's going to be a
00:33:26.080 first strike, they
00:33:26.700 would launch hundreds, so
00:33:28.060 this is probably bad
00:33:29.860 data.
00:33:30.600 But the idea that we
00:33:32.460 have a system where it
00:33:34.320 is falling to some
00:33:35.900 low-level person to
00:33:38.160 decide whether we are
00:33:40.220 on a grease slide into
00:33:42.180 nuclear Armageddon, it's
00:33:44.120 a crazy situation.
00:33:45.260 Yeah, nobody should
00:33:46.780 ever be put in that
00:33:47.940 position.
00:33:48.400 And the fact that we
00:33:51.600 relied on Vasily
00:33:52.860 Arkhipov and Stanislav
00:33:54.840 Petrov to make that
00:33:56.700 call, we need to move
00:33:58.600 away from a system where
00:33:59.900 that's even possible.
00:34:01.800 And people are not
00:34:02.900 equipped to make these
00:34:04.580 kinds of decisions under
00:34:06.940 duress.
00:34:07.720 It's just not something
00:34:09.160 that we're wired for.
00:34:11.500 And even as you say,
00:34:13.360 with political leaders like
00:34:14.400 Kennedy and Khrushchev,
00:34:15.280 yeah, Kennedy was
00:34:17.180 elected and delegated
00:34:19.320 with this level of
00:34:20.700 responsibility, but even
00:34:22.520 then, the pressure to put
00:34:24.840 on a single individual, I
00:34:26.700 think we should reject
00:34:28.260 that and we should move
00:34:29.620 away from systems in
00:34:30.840 which one person is
00:34:32.620 forced to make a
00:34:33.360 decision about the fate of
00:34:34.520 the nation in 15 minutes
00:34:36.820 or less in some cases.
00:34:38.120 You know, we still have
00:34:39.060 these very tight timelines
00:34:41.160 for decision-making, even
00:34:43.620 today.
00:34:44.820 Yeah, I mean, we'll talk
00:34:47.340 about AI.
00:34:48.760 There are reasons to be
00:34:50.620 very concerned about
00:34:51.540 taking this out of human
00:34:53.000 hands, but that suggests
00:34:54.680 that the whole thing is
00:34:56.080 totally untenable.
00:34:57.600 And even the ethics of it,
00:34:59.240 when you think about a
00:35:01.420 retaliation in response to
00:35:03.620 a perceived first strike,
00:35:06.260 that is something I spoke
00:35:08.080 about, I believe, with
00:35:09.420 William Perry when he was
00:35:10.680 on the podcast.
00:35:11.300 I mean, I just, I don't
00:35:13.260 think, it feels like we
00:35:14.540 haven't thought through
00:35:16.000 the psychology of the
00:35:18.140 moment.
00:35:18.540 I mean, imagine you're
00:35:19.420 the president of the
00:35:20.500 United States and you
00:35:22.220 have information that
00:35:24.440 your enemy, let's say it's
00:35:25.880 Russia, has just launched
00:35:27.840 a full, you know, first
00:35:30.260 strike seeking to destroy
00:35:32.720 American society.
00:35:34.760 The idea is that given
00:35:37.020 that information and given,
00:35:38.760 you know, the 15 or 30
00:35:41.180 minutes you have left to
00:35:43.000 respond, that it's the
00:35:45.080 policy of the United
00:35:46.220 States and it's actually
00:35:48.520 possible that someone's
00:35:50.760 going to follow this
00:35:51.500 policy to just unleash our
00:35:54.820 own genocidal retaliation,
00:35:57.620 you know, just get the
00:35:58.760 missiles out of the silos
00:35:59.680 before they get destroyed
00:36:00.860 so that we can kill 100 or
00:36:03.200 200 million people on the
00:36:04.640 other side quite pointlessly,
00:36:07.020 right?
00:36:07.180 There's nothing is
00:36:08.080 accomplished, you know,
00:36:09.520 you have not protected
00:36:10.540 anyone on your side by
00:36:12.160 doing this and yet it's
00:36:13.920 imagined that a U.S.
00:36:16.100 president is going to feel
00:36:17.320 that that is what he or
00:36:19.260 she wants to do in their
00:36:20.380 last minutes of life.
00:36:22.240 It really is out of, you
00:36:24.540 know, Dr.
00:36:25.080 Strangelove that we got into
00:36:26.620 this situation.
00:36:27.480 We've built this incredible
00:36:29.600 doomsday machine and each
00:36:31.920 step along the way there was
00:36:33.360 a rationale for doing what we
00:36:35.640 did and it was driven by
00:36:37.820 this sense of competition.
00:36:40.780 But when you step back and
00:36:43.220 look at the system, it's
00:36:45.220 insane that we continue to
00:36:47.460 live with this.
00:36:48.860 I remember on one of your
00:36:49.700 podcasts you mentioned it
00:36:51.020 was as if we had all wired
00:36:53.360 our homes with dynamite and
00:36:56.060 that that system just existed
00:36:58.100 in the background and then we
00:36:58.940 just all forgot about it,
00:37:00.220 right?
00:37:00.360 Right, and we just go about
00:37:01.320 our lives forgetting that
00:37:03.040 we're under the veil of this
00:37:05.920 nuclear threat and there has
00:37:08.740 been this collective amnesia,
00:37:11.040 I would say, about nuclear
00:37:12.780 weapons and we've just assumed
00:37:14.300 that they've pretty much gone
00:37:16.540 away and if they haven't gone
00:37:18.220 away, they're probably in safe
00:37:19.680 hands.
00:37:20.660 And I think that the invasion
00:37:22.680 of Ukraine by Russia has
00:37:25.660 woken some of us from that
00:37:27.720 slumber and to realize that
00:37:29.400 these weapons are very much
00:37:31.040 still a tool of statecraft and
00:37:33.240 can be used for threat making
00:37:35.260 and coercion and that nuclear
00:37:38.500 weapons remain a part of the
00:37:40.580 world and this collective
00:37:41.820 challenge that we need to find
00:37:43.280 a way to manage.
00:37:44.920 So let's talk about
00:37:45.740 proliferation and why it
00:37:48.820 hasn't proceeded further than it
00:37:50.800 has.
00:37:51.740 So we've got nine countries now
00:37:53.640 that have nuclear weapons.
00:37:55.420 If I'm not mistaken, that's the
00:37:56.900 US, Russia, China, the UK,
00:37:59.400 France, Israel, India, Pakistan,
00:38:02.860 and North Korea.
00:38:04.100 But many others have toyed with
00:38:06.420 developing them and South Africa
00:38:08.520 even had a stockpile at one point
00:38:10.440 and then dismantled it in 89.
00:38:13.720 And then obviously Ukraine and
00:38:15.520 Belarus and Kazakhstan had weapons
00:38:18.340 that were Soviet weapons that they
00:38:19.860 gave back when the Soviet system
00:38:22.720 collapsed.
00:38:23.260 How do you interpret the fact that,
00:38:26.240 I mean, this is not a successful
00:38:29.100 story of total non-proliferation,
00:38:31.660 but at one point it was imagined
00:38:33.800 that many more countries were going
00:38:35.520 to go nuclear very quickly.
00:38:37.420 So what happened?
00:38:38.840 Yeah.
00:38:39.140 So I think, I mean, this kind of
00:38:40.320 takes us back to the film as well
00:38:41.980 because when Oppenheimer leaves the
00:38:44.980 stage, the sense of most technical
00:38:48.840 experts and political experts and
00:38:51.760 military experts is that these
00:38:53.520 weapons will almost inevitably
00:38:55.340 spread.
00:38:56.540 The scientists understand that it's
00:38:58.020 not hard science, it's an
00:39:00.560 engineering problem and that any
00:39:03.540 country that can mobilize enough
00:39:05.300 resources can acquire these weapons.
00:39:08.400 And, you know, during the early
00:39:10.520 1960s, Kennedy famously said there
00:39:14.040 are 15 to 25 countries that might
00:39:16.080 acquire nuclear weapons and it's a
00:39:19.440 list of, it's an interesting list
00:39:21.780 when you go back and look at it,
00:39:23.480 right?
00:39:24.240 And here we are now with only nine
00:39:28.260 countries that have nuclear weapons
00:39:30.060 and I think this is a success story
00:39:32.700 and it's a story that we should be
00:39:34.520 telling more often because it shows
00:39:36.880 that when there's sufficient will,
00:39:40.040 you can do hard things and we can
00:39:43.000 make ourselves safer.
00:39:44.020 So I think there are really four
00:39:45.860 reasons why you don't see the
00:39:48.120 unfettered spread of nuclear weapons.
00:39:50.520 One of them is that the US and the
00:39:52.920 Soviet Union essentially buy off some
00:39:55.580 of these would-be proliferators with
00:39:58.060 security guarantees and promises to
00:40:01.520 protect them if they don't acquire
00:40:02.760 nuclear weapons or lean on them in
00:40:05.140 ways that make it unlikely that they
00:40:06.980 would continue their pursuit of the
00:40:08.260 bomb.
00:40:09.320 You know, another is that this system
00:40:12.520 of international law and export
00:40:14.920 controls springs up and that increases
00:40:17.760 this, the already high costs of
00:40:20.600 pursuing nuclear weapons.
00:40:22.620 There are certainly financial costs,
00:40:25.100 logistical costs, and reputational costs
00:40:27.280 for countries that want to acquire
00:40:29.080 these.
00:40:30.360 And so this system of law and export
00:40:33.160 control raises those costs.
00:40:35.120 You also have a couple cases of
00:40:37.840 counter-proliferation through military
00:40:40.560 action or sanctions that knocks off
00:40:43.760 countries' programs that might have
00:40:45.980 become a threat.
00:40:47.400 But I think an underrated part of the
00:40:51.040 story is this sense, this set of norms
00:40:56.900 that emerge against nuclear weapons and
00:41:00.260 against nuclear proliferation.
00:41:01.640 And elites in many countries come to view
00:41:06.320 nuclear weapons as immoral and as
00:41:10.040 unnecessary and come to see them as
00:41:12.680 liabilities rather than assets.
00:41:15.440 And I think that's an underrated part of
00:41:16.940 the story.
00:41:17.480 So it's really a multi-causal story, but
00:41:20.200 where we are now, this is kind of the best
00:41:23.380 case scenario for someone sitting in, you
00:41:26.860 know, 1960 and looking at where this
00:41:30.760 technology might go.
00:41:33.380 And I think we can continue to build on
00:41:35.940 that.
00:41:36.780 Then what do you think about the logic of
00:41:39.680 deterrence here?
00:41:40.760 Because when you look at a country that
00:41:43.680 really has become a global malefactor like
00:41:47.180 North Korea, the reason why North Korea has
00:41:51.240 been immune to, you know, retribution or,
00:41:54.860 you know, outside meddling, apart from its
00:41:58.220 quasi-alliance with China, is the fact that
00:42:00.500 it now can, you know, I guess in part it
00:42:04.040 could, there's a conventional answer here.
00:42:06.020 It could just blanket South Korea with
00:42:08.280 artillery shells.
00:42:09.360 But the fact that it's nuclear seems to be
00:42:13.540 part of the picture here.
00:42:15.640 And it's just a reason why it's unthinkable to
00:42:18.700 respond to its provocations with force.
00:42:21.960 I guess another example would be Pakistan.
00:42:24.280 Now, it's like as much as we might want to
00:42:28.940 respond to something there, it might have
00:42:31.220 been several moments over the last 20 years
00:42:33.720 where it would have seemed warranted.
00:42:35.960 It's in a different category given the fact
00:42:38.880 that it has nuclear weapons.
00:42:40.140 Why do you think that, and I guess we could
00:42:42.520 speculate that Ukraine, had they ever
00:42:46.220 properly had their own nuclear arsenal and
00:42:49.280 retained it, they would not have been invaded
00:42:52.000 by Russia.
00:42:53.540 So, if we think that's actually true, you know,
00:42:56.480 strategically, why don't you think that has
00:42:58.480 just caused much more of the world to draw
00:43:01.120 the lesson that if you want to maintain your
00:43:03.280 sovereignty as a nation, you want to have at
00:43:07.040 least some nuclear bombs that you can threaten
00:43:10.500 to use?
00:43:11.000 Yeah, well, I think that North Korea and Pakistan
00:43:14.640 drew that lesson, and they live in a tough
00:43:16.860 neighborhood and face some adversaries and
00:43:19.420 decided that the only way they could achieve
00:43:21.940 their security was to acquire nuclear weapons, and
00:43:24.340 they successfully crossed that line.
00:43:27.300 And they are sort of the exception that proves
00:43:30.640 the rule, because a lot of other countries
00:43:32.760 weren't willing to subject themselves to the
00:43:36.600 types of sanctions and economic isolation in order
00:43:41.200 to achieve the bomb.
00:43:42.720 So, both Pakistan and North Korea paid a huge cost to
00:43:45.920 acquire nuclear weapons.
00:43:48.640 And people look at sanctions and say, well, they didn't
00:43:51.740 work here.
00:43:53.120 And to some extent, that's true, but I think those
00:43:55.780 sanctions also had a deterrent effect for other
00:43:58.620 countries that might have wanted to go in that
00:44:01.520 direction.
00:44:01.920 And most countries have signed the Nuclear Non-Proliferation
00:44:05.860 Treaty and have adhered to it because they realize that
00:44:10.600 while they probably could get a nuclear weapon, that would be
00:44:14.540 very expensive economically, politically, et cetera, and would
00:44:18.780 result in their isolation.
00:44:20.100 Who else do you think is poised to go nuclear now beyond
00:44:24.400 the obvious case of Iran?
00:44:27.460 Yeah, I think it's a really short list.
00:44:29.060 And I think that that's evidence of the success of this
00:44:32.460 international system that we've built over the years.
00:44:35.820 I think Iran is the only credible country that's on the
00:44:39.420 verge.
00:44:40.240 Now, if Iran acquires nuclear weapons, this could result in a
00:44:44.640 new wave of interest from countries like Saudi Arabia, for
00:44:48.920 example.
00:44:49.300 You could also imagine a world in which the US backs off of some
00:44:54.540 of its alliance commitments and basically signals that it's
00:44:58.800 not willing to defend Japan or South Korea.
00:45:03.500 And you could imagine governments in those countries proceeding
00:45:06.160 with a nuclear weapons program.
00:45:08.280 They both have access to the technology and the fissile material
00:45:12.500 if they wanted to launch a crash program to acquire the bomb.
00:45:16.300 So in some ways, these US security assurances are a key part of the
00:45:22.660 non-proliferation regime.
00:45:25.600 There's also Taiwan, right?
00:45:27.260 Which had a nuclear weapons program until the 1960s or in the 1960s and gave
00:45:33.900 that program up under pressure from the United States.
00:45:37.400 So nuclear weapons are out there.
00:45:39.800 They're not that hard to build.
00:45:40.900 These are 1940s technology, right?
00:45:43.620 They entered the world at the same time as microwave ovens and jet engines and things
00:45:48.140 that we take for granted as having spread everywhere, right?
00:45:51.520 So it's really this system of assurances and controls and norms that have kept these weapons
00:45:58.060 from going everywhere.
00:46:00.080 But we're only 80 years into the nuclear story, right?
00:46:04.340 That's the crazy thing is there's still people who are alive who survived Hiroshima and Nagasaki.
00:46:11.620 And that's one human lifetime.
00:46:14.700 And we don't know what's going to come next.
00:46:17.940 And what is the story that we're going to be writing 80 years from now if we can survive that long
00:46:23.420 looking back at this period?
00:46:26.380 Will we say this was a period of relative safety or this was a time where we turned the corner
00:46:33.200 and went down a dark path?
00:46:35.140 Or is this a time when we decided once and for all that these weapons are too dangerous to live with
00:46:41.840 and we push them to the side and stop relying on them as heavily?
00:46:47.460 I think the most likely scenario is the status quo where these things continue to hum along
00:46:53.580 in the background and we all pretend that they don't really exist.
00:46:58.580 But every year we're running some non-zero risk.
00:47:02.980 You keep rolling those dice year after year and the chance for human miscalculation,
00:47:11.120 for technical accident, for deliberate use, every year you're taking a risk.
00:47:17.460 Yeah, that's the most sobering part of it, the idea that we're rolling those dice year after year
00:47:24.980 and as a matter of probability, it's compounding.
00:47:29.080 And it's all being maintained by an aging infrastructure, which I guess in some of the...
00:47:38.460 We'll talk about the dangers of things like cyber attacks, et cetera.
00:47:42.400 But maybe there are some ways in which the antiquity of this system has a silver lining
00:47:48.760 because presumably it's not as...
00:47:51.240 Maybe it's not as hackable as it would be if it was all being run on the latest operating system.
00:47:56.840 Yeah, they've upgraded it now.
00:47:59.280 So we're now on digital systems with nuclear command and control.
00:48:04.680 And I think that enhances reliability.
00:48:07.840 But as you mentioned, it creates certain cyber vulnerabilities.
00:48:12.200 And nobody knows what those cyber vulnerabilities are in every country.
00:48:17.400 There are some people who believe they know a lot about their own country's vulnerabilities.
00:48:21.880 But as you say, there are nine nuclear weapon states and they all have different systems
00:48:26.220 for managing nuclear weapons.
00:48:28.720 And there's the possibility that one side will attack a nuclear arsenal in a way that leads
00:48:38.200 to nuclear escalation.
00:48:40.400 That's an additional terrifying variable here, which is that really we're at the mercy of the
00:48:47.320 weakest link in that chain.
00:48:49.320 I mean, we might completely lock down our system in the United States and feel that it's really
00:48:57.800 perfect.
00:48:59.340 You know, there's just the chance that we're going to do something by accident is zero.
00:49:05.260 Now, of course, we could never achieve that.
00:49:08.320 Right.
00:49:08.560 But, you know, even if we did, the best possible case, we're at the mercy of whatever China
00:49:15.740 and Russia and other possible adversaries.
00:49:18.080 And North Korea.
00:49:18.700 Yeah.
00:49:19.440 You know, how good are their systems?
00:49:20.800 I don't want to be at the mercy of North Korea's systems.
00:49:24.480 It's incredible that we're in this situation.
00:49:26.620 And then you read, I'm sure you've read Eric Schlosser's book, Command and Control.
00:49:33.040 Yeah, it's a masterpiece.
00:49:34.040 You read about the preparations we have made for, you know, the continuity of government.
00:49:40.160 And it is a dark comedy.
00:49:42.620 Yeah.
00:49:42.800 The steps we've had to take to figure out what to do in the event of a full-scale nuclear
00:49:49.500 exchange.
00:49:50.220 You know, it's so deeply impractical and insane.
00:49:55.960 And I mean, again, it's easy to see how we have escalated ourselves into this untenable
00:50:02.440 situation, but, you know, you've got this perverse ratchet that just keeps turning in one direction.
00:50:08.360 But that we got there and we're left with the machinations that we imagine is going to
00:50:17.320 safeguard, you know, our survival, it's just, it's bonkers.
00:50:20.900 I think that's worth looking at a few of the moments where we actually released tension from
00:50:26.760 that ratchet because it hasn't always been inevitably increasing.
00:50:31.940 One of them is in 1986 when Reagan and Gorbachev meet and they agree that a nuclear war can never
00:50:41.140 be won and must never be fought.
00:50:43.320 And they fell short of some of the deep cuts that were discussed at the Reykjavik summit,
00:50:48.640 but they left with a shared understanding and Gorbachev went back believing that the U.S.
00:50:55.040 would not launch a nuclear attack on the Soviet Union.
00:50:59.020 They'd previously been very afraid that the U.S. was preparing to do that.
00:51:02.220 So that sense of shared understanding allowed for the intermediate range nuclear forces agreement,
00:51:09.680 which limited some of the most destabilizing weapons in Europe.
00:51:13.100 So that's one example.
00:51:13.900 Another is in 1991 where unilaterally, President H.W. Bush just takes all of the U.S. tactical
00:51:23.920 nuclear weapons and he takes them off alert and off of the surface ships, et cetera.
00:51:30.740 And this is just a recognition of a change in the security environment after the fall of
00:51:35.960 the Soviet Union.
00:51:37.140 And he didn't need to negotiate an extensive treaty, but I think rather courageously just said,
00:51:43.160 we can move first and had this presidential nuclear initiative that was then reciprocated
00:51:50.200 by Russia.
00:51:50.900 And so that's one of the cases where you have this ratchet going in the other direction.
00:51:56.360 And so there are things that we have done in the past to take a little pressure out of
00:52:01.960 the system.
00:52:02.620 Unfortunately, where we are now is going in the wrong direction.
00:52:06.820 We've gotten spoiled.
00:52:07.780 The past 30 years or so has been a period of relatively low nuclear risk.
00:52:14.500 And with Russia's invasion of Ukraine, I feel like we've entered a new period of escalating
00:52:21.480 nuclear risk.
00:52:22.680 And this is something that people have been talking about for some time, but you can see
00:52:27.540 it really manifesting itself.
00:52:29.060 You're fighting a conventional war in the nuclear shadow in which Vladimir Putin has made references
00:52:37.560 and threats with nuclear weapons.
00:52:40.120 And then he's occasionally walked them back, but some other spokespeople have gone forward
00:52:48.060 and made those threats again.
00:52:49.580 So we have this period of heightened risk.
00:52:52.660 And in the background is a new relationship with China and their nuclear arsenal.
00:52:59.840 So China for many years has had a small recessed nuclear arsenal, and they are in the process
00:53:06.520 of doubling or tripling that arsenal.
00:53:08.740 They could have as many as 1,500 nuclear weapons by the 2030s.
00:53:13.540 And that is going to reshape this competition because we've never had a three-way nuclear
00:53:22.200 standoff in the way that we soon will.
00:53:25.840 So let's take those as separate cases.
00:53:29.560 Let's talk about Russia and Ukraine first.
00:53:33.300 The threats we've heard from Putin and other spokespeople in Russia, have those all been with
00:53:41.560 respect to the use of tactical weapons in the theater of conflict in Ukraine?
00:53:46.920 Usually it's not specified.
00:53:48.980 Usually they're making some reference.
00:53:51.460 So, you know, for example, in February, Putin said, if Ukraine attempts to take back Crimea,
00:53:59.020 European countries will be in conflict with Russia, which is a leading nuclear power superior
00:54:03.920 to many NATO countries in terms of nuclear force.
00:54:06.760 In that case, it's a vague threat, but it's referencing nuclear forces that could be used.
00:54:13.560 And then later, Putin mentions that they are raising the alert of their nuclear forces.
00:54:21.340 It turns out that appears to have been bogus, and the U.S. intelligence community mentions
00:54:26.380 that they don't see any difference in the operational patterns of Russia's forces.
00:54:33.140 But it's clear that he's trying to manipulate risk and to raise the prospects that nuclear
00:54:40.880 weapons would be used.
00:54:42.620 And presumably, it would be a tactical or battlefield nuclear weapon rather than a strategic nuclear
00:54:50.340 weapon.
00:54:51.200 But we just don't know.
00:54:52.700 We know that Russian nuclear doctrine says that they would only use nuclear weapons if the
00:54:58.720 existence of the state is threatened.
00:55:01.840 But at various points, Putin and other officials have made statements that seem to signal a broader
00:55:07.680 interpretation of that in a way that I think we need to take seriously, even if we recognize
00:55:13.480 that they have some desire to manipulate that risk.
00:55:16.680 So, when this war started and the obvious threat of nuclear escalation was first discussed, many people
00:55:29.520 immediately drew the lesson, seemingly the wrong lesson from the Cuban Missile Crisis, which is that you just
00:55:37.760 can't blink, right?
00:55:38.620 You can't give in to nuclear blackmail.
00:55:40.760 We don't want to set that.
00:55:41.740 One, it's a terrible precedent because it means that anyone who has nuclear weapons can basically
00:55:47.420 do whatever they want conventionally, you know, as long as they purport to be suicidal.
00:55:54.220 And I guess I'm wondering what you think about what we've done so far and whether you think
00:56:02.140 we have been, we, the U.S., I guess, and NATO, have been impeccable in how we have not caved in to
00:56:11.740 Russian demands or whatever.
00:56:13.420 Yeah, I think it's been a well-calibrated response overall.
00:56:19.180 And you could see there is not a rush to invoke nuclear weapons as a response.
00:56:25.820 There is a seriousness and a cautiousness through which the Biden administration has approached
00:56:32.540 this issue while continuing to support Ukraine's righteous defense of its territory.
00:56:41.100 And I think it's a really hard line to walk because it's not clear where the lines are.
00:56:50.460 What do you think we would do if Russia used tactical nukes on Ukraine?
00:56:55.080 I don't think anyone knows for sure, but I suspect the U.S. would strike with conventional
00:57:01.720 forces the units that launched the attack and would also strike other forces that are of great value
00:57:10.760 to Russia. For example, sinking some warships in the Black Sea or striking other targets and indicate
00:57:19.240 that this represents an escalation in the war, but without expanding in a way that could lead
00:57:26.680 to all-out nuclear war. I think that would be the attempt. But who knows?
00:57:30.840 Yeah. There are not that many stages beyond that, right?
00:57:33.960 Right.
00:57:34.520 That seems completely sensible to me. But then when you imagine what happens next,
00:57:41.560 there's just not that many stops on the way to the end of everything.
00:57:44.920 It's interesting. I saw you had Fred Kaplan on the podcast. He's an amazing, he's a national
00:57:50.280 treasure. That book's a great book. And he describes in it a set of war games and exercises
00:57:55.960 that were conducted during the Obama administration over a fictitious scenario, a war game in the
00:58:02.680 Baltics in which Russia had invaded and occupied the Baltics and had used nuclear weapons. And they
00:58:09.160 played the simulation or war game out twice, once with the principals, so the Secretary of Defense,
00:58:16.840 the Secretary of State, et cetera, and once with the deputies, the deputy secretaries, et cetera.
00:58:21.400 And the outcome was different in each case. The principals responded with a nuclear weapon and
00:58:27.560 the deputies did not. So a lot of it depends on who's at the table and who's advocating for what.
00:58:33.640 Mm-hmm.
00:58:34.280 Now, with any of these war game scenarios, they're different than what someone would be encountering
00:58:40.600 when really making a decision. I think they're really useful to try to help prepare ourselves to
00:58:46.440 think the unthinkable, to think about what we would do when sitting in that chair, but they can also
00:58:54.040 mislead in various ways too. I think one of the interesting questions we might ask is why hasn't
00:59:01.480 Russia used nuclear weapons yet, right? Because we know they see this conflict as being essential to
00:59:08.600 their security. It's sometimes described as existential. They have nuclear weapons, including
00:59:14.920 relatively low-yield tactical weapons that they could use on the battlefield to try to
00:59:20.360 achieve a tactical goal, but they haven't. And I think there are a few reasons. I mean,
00:59:27.000 one, we don't know how this ends and maybe they're not desperate enough and maybe that's why they
00:59:31.640 haven't used them. There's also a deterrence element from NATO and from Ukraine. But I think that there's
00:59:39.560 another piece of the puzzle too, which is that even for Russia and Vladimir Putin,
00:59:45.720 these weapons are seen as a line that he is reluctant to cross. And that's in part a result
00:59:53.000 of this history of 78 years of non-use of nuclear weapons. The Soviet Union had this major rhetorical
01:00:02.200 talking point throughout the Cold War that we weren't the ones who used nuclear weapons,
01:00:05.880 it was the US that used these terrible weapons. And there's been this distinction that we've drawn
01:00:13.000 over the years, it wasn't always like this, but that nuclear weapons are something different.
01:00:17.560 So if Russia were to cross that line, they would be paying a price in doing that reputationally.
01:00:23.800 You know, three quarters of the people in the world live in a country that haven't really taken sides
01:00:29.000 in this conflict. And we've heard that China and India have indicated to Russia that Russia should
01:00:37.480 not use nuclear weapons in this conflict. And so there are considerations that are other than military.
01:00:45.400 Now, one of my fears is that if a country does use nuclear weapons, and especially uses a small,
01:00:54.280 relatively small battlefield weapon, there will not be the sorts of massive deaths and casualties that
01:01:01.400 we saw from Hiroshima and Nagasaki. And a lot of people are going to look around and say,
01:01:06.040 that's it? What's the big deal? And you could imagine that leading to a new wave of interest in nuclear
01:01:15.640 weapons and a new wave of proliferation. It also could lead to a rejection of nuclear weapons and to
01:01:23.240 say, we should never use these things again. And so I think whatever happens immediately in the aftermath
01:01:31.160 of the next use of nuclear weapons, if there is one, could shape our relationship with these weapons
01:01:38.280 for the future. And this nuclear taboo that we've had for the past 78 years is something that benefits
01:01:46.200 us all. And we should really work to preserve that. Yeah, well, it's somewhat analogous to the taboos
01:01:52.200 around chemical weapons and biological weapons. And I'd heard recently, I don't know if this is common
01:01:59.320 knowledge and I just missed it. But I'd heard that at one point, we realized we could create laser
01:02:06.360 weapons that would just permanently blind soldiers on the battlefield. And we just didn't go down that
01:02:11.400 path at all, because it just seemed so ghastly to ever put that into use. Which is interesting,
01:02:18.120 because on some level, it's not nearly as bad as the other things we have developed.
01:02:21.640 I don't know why it was so obviously unethical to the people who saw that this technology was in
01:02:29.640 reach. But there is just something horrible about the idea of effortlessly blinding people en masse
01:02:37.240 as a way of winning a war. Yet, we're willing to blow them up, riddle them with shrapnel,
01:02:43.720 etc. And yet, silently blinding everybody is just, we're not going to go there. Do you have any
01:02:50.200 intuitions about why that struck us as just totally untenable, ethically?
01:02:54.040 Yeah, I'm not sure. But you have, at various times, an effort to make war more humane and to limit
01:03:02.200 the types of activities you would engage in. Even in World War I, there was an effort before the war
01:03:08.280 started to limit the use of poison gas. But then, once one side used poison gas, and initially it
01:03:17.240 wasn't the type that killed you, it was a less deadly form of gas, all of a sudden that line was
01:03:24.120 crossed and it became commonplace to do this horrible thing. And so these norms, I think, can be
01:03:30.920 really valuable, but they can be fragile as well. And I don't know exactly what to make sense of it.
01:03:35.400 You see an effort to ban landmines, and cluster munitions, and these other devices that are
01:03:44.840 disproportionate in their humanitarian consequence, right? They're just really awful weapons that harm
01:03:51.720 civilians. And then, we have these weapons, nuclear weapons, that are inherently inhumane in just about
01:03:59.880 every circumstance you could imagine them being used, right? We plan to conduct mass murder on this
01:04:07.560 scale that is hard to comprehend in the service of national security. So even as you're preventing
01:04:15.880 blinding lasers and landmines, you still have plans on the book to incinerate cities,
01:04:24.040 or incinerate military bases that are adjacent to cities, which would have resulted in massive
01:04:30.920 fallout and death. It's one of the great contradictions. And I think, you know, to go back to
01:04:37.320 the film Oppenheimer, this is part of what's captured, is the decision to develop the H-bomb
01:04:45.400 is about what is the role of these weapons going to be in society and in warfare going forward.
01:04:54.280 And there were a group of people who felt nuclear weapons were like any other weapon,
01:05:01.640 and that we ought to develop them and put them in the hands of the military. And Truman eventually
01:05:09.080 pushed back against that and took control back and put these in the hands of civilians. And that's
01:05:17.000 where it's been in the US and in other nuclear countries as well, that these weapons are different
01:05:24.200 than just military devices that can be sent out to the local commanders. But we have a really imperfect
01:05:32.920 history there about how they've been used and practiced.
01:05:36.760 What do you think about the growing tensions between the US and China, specifically around
01:05:43.640 our somewhat equivocal commitment to protecting Taiwan?
01:05:49.320 Yeah, I think if there is a hot war between the US and China, it will be over Taiwan. I think that's
01:05:58.600 the only issue that approaches the stakes. And the US has become less equivocal under the Biden
01:06:06.680 administration about its willingness to defend Taiwan. And...
01:06:10.760 Were those moments essentially gaffes on his part, where he basically said we would
01:06:15.160 defend Taiwan even though our official doctrine is strategic ambiguity or something like that?
01:06:21.960 Yeah, I don't think so. I don't think they were. I think it reflects an increased willingness to
01:06:29.560 stand up to China or to try to stand up to China in this case. And I am deeply concerned about the path
01:06:39.080 that we're on because it seems like we are on a collision course with China. And nobody really
01:06:44.600 knows what the right approach is to avoid war with China. Because there are risks and costs to both approaches.
01:06:53.320 So...
01:06:54.680 Well, what's the risk of... So we're strangely, and we as I think the entire world is strangely dependent on
01:07:03.560 on Taiwanese manufacturing of semiconductors. But if we on-shored all of that supply chain,
01:07:12.040 and we're no longer dependent on them, can you imagine that we would suddenly decide,
01:07:16.680 they're not a critical US interest anymore, and we don't need to have a policy that we're going to
01:07:22.440 come to their rescue? Or does that then make Japan and South Korea suddenly worried that we're not the
01:07:30.680 ally we claim to be, and then they go nuclear? Yeah, I think that's the central debate that we're
01:07:36.520 going to have in the coming years, as the US becomes less dependent on Taiwan for its technology,
01:07:43.720 and as China becomes more powerful relative to the US. And China has been building up its military
01:07:51.080 in order to assert its dominance in the Western Pacific. And it's not clear how long the US can
01:08:00.040 preserve its advantage. And a US president is going to have to make a hard choice at some point.
01:08:07.800 There's a fair amount of talk about the coming demographic collapse in China, and that they're
01:08:15.080 really just not going to be what we feared going forward. I don't know if you have followed the
01:08:22.840 work of Peter Zion, or anyone else who's been been hitting this topic of late. But yeah, I haven't
01:08:30.280 been following it that closely, but it does sound that the narrative on China has shifted a little bit.
01:08:36.040 Yeah, yeah. Although I don't know if that could lead them to do something more reckless rather than
01:08:43.000 less reckless in the meantime. They may feel like they have a closing window to resolve this problem.
01:08:49.560 Right. And Xi Jinping has said that he does not want to pass the Taiwan issue on. He wants to deal
01:08:56.680 with it during his tenure. I'm sure he'd like to. I don't know if he's committed to doing that.
01:09:02.200 Hmm. So, given these background concerns that we have collectively built a doomsday device,
01:09:13.160 and it's on, to one degree or another, a hair trigger, or many hair triggers, or triggers that,
01:09:20.440 the integrity of which we can't assess. And now we have this growing concern about misinformation and
01:09:27.160 disinformation and cyber attacks and deep fakes. We have this digital layer of culture that is proving
01:09:36.680 to be a kind of a hallucination machine. How are you thinking about the advent of these new digital
01:09:45.000 problems? And if we throw generative AI and AI control of our actual nuclear infrastructure,
01:09:53.560 ultimately, how are you thinking about recent developments in tech in light of everything
01:10:00.600 we've just talked about? Yeah.
01:10:01.560 Well, I think it's really concerning. And there's a couple of reasons for concern. And you've mentioned
01:10:06.040 one of them is just do leaders and decision makers understand the context in which they're making
01:10:15.240 decisions. And there's an opportunity to create disinformation about a particular conflict or
01:10:22.760 crisis, right? And then at a more granular level, there is a set of systems that enable nuclear use,
01:10:32.600 command and control, communications. And these systems rely upon a digital infrastructure,
01:10:41.880 and they need to be executed perfectly every time and with great speed. So you have a network of early
01:10:51.960 warning satellites and radars, and you have communications nodes, and you have decision makers
01:10:59.800 who then receive the information from these various sensors and have to make sense of it.
01:11:05.480 And I think in many countries, there's going to be a strong incentive to use AI to synthesize
01:11:15.640 that data and provide decision-making support to the relevant decision makers as quickly and accurately as
01:11:25.960 possible. And to some extent, this is just software, right? This is what military planners do.
01:11:35.000 They take state-of-the-art software, and they integrate it into their systems. And so we will be
01:11:41.240 relying increasingly on this processing of the information by something that you could consider
01:11:48.040 as AI, right? Now, there's a strong commitment by the US military and by US decision-makers to never
01:11:55.800 let an AI agent make a decision. There always needs to be a human in the loop and a human making the
01:12:02.920 decision to use a nuclear weapon system. My concern is that all of the processing of the information and
01:12:12.600 the interpretation of the information could be done by an AI system in a way that leaves humans essentially
01:12:19.880 as button pushers. Are you really going to reject the conclusions of a system that has proved 99%
01:12:31.560 reliable and that's built on state-of-the-art software and hardware? And it just really seems to be the best way to
01:12:41.880 support your decisions. And that's, I think, the slippery slope we might go on. And there are some efforts in
01:12:50.200 Congress to limit that. I think that, you know, as with other command and control issues, we are only as
01:12:57.880 safe and secure as the weakest link in the chain. And so we need to be getting together now with Russia,
01:13:05.720 China, other countries to figure out how can we avoid this slippery slope in which we are essentially
01:13:15.240 delegating nuclear decisions to an algorithm. Because that's a really scary world.
01:13:20.520 Yeah, it is. Except if you imagine that you have AI that you are wise to trust, right? Because again,
01:13:31.560 we're talking about situations where you don't have a lot of time, right? If you've got 15 minutes to make
01:13:37.560 a decision and you either have an ape who doesn't have time to consult with other apes, or you have some
01:13:45.960 AI system that you have put in place that you really think is analogous to a chess engine that's just
01:13:52.120 better at chess than people are, right? Yeah. I mean, I think you've put your finger on it,
01:13:57.400 which is that these digital systems and these human systems are prone to different modes of failure.
01:14:05.080 And the problem, fundamentally, is making high-stakes decisions under incredible time pressure.
01:14:15.560 That's the fundamental problem. And that's what I think we need to move back from. We need to devise
01:14:21.800 a system that allows us to be safe and secure without relying on a decision in minutes that
01:14:30.040 could imperil the world. Because whether you're delegating that decision to machines or to people,
01:14:36.440 there are these failure modes. And I don't know which is better, right? I just reject the premise that
01:14:44.360 we need to accept that. Is there a path back to zero here? I mean,
01:14:49.880 has anyone articulated a plausible path whereby we would just recognize that the only way to win
01:14:56.760 this game is not to play it at all? I mean, it seems really implausible at this particular moment,
01:15:02.760 given the height of tensions with Russia, China. We haven't even talked about India or Pakistan
01:15:09.640 or Israel's reliance on nuclear weapons, North Korea. There are a lot of countries that possess these
01:15:15.400 weapons and have a strong desire and incentive to keep them, right? So I think it needs to be,
01:15:24.760 if we ever move in this direction, it needs to be a joint project in which collectively,
01:15:30.280 we recognize that these weapons pose an unacceptable risk to humanity and to our nations,
01:15:37.800 and that systematically, step by step, in a safe way, we're going to pull back from the brink.
01:15:47.080 Because there are certainly risks to moving too quickly and to leaving vulnerabilities. But I think
01:15:54.440 the first thing we need to do is to recognize that we've got a problem and that fundamentally,
01:16:01.880 we've wired all our homes with dynamite, right? We haven't even acknowledged that, right?
01:16:06.920 And once we acknowledge that there can be a better way to resolve our differences without resort to
01:16:15.160 nuclear threats, then we can start moving in the right direction. The Obama administration put forward
01:16:21.880 this plan, a graduated approach towards a world free of nuclear weapons, and it was rejected by Russia,
01:16:30.040 in part because they saw it as a ploy. And so the world we live in now, you can't just take nuclear
01:16:37.480 weapons out of that world and expect that to be a safe world. It's naive and unrealistic. But we need
01:16:43.960 to work towards greater mechanisms of collective security in which we reach the point that there's
01:16:49.960 no conflict that's worth fighting that we would consider annihilating each other's cities for.
01:16:54.920 Well, on that point, do you think that the current status quo of mutually assured destruction has
01:17:02.120 kept us over the last 75 years from fighting the conventional version of World War III?
01:17:09.480 It's interesting that you say mutually assured destruction because this phrase is often evoked.
01:17:15.000 This is not a deliberate strategy so much as a condition that people had to accept, right?
01:17:22.040 And there was always a desire, especially within the US, to escape from this condition of mutual assured
01:17:28.920 destruction. Because if deterrence is stable at the nuclear level, it allows for potentially
01:17:35.640 conventional aggression below the nuclear level, right? This is that stability-instability paradox.
01:17:43.160 And so there was always a desire to maintain some nuclear superiority. This is the world that
01:17:51.320 we are confronted with is a world of anxiety and fear. And you can have nuclear stability for a while,
01:18:01.480 but then something comes along to challenge that nuclear stability. I think that if you look at the
01:18:07.720 way leaders thought about nuclear weapons throughout the Cold War, it did play a dampening effect on their
01:18:15.560 their goals and aspirations and their willingness to engage in war, especially between the great powers,
01:18:23.880 right? But it pushed that conflict elsewhere. So instead of fighting a conventional war in Europe,
01:18:32.360 there were these proxy wars that were fought in Korea and in Vietnam and in Afghanistan. And the Cold War was,
01:18:42.440 it was a relatively peaceful time if you lived in the United States, but it was not a peaceful time for
01:18:51.560 the populations that were affected by these proxy wars. There were just some really awful, brutal conflicts
01:18:59.000 conflicts that were a result of this rivalry. And so I think nuclear deterrence has certainly had some
01:19:10.040 benefits, but it has come at the cost of these various close calls and at the cost of pushing conflict elsewhere.
01:19:19.000 Well, I know we all await the wisdom of governments in figuring out how to mitigate this threat, but
01:19:27.720 what is the role or opportunities for philanthropy here? Because I know you're currently at Longview
01:19:37.000 Philanthropy and leading their program on nuclear weapons and existential risk. And Longview has been
01:19:44.120 advising me and advising me and the Waking Up Foundation and how we give out money each year.
01:19:48.600 Philanthropically, what can private citizens do to help?
01:19:53.880 Yeah. So I think from the start of the nuclear age, scientists and activists and non-governmental
01:20:01.320 experts have played a really key role in auditing government activities and putting pressure and changing
01:20:09.160 the incentives for what government actors wanted to do. In general, these weapons are the domain of
01:20:15.240 governments. They're in the hands of government and military leaders. And that is as it should be. But
01:20:22.840 the voices of citizens are really important too in setting the tone and the voices of experts as well.
01:20:29.960 So I think you could see that in the role of academic experts and understanding nuclear deterrence and
01:20:38.440 shaping the field of arms control. You can see that today in the work of many NGOs who work really
01:20:45.080 hard to make information publicly accessible in the role of media organizations that report on these
01:20:52.280 things. But this is a contracting field. You have the largest funder in the space, which is the MacArthur
01:20:59.000 Foundation, chose in 2020 to exit the field. And so there are a lot of these non-governmental
01:21:06.120 organizations that are essentially starved for cash.
01:21:09.640 And what happened there? Why did MacArthur get out of the saving the future game?
01:21:15.960 They were reorganizing their portfolio and they had placed a big bet on nuclear weapons. And they did an
01:21:24.600 assessment of that and determined that while the grantees were making great contributions and informing
01:21:30.760 official policy and informing the public, they didn't see a line of sight to achieving their big
01:21:36.040 bet goal. And so the board ultimately decided that they didn't want to do this anymore. And I don't
01:21:44.360 think that's the right choice. But at the same time, I think the MacArthur Foundation should be applauded
01:21:50.200 for their many years of investment in this because there are lots of other foundations who haven't done
01:21:55.720 anything in this space. And when I look at that, I just think about how large and consequential an
01:22:02.840 issue this is and how important it is to have non-governmental voices. And the amount of money
01:22:10.120 that is going into the sector is tiny in comparison.
01:22:14.760 What is it? Can you estimate what the funding is?
01:22:18.360 Yeah. So the Peace and Security Funders Group seeks to estimate the total non-governmental
01:22:25.480 spending in this space. And I think that we don't have the numbers for this year,
01:22:29.240 but it'll be somewhere around $30 million.
01:22:31.240 Oh my God. That really is paltry given what we're talking about.
01:22:35.720 Yeah.
01:22:36.680 Wow. Is that all the organizations that are in the space? I mean,
01:22:41.400 something like the Plowshares Fund and you're including all of those?
01:22:45.160 We're including the grants that Plowshares makes. Yeah. In that total.
01:22:51.640 Man. Okay. Well, this is an appeal to audience members. This is a game that we obviously need to
01:22:59.320 win. And it's astonishing to me that we're talking about this level of funding for a problem of this
01:23:08.680 sort. When you look at what gets funded and at what scale, there are startups that no one's ever
01:23:17.560 heard of and will never hear of that have raised 10 times that amount of money and then they evaporate.
01:23:25.160 It's just, this is all upside down. So I am going to be giving money to this. I've already given money
01:23:34.040 to Plowshares and others, but this is going to be a top priority going forward. And I would just
01:23:40.840 welcome that all of you get involved to the degree that you can.
01:23:45.640 I know, Carl, Longview is opening a nuclear weapons policy fund, right? Can you say something about that?
01:23:54.760 So we see this as a really neglected problem that just affects all of us alive today. And we need
01:24:03.560 non-governmental voices, the voices of scholars and scientists and activists in order to help shape
01:24:11.320 these policies. And I think from the start of the nuclear age, these voices have been essential.
01:24:16.680 So we're putting together this fund to try to raise money. None of it goes to Longview Philanthropy,
01:24:22.920 100% goes directly to the beneficiaries. And so what types of groups are we likely to fund?
01:24:30.280 Well, for example, the Carnegie Endowment for International Peace is working on this issue of
01:24:37.480 inadvertent nuclear escalation and looking at the ways that technological entanglement of conventional
01:24:45.800 and nuclear systems could lead to the inadvertent use of nuclear weapons. You have a group called
01:24:52.280 the Council on Strategic Risks, which is looking at some of the most dangerous nuclear systems that are
01:24:59.400 in development. For example, the sea-launched cruise missile, which the US administration did a
01:25:06.440 review of, decided it didn't need, but Congress then put the money back in for it. And this weapon is
01:25:12.120 escalatory because it has target and payload ambiguity. So when it's launched, you don't
01:25:18.760 know exactly where it's going and you don't know whether it carries a nuclear or a conventional warhead.
01:25:25.400 So these are the types of interventions that we think are really important at the moment. We need,
01:25:31.480 broadly, a civil society effort to elevate this issue and return it to a position of concern within
01:25:39.880 society. And I think there are just so many ways to contribute to nuclear risk reduction. And one
01:25:46.600 of them is financially, if you're in a position to do that. But I think this is an issue for everyone.
01:25:53.080 And I think that we should all add nuclear weapons to our portfolio of concern. And I know that's a big
01:26:00.440 ask because there are just so many things to worry about these days, but we're not going to get better
01:26:05.400 policies unless people remember the threat that these weapons pose. And support political space
01:26:13.080 for the US, if you're in the US, to negotiate with Russia and China to reduce these shared risks.
01:26:20.680 And if you're not in a position to give financially, you still have a political voice
01:26:26.120 and you can talk about these issues with your friends and amplify helpful messages on social media.
01:26:31.800 And if you are in a position to give financial support, there are so many good, dedicated people
01:26:39.720 who have spent their lives preparing to try to contribute. And they're struggling right now because
01:26:46.680 the space has contracted and a little bit of money goes a really long way here. And our job at Long
01:26:53.960 View Philanthropy is to try to find the best, highest impact projects and then to put that money to use.
01:27:00.680 So we have a great team and we can go out and investigate and find groups that we think are
01:27:08.120 doing work that is the most effective. And then we can network them together and help them be more
01:27:15.320 effective than they would be operating in isolation. So by all means, if you already know of a group
01:27:21.640 working on nuclear weapons risk reduction, you can always support them directly. But if you're not sure
01:27:26.920 what to do, we want to make it really easy for people to make a difference here.
01:27:32.920 Well, that's great. And we will put a link to the foundation page when this podcast goes out and
01:27:39.960 will be on my blog and in the show notes and in the associated email. Lastly, Carl, imagine we have
01:27:49.400 some in our audience who are just going to college now or they're midstream in their
01:27:56.200 undergraduate years and they are suddenly struck by the opportunity to live a truly meaningful life
01:28:04.920 by trying to grapple with this particular problem. I imagine there are many paths through a university
01:28:12.040 and perhaps through a graduate program that could equip somebody to meaningfully put their shoulder
01:28:17.320 to the wheel here. But what strike you as a couple that seem especially promising?
01:28:23.400 Well, I have incredible respect for the government officials who grapple with these problems and
01:28:29.000 they're not easy and they're operating under a lot of constraints. So we need really good people
01:28:36.360 in government working on these issues. So I think a career in government is excellent,
01:28:41.960 an excellent path, both in the short term you can contribute, but longer term you're developing
01:28:47.080 skills, connections, and perspectives that will be helpful. There are a lot of graduate programs that
01:28:52.680 prepare you both in terms of science and policy to have a high impact career in this space. But beyond that,
01:28:59.560 I think we need people with a variety of skills. So if you are an artist or a graphic designer, you can
01:29:09.960 contribute in that way. If you do social media, we need people who can tell great human stories about the
01:29:17.880 way nuclear weapons have affected us and the risks we continue to run. And I think there's a really
01:29:25.800 important role for civil society and for citizens and for outside experts to provide support for
01:29:34.680 government efforts, but also to critique them and audit them and to hold people to account because
01:29:40.680 there are large bureaucracies that are at work, that are chugging away, producing these outcomes that
01:29:47.320 are inimical to our collective security. And so you need people who are willing to call that out.
01:29:54.440 One example is this guy, Bruce Blair, who passed away a few years ago, but is just a hero to me. He's
01:30:01.720 this veteran nuclear launch officer, and he became a deep expert in nuclear command and control and a
01:30:08.440 really dedicated truth teller to expose the dangers that are inherent in this whole enterprise. And someone
01:30:15.880 like that, he knew the generals and the admirals, and he knew people in the Russian enterprise as well,
01:30:23.880 and he spoke with great clarity and conviction. But he was able to provide a counterpoint to some of the
01:30:30.920 official narratives in a way that I think is really healthy. And then you also have people who work in
01:30:37.880 and out of government and develop the expertise and the connections they need outside of government,
01:30:44.200 and then bring that in. So a good example of this is Rose Gottmiller, who worked in government early in
01:30:49.080 her career. And then she went to work at the Carnegie Moscow Center. And the expertise that she built up
01:30:55.960 was really helpful when she was appointed as the chief negotiator for the New START Treaty. And she
01:31:02.200 describes in her book how that was a really important part of getting that treaty done, and then the role of
01:31:10.840 civil society in getting that treaty passed through Congress, because you need a two-thirds majority
01:31:17.000 for treaty ratification. So providing political space for cooperation is essential, because it's really
01:31:26.040 hard these days to talk about cooperating with Russia and China. And I get it, right? These are countries
01:31:33.320 that are, in some cases, they're doing really awful things. But we have a shared threat that we need to
01:31:38.040 manage. And I think that's one of the roles of civil society is opening doors for work in that area.
01:31:45.800 Yeah. And that point, that brings us full circle to what Christopher Nolan has just accomplished with
01:31:52.360 his film. I mean, it's just, you know, it's a work of art, but perhaps more than anything in recent
01:31:57.240 memory, it's made this problem unignorable for so many millions of people. So it's, I mean, props to him.
01:32:04.360 Yeah. I mean, there's just so many important themes. Yeah. Like, in terms of the way it deals
01:32:08.280 with the role of scientists and society, and we just see echoes of this today in the way scientific
01:32:14.680 expertise is sidelined in the public sphere, from vaccines to climate change to AI. And, you know,
01:32:23.640 it's capturing this Prometheus moment. And nuclear weapons were really the first time we confronted the
01:32:29.720 fact that our power has outstripped our wisdom. And we unleashed these elemental forces, you know,
01:32:35.960 the very forces that power the sun, we bring them down to earth, right? And we had to grapple with
01:32:41.400 that then. But in some ways we're doing it again with biotechnology and with artificial intelligence.
01:32:48.360 And so the story is about nuclear weapons, but this idea of creating something that you're not sure you
01:32:55.560 can control. It has real resonance in this moment. And I, you know, there's this scene in the movie,
01:33:03.560 without spoiling it, where Oppenheimer is talking to Einstein. And I think the scene is fabricated,
01:33:09.560 but it is based on the sentiment that he might have had at the time. As they're embarking on the
01:33:15.560 Manhattan Project, they are wondering whether the first Trinity test could result in the ignition
01:33:22.280 of the atmosphere and lead to a chain reaction, which destroys all of humanity. And they run the
01:33:28.440 calculations and they run them again, and they realize that this possibility is vanishingly small.
01:33:33.480 It's essentially zero. So Oppenheimer's talking to Einstein and he says, when I came to you with
01:33:39.160 these calculations, we thought we might start a chain reaction that might destroy the entire world.
01:33:46.440 He turns to Einstein and he says, I believe we did. And the question is, what did we set in motion
01:33:54.520 with that first Trinity test? Did we start this arms race inexorably, which would lead us to where we are
01:34:02.120 today with 12,000 weapons, many of them on high alert in this system in which we are all vulnerable
01:34:10.520 forever? I don't think we did. If you look at the past 80 years, we've come right up to the brink.
01:34:18.600 But then each time we've gained a little bit of wisdom and we've built these systems
01:34:23.240 of governance. And you look at the nuclear non-proliferation regime to prevent the spread
01:34:28.120 and these various arms control treaties that have helped manage competition and hotlines that allowed
01:34:36.120 for communications between adversaries. All of these are imperfect ways of managing this technology and we
01:34:43.800 need to do better. But I think Oppenheimer looking at where we are today, if he could see where we're
01:34:50.360 at, he'd be terrified by the number of weapons we've built. But I think he'd be also impressed at the
01:34:58.600 international systems we've built to regulate these weapons. And the International Atomic Energy
01:35:05.800 Agency in some ways reflects his vision of international control over the peaceful uses
01:35:11.960 of nuclear energy. So it's really a mixed story. Yeah. Well, Karl, thank you for your time and thank
01:35:19.080 you for the work you're doing. I will continue to follow it with interest.
01:35:22.760 Thank you. Thank you. I appreciate all you're doing.