#330 — The Doomsday Machine
Episode Stats
Length
1 hour and 35 minutes
Words per Minute
148.33809
Summary
In this episode, I speak with Carl Robichaux, co-Director of Longview Philanthropy's program on nuclear weapons policy, and co-Managing Director of the Nuclear Weapons Policy Fund. We discuss the new film, J.J. Abramson s The War Plan, and its connection to the history of nuclear weapons, proliferation, and failure to contain nuclear weapons. We also discuss the legacy of the Manhattan Project, and the impact nuclear weapons had on the lives of the people who lived near the sites of the first nuclear detonation, the Trinity test site in 1945. And we talk about the role of private citizens in mitigating nuclear risk, and how they can play a role in preventing nuclear proliferation. This episode is a PSA, which means there is no paywall. If you'd like to support the podcast, you can subscribe at Samharris.org/Making Sense. You can also join the conversation by using the hashtag , and find out more about the podcast on social media by using and in the comments section below. Thanks to our sponsor, the Waking Up Foundation, for sponsoring this episode. Make Sense.org. The Making Sense Podcast is a production of Gimlet Media. Making Sense is a podcast on nuclear policy, defense, and security issues. Our mission is to make the world safer, smarter, and a place for everyone to learn and practice what they can do to make sense of the world around them. . We make the things they need to know, so they can be their best, not less, and can do the most effective way to live up to their potential impact, and have the most impact, not only in the world they can achieve the most of their best day to impact their day to day lives, everywhere they learn, and they can help them achieve their most impact and their most effective day to help them most impact the most effect, and their impact, they become a better day, they help them are most impactful, they are most effective, they will help us all of their most meaningful day to effect the most meaningful impact, their day is the most important thing, they care about it, their most of all, they deserve the most, they're everywhere they care, they get it, they learn the most they care most of it, they're a masterpiece, they understand it, and so they're the most powerful they're most effective.
Transcript
00:00:31.820
As always, if you want to support the podcast, you can subscribe at samharris.org.
00:00:42.160
Carl co-leads Longview Philanthropy's program on nuclear weapons policy.
00:00:47.440
And he co-manages their Nuclear Weapons Policy Fund.
00:00:51.020
This is a fund to which the Waking Up Foundation will soon be giving a fair amount of money.
00:00:55.680
If you'd like to support it along with us, you can find the relevant link in the show notes in your podcast player.
00:01:02.900
For more than a decade, Carl led grant-making and nuclear security at the Carnegie Corporation of New York.
00:01:08.900
He also previously worked with the Century Foundation and the Global Security Institute,
00:01:14.080
where he focused on arms control, international security policy, and non-proliferation.
00:01:19.100
And the topic of this conversation is the ongoing thread of nuclear war.
00:01:25.120
We discussed the new film Oppenheimer, which I must say really is a masterpiece.
00:01:30.360
If you haven't seen it in a theater, and it's still playing in a theater near you,
00:01:37.120
This really is a film that benefits from the big screen.
00:01:40.020
We discussed the ethics of dropping the atomic bombs on Hiroshima and Nagasaki,
00:01:45.260
the Cuban Missile Crisis, and some of the false lessons we learned there,
00:01:49.580
the history and the future of nuclear proliferation,
00:02:12.560
the role of private citizens in mitigating nuclear risk,
00:02:15.840
and finally Longview Philanthropy's Nuclear Risk Policy Fund,
00:02:22.580
Unfortunately, this remains one of the biggest problems of our time,
00:02:26.280
one which we do not talk about or think about nearly enough.
00:42:53.540
So, if we think that's actually true, you know,
00:43:11.000
Yeah, well, I think that North Korea and Pakistan
00:43:21.940
their security was to acquire nuclear weapons, and
00:43:36.600
types of sanctions and economic isolation in order
00:43:42.720
So, both Pakistan and North Korea paid a huge cost to
00:43:48.640
And people look at sanctions and say, well, they didn't
00:43:53.120
And to some extent, that's true, but I think those
00:43:55.780
sanctions also had a deterrent effect for other
00:44:01.920
And most countries have signed the Nuclear Non-Proliferation
00:44:05.860
Treaty and have adhered to it because they realize that
00:44:10.600
while they probably could get a nuclear weapon, that would be
00:44:14.540
very expensive economically, politically, et cetera, and would
00:44:20.100
Who else do you think is poised to go nuclear now beyond
00:44:29.060
And I think that that's evidence of the success of this
00:44:32.460
international system that we've built over the years.
00:44:35.820
I think Iran is the only credible country that's on the
00:44:40.240
Now, if Iran acquires nuclear weapons, this could result in a
00:44:44.640
new wave of interest from countries like Saudi Arabia, for
00:44:49.300
You could also imagine a world in which the US backs off of some
00:44:54.540
of its alliance commitments and basically signals that it's
00:45:03.500
And you could imagine governments in those countries proceeding
00:45:08.280
They both have access to the technology and the fissile material
00:45:12.500
if they wanted to launch a crash program to acquire the bomb.
00:45:16.300
So in some ways, these US security assurances are a key part of the
00:45:27.260
Which had a nuclear weapons program until the 1960s or in the 1960s and gave
00:45:33.900
that program up under pressure from the United States.
00:45:43.620
They entered the world at the same time as microwave ovens and jet engines and things
00:45:48.140
that we take for granted as having spread everywhere, right?
00:45:51.520
So it's really this system of assurances and controls and norms that have kept these weapons
00:46:00.080
But we're only 80 years into the nuclear story, right?
00:46:04.340
That's the crazy thing is there's still people who are alive who survived Hiroshima and Nagasaki.
00:46:17.940
And what is the story that we're going to be writing 80 years from now if we can survive that long
00:46:26.380
Will we say this was a period of relative safety or this was a time where we turned the corner
00:46:35.140
Or is this a time when we decided once and for all that these weapons are too dangerous to live with
00:46:41.840
and we push them to the side and stop relying on them as heavily?
00:46:47.460
I think the most likely scenario is the status quo where these things continue to hum along
00:46:53.580
in the background and we all pretend that they don't really exist.
00:46:58.580
But every year we're running some non-zero risk.
00:47:02.980
You keep rolling those dice year after year and the chance for human miscalculation,
00:47:11.120
for technical accident, for deliberate use, every year you're taking a risk.
00:47:17.460
Yeah, that's the most sobering part of it, the idea that we're rolling those dice year after year
00:47:24.980
and as a matter of probability, it's compounding.
00:47:29.080
And it's all being maintained by an aging infrastructure, which I guess in some of the...
00:47:38.460
We'll talk about the dangers of things like cyber attacks, et cetera.
00:47:42.400
But maybe there are some ways in which the antiquity of this system has a silver lining
00:47:51.240
Maybe it's not as hackable as it would be if it was all being run on the latest operating system.
00:47:59.280
So we're now on digital systems with nuclear command and control.
00:48:07.840
But as you mentioned, it creates certain cyber vulnerabilities.
00:48:12.200
And nobody knows what those cyber vulnerabilities are in every country.
00:48:17.400
There are some people who believe they know a lot about their own country's vulnerabilities.
00:48:21.880
But as you say, there are nine nuclear weapon states and they all have different systems
00:48:28.720
And there's the possibility that one side will attack a nuclear arsenal in a way that leads
00:48:40.400
That's an additional terrifying variable here, which is that really we're at the mercy of the
00:48:49.320
I mean, we might completely lock down our system in the United States and feel that it's really
00:48:59.340
You know, there's just the chance that we're going to do something by accident is zero.
00:49:08.560
But, you know, even if we did, the best possible case, we're at the mercy of whatever China
00:49:20.800
I don't want to be at the mercy of North Korea's systems.
00:49:26.620
And then you read, I'm sure you've read Eric Schlosser's book, Command and Control.
00:49:34.040
You read about the preparations we have made for, you know, the continuity of government.
00:49:42.800
The steps we've had to take to figure out what to do in the event of a full-scale nuclear
00:49:50.220
You know, it's so deeply impractical and insane.
00:49:55.960
And I mean, again, it's easy to see how we have escalated ourselves into this untenable
00:50:02.440
situation, but, you know, you've got this perverse ratchet that just keeps turning in one direction.
00:50:08.360
But that we got there and we're left with the machinations that we imagine is going to
00:50:17.320
safeguard, you know, our survival, it's just, it's bonkers.
00:50:20.900
I think that's worth looking at a few of the moments where we actually released tension from
00:50:26.760
that ratchet because it hasn't always been inevitably increasing.
00:50:31.940
One of them is in 1986 when Reagan and Gorbachev meet and they agree that a nuclear war can never
00:50:43.320
And they fell short of some of the deep cuts that were discussed at the Reykjavik summit,
00:50:48.640
but they left with a shared understanding and Gorbachev went back believing that the U.S.
00:50:55.040
would not launch a nuclear attack on the Soviet Union.
00:50:59.020
They'd previously been very afraid that the U.S. was preparing to do that.
00:51:02.220
So that sense of shared understanding allowed for the intermediate range nuclear forces agreement,
00:51:09.680
which limited some of the most destabilizing weapons in Europe.
00:51:13.900
Another is in 1991 where unilaterally, President H.W. Bush just takes all of the U.S. tactical
00:51:23.920
nuclear weapons and he takes them off alert and off of the surface ships, et cetera.
00:51:30.740
And this is just a recognition of a change in the security environment after the fall of
00:51:37.140
And he didn't need to negotiate an extensive treaty, but I think rather courageously just said,
00:51:43.160
we can move first and had this presidential nuclear initiative that was then reciprocated
00:51:50.900
And so that's one of the cases where you have this ratchet going in the other direction.
00:51:56.360
And so there are things that we have done in the past to take a little pressure out of
00:52:02.620
Unfortunately, where we are now is going in the wrong direction.
00:52:07.780
The past 30 years or so has been a period of relatively low nuclear risk.
00:52:14.500
And with Russia's invasion of Ukraine, I feel like we've entered a new period of escalating
00:52:22.680
And this is something that people have been talking about for some time, but you can see
00:52:29.060
You're fighting a conventional war in the nuclear shadow in which Vladimir Putin has made references
00:52:40.120
And then he's occasionally walked them back, but some other spokespeople have gone forward
00:52:52.660
And in the background is a new relationship with China and their nuclear arsenal.
00:52:59.840
So China for many years has had a small recessed nuclear arsenal, and they are in the process
00:53:08.740
They could have as many as 1,500 nuclear weapons by the 2030s.
00:53:13.540
And that is going to reshape this competition because we've never had a three-way nuclear
00:53:33.300
The threats we've heard from Putin and other spokespeople in Russia, have those all been with
00:53:41.560
respect to the use of tactical weapons in the theater of conflict in Ukraine?
00:53:51.460
So, you know, for example, in February, Putin said, if Ukraine attempts to take back Crimea,
00:53:59.020
European countries will be in conflict with Russia, which is a leading nuclear power superior
00:54:03.920
to many NATO countries in terms of nuclear force.
00:54:06.760
In that case, it's a vague threat, but it's referencing nuclear forces that could be used.
00:54:13.560
And then later, Putin mentions that they are raising the alert of their nuclear forces.
00:54:21.340
It turns out that appears to have been bogus, and the U.S. intelligence community mentions
00:54:26.380
that they don't see any difference in the operational patterns of Russia's forces.
00:54:33.140
But it's clear that he's trying to manipulate risk and to raise the prospects that nuclear
00:54:42.620
And presumably, it would be a tactical or battlefield nuclear weapon rather than a strategic nuclear
00:54:52.700
We know that Russian nuclear doctrine says that they would only use nuclear weapons if the
00:55:01.840
But at various points, Putin and other officials have made statements that seem to signal a broader
00:55:07.680
interpretation of that in a way that I think we need to take seriously, even if we recognize
00:55:13.480
that they have some desire to manipulate that risk.
00:55:16.680
So, when this war started and the obvious threat of nuclear escalation was first discussed, many people
00:55:29.520
immediately drew the lesson, seemingly the wrong lesson from the Cuban Missile Crisis, which is that you just
00:55:41.740
One, it's a terrible precedent because it means that anyone who has nuclear weapons can basically
00:55:47.420
do whatever they want conventionally, you know, as long as they purport to be suicidal.
00:55:54.220
And I guess I'm wondering what you think about what we've done so far and whether you think
00:56:02.140
we have been, we, the U.S., I guess, and NATO, have been impeccable in how we have not caved in to
00:56:13.420
Yeah, I think it's been a well-calibrated response overall.
00:56:19.180
And you could see there is not a rush to invoke nuclear weapons as a response.
00:56:25.820
There is a seriousness and a cautiousness through which the Biden administration has approached
00:56:32.540
this issue while continuing to support Ukraine's righteous defense of its territory.
00:56:41.100
And I think it's a really hard line to walk because it's not clear where the lines are.
00:56:50.460
What do you think we would do if Russia used tactical nukes on Ukraine?
00:56:55.080
I don't think anyone knows for sure, but I suspect the U.S. would strike with conventional
00:57:01.720
forces the units that launched the attack and would also strike other forces that are of great value
00:57:10.760
to Russia. For example, sinking some warships in the Black Sea or striking other targets and indicate
00:57:19.240
that this represents an escalation in the war, but without expanding in a way that could lead
00:57:26.680
to all-out nuclear war. I think that would be the attempt. But who knows?
00:57:30.840
Yeah. There are not that many stages beyond that, right?
00:57:34.520
That seems completely sensible to me. But then when you imagine what happens next,
00:57:41.560
there's just not that many stops on the way to the end of everything.
00:57:44.920
It's interesting. I saw you had Fred Kaplan on the podcast. He's an amazing, he's a national
00:57:50.280
treasure. That book's a great book. And he describes in it a set of war games and exercises
00:57:55.960
that were conducted during the Obama administration over a fictitious scenario, a war game in the
00:58:02.680
Baltics in which Russia had invaded and occupied the Baltics and had used nuclear weapons. And they
00:58:09.160
played the simulation or war game out twice, once with the principals, so the Secretary of Defense,
00:58:16.840
the Secretary of State, et cetera, and once with the deputies, the deputy secretaries, et cetera.
00:58:21.400
And the outcome was different in each case. The principals responded with a nuclear weapon and
00:58:27.560
the deputies did not. So a lot of it depends on who's at the table and who's advocating for what.
00:58:34.280
Now, with any of these war game scenarios, they're different than what someone would be encountering
00:58:40.600
when really making a decision. I think they're really useful to try to help prepare ourselves to
00:58:46.440
think the unthinkable, to think about what we would do when sitting in that chair, but they can also
00:58:54.040
mislead in various ways too. I think one of the interesting questions we might ask is why hasn't
00:59:01.480
Russia used nuclear weapons yet, right? Because we know they see this conflict as being essential to
00:59:08.600
their security. It's sometimes described as existential. They have nuclear weapons, including
00:59:14.920
relatively low-yield tactical weapons that they could use on the battlefield to try to
00:59:20.360
achieve a tactical goal, but they haven't. And I think there are a few reasons. I mean,
00:59:27.000
one, we don't know how this ends and maybe they're not desperate enough and maybe that's why they
00:59:31.640
haven't used them. There's also a deterrence element from NATO and from Ukraine. But I think that there's
00:59:39.560
another piece of the puzzle too, which is that even for Russia and Vladimir Putin,
00:59:45.720
these weapons are seen as a line that he is reluctant to cross. And that's in part a result
00:59:53.000
of this history of 78 years of non-use of nuclear weapons. The Soviet Union had this major rhetorical
01:00:02.200
talking point throughout the Cold War that we weren't the ones who used nuclear weapons,
01:00:05.880
it was the US that used these terrible weapons. And there's been this distinction that we've drawn
01:00:13.000
over the years, it wasn't always like this, but that nuclear weapons are something different.
01:00:17.560
So if Russia were to cross that line, they would be paying a price in doing that reputationally.
01:00:23.800
You know, three quarters of the people in the world live in a country that haven't really taken sides
01:00:29.000
in this conflict. And we've heard that China and India have indicated to Russia that Russia should
01:00:37.480
not use nuclear weapons in this conflict. And so there are considerations that are other than military.
01:00:45.400
Now, one of my fears is that if a country does use nuclear weapons, and especially uses a small,
01:00:54.280
relatively small battlefield weapon, there will not be the sorts of massive deaths and casualties that
01:01:01.400
we saw from Hiroshima and Nagasaki. And a lot of people are going to look around and say,
01:01:06.040
that's it? What's the big deal? And you could imagine that leading to a new wave of interest in nuclear
01:01:15.640
weapons and a new wave of proliferation. It also could lead to a rejection of nuclear weapons and to
01:01:23.240
say, we should never use these things again. And so I think whatever happens immediately in the aftermath
01:01:31.160
of the next use of nuclear weapons, if there is one, could shape our relationship with these weapons
01:01:38.280
for the future. And this nuclear taboo that we've had for the past 78 years is something that benefits
01:01:46.200
us all. And we should really work to preserve that. Yeah, well, it's somewhat analogous to the taboos
01:01:52.200
around chemical weapons and biological weapons. And I'd heard recently, I don't know if this is common
01:01:59.320
knowledge and I just missed it. But I'd heard that at one point, we realized we could create laser
01:02:06.360
weapons that would just permanently blind soldiers on the battlefield. And we just didn't go down that
01:02:11.400
path at all, because it just seemed so ghastly to ever put that into use. Which is interesting,
01:02:18.120
because on some level, it's not nearly as bad as the other things we have developed.
01:02:21.640
I don't know why it was so obviously unethical to the people who saw that this technology was in
01:02:29.640
reach. But there is just something horrible about the idea of effortlessly blinding people en masse
01:02:37.240
as a way of winning a war. Yet, we're willing to blow them up, riddle them with shrapnel,
01:02:43.720
etc. And yet, silently blinding everybody is just, we're not going to go there. Do you have any
01:02:50.200
intuitions about why that struck us as just totally untenable, ethically?
01:02:54.040
Yeah, I'm not sure. But you have, at various times, an effort to make war more humane and to limit
01:03:02.200
the types of activities you would engage in. Even in World War I, there was an effort before the war
01:03:08.280
started to limit the use of poison gas. But then, once one side used poison gas, and initially it
01:03:17.240
wasn't the type that killed you, it was a less deadly form of gas, all of a sudden that line was
01:03:24.120
crossed and it became commonplace to do this horrible thing. And so these norms, I think, can be
01:03:30.920
really valuable, but they can be fragile as well. And I don't know exactly what to make sense of it.
01:03:35.400
You see an effort to ban landmines, and cluster munitions, and these other devices that are
01:03:44.840
disproportionate in their humanitarian consequence, right? They're just really awful weapons that harm
01:03:51.720
civilians. And then, we have these weapons, nuclear weapons, that are inherently inhumane in just about
01:03:59.880
every circumstance you could imagine them being used, right? We plan to conduct mass murder on this
01:04:07.560
scale that is hard to comprehend in the service of national security. So even as you're preventing
01:04:15.880
blinding lasers and landmines, you still have plans on the book to incinerate cities,
01:04:24.040
or incinerate military bases that are adjacent to cities, which would have resulted in massive
01:04:30.920
fallout and death. It's one of the great contradictions. And I think, you know, to go back to
01:04:37.320
the film Oppenheimer, this is part of what's captured, is the decision to develop the H-bomb
01:04:45.400
is about what is the role of these weapons going to be in society and in warfare going forward.
01:04:54.280
And there were a group of people who felt nuclear weapons were like any other weapon,
01:05:01.640
and that we ought to develop them and put them in the hands of the military. And Truman eventually
01:05:09.080
pushed back against that and took control back and put these in the hands of civilians. And that's
01:05:17.000
where it's been in the US and in other nuclear countries as well, that these weapons are different
01:05:24.200
than just military devices that can be sent out to the local commanders. But we have a really imperfect
01:05:32.920
history there about how they've been used and practiced.
01:05:36.760
What do you think about the growing tensions between the US and China, specifically around
01:05:43.640
our somewhat equivocal commitment to protecting Taiwan?
01:05:49.320
Yeah, I think if there is a hot war between the US and China, it will be over Taiwan. I think that's
01:05:58.600
the only issue that approaches the stakes. And the US has become less equivocal under the Biden
01:06:06.680
administration about its willingness to defend Taiwan. And...
01:06:10.760
Were those moments essentially gaffes on his part, where he basically said we would
01:06:15.160
defend Taiwan even though our official doctrine is strategic ambiguity or something like that?
01:06:21.960
Yeah, I don't think so. I don't think they were. I think it reflects an increased willingness to
01:06:29.560
stand up to China or to try to stand up to China in this case. And I am deeply concerned about the path
01:06:39.080
that we're on because it seems like we are on a collision course with China. And nobody really
01:06:44.600
knows what the right approach is to avoid war with China. Because there are risks and costs to both approaches.
01:06:54.680
Well, what's the risk of... So we're strangely, and we as I think the entire world is strangely dependent on
01:07:03.560
on Taiwanese manufacturing of semiconductors. But if we on-shored all of that supply chain,
01:07:12.040
and we're no longer dependent on them, can you imagine that we would suddenly decide,
01:07:16.680
they're not a critical US interest anymore, and we don't need to have a policy that we're going to
01:07:22.440
come to their rescue? Or does that then make Japan and South Korea suddenly worried that we're not the
01:07:30.680
ally we claim to be, and then they go nuclear? Yeah, I think that's the central debate that we're
01:07:36.520
going to have in the coming years, as the US becomes less dependent on Taiwan for its technology,
01:07:43.720
and as China becomes more powerful relative to the US. And China has been building up its military
01:07:51.080
in order to assert its dominance in the Western Pacific. And it's not clear how long the US can
01:08:00.040
preserve its advantage. And a US president is going to have to make a hard choice at some point.
01:08:07.800
There's a fair amount of talk about the coming demographic collapse in China, and that they're
01:08:15.080
really just not going to be what we feared going forward. I don't know if you have followed the
01:08:22.840
work of Peter Zion, or anyone else who's been been hitting this topic of late. But yeah, I haven't
01:08:30.280
been following it that closely, but it does sound that the narrative on China has shifted a little bit.
01:08:36.040
Yeah, yeah. Although I don't know if that could lead them to do something more reckless rather than
01:08:43.000
less reckless in the meantime. They may feel like they have a closing window to resolve this problem.
01:08:49.560
Right. And Xi Jinping has said that he does not want to pass the Taiwan issue on. He wants to deal
01:08:56.680
with it during his tenure. I'm sure he'd like to. I don't know if he's committed to doing that.
01:09:02.200
Hmm. So, given these background concerns that we have collectively built a doomsday device,
01:09:13.160
and it's on, to one degree or another, a hair trigger, or many hair triggers, or triggers that,
01:09:20.440
the integrity of which we can't assess. And now we have this growing concern about misinformation and
01:09:27.160
disinformation and cyber attacks and deep fakes. We have this digital layer of culture that is proving
01:09:36.680
to be a kind of a hallucination machine. How are you thinking about the advent of these new digital
01:09:45.000
problems? And if we throw generative AI and AI control of our actual nuclear infrastructure,
01:09:53.560
ultimately, how are you thinking about recent developments in tech in light of everything
01:10:01.560
Well, I think it's really concerning. And there's a couple of reasons for concern. And you've mentioned
01:10:06.040
one of them is just do leaders and decision makers understand the context in which they're making
01:10:15.240
decisions. And there's an opportunity to create disinformation about a particular conflict or
01:10:22.760
crisis, right? And then at a more granular level, there is a set of systems that enable nuclear use,
01:10:32.600
command and control, communications. And these systems rely upon a digital infrastructure,
01:10:41.880
and they need to be executed perfectly every time and with great speed. So you have a network of early
01:10:51.960
warning satellites and radars, and you have communications nodes, and you have decision makers
01:10:59.800
who then receive the information from these various sensors and have to make sense of it.
01:11:05.480
And I think in many countries, there's going to be a strong incentive to use AI to synthesize
01:11:15.640
that data and provide decision-making support to the relevant decision makers as quickly and accurately as
01:11:25.960
possible. And to some extent, this is just software, right? This is what military planners do.
01:11:35.000
They take state-of-the-art software, and they integrate it into their systems. And so we will be
01:11:41.240
relying increasingly on this processing of the information by something that you could consider
01:11:48.040
as AI, right? Now, there's a strong commitment by the US military and by US decision-makers to never
01:11:55.800
let an AI agent make a decision. There always needs to be a human in the loop and a human making the
01:12:02.920
decision to use a nuclear weapon system. My concern is that all of the processing of the information and
01:12:12.600
the interpretation of the information could be done by an AI system in a way that leaves humans essentially
01:12:19.880
as button pushers. Are you really going to reject the conclusions of a system that has proved 99%
01:12:31.560
reliable and that's built on state-of-the-art software and hardware? And it just really seems to be the best way to
01:12:41.880
support your decisions. And that's, I think, the slippery slope we might go on. And there are some efforts in
01:12:50.200
Congress to limit that. I think that, you know, as with other command and control issues, we are only as
01:12:57.880
safe and secure as the weakest link in the chain. And so we need to be getting together now with Russia,
01:13:05.720
China, other countries to figure out how can we avoid this slippery slope in which we are essentially
01:13:15.240
delegating nuclear decisions to an algorithm. Because that's a really scary world.
01:13:20.520
Yeah, it is. Except if you imagine that you have AI that you are wise to trust, right? Because again,
01:13:31.560
we're talking about situations where you don't have a lot of time, right? If you've got 15 minutes to make
01:13:37.560
a decision and you either have an ape who doesn't have time to consult with other apes, or you have some
01:13:45.960
AI system that you have put in place that you really think is analogous to a chess engine that's just
01:13:52.120
better at chess than people are, right? Yeah. I mean, I think you've put your finger on it,
01:13:57.400
which is that these digital systems and these human systems are prone to different modes of failure.
01:14:05.080
And the problem, fundamentally, is making high-stakes decisions under incredible time pressure.
01:14:15.560
That's the fundamental problem. And that's what I think we need to move back from. We need to devise
01:14:21.800
a system that allows us to be safe and secure without relying on a decision in minutes that
01:14:30.040
could imperil the world. Because whether you're delegating that decision to machines or to people,
01:14:36.440
there are these failure modes. And I don't know which is better, right? I just reject the premise that
01:14:44.360
we need to accept that. Is there a path back to zero here? I mean,
01:14:49.880
has anyone articulated a plausible path whereby we would just recognize that the only way to win
01:14:56.760
this game is not to play it at all? I mean, it seems really implausible at this particular moment,
01:15:02.760
given the height of tensions with Russia, China. We haven't even talked about India or Pakistan
01:15:09.640
or Israel's reliance on nuclear weapons, North Korea. There are a lot of countries that possess these
01:15:15.400
weapons and have a strong desire and incentive to keep them, right? So I think it needs to be,
01:15:24.760
if we ever move in this direction, it needs to be a joint project in which collectively,
01:15:30.280
we recognize that these weapons pose an unacceptable risk to humanity and to our nations,
01:15:37.800
and that systematically, step by step, in a safe way, we're going to pull back from the brink.
01:15:47.080
Because there are certainly risks to moving too quickly and to leaving vulnerabilities. But I think
01:15:54.440
the first thing we need to do is to recognize that we've got a problem and that fundamentally,
01:16:01.880
we've wired all our homes with dynamite, right? We haven't even acknowledged that, right?
01:16:06.920
And once we acknowledge that there can be a better way to resolve our differences without resort to
01:16:15.160
nuclear threats, then we can start moving in the right direction. The Obama administration put forward
01:16:21.880
this plan, a graduated approach towards a world free of nuclear weapons, and it was rejected by Russia,
01:16:30.040
in part because they saw it as a ploy. And so the world we live in now, you can't just take nuclear
01:16:37.480
weapons out of that world and expect that to be a safe world. It's naive and unrealistic. But we need
01:16:43.960
to work towards greater mechanisms of collective security in which we reach the point that there's
01:16:49.960
no conflict that's worth fighting that we would consider annihilating each other's cities for.
01:16:54.920
Well, on that point, do you think that the current status quo of mutually assured destruction has
01:17:02.120
kept us over the last 75 years from fighting the conventional version of World War III?
01:17:09.480
It's interesting that you say mutually assured destruction because this phrase is often evoked.
01:17:15.000
This is not a deliberate strategy so much as a condition that people had to accept, right?
01:17:22.040
And there was always a desire, especially within the US, to escape from this condition of mutual assured
01:17:28.920
destruction. Because if deterrence is stable at the nuclear level, it allows for potentially
01:17:35.640
conventional aggression below the nuclear level, right? This is that stability-instability paradox.
01:17:43.160
And so there was always a desire to maintain some nuclear superiority. This is the world that
01:17:51.320
we are confronted with is a world of anxiety and fear. And you can have nuclear stability for a while,
01:18:01.480
but then something comes along to challenge that nuclear stability. I think that if you look at the
01:18:07.720
way leaders thought about nuclear weapons throughout the Cold War, it did play a dampening effect on their
01:18:15.560
their goals and aspirations and their willingness to engage in war, especially between the great powers,
01:18:23.880
right? But it pushed that conflict elsewhere. So instead of fighting a conventional war in Europe,
01:18:32.360
there were these proxy wars that were fought in Korea and in Vietnam and in Afghanistan. And the Cold War was,
01:18:42.440
it was a relatively peaceful time if you lived in the United States, but it was not a peaceful time for
01:18:51.560
the populations that were affected by these proxy wars. There were just some really awful, brutal conflicts
01:18:59.000
conflicts that were a result of this rivalry. And so I think nuclear deterrence has certainly had some
01:19:10.040
benefits, but it has come at the cost of these various close calls and at the cost of pushing conflict elsewhere.
01:19:19.000
Well, I know we all await the wisdom of governments in figuring out how to mitigate this threat, but
01:19:27.720
what is the role or opportunities for philanthropy here? Because I know you're currently at Longview
01:19:37.000
Philanthropy and leading their program on nuclear weapons and existential risk. And Longview has been
01:19:44.120
advising me and advising me and the Waking Up Foundation and how we give out money each year.
01:19:48.600
Philanthropically, what can private citizens do to help?
01:19:53.880
Yeah. So I think from the start of the nuclear age, scientists and activists and non-governmental
01:20:01.320
experts have played a really key role in auditing government activities and putting pressure and changing
01:20:09.160
the incentives for what government actors wanted to do. In general, these weapons are the domain of
01:20:15.240
governments. They're in the hands of government and military leaders. And that is as it should be. But
01:20:22.840
the voices of citizens are really important too in setting the tone and the voices of experts as well.
01:20:29.960
So I think you could see that in the role of academic experts and understanding nuclear deterrence and
01:20:38.440
shaping the field of arms control. You can see that today in the work of many NGOs who work really
01:20:45.080
hard to make information publicly accessible in the role of media organizations that report on these
01:20:52.280
things. But this is a contracting field. You have the largest funder in the space, which is the MacArthur
01:20:59.000
Foundation, chose in 2020 to exit the field. And so there are a lot of these non-governmental
01:21:06.120
organizations that are essentially starved for cash.
01:21:09.640
And what happened there? Why did MacArthur get out of the saving the future game?
01:21:15.960
They were reorganizing their portfolio and they had placed a big bet on nuclear weapons. And they did an
01:21:24.600
assessment of that and determined that while the grantees were making great contributions and informing
01:21:30.760
official policy and informing the public, they didn't see a line of sight to achieving their big
01:21:36.040
bet goal. And so the board ultimately decided that they didn't want to do this anymore. And I don't
01:21:44.360
think that's the right choice. But at the same time, I think the MacArthur Foundation should be applauded
01:21:50.200
for their many years of investment in this because there are lots of other foundations who haven't done
01:21:55.720
anything in this space. And when I look at that, I just think about how large and consequential an
01:22:02.840
issue this is and how important it is to have non-governmental voices. And the amount of money
01:22:10.120
that is going into the sector is tiny in comparison.
01:22:14.760
What is it? Can you estimate what the funding is?
01:22:18.360
Yeah. So the Peace and Security Funders Group seeks to estimate the total non-governmental
01:22:25.480
spending in this space. And I think that we don't have the numbers for this year,
01:22:31.240
Oh my God. That really is paltry given what we're talking about.
01:22:36.680
Wow. Is that all the organizations that are in the space? I mean,
01:22:41.400
something like the Plowshares Fund and you're including all of those?
01:22:45.160
We're including the grants that Plowshares makes. Yeah. In that total.
01:22:51.640
Man. Okay. Well, this is an appeal to audience members. This is a game that we obviously need to
01:22:59.320
win. And it's astonishing to me that we're talking about this level of funding for a problem of this
01:23:08.680
sort. When you look at what gets funded and at what scale, there are startups that no one's ever
01:23:17.560
heard of and will never hear of that have raised 10 times that amount of money and then they evaporate.
01:23:25.160
It's just, this is all upside down. So I am going to be giving money to this. I've already given money
01:23:34.040
to Plowshares and others, but this is going to be a top priority going forward. And I would just
01:23:40.840
welcome that all of you get involved to the degree that you can.
01:23:45.640
I know, Carl, Longview is opening a nuclear weapons policy fund, right? Can you say something about that?
01:23:54.760
So we see this as a really neglected problem that just affects all of us alive today. And we need
01:24:03.560
non-governmental voices, the voices of scholars and scientists and activists in order to help shape
01:24:11.320
these policies. And I think from the start of the nuclear age, these voices have been essential.
01:24:16.680
So we're putting together this fund to try to raise money. None of it goes to Longview Philanthropy,
01:24:22.920
100% goes directly to the beneficiaries. And so what types of groups are we likely to fund?
01:24:30.280
Well, for example, the Carnegie Endowment for International Peace is working on this issue of
01:24:37.480
inadvertent nuclear escalation and looking at the ways that technological entanglement of conventional
01:24:45.800
and nuclear systems could lead to the inadvertent use of nuclear weapons. You have a group called
01:24:52.280
the Council on Strategic Risks, which is looking at some of the most dangerous nuclear systems that are
01:24:59.400
in development. For example, the sea-launched cruise missile, which the US administration did a
01:25:06.440
review of, decided it didn't need, but Congress then put the money back in for it. And this weapon is
01:25:12.120
escalatory because it has target and payload ambiguity. So when it's launched, you don't
01:25:18.760
know exactly where it's going and you don't know whether it carries a nuclear or a conventional warhead.
01:25:25.400
So these are the types of interventions that we think are really important at the moment. We need,
01:25:31.480
broadly, a civil society effort to elevate this issue and return it to a position of concern within
01:25:39.880
society. And I think there are just so many ways to contribute to nuclear risk reduction. And one
01:25:46.600
of them is financially, if you're in a position to do that. But I think this is an issue for everyone.
01:25:53.080
And I think that we should all add nuclear weapons to our portfolio of concern. And I know that's a big
01:26:00.440
ask because there are just so many things to worry about these days, but we're not going to get better
01:26:05.400
policies unless people remember the threat that these weapons pose. And support political space
01:26:13.080
for the US, if you're in the US, to negotiate with Russia and China to reduce these shared risks.
01:26:20.680
And if you're not in a position to give financially, you still have a political voice
01:26:26.120
and you can talk about these issues with your friends and amplify helpful messages on social media.
01:26:31.800
And if you are in a position to give financial support, there are so many good, dedicated people
01:26:39.720
who have spent their lives preparing to try to contribute. And they're struggling right now because
01:26:46.680
the space has contracted and a little bit of money goes a really long way here. And our job at Long
01:26:53.960
View Philanthropy is to try to find the best, highest impact projects and then to put that money to use.
01:27:00.680
So we have a great team and we can go out and investigate and find groups that we think are
01:27:08.120
doing work that is the most effective. And then we can network them together and help them be more
01:27:15.320
effective than they would be operating in isolation. So by all means, if you already know of a group
01:27:21.640
working on nuclear weapons risk reduction, you can always support them directly. But if you're not sure
01:27:26.920
what to do, we want to make it really easy for people to make a difference here.
01:27:32.920
Well, that's great. And we will put a link to the foundation page when this podcast goes out and
01:27:39.960
will be on my blog and in the show notes and in the associated email. Lastly, Carl, imagine we have
01:27:49.400
some in our audience who are just going to college now or they're midstream in their
01:27:56.200
undergraduate years and they are suddenly struck by the opportunity to live a truly meaningful life
01:28:04.920
by trying to grapple with this particular problem. I imagine there are many paths through a university
01:28:12.040
and perhaps through a graduate program that could equip somebody to meaningfully put their shoulder
01:28:17.320
to the wheel here. But what strike you as a couple that seem especially promising?
01:28:23.400
Well, I have incredible respect for the government officials who grapple with these problems and
01:28:29.000
they're not easy and they're operating under a lot of constraints. So we need really good people
01:28:36.360
in government working on these issues. So I think a career in government is excellent,
01:28:41.960
an excellent path, both in the short term you can contribute, but longer term you're developing
01:28:47.080
skills, connections, and perspectives that will be helpful. There are a lot of graduate programs that
01:28:52.680
prepare you both in terms of science and policy to have a high impact career in this space. But beyond that,
01:28:59.560
I think we need people with a variety of skills. So if you are an artist or a graphic designer, you can
01:29:09.960
contribute in that way. If you do social media, we need people who can tell great human stories about the
01:29:17.880
way nuclear weapons have affected us and the risks we continue to run. And I think there's a really
01:29:25.800
important role for civil society and for citizens and for outside experts to provide support for
01:29:34.680
government efforts, but also to critique them and audit them and to hold people to account because
01:29:40.680
there are large bureaucracies that are at work, that are chugging away, producing these outcomes that
01:29:47.320
are inimical to our collective security. And so you need people who are willing to call that out.
01:29:54.440
One example is this guy, Bruce Blair, who passed away a few years ago, but is just a hero to me. He's
01:30:01.720
this veteran nuclear launch officer, and he became a deep expert in nuclear command and control and a
01:30:08.440
really dedicated truth teller to expose the dangers that are inherent in this whole enterprise. And someone
01:30:15.880
like that, he knew the generals and the admirals, and he knew people in the Russian enterprise as well,
01:30:23.880
and he spoke with great clarity and conviction. But he was able to provide a counterpoint to some of the
01:30:30.920
official narratives in a way that I think is really healthy. And then you also have people who work in
01:30:37.880
and out of government and develop the expertise and the connections they need outside of government,
01:30:44.200
and then bring that in. So a good example of this is Rose Gottmiller, who worked in government early in
01:30:49.080
her career. And then she went to work at the Carnegie Moscow Center. And the expertise that she built up
01:30:55.960
was really helpful when she was appointed as the chief negotiator for the New START Treaty. And she
01:31:02.200
describes in her book how that was a really important part of getting that treaty done, and then the role of
01:31:10.840
civil society in getting that treaty passed through Congress, because you need a two-thirds majority
01:31:17.000
for treaty ratification. So providing political space for cooperation is essential, because it's really
01:31:26.040
hard these days to talk about cooperating with Russia and China. And I get it, right? These are countries
01:31:33.320
that are, in some cases, they're doing really awful things. But we have a shared threat that we need to
01:31:38.040
manage. And I think that's one of the roles of civil society is opening doors for work in that area.
01:31:45.800
Yeah. And that point, that brings us full circle to what Christopher Nolan has just accomplished with
01:31:52.360
his film. I mean, it's just, you know, it's a work of art, but perhaps more than anything in recent
01:31:57.240
memory, it's made this problem unignorable for so many millions of people. So it's, I mean, props to him.
01:32:04.360
Yeah. I mean, there's just so many important themes. Yeah. Like, in terms of the way it deals
01:32:08.280
with the role of scientists and society, and we just see echoes of this today in the way scientific
01:32:14.680
expertise is sidelined in the public sphere, from vaccines to climate change to AI. And, you know,
01:32:23.640
it's capturing this Prometheus moment. And nuclear weapons were really the first time we confronted the
01:32:29.720
fact that our power has outstripped our wisdom. And we unleashed these elemental forces, you know,
01:32:35.960
the very forces that power the sun, we bring them down to earth, right? And we had to grapple with
01:32:41.400
that then. But in some ways we're doing it again with biotechnology and with artificial intelligence.
01:32:48.360
And so the story is about nuclear weapons, but this idea of creating something that you're not sure you
01:32:55.560
can control. It has real resonance in this moment. And I, you know, there's this scene in the movie,
01:33:03.560
without spoiling it, where Oppenheimer is talking to Einstein. And I think the scene is fabricated,
01:33:09.560
but it is based on the sentiment that he might have had at the time. As they're embarking on the
01:33:15.560
Manhattan Project, they are wondering whether the first Trinity test could result in the ignition
01:33:22.280
of the atmosphere and lead to a chain reaction, which destroys all of humanity. And they run the
01:33:28.440
calculations and they run them again, and they realize that this possibility is vanishingly small.
01:33:33.480
It's essentially zero. So Oppenheimer's talking to Einstein and he says, when I came to you with
01:33:39.160
these calculations, we thought we might start a chain reaction that might destroy the entire world.
01:33:46.440
He turns to Einstein and he says, I believe we did. And the question is, what did we set in motion
01:33:54.520
with that first Trinity test? Did we start this arms race inexorably, which would lead us to where we are
01:34:02.120
today with 12,000 weapons, many of them on high alert in this system in which we are all vulnerable
01:34:10.520
forever? I don't think we did. If you look at the past 80 years, we've come right up to the brink.
01:34:18.600
But then each time we've gained a little bit of wisdom and we've built these systems
01:34:23.240
of governance. And you look at the nuclear non-proliferation regime to prevent the spread
01:34:28.120
and these various arms control treaties that have helped manage competition and hotlines that allowed
01:34:36.120
for communications between adversaries. All of these are imperfect ways of managing this technology and we
01:34:43.800
need to do better. But I think Oppenheimer looking at where we are today, if he could see where we're
01:34:50.360
at, he'd be terrified by the number of weapons we've built. But I think he'd be also impressed at the
01:34:58.600
international systems we've built to regulate these weapons. And the International Atomic Energy
01:35:05.800
Agency in some ways reflects his vision of international control over the peaceful uses
01:35:11.960
of nuclear energy. So it's really a mixed story. Yeah. Well, Karl, thank you for your time and thank
01:35:19.080
you for the work you're doing. I will continue to follow it with interest.
01:35:22.760
Thank you. Thank you. I appreciate all you're doing.