WarRoom Battleground EP 344: Arms Race Of The 21st Century; Rebuttal To The Existential Threat To Humanity
Episode Stats
Words per Minute
153.95116
Summary
In this episode, we discuss an article from Axios that lays out the three major existential threats to humanity: climate change, artificial intelligence, and nuclear weapons. We talk about what they are, why they are a threat to humanity, and what we need to do about them.
Transcript
00:00:20.400
If you continue to look the other way and shut up,
00:00:23.040
then the oppressors, the authoritarians, get total control and total power.
00:00:31.440
It's another element that backs them into a quarter
00:00:36.540
This is why this audience is going to have to get engaged.
00:00:40.700
All this nonsense, all this spin, they can't handle the truth.
00:00:51.900
We committed this morning on the show to take this article that was in Axios.
00:00:58.080
that's kind of the inside baseball for the corporate media.
00:01:02.200
It's Mike Allen's kind of inside baseball newsletter for the lobbyists,
00:01:07.280
the political operatives, the Uniparty, Wall Street, all of it.
00:01:13.180
It gives you a great summary of what's going on.
00:01:15.540
Today, particularly, they had this article called
00:01:17.820
The Three Are the Major Existential Threats to Humankind
00:01:21.240
and talked about these great forces that kind of are underpinning
00:01:26.820
the issues in the world today and all are converging
00:01:30.620
kind of in a place and time as a major threat to humanity.
00:01:33.980
And, of course, the buried lead, they had down about the 10th paragraph
00:01:37.960
and said, this is exactly why you need really smart people now to run governments.
00:01:43.000
This being their theory of the case because they're the college-educated elites.
00:01:51.160
They're bending over backwards and trying to rewrite the Constitution
00:01:54.000
to allow the deadbeats that got all these worthless college degrees
00:01:58.880
to kind of walk away from it and leave working-class America to pay for it.
00:02:02.200
But all their implication is that they're the college-educated,
00:02:06.860
they have the advanced degrees, that MAGA and the populist movement
00:02:15.720
And here's why they're dangerous, because the world's getting so complicated
00:02:18.620
and these major threats, these existential threats to humankind
00:02:25.140
You need the elect of the college-educated and the post-graduate degrees
00:02:30.940
and the doctoral work and all that to guide us,
00:02:38.440
So what we've done is assembled our contributors, the experts in each of them.
00:02:44.060
Axios laid out that it was climate change, artificial intelligence,
00:02:47.860
and nuclear weapons that were the three big current going to overwhelm humanity.
00:02:52.760
So we've brought in our contributors to kind of go and deconstruct that
00:03:00.860
And I really appreciate them pulling this together,
00:03:03.180
because Axios just printed the thing early this morning.
00:03:08.160
I'm going to start with artificial intelligence, go to climate change,
00:03:10.640
then finish with our experts on nuclear weapons.
00:03:13.100
Let's start. We've got a cold open for the one and only Joe Allen.
00:03:18.040
AI, as we all know, is the study of how to make machines intelligent.
00:03:24.200
Its stated goal is general-purpose artificial intelligence,
00:03:27.940
sometimes called AGI or artificial general intelligence,
00:03:32.260
machines that match or exceed human capabilities in every relevant dimension.
00:03:37.300
The last 80 years have seen a lot of progress towards that goal.
00:03:43.240
For most of that time, we created systems whose internal operations we understood,
00:03:49.000
drawing on centuries of work in mathematics, statistics, philosophy, and operations research.
00:03:59.080
Beginning with vision and speech recognition and now with language,
00:04:01.960
the dominant approach has been end-to-end training of circuits
00:04:05.340
with billions or trillions of adjustable parameters.
00:04:12.140
but their internal principles of operation remain a mystery.
00:04:16.820
This is particularly true for the large language models, or LLMs, such as ChatGPT.
00:04:28.720
In my view, LLMs do not constitute AGI, but they are a piece of the puzzle.
00:04:34.460
We're not sure what shape the piece is yet or how it fits into the puzzle,
00:04:38.880
but the field is working hard on those questions and progress is rapid.
00:04:43.300
Alan Turing, the founder of computer science, warned in 1951
00:04:51.900
we should have to expect the machines to take control.
00:04:56.480
We have pretty much completely ignored this warning.
00:05:01.480
This committee has discussed ideas such as third-party testing, licensing,
00:05:06.380
national agency, and international coordinating body,
00:05:12.580
Here are some more ways to, as it said, move fast and fix things.
00:05:18.740
First, an absolute right to know if one is interacting with a person or a machine.
00:05:24.360
Second, no algorithms that can decide to kill human beings,
00:05:35.880
if systems break into other computers or replicate themselves.
00:05:41.180
Fourth, go beyond the voluntary steps announced last Friday.
00:05:44.860
Systems that break the rules must be recalled from the market
00:05:52.460
to helping terrorists build biological weapons.
00:06:02.460
That's from congressional testimony from, what, a week or so ago.
00:06:05.100
Walk me through who that was, what he said, why is it important?
00:06:13.280
He is involved with the Future of Life Institute,
00:06:19.620
They were actually the organization that drafted the open letter
00:06:24.820
requesting a halt on artificial intelligence development above the level of GPT-4.
00:06:30.360
They also drafted an open letter back in 2014-15
00:06:37.020
in which they urged a complete ban on artificial intelligence-based lethal autonomous weapons,
00:06:47.340
so drone swarms that can decide to attack and kill on their own.
00:06:51.260
Something really important to remember about the Future of Life Institute
00:06:54.380
is that they are, by and large, composed of transhumanists
00:06:58.200
or quasi-transhumanists, they, as you heard Stuart Russell talking about,
00:07:03.160
hope to create a superhuman artificial intelligence god.
00:07:07.740
And what you hear, this hearing was actually quite a bit more interesting in many ways than the last.
00:07:16.360
and also much more substantial proposals on how to deal with something like this.
00:07:25.460
Josh Hawley really stood out as being very informed
00:07:28.860
and was very open to recognizing the potential dystopian elements of all this.
00:07:34.780
That being said, the problems that they're highlighting,
00:07:38.900
there's really three that are of real import to my mind.
00:07:45.900
the notion that you don't know whether you're interacting with a human being
00:07:52.200
I think that's going to be a real problem going forward.
00:07:56.740
The artificial intelligence is definitely good enough
00:07:59.560
to fool somebody with, say, an IQ of 110 below.
00:08:08.020
if they're talking to a person on social media.
00:08:10.600
If it's that good, how would you know without some kind of verification?
00:08:14.160
And that's where verification systems like the WorldCoin that we covered yesterday
00:08:19.520
and other verification systems, like something that would be in the same vein,
00:08:24.280
biometric identification, tying your body to a digital identity to prove that you're human.
00:08:30.820
So, it could actually cause as many problems trying to solve this problem,
00:08:35.300
such as having some sort of mandated biometric ID
00:08:38.080
in order to get onto the Internet or anything else.
00:08:41.180
The second one, though, Steve, you know, that he also mentioned,
00:08:44.600
that lethal autonomous weapons, you have people like ex-Google CEO Eric Schmidt,
00:08:49.620
who chaired the National Security Commission on Artificial Intelligence back in 2001.
00:08:56.160
And he is really, really pushing for using lethal autonomous weapons.
00:09:02.440
He believes that as other nations develop them, it will be imperative for the U.S.
00:09:07.560
to keep up that we have weapons that can decide to kill
00:09:11.040
because they would be able to react much faster than any human being would.
00:09:16.220
And just to generalize, he believes that human beings would not win a war
00:09:22.160
in which artificial intelligence was behind the kill switch, so to speak,
00:09:26.600
or the red button to either launch a missile or a drone attack or anything like that.
00:09:35.520
It's a big decision that's going to have to be made.
00:09:37.440
Do you allow machines to make the decision to kill or not?
00:09:41.160
We heard Mark Andreessen arguing that that should be the only system able to kill
00:09:47.060
because human beings are so faulty and so slow.
00:09:49.780
And he is a very respected intellectual in the realm of technology.
00:09:53.840
It wouldn't surprise me if he showed up at one of these hearings to testify.
00:09:57.400
So, to me, I think that's a very, very dangerous move to even have them at all,
00:10:02.400
let alone to have an attitude that human beings should not be responsible
00:10:06.040
for life or death decisions on the battlefield and that machines should.
00:10:09.920
And last but not least, of course, the idea of artificial general intelligence,
00:10:14.060
sort of AI god that the prophets of transhumanism are predicting.
00:10:19.960
And that's one that there's literally no way to know whether the technology has capped out
00:10:26.300
basically right here or if the most aggressive predictions like Elon Musk
00:10:30.440
and some of the others who spoke there at the hearing that artificial general intelligence,
00:10:35.180
the superhuman general purpose AGI could be literally right around the corner next five, seven years.
00:10:42.060
And you also don't know that you could easily have a corporation that claims that they have AGI
00:10:47.940
or something like AGI and be able to wield enormous amounts of influence and power
00:10:54.880
I think that this is probably the most shattering, both on just a philosophical level,
00:10:59.020
political level, economic level, and, of course, a theological level.
00:11:02.320
If you have a company like Google or OpenAI who claims to have brought into existence
00:11:08.320
either a conscious or at least a hyper-intelligent being that is not human,
00:11:13.880
a sort of alien life form, that really does change the game
00:11:17.260
as far as how we talk about what it means to be human
00:11:20.080
and what it means to really have rights in a society.
00:11:23.380
You have all these people like Dalton Ispon, Martine Rothblatt,
00:11:30.080
So I think that Steve, it was very – I think listening to Josh Howley
00:11:34.300
and even Richard Blumenthal, Amy Klobuchar, I think that they have definitely shown
00:11:38.780
that they've done their homework and they're on top of this
00:11:41.740
as much as one could be on top of something so unpredictable.
00:11:44.900
At the same time, we have a lot of chaos on the horizon,
00:11:51.640
if these deepfakes really do proliferate, as they say.
00:11:59.140
But, RZ, the deepfakes, don't get me wrong, are going to be bad,
00:12:02.540
and this is one of the whole things about machines in elections
00:12:05.280
and connectivity to the net about artificial intelligence
00:12:11.500
But back to this, you know, Mike Allen had this piece,
00:12:16.720
Axios had this piece, and it's kind of ironic in that
00:12:19.760
their corporate clients, the uniparty, the corporatist uniparty
00:12:24.120
that runs our nation, has had no problem in this race
00:12:29.240
to develop AI with kind of a devil catch the hindmost
00:12:33.620
or no controls whatsoever, kind of this bizarrely libertarian approach.
00:12:38.320
Esper the other day, and this is what I think it's semaphore,
00:12:41.020
had about his interview with the New York Times,
00:12:46.600
Esper, the former Secretary of Defense, you know,
00:12:48.620
the one with President Trump that I actually said
00:12:56.480
that he's sitting there going, there's an outright arms race,
00:12:58.920
that the 21st century is going to be defined by the arms race
00:13:02.320
that is more intense than the arms race you see in the movie Oppenheimer
00:13:06.840
about who's going to get the nuclear weapons first,
00:13:10.400
the Nazis are ourselves, or then the Russians are ourselves
00:13:13.500
with the hydrogen bomb, that you've got this arms race right now,
00:13:17.660
that will define the 21st century, and there's nothing that can stop that,
00:13:22.900
What struck me about these hearings, even as good as Josh Hawley
00:13:25.880
and these guys were, you're talking about some pretty modest things,
00:13:31.500
and nobody even came close to talking about a kill switch
00:13:35.220
that actually could knock the whole thing down,
00:13:36.780
or that you would have, how would you actually get some agreement,
00:13:41.380
some treaty or some agreement that you wouldn't,
00:13:46.440
into the machine itself where you could kill humans?
00:14:05.980
Yeah, I've long said, and I still stand behind it for now
00:14:18.940
to halt the development of artificial intelligence
00:14:27.880
that they would go overseas and develop them there,
00:14:32.020
to hold the sort of fabled ring of power, so to speak.
00:14:35.640
You could also have an agency that checked for safety,
00:14:41.860
Josh Hawley really pushed the concept of data privacy,
00:14:45.060
and that's something that's really needed to be put in place
00:14:49.920
It means that AI is not raking your personality over
00:14:53.760
nor are just the general algorithms that are used by Google,
00:14:59.340
And so that would be an excellent move forward.
00:15:04.260
I mean, you know, one thing that I cover in my book
00:15:06.200
and what we've covered here for the last two and a half years
00:15:08.860
is that this is an enormous field of technological systems
00:15:13.400
on top of technological systems on top of systems,
00:15:22.640
or social disruption from a sort of distributed array of actors.
00:15:26.760
And so the chaos of the ongoing technological revolution
00:15:31.860
that we're living in, the so-called fourth industrial revolution,
00:15:35.320
I don't see any solid political solutions for it.
00:15:43.800
educating themselves about what the possibilities are,
00:15:56.480
what Artie and Polo calls the cyborg theocracy.
00:16:13.480
especially the non-invasive brain-computer interfaces
00:16:39.020
in a direction where you will have the possibility
00:16:44.620
So, I think that even if in the U.S. we say no on everything,
00:16:53.960
then obviously we're going to be at the center of that
00:16:56.120
because we have the best technology in the U.S.,
00:17:20.580
but I also think that there's the real possibility
00:17:22.940
that something like mandated biometric identification
00:17:28.980
really opens the doors to a lot of much worse problems,
00:17:34.600
of not knowing if you're speaking to a bot online.
00:18:09.860
this book is out on the 29th, I think, of August.
00:19:21.600
Maybe your interest is in the evolutionary underpinning
00:47:05.220
growth rate of electrification across the next set
00:47:32.320
spending a trillion dollars to add close to net
00:47:53.600
we'll need about two times more electrification
00:48:03.700
all and then half for the EV conversion and for
00:48:17.380
growing electrical capacity by about four and a
00:48:19.620
half five percent where it needs to grow 20 percent
00:48:30.240
resulting in the de-electrification of the West
00:48:46.800
energy policy only environmental yielding to the
00:48:52.800
how energy is created and that less up means less
00:48:56.460
I find it um this is why I wanted to do the Axios piece
00:49:03.840
for today because Axios and pulling out climate change
00:49:07.900
artificial intelligence and nuclear weapons they said
00:49:11.700
these are the three existential threats to humankind
00:49:16.500
Axios and their corporate partners are the ones that have been
00:49:20.580
driving this that's the thing is so outrageous Dave we got to
00:49:24.700
bounce how what's your social media oh we just lost Dave
00:49:28.560
Walsh that's okay uh okay Dave what's your social media how
00:49:32.040
do people get to you brother on getter at Dave Walsh
00:49:35.240
energy and truth social the same thank you Steve you you've
00:49:39.300
been spectacular we're gonna have you back on a drill down
00:49:41.180
even more on this sir no pun intended we're gonna leave
00:49:45.260
you with the weavers and their song this land is your
00:49:50.200
land a uh a beautiful and powerful song that the united states
00:49:54.940
of america this land is your land from the weaver
00:49:56.940
i went walking that ribbon of highway i saw above me that endless skyway i saw below me
00:50:10.740
this land was made for you and me this land is your land this land is my land from california to the new york island from the river chorus to the gulf stream waters
00:50:36.740
this land was made for you and me i roamed and i rambled and i followed my footsteps to the sparkling sands of her diamond deserts while all around me a voice was sounding saying this land was made for you and me
00:51:02.740
the sun came shining and i was scrolling and the wheat fields waving and the dust clouds rolling as the park was lifting a voice was chanting
00:51:32.720
this land was made for you and me this land was made for you and me this land is your land this land is my land from california to the new york island from the redwood forest to the gulf stream waters this land was made for you and me
00:52:07.720
veterans you know we have been all over the supply chain issue
00:52:23.720
with china and medications and the uh active pharmaceutical ingredients china has a stranglehold on us where there's a way to break that
00:52:33.640
jace medical i got an emergency medication kit from them
00:52:37.960
the fda just declared a global shortage of medication and warned that critical antibiotics are in extreme short supply across the united states but you know that
00:52:47.200
because you're a viewer or listener of the show now here's the action you can take to correct
00:52:53.840
do yourself and your family a favor and get your jace case right now it's a pack of five prescription antibiotics
00:53:01.040
you'll have on hand for common emergencies just visit jace medical.com that's jace jace medical.com
00:53:10.200
take a few minutes and fill out the form your information will be reviewed by a board certified physician and your medication
00:53:17.620
will be dispensed by a licensed pharmacy at a fraction of the regular cost
00:53:22.820
you'll be glad you have the jace case go to jace medical that's one word j-a-s-c medical.com
00:53:30.060
and enter code bannon at checkout for a discount on your order
00:53:33.940
that's promo code bannon at jace j-a-s-e medical.com
00:53:38.720
you know what the problem is because you've watched the show
00:53:41.740
you can break you can take action and break that problem
00:53:45.620
by going to jace medical and get your jace case today
00:53:52.180
folks let me tell you about salty it's a company that makes a soft gel supplement rich in antioxidants
00:53:58.780
to help people like you and me keep a healthy heart
00:54:02.440
while covid gets all the headlines it's important to realize
00:54:05.780
that heart disease kills nearly 700 000 americans every year yes heart disease is the number one killer
00:54:12.540
every year year in and year out heart disease builds over time
00:54:16.220
hypertension high blood pressure bad cholesterol diabetes all of it
00:54:23.280
to being energetic as we get older it is never too early
00:54:27.740
to take care of your heart you see heart disease sneaks up on us
00:54:32.420
you can start in your 30s and when this happens you're at serious risk by the time you turn 60
00:54:38.760
and those you care about please go to warroomhealth.com
00:54:47.980
use the code warroom at checkout to save 67 percent of your first shipment
00:54:52.560
that's code warroom at checkout to save 67 percent
00:54:56.020
and do it again warroomhealth all one word warroomhealth.com
00:55:00.080
go there today you need if you're going to be part of the posse you need a strong heart