Episode 4788: If Anyone Builds It, Everyone Dies
Episode Stats
Summary
Erica tells the story of how her son, Charlie Kirk, was hit by a car and killed instantly, and how her family and friends rushed to the scene to try and save his life. This is a powerful and moving story.
Transcript
00:00:00.000
We have time before we get our new guest up, or can we play the Charlie Kirk Doctor?
00:00:18.660
Now, here's what Erica wants me to relate on Sunday.
00:00:26.040
This is going to be the hard part, but maybe also the comforting part.
00:00:37.400
My three sons are a little bit older than Charlie.
00:00:42.720
So when he was hit, if your son got hit, what would you do?
00:00:54.060
Because if there was any way I could save him, I had to do something.
00:00:57.880
And I couldn't just, we're going to just take him.
00:01:12.000
And I'm on one side, and there's actually some video of this.
00:01:17.640
I'm on one side of the car, the right side, and they're getting Charlie in.
00:01:27.260
I pop the top, the back gate open, and I jump in the back.
00:01:58.260
And Charlie's laid out in front, just right in front of me.
00:02:02.260
And Charlie's so tall, we can't close the door.
00:02:07.000
We drove four miles, I don't know, it's four something miles, all the way to the hospital
00:02:12.960
To this day, I don't know how Brian stayed in the car.
00:02:18.000
We're, you know, we're trying to do, we're trying to stop the bleeding.
00:02:24.220
And I'm yelling, come on, Charlie, come on, come on.
00:02:29.620
My son and daughter-in-law are hearing this whole thing.
00:02:32.260
And his security team, again, Justin, Dan, Brian, and Rick, they love Charlie.
00:02:39.140
But they're much cooler than I, I mean, they're just carrying out, they're calmly, but they're
00:02:43.460
swiftly doing exactly what they were trained to do.
00:02:49.300
We're yelling, come on, let's go, let's go, let's go.
00:02:57.900
And we're, we're doing the best we can to navigate traffic.
00:03:03.640
And suddenly there's an ambulance coming toward us.
00:03:12.380
The doctor later said that was the right thing to do.
00:03:15.820
We're still heading to the hospital trying to get there.
00:03:20.780
At one point, somebody says, let's get there in one piece because we're just, we're cutting
00:03:24.560
through intersections, you know, we're just beeping the horn.
00:03:58.360
He was killed instantly and felt absolutely no pain.
00:04:11.360
And by the way, there was just nothing, nothing any of us could do about it.
00:04:16.160
We were giving him CPR, but nothing was happening.
00:04:19.060
It wasn't like if we had better first aid or we had better medical facilities or we were
00:04:23.600
faster to the hospital, we could have saved him.
00:04:26.220
So if that's any comfort at all, Charlie didn't suffer.
00:04:32.540
He was with Jesus absent from the body present with the Lord.
00:04:42.260
When we got to the hospital and they started working on him right away, they did get a pulse
00:04:50.100
And so Rick and I were just, everyone's praying.
00:05:00.220
And the doctor later said that we got a pulse because Charlie was a very healthy man, but
00:05:13.680
So 20 or 30 minutes later, the surgeon came out and said he was dead.
00:05:19.200
Thursday, 18 September, Year of the Lord, 2025.
00:05:33.700
We had a break for the press conference in at Checkers and Dr.
00:05:38.400
Peter Navarro had to take a meeting over in the West Wing.
00:05:41.860
He's going to rejoin us approximately 1130 to go through that.
00:05:46.520
Charlie Kirk, this is why huge announcement overnight, President of the United States going
00:05:52.560
to designate Antifa and affiliate organizations, a major terrorist organization.
00:05:58.040
And this will now expand from what the authorities out in Utah were doing, which is a murder, just
00:06:05.500
a typical murder case to something that's going to really get to the bottom of it.
00:06:09.320
And I think you see, and you know my opinion on this ridiculous set of text messages that
00:06:14.580
are trying to foist on us, which is absurd to do.
00:06:22.540
I think he's over at Turning Point right now who joins us this afternoon.
00:06:24.800
I want to bring on, because of the, what happened at this press conference about the, there was
00:06:31.440
this massive deal, excuse me, before the press conference earlier today, some of the audience
00:06:35.200
might not have caught it, but there was a huge announcement of this transaction for nuclear
00:06:39.460
power and, you know, $400 billion and all of this technology we're going to provide.
00:06:44.580
And it's all down to about artificial intelligence.
00:06:48.060
I want to bring on the authors of a book that I think is a must read for every person in this
00:06:57.680
It's about artificial intelligence and two of the individuals that have been there from
00:07:01.660
the beginning and know both the benefits and huge upside of artificial intelligence, but
00:07:10.940
Eliza Yudkowski and Nate Soros, Yud and Nate, thank you for joining us here in the war room.
00:07:21.880
Weren't you the guy, just like a year or so ago, and Joe Allen, I was going through some
00:07:28.200
Weren't you the guy that said, hey, if this thing gets so out of control, I will be the
00:07:38.380
I was saying that we needed an international treaty and was being plain that the international
00:07:43.280
treaty needed to be willing to be backed up by force, including on non-signatory nations.
00:07:47.600
This wasn't about individuals trying to stop a data center.
00:07:51.960
I don't think that's going to be all that effective.
00:07:56.940
You ban it in your own country, it moves to others.
00:08:01.940
You know, it's not, it's not the kind of product which just kills the voluntary customers or
00:08:06.720
even people standing next to the voluntary customers.
00:08:09.180
This is something that endangers people on the other side of the planet.
00:08:13.980
This is the sort of treaty that needs to be backed up by force.
00:08:19.400
So, yeah, that is a, it's kind of a brilliant concept, right?
00:08:23.860
And you're saying that even non-signatories would have to take out.
00:08:29.200
What is it about this technology that potentially could be so dangerous to humanity?
00:08:34.060
Because all we hear, all we get, we get glazed every day.
00:08:36.600
We get glazed at how this is so wonderful and this is so tremendous and you got, you've
00:08:41.240
got Larry Fink, you got Steve Schwartzman, you have the head of Apple.
00:08:46.680
They're all over there cheering this on because of artificial intelligence.
00:08:50.560
You know, Oracle's now on fire because of artificial intelligence.
00:08:56.100
Why are you, as somebody who knows this and have been there since the beginning, so concerned
00:08:59.980
about it, you think there has to be a treaty that potentially nations of the world
00:09:04.240
would have to take it upon themselves or in unison to go take out the data centers, sir?
00:09:10.280
There's a limit to how far you can push this technology, how much benefit you can get out
00:09:20.400
We're already starting to see the artificial intelligences that are being built today
00:09:24.100
doing minor amounts of damage that their builders, the people who don't, you know,
00:09:40.040
At some point, it gets smarter than you, able to invent new technologies we don't have.
00:09:45.920
And at that point, I think that the current minor incidents of loss of control are going
00:09:49.980
to turn into catastrophic end-of-the-world type incidents.
00:09:56.880
I think it's actually going to happen that way.
00:10:03.460
I want to make sure the audience understands this.
00:10:11.840
Just explain to our audience why this is fundamentally different than computer programs and neural networks
00:10:16.960
and other things that have been crafted in the past.
00:10:19.820
Yeah, so when an AI cheats on a programming problem, which we're starting to see a little bit of that these days,
00:10:29.380
is this is a case where the AI has, in some sense, it's been trained to succeed at certain types of tasks
00:10:42.080
and does it in ways – what am I trying to say?
00:10:53.120
There's no line in the code that is, make this AI cheat on the program problems.
00:10:58.180
When AIs do things nobody wants, nobody asks for, there's no line in the code that a programmer can go in and fix
00:11:04.420
These AIs, we gather a huge amount of computing power and we gather a huge amount of data
00:11:09.260
and we shape the computing power to be better at predicting the data.
00:11:12.840
Humans understand the process that does the shaping.
00:11:15.540
Humans don't understand what comes out of the shaping process.
00:11:19.500
These things are – and the result, the result that comes out of the shaping process,
00:11:23.680
it does all sorts of things that we weren't asking for these days.
00:11:32.580
These are small now, they're cute now, but they're indications that we are starting to get AIs
00:11:39.440
that have something a little bit like drives, something a little bit like goals
00:11:47.880
We keep making them smarter and they have goals we didn't want.
00:11:55.220
Right now, the drive – and correct me if I'm wrong – the four horsemen or five horsemen of the apocalypse,
00:12:00.980
the company that's driving this, all of them are committed to artificial general intelligence.
00:12:06.100
And correct me if I'm wrong, I believe your guy's thesis is that artificial –
00:12:10.780
since we don't really understand artificial intelligence that well,
00:12:14.420
hurtling towards superintelligence or AGI will be uncontrollable and lead to a catastrophe
00:12:23.400
Is that – is basically that your central argument?
00:12:35.040
There is no established science for how to make smarter-than-human machines that aren't dangerous.
00:12:40.440
If we keep pushing on making AI smarter and smarter while not having any ability to direct them to do good things,
00:12:51.000
the default outcome is just these AIs get to the point where they can invent their own technology,
00:12:56.660
to the point where they can build their own infrastructure, and then we die as a side effect.
00:13:08.620
We're trying to get many of them organized and get a bigger platform.
00:13:12.500
But if you look at the business press, if you look at just the general news,
00:13:16.140
if you look at what's coming out of Washington, it's all this super cheerleading.
00:13:22.040
I mean, I don't know if you saw the earlier announcement about the nuclear power plants,
00:13:26.940
but it was all about AI, AI, AI, and Starmer just sitting there with his pom-poms out.
00:13:35.600
Why is it people like you guys who are very involved and know this and have been there since the beginning,
00:13:42.420
why are these voices not getting a bigger platform right now?
00:13:47.480
So I think a lot of people don't understand the difference between chatbots that we have today
00:13:54.820
The explicitly stated goal of these companies, as you said, is to create smarter-than-human AI,
00:14:00.820
to create AI that can outperform humans at any mental task.
00:14:08.940
They, you know, five years ago, the computers could not hold on a conversation.
00:14:13.300
I think a lot of people in Washington think that that's all AI can do and all it will be able to do
00:14:18.160
just because, you know, they haven't seen what's coming down the line.
00:14:21.980
And it can be hard to anticipate what's coming down the line.
00:14:24.140
I am hopeful that as people notice that we're keeping making these machines smarter,
00:14:29.800
as they notice what these companies are racing towards,
00:14:32.340
I'm hopeful that they'll realize we need to stop rushing towards this cliff edge.
00:14:37.420
Do you think, we've got about 60 seconds here, Nate,
00:14:40.640
do you think that that's being presented on Capitol Hill or anywhere in the media right now,
00:14:45.140
that what you're saying needs to be done is being done?
00:14:48.060
Not very well, but I'm very glad we're having this conversation.
00:14:52.720
And I'm hoping that the book makes a big splash because a lot of people,
00:14:56.460
you know, a lot more people are worried about these dangers than you might think.
00:15:00.520
And we've spoken to some people who say they're worried and say they can't talk about it
00:15:06.900
And then we see polls saying that lots of the population is worried.
00:15:09.820
So, I think this is a message that has its moment to break out.
00:15:14.600
And I'm hoping the world wakes up to this danger.
00:15:23.880
Their book, If Anyone Builds It, Everyone Dies.
00:15:29.020
You have two people that are experts and would have benefited economically,
00:15:34.200
There is something to unite this country, and you can unite this country around
00:15:39.280
questions about the oligarchs and big tech and exactly what's going down here.
00:15:51.940
And is it putting the country's interest in human interest
00:15:55.300
before corporate interest in the making of money?
00:16:11.280
This July, there is a global summit of BRICS nations in Rio de Janeiro.
00:16:16.260
The bloc of emerging superpowers, including China, Russia, India, and Persia,
00:16:21.900
are meeting with the goal of displacing the United States dollar as the global currency.
00:16:30.120
As BRICS nations push forward with their plans, global demand for U.S. dollars will decrease,
00:16:35.200
bringing down the value of the dollar in your savings.
00:16:38.460
While this transition won't not happen overnight, but trust me, it's going to start in Rio.
00:16:44.320
The Rio Reset in July marks a pivotal moment when BRICS objectives move decisively from a theoretical possibility towards an inevitable reality.
00:16:55.160
Learn if diversifying your savings into gold is right for you.
00:17:00.020
Birch Gold Group can help you move your hard-earned savings into a tax-sheltered IRA and precious metals.
00:17:06.600
Claim your free info kit on gold by texting my name, Bannon, that's B-A-N-N-O-N, to 989898.
00:17:13.600
With an A-plus rating with the Better Business Bureau and tens of thousands of happy customers,
00:17:19.240
let Birch Gold Army with a free, no-obligation info kit on owning gold before July.
00:17:44.760
And it's where all the biggest voices in conservative media are speaking out.
00:17:51.840
It's where I put up exclusively all of my content.
00:18:14.820
It is a warning and a deeply thought through warning to humanity and to the citizens of
00:18:22.060
Guys, we've been accused a lot on the war room of being Luddites, that we just don't
00:18:33.680
But I keep telling people, I say, hey, in this regard, all I'm doing is putting out the
00:18:37.860
information of some of the smartest guys that were there at the beginning of the building
00:18:44.240
Now, can folks make the argument against you guys that you just have become a proto-Luddites
00:18:51.380
You know, there are many technologies that I am quite bullish on.
00:18:58.480
I personally think America should be building more nuclear power.
00:19:01.840
I personally think we should be building supersonic jets.
00:19:04.560
It's different when a technology risks the lives of everybody on the planet, and this
00:19:13.080
You have the Nobel laureate, prize-winning founder of the field saying he thinks this
00:19:19.500
You have people inside these labs saying, yes, it's very dangerous, but we're going to
00:19:23.220
rush ahead anyway, even at the risk of the entire world.
00:19:27.300
If a bridge was going to collapse with very high probability, we wouldn't say, well, we
00:19:35.120
need to leave it up for the sake of technology.
00:19:38.020
You know, when NASA launches rockets into space, it accepts a 1 in 270 risk of that rocket
00:19:45.920
blowing up, and those are volunteer astronauts on the crew.
00:19:49.200
That's the sort of risk they're willing to accept.
00:19:51.960
With AI, we are dealing with much, much higher dangers than that.
00:19:55.280
We're dealing with a technology that just kills you.
00:19:58.320
We're trying to build smarter-than-human machines while having no idea of how to point them in
00:20:06.340
To imagine that this is Luddites, we're not saying, oh, this is going to have some bad
00:20:19.440
You know, AIs may have those effects, but the chatbots may have those effects, but a
00:20:32.020
But that's just the default course this technology takes.
00:20:35.340
It's not that humanity cannot progress technologically.
00:20:38.560
It's that we shouldn't race over a cliff edge in the name of technology.
00:20:42.160
We need to find some saner way forward towards the higher tech future.
00:20:46.580
Okay, so yeah, to Nate's point, you know, we had this thing in England today where they're
00:20:53.900
full-on, all nuclear powers because we need it.
00:20:56.980
You know, all the climate change guys, forget that.
00:20:59.180
We need this because we've got to have artificial intelligence.
00:21:01.440
You have this concept of a treaty, like maybe the Geneva Treaty after the wars, to have nations
00:21:10.440
However, all you hear, we're the leader of the anti-CCP movement in this country, have
00:21:18.000
I'm sanctioned fully by the Chinese Communist Party.
00:21:20.380
I can't have any association with any Chinese company or people because we represent Lao
00:21:26.840
What they're holding up to us is that if we follow Yud and Nate and what they want to
00:21:31.020
do in the war room, that the CCP, particularly after Deep Seek, what they call the Sputnik moment,
00:21:36.060
that the Chinese Communist Party is going to build this technology, and then we're going
00:21:39.820
to have the most evil dictatorship in the world, have control of the most evil technology ever
00:21:44.960
created, and we're going to be at their beck and call.
00:21:50.860
We have not advocated that the United States or the United Kingdom, for that matter, try
00:21:55.960
to unilaterally relinquish artificial intelligence.
00:21:59.820
Our position is that you need an international treaty.
00:22:06.500
The Chinese government has at least overtly seemed on some occasions to present a posture
00:22:12.320
of being willing to acknowledge that AI is a worldwide matter and that there might be
00:22:18.400
cause for coordination among nations to prevent the Earth itself from being wiped out.
00:22:27.160
We have had countries that hated each other work together to prevent global thermonuclear war,
00:22:34.820
You look back at history, in the 1950s, a lot of people thought humanity wasn't going to
00:22:38.820
make it, or at least civilization wasn't going to make it, that we were going to have a nuclear
00:22:43.320
And that wasn't them enjoying being pessimistic.
00:22:49.900
They thought that what was going to happen is that every country was going to build its own
00:22:52.780
nuclear fleet, and eventually there would be a spark, and the world would go up in flames.
00:22:58.200
And the reason it didn't happen is because of some pretty serious efforts put forth by
00:23:01.860
countries that in many cases hated each other's guts to at least work together on not all
00:23:12.900
Nate, before I let you guys bounce, can you explain the title of the book?
00:23:22.540
I mean, what we mean is that if humanity builds smarter than human AI using anything
00:23:29.080
remotely like the current technology and anything remotely like the current lack of
00:23:32.540
understanding, then every human being will die.
00:23:37.620
Superintelligence can transform the world more than that.
00:23:40.300
And this title also goes back to the point about China.
00:23:43.240
It's not that great dictators would be able to control this great power if they made it.
00:23:48.580
If you make a superintelligence, you don't have that superintelligence.
00:23:53.740
You've just created an entity that has the planet.
00:23:56.820
Artificial intelligence would be able to radically transform the world.
00:24:00.260
And if we don't know how to make it do good things, we shouldn't do it at all.
00:24:05.220
And we are nowhere near close being able to point superintelligence in a good direction.
00:24:08.940
So humanity needs to back off from this challenge.
00:24:15.860
All I can say is the forces of capital, the forces of politics, human avarice and greed,
00:24:23.640
and also the need for power makes this, you know, we fight big pharma.
00:24:29.220
The fights we have every day are massive against long odds.
00:24:35.640
But I tell people, this is the hardest one I've ever seen because of what's happened.
00:24:39.640
And I said at that time at Davos, when we had the thing at Davos, when Chad GPT came out,
00:24:44.700
I said, you wait till venture capital and Wall Street gets involved.
00:24:51.020
You know, Steve, it would be a very different thing if Elie Zarykowski and Nate Soares
00:24:57.640
were making these accusations or, at the very least, issuing these warnings in a vacuum.
00:25:04.860
If the tech companies, for instance, were just simply saying, we're building tools,
00:25:10.520
these guys are accusing us of building gods, they're crazy, it'd be a very different situation.
00:25:17.760
Every one of them, even the most moderate, like Demis Isavas, but certainly Sam Altman,
00:25:24.360
Elon Musk, even Dario Amadei, they all are talking about the creation of artificial general
00:25:34.000
And so when we first started covering, when you first brought me on four and a half years ago,
00:25:38.720
we hit a lot of the points that Yudkowsky was making.
00:25:42.700
We would show videos and try to explain to the audience.
00:25:45.000
And they, by and large, didn't really grasp the reality of it
00:25:49.700
because it wasn't as much of a reality four and a half years ago.
00:25:53.620
In just that short amount of time, we've seen the creation of systems
00:26:03.160
We saw at the very beginning, GPT was not supposed to be online.
00:26:09.200
Basically, all the warnings that Yudkowsky gave early on when AI was hitting the headlines,
00:26:18.680
My question for Yudkowsky would be this, and for Suarez.
00:26:24.180
You live in and among the most techno-saturated culture in the country, San Francisco.
00:26:30.800
Can you give us some insight into the mentality of the people who are willing to barrel ahead,
00:26:40.040
no matter what, and create these systems, even if that means the end of the human race?
00:26:45.860
So, some of them have just come out and said, you know,
00:26:52.920
I had a choice between being a bystander and being a participant,
00:26:58.180
That is, in some sense, enough to explain why these people are barreling ahead,
00:27:03.200
although I think in real life there's a bunch of other explanations, too,
00:27:06.220
like the people who started these companies back in 2015
00:27:09.520
are the sort of people who are able to convince themselves it would be okay
00:27:13.120
to gamble with the whole civilization like this.
00:27:16.020
You know, we've seen comments like back in 2015, I believe,
00:27:22.360
AI might kill us all, but there'll be good companies along the way.
00:27:27.400
artificial general intelligence will probably kill us all,
00:27:29.960
but there will be great companies made along the way.
00:27:31.880
I don't know the exact quote, but that mentality,
00:27:36.340
it's not someone taking seriously what they're doing.
00:27:38.980
It's not someone treating with gravity what they're doing.
00:27:41.760
This wouldn't be an issue if they couldn't also make greater and greater intelligences.
00:27:47.720
But in this world where we're just growing intelligences,
00:27:51.400
and are the most optimistic people that were foolish enough to start the companies
00:27:54.900
can just grow these AIs to be smarter and smarter,
00:28:04.480
You know, when Jeffrey Hinton, now Nobel laureate Jeffrey Hinton,
00:28:13.940
sort of woke up and noticed that it was starting to be real,
00:28:17.780
he quit Google and started speaking out about these issues more openly.
00:28:22.080
Now, who knows how much money he was turning down by doing that,
00:28:26.680
And, you know, that's one kind of person that you have on the playing field.
00:28:29.940
And then you've also got the, you know, the people who were selected and filtered
00:28:34.300
for being the sort of people who would, you know,
00:28:37.540
back when OpenAI started, go over to Elon Musk and say, you know,
00:28:41.700
you know how we can solve the problem of these things we can't control?
00:28:51.460
This was, you know, this was always kind of moon logic,
00:28:56.020
but they sure got Elon's money and then, you know, took it and ran off.
00:28:59.620
And that's just the kind of people we're dealing with here.
00:29:06.940
I just want to hold you through the break because I want to give people access
00:29:09.600
to how to get this book, where to get it, your writing, social media, all of it.
00:29:47.220
a delicious glass of Field of Greens daily is like nutritional armor for your body.
00:29:53.340
Each fruit and each vegetable was doctor selected for a specific health benefit.
00:29:59.380
There's a heart health group, lungs and kidney groups, metabolism, even healthy weight.
00:30:04.540
I love the energy boost I get with Field of Greens.
00:30:07.760
But most of all, I love the confidence that even if I have a cheat day or, wait for it,
00:30:13.660
a burger, I can enjoy it guilt-free because of Field of Greens.
00:30:20.140
And only Field of Greens makes you this better health promise.
00:30:24.420
Your doctor will notice your improved health or your money back.
00:30:28.540
Your doctor will notice your improved health or your money back.
00:30:32.040
Let me get you started with my special discount.
00:30:37.960
Just use code Bannon, B-A-N-N-O-N, at fieldofgreens.com.
00:30:47.080
And if your doctor doesn't know how healthy you look and feel,
00:31:05.300
By the way, just for I to let you guys go and give your coordinates and tell people how to buy this amazing book,
00:31:10.580
purchase of this book, which we're going to break down and spend more time on, folks, in the days ahead.
00:31:15.880
There's a movie called Mountainhead, which basically has actors playing Elon Musk, Steve Jobs, I think Zuckerberg, and Altman.
00:31:26.060
And it's actually kind of dark to begin with, but it turns very dark when one becomes – they identify one as a decelerationist.
00:31:40.540
I would decelerate any technology that could wipe us all out and prevent us from learning from mistakes.
00:31:45.760
Every other technology, I think we need to go full steam ahead and are sometimes hobbling ourselves.
00:31:50.940
But AI in particular, any technology that kills everybody and leaves no survivors, you can't rush ahead on that.
00:32:04.060
If a product only kills the voluntary customers and maybe the person who sold it, that's kind of between them.
00:32:11.340
I might have sympathy, but not to the point where I try to take over their lives about it.
00:32:15.320
If a product kills people standing next to the customer, it's a regional matter.
00:32:19.620
Different cities, different states can make different rules about it.
00:32:22.500
If a product kills people on the other side of the planet, that's everybody's problem.
00:32:26.340
And, you know, yeah, you don't have to agree with me to want humanity to not die here about this part.
00:32:35.760
But I would happen to, you know, go full steam ahead on nuclear power.
00:32:43.740
Artificial intelligence, you know, gain-of-function research on viruses might be another thing.
00:32:49.600
But, you know, it does actually differ by the technology.
00:32:53.720
You don't have – there's not this one switch that's set to excel or deset.
00:33:02.120
I might add we were the first show in January of 2020 to say about what the University of North Carolina was doing on gain-of-function was a danger to humanity.
00:33:12.180
And we're laughed at by the mainstream media as being conspiracy theorists.
00:33:16.240
Yud, what is your – what are your coordinates?
00:33:18.860
How do people follow you, your thinking, your writing?
00:33:39.180
Look forward to having you back and break down this book even more.
00:33:48.720
Thank you, guys, for joining us in the war room.
00:34:00.000
First off, you're one of the ambassadors of Turning Point.
00:34:02.480
Give me your thoughts of what's evolved this week.
00:34:05.520
And, you know, we've got Charlie's funeral or celebration of life on Sunday.
00:34:09.440
Give me your – you knew him extremely well, and you do the faith show here on Real America's Voice.
00:34:24.620
It's been a turning point for not just America, but for the world.
00:34:33.820
And for this to happen to him is just devastating to so many people.
00:34:38.320
And so many people are wondering, what do we do?
00:34:43.540
And I say silence is not an option at this point.
00:34:52.860
So that's what I'm trying to do with my Faith and Freedom show right here on Real America's Voice.
00:34:58.860
I think I am the oldest Turning Point ambassador.
00:35:07.040
That's just biological – that's chronological age.
00:35:21.260
Hey, I just – I'm just calling it – I'm just calling balls and strikes here.
00:35:32.880
He's young at heart, or it's like parenting a young child.
00:35:40.660
Steve, that's a whole other podcast, okay, about Ted and trying to stay young.
00:35:48.800
There is a study recently about epigenetics, which is the science that shows your DNA is not your destiny.
00:35:58.020
None of us eat right, so we all take supplements, right?
00:36:02.340
But there are so many different fruit and vegetable supplements on the market, and if you study their ingredients, which I have, I'm a label reader, it's just common produce with limited nutritional value.
00:36:13.840
There's a product called Field of Greens, and it's different.
00:36:16.120
And they wanted to prove that it was different by doing a university study where each fruit and vegetable in Field of Greens is medically selected for health benefits.
00:36:25.660
There's heart health, lungs, kidney, liver, healthy metabolism, and healthy weight.
00:36:31.780
And in this study, Steve, they wanted to see how diet, exercise, and lifestyle changed your real age.
00:36:40.960
And this is fascinating to me, but some of the participants, they ate their normal diets, including fast food, they didn't exercise anymore, and they didn't stop drinking.
00:36:52.800
All they did is add Field of Greens to their daily routine, and the results were remarkable.
00:37:01.220
60% of participants showed a measurable reduction in their biological aging markers after just 30 days.
00:37:10.100
One scoop of Field of Greens slows the body's aging process at the cellular level.
00:37:15.280
And I think this was what helped me, because I've worked out all my life.
00:37:21.360
I'll be honest, I don't eat right all the time.
00:37:24.340
So just by taking one scoop of Field of Greens, I can see that aging slow down.
00:37:33.860
Fieldofgreens.com, promo code BANDY, get 20% off.
00:37:40.340
Not us that have all the – about this Texas A&M study, but also I get an energy boost, Shemaine, just every day.
00:37:49.700
I want to thank you for all you do, and particularly being an ambassador and helping the folks over at Turning Point, particularly in this very difficult phase for the movement, for the company, for Erica, the kids, everybody.
00:38:02.560
So I really want to thank you for joining us today.
00:38:08.000
If we remain silent, relief and deliverance is going to come from someplace else.
00:38:35.240
You've been with the president now for 10 years.
00:38:38.000
But tell me about – you wrote this piece that's pretty moving, and I think it's so tied to your book about you went to prison so that we don't have to.
00:38:51.460
Steve, I really want people to understand the legacy of Charlie Kirk historically.
00:39:00.880
He certainly would have been a governor, but already at 31 years old, he's the greatest political organizer in the last 50 years.
00:39:11.100
And if you compare him to the two who were there at the top before Charlie Kirk, Ralph Reed on the right, David Axelrod on the left.
00:39:20.020
What Ralph Reed did with the Christian coalition is mobilize the Christian right to get out and actually vote.
00:39:27.500
He was responsible for the Gingrich revolution in 1994 as well as the Bush win in 2000.
00:39:34.960
And then Axelrod on the left, he was able to mobilize a natural Democrat constituent, blacks, Hispanics, and young people, used micro-targeting, some advanced kind of techniques at the time, and basically won the race for Obama in 2008.
00:39:53.700
The reason why Charlie is head and shoulders above each one of them is he had a much heavier lift, Steve.
00:40:01.060
To mobilize the youth in support of MAGA and Trump and MAGA candidates in Congress, he had to first bring him over to our side.
00:40:16.000
When I first met him, Charlie, back in 2016, young kid, thinking he was going to go out there and change the viewpoint of the youth of America, I thought he was Don Quixote.
00:40:33.220
And people need to understand father, husband, patriot, just a wonderful human being that's here.
00:40:44.820
But in terms of pure historical significance, he will go down in history as the greatest political organizer in the last 50 years.
00:40:53.380
And I don't think anybody's ever going to do, again, what he did because it's relatively easy to mobilize.
00:41:02.000
It's very difficult to persuade people over to your side and then mobilize, Steve.
00:41:11.660
I want to, because you got your PhD at Harvard, then you went back, you taught in the university system.
00:41:17.340
When I first saw Charlie, I think Breitbart's the first guy to give him a paid gig.
00:41:21.320
But some of the people around Breitbart were what financed him at the very beginning when he was going after student governments.
00:41:26.960
And I think many people who thought Charlie was just a ball of fire thought it was the longest odds.
00:41:33.840
And you thought so, too, because the universities, as we know now, it is based around this kind of radical philosophy.
00:41:42.380
And the kids are formed all the way from kindergarten all the way up.
00:41:45.500
So that is, to me, the greatness of Charlie Kirk, that he was able to go in and just do this when so many people said, hey, look, this guy's great.
00:42:01.260
And you knew it better than anybody because you were inside the belly of the beast.
00:42:07.260
Well, I spent 25 years at the University of California in Irvine.
00:42:12.260
And if there's a system that ever was woke, that certainly is it.
00:42:19.240
But what Charlie understood, he didn't start at Harvard and Cornell.
00:42:23.680
He understood that most of the universities in this country are in flyover country.
00:42:32.060
He went out there and Socratically, I mean, when I taught in the classroom, I was a big fan of the Socratic method.
00:42:41.800
You've got to have them come to their own conclusions.
00:42:45.820
And that's how Charlie was able to bring people home to MAGA.
00:42:53.620
I mean, fast forward, it's like when I got out of prison, you know, the day I got out of prison,
00:43:00.860
July 17th, I went to the Republican National Committee, gave the speech.
00:43:08.920
The title of the book is actually a tagline from the speech.
00:43:13.940
But I mentioned this in the context of Charlie because I didn't even know this.
00:43:26.640
And he was giving a speech on my birthday, July 15th, two days before me.
00:43:31.740
And he said, I visited college campuses so you won't have to.
00:43:37.300
And it, I mean, the way I just, somehow it struck a warm chord in me.
00:43:47.640
And, yeah, I'm on the campaign trail getting out of prison with the boss.
00:43:51.860
My fiancee Pixie in the book, I call her Bonnie.
00:44:07.580
And then during the transition, he was essentially Sergio Gore's co-pilot there, putting all the personnel together in the administration.
00:44:20.040
And, look, the boss, Charlie was like a son to Donald John Trump, as well as a key advisor, one of his most trusted advisors.
00:44:35.260
I'm going to try to hitch a ride out on the Korean Air Force One on Sunday and be there.
00:44:45.540
I want to hold you through the break because I want to talk about the book for a second.
00:44:58.820
If you're a homeowner, you need to listen to this.
00:45:00.960
In today's AI and cyber world, scammers are stealing home titles with more ease than ever, and your equity is the target.
00:45:11.660
Criminals forge your signature on one document.
00:45:16.060
Pay a small fee with your county, and boom, your home title has been transferred out of your name.
00:45:22.580
Then they take out loans using your equity or even sell your property.
00:45:26.640
You won't even know it's happened until you get a collection or foreclosure notice.
00:45:33.860
So let me ask you, when was the last time you personally checked your home title?
00:45:39.660
If you're like me, the answer is never, and that's exactly what scammers are counting on.
00:45:48.060
Use promo code Steve at HomeTitleLock.com to make sure your title is still in your name.
00:45:55.620
You'll also get a free title history report plus a free 14-day trial of their million-dollar triple lock protection.
00:46:05.580
Urgent alerts to any changes, and if fraud should happen, they'll spend up to $1 million to fix it.
00:46:32.200
It's the process of how you get to the value of gold.
00:46:34.560
Make sure you take your phone out right now and text Bannon, B-A-N-N-O-N, 989898.
00:46:40.640
To get the ultimate guide, which happens to be free, investing in gold and precious metals in the age of Trump.
00:46:48.520
We had a rate cut last night, only 25 basis points.
00:46:53.920
Steve Mirren, the Council on Economic Advisors chair, is now, I guess, the interim governor.
00:47:03.140
Go find out why gold has been a hedge for times of financial turbulence in mankind's history.
00:47:12.840
You can't be on tonight because you're going to be at one of the conferences.
00:47:15.060
I'm going to get you back on hopefully tomorrow.
00:47:17.160
We had a historic interview today on the book, If Anyone Builds It, Everyone Dies.
00:47:25.720
If the audience wants to hear the in-depth interview I did with Nate Soros a couple of weeks ago,
00:47:33.140
it's right at the top of my social media, at J-O-E-B-O-T-X-Y-Z.
00:47:39.240
Also an article about the hearing two days ago with Josh Hawley and the parents of children
00:47:48.960
That's also up at the top of my social media, at J-O-E-B-O-T-X-Y-Z or JoeBot.X-Y-Z.
00:47:55.360
We are backed up on a lot of stuff because of the Charlie Kirk situation was obviously a priority,
00:48:00.720
including designated Antifa a terrorist group so we can get to the bottom of all of it
00:48:05.360
and not just have this dealt with by Utah officials as a single murder.
00:48:09.700
It's much deeper than that, the assassination of Charlie Kirk.
00:48:14.840
Josh Hawley was supposed to be here today, but of course we had the press conference.
00:48:18.160
And right there on the screen you see the President of the United States getting ready to leave
00:48:21.920
to go to the airfield to take Air Force One back.
00:48:29.140
Peter Navarro, one of President Trump's closest advisors,
00:48:34.920
I think he's the only one that's been there from the very, very, very beginning that's still there.
00:48:39.080
Why should, in a world of all this information and everything going on,
00:48:42.640
as big a hero as you are to this movement, as highly respected as you are by President Trump,
00:48:49.640
because you're kind of the architect with him of the reorganization of the world's trading patterns,
00:48:55.540
why should people buy a book about you and your days in prison, sir?
00:49:03.040
And I would ask the posse to go right now to Amazon.
00:49:08.720
The book is really the best analysis of how the left is going after all of this.
00:49:16.940
If they can come for me, if they can come for Steve Bannon, put him in prison,
00:49:20.900
if they can try to put Trump in prison now, they shot Charlie Kirk.
00:49:27.360
And I'll take you into prison, and I went to prison so you won't have to.
00:49:31.720
But the broad scope here, Steve, is really an analysis about the asymmetry,
00:49:37.680
the disturbing asymmetry between how the left is waging war on this.
00:49:43.520
You mentioned everybody I serve with, Steve, including you, has been a target of the left.
00:49:52.220
At a minimum, they've spent millions of dollars in legal fees,
00:49:55.720
whether it's Mike Flynn or America's Mayor Rudy Giuliani.
00:50:01.620
They take the bar cards of Jeff Clark, John Eastman.
00:50:05.000
And on the other end, Steve, you and I went to prison.
00:50:07.960
And everybody who put us there, every single person was a Democrat except Liz Cheney.
00:50:16.440
How can the Democrats seize power and use that to put us in prison and call us fascists?
00:50:30.620
It's a story about how we must wake up to what's happening.
00:50:37.460
But look, if you want to find out what it's like to go into prison for a misdemeanor
00:50:42.260
and wind up spending four months with 200 felons, this is the book.
00:50:49.240
And you know, Bonnie, my fiancé, it's also a story about how we were able to cope and deal with that.
00:50:59.020
The message there is simply that she did the time with me.
00:51:02.200
And that's what happens when people are unjustly targeted by the left.
00:51:12.880
But when they go after our families, that's where you draw the line.
00:51:21.540
If they steal the election in 2028, trust me, this audience, they're going to be coming for you.
00:51:26.020
You see now, we're in a different place than we even back then.
00:51:29.060
This is getting more and more intense every day.
00:51:31.580
That's why Navarro's book has got to be written because it's actually for you and about you.
00:51:36.860
It ain't about Navarro, not about me, not about President Trump.
00:51:43.780
And understand, like I said, there's nothing, there's no compromise here.
00:51:49.000
One side's going to win and one side's going to lose.
00:51:50.800
And if they steal it in 2028, they're coming for you.
00:51:53.460
Peter, where do people go to get your writings?
00:51:55.460
Where do they go to get, in particular, your great piece on Charlie Kirk?
00:52:00.580
The book, I Went to Prison So You Won't Have To.
00:52:06.580
Please drive this thing up to bestsellers so we get the message out.
00:52:12.880
I Went to Prison So You Won't Have To on Amazon.
00:52:15.820
The piece about Charlie is very close to my heart.
00:52:25.680
It will be up on my sub-stack on Sunday as we celebrate Charlie on that sacred Sunday that we're about to have.
00:52:35.340
And you can always go to my sub-stack, peternavar.substack.com.
00:52:38.480
But, Steve, I really appreciate what the War Room does.
00:52:41.660
I appreciate being able to come talk about I went to prison so you won't have to.
00:52:47.660
And C-SPAN is running a long, hour-long interview on Sunday at 8 p.m.
00:52:59.900
You're going to be on Saturday, and also you're going to be on Sunday.
00:53:02.460
We're doing wall-to-wall coverage live from the stadium.
00:53:07.620
Mike Lindell, it's been a long, tough morning for the War Room.
00:53:16.580
I'm sitting here back in Minnesota at my factory.
00:53:28.140
And then we have the pillows, the Giza Dream pillows, the Giza Covers.
00:53:41.760
You guys go to mypillow.com forward slash war room.
00:53:45.940
And then you're going to see all of the big-ticket items at the website, free shipping on the beds, the mattress toppers, 100% made in the USA.
00:53:58.240
And people, remember, we have a 10-year warranty and a 60-day money-back guarantee, everybody.
00:54:06.320
Of Megyn Kelly, we toss to the Charlie Kirk Show, two hours of populist nationalism, hosted today by Megyn Kelly.
00:54:13.840
We'll see you back here at 5 p.m. Eastern Daylight Time.
00:54:26.840
The IRS doesn't mess around, and they're applying pressure like we haven't seen in years.
00:54:31.340
So if you haven't filed in a while, even if you can't pay, don't wait.
00:54:40.080
You need the trusted experts by your side, Tax Network USA.
00:54:44.200
Tax Network USA isn't like other tax relief companies.
00:54:47.820
They have an edge, a preferred direct line to the IRS.
00:54:51.300
They know which agents to talk to and which ones to avoid.
00:54:54.780
They use smart, aggressive strategies to settle your tax problems quickly and in your favor.
00:54:59.940
Remember, whether you owe $10,000 or $10 million, Tax Network USA has helped resolve over $1 billion in tax debt, and they can help you, too.
00:55:17.300
Talk with one of their strategists and put your IRS troubles behind you.
00:55:31.060
Or visit Tax Network USA, TNUSA.com slash Bannon.