Bannon's War Room


Episode 4788: If Anyone Builds It, Everyone Dies


Episode Stats

Misogynist Sentences

5

Hate Speech Sentences

4


Summary

Erica tells the story of how her son, Charlie Kirk, was hit by a car and killed instantly, and how her family and friends rushed to the scene to try and save his life. This is a powerful and moving story.


Transcript

00:00:00.000 We have time before we get our new guest up, or can we play the Charlie Kirk Doctor?
00:00:05.000 I want to play.
00:00:07.660 This was very moving.
00:00:09.000 It came out last night.
00:00:09.920 It was going to be the end of our cold open.
00:00:12.240 But as you saw, we cut in the cold open.
00:00:14.900 So let's go ahead and play this.
00:00:16.140 I want everybody to hear this.
00:00:17.560 Let's go ahead and play it.
00:00:18.660 Now, here's what Erica wants me to relate on Sunday.
00:00:26.040 This is going to be the hard part, but maybe also the comforting part.
00:00:31.000 Charlie Kirk was literally like a son to me.
00:00:34.240 I have three sons.
00:00:35.200 He was like my fourth son.
00:00:37.400 My three sons are a little bit older than Charlie.
00:00:39.520 He was like my fourth son.
00:00:42.720 So when he was hit, if your son got hit, what would you do?
00:00:49.000 What would you do?
00:00:50.040 I got in the car.
00:00:54.060 Because if there was any way I could save him, I had to do something.
00:00:57.880 And I couldn't just, we're going to just take him.
00:01:02.660 You guys got it.
00:01:04.440 So they got him into the side of the car.
00:01:07.700 It was an SUV.
00:01:09.300 It was the SUV we took over.
00:01:12.000 And I'm on one side, and there's actually some video of this.
00:01:16.620 Somebody was taking video of this.
00:01:17.640 I'm on one side of the car, the right side, and they're getting Charlie in.
00:01:21.260 So I run over to the other side.
00:01:22.460 But the guy was dragging him in.
00:01:23.980 They're now blocking that entrance.
00:01:25.440 So at that point, I run around to the back.
00:01:27.260 I pop the top, the back gate open, and I jump in the back.
00:01:31.740 The car lurches forward.
00:01:33.400 Apparently, somebody jumped in the car.
00:01:35.080 So the car lurches forward.
00:01:36.000 So I almost fall out of the car, the SUV.
00:01:39.920 Then I grab the thing and close it.
00:01:43.560 And there's five of us in the car now.
00:01:47.520 Justin is driving.
00:01:49.000 Dan is up front with the GPS.
00:01:52.340 Rick has got him.
00:01:53.160 Rick's on my left.
00:01:54.580 And Brian is there.
00:01:55.820 And I'm coming over the back seat.
00:01:58.260 And Charlie's laid out in front, just right in front of me.
00:02:02.260 And Charlie's so tall, we can't close the door.
00:02:07.000 We drove four miles, I don't know, it's four something miles, all the way to the hospital
00:02:11.940 with the door open.
00:02:12.960 To this day, I don't know how Brian stayed in the car.
00:02:15.100 Because we're just, go, go, go, go, go.
00:02:18.000 We're, you know, we're trying to do, we're trying to stop the bleeding.
00:02:21.360 You saw it.
00:02:24.220 And I'm yelling, come on, Charlie, come on, come on.
00:02:27.140 Meanwhile, my phone is still on.
00:02:29.620 My son and daughter-in-law are hearing this whole thing.
00:02:32.260 And his security team, again, Justin, Dan, Brian, and Rick, they love Charlie.
00:02:39.140 But they're much cooler than I, I mean, they're just carrying out, they're calmly, but they're
00:02:43.460 swiftly doing exactly what they were trained to do.
00:02:46.560 Rick starts praying out loud.
00:02:47.980 I'm praying out loud.
00:02:49.300 We're yelling, come on, let's go, let's go, let's go.
00:02:54.900 My son's here in all this.
00:02:57.900 And we're, we're doing the best we can to navigate traffic.
00:03:00.980 It's not a highway.
00:03:01.860 We're on surface streets.
00:03:03.640 And suddenly there's an ambulance coming toward us.
00:03:07.460 And there was conversation in the car.
00:03:09.500 Should we stop?
00:03:10.060 We're like, no, no, just keep going.
00:03:11.840 Just keep going.
00:03:12.380 The doctor later said that was the right thing to do.
00:03:14.480 Ambulance goes by us.
00:03:15.820 We're still heading to the hospital trying to get there.
00:03:20.780 At one point, somebody says, let's get there in one piece because we're just, we're cutting
00:03:24.560 through intersections, you know, we're just beeping the horn.
00:03:26.940 This is not an emergency vehicle.
00:03:29.200 There's no, there's no lights.
00:03:30.700 There's none of this.
00:03:31.680 And I go, we got to start CPR.
00:03:33.480 So I try and start that now.
00:03:48.740 Charlie wasn't there.
00:03:50.160 His eyes were fixed.
00:03:52.620 He wasn't looking at me.
00:03:54.220 He was looking past me right into eternity.
00:03:56.820 He was with Jesus already.
00:03:58.360 He was killed instantly and felt absolutely no pain.
00:04:05.420 That's what I was told later.
00:04:08.660 But of course we had to try.
00:04:11.360 And by the way, there was just nothing, nothing any of us could do about it.
00:04:16.160 We were giving him CPR, but nothing was happening.
00:04:19.060 It wasn't like if we had better first aid or we had better medical facilities or we were
00:04:23.600 faster to the hospital, we could have saved him.
00:04:25.680 We couldn't.
00:04:26.220 So if that's any comfort at all, Charlie didn't suffer.
00:04:31.540 He was gone.
00:04:32.540 He was with Jesus absent from the body present with the Lord.
00:04:38.720 That's where he was.
00:04:41.180 Now it is true.
00:04:42.260 When we got to the hospital and they started working on him right away, they did get a pulse
00:04:48.900 back.
00:04:50.100 And so Rick and I were just, everyone's praying.
00:04:53.460 We're just praying for a miracle.
00:04:55.020 We had a small sliver of hope.
00:05:00.220 And the doctor later said that we got a pulse because Charlie was a very healthy man, but
00:05:11.760 the shot was catastrophic.
00:05:13.680 So 20 or 30 minutes later, the surgeon came out and said he was dead.
00:05:19.200 Thursday, 18 September, Year of the Lord, 2025.
00:05:33.700 We had a break for the press conference in at Checkers and Dr.
00:05:38.400 Peter Navarro had to take a meeting over in the West Wing.
00:05:41.860 He's going to rejoin us approximately 1130 to go through that.
00:05:46.520 Charlie Kirk, this is why huge announcement overnight, President of the United States going
00:05:52.560 to designate Antifa and affiliate organizations, a major terrorist organization.
00:05:58.040 And this will now expand from what the authorities out in Utah were doing, which is a murder, just
00:06:05.500 a typical murder case to something that's going to really get to the bottom of it.
00:06:09.320 And I think you see, and you know my opinion on this ridiculous set of text messages that
00:06:14.580 are trying to foist on us, which is absurd to do.
00:06:17.940 We'll break that more down this afternoon.
00:06:20.200 Alex Brusowich is going to join us.
00:06:22.540 I think he's over at Turning Point right now who joins us this afternoon.
00:06:24.800 I want to bring on, because of the, what happened at this press conference about the, there was
00:06:31.440 this massive deal, excuse me, before the press conference earlier today, some of the audience
00:06:35.200 might not have caught it, but there was a huge announcement of this transaction for nuclear
00:06:39.460 power and, you know, $400 billion and all of this technology we're going to provide.
00:06:44.580 And it's all down to about artificial intelligence.
00:06:48.060 I want to bring on the authors of a book that I think is a must read for every person in this
00:06:54.300 nation.
00:06:55.120 If anyone builds it, everyone dies.
00:06:57.680 It's about artificial intelligence and two of the individuals that have been there from
00:07:01.660 the beginning and know both the benefits and huge upside of artificial intelligence, but
00:07:06.740 also the potential downside.
00:07:08.760 And is there enough risk mitigation?
00:07:10.940 Eliza Yudkowski and Nate Soros, Yud and Nate, thank you for joining us here in the war room.
00:07:17.000 I appreciate it.
00:07:18.060 First question.
00:07:18.840 Yud, I think, I think it was you.
00:07:21.880 Weren't you the guy, just like a year or so ago, and Joe Allen, I was going through some
00:07:27.080 stuff with Joe Allen.
00:07:28.200 Weren't you the guy that said, hey, if this thing gets so out of control, I will be the
00:07:31.800 first to go bomb the data centers?
00:07:34.700 Was that you?
00:07:36.920 No.
00:07:38.380 I was saying that we needed an international treaty and was being plain that the international
00:07:43.280 treaty needed to be willing to be backed up by force, including on non-signatory nations.
00:07:47.600 This wasn't about individuals trying to stop a data center.
00:07:51.960 I don't think that's going to be all that effective.
00:07:54.780 You take out one, there's others.
00:07:56.940 You ban it in your own country, it moves to others.
00:07:59.320 This is an international treaty situation.
00:08:01.940 You know, it's not, it's not the kind of product which just kills the voluntary customers or
00:08:06.720 even people standing next to the voluntary customers.
00:08:09.180 This is something that endangers people on the other side of the planet.
00:08:12.240 So I was trying to be plain.
00:08:13.980 This is the sort of treaty that needs to be backed up by force.
00:08:16.840 It has to be.
00:08:19.400 So, yeah, that is a, it's kind of a brilliant concept, right?
00:08:23.860 And you're saying that even non-signatories would have to take out.
00:08:27.520 What is it?
00:08:28.520 Make your case.
00:08:29.200 What is it about this technology that potentially could be so dangerous to humanity?
00:08:34.060 Because all we hear, all we get, we get glazed every day.
00:08:36.600 We get glazed at how this is so wonderful and this is so tremendous and you got, you've
00:08:41.240 got Larry Fink, you got Steve Schwartzman, you have the head of Apple.
00:08:46.680 They're all over there cheering this on because of artificial intelligence.
00:08:50.560 You know, Oracle's now on fire because of artificial intelligence.
00:08:54.440 All we hear is the upside.
00:08:56.100 Why are you, as somebody who knows this and have been there since the beginning, so concerned
00:08:59.980 about it, you think there has to be a treaty that potentially nations of the world
00:09:04.240 would have to take it upon themselves or in unison to go take out the data centers, sir?
00:09:10.280 There's a limit to how far you can push this technology, how much benefit you can get out
00:09:15.920 of it before it stops being a tool.
00:09:18.340 We're already starting to verge on that.
00:09:20.400 We're already starting to see the artificial intelligences that are being built today
00:09:24.100 doing minor amounts of damage that their builders, the people who don't, you know,
00:09:29.960 they don't craft them.
00:09:30.800 They don't like put them together bit by bit.
00:09:32.300 They grow them.
00:09:32.880 It's like a farmer raising an AI.
00:09:36.280 And at some point, it stops being a tool.
00:09:40.040 At some point, it gets smarter than you, able to invent new technologies we don't have.
00:09:45.920 And at that point, I think that the current minor incidents of loss of control are going
00:09:49.980 to turn into catastrophic end-of-the-world type incidents.
00:09:54.400 It's not a new idea, really.
00:09:56.880 I think it's actually going to happen that way.
00:09:58.860 Explain to me.
00:10:03.460 I want to make sure the audience understands this.
00:10:05.100 You guys say AI is grown, not crafted.
00:10:07.860 It's not like things we've known before.
00:10:09.680 Nate, maybe one of you guys take that on.
00:10:11.840 Just explain to our audience why this is fundamentally different than computer programs and neural networks
00:10:16.960 and other things that have been crafted in the past.
00:10:19.820 Yeah, so when an AI cheats on a programming problem, which we're starting to see a little bit of that these days,
00:10:29.380 is this is a case where the AI has, in some sense, it's been trained to succeed at certain types of tasks
00:10:42.080 and does it in ways – what am I trying to say?
00:10:49.200 Sorry.
00:10:53.120 There's no line in the code that is, make this AI cheat on the program problems.
00:10:58.180 When AIs do things nobody wants, nobody asks for, there's no line in the code that a programmer can go in and fix
00:11:02.700 and say, whoops, we did that wrong.
00:11:04.420 These AIs, we gather a huge amount of computing power and we gather a huge amount of data
00:11:09.260 and we shape the computing power to be better at predicting the data.
00:11:12.840 Humans understand the process that does the shaping.
00:11:15.540 Humans don't understand what comes out of the shaping process.
00:11:19.500 These things are – and the result, the result that comes out of the shaping process,
00:11:23.680 it does all sorts of things that we weren't asking for these days.
00:11:27.060 You know, we've seen them threaten reporters.
00:11:30.660 We've seen them cheat on programming problems.
00:11:32.580 These are small now, they're cute now, but they're indications that we are starting to get AIs
00:11:39.440 that have something a little bit like drives, something a little bit like goals
00:11:44.960 that we didn't intend and didn't ask for.
00:11:47.880 We keep making them smarter and they have goals we didn't want.
00:11:50.740 That's going to end really quite poorly.
00:11:55.220 Right now, the drive – and correct me if I'm wrong – the four horsemen or five horsemen of the apocalypse,
00:12:00.980 the company that's driving this, all of them are committed to artificial general intelligence.
00:12:06.100 And correct me if I'm wrong, I believe your guy's thesis is that artificial –
00:12:10.780 since we don't really understand artificial intelligence that well,
00:12:14.420 hurtling towards superintelligence or AGI will be uncontrollable and lead to a catastrophe
00:12:21.180 and potentially the end of humankind.
00:12:23.400 Is that – is basically that your central argument?
00:12:26.100 That's right.
00:12:28.660 It's not that the AI will hate us.
00:12:30.280 It's not that it'll have malice.
00:12:32.440 It's that we're just growing these things.
00:12:35.040 There is no established science for how to make smarter-than-human machines that aren't dangerous.
00:12:40.440 If we keep pushing on making AI smarter and smarter while not having any ability to direct them to do good things,
00:12:51.000 the default outcome is just these AIs get to the point where they can invent their own technology,
00:12:56.660 to the point where they can build their own infrastructure, and then we die as a side effect.
00:13:01.680 How are you guys being received?
00:13:07.140 I hear these voices now.
00:13:08.620 We're trying to get many of them organized and get a bigger platform.
00:13:12.500 But if you look at the business press, if you look at just the general news,
00:13:16.140 if you look at what's coming out of Washington, it's all this super cheerleading.
00:13:19.780 You just saw it today at Checkers.
00:13:22.040 I mean, I don't know if you saw the earlier announcement about the nuclear power plants,
00:13:26.940 but it was all about AI, AI, AI, and Starmer just sitting there with his pom-poms out.
00:13:31.900 We've got a couple of minutes in this segment.
00:13:33.520 I'd like to hold you through it.
00:13:35.600 Why is it people like you guys who are very involved and know this and have been there since the beginning,
00:13:42.420 why are these voices not getting a bigger platform right now?
00:13:47.480 So I think a lot of people don't understand the difference between chatbots that we have today
00:13:53.120 and where the technology is going.
00:13:54.820 The explicitly stated goal of these companies, as you said, is to create smarter-than-human AI,
00:14:00.820 to create AI that can outperform humans at any mental task.
00:14:05.000 Chatbots are not what they set out to make.
00:14:07.240 They are a stepping stone.
00:14:08.940 They, you know, five years ago, the computers could not hold on a conversation.
00:14:12.360 Today, they can.
00:14:13.300 I think a lot of people in Washington think that that's all AI can do and all it will be able to do
00:14:18.160 just because, you know, they haven't seen what's coming down the line.
00:14:21.980 And it can be hard to anticipate what's coming down the line.
00:14:24.140 I am hopeful that as people notice that we're keeping making these machines smarter,
00:14:29.800 as they notice what these companies are racing towards,
00:14:32.340 I'm hopeful that they'll realize we need to stop rushing towards this cliff edge.
00:14:37.420 Do you think, we've got about 60 seconds here, Nate,
00:14:40.640 do you think that that's being presented on Capitol Hill or anywhere in the media right now,
00:14:45.140 that what you're saying needs to be done is being done?
00:14:48.060 Not very well, but I'm very glad we're having this conversation.
00:14:52.720 And I'm hoping that the book makes a big splash because a lot of people,
00:14:56.460 you know, a lot more people are worried about these dangers than you might think.
00:15:00.520 And we've spoken to some people who say they're worried and say they can't talk about it
00:15:04.680 because they would sound a little too crazy.
00:15:06.900 And then we see polls saying that lots of the population is worried.
00:15:09.820 So, I think this is a message that has its moment to break out.
00:15:14.600 And I'm hoping the world wakes up to this danger.
00:15:19.100 Guys, can you hang on for one second?
00:15:21.020 We'll just take a short commercial break.
00:15:22.500 Nate and Yud are with us.
00:15:23.880 Their book, If Anyone Builds It, Everyone Dies.
00:15:26.700 It is an absolute must-read.
00:15:29.020 You have two people that are experts and would have benefited economically,
00:15:33.080 as some of these folks are.
00:15:34.200 There is something to unite this country, and you can unite this country around
00:15:39.280 questions about the oligarchs and big tech and exactly what's going down here.
00:15:46.120 Who benefits from it?
00:15:47.420 Who's driving it?
00:15:48.260 And does it have enough regulation?
00:15:50.220 It does have enough control.
00:15:51.940 And is it putting the country's interest in human interest
00:15:55.300 before corporate interest in the making of money?
00:16:00.680 Short commercial break.
00:16:02.300 Back with Nate and Yud in a moment.
00:16:04.200 I got American Bae in America's heart.
00:16:11.280 This July, there is a global summit of BRICS nations in Rio de Janeiro.
00:16:16.260 The bloc of emerging superpowers, including China, Russia, India, and Persia,
00:16:21.900 are meeting with the goal of displacing the United States dollar as the global currency.
00:16:27.180 They're calling this the Rio Reset.
00:16:30.120 As BRICS nations push forward with their plans, global demand for U.S. dollars will decrease,
00:16:35.200 bringing down the value of the dollar in your savings.
00:16:38.460 While this transition won't not happen overnight, but trust me, it's going to start in Rio.
00:16:44.320 The Rio Reset in July marks a pivotal moment when BRICS objectives move decisively from a theoretical possibility towards an inevitable reality.
00:16:55.160 Learn if diversifying your savings into gold is right for you.
00:17:00.020 Birch Gold Group can help you move your hard-earned savings into a tax-sheltered IRA and precious metals.
00:17:06.600 Claim your free info kit on gold by texting my name, Bannon, that's B-A-N-N-O-N, to 989898.
00:17:13.600 With an A-plus rating with the Better Business Bureau and tens of thousands of happy customers,
00:17:19.240 let Birch Gold Army with a free, no-obligation info kit on owning gold before July.
00:17:25.060 And the Rio Reset.
00:17:27.600 Text Bannon, B-A-N-N-O-N, to 989898.
00:17:32.020 Do it today.
00:17:33.260 That's the Rio Reset.
00:17:35.080 Text Bannon at 989898 and do it today.
00:17:38.980 Because Voice Family, are you on Getter yet?
00:17:41.260 No.
00:17:41.760 What are you waiting for?
00:17:42.820 It's free.
00:17:43.720 It's uncensored.
00:17:44.760 And it's where all the biggest voices in conservative media are speaking out.
00:17:49.300 Download the Getter app right now.
00:17:51.140 It's totally free.
00:17:51.840 It's where I put up exclusively all of my content.
00:17:54.700 24 hours a day.
00:17:55.620 You want to know what Steve Bannon's thinking?
00:17:57.360 Go to Getter.
00:17:57.980 That's right.
00:17:58.760 You can follow all of your favorites.
00:18:00.540 Steve Bannon, Charlie Kirk, Jack Posobin.
00:18:02.820 And so many more.
00:18:04.420 Download the Getter app now.
00:18:05.780 Sign up for free and be part of the movement.
00:18:09.940 So guys, Yud and Nate join us.
00:18:12.360 The book, If Anyone Builds It, Everyone Dies.
00:18:14.820 It is a warning and a deeply thought through warning to humanity and to the citizens of
00:18:21.240 this republic.
00:18:22.060 Guys, we've been accused a lot on the war room of being Luddites, that we just don't
00:18:26.640 like technology.
00:18:27.620 We don't like the oligarchs.
00:18:28.700 We don't like the tech bros.
00:18:29.700 We're populist nationalists.
00:18:30.760 We just don't like them, right?
00:18:32.220 And so we want to stop them.
00:18:33.680 But I keep telling people, I say, hey, in this regard, all I'm doing is putting out the
00:18:37.860 information of some of the smartest guys that were there at the beginning of the building
00:18:43.160 of artificial intelligence.
00:18:44.240 Now, can folks make the argument against you guys that you just have become a proto-Luddites
00:18:50.000 because of your journey?
00:18:51.380 You know, there are many technologies that I am quite bullish on.
00:18:58.480 I personally think America should be building more nuclear power.
00:19:01.840 I personally think we should be building supersonic jets.
00:19:04.560 It's different when a technology risks the lives of everybody on the planet, and this
00:19:11.400 is not a very controversial point these days.
00:19:13.080 You have the Nobel laureate, prize-winning founder of the field saying he thinks this
00:19:18.120 is very dangerous.
00:19:19.500 You have people inside these labs saying, yes, it's very dangerous, but we're going to
00:19:23.220 rush ahead anyway, even at the risk of the entire world.
00:19:26.120 This is a crazy situation.
00:19:27.300 If a bridge was going to collapse with very high probability, we wouldn't say, well, we
00:19:35.120 need to leave it up for the sake of technology.
00:19:38.020 You know, when NASA launches rockets into space, it accepts a 1 in 270 risk of that rocket
00:19:45.920 blowing up, and those are volunteer astronauts on the crew.
00:19:49.200 That's the sort of risk they're willing to accept.
00:19:51.960 With AI, we are dealing with much, much higher dangers than that.
00:19:55.280 We're dealing with a technology that just kills you.
00:19:58.320 We're trying to build smarter-than-human machines while having no idea of how to point them in
00:20:04.940 a good direction.
00:20:06.340 To imagine that this is Luddites, we're not saying, oh, this is going to have some bad
00:20:14.420 effect on the everyday worker in losing a job.
00:20:19.440 You know, AIs may have those effects, but the chatbots may have those effects, but a
00:20:26.180 superintelligence, it kills everybody.
00:20:29.660 Then you'll have no jobs.
00:20:30.920 You'll also have full employment.
00:20:32.020 But that's just the default course this technology takes.
00:20:35.340 It's not that humanity cannot progress technologically.
00:20:38.560 It's that we shouldn't race over a cliff edge in the name of technology.
00:20:42.160 We need to find some saner way forward towards the higher tech future.
00:20:46.580 Okay, so yeah, to Nate's point, you know, we had this thing in England today where they're
00:20:53.900 full-on, all nuclear powers because we need it.
00:20:56.980 You know, all the climate change guys, forget that.
00:20:59.180 We need this because we've got to have artificial intelligence.
00:21:01.440 You have this concept of a treaty, like maybe the Geneva Treaty after the wars, to have nations
00:21:07.560 of the world be prepared to take action here.
00:21:10.440 However, all you hear, we're the leader of the anti-CCP movement in this country, have
00:21:17.320 been for years.
00:21:18.000 I'm sanctioned fully by the Chinese Communist Party.
00:21:20.380 I can't have any association with any Chinese company or people because we represent Lao
00:21:24.640 Ba Jing, the little guy in China.
00:21:26.840 What they're holding up to us is that if we follow Yud and Nate and what they want to
00:21:31.020 do in the war room, that the CCP, particularly after Deep Seek, what they call the Sputnik moment,
00:21:36.060 that the Chinese Communist Party is going to build this technology, and then we're going
00:21:39.820 to have the most evil dictatorship in the world, have control of the most evil technology ever
00:21:44.960 created, and we're going to be at their beck and call.
00:21:48.200 Your response, sir?
00:21:50.860 We have not advocated that the United States or the United Kingdom, for that matter, try
00:21:55.960 to unilaterally relinquish artificial intelligence.
00:21:59.820 Our position is that you need an international treaty.
00:22:03.200 You need verification.
00:22:06.500 The Chinese government has at least overtly seemed on some occasions to present a posture
00:22:12.320 of being willing to acknowledge that AI is a worldwide matter and that there might be
00:22:18.400 cause for coordination among nations to prevent the Earth itself from being wiped out.
00:22:24.160 And this is not a new situation in politics.
00:22:27.160 We have had countries that hated each other work together to prevent global thermonuclear war,
00:22:32.460 which is not in the interest of any country.
00:22:34.820 You look back at history, in the 1950s, a lot of people thought humanity wasn't going to
00:22:38.820 make it, or at least civilization wasn't going to make it, that we were going to have a nuclear
00:22:42.760 war.
00:22:43.320 And that wasn't them enjoying being pessimistic.
00:22:45.100 There was this immense historical momentum.
00:22:46.980 They'd seen World War I, World War II.
00:22:49.900 They thought that what was going to happen is that every country was going to build its own
00:22:52.780 nuclear fleet, and eventually there would be a spark, and the world would go up in flames.
00:22:56.160 And that didn't happen.
00:22:58.200 And the reason it didn't happen is because of some pretty serious efforts put forth by
00:23:01.860 countries that in many cases hated each other's guts to at least work together on not all
00:23:06.680 dying in a fire.
00:23:07.960 And that's what we need to reproduce today.
00:23:12.900 Nate, before I let you guys bounce, can you explain the title of the book?
00:23:16.920 It's pretty grabbing, but it's scary.
00:23:19.000 If anyone builds it, everyone dies.
00:23:21.380 What do you guys mean by that?
00:23:22.540 I mean, what we mean is that if humanity builds smarter than human AI using anything
00:23:29.080 remotely like the current technology and anything remotely like the current lack of
00:23:32.540 understanding, then every human being will die.
00:23:35.760 Doesn't matter if you're in a bunker.
00:23:37.620 Superintelligence can transform the world more than that.
00:23:40.300 And this title also goes back to the point about China.
00:23:43.240 It's not that great dictators would be able to control this great power if they made it.
00:23:48.580 If you make a superintelligence, you don't have that superintelligence.
00:23:53.740 You've just created an entity that has the planet.
00:23:56.820 Artificial intelligence would be able to radically transform the world.
00:24:00.260 And if we don't know how to make it do good things, we shouldn't do it at all.
00:24:05.220 And we are nowhere near close being able to point superintelligence in a good direction.
00:24:08.940 So humanity needs to back off from this challenge.
00:24:14.180 Joe Allen, any comments?
00:24:15.860 All I can say is the forces of capital, the forces of politics, human avarice and greed,
00:24:23.640 and also the need for power makes this, you know, we fight big pharma.
00:24:29.220 The fights we have every day are massive against long odds.
00:24:34.300 We've won more than we've lost.
00:24:35.640 But I tell people, this is the hardest one I've ever seen because of what's happened.
00:24:39.640 And I said at that time at Davos, when we had the thing at Davos, when Chad GPT came out,
00:24:44.700 I said, you wait till venture capital and Wall Street gets involved.
00:24:47.640 Any thoughts, Joe Allen?
00:24:51.020 You know, Steve, it would be a very different thing if Elie Zarykowski and Nate Soares
00:24:57.640 were making these accusations or, at the very least, issuing these warnings in a vacuum.
00:25:04.860 If the tech companies, for instance, were just simply saying, we're building tools,
00:25:10.520 these guys are accusing us of building gods, they're crazy, it'd be a very different situation.
00:25:16.420 But that's not the situation.
00:25:17.760 Every one of them, even the most moderate, like Demis Isavas, but certainly Sam Altman,
00:25:24.360 Elon Musk, even Dario Amadei, they all are talking about the creation of artificial general
00:25:31.380 and artificial superintelligence.
00:25:34.000 And so when we first started covering, when you first brought me on four and a half years ago,
00:25:38.720 we hit a lot of the points that Yudkowsky was making.
00:25:42.700 We would show videos and try to explain to the audience.
00:25:45.000 And they, by and large, didn't really grasp the reality of it
00:25:49.700 because it wasn't as much of a reality four and a half years ago.
00:25:53.620 In just that short amount of time, we've seen the creation of systems
00:25:59.580 that can competently produce computer code.
00:26:03.160 We saw at the very beginning, GPT was not supposed to be online.
00:26:07.640 Very quickly, that ended.
00:26:09.200 Basically, all the warnings that Yudkowsky gave early on when AI was hitting the headlines,
00:26:16.820 those are coming to pass.
00:26:18.680 My question for Yudkowsky would be this, and for Suarez.
00:26:24.180 You live in and among the most techno-saturated culture in the country, San Francisco.
00:26:30.800 Can you give us some insight into the mentality of the people who are willing to barrel ahead,
00:26:40.040 no matter what, and create these systems, even if that means the end of the human race?
00:26:45.860 So, some of them have just come out and said, you know,
00:26:52.920 I had a choice between being a bystander and being a participant,
00:26:56.280 and I preferred being a participant.
00:26:58.180 That is, in some sense, enough to explain why these people are barreling ahead,
00:27:03.200 although I think in real life there's a bunch of other explanations, too,
00:27:06.220 like the people who started these companies back in 2015
00:27:09.520 are the sort of people who are able to convince themselves it would be okay
00:27:13.120 to gamble with the whole civilization like this.
00:27:16.020 You know, we've seen comments like back in 2015, I believe,
00:27:20.260 Sam Altman said something like,
00:27:22.360 AI might kill us all, but there'll be good companies along the way.
00:27:25.800 Or I think he maybe even said,
00:27:27.400 artificial general intelligence will probably kill us all,
00:27:29.960 but there will be great companies made along the way.
00:27:31.880 I don't know the exact quote, but that mentality,
00:27:36.340 it's not someone taking seriously what they're doing.
00:27:38.980 It's not someone treating with gravity what they're doing.
00:27:41.760 This wouldn't be an issue if they couldn't also make greater and greater intelligences.
00:27:47.720 But in this world where we're just growing intelligences,
00:27:50.160 where people who don't know what they're doing
00:27:51.400 and are the most optimistic people that were foolish enough to start the companies
00:27:54.900 can just grow these AIs to be smarter and smarter,
00:27:57.520 that doesn't lead anywhere good.
00:28:02.720 You had any comments?
00:28:04.480 You know, when Jeffrey Hinton, now Nobel laureate Jeffrey Hinton,
00:28:13.940 sort of woke up and noticed that it was starting to be real,
00:28:17.780 he quit Google and started speaking out about these issues more openly.
00:28:22.080 Now, who knows how much money he was turning down by doing that,
00:28:24.880 but that's what he did.
00:28:26.680 And, you know, that's one kind of person that you have on the playing field.
00:28:29.940 And then you've also got the, you know, the people who were selected and filtered
00:28:34.300 for being the sort of people who would, you know,
00:28:37.540 back when OpenAI started, go over to Elon Musk and say, you know,
00:28:41.700 you know how we can solve the problem of these things we can't control?
00:28:45.440 We can, like, put them in everyone's house.
00:28:47.280 We can give everyone their own copy.
00:28:48.700 And this was never valid reasoning.
00:28:51.460 This was, you know, this was always kind of moon logic,
00:28:56.020 but they sure got Elon's money and then, you know, took it and ran off.
00:28:59.620 And that's just the kind of people we're dealing with here.
00:29:05.500 Guys, can you hang on for one second?
00:29:06.940 I just want to hold you through the break because I want to give people access
00:29:09.600 to how to get this book, where to get it, your writing, social media, all of it.
00:29:14.560 Yud and Nate, heroes.
00:29:18.700 Very hard what they're doing.
00:29:21.440 Very, very, very, very hard.
00:29:25.120 Short commercial break.
00:29:26.100 We're back in a moment.
00:29:38.740 Hey, we're human.
00:29:40.180 All too human.
00:29:42.160 I don't always eat healthy.
00:29:43.780 You don't always eat healthy.
00:29:45.360 That's why doctors created Field of Greens,
00:29:47.220 a delicious glass of Field of Greens daily is like nutritional armor for your body.
00:29:53.340 Each fruit and each vegetable was doctor selected for a specific health benefit.
00:29:59.380 There's a heart health group, lungs and kidney groups, metabolism, even healthy weight.
00:30:04.540 I love the energy boost I get with Field of Greens.
00:30:07.760 But most of all, I love the confidence that even if I have a cheat day or, wait for it,
00:30:13.660 a burger, I can enjoy it guilt-free because of Field of Greens.
00:30:17.040 It's the nutrition my body needs daily.
00:30:20.140 And only Field of Greens makes you this better health promise.
00:30:24.420 Your doctor will notice your improved health or your money back.
00:30:27.460 Let me repeat that.
00:30:28.540 Your doctor will notice your improved health or your money back.
00:30:32.040 Let me get you started with my special discount.
00:30:35.040 I got you 20% off your first order.
00:30:37.960 Just use code Bannon, B-A-N-N-O-N, at fieldofgreens.com.
00:30:42.400 That's code Bannon at fieldofgreens.com.
00:30:45.400 20% off.
00:30:47.080 And if your doctor doesn't know how healthy you look and feel,
00:30:51.160 you get a full money back guarantee.
00:30:53.880 Fieldofgreens.com.
00:30:55.860 Code Bannon.
00:30:56.940 Do it today.
00:30:57.980 Here's your host, Stephen K.
00:31:00.980 Bannon.
00:31:04.340 Nate and Yud.
00:31:05.300 By the way, just for I to let you guys go and give your coordinates and tell people how to buy this amazing book,
00:31:10.580 purchase of this book, which we're going to break down and spend more time on, folks, in the days ahead.
00:31:15.880 There's a movie called Mountainhead, which basically has actors playing Elon Musk, Steve Jobs, I think Zuckerberg, and Altman.
00:31:26.060 And it's actually kind of dark to begin with, but it turns very dark when one becomes – they identify one as a decelerationist.
00:31:34.000 Are you guys – Nate, you first.
00:31:35.480 Are you a decelerationist about this?
00:31:37.360 I would decelerate AI.
00:31:40.540 I would decelerate any technology that could wipe us all out and prevent us from learning from mistakes.
00:31:45.760 Every other technology, I think we need to go full steam ahead and are sometimes hobbling ourselves.
00:31:50.940 But AI in particular, any technology that kills everybody and leaves no survivors, you can't rush ahead on that.
00:31:59.100 Yud, are you a decelerationist?
00:32:02.100 I've got libertarian sympathies.
00:32:04.060 If a product only kills the voluntary customers and maybe the person who sold it, that's kind of between them.
00:32:11.340 I might have sympathy, but not to the point where I try to take over their lives about it.
00:32:15.320 If a product kills people standing next to the customer, it's a regional matter.
00:32:19.620 Different cities, different states can make different rules about it.
00:32:22.500 If a product kills people on the other side of the planet, that's everybody's problem.
00:32:26.340 And, you know, yeah, you don't have to agree with me to want humanity to not die here about this part.
00:32:35.760 But I would happen to, you know, go full steam ahead on nuclear power.
00:32:39.920 Yeah, it's just the special case here.
00:32:43.740 Artificial intelligence, you know, gain-of-function research on viruses might be another thing.
00:32:49.600 But, you know, it does actually differ by the technology.
00:32:53.720 You don't have – there's not this one switch that's set to excel or deset.
00:32:59.700 Yud, by the way, what's your social media?
00:33:02.120 I might add we were the first show in January of 2020 to say about what the University of North Carolina was doing on gain-of-function was a danger to humanity.
00:33:12.180 And we're laughed at by the mainstream media as being conspiracy theorists.
00:33:16.240 Yud, what is your – what are your coordinates?
00:33:18.080 What's your social media?
00:33:18.860 How do people follow you, your thinking, your writing?
00:33:22.800 E.S. Yudkowski on Twitter.
00:33:27.640 Thank you, sir.
00:33:28.660 Nate, where do people go to get you?
00:33:31.400 Yeah, I'm S-O-8-R-E-S on Twitter.
00:33:36.800 Thank you, guys.
00:33:37.680 Actually, we're very honored to have you on.
00:33:39.180 Look forward to having you back and break down this book even more.
00:33:41.320 Everybody ought to go get it.
00:33:43.020 If anyone builds it, everyone dies.
00:33:46.120 A landmark book everyone should get and read.
00:33:48.720 Thank you, guys, for joining us in the war room.
00:33:54.220 Joe Allen is going to stick with us.
00:33:55.520 I'm going to go back to Joe in a moment.
00:33:57.520 Shemaine, you're an ambassador.
00:34:00.000 First off, you're one of the ambassadors of Turning Point.
00:34:02.480 Give me your thoughts of what's evolved this week.
00:34:05.520 And, you know, we've got Charlie's funeral or celebration of life on Sunday.
00:34:09.440 Give me your – you knew him extremely well, and you do the faith show here on Real America's Voice.
00:34:17.080 But you're an ambassador at Turning Point.
00:34:18.840 Your thoughts about this tragic week?
00:34:20.620 It's been horrific for so many people.
00:34:24.620 It's been a turning point for not just America, but for the world.
00:34:29.160 And we saw Charlie as the last good guy.
00:34:33.820 And for this to happen to him is just devastating to so many people.
00:34:38.320 And so many people are wondering, what do we do?
00:34:40.260 What do we do with all this anger and sadness?
00:34:43.540 And I say silence is not an option at this point.
00:34:47.360 We must move forward.
00:34:49.360 We must carry that torch that Charlie gave us.
00:34:52.860 So that's what I'm trying to do with my Faith and Freedom show right here on Real America's Voice.
00:34:57.480 And I appreciate that.
00:34:58.860 I think I am the oldest Turning Point ambassador.
00:35:02.080 I have to be.
00:35:02.860 I don't know anybody older.
00:35:03.880 But hang on.
00:35:07.040 That's just biological – that's chronological age.
00:35:10.520 It's certainly not biological age.
00:35:11.900 You've got more energy.
00:35:12.920 You've got more energy than 20 young people.
00:35:15.980 How do you do it?
00:35:17.500 I mean –
00:35:17.840 I see where you're going with this.
00:35:19.000 You've got the show.
00:35:19.560 You're an ambassador.
00:35:21.260 Hey, I just – I'm just calling it – I'm just calling balls and strikes here.
00:35:26.580 Well, it's true.
00:35:27.460 Tell me about it.
00:35:28.020 Tell me what keeps you so young.
00:35:30.620 Well, it's true.
00:35:31.240 Besides your husband, which I know is young.
00:35:32.880 He's young at heart, or it's like parenting a young child.
00:35:37.340 Besides Ted, what keeps you young?
00:35:40.660 Steve, that's a whole other podcast, okay, about Ted and trying to stay young.
00:35:47.560 But I think you're right.
00:35:48.800 There is a study recently about epigenetics, which is the science that shows your DNA is not your destiny.
00:35:58.020 None of us eat right, so we all take supplements, right?
00:36:01.560 Most of us do.
00:36:02.340 But there are so many different fruit and vegetable supplements on the market, and if you study their ingredients, which I have, I'm a label reader, it's just common produce with limited nutritional value.
00:36:13.840 There's a product called Field of Greens, and it's different.
00:36:16.120 And they wanted to prove that it was different by doing a university study where each fruit and vegetable in Field of Greens is medically selected for health benefits.
00:36:25.660 There's heart health, lungs, kidney, liver, healthy metabolism, and healthy weight.
00:36:31.780 And in this study, Steve, they wanted to see how diet, exercise, and lifestyle changed your real age.
00:36:40.960 And this is fascinating to me, but some of the participants, they ate their normal diets, including fast food, they didn't exercise anymore, and they didn't stop drinking.
00:36:52.800 All they did is add Field of Greens to their daily routine, and the results were remarkable.
00:37:01.220 60% of participants showed a measurable reduction in their biological aging markers after just 30 days.
00:37:10.100 One scoop of Field of Greens slows the body's aging process at the cellular level.
00:37:15.280 And I think this was what helped me, because I've worked out all my life.
00:37:21.360 I'll be honest, I don't eat right all the time.
00:37:24.340 So just by taking one scoop of Field of Greens, I can see that aging slow down.
00:37:33.860 Fieldofgreens.com, promo code BANDY, get 20% off.
00:37:36.880 It'll ship out today.
00:37:38.280 We hit it every day here at the War Room.
00:37:40.340 Not us that have all the – about this Texas A&M study, but also I get an energy boost, Shemaine, just every day.
00:37:48.240 So that's where we take it.
00:37:49.700 I want to thank you for all you do, and particularly being an ambassador and helping the folks over at Turning Point, particularly in this very difficult phase for the movement, for the company, for Erica, the kids, everybody.
00:38:02.560 So I really want to thank you for joining us today.
00:38:04.740 Well, we have to.
00:38:06.160 It's an Esther 414 moment.
00:38:08.000 If we remain silent, relief and deliverance is going to come from someplace else.
00:38:14.300 We were born, Steve, for such a time as this.
00:38:19.640 Shemaine Nugent, thank you very much.
00:38:21.960 Wisdom and energy all in one.
00:38:24.340 Thank you, ma'am.
00:38:25.180 Appreciate you.
00:38:25.760 God bless.
00:38:28.500 Dr. Navarro, you were our co-host.
00:38:32.100 You were the contributor.
00:38:34.260 You were the president.
00:38:35.240 You've been with the president now for 10 years.
00:38:36.780 You were his economic advisor.
00:38:38.000 But tell me about – you wrote this piece that's pretty moving, and I think it's so tied to your book about you went to prison so that we don't have to.
00:38:47.620 Talk to me about Charlie Kirk.
00:38:51.460 Steve, I really want people to understand the legacy of Charlie Kirk historically.
00:38:58.780 He could have been president.
00:39:00.880 He certainly would have been a governor, but already at 31 years old, he's the greatest political organizer in the last 50 years.
00:39:11.100 And if you compare him to the two who were there at the top before Charlie Kirk, Ralph Reed on the right, David Axelrod on the left.
00:39:20.020 What Ralph Reed did with the Christian coalition is mobilize the Christian right to get out and actually vote.
00:39:27.500 He was responsible for the Gingrich revolution in 1994 as well as the Bush win in 2000.
00:39:34.960 And then Axelrod on the left, he was able to mobilize a natural Democrat constituent, blacks, Hispanics, and young people, used micro-targeting, some advanced kind of techniques at the time, and basically won the race for Obama in 2008.
00:39:53.700 The reason why Charlie is head and shoulders above each one of them is he had a much heavier lift, Steve.
00:40:01.060 To mobilize the youth in support of MAGA and Trump and MAGA candidates in Congress, he had to first bring him over to our side.
00:40:13.100 And that was a heavy lift.
00:40:15.100 And he did it.
00:40:16.000 When I first met him, Charlie, back in 2016, young kid, thinking he was going to go out there and change the viewpoint of the youth of America, I thought he was Don Quixote.
00:40:28.240 I'll be honest with you, Steve.
00:40:29.920 He proved me wrong.
00:40:31.220 He proved the world wrong.
00:40:33.220 And people need to understand father, husband, patriot, just a wonderful human being that's here.
00:40:44.820 But in terms of pure historical significance, he will go down in history as the greatest political organizer in the last 50 years.
00:40:53.380 And I don't think anybody's ever going to do, again, what he did because it's relatively easy to mobilize.
00:41:02.000 It's very difficult to persuade people over to your side and then mobilize, Steve.
00:41:07.760 Hang on for a second.
00:41:11.660 I want to, because you got your PhD at Harvard, then you went back, you taught in the university system.
00:41:17.340 When I first saw Charlie, I think Breitbart's the first guy to give him a paid gig.
00:41:21.320 But some of the people around Breitbart were what financed him at the very beginning when he was going after student governments.
00:41:26.960 And I think many people who thought Charlie was just a ball of fire thought it was the longest odds.
00:41:33.840 And you thought so, too, because the universities, as we know now, it is based around this kind of radical philosophy.
00:41:42.380 And the kids are formed all the way from kindergarten all the way up.
00:41:45.500 So that is, to me, the greatness of Charlie Kirk, that he was able to go in and just do this when so many people said, hey, look, this guy's great.
00:41:56.240 He's fantastic.
00:41:56.920 But this is Don Quixote.
00:41:58.240 You're tilting at windmills here.
00:41:59.840 It just can't happen.
00:42:01.260 And you knew it better than anybody because you were inside the belly of the beast.
00:42:06.700 Brutal.
00:42:07.260 Well, I spent 25 years at the University of California in Irvine.
00:42:12.260 And if there's a system that ever was woke, that certainly is it.
00:42:19.240 But what Charlie understood, he didn't start at Harvard and Cornell.
00:42:23.680 He understood that most of the universities in this country are in flyover country.
00:42:29.060 And he just rolled his sleeves up.
00:42:31.220 He was tireless.
00:42:32.060 He went out there and Socratically, I mean, when I taught in the classroom, I was a big fan of the Socratic method.
00:42:39.120 You can't tell people things.
00:42:41.800 You've got to have them come to their own conclusions.
00:42:45.820 And that's how Charlie was able to bring people home to MAGA.
00:42:50.940 And very keen intellect.
00:42:53.620 I mean, fast forward, it's like when I got out of prison, you know, the day I got out of prison,
00:43:00.860 July 17th, I went to the Republican National Committee, gave the speech.
00:43:06.820 I went to prison so you won't have to.
00:43:08.920 The title of the book is actually a tagline from the speech.
00:43:12.180 It means like a wake up call.
00:43:13.940 But I mentioned this in the context of Charlie because I didn't even know this.
00:43:18.960 But two days earlier, I saw a clip was shown.
00:43:22.620 I was on the set shortly after he got killed.
00:43:26.640 And he was giving a speech on my birthday, July 15th, two days before me.
00:43:31.740 And he said, I visited college campuses so you won't have to.
00:43:37.300 And it, I mean, the way I just, somehow it struck a warm chord in me.
00:43:45.000 And I felt like we were fellow travelers.
00:43:47.640 And, yeah, I'm on the campaign trail getting out of prison with the boss.
00:43:51.860 My fiancee Pixie in the book, I call her Bonnie.
00:43:55.820 You know her well, Steve.
00:43:57.100 She's been on your show.
00:43:59.920 We'd see Charlie everywhere, right?
00:44:01.760 Everywhere we'd go.
00:44:02.760 We went to Georgia and North Carolina.
00:44:04.480 We were in Pennsylvania.
00:44:05.240 He was always there.
00:44:07.580 And then during the transition, he was essentially Sergio Gore's co-pilot there, putting all the personnel together in the administration.
00:44:20.040 And, look, the boss, Charlie was like a son to Donald John Trump, as well as a key advisor, one of his most trusted advisors.
00:44:32.660 So, he'll be missed.
00:44:35.260 I'm going to try to hitch a ride out on the Korean Air Force One on Sunday and be there.
00:44:40.300 I'm sure you'll be covering this.
00:44:40.980 Hang on one second.
00:44:42.220 Yeah.
00:44:43.300 Hang on a second.
00:44:44.040 We're doing wall-to-wall coverage of it.
00:44:45.540 I want to hold you through the break because I want to talk about the book for a second.
00:44:49.040 Back in a moment.
00:44:51.060 We will fight till they're all gone.
00:44:53.600 We rejoice when there's no more.
00:44:55.460 Let's take down the CCP.
00:44:58.820 If you're a homeowner, you need to listen to this.
00:45:00.960 In today's AI and cyber world, scammers are stealing home titles with more ease than ever, and your equity is the target.
00:45:10.500 Here's how it works.
00:45:11.660 Criminals forge your signature on one document.
00:45:14.240 Use a fake notary stamp.
00:45:16.060 Pay a small fee with your county, and boom, your home title has been transferred out of your name.
00:45:22.580 Then they take out loans using your equity or even sell your property.
00:45:26.640 You won't even know it's happened until you get a collection or foreclosure notice.
00:45:33.860 So let me ask you, when was the last time you personally checked your home title?
00:45:39.660 If you're like me, the answer is never, and that's exactly what scammers are counting on.
00:45:45.520 That's why I trust Home Title Lock.
00:45:48.060 Use promo code Steve at HomeTitleLock.com to make sure your title is still in your name.
00:45:55.620 You'll also get a free title history report plus a free 14-day trial of their million-dollar triple lock protection.
00:46:02.700 That's 24-7 monitoring of your title.
00:46:05.580 Urgent alerts to any changes, and if fraud should happen, they'll spend up to $1 million to fix it.
00:46:12.260 Go to HomeTitleLock.com now.
00:46:15.060 Use promo code Steve.
00:46:16.540 That's HomeTitleLock.com, promo code Steve.
00:46:19.360 Do it today.
00:46:21.020 Here's your host, Stephen K. Bannon.
00:46:26.600 Okay.
00:46:28.320 By the way, gold, retract a little bit today.
00:46:31.160 It's not the price of gold.
00:46:32.200 It's the process of how you get to the value of gold.
00:46:34.560 Make sure you take your phone out right now and text Bannon, B-A-N-N-O-N, 989898.
00:46:40.640 To get the ultimate guide, which happens to be free, investing in gold and precious metals in the age of Trump.
00:46:47.780 So go check it out.
00:46:48.520 We had a rate cut last night, only 25 basis points.
00:46:52.340 President Trump wants more.
00:46:53.920 Steve Mirren, the Council on Economic Advisors chair, is now, I guess, the interim governor.
00:46:59.760 He voted against it, won a 50 basis point cut.
00:47:03.140 Go find out why gold has been a hedge for times of financial turbulence in mankind's history.
00:47:07.660 Joe Allen, I know you've got a bolt.
00:47:11.220 Just give your coordinates.
00:47:12.840 You can't be on tonight because you're going to be at one of the conferences.
00:47:15.060 I'm going to get you back on hopefully tomorrow.
00:47:17.160 We had a historic interview today on the book, If Anyone Builds It, Everyone Dies.
00:47:22.540 Very uplifting.
00:47:24.160 Where do people go to get your writing, sir?
00:47:25.720 If the audience wants to hear the in-depth interview I did with Nate Soros a couple of weeks ago,
00:47:33.140 it's right at the top of my social media, at J-O-E-B-O-T-X-Y-Z.
00:47:39.240 Also an article about the hearing two days ago with Josh Hawley and the parents of children
00:47:45.120 who were lured into suicide by AI.
00:47:48.960 That's also up at the top of my social media, at J-O-E-B-O-T-X-Y-Z or JoeBot.X-Y-Z.
00:47:55.360 We are backed up on a lot of stuff because of the Charlie Kirk situation was obviously a priority,
00:48:00.720 including designated Antifa a terrorist group so we can get to the bottom of all of it
00:48:05.360 and not just have this dealt with by Utah officials as a single murder.
00:48:09.700 It's much deeper than that, the assassination of Charlie Kirk.
00:48:13.620 Joe Allen, thank you so much.
00:48:14.840 Josh Hawley was supposed to be here today, but of course we had the press conference.
00:48:18.160 And right there on the screen you see the President of the United States getting ready to leave
00:48:21.920 to go to the airfield to take Air Force One back.
00:48:26.240 He'll arrive later tonight.
00:48:27.360 Of course we'll be covering all of that.
00:48:29.140 Peter Navarro, one of President Trump's closest advisors,
00:48:32.100 and I think arguably the longest advisor.
00:48:34.920 I think he's the only one that's been there from the very, very, very beginning that's still there.
00:48:39.080 Why should, in a world of all this information and everything going on,
00:48:42.640 as big a hero as you are to this movement, as highly respected as you are by President Trump,
00:48:49.640 because you're kind of the architect with him of the reorganization of the world's trading patterns,
00:48:55.540 why should people buy a book about you and your days in prison, sir?
00:49:01.280 It's not about me, Steve.
00:49:03.040 And I would ask the posse to go right now to Amazon.
00:49:05.920 I went to prison so you won't have to.
00:49:08.720 The book is really the best analysis of how the left is going after all of this.
00:49:16.940 If they can come for me, if they can come for Steve Bannon, put him in prison,
00:49:20.900 if they can try to put Trump in prison now, they shot Charlie Kirk.
00:49:25.260 They can do this to you.
00:49:27.360 And I'll take you into prison, and I went to prison so you won't have to.
00:49:31.720 But the broad scope here, Steve, is really an analysis about the asymmetry,
00:49:37.680 the disturbing asymmetry between how the left is waging war on this.
00:49:43.520 You mentioned everybody I serve with, Steve, including you, has been a target of the left.
00:49:52.220 At a minimum, they've spent millions of dollars in legal fees,
00:49:55.720 whether it's Mike Flynn or America's Mayor Rudy Giuliani.
00:50:01.620 They take the bar cards of Jeff Clark, John Eastman.
00:50:05.000 And on the other end, Steve, you and I went to prison.
00:50:07.960 And everybody who put us there, every single person was a Democrat except Liz Cheney.
00:50:13.720 And that's the exception that proves the rule.
00:50:15.580 I mean, think about that.
00:50:16.440 How can the Democrats seize power and use that to put us in prison and call us fascists?
00:50:24.560 How dare they, Steve?
00:50:26.200 So I went to prison so you won't have to.
00:50:30.620 It's a story about how we must wake up to what's happening.
00:50:35.180 But it's all a lawfare story.
00:50:37.460 But look, if you want to find out what it's like to go into prison for a misdemeanor
00:50:42.260 and wind up spending four months with 200 felons, this is the book.
00:50:49.240 And you know, Bonnie, my fiancé, it's also a story about how we were able to cope and deal with that.
00:50:59.020 The message there is simply that she did the time with me.
00:51:02.200 And that's what happens when people are unjustly targeted by the left.
00:51:08.440 It's not just you.
00:51:10.060 We can take it, Steve.
00:51:11.120 You and I are soldiers.
00:51:12.040 We can take it.
00:51:12.880 But when they go after our families, that's where you draw the line.
00:51:16.720 So it's a big book, a big story.
00:51:19.240 Go ahead.
00:51:21.540 If they steal the election in 2028, trust me, this audience, they're going to be coming for you.
00:51:26.020 You see now, we're in a different place than we even back then.
00:51:29.060 This is getting more and more intense every day.
00:51:31.580 That's why Navarro's book has got to be written because it's actually for you and about you.
00:51:36.860 It ain't about Navarro, not about me, not about President Trump.
00:51:40.160 There are different characters in this story.
00:51:42.320 But the book is about you.
00:51:43.780 And understand, like I said, there's nothing, there's no compromise here.
00:51:48.040 There's no unity here.
00:51:49.000 One side's going to win and one side's going to lose.
00:51:50.800 And if they steal it in 2028, they're coming for you.
00:51:53.460 Peter, where do people go to get your writings?
00:51:55.460 Where do they go to get, in particular, your great piece on Charlie Kirk?
00:51:58.200 Where do they go to get your book?
00:52:00.140 Sure.
00:52:00.580 The book, I Went to Prison So You Won't Have To.
00:52:03.200 It's on Amazon.
00:52:04.440 It came out two days ago.
00:52:06.580 Please drive this thing up to bestsellers so we get the message out.
00:52:10.340 It's our best defense about them targeting us.
00:52:12.880 I Went to Prison So You Won't Have To on Amazon.
00:52:15.820 The piece about Charlie is very close to my heart.
00:52:19.500 It's on the Washington Times op-ed site.
00:52:22.580 Today, I put it out on X at Real P Navarro.
00:52:25.680 It will be up on my sub-stack on Sunday as we celebrate Charlie on that sacred Sunday that we're about to have.
00:52:35.340 And you can always go to my sub-stack, peternavar.substack.com.
00:52:38.480 But, Steve, I really appreciate what the War Room does.
00:52:41.660 I appreciate being able to come talk about I went to prison so you won't have to.
00:52:47.660 And C-SPAN is running a long, hour-long interview on Sunday at 8 p.m.
00:52:55.640 We'll talk about it on Sunday night.
00:52:56.820 Yeah, we'll talk about that later.
00:52:57.740 Pete is going to be with us.
00:52:58.900 Yeah.
00:52:59.900 You're going to be on Saturday, and also you're going to be on Sunday.
00:53:02.460 We're doing wall-to-wall coverage live from the stadium.
00:53:05.600 We'll give more details later.
00:53:06.560 Peter Navarro, thank you.
00:53:07.620 Mike Lindell, it's been a long, tough morning for the War Room.
00:53:10.880 They need a deal, sir.
00:53:11.960 What do you got for us?
00:53:13.420 Best deal, everybody.
00:53:14.820 We've got the three-in-one special.
00:53:16.580 I'm sitting here back in Minnesota at my factory.
00:53:19.260 All the towels came in.
00:53:20.540 Remember, they actually work.
00:53:21.860 Like the Six Beast towel sets, $39.98.
00:53:25.440 They're normally $69.98.
00:53:28.140 And then we have the pillows, the Giza Dream pillows, the Giza Covers.
00:53:34.320 All of the sheets are on sale, $29.88.
00:53:38.360 Once they're gone, they're gone.
00:53:39.880 These are the percale sheets.
00:53:41.760 You guys go to mypillow.com forward slash war room.
00:53:45.940 And then you're going to see all of the big-ticket items at the website, free shipping on the beds, the mattress toppers, 100% made in the USA.
00:53:58.240 And people, remember, we have a 10-year warranty and a 60-day money-back guarantee, everybody.
00:54:03.760 Mypillow.com, promo code war room.
00:54:05.500 See you back here at 5.
00:54:06.320 Of Megyn Kelly, we toss to the Charlie Kirk Show, two hours of populist nationalism, hosted today by Megyn Kelly.
00:54:13.840 We'll see you back here at 5 p.m. Eastern Daylight Time.
00:54:20.460 You missed the IRS tax deadline.
00:54:23.620 You think it's just going to go away?
00:54:24.900 Well, think again.
00:54:26.840 The IRS doesn't mess around, and they're applying pressure like we haven't seen in years.
00:54:31.340 So if you haven't filed in a while, even if you can't pay, don't wait.
00:54:36.740 And don't face the IRS alone.
00:54:40.080 You need the trusted experts by your side, Tax Network USA.
00:54:44.200 Tax Network USA isn't like other tax relief companies.
00:54:47.820 They have an edge, a preferred direct line to the IRS.
00:54:51.300 They know which agents to talk to and which ones to avoid.
00:54:54.780 They use smart, aggressive strategies to settle your tax problems quickly and in your favor.
00:54:59.940 Remember, whether you owe $10,000 or $10 million, Tax Network USA has helped resolve over $1 billion in tax debt, and they can help you, too.
00:55:11.560 Don't wait on this.
00:55:12.840 It's only going to get worse.
00:55:14.000 Call Tax Network USA right now.
00:55:16.360 It's free.
00:55:17.300 Talk with one of their strategists and put your IRS troubles behind you.
00:55:21.000 Put it behind you today.
00:55:22.220 Call Tax Network USA at 1-800-958-1000.
00:55:27.620 That's 800-958-1000.
00:55:31.060 Or visit Tax Network USA, TNUSA.com slash Bannon.
00:55:35.800 Do it today.
00:55:36.780 Do not let this thing get ahead of you.
00:55:39.880 Do it today.
00:55:40.340 Do it today.
00:55:44.500 Do it today.
00:55:48.160 Do it today.
00:55:49.580 Do it today.
00:55:52.000 Do it today.
00:55:57.480 Do it today.