Bannon's War Room - March 17, 2026


WarRoom Battleground EP 969: Soul of the Machine — Tech Money, Kill Bots, and Human Atrophy


Episode Stats

Length

54 minutes

Words per Minute

160.83517

Word Count

8,743

Sentence Count

399

Misogynist Sentences

5

Hate Speech Sentences

12


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

In this episode of the War Room Podcast, host STEPHEN K. BONUS EPISODE, host JOSH MILLER sits down with The Wall Street Journal's Peter Bergen to discuss the current state of the Democratic Party, the future of the media, and the dangers of mega-media.

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.000 We'll be right back.
00:00:30.000 don't like hearing that. I know you try to do everything in the world to stop that, but you're
00:00:33.020 not going to stop it. It's going to happen. And where do people like that go to share the big
00:00:37.400 line? Mega Media. I wish in my soul, I wish that any of these people had a conscience. Ask yourself,
00:00:46.220 what is my task and what is my purpose? If that answer is to save my country, this country will
00:00:53.840 be saved. War Room. Here's your host, Stephen K. Band.
00:01:00.000 monday 16 march year of the lord 2026 um not to increase your burden but the reason we're doing
00:01:10.480 this you are the war room posse you're one of the most powerful political entities
00:01:15.480 in the history of mankind because you've bent the arc of history in this country
00:01:20.920 being the very first folks on with president trump and having his back and obviously there's
00:01:26.740 disagreements all the time disagreements and people that are endorsed disagreements and policy
00:01:32.020 disagreements maybe even sometimes uh tangentially the direction and i realize right now a lot of
00:01:40.360 people have a lot of questions of what's going on but you must be completely informed because
00:01:45.360 this is all about your agency if we get back to the core of what this show is is to give you access
00:01:51.520 on a daily basis, the best minds and the most active fighters in the country on the issues
00:01:58.720 that matter most for this republic and for you and your family and your community and you
00:02:03.900 personally. One that we're very proud to have been the leader in. And I think on the right,
00:02:08.720 I'm kind of shocked that nobody's ever really gotten serious about challenging us to be the
00:02:13.640 leader in all things transhumanism. How do we do that? Well, I've always had an interest in this.
00:02:18.580 I try to do the film on transhumanism of Kurzweil's book years ago with Steve McAvity,
00:02:28.340 who produced Passion of Christ.
00:02:30.080 We didn't really get off the ground to get the financing, but a strong interest.
00:02:34.100 And then a guy, a young guy, was writing over at Sean Davis' The Federalist on occasions
00:02:39.140 putting up.
00:02:39.800 He wasn't on the staff, but he was a contributor.
00:02:42.280 And we talked him into coming over here for four or five years full time as our editor
00:02:47.320 for all things transhumanism he just started a new group called humans first he's one of the
00:02:51.180 brightest minds in this area and has really galvanized a part of this movement to make sure
00:02:58.220 that we stand up and this and that that we have we fully understand what we're doing we fully
00:03:04.100 understand what we're financing and we have control of it it just doesn't happen because
00:03:09.860 if it just happens the down the dark side and the downside of this is probably two-thirds one-third
00:03:15.220 just full stop not even a question you're not going to get to a sunlit uplands if this thing
00:03:21.100 just kind of rolls on its own with a handful of oligarchs uh greedy power mad voracious
00:03:29.020 avarice oligarchs doing this that would be the one and only joe allen joe i do want to i've got
00:03:37.620 you here for an hour because i know you're going all over the country and i want to go there's five
00:03:40.640 or six big issues i've got to go through to make sure the audience is fully up to speed
00:03:44.120 and understand this so this is like our own briefing our own private briefing folks
00:03:48.540 we had this order to go through it but then you know some of the top political people i worked
00:03:54.140 with sent me a punch bowl exclusive which dropped at the ai pack uh which is uh and that is not a
00:04:01.420 pack but the artificial intelligence pack um no pun intended here it's been initially capitalized
00:04:09.380 with 125 million dollars and and people are like stunned by this that's their opening bid what does
00:04:15.740 that mean they are going to try to buy their way in total absolute control of this as far as the
00:04:22.460 political process goes because they think that we live in a world where 30 second spots can uh can
00:04:29.200 basically get to people and and and to drive those people inform those people and to believe what they
00:04:34.440 want them to believe and to act the way they want them to act. And it is upon us. It's upon us in
00:04:40.060 this time and place to defeat that. And I got it. The odds are long, but we've taken on long odds
00:04:46.120 before. And we never take on easy fights. This is a hard fight and a tough fight. Today, let's play
00:04:51.520 the clip. We've got a couple of clips to play. I want to play a clip, Joe. I'm not sure if you've
00:04:54.860 seen it because you've been running around, but the president of the United States and the president
00:04:58.520 and sacks and this team have been like mega accelerationist and the president said something
00:05:04.980 today because you know when he's speaking and doing these things and this is why we cover it
00:05:08.620 live on rav and i realize sometimes saying hey he's repeating himself it's fine you're getting
00:05:12.840 access to the thinking of a man it's one of the most important presidents this country's ever had
00:05:17.000 i rate him uh with a washington and with lincoln um and you're getting you're seeing it in real
00:05:22.960 time you've never been able to see that everything that white house has traditionally done has been
00:05:27.680 very guarded and everything's been kind of in political speak. You're getting in one regard
00:05:33.060 political stream of consciousness, but it's very powerful. And today was part of that. Let's go
00:05:36.840 and play this clip that was at the luncheon earlier today when president was answering some
00:05:41.040 questions. They are a country that for years, I didn't know this until recently, they're a country
00:05:47.040 based on disinformation. And now they're using disinformation plus AI. And that's a terrible
00:05:53.600 situation. That's a terrible situation. Uh, they showed all sorts of things happening in the last
00:06:01.060 two weeks that never happened between their kamikaze boats that don't exist between blowing
00:06:08.000 up the aircraft carrier, one of the great ships in the world, the Abraham Lincoln on fire.
00:06:14.920 They showed it on fire. I called the, the general, I said, uh, general, what's with the Abraham
00:06:22.380 link. It looks like it's burning down. No, it's not burning down. Not a bullet was ever fired at
00:06:27.520 it, sir. They know better. I said, this is my first glimpse of AI and what they've done with
00:06:33.920 it. They showed buildings in Tel Aviv burning to the ground, high rises burning. They showed
00:06:40.420 buildings in Qatar. They showed buildings in Saudi Arabia burning and they weren't burning.
00:06:46.860 They weren't hit. It was all AI, AI based. Terrible. It's terrible.
00:06:54.540 As David Sachs became kind of a America first, hey, what are we doing in the Middle East?
00:07:00.240 What are we doing this? Because David Sachs came out with many things.
00:07:02.660 We can say, hey, we look at need an off ramp. We got to rethink this.
00:07:05.980 We may be too far into this on and on, like ripped from the pages of the war room.
00:07:11.060 Although we're supportive, as we say now we're in it, whether you like it or not, we're in it.
00:07:14.480 We've got to win this thing.
00:07:15.920 We have to figure it out.
00:07:17.240 As David Sachs has done that, he's the big accelerationist.
00:07:19.720 He's the president's advisor, senior advisor on all things crypto and artificial intelligence.
00:07:25.360 The president right there, I say the president's not a full accelerationist.
00:07:28.800 And the reason is he says, hey, I just saw this stuff.
00:07:31.700 So, Joe, and I want to make one more thing is that, Joe, for a tangent to you, I saw
00:07:35.820 a video yesterday when I was working with Elizabeth and the team that helps me with
00:07:40.260 my social media.
00:07:41.440 I was about to put it up.
00:07:42.380 It was incredibly moving.
00:07:43.540 It was a little boy with his mom, with a flag-draped casket, crying for his father, it looked like, it seemed, and then got up on the thing.
00:07:56.200 It was so moving.
00:07:57.080 And at the last second, I always check with Grace Chung.
00:08:01.540 I go, Grace, hey, hang on.
00:08:02.860 This is not artificial intelligence.
00:08:03.840 And boom, a second.
00:08:04.880 It's artificial intelligence, Steve.
00:08:06.980 I go, I almost put that up.
00:08:09.400 This stuff is so realistic.
00:08:10.900 What the president's talking about, they had a huge thing in the Middle East.
00:08:13.700 It went viral about the Abraham Lincoln, you know, on fire, taking a couple of incomings.
00:08:18.320 Also, these rallies have been going on in the square, on and on and on.
00:08:21.640 Look, some of it's real, but artificial intelligence, these guys have been masters of this.
00:08:27.340 Joe, what does that tell us?
00:08:28.360 And particularly that the accelerationists are dumping in cash in greater levels.
00:08:32.960 You could take big pharma, big ag, you could take the banks, the lords of easy money, the food processors, add it all together, and you ain't got what they're putting in on artificial intelligence.
00:08:46.980 Now, curious minds want to know, why are they doing that, Joe Allen?
00:08:53.460 Well, Steve, the first reason that you see all this money going into the super PAC, leading the future.
00:09:00.060 Leading the Future, of course, it was primarily founded by Greg Brockman of OpenAI, by Joe Lonsdale of Palantir, and Mark Andreessen of Andreessen Horowitz.
00:09:11.200 The reason they're pouring money to boost pro-AI candidates is because the public has completely soured on this.
00:09:19.840 you look at the teachers, you look at the workers in the corporations that are being forced to use
00:09:26.280 generative AI in as many processes as possible to accelerate their productivity. You look at
00:09:33.200 the corporate leaders who were recently interviewed by the National Bureau of Economic Research.
00:09:39.220 It's pretty much accepted by the majority of people in America. The AI, number one,
00:09:45.580 isn't really the economic boon that was promised. It is not really boosting productivity as of yet.
00:09:52.600 It just means people write worse emails and more of them. And then you also have the general
00:09:58.220 discontent about the more dangerous aspects of this. You remember Greg Brockman's OpenAI
00:10:03.880 and their chatbot, ChatGPT, was goading children into suicide. There are multiple lawsuits on this.
00:10:12.780 and you look at palantir which has become a household name at this point and the ways in
00:10:18.780 which their technology are used yes to defend america it's yes it's a national security asset
00:10:25.360 but also to surveil americans and americans are sick of it and so we have the numbers any politician
00:10:34.640 who runs challenging these tech firms challenging the broligarchs they're going to win
00:10:42.500 But we also know that money buys elections in many cases.
00:10:47.140 And so you've got these guys dumping money.
00:10:49.960 You get $125 million right now, and I keep hearing $200 million behind the scenes,
00:10:56.220 $125 million to pull from to push pro-AI candidates.
00:11:00.980 Byron Donalds in Florida being a really good example.
00:11:04.300 The punch bowl is $125 million as their down payment.
00:11:09.340 They've never seen a number like that, like the opening bid.
00:11:13.020 This is going to get to 200 and more, but continue on.
00:11:15.860 Is Byron Donald's a good guy?
00:11:17.520 Byron Donald's could not be taking $5 million from these AI guys.
00:11:21.280 Don't tell me that.
00:11:22.280 Please don't tell me that.
00:11:24.320 Well, you know, as I understand it, the deal is still in the works,
00:11:27.260 but certainly leading the future plans to spend at least $5 million
00:11:31.860 to boost his campaign with ads and whatnot.
00:11:35.760 Why?
00:11:36.140 Because Byron Donalds is going to be the leader in showing AI as a benefit to education.
00:11:43.500 Now, if you ask educators, educators are going to tell you that AI means that their students
00:11:49.260 cheat more often.
00:11:50.440 They offload their thinking to the AIs rather than studying themselves.
00:11:55.480 And a lot of the teachers look around, the professors in colleges and teachers in schools
00:11:59.080 look around and they see their colleagues, other professors using AI to build out their
00:12:04.060 syllabi to actually do the research to put together their course materials all of it and even then
00:12:09.540 using ai to grade the papers by kids who are using ai to write the papers so but you know this is
00:12:15.420 but hang on but hang but hang on hang on hang on hang on hang on slow down you were just on the
00:12:20.500 steps we had you i think was last week you kicked off humans first and weren't you on the steps in
00:12:27.380 Tallahassee as your very initial kind of kickoff of this? And correct me if I'm wrong, but Governor
00:12:33.480 DeSantis, and people know I've had a lot of issues with Governor DeSantis, particularly when we ran
00:12:38.180 against Trump, but I always think he's been a very solid governor. Governor DeSantis is probably
00:12:43.780 farther ahead on this than any other governor in the United States of AI. And he just said flat out
00:12:49.340 at this press conference, the AI problems we've got at the federal government level, he said he's
00:12:54.420 not going to allow that to happen to the school children and the younger people down in uh in
00:13:00.620 florida so just the practicality how's brian donald is brian donald's gonna be going to take
00:13:05.100 five million dollars from these guys and go against the policies that are overwhelmingly
00:13:11.140 embraced by the population in in florida uh through governor de santis is that where we are
00:13:18.660 it would appear so and you know with de santas you have his ai bill of rights and it's very
00:13:26.000 comprehensive ai bill of rights deals with say the deep fakes that you were talking about and
00:13:30.880 the president was talking about in regards to iran the war in iran and they're pervasive
00:13:35.260 across society for all sorts of things women who have their images used for pornographic deep fakes
00:13:41.480 people creating embarrassing deep fakes of other people and just fake material all around
00:13:46.980 The AI Bill of Rights addresses that, saying that the technology simply cannot be used for that.
00:13:51.360 You can't put out an AI commercially that would create a deepfake.
00:13:56.600 And also, one of the big issues is parental controls.
00:13:59.800 And so Ron DeSantis' AI Bill of Rights directly addresses the parental control issue.
00:14:07.260 Any child under 18 would need a parent's permission to use AI.
00:14:12.080 People would say, oh, well, that's a restriction on freedom.
00:14:14.220 Well, maybe, but you see what's happening right now in the same way that you would restrict freedom on, I don't know, alcohol or porn or anything like that.
00:14:23.020 Kids are using AIs to do all sorts of destructive things to each other and to themselves, not excluding the issue of children who are lured into suicide by chatbots gone rogue.
00:14:35.520 And so it addresses that. It addresses the data center. So Ron DeSantis has taken probably the strongest stand next to Josh Hawley from the conservative side on this. And that's why you've got people like Amy Kramer from Humans First down in Florida fighting every single day, pushing for the AI Bill of Rights legislation on data centers, all of that.
00:14:57.660 You have the Florida Citizens Alliance with Keith Flaw and Ryan Kennedy, and they are fighting, they're meeting with the politicians, fighting for more parental controls on AI, more controls on the data centers to make sure they're not parasitizing local communities and their utilities.
00:15:16.620 Against that, you have who?
00:15:19.080 You have some people who I think have been kind of schnookered into believing that all of these sorts of technologies are benefiting them in some way or might make them rich.
00:15:27.800 But by and large, politically speaking, the polls are overwhelming.
00:15:31.900 You have a recent NBC poll of the U.S., which found that 57 percent of Americans believe the risks outweigh the benefits on A.I.
00:15:41.940 You have a recent economist with YouGov economist poll showing that three times the Americans believe the AI is mostly or entirely negative as opposed to positive.
00:15:55.340 The public sentiment is clear.
00:15:57.040 We have the numbers in regard to voters, but they have the money to either shift public opinion or, you know, as the war room posse well knows, money can buy you a lot of things, even more than votes.
00:16:10.680 So, yeah, this fight is only just starting. I mean, this is just state level. You saw the fights in California over SB 53. It passed, but it passed in extremely weak form. You saw the same fight in New York with Hochul and the Rays Act. It would have been a very stringent act demanding transparency and accountability of the AI companies before they deployed any system.
00:16:33.700 Well, you had leading the future pumping money into a lot of, for instance, Alex Boris to smear him.
00:16:42.160 And he's a Democrat. I'm not trying to defend any of his other policies, but at least he was taking on the AI companies and leading the future paid money to smear him.
00:16:51.740 And as I understand it, Leading the Future and other sort of broligarch networks pressured Kathy Hochul into passing a soft form of the Accountability Act.
00:17:03.660 The RAISE Act was what we ended up with.
00:17:05.920 So this is just starting on the state level.
00:17:08.400 But when it gets to the federal level, it's going to really heat up because at that point, the stakes are existential for some of these AI companies.
00:17:18.040 If these companies are forced to be transparent about where they get their data, about what these systems are capable of behind the scenes, and if they're held accountable legally for damages such as suicide, for damages such as widespread disinformation and bot networks, then for the most part, they'll either be crushed or diminished so that they are no longer the kings of the U.S. economy.
00:17:44.840 I mean, people no longer look to them.
00:17:46.220 Hang on, hang on, hang on.
00:17:47.920 They've hired the best law firms in the country.
00:17:51.580 They've hired the biggest attack dogs in the country.
00:17:54.780 They've hired thousands of influencers.
00:17:57.980 They have hired the crisis PR companies.
00:18:01.680 They've hired the biggest lobbyists.
00:18:03.280 These guys have unlimited resources, let's be blunt,
00:18:06.100 because they're playing a game that is for value.
00:18:10.960 Nvidia just announced a trillion dollars in revenue and backlog for advanced chips for the next two years.
00:18:18.620 Let me repeat that.
00:18:19.200 That's where they got on the books now, a trillion dollars in backlog.
00:18:22.780 So the game couldn't be higher.
00:18:25.100 They also understand if they are to lose this game even marginally, if there is any visibility into what they're actually doing and what it means and internal studies and all that, they're going to be shut down and seized immediately.
00:18:39.360 people are going to be shocked and they're going to be furious about what's been going on oh by the
00:18:45.300 way they definitely need and they're trying to slip and slide in on these data centers to suck
00:18:50.860 every ounce of water they can suck out of every aquifer in the country or and or creek or river
00:18:56.900 in addition uh they're they're lying about the data centers and particularly about president
00:19:02.580 trump signed the executive order and said it's not going to affect it's not going to affect
00:19:06.120 consumers they're working like trojans to work around that and we know in addition and now the
00:19:12.700 modeling is coming out from wall street joe that they're at least going to need a trillion dollars
00:19:17.200 of some sort of public finance or public loan guarantees etc because even though even they
00:19:23.780 don't are not generating the cash or even want to commit uh all the money of i think the six to seven
00:19:30.240 the $8 trillion build out here. No major impact in world history has ever been obfuscated more
00:19:38.920 than what the oligarchs are doing because the oligarchs understand that's why do you think
00:19:43.160 they're putting $125 million into governor's races, races at every level, particularly the
00:19:49.960 House and the Senate? Why are they doing that? It's not as a civics lesson. They're doing it
00:19:55.280 for a power grab. They feel that if they can dump a couple of hundred million dollars, which for
00:19:59.700 them is nothing okay nothing if they can dump a couple hundred million dollars in that they can
00:20:05.100 buy political control they can get so far down the path that you can't unwind it and what do they
00:20:10.500 keep doing they keep pointing to we have to do this because if we don't the chinese communist
00:20:16.300 party and steve you've got an event over at the kennedy center right now killed the order about
00:20:21.100 organ harvesting you more than anybody in the war room posse should know more than anybody we can't
00:20:26.100 let the ccp the murderous uh vicious dictatorship of the chinese communist party control it yeah
00:20:32.440 well we got that and we are we're saying you on that level no chips no education they don't work
00:20:39.260 at any labs they don't work at any country and companies and 650 000 or 350 000 don't come to
00:20:44.680 the colleges you send them all home you don't no bank no lending nothing zero we hermetically seal
00:20:50.860 like Carthage. We hermetically seal it. We take it apart brick by brick. Then we salt the earth
00:20:56.260 around it. Nothing goes to the CCP because we understand that risk better than anybody. But Joe,
00:21:04.000 even as you have studies coming out that are showing you now, pretty definitive studies this
00:21:09.200 early on that show you that people for the first time in our species, you're actually getting
00:21:13.700 dumber by using this. And that's why I think they understand this is a 90-10 issue. It's not a
00:21:20.300 56% or 60%. If you ask the question the right way, people will say, well, absolutely.
00:21:24.880 We shouldn't let these guys run the deal because they're not trustworthy. Joe Allen.
00:21:31.480 Yeah, Steve, if Denver will throw up, there's a recent article in Futurism. And by the way,
00:21:37.580 Futurism, I remember it as basically a tech booster rag. I don't know what happened to
00:21:42.620 those guys over there, but they are fantastic. They are, as the kids say today, based every
00:21:47.580 single day they're publishing some article or another holding these guys over the flames but
00:21:53.060 futurism just published an article it jumps off from a recent guardian article professors say ai
00:21:59.380 is destroying their students abilities to think ability to think and if you go through this piece
00:22:05.400 i really recommend that the audience just check it out look at the studies that are linking to
00:22:10.680 linking to one of the most important comes from one of the top educational journals and it is
00:22:18.700 technology in higher education and they show definitively what everybody already knew very
00:22:25.760 similar to the mit study on the cognitive effects the neurological effects of relying on gpt to
00:22:32.260 write what everybody knows is that if you don't think and allow the machine to think for you
00:22:37.980 you get dumber. It's the inverse singularity. Instead of the machines getting smarter and
00:22:42.940 smarter and smarter, people get dumber and dumber and dumber. And so the machines seem that much
00:22:47.520 more impressive. And this is widespread. They are wrecking an entire generation. They are
00:22:54.520 turning kids into a kind of global village of the damned. But instead of the kids having special
00:23:01.940 psychic powers they're just we'll say dumber without having to go to some of the saltier
00:23:09.600 language although i think the audience knows that there are cognitive limitations that have
00:23:15.520 better ways of describing it this is a nightmare and these are the people that are going to be
00:23:20.460 taking care of us so but as far as it goes now i just got to say this when when you look at this
00:23:26.120 this argument you look at this argument that we have to keep up with china the only way that we're
00:23:33.420 going to accelerate or the only way we're going to excel as a country is to accelerate the technology
00:23:38.220 in order to beat china to the finish line what's the finish line well the singularity in which all
00:23:43.460 human beings are either merged into the machine or replaced by the machine but putting that aside
00:23:48.520 the national security issues as far in insofar as ai is useful in staying ahead of china
00:23:55.460 geopolitically even the economic issues like finance or the health issues like ai is going
00:24:01.520 to cure cancer which so far not so good on that one people are trying people like david sacks
00:24:08.780 people like mark andreason people like elon musk and sam altman they are basically conflating
00:24:14.080 those issues, national security, health, and the economy, with the basic issues. What is AI doing
00:24:20.420 to children? What is AI doing to the average worker? What is AI doing to the average doctor?
00:24:26.640 By and large, it's making them vessels, hosts for algorithmic parasites. They are dependent
00:24:34.220 on machines to think. They're not able to reason on their own. It's horrendous the degree to which
00:24:42.060 this technology makes people observably dumber. And so to conflate the national security issues
00:24:49.420 with AI overall, it's completely disingenuous. You don't have to make your entire society brain
00:24:55.580 damaged in order to keep up with China. In fact, that's only going to put us behind in the long
00:25:00.480 term. And so I think, again, pointing to Ron DeSantis and the AI Bill of Rights, what are each
00:25:06.560 of those laws that are embedded in that bill meant to do, just at the very least to mitigate
00:25:12.020 that damage, give people control over their own futures. Put humans first.
00:25:20.880 Amazing. Joe, hang on. You're going to stick around. We've got some clips to play and a lot
00:25:24.940 more to get to while we've got you. Real quickly, Joe, where do people go to sign up for humans
00:25:28.960 first? I want to make sure that we got the entire war room posse as part of this. Where do they go?
00:25:33.400 go to humansfirst.com or the x site is at real humans first really important of course sign up
00:25:44.460 for the newsletter but there's also an ai pledge it is a document to force your congressmen your
00:25:54.220 governors any political figure to promise to pledge that they will not take money from big
00:26:02.000 So that's humansfirst.com or at realhumansfirst on X.
00:26:09.320 Joe, hang on.
00:26:10.440 We're going to take a short commercial break where we're talking about AI.
00:26:13.240 Home title lock.
00:26:14.080 Now that you have cyber and AI in a combo against the very rudimentary system we have throughout the country about the registration of titles for your home, 90% of your net worth is there.
00:26:28.440 Hometitlelock.com, promo code Steve.
00:26:31.640 Talk to Natalie Dominguez and the team.
00:26:33.360 You get a 14-day free offer on $1 million triple lock protection
00:26:37.580 so you can understand all of it.
00:26:39.740 Take away the angst and the anxiety.
00:26:43.920 When it's 80% or 90% of your net worth, you've got to do it.
00:26:47.300 HomeTitleLock.com, promo code Steve, just pennies a day.
00:26:49.840 But talk to Natalie Dominguez and the team, and do it today.
00:26:52.240 They're offering you a special.
00:26:54.140 Patriot Mobile, 972-PATRIOT.
00:26:57.020 Patriot. They're going to be with us at CPAC, a Christian company that supports your Christian
00:27:03.240 values. Make this shift today. 972-PATRIOT for Patriot Mobile. Short clip.
00:27:13.100 Think about this. In 2006, $20,000 equaled roughly 33 ounces of gold at spot prices.
00:27:23.720 at today's prices those 33 ounces of gold will be worth 165 000 smart americans diversify a portion
00:27:36.800 of their savings into precious metals and that's why you need to consider buying gold for my friends
00:27:42.560 at birch gold group for thousands of years gold has been a store of wealth and today it is a
00:27:49.880 crucial part of any balance strategy. Even better, Birch Gold can help you convert an existing IRA
00:27:56.180 or 401k into a tax-sheltered retirement account in gold. Just text my name, Bannon, B-A-N-N-O-N,
00:28:03.980 to the number 989898 to receive your free info kit on gold. There's no obligation, just useful
00:28:11.440 information. With an A-plus rating with the Better Business Bureau and tens of thousands of happy
00:28:18.160 customers. Let Birch Gold help you diversify with gold. Now that's peace of mind. Text Bannon
00:28:26.440 to 989898. Again, my name, Bannon, B-A-N-N-O-N, to the number 989898. Do it today.
00:28:36.420 I've got a question for you. How do you usually get your medication? Let me guess. You call the
00:28:41.020 doctor, wait for an appointment to get a prescription, sit in a waiting room, then you
00:28:45.300 stand around in line at the pharmacy hoping they actually have what you need that's not convenient
00:28:50.840 that's a system that doesn't work there's a better way and that way is where all family pharmacy
00:28:58.140 comes in they take the hassle out of the process you go online select what you're looking for
00:29:03.700 complete a short medical form and a licensed physician reviews your request if it's appropriate
00:29:09.680 they write the prescription and your medication is shipped straight to your home they've got
00:29:15.520 essentials like antibiotics antivirals ivermectin nad plus and hundreds of other prescription
00:29:23.520 medications it's simple it's efficient and it puts you back in control stop dealing with the old way
00:29:31.440 get your medical freedom and do it today go to allfamilypharmacy.com slash bannon and use promo
00:29:38.160 code Bannon10 to get 10% off your order. Get your medical freedom today. Go to allfamilypharmacy.com
00:29:46.160 slash Bannon and use promo code Bannon10 to get 10% off your order. Fellow patriots, the Federal
00:29:53.700 Reserve has betrayed America for over a century, printing fiat, inflating away your savings,
00:30:00.980 serving globalist masters. But President Trump is ending it. President Trump is wielding
00:30:07.000 a 112-year-old law to reclaim control over the rogue Federal Reserve. He's replacing Jerome Powell,
00:30:14.680 slashing rates, and igniting America's re-industrialization. This isn't theory.
00:30:20.460 Government-backed industry plus low rates unleashes super cycles. History repeats.
00:30:27.160 Gold's already exploding. Miners are up 400 percent in the last year. What Rickards is
00:30:32.980 calling Trump's gift is wealth for American patriots, not global handouts.
00:30:39.180 Now it's America's turn.
00:30:41.660 Gold to 27,000 in the coming years.
00:30:44.320 Jim Rickards, CIA and Pentagon veteran, says act now.
00:30:48.080 Go to Insider2026.com.
00:30:52.140 That's Insider2026.com to get Jim Rickards' strategic intelligence newsletter today.
00:31:02.980 The other really subtle point that I think we're going to hear a lot about AI in the coming year, insider threats, model poisoning.
00:31:19.560 Remember, their model has a soul, has a constitution.
00:31:24.360 That's not the U.S. Constitution.
00:31:25.940 The other day, their model was anxious and they believe it has a 20 percent chance right now of being sentient and have its own ability to make decisions.
00:31:36.600 So do does the Department of War want something like that in their supply chain so that it could hallucinate?
00:31:44.120 It could corrupt models that are used by defense contractors who are building weapons systems or or airplanes and so on.
00:31:50.760 And so the truth of it is we can't have a company that has a different policy preference that is baked into the model through its constitution, its soul, its policy preferences, pollute the supply chain.
00:32:03.800 So our warfighters are getting ineffective weapons, ineffective body armor, ineffective protection.
00:32:09.820 And that's really where the supply chain risk designation came from.
00:32:13.580 This is Maven Smart System, Palantir's software as a service
00:32:18.000 product that we are deploying across the entire department.
00:32:22.140 As you can see, it's not just one data feed.
00:32:25.040 It's multiple.
00:32:26.780 And instead of having eight or nine systems
00:32:29.280 for those decision makers to look at every single day
00:32:32.080 in order for them to make decisions,
00:32:33.860 you then fuse it into a single visualization tool.
00:32:37.200 The single visualization tool allows you to select, deselect
00:32:40.480 different types of data, look at different approaches
00:32:43.220 to data, but more importantly, action from the same system that you're trying to develop
00:32:50.140 your workflows around.
00:32:51.640 Once you have a detection that you want to actually move and actually move into a targeting
00:32:55.020 workflow, for example, this is what we do.
00:32:58.980 Left click, right click, left click, magically, it becomes a detection.
00:33:03.620 That detection then gets moved into a workflow.
00:33:09.380 This is standard digitized workflow, but I want to walk you through it quickly.
00:33:14.140 You have different types of targets that are identified on the left there.
00:33:17.260 Every single column produces a different type of decision-making process.
00:33:22.280 Once you have that decision and you're trying to actually action that process, we now move
00:33:26.020 into COA generation, course of action generation, where we are automatically, by a number of
00:33:31.140 factors, trying to identify what the best asset to prosecute a target looks like.
00:33:38.020 Once we've got the different approaches and we select one, we then can move directly into
00:33:43.560 how do we action that target.
00:33:45.120 So we've gone from identifying the target to now coming up with a course of action to
00:33:49.240 now actioning that target all from one system.
00:33:52.800 This is revolutionary.
00:33:53.740 We were having this done in about eight or nine systems where humans were literally moving
00:33:58.080 detections left and right in order to get to our desired end state, in this case actually
00:34:03.060 closing a kill chain.
00:34:04.060 Conspiracy theorists, you may hate this, but there's one person protecting your
00:34:07.560 right to be a conspiracy theorist that actually has a seat at the table and
00:34:10.800 that person is me you may not you may not want to hear that truth but it is
00:34:15.720 true and maybe do a little more reading before you pontificate on your absurd
00:34:20.820 and obviously ill-formed in many times stupid opinions okay so cuz like you're
00:34:25.980 attacking the person who's protecting you idiot it's like so stupid okay Joe
00:34:34.800 So, Alan, you had some pretty big hitters in there.
00:34:37.600 Walk us through who those people were and what was said.
00:34:42.640 You know, that first gentleman, Emile Michael, he's the Department of War Undersecretary for Research and Engineering.
00:34:50.420 He comes out of Uber.
00:34:52.020 Very strange statements you heard there, talking about Anthropics Claude, how the company believes that it is conscious,
00:34:59.660 how Claude, the chatbot, has its own constitution that's not the U.S. constitution, so on and so
00:35:05.840 forth, the hallucinations. Now, Emile Michael, very smart guy, but those statements were very
00:35:12.000 strange. Mostly they were spin. Mostly they were propaganda. He's just justifying declaring
00:35:19.140 Anthropic and Claude a supply chain risk. But the real interesting point is the reality that
00:35:28.560 some number, maybe the majority, maybe almost all of the people at Anthropic are open to the
00:35:36.200 possibility that the chatbot that they're deploying, and in this case, deploying on
00:35:40.880 classified documents in order to accelerate the process of killing other people, that it is
00:35:48.160 conscious. And they also talk about how recently they're making statements in the media about how
00:35:54.500 Claude, a chatbot, is anxious. And that anxiousness, that nervousness, is an indication
00:36:01.240 that there's some kind of being, some kind of soul inside that's deserving of ethical treatment.
00:36:06.220 You can't enslave it to kill people, basically. Very, very strange moment in history. And then
00:36:12.360 right after that, you saw a recent demo from Palantir and Maven Smart Systems. Maven Smart
00:36:17.920 systems, a lot of the savvy War Room listeners will remember Project Maven. And for instance,
00:36:24.960 the Google confrontation in which you had workers who staged kind of protest, you know, writing a
00:36:33.580 document in protested Google being used in surveillance technology for Project Maven.
00:36:40.600 Well, now basically all the tech companies are on board, OpenAI, XAI, Anthropic, and now called Maven Smart Systems, it is an integrated platform which allows the user, in this case, say an intelligence analyst in the military, central command, to use these systems to rake over huge amounts of satellite data, classified data, classified reports.
00:37:08.520 And then the decision compression means that the system itself, the machine itself, is making decisions as to what is and isn't a valid target, what sort of weaponry would be appropriate to use against the target, how it would be conducted.
00:37:24.740 And then after the operation is conducted, after the strike has occurred, then the AI is making the assessment as to whether or not it's a success or not.
00:37:34.900 So, I mean, this has been in the works for a long time.
00:37:38.680 Algorithmic systems have been used to find data in vast data sets, right, to find important, relevant data.
00:37:46.040 But this is next level.
00:37:47.840 You're talking about the same chatbot that people use to write their emails or do their homework is also being used to rake over classified data and decide who is and isn't worthy of being killed.
00:38:01.580 Hang on one second.
00:38:02.680 I want to go back to this debate because now a breaking story, I think it was in Axios, that a lot of corporate America are coming in at heaven.
00:38:09.620 Just break it down very basically of what Anthropic was said that they could not ethically do with providing artificial intelligence tools that they felt they had to hold back, of which Pete Hegseth and Emil Michael.
00:38:25.200 And Emil is not just a brilliant guy.
00:38:27.600 This is one of the top guys in the Pentagon.
00:38:29.400 He's basically in charge of R&D, but this is tied now all to the weapons systems that they said that's unacceptable.
00:38:36.480 And the Trump and President Trump of them really came out hard and said, it's impossible when we contract a weapon systems, we can do with it what we want, how we want.
00:38:44.960 You've got to give us the whole thing that we're buying.
00:38:46.700 You can't. We don't care about your ethics.
00:38:48.700 We don't care about your morals.
00:38:49.860 That doesn't mean anything to us when we're buying a weapon system.
00:38:53.160 Can you explain to the audience what's that about?
00:38:56.340 Because this is going to be a massive issue.
00:38:58.900 It's a massive issue now. It's going to be a bigger issue going forward.
00:39:04.060 Yeah, the argument really stems from just differences in values.
00:39:09.860 And a lot of people are rallying around Anthropic because when they were told by Pete Hegseth and the Department of War
00:39:18.520 that they would have to take off the safeguards from the system that would allow the Department of War
00:39:26.400 to use the system for any lawful use cases and there's a lot there's a lot of slippery language
00:39:34.740 about that but anthropic their argument is that the department of war wanted to use claude wanted
00:39:41.000 to use their ai chatbot to surveil americans and also to use it in autonomous weapon systems
00:39:49.180 that that's their story all of this is basically hidden behind the wall of classification so all
00:39:55.220 we have are the stories told on the other side basically basically but hang on autonomous
00:40:00.800 weapon systems mean where there's no human in the kill chain making a decision human in the machine
00:40:06.380 itself targets target acquisition all of it boom and they let it off it's totally autonomous with
00:40:13.140 no human input at all and anthropic said we're not comfortable with that that's right and as far
00:40:19.080 as anyone knows i mean it's been the department of war department defense going back it's been
00:40:23.860 their policy to always keep a human in the loop how meaningful that is is another question entirely
00:40:29.580 but at least somewhere in the kill chain there's a human being who's signing off on it and saying
00:40:34.300 yes that's a go it would be the implication then that someone is trying to push against that and
00:40:41.740 you have all these different forces within the department of war palmer lucky at andriel for
00:40:47.020 instance he used to also believe that there should be a human responsible human in the loop now he's
00:40:52.840 arguing that you need fully autonomous weapons that can identify a target act on the target and
00:40:59.940 kill now without any human being signing off now that pump now that palmer's selling them right
00:41:06.260 it's different yeah and it's just i think it's not to say anthropic is right and the pentagon
00:41:11.740 saying you're not putting any controls we will determine what controls we put in and anthropic
00:41:17.160 saying we're not comfortable with that this is spreading it's going to get a very deep supply
00:41:21.200 chain. I want to go to this conversation. Go ahead. Just real quick. One of the things that
00:41:28.160 Anthropic is saying is basically that the ethical use case of the system is for the benefit of
00:41:34.140 Americans, right? That Americans shouldn't be surveilled, so on and so forth. But as I've
00:41:37.960 argued, and I actually agree on that. If they're telling the truth, I agree completely. The system
00:41:43.120 shouldn't be used for that. But ultimately, Anthropic's goal is to create AGI superintelligence
00:41:49.020 after that. And so they are not really loyal to Americans. They're loyal to the so-called
00:41:55.720 country of geniuses in the data centers that they're building. And Sam Altman will talk about
00:42:01.180 this in just a moment. They all basically agree that the future is not human. The future is
00:42:06.100 artificial. I want to go, by the way, so consciousness, people are saying, hey, we're
00:42:12.740 going to get to AGI this fall, that machines are going to have consciousness. You just heard this
00:42:17.480 whole debate about Claude. Let's go ahead. We're going to have a several minute conversation and
00:42:22.840 Joanne is going to break it down on the other side. Fundamentally our business and I think the
00:42:27.580 business of every other model provider is going to look like selling tokens. You know they may
00:42:34.000 come from bigger or smaller models which makes them more or less expensive. They may use more
00:42:39.540 or less reasoning which also makes them more or less expensive. They may be running all the time
00:42:43.540 in the background trying to help you out uh they may run only when you need them if you want to
00:42:47.560 pay less they may work super hard you know spend tens of millions hundreds of millions of someday
00:42:54.420 billions of dollars on a single problem right that's really valuable but we see a future where
00:43:02.760 intelligence is a utility like electricity or water and people buy it from us um on a meter
00:43:11.660 and use it for whatever they want to use it for the demand that we see for that seems like it's
00:43:17.700 going to continue to just go like this and if we don't have enough we either can't sell it
00:43:26.020 or the price gets really high and it you know kind of goes to rich people or society makes
00:43:32.240 a bunch of sort of central planning decisions that i think almost always go badly about you
00:43:36.720 we're going to use our limited compute supply for this and not that so the best thing to me
00:43:42.480 throughout all the history of capitalism innovation whatever you want is to just flood the market yeah
00:43:46.880 consumers are gradually getting less and less in the loop on the recursive self-improvement so you
00:43:52.560 know every successive model is built by the one before it so that that is happening to a large
00:44:03.520 degree but it's it's not yet fully automated it may be there end of this
00:44:14.260 year but not later than next year do you see a hard takeoff at that point
00:44:20.640 we're in the hot table okay right now yes I mean look at I mean at this point I
00:44:29.280 I go to sleep. There's some passive AI breakthrough. And when I wake up, there's another one.
00:44:37.020 Yes.
00:44:39.180 Yeah, it's hard to keep track, honestly. It's a bit of a head spinner.
00:44:42.960 At this point, I think the definition of AGI really matters. Some people would say,
00:44:48.620 we already got there. Some people say it's very close. Some people say we're kind of,
00:44:51.800 you know, it's maybe still a year away. But in any case, that word has ceased to have much
00:44:58.420 median uh there are maybe two thresholds that we could talk about that are interesting okay number
00:45:04.840 one when is there going to be more of the world's cognitive capacity inside of data centers than
00:45:09.980 outside of them and that to me feels like maybe it could happen huge error bars i could be totally
00:45:17.360 wrong but maybe that could happen by like late 2028 and that's an extraordinary shift in the world
00:45:25.080 the other one is when can a ceo of a major company a president of a major country a nobel prize
00:45:35.580 winning scientist when can they not do their job without making heavy use of ai this doesn't mean
00:45:41.900 that there will be an ai ceo or an ai president but it does mean that the role of let's say a human
00:45:49.200 CEO, when I think about my job, it's really quite different. And so more and more, I think of these
00:45:56.540 jobs will be supervising a bunch of AI, providing oversight, deciding how to trust the outputs,
00:46:05.460 how to provide guidance. And that threshold of when you really wouldn't want to be doing your
00:46:13.540 job running a large organization without heavy reliance on AI, I think that's another sort
00:46:20.400 of interesting threshold. That may take a little bit longer, but probably not a lot longer.
00:46:29.120 Okay. People have to understand when they are telling you, first about the capacity,
00:46:34.300 I'll go to Joel in a second, but they're telling you, hey, we're really going to get into a
00:46:36.980 supervisory stage and you're just going to be supervising AI as part of your job. That phase
00:46:42.740 is going to last six months to a year max then it's over you they're not going to you think
00:46:50.820 they're going to have humans oh we're just i'm supervising you and i'm just you know managing
00:46:55.020 this is whole agentic movement like salesforce and these companies they're trying to tap it down
00:47:00.440 the agent that the agentic you is going to be the one that they use the fact is going to use
00:47:07.080 itself i'm not going to care what you have to say so if what all been saying oh yeah well you know
00:47:11.320 presidents or ceos or nobel prize scientists will get to a phase and then what it'll really be is
00:47:17.200 that we'll just be supervising we'll be we'll be supervising the machine we'll be supervising the
00:47:23.280 super intelligence that'll last 90 days right take your number two pencil out and write that
00:47:30.400 down in your notebook joe allen the scariest one was the data centers all cognitive all cognitive
00:47:37.120 ability inside a data center or everything called a brain on earth with all the capacity we've got
00:47:44.660 and remembering you know and being able to bring back things that the capacity inside the data
00:47:51.860 centers versus all cognitive capabilities on the outside i don't know fall of 2028 when's election
00:47:57.800 day fall of 2028 pretty scary sir they they're telling you what they're doing they're not really
00:48:04.120 They're hiding it about what the date is, but the overarching things, they're running it up the flag points.
00:48:09.040 Oh, yeah, we talked about that a year ago.
00:48:11.240 You can't be surprised by that.
00:48:12.720 We talked about that.
00:48:13.920 Joe Allen.
00:48:15.520 Yeah, and I think the scariest prospect, Steve, isn't just the idea of Sam Altman talking there about the ratio of non-biological cognition versus biological cognition, machine versus human.
00:48:29.460 The idea that most of the thinking, quote unquote, on Earth would be done by machines, that is a chilling thought.
00:48:37.460 But to me, the most chilling notion would be that as that process unfolds, human thinking becomes worse and worse, more and more clouded, more and more dependent on the machines.
00:48:50.220 And the machines turn out not to be the country of geniuses that they're being billed as now.
00:48:56.280 and you've got altman and musk talking continuously about how the use of ai is going up the food chain
00:49:03.080 so instead of it just being your average uber driver who's just guided around by the algorithm
00:49:08.220 or your average amazon fulfillment center worker who's just guided by the algorithm kind of like
00:49:13.260 an ant following a pheromone trail you now have middle management being guided by the algorithm
00:49:19.780 being guided by Claude. You now have CEOs who are asking Claude or asking GPT, what's the answer to
00:49:27.700 this problem? Scientists, physicists, engineers. And then, of course, as we just saw a moment ago
00:49:34.700 and has been reported widely since the Iran invasion, dependence on the algorithm by war
00:49:40.960 fighters, war people whose duty it is to protect the country and to accurately identify who is
00:49:49.120 and isn't a valid target in an enemy nation they are more and more becoming dependent the cognition
00:49:55.820 is shifting from the human to the machine and again just that process alone if the machines
00:50:02.680 do turn out to be geniuses and we just become kind of barnacles on the ship hull and wither away
00:50:07.480 well that's a that's a pretty piss poor future for humanity but an even worse future would be
00:50:13.940 one in which machines turn out to be pretty smart most of the time but at that critical moment the
00:50:18.900 machine is dumb the machine failed you and you allowed your own mind to wither
00:50:23.800 uh joe we only got about a minute and a half and by the way that we're not going to be live
00:50:30.360 they're not live streaming because it's a it's a closed event to get people there we had a huge
00:50:34.920 turnout so we'll do we'll do clips from yon yola kicks you kellyks you kelly thank you
00:50:42.500 kellyks event at kennedy center we got about a minute where do people go i want everybody
00:50:47.540 pile in to humans first where they go joe go to humansfirst.com or at real humans first on x
00:50:57.540 also i've got my talk that i gave at georgia tech up on my site that's joe bot dot xyz or at my own
00:51:05.580 x page at j-o-e-b-o-t-x-y-z thank you very much steve uh any any talks this week people should
00:51:14.140 know your calendar for the next two weeks where are you going to be i'm taking a moment just to
00:51:19.740 get my head together i'm going to be back out on the road in one month's time perfect do some
00:51:27.240 thinking and some writing i'll think those machines joe allen thank you so much uh appreciate
00:51:31.680 you go to humans first get everything associated with joe allen birch gold uh now more than ever
00:51:39.640 While we're talking about the Straits of Hormuz, it's not plural.
00:51:46.220 Also, the Persian Gulf.
00:51:48.200 When we're back to hydrocarbons, understand why gold has been a hedge for mankind's turbulence.
00:51:55.620 Let's be blunt.
00:51:56.720 Wars and the rumor of wars for 5,000 years.
00:52:01.000 Find out about Birchgold.com.
00:52:02.740 Promo code BANNED.
00:52:03.640 The end of the dollar empire.
00:52:05.080 and gotten out, you can get a hard copy,
00:52:08.040 the Patriots edition, first second episode.
00:52:10.060 See you tomorrow morning at 10.
00:52:29.560 Do you owe back taxes or you haven't filed your taxes in years?
00:52:34.460 now is the time to resolve your tax matters.
00:52:37.640 With the national conversation around abolishing the income tax,
00:52:42.460 the IRS is fighting back and proving it's here to stay
00:52:45.560 by becoming more aggressive than ever before.
00:52:48.920 They're sending out more collection notices, filing more tax liens,
00:52:53.060 and collecting billions more in recent years.
00:52:56.420 If you owe, the IRS can garnish your wages, levy your bank accounts,
00:53:01.760 seize your retirement and even your home if you owe or haven't filed it's not a question of if
00:53:08.800 the irs will act it's a question of when it will act right now tax network usa is offering a
00:53:15.960 completely free irs research and discovery call to show you exactly where you stand and what they
00:53:22.780 can stop before it's too late their powerful programs and strategies can save you thousands
00:53:28.840 or even eliminate your debt entirely if you qualify.
00:53:32.680 Don't make a costly mistake.
00:53:34.700 Representing yourself or calling the IRS on your own
00:53:37.220 waives your rights and costs you more money.
00:53:40.200 They are not, and let me repeat, the IRS is not on your side.
00:53:44.400 Get protected the right way with Tax Network USA
00:53:47.680 and start the process of settling your tax matters once and for all today.
00:53:54.180 Call 1-800-958-1000.
00:53:57.180 That's 1-800-958-1000 or visit TNUSA.com slash Bannon for your free discovery call with Tax Network USA.
00:54:09.600 Let me repeat, 800-958-1000, tell them Bannon sent you.
00:54:14.020 Don't let the IRS be the first to act.
00:54:18.200 Take advantage of first mover advantage, you move.