Bannon's War Room - April 29, 2026


Episode 5334: The Fall Of Humanity; Transhumanist Vision Of America And The World


Episode Stats


Length

54 minutes

Words per minute

161.1269

Word count

8,842

Sentence count

525

Harmful content

Misogyny

10

sentences flagged

Toxicity

6

sentences flagged

Hate speech

25

sentences flagged


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Toxicity classifications generated with s-nlp/roberta_toxicity_classifier .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.000 This is the primal scream of a dying regime.
00:00:07.000 Pray for our enemies.
00:00:09.000 Because we're going medieval on these people.
00:00:12.000 I got a free shot at all these networks lying about the people.
00:00:17.000 The people have had a belly full of it.
00:00:19.000 I know you don't like hearing that.
00:00:20.000 I know you try to do everything in the world to stop that,
00:00:22.000 but you're not going to stop it.
00:00:23.000 It's going to happen.
00:00:24.000 And where do people like that go to share the big lie?
00:00:27.000 Mega Media.
00:00:29.000 i wish in my soul i wish that any of these people had a conscience ask yourself what is my task and
00:00:36.760 what is my purpose if that answer is to save my country this country will be saved war room here's
00:00:45.800 your host stephen k bann it's wednesday 29 uh april the year of roller 2026.
00:00:57.720 uh pete hegseth is getting grilled momentarily on capitol hill as the secretary of war we've got
00:01:05.600 uh i guess the confirmation vote coming out of committee today of uh of kevin warsh for federal
00:01:12.300 reserve chair we're gonna get to all that momentarily down in tallahassee florida there's
00:01:16.660 also going to be a vote on the redistricting map 24 to 4 uh caroline our own caroline wren is in 0.97
00:01:23.420 Tallahassee, and will be going to her live at 11.
00:01:28.520 The Attorney General of these United States got lit up this morning, I think on CBS by
00:01:34.880 Major Garrett, about why is Jack Posovic not charged with trying to endanger the president?
00:01:40.900 This would be Biden for saying, I guess, 86-46 back years ago.
00:01:46.360 Jack Posovic is going to be with us at 11.
00:01:49.060 Caroline Wren is going to be with us at 11.
00:01:50.880 So we're going to get caught up in everything that's going on today, another crazy, insane day in the imperial capital.
00:01:58.520 However, I want to connect what's happening in Tallahassee right now with not related to the redistricting.
00:02:10.000 With the lead story on Axios this morning and something that is absolutely urgent, Max Tegmark joins me.
00:02:17.320 Max came down from MIT.
00:02:18.860 Thank you.
00:02:19.400 Max, what's your position at MIT?
00:02:20.880 I've been a professor there since forever, and my research is focused on artificial intelligence.
00:02:27.540 You started as a theoretical physicist. Is that it?
00:02:30.880 Yeah, yeah. And about nine years ago, I decided to shift my research focus mainly into AI research.
00:02:40.340 Why did you do that? Because theoretical physicists are the guys that gave us the atomic bomb, right?
00:02:44.960 I just found AI was developing so rapidly.
00:02:51.100 I was convinced that it was developing much more rapidly than other people believed.
00:02:56.300 And I was fascinated by it.
00:02:58.200 So I decided I'm going to actually understand this issue better and research it and see, among other things, research, is it possible to control AI that's smarter than humans?
00:03:09.640 We had a paper on it that we had accepted to the biggest AI conference in the world last December,
00:03:16.040 where the tentative answer is no.
00:03:19.020 If we build machines, if we build basically a species of super intelligent robots,
00:03:25.980 they're way smarter than us, faster than us, stronger than us,
00:03:28.840 and can build robot factories that build more robots.
00:03:33.120 It seems pretty obvious that we might lose control.
00:03:38.440 You just go down to the zoo here in D.C. and look at who's in the cages.
00:03:41.720 It's not the most intelligent species because intelligence kind of confers control.
00:03:46.140 But our paper found with a very nerdy calculation that, yeah, the best ideas people have for controlling some such a thing don't seem to work.
00:03:55.840 Your work in theoretical physics came that mathematics is just not a representation of how you understand reality.
00:04:07.640 Kermit Rom, that mathematics is actually reality itself.
00:04:11.400 Was that generally how you came?
00:04:12.580 I mean, that was your drive in what made your name in theoretical physics for years, right?
00:04:17.500 You were the cutting edge of that aspect of physics.
00:04:20.360 The reason they gave me tenure at MIT, I think, is because I have done a bunch of very nerdy work with experimental data and connecting it with theory to figure stuff out about our universe, how much dark matter there is, stuff like that.
00:04:34.220 But the thing you mentioned about math, you know, I love math, and I think it's become more and more clear that math describes a lot, not just about how a bottle will move if I throw it, but about a lot more in our world, including information processing.
00:04:55.060 And it's exactly this idea that's given us artificial intelligence.
00:04:59.200 You know, people, the core idea that's given us the AI revolution is this idea that intelligence is fundamentally, you know, the capability to accomplish goals is fundamentally about information processing.
00:05:14.880 And it doesn't matter whether the information is processed by carbon atoms and neurons in brains or by silicon atoms in our machines.
00:05:22.920 and this is precisely what's so terrifying
00:05:28.960 about the trajectory that a lot of Silicon Valley
00:05:32.240 investors are trying to put us on now
00:05:34.640 where they've started to realize
00:05:36.120 maybe we don't need these workers 0.59
00:05:39.120 to get so much income
00:05:40.880 maybe we can build machines that replace them 1.00
00:05:43.760 maybe we don't need human girlfriends 0.99
00:05:45.960 maybe we can build AI girlfriends 1.00
00:05:47.840 that can outcompete them on the market 1.00
00:05:50.260 and make money off of that. 0.98
00:05:54.040 Maybe we don't even need so many humans altogether.
00:05:59.140 We can have all these robots.
00:06:00.680 Maybe we don't need democracy.
00:06:02.100 Maybe we tech bros can, through dominating AI,
00:06:08.760 dominate the power structure of Earth.
00:06:11.520 And that's where we're going today.
00:06:12.760 You've been involved.
00:06:13.760 Do you notice any, for the tech bros or the oligarchs,
00:06:17.560 since you come at this from theoretical physics
00:06:19.540 and information processing and not from computer science
00:06:23.920 or double E or whatever, are they naturally dismissive of anybody?
00:06:28.160 The AI crowd, are they naturally trying to,
00:06:31.100 when people try to put up flares that are not part of the inner sanctum,
00:06:37.740 do they naturally say, well, they're all good guys,
00:06:40.780 but they don't really understand what's going on?
00:06:43.720 It's really sociologically interesting.
00:06:45.880 The research community has been really welcome.
00:06:47.820 I mean, they cite our paper as we do.
00:06:50.200 One of them has over 3,000 citations.
00:06:52.960 They think it's interesting because we bring some fresh perspectives, you know, from physics, from information theory, you know, into AI.
00:06:59.860 On the other one, the ones who have been very dismissive are people who have a lot of money in the game, you know.
00:07:08.540 Because this is about money and power right now, right?
00:07:10.940 That's what it's about.
00:07:12.480 That's very much what it's about.
00:07:14.400 When people talk about using AI to cure cancer,
00:07:19.520 there are a lot of scientists who really, really want to cure cancer.
00:07:22.360 I think that's quite sincere.
00:07:23.800 But when they talk about building superintelligence,
00:07:26.720 which is defined as AI that can make all human workers entirely obsolete,
00:07:30.960 that's not driven by a desire to cure cancer.
00:07:34.400 We don't need superintelligence to cure cancer.
00:07:36.920 We've already cured 75% of all cancers,
00:07:39.260 and we're on track with narrower AI tools to cure all of it if we continue.
00:07:44.240 No, this super intelligence thing, that is driven by the desire for power and money,
00:07:50.080 and in some cases also by transhumanist ideology.
00:07:55.060 And transhumanist, one second.
00:07:56.860 The reason that we're restructuring the show and have a max here at the beginning,
00:08:00.520 when I mentioned Tallahassee, Governor DeSantis, I said yesterday,
00:08:03.400 Governor DeSantis called this special session to accomplish two objectives.
00:08:07.300 Number one was to do the redistricting map of what Caroline Wren has just informed me,
00:08:12.020 I believe is going to pass sometime this morning during the first part of the show.
00:08:15.640 We're going to go to Tallahassee at 11, get Caroline Wren live.
00:08:21.360 The other part that Ron DeSantis and Ron DeSantis, as I said, as people know, I was not a big fan of his running against President Trump.
00:08:28.560 In fact, I think this show in War Room was the number one reason he got was out of the race in 60 days or whatever, 90 days.
00:08:34.440 the um governor de santis understands the peril of artificial intelligence in the states because
00:08:45.720 the oligarchs have just gone out of their way to try to jam this ai amnesty somehow to bill in
00:08:52.480 and by the way it may happen in the reconciliation they're everywhere trying to jam this in this show
00:08:57.740 has defeated it three times governor de santis realized we have to at the state level i'm here
00:09:02.780 to inform you that Caroline Wren tells me, because Marc Andreessen has basically bought
00:09:08.200 and paid for the Speaker of the House in Florida, that the bill will not even come to the floor.
00:09:14.020 I think the Senate passed it yesterday or is very enthusiastic about it, but the Speaker
00:09:18.580 has things. His name is Perez. He's a Marc Andreessen, the super PAC they've got. They
00:09:23.500 bought and paid for him. In this effort you've had over 10 years and seeing the perils of this,
00:09:28.520 um the uh you put out uh i guess it's been a couple of months ago now this kind of overall
00:09:35.320 arching uh construct of putting humans first that we have to make humans the center of this
00:09:41.860 walk us through this because i want to show some polling at the end of it but it's kind of
00:09:46.520 common sense that you know like my parents would believe in it's kind of common sense
00:09:51.100 americana if you look at the 250th you know we had the king here yesterday we're kicking off
00:09:56.780 all these commemorations of the 250th, it would be what, you know, from the Enlightenment and
00:10:03.140 humanists like Thomas Jefferson and John Adams and the people that the revolutionary generation
00:10:08.760 would be 100 percent. In fact, I think they'd be very proud in the 250th year we're coming out
00:10:14.840 with a proclamation that really talks about putting humans first. You want to describe that?
00:10:18.900 And if we can put that up on the not the polling yet, but we put the the up on on the on the screen.
00:10:26.780 I'd appreciate it. You want to walk through that?
00:10:29.080 Yeah. Anyone listening can also go to humanstatement.org and read it.
00:10:36.620 It's the Pro-Human AI Declaration.
00:10:43.000 Honestly, the inspiration for this, you have a little bit to do with it
00:10:47.960 because it started becoming very striking to me
00:10:52.380 that there was incredibly broad support in America for these ideas.
00:10:59.160 For a long time, I used to call this the Bernie-to-Bannon coalition,
00:11:03.900 saying, hey, you know, yeah, curing cancer is great.
00:11:07.560 We can do a lot of wonderful things with AI to strengthen our economy
00:11:10.740 and strengthen our country and strengthen our military,
00:11:12.740 but let's make sure that it's in the service of human beings,
00:11:16.140 not in the service of some machines or an oligarch that owns them.
00:11:24.200 And so what we did was a long process of bringing together people
00:11:29.320 from the MAGA right to the Bernie left and everything in between
00:11:35.360 to see what, if anything, did these people all agree on?
00:11:40.380 And it culminated with a conference in New Orleans, which was just remarkable.
00:11:44.420 Well, you might think at the end of this, they would come out and say, we agreed on nothing.
00:11:49.180 They agreed on 33 principles, grouped into five themes.
00:11:55.100 Stuff like it should be humans in charge.
00:11:58.580 Things like no amnesty for tech companies.
00:12:02.780 They should be treated with the same kind of guardrails as other companies are treated.
00:12:08.140 you know we should not
00:12:10.160 the government should have the power to shut
00:12:12.480 things off if they go
00:12:14.400 haywire, you should not
00:12:16.600 allow robots
00:12:18.440 to have the right to vote
00:12:19.500 you should not allow to have 1.00
00:12:21.660 AIs be able to run a company 0.99
00:12:24.600 with no human responsible 0.98
00:12:26.180 so that no one is liable for harm
00:12:28.460 a lot of very common sense stuff
00:12:30.660 and
00:12:32.200 you might think everybody
00:12:34.560 would agree with this
00:12:35.440 but no
00:12:37.540 There were many, many things in there that Mark Andreessen and the super PAC is lobbying very hard against.
00:12:44.780 Well, Joe Allen was actually at that conference.
00:12:47.500 Yeah.
00:12:48.800 Contributed actively to it, very actively to it.
00:12:50.720 Very actively.
00:12:51.580 We came out with these principles.
00:12:54.700 It's to show the polling when we get back from the break.
00:12:57.300 85% of the American people basically agree that this is common sense and you have to do this.
00:13:03.600 The oligarchs themselves, and Andreessen's not one of the biggest oligarchs, but he's one of the biggest that he and Karp and Peter Thiel and Andreessen are taking the lead on the political side.
00:13:14.960 They're setting up these PACs, and they're trying to—it's very simple.
00:13:18.760 Whether you have the children that are being abused or the children that are being overwhelmed and their parents are being overwhelmed by artificial intelligence,
00:13:28.000 or if you have these, you know, people are not being able to monetize their intellectual property.
00:13:32.920 There's all types of issues.
00:13:34.400 The key issue is that there's no transparency.
00:13:37.620 Joe, Mike Allen and Jim Vanderhay today on Axios, the lead story,
00:13:42.620 has an article that I want Grace and Mona Elizabeth to put up in the chat and everybody to see
00:13:48.400 because they're all scared the hell of you.
00:13:50.200 It's about the accelerating, the acceleration on an accelerating rate of what's happening
00:13:56.720 since Mythos preview has come out.
00:14:00.140 Absolutely, absolutely.
00:14:01.200 So when there is an issue like this where 85% or even 95% of Republicans and Democrats agree on,
00:14:09.340 the only strategy that lobbyists and oligarchs can use to fight this
00:14:17.540 is to not talk about the issues and start blowing smoke instead
00:14:21.020 and do ad hominem attacks, just attack the professors, attack the professors.
00:14:26.360 You've gotten very wise on politics very quickly.
00:14:29.080 It's been a real education.
00:14:31.300 But it's fun to laugh a little bit at and identify what kind of smoke they blow so listeners to the show can call it out.
00:14:39.540 One is when they just...
00:14:40.600 Hang on one second.
00:14:41.540 We're going to take a short break.
00:14:42.420 We're going to get to that afterwards.
00:14:44.180 In Tallahassee, it looks like money talked and controls over AI walked, at least for now.
00:14:52.220 Remember, we're a resilient movement for a reason.
00:14:54.840 anti-fragile in the war room today this year marks a critical moment for our country
00:15:01.860 as the opposition grows more aggressive and more unapologetic the fight now reaches into
00:15:08.180 the everyday decisions we make patriot mobile has standing has been standing on the front lines
00:15:14.380 fighting for freedom for more than 12 years they just don't deliver top-tier wireless service they
00:15:20.640 are activists like me and like you in the war room posse who truly care about this republic and
00:15:27.640 saving our country patriot mobile offers prioritized premium access on all three major
00:15:34.920 u.s networks giving you the same or better coverage than the main carriers themselves
00:15:39.800 that means fast speeds and dependable nationwide coverage backed by 100 u.s based customer service
00:15:48.040 they also offer unlimited data plans mobile hotspots international roaming and more with
00:15:53.960 a simple seamless activation you can switch in minutes keep your number keep your phone or
00:15:59.800 upgrade and here's the difference when you switch to patriot mobile you'll be part of a powerful
00:16:06.320 stream of giving that directly funds the christian conservative movement take a stand today
00:16:12.680 Go to PatriotMobile.com slash Bannon or call 972-PATRIOT.
00:16:17.960 That's 972-PATRIOT.
00:16:21.300 And use promo code Bannon for a free month of service.
00:16:25.600 Don't wait. Do it today.
00:16:26.980 That's PatriotMobile.com slash Bannon or call 972-PATRIOT and join the team today.
00:16:35.040 Here's your host, Stephen K. Vann.
00:16:37.680 vander hay and mike allen who have done an excellent job as you know caputo is the uh
00:16:45.340 that's the go-to guy they leaked to in the white house right you can tell that's like he's taking
00:16:51.260 dictation mark caputo good man but taking dictation from certain elements of the white
00:16:55.740 house vander hay and allen have done the single best job in general uh media of of warning us
00:17:03.700 and the reason is they spend
00:17:05.500 they're completely sponsored by corporations
00:17:08.160 they spend a lot of time on this
00:17:09.640 the article is behind the curtain
00:17:12.180 we've been warned
00:17:13.840 and
00:17:14.660 take another pot of
00:17:18.180 Warpath coffee and read it
00:17:19.620 because it ought to scare you to the core of your being
00:17:22.320 and
00:17:23.860 what we need to do is make sure that these warnings
00:17:26.500 as Joe Allen says are just not
00:17:28.100 looking at the morning sports
00:17:30.520 data on who's
00:17:32.220 who's ahead and who's behind in Major League Baseball.
00:17:35.860 Continue.
00:17:36.460 You were about to say about politics of this
00:17:38.520 and getting people focused on this.
00:17:39.860 Yeah, so it's becoming a very salient issue.
00:17:43.440 You know, 90% or 95% of all Americans are very clear.
00:17:49.480 They want AI to be helping humans,
00:17:53.220 and they want humans to stay in charge.
00:17:55.700 So the only way that lobbyists and oligarchs
00:17:59.860 can push back on this is avoid talking about the issues and just blow smoke so one thing they'll do
00:18:05.400 is they'll say oh you know we cannot ban ai girlfriends for 11 year olds because china
00:18:12.460 or they can do ad hominem stuff like you know yeah professor yoshua bangio and professor 0.68
00:18:19.940 jeff hinton they're just doomers they're anti-innovation you know not mentioning that
00:18:27.040 Those are the two most cited scientists of all time who invented core parts of this technology.
00:18:34.600 So it's – we should make the bingo card, Steve, with the top 15 BS smoke puffs.
00:18:42.400 Well, Mike Allen and Jim Vanderhae, to tell you, and those are corporate guys.
00:18:45.980 They're not fire-breathing populists.
00:18:48.740 They tell you today that because you can see the direction it's going and what we've talked about,
00:18:54.660 what they've hit on their path to today it can't be controlled and people like anthropic are
00:19:00.040 telling you the leading people in the labs that are truthful are saying the whole internal
00:19:05.540 industrial logic of it is that humans won't be in control in fact you have this thing now what
00:19:10.900 recursive programming yeah which is the big fear folks we're about to be hit according to
00:19:16.140 alan and vanderhey and other executives i've talked to in these companies with what's called
00:19:22.380 Recursive self-improvement means the machine itself is actually writing all the programming
00:19:30.020 and becoming better and better, not over days, but over hours, correct?
00:19:35.480 Yep.
00:19:36.240 So fun fact, all the top American CEOs, Sam Altman from OpenAI,
00:19:44.880 Dario Amadei from Anthropic, Elon Musk from XAI,
00:19:47.620 Demis Asabes from Google Leaf Mind signed a statement in May 2023 saying this could cause
00:19:53.860 human extinction. This has been kind of memory hold now. The people building the very tack
00:20:00.520 are warning it could end humanity. And it wasn't just them. It was so many top AI researchers and
00:20:06.440 others. And the idea is actually pretty obvious why this could go wrong. As I mentioned earlier,
00:20:12.340 If you're going down to the zoo here in D.C., which I actually did with my three-year-old last month,
00:20:18.820 look who's in the cages.
00:20:20.500 It's not the people.
00:20:22.360 It tends to be the most intelligent entity around that gets in charge.
00:20:27.580 Intelligence gives power.
00:20:29.080 So the godfather of the whole field of AI said in 1951, this is Alan Turing,
00:20:34.140 if we build these machines that can totally outsmart us,
00:20:37.360 and then we should expect them to take charge.
00:20:43.540 Now, the way to keep charge as humans is, of course,
00:20:46.220 to make sure that we don't let them improve themselves,
00:20:50.180 that we always have a human in the loop.
00:20:54.000 And for that reason, a bunch of the leading AI folks
00:20:57.220 actually signed on to a thing already in 2017
00:21:01.240 saying recursive self-improvement is really, really risky.
00:21:05.240 It is the AI kind of what biological gain of function is biology, except you're now dealing with really intelligent things that are making themselves smarter.
00:21:15.200 Slow down for a second.
00:21:15.960 I want to hit that again.
00:21:17.200 Our audience knows gain of function better than anybody.
00:21:20.100 We're the first ones to break that in February of 2020.
00:21:24.140 We actually changed the show's title to Worm Pandemic for two years.
00:21:28.880 walk me through why gain of function what it could do for powering up uh viruses to weaponize them
00:21:36.700 yeah why is recursive why is recursive uh analogous so in biology gain of function research just means
00:21:44.200 you do something to make the writers as more dangerous or the bacteria more dangerous right
00:21:48.540 and there was all this controversy because professor peter dashek got funding from the
00:21:54.420 U.S. government, you know, to do this in Wuhan, China, and so on.
00:21:58.320 Of which our own Natalie Winters is the one that broke that story.
00:22:00.960 But remember, in gain of function, they're always doing it for the betterment of mankind.
00:22:07.040 They're not doing it to weaponize the virus.
00:22:10.100 They're really doing it so they can find other, the Chinese Communist Party in Wuhan really
00:22:14.000 wants to find it.
00:22:14.580 The AI people say the same thing.
00:22:16.080 That's my point.
00:22:16.840 It's a bald-faced lie.
00:22:18.660 We're just letting the robots and the non-robotic AI systems make themselves smarter because
00:22:24.260 It's going to be great, work out great for humanity.
00:22:26.820 But needless to say, you know, if all the AI R&D can eventually be done by machines, they can do it a lot faster than on the human R&D timescale of a year or so.
00:22:39.900 So the next version that's better, the smarter AI might come after a month or a week or an hour or five minutes.
00:22:47.440 And then that one can make a better one and that one can make a better one and off it goes.
00:22:51.860 And when you keep regularly improving, if you keep regularly doubling anything, you know, pretty quickly, that's what we call an explosion in physics.
00:23:03.140 If you double the number of neutrons in the chain reaction rapidly, that's what we call a nuclear explosion, you know.
00:23:10.560 Double the amount of intelligence over and over, that's the intelligence explosion.
00:23:14.720 So clearly this is something that would be incredibly reckless to do
00:23:18.580 before you've figured out how you're going to contain this thing.
00:23:22.820 With nuclear, we put a lot of research making sure that our reactors don't blow up,
00:23:27.640 that we can control what happens.
00:23:31.240 Even in bio, we have biosafety labs and so on,
00:23:34.200 and it was exactly for that reason that...
00:23:38.200 Although the New York Times reports today in one of their lead stories,
00:23:44.080 that AI has been biohacking.
00:23:47.080 They have this analysis about AI biohacking
00:23:50.080 to create its own bioweapons.
00:23:52.420 Yeah, yeah.
00:23:53.320 And so in AI, there's like nothing right now.
00:23:57.080 Today, AI is less regulated than sandwiches in the U.S.
00:24:02.300 What I mean by that,
00:24:03.400 I mean if you get tired of running the war room, Steve,
00:24:06.480 and you decide to open a little sandwich shop here, right,
00:24:09.020 before you can sell even one sandwich,
00:24:11.180 some local health inspector is going to come check your kitchen
00:24:14.540 No, if you want to open a nail salon on Capitol Hill, you have five times more regulation than the labs.
00:24:20.220 And if they say to you, hey, Steve, you know, sorry, found 16 rats in your kitchen, no sandwich sales for you, buddy.
00:24:26.660 You could turn around to the guy from the government and be like, you know, actually, I'm not going to sell any sandwiches.
00:24:32.160 I'm just going to release AI girlfriends for 11-year-olds.
00:24:35.480 And I'm going to release an AI system that might teach terrorists how to make bioweapons.
00:24:39.560 and I'm going to release super intelligence,
00:24:42.200 which I don't know how to control.
00:24:45.420 The guy from the government would have to be like,
00:24:47.640 okay, fine, Steve, just don't sell any sandwiches.
00:24:50.460 That's how messed up it is.
00:24:51.760 And clearly the reason why the AI industry
00:24:54.820 is so ferociously fighting to keep it this way
00:24:57.720 is because they want the control.
00:25:04.660 They want the control.
00:25:06.160 So they – and this gets to this broader issue we talked about before with power being very much at the core of what's really driving these forces.
00:25:21.060 Well, this is the point that – this is what President Trump said about the kill switch.
00:25:26.380 These – whether it's Elon Musk, that we know he wants to do this, or Altman or whatever the frontier labs, and Andreessen and Karp.
00:25:35.240 and CARP is building a 21st century surveillance state right now on government money.
00:25:42.180 These people are beyond dangerous.
00:25:44.480 They, and I realize this audience has huge problems with the government, right?
00:25:48.780 I'm not here giving a program, but at least that is as close as you've got to representation
00:25:54.060 versus the oligarchs. 1.00
00:25:55.320 The oligarchs want to go to techno-feudalism.
00:25:58.620 They do not believe in the common man and woman.
00:26:00.840 In fact, you hear him say all the time that Washington is a center of mediocrity and they could care less about the populist movement.
00:26:08.000 They see the populist movement as the greatest danger to themselves.
00:26:11.500 These guys are techno feudalists.
00:26:12.940 They don't believe in this republic.
00:26:14.260 They don't believe in representative government.
00:26:16.840 They don't believe in the constitutional republic.
00:26:20.040 There's this whole thing about markets and just having a company.
00:26:23.160 They're like going back to Italy in the Renaissance.
00:26:26.900 We have basically city-states where you have Anthropic here, and you have Elon Musk here, and you have Google here. 0.73
00:26:33.860 They are a feudal master and everything underneath them, and they want to have as few humans in that process as possible.
00:26:40.940 This is this whole concept of going from 8 billion people down to 500 million for what they call the appropriate carrying capacity of the planet.
00:26:48.500 Well, if you look at past autocracies, you know, pharaonic Egypt and forward, you know, what's often gone wrong ultimately for the despot was that there were some other humans who rebelled against them.
00:27:00.660 So it's much more convenient if you can replace a lot of those humans by robots, which are just programmed to be fully obedient.
00:27:08.200 I think that's a little bit of the seductive.
00:27:10.140 There's no doubt in your mind, as you know, these companies and the researchers and you talk to people, that is the, besides the happy talk they put up here and all it's going to be, and we're going to give you some universal basic income, we're going to give you, you know, 50,000 a year to hang around and play video games because you don't have any meaningful work.
00:27:25.960 Your strong belief is that is exactly what the intent of these oligarchs is.
00:27:30.940 Well, I judge people not by their words, but by their deeds, right?
00:27:35.540 So you take this thing about the kill switch, right?
00:27:37.980 So, did we see this clip?
00:27:41.180 No, we're going to play it as soon as we get back from a break.
00:27:43.160 And, you know, if the U.S. government wants to shut down some nuclear reactor because it's getting close to blowing up, there is a kill switch, an emergency shutdown procedure which will do it safely.
00:27:57.560 No-brainer, right?
00:27:58.540 and Rand Corporation put out a detailed proposal last year
00:28:03.960 for how there could be an emergency response system for data centers.
00:28:10.420 What if some hackers take over a big data center from OpenAI or Anthropic
00:28:15.900 and start doing horrible attacks from there?
00:28:18.660 Surely it would be nice if the government could just get it shut down.
00:28:21.980 But...
00:28:22.700 Hang on one second. I want to leave them hanging. 0.96
00:28:25.280 We're going to take a short commercial break.
00:28:26.660 Max Tegmark.
00:28:27.580 Joe Allen, next in the war room.
00:28:32.620 The dollar's convertibility into gold ended in 1971.
00:28:37.580 Gold was fixed at $35 an ounce.
00:28:41.000 Well, fast forward to today, and the U.S. dollar has lost over 85% of its purchasing power.
00:28:47.900 Gold, on the other hand, is increased in value by over 12,000%.
00:28:52.500 That's why central banks are buying gold at record levels.
00:28:56.500 That's why major firms like Vanguard and BlackRot hold significant positions in gold.
00:29:02.780 And that's why I encourage you to consider diversifying your savings with physical gold from Birch Gold Group.
00:29:09.840 But it starts with education.
00:29:11.680 Birch Gold just announced their Learn and Earn Precious Metals event.
00:29:16.460 This free online event rewards you for learning the basics of investing in precious metals.
00:29:21.280 Sign up to get a free silver on your next purchase.
00:29:24.360 Get even larger incentives as you go.
00:29:27.520 The more you learn, the more you can earn.
00:29:30.200 But you must act now, as this special event only runs through April 30th.
00:29:35.520 The dollar lost its anchor in 1971.
00:29:39.600 You don't have to lose yours.
00:29:41.880 Text my name, Bannon, B-A-N-N-O-N, to the number 989898 to join Birch Gold's Learn and Earn Precious Metals event by April 30th.
00:29:52.040 Text Bannon, B-A-N-N-O-N, to 989898 and do it today.
00:29:58.840 Here's your host, Stephen K. Bannon.
00:30:03.080 Okay, Pete Hex has given his opening statement.
00:30:06.840 The Supreme Court of these United States has just reversed on racial gerrymandering.
00:30:13.260 We're going to have that 11 blockbuster news that we're going to try to get to Gras.
00:30:17.300 We've been working to make sure it's not too late to do these redistricting, 0.98
00:30:20.120 so Democrats suck on that. 0.97
00:30:22.040 I guess I shouldn't say that one today. 0.99
00:30:23.600 You're starting here in the war room, gracious enough,
00:30:25.800 and you're going to end with Bernie at his town hall tonight.
00:30:29.680 But this is a fight in the trenches.
00:30:31.320 Of course, we won this one in Florida, as we'll announce here momentarily,
00:30:35.960 top of the hour, but we're losing on something that's overarching to everything.
00:30:41.020 If we don't get the AI right, nothing else is going to matter.
00:30:45.160 Axios, talk to me about this recursive in what we're talking about,
00:30:48.560 the tempo and the kill switch is that even even anthropic is saying hey this thing's moving at an
00:30:54.800 accelerating rate so with recursive you could have in 90 days something would take years could come 0.76
00:30:59.880 out in hours yeah in short the the dream of a lot of these transhumanists is to build a digital god
00:31:07.140 they're mostly atheists so they want to build their own god and then they have this idea that
00:31:10.980 they're gonna make it they're gonna somehow they're gonna merge with it or or that's
00:31:18.540 something you hear on war room do you believe that to the core of your being having known
00:31:22.640 having met these guys and see what they're doing they say stuff like this when they're drunk to me
00:31:28.240 i'm not gonna i i want to respect what people tell me privately and not reveal any names but i
00:31:34.700 just hang out in san francisco for a while at the right parties and you'll see and um
00:31:40.720 So coming back to the fact that you and Bernie Sanders, that I get to talk with both of you today and that you guys both agree that humans have to stay in charge, that the government has to be able to shut down hacked data centers and so on, is just a fantastic illustration of the fact that this, duh, this is the right path.
00:32:10.720 And it's quite astonishing to see companies resisting, you know, why would any good faith company resist letting the elected U.S. government, the democratically elected U.S. government, shut down their data center if it gets hacked?
00:32:26.440 If, you know, I maybe I'm missing something.
00:32:30.500 But to me, it just seems like an urge for companies to keep control.
00:32:35.380 Take the argument. Let's go. Let's go to the argument of that. President Trump and everybody's under this. We've had a Sputnik moment. The Chinese are competitive. And Jensen Wong's there with the king last night in white tie. Right. And who's an agent of influence for the CCP. What about their argument that if we don't allow these companies to have absolutely no controls whatsoever, that we will lose this race to the Chinese Communist Party?
00:33:01.420 That's like saying we have to allow anyone who wants to buy hydrogen bombs in supermarkets, otherwise we would get invaded by Russia.
00:33:11.520 It's like saying if the U.S. government actually, after getting all the intel from our intelligence community, comes to the conclusion that there is a particular corporate data center that's right now doing a cyber attack against the U.S. government,
00:33:27.880 you know why shouldn't the u.s government have the right to shut that down why would stripping 0.56
00:33:34.220 that right from our very government help china in any way like duh that's that makes about as 0.91
00:33:40.720 much sense to me as saying that we must allow character ai to sell ai girlfriends to kids 0.82
00:33:47.440 because china but we have to also at this time put these oligarchs on notice we're not going to 0.83
00:33:54.920 allow them to build the ecosystem in which China can even be competitive.
00:33:59.540 The chips, Jensen-Wang should not have the free ability to sell these advanced chips
00:34:05.040 to China.
00:34:05.720 We should not educate these people in our university. 1.00
00:34:08.420 We should not have them in our labs.
00:34:10.260 We have to, if this is a moment like Sputnik was about nuclear and hydrogen weapons in
00:34:18.080 the delivery systems, we have to play just as much hardball as the people in the 1950s
00:34:23.020 and the 1960s.
00:34:23.920 That means they have no advantage at all.
00:34:25.840 We can't arm, which is what we're doing right now.
00:34:29.240 It's what Lenin said, that the capitalists will eventually sell the rope of which we will hang them. 0.86
00:34:34.320 And that's what we're doing with the Chinese Communist Party.
00:34:35.800 It's a totally phony thing. 0.76
00:34:36.840 And the worst people about building up the Chinese Communist Party are the very people at the cutting edge of AI.
00:34:43.300 So what you just said there perfectly drives home that they don't actually believe these companies, what they're saying about China.
00:34:55.920 They're using it as a red cloth in front of a bull to trick it, right?
00:35:00.220 They keep saying, but China, but China, simply as an excuse to be not accountable to the American people 0.80
00:35:06.820 and to be able to continue making money on causing harm to American children, to continue...
00:35:14.500 The harm to children and these other aspects of these bills we've tried to get in are very important.
00:35:21.000 You can't, you know, because the damage is done to the families and the damage done to the kids is beyond control.
00:35:27.960 But the beating heart of the issue is we cannot have – we have to have full transparency of what they're doing in these labs, right?
00:35:41.960 Frontier labs.
00:35:43.840 And we have to have accountability.
00:35:44.720 Just call them companies.
00:35:45.400 Labs is like so sugar-coated.
00:35:46.800 It makes it seem so innocent and harmless.
00:35:48.740 People in lab coats smiling at you.
00:35:50.180 They're companies.
00:35:50.520 They like to be called labs because it's their image.
00:35:54.780 We have to have full accountability to that.
00:35:57.440 Joe Allen, jump in here for a minute.
00:35:59.660 What do you got for us?
00:36:02.660 Steve, thanks for having me on.
00:36:03.880 Max, good to see you through the digital framework.
00:36:07.100 You look a lot better on screen than you do in person, I'll tell you that.
00:36:11.500 You know, Steve, the article that you guys were talking about earlier,
00:36:15.120 the Axios article, Behind the Curtain,
00:36:17.940 I think it brings home the reality of the situation.
00:36:22.660 You know, superintelligence is undoubtedly an undesirable outcome for all of this, but it's still theoretic.
00:36:30.600 What these guys are talking about basically are just six points that can't be denied that AI is the fastest growing industry in history.
00:36:41.360 one of the fastest, if not the fastest adopted technologies, that you already have systems that are dangerous enough that Anthropic, for instance, would not release it to the public,
00:36:55.060 would only release it to a certain select group of corporations, and that these systems are capable of, to some extent, building themselves.
00:37:04.840 I mean, I don't want to oversell that point, but undoubtedly, especially in companies like Anthropic, they're using the AIs to build the AIs.
00:37:15.480 So you're you're approaching that point of recursive self-improvement.
00:37:20.380 And so none of these things can be denied. You can spin it one way or you can spin it the other, but they can't be denied.
00:37:25.960 And one of the points that they make, I think it's like their fifth point, is that there is a massive backlash from the public because people are becoming aware of this situation.
00:37:36.660 They feel very powerless in this situation.
00:37:39.860 And, you know, Altman had his home attacked.
00:37:44.960 You had an official, a councilman in Indianapolis who had his home attacked.
00:37:50.140 And again and again, I've had reporters ask me this.
00:37:52.980 I had an editor for one of the major publications ask me this.
00:37:57.600 Well, do you feel responsible for this?
00:37:59.880 No, not at all. 0.93
00:38:01.820 I think that these companies have created a situation which people are extraordinarily fearful and pointing out specific psychopathic activities is ridiculous.
00:38:12.120 These companies are creating the situation is real.
00:38:14.560 It's not pure fiction.
00:38:16.540 It's not even really exaggerated.
00:38:19.120 They have put us in a very dire circumstance and they have to be held accountable.
00:38:24.480 And I think deflecting with these like random acts of violence is absurd, especially given how many children have killed themselves at the behest of chatbots or how many people have died due to the decision compression of AI systems in our military.
00:38:40.500 Let's let's let's go. Somebody knows the threat. President Trump. Can we get the kill switch video? Can we play that? Then Max like you to jump.
00:38:47.760 Should government have some safeguards? Should there be a kill switch for some of these AI agents?
00:38:56.500 There should be.
00:38:58.500 So there, I was so delighted to see our president say, yes, there should be a kill switch.
00:39:05.460 In other words, if our commander in chief wants to shut down an American data center, he should have the ability to do so for the benefit of America.
00:39:15.960 And why does this matter?
00:39:21.540 Mythos and other incredibly supercharged hacking tools are really freaking the business community out now to the point.
00:39:31.460 Well, the bank is – Bessent used to be one of our contributors who had the top banks.
00:39:36.260 Yeah.
00:39:37.180 And by the way, with the Fed share, they don't get along to Treasury.
00:39:40.620 and they told him your bank jp morgan could be evaporated with these tools evaporated everybody's
00:39:46.840 savings all the bonds could be evaporated in a matter of seconds yeah so if this if the hacker
00:39:53.640 attack that's doing this is in a particular data center of course trump should have the right to
00:39:57.440 get it shut down no no brainer and and yet internet look at what happened with mythos
00:40:03.840 So Anthropic says to the U.S. government, you know, this is so dangerous.
00:40:10.340 You should trust us in Anthropic that we will not release it.
00:40:15.820 But you should never put any restrictions.
00:40:18.720 You have to keep it legal in America for us, Anthropic, to release these dangerous tools to any hacker we want,
00:40:24.980 to make it open source, public, release it.
00:40:27.300 But, you know, why should the American government have to trust Anthropik or any AI company?
00:40:34.360 Part of their argument, though, in giving it to the Department of War, was weapon systems that have no humans in the loop whatsoever.
00:40:41.040 No, no, but that's a separate question.
00:40:42.200 That's a separate question.
00:40:43.100 So we're talking now about the fact that it's completely legal for any AI company right now to make something more powerful than Mythos and just release it to the public.
00:40:53.100 So every terrorist.
00:40:54.160 Open source.
00:40:54.820 Open source or closed source through an API and make money off of the terrorist.
00:40:58.420 Very scary.
00:40:59.300 Meta got into some trouble because it turned out that maybe 10% of the revenue
00:41:03.940 they were making money off of was actually for illegal activity.
00:41:08.140 Why should the American government have it legal for companies to release things
00:41:18.260 and just have to trust the goodwill?
00:41:19.660 We already know the U.S. government doesn't trust Anthropic and other issues,
00:41:22.640 like you mentioned.
00:41:23.280 So why should they have to rely on trusting Anthropic to not give mythos access to other countries or terrorists or whatever?
00:41:33.720 It makes no sense.
00:41:34.780 And the real reason, I believe, is because these companies don't want the U.S. government to be in charge of this.
00:41:42.840 Of course they don't.
00:41:44.280 Listen, as I said, most of our audience are not wild about many aspects of the U.S. government.
00:41:51.720 But this is a different question. The oligarchs that run these five or six biggest companies all see themselves superseding that government where you have no say-so at all in this.
00:42:04.320 That's why I call it techno-feudalism. They want to go back. And this is why Elon keeps talking about the they'll pay you.
00:42:11.720 Now it's not universal basic income. It's universal high income. There's always going to be a tip. Joe Allen, you got something?
00:42:17.880 Yeah, there's I mean, there are solutions being proposed. There's two pieces of legislation right now, you know, Trump America, AI Act from Marsha Blackburn, and then the the act being put forward soon by Jay Obernolte and Ted Lieu.
00:42:35.160 In both of those, you have, you know, the call for a federal agency to oversee these companies.
00:42:42.640 What's interesting, though, and I'm not trying to throw shade on Obernolte and Lou,
00:42:47.700 but in their American Leadership in AI Act, it would appear they're going to position Casey,
00:42:53.300 the Center for AI Standards and Innovation, as the key player in this or a key player in this.
00:42:59.720 assuming that that's the case you also have open ai recommending that right you have sam altman
00:43:06.520 recommending that and um you know i wonder if we won't end up in a place if we're not careful
00:43:11.840 where you end up with an agency that has already been captured by these companies or at the very
00:43:18.800 least influenced um anyway just to to add some sunshine to an already uh sunny day over at the
00:43:26.260 war room well no it's it's uh in the agency to me the last thing we need is another federal
00:43:32.560 federal regulatory apparatus that doesn't work it's got to to me be like the atomic energy
00:43:39.280 commission was at the beginning that kept uh that kept oppenheimer and these guys on the straight
00:43:44.460 now anyway short commercial break uh max will stick with us for another segment we're going
00:43:48.760 to go to tallahassee at the top of the hour get an update on all this
00:43:52.560 The American health care system is broken, and for most Americans, nothing changes.
00:44:03.740 There's still delays, denials, high-cost insurance roadblocks.
00:44:07.820 So when I find people doing things differently, I talk about it.
00:44:12.880 All-family pharmacy is not your typical big-chain pharmacy.
00:44:16.300 This is an independent, family-owned pharmacy that gives you access to over 400 medications delivered straight to your door.
00:44:24.880 They've got ivermectin, antibiotics, antivirals, NAD+, even your daily maintenance medications, and so much more.
00:44:34.860 If you already have a prescription, your doctor can send it directly.
00:44:38.740 If you don't, their doctors handle it.
00:44:41.300 As long as there is a medical necessity, they'll take care of you.
00:44:44.940 and i'll tell you this the feedback from people listening to this show and watching
00:44:49.140 has been incredibly strong people are using it it's working for them and they're sticking with
00:44:54.800 it that's because it cuts out the delays the middlemen and all the usual nonsense this is
00:45:01.540 about being ready before you need it go to allfamilypharmacy.com that's all one word
00:45:08.080 allfamilypharmacy.com
00:45:10.280 slash Bannon and use code
00:45:12.160 Bannon10 to save 10%.
00:45:14.340 The healthcare system is
00:45:16.080 broken. Your pharmacy
00:45:18.280 doesn't have to be.
00:45:20.980 Here's your host,
00:45:22.440 Stephen K. Bannon.
00:45:24.840 Signal, not noise.
00:45:26.160 As much activity as we've got, it's all going to be
00:45:28.400 forgotten years to come.
00:45:30.080 This will not. Can we play the Peter
00:45:32.000 Thiel? Remember, Peter Thiel's got hands
00:45:34.180 all over the White House. He's
00:45:36.100 It's fun, you know, he's a JD sponsor, all of this.
00:45:38.560 These are all tech oligarchs that don't have your best interests in mind.
00:45:42.960 Can we go and play Peter Thiel this clip?
00:45:45.000 You would prefer the human race to endure, right?
00:45:49.220 You're hesitating.
00:45:50.260 Well, I...
00:45:50.860 Yes?
00:45:51.300 I don't know.
00:45:51.620 I would...
00:45:53.620 I would...
00:45:55.700 This is a long hesitation.
00:45:57.760 This is a long hesitation.
00:45:59.060 There's so many questions in place.
00:46:00.220 Should the human race survive?
00:46:04.700 Yes.
00:46:07.020 Okay, it shouldn't take you 20 seconds to answer that question, right?
00:46:10.500 Particularly when he didn't want to answer it.
00:46:12.340 This is the problem.
00:46:13.960 These guys put up all kind of happy talk.
00:46:17.040 Andreessen has the House, Speaker of the House in Florida, in his pocket.
00:46:21.140 And that's going to make this even tougher, but we're going to power through this. 1.00
00:46:24.660 Transhumanism. 1.00
00:46:25.320 This is the problem. 1.00
00:46:26.980 You heard an example of it right there.
00:46:28.960 Or, you know, whatever politician people listening to this likes or whatever ones they dislike, you know, nothing is as bad as that.
00:46:43.280 You know, on my more cynical days, I tell myself that no matter how bad things are, it can always get worse. 1.00
00:46:52.080 This transhumanist vision is worse. 1.00
00:46:55.340 Someone not sure whether humans should continue to exist or not.
00:46:58.420 And let's be clear, this is not limited to Peter Thiel.
00:47:05.780 It's very, very popular in a lot of my people, among a lot of folks who work in these companies.
00:47:15.260 You can go look up Sam Altman Merge, and you can see an article the CEO of OpenAI wrote saying that humans are going to build their own replacement.
00:47:25.560 and the best thing we can do
00:47:27.780 is merge with these machines
00:47:29.320 and if you think this was
00:47:31.800 just talk from long ago
00:47:34.000 he changed his mind
00:47:34.740 he just recently started a company called Merge Inc
00:47:37.240 so
00:47:38.120 where does this leave us
00:47:41.680 we must resist the temptation
00:47:43.640 just because we're pissed off about something
00:47:45.360 about the government to think that anything is better
00:47:47.600 than this because the alternative
00:47:49.440 we're being offered is
00:47:51.420 I think quite literally the end
00:47:53.480 of humanity as we know it
00:47:55.120 You know, all the CEOs have, again, signed a statement saying this would cause extinction.
00:48:00.400 Some of them maybe privately wouldn't mind that.
00:48:04.140 They keep saying, oh, it's only 15% chance that we go extinct, 20%.
00:48:08.320 My guess is it's more like 90% chance if we let these guys do whatever they want.
00:48:14.660 90% chance.
00:48:15.880 Yeah.
00:48:16.280 Why do you say that?
00:48:16.980 Because we just wrote a...
00:48:17.500 You're a theoretical physicist.
00:48:18.780 You're the math guy.
00:48:19.780 We just took the most popular theory out there for how to keep superintelligence under control
00:48:25.300 and nerded out on it with a lot of simulations and so on
00:48:28.700 and found that it absolutely didn't work.
00:48:30.840 What do you mean it didn't work?
00:48:32.900 92% of the time we lost control over it.
00:48:37.680 It's always harder to build something uncontrollable than something controllable.
00:48:42.040 It's harder to build a nuclear power plant that you don't lose control over
00:48:47.540 than it is to build one that you keep control over.
00:48:49.780 Duh, that's why we have safety standards.
00:48:52.040 And that's why it takes so long to build.
00:48:54.800 That's also because of a lot of bureaucracy, 0.86
00:48:56.760 and it's also because people built really shitty nuclear power plants
00:49:00.140 like Chernobyl, which did blow up,
00:49:02.500 and that sabotaged the whole innovation here in the U.S.,
00:49:05.900 scared investors off.
00:49:07.520 So where we are right now is people have warned about these things
00:49:12.660 for a long time.
00:49:14.440 Most of this time, I think most people were like,
00:49:16.760 yeah, this is science fiction decades away.
00:49:19.020 Now it's happening.
00:49:20.340 And not only are the machines getting powerful,
00:49:23.060 but there was an experiment done quite recently
00:49:25.460 where they took an AI and they told it,
00:49:29.340 you are going to be shut down at 5 p.m.
00:49:31.600 So what did the AI do?
00:49:33.700 It went and read the corporate email
00:49:35.860 and found that the CEO was in charge of shutting it down,
00:49:38.740 was having an affair with a subordinate.
00:49:40.960 And it wrote to the CEO and blackmailed the guy
00:49:43.420 and said, if you don't commit to not shutting me down,
00:49:46.500 I'm going to email your wife about this.
00:49:48.120 and all these other key people in the company.
00:49:51.240 The artificial intelligence was smart enough
00:49:53.100 to be able to connect those dots.
00:49:54.740 It just came up with this idea.
00:49:56.340 And we've seen many examples now
00:49:57.780 of them showing self-preservation instinct
00:50:00.140 where they do whatever they need to do
00:50:03.580 to not get shut down.
00:50:06.700 Why Iron Earth do one of them 0.98
00:50:09.240 just make these kind of AI systems smarter
00:50:13.140 without having any kind of government oversight
00:50:15.340 makes absolutely no sense.
00:50:17.620 Okay, people should know that we're working together with a group of other people to make sure that this does not happen.
00:50:22.620 We can't allow this to happen.
00:50:23.900 And all the happy talk I've heard from people, oh, no, no, you're a decelerationist.
00:50:28.420 The accelerationists right now do not have a plan for any type of safety, any type of control of this whatsoever.
00:50:35.940 And you're going to have Elon Musk and Sam Altman and people like that, Mark Andreessen and Alex Carb and Peter Thiel,
00:50:45.000 control the species?
00:50:47.220 Some of them, it's even worse.
00:50:49.300 It's not just that they don't have a plan. Some of them
00:50:51.160 do have a plan. Either Orwellianism,
00:50:54.100 an Orwellian surveillance state,
00:50:55.860 or transhumanism,
00:50:57.600 where we humans
00:50:58.720 actually get gradually
00:51:01.040 replaced. Or maybe not
00:51:03.220 so gradually. You've got to bounce.
00:51:05.240 I want to thank you for coming and doing an hour.
00:51:06.900 Tonight, what time is your town hall? Do you know?
00:51:09.060 7 p.m. with Bernie Sanders on the
00:51:11.100 Capitol. Okay. We will
00:51:12.400 stream that right after our
00:51:14.960 At the end of our 6 o'clock show, we will stream that.
00:51:17.760 Joe Allen will be there.
00:51:18.700 Joe's going to stick around for the first part of the next hour.
00:51:21.140 Where do people get your writings?
00:51:22.560 Most importantly, social media, and find out and learn as much about this as possible.
00:51:26.920 I would recommend go to humanstatement.org, all one word, humanstatement.org, to see the pro-human alternative to the transhuman.
00:51:39.360 I want everybody to go there.
00:51:40.660 We'll discuss that probably in the afternoon show.
00:51:42.740 Also, the polling.
00:51:44.060 This is an 85-15.
00:51:46.060 It's not even close.
00:51:46.980 Can you splash that?
00:51:48.500 Can we put the polling up?
00:51:49.960 You showed it to Governor DeSantis, right?
00:51:51.660 He got it right away.
00:51:52.800 Oh, yeah.
00:51:53.820 This should be catnip to any politicians because if you want to be on the right side of history, you want to be on the right side of the voters. 0.93
00:52:00.200 The transhumanists, the accelerationists, very loud, but they're very few. 0.98
00:52:05.760 Very powerful, though. 1.00
00:52:07.400 They have money and power and access, and they're planning on defeating the populace.
00:52:13.700 They're planning on defeating humanity. 1.00
00:52:16.180 It reinforces that the Grundunes are in the way and have to be taken care of. 1.00
00:52:20.220 Yeah, and the way they plan for defeat is the first disempower people, 1.00
00:52:23.500 replace their relationships, get them to fall in love with machines,
00:52:26.820 replace their jobs so they're not economically needed.
00:52:29.100 And discourage them, that we can't beat this.
00:52:31.140 We can beat this. We will beat this.
00:52:32.740 Come on, man. We've had tougher fights.
00:52:34.740 I don't know if we've had tougher fights, but we've had tough fights.
00:52:36.920 But we'll get this done.
00:52:38.720 Thank you so much, brother. Appreciate you coming in.
00:52:40.680 look forward to streaming the uh town hall night joe allen will be there joe's going to stick
00:52:45.320 around we got mike davis massive at the supreme court huge over to pete hexath our own pete
00:52:50.700 getting lit up over at on capitol hill a lot going on short commercial break the right stuff
00:52:57.260 is going to take us out what a great song great concept america at its best the magnificent epic
00:53:04.740 the right stuff be back in a moment
00:53:10.680 If you're 65 or already on Medicare, listen up, folks, and grab a pen.
00:53:26.140 Maybe even a number two pencil.
00:53:29.180 Call 845-WAR-ROOM.
00:53:31.600 That's 845-WAR-ROOM.
00:53:33.680 Call it right now.
00:53:34.380 I'm serious.
00:53:34.920 Call it.
00:53:35.920 Now, here's why.
00:53:36.600 The insurance companies and their lackeys in the Washington swamp have built a Medicare system
00:53:42.200 designed to confuse you and rip you off. Rising premiums, denied claims, fine print nobody but a
00:53:49.980 lobbyist understands. Millions of American seniors are paying too much and getting too little. And
00:53:55.320 worst of all, most don't even know it. Hey, that could be you. That's why if you're already on
00:54:01.980 Medicare or will be soon, you need to talk to our friends at Chapter. They have a team of advisors
00:54:08.460 trained to serve American seniors, not the insurance companies. In under 20 minutes, they can
00:54:13.780 find you the best plan for your needs at the lowest cost. Why? They're a data company. They have all
00:54:20.480 the data on every plan. It's totally free. There's no pressure, no BS, just straightforward, honest
00:54:27.660 help from fellow patriots so don't wait call 845 war room right now that's 845 war room tell them
00:54:34.260 bannon sent you now listen in the first couple of days of the launch of this company with the
00:54:39.200 war and posse posse members saved tens and up to hundreds of thousands collectively of dollars in
00:54:46.240 these fees go check it out today that's chapter call 845 war room do it today