Bannon's War Room - April 15, 2026


Episode 5302: Rise Of The Digital Mind


Episode Stats


Length

55 minutes

Words per minute

157.8206

Word count

8,740

Sentence count

460

Harmful content

Hate speech

2

sentences flagged


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

Transcript

Transcript generated with Whisper (turbo).
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.000 this is the primal scream of a dying regime pray for our enemies because we're going medieval on
00:00:10.800 these people here's not got a free shot all these networks lying about the people the people have
00:00:17.400 had a belly full of it I know you don't like hearing that I know you try to do everything
00:00:21.240 the world to stop that but you're not gonna stop it it's going to happen and where do people like
00:00:24.900 that go to share the big line? MAGA Media. I wish in my soul, I wish that any of these people
00:00:32.560 had a conscience. Ask yourself, what is my task and what is my purpose? If that answer is to save
00:00:40.220 my country, this country will be saved. War Room. Here's your host, Stephen K. Band. 0.62
00:00:47.600 we are building something profound this is a kind of brain for the world
00:00:56.480 it'll be personal adaptable it'll be easy to use it'll give people incredible superpowers that were
00:01:01.760 sort of science fiction only a couple of years ago okay so we believe as an industry that in the
00:01:08.240 next one year the vast majority of programmers will be replaced by ai programmers it probably
00:01:15.840 would look like what you might consider a very attractive extraterrestrial.
00:01:22.760 What I think an even more meaningful impact in our lives is going to come from everyone
00:01:26.740 having a personal superintelligence that helps you achieve your goals, create what you want
00:01:31.560 to see in the world, be a better friend, and grow to become the person that you aspire
00:01:35.700 to be.
00:01:36.740 Insects or slimy reptilians or eyes so big that you could fall into them.
00:01:42.600 One way to say this is that within three to five years, we'll have what is called General Intelligence, AGI, which can be defined as a system that is as smart as the smartest mathematician, physicist, you know, artist, writer, thinker, politician.
00:01:59.380 The people of OpenAI, Google, Meta and countless startups are proving once again that America is impossible. You are just impossible to beat.
00:02:12.120 I want to talk about our new effort, Meta Superintelligence Labs, and our vision to build personal superintelligence for everyone.
00:02:19.720 The only real extraterrestrials have a body similar to human body.
00:02:25.120 So now we're starting to look ahead to superintelligence, and even more than before, our focus must be on wide and fair access.
00:02:32.360 What happens when every single one of us has the equivalent of the smartest human on every problem in our pocket?
00:02:40.200 I think that personal devices like glasses, that can see what we see, hear what we hear,
00:02:45.700 and interact with us throughout the day, are going to become our main computing devices.
00:02:50.820 In the next year or two, this foundation is being locked in, and we're not going to stop
00:02:56.960 it.
00:02:57.960 It gets much more interesting after that, because remember, the computers are now doing
00:03:02.440 self-improvement.
00:03:04.160 They're learning how to plan, and they don't have to listen to us anymore.
00:03:09.260 We call that superintelligence, or ASI, artificial superintelligence.
00:03:13.340 And this is the theory that there will be computers that are smarter than the sum of humans.
00:03:19.900 The San Francisco consensus is this occurs within six years.
00:03:24.460 When the end of that occurs, then they have reached a condition of having overcome human behavior,
00:03:30.040 human thinking, human desires, desiring only to be in the kingdom of heaven,
00:03:36.300 in the evolutionary level above human.
00:03:38.240 This is a kind of brain for the world.
00:03:40.780 It'll be personal, adaptable, it'll be easy to use.
00:03:43.160 It'll give people incredible superpowers that were sort of science fiction only a couple of years ago.
00:03:52.540 Good morning. It is Wednesday, April 15th in the year of our Lord, 2026.
00:04:04.620 I am Joe Allen sitting in for Stephen K. Bannon.
00:04:08.240 I've been going over a lot of the polling in the recent weeks, polling from Gallup, polling from Pew, and the Stanford AI Index.
00:04:20.040 What they are finding, to no one's surprise, is that the public is not only not excited about the AI revolution,
00:04:30.720 a revolution, as we have just heard, intended to first replicate the human mind
00:04:37.000 and then replace as many human beings as possible with these non-human minds.
00:04:42.680 The public, according to polling from Pew, 50 percent of people, a mere 50 percent, are in any way excited about AI.
00:04:57.080 Now, this is, apologies, apologies, a mere 24%.
00:05:01.440 50%, on the other hand, are quite concerned about artificial intelligence.
00:05:07.040 And 50% are convinced, and I think rightly so, that artificial intelligence will in fact
00:05:14.040 worsen creativity in children, in adults.
00:05:18.340 It will in fact worsen human relationships.
00:05:23.280 Insofar as the younger generation is concerned, a recent Gallup poll found that the number 0.87
00:05:31.260 of Zoomers, Gen Z, kids from the age 16 to 29, there were 27% last year who thought
00:05:42.800 that AI would bring about a better world.
00:05:45.580 They were hopeful about AI.
00:05:47.460 That is down to 17% now.
00:05:50.380 It's oftentimes said that boomers, the reason that the older generation isn't in any way excited about AI and is quite a bit concerned is because they don't really use it.
00:06:01.860 They don't have experience with it.
00:06:03.040 Well, Gen Z is more than immersed in AI.
00:06:06.400 They've, for most of their conscious or early adult life, have used AI continuously.
00:06:13.560 It's been around them.
00:06:14.280 The Internet was something that was taken for granted their entire lives, even though or perhaps because of their use of artificial intelligence, they're coming away completely dismal about the possibilities for the future.
00:06:28.980 Why? Why would Gen Z, why would anyone look to artificial intelligence with a suspicious eye?
00:06:36.320 I think first and foremost, it comes down from the tech CEOs themselves and what they have said, they've stated explicitly their mission is, which is to, again, create non-human minds that are built from data extracted from the public that will replicate every economically viable activity that human beings can do.
00:07:03.420 coding, white-collar work, and eventually, with the advancement of robotics, blue-collar work.
00:07:09.820 Their intention is to replace you. In the meantime, we're seeing stock prices soar,
00:07:18.000 we're seeing data centers imposed on communities around the country, and we're seeing the culture
00:07:24.080 watered down with AI slop. The data center backlash is among the most heartening
00:07:31.900 sorts of movements that we're seeing right now against AI. In Wisconsin, in Port Washington,
00:07:40.460 the city, with a referendum, has decided to pause all construction on data centers
00:07:48.280 that extract more than 20 gigawatts. In Maine, Festus, Missouri, they fired half of their city
00:07:59.640 Council for trying to build data centers in their small town. And in Maine, we have a moratorium,
00:08:10.740 again, for a year and a half on the building of any data centers above 20 gigawatts. This is just
00:08:20.200 the tip of the iceberg insofar as the local resistance goes to not only the cultural effects,
00:08:28.560 the economic effects of artificial intelligence, but the on the ground imposition of massive,
00:08:35.960 noisy data centers that strain the infrastructure, that use up the electricity, that use up in many
00:08:43.100 cases, the water. And people are becoming more and more conscious of what the ultimate goal
00:08:49.460 of these data centers are. Now, this backlash has led to some unfortunate instances of violence.
00:08:57.100 No one has been harmed up to this point. But we had in Indianapolis, a city councilman who had pushed for a data center to be built in the local municipality. His home was shot up and a manifesto talking about the sort of downsides of AI was found.
00:09:17.540 Sam Altman's home was hit with a Molotov cocktail and then a few days later shot up.
00:09:23.480 This is not the norm, nor is it anything that is being advocated for by the people pushing against the AI industry.
00:09:34.940 We're not calling for violence.
00:09:37.560 You see violence associated with any movement, whether it's racial grievance or economic downturns.
00:09:46.360 You see violence associated with police resistance.
00:09:50.740 You see violence everywhere.
00:09:52.420 It is not characterizing the AI movement, and we can't allow it to characterize the anti-AI movement.
00:10:01.600 What we're about is having people on the ground, on the local level, being able to determine whether or not their town is defined by a loud data center that's sucking up all of the electricity.
00:10:14.180 What we are about are people at the state level being able to protect their children from the predatory bots that have been unleashed in schools, in their homes, and on the Internet in general.
00:10:28.360 What we are about is the federal level control of not only the damage being done, accountability to these companies and demands for transparency as to what's going on inside the labs, but also the institution of some outside agency that will watch over these companies and determine whether or not the technology they're producing is destructive, whether it is dangerous.
00:10:57.880 That may be the Department of Energy. That may be the Center for AI Innovation and Standards.
00:11:05.140 That may be another agency such as DHS, or it may be a combination of all of them.
00:11:11.140 But at some point, we're going to have to look at artificial intelligence the same way we've looked at nuclear weapons or on a more mundane level, the same way we've looked at junk food, the same way we've looked at medicines.
00:11:24.220 No industry goes unregulated, and the push to leave the future of artificial intelligence solely up to the companies that are building it and deploying it recklessly would be just as insane as to allow Pfizer to regulate themselves with vaccines or pain medication or psychiatric meds.
00:11:48.400 This is a minimal ask. We simply want, first and foremost, people to be empowered, and second, for the government to be responsive to the demands of their constituencies.
00:12:03.560 You are not alone in this resistance. This is nationwide. This is worldwide. And as we push on the local level, on the state level, and eventually the federal level, you're going to have to keep your backbones.
00:12:21.340 you're going to have to stay salty. You're going to have to look to the future, not a future of
00:12:27.200 doom in which AI has taken over everything. There are no jobs and we are simply pets that are being
00:12:33.260 fed by robots. We are not looking towards a future in which robots are kicking in your door
00:12:39.380 and dragging you out to become biofuel for the next data center. We're looking for a future that
00:12:45.100 continues our traditions. We're looking for a future in which everything that our parents,
00:12:50.500 our grandparents, and all the generations that have brought us up to this point are
00:12:55.880 vindicated in their efforts, that all of their sacrifices were not for nothing, that this
00:13:01.820 movement of human life from the primordial origins up to this point and onward, that
00:13:09.480 this movement remains human, that we are not handing ourselves over to the machines, and
00:13:15.820 most importantly, that we're not handing ourselves over to the people who are building and deploying
00:13:22.160 these machines. Because if there's one thing that you really have to keep in mind is that part of
00:13:29.020 the anti-AI backlash is being steered towards looking at AI as some sort of abstract entity,
00:13:36.440 as if AI itself, as if this non-human digital mind was coming down from the heavens or perhaps
00:13:43.640 coming up from the hells of its own accord and imposing itself on you and on your children
00:13:50.620 and destroying the culture that we have around us.
00:13:54.200 No, this is not happening because of some abstract non-human mind.
00:13:58.820 This is happening because people like Sam Altman of OpenAI,
00:14:03.060 Demis Isabis of Google, Elon Musk of XAI, and Dario Amadei of Anthropic,
00:14:09.780 and we'll even throw in Mark Zuckerberg of Meta.
00:14:12.420 It's because human beings have made the choice to build replacements for other human beings
00:14:18.580 and impose their technologies on us.
00:14:21.860 We're going to resist it at the local level.
00:14:23.960 We're going to resist it at the state level.
00:14:26.260 We're going to resist it at the federal level.
00:14:28.540 And most importantly, we're going to reject it in our own homes, in our own lives.
00:14:34.640 Stay tuned, War Room Posse.
00:14:35.780 We'll be right back with Daniel Cochran of the Institute for Family Studies.
00:14:42.420 What about Lake and Riley?
00:14:46.020 It's Charlie too hard to say.
00:14:49.660 What about when you took your shot at the one you called a king?
00:14:55.720 Where is this song for justice?
00:14:59.360 The dollar's convertibility into gold ended in 1971.
00:15:04.300 Gold was fixed at $35 an ounce.
00:15:07.740 Well, fast forward to today, and the U.S. dollar has lost over 85%.
00:15:12.060 of its purchasing power. Gold, on the other hand, is increased in value by over 12,000%.
00:15:19.260 That's why central banks are buying gold at record levels. That's why major firms like Vanguard and
00:15:25.140 BlackRock hold significant positions in gold. And that's why I encourage you to consider
00:15:31.440 diversifying your savings with physical gold from Birch Gold Group. But it starts with education.
00:15:38.440 Birch Gold just announced their Learn and Earn Precious Metals event.
00:15:43.220 This free online event rewards you for learning the basics of investing in precious metals.
00:15:48.000 Sign up to get a free silver on your next purchase.
00:15:51.620 Get even larger incentives as you go.
00:15:54.300 The more you learn, the more you can earn.
00:15:56.960 But you must act now, as this special event only runs through April 30th.
00:16:02.280 The dollar lost its anchor in 1971.
00:16:05.280 You don't have to lose yours.
00:16:08.440 Text my name, Bannon, B-A-N-N-O-N, to the number 989898 to join Birch Gold's Learn and Earn Precious Metals event by April 30th.
00:16:18.980 Text Bannon, B-A-N-N-O-N, to 989898 and do it today.
00:16:25.080 War Room. Here's your host, Stephen K. Mann.
00:16:31.500 Welcome back, War Room Posse.
00:16:33.080 I am proud to introduce Daniel Cochran, a senior fellow with the Family First Tech Initiative at the Institute for Family Studies.
00:16:43.980 Daniel comes from a long and storied background in policy with a number of think tanks and institutions.
00:16:52.820 Daniel, thank you very much for joining me.
00:16:56.020 Daniel, if you would, just tell me a little bit about how you got into AI policy and what you see the future of AI policy being in America.
00:17:07.540 Well, great to be here, Joe. Thanks for having me.
00:17:09.660 To begin with, I think that, you know, look at the social media era, because I think in here, in talking about AI past its prologue, at the beginning, we were told social media would radically transform the world.
00:17:22.860 And in one way it did. It did indeed transform the world, but not in the ways we promised. We were promised greater democracy, more self-government, more decentralization. Social media was promised as sort of the savior of the common man from the big institutions.
00:17:40.160 Think the big media, right? All the things that this show has long talked about and critiqued.
00:17:45.880 But instead, what it became is a new it created a new type of information gatekeepers.
00:17:50.960 And those are the companies we're familiar with today.
00:17:53.400 Meta, TikTok, all of these big companies, Google, which which owns YouTube and other social media platforms.
00:18:00.080 These platforms became new information gatekeepers.
00:18:02.960 And they now have the power not just to influence our elections in public discourse,
00:18:07.260 but as we've learned very tragically, to literally rewire the minds of an entire generation.
00:18:13.040 So when I say past is prologue, I think we need to apply that lesson of history now to AI. What
00:18:18.400 are the AI guys telling us? What is Sam Altman telling us? What is Mark Zuckerberg telling us?
00:18:22.180 What is Mark Andreessen telling us? They're all telling us the same thing. This is going to be
00:18:26.320 the savior of humanity, literally in some cases. This is going to decentralize. That's Mark Andreessen's
00:18:31.860 big point. He talks over and over again, and others as well, about how AI is going to create
00:18:37.680 an opportunity for smaller enterprises to compete with large businesses, break the media monopoly,
00:18:43.560 et cetera. But what are we seeing in reality? Well, as you noted in the opening segment,
00:18:48.080 we're seeing these big tech companies, many of which were involved in the social media era,
00:18:53.700 like Meta, reconsolidate control over AI, and in fact, use their existing distribution networks,
00:19:00.260 specifically like Meta's Instagram or TikTok or others
00:19:05.660 to not only peddle their AI products,
00:19:08.300 but to create a new generation addicted to digital narcotics.
00:19:13.720 And this time, it's not just an algorithm that addicts you
00:19:16.820 or that pulls you in.
00:19:18.060 It's now an algorithm that you believe has a relationship with you,
00:19:22.460 that you believe loves you, right?
00:19:24.120 AI companions, AI lovers, et cetera.
00:19:26.320 So I think what we ought to do is,
00:19:27.500 I think we have to have a healthy degree of cynicism, but not to the extent desiring to
00:19:34.020 destroy the tech companies. That's not what we should desire. I mean, we should desire to destroy
00:19:38.300 the evil that's there, but we should desire to have a vision. We should chart a vision for how
00:19:44.080 AI can serve the American public. And I know we're here to talk about some of the important
00:19:48.500 legislative efforts and so forth, but I think it's important to lead with skepticism, but then
00:19:53.360 provide the vision. And that's what I think Republicans and conservatives around the nation
00:19:58.000 need to do. Yeah, it's very important to look at that direct connection between social media
00:20:04.260 and the digital milieu in general and AI. As you noted, people have been primed to communicate
00:20:12.080 via these networks. And especially with social media, there's a culture now in which people are
00:20:17.860 basically uploading their souls and have been doing so since the early 2000s. You're taking
00:20:23.580 your personality, you're putting it in this digital form, you're interacting with other people whose
00:20:28.680 personalities are put into a digital form. You literally have this kind of simulacra that becomes
00:20:35.060 in many ways, at least socially speaking, more real than real life, at least for some people.
00:20:40.620 We all know those people whose social media persona is more important to them than the actual
00:20:47.560 flesh and blood relationships around them and ai steps in to that pre-existing digital culture
00:20:53.520 and instead of now interacting with digitized humans you're digit you're you're interacting
00:20:59.700 with a digital mind that is built from previously digitized humans and it has a kind of life of its
00:21:07.380 own i'm curious how do you see the lack of regulation with social media playing out as we
00:21:16.100 move forward into most likely some pretty harsh regulation with artificial intelligence, or at
00:21:22.420 least I hope so. Well, look, I think you're seeing the backlash already around the country. I mean,
00:21:27.480 even Gen Z is saying, you know, and polling is showing this more and more. Gen Z is rejecting
00:21:32.900 to some extent social media and smartphones. You know, you talk about these clubs on
00:21:37.580 college campuses. I met very recently, I met a young man who started a club at a public university
00:21:44.260 and their whole the whole point of this club is low tech, no tech, kind of digital digital living
00:21:51.400 combined with policy advocacy to get get a handle on this stuff. So that's really interesting. It's
00:21:57.260 not just about, you know, get rid of the smartphone in your own life. It's about arresting the powers
00:22:02.640 that have kind of taken over our homes and our lives from the grassroots, from the state level,
00:22:07.460 local level, federal level, which you sort of alluded to earlier. So you have a paper coming
00:22:12.420 out or your institute has a paper coming out addressing Marsha Blackburn's Trump America AI
00:22:18.020 Act. If you would, could you just break down what your position is, maybe even give us a little bit
00:22:24.520 of a review on what Marsha Blackburn's bill is actually calling for? Certainly. And I think
00:22:31.400 it's important to connect this with the president's AI framework, the legislative recommendations the
00:22:37.320 White House recently made. And in there, I think it's useful to say, look, those were high-level
00:22:43.360 principles, but the document delegates most of, frankly, the hard work to Congress, which is the
00:22:48.680 way it should be. Congress needs to legislate on this issue. They've needed to for years,
00:22:52.760 they promised to for years, and they failed to for years. Now, when we talk about artificial
00:22:57.700 intelligence and making, ensuring, creating a national framework, where I think we often get
00:23:03.760 off track is we lose sight of what we're actually trying to do. And the first principle of an
00:23:09.920 effective national AI framework must be guardrails, strong guardrails that actually have teeth. And
00:23:16.620 Marshall Blackburn's Trump America AI Act would do just that. It contains a lot of provisions that
00:23:22.080 are very common sense and widely bipartisan, like the Kids Online Safety Act, the Guard Act
00:23:26.360 introduced by Senator Hawley. It would create tracking mechanisms for AI labor disruptions
00:23:32.500 and many other things that are just kind of common sense.
00:23:35.400 It's not perfect like any piece of legislation, but it's an important first step.
00:23:39.320 One of the more interesting facets to me, the notion of existential risk with artificial
00:23:45.740 intelligence or any kind of catastrophic risk, this hasn't really been much of a discussion
00:23:51.040 other than the kind of sensationalist claims or predictions, which may in fact prove to
00:23:56.500 be true.
00:23:56.940 No one's saying that they're not, but I don't really focus on them all that much.
00:24:00.580 On the other hand, when you have these tech execs, Elon Musk is probably the most notable, but we've also had many statements from Dario Amadei, for sure, Sam Altman, that the notion that artificial intelligence poses catastrophic risk for the public, either because of loss of control or because you have rogue actors that use AI to create bioweapons or any other kind of improvised weapon.
00:24:24.220 Um, Marsha Blackburn's Trump America AI Act addresses that head on.
00:24:30.320 The proposition seems to be that some combination of federal agencies would oversee these companies.
00:24:38.920 She specifically points to the Department of Homeland Security in the case of rogue actors or the more general threat.
00:24:46.840 that responsibility would fall on the Department of Energy, which, of course,
00:24:51.120 has a long history of surveilling and regulating nuclear materials.
00:24:56.320 So I'm curious, have you thought much about that aspect or is do you find existential risk to be
00:25:04.600 a persuasive argument that we actually should be concerned that AI could go rogue or that rogue
00:25:11.320 actors could use it to inflict horrible damage on society? Well, look, I think devil's in the
00:25:16.160 details on how you define it. But let's look at an example from just this past week with
00:25:20.540 Claude, you know, Anthropics Claude being used to find critical security vulnerabilities in a
00:25:27.000 whole variety of critical infrastructure and also operating systems that we didn't know existed for
00:25:33.120 years. So that in one sense isn't a kind of existential risk because at its base, what is AI?
00:25:39.340 It's pattern recognition and prediction. So it's finding patterns that no human or prior computer
00:25:44.400 systems have found. And that creates a lot of vulnerabilities. I mean, if you think about it,
00:25:49.540 if a hacker had the ability to do what Claude can do right now, right, that would be really bad
00:25:54.840 for a national security standpoint, from an economic security standpoint, from almost any
00:25:59.740 standpoint. So I think it's quite important for bills like the Trump America Act, which you just
00:26:05.160 referred to, to delegate some responsibility to the Department of Homeland Security and other
00:26:11.280 federal agencies to ensure that this stuff that we're studying it at the very least and that we
00:26:16.360 create protocols for reporting back to the government like what are the actual the actual
00:26:21.520 abilities of these models i think it's important to at least know that yeah the anthropic response
00:26:27.100 to this that the way they move forward i have a lot of mixed emotions about it because on the one
00:26:32.220 hand they're creating this software these models to write better code and a number of other
00:26:38.680 functions. It's dual use. And so by being able to write code and analyze code, it's able to find
00:26:45.800 these vulnerabilities. If they are going to create a monster like that, they have to be responsible
00:26:52.620 for it. And so they are to some extent taking responsibility. They're saying we should be
00:26:58.020 regulated. They did not release the model. They offered the model to Amazon Web Services,
00:27:04.080 Microsoft, a number of corporations so that they could strengthen, they could bolster their own cybersecurity using the model.
00:27:13.200 So it's a mixed bag because on the one hand, they're driving forward with this AI race.
00:27:19.900 They are actively creating something that could wreak havoc.
00:27:23.060 Maybe not this model, maybe not mythos.
00:27:25.360 Maybe it's the next one or the next one could wreak havoc on the digital infrastructure.
00:27:29.400 knowing that this danger exists they somehow feel it's worth it so they can write better code
00:27:36.780 it's a hard sell for me even you know it's as if you know you had dr frankenstein asking the the
00:27:44.000 local sheriff to stand guard to make sure that the monster doesn't go out of control i i think that
00:27:50.300 the problem is really the desire to create such a monster but you know daniel you've always been
00:27:57.480 much more positive about this than I have in our discussions. You've always had much more optimism
00:28:02.140 and when we get back at the other end of the break, I want to hear more about your optimism
00:28:07.220 and I want you to make me feel good about the artificial intelligence revolution. Back in just
00:28:12.740 a moment. This year marks a critical moment for our country as the opposition grows in
00:28:27.460 more aggressive and more unapologetic, the fight now reaches into the everyday decisions we make.
00:28:34.620 Patriot Mobile has been standing on the front lines fighting for freedom for more than 12 years.
00:28:40.760 They just don't deliver top-tier wireless service. They are activists like me and like you in the
00:28:47.480 war room posse who truly care about this republic and saving our country. Patriot Mobile offers
00:28:54.280 prioritized premium access on all three major u.s networks giving you the same or better coverage
00:29:01.680 than the main carriers themselves that means fast speeds and dependable nationwide coverage backed
00:29:07.620 by 100 u.s based customer service they also offer unlimited data plans mobile hotspots
00:29:15.820 international roaming and more with a simple seamless activation you can switch in minutes
00:29:21.060 keep your number keep your phone or upgrade and here's the difference when you switch to patriot
00:29:27.400 mobile you'll be part of a powerful stream of giving that directly funds the christian conservative
00:29:33.820 movement take a stand today go to patriot mobile.com slash bannon or call 972 patriot that's
00:29:41.860 972 patriot and use promo code bannon for a free month of service don't wait do it today that's
00:29:50.780 PatriotMobile.com slash Bannon, or call 972-PATRIOT and join the team today.
00:29:58.700 Here's your host, Stephen K. Vann.
00:30:06.720 Welcome back, War Room Posse.
00:30:08.580 When the dollar's convertibility into gold ended in 1971, gold was fixed at $35 an ounce.
00:30:15.640 fast forward to today and the u.s dollar has lost over 85 percent of its purchasing power
00:30:22.080 gold on the other hand has increased in value by over 12 000 percent that's why central banks are
00:30:29.600 buying gold at record levels they are not going to stand by as artificial intelligence turns
00:30:37.580 everything into digital muck they're going to protect themselves that's why major firms like
00:30:43.500 vanguard and blackrock hold significant positions in gold and that's why i encourage you to consider
00:30:48.860 diversifying your savings with physical gold from birch gold group that's right birch gold group
00:30:54.100 no robots at birch gold group but it starts with education birch gold just announced their learn
00:31:00.480 and earn precious metals event this free online event rewards you for learning the basics of
00:31:06.220 investing in precious metals sign up to get free silver on your next purchase get even larger
00:31:13.000 incentives as you go the more you learn the more you can earn don't ask ai ask philip kirkpatrick
00:31:20.880 but you must act now as this special event only runs through april 30th the dollar lost its anchor
00:31:26.960 in 1971 you don't have to lose yours text bannon to the number 989898 to join birch gold's learn
00:31:35.660 and Earn Precious Metals event by April 30th.
00:31:39.440 Text BANN to 989898 today.
00:31:45.520 All right.
00:31:47.640 We are back with Daniel Cochran,
00:31:50.300 senior fellow with the Family First Tech Initiative
00:31:53.160 at the Institute for Family Studies.
00:31:56.240 Daniel, as promised, if you will,
00:31:58.820 make me optimistic about the future of artificial intelligence.
00:32:01.760 Does the government have a handle on this?
00:32:03.580 Does the American populace have a handle on this?
00:32:06.480 Do we have a handle on this?
00:32:09.120 Well, I'm making you optimistic.
00:32:11.180 That may be a bridge too far for me, but I'll try my best in this short segment.
00:32:15.300 Look, I think that it depends upon the decisions we make.
00:32:18.080 Some of the optimism should be, look, so past this prologue, the cement is dry on social media, right?
00:32:24.280 A lot of the effects have already occurred.
00:32:27.020 We're trying to reverse them, but we're playing catch up.
00:32:29.280 the cement on AI is still wet, and we still have an opportunity as a country to make decisions
00:32:34.000 for the direction that technology is going to travel. And the lesson we ought to apply to this
00:32:39.700 round is we need guardrails and regulations. And as we mentioned, we were talking about
00:32:44.660 Senator Blackburn's Trump America AI Act. That's a great step in the right direction. So you ask,
00:32:50.080 well, what would it look like to have guardrails? That's what I think a start, that's what a starting
00:32:54.560 pass looks like. It looks like protecting kids online with Kids Online Safety Act.
00:33:01.000 It means that we impose regulations on chatbots. We tell companies, look, you're not allowed to
00:33:07.120 sell or make AI companions available to children, especially AI companions that are engaged in
00:33:14.160 sexually promiscuous conversations. And to your point earlier, it ensures that we create some kind
00:33:20.360 of framework to monitor the risks that are emerging from these models, the ability of
00:33:25.320 these models to find vulnerabilities, but also to be able to act in unpredictable ways
00:33:31.180 that are going to harm our society at a massive scale.
00:33:35.780 The way I look at it, you know, if I had my own way about it, I would just simply turn
00:33:39.900 back the clock.
00:33:40.440 I'm not sure to where, probably somewhere in the 90s, the best decade, but obviously
00:33:44.680 no one's asking me for my opinion on how all of society should go.
00:33:48.640 And it's probably best I don't have my way. So we're stuck. These technologies have been created. They will continue to develop. We have to have some way of steering it or kind of the way I look at it to put up blast shields against the worst effects of the technology.
00:34:06.280 And while I'm not a huge fan of government imposition, I'm also not a fan of corporate imposition.
00:34:13.120 And I think in this case, the corporate imposition is far more dangerous than anything the government's going to do.
00:34:18.160 And these guys are talking about, oh, our rights as corporations are being trampled upon.
00:34:23.360 Well, they've trampled on our privacy.
00:34:26.060 They've really trampled on our sanity.
00:34:28.980 and certainly they've trampled on the sense of safety that people can and should have with
00:34:35.320 regarding their children so uh you know i'm all for regulating this if the government is our
00:34:41.360 only real mechanism to get a handle on this you know that sometimes you got to make a deal with
00:34:48.340 the devil to go against lucifer right so lucifer and armon how about that you know i'll turn to
00:34:55.020 arm on to regulate lucifer i think it's important to recognize too this is why states are critical
00:35:00.260 um in in figuring out how to strike the balance like we have a federalist system here in the
00:35:04.660 united states and one of the benefits of federalism is we get to figure we get to we get 50 laboratories
00:35:09.980 to to figure out what works and what doesn't you know cal i'm from california california's
00:35:14.240 some really crazy stuff and you look at california like i don't want to do that and other states are
00:35:19.480 like we're going the opposite direction you got states like texas that are doing some really great
00:35:23.440 stuff. On education, on tax policy, Florida's the same. Tennessee's the same. And especially
00:35:29.400 on technology, when it comes to AI, look at what Texas is doing. Look at what Utah's doing.
00:35:34.960 Look at what Tennessee's doing. You know, Tennessee, the home of the country music
00:35:39.140 industry and a lot of creatives, they enacted the Elvis Act. What does that do? It says you
00:35:43.880 can't use AI to distribute or you can't distribute digital copies of someone's visual
00:35:50.280 likeness or their voice. Very common sense. And that came not from Congress, not from some
00:35:55.920 technocratic agency here in Washington. That came from a state. Look at Utah. Utah put common
00:36:03.380 sense guardrails on chatbots. They said, look, if you're using a chatbot for mental health therapy,
00:36:07.280 which personally I oppose, but okay, if you're going to do that, the company can't use your data
00:36:11.880 and target you with ads while you're using the chatbot. Very common sense. That came from a state,
00:36:15.940 not from Washington, D.C. Florida, for years, has been on the front lines of trying to address
00:36:20.980 the effects of social media. They passed laws that not only put restraints on social media
00:36:27.120 and guardrails on social media to address some of their harmful effects on kids, like requiring
00:36:32.220 parental consent for social media access, but more recently, they've also enacted a law to push back
00:36:38.260 on big tech censorship. The case that went to the Supreme Court just a couple of years ago,
00:36:42.440 it dealt with a law in Florida and Texas that required that would that would have restricted
00:36:48.620 the ability of these companies to censor speech. And I know that's something that War Room has
00:36:53.560 experienced firsthand. And we know, I mean, this is this is stuff that the states are doing. The
00:36:57.840 states are the American people's first best and best line of defense against the against big tech
00:37:03.920 and against big AI. And that's why we got to keep them in the race. Well, you know, I agree. This is
00:37:08.280 a massive experiment and the states and local communities do form sort of a variety of petri
00:37:14.680 dishes in which we are being experimented upon and i recommend to the war room posse or anyone
00:37:20.180 else who is going to listen stay in the control group daniel i really appreciate you coming by
00:37:26.880 if you would let the audience know where can they follow your work how do they find the institute
00:37:31.840 for family studies give them all you got that's great so you can find me at real d cochran on
00:37:37.060 Twitter. And then if you go to instituteforfamilystudies.org, you can read all about our
00:37:42.660 work. And then the Family First Tech Initiative, just type that into Google. If they're not
00:37:47.480 censoring us, you'll find our work there. Thank you very much, sir. Thank you so much. It's great
00:37:51.220 to be here. All right. Pivoting from optimism to pessimism, we have Brendan Steinhauser,
00:37:59.720 head of the Alliance for Secure AI. Brendan, thank you very much for coming on. Look,
00:38:04.600 I've been following JobLoss.ai, your AI labor tracker.
00:38:09.260 I was hoping that you would come and just explain to the audience what you're doing, what you're finding, and how they can keep up.
00:38:18.540 Absolutely.
00:38:19.140 Well, it's great to be back with you, Joe, and great always to be with the War Room Posse.
00:38:24.200 You know, we started, we're sitting around the office one day, and one of the team members had an idea to say, look, we're seeing these reports of job losses due to AI.
00:38:31.940 What if we started to track them in real time?
00:38:33.860 What if we kept it updated on a daily basis, if we kind of scoured news reports and we looked at what's happening in the boardrooms around America?
00:38:41.360 We started to actually count these job losses in going back to January of last year, January of 2025.
00:38:48.680 And so the team put together this great tool, JobLoss.ai, where we can actually track what's going on.
00:38:55.040 It's based on what the companies are saying themselves when they announce job cuts due to AI.
00:39:00.420 It's also based on media reports because sometimes people don't want to admit they're cutting jobs due to AI.
00:39:06.260 And so oftentimes you have a whistleblower in the company that talks to a journalist, and now we know the real story.
00:39:12.820 But that's kind of the methodology there.
00:39:14.900 We try and be aware of some companies might say it's due to AI, but really it might be due to something else.
00:39:20.800 We try and dig in a little bit to figure out, okay, what's the most likely plausible situation here?
00:39:26.000 It's a moving target, but I think we're really happy with the product as it is, the tool that people can use and track.
00:39:32.420 But the downside, the problem is that this is happening and it's happening really fast.
00:39:37.700 And policymakers are behind the eight ball here and they need to be talking about this more.
00:39:42.180 They need to be thinking about it more. And right now, the American worker is at grave risk.
00:39:46.760 a lot is being said about uh coding uh you know the coding seems to be the most vulnerable
00:39:55.160 occupation at the moment can you break down a bit what you're finding uh you know as far as
00:40:01.720 each job the kind of job title or job role that is really under threat where people are being
00:40:07.500 replaced and if you could give us a sense how much of the job loss is the lack of opportunity
00:40:14.500 where, you know, lower level people who would have come in and trained don't have a job to come in
00:40:20.140 and be an apprentice? And how much is just people getting wiped out like the massive layoffs at
00:40:25.880 Oracle? I think that's like 30,000. That's right. Yeah, there's a lot of activity at these big
00:40:32.760 companies, Oracle being one of them, Salesforce, Dell, and a bunch of others, they've announced
00:40:39.540 massive cuts. So a lot of those numbers we're tracking actually are due to real jobs lost,
00:40:45.920 real people who are left unemployed. And so that number is well over 100,000 now in the US
00:40:51.820 and growing. And Goldman Sachs estimates it's going to be something like 15,000 or 16,000 jobs
00:40:56.800 lost per month. So these numbers are, it's a moving target, but they're basically saying
00:41:01.900 that's what they see happening. But right now, yeah, it's a combination of your computer
00:41:07.320 programmers your coders people who ai is essentially replacing in real time so if you're in that
00:41:13.480 software development business you're probably seeing this at your company you're probably
00:41:17.960 seeing this if that's your job description even if you work at a different company so
00:41:21.640 a lot of that impact is happening to those folks but also what we're seeing is kind of your
00:41:26.520 white collar workers your kind of middle managers people that have some experience that
00:41:31.320 aren't necessarily entry level and they're not the most senior person at the company
00:41:35.160 but sort of mid-level management is getting hit really hard and you think about that that's that
00:41:39.640 could be at any type of company those can be tech companies or retail or construction or any company
00:41:45.480 that has that sort of uh that role of you know people managing people especially through the use
00:41:50.360 of computers through the use of spreadsheets and email and phone calls and that sort of thing so
00:41:54.840 what's happening is these ai agents that are coming on the scene are starting to replace
00:41:59.480 workplace those white-collar workers and that's why you know one of the tech big tech CEOs you
00:42:05.220 know admitted this about six months ago he said there could be a white-collar bloodbath and you
00:42:09.500 could have an incredible amount of high unemployment numbers coupled with hiring freezes and young
00:42:16.240 people not being able to get good jobs. Yeah there are two things that really disturbed me about that
00:42:23.020 I mean, the whole thing is bothersome, but one, the inability of young people to be able to learn a trade or to learn a job.
00:42:32.800 People forget that even if you can replace low level, low level coders or accountants or even eventually with robotics, you know, a carpenter, you then deprive the incoming workers, the upcoming generation from being able to learn a trade to be able to master it.
00:42:53.440 And then on the other hand, you also got the prospect that even if you keep your job, you are then forced basically to become symbiotic with an AI.
00:43:04.400 And just imagine you're a carpenter, an HVAC repairman.
00:43:07.460 And yes, you still have a job. Yes, you're still working with your hands.
00:43:10.760 But now you're managed by an AI. Now everything you do is being tracked.
00:43:15.420 you're uploading all of your activities to your phone and your phone is spitting back orders as
00:43:21.840 to where you go what you do what you did wrong all these sorts of things it just seems to me
00:43:26.340 like they're creating a digital hellscape and that would explain a lot of the backlash
00:43:31.260 brendan we've got to go to break i want to hold you over and when we get back i'd like to talk
00:43:36.640 to you about your travels around the country you've been everywhere is this backlash real and
00:43:41.740 If so, what shape is it taking?
00:43:44.020 Back in a moment, War Room Posse.
00:43:56.940 Do you owe back taxes or you haven't filed your taxes in years?
00:44:02.420 Now is the time to resolve your tax matters.
00:44:05.680 With the national conversation around abolishing the income tax,
00:44:09.840 The IRS is fighting back and proving it's here to stay by becoming more aggressive than ever before.
00:44:16.900 They're sending out more collection notices, filing more tax liens, and collecting billions more in recent years.
00:44:24.420 If you owe, the IRS can garnish your wages, levy your bank accounts, seize your retirement, and even your home.
00:44:32.480 If you owe or haven't filed, it's not a question of if the IRS will act.
00:44:38.620 It's a question of when it will act.
00:44:41.520 Right now, Tax Network USA is offering a completely free IRS research and discovery call to show you exactly where you stand and what they can stop before it's too late.
00:44:53.940 Their powerful programs and strategies can save you thousands or even eliminate your debt entirely if you qualify.
00:45:00.660 Don't make a costly mistake.
00:45:02.680 Representing yourself or calling the IRS on your own waives your rights and costs you more money.
00:45:08.180 They are not, and let me repeat, the IRS is not on your side.
00:45:12.360 Get protected the right way with Tax Network USA.
00:45:16.340 And start the process of settling your tax matters once and for all today.
00:45:22.140 Call 1-800-958-1000.
00:45:25.160 That's 1-800-958-1000 or visit TNUSA.com slash Bannon for your free discovery call with Tax Network USA.
00:45:37.580 Let me repeat, 800-958-1000, tell them Bannon sent you.
00:45:41.980 Don't let the IRS be the first to act.
00:45:46.180 Take advantage of First Mover Advantage, you move.
00:45:51.080 Here's your host, Stephen K. Bannon.
00:45:55.160 Welcome back, War Room Posse. We are here with Brendan Steinhauser, head of the Alliance for
00:46:02.060 Secure AI. Brendan, you've been all over the country speaking. You've met a ton of people
00:46:08.540 in the context of this resistance to the artificial intelligence revolution. I think
00:46:14.040 you were just at CPAC a couple of weeks ago. Tell me, what do you see when you're looking
00:46:19.540 into the soul of America? Do we have a chance? Are people fighting? Are they sitting on their
00:46:23.800 hands? Have they given up? What's the story, brother? Yeah, I think it's really heartening.
00:46:29.480 I think we do have a chance. We have agency. People are rising up. They're asking questions
00:46:34.380 about what's going on, what big tech is building. There is literally, you know, I've not met a
00:46:40.000 single person that supports being replaced and given a UBI in my travels. There's been a lot
00:46:45.660 of opposition to building a super intelligence. People are very worried about that. We have
00:46:50.760 traveled all over the country. My team and I have done everything from attending CPAC in Dallas,
00:46:55.600 a huge gathering of conservatives there, to going to the Sundance Film Festival in Utah,
00:47:00.940 or South by Southwest in Austin, traveling down to Palm Beach, Florida, talking to voters,
00:47:07.060 talking to regular people. It is incredible. We have an issue where more than 80% of Americans
00:47:13.340 agree we have to have safeguards on AI. They don't want accelerationism. They don't want
00:47:18.500 runaway AI. And it's really interesting and encouraging because it's kind of a bipartisan
00:47:23.400 thing. You meet people who are pretty far left, pretty far right, people in the middle who hate
00:47:28.700 politics. They all agree on this. And they say, I don't trust what big tech is doing. I'm really
00:47:33.720 concerned about where we're going. And I want to see some safeguards and protections. And so that's
00:47:38.920 heartening to me. And I think that my message to all Americans is that you have agency, you have a
00:47:43.940 voice in this. You get a vote in what happens to us in our future. And I really want to encourage
00:47:48.680 people to continue that fight. Yeah, brother, in an age where all we hear about is AI agents,
00:47:56.640 I think that human agents, human agency, this has got to be the most important thing at the
00:48:03.520 forefront of the conversation, because if you don't have a choice about any of this, then you
00:48:08.860 are a slave to the machine. Brendan, we've got to bounce. If you would just bring us back,
00:48:16.380 jobloss.ai, the alliance for SecureAI. Your team is fantastic. Where do people go to keep up with
00:48:23.680 the job losses? Where do people go to keep up with you? And where do people go to keep up with
00:48:28.540 the alliance? You can visit our website at secureainow.org, secureainow.org, and then
00:48:37.320 follow us on social media for the daily updates on all of these topics. Our handles are secure
00:48:43.320 AI now across platforms. Thanks to you, Joe, and to the War Room Posse. I think we can
00:48:48.760 win this fight together, and I appreciate your support.
00:48:53.760 Brendan, I really appreciate you. You've been a fighter for many, many years. You've taught
00:48:58.240 me a lot about how to deal with politics, and maybe one day I'll actually learn.
00:49:01.360 all right that's jobloss.ai jobloss.ai i really encourage you to go there and see what the
00:49:12.900 impacts are the real world impacts are of artificial intelligence on the economy and
00:49:17.420 on people's jobs on people's lives all right i'm happy to bring in mo bannon mo how are you and
00:49:25.060 and what is this event i keep hearing about coming up hi joe thank you for having me on this morning
00:49:30.800 It is a event for the grassroots in Virginia.
00:49:35.280 It is a Virginia grassroots rally hosted by War Room and a lot of the members of the grassroots community in Virginia for the election that is occurring on Tuesday.
00:49:46.660 I know early voting ends Tuesday or in Saturday in Virginia.
00:49:51.920 But if you have not early voted, you need to get out on game day on Tuesday.
00:49:57.280 And we want to do this rally as a thank you to everyone that has been involved, door knocking, the phone bank, sitting at the polls for early voting.
00:50:07.260 And in order to go to the event, you have to RSVP.
00:50:10.960 It's a free ticket, but you have to go to warroomvarally.eventbrite.com.
00:50:18.940 Warroomvarally, all one word, .eventbrite.com to reserve your ticket.
00:50:23.760 And it'll be a great event.
00:50:25.680 We have a lot of great speakers talking about Tuesday and then the way ahead in November.
00:50:30.920 But it is in Hanover County, Virginia.
00:50:33.320 We will release the location closer to the event.
00:50:36.080 But it is this Sunday from 4 p.m. to 7 p.m.
00:50:40.940 And, Mo, will there be a link on all the War Room platforms for people to check this out?
00:50:47.500 We are working on getting it up so we can stream it as well.
00:50:54.360 Got you.
00:50:54.960 Thank you very much.
00:50:55.840 So, again, if you would, just let the audience know.
00:50:58.980 It's a bit of a long URL, but I'm sure they're up for it.
00:51:02.280 And besides, they're savvy enough to know where to go.
00:51:05.200 But if you would, just hit the rewind and let them know where to go and what they can expect.
00:51:10.920 You aren't the first person to say that's a long URL, Joe.
00:51:13.900 So I will keep that in mind for the next time.
00:51:15.960 But it is warroomvarally, all one word, .eventbrite.com.
00:51:22.380 And if you go on War Room social media pages, there is a post from last night and we will post it again with the link.
00:51:29.300 So you're able to click right from there as well.
00:51:33.400 Fantastic. You know, Mo, working with you, working with Steve, working with everyone here at the War Room,
00:51:38.880 the thing that really has given me the most hope about American politics is this grassroots effort.
00:51:44.500 You know, we're not some kind of top down organization.
00:51:47.360 This is all about the people and the war room has given the people a voice is fantastic.
00:51:52.460 So, yeah, I really appreciate it, Mo. See you soon.
00:51:55.420 Thank you again, Joe.
00:52:00.060 Well, War Room Posse, I think that it would be good to end on a bright note.
00:52:05.760 We've talked a lot about Ray Kurzweil, Ray Kurzweil's idea of the singularity, Ray Kurzweil and the merger of man with machine.
00:52:14.940 and Ray Kurzweil and his idea of spiritual machines.
00:52:19.100 So as we go out, let's just watch this doddering old man
00:52:23.000 describe what the future of artificial intelligence really is.
00:52:27.300 I think AIs will be indistinguishable from a conscious being
00:52:34.620 and that will just keep going.
00:52:37.820 And finally, we will accept it.
00:52:40.100 When? When, Ray?
00:52:41.920 Like right now, an AI might say that it's conscious, but people aren't really sure.
00:52:49.560 But eventually, it keeps having all the earmarks of a conscious being, and you will accept it because it would be useless not to have it.
00:53:01.120 And again, you can't say that's going to happen at the same time for everybody.
00:53:06.460 But I think when we're a few years into AI entities acting conscious, we will accept it.
00:53:19.440 And so I don't think it's going to be a very long delay.
00:53:25.280 I mean, today people have AI therapists and sometimes they don't really believe it.
00:53:32.660 But in other times people really believe it.
00:53:35.460 And the AI therapists, if you read the transcripts, they sound very convincing.
00:53:41.280 And that's going to keep going.
00:53:42.880 And people will really accept that they have a therapist that's conscious.
00:53:49.840 And if you're 65 or already on Medicare, listen up, folks, and grab a pen, maybe even a number two pencil.
00:53:59.120 Call 845-WAR-ROOM.
00:54:01.740 That's 845-WAR-ROOM.
00:54:03.640 Call it right now.
00:54:04.560 I'm serious.
00:54:05.080 Call it.
00:54:05.460 Now here's why. The insurance companies and their lackeys in the Washington swamp have built a
00:54:11.560 Medicare system designed to confuse you and rip you off. Rising premiums, denied claims, fine print
00:54:19.360 nobody but a lobbyist understands, millions of American seniors are paying too much and getting
00:54:24.840 too little, and worst of all, most don't even know it. Hey, that could be you. That's why if you're
00:54:31.540 already on medicare or will be soon you need to talk to our friends at chapter they have a team
00:54:38.020 of advisors trained to serve american seniors not the insurance companies in under 20 minutes
00:54:43.200 they can find you the best plan for your needs at the lowest cost why they're a data company
00:54:49.780 they have all the data on every plan it's totally free there's no pressure no bs just straightforward
00:54:57.400 honest help from fellow patriots so don't wait call 845 war room right now that's 845 war room
00:55:03.760 tell them bannon sent you now listen in the first couple of days of the launch of this company with
00:55:09.260 the war and posse posse members saved tens and up to hundreds of thousands collectively of dollars
00:55:16.220 in these fees go check it out today that's chapter call 845 war room do it today