WarRoom Battleground EP 778: Big Tech Dominance Crushes Small AI Firms
Episode Stats
Words per Minute
181.52223
Summary
In this episode, two of my favorite people and both practitioners and theorists on artificial intelligence, Brian Costello, a practitioner, and Joe Allen, a philosopher, join me in the War Room to talk all things AI.
Transcript
00:00:00.000
Mr. Altman, now this is going to, I'm going to count this as a highlight recently, like I know
00:00:05.720
the work that you've done, you're really one of the people that are moving AI, and now it's an
00:00:11.640
opportunity, I was excited to meet you, and now people, you know, people ask me, it's like if
00:00:17.600
you're going to talk about AI, and now I get to ask you, I mean, you know, like the literal, the
00:00:21.940
expert, you know, some people are worried about AI or whatever, and I'm like, you know, what about
00:00:28.100
the singularity, so, you know, the people like that, if you would address that, please.
00:00:32.700
Thank you, Senator, for the kind words and for normalizing hoodies in more spaces, I'd love to
00:00:37.020
see that. I am, I am incredibly excited about the rate of progress, but I also am cautious, and
00:00:49.080
I would say, like, I don't know, I feel small next to it or something, I think this is beyond
00:00:57.060
something that we all fully yet understand where it's going to go, this is, this is, I believe,
00:01:03.000
among the biggest, maybe it'll turn out to be the biggest technological revolutions humanity will
00:01:07.400
have ever produced, and I, I feel privileged to be here, I feel curious and interested in what's
00:01:16.000
going to happen, but I do think things are going to change quite substantially. I, I think humans have
00:01:23.180
been a wonderful ability to adapt, and things that seem amazing will become the new normal very
00:01:28.640
quickly. Uh, we will figure out how to use these tools to just do things we could never do before,
00:01:32.900
and I think it will be quite extraordinary, but these are going to be tools that are capable of
00:01:39.300
things that we can't quite wrap our heads around, and some people call that, you know, as these tools
00:01:45.000
start helping us to create next, future iterations, some people call that singularity, some people call
00:01:50.200
that the takeoff, whatever it is, it feels like a sort of new era of human history, and I think it's
00:01:55.320
tremendously exciting that we get to live through that, and we can make it a wonderful thing, but
00:02:00.140
we've got to approach it with humility and some caution.
00:02:06.680
This is the primal scream of a dying regime. Pray for our enemies, because we're going medieval on
00:02:15.000
these people. Here's one time I got a free shot at all these networks lying about the people. The
00:02:21.180
people have had a belly full of it. I know you don't like hearing that. I know you've tried to do
00:02:25.140
everything in the world to stop that, but you're not going to stop it. It's going to happen.
00:02:28.380
And where do people like that go to share the big lie? Mega Media. I wish in my soul, I wish that any of
00:02:36.320
these people had a conscience. Ask yourself, what is my task and what is my purpose? If that answer
00:02:43.440
is to save my country, this country will be saved. War Room. Here's your host, Stephen K. Bannon.
00:02:57.080
It's Thursday, 29 May, year of early, 2025. Two of my favorite people and both experts,
00:03:02.880
one a practitioner, the other a theoretician on artificial intelligence, Brian Costello,
00:03:08.860
the practitioner, and Joe Allen, the philosopher theoretician, are joining us. Brian, you're in
00:03:17.620
this day-to-day. Are we in the big bang? Elon Musk is talking about it. Are we in the big bang phase
00:03:24.640
of artificial intelligence now as far as exploding knowledge and understanding of this, at least
00:03:31.660
technologically, to drive technological progress? And since you're the one economic nationalist
00:03:39.100
populist that I think is in this space, there may be others, but you're the most vocal,
00:03:44.840
are we heading down a path that you see that this can actually be done from a populist perspective and
00:03:52.400
avoid a white-collar, job-killing apocalypse, sir?
00:04:00.400
So, yes, I think we are in a big bang. We're seeing these models get more sophisticated.
00:04:05.280
As Joe talked about earlier on the show, you're starting to see these agents come out, which
00:04:10.920
replace a lot of work in terms of the humans. And I think what you generally see here, what's
00:04:17.280
happening with us, Steve, is we're seeing 10, 15 years of innovation be consolidated down to months.
00:04:22.800
What used to take a year from a software development standpoint with, like, a whole big team,
00:04:27.260
machines can do in hours and days better than humans can. So that's certainly a big bang.
00:04:33.440
And there's a lot of changes there. To your second question, we're not headed down the right path
00:04:38.780
right now. There are, I think, 10 companies that control 90% of the AI infrastructure.
00:04:45.660
And we're seeing a lot of announcements in terms of investment into AI, but what you're not seeing
00:04:52.260
in those announcements is jobs, right? You're seeing a lot of announcements into capital and AI
00:04:57.260
factories. And there will certainly be the construction jobs and the jobs in the short term
00:05:00.720
around that build this AI infrastructure out. But you're not seeing a lot of employment commitments
00:05:06.000
towards that. So we really need to step back and we need to look at this and say, like, okay,
00:05:11.020
we're not even asking the right question now, is how do we make AI work for the American people?
00:05:16.300
Right now the question is, how do we do big deals for big businesses? But we're not stepping back and
00:05:21.540
saying, how do we make this work for everybody in America? And I don't think the answer is
00:05:25.700
universal basic income. I think there's other strategies we could take.
00:05:29.860
Okay, let's hear those, because what we're hearing, you talk about big deals,
00:05:34.540
this deal in UAE to build these data centers is pretty shocking that there's going to be another
00:05:39.820
node outside the United States. We have Dave Walsh on here all the time talking about the massive
00:05:43.620
energy needs. You've got deal after deal after deal. And of course, President Trump cut off
00:05:49.260
chips to China, which sounds to me kind of like cutting off oil to Japan in 1940. I fully support it.
00:05:56.480
I'm saying you just got to be ready for the blowback. But nobody but you, Brian Costello,
00:06:01.020
are talking at all. And this is why Costello is so important. You're the first guy that gave me a
00:06:05.900
heads up and say, hey, look, there's two models here. One model is efficiency. And of course,
00:06:10.560
Wall Street loves that because that means cut jobs, higher margins. This happened. This efficiency
00:06:15.360
model goes back way before AI to other investment opportunities, particularly offshoring jobs.
00:06:22.800
And then you've what you call the productivity creative model that they're not pursuing. So so
00:06:28.400
walk us through, because, brother, as you know, to get this pointed in the right direction or to bend
00:06:34.220
the arc here is going to be a Herculean task since the accelerationist are in charge of the Big Bang,
00:06:41.320
brother. Yeah, let's I mean, listen, I'll walk. I mean, what I have good insight into, Steve,
00:06:46.960
is what China is doing here. Right. So what China does, they open source their models. There's two
00:06:50.980
companies, DeepSeek and Quinn. So they gave the access to AI to everybody. They open source the
00:06:56.660
technology behind robotics and a firm called Utitrade. And then they launched a $1 trillion
00:07:02.620
CCP guidance fund to guide all the private capital into these are the areas in the segments we want to
00:07:09.480
win. Right. And so China kind of aligned a strategy around it. And you have to appreciate when they've
00:07:16.440
kind of focused on an industry they wanted to be a leader and, you know, they've done it. Right.
00:07:21.920
What we're doing is we're letting the free market, the biggest tech companies just buy up all the
00:07:27.080
AI infrastructure. And then you're seeing companies like Tesla and Google, Google through Waymo and
00:07:33.420
Tesla through its robo taxi that's launching, you know, go to replace the Uber and Lyfts that employ a lot
00:07:39.280
of the people. So you say, OK, AI is bad. But what you could what we could see and what we need to start
00:07:44.760
seeing is how does somebody create an Uber or Lyft with drivers that AIs, the whole back end, imports
00:07:51.040
more money into the driver's pockets. Right. Those are the types of things we need to see actually
00:07:57.340
help. Let's go back. You say you say you say free market. You see, Cussell and those are my trip
00:08:01.300
bars. You say free market. But in a sense, it's actually nine. As you said, 10 companies. And I would
00:08:08.380
actually say probably weighted to the top five or six. Ten companies control right now in your estimate
00:08:13.240
90 percent of the AI infrastructure, correct? Yeah. And they're also they also happen to be
00:08:19.860
30 to 40 percent of the entire market cap of the stock markets. And guess what? They're making all
00:08:26.860
these AI announcements and investments and their employment's not going up. And it's why? Because
00:08:31.760
they're investing in capital and they're investing in AI and they're investing in automating the jobs away.
00:08:36.340
Yeah. So that's the efficiency model, automating the jobs away. Do you see any way what needs to
00:08:43.960
be done to bend to your model, which is the productivity, the productivity, the creative
00:08:49.820
slash productivity model? So we need to have the conversation. Then we need to look at what are the
00:08:55.740
industries where AI can help us that we don't have today that are additive. Right. And I would argue
00:09:03.480
we even need to get down at the county level, Steve, I think we have thirty one hundred counties
00:09:07.200
in the country and the counties now have abilities to they need to look at where the money's being
00:09:12.640
big, being sucked out of the county, you know, from their people into the big tech and how do they
00:09:17.200
execute strategies locally around this? Right. So we need to look at. So the problem is
00:09:23.480
our free market or what we call our free market. Right. Likes to go where there's the least amount of
00:09:29.760
friction. Right. And the least amount of friction in AI right now is replacing jobs. It's not building
00:09:35.220
that new industries. It's automating away workers in existing things. So we need to look at how do we
00:09:41.760
align capital and form capital and net out of industries, the robotics, the automation, bringing
00:09:48.000
them in. You know, here's the reality. Manufacturing is highly automated now. Right. Like it's gone a lot
00:09:53.740
away from cheap labor. Right. So we're fooling ourselves to think, you know, something somebody
00:09:59.740
up in Boston is then going to be employed in a factory down in, in, in, you know, on the Mexico
00:10:06.500
border in Texas. Like it's just not going to happen. Right. So we need to look at what are the
00:10:11.060
switching. What? But but but but it this is why in the big, beautiful bill they slipped in that no
00:10:18.020
states you talk about the counties. They're powerless because we slipped into the big, beautiful
00:10:22.440
bill that no state, no state can have authorization over federal law, over federal law to to to to do
00:10:30.780
anything here. So when you talk about the county level, that's kind of meaningless, isn't it?
00:10:35.920
It is. I think the way it's structured right now, we need to have the conversation. We need to look
00:10:39.880
at the count. We can't like the counties need to look at they need to look at things and say, like,
00:10:43.600
OK, we now the cost of building technology is essentially zero. It's it's the access to the
00:10:48.280
infrastructure. Right. So they could be building their own Uber Eats locally. Right. And keeping
00:10:54.340
the money there instead of sucking all the money out to all these big tech oligarchs. So they could
00:10:59.280
you know, so you have strategies you can execute now at the local and the county level in terms of
00:11:04.520
technology and facilitating things in that environment. Like I'm a big believer, you know, this is
00:11:10.380
controversial, but I put an AI plan in when President Trump asked for it. But I believe we need a fund
00:11:17.040
at the national level that invests locally. It doesn't just put all the money in Silicon Valley.
00:11:22.200
It turns around and it helps the people of Iowa, the people of what how do they figure out AI
00:11:26.580
and how do they start things around agricultural in the local regions? Because the reality is
00:11:32.520
everybody. Yeah, you do. You do agree that this this efficiency model is all focused on on the
00:11:39.980
cutting of jobs, correct? On just taking out the labor part of the whatever their business model for
00:11:45.740
this industry, whatever industry they pick. It's all an efficiency model to basically and this is
00:11:50.600
how you get to the white collar apocalypse. And we're seeing it right now, Steve, we're seeing
00:11:56.640
we're seeing record corporate profitability. We're seeing job postings decrease. Right. And we're seeing
00:12:05.540
the the value of AI is basically going to the shareholders in these big companies, which is 15%
00:12:12.280
of the market. And candidly, when we have the foreign investors and a lot of the value in AI and
00:12:17.800
sucking value out of the country will go to other countries. This is what you mean by we we talk there
00:12:24.100
they were talking the other day that you could have not, you know, we're arguing about 1.7% growth
00:12:29.640
versus two and a half or three, you could have 10% actual GDP growth, but also have 10 to 15%
00:12:35.060
unemployment, particularly among white collar workers. Is that correct? Is that a potential
00:12:39.880
that's a potential that's a potential future for us? Listen, it's the guarantee, right? We have
00:12:47.040
companies now announcing huge capex spendings and this idea of AI, you know, NVIDIA announced their
00:12:51.900
earnings last night, he's the biggest player in the space, and announced AI factories, right? What's AI
00:12:57.660
factories? That's, you know, what, when Wall Street cheers the fact that you're going to make a huge
00:13:02.860
capital investment in technology, that means one thing, your employment is going to go down,
00:13:06.380
you're not going to hire the labor anymore. So you're seeing it play out right right now.
00:13:12.800
What else? Before I turn to Joe, what else do you find important in NVIDIA's announcement last night?
00:13:20.000
Listen, I think that they, the growth rate is staggering. Like it's, we've never, I mean,
00:13:27.180
they're talking about being four or $5 trillion. We've never seen a, this show how fast this is
00:13:32.480
growing, right? We've never seen a company add trillions in market cap over its lifetime,
00:13:37.620
nevermind in a couple of years, right? And the infrastructure is incredibly important.
00:13:42.440
And candidly, as a country, we need, you know, and I'm a little biased here, because we're doing
00:13:45.480
something here. We need a diverse, like having one company control all the AI infrastructure in the
00:13:50.180
world is not a good thing. It's not healthy. This is our thing about, this is our thing about going
00:13:55.000
after the oligarchs. How do you, this is what we're the lead on. And we're saying right now,
00:13:59.660
NVIDIA is something that, and the argument, oh, you can't do this because this is what's the
00:14:04.120
bulwark against the Chinese Communist Party. Clearly, NVIDIA is way too powerful right now.
00:14:10.380
No one company can have that type of market power in advanced chips. So what do you do about it?
00:14:18.400
And they're going deeper in Taiwan. And we all know the geopolitical risks, you know, you covered
00:14:22.620
early in the week, the geopolitical risks we have with China and actually in Taiwan, right? And they're
00:14:28.440
going deeper. All just all NVIDIA stuff is made by Taiwan Semiconductor, TSMC, right? Who's committing
00:14:36.020
to do here? So we need to be investing more as a country in infrastructure, you know, so we have
00:14:42.380
variety of infrastructure and different place there.
00:14:44.960
Do you think last night the Trump administration also banned any sale of, and the NVIDIA CEOs
00:14:50.500
bellyaching about, but they, correct me if I'm wrong, didn't they ban 100% any advanced chip
00:14:56.160
design to be sold to the Chinese Communist Party?
00:14:58.320
Oh, they banned the software. They told the software companies that helped them do the
00:15:02.080
chip design that they can't, there's three big players that they can't sell into China
00:15:11.060
Is that the equivalent, is that the equivalent of cutting off the oil to Japan in July and
00:15:16.380
August of 1941? Because what's happening in Taiwan right now are not, they're not exercises,
00:15:21.580
they're rehearsals. We've talked about this ad nauseum here, how these are rehearsals for the
00:15:26.640
invasion of Taiwan. Is what President Trump and the team did last night against the, you could tell
00:15:32.080
the NVIDIA and the software companies opposed it. Is this the equivalent of what happened in the
00:15:39.760
Listen, I think they're going to react and they'll probably overreact back. I don't, I don't, I think
00:15:43.740
with AI, you can actually replicate building the software. So the software is not as important as it
00:15:48.060
once, once was, and that they probably already had efforts to do that. I mean, the thing that people
00:15:52.260
need to understand on this is, you know, Xi was up in Russia a few weeks ago when Putin seemed to
00:15:58.660
make his big pivot with Trump. And if I'm Xi, I'm up in Russia and saying, hey, listen, you know,
00:16:03.580
we've got huge efforts in AI to place their biotech and pharmaceutical industry, their chip industry,
00:16:08.920
the automobile industry. We can give Russia everything you need, keep your foot on the gas.
00:16:13.480
You don't need to back down. I mean, we're missing that, you know, we're missing Ukraine.
00:16:17.260
You agree, you agree with me? They were playing footsies like two teenagers on their first date.
00:16:22.820
Yeah, we're forgetting that she, that we're, listen, she's going all in on the AI. Like he's
00:16:28.280
out meeting with the AI startups in China, right? He's going up to Russia, basically, you know,
00:16:34.800
on an anniversary of World War II and has his troops marching in a parade, right? He wants them to stay
00:16:40.900
on Ukraine, right? And he's promising what he's promising to Putin. And what we now have that we
00:16:46.800
never had with, with, with Russia in the last Cold War, is you have another economic power who's
00:16:51.520
promising everything they need, right? The last real big meeting they had, look, we spent two
00:16:57.180
years negotiating a deal with them, Lighthizer did, that took out all the issues with the Chinese
00:17:01.860
Communist Party, integrated them into the Western thing. And in May of 2019, after two years of
00:17:07.580
negotiation, after they had the first one, Belt One Road meeting, Putin comes, I think it was in
00:17:13.160
Shanghai. They spit in our face, rip it up and say, hey, we're going to go, we're going to go
00:17:17.080
it alone. They, they decreed a people's war three weeks later and also decreed they're going to be
00:17:21.940
technologically, they're going to uncouple. We were not also uncouple, they're going to uncouple.
00:17:26.540
You come back, what, five years later, they're, they're playing, six years later, they're playing
00:17:31.080
footsie at the, at the 80th commemoration of the victory in Europe day, of which they got Chinese
00:17:36.520
troops marching. And, and, and since that time, people got to wake up that what they've done in
00:17:43.600
Ukraine is, and remember, we're 100% get out of the Ukraine war. And we're a big believer in the
00:17:49.380
Russian rapprochement to break them off of the CCP. That's a lot harder today. And I think you're 100%
00:17:54.460
correct. What she told him is that, hey, we got this AI thing covered. You keep the pressure on in
00:17:59.100
Ukraine. And, and quite frankly, don't help the Americans get off of this Persian situation with
00:18:04.560
the Israelis. I want to go bomb. Hang on a second. So Joe Allen, everything you've heard from a
00:18:09.680
practitioner of, of Brian Costello, who's putting money to work every day in this area, your thoughts.
00:18:19.260
Steve, Brian, I think it's important for me to say first and foremost, some of my best friends
00:18:24.340
are AI developers and cybersecurity experts. So while I myself am very much a kind of an
00:18:34.140
anti-tech extremist, I think that it's important to remember that there is no one model for how to
00:18:41.420
go forward into the future. And you have the two extremes, right? You have the one extreme all in
00:18:46.840
on this sort of singularitarian, transhumanist, post-humanist mindset. I'm going to become a
00:18:52.840
cyborg. You have something much more on my wavelength of being extremely technical or extremely skeptical
00:19:00.880
of all of these technical solutions for spiritual problems. But there is this huge middle ground.
00:19:07.540
People, you know, I still use phones. I still use cars, right? And nobody is a good Luddite. You've
00:19:13.060
never heard of them because they live in the woods. So, you know, with Brian's efforts to steer AI and
00:19:19.460
provide tools for people to survive in this environment that's being foisted upon us, I can't do anything but
00:19:27.440
just say I support these efforts. But one of my best friends, Justin Lang at Culture Pulse is trying
00:19:34.700
to do something very, very similar and others whom I can't mention on air. And I think that this is going
00:19:42.020
to be an important part of it. So, you know, I don't mean to suck up all the oxygen with kind of
00:19:48.180
anti-tech skepticism. I think that's going to be one of the most important shields going forward for
00:19:54.520
people morally and spiritually to have these cultural barriers. But the reality is that much
00:20:00.940
like Bitcoin provides some sort of alternative to fiat currency and much like, you know, natural
00:20:09.400
remedies or generic drugs. Stop, stop. Yo, yo, what do you mean you're an extremist? What do you mean
00:20:15.480
you're extremist? I don't hear anything extreme of that. The extremists are Elon Musk and hold it,
00:20:21.220
Dario. Look, we've got to get some definitions here for nomenclature. The accelerationists are
00:20:26.260
extremists. You're not an extremist. You're kind of trying to put common sense on this. You're not
00:20:31.180
you're not an AI professional practitioner like Brian Costello putting money to work and managing
00:20:36.840
companies, investing in companies that are at the cutting edge of AI. But you're certainly why you say
00:20:42.540
you're extremist. You're not an extremist. The accelerationists are extremist because it takes more,
00:20:48.760
it takes 10 times more regulation on Capitol Hill in the imperial capital to open a nail salon or a
00:20:56.440
hair braiding operation than it takes on all of artificial intelligence. They're the extremists
00:21:02.260
and they got a political class here that's bought off that is essentially stepping back and saying,
00:21:07.160
hey, it's a free market. You know, this is just let go when it ain't the free market, because 90% of
00:21:13.220
this has been underwritten by American taxpayer dollars through the universities and through the
00:21:18.520
weapons labs and the national labs. Joe Allen. Well, you know, everybody has to draw their lines
00:21:25.340
on how far they're going to go on all this. And my lines are drawn as close to the human as possible.
00:21:30.260
As a writer, I am completely, I'm an absolutist. I do not use AI for anything, for any of it. As a
00:21:36.780
reader, a researcher, I don't use AI for anything. In fact, outside of seeing the AI overviews that are
00:21:43.680
foisted on me, I don't use AI for anything. But there are people who will. I think that in the case of
00:21:51.220
the extremists on the accelerationist side, that some of the kind of Luddite impulse is a reaction to
00:21:58.520
that. These people have a totalizing vision for where all of this goes, including putting chips
00:22:04.760
in people's brains so they can survive in this hyper-technological society. But in the case of
00:22:11.160
Brian Costello, we've had Max Tegmark on. Hang on, hang on. It's their totalist, maximalist
00:22:19.000
vision that is actually we're heading down the arc on. And one of the reasons is they have a totalist,
00:22:24.660
maximalist vision. And you don't. And I don't. And I'm with you. I don't use any AI in any reason.
00:22:30.660
And I've, people I greatly admire have told me over the last month, they said, you're an idiot.
00:22:34.980
You've got to at least perplexity or take a pick. You've got to at least get involved here.
00:22:39.440
So you understand it. I'm, I'm an extremist in that right now, because I can see what it's doing
00:22:43.900
to people already. But the maximalist and the accelerationist, they have a vision and they're
00:22:49.960
pretty smart about rolling it out. And you've got Brian Costello, who's like one small boat,
00:22:56.060
right? In a, in a, in a sea of ocean liners and battleships and carriers that are all heading
00:23:01.900
toward destination. And that destination is a singularity. You heard Altman right there,
00:23:06.380
Fetterman and God bless John Fetterman. He asked the question every senator should have been asking,
00:23:11.340
what's the convergence point on this? What, what leads up to it? And then what comes after it
00:23:17.020
and force Altman to answer the question, Joe Allen? Yeah. You know, outside of the,
00:23:23.120
just the, the big picture, just on a practical level, and this is what Brian is talking about
00:23:27.840
on a practical level, what do you do? Uh, you know, I think Max Tegmark makes a really good
00:23:33.340
argument on this future of life Institute. The audience should be well familiar with Max Tegmark
00:23:39.860
at this point. And he says that the key to AI regulation is to leave allowances for tool AI,
00:23:47.220
AI that human beings are in control of narrow AIs for medicine, narrow AIs for financial analysis,
00:23:54.040
things like this. What should be off the table is any AI that is out of human control. And I think
00:24:01.860
that should include control of the populace, that the populace doesn't have a say in what is
00:24:08.540
deployed is, is, is abhorrent. The populace should have a say as to what flies and what does not.
00:24:15.840
Okay, Joe, we got to bounce, uh, right now, where do people go? Social media, all your works,
00:24:24.900
At J O E B O T X Y Z and Joe bot dot X Y Z. Thank you very much, Steve. Thank you very much,
00:24:31.380
Brian. Brian, you're a lonely voice out there in the, in the middle of this, uh, massive new,
00:24:38.840
uh, post-industrial revolution. Where do people go? Your Twitter feed is on fire about this,
00:24:43.100
particularly the two models of efficiency versus creativity slash productivity. Where do people
00:24:49.280
go, Brian? Yeah. Thanks, Steve. It's on, uh, on X it's, uh, BP Costello. Uh, keep up the good work.
00:24:57.980
Well, we're going to get back to you maybe tomorrow, the next day to go back more through
00:25:02.160
this. Cause this, the, the Axios articles, two of them in the morning, Joe hit on the coming
00:25:07.800
white collar apocalypse. We ain't talking five years. You're talking in, uh, six to 12 months.
00:25:13.880
It's a pun. It says, I told you, I mean, you look at cognizant, you look at IBM, you look at
00:25:18.040
Microsoft, you look at, uh, McKinsey lays off 10%. You look across the board of all these
00:25:23.100
earnings announcements for the talk and laying off five to 10% of their workforce.
00:25:27.700
They don't want to say that AI is driven and AI is driving all of it, all of it. Okay.
00:25:32.800
Short commercial break. Make sure you go to birchgold.com times of turbulence. Find out
00:25:38.420
what gold has been a hedge for 5,000 years of mankind's history. The end of the dollar empire.
00:25:43.680
Understand what the dollar is, the prime reserve currency and the importance it is in your financial
00:25:48.640
life. Check it out. It's all free. Birchgold.com promo code. No, promo code BANET, right? Go check
00:25:55.500
it out. Seventh free installment, the road to Rio. The Rio reset is July 6th. That should be,
00:26:00.240
I don't know, two or three days after the Senate votes on their version of the big, beautiful bill.
00:26:05.160
Check it out. We're going to leave you for this half hour with a man comes around the best cover
00:26:10.520
from the book of revelations, according to St. John the evangelist. Next, Andrew Breitbart said
00:26:20.320
cultures up river from politics. He might have lived, that might have been an homage to Gromsky.
00:26:26.620
He knew the Frankfurt School backwards and forth. Two of the cultural drivers of the MAGA movement
00:26:33.220
Summer born in summertime. It's Alpha and Omega's kingdom come. And the whirlwind is in the
00:26:46.280
country. The virgins are all trimming their wings. The whirlwind is in the country. It's half with
00:26:59.620
it's a kick. This July, there is a global summit of BRICS nations in Rio de Janeiro. The block of
00:27:05.960
emerging superpowers, including China, Russia, India, and Persia, are meeting with the goal of
00:27:12.460
displacing the United States dollar as the global currency. They're calling this the Rio reset.
00:27:19.380
As BRICS nations push forward with their plans, global demand for U.S. dollars will decrease,
00:27:24.320
bringing down the value of the dollar in your savings. While this transition won't not
00:27:29.340
happen overnight, but trust me, it's going to start in Rio. The Rio reset in July marks a pivotal
00:27:36.480
moment when BRICS objectives move decisively from a theoretical possibility towards an inevitable
00:27:43.200
reality. Learn if diversifying your savings into gold is right for you. Birch Gold Group can help
00:27:51.000
you move your hard-earned savings into a tax-sheltered IRA and precious metals. Claim your free info kit on
00:27:57.260
gold by texting my name, Bannon, that's B-A-N-N-O-N, to 989898. With an A-plus rating with the Better
00:28:04.900
Business Bureau and tens of thousands of happy customers, let Birch Gold Army with a free,
00:28:10.440
no-obligation info kit on owning gold before July and the Rio reset. Text Bannon, B-A-N-N-O-N,
00:28:18.520
to 989898. Do it today. That's the Rio reset. Text Bannon at 989898 and do it today.
00:28:28.100
If you're a homeowner, you need to listen to this. In today's AI and cyber world, scammers are stealing
00:28:35.100
home titles with more ease than ever, and your equity is the target. Here's how it works. Criminals
00:28:41.480
forge your signature on one document. Use a fake notary stamp, pay small fee with your county,
00:28:47.220
and boom, your home title has been transferred out of your name. Then they take out loans using
00:28:53.820
your equity or even sell your property. You won't even know it's happened until you get a collection
00:28:59.520
or foreclosure notice. So let me ask you, when was the last time you personally checked your home title?
00:29:09.120
If you're like me, the answer is never. And that's exactly what scammers are counting on.
00:29:14.560
That's why I trust Home Title Lock. Use promo code Steve at HomeTitleLock.com to make sure your
00:29:22.160
title is still in your name. You'll also get a free title history report plus a free 14-day trial
00:29:29.420
of their million-dollar triple lock protection. That's 24-7 monitoring of your title. Urgent alerts
00:29:35.620
to any changes, and if fraud should happen, they'll spend up to $1 million to fix it.
00:29:41.640
Go to HomeTitleLock.com now. Use promo code Steve. That's HomeTitleLock.com, promo code Steve. Do it today.
00:29:50.360
Enjoy a delicious glass of doctor-formulated Field of Greens each day, and you're going to feel
00:29:56.340
amazing. Plus, your doctor will notice your improved health or your money back.
00:30:01.600
How can a fruit and vegetable drink promise better health? Each fruit and vegetable in Field of Greens
00:30:08.160
was doctor-selected to support vital organs like heart, liver, kidneys, metabolism, immune system,
00:30:15.380
and healthy blood cells. Let me get you started with 20% off and free shipping. Visit FieldOfGreens.com
00:30:22.440
and use my code Bannon. That's FieldOfGreens.com, code Bannon. Remember, every day you get the max
00:30:31.040
you need of your fruits and vegetables in this real organic superfood. That's FieldOfGreens.com,
00:30:41.440
Did you get laid yesterday? I didn't. I'm concerned that someone forgot to validate my time or
00:31:02.820
something. Oh, that's funny. Paid, not laid. There's no way he gets elected, Dash. But I'm afraid
00:31:17.320
we can't take that risk. It's like an insurance policy. God, Hillary should win 100 million to zero.
00:31:32.820
Just went to a Southern Virginia Walmart. I could smell the Trump support.
00:31:51.040
You can come hang out in 4012 with me. I have remnants of cuboted chips and salsa.
00:31:56.780
It's going to be a Clinton-Trump race. Unbelievable.
00:32:04.120
What? Exclamation point, question mark, exclamation point, question mark, question mark, question mark.
00:32:17.220
Okay. Welcome back. Two of my favorite people and two of the most creative people in the country,
00:32:43.720
and particularly the MAGA movement, Phelan McAleer and Anne McElhenney. You guys are filmmakers.
00:32:49.580
You also produce plays. You do, of all the stuff you've done, I love all your material.
00:32:54.640
What you did, I think it was on the MOLA report. You did a live, like, stage reading of it.
00:32:59.960
Yeah, and released it as a podcast. We also did the FBI love boards. You just saw the trailer there.
00:33:04.480
We took the actual text messages of Strzok and Page and their congressional bank.
00:33:14.180
But also, it's teen romance meets John Le Carrier. You know, oh, Lisa, I love you. Come on.
00:33:20.920
Come on. I'll give you a debriefing in the quiet room, you know, and then let's take down Trump tonight.
00:33:25.620
You know, let's take down your panties and take down Trump tonight.
00:33:32.020
Talk to me about this moment as creatives. We just finished, you know, we've been all over this AI.
00:33:38.740
You guys have always been at the cutting edge of filmmaking, particularly being independent filmmakers, always bootstrapping it.
00:33:46.080
Are we going to lose something of the creative talent of individual filmmakers with this AI?
00:33:51.740
Is it we have the efficiency versus the productivity model?
00:33:55.580
There's a lot of people very scared. I mean, you're seeing that from people in Hollywood.
00:33:58.640
They're really worried. Screenwriters are really worried. Actors are worried.
00:34:01.400
I mean, I'm just seeing people like Zach Levy have written a lot about it.
00:34:04.400
He's been talking a lot about it. People, people are, yeah, people are worried about it.
00:34:08.400
However, I think we're, I think we're still in a good place.
00:34:11.100
I think bad, it's going to hurt bad filmmakers and people who write by rote.
00:34:17.740
You know, it's, it's, and you can see that even.
00:34:23.040
You know, 90% of a lot of, a lot of these industries, these so-called creative industries are, let's say 60%, because a lot of it's bad.
00:34:29.660
Like, there's so, like, we used to go to the movies all the time.
00:34:33.240
Because, because, and I say, go to the movies, we'd say, let's go to the movies.
00:34:38.040
I think the Tom Cruise latest edition of Mission Impossible is awful.
00:34:43.440
But hasn't, but hasn't filmmaking, we don't, I, we, I screened the searchers for some people on the TCM.
00:34:51.120
We put it up on a bigger screen and they were absolutely stunned by the photography and the color and not just the acting.
00:34:57.800
But I, and I told people, you can't make that film today.
00:35:05.960
Technology has actually destroyed great Halloween filmmaking.
00:35:10.280
I mean, this AI will, will, will wipe out bad filmmakers because it's, it's, there's no real skill there.
00:35:17.740
So it's, it'll make us hopefully all raise our game.
00:35:22.840
I want people to get to know you and also support your work.
00:35:26.800
What are you guys, because this moment in history is, be remembered for a hundred years.
00:35:31.720
We have, and obviously we have never really built to creative chops.
00:35:37.360
You guys are, have been always at the tip of the spear.
00:35:41.960
What are you guys working on and what do you need as far as help goes?
00:35:45.320
Well, I just wanted to say we're here because of Andrew Breitbart, as many of us are.
00:35:49.640
Like we met Andrew in a bar, what was it, 15 years ago?
00:35:55.960
I mean, literally it was Andrew Breitbart who encouraged us to come here.
00:36:01.900
Well, I think that, you know, we, we met him in the first time we met him was in Washington,
00:36:06.040
D.C. and I had somebody yesterday describe Washington, D.C. as just a soup of mediocrity.
00:36:11.000
And then in the middle of that, and you can imagine exactly that experience, Andrew walks
00:36:14.740
into this room where everyone was wearing the typical uniform, you know, the, you know,
00:36:20.480
I was like, I was like, don't just bring me with you, you know?
00:36:24.180
And he basically said, yeah, you guys need to come to LA.
00:36:26.780
And we did come to LA and he introduced the first time we went to LA, he introduced us
00:36:33.120
But do you remember the first time we met Andrew?
00:36:35.080
It was in that man's house, which has since burned down.
00:36:40.480
And Andrew was standing in front of his freezer saying, I'm thinking of setting up this website.
00:36:44.880
The question is whether I should call it after my name or not.
00:36:49.560
Because, like, you've got Drudge Report and Huffington Post and Breitbart.
00:36:59.040
I literally said to him, I don't think you should call it Breitbart.
00:37:04.240
Listen, he was also got something very deep that nobody in politics got, that culture is
00:37:14.340
It's one of the reasons we've been such a great fight.
00:37:17.360
And President Trump has changed the culture a lot.
00:37:19.780
But when we look at high culture and how that rolls down and pop creative culture, we're
00:37:31.180
By the way, I'm just looking at, you know, the backdrop here.
00:37:33.180
And it makes me think about, you know, we're looking at the image of Jesus there.
00:37:36.660
And I'm thinking, like, at this moment, what's the most evil thing right now that's threatening
00:37:42.460
The most evil thing we think is the trans madness that is destroying children forever, making
00:37:51.860
You know, just, I mean, isn't that a precursor to transhumanism?
00:37:58.960
If you can get people to think they can be anything and change anything right away from
00:38:07.000
That's one of the projects we're going to be working on.
00:38:10.380
We're thinking of doing a documentary with a friend of yours, Miriam Grossman, who's
00:38:15.800
And she basically said, please tell Steve I was asking for him.
00:38:21.480
It's like the hand of God, because we interviewed her last week and then she came to Los Angeles
00:38:24.800
and she said, look, I've been thinking of making this documentary.
00:38:27.120
We're like, OK, we're in, you know, because what a great idea.
00:38:30.980
And it's extraordinary what has happened to the medical establishment, how they've been,
00:38:39.160
And tried to destroy her and the people that stood up.
00:38:41.140
Absolutely, that you're, you know, that suddenly you're not, you lose your voice.
00:38:53.500
What really brought that home to me was the weird thing.
00:38:55.900
When they introduced medical marijuana in California, you had all these pop-up shops
00:39:01.840
where doctors were in there signing prescriptions for people.
00:39:05.280
If they went in and said they had to throw back.
00:39:06.680
And I realized, my God, there's thousands of doctors who will do anything for money.
00:39:14.940
So we're kind of, you know, they're the good ones.
00:39:16.920
And so when you add in ideology and the God-given complex, you know, I can change people.
00:39:24.600
It's a, it's, you should not trust your doctor, you know, or any doctor or any establishment.
00:39:30.020
If we take your film slate and we take your life theater slate, right, walk us through what you're working on.
00:39:35.120
Because we want people to go to your site, get to know you better.
00:39:37.400
And particularly if they want to participate in funding this or helping produce it or be an executive producer.
00:39:44.640
So anyone who does donate to us, we're a charity and tax deductible.
00:39:50.740
I mean, one of the things that people, maybe a lot of people know us from is the Gosnell movie,
00:39:53.940
which we, you know, we wrote a New York Times bestselling book.
00:39:56.700
And we had a movie with Dean Cain starring in the role of Detective Jim Wood, who found this extraordinary doctor, like Mengele character in West Philadelphia.
00:40:12.160
Because it was so, I mean, it drew you in as a great film.
00:40:14.940
But the guy is so insane in the way he abused those people, that minority community.
00:40:20.880
But remember, that came from a cover up by the mainstream media.
00:40:24.440
Like there would be no Gosnell story if the media had funny, if the media had covered it in the first place.
00:40:29.280
You know, and this idea that the cover up of Joe Biden is a shock to the media.
00:40:34.140
They have been, they have been putting their thumb on the scale for decades.
00:40:41.640
They would, didn't go and cover the Gosnell trial.
00:40:51.500
So you could just take the train from New York.
00:40:53.580
I mean, it wasn't like it's, I was out in the middle of nowhere.
00:40:56.380
When you made that film with top talent and you went to get it distributed, what was the response you got?
00:41:02.360
It was worse than Passion of the Christ, right?
00:41:05.040
And actually we did, we did distribute it in theaters.
00:41:11.500
Like you'd, people, people wrote to us and told us they would arrive at a cinema.
00:41:14.900
They'd have hired a bus and brought up a whole bunch of people to go and see the movie.
00:41:17.900
It wouldn't be up on the, you know, they wouldn't have had, they wouldn't put in the marquee.
00:41:21.340
The guy, people wrote and said that when they went to buy a ticket, the guy said, do you really want to see that?
00:41:25.240
I think you should go and see this thing instead.
00:41:26.740
But like, unbelievable, like the list of things that happened, you know, we had, the unions came after us, by the way, during the filming to try and shut the whole thing down.
00:41:36.040
It was, at every level of that film, we found opposition.
00:41:43.820
You know, it's not who controls the means of production, certainly in entertainment.
00:41:49.900
You know, because, you know, you know this yourself, Steve, you've been banned from so many social media channels.
00:41:56.380
But when you went to distribute the movie, no distributor at all, although you could see the way to make money in this thing.
00:42:01.480
No, no, no, look, like we, this was a world record crowdfunding, right?
00:42:06.100
This, we raised 2.3 million from, what was it, 70,000, 30,000 people.
00:42:10.480
So we had 30,000 evangelical people who had literally put their money where their mouth was, right?
00:42:18.160
And by the way, imagine if you had funded a movie.
00:42:28.120
And it's like, it's kind of guaranteed opening weekend, a massive box office.
00:42:42.220
And I think the world is changing a little, you know, but it's still, you're right.
00:43:04.880
And you don't have the ability to make a couple of bombs.
00:43:16.020
You know, yes, if we had 40 years of experience making, or 50 years of experience making.
00:43:21.120
And flooded with money from Soros and from Bloomberg and all these guys.
00:43:25.820
Because that's something that I've always, I never got the answer to this.
00:43:28.900
But on the left, they really do believe in the arts.
00:43:35.900
Like, look, I've spoken to donors many times in the past.
00:43:43.980
You know, we've had people basically see the Gosnell movie.
00:43:46.800
I mean, people we actually know who are very pro-abortion.
00:43:50.120
Who said, 90 minutes, they changed their mind about abortion.
00:44:05.040
Particularly film where you sit in a theater and have, you know, it comes at you over it.
00:44:09.300
You have the communal experience of watching Gosnell.
00:44:12.560
And you've got young people coming out saying, I'm never going to let this happen again.
00:44:27.940
The theater is, is, is, is the worst, worst of all the arts, I think.
00:44:34.420
So we took the transcripts of the grand jury investigation into the shooting of Michael Brown,
00:44:43.720
And, and most of the voices are black voices saying this never happened, that he, he attacked
00:44:51.040
And as one lady, one lady who spent months dodging a subpoena, but eventually was caught
00:44:56.660
and she had a black liberation tattoo on her arm.
00:45:01.960
I don't want to, you know, I don't, I hate the cops, but that guy, that guy was going to
00:45:07.080
And so we did the first rehearsal in LA, uh, nine of the actors walked out after the
00:45:14.140
first rehearsal and this, you know, and this is because the truth didn't match what they'd
00:45:22.840
So the, it's an incredibly powerful play, Ferguson, uh, because it's the truth.
00:45:29.540
Where do people go to get access to all your content?
00:45:35.540
And, uh, and they get all the films, they get access to everything.
00:45:42.400
Go through your slate of what you're working on that you want to expose people to say,
00:45:50.200
If, is that also on unreported projects in development?
00:45:53.500
Well, one of the fun things we want to do, by the way, is we want to bring FBI lovebirds
00:46:00.820
And, you know, we could have some very interesting people in the starring roles.
00:46:04.600
Um, and it could be a lot of fun because not just that you get a media firestorm.
00:46:16.600
We love Bongino, but guys, we got to step it up over there.
00:46:22.060
Look, um, we need to, I know, I know you're not a big fan of, uh, of Elon, but we need
00:46:29.100
We need to scorch the earth, you know, the fair play to Elon Musk.
00:46:34.740
Don't you think there's enough there to start a grand jury investigation just for what you've
00:46:39.280
But the, but the, maybe the statute of limitations has passed, you know, but we need to, I think
00:46:43.860
If we're not doing it, why we're not doing it, you know, this, it's dirty, dirty, dirty
00:46:54.180
And, you know, no, it's, it's, there's a lot of bad apples over there.
00:46:58.520
And I don't know if the whole barrel is rotten at this stage that they need to clean house.
00:47:02.920
And, uh, you know, they need to go in and maybe get rid of 50% of, of, of the operatives.
00:47:09.380
I mean, look at when they were doing all that illegal stuff.
00:47:13.080
And the, and the Russia hoax, both investigating the campaign and investigating the president,
00:47:22.780
And I don't know whether it's people were frightened or maybe they were true believers.
00:47:29.540
So what are the two or three things you're trying to launch?
00:47:35.260
We did a stage it to do a stage here here in, here in DC, have a really fun few nights.
00:47:43.160
After the October 7 massacres in Israel, we went to Israel, interviewed about 20 people
00:47:48.040
and boiled it down to about 14 of the most dramatic stories of that day.
00:47:55.660
All of them are tales of trauma, heroism, survival, resilience.
00:48:11.980
And we'd like to bring it to Ireland, by the way.
00:48:23.460
And as I said, we are the other thing we're working on.
00:48:34.440
The cultural guys, firstly, they wouldn't let you guys come back.
00:48:37.800
If they told you, if you said, I guarantee you, we ought to announce that.
00:48:48.460
We had a very lengthy back and forth just to rent the place.
00:48:52.880
And by the way, they're desperate for money because all of their money comes from the government.
00:48:58.480
So, you know, rent the place out on a dark night.
00:49:05.560
This play does not fit in with our artistic ambitions, which we haven't got an answer of what their artistic ambitions are.
00:49:12.620
But we do know what their artistic ambitions were because what was on stage at the time was something made by a Palestinian poet whose ambition was the annihilation of Israel.
00:49:22.820
So we know something about their artistic ambitions, but we weren't acceptable.
00:49:25.860
Could you get a theatre here in Washington, D.C. to stage it?
00:49:31.540
If anyone out there knows a theatre whose ownership...
00:49:41.180
And funny, there are actors around who will act in this.
00:49:46.180
And, you know, some people are very, very passionate about it.
00:49:48.540
And quite high-profile actors as well, by the way, with acting.
00:49:52.100
I really recommend people to go to our website and watch some of the clips from it.
00:49:56.460
And then the trans thing, as I said, we are working on a documentary.
00:50:00.240
You don't pick any controversial topics, please.
00:50:03.740
Well, first off, the protests on Dr. Grossman's play or whatever, and the protests on October 7th would be...
00:50:10.740
There's probably quite a bit of overlap, you know?
00:50:26.960
And we're kind of, you know, we're everywhere on social media.
00:50:37.080
Go to the October 7th The Play Instagram page, October7theplay.com.
00:50:45.360
Look, there's no point in being in this fight if you're not going to fight.
00:50:48.120
If you're not getting hit over the head, you're not doing God's work, you know?
00:51:01.040
What I love is that you had the theater, which is my first love.
00:51:11.760
There's some conservative voices in Hollywood, but not in the theater.
00:51:16.000
And I'd like to know why someday, but it's just...
00:51:18.400
But the night that we had FBI lovebirds in Washington, D.C., it was like...
00:51:25.300
And coming from Ireland with Abby, all the great Irish playwrights.
00:51:48.900
The IRS doesn't mess around, and they're applying pressure like we haven't seen in years.
00:51:53.400
So if you haven't filed in a while, even if you can't pay, don't wait.
00:52:06.260
Tax Network USA isn't like other tax relief companies.
00:52:09.880
They have an edge, a preferred direct line to the IRS.
00:52:13.340
They know which agents to talk to and which ones to avoid.
00:52:16.460
They use smart, aggressive strategies to settle your tax problems quickly and in your favor.
00:52:23.340
Whether you owe $10,000 or $10 million, Tax Network USA has helped resolve over $1 billion
00:52:39.360
Talk with one of their strategists and put your IRS troubles behind you.
00:52:53.100
Or visit Tax Network USA, TNUSA.com slash Bannon.
00:53:03.180
...to provide American-made natural supplements without all the artificial nonsense.
00:53:09.880
So unfortunately, as many of you know, a lot of these big corporate supplements
00:53:13.860
will include things like preservatives, artificial ingredients, and other additives that really
00:53:20.620
So that's why we created Sacred Human, really trying to fill this gap with quality supplements.
00:53:25.680
And of course, the beef liver being our flagship product.
00:53:28.800
For those who don't know, beef liver is loaded with highly bioavailable ingredients such as
00:53:38.360
And because it is 100% grass-fed and natural, your body is able to absorb these nutrients
00:53:44.220
far better than taking any other synthetic multivitamin or any other synthetic vitamin in general.
00:53:52.820
But if you'd like to check us out, you can go to sacredhumanhealth.com.