Bannon's War Room - August 13, 2025


Episode 4701: Antitrust Slams Largest US Landlord, OpenAI Bombs, and DC Crime Wave Brought to a Grinding Halt


Episode Stats

Length

54 minutes

Words per Minute

161.73354

Word Count

8,801

Sentence Count

714

Misogynist Sentences

5

Hate Speech Sentences

5


Summary

The DOJ is suing Graystar, the largest landlord in the United States, for colluding with their competitors in order to drive up rents on Americans. We talk to Gail Slater, Assistant Attorney General for Antitrust at the Department of Justice, and Joe Allen, Director of Litigation at the DOJ, about the case. We also hear from NYU Professor Gary Marcus, the NYU Professor and nemesis of Sam Altman of OpenAI, to break down the flop that was GPT-5. And we discuss the supposed militarization of Washington, D.C.


Transcript

00:00:00.000 This is the primal scream of a dying regime.
00:00:07.000 Pray for our enemies.
00:00:09.000 Because we're going medieval on these people.
00:00:12.000 I got a free shot at all these networks lying about the people.
00:00:16.000 The people have had a belly full of it.
00:00:18.000 I know you don't like hearing that.
00:00:20.000 I know you try to do everything in the world to stop that,
00:00:22.000 but you're not going to stop it.
00:00:23.000 It's going to happen.
00:00:24.000 And where do people like that go to share the big lie?
00:00:27.000 Mega Media.
00:00:28.000 I wish in my soul, I wish that any of these people had a conscience.
00:00:33.000 Ask yourself, what is my task and what is my purpose?
00:00:37.000 If that answer is to save my country, this country will be saved.
00:00:44.000 War Room. Here's your host, Stephen K. Band.
00:00:54.000 Good evening.
00:00:55.000 Good evening.
00:00:56.000 It is Tuesday, August 12th in the year of our Lord, 2025.
00:01:00.000 I am Joe Allen sitting in for Stephen K. Bannon,
00:01:03.000 who against his deepest spirit and genetic constitution is not working tonight.
00:01:10.000 Well, I bet he's probably working somewhere.
00:01:13.000 I don't know anyone who works harder than Steve Bannon, except for me.
00:01:19.000 Tonight, we're going to talk about antitrust suits against the largest tenant in the largest landlord in the United States.
00:01:33.000 Gray Star, who has been colluding with their competitors in order to drive up rents on Americans.
00:01:41.000 Americans work damn hard for their pay, and it is atrocious what these corporations are doing to siphon all that money out of their paychecks and into their vast coffers.
00:01:55.000 We're also going to be talking to Gary Marcus, the NYU professor and nemesis of Sam Altman of OpenAI, here to break down the flop that was GPT-5 from a very expert and somewhat biting perspective.
00:02:12.000 Following, we will discuss the supposed militarization of Washington, D.C.
00:02:19.000 I've been here for a few days.
00:02:21.000 Maybe I'm in the wrong place.
00:02:22.000 I need to find out where all of these soldiers are and where they're kicking in doors because it's pretty calm and peaceful.
00:02:30.000 Other than the few straggling homeless, they need to get out to the east end a little bit further.
00:02:35.000 And we will close out with a little talk about AI and prayer, specifically the psychotics who are praying to artificial intelligence as if it were Jesus.
00:02:51.000 So I'd like to bring in Gail Slater, Assistant Attorney General for Antitrust at the U.S. Department of Justice.
00:02:59.000 Gail, thank you very much for being with us this evening.
00:03:02.000 Thank you, Joe.
00:03:03.000 It's a thrill to be here and to finally meet you in person, albeit remotely.
00:03:07.000 Yeah, I wish you could be here with me.
00:03:10.000 But, you know, next time, maybe coffee soon.
00:03:13.000 So, Gail, this suit against Graystar, we're at the settlement phase.
00:03:20.000 If you could just walk the audience through what the offense was, what the likely settlement will be and where we go from here.
00:03:30.000 Can we look forward to better conditions for renters in the U.S.?
00:03:35.000 You bet. You bet.
00:03:37.000 So let's start with why rent is so important.
00:03:42.000 Rental prices are so important to so many Americans.
00:03:44.000 And then we'll get into Graystar.
00:03:46.000 So we know that millennials rent more than they own.
00:03:52.000 They're having a hard time getting onto the property ladder.
00:03:56.000 And we also know that millennial Americans spend about a third of their monthly income on rent alone.
00:04:02.000 So this is a really important issue, not just for young people, but for all renters.
00:04:06.000 Graystar, as you said, is their biggest landlord in the country.
00:04:10.000 And we think they have almost a million units under management.
00:04:13.000 So it's a lot of properties.
00:04:15.000 It's a lot of rental income.
00:04:16.000 And so what the DOJ has done over time is investigated and litigated against not just Graystar, but a bunch of other large landlords, as well as a company named RealPage.
00:04:29.000 And what we discovered during the investigation was that RealPage and other landlords were working together to pull competitively sensitive data.
00:04:38.000 And for them, that's rental prices in local markets throughout the country.
00:04:42.000 And they were running that data through RealPage's software using algorithms.
00:04:48.000 And prices on rental properties were going up.
00:04:52.000 They weren't going down.
00:04:53.000 And we think the prices were certainly higher than they would have been had this conduct not been engaged in.
00:04:58.000 So fast forward to last week, and we entered into a settlement with Graystar.
00:05:04.000 And Graystar has agreed not to use RealPage's software anymore.
00:05:08.000 They've agreed not to engage in what we call digital collusion to set the prices at levels higher than they would be for all Americans who are renting from them, particularly from young millennials, for all the reasons that we care about that for.
00:05:25.000 And we're very, very happy about the settlement.
00:05:27.000 It's a very effective way of settling the litigation without incurring costs for taxpayers and costly litigation that we might eventually lose on appeal five years from now and so on.
00:05:39.000 And so it's a good settlement.
00:05:41.000 They're not going to use the software anymore.
00:05:43.000 And we're hoping to see that pass through in the form of real price competition between these landlords going forward.
00:05:49.000 I'm curious.
00:05:51.000 We hear a lot about BlackRock and other major corporations and investment houses that are buying up all this property and jacking up the rents.
00:06:01.000 How widespread is this algorithmic or digital collusion?
00:06:07.000 Sure.
00:06:08.000 So leaving BlackRock aside, we've scoped out that there's at least more than one landlord involved.
00:06:14.000 And then, of course, RealPage was the sort of hub in the hub and spoke that this data was being fed through.
00:06:22.000 We're still working with some other landlords.
00:06:24.000 That's confidential for now.
00:06:26.000 And there was a smaller settlement earlier this year with a smaller landlord.
00:06:30.000 But this first step with Graystar, this proposed settlement, is, we think, a really concrete step in the right direction and hopefully will get us to a global resolution with everyone involved.
00:06:41.000 We think it's really good government and it's going to give relief to consumers today rather than five years from now.
00:06:47.000 You know, I think one of the really heartening aspects of the Trump administration is this push towards antitrust legislation and enforcement.
00:06:58.000 Josh Hawley has been a strong advocate of this.
00:07:01.000 Many others within that camp.
00:07:04.000 So Google right now is imminent, correct, that the settlement will be announced in their antitrust case.
00:07:13.000 Can you talk about that a little bit?
00:07:15.000 Yeah, for sure.
00:07:16.000 So this is another case where we've had a bipartisan consensus around antitrust, tough antitrust enforcement for quite some time now.
00:07:24.000 This was a case involving Google search engine, which, of course, is a household name.
00:07:29.000 It's a verb.
00:07:30.000 We think that Google's had about 90 percent of the online search market for a couple of decades now.
00:07:35.000 In the first Trump administration stepped in and actually sued Google in October of 2020.
00:07:43.000 So right before he left office under our century old monopolization laws.
00:07:49.000 And the accusation, the allegation in that complaint was that Google had monopolized online search in a couple of different ways.
00:07:58.000 And we can get into that if you like.
00:08:01.000 That case went to litigation under the last administration.
00:08:03.000 And now we've come out the other side of that litigation.
00:08:06.000 The judge in D.C. found them liable.
00:08:09.000 Indeed, they were a monopoly and they had monopolized online search.
00:08:13.000 And then under my tenure and with the blessing of President Trump, who put me in office and announced at the time of my nomination that it was with an eye to holding the line on big tech antitrust enforcement.
00:08:29.000 So earlier this year, we brought the Google case forward to what we call the remedies phase.
00:08:36.000 And so that involved a multi week trial just to focus on remedies.
00:08:40.000 Given that the judge found Google liable of monopolization last year, what would be an effective remedy to that conduct?
00:08:48.000 And so we put a couple of different options out there for the judge.
00:08:51.000 It's obviously his decision at the end of the day.
00:08:54.000 Google has for many years paid companies like Apple, like the cell phone carriers to be the default search engine on Apple's phones in particular.
00:09:04.000 And that's that's a big chunk of the market.
00:09:07.000 And they paid Apple billions of dollars every year for that contract.
00:09:11.000 We said, you know, these contracts are problematic because they're closing off competitors from reaching consumers in the search market.
00:09:19.000 And we proposed some other remedies around data sharing so that the data that other companies, other search engines need to reach scale in this market could be freed up and used to compete with Google, to innovate Google.
00:09:35.000 All of that good stuff that we love to see in our free markets that was not really taking place for several decades.
00:09:43.000 And we also proposed in our remedies, again, it's the judge's decision that that Google divest its Chrome browser.
00:09:53.000 And that matters because the Chrome browser, for those of you familiar with Google's products, is very tightly integrated with search.
00:10:00.000 And we estimate that about a third of all searches Americans engage in every day run through that browser.
00:10:07.000 And so Chrome browser divestiture was one of the remedies we put forward.
00:10:12.000 The judge in D.C. is, I think, fine tuning his opinion as you and I talk.
00:10:19.000 And we're expecting a decision from him any day now.
00:10:22.000 And we think that matters a lot.
00:10:25.000 As I said, this was a bipartisan case, not only bipartisan at the federal level between the previous Trump administration, the last administration and today.
00:10:35.000 It was also joined and signed on to by 49 different states.
00:10:40.000 That's a really big number, making it a landmark decision in our world.
00:10:45.000 And we think it's going to be a really, really important decision for antitrust enforcement going forward.
00:10:51.000 It will send a strong signal that we are serious about robust antitrust enforcement under President Trump's watch.
00:10:58.000 It's amazing, Gail.
00:11:00.000 Teddy Roosevelt would be very, very proud.
00:11:03.000 And this problem of information monopolization, the search engine is still one of the, if not the most used method to gain information, whether it be news, whether it be research, whether it be just simply where you're going to shop.
00:11:20.000 So I think that this practice of pushing out the competitors and especially when you look at the bias in Google's algorithms and the way in which it's being gamed at this point to push search results up in the order.
00:11:38.000 I really think that this is important for a number of reasons, but the biggest reason is the vast impact that these corporations have on our lives.
00:11:50.000 Now, I won't rope you in to our opinions on the digitization of all culture, but this impact has shaped reality.
00:11:59.000 It shaped politics.
00:12:00.000 It shaped economics.
00:12:02.000 And for any one corporation to have most or all of the control is completely unacceptable.
00:12:07.000 And I think it's really heartening to see actual solutions moving forward to divest them of this power and hopefully bring other companies up.
00:12:16.000 I mean, I'm not a huge fan of Brave or DuckDuckGo, but at least you know that there are options.
00:12:22.000 There's some diversity in the field of information that people can actually have some chance of making their own decisions instead of having just one funnel of reality pushed into their brains and the information pushed in.
00:12:36.000 So before we go, we've only got a couple of minutes left.
00:12:39.000 I'm wondering, are there any other cases moving forward in this push towards antitrust action that we should know about?
00:12:49.000 So we have a bunch of cases.
00:12:52.000 The ones that are most interesting, I think, to the war room posse, I expect, are the Google case.
00:12:59.000 We also are looking at the ways in which we can use the antitrust tools, the scalpel, not the sledgehammer, it's law enforcement, it's not regulation, to foster competition in healthcare markets.
00:13:12.000 And this is another pocketbook issue that President Trump was elected to carry forward and democratize the economy around.
00:13:19.000 And we can play a role in that.
00:13:21.000 We have executive orders from the White House that are tasking us with looking into drug pricing.
00:13:27.000 Also the PBMs, the pharmacy benefit managers.
00:13:33.000 And so we're obviously keeping a close eye on those executive orders and thinking about the ways in which we can comply with them, working with sister agencies across the Trump administration.
00:13:45.000 And it's a very, very important pocketbook market for all Americans, not just for the war room posse, but in particular for working class Americans.
00:13:53.000 Same as with our rental cases.
00:13:56.000 The, you know, our healthcare budget is not quite as big as our rental budget on a monthly basis, but it's a pretty big chunk of it, so.
00:14:05.000 Gail, I got to say from the bottom of my heart, we really appreciate all the fighting you've done.
00:14:11.000 We know you've taken a lot of heat.
00:14:12.000 Where can the war room posse find you?
00:14:15.000 And how can we support you going forward?
00:14:17.000 Because we know this fight is not easy.
00:14:19.000 It's uphill the entire way.
00:14:21.000 We're up against some trillion dollar companies.
00:14:23.000 They never said it was going to be easy.
00:14:25.000 So all the support we can get from the war room posse is much appreciated as it is from you too, Judd.
00:14:31.000 So I can be found at AAG Slater.
00:14:33.000 That's my official account at the DOJ.
00:14:35.000 Thanks so much for that.
00:14:37.000 And my personal account is Gail A. Slater.
00:14:39.000 But the real work is being done under the official account.
00:14:42.000 And so I encourage everybody to follow us there.
00:14:45.000 And thank you so much again for having me on.
00:14:48.000 Yeah, Gail, thank you very much for being on.
00:14:50.000 And again, thank you very much for your tireless fight.
00:14:54.000 All right, we'll be right back after the break with Gary Marcus.
00:14:59.000 Stay tuned.
00:15:00.000 This July, there is a global summit of BRICS nations in Rio de Janeiro, the block of emerging superpowers, including China, Russia, India and Persia, are meeting with the goal of displacing the United States dollar as the global currency.
00:15:15.000 They're calling this the Rio reset as BRICS nations push forward with their plans.
00:15:21.000 Global demand for U.S. dollars will decrease, bringing down the value of the dollar in your savings while this transition won't not happen overnight.
00:15:30.000 But trust me, it's going to start in Rio.
00:15:33.000 The Rio reset in July marks a pivotal moment when BRICS objectives move decisively from a theoretical possibility towards an inevitable reality.
00:15:43.000 Learn if diversifying your savings into gold is right for you.
00:15:48.000 Birch Gold Group can help you move your hard-earned savings into a tax-sheltered IRA and precious metals.
00:15:54.000 Claim your free info kit on gold by texting my name, Bannon, that's B-A-N-N-O-N, to 989898.
00:16:02.000 With an A-plus rating with the Better Business Bureau and tens of thousands of happy customers, let Birch Gold Army with a free, no-obligation info kit on owning gold before July.
00:16:13.000 And the Rio reset.
00:16:15.000 Text Bannon, B-A-N-N-O-N, to 989898.
00:16:20.000 Do it today.
00:16:21.000 That's the Rio reset.
00:16:23.000 Text Bannon at 989898 and do it today.
00:16:27.000 Voice family, are you on Getter yet?
00:16:29.000 No.
00:16:30.000 What are you waiting for?
00:16:31.000 It's free.
00:16:32.000 It's uncensored.
00:16:33.000 And it's where all the biggest voices in conservative media are speaking out.
00:16:37.000 Download the Getter app right now.
00:16:39.000 It's totally free.
00:16:40.000 It's where I put up exclusively all of my content 24 hours a day.
00:16:44.000 You want to know what Steve Bannon's thinking?
00:16:45.000 Go to Getter.
00:16:46.000 That's right.
00:16:47.000 You can follow all of your favorites.
00:16:49.000 Steve Bannon, Charlie Kirk, Jack Posobin.
00:16:51.000 And so many more.
00:16:52.000 Download the Getter app now.
00:16:54.000 Sign up for free and be part of the movement.
00:16:56.000 I mean, I think OpenAI is probably going to head towards surveillance.
00:17:00.000 You can imagine two business models for OpenAI.
00:17:03.000 One would be if they could actually build AGI soon.
00:17:06.000 Maybe they can make a lot of money with that.
00:17:08.000 Real AGI would be worth trillions of dollars.
00:17:11.000 But the things that they've actually delivered don't work that reliably.
00:17:15.000 And that has limited their commercial utility.
00:17:17.000 They have a lot of private data.
00:17:19.000 People treat it as a therapist.
00:17:21.000 And they now want to build apparently like a necklace or something.
00:17:23.000 They record you 24-7.
00:17:25.000 Like that's like 1984 independent.
00:17:28.000 My kid will never grow up, will never ever be smarter than an AI.
00:17:34.000 That will never happen.
00:17:35.000 You know, kid born a few years ago, they had a brief period of time.
00:17:37.000 My kid never will be smarter.
00:17:38.000 GPT-5 is a major upgrade over GPT-4 and a significant step along our path to AGI.
00:17:44.000 And so where do you think we are on this AGI path then?
00:17:48.000 What's your personal definition of AGI and then I'll answer.
00:17:53.000 Oh, that's a good question.
00:17:54.000 Well, what is your personal definition of AGI?
00:17:58.000 I have many, which is why I think it's not a super useful term.
00:18:05.000 I think the point of all of this is it doesn't really matter.
00:18:08.000 And it's just this continuing exponential of model capability.
00:18:14.000 All right, we're back.
00:18:15.000 That was Sam Altman promising a digital deity for everyone.
00:18:20.000 Everyone has a super genius in their pocket, PhD level.
00:18:25.000 Well, last week GPT-5 launched and the reception was disappointed to say the least.
00:18:33.000 It may have the capabilities of a PhD, but only if the PhD is incapable of arranging information in a cogent manner
00:18:44.000 and occasionally hallucinating presidents or facts out of nowhere.
00:18:49.000 Now, no one, I think, has called out Sam Altman for his over promising more than the NYU professor of cognitive science, Gary Marcus.
00:19:01.000 Gary has a lot of different views than most of us here at the War Room.
00:19:07.000 He certainly doesn't share our politics, but he is an excellent resource for information and I think an excellent resource for arguments against the use of AI in every aspect of our lives from education to medicine to the military, at least in its current form.
00:19:26.000 If it's an extremely flawed technology, it should not be pushed down the throats of students, doctors and soldiers.
00:19:33.000 Gary Marcus, thank you very much for joining us.
00:19:36.000 We appreciate your fight, sir.
00:19:39.000 Thank you very much for having me.
00:19:41.000 And I couldn't agree more with what you just said.
00:19:44.000 And I'm glad you said the last little piece of it, which is maybe some future technology would merit being used across the board for everything that we do.
00:19:54.000 The problem is the current technology just does not deliver on its promises.
00:19:58.000 And so you don't want, for example, a hallucination machine to be teaching your kids.
00:20:02.000 Yeah, this problem of hallucinations.
00:20:06.000 Now, we'll get to that in a moment.
00:20:08.000 I really want to talk about the quantified aspects of the hallucinations, which seem to be reduced in the new model.
00:20:15.000 But more than anything, I just want to allow you to take your victory lap on this GPT-5 launch.
00:20:22.000 Obviously, they may not have said GPT-5 will be artificial general intelligence.
00:20:30.000 Just for the audience's benefit, they've heard this a million times.
00:20:34.000 But right now, today, the definition of AGI most agree on is an AI system that is able to do anything a human worker could do.
00:20:45.000 And clearly, it was nothing of the sort.
00:20:48.000 But they were teasing it, and they have all these influencers who are constantly pushing this idea that the next model is going to be this artificial general or artificial godlike intelligence.
00:21:00.000 Gary, take your victory lap, sir.
00:21:02.000 You called this from the beginning that it was not going to happen.
00:21:05.000 What is your take on it?
00:21:07.000 I did, and it's true. I called it.
00:21:10.000 I warned about these hallucinations, these kind of weird errors, all the way back in the year 2001.
00:21:16.000 And I've consistently said, these people are overpromising what this particular technology can do.
00:21:21.000 And I've really been vilified by it.
00:21:23.000 I mean, I think they have, you know, they have an emoji at OpenAI for me because they dislike me so much.
00:21:30.000 And they kept spouting that, you know, these systems were going to do all this amazing stuff.
00:21:35.000 And they basically did like a three year long marketing campaign to try to convince people that this thing was going to be amazing.
00:21:42.000 And I kept saying, no, it's not going to be amazing. It'll be cool, but it's not going to be so much better than these other models.
00:21:48.000 This idea of scaling that you can just make the models better and better by adding more data is not going to get us where they want to go.
00:21:56.000 And, you know, nobody really took me very seriously until last week.
00:21:59.000 And then when they dropped it, even in the first few minutes of the live stream, I think, you know, Sam said it's PhD level and people were ready to be shocked and amazed.
00:22:09.000 There is a manifold market you can see and everybody was thinking OpenAI has got this.
00:22:13.000 They're going to be dominant.
00:22:15.000 They're going to have, you know, the best models.
00:22:17.000 And you look at it and like over the course of the hour, everybody kind of realized it's not really what they're promising, is it?
00:22:24.000 And then people went home for the next several days and tried the system.
00:22:29.000 And as you say, the dominant reaction was disappointment.
00:22:32.000 And there are people that really dislike what I've had to say posting things on Twitter.
00:22:37.000 Like, I really, really hate to say this, but Gary Marcus was right.
00:22:41.000 It went so far that people called it Gary Marcus Day.
00:22:44.000 Like, this is not a good outcome for OpenAI when they do their big, splashy presentation of what's supposed to be their most amazing thing.
00:22:52.000 And then people end up calling that Gary Marcus Day when I'm their big nemesis.
00:22:56.000 I mean, so, yes, I think I can take a victory lap if we want to be honest about it.
00:23:02.000 Well, so this issue of hallucinations, this is a real problem.
00:23:06.000 It's not a problem if you're playing with AI.
00:23:09.000 It's not a problem if you want to go back and check everything that this machine is feeding to you, which in some ways makes the entire process kind of comical to begin with.
00:23:18.000 But if you have students who come to rely on this for all the answers to the truths of the world, if you have doctors who come to rely on this for diagnostics or therapies, it's a major, major problem.
00:23:32.000 But the internal benchmarks at OpenAI showed a drastic reduction in hallucinations, something like 5X for some parameters, something like 10X for others.
00:23:44.000 I'm curious what your opinion is on that.
00:23:48.000 Is the problem of hallucinations being dealt with by these companies, even 1% hallucination, is this a problem that really makes the technology not worth incorporating into major institutions in the U.S.?
00:24:04.000 Well, I mean, it does depend on the context.
00:24:07.000 But even 1%, as you're kind of pointing out there, can be pretty serious.
00:24:11.000 I mean, imagine using a technology like that in driverless cars.
00:24:15.000 If 1% of the time it makes up a vehicle that's not there or something like that, then that couldn't be fatal.
00:24:20.000 So it really depends on what you're looking at.
00:24:24.000 But for many domains, including education, 1% hallucination can still be pretty bad.
00:24:29.000 And I'm not sure that we should actually accept that number.
00:24:31.000 I mean, lots of people have already documented hallucinations.
00:24:34.000 There's something we call Goodhart's Law, which is about benchmarks.
00:24:38.000 You have some measurement, and people, after they know about that thing, start to teach to that test.
00:24:43.000 That's what Goodhart's Law means.
00:24:45.000 And so people are training to the benchmarks, trying to make those benchmarks look good.
00:24:49.000 And often what happens in the real world is those benchmarks aren't representative anymore.
00:24:54.000 And so it's one thing for somebody to say, hey, here's this benchmark and they're only 2% wrong or 1% wrong or something like that.
00:25:03.000 And then you have to see in the real world.
00:25:05.000 And also when it's 1%, there's something insidious happens, which is people stop paying attention.
00:25:10.000 So somebody sent me – well, first of all, you mentioned the presidents, right?
00:25:15.000 So there's something circulating where AI was inventing presidents and what years they were born, misspelling their names.
00:25:23.000 And then I replicated it last night with Canadian prime ministers.
00:25:28.000 Somebody got mad at me.
00:25:29.000 They're like, oh, they're not very good at generating images.
00:25:32.000 You haven't done the right parameters.
00:25:35.000 And so they probably did a better version of U.S. presidents from – I think it was GPT-5 with thinking mode turned on.
00:25:42.000 And they're like, look, I nailed it.
00:25:44.000 They didn't actually use those words, but obviously they were trying to kind of show that they were smarter than me because they were able to get the system to perform right.
00:25:51.000 And then I looked at it carefully and it said that Bill Clinton was president from – I think it was 1981 to 1999.
00:25:59.000 So that had him overlapping at the same time as Reagan and George H.W. Bush.
00:26:06.000 That can't happen in the real world.
00:26:08.000 The system were actually smart.
00:26:09.000 It would just look up the information on Wikipedia and not make it up, not hallucinate.
00:26:13.000 I think that Clinton was in office in 1981.
00:26:15.000 He was probably too young even to be president then.
00:26:18.000 And, of course, Reagan was in office.
00:26:20.000 So, you know, this guy looked at his own thing and didn't realize that it was still problematic.
00:26:26.000 So these hallucinations can be really subtle and insidious.
00:26:29.000 And if you have, you know, in a teaching context where the students themselves don't know the right answer, they're going to miss a bunch of that.
00:26:36.000 And people start to have what I call the looks good to me reaction, which is what the guy I'm describing now had, which is he looked at it, seemed good.
00:26:45.000 He thought it was better.
00:26:46.000 I mean, it was better than the first thing.
00:26:48.000 And he just assumed it was right because it was right about some things.
00:26:51.000 But a bunch of errors slipped in.
00:26:53.000 And that's what we've seen over and over and over with these large language models is they get a bunch of stuff right and a bunch of stuff wrong and you never know what.
00:27:00.000 You can't really trust them on their own.
00:27:02.000 It's not changed in years and years of working on this.
00:27:06.000 You know, I am oftentimes asked what I fear most about these technologies, whether it's AI or genetic engineering, whatnot.
00:27:14.000 A lot of people are really concerned about the singularity, right?
00:27:17.000 This exponential increase in capabilities, this explosion of intelligence that could take over humans.
00:27:23.000 Certainly something to keep an eye on without a doubt.
00:27:25.000 But I worry that the inverse singularity is our real problem where human intelligence decreases and decreases until it's finally a precipitous drop.
00:27:35.000 And all these kind of flawed, wonky technologies appear dazzling to our stupefied eyes.
00:27:41.000 But, Gary, we've got to go to a break.
00:27:44.000 We'll bring you back on the other side.
00:27:46.000 War Room Posse, stick around.
00:27:48.000 We're going to be talking about the federal moves here in Washington, D.C. against crime.
00:27:56.000 And when we come back, I want to ask Gary about his long term vision of where AI goes.
00:28:02.000 This is something that I think people really need to focus on even more than these launches of products or any sort of bizarre story you hear about an AI blackmailing an engineer.
00:28:16.000 But we'll get to that when we come back.
00:28:18.000 Stay tuned.
00:28:19.000 If you're a homeowner, you need to listen to this.
00:28:29.000 In today's AI and cyber world, scammers are stealing home titles with more ease than ever.
00:28:36.000 And your equity is the target.
00:28:38.000 Here's how it works.
00:28:39.000 Criminals forge your signature on one document.
00:28:42.000 Use a fake notary stamp.
00:28:44.000 Pay small fee with your county and boom.
00:28:47.000 Your home title has been transferred out of your name.
00:28:50.000 Then they take out loans using your equity or even sell your property.
00:28:55.000 You won't even know it's happened until you get a collection or foreclosure notice.
00:29:01.000 So let me ask you, when was the last time you personally checked your home title?
00:29:07.000 If you're like me, the answer is never.
00:29:10.000 And that's exactly what scammers are counting on.
00:29:13.000 That's why I trust Home Title Lock.
00:29:16.000 Use promo code Steve at HomeTitleLock.com to make sure your title is still in your name.
00:29:24.000 You'll also get a free title history report plus a free 14-day trial of their million dollar triple lock protection.
00:29:31.000 That's 24-7 monitoring of your title.
00:29:34.000 Urgent alerts to any changes.
00:29:35.000 And if fraud should happen, they'll spend up to $1 million to fix it.
00:29:41.000 Go to HomeTitleLock.com now.
00:29:43.000 Use promo code Steve.
00:29:45.000 That's HomeTitleLock.com.
00:29:47.000 Promo code Steve.
00:29:48.000 Do it today.
00:29:49.000 Here's your host, Stephen K. Vance.
00:29:51.000 And we're back with Gary Marcus.
00:30:02.000 Gary, I want to get to the big picture of all of this.
00:30:06.000 We can look at the current state of GPT-5 and say this is a silly machine.
00:30:14.000 It has a lot of important and useful capabilities, but ultimately it's flawed and certainly it's not artificial general intelligence.
00:30:24.000 I'm wondering, from your perspective, you talk a lot about the potential of neurosymbolic AI.
00:30:30.000 We don't have to go into the technicalities of it quite yet, but other methods, other approaches besides just scraping vast amounts of literature, of language data, and then shoving it into a massive algorithmic process, the large language model.
00:30:47.000 You believe that that approach is basically at a dead end, at a wall, but there are possibilities for other approaches.
00:30:55.000 So big picture, is your skepticism that AGI, artificial general intelligence, or even superintelligence aren't possible, won't happen, or is your perspective that current methods are not going to get us there and maybe others will and maybe others should?
00:31:13.000 I think current methods really are at a kind of impasse.
00:31:18.000 They're making progress in some ways, so the graphics always get better and stuff like that, but they're reaching the same obstacles over and over again.
00:31:27.000 So they continue to have problems with hallucinations and so forth.
00:31:30.000 I don't think that's a logical problem.
00:31:32.000 I think it's a problem with how we're going about things.
00:31:34.000 Because sometimes in the history of science, scientists are just wrong.
00:31:38.000 And eventually there's self-correction, the field as a whole realizes that they're stuck and they try something else.
00:31:44.000 So everybody thought that genes were made of proteins early in the 20th century, and they were wrong.
00:31:49.000 And eventually they figured out that they were made of this weird sticky acid that we now know as DNA.
00:31:54.000 So, you know, for 20 years people pursued the wrong path.
00:31:58.000 And I think to some extent that's true now.
00:32:00.000 I don't really think large light in which models will disappear, but we will come up with much better techniques.
00:32:05.000 We need some major innovations and rethinking, going back to the drawing board.
00:32:10.000 We'll keep these current tools, but we will invent other tools.
00:32:13.000 We'll need a broader set of tools.
00:32:15.000 Eventually, yeah, I do think we'll get to artificial intelligence.
00:32:19.000 You know, humans are a kind of general purpose intelligence, biologically built.
00:32:24.000 Someday we'll have machines like that.
00:32:26.000 We have to get past, I think, these kind of fixed ideas that people are stuck with right now.
00:32:32.000 But we will.
00:32:33.000 I don't know if it will take 10 years, 20 years, 50 years, but I think it will probably happen this century.
00:32:39.000 But this is a really tough topic because, granted, if we have 10 years or 50 years to prepare for a system that would be able to replace all intellectual work and assuming that robotics keep pace, all blue collar work, all physical labor.
00:33:01.000 Gary, what will we do?
00:33:04.000 I don't know.
00:33:05.000 I mean, a century from now, I really don't know what life will be like.
00:33:09.000 I think that, you know, robots are not that good right now.
00:33:12.000 So blue collar work like carpentry and plumbing and things like that.
00:33:17.000 Those are no danger whatsoever in the next 20 years.
00:33:20.000 Maybe in the next 100, they are.
00:33:22.000 You know, robots will get better.
00:33:24.000 When robots can do all our plumbing and carpentry, that'd be a major advance compared to where we are now.
00:33:32.000 We might actually have a kind of life of leisure then.
00:33:35.000 We have a lot of economic questions about, you know, who gets the wealth and how is it distributed?
00:33:39.000 Prices might come down, but people might have very meager amounts of money because there's not a lot that they can do that actually commands income.
00:33:47.000 Maybe some arts and things like that.
00:33:49.000 We don't really know.
00:33:52.000 We do need to start preparing for changes in society.
00:33:55.000 Maybe not quite as fast as Silicon Valley would lead you to believe.
00:33:58.000 I think Silicon Valley hypes these things so they can drive up valuations and, you know, get more money for the things that they're doing.
00:34:05.000 They try to instill a sense of fear that's maybe not realistic compared to now.
00:34:10.000 But sure, 50 years from now, things will be pretty different.
00:34:14.000 You know, from my own perspective, just the desire, especially the immediate desire that we are going to replace all of you and your labor will be worth nothing.
00:34:24.000 You'll have no negotiating power whatsoever in five years, in 10 years.
00:34:29.000 Maybe your biological form will be irrelevant and should be either cast aside or perhaps uploaded and preserved in a data center.
00:34:37.000 These are the sorts of thoughts that if I told you that this was my fantasy, you would say, Joe, you are a psychopath.
00:34:44.000 You should be locked up.
00:34:46.000 But for these guys, it simply, as you say, drives up investment.
00:34:49.000 Knowing that mentality.
00:34:51.000 I mean, I think Elon Musk all but said.
00:34:54.000 I mean, I think Elon Musk all but said the other day basically.
00:34:59.000 Please.
00:35:00.000 That, you know, we might have an economy that's a thousand times bigger, but people might not be part of the picture.
00:35:07.000 You know, I don't think you and I, even though our politics may differ, really want that kind of world where there's a bigger economy, but there's no human beings left, you know, having meaningful lives or maybe even having lives at all.
00:35:20.000 It was really scary when Peter Thiel, you know, hesitated when he was asked, but they'll still be people.
00:35:26.000 People are still important.
00:35:27.000 Right.
00:35:28.000 And he kind of hesitated.
00:35:29.000 And I don't want that world.
00:35:31.000 I want a world where humans still have an important place in how things go.
00:35:39.000 Yeah.
00:35:40.000 The Peter Thiel issue is very, very important because so many people on the right look up to him as an icon, not only as an entrepreneur, but a philosopher.
00:35:50.000 That hesitation, while granted, Peter Thiel seems to be hesitating and thinking up what he's going to say on the fly with almost anything that he says.
00:35:59.000 But a ready answer would have just been, yeah, I want humans to go on.
00:36:04.000 Yeah, I think humans are better than robots.
00:36:07.000 And it didn't seem too sincere.
00:36:10.000 Before we go, I just want you to let the audience know where can they follow you?
00:36:15.000 And I'm very curious, too, what are you going to be doing in the near future?
00:36:20.000 What projects should we be looking out for from Gary Marcus?
00:36:24.000 Well, in the short term, you can follow me on my sub stack, Marcus on AI.
00:36:30.000 That piece had 150,000 readers that just came out.
00:36:35.000 You can follow me on Twitter.
00:36:37.000 And there's my recent book, Taming Silicon Valley.
00:36:40.000 So those are all ways to follow me.
00:36:42.000 And there might be an exciting new project, but I can't talk about it just yet.
00:36:47.000 All right. Well, keep us posted.
00:36:49.000 Thank you very much, Gary, for coming on again.
00:36:51.000 I think that these sorts of conversations should be happening all over the place.
00:36:55.000 People from the political left, the political right, people from the futurist camp, people from the Luddite camp.
00:37:01.000 If we can't talk about it, we're going to fight about it.
00:37:04.000 And if we're arguing about it, who knows how bad it could get?
00:37:06.000 It could come to blows.
00:37:07.000 Not you and me, of course.
00:37:08.000 If I can just say one more thing.
00:37:10.000 Thank you very much, Gary.
00:37:11.000 Our politics.
00:37:12.000 Yes, sir.
00:37:13.000 Our politics differ, but I think that on these issues, we very much are aligned.
00:37:17.000 Thanks a lot.
00:37:18.000 Yes, sir.
00:37:19.000 Thank you.
00:37:20.000 All right, Denver, if you could throw in that cold open for Mike Howell.
00:37:26.000 Today, we're formally declaring a public safety emergency.
00:37:30.000 This is an emergency.
00:37:31.000 This is a tragic emergency.
00:37:34.000 And it's embarrassing for me to be up here.
00:37:37.000 You know, I'm going to see Putin.
00:37:38.000 I'm going to Russia on Friday.
00:37:41.000 I don't like being up here talking about how unsafe and how dirty and disgusting this once-beautiful capital was.
00:37:49.000 The murder rate in Washington today is higher than that of Bogota, Colombia, Mexico City, some of the places that you hear about as being the worst places on Earth.
00:38:02.000 Much higher.
00:38:03.000 This is much higher.
00:38:04.000 The number of car thefts has doubled over the past five years.
00:38:09.000 And the number of carjackings has more than tripled.
00:38:12.000 At your direction this morning, we've mobilized the D.C. National Guard.
00:38:16.000 It'll be operationalized by the Secretary of the Army, Dan Driscoll, through the D.C. Guard.
00:38:20.000 You will see them flowing into the streets of Washington in the coming week.
00:38:24.000 At your direction as well, sir, there are other units we are prepared to bring in.
00:38:28.000 Other National Guard units, other specialized units, they will be strong.
00:38:35.000 Wild times, War Room Posse.
00:38:37.000 I want to bring in Mike Howell, president of the Oversight Project.
00:38:41.000 Mike, can you tell us a little bit about what's happening here in D.C.?
00:38:45.000 I'm walking around the streets.
00:38:47.000 Everything seems basically normal, but a lot is moving.
00:38:50.000 What's going on, man?
00:38:52.000 So President Trump has basically invoked his authority under the D.C. Home Rule Act.
00:38:57.000 The Home Rule Act, which allows him to take over the police for a period of up to 30 days.
00:39:03.000 And so he's begun that process.
00:39:05.000 I will say on the ground things, you're right, aren't much different.
00:39:09.000 He's called up the National Guard, which I think amounts to some 800 of the total 2,700 D.C. National Guardsmen,
00:39:17.000 of which one to 200 will be on the streets sometimes.
00:39:20.000 So a relatively small number.
00:39:22.000 I mean, D.C. has over 3,000 cops, so you do the math.
00:39:25.000 A very small federal presence as of now.
00:39:28.000 We're going to need a much larger federal presence moving forward.
00:39:31.000 And so I hope that by him opening this door, the administration fully walks through it and does what needs to be done in Washington, D.C.
00:39:39.000 Furthermore, when that 30 days expires, Congress can extend it.
00:39:43.000 And so we should be planning for a long and sustained, very visible, very large presence in Washington, D.C.
00:39:49.000 Ultimately, it will be up to the Trump administration of whether this is just a rhetorical type announcement for political purposes or they're going to see it through.
00:39:57.000 And I hope they see it through for the long haul.
00:39:59.000 So D.C. has always had a major crime problem.
00:40:04.000 And I'm curious, then, from your perspective, is this really necessary?
00:40:08.000 Is the federalization of the police force really going to be required to get this under control?
00:40:15.000 Absolutely.
00:40:16.000 I mean, the optimal thing would be overturning or passing a new thing to override the Home Rule Act, which completely takes away any ability of, quote unquote, self-government in D.C.
00:40:27.000 The practical reality is D.C. isn't ready or capable of self-government.
00:40:32.000 Right now, the way it operates is basically through moneyed interest vying for control of the city council or the mayor's office, where they then compete with the political machinery from the Democrats on the other end.
00:40:44.000 So we aren't seeing a political process play out in Washington, D.C. over the decades where Home Rule has been in place.
00:40:50.000 But it clearly does not work.
00:40:52.000 It's a level of acceptable corruption and then deteriorating just conditions elsewhere.
00:40:58.000 It's getting people killed.
00:40:59.000 It is a ugly city now.
00:41:01.000 You have just vagrancy and quality of life kind of crimes throughout.
00:41:05.000 And particularly after Black Lives Matter, where the policy aim was basically to legalize criminal activity amongst, you know, preferred demographic groups.
00:41:14.000 The things have gotten just worse in D.C. where just everyday crime is allowed.
00:41:19.000 And particularly with some of the youth violence, predominantly African-American males, almost exclusively African-American young males, a rampant rise in carjacking because people under that age cap of 18 can basically get away with it and re-release.
00:41:34.000 So it's a lot of left wing fever dreams turning into an apocalyptic situation in D.C. over the last few years.
00:41:41.000 Yeah, I've known a number of people to be mugged here.
00:41:46.000 I like I say, as I walk the streets, it looks like regular old D.C.
00:41:50.000 And that is a beautiful city.
00:41:52.000 But clearly, a lot of this is not under control and there's no effort being made to keep it under control.
00:42:00.000 Mike, if you could, please let us know where we can follow you.
00:42:04.000 Tell us where we can find your organization.
00:42:07.000 Absolutely.
00:42:08.000 Absolutely. So I'm on X at ML tweets, president of the oversight project, which is at it's your gov.
00:42:14.000 And we're going to stay on top. D.C. is, you know, right where we work out of.
00:42:18.000 If we need to litigate, we'll litigate, investigate, we'll investigate.
00:42:21.000 We want to make sure the Trump administration has enough room to actually see through a full scale return to safety and order in Washington, D.C.
00:42:31.000 Mike Howell, thank you very much.
00:42:34.000 Thanks for having me.
00:42:35.000 All right.
00:42:37.000 So when you're thinking about these apocalyptic scenarios, War Room Posse, my patriot supply is your go to to hunker down and survive the crime apocalypse.
00:42:50.000 My patriot supply dot com slash Bannon by the three month emergency food kit and get a free four week kit.
00:43:00.000 Originally nine forty four.
00:43:03.000 But you get it for two hundred and forty seven dollars off that price and you can get the three month kit for six hundred and ninety seven dollars.
00:43:12.000 That is my patriot supply dot com slash Bannon and the tax network USA T in USA dot com slash Bannon.
00:43:26.000 These guys can make sure the IRS doesn't suck out the last of your resources and leave you high and dry.
00:43:33.000 That's T in USA dot com slash Bannon or call one eight hundred nine five eight one zero zero zero for a free consultation to protect you from the tax man back in three.
00:43:48.000 Hey, we're human, all too human.
00:43:51.000 I don't always eat healthy.
00:43:53.000 You don't always eat healthy.
00:43:55.000 That's why doctors create Field of Greens.
00:43:57.000 A delicious glass of Field of Greens daily is like nutritional armor for your body.
00:44:03.000 Each fruit and each vegetable was doctor selected for a specific health benefit.
00:44:09.000 There's a heart health group, lungs and kidney groups, metabolism, even healthy weight.
00:44:15.000 I love the energy boost I get with Field of Greens, but most of all, I love the confidence that even if I have a cheat day or wait for it, a burger,
00:44:24.000 I can enjoy it guilt free because of Field of Greens.
00:44:27.000 It's the nutrition my body needs daily and only Field of Greens makes you this better health promise.
00:44:34.000 Your doctor will notice your improved health or your money back.
00:44:37.000 Let me repeat that.
00:44:38.000 Your doctor will notice your improved health or your money back.
00:44:42.000 Let me get you started with my special discount.
00:44:44.000 I got you 20 percent off your first order.
00:44:47.000 Just use code Bannon, B-A-N-N-O-N at Field of Greens dot com.
00:44:52.000 That's code Bannon at Field of Greens dot com.
00:44:55.000 20 percent off.
00:44:56.000 And if your doctor doesn't know how healthy you look and feel, you get a full money back guarantee.
00:45:03.000 Field of Greens dot com.
00:45:05.000 Code Bannon.
00:45:06.000 Do it today.
00:45:07.000 War Room.
00:45:10.000 War Room.
00:45:11.000 Here's your host, Stephen K.
00:45:13.000 Band.
00:45:22.000 All right.
00:45:23.000 Welcome back.
00:45:24.000 War Room Posse.
00:45:25.000 Be sure to go to Birch Gold.
00:45:28.000 Birch Gold.
00:45:29.000 Sorry, Philip Kirkpatrick.
00:45:32.000 Birch Gold dot com slash Bannon or text Bannon to nine eight nine eight nine eight.
00:45:42.000 All right.
00:45:58.000 I want to bring in Wade Miller, Senior Advisor at the Center for Renewing America.
00:46:15.000 He's got a great article up at the site, a policy brief on the Census Bureau defrauding
00:46:22.000 American voters.
00:46:24.000 Wade Miller, thank you very much for joining us.
00:46:26.000 Thanks for having me on.
00:46:28.000 Tell us about this differential privacy algorithm and what the Census Bureau is doing to keep
00:46:36.000 Americans from hanging on to our birthright.
00:46:40.000 Sure.
00:46:41.000 So in an ideal situation, the census collects the data and then you have that raw data and
00:46:48.000 then that's what informs all the decision making.
00:46:51.000 But in 2020, a whole bunch of Obama bureaucrats that were still left over at the Census Bureau
00:46:57.000 created this differential privacy algorithm.
00:47:00.000 And so their excuse for it is it protect privacy.
00:47:03.000 And I understand this in some applications when it comes to some characteristic data.
00:47:07.000 We can have a conversation about how to protect some of this information.
00:47:11.000 But when it comes to population, when it comes to citizenship status, those are two key elements
00:47:16.000 to understanding apportionment fairly and understanding redistricting fairly.
00:47:20.000 And unfortunately, what happens is, is they run this algorithm.
00:47:24.000 And so you have this clean data at the state level and then all the way down to the block level.
00:47:28.000 And then you do differential privacy.
00:47:30.000 And what it does is the state level data remains accurate if the counting was accurate.
00:47:35.000 And that's a whole nother topic that we had historic levels of miscounts in the 2020 census process.
00:47:42.000 But let's just assume that the counting is correct.
00:47:44.000 At the state level, the data stays the same.
00:47:46.000 But at every other level, you have movement of the data.
00:47:49.000 And this includes population data.
00:47:51.000 It includes citizenship data.
00:47:53.000 And this is a really important thing because even if the Trump administration is successful
00:47:58.000 in asking the citizenship data or the citizenship question on the census,
00:48:03.000 if you don't figure out or fix or reform the differential privacy algorithm, it can move that data around in a manner that makes it so that when it comes to the redistricting process,
00:48:15.000 you're not getting as much utility out of it because you don't actually know where they're at to account for them.
00:48:21.000 And so then, therefore, you can't create districts that have equal amounts of citizens in them.
00:48:27.000 You'll have some districts that have a lot of illegals and other districts that have very few illegals in them.
00:48:33.000 Even if you know the total number in that state, you can't fix it in the redistricting as easily.
00:48:39.000 So it's a big problem.
00:48:41.000 And there's a lot of other problems with the census.
00:48:43.000 An important note, the Census Bureau admits that they did a really bad job in 2020.
00:48:50.000 And that's not even getting into the differential privacy problem.
00:48:54.000 And this is another key point.
00:48:56.000 Only a handful of select bureaucrats at the census have clearance and access to what's known as the TIGER file, which is the raw data.
00:49:06.000 No one else in the federal government, other federal agencies don't have it.
00:49:09.000 And there's been a lot of independent studies that show that this movement of the population, the scrambling of the data,
00:49:15.000 is depriving some areas of population and overestimating the population of many other areas.
00:49:21.000 And it just so happens to be negatively impacting mostly rural areas and positively impacting mostly urban areas.
00:49:28.000 So what does this mean?
00:49:30.000 Well, when it comes to maps, rural areas, just as a nonpartisan objective fact, rural areas tend to vote more for Republicans and cities tend to vote more for Democrats.
00:49:40.000 Well, if the cities have more population because of differential privacy and then also counting illegals on top of that, then you basically have a situation where these cities have outmatched amounts of political power that then gives them more representation than they're supposed to have and deprives rural areas of their power.
00:50:00.000 If you fix all this, not as a partisan issue, just as an objective, neutral observation, all of this will help the right and it will not help the left.
00:50:11.000 And by the way, all of the errors in the 2020 census, they all just so happened to help the left.
00:50:16.000 None of them broke in our direction in terms of allowing the right or conservatives to have more representation in Congress.
00:50:23.000 All of the mistakes helped the left and that's just from the counting.
00:50:27.000 Now we have differential privacy, counting of illegal aliens in the redistricting.
00:50:32.000 All of these things happen in 2020.
00:50:34.000 If you do that entire process over fairly, which I think that the Trump administration is going to do by republishing the 2020 census, then I think that you're going to see a seismic shift in the potentially the electoral college and certainly in the redistricting process that's going to benefit the right versus the left.
00:50:51.000 But it will be a fair representation of actual voters and citizens.
00:50:58.000 Fantastic work.
00:50:59.000 Wade, can you tell us where can we find the Center for Renewing America?
00:51:04.000 Where can we follow you and where should people go to understand the details of this?
00:51:10.000 Direct them to the article, please, sir.
00:51:12.000 Sure.
00:51:13.000 So at Center for Renewing America, our website is americarenewing.com.
00:51:17.000 It's americarenewing.com.
00:51:19.000 And you can just search for a census in that little search tab at the top.
00:51:24.000 And then my account is Wade Miller underscore USMC.
00:51:28.000 Thank you very much, sir.
00:51:31.000 I appreciate you coming on.
00:51:32.000 Mike Lindell, I'm worn out.
00:51:37.000 This has been a long, long grinding journey, and I need to lay my head down on something soft.
00:51:44.000 Where can I get the best night's sleep in the whole wide world?
00:51:49.000 Well, you guys all know where to get it.
00:51:52.000 You get it at mypillow.com forward slash war room.
00:51:56.000 You guys, this last week, two sales collided, and we're going to keep it going.
00:52:01.000 This is two sales.
00:52:03.000 One is free shipping on your entire order.
00:52:06.000 And the other one is the employee pricing special.
00:52:09.000 There it is.
00:52:10.000 Two sales collided.
00:52:12.000 Never happened before in my pillow.
00:52:14.000 Probably anywhere.
00:52:15.000 You guys get the best of both worlds.
00:52:17.000 Free shipping on your entire order.
00:52:19.000 And you get a lot of the stuff we're getting low on.
00:52:22.000 So, like, get those bath towels, those oversized bath towels.
00:52:25.000 They're called bath sheets.
00:52:27.000 We've got the bath mats, the bath towels.
00:52:29.000 They're all on sale.
00:52:30.000 Then you got the MyPillow Premium.
00:52:32.000 That sale is on.
00:52:34.000 If employees pay for it, you guys call 800-873-1062.
00:52:40.000 All my operators are standing by.
00:52:42.000 They love hearing from the War Room Posse.
00:52:45.000 And get the best promo code, the most famous promo code, promo code WARBROOM.
00:52:49.000 You guys have backed MyPillow and my employees.
00:52:52.000 We have your back with this double sale.
00:52:56.000 Thank you very much, Mike.
00:52:58.000 Hey, Mike, can I get a free pillow and some slippers?
00:53:01.000 I'm worn out, dude.
00:53:02.000 Yeah.
00:53:03.000 It's bad.
00:53:04.000 Right on.
00:53:05.000 All right.
00:53:06.000 Send them on over.
00:53:07.000 We'll be here at the studio.
00:53:08.000 What if he had the brightest mind in the War Room delivering critical financial research
00:53:13.000 every month?
00:53:14.000 Steve Bannon here.
00:53:15.000 War Room listeners know Jim Rickards.
00:53:17.000 I love this guy.
00:53:18.000 He's our wise man, a former CIA, Pentagon, and White House advisor with an unmatched grasp
00:53:24.000 of geopolitics and capital markets.
00:53:26.000 Jim predicted Trump's Electoral College victory exactly 312 to 226, down to the actual number
00:53:35.000 itself.
00:53:36.000 Now he's issuing a dire warning about April 11th, a moment that could define Trump's presidency
00:53:41.000 in your financial future.
00:53:43.000 His latest book, Money GPT, exposes how AI is setting the stage for financial chaos, bank
00:53:50.000 runs at lightning speeds, algorithm-driven crashes, and even threats to national security.
00:53:55.000 Right now, War Room members get a free copy of Money GPT when they sign up for Strategic
00:54:01.000 Intelligence.
00:54:02.000 It's Jim's flagship financial newsletter, Strategic Intelligence.
00:54:07.000 I read it.
00:54:08.000 You should read it.
00:54:09.000 Time is running out.
00:54:10.000 Go to RickardsWarRoom.com.
00:54:11.000 That's all one word, Rickards War Room.
00:54:13.000 Rickards with an S.
00:54:15.000 Go now and claim your free book.
00:54:17.000 That's RickardsWarRoom.com.
00:54:19.000 Do it today.
00:54:20.000 Do it today.
00:54:21.000 Do it today.
00:54:22.000 Do it today.
00:54:23.000 Do it today.