Bannon's War Room - April 08, 2025


WarRoom Battleground EP 741: "Replacement AI" Could Wreck Humanity


Episode Stats

Length

55 minutes

Words per Minute

163.74318

Word Count

9,058

Sentence Count

630

Misogynist Sentences

2

Hate Speech Sentences

7


Summary

Trade war with China continues to escalate, and a new round of tariffs is being levied on China. Steve Kamb has a special guest, Rosemary Gibson, to talk about the dangers of cheap generic drugs coming into the United States from China.


Transcript

00:00:00.000 This is the primal scream of a dying regime.
00:00:18.600 Pray for our enemies, because we're going medieval on these people.
00:00:23.920 I got a free shot at all these networks lying about the people.
00:00:28.180 The people have had a belly full of them.
00:00:30.000 I know you don't like hearing that.
00:00:31.580 I know you try to do everything in the world to stop that, but you're not going to stop it.
00:00:34.220 It's going to happen.
00:00:35.480 And where do people like that go to share the big line?
00:00:38.880 Mega Media.
00:00:40.240 I wish in my soul, I wish that any of these people had a conscience.
00:00:45.640 Ask yourself, what is my task and what is my purpose?
00:00:49.440 If that answer is to save my country, this country will be saved.
00:00:55.620 War Room. Here's your host, Stephen K. Baff.
00:01:00.000 Monday, 7 April, Year of the Lord 2025.
00:01:07.640 Of course, it didn't turn into be Black Monday today.
00:01:11.100 Not because Jim Cramer and the insane people over at MSNBC, New York Times, didn't try to make it happen, but it didn't.
00:01:19.320 Also, futures trading is already up.
00:01:21.760 Asian markets about ready to open.
00:01:23.320 And we'll get on top of all that here momentarily as we go through another day of this evolving trade war against the Chinese Communist Party.
00:01:31.400 It's quite evident what is happening here.
00:01:33.680 And we'll break it all down for you a little later in the show.
00:01:36.560 I want to bring back Rosemary Gibson.
00:01:37.960 So, Rosemary, in the middle of this trade war, you're one of the first investigative reporters to come out now seven years ago with, hey, look, folks, I don't know if anybody realizes this, but all the generic drugs that people take,
00:01:51.200 which is a huge percentage of drugs that people take, and 100 percent of the active pharmaceutical ingredients are all manufactured in China, and we have no control of that supply chain.
00:02:02.420 Now we're losing the economics of it, but more importantly, it's a strategic asset.
00:02:07.200 And we warn people about this all the time because the CCP is having their back up against the wall right now with President Trump.
00:02:12.340 Now, the tariff rate is 104 percent, I think, and President Trump's not backing off.
00:02:17.760 This is going to be – it was very clear when Scott Besson came out early and said, hey, we're setting up a process to talk – and just a while ago on Larry Kudlow, we've got that clip and we'll play it later.
00:02:27.940 Scott Besson said there's 75 major trading partners that are now trying to get on his schedule to come in and talk about the trade relationships and whether the reciprocity thing is too high or we miscalculated or we didn't calculate correctly the non-tariff trade barriers or currency manipulation, all that.
00:02:49.800 But China's not one of them.
00:02:51.460 In fact, President Trump's canceled any further discussions with China given the fact that they tried to retaliate.
00:02:56.340 And he told them flat out, you try to retaliate, I'm going nuclear.
00:03:00.260 They tried to retaliate, and he went nuclear.
00:03:03.340 Your claims now are actually – I think actually – if that's not disturbing enough, more disturbing.
00:03:08.440 You're saying that generic drugs coming through here because either the FDA is overwhelmed or they not have levels of competence but that these generic drugs are essentially contaminated.
00:03:20.740 And a pretty high percentage of them are contaminated by different things.
00:03:24.520 I don't want to put words in your mouth, but is that what your investigation is showing?
00:03:29.300 Well, this is what the Department of Defense testing program that started in November of 2023 has found so far, Steve.
00:03:39.900 And they launched this generic testing program.
00:03:42.520 Kaiser Health Plan was the first one to start a testing program.
00:03:45.960 They have 13 million people.
00:03:47.740 So they knew that something was not right, that our regulatory folks at FDA just – it's too much for them.
00:03:55.420 They don't have the enforcement power against the corporate control by Congress and the Wall Street folks.
00:04:03.980 So Kaiser Permanente started testing drugs, and then the DOD followed suit, again, because they realized the FDA – and FDA has said this – that they do not have information on the quality and safety of the U.S. medicine supply.
00:04:19.640 That was written in 2015.
00:04:22.060 It's in black and white.
00:04:23.240 And so the DOD started this generic drug testing program for the medicines that are important to them, and according to that independent testing, that 13% of what they've tested so far, they're not pure as they should be.
00:04:41.280 They have carcinogens.
00:04:42.880 They have arsenic in some of the samples.
00:04:46.040 They have lead.
00:04:46.740 And plus, they weren't made right, so they don't work right, and so you don't get the protection or the therapeutic value.
00:04:54.640 And this is, you know, so far from Six Sigma quality.
00:04:57.980 You know, any manufacturer with, you know, 13% defect rate is just really bad.
00:05:03.420 And by the way, it's not just China, Steve.
00:05:06.100 It's India and other trading partners.
00:05:09.720 And, you know, we can put the onus on China or India, FDA not doing its job.
00:05:17.240 Steve, what we need to look at is this.
00:05:19.700 There are six companies, U.S. companies, and they source 90% of the generic drugs from around the world.
00:05:27.220 They're the ones that bring them into U.S. commerce.
00:05:29.360 Just as RFK convened the leaders and CEOs of food industry businesses, we need to have a meeting at the White House with these six CEOs, no complaining, no hand-wringing.
00:05:43.900 What are you going to do about it?
00:05:45.980 They've outsourced their equality role to the FDA, but they can't do it.
00:05:50.900 So this needs to change.
00:05:52.420 And we have to implement the executive order that President Trump signed in August 2020 to bring back domestic manufacturing, direct the DOD, HHS, to give priority to domestic manufacturers of the finished drug, the active ingredients, and the components to make them.
00:06:14.760 And those components to make the APIs, that's where China has the global chokehold.
00:06:19.140 And we absolutely must ramp up independent testing.
00:06:24.200 Let's get a way to brief the new DOD leadership.
00:06:27.380 They've got to, because they supply the White House pharmacy, they supply the crash cart.
00:06:33.680 We have to have those products tested.
00:06:35.920 There could be some retaliation in lots of different ways, and we have to protect our people.
00:06:40.560 But hang on, hang on, hang on.
00:06:43.100 At 0.13%, which would be extraordinarily high if it wasn't 0.0001%, I could say, hey, you know, maybe we're better.
00:06:56.140 At 13%, what are you talking about?
00:06:59.020 This stuff is poison.
00:07:00.400 They're just shipping in poison.
00:07:01.700 13% of what they're sending has carcinogenics, arsenic, or lead in it.
00:07:08.580 But it just shows you we're kidding ourselves.
00:07:12.860 What does it take, because I know you and Peter worked on the situation, I think, south of my hometown, I think it was in Petersburg or Hopewell, to try to have the initial kind of test facility to do this.
00:07:23.760 I think on the executive order, is that been successful?
00:07:26.880 Do we even have the possibility of saying, hey, look, guess what?
00:07:30.200 We're not going to take it from India.
00:07:31.580 We're not going to take it from South Africa.
00:07:32.980 We want to take it from Europe.
00:07:34.080 We're definitely not going to take it from China.
00:07:35.680 We're going to give you one year, and we want all 100% of generics and API manufactured in the United States of America by people called Americans.
00:07:46.140 Ma'am.
00:07:47.440 It's going to take a lot more than a year, Steve, but you know what?
00:07:50.700 We have to get started.
00:07:52.760 We need 50 of these places around the country, but you know the most important thing that we need, Steve, is customers.
00:07:58.740 Instead of the DOD sending our taxpayer money to China or India to make our drugs, let's keep that money at home.
00:08:07.620 And by the way, some of the lower quality generics, the DOD was paying more money for them than generic drugs that actually performed better and had no problems.
00:08:18.700 So the cost and quality is not correlated at all.
00:08:21.940 So those who want to say, oh, it's going to break the bank may not be true.
00:08:26.240 So I think we have an opportunity here.
00:08:28.560 We have a burning platform because it's not just supply chain issue.
00:08:32.640 This is about the quality and safety.
00:08:34.760 And if we don't intervene now, Steve, if you have chaos and disorder and you do nothing, what happens?
00:08:40.600 It gets worse and worse and worse.
00:08:42.040 So we have to, the present Trump leadership, get in here now and get this going.
00:08:46.360 Rosemary, I know we're going to follow up on this.
00:08:51.880 Where do people go on the interim, social media or your website or whatever?
00:08:56.540 Hopefully you have another book in you about this because this is pretty stunning.
00:09:01.200 And I know we're going to push this out and make some news today on this.
00:09:03.520 But where do people go to follow up before we have you back on the show?
00:09:07.980 I have to put my head back together, then have you back on in a couple of days.
00:09:11.160 This is mind-blowing.
00:09:12.900 It's a long story.
00:09:13.840 We're trying to compress it in a real short period of time.
00:09:16.440 But on X, Rosemary 100.
00:09:19.520 A lot of smart people out there, Steve.
00:09:21.220 They know their drugs are not working right.
00:09:23.380 And now we're getting the data to show it.
00:09:25.520 There you go.
00:09:29.040 Okay.
00:09:29.580 We'll follow up with you, Rosemary.
00:09:30.840 Thank you for coming back on.
00:09:32.160 And you changed the world last time in the early days of the pandemic.
00:09:35.620 Let's change it again.
00:09:37.720 Let's do it.
00:09:41.300 Wow.
00:09:41.700 Remember, Rosemary Gibson came on.
00:09:44.420 Shocked the world.
00:09:46.120 Got President Trump's attention.
00:09:48.020 Eventually signed that executive order.
00:09:49.480 I think we're going to dust that baby off.
00:09:51.760 Get Navarra on this.
00:09:53.100 This is part of the whole trade war.
00:09:54.500 You know, I've always been very uncomfortable when the Chinese Communist Party is in charge of anything.
00:09:58.940 Not that they would do it on purpose.
00:10:02.860 To set 13 percent?
00:10:04.220 Are you kidding me?
00:10:05.360 At .00013, it's too high.
00:10:09.140 Okay.
00:10:09.500 Let's talk about it.
00:10:10.120 Can we put the – do we have a cut on Netanyahu?
00:10:14.380 Let's go back and do we – can we play the president's cut again?
00:10:17.380 I want to play the president's cut as I look for this – the – yeah, the – I've got this.
00:10:27.700 So it was a – I think a big disappointment for Bibi today.
00:10:31.200 Let's go ahead and play it.
00:10:31.920 Let's go ahead and play what happened in the Oval, and then I'll try to describe it to you.
00:10:35.740 Let's go ahead and let it rip.
00:10:36.500 Thank you, Mr. President.
00:10:44.920 I want to ask you about Iran, because this is the first time we hear that the U.S. is having a direct contact with the Iranians.
00:10:51.560 Is it possible to give us some more information at what level if the U.S. is represented?
00:10:56.880 High level. Very high level.
00:10:57.680 And –
00:10:57.880 We're dealing with the Iranians.
00:11:00.180 We have a very big meeting on Saturday, and we're dealing with them directly.
00:11:04.160 You know, a lot of people say, oh, maybe you're going through surrogates, or you're not dealing directly.
00:11:08.840 You're dealing through other countries.
00:11:10.480 No, we're dealing with them directly, and maybe a deal is going to be made.
00:11:14.200 That would be great.
00:11:15.360 That would be – it would be really great for Iran, I can tell you that.
00:11:19.660 But hopefully we're not going to have to get into that.
00:11:22.300 We are meeting, very importantly, on Saturday at almost the highest level, and we'll see how it works out.
00:11:29.920 Please.
00:11:30.140 Okay, given what's going on today and kind of the historic nature of this day on the geoeconomic and geopolitical side, I think this has kind of gotten buried.
00:11:43.120 And there was a reason that they called off the press conference in the East Room.
00:11:49.100 And they said, oh, it's not.
00:11:50.180 We're just going to have a press availability.
00:11:51.440 There's a big difference in a press availability and having a formal press conference.
00:11:58.300 Blacktivist and Ryan Grimm put out later that he agreed with this.
00:12:02.500 I just want to go through because I think it's very important for everybody in the audience to understand this.
00:12:06.900 The meeting with Netanyahu and Trump was a major disappointment from Netanyahu's perspective based on – this guy's reading in Netanyahu's brief statement in the Oval Office.
00:12:16.900 Here's what transpired.
00:12:17.940 Number one, Netanyahu's attempt to have the tariffs lifted was unsuccessful.
00:12:23.280 Number two, his efforts to prevent the U.S. from selling the F-35 fighter jets to Turkey failed.
00:12:30.100 Additionally, he was unable to convince Trump to pressure Erdogan into abetting his goal of building military bases in Syria.
00:12:37.560 Now, I think that's a big defeat for all of us because I think there's no way we should be arming Turkey given what Erdogan and his overall strategy for the new caliphate.
00:12:46.760 But be it as it is made, that's Netanyahu's opinion of what happened with President Trump.
00:12:51.080 Number three, Netanyahu was also painfully unsuccessful in getting Trump to reveal his plans regarding Iran and Persia.
00:12:57.920 On Gaza, Trump wants the war to end and is no longer advocating for the ethnic cleansing of the region.
00:13:03.020 This guy calls it ethnic cleansing.
00:13:04.360 We would call it continuing fighting conclusions.
00:13:06.020 It's a surprisingly bad day.
00:13:07.280 I have to agree with that.
00:13:09.120 I think the core of this – they said it was because of tariffs.
00:13:11.600 The core of this is Netanyahu want to come once more just like a month ago and pitch a military plan for Persia.
00:13:17.920 That just cannot happen.
00:13:18.900 We can't get into another military conflict in the Middle East.
00:13:22.220 And there are ways with the leverage we have, and we have tons of leverage to bring economic warfare, just like we're bringing it to the Chinese Communist Party right now, to bring even more to bear against the Persians.
00:13:34.700 And that's what needs to take place.
00:13:36.200 I think it was kind of a – maybe not a surprise, but a surprise President Trump was saying it publicly that at the very high level, we're very far down the road with direct dealings with the Persians.
00:13:46.060 And there's a meeting on Saturday at a very high level, and he went out of his way and said it was a direct Persia to the United States meeting.
00:13:54.680 Obviously, we've been a big advocate here in the war room that this is one of the things that will get taken care of under the Russian rapprochement.
00:14:04.340 So let's see if that happens.
00:14:06.320 President Trump went out of his way and said, hey, no other countries were involved here.
00:14:10.300 There's intermediaries, I think he meant to say Israel and Russia, and this was going directly.
00:14:15.320 So we'll see what transpires out of that.
00:14:17.920 Maybe I would hold off on bombing the Houthis and let the Brits, French, and Italians get down there, and they keep the sea lanes open to the Suez Canal for a change.
00:14:29.040 Okay, we have a – now, do we have the short version or a long version?
00:14:31.700 We have a short version or a long version of this?
00:14:33.380 Besson?
00:14:33.860 Yeah, no, not Besson.
00:14:34.800 We're going to play – we're going to go right now to artificial intelligence.
00:14:37.180 So let's go ahead and play the version we got up.
00:14:38.720 We're going to have Max Tegmark from MIT and our own Joe Allen join us in a second.
00:14:45.580 Let's go ahead and play it.
00:14:46.660 OpenAI will do great work.
00:14:48.280 We are trying to sort of solve general intelligence.
00:14:51.420 We're trying to build the algorithm for a program that is truly smarter than a human in every way and then figure out how to have that maximally benefit humanity.
00:15:00.080 And that's why we're a nonprofit.
00:15:01.280 We don't ever want to be making decisions to benefit shareholders.
00:15:03.840 You know, the only people we want to be accountable to is sort of humanity as a whole and doing this thing right.
00:15:10.080 Elon Musk is asking a federal court to block OpenAI's effort to convert into a fully for-profit business.
00:15:15.980 It's the latest escalation in an ongoing legal feud between Musk and the company.
00:15:20.400 You know, I think I've often said that, you know, my chance that something goes, you know, really quite catastrophically wrong on the scale of, you know, human civilization, you know, it might be somewhere between 10 and 25%.
00:15:35.580 In 10 years, how is life going to be different because of AI for just a normal person?
00:15:41.440 I think in 10 years, based on the current rate of improvement, AI will be smarter than the smartest human.
00:15:48.400 There will be ultimately billions of humanoid robots.
00:15:52.300 All cars will be self-driving.
00:15:54.180 Now, if AI will be smarter than any person, how many jobs go away because of that?
00:16:00.920 Goods and services will become close to free.
00:16:05.240 The challenge will be fulfillment.
00:16:07.920 How do you derive fulfillment and meaning in life?
00:16:11.440 How real is the prospect of killer robots annihilating humanity?
00:16:17.300 20% likely.
00:16:19.220 Maybe 10%.
00:16:20.140 On what time frame?
00:16:22.660 Five to 10 years.
00:16:24.220 I mean, you can look at it like the glass is 80-90% full, meaning like 80% likely will have extreme prosperity for all.
00:16:36.080 Okay.
00:16:36.900 Max and Joe Allen join us.
00:16:39.400 Max, you've been one of the leaders in trying to alert people to the fact that we're heading down a path of artificial intelligence that is, quite frankly, not only not regulated, really not looked at by any outsourced sources.
00:16:54.860 We're kind of taking everybody's word for it.
00:16:57.020 In the clip we just played right there, when you have people saying, I don't know, 20% that robots could kill humanity, I mean, these numbers are astronomical.
00:17:06.840 And we've got so much going on.
00:17:08.560 We've got, you know, President Trump just came out and endorsed the big, beautiful Senate bill, which I think has got some issues to it.
00:17:14.920 And I don't think it's going to support the House.
00:17:16.740 That's a whole other fight we've got to get into tomorrow.
00:17:18.860 You've got the trade war going on.
00:17:20.580 You have all this geopolitical activity going on throughout the world right now.
00:17:25.000 I think, unfortunately, this has been put a little bit in the back burner.
00:17:27.980 And I keep saying this ought to be not just on the front burner.
00:17:31.360 This ought to be on the front burner with the heat turned all the way up.
00:17:35.100 So take a second and ascribe to the audience exactly where do we stand with this whole race on artificial intelligence, the couple or three people that are really making this happen, and how little oversight there is, sir.
00:17:48.360 Yeah, it's truly insane what's going on right now.
00:17:52.140 And so easy to miss the forest for all the trees because of all the other things going on.
00:17:56.900 What's basically happened is scientists have always been curious about how stuff works, and we figured out how muscles work, and we built machines that are much stronger than us, gave us the industrial revolution.
00:18:10.020 People then shifted to starting working more with our brains.
00:18:13.280 Now, the people are trying to do the ultimate replacement where you replace not just the muscles but also our brain work with machines that can just outthink us in every way.
00:18:23.260 And if that happens, we have nowhere to go.
00:18:28.560 And first of all, I think it's very naive to just trust that some oligarchs are going to be very compassionate about taking care of people when they don't need to anymore if we can't get jobs.
00:18:44.820 Second, there's this even bigger risk that it's not clear that anyone is able to control machines that are just vastly smarter than us.
00:18:54.400 You know, just walk down to the zoo and ask who's in the cages?
00:18:59.140 Is it the tigers or the humans?
00:19:01.720 And ask yourself why.
00:19:03.760 You know, it's because the smarter species tends to control.
00:19:08.040 And the sad fact is that we're much closer now to building smarter than human machines.
00:19:16.240 Many of the leaders of the companies and many of the top scientists think it's going to happen within one to five years, depending on who you ask.
00:19:25.660 I had drinks with one of the leaders the other week.
00:19:28.140 He thinks it's going to be in March next year.
00:19:29.920 And we're much closer to building this than we are to figuring out how to control it.
00:19:36.140 And crazy.
00:19:40.720 Okay. Hang on one second. Hang on one second.
00:19:44.400 There's one thing to talk about the cultural or socioeconomic impact this can have of, you know, cutting through and laying off all the low-level programmers, administrative, clerical, managerial.
00:19:59.960 And, you know, is that good or bad?
00:20:01.960 And how do we maximize it?
00:20:03.480 And are we innovative or just looking for efficiencies?
00:20:05.740 That's a whole line of argument.
00:20:08.600 I want to go back to the one that's in front of us that to me should be the most important we've got to talk about.
00:20:14.420 And that's the control issue.
00:20:16.200 When you say that – explain to people when you say, I guess it's artificial general intelligence or where a computer or where these thinking machines are actually smarter than a human brain.
00:20:29.500 And you say it's one to five years.
00:20:31.320 You know, a couple of years ago people were telling us it was 10 years.
00:20:33.600 It was 20 years.
00:20:34.480 It was 25 years.
00:20:36.060 Now you've met with – you've met informally with one of the leaders in the industry.
00:20:40.140 He says, yeah, I think it will be March of next year.
00:20:42.160 That's the last time I looked, 11 or 12 months, right?
00:20:47.280 So – and that would mean a red flag of like what are we talking about?
00:20:50.940 So specifically, what are you talking about when you say there's a problem with they could actually be smarter than humans and there would be a problem with they would be the smarter species?
00:21:01.640 What do you mean by that?
00:21:02.500 Yeah, so if you look back a bit here, you know, AI has been seriously overhyped from the 1950s until about four years ago, falling far behind its promises.
00:21:17.600 And people kind of got used to that and didn't take it seriously.
00:21:20.640 But as recently as five, six years ago, most of my colleagues also still predicted, therefore, that we were decades and decades away from building something that could master language and common knowledge at the level of ChatGPT.
00:21:35.460 And they were all wrong because there's been some enormous breakthroughs in the last five years.
00:21:42.100 And things are now going much faster than we thought.
00:21:44.460 But – so the kind of AI that people called AGI, Artificial General Intelligence, I think we should actually call it replacement AI because the real purpose of the investors and companies that are building it is to replace all humans on the job market.
00:22:01.960 Some of the companies even admit this on their websites.
00:22:05.680 And then, as you said, if that happens, there's a separate question of whether they'll also just replace us altogether on the planet and get rid of us.
00:22:11.920 And I think it's – we're on track for this technology coming.
00:22:19.780 It's not here now, but during Trump's presidency.
00:22:22.660 So if someone stops it, it's going to be Trump and his administration.
00:22:26.600 And the reason that this is so insane is because we're effectively treating the AI industry differently from any other powerful industry.
00:22:41.820 AI is the only industry that can do whatever it wants without any safety standards.
00:22:46.720 You talked earlier on the show here about the FDA and how we have to have safety standards before you can sell medicines to make sure they don't have 13 percent toxins in them, right?
00:22:57.320 And we have that even for sandwiches.
00:22:59.960 If there's a sandwich shop in San Francisco across the street from one of these companies, they can't even sell one sandwich until the health inspector has checked out their kitchen, right?
00:23:09.160 Yet, it's completely legal now if some company wants to build smarter-than-human machines that they have no idea how to control it, just sell them.
00:23:18.180 That needs to change.
00:23:19.620 So the good news with this is although the problem is very serious, and in my professional opinion as someone who's worked on this for many years, the most serious challenge we've ever faced in the history of our species, it's also super easy to solve.
00:23:33.260 Just stop making dilly-dallying and making special exceptions for this particular industry and their lobbyists and just treat them like anyone else.
00:23:44.120 There should be some sort of FDA for AI, and if they can't convince independent experts who don't have a conflict of interest that this stuff can be controlled, come back when you can, buddy.
00:23:56.060 Well, it's not controlled.
00:23:59.860 It's not regulated for a reason.
00:24:01.220 They don't want to regulate it.
00:24:02.020 But I want to just go back to make sure the audience – because I love replacing AGI with replacement AI because that's what they're trying to do.
00:24:09.780 Just go back for – something has been overhyped for decade after decade after decade.
00:24:15.140 You said the language model was the differentiation.
00:24:18.460 That's what was released, I guess, with Chad GPT at Davos two years ago, I think it was, two or three years ago.
00:24:27.380 Why did that catch people by surprise, and why was that such a huge leap that all of a sudden folks go, wow, this thing could be – this thing really could replace humans?
00:24:36.500 Why was that such a big leap for the industry for how you actually build these, and what exactly happened?
00:24:43.140 You know, traditionally AI systems are the ones that always underperformed and have the intelligence programmed into them by humans, like the machine that beat Gary Kasparov at chess, you know, back when I was a kid, was programmed by people who knew how to play chess.
00:24:57.040 The modern approach is to instead make machines that just grow intelligence by gobbling up lots of data.
00:25:04.160 And the reason people underestimated how well that was going to work, I think, is because of a simple mistake.
00:25:14.380 You know, imagine if we had this conversation the year 1900.
00:25:18.200 How long will it be until we can have flying machines?
00:25:21.840 And you said to me, you know, hey, Max, we're clearly decades away because we don't even understand quite how birds fly and can't build mechanical birds.
00:25:29.900 That would have been wrong because there turned out to be a much easier way to fly airplanes.
00:25:36.620 And I think a lot of my colleagues, even very smart ones, similarly thought we would never figure out how to make thinking machines that could outthink us until after we figured out how our brain works.
00:25:48.000 And we're nowhere near figuring out the brain works.
00:25:50.340 Turns out there is an easier way to build thinking machines, and that's what the industry is doing.
00:25:56.780 Hang on one second for a second, Max.
00:26:00.940 Joe Allen and Max are with us.
00:26:03.620 We're going to take a short commercial break.
00:26:05.900 Make sure that you go – financial turbulence is going to continue tomorrow and the next day and the next day and the next day.
00:26:12.740 President Trump is on a monumental historic reordering of the global trading patterns in the world.
00:26:19.360 This is going to cause a lot of what we call perturbations.
00:26:22.660 So, make sure that you check out Birch Gold.
00:26:26.940 Take your phone out right now.
00:26:28.040 Bannon, B-A-N-N-O-N.
00:26:29.620 Text it at 989898.
00:26:31.940 Get the ultimate guide for investing in gold and precious metals in the era of Trump.
00:26:37.860 What Birch Gold is trying to do is trying to teach you the patterns of what causes gold to both rise and fall.
00:26:45.240 Go check it out today.
00:26:46.140 Make sure you make a personal contact in relation with Philip Patrick and the team over there.
00:26:50.480 Short commercial break.
00:26:51.920 We're going to be back with Thinking Machines in just a moment.
00:26:55.200 But I'm American, baby.
00:26:58.720 I got American power.
00:27:02.680 I got American, baby.
00:27:07.080 In America's heart.
00:27:10.660 Go on, rain.
00:27:12.640 You don't go out and buy a life jacket when the boat is already sinking.
00:27:16.660 You don't buy gold when the economy has already collapsed.
00:27:20.480 Clearly, others are heeding this advice as gold hit an all-time high the first part of 2025, multiple times.
00:27:28.000 It's not too late for you.
00:27:30.080 The company I trust to help you diversify into physical gold is Birch Gold, the company I buy my gold from.
00:27:36.300 Birch Gold specializes in helping you convert an existing IRA or 401K into a tax-sheltered IRA in physical gold for no money out of pocket.
00:27:48.320 Just listen to this five-star review.
00:27:50.480 Quote, knowledgeable, helpful, non-pressure.
00:27:52.840 End quote.
00:27:54.220 That's what you get with Birch Gold, and that's why I've endorsed them for so long.
00:27:58.440 Get your free info kit on gold by texting the word Bannon, B-A-N-N-O-N, to 989898.
00:28:05.260 There's no obligation.
00:28:06.560 Just useful information with an A-plus rating from the Better Business Bureau and countless five-star reviews.
00:28:13.640 Text Bannon, B-A-N-N-O-N, to 989898.
00:28:17.040 And let the experts at Birch Gold help you secure your financial future today with gold.
00:28:25.420 How well do you sleep at night?
00:28:27.600 Real peace of mind comes from knowing your family is prepared for anything.
00:28:32.640 My Patriot Supply, America's most trusted name in emergency preparedness, is offering a time-limited discount on their best-selling three-month emergency food kit.
00:28:43.440 Now, this is the basic, the three-month emergency food kit.
00:28:47.280 This isn't your typical survival food.
00:28:49.440 Each kit provides over 2,000 calories and 100% of the recommended value of 12 essential vitamins and minerals daily for 90 days.
00:28:58.820 With delicious meals and extras, including chicken, beef, fruits, and veggies.
00:29:03.960 We're talking quality meals that last up to 25 years in storage and still taste like home-cooked.
00:29:09.560 The pandemic taught us how quickly store shelves can empty.
00:29:13.440 Don't wait for the next crisis.
00:29:15.160 Your three-month emergency food kit includes free shipping, a disaster replacement warranty, and 24-7 U.S.-based customer support.
00:29:23.840 This special discount is available for a limited time on it.
00:29:26.940 Visit MyPatriotSupply.com now to secure your family's future with a $100 discount.
00:29:33.380 If you order by 3 p.m., they'll ship your order within the same day so you have peace of mind right away.
00:29:39.080 Get MyPatriotSupply.com by going to MyPatriotSupply.com.
00:29:46.580 Do it.
00:29:47.840 Do it today.
00:29:49.360 Use your agency.
00:29:50.640 Action.
00:29:51.340 Action.
00:29:51.880 Action.
00:29:52.120 You wish you could have invested in the stock market last year when investors scored the highest profits in decades.
00:29:59.240 But between that mountain of bills and credit card debt, you had nothing, and I mean nothing, left over.
00:30:05.240 It's time to stop letting debt hold you back.
00:30:09.020 Let me tell you how Done With Debt can help.
00:30:11.480 They have a brilliant new strategy designed to tackle your debt and put cash back into your hands so you can save and invest and build the life you've been wanting.
00:30:23.460 Done With Debt goes head-to-head with credit card and loan companies.
00:30:27.520 Their team of negotiators and legal experts work to significantly reduce your bills, eliminate interest, and erase penalties.
00:30:35.600 This frees up cash to invest while the stock market is still hot.
00:30:40.280 The bottom line is this.
00:30:42.800 Done With Debt helps turn crushing debt into financial freedom.
00:30:46.740 That's crushing debt into financial freedom.
00:30:50.460 But some of their strategies are time-sensitive, so don't wait.
00:30:54.360 Start building the life you deserve.
00:30:56.980 Visit donewithdebt.com and talk with one of their strategists.
00:31:00.820 It's free.
00:31:02.340 Go to donewithdebt.com.
00:31:04.200 That's donewithdebt.com.
00:31:10.280 Hello, War Room Posse.
00:31:20.460 Today we're going to have another huge War Room exclusive at wholesale prices.
00:31:25.220 We're going to start with our My Towels with that proprietary technology.
00:31:29.500 The bath towels that came in.
00:31:30.900 The big bath sheets are in.
00:31:32.280 The kitchen towels are in with all the accessories as low as $9.99.
00:31:37.520 So go to mypillow.com.
00:31:39.560 Scroll down until you see Steve.
00:31:41.440 Give them a click.
00:31:42.620 And there it is.
00:31:43.700 The kitchen and towel $9.99 sale.
00:31:46.720 Right next to that, you have the spring sheet sale where you save up to 50%.
00:31:51.020 And there's the My Crosses, the most requested product, I think, ever for MyPillow.
00:31:57.300 You save 30%.
00:31:58.580 And there's the premium MyPillows, $18.98 for the queens and $19.98 for the kings.
00:32:06.220 That's a War Room exclusive.
00:32:08.560 Help keep my employees going, you guys, and help yourself get the best sleep ever and the
00:32:13.700 best products ever or call 800-873-1062, 800-873-1062, promo code WARROOM.
00:32:25.060 The most sought after promo code ever.
00:32:30.620 Folks, I hate to say that I called this one earlier when Jonathan Karl, early this morning
00:32:34.900 on the Good Morning America with George Stephanopoulos said that there was a senior conservative member
00:32:41.740 of the legal community that was going to place a challenge to President Trump.
00:32:46.380 It's just been filed.
00:32:48.420 Headline in the Guardian, right-wing group backed by Charles Koch, and wait for it, Leonard Leo,
00:32:55.260 sues to stop Trump's tariffs.
00:32:57.760 New Civil Liberties Alliance as President's invocation of emergency powers to impose tariffs
00:33:02.540 is unlawful.
00:33:03.940 First paragraph, a libertarian group backed by Leonard Leo and Charles Koch has mounted
00:33:08.280 a legal challenge against Donald Trump's tariff regime.
00:33:11.740 In a sign of spreading right-wing opposition to a policy that has sent international markets
00:33:18.620 plummeting.
00:33:19.380 This will be another fight.
00:33:20.560 We told you they're going to go all the way to try to federal court to try to slow down
00:33:23.480 President Trump.
00:33:25.100 The Kochs and the libertarians taking the same tactics as the progressive left-wing neo-Marxists.
00:33:30.900 So we'll get on to that tonight.
00:33:32.760 And sure, tomorrow morning, we'll have a big breakdown in the morning show.
00:33:36.740 Joe Allen joins us, our editor for all things Singularity.
00:33:41.840 Joe, I think Max is on to something here big time.
00:33:45.240 This is not getting enough coverage.
00:33:46.800 You've got these four guys who all have messiah complexes.
00:33:51.380 They're running wild.
00:33:52.720 Your thoughts, sir?
00:33:55.280 Well, first off, Steve, I'd just like to really extend my gratitude to Max for coming on.
00:34:01.420 A lot of people would maybe be nervous in his position to appear on a conservative show.
00:34:11.220 But for the audience, just to jog memories, people who have been listening to The War Room
00:34:17.220 for the last four years will remember in the first year, I recommended Max Tegmark's book
00:34:24.620 many times, Life 3.0.
00:34:28.240 And that's important because it really does give a profound framework for understanding
00:34:34.320 what the impact and the import of artificial intelligence is.
00:34:39.260 It is, in many ways, even if you just think of it as an imaginative exercise,
00:34:45.060 an invasion of a new species.
00:34:47.760 It's very similar to the sort of replacement labor and replacement populations of mass immigration
00:34:57.240 in the sense that it threatens jobs.
00:34:59.760 It threatens national identity.
00:35:02.160 I mean, what do you do when the digital space is 50% bots?
00:35:07.400 It's earth shattering.
00:35:09.380 And I also want to remind the audience that they'll probably remember, some will, two years
00:35:14.920 ago or so, the Future of Life Institute, of which Max is a co-founder, published an open
00:35:24.400 letter to pause AI.
00:35:26.420 They wanted to stop development of AI roughly at the level of GPT-4.
00:35:32.420 Of course, we've now blown past that.
00:35:34.320 Even many of the signatories, including Elon Musk, is attempting to blow past that.
00:35:39.260 But it's really important, too, to look at what they're doing at the Future of Life Institute
00:35:44.980 and what Max Tegmark is doing individually, is trying to find some convincing way to persuade
00:35:53.760 politicians that this is, in fact, a real danger.
00:35:58.200 And that if we don't at least begin the conversation on how to regulate this, on how to minimize the
00:36:06.700 negative impacts, even if it doesn't become superhuman artificial intelligence, then we're
00:36:12.560 going to be caught with our pants down.
00:36:14.800 And the kind of nightmarish scenarios will absolutely be much more likely to unfold without
00:36:23.580 any sort of action from either the public at large or preferably the government itself.
00:36:30.760 And I'd really like to hear more from Max about the kinds of concrete proposals that
00:36:36.540 he has for governmental policy.
00:36:39.240 I spoke to him before, and he makes a very strong case for a kind of limited regulation.
00:36:47.900 Yeah.
00:36:49.300 Well, no, let me turn it over to Max.
00:36:51.100 First off, give us an update.
00:36:53.180 Since you sent this letter and you're kind of a leader in this field, you sent this warning
00:36:57.680 shot.
00:36:58.620 What's been the result of just that?
00:37:00.480 Did that really get people's attention?
00:37:02.160 Did people dismiss it?
00:37:03.240 Did people in the industry say, hey, don't ever mention Max's name again because he's
00:37:07.380 kind of, you know, gone off the deep end?
00:37:09.560 We got to pursue this because the Chinese are pursuing it for whatever reason they give.
00:37:13.340 What's the status when he put this, when he put the warning shot across people's bow?
00:37:18.660 It had a huge effect.
00:37:20.400 There was a strong pent up anxiety among, across society where people felt afraid of voicing their
00:37:29.780 concerns with this for fear out of being branded clueless or crazy.
00:37:34.300 And when people saw that Professor Yoshua Bengio, who's now the most cited AI researchers in
00:37:41.180 the world and many others had signed this, it made it socially safe for everyone to really
00:37:46.780 speak up.
00:37:47.840 And that led very in short order to a statement by, which was actually signed by all the top
00:37:52.400 CEOs of the companies even saying, hey, you know, AI could drive us all extinct.
00:37:57.280 And then there started being political discussion.
00:38:00.480 Unfortunately, what's happened since then is the lobbyists from big tech have mobilized
00:38:05.040 very successfully.
00:38:06.240 There are more lobbyists in Washington, D.C. now from big tech than from any other industries.
00:38:11.980 And I've more or less memory hold the fact that the CEO is warned of the very technology that
00:38:19.280 they're now rushing to build.
00:38:21.080 So switching gears to the solution here, which is actually quite straightforward, it's important
00:38:26.500 to remember, of course, we want to figure out how to cure cancer and reduce road deaths
00:38:33.980 with et cetera, et cetera, if we can use AI for things like this.
00:38:38.120 And we can have all of that without building replacement AI.
00:38:44.680 If we have, there is a massive amount of research actually on how you can take AI systems that
00:38:50.680 are tools, which I like to define as things that we can control for us, right?
00:38:55.000 You don't want an uncontrollable car, for example.
00:38:58.100 That's why your cars today are tools.
00:39:00.200 And how we can have amazing tools that help us cure diseases and make us more prosperous and
00:39:07.000 all sorts of, et cetera, et cetera, and strong without ever taking that extra step and making
00:39:14.220 things that can just replace all humans and that we don't know how to control.
00:39:18.580 So that's where you want to draw the line.
00:39:20.200 If you simply treat AI like we would treat car manufacturers, airplane manufacturers, drug
00:39:24.720 manufacturers, and everywhere else saying, hey, you know, before you sell your stuff, show us
00:39:28.740 why this is not something we could lose control over that just wholesale replace us, then what
00:39:37.000 we'll get is a golden age of innovation to build all these tools.
00:39:41.160 And we will not be in this crazy rush where we worry if people are going to solve the,
00:39:48.640 figure out how to control them in time.
00:39:51.180 And it's, it's, it's a, it's total scientific consensus, even though it's very little known
00:39:56.320 among the broad public that we're much closer to.
00:39:58.880 Okay.
00:39:59.880 But, but, but, but, but hang on for a second.
00:40:02.820 Hang on for a second.
00:40:04.080 This industry's had two, just in the last couple of years, it's had two shocking moments
00:40:09.880 given all the tens of billions of dollars in smart people on the finance side and venture
00:40:16.060 capital side and obviously all the smart individuals on the scientific and technological side and
00:40:21.600 programming side and things like the weapons labs, the national labs, the chat GPT moment
00:40:27.320 at Davos shocked the world and kind of shocked an industry.
00:40:32.080 Then you, bro, you just had one a couple of months ago that was orders of magnitude bigger
00:40:38.700 than that shop shock, which was deep seek that came from the Chinese.
00:40:42.880 So just given the nature and structure of this industry, that continues to have things
00:40:49.040 that come out of nowhere to the people who are already the best in the world and the,
00:40:53.260 and the people that are putting their money in to make sure they have, you know, their,
00:40:57.000 they get return on capital.
00:40:58.780 Given that the landscape is in the last three years, you've had two massive order of magnitude
00:41:04.920 out of nowhere moments.
00:41:07.420 How could we ever have anything that just had some sort of light?
00:41:10.260 Oh, can you please tell us what you're working on thing?
00:41:13.960 You would have to have, particularly if it's on replacement and if replacement can have
00:41:18.120 some sort of probability that in replacing human beings, maybe they make human beings extinct.
00:41:24.460 Don't you, the reality is you just can't have some sort of light hand on regular regulation.
00:41:29.540 You actually have to drill down into these companies.
00:41:32.140 So what the hell is going on?
00:41:33.140 Because if history shows us anything, this does not move incrementally.
00:41:37.020 It moves by massive leaps and bounds.
00:41:39.880 Am I wrong in that?
00:41:42.100 No, you're, you're exactly right.
00:41:43.840 Of course.
00:41:44.500 But you know, if AI would be, if it treats, treated like other industries, like pharma,
00:41:51.480 you know, there'll be different tiers.
00:41:52.940 If someone is launching a new lollipop, very light touch regulation.
00:41:57.120 If someone's trying to make a new kind of fentanyl, of course, there'll be very, very
00:42:01.620 serious scrutiny of that.
00:42:03.620 There should be.
00:42:05.000 And in the same way, if someone is just launching a slightly better self-driving car, you know,
00:42:09.320 I think that should be very easy to get approved, not a lot of red tape, so we can save lives
00:42:14.780 on the road.
00:42:15.760 But if someone says, I want to build artificial super intelligence that might become as much
00:42:22.120 smarter than humans as humans are smarter than snails, which is something these companies
00:42:27.460 are basically saying that they're going to do.
00:42:30.100 That's even more serious than if someone wants to make new opioids.
00:42:34.100 Of course, the government should come in and say, hey, buddies, what are you doing here?
00:42:37.680 You know, and the default is it's your responsibility to convince us in the government that this is
00:42:44.080 safe, not the other way around.
00:42:50.060 If the probability when you were having when you met with this person socially and they
00:42:55.000 said, oh, I think we'll get to replacement AI as soon as next March, your best guess is
00:43:01.420 it is it sometime in the next year or two that this will actually be dropped on us and
00:43:05.880 don't have to be dealt with if we don't deal with it now?
00:43:08.880 I'm very humble about forecasting.
00:43:11.060 You know, it's really hard.
00:43:13.120 But what the fact is, indisputably, is that the leading players in all the companies are
00:43:19.900 all making predictions between one year and five years from now.
00:43:23.100 And you might think they're just overhyping it for their investors.
00:43:25.680 But the whistleblowers who have left these companies in disgust are also saying very similar
00:43:31.780 things.
00:43:32.280 And so are academic colleagues.
00:43:34.540 And when I look at the research myself as well, you know, it's clear that the tracking
00:43:41.460 language, passing, getting things like chat GPT, that was an enormous hurdle that people thought
00:43:50.220 might have taken 50 years.
00:43:51.380 From now on out, it's mostly engineering, in my opinion.
00:43:55.260 And five years or one year, it doesn't really matter.
00:43:58.960 Either way, government has to step up.
00:44:01.560 The good news is that the current administration is actually working out a new action plan that's
00:44:06.940 supposed to come out the middle of this year.
00:44:09.520 And I think that's a really great opportunity to step up and say, you know, say to all these
00:44:18.140 companies in the Bay Area that, you know, we are not going to treat you somehow with any less
00:44:29.580 scrutiny than we treat all of the other companies.
00:44:32.040 And Trump himself has even spoken about how there's this real risk that we could lose control
00:44:37.580 over things.
00:44:38.120 We do not want our government to create a company.
00:44:43.520 I want to go back.
00:44:44.620 In the time we've got, I want to go back.
00:44:47.240 A lot of people saw Oppenheimer.
00:44:49.940 Go back.
00:44:51.340 You said the chat GPT.
00:44:53.200 And when we talked over the weekend, you said it was a Fermi.
00:44:55.820 Explain who Fermi was when his experiment below Soldier Field in Chicago took place, what it showed
00:45:04.020 the world, and from there on, the theoretical aspect of could you build an atomic weapon
00:45:08.940 was all theoretical.
00:45:10.200 From that moment on, you said, hey, it just became the engineering, the mechanics of it.
00:45:14.980 You believe in AI were at the same point.
00:45:17.400 Walk us through what Fermi did and why did that bring everything together?
00:45:21.640 She actually could build a bomb.
00:45:22.920 You just had to do the physical nature of it.
00:45:26.060 Exactly.
00:45:26.860 The metaphor you mentioned is very good.
00:45:28.860 You know, for a long time, the world's best physicists thought maybe it'll take 100 years
00:45:34.580 or 50 years or whatever to figure out how to get nuclear energy out of the atoms.
00:45:40.280 And then Enrico Fermi built the first ever self-sustaining nuclear reactor in Chicago under a football
00:45:48.360 stadium around 1942.
00:45:50.660 And most people at that point still even heard about it, totally dismissed it.
00:45:57.020 But the physicists totally freaked out and realized that from now, from here on out to
00:46:02.380 the bomb, it's just engineering.
00:46:05.360 Maybe it'll take one year.
00:46:06.340 Maybe it'll take five years.
00:46:07.320 In fact, it took three years.
00:46:08.380 And it's very analogous, Alan Turing, who's the intellectual godfather of the field of
00:46:14.180 AI, he said in 1951 that, you know, if we ever build machines that can outthink us in
00:46:21.960 every way, then we have robots, basically, that are much smarter than us, that can build
00:46:27.360 robot factories, that can build more robots in the billions.
00:46:30.580 They will quickly take control of Earth.
00:46:32.840 But he said, don't worry about that because it's far, far away.
00:46:35.920 But I'll give you a canary in the coal mine so you know when to pay attention.
00:46:41.500 The Turing test said when machines can master language and knowledge at human level, that's
00:46:48.620 when you're close.
00:46:49.720 That's when it's all engineering from then on.
00:46:51.940 And that's when you have to pay attention.
00:46:53.020 And this is the moment that we've had now with ChatGPT and sequels.
00:46:59.220 So we've been waiting for it.
00:47:01.620 I wish, from the sake of my children, that this would have taken us a lot longer to get
00:47:06.500 to this point.
00:47:07.020 But here we are.
00:47:08.560 And now is the time to act.
00:47:12.820 Max, hang on for a second.
00:47:14.020 Joe, we're going to wrap up here with that.
00:47:16.640 We're going to have Max back on.
00:47:17.700 It's been amazing.
00:47:18.560 Any closing thoughts on this?
00:47:21.680 Once again, just extending gratitude to Max for coming on and imparting his wisdom to
00:47:28.040 the audience.
00:47:28.800 I believe that many are well prepared to hear it.
00:47:32.160 Just my real parting thought, though, is that what Max Tegmark here is describing is a nightmare
00:47:40.180 scenario in which machines become more intelligent than human beings.
00:47:43.440 And we could argue all day long about what intelligence is, what real thinking is, whether
00:47:48.760 these machines could truly replace every facet of humanity.
00:47:52.620 I would like to make a statement that is perhaps even darker.
00:47:58.340 Even if AI doesn't reach that level, not for the next 10 years, 20 years, where it's at
00:48:06.440 right now and where these companies are positioned to deploy it all over the economy, in education,
00:48:14.380 in medicine.
00:48:14.920 I think that there is already a real threat that AI is, A, going to replace some jobs, but
00:48:22.020 B, make a lot of jobs kind of uninhabitable for people who are not dehumanized, teachers
00:48:29.900 who have to use chatbots to teach their students, doctors who constantly have to rely on the benchmarks
00:48:35.140 of machines.
00:48:36.040 So I just want to emphasize that while AGI is still just in the realm of possibility,
00:48:43.920 the narrow AIs that we have right now, I think, could do real damage and before any kind of
00:48:49.460 regulation could actually be pushed through.
00:48:52.440 And so just to reiterate what we've been saying for a long time, it's going to be up to those
00:48:57.200 people listening, to regular citizens, to make informed choices as to whether or not they
00:49:02.480 want to turn their children over to chatbots for education or turn their own bodies over
00:49:07.500 to doctors who are really reliant on AI as their means of diagnosing disease, so on and
00:49:15.380 so forth.
00:49:15.740 But anyway, thank you very much, Max, and thank you, Steve.
00:49:19.540 Yeah, no, we're going to get organized in this.
00:49:21.500 Hang over one second, Joe.
00:49:22.440 Max, we've had the turning moment, as you said.
00:49:26.820 Where do people get you?
00:49:28.020 I want to get access to your writings, access to your book, access to your social media.
00:49:32.860 Where do they go, Max?
00:49:34.500 I'm on Twitter.
00:49:37.720 Just TechMark is my Twitter handle.
00:49:40.820 I often piss off tech bros by things I post there.
00:49:45.500 And if someone's interested in reaching out to me personally, my email is in plain sight
00:49:50.980 also.
00:49:51.440 It's just TechMark at MIT.edu.
00:49:53.880 And I just want to maybe end with a little appeal here.
00:49:57.040 You know, there are so many different things that people argue about and fight about right
00:50:02.120 now on this planet.
00:50:03.100 But in this battle, as was eloquently said there by John, you know, it's really team human
00:50:10.900 versus team machine.
00:50:11.860 And we have to all ask ourselves, which side are we on?
00:50:15.160 And I would encourage anyone who listens to this, if someone comes up and tells them,
00:50:21.380 oh, I want to go work for this tech company and build AGI, you know, give them a hard time.
00:50:26.960 Ask them what team are they on, actually?
00:50:29.360 Why is this supposed to be good for team human, you know, that we just build our own replacements?
00:50:34.640 We are in charge of this planet.
00:50:35.900 And let's use that fact to build tools that improve the lives for all of us, not to just
00:50:44.420 throw away the keys to the planet that we're supposed to be stewards over.
00:50:52.140 Max, honored to have you on here.
00:50:53.980 We'll check in after the show and then look forward to having you back, brother.
00:50:57.260 Fantastic discussion.
00:50:58.660 Appreciate you.
00:51:02.180 Incredible.
00:51:02.860 Joe Allen, where do people get you?
00:51:04.520 I know you're on special assignment for us.
00:51:06.480 People miss you coming on every couple of days.
00:51:08.320 Where do folks go?
00:51:10.600 You can find my writings at joebutt.xyz, social media at J-O-E-B-O-T-X-Y-Z.
00:51:18.400 Thank you very much, Steve.
00:51:21.960 Joe Allen, on a canceled Black Monday, great way to end the show.
00:51:29.140 Hey, look, folks, if not you, who?
00:51:32.380 This is why people come on the war room.
00:51:34.820 They want access to a group of fighters.
00:51:37.700 Are you on Team Human?
00:51:38.800 Are you not?
00:51:39.300 Are we going to be taken over by thinking machines?
00:51:41.500 Are we going to build our own replacements?
00:51:44.020 I don't think so.
00:51:45.920 But it's just not going to happen.
00:51:47.560 We have to make it happen.
00:51:48.680 We're going to leave with the right stuff.
00:51:49.940 You know why?
00:51:50.940 You got it.
00:51:52.400 President Trump has it.
00:51:53.580 The MAGA movement has it.
00:51:55.120 We're going to see you at 10 a.m.
00:51:57.620 Eastern Daylight Time tomorrow morning when you will be back in the war room.
00:52:02.520 Are you a yo-yo dieter?
00:52:25.720 Later, you diet, lose weight, but gain it all back, plus a few extra pounds for the effort.
00:52:32.080 Then later, you lose it again and regain it again and on and on and on.
00:52:36.400 I think I resemble this.
00:52:38.720 It's dangerous.
00:52:39.580 Studies show that you can increase your risk of heart attack, stroke, type 2 diabetes, and other health problems.
00:52:45.960 Breaking free of your yo-yo diet pattern is a main reason doctors created Lean.
00:52:51.160 Lean is a supplement, not an injection, and you don't need a prescription.
00:52:56.320 The science behind Lean is impressive.
00:52:59.020 It's studied natural ingredients, target weight loss in three powerful ways.
00:53:03.540 Lean helps maintain healthy blood sugar.
00:53:06.060 It helps control appetite and cravings.
00:53:08.160 And it helps burn fat by converting fat into energy.
00:53:12.680 Listen, if you're tired of losing weight and gaining it back,
00:53:15.500 if you want to lose meaningful weight at a healthy pace, Lean was created for you.
00:53:20.040 Let me get you started with 20% off when you enter Bannon20, that's B-A-N-N-O-N-20, at TakeLean.com.
00:53:28.020 That's code Bannon20 at TakeLean.com.
00:53:32.660 Bannon20, that's 2-0, at TakeLean.com.
00:53:36.740 Lose it and keep it off.
00:53:39.200 I'm really trying to fill this gap of quality supplements,
00:53:43.100 and of course the beef liver being our flagship products.
00:53:45.900 For those who don't know, beef liver is loaded with highly bioavailable ingredients
00:53:50.960 such as vitamin A, B12, zinc, CoQ10, etc.
00:53:56.040 And because it is 100% grass-fed and natural,
00:53:59.960 your body is able to absorb these nutrients far better than taking any other synthetic multivitamin
00:54:05.200 or any other synthetic vitamin in general.
00:54:08.560 So we have some other amazing products,
00:54:10.500 but if you'd like to check us out, you can go to SacredHumanHealth.com
00:54:13.800 and cheers to your health.
00:54:16.040 700,000 Americans every year.
00:54:18.940 Yes, heart disease is the number one killer every year, year in and year out.
00:54:22.760 Heart disease builds over time.
00:54:24.840 Hypertension, high blood pressure, bad cholesterol, diabetes, all of it affects our heart.
00:54:30.060 A healthy heart is key to being energetic as we get older.
00:54:33.860 It is never too early to take care of your heart.
00:54:38.760 You see, heart disease sneaks up on us.
00:54:40.800 You can start in your 30s, and when this happens, you're at serious risk by the time you turn 60.
00:54:44.480 If you want to take care of your heart and those you care about,
00:54:48.580 please go to WarRoomHealth.com.
00:54:51.120 That's WarRoomHealth.com.
00:54:53.660 All one word, WarRoomHealth.com.
00:54:56.220 Use the code WARRoom at checkout to save 67% of your first shipment.
00:55:00.500 That's code WARRoom at checkout to save 67%.
00:55:04.000 Do it again.
00:55:05.460 WarRoomHealth, all one word, WarRoomHealth.com.
00:55:08.800 Go there today.
00:55:10.460 If you're going to be part of the posse, you need a strong heart.
00:55:13.340 You need a lion's heart.
00:55:15.100 How we're going to do that is with Salty.
00:55:17.400 Go there.
00:55:18.040 Do it today.
00:55:18.660 Check it out.