Bannon's War Room - December 19, 2025


Episode 5008: Live From AMFest Day 1


Episode Stats

Length

55 minutes

Words per Minute

167.14638

Word Count

9,214

Sentence Count

808

Misogynist Sentences

15

Hate Speech Sentences

27


Summary

In the wake of the Brown University shooting and the shooting at the Capitol, the question is: Is there a link between Islamic terrorism and the recent spate of mass shootings across the country? And if so, what role does Islam play in them?


Transcript

00:00:00.000 All right, Burkwam, Burkwam, Burkwam, Bannon, Bannon, Bannon, Bannon, Bannon, Maureen, hey everyone.
00:00:11.500 War Room, I'm going to let Maureen introduce the show.
00:00:14.980 Well, you're here in the War Room. I'm not going to introduce it like my dad, but thank you guys all for joining us here.
00:00:22.700 My dad will be here tomorrow, so don't miss out.
00:00:26.840 I know, I know. You have the prettier Bannon. He has the better hair.
00:00:31.940 Whoa, did you just say the better hair?
00:00:34.880 He has the better hair.
00:00:36.040 I don't know about that. Your hair is pretty cool.
00:00:39.260 He is amazing. When is he speaking?
00:00:42.700 He speaks tomorrow at 6 p.m.
00:00:45.180 So make sure you see that. Steve Bannon.
00:00:47.980 Technically 5.59, but we know it will be 6 p.m.
00:00:51.040 On the big stage at 5.59, Steve Bannon. That should be really, really, really interesting.
00:00:55.140 All right, so tell us about this.
00:00:57.620 So, Bryan University, what's going on with Islamic terror in America?
00:01:02.040 Is anyone else concerned about it as much as we are here at Real America's Voice?
00:01:06.040 Does anyone think that these incidences are unique and independent of each other?
00:01:12.200 Or do you think there's some sort of coordination between, I don't know, cells, Iran?
00:01:20.040 What do you think, Maureen?
00:01:20.960 I think there is a correlation, and I think the fact that we're seeing a continuing increase in violence,
00:01:27.780 and especially in this case, I believe, a targeted hit on a conservative.
00:01:33.800 Ella Cook was a conservative in charge of a college Republican club,
00:01:40.000 and she was killed, I think, for her beliefs.
00:01:43.780 So, I think that, you know, like we were talking about earlier,
00:01:48.140 you can't be afraid to have your voice heard, but we need to get to the bottom of it,
00:01:53.920 and I don't think we're getting the full transparency.
00:01:56.160 And there was supposed to be a press conference at 4 p.m. Eastern that was postponed,
00:02:02.100 I believe, indefinitely.
00:02:04.080 And Joe Allen can...
00:02:05.080 Joe Allen, folks.
00:02:05.780 ...so address.
00:02:06.600 There's a potential correlation between what happened at MIT and what happened at Brown.
00:02:11.520 And what happened in the Capitol?
00:02:15.740 What happened at the Capitol?
00:02:16.340 Another fanatical Muslim kills a service member, and one's still in the hospital, right?
00:02:24.100 Yeah.
00:02:24.460 As far as the shooting in Alston, outside of Boston, it's unclear what the connection is,
00:02:31.840 but certainly the killing of a young, very promising, and very highly respected physicist.
00:02:40.020 It echoes some of the tactics one has seen, for instance, in Iran, in which scientists have been targeted.
00:02:47.060 So, it's all speculation.
00:02:48.780 No one knows who did it, and no one knows if it's directly connected.
00:02:51.980 But I think it's ominous in the deepest sense of the word.
00:02:56.640 It's an omen.
00:02:57.500 So, they're actually, breaking news, they have identified a suspect in the Brown University shooting.
00:03:04.940 Yes, yes.
00:03:06.840 According to two law enforcement officials familiar with the case.
00:03:10.240 However, the manhunt for the shooter is still ongoing.
00:03:15.140 They got the wrong one.
00:03:16.020 So, they're saying that they've identified a shooter, but then the manhunt...
00:03:20.220 Sure.
00:03:20.660 Still on the loose.
00:03:21.400 Yes.
00:03:21.540 Still on the loose.
00:03:22.200 So, no description yet, right?
00:03:24.040 And they want to address the name.
00:03:24.420 Yeah, no, this is important.
00:03:26.020 Coulter's Law.
00:03:27.140 No description.
00:03:28.220 So, Coulter's Law is still in effect.
00:03:29.860 You can't say a Mediterranean-looking male anymore, right?
00:03:31.940 You have to say a suspect.
00:03:33.980 They, them, are on the loose.
00:03:37.460 They, they, at the root of every conspiracy is they, and them.
00:03:40.580 They, them, are on the loose.
00:03:41.520 So, what is more dangerous to us right now?
00:03:44.960 I mentioned Lara Logan was on the show yesterday talking about the, the cartels, the danger
00:03:50.500 that the cartels are, not only to our elections, but also to, to our safety, to our, to our
00:03:55.340 families, to our children with the drugs.
00:03:57.120 What's more dangerous, the cartels or Islam, fundamental Islamic fanatics?
00:04:02.640 Well, insofar as unpredictable violence, I think it's very clear that Muslim fanatics
00:04:07.440 are a real threat.
00:04:09.020 Cartels tend to be much more strategic.
00:04:11.060 They would have an economic or political or gang-related purpose.
00:04:15.220 So, it's very different.
00:04:16.600 But, you know, I, I follow the excellent work of Joe Kent, head of counter-terrorism.
00:04:22.020 And it's linked in many ways because the fear or the, the, the realistic apprehension that
00:04:30.000 the country has a lot of Islamic cells, um, a big part of that is that so many people
00:04:35.420 came over the border over the last four years.
00:04:38.060 And so, there's really no way to know how many terrorists of any strike came over the
00:04:43.480 border during that time.
00:04:44.220 I think, I'm pretty sure Trump said eight or 9,000, and they crossed the border.
00:04:47.940 Yeah, I mean, it's, it's, it's, it's, it's a guess.
00:04:50.340 In that, in that botched withdrawal, we allowed anyone, there was not proper vetting through
00:04:55.980 the State Department to let anyone that claimed to have helped, to have helped the United States
00:05:01.860 and Afghanistan enter this country.
00:05:04.440 And then, we see what happened.
00:05:06.920 Someone who came from Afghanistan shot two National Guard soldiers in Washington, D.C., murdering
00:05:16.200 one of them.
00:05:16.780 So, we are seeing an increase.
00:05:19.220 You're not being properly vetted into this country.
00:05:22.820 The Biden regime opened the border and said, come on in.
00:05:28.140 Thoughts on that one.
00:05:29.020 I'm going to, I'm going to jump in the crowd.
00:05:30.260 So, you guys keep going.
00:05:31.220 I'm going to go in the crowd, see if the folks have some thoughts.
00:05:33.240 But keep it going, guys.
00:05:34.320 You know, beyond the identity of perpetrators, the narrative that you hear again and again,
00:05:39.140 the narrative of the evil extremist right-wing terrorist.
00:05:42.920 I think all of the events of the last two years have really put the light at that.
00:05:48.320 Not that it needed a whole lot more, but everything from Trump in Pennsylvania having an attempt on his life to what happened to Charlie Kirk,
00:05:57.580 to the shooting at D.C., what we're seeing right now with Ella Cook.
00:06:04.160 It's pretty clear that for a variety of reasons, it is not crazy right-wingers who are at the forefront of domestic terrorism in America.
00:06:15.000 And I think that the sooner that narrative goes away, the sooner we can move towards logical, rational solutions to the violence.
00:06:24.560 I think you're right, but I also think that we need to get to the bottom of what happened in these different incidences,
00:06:32.740 what happened with Trump in Butler, Pennsylvania.
00:06:36.400 We haven't gotten to the bottom of that.
00:06:38.440 Yes.
00:06:38.740 It's been over a year.
00:06:39.960 Yes.
00:06:40.300 If we're not getting to the bottom of what happened, then how are we going to expect any transparency or any resolve in anything since then?
00:06:51.380 Yeah, absolutely.
00:06:52.380 And I think the cultural chaos that you see, I'm not one to shy away from any conspiracy theory.
00:06:59.760 You could say that my entire career has been spent looking at a variety of open conspiracies.
00:07:05.780 But in the absence of evidence, right, if you have no detailed official narrative or if the official narrative looks fishy, which is the case in most of these,
00:07:15.960 the only thing people can do is speculate.
00:07:18.020 And so you have professional speculators who rule the narrative, a.k.a. conspiracy theorists.
00:07:25.940 I mean, it really breeds a culture of suspicion.
00:07:29.300 So, and it makes people, let's say, much more likely to behave in insane ways.
00:07:35.620 What you saw around the Butler, Pennsylvania shooting, for instance, it's been a year.
00:07:42.500 Morning.
00:07:43.140 We have someone out here that's got a couple of thoughts on what we're talking about right now.
00:07:46.980 What's your name and where are you from?
00:07:48.260 My name's Angela Armstrong from Ohio.
00:07:51.400 From Ohio.
00:07:51.800 Now, I asked a question earlier to Joe.
00:07:54.560 What are we more fearful of?
00:07:56.600 Islamic terror, homegrown Islamic terror because they've come through the border,
00:08:01.260 or the drug cartels bringing fentanyl and drugs into our country, maybe even killing Americans?
00:08:07.260 They're both equally a concern to me due to the fact that Islamic terrorists,
00:08:16.220 they're raised as children, to be killers and be desensitized to human feelings.
00:08:24.700 The drug cartels, they're all out there for the money.
00:08:27.000 What's the best defense against drug cartels, Islamic terror, all the threats,
00:08:33.000 even a tyrannical government?
00:08:34.400 What's the best defense?
00:08:37.660 For me being pro-gun.
00:08:39.680 Thank you.
00:08:40.740 Thank you.
00:08:41.620 Second Amendment.
00:08:42.600 Second Amendment.
00:08:43.300 Thank you.
00:08:43.580 I actually work for Buckeye Firearms Association, the best pro-gun organization in the state of Ohio.
00:08:50.380 Protecting the Second Amendment.
00:08:51.600 Protecting the Second Amendment.
00:08:52.940 Dick Heller from D.C. v. Heller.
00:08:55.500 Sure, sure, sure.
00:08:56.480 Good friend of mine.
00:08:57.700 If you all carry, we're more of a polite society if you're afraid of being shot.
00:09:03.120 Well, I don't want to be a victim and travel to, let's say, for instance, Hawaii or Chicago or California and be a victim.
00:09:11.480 They're taking away your human, natural, God-given rights.
00:09:16.480 Protect yourselves, folks.
00:09:17.980 So, Australia is one of the most difficult countries to be able to own a firearm.
00:09:23.400 One of the most difficult.
00:09:24.340 What happened two weeks ago, last weekend, before that?
00:09:28.240 Fifteen, maybe sixteen people slaughtered, murdered because it was a soft target.
00:09:32.580 First of all, you can't get a gun in Australia.
00:09:34.580 It's almost impossible.
00:09:35.340 So, and then two Muslim extremists, father and son, shoot up a beach filled with people who are celebrating a Hanukkah festival.
00:09:45.200 Insane.
00:09:45.940 Got to protect yourself.
00:09:47.320 Arm up.
00:09:47.920 Final thought.
00:09:48.640 Final thought.
00:09:49.540 Well, you look at Australia.
00:09:51.660 The gentleman that tackled the guy to take the shotgun away from, I believe it was a shotgun.
00:09:57.000 He didn't know how to hold it.
00:09:58.120 Didn't know how to discharge it.
00:10:00.380 If he would have done that, that would have been a huge difference with more lives being saved.
00:10:04.440 There you go.
00:10:04.740 God bless America.
00:10:05.500 God bless Charlie Kirk's movement.
00:10:07.140 God bless America.
00:10:08.080 God bless Charlie Kirk's movement.
00:10:09.140 God bless that guy who took it upon himself to put himself at risk to save a bunch of lives in Australia.
00:10:14.580 Back to you guys up there.
00:10:16.420 What do you think about what she just said?
00:10:19.060 I think that that's definitely the foundation that needs to begin with.
00:10:22.500 People aren't allowed to protect themselves, and they're not armed to do so.
00:10:26.400 You fundamentally have a weak country.
00:10:28.340 But as far as anything regarding Muslim terrorism or gang violence, I don't think the Second Amendment is enough.
00:10:35.720 At this point, it's a balancing act between civil rights for Americans.
00:10:40.320 Do we get surveilled?
00:10:41.680 Are we subject to the sorts of systems that Palantir deploys?
00:10:45.580 On the other hand, are those systems available to law enforcement and other investigating bodies to actually track down killers?
00:10:53.620 You would think in a massive, digitally-saturated surveillance state that we'd be able to find killers just like that.
00:11:00.460 They found everyone that walked into the Capitol, for instance.
00:11:03.580 But for whatever reason, this is slow-rolled.
00:11:06.560 I agree.
00:11:07.480 And back to what you said about the narrative, that we're in all of these incidences that are occurring are not right-wing extremists.
00:11:16.360 And the fact that the left is still pushing that narrative, I believe, is causing an increase in these incidences to happen.
00:11:25.660 Yes, absolutely.
00:11:26.700 And it also gives justification for some of the worst elements of human nature.
00:11:30.260 The celebration around the various killings from the left, it's pretty horrific.
00:11:34.940 I mean, case in point, the seditious six, as soon as Senator Slotkin went on national television and said that the National Guard was going to start shooting civilians, days later, we see two National Guards shot.
00:11:52.020 So I believe, personally, being on the West Point Board of Visitors, Senator Slotkin is also on the West Point Board of Visitors.
00:11:59.200 And I think that President Trump, and I believe he has been told this already, that she should be removed effective immediately from the West Point Board of Visitors.
00:12:10.900 She should not have the right to be on that board if you're going to tell our servicemen and women that they should disobey orders.
00:12:19.960 And the fact that two service members were harmed because of what she said, she doesn't deserve to be on that board.
00:12:26.640 Amen.
00:12:26.900 Hey, Maureen and Joe, I got, believe it or not, we have President Trump right here.
00:12:31.980 President Trump, congratulations on the great speech last night.
00:12:35.160 How do you feel today after the big speech?
00:12:37.060 Well, we're feeling so great.
00:12:38.380 How are we doing, Turning Point?
00:12:39.820 How are we doing, ladies and gentlemen?
00:12:42.420 Unbelievable.
00:12:43.180 AmFest, like America First AmFest.
00:12:45.320 It's a great festivity.
00:12:46.480 We're having so much.
00:12:47.040 You're looking great.
00:12:47.820 Nice and bronze.
00:12:48.640 You're here from Florida.
00:12:49.640 I am.
00:12:50.360 Mr. President, Melania's Christmas decorations are stunning this year.
00:12:54.780 Beautiful.
00:12:55.180 She's so, isn't she amazing, Melania?
00:12:57.320 Isn't she beautiful?
00:12:58.740 She's so great.
00:12:59.620 I haven't seen her in a little while.
00:13:00.720 That's okay.
00:13:01.260 But she's doing great work and we love that.
00:13:03.180 Excuse me.
00:13:03.820 Turn around for a second.
00:13:05.100 Is this Tucker Carlson?
00:13:06.940 Tucker?
00:13:07.560 I mean, I don't really know what that means, actually.
00:13:11.180 I don't know what that means.
00:13:13.280 What does that mean?
00:13:13.780 I don't even know who that is.
00:13:14.720 What's the biggest threat to America right now, Tucker?
00:13:19.160 Right now?
00:13:19.900 I mean, like it's supposed to be Qatar?
00:13:22.700 I don't know.
00:13:23.860 How's the house?
00:13:24.680 You bought a house in Qatar?
00:13:25.640 It's a beautiful house and it was built by Israelis because I love Israel and I hate it.
00:13:32.180 Back to you, Morgan.
00:13:34.260 That was definitely somewhere between.
00:13:35.540 We love America.
00:13:36.580 That's what we're doing here, folks.
00:13:38.100 We're doing great.
00:13:38.940 Back to you.
00:13:39.360 Brilliant performance, but it's somewhere between Tucker Carlson and Mickey Mouse, for sure.
00:13:44.460 I am happy to see a better Christmas decoration seen up at the White House compared to Dr. Jill Biden's Christmas decorations.
00:13:56.700 Maureen, you didn't do this when you said doctor.
00:13:59.620 Doctor.
00:14:00.580 Doctor.
00:14:01.800 Doctor.
00:14:02.420 I'm fighting back for having to say that.
00:14:04.060 My apologies.
00:14:05.980 We have about 45 seconds.
00:14:07.920 You know, when we come back, what are we doing when we come back?
00:14:10.020 Do you want to do some more with the crowd?
00:14:11.480 Do you have any surprise?
00:14:13.380 Well, I actually want to tell you guys that you need to go to birchgold.com slash Bannon or text Bannon to 989898.
00:14:25.280 Wait, wait.
00:14:25.640 What is it?
00:14:26.340 Bannon?
00:14:26.840 Text the word Bannon.
00:14:27.760 B-A-N-N-O-N to what?
00:14:31.060 989898.
00:14:32.300 Now, through December 22nd, for every $5,000 you purchase of gold, Birch will send you an ounce of silver.
00:14:41.600 By the way, an ounce of silver is like 65 bucks right now.
00:14:44.420 Once again, I should say it as Dave Bratt says it.
00:14:46.220 He's like, Bannon to 989898.
00:14:50.560 All right, folks.
00:14:51.060 We're going to come back.
00:14:51.800 We have a lot more show for you.
00:14:52.860 War Room coming up in two and a half minutes.
00:14:55.020 We'll be right back.
00:14:55.640 We'll be right back.
00:15:26.120 You see, smart people diversify and have a hedge.
00:15:30.260 That's why I encourage you to buy gold from Birch Gold.
00:15:33.020 With the rate cuts from the Fed in 2026, the dollar will be worth less.
00:15:37.220 And what happens if the AI bubble bursts?
00:15:40.240 Diversify.
00:15:41.200 Let Birch Gold Group help you convert an existing IRA or 401k into a tax-sheltered IRA and physical gold.
00:15:49.820 Let me repeat that.
00:15:51.080 Physical gold.
00:15:52.080 And for every $5,000 you buy, you'll get an ounce of silver for your stocking or for your kids.
00:15:58.520 What a great way to teach them about saving smartly.
00:16:02.840 Just text my name, Bannon, B-A-N-N-O-N, to 989898 to claim your eligibility for this offer.
00:16:09.840 Again, text Bannon, B-A-N-N-O-N, to the number 989898 today.
00:16:15.580 Because Birch Gold's free silver with qualifying purchase promotion ends the 22nd of December of this year.
00:16:24.860 Make sure you text Bannon to 989898.
00:16:28.700 Do it today.
00:16:29.220 Here's your host, Stephen K. Bannon.
00:16:37.580 So, earlier, Eric asked,
00:16:40.680 War Room.
00:16:44.540 I need a hype man.
00:16:45.600 As you know, I'm a real ray of sunshine most days.
00:16:49.040 My hype woman was out here somewhere.
00:16:51.060 I think she left.
00:16:51.820 No.
00:16:51.960 I want to, real quick, if I could give a shout-out to the sisters.
00:16:56.520 Please.
00:16:57.880 Shout-out to the sisters, ladies and gentlemen.
00:16:59.760 Please.
00:17:01.240 All right.
00:17:01.640 Earlier, Eric asked the audience if you are for or against AI.
00:17:06.560 Who is a big fan of AI, artificial intelligence?
00:17:12.240 Traitors.
00:17:13.600 Traitors.
00:17:14.720 How much money are you making from it?
00:17:17.320 And look at this guy.
00:17:18.760 Traitor to the human race.
00:17:20.380 Wait a minute.
00:17:21.240 Wait a minute, wait a minute, wait a minute.
00:17:23.220 Why am I a traitor?
00:17:24.640 Because I want to ask an AI bot something I can't figure out by myself.
00:17:29.380 Okay.
00:17:29.980 Just play it forward in your mind for a moment.
00:17:32.940 Already, most human culture and a lot of human delusion is mediated through machines.
00:17:39.240 But at least it's human to machine to human or human to human madness.
00:17:44.200 The future we're looking at is one that just goes machine to human.
00:17:48.420 Can you imagine in five, ten years when artificial intelligence is turned to by every human on earth, or at least a majority, for what is real?
00:17:57.620 What happens then?
00:17:58.800 What happens when everyone looks to the screen to ask, how do I get well?
00:18:04.260 God, do you love me?
00:18:06.240 Wait, wait.
00:18:07.060 So, Joe, let me ask you this.
00:18:08.860 Would you rather have an AI program telling your surgeon what's wrong with me or having him, I don't know, guess from a radiology, from an x-ray?
00:18:19.420 I think that any tool that can enhance a human's ability is on the table.
00:18:25.280 Even if, as a writer, I think that AI is just completely verboten.
00:18:29.440 Any writer who uses AI should have to put AI as a co-author and leave their name on it as a mark of shame.
00:18:36.840 But a doctor is a very different story.
00:18:38.640 A soldier is a very different story.
00:18:40.140 So, to the extent that a doctor can use it, yeah, but if your doctor, if your surgeon is literally taking instructions from an AI, you've got bigger problems than cyborg theocracy.
00:18:51.440 All right, all right.
00:18:52.760 And now, real quick, how many of you are revolted by artificial intelligence?
00:18:58.360 Let's get a boo!
00:18:59.880 Can we ask why?
00:19:01.000 Ask why?
00:19:01.840 Why?
00:19:02.220 Why?
00:19:03.140 I love AI.
00:19:04.640 Oh, she's on my side.
00:19:07.200 She's on team bowling.
00:19:08.840 I am from Oregon.
00:19:10.220 My name's Janice Daniels.
00:19:11.680 But I find so many great uses, like when I get a document from the car company and it's got tons of fine print.
00:19:19.460 I just file it into AI and it summarizes.
00:19:22.520 Summarize?
00:19:23.000 You send a five-page document with legalese and say, summarize this in one paragraph where a fifth grader can read it.
00:19:30.720 And it can check the math, too.
00:19:33.080 It checks the math.
00:19:34.080 Ben, are you pro-AI or anti-AI?
00:19:36.640 I'm mixed.
00:19:37.640 I've got to be honest.
00:19:38.400 I'm mixed.
00:19:38.860 I'm with you on the potential downfalls, Joe, where it could lead to.
00:19:44.420 But I also see the value in things like that, where it makes your life easier.
00:19:48.880 And the bigger issue I see is I don't see us stopping it.
00:19:52.660 And so then the question becomes, how do we use it in a way or control it in a way that adds value but doesn't take away from our society?
00:20:00.380 Real quick on that.
00:20:01.520 You're wearing an ice hat.
00:20:02.780 Yeah, yeah.
00:20:03.100 I hope you guys don't mind.
00:20:04.260 I hope you're not offended.
00:20:05.280 And real quick, so on that note, I can remember basically my entire adult life being told that mass immigration was unstoppable.
00:20:13.360 This is just the future, the mixture of all people.
00:20:17.380 Well, obviously there are solutions, right?
00:20:19.980 So I think the whole narrative that AI is just the future, it's a good way to get you to roll over and take it lying down.
00:20:27.020 Wait, hold on.
00:20:27.660 I think there are controls that need to be in there.
00:20:29.860 I'm just saying AI is a broad spectrum idea, artificial intelligence on some level.
00:20:36.380 I mean, that's really computing at the base level.
00:20:39.080 I don't think you get rid of all of that.
00:20:41.200 But I do think that we need guardrails.
00:20:43.080 And we do need to know what is AI and what's not.
00:20:45.000 All the videos that are out there on social media right now, it's like my mom is bringing this stuff to me.
00:20:50.220 And I'm like, Mom, that's obviously AI.
00:20:52.120 But she doesn't know it.
00:20:53.180 Ben, there are a couple of things.
00:20:54.360 So number one, there's new technology, Joe, that if you put a video up on the blockchain, you know if it's legit or not.
00:21:02.260 Yeah, that's true.
00:21:02.800 So you'll be more and more good at that.
00:21:06.380 But which would you rather be in, Maureen and Joe?
00:21:09.580 An Uber car that's driven by an illegal that can't speak English or a Waymo, an AI car?
00:21:19.220 So I guess technically Uber is already AI, algorithmic immigrants.
00:21:25.200 Very nice.
00:21:26.600 Personally, even though I think that any of these algorithmic immigrants should already know how to get where they're going and not need to follow it like an ant following a pheromone trail, I would take the human a thousand percent.
00:21:41.460 I had a woman take me or send me on my first Waymo ride the other day.
00:21:46.820 And I think the absence of the human, the absence even of someone who can't even speak to you.
00:21:51.440 Is it proven to be safer, though?
00:21:53.080 It doesn't, you know, a lot of safe and effective.
00:21:55.200 Don't give me that.
00:21:55.740 I don't care.
00:21:57.040 Human life is so much more than quantification.
00:22:00.460 Yeah, but Joe, we've seen all these, once again, accidents with illegal immigrants that get CDLs that are causing death and destruction because they can't read a sign on the road.
00:22:13.660 So I don't know if I'd rather be in a car that had no driver.
00:22:18.240 Or, you know, I'm in fear for my safety with someone who's an illegal immigrant behind the wheel because I don't know if they can read the sign.
00:22:27.500 I've got an idea.
00:22:28.600 I've got an idea for a solution.
00:22:30.180 How about neither robots nor illegal immigrants and American workers having the privilege?
00:22:34.920 I thought Joe was going to go there because it's kind of a straw argument because the answer is neither, right?
00:22:41.380 I mean, but at some point as society, we as society get to decide.
00:22:45.980 And that's why we're here is to compete in the arena of ideas and to say this is the ideas we like or these are the ideas that we don't.
00:22:54.000 By the way, I hope you don't mind.
00:22:55.260 This is my favorite hat.
00:22:56.740 This is my favorite Christmas gift.
00:22:58.240 I got this from Houston Ice this year.
00:23:00.520 And if you're offended, good.
00:23:02.720 We're coming for you.
00:23:03.860 You want to hear from some of the folks?
00:23:06.900 You want to Waymo, an AI-driven car, no human being, or a human driving your taxi?
00:23:15.040 I'd rather have humans until they can prove the AI works.
00:23:19.140 I'm kind of a mid on AI.
00:23:20.600 I think it'll do some things well.
00:23:22.220 Not as good as advertised, but also not as bad as advertised.
00:23:26.520 I remember you were invoking the dark future in which your surgeon asked ChatGPT for what to do.
00:23:32.280 What I think is the worse that sort of thing gets, the more the culture will notice and adapt to it.
00:23:37.520 I heard an example from a history teacher who assigned his students,
00:23:41.480 please generate me a report using ChatGPT and then do your own research and tell me how much it got wrong.
00:23:48.260 And the students came back and said, well, now I no longer trust AI.
00:23:51.720 So the worse it gets, the more people will notice and teach everyone how wrong it actually is.
00:23:57.800 Wait a question for you guys.
00:23:59.940 With AI, automatic driving car, human being driverless cars, Maureen,
00:24:05.840 won't that mitigate some of the DEI and the woke measures that this country and this world wants to put in?
00:24:13.160 We want to have a certain amount of immigrants.
00:24:15.740 We want to have a certain amount from this minority driving, surgeons, getting into the schools.
00:24:23.200 The more it goes AI, it becomes meritocracy again.
00:24:26.980 Does it not?
00:24:28.140 No.
00:24:30.020 No.
00:24:30.660 I mean, just consider for just a moment.
00:24:32.960 But the question was for Maureen.
00:24:34.500 Go ahead, Joe.
00:24:35.180 Go ahead.
00:24:35.620 Take it.
00:24:36.240 Are you mansplaining, Joe?
00:24:38.100 Are you mansplaining now?
00:24:39.720 Whoa, whoa, whoa.
00:24:40.560 You're going to get me hit over here.
00:24:41.720 To the extent that AI allows for, look at an Uber, for instance.
00:24:48.300 The only reason that driver who has no idea where he is has no idea how to communicate with you
00:24:53.440 is because algorithms are feeding him his instructions.
00:24:56.720 So I think the more that human beings turn the attention and energy towards machines,
00:25:02.880 the less important the humans will be.
00:25:05.900 I think we need better surgeons, not better AI.
00:25:08.740 We need better drivers, better teachers, better citizens, not better AI.
00:25:13.860 That's a pretty firm position, though.
00:25:15.780 I have to agree with Joe, even though he was mansplaining.
00:25:19.300 I do have to agree with him, though.
00:25:20.900 If we need to get away from AI, but we also need to set the bar higher
00:25:26.660 and not allow every illegal that came into the country under the Biden regime
00:25:31.740 to be able to get a license or go to school on our dime
00:25:39.460 and then pass and become a position that they're not qualified for.
00:25:44.660 So I think that we need to get back to U.S. citizens, get driver's license,
00:25:50.440 U.S. citizens are able to go to college and do all of these things and get away from AI.
00:25:57.440 I've got a writer over here that wants to jump in on the conversation if we can.
00:26:02.360 Young lady.
00:26:02.980 Hi, Ben.
00:26:03.960 Teresa from Sandpoint, Idaho.
00:26:06.100 And I, so, Sandpoint, Idaho, Teresa, I want to comment on Joe's comment about AI.
00:26:11.700 As a writer, it terrifies me because it's going to take away our individual voices.
00:26:17.620 You cannot duplicate that, and that's what makes all of those thousands and millions of books out there
00:26:24.840 valuable to someone because they're attracted to that voice.
00:26:28.080 But we're all going to blend into one sound, and what are we going to know what's real?
00:26:32.800 I think someone mentioned about videos.
00:26:34.480 We don't even know if that's real or not anymore, and that's frightening.
00:26:38.020 So, no, I'm not a fan.
00:26:40.280 Oh, yeah.
00:26:41.100 Oh, yeah.
00:26:41.380 By the way, here's a glass of Lake Pendray wine to you, madam.
00:26:46.900 Yeah.
00:26:47.600 Sandpoint, Idaho, a fantastic place.
00:26:49.380 They'll be the last to go when the cyborg revolution really gets going.
00:26:52.300 Yeah, it's a good place to hide out.
00:26:53.720 It is a good place to hide out up in Sandpoint.
00:26:55.580 Anyone else want to jump in on the AI conversation?
00:26:57.560 Wait, wait, wait.
00:26:57.840 We have a good one.
00:26:58.420 We have a good one.
00:26:58.880 Okay, okay.
00:26:59.500 Come on, come on, come on.
00:27:00.440 You have a great one.
00:27:00.800 Don't be shy.
00:27:01.580 Don't be shy.
00:27:02.100 No, what I was saying is Uber and Lyft, a lot of Americans actually use those jobs, especially after inflation.
00:27:08.900 So, we just have to get rid of the illegals, but a lot of Americans actually use that as their source of income that they can't afford with just their one job.
00:27:16.280 So, I don't think taking it away is a good idea.
00:27:18.740 I kind of agree with you guys on that.
00:27:20.380 Just no illegals.
00:27:22.220 Yeah, it's common.
00:27:23.360 I mean, who thinks it's a good idea to give a driver's test in 17 different languages?
00:27:29.140 When you have to read the sign that says, do not enter, or one way, you can't read it.
00:27:33.800 So, I actually, Eric, I saw a video recently, and it was a driver of an 18-wheeler, and he was stopped on the side of the road.
00:27:42.080 So, a state trooper came up next to him and was asking him questions, and he did not know how to answer because he wasn't from the United States.
00:27:48.880 He was from China.
00:27:50.300 And so, the state trooper went over the different road signs.
00:27:54.840 The only one he knew was stopped.
00:27:56.500 He couldn't tell you anything else.
00:27:58.040 So, someone that cannot understand a road sign should not be behind the wheel.
00:28:04.160 Point blank, period.
00:28:05.620 Good.
00:28:08.960 So, we're about to go to commercial break.
00:28:11.940 However, I know my dad talks about this, but who doesn't have Home Title Lock?
00:28:17.780 Well, you guys can go to HomeTitleLock.com, use promo code Steve, and I'll talk more about it when we come back from the commercial break.
00:28:28.760 If you're a homeowner, you need to listen to this, so listen up.
00:28:33.720 In today's artificial intelligence and cyber world, scammers are stealing home titles with more ease than ever.
00:28:41.640 And your equity, the equity in your home, your life savings is the target.
00:28:47.760 Now, here's how it works.
00:28:48.720 Criminals, forage your signature on one document, use a fake notary stamp, pay a small fee with your county, and boom, your home title has been transferred out of your name.
00:29:01.180 Then they take out loans using your equity or even selling your property.
00:29:06.180 You won't even know it's happened until you get a collection or foreclosure notice.
00:29:11.900 So let me ask you, when was the last time you checked your home title?
00:29:17.740 If you're like me, the answer is never.
00:29:21.380 And that's exactly what scammers are counting on.
00:29:24.340 That's why I trust Home Title Lock.
00:29:26.940 Before I met them, I never checked on this.
00:29:29.420 Now I'm safe, and now I'm secure.
00:29:32.400 Use promo code Steve at HomeTitleLock.com to make sure your title is still in your name.
00:29:37.720 You'll also get a free title history report plus a free 14-day trial of their $1 million triple lock protection.
00:29:47.940 That's 24-7 monitoring of your title, urgent alerts to any changes, and if fraud should happen, they'll spend up to $1 million to fix it.
00:29:58.720 Go to HomeTitleLock.com now.
00:30:00.940 Use promo code Steve.
00:30:02.640 That's HomeTitleLock.com.
00:30:04.980 Promo code Steve.
00:30:06.180 Do it today.
00:30:08.000 Do it now.
00:30:10.460 Here's your host, Stephen K. Band.
00:30:17.300 All right, War Room Posse.
00:30:19.760 Let's hear it for Ben Berkwam.
00:30:23.760 Ben Berkwam has ventured to the border, to the Darien Gap.
00:30:28.840 He is a man who has fierce courage.
00:30:31.780 Now, would you like to see Ben Berkwam replaced by a robot?
00:30:34.980 All right, it would be safer.
00:30:36.880 It might be, we call it Bot Berkwam.
00:30:39.140 Well, I'll tell you what, Joe.
00:30:40.680 The one thing that the competitive advantage, and people always ask me, how do you do what you do?
00:30:46.620 I'd say the biggest advantage I have is authenticity.
00:30:49.880 I don't think you can have a robot do that.
00:30:52.080 And back to the mainstream media, the reason why so many people are leaving the fake news and coming to War Room and coming to Real America's Voice is because of the authenticity that we have.
00:31:01.560 You don't get that from a robot AI.
00:31:03.640 But I am concerned that a lot of jobs, and I was out there with the Starbucks baristas that were demanding their rights, and I went up to them and I just told them, I was like, so you realize you're just asking for your job to be replaced by a robot?
00:31:15.400 It is concerning, and you've got all these people that don't even know what they're asking for.
00:31:19.620 You know, the McDonald's worker who says, I want $15 an hour, and you go into the McDonald's, and now it's all kiosks.
00:31:25.780 Well, you got what you wanted, and now it's a robot.
00:31:29.340 But no, to answer your question, I don't think a robot could replace me.
00:31:33.260 Boom.
00:31:34.240 Ben Berkwam first.
00:31:35.640 Mic drop.
00:31:36.160 So, Maureen, I've got this issue.
00:31:39.660 Pretty much my entire personality has been digitized at this point.
00:31:43.660 Steve hired me, turned me into a robot.
00:31:46.040 All of my information is out there.
00:31:47.980 My social security number is digitized.
00:31:50.620 If I had a home, my title would be digitized.
00:31:53.720 What am I going to do in a world like this?
00:31:55.500 So if you're a homeowner, you need to listen to this.
00:31:58.640 In today's AI and cyber world, scammers are stealing home titles with more ease than ever.
00:32:06.700 And your equity is the target.
00:32:10.000 So use, go to hometitlelock.com and use promo code Steve, and you get a 14-day trial.
00:32:17.760 Go to hometitlelock.com, promo code Steve for your 14-day trial.
00:32:27.760 Boom.
00:32:28.680 So, Ben Berkwam, you got a question from the audience.
00:32:32.100 Well, actually, I just wanted to introduce somebody.
00:32:34.160 For those of you that watch War Room or any of the work we've done, this is a little badge of honor.
00:32:38.900 So I've got a sheriff's sergeant from California.
00:32:41.880 She used to not be able to say anything, but she was there when I got arrested for jumping the wall
00:32:46.440 at Gavin Newsom's mansion in California, along with Laura Loomer, to demand sanctuary.
00:32:51.880 And I walk in there, and she's like, what are you doing here?
00:32:54.620 But, Scar, I just want to ask you, now that you can say something, how bad is it in California,
00:32:58.740 and what's your prayer for 2026 going out of this?
00:33:02.560 Well, it is bad from California.
00:33:04.400 And so since then, I retired, and I promptly moved, and I moved to Free Florida.
00:33:11.280 So, very happy with that.
00:33:13.060 But I really hope that, I really just want people to wake up.
00:33:18.420 And that's why, you know, I listened to Ben.
00:33:20.860 I had already been following him.
00:33:22.380 But just wake up and stop being the sheeple that they take to be, and to just pay attention.
00:33:29.920 And, of course, being from law enforcement, I've got my head on the swivel all the time,
00:33:33.340 and I'm very aware of my surroundings, what have you, but the general populace is not.
00:33:38.180 And they just follow and go along to get along.
00:33:42.600 Any recommendations, real quick, to people that are here?
00:33:45.040 A couple things that you should be doing anywhere you go in this age of jihad and radical leftist terrorism?
00:33:51.240 Well, you've got to have a plan.
00:33:53.740 You've got to talk about your plan with your family and your friends.
00:33:57.440 Know who your people are.
00:33:59.520 Get your go bag.
00:34:01.380 And whether you're going to bug in or bug out, you've got to know where to go.
00:34:05.100 And make sure your communication is good.
00:34:07.520 Make sure you have all the tools.
00:34:10.040 Water, we can survive without food, but we need our water.
00:34:13.540 And just, you know, continue to pray that the Lord will protect us
00:34:19.060 in all whatever happens throughout this country and support each other.
00:34:25.340 On behalf of the War Room and Real America's Voice News,
00:34:28.220 to all the sheriffs out there, law enforcement, ICE, Border Patrol,
00:34:31.300 everyone who swears that oath and upholds that oath, God bless you.
00:34:34.100 We stand with you.
00:34:35.660 Woo!
00:34:36.100 If you want.
00:34:40.060 Yeah.
00:34:40.820 We have a couple young people here who are going to weigh in on the AI debate.
00:34:44.340 Let's start with Dominic.
00:34:46.440 Hi, guys.
00:34:47.320 It's an honor to be here.
00:34:48.780 I just want to say with the AI situation,
00:34:51.680 I do believe that it can be used in a positive context for our country, manufacturing.
00:34:56.880 It can create jobs.
00:34:57.740 But at the same time, we have to make sure that our AI development is focused in America
00:35:02.200 and we can't have it outsourced to countries like China and Russia and have those exploited.
00:35:07.520 But at the same time, we need to be careful that it cannot get out of control in certain industries
00:35:12.060 until the point where it can become dangerous and make life-threatening decisions.
00:35:15.580 For example, if someone wants to say, how do we, just for example, just out there,
00:35:20.120 how do we tackle the climate problem or whatever?
00:35:22.900 And AI could say, well, the human race is the problem.
00:35:25.300 And then it could go after the entire human race.
00:35:28.660 So my thing is that we need to understand also that AI does not take over the human mind and its conscience
00:35:37.180 because at the end of the day, we're the ones that are sound to each other
00:35:40.600 and AI really doesn't have any connection to the human mind.
00:35:43.280 If I can just say that's a very well said, very well spoken, and very important perspective
00:35:49.460 because it's your problem a lot more than it is mine, right?
00:35:53.240 You're the one facing that future.
00:35:54.740 And here's Maria.
00:35:58.560 I'm more neutral to the not really problem of AI,
00:36:03.140 but if we don't come to a conclusion that this is not really therapeutic problem to solve,
00:36:11.940 but we need to come together and find out what's going on with the inside program
00:36:20.940 because we don't know what we are signing up originally.
00:36:25.600 This is a great point.
00:36:26.980 We don't know who, I know it's simulation, right?
00:36:29.260 So the AI bots get their, they understand human emotion by simulation, billions upon billions of simulations.
00:36:37.400 They'll look at pictures, they'll look at text, they'll see patterns in speech,
00:36:41.500 and they'll understand the emotion that way.
00:36:43.740 But garbage in, garbage out, right?
00:36:45.780 You've got to be careful who's teaching it, who's feeding what simulations to the AI bots.
00:36:51.560 Very quick, these guys want to be on camera.
00:36:53.000 I'm sorry, we'll go real quick.
00:36:54.660 What's your name?
00:36:55.600 Fletch.
00:36:56.340 What's your name?
00:36:57.080 Bryce.
00:36:57.620 Where are you guys from?
00:36:59.340 California.
00:37:00.080 San Diego.
00:37:00.580 All right, guys, you're on TV.
00:37:02.180 That's cool, right?
00:37:03.100 Back to you guys.
00:37:03.600 What are you guys looking forward to here at AmFest?
00:37:06.380 Seeing Tucker Carlson.
00:37:09.640 Yeah, Tucker Carlson.
00:37:11.060 You guys are Tucker fans.
00:37:12.540 All right, all right.
00:37:15.300 All right, back to you guys.
00:37:16.740 Back to you guys.
00:37:17.440 So, Joe, I can't ever keep up with you.
00:37:20.220 Where have you been recently, and what have you been doing?
00:37:23.760 Well, my last major stop was in San Francisco.
00:37:27.240 I attended a death cult ritual with former and even current transhumanists who are worried that AI will kill everyone.
00:37:37.140 And so they had a solstice celebration.
00:37:39.680 They're so defiant against God and nature, they celebrated the solstice a week before the actual solstice.
00:37:45.880 So the death cult ritual was quite interesting, quite moving.
00:37:48.900 A lot of people sad about AI killing everyone.
00:37:51.500 It got so sad at a certain point, I wished that AI would just come and kill me right then.
00:37:56.000 And before that, I was in St. Louis with the Tradcasts discussing technology, discussing AI.
00:38:03.840 But, you know, about six months ago or so, a little bit less, I was here in the Phoenix area.
00:38:10.600 You guys are familiar with Scottsdale.
00:38:12.820 Some of you locals are familiar with Scottsdale.
00:38:14.980 Have you ever been up to the Alcor Life Extension facilities?
00:38:18.660 Anyone?
00:38:18.880 It's a pretty neat little sidestop on a quest towards the mountains.
00:38:24.940 So Alcor Life Extension is just north of here.
00:38:29.040 And what they do, right now they have 252 human bodies that are frozen in suspended animation.
00:38:36.740 And the idea is that once technology, and especially AI, has come to the point that you can reanimate someone,
00:38:45.740 they will then be pulled from their containers, reanimated, and will shamble the earth like oozing zombies.
00:38:51.500 Sort of like you take a blueberry that's been frozen and you bring it back to life.
00:38:57.300 So it was actually quite interesting, though.
00:39:00.100 When I started talking about transhumanism, the president was very clear.
00:39:05.360 Sir, this is not about transhumanism.
00:39:08.700 This is about science.
00:39:10.380 And it validated a prediction that I had.
00:39:13.760 Even when we first started talking about transhumanism on the war room four and a half years ago,
00:39:18.980 I said it was already an out-of-fashion term.
00:39:21.740 In the future, we won't call it transhumanism.
00:39:25.180 We'll just call it science and technology.
00:39:27.960 But anyway, anyone who's worried about death and anyone who would like to be reanimated,
00:39:32.080 Alcor Life Extension, just promo code JOBOT.
00:39:37.120 Promo code Steve Bannon War Room, right?
00:39:41.100 You had me at oozing.
00:39:42.040 Who loves the war room here?
00:39:45.540 Leland, very quick.
00:39:47.360 Yeah, hi, it's Leland from South Louisiana.
00:39:49.280 And I've got to ask you, Maureen, do you have a middle name, point of personal order?
00:39:54.020 I do have a middle name.
00:39:55.200 It's Maureen Elizabeth.
00:39:56.700 So instead of SKB, it's M-M?
00:39:59.640 M-E-B.
00:40:00.400 M-B-B.
00:40:01.240 M-E-B.
00:40:02.140 M-B-B.
00:40:02.780 That sounds good.
00:40:03.780 So I see you've got bowling walking the floor.
00:40:05.440 Even SKB can't get that done.
00:40:08.580 I love Steve.
00:40:09.720 Steve Bannon is my true North MAGA.
00:40:11.740 You know that, right?
00:40:12.600 And I'm mine, too.
00:40:13.460 Back to you guys.
00:40:14.560 Ben.
00:40:14.860 Yeah.
00:40:15.600 So, by the way, you had me, Joe, at oozing flesh.
00:40:18.600 That's just, your description.
00:40:20.660 That was not AI, was it?
00:40:22.260 You came up with that yourself.
00:40:23.740 I told Joe this earlier, that every time he is on War Room, he scares the audience half to death.
00:40:33.180 Yeah, you should be a jerk.
00:40:33.760 All the live chat, you're scaring the entire audience.
00:40:37.060 But you guys all need to hear what Joe has to say.
00:40:40.040 You should be a fiction, well, I guess it would be a non-fiction horror writer.
00:40:43.860 You know, maybe you could call me a futurist.
00:40:46.220 A futurist is basically a science fiction writer that has fancy graphs.
00:40:49.780 Okay.
00:40:50.560 All right.
00:40:50.840 I want to jump in, bring in another California native over here.
00:40:54.060 Redlands, California.
00:40:55.020 Greg, what's your question?
00:40:57.180 I don't think that any of you are considering this dynamically.
00:41:00.140 Not just what AI can do today, but a year from now, five years from now, or ten years from now, what can and can it do?
00:41:10.540 So, when you said earlier, Ben, they can't replace me.
00:41:14.420 Want to bet?
00:41:15.900 And when AI, you already have AI girlfriends that are developing emotional attachments.
00:41:21.520 I don't.
00:41:22.160 Well, they exist.
00:41:23.980 They develop emotional attachments.
00:41:26.540 So, now let's expand that for ten years.
00:41:28.620 But, well, the question is this.
00:41:30.800 Elon Musk said recently that within 20 years, work will be optional.
00:41:36.700 If you want, like a hobby, like gardening, if you want to work, fine, but you won't have to.
00:41:41.840 So, what does a society look like where no one has to work?
00:41:46.840 And I think the answer is not very good.
00:41:48.900 Yeah, it's terrifying.
00:41:49.720 I think it's pretty clear.
00:41:51.600 And, by the way, good on you, sir, for your critical thinking.
00:41:54.280 It's going to serve you well and everyone in the next generation.
00:41:58.240 Yeah, Elon Musk talks about artificial superintelligence as the future god over all of the human race.
00:42:05.000 He talks about when AI has surpassed human intelligence, there will be only one entity in charge, and that will be a digital entity.
00:42:13.000 He says that in order to keep pace, we'll need to have Neuralink brain implants or some other kind of implant.
00:42:18.580 And, of course, he says that all work ultimately will be done by robots, meaning, as you say, human beings have zero economic value.
00:42:28.660 The masses of humanity under those conditions would have zero economic value, meaning you have zero negotiating power, meaning you're either going to be a pet or you're going to be biofuel.
00:42:40.520 But I'm not convinced that that's the future we're looking forward to.
00:42:44.880 I think it's something quite worse.
00:42:46.620 I think that waves of propaganda are going out saying that robots will do all the work in the future.
00:42:53.280 All the intellectual work will be done by artificial intelligence.
00:42:56.520 I don't think that's going to happen, but what I do think is they are demoralizing an entire generation that is growing up thinking that robots are basically going to be their babysitters or that they're going to be babysitting robots.
00:43:09.200 I think reality is going to have a lot more harsh conditions than we're being told, and I don't think they're being prepared for those harsh conditions.
00:43:16.900 Yeah, we've got another one over here.
00:43:20.360 Come over here.
00:43:20.960 Got to be in the camera, though.
00:43:22.220 I know this one.
00:43:23.160 Hey, Joe.
00:43:23.660 How are you doing?
00:43:24.540 So you're describing a future where people have no autonomy, no wealth.
00:43:28.520 What do they think they're going to do with all these people that have been disenfranchised with no motivation to create or do or be?
00:43:33.820 There are going to be a lot of frustrated people, and that could either have a possibility to encourage people to be excellent, but it's also going to cause a lot of people to be very angry and frustrated.
00:43:43.100 So what do they think they're going to do with all of us?
00:43:44.800 Have you heard any comments from the sociopathic overlords on that point?
00:43:48.940 Well, in the 20 seconds we have left, I just want to put a term out there for you, art elect war.
00:43:55.300 Hold your answer until we come back from the commercial break, Joe.
00:43:59.320 But between now and then, go to birchstallgold.com, use promo code Bannon, and I'll tell you more about it when we come back from the commercial break.
00:44:09.320 Imagine having the world's most connected financial insider feeding you vital information.
00:44:14.800 The kind of information only a handful of people have access to, and that could create a fortune for those who know what to do with it.
00:44:24.580 That's exactly what you get when you join our frequent guest and contributor, Jim Rickards, in his elite research service, Strategic Intelligence.
00:44:34.080 Inside Strategic Intelligence, you'll hear directly from Jim and receive critical updates on major financial and political events before they hit the mainstream news.
00:44:45.060 He'll put you in front of the story and tell you exactly what moves to make for your best chance to profit.
00:44:51.780 As a proud American, you do not want to be caught off guard.
00:44:55.800 Sign up for Strategic Intelligence right now at our exclusive website.
00:45:00.620 That's RickardsWarRoom.com.
00:45:03.120 RickardsWarRoom.com.
00:45:05.000 You go there, you get Strategic Intelligence based upon predictive analytics.
00:45:10.660 Do it today, right now.
00:45:12.920 RickardsWarRoom.com.
00:45:14.380 Here's your host, Stephen K.
00:45:18.500 Vance.
00:45:20.560 Welcome back, War Room Posse.
00:45:24.980 Mo, if you would just let me cry on your shoulder for a minute.
00:45:29.380 I have a real problem with spending money.
00:45:32.140 I have an even bigger problem saving money.
00:45:35.380 And if I do save money, I never know, do I put it into Bitcoin?
00:45:38.500 Do I put it into stocks?
00:45:39.980 Where do I put this meager amount of money I even have?
00:45:42.340 I think you should put it into gold.
00:45:43.860 And you can go to BirchGold.com.
00:45:46.520 Promo code Bannon or text Bannon to 989898.
00:45:52.400 And now through December 22nd, for every $5,000 you spend on gold, you get an ounce of silver.
00:45:58.720 And I don't know if you guys know this, but Philip Patrick will be here on set tomorrow.
00:46:05.300 So come back tomorrow morning.
00:46:06.980 You'll get to see Philip Patrick in person, live here at War Room.
00:46:12.040 And Ben Berkwam, we have, if you could restate the question.
00:46:16.020 A great question.
00:46:16.860 Great question.
00:46:17.460 And this is the big concern.
00:46:19.940 Yeah, thanks, guys.
00:46:20.940 I just wanted to know if our sociopathic overlords have stipulated what they're going to do with all the billions of franchised people who are no longer economically viable,
00:46:29.560 allowed to be creative, allowed to contribute in any way, shape, or form because they've been replaced.
00:46:33.080 It seems to me that that might have a lot of unintended consequences, including passively uniting everyone.
00:46:39.320 So have they spoken to that?
00:46:41.220 Yeah.
00:46:41.640 Ben, could you just restate that?
00:46:43.040 I was having a very hard time.
00:46:45.120 So what are they going to do with all of us peons when we have no purpose anymore?
00:46:49.880 The overlords over AI.
00:46:52.300 Yeah.
00:46:52.500 Did I get that right?
00:46:54.180 Okay.
00:46:54.540 You would either be the source of training data and or you would be a pet or you would be biofuel.
00:47:02.660 If you have a society in which people are just allowed to breed with no use whatsoever and the machine is feeding them endless amounts of soil and smoothie,
00:47:12.440 at some point or another, the people who run the system are going to think, well, why are we keeping them alive?
00:47:18.380 Maybe this generation, maybe the next, maybe the next.
00:47:21.220 I think to the extent anyone has convinced you that living on universal basic income in your pod with your soylent smoothie and virtual reality is any kind of future, you're already cooked.
00:47:32.260 And I know you are not, but I think that the plan, by and large, is to spend people on this utopian dream and ultimately, I don't think it will come to realization.
00:47:42.720 It's going to be a real, real big problem.
00:47:46.740 Yeah, it is.
00:47:47.820 You know, one of the things, you look at all the people on welfare that expect that check every month from government and then when it doesn't come, the immediate violence that we see.
00:47:57.400 I mean, we saw that with SNAP, all these people that are just dependent on the government, and then they say, oh, if you don't give it to me, I'm going to go rob you.
00:48:04.180 I mean, we're heading towards some dangerous, scary times.
00:48:07.640 I don't see that as a positive potential future for this nation in any way, but it's scary.
00:48:14.220 Enough of us will make it.
00:48:15.660 Enough of us will make it.
00:48:16.740 Buy land, work the earth.
00:48:18.840 But think about how important technology has been.
00:48:22.040 And we're removing kinetic warfare.
00:48:25.420 We're using drone strikes instead of human beings.
00:48:27.840 These are wonderful developments, which wouldn't happen without high-level technology.
00:48:32.840 It's a tough one because, yes, it does save American lives for now, but as other nations also develop them, as you have more and more technology transfer, it doesn't take a really vivid imagination to see where this all goes.
00:48:45.440 It's fully autonomous weapon systems with people in charge of them with very little ethical reasoning.
00:48:52.620 Nightmare world.
00:48:53.800 Joe, I know you have to bounce, so can you give everyone your coordinates where they can find you?
00:48:59.080 Well, I am right now heading over to the rumble stage to answer even more difficult questions.
00:49:05.020 And I'm also simultaneously jumping into the time machine slash teleporter, and we'll be doing War Room tonight at 6 p.m.
00:49:14.620 War Room Battleground, Cyborg Theocracy, Visions of an All-Powerful Machine.
00:49:19.680 Stay tuned.
00:49:20.360 You're AI.
00:49:20.860 Or stay tuned over here.
00:49:22.480 Joe's got an AI robot.
00:49:23.980 He's got a robot.
00:49:25.200 He's got a double.
00:49:26.260 And I know, I believe we have Mike Lindell.
00:49:28.620 Mike, can you hear us?
00:49:30.340 Yes, I can.
00:49:31.760 I wish I was with you all.
00:49:33.840 I'm busy back here running for governor of Minnesota now.
00:49:39.520 Oh, Mike Lindell saying he's running for governor.
00:49:42.320 Mike Lindell running for governor of Minnesota now.
00:49:44.860 I think Minnesota's in good hands with Mike Lindell as governor.
00:49:48.180 That's just my theory.
00:49:49.840 Mike, tampons in the men's room in Minnesota under Governor Mike Lindell, yes or no?
00:49:57.200 Absolutely not.
00:49:58.220 You guys go to MikeLindellGov.com, and you'll see what we're really going to do to Minnesota.
00:50:02.860 And it's going to be amazing.
00:50:05.020 But I wanted to get on here, too, and give you guys the last day for the Christmas specials here.
00:50:10.220 You guys, at the promo code WARROOM, mypillow.com forward slash WARROOM.
00:50:16.100 But here's the five Christmas specials that we put all our flagship products on sale.
00:50:21.760 There they are.
00:50:22.460 The Giza Dream Sheets.
00:50:23.800 You've got the six-piece towel sets, $39.98, slippers, $39.98.
00:50:29.160 All these are the best Christmas gifts ever, you guys.
00:50:32.040 And you guys have supported mypillow through the whole year, so we're giving back.
00:50:36.120 I mean, with all these specials.
00:50:38.060 Go to mypillow.com forward slash WARROOM.
00:50:40.640 You guys see the crosses.
00:50:42.020 There's just a few of them left allotted for Christmas.
00:50:45.400 And then you have all the towels and the blankets that came in.
00:50:48.980 It's five different kinds of blankets all on sale.
00:50:51.680 We have over 250 products.
00:50:54.460 Do all your shopping right now.
00:50:56.280 Promo code WARROOM.
00:50:57.740 And then you guys see the, we also have all kitchen products.
00:51:00.900 And then we have the five, the Bible pillows.
00:51:05.100 Five of those.
00:51:06.060 You get five of them for $29.98.
00:51:08.500 That's listed $6 a gift.
00:51:10.780 Promo code WARROOM.
00:51:12.220 And if you guys call 1-800-873-1062, I'm going to take some of your calls myself.
00:51:20.240 And once again, promo code WARROOM, 800-873-1062.
00:51:26.620 And I wish I could be out there with all of you out there at Amfest, but we are out there
00:51:31.740 with Lindell TV.
00:51:33.320 And just know that I'm back here.
00:51:35.420 We're going to make Minnesota great again.
00:51:37.540 Trust me.
00:51:38.740 This is, we're in it, I'm all in, so.
00:51:43.520 Thank you, Mike.
00:51:44.540 And we'll see you tomorrow.
00:51:47.500 Eric.
00:51:47.620 Hey, folks.
00:51:49.080 Support the folks that support these shows, because that's why we can do it.
00:51:53.820 Support Rav.
00:51:54.600 Support Steve Bannon's WARROOM.
00:51:56.560 Maureen.
00:51:57.180 Support Mike Lindell.
00:51:58.320 Support Birch Gold.
00:51:59.560 These folks are really, they go a long way.
00:52:01.960 They go a long way to make sure we can bring this information to you.
00:52:06.520 Real information, not what you're getting in the left-stream media.
00:52:09.640 Eric, in the last minute or so, what are you looking forward to here at Amfest this weekend?
00:52:14.960 I'm literally, I'm loving this.
00:52:16.740 Just walking around this crowd is really inspiring.
00:52:19.740 Young people want to talk about conservative values again.
00:52:23.600 It's amazing.
00:52:24.200 And that's an honor to Charlie Kirk, to his legacy, to everything Turning Point USA has been doing.
00:52:29.440 And really, folks, Rav putting this stage up here means a lot.
00:52:33.260 And Steve and everyone coming out here as well means a lot.
00:52:36.300 I agree with you, and it is a bittersweet Amfest, because Charlie's not here.
00:52:40.520 But I think that he would be, I know he's smiling up in heaven seeing everyone here,
00:52:45.540 and that everyone is not staying silent, that they're continuing to fight.
00:52:49.760 So from here, I know main stage is starting very shortly, if they haven't already started already.
00:52:55.340 But we're tossing to just the news.
00:52:57.340 So stick around on Rav to just the news with John Solomon, or go over to Rumble or Getter to Bannon's War Room
00:53:05.700 for our 6 p.m. show, which Joe Allen will be the host of.
00:53:10.460 So he'll scare you guys with some more AI.
00:53:12.700 And come back here tomorrow morning at 10 a.m.
00:53:14.900 Do you owe back taxes or haven't filled in years?
00:53:17.920 Now is the time to resolve your tax matters.
00:53:20.900 When the national conversation around abolishing the income tax system,
00:53:24.160 the IRS is fighting back and proving it's here to stay by becoming more aggressive than ever before.
00:53:30.980 They're sending out more collection notices, filing more tax liens,
00:53:34.620 and collecting billions more than in recent years.
00:53:37.880 If you owe, the IRS can garnish your wages, levy your bank accounts, seize your retirement, even your home.
00:53:44.900 If you owe or haven't filed, it's not a question that the IRS will act.
00:53:49.840 It's when the IRS will act.
00:53:52.160 Now, right now, Tax Network USA is offering a completely free IRS research and discovery call
00:53:58.320 to show you exactly where you stand and what they can stop before it's too late.
00:54:03.940 Their powerful programs and strategies can save you thousands or even eliminate your debt entirely if you qualify.
00:54:10.540 Don't make a costly mistake.
00:54:12.800 Representing yourself or calling the IRS on your own waives your rights and costs you more money.
00:54:17.380 They are not on your side.
00:54:19.740 The IRS is there to get as much money as possible not to help you out.
00:54:24.040 Get protected the right way with Tax Network USA and start the process on settling your tax matters once and for all beginning today.
00:54:32.640 Call 800-958-1000.
00:54:36.060 That's 1-800-958-1000.
00:54:39.480 Or visit TaxNetworkUSA.com.
00:54:42.640 That's TNUSA.com.
00:54:44.700 Slash Bannon for your free discovery call with Tax Network.
00:54:48.800 TNSUSA.com.
00:54:50.800 Slash Bannon for your free discovery call.
00:54:54.540 Or call 1-800-958-1000.
00:54:58.020 That's 1-800-958-1000.
00:55:03.060 Don't let the IRS make the first move.
00:55:06.720 Act today.