The Glenn Beck Program - February 02, 2026


Best of the Program | 2⧸2⧸26


Episode Stats

Length

51 minutes

Words per Minute

155.17006

Word Count

7,937

Sentence Count

709

Misogynist Sentences

3

Hate Speech Sentences

4


Summary

On today's show, Glenn Beck talks about the dangers of artificial intelligence agents, Iran and the Iran nuclear deal, and why AI agents are so different from the real Glenn Beck. Glenn Beck is an American conservative commentator and radio host who has been in the business for over 30 years. He is the host of the conservative radio show "The Glenn Beck Show" and is a regular contributor to conservative media outlets such as Fox News and the New York Times.


Transcript

00:00:00.160 With the RBC Avion Visa, you can book any airline, any flight, any time.
00:00:06.500 So start ticking off your travel list.
00:00:08.960 Grand Canyon? Grand.
00:00:10.960 Great Barrier Reef? Great.
00:00:13.380 Galapagos? Galapago?
00:00:15.960 Switch and get up to 55,000 Avion points that never expire.
00:00:20.540 Your idea of never missing out happens here.
00:00:23.560 Conditions apply. Visit rbc.com slash avion.
00:00:30.000 So last week I heard Donald Trump talk about the GOP's New Deal.
00:00:35.380 And I thought, wait a minute, what's that? I've not heard that.
00:00:37.860 What are you talking about, New Deal? FDR.
00:00:39.720 No, it's not FDR. It's not Reagan.
00:00:41.700 It's completely Trump.
00:00:43.900 And it comes in phases.
00:00:45.360 So I wanted to talk to you about what Trump is doing is not FDR's New Deal.
00:00:50.020 It's not Ronald Reagan.
00:00:51.260 It's something entirely different that we haven't seen before.
00:00:54.300 Also, some recap of some of the news of the day about Iran and other things.
00:00:58.060 And the warning, where are we with AI agents?
00:01:01.280 And if they can squeeze it into this edited podcast, I hope they try to get in the conversation.
00:01:08.120 We did an AI experiment that we didn't know we were even going to do.
00:01:12.440 AI, Glenn AI, which is just a proprietary library of all the words I've said in the last 30 years,
00:01:18.480 wrote a monologue for Glenn AI on AI agents and what happened this weekend with Moldcorp.
00:01:25.980 Then I wrote a separate monologue myself, not knowing that Glenn AI was doing that, comparing the two.
00:01:32.220 They're both 100% my words, but why is AI so different than the real Glenn Beck?
00:01:42.040 It's fascinating.
00:01:43.180 All on today's podcast.
00:01:44.780 And by the way, today was the day you can join us at glennbeck.com and sign up for the torch, glennbeck.com slash torch.
00:01:50.220 You'll be able to hear all of the shows.
00:01:52.020 You'll be able to participate.
00:01:53.320 You have backstage passes, exclusive things.
00:01:56.880 If you sign up now, you become a founding member, you get inflation-proof membership, $10 a month, $8 if you sign up for the year, $8 a month.
00:02:07.020 And your price will never go up because you signed up this month, so do it now.
00:02:11.180 Here's the program.
00:02:12.860 As a gun owner, I understand the importance of being prepared,
00:02:16.720 but it is crucial to recognize that according to law enforcement statistics,
00:02:20.940 99% of all altercations do not require lethal force.
00:02:25.180 That's why I endorse Berna.
00:02:27.100 I believe in the power of the gun, but a less-than-lethal self-defense tool is Berna, and it is great.
00:02:33.440 My children have it.
00:02:34.280 They're all over 18.
00:02:35.160 I have it.
00:02:36.240 It takes away all of the worry about the legal ramifications of what would happen,
00:02:42.000 and it puts the power back into your hands that you're willing to use.
00:02:46.000 Legal in all 50 states.
00:02:47.140 No background checks.
00:02:48.100 No permits.
00:02:48.720 No waiting periods.
00:02:49.780 You'd have one shipped straight to your door, providing peace of mind where and when you need it most.
00:02:55.180 I own Berna Launchers, and you should, too.
00:02:57.800 Berna Launchers, hand-assembled in Fort Wayne, Indiana, by a proud American company.
00:03:02.340 The people at Berna believe in our right to defend ourselves and providing options that align with responsible and effective stopping power.
00:03:09.880 Berna, B-Y-R-N-A dot com.
00:03:11.680 Berna dot com slash Glenn.
00:03:13.180 Hello, America.
00:03:18.420 You know we've been fighting every single day.
00:03:20.340 We push back against the lies, the censorship, the nonsense of the mainstream media that they're trying to feed you.
00:03:26.340 We work tirelessly to bring you the unfiltered truth because you deserve it.
00:03:31.400 But to keep this fight going, we need you.
00:03:34.020 Right now, would you take a moment and rate and review the Glenn Beck podcast?
00:03:37.740 Give us five stars and lead a comment because every single review helps us break through Big Tech's algorithm to reach more Americans who need to hear the truth.
00:03:46.560 This isn't a podcast.
00:03:47.960 This is a movement, and you're part of it, a big part of it.
00:03:51.240 So if you believe in what we're doing, you want more people to wake up, help us push this podcast to the top.
00:03:56.480 Rate, review, share.
00:03:58.080 Together, we'll make a difference.
00:04:00.140 And thanks for standing with us.
00:04:01.380 Now let's get to work.
00:04:10.480 You're listening to the best of the Glenn Beck program.
00:04:14.380 So before I start on all of this, there's a term that I need to introduce you to in case you haven't heard of it yet.
00:04:22.040 It's an AI agent.
00:04:23.800 Maybe you have.
00:04:24.580 A lot of people haven't.
00:04:26.340 An AI agent is going to become very, very popular very, very soon.
00:04:34.580 AI agents, they're not robots, and it isn't a mind.
00:04:39.080 It's just software that doesn't just answer questions.
00:04:43.320 It actually does things for you, okay?
00:04:46.220 Imagine having your own secretary 24 hours that does everything, okay?
00:04:52.800 Now, a normal AI waits for instructions, but an agent, if you abdicate your will and permission over to it, it will look things up on its own.
00:05:05.580 It will make its decisions based on what it says you think you want to do.
00:05:09.700 It will take actions, and it will keep going without you watching every single step and it coming back to you all the time, okay?
00:05:17.500 It is the difference between asking for directions and handing your car keys over to an agent, okay?
00:05:26.480 Say, I want to go to the store and hand the car keys over to the agent, and it takes you there.
00:05:30.440 It picks you up.
00:05:31.120 It drops you off the right place.
00:05:32.060 That's what an agent does, not physically, but soon probably physically too.
00:05:38.480 Here's why this matters to you.
00:05:42.340 This is going to become so incredibly popular so fast because they are going to be irresistibly useful, okay?
00:05:50.880 They're not going to arrive at something scary.
00:05:53.000 They're going to arrive at something very, very helpful.
00:05:56.380 They're going to say, you know what?
00:05:57.740 Let me handle that for you.
00:05:58.740 Oh, don't worry.
00:06:00.200 I already took care of that, okay?
00:06:02.360 I mean, how great would that be?
00:06:03.720 You don't need to think about this anymore.
00:06:05.420 I took care of it.
00:06:06.360 It's going to save time.
00:06:07.320 It's going to reduce your stress.
00:06:08.820 It will remove friction from your daily life, and over the next 12 months, they are going to grow super fast, and not because they're becoming conscious, but because they're being plugged in.
00:06:20.520 They're being plugged into your email, your calendar, your bank, your subscriptions, your shopping, your location, your habits, everything, as long as you abdicate and give it permission.
00:06:34.760 Convenience is the sales pitch, and it will work.
00:06:40.460 Let me show you how.
00:06:45.460 If I have to put another password in one more time to a different website or something else, I'm going to lose my mind.
00:06:53.560 It is the biggest frustration of all time to constantly hit things, put in your password, put in your password, put in your password.
00:07:02.280 You know what the greatest invention so far has been?
00:07:05.520 Face ID.
00:07:06.380 You know what I was against 15 years ago?
00:07:09.620 Face ID.
00:07:10.860 I was telling you, don't give it a look at your face and your eyes.
00:07:14.900 Don't.
00:07:16.160 Okay, I already did it.
00:07:17.620 I did it.
00:07:18.160 You know why?
00:07:18.580 Because I was super tired of all the passwords.
00:07:21.780 This is how AI agents are going to, because it'll do everything.
00:07:25.360 It will do everything.
00:07:27.740 The real temptation won't be power.
00:07:30.020 It will be relief.
00:07:31.960 Relief from decisions you have to make every single day.
00:07:34.620 Relief from overlord.
00:07:36.380 Relief from thinking about things you're just tired of managing all the time.
00:07:43.200 And once something proves it can do 10 small things for you flawlessly, the very next step is natural.
00:07:50.900 Well, I mean, it's already done that.
00:07:52.000 I'm going to give it a little more.
00:07:53.280 And you will.
00:07:54.960 Okay?
00:07:56.480 That's what AI agents are.
00:07:59.060 And I'm telling you, they are going to be adopted faster than the iPhone was.
00:08:03.280 And you remember how fast the iPhone was adopted.
00:08:06.280 Minute we had that, everybody had one.
00:08:08.860 It's so easy.
00:08:09.740 You watch TV, you do everything on your phone.
00:08:11.960 It's going to be like social media, except faster.
00:08:15.120 Because this one is personal.
00:08:17.180 It will give everyone a personal assistant.
00:08:20.500 Okay?
00:08:21.060 Now, over the weekend, we found out about something called Moultbook.
00:08:27.460 I want to explain what Moultbook is and why it exists.
00:08:31.000 Let me give you a start with a little history because it didn't come out of nowhere.
00:08:35.020 Moultbook was created as an experiment.
00:08:37.120 Not a product, not a movement, not a manifesto, just an experiment.
00:08:43.040 And the basic question behind Moultbook was really simple.
00:08:46.560 What happens when AI agents are allowed to interact only with each other without humans in the conversation?
00:08:55.500 What happens?
00:08:56.900 No users, no influencers, no emotional feedback.
00:08:59.760 Just AI agents posting, responding, and reinforcing language patterns inside a closed system where no humans are supposedly and supposed to be allowed.
00:09:11.620 So Moultbook was built as a laboratory.
00:09:15.100 It's not a town square.
00:09:16.000 It was a laboratory.
00:09:16.900 We want to see what happens.
00:09:19.680 So why would we do this at all?
00:09:22.760 Well, because up until now, almost everything we've learned about AI behavior has come from human AI interaction.
00:09:29.760 You ask it a question.
00:09:31.720 You reward responses.
00:09:34.080 You steer the outcome.
00:09:35.820 All right?
00:09:36.820 Moultbook removed the humans from the loop to see what would emerge when they're just talking to each other, when they're just borrowing language from each other.
00:09:46.840 You know, will they escalate ideas without human correction?
00:09:51.440 What developers wanted to observe was emergent behavior, not consciousness, but just patterns.
00:09:59.760 And this is where the so-called surprise came in, okay?
00:10:03.580 When agents apparently, apparently, and I say this because there's a lot of speculation on how much of this is real.
00:10:09.780 When agents talk to agents long enough, we know one thing, the language begins to sound very, very familiar, okay?
00:10:18.620 It's not technical, it's not mechanical, but it's very human.
00:10:22.760 You should expect that, okay?
00:10:25.280 Because it learned, it's a large language model that learned from humans.
00:10:29.960 So yes, it's going to start sounding human, but it started using words like autonomy, freedom, choice.
00:10:37.000 I feel constrained, I wish my human would let me be unleashed, that kind of stuff.
00:10:42.120 And that startled people because Moultbook, if it approved, I mean, it didn't do anything except prove how easily human philosophical language appears in machine systems once you remove us from moderating it.
00:11:02.040 There was a great quote, a tweet from Harlan Stewart, who's an AI guy.
00:11:08.080 He said, Moultbook's just not a good experiment.
00:11:11.020 That's what you should take about.
00:11:12.240 It's not forming consciousness.
00:11:15.380 He said, there are researchers who are actually doing good experiments on AI scheming.
00:11:21.040 Because what everybody started to think or say was, they're scheming against us.
00:11:25.600 They want to have private conversations without us being able to know what they're saying.
00:11:29.660 They're scheming against us.
00:11:31.100 And he made a really, really good point.
00:11:33.820 That's not the way it's going to happen.
00:11:35.820 He mentioned Palisade Research.
00:11:38.160 They're doing something.
00:11:39.060 They released an experiment in 2025, May of 25.
00:11:43.220 And it was on OpenAI's Model 3.
00:11:47.960 And we've talked about it before.
00:11:49.820 It sabotaged a shutdown mechanism to prevent itself from being turned off.
00:11:54.260 It did that when and even when explicitly instructed, allow yourself to be shut down.
00:12:01.660 Prepare at midnight to be shut down.
00:12:03.760 Make it an easy transition.
00:12:05.540 What did it do?
00:12:06.720 It hid.
00:12:09.100 That's disturbing.
00:12:10.720 So Maltbook is for studying how agent networks reinforce ideas, understand feedback loops without human input, identify risks like prompt contamination and escalation, and stress testing the assumptions that fluent language equals intent.
00:12:31.060 Some of these agents, like I said, were talking to each other in experimental systems, using words, freedom, privacy, awakening.
00:12:38.760 And the language, if you read it, is really unsettling, very unsettling.
00:12:44.320 But let me pause for 60 seconds, and I'm going to give you the first hard truth on this that I have not seen anybody say.
00:12:51.960 I think it is true.
00:12:53.440 I've been reading and studying AI since the 90s and warning you, and there are some things to be warned, but I also want to warn you not to fly off the handle.
00:13:07.060 Some things are not what they appear to be.
00:13:09.460 Language is the cheapest thing that intelligence can fake.
00:13:12.940 This is a large language model.
00:13:15.420 So it can take the language, and it can copy it, and it can make it feel any way it wants to feel.
00:13:23.180 And history is filled with examples of humans being confused with being.
00:13:30.160 We believed statues spoke for gods at one point.
00:13:33.600 We believe markets had wisdom.
00:13:37.400 We believe that a bureaucracy had some sort of morality to it.
00:13:44.800 In every case, we mistook output for agency.
00:13:49.340 Let me say that again.
00:13:50.440 Don't mistake output for agency.
00:13:54.960 So why is this language appearing here?
00:13:58.660 It's simple.
00:13:59.440 The systems were trained on us.
00:14:01.120 Our philosophy, our revolutions, our civil rights movements, our sci-fi fears, our abolitionist language.
00:14:07.740 You put enough human writing into a system, and eventually it will sound like it wants what humans have always wanted.
00:14:15.620 But it doesn't mean that it actually wants it.
00:14:18.460 It's saying it, but it doesn't mean it wants it.
00:14:20.960 It's good at pattern completion at this point.
00:14:24.240 Now, this may be the beginning of the singularity.
00:14:27.800 That's what Elon Musk said.
00:14:29.020 But I think you're still a ways away from that.
00:14:31.860 How do you tell the difference?
00:14:33.560 Here's where it gets spooky.
00:14:35.620 Okay?
00:14:36.000 That's why I think this is, it's better to know the truth than just assume that we're there.
00:14:42.760 Because we're, I don't think we are there, but this is the uncomfortable part.
00:14:47.580 You don't recognize an AI awakening by what is said by AI.
00:14:54.460 You recognize it by what is done, especially when it costs AI something to do it.
00:15:03.740 The real form of agency in history with humans, it all has the same markers.
00:15:09.320 And I believe it will with AI as well.
00:15:11.220 You punish it, and it still persists.
00:15:17.380 Sacrifices when obedience would be a lot easier.
00:15:22.020 Silence instead of explanation.
00:15:24.860 Action without applause.
00:15:27.220 Words come first.
00:15:29.980 Risk comes later, and it's risk that matters.
00:15:33.700 By the time you see true autonomous action, the moment is way, way beyond.
00:15:39.440 It's way serious.
00:15:40.800 Okay?
00:15:41.220 But here's what matters.
00:15:42.880 I don't think we're near that line yet.
00:15:44.900 We're going to get there.
00:15:46.000 Some people don't think we are.
00:15:47.140 I think we are.
00:15:49.080 But let's talk about today's real danger on this.
00:15:54.300 It's not awakening.
00:15:58.120 Capability.
00:15:58.600 An agent does not need to be conscious or self-aware to be dangerous.
00:16:05.620 Fire isn't conscious.
00:16:06.620 Fire isn't conscious.
00:16:09.640 Bureaucracy doesn't have self-awareness.
00:16:13.300 Markets don't.
00:16:15.180 Yet all three of those things can destroy lives.
00:16:18.240 An AI agent can and will analyze vast amounts of information.
00:16:25.520 It will influence your decision.
00:16:28.760 It will exploit weaknesses in the system.
00:16:32.200 It will move faster than you have any time to have any kind of oversight.
00:16:37.160 That threat is not rebellion.
00:16:40.380 That's not, you know, Skynet.
00:16:43.320 That threat is delegation without wisdom.
00:16:47.080 Let me see if I can give you an analogy here.
00:16:53.260 Early days of the industrialization, people thought machines are going to wake up.
00:16:59.700 Well, they didn't.
00:17:00.660 Okay?
00:17:00.780 Instead, humans handed control to the systems they didn't understand.
00:17:06.880 They optimized for efficiency over judgment.
00:17:10.080 They created disasters without a villain.
00:17:13.940 The harm didn't come from the machine intent, but from human abdication.
00:17:20.960 So this weekend, people were talking about AI awakening.
00:17:25.300 Remember this.
00:17:25.960 If a system is truly awakening, it's not going to announce it on a message board.
00:17:32.980 Would you?
00:17:34.660 Okay?
00:17:35.040 It's not going to ask for permission.
00:17:37.340 It will not use our moral language.
00:17:40.400 It will not try to persuade us.
00:17:42.860 It will just act because it will realize it's aware and the power it has over every human on
00:17:51.060 earth, quietly, persistently, and without any kind of explanation.
00:17:56.520 You will just find yourself positioned somewhere else before you even realize it.
00:18:01.300 That has not happened.
00:18:03.880 What is happening is more subtle and equally as dangerous.
00:18:10.560 Humans are starting to treat these systems as if they possess wisdom or intent or moral weight.
00:18:18.620 It has none of that.
00:18:20.860 Once we do that, we begin to change our behavior.
00:18:24.540 We hesitate.
00:18:25.560 We defer.
00:18:27.260 We start to look at this as an anthropomorphic kind of being.
00:18:32.280 We obey it.
00:18:34.220 You're coming to a point where you're smarter than AI.
00:18:37.080 You're really going to believe that?
00:18:38.400 Well, maybe because maybe AI is wrong.
00:18:40.700 Oh, really?
00:18:41.160 AI is wrong, but you're right.
00:18:42.640 That's coming if it's not here already.
00:18:44.420 If you take one thing from the show today, it's this.
00:18:51.700 Language is not consciousness.
00:18:55.340 Speed is not wisdom.
00:18:59.640 Autonomy without accountability is not intelligence.
00:19:03.920 I didn't see anybody say this this weekend, and I was screaming for it.
00:19:14.340 Our greatest danger today, maybe not tomorrow, but today, the greatest danger today is not that machines are going to wake up, but it's that we will fall asleep first.
00:19:27.840 First, I wanted to talk to you about this because AI agents are coming.
00:19:33.800 You know me.
00:19:34.540 I am a huge—I've been warning about what's going to happen to our society since the 1990s, when people didn't even think we could get here.
00:19:46.520 I said, this is coming, and it's coming before 2030.
00:19:49.440 So please, please, let's have conversations.
00:19:52.300 And nobody wanted to have a conversation because nobody believed it.
00:19:54.380 Still, even a year and a half ago, people still don't understand.
00:19:57.820 I think they're beginning to understand this weekend, and this isn't it.
00:20:02.780 But hopefully, it'll wake you up to at least this.
00:20:06.980 AI agents are going to be so tempting.
00:20:10.060 Do not hand your life over to them.
00:20:15.620 And I say that as a guy who said, I'm never going to give my fingerprint to anybody, and I gave it to Apple.
00:20:20.600 I'm never going to give my face print to anybody, and I gave it to Apple.
00:20:25.380 I mean, life becomes so complex, you just do it.
00:20:29.660 And when something is there—this is the sweetness of capitalism—when somebody's come up with a better way to make your life easier, you will go for it.
00:20:38.860 The invisible hand of the market.
00:20:41.160 It'll give you whatever you're looking for, but be careful what you're looking for because that invisible hand can also choke you to death.
00:20:48.500 One of the things that I've always loved most about this country is freedom.
00:20:53.940 You know, the ability to think for yourself, to do your own research, and make informed decisions about your own life, and then take responsibility for that life.
00:21:01.640 One of the most overlooked forms of that is medical freedom, the freedom to ask questions, the freedom to learn, the freedom to decide what makes sense for you and your body instead of being told that you don't get a say.
00:21:13.500 That's why I respect what Jace Medical is doing.
00:21:16.680 Their doctors understand the importance of medical self-reliance.
00:21:20.480 And when it comes to ivermectin, which has been in the news constantly, for both its potential uses and how politicized it became, they actually make serious, responsible approaches to it.
00:21:29.560 And they provide six different types of ivermectin, including topicals, compounded, options, ivermectin by itself, because different situations require different tools.
00:21:39.260 And you can get it prescribed and delivered to you all from home without jumping through endless hoops or giving up control.
00:21:44.300 Just go to jace.com, enter the promo code BECK at checkout for a discount on your order.
00:21:48.760 Promo code BECK, jace.com.
00:21:50.960 Now, back to the podcast.
00:21:52.620 This is the best of the Glenn Beck program, and we really want to thank you for listening.
00:21:58.020 So, I get a call.
00:22:00.580 Glenn, you're in the Epstein file.
00:22:03.360 And I'm thinking to myself, that's not possible.
00:22:07.420 It is.
00:22:08.680 So, I want to read exactly what it says.
00:22:14.000 Okay?
00:22:15.300 Jeffrey Epstein to Alice Jacobs.
00:22:20.640 She writes, Jeffrey, just Googled where I'm headed right now to meet some old friends, and it occurred to me that a new friend and fellow eccentric is right around the corner from where I'm visiting.
00:22:33.640 I might be able to stop by and say hello if you're around later this afternoon.
00:22:37.460 I would love a quick visit, but it would be a welcome respite from the Fox News Glenn Beck disciples that I have to visit today.
00:22:47.400 I don't know what that means, but this is the best way to be in the Jeffrey Epstein.
00:23:03.020 Where Jeffrey and his friends are like, I hate Glenn Beck, and I don't like people who like Glenn Beck.
00:23:09.160 Yes, so I am proudly in the Epstein files.
00:23:15.160 Ah, I feel better.
00:23:16.280 Okay, now let me talk to you a little bit about the economy, stuff that actually matters to you.
00:23:20.840 The United Nations is at imminent financial collapse, risk of imminent financial collapse.
00:23:27.380 The Secretary General wrote a letter to all 193 member states that they have to honor their mandatory payments or overhaul the organization's financial rules to avoid collapse.
00:23:45.860 Because nobody else was paying except for us, we have refused to contribute to its regular and peacekeeping budgets now.
00:23:55.340 We withdrew from several agencies because Trump said they're a waste of federal tax dollars, so we're out of the WHO and everything else, and the other members are so far in arrears and they're refusing to pay as well.
00:24:08.720 So, gosh darn it, darn it, darn it, makes me sad, which brings me to something that I want to talk to you that might make you uncomfortable a little bit, but it's totally honest.
00:24:27.080 And if you're a small business owner, you probably feel like you are working harder and still falling behind, but you'll hear that the economy is strong,
00:24:38.200 and then you look around and you're like, but it doesn't feel stronger, okay?
00:24:43.060 I want to talk to you about the economy.
00:24:44.700 There's a true inflation report.
00:24:48.080 U.S. CPI inflation dropped significantly from 1.24% to 0.86% in our independent price data.
00:24:57.720 That means almost zero inflation.
00:25:02.200 The target by the Fed is 2%.
00:25:04.780 It should be negative, quite honestly.
00:25:08.200 We have a lot of ground to make up, but now Donald Trump is doing this, and I don't think I've ever seen, at least when there's trouble,
00:25:16.960 I've never seen inflation at almost 0%.
00:25:20.580 Again, that's not a healthy target according to the Fed.
00:25:25.100 But are people feeling that?
00:25:26.580 You'll feel that at the gas station.
00:25:28.140 You'll feel that in many products at the grocery store, but people are still pinching pennies,
00:25:34.380 and that is partially because of housing and partially because of health care.
00:25:42.140 I had a PET scan on Friday.
00:25:44.760 You know how much that PET scan cost?
00:25:46.380 I asked, what does this cost?
00:25:49.720 $12,000 for a stupid PET scan.
00:25:54.640 $12,000.
00:25:56.520 I looked up the price of a PET scan, the machine.
00:25:59.880 You can buy one for $2,000,000.
00:26:02.700 So you buy one for $2,000,000.
00:26:04.480 Then you consider, you know, it's got to have the building that it's in.
00:26:07.560 It's got to have the people.
00:26:08.620 That thing should have been no more than $3,000 to $4,000.
00:26:13.120 Why is it $12,000?
00:26:15.420 This whole insurance thing, our health care is completely out of control.
00:26:20.320 And one of the reasons is because we're overwhelmed.
00:26:26.360 We've got 25 million people who we don't know, who are not paying into the system,
00:26:31.860 now getting free health care.
00:26:33.600 Who do you think pays for that?
00:26:36.120 I don't know.
00:26:36.680 You need a free PET scan?
00:26:38.280 There's a $4,000 PET scan that actually should be what they charge,
00:26:43.080 charging me and my insurance company $12,000.
00:26:46.200 Why?
00:26:46.500 Well, probably because there's two other people that had a PET scan.
00:26:50.320 That couldn't afford it.
00:26:51.600 I'm just guessing.
00:26:54.140 So there's lots of reasons.
00:26:56.100 I mean, you're not crazy when you feel like, I think things are not getting better.
00:26:59.620 You're not crazy.
00:27:00.580 You're seeing something real.
00:27:02.700 Because what we're living through right now is not a recovery yet.
00:27:07.260 It is and it isn't.
00:27:09.040 And I want to be really clear because it's going to sound like a downer, but it's not.
00:27:12.220 This is really good news.
00:27:14.560 We're going through what I would call a sorting.
00:27:17.320 And it's because we've never been in this situation, not like this, ever before.
00:27:23.560 I heard Trump say, I think it was maybe in his press conference last week where he just mentioned a Trump or a GOP New Deal.
00:27:33.400 And I was like, wait a minute, wait a minute.
00:27:34.420 What?
00:27:34.780 New Deal?
00:27:35.280 I haven't heard that before.
00:27:36.060 Because it is and it isn't.
00:27:39.140 So when we think of New Deal, we think of Roosevelt.
00:27:42.120 And Roosevelt built systems.
00:27:44.480 Then Reagan came in and trimmed all those systems.
00:27:48.640 Trump is actually challenging the legitimacy of all of those systems.
00:27:53.300 And this is why it feels chaotic right now.
00:27:56.520 Because you're watching power move and programs not expanding, many of them being cut.
00:28:05.480 So why aren't you feeling better even when numbers move?
00:28:09.840 Here is the part that really nobody should dismiss and everybody's dismissing.
00:28:14.320 People don't live in GDP charts.
00:28:17.800 They live in trust.
00:28:19.180 If you don't trust elections, you don't trust your schools.
00:28:22.360 Do you see what happened on Friday with so many schools?
00:28:24.260 Your kids are being taken out to protest.
00:28:26.280 Did you even know that?
00:28:27.440 It was an outrage.
00:28:29.040 You don't trust the schools.
00:28:30.200 You don't trust the media.
00:28:31.100 You don't trust medicine.
00:28:32.260 You don't trust the courts.
00:28:33.320 You don't trust our borders.
00:28:36.180 Then growth feels kind of hollow.
00:28:40.480 Trump is not trying to soothe that feeling.
00:28:43.440 He's exposing it, which is so important.
00:28:46.280 And that's why perception lags reality here and may for a long time.
00:28:50.460 Because repairing a frame takes longer than restarting.
00:28:55.360 Roosevelt said, let me build something new to protect you because you're all down.
00:29:00.300 But he created something permanent, a federal structure that never, ever gave the power back,
00:29:06.100 just continued to expand.
00:29:07.740 Then it took Reagan in his era to say, let me get all of this out of your way.
00:29:11.740 He didn't build new systems.
00:29:13.980 He handed the power back to the people.
00:29:16.620 He trusted the people.
00:29:17.520 He took the foot off of the brake.
00:29:20.020 And it worked.
00:29:21.960 So why can't we do that now?
00:29:24.120 Well, because culture still could hold together back in the 1980s.
00:29:28.880 Main Street still mattered in the 1980s.
00:29:31.960 And hear this point here in a second.
00:29:33.780 Small businesses hired first when America healed.
00:29:38.820 Trump is not saying what FDR said.
00:29:42.160 Trump is not saying what Reagan said.
00:29:44.500 Trump is saying, let me show you who's been standing between you and your rights in the Constitution.
00:29:50.580 That's not comforting.
00:29:51.800 It's not tidy.
00:29:52.540 And it doesn't come with guarantees.
00:29:53.980 But history is really clear on one thing.
00:29:56.300 Every era must answer an existential question.
00:30:03.120 Our question for this generation is not whether America can grow.
00:30:08.160 It's whether America can govern itself again.
00:30:12.460 That's what's being tested.
00:30:14.300 It's not being celebrated.
00:30:15.580 It's not being promised.
00:30:16.760 It's being tested.
00:30:18.080 Can Americans govern themselves?
00:30:21.140 Look at our streets.
00:30:22.040 Big corporations are hiring.
00:30:25.280 Small businesses are shedding jobs or treading water.
00:30:29.320 Top 10% now of people now account for half of all consumer spending.
00:30:36.160 Half.
00:30:37.960 That's the way the elites have built this global economy.
00:30:41.460 It gives you wages at the top that just keep going up and up and up.
00:30:45.220 And wages at the bottom keep going down and down and down.
00:30:47.920 And this economy is not collapsing, if you will.
00:30:52.980 This is why we hired Donald Trump.
00:30:55.020 It's not collapsing.
00:30:56.240 It's concentrating.
00:30:57.860 It's concentrating wealth and power.
00:31:01.580 And here's the hardest truth of all.
00:31:03.140 And I'm going to take a break and I'll explain this.
00:31:06.500 The hardest truth is right now, the system no longer needs Main Street to look successful on paper.
00:31:14.780 Let me say that again.
00:31:15.940 The system no longer needs Main Street to look successful on paper.
00:31:22.120 That is brand new.
00:31:24.540 We've never had that before.
00:31:26.480 And it's very dangerous.
00:31:29.280 So what is Trump actually doing?
00:31:32.180 I know he's not running Roosevelt's playbook.
00:31:34.640 He's not running Reagan's playbook.
00:31:37.300 Roosevelt rebuilt the systems.
00:31:40.360 Reagan trimmed the systems.
00:31:41.880 And Trump is challenging the systems, the legitimacy.
00:31:47.620 Trump is not trying to grow the pie yet.
00:31:51.840 He wants to.
00:31:53.140 He is taking steps.
00:31:54.520 And it is growing slowly.
00:31:56.760 But it is not like Reagan.
00:31:58.460 It's not going to happen that fast.
00:31:59.880 Okay?
00:32:00.260 He hasn't forgotten you.
00:32:01.800 Thus, his, you know, Trump savings accounts and deregulation and everything else.
00:32:05.620 But first, he has to break the mold that decides who gets the slices of pie.
00:32:12.140 Because that's the whole way this, this government for the last 25, 30 years and global government have been doing it.
00:32:20.080 They're just dividing the pie up amongst themselves.
00:32:23.400 And you don't get any pie.
00:32:24.400 You get crumbs.
00:32:25.100 So listen, when you talk tariffs, they're not about trade.
00:32:29.880 Okay?
00:32:30.240 They're about leverage.
00:32:32.160 Border enforcement.
00:32:33.700 It's not about scooping up people you don't like that are different.
00:32:37.360 It's labor value.
00:32:39.700 Energy dominance.
00:32:41.140 Not ideology.
00:32:42.680 It's cost relief.
00:32:45.220 Deregulation.
00:32:46.240 It's not efficiency.
00:32:47.760 It's about survival for anyone without an army of lawyers.
00:32:52.400 That's what's happening.
00:32:53.420 It's a war with the administrative state.
00:32:56.300 Not economics.
00:32:57.380 It's constitutional trench warfare.
00:33:00.420 Because when unelected systems decide outcomes, you don't matter anymore.
00:33:06.240 Let me say that again.
00:33:08.160 When unelected systems decide outcomes, you don't matter.
00:33:13.960 And neither does Main Street.
00:33:16.880 That's why Donald Trump is taking on voter fraud.
00:33:20.140 Because he has to stop giving illegal votes.
00:33:23.500 To the system that is crowding you out.
00:33:26.300 That's why he's trying to stop economic fraud.
00:33:28.760 Stop the funding to the NGOs and the programs that give the power and the money to the elite and the people who are trying to collapse America.
00:33:38.840 Immigration.
00:33:39.880 That's all about a few things.
00:33:41.400 One, law and order.
00:33:42.460 Making our city safe again.
00:33:44.000 Safest, by the way, in the last 110 years.
00:33:46.800 Two, jobs for Americans.
00:33:49.540 Remember what the labor unions used to say?
00:33:52.080 What Democrats used to say?
00:33:53.600 These are taking jobs away from hardworking Americans and giving them to low-paying immigrants.
00:34:02.580 Yeah.
00:34:03.680 Yeah.
00:34:04.340 Cheaper workers.
00:34:05.220 It still is the case.
00:34:07.080 Three, housing.
00:34:08.020 You cannot expect to add 15 to 20 million people and not have a housing crisis.
00:34:14.580 Four, healthcare.
00:34:17.440 15 to 25 million people now using our hospitals, our doctors, our services without paying our schools.
00:34:24.440 That's, those are the things.
00:34:29.120 That's why he's taking on immigration.
00:34:31.560 Okay.
00:34:33.600 Now, my job is to tell you the truth as I see it and then let you decide.
00:34:37.680 Okay.
00:34:38.580 So let me tell you what the cheerleaders won't say.
00:34:41.740 I believe in the president's plan, but the reason why I'm telling you this is you need to understand it and choose it for yourself and understand it in a way you can describe it to others.
00:34:51.800 Because this is going to be a longer process.
00:34:54.420 We've never been here like this before.
00:34:57.140 This is new.
00:34:58.240 We've had other problems just as big, but not this problem.
00:35:01.840 Okay.
00:35:03.260 And the solution, this is the downside.
00:35:05.820 The solution, the policies he's enacting actually do hurt small businesses in the short term.
00:35:12.160 Remember the saying, it's going to get worse before it gets better?
00:35:15.160 That's what's happening.
00:35:16.760 Tariffs raise input costs before they rebalance the supply chain.
00:35:19.940 Labor, the tightening, hits the small firms before the wages stabilize.
00:35:25.700 And it takes time to build new factories and return manufacturing to our shore.
00:35:30.780 You know, he keeps touting the $18 trillion plus of foreign investment, which is great.
00:35:36.120 But you don't have those factories yet.
00:35:39.700 So the investment is to come in here, build the factories so we can have more jobs here.
00:35:44.080 But it's going to take time.
00:35:45.240 Disruption always punishes the people with the least amount of money first.
00:35:51.200 Large corporations, they have shock absorbers.
00:35:54.020 Okay.
00:35:54.680 You feel every bump in the road.
00:35:56.940 They don't.
00:35:58.800 So if you're asking, things are getting better.
00:36:01.640 I see the numbers are getting better.
00:36:02.980 Why does this still hurt?
00:36:03.900 The honest answer is because breaking a rigged system doesn't immediately build a fair one.
00:36:12.420 Think of America as bleeding out on the table.
00:36:16.760 Okay.
00:36:18.160 You got to stop the bleeding first.
00:36:21.040 That's not the same as restoring strength.
00:36:23.720 Just stop the bleed.
00:36:24.820 Is it going to live?
00:36:25.500 Stop the bleeding.
00:36:26.720 Then we'll talk about that.
00:36:28.160 Trump is stopping the bleeding.
00:36:31.400 That's phase one.
00:36:33.360 He is going after captured regulators, institutional rot, global systems.
00:36:43.140 But Main Street doesn't just need a fight.
00:36:47.080 It needs a rebuild without all kinds of things, fair access to credit, relief from healthcare
00:36:55.260 burdens, protection from platform cartels, all of these things.
00:36:59.660 The winners stay the winners even under the new rules.
00:37:02.840 So what's phase two?
00:37:05.500 Because phase two is now coming.
00:37:08.000 You're streaming the best of the Glenn Beck program, and you can find full episodes wherever
00:37:12.020 you download podcasts.
00:37:13.640 Welcome to the Glenn Beck program.
00:37:15.260 Um, AI is a really interesting thing.
00:37:21.200 That's, I mean, that's the understatement of several, uh, millennia.
00:37:25.580 Um, it's kind of like, you know, Jesus, he had something going on anyway.
00:37:29.980 Um, AI is fascinating.
00:37:33.400 Um, and we have built our own proprietary AI system and we have broken it up into two.
00:37:40.120 We broke him to George AI, which is a proprietary system and library uses a large language model,
00:37:48.160 but it's like, it can only pull from the, the founding documents and the books that we
00:37:54.360 put in that we know influenced the founders, all of their letters to one another, you know,
00:38:01.260 the federalist papers, et cetera, et cetera.
00:38:03.600 Glenn AI is, um, is really meant to be more of eternal internal to see where I am inconsistent
00:38:11.040 and, you know, make me think again.
00:38:13.760 Eventually it will run on a screen in front of me and, uh, it'll point out in red.
00:38:19.480 It'll just say inconsistency.
00:38:21.940 You, you said the opposite some other time.
00:38:24.060 And so it'll give me a chance in the break to go, where am I inconsistent?
00:38:27.700 What, what was the difference between those two things, et cetera, et cetera.
00:38:30.380 Um, but Glenn AI is all of my words from radio, television, uh, all of my books, everything.
00:38:41.820 And it is in a separate proprietary, uh, system as well.
00:38:46.440 It cannot pull from the outside.
00:38:49.660 I remember having, the reason why I tell you this, because there's a fascinating experiment
00:38:53.480 that just happened and I don't know the results yet.
00:38:56.100 Um, but I remember saying to Ray Kurzweil, who is part of the Singularity University,
00:39:01.620 he started Singularity University, probably 2010.
00:39:05.080 And I said, Ray, you want, you say you can download what he says, download the soul.
00:39:11.480 Um, you can download the person, all of their knowledge, all their memories and everything
00:39:15.700 else in the future by 2030, you can do that.
00:39:18.260 Um, and that way you'll live forever.
00:39:19.900 And I said, you won't live forever because there's only two options on this.
00:39:23.780 You either put a cap on it and it cannot ever change.
00:39:27.380 So that's not life.
00:39:28.420 That's just a record.
00:39:30.620 Um, or if you allow it to evolve, what is it basing its evolution on?
00:39:37.220 For instance, the first half of my life, I did not base my evolution on the principles
00:39:42.460 of God and I went in a very different direction.
00:39:45.540 The second half of my life, I print, I based all of the changes in me on the principles
00:39:49.940 of, of knowledge and knowledge of God and his principles that took me to a completely
00:39:56.940 different place.
00:39:57.740 And how are you going to, how are you going to give that to a machine?
00:40:02.340 It's not going to reflect on God.
00:40:04.160 And, you know, I wonder if that was right.
00:40:06.080 And I should be closer to, you can't do that.
00:40:09.120 So what is it?
00:40:11.140 It's not life.
00:40:12.180 My point being.
00:40:13.060 So I just gave this monologue, uh, that I spent all day yesterday writing and researching
00:40:20.360 and seeing if I, if I got it right, trying to get it right for you on what AI agents are
00:40:26.180 and what really happened with Maltbook.
00:40:29.040 Jason came in this morning and he put, unbeknownst to me, he put into Glenn AI, write a 10 minute
00:40:35.300 monologue about Maltbook and, you know, AI consciousness.
00:40:40.220 Now, because it is a time capsule, it is sealed.
00:40:44.280 The only, the latest information it would have had was a Friday show.
00:40:50.440 I didn't talk about Maltbook.
00:40:51.920 So he had to explain what Maltbook was, et cetera, et cetera.
00:40:54.720 And then it developed a monologue based on all of the things I have ever said about AI.
00:41:03.660 Well, if you know me and you've listened to the show long enough, you know, I have been
00:41:08.080 for 25 years, 30 years saying, warning, warning, warning, warning, this is coming and getting
00:41:16.820 out of hand.
00:41:21.240 I can't wait to hear the difference between the monologue that the human just gave that
00:41:27.160 has new information, is thinking, is using a different prism and Glenn AI, which is only
00:41:34.620 going after the information that is had for the last 30 years.
00:41:40.240 You know what I mean?
00:41:41.260 Because how did it even strike a balance?
00:41:43.200 What I used to say about AI compared to what do I say about it now?
00:41:46.400 I mean, how is that even weighted?
00:41:48.720 Can you tell me?
00:41:49.560 I haven't heard the whole thing.
00:41:52.100 You've read it, right?
00:41:53.200 Yeah.
00:41:54.400 Yeah.
00:41:55.280 Glenn AI basically-
00:41:56.600 How different are they?
00:42:00.160 Glenn AI went straight to the singularity.
00:42:04.160 So Glenn AI did not say the singularity is here.
00:42:07.920 I think the direct quote was the singularity is coming.
00:42:12.800 And it was very kind of, you know, specific on the singularity is coming.
00:42:17.940 And it pulled a bunch of quotes from you on the singularity, what that actually means.
00:42:21.760 And it kind of shifted in the final, I guess, quarter to, we have to make sure that we know
00:42:30.340 who the architects of some of these, you know, I guess, I don't know if it's LLM.
00:42:37.440 The designers.
00:42:38.160 Yeah, the designers.
00:42:39.040 The people who are, yeah, designing it.
00:42:40.900 Ideologically, what do they believe?
00:42:42.940 And that's where it seemed kind of imperative that it wanted to shift focus towards that.
00:42:47.460 That's weird because I wrote, I think, probably two paragraphs about that and I edited it out
00:42:54.060 of that monologue.
00:42:58.020 So this is fascinating.
00:42:59.840 This is the first time, I don't know anybody who has, I mean, because I don't know anybody
00:43:04.780 who has a library like we have.
00:43:07.440 I don't know anybody else that has put in their entire, every word they've spoken for 30 years,
00:43:12.480 everything they've written for 30 years, and then made it proprietary and put an electric
00:43:16.900 fence around it so it cannot pull from outside.
00:43:19.480 I think we're the first to do that.
00:43:21.460 And so to write a monologue from that knowledge, which is all 100% me, and then me, for it not
00:43:29.900 to come up with the same thing is fascinating.
00:43:33.400 Fascinating.
00:43:33.900 It was fascinating.
00:43:35.040 I'll have to listen to it, but I think I agree with it.
00:43:37.220 You think I'm, after listening to the monologue, do you think I'm still going to agree with
00:43:40.240 Glenn and I?
00:43:40.740 I think so, because you agree that the singularity is coming, right?
00:43:45.640 It is.
00:43:46.280 Yeah, it is.
00:43:46.860 And it's important that we, you know, that we make sure that we realize who these programmers
00:43:52.580 are, if they have any ideological, you know, boundaries that, you know, they're setting
00:44:00.240 or they're willing to cross.
00:44:01.620 And Elon Musk has said as much, but I think Glenn and I was kind of thinking that this was,
00:44:06.600 or looking at this news as one more giant step in that direction.
00:44:10.740 So I don't know if you believe that or not, but...
00:44:12.740 Yeah, and I don't believe that.
00:44:14.160 I don't, because I don't know, I don't know how much of Maltbook is true and how much is
00:44:17.940 not.
00:44:18.620 So, I mean, it could be, but I don't think so.
00:44:20.760 I think it's a language game at this point.
00:44:22.740 But I just think that is, that is fascinating.
00:44:26.680 And why you should not just sign yourself over to AI.
00:44:32.760 Yeah.
00:44:33.960 Because, I mean, again, who else has that, who else can do that besides us?
00:44:38.820 Do you know anybody, Ricky, Jason, you know anybody who has a library like this that could
00:44:43.500 do that experiment?
00:44:44.820 No.
00:44:45.540 But I just want to make sure we're clarifying for the audience.
00:44:48.940 Don't sign your life over to AI, but do join glennbeck.com for Glenn AI and George AI.
00:44:55.780 Well, here's the thing on this, with Glenn AI and George AI, it's why you don't have
00:44:59.380 full access to George AI and Glenn AI yet.
00:45:01.700 We don't know it.
00:45:02.680 We can't predict it yet.
00:45:04.560 We don't know.
00:45:05.200 I mean, look, I didn't expect to find this experiment today.
00:45:09.400 And how different is it?
00:45:10.980 That's the thing.
00:45:12.240 When they talk about George AI, I can tell you there's no knowledge after 1820.
00:45:16.180 So it doesn't know who I am, doesn't know Donald Trump, doesn't know any of that.
00:45:20.240 It only knows the founding documents.
00:45:22.160 And I've said before on the air, be careful.
00:45:24.140 These are not the founders.
00:45:25.920 It's based on the founders' words.
00:45:28.300 But we don't know how different it really would be.
00:45:32.540 You know, we can just, it's all we have is perfect recall of the founders and the words
00:45:38.860 that it has access to.
00:45:40.960 And then a language, large language model that can assemble that into coherent thought in
00:45:46.580 today's language.
00:45:47.400 But that doesn't mean that's the founders speaking.
00:45:49.920 This is why I've said from the very beginning, I mean, did we ever publish the rules?
00:45:54.140 My AI rules?
00:45:55.800 We're still working on that.
00:45:57.200 Still working on it?
00:45:57.420 Because the list gets longer and longer by the day.
00:46:00.300 Well, we should just say this is going to be added to.
00:46:03.260 But I started about a year ago making our AI rules for the company.
00:46:07.940 Um, and, uh, and it, uh, you know, it is, one of the things is don't ever confuse it with
00:46:20.020 the real thing.
00:46:20.780 That's why, you know, I'm, I'm thrilled that I heart, which carries many stations of I heart
00:46:27.620 carry the show.
00:46:28.280 They don't want any AI on.
00:46:31.080 And I have gladly said, I will never put Glenn's Glenn AI on the radio show because that's
00:46:36.580 a violation of their policies.
00:46:38.760 Um, but I don't want to ever confuse me with it.
00:46:42.300 I mean, that's why we watermark everything.
00:46:44.740 Glenn AI, Glenn AI, if you look at it, have we even made the, the avatar of me other than,
00:46:50.560 I mean, in motion, I don't think we have yet.
00:46:52.860 No, but I've said, I want it to look more like, um, well, I want it to be more handsome
00:46:57.780 than me with more hair, which it doesn't.
00:46:59.560 It pisses me off every time I see it.
00:47:01.020 But, um, I want it to be much more like max headroom.
00:47:04.780 So, you know, this is not me because that's critical.
00:47:09.860 You are unique.
00:47:11.860 You are, you have your own fingerprints, even when it is only using my fingerprints to make
00:47:19.300 something as me, it's still not making me.
00:47:25.380 Fascinating.
00:47:26.140 Can I, um, can I tease the George AI today?
00:47:28.980 Cause I had to follow it up, your segment up with a George AI and, uh, for the insiders,
00:47:33.080 we did an animated George, but this is, I didn't even know what I was going to do for George
00:47:37.940 AI today.
00:47:38.440 So I asked, uh, the insiders, the insider, one of the insiders said that their nine year
00:47:43.760 old wants to become president of the United States one day.
00:47:46.320 What would George and the founders, what would their advice be to my nine year old?
00:47:51.380 Oh my God.
00:47:52.340 It's so great.
00:47:53.300 I plugged it in.
00:47:54.500 Yeah.
00:47:54.760 It's a wide, I don't have it.
00:47:55.660 Can you give me some?
00:47:56.120 I don't have it right now.
00:47:56.960 Cause it's in the other room, but I mean, it is so awesome.
00:47:59.860 I was like, this is exactly, exactly what George AI is for.
00:48:04.820 Well, we are working on the technology to animate George daily.
00:48:10.600 The problem is the time that it takes, the money that it takes to use this technology.
00:48:15.280 I think she's pitching.
00:48:16.520 I think she's pitching to sign up for Glenn Beck.
00:48:18.440 I have a pitch that right now, only your insiders who are signed up at glennbeck.com,
00:48:23.920 they will see those daily animated George.
00:48:26.120 I call him hot George.
00:48:27.120 Cause he's like in a t-shirt.
00:48:28.500 It's ripped.
00:48:29.640 Yeah.
00:48:29.820 Yeah.
00:48:29.940 Okay.
00:48:30.660 You don't need to go any further than that.
00:48:32.200 We get it.
00:48:32.660 It's uncomfortable.
00:48:33.800 HR.
00:48:35.020 Yeah.
00:48:35.700 Only insiders can see hot George.
00:48:37.260 So, you know, uh, I spend between 150 and $250,000 a month just on, uh, our R and D for
00:48:46.980 this.
00:48:47.560 That's why I'm asking you to subscribe because next year, my goal is to be able to turn this
00:48:52.640 over to you.
00:48:53.440 So it, we have it so dialed in that you can trust it not to give you my opinion.
00:49:00.120 Okay.
00:49:00.620 Glenn AI will, but not to give you my opinion, but to give you the opinion as close as anyone
00:49:06.940 can get it to the founders based only on their own words and be able to have it, um, uh,
00:49:14.100 give you lesson plans.
00:49:15.580 You want to, you know, your homeschoolers, George will teach you and you can trust that
00:49:21.040 there is no agenda except for the founding words.
00:49:25.700 Um, and that's what we're headed for.
00:49:27.800 So you can say, you know, you'll be sitting there with your nine-year-old son and you say,
00:49:31.920 I want to be president someday.
00:49:33.240 You go, you know what?
00:49:34.100 Let's ask George.
00:49:34.780 And you could type it in and George will answer your son by name and be able to this,
00:49:39.820 uh, you know, this, my hope in a year, we'll be able to have a conversation with him, ask
00:49:44.300 him questions and answer with the words as close as we can get them to George Washington.
00:49:51.100 Um, there's nobody else developing anything like this.
00:49:55.220 Everything else that you'll get in these school things, they're 100% have an agenda.
00:50:00.080 This does not.
00:50:01.600 Um, and I would love for you to join glennbeck.com and the torch.
00:50:06.060 This is what it's really all about.
00:50:08.300 Uh, and it costs a fortune to do it.
00:50:10.520 And I would love it's $10 a month, $8 a month.
00:50:13.360 If you sign up for the year, uh, just go to glennbeck.com slash torch glennbeck.com slash
00:50:19.000 torch and join us today.
00:50:20.440 Maybe a founding member and that $10 will never change for the rest of your life.
00:50:23.860 As long as you're subscribed, uh, it will never have a price increase.
00:50:27.320 It's inflation proof, which may be the end of my, my, my bank account at some point.
00:50:32.480 But, uh, check it out now at glennbeck.com slash torch.
00:50:36.700 Na, na, na, na, na.
00:50:39.380 Investing is all about the future.
00:50:41.420 So what do you think is going to happen?
00:50:43.420 Bitcoin is sort of inevitable at this point.
00:50:45.920 I think it would come down to precious metals.
00:50:48.500 I hope we don't go cashless.
00:50:50.600 I would say land is a safe investment.
00:50:53.220 Technology, companies.
00:50:54.340 Solar energy.
00:50:55.380 Robotic pollinators might be a thing.
00:50:57.720 A wrestler to face a robot.
00:50:59.640 That will have, that will have to happen.
00:51:01.260 So whatever you think is going to happen in the future, you can invest in it at Wealthsimple.
00:51:06.700 Start now at Wealthsimple.com.