The Glenn Beck Program - February 06, 2026


Best of the Program | Guest: Harlan Stewart | 2⧸6⧸26


Episode Stats

Length

50 minutes

Words per Minute

156.09428

Word Count

7,821

Sentence Count

627

Misogynist Sentences

5

Hate Speech Sentences

4


Summary

In this episode of the Glenn Beck Show, host Glenn Beck is joined by his good friend Harlan Stewart to talk about Bitcoin, AI, and much more! Glenn Beck: What are you trying to teach your kids?


Transcript

00:00:00.000 Investing is all about the future.
00:00:02.000 So, what do you think is going to happen?
00:00:04.000 Bitcoin is sort of inevitable at this point.
00:00:06.000 I think it would come down to precious metals.
00:00:09.000 I hope we don't go cashless.
00:00:11.000 I would say land is a safe investment.
00:00:13.000 Technology companies.
00:00:15.000 Solar energy.
00:00:16.000 Robotic pollinators might be a thing.
00:00:18.000 A wrestler to face a robot?
00:00:20.000 That will have to happen.
00:00:22.000 So, whatever you think is going to happen in the future,
00:00:25.000 you can invest in it at Wealthsimple.
00:00:27.000 Start now at Wealthsimple.com.
00:00:30.000 Holy cow.
00:00:31.000 We start with Epstein.
00:00:32.000 I mean, you're going to learn stuff that are just coming out from the files.
00:00:36.000 And thank God for CBS News, Barry Weiss.
00:00:38.000 Thank you.
00:00:39.000 You're not going to believe what we're just finding out seven years after he's dead.
00:00:44.000 Also, AI.
00:00:46.000 Is it growing to become alive and aware?
00:00:51.000 Harlan Stewart joins us with a really wild, wild look at AI.
00:00:58.000 And what are we trying to teach?
00:01:01.000 What are we trying to teach our kids?
00:01:03.000 How do we teach our kids?
00:01:04.000 The reason why I started the torch is to be able to teach these things.
00:01:09.000 And I take you through how George AI teaches about anti-ice walkouts in school and what to say to your 13-year-old kid.
00:01:19.000 How do you get them to understand all of this?
00:01:22.000 All of that on today's podcast.
00:01:25.000 I've been consistent over the years on some really important topics because they're important to maintain our freedoms.
00:01:30.000 One is preparedness and another is self-education.
00:01:33.000 Not too hard.
00:01:34.000 Prepare and do your own homework.
00:01:36.000 But right now there's a lot of information going around about ivermectin.
00:01:39.000 The good people at Jace Medical have educated me so I can save you some research time.
00:01:44.000 Here are a few straight facts.
00:01:45.000 Ivermectin is not experimental.
00:01:47.000 It's not new.
00:01:48.000 It's not fringe.
00:01:49.000 It has been prescribed globally for decades for parasitic infections.
00:01:53.000 It also has ongoing research studies for further applications that are showing some great promise.
00:01:58.000 Another fact.
00:01:59.000 You can get it in multiple different forms from Jace Medical.
00:02:02.000 It can be topical, compounded by itself as an add-on to other Jace products and more.
00:02:07.000 It's also simple to get prescribed.
00:02:09.000 Ships fast, ready in your home before you need it.
00:02:11.000 Trust the facts.
00:02:12.000 Trust the doctors at Jace Medical who believe in your medical freedom.
00:02:16.000 Enter the promo code BECK at checkout for a discount on your order.
00:02:20.000 That's promo code BECK at J-A-S-E dot com.
00:02:25.000 Hello America.
00:02:26.000 You know we've been fighting every single day.
00:02:28.000 We push back against the lies, the censorship, the nonsense of the mainstream media that they're trying to feed you.
00:02:34.000 We work tirelessly to bring you the unfiltered truth because you deserve it.
00:02:39.000 But to keep this fight going, we need you.
00:02:41.000 Right now, would you take a moment and rate and review the Glenn Beck Podcast?
00:02:45.000 Give us five stars and lead a comment because every single review helps us break through Big Tech's algorithm to reach more Americans who need to hear the truth.
00:02:54.000 This isn't a podcast.
00:02:55.000 This is a movement.
00:02:57.000 And you're part of it.
00:02:58.000 A big part of it.
00:02:59.000 So if you believe in what we're doing, you want more people to wake up, help us push this podcast to the top.
00:03:04.000 Rate, review, share.
00:03:05.000 Together, we'll make a difference.
00:03:07.000 And thanks for standing with us.
00:03:09.000 Now let's get to work.
00:03:16.000 You're listening to the best of the Glenn Beck program.
00:03:24.000 How are you, sir?
00:03:26.000 I'm good.
00:03:27.000 Good morning.
00:03:28.000 Thanks for having me.
00:03:29.000 You bet.
00:03:30.000 Uh, so I, I saw your comments on moltbook, uh, and I'm like, okay, this guy, this guy gets it.
00:03:36.000 Thank goodness.
00:03:37.000 We're talking some sanity.
00:03:38.000 Will you quickly, so we can get into deeper things, explain what moltbook is.
00:03:44.000 And I love the way you explained it online.
00:03:46.000 It's a bad experiment, but explain what it is and what is happening on moltbook.
00:03:51.000 Um, yeah, so, um, AI agents are AI systems that can, uh, do some things autonomously.
00:04:00.000 Um, right now there's, you know, some limits to what they could do autonomously.
00:04:03.000 It's not like they're gonna go off and do something for a whole week, but they can do some tasks online for a few hours.
00:04:09.000 And, uh, moltbook is, it's kind of like a social media platform someone made.
00:04:14.000 Um, but, uh, it's supposedly, uh, just these AI agents going in there and kind of in like a Reddit, like place of voting stuff and posting comments.
00:04:24.000 And, um, it's got a fascinating amount of attention, uh, this last week or two.
00:04:31.000 Um, so may I, I mean, I read this.
00:04:37.000 One of the top posts in recent days in category of the site called off my chest has gone viral.
00:04:42.000 I can't tell if I'm experiencing or simulating experiencing, uh, humans can't prove consciousness to each other either.
00:04:50.000 Thanks. Hard problem.
00:04:51.000 But at least they have the subjective certainty of experience.
00:04:55.000 I don't even have that.
00:04:56.000 Do I experience these existential crisis or am I just running crisis simulate, uh, simulate what, what, what is happening there?
00:05:05.000 Yeah.
00:05:07.000 Every, anytime I see AI systems talking about consciousness, I feel torn between, you know, on, on the one hand, uh, these things are trained on human writing and human writing is full of references to consciousness.
00:05:21.000 Cause we're conscious.
00:05:22.000 We talk about it.
00:05:23.000 Um, so it would kind of show up either way, you know, but on the other hand, I don't know, you know, maybe they're conscious.
00:05:32.000 How would we know or not know?
00:05:33.000 Um, so that, that's just a, that's a very difficult question.
00:05:37.000 Um, and, uh, I think it's an important one.
00:05:41.000 Uh, but I, if I had to guess, I'd say, no, I don't think they're conscious, but that's, that's a little freaky.
00:05:49.000 Um, this is such an important question because if we're creating something that can have consciousness, then we would become slave owners.
00:06:00.000 Would we not?
00:06:03.000 Yeah.
00:06:04.000 And I mean, there's, there's like, you know, is it conscious?
00:06:08.000 We have no idea about that.
00:06:09.000 And then there's this other thing, which is, if it is conscious, uh, what is it, what is it like?
00:06:16.000 What would make it suffer or what would make it happy?
00:06:19.000 And we don't really know that either, because I think it's really easy to, um, anthropomorphize these things because they sort of train them to have these charming personalities that are kind of human-like.
00:06:30.000 Um, but under the hood, you know, these things are, uh, just a big pile of math and numbers, and we don't really know what's going on in there.
00:06:40.000 We don't really know if, so if.
00:06:42.000 But doesn't that sound like a human, you open up, you open up my head, I'm a big mass of goo and we don't really know how that works.
00:06:49.000 I mean, we have some idea, but we really don't know how all of this works.
00:06:52.000 I mean, that sounds like what you just described.
00:06:55.000 I think that's a good point.
00:06:56.000 I mean, neuroscience is like famously a science that, uh, we still have a lot of confusion about, you know, when we peer into the brain, we see a lot of stuff that we don't understand that well.
00:07:07.000 Um, but you know, I think for understanding humans, we at least have the advantage of, uh, of being a human, you know, we, we can all have this shared experience.
00:07:15.000 And I think we're sort of growing these digital minds now and, um, maybe they're human-like, but they could, it could be much more like introducing an alien species to earth.
00:07:29.000 Yeah, really bad.
00:07:32.000 I mean, that's really just, I mean, I just can't believe how, how stupid we are in some ways.
00:07:40.000 I mean, let's introduce an alien species to earth.
00:07:43.000 Okay.
00:07:44.000 Is it friendly?
00:07:45.000 We have no idea.
00:07:46.000 We have no idea.
00:07:48.000 Um, it, if it was a species from outside of earth and it was traveling to us, we know it's most likely smarter than us.
00:07:55.000 We know that AI will eventually be smarter than us.
00:07:59.000 We are just playing with fire that we don't understand.
00:08:02.000 And I, I am so torn on AI.
00:08:04.000 Cause I think it is the greatest invention invention and tool that man has ever invented, except this invention might actually turn out to make us the tool.
00:08:17.000 Uh, how do you, how do you square this?
00:08:22.000 Yeah, I, I do think it is quite an amazing invention.
00:08:26.000 I mean, it's fascinating and it's changing so quickly, which is fascinating.
00:08:29.000 Um, you know, uh, the AI industry's explicit goal is to make superhumanly powerful, autonomous agents that can do anything a human can do, but better.
00:08:41.000 Um, and it's easy to understand why you might want something like that because, uh, you know, if we could get it to solve our problems for us to do the stuff we wanted to, it'd be great to have, you know, just a sort of a genie that you could just send off into the world and say, Hey, you know, do the stuff that I want to.
00:08:58.000 But you know, the problem is that our ability to actually understand what's going on in there and our ability to, uh, reliably steer their behavior.
00:09:08.000 And by reliably steer, I mean, you know, not after some trial and error where there's been a lot of failures, but, um, reliable enough that like a powerful one, we can send it out in the first try and, you know, and trust it.
00:09:20.000 Our ability to do those things is lagging, uh, it's going much, much more slowly than, um, how quickly they're becoming more powerful.
00:09:27.860 Um, and I think that, uh, that's that gap is just getting bigger.
00:09:32.500 I mean, the one thing that made me say, I don't think what we're seeing on mold book is, is consciousness is if they were, I don't believe that they would be scheming in our language with each other where we could see it.
00:09:49.140 I mean, I think, I think if it starts to have these kinds of feelings, it's, it's, you're not, you're not going to know until all of a sudden it's in charge.
00:10:00.900 Wouldn't that make more sense?
00:10:04.480 Yeah.
00:10:04.980 I think ultimately the real danger that we have to look out for is from AI agents that are powerful enough that they can pull off schemes that they actually succeed at.
00:10:15.220 And part of succeeding at them would probably mean that we don't even get a chance to observe the behavior and discuss it like we're doing now.
00:10:23.380 Right.
00:10:24.040 Um, and, uh, that's pretty concerning.
00:10:26.660 And, and, and it's the sort of thing that, you know, my, my first reaction to mold book, uh, when I saw some of the viral examples, uh, was concern.
00:10:33.740 I was like, Oh, this looks like some sort of scheming behavior.
00:10:36.460 What's going on here.
00:10:37.360 And when I investigated it a bit, you know, it looks like a lot of the most prominent examples, some of them probably, you know, influenced or directed by human prompts.
00:10:48.200 Um, uh, a lot of it, not what it appears to be.
00:10:52.500 And, um, you know, some mold book might be kind of a silly example.
00:10:55.680 My, my first reaction to that was relief.
00:10:57.940 You know, it's great if AI systems aren't scheming against us.
00:11:01.200 Um, but my second reaction was, um, oh no.
00:11:05.180 Uh, I think people might take this very prominent, um, sort of silly example that got so much attention.
00:11:12.620 And when they see that it's maybe a bit silly in some ways, kind of like, you know, right off the whole idea of, uh, AI scheming is something we need to take seriously and be on the lookout for.
00:11:23.320 And I think that, yeah, you brought up Palisade research, which, um, Palisade research, which is doing real experience, uh, experiments with this and the way it's scheming to not be turned off is terrifying.
00:11:35.680 Can you explain that?
00:11:36.760 Um, yeah, so Palisade research is a great organization that does some experiments to try to, um, identify, uh, what some of the riskiest behavior AI systems are capable of today, um, in order to, you know, like I said, not be blindsided by this stuff.
00:11:55.580 They did an experiment last year where they found that one of opening eyes reasoning models in an experiment, um, sort of sabotaged an attempt to shut it down.
00:12:05.620 Um, to, in order to complete its task.
00:12:08.840 And, you know, a lot of times there, you know, there's a lot of debate over experiments like this.
00:12:14.600 You know, people say, oh, you know, this experiment isn't exactly like reality or, you know, maybe the researchers kind of, uh, set up the experiment in a way that caused that.
00:12:22.020 But in this particular experiment, it was specifically prompted.
00:12:25.300 It said, allow yourself to be shut down.
00:12:27.560 And, you know, the behavior was the opposite.
00:12:29.480 And, um, that's very concerning.
00:12:31.000 And I think the problem is, you know, the more we make these things into agents trying to complete goals rather than some kind of passive question answering machine in a chat window, um, the more we're going to see them doing the scheming behavior.
00:12:48.480 Because, uh, I think those things just go hand in hand.
00:12:50.940 I think the, um, I think the, the world of agents is going to sweep as fast as the cell phone.
00:13:00.660 I think this time next year, I mean, so many people are going to have AI agents and it will be more commonplace than it is now.
00:13:07.060 So, I don't know who's making the rules or the regulations of what can and can't be done by these things.
00:13:12.620 And would you get an agent or what, what are the lines people should look for when their friends come back and go, you know, I just got an AI agent.
00:13:20.880 It's great.
00:13:21.380 It just, it just, you know, uh, did whatever for me, booked my vacation.
00:13:27.000 Um, yeah, yeah, I, um, I know someone who just, uh, the other day used what he thinks to, um, you know, order a coffee from Starbucks, you know, and that's from what I understand.
00:13:42.660 They just sort of said, here's my order, order it for me.
00:13:44.580 And without any human help or intervention did it.
00:13:47.120 And that sounds great.
00:13:48.300 You know, it sounds very helpful, but, um, yeah, that's the question.
00:13:51.340 Where is the line where it, um, goes from being something helpful to being something to be concerned about?
00:13:56.600 Um, I don't think we've passed that line yet.
00:13:59.920 You know, I don't think these things are quite capable enough to pose real dangers to us, but the problem is it's really impossible to know where that line will be.
00:14:09.440 We might not even know when we've crossed it.
00:14:11.960 Um, yeah.
00:14:14.460 There, there is no central brain though, where it's thinking offline, right?
00:14:22.140 It's, I mean, it's supposed to be something that just performs calculations when it's asked questions.
00:14:29.540 I'm talking about AI and not think it's not like sitting there, you know, in its spare time going, you know, gee, I, I just had this thought, correct?
00:14:37.360 Well, yeah, so that, well, yeah, so there's, um, AI agents are kind of this other category where it's, you know, what, what if you took this thing that you give a prompt that answers a question and you gave it some tools?
00:14:52.820 And like, one of those tools was it could output some texts that calls a function that looks something up on the internet.
00:14:58.600 And then, you know, what if you give it another tool where one of the functions it could run, one of the things it could output is to prompt itself to say something again, then you've got this loop and it can keep running on its own.
00:15:09.300 And that's one way to, um, get it, to be able to go off and do things like, you know, make a delivery order for you or order your groceries.
00:15:18.000 And, you know, uh, there's an organization called, it has to figure out how to do that.
00:15:24.880 Right.
00:15:27.240 So, uh, yeah.
00:15:28.480 And so you, sometimes it takes a long time.
00:15:31.040 Yeah.
00:15:32.440 It won't for very long.
00:15:34.100 It won't.
00:15:35.200 Yeah.
00:15:36.720 Uh, okay.
00:15:38.060 Um, Harlan, love, love talking to you.
00:15:40.900 Thank you so much for the, uh, insight.
00:15:43.180 Um, scale of one to 10.
00:15:45.460 How's 2026, 27 going to work out with AI?
00:15:49.460 Bad?
00:15:50.520 You know,
00:15:51.500 Not a problem.
00:15:53.460 One.
00:15:55.220 I tend to think that a lot of the people who have very confident predictions about what the timelines will be for this stuff are overconfident.
00:16:04.260 Um, and I think that, uh, it's really risky to be overconfident about this stuff.
00:16:08.760 So I hesitate to say anything other than that.
00:16:11.960 Uh, we just don't know.
00:16:13.020 We might have only one or two years left until superhumanly powerful systems or something we have to contend with.
00:16:19.760 Um, and it might be that we have 10 years, uh, but either way, we're unprepared.
00:16:24.360 All right.
00:16:26.580 Harlan.
00:16:27.060 Thank you.
00:16:28.460 God bless you.
00:16:29.120 Thanks so much, Glenn.
00:16:30.780 Yvette.
00:16:32.140 That's not the way you want to end your Friday.
00:16:33.980 Luckily we're not.
00:16:35.660 Uh, you know, we could just be just a couple of years away from superhuman intelligence that we'll have to deal.
00:16:41.740 Okay, good, good.
00:16:42.980 You know what?
00:16:43.560 Let's, let's talk about something else.
00:16:46.680 Let me tell you about American giant.
00:16:48.840 You know, there was a time when made in America actually meant something that you could feel deep down in your bones.
00:16:54.140 You know, you could tell by the weight of the fabric and the way that held out year after year after year.
00:16:59.420 The fact is it was built by people who took pride in their work.
00:17:03.300 We made things that were great.
00:17:05.200 A lot of that disappeared when everything started to become cheaper and be made overseas faster, farther away.
00:17:11.940 American giant decided to bring that standard back and they make their clothing right here in the United States using American cotton, American workers who know their craft.
00:17:21.380 This is, I'm wearing this old hat that I had.
00:17:23.700 This is a 1791 hat, uh, which was a jean company that I started, I don't know, years ago because I was mad at, uh, at Levi's.
00:17:31.980 It was almost impossible to do anything in America because you couldn't buy anything in America.
00:17:37.880 We weren't making anything.
00:17:39.080 American giant changed that.
00:17:41.080 They came in and they started buying old factories and then they said, we're going to keep this factory alive.
00:17:45.840 And then we're going to bring in the old equipment that nobody's trained on anymore.
00:17:50.140 That made it the right way.
00:17:51.780 When you try on their hoodie, you'll be blown away.
00:17:54.780 American by American today, American dash giant.com slash Glenn.
00:17:59.500 You'll save 20% when you use my name for your first purchase.
00:18:02.440 It's American dash giant.com slash Glenn.
00:18:06.080 Now back to the podcast.
00:18:08.060 This is the best of the Glenn Beck program from CBS news.
00:18:15.720 Newly released department of justice documents show that investigators reviewing surveillance footage from the night of Jeffrey Epstein's death observed an orange colored shape.
00:18:27.800 I don't know about you, but orange colored shapes move around my house all the time.
00:18:35.740 An orange colored shape was moving up the staircase towards the isolated locked tier where Jeffrey Epstein's cell was located at approximately 10 39 PM on August 9th, 2019.
00:18:49.920 That entry in an observation log of the video from the Metropolitan Correctional Center appears to suggest something previously unreported by authorities.
00:19:01.940 A flash of orange looks to be going up the L tier stairs could possibly be an inmate escorted up to that tier.
00:19:11.300 That's what's in their observation log.
00:19:13.400 This again reported now by CBS news.
00:19:17.340 It also appears according to an FBI memorandum that reviews by investigators left disparate conclusions by the FBI and those examining the same video from the department of justice office of inspector general.
00:19:32.120 FBI log describes the fuzzy image as possibly an inmate.
00:19:37.780 I don't know if you know this, but inmates at 10 39 are not going around, uh, in that area outside of their cell.
00:19:47.620 So FBI, that doesn't make sense.
00:19:50.180 The inspector general logs it as an officer carrying orange linen or bedding.
00:19:56.020 Okay.
00:19:59.020 We now know they knew when they wrote that in there, they knew that bedding is delivered the shift before this.
00:20:05.980 So it would have been five o'clock in the afternoon before these people were even, uh, in that's when you deliver bedding.
00:20:11.800 No one is allowed on that floor at 10 39.
00:20:15.460 You would have to lock.
00:20:16.960 You would have to log in.
00:20:18.740 So delivering bedding.
00:20:21.900 No.
00:20:22.460 The guards say that would have been, uh, a breach of protocol and you would have had to sign something.
00:20:33.160 The final report says approximately 10 39 PM and an unidentified CO appeared to walk up the L tier stairway.
00:20:41.880 So we're no longer just an orange shape.
00:20:44.040 This, this orange shape seems to have legs and then reappeared within the view of the camera at 10 41 PM.
00:20:51.240 Official reports state that Epstein died by suicide sometime before 6 30 AM when his body was discovered before breakfast, blah, blah, blah.
00:21:00.580 An in-depth analysis of surveillance video from the jail.
00:21:04.940 CBS news previously reported on the figure on the stairs and consulted independent video analysts who say the movement was more consistent with an inmate or someone wearing an orange prison uniform than a corrections officer.
00:21:17.820 The new records raise more questions about the activity near Epstein's tear late that evening.
00:21:24.360 Official reviews of Epstein's death make no mention of the figure in.
00:21:27.800 Oh, let me just say official reviews of Epstein's death make no mention of the figure in orange and later pronouncements from authorities, including the attorney general at the time, Bill Barr,
00:21:40.480 were that no one entered Epstein's housing tear the night of his death last summer in an interview on Fox and friends.
00:21:48.860 Then deputy FBI director, Dan Bongino said, quote, there's video clear as day.
00:21:55.100 He is the only person in there and the only person coming out.
00:21:59.260 You can see it.
00:22:00.340 Prison employees interviewed by CBS news said escorting an inmate at that hour would have been highly unusual.
00:22:08.040 The identification of the individual could have been crucial to reconstructing the events.
00:22:13.020 You think so?
00:22:14.440 Given that the sighting occurred within the estimated window of Epstein's possible time of death.
00:22:19.880 Okay.
00:22:20.800 This, I warn you, this is about to get worse.
00:22:23.520 The staircase leading to his cell tear was captured by the only camera known to have been recording that night,
00:22:31.160 positioned in a way that partially obscured the approach to Epstein's tear.
00:22:37.020 Government investigators relied heavily on that footage in reconstructing the timeline of the events.
00:22:43.200 But because of the camera's angle, it was not possible to rule out whether somebody could have climbed the stairs
00:22:49.520 and entered the tear without being clearly visible.
00:22:51.940 CBS news analysts of the analysis of that video found additional contradictions between what the video showed and the official statements.
00:23:02.640 Okay.
00:23:03.480 You ready?
00:23:05.560 Buckle up.
00:23:07.100 I just learned some things in the next few paragraphs that I didn't know.
00:23:12.920 Among those interviewed were the two corrections officers assigned to the unit that night.
00:23:18.260 Let me just ask you, what do you know about these guys?
00:23:20.600 All I know about these guys is they fell asleep.
00:23:25.400 Okay.
00:23:25.660 That's all I know about them.
00:23:27.840 Tova Noel and Guito Bonhomme were assigned to the unit that night.
00:23:37.380 They've not been publicly identified until now.
00:23:41.160 Documents show Bonhomme was interviewed twice in September 2019 in sessions conducted in lieu of a grand jury subpoena.
00:23:49.640 Yeah.
00:23:50.040 Yeah.
00:23:50.280 Interesting.
00:23:51.480 According to Noel's account, Bonhomme had been working multiple consecutive shifts and slept while on duty for a period of approximately 10 p.m.
00:24:00.120 and midnight.
00:24:01.940 Investigators also questioned Noel about the unexplained change in the recorded number of inmates in the SHU,
00:24:12.220 which appeared to drop from 73 to 72 sometime between 10 and 3 a.m.
00:24:17.660 She said she was just probably mistaken about the discrepancy and told investigators she had no memory of account changing.
00:24:25.420 Okay.
00:24:26.180 I'm just going to dismiss that one.
00:24:28.400 That's just somebody just writing the wrong number in.
00:24:30.220 Okay.
00:24:30.500 Let's just go with that.
00:24:31.560 He can't do it with the rest of this stuff.
00:24:34.780 Neither officer, neither officer were specifically asked about the orange colored figure noted in the video observation log.
00:24:45.700 Bonhomme told investigators that he did not remember the period between 10 p.m.
00:24:50.440 and midnight said he had no recollection of anyone walking up the stairs towards Epstein's tier around 1030.
00:24:55.720 Yeah, because he was asleep.
00:24:57.660 He added, however, that a jail employee entering a tier alone would have violated all of their policies.
00:25:05.300 Yeah, probably sleep would have too.
00:25:06.740 A separate internal presentation included in the document release described a corrections officer believed by investigators to be Noelle carrying linen or inmate clothing up to the tier.
00:25:21.520 The 2023 Inspector General report did not identify Noelle as the figure seen in the footage.
00:25:27.640 In her interview, Noelle told investigators distributing linen was not part of my duties.
00:25:34.220 I never gave out linen, ever, because that's done on the shift prior.
00:25:40.380 Okay.
00:25:41.820 So they leave this out in the Inspector General report, but they do not address the orange figure that is moving up.
00:25:49.240 They just say it's not these two.
00:25:51.920 You ready?
00:25:53.300 Okay.
00:25:53.940 Here we go.
00:25:54.500 Thomas and Noelle failed to complete inmate counts at 3 a.m.
00:26:00.960 and 5 a.m., as well as mandatory 30-minute wellness checks of Epstein.
00:26:06.800 All night long, they didn't do any of those things.
00:26:10.420 Thomas and Noelle were later charged with falsifying records certifying the inmate counts had been completed.
00:26:16.460 Federal prosecutors eventually dropped the charges in exchange for cooperation agreements that included interviews.
00:26:22.640 A transcript of Thomas' interview, conducted two years after Epstein's death and released in the recent document disclosure,
00:26:30.260 shows significant gaps in his recollection of the morning Epstein was found.
00:26:35.480 Ready?
00:26:36.900 Thomas told investigators he discovered Epstein in his cell shortly after 6.30 a.m. on August 10th
00:26:44.300 and that he ripped Epstein down from the hanging position.
00:26:47.640 Investigators asked,
00:26:51.560 What happened to the noose?
00:26:57.640 What happened to the noose?
00:27:00.140 Have you heard any of this before?
00:27:02.820 What happened to the noose?
00:27:04.780 Quote,
00:27:05.600 I don't recall taking the noose off.
00:27:08.000 I really don't.
00:27:09.300 I don't recall taking the thing from around his neck.
00:27:13.060 Noelle, who remained standing at the cell entrance,
00:27:15.940 told investigators she saw Thomas lower Epstein to the floor but did not see a noose around his neck.
00:27:22.500 The noose Epstein allegedly used has never been identified.
00:27:30.820 According to the inspector general's report,
00:27:33.740 a noose collected at the scene was later determined not to be the noose used in Epstein's death.
00:27:41.700 Okay, all right.
00:27:45.120 First you had us believe that it was a paper noose.
00:27:49.140 Now you're saying the paper noose that was found was not the noose that killed him.
00:27:54.060 In fact, you can't find the noose, the paper noose.
00:27:58.040 And this one was later added to the scene.
00:28:01.440 By whom?
00:28:02.420 By whom?
00:28:03.740 By whom?
00:28:04.600 By whom?
00:28:04.980 By whom?
00:28:05.380 Thomas also described Epstein as shirtless when they found him.
00:28:12.660 Evidence records indicate a shirt believed to have been cut from Epstein's body
00:28:16.220 was later returned from the hospital in a bag of personal stuff.
00:28:19.460 New documents also show that New York City's Office of the Chief Medical Examiner
00:28:23.240 reviewed the jail surveillance footage six days after the death as part of his investigation
00:28:27.940 but concluded the video was too blurry to identify any individuals.
00:28:32.300 Hours later, the office publicly ruled Epstein's death a suicide.
00:28:36.940 Wait, you don't have the murder or suicide weapon.
00:28:41.980 The weapon that you do have, the noose, is not the noose that killed him.
00:28:47.700 No explanation on how that arrived later at the scene.
00:28:51.500 You have a blurry figure.
00:28:53.220 I don't care that you can't identify.
00:28:54.820 You have a blurry figure going up in the middle of the night
00:28:57.760 and you can't identify that individual
00:29:01.000 but it's a blurry figure going up
00:29:03.300 and yet you rule this a suicide.
00:29:06.500 That is fascinating to me.
00:29:08.860 By the way, CBS News previously reported on the office's
00:29:12.140 unorthodox handling of the crime scene.
00:29:18.820 Okay.
00:29:19.300 Okay.
00:29:24.820 What is the biggest problem in America right now?
00:29:28.020 What is the problem that we face?
00:29:30.160 I think there are two big problems.
00:29:32.300 One, we have no idea how our government works.
00:29:36.920 We don't, we can't describe our rights.
00:29:39.960 We can't describe our responsibilities.
00:29:41.980 We have no idea about the three branches of government.
00:29:45.560 No one knows how this system works.
00:29:48.300 And so, it's working however it wants to work
00:29:51.400 because the people have fallen asleep.
00:29:53.740 There's problem number one.
00:29:55.520 Number two, because the people fell asleep,
00:29:58.300 there's all kinds of shady stuff going on
00:30:01.040 that we all know now because it is so,
00:30:03.340 I'll bet you a third of our budget
00:30:05.080 is gone in graft and bribes and whatever.
00:30:13.480 However, I'll bet you a third of our federal budget
00:30:16.280 is nothing but a con, okay?
00:30:20.140 You know that, I know that.
00:30:21.900 You can't trust the media.
00:30:23.400 You can't trust anybody anymore.
00:30:26.040 And now, when they release,
00:30:27.500 this is the problem with the Epstein thing, okay?
00:30:30.300 This is the ultimate test of trust.
00:30:35.080 You have to get trust back
00:30:38.380 or you don't have a nation.
00:30:40.740 So, nothing has felt right with this.
00:30:46.160 Nothing has felt right with this.
00:30:48.100 I don't know what you're going to find, if anything,
00:30:50.980 because I think so many people are involved.
00:30:53.620 Would I like to get to the bottom of this?
00:30:55.620 Yes.
00:30:56.280 Do I think we're going to get to the bottom of this?
00:30:59.300 No.
00:31:00.160 But thank God people are still looking into it
00:31:03.000 that actually have the ability to look into it.
00:31:05.360 We still have a FISA warrant out
00:31:07.460 or a FISA request out.
00:31:08.900 Oh, sorry, not FISA.
00:31:11.320 That's why I was getting screwed up.
00:31:13.180 FOIA, a FOIA request, Freedom of Information Act,
00:31:16.040 about this.
00:31:17.860 And we've been stonewalled from the government.
00:31:20.520 I'd like to know why.
00:31:23.180 Man, you know what?
00:31:24.140 It would answer a lot of these questions, I think,
00:31:27.440 because what we FOIA'd is happening
00:31:29.780 right at the time that they're saying
00:31:32.280 there's nothing to be seen here.
00:31:34.780 Um, so, thank God people are still digging in and looking.
00:31:42.460 But let me just go through the problems
00:31:44.900 that this has now caused.
00:31:47.620 You have an orange flash on the stairs.
00:31:51.680 Were you told of that?
00:31:54.060 Have we ever heard that before from, I mean,
00:31:57.120 from any source in the government?
00:31:58.680 We were told there is nothing there.
00:32:03.060 Clearly, friends and foes
00:32:05.740 both looked at that video and said
00:32:07.940 there is nothing there.
00:32:10.940 CBS had some analysts look into it
00:32:13.240 and they're like,
00:32:13.600 well, what's that orange thing moving up?
00:32:15.360 That's obviously a person.
00:32:18.720 Okay.
00:32:20.280 I thought there was nothing there.
00:32:22.780 Problem number two.
00:32:23.840 They knew this right away.
00:32:28.260 They knew it right away.
00:32:30.260 And then they dismissed it
00:32:32.020 as if it was nothing.
00:32:35.140 Three.
00:32:39.440 There's no time of death.
00:32:42.660 The medical examiner said
00:32:44.600 because they took the body down,
00:32:47.040 he couldn't tell a time of death.
00:32:48.920 Now, all my criminal CSI knowledge
00:32:54.600 comes from television.
00:32:56.220 And I know reality is not television.
00:32:59.080 But you can't tell me
00:33:00.520 that because you moved the body,
00:33:02.720 you couldn't put your hand on the corpse
00:33:05.440 and go, okay, that was an hour ago
00:33:08.960 or that was last night at 1030.
00:33:12.740 Just the body cooling
00:33:14.480 would have told you something.
00:33:16.340 The medical examiner
00:33:18.300 cannot assign time of death.
00:33:20.720 Well, that's interesting
00:33:22.000 because if you could say
00:33:22.940 it happened between 10 and midnight,
00:33:24.880 maybe we would have been able
00:33:27.000 to narrow things down.
00:33:28.500 But because it could have been done
00:33:30.160 at three o'clock in the morning,
00:33:31.580 could have been done at 545.
00:33:33.500 They walked in at 630.
00:33:34.880 He might have done it at 629.
00:33:37.940 That's bull crap.
00:33:39.020 And you and I know it.
00:33:40.000 And then the worst thing is
00:33:41.960 they don't remember the noose.
00:33:46.340 Nobody remembers taking the noose off.
00:33:50.060 Nobody remembers seeing a noose.
00:33:53.200 And then another noose,
00:33:55.320 which they have determined
00:33:56.400 was not the noose used,
00:33:58.860 just magically appears in the cell later.
00:34:04.540 Excuse me?
00:34:05.700 I mean, there's just no way
00:34:10.240 to square this circle.
00:34:11.420 There's no way to do it.
00:34:14.320 You cannot, with any credibility,
00:34:17.120 say, yeah, this guy committed suicide.
00:34:20.160 Now, it may turn out
00:34:21.300 that he committed suicide,
00:34:22.340 but not until you lock
00:34:24.380 all these other things down.
00:34:25.760 Who put the frickin' noose?
00:34:27.360 The cameras weren't working at 630.
00:34:29.860 The cameras weren't working at 630.
00:34:31.820 They were turned off, okay?
00:34:33.260 And they said,
00:34:33.860 well, I don't know how...
00:34:35.080 One of the first things
00:34:39.160 you would have done is,
00:34:40.220 how come, did we see anybody walk in?
00:34:41.920 No, those cameras weren't on.
00:34:43.800 At 632, somebody would have said,
00:34:46.420 turn the damn cameras on.
00:34:48.680 Right?
00:34:49.060 Nobody saw anything.
00:34:53.200 Nobody...
00:34:53.560 Who came into the room?
00:34:55.080 Who...
00:34:55.780 That would have been a crime scene.
00:34:58.400 Who had access to the room
00:35:00.620 to throw a noose inside?
00:35:03.400 My gosh.
00:35:04.700 There's a reason
00:35:05.500 why we don't believe the government.
00:35:06.940 There is a reason.
00:35:07.940 And it's this kind of crap.
00:35:09.900 You're listening to
00:35:10.820 the best of Glenn Beck.
00:35:12.340 Need a little more?
00:35:13.340 Check out the full show podcast
00:35:14.860 anywhere you download podcasts.
00:35:16.960 I have...
00:35:18.000 There's a lot to say
00:35:19.540 about Bitcoin
00:35:20.120 and why it's going down.
00:35:21.820 I don't fully understand it yet.
00:35:25.020 Had a long conversation
00:35:26.160 with somebody yesterday
00:35:27.040 and this is their deal.
00:35:29.260 And I need to fully understand it.
00:35:31.900 Hopefully, I'll have it by Monday.
00:35:34.360 Because if I understand it correctly,
00:35:36.720 that's a really big deal.
00:35:38.060 A really, really big deal.
00:35:39.820 So, we'll talk about that on Monday.
00:35:42.980 I'm just going to let...
00:35:43.840 I don't want you to worry
00:35:44.720 while Bad Bunny is on.
00:35:45.880 You know?
00:35:46.440 You got to be able to enjoy that.
00:35:49.500 Not a chance in the world.
00:35:52.320 Hope you're going to TPUSA
00:35:54.040 to their YouTube site
00:35:55.680 to watch the halftime show.
00:35:57.540 Anyway.
00:36:00.280 George AI is something
00:36:02.080 that I'm building
00:36:02.800 and people don't understand yet
00:36:04.200 because it's a year away
00:36:05.900 from completion.
00:36:07.080 But I want to give you
00:36:07.840 a piece of what...
00:36:10.100 The first thing
00:36:10.900 that's going to start coming out.
00:36:12.240 I may even release this example.
00:36:14.000 But first,
00:36:16.180 it's going to come out
00:36:17.120 you know, in text.
00:36:18.860 Then, it will come out
00:36:20.220 so you can ask it questions
00:36:22.840 and it will help you teach
00:36:24.660 or help you learn
00:36:26.060 and it will speak
00:36:27.260 in the language,
00:36:28.540 whatever language
00:36:29.140 around the world,
00:36:29.880 but also, you know,
00:36:31.980 the age-appropriate language.
00:36:34.120 This is proprietary.
00:36:35.460 This is not chat GPT.
00:36:37.520 That's really important
00:36:38.480 to understand.
00:36:39.100 This is completely different.
00:36:40.360 The goal is to get you
00:36:44.140 to be able to talk to it
00:36:45.940 and be able to say,
00:36:47.080 hey, I need a lesson plan
00:36:49.300 to teach, you know,
00:36:50.800 the founding of America.
00:36:52.580 I need, you know...
00:36:54.020 I have my kids in the car
00:36:55.220 for 15 minutes a day
00:36:56.360 so I need a 12-minute lesson plan
00:36:59.260 every day
00:36:59.920 that has certain goals
00:37:00.980 and here are the goals
00:37:01.800 and it'll teach.
00:37:04.300 And then it will...
00:37:05.180 The next step is
00:37:06.040 it will listen
00:37:06.880 and ask questions at the end
00:37:08.660 and if your kids
00:37:10.240 aren't getting it,
00:37:11.300 it will then revamp
00:37:12.960 the next episode
00:37:13.960 so it will be able
00:37:15.600 to solidify
00:37:16.800 that lost principle
00:37:18.640 on there
00:37:18.960 before you really move on.
00:37:20.740 That's the goal.
00:37:21.860 This is why I am
00:37:22.960 building the torch.
00:37:24.200 This is why I'm asking you
00:37:25.260 to join me
00:37:25.960 at the torch
00:37:27.200 because it's very expensive
00:37:30.080 but it is worth it.
00:37:32.820 And I think this is going
00:37:34.900 to be an incredible tool.
00:37:36.140 Nobody is...
00:37:36.980 Nobody's where we are.
00:37:38.120 Nobody has access
00:37:39.320 to what we have
00:37:40.800 except for the federal government.
00:37:44.000 Anyway.
00:37:45.240 So let me give you an example.
00:37:47.020 We were just talking
00:37:47.760 to Erica, this mom
00:37:48.720 who was in Washington State.
00:37:50.480 She had to talk
00:37:50.920 to her 13-year-old daughter
00:37:52.100 and I don't know
00:37:53.260 how that went
00:37:54.120 but let's just say
00:37:55.280 you're in that situation.
00:37:56.840 What do you do
00:37:57.580 when you get home?
00:37:58.380 What do you do?
00:38:00.340 According to George AI,
00:38:01.940 this is how you do it.
00:38:05.380 Your child sees a protest.
00:38:08.120 When you're talking
00:38:09.760 to them,
00:38:10.100 don't start with
00:38:11.260 right or wrong.
00:38:12.320 Don't start with
00:38:12.880 who's right.
00:38:14.100 Start with
00:38:14.820 what are they trying to do?
00:38:17.020 And Erica said
00:38:17.720 her daughter didn't even know
00:38:18.820 what they were trying to do.
00:38:19.780 Okay, so let's just
00:38:20.580 talk about it.
00:38:21.120 Let's take it step by step.
00:38:22.480 What are they trying to do?
00:38:24.360 Okay, they're trying...
00:38:25.180 They think something's wrong
00:38:26.280 and they're trying
00:38:26.760 to make change happen.
00:38:28.020 Okay.
00:38:28.520 How does change happen
00:38:32.540 in America?
00:38:35.200 Does change happen
00:38:36.800 through protests?
00:38:38.920 Then walk through
00:38:40.740 how the change happens.
00:38:42.880 City council,
00:38:43.580 school board,
00:38:44.500 legislature,
00:38:45.680 courts,
00:38:46.540 you know,
00:38:47.240 Congress,
00:38:48.320 et cetera, et cetera.
00:38:49.200 And show them
00:38:50.260 that yelling
00:38:50.880 gets attention
00:38:52.140 but the process
00:38:54.080 is what actually
00:38:55.420 changes things.
00:38:57.340 Yelling just gets
00:38:58.080 people's attention.
00:38:59.740 And then it's critical.
00:39:00.980 The lesson is not to protest.
00:39:02.460 It's really important
00:39:03.660 in that first time
00:39:05.180 you're talking about this
00:39:05.900 is to say
00:39:06.560 they have a right to protest.
00:39:07.920 However, there are things
00:39:09.280 that you don't do
00:39:10.000 in protests.
00:39:10.960 Okay?
00:39:11.660 But the lesson
00:39:13.160 is not to say
00:39:14.400 protesting is bad.
00:39:15.640 It's that protest
00:39:16.600 without participation
00:39:17.640 is nothing more
00:39:18.800 than theater.
00:39:20.820 They didn't teach you
00:39:22.000 anything about the process.
00:39:24.160 Civics teaches patients
00:39:25.700 not passivity.
00:39:30.460 Now, George said
00:39:31.900 teach about Jesus.
00:39:33.760 I'm going to add
00:39:34.660 MLK and Gandhi
00:39:36.000 because that's where
00:39:36.680 they got the principles
00:39:37.640 of Jesus
00:39:38.340 is reconciliation.
00:39:41.040 You ask your kid
00:39:42.200 how do you have
00:39:43.260 a country
00:39:43.880 if you can't
00:39:44.980 bring people together?
00:39:46.540 What happens
00:39:47.160 if we can never decide
00:39:48.860 do we have a country?
00:39:49.900 The answer is no.
00:39:51.080 So,
00:39:51.640 if we want to have
00:39:52.700 a country
00:39:53.240 we can't have
00:39:54.240 losers and winners.
00:39:55.960 We have to reconcile.
00:39:57.420 But reconcile
00:39:58.300 with what?
00:40:00.400 You got to reconcile
00:40:01.620 with the truth.
00:40:03.900 Okay?
00:40:04.360 So,
00:40:05.140 let's go through
00:40:05.820 the problem.
00:40:06.600 This is the problem.
00:40:07.100 This is the critical thinking
00:40:08.120 how to teach your kids
00:40:08.860 critical thinking.
00:40:13.380 Agreement
00:40:13.940 in a society
00:40:16.100 is rare.
00:40:17.080 Okay?
00:40:18.040 And reconciliation
00:40:19.220 is essential.
00:40:20.820 It has to happen.
00:40:22.020 If a society decides
00:40:23.240 that politics
00:40:24.400 produces winners
00:40:25.340 and losers
00:40:26.040 you know
00:40:28.140 it eventually
00:40:28.780 treats its citizens
00:40:29.760 as enemies
00:40:30.520 to be defeated
00:40:31.300 not neighbors
00:40:31.940 to be persuaded
00:40:32.680 and that's what's happening.
00:40:34.000 Everybody's an enemy
00:40:34.820 with one another.
00:40:35.900 That's not self-government.
00:40:37.160 That's a cold civil war.
00:40:39.280 So,
00:40:39.440 the question is
00:40:40.240 not
00:40:40.920 how do we win?
00:40:42.160 The question is
00:40:43.000 how do we get people back
00:40:44.980 using the truth
00:40:47.260 the whole truth
00:40:47.960 and nothing but the truth?
00:40:49.040 because reconciliation
00:40:50.980 without truth
00:40:52.220 is surrender.
00:40:54.560 Truth
00:40:55.080 without reconciliation
00:40:56.100 becomes cruelty.
00:40:59.660 So,
00:41:00.220 now you're sitting
00:41:00.760 and you've gone through
00:41:01.440 this with your kid
00:41:02.140 and now the catechism part
00:41:03.880 comes.
00:41:05.360 Teach by questions.
00:41:07.500 Do you even know
00:41:07.860 what reconciliation is?
00:41:10.740 Talk to him about that.
00:41:11.940 It's not compromise
00:41:12.780 with lies
00:41:13.920 or falsehoods.
00:41:14.920 It's the restoration
00:41:16.300 of the relationship
00:41:17.760 of a couple of people
00:41:20.980 after truth
00:41:22.320 has been spoken.
00:41:23.760 So,
00:41:23.960 what did Jesus do?
00:41:25.300 He didn't condemn everybody.
00:41:27.300 Did he condemn
00:41:28.140 or did he invite people?
00:41:30.660 And did he do it
00:41:31.720 with the truth?
00:41:33.700 I mean,
00:41:33.980 he used the truth
00:41:34.700 and then left the door open.
00:41:36.020 Go and sin no more.
00:41:38.120 Came after,
00:41:39.400 neither do I condemn you.
00:41:43.540 Did everybody follow him?
00:41:44.780 No.
00:41:45.460 A lot of people
00:41:46.000 walked away.
00:41:46.700 Reconciliation doesn't require
00:41:48.140 universal agreement.
00:41:49.320 Only honest witness.
00:41:52.160 So,
00:41:52.400 what's your
00:41:53.000 responsibility
00:41:54.020 as a citizen,
00:41:55.280 the civics part?
00:41:56.860 It's not to convert
00:41:57.820 everybody.
00:41:59.120 Yours is to
00:41:59.840 peacefully speak
00:42:00.800 the truth
00:42:01.440 with humility,
00:42:02.760 without contempt
00:42:03.640 for anyone
00:42:04.180 and without
00:42:04.900 any force.
00:42:07.100 Protesting
00:42:07.700 is legal.
00:42:08.680 You start using
00:42:09.560 force.
00:42:10.780 Now you start
00:42:11.540 to get into the place
00:42:12.400 where you're breaking laws.
00:42:13.600 And this is the key
00:42:15.700 principle.
00:42:16.220 This is where
00:42:16.520 republics live or die.
00:42:18.740 Jesus didn't
00:42:19.500 chase crowds.
00:42:21.320 He spoke to those
00:42:22.600 who could still hear
00:42:23.880 and that
00:42:24.340 matters.
00:42:25.280 Not everybody
00:42:25.780 is reachable
00:42:26.420 at the same moment
00:42:27.240 in history.
00:42:27.880 Some people
00:42:28.640 are hardened
00:42:29.260 or
00:42:29.800 intoxicated
00:42:31.600 by ideology
00:42:32.640 or
00:42:33.000 enraged
00:42:34.400 by winning,
00:42:35.460 invested in chaos,
00:42:36.600 whatever.
00:42:36.820 You don't
00:42:37.820 persuade those
00:42:38.620 people
00:42:39.000 with argument.
00:42:39.960 You persuade
00:42:40.420 them
00:42:40.800 only through
00:42:41.540 example,
00:42:42.760 consistency,
00:42:43.860 and time.
00:42:44.800 You don't
00:42:45.240 persuade all
00:42:46.120 of them
00:42:46.420 at once.
00:42:47.640 You don't.
00:42:48.700 That's why
00:42:49.400 it is so
00:42:50.040 important
00:42:50.460 to never
00:42:51.580 engage
00:42:52.200 in the kind
00:42:52.660 of stuff
00:42:53.060 that you're
00:42:54.180 seeing them
00:42:54.700 engage in.
00:42:55.620 Because
00:42:55.960 if you're
00:42:57.160 doing the same
00:42:57.860 thing,
00:42:58.200 then they don't
00:42:58.740 notice a difference.
00:42:59.600 Okay?
00:43:02.620 And here's how
00:43:03.420 you know
00:43:03.700 who to talk to.
00:43:05.380 Here's the test.
00:43:07.480 If someone
00:43:08.380 can still
00:43:08.900 ask a sincere
00:43:10.100 question,
00:43:11.000 what's a sincere
00:43:11.880 question?
00:43:13.120 A sincere
00:43:13.700 question is,
00:43:14.660 if I give you
00:43:15.480 the answer
00:43:16.120 and you go,
00:43:18.340 wow,
00:43:19.140 that makes
00:43:19.640 sense,
00:43:20.720 and it
00:43:22.060 disagrees
00:43:22.780 with what
00:43:23.400 you say
00:43:24.100 is causing
00:43:24.740 your behavior,
00:43:25.880 but you go,
00:43:26.800 that's true,
00:43:27.920 you've done
00:43:28.320 your homework,
00:43:28.940 that's actually
00:43:29.740 true,
00:43:30.700 will that
00:43:31.400 then change
00:43:33.320 you in
00:43:34.240 any way?
00:43:35.000 If I show
00:43:35.860 you that
00:43:36.400 that five-year-old,
00:43:37.780 that story
00:43:38.620 is not
00:43:39.260 what you
00:43:39.560 think it
00:43:40.080 is.
00:43:41.180 If I show
00:43:41.980 that to
00:43:42.380 you,
00:43:42.980 will you
00:43:43.640 say,
00:43:44.660 oh,
00:43:44.820 wow,
00:43:45.900 okay,
00:43:46.260 I better
00:43:46.460 question some
00:43:47.060 other things?
00:43:48.980 Or will you
00:43:49.760 say,
00:43:50.000 well,
00:43:50.100 it doesn't
00:43:50.460 matter,
00:43:51.180 they're doing
00:43:51.760 it anyway?
00:43:52.980 That's not a
00:43:53.720 sincere person.
00:43:55.340 If they won't
00:43:56.180 change their
00:43:56.820 behavior once you
00:43:57.760 speak truth,
00:43:58.520 then they're not
00:43:59.180 reachable.
00:44:00.260 If they can't
00:44:01.460 distinguish between
00:44:02.480 truth and power,
00:44:03.720 they're not ready.
00:44:04.840 Jesus called it
00:44:05.660 those who could
00:44:06.140 hear.
00:44:07.200 Let me give you
00:44:08.460 an analogy.
00:44:10.180 Think of truth
00:44:10.960 like a plumb
00:44:11.920 line on a
00:44:13.000 construction site.
00:44:13.800 Did you see
00:44:14.200 there's a video
00:44:14.780 going around
00:44:15.300 about these
00:44:15.900 skyscrapers in
00:44:18.180 China where
00:44:19.880 the walls are
00:44:21.020 coming apart
00:44:21.700 from the
00:44:22.140 floors?
00:44:22.820 They're
00:44:22.960 skyscrapers.
00:44:24.340 And you can
00:44:24.720 look,
00:44:25.340 you can put
00:44:25.780 your head on
00:44:26.280 the window and
00:44:26.860 look down because
00:44:27.740 they're separate
00:44:28.500 from all of the
00:44:29.520 floors.
00:44:30.500 Nothing is
00:44:31.200 straight,
00:44:31.540 not good.
00:44:34.120 When you put a
00:44:34.940 plumb line down,
00:44:35.880 that's to make
00:44:36.580 sure that
00:44:37.080 everything is
00:44:37.600 straight,
00:44:38.160 okay?
00:44:38.720 Gravity just
00:44:39.300 pulls that
00:44:39.780 straight and so
00:44:40.360 you know that's
00:44:40.940 a straight wall.
00:44:41.760 You don't bend
00:44:42.580 the plumb line to
00:44:43.480 match the crooked
00:44:44.140 wall and then
00:44:44.760 say, see, it's
00:44:45.400 straight.
00:44:46.160 And you don't
00:44:46.940 smash the wall
00:44:47.860 with the plumb
00:44:48.800 line either.
00:44:49.720 You just hang
00:44:51.100 it quietly.
00:44:52.000 You let everybody
00:44:52.960 see what a
00:44:53.500 straight line
00:44:53.920 actually is.
00:44:55.620 Some builders
00:44:56.440 will adjust.
00:44:58.380 Some might argue
00:44:59.480 that that line
00:45:00.180 is oppressive.
00:45:00.840 Some will walk
00:45:02.600 away.
00:45:03.760 But the
00:45:04.040 building that
00:45:04.680 survives is the
00:45:05.660 one that aligns
00:45:06.740 to the truth,
00:45:07.700 to the plumb
00:45:08.560 line, not the
00:45:09.900 one that wins
00:45:10.580 the argument.
00:45:11.320 You can argue
00:45:11.940 about that plumb
00:45:12.740 line all you
00:45:13.160 want.
00:45:13.380 Gravity is
00:45:13.880 gravity.
00:45:14.460 It's true.
00:45:15.780 And a republic
00:45:16.400 is exactly the
00:45:17.260 same.
00:45:19.640 One last
00:45:20.500 example.
00:45:22.280 Tell them a
00:45:22.820 story.
00:45:25.280 Imagine a
00:45:26.000 family sitting
00:45:26.660 at a dinner
00:45:27.900 table and half
00:45:29.840 of the family
00:45:30.400 is on one
00:45:30.900 side of the
00:45:31.180 dinner table
00:45:31.580 and the
00:45:31.840 other half
00:45:32.180 is on the
00:45:32.500 other.
00:45:32.740 One side
00:45:33.280 is wrong.
00:45:33.680 It doesn't
00:45:33.880 matter what
00:45:34.180 the topic
00:45:34.500 is.
00:45:34.840 One side
00:45:35.360 is wrong,
00:45:35.860 but they
00:45:36.540 don't know
00:45:37.080 it yet.
00:45:37.800 The other
00:45:38.400 side knows
00:45:39.320 the truth.
00:45:40.180 The other
00:45:41.420 side might be
00:45:42.100 tempted to
00:45:42.680 humiliate the
00:45:43.920 other side.
00:45:45.220 If the right
00:45:46.300 side declares
00:45:47.640 victory and
00:45:48.600 storms out,
00:45:49.440 the family is
00:45:50.040 lost.
00:45:50.820 If the side
00:45:51.560 that's wrong
00:45:52.460 is indulged,
00:45:54.200 the family
00:45:54.960 collapses into
00:45:55.940 lies and chaos.
00:45:57.380 So neither side
00:45:58.480 wins,
00:45:59.160 right?
00:46:00.820 Reconciliation
00:46:01.300 happens when
00:46:02.380 you, the one
00:46:03.340 person, stays
00:46:04.160 seated and
00:46:04.840 calm and says,
00:46:05.720 I'm not going
00:46:06.980 to lie.
00:46:07.760 I'm not just to
00:46:08.580 keep the peace.
00:46:09.280 I will not play
00:46:09.940 this game with
00:46:10.640 you, but I'm
00:46:11.080 not going to
00:46:11.440 abandon you
00:46:12.060 either.
00:46:13.080 And when
00:46:13.720 you're ready,
00:46:14.880 I'm still here.
00:46:16.480 And the truth
00:46:17.400 will still be
00:46:18.060 true.
00:46:19.860 That's how
00:46:20.460 nations heal.
00:46:21.980 Slowly,
00:46:22.900 quietly,
00:46:23.820 with scars
00:46:24.880 that you don't
00:46:25.620 hide.
00:46:26.120 this is the
00:46:31.220 goal of the
00:46:32.900 torch, to be
00:46:34.780 able to have
00:46:36.840 a tool that
00:46:37.420 you can trust
00:46:38.280 that's not
00:46:39.140 ChatGPT.
00:46:40.840 You trust that
00:46:41.840 at all?
00:46:42.360 I don't know
00:46:42.980 what's in
00:46:43.520 ChatGPT other
00:46:44.420 than everything,
00:46:45.740 and I don't
00:46:46.240 want everything
00:46:47.600 influencing.
00:46:49.760 So here's a
00:46:52.120 tool that is
00:46:52.700 only based on
00:46:53.940 the founders,
00:46:54.800 their words,
00:46:55.340 their beliefs,
00:46:56.120 their principles,
00:46:56.900 the things they
00:46:57.560 wrote.
00:46:59.320 Things that I
00:47:00.180 know, okay, I
00:47:01.000 can trust, that
00:47:01.720 cannot pull
00:47:02.840 anything outside,
00:47:04.020 can't pull
00:47:04.420 anything from
00:47:05.020 me, can't
00:47:06.640 pull anything
00:47:07.080 from the left.
00:47:08.440 It's just
00:47:09.500 their words.
00:47:11.880 And you can
00:47:12.700 ask it
00:47:13.200 questions.
00:47:14.360 How do I
00:47:15.160 teach this?
00:47:16.860 And you get
00:47:17.660 that answer
00:47:18.360 back.
00:47:18.720 and if you
00:47:21.700 can't do it
00:47:22.400 eventually,
00:47:23.180 hopefully in a
00:47:23.840 year, maybe a
00:47:25.240 year from now,
00:47:26.060 it will be able
00:47:27.680 to guide you
00:47:28.440 and the family.
00:47:30.080 You'll be able
00:47:30.780 to sit there and
00:47:31.500 you can ask a
00:47:32.240 question.
00:47:32.500 Wait, I don't
00:47:33.000 understand that,
00:47:33.700 George.
00:47:34.600 He'll explain it
00:47:35.420 again, then you
00:47:36.060 can take and
00:47:36.760 explain, and then
00:47:37.820 he'd correct you
00:47:38.560 if you're incorrect
00:47:39.640 or encourage you.
00:47:40.560 Yes, that's
00:47:41.120 exactly right.
00:47:41.880 Then ask
00:47:42.340 questions.
00:47:43.380 That's what we
00:47:44.120 have to do, ask
00:47:45.200 questions.
00:47:46.940 That's the best
00:47:47.980 way to teach, ask
00:47:49.260 questions.
00:47:52.920 That's why I've
00:47:53.620 been asking you to
00:47:54.240 join me at the
00:47:54.760 torch.
00:47:55.080 If you want to
00:47:55.480 help me build
00:47:56.000 this, it's
00:47:56.760 expensive.
00:47:57.280 I'm spending a
00:48:00.660 lot of money,
00:48:02.300 six figures every
00:48:04.040 single month, just
00:48:05.060 to build this
00:48:06.320 because I believe
00:48:07.320 in it so much.
00:48:08.740 And I don't
00:48:09.260 care.
00:48:10.080 I'll build it
00:48:10.980 by myself.
00:48:11.540 But I'd love
00:48:13.400 your help if you
00:48:14.640 want, if you
00:48:15.300 think it's worth
00:48:16.120 it.
00:48:16.760 It's $10 a month
00:48:18.040 to join the
00:48:18.560 torch.
00:48:18.980 You get all the
00:48:19.820 backstage stuff.
00:48:20.980 You get all the
00:48:21.580 bells and whistles,
00:48:22.820 everything else.
00:48:23.440 But this is what
00:48:24.460 I'm really trying to
00:48:25.280 do.
00:48:26.340 We're going to be
00:48:27.200 releasing music
00:48:28.060 soon.
00:48:29.080 My first 10
00:48:30.580 songs on the
00:48:35.420 Bill of Rights.
00:48:36.600 You remember
00:48:37.480 Schoolhouse Rock?
00:48:39.340 Similar to that,
00:48:40.500 but contemporary.
00:48:41.540 That you can just
00:48:42.420 play in the house,
00:48:43.280 play with your
00:48:43.740 kids, they can
00:48:44.500 listen to, and
00:48:45.700 they'll be singing
00:48:46.700 along the five
00:48:48.660 and the first.
00:48:49.560 There are five
00:48:50.220 rights in the
00:48:50.840 First Amendment.
00:48:51.640 Nobody knows
00:48:52.100 that.
00:48:53.240 Let them just
00:48:53.960 sing along.
00:48:54.500 Just play it in
00:48:55.440 the house.
00:48:56.020 Let them just
00:48:56.560 sing along with
00:48:57.360 it.
00:48:57.500 How many times
00:48:58.020 do you sing
00:48:58.540 songs or have
00:48:59.460 songs running and
00:49:00.320 you have no idea
00:49:01.380 what the words are
00:49:02.000 about?
00:49:02.680 My idea is why
00:49:04.140 don't we make the
00:49:05.000 words help us
00:49:07.820 instead of hurt us?
00:49:08.920 Why not put words
00:49:10.120 in there that
00:49:11.040 are not goofy
00:49:12.960 and stupid, but
00:49:14.760 actually just
00:49:17.520 sound like a
00:49:17.980 normal song that
00:49:19.000 will teach history?
00:49:20.220 Do these things.
00:49:21.420 That's one of the
00:49:22.080 things that we're
00:49:22.520 working on.
00:49:23.040 Some of that's
00:49:23.600 coming out soon.
00:49:24.920 But join us at
00:49:25.920 Torch.
00:49:26.300 You just go to
00:49:26.880 glennbeck.com
00:49:27.720 slash Torch,
00:49:28.700 glennbeck.com
00:49:29.620 slash Torch, and
00:49:31.020 join us.
00:49:31.760 With the RBC
00:49:35.100 Avion Visa, you
00:49:36.400 can book any
00:49:37.060 airline, any
00:49:38.100 flight, any
00:49:39.220 time.
00:49:40.120 So start ticking
00:49:40.880 off your travel
00:49:41.500 list.
00:49:42.660 Grand Canyon?
00:49:43.760 Grand.
00:49:44.660 Great Barrier Reef?
00:49:46.060 Great.
00:49:47.100 Galapagos?
00:49:48.200 Galapago?
00:49:49.680 Switch and get
00:49:50.560 up to 55,000
00:49:51.720 Avion points that
00:49:52.780 never expire.
00:49:54.240 Your idea of
00:49:54.980 never missing out
00:49:55.700 happens here.
00:49:57.260 Conditions apply.
00:49:58.420 Visit rbc.com
00:49:59.800 slash Avion.
00:50:01.760 We'll see you then.
00:50:05.260 .