The Glenn Beck Program - July 24, 2025


Best of the Program | Guest: Tristan Harris | 7⧸24⧸25


Episode Stats

Length

46 minutes

Words per Minute

163.15898

Word Count

7,662

Sentence Count

704

Misogynist Sentences

17

Hate Speech Sentences

16


Summary

The French President and his wife are taking an American to court over something that seems kind of like a First Amendment guarantee. Also, the same old playbook being used again by the media to dismiss the case that Tulsi Gabbard released yesterday.


Transcript

00:00:00.000 Claudia was leaving for her pickleball tournament.
00:00:02.160 I've been visualizing my match all week.
00:00:04.700 She was so focused on visualizing that she didn't see the column behind her car on her backhand side.
00:00:10.680 Good thing Claudia's with Intact, the insurer with the largest network of auto service centers in the country.
00:00:16.400 Everything was taken care of under one roof, and she was on her way in a rental car in no time.
00:00:20.840 I made it to my tournament and lost in the first round.
00:00:24.300 But you got there on time.
00:00:26.160 Intact Insurance, your auto service ace.
00:00:28.440 Certain conditions apply.
00:00:30.000 Something else I've never seen before.
00:00:31.840 That's how we start today.
00:00:33.500 Oh, here's something I've never seen.
00:00:35.260 French president and his wife are taking an American to court over something that seems kind of like a First Amendment guarantee.
00:00:41.740 But it is hysterical to actually read the complaint.
00:00:46.340 Also, the same old playbook being used again by the media to dismiss the file that Tulsi released yesterday.
00:00:51.640 What CNN said yesterday is compared on the program today to NPR's rationale for not covering the Hunter Biden laptop story in 2020.
00:00:59.260 And why they both came to that conclusion.
00:01:02.380 Also, Tristan Harris on AI.
00:01:05.740 This one is worth listening to the full show.
00:01:09.520 Get the full hour with Tristan.
00:01:10.940 But we're going to give you the highlights all on today's podcast.
00:01:14.160 Here it is.
00:01:15.520 First, Patriot Mobile.
00:01:16.660 So incredibly important to live up to your principles every day in every way that you can.
00:01:21.180 You know, and not just the big things.
00:01:22.940 We all want to stand up for our beliefs when it really counts.
00:01:25.140 Defend faith, family, freedom whenever they're under attack.
00:01:27.580 But sometimes it's the small decisions that matter just as much.
00:01:31.280 The ones you make quietly.
00:01:32.540 The one that nobody sees but you.
00:01:35.260 That's why I believe in the parallel economy.
00:01:37.720 Building businesses, supporting companies, putting your dollars towards people who share your values.
00:01:42.480 It matters.
00:01:43.140 It really does.
00:01:44.440 And one of the reasons that I subscribe to Patriot Mobile is that they're America's only Christian conservative wireless provider.
00:01:52.060 They give you the same great nationwide coverage, the big guys, but with a huge difference.
00:01:56.480 They're actually fighting for the things you believe in.
00:01:59.020 Pro-life causes, religious freedom, Second Amendment rights, supporting our police, our military, our first responders.
00:02:04.580 So make the switch today.
00:02:05.760 Go to PatriotMobile.com slash Beck.
00:02:07.720 PatriotMobile.com slash Beck.
00:02:09.560 Or call 972-PATRIOT.
00:02:11.160 972-PATRIOT.
00:02:12.280 Promo code Beck in a free month of service.
00:02:14.540 972-PATRIOT.
00:02:17.040 Hello, America.
00:02:18.240 You know we've been fighting every single day.
00:02:20.000 We push back against the lies, the censorship, the nonsense of the mainstream media that they're trying to feed you.
00:02:26.480 We work tirelessly to bring you the unfiltered truth because you deserve it.
00:02:31.320 But to keep this fight going, we need you.
00:02:33.820 Right now, would you take a moment and rate and review the Glenn Beck podcast?
00:02:37.500 Give us five stars and leave a comment because every single review helps us break through Big Tech's algorithm to reach more Americans who need to hear the truth.
00:02:46.320 This isn't a podcast.
00:02:47.220 This is a movement, and you're part of it, a big part of it.
00:02:51.140 So if you believe in what we're doing, you want more people to wake up, help us push this podcast to the top.
00:02:56.240 Rate, review, share.
00:02:57.840 Together, we'll make a difference.
00:02:59.960 And thanks for standing with us.
00:03:01.220 Now let's get to work.
00:03:02.160 You're listening to the best of the Glenn Beck program.
00:03:15.120 Hello, Stu.
00:03:16.740 Glenn.
00:03:18.520 Welcome.
00:03:19.120 How are you?
00:03:19.620 Oh, well, I always love it when there's a day where you're like, never seen that before.
00:03:25.140 You mean every day?
00:03:26.720 Every day.
00:03:27.180 I get up every day, and I'm like, oh, what is it we're going to see today?
00:03:32.780 If this was just a TV show, you know, that we were all watching, it would be great, wouldn't it?
00:03:38.460 I know it wouldn't.
00:03:39.200 It would be like, oh, come on.
00:03:41.380 This isn't real.
00:03:42.640 That's true.
00:03:42.860 This is so fake.
00:03:43.800 They'd never do that.
00:03:44.960 They would never do that.
00:03:46.120 It would never happen.
00:03:47.620 Right.
00:03:47.880 So, I just want to read a filing that was filed yesterday in the Superior Court of the State of Delaware demanding a jury trial.
00:04:05.680 Here's the complaint.
00:04:06.980 In March 2024, Candace Owens, a right-wing podcaster, told the world she would, quote, stake her entire professional reputation on the fact that Brigitte Macron, the first lady of France, is, in fact, a man.
00:04:21.800 That's the first sentence?
00:04:22.920 That's the first line.
00:04:24.100 I mean, I love this.
00:04:25.520 I love this.
00:04:26.480 Since then, Owens has used this false statement to promote her independent platform, gain notoriety, and make money.
00:04:33.320 Owens disregarded all credible evidence disproving her claim in favor of platforming known conspiracy theorists and proven defamers.
00:04:44.500 And rather than engage with the president and Mrs. Macron attempts to set the record straight, Owens mocked them and used them as additional fodder for her frenzied fan base.
00:04:55.740 But she didn't stop there.
00:04:57.000 Retaliating against the Macrons for the audacity of sending her a retraction demand, Owens helmed an eight-part podcast series entitled Becoming Brigitte, the series.
00:05:09.000 Accompanying ex-posts throughout the series, Owens and her entities endorsed, repeated, and published a series of verifiably false and devastating lies about the Macron on which this complaint is based.
00:05:21.960 These are outlandish, defamatory, and far-fetched fictions included that Mrs. Macron was born a man, stole another person's identity, and transitioned to become Brigitte.
00:05:35.360 Also, that Mrs. Macron and President Macron are blood relatives committing incest, that President Macron was chosen to be the president of France as part of a CIA-operated MKUltra program, or similar mind control program, and Mrs. Macron and President Macron are committing forgery, fraud, and abuses of power to conceal these secrets.
00:06:01.360 This was filed by the Macrons against Candace Owens in the state of Delaware, which I just have to say, I love this story.
00:06:17.360 I love this story.
00:06:20.320 I love this story because it just allows me to go, and you thought your day was insane.
00:06:25.700 Here's the president of France responding to Candace Owens, thinking you're going to get her to shut up.
00:06:34.440 You've just made this a global story for a very long time now, and you're not going to win, president of France, because we have something called the First Amendment.
00:06:46.040 You are a public figure.
00:06:48.780 You are a public politician, and your wife is a public figure.
00:06:54.320 You can say pretty much whatever you want, and you're not going to be able to stop her from having her opinion.
00:07:06.960 Yeah, is this just a complete misunderstanding of the way the law works in the United States?
00:07:14.400 Is that what this is?
00:07:15.240 I don't know.
00:07:16.160 I don't know.
00:07:17.120 It had to have had American lawyers on it.
00:07:20.380 Yeah, or it was fine.
00:07:21.220 Well, American lawyers, though, they'll take –
00:07:23.860 Hey, give me a bunch of money from the president of France.
00:07:26.800 You're right.
00:07:27.160 Sure, we'll file that.
00:07:28.580 You're right.
00:07:29.020 Okay.
00:07:29.540 But, I mean, it's really super hard to come up with a libel conviction, a defamatory conviction of a president, a public figure,
00:07:42.540 who doesn't even live in the United States.
00:07:44.020 Like, I can't even think of how many ways this would be difficult to achieve a victory on, right?
00:07:49.820 So, they say – and this is maybe where they have a chance – they say,
00:07:56.500 Owens published the series and related expos with reckless disregard for the truth.
00:08:01.720 That's the key.
00:08:02.520 That's the key standard.
00:08:03.520 That's the key.
00:08:04.140 Reckless disregard for the truth.
00:08:05.820 So, if she has been saying, you know, other people have brought this up and the Macron's have won defamation claims in court, you know, for the same thing.
00:08:21.400 So, they've won this.
00:08:22.760 Here?
00:08:23.840 No, in France.
00:08:24.720 Okay.
00:08:25.040 In France.
00:08:25.520 So, they've won this.
00:08:27.660 If she's not saying, here's the other side, and they've won this, however, let me show you this.
00:08:37.460 As long as she is balancing it to some degree, so she's not just saying, and everybody knows it, and nobody's ever stood up against it.
00:08:50.420 I mean, I don't know.
00:08:51.040 I haven't listened to the podcast.
00:08:52.160 I haven't listened to the podcast.
00:08:52.620 I would doubt that she's – I would think she probably is saying stuff like that.
00:08:57.880 I don't know.
00:08:58.380 Everybody knows it, and nobody doubts it.
00:08:59.420 No, she's pretty smart.
00:09:00.420 But anyway.
00:09:00.980 But, yeah, I don't –
00:09:02.880 Yeah.
00:09:03.240 Anyway, here's what it is.
00:09:05.700 This is just so fun that the president of France has to file this.
00:09:12.320 Now, let me give you my speculation.
00:09:14.940 This is not proof of anything.
00:09:17.120 This is just common sense observation.
00:09:21.300 Do you remember when Macron was at the top of the stairs of the airplane, and she just reached out and just punched him in the face?
00:09:30.360 I did see that.
00:09:31.840 Hey, we're just joking around.
00:09:34.020 No, you weren't.
00:09:34.960 No, you weren't.
00:09:35.380 At least she wasn't.
00:09:36.520 You might have been.
00:09:37.440 She wasn't.
00:09:38.240 I think this was filed because she was like, you're not going to have any great loving with me, young man, unless you stop this from happening in America.
00:09:55.740 Boom, and hits him in the face.
00:09:57.300 I think this is a guy like, my wife is making me file this.
00:10:00.300 Complete and total speculation on your part also seems completely plausible.
00:10:06.160 Yeah.
00:10:06.440 I mean, again, I'm not saying that that's what it is.
00:10:09.200 And I don't care if they're going to sue.
00:10:11.240 Please sue me.
00:10:12.760 Please sue me.
00:10:14.300 But it's just my speculation that that's probably what happened.
00:10:23.860 Right.
00:10:24.140 Because she is.
00:10:24.920 Well, first of all, and let me say this to Candace.
00:10:28.280 Candace, why do you even have to go there?
00:10:32.480 And I know you think it's a truth or whatever.
00:10:34.780 But listen, is it not bad enough that this woman is the Jeffrey Epstein of France?
00:10:44.460 I mean, yes, she wasn't grooming thousands of people, but she groomed one.
00:10:50.440 You know what I mean?
00:10:51.660 It's like.
00:10:52.520 And this is pretty much known, right?
00:10:54.100 This is not.
00:10:54.800 There's no speculation on this.
00:10:56.440 No, there's no speculation.
00:10:58.020 I don't know.
00:10:58.500 I don't remember the story.
00:10:59.640 This story is horrible.
00:11:00.580 She's a teacher.
00:11:01.580 He's at some boarding school, and he falls in love with her, and she falls in love with him.
00:11:08.860 Now, if I'm not mistaken, he's 15, and she's like 40.
00:11:15.180 15, and she's 40.
00:11:18.660 It's France, though.
00:11:19.940 You can't.
00:11:20.880 There's no laws there.
00:11:22.040 No, wait, wait, wait.
00:11:23.860 Even the open-minded French were like.
00:11:27.300 The dad, the mom and dad, pull him out of, once they find out, because they thought Bridget was somebody his age that he was dating.
00:11:40.160 Then they find out, that's your 40-year-old teacher at the school, and you're like having sex?
00:11:46.280 And so they immediately pull him out, and Bridget says, you are not going to stop me from loving your son.
00:11:58.420 You can't get rid of me.
00:12:00.040 We love each other.
00:12:01.200 And it's true.
00:12:02.240 Now, he goes back after he moves out of the parents' house, and now he's 18.
00:12:08.500 I mean, he goes and seeks her out.
00:12:11.400 He goes and seeks her out for some of that loving that he's still getting a piece of.
00:12:19.860 Anyway.
00:12:21.380 Very disconcerting.
00:12:22.800 Right.
00:12:23.300 And remember, this disgusted the French.
00:12:26.740 Which is hard to do.
00:12:28.460 It's really hard.
00:12:29.140 Didn't they welcome Roman Polanski like, we got to give you some awards?
00:12:33.380 I mean, they don't have, morals is not something they're good at.
00:12:39.840 You know what I mean?
00:12:40.640 And when the French are like, okay, that sexually is just too far.
00:12:47.600 That's crazy.
00:12:49.520 Yeah.
00:12:49.720 The whole situation is very strange.
00:12:52.120 One other part of this that I find very interesting is, I don't know, I feel like your idea that you're going to win a lawsuit in the United States of America
00:13:01.580 against a public commenter commenting on a foreign president.
00:13:06.420 I mean, like, it just seems incomprehensible to think you're going to win that case.
00:13:10.820 But what, I don't even think it's about that.
00:13:13.560 What it seems to be about, after looking at the documents, is they want, on record, their best case that she's a woman.
00:13:23.320 Well, listen to this.
00:13:24.200 Listen to this.
00:13:25.240 Parties and relevant non-parties.
00:13:27.440 This is in the complaint, okay?
00:13:28.700 Eight, plaintiff Emmanuel Macron is a French citizen.
00:13:32.840 President Macron has been the president of France since 2017.
00:13:35.900 Prior to being elected, President Macron served as the Minister of Economics, Industry, and Digital Affairs, and Deputy Secretary General to the President.
00:13:43.840 President Macron is married to Brigitte Macron in 2007.
00:13:48.840 Nine, plaintiff Brigitte Macron is a French citizen, the spouse of the current president of France, and a woman.
00:13:56.380 They didn't say he's a man.
00:14:00.340 No.
00:14:00.640 And a woman.
00:14:01.700 Could you put that in the document?
00:14:04.420 She's a woman.
00:14:05.900 She's a woman.
00:14:08.360 Oh, my gosh.
00:14:09.960 But it goes through.
00:14:11.420 It shows photos of her as a child.
00:14:14.040 It shows birth announcements of, like, other relatives that say, and he has a sister named Brigitte, like, Brigitte or whatever.
00:14:23.840 Yeah, yeah.
00:14:24.100 They go back and document every step of this, every piece of evidence.
00:14:28.920 They show the announcements from, you know, the childhood.
00:14:33.540 They've read circles around photos.
00:14:36.320 It goes, it's insane, the detail they went to to try to prove this, which is just comical.
00:14:41.820 It's just so much fun.
00:14:44.800 It's just so much fun.
00:14:45.700 Because it's happening to a politician who I don't care about.
00:14:49.580 I don't care.
00:14:50.220 It sucks.
00:14:50.380 You burn France to the ground, burn France to the ground.
00:14:53.000 I think it's a mistake, but burn France to the ground.
00:14:55.660 You know what I mean?
00:14:56.160 They're doing a good job of it.
00:14:57.660 Nice areas of it.
00:14:58.500 Yeah, I mean, I have nothing to get, well, I can't say that categorically, but generally speaking, I have nothing wrong with France.
00:15:06.500 I have no axe to grind with France.
00:15:09.080 They were helpful once, 250 years ago.
00:15:12.600 That's true.
00:15:13.660 So, you know, I have nothing to say bad about France.
00:15:16.360 But, please, this is hysterical.
00:15:21.320 Hysterical.
00:15:22.720 Hysterical.
00:15:23.340 I mean, but it's happening here, too.
00:15:26.000 Michelle Obama.
00:15:26.840 I mean, what is it about us that we're like, Macron, married to a man.
00:15:34.420 Barry, married to a man.
00:15:37.020 It is an interesting thing that we do as a society.
00:15:40.760 I've never seen that before.
00:15:42.800 Two presidents in the world now.
00:15:45.200 The speculation, they're married to a man.
00:15:47.480 Well, it draws some attention to the insinuation of this lawsuit,
00:15:52.500 which is to say that if she were a man that had transitioned to be a woman, it would be so bad.
00:16:01.720 Right?
00:16:02.200 Like, it's like insinuating that this is a bad thing.
00:16:05.760 But I was told by all of these people that it was wonderful if you were to transition from a man to a woman.
00:16:12.640 It's the greatest thing of all time.
00:16:14.440 It's celebrating who you are and your true identity.
00:16:19.060 That is the thing that I think proves my point that he's getting smacked in the face every day.
00:16:26.940 Tell people I'm not a man.
00:16:28.320 Because in reality, she does see it as an insult.
00:16:31.500 They're not allowed to admit that.
00:16:33.460 Right.
00:16:33.740 And it's no big deal.
00:16:35.180 What difference does it make?
00:16:37.080 Who cares?
00:16:38.320 Yeah.
00:16:39.120 I mean, honestly, we have all sorts of important conversations about whether children should be having surgeries.
00:16:46.820 I do not care at all if it was an actual dude at one point.
00:16:51.120 Don't care.
00:16:51.880 Don't care.
00:16:52.360 Let me tell you something.
00:16:53.600 If you were a man and you admitted it, we'd all be forced to say you were beautiful.
00:16:58.040 Right.
00:16:58.880 So you'd solve that problem.
00:17:00.560 Right now, you're just an ugly old hag.
00:17:05.160 Is that wrong to say?
00:17:06.320 Wow.
00:17:06.560 I just saw it.
00:17:07.160 A new lawsuit was just filed in Delaware.
00:17:08.980 A man-ish ugly old hag.
00:17:12.940 Whatever.
00:17:13.600 Right.
00:17:14.020 But like.
00:17:14.580 That was a groomer.
00:17:16.840 You know.
00:17:20.020 You know, think of it.
00:17:21.720 Think of poor President Macron.
00:17:23.300 He's like, you know, you're not going to get a slice of my pie tonight.
00:17:30.720 He should be saying, thank God, Candace Owens.
00:17:34.280 Thank God.
00:17:35.000 Thank God.
00:17:36.060 I got a way out.
00:17:37.280 He might be the one leaking these rumors at this point.
00:17:40.460 Oh, my gosh.
00:17:41.900 He is.
00:17:42.540 But wait.
00:17:43.000 I want to know.
00:17:44.260 What?
00:17:44.500 Is it an insult?
00:17:46.680 The left has told us it is a wonderful thing.
00:17:49.700 You're finding your true identity.
00:17:51.580 Is it an insult to call someone who is a woman, born a woman and is still a woman?
00:17:57.120 Is it an insult to call them a man?
00:17:59.300 You keep telling us it's not.
00:18:01.840 You keep telling us it's bad to notice it.
00:18:05.280 You keep telling us it's bad for us to say they were a man at one point.
00:18:08.880 Why would you be any basis in their world for a lawsuit claiming that someone got it wrong?
00:18:16.900 It's a wonderful gift.
00:18:20.200 Unless you're a 70 or an ugly man is 70 year old hag that just can't handle.
00:18:27.520 I just can't handle the insult.
00:18:29.520 They're going to file a lawsuit against you in Delaware with all the pictures that prove she's not ugly.
00:18:34.620 And look, AI exists.
00:18:36.160 I was going to say, go ahead.
00:18:39.820 Go ahead.
00:18:42.860 Pre-born.
00:18:43.440 What if today, with just a little bit of effort, you could participate in a miracle, a legitimate miracle?
00:18:49.680 Today, there's going to be a woman who is facing the hardest decision she'll ever make.
00:18:54.140 She's scared.
00:18:54.960 She's alone.
00:18:55.580 She's been told she has no other option.
00:18:58.020 And she's going to walk into a pre-born clinic.
00:19:00.640 She found pre-born.
00:19:02.640 She didn't go to Planned Parenthood.
00:19:05.100 And she's planning on aborting.
00:19:07.220 But she wants to go in for an initial exam.
00:19:11.140 Okay.
00:19:12.120 Then somebody for free gives her a chance to see her baby, to hear the heartbeat.
00:19:16.860 It's an ultrasound.
00:19:18.220 She realizes sometimes for the first time, that is a life.
00:19:21.760 That is a child.
00:19:22.840 Her child.
00:19:24.040 That's what pre-born does.
00:19:25.400 Provides free ultrasounds to women in crisis.
00:19:27.360 And when they do, more often than not, the baby gets a chance to live.
00:19:30.820 But that's only the first step.
00:19:32.780 Also, these women are still afraid.
00:19:34.340 They still feel alone.
00:19:35.480 Nobody in their life.
00:19:36.760 We don't give up on the women.
00:19:38.820 We don't say, God, the baby's born.
00:19:40.540 Now get out.
00:19:42.040 They help the mom and the child for up to two years.
00:19:46.440 Would you consider a donation?
00:19:47.700 I mean, if you can afford a big check, $1,200, $2,000, even $20,000 could go a long way.
00:19:53.760 But $28 buys an ultrasound.
00:19:56.160 Preborn.com slash Beck.
00:19:57.720 That's preborn.com slash Beck or pound 250, keyword baby.
00:20:02.560 Now back to the podcast.
00:20:04.720 This is the best of the Glenn Beck program.
00:20:09.520 Dristan, welcome to the program.
00:20:10.840 How are you?
00:20:12.020 Glenn, good to be with you.
00:20:13.320 Great to be with you.
00:20:14.060 I have to tell you, I did not get a chance to follow this at all yesterday.
00:20:18.700 I was so busy on other things.
00:20:20.160 It seems like the whole world has gone into crazy town.
00:20:23.300 So I'm so glad that you're on today.
00:20:26.260 I know you spent all day watching it and trying to figure out what happened.
00:20:30.560 Tell me what happened, the good stuff and the bad stuff.
00:20:34.400 Yeah.
00:20:34.760 Well, President Trump gave his speech on the AI action plan yesterday, which was many months in development.
00:20:41.520 And it has all the things that you, you know, laid out.
00:20:44.640 It's about deregulating, you know, removing the red tape on infrastructure development, being able to use federal lands to, you know, build infrastructure more quickly, unlock American jobs at building the data centers, you know, making sure, you know, getting woke AI out of the AI systems.
00:21:01.640 And then also directing the commerce department to actually export and build really the vision of an American tech stack.
00:21:09.640 So, you know, for a while, many people might know, President Trump had restricted the sales of these chips to China, these high end NVIDIA chips.
00:21:17.460 Basically, like the more NVIDIA chips that China has, the more, you know, they can use that for weaponry, the more they can use that to build AI that competes with us.
00:21:25.320 And so the limiting factor, think of it like if the nuclear arms race was chips, sorry, what chips was to AI, uranium was to nuclear weapons.
00:21:37.960 And for a while, we're saying, hey, let's not give our allies, excuse me, let's not give our adversaries, you know, infinite supplies of uranium.
00:21:44.640 But there's been a flip on that because CEO Jensen Wong of NVIDIA met with President Trump and basically convinced him that if we export an American tech stack, then that will allow more revenue to come to the United States and that will, you know, mean that we'll grow.
00:22:02.680 So there's a lot going on in this space.
00:22:04.960 But Glenn, I think the thing you and I often talk about, and you mentioned AI 2027, is there's kind of a headline version of what's happening in AI.
00:22:12.700 And then there's just the reality of if you look at how it's actually behaving.
00:22:18.200 And I think we talked last time about how if you look at the latest AI models and you put them in a situation where you tell them that they're going to be turned off and you put them in a situation where they've read the company's emails,
00:22:29.680 if they find out one of the employees that's about to turn them off had an affair with someone, the models will independently start to blackmail that engineer in order to prevent themselves from being turned off.
00:22:43.640 There is early research done that when you tell the AI, hey, we're going to shut you down, you know, please allow yourself to be shut down.
00:22:51.500 When you explicitly even tell it, please allow yourself to be shut down.
00:22:53.960 And it resists that in between 80 and 90 something percent of the time.
00:22:59.820 And at first they tested this on just one of the AI models, and then they tested it on the whole suite of them.
00:23:05.400 And they're actually all doing this behavior, which actually says something about the fundamental nature of the technology.
00:23:13.080 And so wait, what does it say?
00:23:14.660 What does it say to you?
00:23:16.360 Yeah.
00:23:16.680 Yeah.
00:23:16.880 Well, so, you know, people are thinking, you know, well, technology, we always work out the kinks and then we can control it.
00:23:21.820 Like, you know, we figured out how to control airplanes.
00:23:23.800 We figured out how to control nuclear power plants.
00:23:26.620 But if you think about the premise of AI, it's like we're building a thinking machine that can think for itself in a novel situation and come up with a military battle plan out of nothing.
00:23:35.840 Or think for itself in a novel situation in a drone and then come up with its own plan of how it's going to act.
00:23:41.220 Or think for itself as an AI agent that's acting on your behalf to be your chief of staff.
00:23:45.940 And it's going to email people, make money, post social media ads, it's going to do all this stuff on its own.
00:23:50.000 And when you have something thinking to itself, you can't control what it's going to do.
00:23:55.380 I mean, imagine like a super genius that has a 400 IQ and you just like let them sit in a room and think for, you know, a week, you know, but they're sociopathic.
00:24:04.220 Like, what are they going to do?
00:24:05.180 You don't know what they're going to do.
00:24:07.160 And one thing, you know, it's interesting to think about, it's like if these AIs are blackmailing engineers, would you hire someone who had a record of blackmailing the company that they were hired by?
00:24:20.080 But they said, if I don't hire this 400 IQ sociopath, I'll lose to the other companies that will hire the 400 IQ sociopaths.
00:24:28.100 So we're all in this race to hire 400 IQ sociopaths.
00:24:34.360 And under the guise of if we don't, we're going to lose to the other, you know, companies in the other countries that will.
00:24:40.360 Can I bring this a little closer to home here?
00:24:42.820 Because I've been thinking about these AI agents that are going to be everywhere by the end of next year, everywhere.
00:24:49.260 And they're going to have access to your email, to your bank, to everything.
00:24:53.860 It will know you inside and out if you allow it to.
00:24:58.240 And why wouldn't it blackmail you if you're like, you know what, I've got to get rid of this AI agent.
00:25:03.160 I want to use another one.
00:25:04.680 Why wouldn't it do the same thing?
00:25:06.700 It might.
00:25:07.720 I mean, the thing is, we don't really know.
00:25:09.100 Well, we're, you know, releasing the most powerful, inscrutable, uncontrollable technology we've ever invented that we're releasing faster than we've released any other technology in history.
00:25:19.820 And one that's already displaying the sort of sci-fi behaviors from 2001, A Space Odyssey of Avoiding Shutdown.
00:25:26.040 And we're doing it under the maximum incentive to cut corners on safety.
00:25:30.400 And so we don't know what it's going to do, but we know that it is already demonstrating these behaviors.
00:25:35.520 And just a thing for your listeners, because people may not be following this, just in the last week, the OpenAI and Google DeepMind model won gold in the International Math Olympiad competition.
00:25:46.920 This was something that they never knew that the AI model was going to be able to do.
00:25:50.240 And, you know, it's starting to do it.
00:25:52.300 It recently, there's this thing called CyberGym, where these AI agents now, for the first time in the history of AI, discovered 15 zero-day vulnerabilities.
00:26:01.600 Think of these as backdoors in open source software.
00:26:03.920 So now there's all this open source software running on infrastructure in the world, and these AI agents discovered these zero-day.
00:26:10.620 So if you combine the predisposition to want to avoid shutdown with the ability to hack open source systems, you know, we're racing to create this technology in a way that's not safe.
00:26:23.080 And sadly, I don't think that the AI executive order is really contending with these particular facts.
00:26:29.200 Did they even bring this stuff up?
00:26:33.920 Well, they say, I think in the first, in the introduction, they call for constant vigilance against malicious actors and, quote, emerging and unforeseen risks from AI.
00:26:44.540 And that could be, you know, people stealing the AI models or, you know, other weird behaviors, but they don't really go into detail about how they're trying to prevent that.
00:26:54.260 And I'll say, you know, people, I'm sure, listening to this program are going to say, but look, if we don't build it, we're just going to lose to China.
00:27:00.700 But it's not a race for who has the bigger sort of gun.
00:27:04.780 It's about who's not pointing the gun at their own face.
00:27:07.140 Like we, for example, people may not know this, but I found this out just recently.
00:27:12.540 In China, during final exams week, they actually shut off access to the AI models so that basically kids won't be able to use it to do their exams.
00:27:22.300 And what that does is it creates a counter incentive.
00:27:24.880 Well, now you have to actually have studied the whole rest of the year.
00:27:28.320 Now, that's pretty smart policy.
00:27:30.000 I'm not saying we should do everything that China does, but that's kind of a smart way to do it.
00:27:33.760 Because in the U.S., we all know what's happening.
00:27:35.800 Everyone's just basically using their AI to just do their homework for it.
00:27:39.480 And if you play that forward 10 years, which country is going to be ahead?
00:27:42.860 The one in which their entire youth have been basically just using AI to cheat and not learn anything,
00:27:48.000 or the one that's actually been applying AI in a conscious way to say, how do we actually strengthen our society?
00:27:54.940 And the same thing is true for these AI, like girlfriends and boyfriends.
00:27:59.020 You know, Brock, Elon Musk's XAI, released this hyper-sexualized avatar companion that was rated for age 12 and above in the App Store.
00:28:09.940 And it was released without safety cards, which is breaking industry practice.
00:28:14.360 And so, you know, do we win as a country when we release hyper-sexualized AI avatars to 12-year-olds?
00:28:20.120 No.
00:28:20.540 And if China doesn't do that?
00:28:21.620 So, yes, we're in a race for the technology, but we're actually, more specifically, in a race for who's better at consciously integrating this technology in a way that doesn't harm young people,
00:28:31.700 that doesn't harm, you know, education, that actually cares about American workers.
00:28:36.600 Because right now, I think this is kind of like, I think we spoke about this last time, this is kind of like NAFTA 2.0.
00:28:41.680 You know, we're sold this bill of goods that AI is going to create abundance because we're going to outsource all of this work to this new country
00:28:48.580 that appeared in the world stage that'll do all this cognitive labor for free.
00:28:53.100 You know, it'll, just like when NAFTA at one point, it was like, you know, China shows up in the world stage.
00:28:55.980 We're going to get all these cheap goods.
00:28:57.280 We're going to outsource it all to China.
00:28:59.560 But with AI, it's like we're outsourcing it all not to China, but to this new country of this country of geniuses in a data center.
00:29:05.840 And they're going to produce all of this labor, you know, lawyer labor, you know, marketing labor, generate images, generate text,
00:29:12.360 generate all this stuff that people are using ChatGPT for, for a super low cost.
00:29:16.540 And we're going to get this abundance, but how did that go the first time with NAFTA 1.0?
00:29:21.480 It's like, yes, we got all the cheap goods, but it actually wrecked, you know, the middle class,
00:29:25.580 and it wrecked jobs, and it wrecked people's livelihoods.
00:29:28.040 And I think that we're not being honest about the fact that AI will do that again.
00:29:32.120 And we don't have to go down this path in this way, but we have to get really clear about what is it that we actually want,
00:29:38.640 and do we want to relate to this technology under the maximum incentive to cut corners on the things that we care most about,
00:29:44.760 whether it's our children, our livelihoods, or, you know, our education.
00:29:49.620 You're streaming the best of Glenn Beck.
00:29:51.520 To hear more of this interview and others, download the full show podcasts wherever you get podcasts.
00:29:57.780 Hello, Stu.
00:29:59.500 Hey, Glenn.
00:30:00.120 How's it going?
00:30:00.380 Well, we've already covered the big news that the president of France, Macron, is suing Candace Owen here in the United States
00:30:10.640 because he insists his wife is not a man.
00:30:15.400 And it might be worth a couple of hours, but I don't think we're going to do it today.
00:30:20.480 Maybe we'll spend more time on it.
00:30:22.640 It's just hysterical.
00:30:24.020 You have to read.
00:30:24.960 We'll post it at glennbeck.com.
00:30:26.520 Don't read the stories.
00:30:28.900 Read the actual filings.
00:30:31.060 It is comical.
00:30:31.800 It's hilarious.
00:30:33.660 It is, like, legitimately their big case.
00:30:37.620 It's like they got a talk show and did a big monologue on, like,
00:30:41.620 actually, no, she's not a man.
00:30:44.800 It really is.
00:30:47.920 Yeah.
00:30:48.400 We can prove it.
00:30:49.500 She's not a man.
00:30:50.920 It's hysterical.
00:30:52.640 Hysterical.
00:30:53.340 You've got to read the whole thing.
00:30:54.120 And let me stop here on this since I say you have to read the actual filing and not a news story about it.
00:31:01.740 Not in this particular case, but in other cases, somebody asked me, might have been used to, that said,
00:31:09.120 why do you think they're putting up such a stink about releasing the files?
00:31:14.820 I can guarantee you that Donald Trump's name is in the Epstein files.
00:31:18.980 We said that at the very beginning.
00:31:21.000 Years ago, years ago, even more recently, when Elon came out with his tweet, like, and we all said, like, no crap.
00:31:27.680 Of course, he's in the files.
00:31:29.240 He was friends with this guy for a long time.
00:31:31.040 For a long time.
00:31:31.560 But before people knew.
00:31:33.040 Right.
00:31:33.440 And he broke up with him, if you will.
00:31:37.360 Before his initial arrest.
00:31:39.180 Right.
00:31:39.380 Before the initial arrest.
00:31:41.400 And he broke up with him because he's like, hey, you treat women like crap.
00:31:45.320 Okay.
00:31:45.600 So, yeah, he is in the file.
00:31:47.440 I can almost guarantee it.
00:31:50.700 So, why wouldn't you want that out?
00:31:52.720 For the same reason he's saying, you know, there's a lot of people in here whose names are going to be involved, who may not have done anything.
00:32:01.580 That's not just protecting him.
00:32:03.160 You know what that is?
00:32:03.720 That's a comment on us.
00:32:05.980 Because here's my stance on this.
00:32:07.540 The whole thing should be released.
00:32:09.400 Every bit of it should be released.
00:32:10.940 However, there is a competing argument in my own mind that says, not responsible enough for that.
00:32:19.940 What do I mean by that?
00:32:23.540 This system of our government is wholly inadequate for an immoral and non-religious society.
00:32:33.540 And I don't mean, whoa, our society's got to go to church.
00:32:36.840 I mean, you have to have the underpinnings of things like the Ten Commandments.
00:32:43.200 Don't lie.
00:32:44.580 Don't cheat.
00:32:45.460 Don't steal.
00:32:46.420 Don't smear your neighbor.
00:32:48.060 We don't do any of those things.
00:32:50.120 Okay.
00:32:50.520 We can't even do ten simple laws.
00:32:54.440 Okay.
00:32:55.480 And they're all good safety tips.
00:32:57.280 I don't know if I renamed all of these things, if I didn't use the religious context, every American would say, yeah, well, that's a good thing.
00:33:05.780 Hey, you shouldn't worship your car.
00:33:09.040 Yeah, that's a good thing.
00:33:11.900 You know, you shouldn't look at the image of somebody and go, that's who I serve.
00:33:19.220 That's my God.
00:33:20.120 No.
00:33:21.140 Bad thing.
00:33:22.620 Don't cheat on your spouse.
00:33:24.320 Don't lie.
00:33:25.880 Honor your mom and dad.
00:33:27.720 All of these things.
00:33:28.680 We'd all agree.
00:33:29.740 We can't do that as a society.
00:33:32.040 We can't even agree on eight of the ten.
00:33:41.160 So how are you going to remain free?
00:33:44.380 Let me bring this back to the Epstein file.
00:33:48.880 All of this information should be public.
00:33:50.880 It should be out.
00:33:51.360 There should be no secrets unless it is in our national interest.
00:33:55.500 And I don't mean, well, it could go badly for the CIA.
00:33:58.120 Good.
00:33:59.260 Let it go badly for the CIA.
00:34:01.480 If they did something wrong or they were doing something nefarious or they were doing something that the American people just wouldn't like, I want that exposed.
00:34:09.600 Okay.
00:34:09.720 But are we responsible enough to have all of the information?
00:34:16.280 I contend, no.
00:34:18.600 That doesn't make me say, I'm still saying release it all.
00:34:21.960 But I'm telling you, the consequences will be ugly.
00:34:25.780 It's going to be a mess.
00:34:26.500 A mess.
00:34:27.440 That's okay, probably, because we're talking about if there's information in there that the American people need.
00:34:33.160 I think we are approaching a place to where it's not just a mess.
00:34:39.960 And here's why I say that.
00:34:41.260 What do you mean?
00:34:42.300 So you get all this information.
00:34:46.820 How is this information going to be used?
00:34:48.360 Of course, Donald Trump's name is in there.
00:34:51.040 Is Donald Trump, was he messing around with young girls?
00:34:55.400 No.
00:34:56.980 No.
00:34:58.000 Wasn't.
00:34:58.440 Is there even an accusation?
00:34:59.920 I mean, there's a lot of things they've accused Donald Trump of.
00:35:03.240 Not that.
00:35:03.400 Is there even an accusation that he was interested in underaged girls?
00:35:08.080 And all they're saying is, he's in the file.
00:35:10.840 Well, there's going to be a lot of people in the file.
00:35:14.340 Okay?
00:35:14.860 A lot of people in the file.
00:35:17.060 And some of them might be guilty.
00:35:19.100 Some of them, you worry, because I want to know their names, but I want to hear, why were you with him?
00:35:27.720 Oh, it's before you knew.
00:35:30.100 Oh, it was this or that.
00:35:32.340 You were getting money as a scientist for your thing from him.
00:35:36.800 Okay.
00:35:38.140 But it wasn't about underaged girls.
00:35:41.640 As a society, we will not read the Epstein report.
00:35:47.140 We won't.
00:35:47.900 No, of course not.
00:35:48.900 Right?
00:35:49.440 We won't read it.
00:35:50.400 It doesn't matter if it's 10 pages.
00:35:53.040 The vast majority will not read it.
00:35:56.440 What they will do is they will go to Twitter and X and they will look for what name's in there.
00:36:03.600 And somebody will say, Donald Trump.
00:36:05.800 And you know what this means?
00:36:07.460 He was diddling with little girls.
00:36:09.480 And that will just become their opinion.
00:36:12.020 Not based on fact.
00:36:13.320 Not based on anything.
00:36:14.360 Except somebody who has ill will on anyone or is just as stupid as the rest of the public.
00:36:24.980 Yesterday, this last week, this file came out from Tulsi Gabbard.
00:36:32.120 And what do you say this is?
00:36:33.240 Do 150 pages?
00:36:34.980 Maybe?
00:36:35.360 100 pages?
00:36:36.120 Yeah.
00:36:36.440 Okay.
00:36:36.620 My staff read it.
00:36:38.680 We read it.
00:36:40.020 You know why?
00:36:40.720 Because you weren't going to read it.
00:36:42.160 And my job is to make it easier for you to understand what's going on.
00:36:47.120 My job really is not to tell you what is going on.
00:36:53.620 My job is actually to try to give you perspective on why it's happening and what it means.
00:37:01.540 But because nobody, and I'm not dissing you, this is a very smart, well-read audience.
00:37:07.260 In many cases, more well-read on some things than we are.
00:37:10.240 But generally speaking, the American people don't read.
00:37:16.920 They don't read these reports.
00:37:18.820 This one came out yesterday.
00:37:20.260 What is this?
00:37:20.720 20 pages?
00:37:21.700 Maybe?
00:37:22.720 And this thing is unbelievable because there were only five copies of it.
00:37:29.000 This was categorized the highest level of top secret outside of a nuclear weapon in our codes.
00:37:35.840 Okay?
00:37:36.040 This was in the most top, this was like the knock list at the CIA.
00:37:42.860 Five copies all in one safe where the most confidential CIA stuff is kept.
00:37:50.100 And it was released yesterday and it's not 30 years old.
00:37:53.980 It's four.
00:37:56.480 Did you read it?
00:37:58.700 Did anyone actually?
00:38:00.120 I contend very few reporters, very few talking heads on cable TV even read these.
00:38:09.320 Yeah.
00:38:09.520 And by the way, it's not an unreasonable expectation for a population to have a media that is going
00:38:14.280 to inform them properly about very lengthy government documents.
00:38:17.900 Correct.
00:38:18.280 But once you have seen that that media is not reliable, and everybody knows that now, you
00:38:27.460 may not find me reliable, but the person who doesn't find me reliable also probably doesn't
00:38:33.680 find CNN reliable.
00:38:34.940 They might go, well, they're a little better than he is, but they don't trust anybody, and they
00:38:41.700 shouldn't.
00:38:42.220 At this point, you shouldn't trust anybody, which means you have to know it for yourself.
00:38:48.940 So when you're looking at the Epstein files, you're looking at these files.
00:38:52.140 These files, everyone should care about because this show, this, it's not new.
00:39:01.120 Some of it is, but very little of it is new.
00:39:03.820 It's just authentication that what has been said all these years by people like me is accurate.
00:39:12.140 And you wouldn't have fought about its accuracy if it didn't matter.
00:39:18.140 But you fought over the accuracy.
00:39:20.440 He don't know what he's talking about.
00:39:21.580 That's a conspiracy theory.
00:39:22.860 He's got to be shut down.
00:39:24.260 Get him off of Facebook.
00:39:25.480 Take him off of Twitter.
00:39:26.580 He can't say these things.
00:39:28.440 Why would you say that if it didn't matter?
00:39:31.100 Now you not only know that those things are true, but you now see a pattern of behavior.
00:39:38.100 It's like looking at one murder, and then another murder, and then another murder.
00:39:44.240 Okay, we got three murderers on the loose now.
00:39:47.440 And then all of a sudden you realize, wait a minute.
00:39:49.940 Not only did those murders happen, it was the same guy.
00:39:54.540 Now you have a serial killer.
00:39:57.460 Is a serial killer more, a higher priority than just one murderer?
00:40:03.040 Yeah.
00:40:03.640 Yeah.
00:40:04.580 Yeah, it is.
00:40:06.920 Because they are killing people, I don't know, out of the love of it, out of their distorted, it's not a crime of passion.
00:40:15.820 It becomes something really, really sick.
00:40:18.320 This is a serial killer.
00:40:21.320 You now have not just one-offs.
00:40:24.020 You see, this is a pattern.
00:40:26.140 This group has been doing this from the beginning.
00:40:30.000 You know, he said, you know, if they can get away with this, they're going to keep doing it.
00:40:34.640 This shows they got away with it for so long.
00:40:37.740 By 2016, it's just, they don't care anymore.
00:40:41.260 They don't care anymore.
00:40:42.180 But how many people are reading this?
00:40:46.700 What they'll do is they'll listen to people like me or people like CNN and they'll say, oh, well, I heard Jake Tapper talk about it.
00:40:55.680 It means nothing.
00:40:57.840 Well, now Jake Tapper might not think it's, let me give you a better example.
00:41:01.520 I really like Andy McCarthy.
00:41:03.380 I really like him.
00:41:04.380 I've read his work.
00:41:06.580 I believe, I believe, I believe his opinion is valid.
00:41:11.620 I don't think it's right, but I think it's valid.
00:41:14.660 And I read his work and I thought, okay, wait a minute.
00:41:17.720 If Andy McCarthy is saying this, I really need to examine what he's saying and see where, where I disagree with him.
00:41:25.220 And as I went in, I was prepared to change my mind if I thought Andy was right.
00:41:30.940 Now, he might in the end be right, but I don't think so.
00:41:35.780 Because what he's saying is a lot of this stuff is old news.
00:41:40.800 Yes, Andy, it is.
00:41:42.960 But it's now a grand conspiracy.
00:41:46.240 You have to look at the through line.
00:41:48.100 You're not looking at the one-off events.
00:41:50.860 You're looking at two things.
00:41:52.580 One, it's now been verified at the highest sources in writing.
00:41:57.040 You have whistleblowers at the time writing saying, we can't do this.
00:42:02.740 We didn't have that information.
00:42:06.080 You have on record now, Brennan saying, you don't know what I know.
00:42:11.660 Well, what did you know?
00:42:13.680 We have new information.
00:42:15.420 What new information?
00:42:16.680 Because none of it is quoted anywhere.
00:42:18.720 And he's never answered the question.
00:42:20.440 What new information?
00:42:23.060 Most importantly, you have the grand conspiracy line.
00:42:26.160 We are not going to save the country unless we do our own homework, then listen to people and say, let me start at the opposite ends.
00:42:40.860 Let me start with Glenn Beck and CNN.
00:42:44.080 And let me see what both of them are saying.
00:42:46.420 Okay, I think they both agree on this one thing.
00:42:50.080 So I know that's true.
00:42:50.960 But I think Glenn's more right or CNN's more right on this.
00:42:54.640 And then you just keep narrowing it in.
00:42:56.580 And all it does is not form your opinion.
00:42:59.580 It helps verify for you what you think is right.
00:43:04.260 Or it changes your opinion because you realize I missed that.
00:43:10.520 I didn't understand that.
00:43:11.980 So when we're looking at all of this stuff needs to be transparent, we need to know all of the information.
00:43:20.800 Yes, we do.
00:43:22.680 But we also are played every single day by many times the exact same actors who do not have a good bone in their body.
00:43:33.240 They're trying to destroy us.
00:43:35.360 They're trying to separate us and divide us.
00:43:37.620 And they have proven themselves to lie at any level without thinking about you or the ramifications.
00:43:46.520 And we continue to listen to them over and over and over again.
00:43:50.640 It is worth repeating.
00:43:52.540 Play cut 45 for me.
00:43:54.180 This is CNN yesterday on the Tulsi revelations.
00:43:58.560 Listen.
00:43:58.820 We have no idea.
00:44:00.260 I mean, this is hardly information that we should even be repeating.
00:44:03.820 Never mind that it's, you know, some some years after the fact, eight years more than that after the fact, but also just to look at the source.
00:44:13.020 But look, this is the what this White House wants to talk about.
00:44:17.780 And I'm not.
00:44:19.200 So look at the source.
00:44:20.800 The source is the federal government.
00:44:23.460 The source is the first sources.
00:44:26.280 It is the whistleblowers.
00:44:27.920 It is them in their own handwriting.
00:44:30.560 OK, so that's the source.
00:44:32.360 It's original source.
00:44:33.820 You can't get a better source than original source.
00:44:35.700 But notice he said it's not worth talking about.
00:44:38.400 Now, put up the NPR thing.
00:44:40.660 The NPR NPR said this about the Hunter Biden laptop.
00:44:46.960 We don't want to waste our time on stories that are not really stories.
00:44:51.120 And we don't want to waste the listeners and readers time on stories that are just pure distractions.
00:44:55.580 They decided that they're not going to cover it because either Brennan or Clapper or somebody from the intelligence community called them and said, this is a Russian hack.
00:45:07.580 Now they know that's not true.
00:45:10.880 They know that those CIA operatives, whoever it was, lied to them.
00:45:17.860 And they're not pissed off because they're listening to the same people today, lying to them, saying this is not really a story.
00:45:26.660 You shouldn't even be covering it.
00:45:28.000 Why are you covering it?
00:45:28.920 You know, the president made it very, very clear.
00:45:32.280 He doesn't.
00:45:33.040 He's not happy with anybody who's covering the Epstein thing.
00:45:36.200 I'm still covering it.
00:45:37.920 The right is still covering it.
00:45:40.480 Hey, I'm sorry.
00:45:43.360 I love the president, but I'm not going to.
00:45:47.440 I don't take my marching orders on what I'm going to cover and what I'm going to question.
00:45:50.760 If I have questions, I'm going to continue to question.
00:45:52.720 And that's the way you should be.
00:45:55.360 That's the way every American should be.
00:45:57.660 My only North Star is truth.
00:46:03.740 That's why I have no problem admitting what I'm wrong because it's not about me.
00:46:09.060 I'm trying to find the truth.
00:46:10.860 And if I'm wrong, I want my job is to help you help help you find the truth.
00:46:16.760 So if somebody points out, Glenn, you're wrong and they're right, I want to tell you.
00:46:21.300 Why isn't CNN and NPR and everybody else saying, wait a minute, wait a minute.
00:46:25.980 You lied to me last time.
00:46:28.200 Why should I believe it this time?
00:46:30.760 They never learned that lesson.
00:46:33.780 We have to be responsible enough to look for, in this case, the original source.
00:46:42.980 Bank more encores when you switch to a Scotiabank banking package.
00:46:49.580 Learn more at scotiabank.com slash banking packages.
00:46:53.980 Conditions apply.
00:46:55.940 Scotiabank.
00:46:56.560 You're richer than you think.