The Glenn Beck Program - March 27, 2026


Best of the Program | Guests: Gene Hamilton & Tristan Harris | 3⧸27⧸26


Episode Stats

Length

44 minutes

Words per Minute

165.04129

Word Count

7,356

Sentence Count

316

Misogynist Sentences

7

Hate Speech Sentences

14


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.640 Now streaming on Paramount+.
00:00:03.600 My center, my soul is gone.
00:00:06.240 From Academy Award nominee Taylor Sheridan.
00:00:09.680 Mine is not a family designed to withstand tragedy.
00:00:13.440 Starring Academy Award nominee Michelle Pfeiffer
00:00:16.720 and Golden Globe nominee Kurt Russell.
00:00:19.360 The worry is what you do next.
00:00:21.840 You will have as much life to live as you allow yourself.
00:00:25.040 The Madison.
00:00:26.480 New series now streaming only on Paramount+.
00:00:30.000 Hey, it's the Friday podcast. You're not going to believe Gene Hamilton is on with us and he's
00:00:34.540 talking about something came out a couple of weeks ago, but most people just missed. And
00:00:37.460 I really think we need to pay attention to this. The CIA under Biden was labeling, had a full
00:00:44.480 program labeling mothers and those who believed in, you know, being being a mom and having more
00:00:52.600 children as terrorists it's an amazing thing and you know i wonder how much is still left
00:01:00.480 that we don't know about but thank god this program has been exposed and is out uh thanks
00:01:06.780 to tulsi gabbard and gabbard and the president um also um this week 10 years ago i answered a
00:01:13.640 question from somebody what will the world look like in 10 years how close was i it's pretty
00:01:20.740 shocking, I think. And a review of a movie with Tristan Harris, it is called the AI movie,
00:01:30.600 the AI doc, actually. You need to see Hail Mary. It's a great fun for the everybody in the family
00:01:37.040 kind of movie. This one, you need to bring everybody you know to. And believe me, you go
00:01:42.920 with a pack of friends to this, the AI doc, you are going to have the most fascinating conversation
00:01:49.000 at dinner after the movie it opens this weekend the ai doc you must see it at least catch the
00:01:55.480 podcast here with tristan harris telling you all about it here it is let me tell you about relief
00:02:02.020 factor if you've ever had one of those days where the you know pain isn't sharp or dramatic but it's
00:02:07.740 just there not enough to stop you but enough that you feel it in everything and you're like oh geez
00:02:12.180 you know sitting down standing up walking across the room not walking across the room you know
00:02:17.720 your body is just a step behind the whole time it feels like.
00:02:20.320 And so you adjust and you change how you move,
00:02:22.420 how you sit,
00:02:23.020 how do you plan to move little by little every day?
00:02:25.680 And it just becomes normal.
00:02:27.060 And that is the part where it's easy to miss.
00:02:29.600 When your pain becomes normal,
00:02:31.060 you stop asking whether it has to be normal.
00:02:34.020 Relief factor was created for that kind of pain.
00:02:36.540 It's a daily drug-free supplement that goes after the inflammation,
00:02:39.760 which causes most of our pain and most of our disease in our body.
00:02:42.900 And here's what matters.
00:02:44.180 Over a million people have tried relief factor and two thirds of them have
00:02:47.060 gone on to take it month after month and i'm one of them if you've been listening to me talk about
00:02:51.600 relief factor for a while maybe this is the time it's your moment take it take it you're going to
00:02:57.380 get the advanced formula designed to work even smarter now start with a three-week quick start
00:03:01.740 for 1995 at relieffactor.com three-week quick start 1995 try it for three weeks see if it
00:03:08.900 doesn't make a difference for you 800 for relief relieffactor.com hello america you know we've been
00:03:15.600 fighting every single day. We push back against the lies, the censorship, the nonsense of the
00:03:20.800 mainstream media that they're trying to feed you. We work tirelessly to bring you the unfiltered
00:03:25.860 truth because you deserve it. But to keep this fight going, we need you right now. Would you
00:03:31.260 take a moment and rate and review the Glenn Beck podcast? Give us five stars and lead a comment
00:03:36.060 because every single review helps us break through Big Tech's algorithm to reach more Americans who
00:03:42.000 need to hear the truth. This isn't a podcast. This is a movement and you're part of it, a big part
00:03:47.460 of it. So if you believe in what we're doing, you want more people to wake up, help us push this
00:03:51.560 podcast to the top. Rate, review, share. Together, we'll make a difference. And thanks for standing
00:03:57.360 with us now let's get to work you're listening to the best of the glenn beck program this is insane
00:04:13.040 we found out i think it was last week um there was a war on motherhood by the cia
00:04:22.660 Newly released CIA documents reveal the Biden administration identified motherhood and homemaking as an indicator of white, racially and ethnically motivated violent extremism.
00:04:40.460 The intelligence assessment reveals top to bottom bias at the CIA under Biden.
00:04:48.220 This is absolutely terrifying.
00:04:52.660 terrifying how did we get to this place and why does it seem no one in america cares
00:05:04.220 gene hamilton he cares he's with america's first legal he's the president and co-founder
00:05:10.640 gene can you tell me and take us through some of these documents and what is being done
00:05:16.140 Well, thanks for having me on. You raise very important points. It is absolutely insane that the United States government, in particular, the intelligence community, would issue any kind of report that remotely touches anything like motherhood or traditional roles of marriage and homemaking as being indicators of propensities to engage in any kind of
00:05:46.140 um terroristic or violent conduct or extremism of any kind but what you have to do glenn is you
00:05:53.500 have to go back and you have to think all right no actually is a moment of clarity here of course
00:05:59.580 thank you to the trump administration for withdrawing this product it was an actual cia
00:06:04.780 document that they published and that they have since withdrawn because it didn't meet their
00:06:09.660 standards. But then you have to think, well, now, how is it that something like this comes to be?
00:06:16.060 And I think that you and I and your audience at home all know that it's not all patriots working
00:06:23.820 in the intelligence community. Many of them are. Most of them are, I would suspect. But there are
00:06:28.980 a lot of people who get hired out of Ivy League schools, out of elite worldviews, and who think
00:06:38.020 that netflix movies are reality and if you look at this product or if you look at some of the other
00:06:43.740 products that they've published they seem to espouse viewpoints that you would see on some
00:06:49.820 crazy netflix produced movie that views white people or views traditional roles of marriage
00:06:57.820 or motherhood as indicators of some kind of threat to our national security and that in fact
00:07:04.360 is insane so i look at this and i think it is so insane i have i have no you know i have no
00:07:15.520 inside knowledge of what everybody is tracking and things but i i'm you know not a casual observer
00:07:22.440 of society and life in america and the only thing i can think of as a motive is she who rocks the
00:07:31.100 cradle rules the world you've got to destroy the family that is one of the stated from the left
00:07:37.480 one of the stated goals destroy the traditional family so you mark moms you also believe in global
00:07:44.660 warming mark mom stop having babies i mean this is so insidious but to be fair is there anything
00:07:52.580 that you can find in this document that shows well that was i could see how they read it this
00:07:58.880 way i can see how they took this information and said maybe that's leading to extremism i'm being
00:08:05.180 way over gracious but i have to ask anything you have to ask you have to ask i don't think so i
00:08:12.440 mean and especially when you consider this in the context of what other documents were produced
00:08:16.960 during the bite administration in public documents and of course documents that were
00:08:21.540 were classified have subsequently been declassified but remember they issued in the
00:08:27.380 Biden administration a domestic security strategy that identified conservative Americans, identified
00:08:33.500 all kinds of traditional values as being threats to our domestic security. The Biden administration's
00:08:40.540 entire team, their apparatus, viewed traditional Americans, white Americans, or conservative
00:08:49.120 Americans, military veterans, as the greatest potential threats to the security of the United
00:08:54.880 States. I mean, it was the, the, the absolute ludicrousness of this just can't be, can't be
00:09:01.360 overstated. I mean, it's, it's, it's, it's wild, but that was their worldview and they created
00:09:06.380 products and from those products derive action. And from that action comes investigations and,
00:09:13.340 uh, retribution campaigns, um, that played out all across the country against, uh, various
00:09:18.440 Americans. And so it's, this is not just, uh, some thing that the American people should
00:09:24.380 shrug their shoulders out and say well you know it's kind of that sounds a little silly
00:09:29.060 silly democrats uh shouldn't do that again this is real dangerous stuff and they have to realize
00:09:35.760 this is what is going to happen again if they get in power again they will do this type of product
00:09:43.080 again and we will all suffer the consequences i have to tell you i am i am i am convinced
00:09:50.320 that our security apparatus is a danger to the republic it's absolutely a danger um and it should
00:09:57.780 you should never feel that way you should be skeptical but you should never feel like i think
00:10:02.060 they're a danger to the republic um and the reason why i say that is you can take this what you have
00:10:08.660 been calling a product you can take it out you can remove it but what about the people who
00:10:13.300 produced it what about the people inside the agencies that didn't raise the alarm and say
00:10:19.560 what the hell are we doing right right oh yeah absolutely i mean there has to be top to bottom
00:10:26.980 review of the people who are involved in producing these products contributing to these products
00:10:33.760 and there needs to be consequences for them now of course it's very difficult for us as average
00:10:39.900 americans to keep tabs on the employment actions within the intelligence community i am confident
00:10:46.680 at least under this administration, that they're doing everything in their power to address it.
00:10:51.700 But as you say, Glenn, though, it is not just this one product. You have to wonder,
00:10:59.200 well, who else was in the agency at the time who was just okay with this happening? Who went along
00:11:06.140 with it? Will they speak up in the future? Next time that there's some radical in the administration
00:11:12.900 who's wanting to push a ridiculous product that has no basis in reality?
00:11:18.060 Will they raise their voice?
00:11:19.440 Will they speak up?
00:11:20.460 Because if they didn't before, chances are they're not going to do it again.
00:11:24.080 And, again, the main point being here, this is extraordinarily dangerous.
00:11:31.040 The consequences are significant, and we all have to be aware of this
00:11:35.240 as we go forward and we make decisions about things in the fall in particular.
00:11:41.540 Listen to this.
00:11:42.900 Um, the purpose, uh, is to, of this document is to focus on, uh, women who are supporters or sympathetic to transnational white REMVs, R E M V E. You know what a REMV is?
00:11:59.160 go ahead yeah no i mean it's just you know it's look it's it this this has ties it has connections
00:12:07.300 also with this some of this trans ideology has ties with some of the folks who are engaging in
00:12:13.260 in just a modern weaponized war on womanhood people who uh don't appreciate and value and
00:12:23.040 support women as God created, uh, individuals, uh, who are distinct and unique in their capabilities
00:12:30.760 and want to distort and change all concepts of gender in our society to, to, again, you have to
00:12:38.940 look, Glenn, you mentioned earlier, the destruction of the family, the destruction of the family is
00:12:44.300 the one thing that can help liberals and Marxists obtain power because they're, they, it would,
00:12:50.800 It eliminates that support network. It eliminates so much. And they will take products like this. They will engage in other activities. They will do all kinds of things to destruct and destroy the family because that's another way for them to obtain power in the long run.
00:13:06.600 And they're talking about this over in Europe, and, you know, that's the way the CIA gets away with it, but you can guarantee it has been come here.
00:13:14.740 Renvi is racially and ethnically motivated violent extremist.
00:13:18.820 So a racially and ethnically motivated violent extremist.
00:13:23.720 I would say that there are those who are in BLM that fit that.
00:13:27.500 I would say that there are Islamists.
00:13:32.500 The Islamist movement absolutely fits that.
00:13:35.780 But then they say the mothers believe that their perception of an idealized white European ethnic identity is under attack from people who embody and support multiculturalism and globalism.
00:13:51.260 It is under attack. I mean, this is this is I mean, this is incredible. They do not actually identify anybody who is a violent extremist who is attacking those kind of, you know, visions of the future and visions of the past, if you will.
00:14:14.020 They're not going after them, just the people who are saying, wait a minute, I like my country.
00:14:19.680 I like the culture that we have right now.
00:14:22.060 I don't want to become that culture.
00:14:25.420 Why are they the only ones that are in the crosshairs?
00:14:29.100 Well, I mean, look, there's all kinds of reasons that we could get into.
00:14:32.520 And I think one thing that we can just make clear for everybody, everyone listening here would rightly condemn anyone who wants to engage in any kind of violent extremism on any basis.
00:14:43.020 There's just no place for it. But to to connect into into insinuate that just because somebody wants France to remain France is somehow going to be a violent extremist is is is a threat is ridiculous.
00:15:01.820 Because, look, I mean, what are they what are they saying? Should Japan not remain Japan? Should it not have a Japanese character? Should Nigeria not remain Nigeria?
00:15:12.360 And is there a problem with a person desiring for countries to have their own individually unique cultures?
00:15:17.960 um it again it comes from this worldview from these elite institutions and some of the folks
00:15:24.360 who um get behind the creation of some of these products is that they they they think that it is
00:15:30.840 in fact dangerous for someone to think that uh america uh you know if you are a person who uh
00:15:39.240 views the world based on merit and appreciates individual humans regardless of where they came
00:15:45.160 from, but understands that it was founded on Judeo-Christian values by individuals from
00:15:50.780 Western Europe principally, then the notion that that was actually our history, and that's
00:16:00.960 who we are, and it's a fact, and there's nothing wrong with that, continuing fact to maintain
00:16:07.580 our identity, is just something that they're just not used to dealing with.
00:16:13.300 and uh we're going to see more and more of this glenn unless unless uh we stop these things
00:16:20.320 you're saying yeah unless we unless we elect the right people i mean i am i don't know about you
00:16:27.980 i'm sure you feel the same way gene i am for the first time truly terrified of what is coming our
00:16:34.000 way if these elections go and give the power back to the left because it's not even to the democrats
00:16:39.860 anymore the democrats have been wholly eaten by the left and they are talking about purges and
00:16:46.280 everything else it is very very concerning when you see this was happening under biden good god
00:16:53.980 almighty what would happen under a marxist yes absolutely um gene besides the election is there
00:17:04.420 anything that people should be aware of or do? Look, I think that, again, just with all things
00:17:12.100 in life, Glenn, we all have our own spheres of influence. And one of the most important things
00:17:17.780 is spreading knowledge. This is a product, this particular product we're talking about today
00:17:23.420 was declassified and withdrawn by the CIA, you know, I guess about a month ago, and it didn't
00:17:31.160 get a whole ton of attention. There are a lot of people in your sphere of influence for the folks
00:17:36.820 at home, your friends, your family who have no idea, no knowledge about the existence of this
00:17:42.560 product. And they would, they see things in the headlines. They believe some things. They think
00:17:47.600 that, you know, maybe all the Trump administration's being radical and here they're doing this crazy
00:17:51.500 thing again. And they really need to understand that. No, no. In fact, that's a narrative that's
00:17:57.460 being painted here that's not accurate and in fact on the other hand look at what these people
00:18:03.220 were doing the last team that was in here amongst everything else that everybody already knows we
00:18:08.020 need to spread knowledge we need to make sure that people are aware that these types of products
00:18:12.580 exist they are real they are going to happen again and there is no going back to some kind
00:18:19.640 of moderate democratic administration uh is is not a likely thing that is going to happen in the
00:18:25.260 future. Thank you so much, Gene Hamilton. Um, thank you for being on it. Thanks for all of
00:18:31.300 the work that you and all of your colleagues, uh, do every day. Appreciate it. America first
00:18:36.300 president and co-founder. You bet. This is the best of the Glenn Beck program. And don't forget
00:18:42.500 rate us on iTunes. Tristan Harris center for humane technology, the co-founder,
00:18:49.520 former digital design ethicist.
00:18:52.880 Tristan, how are you, sir?
00:18:55.380 Well, it is always really good to be with you,
00:18:57.560 and thank you so much for encouraging people
00:18:59.120 to go out and see this movie that we're talking about today,
00:19:01.900 which I should also be clear, by the way,
00:19:03.220 I don't make any money when people see this movie,
00:19:05.560 so I want people to hear both your call and mine,
00:19:07.660 as this is a kind of a public service announcement to the world,
00:19:11.320 much like, as you know, the day after that film in 1982
00:19:15.080 about what would happen if the U.S. and Soviet Union went to New York.
00:19:19.520 It's just meant to clarify. And if we have clarity about where we're headed with AI, then we can choose what we want to choose. And the emphasis is on agency, on choice. What do we actually want here? But I'm so grateful to be back with you on the program.
00:19:33.420 Tristan, thank you for saying day after. I was trying to find a comparison. That is it. Ronald Reagan saw that movie, and I think somebody in Hollywood sent it to him thinking we'll change his mind.
00:19:49.220 And his mind was already made up, really, of nuclear weapons.
00:19:52.380 He knew they were bad.
00:19:53.340 That just clarified.
00:19:54.900 And he took that over to Moscow and gave it to Gorbachev and said, watch this movie.
00:20:00.500 And that's what brought both of them to the table.
00:20:03.540 And I'm telling you, you were exactly right.
00:20:06.620 This movie needs to be seen.
00:20:09.260 And, you know, what's so frustrating is this decision is all being made by somebody.
00:20:16.100 at least we understood what nuclear weapons would do we maybe didn't know it until that movie we
00:20:22.420 didn't have it visualized for us but people don't understand what ai even is and what it can do we're
00:20:30.200 not at ai really yet we're not at agi we haven't you haven't even scratched the surface of what's
00:20:35.340 truly coming um and it's being made by these eggheads and these billionaires uh who are
00:20:43.080 quite honestly terrifying just terrifying yeah yeah well as you said you know back in 1945 all
00:20:51.580 of us saw the photos uh and later videos of hiroshima and nagasaki and we saw we knew
00:20:58.060 intuitively the destructive power of nuclear weapons everybody got that evidence because
00:21:02.880 there was a catastrophe um and with ai most people's experience let's just tune into it
00:21:08.360 there you are you use chat gpt your baby's burping in the background and boom you have a blinking
00:21:12.180 cursor that tells you exactly what you need to know. So why in the world are we talking about AI
00:21:16.360 as if it's nuclear weapons? This seems bizarre. And the truth is that what this blinking cursor,
00:21:22.340 this friendly face obscures, is that behind that blinking cursor are essentially like five
00:21:28.400 soon-to-be trillionaires who are racing to create, as you just said, artificial general
00:21:34.520 intelligence. They're racing to be able to replace all human economic labor in the economy.
00:21:41.180 everything that a human mind can do, all the marketing jobs, all the sales, all the media
00:21:46.820 jobs, all the financial analyst jobs, every job, mathematicians, physicists, AI is going to be able
00:21:52.060 to do all of those things. And I think something your audience needs to know is, are these companies
00:21:56.140 in a race to support the American worker? Are they in a race to augment and support you?
00:22:01.040 Well, let me just break this down because I think it's so important for people to get.
00:22:04.820 These companies have taken on so much investment. This is literally more money has been invested
00:22:10.060 into this than any other technology in human history.
00:22:13.460 And they can't make up that investment if everybody just paid for a chat GPT subscription.
00:22:20.240 If everybody paid $20 a month, that's not enough to make back their investment.
00:22:24.500 If they just charge advertising for all the chat GPT usage, so suddenly you have embedded
00:22:29.260 ads everywhere, that's also not enough to make back the amount of money they've taken
00:22:33.180 on.
00:22:33.760 The only thing that justifies the amount of money that they have raised is if they can
00:22:38.760 replace all economic labor in the economy, the $50 trillion labor economy, that's the incentive.
00:22:45.280 As we said many times in this program with you, Charlie Munger, Warren Buffett's business partner
00:22:50.140 said, if you show me the incentive, I will show you the outcome. How did you and I predict social
00:22:55.880 media back in 2017? If you saw that the incentive was the race for eyeballs and engagement and
00:23:01.720 duration of use and frequency of use, you're going to get a digital brain rot machine,
00:23:05.920 which is what's keeping all of our young people doom scrolling.
00:23:08.900 And so with AI, the way that you can predict which way this is going to go
00:23:11.920 is it's not short-term, it's augmenting workers, it's helping you, it's a blinking cursor.
00:23:15.700 But long-term, these companies, their goal is not to do that.
00:23:18.820 Their goal is to replace you.
00:23:21.340 And I think that means that you won't have, you and me and everybody else who just has a regular job,
00:23:26.140 you won't have political power after the GDP of countries starts to come from AI and data centers
00:23:31.240 and not from people.
00:23:32.500 because now the companies don't need you for their labor
00:23:35.060 and governments don't need you for your tax revenue.
00:23:37.520 This is called the intelligence curse
00:23:39.420 that much like there are countries
00:23:42.720 that discover a natural resource like Congo
00:23:44.780 or Sudan or Venezuela.
00:23:46.960 And when all of their GDP starts to come from that resource,
00:23:50.600 what was a blessing of that resource becomes a curse
00:23:52.900 because now the entire country kind of organizes itself
00:23:55.940 around mining that resource.
00:23:57.920 Well, in the case of intelligence,
00:23:59.800 AI is the new resource.
00:24:01.640 It's the new thing that's going to drive all economic growth.
00:24:04.620 And the reason I'm saying this is not to scare your listeners.
00:24:07.040 It's so that we can get crystal clear that these five companies and the five soon-to-be trillionaires that are running them, they don't have the interests of regular people at heart.
00:24:16.520 And we can talk more about that, but I think it's just so important for your listeners to get that.
00:24:20.680 Tristan, I mean, I wrestle with this all the time because I am engaged in building some AI and using some AI.
00:24:28.000 We're trying to do it ethically, and it's allowing me to hire more people.
00:24:34.300 I will always look at it as a tool, not as a replacement.
00:24:39.320 I know if I wanted to, I could replace so much of my staff right now just by implementing all kinds of AI, but I don't want that.
00:24:50.120 But these big companies, they don't, you know, as soon as you're traded on the stock exchange and everything else, it's all about profit, profit, profit, profit.
00:25:00.180 And how do you stop this?
00:25:04.160 Yeah.
00:25:04.520 Well, I think the way we, I mean, it's not just about stopping, it's steering.
00:25:09.860 You know, I always say, you know, people always say, well, how could we possibly do something different because we're in a race with China, right?
00:25:14.680 Well, yes, we're in a race with China, but also the AI companies have been using a boogeyman of China to say that's why you have to keep funding us and keep accelerating us.
00:25:22.960 We should just also be aware that the companies are using that narrative to drive up their goals and drive up investment into their products so that they can win against their competitors.
00:25:31.440 But one thing we should say about the race is what are we in a race for?
00:25:35.540 Because if we know, maybe we talked about this last time, but the U.S. beat China to developing the technology of social media.
00:25:42.120 We built a psychological bazooka called social media, but then we didn't actually govern it well, so we sort of spun it around and blew our own brain off.
00:25:49.300 Did that make us stronger or weaker as a country?
00:25:51.980 So we're actually in a race for who is better at steering and wielding the technology in a way that strengthens the full-stack economic, military, scientific, and technological health and strength of your society.
00:26:05.360 That's what we're in a race for.
00:26:06.480 but i don't you know what i i just don't i don't even know who to trust raston because
00:26:11.680 you know like this whole thing between the federal government the you know the pentagon
00:26:15.720 and anthropic i don't want either of them to have that kind of stuff i don't want either of them to
00:26:21.820 have it and i don't know who's better and who's worse you know yeah yeah no really what you're
00:26:29.080 pointing to is the crisis of trust here because at the end of the day it comes down to we're
00:26:33.780 super new kind of power. It's a power we've never seen before. When you have Nobel Prize
00:26:38.160 level intelligence capability, like Nobel Prize level physics, math, engineering, coders, super
00:26:43.980 coders, super cyber weapon hackers, like all of that embodied in an AI model that can just do
00:26:50.280 unbelievable new things. It's an unbelievable amount of power. And the question is, who do
00:26:54.040 you trust with that power? And the film, the AI talk is really getting into that. It's one of the
00:26:58.260 failure modes, we call it dystopia. Like, how do you know, how do you centralize this power
00:27:02.720 and have it not go badly.
00:27:04.440 One of the things is you can't centralize the power.
00:27:06.020 We need checks and balances on it.
00:27:07.320 We need oversight.
00:27:08.400 We need democratic oversight.
00:27:10.020 And regular people should have a say
00:27:11.500 about how they want this to go.
00:27:13.400 But right now, when we're living in kind of cacophony
00:27:16.080 and confusion and people don't know
00:27:17.520 what's really real about AI,
00:27:19.120 what that means is that the companies win.
00:27:20.820 The default path that's led by the companies
00:27:24.320 and their incentives, that's what wins.
00:27:26.780 And I'll just tell you that the handful of CEOs
00:27:29.160 running these companies,
00:27:30.480 they're not interested in what's good for regular people.
00:27:32.720 because they're only interested in what will enable them to win the race.
00:27:36.080 It's how you get Sam Altman.
00:27:37.540 Just two weeks ago, he was asked at the India AI Summit,
00:27:40.340 the big AI summit happening in India.
00:27:42.320 He was asked, well, doesn't it take a lot of energy to train and to run AI?
00:27:46.660 And you know what his response was?
00:27:48.400 Well, doesn't it take a lot of energy to train a human over 20 years,
00:27:52.360 a lot of energy and resources and food?
00:27:54.220 What he's basically saying is we shouldn't value humans.
00:27:58.640 And this is why you get Peter Thiel stuttering for 17 seconds
00:28:01.740 when he was asked by Ross Douthat in the New York Times, he was asked a simple question,
00:28:06.120 should the human species endure? And he stuttered for 17 seconds before he could give a clear answer.
00:28:12.340 I think both of these things are related. When AI becomes the principal driver of economic growth
00:28:17.960 for countries, and when companies don't need people for their growth or their development,
00:28:24.680 then people ask the question, why should you value people? And so this is the last chance that our
00:28:29.380 political voice as regular people, a human movement of regular people, this is the last
00:28:33.740 chance that our political voice will really matter. And so what I'm deeply hoping is that this film,
00:28:38.980 the AI doc, will really need people to have clarity about where this is going.
00:28:42.960 While there are many benefits, we're going to get new medicine, we're going to get new materials and
00:28:46.540 new science, and it's going to be very exciting. We're also going to get this mass disempowerment
00:28:50.560 of regular people. And that's an anti-human future. And so we have to get crystal clear on
00:28:56.220 that so that we can basically act together as a community and steer this a different direction.
00:29:01.600 There's a lot of ways people can do that. They can boycott unsafe AI products or AI products that
00:29:05.940 have bad safety ratings or are enabling mass surveillance. People can do small things like
00:29:11.120 grayscale their phone. You had an advertisement for a product that tries to protect your likeness.
00:29:15.600 People can set a secret password with their family. So we get a phone call and it's like,
00:29:20.660 I don't know, you're a sign and it sounds there at least, but you're not sure.
00:29:23.180 in the secret password.
00:29:24.740 There's a lot of things people can do.
00:29:27.240 And I see it not just as an AI topic,
00:29:29.460 but really about technology's encroachment
00:29:31.140 on our humanity writ large.
00:29:33.260 And just like social media encroached on our brains
00:29:36.180 and our children,
00:29:37.380 the most anxious and depressed generation,
00:29:39.240 we can fight back against that.
00:29:40.820 And that's what we have to do with AI too.
00:29:42.240 We have to fight back against the default path.
00:29:44.360 Be pro-technology, but for steering AI,
00:29:47.140 not stopping it.
00:29:48.820 You're listening to the best of the Glenn Beck Podcast.
00:29:51.460 hear more of this interview and others with the full show podcast available wherever you get
00:29:56.380 podcasts so 10 years ago i sat on a plane with my then chief of staff and um and this all comes
00:30:04.400 from something that we found in a show every day in our uh in our show prep i get a i get a recap
00:30:11.740 of the things i was talking about five years ago 10 years ago 15 years ago 20 years ago this one
00:30:17.020 came up this week 10 years ago this week on the plane and he asked he said where are we headed
00:30:24.300 what what what is what is the world going to look like what's america going to look like in 10 years
00:30:29.420 and so i talked about it on the air and i went back and listened to that show
00:30:32.960 uh it's pretty amazing i laid out three possibilities i didn't make a prediction i
00:30:38.000 said there are three ways this could go i think um and the first one was slow decay not a collapse
00:30:45.740 You wouldn't point to it on the calendar and say, this is when it changed, but just a steady
00:30:49.840 erosion, okay?
00:30:51.580 Corruption, these are my words, corruption would become routine.
00:30:55.380 Violence will become background noise.
00:30:58.240 Currency won't die overnight.
00:31:00.360 It just buys less and less year after year until you just have to adjust your expectations
00:31:05.700 downward.
00:31:06.980 The border will blur.
00:31:09.420 Drugs will flood in.
00:31:11.020 Institutions will continue to weaken, but they won't break.
00:31:13.880 they just stopped working the way they once did. That was option number one. I don't know.
00:31:20.980 That's pretty close. The other one was a darker turn. Listen to this one. I said, or we will turn
00:31:31.360 not to chaos, but from chaos, but it will all be about control. A moment will come when the
00:31:40.420 system decides dissent is a real threat when the people who warn, protested, resisted are no longer
00:31:46.820 just wrong, but dangerous. And the label will change from opponent to enemy. And once that word
00:31:53.080 sticks, you know what usually follows. That was option two. I don't know. I think we've seen some
00:32:00.220 of that. And then there was the more hopeful path that citizens would wake up, that grassroots
00:32:07.640 movements, imperfect, but loud, might be a little messy, but they would remind the country who it
00:32:14.060 was supposed to be, and people will look back and say, wow, that was the moment that it really
00:32:18.300 turned around. That was 10 years ago. I think a little bit of all of those things happened.
00:32:25.860 Don't you? I mean, the slow decay, it's not a collapse, but erosion. Corruption is routine.
00:32:32.780 Absolutely. Violence, a background noise. Absolutely. Currency hasn't died, just buys less year after year. We have to adjust our expectations downward. The border has absolutely blurred. The drugs are flooding in, still are. The institutions haven't broken, but they weakened. And things aren't working the way they used to.
00:32:55.000 OK, add to that, I would say we have had the moment during the Biden administration where the system decided dissent was a real threat and they started to silence people, you know, people who warned and protested and resisted.
00:33:10.880 I'm not going to take the covid vaccine. You were all of a sudden dangerous. You weren't wrong. You were dangerous.
00:33:17.420 And that's when we went from opponent to enemy. These people are trying to kill the elderly.
00:33:25.000 and then the hopeful path. Remember, this was 10 years ago. So we're talking 2016.
00:33:33.380 What was happening in 2016? What was happening in 2016? I said the last path is the hopeful one
00:33:42.320 that citizens would wake up, a grassroots movement, imperfect, loud, maybe messy,
00:33:48.260 would remind the country who it was supposed to be, and people would look back and say,
00:33:51.960 that was the moment it turned around. I think all three paths happened at the same time.
00:33:58.740 Bits and pieces. Here's what hasn't happened. We haven't decided which one. So in 10 years from now,
00:34:07.880 what does the world look like? Well, it's not going to be a combination of all three.
00:34:13.080 I can't give you, you know, a dark, desperate path, a really dystopian path and a hopeful path because and expect them all to come, you know, and do what this did because we're out of runway.
00:34:31.680 You have to choose.
00:34:33.360 Do we slam on the brakes with this plane right now or do we pull on the yoke and start to fly?
00:34:40.360 We can't make that decision.
00:34:44.020 That's the real problem.
00:34:47.220 Because math is going to make the decision for us.
00:34:51.660 Births are going to make the decision for us.
00:34:54.520 Immigration, what we choose to do with our laws, how we vote, it's going to make the decision.
00:35:00.940 If we remain apathetic and uncommitted, and I know a lot of people say they're committed,
00:35:07.740 but i asked myself today on the drive-in am i really committed am i really truly committed
00:35:17.100 you know i struggle with things just like you do and there are times when i'm like you know
00:35:24.680 but if i really was committed i would do x y and z and the reason why i don't is i make excuses
00:35:32.980 for myself and i don't know if they're excuses or they're valid valid i'm tired i'm tired you know
00:35:40.000 uh i i don't have as much time it feels like in my day or i can't last i mean i used to be able to go
00:35:46.720 easy
00:35:47.980 midnight um and when that would happen i'm hearing myself back again sarah and when that when that
00:35:58.360 would happen. Uh, I got a lot of stuff done, but I can't, my body just won't put up with that
00:36:02.940 anymore. So I'm tired all the time. And I'm also a little cranky and more crankier than I used to
00:36:08.620 be. Um, is that an excuse or is that reality? I don't know. I don't know, but have we done
00:36:17.400 everything we can there? There's more that we can do, but not, it's not going to take a lot
00:36:26.560 Because I've been thinking about this stuff for me.
00:36:29.300 What do we do?
00:36:30.420 Well, one of the things that we can do is we've got to just continue to speak out.
00:36:34.920 We have to, you know, I saw a quote from Jefferson recently.
00:36:40.000 Because I just remembered it being a well-educated republic.
00:36:43.860 A well-educated people can be entrusted with their republic.
00:36:47.080 That's not what he said.
00:36:48.040 That's not what he said.
00:36:49.320 He said a well-informed public.
00:36:53.780 Well, that's totally different.
00:36:55.040 that takes it away from all of the eggheads and that just puts it squarely on your shoulders
00:37:01.460 are you well informed you know i just did a story last hour about trusting what you see in your
00:37:10.300 social media feed if you missed it you'll find the story it's an exclusive story at glenbeck.com
00:37:15.700 today about how much of the viewpoints of the average americans now how much is being shaped
00:37:21.920 by social media and what is the percentage of the social media just based on the war alone
00:37:27.960 how much is that coming from overseas and how much of that is made to look like it is being
00:37:35.720 recorded here in america a vast majority of it this this this study i shared last hour
00:37:42.240 is shocking find it at glennbeck.com right now and read that story
00:37:47.600 once we are well informed once we know what's true what's not then you have you're able to
00:38:00.000 form the conviction and say you know what this is a real problem and that's why today we released
00:38:09.080 at stop the conquest.com an additional how many hours six hours four hours you remember many hours
00:38:16.840 Many hours of raw interviews that we did in preparation for the special of, you know, stopping the conquest, the Islamist plan to overthrow the United States of America and the Western world.
00:38:30.720 But we also added a booklet.
00:38:33.340 So it's free.
00:38:34.380 You don't have to be a Torch subscriber, but it is a free booklet.
00:38:37.180 You can download it.
00:38:38.220 I urge you, print it, share it with as many people as you can and get all of the information.
00:38:44.380 It does not have my name on it.
00:38:45.660 It's produced by me and my staff, but it doesn't have my name on it because I don't want that to be an obstacle for people to go, well, I sclap back.
00:38:52.560 Because it has it is nothing but the original documents in it.
00:38:56.080 So you can make the case.
00:38:57.440 It lays out the case and then it gives you all of the documents that are all originals.
00:39:01.220 So, I mean, they can argue with the original documents all they want.
00:39:05.580 But we have to be informed.
00:39:07.460 We have to know what's actually happening in our own country.
00:39:10.120 And one of the things that we have to know is there is a real danger, a real and present danger from people who are here illegally.
00:39:23.380 And I'm not talking about the people who are cleaning your house, mowing your lawn.
00:39:30.760 Yeah.
00:39:31.460 Is that a problem?
00:39:32.860 Yeah, it takes jobs from people, et cetera, et cetera.
00:39:35.280 But we're in an existential threat.
00:39:37.220 We are talking, did you hear the story, the brother and sister who were indicted after authorities say that one of them planted a potentially deadly explosive outside of MacDill Air Force Base in Florida?
00:39:51.100 This just happened?
00:39:55.440 And I thought, brother and sister, what is happening?
00:39:59.460 What do you mean?
00:40:00.660 What?
00:40:05.200 Yep.
00:40:07.220 A brother and sister, the brother, the sister's been arrested.
00:40:11.360 The brother went back home to China.
00:40:17.780 China.
00:40:20.980 Hmm.
00:40:22.600 That doesn't sound like a good thing, does it?
00:40:27.520 Meanwhile, we have our own governors that are aiding and abetting.
00:40:35.520 that is the only way to say it aiding and abetting when you have somebody who has been arrested
00:40:41.180 for especially violent crimes and you won't help you won't no no not help you won't follow the law
00:40:50.480 and call ice and say hey by the way at the back door we're going to be escorting this guy out
00:40:55.640 you want him he's an illegal he's just he's just stabbed and raped a woman you might want to deport
00:41:02.720 them. When they won't even do that, you're aiding and abetting. You're aiding and abetting.
00:41:10.720 I'm going to meet with a father today. I'm dreading this. I have a pit in my stomach today.
00:41:15.840 I am dreading having this conversation. I'm going up to the Chicago area,
00:41:20.560 and I'm talking to a dad who just lost their daughter. You know the story.
00:41:26.960 governor won't say anything
00:41:30.000 nobody will say anything
00:41:31.600 nobody will even recognize
00:41:32.780 and killed by an illegal
00:41:34.980 it's not the one that just happened
00:41:38.520 this one happened before that one
00:41:40.980 nobody
00:41:41.800 nobody knows it
00:41:43.240 nobody cares it seems
00:41:44.940 at least nobody in power
00:41:46.480 I'll bring that story to you on Monday
00:41:51.040 but now the New Jersey
00:41:54.880 Democrat governor
00:41:56.480 is now saying, we're not going to cooperate with ICE.
00:42:03.440 We're not going to cooperate with ICE.
00:42:05.680 Tell me, tell me, how can you be for empathy and justice
00:42:17.700 when you don't have any empathy for the victims
00:42:25.100 and you make the perpetrator the victim there's no empathy in you you can claim oh i'm empathetic
00:42:32.320 with the illegals well you can be empathetic i understand empathy for people who just want to
00:42:37.380 make a better life for themselves and they saw an open border in america didn't care about its
00:42:41.480 borders and came here to make a better life for them and their children i have no empathy none
00:42:46.380 whatsoever for people who are criminals came from another country and went i can be a kingpin in
00:42:52.480 america because they'll never arrest me and i'm going to rape and kill whoever i want i have who
00:42:57.540 has empathy for that you're not empathetic you're a monster you're a monster you're not for justice
00:43:05.820 when you will have somebody who raped and killed an innocent person raped and killed them and you're
00:43:14.020 wanting justice for them and not the one who is just raped and killed democrats you got to
00:43:22.360 wake up you gotta wake up you are you have trapped yourself inside of an insane asylum
00:43:28.560 and you just think that it's okay well because the other side is so bad no no no no gang no
00:43:36.460 we are not like that are our policies getting your daughter raped and killed on the streets
00:43:45.220 or are we the ones saying hey we should have less chaos we should make sure that those rapists i
00:43:52.300 don't care if they're from here or they're homegrown they should be in jail or is is your
00:43:59.500 side the one letting them out you're in you're locked in an insane asylum and you have absolutely
00:44:04.600 no idea you are you're drinking the giggle juice and you're pointing out the window at the same
00:44:10.600 people going look at these crazy people you're the one in the nut house you got to admit that
00:44:16.780 or you're never going to get out.
00:44:19.780 Some say the bubbles in an Aero truffle piece
00:44:22.080 can take 34 seconds to melt in your mouth.
00:44:24.740 Sometimes the very amount you're stuck at the same red light.
00:44:28.220 Rich, creamy, chocolatey Aero truffle.
00:44:31.240 Feel the Aero bubbles melt.
00:44:33.260 It's mind bubbling.