The Glenn Beck Program - August 22, 2025


Best of the Program | 8⧸22⧸25


Episode Stats

Length

45 minutes

Words per Minute

152.23172

Word Count

6,927

Sentence Count

619

Misogynist Sentences

10

Hate Speech Sentences

15


Summary

On today's show, Glenn Beck is joined by Congresswoman Alexandria Ocasio-Cortez (D-VA) to discuss immigration reform, abortion, AI, and much, much more. Glenn also talks about John Bolton's latest comments about AI and Elon Musk's claim that the new version of AI might be AGI.


Transcript

00:00:00.000 Hey, it was an open phone Friday today where we took phone calls about everything.
00:00:04.240 I didn't get to my important cracker barrel point because people wanted to talk about,
00:00:09.740 you know, really important stuff.
00:00:11.620 John Bolton is in the news.
00:00:13.440 We talked about AI.
00:00:15.640 You know, Elon Musk said when you see the new version of AI, it might be AGI.
00:00:21.940 That's a little frightening.
00:00:24.420 What does that all mean?
00:00:25.420 We talked about the symbols of America and how they're faded.
00:00:31.000 We talked about how do we renew the American dream because people don't understand the American dream anymore.
00:00:38.580 I mean, at least if you're under 30, you haven't lived the American dream.
00:00:42.220 You don't even believe in the American dream because you don't know what the American dream is.
00:00:45.920 We go into that on today's podcast.
00:00:49.060 First, let me talk to you about pre-born.
00:00:50.980 One in four pregnancy ends in abortion right now.
00:00:53.860 That's something we cannot ignore.
00:00:57.700 There's another side of it.
00:00:59.020 Four out of four mothers who make that decision have to live with it for the rest of their lives.
00:01:04.840 The child isn't the only one lost.
00:01:06.320 It's a piece of the mom that is gone as well.
00:01:09.560 Pre-born steps in in that crucial moment before the decision is made.
00:01:12.900 They offer ultrasounds to expectant moms, showing them life inside of them.
00:01:17.260 And when mom sees the heartbeat, when she hears the heartbeat, she sees the baby.
00:01:21.000 Something changes, and statistics show she's most likely going to choose life.
00:01:25.420 Now, every dollar you give helps that process.
00:01:28.600 If you're a business owner, please consider making a larger tax-deductible gift.
00:01:32.880 You know, not just a write-off, but, you know, kind of a...
00:01:35.720 It will make an internal impact in your life.
00:01:40.320 It will.
00:01:41.220 While the government isn't working to save babies, you can.
00:01:44.440 A gift of $15,000 sponsors an ultrasound machine, a powerful tool that saves lives every single day, and every gift matters.
00:01:52.040 So get involved today.
00:01:53.020 Dial pound 250.
00:01:54.020 Say the keyword baby.
00:01:55.440 That's pound 250, keyword baby.
00:01:57.020 Or go to preborn.com slash Beck.
00:01:58.980 Preborn.com slash Beck.
00:02:00.500 Let's rescue a generation, one heartbeat at a time.
00:02:03.620 Hello, America.
00:02:05.360 You know we've been fighting every single day.
00:02:07.280 We push back against the lies, the censorship, the nonsense of the mainstream media that they're trying to feed you.
00:02:13.600 We work tirelessly to bring you the unfiltered truth because you deserve it.
00:02:18.480 But to keep this fight going, we need you.
00:02:20.940 Right now, would you take a moment and rate and review the Glenn Beck podcast?
00:02:24.300 Give us five stars and leave a comment because every single review helps us break through Big Tech's algorithm to reach more Americans who need to hear the truth.
00:02:33.360 If this isn't a podcast, this is a movement, and you're part of it, a big part of it.
00:02:38.340 So if you believe in what we're doing, you want more people to wake up, help us push this podcast to the top.
00:02:43.400 Rate, review, share.
00:02:45.020 Together, we'll make a difference.
00:02:47.060 And thanks for standing with us.
00:02:48.360 Now let's get to work.
00:02:49.320 You're listening to the best of the Glenn Beck program.
00:03:01.620 Welcome to the Glenn Beck program.
00:03:04.540 You know, the Democrats are just losing all kinds of credibility, and people are switching parties like they've never switched away from the Democratic Party before.
00:03:13.220 And here's why.
00:03:15.400 Everybody's favorite, Congresswoman Crockett, on the D.C. crime crackdown.
00:03:21.600 Listen to this.
00:03:22.500 ICE, for the most part, is nothing but a ride.
00:03:24.860 That's all they were supposed to do for the most part, right?
00:03:27.220 It's like, you know what?
00:03:28.420 This person is undocumented, or this person reenters the country illegally, all the things, and then they have an ICE hold.
00:03:36.960 And then ICE gets them so they can then send them out.
00:03:39.360 That's all ICE is supposed to do.
00:03:40.860 Look at them as a fancy Uber driver for immigrants.
00:03:43.540 That's all they're supposed to do.
00:03:44.620 Oh, my gosh.
00:03:45.200 And now they're running into places, doing raids, and they're falling all over each other, injuring each other.
00:03:51.720 Like, we are a joke.
00:03:56.540 She's awesome.
00:03:57.100 No, she's with Gavin Newsom.
00:03:58.520 She's fabulous.
00:03:59.640 Can we go to cut three?
00:04:01.180 Here she is on the crime crackdown.
00:04:03.800 Tell me what it's like in D.C.
00:04:06.140 Tell me what you think this is really all about.
00:04:09.480 So it's very dystopian to see.
00:04:13.180 It's funny because I used to watch, like, The Handmaid's Tale, and I can't, right?
00:04:19.460 I never finished, and I can't watch it because it is too close to reality.
00:04:23.860 Oh, yeah.
00:04:24.400 And so what we're seeing is this militarization, and obviously it started in your state.
00:04:32.420 That was kind of the testing grounds.
00:04:34.920 Going to your state, going to a Black woman mayor's city first.
00:04:40.400 And now we are in yet another Black woman-led city and taking over.
00:04:46.960 That's why he's doing it.
00:04:47.660 And to me, it goes, again, to the level of racism and hate that is constantly spewed out of this administration.
00:04:55.480 Stop, stop, stop, stop.
00:04:56.640 What a stunning twist.
00:04:57.780 That's like M. Night Shyamalan.
00:04:59.300 I would never expect her to go through a racial claim.
00:05:02.500 Whoa, I know.
00:05:04.140 Now let's go to Prisker, the governor from Illinois, and what he has to say about what's going on.
00:05:10.700 Cut five.
00:05:11.340 Must we?
00:05:12.120 I built a Holocaust museum.
00:05:13.700 And one thing about that experience that I can tell you, and I worked with Holocaust survivors for more than a decade to build this museum.
00:05:23.700 One thing I learned in the process of that is that it doesn't take very long to tear apart a constitutional republic.
00:05:32.020 Indeed, the Nazis did it in 53 days.
00:05:36.020 And our democracy is almost as fragile.
00:05:41.540 Started in 1922.
00:05:42.740 And we're seeing it right now.
00:05:45.460 Yes, we are.
00:05:46.760 Who's been tearing it apart, you fat?
00:05:49.060 Anyway, so now we're pre-Nazi Germany, according to him.
00:05:55.200 Uber drivers, ICE, it's the handmaid's tale in Washington, D.C.
00:06:01.120 It's pre-Nazi Germany in Illinois.
00:06:04.700 And here comes Stacey Abrams to help us with more.
00:06:09.020 Cut six.
00:06:09.600 I want to tie this back to the abundance agenda and how you think about blue state power.
00:06:15.920 If it is true that he's a grand Ayatollah, that mystical power extends and can be, you know, he can anoint his, you know, his profits.
00:06:30.020 And he can remain in.
00:06:32.140 Stop.
00:06:32.580 So we are.
00:06:33.660 So good.
00:06:34.220 We are now, he's now the Ayatollah.
00:06:37.660 He's Hitler in the handmaid's tale who is also the mystical Ayatollah who is appointing new profits.
00:06:47.580 We have said he has a lot of energy, Glenn, and there's a lot of different roles to pull off.
00:06:52.280 He is covering a lot.
00:06:54.200 Now, he's also trying to make peace.
00:06:57.420 But Susan Rice, with all of her deep, deep credibility, has something to say about that.
00:07:03.500 I mean, really, Nicole, it's pathetic.
00:07:05.540 It's been clearly and repeatedly established, including by the bipartisan Senate Intelligence Committee, led by Marco Rubio, that Russia interfered in the 2016 election by disinformation campaigns, by social media efforts, by all sorts of means short of manipulating the actual vote.
00:07:30.220 And that's just a fact.
00:07:32.760 Now, obviously, Donald Trump doesn't like that fact.
00:07:35.120 He doesn't like the fact that the intelligence community and the Senate bipartisan Intelligence Committee assessed that this interference was intended to.
00:07:45.380 Stop, stop, stop, stop, stop, stop, stop, stop, stop.
00:07:47.660 I can't.
00:07:48.880 How is this still happening with no pushback from what was it, ABC or NBC?
00:07:53.880 No pushback from NBC.
00:07:56.200 None.
00:07:56.820 Zero.
00:07:57.940 All of the documentation has come out.
00:07:59.660 It shows she was part of the conspiracy.
00:08:03.460 She was part of it.
00:08:05.040 She was a ringleader in this.
00:08:07.200 It's now showing all the documents, the facts.
00:08:10.720 There's this little fantasy that the deep state has been pushing.
00:08:16.020 And then there are the actual documents written to and by people like Susan Rice showing this was all made up.
00:08:27.240 Wow.
00:08:27.700 It's so, wow.
00:08:29.100 And that strange Hitler mystical Ayatollah and guy who, handsmaid's tale guy who just wants everybody dressed in red robes cannot get that stopped.
00:08:41.040 That is so very weird.
00:08:43.940 Glenn, you've worked with charities for a long time.
00:08:48.080 Have you founded your own and all this incredible work around the globe?
00:08:52.660 Would you consider potentially putting together a fundraiser for the Democrats to come up with another literary reference than The Handmaid's Tale?
00:09:00.300 Well, like, is it possible we could get them a different book just so they could say that title of it?
00:09:05.680 Now, I know Jasmine Crockett, of course, is so stupid.
00:09:09.680 She couldn't even act like she read the book.
00:09:11.740 She only said she was watching the Hulu show.
00:09:13.660 But still, can we get them some reference other than The Handmaid's Tale?
00:09:20.460 Well, they're already doing it.
00:09:21.140 They're already doing it.
00:09:22.240 When you talk about literary stuff, they're already doing it, Stu.
00:09:25.360 Here we go.
00:09:26.000 You ready?
00:09:26.420 Mm-hmm.
00:09:26.660 They have now, the DNC has now blacklisted terms that they don't want any of their people using.
00:09:35.100 Okay?
00:09:35.640 Oh, okay.
00:09:36.160 Now, tell me what these terms have in common.
00:09:39.800 Blacklisted terms.
00:09:42.200 Privilege.
00:09:44.240 Violence, as in environmental violence.
00:09:48.740 Dialoguing.
00:09:49.980 Triggering.
00:09:51.460 Othering.
00:09:53.420 Microaggression.
00:09:54.620 Holding space.
00:09:55.660 Body shaming.
00:09:57.620 Subverting norms.
00:09:59.220 Systems of oppression.
00:10:01.360 Cultural appropriation.
00:10:03.300 The Overton window.
00:10:05.220 Existential threat to the climate.
00:10:07.320 Existential threat to democracy.
00:10:09.200 Existential threat to the economy.
00:10:11.760 Radical transparency.
00:10:13.840 Stakeholders.
00:10:15.100 The unhoused.
00:10:16.600 Food insecurity.
00:10:18.240 Housing insecurity.
00:10:19.880 People who immigrated.
00:10:21.580 Birthing person.
00:10:22.940 Cisgender.
00:10:24.160 Deadnaming.
00:10:24.780 Heteronormative.
00:10:26.620 Patriarchy.
00:10:28.660 LGBTQIA+.
00:10:30.100 BIPOC.
00:10:31.840 Allyship.
00:10:32.520 Incarcerated people.
00:10:34.300 And involuntary confinement.
00:10:36.900 Those are the words that the Democrats are now telling their people, don't use any of
00:10:42.620 these words.
00:10:43.340 Those are the words that they forced everybody to use.
00:10:47.040 So they are reading from a new book.
00:10:50.320 They're just burning their own book.
00:10:52.760 It is absolutely incredible what is happening right now.
00:10:58.360 Just absolutely nuts.
00:11:00.980 I don't see how they're going to get through conversations without those words.
00:11:03.900 Those are the only words they say.
00:11:04.920 I know, I know, I know, I know, I know.
00:11:07.680 I could just add, I could just add some conjunctions in there and I could make that into a speech.
00:11:14.540 Let me go to, let me go to Eve in Utah.
00:11:17.480 Hello, Eve.
00:11:18.120 Welcome to the Glenn Beck Program.
00:11:19.700 Good morning.
00:11:20.520 Thanks for taking my call.
00:11:21.640 It's a privilege to speak with you, gentlemen.
00:11:24.000 I am calling from the bluest red state in the Union.
00:11:27.080 That'd be Utah.
00:11:28.340 I have been reading the headlines in KSL and how Governor Cox is refusing to send troops to arrest immigrants.
00:11:36.340 I'm calling specifically to say that there are, I think there's a significant problem here.
00:11:41.420 I've seen illegal immigrants.
00:11:43.860 At least I'm assuming.
00:11:44.820 I'm just assuming.
00:11:46.080 But these are people who are not proficient in English.
00:11:48.980 I'm going into 7-Eleven.
00:11:50.400 I'm going into school districts or hiring a lot of illegal aliens.
00:11:55.660 I know because the kids that are in the classes are parts of, members of families that are here from Venezuela, from Guatemala, from different countries in Africa.
00:12:08.400 I have concerns about Best Buy hiring a lot of people who I'm not sure if they're documented, immigrants.
00:12:17.580 But I have friends who cannot get hours at their work because they've got supervisors, specifically at Best Buy, one friend in particular, who had to quit because she was not getting the hours that she needed to support herself.
00:12:32.380 She had a Spanish supervisor who was hiring other Hispanic employees.
00:12:37.680 And then if you spoke Spanish, she felt like she was discriminated against because she didn't speak Spanish and she was not getting the hours that she needed.
00:12:45.260 She had to quit.
00:12:45.840 So I see a lot of this in, like, I don't know, in different companies and different businesses where I have friends and neighbors who are applying for jobs and they're not getting them because they don't speak Spanish or because they're not Hispanic.
00:13:00.680 So let me make some very controversial statements here about your phone call.
00:13:08.540 First of all, let's start with KSL.
00:13:10.000 If you're getting your news from KSL, be careful.
00:13:13.420 You know, there's always been a trusted source in Utah and for many reasons.
00:13:20.580 And they, just like everybody else, has a really hard time hiring journalists that are not woke.
00:13:30.540 It's the same thing as some of the Utah universities, BYU in particular.
00:13:35.440 That has gone woke.
00:13:37.460 How has that gone woke?
00:13:39.420 Because they can't find anybody to fill those jobs that can live the standards and also not be woke.
00:13:47.400 So you have to be really careful of the sources.
00:13:50.960 Don't take anything that you see in some of these sources to be gospel, if you will.
00:13:58.420 Be very careful and be aware.
00:14:00.640 And that's not a universal blanket on anything.
00:14:03.260 That is, just be aware of that.
00:14:05.900 Stop trusting some of those sources because who owns them.
00:14:10.640 Start trusting the sources because they are actually telling the truth.
00:14:14.300 And in some cases, in many cases, that is not happening in the media, no matter who owns it.
00:14:22.500 The second thing is, I'm really concerned about Utah.
00:14:25.700 You hit it right, the nail right on the head.
00:14:28.240 It is the bluest red state.
00:14:31.720 That thing used to be so deeply red.
00:14:35.420 But they have chipped away little by little.
00:14:38.480 Cox is a very, he's a big part of that.
00:14:43.460 But they have chipped away little by little.
00:14:46.160 And I think, personally, I believe in Isaiah.
00:14:54.120 And any place that claims to be, you know, God's people, Rome, you know, the Bible Belt, Utah.
00:15:06.100 Isaiah comes to mind all the time.
00:15:09.800 And I will clean out my own house first.
00:15:14.760 I think these cities that claim to be very religious and have let it go to literal hell because the people have just been arrogant.
00:15:32.480 They just think it will always be this way.
00:15:35.140 I think Utah is coming for a giant reckoning.
00:15:38.700 There's a reason I live in Idaho.
00:15:40.780 There's a couple of reasons.
00:15:41.640 But one reason I live in Idaho and not Utah is because I think Utah is going to pay a very, very heavy price for the things that they are allowed to happen.
00:15:50.460 And every time I go into the big cities in Utah, I am shocked at how bad they are.
00:15:57.080 So you just need to wake your neighbors up.
00:16:01.400 And if your neighbors are awake, you must be active.
00:16:05.160 It's the same story in Texas.
00:16:08.240 Texans are asleep at the switch.
00:16:10.680 Now, I am thrilled to see that Chip Roy is running and is going to be, I think, hopefully, will be elected as our next attorney general.
00:16:22.220 Ken Paxton will go into the Senate.
00:16:24.860 I think those are great things.
00:16:26.600 But locally, the problem many times is locally.
00:16:31.360 And you have to be awake.
00:16:33.580 And Texans are asleep.
00:16:34.720 I think Utahns are absolutely asleep.
00:16:37.660 And you have to wake up on that and do the things locally that will hold your city's feet to the fire and hold your city's feet to the Constitution and to the actual rule of law.
00:16:50.960 Look at those blacklisted terms that the Democrats are now running from.
00:16:54.680 They created all of those terms.
00:16:56.660 They're now running from all of those terms.
00:16:58.900 You hear anybody in your city using those terms.
00:17:02.600 Because make sure you let everybody know that's somebody that shouldn't win the next election.
00:17:09.880 And you do something about it.
00:17:11.280 You organize and do something about it.
00:17:14.020 Local, local, local.
00:17:16.920 But I think you're right about what you're seeing.
00:17:20.380 Let me tell you about American financing.
00:17:22.120 Money can be a very funny thing.
00:17:24.080 You know, when you have it, I guess it's a funny thing.
00:17:26.920 When you don't have it, it kind of sucks.
00:17:28.820 If you're not careful, it starts to own you.
00:17:31.540 Every credit card, every rising interest rate, every mortgage payment, it feels like it's just squeezing tighter and tighter and tighter.
00:17:38.200 You know, I just don't like it.
00:17:39.400 I'm starting a new business and just kind of putting, like, all of our chips on the table.
00:17:44.180 And I told the accountant yesterday, I said, I feel like somebody's standing on my chest all the time.
00:17:48.720 And I really don't like that.
00:17:50.500 If you have that feeling, you might want to call somebody that can help.
00:17:54.940 American financing could be that help for you.
00:17:58.020 American financing.
00:17:58.900 Call them now.
00:18:00.380 You can get out of all of the high interest rate.
00:18:02.900 They can do consolidation loans.
00:18:04.460 They'll listen to your story and then find the way to help you.
00:18:08.660 Call American financing now at 800-906-2440.
00:18:11.500 800-906-2440.
00:18:13.120 It's Americanfinancing.net.
00:18:15.460 Now back to the podcast.
00:18:17.000 This is the best of the Glenn Beck program.
00:18:20.340 Let me take Joe in Florida.
00:18:22.120 Hello, Joe.
00:18:22.740 Welcome.
00:18:23.000 Hello.
00:18:25.960 Pardon me if I stutter.
00:18:28.160 I'm super nervous.
00:18:29.260 And I've been listening to you guys since 2007.
00:18:33.720 Don't be nervous, Joe.
00:18:35.180 It's just us and 11 million friends.
00:18:38.940 Exactly.
00:18:39.280 So what I really wanted to talk to you about was AI in capitalism.
00:18:50.660 Okay.
00:18:50.820 So other than GPT-5, having everybody's baseline site profile between threads now, I'm really struggling to picture how the free market and how capitalism works.
00:19:12.160 If AI is going to take so many jobs and so many people are going to be out of work, well, sure, if we make things cheaper, who's going to have money to pay for anything and how do we overcome that?
00:19:32.320 Are we going to tax private businesses for using AI?
00:19:47.560 How do we find balance?
00:19:50.320 Because it's like if you give the private sector too much freedom, we always do the wrong thing, like with interest rates.
00:20:02.320 But too much government doesn't work either.
00:20:05.560 So how do we move forward?
00:20:08.160 How do we find balance?
00:20:10.960 Joe, that is the trillion-dollar question.
00:20:15.920 That is something that I don't think anyone has an answer for yet.
00:20:20.280 But at least thank you, Joe, for thinking about this.
00:20:24.500 Most people have not thought this through.
00:20:26.460 They have not really come to a place yet to where they see what AI is about to do.
00:20:31.780 And it is about to just destroy jobs.
00:20:36.120 Now, there is the hope that new jobs come out.
00:20:40.760 But I'm having a hard time seeing it not destroy millions of jobs around the world.
00:20:50.400 And so people are going to be unemployed.
00:20:53.580 And some of the great minds of the Great Reset, et cetera, including Noah Haval Harari, has come out and said there's going to be useless people that we just need to keep on drugs.
00:21:06.620 Literally, this is what he says.
00:21:08.540 Need to keep them on drugs and keep them addicted to the Internet.
00:21:11.380 Just keep them busy.
00:21:12.240 That's not sustainable.
00:21:15.100 I think, honestly, you're seeing another solution in Canada.
00:21:18.880 They have just taken their MAID program, which is medical assistance in dying.
00:21:24.760 And it was for just people who were at the end of life.
00:21:27.920 They were in, you know, had a terminal disease.
00:21:30.640 That was just like five, six years ago.
00:21:32.460 Terminal disease only.
00:21:33.420 They have now adopted MAID for newborn children.
00:21:42.880 Newborn children.
00:21:44.280 Now, to give you an idea of how rare that is, the last country that did this was Nazi Germany.
00:21:55.060 The doctors up in Canada cannot keep up with the current requests for medical-assisted suicide.
00:22:03.420 And it is in every, every sector, every walk of life.
00:22:09.020 It is the elderly.
00:22:10.040 It is the sick.
00:22:10.860 It is the non-sick.
00:22:12.000 It is those with disabilities.
00:22:14.580 It is those teenagers that are going through depression.
00:22:18.140 Now it's down to babies after you're born.
00:22:22.380 Not only can you kill them before you're born, now you can kill them after.
00:22:25.900 Because I don't know why.
00:22:27.160 Is it an inconvenience?
00:22:28.760 Is it, what is it?
00:22:29.640 And if they say now, oh, no, it's only for the very, very, very, very malformed, well, that's what it was in Germany, too, when they first did it.
00:22:39.040 And you saw what happened in Germany.
00:22:41.200 So you're going to see some of the worst of human beings come out to solve this thing.
00:22:46.740 You're also going to see things like UBI.
00:22:48.900 That's a universal basic income.
00:22:52.180 I personally think there should be a tax.
00:22:54.780 And I haven't, I'm not settled on any of this.
00:22:57.300 I think there should be a tax on those like Zuckerberg and, quite honestly, Elon Musk, the people who are going to be running these things.
00:23:06.040 And there's going to be a handful that are worth trillions and trillions of dollars.
00:23:11.960 I'm sorry, but you used our information, our private selves, and you're still using them to build these things.
00:23:21.320 And then you took our jobs away.
00:23:23.080 I'm sorry, but that is the first time I've ever said that maybe we should share the wealth a little bit.
00:23:29.180 But, hey, Jamie, this battery just went out.
00:23:33.320 So I think, you know, we have a lot of talking to do here.
00:23:37.940 And I'm not sure that any of it that I am suggesting is right.
00:23:42.220 But here's where I would like to go.
00:23:44.980 Yesterday we had a phone call.
00:23:47.580 We had somebody call in.
00:23:48.640 Her name was Angela.
00:23:49.600 She was from Tennessee.
00:23:51.180 And she talked about all of this.
00:23:54.340 She's talked about how all of this stuff is starting to collapse, that she doesn't really believe in anything.
00:23:59.500 And she's wondering whether maybe the government should do more.
00:24:04.000 Well, she talked about how, you know, she had done everything right.
00:24:12.080 She had gone to college.
00:24:14.060 And now she came from nothing, built herself up, and now she could just barely, you know, keep her head above water.
00:24:21.440 So I gave her some advice, but it bothered me all day yesterday.
00:24:25.520 And I want to come back to this because I've been thinking as I'm developing this new venture of mine called The Torch, we've been looking for the imagery for it and everything else.
00:24:38.340 And I saw some things that our team has been producing, and it included the Statue of Liberty and the flag and everything else.
00:24:44.960 And as I'm watching that, I thought, that is so dated.
00:24:47.760 That appeals to me and my generation.
00:24:50.740 But I don't think that appeals to anybody that is in their 20s because it's all empty, you know.
00:24:57.380 There was a time when the American flag meant something, and it didn't need to be explained at all.
00:25:03.260 There was a time when the Statue of Liberty was more than just an old, outdated tourist stop, that when you saw it, sometimes it could move you to tears.
00:25:11.940 It was a promise.
00:25:12.920 There was a time when the courts were considered the halls of justice, not arenas for politics.
00:25:20.000 But for people who are in their 20s and early 30s, I don't know.
00:25:24.980 I think all of that stuff feels hollow now.
00:25:28.140 You know, the flag, it's a banner of somebody else's dream, and Lady Liberty is just a shell,
00:25:33.940 and the courts are just another place where power decides outcomes, not truth.
00:25:37.660 That American dream that I understand because of my generation and my parents is gone because we didn't pass it on to our children,
00:25:48.220 and the schools and the media and everybody else did a horrible job at this.
00:25:53.420 If you're in your 20s or 30s, you were a kid when your parents probably lost everything in 2008,
00:26:02.380 and you saw the big banks, you know, bail every big bank out, and then you saw maybe your mom and dad's business shuttered on Main Street.
00:26:11.500 You watched your parents work hard and have less for it, and then came COVID, and you saw the government do the same thing, bail out all,
00:26:22.320 hey, it's fine to be in Home Depot, but that local Ace Hardware, no, that's the plague.
00:26:27.200 And none of that made sense, and jobs vanished, and schools closed, and freedoms were curtailed,
00:26:34.340 and the divisions in this country just froze like, you know, cracks in a frozen lake.
00:26:40.800 I mean, it was, it's not good, and that's all you've seen your whole life.
00:26:46.040 And then you did the right thing because what was right in the 1950s was still thought to be right today, and it wasn't,
00:26:53.600 but that was go to college.
00:26:55.020 You did everything you were told was right.
00:26:58.800 You chased the degree.
00:27:00.420 You took on debt.
00:27:02.080 And then the jobs you got out, the jobs you were promised weren't there.
00:27:05.360 They never came.
00:27:06.780 Well, they weren't.
00:27:07.760 It's because you were being lied to about that.
00:27:10.420 Nobody could look over the horizon.
00:27:12.060 Oh, I'm sorry.
00:27:13.940 The people who actually have credible voices, or so you thought at the time,
00:27:18.700 would look over the horizon and say, no, it's fine.
00:27:21.440 It's fine.
00:27:22.160 Some of us were saying, don't.
00:27:23.800 Don't do that.
00:27:24.640 That's a lie.
00:27:25.540 It's not going to happen.
00:27:27.000 But we were discredited.
00:27:29.620 And now the house, it feels as distant to you.
00:27:32.280 Buying a house probably feels like, oh, yeah, and I'm going to walk on the moon someday, too.
00:27:37.720 Capitalism, the system that built the abundance that you see around you,
00:27:41.680 now feels like a rigged game because many times it is a rigged game.
00:27:45.660 It feels broken.
00:27:46.640 It feels like it failed you.
00:27:49.440 And quite honestly, it did.
00:27:50.980 I see it.
00:27:52.620 I see it.
00:27:53.500 I hear you.
00:27:54.640 And you are not wrong to feel betrayed.
00:27:59.240 Now the question is, what do you do with that?
00:28:02.380 So the American dream is not what they told you.
00:28:06.800 The lies started long before the banks started bailing everybody out except your parents.
00:28:12.060 The American dream was never about the banks.
00:28:15.700 It was never about politicians.
00:28:17.040 It was not about what the universities say it was all about.
00:28:21.720 It was never about a white picket fence or a two-car garage.
00:28:25.040 You know, that was all a marketing pitch.
00:28:26.760 And I can tell you right where it came from.
00:28:29.440 It came in the 1930s with FDR.
00:28:33.040 Again, it was a marketing pitch.
00:28:36.140 Up until the 1930s, the American dream was just this, freedom.
00:28:40.440 It was you not having to ask for permission to start a business.
00:28:46.100 It was you not being cobbled by heavy taxes and regulations.
00:28:50.840 It was about building and creating, about being you without having to ask, can I be me?
00:28:58.720 It was all about dreaming just audacious dreams and then taking your two hands and putting them to work and trying to make that a reality.
00:29:09.100 What stole that dream is not, it wasn't capitalism.
00:29:15.500 It was control.
00:29:18.180 A hundred years of policies from the progressives where you were taught to wait, to comply, to memorize these because they're going to be on a test, to look to the government, to the experts, to the bureaucrats.
00:29:31.140 Everything requires permission just to live your own life.
00:29:34.560 And if it's not permission, it's a tax or some sort of a form that you have to fill out.
00:29:42.620 The dream wasn't broken.
00:29:45.600 It wasn't broken by the people.
00:29:47.440 It wasn't broken by capitalism.
00:29:49.360 The dream was strangled to death by the system.
00:29:53.280 Okay, now that, if you can understand that, now you have a new set of questions.
00:30:03.040 Symbols can be replaced.
00:30:05.500 When the old symbols lose their power, new ones can rise or you can reinvigorate those symbols by putting new power back into them.
00:30:16.260 The new symbols of the American dream are not going to be marble statues or buildings.
00:30:21.600 The new symbols of the American dream go back to what they were before the progressive era.
00:30:27.360 You, the people, somebody who's working right now in their basement on something and they just think they have something and them being able to keep that idea, enhance people's lives and get rich from it.
00:30:40.920 The craftsman that's turning a side hustle into something real.
00:30:45.080 The entrepreneur that is not going to wait for permission.
00:30:48.760 The communities that stand together when the institutions fail them.
00:30:54.340 That's the American dream.
00:30:57.680 And the danger here is that we are losing our symbols.
00:31:02.520 And one of our symbols is the Declaration of Independence and the Constitution because nobody knows what it really is.
00:31:09.480 So it doesn't have power to the average person.
00:31:12.280 But that is, the Constitution is not a relic.
00:31:15.280 It's not a symbol.
00:31:16.640 It is a root.
00:31:19.160 Now, the good thing is, if our roots on this thing are still deep, we are entering the storm of ages.
00:31:27.340 And when a storm rages, those roots, if they're deep enough, they hold and they survive.
00:31:37.220 If the roots are so atrophied, it's just going to tumble and blow away.
00:31:41.500 But the truth isn't gone.
00:31:44.240 Justice is not dead.
00:31:46.800 Liberty is not just an empty vessel that is standing there as a tourist trap.
00:31:51.800 All of these truths have been buried for decades.
00:31:57.960 Noise and lies.
00:31:59.780 And here's a good thing.
00:32:01.560 It's going to be your generation that digs them back up.
00:32:04.500 You're going to find them again.
00:32:06.800 And you're going to find them.
00:32:08.580 Well, you won't find them if you're waiting for rescue.
00:32:10.720 You won't.
00:32:11.860 You'll find them by daring to dream again.
00:32:14.680 By daring to say, I don't care what you tell me.
00:32:17.980 You're not the boss of me.
00:32:19.580 And you don't control my thoughts.
00:32:23.660 I'm sorry.
00:32:24.800 I can either choose thoughts that empower me or I can choose the thoughts that disempower me.
00:32:30.940 And I'm sorry.
00:32:32.200 All the thoughts you're putting into my head make me weak and pathetic.
00:32:37.260 I am not going there.
00:32:39.120 I'm going to change my thoughts and change my life.
00:32:42.740 You know, you don't wait for somebody else.
00:32:48.920 The American dream is truly American because of who we used to be and who we, I think, still are.
00:32:56.380 We just have to find it in ourself.
00:32:58.300 We just have to believe it again.
00:33:00.120 We are the people that went to the moon.
00:33:02.520 We are the people that do these daring things.
00:33:05.200 And that is not dead.
00:33:06.560 It's just changing hands.
00:33:09.400 And they have the older generation has tried to convince you that it means nothing.
00:33:16.140 You can't do it.
00:33:17.500 It's in your hands now.
00:33:19.120 And when you grasp it, when you live it, not as a slogan, but as a way of life, you'll discover it was never about chasing symbols.
00:33:26.260 It was about becoming one.
00:33:28.720 So in talking about capitalism and the future and especially AI, let's have a deeper conversation on this because, you know, the fear is it's going to take our jobs and you're going to be a useless eater, et cetera, et cetera, because AI will have all of the answers.
00:33:56.040 Correct.
00:33:56.560 But how many times, hang on, hang on, that is correct if you look at it that way.
00:34:03.120 But let me say this, I can have people who are wildly educated on exactly the same facts and they will come to a different conclusion or a different way to look at that, okay?
00:34:18.900 They can agree on all of the same facts, but because they're each unique and AI.
00:34:26.560 AI is not a GI or ASI is not going to be unique, I don't think.
00:34:31.620 This is my understanding of it now.
00:34:33.260 And I've got to do some, I've got to talk to some more people about this that actually know.
00:34:39.080 Because coding is now what AI does, okay?
00:34:42.800 That can develop any software.
00:34:44.560 However, it still requires me to prompt.
00:34:51.300 I think prompting is the new coding.
00:34:55.620 And if you don't know what prompting is, you should learn today what prompting means.
00:35:01.080 It is an art form.
00:35:03.960 It really is.
00:35:04.820 As I have been working with this now for almost a year now, learning how to prompt changes everything.
00:35:13.240 And so, and now that AI remembers your conversations and it remembers your prompts, it will get a different answer for you than it will for me.
00:35:26.760 And that's where the uniqueness comes from.
00:35:30.200 And that comes from looking at AI as a tool, not as the answer.
00:35:38.420 So, Stu, if you put in all of the prompts that make you, you, and then I put in a prompt that makes me, me, Donald Trump does it, you know, Gavin Newsom does it.
00:35:51.120 It's going to spit out different things because you're requiring a different framework.
00:35:57.360 Do you understand what I'm saying?
00:35:59.920 Yeah, you can essentially personalize it, right, to you.
00:36:03.340 Correct.
00:36:03.580 It's going to understand the way you think rather than just a general person would think.
00:36:08.180 Correct.
00:36:08.880 And if you're just going there and saying, give me the answer, well, then you're going to become a slave.
00:36:14.420 But if you're going and saying, hey, this is what I think, this is what I'm looking for, this is where I'm missing some things, et cetera, et cetera.
00:36:25.300 It will give you a customized answer that is unique to you.
00:36:30.700 And so prompting becomes the place where you're unique.
00:36:35.100 Now, here's the problem with this.
00:36:37.300 This is something that I said to Ray Kurzweil back in 2011, maybe.
00:36:41.860 He was sitting in my studio, and I said, so, Ray, we get all this.
00:36:46.100 It can read our minds.
00:36:47.260 It knows everything about us, knows more about us than anything, than any of us know.
00:36:52.640 How could I possibly ever create something unique?
00:36:57.560 And he said, what do you mean?
00:36:58.500 And I said, well, if I was, let's say I wanted to come up with a competitor for Google.
00:37:04.460 If I'm doing research online and Google is able to watch my every keystroke and it has AI, it's knowing what I'm looking for.
00:37:13.280 It then thinks, what is he trying to put together?
00:37:18.040 And if it figures it out, it will complete it faster than me and give it to the mothership, which has the distribution and the money and everything else.
00:37:28.100 And it will, I won't be able to do it because it will already have done it.
00:37:32.400 And so, you become a serf.
00:37:36.120 The lord of the manor takes your idea and does it because they have control.
00:37:42.020 That's what the free market stopped.
00:37:45.880 And unless we have control of our own thoughts and our own ideas and we have some safety to where it cannot intrude on those things, that we have some sort of a patent system for unique ideas that you're working on.
00:38:07.880 That AI cannot take what you're working on and share it with the mothership, share it with anybody else.
00:38:14.680 Then it's just a tool of oppression.
00:38:17.540 Do you understand what I'm saying?
00:38:18.660 Yeah.
00:38:19.100 I mean, obviously, these companies would say they're not going to do that.
00:38:23.320 You know what Ray said?
00:38:25.180 Yeah.
00:38:25.700 Ray said, Glenn, we would never do that.
00:38:28.020 And I said, why not?
00:38:29.580 And he said, well, because it's wrong.
00:38:31.020 We just wouldn't do it.
00:38:31.760 I was like, oh, oh, I forgot how moral and, you know, such high standing everybody in Silicon Valley and Google is.
00:38:40.940 And Silicon Valley and Google is far.
00:38:44.200 I have far more confidence in their just benevolence than I do China.
00:38:50.980 And Washington.
00:38:52.400 And Washington.
00:38:53.700 Yeah, exactly.
00:38:54.660 The DOD.
00:38:55.580 Yeah.
00:38:56.080 I mean, everyone's going to have these things developed and who knows what they're going to do.
00:39:00.020 I mean, I suppose there will be some eventually that becomes an issue or becomes a risk.
00:39:06.480 There's going to be some solutions to that.
00:39:08.360 Like you could have closed loop systems that don't connect with the mothership.
00:39:11.660 But like all that stuff's going to be there will be answers to those questions, I'm sure.
00:39:18.820 But, you know, at some level, right, they're using what you're typing in as training for future AIs.
00:39:25.420 Correct.
00:39:25.740 Right.
00:39:26.220 So it all, in a way, kind of has to go to the mothership at some level.
00:39:30.640 And whether they try to take advantage of it in the way you're talking about, I mean, you hope they don't.
00:39:34.660 But I don't trust them.
00:39:35.540 Right now, a year ago, we thought we're going to use, we'll use somebody's AI as the churn, as the compute power because the server farms, everything is so expensive.
00:39:50.580 But I don't think now, we've been talking about this at the torch that our, you know, our dreamers are working on.
00:39:58.100 I'm not sure we're ever going to be able to get the compute power that we need for a large segment of people.
00:40:03.540 Because right now, these companies, now think of this, the world is getting between 1% and 3% of the compute power.
00:40:11.180 So that means 97% to 99% of all of that compute is going directly into the company trying to enhance the next version.
00:40:23.240 Okay.
00:40:24.100 All of that thinking, that's like, that's like you giving, you know, something that everybody else thinks is your main focus and you're only giving it 20 or 15 minutes a day.
00:40:38.520 Okay.
00:40:39.280 You're operating at the highest levels and I'm only going to spend 10 minutes thinking about your problem.
00:40:44.720 All right.
00:40:45.440 And you think that's what I'm really doing is spending all my time on there.
00:40:48.520 And so they're eating up all of the compute for the next generation of, and I don't think that's going to stop.
00:40:55.060 And so we're now looking into, can we afford to build our own AI server farm at a lower level that doesn't have to, you know, take on 10 million people, but maybe a million people and keep it disconnected from everything else.
00:41:13.880 Uh, if we can do that, I think that's, I think that's a really important step that people will then be able to go, okay, all right, I can come up with my own, even my own company compute farm, uh, that keeps my secrets, keeps all of the things that I'm thinking, keeps all of this information right here.
00:41:35.280 Um, hopefully that will happen, but I, I'm not sure because I, I think when, when they do hit AGI, you're not going to get it.
00:41:44.280 You might have access to AGI, but it will be so expensive because AGI is going to try to get to ASI.
00:41:50.700 So when they get to AGI, um, when that, when that is there and available, it could be $5,000 a month for an individual.
00:41:59.160 It could, it could be astronomical prices.
00:42:01.380 You're not going to get, uh, compute time on a quantum computer.
00:42:07.400 You're just not, it'll be way too expensive because the big boys will be using it.
00:42:12.060 The DOD will be using it.
00:42:13.920 Most of it, you know, the, uh, Microsoft and Google and everybody else, when they develop theirs,
00:42:19.140 they will be using it themselves, uh, to get stronger and better, et cetera, et cetera.
00:42:24.920 So there has to be something for the average person to be able to use this that is not connected to the big boys.
00:42:33.820 I'm still not sure, Glenn, if we're at this time, but like, just to redefine these terms, AGI and ASI, artificial general intelligence, artificial super intelligence.
00:42:42.560 Um, and the artificial general intelligence is basically, it can be the smartest human, right?
00:42:48.460 We are, not even, not even that you would still consider this person, a super genius.
00:42:54.500 Um, it's general intelligence.
00:42:56.740 You are a general intelligence being, meaning you can think and be good at more than one thing.
00:43:02.420 You can play the piano and be a mathematician and you can be the best at both of those.
00:43:08.000 Okay.
00:43:09.140 What we have right now is, is narrow AI.
00:43:11.620 It's good at one thing.
00:43:13.520 Now we're getting AI to be better at multiple things.
00:43:18.520 Okay.
00:43:19.100 But when you get to general AI, it will be the best human beyond the best human in every general topic topic.
00:43:27.860 So it can do everything.
00:43:29.700 It'll pass every board exam for every walk of life.
00:43:34.700 Okay.
00:43:36.080 Now that's the best human on all topics.
00:43:41.260 Art.
00:43:42.100 And I would call that super intelligence myself, but it's not, that's just general intelligence.
00:43:48.020 Top of the line, better than any human on all subjects.
00:43:52.760 Super intelligence is when it goes so far beyond our understanding.
00:43:57.960 We, it will create languages and formulas and, and, uh, alloys and, and think in ways that we cannot possibly even imagine today because it's almost like an alien life form.
00:44:11.120 You know, when we think all the aliens are going to come down, they're going to be friendly.
00:44:14.300 You don't know that.
00:44:15.200 You don't know how they think.
00:44:16.780 They've created a world where they can travel in space and time in ways we can't.
00:44:22.240 That means they are so far ahead of us that we could to them be like cavemen or monkeys.
00:44:29.980 So we don't know how they're going to view us.
00:44:32.180 I mean, look how we view monkeys.
00:44:33.540 Oh, a cute little monkey.
00:44:35.000 Let's put something in its brain and see if it feels electricity in its brain.
00:44:38.900 Okay.
00:44:39.860 We don't know how it's going to think because we aren't there.
00:44:43.380 We're, and that's what we're developing.
00:44:45.580 We're developing an alien life form that cannot be predicted and cannot be something that we can even keep up with.
00:44:54.700 All right.
00:44:55.300 More in just a second.
00:44:56.260 Let me stop.
00:44:57.100 And by the way, I am, uh, I'm working on a constitutional amendment and I'm partnering with some people on a constitutional amendment.
00:45:07.340 I'm going to tell you about it soon, but it was, it was regards to AI.
00:45:09.860 Um, and we need to define what it means to be human quickly need to define that.
00:45:17.620 And then we need to have a constitutional amendment that this is a human, this is not, and only humans have equal rights.
00:45:27.560 We've got to do that right now.