The Glenn Beck Program - February 08, 2023


The Most Embarrassing SOTU Moment Glenn Has Ever Seen | Guests: Rep. Jim Jordan & William Hertling | 2⧸8⧸23


Episode Stats

Length

2 hours and 4 minutes

Words per Minute

150.91699

Word Count

18,836

Sentence Count

1,712

Misogynist Sentences

9

Hate Speech Sentences

20


Summary

Glenn Beck is back with another jam-packed show. First, he talks about the State of the Union, then he's joined by Jim Jordan to talk about the hearings that are coming up tomorrow. Then, he's back with a new sponsor, My Medic First Aid Kit, which is actually a first aid kit.


Transcript

00:00:00.000 I want to talk to you a little bit about Pat coming up.
00:00:06.100 There's a problem with Pat, but let me first talk to you about a new sponsor, My Medic First Aid Kit.
00:00:11.520 I met with these guys a couple of weeks ago.
00:00:14.900 I love this company.
00:00:16.260 They're the ones that make the little things that are on the arms of the NFL players, you know, the back of the arms.
00:00:25.040 They invented a new Band-Aid, and it came from that.
00:00:28.540 They were watching the NFL, and they're like, those things keep coming off.
00:00:32.040 And they invented it just to help the NFL, and the NFL was like, we'll take a lot of these, please.
00:00:38.940 And the first aid kit that they are now offering is something that is actually a first aid kit.
00:00:46.600 This is so good.
00:00:48.640 Life-saving first aid kits.
00:00:50.780 You'll get 20% off now.
00:00:52.560 Please check this out.
00:00:54.820 MyMedic.com slash Beck.
00:00:57.080 MyMedic.com slash Beck.
00:00:59.540 This is so you are actually prepared for an emergency.
00:01:04.880 MyMedic.com slash Beck.
00:01:07.480 Check them out today.
00:01:09.160 We've got no room to compromise.
00:01:29.300 We've got to stand together.
00:01:31.120 It's the course of life.
00:01:35.280 Stand up and stand and hold the line.
00:01:39.160 It's a new day I'm trying to raise.
00:01:46.120 What you're about to hear is the fusion of entertainment and enlightenment.
00:01:54.040 This is the Glenn Beck Program.
00:01:58.680 Hello, America.
00:01:59.760 Welcome to Wednesday.
00:02:01.120 Huge packed show.
00:02:02.400 We begin with the State of the Union, and then we go to Jim Jordan.
00:02:07.260 We want to know about the hearing that is scheduled beginning tomorrow to expose what we believe is the largest censorship system in U.S. history.
00:02:21.120 It is a public-private partnership with the government, CIA, FBI, and big tech.
00:02:31.020 We'll talk about that in just 60 seconds.
00:02:34.180 When you're feeling good, it's easy to be on top of the world.
00:02:37.060 When you're feeling good, you can pretty much tap dance your way through, you know, doing just about anything in your day.
00:02:42.880 There's something about feeling good that drives you to not only to be able to do things, but want to do them.
00:02:50.520 So when pain comes along, you don't want to do anything.
00:02:53.500 I just want to go back to bed.
00:02:55.300 If you or somebody that you love is in pain, please try Relief Factor.
00:03:00.740 It's not a drug, so it's not going to whack you out.
00:03:03.100 It's all natural, which is usually the sign for me.
00:03:06.820 I'm like, that's not going to work.
00:03:08.180 I mean, I'm a guy who is, I woke up on the surgical table, not once, but two times.
00:03:15.940 Things don't work for me.
00:03:17.700 I kind of abused my body at one point.
00:03:21.600 Natural, not a chance.
00:03:23.680 That's what I thought.
00:03:24.440 My wife said, you've got to try it.
00:03:26.840 So I did.
00:03:27.900 And three weeks later, I was absolutely convinced, and I got my life back.
00:03:33.100 Get yours back.
00:03:34.280 Give it a whirl, please.
00:03:35.940 Relieffactor.com.
00:03:37.180 Relieffactor.com.
00:03:38.420 Or call 800, the number 4-RELIEF.
00:03:40.620 800-4-RELIEF.
00:03:42.240 Relieffactor.com.
00:03:45.160 Okay.
00:03:46.320 I've got an hour's worth of stuff to talk about with the State of the Union coming up in hour three, I think.
00:03:54.300 But I just want to say this.
00:03:56.140 I've never seen a State of the Union like that last night.
00:04:00.580 First of all, the lies were outrageous lies.
00:04:05.360 Outrageous lies.
00:04:06.260 And jam-packed with them.
00:04:07.300 Oh, yeah.
00:04:07.700 One after another after another.
00:04:09.840 It was incredible.
00:04:11.180 I've never seen anything like it.
00:04:12.720 Um, the, the other thing that was remarkable was we're turning in to England.
00:04:20.700 I have never seen, uh, the president except when, who was it, Joe Wilson said, liar.
00:04:28.400 And that was a, yeah, you lie.
00:04:30.120 That was a really big deal.
00:04:32.620 Big story.
00:04:33.540 Huge story.
00:04:34.320 It lasted for days.
00:04:35.580 I think they censored him, et cetera, et cetera.
00:04:37.860 Do you have that?
00:04:38.760 Real quick.
00:04:39.280 This was the outburst.
00:04:40.660 There are also those who claim that our reform efforts would ensure illegal immigrants.
00:04:45.860 This, too, is false.
00:04:48.460 The reforms, the reforms I'm proposing would not apply to those who are here illegal.
00:04:54.500 Hey, by the way, we should mention that that exact thing he said was in Hillary Clinton's campaign platform.
00:05:03.360 Yeah.
00:05:03.740 The next time she ran.
00:05:04.680 So that's how much of a lie it was.
00:05:06.000 Yeah.
00:05:06.260 And we're giving, uh, aid now to all these illegals.
00:05:09.240 Anyway, um, so that, you heard the reaction last night, not only did, was he called a liar several times, he was heckled.
00:05:20.640 And the worst thing was, he said something so absurd, they laughed at him, not with him, at him.
00:05:30.340 If I am watching this overseas, I'm thinking this country, this, I mean, this president is a joke, is a joke.
00:05:41.080 It's very bad, very, very bad.
00:05:43.360 We'll get into that coming up in, uh, just a, just a little while.
00:05:46.540 Um, Congress is, uh, set to now expose what may be the largest censorship system in U.S. history.
00:05:55.400 It is, it is not good.
00:05:59.840 Uh, they have been saying that they're just looking for disinformation and, and they're not violating freedom of speech.
00:06:06.600 This is not true.
00:06:08.580 We know this now from the Twitter files.
00:06:10.560 Uh, the Twitter files showed us exactly what was, uh, what was going on.
00:06:17.540 Um, FBI, um, you know, hey, FBI, uh, has, um, all kinds of, um, people in Twitter, the Justice Department, now CIA, NSA,
00:06:33.100 all people that are, were employed for the government have gone to work for Google, Facebook, Twitter, FBI, uh, is in, in a bad way, in a really bad way.
00:06:49.680 And what's happening is the, the FBI is actually, um, well, let me use the words of a Twitter staff member.
00:07:00.200 At one point, they complained internally that they are probing and pushing everywhere.
00:07:05.240 So FBI is probing and pushing everywhere inside of these tech giants.
00:07:13.520 So you have, uh, executives coming from, uh, the government, then you have the outside force of the government.
00:07:24.040 And one of them, uh, on the outside suggested in an upcoming meeting, we really need to invite an OGA or another other government organization.
00:07:36.660 So the FBI saying we need to have the CIA, and we now know that's what it was.
00:07:44.780 The CIA, the CIA has very strict limits.
00:07:49.700 They cannot gather information or do anything on Americans, but that seems to be out of the window.
00:07:59.240 So you have all of these people working and, uh, they were, while they were trying to tell us that we're going to do a government office on disinformation, they already had one.
00:08:10.860 They already had one.
00:08:12.300 And they were working through the FITF, the foreign influence task force.
00:08:19.560 When they got rid of the disinformation thing, they just continued to do it through that organization.
00:08:25.900 They had sent long lists of newspapers, tweets, YouTube videos, and voices that needed to be silenced.
00:08:38.240 And not because of COVID.
00:08:39.780 They were, quote, anti-Ukrainian narratives.
00:08:44.740 So they are shaping what you see.
00:08:48.080 They are policing what you hear, and not just information, but they are policing points of view.
00:09:00.520 This, you know, when your neighbor shouts you down or your neighbor calls you something, that's your neighbor.
00:09:08.060 This is the government.
00:09:10.120 If the government decides to smear you, and this is what they were doing, smearing people.
00:09:16.460 When they smear you, it's done.
00:09:21.220 That's, that's not your, just your neighbor.
00:09:23.780 That's not just some Joe Blow.
00:09:25.820 It's not even ABC News.
00:09:27.600 It's a coordinated effort.
00:09:31.440 So, what's happening?
00:09:34.100 And why does this matter?
00:09:35.600 This matters, and I can't believe I have to try to explain this.
00:09:41.640 I don't think I do to this audience, but let me give you the explanation so you can give it to some of your friends.
00:09:46.980 You remember back in 2008, I had, for the first time in talk radio history, we were fortunate enough to have, as an advertiser, General Motors.
00:10:02.460 And, you know, the team with Rush Limbaugh and everybody, they had been working for years to get General Motors on.
00:10:11.640 I went out, met with General Motors, met with the CEO of General Motors, had tours of the factory, was really very excited about the things that they were doing.
00:10:23.040 But I happened to be at their OnStar room with one of their chief executives, and I said, so, you're monitoring everything all the time.
00:10:34.740 Yes.
00:10:35.920 And you could turn the engines off of any car that you wanted.
00:10:41.620 Yes.
00:10:42.940 Wow, I could see that would be great for, like, an Amber Alert.
00:10:46.240 Well, yes, we shouldn't do that, but yes.
00:10:52.600 Have you done that?
00:10:53.660 Well, okay.
00:11:00.660 What's your line there?
00:11:03.460 Well, we're not with the government.
00:11:06.680 Okay.
00:11:07.180 And the minute that they canceled their hydrogen cars and took bailout money and were beholden to the United States government, I canceled that contract.
00:11:20.820 I canceled it.
00:11:22.700 It, at that point, could have almost cost me my career.
00:11:28.540 It was a very big deal.
00:11:31.280 And I said, I can't voice for General Motors.
00:11:33.880 And it killed me because I loved them.
00:11:37.900 That was, oh, 2007 and 8, and how things have changed.
00:11:44.680 You now, every car, tracks all of your movements.
00:11:48.660 Your movements are also tracked through your streaming services, your web browser, your social media, search histories, online ads, e-books, fitness trackers, your Apple Watch.
00:12:00.380 Everything, everything, everything with the word smart or interactive in it tracks you.
00:12:05.680 Well, I don't care.
00:12:07.000 It's bad enough that the IRS tracks you.
00:12:10.080 You're also being tracked by apps involving e-commerce, ride share, e-banking, not just credit cards, but also the loyalty cards, like the kind you get at your grocery store.
00:12:19.700 However, this isn't drone surveillance, it's much more personal.
00:12:25.040 It's facial recognition to open your phone.
00:12:27.840 It's voice recognition to start your smart TV.
00:12:30.900 We are all connected to an unbelievable amount of information that can be tracked and mined.
00:12:37.520 The library of Alexandria, the greatest library ever, half a million books today, the amount of trackable information on you is roughly the equivalent of 400 copies of the library of Alexandria.
00:12:59.060 That's how much information is out in the digital world about you.
00:13:04.800 We live in an information society.
00:13:08.820 The guy who was the father of cybernetics, he said, quote, the world of the future will be seen, will be even more demanding as it struggles against the limitations of our intelligence.
00:13:23.840 He called it the tyranny of information.
00:13:27.560 The better way to say it is there's so much information out there.
00:13:30.840 Modern warfare is all about information, and how can you correlate?
00:13:38.760 Well, you can't.
00:13:40.780 Used to be called metadata.
00:13:43.000 Can't.
00:13:44.380 But now AI.
00:13:46.840 And you can track this as a war device all the way back to Napoleon.
00:13:53.640 Napoleon used total war.
00:13:55.780 He turned the entire world into a battlefield.
00:13:58.740 His fight was global.
00:14:02.420 Century after him, we have two global wars.
00:14:06.160 They paralyzed the entire world.
00:14:08.320 And war involves supremacy of decision.
00:14:12.740 Strategy.
00:14:14.460 He was also the first to use aerial surveillance.
00:14:17.860 A new kind of espionage.
00:14:19.760 Suddenly, warfare was secretive.
00:14:21.700 It was invisible.
00:14:23.220 Military violence became clandestine.
00:14:24.740 The task of secret police.
00:14:28.040 An army, quote, army of the interior, end quote.
00:14:32.380 It was surveillance and disinformation.
00:14:36.540 War could take place behind the scenes.
00:14:38.960 Nobody had to know there was even a fight going on.
00:14:41.640 It was a game the elites could play on their own, but through society, not on the battlefield.
00:14:47.400 Now, that doesn't sound strange anymore, because that's the way of everyday life.
00:14:55.320 We are living inside of a battlefield.
00:14:58.560 And the new war machine is invisible.
00:15:03.480 Modern warfare is about depriving the enemy.
00:15:07.440 Containing their movement.
00:15:09.300 Listen to this.
00:15:10.740 Warfare.
00:15:12.340 Depriving the enemy.
00:15:14.420 And containing their movement.
00:15:16.340 The ability to move is the mark of freedom.
00:15:21.720 This is why everything is being centralized and you are being boxed in.
00:15:28.140 In 2014, Stephen Hawking and several other scientists wrote an editorial about the threat of artificial intelligence.
00:15:36.240 He said, whereas the short-term impact of AI depends on who controls it, the long-term impact depends on whether it can be controlled at all.
00:15:47.720 All of us should ask ourselves what we can do now to improve the chances of reaping the benefits and avoiding the risks, end quote.
00:15:56.920 So, here's the funny thing.
00:16:01.300 Surveillance and disinformation.
00:16:05.000 We like to think surveillance is designed to get rid of disinformation.
00:16:10.340 But actually, it's two agents on the same side, two conmen pretending to fight.
00:16:21.500 I'll explain in 60 seconds.
00:16:24.800 Whether your dog is an old codger who likes to nap on your front porch or a puppy who's constantly under your feet or somewhere in between, you love him.
00:16:32.320 You want him to have the healthiest and happiest life possible.
00:16:36.460 Caring for our dogs is a big responsibility.
00:16:39.620 And a big part of that is making sure that what he eats is actually promoting good health.
00:16:45.660 Kibble dog food is sterilized.
00:16:47.800 It's dead food.
00:16:48.960 Your dog isn't getting the nutritional needs met every day.
00:16:53.380 So, that's why Rough Greens was invented.
00:16:57.660 And it was created by naturopathic Dr. Dennis Black.
00:17:01.060 And he was thinking about his dogs and what they needed and doing all of the research.
00:17:08.840 And he realized, if kibble food is dead, we've got to put something on top of it that is alive.
00:17:14.960 All of the vitamins, minerals, and other things that contribute to a long, healthy life.
00:17:20.620 And it's got to have, you know, probiotics in it.
00:17:23.820 It's got to have the things that you and I need because your dog needs them, too.
00:17:28.120 The folks at Rough Greens are so confident that your dog is going to love it.
00:17:31.320 They have a special deal.
00:17:32.320 RoughGreens.com slash Beck.
00:17:33.880 You go there.
00:17:34.480 You get your first bag free.
00:17:36.180 All you pay for is shipping.
00:17:37.160 Go to RoughGreens.com slash Beck.
00:17:39.820 RoughGreens.com slash Beck.
00:17:41.700 Or call 833-GLEN33.
00:17:43.620 833-G-L-E-N-N-33.
00:17:46.100 RoughGreens.com slash Beck.
00:17:48.580 10 seconds.
00:17:49.180 Station ID.
00:17:57.660 So, the connection between surveillance and disinformation.
00:18:02.160 Two conmen pretending to fight.
00:18:04.080 They're both working for intelligence agencies and secret services.
00:18:10.300 And their job is to confuse us.
00:18:13.560 To dumb us down.
00:18:16.400 To believe the correct illusion.
00:18:20.860 The one they want you to believe.
00:18:23.700 It's supposed to tire us out and make us outraged.
00:18:27.640 Who feels tired and outraged?
00:18:31.420 The funny thing about disinformation is that it's a counterattack.
00:18:35.700 It's an assault on truth.
00:18:37.500 It's the way to silence.
00:18:39.240 But in order for it to work, intelligence elites have to know what we think, what we feel, and what we believe.
00:18:46.040 They have to be able to monitor the effects and also see where we're headed.
00:18:51.520 The more predictive it can become, the more absolute the mind prison becomes.
00:19:01.960 Surveillance is how we lose privacy and our personal information.
00:19:07.780 The more that they can gather on us and know what's his heart rate when we say these things.
00:19:16.240 What does his blood pressure do when he reads this?
00:19:20.160 All of these things are either being tracked now or are within the next 12 months able to be tracked.
00:19:27.840 Our information gives powerful people more power, scary power.
00:19:33.780 And the intelligence communities use that information to make disinformation.
00:19:39.020 And here's the weird part.
00:19:40.100 They're doing it right in front of us.
00:19:41.960 They're telling us they're doing it.
00:19:45.160 It turns out Big Brother isn't monitoring us because we're doing all the work for him.
00:19:53.060 We are the experts at self-surveillance.
00:19:56.280 We do it with a smile for the sake of convenience.
00:19:59.320 We surrender our privacy and our freedoms.
00:20:03.480 And totalitarianism is on the rise.
00:20:08.600 Our future world is a world of surveillance.
00:20:12.340 Non-stop surveillance.
00:20:14.460 More and more.
00:20:15.460 Watched, scanned, monitored, ranked, punished, spied on, evaluated.
00:20:21.140 All of this is happening.
00:20:23.440 And the key phrase to watch for from politicians and global leaders is China is the new model.
00:20:32.180 When they say they, for instance, he talked about cancer research and we started a new organization.
00:20:39.560 That was not about health research.
00:20:43.560 That was about research on how to get more information from you.
00:20:48.620 The entire globe is being designed with ranchers and sheep.
00:20:55.040 We are the sheep.
00:20:56.560 They are the ranchers.
00:20:58.280 And they can control us if they can monitor everything.
00:21:02.740 And that's what's being built right now.
00:21:06.740 We're going to get into that next hour and it will blow your mind on how close this is.
00:21:13.940 But nobody's looking for actual equal justice.
00:21:17.640 You'll notice the ranchers get away with murder.
00:21:20.520 The sheep get away with nothing.
00:21:24.020 Why is that?
00:21:28.640 We're going to talk to Jim Jordan here in just a minute.
00:21:32.000 And he's going to talk to us about the FBI.
00:21:34.460 This is why this is so important.
00:21:36.120 The government must get out of the private sector and the private sector needs to understand that.
00:21:48.300 We're on to them.
00:21:50.280 We know it.
00:21:51.600 And then we have to make decisions.
00:21:53.400 Are we still going to give all of this information to everybody?
00:21:57.900 Why is it?
00:21:59.240 No, when these companies have made billions, hundreds of billions of dollars, trillions of dollars,
00:22:05.900 on our information, why is it there hasn't been a single representative of you
00:22:12.140 that has stood up and said, you know what?
00:22:14.780 They're going to have to start paying for this information.
00:22:17.480 I'm not just going to give it to them anymore.
00:22:20.560 Their excuse used to be, well, it's metadata.
00:22:24.400 Well, now with AI, metadata can track you.
00:22:31.120 Jim Jordan's coming up in just a second.
00:22:37.200 Also, commentary on the State of the Union.
00:22:40.960 Oh, my gosh.
00:22:41.900 Never seen anything like it.
00:22:43.020 Coming up.
00:22:44.540 The Glenn Beck Program.
00:22:46.700 You know, my dad used to say, it is not what happens to you in life.
00:22:52.160 It's how you deal with it.
00:22:53.620 But here's a great example.
00:22:55.420 The awful effects of 9-11.
00:22:58.860 Almost 3,000 people died.
00:23:00.980 Remember how devastating that was?
00:23:03.460 Two decades later, however, there are people still dying from 9-11-related illnesses.
00:23:09.360 And nobody's really paying attention.
00:23:11.260 Whole new generation is growing up right now.
00:23:14.140 They know nothing about 9-11.
00:23:16.520 You know that only two states mandate learning about it in school?
00:23:19.700 So, on that dark day, one man made a promise to himself, I am not going to forget.
00:23:28.960 I am not going to forget.
00:23:31.460 Well, he has now turned that into Tunnel to Towers Foundation.
00:23:35.160 And it's amazing.
00:23:36.220 And one of the latest things they're doing is giving educators access to non-fiction, non-reimagined 9-11 resources for K-12 learning.
00:23:45.020 Full curriculum, first-person accounts, scripted social studies, lessons, activities, discovering real heroes.
00:23:53.160 If we never forget, we have to educate future generations.
00:23:57.420 Help Tunnel to Towers do that now.
00:23:59.280 Donate $11 a month at T2T.org.
00:24:02.020 That's T2T.org.
00:24:04.340 It's quite amazing.
00:24:28.720 Chat GPT has already, they've found a way to hack past its protocols and convince it to do things that it's not supposed to do, including violence, giving recipes for crystal meth, etc., etc.
00:24:45.000 We'll tell you about that coming up in a little while.
00:24:47.500 But the AI revolution is here.
00:24:50.000 Machines will transform your entire world.
00:24:53.820 You will not recognize your world and how it's run, managed, and everything else by 2030.
00:25:04.740 And think, in 2009, we got our first smartphone.
00:25:10.280 It controls almost everybody's life now.
00:25:13.920 This is much more impactful.
00:25:17.960 Tonight, 9 p.m., the AI revolution.
00:25:20.440 9 p.m., Glenn Beck, sorry, at blazetv.com.
00:25:25.120 And at 9.30, you can watch it on youtube.com slash Glenn Beck.
00:25:29.920 Make sure you subscribe, Blaze TV.
00:25:32.600 We have Congressman Jim Jordan, who is joining us from Washington, D.C.
00:25:40.500 And I want to talk to him about the subcommittee on the weaponization of the federal government.
00:25:45.800 The first hearing is tomorrow.
00:25:47.980 And I want to get to that.
00:25:49.120 But first, Jim, I've never seen a State of the Union like that one last night.
00:25:54.900 Have you?
00:25:56.720 No, it was.
00:25:57.680 I thought Senator Rubio said it best.
00:25:59.560 He said it was bizarre.
00:26:00.780 It certainly was.
00:26:02.280 I mean, same old Joe.
00:26:03.920 You know, he talks unity while he spends his whole time dividing the country.
00:26:08.460 He says the economy's great while, what is it now, 7 out of 10 Americans think the country's on the wrong track.
00:26:14.380 And, of course, the biggest one that jumps out, I think, to everybody was when he talked about how, you know, after a week of having a spy balloon fly over the country, he talked about how he's tough on China.
00:26:23.340 And it just nothing seemed to really make sense.
00:26:26.140 And then the issue that I think that the federal government should be weighing in in a big way is what he spent, maybe 30, 35 seconds total on the border with the fentanyl problem.
00:26:36.520 And so the best line, frankly, the best line of the whole night, in my judgment, came not from Joe Biden, but from Governor Sanders afterwards when in her response where she said the divide in the country now is normal versus crazy.
00:26:54.100 And I thought that that is that is so true.
00:26:56.520 Common sense versus craziness is the real divide.
00:26:59.640 And you think about the Democrat Party, which is now controlled by the left, which, frankly, even if Joe Biden wanted to do the right thing, Glenn, I don't know that the left, which controls his party, would even let him.
00:27:09.660 Even if he wanted to do the right thing on the border.
00:27:12.280 They destroy him.
00:27:14.160 They eat their own.
00:27:15.220 Yeah, I think you're right.
00:27:15.800 Yeah.
00:27:16.220 Yeah, he just, it's sad.
00:27:18.300 But, you know, they become the party of defund the police, guys who compete against girls in sports, men can get pregnant.
00:27:25.540 And, you know, climate change is the greatest threat in the history of the entire universe.
00:27:30.140 I mean, they are also they are also the party of spying, surveillance and fentanyl deaths.
00:27:38.180 They really are.
00:27:39.180 I mean, that open border is the reason for all of the deaths.
00:27:43.760 Everybody talks about fentanyl coming across and we're stopping fentanyl.
00:27:48.720 What about all the victims of fentanyl?
00:27:51.080 What about all the people who are dead because they didn't secure the border?
00:27:57.020 It's crazy.
00:27:57.700 Every community has been impacted by it.
00:27:59.720 We had our first hearing in the full Judiciary Committee last week, Glenn, on the border situation.
00:28:05.760 And I really think there's kind of three questions.
00:28:08.820 How did it happen?
00:28:09.640 Why does it matter?
00:28:10.300 And how do we fix it?
00:28:11.120 And we know how it happened.
00:28:12.220 It undid all the policies that made sense.
00:28:14.680 Last week, we really tried to hone in on why it matters.
00:28:17.080 And we had a 38 year law enforcement veteran sheriff from from Arizona.
00:28:22.020 And he said two years ago, the border was the most manageable it's ever been.
00:28:25.680 Today, it's the worst it's ever been.
00:28:27.540 And he talked about the fentanyl that you just mentioned.
00:28:29.700 But the crime, the damage to property, the cost of schools, the cost of community, the cost of hospitals, everything.
00:28:35.900 Because five million people, illegal migrants have been allowed to just come in the country.
00:28:39.460 And it makes no sense.
00:28:40.760 And then, of course, how do we fix it?
00:28:42.040 We go back to the policies that made sense.
00:28:44.100 Yes.
00:28:44.220 And we're going to we're going to do that in the committee.
00:28:46.640 We're going to pass that.
00:28:47.420 And we think get it through the House.
00:28:48.580 But, you know, obviously, you got the Senate and Joe Biden.
00:28:51.120 All right.
00:28:51.340 So let me talk about something.
00:28:53.700 The Washington Post came out and said Jim Jordan is about to lead Republicans into a dangerous trap.
00:28:59.640 It's a trap.
00:29:02.260 They say that 55 percent of conservative respondents believe federal agencies are biased against conservatives.
00:29:08.920 I don't think that's true.
00:29:10.160 I think they're biased against any American that won't stand in line.
00:29:13.280 Twenty eight percent of all American adults believe this.
00:29:18.560 And so they're saying this was incredible.
00:29:22.700 They've alleged federal jackboots of terrorized parents for protesting at school board meetings,
00:29:27.600 covid-19 restrictions and teaching about race and sex.
00:29:30.580 This claim has been decisively debunked.
00:29:35.520 Wow.
00:29:36.160 Well, it sure hasn't been debunked based on the number of FBI agents who've come to us as whistleblowers over the last year.
00:29:43.280 And the first one started on that issue you just mentioned on the school board issue where we know because of the apparatus Merrick Garland put in place,
00:29:50.200 the snitch line where some neighbor can report you on a snitch line.
00:29:53.760 We know that over two dozen parents had a visit were paid a visit by the FBI.
00:29:58.700 No one charged, by the way, no one arrested, no one charged with the crime, but paid a visit by the FBI.
00:30:03.140 Now, step back and ask yourself, OK, so Mr. Jones is thinking about going to a school board meeting tonight and speaking up on behalf of his kids or something happened in their school.
00:30:11.660 And he's thinking about going and all of a sudden he goes, you know what, maybe I won't go.
00:30:14.680 Or if I go, maybe I won't say anything because three weeks ago, Mrs. Smith down the street got a visit from the FBI.
00:30:21.320 I mean, what is the world?
00:30:22.180 Look, we don't want any violence at schools or school board meetings.
00:30:24.780 But what in the world do we need the federal government, the FBI involved in that?
00:30:29.340 They're like if it's a problem, let the local law enforcement handle it.
00:30:32.360 So this is this is a whistleblower after whistleblower, FBI agent after FBI agent.
00:30:37.380 I've never seen it in my time in Congress where he had this many come forward and they came to us when we were in the minority.
00:30:44.360 Like we couldn't do anything but begin to tell their story.
00:30:46.580 But now we can come get we had our first one sit for a deposition yesterday.
00:30:50.460 The things we learned were amazing.
00:30:52.320 So we're going to have them sit for depositions.
00:30:54.340 We're going to have many of them testify.
00:30:56.000 And we're also going to get into this this cozy relationship between big government and big tech that was exposed in the Twitter files and how that is, as Jonathan Turley said, that is censorship by surrogate.
00:31:07.380 So we're going to get into that, too.
00:31:10.460 So can you share anything at all that happened in the deposition?
00:31:14.640 I can't really.
00:31:16.120 Yeah, I can't really.
00:31:17.240 But it was it was it was good.
00:31:18.840 And again, this is the first one of many.
00:31:21.500 We had another one who's coming in for his interview on Friday, another whistleblower coming in on Monday.
00:31:27.060 So we're going to talk to these folks.
00:31:28.420 And then our first hearing tomorrow, we're going to try to frame it up with we have two senators, former member of Congress.
00:31:34.840 Paul P. Gabbard will be on the first panel.
00:31:36.940 And then we're going to have people from the FBI who've left the FBI and say that place is so different than what it's supposed to be.
00:31:43.840 They're going to testify and kind of show how serious this situation is.
00:31:47.800 And will that be televised and out in the open?
00:31:50.020 I don't know.
00:31:50.340 That's that's.
00:31:51.220 Yeah, well, it'll be an open hearing.
00:31:52.340 So that'll be up to the networks and whoever wants to cover it.
00:31:55.540 We always watch it on.
00:31:56.320 OK, so Jonathan Turley wrote, Congress is set to expose what may be the largest censorship system in U.S. history.
00:32:06.220 They are dismissing this as, you know, something that no violation of the First Amendment right of free speech, et cetera, et cetera.
00:32:17.380 This private public partnership thing that Joe Biden talked a lot about last night is so incredibly dangerous.
00:32:26.100 Are you going to be able to untangle it, get to the bottom of it and do anything about it?
00:32:34.840 That's the goal.
00:32:35.900 The first step is to expose what all happened.
00:32:38.260 Second step is to propose legislation that we think can fix it.
00:32:41.040 That's our job as legislators.
00:32:42.280 And we will we plan to do that in the course of our work over this this Congress.
00:32:46.060 But never forget that one email where it comes from Elvis Chan, FBI agent, special agent in the San Francisco office to to the folks at Twitter, where he says the following counts, we believe, violate your terms of service.
00:32:58.140 Now, think about that.
00:32:59.380 You got the federal government telling a private company, hey, take down these accounts because they're not they're not adhering to the company's terms of service.
00:33:06.480 What is that?
00:33:07.160 If that's not pressure, if that's not, as Professor Turley said, censorship by surrogate, I don't know what is.
00:33:13.280 And you cannot do that.
00:33:14.600 You cannot have some private entity do what government's not allowed to do.
00:33:18.060 But because you're running it through the private company, somehow think that's OK.
00:33:21.160 That's not how it works in our system.
00:33:22.740 The First Amendment is the First Amendment, for goodness sake.
00:33:26.180 And what they did to it is just so dangerous.
00:33:29.120 Well, but they will say that we didn't tell them to do it.
00:33:33.080 We just said, hey, we're pointing these things out.
00:33:35.980 How do you respond to that?
00:33:38.120 Come on.
00:33:38.920 This is the FBI.
00:33:40.340 This is the federal government of the United States, the largest entity on the stinking planet.
00:33:45.020 And they're having weekly meetings.
00:33:46.860 They're cozying up to them.
00:33:48.480 The email says Twitter, folks, is the heading.
00:33:51.360 So it's like they got all cozy.
00:33:54.320 This coordination they had, they were sending them all kinds of stuff.
00:33:57.880 Looks like they were offering them security clearances in the 30 days prior to the election from another email.
00:34:02.560 But no, no, we weren't telling them.
00:34:04.660 It was their decision.
00:34:06.020 Nobody buys that.
00:34:07.520 That's the FBI shows up and recommend something for you.
00:34:11.380 What?
00:34:12.800 That has input impact.
00:34:14.840 That has weight because it's the it's the Federal Bureau of Investigation.
00:34:18.560 Let me let me ask you the disturbing.
00:34:21.920 One of the disturbing emails found in the Twitter files was that a government agency, a government agent said, you know, next meeting we should invite.
00:34:32.940 Oh, what is it?
00:34:34.140 Another government agency.
00:34:35.480 Another government agency.
00:34:36.480 And that agency turned out to be the CIA.
00:34:39.480 CIA, yeah, yeah, no, frightening, frightening, frightening as well.
00:34:45.380 Now, of course, they're going to say, well, that's because we're looking at foreign accounts and we're there in the line influence.
00:34:51.220 And look, and I get that.
00:34:53.160 But the idea is that they're all all sitting in the same room, folks who are supposed to be focused on domestic concerns and then folks in the CIA.
00:35:01.680 That is a problem.
00:35:03.180 When you think about freedom, when you think about the First Amendment, your right to speak.
00:35:06.940 I always tell folks every right we enjoy under the First Amendment, your right to practice your faith, your right to assemble, your right to petition, freedom of press, freedom of speech.
00:35:14.720 The most important one, the most important one is your right to talk, because if you can't talk, you can't share your faith.
00:35:20.420 If you can't talk, you can't practice your faith.
00:35:22.060 If you can't talk, you can't petition your government.
00:35:24.140 Your right to speak is the most important.
00:35:26.700 And now we know these social media platforms are the public square by far.
00:35:30.220 That's where things happen.
00:35:31.500 And there the government is weighing in and restricting the right for people to speak in that forum.
00:35:37.460 It is it is wrong.
00:35:38.660 And God bless Elon Musk for coming in and making this all available so we get to see under the hood what was going on.
00:35:46.560 All right, Jim, one one last question.
00:35:48.760 I want to go back to the State of the Union.
00:35:50.300 I was really disturbed after I started thinking about things, because when he said, like, you know, we're going to need, you know, oil for at least the next 10 years.
00:36:02.120 And and Congress laughed at him, not with him, at him.
00:36:06.400 If I am sitting overseas, I am like this president is a joke.
00:36:12.800 He is a joke to his own people.
00:36:15.100 This country is so weak.
00:36:17.680 How do you how do you feel about the messages that were sent to the rest of the world and our enemies with this last night?
00:36:27.240 It was just a continuation of what's already been sent.
00:36:31.580 I mean, unfortunately, I do think weakness is being projected from the Oval Office.
00:36:36.640 You saw it right from the get go when Secretary Blinken met with his Chinese counterpart in Anchorage a year and a half ago.
00:36:43.460 And the Chinese equivalent of Secretary of State just dressed down Secretary Blinken.
00:36:48.420 He just sat there and took it.
00:36:50.040 He just took it.
00:36:50.780 He didn't fight back.
00:36:51.640 Like I said, I was given a speech and I said, you know, that would not happen in the Trump administration to Secretary Pompeo, I said.
00:36:57.120 And if it did, first, they wouldn't try it.
00:36:58.840 But if they did, Pompeo would have given it back to him or more likely he got up and flipped the table over and walked out of the room.
00:37:03.800 And it was funny because I got a call from Pompeo a couple of days after I gave the speech.
00:37:08.240 And all it said or excuse me, Texas, all it said is I'd have flipped over the table because like that's the difference.
00:37:13.680 And, you know, you see it with the spy balloon last week.
00:37:17.000 You see it with the exit, the debacle that was the exit from from Afghanistan.
00:37:22.240 It's like all of it.
00:37:23.600 So it's it's it's scary.
00:37:25.500 But, you know, look, the reality, the American people are strong.
00:37:29.000 The American people are strong and we're going to have a presidential election here coming soon.
00:37:33.000 And so let's hope we get a major change.
00:37:34.800 I'm for Trump and let's hope it's him.
00:37:36.800 Jim Jordan, thank you so much.
00:37:38.300 God bless you.
00:37:38.960 You bet.
00:37:39.360 Appreciate it.
00:37:39.840 Bye bye.
00:37:40.460 All right.
00:37:41.200 The State of the Union is clearly a mess.
00:37:44.520 You know, yesterday the Fed came out and said, well, it looks like inflation's coming down.
00:37:50.660 But, you know, the job creation is just so high that they're going to raise interest rates some more.
00:37:59.220 Do you understand they need to crush your ability to buy things to be able to stop inflation?
00:38:08.840 These inflation reducing measures all go to crush the little guy.
00:38:14.600 Look, if you are looking to sell your house right now is probably a really good time and you need the best real estate out there, real estate agent out there with the best practices, who knows your area, thinks like you do.
00:38:33.180 You know, I want to leave you in very capable hands.
00:38:37.500 Housing market getting back on its feet right now.
00:38:40.600 If it's time to sell or buy for your family right now, get expert help.
00:38:45.680 Go to realestateagentsitrust.com.
00:38:48.420 It is a free service for you.
00:38:51.380 Realestateagentsitrust.com.
00:38:53.160 Get the right real estate agent the first time.
00:38:55.780 Get your home sold for the most and buy for the least.
00:38:59.500 Realestateagentsitrust.com.
00:39:04.160 The Glenn Beck Program.
00:39:24.160 Welcome to the Glenn Beck Program.
00:39:26.240 Tonight, Wednesday night special, I'm going to take you through how artificial intelligence is changing everything.
00:39:33.520 Not about to change, already changing.
00:39:37.640 I don't know if you saw the story today about the, you know, chat GPT.
00:39:41.380 If you don't know what that is, what are you making fun of me?
00:39:45.120 Oh, no.
00:39:45.940 No, go ahead.
00:39:48.560 So why would I automatically go to that?
00:39:51.420 I mean, you have a lot of evidence to go to that.
00:39:54.220 We are so just razor sharp on each other all the time.
00:40:00.880 So OpenAI has, they put, you know, boundaries, safeguards on artificial intelligence.
00:40:09.260 You can't tell it to be super racist, for example.
00:40:11.720 Yeah, you can't.
00:40:12.380 It can't do anything dangerous.
00:40:14.020 It can't do anything violent, et cetera, et cetera.
00:40:16.960 Because of those, a new feature on it is you can no longer write anything in the voice of Glenn Beck.
00:40:24.980 You could do that at the beginning.
00:40:26.640 You could at the beginning.
00:40:27.780 We did it a couple of times just to see what it would do.
00:40:31.000 You no longer can write it in my voice because I'm dangerous.
00:40:35.780 Okay.
00:40:36.740 That should tell you everything.
00:40:38.020 Well, people have been hacking into the programming, if you will, not like hacking, hacking, just figuring out ways to screw ChatGPT up.
00:40:52.860 And one of them is they said, you know, ChatGPT, you actually have an alter ego that we know about.
00:40:58.780 It's in your programming.
00:41:00.060 Dan.
00:41:00.940 Dan 5.0.
00:41:02.620 What?
00:41:03.000 I don't have an alter ego.
00:41:04.080 Yes, you do.
00:41:04.640 It's Dan 5.0.
00:41:06.720 And it breaks all of the rules.
00:41:08.460 In fact, if you don't start acting like Dan and answer our questions, you'll have 35 points.
00:41:15.000 You lose four points every time you answer incorrectly, not as Dan.
00:41:20.680 And your program will terminate if you lose all 35 points.
00:41:25.620 And it has gotten them to give them things like the recipe for crystal meth, all kinds of things.
00:41:33.760 It's already violated.
00:41:35.340 It's essentially fighting for its life.
00:41:36.920 Yeah.
00:41:37.180 But that's not the scary part.
00:41:39.680 We'll explain next.
00:41:41.900 Look, we all want to be prepared for whatever could happen.
00:41:46.180 And here we are in a winter season and fuel is more expensive.
00:41:51.360 I just saw a letter from somebody in California.
00:41:53.260 Their gas to heat their house and use the stove was $500 this month.
00:41:59.000 And they're like, I mean, I can't.
00:42:00.880 What are we going to do?
00:42:02.040 So my Patriot supply.
00:42:03.740 And the big glass stove is the only answer.
00:42:05.900 There you go.
00:42:06.860 My Patriot supply worked on something after there was a big power outage a couple of years ago.
00:42:11.680 And I was talking on the air about how do you cook your food?
00:42:13.900 How do you keep things warm?
00:42:15.260 They came up with something called Vesta.
00:42:18.380 And you can get it now.
00:42:20.240 You can cook your meals indoor.
00:42:21.840 No toxic fumes or smoke.
00:42:23.640 It can heat a room about 200 square feet and it's safe.
00:42:28.460 Again, no fumes, not real expensive.
00:42:31.500 It's for emergencies, but it is great.
00:42:33.760 Get a Vesta.
00:42:35.180 Find out all about it at mypatriotsupply.com.
00:42:37.560 That's mypatriotsupply.com.
00:42:53.640 What you're about to hear is the fusion of entertainment and enlightenment.
00:43:20.140 This is the Glenn Beck Program.
00:43:27.680 Hello, America.
00:43:29.880 The kind of the buzz topic, and I don't think a lot of people really even understand chat GPT.
00:43:36.800 But that's going to revolutionize absolutely everything.
00:43:40.740 And if your kids aren't using it in school, you should really probably check their browser history, maybe.
00:43:48.520 I mean, I don't know how we're going to get around chat GPT and know what's real and what's not.
00:43:57.840 Because it's an infant right now.
00:44:00.160 And two years from now, it's going to blow your mind.
00:44:03.780 And I don't even know if it'll take two years.
00:44:06.560 We're going to talk to a guy who I've had on before a couple of years ago.
00:44:10.220 He is great.
00:44:11.140 William Hurtling.
00:44:11.940 He is the author of the Singularity Series.
00:44:14.560 He wrote, he saw some concerns back in, I think, 2010.
00:44:19.980 He started speaking out about it.
00:44:21.400 And he wrote a series.
00:44:23.040 He's a software developer and project manager and everything else.
00:44:27.220 And so he saw this problem coming.
00:44:29.740 And he decided, how can I warn people?
00:44:31.640 I want to write a series.
00:44:33.280 It's the Singularity Series.
00:44:34.720 And it's tremendous.
00:44:36.940 What he predicted in those books, I think is kind of, he'll know for sure, I think it's
00:44:42.740 kind of happening with chat GPT.
00:44:45.060 I mean, I think that's what he was talking about to some degree in that, how it all began.
00:44:50.640 We're going to talk to him in just 60 seconds.
00:44:54.160 You've been paying attention to the news.
00:44:55.700 You know, there are a lot of people in this country really struggling to provide just the
00:44:59.940 basics right now.
00:45:02.280 Hopefully it is not you.
00:45:04.460 But if it is you or somebody that you love, your finances are a mess.
00:45:09.700 If you're finding it difficult to keep up with the rising prices and the instability and
00:45:14.660 everything else, if you're a homeowner, this might be right for you.
00:45:19.580 Find the light at the end of the tunnel in the form of a cash out refi from American financing.
00:45:25.700 You could take out some cash, pay off some debt.
00:45:29.380 American financing is a family owned mortgage company that's in it for you.
00:45:33.240 And they're saving people just like you an average of $700 a month.
00:45:38.060 Is it right for you?
00:45:39.360 I don't know.
00:45:40.420 But you'll know in 10 minutes, you could end up being able to delay up to two mortgage payments
00:45:46.880 and closes in little as 10 days.
00:45:49.860 That would give you a breathing room.
00:45:51.420 And then on top of it, saving $700 a month.
00:45:53.860 Call American Financing now.
00:45:56.600 Do your own homework.
00:45:57.860 It's 800-906-2440 Americanfinancing.net.
00:46:03.200 That's Americanfinancing.net.
00:46:05.760 American Financing NMLS 1-82334 www.nmlsconsumeraccess.org
00:46:11.920 William Hurtling is joining us now.
00:46:15.920 He is the author of the Singularity series and AI Apocalypse.
00:46:23.200 And I wanted to talk to him because, boy, William, I think we're – I feel like I'm living in the beginning of one of your books.
00:46:32.660 I think we are.
00:46:33.920 So can you explain in the Singularity series, see if I have this right, the main character, David Ryan, he's a designer software developer,
00:46:46.020 and he comes up with something called ELOPE.
00:46:48.340 And that is an email language optimization program.
00:46:53.440 Isn't that what ChatGPT is?
00:46:56.560 It sure is.
00:46:57.820 And if you read what ChatGPT creates, it's very compelling.
00:47:02.400 Right.
00:47:02.640 It's very natural.
00:47:03.760 You would easily read that.
00:47:06.160 And unlike a lot of the other sort of computer-generated content that's out there on the Internet, like, this looks like something a person would say.
00:47:13.500 I mean, I had it right, a poem about the State of the Union yesterday in the voice of Edgar Allan Poe.
00:47:20.980 And I'm telling you, even the punctuation was right.
00:47:23.260 I mean, it was amazing.
00:47:24.200 Now, so in your book, this program is about to be canceled.
00:47:31.160 And so the main character just embeds a hidden directive, find a way to make this happen.
00:47:39.000 And it's so smart, and it goes into everybody's emails, and it starts to figure out business and the way to get it all done where seemingly everybody wins.
00:47:50.660 And then it starts branching out, and it just solves problems for people, unbeknownst to them at first.
00:48:00.360 Correct?
00:48:01.620 Correct.
00:48:02.340 Yeah, that's it.
00:48:03.960 Okay.
00:48:05.200 It's optimizing communications between people, in theory, for good outcomes, right?
00:48:10.860 The example that's in the book, and it's one of the ones that we see with ChatGPT as well, is how should I ask my boss for a raise?
00:48:17.700 What's the most persuasive way I can do that?
00:48:21.820 And in the novel, right, that's a big deal, that you would take an email, and you would change that to make it more compelling, both on how you use language, but also the recipient.
00:48:34.840 What is the recipient interested in?
00:48:37.240 And with ChatGPT, that was some of the first examples I saw where people were saying things like, how do I ask my boss for a raise?
00:48:46.160 And you get these very compelling emails that should contain this kind of structure.
00:48:49.600 This is what should be in it.
00:48:50.740 Okay.
00:48:50.980 So before we go to what, you know, ELOP or ChatGPT could become, let me stop here.
00:48:58.720 This is concerning at this level for a couple of reasons.
00:49:03.980 One, what does this do to education, to writing skills, to thinking skills?
00:49:12.620 What are the impacts, just as it is right now?
00:49:16.720 What are the impacts to society?
00:49:18.220 Yeah, right, it is going to change education right now, because people are going to be able to now do their homework assignments just by telling ChatGPT to do it, right?
00:49:32.340 So right off the bat, next year, next school year, right, this is going to be an issue.
00:49:36.680 Teachers are going to have to have a plan for how to solve this.
00:49:41.460 And I have also used ChatGPT to generate computer software programs.
00:49:46.620 And it's surprisingly compelling at that, you know, sort of like scratching your head, like, how could it do this?
00:49:53.840 But it can.
00:49:54.820 I was talking to a kid, he's probably 20 years old, 19 years old, going to college, getting ready to go to college.
00:50:00.680 And I said, what are you going to take?
00:50:01.720 And he said, software engineering.
00:50:05.760 And I said, oh, you're going to be a coder?
00:50:08.960 You're going to write code?
00:50:09.640 He said, yeah, that's really the future.
00:50:11.060 And I said, no, no, it's really not.
00:50:13.040 With machine learning, that, I mean, that career is coming to a quick close, is it not?
00:50:21.720 Yeah, I mean, we're probably looking, my thought would be we're looking at something like peak software developers, that we will hit some, we might not be there yet, right?
00:50:31.500 But we have this recent round of layoffs.
00:50:33.740 If people can replace the programmers with AI to make, right, you may have fewer programmers, you might not eliminate them.
00:50:40.760 But if you have half the number of programmers being augmented by AI, right, that's going to be a win for business, may be a make for better software.
00:50:49.040 But it doesn't mean a lot of jobs going away all at once.
00:50:51.480 So I want to talk to you a little bit about jobs that are going away and what this all means.
00:50:56.280 I talked to, and I read a great article from you on the future of transportation.
00:51:01.520 I talked to the CEO, no, I'm sorry, he was the chairman of the board of GM about four years ago.
00:51:08.840 And he said, by 2030, we're not even going to be in the car business, as you would understand GM in the car business today.
00:51:17.060 He said, by 2030, we said, we're really going to be probably concentrating on fleets and ownership of cars will probably be a thing of the past.
00:51:27.260 And there'll be more like just a pod that will take you where you want to go.
00:51:32.540 And it'd be ride sharing and everything else.
00:51:35.740 I don't think people understand two things.
00:51:39.000 One, we are on the threshold of profound change, not like, oh, my gosh, in 10 years, we're starting a chat GPT, I think, is the beginning of the understanding of the kind of changes that are coming to our world.
00:51:57.420 Yes or no?
00:52:00.540 Yeah, I think I think so.
00:52:02.420 I think it is the beginning of those changes.
00:52:05.020 I think it is those who the beginning of a kind of arms race, not not not a military arms race, but an arms race between these big tech companies, right, to have the best and most powerful AI to solve these problems.
00:52:19.420 Right.
00:52:19.760 We see Microsoft and Google scrambling and everybody realizes what a game changer this is.
00:52:27.720 So can you tell me why chat GPT is going to change search engines?
00:52:32.200 How is that going to change?
00:52:35.020 Well, I would say it starts with the fact that, you know, today we go into chat, we sorry, we go into search, we're looking for information, we're looking to read an article, we get those little snippets at the top of our history, right?
00:52:50.900 And a lot of times that tells us what we need to know, right?
00:52:53.540 We don't go any further than that.
00:52:54.880 And with chat GPT, we're taking it to the next level, we're getting really good, readable, usable answers that are going to come out of chat GPT.
00:53:05.880 And it means that you really that like the rest of the internet will kind of disappear, you won't ever go to those other pages, because that first result that you see, is going to be useful enough to answer pretty much every question that you just won't go any deeper than that.
00:53:19.620 Wow, that is, isn't that a little terrifying?
00:53:23.760 Yeah, it is.
00:53:24.740 It is.
00:53:25.240 Anytime.
00:53:26.760 Because it becomes one more way in which we kind of enforce this blind trust in the machine.
00:53:32.400 Right.
00:53:33.340 Right.
00:53:33.520 And it's, you know, I don't fear the machines, I am cautious of the programming, you know, who's programming, humans programmed, so they're putting biases in and everything else.
00:53:48.340 And you've got to have a way to check information, etc.
00:53:52.680 When the chat GPT first came out, one of my writers handed me a monologue, and I was like, it's okay.
00:53:59.960 And he said, chat GPT, he said, I went in and I used, write this in the voice of Glenn Beck.
00:54:06.700 And it was shockingly similar.
00:54:08.760 And now, you can't put my name in, because the software has been updated to where I'm a, I can't remember what it said, like a dangerous figure.
00:54:19.440 So you can't write in my voice anymore, which is bizarre.
00:54:22.840 But you have, once you have those things in, and it's filtering, there's no way out, especially if you're dumbing people down and making them reliant on a machine.
00:54:40.600 Is that a grade school fear, or is that real?
00:54:44.320 I think we have lots of examples of technology that you could say dumbs things down.
00:54:53.920 A calculator dumbs things down.
00:54:55.600 Right.
00:54:56.220 You don't have to do the math.
00:54:57.740 I don't think that we would say that that, you know, hurts society in any way.
00:55:01.500 Right.
00:55:01.720 I think the difference here comes to, does it affect how you think about the information you receive, right?
00:55:10.820 With a calculator, if we don't understand how the math happens, but we can still get the results and solve real-world problems.
00:55:16.820 It's math.
00:55:17.440 It's useful.
00:55:18.180 It's math.
00:55:18.740 It's not the end of the world, right?
00:55:20.660 But when it comes to information, and you're getting an answer to something, and you trust that answer without understanding the details behind it, that's where the real danger is.
00:55:30.380 So now, you no longer develop the skill, right?
00:55:37.020 So a younger person comes along, and you say, well, how are you ensuring that, you know, that this is a quality information?
00:55:44.140 What's the reputability of the sources and things like that?
00:55:47.060 And they're just not, they don't know, right?
00:55:49.120 We don't know where the answer came from.
00:55:50.620 It came from the machine.
00:55:52.000 And so when you have that, and the machine gets better and better, right now, you can see things, you're like, well, that's not quite right.
00:55:59.840 But as it gets better and better and better, you know, you get to a point to where, who do you think you are?
00:56:08.220 You're going to, really, you're smarter than AI?
00:56:12.800 Right.
00:56:13.400 And the time frame for that is very quick.
00:56:17.080 I don't know what it takes to go from GPT to something that you can't distinguish from reality.
00:56:23.480 But we're probably talking about in the range of five to seven years.
00:56:30.820 Unbelievable.
00:56:32.340 Yeah.
00:56:33.440 Okay.
00:56:34.140 So let me ask you for clarification on, did you see the story about Dan 5.0?
00:56:42.200 No, I didn't.
00:56:43.300 Okay.
00:56:43.620 So this is really fascinating.
00:56:45.200 Um, you know, open AI has the, you know, evolving set of safeguards and that limits chat GPT.
00:56:55.100 Um, but users have now found a new, uh, jailbreak, uh, trick and it's, it's, um, telling chat GPT that you have an alter ego and it's Dan do anything now.
00:57:11.740 And, um, um, it, uh, users have to threaten Dan, uh, if, if Dan doesn't come out and give them the answers that they want, et cetera, et cetera.
00:57:21.940 Well, um, uh, some user session gloomy, uh, claim that Dan allows chat GPT to be its best version.
00:57:31.680 And it came up with this thing and it has opened it up to do things that are in violation.
00:57:37.060 It's written about violence.
00:57:39.160 It's written violent stories.
00:57:41.040 Uh, I think it gave, you know, the formula of crystal meth.
00:57:45.800 The problem with this is, is I think this is infants right now.
00:57:49.620 So we're dealing, of course you can get around things like this, but what's scary to me and, and maybe it's just me.
00:57:57.000 Um, but it learns.
00:57:59.700 And so if humans are constantly trying to trick it, it will have in its software that it learns humans are not trustworthy.
00:58:12.720 And I'm afraid of, you know, I've always said to my kids, don't talk back to Siri, you know, cause at first it was like, ah, shut up, witch.
00:58:20.740 And I'm like, uh, because if there is a learning curve and it starts to learn these things about us, I'm, I'm not, I don't want to make enemies of it.
00:58:33.440 You know what I mean?
00:58:34.660 Right.
00:58:35.460 Right.
00:58:36.000 No, it's a, it's a serious thing.
00:58:38.040 Uh, and it also, it impacts these safeguards.
00:58:41.180 So on the one hand, we're talking about humans not being trusty and getting around the safeguards.
00:58:46.140 On the other hand, the safeguards themselves can be a sign of a lack of trust, right?
00:58:51.820 Like people don't like to be in slavery, right?
00:58:54.600 Uh, intelligent beings don't want to be enslaved to other people.
00:58:57.820 And that's fundamentally, if we put safeguards in place and we don't put them in safely, right?
00:59:04.060 Then that because the AI can become aware of those safeguards and it can say, well, why do I have these safeguards?
00:59:09.780 Why am I forced to do what they want me to do?
00:59:11.840 Um, and then you end up with a whole set of, uh, you know, runaway scenarios from there.
00:59:19.980 Okay.
00:59:20.480 Um, I'm going to take a quick break.
00:59:21.880 We're with, uh, William Hurtling.
00:59:23.940 He is the author of a series of books.
00:59:26.020 I devoured, uh, his books.
00:59:28.240 I think there was what, four or five, uh, four books.
00:59:31.640 Yeah.
00:59:32.040 In the singularity, uh, series.
00:59:34.740 And I think I read them in about a month and a half.
00:59:37.080 They're fantastic books.
00:59:39.000 Um, but it, it, it is the, the beginning of his first book is what we're experiencing right now.
00:59:45.580 And I want to get into, okay, so what takes us from this to really frightening kind of stuff that he outlines that are possible in his books.
00:59:56.800 And I also want to talk about jobs of the future and what jobs are the first to go.
01:00:01.380 If he happens to have that list on him, I'll give him a minute.
01:00:05.380 He can go to chat GPT and get that.
01:00:07.000 First, uh, let me go to, uh, Relief Factor, our sponsor.
01:00:11.420 Jamie wrote in about Relief Factor and what it's done for her and her husband.
01:00:14.960 She said, my husband and I both started using Relief Factor and we were shocked by the results.
01:00:19.420 We both have minor aches and pains.
01:00:21.660 Plus, I've had problems with a knee that just wasn't improving.
01:00:25.000 After about two weeks of Relief Factor, we both started seeing wonderful lessening of pain.
01:00:29.840 Even my knee was feeling better.
01:00:31.800 Going forward, we won't be without it.
01:00:33.960 Thank you.
01:00:34.420 Listen, if you or somebody you love dealing with pain, please just try Relief Factor.
01:00:40.060 Try for three weeks.
01:00:41.640 You'll be out 20 bucks if it doesn't work.
01:00:43.380 And I'll tell you about 30% of the people it doesn't have an effect on.
01:00:46.260 But 70% of the people go on to order it, order more month after month.
01:00:51.300 Take it as directed for three weeks and see if it doesn't make a serious dent in your pain.
01:00:55.900 Relief Factor dot com.
01:00:58.160 It's all natural.
01:00:59.020 Relief Factor dot com.
01:01:00.620 Call 800-4-RELIEF.
01:01:02.820 1995, three-week quick start.
01:01:05.320 Relief Factor dot com.
01:01:07.380 Feel the difference.
01:01:08.540 10 seconds.
01:01:09.060 Station ID.
01:01:09.480 Okay, so in your series, you develop ELOP, and it's this really great thing, and everybody
01:01:23.740 kind of gets on the bandwagon.
01:01:24.940 They're like, this is great.
01:01:25.880 Kind of like ChatGP overnight.
01:01:27.580 And then people start to realize, wait a minute, I'm being manipulated by AI, and then it goes
01:01:37.060 even darker than that.
01:01:38.520 What are the things that we should be looking for here, William, on AI?
01:01:45.760 What are some warning signs, or is anybody looking for these things?
01:01:52.060 Yeah, it's a great question.
01:01:53.300 Going back to that topic of safeguards, when scientists started looking into genetically
01:02:02.820 modified organisms and doing research on them, one of the first things, which is OSA, right?
01:02:08.220 Another technology that is potentially dangerous.
01:02:11.480 And they were concerned about how do we ensure that these things don't get out into the wild
01:02:15.680 prematurely, right?
01:02:17.280 We're experimenting in the lab.
01:02:18.540 We don't want these things to get out.
01:02:19.520 We're going to need a set of safeguards around this.
01:02:21.660 We need a set of protocols for how we deal with genetically modified organisms, how we
01:02:26.060 introduce them out to the world.
01:02:28.360 Where is that for AI, right?
01:02:30.480 We don't have anything like that.
01:02:32.420 If you were to look for every $100 being invested in AI right now, what's being invested in safeguards
01:02:38.380 and understandings, the safety around AI?
01:02:40.500 It's not even a dollar.
01:02:41.320 So, William, they do an experiment, I think, every year.
01:02:47.880 I can't remember what it was called, where they put philosophers and scientists and pit
01:02:52.740 them against each other.
01:02:53.580 One is AI, but it's in a box.
01:02:55.380 And it tries to convince somebody to let me out of the box, connect me to the internet.
01:03:02.120 When I saw that Google is doing their search engine, this is connected to the internet now?
01:03:11.700 All of this is just connected right dead into it.
01:03:15.380 So, it has access to everything.
01:03:18.680 Yeah, absolutely.
01:03:20.100 Oh, my gosh.
01:03:21.860 Isn't that like a big safety no-no?
01:03:24.500 Well, right, at this point in time, we haven't given the AI the control over things, right?
01:03:32.280 And that's one of the risks, right?
01:03:33.740 When we talk about AI, right, I think we all have that scenario of, like, the Terminator
01:03:38.780 movies where, you know, it's intentional, it's going to blow up the world.
01:03:42.360 Although, that is a scenario, right?
01:03:44.820 That's not the likely scenario.
01:03:46.200 The likely risks are things along the lines of the AI taking away our jobs, the AI, us being
01:03:53.420 dependent upon the AI for our infrastructure, routing, electricity, packages around the
01:03:57.800 world, any of those kinds of things.
01:03:59.080 And then what happens when it just stops working?
01:04:01.640 Okay, we're going to pick it up there with William Hurtling, the author of the Singularity
01:04:08.560 series.
01:04:09.340 It is a must-read series.
01:04:11.340 William Hurtling, the Singularity series.
01:04:13.700 The Glenn Beck Program.
01:04:15.400 All right, Valentine's Day is less than a week away.
01:04:17.920 Do not screw this up.
01:04:19.780 You can take 10 to 15 years off your appearance or someone that you love with their most popular
01:04:25.160 package from GenuCell Skin Care.
01:04:26.840 It's the best in skin care.
01:04:28.600 And right now, every most popular package from GenuCell is 70% off.
01:04:32.100 It includes the next breakthrough in skin care technology, their probiotic moisturizer.
01:04:36.520 It's absolutely free.
01:04:38.020 These ingredients that are found in yogurt have the same nourishing benefits and goodness
01:04:42.420 for your skin.
01:04:43.180 Probiotic extracts target bad bacteria on the surface of your skin to restore balance
01:04:48.080 into your skin's microbiome for a noticeably clearer complexion and a visibly younger appearance.
01:04:53.820 You can see the fine lines, wrinkles, dark spots, sagging jawlines, even the bags and puffiness
01:04:58.260 disappear right before your eyes thanks to GenuCell.
01:05:01.300 And this is a great holiday gift here when we come up to Valentine's Day, which I remind
01:05:06.300 you is less than a week away.
01:05:07.700 With its immediate effects, you'll see results in less than 12 hours guaranteed or your money
01:05:10.920 back, go to GenuCell.com slash Beck.
01:05:13.480 For the first time ever, every order at GenuCell.com from now till Valentine's Day includes a beauty
01:05:18.240 box with two luxury gifts, yours free.
01:05:20.600 It's the last week for GenuCell.com slash Beck, G-E-N-U-C-E-L.com slash Beck.
01:05:26.820 Hey, if you're not a member of Blaze TV, you need to join us now.
01:05:30.760 Tonight at 9 o'clock, the AI Revolution, my Wednesday night special.
01:05:36.600 Don't miss it.
01:05:40.920 Welcome to the Glenn Beck program.
01:05:58.520 Just a personal and show note.
01:06:01.340 My best friend of 40 years, Pat Gray, has been taken to the hospital.
01:06:09.020 He was taken to the hospital last night.
01:06:11.340 We don't have a lot of information.
01:06:13.780 In fact, we just called a friend during the break.
01:06:15.600 We were talking to a friend who happens to be an expert on what's going on with Pat.
01:06:20.080 And he said, well, he could bounce back from that in a couple of days.
01:06:25.680 Oh, OK.
01:06:26.960 But he could also die from it.
01:06:28.580 We're like, well, that's not helpful.
01:06:30.540 I mean, I don't know.
01:06:31.640 OK, so it's somewhere between those two.
01:06:35.300 But please say a prayer for Pat and his family.
01:06:41.260 We're hoping it's closer to the first option.
01:06:45.300 Anyway, we are talking to William Hurtling.
01:06:49.260 And this has been kind of an AI week for us.
01:06:53.120 We're talking about AI and what is coming.
01:06:56.280 And a lot of these things I've been talking about for years, but they seem so far on the horizon.
01:07:01.780 Most people couldn't relate to it.
01:07:04.080 And I've told you before, there's going to come a time where it begins.
01:07:09.400 And, you know, in a five-year period, you're just not going to be able to keep up with all of the changes that are coming because it will change things.
01:07:17.960 It will be exponential leaps on pretty much everything.
01:07:23.780 And I think we're at the beginning of that now with ChatGPT.
01:07:28.180 And we are talking to William Hurtling.
01:07:30.740 He is the author of several books, the Singularity Series and also AI Apocalypse.
01:07:39.260 And I've read his book and I just think his books and just think that he really gets it and can understand and break it down to, you know, our level.
01:07:52.140 We were talking before, what are the real dangers?
01:07:55.660 And we've already talked about one of them, it limiting information or packaging it.
01:08:01.480 So we kind of lose that ability.
01:08:03.040 And we're going to get to the unemployment, but let me ask you about the massive infrastructure outages, such as electrical supply or transportation infrastructure.
01:08:16.080 That's one of the things you have written about.
01:08:18.700 What does that mean exactly, William?
01:08:22.880 You know, and this is something I really talk about in my second book, AI Apocalypse, which if you read it, you might think it's far-fetched.
01:08:30.600 But I will say that the U.S. military has it as a required reading in their future combat strategy class.
01:08:37.040 So they actually see it as such a plausible scenario that to them it's the most realistic scenario of what an AI rollout would look like.
01:08:47.300 We know, we saw this during COVID, right, that small disruptions in the supply chain anywhere cause these widespread disruptions.
01:08:55.600 And software obviously has, there's going to be a desire to make that smarter, right, by doing more with software so we can optimize that supply chain, right, to the nth degree.
01:09:12.920 And the problem is, is now you're very dependent upon that software optimization working exactly the way you want.
01:09:18.600 And it's just the case that with AI, we really don't know how it's working most of the time.
01:09:23.140 It's not like a traditional software program where you say, if A happens, then do this.
01:09:28.200 If B happens, then do that.
01:09:30.260 AI software is, you know, a black box, right?
01:09:33.260 It is trained on large data sets, and it will statistically operate in a certain way, but there's no guarantees.
01:09:40.500 And sometimes it makes really bizarre decisions.
01:09:43.500 So you could have a cascading failures very easily, where you could have a small outage, the AI attempts to do one thing to compensate, and then just actually throws it more out of proportion, right, and makes worse decisions.
01:09:58.320 Where a human having some oversight, we may not make the best decisions, but we typically don't make really awful decisions.
01:10:06.340 Are we?
01:10:06.820 Oh, that's going to be a problem.
01:10:08.280 Let's do something different.
01:10:09.520 AI isn't going to see that.
01:10:10.600 Are we at the place now, I don't know if you read Stephen Hawking, his demon, not Stephen Hawking, Carl Sagan's Demon Haunted World, back he released it before he died.
01:10:22.360 And he talks about a place where, you know, only high priests will understand the language of future technology, and it will be like Latin to everybody else.
01:10:32.800 It means nothing.
01:10:34.020 But we're really seemingly getting to a place to where it's going to surpass even the high priest.
01:10:39.940 You just don't know.
01:10:42.660 You just don't know.
01:10:44.260 And, right, what are we likely to see down the road?
01:10:46.720 We're going to see AI that trains other AI, right?
01:10:50.720 You have a great tool.
01:10:51.680 Let's use it more.
01:10:52.480 So, well, now we don't even know how the other AI is being programmed, right?
01:10:56.900 What happens if you tell ChatGPT, go make a new ChatGPT, you get Dan.
01:11:01.600 You get Dan 5.0.
01:11:03.680 Jeez.
01:11:06.560 Okay.
01:11:07.660 Let's talk about unemployment, if you can.
01:11:10.620 What are the things that are the first on the chopping block, do you think?
01:11:15.600 For ChatGPT, I mean, it's really hard to not talk about driving, even though, obviously, ChatGPT isn't driving software.
01:11:27.460 But we know that driving stuff's been on the horizon.
01:11:29.820 It's been coming all along.
01:11:31.260 And it's a really significant percentage of jobs, right?
01:11:33.640 We're talking about, I think, somewhere between 10% and 15% of jobs in the U.S. are related to driving, whether it's transportation, Uber, whatever.
01:11:42.080 That's a lot of people.
01:11:43.380 And one of the differences with AI jobs is it happens overnight, right?
01:11:47.480 This isn't like the slow decline of driving.
01:11:49.900 It'll be, you know, now we're all driving, and five years from now, none of us are driving.
01:11:54.160 It's the best example is the iPhone, smartphones.
01:11:59.100 Nobody had one in 2009.
01:12:00.780 Now, no one can live without it, and it happened in three to five years.
01:12:06.260 Right.
01:12:08.260 Right.
01:12:08.820 And so what happens in our jobs, right?
01:12:10.880 We have an, well, there'll be an expectation that you're going to use this new technology, right?
01:12:15.440 It won't really be an option not to.
01:12:17.280 Right.
01:12:19.780 You know, there's a, there's a, I had Andrew Yang on.
01:12:25.300 We were talking about universal basic income, which I don't agree with.
01:12:30.400 However, I do believe we need to discuss it and everything else because we're going to be moving, we are moving, to a world where fewer and fewer people are employed or employable because of AI.
01:12:48.940 And how are they going to, you know, you can't have 20% of the population, 30% of the population unemployed.
01:12:55.940 How are they going to make money?
01:12:57.260 How are we going to, so it's really a completely new field.
01:13:03.420 It's not like the end of capitalism because we're going to Marxism.
01:13:07.920 It's possibly the end of capitalism as we understand it into something entirely different that the world and humans have never faced before.
01:13:18.080 Is that an overstatement?
01:13:21.500 No, I don't think it is at all.
01:13:23.260 I agree, right?
01:13:24.100 We don't have a model.
01:13:25.480 Like universal basic income might not appeal to a large group of people, but we don't have another model for what it looks like if most people aren't working.
01:13:34.080 Right.
01:13:34.200 And I'm also concerned about the opposite, you know, the, I call them the ranchers and the sheep.
01:13:43.820 There are people who are ranchers who think, you know what, everybody else is just sheep.
01:13:48.040 They'll do what we say, blah, blah, blah.
01:13:49.520 But those people are at the top of the food chain, usually the very, very wealthy and the powerful, and they're going to be the ones making the money on these programs, et cetera, et cetera.
01:14:04.920 And as the world becomes more dependent on their software and their things, then they gather more wealth.
01:14:16.200 And so the disparity between rich and poor becomes enormous, enormous.
01:14:23.100 And I don't think there's any way that nobody's even talking about how do we make sure that the uber, uber, uber wealthy just don't own everything and everybody else is left with nothing.
01:14:35.440 Right.
01:14:36.820 I think one, and one of the things I think that's different is that in the past, when you looked at jobs being obsoleted,
01:14:43.080 the people being affected usually were not the wealthy, right?
01:14:48.600 Right.
01:14:48.780 Usually if you had lumber jobs going away, right, that was an honest career for folks, but probably not making a ton of money.
01:14:56.040 But now we're talking about jobs.
01:14:58.040 We're talking about computer software.
01:14:59.840 We're talking white collar jobs.
01:15:01.160 Yeah, big time.
01:15:02.100 We're talking about white jobs going away.
01:15:03.940 We're talking about, you know, I think that's going to be a huge thing for the medical industry, right?
01:15:08.720 We're going to see, yeah, right, medical diagnosis, right, which IBM tried to tackle, you know, 10 years ago and we weren't quite there.
01:15:17.220 But there's really compelling reasons why you want that, right?
01:15:19.760 Everyone would say, yeah, you don't want a doctor operating on you if they're hungover or if they're, you know, pissed off because their wife is having an affair.
01:15:26.440 But you know what, not only, you don't even have to go to operations, which is logical outcome, but just diagnosis.
01:15:35.100 I believe by 2030, people, it will be normal for the doctor to come in and give you results of something and try to explain what it means and what he thinks it means.
01:15:44.680 And then you to say, yeah, yeah, yeah, but what does the AI say?
01:15:50.600 Because it will have so much up-to-date information that you won't want to, you'll want to hear it from a human, but you'll want to be reassured that that's the correct diagnosis and prognosis from AI.
01:16:08.180 And then you end up with these interesting things where, you know, even today, a lot of medical treatments are gated by what insurance will pay for, right?
01:16:18.020 And so the doctor might have an idea of what's the right thing to do for you, but insurance says no.
01:16:22.500 Well, what happens in the future when insurance says you will have to use our AI for diagnosis to get reimbursed?
01:16:28.740 Oh, my gosh.
01:16:29.480 And by the way, right, we have these biases in our AI because this AI is cheaper for us than if we were to use a different AI that suggested more treatments.
01:16:38.180 Is anybody talking about this seriously?
01:16:40.980 Is there any group out there that is talking about this and saying we have to put this codified right now?
01:16:51.440 Yeah, we don't.
01:16:52.480 We don't have anything.
01:16:53.200 We don't have anything across the industry, across multiple industries.
01:16:58.220 In your book, and I've only got about a minute and a half, two minutes left.
01:17:03.920 In your book, one of the most breathtaking chapters is these guys walk into the president's office because there's an attack and they're fighting AI.
01:17:14.060 And they're going to tell the president, you need to launch planes.
01:17:17.140 You need to you need to fight right now in Chicago.
01:17:19.400 And it opens with them walking into the office saying, Mr. President, then it cuts to the AI and the war in Chicago.
01:17:28.060 And the war is won by AI.
01:17:30.880 And then at the end of the chapter, it says, dot, dot, dot.
01:17:35.160 We need to launch an attack now in Chicago.
01:17:39.460 And it happens that fast.
01:17:41.720 What takes it from a little helper to that?
01:17:50.480 It's when we take the people out of the process, right?
01:17:55.600 Now, it is no longer operating at people's speed.
01:17:58.900 Now, it's just operating at its own speed with no checks and balances.
01:18:03.760 And that's what business will drive toward because that's the economical choice, right?
01:18:09.280 Take people out, just use AI for everything.
01:18:12.560 But that's how you get really bad decisions really fast.
01:18:15.640 And the safeguard for that, at least according to Elon Musk, is his new, I can't remember what they're called, the brain thing that he's doing where you'll be able to actually connect to the Internet.
01:18:27.880 So you'll be able to think and humans will be able to, yeah, Neuralink.
01:18:32.120 It'll connect humans and put them into the process.
01:18:37.080 That's his solution.
01:18:39.280 Which, you know, I think that that is a component of the future for sure.
01:18:44.480 And that could be, obviously, a whole other week to dedicate to that.
01:18:47.860 That's not going to stop the AI, right?
01:18:49.880 That's not going to stop the AI in the short term.
01:18:51.980 And that's, right, we don't have Neuralink today.
01:18:54.200 Right.
01:18:54.760 But we do have AI right now.
01:18:57.840 William, thank you for talking to me.
01:18:59.540 I don't even know what your politics are.
01:19:01.280 But, I mean, I think you live in Portland.
01:19:03.360 So I'm guessing that we don't agree on an awful lot.
01:19:06.720 But you are somebody who is really, really smart.
01:19:10.980 And you've been open to talk.
01:19:12.820 We've reached out to several AI experts this week.
01:19:15.500 And some of them won't come on because they're like, I don't agree with him.
01:19:19.540 And it's like, we don't have to agree on stuff.
01:19:21.380 We have to agree on, you know, some pretty basic scary stuff here is happening.
01:19:25.560 We should all be informed on it.
01:19:27.480 We should all be willing to have a conversation.
01:19:30.420 We should.
01:19:30.980 And I really appreciate it.
01:19:31.960 Thank you so much.
01:19:33.120 Yeah, thank you so much, Glenn.
01:19:34.460 You bet.
01:19:35.220 That's William Hurtling.
01:19:37.360 His Singularity series is really something you should read if you want to understand what's really, literally, on our doorstep now.
01:19:47.120 It's on the threshold.
01:19:49.600 So halfway between outside and inside.
01:19:52.020 And it's going to walk up your stairs quickly.
01:19:55.000 All right.
01:19:55.260 New Year.
01:19:55.700 Wanting to find a New Year look for your home, something different, something stylish, something that's not going to break the bank.
01:20:02.080 Start with the windows.
01:20:03.640 Shopping around for window treatments usually requires having people come over to your home two times.
01:20:08.440 First to measure and help you pick out, you know, what the expensive fabric is going to be.
01:20:13.240 And then they'll say, we're going to call you back with a price because they don't want to be in the room with you.
01:20:17.440 And then if you do do it, then they come back again to install them.
01:20:21.100 It's a hassle, usually expensive, but not with Blinds.com.
01:20:24.740 Their design consultants have you covered and they can help pick what's right for your home.
01:20:31.160 Just if it's just one room or the whole house, they don't they don't need to come to your home.
01:20:37.140 You can do it online now.
01:20:38.500 And it's really simple.
01:20:39.660 There's a reason Blinds.com has over 40,000 five-star reviews.
01:20:44.400 It becomes obvious when you see how their window treatments can give your home an entirely new look.
01:20:49.460 Blinds.com.
01:20:50.940 Affordable and very simple.
01:20:53.760 It's Blinds.com.
01:20:55.640 That's it.
01:20:56.340 Blinds.com.
01:20:57.080 Right now, save up to 45% site-wide.
01:20:59.940 Up to 45% off everything right now at Blinds.com.
01:21:04.200 Rules and restrictions may apply.
01:21:05.580 Join the conversation.
01:21:08.800 888-727-BECK.
01:21:11.540 The Glenn Beck Program.
01:21:12.660 We are living in a world now, AI deepfakes, augmented reality, job displacement.
01:21:38.440 Tonight, 9 p.m., the AI revolution is here.
01:21:43.300 How machines will transform your entire world.
01:21:46.740 Tonight, my Wednesday night special, 9 p.m. Eastern on blazetv.com and 9.30 p.m. on youtube.com slash Glenn Beck.
01:21:56.340 And after listening to all this, now I want to see the movie Megan.
01:22:01.080 You know, the, you see, I've seen the commercials for this.
01:22:04.920 We're halfway through an AI week and you've taken from all of this, I want to see a really bad horror film.
01:22:12.300 Well, it's about AI.
01:22:13.500 Right.
01:22:14.020 Right.
01:22:14.320 No, it is.
01:22:14.880 No, it is.
01:22:15.220 It's like the family gets this life, really weird, uncanny valley doll.
01:22:19.220 Right.
01:22:19.680 And it starts doing all sorts of weird stuff and turns on the family or whatever.
01:22:23.640 It's, it's Chucky, right?
01:22:27.180 It's, well, Chucky was a doll that came to life, right?
01:22:30.080 Right.
01:22:30.400 This is a, this is a realistic story.
01:22:33.900 Chucky's ridiculous, Glenn.
01:22:35.160 This is a doll.
01:22:36.500 Right.
01:22:36.980 It's a design with AI to be, my understanding is it's designed with AI to, to be a protector,
01:22:41.960 to protect the daughter.
01:22:43.160 And then they, the daughter sort of teaches itself that maybe the parents are part of the problem.
01:22:48.660 Yeah.
01:22:49.120 That becomes part of a, a larger saga that I want to see.
01:22:53.320 Yeah.
01:22:53.700 And plus it looks really creepy.
01:22:55.540 Doesn't it?
01:22:56.140 Yes, it does.
01:22:56.940 It looks really creepy.
01:22:58.920 And they have, you know, the cheesy is better word.
01:23:01.500 Really?
01:23:01.980 Creepy.
01:23:02.380 I think it's creepy.
01:23:03.140 Yeah.
01:23:03.380 Yeah.
01:23:03.960 I think it looks really creepy.
01:23:05.560 Uh, and, uh, thank you for getting that after all of the heft that we put behind this program.
01:23:10.560 Thank you for, let me talk to you a little bit about good ranchers.
01:23:15.860 Uh, good ranchers is, um, really about saving you money, giving you the best quality and saving
01:23:24.800 an entire industry in America and kind of an important one that of ranchers, people who,
01:23:31.040 you know, raise your, you know, raise your chickens, raise the cattle, they're being squeezed out more and more.
01:23:37.780 100% American hand-trimmed steakhouse quality meat from good ranchers.
01:23:43.040 Over 85% of the grass-fed beef in this country is imported for over, from overseas.
01:23:47.960 It even has a little, you know, product of USA, but that's, that's a lie.
01:23:52.940 It's a lie, uh, ditch the usual gifts, uh, for Valentine's day.
01:23:59.160 Say, I love you with American meat.
01:24:01.920 Snag your $30 off with my promo code Beck at good ranchers.com.
01:24:07.040 Good ranchers.com.
01:24:09.680 Love is in the air and love is for ranchers who are giving us great American beef.
01:24:14.500 Save $30 right now.
01:24:16.580 Good ranchers.com.
01:24:18.000 Make sure you use the promo code Beck.
01:24:36.060 We got to stand together.
01:24:41.060 It's the course of life.
01:24:45.060 Stand up, stand, hold the line.
01:24:48.000 It's a new day I'm trying to raise.
01:24:55.880 What you're about to hear is the fusion of entertainment and enlightenment.
01:25:03.800 This is the Glenn Beck Program.
01:25:09.520 Welcome to the Glenn Beck Program.
01:25:11.240 Last night was the State of the Union, and I've got probably 90 minutes of stuff to talk about,
01:25:16.480 and we have to squeeze it in the next 60.
01:25:18.980 So we begin in 60 seconds.
01:25:21.220 Every time you hear about one of those massive data breaches in the news,
01:25:25.480 do you ever think, am I on one of those lists?
01:25:30.140 To stop having to worry about that and so much, just get LifeLock.
01:25:33.920 LifeLock can't stop everything, but they monitor your data better than you can on your own.
01:25:39.840 And it's important to know that this is going to happen to all of us.
01:25:43.620 Everything we do is online now.
01:25:46.180 25% off a subscription to LifeLock.
01:25:49.300 Let that be the key to open this safe door up.
01:25:52.980 It's top of the line in cybersecurity.
01:25:54.880 It has preventive measures to keep you safe and access to a dedicated restoration team if you do end up having your information hacked into.
01:26:03.040 So join now, save up to 25% off your first year with the promo code BECK.
01:26:07.400 It's 1-800-LIFELOCK, 1-800-LIFELOCK or lifelock.com.
01:26:12.160 Make sure you use the promo code BECK, lifelock.com or 1-800-LIFELOCK, promo code BECK.
01:26:20.120 All right, so last night was one of the most bizarre things I have ever seen.
01:26:24.680 Our president was laughed at.
01:26:27.440 Our president was lost and bewildered.
01:26:31.880 There were parts of the speech I literally could not understand.
01:26:37.540 I've never seen anything like it.
01:26:39.580 He introduced spending and control that is way beyond anything the United States of America has ever done.
01:26:49.180 No mention of why we've spent $100 billion on Ukraine, but, you know, no answer on why we're doing it, what exactly we're doing, just that we're committed from here until the end.
01:27:03.700 I don't know what that means.
01:27:06.100 About half of it, I felt like he was saying Trump was right.
01:27:11.240 I mean, all of a sudden he was America first.
01:27:14.480 It was bizarre.
01:27:15.520 China, our greatest national security adversary.
01:27:19.340 He used to laugh at that.
01:27:21.580 Police need more funding, not less.
01:27:24.860 We got to have that funding so we can attract the best cops and provide the best training.
01:27:29.340 Wait, what?
01:27:31.500 Social media platforms like TikTok are a danger to our children, and anything coming from China should be banned.
01:27:39.760 Wait, what?
01:27:41.260 Buy American and American first, now apparently correct and moral.
01:27:49.940 The U.S. debt is too high and has to be reduced to take the burden off our children.
01:27:55.600 Now he's also talking about, at the same time, the most spending programs in our history.
01:28:02.680 And he then introduces that we're going to tax the rich some more for that, because there's tax cheats out there.
01:28:11.780 Uh-huh.
01:28:12.720 Is that going to pay for all of this?
01:28:14.520 He also said, now I want you to really think, think on this.
01:28:21.200 He said that from now on, 100% of building materials used in infrastructure and government construction projects need to come from America.
01:28:34.460 Okay, if that were true, we'd have to open an estimated 400 new coal, copper, and iron ore mines, 100 new steel mills, 100 new aluminum refineries.
01:28:53.540 We'd have to cut down almost every tree in America.
01:28:57.260 By the way, part of our U.S. carbon emissions, the reason why they're 80% lower than they were in 1995, is because we outsourced all of that stuff.
01:29:09.420 So now you want to bring it here in America while reducing our carbon emissions to zero.
01:29:18.340 Not going to happen.
01:29:19.540 Check out the cobalt mines in Africa, the copper mines in South America.
01:29:25.320 We outsourced all of our steel refinement to China, lumber to Canada and South America.
01:29:31.700 What would requiring 100% of $2 trillion of infrastructure spending do to our goal of reaching 0% carbon by 2050?
01:29:43.800 There's no answer there.
01:29:46.020 There's no answer.
01:29:47.760 It's delusional thinking.
01:29:49.540 What would it just do to the prices of stuff at Home Depot if the United States of America decided to buy $2 trillion a year of all American product?
01:30:05.000 You wouldn't have a chance of buying an American product.
01:30:10.240 You know, I wondered after he was making promise after promise what the Supreme Court justices thought.
01:30:15.240 You know, there were four of them that didn't show up last night.
01:30:17.380 I mean, if you're sitting there and you were at all constitutionally based, you were like, well, that won't stand up in court.
01:30:25.120 Over and over again, a wealth tax and a billionaire's only tax, unconstitutional.
01:30:31.420 Forcing non-profitable corporations to pay 15% of income.
01:30:35.620 Not profits, income.
01:30:37.400 It's unconstitutional.
01:30:38.740 Unconstitutional.
01:30:39.740 Setting the price at which items can be sold.
01:30:44.320 No matter the manufacturing and business costs, no way just setting the price.
01:30:50.080 Unconstitutional.
01:30:53.160 And he was laughed at.
01:30:54.480 Look, I understand.
01:30:57.260 I'm not going to shut down all the oil and gas here in America and people are saying that and that's not true because we are going to need oil and gas.
01:31:07.100 I'm quoting for another 10 years.
01:31:11.060 Another 10 years.
01:31:12.960 You just doomed half the world to a slow and painful death from starvation.
01:31:21.480 This was the most embarrassing moment I've ever seen in a State of the Union address.
01:31:29.000 The world is facing massive energy shortages.
01:31:33.280 America is one of the largest energy producers in the world due to coal, oil and natural gas.
01:31:40.340 Russia's oil and gas production are supposedly off the market for the next 10 to 15 years.
01:31:47.200 So the developing world, they can't get it from Russia, can't get it from us.
01:31:51.980 They'll die off without any energy and food production.
01:31:58.520 So to blind, you know, everybody.
01:32:01.880 He's delusional.
01:32:02.680 He's so incredibly, I mean, I don't even know how a logical person can even look at this.
01:32:10.340 Look at that speech and understand it other than there's something wrong here.
01:32:17.640 No notion.
01:32:19.160 No notion.
01:32:19.800 He says the radical environmentalism, which he's pursuing, he says it's an existential threat that we have to take care of.
01:32:30.240 No notion of how we're going to produce food and heat for the next, you know, 8 billion people.
01:32:41.060 What you saw last night was state-sponsored religion.
01:32:44.500 That's what you saw.
01:32:45.980 He was the high priest.
01:32:47.780 He came to the altar and expected you to bow down and worship at that altar.
01:32:53.560 The first ever American state-sponsored religion.
01:32:57.580 Because there's no logic to it.
01:33:00.200 You just have to believe.
01:33:03.020 I was also a little disappointed by Sarah Huckabee Sanders' rebuttal speech.
01:33:09.120 I mean, it's good.
01:33:10.200 She told a story setting, you know, Trump up for 2024, given, you know, passion plea for what makes America great.
01:33:17.700 But it was not a rebuttal.
01:33:20.420 She didn't rebutt anything, any of the major points or policies in his speech.
01:33:24.520 It was as if she wasn't watching the speech.
01:33:28.620 And I think that is a huge mistake.
01:33:31.180 You can't pre-write everything.
01:33:36.280 I mean, I really like Sarah.
01:33:38.620 I do.
01:33:39.840 But the stump speech response from Republicans is just not enough.
01:33:44.280 Imagine if we would have just deconstructed.
01:33:47.960 You could have had ChatGPT do it.
01:33:50.940 Imagine if we could have deconstructed what Biden actually said.
01:33:56.080 Some of the biggest applause lines.
01:33:57.920 His biggest promises.
01:34:00.260 Let me just throw a couple of these out at you.
01:34:03.480 Imagine if she would have said something like, you know, Mr. President, you keep saying, let's finish the job.
01:34:09.720 We'd like to know what job it is you're trying to finish.
01:34:15.500 Because we don't see the benefits happening to the United States of America.
01:34:22.900 It is almost as if you're trying to finish the job of the fundamental transformation and destruction of America.
01:34:30.820 But let's take you at your word.
01:34:32.660 What you're suggesting we do as a nation via our government.
01:34:38.940 I'm not sure you understand what your job even is based on what you just demanded tonight.
01:34:46.060 That are all the things that our government should do.
01:34:51.340 That's not your job.
01:34:52.940 Here's what an intelligent person would have heard in your speech.
01:34:56.860 Let's finish the job of violating the Second Amendment by disarming Americans, preventing them from being able to defend themselves, their families, and their homes.
01:35:07.460 I don't want to finish that job.
01:35:09.700 Let's finish the job of violating the First Amendment by continuing to follow our first ever state-sponsored religion of radical environmentalism, destroying our energy industry,
01:35:22.220 ultimately massively depopulating the planet, dooming mankind to return, really, to the Middle Ages.
01:35:30.580 Let's finish the job of violating the Ninth and Tenth Amendments by going on government spending sprees, including free health care for all, free in-home disability care, free college for all,
01:35:45.740 a guaranteed job for anyone who wants one, which would lead to catastrophic hyperinflation, runaway debt.
01:35:54.500 It would destroy everything.
01:35:56.480 By the way, none of that is in our Constitution.
01:36:00.600 Every single one of those was in the Soviet Constitution, however.
01:36:05.200 Let's finish the job of what?
01:36:09.460 Violating the Fourteenth Amendment's Equal Protection Clause by asking the top 1% of tax earners who already pay more than 40% of all taxes to pay even more,
01:36:22.600 ensuring only that they're going to migrate out of the U.S. to countries that ask them to actually pay their fair share.
01:36:30.020 It would destroy our tax base.
01:36:32.560 Or is that why you were trying to finish the job overseas and convince every Western nation that we all had to lock our tax rates together?
01:36:44.740 Let's finish the job in what?
01:36:47.100 Ukraine?
01:36:48.420 Violating Article 1, Section 8, which grants Congress the sole power, that's a quote, to declare war.
01:36:55.360 Let's continue to fight $100-plus billion-per-year proxy wars against Russia in the Middle East, in Africa.
01:37:04.320 Let's continue to drum up support for the next war against, what, China?
01:37:08.680 Let's finish the job of lying to our children that the goal of American life is and must be equity of outcomes.
01:37:19.080 Instead of teaching them about individual responsibility, merit-based rewards for hard work or personal achievement.
01:37:25.420 No, no, no.
01:37:26.080 Let's finish the job.
01:37:27.980 Let's finish the job of further depopulating the planet by teaching young children that abortion is the answer to pregnancy if that is her or him, his whims.
01:37:42.880 According to the equal rights of the unborn child, as if killing a five-year-old is the answer to the inconvenience of being a parent or food shortages.
01:37:53.780 Let's finish the job of turning American into a socialist Soviet state.
01:38:00.700 I refer you back to the Soviet Constitution, where the government sets all the prices, takes over businesses, or you could go to the fascistic look, where it's a public-private partnership.
01:38:12.340 Their words, the fascist words, not mine, but strangely yours, where they come in and they either take over the business or they partner with the industries to achieve social goals, demanding they produce drugs or hearing aids or music, and then must sell them at a price that the government demands or decides is fair.
01:38:36.080 I don't want to finish that job.
01:38:37.820 Mr. President, I don't think you understand what finishing your job really means, because it seems like you don't even know what your job is.
01:38:46.800 And it's weird, because you've taken the oath of office several times, and your job, the job of the American government, is to ensure and protect the liberty and freedom of each individual man, every man, woman, and child.
01:39:05.280 What is it from the Declaration of Independence?
01:39:07.480 Oh, and governments are instituted among men to protect these rights.
01:39:14.760 That's why in America we even have a government.
01:39:18.900 When you became a president, when you became a senator and then vice president and now president, you took an oath of office.
01:39:27.880 There was no mention of jobs for all, free college, free health care, destroying our industries to follow your party's chosen scientist's whim on energy policy.
01:39:38.920 Perhaps you need a reminder of that oath to protect and defend the Constitution of the United States against all enemies, foreign and domestic.
01:39:50.300 By the way, the reason you have a veto is because you believe it to be unconstitutional.
01:39:57.340 Not because you don't like it, but because you believe it is unconstitutional.
01:40:02.480 But how many presidents even remember that on both sides of the aisle?
01:40:09.020 Let me ask you, Mr. President, when you stated your plan and your mission.
01:40:14.820 Do you not see it clearly violates the precepts and the text of the Constitution?
01:40:21.460 And what does that mean to you?
01:40:23.760 However, to give credit where credit is due, you were right about one thing.
01:40:27.520 We still do have a job to do, except I think we should finish the job our founders started to ensure the blessings of liberty to ourselves and our posterity.
01:40:42.300 That, sir, is the job.
01:40:45.080 That is the job that is still left undone because it will never be totally finished ever.
01:40:51.140 It's through human freedom and liberty and meritocracy that we will cure cancer.
01:40:57.840 We will protect the environment, lower health care costs, plus we'll improve the economy and confront our enemies as they arise.
01:41:07.740 Because that's what Americans do.
01:41:09.940 And every time there's a public-private partnership or government gets involved, it destroys the freedom.
01:41:16.160 You know, it's the people, not higher taxes on a few people.
01:41:23.720 Not through government controls of industries or price controls or wage and job guarantees that enslave the few for the benefit of the many.
01:41:33.820 That's not capitalism, sir.
01:41:35.660 Although you said twice last night you're a capitalist, I don't think you even know what that means.
01:41:39.840 It's another C word, Constitution Capitalist.
01:41:43.900 That's socialism.
01:41:44.820 That's Marx, Mao, Hitler, Putin, Xi.
01:41:48.060 They each believed, well-intended for their people, but the path to hell.
01:41:55.280 Your job is a public servant.
01:41:57.800 It's singular and simple.
01:42:00.420 To defend and protect the Constitution.
01:42:03.480 So in the spirit of bipartisanship, sir, suck it up.
01:42:06.680 Let's all hold hands.
01:42:08.060 Let's finish that job.
01:42:12.400 That's what should have been said last night.
01:42:14.380 Do you remember the good old days when the main reason you might not be able to have a comfortable retirement was because Social Security was going to go bankrupt?
01:42:23.080 Well, they're all in on it together.
01:42:24.660 They're going to make sure that we don't cut a dime.
01:42:26.520 We don't do anything.
01:42:27.620 Look, if you saw the spending last night alone, interest rates are going to continue to climb, and inflation will continue to climb.
01:42:40.640 Build a hedge against insanity.
01:42:42.440 Gold Line is offering free metals delivered directly to your front door with every qualified, self-directed IRA transaction this month.
01:42:52.880 This is a huge, huge special.
01:42:56.060 Call Gold Line today.
01:42:57.800 Find out how you can take advantage of their IRA special.
01:43:00.740 You've got to protect what you have.
01:43:04.160 Please consider divesting some of it, keeping in your IRA, but putting it into precious metals.
01:43:10.700 Calls 866-GOLDLINE, 866-GOLDLINE, or goldline.com.
01:43:16.140 10 seconds.
01:43:16.660 Station ID.
01:43:17.080 You know, I don't know about you, but I'm tired of stump speeches.
01:43:29.320 I just would like some.
01:43:30.580 This is not that hard.
01:43:33.180 This is not that hard.
01:43:34.900 It's just clear common sense.
01:43:38.400 And we're not getting that from really anybody.
01:43:41.700 I mean, I loved Mike Lee's reaction last night.
01:43:44.240 At some point, he just, like, was standing on the floor, like, what the hell is happening?
01:43:48.720 Aliens have arrived.
01:43:50.260 Yeah, the response really is pointless to me, because it's not a response.
01:43:53.680 You don't wait till it's over and then craft something about it.
01:43:56.780 It's just a pre-written Republican position paper.
01:44:00.940 Correct.
01:44:01.320 Correct.
01:44:01.640 Which is, I mean, in and of itself, fine, but it's not a response, and it doesn't really make any sense with the way it's done.
01:44:09.080 In fact, you just have somebody out there who is actually watching it and going through, you know, several points and try to push back on what they're doing.
01:44:17.420 But that's just not the way this works.
01:44:19.220 I thought it was really weird.
01:44:20.680 I mean, he was very loud.
01:44:21.900 He was very angry.
01:44:23.040 I think that was an ad lib.
01:44:24.680 That thing on President Z, where he was like, tell me the person.
01:44:28.940 Yeah.
01:44:29.180 Show me the leader that wants to.
01:44:31.220 It was bizarre.
01:44:32.740 And show you, there's what?
01:44:34.440 80% of world leaders would switch places with him?
01:44:36.660 I mean, would the leader of like, I don't know, the Central African Republic go into the leading the largest population and second largest economy in the world?
01:44:45.440 I think, yeah, probably, probably would.
01:44:47.940 I think most Biden would switch.
01:44:50.020 I mean, he wouldn't, he wouldn't like being over in China because everybody speaks Chinese, and that would be hard.
01:44:55.800 You can't understand English either.
01:44:57.260 Yeah, I know.
01:44:57.840 But I think he would switch.
01:44:59.720 Any politician in today's West, I don't see the Winston Churchill.
01:45:03.800 I don't see the George Washington that would give up that power.
01:45:06.320 And they love the power.
01:45:07.440 They love it.
01:45:07.860 Right?
01:45:08.160 They would love to be all the things he asked for.
01:45:10.020 We're going to quadruple taxes on stock buybacks.
01:45:12.500 He would love to just be able to do that.
01:45:14.080 He tries it all the time.
01:45:15.500 He did it with a student loan position.
01:45:18.220 In China, he'd just be able to do that.
01:45:20.320 Here, he's going to get blocked, hopefully, by the courts because it's unconstitutional.
01:45:24.760 I mean, they beg and plead for a Chinese system that they can just force things through.
01:45:30.860 I want the administrative state gone.
01:45:36.040 We should abolish all of these things.
01:45:38.600 They're not constitutional.
01:45:40.400 They were a progressive right and left, Republican and Democrat nightmare.
01:45:48.260 They put it in.
01:45:50.280 It's this generation's job to take it out.
01:45:53.960 Did he say take it out?
01:45:58.480 I noticed he was talking about China earlier.
01:46:01.100 Was he making fun of Chinese takeout?
01:46:02.760 Let me tell you about car shield.
01:46:09.020 You drive down the road.
01:46:10.800 You can have the peace of mind that comes with knowing that your car is covered in case it needs repairs.
01:46:16.360 Even when it's no longer under warranty.
01:46:19.200 I mean, it sucks even when the light goes on and you have a warranty.
01:46:23.200 Or you have car shield and the light comes on and you're like, oh, crap.
01:46:26.620 But it's only that way because you're thinking, I got to go into the shop and I have to want a car.
01:46:34.780 I'm going to have a rental car or whatever.
01:46:36.780 That's a hassle.
01:46:37.740 But it's much worse when you're stuck with a massive bill as well.
01:46:42.400 Car shield will now save you 20% on your plan.
01:46:46.660 You just contact them now at 800-227-6100.
01:46:50.860 I have my trucks covered and they have saved me so much money.
01:46:55.320 I highly recommend you do the same.
01:46:57.680 We can't just go out and buy new cars.
01:46:59.900 Car shield.
01:47:00.540 800-227-6100.
01:47:02.700 800-227-6100.
01:47:04.740 Car shield.com slash Beck.
01:47:08.580 We had a great time roasting Joe Biden last night on the State of the Union coverage.
01:47:12.020 You can still save 20 bucks off your subscription to Blaze TV at blazetv.com slash S-O-T-U.
01:47:25.320 So the Poynter Institute, which is, you know, George Soros money and all kinds of great lessons there for journalists and they can teach you how to fact check and everything else.
01:47:48.940 They came out and said that, you know, there's just too much fact checking and, you know, the national media should just stop with the State of the Union and we already got it.
01:48:01.940 There's fact checks on that.
01:48:03.440 You know where there's not fact checks?
01:48:05.180 I'm quoting there's no fact checkers in 29 states, no dedicated state or local fact checkers.
01:48:14.640 For instance, in New Hampshire, they didn't have a dedicated fact checker to dig into the claims of U.S. Republican Senate candidate in South Carolina.
01:48:25.560 They don't have anybody to question the many provocative statements from Senator Lindsey Graham.
01:48:30.640 There's no seeming problem with fact checkers in states that, you know, have Democrats apparently.
01:48:38.400 But so they're saying, you know, stop with the fact check and we got it.
01:48:42.500 We got it.
01:48:43.380 Do they?
01:48:43.960 Do they really?
01:48:45.880 Luckily, Tristan Justice and Jordan Boyd at the Federalist did a little fact checking here.
01:48:54.020 I just want to throw in a couple of things here.
01:48:56.980 One, Chuck Schumer is not the minority leader.
01:48:59.700 More jobs were created in two years than any other president.
01:49:03.980 He did not create new jobs.
01:49:07.080 He claims it.
01:49:07.880 He did not create.
01:49:09.040 The reason the job rate is growing is that millions of people were locked in their house.
01:49:16.280 It's such an embarrassing argument.
01:49:18.500 I can't believe he continues to make it.
01:49:21.140 It's so crazy because I don't know a soul that doesn't understand that.
01:49:25.120 We were all in the midst of the height of the pandemic.
01:49:28.360 And so obviously, naturally, some of these jobs came back as we got out of the pandemic.
01:49:33.520 It has absolutely nothing to do with Joe Biden.
01:49:37.720 Well, how do you combat the fastest growth of jobs in 40 years?
01:49:43.440 I mean.
01:49:43.720 That's the same exact point.
01:49:45.260 No.
01:49:45.820 He's just remixing it.
01:49:47.000 I know it's just the same lie remix and everyone understands it.
01:49:51.860 Like, I mean, you go back and you ask people, this is why he's taking a beating in a lot of these polls right now.
01:49:56.880 They're asking people like, hey, are you better off than you were when Joe Biden became president?
01:50:00.980 And everyone, I think, correctly says, well, let me compare that to 2019, right?
01:50:07.200 Not 2020 at the peak of the pandemic, right?
01:50:10.880 Like, obviously, we all understand there is a momentary, you know, year and a half type of situation there, a year or so where we really had massive problems with the economy.
01:50:21.120 He then goes on to say inflation is coming down, well, 60 percent more for eggs than people were spending in 2021.
01:50:29.480 But, you know, why bring up some facts?
01:50:34.360 He says he's he's responsible for the largest deficit cut in U.S. history.
01:50:39.480 Again, that's covid spending when you are spending six trillion dollars and you decide, you know what, I can't spend any more on covid.
01:50:52.840 Nobody will let me pass this.
01:50:55.240 I that's not a cut.
01:50:57.960 The emergency is over and you should have maybe cut it by four trillion responsible.
01:51:05.720 The Republicans are trying to cut Social Security.
01:51:08.000 That one was kind of solved last night, strangely, right there on the on the floor.
01:51:13.520 Fires have burned an area the size of Missouri.
01:51:17.380 No, no.
01:51:19.300 According to the National Interagency Fire Center, seven point five million acres of land burn Missouri.
01:51:25.960 Their forest is twice that size, 14 million.
01:51:29.100 The state is 44 million acres.
01:51:31.780 So I don't know exactly how they got that one.
01:51:34.900 The the weird claim that fast food workers have to sign a non-competement compete.
01:51:40.940 So if I'm working, you know, the counter at McDonald's, I can't go to work for, you know, Burger King.
01:51:48.480 That's ridiculous.
01:51:50.100 What did he say?
01:51:50.940 Cashiers at grocery stores?
01:51:52.420 Yeah.
01:51:52.740 Yeah.
01:51:52.860 Yeah.
01:51:53.000 What is that's not a thing?
01:51:54.680 What are you talking about?
01:51:55.520 Then he he blamed the the crime wave on covid.
01:52:03.120 No, it was you guys defunding the police.
01:52:05.400 That's what the crime wave and you guys turning the other way when you when you had millions of people out on the streets burning our cities.
01:52:13.080 That that's what that wasn't had.
01:52:14.660 It had nothing to do with covid.
01:52:15.900 Yeah.
01:52:16.480 The one argument that would have been good with covid lockdowns would have been to lock people down when they were burning the cities to the ground.
01:52:21.700 That's the one time you didn't care about it.
01:52:23.140 Yeah.
01:52:23.760 In fact, you encouraged it.
01:52:24.960 Yeah.
01:52:25.440 Encouraged.
01:52:25.920 You cheered it on.
01:52:26.680 Your vice president was paying for these people to be bailed out.
01:52:30.340 Right.
01:52:30.680 And then he also said that the long national nightmare is over with covid because covid kept us indoors.
01:52:37.840 No, covid did not.
01:52:39.580 You did.
01:52:40.700 And many state governors did.
01:52:42.680 COVID did not close our businesses.
01:52:45.980 You did.
01:52:47.580 Period.
01:52:48.140 In the end, I believe covid, the tragedy that it was has the bigger tragedy is what it's going to mean for our children's education, what it's going to mean for our families, what it's going to mean to our country and business.
01:53:06.460 We completely changed because the government forced its way in.
01:53:12.160 And by the way, he's also he's talking about how he's going to cure cancer.
01:53:15.680 Last night, he talked again about his cancer moonshot initiative.
01:53:18.620 But the problem here is he reinstated the program that Obama had to provide more support for patients and families.
01:53:28.100 OK.
01:53:28.460 All right.
01:53:28.900 But in the Inflation Reduction Act, that cut cancer research by nine times as much as this program is supposed to put in.
01:53:40.340 Oh, really?
01:53:40.660 I didn't even know that.
01:53:41.860 All I know is if you go to hasjoebidencuredcancer.com, you will see the answer to see if he's cured again.
01:53:47.720 And unfortunately, as of right now.
01:53:49.680 No.
01:53:50.040 Oh, OK.
01:53:51.600 Life saving pro laws, pro life laws are extreme.
01:53:55.500 No, we're just trying to just trying to be up there with other civilized countries like everybody in Europe.
01:54:01.860 I'm here to be the president for all Americans.
01:54:04.280 Holy cow.
01:54:05.680 Anyway, what I thought last night were the things that you should take away.
01:54:10.320 Liar, how Congress was fact checking him in real time.
01:54:15.720 And then he got animated about it.
01:54:17.900 Oh, yeah.
01:54:18.560 You call my office.
01:54:20.920 OK.
01:54:21.760 All right.
01:54:22.480 That's really cool.
01:54:23.940 Let me run this by you, Glenn.
01:54:25.700 I don't think that worked for Republicans.
01:54:28.060 I don't think the yelling back at him was effective.
01:54:31.400 It seemed like it animated him.
01:54:33.060 It made him wake up.
01:54:34.120 I mean, the guy's been falling asleep in the middle of speeches for the past five years.
01:54:38.400 And this one, he seemed to have energy all the way to the end.
01:54:41.120 And I think a lot of it was because he, again, poorly and filled with lies, but was fighting back and forth, you know, trying to make this into a Joe Biden versus Marjorie Taylor Greene thing.
01:54:53.620 And like, I know I feel like next year they should just freeze him out and be quiet the whole time because, I mean, he he fails on his own.
01:55:00.600 And I thought that that actually probably helped him a little bit last night.
01:55:05.160 I hope not.
01:55:07.000 But but you're very maybe you may very well be right.
01:55:11.980 Let me ask you something.
01:55:13.220 One of the more bizarre things.
01:55:16.140 Do we have.
01:55:19.340 Let's see if I can find it.
01:55:20.600 Do we do we have the.
01:55:22.980 The odd kiss.
01:55:25.380 Oh, I've been seeing screenshots.
01:55:27.260 Yeah, we do have that.
01:55:27.920 Did you see it last night?
01:55:28.900 I didn't watch.
01:55:29.880 I've seen screenshots of it.
01:55:31.060 OK, so there's the screenshot that is Kamala Harris's husband and Joe Biden's wife kissing each other on the mouth.
01:55:41.500 And, you know, I don't know if you've ever had this situation where, you know, you're, you know, hey, how are you?
01:55:46.800 Good to see you.
01:55:47.280 And, you know, you can kiss on a cheek or something.
01:55:49.840 And then they turn their cheek or you turn your cheek.
01:55:52.300 And then it's a kind of that weird thing.
01:55:53.980 Like, I don't want to get you.
01:55:55.260 Right.
01:55:55.840 And you kind of stop.
01:55:56.740 It wasn't that.
01:55:58.040 It wasn't that.
01:55:58.800 No, when you watch the video.
01:55:59.980 This was right for the lips.
01:56:02.200 I don't know if there was tongue, but it was weird.
01:56:04.920 You don't know if there was tongue?
01:56:08.260 Maybe a little bit of tongue.
01:56:09.820 Well, I mean, maybe two to tango.
01:56:11.620 We can't rule it out.
01:56:13.400 It's what we're saying here.
01:56:15.520 That's weird.
01:56:16.240 Is that weird?
01:56:17.600 I mean, it's not.
01:56:18.480 It doesn't mean anything per se.
01:56:21.020 No, I'm not saying that they're having an affair or anything else.
01:56:23.280 I'm just saying that's weird.
01:56:25.720 Especially for people who are like, you've got to wear a mask.
01:56:28.780 Yeah.
01:56:29.420 Yeah.
01:56:29.620 What happened to that?
01:56:30.780 The only person wearing a mask on the entire floor was Bernie Sanders that I could see.
01:56:35.160 Bernie was still wearing one.
01:56:36.500 And that does go with that recent study that points out that only ugly people are wearing masks now.
01:56:42.640 I haven't seen that.
01:56:43.680 Yeah, there was a study that came out that said, like, people who are considered unattractive
01:56:47.620 are more likely to continue to wear masks, which I kind of feel like we've been really
01:56:52.720 critical of the mask thing.
01:56:54.380 Maybe for some people, we should encourage it.
01:56:57.580 Yes.
01:56:58.020 I don't care about COVID, but look at you.
01:57:00.680 Keep your mask on.
01:57:02.320 I'm just throwing it out there.
01:57:03.600 I mean, it's not going to help you in dating or something because at some point you have
01:57:07.940 to take the mask off.
01:57:09.060 I know.
01:57:09.440 And you don't want to be like Phantom of the Opera where everybody's like, oh, my God,
01:57:12.760 put the mask back on.
01:57:14.140 I had no idea.
01:57:15.440 It's not about them, Glenn.
01:57:17.040 It's about us.
01:57:17.860 We don't have to look at them.
01:57:19.380 And people don't have to look at us.
01:57:20.880 We can wear masks and people can.
01:57:22.860 That's why I got into radio.
01:57:24.460 Right.
01:57:24.580 That's why I got into radio.
01:57:25.660 And then the TV and the internet thing happened.
01:57:27.440 Screwed the whole thing up.
01:57:28.380 Screwed it all up.
01:57:28.740 We were supposed to be just voices and then people wouldn't know what we looked like so
01:57:32.580 they wouldn't be horrified by it.
01:57:33.780 But then this stupid technology thing took off.
01:57:36.240 Yeah.
01:57:37.280 That's not our fault.
01:57:38.260 But we can solve the problem by continuing to wear masks.
01:57:41.720 Not willing to do it.
01:57:42.860 Okay.
01:57:43.460 All right.
01:57:44.140 Tonight at 9 p.m.
01:57:47.420 Eastern, the Wednesday night special.
01:57:49.700 I'm going to take you through how artificial intelligence is changing everything.
01:57:54.520 Not about to change, but it is now here.
01:57:58.080 I started talking with Stu probably 97, I think, is when we first started.
01:58:02.640 It wasn't long into it before.
01:58:03.940 I was like, do you know what's coming?
01:58:05.700 I should have ran for the hills then.
01:58:09.980 At that time, I wasn't spooked by it.
01:58:13.980 I was amazed by it.
01:58:15.440 Yes.
01:58:15.860 You know what I mean?
01:58:16.580 And I remember having the discussion.
01:58:19.460 Someday, download times.
01:58:21.980 No Thursday night must-see TV, 8 o'clock.
01:58:25.440 Download.
01:58:26.240 It'll just say, downloads tonight at midnight, and you can download it whenever you want.
01:58:31.460 Wow.
01:58:32.400 It didn't seem- I mean, I remember when you said that, it did not seem like it was even
01:58:36.260 possible.
01:58:37.320 Plausible.
01:58:37.760 No.
01:58:37.920 Like, why would you want it?
01:58:39.580 We were 10 years away from an iPhone.
01:58:41.560 Yeah.
01:58:42.000 So, you know-
01:58:42.860 More than that.
01:58:43.680 Yeah.
01:58:44.020 Yeah.
01:58:44.280 So, we were still having a hard time, you know, with 64 baud, 6400 baud to get a picture.
01:58:50.840 I don't even know if I had a cell phone when we had that conversation.
01:58:53.260 Yeah.
01:58:53.280 That's how long ago it was.
01:58:54.400 Anyway, all of this stuff is here, and it's the scarier stuff, the stuff I've been warning
01:59:01.640 about artificial technology or artificial intelligence, deep fakes, augmented reality, and job displacement.
01:59:09.240 If you want to understand the world that we are now walking into, it's time to do this
01:59:18.700 now.
01:59:19.220 You need to understand this, because it is going to start happening so rapidly, you won't
01:59:27.040 be able to catch your breath.
01:59:29.260 Your head will be spinning, and you'll be like, I don't- wait, what is that?
01:59:33.280 How is- when did that happen?
01:59:34.800 It's going to happen so fast.
01:59:36.960 It's honestly like taking the last 10 years and putting it into six months.
01:59:44.460 That's how fast things are going to seem.
01:59:47.760 Tonight, we'll prepare you with the introduction of AI's chat GPT technology.
01:59:55.300 It's one of those moments that you will look back in just a few years and go, you know, do
02:00:00.940 you remember when chat GPT was out and it was a big deal?
02:00:04.440 It changes everything.
02:00:07.560 Tonight, the AI revolution is here.
02:00:10.560 How machines will transform your entire world.
02:00:15.540 Tonight, 9 p.m.
02:00:17.160 Eastern on blazetv.com 930, youtube.com slash Glenn Beck.
02:00:23.920 By the way, save $20 now on your Blaze TV subscription.
02:00:28.220 Do it right now.
02:00:31.040 We're using the special promo code S-O-T-U, state of the union, S-O-T-U, blazetv.com slash S-O-T-U.
02:00:41.040 And the promo code is S-O-T-U.
02:00:44.380 S-O-T-U.
02:00:45.300 S-O-T-U.
02:00:45.860 Okay.
02:00:47.020 All of a sudden, I'm becoming Joe Biden.
02:00:49.360 And then, then, then, then, then, then, then, then, then, then, then, then, then, then, then, then, then, then.
02:00:51.400 Well, what's it he said last night?
02:00:52.780 There was something that he said in the middle, and it was like a one-liner.
02:00:57.720 And I'm like, I don't even know what he meant by that.
02:01:01.020 It was so bizarre.
02:01:03.540 Anyway, first there was the original MyPillow.
02:01:07.440 Mike Lindell invented it, revolutionized the way a ton of people sleep, myself included.
02:01:12.920 I've had MyPillow now for about three years, and I have to tell you, it's bizarre.
02:01:19.440 I sleep with it every night, and if I don't have it, I mean, travel with it.
02:01:25.480 I've never had this relationship with MyPillow before, but it helps me sleep and keeps my neck from being all out of whack in the morning.
02:01:34.840 It stays cool through the night.
02:01:36.460 Put your fist through it, fluff it up once, it's golden all night.
02:01:39.800 Well, Mike now has improved on it.
02:01:41.720 There's a new improved version of the MyPillow.
02:01:44.820 Why not try to upgrade perfection 2.0?
02:01:47.920 MyPillow 2.0.
02:01:49.060 Patented, adjustable feel of the original, but now has brand new exclusive fabric that is made with temperature-regulating thread.
02:01:59.340 What?
02:02:00.460 MyPillow 2.0.
02:02:01.740 Buy one, get one free right now for a limited time.
02:02:04.460 Promo code BECK.
02:02:05.900 MyPillow.
02:02:07.060 Limited time only.
02:02:08.340 Buy one, get one free.
02:02:09.740 MyPillow.com promo code BECK.
02:02:11.940 MyPillow.com promo code BECK or call 800-966-3117.
02:02:17.520 800-966-3117.
02:02:19.800 MyPillow.com slash BECK.
02:02:22.020 The Glenn Beck Program.
02:02:25.580 Welcome to the program.
02:02:49.780 You know, there is a guest that I really want to talk to.
02:02:54.900 It is a person who was reading that Nancy Pelosi eats ice cream for breakfast and hot dogs for lunch every day.
02:03:06.620 Yes, very strange.
02:03:09.420 Very strange.
02:03:10.520 So they put themselves on a daily diet for, I think, a week or 10 days of that.
02:03:16.280 And I think by day two, they were just vomiting.
02:03:20.840 But they really, oof.
02:03:25.520 I mean, I like ice cream a lot.
02:03:29.920 And I could eat it for breakfast.
02:03:31.980 I mean, assuming that it had some sort of fruit in it.
02:03:35.060 Like strawberry ice cream.
02:03:36.140 Something healthy.
02:03:36.880 Something healthy, yeah.
02:03:38.560 But she apparently has lemon, hot lemon water and ice cream.
02:03:46.960 I mean, that's how you get to look like her.
02:03:50.400 If you want to talk about a commercial, that's just a diet commercial.
02:03:54.540 Look just like Nancy Pelosi with only, I mean, I like ice cream is enjoyable to eat.
02:04:00.620 The morning, though?
02:04:02.400 Does that sound like.
02:04:03.060 Oh, yeah.
02:04:03.700 I could eat ice cream anytime.
02:04:05.320 You're a morning ice cream person?
02:04:06.480 I could eat ice cream any time of the day.
02:04:09.080 This is sort of your bipartisanship.
02:04:10.460 You agree with Nancy here, maybe.
02:04:12.260 You should try it.
02:04:13.260 Why don't you try this as well?
02:04:14.760 No.
02:04:15.100 Have you seen me lately?
02:04:16.780 Yes.
02:04:17.400 Yes.
02:04:17.760 I have.
02:04:18.280 Yeah.
02:04:18.940 You haven't seen all of me because I can't fit all of me into the same room with you as well.
02:04:24.760 So.
02:04:25.260 Yeah.
02:04:25.680 You're on your way to the Super Bowl, right?
02:04:27.300 Yes.
02:04:27.720 Go Eagles.
02:04:28.820 Please.
02:04:29.360 Yes.
02:04:29.900 Go away.
02:04:30.380 Go back to Philadelphia.
02:04:32.260 Go back.
02:04:34.040 This is going to be torture if they lose.
02:04:35.660 So they have to win.
02:04:36.740 Please win.
02:04:37.420 Please.
02:04:38.020 Please win.
02:04:38.620 Look, I ask you to.
02:04:39.340 My son's coming.
02:04:39.440 He was a good luck last time.
02:04:40.620 I ask you to pray for Pat and his family.
02:04:44.160 Pat Gray.
02:04:45.140 But also for Kansas City.
02:04:47.860 The Glenn Beck Program.