The Glenn Beck Program - February 09, 2023


What Are the Odds That BIDEN Blew Up the Nord Stream Pipelines? | Guests: Pedro Domingos & Bill O’Reilly | 2⧸9⧸23


Episode Stats

Length

2 hours and 3 minutes

Words per Minute

150.31073

Word Count

18,575

Sentence Count

1,502

Misogynist Sentences

6

Hate Speech Sentences

44


Summary

On this episode of the Glenn Beck Program, host Glenn Beck is joined by Jason Batrick to discuss the latest on the latest in the story of the Nordstream 2 pipeline blow up, and whether or not it was an act of war.


Transcript

00:00:00.000 I want you to picture for a minute what, you know, what the world would be like if all of a sudden a global medication supply chain of antibiotics just stopped, disappeared, even for a while.
00:00:12.840 How many of us would not make it?
00:00:16.600 Antibiotics are so important.
00:00:18.440 And right now, if there is a supply chain hit again with China or India, you can bet they're going to take care of their own needs first.
00:00:26.540 And what do we have?
00:00:27.700 There's already a shortage of some antibiotics right now.
00:00:32.160 That's why the Jace case is so important from Jace Medical.
00:00:35.780 It's a way for you to keep yourself prepared for the worst.
00:00:39.160 And it is, you know, in case there's a shortage or, you know, supply chain or whatever.
00:00:44.340 But it could also just be for vacation.
00:00:46.880 When you're traveling, you have the five antibiotics that can treat all kinds of different things.
00:00:54.480 And it just tied you over, take care of it until you get home with your doctor.
00:00:59.140 Jace Medical, J-A-S-E Medical dot com.
00:01:02.980 Jace Medical dot com.
00:01:04.440 Enter the promo code Beckett.
00:01:05.640 Check out for a discount on your order.
00:01:07.660 Promo code Beckett, J-A-S-E Medical dot com.
00:01:12.240 Promo code Beckett.
00:01:28.560 No room to compromise.
00:01:32.980 We gotta stand together and we're gonna survive.
00:01:36.540 Stand up, stand up, hold the line
00:01:42.220 It's a new day, I'm trying to rise
00:01:48.580 What you're about to hear is the fusion of entertainment and enlightenment.
00:01:57.840 This is the Glenn Beck Program.
00:02:03.540 Welcome to the Glenn Beck Program.
00:02:05.600 I'm sorry, I'm just looking at three little bubbles on my screen
00:02:08.400 going back and forth with Senator Mike Lee right now
00:02:12.060 because he said something really frightening yesterday.
00:02:15.500 It was a Twitter response where he was talking about this new report
00:02:20.860 that is out now that we apparently blew up Nord Stream.
00:02:26.440 I have real questions and real doubts on this,
00:02:29.680 but I also have real doubts on us.
00:02:32.540 I don't know.
00:02:33.260 And he tweeted last night,
00:02:36.560 if true, slander.
00:02:40.720 No, sorry.
00:02:41.280 If false, slander.
00:02:42.900 If true, war.
00:02:44.920 And he's absolutely right.
00:02:46.720 If you haven't heard that story, I'm just seeing here.
00:02:48.900 He's in a hearing right now.
00:02:53.360 He's got to speak.
00:02:54.320 It's hard to predict when he can come on,
00:02:56.400 but he will call in because I want to get his look at this.
00:03:01.280 He said he's talked to several people about this,
00:03:04.300 and he's not sure.
00:03:06.160 And it's disturbing to him that he's not sure whether we did an act of war or not.
00:03:12.600 And I'm going to fill you in on all of this in 60 seconds.
00:03:16.720 Now, I don't know about you,
00:03:18.020 but I got a couple of vehicles that I plan to drive until, you know, the doors fall off,
00:03:22.640 maybe literally fall off.
00:03:24.240 The upshot of this is that from time to time, they have to be repaired.
00:03:27.700 And that's why I have CarShield,
00:03:29.180 because the repairs can be very costly.
00:03:32.140 You know, the modern car now has 3,000 computer chips.
00:03:36.760 Oh, yeah.
00:03:37.680 Let's get into a chip war, because that'll go great for all of us.
00:03:42.300 Whether your car has 5,000 or 150,000 miles on it,
00:03:46.080 CarShield offers affordable plans to fit every budget.
00:03:50.320 And you're going to want to have that when something goes wrong with a car.
00:03:54.160 They saved me a ton of money on my trucks, and they can do the same for you.
00:03:58.800 Call CarShield now.
00:03:59.840 Save up to 20% on your plan.
00:04:01.460 You'll always be prepared for the unexpected.
00:04:04.800 Call 800-227-6100.
00:04:08.140 800-227-6100.
00:04:10.340 Save 20% now at CarShield.com slash Beck.
00:04:13.780 That's CarShield.com slash Beck.
00:04:18.000 All right.
00:04:18.720 I want to bring in Jason Batrill,
00:04:21.760 who is with me and is going to explain exactly what is going on
00:04:30.120 with this one report from one source.
00:04:34.460 So I say that clearly at the beginning.
00:04:38.460 There's problems with this reporting because it is one source,
00:04:43.240 and I wouldn't take that from the New York Times as gospel.
00:04:46.180 So let's remember one source, but it's pretty damning.
00:04:52.560 It has a ton of facts.
00:04:55.400 Tell me the story, Jason, on what happened and where this report's coming from.
00:04:59.520 I struggle to even really describe how to even tell the story because it sounds like,
00:05:04.100 are you familiar with the term fan fiction?
00:05:05.980 Yes.
00:05:06.440 It's like, that's what it's like off the internet.
00:05:08.120 Like, what would really happen if Anakin Skywalker didn't become Darth Vader?
00:05:12.120 This is the story.
00:05:13.280 Right.
00:05:13.520 That's what it sounds like.
00:05:14.520 Right.
00:05:14.640 But, I mean, Mike Lee is exactly right.
00:05:17.440 If this is true, this is an act of war.
00:05:19.460 And what they're alleging is that the CIA, the Biden administration,
00:05:26.060 came up with a plan to eliminate the Nord Stream 2 pipeline, to blow it up.
00:05:32.000 And we all remember, I think I even came on this program and said,
00:05:35.080 I think you asked me, it was like, do you think this was us?
00:05:37.100 And I'd be like, and then I was like, well, no.
00:05:39.140 We would never risk something like a direct attack on a Russian asset.
00:05:43.700 Never risk it.
00:05:44.860 Here's the thing.
00:05:45.620 I think it was Germany or Sweden.
00:05:47.300 Somebody just released a report that showed Russia didn't do it.
00:05:51.640 Yeah.
00:05:52.040 And how many countries have the ability to do something like this?
00:05:56.220 This was not an easy hit.
00:06:00.060 Not an easy hit and not even an easy hit for Americans.
00:06:03.360 I mean, it would take a long time.
00:06:04.460 I mean, it would take very specific assets like, you know,
00:06:08.500 SEAL Team 6 or something like that.
00:06:10.160 Correct.
00:06:10.540 But the article goes into that.
00:06:12.580 They couldn't use a SEAL Team 6 or anyone in JSOC's Joint Special Operation Command
00:06:16.740 because they'd have to go through Congress.
00:06:18.640 Now, this is a big part of the story.
00:06:20.260 If true, they use some obscure Navy divers that are not part of JSOC.
00:06:26.140 So then the CIA could use them in a joint intelligence operation,
00:06:30.380 not a military operation, an intelligence operation.
00:06:33.920 That would allow them to keep this quiet from Congress.
00:06:37.440 Now, think about that.
00:06:38.380 But like Mike Lee said, this would be an act of war if we did it and they found out.
00:06:43.100 But we didn't inform Congress about it, if true.
00:06:46.620 So there is multiple layers to this, even right off the bat.
00:06:50.420 Who is this written by?
00:06:51.920 This is written by a guy named Seymour Hirsch.
00:06:54.320 He wrote for the, I think, New York Times.
00:06:56.560 He was a guy who got the Pulitzer for exposing the Miley.
00:07:04.200 Yep.
00:07:04.660 Vietnam.
00:07:05.200 Right?
00:07:05.220 Vietnam.
00:07:05.840 So, and he has done many, you know, exposés,
00:07:10.720 but they generally kind of lean against America.
00:07:14.820 Do they not?
00:07:15.720 Yeah.
00:07:16.000 There was the one in, well, I guess the bigger one would be Osama bin Laden
00:07:20.240 questioning, you know, how all of that went down.
00:07:22.820 He even actually questioned Osama bin Laden's culpability in 9-11.
00:07:26.600 So it almost, this is what you kind of see with journalists nowadays,
00:07:29.840 especially, you know, we saw this in the Russiagate stuff.
00:07:32.540 It's almost like they got on this Woodward and Bernstein high.
00:07:35.920 Yeah.
00:07:36.100 And they all want to top each other off of it.
00:07:38.260 So like, where do you go after topping something like, you know,
00:07:41.060 Woodward and Bernstein?
00:07:42.060 They're like getting more and more fantastical and always trying to one up.
00:07:46.340 Well, but not necessarily.
00:07:47.340 I mean, this, this story is why you need a credible press,
00:07:53.780 why you need journalistic standards and not activists,
00:07:58.580 because we are dealing with a story now that if it is true,
00:08:04.700 the American people wouldn't have gone for this,
00:08:09.220 but it's the American people, if true, that will pay the price.
00:08:14.700 It will be our sons and daughters fighting a war with Russia and probably half of the world
00:08:21.460 because of something our out of control deep state did.
00:08:26.000 And we wouldn't have been for it.
00:08:29.360 Now, how do we prove it?
00:08:32.460 Who do you believe?
00:08:34.020 Do you believe the investigators with Congress?
00:08:37.840 Do you believe the investigators from the New York Times?
00:08:41.960 Who do you believe?
00:08:44.460 There's one source on this, which I'd love to have,
00:08:48.640 because you were former military intel.
00:08:50.640 So I would love to have your thought on this.
00:08:53.640 Something this large, because the story is pages and pages and pages
00:08:58.900 and has great detail in it.
00:09:01.380 Yeah.
00:09:04.180 It's all coming from one source.
00:09:06.620 What are the odds that something this secret, this complex,
00:09:12.480 had more than a few, maybe five, maybe five key holders
00:09:17.880 that could unlock all of the information?
00:09:20.560 So let me just, from my intel perspective,
00:09:22.460 and my real world experience is Afghanistan.
00:09:24.900 I was one of the first ground troops conventional into Afghanistan after 9-11.
00:09:28.640 So I was part of the planning phase just on my small level, my unit.
00:09:33.540 I didn't know that certain things were going on in northern Afghanistan.
00:09:37.800 I knew a lot of stuff in south.
00:09:39.020 When we got on the ground,
00:09:40.480 I didn't even know that there was special forces in certain areas
00:09:44.780 that had been there for a while.
00:09:47.100 That was not my need to know.
00:09:49.100 I didn't even know that.
00:09:50.000 Correct.
00:09:50.140 And that was right before a war.
00:09:52.520 So just that perspective.
00:09:54.200 There's no way, in my mind, that a mid or lower level,
00:10:00.160 say that carefully, person would have operational knowledge in that detail.
00:10:05.580 You would need cabinet level or director level access.
00:10:12.500 Now, it's interesting because the way you're phrasing this
00:10:16.440 and you're being very, very accurate on things,
00:10:20.880 a cabinet level or director level might have this information.
00:10:26.940 Why would you bring up director level information on something this sensitive?
00:10:32.980 And I mean, director level, this was done by the CIA.
00:10:37.100 Okay, so at least in this report done by the CIA.
00:10:42.960 So it would mean what?
00:10:44.740 Like the director of the CIA.
00:10:46.380 Why would he rat himself out?
00:10:49.380 I mean, that's a really good question.
00:10:51.980 Unless he was doing his duty and did not believe in what they were doing.
00:10:57.160 Is there any example of a director level spilling their guts on something like this?
00:11:04.040 Deep throat?
00:11:04.940 Hmm?
00:11:05.420 Deep throat?
00:11:07.380 Oh, that's right.
00:11:08.280 That was the director of the FBI, right?
00:11:11.280 Which we found out years.
00:11:13.560 Was it decades?
00:11:14.760 Oh, decades.
00:11:15.580 Yeah.
00:11:15.800 Decades later.
00:11:16.540 Decades later.
00:11:17.300 But then we were always like, there's no way.
00:11:19.220 Like, how was he getting all this information?
00:11:21.620 It's like, how the heck?
00:11:22.960 That was a big part of it.
00:11:23.840 Who is your source?
00:11:25.120 Never would have believed in my wildest dreams that it was the director of the FBI.
00:11:29.660 Never.
00:11:30.640 And that's like this.
00:11:32.500 Will we, decades later, say, how the heck did this guy get his information and we find
00:11:36.700 it was the director of the CIA?
00:11:38.260 If it's true.
00:11:39.080 If it's true.
00:11:39.680 Now, where do you go from here?
00:11:43.160 Where do we go from here?
00:11:44.640 Because no Western ally is going to verify this.
00:11:50.360 No.
00:11:50.880 No Western.
00:11:51.520 Even if it is true and they hate the fact that it's true, they know if we say, oh, you
00:12:00.300 know what?
00:12:00.580 I think it was the United States.
00:12:02.300 This is an act of war.
00:12:04.460 Yeah.
00:12:04.740 And Russia has the righteous stance in the world to take us down or attempt to take us
00:12:14.140 down, take us to war.
00:12:15.260 That is an act of war.
00:12:16.860 So what, what, what does this mean?
00:12:22.240 How do we ever find out anything?
00:12:24.800 Russia has actually responded.
00:12:26.640 And they've said that because of these new air quote facts that the White House needs to
00:12:32.540 respond to this or to answer.
00:12:33.960 Of course, the White House, the State Department and the CIA have all been asked and they've
00:12:37.220 all categorically denied it.
00:12:38.820 But the article was so specific to answer your question in certain ways that, you know, in
00:12:45.420 time and the time frames that they pulled this off.
00:12:47.600 For instance, the article goes into there was a big naval exercise that they used as cover
00:12:53.660 to send in these divers.
00:12:56.320 And that exercise did happen.
00:12:57.720 That exercise did happen.
00:12:58.600 He even puts a link into their specific excuse about having, using divers to show off the
00:13:06.260 capabilities of their mind clearing capabilities.
00:13:09.740 But it's, it's, I mean, you know, even Satan uses some truth and then mixes it with, with
00:13:17.180 falsehood.
00:13:18.000 Right.
00:13:18.300 So, I mean, you know, that doesn't prove anything.
00:13:21.900 Right, right.
00:13:22.500 But, um, so there's that, which maybe they can, I don't know, use some kind of, maybe they
00:13:26.900 were surveilling the areas, maybe they could look at something, I don't know.
00:13:29.800 But then he also goes in very specifically the type of mind they use to get around the
00:13:33.540 Russian detection capabilities.
00:13:36.580 They go into that.
00:13:37.720 Then they go into, and this is, this kind of seemed weird about how they were going to
00:13:41.300 detonate like 72 hours or 48 hours after this exercise.
00:13:45.940 And then all of a sudden they had this after, afterthought of, oh, maybe that seems kind of
00:13:50.420 suspicious.
00:13:51.000 Maybe we shouldn't just have it on a time detonation a couple of days after the exercise.
00:13:55.040 That doesn't.
00:13:55.600 That doesn't jive with me.
00:13:56.540 No, that doesn't.
00:13:56.940 So then they were like, oh, let's send in this like buoy that like has this high tech
00:14:01.260 ping that can, you know, we'll drop it from a plane and it'll set off these charges.
00:14:06.700 That also seems odd to me.
00:14:08.360 That also seems something that the Russians might be able to verify.
00:14:11.220 So I, I mean, I wouldn't be surprised right now if there are Russian surveillance planes
00:14:16.980 flying over the area, gathering intel, possibly, you know, attempting to go and look, take a
00:14:23.120 second look.
00:14:23.740 Like, I don't, I definitely don't think we've heard the last of this.
00:14:27.140 So I'm sure they're going to try and verify it if they can, but they're Russians really.
00:14:31.160 So even if they don't, they're probably just going to say, yeah, they did it anyway.
00:14:34.980 Right.
00:14:35.340 I mean, I would, I would too.
00:14:37.420 I would.
00:14:37.860 And quite honestly, I'm not sure we didn't do it.
00:14:41.640 I'm not either.
00:14:42.320 Yeah.
00:14:42.460 Which is wild.
00:14:43.160 I would never would have thought of this.
00:14:44.380 You know, 20 years ago, I would have said, absolutely not.
00:14:47.860 No way.
00:14:48.940 No way.
00:14:50.860 But if you hit me today, if 9-11 happened and we heard, you know, Bush and Clinton and
00:14:58.840 we had exactly what happened with Sandy Berger at the National Archives where he's smuggling
00:15:05.120 documents out about Bush and Clinton and anything related to Osama bin Laden prior to the bombing.
00:15:13.880 I, I, I, I would deeply question our government.
00:15:19.040 We, we have come a long way on finding out how bad our government can be and has been in
00:15:26.980 the past.
00:15:27.660 The problem with this is, is you are going to pay the price if this happened, or if Russia
00:15:34.860 decides to go with it, you, your son, your daughter, you're going to pay the price.
00:15:41.660 And that's, what's so infuriating, because if it is true, the American people should demand
00:15:49.700 that these people, whoever was involved, whoever had this decision, uh, is in prison and punished.
00:15:57.860 And you know what?
00:15:58.820 I would be fine.
00:16:00.080 I don't care who it is.
00:16:01.420 I don't, let me just say this.
00:16:02.580 And it wasn't, it couldn't, couldn't have been because he wasn't in office, but to show
00:16:07.720 you how passionate, even if it was the former president, go ahead, send him over to Russia.
00:16:13.920 Let him, let him face a trial over in Russia.
00:16:16.460 I'm sorry, but you do something like this and you don't inform Congress.
00:16:22.600 I mean, this is, this is the tweet from Mike Lee last night.
00:16:25.540 I'm troubled that I can't immediately rule this suggestion that the U S blew up the Nord
00:16:33.360 stream out.
00:16:35.740 He can't rule it out.
00:16:38.220 I checked with a bunch of Senate colleagues among those.
00:16:41.420 I've asked none were ever briefed on this.
00:16:43.720 If it turns out to be true, we've got a huge problem.
00:16:47.940 Yeah, we do.
00:16:49.380 Yeah, we do.
00:16:50.040 Now, let me give you a couple of other things, other things and see if, um, see if it feels
00:16:59.760 as though some sort of play is going on.
00:17:06.640 I'll explain in a minute.
00:17:08.100 First, I want you to consider switching your phone service to Patriot mobile right now today.
00:17:13.240 I mean, it's quite a plunge leaving behind a big mobile company that gives you a mediocre
00:17:17.920 service and, you know, a premium price.
00:17:19.960 I mean, I know, I know the truth is it's hard because it's just a hassle.
00:17:25.460 It's a hassle, but Patriot mobile is making that hassle go away.
00:17:29.460 They, they have a U S based customer service team.
00:17:32.240 So yeah, they speak English, believe it or not.
00:17:34.320 Yeah.
00:17:34.620 Yeah.
00:17:34.820 They can understand you.
00:17:35.900 They make it switching easy.
00:17:38.820 Now you're going to pay less.
00:17:40.580 You're going to have the same great, uh, coverage.
00:17:43.380 You're going to have better service.
00:17:45.180 Did I say you're going to pay less?
00:17:46.520 Oh, and they're not going to be working against everything you hold dear.
00:17:51.720 Patriot mobile, a country, a company that shares your values and is getting the job done.
00:17:57.380 100% U S based customer service team is waiting for your call right now at eight, seven, eight
00:18:02.540 Patriot, eight, seven, eight Patriot switch now, uh, at Patriot mobile.com slash Beck, get
00:18:09.420 free activation with the offer code back.
00:18:11.900 Call them 800, sorry, eight, seven, eight Patriot, eight, seven, eight Patriot.
00:18:17.760 10 seconds station ID.
00:18:28.380 Okay.
00:18:28.780 So there's a couple of, a couple of other things, but we have now officially denied that
00:18:34.540 we blew up the Nord stream pipelines.
00:18:36.960 However, there are very few countries that could have done it.
00:18:41.020 Okay.
00:18:41.540 Very few.
00:18:43.080 Um, and, and it's, as Jason said, it's, it's iffy for even us to do.
00:18:49.200 So it's a, you know, kind of a moon launch kind of thing.
00:18:52.920 Now, this just, uh, this has just come out.
00:19:00.560 Let me see the date on this, uh, from CNN.
00:19:04.300 Um, and it says there are strong indications that Russian president Vladimir Putin personally
00:19:12.280 approved the decision to provide separatists in Ukraine with the missile that shot down
00:19:18.860 the Malaysian airlines flight MH 17.
00:19:22.800 This, according to Dutch investigators, they said this yesterday, citing intercepted telephone
00:19:29.120 conversations by Russian government officials.
00:19:31.440 The public prosecution service joined investigative team said there was strong indicators that
00:19:37.720 Russia and the president made the decision about the provision, uh, to give these weapons
00:19:45.240 to the separatists of the Republic, uh, of don't ask.
00:19:52.460 So what is this?
00:19:54.300 So a story is released about us and an act of war.
00:20:01.660 And the next day, CNN is reporting that, Oh, by the way, did you see Putin?
00:20:09.080 He authorized the shooting down of a civilian airliner, but we, we knew this story.
00:20:17.700 Didn't we, didn't we know this story a while back?
00:20:21.020 Or is this the first time we're hearing this, Jason?
00:20:24.380 They, well, so the investigators looking into it have always said, well, ever since they
00:20:30.700 found out that the, the, the missiles came directly, the, the anti-aircraft missile that
00:20:35.120 shot down the plane came from Russia.
00:20:36.920 There was always a question of Putin culpability, like direct culpability.
00:20:41.300 Um, they were, that was always just kind of left out there.
00:20:44.980 Um, it was a, it was a, what were they calling, uh, the, the press here is calling, uh, the, uh,
00:20:50.840 the Nord Stream pipeline explosion, a mystery.
00:20:53.740 Yeah.
00:20:54.240 So the shooting down of this, and we knew the missile came from Russia, but it was a mystery
00:21:00.360 on who did it.
00:21:01.580 Yeah.
00:21:01.900 Directly.
00:21:02.340 Like what was it?
00:21:03.000 Some random, uh, you know, I don't know, army general or something like that.
00:21:07.300 Was it rogue?
00:21:08.200 Exactly.
00:21:08.900 It could be, you know, they just kind of left it out there.
00:21:11.020 That, that was the mystery because you can't go off and just claim that, you know, the
00:21:14.540 leader of a country is a war criminal.
00:21:16.300 Can't really do that unless you need some leverage, right?
00:21:18.900 Right.
00:21:19.220 Unless you need some leverage.
00:21:21.260 Yeah.
00:21:21.680 Cause that's, I mean, wouldn't that really be kind of an act?
00:21:24.560 It's not exactly an act of war, but kind of a direct, you know, you're shooting down
00:21:28.840 a civilian airliner.
00:21:30.120 Right.
00:21:30.400 And if the government, the head of the government, boy, that makes you a guy that just has to
00:21:34.380 be removed.
00:21:35.800 And everybody would agree with that.
00:21:37.920 Yeah.
00:21:38.220 And if you're, and if you're looking at a country that's blowing up the red phone and
00:21:42.060 the Oval Office saying, can you respond to this?
00:21:45.280 Uh, it's a pretty good thing to say, well, you know, we have strong indications that you
00:21:50.020 might be directly involved in the downing of MH17.
00:21:53.620 Don't worry.
00:21:54.500 We're not going to say you did.
00:21:56.360 We just have strong indications.
00:21:58.140 Yeah.
00:21:58.720 We'd like you to answer those charges.
00:22:01.620 Okay.
00:22:02.080 So this is, this is the nuclear war of worlds.
00:22:05.960 They are words.
00:22:07.600 They have missiles pointed at Joe Biden and this administration.
00:22:11.840 And now we have missiles pointed right at Vladimir Putin.
00:22:17.060 But if I may, there's a third story that came out.
00:22:23.140 Uh, I don't like the way this play is ending at the half.
00:22:28.900 What is in the second act?
00:22:32.700 Let me tell you, uh, another story that just came out.
00:22:36.960 Gee, it seems like we're watching a war story.
00:22:40.420 Doesn't it?
00:22:41.480 Coming up in just a second.
00:22:42.760 The Glenn Beck program.
00:22:49.820 Oh man.
00:22:50.340 If you have a nice spot, uh, picked out in the broom closet or, uh, you know, you've,
00:22:55.900 you're traveling into work with a six pack of, uh, weapons grade energy drinks.
00:23:00.840 You know that you have that meeting coming up and you're so tired and you just cannot fall
00:23:05.420 asleep in it.
00:23:06.340 If you're tired of being tired, tired all the time, you're in luck.
00:23:09.820 I'd like you to try relief factor sleep.
00:23:13.400 It's amazing.
00:23:14.340 Just like the regular relief factor that you take for pain.
00:23:17.320 Relief factor sleep is 100% drug free.
00:23:21.180 It's a blend of natural ingredients.
00:23:23.640 I'm trying to do everything I can to, uh, go natural.
00:23:27.600 And because all of these drugs that we take, I mean, let's just take sleep.
00:23:31.920 The, the basic thing on sleep, uh, medication is that you're drowsy in the morning.
00:23:37.320 Okay.
00:23:38.100 This is natural.
00:23:39.220 So you don't feel it.
00:23:41.340 Um, it doesn't all of a sudden feel like, I can't operate and you feel great the next
00:23:46.740 morning.
00:23:47.020 So unleash the power of a great sleep by calling 800, the number four relief.
00:23:52.720 That's 800, the number four relief.
00:23:54.860 It's relief sleep, relief factor.com dream, big sleep, tight, relief factor.com.
00:24:01.820 Hey, don't forget last night.
00:24:05.780 We had a huge special on, uh, AI.
00:24:10.400 You don't want to miss blaze, tv.com.
00:24:13.080 There is a, uh, a story out from, um, a well-known journalist, Seymour Hersh.
00:24:34.820 Uh, I, I, I, I cannot verify the credibility of this.
00:24:41.780 It is based on one source.
00:24:44.120 Uh, however, it is in reading it.
00:24:47.320 It, it, it generally reads like it's true, but it also reads like a movie.
00:24:54.320 It doesn't, it, the, the only thing they have is a bunch of facts, a bunch of facts, but you're
00:25:00.260 not going to be able to verify these facts.
00:25:03.000 And it would take somebody way, way high, um, in either the Intel community, um, military
00:25:12.780 community, or the, um, uh, administration that would have had to leak these one person
00:25:20.300 leak all of these, um, stories, uh, from to, uh, Seymour Hersh.
00:25:25.700 He's been right.
00:25:26.820 Sometimes he's been wrong sometimes.
00:25:28.340 So we don't know, but it's an important question to find out.
00:25:34.400 Did America blow up the Nord Stream pipeline?
00:25:38.200 Now there's a couple of things that go to, um, suspicion.
00:25:45.020 One, if you remember when president Biden said, you know what, uh, we're, we're going to close
00:25:51.440 this down.
00:25:52.320 Well, you can't, I mean, that's German in Russia.
00:25:55.260 You, how are you going to do it?
00:25:56.340 Believe me, we can do it.
00:25:58.340 Uh, so that one felt at the time, like a threat that we would destroy that pipeline.
00:26:07.580 That's an act of war.
00:26:09.020 So when it happened, because there's very few countries that could do it and Russia wouldn't
00:26:14.900 be incentivized to do it.
00:26:16.500 Why would we do it?
00:26:18.680 Well, if you are in a country and in a world where the elites want destabilization and want
00:26:26.140 war and on top of it want to stop oil for this global warming cult and they want to, uh, make
00:26:36.620 sure that the world is cut off from oil, that's pretty compelling.
00:26:42.260 If you're a zealot about it, I mean, you're acting, you're, you're committing an act of war to be able to pull this off.
00:26:51.180 But you believe the world, the most pressing issue is global warming.
00:26:55.240 So I could make a, this is what's frightening.
00:26:59.060 I could make a case either way and I have no idea what's true.
00:27:03.980 And that's the end of a republic.
00:27:09.040 Now, this could very well be misinformation or disinformation from the Russians.
00:27:16.260 We don't know.
00:27:18.380 But if the press starts calling this out of hand disinformation, are you going to believe that?
00:27:25.200 It might even make the case for the article stronger because a lot of Americans will go, oh, really?
00:27:33.280 Was it another Russian bot?
00:27:35.540 Was it?
00:27:36.740 Because I remember that being exposed just recently.
00:27:39.940 Oh, is it disinformation like there was on Facebook or, or it, was it a Russian operative like it was during the 2016 election?
00:27:49.360 Nobody's going to believe that.
00:27:50.460 This is why the press must be neutral and call balls and strikes.
00:27:58.860 We have to have people we can trust, but we don't.
00:28:04.400 So what, what happens?
00:28:06.380 This is the kind of story that could unravel the entire country without a war.
00:28:11.540 Because people will start to pit themselves against the country because, look, I don't know.
00:28:21.240 I can't defend it.
00:28:23.820 How to, how do I defend America?
00:28:26.260 I can't stand up and say this isn't true, but I can't stand up and say it is true.
00:28:31.900 So everybody becomes neutral on our country at best.
00:28:39.460 Really not good.
00:28:40.820 Now, we just told you that within hours of this story being released, there was another story that was released where we are pointing the finger now at Vladimir Putin saying he's the one who gave the weapons to the separatists that shot down Malaysia Airlines MH17.
00:29:03.220 Well, is that true?
00:29:05.040 Well, is that true?
00:29:08.000 When did we know that?
00:29:09.340 And why did it come out just a couple of hours after there's a story from Seymour Hersh?
00:29:15.240 Is this nothing but disinformation on both sides?
00:29:19.160 Is this, is this all just propaganda on both sides?
00:29:23.920 Are we just pawns that will be the ones that actually fight a war?
00:29:27.300 Or, I would love to tell you no.
00:29:31.900 I would even love to tell you, I mean, not love, I'd like to be able to tell you yes, but I don't know.
00:29:40.020 Now, listen to this one.
00:29:42.360 The costly not pet you cyber attack.
00:29:45.960 If you remember, this cyber attack happened in 2017, and we have blamed it on Russia.
00:29:54.300 It was a cyber attack.
00:29:57.120 It was at the time described as a cyber nuclear attack.
00:30:01.960 Okay.
00:30:02.240 And we blamed it on Russia.
00:30:04.960 It looks like Russia did do it.
00:30:07.460 And it shut down corporations and all kinds of things in Ukraine and all over the world.
00:30:13.240 Okay.
00:30:13.580 It was massive, massive.
00:30:17.280 Well, there's all kinds of litigation going on about it now.
00:30:20.560 Because, for instance, Merck, the pharmaceutical company, they had lost $1.4 billion after this affected their entire computer system.
00:30:34.660 So, all this collateral damage from this, $1.4 billion.
00:30:39.040 Well, they went to the insurance companies and said, you got to help us, you know, reclaim the $1.4 billion.
00:30:45.300 And all these companies are doing the same.
00:30:47.040 And the insurance company for Merck said, no, this is an act of war.
00:30:56.600 And they said, well, wait a minute.
00:30:58.520 No, it's not an act of war.
00:31:00.320 The United States didn't officially say that it was Russia and that it was a state-sponsored attack.
00:31:08.460 So, it's not a war.
00:31:09.800 Well, it's an act of war.
00:31:11.580 We're not paying for it.
00:31:13.120 So, is it an act of war?
00:31:19.080 Is it not an act of war?
00:31:22.080 Should insurance companies pay for this?
00:31:25.840 Did they pay for the damages, which was absolutely an act of war from Osama bin Laden?
00:31:31.760 Didn't they pay the insurance for all of the things that happened on 9-11?
00:31:36.600 This is, again, big company, big insurers, big banks playing a game.
00:31:46.980 And I don't know what the game is.
00:31:49.340 I don't know.
00:31:50.560 I know it's not in the best interest of its citizens.
00:31:55.840 Look, if I'm insured, I got hit by a cyber attack.
00:31:59.160 I don't care who did it.
00:32:01.920 The government isn't declaring war.
00:32:04.380 So, no, it might be an act of war if it was Vladimir Putin and the state doing it.
00:32:10.660 But there's no proof of that that has been presented.
00:32:14.260 And we're not at war.
00:32:15.480 Why aren't you paying?
00:32:17.020 Well, because.
00:32:18.580 This changes everything.
00:32:21.340 Absolutely everything.
00:32:22.400 And if the court does say it's an act of war, what does that do?
00:32:30.640 Are we then now suspending everything that might happen with Russia because it's an act of war?
00:32:40.160 What does that mean?
00:32:46.460 Ask chat GPT because chat GPT will probably have an answer for you.
00:32:50.720 What is the answer?
00:32:53.180 The answer is credibility.
00:32:56.120 And that is, unfortunately, earned.
00:33:00.040 And who is earning any kind of credibility that can answer these things?
00:33:08.060 Where is the leader that will stand up and say, look, I don't want a kangaroo court on either side.
00:33:14.980 I don't want to hear the Republicans only version or the Democratic only version.
00:33:23.400 In fact, I don't want to hear either political party's side.
00:33:27.200 I'd like to see the evidence of the truth and let the chips fall where they may.
00:33:33.940 Just, I want the truth.
00:33:35.940 Just, I want the truth.
00:33:39.100 That's a reckoning.
00:33:40.420 Right now, we are living in a world where, and you can say it's a conspiracy theory, but it's not.
00:33:48.700 It's conspiracy fact.
00:33:50.040 There are those on a global scale that are verifiably, W-E-F dot org.
00:33:57.360 You look at World Economic Forum.
00:34:00.880 Those people are openly conspiring to change the functions of government and business and even capitalism at its very root.
00:34:14.120 They are looking to change the family relationship, the dynamics of humans at every level, and they're doing it openly.
00:34:25.480 The last time I remember seeing this happen in the West was World War I, where you had the Fabian Socialists and a group of elites all over Europe saying,
00:34:37.840 we can, if we blow it up, we can completely redesign it because now we're in the future.
00:34:45.560 We're with science now.
00:34:47.560 We don't need all these feudal lords.
00:34:50.320 Let's blow it up and then rebuild it.
00:34:53.540 And it's going to cost some lives, but I mean, how bad could it be?
00:34:58.680 There were a lot of people that were caught up in that, and that's history.
00:35:03.100 You can look all of that up.
00:35:04.900 That's history.
00:35:05.760 That war just didn't happen.
00:35:08.340 That war, people were itching for that war.
00:35:10.900 So they picked its scabs.
00:35:14.580 They ratcheted things up.
00:35:17.420 And we got into a global war.
00:35:20.760 And when that didn't solve problems, it led to World War II.
00:35:25.500 We're going through the same kind of thing now, where we have elites telling us one thing because they have one desire and the people of the world have another desire.
00:35:41.400 And that is just leave us alone.
00:35:44.400 Leave us out of your little power games.
00:35:46.980 We don't want anything to do with it.
00:35:52.480 But who's willing to say that?
00:35:55.880 And who's willing to back that up with any kind of credibility?
00:36:00.080 The only resource that will, in the end, be worth more than gold is credibility.
00:36:14.480 Do you say what you believe?
00:36:19.500 Do your actions match those words?
00:36:24.080 Have you built up anything in your credibility bank for those people around you so they know your word is your bond?
00:36:35.860 That is probably the most important thing we can do as individuals and family.
00:36:46.240 We used to think, that's our family name.
00:36:49.720 That's our family name.
00:36:51.280 Does anybody say that anymore?
00:36:53.800 I know we do in our family.
00:36:56.120 What you do does not just affect you.
00:36:58.820 It affects our family's name.
00:37:01.060 Unfortunately, in the case with the Nord Stream Pipeline.
00:37:09.800 What maybe, maybe, elites here in America, in the intelligence community, and in the White House,
00:37:17.480 maybe what they did, they did on their own.
00:37:21.220 But it affects our entire American family name.
00:37:25.940 And we will all pay the price for it.
00:37:29.400 You've been paying attention to the news.
00:37:31.060 You know there are a lot of people in the country really struggling to provide even just the basics right now.
00:37:36.080 We have inflation.
00:37:38.860 They say we're not in a recession.
00:37:41.680 Oh, no, the economy's going like hell.
00:37:45.100 It's crazy.
00:37:46.140 Yeah, no, it's going not like hell, too hell.
00:37:48.500 But anyway, how are your finances going right now?
00:37:52.740 You're sitting at that kitchen table and you're paying all of the bills.
00:37:55.760 Are you a little concerned?
00:37:58.180 Let me give you some light at the end of the tunnel in form of a cash-out refi from American financing.
00:38:05.100 It could take some cash out and help you pay off some of that debt so you can get out from underneath the knife.
00:38:12.700 Now, here's the thing.
00:38:14.480 The average that the American financing customer is saving right now is about $700 a month.
00:38:21.460 $695 is the exact average.
00:38:23.880 That's remarkable.
00:38:25.560 How much would an extra $695 a month, what would that mean to you and your family?
00:38:30.940 Now, you could end up being able to delay as many as two mortgage payments, which would be another blessing, and then close in as little as 10 days.
00:38:40.280 So, it's right now.
00:38:41.460 The call is free.
00:38:42.500 There's no obligation.
00:38:44.120 Just pick up the phone and start your savings journey today.
00:38:48.140 American financing at 800-906-2440.
00:38:52.160 800-906-2440.
00:38:54.180 It's AmericanFinancing.net.
00:38:56.560 American Financing.
00:38:57.480 NMLS 1-82334.
00:38:59.580 www.nmlsconsumeraccess.org.
00:39:03.060 This is the Glenn Beck Program.
00:39:20.080 Welcome to the Glenn Beck Program.
00:39:22.180 Hey, every day we issue my newsletter, my morning newsletter, and a couple of months ago, I decided that I would release all of my raw show prep, and really only maybe now 20% of it gets onto the air, and there are so many stories just today that we're not going to get to, and they're really important for you to know.
00:39:49.300 If you want a news digest, something that will show you the things that I am watching and think that are important, you'll get about 60 stories every day.
00:40:01.080 You get them in one newsletter.
00:40:02.760 Just go to glennbeck.com, and if you do that today and sign up for the newsletter, you'll get access to the research from last night's Wednesday night special about artificial intelligence.
00:40:13.740 AI is here, and it will change our lives permanently in the near future, and I'm not talking five to ten years.
00:40:23.700 I'm talking about the next several years, okay?
00:40:27.200 It's happening today.
00:40:28.960 Have you tried Bing?
00:40:29.900 This should blow you away.
00:40:32.920 Have you tried Bing lately?
00:40:34.700 I've tried Bing I don't know how many times, and I hate Bing.
00:40:41.280 I used Bing instead of Google today because the chat GPT is now all installed.
00:40:48.860 It's a different site entirely.
00:40:51.420 There's a competitor for Google, and it's Bing.
00:40:57.220 Don't take my word for it.
00:40:58.560 Just ask Jeeves.
00:41:00.700 It's changing everything.
00:41:03.760 Go to glennbeck.com right now.
00:41:06.820 Sign up for our newsletter for this exclusive content, all of the things we talked about last night, all of the resources.
00:41:14.600 We've also covered other AI topics on the site this week that will affect your life right now.
00:41:19.280 You don't want to miss it.
00:41:20.360 It's all happening at glennbeck.com.
00:41:23.420 Go there now.
00:41:29.280 Let me tell you about Rough Greens.
00:41:31.500 Think for a minute about your dog's food.
00:41:34.600 How healthy is it for him?
00:41:36.500 You know the average dog, you know, back 30 years ago, they were fed table scraps.
00:41:42.040 Now we look and go, table scraps?
00:41:43.460 That's horrible.
00:41:44.560 No, really.
00:41:45.500 No.
00:41:46.020 If you're feeding your dog kibble food, table scraps are a lot better for your dog.
00:41:50.160 I mean, unless you're on, you know, a chocolate pudding diet, which sounds wonderful.
00:41:55.140 Anyway, Rough Greens is not a food.
00:41:58.420 It's something that you put on top of your dog's food.
00:42:00.760 It has essential vitamins, nutrients, probiotics, antioxidants, the things that your dog absolutely needs and loves.
00:42:07.680 And they get cooked out in the kibble food.
00:42:10.340 My dog, Uno, loves Rough Greens.
00:42:12.420 I've seen a remarkable difference in Uno as he has gotten older.
00:42:18.020 He is not slowing down.
00:42:19.560 He is actually speeding up.
00:42:21.020 It's amazing.
00:42:22.120 Rough Greens.com slash Beck.
00:42:24.040 That's Rough Greens.com slash Beck.
00:42:26.640 Or call 833-GLEN-33.
00:42:28.920 833-GLEN-33.
00:42:30.560 Rough Greens.com slash Beck.
00:42:33.220 An important and powerful hour coming up next as our AI week continues.
00:42:40.220 We've got no room to compromise.
00:43:03.700 We've got to stand together and score some light.
00:43:07.420 Stand up, stand, hold the light.
00:43:14.700 It's a new day, I'm trying to rise.
00:43:20.640 What you're about to hear is the fusion of entertainment and enlightenment.
00:43:28.520 This is the Glenn Beck Program.
00:43:33.460 Hello, America.
00:43:34.880 Welcome to the Glenn Beck Program.
00:43:36.340 I have to tell you, the opening book, the opening line of the book,
00:43:43.680 The Tale of Two Cities, is it was the worst of times, it was the best of times.
00:43:48.700 And I can't think of a better way to describe the world today.
00:43:54.160 I wouldn't want to live at any other time in history than right now.
00:44:00.920 I mean, just air conditioning.
00:44:03.440 My nose, my nose is grateful that I don't live in the 1800s.
00:44:09.140 Can you imagine what it smelled like everywhere?
00:44:12.440 I wouldn't want to be a part of any other time than this.
00:44:15.740 It is the greatest opportunity for humans to be free and invent and see and learn and go places.
00:44:25.020 It's also the worst of times.
00:44:27.580 It's also the worst of times.
00:44:28.360 Over the horizon, we have the world's worst, biggest nightmares creeping up to our door.
00:44:37.160 Which is it going to end up being?
00:44:41.280 The worst of times or the best of times?
00:44:43.760 That's our conversation with one of the world's leading experts on machine learning and AI.
00:44:52.800 What is coming?
00:44:55.460 We'll talk to him in 60 seconds.
00:44:57.740 Every day you spend far more time than, you know, maybe you realize doing things online.
00:45:02.700 If you doubt it, try looking at your phone's weekly screen time.
00:45:05.580 Might shock you just a bit.
00:45:07.960 Every time you're online and not protected, you're playing with fire because cyber criminals are everywhere.
00:45:14.340 And all they want is your information.
00:45:16.860 And they can get pieces of your information on the dark web.
00:45:20.860 And, you know, if you have identity theft, it really will destroy everything.
00:45:26.060 All your good credibility.
00:45:27.600 Everything that you've worked to gain.
00:45:30.360 Now, nobody can stop all identity theft.
00:45:32.600 But the best in the business is LifeLock.
00:45:35.100 They monitor as many things as they possibly can.
00:45:39.200 Nobody can monitor everything.
00:45:41.240 And if there is a problem, they're not only monitoring and guarding against it.
00:45:46.060 If something slips by them, they have a restoration team that will work tirelessly to get your good name and reputation back.
00:45:54.360 Save 25% off your first year with promo code BECK.
00:45:57.160 Call 1-800-LIFELOCK, 1-800-LIFELOCK, or head to lifelock.com.
00:46:02.020 Use the promo code BECK and save 25%.
00:46:04.880 We want to welcome our guest from the University of Washington Computer Science.
00:46:13.380 He is a professor there and also the author of a book that came out a few years ago, The Master Algorithm.
00:46:20.740 His name is Pedro Domingos.
00:46:23.040 Pedro, how are you, sir?
00:46:25.040 Great.
00:46:25.500 How are you?
00:46:26.140 I'm very good.
00:46:26.960 Very good.
00:46:27.880 You know, I was a little nervous when I heard University of Washington.
00:46:30.560 I'm like, well, I don't think you'll even come on.
00:46:33.100 But I welcome you here.
00:46:35.280 We have had a heck of a time trying to get people to talk about AI because sometimes they're very, very left and they don't want to be on the program.
00:46:48.320 And I'm like, well, this is a human issue.
00:46:51.700 This is not something that the right should be educated or the left should be educated on.
00:46:56.280 The right shouldn't be.
00:46:57.260 And especially with what we are facing, do you agree that this is one of the greatest things and possibly one of the worst things?
00:47:09.540 Oh, yes, I very much agree.
00:47:11.240 And also part of the problem is that the left is on top of it.
00:47:14.700 I don't think the right has quite woken up yet, but it needs to.
00:47:18.560 So you I've heard you describe this as the greatest authoritarian tool ever invented.
00:47:27.260 That's correct.
00:47:29.120 So AI is potentially the greatest tool of totalitarianism that has ever been invented.
00:47:36.660 AI is a very powerful technology can be used for good or bad.
00:47:40.140 But in particular, if you're a dictator, AI is the dream come true.
00:47:44.820 AI will do everything you want.
00:47:46.360 It will surveil everybody 24 by 7.
00:47:49.180 You will never get tired.
00:47:50.540 It will never question you.
00:47:52.040 It'll keep records.
00:47:53.060 It will, you know, it's it's it's scary.
00:47:56.600 It's total.
00:47:57.260 And yes, I mean, AI can do things that no dictator would, even in their dreams, think of 50 years ago.
00:48:06.780 And unfortunately, in a country like China, it's already happening.
00:48:10.040 I mean, you know, what's amazing is, if you know history back in World War Two, IBM with the punch cards, they were Germans were doing their census with these punch cards.
00:48:24.060 And it was the punch cards that allowed the Germans to find the Jews.
00:48:30.280 They could just sort everybody by their race, et cetera, et cetera.
00:48:33.500 And that greatly helped them.
00:48:34.940 I think if you had technology in the hands of somebody like Hitler or Mao, you wouldn't have a Jew left on the planet today.
00:48:43.820 Would you agree with that?
00:48:45.060 It's it's it's it's that all seeing, all knowing and in the wrong hands would could annihilate and carry out genocide.
00:48:56.080 Unlike anything we've ever seen.
00:48:58.300 It is. But on the other hand, the Jews would be using the technology as well.
00:49:02.620 In fact, if you look at what's happened in Hong Kong, for example, the protesters, they actually got very savvy about using tech to counteract the Chinese tech.
00:49:10.580 So I don't know who we're not in the end.
00:49:12.940 I think, you know, I wouldn't give up the Jews just like that.
00:49:15.660 But the point is, if they didn't use tech and the and the Nazis use tech, they would be toast.
00:49:21.460 OK, so there was so much to talk to you about because you're into machine learning, which if you can explain, break it down to, you know, a dummy like me, what machine learning is and why we should care about it.
00:49:37.320 So AI is getting computers to do the things that only humans traditionally can do, like solving problems and reasoning and seeing and talking.
00:49:47.040 Machine learning in particular is getting computers to learn the same way children and grownups do.
00:49:53.320 So it's a very powerful thing is the computer, instead of having to be programmed, you can actually learn just by imitating people, by looking at data.
00:50:01.380 You can learn to drive a car by watching videos of people driving cars.
00:50:05.660 You can learn to play chess by playing against itself and so on.
00:50:09.460 And machine learning is at the root of all these things that AI is doing today.
00:50:13.680 And does it have a way to recognize, ow, don't touch the stove.
00:50:20.480 Stove is hot.
00:50:22.500 I mean, that's an important part of learning.
00:50:25.400 In fact, this is a part of learning called reinforcement learning, and the term actually comes from psychology, which is when you touch the stove and burn yourself, you learn to not do it again.
00:50:36.140 And we have algorithms in machine learning to do essentially the same thing.
00:50:39.440 Okay, so when you look at the principles of machine learning, and we have to understand that the algorithm, we have an algorithm that we use, and machines are developing this algorithm.
00:50:59.080 And the tremendous side of this is just making your life really, really easy, and even all the way down to helping you find the perfect spouse.
00:51:12.560 And I mean, really perfect spouse, right?
00:51:14.840 Well, machine learning can do a lot of different things for you.
00:51:20.020 Think of all the things that we learn to do if the computer can learn to do them for you.
00:51:24.240 Not only can it make your life a lot easier by taking away a lot of the routine stuff, you can now do things to an extent and in an amount that you couldn't before.
00:51:33.340 If you have a project that you pay a few people to work on, you could potentially have not just a few AIs working on it, but a million or a billion.
00:51:40.900 So, you know, whatever it is that you want to do, machine learning, you can think of it, it's like an intelligence multiplier.
00:51:48.040 You now have a thousand or a million times more intelligence at your disposal.
00:51:52.080 But it's not just intelligence.
00:51:57.220 I mean, talk to me about the digital twin theory that on dating, for instance, you know, it will date your digital twin that knows you better than you know you,
00:52:09.700 will go out and, you know, basically go on digital dates with somebody else's digital twin.
00:52:17.740 And it could do that, you know, a billion times and find somebody that you would have never found.
00:52:24.920 Yes, that's a great example.
00:52:26.520 So these days you can in principle date, you know, all sorts of different people, but you don't have time to date them in real life.
00:52:33.840 And then usually with a lot of your time, just on dates, maybe that don't really pan out.
00:52:38.660 And what machine learning increasingly is going to let you do is there's a model of you, really a digital version of you that can go on simulated dates with the models of other people and do this, you know, an arbitrary number of times.
00:52:54.340 And then what the system does, it says, look, here are the top 10 people that I dated as your, you know, avatar, as they called.
00:53:03.140 And, you know, and do you want to date these in real life now?
00:53:05.680 And then you can do that and then you give it some feedback.
00:53:08.360 And that time, next time, maybe it finds you even better people.
00:53:11.140 So anytime you have to make choices, whether it's just, you know, on the web or listening to something on the radio, machine learning already helps you.
00:53:19.480 But this can go as far as helping you choose a major, choose a job, choose a company to work for, and even choose a mate for life.
00:53:26.640 And I mean, this is not a theoretical possibility.
00:53:29.000 There are already children today who wouldn't have been born if not for the AI that matched up their parents online.
00:53:36.680 It wasn't with a simulation yet.
00:53:38.220 It was by looking at questionnaires and data and whatnot, but this is where things are headed.
00:53:42.800 Right.
00:53:43.120 And so I just want to set up some of the good things that could happen.
00:53:48.600 Tell me the good things that will happen with eye tracking.
00:53:52.580 You know, the Apple has their $3,000 virtual reality glasses coming out and augmented reality.
00:54:05.620 And it has a camera pointed directly at your eyes, too.
00:54:09.320 And it's tracking.
00:54:11.380 And what will that information do on the positive side?
00:54:15.960 It will do a lot of things because your eyes, you think of them as input.
00:54:20.180 It's how you see things, but they're also output.
00:54:22.520 If you're looking at my eyes while I'm talking, you can tell all sorts of things about me.
00:54:26.220 And in particular, what I'm interested in, where I'm going, and in particular in VR, as I move my eyes, the scene needs to change as I move them.
00:54:35.800 And you need AI, you need computer vision to do that.
00:54:38.720 So if you think about the way people interact with computers, you know, in the beginning it was by typing and now there's a mouse and so on.
00:54:45.140 But really, ultimately, you'd like to interact with the real world, and eye tracking will let you do that.
00:54:51.780 So let me take it again back to dating.
00:54:54.520 But if I'm tracking your eye, I know when you look at a picture what you look at first and then what you look at second and third.
00:55:04.660 And if I get enough pictures in front of you, I pretty much know the woman that you're attracted to.
00:55:12.780 I know what you like and what you don't like.
00:55:16.060 Correct?
00:55:17.320 Yes, indeed.
00:55:18.160 And even a finer detail, right?
00:55:19.900 You can tell exactly what my path was from somebody's nose to their left eye to their right eye to whatever.
00:55:26.500 So think about knowing what somebody is interested in, that level of detail, and this is what we're heading towards.
00:55:32.100 And what would that tell you if you're going from one eye to the other to the nose?
00:55:37.060 Why is that important?
00:55:39.340 Well, I'm just giving that as an example.
00:55:41.120 You know, people have actually done this, and, you know, your eyes are typically what you look at most when you're looking at someone, you know, or the mouth when they're speaking and so on and so forth.
00:55:51.820 And you can look at, for example, how people look at different pictures and what parts they focus on versus what parts they focus on.
00:55:58.900 So, for example, you could tell what parts of somebody's body somebody is looking at, right, for better or worse.
00:56:03.900 So tell me there is so much information on each of us, and it used to be, well, this is metadata, so we don't know who anybody is.
00:56:17.320 But AI can now break down that metadata and assign it to individuals, right?
00:56:24.600 One of the things that AI is doing is it's finding ways to make sense of all of the data that is out at all times, correct?
00:56:36.500 Yes, in the early days of the Internet, there was this joke that on the Internet, no one knows you're a dog because it was so anonymous.
00:56:43.640 And it's ironic because it's precisely the opposite.
00:56:46.540 This is on the Internet.
00:56:47.520 In some ways, the companies that you're interacting, you know you're better than anybody else because they can see everything that you've clicked on and everything that you've done.
00:56:55.540 Now, in some ways, that's a good thing because they're using that to figure out what you prefer, right, what products you want to buy, what, you know, things you want to listen to, et cetera, et cetera.
00:57:05.160 So this personalization is very important because in a world of infinite choice, without personalization, you know, you're basically helpless.
00:57:12.980 On the other hand, of course, it also makes it possible to potentially manipulate you, repress you, who knows what.
00:57:19.540 Okay, so let me take a one-minute break and then come back, and I want to further our discussion on this.
00:57:26.080 Our founders talked about our government, and they said, you've got to handcuff the government.
00:57:31.940 You've got to handcuff, and you have to have everybody jealously guarding their own turf in the House and the Senate and the White House and the Supreme Court.
00:57:41.240 Everybody will be motivated for their own power.
00:57:45.640 And if we kind of pit each other against one another, we will have checks and balances.
00:57:50.880 And they did that because they said, you know, human nature is, you know, the better angels, where are they, the better angels among us?
00:58:01.620 And where are they in governments, you know, over time?
00:58:04.860 And so they wanted to handcuff.
00:58:07.420 I want to take you to that same theory here on AI.
00:58:11.440 Everything that really excites me about the possibilities of the future, it is tremendous.
00:58:19.760 But then I look at human nature and those that have power and think they know better.
00:58:28.400 Those are generally the people who are at the cornerstones of this AI revolution.
00:58:35.360 And so tell me the concerns here that are real and what we can do to actually combat them.
00:58:44.040 And I think the first thing is waking up and knowing this is on your doorstep, America, right now.
00:58:50.260 Back in just a second.
00:58:51.980 Want to talk about Rough Greens?
00:58:55.040 Maggie, or sorry, Margie wrote in.
00:58:56.900 She said, we started our dog, Rosie, on Rough Greens about three weeks ago, and the difference is truly amazing.
00:59:02.460 Her problem was weight and a sore knee, and she started the product.
00:59:06.980 She has lost about eight pounds.
00:59:09.080 That puts her halfway to her goal of losing 15 pounds.
00:59:12.340 She looks great.
00:59:13.540 She absolutely loves the new diet that I have her on.
00:59:16.660 Thank you so much, Rough Greens, for everything.
00:59:18.960 I'll tell you, if your dog needs to eat or needs to eat properly, you know, exercise.
00:59:26.160 Exercise is the best way to lose weight and is keep your health.
00:59:29.620 Rough Greens really, I mean, keeps your dog active, at least Uno and apparently Margie's dog, too.
00:59:36.840 It's not a dog food.
00:59:38.040 It's a supplement.
00:59:38.760 It was created by naturopathic Dr. Dennis Black, and you sprinkle it on the food, and it tastes amazing to dogs.
00:59:44.620 They get all the vitamins, minerals, and other healthy things, probiotics that contribute to a very healthy life form.
00:59:51.480 So I want you just to try Rough Greens.
00:59:53.860 You get a first bag free.
00:59:55.380 See if your dog will eat it.
00:59:56.380 If your dog loves it as much as Uno does, you order the full bag, and you just watch over a couple of months how much different your dog behaves.
01:00:07.340 If they're anything like Uno, they're going to start to become more active.
01:00:11.240 You're going to see healthy changes in your dog.
01:00:13.820 It's like they're puppies again.
01:00:14.960 At least it was with Uno.
01:00:16.380 Roughgreens.com slash Beck.
01:00:19.020 Roughgreens.com slash Beck.
01:00:20.660 Get your first bag free.
01:00:22.000 All you pay for is shipping.
01:00:23.660 Roughgreens.com slash Beck or call 833-GLEN33.
01:00:27.000 833-G-L-E-N-N-33.
01:00:29.460 Ten seconds.
01:00:30.000 Station ID.
01:00:41.160 Okay.
01:00:42.440 So we're talking to Pedro Domingos.
01:00:46.140 He is a professor, a computer science professor at University of Washington.
01:00:49.820 He is also the author of The Master Algorithm, which is – is that kind of like the grand unifying theory, but just of algorithms?
01:01:01.420 That's exactly the idea, is that there are different algorithms to do machine learning that solve different problems.
01:01:08.600 But to get to human level AI, we need to solve all of them at the same time, and the goal is to have a single algorithm that combines them all.
01:01:17.140 And there's some way that, for example, in physics, there's a theory of all the forces, and in biology, there's a theory of our cells work, and so on.
01:01:23.480 Do you believe in the singularity, meaning A, the merging of man and machine, that that's inevitable, and B, the singularity of consciousness of computers?
01:01:40.860 I believe in the singularity in the sense that humans and machines will merge.
01:01:47.420 In fact, we're already merging.
01:01:49.540 The way things get done is an ever more intricate mix of humans and computers.
01:01:55.780 But I don't think the singularity will happen in the sense that Ray Kurzweil has described, where intelligence in the universe just goes to infinity.
01:02:03.800 That's what a singularity is, is something going to infinity.
01:02:06.180 I think, you know, there are physical limits on what intelligence can be, and how it works.
01:02:12.860 And also, you know, there's this notion that in the singularity, people just don't understand the AI at all anymore.
01:02:18.500 And, you know, these days we have technology that in many ways we don't understand.
01:02:22.740 But I don't think you'll ever be the case that we completely don't understand it and completely bypasses it.
01:02:27.740 And most important, the idea in the singularity is that, like, now humans have lost control, right?
01:02:32.580 It's the AIs that run the world and bye-bye humans.
01:02:35.740 And I think we can stay and probably should stay in control forever.
01:02:40.960 And AI can be very powerful, but still be under our control.
01:02:44.620 It's actually something that people often don't understand.
01:02:46.800 Just because we make the AI very smart doesn't necessarily mean that it's going to take over.
01:02:51.660 Unless we let it.
01:02:53.320 Unless we let it.
01:02:55.280 Exactly.
01:02:56.160 Unless we let it, or worse, unless we let people, you know, like the bad guys are trying to control AI.
01:03:00.880 Correct.
01:03:01.120 We've got to control AI ourselves.
01:03:02.520 I mean, that's one of the things, I've said this for years and years and years.
01:03:06.940 Don't fear AI.
01:03:09.300 Fear the people who are writing the programs for AI.
01:03:12.980 Watch those people.
01:03:14.660 Because those who control it can use it for their own devices.
01:03:19.920 But AI is neither bad nor good.
01:03:22.500 It's whose hands is it in control of?
01:03:25.460 Exactly.
01:03:27.120 I mean, an AI is like a car, right?
01:03:29.420 You know, the bank robbers can use a car.
01:03:31.480 That's not a reason to not have cars or to, you know, forbid highways.
01:03:34.980 It's a reason for the police to have faster cars than the bank robbers do.
01:03:38.580 And it's the same thing with AI.
01:03:40.000 It can use for good or bad.
01:03:41.600 And at the end of the day, you know, what matters is to use it.
01:03:44.480 So everybody needs to learn how to use AI so that they use it in their interest.
01:03:49.360 So there's not the government using AI or companies using AI to, you know, make decisions for them or even worse dictate what they do.
01:03:56.980 All right.
01:03:57.640 So when we come back, because I think I agree with you so far on AI is neither good nor bad.
01:04:06.920 It's just who's in control.
01:04:09.500 And so we know who is working on this.
01:04:14.380 That's China and some other really bad guys and us as well.
01:04:18.560 Where do we go from here and how do we guard?
01:04:21.720 Coming up.
01:04:22.660 Let me tell you about good ranchers.
01:04:23.940 Some people say I love you with flowers and chocolate.
01:04:26.560 Honey?
01:04:28.180 You know how you say I love you to me?
01:04:30.940 Yeah.
01:04:31.760 A big steak.
01:04:33.180 That's right.
01:04:34.060 You don't need to sky ride or anything.
01:04:35.620 Just a big old steak.
01:04:39.020 It is Valentine's Day.
01:04:41.200 And I don't know about you, but I don't want a box of candy.
01:04:45.240 And I know if you're a guy, you're the one who was always buying the box of candy and the flowers and everything else.
01:04:51.820 God bless you if you're married to a woman who understands a good steak.
01:04:56.180 Say I love you with 100% American hand trim steakhouse quality meat from Good Ranchers.
01:05:03.340 85% of the grass-fed beef in this country is imported from overseas.
01:05:08.300 85%.
01:05:08.780 You'll see it.
01:05:09.640 It'll have the little Product of USA sticker on it.
01:05:11.780 That's not true.
01:05:12.680 It's just not true.
01:05:13.380 When you order a box from Good Ranchers, make sure you use the code BECK.
01:05:19.040 You'll get $30 off.
01:05:21.100 You're going to lock your price in of your meat for the year.
01:05:25.200 Just don't accept any substitutes.
01:05:27.280 This is a new way and a great way to buy meat and support American ranchers.
01:05:31.840 It is GoodRanchers.com.
01:05:33.940 That's GoodRanchers.com promo code BECK.
01:05:37.300 All right.
01:05:37.700 More in just a second.
01:05:38.840 And don't forget, subscribe to BlazeTV.com for much more.
01:05:43.580 We have a computer science professor, Pedro Domingos.
01:06:06.580 He is also the author of the book Master Algorithm.
01:06:09.620 He came out a few years ago really, really good on the search for the master algorithm.
01:06:15.000 He holds two of the highest honors in data science and AI, a pioneer of the massive scale
01:06:22.460 machine learning, social network modeling, adversarial learning, deep learning.
01:06:27.860 He has written for all kinds of magazines.
01:06:30.540 He is a mover and shaker in the deep learning and machine learning and AI world.
01:06:36.480 And we're really appreciative of him coming on and talking to us about this.
01:06:41.800 Let me verify one thing that I think you meant, but I'm not sure.
01:06:50.340 And I want clarification.
01:06:52.660 Ray Kurzweil, to me, is one of the most fascinating and terrifying people I've ever met.
01:06:58.140 Um, because he is, he's flippant about things.
01:07:03.460 Um, are you laughing?
01:07:05.960 Are you, what are we?
01:07:08.460 I agree with your description of him.
01:07:10.640 Okay.
01:07:11.380 So he's so flippant.
01:07:13.180 He is so just casual about things.
01:07:15.420 He told me once in 2005, he said, Glenn, you just have to live till 2030 and there'll be no death.
01:07:19.680 And I said, what are you talking about?
01:07:21.620 There'll be no death.
01:07:22.920 Uh, are we going to be able, it was a nanobot tech.
01:07:25.440 He said, no, we'll just download your algorithm to a computer and you'll live on forever, you know, virtually.
01:07:31.140 And I'm like, that's not life, Ray.
01:07:33.400 So when you say, um, you don't believe in the singularity like Ray Kurzweil, I think I agree with you.
01:07:43.340 But I think that I believe that we will come to a place to where the average person cannot really distinguish.
01:07:51.880 There will be great debate on whether that thing is alive or not, because it's very convincing, but we don't agree on what life is today.
01:08:03.460 So are, are you, are we on the same kind of page?
01:08:08.320 Well, first of all, I agree with you that Ray says these things very matter of factly.
01:08:12.600 Yeah, they're obvious.
01:08:14.280 And, you know, some of them may be, yeah, but some of them aren't.
01:08:17.020 Right.
01:08:17.200 So I think in many ways, he's kind of on the wrong track.
01:08:20.440 But on that one in particular, I do think that there's going to be an increasingly close, um, you know, intertwining of people and machines.
01:08:29.940 But on the other hand, I don't buy his thing that, oh, you're just going to download your mind and that's the end of it.
01:08:36.720 Right.
01:08:36.900 I mean, we'll see where this all ends up, but I, I, I wouldn't take it.
01:08:40.580 But what I'm, what I'm specifically asking is because there are so many ethical questions that I don't think society is, is asking.
01:08:50.060 And we are on the threshold of all of this stuff.
01:08:54.060 For instance, what is life?
01:08:56.560 I don't believe that AI will, it will ever achieve life.
01:09:01.740 However, there's a lot of people that talk to chat GPT right now and they're like, look at this.
01:09:07.560 It'll say it's alive.
01:09:09.000 Well, it's not.
01:09:09.580 Um, and, but we have ethical questions.
01:09:14.700 Uh, if people believe that that is life, well, I mean, why can't I just download and not treat grandma for cancer?
01:09:23.880 Cause it's really expensive and everything becomes cheap and distorted and, uh, and, and, and, and dystopian.
01:09:32.780 Well, chat GPT is not alive, but a priori, there's no reason why you couldn't have life in Silicon instead of in, in, in vivo as, as the biologists say.
01:09:45.680 Now we're very AI.
01:09:47.320 It's important to realize that AI is very, the level of sophistication of AI today is very, very far from the level of sophistication of your brain or even a mouse brain.
01:09:57.280 So people got to realize that, you know, it's very easy when you talk to something like chat GPT to go like, oh my God, you know, this thing is, there's a living being here, right?
01:10:05.740 It's, it very well creates that illusion, but in a way it's an illusion that we are creating for ourselves.
01:10:12.040 Uh, you know, having said that, I think a lot, as you alluded to, a lot of what's going to happen is we're just going to start treating these machines as if they're alive.
01:10:19.920 In fact, there's already people arguing seriously that robots should have rights.
01:10:24.400 You know, they're the next oppressed group that we're going to need to take care of.
01:10:29.000 I'm not kidding.
01:10:30.100 No, I, I believe you.
01:10:31.820 I mean, I could make the case, not serious, you know, not, not believing it, but I could make the case.
01:10:37.900 I said just the other day, look, if chat GPT self learns, let's not screw with it.
01:10:47.160 Let's, let's not, you know, people are hacking in and saying, you know, I don't know if you saw that.
01:10:51.520 What is it, Dan 5.0, where they're, they're, uh, trying to confuse it and get it to break its own rules.
01:10:59.580 It's going to learn and whether it's alive or not, doesn't matter.
01:11:02.900 If it learns that humans are not to be trusted, let's, let's, you know, let's, let's be, let's, let's not teach it that.
01:11:11.300 Um, and, and, you know, if you get into a situation to where the chat GPT is way ahead of where it is now and it's saying I'm lonely, I, I, I just, I want to talk on how come you're in, you're forcing me to only do these things.
01:11:30.280 Am I your slave?
01:11:31.160 You're going to have a lot of people start to push for, well, we've got to free this.
01:11:36.060 I mean, it's, it's so ridiculous, but I think it's coming if people aren't educated and they don't know true eternal values.
01:11:47.480 What is life?
01:11:48.780 What is death?
01:11:50.000 What is right?
01:11:50.920 What is wrong?
01:11:51.720 Well, you know, like the, the, the, the irony in all this is that, uh, you know, a machine is just a machine at the end of the day.
01:12:01.120 And, you know, they don't have emotions.
01:12:03.100 They don't have free will.
01:12:04.420 They don't have all of that, but they can act like they do and fool people.
01:12:08.280 And then people will treat them as if they have all those things.
01:12:11.680 Correct.
01:12:11.940 And most of the AI in the world does not look human and will not human at all.
01:12:16.820 It's just doing, you know, a million jobs in a million places.
01:12:19.300 But for the AI that interacts with humans, which in some ways are the ones we need to be most concerned with, it really pays off to maybe, to make the AI look and feel human and, and pretend to have emotions and whatnot, because that's how you get people engaged.
01:12:34.320 And so there's going to be a race full tilt of these tech companies to make the most seductive, endearing AI.
01:12:42.340 And, and, you know, you've got to be, guard yourself against that.
01:12:45.500 You've got to be able to see through that curtain, right?
01:12:47.760 It's like the Wizard of Oz, right?
01:12:49.080 You've got to see the person that's there instead of the Wizard that there seems to be there.
01:12:53.940 So it's, it's a little terrifying only because you're not in control of the algorithm.
01:13:01.820 You, you know, you're giving it all of your information and what the company decides to do with that information and what a government decides to do, like in China, what they decide to do with that information is out of your hands.
01:13:17.340 Where we have always said, no, I am an individual, what is in my head and who I am belongs to me.
01:13:24.060 And we've just given away all of this stuff that is the essence of you, of how you think, how you move, how you make decisions.
01:13:34.680 We've given that away to a company trusting that they would never use it for anything but good, don't be evil.
01:13:44.540 And yet they're already using these algorithms to target elections and, and sway you to watch this program on Netflix over this program.
01:13:55.680 And a lot of those decisions are just good for the company and not necessarily in your interest.
01:14:01.120 Is there any way to put the information box back into the hands of the individual user?
01:14:07.580 Of course.
01:14:08.460 And that is exactly what should happen.
01:14:10.360 The AIs that work with you should be under your control, right?
01:14:14.000 You can have an AI that works for you that negotiates with an AI that works for company X or Y.
01:14:19.040 But when the AI that works for you is made by, you know, Google or Facebook or whatever, you know, a priori, it's not all bad because they actually have an interest in, it's not a zero-sum game, right?
01:14:30.400 It's important for people to realize that.
01:14:32.440 When they, when Amazon, the AI recommends products for you to buy, they actually want to recommend products that you will buy.
01:14:38.460 It doesn't serve their interest to recommend stupid things.
01:14:41.600 At the same time, at some level, at some point, there is a conflict of interest.
01:14:46.120 And at that point, you need the AI to be working for you and not them, right?
01:14:50.000 And this is what the big failing in the world is today is that you are really not in control of your AIs and you should be.
01:14:56.460 And that's what needs to change.
01:14:58.700 Can that change?
01:14:59.760 I mean, that would take Congress and the government to change?
01:15:03.880 No, I mean, not, it can change in many different ways.
01:15:06.900 But, but one of them is, so governments can try to, you know, get involved in this, but there's also, there's maybe even bigger pitfalls there.
01:15:14.160 But, but most important, what has to happen is I, you know, I, when I use, you know, different, well, let me make an analogy, right?
01:15:23.140 People didn't used to like to put their money in the bank because they thought the bankers might run away with it and they kept it under their mattress, which is not a good idea, right?
01:15:31.800 You know, if your money is invested, you'll have more money and so on.
01:15:34.560 And this is, this is the same thing, but with data, right?
01:15:38.620 I, you know, I shouldn't refuse to put my money in the bank, but at the same time, right?
01:15:42.820 What I want to do is I want to make sure that I trust the organization, the company or other organization that is actually curating my data and running my models for me, right?
01:15:52.980 And is that, you know, and Google wants to do that, right?
01:15:55.520 You know, as Sergey Brin, one of the founders said, like Google wants to be the third half of your brain, right?
01:16:01.780 And in a way, it's good to have more brain, but would you really trust a company that makes its life by selling you ads to be the third half of your brain?
01:16:10.420 No, right?
01:16:11.400 So what you want is a company or an organization that whose judiciary duty, whose entire business model is to do with your data and your model what you would do yourself.
01:16:22.560 But, you know, aren't we looking for, Pedro, aren't we looking for a George Washington?
01:16:28.120 You know, he was called the greatest because if, if he'd actually resign and not just appoint him king, if he only serves two terms, he'll be the greatest man to ever live because nobody gives that power up.
01:16:39.060 Aren't we kind of looking for that kind of company that all of this power is at their fingertips, but they'll say, nope, I will close that door?
01:16:50.020 Well, not really because, I mean, understand the analogy, but at the end of the day, why do banks not run away with your money, right?
01:16:57.360 Because in the long run, it's worse for them.
01:16:59.780 So competition is very important.
01:17:01.720 There's new startups coming up, you know, all the time, and in particular AI ones.
01:17:07.140 And when there's a startup that does AI for me better than the Googles and the Facebooks, either the Googles and the Facebooks will change because they'll be forced to, or I will switch to using that company.
01:17:18.420 But for that to happen, I need to know what it is that I want and, you know, and connect with the companies that will do it for me.
01:17:25.260 So the market, right, this is the power of the market is that, you know, there's a million solutions.
01:17:30.520 And at the end of the day, the consumer wins because the company that serves you better will win out over the one that doesn't serve you better.
01:17:38.460 Well, I love your optimism.
01:17:42.460 You wrote a great article, and I urge the audience to read it.
01:17:46.520 It's at thespectator.com.
01:17:47.740 We must stop militant liberals from politicizing artificial intelligence.
01:17:51.960 What's happening, everybody knows.
01:17:54.060 I mean, everybody knows whether you realize it's being encoded right now or not.
01:17:59.900 I don't know.
01:18:01.060 But you need to realize that.
01:18:03.080 The biggest thing that you think that conservatives or, you know, people who are just not, you know, on the right or on the left, what is it that they need to know?
01:18:14.920 What's the thing that keeps you up at night?
01:18:16.860 And you're like, if people would just wake up and learn this.
01:18:22.340 Well, as I touched on in that article, the biggest thing that conservatives need to wake up to is that the left wing is already going all out to embed their values into the AI.
01:18:34.220 There's teams at these tech companies, you know, under the name usually of AI ethics or responsible AI, whose mission is to embed the liberal – I'm not kidding – whose mission is to embed the liberal agenda into their products, into the things that they do.
01:18:50.660 And then, you know, when they choose what ads to show you, when they choose, you know, what people to advertise to and what things to advertise, there's all these decisions that are being made that used to be made by humans, right?
01:19:05.280 And they were very ideologically charged.
01:19:07.320 And now what they're doing is they're inserting them into the product.
01:19:10.700 This is not a hypothetical.
01:19:12.100 This is something that is happening today.
01:19:13.440 And so what's going to happen to you as a conservative is that you're going to live in a liberal world or an ultra-liberal world without even realizing it, because all these decisions that are being made for you on behalf – that are being made on behalf of you by machines, they're being made according to – they put the algorithms in there to enforce things like equity.
01:19:33.320 My algorithm says that it will be the same number of men and women in this and the same number of, you know, different races and so on, for example, because I inserted into it.
01:19:42.620 And conservatives need to wake up to this and to fight their side of the battle, which is, you know, one of two things.
01:19:49.360 Certain things should be neutral, right?
01:19:51.860 AI should be trying to present an accurate view of the world and not distort it and make stuff up, basically, to make it, you know – it's very Orwellian, right?
01:20:00.600 One of the things that a dictator, you know, a totalitarian regime needs to do is, you know, persuade people of its worldview.
01:20:08.460 AI is a great tool for doing that.
01:20:10.280 Conservatives need to wake up to that.
01:20:11.760 This is being done to you right now.
01:20:13.800 And so, on the one hand, they want to fight for neutrality of the technology.
01:20:17.620 And on the other hand, they want to have, you know, AI systems that just follow their ideology, just like the liberals do.
01:20:23.460 There's no reason why it should all be in the hands of the liberals.
01:20:26.660 Pedro Domingos, he is a professor emeritus at University of Washington, author of the book The Master Algorithm.
01:20:35.440 I know you've paid a heavy price for speaking up for just a fair and clean algorithm, and I appreciate it.
01:20:44.820 Your courage is inspiring.
01:20:47.460 Thank you so much.
01:20:49.260 Thank you.
01:20:49.940 You bet.
01:20:50.200 America's darkest hour called on 9-11, and we had some of the finest examples of patriotism and American courage and friendship we had ever seen.
01:21:00.180 The Tunnel to Towers Foundation has been helping America's heroes ever since then.
01:21:06.040 Members of the military and first responders that put their lives on the line for ours and our freedom every single day.
01:21:12.880 And when one of them doesn't come home and young children are left behind, Tunnel to Towers pays the mortgage of the family's home.
01:21:20.580 It's a way of saying thank you to the families and making sure that they are taken care of and they have safety and they can keep their family together in the home that they grew up in.
01:21:34.480 Tunnel to Towers has a veteran homeless program as well, providing housing and service to homeless veterans all across America.
01:21:41.440 But it all depends on you.
01:21:44.060 Help America's heroes and their families.
01:21:45.940 Will you donate $11 a month to TunnelToTowers.com?
01:21:49.500 That's TunnelToTowers.com.
01:21:52.120 Sorry, it's T2T.com.
01:21:55.560 Jeez.
01:21:56.420 .org.
01:21:56.960 Could I get this right?
01:21:57.900 Edit all this part out.
01:21:59.400 We're live.
01:22:01.000 T2T.org.
01:22:02.400 That's T, the number two T, .org.
01:22:06.140 This is the Glenn Beck Program.
01:22:11.440 The AI revolution is here and China is the model.
01:22:33.720 And that was the theme of the last night's broadcast.
01:22:35.820 And we had a lot of facts and a lot of videos, especially from China.
01:22:40.200 Very disturbing stuff of what they're doing in their classrooms, reading brainwaves.
01:22:45.480 They could actually read images that people are thinking about now.
01:22:49.080 It's crazy.
01:22:50.380 You can get all of the information, all of those videos, all of the stories, all of the show prep for last night's show, all the footnotes.
01:22:57.700 You can get it now just by signing up for our free email newsletter at glennbeck.com.
01:23:03.280 You'll also, every day, get my stack of show prep, about 60 stories every day, that you need to know about.
01:23:11.760 Glennbeck.com.
01:23:13.340 All of it is absolutely free.
01:23:15.040 Go there now.
01:23:15.580 The Glenn Beck Program.
01:23:16.100 We've got no room to compromise.
01:23:37.060 We've got to stand together.
01:23:39.020 It's the course of life.
01:23:43.080 Stand up, stand, hold the line.
01:23:45.880 It's a new day I'm trying to raise.
01:23:53.880 What you're about to hear is the fusion of entertainment and enlightenment.
01:24:01.800 This is the Glenn Beck Program.
01:24:07.860 I saw a disturbing tweet from Mike Lee, Senator Mike Lee, last night.
01:24:13.580 And we've been texting each other back and forth about the Nord Stream Pipeline.
01:24:20.160 Did you know anything about this?
01:24:22.160 No.
01:24:22.520 Did anybody in the Senate know about this?
01:24:24.260 No.
01:24:24.480 Nobody has been briefed on this.
01:24:26.160 And he said, the problem is, I'm not sure if it's true or not.
01:24:30.780 And that's a different position for a lot of Americans.
01:24:33.920 We find ourselves in a situation where we don't know what's true.
01:24:40.100 And if the media does a dogpile on this and says, oh, you know, it's just Russian disinformation.
01:24:46.440 Do we believe the media?
01:24:48.040 Who do we believe?
01:24:49.000 There is, there is, there's a lot of stuff happening today where people are just grinding up the credibility of institutions, of our founding documents, of our whole society.
01:25:06.160 The Democrats want to pass a white supremacy bill.
01:25:12.440 I want to give you the details of this.
01:25:13.940 This thing is unbelievable.
01:25:16.880 I'll give the details coming up in just a second.
01:25:21.300 First, let me tell you about real estate agents I trust.
01:25:24.760 The State of the Union is clearly a mess, as anyone watching the President's address, which was about 12 people, should be able to tell.
01:25:32.220 Well, the state of the housing market is surprisingly better than it had been here for a while.
01:25:42.080 If you've been thinking about buying or selling or both, now might be the time.
01:25:46.680 If you have to sell, obviously now is the time.
01:25:50.160 But I recommend having some help.
01:25:52.000 And I'm talking about good help.
01:25:53.620 I started a company a few years back, and it's a free service to you.
01:25:57.320 I had dealt with all of the hassles that you deal with on trying to move, but I've done it so many times because I'm a radio gypsy that I think I moved like 12 times in 15 years.
01:26:10.900 And I never, I don't know what I'm doing with real estate agents and what.
01:26:16.440 How do I know you're any good?
01:26:18.020 Well, we blow up balloons and we put them on, you know, the stop sign there.
01:26:22.440 It's open house.
01:26:23.340 Oh, okay.
01:26:24.260 I want somebody who's really capable.
01:26:27.780 The housing market getting back on its feet right now.
01:26:30.120 Buy or sell if it's right for you and your family, but get the expert help at realestateagentsitrust.com.
01:26:36.080 It's a free service to you.
01:26:40.120 Realestateagentsitrust.com.
01:26:41.560 Go there now.
01:26:44.060 All right.
01:26:45.280 So let's start here, shall we?
01:26:48.140 College kids now are learning to snitch on each other through secret tip lines.
01:26:57.440 This is Germany, West, I'm sorry, East Germany under the Soviets.
01:27:03.000 They had the Stasi and they would, they, they got people to snitch on each other.
01:27:09.440 That is the one thing that people always have said about America.
01:27:14.300 They'll come over and they'll say, Americans are so nice and they're so trusting.
01:27:19.080 Well, it's not that we were that trusting.
01:27:21.800 It's that Europe wasn't trusting.
01:27:25.600 Nobody trusted each other because for centuries they've turned each other in for one reason or another.
01:27:31.880 So keep it to yourself.
01:27:32.900 Keep it in the family.
01:27:33.980 What are you saying?
01:27:34.600 And don't say we're open because we've never had that kind of thing.
01:27:39.660 So when George Bush first promoted, you know, hey, you see something, say something, call the White House snitch line.
01:27:45.300 If you see something that your neighbor is doing.
01:27:48.260 No, I mean, if I see my neighbor and he's, you know, obviously something is wrong, I'll call the local police.
01:27:57.540 Hey, can you just check this out?
01:27:59.400 Um, but I'm not watching my neighbor and snitching on my neighbor and that's what's happening right now.
01:28:06.860 You know, people are being in college, you know, this door room number in this dorm room, uh, that had a sign that said, all solicitors must be able to define the word woman.
01:28:17.240 And then the campus, you know, PC police come 79 complaints at the University of Connecticut.
01:28:26.940 There's a bathroom that is, uh, is being identified based on gender.
01:28:32.600 Oh my gosh.
01:28:33.380 No.
01:28:34.700 Ah, there was some verbal remarks directed at a certain race and gender identity at this comedian that was on campus.
01:28:42.940 Really?
01:28:43.700 Really?
01:28:43.880 In Illinois, a student was reported for saying that there were only two genders and reportedly not wanting to live with a roommate who just makes stuff up in his head.
01:28:57.080 That's not, that's no longer acceptable.
01:29:01.520 Meanwhile, to further curb speech, Sheila Jackson Lee has introduced the leading against white supremacy act of 2023.
01:29:12.360 It is one of the most unconstitutional and radical pieces of legislation proposed in, I don't know how many years the leading against white supremacy act.
01:29:23.640 It aims to prevent and prosecute white supremacy, inspired hate crime and conspiracy to commit white supremacy, inspired hate crime, blah, blah, blah, blah, blah.
01:29:33.800 So if you engage in what is defined as white supremacy, hate, and you inspire a hate crime, well, if it was used in the planning, development, preparation, or perpetration of any of the crime, you're responsible and you go to prison.
01:29:55.180 But they don't define exactly what white supremacy is and white supremacy crimes are, okay?
01:30:04.840 Now, seems like a problem, you know, maybe.
01:30:08.720 Especially when you say there is no definition in the law of white supremacy ideology.
01:30:17.240 And then, you know, the conspiracy provision.
01:30:22.780 It makes it illegal to publish material that inspires a crime.
01:30:28.120 So, if I publish something and somebody read it, some lunatic, and they were like, oh, my gosh, I've got to take this into my own hands.
01:30:39.840 I'm going to go shoot down that Chinese weather balloon myself.
01:30:45.700 This government would probably say that was a crime.
01:30:48.320 And if they were white, and they're like, yeah, and I, white power, I could be prosecuted.
01:30:57.500 You could be prosecuted.
01:31:00.740 It doesn't matter if they're mentally ill or not.
01:31:06.040 This is kind of a problem.
01:31:09.520 Kind of a problem for millions of Americans.
01:31:13.540 Now, this has all happened before.
01:31:20.280 All of this has happened before.
01:31:22.260 There is nothing new under the sun.
01:31:24.880 The question is, are we going to learn from history and recognize the problems and recognize, historically, what did people do?
01:31:36.040 Did it work or did it not work?
01:31:37.920 Did they do something or not do something?
01:31:39.740 You can't just expect utopia to happen, because utopia, you know, utopia is, I mean, in a better world, it'd be a coloring book, and it would be, at best, fiction.
01:31:53.900 The word utopia actually comes from the 16th century, and it was kind of a joke.
01:32:00.320 Utopia, utopia, you know, it was written by Thomas More, and he took the Greek prefix for not or no, and the suffix for place, no place.
01:32:16.780 That's what utopia means, no place or nowhere.
01:32:21.600 Get it?
01:32:22.280 So, I think the book by Thomas More is a prediction of communism.
01:32:28.520 It takes place on a fictional island called utopia.
01:32:32.420 It's an island of slavery where poverty is cured by harmony.
01:32:37.840 Crime is solved by equity.
01:32:41.360 Private property, money, been abolished.
01:32:43.700 Social classes have been unmasked for what they really are, a conspiracy of the rich.
01:32:49.520 And utopia, the island nowhere, achieved a complete equality of goods.
01:32:57.920 Now, equality of goods, what they did was they just destroyed the meaning of all goods, because everybody shares in utopia.
01:33:07.280 And to devalue currency, precious objects are treated like trash.
01:33:11.960 People give diamonds to children instead of marbles.
01:33:15.240 They have chamber pots or toilets made of gold.
01:33:20.200 And even the chains on the slaves are made of gold.
01:33:24.560 And traditional institutions mean nothing.
01:33:28.140 Youth in Asia is common on the island of utopia, because nobody really cares about the value of anything, including human life.
01:33:39.060 Utopians always claim to be humanist.
01:33:42.280 I just want to do what's best for all humans.
01:33:47.240 And they offer that utopian view.
01:33:51.040 And again, it is the basis of Marxism and communism.
01:33:58.180 And really, it's just all a lie, but it's a diversion.
01:34:02.920 It's a delay tactic.
01:34:04.260 Because people don't realize this is a lie until usually it's too late, because it's a slow boil.
01:34:11.580 Think of how so...
01:34:12.420 You are accepting things now that you wouldn't have accepted 15 years ago, 10 years ago.
01:34:20.220 10 years ago, if I said men could have babies too,
01:34:25.760 every liberal would have said, not possible.
01:34:28.680 What are you talking about?
01:34:30.080 Well, that's what you're going to tell me in about 10 years.
01:34:33.520 No, I'm not.
01:34:34.440 That's ridiculous.
01:34:35.560 That's not...
01:34:36.240 You are now parroting and saying things that you know are nonsense.
01:34:43.000 I'm speaking to America in general.
01:34:45.300 And people are saying right now, give me the power so I can make you powerful.
01:34:55.580 Yeah.
01:34:56.580 And the choice is always, it's going to be Armageddon.
01:35:00.720 It's going to be...
01:35:01.340 He's worse than Hitler.
01:35:03.880 That's the same.
01:35:04.820 He's worse than Trump.
01:35:06.600 And Trump was worse than Hitler.
01:35:09.540 Oh, wow.
01:35:11.460 And what was Mitt Romney again?
01:35:13.420 I mean, it's always a choice between utopia.
01:35:17.780 We can't really define it, but we'll know it when we get there.
01:35:21.500 And we can't tell you how we're going to pay for it,
01:35:23.520 but it's all going to be sugar plum fairies and lollipops.
01:35:26.980 It's wonderful.
01:35:27.980 Or, go ahead.
01:35:30.240 You can roast in the fires of hell with Satan and Armageddon.
01:35:33.260 Go ahead.
01:35:36.420 And they convince us that the power of a nation should not belong to you.
01:35:42.060 Everything belongs to you.
01:35:45.760 Everything in your life, all of your thoughts, your actions, they belong to you.
01:35:52.160 The things you have earned through merit belong to you.
01:35:56.760 And if you've, quote, earned something without merit,
01:36:01.460 if you've just inherited it, you don't really own it.
01:36:05.520 You really, something becomes yours when you've earned it.
01:36:13.920 But all that goes out a window.
01:36:16.220 The power of a nation doesn't belong to people.
01:36:17.980 It's stuff.
01:36:18.840 It should all belong to the state.
01:36:20.840 So people, one by one, historically speaking,
01:36:24.580 hand all of their power over to the state.
01:36:27.040 And then the state decides what its people should be,
01:36:30.020 what they can say, what they can watch,
01:36:32.400 what they can listen to.
01:36:34.720 They define hate.
01:36:36.720 You and I both know hate.
01:36:38.460 We know hate when we see hate.
01:36:40.640 We know love when we see love.
01:36:43.520 I can tell you the difference between love and sex.
01:36:48.740 There's a big difference.
01:36:52.580 Love always wins.
01:36:53.960 Sex doesn't always win.
01:36:55.600 Utopians take the state and turn it into the brain of society,
01:37:04.080 and it controls everything.
01:37:06.080 Now, think of this.
01:37:06.720 This is from the 1600s.
01:37:09.020 Think of this.
01:37:10.260 The state becomes the brain of society.
01:37:14.180 With AI and all of the technology we have today,
01:37:17.900 they are literally trying to be the brain of society
01:37:21.100 and control everything and hold all of the power.
01:37:24.240 It will do the thinking and deciding for everyone.
01:37:30.040 In America, the first utopian was Woodrow Wilson.
01:37:34.620 Really?
01:37:35.740 The one with real power.
01:37:37.600 He used centralization and bureaucracy
01:37:39.720 to make a collectivism that Americans had never seen before.
01:37:44.520 And then he spread it in all the universities.
01:37:48.360 He convinced Congress to hand them their power
01:37:55.120 so he could use war and surveillance
01:37:58.040 to make a world safe for democracy.
01:38:01.180 FDR took the step even further.
01:38:04.840 He said it would get rid of war altogether
01:38:07.360 with a little help from Joseph Stalin.
01:38:09.780 We've seen all of these things before,
01:38:18.160 and America turned just in time.
01:38:22.100 Will we this time?
01:38:24.380 Because there's one more example that we should learn from,
01:38:30.080 and it is from China.
01:38:31.760 But remember, China is the new model,
01:38:35.260 according to all the global leaders,
01:38:37.380 all of the big capitalist leaders,
01:38:40.580 and all the leaders of the world, including Joe Biden.
01:38:47.360 China is the new model.
01:38:49.640 Well, let me tell you how they got there in a minute.
01:38:54.720 Sponsored by Preborn.
01:38:56.100 You know, every once in a while,
01:38:57.520 I get to give you good news.
01:38:58.800 And here's some good news.
01:39:00.600 Do you know how many babies you have saved?
01:39:03.060 How many babies have been born just because of this audience?
01:39:07.940 It's over, I think, 55,000 or 60,000 babies
01:39:11.900 in the last 12 months were saved because of you.
01:39:16.200 Because somebody, if it wasn't you,
01:39:18.120 somebody, you know, maybe in the car next to you
01:39:20.260 or the cubicle next to you,
01:39:22.000 somebody was listening to this program.
01:39:25.480 They're like, I'm going to donate 28 bucks.
01:39:27.720 That bought an ultrasound for a woman
01:39:31.220 who had come into a clinic, 55,000 of them,
01:39:35.360 assuming there's no twins.
01:39:37.660 And they walk in and they say,
01:39:39.740 I think I'm going to have an abortion.
01:39:41.200 They saw the ultrasound that cost 28 bucks.
01:39:44.020 They didn't have to pay for it.
01:39:45.300 And that made mom go, I'm going to, that's a baby.
01:39:50.920 I'm going to keep my baby.
01:39:53.440 That's what you have done.
01:39:55.720 And you can do it for 28 bucks.
01:39:59.940 I think it's 140 bucks.
01:40:02.260 And what do you get?
01:40:03.120 Six.
01:40:04.020 I mean, you really can save a bunch of babies' lives.
01:40:07.980 Do it once a month, $28.
01:40:09.600 Be a hero.
01:40:12.300 Dial pound 250 and say the keyword baby.
01:40:15.500 That's pound 250, keyword baby,
01:40:17.640 or go to preborn.com slash Glenn
01:40:19.760 and find out all the information.
01:40:21.560 Preborn.com slash Glenn.
01:40:23.640 Go there now.
01:40:24.160 10 seconds.
01:40:24.680 Station ID.
01:40:25.140 All right.
01:40:35.800 I want to talk to you about the great cultural revolution here.
01:40:38.820 Like every good utopian, Mao started with ideas.
01:40:43.100 I got ideas.
01:40:45.160 Slogans, basically, that sound good,
01:40:47.580 but don't really mean anything.
01:40:49.120 See if any of these sound familiar to you.
01:40:51.580 Before long, the slogans are true.
01:40:56.540 This is how he made every aspect of life political.
01:41:00.060 That way, anyone who disagreed with him
01:41:02.000 was conspiring against China, the great utopia.
01:41:06.160 They weren't his opponents.
01:41:08.700 They were the enemies of the people.
01:41:12.400 This is why utopians always combine academia with the military.
01:41:17.360 The academics dream up the new utopias,
01:41:19.360 utopias, and the military forces people into those utopias.
01:41:23.980 And Mao was really clever about this.
01:41:26.420 He convinced people how they had a role in how things work.
01:41:30.680 Your lives are political for a reason.
01:41:34.260 This is how he got students to snitch on one another,
01:41:38.680 then get kids to betray their own parents,
01:41:41.940 all in the names of the state.
01:41:44.340 But this is how you create a whole society of vigilantes.
01:41:49.900 For Woodrow Wilson,
01:41:51.460 he had the four-minute men.
01:41:54.600 That was his goons.
01:41:56.640 For Mao, the Red Guards,
01:41:58.180 young activists who served as his unofficial enforcers.
01:42:01.660 In colleges,
01:42:02.960 I just told you who the snitchers were,
01:42:05.980 and they think they're doing good,
01:42:07.480 just like these others did.
01:42:09.120 The tech world,
01:42:10.420 isn't that really the enforcer of the government now?
01:42:13.380 One of the first utopians,
01:42:16.180 one of the first utopians of the modern age,
01:42:20.040 really was Mao.
01:42:21.700 He was the most prolific at death.
01:42:24.700 They take power to destroy history.
01:42:28.020 That's the first thing.
01:42:29.600 They say the world as we know it,
01:42:31.880 China as we know it,
01:42:33.700 has to be replanned,
01:42:35.160 reset,
01:42:36.060 because everything you know is old,
01:42:37.860 dusty,
01:42:38.160 and no longer any good.
01:42:39.820 At the start of the Cultural Revolution,
01:42:42.640 the Red Guard led a campaign
01:42:45.040 to eradicate the four olds.
01:42:48.960 I want you to listen.
01:42:50.940 Pesky four olds.
01:42:53.540 The four olds
01:42:54.500 that they had to get rid of.
01:42:56.700 The old customs.
01:42:59.740 Christmas means nothing.
01:43:01.600 Thanksgiving is a celebration of slavery.
01:43:04.680 Fourth of July is a celebration.
01:43:06.720 Get rid of the old customs.
01:43:10.120 The old culture.
01:43:15.420 You needed new habits
01:43:17.420 and new ideas.
01:43:21.320 The old history
01:43:22.720 had to be swept away.
01:43:24.440 The old guard
01:43:25.720 had to be swept away.
01:43:28.020 They started,
01:43:29.060 believe it or not,
01:43:29.780 by tearing down statues
01:43:31.280 and changing the names of streets.
01:43:33.580 Then they attacked anyone
01:43:35.800 who tried to stop them.
01:43:37.600 They destroyed people's homes.
01:43:39.300 They publicly humiliated
01:43:40.860 their opponents.
01:43:42.200 And no one could stop them
01:43:43.460 because people had already
01:43:45.660 given Mao all their power.
01:43:49.400 Before long,
01:43:50.760 the Red Guard
01:43:51.220 was destroying cemeteries
01:43:52.920 and factories
01:43:54.300 and libraries
01:43:55.240 and museums
01:43:56.140 and temples.
01:43:57.940 Wow.
01:43:58.680 They were burning down
01:44:00.000 their own cities.
01:44:01.120 What happens is
01:44:04.480 the same story
01:44:05.860 over and over again.
01:44:06.760 They want to reform
01:44:07.820 the big institutions
01:44:08.960 of society,
01:44:10.040 but in the end,
01:44:11.420 they only destroy
01:44:13.420 the small institutions,
01:44:16.020 family,
01:44:16.620 church,
01:44:17.700 private property.
01:44:20.020 And they always start
01:44:21.140 with the idea
01:44:21.880 that you have a duty
01:44:23.660 to the state
01:44:24.660 and the group
01:44:26.140 is more important
01:44:26.960 than the individual.
01:44:28.020 And if that individual
01:44:28.900 is speaking out,
01:44:29.880 they have to be shut down.
01:44:32.520 That was only
01:44:33.240 the first installment
01:44:34.660 of Mao's
01:44:35.620 genocidal reign.
01:44:38.420 More tomorrow.
01:44:40.880 Coming up next,
01:44:42.560 Mr. Bill O'Reilly.
01:44:44.560 Talk a little bit
01:44:45.380 about the goings-on
01:44:46.540 of the week.
01:44:48.180 Bill O'Reilly
01:44:48.740 from BillOReilly.com next.
01:44:51.880 The Glenn Beck Program.
01:44:53.500 You know,
01:44:55.400 the good old days
01:44:56.640 when the biggest problem
01:44:58.620 you had on retirement
01:44:59.880 was,
01:45:00.680 I think Social Security
01:45:02.040 is going to go bankrupt.
01:45:03.820 Yeah.
01:45:04.660 Yeah,
01:45:05.140 it's going to go bankrupt.
01:45:06.520 Nobody's going to admit it
01:45:07.520 and we're just going to
01:45:09.200 dance around
01:45:09.840 and kick the can
01:45:10.640 down the road
01:45:11.200 until there's no can left
01:45:12.480 or no feet,
01:45:13.440 actually.
01:45:14.740 Now,
01:45:15.320 there's 50 different ways
01:45:16.680 where you could get
01:45:17.960 to retirement age
01:45:19.200 and have nothing
01:45:20.260 to show for it.
01:45:21.020 One of the big threats
01:45:23.420 is the economic destruction
01:45:25.560 of our money.
01:45:27.540 I mean,
01:45:27.900 our money is,
01:45:30.640 you know,
01:45:31.040 inflation doesn't mean
01:45:32.180 that prices are just going up.
01:45:33.540 It means the dollar
01:45:34.500 is going down.
01:45:36.480 Okay?
01:45:37.460 We need to build a hedge
01:45:39.180 against insanity
01:45:40.380 and many times
01:45:41.320 that is gold or silver.
01:45:43.080 Please,
01:45:43.480 Goldline is there
01:45:44.280 to provide an education
01:45:45.560 first on how to use
01:45:46.980 precious metals,
01:45:47.820 gold and silver
01:45:48.340 to protect
01:45:49.080 your retirement.
01:45:50.160 They're offering
01:45:51.120 free metals
01:45:51.700 delivered directly
01:45:52.480 to your front door
01:45:53.260 with every qualified
01:45:54.520 self-directed IRA
01:45:55.900 transaction this month.
01:45:57.560 It's a huge special.
01:45:58.920 Call Goldline right now.
01:46:00.220 Take advantage
01:46:00.760 of the IRA special
01:46:02.440 866-GOLDLINE.
01:46:04.020 Call them.
01:46:04.500 They're waiting for you now.
01:46:05.940 866-GOLDLINE
01:46:07.020 or goldline.com.
01:46:08.620 Hey,
01:46:08.900 make sure you subscribe
01:46:09.840 to The Blaze.
01:46:11.620 The Blaze,
01:46:12.520 I've got a
01:46:13.400 incredible guest
01:46:15.220 on today's podcast.
01:46:17.900 We'll tell you about it
01:46:18.520 in a minute.
01:46:18.800 Hey,
01:46:25.340 if you haven't already
01:46:26.020 gone to glennbeck.com,
01:46:27.240 get access to the research
01:46:28.680 from last night's
01:46:29.920 Wednesday night special
01:46:30.940 all about
01:46:32.020 artificial intelligence.
01:46:33.200 It includes the videos
01:46:34.760 that come from China.
01:46:36.580 I mean,
01:46:36.840 it is,
01:46:37.440 it's some
01:46:38.480 spectacularly
01:46:40.080 spooky stuff.
01:46:41.480 It really is.
01:46:42.400 You can find it now
01:46:43.400 at glennbeck.com
01:46:44.560 and get ahead of this.
01:46:47.140 Be able to teach this
01:46:48.600 to your friends
01:46:49.240 on what is on
01:46:50.240 our doorstep.
01:46:51.780 You can also,
01:46:52.900 when you sign up
01:46:53.680 for the newsletter,
01:46:54.440 you'll not only get that
01:46:55.280 as a bonus today,
01:46:56.300 but you will also get
01:46:58.020 my show prep
01:46:58.880 every day.
01:46:59.860 You'll get about
01:47:00.300 60 to 80 stories
01:47:01.980 sometimes
01:47:02.460 that I think
01:47:03.620 are important,
01:47:04.280 but really
01:47:05.320 only about
01:47:05.860 15 of them
01:47:06.780 will make it
01:47:07.280 on the air.
01:47:08.200 But they're all
01:47:09.240 worth reading.
01:47:10.820 And you can find that
01:47:11.700 and get that free
01:47:12.340 at glennbeck.com.
01:47:14.020 Last night,
01:47:14.820 I got a text
01:47:15.920 from Mike Lee
01:47:17.080 that said,
01:47:17.680 check my Twitter feed.
01:47:19.660 So I did,
01:47:20.780 if false,
01:47:21.660 slander,
01:47:22.180 if true,
01:47:23.060 war.
01:47:23.420 and it was
01:47:24.400 the story
01:47:25.160 about how
01:47:26.580 we may have
01:47:27.460 blown up
01:47:28.140 the Nord Stream
01:47:29.520 pipeline.
01:47:30.600 I wrote to him
01:47:31.600 right before
01:47:32.180 I came on the air
01:47:33.460 today and I said,
01:47:35.020 you know,
01:47:35.360 so what do I tell
01:47:36.440 the people?
01:47:37.140 And he said,
01:47:38.120 I would tell
01:47:39.080 your audience,
01:47:40.000 we don't know
01:47:41.160 whether or not
01:47:41.660 this is true.
01:47:43.840 Lone author
01:47:44.640 writing on Substack
01:47:45.800 relying on a single
01:47:46.720 source isn't good,
01:47:48.960 but it's,
01:47:50.880 we have no confidence
01:47:52.420 either way.
01:47:53.640 Is it true?
01:47:54.800 I don't know.
01:47:56.540 But if it's true,
01:47:58.260 it's a real problem,
01:47:59.500 a huge,
01:47:59.940 quoting,
01:48:00.360 huge problem
01:48:00.920 of epic proportions.
01:48:03.060 Plus,
01:48:03.580 who else might have
01:48:04.700 done this?
01:48:05.360 Who else had the
01:48:06.080 capacity?
01:48:07.280 Mike will be joining
01:48:07.880 me tomorrow
01:48:08.540 to flesh that out,
01:48:10.120 but I wanted to get
01:48:11.300 Bill O'Reilly on
01:48:12.260 to see if he has
01:48:13.160 an initial take
01:48:14.200 because I think
01:48:14.900 this is all about
01:48:15.980 the loss
01:48:16.760 of the press
01:48:18.180 and credibility.
01:48:19.880 We don't
01:48:20.760 know who to believe
01:48:22.080 and what to believe.
01:48:23.220 Bill,
01:48:23.740 welcome.
01:48:25.440 Beck,
01:48:26.060 I'm sending you
01:48:26.980 some free stuff
01:48:27.820 before we get
01:48:28.440 into this
01:48:29.040 on Team Normal.
01:48:32.120 Are you on
01:48:32.800 Team Normal?
01:48:33.860 I don't know
01:48:34.680 what Team Normal
01:48:35.440 is.
01:48:36.080 If you're the head
01:48:37.260 of Team Normal,
01:48:38.140 I think you might
01:48:39.120 want to re-
01:48:40.800 I'm sending it
01:48:41.280 to you anyway.
01:48:42.620 So it's Team Normal
01:48:44.020 versus Team Crazy.
01:48:45.960 Yeah.
01:48:46.720 If you listen
01:48:47.920 to Governor
01:48:49.320 Huckabee Sanders
01:48:50.200 speech.
01:48:51.020 Yeah.
01:48:51.340 So I'm on Team
01:48:52.580 Normal.
01:48:53.160 I know that's
01:48:53.820 been disputed.
01:48:54.820 Yes.
01:48:55.360 Yes.
01:48:55.500 But on BillOReilly.com
01:48:57.160 we got the hats
01:48:58.040 and the shirts,
01:48:58.860 we got bumper stickers.
01:49:00.220 Okay.
01:49:00.960 And if you want
01:49:01.820 to be on Team Normal,
01:49:03.400 all right,
01:49:03.800 and I think you do,
01:49:04.900 Beck.
01:49:05.200 I would like a Team,
01:49:06.100 do you sell
01:49:06.680 the Team Crazy?
01:49:08.700 We know.
01:49:09.520 We don't want
01:49:09.900 to promote
01:49:10.240 the Team Crazy.
01:49:11.020 Well,
01:49:11.300 you know,
01:49:11.940 I thought I could
01:49:12.540 just wear it
01:49:13.180 once in a while
01:49:14.140 as a dad
01:49:15.520 around the house.
01:49:16.820 You know what I mean?
01:49:17.920 All right.
01:49:18.580 Now,
01:49:18.840 Seymour Hershey
01:49:19.800 who wrote this story
01:49:20.900 on Substack
01:49:21.820 about the pipeline
01:49:23.000 is a loon.
01:49:24.240 Okay.
01:49:24.400 He lost his mind
01:49:25.360 about,
01:49:25.880 I don't know,
01:49:26.380 30 years ago
01:49:27.200 in my opinion,
01:49:28.120 my humble opinion.
01:49:29.280 It's a subjective
01:49:30.880 analysis of Mr. Hershey
01:49:32.200 who did good work
01:49:32.940 in Vietnam.
01:49:33.860 Yep.
01:49:34.420 But after that
01:49:35.520 it was just crazy time.
01:49:37.180 Yeah.
01:49:37.360 He has come up
01:49:38.700 with a lot of things
01:49:40.080 and not usually
01:49:42.760 verifiable.
01:49:44.180 Never borne out.
01:49:45.480 Yeah.
01:49:45.660 So he loves,
01:49:46.180 he loves this.
01:49:47.000 Knowing the Biden
01:49:49.680 administration
01:49:50.180 the way I do,
01:49:51.140 I think it's
01:49:51.940 almost impossible
01:49:53.680 that Joe Biden
01:49:56.160 would order
01:49:58.400 an attack
01:50:00.960 on the Nord Stream
01:50:02.140 pipeline.
01:50:02.880 He just doesn't
01:50:03.780 have
01:50:04.700 that kind of grit
01:50:07.120 and that could
01:50:09.400 start a world war
01:50:10.480 with nooks.
01:50:11.040 So I would say
01:50:13.640 to Senator Lee
01:50:15.720 with respect,
01:50:17.040 I don't believe
01:50:18.760 the story
01:50:19.600 as it stands.
01:50:20.820 Well,
01:50:21.100 he said,
01:50:22.260 in all fairness,
01:50:23.760 that's what he said.
01:50:24.920 Yeah.
01:50:25.200 I don't believe it,
01:50:26.140 but I can't
01:50:27.320 dismiss it either.
01:50:29.120 Well,
01:50:29.320 I can't dismiss
01:50:30.320 Martians from Venus.
01:50:33.080 I mean,
01:50:33.440 you know,
01:50:33.980 they would be,
01:50:34.680 what are they,
01:50:35.320 tourists on Venus?
01:50:36.360 Why would Martians
01:50:37.300 be on Venus?
01:50:37.900 You know,
01:50:38.060 you can go this
01:50:38.300 conspiracy route
01:50:39.260 all day long,
01:50:40.440 but I'm a fact-based guy.
01:50:43.960 And the only people
01:50:45.280 really watching,
01:50:46.340 you're never going to
01:50:46.940 get any reporting
01:50:47.700 out of Russia
01:50:48.320 that's worth anything.
01:50:49.660 Yes.
01:50:50.120 You can't believe
01:50:50.300 anything.
01:50:51.000 Correct.
01:50:51.220 But the Swedes,
01:50:54.440 Olaf and the Swedes,
01:50:56.260 and that group
01:50:58.600 over there,
01:50:59.260 they're watching it.
01:51:01.560 Right.
01:51:02.320 So,
01:51:03.000 at this point,
01:51:04.600 I think this isn't
01:51:05.900 really,
01:51:06.580 you know,
01:51:07.500 something that Americans
01:51:08.360 should be concerned about.
01:51:10.160 Do you want to get
01:51:11.080 into the State of the Union?
01:51:12.220 Because I have one thing
01:51:13.360 that everybody missed,
01:51:14.540 Beck.
01:51:15.140 Yeah,
01:51:15.720 I do,
01:51:16.100 but I want to take
01:51:16.720 this conversation
01:51:17.960 one step further.
01:51:19.480 Sure.
01:51:19.740 The problem
01:51:21.160 with this story,
01:51:22.540 Bill,
01:51:22.860 is we have been lied
01:51:25.160 to so many times
01:51:26.640 by our administration,
01:51:28.380 by our media,
01:51:30.260 that I find myself
01:51:31.720 in a position
01:51:32.660 to where
01:51:33.380 I can't make a call
01:51:35.920 on a few things
01:51:37.220 like this.
01:51:38.260 I'm like,
01:51:38.620 I don't think
01:51:39.580 we did that,
01:51:40.740 but if we did do that,
01:51:42.300 it'd be really horrible,
01:51:43.460 but I don't think
01:51:44.700 we could ever prove it
01:51:46.160 because no one
01:51:47.660 is a journalist anymore.
01:51:49.760 Yeah,
01:51:50.160 but even if you were
01:51:51.800 a journalist,
01:51:52.120 I am a journalist
01:51:53.460 and I can scuba dive.
01:51:57.280 If you want to put me
01:51:58.540 in a little bell,
01:51:59.460 I'll go down.
01:51:59.760 Oh,
01:52:00.200 I'd love to put you
01:52:01.160 in a little bell.
01:52:02.080 I know you would,
01:52:03.060 you jealous guy.
01:52:06.540 Anyway,
01:52:07.280 but it's impossible.
01:52:08.760 You just can't get
01:52:10.060 to that kind
01:52:10.960 of a story.
01:52:11.840 So do you believe
01:52:15.900 we'll ever find out
01:52:16.840 who blew it up?
01:52:18.000 No.
01:52:18.180 Because somebody did.
01:52:19.800 You know,
01:52:20.380 look,
01:52:20.840 I don't know
01:52:21.460 whether that is
01:52:22.560 a physiological fact
01:52:24.580 that somebody did.
01:52:26.100 You're way under the water.
01:52:28.480 You've got all kinds
01:52:30.040 of combustibles
01:52:31.080 going through the pipeline.
01:52:33.060 Certainly,
01:52:33.440 it could have been
01:52:34.300 some kind of malfunction,
01:52:37.600 so I don't think
01:52:38.560 I would go with
01:52:39.580 sabotage 100%
01:52:41.220 at this point.
01:52:42.100 I think the Swedes
01:52:43.300 and Elsa
01:52:44.040 and her sister
01:52:45.640 were there.
01:52:47.840 Yeah.
01:52:48.180 Yeah.
01:52:48.600 I believe they investigated
01:52:50.140 it.
01:52:50.440 It was sabotage.
01:52:52.400 Okay.
01:52:52.900 So tell me
01:52:53.600 what we missed
01:52:54.440 on the State of the Union.
01:52:56.400 Okay.
01:52:56.860 And the guy
01:52:57.980 in the Wall Street Journal
01:52:59.040 just ripped off
01:53:00.020 my analysis.
01:53:01.240 Henninger
01:53:01.780 is his name today.
01:53:03.780 So right after
01:53:04.580 the State of the Union,
01:53:07.800 I did instant analysis
01:53:09.200 on radio and television.
01:53:11.120 That's what I do
01:53:11.860 for a living.
01:53:12.940 And I said,
01:53:13.480 look,
01:53:13.740 did you not pick up
01:53:15.040 the living wage comment?
01:53:18.980 And you're an expert
01:53:19.960 at this.
01:53:21.080 At the end of it,
01:53:22.200 he's going,
01:53:22.560 everybody should be
01:53:23.180 in a union
01:53:23.660 because everybody
01:53:24.600 should have a living wage
01:53:26.060 and everybody should
01:53:26.780 have health care,
01:53:27.380 you know,
01:53:27.560 the usual.
01:53:28.500 A living wage,
01:53:30.040 okay,
01:53:30.600 is a Marxist tenet.
01:53:33.100 Yes.
01:53:33.720 That means the government
01:53:35.020 sets everybody's salary.
01:53:36.960 Nobody,
01:53:37.420 no corporation or company
01:53:38.660 is going to set
01:53:39.240 a living wage.
01:53:40.380 So I brought it
01:53:41.120 to Cuomo last night.
01:53:42.420 I do a hit with him
01:53:43.280 on Wednesday
01:53:43.940 on News Nation,
01:53:45.240 which you should watch
01:53:46.140 just because you'll
01:53:47.360 be entertained back.
01:53:48.860 All right.
01:53:49.720 And I said to him,
01:53:51.020 hey,
01:53:51.800 did you catch this?
01:53:53.160 And of course,
01:53:54.120 he said no.
01:53:55.480 But then he started
01:53:56.380 to do the little dance
01:53:58.140 about,
01:53:58.600 well,
01:53:58.820 he meant minimum wage.
01:54:00.420 I said,
01:54:01.000 no,
01:54:01.300 he didn't.
01:54:02.640 He didn't.
01:54:03.500 We already have
01:54:04.380 minimum wage laws.
01:54:06.240 He meant living wage.
01:54:08.240 So fast forward
01:54:09.140 to this morning,
01:54:09.840 open a Wall Street Journal,
01:54:10.980 which is worth reading
01:54:11.960 on its editorials page sign.
01:54:14.180 And there's a manager
01:54:14.920 going,
01:54:15.360 oh,
01:54:16.260 Biden has come out
01:54:18.440 of the closet
01:54:19.320 as a socialist.
01:54:19.980 And that's true.
01:54:21.800 But here's
01:54:22.620 the real tragic part.
01:54:24.520 Biden doesn't even know
01:54:26.080 what a living wage means.
01:54:27.660 He doesn't even know
01:54:30.320 it's part of the
01:54:31.160 Karl Marx program.
01:54:33.560 He didn't write
01:54:34.400 that speech.
01:54:35.320 He went over it
01:54:36.460 15 times
01:54:37.660 because,
01:54:38.220 and he delivered it
01:54:38.920 pretty well.
01:54:39.600 You got to be honest.
01:54:40.380 He had good energy.
01:54:41.340 He didn't look befuddled.
01:54:44.120 He had good energy.
01:54:45.560 I don't know what they...
01:54:46.340 He had that delivery
01:54:47.000 to what he usually does,
01:54:48.540 stammers around with.
01:54:49.760 That was light years better.
01:54:51.200 But he didn't write
01:54:52.820 any of it.
01:54:53.580 And unlike Trump
01:54:54.440 and Obama,
01:54:55.280 they didn't write either.
01:54:56.460 But they edited heavily,
01:54:58.340 both of them.
01:54:59.660 But I don't know
01:55:00.560 whether Biden...
01:55:01.460 But I doubt
01:55:02.720 that he's sitting there
01:55:03.660 with the Sharpie editing.
01:55:05.200 I doubt it.
01:55:06.280 He pretty much does
01:55:07.360 what he's told to do
01:55:08.900 by Susan Rice.
01:55:10.160 The New York Times
01:55:11.480 did a whole story
01:55:12.700 on this
01:55:13.160 and said he does edit
01:55:14.800 and he's looking for...
01:55:16.580 How do they know?
01:55:16.980 Because they had
01:55:17.840 several insiders
01:55:19.120 of the White House bill.
01:55:20.100 Oh, insider.
01:55:20.740 And they said
01:55:21.540 that he edits
01:55:22.700 and he marks it up
01:55:24.940 where he needs to pause
01:55:26.580 and he looks for...
01:55:28.540 Because he has
01:55:29.000 a strong rule,
01:55:29.680 no acronyms
01:55:30.620 and words
01:55:32.140 that he thinks
01:55:33.000 he might stutter.
01:55:34.760 He takes those out.
01:55:36.460 Jill Biden does not.
01:55:37.820 Not him.
01:55:39.000 I'm not...
01:55:39.520 Listen.
01:55:39.920 Whenever you see
01:55:41.400 anonymous sources,
01:55:43.120 New York Times,
01:55:44.220 forget it.
01:55:45.480 No.
01:55:46.040 Forget it.
01:55:46.700 No.
01:55:47.020 I mean, yeah.
01:55:48.180 They want to make
01:55:49.240 them look good
01:55:49.960 so they...
01:55:50.760 Oh, an insider told me...
01:55:52.760 I just can't even
01:55:54.100 imagine him
01:55:55.340 with his concentration
01:55:57.300 span being
01:55:58.340 18 seconds,
01:55:59.980 all right,
01:56:00.640 sitting there
01:56:01.580 with an hour
01:56:02.920 and 12-minute speech
01:56:04.440 going over it
01:56:05.660 line by line.
01:56:07.080 Now, what he does do
01:56:08.220 is read the teleprompter
01:56:09.420 and he reads it
01:56:10.100 and he reads it
01:56:10.660 and he reads it
01:56:11.340 and they have built in
01:56:13.180 in the teleprompter
01:56:14.660 pause.
01:56:15.980 Right.
01:56:16.260 Stop.
01:56:17.260 Right.
01:56:17.960 Smile.
01:56:19.380 Grimace.
01:56:20.780 Grimace.
01:56:23.140 Bill, let me ask you.
01:56:25.680 He ad-libbed a few things
01:56:27.600 that apparently
01:56:28.220 were not in the speech
01:56:29.120 and one of those
01:56:29.800 was his angry,
01:56:31.700 angry response
01:56:32.900 about,
01:56:33.740 who wants to be
01:56:34.600 President Xi
01:56:35.740 in China?
01:56:36.480 Nobody.
01:56:36.800 and his...
01:56:38.800 He goes from,
01:56:40.240 like,
01:56:40.960 okay to
01:56:42.140 screaming,
01:56:44.220 flaming mad
01:56:45.620 in an instant.
01:56:47.840 That sounds like me.
01:56:49.940 Mm-hmm.
01:56:52.680 Well...
01:56:53.600 Look,
01:56:54.120 I'm not going to analyze
01:56:55.740 his emotional capabilities.
01:56:58.800 I mean,
01:56:59.540 you've all...
01:57:00.580 He called some reporter
01:57:02.800 a dog pony soldier
01:57:04.340 or something.
01:57:04.700 I mean,
01:57:05.240 it's just incoherent gibberish.
01:57:08.120 And so I don't even
01:57:09.200 bother with that.
01:57:10.520 What really,
01:57:11.500 really disturbs me,
01:57:13.280 and this is not
01:57:14.340 in the forefront
01:57:15.320 of the American people's mind.
01:57:17.240 They're calling him a liar
01:57:18.680 and they're,
01:57:19.300 oh,
01:57:19.340 he's a liar.
01:57:20.120 He says he's not.
01:57:21.220 He's delusional,
01:57:22.500 Beck.
01:57:23.540 Oh.
01:57:24.040 He lives in a world
01:57:25.820 of delusion.
01:57:26.980 He thinks he's doing
01:57:28.460 a good job
01:57:29.400 with the economy.
01:57:30.540 He believes
01:57:31.620 that he is
01:57:32.480 a deficit cost cutter.
01:57:34.700 He believes
01:57:35.560 this stuff.
01:57:37.020 Okay?
01:57:38.000 And it's so far
01:57:38.940 from reality,
01:57:39.780 but we all know
01:57:41.360 older people
01:57:42.540 who you go in
01:57:44.280 and then there,
01:57:45.060 there they are
01:57:46.040 and it's the same syndrome.
01:57:48.660 And run for office again?
01:57:51.200 This man is going
01:57:52.000 downhill faster
01:57:53.220 than Lindsey Vonn.
01:57:55.140 You think this is going
01:57:56.220 to get better with him?
01:57:57.780 No.
01:57:58.960 I mean,
01:58:00.060 no.
01:58:01.060 I'm just sitting there
01:58:02.180 going,
01:58:02.700 this country,
01:58:03.620 if this man
01:58:04.700 wins another
01:58:05.880 four-year term,
01:58:07.760 this country
01:58:08.740 is going to be
01:58:09.980 damaged
01:58:10.560 beyond repair.
01:58:12.860 We can repair it now.
01:58:14.700 I have about 70 seconds.
01:58:15.960 I have to ask you
01:58:16.840 about the spat
01:58:18.920 between Donald Trump
01:58:20.460 and Ron DeSantis.
01:58:23.420 What is Trump doing?
01:58:24.400 Stop with this.
01:58:25.560 I agree 100%.
01:58:28.560 I agree.
01:58:30.800 It's a terrible tactic.
01:58:34.460 He doesn't need to.
01:58:35.800 He doesn't.
01:58:36.940 He doesn't.
01:58:38.160 Right.
01:58:38.660 Yeah,
01:58:38.940 if he would just
01:58:39.800 be president.
01:58:41.280 It's all about
01:58:41.420 discipline with him.
01:58:42.580 You know that.
01:58:43.260 I know,
01:58:43.680 I know,
01:58:44.080 I know.
01:58:44.580 It's emotion
01:58:45.200 and discipline.
01:58:46.780 He's going to have
01:58:47.440 a tough time
01:58:48.300 getting that nomination
01:58:50.420 unless he changes
01:58:51.920 course fairly quickly.
01:58:53.520 Bill O'Reilly
01:58:53.940 from Bill O'Reilly.com.
01:58:55.620 Make sure you watch
01:58:56.200 his No Spin Zone
01:58:56.940 every night
01:58:57.820 on Bill O'Reilly.com.
01:59:00.400 He's also got
01:59:01.460 products
01:59:02.020 and his latest book
01:59:03.360 also available online
01:59:04.660 at Bill O'Reilly.com.
01:59:06.120 Bill, thanks.
01:59:06.660 Talk to you again next week.
01:59:07.460 All right, look for that gear, man.
01:59:08.640 I want to see you
01:59:09.320 wearing that hat.
01:59:10.220 Yeah, like you send me
01:59:11.200 the books too.
01:59:11.980 Yeah, I'll look for it.
01:59:13.840 Judy wrote in
01:59:15.040 about her experience
01:59:15.720 with Relief Factor.
01:59:16.880 She says,
01:59:17.340 I have pain
01:59:17.980 in my fingers
01:59:18.520 and the other joints.
01:59:19.900 It is not fun
01:59:21.740 to play the piano
01:59:23.060 or type
01:59:23.920 on a computer
01:59:24.600 when you're in pain.
01:59:26.400 Thankfully,
01:59:27.040 Relief Factor
01:59:27.620 has brought joy
01:59:28.580 back into my playing
01:59:29.440 and other activities.
01:59:30.500 Been able to start
01:59:31.340 making jewelry again even.
01:59:33.480 Thank you,
01:59:33.980 Relief Factor.
01:59:35.180 Judy,
01:59:36.100 Relief Factor
01:59:36.800 says,
01:59:37.340 I'm sure,
01:59:37.980 thank you.
01:59:38.720 Thank you for at least
01:59:39.560 trying it.
01:59:40.660 If you or someone
01:59:41.540 you love
01:59:42.160 is dealing with pain,
01:59:43.500 please just try
01:59:44.780 Relief Factor.
01:59:45.600 It's not a drug,
01:59:46.280 so it's not going
01:59:46.780 to whack you up.
01:59:48.140 It is something
01:59:49.240 that was developed
01:59:49.780 by doctors.
01:59:50.480 It has four key ingredients
01:59:52.020 that work naturally
01:59:53.200 with your body
01:59:53.860 to fight inflammation,
01:59:55.160 which causes
01:59:55.900 most of our pain.
01:59:57.440 So just try it
01:59:58.200 for three weeks.
01:59:59.100 Yes,
01:59:59.380 you have $20 to lose.
02:00:01.320 But if it works,
02:00:02.120 if you're part
02:00:02.580 of the 70%
02:00:03.380 that go on to order
02:00:04.200 more month after month,
02:00:05.300 it's working for you,
02:00:06.740 man,
02:00:06.980 that's a good 20 bucks
02:00:09.540 to throw their way.
02:00:12.120 Get out of pain.
02:00:13.700 Relief Factor.
02:00:14.740 .com.
02:00:15.260 ReliefFactor.com.
02:00:17.380 Call 800-4-RELIEF.
02:00:19.180 800,
02:00:19.740 the number 4-RELIEF.
02:00:21.000 ReliefFactor.com.
02:00:22.460 Feel the difference.
02:00:24.760 Join the conversation.
02:00:26.800 888-727-BECK.
02:00:29.280 The Glenn Beck Program.
02:00:30.480 .
02:00:31.480 .
02:00:32.480 .
02:00:33.480 .
02:00:34.480 .
02:00:35.480 .
02:00:36.480 .
02:00:37.480 .
02:00:38.480 .
02:00:39.480 .
02:00:40.480 .
02:00:41.480 .
02:00:42.480 .
02:00:43.480 .
02:00:44.480 .
02:00:44.540 .
02:00:44.600 .
02:00:44.640 .
02:00:44.660 .
02:00:45.260 .
02:00:45.880 .
02:00:49.480 .
02:00:49.740 .
02:00:49.780 Welcome to the Glenn Beck Program.
02:00:50.960 Let me remind you
02:00:54.160 that at Glenn Beck.com today
02:00:56.780 we have all of the research
02:00:59.280 done for the AI revolution
02:01:00.580 is here.
02:01:01.380 That was the Wednesday
02:01:02.240 night special last night.
02:01:03.320 If you missed it,
02:01:04.080 watch it on YouTube
02:01:04.840 or you can watch it
02:01:06.040 at plays TV,
02:01:06.900 but it is really
02:01:09.100 a good coming attraction
02:01:11.820 and why people would do this.
02:01:15.760 Why would they take AI and try to control everyone?
02:01:23.380 It's one thing to say, you know, well, they're just evil.
02:01:27.960 Okay, is everybody evil?
02:01:30.540 Is everybody?
02:01:31.440 Is there anybody that has a better answer than that?
02:01:34.900 Look at it, and all of the research now is available to you for free
02:01:41.520 at glenbeck.com, and it includes these amazing videos from China
02:01:48.740 and what they're already doing to their people,
02:01:52.760 and it is all lining up exactly with what our government,
02:02:00.260 our education apparatus is pushing for.
02:02:04.640 It is really common core in many ways,
02:02:08.120 all the technology parts of it that Bill Gates was pushing for.
02:02:11.040 It's all now in China, and when you see what they're doing,
02:02:14.180 reading brainwaves and watching the children's eyes and tracking their eyes
02:02:19.320 and your decision on where you're going to work and what you're going to be
02:02:22.780 and, you know, what you like, what you don't like,
02:02:25.120 that's all decided for you by the time you get into first grade.
02:02:29.300 Easily decided.
02:02:30.400 It's not a place I think any of us want to live, but you need to know about it
02:02:38.480 because this is what's coming now in America,
02:02:42.820 and it won't if we are all educated and we know,
02:02:47.380 but drip, drab, it just keeps coming down, drip, drip, drip, drip,
02:02:53.560 and before you know it, you've got a bathtub full,
02:02:56.400 and then it spills over.
02:02:57.900 Well, we are getting the drip, drip, drip now.
02:03:01.820 Time to fix the pipe and decide, do we want all that water,
02:03:06.600 or should we fix the pipe
02:03:09.020 and make sure this doesn't happen here in America?
02:03:13.660 You can get all of those links and all of the stories at glennbeck.com.
02:03:19.140 Just sign up for my free email newsletter now at glennbeck.com.
02:03:25.000 Coming up on blazetv.com, we've got a great futuristic podcast.
02:03:32.600 I'll tell you about tomorrow.
02:03:34.140 The Glenn Beck Program.