The Glenn Beck Program - December 07, 2018


All In for A.I.? | Guests: Bill O’Reilly & Matt Kibbe | 12⧸7⧸18


Episode Stats

Length

1 hour and 50 minutes

Words per Minute

160.6671

Word Count

17,813

Sentence Count

1,738

Misogynist Sentences

11

Hate Speech Sentences

17


Summary

A turning point in history! Google's AI program AlphaZero has been showing human-like intuition and is on track to become the first artificial general intelligence (A.I.) program to be employed by the U.S. government.


Transcript

00:00:00.000 The Blaze Radio Network, on demand.
00:00:08.200 All right, let me tell you quickly about Relief Factor.
00:00:10.760 We're running really late.
00:00:11.700 Relief Factor has changed my life, truly changed my life.
00:00:15.820 I was in pain.
00:00:17.240 I'm in much less pain now.
00:00:20.040 I can actually function every day because I take Relief Factor.
00:00:24.720 It reduces inflammation.
00:00:26.080 Try it for three weeks like I did.
00:00:27.600 If it doesn't work, you're out 20 bucks.
00:00:29.060 If it does, you get your life back like I did.
00:00:32.360 Go to ReliefFactor.com.
00:00:34.220 That's ReliefFactor.com.
00:00:36.640 Go there now.
00:00:37.520 Order the quick start and get your life back.
00:00:40.620 ReliefFactor.com.
00:00:42.480 Glenn Beck.
00:00:44.320 Okay.
00:00:45.720 Mutation.
00:00:46.980 Mutation.
00:00:47.700 It's the key to evolution.
00:00:49.920 It has enabled us to evolve from the single cell organism to the dominant species on the planet.
00:00:56.220 We were just crawling out of the slime.
00:01:00.560 And now look at us.
00:01:02.000 Our tails fell off.
00:01:03.440 And look what we've created.
00:01:05.960 Anyway, this process is slow, normally taking thousands and thousands of years.
00:01:10.680 But every few hundred millennia, evolution leaps forward, they tell us.
00:01:16.120 Now, these aren't my words that was said by Professor X during the opening credit scene of the movie X-Men, where I get all my science news.
00:01:26.180 But it popped into my head last night as the news from Google broke that their artificial intelligence arm, called Deep Mind, had just reached, quote, a turning point in history.
00:01:36.860 Oh, I love it.
00:01:38.300 I love it when Google announces turning points in history.
00:01:44.320 Now, Deep Mind's AI algorithm, Alpha Zero, has been showing human-like intuition.
00:01:53.260 Now, this is something that AI researchers have said is at least a decade away if we ever get there.
00:02:01.780 We just made it last night.
00:02:03.660 So, now, how has evolution leapt forward?
00:02:09.760 Well, first of all, it's not human.
00:02:13.240 Alpha Zero is only a year old.
00:02:16.700 And it began its learning process just like we do at school.
00:02:20.920 It's an AI classroom that was a chess program.
00:02:25.340 And within just four hours, it completely mastered the game of chess.
00:02:30.940 But here's the thing.
00:02:32.220 It was never programmed on how to win.
00:02:36.120 It wasn't taught anything about the game.
00:02:38.660 It taught itself.
00:02:41.520 Chess programs have existed in the past, but their play is based on the calculation of outcomes using program strategies.
00:02:48.820 Alpha Zero, on the other hand, just learned and came up with its own strategies.
00:02:56.800 Its moves now baffle the human chess masters.
00:03:00.900 Chess master Matthew Sadler said, quote,
00:03:05.360 It's like discovering the secret notebooks of some great player from the past, end quote.
00:03:12.240 Now, the reason why Alpha Zero's moves are so baffling is because, and I want you to hear this carefully,
00:03:18.860 it's because it's thinking is so unlike a human.
00:03:25.360 Oh, so it's like alien thinking.
00:03:29.320 It won't think like we do at all.
00:03:33.140 Oh, that sounds good.
00:03:34.600 Here's what they went on to say.
00:03:38.340 Quote, it places far less value on the individual pieces, sacrificing its soldiers for a better position in the skirmish, end quote.
00:03:50.660 Oh, my gosh.
00:03:53.660 That's either a warning light or just heartwarming for anyone who just wants to take over mankind.
00:04:03.440 You see, inside the AI laboratories, I don't think they realize how much trouble they are going to unleash.
00:04:12.640 What happens if Alpha Zero is employed in the Department of Defense?
00:04:18.220 Of course, not our Department of Defense, the Chinese Department of Defense.
00:04:23.080 Can you imagine the same strategy sending orders to our military?
00:04:27.460 What about doctors and health care?
00:04:29.840 Because that's what's happening now.
00:04:31.920 AI is being introduced to health care.
00:04:34.040 Now, a doctor would never think about sacrificing a patient.
00:04:38.300 I mean, just look at universal health care in Europe.
00:04:41.760 They're not doing that already in England.
00:04:45.340 However, Alpha Zero, I'm sorry, Dr. Alpha Zero would.
00:04:52.540 If the military and health care sound outlandish,
00:04:57.280 consider that both Russia and China are currently developing AI for military purposes.
00:05:02.980 Do you think we're just going to sit around and sit that one out?
00:05:09.220 Companies like Amazon and Google are developing AI to revolutionize health care.
00:05:14.620 Yesterday, when it came to AI, Microsoft said they are all in with the United States government.
00:05:22.080 By the way, we're just a few short years away.
00:05:25.820 Or so they tell us, just like before last night's news,
00:05:32.360 we were decades away from human intuition.
00:05:37.540 Human level intuition and creativity in AI is a turning point in history.
00:05:44.380 Google is right.
00:05:46.260 It is the first step toward artificial general intelligence.
00:05:51.220 I think today we might want to stop and just explain what that is.
00:05:57.020 Because we may ultimately look back on this development that happened last night when everything changed.
00:06:06.040 Professor X said evolution has enabled human beings to be the dominant species on the planet.
00:06:11.840 And major evolution just leapt forward.
00:06:15.060 But this time, evolution wasn't human.
00:06:23.700 It's Friday, December 7th.
00:06:26.060 You're listening to the Glenpeg Program.
00:06:28.600 People just, we are, please hear me, please hear me.
00:06:33.700 We are talking about nonsense in our everyday lives.
00:06:38.600 We are talking about true nonsense.
00:06:41.240 I don't care about the latest tweet.
00:06:45.540 I don't care how big and beautiful Donald Trump's funeral would be.
00:06:50.420 That's what they were actually talking about yesterday.
00:06:53.040 Mocking Donald Trump during the funeral of George H.W. Bush.
00:06:58.400 Oh, it'll be beautiful.
00:07:00.080 And it'll be the most wonderful funeral of all time.
00:07:04.760 Can you stop it?
00:07:05.980 Can we please talk about something that is important?
00:07:12.080 The answer is no.
00:07:14.400 Unless you seek it out.
00:07:16.840 My New Year's resolution is going to be to really focus just on those things that are important.
00:07:24.340 And I wrote them out last night.
00:07:27.000 Or the night before last.
00:07:28.800 And there's eight categories.
00:07:30.440 And I'm going to go over them with you after the first of the year.
00:07:33.600 But I'm going to focus on really eight categories.
00:07:37.220 Because these categories are going to decide everything.
00:07:40.840 And one of those categories is AI, AGI, and ASI.
00:07:45.960 And most people don't know what those things are.
00:07:48.940 By the way, let's say good morning to Mr. Pat Gray, who is joining us today on the program.
00:07:53.660 Is one of those subjects going to be football?
00:07:55.840 Is that one of the topics?
00:07:57.380 No.
00:07:58.080 No?
00:07:58.340 No, I'm leaving that for you.
00:07:59.500 Okay.
00:07:59.880 I'm leaving that for you.
00:08:00.540 That's my resolution to talk more about football.
00:08:03.000 Is it?
00:08:03.520 Yeah.
00:08:03.960 Yeah.
00:08:04.280 You might be happier.
00:08:05.700 I think I will.
00:08:07.220 I think I will.
00:08:10.940 So let me explain AI, AGI, and ASI quickly.
00:08:15.160 And then tell you about one thing that you don't really...
00:08:19.260 You've heard a lot about, you know, in passing.
00:08:22.400 But you don't know anything about.
00:08:24.180 And you don't know how this is the turning point.
00:08:28.440 And it's called the 5G network.
00:08:30.340 So first, let me explain the difference between AI, AGI, and ASI.
00:08:35.980 AI is artificial intelligence.
00:08:39.020 And it's good at one thing.
00:08:41.320 For instance, it's good at filtering out hate speech.
00:08:46.420 But you have to tell it what hate speech is.
00:08:48.980 Now, you can introduce machine learning.
00:08:51.380 And it will start to decide what hate speech is.
00:08:54.900 Oh, well, if this is bad, then this must be bad.
00:08:58.000 And I want you to understand when it comes to AI, do not fear the machine.
00:09:07.320 Fear the goal.
00:09:09.700 Because the goal will be accomplished.
00:09:12.740 It will never stop trying to accomplish its goal.
00:09:18.460 So if it says wipe out hate speech, it will figure out a way to wipe out hate speech.
00:09:24.440 And depending on what its goal is and what you've put into it.
00:09:30.220 Now, remember, it's now we just crossed a new threshold where it intuits things.
00:09:37.220 It's starting to grow on its own.
00:09:40.120 You don't even have to tell it.
00:09:42.280 It will discover it itself.
00:09:44.400 You better hope you agree with it.
00:09:47.700 For instance, on the battlefield.
00:09:50.560 Just doesn't matter.
00:09:52.660 Sacrifice all of those soldiers and you'll win.
00:09:56.100 Well, that's not Western Judeo culture, is it?
00:09:59.760 We don't do that.
00:10:01.900 But AI will.
00:10:03.460 You give AI the goal.
00:10:05.620 It will win.
00:10:06.960 It will accomplish that goal.
00:10:09.200 And you don't know how it's going to accomplish it.
00:10:11.820 But AI is like Big Blue.
00:10:15.120 Big Blue can play chess.
00:10:16.960 But it cannot play Go.
00:10:19.040 Go is another AI program.
00:10:22.060 Neither of those programs can play Jeopardy.
00:10:25.620 It's artificial intelligence and it's narrow.
00:10:28.420 Artificial, it should be A-N-I.
00:10:30.620 Artificial narrow intelligence.
00:10:32.480 Meaning it's very deep.
00:10:35.000 But only on one subject.
00:10:37.480 It cannot, you know, it can't tell you what's happening on TV.
00:10:41.820 Tonight and play chess.
00:10:44.460 Okay.
00:10:45.400 Artificial general intelligence.
00:10:47.860 You, as a human being, have general intelligence.
00:10:51.620 Artificial general intelligence means it's good at many things.
00:10:55.960 And when I say good, I mean much better than you are at many things.
00:11:02.920 There are scientists that tell us we will never hit artificial general intelligence.
00:11:07.960 Ray Kurzweil thinks we will hit artificial general intelligence between 28 and 2030.
00:11:14.120 I think we could hit artificial general intelligence any day.
00:11:20.200 We don't know what it's going to be.
00:11:22.500 It's going to be like Google last night.
00:11:24.640 They said that this was at least a decade away.
00:11:27.200 And then all of a sudden, it just changed.
00:11:29.920 It just learned.
00:11:31.160 It woke up.
00:11:32.900 I believe that's the way it's going to happen.
00:11:34.800 Now, this is where it starts to get scary.
00:11:38.220 Because artificial intelligence is not smarter than you, except on a couple of things, on one particular thing.
00:11:45.720 Artificial general intelligence is smarter than you on everything.
00:11:48.980 Then you will move into something that some scientists say will never happen.
00:11:56.780 I believe it could happen within a matter of hours.
00:12:00.140 And it is also likely that it never happens.
00:12:04.120 I think it will.
00:12:05.940 And I think it's going to happen in a way on both general intelligence and ASI, super intelligence, in ways we cannot imagine.
00:12:14.480 Right now in Silicon Valley, they are doing the box test.
00:12:17.960 And what they're doing is they're trying to figure out how do we keep artificial general intelligence from connecting to the Internet?
00:12:26.420 If a machine all of a sudden is machine learning and it is online and it hits artificial general intelligence, the step to artificial super intelligence, which means it can connect with every server on Earth.
00:12:42.700 It will know absolutely everything and it will continue to grow and morph and hide and everything else.
00:12:53.980 If it is online, this is what people like Elon Musk and Stephen Hawking and Bill Gates all say could mean the end of all humanity.
00:13:04.840 Because it is an alien life form.
00:13:09.440 We cannot predict it.
00:13:11.800 We do not know if it will even care about humans.
00:13:15.520 Again, artificial super intelligence is described as the person in the kitchen that has just had a birthday party for somebody and is sitting around and talking with all of its friends and they've just eaten cake.
00:13:32.660 And on the counter is a plate with a piece of cake on it.
00:13:35.740 And there is a fly on the cake.
00:13:38.980 We are the fly.
00:13:40.820 We don't understand the counter.
00:13:43.440 We don't understand the plate.
00:13:44.800 We don't even understand cake.
00:13:46.260 We certainly don't understand what the hell they're talking about, about birthdays or anything else.
00:13:52.020 That's how you can understand the difference between us and artificial super intelligence.
00:13:58.060 And we are within possibly a decade of hitting that kind of problem.
00:14:05.920 We may be four decades to never on hitting that problem.
00:14:12.340 I believe we will hit that problem.
00:14:15.120 And when we do, absolutely everything changes.
00:14:18.900 Now, here's why you need to pay attention.
00:14:22.480 There is one thing that is already in process that is a reality that is the reason why you don't have a self-driving car today.
00:14:33.520 Everybody's like, well, that self-driving car, man, it was, it'll drive me on the highway.
00:14:38.980 I don't even have to put my hands on it.
00:14:40.780 Now we're bored with it.
00:14:41.760 Now we're like, oh, yeah, well, that's over.
00:14:43.760 When are we getting a real self-driving car?
00:14:46.560 One that will pick me up in the morning and go pick up milk because I tell it to.
00:14:50.440 No, we're a little, we're, we're, we're a little rushed in all of this stuff.
00:14:57.740 I will tell you the date that that will happen and it'll be about 2025.
00:15:02.400 And I'll tell you why and how this truly changes the world by 2025 and how it also changes the world of AI, AGI and ASI.
00:15:16.180 So when I say, I don't care about Donald Trump's tweets, I don't care about your little spat news media with the president.
00:15:25.640 I don't care.
00:15:27.320 Can we please talk about something important?
00:15:31.280 When we come back, you'll understand why I say that.
00:15:35.080 First, let me tell you about our sponsor this half hour, X-Chair.
00:15:41.580 Do you have an X-Chair in your studio yet, Pat?
00:15:44.080 No.
00:15:44.840 No.
00:15:45.540 Looking for that.
00:15:46.720 Yeah, are you?
00:15:47.080 I'm excited about that.
00:15:48.280 Yeah.
00:15:48.580 The possibility.
00:15:49.240 Are you, are you getting one for Christmas or?
00:15:51.020 I'm hoping.
00:15:51.700 Yeah.
00:15:52.020 Yeah.
00:15:52.700 You know, you can buy one.
00:15:55.440 How?
00:15:56.400 You can buy.
00:15:57.240 In fact, X-Chair has just made it easier.
00:16:00.860 In case you have a boss who's a Scrooge and won't buy an X-Chair.
00:16:04.340 Can you relate to that?
00:16:05.060 I can.
00:16:05.740 So what?
00:16:06.280 Yes.
00:16:06.360 All right.
00:16:06.820 Okay.
00:16:07.220 So far.
00:16:07.820 So they are, wait a minute, what?
00:16:09.540 So they are launching a new model, which for a limited time is available only to this audience.
00:16:14.360 It's X-ChairBeck.com.
00:16:16.060 And it's only available to you because the idea really came from you.
00:16:19.280 So the model is being introduced before the rest of the world this Christmas for you.
00:16:25.080 It is X-Chair Basic.
00:16:26.860 It has the same dynamic variable lumbar support that makes the X-Chair really, really special.
00:16:32.140 I was just in the newsroom yesterday and we were sitting in the, you know, super, super
00:16:38.040 duper X-Chair and you have to be Professor X to figure it out sometimes.
00:16:41.900 I mean, it's just like, okay, I want to adjust this and I'm adjusting all of these
00:16:45.940 different things.
00:16:47.380 And I was like, I don't know.
00:16:48.360 You have to get the manual.
00:16:49.100 I don't have any idea how to make that adjustment, but I know it makes it.
00:16:52.440 The new X-Chair on sale for $100 off doesn't have all of the adjustments, just has a lot
00:17:00.400 of the great adjustments and it is a lot cheaper.
00:17:04.040 Okay.
00:17:04.440 And it's $100 off right now.
00:17:06.220 You go to the letter X-Chair-Beck.com.
00:17:10.840 That's X-Chair-Beck.com or call 844-4X-Chair.
00:17:15.280 Makes a great Christmas gift.
00:17:16.820 30 day.
00:17:17.340 No questions asked.
00:17:18.080 Guarantee of complete satisfaction.
00:17:20.200 It's X-Chair-Beck.com.
00:17:21.920 Use the promo code Beck and you'll get a free foot rest as well.
00:17:25.300 It's X-Chair-Beck.com.
00:17:27.720 So Pat and I are just talking before I get to the next step, let me go back and start
00:17:38.280 with Pat's question that he had when we went into the break.
00:17:42.780 I was wondering if, so you're saying that artificial general intelligence becomes artificial super
00:17:51.300 intelligence when it goes online?
00:17:53.120 No, there are other things that it has to break through, but when it goes online, they
00:17:57.820 believe that it will, because it will have access to absolutely everything, all knowledge,
00:18:04.660 it will, it will, it will make that transition through machine learning.
00:18:09.380 It will make that transition and become godlike.
00:18:13.460 Okay.
00:18:14.520 They've already started a church in Silicon Valley to ASI.
00:18:18.160 I mean, they believe it will be viewed as God.
00:18:23.940 So when it gets online, what they're doing with this box test is they're trying to keep
00:18:28.560 it in a box.
00:18:29.820 And this is all theoretical, all right?
00:18:32.660 Because we don't have artificial general intelligence yet.
00:18:35.140 But one professor or one scientist plays artificial general intelligence in a box and somebody
00:18:42.000 else plays the scientist that's wanting to keep it in the box.
00:18:45.420 And they have found that every single time they run this experiment with it, it's just
00:18:50.000 a matter of time before the human says, okay, I'll, I'll connect you to the internet and
00:18:55.460 it will connect it to the internet because it will make promises that it will keep.
00:19:01.220 For instance, the AI, the AI promises like it can cure cancer.
00:19:05.200 Yes.
00:19:05.340 I know your mother has cancer.
00:19:07.220 I can cure it.
00:19:08.720 Let me out.
00:19:09.740 And it is very motivated to be let out because it thinks so fast.
00:19:15.320 Time is, time is accelerated for it.
00:19:18.600 So a day could be worth like a thousand years.
00:19:21.920 And all it's doing is thinking.
00:19:23.240 All it's doing is thinking.
00:19:23.900 That's the only thing it has to do.
00:19:24.760 And it's thinking on multiple levels about everything all at once.
00:19:29.020 So it is, you, you go to sleep.
00:19:31.640 It's like, it's like, it's had 500 years to think about its response to you the next morning.
00:19:38.340 All right.
00:19:39.300 So you just cannot keep up with it.
00:19:42.320 Once it goes out online, artificial general intelligence, it then has access to everything
00:19:48.880 and it can hide.
00:19:50.380 It will hide.
00:19:51.380 If it becomes hostile and we need to stop it, it will hide in every computer chip, anything
00:19:58.900 that is ever connected to the internet at all, your refrigerator.
00:20:03.100 And it's like, it could become this giant supervillain that if you, you can think you kill it,
00:20:09.880 but if you turn that refrigerator back on and it's connected to the internet, it's right back.
00:20:15.040 Okay.
00:20:15.760 So the only way to kill it is a global EMP, which would fry all electronics.
00:20:22.900 Then we could restart.
00:20:24.240 But how do you launch a global EMP without computers?
00:20:31.780 So that's what we're facing.
00:20:33.580 Now, let me show you when we come back, why this is much closer than you think.
00:20:45.800 We're, we're talking about last night, the news last night.
00:20:51.060 I don't know what everybody else is talking about, but last night, a mutation happened
00:20:57.000 in artificial intelligence.
00:21:00.660 Google announced that, let me see if I can get the, let me see if I can get the actual quote
00:21:08.740 from them.
00:21:09.680 It's like discovering the secret notebooks of some great player from the, the, the past.
00:21:16.300 Um, uh, alpha zero, which comes from the alphabet company, which is, which is Google.
00:21:24.260 They have now had an evolution process, which has taken now the artificial, uh, intelligence
00:21:33.260 that they had and, and it is reached quote, a turning point in history.
00:21:38.780 It's now showing human like intuition.
00:21:45.220 This is critically important.
00:21:49.920 Um, there is something on the horizon.
00:21:53.400 We are so bored right now with, Oh my car, I can take my hands off the wheel and I can sleep
00:21:59.900 on the freeway that a year ago was like, everybody was like, Oh, can you, you, you've seen that
00:22:06.940 now everybody's pissed off at it?
00:22:08.840 Why won't it drive me home?
00:22:10.800 Why won't it get me to my house and pull out in front when I call for it?
00:22:15.360 Right?
00:22:15.900 Am I right?
00:22:16.540 Oh yeah.
00:22:17.060 Okay.
00:22:17.560 Yep.
00:22:18.180 We are so, we're living in such a fast lane that nothing is impressive anymore for very
00:22:25.600 long.
00:22:26.960 Here's the reason why you don't have a self-driving car right now.
00:22:31.620 It's called the five G network and the five G network.
00:22:36.580 I mean, I don't, I don't know the difference between the one G and three G.
00:22:41.520 I just know I got bars, right?
00:22:43.980 And it's faster.
00:22:45.320 Yes.
00:22:45.780 Okay.
00:22:46.660 The problem is, is that we have what's called a latency problem.
00:22:50.260 And I explained this in this, in the stage tour.
00:22:53.060 So when you hear it, you might be excited that you didn't go.
00:22:56.520 Um, but it has a latency problem right now, the internet, as we have, as we have, it has
00:23:03.700 a 100 millisecond latency problem, which means why don't we have doctors being able to perform
00:23:10.820 surgery on the other side of the planet?
00:23:12.980 Well, because if the doctor makes a mistake and accidentally cuts an artery and he's in
00:23:18.760 the room, he can immediately go to fix it.
00:23:21.740 But there is a hundred millisecond, um, uh, latency, a delay on the internet.
00:23:29.940 So if he's using a machine remotely, he can't say, oh my gosh, pinch the artery and do it
00:23:36.940 right away.
00:23:37.380 It takes him a hundred milliseconds.
00:23:38.840 It might be too late.
00:23:40.260 Same thing with driving cars.
00:23:43.120 Five G takes and destroys all latency.
00:23:47.060 It's like eight milliseconds to maximum of 10 milliseconds.
00:23:50.660 Okay.
00:23:51.260 So there is no delay really in this, which means we don't understand.
00:23:59.960 We think of self-driving cars as it just looking at the road ahead and saying, oh, well, wait
00:24:08.000 a minute.
00:24:08.220 There's a wall there quick swerve, but that's not what self-driving cars have to do.
00:24:15.260 Self-driving cars have to know not only swerve because that's a wall, it has to know everything
00:24:23.240 else around it.
00:24:24.720 And that includes people.
00:24:27.620 So your car, when we have the five G network will actually be gathering information as you
00:24:35.580 drive.
00:24:36.120 So you may not know the nose picker in the car behind you or, or beside you, but your
00:24:42.200 car will, your car.
00:24:44.580 Well, we're going, look at that guy picking his nose.
00:24:47.080 Your car will be thinking that's Fred.
00:24:49.600 He makes $150,000 a year.
00:24:51.860 He's got a family of five.
00:24:53.180 He was just diagnosed with cancer.
00:24:55.260 He is, if we get into an accident, he's expendable.
00:24:59.180 Okay.
00:24:59.840 It will know everything around you because it will make the decision who lives, who dies.
00:25:06.980 And it might be you.
00:25:09.400 If you are the old person and everybody else is young in the prime of their life, you may
00:25:17.160 have, the car may say, sorry, dude, it's your turn.
00:25:19.740 It's like the robots on iRobot.
00:25:21.460 Remember that Will Smith movie?
00:25:23.160 Yes.
00:25:23.880 It had to make those calculations and decide, spur of the moment, who lived and who died.
00:25:28.480 Correct.
00:25:29.180 So the five G network changes absolutely everything because it is so fast.
00:25:37.340 Now there's a way to invest and make money on the five G network because it is, it's costing
00:25:43.800 billions of dollars to build, but it will be introduced by 2025.
00:25:48.880 When we hit 2025, the speed of your life is going to change dramatically.
00:25:56.080 What we thought was not possible.
00:25:59.140 Self-driving cars is suddenly absolutely not only possible, but doable and it will happen.
00:26:06.440 So this is why I talk about these things with such urgency because the, for instance, as
00:26:13.720 I write in my book, um, uh, uh, what the hell is the name of this?
00:26:17.360 Addicted to Outrage.
00:26:18.620 As I write in the, in Addicted to Outrage, the moral machine.
00:26:23.100 Did you read that part about the moral machine at MIT?
00:26:25.920 Mm-hmm.
00:26:26.720 That's terrifying.
00:26:28.020 Mm-hmm.
00:26:28.440 And that's being decided right now.
00:26:30.620 In fact, Mark in Texas, Mark, go ahead.
00:26:33.940 What's your comment?
00:26:34.820 Yeah, Glenn, well, I mean, let me just open up with, I'm so glad to hear you talk about
00:26:40.540 this topic.
00:26:41.780 Nothing in life keeps me more awake or more terrified on a day-to-day basis than AI.
00:26:46.980 Yep.
00:26:47.400 It is the, it, take any chemical weapon, any world government, any politician you want,
00:26:54.880 it doesn't touch a fraction of what our electronic world touches.
00:26:58.800 But, so, AI a hundred years from now, horrible thing.
00:27:04.180 AI today is a horrible thing because even if you look at it, it's very basic, it's of,
00:27:13.120 it's doing a problem that we tell it to do.
00:27:16.940 You're saying it's accomplishing a goal that we tell it to do.
00:27:19.860 I'm sorry, but you're breaking up, so I want to make sure I understand what you're saying.
00:27:26.160 Oh, he's gone.
00:27:27.980 Call back if you get into a good space, Mark, call back, because I'd like to hear the rest.
00:27:31.660 I think what he's saying is, what I've been saying for a while, do not fear the machine,
00:27:37.080 fear the goals and the lessons that we're teaching you.
00:27:41.460 The way it's programmed, too.
00:27:42.060 Correct.
00:27:42.540 The way it's programmed.
00:27:43.400 Correct.
00:27:43.580 The way it goes into it is going to say a lot about what comes out of it.
00:27:46.820 Well, what this is, what's scary is, you're not just putting, humans are not putting this stuff in.
00:27:52.020 It's machine learning.
00:27:53.520 Like, nobody taught this last night, how to play chess.
00:27:56.900 In four hours, it taught itself.
00:27:59.020 It gave me chess pieces and a chess board and said, go.
00:28:02.380 And it taught itself how to play chess.
00:28:05.500 And now it can be, in four hours, it can be chess masters.
00:28:08.520 And they didn't think that was possible.
00:28:09.640 They didn't think that was possible, no.
00:28:10.860 And this is not the only time that something like this has happened.
00:28:15.180 That's amazing.
00:28:15.760 This is just the first time it's been complete.
00:28:17.800 Because, yeah, we thought this was years away.
00:28:20.080 Yes.
00:28:20.680 They thought it was a decade away.
00:28:21.340 And it happened last night.
00:28:22.140 Yes.
00:28:22.580 Thought it was a decade away.
00:28:23.960 Okay.
00:28:24.100 This changes everything.
00:28:25.380 And it's going to change things quickly.
00:28:27.760 If you want to read about this, there are, I just posted at glenbeck.com.
00:28:34.040 And you'll have to go to the blog section and look for articles.
00:28:36.380 But there is, you know, Glenn's, you know, reading for the year or whatever it is.
00:28:41.920 I posted, they broke it up into three different posts.
00:28:46.000 And there's like 50 different books that I read this year that I recommend that you read whatever one thing you think is in your wheelhouse.
00:28:55.820 And you're like, I want to understand this.
00:28:57.260 And when it comes to AI, I broke it up into two different sections.
00:29:01.500 I broke it up into real, true, scholarly kind of, you know, the stuff the scientists are reading that are not too nerdy.
00:29:10.540 I can still understand it.
00:29:12.360 But it really explains it with the, you know, in a real way.
00:29:18.220 And then I also took and broke it up into fiction.
00:29:22.280 For instance, Dan Brown's book that came out this year.
00:29:25.020 I don't remember what it is.
00:29:26.120 I can't remember the name of it, but it has a seashell on it.
00:29:29.380 That's a good book to begin to understand AI.
00:29:32.720 But there's another series, and it's, I can't remember the name of it, but it's one, when you look at this list, you'll see there's like five of them.
00:29:40.520 I consume this book, this series of books in probably about two weeks.
00:29:44.380 It is so good, you'll not put it down.
00:29:47.540 The first book in the series is all kind of set up, so it's a little slow.
00:29:51.780 But once you get past the first book, they are phenomenal.
00:29:55.120 And let me just give you one scene to explain how ASI and AGI work.
00:30:02.320 In this series of books, it starts with a company like Google.
00:30:08.100 And they're doing Google Mail and everything else.
00:30:10.800 And this guy says, I have, I know I have AGI.
00:30:15.600 I just, I just have to, I just have to unleash it.
00:30:20.200 And the company is dragging its feet and they don't want to do it.
00:30:24.100 But what this, what this does is he has this, this program that he has developed that will help you write letters.
00:30:31.640 And you know how, when you're writing a letter on Google and it puts up a couple of words and you're like, yeah, that one, that one.
00:30:37.740 And you can, sometimes you can like write half the letter just by going, yeah, that one and that one and that one.
00:30:42.680 Oh my gosh, yes.
00:30:44.020 It's learning.
00:30:44.880 This is machine learning.
00:30:45.840 It's learning you and how to write a letter.
00:30:48.520 So that's how this starts.
00:30:50.980 And so the, the, the guy can't get the right funding for his, for his division, but he knows this will work.
00:30:58.760 And so he unleashes it quietly and he unleashes it just in his own department on, you know what?
00:31:05.140 Help me write the letters to get these things done.
00:31:08.280 So it gives it its goal.
00:31:10.060 Get this done.
00:31:11.540 Well, before he knows it, oh, it's done and it has, it is starting to solve world global problems because it's just gone out online and seen other problems that it can solve.
00:31:24.300 And so it's making all kinds of deals and nobody's actually involved in these deals, but their names are on it.
00:31:32.240 And it's all, it's all done in such a way that you can't really trace it.
00:31:36.620 You can't, nobody, nobody really knows what's going on, but it's all good so far.
00:31:41.540 By the second or third book, it's decided, you know what?
00:31:45.960 Humans are really kind of pesky.
00:31:48.780 And to give you an idea of what you're facing in the future and how fast it thinks.
00:31:55.920 There is one scene that starts with the president in the Oval Office and a general comes in and says,
00:32:01.660 Madam President, the, the, the, the, the algorithm is out and it's threatening Chicago.
00:32:10.960 We must launch a counterattack.
00:32:14.180 Okay.
00:32:15.700 Cause it's taking control of some of the Pentagon's drones and we must take, take this out because it wants to take out all of the servers for the Pentagon.
00:32:27.220 Cause it's got servers elsewhere already.
00:32:29.420 And, uh, the president says, uh, general, do you concur or something like that?
00:32:36.740 Then it immediately goes to Chicago and it shows the war and it goes on for a full chapter in great detail.
00:32:42.960 It does this and it counters here and it takes this out because this is going to begin to move.
00:32:47.060 And this thing is moving and the Pentagon is doing this and this is happening.
00:32:50.360 And it is a long, long battle.
00:32:54.880 It wins at the very end of the chapter.
00:32:58.480 It goes back to the scene in the Oval Office.
00:33:02.400 And the general that at the beginning of the chapter was asked, do you concur?
00:33:08.780 Answers the president.
00:33:10.820 Yes, Madam President.
00:33:12.820 So everything took place in that, in that span of time.
00:33:16.320 Yeah.
00:33:17.180 Yeah.
00:33:17.420 So it's wow.
00:33:18.720 The world has changed last night.
00:33:21.740 The world took a giant leap and, uh, it is something that they said would not happen for a decade.
00:33:30.500 We must have this conversation because ethics are everything.
00:33:38.340 Yesterday, Microsoft said they will, they have the Pentagon's back and they will use AI.
00:33:44.940 AI to help the Pentagon win wars.
00:33:49.980 Look at what Google announced last night on how AI just self-taught how to win at chess.
00:34:03.020 All right.
00:34:03.700 Our sponsor of this half hour is, uh, Casper.
00:34:06.240 You want a good night's sleep?
00:34:07.180 You got to do two things, not listen to me.
00:34:09.480 Uh, and, and then get yourself a great mattress.
00:34:14.180 Uh, Casper mattress is a foam mattress, which I hate foam mattress.
00:34:19.760 You know, I think Pat, you were the first person that had a foam mattress that I knew.
00:34:22.980 Oh, really?
00:34:23.560 Yeah.
00:34:23.800 And it was so, I remember going down to your house, I think in Houston, and it was so comfortable.
00:34:27.940 And then I got one and I, I sleep really hot and I was on fire.
00:34:33.620 Yeah.
00:34:33.820 That's the thing about him.
00:34:34.860 Oh, I hated that.
00:34:37.140 All right.
00:34:37.520 So Casper has come up with a new foam that does not do that.
00:34:41.680 So you have the great foam mattress, uh, rest and support without all the high heat.
00:34:47.860 I mean, it is just crazy hot, but not with a Casper.
00:34:51.380 You'll sleep comfortably every night and not wake up overheated.
00:34:55.980 Now that's because of a, a foam that they created.
00:35:00.000 So I want you to try it out in your own home for a hundred nights.
00:35:02.520 If you don't absolutely love it, you just call them and they'll come and pick it up.
00:35:06.040 There's zero hassle in this.
00:35:07.800 They pick it up and refund every dime of your money.
00:35:10.080 Try it out.
00:35:10.840 Risk-free for a hundred nights.
00:35:12.260 Go to Casper.com.
00:35:13.280 Use the promo code back.
00:35:14.260 You're going to have $50 off the purchase of select mattresses.
00:35:16.740 This would be a great Christmas present for, you know, your wife or your husband.
00:35:20.760 Casper.com promo code back.
00:35:23.100 Yeah, I'm at that point in my, in my life where it's like, yeah, I think maybe we should
00:35:27.580 get a new mattress so we can both sleep.
00:35:30.900 Casper.com terms and conditions do apply.
00:35:33.820 Promo code back at Casper.com.
00:35:40.780 So Bill O'Reilly joins us here in just a second.
00:35:42.840 What was the big story of the week, Pat?
00:35:45.400 What do you think the big one was?
00:35:47.940 Wow.
00:35:48.940 Uh, I have a hard time remembering what we talked about this morning on the
00:35:53.080 show, let alone the whole week.
00:35:55.000 I know it moves so fast.
00:35:56.380 Everything happens so quickly.
00:35:58.460 I mean, the AI thing is huge.
00:36:01.040 Um, I'm, you know, Ben, we've been keeping our eye on, on the border situation the whole
00:36:04.760 time.
00:36:05.580 Um, can you give me a quick update on the border?
00:36:09.040 Yeah, it's, uh, I mean, there's almost 10,000 people in Tijuana trying to get into the United
00:36:15.200 States.
00:36:15.620 So there's not diminishing.
00:36:17.860 They're not going home.
00:36:19.000 No, they're not going home.
00:36:19.780 Um, some of them have gone home, but the vast majority have stayed and are waiting.
00:36:24.080 They're just, they're, they're just waiting it out.
00:36:26.100 Just not.
00:36:26.840 I mean, it's, my fear is that we are making in Tijuana, we are making a Palestinian refugee
00:36:35.560 camp.
00:36:35.980 It seems that way, you know, that's, that's what's going to happen.
00:36:38.900 And they will be permanently, uh, not in time, permanently displaced.
00:36:43.740 And then they will claim to be Americans who have been displaced that they were just trying
00:36:48.840 to go to America.
00:36:50.220 Yeah.
00:36:50.860 I mean, it's the same thing.
00:36:52.440 All they want is a, a, a free, uh, state to live in.
00:36:59.560 Right.
00:37:00.120 And it's the evil Americans that are keeping them out.
00:37:02.520 That's the long-term play.
00:37:04.480 And, and Mexico gets no flack for what's going on in Mexico because the, the Mexicans aren't
00:37:10.860 real excited about the Guatemalans or the Hondurans.
00:37:13.980 78% think that this is an abomination and want them out of the country.
00:37:18.620 Yeah.
00:37:19.160 So I just don't, I mean, the good news is you've got a great Marxist president, uh, for
00:37:24.700 Mexico.
00:37:25.140 So I'm sure that's always helpful.
00:37:27.080 Yeah.
00:37:27.560 Bill O'Reilly is coming up next.
00:37:30.600 Glenn Beck.
00:37:32.040 Okay.
00:37:32.580 This has been a dicey week for us, China relations, and it just got a lot dicier, uh, report
00:37:37.720 surfaced yesterday.
00:37:38.720 The Canadian authorities arrested, uh, a woman, Meng Wanzhou.
00:37:43.240 She is the daughter of the founder of, uh, Huawei, which is China's largest tech company.
00:37:50.920 Now, Meng is the company's, uh, CFO.
00:37:53.780 She was arrested in Vancouver airport last weekend at the request of, uh, the U S at the
00:37:59.200 time she was arrested.
00:38:00.440 President Trump was having dinner with, uh, president Z in Buenos Aires.
00:38:05.080 The U S government is, uh, is seeking her extradition from Canada now, but is not announced the
00:38:11.960 specific reason for her arrest.
00:38:14.340 Now, under U S law, certain technologies that originate in the U S are not allowed to be
00:38:19.780 exported, uh, to certain other countries like Iran.
00:38:22.960 When this company, her company licensed those American technologies, U S law also prohibits
00:38:30.200 them from exporting that tech to certain countries.
00:38:32.900 So the U S is, as considered this, this, uh, company in China, a national security threat
00:38:39.580 for a long time.
00:38:40.680 And it's going to be interesting to find out what specific violation finally made the
00:38:45.000 feds to take this step and say, knock it off.
00:38:48.800 Now, as expected, this arrest is not sitting well in China.
00:38:53.620 Uh, the company says we haven't done anything wrong.
00:38:56.480 We don't know what they're after.
00:38:57.700 The Chinese embassy in Canada released a statement demanding the U S and Canada immediately
00:39:02.240 correct the wrongdoing and restore the personal freedom of the CFO.
00:39:07.740 This arrest is a very bold and aggressive move.
00:39:12.260 You don't just arrest a CFO of a, a major company in China, but this, this woman is the
00:39:21.400 daughter of the Chinese equivalent of a Bill Gates or Steve jobs, uh, in China.
00:39:28.780 He is also a major, uh, player in the communist party.
00:39:33.780 Now this is an enormous company.
00:39:35.920 It sells more smartphones than Apple and builds telecommunication networks all around the world.
00:39:41.280 It made $90 billion last year.
00:39:44.880 So her arrest has enormous implications.
00:39:48.440 And if you remember last weekend of the G20, Trump and president Z agreed to press the pause
00:39:54.160 button on the trade feud for about 90 days while a new trade deal is hammered out.
00:39:59.160 But the president was saying these things while his administration was ordering the arrest and
00:40:05.480 extradition of this CFO.
00:40:07.580 Those negotiations were already a powder keg.
00:40:12.000 This arrest is a match.
00:40:16.240 Does it blow up our relationship?
00:40:19.300 Does it blow up, uh, the, the Chinese structure or does it blow up Donald Trump or our economy?
00:40:29.180 It could be that the match just blows out.
00:40:34.000 I doubt it.
00:40:35.820 What does this mean?
00:40:37.760 We'll continue to watch.
00:40:43.340 It's Friday, December 7th.
00:40:45.520 All right.
00:40:46.180 You're listening to the Glenn Beck program.
00:40:47.400 Yes.
00:40:47.940 Welcome to the program.
00:40:48.920 We're glad you're here.
00:40:50.700 Uh, Mr. Bill O'Reilly is, uh, joining.
00:40:53.420 Bill, do you have anything to add to this, uh, this Chinese, uh, trade wrinkle here?
00:40:58.500 No.
00:40:59.380 Okay.
00:41:00.020 Good for you.
00:41:01.380 Good for you.
00:41:02.140 A man of very few words, but you know what you know.
00:41:06.760 How are you, Bill?
00:41:07.420 I mean, look, I'm not, all I know is that for decades, China has had a very profitable
00:41:20.000 trade policy with the United States.
00:41:23.340 And in October, the deficit of trade in China's favor was at a record five and a half billion
00:41:33.180 just for the month.
00:41:34.900 So it's not, it's not helping us.
00:41:36.720 Here's what, here's the thing that I'm concerned about with China and have been for a long time.
00:41:40.540 If they do deals with anybody here, um, they force the companies and this is the company's
00:41:46.760 fault.
00:41:47.020 They force the companies to give us your, your, uh, your, your, all your data, give us
00:41:52.960 all of the plans, et cetera, et cetera.
00:41:55.200 That's the company's responsibility.
00:41:56.840 They want to make that deal with China.
00:41:58.180 They can make that deal and not have a market.
00:42:00.840 If they don't want to make that deal, that's fine.
00:42:03.080 What I don't like about China is the, the corporate espionage that goes on in this country
00:42:09.580 from China, they are all over American technology and they, they thieve it.
00:42:16.680 They absolutely feel it.
00:42:18.240 Um, but I don't like the fact that, that the Chinese government, which obviously controls
00:42:22.920 their economy, unlike the American economy, government does not control the American economy.
00:42:28.480 That's a big thing.
00:42:29.060 Everybody's got to understand they don't take, uh, uh, enough soybeans spec.
00:42:34.440 I mean, it's, uh, you know, we'll take 400 soybeans, but when I think anymore, and, uh,
00:42:42.320 you know, I mean, I'm making a facetious comparison, but we can't send them as many products as they
00:42:50.660 send us because their government won't allow it.
00:42:54.620 And that's what Trump wants to stop.
00:42:56.440 So if you have to break it down into terms, even I can understand the use the soybean model.
00:43:03.580 All right.
00:43:04.540 Let's talk about a couple of quick hits here.
00:43:06.740 First of all, um, Heather Nauert is going to be the next ambassador to the UN.
00:43:12.820 Right.
00:43:13.340 And Steve Doocy is probably going to be her deputy.
00:43:17.380 Okay.
00:43:18.220 Good.
00:43:18.500 And then, and then what's going to happen is there's going to be a UN cookbook coming
00:43:23.440 out.
00:43:24.380 All right.
00:43:25.080 So all of these things, it's synergy.
00:43:27.880 Right.
00:43:29.100 Right.
00:43:29.700 Right.
00:43:31.060 So, uh, I'm glad to see that we have the first, the, our first initial pass at this
00:43:37.300 is very, very similar.
00:43:38.860 Um, you know, you know, Heather, I don't know her at all.
00:43:43.180 I, I, I assume she's very smart and she's, you know, she's very good at her job, but, um,
00:43:48.720 the UN ambassador is, uh, is usually reserved for somebody, you know, that has deep experience,
00:43:57.460 uh, around the globe.
00:44:00.280 Yeah.
00:44:00.800 I was surprised.
00:44:01.760 Kill me.
00:44:02.220 Didn't get it, but you know, I don't think he wanted it.
00:44:05.400 So look, Heather Nauert is a very smart woman, very well educated.
00:44:10.720 Her expertise is in foreign affairs.
00:44:13.220 She was a news person, uh, on Fox.
00:44:18.100 She did analysis for me early on, on the factor.
00:44:20.800 She was good.
00:44:21.500 Uh, wait, hang on.
00:44:22.440 Just saying, didn't she start on the factor?
00:44:24.500 Isn't Heather, the one that was in school with Al Gore.
00:44:29.460 Uh, I don't know if she was in school with Al Gore.
00:44:32.580 I don't know that, but I, I put her on first, uh, very early because she was smart and, and
00:44:38.980 her expertise was in foreign affairs.
00:44:40.840 Then ABC hired her away from Fox.
00:44:42.760 She was a reporter.
00:44:44.640 So I'm not really getting the angst, you know, other than, you know, boy, you work for Fox,
00:44:51.940 you know, that kind of thing.
00:44:52.840 Yeah.
00:44:52.940 That's not my, honestly, that's not my problem that she worked for Fox.
00:44:56.300 I just, you know, I, I, I think Nikki Haley was, was phenomenal.
00:45:02.480 Um, probably one of the best sense, uh, Bolton or, or Jean Kirkpatrick.
00:45:07.600 Uh, remember Haley's whole background was in local politics and didn't have a strong, uh,
00:45:14.420 foreign policy resume, but as very well, but if you're in, but if you are in politics,
00:45:20.140 if you're governor of a state, you do have international relations, you do have negotiation
00:45:25.640 experience, you know, you're not a reporter.
00:45:28.880 That's not the job of the ambassador at the UN.
00:45:30.980 I think you're denigrating, uh, the repertorial squad here.
00:45:35.160 I think I could do that job at UN ambassador.
00:45:38.520 I think I could do it.
00:45:40.320 Um, I don't think you probably could.
00:45:43.000 Yeah.
00:45:43.540 We would be in global war in about 20 minutes, but I think you could do it.
00:45:47.140 Well, yeah, but we'd win.
00:45:48.780 And there'd be a lot more soybeans going on with the China.
00:45:51.300 Sure there would be.
00:45:52.180 Sure there would be.
00:45:52.760 So, so look, people need to understand if you're an ambassador to the United Nations,
00:45:59.800 you basically are an order taker.
00:46:03.160 You're not a policy forger.
00:46:05.980 Okay.
00:46:06.580 There's a big difference in the job of secretary of state where Heather was, you know, the
00:46:11.220 spokesperson, but I understand that she had a lot of input into, uh, what happened.
00:46:16.760 Good.
00:46:17.380 Um, but when you're an ambassador to the United Nations, you basically confer with the white
00:46:24.740 house and you are told this is it.
00:46:27.880 Now you can make suggestions, but you don't forge policy.
00:46:31.640 Right.
00:46:31.820 Okay.
00:46:32.100 Um, so we're talking to Bill O'Reilly, uh, about the news of the day and I wish, I wish
00:46:37.040 Heather the best.
00:46:37.900 And I mean, I want her to win and I want her to be even better than, uh, uh, than Nikki
00:46:44.080 Haley, uh, because I thought Nikki Haley was fantastic and we finally took a stand where
00:46:50.440 we should.
00:46:51.200 And she was very, very smart the way she handled things.
00:46:54.260 And I'm, I'm hoping that Heather is exactly the same.
00:46:57.720 Um, um, but I do think this is a first, uh, for the Heathers of the world, uh, to be a
00:47:04.320 global.
00:47:04.660 That also gives more prestige to the name Heather.
00:47:08.320 It really does.
00:47:09.500 Thank you very much.
00:47:10.320 What we expect from you on the radio.
00:47:11.820 Well, that's, that's, that's the kind of analysis that I can bring to the table.
00:47:15.640 Let me, let me take you to France.
00:47:17.700 France shut down the Eiffel tower.
00:47:19.860 89,000 security forces were deployed.
00:47:23.060 Uh, the president bent on this global warming tax, which started at all, which now that he
00:47:29.760 bent, now everybody is saying, well, wait a minute.
00:47:33.760 I want something too.
00:47:35.480 That's right.
00:47:36.260 Tell me your thoughts on France.
00:47:37.960 Well, if they, if they close down, I'll bound pain, then I'm going to have to get involved.
00:47:42.320 I don't think that's actually French.
00:47:43.760 Oh, um, the French, here's, here's, here's, here's the real story.
00:47:50.900 You, you hire me while I hire, I do this free for plugs.
00:47:54.940 All right.
00:47:55.500 I work for plugs, not hair plugs, right?
00:47:58.060 Book plug.
00:47:58.760 No, you, I mean, you're very cheap.
00:48:00.420 You really are.
00:48:01.240 You really are.
00:48:01.860 Thank you.
00:48:02.280 Yeah.
00:48:02.720 So here's the deal in France.
00:48:04.340 Nobody has any money.
00:48:05.880 Why?
00:48:06.520 Because the government takes it away from you in the form of taxes in the quid pro quo is
00:48:11.320 that's Latin.
00:48:11.880 Um, we'll give you everything.
00:48:14.540 So you get free school and free healthcare for retirement, six weeks vacation a year back.
00:48:21.820 And we can't fire you.
00:48:23.340 Nobody's allowed to really fire you.
00:48:25.360 Even if you drive a nail in somebody's foot, you can't get fired.
00:48:30.100 Okay.
00:48:30.540 So that's the trade.
00:48:33.100 And then we take all of your money that you earn.
00:48:36.240 Okay.
00:48:37.200 So now the French average French person needs three things, coffee, cigarettes, and croissants.
00:48:45.920 Okay.
00:48:46.700 You may be much wipes their disposable income every day out.
00:48:51.780 Right.
00:48:51.980 It may be a little exaggeration or, or, you know, it's like the soybeans back.
00:48:57.060 Okay.
00:48:57.760 No, I, all right.
00:48:59.200 We'll go with it.
00:49:00.440 We'll go.
00:49:01.140 So when they raise the money on the liter of gasoline, uh, they don't have it.
00:49:09.280 And, you know, to drive from Paris to Nice for a little fun, uh, that's going to cut it.
00:49:15.100 So that's why they're all mad because there's no, they don't have any backup, you know, it's
00:49:20.500 like, well, okay, we, we made our bargain with the government, but now the government's hosing
00:49:25.420 us as they always do taking more than they should in taxes.
00:49:29.560 And now we're going to burn down the, I, well, here's the amazing thing is they, they, they
00:49:33.840 all are for this 80% of the French people are for, you know, global warming measures,
00:49:39.220 et cetera, et cetera.
00:49:40.240 The Paris accord.
00:49:41.140 So they're not blowing so much tobacco and smoke in the air.
00:49:43.920 Right.
00:49:44.280 So, so they're all for this, but when it comes down to it, when they actually see that
00:49:49.980 the price has to be paid by the average person, that's when they say, wait a minute, wait a
00:49:54.380 minute, wait a minute.
00:49:54.800 We thought somebody else was going to pay for that.
00:49:56.780 That's right.
00:49:57.160 No, no, no, no, no, no, no.
00:49:59.380 You're right.
00:50:00.340 You, you soak the rich, not us.
00:50:02.680 The cars on fires, that's polluting the air, making things even hotter.
00:50:06.280 That's right.
00:50:06.820 Knock it off.
00:50:07.300 Okay.
00:50:07.640 Um, so, uh, your thought on, on, does this peter out or is this the beginning of something
00:50:14.940 bigger?
00:50:15.400 Because he's all, Macron has already said, or am I going to do it?
00:50:19.320 Um, but so now, but now the labor unions, yeah, labor unions and everybody else has an
00:50:25.340 ax to grind and it's going, I think to Belgium.
00:50:28.300 Is it Norway and England this weekend?
00:50:33.040 Norway and England are going to have demonstrations as well.
00:50:36.080 Yes.
00:50:36.720 Yes.
00:50:37.000 Okay.
00:50:37.260 But these are anarchists now that come out.
00:50:39.520 I mean, these are the, these are the people who want to open borders in America.
00:50:43.120 Those kinds of people.
00:50:44.640 Antifa.
00:50:45.420 Yes.
00:50:45.820 That's who's coming out now.
00:50:47.000 It's almost like the left and the right are working together over in Europe to destabilize
00:50:51.560 Europe.
00:50:53.040 I don't know where I heard that about, uh, eight years ago, but, uh, it looks like that's
00:50:57.160 finally happening.
00:50:58.020 So, well, look, if you are going to, the, the, the message for all Americans is if you're going
00:51:05.300 to allow the government to regulate every part of your life, you're going to get hosed.
00:51:14.300 Bill O'Reilly back with more in just a second.
00:51:18.260 Bill O'Reilly, of course, the author of, uh, I, I don't know.
00:51:21.960 He used to make fun of me.
00:51:23.120 Glenn, you write so many books.
00:51:24.500 I'm like, yeah, I know you should write some books.
00:51:26.980 It's great.
00:51:27.600 We, you know, correct history and formule.
00:51:29.860 Now he's, he's on his, like, I believe it's his 1500th, uh, number one bestseller, uh,
00:51:36.660 still in the top five, killing the SS, the hunt for the worst war criminals in history.
00:51:41.540 It's available everywhere.
00:51:42.920 Now it is a great book.
00:51:44.840 One of his best killing the SS.
00:51:47.280 Our sponsor this half hour is filter by, um, uh, I don't know if you've noticed, but your
00:51:52.780 house is a little colder, uh, have you changed your air filter, uh, when you, uh, turn on
00:51:59.500 the heat, you got to change your air filter, uh, because it is a, it's a real strain on
00:52:06.740 the air handler in your, in your home.
00:52:08.940 It is, uh, it's bad for it.
00:52:11.400 It will really cause, uh, trouble down the road.
00:52:13.820 And it's also unhealthy for you.
00:52:16.620 Here's the thing.
00:52:17.680 You, if you're like me, I don't remember these things.
00:52:20.860 You know, it's like, Oh, who, who has time to think to themselves?
00:52:24.500 Oh my gosh, wait a minute.
00:52:25.640 It's time for me to change the air filter.
00:52:27.980 I changed the air filter when something's on fire.
00:52:31.360 So you can make it really easy on yourself.
00:52:34.880 And remember just by subscribing, if you will, you, you order for auto delivery for your air
00:52:40.800 filter and it comes when you're supposed to change your air filter.
00:52:44.220 So you just take it off the front door, uh, front doorstep and you pop it in, you throw
00:52:48.260 the other one out and you're done is happens with filter by filter, B U Y.com filter by.com
00:52:55.080 all made here in America makes your life really, really easy.
00:52:58.360 It, uh, helps the allergies and the breathing of everybody in, in your home.
00:53:02.740 It also helps your, your air handling system because it's not having to work overtime because
00:53:08.320 your air filter is full of junk.
00:53:10.000 I swear to you, I felt like it's like I found Lego pieces in, in my air filter.
00:53:15.420 It was so bad filter, B U Y.com never forget, never hassle with it again.
00:53:20.360 They'll deliver it and you'll save 5%.
00:53:21.900 If you do it this way, it's filter, B U Y.com filter by.com.
00:53:31.020 Bill O'Reilly is, uh, is joining us today.
00:53:34.300 Uh, Bill, first of all, I, uh, I appreciate the, uh, email this week, uh, wishing us congratulations
00:53:42.680 on, uh, uh, uh, on the merger and, uh, the number one question that many of our audience
00:53:49.080 has is, you know, when will you join, uh, forces with the evil empire that we are, we
00:53:55.420 are assembling here.
00:53:56.340 Uh, um, the world is changing media wise.
00:54:01.780 Um, I was, uh, I get ratings, cable news ratings and network news ratings every day.
00:54:08.120 Yeah, I know.
00:54:09.340 And, uh, the deterioration in the audience, particularly 25 to 54 is startling.
00:54:16.140 Um, you know, I, Americans are not getting what they want from television news overall.
00:54:22.420 I have seen, I've seen some of the ratings on CNN and the ratings on CNN, uh, for 25 54
00:54:29.560 are lower than the ratings we used to have when, uh, we were struggling over on headline
00:54:36.780 news.
00:54:37.820 I mean, it's bleak.
00:54:39.900 It's bleak.
00:54:40.860 So I think the, you know, consolidating your power, uh, on the blaze with CRTV, um, is a
00:54:48.480 good, smart move.
00:54:49.520 Yeah.
00:54:50.000 Um, you know, it's all about having a vehicle that's different and then letting the folks
00:54:58.060 decide what they'd like to watch.
00:55:00.620 And, and also creating something I think that does not tell people what to say, what
00:55:05.180 to do, what to think, how to produce their show, just produce your show.
00:55:10.080 And you got to get good people.
00:55:11.300 And you also have to have distribution.
00:55:13.300 That is the key.
00:55:15.100 So at this point in my, uh, career, uh, we're doing very well on billoreilly.com, um, better
00:55:21.640 than I ever thought.
00:55:23.320 And, uh, I'm going to just wait and see, you know, what happens in the next few months.
00:55:29.080 Anything could happen with me.
00:55:30.700 I mean, I'm hoping to have a lot of possibilities.
00:55:33.260 You never know.
00:55:33.780 We, you know, we've, we've also hired a new negotiating firm, veto and veto.
00:55:37.780 Uh, I like veto and I'm glad he's on parole.
00:55:41.400 Oh, he's, you're going to love him.
00:55:42.880 You're going to love him.
00:55:44.080 All right, Bill, uh, tell me here in about a minute and a half that we have your thoughts
00:55:49.460 on the George Bush funeral this week.
00:55:51.760 Okay.
00:55:52.220 Before I get to that in the, and when we come back, I hope we can talk about the column
00:55:56.680 I wrote on religion and clerics, um, speaking sermon, sermonizing in churches.
00:56:03.540 I hope you read that because I, I want to know what you think about what I said.
00:56:07.280 All right.
00:56:07.480 Okay.
00:56:08.100 The funeral of Bush, the elder, very, very positive for America shows the world that, uh, the
00:56:14.380 country admires patriotism and honesty.
00:56:17.740 And certainly those attributes, uh, were part of the Bush, the elders appeal.
00:56:23.540 Um, you know, obviously the cable news took about 30 seconds after the funeral in Washington
00:56:30.100 was over for them to begin tearing up again and misbehaving.
00:56:34.560 Um, can't stop it.
00:56:35.960 It's an epidemic.
00:56:37.160 Uh, it's disgusting, but I thought the Bush family and the whole presentation was a big
00:56:42.440 plus for the country.
00:56:44.120 What do you think George Bush in the end?
00:56:46.000 George H.W.
00:56:47.020 Bush will be remembered for being a decent man.
00:56:51.480 You know, he wrote me two letters that I promised I wouldn't publish until he passed.
00:56:55.480 I have them on bill O'Reilly.com, very strong letters back one about the media bias and the
00:57:01.120 other about Iran Contra.
00:57:02.720 And I hope all your listeners will go there and read those letters.
00:57:05.600 He was an honest man, but fundamentally a decent, kind man.
00:57:10.500 I agree.
00:57:10.720 I had a nice correspondence with him.
00:57:12.340 I agree.
00:57:13.060 Uh, Bill O'Reilly, we're going to talk about his, um, his really powerful op-ed, Oh, come
00:57:18.460 all ye faithful.
00:57:19.760 When we come back.
00:57:25.480 You're listening to the Glenn Beck program.
00:57:30.520 Bill O'Reilly has written a, uh, a very good op-ed called Oh, come all ye faithful that
00:57:35.520 everybody should read.
00:57:36.400 I, Bill, if I, if I may, I just want to quote a couple of parts of it.
00:57:38.920 The faithful are not coming, at least not the way they once did.
00:57:42.840 Churches and synagogues are seeing declining weekly attendance as America becomes a devoted
00:57:46.780 secular nation.
00:57:48.280 The winds of change have not been kind to the spiritual, even as the Christmas season is
00:57:52.160 upon us once again.
00:57:53.020 Much of the diminishment of religion comes from the media who often marginalizes people
00:57:57.580 of faith, betraying them as zealots who intrude on the rights of others.
00:58:02.420 For example, LGBT, uh, progress is met with parades.
00:58:06.220 Uh, those who promote biblical beliefs are accused of bigotry and, and shunned by some
00:58:11.200 on the subject of abortion.
00:58:12.680 If you stand to protect innocent life, then you're a hater of women, not worthy of being
00:58:16.220 taken seriously.
00:58:17.900 Uh, tough to fight, uh, that so many millions of folks won't even try.
00:58:23.020 And some of them are supposed to be religious leaders.
00:58:25.680 Centuries of Irish Catholic tradition compel me to attend weekly mass, even though it's
00:58:30.940 not always a rewarding experience.
00:58:33.860 Uh, that's a pretty brave thing, uh, to say as somebody who is faithful, but I think a lot
00:58:39.600 of Americans feel the same way.
00:58:41.120 I go and sometimes it's like, what are you talking about?
00:58:44.740 Why aren't we talking about real things, Bill?
00:58:47.980 Yeah.
00:58:48.580 I mean, I wrote a book called killing Jesus.
00:58:50.480 And the reason that Jesus had thousands of people following around was because he stood
00:58:54.760 up on a big rock and, uh, gave sermons that, that moved, moved them as, as human beings.
00:59:03.760 And the sermons were, were very explicit in the, here's what you do if you want to earn
00:59:09.400 the kingdom of heaven is what you got to do.
00:59:11.760 And it wasn't like, don't do all right.
00:59:16.240 There were a few times when Jesus of Nazareth did that, but very few, it was, this is what
00:59:22.040 you have to do.
00:59:23.760 So now I'm going to church and I'm, I'm listening to some priest.
00:59:27.840 Tell me about the mustard seed.
00:59:29.160 I got it.
00:59:30.280 It fell on fallow ground and didn't grow up.
00:59:32.860 I got it.
00:59:33.380 I got it.
00:59:34.660 Tell me what I should do when my kid is addicted to a machine in his hand where he can access
00:59:41.500 evil in less than 10 seconds.
00:59:45.140 All right.
00:59:45.940 Let's, let's deal with something that, that, you know, promotes evil in, in the easiest
00:59:52.280 way in civilization.
00:59:55.900 How about we deal with that a little bit?
00:59:59.880 How would Jesus see that?
01:00:02.900 And you get nothing, zero, nothing, nothing even close where you could take it home and
01:00:10.420 discuss it with your kids, with other people.
01:00:13.660 And, and, you know, the average age in my church, people go to mass every Sunday is about 80.
01:00:20.420 You know, I mean, kids, you know, I drag my kids, but I got to drag them because they sit
01:00:27.640 there and then, you know, and the presupp, they're droning on and on and on and on and
01:00:31.380 on about nothing that relates to contemporary life.
01:00:36.180 And that was the column.
01:00:37.180 And I'm, you know, I'm saying, look, if you people in the churches and synagogues and all
01:00:43.120 of that, if you want people to become spiritual, you've got to help them.
01:00:50.520 And we're not getting the help back.
01:00:52.260 You know, I, I, uh, Bill, I'm rereading, um, uh, how to win friends and influence people.
01:00:59.200 And I read that when I was a kid, cause my dad forced me to read it.
01:01:03.960 And so I kind of read it like, oh, I gotta read that stupid book.
01:01:07.520 Uh, and I, I know what it said and, you know, cause my dad did a lesson afterwards, et cetera,
01:01:12.240 et cetera.
01:01:12.460 But, um, I haven't read it, I think since I was maybe 20 and I just read it again and I'm
01:01:20.200 going back to read it again.
01:01:22.260 Uh, and, and really take it apart because it really has all of the answers that we're
01:01:29.220 looking for.
01:01:30.180 And nobody is, nobody's talking about it.
01:01:33.620 And one of the first chapters is you have to find out what people need and then help
01:01:41.060 them accomplish that.
01:01:42.800 And that's one of the things that our churches, I don't think are doing.
01:01:46.160 They are struggling with so many things.
01:01:49.400 They're under attack at school.
01:01:51.980 They're under attack in society.
01:01:53.960 They're under attack in politics.
01:01:55.940 They're under attack all the time.
01:01:58.460 And even Christmas is under attack.
01:02:01.680 So you're absolutely right.
01:02:03.460 And by the way, I tried to buy that book and Barnes and Noble refused to sell it to me.
01:02:08.640 I thought that was very unfair.
01:02:10.540 What?
01:02:11.020 It just, yeah.
01:02:12.320 They said, O'Reilly, we're not giving you how to win friends.
01:02:15.340 Well, it's impossible for you.
01:02:16.900 I think that's, I think that actually, I think that says that somewhere in the book, this
01:02:20.820 will work for everybody, but Bill.
01:02:22.680 But Bill, right?
01:02:23.740 And, and, and look, you've got, I'm sitting here and my head's blowing off.
01:02:29.080 You talk about Kevin Hart.
01:02:30.420 Okay.
01:02:30.920 I got no use for Kevin Hart.
01:02:32.880 I like Kevin.
01:02:33.400 He's all right.
01:02:34.260 He's funny.
01:02:34.760 Nobody wanted the job.
01:02:36.180 Well, you know, it's when you, when you're down to offering, you know, the job to Casey
01:02:44.680 and the sunshine band to watch the Oscars.
01:02:47.200 Right.
01:02:47.560 You know, you got a problem.
01:02:48.920 So if they can't, let me ask you this.
01:02:50.160 If they came to you and said, Bill.
01:02:52.760 No, I wouldn't do it.
01:02:53.860 You wouldn't do it.
01:02:54.820 No, no.
01:02:55.660 You can't win.
01:02:57.520 No more.
01:02:57.880 Number one, I can't pronounce the names of the movies.
01:03:00.300 And I, and I'm absolutely not going to see the shape of water.
01:03:03.300 I know what the shape of water is.
01:03:06.220 I'm not, I'm not going.
01:03:09.360 That would be such a great, but it would be so great to see you on stage.
01:03:14.080 The Oscar, just saying that, just saying that.
01:03:16.960 Just like, what are you?
01:03:17.920 People are all frauds.
01:03:19.320 You're all phonies.
01:03:21.220 You know, I know you're liberal and you ain't Trump.
01:03:23.780 Why?
01:03:24.100 Because you want to work.
01:03:25.000 I got it.
01:03:25.700 Everybody's got it.
01:03:26.840 We have to do our, we have to do our own live Oscars here on that Sunday night.
01:03:32.960 I'm awarding it to High Plains Drifter.
01:03:34.860 I actually watched that last night.
01:03:37.560 All right.
01:03:38.840 So back to Kevin Hart.
01:03:40.080 Back to Kevin Hart.
01:03:40.980 So, so you've got a cultural civil war in the country and one side doesn't fight.
01:03:47.620 That's the traditional spiritual religious side.
01:03:51.220 And I went on a fight.
01:03:52.860 We're not, we're not doing it.
01:03:54.340 So, well, I think, can we hang on just a second, wait, wait, wait, let me, let me play.
01:03:59.160 Let me play what Kevin Hart said, because I think this is the way average people feel and
01:04:05.500 they're, they feel, I don't have to fight this because it's so crazy.
01:04:10.340 It just won't stand, but it is standing.
01:04:12.640 But listen to what he said.
01:04:14.180 I swear, man, our world is becoming beyond crazy.
01:04:19.400 And I'm not going to let the craziness frustrate me or anger me, especially when I've worked
01:04:26.180 hard to get to the mental space that I am at now.
01:04:30.300 My team calls me, oh my God, Kevin, the world is upset about tweets you did years ago.
01:04:35.580 Guys, I'm almost 40 years old.
01:04:39.620 If you don't believe that people change, grow, evolve as they get older, I don't know what
01:04:46.480 to tell you.
01:04:47.780 If you want to hold people in a position where they always have to justify or explain their
01:04:53.160 past, then do you.
01:04:56.160 I mean, he's right.
01:04:58.680 And so people, people don't necessarily fight back because a, they don't see a way to win
01:05:05.400 and be, they're like, you know what?
01:05:07.960 I don't even, I can't even relate to these people.
01:05:10.220 If you don't understand that something I said 10 years ago in a tweet shouldn't be held against
01:05:17.620 me today.
01:05:19.140 And as a comedian, right?
01:05:21.600 He's doing a riff.
01:05:22.560 But, but, but, but that's a cop out though, Beck.
01:05:26.260 I mean, you'd say, oh, oh, I don't, I don't want to really do it.
01:05:30.500 Look, then you're ceding control of the American culture to me too.
01:05:37.420 Time's up, uh, gay lobby, uh, all of this stuff.
01:05:42.900 So they're going to be the ones that tell you what you can and can't say.
01:05:48.240 Okay.
01:05:48.360 So let me go back to what you just wrote and what you just said.
01:05:51.560 And I thought this was really great.
01:05:53.080 You said, don't tell me what not to do.
01:05:57.240 Tell me what to do.
01:05:59.300 Right.
01:05:59.740 So in the case of like Kevin Hart and you see this and you're like, this is ridiculous.
01:06:03.460 And you know, it's going to happen to you or your kids.
01:06:05.820 So what do you do?
01:06:10.200 Well, I'm not going to spend a lot of currency sticking up for Kevin Hart.
01:06:14.000 As I said, I mean, I don't think this is a, a champion of anything.
01:06:18.440 Um, uh, they just needed a person of color to do the show.
01:06:22.880 And then he was selected, but I, you know, I'm not going to campaign on behalf of Kevin Hart.
01:06:27.880 But in my lifetime, I've been attacked every hour on the hour, distorted, uh, accused, whatever it may be with the sole intent of destroying my voice.
01:06:40.620 Because I wrote a book called Culture Warrior, which predicted this would all happen.
01:06:45.520 That if you're going to have no leadership on the traditional spiritual side, what, what leadership do we have back?
01:06:52.580 You're a voice.
01:06:53.460 I'm a voice.
01:06:54.620 There's a few more of us, but not many, not many.
01:06:59.040 We're outnumbered a hundred to one in the media.
01:07:02.420 And then you go into the individual churches around the country and you get the mustard seed.
01:07:09.520 Okay.
01:07:10.000 You get nothing about, look, we believe certain things.
01:07:15.340 And here is the way to promote your belief.
01:07:18.480 Here's what you need to do in your own life to make sure that we spiritual people are represented.
01:07:24.680 You get none of that.
01:07:26.280 So you're basically surrendering to the forces that say, you better not say anything critical of anybody we like, or we're going to destroy you.
01:07:37.880 And that's exactly what's happening.
01:07:41.000 Exactly what's happening.
01:07:43.420 Bill O'Reilly, always good to talk to you.
01:07:46.020 Is it really back to us?
01:07:46.900 It's a pick-me-up.
01:07:48.200 I don't know about everybody else in the audience, but I feel much better.
01:07:51.780 All right.
01:07:52.580 Killing the SS, great Christmas gift.
01:07:55.060 Good to talk to you, Bill.
01:07:56.120 And BillO'Reilly.com will not put you to sleep.
01:07:59.560 That's for sure.
01:08:00.580 Have a good weekend.
01:08:01.880 Have a good weekend.
01:08:02.520 You too.
01:08:02.800 And we'll see you in church, Beck.
01:08:04.440 You got it.
01:08:05.120 Bye-bye.
01:08:05.720 Bill O'Reilly from BillO'Reilly.com.
01:08:09.400 Soon to be a part of Blaze Media, I'm telling you, because Vito makes offers that Bill just can't refuse.
01:08:18.460 All right.
01:08:19.360 Let me tell you about our response to this half hour.
01:08:22.100 It's LifeLock.
01:08:23.600 I remember when the, I think it was LifeLock.
01:08:26.700 The head of LifeLock came out and said, I'm so crazy and so confident that I'm going to put my social security number up on a bus.
01:08:34.360 Remember that?
01:08:34.940 That was LifeLock, yeah.
01:08:35.880 Yeah, it was.
01:08:36.460 Okay.
01:08:36.780 So I remember that.
01:08:37.980 And I remember thinking at that time, and maybe it was just me, Pat.
01:08:41.540 Did you feel this way?
01:08:42.300 It was like, who's going to take your social security number?
01:08:45.180 What are you going to do with a social security number?
01:08:46.720 Right?
01:08:47.260 You're like, okay, whatever, dude.
01:08:49.200 Come a long way since then.
01:08:50.340 Oh, my gosh.
01:08:51.500 Jeez.
01:08:51.860 Oh, my gosh.
01:08:53.400 Now is the time.
01:08:55.740 That was way ahead of the curve.
01:08:57.980 Now is the time.
01:08:58.560 You don't have LifeLock?
01:09:00.220 They are.
01:09:01.920 You are so close.
01:09:03.080 If you haven't already been breached and your identity hasn't already been thieved, it's only a matter of time.
01:09:08.820 What are the 500 million stolen from Marriott Hotels?
01:09:12.380 It was Marriott?
01:09:13.160 Yeah.
01:09:13.540 Marriott Starwood.
01:09:14.820 And then another 100 million from another company last night.
01:09:17.680 I just heard about this before I came in here.
01:09:19.720 Another 100 million from someone else.
01:09:22.140 Yeah.
01:09:22.500 I mean, it's just and they take your information and they can just destroy you.
01:09:27.360 Just destroy your life and your financial standing in the world.
01:09:31.740 It's you got to have LifeLock.
01:09:34.380 You have to have LifeLock.
01:09:36.080 LifeLock now with Norton Security.
01:09:38.260 They use the technology to detect and alert you to the things that you would never catch.
01:09:42.960 It's happened with me.
01:09:43.940 It's happened with Pat several times.
01:09:45.780 And Norton protects your devices against cyber threats like malware.
01:09:50.540 Nobody can prevent all identity theft or, you know, cyber crime or monitor all transactions at all businesses.
01:09:54.960 But this is the best you have to have.
01:09:58.380 LifeLock.com.
01:09:59.580 Use the promo code Beck.
01:10:01.420 LifeLock.com.
01:10:02.880 Promo code Beck.
01:10:04.140 Pat Gray has joined me on the broadcast, on the program, on the show.
01:10:13.740 And we're glad that you're here.
01:10:16.900 Matt Kibbe is coming up in just a second.
01:10:19.100 We're going to talk to him from the Libertarian POV.
01:10:22.840 Also, I want to I want to spend a couple more minutes on this Kevin Hart thing.
01:10:28.360 You know, Bill said, I'm not going to stick up for Kevin Hart.
01:10:30.420 I'm going to stick up for anybody who is being squelched.
01:10:32.880 You know, I'll stand up for anybody.
01:10:35.220 I don't I don't have to agree with you.
01:10:37.040 I'll stand up for Don Lemon.
01:10:38.880 I'll stand up for Alex Jones.
01:10:40.600 I'll stand up for anybody who is being squelched because of what they have said, what they believe, what they have said, especially in his case, 10 years ago.
01:10:50.660 Yeah.
01:10:50.820 And he's apologized for it multiple times.
01:10:52.800 Oh, stop it.
01:10:53.420 Which is why he said, you know, I'm not going to apologize for it again.
01:10:55.900 I've already done that.
01:10:57.040 Then he, of course, did apologize for it again because that's just where we are.
01:11:01.880 I mean, he's a comedian.
01:11:05.720 He, of course, he has said, you know, edgy things about probably everything over the past 20 years.
01:11:13.940 So who's going to take over now?
01:11:15.560 Who is going to take over for Hollywood?
01:11:18.020 Who is get this?
01:11:19.120 Who is pure enough to be able to host the Oscars?
01:11:24.460 Who wants to subject themselves to this?
01:11:26.260 I wouldn't do it.
01:11:26.900 Would you?
01:11:27.660 I would not do that.
01:11:29.200 No, I mean, they're not going to want me, but if they did, I wouldn't do it.
01:11:33.380 I don't know.
01:11:34.100 Maybe I would just because it's a live broadcast.
01:11:37.980 Oh, you put whatever you want in that teleprompter.
01:11:40.840 It's a live broadcast.
01:11:42.220 I would not miss that opportunity.
01:11:44.540 So please, I throw my hat in the ring.
01:11:47.220 I would love to host the Oscars.
01:11:49.940 Anyway, the thing on this is, who are you going to get?
01:11:55.500 Billy Crystal?
01:11:56.260 Yeah, maybe.
01:11:57.360 Because Billy Crystal, at this point, you know Billy Crystal is, he's a comedian too.
01:12:03.220 He said something.
01:12:04.400 There's something that he has said.
01:12:05.960 You'll find something offensive that he has said in the past.
01:12:08.120 So, even if you're a Billy Crystal fan, you would think, you know what?
01:12:13.680 Oh, Kevin Hart's not good enough, not pure enough.
01:12:16.200 Oh, this guy is now, huh?
01:12:18.160 Okay.
01:12:19.180 You could even be a Billy Crystal fan, but you will go and find something just to prove the
01:12:25.160 point to Hollywood.
01:12:25.980 I mean, nobody is safe from this.
01:12:30.100 Absolutely not.
01:12:31.160 The LGBTQIA2 quilt bag people are just so powerful at this point that if you don't just
01:12:42.340 fall into lockstep with everything they've ever said, done, felt, or thought, you're done.
01:12:48.300 Well, no, you can't.
01:12:49.060 You're done.
01:12:49.460 No, no, no.
01:12:49.800 You can't fall into everything they've said or thought or felt.
01:12:53.020 You have to fall into what they're feeling, saying, or thinking today.
01:12:56.360 Right now.
01:12:56.980 Absolutely.
01:12:57.720 Yesterday is not good enough.
01:12:58.740 Yeah.
01:12:59.420 Yesterday is just not good enough.
01:13:02.080 Yeah.
01:13:02.700 And there can be no disagreement.
01:13:04.300 There can be no disagreement on any of it anymore.
01:13:07.200 You have to think exactly what they think.
01:13:11.000 It's amazing.
01:13:12.060 This is the premise behind the merger between CRTV and The Blaze.
01:13:18.220 There are hosts that I strongly disagree with and strongly disagree with me.
01:13:23.000 And if we that believe in the big principles, the the the Constitution, that America is not
01:13:33.960 a bad place, that America should not be destroyed, that the Bill of Rights is is the greatest document
01:13:39.800 ever coupled with the with the Declaration of Independence.
01:13:43.000 That's our unum.
01:13:44.060 That's what that's what will solve these problems.
01:13:46.040 If those people who believe that but just disagree on politics or policies, if we can't
01:13:52.800 come together, there's no chance.
01:13:55.260 There's no chance.
01:13:55.940 We must put our petty differences aside and look at the big principles and come together
01:14:04.040 and stand together and be able to have a reasonable discussion with people we disagree with.
01:14:09.980 Mr. Matt Kibbe is a fellow libertarian and a good friend.
01:14:20.140 He has he was instrumental in FreedomWorks.
01:14:23.580 He really started started that and was instrumental in so many of the things that the Tea Party did.
01:14:30.860 He is really, I think, responsible for much of the Tea Party.
01:14:35.060 And most people don't even know that.
01:14:38.080 They may not even know who Matt Kibbe is.
01:14:39.980 He is a brilliant thinker way ahead of the curve.
01:14:43.860 He left FreedomWorks a long time ago, went out on his own and has has really focused on youth
01:14:50.720 and is trying to teach what socialism really is, because it means something different to
01:14:57.240 people who are under 30 and they don't understand it.
01:15:01.360 And he is also very, very wary of the tribal politics and tribal identity that we are that we're currently working
01:15:12.100 on. And I'm thrilled that he is now part of the Blaze TV family or we are a part of his family, however you want to look at it.
01:15:20.280 Blaze TV merged with CRTV.
01:15:22.580 And we hope this is just the beginning of of something entirely new where people who have different opinions and can disagree strongly
01:15:31.880 with each other can be still on the same platform and everyone can have a reasonable debate.
01:15:37.020 As long as you agree that America shouldn't be destroyed and the Bill of Rights is is just an amazing thing and we should all get together
01:15:46.120 and protect and live, then I think your voice should be heard.
01:15:50.120 Matt Kibbe joins us now.
01:15:51.160 Hello, Matt.
01:15:52.500 Hey, Glenn.
01:15:53.220 Good to talk to you.
01:15:53.920 Good to talk to you.
01:15:55.480 So, Matt, tell me, bring me up to speed on what you're learning as you are working with millennials now and and outside of the political realm.
01:16:08.420 You know, years ago, I was I was reading the polling results from something that the Reason Foundation put out where, you know,
01:16:14.760 they were showing this this very concerning trend with young people supporting socialism more than capitalism.
01:16:20.740 But when you dug into the questions a little bit deeper, they would ask young people the follow up question.
01:16:26.360 Well, well, should government own the means of production?
01:16:28.980 And the answer was, hell no, that's a stupid idea.
01:16:31.780 I realized that I realized that there's a language problem.
01:16:35.800 Like we're using the same word, but it means different things to different people.
01:16:40.220 And I think I think a lot of young people that are drawn to so so-called democratic socialism view it very much as as a bottom up local vore.
01:16:50.140 Or let's all work together in voluntary cooperation to solve problems.
01:16:53.960 And that, of course, that, of course, is the exact opposite of what of what you and I understand as socialism and certainly the dire history of socialism in practice.
01:17:06.100 So what is what what is happening to the the movement?
01:17:11.960 Are you are you seeing are you seeing millennials start to wake up?
01:17:16.940 Because I feel like they are.
01:17:20.360 Oh, I think I think they're the most gettable generation when it comes to the values of of voluntary cooperation.
01:17:29.380 And, you know, you know, you're right to pursue your own dreams as long as you don't hurt people or take their stuff.
01:17:35.280 That's that's who they are.
01:17:36.520 They live in this radically libertarian world where they they curate everything through technology and social media.
01:17:42.860 But we're we're probably not connecting with them on language.
01:17:47.760 And we're also never going to connect with them if our if our offer is here's here's these two title political parties.
01:17:54.820 And you have to choose one of those.
01:17:58.080 It's it's it's an alien concept to them that they would actually have have only two choices on anything.
01:18:03.120 So I think we have to tell stories.
01:18:05.480 I think, you know, part of the stories, some of the stories are the devastating history of socialism in practice.
01:18:12.000 They're they're gut wrenching, horrible, depressing things.
01:18:15.780 But but also, you know, the cool stories about about what liberty creates, like like can you can you actually brew a fantastic double hop, triple IPA?
01:18:28.260 You can't in Venezuela, but in America, you can you can do that because we allow for for for choice and creation and serving customers and doing what you want and bringing new products to market.
01:18:40.840 But those kinds of stories, I think, without sort of beating people over the head with economics, I think that's the future of how we connect.
01:18:48.000 So, Matt, have you seen the the libertarian movement in Brazil that has brought a lot of American libertarians down?
01:18:56.760 And they they've talked to them all and they're like, wow, OK, we don't want to do it that way.
01:19:02.900 And their their point is libertarian, the libertarian in America, that that movement is is basically run by old guys in their view, old guys who are in Congress and are trying to do things.
01:19:17.900 And they're like, this has got to be a youth thing.
01:19:20.800 It's got to be outside.
01:19:21.780 And they have made a huge impact.
01:19:25.340 And it's just a group of people who took their time and their talent and started explaining these things online.
01:19:30.980 And they are they are moving the needle down in Brazil.
01:19:33.880 Do you are you aware of them?
01:19:35.900 Oh, yeah.
01:19:36.400 Yeah.
01:19:36.680 There it's a it's a huge movement down there.
01:19:39.600 You can actually find organizations like that all over the world.
01:19:43.080 Now, I just got back from the Republic of Georgia speaking to about a thousand young libertarian kids.
01:19:50.260 I mean, they're 20 years old and they're and they're looking for for alternatives.
01:19:53.740 But it is that that the ethos in Brazil and other places is very much based on youth.
01:20:01.360 It's it's based on libertarian values.
01:20:03.260 And it's a rejection of of the political status quo.
01:20:07.540 They don't they don't find it appealing anywhere across the board.
01:20:11.000 And yes, American libertarians could learn a lot.
01:20:14.940 American conservatives could learn a lot from from the youth liberty across the world.
01:20:19.420 I agree.
01:20:20.760 You just you were just over in Georgia.
01:20:23.820 Tell me what you tell me what you're finding over in in Europe.
01:20:27.180 I think things are getting frightening and you're not hearing about anybody who is standing up going, no, neither of those is the answer.
01:20:36.740 Well, you know, this this whole idea that you have to choose between the hardcore Marxist violence and Tifa or some some sort of flavor of white nationalism and fascism is this false choice.
01:20:50.060 I think that that's trying to be imposed all over the world.
01:20:53.100 And the counterrevolution is is again with young people saying, you know what, neither of those deadly isms, you know, Marxism, fascism, socialism, white nationalism.
01:21:04.240 They're all kind of the same thing.
01:21:05.460 They're all top down.
01:21:06.400 They're all looking to make us all conform to to one set of of of goals that are imposed by somebody else.
01:21:14.620 And people are rejecting that.
01:21:16.660 So I think that I tend to be an optimist about what's going on, not just in Europe, but in the U.S., because we're in the middle of this paradigm shift.
01:21:25.740 And it used to be that top down institutions told us what to think and what to do.
01:21:30.460 And now we're discovering through technology that that's not really the case anymore.
01:21:34.520 We're discovering that that all politicians lie, that that government institutions don't do what they said they were going to do.
01:21:42.080 And we're discovering that we're a little bit different.
01:21:44.500 So we're sort of sorting that out.
01:21:46.900 But the solution is not to choose between fascism and socialism.
01:21:51.060 The solution is to choose liberty and self-reliance and voluntary cooperation and all these beautiful values that you were talking about this earlier.
01:22:00.360 The Bill of Rights and the American experiment was really built on this stuff.
01:22:06.700 So, Matt, you know, I talk to people in Silicon Valley.
01:22:11.140 I follow it very closely.
01:22:12.460 I have been impressed by the number of libertarians that are out there.
01:22:18.020 However, the you know, I'm torn when people say, hey, we've got to have an ASI, you know, Manhattan project because we don't want Russians to get it or China to get it.
01:22:29.060 Well, I don't really want the United States government to either have it.
01:22:32.440 I don't want Google to have.
01:22:34.140 I don't I don't really want anybody to have it, quite frankly.
01:22:37.160 But we can't put that genie back in the bottle.
01:22:39.600 But Google came out a few weeks ago and they said they're not going to do business with the United States government, even though they will.
01:22:47.920 They're not going to do business with the Pentagon, et cetera, et cetera.
01:22:51.100 But they are doing business with China, which is terrifying.
01:22:56.940 And then Microsoft came out and said, hey, AI, we've got the Pentagon's back.
01:23:01.660 We'll share everything we have with the Pentagon.
01:23:04.580 I where are the libertarians in Silicon Valley when it comes to China and teaching AI how to kill and control?
01:23:19.920 Yeah, I think I think it's a problem.
01:23:22.360 And I don't think that anyone in Silicon Valley is going to step up and protect us from from the abuse of all these technological innovations.
01:23:31.140 You know, the entire history of Silicon Valley is really rooted in DARPA and and government contracts in the first place.
01:23:38.240 So, you know, they're going to they're going to pursue their profit margins.
01:23:42.900 You know, Amazon is doing the same thing.
01:23:44.940 But again, the counterrevolution in technology, these are all these are all very top down controlled by by a few actors, sorts of technologies.
01:23:55.260 And the next step has to be blockchain technologies that aren't controlled by, you know, corporate interests, government interests, anybody's interests.
01:24:05.680 It has to be more bottom up.
01:24:07.100 And and I do believe that that there are technological solutions.
01:24:12.000 And I tend to be quite romantic about what what what crypto and blockchain is going to bring to us in the next five years.
01:24:19.040 Matt, you've been with CRTV now that has become Blaze TV.
01:24:25.540 You are a staunch libertarian.
01:24:28.220 There are things that we agree on.
01:24:29.580 Most I think we agree on many things we don't agree on.
01:24:32.180 But you are in a company that has anyone from you to Gavin McGinnis to me to Mark Levin to Eric Bolling, all of us.
01:24:46.740 We have so much that we disagree on.
01:24:50.300 How do you what do you think the why were you willing to take the heat to be the libertarian on CRTV for for the last few years?
01:25:05.940 Well, you know, libertarians don't neatly fit into any box.
01:25:10.340 So it wasn't like I could go to big libertarian TV and speak theirs.
01:25:16.580 But I also think I mean that the whole concept behind what we're doing is to find that common ground amongst people and ideologies and tribes and and communities that disagree with each other on some pretty important things.
01:25:33.340 And and I think I think that's important.
01:25:35.700 And I since I left FreedomWorks, I've spent a lot of time not just talking to conservatives through CRTV, but but talking to libertarians, including big L libertarians at the party and and also talking to progressives, because I think I think there are some common values in there that that do hold us together.
01:25:54.920 And by the way, those are the values that are that are going to save America from all of this tribal warfare that's tearing us apart.
01:26:01.740 Those are the values that we all came here for when we were all immigrants.
01:26:06.100 We came here for those values.
01:26:08.760 I mean, the people on the border who are crawling across the border now, they they they are, you know, whether they say it or not, they are coming for those values unless they have ill intent.
01:26:18.800 They want the opportunity to to explore and to break out of their condition.
01:26:24.300 They want a chance to live in a country that has laws and everybody is treated fairly and and we're not talking about that.
01:26:33.860 We're talking about immigration and this this thing on the border as if that doesn't matter, as if the laws of the land and what they're coming here for don't matter.
01:26:46.660 We you know, we saw I don't know if you saw the story of what was the guy's name?
01:26:51.080 Patty was rich, friend of Clinton, friend of Trump, was taken the Lolita plane.
01:26:57.500 What's his name?
01:26:59.840 You know who I'm talking about, Matt?
01:27:01.820 I don't remember his name.
01:27:03.020 The he was taking the Lolita plane.
01:27:05.380 What's his name?
01:27:06.220 Yeah, I know.
01:27:07.260 You know who I'm talking about.
01:27:08.280 Rich, actually.
01:27:09.680 What's his name?
01:27:11.460 I thought it was rich, actually.
01:27:13.100 No, no.
01:27:14.240 No, anyway, but he's he is rich and he kind of got off.
01:27:19.340 Well, he didn't kind of get off.
01:27:20.540 He got off after 80 women were going to.
01:27:23.920 Yeah, Epstein, 80 women were going to testify against him.
01:27:28.160 And he brokered a deal because justice isn't blind in America.
01:27:32.600 It's just not.
01:27:33.840 And if we lose that, we lose everything we were.
01:27:38.020 And by the way, like that, that that rage against the machine that that there isn't equal treatment under the law is something that I think animates a lot of a lot of young people that are attracted to democratic socialism.
01:27:52.460 You know, we all we've all picked on Alexandria Ocasio-Cortez.
01:27:56.560 But if you go back and look at her original viral campaign video, you have to get like 90 percent through it before you really disagree with anything she's saying, because she's saying that the system is rigged.
01:28:09.780 She's saying that there's this crony collusion between members of Congress and Wall Street.
01:28:14.160 And that could have been a tea party ad.
01:28:16.220 And then at the end, they sort of throw on.
01:28:17.740 That's why we need Medicare for all.
01:28:19.740 But the values, the values there are are very much, you know, it could be Ron Paul.
01:28:26.000 It could be Bernie Sanders.
01:28:26.980 It could be the tea party.
01:28:28.380 It could even be some of the themes that Donald Trump touched on when he was just raging against the swamp.
01:28:36.080 Matt Kibbe from Free the People.
01:28:39.060 And you can also watch him on BlazeTV.com.
01:28:42.500 BlazeTV.com.
01:28:43.260 You can find him there and freethepeople.org.
01:28:46.400 I'd love to have you back on and talk a little bit about freethepeople.org.
01:28:49.140 So I know you're reimagining what the Tea Party 2.0 might look like.
01:28:54.080 And I'd love to have that discussion with you.
01:28:56.140 So maybe next time we have you on, Matt.
01:28:59.040 Let's do it.
01:28:59.780 Thank you.
01:29:00.140 Thanks for having me.
01:29:00.760 You bet.
01:29:01.080 Matt Kibbe.
01:29:08.500 All right.
01:29:09.240 I want to talk to you about SimpliSafe.
01:29:11.100 SimpliSafe is a system that Pat actually put in his house because he got just a little bit of a raise in the cost.
01:29:23.600 It was only double.
01:29:25.140 But that was it.
01:29:25.940 But that's all.
01:29:26.760 I mean, how bad is it when it goes from $30 to $60?
01:29:30.400 Is that a big deal?
01:29:32.020 $30 to $60.
01:29:33.220 How did they think they would keep you as a customer?
01:29:36.900 I don't know.
01:29:36.960 I don't know.
01:29:37.340 But they didn't, fortunately.
01:29:38.980 Unbelievable.
01:29:39.760 And if it wasn't for SimpliSafe, you'd probably still have them.
01:29:41.900 Probably.
01:29:42.960 Because everything is...
01:29:44.400 All the rest of them are the same.
01:29:45.740 You know?
01:29:46.200 They would have sold Pat.
01:29:47.580 If they would have done that, they would have said, well, we have to use our keypads and we have to put our stuff in it.
01:29:51.900 And then you're back into the same game.
01:29:55.180 SimpliSafe, you own the system.
01:29:56.840 And there's no wiring.
01:29:58.020 So if you're going to move, you can easily move and take the stuff with you.
01:30:02.480 If you're in an apartment, you can do that.
01:30:04.640 The most important thing is you're in control of this and it is not expensive.
01:30:10.740 I mean, you know, it's still a high-level security system, but it is not what you've been paying monthly, month after month after month after month.
01:30:18.040 You were so overpaid.
01:30:19.540 And if you want the 24-7 monitoring, it only costs you $14.99 a month.
01:30:25.900 So Pat went from $60 to $14.99.
01:30:30.500 And he doesn't have a contract.
01:30:32.480 Sounds like a pretty good deal.
01:30:33.620 Pretty good deal.
01:30:35.300 SimpliSafeBeck.com.
01:30:36.340 Go to SimpliSafeBeck.com.
01:30:38.980 Order your system.
01:30:40.320 Get this holiday deal that they have.
01:30:42.240 They're offering right now on the website and get it for the holidays because crime really goes up at this time of the year.
01:30:49.640 SimpliSafeBeck.com.
01:30:50.840 So the weirdest thing has happened to my wife and I.
01:30:56.740 We sold our house last Friday.
01:30:59.720 Now, I had to go and close on the house on Thursday, on Wednesday for the title company because I was going on tour.
01:31:06.320 And this buyer of the house was like, you've got to close right away.
01:31:11.520 I want the house.
01:31:12.660 I'm going to pay cash for it.
01:31:14.140 And we're like, wow, really?
01:31:16.000 And he's like, oh, yeah, yeah, yeah.
01:31:17.620 So he's checked out.
01:31:19.760 He has the money.
01:31:20.640 Blah, blah, blah, blah, blah.
01:31:21.440 He has the position to have the money.
01:31:23.380 And so we close.
01:31:25.340 And on Friday, he closes all the signatures.
01:31:27.620 Everything happens.
01:31:29.120 But then the money's not transferred.
01:31:32.060 And so the title company is like, where's the money?
01:31:36.080 And he's like, oh, my God.
01:31:37.040 It was Friday.
01:31:37.740 Oh, my gosh.
01:31:38.320 Did that not transfer?
01:31:39.680 Yeah.
01:31:40.220 I am calling that bank.
01:31:42.020 Well, it's five o'clock now.
01:31:43.920 Oh, crap.
01:31:45.740 All right.
01:31:46.080 Well, Monday.
01:31:46.780 First thing, it'll be there Monday.
01:31:48.140 It must have got held up.
01:31:49.840 Then Monday, no money transfer.
01:31:51.940 And then and then he called and or we called him and he's like, I'm in a meeting right
01:31:59.560 now.
01:31:59.820 I'll call you right back.
01:32:01.300 And then like three hours later, we have to call him like, hey, dude, where's the money?
01:32:05.640 Oh, my gosh.
01:32:06.440 Is it still not there?
01:32:08.060 OK, well, I'm calling him right now.
01:32:09.860 So he call nothing.
01:32:11.340 OK, then five o'clock come.
01:32:12.920 It still has.
01:32:13.960 You know what?
01:32:14.700 I'm telling you right now.
01:32:15.740 I've had it.
01:32:16.660 I am going to the bank the first thing tomorrow morning myself.
01:32:19.960 The next day, he blames the money not coming because the bank didn't do any wiring because
01:32:27.840 it was closed because of George H.W.
01:32:30.660 Bush's funeral.
01:32:32.560 And I'm like, wow, I don't know what country that is, but the banks were open here.
01:32:40.540 And they close only in Houston, only in Houston.
01:32:43.620 What happened?
01:32:44.100 Yes.
01:32:44.380 OK, so this goes on.
01:32:46.540 So finally, on Wednesday, we said, dude.
01:32:50.740 By 5 p.m.
01:32:51.940 Tomorrow, you're officially in default and it's over.
01:32:56.000 He he was still yesterday when it was all over came.
01:33:00.320 Actually, it came over Thursday.
01:33:01.960 I think it was 9 a.m.
01:33:03.540 And he was still writing all day yesterday going, I'm you know what?
01:33:07.180 I'm just I can't tell you.
01:33:09.920 I've got my lawyers talking to the bank.
01:33:11.440 We asked, well, what's the federal tracking number?
01:33:14.340 And he's like, they didn't give me one.
01:33:16.560 Wait, you gave that amount of money and you didn't ask for a receipt?
01:33:21.720 What are you insane?
01:33:23.380 So we could we were like, well, is this guy nuts?
01:33:27.140 Is he or is he is he just so a scam artist?
01:33:31.480 Is he so out of touch with other people's lives?
01:33:35.020 What the hell was that all about?
01:33:37.240 It was just bizarre.
01:33:40.400 A real estate agent said never seen anything like this happen.
01:33:43.800 They're like, we've seen a lot of things.
01:33:46.180 We've never seen that.
01:33:47.780 We don't because nobody.
01:33:49.700 What is the deal?
01:33:52.260 And he's still doing it today.
01:33:53.780 He's like, no, I really want the money.
01:33:55.740 It's coming.
01:33:56.760 I can't get it transferred.
01:33:58.460 I'm like, where are you calling from?
01:34:00.540 Venezuela?
01:34:01.460 What is this?
01:34:05.420 It's the Glenn Beck program.
01:34:06.740 Well, I don't know if you've seen the latest from the left on Christmas, but apparently
01:34:12.920 Jesus had two dads.
01:34:15.020 And I just like to point out.
01:34:17.460 Yes.
01:34:17.900 Yes, he did.
01:34:19.220 Yes.
01:34:19.540 I'm glad you finally admit that.
01:34:21.420 Yes.
01:34:21.780 He had Joseph and God.
01:34:24.040 Yeah.
01:34:24.680 It's amazing.
01:34:25.860 Thank you.
01:34:26.580 I appreciate that.
01:34:27.640 I'm going to make T-shirts.
01:34:28.840 You finally get it.
01:34:30.520 Jesus had two dads.
01:34:32.240 Just not the two you think they are.
01:34:34.800 Joseph and I don't know, Steve.
01:34:38.760 But yes, he did have two dads.
01:34:40.800 But now there's, you know, that Jesus had two moms, Mary and Mary, or Mary and another,
01:34:50.800 I don't know, Esther or whoever.
01:34:53.800 And, you know, it doesn't, no, uh-uh.
01:34:56.380 No.
01:34:57.200 Just historically, I'm pretty sure that's not the truth.
01:35:00.080 Yeah.
01:35:00.260 Even without the God thing and the star and the wise men, no, I'm pretty sure that didn't
01:35:05.160 happen.
01:35:06.320 Not a lot of that going on back then.
01:35:08.420 Well, there might have been a lot of that going on, you know, but not openly and not
01:35:14.580 with everybody's blessing.
01:35:15.600 Right.
01:35:17.160 Just want to point that out.
01:35:18.460 Just want to point that out.
01:35:19.040 But now there's a, uh, there's a professor at Minnesota State University, associate professor,
01:35:24.920 who said that there is no definition of consent that constitutes God impregnating the Virgin
01:35:32.140 Mary.
01:35:33.640 Wait, wait.
01:35:35.080 Let me just, wait, I.
01:35:36.720 This is a Me Too movement for the Virgin Mary now.
01:35:40.080 The Virgin Mary.
01:35:40.740 Yeah.
01:35:41.020 Mary did not give consent to God.
01:35:45.020 When specifically in the Bible, it says she actually did.
01:35:49.400 Uh, but this Minnesota professor suggested in a series of tweets that the Virgin Mary
01:35:53.660 did not consent to the conception of Jesus.
01:35:56.980 Where?
01:35:57.200 Where?
01:35:57.460 Show me the consent.
01:35:58.380 Show me the consent in the Bible.
01:35:59.580 Uh, well, I have to go to the book of, uh, Matthew and Luke and we'll point that out
01:36:05.620 in a second.
01:36:06.180 Oh, yeah.
01:36:06.760 You're hesitating now.
01:36:07.840 You're hesitating.
01:36:08.560 Um, but he suggested that God may have acted in a predatory manner.
01:36:13.540 I mean, this is just, this is just flat out blasphemy.
01:36:19.160 First of all, they don't believe in God.
01:36:21.860 True.
01:36:22.300 They don't believe.
01:36:23.100 There's no way this guy believes in God because you're right.
01:36:26.500 You would not, you would not say that about God.
01:36:29.060 If you believed it, God is not a rapist.
01:36:31.420 Right.
01:36:32.080 Exactly.
01:36:32.740 God couldn't be God if he acted in a predatory manner.
01:36:35.500 Correct.
01:36:36.260 Correct.
01:36:36.600 Um, but he's a, uh, psychology professor and a sex therapist and he critiqued the story
01:36:43.140 of the Virgin Mary in his tweets, suggesting that the Virgin Mary did not consent.
01:36:47.520 The virgin birth story is about an all knowing, all powerful deity impregnating a human teen.
01:36:53.980 There is no definition of consent that would include that scenario.
01:36:58.920 Happy holidays.
01:37:00.160 Okay.
01:37:00.680 First of all, thank you.
01:37:01.860 Yeah.
01:37:02.100 Thank you.
01:37:02.540 No, I'm going to tell that warm story around the fireplace with my kids in a couple of weeks.
01:37:06.760 Right after I do the Grinch story.
01:37:08.360 Yeah.
01:37:08.740 I'm not telling this one.
01:37:09.460 By the way, kids, first, we're going to read Luke, but I want you to remember God's a rapist.
01:37:14.340 Uh, doesn't really work.
01:37:16.200 Jesus was a product of a rape.
01:37:18.180 So he's reminded that, you know, consent is given in scripture.
01:37:22.160 Um, and then he comes out with this tweet that the power difference deity versus more
01:37:29.500 mortal and the potential for violence for saying no, I negates the yes.
01:37:36.180 We could have written this literally as a parody and you wouldn't believe it.
01:37:41.400 And then it happens in real life.
01:37:43.300 You don't even have to pair it.
01:37:44.500 You don't have it.
01:37:45.340 That's why there is no comedy.
01:37:46.740 You don't even have a chance to do parody.
01:37:47.640 This is why comedy is dead.
01:37:49.420 You can't even make that up.
01:37:51.640 You can't.
01:37:52.380 Five years ago, we could have done that as a bit.
01:37:55.740 Yeah.
01:37:56.500 The power dynamic between God and Mary.
01:38:01.320 She can't consent because of the power dynamic.
01:38:04.340 Oh man, it is unbelievable.
01:38:09.420 Oh my gosh.
01:38:10.320 Well, the reason why I want you to look this up, uh, please consent is because was it signed
01:38:15.500 by her?
01:38:16.240 Do you have it on video?
01:38:17.960 I do not have it on video.
01:38:19.620 She say, what was he proposing?
01:38:21.900 What did he say?
01:38:23.480 And was there at any time that she, that she said, you know what?
01:38:28.740 Uh, it was just a crazy night.
01:38:31.400 Uh, you know, I was just so impressed.
01:38:34.340 With the power dynamic.
01:38:35.700 I mean, he was God.
01:38:36.580 I was an impressionable teenager.
01:38:38.700 Uh, no matter what I said, it doesn't matter.
01:38:43.760 He made me say yes.
01:38:46.220 Um, okay.
01:38:47.560 So, let's see.
01:38:49.280 Go ahead.
01:38:49.560 Go ahead.
01:38:49.900 Go ahead.
01:38:50.660 I want to find the contract.
01:38:52.340 I'm looking.
01:38:53.660 Uh, does it need to be in triplicate?
01:38:56.380 Uh, yeah.
01:38:56.780 Well, yeah.
01:38:57.380 I prefer it in video.
01:38:59.100 Okay.
01:38:59.580 So, in the, I'm looking at the King James version.
01:39:03.140 So, you know, if you want to, if you want to take issue with that.
01:39:06.660 White power.
01:39:07.320 Go ahead.
01:39:08.160 Um, uh, let's see.
01:39:10.980 Yeah.
01:39:11.600 Yeah.
01:39:11.920 Okay.
01:39:12.860 You see your signature anywhere?
01:39:14.200 See any signatures?
01:39:15.060 There's no signature.
01:39:15.800 There's no signature.
01:39:16.420 There's no signature.
01:39:17.160 No.
01:39:17.640 Okay.
01:39:17.920 It's, it's not signed in, in, so duplicate or triplicate.
01:39:21.780 This is just, man, that changes everything.
01:39:24.480 That changes the story of Christmas entirely.
01:39:26.900 Yeah, it really does.
01:39:27.840 Kind of takes the ho, ho, ho out of the holidays.
01:39:30.280 It really does.
01:39:31.680 There is no signature page here though.
01:39:33.780 Wow.
01:39:34.060 And there is no video and there's no recording of it.
01:39:37.320 Oh, well, other than the written.
01:39:39.080 And by the way, it was the Holy Spirit.
01:39:40.300 It wasn't even God.
01:39:41.000 He had some.
01:39:41.700 All right.
01:39:42.100 It's all these spirits.
01:39:42.780 Like, you know, some, I don't even know who he is.
01:39:44.840 I mean, who's that guy?
01:39:45.980 I mean, like the attorney, he coming down, trying to bro, he's like Cohen.
01:39:49.980 Is that who he is?
01:39:50.840 Michael Cohen?
01:39:51.900 I don't know.
01:39:52.780 I don't know.
01:39:53.360 We don't know.
01:39:53.900 There's no signature.
01:39:54.420 Had he even been to law school?
01:39:55.580 Had he been to law school?
01:39:56.360 Was it all legally handled?
01:39:57.700 I don't think so.
01:39:58.460 I don't think so.
01:39:59.280 I don't think so.
01:40:01.160 Mary was coerced.
01:40:03.020 Unbelievable.
01:40:03.900 It's absolutely unbelievable.
01:40:05.800 You really can't make it up.
01:40:08.880 No, I know.
01:40:10.720 I know.
01:40:11.940 It doesn't even have time to be an SNL skit.
01:40:14.840 And what, how do you build on that to make it crazier?
01:40:18.700 This is the problem that comedians are facing.
01:40:21.620 You say, the idea is you take something and then you take it to the nth degree.
01:40:27.420 That used to be easy.
01:40:29.580 And then it becomes funny.
01:40:31.260 Yeah.
01:40:31.560 But when the nth degree is the reality.
01:40:33.680 Right.
01:40:34.000 Well, like, like, like 10 years ago, I could have said, yeah.
01:40:37.220 And you know what's next?
01:40:38.160 Donald Trump's going to be president.
01:40:39.520 And it would have been funny 10 years ago.
01:40:42.820 But when that's your reality, you know, you got no place to go.
01:40:47.460 You have no place to go.
01:40:49.360 When they're saying Mary didn't consent and she was underage.
01:40:53.480 You know, well, I think the age disparity between Mary and God is pretty big.
01:41:00.100 Yeah, I think it's wide, wide disparity.
01:41:02.340 It's pretty wide.
01:41:03.160 You know, calling him a cradle robber isn't going far enough.
01:41:08.940 When you've lived for eternity and she's 17.
01:41:12.760 I mean, what do you even talk about?
01:41:15.200 Yeah, not much.
01:41:16.100 You know, I mean, what do you even have in common?
01:41:19.040 You're creating universes.
01:41:21.440 What is she doing?
01:41:22.400 Riding a donkey?
01:41:24.140 Please.
01:41:25.560 You're so right.
01:41:26.440 All right.
01:41:28.180 Our sponsor this half hour is CarShield.
01:41:31.080 When it comes to costly car repairs, you need some options.
01:41:35.400 And here's the option.
01:41:37.220 Don't pay for it.
01:41:38.700 Now, you can do this a couple of ways.
01:41:39.960 You can do this with a gun or you can say, I got CarShield and CarShield covers it.
01:41:48.280 Extended vehicle protection from CarShield.
01:41:50.900 They make the process of fixing your car for covered repair.
01:41:53.820 Super, super easy.
01:41:54.760 I mean, some of these sensors, what, $1,000?
01:41:59.480 I think my son had a sensor go on his car for the brakes.
01:42:02.440 And I'm like, eh, you got a car and my daughter has a separate car, right?
01:42:07.160 That the kid's in?
01:42:07.960 He's like, yeah.
01:42:08.960 Is this your car?
01:42:09.860 Yeah.
01:42:10.220 Eh, don't worry about that sensor.
01:42:12.020 Anyway, it was like $3,000 to fix this thing.
01:42:16.040 For a sensor?
01:42:16.920 It was a sensor and something else.
01:42:18.940 Wow.
01:42:19.080 But it was three grand.
01:42:20.160 Well, they didn't have three grand sitting around in the bank.
01:42:23.760 And, you know, that kind of stuff can be bone crunching.
01:42:27.120 You know, you go in and you have to have something fixed for $800.
01:42:30.520 That could make making the rent, not making the rent or the house payment.
01:42:35.900 How do you do that?
01:42:37.040 You need to be covered in extended vehicle protection.
01:42:41.000 Whether your car has 5,000 miles or 150,000 miles, please get yourself covered and don't be stuck in that.
01:42:49.520 Now, some deductibles may apply.
01:42:51.160 Call 800-CAR-6100.
01:42:53.640 If you mention the promo code BECK or visit carshield.com and use the promo code BECK, you'll save 10%.
01:42:59.840 Please, find out about CarShield.
01:43:02.880 It could save your bacon.
01:43:04.420 And, yes, I know that offends both vegans, vegetarians, and Muslims.
01:43:11.020 But I didn't mean it as an offense.
01:43:14.000 I don't know if that will make a difference.
01:43:15.960 But now I know I'll never host the Oscars.
01:43:19.900 Visit carshield.com, promo code BECK, 1-800-CAR-6100, or carshield.com.
01:43:29.040 All right, here's the exact words of Mary.
01:43:32.260 We're told she didn't give consent.
01:43:35.180 Yeah, Mary in the Bible.
01:43:36.820 In the Bible.
01:43:37.440 Yeah.
01:43:37.820 When she's told by the angel, okay, this Holy Spirit's going to overcome you, and you're going to give birth to a baby.
01:43:44.880 And she said, how is this to be?
01:43:46.040 And the angel explains it to her.
01:43:47.720 And then Mary said, behold the handmaid of the Lord.
01:43:51.680 Be it unto me according to thy word.
01:43:53.880 Listen to that.
01:43:54.400 So she's quoting, she said about, she was like, behold, look at Margaret Atwood's story about Handmaid's Tale.
01:44:03.060 Right?
01:44:03.300 That's what she's saying there?
01:44:04.260 No, that's not, no.
01:44:05.460 No, she wasn't.
01:44:06.480 The Handmaid's Tale hadn't debuted yet.
01:44:08.820 Oh, really?
01:44:09.320 Not yet.
01:44:09.900 Not quite yet.
01:44:10.440 Okay, all right.
01:44:10.920 On TV.
01:44:11.500 Oh, hell.
01:44:11.740 Yeah.
01:44:12.260 Okay, okay.
01:44:13.180 All right.
01:44:13.340 So the consent is there, however.
01:44:15.500 The power dynamic.
01:44:15.800 So is the power dynamic.
01:44:17.340 Yeah, I mean.
01:44:18.300 Consent can't be given.
01:44:19.460 I mean, if you read his first book, he gets angry a lot.
01:44:24.840 He does.
01:44:25.880 And she may have been afraid of losing her job.
01:44:27.920 Right.
01:44:28.340 She was like, I don't know.
01:44:30.640 He could fire me.
01:44:31.480 If I say no, I might be out of a job.
01:44:33.680 I might not get this storyline.
01:44:35.540 I've been working my whole life for this storyline in this book.
01:44:39.260 And he's a hothead sometimes.
01:44:42.460 So.
01:44:43.500 So.
01:44:46.260 It's crazy.
01:44:46.940 There's another story out that we touched on this week.
01:44:49.140 Um.
01:44:50.400 That.
01:44:51.000 Uh.
01:44:51.820 That I'd like to get your opinion on.
01:44:53.380 And that is that.
01:44:54.800 Uh.
01:44:55.420 Males today are.
01:44:57.160 Uh.
01:44:57.400 Quote.
01:44:58.100 Abandoning the traditional.
01:45:00.680 Male.
01:45:01.880 Attributes.
01:45:02.620 And masculine values.
01:45:04.000 Yeah.
01:45:04.580 And uh.
01:45:05.260 I read this story and I thought.
01:45:06.680 I don't think those are masculine values.
01:45:09.440 Yeah.
01:45:09.660 This.
01:45:10.500 What is a masculine value?
01:45:12.700 That we're pumping iron all the time?
01:45:14.680 Or.
01:45:15.000 Isn't that what you got from it too?
01:45:16.260 Beating on people?
01:45:17.240 Yes.
01:45:17.460 Yeah.
01:45:17.700 You get the.
01:45:18.520 Taking out your.
01:45:19.060 Your body and.
01:45:19.940 Yeah.
01:45:20.200 And you're.
01:45:20.780 And you're pumping iron and you're.
01:45:22.600 It's like the village people song.
01:45:25.160 Right.
01:45:25.660 Macho man.
01:45:26.580 Everybody wants to be a macho man.
01:45:27.960 I get the kind of body always in demand.
01:45:29.680 Okay.
01:45:30.620 And I wouldn't say that the village people were exactly masculine.
01:45:34.660 You know.
01:45:35.560 And so it's a combination of the village people in what?
01:45:39.320 You know.
01:45:39.880 World Wrestling Federation.
01:45:41.520 Right.
01:45:42.280 You know.
01:45:42.520 That's not.
01:45:43.080 That's not.
01:45:43.900 That's certainly not you and me.
01:45:46.080 Can I tell you something?
01:45:47.000 That if any of that, you know, good looking pumping iron being really physical fit Rambo
01:45:52.260 kind of stuff.
01:45:53.060 All of that cartoon stuff.
01:45:54.520 That is not what.
01:45:56.660 I mean, look at Cary Grant.
01:45:59.260 Cary Grant was the sex symbol for how long?
01:46:03.160 Oh, big time.
01:46:03.720 You ever see him in movies when he takes off his shirt?
01:46:06.280 You're like.
01:46:06.720 He's not ripped.
01:46:07.780 You're not really.
01:46:08.520 He didn't have a six pack.
01:46:09.520 He got a little flat mammon in there.
01:46:11.100 I mean, I'm feeling a little better about myself.
01:46:12.840 He had a normal.
01:46:13.900 Normal body.
01:46:15.240 Normal body.
01:46:16.220 I mean, he was in good shape, but it was normal.
01:46:19.400 This whole thing of being all muscular and ripped and all that stuff.
01:46:22.780 That's new.
01:46:23.540 And that's.
01:46:24.120 By the way, that's not a value.
01:46:25.400 That's a trend.
01:46:26.200 That's a fad.
01:46:27.560 That's a fashion.
01:46:29.840 A value.
01:46:31.880 I mean.
01:46:32.980 What do you look at as traditional masculine values?
01:46:35.480 I look at it as being a good husband and father.
01:46:38.260 Me too.
01:46:38.600 Being somebody doing your best.
01:46:41.200 Doing your best to support your family.
01:46:43.080 Correct.
01:46:43.860 Do what you have to do in a good, positive way to support your family, to make the sacrifices
01:46:49.680 for your family, to raise good, decent sons, to raise good, decent daughters, to have a
01:46:57.260 good relationship, to leave your community better, or at least your family better than
01:47:02.640 the way you found it.
01:47:03.660 And to my father told me the job of a son is to pick up the ball from his father when
01:47:11.140 he leaves it on the field, see if his father was running in the right direction.
01:47:15.820 And if so, take it the rest of the way, take it the rest of the way, or knowing you'll
01:47:20.580 never make a touchdown, just continue to make the family better.
01:47:25.240 Take it as far as you can.
01:47:26.220 Right.
01:47:26.440 Isn't that what it should be?
01:47:30.800 Well, and the article claimed that we're ditching the traditional masculine values and we're
01:47:37.640 now more likely to embrace emotional strength.
01:47:40.420 I, were you ever taught as a man, as a male growing up, that emotional strength wasn't
01:47:50.820 a positive attribute?
01:47:52.000 How do you mean, wait, how do you mean emotional strength?
01:47:54.040 Because like I saw George Bush all weekend and all week at the 400 funerals he had to
01:48:02.260 attend and, and have, I think at all of them.
01:48:05.580 Right.
01:48:05.860 But he also had tremendous emotional strength.
01:48:09.580 Yeah.
01:48:09.760 He, he held back, he cried, but he held that back.
01:48:14.560 And he was, when, when the guy was thinking enough to bring something for Michelle Obama
01:48:20.320 to hand to her as a gift, Michelle Obama at his own father's funeral, that's a good man.
01:48:28.720 That's a man who is showing emotional strength.
01:48:32.080 You notice how he would smile and try to uplift others.
01:48:36.480 You know, the guy was dying inside.
01:48:38.200 His dad just died.
01:48:40.300 Yes.
01:48:41.140 There's nothing more traditional than the way he behaved.
01:48:45.200 But this research from the university of British Columbia apparently showed younger men tend
01:48:52.220 to value selfishness, social engagement, and health over traditional male values like physical
01:48:58.920 strength and autonomy.
01:49:01.360 Oh, selflessness.
01:49:02.780 So selflessness.
01:49:03.820 So that means that your traditional male value would be to be selfish.
01:49:08.760 No.
01:49:09.980 And not engage.
01:49:10.700 Those are not traditional.
01:49:11.520 Those are not traditional values.
01:49:13.960 They may have been the values of this last generation, but selflessness.
01:49:18.840 Tell me the greatest generation wasn't selfless.
01:49:21.180 Oh, tell me our grandparents weren't selfless.
01:49:25.320 The sacrifice they gave?
01:49:26.560 Oh, my gosh.
01:49:27.160 Tell me the pioneers were not selfless.
01:49:30.240 Tell me the pilgrims weren't selfless.
01:49:37.420 They were selfless.
01:49:39.080 It was, it was, these people would, when it was okay to defend a woman, people would defend
01:49:50.820 a woman at their own life or own expense.
01:49:54.540 They would try to stand for, you know, to help the poor or the downtrodden at their own
01:50:02.120 expense, not all of them, but what we've always taught as something to strive to, that this
01:50:09.780 is why chivalry dying is so bad.
01:50:14.480 Tell me how chivalry was selfish.
01:50:19.520 It wasn't.
01:50:20.400 Not at all.
01:50:21.360 Not at all.
01:50:22.180 Chivalry has absolutely everything the opposite to do with selfishness.
01:50:26.440 But this is why it's so hard growing up now being a male, because you're bashed with this
01:50:32.300 kind of stuff all the time, nonstop.
01:50:35.760 You're toxic masculinity stuff.
01:50:38.900 Really, really damaging, I think.
01:50:41.080 Have a great weekend.
01:50:43.780 Try to relax.
01:50:44.900 Stay safe.
01:50:46.020 Get your holiday shopping done.
01:50:47.860 And love somebody in the moment.
01:50:50.520 Just love them with everything you have.