The Glenn Beck Program - December 12, 2017


12⧸12⧸17 - Lesser of 2 Evils? (Frank Luntz, Brett King & Norm Stamper join Glenn)


Episode Stats

Length

1 hour and 52 minutes

Words per Minute

156.89624

Word Count

17,665

Sentence Count

1,417

Misogynist Sentences

20

Hate Speech Sentences

18


Summary

A Chicago judge recently let a woman who was caught laughing the entire time walk away with just 200 hours of community service. Thursday a former police officer was acquitted of murder or manslaughter for an on-duty shooting. The same week a jury found a man innocent of murder and manslaughter after he killed a woman on a pier.


Transcript

00:00:00.000 The Blaze Radio Network, on demand.
00:00:09.500 Love. Courage. Truth. Glenn Beck.
00:00:14.980 He was kidnapped. He was bound with duct tape. Gagged with an old sock.
00:00:21.200 Beaten. He was stabbed. And he was forced to drink toilet water.
00:00:26.280 They gave him death threats, made racist remarks, and it was all caught on camera.
00:00:35.360 Intentionally, of course.
00:00:38.640 Do you remember this attack? It happened in Chicago.
00:00:43.200 Four black teens kidnapped a mentally disabled young man.
00:00:49.020 And then streamed it live on Facebook for the whole world to see.
00:00:53.720 One of the kidnappers, Brittany Herring Covington, she was 19.
00:01:01.600 She was the person that actually recorded the torture on her phone.
00:01:05.900 She's the one who's caught laughing the entire time.
00:01:10.620 Tell him to go further.
00:01:11.720 A Chicago judge recently let her walk away with just 200 hours of community service
00:01:18.400 after she pleaded guilty to a hate crime, aggravated battery, and intimidation charges.
00:01:29.540 Why the leniency?
00:01:30.820 Because the judge said, I think if I send her to jail, I don't know if she's going to come out any better.
00:01:39.240 Oh.
00:01:43.500 Can I ask you?
00:01:48.620 What does the statue of Lady Justice look like?
00:01:54.440 I know she's blindfolded, but I don't think that's blindfold anymore.
00:02:02.160 She's wearing, she's holding the scales of justice to weigh both sides.
00:02:08.160 But then she's holding a sword.
00:02:10.620 And that sword is for swift justice.
00:02:15.280 Here the judge was thinking about the attacker's best interest.
00:02:25.740 Well, I understand we just don't want to throw people in jail.
00:02:30.400 We want something to work.
00:02:32.600 But we also need some punishment.
00:02:36.240 Somehow or another, this just doesn't seem just.
00:02:40.300 The three other kidnappers, by the way, are still awaiting trial,
00:02:45.060 and they deserve far more than community service, in my humble opinion.
00:02:51.200 I don't understand what's happening to us.
00:02:54.040 Thursday, last week, former police officer was acquitted of murder or manslaughter
00:03:00.700 for an on-duty shooting.
00:03:02.800 Remember, this is, this happened the same week that a jury in San Francisco
00:03:12.680 found a man innocent of murder or manslaughter after he killed a woman by shooting her on a pier.
00:03:21.860 Have you seen the body cam footage of the police officer that shows an intoxicated, unarmed man?
00:03:33.120 His name was Daniel Shaver.
00:03:36.400 He's crawling towards the police, crying,
00:03:39.400 Please, please don't shoot me.
00:03:44.300 The officer shot him five times.
00:03:48.180 Killed him.
00:03:48.740 Maybe, maybe, maybe this is consistent with police training.
00:03:55.800 That's what his, that's what his attorney successfully argued.
00:04:01.780 But if you watch that footage, I'm sorry, I can't see how that man posed a threat
00:04:07.860 to the man with the AR.
00:04:12.840 This was an unnecessary death.
00:04:15.900 And.
00:04:18.740 Here's what I just want to ask.
00:04:25.380 Are these three trials signs that we are way over the cliff?
00:04:32.900 These three trials.
00:04:37.420 Kidnapping.
00:04:38.800 And brutalizing a handicapped person.
00:04:42.480 And you get community service?
00:04:45.860 Killing a woman on the pier in San Francisco.
00:04:55.880 Not holding her up.
00:04:57.000 Just firing a gun.
00:04:59.180 I don't need murder, but manslaughter.
00:05:02.080 And the one that I really hate to judge, because I'm not in their position.
00:05:14.300 It's a police officer shooting a man.
00:05:18.840 Now, I wasn't standing in the hallway.
00:05:20.840 But what the hell has happened to our juries?
00:05:26.480 What is happening to our judges?
00:05:29.980 What has happened to our, our grasp on reality?
00:05:35.520 Have we all forgotten common decency and common sense?
00:05:42.400 It's Tuesday, December 12th.
00:05:53.840 This is the Glenn Beck program.
00:05:58.520 Welcome to the program.
00:05:59.620 You know, I, we have Frank Luntz coming on in just a minute.
00:06:04.400 He's going to tell us what he found in Alabama, because the Roy Moore election is happening today.
00:06:12.220 And let me just save the press an awful lot of time.
00:06:17.420 If Roy Moore wins, here's why.
00:06:22.920 We are now at a point of voting for the lesser of two evils.
00:06:28.760 But that, in and of itself, is the problem.
00:06:36.840 We are looking at two things that most people in America, on both sides, think are evil.
00:06:46.120 One, a guy who is propositioning young girls.
00:06:51.200 A guy who may have behaved wildly inappropriately with these girls.
00:07:00.780 May have threatened one of them.
00:07:03.820 May.
00:07:06.760 And may have been, when he was in his 30s, been hitting on 14, 15, 16-year-old girls.
00:07:14.780 I don't know how you square that.
00:07:16.900 That's just, that's, if it's just hitting on the 16 to 18-year-old girls and trying to kiss and date them,
00:07:26.720 but doing it, you know, respectfully, there's no law broken, but I find it creepy.
00:07:34.040 The others, if the others appear to be, you know, something that was unwanted and beyond creepy,
00:07:42.400 possibly, possibly illegal.
00:07:44.540 I don't know of a single person in my life, that if I asked you, if these things were true,
00:07:53.760 do you think that's evil?
00:07:57.160 Well, yes.
00:07:58.740 If those things are, if all of it is true, yes.
00:08:02.720 At best, it's creepy.
00:08:05.680 And I don't want to be around that person.
00:08:09.000 So there's the evil.
00:08:10.980 Evil number one.
00:08:13.020 Who's he running against?
00:08:15.340 He's running against a guy who says abortion on demand.
00:08:20.540 It doesn't matter how old.
00:08:23.460 Abortion, partial birth abortion.
00:08:26.260 Once it leaves the birth canal, this is a quote,
00:08:30.740 then I'll protect the baby.
00:08:32.860 Then I'm pro-life.
00:08:35.060 But until that baby is born,
00:08:37.440 it is the right of the mother to do whatever she wants to do with that baby.
00:08:43.280 Now, 80% plus in America believes that's not just immoral, that's evil.
00:08:53.700 80% of Americans are against that.
00:08:56.620 80% of Americans have enough common sense to know that's a baby, man.
00:09:01.200 That is a baby.
00:09:02.500 You can't just partially birth a baby and then stick scissors in the back of its neck.
00:09:10.500 That's evil.
00:09:11.980 So, if Roy Moore is elected today,
00:09:18.280 that's why.
00:09:21.300 Period.
00:09:24.060 Which is more evil?
00:09:25.220 Well, I don't really want...
00:09:27.280 Well, you've got to choose.
00:09:29.140 Well, I really don't want either.
00:09:30.540 You've got to choose.
00:09:31.780 Well, okay, a guy that maybe, maybe 30 years ago did these things.
00:09:43.260 There's no current allegations.
00:09:46.540 Maybe he's not that way anymore, if he ever was.
00:09:52.180 Or the guy who is currently saying,
00:09:55.180 I'll jam scissors in the back of the neck, into the head,
00:09:59.680 and kill a baby in the birth canal?
00:10:02.420 He's currently saying that?
00:10:04.020 Hmm.
00:10:04.680 Let me think.
00:10:05.520 Which one's more evil?
00:10:06.940 That's why.
00:10:10.400 I don't think people want either of them.
00:10:15.520 But this system has created both of them.
00:10:21.600 I think if you had a third choice,
00:10:24.860 one that was just a normal human being,
00:10:27.980 I think that person might win.
00:10:33.340 But make no mistake,
00:10:35.000 that is why.
00:10:37.020 If he does win,
00:10:39.140 that's why he'll win.
00:10:41.360 So, Stu,
00:10:53.560 the Roy Moore thing,
00:10:56.140 first of all,
00:10:56.540 the polls are everywhere.
00:10:58.320 I've never seen polls like this.
00:11:00.360 They have no idea.
00:11:01.180 They have no idea what's going to happen to that.
00:11:02.620 They have no idea.
00:11:03.300 Some polls show Roy Moore up 10.
00:11:06.540 Some show him down 10.
00:11:08.660 Some show it tied.
00:11:09.740 I mean, that's a 20-point swing.
00:11:11.720 It really is.
00:11:12.340 They don't know.
00:11:13.260 Nobody has any idea.
00:11:14.400 I would be very surprised
00:11:17.860 if Roy Moore does not win that election.
00:11:20.340 Me too.
00:11:20.800 I will be.
00:11:21.280 And in fact,
00:11:21.780 I honestly expect him to win it handily.
00:11:24.040 Six, something like that.
00:11:25.440 Me too.
00:11:25.980 Now, I don't know if that's true.
00:11:27.220 I mean, really,
00:11:28.340 the polling does not give that indication.
00:11:30.420 The polling is very mixed.
00:11:32.040 This shows that Jones has a chance.
00:11:33.480 I think what you just said is really important.
00:11:38.280 And I think Frank Luntz talked about this a little bit as well.
00:11:40.680 He's coming up to talk about the election,
00:11:42.580 but to steal a little bit of his thunder.
00:11:44.680 If Jones was a pro-life Democrat,
00:11:48.000 he's probably up by five or ten.
00:11:49.860 Yes, he would win.
00:11:51.260 But he's not.
00:11:52.580 He's actually a Debbie Wasserman Schultz Democrat
00:11:56.620 when it comes to that issue.
00:11:57.780 And that is not going to play in Alabama at all.
00:12:01.060 Can I ask you,
00:12:01.660 when did we,
00:12:02.780 when did America become these two extremes?
00:12:07.960 When did America become these,
00:12:10.840 these two polar,
00:12:13.940 couldn't be further from each other opposites?
00:12:18.540 When, when,
00:12:19.820 how is this good for any of us?
00:12:22.800 I mean, this is the Facebook world.
00:12:25.200 It really is.
00:12:25.980 You are,
00:12:27.160 you are either for us
00:12:29.980 or you're against us.
00:12:32.100 You are either,
00:12:33.580 I either like you
00:12:34.820 or I hate you
00:12:36.280 and you need to be destroyed.
00:12:39.020 Yeah, no, I mean,
00:12:40.160 that is,
00:12:40.900 that's the issue, right?
00:12:42.000 That's the one we're,
00:12:42.900 we're dealing with every single day.
00:12:44.560 I think you look at,
00:12:46.040 you know,
00:12:46.420 I think you could look at this election honestly,
00:12:48.220 fairly and say,
00:12:48.980 it's probably not
00:12:50.720 all that big of a deal either way.
00:12:53.760 I mean,
00:12:54.060 the Republicans are going to have,
00:12:55.660 a majority either way.
00:12:57.660 This is going to last until 2020,
00:13:00.280 until,
00:13:01.040 until there's another election for the seat.
00:13:03.300 This seat will likely go Republican then anyway.
00:13:06.560 2018 is very,
00:13:08.100 very healthy,
00:13:08.780 healthily tip towards Senate Republicans.
00:13:11.860 So they'll likely hold the Senate anyway.
00:13:14.760 If Doug Jones gets in,
00:13:16.660 he's not going to make abortion more legal.
00:13:19.320 No.
00:13:19.960 Right?
00:13:20.440 Like,
00:13:20.720 Roy Moore is probably not going to start molesting 14 year olds in the Senate.
00:13:24.900 No.
00:13:25.220 Like,
00:13:25.400 almost none of the things that have been talked about,
00:13:28.520 about,
00:13:29.100 with this election,
00:13:29.980 actually affect anything.
00:13:31.920 It's just,
00:13:32.380 I swear,
00:13:32.780 it's just a way for us to entertain ourselves for a couple months until the next story hits.
00:13:35.640 No,
00:13:35.920 it's not even that.
00:13:37.080 It's not even that.
00:13:37.780 It's a way for people to make money by having ratings and sell,
00:13:43.640 you know,
00:13:44.300 newspapers to get clicks.
00:13:47.040 That's all this is about.
00:13:48.040 And for the parties to drive a wedge between us.
00:13:51.920 That's what this is about.
00:13:53.480 It's not going to make any difference.
00:13:55.060 It's not going to make any difference.
00:13:56.460 Not in the short term.
00:13:57.880 I mean,
00:13:58.080 is it possible that one vote will come down to,
00:14:01.260 you know,
00:14:02.500 needing that seat?
00:14:03.840 It's definitely possible that that could happen.
00:14:06.500 It could change a little bit of legislation on the fringes.
00:14:09.360 Maybe there's a perk put in there for,
00:14:12.220 you know,
00:14:12.920 John McCain that wouldn't normally be in there if this seat isn't there.
00:14:16.900 It's possible.
00:14:17.520 I don't think John McCain's going to be around that long.
00:14:19.360 Well,
00:14:19.680 I mean,
00:14:19.980 that's a whole other story.
00:14:20.940 But I mean,
00:14:21.280 the bottom line is,
00:14:22.220 it could have some,
00:14:23.240 it's possible.
00:14:24.440 It could have some short term policy consequences within the next couple of years.
00:14:29.540 It's possible.
00:14:30.360 But there's also possibility of the other side going awry.
00:14:34.160 I mean,
00:14:34.380 having,
00:14:35.040 you know,
00:14:35.240 who knows what Roy Moore is going to do.
00:14:37.220 And,
00:14:37.380 you know,
00:14:38.160 the associations there aren't necessarily universally positive either.
00:14:41.720 In fact,
00:14:42.520 many people have noted the opposite.
00:14:44.440 So there,
00:14:45.600 there's enough unknowns on both sides here.
00:14:48.320 And the downside of either outcome is not all that significant.
00:14:52.860 And that is the,
00:14:54.020 it's not the type of analysis.
00:14:55.620 As you point out,
00:14:56.580 there is something going on here,
00:14:57.700 which is there's an industry built around making these things exciting and intense.
00:15:01.160 And the most important thing that's ever happened to you.
00:15:03.380 Every election is always the most important thing that's ever happened to you.
00:15:05.900 And that we are part of that industry.
00:15:07.360 And I realize it's a terrible idea to come out and say that it's not the most important thing that's ever happened to you.
00:15:12.660 But this election is not the most important thing that's ever happened to you.
00:15:15.980 It isn't.
00:15:17.120 A.
00:15:17.880 Men.
00:15:18.280 Frank Lunt's coming up in just a second to tell us what he found in Alabama and what he thinks.
00:15:30.120 Also,
00:15:31.100 futurist Brett King is coming up next hour.
00:15:35.160 He has written a book called Augmented in the,
00:15:38.580 in the,
00:15:39.340 I think it's fast lane of life.
00:15:40.860 It is remarkable,
00:15:44.080 remarkable.
00:15:45.380 And we'll be talking to him next hour.
00:15:49.580 All right.
00:15:50.300 Who's our sponsor this half hour?
00:15:52.860 My Patriot Supply.
00:15:56.440 My Patriot Supply right now is offering 102 servings of survival food for only $99.
00:16:03.640 And it is shipped to your home for free.
00:16:06.320 That's breakfast,
00:16:07.220 lunch,
00:16:07.500 and dinner.
00:16:08.740 Do you have,
00:16:09.320 do you,
00:16:09.600 have you done anything,
00:16:10.740 Stu?
00:16:11.040 I love my Patriot Supply.
00:16:12.220 Yeah.
00:16:12.460 I,
00:16:12.660 I never done it before my Patriot Supply.
00:16:14.700 I believe is what you're referring to.
00:16:16.080 Yes.
00:16:16.420 The thing about my Patriot Supply is they address,
00:16:20.500 yes,
00:16:20.700 they can address people like you.
00:16:22.660 What does that mean?
00:16:23.320 Well,
00:16:23.680 you know,
00:16:24.080 people who,
00:16:24.800 who are like prepared and think,
00:16:26.720 oh,
00:16:26.780 I need a year's worth of food.
00:16:27.920 And I,
00:16:28.720 you know,
00:16:28.940 you,
00:16:29.540 you're,
00:16:30.020 you've been in this world for a long time.
00:16:32.080 What does that mean?
00:16:33.260 Well,
00:16:33.600 you're,
00:16:33.980 I don't want to say a prepper because I think there's,
00:16:35.920 a negative connotation to that where there really shouldn't be.
00:16:38.340 Right.
00:16:38.520 And being prepared is not a bad thing,
00:16:40.080 but I mean,
00:16:40.620 it's one of those things where you're in that world and you think about those
00:16:43.020 things.
00:16:43.560 You're,
00:16:43.960 as you've said yourself,
00:16:45.020 you're a catastrophist.
00:16:46.420 So you,
00:16:47.440 you are always thinking about the worst case scenario.
00:16:49.720 I grew up in a very,
00:16:50.780 I grew up in Connecticut.
00:16:52.000 I don't think I've ever had a tough day in my life.
00:16:54.420 I don't even have a real job.
00:16:55.520 Look at what I'm sitting here talking on the radio.
00:16:58.060 It's not even a real work.
00:16:59.240 I mean,
00:16:59.480 I,
00:16:59.680 you know,
00:16:59.920 so I have never really thought about that.
00:17:01.520 When I met you,
00:17:02.320 I had a few packets of duck sauce and soy sauce for prep.
00:17:05.920 In case something went wrong,
00:17:07.160 you'd have duck sauce,
00:17:07.940 which has sugars.
00:17:09.240 And I had shotgun shells in case he came to my house.
00:17:11.980 Exactly.
00:17:12.560 Point.
00:17:13.520 That's true.
00:17:13.920 And I,
00:17:14.160 that was my plan to show up.
00:17:15.580 It was.
00:17:15.940 And it was my plan as well.
00:17:17.560 My,
00:17:17.920 my patron supply though,
00:17:19.600 it makes it really easy.
00:17:20.640 So even if you're like this,
00:17:22.080 this deal is perfect.
00:17:22.920 What is it?
00:17:23.500 It's 102 servings for $99.
00:17:25.220 Yeah.
00:17:25.420 So that takes,
00:17:26.120 to me,
00:17:26.660 that knocks out 95,
00:17:28.560 99%,
00:17:29.740 95 to 99% of the,
00:17:31.960 anything you need,
00:17:32.560 anything you need.
00:17:33.060 Like it doesn't cover you for zombie apocalypse.
00:17:35.600 No,
00:17:35.820 but pretty much everything else it covers you for this.
00:17:38.080 This covers you for anything that is,
00:17:39.960 is that it could happen in your life.
00:17:42.400 That is not just an absolute disaster.
00:17:45.660 And you do it once in 25 years.
00:17:47.160 It's easy to store.
00:17:48.320 It's they make it really simple.
00:17:49.740 And you have it now for 99 bucks.
00:17:51.800 Grab a couple of these 102 servings.
00:17:53.940 You'd have,
00:17:54.440 you know,
00:17:55.040 have 204 servings for 200 bucks.
00:17:57.600 If you bought a couple of them,
00:17:58.520 they're $99 each prepare with glenn.com.
00:18:02.680 Call 800-200-7163,
00:18:05.060 800-200-7163 prepare with glenn.com.
00:18:11.980 Glenn back.
00:18:19.340 Glenn back.
00:18:21.520 May I ask who is giving Roy Moore advice?
00:18:25.060 Did you see the video that came out where they had like a 12 year old girl do an interview with him?
00:18:33.100 This can play a little bit of this.
00:18:34.580 Will you Sarah,
00:18:35.240 please?
00:18:35.840 Not a good idea.
00:18:36.660 So what do you think are the characteristics of a really,
00:18:38.960 really good Senator?
00:18:40.280 Following the constitution,
00:18:41.940 just adhering to principle and not going to get elected again and not trying to stay in office for 30 or 40 years and building an empire.
00:18:50.440 You're there to serve the people serve people in my position is Alabama center,
00:18:55.880 the people of Alabama,
00:18:56.620 but a lot of the issues that I stand for would be the good of the country.
00:19:00.460 So being like,
00:19:01.440 what can I country do for me?
00:19:03.020 Instead of doing that,
00:19:03.860 what can I do for my country?
00:19:05.300 Stop the little girl in the pigtails.
00:19:07.300 Not a good idea.
00:19:09.320 What are you thinking?
00:19:10.660 This is some Trump super PAC or something.
00:19:12.580 I set this up apparently.
00:19:13.660 Okay.
00:19:13.900 Trump super PAC.
00:19:14.560 So then,
00:19:15.900 then his wife takes to the microphone.
00:19:19.540 Now,
00:19:20.060 out of all of the things that I've heard about Roy Moore,
00:19:23.860 anti-Semite is not one of them,
00:19:26.540 but they address that.
00:19:28.480 Listen to this.
00:19:30.700 Fake news would tell you that we don't care for Jews.
00:19:34.660 Wait,
00:19:35.180 what?
00:19:35.280 I tell you all this because I've seen it all.
00:19:37.500 So I just want to set the record straight while they're here.
00:19:39.640 Thank you.
00:19:44.560 One of our attorneys is a Jew.
00:19:51.000 Stop.
00:19:52.200 Stop.
00:19:53.780 Stop.
00:19:55.860 Stop.
00:19:56.800 What are you thinking?
00:19:59.740 I was,
00:20:00.240 I was.
00:20:00.520 One of our maids is black.
00:20:03.360 It's like,
00:20:04.060 that's okay.
00:20:05.180 I,
00:20:05.500 I wasn't going to vote for Roy Moore,
00:20:08.180 largely based on the fact that I did not believe he had a Jewish attorney,
00:20:11.460 but now.
00:20:11.880 Now that I know he's got a Jewish attorney.
00:20:13.880 Who is advising these people?
00:20:18.000 Stop it.
00:20:24.640 Glenn Beck.
00:20:33.500 This is the Glenn Beck program.
00:20:35.460 So if I had to,
00:20:41.480 if I could talk to only one person to try to figure out what America was
00:20:46.960 thinking,
00:20:47.680 that one person would be Frank Luntz.
00:20:50.520 He runs Luntz global.
00:20:53.360 And you can find out all about it at focus with Frank.com.
00:20:57.620 But he,
00:20:58.560 he does things for businesses and politicians and,
00:21:03.460 and everything else.
00:21:05.300 When you're really trying to get a beat on what people are feeling,
00:21:09.320 Frank is really good with his focus groups and he's just been in Alabama.
00:21:13.980 Welcome to the program.
00:21:14.760 Frank Luntz.
00:21:15.220 How are you?
00:21:15.580 You are always the kindest person to me on the radio.
00:21:20.860 I don't know if your listeners have ever met you before,
00:21:24.060 but you have always been the kindest guy.
00:21:27.760 And I'm not sure if that's your image.
00:21:31.520 Yes.
00:21:32.100 Yes,
00:21:32.540 you do.
00:21:32.960 Frank,
00:21:33.320 if anybody knows my image,
00:21:34.760 you would know.
00:21:35.480 I thought you knew the people,
00:21:36.760 Frank,
00:21:37.000 you don't know.
00:21:37.380 That's as clearly not my image.
00:21:40.500 So,
00:21:42.680 so Frank,
00:21:43.480 tell me what you found in Alabama.
00:21:45.860 So we found a very polarized and extremely excited,
00:21:52.220 intense,
00:21:53.320 passionate electorate that desperately wants to send a message to Washington.
00:21:58.060 And to my greatest surprise,
00:22:00.460 that message is coming just as hard to the Republican establishment as it is to the Democrats.
00:22:06.500 There is as much criticism of the Republican leadership in Congress as there was their
00:22:12.520 Democratic opponents.
00:22:14.000 And this is among Republicans.
00:22:15.740 And that tells me that Alabama is a symptom of what's happening across the country.
00:22:20.740 And what's happening across the country?
00:22:22.280 I think that people are just as fed up today as they were one year ago.
00:22:27.760 I think that they're disappointed with the rate of change in Washington,
00:22:31.260 that the swamp has not been drained.
00:22:33.520 And I think that they're ready to say,
00:22:35.200 I've had it and I'm going to vote even more people out in the next election.
00:22:41.260 So,
00:22:41.380 Frank,
00:22:42.020 the,
00:22:42.620 the idea that Alabama is,
00:22:46.340 has to vote for somebody who is accused of improprieties and possibly worse,
00:22:53.740 you know,
00:22:55.240 20 years ago,
00:22:56.200 and a guy who is abortion on demand.
00:23:00.160 It's really,
00:23:01.360 truly the lesser of two evils.
00:23:03.940 And,
00:23:04.280 you know,
00:23:05.140 for God fearing people,
00:23:06.540 you know,
00:23:08.120 abortion is more evil than somebody doing something 20 years ago.
00:23:14.240 Do I have that right or wrong?
00:23:15.940 You have it right,
00:23:16.800 but I'd be careful because that's not,
00:23:20.000 they will not let themselves be caught saying that what they're saying is that it is all evil,
00:23:26.700 that it all needs to change.
00:23:28.580 And that this is the guy Roy Moore in their minds.
00:23:32.340 This is the guy who they think is most likely to shake the hell out of Washington,
00:23:39.400 D.C.
00:23:39.700 So what do they feel about his,
00:23:41.980 the accusations?
00:23:44.300 They don't think they're true.
00:23:47.000 They don't think that they're real.
00:23:49.980 They think that these are women who have been paid by,
00:23:53.780 Gloria Allred or the laughter,
00:23:58.220 whoever,
00:23:59.120 or,
00:23:59.400 or even,
00:24:00.240 what's his name?
00:24:00.920 Even Soros,
00:24:02.960 Soros and the Democrats.
00:24:04.680 They think that America is under attack,
00:24:10.140 is under siege,
00:24:11.320 and they desperately want to send a message.
00:24:15.020 Enough is enough.
00:24:16.380 And they want to do it in an emotional way.
00:24:20.840 So what do you think this means,
00:24:22.500 Frank?
00:24:23.660 Assuming that Roy Moore wins,
00:24:26.000 do you think he's going to win?
00:24:27.720 I can't,
00:24:28.620 you know,
00:24:28.860 I've,
00:24:29.140 I've never in my professional life,
00:24:31.240 I've never held back a,
00:24:33.420 a projection.
00:24:35.180 I've always felt that I should speak up,
00:24:38.440 because that's my job as a pollster,
00:24:40.560 is to know what's going to happen.
00:24:41.820 I can't do it this time.
00:24:43.480 Glenn,
00:24:43.840 I just don't know.
00:24:44.980 I don't believe any of the polls.
00:24:46.280 I think someone's going to look really foolish,
00:24:48.340 when the,
00:24:49.100 when the election is over.
00:24:50.840 Yeah,
00:24:50.980 I think,
00:24:51.460 I mean,
00:24:51.780 you know,
00:24:52.100 I've never seen,
00:24:52.700 have you seen a 20 point spread in polls?
00:24:54.780 Never.
00:24:55.380 Yeah.
00:24:55.560 Not even,
00:24:56.020 and there was a spread during Clinton,
00:24:57.740 but the spread during Clinton was a 10 point spread.
00:25:00.560 It means that an awful lot of people are lying,
00:25:02.900 to pollsters right now.
00:25:04.140 And that's because they're afraid of the pressure,
00:25:07.840 this essence of political correctness,
00:25:09.660 which is the thing that I urge you to address.
00:25:12.060 I urge you on your shows going forward to talk about it,
00:25:15.720 because it is poisoning our students' minds.
00:25:18.840 It is poisoning the public debate,
00:25:21.140 that we can no longer say what we truly believe out of fear that it will hurt us professionally or personally.
00:25:27.460 But how,
00:25:28.220 how do you,
00:25:28.960 you know,
00:25:29.140 Frank,
00:25:29.360 I would love to,
00:25:30.140 I'd love to have you on for an extended period of time,
00:25:32.640 because I think you could teach us so much.
00:25:34.760 And I mean,
00:25:35.480 the audience in America,
00:25:36.480 how do you have that conversation when millennials are saying that,
00:25:43.940 you know,
00:25:44.640 there should be safe zones,
00:25:46.180 there should be limits on speech?
00:25:48.300 Right.
00:25:49.180 But those are,
00:25:49.980 by their definitions,
00:25:51.640 safe zones.
00:25:53.000 So that you're not allowed to ask the question,
00:25:55.740 why does a murderer in California,
00:25:57.680 who shouldn't even be in this country,
00:25:59.760 why does that person get let off?
00:26:02.500 You can't have a conversation about border security.
00:26:05.860 But on the same token,
00:26:07.220 Glenn,
00:26:07.420 you also can't say,
00:26:09.100 why is there such negativity in this tweeting?
00:26:12.920 Why can't we treat each other with respect as we are criticizing each other for beliefs that we don't share?
00:26:19.200 I think that the coarseness of our culture has been so,
00:26:23.380 so destroyed by social media that the ability to talk to each other in a tough,
00:26:33.400 but respectful way is gone.
00:26:35.480 It's not that it's going,
00:26:36.620 it's gone.
00:26:37.660 Frank,
00:26:38.160 you and I have seen each other at some really low points.
00:26:40.860 We have seen each other where,
00:26:43.360 where I've come to you and Frank,
00:26:45.460 help me.
00:26:46.200 I have no hope left.
00:26:48.440 Do you,
00:26:49.220 have you found,
00:26:50.640 have you found hope in all of the polling?
00:26:53.920 No,
00:26:54.240 not at all.
00:26:55.640 I'm in the worst place I've ever been in my professional life internally.
00:27:00.080 I don't really want to have this conversation with a million people,
00:27:03.740 but no,
00:27:05.000 I don't because I understand the Trump voter who is,
00:27:10.340 desperate to save his or her country.
00:27:13.940 Yeah.
00:27:14.380 I understand the feeling of African Americans who don't want to go back to the 1950s and 60s,
00:27:21.100 because that was a bad time for them in this country.
00:27:24.440 I understand those who came from other countries legally,
00:27:28.080 but they're being demonized by the illegal population.
00:27:32.240 I get millennials who are nervous about where the country's headed.
00:27:37.760 They see the fires and they see the hurricanes and they see the weather and they wonder what's going on.
00:27:42.560 I hear all of this and I appreciate it.
00:27:46.300 But the truth is,
00:27:47.940 most people don't.
00:27:49.340 They see what they want to see and they disregard the rest.
00:27:52.180 Is there a way in this world of social media?
00:27:55.720 Is there a way to come back together?
00:27:58.400 Is there a message that will bring us together?
00:28:01.440 Because I feel exactly the same way,
00:28:03.800 Frank.
00:28:04.140 I really,
00:28:05.300 truly believe that the vast majority of people feel this way.
00:28:10.200 They're tired of this.
00:28:11.620 They don't want to live like this.
00:28:13.220 They don't want to be at each other's throats.
00:28:15.940 Well,
00:28:16.560 I want that.
00:28:17.280 There are two things.
00:28:18.080 One is,
00:28:19.180 this is a plug,
00:28:20.400 but not really.
00:28:21.880 I want to hear from those people.
00:28:24.460 And if they go to Luntz Global,
00:28:26.720 which is my website,
00:28:27.520 they can sign up for the focus groups that you talk about and that you watch.
00:28:31.700 They can sign up and their voices can be heard and there won't be any shouting and there won't be any disrespect.
00:28:37.080 They'll get a chance to be heard and they'll get a chance to learn from others.
00:28:40.400 But the other thing is,
00:28:41.840 I want them to see this Vice News HBO clip.
00:28:44.940 And all you have to do is go onto YouTube,
00:28:47.300 type in Alabama and my name,
00:28:49.620 and they'll see the entire seven and a half minutes.
00:28:52.700 Some of it should shock you,
00:28:54.780 should shock them by how explicit they are.
00:28:59.960 Tell me about it.
00:29:01.320 I have a simple question.
00:29:03.720 A 14-year-old,
00:29:05.860 one of the people said his grandmother was married when she was 13 and she had two kids by the time she was 15.
00:29:12.360 That there are a lot of people who would be proud that their daughter of that age was dating a district attorney.
00:29:19.640 I don't get that.
00:29:21.640 That doesn't compute to me.
00:29:22.980 And I don't care if that's 2017 or he was referring to 20 or 30 years ago.
00:29:28.160 It ain't right.
00:29:29.460 It just isn't.
00:29:30.260 But, you know, that's the one thing.
00:29:33.040 I mean, I keep coming back, Frank, to, you know, Jerry Lee Lewis.
00:29:36.120 He married his 13-year-old cousin and nobody in the South had a problem with that.
00:29:41.080 Well, they did have a problem with it.
00:29:42.640 You know this.
00:29:43.440 No, they had a problem.
00:29:44.320 No, no, no.
00:29:44.720 They had a problem with it in England, and that's what really tore everything apart.
00:29:49.420 Well, he would have been, I think he would have been as big as Elvis.
00:29:52.540 I do, too.
00:29:53.080 The man was one of the greatest piano players.
00:29:55.540 And by the way, he played here in L.A. about three weeks ago.
00:29:59.120 And even in his 80s, the man is brilliant.
00:30:01.980 But he never had the career that he could have had, because outside his home area, Americans found that too much to tell.
00:30:10.780 Correct.
00:30:11.440 Correct.
00:30:11.860 Outside of his home area.
00:30:13.320 But his home area, and this is really kind of the, you know, same kind of area that Roy Moore is from.
00:30:20.120 I mean, it's different, especially back then.
00:30:24.120 But does that make it okay?
00:30:25.760 No.
00:30:26.020 There was segregation back then.
00:30:27.780 No.
00:30:27.920 Does that make it okay?
00:30:28.840 No.
00:30:29.440 So that's the issue that I have.
00:30:31.180 I know we cannot judge.
00:30:33.220 I've been through this with so many people with these conversations.
00:30:35.920 We cannot judge values and morals by today's standards, looking back 40 years ago, because we think differently and we act differently.
00:30:46.760 But that said, I don't feel like we've learned what we should have learned.
00:30:52.120 I don't feel like we have that same commonality that existed in this country years ago.
00:30:58.300 I think there's so much more that divides us than unites us, and we're looking for those divisions.
00:31:03.360 We are seeking to tear ourselves apart, and that's frightening to me.
00:31:07.200 What is the biggest thing we have in common, Frank?
00:31:09.660 Well, the biggest thing is an appreciation for the country.
00:31:12.260 But I will tell you right now that one out of five Americans isn't patriotic anymore.
00:31:16.960 One out of five Americans does not feel that this is the greatest country on the earth, does not feel that our system is the best system.
00:31:24.760 And that's different. That was the one thing that united us 25 years ago.
00:31:30.480 Under Reagan's administration, we all thought that even with our imperfections, we were still the best.
00:31:37.420 That exceptionalism is gone in one out of five Americans.
00:31:40.620 And out of those one out of five Americans, what do they think is the best?
00:31:44.960 They just believe anything better?
00:31:50.360 No, they won't give anything better, but they refuse to accept American exceptionalism.
00:31:55.660 By the way, they do tend to vote Democrat a lot more than voting Republican, but I don't want to bring partisanship into this.
00:32:05.440 When you can't even agree on your country's values, then we're in deep trouble.
00:32:13.480 Have you tested the Bill of Rights?
00:32:16.700 Yes.
00:32:17.420 How are those testing, those principles?
00:32:20.160 It's really weird.
00:32:21.220 It's like, have you tested mom and apple pie?
00:32:25.500 Right, right.
00:32:26.940 Well, the first problem is that Americans don't even know what the Bill of Rights are.
00:32:32.260 They don't know the three systems of government.
00:32:35.860 We have more people in this country who believe that UFOs exist than believe Social Security will exist by the time they retire.
00:32:42.560 We have more people in this country that can name the home of the Simpsons than where Abraham Lincoln was born.
00:32:51.500 More people can name more Kardashians than can name members of the Supreme Court.
00:32:56.400 All of that scared the living hell out of me, because we know our pop culture absolutely to the last detail, and we know nothing about our founding fathers.
00:33:08.080 Frank Luntz.
00:33:09.100 He is the founder and chairman of Luntz Global.
00:33:11.460 I urge you to go to focuswithfrank.com and sign up for some of his testing.
00:33:19.260 He is one of the best listeners.
00:33:23.580 He is truly empathetic and can hear beyond the words.
00:33:31.040 I think he is, quite honestly, I think he is a solution to many of the things that ail us.
00:33:37.200 If more people will speak honestly and more people, like Frank, will listen, please go to focuswithfrank.com and sign up to be part of his focus groups, focuswithfrank.com.
00:33:49.520 Frank Luntz, always a pleasure and a privilege to have you on the program.
00:33:52.720 Thank you, sir.
00:33:53.300 Thank you.
00:33:54.000 He's an amazing man.
00:34:07.200 I, I, I, I, a tortured soul.
00:34:11.800 That's exactly the word I was about to use.
00:34:13.680 He's a tortured soul.
00:34:14.260 He's an interesting dude, man.
00:34:15.220 He just seems tortured by this.
00:34:17.940 He is.
00:34:18.120 And he continues to do it.
00:34:19.500 He loves the country.
00:34:21.000 He's got all the information.
00:34:22.500 He doesn't know what to do.
00:34:24.420 And nobody will listen.
00:34:26.260 And we've, we've had so many conversations, so many conversations.
00:34:31.120 He's, he's a tortured soul, but man, he's.
00:34:34.140 Knows his stuff for sure.
00:34:35.340 He knows his stuff and he's a good, decent guy.
00:34:38.260 All right.
00:34:38.820 Sponsor of the staff of ours, blinds.com.
00:34:40.960 Blinds.com make it really easy for your house to look and feel the way you want it to feel.
00:34:46.780 Not like a house, but your home.
00:34:49.440 Um, they make it easy because, uh, they have, uh, FaceTime and a million different ways to connect with these people who are, um, interior designers and it's free.
00:35:01.500 So I don't know.
00:35:02.840 I mean, what is that going to look like when you put it up there?
00:35:05.440 What they did is we FaceTime and they will, uh, take a picture of the room or your house and they will then, uh, superimpose the look of the drapes or the shutters or the blinds or whatever it is that you want.
00:35:19.000 And they will show you, this is what it's going to look like.
00:35:22.020 Then they'll send you a piece of the fabric or a piece of the blind.
00:35:24.480 So, you know, the color is exactly right.
00:35:27.320 Then all you have to do is measure.
00:35:29.500 And if you miss measure, they remake the blinds for you for free.
00:35:32.940 They not only will completely transform the room.
00:35:35.500 They will give you the best customer service you have ever experienced.
00:35:38.040 And at an amazing prices, I want you to go to blinds.com right now, blinds.com.
00:35:45.000 Every order gets free shipping.
00:35:46.720 And, um, now through, uh, what is it?
00:35:50.400 Christmas Eve, you can enjoy the savings site-wide plus a guaranteed 20% off when you go to the blinds.com and you use the promo code back.
00:36:01.100 So if you use the promo code back now through Christmas Eve, you will get 20% off anything for your window coverings at blinds.com.
00:36:10.660 Blinds.com promo code back rules and restrictions to apply.
00:36:16.760 Glenn back.
00:36:25.240 Glenn back.
00:36:26.440 Hey, if you want to, um, kind of escape the news tonight.
00:36:29.380 Uh, at 7 p.m.
00:36:31.400 Chapter 2 of The Immortal Nicholas, Rafe and I, um, are reading it every night and we ask you to join us with your family and it's commercial free and it's only online at theblaze.com slash TV for premium subscribers.
00:36:42.280 One other thing to add about Alabama is that the people of Alabama just don't trust the people presenting the information about Roy Moore.
00:36:48.740 Correct.
00:36:49.260 And let me give you an example of this.
00:36:50.920 This is about Al Franken and his, uh, accusations, uh, in the, from the New York Times.
00:36:54.480 And the grand cavalcade of sexual assault charges we've been hearing lately, his list, Franken's list, goes from fanny gropes to tongue thrusts.
00:37:01.260 It's appalling but pretty minor league.
00:37:03.720 And the picture of Franken feeling up the well-protected breasts of a sleeping colleague on a tour could have been subtitled portrait of a comedian who does not suspect he'll ever run for senator.
00:37:13.440 Franken was a good politician and many Democrats hoped he might grow into a presidential candidate.
00:37:17.260 But it was his destiny to serve history in a different way.
00:37:22.960 Wow.
00:37:23.560 Eight women came out against him to accuse of sexual assault and they could describe it as serving history.
00:37:28.120 He is a hero.
00:37:33.720 Glenn Beck.
00:37:40.540 Love.
00:37:42.100 Courage.
00:37:43.460 Truth.
00:37:44.900 Glenn Beck.
00:37:45.620 Roy Moore or Doug Jones.
00:37:49.060 Alabama.
00:37:49.760 Alabama's going to finally decide today.
00:37:51.880 This morning, most polls have Roy Moore slightly ahead.
00:37:55.080 Fox News has Jones leading by 10 points.
00:37:58.020 It's very close.
00:37:59.280 There's a 20-point swing.
00:38:00.900 Either way, nobody knows.
00:38:02.920 Democrats have driven themselves insane for about a year trying to figure out what happened when it comes to Donald Trump and Hillary Clinton.
00:38:11.320 Now the left can't understand why Doug Jones isn't running away with this race.
00:38:15.240 After the really creepy allegations against Roy Moore.
00:38:19.680 Why would conservatives vote for a creep?
00:38:23.040 Are all conservatives creeps?
00:38:25.140 What does this say about conservatives?
00:38:26.560 Why would Christians support someone like Roy Moore?
00:38:30.120 Does Christianity now condone Roy Moore-like behavior?
00:38:34.420 That's what they're thinking.
00:38:35.600 That's what they're struggling with.
00:38:36.660 They don't understand.
00:38:37.440 They don't understand that you don't trust the media at all.
00:38:43.400 You don't trust them.
00:38:44.980 You don't trust the people that are coming out.
00:38:48.240 And another big factor.
00:38:50.880 For many conservatives in Alabama, this is the same struggle they had in the last presidential election where they held their nose, voted for Trump.
00:38:57.600 Why would they do that?
00:38:59.820 Democrats, press, listen carefully.
00:39:02.720 They'll save you a ton of time and effort because the answer is really simple.
00:39:07.700 One, they don't trust the messengers.
00:39:12.260 You must stop calling them names.
00:39:16.540 You have to start listening to them.
00:39:19.220 The second thing, abortion.
00:39:23.940 Being a single issue voter is not really sophisticated enough to many on the left and the right.
00:39:32.060 But defending life is such a fundamental principle for most conservatives that it will override most other factors.
00:39:39.540 Yes, even Roy Moore-type allegations.
00:39:43.560 Couple that with this guy is a renegade, an outsider.
00:39:48.500 The people in the party don't like him.
00:39:52.140 That's bonus.
00:39:54.300 That's the political reality in America.
00:39:56.920 They not only don't trust the press, they don't trust the GOP or the DNC.
00:40:04.440 What do you trust, America?
00:40:07.740 The deciding factor in this race wasn't the revelation about Roy Moore's dating habits 40 years ago.
00:40:13.620 It was when NBC's Chuck Todd asked Doug Jones his thoughts on abortion 20 weeks into a pregnancy.
00:40:22.040 Now, I'm not in favor of anything that is going to infringe on a woman's right and her freedom to choose.
00:40:28.440 That's just the position that I've had for many years.
00:40:31.320 It's the position I continue to have.
00:40:33.080 But when those people, I want to make sure people understand that once a baby is born, I'm going to be there for that child.
00:40:39.980 That's where I become a right to lifer.
00:40:42.720 For most conservatives in Alabama and throughout the country, that's a little late to be concerned about life.
00:40:48.880 I mean, you know, partial birth abortion is included in that.
00:40:53.480 When the baby is born, well, what about up until it is born?
00:40:59.320 One Republican pollster in Alabama put it this way.
00:41:02.120 If Jones were pro-life, he'd be up 10 points.
00:41:05.160 If Roy Moore wins today, it's not because Alabama voters like him.
00:41:10.140 They might.
00:41:11.520 I think a lot of them are disgusted by him.
00:41:14.420 But here's what they don't like.
00:41:16.940 The press, the parties, and abortion.
00:41:23.480 It's Tuesday, December 12th.
00:41:30.480 This is the Glenn Beck Program.
00:41:33.800 I want to dramatically switch gears here and talk to you about a book that I've been reading.
00:41:42.800 As you know, if you listen to this program, I'm a big fan of Ray Kurzweil.
00:41:47.520 At the same time, he scares the living daylights out of me because I don't hear a lot of talk about ethics.
00:41:54.100 I hear a lot of talk about what can be done and what is coming.
00:41:57.640 And I read a lot of, lately I've been reading a lot of science fiction and a lot of science.
00:42:04.960 And I am very much into the future and what is coming and what life is going to be like for us in 10 years.
00:42:14.660 10 years, that's almost twice as long as it's been since 9-11.
00:42:19.960 It's going to creep up on us fast.
00:42:21.740 And 10 years from now, our lives, our health, our jobs, possibly, hopefully, our politics are going to be completely different.
00:42:32.220 And it's very exciting, but it's also terrifying.
00:42:35.520 And it's only terrifying if you haven't thought of these things before, because they are coming and you can't put the genie back in the bottle.
00:42:45.000 But do we want it?
00:42:48.100 Should we go this direction?
00:42:50.140 How should we handle it when it does come?
00:42:53.920 Brett King is a futurist.
00:42:56.380 He is the CEO and founder of Breaking Banks and also a podcast host of Breaking Banks.
00:43:05.780 He has written a great book, Augmented Life in the Smart Lane.
00:43:12.280 And I've been reading it for a while now and really kind of trying to digest it as we go.
00:43:18.140 And it has been a springboard for so many other books that I have been reading because of this book.
00:43:23.480 And I'm honored to have him on.
00:43:25.120 Brett, welcome.
00:43:26.760 Brett, welcome to the program.
00:43:29.420 That's great to be on.
00:43:30.620 Thanks for having me.
00:43:31.500 Sure.
00:43:31.780 So I don't even know where to begin with you.
00:43:36.000 I really kind of want to try to just introduce America to some of the thoughts that you put together in Augmented of what the world is going to be like coming our way.
00:43:53.980 There's jobs, there's education, there's health, and then we get into stuff like AI and robotics.
00:44:03.740 But let's just start with jobs, education, and health.
00:44:06.900 So there's four disruptive themes I identified in the book.
00:44:12.240 Obviously, the first and most disruptive technology we're going to deal with over the next 10, 20 years is artificial intelligence.
00:44:19.700 But that's going to spur on a whole range of other changes in society.
00:44:24.320 So the first notable impact is we'll be talking to computers.
00:44:29.540 We'll have computers embedded in the world around us that are collecting data and sensors and so forth.
00:44:34.840 And then that flows on to things like health care.
00:44:38.020 For example, you may have seen in the news a live core just got approval from the FDA to launch a device, a band that essentially attaches to the Apple Watch that can do sort of a full EKG, ECG monitoring of your heart rate over time.
00:44:54.980 But when you tie that with an artificial intelligence, they're now expecting within the next 18 months or two years, I'll be able to predict whether you're going to have a heart attack based on that data.
00:45:05.700 So this is where we see the marriage of sort of sensors, sensors and artificial intelligence really changing the way we think about things like health care.
00:45:15.940 Yeah, you talk about these sensors in a way that has made me want to wear my, you know, my Apple smartwatch a little bit more about the way it's going to be able to detect exactly what's happening in our body.
00:45:35.120 I mean, we would much rather go to the ingestibles you're talking about, like you can swallow a computer that can read, you know, like you're a diabetes, a diabetes software, you'll be able to swallow a computer in the future that will monitor your, your, your, your, your blood work.
00:45:51.300 And so look at your sugar levels and then, you know, it won't be long before we have an internal device or be able to dispense insulin.
00:45:58.880 So, you know, regulate insulin in our body without having to inject it and things like that.
00:46:03.920 And, and, and, you know, if you've got a, if you've got a complaint, you know, we can, we can get you to swallow a camera now and ingest that instead of having invasive surgery.
00:46:12.200 I mean, there's a lot of stuff happening on the sensor stuff on the health front.
00:46:15.440 Are we, are we, are we entering a time where it's possible to say disease goes away?
00:46:25.180 So the, the, the biggest shift in respect to disease won't necessarily be just diagnosis.
00:46:31.140 I think that, you know, what we can do with an imaging AI right now, machine learning is we can give a, a, an algorithm, 3,000, 5,000 medical images with diagnosis data.
00:46:45.440 And it will be able to do a pretty good job of approximating the diagnosis that you would have got from your, your doctor.
00:46:51.680 So diagnostic, diagnostic technology is going to increase exponentially.
00:46:58.180 And essentially we're going to get these computers doing the best diagnosis possible.
00:47:02.900 Yeah.
00:47:03.060 Talk a little bit, combining all this.
00:47:04.900 Talk a little bit about the computer in New York.
00:47:08.140 This is kind of a, an offshoot of Watson.
00:47:11.660 Watson, you know, could beat anybody at chess.
00:47:15.440 Um, they had the idea of, wait a minute.
00:47:18.520 What if we just put all of the medical information into it and all of the different cases and see if it can, if it can diagnose cancer and it's far better than, than human doctors.
00:47:30.600 So right now, uh, IBM Watson gets about a 96, 97% hit rate in terms of its diagnosis for specific types of cancer.
00:47:39.820 Now, when you compare that to the best oncologists in the U S who have 20 years of experience, they get it right about 50% of the time, which of course is why, you know, everyone tells you should always get a second opinion.
00:47:53.020 Um, so that's pretty impressive.
00:47:55.240 Uh, it's, it's obviously fairly new tech, but what would be really good is if we could eliminate cancer altogether.
00:48:01.580 And so what we're working on is technologies like gene editing and the two major streams of this is CRISPR and Talon where essentially we can now sequence your DNA, but the future is actually modifying your DNA.
00:48:15.760 So if you've got a disease, a protein switch that results in, say, leukemia, we'll be able to flick that switch to create antibodies instead of creating leukemia just by changing in genome.
00:48:28.040 So I don't know if it was yours, uh, Brad, I, I've been reading so much lately, uh, but do you speak about telomeres in your book?
00:48:36.340 Yeah.
00:48:36.720 So, um, you know, telomere, telomere links is another element for around longevity.
00:48:41.080 Um, and there's a whole lot of new science coming out around longevity now, which is really interesting, but the ability to, uh, you know, insert telomerase, which is the, uh, sort of the, the protein that leads to the, um, at the end of the DNA is these little, um, uh, you know, strands that sort of hold the DNA together, sort of like the aglets you have on your shoelace and they fray over time.
00:49:05.360 Now, if we can restore them, then it's believed that we can extend life.
00:49:09.560 So, um, you know, this is, there's a lot of work going into longevity because that is, that is, as those begin to fray, that's the aging process.
00:49:18.800 So if we can, yeah, errors creep in and that's how we age exactly.
00:49:23.240 So, you know, I'm, I'm reading your book and, and half of it, I am more, uh, excited about the future and more convinced that, you know, you just have to just hold on to 2030, 2035 and the world's going to be different.
00:49:40.720 Uh, you're going to, I mean, anything that you're dealing with, we're going to be able to take care of.
00:49:46.880 Um, that's kind of the, the optimist, uh, optimistic feeling that I get.
00:49:51.940 Um, however, the other half, uh, of me, you know, you look at, for instance, talk about the climber.
00:50:00.440 Uh, I don't remember his name, the mountain climber that, uh, yeah, tell the story.
00:50:07.940 So Dr. Hugh Herr, this is a really interesting one.
00:50:10.820 He, uh, when he was 17, he lost both of his legs through frostbite in a, in a climbing, uh, incident where he's trapped on Mount Wellington.
00:50:19.180 I think it was for a few days.
00:50:21.180 And so he was very inspired to, uh, you know, to, to fix that problem.
00:50:25.640 So he went to Harvard and MIT to learn bio, uh, medicine and robotics.
00:50:30.140 And he basically built himself new legs.
00:50:33.200 And, and today his friends joke that they're going to have to get amputations as well to keep up with him in terms of his ability to climb a mountain now because of his specialist, uh, uh, prosthetics that he's designed to, to climb the mountains, you know.
00:50:47.320 But, um, this does raise the ethical concern is once we get to the point where prosthetics are able to perform at, or at a better level than our own human limbs, what do we do when people start voluntarily, um, having amputations to get prosthetics because they're going to get improved performance?
00:51:08.100 What do you think we do?
00:51:11.220 So obviously we have to have, you know, we have to start thinking about the ethics of things like artificial intelligence.
00:51:17.320 And technology in a pretty structured manner.
00:51:20.660 We can't just let it happen as we have with the iPhone and the internet and so forth, where we just let the, the pure capitalist, uh, approach take.
00:51:28.960 We, we need a, uh, an ethical structural approach to these technologies.
00:51:32.820 So there is some initiatives coming out like this, like a deep mind, the Google effort.
00:51:37.760 They've created an ethics, uh, society to sort of put together or codify ethical standards.
00:51:43.960 But, you know, uh, like it's hard to decide on ethics in our society.
00:51:48.920 Uh, you know, we don't agree on things, as you pointed out at the, uh, the start of the show.
00:51:53.280 How do we codify ethics when as humans, we can't necessarily agree on, on a code of ethics amongst ourselves?
00:51:59.680 I, I worry about this because I, you know, you also look at people like Vladimir Putin, who has recently come out and said, whoever, whoever is the first in with AI controls the world.
00:52:10.740 Uh, I don't think Putin cares about, you know, ethics, uh, or at least the same kind of ethics.
00:52:16.800 No, I, I, I agree with you.
00:52:19.600 Um, so I think there's, there's a couple of concerning elements here right now where we don't have obviously full AI.
00:52:26.720 There's sort of three phases of artificial intelligence.
00:52:29.240 Okay.
00:52:29.500 Hang on just a second.
00:52:30.380 I, hang on.
00:52:31.560 I want to, I want to go there.
00:52:32.700 I have to take a, take a break and we'll come back.
00:52:35.200 Um, I want to talk to, um, Brett about, uh, education because we are not ready for the world that is right around the corner.
00:52:46.120 So what do you do to educate yourself?
00:52:49.460 What should your kids be doing right now?
00:52:52.500 Um, and what should they be looking into?
00:52:54.780 Uh, also the ethics of AI and, um, robotics.
00:53:01.860 It is a strange brave new world that we are headed towards.
00:53:07.460 And we'll talk about that coming up in a second.
00:53:09.400 By the way, the name of the book is augmented life in the smart lane, a must read.
00:53:23.240 No matter who you're shopping for this holiday season, Sherry's berries has the perfect gift for everybody on your list right now.
00:53:29.940 Delicious Sherry's berries dipped in white milk and dark chocolate.
00:53:34.200 It is, they are so good.
00:53:36.340 They start at $19.99 plus shipping and handling.
00:53:39.480 You can double the berries for just $10 more.
00:53:41.960 And if you do your, um, your gift's going to include a, uh, keepsake dessert platter, two gifts in one Sherry's berries.
00:53:48.560 Uh, it truly unbelievable.
00:53:50.400 If you've ever had chocolate dip strawberries, uh, you know, and you order them online and stuff, they're usually crappy, small, unripe strawberries.
00:53:58.080 These are the biggest and the juiciest strawberries you've ever seen.
00:54:02.860 There is no better gift than Sherry's berries.
00:54:06.180 They've added also amazing treats like snowman, a brownie pops, cheesecake, Christmas, Christmas trees, uh, chocolate truffles.
00:54:14.680 All of it is just so good.
00:54:16.620 And you can choose the delivery date customer satisfaction, always number number one or your money back.
00:54:22.100 The gifts arrive packaged in Sherry's berries signature gift box.
00:54:25.120 No wrapping is required and there's only one way to get it.
00:54:29.020 $19.99 this holiday season at berries, B-E-R-R-I-E-S, berries.com.
00:54:35.100 Click on the microphone in the upper right-hand corner and enter my code name, Glenn, at checkout.
00:54:40.100 I give you secrets my actual name too.
00:54:43.340 Remember, double the berries for just $10 more.
00:54:45.720 Your gift will include a free keepsake dinner platter or dessert platter at berries, B-E-R-R-I-E-S, berries.com.
00:54:53.720 Promo code BECK.
00:54:58.220 Glenn Beck.
00:55:08.360 Glenn Beck.
00:55:09.660 So, um, we are with, uh, Brett King, futurist, author of the book, uh, Augmented, Life in the Smart Lane.
00:55:17.040 It is so well worth reading.
00:55:18.780 Please read this book.
00:55:20.180 You want to know what the future is and what to do, um, and how to prepare for it.
00:55:24.620 Um, uh, Brett, we have about three minutes.
00:55:26.720 Give us, if you can, uh, as much as you can on the three types of AI.
00:55:32.500 So we start out with right now, we have machine learning or deep learning where machines can
00:55:37.340 observe and watch human behavior and, and sort of learn to mimic that.
00:55:42.020 Um, so for a self-driving car, as an example, or a diagnostic, uh, algorithm for medical,
00:55:48.200 then we get to artificial general intelligence where you'll be having a conversation with
00:55:52.600 your AI.
00:55:53.280 So think about Alexa or Siri on steroids, where you wouldn't be able to tell the difference
00:55:57.560 between that and a, and a human.
00:55:59.380 That's the, that's the Turing test point.
00:56:02.740 Yeah, that's exactly.
00:56:03.880 So some machine that can pass the Turing test, pull us into thinking it's human.
00:56:07.560 And then we get the strong AI, probably around 2045, something around that timeframe where
00:56:12.980 you have machines that are smarter than humans.
00:56:14.980 Now, you have a chapter that I have, uh, read and listened to about four times, uh, and it
00:56:25.540 is on the robots and, um, uh, and artificial intelligence and why it's important to give
00:56:34.740 robots, uh, emotions.
00:56:37.560 And in it, you say you give great reasons for robots to have emotions and AI to have emotions,
00:56:43.980 but now for the other more controversial reason, why robots need emotions.
00:56:48.820 So they don't kill us all the concept behind some of the most innovative artificial general
00:56:54.280 intelligence minds today.
00:56:55.520 We need to ensure that robots like us and have empathy for mankind.
00:57:00.200 The three laws are not sufficient enough to protect us from the unknowable future of artificial
00:57:04.780 intelligence.
00:57:05.600 Some like Elon Musk, Stephen Hawking believe we need to build very basic motivations as the
00:57:10.600 foundation to future AI, one that enforces a basic love of humans and our planet or planets.
00:57:16.740 The problem of course, is that any safeguards we are able to implement will always be able
00:57:22.620 to be circumvented by any intelligence greater than our own.
00:57:28.720 Yeah, you go on in that.
00:57:31.480 I'll save it for a second.
00:57:32.640 I would like you to talk about this, this problem that we have that is possibly coming of artificial
00:57:40.980 intelligence being greater than our own.
00:57:43.900 And as you say, we will be compared to the fly on the plate in the kitchen.
00:57:50.720 We might know that's a food for us, but we have no idea about the rest of the world.
00:57:56.640 Glenn Beck.
00:58:02.640 You're listening to the Glenn Beck Program.
00:58:05.760 Brett King.
00:58:06.560 He's a futurist from Australia.
00:58:08.640 He's the CEO and founder of Moven.
00:58:11.140 He's a podcast host, Breaking Banks, and author of Bank 4.0.
00:58:17.100 It's going to be released early next year.
00:58:19.940 The book that I have found him through is Augmented Life in the Fast Lane, or the Smart Lane.
00:58:27.180 And it is a mind-bender.
00:58:32.620 If you want to know what's happening in the future and what you should be thinking about,
00:58:37.980 even all the way down to how we should start educating our kids, what we prepare them for,
00:58:43.240 read Augmented Life in the Smart Lane.
00:58:45.980 Brett, let's just start with what I left with on we have to start being nice to robots and AI,
00:58:58.680 and we better start really learning that quickly because they're going to be smart enough to best us in anything we do.
00:59:06.940 Well, you know, be kind to robots is a good rule, I think.
00:59:12.020 But, you know, when you hear Elon Musk, who, incidentally, you know, despite all his criticism of AI
00:59:17.440 and the threat to mankind, he's just announced his own AI initiative.
00:59:21.900 So maybe that was a bit of marketing.
00:59:24.920 But, you know, it's not necessarily that machines are going to be, you know, malevolent or benevolent.
00:59:30.940 The one thing we're learning is that artificial intelligence, they don't think like us as humans.
00:59:37.180 So when we attribute a superintelligence and the fact that it's going to take over the world like, you know, T-1000 Terminator,
00:59:44.700 you know, we are thinking in human terms, but it's not necessary that machines are going to act like humans.
00:59:51.840 So I think that's the one saving grace here is that, you know, sufficiently advanced AIs may not really care about us that much.
01:00:00.160 They may have their own agenda, which we have to cope with.
01:00:05.280 But that's, again, where I think empathy is important.
01:00:07.760 I think if they have empathy for their creator, us, I think that that will help us.
01:00:13.160 So I think building empathy and ethics into robotics is sort of key for a safety valve.
01:00:19.720 Well, I mean, you can already see the seeds planted in there's a robotic brothel in Germany now.
01:00:28.360 And people like to go there and, you know, they'll have their way with the robotics and the wives, I guess,
01:00:33.860 wait in the parking lot for the guys because, you know, it's not like really cheating and all this stuff.
01:00:38.820 And you think about how these robots, some of these robots are going to be used and abused by people.
01:00:46.080 If it is AI at some point, as Kurzweil says, an age of spiritual machines, at some point it will say, don't, I'm hurt, I'm lonely.
01:00:56.180 And then, you know, you do have the human emotion.
01:01:00.140 Well, you know, the other element of this, of course, is as these AIs get very, very good at understanding human behavior and learning to adapt to our concerns.
01:01:11.600 If you have a personal AI encoded in your smartphone, for example, you know, it could become your best friend.
01:01:19.980 In fact, you know, people may fall in love with their AI.
01:01:23.340 Oh, I think they will.
01:01:24.200 In their environment around them.
01:01:25.360 You know, I think that's because if you've got someone who reacts to you in a perfect way, responding to your every need, then, you know, that's a great way to build a basis or a friendship.
01:01:34.260 I mean, quite honestly, Brad, I mean, they don't have to destroy us.
01:01:38.700 They have to get us to fall in love with them and not procreate.
01:01:43.760 I mean, why would I've thought about this for a long time?
01:01:46.880 If I could come home and it's the perfect woman who has it has every trait that I love physically, mentally, everything else.
01:01:58.720 I don't have to hear about their day.
01:02:00.500 They only care about me.
01:02:01.980 They're thinking what I'm thinking.
01:02:03.760 They're adding something to the conversation.
01:02:05.900 It's mind blowing sex.
01:02:08.140 And if I decide I can change the way she looks or I can change anything, I want to try something new.
01:02:14.540 Why would you ever go on a date?
01:02:19.360 This may be the resurgence of humanity as well.
01:02:22.880 I think we'll have a approach where we get totally into this technology.
01:02:27.660 It infuses in society and people get carried away with it.
01:02:30.780 But, you know, there may be an authenticity to the human experience that we miss after a time.
01:02:36.040 And I think, you know, that's probably where, as humans, we'll need to differentiate.
01:02:40.720 We'll have to differentiate in our very humanity.
01:02:43.540 So, you know, you talked about employment and education.
01:02:46.520 You know, if you want to be relevant in that future, you're going to have to be extremely adaptable.
01:02:52.220 But I think the skills that will come to the fore are those that are really human, that people cater for that real human contact, that human touch, that really authentic humanity.
01:03:03.860 I because of your book, I talked to my 13 year old son, who is who is just really an empathetic kid and just loves people and loves children.
01:03:15.560 And I said, you know, have you ever thought about going into nursing and being a nurse practitioner?
01:03:20.360 And we talked about, you know, having your own robotics that you would be watching over several patients, but you would be the one that would be able to come in and kind of telepresence and be able to be there for people and have the actual person to person experience.
01:03:37.600 It's not nursing is not going to be like it is today.
01:03:40.280 Well, you know, if you look at how AI is going to impact jobs right now, the biggest impact we see, particularly in markets like the U.S. and even China is a process where humans are involved in process, ticking the box, following a checklist, you know, these sorts of things, accountants, lawyers, you know, bankers, bank tellers.
01:04:03.840 But the thing where we see a lot of demand coming is those human elements, the creative elements, design and counselling, we think counselling and psychology and those sort of elements, particularly as the role of work in society shifts and we become less defined by what we do and more defined by who we are.
01:04:25.260 You know, there's going to be a huge demand for those sort of human elements of behavioral psychology and counselling.
01:04:32.840 So, you know, it's really easy to say, I'll never I'll never do this.
01:04:37.060 I'll never I'll never upgrade.
01:04:38.800 I'll never augment.
01:04:39.920 We'll all augment, especially you describe how super intelligence, artificial super intelligence will will be so far ahead of us that we won't even be able to understand it.
01:04:54.640 That, you know, that, you know, I look at if if I'm an augmented human, I've augmented my brain and I'm connected to the, you know, Borg or whatever it would be.
01:05:05.720 I I am looking at the world differently.
01:05:09.060 I have access to knowledge and I'm talking to a human non augmented.
01:05:14.380 They don't they can't even follow.
01:05:16.020 Well, you know, so, you know, Kurzweil and Musk, of course, say that for us to keep up with AI, we're going to have to augment our intelligence.
01:05:30.160 Now, this sounds it sounds pretty far fetched putting neural implants in so we can do a Google search in our head, for example.
01:05:37.540 But, you know, that's only one step away from where we are today where we pick up the phone and we, you know, we will ask Google or Alexa to search on information.
01:05:46.560 You know, my kids will never have to pick an Encyclopedia Britannica off the wall to learn about, you know, I don't know how many moons is around Jupiter.
01:05:54.080 They can just ask their their computer.
01:05:56.360 So we've already augmented our intelligence.
01:05:58.500 But this is, you know, when we talk about things like the robotic prosthesis, you know, would you if you have, you know, short sightedness or a problem with your vision, would you be prepared to wear an implant that could give you 20, 20 or better vision?
01:06:15.600 And we already were a spectacle.
01:06:17.140 So we we already have been augmenting our vision for centuries.
01:06:20.260 I will tell you, Brett, that as as I read your book, I've I've I've always said I won't augment.
01:06:25.560 I won't I will not put a chip.
01:06:27.600 However, when you really think about it and if somebody came to me and said, Glenn, I can give you photographic memory.
01:06:33.620 I can give you access to everything.
01:06:35.560 You just implant this in just based on what I do for a living.
01:06:40.420 It would give me such an advantage that I would really be hard pressed not to do it because I would know also if if I don't do it, the other guys are going to do it and there's no way I'll be able to compete.
01:06:54.440 I mean, it's going to be a really tough choice.
01:06:56.540 And this is where science fiction actually informs us about some of these things, because, you know, we've seen sci fi writers write about this and talk about the fact that you've got natural humans versus augmented humans and the battle between these two ethically.
01:07:12.340 And, you know, I think that that's probably a pretty real thing that we're going to have to deal with.
01:07:16.160 Of course, some of it's a little bit more simple, you know, like, for example, in respect to repairing damage that you might have to your body, you know, prosthesis is there, but we're working on 3D printed organs.
01:07:30.540 So bioprinted kidneys and hearts and things like that.
01:07:33.940 So, you know, if you develop a heart disease in the future, we may be at a 3D printer, a new heart using your own stem cells so that it doesn't get you don't need any rejection medicine anymore.
01:07:44.320 And, you know, you could get a new heart and that could extend your life by 20 or 30 years.
01:07:48.740 Who wouldn't do that?
01:07:50.200 Brad, I have a daughter with cerebral palsy.
01:07:52.580 She had strokes at birth.
01:07:54.220 And we have talked about this a lot.
01:07:57.100 And I know now, you know, exoskeletons are being developed that would give her use of her full use of her of her arm and her hand.
01:08:05.680 And is there a time in her lifetime, she's 29 now, where she would have the fog of the way she thinks lifted?
01:08:18.960 Certainly, I think within the next 20 to 30 years, that's a real possibility.
01:08:25.840 I think we are going to, with both gene therapy and with augmentation technologies, I think the word disability will disappear from our vernacular.
01:08:37.360 Are you concerned at all about, especially when it comes to gene manipulation, the creation of Iceland has now been the only place that has a zero birth rate of Down syndrome.
01:08:54.460 And it's because of abortion and early detection.
01:08:59.120 I don't know if that's a good thing.
01:09:01.280 I mean, you know, I just don't think it's a good, I don't know if it's a good thing.
01:09:05.880 I've met a lot of people with Down syndrome, and I quite honestly think when I'm with them, I think, and excuse the use of this word, but it's appropriate for us.
01:09:14.340 We're the retarded ones, not them.
01:09:17.960 They have a connection to humanity.
01:09:21.000 Are you worried at all of this world that we're going into that can just make everybody perfect?
01:09:30.120 Designer babies, you know, we've heard talk about it, you know, in vitro manipulation of DNA and so forth.
01:09:35.880 You know, it's obviously, there's a huge ethical minefield in terms of where do you stop, where do you start?
01:09:41.760 Now, if there's a congenital disease that's going to debilitate, you know, that child for the rest of its life and you can fix it, then why wouldn't you?
01:09:51.300 But at the same time, what if I was able to change your hair color or your skin color or your muscle tone, you know,
01:09:58.660 and get you to be more athletic or more mathematically inclined?
01:10:02.780 You know, and this is where it's a slippery slope.
01:10:05.820 Having said that, I think what history teaches us is, and this is the inevitability of this and why I choose to be optimistic,
01:10:14.240 is with all of these technologies and things we think about, you know, if you look back to the start of the Industrial Revolution,
01:10:21.580 we don't have the self-control to limit humanity's experimentation with these things generally.
01:10:28.780 We rush forward and embrace this and worry about the complications later.
01:10:33.800 Okay, so I've only got about a minute, and I'd love to have you back to talk about banks and everything else.
01:10:39.980 That's a new book, and of course you're...
01:10:41.720 We didn't get to Bitcoin yet.
01:10:43.060 I know, we didn't, and I'm out of time, but I've got to ask you about Bitcoin.
01:10:48.180 Where do you see that?
01:10:51.000 So it is no longer a currency.
01:10:53.480 It's become a crypto asset.
01:10:55.200 It's got a lot of way to grow.
01:10:56.940 I mean, if you look at the U.S. bond market, $31 trillion, the gold market, you know, $8 trillion.
01:11:02.120 You know, this total cryptocurrency market right now is only about $500 billion.
01:11:07.580 So I think it's still got a long, long way to go, but don't think of it as a currency.
01:11:12.200 Think of it as an alternative asset class that's sort of replacing commodities like oil,
01:11:16.720 which are no longer got the returns they used to have.
01:11:19.200 And what about things like Litecoin or Ethereum?
01:11:22.520 Is that a currency?
01:11:24.760 Well, so, you know, Ethereum probably has potential to be a currency,
01:11:28.580 and there are alternatives to Bitcoin, like forks that have occurred, like Bitcoin cash,
01:11:34.180 that could become on mass currency use.
01:11:37.100 I think we'll start to see different methods of value exchange emerge,
01:11:40.680 because we live in a digital world where commerce is digital these days,
01:11:45.040 and the U.S. dollar has no strategic advantage in that digital landscape.
01:11:49.900 Is there a way that countries are going to say,
01:11:52.580 we've got to control this, and they'll come out with their own cryptocurrency?
01:11:57.180 Yeah, absolutely. China's working on that.
01:12:01.080 They have their own blockchain.
01:12:02.540 They will launch their own cryptocurrency as a competitor
01:12:05.280 to try and sort of become the first digital-backed currency.
01:12:10.080 But then Japan and Venezuela and other markets are starting to adopt Bitcoin as official currency,
01:12:16.700 so it could go either way, to be honest.
01:12:18.720 All right. Brett King, futurist, founder of Movin, and author of several books.
01:12:27.940 Bank 4.0 comes out soon.
01:12:29.700 We've been talking to him about augmented life in the smart lane.
01:12:33.080 Thank you so much, Brett. I appreciate it.
01:12:35.260 Thanks, Brett. I appreciate it.
01:12:41.460 You've got to read that book. It's a great book.
01:12:44.140 It's intense, man.
01:12:45.060 You will be so optimistic, and at the same time, you will say,
01:12:50.800 no way out, no way out.
01:12:53.560 During your holiday shopping, if you are using your mobile device,
01:12:58.980 retailers expect 54% of the holiday shoppers to go to their website from a mobile device.
01:13:04.600 And scammers see this as an opportunity to steal your credit card and information
01:13:08.100 and other personal data.
01:13:09.200 And they distribute phony retail apps.
01:13:12.540 Be really, really cautious.
01:13:13.880 Download the apps.
01:13:14.860 Only the ones that you know are from the stores, and that store has a real reputation.
01:13:20.040 One in four people have experienced holiday theft,
01:13:22.520 and if you're only monitoring your credit,
01:13:24.940 your identity can be stolen in ways you're not going to detect.
01:13:27.720 Thieves could sell your information on the dark web
01:13:29.660 or get an online payday loan in your name.
01:13:32.180 LifeLock detects a wide range of identity thefts,
01:13:35.780 and if there is a problem, a U.S.-based identity restoration specialist is going to work to fix it.
01:13:41.140 Nobody can prevent all identity theft.
01:13:43.260 Nobody can, you know, monitor all transactions at all businesses.
01:13:46.320 But LifeLock can help you see more threats to your identity.
01:13:49.760 So call them now.
01:13:50.780 800-LIFELOCK.
01:13:52.120 1-800-LIFELOCK.
01:13:54.180 Use the promo code BECK, and you get 10% off your LifeLock membership.
01:13:57.400 LifeLock.com, promo code BECK, LifeLock.com, or 1-800-LIFELOCK, promo code BECK.
01:14:08.280 Glenn Beck.
01:14:20.160 Glenn Beck.
01:14:23.400 Alabama.
01:14:27.400 Alabama, and court cases that make absolutely no sense.
01:14:33.140 The guys, the four people that kidnap a handicapped kid off the streets of Chicago
01:14:39.720 and torture him online for days, set free.
01:14:44.840 The guy in California shoots a woman on a bridge, set free.
01:14:49.380 And the cop in Arizona that clearly kills a guy, not a problem.
01:14:56.160 What's happening to us next?
01:14:58.220 Glenn Beck.
01:15:04.320 Love.
01:15:05.900 Courage.
01:15:07.440 Truth.
01:15:08.700 Glenn Beck.
01:15:09.500 I have a feeling this Russia investigation is going to end up poorly.
01:15:14.480 I think for us.
01:15:15.960 For us.
01:15:16.360 At this point, the DOJ and the FBI need to be investigating themselves along with the case of Russia.
01:15:26.220 Came out yesterday on Meet the Press that the special counsel was busy trying to piece together what happened inside the White House
01:15:33.680 after senior officials were told that Mike Flynn was susceptible to blackmail.
01:15:38.000 Okay, sounds good.
01:15:39.560 But why is Mueller still bogged down over that?
01:15:44.060 Hand that off and get busy investigating, I don't know, the largest hostile intelligence operation conducted in United States history.
01:15:53.560 You know, the little thing when Russia attempted to interfere with the election and seemed to be influencing and worming its way into both parties.
01:16:03.860 You know, the crime you're supposed to be concentrating on.
01:16:07.920 The last time I checked, the election happened months before Trump and the transition period.
01:16:14.480 If you see obstruction of justice there, great.
01:16:18.580 Well, I mean, not great, but get on that and take it on.
01:16:22.640 But back to the campaign in Russia.
01:16:25.360 We still don't know how a company like Fusion GPS became so influential in this investigation.
01:16:31.980 Fusion GPS, a company that was employed by the Russians, the Clinton campaign, the DNC.
01:16:38.260 Does it sound like the information you'd get from them might be just a little biased?
01:16:42.800 Apparently, the knowledge, you know, was either lost on the FBI and DOJ or they didn't care because we now know that the FBI was trying to pay to have Fusion GPS, the operative, continue his work.
01:16:58.020 When the House Intelligence Committee began pushing Deputy Attorney General Rosenstein to answer some of these things, the the pushback began.
01:17:11.180 Now, what possible motive would he have to stonewall Congress?
01:17:17.000 Well, it came out last week that senior DOJ official Bruce Orr was demoted after it found out that he had secret meetings with the head of Fusion GPS.
01:17:26.720 Now, that alone sounds bad, but that was so last week.
01:17:30.500 Yesterday, Fox reported that it was not only Orr holding secret meetings with Fusion GPS, but his wife was also working for them.
01:17:38.420 She was employed by Fusion GPS while being while the Trump dossier was being compiled before working at Fusion GPS.
01:17:48.040 She worked at a Washington think tank where her specialty was described as a Russia expert.
01:17:54.120 Want to take any guess where Orr's office is at the DOJ or was?
01:17:59.700 Lo and behold, just a few doors down from the same Deputy Attorney General.
01:18:04.760 Now, whether something nefarious happened here or not, you could very easily make the case that this is the reason the outlandish Trump dossier was used to possibly get FISA warrants to spy on a presidential candidate.
01:18:18.580 You could also make the case that the bias is so thick that it's also the reason why a special counsel seems so hell-bent on proving obstruction of justice rather than finding out who was involved in possibly the worst foreign intelligence attack in our history.
01:18:36.300 Is the FBI, is the FBI, is the DOJ dirty?
01:18:43.760 They need to get their house in order.
01:18:46.640 We need to be able to trust someone.
01:18:50.100 Then I don't care if you put Clinton and Trump in jail if they belong there.
01:18:55.680 I just want to know the truth.
01:18:58.200 It's getting embarrassing.
01:18:59.600 Someone needs to step up and figure out how we're so easily hacked and infiltrated by the Russians.
01:19:07.460 I don't care if it's Joe Friday or Perry Mason or Jim frickin' Rockford.
01:19:11.860 We need outside investigators.
01:19:15.340 And time is running out because we're approaching the midterm election.
01:19:20.340 It's Tuesday, December 12th.
01:19:28.900 This is the Glenn Beck Program.
01:19:36.580 All right, there are three cases that really bother me.
01:19:41.220 Three cases that I don't even understand.
01:19:44.940 You remember the woman who was part of the group, there were four or five people, they were all adults.
01:19:53.740 They kidnapped this white handicapped kid, mentally handicapped, and they gagged him, bound him, beat him, tortured him for four days.
01:20:06.220 He finally gets away and he's so freaked out.
01:20:10.300 Well, they had put all of this on Facebook.
01:20:12.560 One of the girls, the first one to go to trial, she's just been given community service.
01:20:19.680 Are you kidding me?
01:20:23.400 What is the judge thinking?
01:20:26.520 What is the jury thinking?
01:20:28.780 How is this possible?
01:20:32.200 The other case, in San Francisco.
01:20:35.120 What was the DA thinking of not charging somebody who is here illegally, quote, finds a gun, it goes off, he kills a woman.
01:20:52.300 Now, it was a ricochet, so it's not murder.
01:20:55.800 How was manslaughter not the thing he was charged with?
01:21:00.500 How did the jury say not guilty?
01:21:04.480 And then, over the weekend, I don't know if you saw this brutal, brutal tape of police coming after a guy, he's laying down in the hallway, he's clearly drunk or drugged or something.
01:21:20.940 And the police, you know, have their flak jackets on and they've got an AR trained on him.
01:21:28.100 And the vest cam shows this guy and they're just barking orders at him.
01:21:34.480 Put your hands up.
01:21:35.760 Don't move.
01:21:37.000 Now, put your hands up.
01:21:39.060 Get on your hands and knees and crawl.
01:21:41.020 How can I do both of those?
01:21:42.880 And, by the way, I'm drunk.
01:21:44.040 And if you make a mistake, we're going to shoot you and kill you.
01:21:48.740 It is a horrific scene.
01:21:50.780 I am usually not somebody who wants to second guess the police.
01:21:54.320 But this one, I just don't understand.
01:21:59.520 We reached out to Norm Stamper.
01:22:03.360 He is the author of the book, To Protect and Serve, How to Fix America's Police.
01:22:07.800 He was a police officer for 34 years, the first 28 in San Diego.
01:22:12.960 In the last six, he was the chief of police in Seattle.
01:22:16.620 Norm, how are you, sir?
01:22:18.700 I'm doing very well, Glenn.
01:22:20.140 Thank you.
01:22:21.840 So where do you normally stand on the police?
01:22:25.060 I mean, I know that you're very disillusioned with the police.
01:22:32.480 But the reason why I ask, I'm from Seattle.
01:22:35.160 And I see, you know, a chief of police from Seattle.
01:22:39.620 And I wonder what that even means anymore, because Seattle has gone so far off the rails.
01:22:47.440 Well, and of course, Seattle was one of the cities that was investigated by the Department of Justice
01:22:53.420 and is now in the middle of a series of reforms that seem to be taking quite well.
01:23:00.940 Okay. So you were a police officer, and you said you got into it because you really wanted to do the right thing.
01:23:09.020 And then what happened?
01:23:11.240 Well, that lasted about five minutes.
01:23:13.380 I got sucked into the culture of policing.
01:23:15.940 And, you know, from the very beginning, had an enormous amount of respect for police officers who do their jobs
01:23:22.780 and do them well, oftentimes heroically.
01:23:26.420 And it's painful to see what we saw coming out of Mesa, Arizona, this last incident that you're talking about.
01:23:34.520 That was a wonderful example of a police officer who needed to de-escalate a situation,
01:23:42.360 but did exactly the opposite to tragic outcome.
01:23:46.580 He said, his attorney argued in front of court, and the jury bought it, that he was following procedure,
01:23:53.140 and that's exactly, he followed his training to the letter.
01:23:57.400 Well, here's the problem.
01:23:59.820 When looking at these individual cases, you can look at Laquan McDonald in Chicago,
01:24:04.840 you can look at Tamir Rice in Cleveland, 12-year-old boy,
01:24:08.640 you can look at Walter Scott running away from Officer Michael Slager in North Charleston, South Carolina.
01:24:13.940 In each of these cases, and in too many other cases, those shootings were predictable,
01:24:20.640 for a reason I'll explain in just a second, and also utterly preventable.
01:24:26.120 Nobody needed to die on those days.
01:24:29.340 It's predictable because officers generally are doing what they've been taught to do.
01:24:35.100 I believe that in the Mesa, Arizona case, that was a distortion of what the officers have been trained to do,
01:24:42.060 but I don't know that.
01:24:43.660 What I do know is a lot of the training takes place in the locker room or in the front seat of a police car,
01:24:49.840 and it really gets to the culture of policing, not the formal structure,
01:24:53.860 not what's being taught in the academy,
01:24:55.760 but rather what is being described as reality
01:25:01.920 and the best response to given situations in the real world.
01:25:06.920 So you've got a good number of senior officers who are, dare I say, paranoid,
01:25:15.020 who are so frightened that they can't see straight.
01:25:18.420 Under the influence of fear, we misperceive situations,
01:25:22.380 we misjudge situations, and the consequences can be horrific.
01:25:27.160 So, Norm, if I'm a cop, I'm listening to you right now going,
01:25:35.440 damn right I'm nervous.
01:25:37.880 I have every reason to be nervous.
01:25:39.480 Every time I walk up to a car, I don't know if somebody's going to shoot me.
01:25:42.460 I'm walking into things I don't know every single day,
01:25:45.260 and people are becoming more and more hostile towards us.
01:25:49.460 I think you're absolutely right,
01:25:51.620 and what that argues for is the kind of education and training,
01:25:56.180 good selection process in the first place,
01:25:58.600 but the kind of education and training that equips officers to deal with that reality.
01:26:03.700 We understand that many of the people you're going to interact with as a police officer
01:26:08.760 are not at their best.
01:26:10.680 We understand that there are some radically evil people out there
01:26:13.800 who just seem to shoot you as look at you.
01:26:16.400 So, if that represents the reality,
01:26:19.760 how are we acculturating and training our police officers
01:26:23.720 so that they can handle confidently and maturely these kinds of situations?
01:26:30.520 What we saw in that last incident was a police officer who was utterly out of control.
01:26:37.600 His voice, his shouts on the tape, on the video,
01:26:42.380 were almost as loud as the five shots that he fired
01:26:46.340 that ended that young, innocent man's life.
01:26:49.900 So, I think it's just vital that we recognize the real world,
01:26:54.740 and we provide the kind of education and training
01:26:57.560 that really equips police officers psychologically,
01:27:01.700 as well as cognitively, to handle these situations.
01:27:05.940 How do we, as a public, how do we judge these things?
01:27:12.280 Because I never want to second-guess the cops,
01:27:16.140 unless it's pretty blatant.
01:27:17.300 I thought this was pretty blatant.
01:27:18.880 Unless it's pretty blatant, because I'm not in that situation.
01:27:23.640 And, you know, to expect perfect people
01:27:26.680 to behave exactly like, you know,
01:27:30.280 you would behave sitting on your couch,
01:27:32.340 which, you know, is not reasonable.
01:27:37.100 So, how do we judge these?
01:27:38.320 I couldn't agree more.
01:27:39.640 I think we judge these things
01:27:43.420 through our own filter of education and awareness,
01:27:47.280 so that we really understand better
01:27:49.760 lethal force situations
01:27:53.000 that confront police officers.
01:27:55.480 I mean, let's be real about this,
01:27:57.680 as you're suggesting we do,
01:27:59.540 and has recognized that sudden violent death
01:28:02.740 is an occupational hazard for police officers.
01:28:06.300 How do we minimize the risk?
01:28:08.360 How do we equip, train, weaponize our police officers
01:28:12.940 so that they're as safe as they can possibly be,
01:28:16.220 given those circumstances?
01:28:17.980 But we do them no service
01:28:20.180 when we help create an Arab paranoia,
01:28:23.440 when we cause police officers
01:28:27.780 to develop a frame of mind,
01:28:29.680 and this officer tragically had that frame of mind.
01:28:33.260 He had written on, you know,
01:28:35.780 engraved on the barrel of his AR-15,
01:28:38.860 you're screwed.
01:28:40.180 And I cleaned that up for you and your audience.
01:28:43.380 Yeah, okay.
01:28:44.880 That's an officer who's got a mentality
01:28:47.640 that has no place in police work.
01:28:49.860 Look, the best cops,
01:28:52.180 I've heard one say this recently,
01:28:54.240 my philosophy is every shift
01:28:56.900 for the rest of my career,
01:28:58.960 I go out there with this truth in mind,
01:29:02.180 nobody dies tonight.
01:29:04.040 Nobody dies on my shift.
01:29:05.580 That means me.
01:29:06.820 I mean, I want to make it home to my family.
01:29:08.800 I'm interested in self-preservation,
01:29:10.820 as is every other, you know,
01:29:13.060 non-suicidal human being.
01:29:15.580 But I also am going to put at the top of my priorities
01:29:21.000 the protection and preservation of human life.
01:29:24.340 And I'm going to get really good
01:29:26.340 at defusing and de-escalating situations.
01:29:30.740 I'm also going to be prepared,
01:29:32.520 should it come to it,
01:29:33.720 to pull the trigger of my gun if I have to.
01:29:36.420 Tartan, Norm Stamper, author of
01:29:37.640 To Protect and to Serve,
01:29:38.720 How to Fix America's Police.
01:29:40.220 Norm, it seems to me that there's education
01:29:41.960 that really needs to come on both sides of the equation.
01:29:44.260 How we deal with police
01:29:46.820 and understanding the situations they're in.
01:29:49.840 It seems like there's a lot of people
01:29:53.000 who have issues understanding that,
01:29:56.180 you know, these interactions can get messy
01:30:00.400 and can, I think, put people on edge.
01:30:04.380 And by being respectful,
01:30:05.960 by handling these situations as well as possible
01:30:08.900 from the perspective of someone
01:30:12.280 who's being pulled over,
01:30:13.380 who is in one of these situations,
01:30:15.340 would go a long way to solve
01:30:17.000 at least some of these problems,
01:30:18.660 though not in this particular case.
01:30:21.900 I do agree with that.
01:30:23.640 I think that there are citizen academies
01:30:26.460 being run by many jurisdictions across the country.
01:30:29.740 There are efforts to achieve a genuine partnership
01:30:34.420 between community and police
01:30:35.800 so that citizens are actually involved
01:30:38.200 in policymaking and program development,
01:30:41.080 crisis management.
01:30:42.260 This is my agenda, by the way,
01:30:44.220 creating this genuine partnership
01:30:46.620 so that citizens are being informed
01:30:50.500 about police policies and practices
01:30:52.840 and the kinds of real-world situations
01:30:55.040 that our officers confront.
01:30:57.060 The more understanding,
01:30:58.860 the more likely it is
01:31:00.240 that we will see responsible behavior,
01:31:03.720 frankly, on both sides of the equation.
01:31:06.060 Yeah, officer that you used to have
01:31:08.060 a long time ago
01:31:09.020 that knew the community
01:31:10.560 and the community knew him.
01:31:13.160 You know, I lived in New York for a while.
01:31:14.740 Those people aren't even,
01:31:16.040 they're nowhere near Manhattan
01:31:18.260 and they're not in the community.
01:31:21.900 And it just changes.
01:31:23.660 It just changes things
01:31:24.780 when the community is not connected
01:31:26.440 and the police officer
01:31:28.000 is not connected to the community.
01:31:29.280 Thank you so much.
01:31:30.700 Appreciate it, Norm.
01:31:31.700 You bet.
01:31:32.320 My pleasure.
01:31:41.840 Norm Stamper wrote the book
01:31:43.000 To Protect and Deserved,
01:31:43.900 How to Fix America's Police,
01:31:45.040 also breaking rank
01:31:46.100 atop cops expose
01:31:47.300 of the dark side of American policing.
01:31:49.700 And we've talked about this before.
01:31:50.860 You have to have both of these things.
01:31:53.560 You have to make sure
01:31:54.480 the good cops get credit
01:31:56.020 and you have to make sure
01:31:57.080 the bad cops get punished
01:31:58.220 for what they do wrong.
01:31:59.280 I want to talk to you
01:32:01.040 a little bit about gold line.
01:32:02.400 You know, I've talked to you
01:32:03.780 about Bitcoin
01:32:05.080 and Bitcoin is something
01:32:06.280 that comes once in a lifetime.
01:32:08.460 You will never see this stuff
01:32:10.360 in your lifetime.
01:32:12.520 It's pretty insane.
01:32:15.360 Gold is where the world returns.
01:32:19.120 It is the currency
01:32:21.200 that has been around
01:32:22.580 since the Bible.
01:32:24.160 Jesus, gold, frankincense, and myrrh.
01:32:26.340 We don't trade in myrrh anymore.
01:32:28.060 We don't trade in frankincense.
01:32:29.540 We do with gold.
01:32:31.080 Now, because of gold line,
01:32:32.800 they've just been purchased
01:32:34.100 by Amark.
01:32:35.160 This is one of the largest
01:32:36.080 publicly traded
01:32:36.860 precious metals wholesalers.
01:32:38.380 They are now able to offer you
01:32:40.480 much more efficient ways
01:32:41.900 to buy precious metals.
01:32:43.420 And so gold line
01:32:44.140 is slashing the prices
01:32:45.680 on its most popular products
01:32:47.580 to prices that they have
01:32:48.620 never, ever offered before.
01:32:50.940 I want you to go to gold line.
01:32:52.500 Even if you've purchased gold before
01:32:54.140 or you've been thinking about it,
01:32:55.600 do it now.
01:32:56.300 I have not seen these prices.
01:32:57.840 They're pretty incredible.
01:32:58.780 Call 866-GOLD-LINE
01:33:00.320 1-866-GOLD-LINE
01:33:02.300 or goldline.com.
01:33:04.600 Read the important risk information
01:33:07.200 and find out if gold or silver
01:33:08.360 is right for you.
01:33:09.200 But it's 866-GOLD-LINE
01:33:11.400 or goldline.com.
01:33:16.360 Glenn Beck.
01:33:22.200 Glenn Beck.
01:33:23.640 Okay, so Netflix released
01:33:27.800 its year-end report.
01:33:29.960 It revealed some concerning
01:33:32.540 streaming habits from some people.
01:33:36.280 There is a user in the United Kingdom
01:33:38.600 who watched the movie,
01:33:41.580 the B-movie,
01:33:43.060 357 times last year.
01:33:46.580 There is somebody who watched
01:33:49.380 The Curse of the Black Pearl
01:33:51.380 every day for an entire year.
01:33:57.100 People watching the same movie,
01:33:58.920 it is not unusual
01:34:00.360 for people to watch the same movie
01:34:02.440 300 times a year.
01:34:03.740 That is unusual.
01:34:05.120 I'm sorry to say that's weird.
01:34:07.960 There are a group of users
01:34:09.500 that are watching
01:34:10.440 the original Christmas
01:34:11.300 from Netflix.
01:34:13.360 They've been watching it every day
01:34:14.700 for the last three weeks.
01:34:16.620 Someone in Antarctica
01:34:18.260 is watching Shameless.
01:34:19.920 Mexico has the most members
01:34:22.880 who are logged in
01:34:23.800 to stream content
01:34:24.640 on Netflix
01:34:25.720 every single day.
01:34:27.880 And Netflix
01:34:28.540 watchers have watched
01:34:30.700 more than 140 million hours
01:34:33.420 per day
01:34:35.100 with the most popular day
01:34:38.900 of viewing January 1.
01:34:40.300 I love this story
01:34:41.020 from Netflix
01:34:41.560 because they had this tweet
01:34:43.020 that came out
01:34:43.900 where they said,
01:34:44.600 you know,
01:34:45.000 someone's watching
01:34:45.540 some Christmas movie.
01:34:46.580 There's been 53 people
01:34:47.660 who watched it every day
01:34:48.960 for the last 18 days
01:34:50.080 and their comment was,
01:34:51.700 who hurt you?
01:34:53.280 Which is that
01:34:53.860 that was very funny.
01:34:54.680 It's also a little creepy.
01:34:56.000 There's something creepy.
01:34:56.860 They can watch everything.
01:34:58.140 They were calling out
01:34:59.920 individual subscribers
01:35:01.880 on Twitter
01:35:02.800 and in a fun way
01:35:05.900 harassing them
01:35:06.660 for their viewing habits.
01:35:07.760 I love this story
01:35:08.940 but it makes me feel
01:35:09.700 a little weird.
01:35:10.540 I want to get into that tomorrow
01:35:11.820 because they have the list
01:35:13.000 of the most popular
01:35:13.860 and I'm not
01:35:15.280 I'm not in any of them.
01:35:17.780 Glenn Beck.
01:35:24.240 This is the Glenn Beck Program.
01:35:27.160 No, I haven't.
01:35:28.680 We're just talking about
01:35:29.840 Star Wars.
01:35:31.500 Stu's going Thursday.
01:35:33.440 I'm taking the whole family
01:35:34.740 and I couldn't get tickets
01:35:35.760 until Monday
01:35:36.480 so I can't see until Monday.
01:35:38.020 When are you seeing it?
01:35:39.660 We're going to wait
01:35:40.280 until our boys get back
01:35:41.080 from college
01:35:41.700 and so probably Monday.
01:35:43.860 I would think.
01:35:45.200 Yeah, I got to go
01:35:45.840 the first night
01:35:46.460 because as soon as
01:35:47.220 you lose access
01:35:48.600 to the internet.
01:35:49.400 Oh, I know.
01:35:50.120 Because everyone's
01:35:51.040 going to be posting things
01:35:52.000 and it's going to get ruined.
01:35:53.400 I had to shut down
01:35:55.180 to, you know,
01:35:55.980 not go to certain sites
01:35:57.360 and things
01:35:57.840 because it was everywhere.
01:36:01.040 You know,
01:36:01.620 people who saw
01:36:02.340 the initial preview
01:36:05.300 in Hollywood,
01:36:06.500 they were all tweeting
01:36:07.940 and they were all saying,
01:36:09.420 oh my gosh,
01:36:10.160 it's so great
01:36:11.080 and so unexpected
01:36:12.020 and then I just stopped.
01:36:13.280 Stop reading.
01:36:15.240 I don't want to know.
01:36:16.060 I don't want somebody
01:36:16.660 to go,
01:36:17.060 oh, you have me a father.
01:36:18.060 He is a father.
01:36:19.260 Right.
01:36:19.680 You're going to find out.
01:36:20.380 I don't want to know
01:36:21.060 any of it.
01:36:21.600 He sees dead people.
01:36:22.280 You don't want that
01:36:22.760 to happen before you walk in.
01:36:23.940 I don't.
01:36:24.840 You know,
01:36:25.120 Rafe and I have been
01:36:25.800 going through all of the theories
01:36:27.140 on, you know,
01:36:28.040 who's the father,
01:36:29.020 who's the mom,
01:36:29.800 all that stuff
01:36:30.420 and just yesterday
01:36:32.980 somebody came out
01:36:33.760 and said,
01:36:34.100 okay,
01:36:34.420 I think I've got it.
01:36:36.180 I haven't seen the movie
01:36:37.000 but I think I have it
01:36:37.800 and I said,
01:36:38.120 no, no, no,
01:36:38.540 I don't want to watch it.
01:36:39.340 I don't want to watch it.
01:36:40.060 Yeah,
01:36:40.200 I don't want the predictions.
01:36:41.280 I don't.
01:36:42.020 I'm going to the movie
01:36:43.180 to enjoy the freaking movie.
01:36:44.660 Yeah.
01:36:44.900 So when I,
01:36:45.340 I'll watch the movie
01:36:46.060 and you know what
01:36:46.520 they're going to do
01:36:46.920 with the movie?
01:36:47.600 Tell me what happens.
01:36:48.900 And I don't.
01:36:49.340 That's the whole point of it.
01:36:50.300 And I also don't want to hear
01:36:51.380 it's,
01:36:51.680 oh,
01:36:51.840 it's the best one yet
01:36:52.940 because then you go in
01:36:54.220 and you're expecting it
01:36:55.020 to be really great
01:36:55.920 and you're like,
01:36:56.560 ah.
01:36:58.200 If you go in
01:36:58.780 and you're like,
01:36:59.160 it's just a movie
01:36:59.820 because how many times
01:37:01.640 have we been let down
01:37:02.300 by the Star Wars franchise?
01:37:03.860 At least three.
01:37:05.200 I would say four.
01:37:06.240 I would say four.
01:37:07.060 To me, four.
01:37:08.120 Return of the Jedi
01:37:08.880 is not good.
01:37:09.960 I love it.
01:37:10.820 I love Return of the Jedi.
01:37:12.080 I unequivocally love
01:37:14.040 the first three.
01:37:15.540 Now,
01:37:15.960 I know you don't.
01:37:16.640 No,
01:37:16.900 I don't hate Return of the Jedi.
01:37:18.420 The Ewoks were.
01:37:19.580 The Ewoks,
01:37:20.460 yes,
01:37:20.820 a little taint,
01:37:21.620 but it's not,
01:37:22.220 it wasn't Jar Jar Binks level.
01:37:23.720 No,
01:37:24.180 it was not.
01:37:24.740 You know?
01:37:25.060 No,
01:37:25.260 it was not.
01:37:25.560 Or the terrible little actor
01:37:27.000 who was Anakin.
01:37:28.120 The first three were just abysmal.
01:37:30.280 I don't even know if I've seen.
01:37:31.280 Agonizing.
01:37:31.820 I don't even know
01:37:32.340 if I've seen episode three.
01:37:34.220 At least all the way through.
01:37:35.120 What?
01:37:35.340 Really?
01:37:36.020 No.
01:37:36.440 That's the best of the first three.
01:37:37.640 I think the second one
01:37:38.900 is the best of the first three.
01:37:40.180 I think Attack of the Clones
01:37:41.240 is the best of the first three.
01:37:42.600 Dude.
01:37:43.200 Because,
01:37:44.040 unlike the other movies,
01:37:45.820 there is a
01:37:46.640 54 minute movie
01:37:48.840 inside of that movie
01:37:50.640 that's actually pretty good.
01:37:51.800 Mm-hmm.
01:37:52.780 Which one?
01:37:53.560 In Attack of the Clones,
01:37:54.680 if you just delete
01:37:55.720 about 40 minutes out of it,
01:37:57.640 there's a good
01:37:58.260 54 minute movie in there.
01:38:00.080 The problem is,
01:38:01.240 there's also
01:38:02.000 40,
01:38:02.940 50 minutes of
01:38:03.720 really terrible movie.
01:38:05.060 Mm-hmm.
01:38:05.280 So you get kind of like
01:38:05.960 really,
01:38:06.380 it'll be a little good
01:38:07.060 and a little bad.
01:38:07.720 Where the other two prequels
01:38:09.280 were actually,
01:38:10.140 I thought,
01:38:10.940 generally speaking,
01:38:11.840 just bad.
01:38:13.060 You know,
01:38:13.840 and again,
01:38:15.340 Star Wars bad.
01:38:16.020 I still like them
01:38:16.760 and I could still watch them,
01:38:17.820 you know,
01:38:18.060 but they're just not.
01:38:18.680 So is this the one,
01:38:19.320 because all I remember
01:38:20.120 is,
01:38:20.720 what's his name,
01:38:22.180 they go to the clone world
01:38:23.340 and then they're
01:38:24.420 laser fighting
01:38:26.780 on top of the deal
01:38:28.820 and it's raining?
01:38:30.140 Mm-hmm.
01:38:30.340 Is that the one?
01:38:31.020 That's all I...
01:38:31.660 You've described
01:38:32.160 all Star Wars movies.
01:38:33.080 Yeah, I know.
01:38:33.680 But that's the only,
01:38:34.300 that's the only part of that
01:38:35.440 I really remember.
01:38:36.260 I don't remember,
01:38:38.200 like in the trailer
01:38:39.080 this time,
01:38:40.020 it says,
01:38:40.580 I've seen this before
01:38:41.700 and I wasn't afraid
01:38:42.560 and it's,
01:38:43.420 you saw the hand
01:38:44.300 come up out of the lava.
01:38:45.580 Have you seen the trailer
01:38:46.220 for this yet?
01:38:46.940 Mm-hmm.
01:38:47.400 Is that the scene
01:38:48.360 from Vader
01:38:50.880 or coming out?
01:38:52.940 Is that the scene
01:38:53.760 from the third?
01:38:54.680 I don't know what scene
01:38:54.780 you're talking about actually.
01:38:56.140 I don't think I've seen
01:38:57.220 that trailer.
01:38:57.900 I don't remember it at least,
01:38:59.060 but it's a big deal, right?
01:39:00.440 I mean,
01:39:00.600 you could still get tickets
01:39:01.480 at some theaters.
01:39:02.460 It's not impossible.
01:39:03.480 This is the last one
01:39:05.620 of the Luke and Leia
01:39:09.220 and all that, right?
01:39:10.980 We don't know.
01:39:11.800 I mean,
01:39:12.060 it is entitled
01:39:12.880 The Last Jedi,
01:39:13.980 but we don't know
01:39:14.420 what happens in the movie.
01:39:15.160 I don't know that they've said that.
01:39:15.800 No, but I think this is,
01:39:16.920 because this is the ninth, right?
01:39:18.500 No, it's the eighth.
01:39:20.580 Yeah, this is eight.
01:39:21.200 Yeah, so you're thinking,
01:39:22.600 because Rogue One
01:39:23.760 was a spinoff
01:39:24.460 that came off last year.
01:39:25.560 Yeah, okay.
01:39:26.040 So that was at nine.
01:39:26.620 Those don't count.
01:39:27.140 Because I thought that he had said
01:39:28.900 that he had written nine.
01:39:30.900 That was the original,
01:39:31.540 that's what they always said
01:39:32.360 when I was a little kid.
01:39:33.300 I remember thinking
01:39:33.720 there was going to be nine of them.
01:39:34.480 In retrospect,
01:39:35.240 I don't believe.
01:39:36.160 No, I kind of don't either.
01:39:37.320 That's interesting.
01:39:38.500 I don't know
01:39:39.360 if he could have ever said it.
01:39:40.800 Yeah, I know.
01:39:41.320 I think he wrote the first six.
01:39:44.620 I think he had
01:39:45.280 a pretty good idea
01:39:46.060 of the first three.
01:39:47.280 That's it.
01:39:47.640 The middle,
01:39:48.060 or whatever they are,
01:39:48.780 the middle three,
01:39:49.660 the four, five, and six.
01:39:51.100 The originals.
01:39:52.160 How do you watch?
01:39:53.120 You're bringing your kids
01:39:54.660 or your grandkids.
01:39:55.940 How do you watch?
01:39:56.760 Do you start at three
01:39:57.480 or do you start at one?
01:39:58.860 It's funny,
01:39:59.480 Pat and I,
01:40:00.280 when we were doing
01:40:01.200 the Pat and Stu show,
01:40:02.140 we had someone
01:40:02.780 who worked here
01:40:03.700 who had never seen
01:40:04.840 any of the movies.
01:40:05.980 He's in his 30s.
01:40:06.620 That's amazing.
01:40:07.480 And we were like,
01:40:08.400 well,
01:40:09.460 he should be destroyed.
01:40:10.240 This is interesting.
01:40:11.860 You should watch,
01:40:12.580 we should make you watch them.
01:40:14.820 We Star Wars harassed him.
01:40:16.640 Yes, we did.
01:40:17.520 To watch them chronologically
01:40:18.980 instead of the way.
01:40:20.020 Oh, good.
01:40:20.300 Am I going to be sued for that?
01:40:21.420 Probably.
01:40:21.780 Probably.
01:40:22.600 So instead of watching it
01:40:23.820 like with the old school
01:40:25.040 Star Wars,
01:40:25.980 Empire Strikes Back,
01:40:27.480 Return of the Jedi first,
01:40:28.480 we had him watch
01:40:29.160 the prequels first.
01:40:30.500 So he watched the first,
01:40:32.280 the Phantom Menace
01:40:33.440 as the first movie
01:40:34.600 because that's in their world,
01:40:36.060 the chronological order
01:40:37.080 of these movies.
01:40:38.380 He got,
01:40:39.340 didn't even get to number two.
01:40:40.700 Just gave up.
01:40:42.320 He watched the first one.
01:40:43.540 I was like,
01:40:43.720 why do people like this series
01:40:46.040 was essentially his analysis.
01:40:47.140 That's how bad
01:40:47.600 Phantom Menace is.
01:40:48.860 Oh, it's horrible.
01:40:49.780 If you start a new hope
01:40:50.540 and you got to go,
01:40:52.000 you have to go
01:40:52.760 to the first three
01:40:54.440 and then you have a friend
01:40:55.780 summarize the first two.
01:40:58.160 You start with the middle three.
01:40:59.920 You have a friend
01:41:01.040 that summarizes the first two.
01:41:03.060 You watch the 55 minutes
01:41:05.040 that's good in episode three.
01:41:07.600 Then you can take off from there.
01:41:09.380 One of the biggest problems
01:41:10.420 in the whole series
01:41:11.420 is the selection of actors.
01:41:13.480 I don't know why they thought
01:41:15.060 these were the best actors.
01:41:16.440 Yeah.
01:41:16.780 But the guy,
01:41:17.480 the little kid who played Anakin
01:41:18.740 was terrible.
01:41:19.640 Yep.
01:41:19.960 And the guy,
01:41:21.060 the adult who played Anakin
01:41:22.080 was terrible.
01:41:23.700 The adult that played Luke
01:41:25.000 is pretty crappy too.
01:41:26.520 I mean,
01:41:26.760 you watch that series back.
01:41:28.240 Mark Hamill is not good at that.
01:41:29.800 Let's try Leia.
01:41:30.920 He's like Laurence Olivier
01:41:31.820 in comparison.
01:41:33.260 Let's try Leia
01:41:35.680 who halfway through
01:41:36.840 the first movie
01:41:37.700 all of a sudden
01:41:38.460 is no longer English.
01:41:40.440 Yeah.
01:41:40.680 Oh, yeah.
01:41:41.020 That's true.
01:41:41.420 That is true.
01:41:42.740 That's true.
01:41:43.340 You will not.
01:41:44.320 You will not terrify me.
01:41:47.140 And then in the next scene
01:41:48.100 she's like,
01:41:48.380 yeah, so what's going on?
01:41:50.040 I don't know.
01:41:50.940 I'm in a garbage dump.
01:41:52.960 Who would have seen
01:41:53.620 this one coming?
01:41:54.920 It's like,
01:41:55.740 what the?
01:41:56.620 No, it's fair.
01:41:58.140 It's funny
01:41:58.680 because you go back.
01:41:59.180 It's called overdub.
01:42:00.700 You could have overdubbed
01:42:02.100 those scenes
01:42:02.680 where you were
01:42:03.220 a different person.
01:42:04.200 They didn't have
01:42:04.580 that technology
01:42:05.120 back then, Glenn.
01:42:06.680 I mean,
01:42:07.160 those movies are really fun
01:42:08.060 and I love them.
01:42:08.720 There's that retro
01:42:09.680 sort of feel to them
01:42:10.840 and there's a part
01:42:12.100 close to my heart
01:42:13.040 but like you watch
01:42:13.860 some of like
01:42:14.660 the late Saber battles
01:42:16.160 and they're just
01:42:16.940 not good.
01:42:18.560 And you go back
01:42:18.920 to the prequels
01:42:19.480 and some of those
01:42:20.060 action scenes
01:42:20.640 are actually really amazing.
01:42:22.220 The movies themselves
01:42:23.340 are not good.
01:42:24.320 Right.
01:42:24.600 The action scenes
01:42:27.020 though in the prequels
01:42:28.460 are almost enough
01:42:30.040 to carry the movie
01:42:30.760 but not quite.
01:42:31.800 Yes.
01:42:32.240 Not quite.
01:42:32.880 But they're that good
01:42:33.740 that they can almost
01:42:34.700 make up for the
01:42:35.420 Jar Jar Binks stuff
01:42:36.420 and all the rest.
01:42:37.260 Yeah, for Attack of the Clones
01:42:38.400 which is the second one
01:42:39.580 you go through
01:42:40.240 and like
01:42:40.560 there's just the scenes
01:42:41.440 of them going through
01:42:42.200 the planet
01:42:42.860 that's the city
01:42:43.560 what's that one
01:42:44.020 calling me?
01:42:44.540 Pat, you might know.
01:42:45.440 Pat's wearing
01:42:45.840 a Star Wars shirt.
01:42:46.680 Borsican or something?
01:42:47.560 Yeah.
01:42:48.380 Pat is wearing
01:42:49.260 a shirt.
01:42:49.720 Well, it's a Star Wars
01:42:50.120 Christmas shirt.
01:42:51.260 It says
01:42:51.980 in green
01:42:52.800 with Yoda on it.
01:42:54.440 An elf?
01:42:55.420 I am not.
01:42:57.820 Yes.
01:42:58.540 Yes.
01:42:59.440 No.
01:43:00.400 Yes.
01:43:01.240 No.
01:43:02.180 But you get through
01:43:03.700 you're right
01:43:04.080 like the action sequences
01:43:05.480 in those first ones
01:43:06.180 are pretty amazing.
01:43:07.960 It's just the rest of it
01:43:08.980 it's just pointless.
01:43:09.740 It's like you're just
01:43:10.180 like they were like
01:43:11.120 what if we made
01:43:12.340 C-SPAN
01:43:13.580 into a Star Wars movie
01:43:14.840 and we all just
01:43:15.880 kind of sat around
01:43:16.480 a Senate
01:43:16.980 and just deliberated
01:43:18.780 bills
01:43:19.500 and legislation.
01:43:20.440 How did
01:43:22.020 George Lucas
01:43:24.080 how did he
01:43:25.700 honestly stumble
01:43:26.720 across this story?
01:43:28.240 It's interesting.
01:43:29.240 I mean it is such
01:43:29.520 a great story.
01:43:30.500 That's a great question
01:43:30.580 because
01:43:30.940 he is horrible.
01:43:32.820 He's not good.
01:43:33.800 He's horrible.
01:43:34.900 Other than this
01:43:35.500 and you can give him
01:43:37.180 you know
01:43:39.860 the Harrison Ford thing
01:43:40.780 with Indiana Jones.
01:43:42.620 So he's got
01:43:42.980 those two things
01:43:43.800 that are absolute classics
01:43:45.060 but everything else
01:43:46.940 No, no, no.
01:43:47.500 Howard the Duck
01:43:48.680 and all the rest
01:43:49.540 of that stuff?
01:43:50.780 No, he also did
01:43:52.180 American Graffiti
01:43:53.680 which got him started
01:43:55.420 in the first place.
01:43:56.060 Yeah, that was
01:43:56.780 you know
01:43:57.160 Happy Days.
01:43:58.080 That started Happy Days.
01:44:00.160 So I mean
01:44:00.480 he had
01:44:01.460 He had something in him
01:44:02.880 but he lost it.
01:44:04.160 Quickly.
01:44:04.820 He lost it.
01:44:05.280 And intermittently.
01:44:06.560 I mean it's like
01:44:07.740 whoa, whoa
01:44:08.560 where is that one
01:44:09.520 coming from?
01:44:10.300 Yeah, very strange.
01:44:11.560 Alright, Pat
01:44:12.700 couple questions.
01:44:13.800 First, Roy Moore
01:44:15.020 who's going to win tonight?
01:44:16.000 Roy Moore.
01:44:17.440 Landslide?
01:44:18.200 Close.
01:44:18.720 I think it's fairly close
01:44:19.860 but I think he wins
01:44:20.980 fairly comfortably
01:44:22.000 four or five points.
01:44:23.800 Yeah, I think he had
01:44:25.020 five or six points myself.
01:44:26.600 Yeah.
01:44:27.380 And it's not because
01:44:29.200 conservatives or Republicans
01:44:31.380 are excusing
01:44:32.120 sexual harassment.
01:44:33.100 First of all
01:44:33.480 many of them
01:44:33.920 don't believe the charges.
01:44:35.460 Secondly
01:44:35.840 the abortion thing
01:44:38.580 supersedes
01:44:39.360 any sexual harassment
01:44:40.940 that may have occurred
01:44:41.800 I think
01:44:42.480 for Alabama voters.
01:44:44.320 I think if it was
01:44:44.860 rape today
01:44:45.980 if you know
01:44:47.220 he was charged with
01:44:48.140 you know
01:44:48.420 hey
01:44:48.640 he just
01:44:49.240 last week
01:44:49.940 he was
01:44:50.300 you know
01:44:50.720 with this girl
01:44:51.680 or you know
01:44:52.240 a year ago
01:44:52.840 he was with this girl
01:44:53.600 40 years ago
01:44:55.220 you don't know
01:44:56.300 who to trust.
01:44:57.240 Right.
01:44:57.500 And he might have
01:44:58.000 just dated a few
01:44:58.840 young girls.
01:44:59.640 He might have just dated
01:44:59.800 so you don't
01:45:00.960 you don't know
01:45:02.020 in comparison
01:45:03.160 to a guy
01:45:03.820 who says
01:45:04.320 yes
01:45:04.620 I'll kill a baby
01:45:05.400 in the womb
01:45:05.900 tomorrow.
01:45:07.420 That's a
01:45:08.160 you know
01:45:08.820 that's a
01:45:09.660 And he doesn't want
01:45:10.200 to put any restrictions
01:45:11.060 on a woman's
01:45:11.660 right to choose.
01:45:12.900 Right.
01:45:13.280 Okay.
01:45:13.680 Once the baby
01:45:14.580 is born
01:45:15.320 then
01:45:16.500 I'm pro-life.
01:45:17.240 Well it's too late then.
01:45:18.440 Do you agree
01:45:18.760 I think it was Frank Luntz
01:45:19.660 who had this point
01:45:20.360 do you agree
01:45:20.820 if Doug Jones
01:45:22.000 the Democrat
01:45:22.480 were a pro-life
01:45:24.180 Democrat
01:45:24.600 that he'd be winning
01:45:25.400 by five points?
01:45:25.860 I think so yeah.
01:45:26.440 I do too.
01:45:27.040 I think so.
01:45:28.060 Probably true.
01:45:28.980 I mean because
01:45:29.220 you know
01:45:29.860 some of those
01:45:31.000 red states
01:45:31.800 will elect
01:45:33.420 some Democrats
01:45:34.240 when it comes
01:45:34.800 to the Senate.
01:45:35.580 Usually they'll go
01:45:36.100 red for the presidency
01:45:36.980 but they'll
01:45:37.440 Or maybe an occasional
01:45:38.240 governor.
01:45:38.800 Yep governors
01:45:39.280 yep
01:45:39.580 it does happen.
01:45:41.180 So I think you're right.
01:45:41.680 But when you're that big
01:45:42.720 an abortion advocate
01:45:44.420 there's no way
01:45:44.980 you can win.
01:45:45.940 You're not going to win.
01:45:46.540 You can't imagine.
01:45:47.200 I mean look
01:45:47.540 and this is the other thing
01:45:48.860 we have to point out.
01:45:50.080 Donald Trump won
01:45:51.000 won Alabama
01:45:52.800 I think by 28 points.
01:45:55.020 So the fact that
01:45:55.560 we're even having
01:45:56.600 this discussion
01:45:57.400 shows how weak
01:45:58.460 of a candidate
01:45:59.000 Roy Moore is
01:45:59.740 like him or not
01:46:00.460 the bottom line
01:46:01.400 is this should not
01:46:02.480 be hard.
01:46:03.500 Yeah.
01:46:03.700 It should be a blowout.
01:46:04.740 Look at the other two
01:46:05.360 Mo Brooks
01:46:05.920 or Luther Strange.
01:46:06.880 I'm not a strange guy
01:46:07.820 maybe a strange guy
01:46:09.320 but I'm not a Luther
01:46:09.800 strange guy
01:46:10.680 but I mean
01:46:11.420 or Mo Brooks
01:46:12.260 either one of them
01:46:13.260 would be winning
01:46:14.000 by 20 points right now.
01:46:15.140 Yes.
01:46:15.520 I mean it would not be
01:46:16.960 yes.
01:46:17.600 You know so I don't know
01:46:18.200 if Alabama wanted
01:46:19.760 Roy Moore as the candidate
01:46:20.900 I think they should have
01:46:21.680 picked Mo Brooks myself
01:46:22.700 but it's up to them
01:46:23.680 but it's up to them
01:46:24.260 and to them
01:46:24.700 people like
01:46:25.080 well why are you commenting
01:46:26.060 on it it's up to them.
01:46:27.440 Well I mean
01:46:27.780 that's our job.
01:46:28.400 I don't know what to tell you.
01:46:29.340 This is why you come
01:46:30.160 to the freaking show.
01:46:30.680 Why are you listening?
01:46:31.500 We can go back to Yoda talk.
01:46:32.080 Yeah I know
01:46:32.420 we're happy with that.
01:46:34.740 One last question Pat
01:46:35.920 what is on your mind today?
01:46:38.540 Larry King
01:46:39.140 for one.
01:46:40.680 What?
01:46:41.040 This is one of the more
01:46:43.840 outrageous allegations
01:46:46.160 not the most severe
01:46:47.500 because there have been
01:46:48.520 accusations of
01:46:49.520 really bad assault
01:46:50.680 and rape
01:46:51.100 but this is one of the
01:46:52.580 weirder things
01:46:53.340 where he's taking
01:46:54.440 a photograph
01:46:54.940 with a woman
01:46:55.800 and he slides
01:46:57.940 his hand
01:46:58.820 down her backless dress
01:47:00.500 inside her dress
01:47:02.280 into her butt crack
01:47:03.960 and leaves
01:47:04.780 three fingers there
01:47:06.100 while they're taking
01:47:08.020 a photo
01:47:08.600 for ten seconds
01:47:10.160 she said.
01:47:11.200 What?
01:47:11.960 Okay.
01:47:12.600 Okay.
01:47:13.000 A couple of things.
01:47:13.860 A couple of things.
01:47:14.580 A couple of things.
01:47:17.140 I mean
01:47:17.920 How do you not say anything
01:47:19.600 for one thing?
01:47:20.480 Well let's look at it
01:47:21.300 the whole spectrum here.
01:47:22.920 I can't believe
01:47:23.940 any guy would do that.
01:47:25.160 That is
01:47:25.360 I mean that goes beyond
01:47:26.820 that's craziness.
01:47:28.240 I'm going to leave
01:47:29.000 my fingers
01:47:29.660 in your butt crack
01:47:30.640 for the full hour.
01:47:31.980 That's also
01:47:32.680 something to say
01:47:34.640 whether he has thin hands
01:47:35.900 or she was wearing
01:47:36.640 a dress that was too big
01:47:38.200 or that was some
01:47:39.320 weird skill.
01:47:40.160 You're dealing
01:47:40.540 with the skeletal
01:47:41.520 remains of Larry
01:47:42.360 Right.
01:47:42.880 Okay.
01:47:42.980 So still
01:47:43.740 that's on
01:47:44.520 that's on him
01:47:45.360 on her.
01:47:47.180 How did you
01:47:48.100 how did you stay
01:47:48.900 there for three seconds?
01:47:49.860 I don't know.
01:47:50.200 It's ten seconds.
01:47:51.180 She said eight to ten seconds
01:47:52.420 while the photo
01:47:53.120 was being taken.
01:47:53.900 That's nuts.
01:47:54.560 I you know
01:47:55.180 you think of any
01:47:55.980 of our wives
01:47:56.840 and this happening
01:47:57.880 he'd get an elbow
01:47:58.820 in the face.
01:48:00.000 Yeah.
01:48:00.580 This happened in 2005.
01:48:01.940 My wife would have gone
01:48:03.520 Oh yeah.
01:48:04.640 Go back crap crazy.
01:48:06.080 Oh yeah.
01:48:06.320 You can't tell me
01:48:06.900 you don't know what to do.
01:48:07.780 You pull his hand out
01:48:09.240 of your dress.
01:48:11.160 Right.
01:48:11.960 I mean especially Larry King.
01:48:13.840 The guy is
01:48:14.340 She didn't even work for the guy.
01:48:15.560 In 2005.
01:48:17.240 Yeah.
01:48:17.560 He was 131 years old.
01:48:19.080 Right.
01:48:19.380 I mean he was Skeletor.
01:48:22.600 Yeah.
01:48:23.060 Unless you thought
01:48:23.920 he had some sort of
01:48:25.020 mystical magic powers
01:48:26.640 that Skeletor the cartoon has
01:48:28.420 you pull his hand out.
01:48:30.160 Yeah.
01:48:30.320 That doesn't make any sense to me.
01:48:31.480 That's really bad.
01:48:32.560 You should have pulled back
01:48:33.580 a bloody stump
01:48:34.500 is what should have happened.
01:48:36.380 You keep a blender
01:48:37.520 and keep a blender
01:48:39.200 by your dress.
01:48:40.860 Larry King comes.
01:48:42.860 Thanks Pat.
01:48:44.020 Pat Gray Unleashed
01:48:44.640 coming up on the
01:48:45.320 Blaze Radio and TV Networks.
01:48:46.300 I will be joining Pat today
01:48:47.440 to talk a little bit
01:48:48.000 about the election.
01:48:48.820 1 p.m. Central
01:48:50.220 so 2 p.m. Eastern
01:48:51.160 on the Blaze Radio
01:48:53.340 and TV Networks.
01:48:55.060 If you've been thinking
01:48:55.960 about your home security
01:48:56.980 there's no better time
01:48:57.800 to get it than right now.
01:48:59.280 I have been singing
01:48:59.940 the praises of SimpliSafe
01:49:00.980 for a while now
01:49:02.020 and that's because
01:49:03.360 it's just the best period.
01:49:06.000 SimpliSafe has put together
01:49:07.180 a massive security arsenal
01:49:08.760 for your home.
01:49:09.720 A special package
01:49:10.700 that has been handpicked
01:49:12.060 just for you.
01:49:13.000 It has entry sensors.
01:49:14.220 It has motion sensors
01:49:15.940 glass break sensors.
01:49:16.920 everything you need
01:49:18.880 to stop a possible
01:49:20.040 possible criminal
01:49:21.020 from entering your home.
01:49:23.180 Now these guys
01:49:24.140 are absolutely the best
01:49:26.140 and for the holidays
01:49:27.260 SimpliSafe is giving you
01:49:29.120 an incredible offer.
01:49:30.740 You'll get $200 off
01:49:32.140 this handpicked security package
01:49:33.640 for complete protection
01:49:34.800 for your home.
01:49:36.000 It's the best value
01:49:37.180 in home security.
01:49:38.360 There are no contracts
01:49:39.400 no commitments.
01:49:40.940 Right now
01:49:41.600 and I mean right now
01:49:42.700 get $200 off
01:49:44.560 this holiday special
01:49:45.520 security package.
01:49:46.900 You go to
01:49:47.580 SimpliSafeBeck.com
01:49:49.460 the best deal
01:49:50.720 on home security
01:49:51.560 you're going to see.
01:49:53.440 SimpliSafeBeck.com
01:49:54.700 save $200
01:49:55.900 on SimpliSafe's
01:49:57.800 home security pick.
01:49:59.480 It's
01:49:59.760 SimpliSafeBeck.com
01:50:01.900 Glenn Beck.
01:50:11.180 Glenn Beck.
01:50:16.780 Good sentence.
01:50:18.060 You just get to a point
01:50:21.600 where you're like
01:50:22.020 you just kind of
01:50:24.400 mumble out noises.
01:50:26.100 Is that
01:50:26.280 that's the point
01:50:26.860 you're at now?
01:50:27.700 Which I understand.
01:50:28.740 It's defensible.
01:50:29.700 Did you see the tweet
01:50:31.220 from Trump today?
01:50:33.140 I did.
01:50:33.760 There was quite an exchange
01:50:34.900 that happened
01:50:35.380 on the Twitters today.
01:50:36.560 The Twitters.
01:50:37.340 So Senator
01:50:38.460 Kirsten Gillibrand
01:50:39.740 from New York
01:50:40.460 basically came out
01:50:41.700 and said
01:50:41.980 Trump should step down
01:50:43.120 because of all
01:50:43.600 of his accusers.
01:50:44.660 They've been
01:50:45.020 kind of out in the media
01:50:46.160 again the last couple of days.
01:50:48.120 And so Trump
01:50:49.140 tweets today
01:50:50.380 lightweight
01:50:52.220 Senator Kirsten Gillibrand
01:50:53.820 a total flunky
01:50:54.820 for Chuck Schumer.
01:50:55.920 Oh my gosh.
01:50:56.820 Stop it.
01:50:57.400 I mean that part
01:50:58.560 is definitely true.
01:51:00.300 He's the president.
01:51:01.580 I know.
01:51:02.140 And someone who would
01:51:02.780 come to my office
01:51:03.540 begging
01:51:04.060 for campaign contributions
01:51:07.020 for whatever reason
01:51:08.180 begging is in quotes.
01:51:09.420 I feel like he just
01:51:10.060 He does it every time.
01:51:11.240 He does it every time.
01:51:12.400 He's a failure.
01:51:14.900 He came begging to me
01:51:15.760 for whatever.
01:51:16.580 He does it every time.
01:51:17.920 There's no reason
01:51:18.720 to put that word in quotes.
01:51:20.440 But anyway
01:51:20.740 begging for campaign contributions
01:51:22.320 not so long ago
01:51:23.160 and would do anything
01:51:24.260 for them
01:51:24.860 is now in the ring
01:51:26.120 fighting against Trump
01:51:27.300 very disloyal
01:51:28.520 to Bill
01:51:29.220 and Crooked
01:51:30.060 used.
01:51:31.720 I don't know
01:51:32.200 what a lot of that means
01:51:33.240 but I would like to know
01:51:34.720 what it means
01:51:35.440 she'd be
01:51:36.160 she was willing
01:51:36.860 to do anything.
01:51:38.480 Right.
01:51:38.740 So
01:51:38.960 is that in quotes?
01:51:40.560 That one's not
01:51:41.060 it's in parentheses
01:51:41.720 that one's in parentheses
01:51:42.640 so that the idea being
01:51:44.440 potentially
01:51:45.220 people are accusing her
01:51:47.020 him of saying
01:51:47.920 that she was
01:51:48.560 she would
01:51:48.960 offering sexual favors
01:51:50.840 which is not what he's
01:51:51.740 necessarily saying
01:51:52.960 although you could
01:51:53.460 take it that way.
01:51:54.440 Elizabeth Warren responded
01:51:55.240 smart enough to know
01:51:56.000 it's going to be
01:51:56.520 looked at that way.
01:51:57.460 Yeah I guess.
01:51:58.540 Elizabeth Warren responds
01:51:59.460 are you trying to bully
01:52:00.480 intimidating slut shame
01:52:01.780 Senator Gillibrand?
01:52:03.460 Wait
01:52:03.620 but slut shame
01:52:04.740 would insinuate
01:52:05.560 that she's doing
01:52:06.620 something promiscuous
01:52:07.520 and being criticized
01:52:08.080 for it.
01:52:08.780 Is that what
01:52:09.160 Elizabeth Warren is saying?
01:52:10.600 I don't think anyone
01:52:11.400 understands the language
01:52:12.340 anymore.
01:52:13.040 It's like a
01:52:14.020 blender of accusations.
01:52:15.680 Me smoke em wumpum.
01:52:17.080 That's
01:52:17.220 I'm just going.
01:52:18.020 That doesn't make any sense.
01:52:21.460 Glenn
01:52:22.060 back.
01:52:25.240 Me.
01:52:29.580 I.
01:52:30.240 I.
01:52:30.920 I.
01:52:31.840 I.
01:52:32.020 I.
01:52:32.920 I.
01:52:34.240 I.
01:52:34.420 I.