The Glenn Beck Program - October 03, 2018


Best of the Program with Peter Schweizer | 10⧸3⧸18


Episode Stats


Length

44 minutes

Words per minute

152.5355

Word count

6,771

Sentence count

609

Harmful content

Misogyny

7

sentences flagged

Hate speech

9

sentences flagged


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

Glenn Beck goes back to basics and explains the difference between a victim and an accuser in the Brett and Christine Blasey Ford case. He also talks about Donald Trump's comments about the "victim" in the case.

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.200 The Blaze Radio Network, on demand.
00:00:06.000 Hey, welcome to the podcast.
00:00:07.920 It's, what day is it?
00:00:08.880 Please tell me it's Thursday.
00:00:09.800 Thursday, Thursday, Thursday, Thursday, Thursday.
00:00:11.740 Oh, I hate that answer.
00:00:13.880 It is Wednesday's podcast.
00:00:15.900 There's a lot to talk about.
00:00:17.340 In fact, we're going to go back to basics today.
00:00:21.360 A little vocabulary test.
00:00:22.340 A little vocabulary test, yeah.
00:00:23.860 It's Wednesday time for the vocabulary test.
00:00:25.800 Do you know the difference between a victim and an accuser?
00:00:29.580 They're not the same thing?
00:00:30.700 They're not.
00:00:31.880 Huh.
00:00:32.240 They are not.
00:00:33.020 And the media doesn't understand that.
00:00:35.180 And they don't understand the word mocking and relaying evidence.
00:00:40.300 I'm going to go out and let them say they do understand both of those things.
00:00:43.140 But they're doing it that way anyway.
00:00:44.740 Well, the American people have to have a brush up on that.
00:00:46.980 And so we'll tell you that.
00:00:48.240 And it's really, I think, really important.
00:00:50.400 Also, we're going to throw in social justice.
00:00:54.300 Yeah.
00:00:54.640 Social justice.
00:00:55.820 Something we should learn.
00:00:57.000 Yeah.
00:00:57.300 It doesn't seem like we've learned those lessons yet.
00:00:59.000 Also, Peter Schweitzer's on.
00:01:00.060 He's got a new documentary out.
00:01:02.580 You'll remember him.
00:01:03.160 He did the book Clinton Cash, among many other great ones.
00:01:06.380 But he's got a documentary out about what Google is doing, manipulating search results.
00:01:11.260 And Peter's not a guy who's just, like, making claims.
00:01:13.900 Like, he has backed this up with data.
00:01:15.300 Yeah, he's got a guy from Harvard University, you know, big egghead, that actually is a Clinton supporter and was all for Clinton.
00:01:23.980 He did a lot of this research.
00:01:26.340 And as he was doing the research and speaking out about it, guess who got kicked off of Google?
00:01:32.760 That guy.
00:01:33.700 So, it's something that you really need to pay attention to because there's some really frightening things happening.
00:01:41.500 We'll find out about it in today's podcast.
00:01:43.920 You're listening to The Best of the Glenn Beck Program.
00:01:57.480 It's Wednesday, October 3rd.
00:02:00.100 Glenn Beck.
00:02:01.500 All right, I wanted to start with something.
00:02:05.580 This is very, very complex.
00:02:07.900 I got up this morning and I've been doing a whole bunch of research on the show.
00:02:15.020 The last thing I got to was the Donald Trump thing.
00:02:19.180 I was on our affiliate in Tulsa and I was asked, what about Donald Trump?
00:02:25.600 And I should have just said, I don't know, I haven't seen it yet.
00:02:29.420 But I had read about it and I'm glad I didn't comment on it.
00:02:34.580 All I said was, I'm not going to comment on it because it's just, it's ridiculous to focus everything on Donald Trump.
00:02:44.160 However, it does deserve comment.
00:02:49.020 The media has said, did you hear Donald Trump mocking the victim?
00:02:57.840 I'm going to get into that in a second.
00:02:59.420 No, I hadn't.
00:03:03.600 But in case you have only read about it or you've only seen the headlines, I would like to play the audio.
00:03:11.920 Here is Donald Trump, according to the press, mocking the victim.
00:03:16.920 Listen.
00:03:17.120 I had one beer.
00:03:19.000 Well, do you think it was, nope, it was one beer.
00:03:21.760 Oh, good.
00:03:22.540 How did you get home?
00:03:23.480 I don't remember.
00:03:24.280 How'd you get there?
00:03:24.980 I don't remember.
00:03:25.720 Where is the place?
00:03:26.440 I don't remember.
00:03:27.420 How many years ago was it?
00:03:28.620 I don't know.
00:03:29.800 I don't know.
00:03:32.220 I don't know.
00:03:34.620 I don't know.
00:03:37.000 What neighborhood was it in?
00:03:38.520 I don't know.
00:03:39.000 Where's the house?
00:03:39.780 I don't know.
00:03:40.120 Upstairs, downstairs, where was it?
00:03:43.540 I don't know.
00:03:43.920 But I had one beer.
00:03:44.900 That's the only thing I remember.
00:03:45.980 All right.
00:03:46.560 So this is Donald Trump mocking?
00:03:49.500 No.
00:03:49.900 This is Donald Trump stating the facts in the Kavanaugh case.
00:03:56.660 Period.
00:03:57.520 Facts that MSNBC and everybody else don't seem to care.
00:04:02.380 Well, they've moved on.
00:04:04.080 I don't know if you've known this.
00:04:05.320 They've moved on now to other things.
00:04:08.300 The lies.
00:04:09.000 The lies that he told.
00:04:11.020 Okay.
00:04:11.400 Well, let's talk about lies, shall we?
00:04:14.140 This is a letter to Grassley's office.
00:04:17.840 The names have been taken out.
00:04:20.460 I, so-and-so, am a current resident of California.
00:04:22.940 I first met Christine Blasey, now Christine Blasey Ford, in 1989, 1990 in California.
00:04:30.100 From 90 to 91, I was just friends with Ford.
00:04:33.560 From approximately 92 to 98, I was in a relationship.
00:04:37.600 So, from 1989 to 1998, nine years, this person knew and was very, very close to her.
00:04:48.400 I found her truthful and maintained no animus toward her.
00:04:53.760 During our time dating, Dr. Ford never brought up anything regarding her experience as a victim of sexual assault, harassment, or misconduct.
00:05:02.100 She never mentioned Brett Kavanaugh.
00:05:04.440 During some of the time we were dating, Dr. Ford lived with Monica I. McClain, who I understood to be her lifelong best friend.
00:05:15.700 During that time, it was my understanding that McClain was interviewing for jobs with the FBI and the U.S. Attorney's Office.
00:05:23.620 I witnessed Dr. Ford help McClain prepare for a potential polygraph exam.
00:05:30.300 Dr. Ford explained in detail what to expect, how polygraphs work, and helped McClain become familiar and less nervous about the exam.
00:05:43.340 Dr. Ford was able to help because of her background in psychology.
00:05:48.940 Now, this is interesting because I do remember, while she was under oath, a very strange line of questioning that went a little something like this.
00:06:00.300 Have you ever had discussions with anyone, besides your attorneys, on how to take a polygraph?
00:06:11.280 Never.
00:06:11.980 Never.
00:06:12.260 And I don't just mean countermeasures, but I mean just any sort of tips or anything like that.
00:06:20.940 No, I was scared of the test itself.
00:06:24.940 She was scared of the test.
00:06:25.980 But it was comfortable that I could tell the information and the test would reveal whatever it was going to reveal.
00:06:33.940 Okay.
00:06:34.460 I didn't expect it to be as long as it was going to be, so it was a little bit stressful.
00:06:37.760 It was stressful.
00:06:38.400 Have you ever given tips or advice to somebody who was looking to take a polygraph test?
00:06:44.440 Never.
00:06:45.280 Never.
00:06:46.380 Maybe the FBI.
00:06:47.620 I demand an FBI investigation on Monica L. McClain, who is a lifetime friend of Dr. Ford.
00:06:58.320 Because there is, now I want to use this word carefully, an accuser, we have to define that here in a second.
00:07:07.140 An accuser saying that Dr. Ford and Monica McClain, Monica was interviewing for jobs with the FBI in the U.S. Attorney's Office.
00:07:19.740 I witnessed Dr. Ford help McClain prepare for a potential polygraph exam.
00:07:26.600 Dr. Ford explained in detail what to expect, how polygraphs work, and helped McClain become familiar and less nervous about the exam.
00:07:37.400 Let me play this audio again of what she said under oath.
00:07:42.220 Have you ever had discussions with anyone besides your attorneys on how to take a polygraph?
00:07:53.000 Never.
00:07:53.620 Never.
00:07:54.020 And I don't just mean countermeasures, but I mean just any sort of tips or anything like that.
00:08:02.640 No.
00:08:03.260 I was scared of the test itself.
00:08:05.820 She was scared.
00:08:06.600 But it was comfortable that I could tell the information and the test would reveal whatever it was going to reveal.
00:08:15.920 All right.
00:08:16.200 I didn't expect it to be as long as it was going to be, so it was a little bit stressful.
00:08:19.960 Okay.
00:08:20.380 Have you ever given tips or advice to somebody who was looking to take a polygraph test?
00:08:25.500 Anyone.
00:08:26.060 Never.
00:08:26.640 Never.
00:08:27.120 Never.
00:08:27.220 Never.
00:08:28.220 Well, we know somebody's lying here.
00:08:32.540 We know.
00:08:33.800 We know someone is lying.
00:08:37.180 Right?
00:08:37.800 Don't we, Stu?
00:08:38.720 Because we have somebody.
00:08:40.780 We have somebody who has accused her of teaching someone else about a polygraph.
00:08:48.700 Well, she's innocent until proven accused.
00:08:51.800 I think that's important that we keep that standard.
00:08:53.520 She is accused.
00:08:53.920 Now that she's proved accused.
00:08:55.540 She is accused.
00:08:56.320 She's guilty.
00:08:57.020 So if we're going to use the same standard that the left is applying, she is a liar. 0.91
00:09:03.680 She has perjured herself.
00:09:06.440 Certainly shouldn't be a professor anymore.
00:09:08.420 She should not be a professor.
00:09:09.860 Can she even be allowed to work at a fast food restaurant? 1.00
00:09:12.200 I don't think so.
00:09:12.740 How can you, how can you possibly believe a liar on anything she says?
00:09:19.880 You want this standard?
00:09:21.760 Because this is the standard that's coming.
00:09:24.540 This is the standard that we're now running to embrace.
00:09:28.840 This is the standard that our children, this is the standard that we ran from.
00:09:35.380 This is why we're America.
00:09:37.700 America, because in every other country, this was a new idea.
00:09:42.340 You cannot come into my house and just take me.
00:09:45.720 You can't just throw me in jail.
00:09:48.320 You have to have an accuser.
00:09:51.140 I have to know what the charges are.
00:09:53.920 I have a right to defend myself.
00:09:56.860 I have a right to know who my accuser is and address my accuser.
00:10:02.900 I have a right to be presumed innocent until proven guilty.
00:10:10.180 This is what America was founded on.
00:10:13.940 This is a uniquely American idea.
00:10:18.640 This was the genius of our founders.
00:10:22.300 You want to flush it away?
00:10:25.080 Go ahead.
00:10:26.440 But I will not be part of it.
00:10:31.960 This is the American idea.
00:10:35.520 Now listen, it is so imperative that you understand what this is.
00:10:47.880 If you do not understand what you're fighting, do you think we could have won World War II without naming the Nazis? 0.65
00:10:57.020 Do you think we will ever win this war on terror without naming what it is about?
00:11:06.740 What is driving people to the terror?
00:11:10.460 The Islamist ideas. 0.90
00:11:13.360 Not Muslim ideas. 1.00
00:11:15.380 Islamist ideas.
00:11:16.580 That Sharia law is the prevailing law, and if you're not under Sharia law, you're an infidel, which means I can kill you, I can rape you, I can turn you into a slave. 1.00
00:11:29.640 That is what the war on terror is all about.
00:11:34.720 And we will never win it unless we name our enemy.
00:11:39.680 We would have never won World War II if we were fighting the Germans. 0.82
00:11:47.540 We were not fighting the Germans. 0.73
00:11:50.440 We were fighting the Nazis.
00:11:55.060 We would not have won in the Civil War had we been fighting the South.
00:12:03.600 We were fighting people who didn't believe in the Constitution.
00:12:09.840 We were fighting for the freedom of all men.
00:12:14.740 That's why we won.
00:12:16.840 And by the way, if you don't think that's true, we lost every single battle up until the point that Abraham Lincoln said,
00:12:27.580 this is about slavery.
00:12:30.240 Look it up.
00:12:31.300 We wouldn't have won the American Revolution if it wasn't against tyranny.
00:12:39.320 It wasn't against the king.
00:12:41.080 It was against tyranny.
00:12:42.860 And it was for certain ideas, like the idea that you are innocent until proven guilty.
00:12:53.860 We are fighting post-modernism.
00:12:56.940 And until the American people understand what post-modernism is, you will lose.
00:13:05.040 You will lose every battle because you will only grow frustrated and angry, which will play directly into what they want to happen.
00:13:18.220 They want us at each other's throats.
00:13:20.880 They want us to be irrational.
00:13:23.160 They want us to be angry.
00:13:24.640 They want us just to start swinging in blind rage.
00:13:30.260 That's their plan.
00:13:31.980 And until you understand what they're doing, until you understand that this isn't really about Ford, this isn't about the charges.
00:13:44.260 This isn't about anything.
00:13:47.260 This is all about the patriarchy.
00:13:50.880 This is all about white men have put together, in this case, a rape culture. 0.75
00:13:57.780 And they have kept people down.
00:14:00.360 And it doesn't matter if he really did it, because other white men have.
00:14:05.180 It doesn't matter if she was really a victim, because other women have been victims. 1.00
00:14:12.620 This is about collective justice, currently entitled social justice.
00:14:20.280 But make no mistake, this is collective justice.
00:14:25.300 And collective justice, to put it into the terms that a Christian will understand, is anti-Christ. 0.92
00:14:38.900 Collective salvation is anti-Christ.
00:14:43.840 Collective justice is anti-Christ.
00:14:48.860 Individual salvation, individual justice.
00:14:55.300 That is Christian.
00:14:59.720 You cannot balance the scales by convicting someone who is not guilty, because someone who looked like them has done it anyway. 0.99
00:15:15.900 I don't think America understands.
00:15:20.340 And I think you feel it.
00:15:22.580 I think you feel it.
00:15:23.560 I don't know if your neighbors do, but I think you feel it.
00:15:26.180 We are extraordinarily close to the edge of the abyss.
00:15:34.060 And I am doing what I promised I would do.
00:15:37.180 I promised when it came to that time, and I asked you to do the same, I would stand and say, don't go there.
00:15:51.120 Stop where you are.
00:15:53.860 Turn around.
00:15:55.880 I know that's where the crowd is going.
00:15:58.540 Turn around.
00:15:59.800 Stop.
00:16:00.400 Safety is this direction.
00:16:03.020 Safety is this direction.
00:16:06.320 I have not known how to explain it to you.
00:16:11.920 It has only been my gut.
00:16:13.300 But I know what it is.
00:16:18.220 And I've been explaining it on TV, and I've explained it on radio on Thursday.
00:16:22.800 We're going to go into it on depth.
00:16:24.640 That's tomorrow.
00:16:26.680 It is in the book.
00:16:28.280 Read it at the library.
00:16:29.440 I don't care if you buy it.
00:16:31.640 Read it at the library.
00:16:32.840 It is not a surrender.
00:16:35.880 It is a desperate plea.
00:16:38.340 Please understand what's happening to us.
00:16:42.200 There is a way to win.
00:16:47.140 But we started this hour with Donald Trump.
00:16:50.020 You'll notice they say he mocked.
00:16:52.060 He didn't mock.
00:16:53.160 He stated facts.
00:16:55.420 People are not going to want to hear the facts.
00:16:58.440 That's okay.
00:16:59.840 State them.
00:17:01.080 State the facts.
00:17:02.160 Calmly, rationally, and relentlessly.
00:17:07.440 The only thing that matters is reason and facts.
00:17:16.800 This is the best of a Glenn Beck program.
00:17:25.860 Let me go to Al in Texas.
00:17:27.460 Hello, Al.
00:17:29.820 Al, are you there?
00:17:31.780 Yes.
00:17:32.160 Good morning.
00:17:32.700 Can you hear me?
00:17:33.320 Yeah, I can.
00:17:33.840 How are you, sir?
00:17:35.620 Fine.
00:17:36.100 And you?
00:17:36.640 Good.
00:17:37.100 Good.
00:17:37.420 What's up?
00:17:38.760 Listen, Glenn.
00:17:40.180 President Trump did not mock Professor Ford.
00:17:45.560 Yep.
00:17:46.640 You're right.
00:17:46.960 Al, I have to tell you, I have to tell you, Al, I have to tell you, I don't even think he attacked
00:18:08.640 her testimony.
00:18:09.180 He just stated the facts.
00:18:11.460 That's all he did.
00:18:12.660 He stated the facts.
00:18:14.500 Now, that might look like an attack to some, but it ain't.
00:18:17.780 Al, do you want a copy of the book or the audio book?
00:18:21.060 I'm going to make one out while we're talking here.
00:18:22.400 I'm more like hard copy books.
00:18:27.080 Hard copy.
00:18:27.620 Okay, you got it.
00:18:28.960 Is this a new thing now?
00:18:29.920 We're just giving books away to everyone who actually gets on the air?
00:18:33.400 Yeah, if you get on the air.
00:18:34.280 It's rare.
00:18:34.880 It's rare.
00:18:35.440 We don't take a lot of calls.
00:18:36.080 I figure I've penciled in five books for the rest of the year to Al.
00:18:41.840 Make it to you, Al.
00:18:44.080 Yes, please.
00:18:44.860 All right.
00:18:45.180 Hang on.
00:18:45.540 Could you make it up to my worst enemy?
00:18:47.740 Al, hang on.
00:18:48.380 We're going to get you the book.
00:18:49.700 So put them on hold and we'll get his address.
00:18:53.680 He didn't.
00:18:54.160 He did not mock.
00:18:54.980 No, he didn't mock.
00:18:55.720 He didn't mock.
00:18:56.700 All he did.
00:18:57.400 Play the audio real quick as we go into the bottom of the hour.
00:19:00.620 This is Donald Trump yesterday.
00:19:02.460 Mocking or just stating the facts?
00:19:04.300 Do we have time?
00:19:07.480 I had one beer.
00:19:09.380 Well, do you think it was?
00:19:10.700 Nope.
00:19:11.160 It was one beer.
00:19:12.160 Oh, good.
00:19:12.940 How did you get home?
00:19:13.880 I don't remember.
00:19:14.680 How'd you get there?
00:19:15.360 I don't remember.
00:19:16.120 Where is the place?
00:19:16.880 I don't remember.
00:19:17.860 How many years ago was it?
00:19:19.020 I don't know.
00:19:20.420 I don't know.
00:19:21.200 That's true.
00:19:22.080 He's not mocking.
00:19:24.120 He's not mocking.
00:19:28.400 This is the best of the Glenn Beck Program.
00:19:34.300 Like listening to this podcast?
00:19:38.280 If you're not a subscriber, become one now on iTunes.
00:19:41.520 And while you're there, do us a favor and rate the show.
00:19:44.380 All right.
00:19:44.720 Home Title Lock is our sponsor.
00:19:46.360 Home Title Lock is awesome because they take some real worry off your plate.
00:19:50.820 Like right now, if I were to ask you, is someone else borrowing money against your name?
00:19:54.980 You cannot answer yes or no unless you have Home Title Lock because Home Title Lock prevents
00:20:01.100 this from happening.
00:20:01.980 In fact, they have a $100 search to see if it's already happened to you.
00:20:04.640 You get that for free when you sign up for Home Title Lock and they'll protect you on
00:20:08.240 an ongoing basis.
00:20:09.160 This is not dark web stuff.
00:20:10.620 This isn't Russia stuff.
00:20:11.700 This is just a guy who just got out of prison, learned how to do this.
00:20:14.860 This is really easy to steal your home right from underneath you.
00:20:18.200 The only people that can protect you, Home Title Lock.
00:20:20.820 Do what I did.
00:20:21.420 Do what Glenn has done.
00:20:22.360 Go to HomeTitleLock.com for pennies a day.
00:20:25.440 Home Title Lock puts a barrier around your title and mortgage.
00:20:27.980 Really important.
00:20:28.560 Get the search for free at HomeTitleLock.com.
00:20:31.140 It's HomeTitleLock.com.
00:20:33.340 We are entering a new time and everything's being redesigned right now and people aren't
00:20:38.680 really talking about the issues.
00:20:40.620 People aren't really talking about big fundamental things that are changing.
00:20:43.960 For instance, America was based on life, liberty, and the pursuit of happiness.
00:20:48.800 Nobody's talking about pursuit of happiness right now.
00:20:50.740 Pursuit of happiness is defined by our founders as ownership that you could own.
00:20:55.160 You could forge your own way in life.
00:20:58.700 And ownership is a big part of capitalism and a big part of America.
00:21:04.340 However, ownership is quickly going away.
00:21:07.820 When you buy a book on Kindle, do you own the book?
00:21:11.580 When you buy a movie from iTunes, do you own the movie?
00:21:19.580 The answer is no.
00:21:22.980 The end of ownership.
00:21:25.980 Aaron, and I want to get this right, Perez, say it for me, Aaron.
00:21:30.860 Just ask him.
00:21:31.440 He'll tell us.
00:21:31.660 Yeah, tell me how you say his name.
00:21:33.940 It's Pursunovsky.
00:21:35.380 Pursunovsky.
00:21:35.860 Okay.
00:21:36.320 It was a lot easier than it looks.
00:21:38.520 We can't pronounce easy words, so that was going to be difficult.
00:21:40.540 Yeah, it's got more than one syllable.
00:21:42.920 There's a lot of consonants there.
00:21:44.080 Yeah, I know.
00:21:44.800 How you doing, Aaron?
00:21:46.420 I'm doing well.
00:21:47.180 How are you?
00:21:47.620 Good.
00:21:47.840 I'm really fascinated by how we make the turns in our society for the future, and ownership
00:21:58.060 is a big part of this, because in the future, I don't know how many people will even own
00:22:02.380 cars.
00:22:03.080 I mean, it's just all changing.
00:22:05.680 But do we really own things when we buy them online?
00:22:09.640 So I think there's a real concern here that consumers go into transactions when they're
00:22:17.380 buying things, digital goods, especially digital books, movies, music.
00:22:22.620 They go into those transactions assuming they work the same way as they do in the world of
00:22:27.760 tangible goods, where if you buy a book, you can give it away to a friend, you can lend
00:22:32.720 it to someone, you can leave it in your will in the future and leave your book collection
00:22:38.380 to your loved ones.
00:22:41.300 And the rules that control these digital transactions when you buy something on your Kindle or from
00:22:47.620 iTunes are very different from the rules that we expect in the physical world.
00:22:52.780 And consumers don't really understand that distinction.
00:22:56.880 And I think that causes a real disconnect between what we all expect to happen and what happens
00:23:03.080 in fact.
00:23:03.780 So to give you a quick example, just a couple of weeks ago, a consumer, a customer of the
00:23:14.820 Apple iTunes movie store found that three movies that he had purchased had been deleted from
00:23:22.140 his account.
00:23:23.240 They were no longer accessible.
00:23:25.380 And I think that shocked a lot of people.
00:23:27.000 Um, those of us that have been following these issues closely for years would remember 10
00:23:32.240 years ago when Amazon remotely deleted books off of people's Kindles, including, uh, ironically,
00:23:40.140 George Orwell's 1984.
00:23:42.020 So these, these issues have been happening for a long time, but I think people are, are now
00:23:46.660 starting to really, uh, sit up and take notice of it.
00:23:49.320 Okay.
00:23:49.500 So I remember it because this, it's easier for me to read everything on Kindle.
00:23:54.060 Um, but I, and I have a, a large collection, uh, in my library of, of, of hardcover books.
00:24:01.480 Uh, and I read so much.
00:24:03.600 I read it all on Kindle, but I have recently really been concerned, not just because I don't
00:24:09.540 actually own it and I can't have it in my library and I can't pass it on, but also because
00:24:14.060 you watch things like it happening in China.
00:24:16.340 If you're a giant, if you're in China, I mean, at first they wouldn't sell the book, but
00:24:20.000 if they did sell the book, the government can just deem that that book is, you don't need
00:24:24.420 to burn books.
00:24:25.500 You could just overnight, just take all of that, every copy of that book out of circulation.
00:24:30.640 If it's only digital, that's really disturbing to me.
00:24:36.180 I think it's a real concern.
00:24:38.020 Um, it's a concern, um, from the perspective of censorship, as you've just described it.
00:24:44.060 It's also a real concern, uh, from the perspective of preservation and sort of archiving our cultural
00:24:51.920 history.
00:24:52.900 If these books are, are stored on the centralized servers and only the hands of, you know, the,
00:24:59.920 the, the two or three companies, um, that dominate these markets, then there's a real risk that,
00:25:07.200 um, we aren't going to be able to ensure kind of the widespread distribution of copies.
00:25:14.060 That will allow us to, um, to, to, to archive and preserve, um, these works.
00:25:20.160 And, and Aaron, it, with the movie, it wasn't because they found it objectionable or anything
00:25:25.140 else.
00:25:25.380 It's because that particular provider, they lost the rights to that movie, right?
00:25:31.300 And so they, they had to pull it from people's libraries because their rights had expired.
00:25:37.860 So there are a number of ways that this can happen.
00:25:40.380 This most recent example, I don't know that the facts are totally clear on exactly.
00:25:43.900 What went on.
00:25:45.040 So one way this can happen is that as you described, um, the deal between the digital retailer,
00:25:52.920 Apple or Amazon, and the copyright holder expires, they no longer have the rights to sell that
00:25:58.800 product.
00:25:59.680 It can also happen when a record label or a movie studio decides that they want to put out
00:26:05.680 the new updated, remastered director's cut edition of a movie.
00:26:11.000 And when they do that, they pull the old version to help drive the sale of the new.
00:26:16.940 Um, Oh my gosh.
00:26:17.700 So they almost force you to, I mean, cause they, they, they've always done this where,
00:26:21.900 you know, it's the masterpiece collection and it's, you know, additional footage and,
00:26:25.980 and, uh, you know, fully restored, but you still had the old copy.
00:26:30.600 Now that's right.
00:26:31.600 You can't, you, I mean, even for, I mean, think of this, even just for comparison, you
00:26:36.800 can't, if they change something in a movie, imagine when, remember when George Lucas changed
00:26:40.880 star Wars.
00:26:42.060 Well, I want to see what it was like when it originally came out.
00:26:46.040 You wouldn't be able to do that.
00:26:47.080 Would you?
00:26:47.560 Unless the movie company decided to allow you to do that.
00:26:51.920 That's right.
00:26:52.420 I mean, and the, and the problem in this most recent case in part was that the consumer
00:26:56.200 didn't have a local copy stored on their computer or their device.
00:27:01.440 Um, and, and this is just a practical tip for people.
00:27:04.160 You should always try to store as much as you can locally.
00:27:07.960 Now, these services are often trying to encourage consumers to rely on their own, on the, on the
00:27:16.060 company's own sort of cloud storage solution.
00:27:19.260 And sometimes, um, with the Apple TV, for example, uh, the Apple TV doesn't allow you
00:27:26.020 to permanently download a copy of a movie.
00:27:28.680 You have to access it through their cloud servers.
00:27:32.500 Exactly.
00:27:33.280 Um, sorry.
00:27:34.680 I think that makes a big difference in your relationship.
00:27:37.700 If I downloaded something on Kindle, could I download it to another cloud and still be able
00:27:44.500 to read it on Kindle?
00:27:46.060 Uh, so the, the Kindle allows you to store those files locally on your own device, but
00:27:57.020 because the Kindle is tethered through software and network connections to Amazon, Amazon has
00:28:04.740 the ability as, as they showed 10 years ago, to remove those files from your device.
00:28:10.500 It's unbelievable.
00:28:11.020 Yeah.
00:28:11.260 You talk about, go ahead.
00:28:13.020 Real quick.
00:28:13.800 Apple, Apple has the same sort of control.
00:28:15.820 Well, we saw this several years ago too, in a very different way.
00:28:18.920 I'm sure, um, some of your listeners may remember when they woke up and found a U2 album on their
00:28:24.460 iPhone.
00:28:25.100 Yes.
00:28:26.180 They put it the other way.
00:28:27.280 They forced everybody to have it.
00:28:29.820 Exactly.
00:28:31.280 That's bizarre.
00:28:32.060 You write about this a little bit and it's, it's an interesting change in the way we think
00:28:36.560 about commerce.
00:28:37.380 There is, there is, in the past you had a transaction where you'd go into a store and you'd buy something.
00:28:42.820 With these digital purchases that we're making from iTunes or Amazon, we're actually like
00:28:47.700 entering a ongoing relationship with them.
00:28:50.900 You, it's a, it's sort of an open-ended thing where they're constantly knowing what you do
00:28:56.040 with that product.
00:28:56.800 And you have that ongoing relationship where they can cancel that at any time without your
00:29:02.380 knowledge.
00:29:02.760 Can you talk a bit, a little bit about the change there?
00:29:05.140 Cause that is a, that's a real change.
00:29:06.400 I don't think people have considered.
00:29:08.060 Aaron, you're right.
00:29:08.540 Uh, that the switch to the digital platform offers convenience, but also makes consumer
00:29:12.500 access more contingent.
00:29:13.840 Unlike a purchase at a broke bookstore, a digital media transaction is continuous linking buyer
00:29:18.440 and seller and giving the seller a post-transaction power impossible in physical markets.
00:29:23.260 Why is that important?
00:29:25.040 So I think this is important for a number of reasons.
00:29:29.620 It leads to these scenarios that we were talking about earlier, where the seller of the good
00:29:35.380 has the ability not only to sort of reclaim, uh, or recall the good, but they also have some
00:29:43.040 ability to control how and when, and under what circumstances you make use of that product
00:29:49.240 after the sale.
00:29:50.180 So that's just not something that you could do in the tangible world, right?
00:29:55.040 Your, your local bookstore, put aside the publisher, your local bookstore can't tell you
00:30:00.000 what country you're allowed to read a book in.
00:30:02.920 They can't tell you, um, you know, how many times you get to read it.
00:30:07.520 They can't tell you who you get to lend that book to.
00:30:10.720 And they certainly can't keep records of all of those interactions.
00:30:13.980 And the digital world allows for, uh, that, that form of control.
00:30:20.760 And importantly, it's not limited just to digital media.
00:30:25.440 Uh, we have all these smart devices, uh, in our homes, on our bodies.
00:30:30.560 Um, you know, we've got our voice assistants and our fitness trackers and, you know, even
00:30:37.480 our home appliances and cars.
00:30:39.640 They all have software, they all have network connections and all of these sort of, uh,
00:30:45.660 problems that I've been describing are going to play out in that space as well, where device
00:30:52.520 makers are not only going to be able to track your behavior, but they're also going to be
00:30:57.680 able to limit the ways in which you can use the products that you think, uh, you have purchased.
00:31:04.080 So, so let me, so let me interrupt here and just ask you this.
00:31:08.080 I see when I go to iTunes, I see a movie I want to watch.
00:31:11.520 It says rent or own.
00:31:15.240 I'm not owning it.
00:31:16.440 I'm just renting it in a different way.
00:31:19.740 Isn't this false advertising?
00:31:22.540 Uh, so I think there's a really good case to be made here that companies like Amazon and
00:31:27.000 Apple that use language like own and buy words that have real meaning for people in their
00:31:34.600 everyday lives are misstating the, the nature of those transactions.
00:31:39.960 So, uh, my, uh, coauthor, Chris Hofnagel, and I wrote a paper a few years ago, a couple of
00:31:47.180 years ago now, um, called what we buy when we buy now that did a survey of about 1500 consumers
00:31:54.500 to figure out what people think this language means.
00:31:58.300 And it turns out that a significant percentage of consumers incorrectly believe that they
00:32:06.840 do have true ownership rights and they get to keep these goods, that they can lend them,
00:32:11.240 that they can give them away.
00:32:13.080 And we think that there is an opportunity here to, uh, correct this misinformation in the
00:32:20.040 marketplace.
00:32:20.420 But think about the company that we're talking about, you know, Apple and Amazon are two of
00:32:25.280 the biggest corporations the world has ever seen and getting them to, uh, convincing them
00:32:34.820 to communicate in a, in a more clear and fair way is, is, is a real challenge.
00:32:41.980 Class action lawsuit.
00:32:43.360 So I think there is a possibility for class action litigation here.
00:32:50.560 There, there are a bunch of, uh, legal, uh, and practical hurdles to making that happen.
00:32:56.880 I think it's something worth pursuing.
00:32:58.640 I think the federal trade commission has a role to play here.
00:33:02.660 This is, uh, squarely within their, um, uh, within their area of, of expertise and obligation
00:33:12.740 to police the market to make sure that consumers have accurate information.
00:33:17.440 Aaron, um, go ahead.
00:33:19.360 Yeah.
00:33:20.000 I just want, go ahead.
00:33:22.000 The, the, the way the market works depends on consumers being informed.
00:33:26.820 People can't make rational choices.
00:33:28.940 People can't decide where to spend their money if they're being misled about the products that
00:33:33.940 they're getting.
00:33:34.680 So I think that it's crucial for the functioning of the market, uh, to have that information
00:33:39.540 be correct.
00:33:40.580 Have you done any look into what a society without real ownership, I mean, we're down to, you
00:33:49.020 know, renting clothes and everything else.
00:33:51.200 Uh, and that's only going to get stronger as, as, as we move forward.
00:33:54.580 Have you looked into what that means for a capitalist society and for America in particular, that
00:34:01.080 has always been about ownership?
00:34:04.420 So my biggest concern here is the way this changes kind of our conception of ourselves and
00:34:14.060 the way we think about ourselves as individuals in a society.
00:34:18.620 This is the best of the Glenn Beck program.
00:34:38.480 One of my favorite guys, uh, because he is, he does his own homework.
00:34:42.100 He rolls up his sleeves.
00:34:43.440 He looks and he tells the truth as he finds it.
00:34:46.260 Peter Schweitzer is here.
00:34:47.800 He's the president of government accountability Institute and a producer of a new documentary
00:34:52.000 that's out called the creepy line.
00:34:54.340 And that is exactly the right name for it.
00:34:58.420 It is.
00:34:59.260 And it actually, the creepy line comes from a speech that Eric Schmidt, Schmidt, the CEO
00:35:03.860 of Google gave.
00:35:04.780 It was an interview.
00:35:05.400 In fact, where he was asked, how do you make these ethical judgments about how far you're
00:35:09.800 going to go?
00:35:10.440 And the, the interviewer actually asks Schmidt, are you going to implant things in our brain?
00:35:15.460 And Eric Schmidt's response was, well, we like to go right up to the creepy line, but
00:35:20.980 not cross it.
00:35:22.060 And he said, we're not going to implant anything in your brain.
00:35:24.840 At least not yet.
00:35:26.120 Those are actually Eric Schmidt's word.
00:35:28.820 And he's, he's, I find him incredibly Frank.
00:35:32.420 Yes.
00:35:32.760 He, he just, he says it like it is.
00:35:35.340 Yes.
00:35:35.600 It's, it's, I've interviewed him a couple of times and it is fascinating because he's
00:35:41.840 just telling you.
00:35:42.820 He doesn't sugarcoat it.
00:35:44.140 And I think it's his background as an engineer and, and he's sort of very direct.
00:35:49.380 I mean, one of the other things we quote him in the film is saying is that Google has and
00:35:54.000 takes very seriously its responsibility to change the values of American people.
00:36:00.060 Uh, you know, Google's mantra has always been, they are more than just a company to make
00:36:05.440 money.
00:36:06.120 Uh, they have a certain ethos, a certain worldview.
00:36:09.160 And part of the reason that they structured the company the way they did in which the founders
00:36:14.140 always have controlling shares is that that sense of social mission is part of it.
00:36:18.300 And Schmidt has been always very direct about saying it.
00:36:20.920 Yes.
00:36:21.160 Part of our mission as a company has been to try to shape and change the values of the
00:36:26.320 United States.
00:36:27.000 And that's sort of one of the premises of this film that it's not just about privacy.
00:36:31.620 It's not that there's taking all this information, Glenn, they're using that information against
00:36:36.680 us to try to nudge us or to move us into directions that we wouldn't ordinarily want to go.
00:36:42.120 Okay.
00:36:42.760 So, um, so let's, can, can you tie this all to Kavanaugh and what we've seen with the Kavanaugh
00:36:50.040 case and how, for instance, you know, there's, there's, um, uh, there is this overwhelming,
00:36:57.200 uh, understanding from half the country that he is absolutely guilty and she is a victim.
00:37:06.060 Right.
00:37:06.660 And there's a lot of information on the other side.
00:37:10.620 In fact, more information on the other side, but you're not really seeing that.
00:37:13.620 Right.
00:37:14.280 Yeah.
00:37:14.660 It's, it's very hard because this is happening in real time right now to sort of monitor what's
00:37:19.700 Google doing, but we can look at the past.
00:37:22.140 Uh, in fact, one of the things we feature in the film is a study done by a Robert Epstein.
00:37:27.300 Uh, Epstein's a very interesting guy.
00:37:29.240 He's a Harvard PhD in psychology studied under BF Skinner, uh, was a former editor in chief
00:37:34.960 of psychology today magazine.
00:37:36.600 And by the way, and this is very relevant, was a Hillary Clinton supporter in 2016.
00:37:41.720 Well, one of the things he did in the 2016 election was he had 2000 people around the
00:37:47.160 country doing Google searches, uh, and they monitored the results that people were getting.
00:37:52.700 This is a very, uh, you know, uh, clear academic study and, and this research was peer reviewed
00:37:58.140 as his other work was.
00:37:59.340 Uh, and what came back was that Google was systematically skewing search results in favor of Hillary Clinton. 0.99
00:38:06.320 They were, in other words, they were, uh, suppressing negative stories about Hillary and the algorithm
00:38:11.860 and they were pushing them in favor of Donald Trump.
00:38:14.480 And Epstein's point was, I actually supported Hillary Clinton thought she was more qualified,
00:38:18.740 but the bottom line is a company should not be doing this.
00:38:22.780 And it's secret.
00:38:23.740 You don't know that it's going on.
00:38:25.940 Nobody's monitoring the results are getting.
00:38:28.020 They're assuming the results in the list that they're getting is representative of some
00:38:32.580 objective standard.
00:38:33.860 Google is a, Google is a verb now.
00:38:35.820 It's not a noun.
00:38:36.900 It's a, it's a verb.
00:38:38.380 I don't know.
00:38:38.940 Google it.
00:38:39.720 Yes.
00:38:40.380 Well, if you Google it and the, and the algorithm is giving you the answer that is skewed, right?
00:38:46.780 That's like going to a dictionary that will always change the definitions of things as
00:38:53.520 it applies to whatever's happening in the world.
00:38:56.280 Yes.
00:38:56.620 That's real problem.
00:38:57.920 No, you're, you're exactly right.
00:38:59.320 And so in the, in the context of Kavanaugh, I mean, I don't know exactly because it's occurring
00:39:03.580 in real time, but the bottom line is there is a history here of Google doing this.
00:39:08.160 It was, it was leaked a couple of weeks ago.
00:39:11.200 Tucker Carlson talked about on Fox about these internal emails where you actually had Google
00:39:16.980 engineers saying, Hey, you know what?
00:39:19.520 We don't like, you know, Trump's policy on immigration.
00:39:22.340 So we want to sort of, uh, suppress certain stories.
00:39:26.280 Um, this is a thing and Google does it.
00:39:28.640 And, and here's the, the, the, the point that we try to make Glenn in this film and in general,
00:39:33.300 the whole conversation that Google wants to have is about fake news and this debate about
00:39:38.240 fake news.
00:39:38.880 Here's the, here's the bottom line.
00:39:40.660 Fake news is competitive.
00:39:41.820 If you and I are having a disagreement about something, I put up my fake news story and
00:39:47.080 you say, Oh yeah, I'm going to put up my fake news story.
00:39:49.980 The point is it's out in the open.
00:39:51.840 You have combat.
00:39:52.780 And by the way, fake news doesn't really convince anybody.
00:39:56.020 You know, if you like Hillary Clinton, that fake news ad that the Russians ran of Jesus and,
00:40:01.940 and, and Hillary arm wrestling is probably not going to convince you to vote a different 1.00
00:40:06.440 way.
00:40:06.880 That wasn't the, that wasn't a real arm wrestling competition.
00:40:09.640 But you know, the, the, the point is, is that that's not going to convince anybody because
00:40:15.300 of confirmation bias.
00:40:16.440 You know, people tend to look for information they want.
00:40:19.020 What Google's doing is different because we don't know what we don't know.
00:40:23.860 The question that we should be asking people, uh, Google and Facebook is why will you not
00:40:29.800 make your algorithm transparent?
00:40:32.260 Right.
00:40:32.520 I've never believed in, you know, those dystopian movies.
00:40:35.680 I've always made fun of them and said, yeah, this is, this is crazy.
00:40:38.960 You know, the, you know, the corporations out to get you because of their algorithms, because
00:40:44.740 they are so all encompassing, that is the world we're headed towards.
00:40:49.980 What do they tell you when they say algorithms?
00:40:54.440 Oh, no, we have to keep that top secret because.
00:40:57.240 Yeah.
00:40:57.460 They, what they argue is it's, it's for reasons of, of, uh, you know, state secret.
00:41:01.620 Um, and, and, you know, that they need to protect their trade secrets.
00:41:05.320 They need to be, uh, uh, you know, making sure that nobody gets access to it.
00:41:09.040 There's some truth to that, but there are a lot of things that they could do to demonstrate,
00:41:13.200 um, that they're offering a fair product and service to people.
00:41:17.460 And here's the thing, Glenn, they have lied about this before, you know, 10 years ago or
00:41:22.380 so you had other, uh, companies like TripAdvisor and Yelp who were saying that Google was artificially
00:41:29.220 suppressing their rankings in Google in favor of Google owned companies, which, okay, you
00:41:35.220 know, Google has the right to do that. 0.99
00:41:36.400 But here's the thing, Google flat out lied and said, absolutely not.
00:41:40.040 We don't do that.
00:41:41.180 Our algorithm is pure.
00:41:42.740 It's true.
00:41:43.360 The best results are going to, are, are organically at the top.
00:41:47.260 Well, here's the problem.
00:41:48.300 The federal trade commission, the European union professors at Harvard university looked
00:41:52.640 at this and said, BS, you are fiddling with the algorithm.
00:41:55.700 You are screwing these other competitors and you're lying.
00:41:59.080 So the point is when Google says you can trust the algorithm, you can trust us.
00:42:03.860 They've lied before and they're lying now.
00:42:06.020 And I think the only question that remains really is how are we going to deal with this?
00:42:10.600 Um, you know, there's an old story that Henry Kissinger said when he's on the national security
00:42:14.960 council, you give a president three choices, do nothing, take my solution or thermonuclear
00:42:20.800 war.
00:42:21.260 Those are your three choices.
00:42:22.500 Uh, in this case, it's kind of like that.
00:42:24.560 We can do nothing.
00:42:25.720 We can try to deal with some sort of the regulatory issues related with Google, or we can break
00:42:31.720 up these companies.
00:42:32.680 Those are the three options that we have.
00:42:34.720 And I think we're really at the point of point number three, because this is not a monopoly
00:42:39.080 like standard oil, standard oil.
00:42:41.260 That's going to dominate the oil market. 0.98
00:42:42.760 This is controlling the news flow in the United States.
00:42:46.180 This is in the end.
00:42:48.260 This is in the end, Peter, um, controlling everything.
00:42:52.260 Yes.
00:42:52.900 Google is the most likely company in the America, in the American world, uh, to come up with AI.
00:43:02.400 Yes.
00:43:02.660 Whoever, whoever gets to AI first controls everything.
00:43:08.920 There's no way to beat it.
00:43:10.600 Right.
00:43:10.980 Once you have AI.
00:43:12.300 Yes.
00:43:12.820 This company is the most likely in the free world to come up with it.
00:43:17.720 If we don't have them contained in some way or another, when they get to AI, we're toast.
00:43:26.140 Yes.
00:43:26.700 Yes.
00:43:27.160 That's exactly right.
00:43:28.240 And here's the thing.
00:43:29.040 It's not just Google, the company, a lot of people don't realize this.
00:43:33.380 I didn't realize this.
00:43:34.640 If you use Safari on your Apple product, you're actually using the Google algorithm.
00:43:39.500 And that is Google information.
00:43:41.680 Um, if you're using, if you are using, um, Yahoo, you're using Google.
00:43:46.640 The point being Firefox is Google.
00:43:49.580 The, all these entities are using the Google algorithm.
00:43:52.440 So even if you say, I am not going to use Google.com, you're using, unless you are making
00:43:58.060 very specific choices for other options.
00:44:00.740 If you're using any of those others, Google is the one that's dominating it.
00:44:04.080 And by the way, Google pays Apple $9 billion a year.
00:44:09.040 Google actually pays Apple to be the algorithm of choice for Safari.
00:44:13.640 That's how much they value this information and want to dominate this space.
00:44:17.580 The blaze radio network on demand.