The Glenn Beck Program - October 03, 2018


'Reason and Facts Matter' - 10⧸3⧸18


Episode Stats

Length

1 hour and 50 minutes

Words per Minute

156.2084

Word Count

17,327

Sentence Count

1,662

Misogynist Sentences

15

Hate Speech Sentences

13


Summary

Trump mocks the victim. Christine Blasey Ford's ex-boyfriend describes a strange line of questioning during a polygraph. Glenn explains why this is not only disturbing but also a very disturbing thing that happened under oath.


Transcript

00:00:00.000 The Blaze Radio Network, on demand, Glenn Beck.
00:00:09.500 All right, I wanted to start with something.
00:00:12.180 This is very, very complex.
00:00:14.440 I got up this morning and I've been doing a whole bunch of research on the show.
00:00:21.620 The last thing I got to was the Donald Trump thing.
00:00:24.520 I was on our affiliate in Tulsa and I was asked, what about Donald Trump?
00:00:32.340 And I should have just said, I don't know, I haven't seen it yet, but I had read about it.
00:00:38.340 And I'm glad I didn't comment on it.
00:00:41.540 All I said was, I'm not going to comment on it because it's just, it's ridiculous to focus everything on Donald Trump.
00:00:49.980 However, it does deserve comment.
00:00:55.520 The media has said, did you hear Donald Trump mocking the victim?
00:01:04.180 I'm going to get into that in a second.
00:01:07.660 No, I hadn't.
00:01:09.800 But in case you have only read about it or you've only seen the headlines, I would like to play the audio.
00:01:17.500 Here is Donald Trump, according to the press, mocking the victim.
00:01:23.500 Listen.
00:01:23.720 I had one beer.
00:01:25.580 Well, do you think it was, nope, it was one beer.
00:01:28.360 Oh, good.
00:01:29.120 How did you get home?
00:01:30.080 I don't remember.
00:01:30.860 How'd you get there?
00:01:31.560 I don't remember.
00:01:32.320 Where is the place?
00:01:33.080 I don't remember.
00:01:34.060 How many years ago was it?
00:01:35.220 I don't know.
00:01:36.380 I don't know.
00:01:38.800 I don't know.
00:01:41.200 I don't know.
00:01:43.580 What neighborhood was it in?
00:01:45.100 I don't know.
00:01:45.500 Where's the house?
00:01:46.360 I don't know.
00:01:48.300 Upstairs, downstairs, where was it?
00:01:50.140 I don't know.
00:01:50.500 But I had one beer.
00:01:51.480 That's the only thing I remember.
00:01:52.580 All right.
00:01:53.140 So this is Donald Trump mocking?
00:01:56.040 No.
00:01:57.700 This is Donald Trump stating the facts in the Kavanaugh case.
00:02:03.280 Period.
00:02:03.720 It's Wednesday, October 3rd.
00:02:14.200 You're listening to the Glenn Beck Program.
00:02:16.120 I'm going to give you a few other facts here.
00:02:20.260 Facts that MSNBC and everybody else don't seem to care.
00:02:25.260 Well, they've moved on.
00:02:26.920 I don't know if you've known this.
00:02:28.040 They've moved on now to other things.
00:02:31.140 The lies.
00:02:31.880 The lies that he told.
00:02:33.860 Okay.
00:02:34.240 Well, let's talk about lies, shall we?
00:02:36.860 This is a letter to Grassley's office.
00:02:40.680 The names have been taken out.
00:02:42.780 I, so-and-so, am a current resident of California.
00:02:46.060 I first met Christine Blasey, now Christine Blasey Ford, in 1989, 1990 in California.
00:02:52.940 From 90 to 91, I was just friends with Ford.
00:02:56.060 From approximately 92 to 98, I was in a relationship.
00:03:01.060 So, from 1989 to 1998, nine years, this person knew and was very, very close to her.
00:03:11.120 I found her truthful and maintained no animus toward her.
00:03:16.620 During our time dating, Dr. Ford never brought up anything regarding her experience as a victim
00:03:22.560 of sexual assault, harassment, or misconduct.
00:03:24.680 She never mentioned Brett Kavanaugh.
00:03:27.540 During some of the time we were dating, Dr. Ford lived with Monica I.
00:03:32.320 McClain, who I understood to be her lifelong best friend.
00:03:38.380 During that time, it was my understanding that McClain was interviewing for jobs with the FBI
00:03:44.100 and the U.S. Attorney's Office.
00:03:46.380 I witnessed Dr. Ford help McClain prepare for a potential polygraph exam.
00:03:53.100 Dr. Ford explained in detail what to expect, how polygraphs work, and helped McClain become
00:04:01.860 familiar and less nervous about the exam.
00:04:06.200 Dr. Ford was able to help because of her background in psychology.
00:04:10.660 Now, this is interesting because I do remember, while she was under oath, a very strange line
00:04:18.700 of questioning that went a little something like this.
00:04:23.300 Have you ever had discussions with anyone, besides your attorneys, on how to take a polygraph?
00:04:32.980 Dr. Ford said, no, I was scared of the test itself, but it was comfortable that I could tell the information and the test would reveal whatever it was going to reveal.
00:04:56.700 I didn't expect it to be as long as it was going to be, so it was a little bit stressful.
00:05:00.860 Stressful.
00:05:01.540 Have you ever given tips or advice to somebody who was looking to take a polygraph test?
00:05:07.300 Never.
00:05:08.060 Never.
00:05:09.220 Maybe the FBI.
00:05:10.460 I demand an FBI investigation on Monica L. McClain, who is a lifetime friend of Dr. Ford, because there is, now I want to use this word carefully, an accuser.
00:05:28.260 We have to define that here in a second.
00:05:30.000 An accuser saying that Dr. Ford and Monica McClain, Monica was interviewing for jobs with the FBI in the U.S. Attorney's Office.
00:05:42.580 I witnessed Dr. Ford help McClain prepare for a potential polygraph exam.
00:05:49.440 Dr. Ford explained in detail what to expect, how polygraphs work, and helped McClain become familiar and less nervous about the exam.
00:06:00.260 Let me play this audio again of what she said under oath.
00:06:05.060 Have you ever had discussions with anyone besides your attorneys on how to take a polygraph?
00:06:15.840 Never.
00:06:16.460 Never.
00:06:16.860 And I don't just mean countermeasures, but I mean just any sort of tips or anything like that.
00:06:25.500 No, I was scared of the test itself, but it was comfortable that I could tell the information and the test would reveal whatever it was going to reveal.
00:06:38.780 I didn't expect it to be as long as it was going to be, so it was a little bit stressful.
00:06:42.300 Have you ever given tips or advice to somebody who was looking to take a polygraph test?
00:06:48.940 Never.
00:06:49.520 Never.
00:06:50.640 Never.
00:06:51.400 Well, we know somebody's lying here.
00:06:54.080 We know, we know someone is lying, right?
00:07:00.640 Don't we, Stu?
00:07:01.620 Because we have somebody, we have somebody who has accused her of teaching someone else about a polygraph.
00:07:12.660 Well, she's innocent until proven accused.
00:07:14.640 I think that's important that we keep that standard.
00:07:16.160 Now that she's proved accused, she's guilty.
00:07:19.920 So if we're going to use the same standard that the left is applying, she is a liar.
00:07:26.480 She has perjured herself.
00:07:29.260 Certainly shouldn't be a professor anymore.
00:07:31.280 She should not be a professor.
00:07:32.720 Should she even be allowed to work at a fast food restaurant?
00:07:35.080 I don't think so.
00:07:35.600 How can you, how can you possibly believe a liar on anything she says?
00:07:42.880 You want this standard?
00:07:44.720 Because this is the standard that's coming.
00:07:47.500 This is the standard that we're now running to embrace.
00:07:51.340 This is the standard that our children, this is the standard that we ran from.
00:07:58.100 This is why we're America.
00:08:00.560 America, because in every other country, this was a new idea.
00:08:05.100 You cannot come into my house and just take me.
00:08:08.560 You can't just throw me in jail.
00:08:11.160 You have to have an accuser.
00:08:13.980 I have to know what the charges are.
00:08:16.740 I have a right to defend myself.
00:08:19.700 I have a right to know who my accuser is and address my accuser.
00:08:25.760 I have a right to be presumed innocent until proven guilty.
00:08:33.040 This is what America was founded on.
00:08:36.800 This is a uniquely American idea.
00:08:41.500 This was the genius of our founders.
00:08:45.280 You want to flush it away?
00:08:48.000 Go ahead.
00:08:50.000 But I will not be part of it.
00:08:52.180 But this is the American idea.
00:09:00.840 Now, listen.
00:09:03.800 It is so imperative that you understand what this is.
00:09:10.720 If you do not understand what you're fighting.
00:09:13.120 Do you think we could have won World War II without naming the Nazis?
00:09:19.880 Do you think we will ever win this war on terror without naming what it is about?
00:09:29.600 What is driving people to the terror?
00:09:33.320 The Islamist ideas.
00:09:36.380 Not Muslim ideas.
00:09:38.220 Islamist ideas.
00:09:39.880 That Sharia law is the prevailing law.
00:09:43.140 And if you're not under Sharia law, you're an infidel.
00:09:47.320 Which means I can kill you.
00:09:48.640 I can rape you.
00:09:49.400 I can turn you into a slave.
00:09:52.380 That is what the war on terror is all about.
00:09:57.240 And we will never win it unless we name our enemy.
00:10:02.540 We would have never won World War II if we were fighting the Germans.
00:10:10.400 We were not fighting the Germans.
00:10:13.280 We were fighting the Nazis.
00:10:17.920 We would not have won in the Civil War had we been fighting the South.
00:10:26.440 We were fighting people who didn't believe in the Constitution.
00:10:32.960 We were fighting for the freedom of all men.
00:10:37.560 That's why we won.
00:10:39.680 And by the way, if you don't think that's true, we lost every single battle up until the point
00:10:46.320 that Abraham Lincoln said, this is about slavery.
00:10:53.080 Look it up.
00:10:54.140 We wouldn't have won the American Revolution if it wasn't against tyranny.
00:11:02.140 It wasn't against the king.
00:11:03.920 It was against tyranny.
00:11:05.720 And it was for certain ideas like the idea that you are innocent until proven guilty.
00:11:16.620 We are fighting post-modernism.
00:11:19.800 And until the American people understand what post-modernism is, you will lose.
00:11:28.760 You will lose every battle because you will only grow frustrated and angry, which will play
00:11:36.600 directly into what they want to happen.
00:11:40.480 They want us at each other's throats.
00:11:43.720 They want us to be irrational.
00:11:46.020 They want us to be angry.
00:11:48.100 They want us just to start swinging in blind rage.
00:11:53.180 That's their plan.
00:11:56.560 And until you understand what they're doing, until you understand that this isn't really
00:12:04.180 about Ford, this isn't about the charges, this isn't about anything, this is all about the
00:12:12.440 patriarchy.
00:12:13.200 This is all about white men have put together, in this case, a rape culture, and they have
00:12:21.400 kept people down.
00:12:23.300 And it doesn't matter if he really did it because other white men have.
00:12:28.200 It doesn't matter if she was really a victim because other women have been victims.
00:12:34.480 This is all about collective justice, currently entitled social justice.
00:12:43.280 But make no mistake, this is collective justice.
00:12:49.340 And collective justice, to put it into the terms that a Christian will understand, is anti-Christ.
00:13:00.760 Collective salvation is anti-Christ.
00:13:06.440 Collective justice is anti-Christ.
00:13:11.700 Individual salvation, individual justice, that is Christian.
00:13:22.600 You cannot balance the scales by convicting someone who is not guilty because they're not
00:13:30.760 because someone who looked like them has done it anyway.
00:13:42.980 I don't think America understands.
00:13:47.600 And I think you feel it.
00:13:49.360 I think you feel it.
00:13:50.440 I don't know if your neighbors do, but I think you feel it.
00:13:52.660 We are extraordinarily close to the edge of the abyss.
00:14:00.620 And I am doing what I promised I would do.
00:14:03.960 I promised when it came to that time, and I asked you to do the same.
00:14:12.400 I would stand and say, don't go there.
00:14:17.920 Stop where you are.
00:14:20.680 Turn around.
00:14:22.660 I know that's where the crowd is going.
00:14:25.320 Turn around.
00:14:26.600 Stop.
00:14:29.160 Safety is this direction.
00:14:31.040 I have not known how to explain it to you.
00:14:38.720 It has only been my gut.
00:14:40.660 But I know what it is.
00:14:44.980 And I've been explaining it on TV, and I've explained it on radio.
00:14:48.840 On Thursday, we're going to go into it on depth.
00:14:51.420 That's tomorrow.
00:14:53.440 It is in the book.
00:14:55.060 Read it at the library.
00:14:56.220 I don't care if you buy it.
00:14:57.420 Read it at the library.
00:15:00.420 It is not a surrender.
00:15:02.740 It is a desperate plea.
00:15:05.200 Please understand what's happening to us.
00:15:09.180 There is a way to win.
00:15:11.300 But we started this hour with Donald Trump.
00:15:16.820 You'll notice they say he mocked.
00:15:18.840 He didn't mock.
00:15:19.940 He stated facts.
00:15:22.320 People are not going to want to hear the facts.
00:15:25.220 That's okay.
00:15:26.580 State them.
00:15:27.860 State the facts calmly, rationally, and relentlessly.
00:15:33.080 The only thing that matters is reason and facts.
00:15:49.400 The book is Addicted to Outrage.
00:15:51.540 Please pick it up.
00:15:53.520 The audio book is really good.
00:15:55.860 Spend 35 hours working on it, reading it.
00:15:58.560 It's 15 hours in total.
00:16:00.720 It's really good.
00:16:03.080 All right.
00:16:03.940 Filter by is our sponsor this half hour.
00:16:07.420 Hey, Stu, wasn't China, aren't they the best?
00:16:10.080 Isn't China the best at, you know, fighting global warming and everything?
00:16:14.100 They're on the cutting edge, as Pat would say, as allegor, when it comes to global warming.
00:16:20.880 They absolutely are.
00:16:22.240 The problem is they're not.
00:16:24.380 The opposite is true.
00:16:25.760 Well, they are on the cutting edge of leading the world in emissions.
00:16:28.640 That they are doing.
00:16:29.860 Okay.
00:16:30.460 I'll give you that.
00:16:31.160 I'll give you that.
00:16:31.720 They've just discovered that air pollution causes a huge reduction in intelligence.
00:16:37.120 Huh.
00:16:38.340 Who would have thunk?
00:16:39.380 Breathing in really bad stuff hurts people.
00:16:42.600 Huh.
00:16:42.940 I got to stop.
00:16:43.760 It looks like they picked the wrong day to start sniffing glue again.
00:16:46.400 That's exactly right.
00:16:47.880 Here's the thing.
00:16:48.680 95% of the global population is breathing unsafe air.
00:16:52.980 I hope that you're in the 5% that's not.
00:16:56.500 Do the smart thing.
00:16:57.760 Change your filter.
00:16:58.940 That's all you have to do.
00:16:59.980 We have HVAC systems.
00:17:01.860 You have to change your filter for a couple of reasons.
00:17:04.440 One, if it's clogged, it's not filtering anything, and it's going to wear down your HVAC system.
00:17:11.860 It's harder for it to suck air through it.
00:17:14.540 The second thing is, you need to actually have filtered air.
00:17:19.680 So, all you have to do is just change it.
00:17:22.360 I don't change it because I forget.
00:17:24.520 That's where FilterBuy comes in.
00:17:25.980 First of all, you don't have to run to a store.
00:17:27.580 They have 600 sizes, custom options.
00:17:30.440 They ship for free within 24 hours.
00:17:33.120 Plus, they're made here in America, and if you're like me and you never remember, you just set up a schedule with them.
00:17:40.100 You say, you know, every six months, every eight months, whatever your HVAC system calls for, and they just automatically send it to you, and you'll save 5% when you do that.
00:17:49.100 FilterBuy.
00:17:49.600 They'll save you time.
00:17:50.360 They'll save you money, and you'll breathe better.
00:17:52.700 Stop procrastinating.
00:17:53.900 Stop forgetting about it.
00:17:54.960 Change your filter.
00:17:55.960 It's FilterBuy.com.
00:17:58.620 FilterBuy.com.
00:18:03.720 Let me go to Al in Texas.
00:18:05.160 Hello, Al.
00:18:07.680 Al, are you there?
00:18:08.480 Yes, good morning.
00:18:10.420 Can you hear me?
00:18:11.200 Yeah, I can.
00:18:11.560 How are you, sir?
00:18:13.300 Fine, and you?
00:18:14.360 Good, good.
00:18:15.140 What's up?
00:18:16.400 Listen, Glenn, President Trump did not mock Professor Ford.
00:18:23.300 Yep.
00:18:24.380 You're right.
00:18:24.780 He merely attacked her testimony, and after all of us have been sifting out the inconsistencies in her testimony,
00:18:33.400 he merely voiced what we're all now thinking but don't want to express because we don't want to seem insensitive to a woman that was sexually assaulted.
00:18:44.120 Al, I have to tell you, I don't even think he attacked her testimony.
00:18:47.300 He just stated the facts.
00:18:49.200 That's all he did.
00:18:50.360 He stated the facts.
00:18:52.260 Now, that might look like an attack to some, but it ain't.
00:18:55.500 Al, do you want a copy of the book or the audio book?
00:18:58.760 I'm going to make one out while we're talking here.
00:19:02.200 I'm more like hard copy books.
00:19:04.800 Hard copy.
00:19:05.400 Okay, you got it.
00:19:06.680 Is this a new thing now?
00:19:07.640 We're just giving books away to everyone who actually gets on the air?
00:19:11.080 Yeah, if you get on the air.
00:19:11.980 It's rare.
00:19:12.580 It's rare.
00:19:13.140 We don't take a lot of calls.
00:19:13.800 I figure I've penciled in five books for the rest of the year to Al.
00:19:19.560 Make it to you, Al?
00:19:21.800 Yes, please.
00:19:22.560 All right.
00:19:22.880 Hang on.
00:19:23.260 Could you make it out to my worst enemy?
00:19:25.440 Al, hang on.
00:19:26.080 We're going to get you the book.
00:19:27.420 So put them on hold and we'll get his address.
00:19:31.380 He did not mock.
00:19:32.620 No, he didn't mock.
00:19:33.440 He didn't mock.
00:19:34.940 Play the audio real quick as we go into the bottom of the hour.
00:19:38.140 This is Donald Trump yesterday.
00:19:40.140 Mocking or just stating the facts?
00:19:42.020 We have time.
00:19:45.140 I had one beer.
00:19:47.080 Well, do you think it was?
00:19:48.420 Nope, it was one beer.
00:19:49.860 Oh, good.
00:19:50.660 How did you get home?
00:19:51.600 I don't remember.
00:19:52.380 How'd you get there?
00:19:53.080 I don't remember.
00:19:53.840 Where is the place?
00:19:54.580 I don't remember.
00:19:55.580 How many years ago was it?
00:19:56.740 I don't know.
00:19:58.140 I don't know.
00:19:58.920 That's true.
00:19:59.740 He's not mocking.
00:20:01.840 He's not mocking.
00:20:03.420 Glenn Beck.
00:20:05.300 Mercury.
00:20:05.700 You're listening to the Glenn Beck Program.
00:20:13.060 Welcome to the program.
00:20:14.060 I'm so glad you're here today.
00:20:16.480 It's vocabulary day.
00:20:18.780 I hope you did your homework.
00:20:20.800 We are just dissecting one sentence.
00:20:23.960 Trump mocked the victim.
00:20:27.920 All right.
00:20:28.580 We all know what Trump means.
00:20:33.080 Mocked.
00:20:34.920 Well, could you look up mock for me?
00:20:36.960 Sure.
00:20:37.240 I think we all know what mocked means.
00:20:39.740 And we've just played the audio.
00:20:42.120 We'll play it again here before we move on.
00:20:44.060 Go ahead.
00:20:44.660 Tell me what mocked means.
00:20:45.720 To tease or laugh in a scornful or contemptuous manner.
00:20:49.980 Okay.
00:20:50.840 Is this mocking the victim?
00:20:54.660 Donald Trump last night.
00:20:55.940 I had one beer.
00:20:58.260 Well, do you think it was?
00:20:59.580 Nope.
00:21:00.060 It was one beer.
00:21:01.040 Oh, good.
00:21:01.800 How did you get home?
00:21:02.760 I don't remember.
00:21:03.540 How'd you get there?
00:21:04.240 I don't remember.
00:21:05.000 Where is the place?
00:21:05.760 I don't remember.
00:21:06.760 How many years ago was it?
00:21:07.900 I don't know.
00:21:09.060 I don't know.
00:21:11.480 I don't know.
00:21:13.880 I don't know.
00:21:16.260 What neighborhood was it in?
00:21:17.800 I don't know.
00:21:18.280 Where's the house?
00:21:19.040 I don't know.
00:21:19.460 Stop.
00:21:20.160 You'll notice that people are not laughing.
00:21:22.260 They are smiling and they are cheering.
00:21:24.320 But they are not laughing.
00:21:25.940 He's not meaning this is a joke.
00:21:27.920 He is instead stating the facts and the absurdity of this case.
00:21:34.700 And people are cheering because finally someone is saying it.
00:21:39.320 In his case, to give President Trump bonus points here, he's saying things that I don't think any other president would have the balls to do.
00:21:47.620 At least no other Republican president.
00:21:49.580 Barack Obama would have done this.
00:21:52.420 But only, only Barack Obama in, you know, in the last recent memory.
00:21:58.940 George Bush wouldn't have done this.
00:22:00.860 Bill Clinton, I don't think, would have done this.
00:22:03.380 He just would have left it alone.
00:22:05.560 Republicans never would have done this because they wouldn't still be standing.
00:22:10.420 The minute Blasey said, I want to testify, the Republican president would have run, run for the hills.
00:22:18.180 I get your point.
00:22:18.840 I would point out Clarence Thomas did go through something very similar.
00:22:21.860 Except for Reagan.
00:22:23.400 And that was Bush.
00:22:24.800 That was 91.
00:22:26.480 Clarence Thomas.
00:22:26.760 That was, wasn't that Reagan?
00:22:28.280 No, Thomas was in 1991, I think.
00:22:30.020 Oh, wow.
00:22:30.420 Okay.
00:22:30.640 I thought that was Reagan.
00:22:31.560 The point is, though, usually.
00:22:33.120 You could easily see this.
00:22:34.120 And it's a different era, right?
00:22:35.520 Everyone seems to fold over these things immediately.
00:22:38.300 You know, you make one bad joke, you lose your job at Guardians of the Galaxy.
00:22:41.480 Right.
00:22:41.580 You know, like this stuff.
00:22:43.360 Correct.
00:22:43.640 It is a different time.
00:22:44.520 Okay.
00:22:45.260 So, he's not mocking.
00:22:47.500 He is stating the facts.
00:22:49.600 That's what's happening.
00:22:51.020 Now, Trump mocked the victim.
00:22:56.400 Hmm.
00:22:57.400 Stu, I've got a few words for you to look up that I think America needs a refresher course
00:23:01.920 on.
00:23:03.080 Could you look up victim?
00:23:05.900 Hmm.
00:23:07.080 Huh.
00:23:07.600 Give me the definition of the word victim.
00:23:10.340 Glenn, victim's a noun.
00:23:11.660 It's a noun.
00:23:11.980 And by the way, we should point out that yesterday around this time, we looked up the word boof.
00:23:15.000 Yes.
00:23:15.240 So, the devil's triangle.
00:23:16.840 So, this is taking a different course.
00:23:19.140 A little bit.
00:23:19.760 Yeah.
00:23:20.280 Victim.
00:23:20.760 That was a 101 class.
00:23:21.900 A person harmed, injured, or killed as a result of a crime, accident, or other event or action.
00:23:28.940 A person harmed, killed, or injured as a result of a crime or an accident.
00:23:36.820 Okay.
00:23:37.640 One interesting thing, as an observation here, Glenn, on the word victim, is if you call her
00:23:42.160 a victim, you are condemning his guilt.
00:23:45.440 Oh.
00:23:46.220 You have already decided the case if you call her a victim.
00:23:49.860 Huh.
00:23:50.220 When you are in a headline, for example, and you say, he mocks victim, what you are saying
00:23:57.220 is...
00:23:58.220 Brett Kavanaugh.
00:23:59.020 Brett Kavanaugh is guilty.
00:24:00.080 Correct.
00:24:00.660 You're saying it without having to say it.
00:24:02.200 What you're saying, what you should say instead, could you look up the word accuser?
00:24:06.880 Accuser, yes.
00:24:07.620 Here we go.
00:24:07.920 Accuser.
00:24:08.140 Okay.
00:24:09.320 Accuser.
00:24:10.020 Accuser.
00:24:10.680 A person who claims...
00:24:12.720 Wait, this is starting to sound accurate.
00:24:14.720 It is.
00:24:15.520 A person who claims that someone has committed an offense or done something wrong.
00:24:19.560 Okay, so an accuser.
00:24:20.400 We can all agree, that's her.
00:24:21.600 An accuser.
00:24:23.200 So, he wasn't mocking.
00:24:25.360 He was stating the facts of the trial or the hearing and condensing the ridiculousness
00:24:35.620 of the accuser's claim.
00:24:38.840 We must stop using the word victim.
00:24:42.760 They are accusers.
00:24:45.340 They may end up being a victim that we can call them.
00:24:49.480 But if we are going to presume innocence, and I know this is an old-fashioned idea, if
00:24:53.720 you're going to presume innocence, you must presume innocence, which means this is the accuser,
00:25:01.980 your honor.
00:25:02.720 This is not the victim.
00:25:04.460 Because once she's a victim, what happens?
00:25:09.260 Well, we're supposed to believe.
00:25:11.100 Could you look up the word believe?
00:25:14.200 Yes, I can do that.
00:25:15.680 We are now being taught believe the victim.
00:25:19.840 Believe.
00:25:20.160 Okay.
00:25:21.020 Two definitions of the word believe.
00:25:24.000 Definition number one.
00:25:25.300 Accept something as true.
00:25:27.100 Feel sure of the truth of.
00:25:30.080 Okay.
00:25:30.500 So, you're sure something is true.
00:25:31.900 So, people now are saying, believe the victim, which is translating, feel sure that this person
00:25:41.900 has been harmed or injured in an event of some sort.
00:25:47.900 Okay?
00:25:48.140 And in this case, a person is responsible.
00:25:50.580 So, she's responsible.
00:25:51.740 Yes.
00:25:52.080 Okay?
00:25:52.560 So, you are being told, when someone says something, I have to believe and assume.
00:26:01.760 Give me the definition again.
00:26:03.320 Accept something as true.
00:26:04.680 I have to accept this as true.
00:26:07.200 Feel sure it's true.
00:26:08.460 And feel sure that it's true that this person has been harmed or went through an act caused
00:26:16.020 by this person.
00:26:18.760 Wow.
00:26:19.300 That's quite a statement.
00:26:20.980 That doesn't seem logical at all.
00:26:24.020 That should only happen after.
00:26:26.080 Well, you know, that's beyond a reasonable doubt, right?
00:26:29.080 Like, that's what the standard is there.
00:26:30.520 What's the second version of believe?
00:26:33.540 Hold something as an opinion.
00:26:36.760 Think or suppose.
00:26:38.960 That seems like the one they're applying here.
00:26:43.520 They're trying to make it seem like, I believe you, I'm sure of the truth.
00:26:46.800 When in reality, it's just their opinion.
00:26:48.760 They're supposing it's true.
00:26:50.700 Mm-hmm.
00:26:51.280 They think it's true.
00:26:52.620 Now, do you want to be judged or have your dad, your brother, your son, your sister, your
00:26:59.540 mother, your friend, have the world assume that something is true because someone stated
00:27:07.160 it without evidence?
00:27:09.860 The answer is no.
00:27:11.660 But we are being taught, the next generation is being taught to believe the victim.
00:27:17.960 Here's the thing.
00:27:19.380 When you present evidence and I see preponderance of evidence, I will believe the victim.
00:27:29.640 Now, that's not even through a court of law.
00:27:31.420 I just want a preponderance of evidence.
00:27:34.520 I want to see it.
00:27:35.740 Now, you'll notice that last week it was looking pretty dicey for Kavanaugh.
00:27:42.020 This week, it looks like there's been some things that the press is not reporting on.
00:27:45.580 Some people coming to the table saying, OK, I flew with her in a small plane in Hawaii
00:27:50.160 and a small plane, a single engine plane.
00:27:53.580 It only had one exit.
00:27:54.740 It was cramped.
00:27:55.940 It was just us.
00:27:57.000 She didn't have a fear of flying.
00:27:58.760 She didn't have it.
00:28:00.240 We went out to go circle the volcano.
00:28:03.320 What are you talking about?
00:28:04.500 A fear of flying.
00:28:05.240 She didn't have a fear of flying.
00:28:08.100 People coming out now saying the reason why she had the second door.
00:28:11.700 I was there.
00:28:14.240 You all you have to do is go look at the permits.
00:28:16.480 She was building an office in her house.
00:28:19.100 It wasn't for escape.
00:28:20.680 It was because she was having clients come to that door and not to the front door.
00:28:26.840 I have two front doors because I'm afraid.
00:28:29.500 Hmm.
00:28:29.980 Not according to the permits.
00:28:32.120 Not according to why.
00:28:33.580 Wait.
00:28:34.040 You put a front door because you were afraid.
00:28:35.940 Why did you put the office there?
00:28:37.920 Why did you put the separate bathroom there?
00:28:39.520 Of all things, would you want random clients coming over your house all the time?
00:28:43.140 Right.
00:28:43.280 Like you'd want to separate that from your business as much as possible.
00:28:45.600 Okay.
00:28:46.100 So I don't know what's true.
00:28:48.360 I don't know what's true.
00:28:49.280 But I certainly do not suppose to believe her.
00:28:54.280 I don't think she's I don't know if she's a victim or not.
00:28:57.620 I don't know.
00:28:58.500 She may be a victim, but he may not be the perpetrator.
00:29:01.740 Right.
00:29:02.120 I mean, the only person who should be calling her a victim is herself.
00:29:07.340 If she let's just say this did happen to her.
00:29:10.200 She can fairly call herself a victim because she knows it happened to her if if it's true.
00:29:14.760 Right.
00:29:15.020 But even if it's true, none of us should be calling her a victim at this point.
00:29:19.800 The victimhood from society identifying you as a victim is when other people have some confirmation of the trial going through.
00:29:27.660 There are enough information to prove it.
00:29:29.480 There is one proven victim between the two of them right now.
00:29:33.700 One proven victim.
00:29:35.240 That is Brett Kavanaugh, who now can't teach up at Harvard because all of the graduate students are saying we're not having a rapist to teach who can't who can't coach his basketball games anymore because they can't have a rapist.
00:29:50.020 He is now a victim.
00:29:51.940 Now, that, of course, assumes that he's innocent.
00:29:54.920 No, no, no.
00:29:55.680 He's had negative consequences.
00:29:56.980 No, he's a victim.
00:29:58.160 Even if he's guilty.
00:29:59.060 He's a victim of mob justice.
00:30:01.640 Right.
00:30:01.860 Because they have not proved the case.
00:30:03.140 Correct.
00:30:03.420 He's a victim of mob justice.
00:30:05.240 I'm not saying that he is he is not rightfully a victim at this point.
00:30:11.740 It sure doesn't look like it.
00:30:13.240 This is mob justice so far.
00:30:15.920 She also is a victim.
00:30:18.160 She's a victim of the Democratic Party and leadership.
00:30:21.560 She has been victimized.
00:30:23.600 She has been dragged out when she said she didn't want to be dragged out.
00:30:27.420 She was lied to when they when the Republican said, we'll go out and meet with you.
00:30:32.800 You don't have to fly.
00:30:33.700 They didn't tell her that she's a victim by the Democratic Party and they don't care.
00:30:40.200 They do not care.
00:30:42.200 That is the real tragedy of this.
00:30:48.700 Two people's lives are forever changed.
00:30:50.120 One.
00:30:51.420 If she is making this up.
00:30:54.740 It's on her.
00:30:55.800 And quite frankly, so is the victimhood of him.
00:31:00.220 It's on her.
00:31:04.300 Look at what's happened.
00:31:05.360 For political purposes only.
00:31:09.860 And if you think the Democrats and I think there are people who vote Democrat, I think there are people who vote for the Republican.
00:31:18.000 I think there are people who vote for Kavanaugh actually feel bad for Christine Ford.
00:31:24.140 I think they actually feel bad for because they don't know she maybe something happened.
00:31:30.220 I was willing to say last week that I think something happened and I don't know.
00:31:36.240 It's becoming less and less clear to me that even something happened because of all of the other testimony now that is coming out that saying, no, I've I was her boyfriend for 10 years, nine years.
00:31:49.160 None of this.
00:31:50.540 None of this happened.
00:31:52.820 I don't know.
00:31:54.880 By the way, I frankly do not believe her by either one of those definitions.
00:31:58.060 I don't suppose it and I don't know it's true.
00:32:00.900 Now, look, I could be wrong on this, but she has not provided like, you know, it can be 100 percent sure that this didn't occur.
00:32:07.460 We weren't there.
00:32:08.080 But that's why we have a system to try to figure out those situations.
00:32:10.780 And we're never there for crimes.
00:32:12.360 OK, we're almost never there for them.
00:32:14.040 So you'll notice that you are presumed innocent.
00:32:16.920 Look up two words, assume and presume.
00:32:18.880 In our standard, you are presumed innocent until proven guilty.
00:32:26.300 You're not assumed you're presumed.
00:32:28.940 What's the difference?
00:32:30.080 Presume is usually used when you suppose something based on probability.
00:32:34.700 So the odds are based on what we know.
00:32:38.060 She doesn't know the house.
00:32:38.940 She doesn't know the time.
00:32:39.680 She doesn't know the month.
00:32:40.440 She doesn't know the year.
00:32:41.720 The people that she said here, they'll testify that they were there.
00:32:45.800 All of them disagree.
00:32:47.020 All of them say, I don't have any recollection of this at all.
00:32:51.120 So she has no witnesses.
00:32:52.900 She the the documents from her doctor conflict with her report.
00:32:58.000 She has changed the time four times in the last four or two months.
00:33:02.740 So and that is sketchy because it appears as though she kept moving it closer because Brett Kavanaugh would have been out of state if it would have been where she originally claimed.
00:33:13.580 So all of those changes.
00:33:16.000 So what's the preponderance of evidence?
00:33:20.040 What are the odds based on the evidence?
00:33:23.220 The odds are he didn't do it.
00:33:25.400 That's the presumption of innocence.
00:33:28.020 What is assumed innocence?
00:33:31.040 Assume is used when you suppose something without any evidence.
00:33:34.860 So no one is asking you to assume he's innocent.
00:33:39.980 We're asking you to look at the facts and then play the odds.
00:33:46.540 Based on those facts, what are the odds that she has it right?
00:33:52.180 So let's stop using the word victim and let's stop saying we're going to believe every victim.
00:34:01.200 We are not.
00:34:02.100 We are going to take seriously the accusation from the accuser.
00:34:10.160 And then we'll look for the facts.
00:34:12.860 All right.
00:34:13.620 Mercury real estate.
00:34:15.720 These are the people are going to sell your house.
00:34:17.260 They're going to sell it on time and for the most amount of money.
00:34:19.780 You need a real estate agent.
00:34:21.100 You're going to buy a house in an area you don't know.
00:34:23.360 Call us.
00:34:24.340 Mercury real estate.
00:34:25.460 Real estate agents.
00:34:26.380 I trust dot com.
00:34:27.820 You call us.
00:34:28.560 We'll give you one of the fifteen hundred agents in your area or the area that you're moving to.
00:34:33.660 And they will be able to help find the right house for you and negotiate the right price.
00:34:38.620 If you're looking to sell your house, we probably have a real estate agent in your in your neighborhood as well that can help sell your house on time and for the most amount of money.
00:34:47.780 These people have been hand ticked handpicked for the team for their knowledge, their skill and their track record.
00:34:53.360 Thousands of families have already put this to the test.
00:34:56.100 It is real estate agents.
00:34:57.460 I trust dot com.
00:34:58.360 Sell your home now fast and for the most amount of money.
00:35:01.560 Real estate agents.
00:35:02.720 I trust dot com.
00:35:03.780 All right, let's go to let's go to Randy.
00:35:12.780 Hello, Randy.
00:35:13.320 You were at the Trump rally last night.
00:35:15.060 What was it like?
00:35:16.240 I was.
00:35:17.440 It was it was great to be there and see him in person.
00:35:20.360 But what I found most interesting there is that it's a pretty large crowd.
00:35:24.920 It looks like it was at capacity, probably 10,000 people or so.
00:35:28.340 But half the crowd were women.
00:35:30.200 Oh, my gosh.
00:35:31.040 So this this thought that the women aren't going to support him, you know, I find that hard to believe.
00:35:36.640 Well, I'll say this, Randy.
00:35:37.240 Those are not the women we need to believe.
00:35:39.060 Yes, those are not the other women.
00:35:40.940 Well, it's in Mississippi.
00:35:42.080 So, you know, the men told them to go.
00:35:44.120 Oh, OK.
00:35:44.600 You know, they don't have minds of their own.
00:35:47.640 So that doesn't work that way at my house.
00:35:50.440 Doesn't work in anybody's house.
00:35:53.280 Randy, thanks a lot, man.
00:35:54.300 I appreciate it.
00:35:54.760 Hang on just a second.
00:35:55.500 I got to get him a book.
00:35:56.220 Oh, yeah, you get a book.
00:35:57.100 You want a book or an audio?
00:35:59.220 A book will be great.
00:36:00.340 A book will be great.
00:36:01.120 Great.
00:36:01.480 All right.
00:36:01.820 Thanks, Randy.
00:36:02.280 Hang on just a second.
00:36:02.900 Do you want Glenn's book or just another book randomly that we pull from the...
00:36:05.580 Let's just...
00:36:06.040 We'll pull random books from other authors.
00:36:07.940 How about Mark Levin's book?
00:36:08.920 Do you want Mark Levin's book?
00:36:11.320 OK.
00:36:11.820 Is that...
00:36:12.140 How long does this go on?
00:36:13.100 If you call in to 888-727-BECK, you get a point that's good enough to get you on the air.
00:36:17.020 You get a free book.
00:36:17.720 I don't know.
00:36:17.980 Your choice, audio or regular?
00:36:19.480 I don't know.
00:36:20.620 Until I run that.
00:36:21.040 Have you cleared this with anyone?
00:36:22.120 No.
00:36:22.580 Good.
00:36:22.800 Good.
00:36:23.120 That's the way it's supposed to work.
00:36:24.480 That's the way it works around here.
00:36:26.600 May have been a very bad idea.
00:36:28.480 Oh, crap.
00:36:29.080 Tomorrow.
00:36:30.300 Tomorrow we're talking about the book.
00:36:31.680 Oh, well, they'll already have the book.
00:36:33.180 Maybe.
00:36:34.420 Because tomorrow I want to take calls from people who have actually read the book.
00:36:38.760 And I want to get into things you disagree with, things you agree with, things you want
00:36:42.600 to know more about.
00:36:43.580 We're going to go through some of the bigger points in the book tomorrow.
00:36:50.200 And if you've read it, I want you to call in on tomorrow's broadcast.
00:36:54.520 Glenn Beck.
00:36:56.200 Mercury.
00:36:57.880 Glenn Beck is coming live to talk about the right path forward and to make fun of the people
00:37:02.340 standing in the way.
00:37:03.460 He might not be able to save the country, but at least we can all go down laughing.
00:37:07.060 Glenn Beck Live.
00:37:08.200 The Addicted to Outrage Tour.
00:37:10.080 On tour this fall.
00:37:11.240 Glenn Beck.
00:37:14.440 All right.
00:37:14.680 If you've already eaten today, you might want to turn the radio down for a minute because
00:37:18.240 this may make you hurl.
00:37:21.620 Keep in mind, Republicans control all three branches of government.
00:37:25.660 Republicans control all three branches of government right now.
00:37:28.820 So here we go.
00:37:30.140 In October, which means the fiscal year has officially ended in October.
00:37:36.100 That's right now.
00:37:38.020 Our debt numbers are in.
00:37:39.720 You ready?
00:37:42.240 Our federal debt has increased by over one point two trillion dollars.
00:37:48.600 Our total federal debt is now over twenty one trillion dollars and rising.
00:37:54.440 I know a lot of people used to care about the debt when it was Obama.
00:37:58.980 I still care about the debt.
00:38:01.900 Total federal debt is now twenty one trillion dollars.
00:38:05.380 It's rising.
00:38:06.500 One point two trillion dollar increase is the sixth largest debt.
00:38:11.280 Oh, well, it's the sixth largest debt increase.
00:38:13.000 It's not even in the top in the entire history of the United States.
00:38:17.440 This debt increase, again, under Republican control is larger than many of the years under the Obama administration.
00:38:25.860 It is nearly the same as during 2011 and 2012.
00:38:30.440 It is larger than 13, 14 and 15.
00:38:34.020 Every single worker in the United States, roughly just over one hundred and fifty five million.
00:38:40.540 If all of us who are getting up to go to work donated one hundred and thirty thousand dollars.
00:38:48.680 It still wouldn't pay off the debt.
00:38:51.080 But even if we did, even if it did play it, pay it down at this rate, we'll be back where we started.
00:38:57.880 In two years.
00:39:01.540 There are only two parties now, apparently, in America because it's binary.
00:39:06.580 I suggest we look elsewhere if you care about the debt.
00:39:11.920 No one is standing for fiscal responsibility.
00:39:15.000 How long are we going to vote for people that clearly do not care about record amounts of debt?
00:39:22.080 We are showing them every day that they can continue to behave in this reckless matter without any consequences whatsoever.
00:39:31.880 Democrats have moved the Overton window so far to the left that the GOP, the former party of fiscal accountability or claimed to be,
00:39:41.320 see no issue at all with an annual trillion plus dollar debt.
00:39:46.120 That's now standard fare.
00:39:47.780 Democrats have slid so far to the left.
00:39:53.720 Now they're talking about things that will cost in their own numbers, 40 trillion dollars in 10 years.
00:40:01.900 Republicans, likewise, refuse to do anything about it.
00:40:05.160 A debt bubble explosion is prime.
00:40:08.940 It is not a question of when this explosion happens.
00:40:15.740 It's going to happen.
00:40:17.780 The only thing we have to concern ourself with is when and if we're going to do anything about it.
00:40:30.620 It's Wednesday, October 3rd.
00:40:33.500 You're listening to the Glenn Beck Program.
00:40:35.300 We are entering a new time and everything has everything's being redesigned right now and people aren't really talking about the issues.
00:40:43.420 People aren't really talking about big fundamental things that are changing.
00:40:47.000 For instance, America was based on life, liberty and the pursuit of happiness.
00:40:51.860 Nobody's talking about pursuit of happiness right now.
00:40:53.880 Pursuit of happiness is defined by our founders as ownership that you could own.
00:40:58.080 You could you could forge your own way in in life and ownership is a big part of capitalism in a big part of America.
00:41:07.080 However, ownership is quickly going away.
00:41:10.840 When you buy a book on Kindle, do you own the book?
00:41:15.560 When you buy a movie from iTunes, do you own the movie?
00:41:19.540 The answer is no.
00:41:26.100 The end of ownership.
00:41:29.020 Aaron, and I want to get this right.
00:41:31.080 Paris.
00:41:33.240 Say it for me.
00:41:33.900 Just ask him.
00:41:34.520 Just tell me how you say his name.
00:41:37.060 It's Perzenoski.
00:41:38.440 Perzenoski.
00:41:38.940 OK.
00:41:39.360 How it was a lot easier than it's than it looks.
00:41:41.640 We can't pronounce easy words.
00:41:42.680 Yeah, that was going to be.
00:41:43.580 Yeah, this is got more than one syllable.
00:41:45.860 There's a lot of consonants there.
00:41:47.160 How are you doing, Aaron?
00:41:49.580 I'm doing well.
00:41:50.240 How are you?
00:41:50.720 Good.
00:41:51.020 I'm I'm I'm I'm really fascinated by how we make the turns in our society for the future.
00:42:00.200 And ownership is a big part of this, because in the future, I don't know how many people will even own cars.
00:42:06.100 I mean, it's just it's just all changing.
00:42:08.580 But do we really own things when we buy them online?
00:42:12.700 So I think there's a real concern here that consumers go into transactions when they're buying things, digital goods, especially digital books, movies, music.
00:42:25.280 They go into those transactions assuming they work the same way they do in the in the world of tangible goods, where if you buy a book, you can give it away to a friend.
00:42:35.320 You can lend it to someone.
00:42:36.480 You can leave it in your will in the future and leave your book collection to your loved ones.
00:42:43.580 And the rules that control these digital transactions when you buy something on your Kindle or from iTunes are very different from the rules that we expect in the physical world.
00:42:55.860 And consumers don't really understand that distinction.
00:42:59.940 And I think that causes a real disconnect between what we all expect to happen and what happens in fact.
00:43:06.700 So to give you a quick example, just a couple of weeks ago, a consumer, a customer of the Apple iTunes movie store found that three movies that he had purchased had been deleted from his account.
00:43:26.300 They were no longer accessible.
00:43:28.500 And I think that shocked a lot of people.
00:43:30.060 Those of us who have been following these issues closely for years would remember 10 years ago when Amazon remotely deleted a book off of people's Kindles, including, ironically, George Orwell's 1984.
00:43:45.060 So these issues have been happening for a long time.
00:43:48.260 But I think people are now starting to really sit up and take notice.
00:43:52.220 OK, so I remember because this it's easier for me to read everything on Kindle, but and I have a large collection in my library of hardcover books and I read so much.
00:44:06.980 I read it all on Kindle, but I have recently really been concerned, not just because I don't actually own it and I can't have it in my library and I can't pass it on, but also because you watch things like it happening in China.
00:44:19.100 If you're in China, if you're in China, I mean, at first they wouldn't sell the book, but if they did sell the book, the government can just deem that that book is you don't need to burn books.
00:44:28.380 You could just overnight just take all of that, every copy of that book out of circulation if it's only digital.
00:44:35.420 That's really disturbing to me.
00:44:39.180 I think it's a real concern.
00:44:41.460 It's a concern from the perspective of censorship, as you've just described it.
00:44:47.120 It's also a real concern from the perspective of preservation and sort of archiving our cultural history.
00:44:55.620 If these books are stored on the centralized servers and only the hands of the two or three companies that dominate these markets, then there's a real risk that we aren't going to be able to ensure kind of the widespread distribution of copies that will allow us to archive and preserve these works.
00:45:23.180 And Aaron, with the movie, it wasn't because they found it objectionable or anything else.
00:45:28.720 It's because that particular provider, they lost the rights to that movie, right?
00:45:34.440 And so they had to pull it from people's libraries because their rights had expired.
00:45:39.940 So there are a number of ways that this can happen.
00:45:43.460 This most recent example, I don't know that the facts are totally clear on exactly what went on.
00:45:48.560 So one way this can happen is that, as you described, the deal between the digital retailer, Apple or Amazon, and the copyright holder expires.
00:45:59.360 They no longer have the rights to sell that product.
00:46:02.320 But it can also happen when a record label or a movie studio decides that they want to put out the new, updated, remastered, director's cut edition of a movie.
00:46:14.480 And when they do that, they pull the old version to help drive the sale of the new.
00:46:18.700 Oh, my gosh.
00:46:20.760 So they almost force you to.
00:46:22.700 I mean, because they've always done this where, you know, it's the masterpiece collection and it's, you know, additional footage and, you know, fully restored.
00:46:30.760 But you still had the old copy.
00:46:33.660 Now, that's right.
00:46:34.860 You can't.
00:46:35.320 I mean, even I mean, think of this.
00:46:37.300 Even just for comparison, you can't if they change something in a movie.
00:46:41.620 Imagine when remember when George Lucas changed Star Wars.
00:46:44.660 Well, I want to see what it was like when it originally came out.
00:46:49.120 You wouldn't be able to do that, would you?
00:46:50.620 Unless the movie company decided to allow you to do that.
00:46:54.920 That's right.
00:46:55.500 I mean, and the problem in this most recent case, in part, was that the consumer didn't have a local copy stored on their computer or their device.
00:47:04.900 And this is just a practical tip for people.
00:47:07.220 You should always try to store as much as you can locally.
00:47:10.220 Now, these services are often trying to encourage consumers to rely on their own, on the company's own sort of cloud storage solution.
00:47:22.580 And sometimes with the Apple TV, for example, the Apple TV doesn't allow you to permanently download a copy of a movie.
00:47:31.760 You have to access it through their cloud servers.
00:47:34.220 So, I think that makes a big difference in your relationship with those goods.
00:47:41.080 If I downloaded something on Kindle, could I download it to another cloud and still be able to read it on Kindle?
00:47:48.560 So, the Kindle allows you to store those files locally on your own device.
00:47:59.840 But because the Kindle is tethered through software and network connections to Amazon, Amazon has the ability, as they showed 10 years ago, to remove those files from your device.
00:48:12.840 It's unbelievable.
00:48:13.560 Real quick, Apple has the same sort of control.
00:48:19.020 We saw this several years ago, too, in a very different way.
00:48:21.980 I'm sure some of your listeners may remember when they woke up and found a U2 album on their iPhone.
00:48:28.080 Yes.
00:48:29.080 They put it the other way.
00:48:30.340 They forced everybody to have it.
00:48:32.840 Exactly.
00:48:34.260 That's bizarre.
00:48:35.120 You write about this a little bit.
00:48:36.300 And it's an interesting change in the way we think about commerce.
00:48:40.480 In the past, you had a transaction where you'd go into a store and you'd buy something.
00:48:46.180 With these digital purchases that we're making from iTunes or Amazon, we're actually entering an ongoing relationship with them.
00:48:54.920 It's sort of an open-ended thing where they're constantly knowing what you do with that product.
00:48:59.840 And you have that ongoing relationship where they can cancel that at any time without your knowledge.
00:49:05.800 Can you talk a little bit about the change there?
00:49:08.180 Because that's a real change I don't think people have considered.
00:49:10.900 And Aaron, before you answer that, we're going to take a quick break and then we'll come back and get you to answer that question.
00:49:15.600 And just the change in capitalism.
00:49:19.040 Change.
00:49:19.420 What does it mean to enter a world where there's really no ownership of anything?
00:49:26.860 For four years now, Relief Factor has been helping my team here in the studio alleviate pain.
00:49:33.960 Late last year, oh, it's Goldline?
00:49:37.560 Well, I want to tell you about Relief Factor anyway.
00:49:39.680 Relief Factor.
00:49:40.440 Take it.
00:49:40.880 It's really good.
00:49:42.520 You can buy Relief Factor with gold.
00:49:44.520 So that's one thing you can do.
00:49:45.640 I don't think you can.
00:49:46.720 Okay.
00:49:47.000 So I want to tell you a little bit about Goldline.
00:49:48.420 Goldline, the new silver maple flex allows you to break off smaller pieces for barter and trade of of silver.
00:49:57.820 There's a really I mean, we just talked about the debt.
00:50:00.960 I don't know what's going to happen to the dollar.
00:50:02.660 Nobody does.
00:50:03.460 Nobody does.
00:50:04.140 Nobody knows what's going to happen.
00:50:05.260 Bitcoin.
00:50:05.840 What's it going to mean there?
00:50:07.380 If there ever is a time of of catastrophic change, there's going to be a time when we're all going to have to kind of work together and figure things out.
00:50:15.860 Because the things aren't going to work the same with with dollars.
00:50:20.680 May I suggest that the world always returns to gold?
00:50:23.440 And for barter, they have the maple flex coin, which is this this bar of of silver that is about the size of a credit card that you can carry it around.
00:50:33.900 It has a maximum flexibility.
00:50:35.900 You just break off pieces and you can have, you know, one tenth, one quarter of an ounce of silver, but they also do it in gold.
00:50:43.860 And it's all made by the Royal Canadian Mint.
00:50:46.340 The only people that carry this is Goldline.
00:50:50.120 Just look at the numbers.
00:50:52.600 Find out for yourself.
00:50:53.660 Is the world hurling towards fiscal sanity or insanity?
00:51:01.480 As soon as the stars start rolling the other direction, I'll stop talking about this, but I don't trust that we have anything that's going to going to bring us back into sanity.
00:51:13.300 Other than some sort of catastrophic event.
00:51:16.860 Goldline.
00:51:17.480 Call them now.
00:51:18.280 Find out if it's right for you.
00:51:19.240 It's not right for everybody.
00:51:20.360 Do your own homework.
00:51:21.500 Don't even take it from me.
00:51:22.580 866-GOLDLINE.
00:51:23.740 1-866-GOLDLINE.
00:51:25.480 Or goldline.com.
00:51:30.120 Glenn Beck.
00:51:31.140 Talking to Aaron Paranofsky.
00:51:32.740 He is a professor of law.
00:51:37.160 And also, you can find him at theendofownership.com.
00:51:41.640 Aaron, you're right.
00:51:42.800 The switch to the digital platform offers convenience, but also makes consumer access more contingent.
00:51:47.740 Unlike a purchase at a bookstore, a digital media transaction is continuous, linking buyer and seller and giving the seller a post-transaction power impossible in physical markets.
00:51:57.160 Why is that important?
00:51:58.920 So, I think this is important for a number of reasons.
00:52:03.400 It leads to these scenarios that we were talking about earlier where the seller of the good has the ability not only to sort of reclaim or recall the good, but they also have some ability to control how and when and under what circumstances you make use of that product after the sale.
00:52:23.980 So, that's just not something that you could do in the tangible world, right?
00:52:28.840 Your local bookstore, put aside the publisher, your local bookstore can't tell you what country you're allowed to read a book in.
00:52:36.900 They can't tell you, you know, how many times you get to read it.
00:52:41.320 They can't tell you who you get to lend that book to.
00:52:44.720 And they certainly can't keep records of all of those interactions.
00:52:47.800 And the digital world allows for that form of control.
00:52:54.680 And importantly, it's not limited just to digital media.
00:52:59.380 We have all these smart devices in our homes, on our bodies.
00:53:05.120 You know, we've got our voice assistants and our fitness trackers and, you know, even our home appliances and cars.
00:53:13.720 They all have software.
00:53:15.700 They all have network connections.
00:53:17.800 And all of these sort of problems that I've been describing are going to play out in that space as well, where device makers are not only going to be able to track your behavior, but they're also going to be able to limit the ways in which you can use the products that you think you have purchased.
00:53:37.880 So, let me interrupt here and just ask you this.
00:53:41.860 I see when I go to iTunes, I see a movie I want to watch.
00:53:45.360 It says rent or own.
00:53:49.040 I'm not owning it.
00:53:50.400 I'm just renting it in a different way.
00:53:53.540 Isn't this false advertising?
00:53:54.940 So, I think there's a really good case to be made here that companies like Amazon and Apple that use language like own and buy, words that have real meaning for people in their everyday lives, are misstating the nature of those transactions.
00:54:13.760 So, my co-author, Chris Hufnagel, and I wrote a paper a few years ago, a couple years ago now, called What We Buy When We Buy Now, that did a survey of about 1,500 consumers to figure out what people think this language means.
00:54:32.100 And it turns out that a significant percentage of consumers incorrectly believe that they do have true ownership rights and they get to keep these goods, that they can lend them, that they can give them away.
00:54:46.960 And we think that there is an opportunity here to correct this misinformation in the marketplace.
00:54:54.500 But think about the company that we're talking about.
00:54:56.720 Apple and Amazon are two of the biggest corporations the world has ever seen.
00:55:02.100 And getting them to, convincing them to communicate in a more clear and fair way is a real challenge.
00:55:15.500 Class action lawsuit?
00:55:18.920 So, I think there is a possibility for class action litigation here.
00:55:24.260 There are a bunch of legal and practical hurdles to making that happen.
00:55:30.600 And I think it's something worth pursuing.
00:55:32.160 I think the Federal Trade Commission has a role to play here.
00:55:36.740 This is squarely within their area of expertise and obligation to police the market to make sure that consumers have accurate information.
00:55:51.640 Aaron, go ahead.
00:55:53.180 Yeah.
00:55:53.840 I just want to go ahead.
00:55:55.240 The way the market works depends on consumers being informed.
00:56:00.900 People can't make rational choices.
00:56:02.980 People can't decide where to spend their money if they're being misled about the products that they're getting.
00:56:08.120 So, I think that it's crucial for the functioning of the market to have that information be correct.
00:56:14.160 Have you done any look into what a society without real ownership?
00:56:21.900 I mean, we're down to, you know, renting clothes and everything else.
00:56:24.980 And that's only going to get stronger as we move forward.
00:56:28.360 Have you looked into what that means for a capitalist society and for America in particular that has always been about ownership?
00:56:36.580 So, my biggest concern here is the way this changes kind of our conception of ourselves and the way we think about ourselves as individuals in a society.
00:56:52.400 Okay, so stop there for a second.
00:56:54.140 If I can hold you for just a couple more minutes after the break, I'd like you to finish that thought.
00:57:00.040 Because I think this is important.
00:57:02.420 The world is being redesigned and it's being redesigned without any of us really understanding it.
00:57:07.820 And we should go in open-eyed.
00:57:10.280 We are a country that is founded on basic individual rights.
00:57:20.520 And some of those rights, property rights, you have a right to own things, right to ownership.
00:57:26.440 You know, possession is nine-tenths of the law.
00:57:28.440 Well, not in the future.
00:57:30.000 In fact, in many cases, not even now.
00:57:32.580 You buy things online.
00:57:35.260 Sometimes you're not actually buying them.
00:57:37.660 You're just renting them.
00:57:38.760 You're entering an ongoing relationship.
00:57:41.040 What does this mean for society?
00:57:45.360 How is this going to change us?
00:57:47.340 Will it even change the way we view things and change some fundamental concepts of what it means here in America of individual rights?
00:57:58.380 We have Aaron Paraznowski on with us, professor of law and the author of the book, What We Buy, When We Buy Now.
00:58:08.220 And you can find more information at TheEndOfOwnership.com.
00:58:11.700 Aaron, so tell me how you've been looking at this.
00:58:16.260 So I think in the short term, what we're likely to see are more changes in the way our commercial interactions occur.
00:58:24.500 In the way that commercial transactions are structured, we're going to start to see people become more and more accustomed to paying for temporary access to resources rather than owning them.
00:58:38.660 And in some ways, I think that makes some degree of sense.
00:58:41.900 There are some people for whom owning a car isn't necessary.
00:58:44.920 They'd rather be able to take a lift or use some sort of car share application.
00:58:49.600 And I think that makes a lot of sense.
00:58:51.560 What I'm worried about is the long term set of implications for a shift away from ownership and towards temporary access, a shift away from independent control of resources to one where we have to rely on permission or the sort of goodwill of the companies that control access.
00:59:17.200 So maybe may I give you an example and see if I'm on the right track.
00:59:22.760 I buy a car and I love this car and I want to keep it.
00:59:26.520 And it's a classic car, but I don't own the software that runs the car.
00:59:31.480 And if at any time the software company says, no, I'm not going to we're not going to support that or we want to discontinue or whatever, I don't I have a heap of junk.
00:59:43.120 I can't do anything with it because I don't own the software that runs it.
00:59:47.920 I think the car is a great example.
00:59:49.900 We see this issue come up in the motor vehicle context, but the way it's come up most most recently and most often is actually not with cars, but with tractors.
01:00:02.100 John Deere, the long running American farm equipment company, makes exactly this argument that they own the software in the tractors that they sell to American farmers.
01:00:15.200 And that means that farmers can only get their tractors repaired by authorized John Deere dealers.
01:00:23.320 They can't do it themselves.
01:00:24.920 They can't go to their local mom and pop, you know, farm repair shop.
01:00:29.600 I think those kinds of changes are really troubling because they go to this sense of independence and this sense of autonomy that we're all independent actors in the world who can make our own decisions, who can decide what's best for us.
01:00:47.020 Do we want to keep this tractor as it is?
01:00:49.280 Do we want to modify it?
01:00:50.420 Do we want to repair it?
01:00:51.440 Those decisions are being taken away from individual consumers and you're being forced to play by a set of rules dictated by the companies who, quote unquote, sell you these products.
01:01:03.600 And doesn't that also doesn't also stop innovation?
01:01:06.740 I mean, sometimes the guy who takes something and then tinkers with it comes up with a better system.
01:01:12.920 But if I'm if I'm locked out of tinkering on my own property, it's it almost creates this this feeling of, oh, well, that's just the way it is.
01:01:25.100 And that's the way it always is going to be.
01:01:26.420 It just runs that way.
01:01:27.560 And it stops innovation, doesn't it?
01:01:30.560 I think it has the real risk of doing that.
01:01:33.080 It discourages people from being creative.
01:01:36.840 It discourages people from from, as you say, tinkering with the things that they own.
01:01:42.300 We have a lot of incredible innovations that have been made over the centuries in this country that didn't come from giant corporate R&D departments.
01:01:51.940 They came from individuals messing around with things that they own in their garage.
01:01:56.440 And there is a risk that we're foreclosing those kinds of opportunities.
01:01:59.900 But even even more broadly than that, if we're discouraged from thinking of ourselves as independent actors in the world, you know, I worry that that creates a sort of complacency in in our in our population, in our country.
01:02:15.520 And, you know, not not not to zoom out to too wide of a level here, but for a democracy to function, people have to feel and they have to be in charge of their own lives.
01:02:27.100 They have to be invested in making informed decisions.
01:02:30.760 And I worry that that, you know, this this lack of control over the everyday decisions might play into a much broader set of problems when it comes to people feeling like active participants in society and democracy.
01:02:47.040 I couldn't agree with you. I couldn't agree with you. I couldn't agree with you more.
01:02:51.760 I I just don't think this is the way society is thinking anymore.
01:02:55.880 Everything is about the collective and very little is about the individual.
01:03:00.480 And, you know, I think you understood you understated the case of tinkerers.
01:03:05.320 I mean, if you look at the inventions in America, a lot of them, a lot of our progress came from what used to be called tinkerers, people who just did things in their own garage.
01:03:17.100 And and and now, whether it's the government or these corporations, everyone is being told that's the way it is.
01:03:25.040 Sit down. Shut up. You can't do anything about it.
01:03:27.920 And I think that's extraordinary. I mean, that that that, you know, in in decades, that's what created China in many ways.
01:03:35.720 That's what they don't think of things the same way that we do.
01:03:40.260 They don't have that that spirit of invention that America has always been known for.
01:03:48.220 So I agree with you that the history of innovation in this country has benefited greatly from individual creators.
01:04:01.140 And we we need to keep an environment in which people have that ability to experiment, to innovate and ultimately to share that progress with with the rest of the country and the rest of the world.
01:04:16.360 So I worry that we're moving in a direction where people aren't able to build those skill sets because they live in a world of sort of locked down digital devices.
01:04:31.260 So let me ask you one more question and I'll let you go. I know you've spent twice the amount of time here that you you probably planned on.
01:04:37.340 But let me ask you one more question. I am really concerned about copyrights, patents, trademarks.
01:04:43.800 We seem to be entering a world where people don't take somebody's intellectual property seriously on the on the other side of this.
01:04:54.260 They just feel that, well, I can download it. I can just take it.
01:04:59.180 And we shouldn't have intellectual property rights.
01:05:02.400 That that that is frightening, because, again, that was the second piece of the American experiment was you have a right to that intellectual property for a period of time so you can make money on it, which encourages other people to come up with their own ideas.
01:05:19.380 Do you see this? Do you see this? Do you see this fading and is this trouble on the horizon as well?
01:05:26.960 So I write and teach about intellectual property and it's something that I take very seriously.
01:05:32.920 And one of the things that I always try to communicate to my students is that intellectual property system functions best when there is a balance between the interests of the public and the interests of creators.
01:05:49.380 And the history of intellectual property, copyright in particular, is a history of a struggle to find and maintain that appropriate balance.
01:05:58.640 And I think we're going through and have been going through kind of since the widespread adoption of the Internet, a period where we're struggling with how to answer some of those questions.
01:06:12.260 There are certainly areas in which copyright holders have legitimate concerns about their works being exploited without compensation.
01:06:24.500 And on the other hand, we live in a culture in which copyrighted works are sort of increasingly being distributed within these environments like Apple and Amazon, for example, where consumers can't do the things that they think they're entitled or should be entitled to do with them.
01:06:47.900 So I think part of the solution here is providing consumers a strong incentive to pay for these works.
01:06:58.240 That's one of the things that streaming services, I think, have gotten right, which is that they offer a really attractive deal to consumers.
01:07:06.760 So people learn that if they're going to access the world's library of music, they have to pay for the privilege of doing that.
01:07:14.900 But figuring out how that money gets distributed and what the right price point is, I think, is one of the sticking points.
01:07:20.800 So it's an important set of questions and one that I probably can't do justice to.
01:07:27.140 You know, with a couple of minutes.
01:07:29.380 Exactly.
01:07:30.200 All right. Aaron, thank you so much.
01:07:31.600 I appreciate it.
01:07:32.360 And I appreciate your thoughtfulness on this.
01:07:36.180 And we'll keep watching for updates.
01:07:39.200 Thank you so much.
01:07:40.560 Aaron Paranowski.
01:07:41.420 I appreciate it.
01:07:41.780 You bet he is.
01:07:43.100 He's found at the website, the end of ownership dot com.
01:07:49.120 I had him on for a couple of reasons.
01:07:51.060 One, you have to know the books that you buy, the videos that you buy, everything you buy says rent or buy.
01:07:58.160 When you buy, you're not really buying it.
01:08:00.540 You don't own it.
01:08:01.780 And in a world where opinions and thoughts and ideas are under siege, you don't have to burn books anymore.
01:08:16.060 All you have to do is get one of the providers or a couple of the providers just to delete them from everyone's library.
01:08:22.980 And they are gone forever.
01:08:28.000 Think about that.
01:08:30.260 Next time you want to download something, what are the books you want to download and what are the books you want to own?
01:08:38.440 American financing, by the way.
01:08:40.160 So, you know, I don't get any more money if I sell a book or I sell a digital download.
01:08:44.640 It's not about that at all.
01:08:46.080 So it is about preserving information.
01:08:50.940 American financing, American financing is there.
01:08:54.280 If you want to buy a house, now might be the time to buy a house.
01:08:57.160 It is a buyer's market right now.
01:09:00.100 The.
01:09:01.600 The trends on homes has gone down here in the last couple of months.
01:09:06.520 And so now you probably have a better chance of negotiating because everybody really wants to, you know, sell their home before the holidays and close before the holidays.
01:09:16.220 So if you are looking for a home, you need to be qualified.
01:09:19.480 American financing gets qualified in just a couple of minutes.
01:09:22.020 Just go to American financing dot net or you call one of the operators at eight hundred nine zero six twenty four forty.
01:09:27.740 And they'll put you in touch with somebody that can help you.
01:09:30.400 Eight hundred nine zero six twenty four forty.
01:09:33.720 They will help you get a loan or refinance.
01:09:38.000 If you if you are looking to consolidate all of your loans and refinance under your mortgage,
01:09:44.160 you don't have to add extra years or anything like that.
01:09:47.100 If you're looking to get out of a variable loan and get into a fixed mortgage, please do that.
01:09:53.660 American financing dot net can help you.
01:09:55.620 Now, these are salary based mortgage consultants.
01:09:58.240 They don't work on commission.
01:09:59.540 So they are looking for you.
01:10:01.820 They listen to you.
01:10:03.260 What do you want?
01:10:04.580 How much do you want to pay?
01:10:06.440 You know, what can you afford?
01:10:07.700 What should you do for down payments, et cetera, et cetera?
01:10:10.880 They have an A plus rating from the Better Business Bureau and they have all the bells and whistles, but they do not work for a bank.
01:10:19.600 That's critical.
01:10:20.900 They're independent and they work for you.
01:10:23.440 American financing dot net.
01:10:25.160 Go to American financing dot net or call eight hundred nine zero six twenty four forty eight hundred nine zero six twenty four forty.
01:10:32.000 American financing dot net American financing corporation and MLS one eight two three three four WWW dot NMLS consumer access dot org.
01:10:43.580 Glad you're here.
01:10:45.300 There is a new documentary.
01:10:46.980 It's called The Creepy Line.
01:10:48.520 And it's all about how Google and Facebook are shaping people and shaping their points of view and and steering you to places.
01:11:00.500 It is a creepy line.
01:11:04.080 We.
01:11:06.140 We have the author of this coming up in just a second, and it is it's staggering.
01:11:11.640 And and and I don't think people what people are looking at are de-platforming and things like that.
01:11:19.420 They are not thinking about the subtle moves.
01:11:23.780 You know, if I controlled the information you had and I controlled what you saw and read first and you had to really dig down to find other things.
01:11:34.440 I could shape your worldview.
01:11:39.100 What's truly frightening is the idea that if a government ever decided to get involved, you could shape.
01:11:49.840 An individual's worldview.
01:11:53.200 You could breed killers.
01:11:57.520 Over time, if I just keep pointing you and directing you to things that are pissing you off.
01:12:04.860 And I keep pointing you to things that are showing you you're the victim of this particular person.
01:12:10.800 And I know who you are.
01:12:12.660 I know you're already an unstable person because I have your whole life in front of me.
01:12:17.920 I see what you're doing anyway.
01:12:19.920 I can go into the public and I can select the unstable and I can wind them up.
01:12:27.540 Now, I am not saying that Google or Facebook is doing that.
01:12:31.160 I don't even do not connect this to them, but that is what they have the ability to do, as well as the governments of the world have the ability to do that.
01:12:44.560 What they are doing is they are shaping us by putting through their algorithms, putting information in front of us that they prefer.
01:12:57.700 Their algorithms are not transparent.
01:13:03.100 I believe these algorithms should be 100% transparent and you should know about it.
01:13:09.820 You should be able to have control of your own algorithm.
01:13:13.080 Imagine if you had control of your own algorithm and you put it at the settings that they have with the default.
01:13:20.260 Now, tilt it to the right.
01:13:23.420 Google search.
01:13:24.720 See what happens.
01:13:26.040 Tilt it to the left.
01:13:27.340 See what happens.
01:13:28.160 Can you imagine if you had the ability to compare, contrast and compare based on what you felt?
01:13:38.240 I wonder what you'd find.
01:13:40.560 The creepy line.
01:13:43.660 In just a second.
01:13:44.520 Also, Keith Ellison, the evidence against him versus the evidence against Kavanaugh.
01:13:52.380 Glenn Beck.
01:13:54.020 Mercury.
01:13:54.660 Glenn Beck is coming live to talk about the right path forward and to make fun of the people standing in the way.
01:14:01.980 He might not be able to save the country, but at least we can all go down laughing.
01:14:05.580 Glenn Beck Live.
01:14:06.720 The Addicted to Outrage Tour.
01:14:08.600 On tour this fall.
01:14:11.680 Glenn Beck.
01:14:13.700 All right.
01:14:14.160 I want to tell you a tale of two people.
01:14:15.980 First of all, we have we have the accusations of not the victim, the accuser.
01:14:22.440 You cannot be a victim unless we have proven that there was a crime that happened.
01:14:27.980 So she is the accuser.
01:14:29.140 And here's what the accuser said.
01:14:30.560 Thirty seven years ago, this guy tried to tear my clothes off.
01:14:34.980 He put his hand over my mouth.
01:14:36.780 She described a horrible, horrible scene.
01:14:39.480 Feel bad for if that's what happened.
01:14:42.460 Now, here's the evidence.
01:14:44.500 She named four people, including one of her best friends.
01:14:48.880 All four denied any knowledge and any memory of this happening at all.
01:14:56.240 She also does not know the time, the place or even the year.
01:15:01.540 In fact, the year has changed four times in the last few months.
01:15:07.880 She has no pictures.
01:15:09.660 She has no letters from the time.
01:15:12.240 She has no evidence from the time.
01:15:14.720 Not a single soul will confirm this.
01:15:19.260 The doctor.
01:15:20.580 She went to a doctor.
01:15:21.900 This is years later.
01:15:23.120 Thirty years later.
01:15:24.360 The doctor recorded that she did talk about this.
01:15:28.820 But the time that she talked about it, the number of boys that were involved in this were different.
01:15:35.040 In fact, the number of boys had doubled at that time.
01:15:39.300 There were four, not two.
01:15:40.860 And Kavanaugh was not named.
01:15:44.260 She is also on record against Trump and on the board of directors of a group that was whose mission was to block Trump's Supreme Court nominees.
01:15:58.140 That's the evidence.
01:15:59.560 Fifty seven percent of Democratic women believe her.
01:16:07.360 On what?
01:16:09.740 Now, let me give you the evidence of another accuser.
01:16:13.440 I do not say victim.
01:16:15.280 I say accuser.
01:16:16.880 Someone who has made the accusation that Keith Ellison was physically abusive in a relationship.
01:16:25.900 Not thirty seven years ago, but in twenty sixteen.
01:16:30.640 The son.
01:16:31.640 Here's the evidence.
01:16:32.580 The son of Ellis Ellison's now ex-girlfriend wrote on Facebook in the middle of twenty seventeen.
01:16:37.600 I clicked on a file.
01:16:39.140 I found over one hundred text and Twitter messages and a video almost two minutes long that showed Keith Ellison dragging my mother off the bed by her feet,
01:16:49.180 screaming and calling her an effing B, telling her to get the F out of the house.
01:16:56.960 The messages I found were mixed with him consistently telling my mom he wanted her back.
01:17:02.440 He missed her.
01:17:03.020 He knew that he had screwed up and he wished he could do things differently.
01:17:07.640 He would victim shame, bully her and threaten her if she went public.
01:17:12.180 I texted him and told him, I know what you did to my mother and a few other things.
01:17:18.940 The woman was forced to come out and say, yes, that is true.
01:17:23.440 It's the most difficult form of abuse to articulate.
01:17:26.440 I didn't want it to come out, but this is a slow, insidious form of abuse.
01:17:30.860 You don't realize it is happening until it's too late.
01:17:33.620 The accuser wrote, four people, including my supervisor at the time, stated that I have come to them after and shared the exact story I shared publicly.
01:17:46.300 I shared multiple texts between me and Keith Ellison, where I discussed the abuse with him and much more.
01:17:54.720 She said, I knew I would not be believed.
01:17:58.380 In 2005, Ellison also faced accusations of domestic abuse for making harassing phone calls in which he threatened to, quote, destroy a woman.
01:18:09.340 She threatened to file a restraining order.
01:18:11.940 The woman wrote in an affidavit that she and Ellison had been in a romantic relationship, that she had pushed, shoved.
01:18:19.220 He had pushed, shoved and verbally abused her and had a lawyer intimidate and threaten her.
01:18:24.780 However, this particular woman in Minnesota, the ex-girlfriend from 2016, did go to a doctor.
01:18:33.800 At the time, the doctor has released the notes.
01:18:39.460 All of these claims are consistent with what she told the doctor at the time.
01:18:45.320 The doctor was treating her for abuse.
01:18:48.540 5% of Democratic women believe this.
01:18:57.840 So please, Democrats, do not start with me.
01:19:05.320 Believe the woman.
01:19:07.820 Believe the victim.
01:19:09.620 Because you don't.
01:19:11.340 You don't.
01:19:12.180 You believe the person that will further your political agenda.
01:19:17.000 It is just that simple.
01:19:19.880 Now, I am not saying that the right doesn't do that as well.
01:19:24.860 I'm just saying, as someone who is standing here watching the world go insane, I'm not going to play either side.
01:19:32.800 I'm not going to jump in or off this cliff with the rest of humanity.
01:19:39.980 I do not believe the victim.
01:19:42.700 I will take seriously the account of the accuser.
01:19:49.140 Then I will look at the facts.
01:19:51.640 If there is a preponderance of evidence, then I will presume that person either innocent or guilty.
01:20:00.420 But after I have seen facts, if this kind of stuff was going on, this should all be in the court of law.
01:20:11.360 If you are a victim, society will do nothing for you because we cannot do anything for you if you haven't gone to the police and reported it.
01:20:23.600 If you believe that we live in a rape culture, then you have a responsibility to go to the police and document everything that has been done.
01:20:37.380 Then we as a society need to do everything we can to make sure that justice is served, not on a collective basis, but judging it by the individual case.
01:20:53.040 That is a just society.
01:20:58.240 That is America.
01:21:00.160 It's Wednesday, October 3rd.
01:21:09.140 You're listening to the Glenn Beck program.
01:21:11.640 One of my favorite guys, because he is he does his own homework.
01:21:15.280 He rolls up his sleeves.
01:21:16.560 He looks and he tells the truth as he finds it.
01:21:19.480 Peter Schweitzer is here.
01:21:20.780 He's the president of Government Accountability Institute and a producer of a new documentary that's out called The Creepy Line.
01:21:27.520 And that is exactly the right name for it.
01:21:31.600 It is.
01:21:32.400 And it actually the creepy line comes from a speech that Eric Schmidt, the CEO of Google, gave.
01:21:37.940 It was an interview, in fact, where he was asked, how do you make these ethical judgments about how far you're going to go?
01:21:43.600 And the interviewer actually asks Schmidt, are you going to implant things in our brain?
01:21:48.840 And Eric Schmidt's response was, well, we like to go right up to the creepy line, but not cross it.
01:21:55.260 And he said, we're not going to implant anything in your brain, at least not yet.
01:21:59.400 Those are actually Eric Schmidt's word.
01:22:02.000 And he's he's I find him incredibly frank.
01:22:05.600 Yes.
01:22:05.920 He he just he says it like it is.
01:22:08.520 Yes.
01:22:08.780 It's it's I've interviewed him a couple of times and it is fascinating.
01:22:13.400 Yes, because he's just telling you he doesn't sugarcoat it.
01:22:17.360 And I think it's his background as an engineer and and he's sort of very direct.
01:22:22.560 I mean, one of the other things we quote him in the film is saying is that Google has and takes very seriously its responsibility to change the values of American people.
01:22:33.220 You know, Google's mantra has always been they are more than just a company to make money.
01:22:39.440 They have a certain ethos, a certain worldview.
01:22:42.320 And part of the reason that they structured the company the way they did, in which the founders always have controlling shares, is that that sense of social mission is part of it.
01:22:51.780 And Schmidt has been always very direct about saying it.
01:22:54.080 Yes.
01:22:54.300 Part of our mission as a company has been to try to shape and change the values of the United States.
01:23:00.240 And that's sort of one of the premises of this film, that it's not just about privacy.
01:23:04.780 It's not that there's taking all this information.
01:23:07.540 Glenn, they're using that information against us to try to nudge us or to move us into directions that we wouldn't ordinarily want to go.
01:23:15.300 OK, so so let's can you tie this all to Kavanaugh and what we've seen with the Kavanaugh case and how, for instance, you know, there's there's there is this overwhelming understanding from half the country that he is absolutely guilty and she is a victim.
01:23:39.060 Right. And there's a lot of information on the other side.
01:23:43.780 In fact, more information on the other side.
01:23:45.440 But you're not really seeing that.
01:23:46.980 Right. Yeah.
01:23:47.840 It's very hard because this is happening in real time right now to sort of monitor what's Google doing.
01:23:53.900 But we can look at the past.
01:23:55.820 In fact, one of the things we feature in the film is a study done by Robert Epstein.
01:24:00.700 Epstein's a very interesting guy.
01:24:02.320 He's a Harvard Ph.D. in psychology studied under BF Skinner, was a former editor in chief of Psychology Today magazine.
01:24:10.440 And by the way, and this is very relevant, was a Hillary Clinton supporter in 2016.
01:24:14.900 Well, one of the things he did in the 2016 election was he had 2000 people around the country doing Google searches and they monitored the results that people were getting.
01:24:25.860 This is a very, you know, clear academic study, and this research was peer reviewed, as his other work was.
01:24:32.820 And what came back was that Google was systematically skewing search results in favor of Hillary Clinton.
01:24:39.940 They were, in other words, they were suppressing negative stories about Hillary and the algorithm, and they were pushing them in favor of Donald Trump.
01:24:47.540 And Epstein's point was, I actually supported Hillary Clinton, thought she was more qualified, but the bottom line is, a company should not be doing this.
01:24:56.040 And it's secret.
01:24:56.900 You don't know that it's going on.
01:24:59.100 Nobody's monitoring the results they're getting.
01:25:01.160 They're assuming the results and the list that they're getting is representative of some objective standard.
01:25:07.040 Google is a verb now.
01:25:09.180 It's not a noun.
01:25:10.080 It's a verb.
01:25:11.540 I don't know.
01:25:12.100 Google it.
01:25:12.820 Yes.
01:25:13.160 Well, if you Google it and the algorithm is giving you the answer that is skewed, that's like going to a dictionary that will always change the definitions of things as it applies to whatever's happening in the world.
01:25:29.420 Yes.
01:25:29.800 That's real a problem.
01:25:31.080 No, you're exactly right.
01:25:32.520 And so in the context of Kavanaugh, I mean, I don't know exactly because it's occurring in real time.
01:25:37.620 But the bottom line is, there is a history here of Google doing this.
01:25:41.340 It was leaked a couple of weeks ago.
01:25:44.360 Tucker Carlson talked about it on Fox about these internal emails where you actually had Google engineers saying, hey, you know what?
01:25:52.780 We don't like Trump's policy on immigration.
01:25:55.500 So we want to sort of suppress certain stories.
01:25:59.720 This is a thing.
01:26:00.900 And Google does it.
01:26:01.960 And here's the point that we try to make, Glenn, in this film and in general.
01:26:06.440 So the whole conversation that Google wants to have is about fake news and this debate about fake news.
01:26:12.060 Here's the here's the bottom line.
01:26:13.800 Fake news is competitive.
01:26:15.300 If you and I are having a disagreement about something, I put up my fake news story and you say, oh, yeah, I'm going to put up my fake news story.
01:26:23.140 The point is, it's out in the open.
01:26:25.000 You have combat.
01:26:25.960 And by the way, fake news doesn't really convince anybody.
01:26:29.180 You know, if you like Hillary Clinton, that fake news ad that the Russians ran of Jesus and Hillary arm wrestling is probably not going to convince you to vote a different way.
01:26:40.060 That wasn't that wasn't a real arm wrestling competition.
01:26:44.080 But, you know, the point is, is that that's not going to convince anybody because of confirmation bias.
01:26:49.820 You know, people tend to look for information they want.
01:26:52.260 What Google's doing is different because we don't know what we don't know.
01:26:57.020 The question that we should be asking people, Google and Facebook, is why will you not make your algorithm transparent?
01:27:05.400 Right.
01:27:05.820 Why will you not?
01:27:07.140 I mean, and let me take a quick break and come back.
01:27:10.120 Do you have an answer?
01:27:11.360 Their answer on why they won't make it transparent?
01:27:15.040 Yes.
01:27:15.380 And it's not very good.
01:27:16.340 Yeah, I bet it's not.
01:27:17.860 I bet it's not.
01:27:19.660 All right.
01:27:20.300 The name of the documentary is The Creepy Line, thecreepyline.com.
01:27:24.020 Peter Schweitzer is with us and we have a lot to discuss because of deplatforming and kind of a roll in from our last conversation about information.
01:27:37.140 How do you know that it's true and will true information, will actual information, will you be allowed to see or keep in the future?
01:27:48.040 First, let me tell you about our sponsor this half hour.
01:27:51.560 It is CarShield.
01:27:52.600 We want to thank CarShield for not only being a sponsor, but also I want to personally thank them for helping me save a buttload of money.
01:27:59.000 I didn't know.
01:27:59.640 I took my truck in to have an oil change.
01:28:02.680 I get there and, you know, the guy, I think the guy would have been a little more, um, I just have to tell you that this is $6,500.
01:28:12.740 Uh, and might've just called, could I, could, uh, hang on, Mr. Beck, could I get some, uh, police back up here or something?
01:28:21.160 He might go crazy.
01:28:22.280 When you go in for an oil change and they tell you, oh, it's $6,500.
01:28:25.700 That's a freak out.
01:28:27.660 Not when you have CarShield, though.
01:28:29.240 Thank you.
01:28:29.900 Not when you have CarShield.
01:28:31.120 They covered it.
01:28:31.940 I didn't even know about it.
01:28:32.700 I know.
01:28:32.940 You get to take the stress out of the situation.
01:28:34.660 Oh, I don't care.
01:28:35.380 Insurance me $50,000.
01:28:36.600 I don't care.
01:28:37.320 Yeah.
01:28:37.540 I mean, that really, that, I mean, I care for CarShield, but, uh, affect the whole insurance kind of program, you know, like, that's not my problem.
01:28:45.440 That's CarShield's problem.
01:28:46.020 That's right.
01:28:46.460 That's right.
01:28:47.120 Screw them.
01:28:48.140 Anyway, the great thing is if you need coverage, if your car has a, you know, 5,000, 150,000 miles, it doesn't matter.
01:28:54.500 You call CarShield, uh, and get your car covered for all of these crazy things.
01:29:01.060 $1,000 for a new sensor, I don't know, CarShield to cover it.
01:29:06.120 You can have anybody do the work because you're not waiting for the check.
01:29:09.760 You don't have to pay the mechanic or the dealership.
01:29:12.140 Doesn't matter.
01:29:13.340 Car 6100.
01:29:15.640 800.
01:29:16.560 Car 6100.
01:29:18.240 That's the number to call.
01:29:19.640 Get yourself protected.
01:29:21.920 CarShield.com.
01:29:23.280 That's CarShield.com.
01:29:24.620 Make sure you use the promo code BECK.
01:29:25.860 You'll save 10%.
01:29:26.720 CarShield.com.
01:29:28.920 Deductible may apply.
01:29:33.200 We, um, uh, we are talking to, um, Larry Schweitzer.
01:29:37.400 He is the president of Government Accountability Institute, the producer of a documentary called The Creepy Line, TheCreepyLine.com.
01:29:44.760 Um, we're talking about Google and, Larry, I've never, I've never believed in, you know, those dystopian movies.
01:29:52.500 I've always made fun of them and said, yeah, this is, this is crazy.
01:29:56.120 You know, the corporation's out to get you.
01:29:58.740 Because of their algorithms, because they are so all-encompassing, that is the world we're headed towards.
01:30:06.780 What do they tell you when they say, algorithms, oh, no, we have to keep that top secret, because?
01:30:14.160 Yeah, they, what they argue is it's, it's for reasons of, of, uh, you know, state secret.
01:30:18.980 Um, and, and, you know, that they need to protect their trade secrets.
01:30:22.300 They need to be, uh, uh, you know, making sure that nobody gets access to it.
01:30:26.140 There's some truth to that, but there are a lot of things that they could do to demonstrate, um, that they're offering a fair product and service to people.
01:30:34.440 And here's the thing, Glenn, they have lied about this before.
01:30:38.080 You know, 10 years ago or so, you had other, uh, companies like TripAdvisor and Yelp who were saying that Google was artificially suppressing their rankings in Google in favor of Google-owned companies.
01:30:51.060 Which, okay, you know, Google has the right to do that.
01:30:53.400 But here's the thing, Google flat out lied and said, absolutely not.
01:30:57.080 We don't do that.
01:30:58.080 Our algorithm is pure.
01:30:59.740 It's true.
01:31:00.460 The best results are, are organically at the top.
01:31:04.260 Well, here's the problem.
01:31:05.280 The Federal Trade Commission, the European Union, professors at Harvard University looked at this and said, BS, you are fiddling with the algorithm.
01:31:12.860 You are screwing these other competitors and you're lying.
01:31:15.860 So the point is when Google says you can trust the algorithm, you can trust us, they've lied before and they're lying now.
01:31:23.100 And I think the only question that remains really is how are we going to deal with this?
01:31:28.040 Um, you know, there's an old story that Henry Kissinger said when he's on the National Security Council.
01:31:32.620 You give a president three choices, do nothing, take my solution, or thermonuclear war.
01:31:38.240 Those are your three choices.
01:31:39.260 In this case, it's kind of like that we can do nothing, we can try to deal with some sort of the regulatory issues related with Google, or we can break up these companies.
01:31:49.720 Those are the three options that we have.
01:31:51.720 And I think we're really at the point of point number three, because this is not a monopoly like standard oil, standard oil that's going to dominate the oil market.
01:31:59.720 This is controlling the news flow in the United States.
01:32:03.160 This is in the end, this is in the end, Peter, um, controlling everything.
01:32:09.400 Yes.
01:32:09.860 Google is the most likely company in the America, in the American world, uh, to come up with AI.
01:32:19.380 Yes.
01:32:19.840 Whoever, whoever gets to AI first controls everything.
01:32:25.880 There's no way to beat it.
01:32:27.560 Right.
01:32:27.940 Once you have AI.
01:32:29.280 Yes.
01:32:29.580 This company is the most likely in the free world to come up with it.
01:32:34.660 If we don't have them contained in some way or another, when they get to AI, we're toast.
01:32:43.120 Yes.
01:32:43.700 Yes.
01:32:44.160 That's exactly right.
01:32:45.220 And here's the thing.
01:32:46.020 It's not just Google the company.
01:32:49.120 A lot of people don't realize this.
01:32:50.340 I didn't realize this.
01:32:51.600 If you use Safari on your Apple product, you're actually using the Google algorithm.
01:32:56.480 And that is Google information.
01:32:58.400 Um, if you're using, if you are using, um, Yahoo, you're using Google.
01:33:03.620 The point being Firefox is Google.
01:33:06.540 The, all these entities are using the Google algorithm.
01:33:09.420 So even if you say, I am not going to use Google.com you're using, unless you are making
01:33:15.040 very specific choices for other options.
01:33:17.720 If you're using any of those others, Google is the one that's dominating it.
01:33:21.040 And by the way, Google pays Apple $9 billion a year.
01:33:25.860 Google actually pays Apple to be the algorithm of choice for Safari.
01:33:30.620 That's how much they value this information and want to dominate this space.
01:33:34.560 So I want to talk to, um, a little bit about, um, what can be done, um, with the algorithm.
01:33:44.240 Uh, I'm concerned about some things that I think are really dangerous and deep platforming
01:33:50.100 and erasing voices, right?
01:33:52.940 Because Google thinks, ah, that's, you know, that's, that's not right.
01:33:57.260 That's eight speech.
01:33:58.300 We'll get to that when we come back.
01:34:04.020 This is the Glenn Beck program.
01:34:06.060 Peter Schweitzer is with us, uh, president of government accountability Institute, producer
01:34:09.640 of a new documentary that is out.
01:34:11.120 You find it at the creepy line.com.
01:34:13.600 Um, the creepy line.
01:34:15.340 And it comes from a, a quote from the guy who's running Google that, you know, we, we're
01:34:21.820 not going to implant things into your head, but we're going to come right up to that creepy
01:34:26.900 line.
01:34:27.700 That's terrifying to hear from somebody who's in charge of Google.
01:34:32.400 Peter, uh, I, I want to, I want to talk to you about de-platforming.
01:34:37.100 I am currently saying to anyone who has a conservative voice, let's all get on a single server.
01:34:46.060 Let's get on to our own platform together.
01:34:50.180 We don't have to join business or anything, but we have to have a protected platform because
01:34:56.660 I believe these companies, uh, are going to de-platform us one by one.
01:35:02.100 Is this what you've been doing Glenn here?
01:35:03.620 Yeah.
01:35:03.940 And I think you're very smart to do that.
01:35:05.720 I think you're very smart to do that because let's step back and first consider the power
01:35:10.820 of Google.
01:35:11.340 And a lot of people don't realize this, but it was widely reported in the guardian elsewhere.
01:35:15.540 In 2009, Google actually shut down the entire internet for two hours on a Saturday morning.
01:35:21.740 They blacklisted the entire internet.
01:35:24.460 That shows you the, the scope and the size that Google has.
01:35:28.100 Wait a minute.
01:35:28.380 It's, it's, it's, it's, it's a, they blacklisted the entire internet.
01:35:32.500 If you go to the guardian, www.
01:35:34.980 Yes.
01:35:35.300 They shut down the entire internet.
01:35:37.140 A couple of years later, they shut down half of Japan's internet.
01:35:41.160 Uh, they said it was sort of an error that they, but, but the point being the size and
01:35:45.320 the scope of this company is enormous.
01:35:47.100 And a lot of people don't realize, even if they're not using a Google product, somebody
01:35:51.980 will say, well, I don't use Gmail.
01:35:53.640 I'm not worried about Gmail.
01:35:54.820 That's fine.
01:35:55.500 But is your company, you know, it may be John Smith at Acme incorporated.com, but is Acme
01:36:01.940 incorporated email server actually a Google product?
01:36:05.380 Because if it is God, Google is monitoring and watching what you're doing and it's part
01:36:11.260 of the data collection they're doing on you.
01:36:13.220 A lot of news organizations in the United States use Google.
01:36:16.760 We, we, we highlight in the film, for example, Robert Epstein, this scholar who has been critical
01:36:21.580 of Google, uh, Clinton supporter, Clinton supporter, uh, uh, you know, Harvard PhD in psychology,
01:36:27.480 um, ran, uh, did some studies and the Washington post ran a piece, um, about could Google swing
01:36:34.600 an election?
01:36:35.120 This is back in 2012.
01:36:36.820 The next day, Robert Epstein was shut out of Google.
01:36:41.420 He could not get on Firefox.
01:36:42.820 He could not get any on, on any of the, uh, the Google products.
01:36:46.760 They had shut him out.
01:36:47.660 Same thing happened to Jordan Peterson, uh, university of Toronto professor who had taken
01:36:52.720 a position about forced speech on general, uh, gender pronouns the next morning when that
01:36:58.900 went public, he was locked out of his Gmail account and Google would not let him back into
01:37:04.580 his Gmail account.
01:37:05.600 All of his emails, his calendars and everything he was blocked from the point being, they have
01:37:10.860 a lot of power and you cannot assume that Google is not going to take these actions.
01:37:16.120 People will say a lot of times, well, they're a company.
01:37:18.580 They want to make their customers happy.
01:37:20.260 They are more than a traditional company.
01:37:22.820 They say so themselves.
01:37:24.440 They view themselves as a company with a mission.
01:37:27.300 So here's what people say when, when, you know, well, I use Gmail.
01:37:31.280 Well, you know, Google is monitoring your mail.
01:37:33.960 They're not reading my mail.
01:37:35.260 Actually, you are both right and wrong.
01:37:38.460 They are not reading your mail.
01:37:40.520 However, they are analyzing your mail because what they're doing is collecting the information
01:37:47.500 on how people relate to one another, how people talk to one another, all for their AI research.
01:37:54.680 So Google isn't free for no reason.
01:37:59.540 That's exactly right.
01:38:00.580 You, you are the product.
01:38:02.060 People always say, what is Google's product?
01:38:03.880 You are the product because they are selling information on Glenn Beck or on Peter Schweitzer
01:38:09.340 or on whoever.
01:38:10.280 And more importantly, I think they are analyzing all of it.
01:38:15.520 Look, there's the, this is just at the beginning.
01:38:18.820 Let me tell you a spooky story.
01:38:21.060 You go to Beijing.
01:38:22.160 There are three concentric circles of security in Beijing.
01:38:25.960 Okay.
01:38:26.460 The center is the main city of Beijing.
01:38:29.420 How many people, you should look it up for me.
01:38:31.020 Will you still?
01:38:31.460 How many millions of people are living in Beijing?
01:38:33.220 In the city proper, that is the inner circle, there is so much monitoring going on.
01:38:40.500 They just did a test.
01:38:41.740 They released a guy.
01:38:43.120 They took a guy, picked somebody, said, go into the center of the city, just go hide.
01:38:49.260 Okay.
01:38:50.380 They had a picture of the guy and the Chinese tested this new brand new system.
01:38:56.040 And then after like an hour or two, they gave the, the system, the picture of the guy and
01:39:03.120 said, find him in eight minutes.
01:39:05.960 He was in the back of a squad car.
01:39:07.500 Unbelievable.
01:39:08.120 In eight minutes.
01:39:09.560 Yes.
01:39:09.900 City of 22 million.
01:39:10.980 Yeah.
01:39:11.460 22 million.
01:39:12.780 It's amazing.
01:39:13.380 In eight minutes.
01:39:14.700 Yeah.
01:39:15.000 So, and this is without AI, right?
01:39:18.080 This is the kind of stuff that is coming that people really need to pay attention to.
01:39:22.820 That's right.
01:39:23.260 And, and Google is at the forefront of that.
01:39:26.020 They, they take a position of pride that they are the forefront of exploiting future technologies
01:39:30.560 real quick though, on Gmail, this is very important for people to realize.
01:39:34.200 It just shows you how far Google goes.
01:39:36.960 I used to have a Gmail account.
01:39:38.640 Now what Gmail does is they scan every email that comes into your Gmail account and every
01:39:44.080 email you send out.
01:39:45.280 You know what else they scan?
01:39:46.900 If you've ever written a draft email, let's say you're mad at your cousin, you know, I'm
01:39:52.300 sick of, you know, cousin Chris and all the crazy stuff he does.
01:39:55.440 And you're late one night, you write a draft and you say, you know what?
01:39:58.820 I don't think I'm going to send that to Chris.
01:40:00.580 That's just mean.
01:40:01.560 Hang on just a second.
01:40:02.520 I want to make sure this is writing a draft, but that's not saving it as a draft.
01:40:07.380 That's correct.
01:40:08.020 It's just, you, it counts the delete, delete, delete, delete, delete, delete.
01:40:13.300 It's not even saving it as a draft, right?
01:40:15.180 That's correct.
01:40:15.820 That's correct.
01:40:16.400 It's scanned by Google.
01:40:18.060 So Google knows more about you than, you know, you can have a very open and completely
01:40:23.240 transparent relationship with anybody in your life, your spouse, your best friend, whatever.
01:40:28.080 Google knows more about you than they do.
01:40:31.120 And they are very aggressively going to retain that information, but they're using it against
01:40:35.660 you.
01:40:35.920 Here's the key thing.
01:40:36.600 It's not just privacy.
01:40:37.740 It's manipulation because they want to take that information and not just sell information
01:40:43.240 advertisers, they, as part of their mission, want to steer and move your life in a certain
01:40:48.260 direction that they deem is the way the direction your life should go.
01:40:52.200 Dr. Peter Schweitzer, the documentary is The Creepy Line.
01:40:55.600 Let me push back on a couple of things.
01:40:56.740 Sure.
01:40:57.240 Or at least the problems I have, because I'm with you on 90% of this.
01:41:00.920 One issue I have is Google is awesome.
01:41:04.100 It's all of the products that they make are better than everybody else's product that is
01:41:08.440 similar.
01:41:08.840 Correct.
01:41:09.180 And it would incredibly inconvenience me to give up Google because all of their products
01:41:14.900 work better than everybody else's products.
01:41:16.560 Jeeves doesn't know anything.
01:41:18.840 Oh, thank you.
01:41:20.700 What do you do?
01:41:21.460 Because I mean, yes, they're everywhere, but they also do a really good job when it comes
01:41:26.720 to designing an easy to use product.
01:41:28.900 Right.
01:41:29.300 No, there's no question.
01:41:30.160 How do you break away?
01:41:30.760 That's a great question.
01:41:32.040 And it's not easy.
01:41:33.100 I mean, there are alternative search systems out there, but none of them are as good as Google.
01:41:38.720 And it's not really the challenge.
01:41:40.300 I think that's a road that people will go down that you shouldn't go down.
01:41:45.600 What we should go down is we should be talking to the people in what?
01:41:49.780 No, no, no.
01:41:50.740 We should be sending people who understand this to Washington, because by the time we
01:41:57.240 talk to Washington, it's already over.
01:41:59.660 We need people who understand the ethics of what's going on with high tech in Washington,
01:42:05.660 because they need to have this conversation right now.
01:42:09.180 It's not about us changing our behavior.
01:42:11.080 Right.
01:42:11.600 It's to the point we must corral Google and break them up or corral them in some way, not
01:42:19.360 with just little litigation, because I mean, not litigation, but not with just little simple
01:42:25.160 laws.
01:42:25.680 Well, we're going to because they'll go around that that fast.
01:42:29.480 Right.
01:42:30.000 OK, let me one other one.
01:42:31.240 Yeah.
01:42:31.520 You said this earlier, and it made me very uncomfortable.
01:42:34.600 And my suspicion is it makes you very uncomfortable, too, which is your solution was you were leaning
01:42:40.240 towards breaking up these companies.
01:42:41.920 And as a capitalist, as a person who does not want the government involved in private
01:42:46.640 business, I mean, how do you how do you how do you bridge that gap?
01:42:51.040 No, I mean, I think that's a great point.
01:42:52.580 And I came reluctantly to this as I looked at all the possible things to do with Google.
01:42:56.920 Here's the problem in this space, the way that technology is moving, how Google exerts
01:43:03.080 its market dominance.
01:43:04.100 It really we really do not have a free market in search.
01:43:07.760 It's really impossible.
01:43:09.160 In fact, one of the guys we interviewed, the vice president of Yelp in our in our film,
01:43:14.600 and the founder of Yelp said he wouldn't start Yelp today because it'd be impossible.
01:43:18.960 Google would smother them.
01:43:19.900 The only reason that Yelp and these other entities have survived is they have built up
01:43:24.000 enough brand equity when they were founded, what, 15, 20 years ago to where they can exist.
01:43:29.100 So we have a situation where there truly is not a functioning monopoly.
01:43:34.380 And by the way, Google's market dominance is partly related to its political alliance
01:43:39.500 in Washington.
01:43:40.100 They pass laws and rules all the time to squelch competition that are to their benefit and
01:43:46.360 to do things like the platforming.
01:43:48.320 Yes, we are not a publisher.
01:43:50.200 We're a platform.
01:43:51.660 Right.
01:43:52.000 Then they go in and they edit.
01:43:54.240 Well, now you're a publisher.
01:43:55.660 You're not a platform.
01:43:56.660 So they play it both ways.
01:43:58.200 And they're they have the money to do it.
01:44:01.440 Yes.
01:44:01.680 And that's a hugely important point, because when they first wanted to set up the regulation
01:44:06.080 for the Internet, it was if you are a publisher, you're treated as a media company.
01:44:10.620 If you are a neutral platform, which is what Google and Facebook claim they are, they say,
01:44:14.980 we're not going to edit any content.
01:44:16.280 They even use the analogy.
01:44:17.620 We're like a telegraph.
01:44:19.080 We're just taking a telegraph message from Peter and we're sending it to Glenn.
01:44:23.300 Nothing else is happening.
01:44:24.440 Well, we all know that's not true now.
01:44:26.040 They're editing.
01:44:26.960 They're censoring.
01:44:28.320 So they are acting as publishers.
01:44:30.080 But unlike other publishers, they don't face the legal or regulatory burdens that they do.
01:44:35.820 So it's unfair competition or even the public scrutiny.
01:44:39.280 Even the public.
01:44:39.880 You sound like you're crazy if you talk against Google or they're editing.
01:44:43.080 Right.
01:44:43.460 No, they are right.
01:44:45.280 The would one solution be to make their algorithms transparent?
01:44:50.460 Yes, that would help.
01:44:52.920 But the algorithm is changing constantly.
01:44:55.580 And I do get I am sympathetic to their argument that this is a trade secret that they've worked
01:45:00.920 very, very hard to develop.
01:45:02.560 And they have, I think, by breaking them up into pieces.
01:45:07.440 And by the way, I don't think it's just two companies.
01:45:09.180 I think it's several different companies.
01:45:11.080 It's an alphabet.
01:45:12.460 Exactly.
01:45:14.100 I think it would be would be extremely helpful in the way to go on this, because otherwise
01:45:18.660 they're going to continue to reexert market dominance.
01:45:21.320 The other thing I hear conservatives talk about, and I find this terrifying, is to make
01:45:26.360 them a utility.
01:45:27.800 Oh, no, no, no.
01:45:29.640 That's terrifying.
01:45:30.840 I don't know what the utilities are like here in Dallas, Texas, Texas.
01:45:35.400 We don't really have it.
01:45:36.220 We have open energy.
01:45:38.060 Yeah.
01:45:38.220 Oh, that's so we have.
01:45:39.160 That's good.
01:45:39.480 Five different energy companies.
01:45:40.760 And yeah, we don't pay anything.
01:45:43.460 Yeah.
01:45:43.960 Like the notion of making Google a public utility, which really means you're making it a government
01:45:50.360 utility now means you've got the government controlling search.
01:45:54.280 No, I have.
01:45:54.920 I have no interest in that whatsoever.
01:45:56.300 And what you are seeing is this merging of Google, which used to be generally out of
01:46:02.440 politics, very little lobbying, very little interaction.
01:46:05.480 When Barack Obama was elected in 2008, you saw this enormous influx of Google people into
01:46:12.280 Washington, D.C., and they've stayed.
01:46:14.560 So it is now one of the most plugged in, if not the most plugged in corporation in America.
01:46:20.100 So this notion that there are kind of these wildcatters out in Silicon Valley that are like
01:46:25.200 independent from government is just totally ridiculous.
01:46:28.160 I could spend two days with you.
01:46:29.940 I've really enjoyed this.
01:46:32.220 He's going to take a hard pass on that.
01:46:33.940 So you're aware.
01:46:35.200 I'm not buying you flowers and drinks.
01:46:38.380 It's Peter Schweitzer.
01:46:39.540 He is the president of Government Accountability Institute.
01:46:42.720 The documentary is called The Creepy Line.
01:46:45.840 How do you see it?
01:46:46.920 It's going to be available shortly on Amazon, iTunes.
01:46:49.720 There's a trailer on the web page, and we love to get feedback from people.
01:46:53.080 And then it won't be on YouTube, I bet.
01:46:57.560 All right.
01:46:58.040 Thank you so much, Peter.
01:46:59.020 Thanks.
01:46:59.580 All right.
01:47:01.660 Our sponsor this half hour is LifeLock.
01:47:05.740 You know, as we look at technology and how people are into everything, LifeLock, let me
01:47:12.920 give you this on Facebook, Facebook, approximately 50 million accounts were accidentally exposed.
01:47:21.020 Whoops.
01:47:21.860 Somebody went in and took 50 million account names and all of the information.
01:47:27.980 Do you know?
01:47:28.620 Is that even you?
01:47:29.400 Do you have any idea?
01:47:32.020 Personal information from a data breach.
01:47:34.840 Criminals use it to open accounts, file tax returns, buy property.
01:47:38.540 There are so many threats to your identity right now.
01:47:42.520 I remember I've been thinking about this for a week.
01:47:44.660 I remember when, remember when LifeLock came out and said, I'll put my security card on buses.
01:47:49.420 Do you remember that?
01:47:50.080 Oh, yeah.
01:47:50.280 My social security number.
01:47:51.340 Social security card.
01:47:51.580 Yeah, yeah.
01:47:51.800 Okay.
01:47:52.800 I remember at that time thinking, well, that's stupid.
01:47:55.880 Yeah.
01:47:56.520 But that's not going to happen.
01:47:57.880 Who's going to get my social security card?
01:48:00.360 We didn't think in those terms.
01:48:02.560 This company was so far ahead.
01:48:04.280 Yeah, they really were.
01:48:05.180 Now, this all matters, and the new LifeLock Identity Theft Protection adds the power of Norton Security to protect you against the threats to your identity and your devices.
01:48:16.420 Now, nobody can stop all cyber threats, prevent all identity theft, or monitor all transactions at all businesses, but the new LifeLock with Norton Security sees the threats that you're going to miss.
01:48:25.560 LifeLock.com or call 1-800-LIFELOCK.
01:48:27.960 Use the promo code BECK, get an extra 10% off your first year, plus a $25 gift certificate with annual enrollment at LifeLock.com.
01:48:37.580 1-800-LIFELOCK.
01:48:39.120 Use the promo code BECK, terms and conditions to apply.
01:48:46.680 Glenn, back.
01:48:48.460 You know, I want to thank Larry for being on it.
01:48:51.480 He is, he's...
01:48:52.820 You're with the...
01:48:53.480 Whoops, Larry?
01:48:54.160 You have never pronounced a name correctly on the air.
01:48:58.460 I mean, starting with me, I'm...
01:48:59.640 Larry, Peter, it's the same.
01:49:01.420 My name is Steve, and you've been calling me Stu for a million years.
01:49:04.720 Larry was great, wasn't I?
01:49:05.660 His name is Peter.
01:49:07.780 What was the other guy?
01:49:08.800 Aaron, uh...
01:49:10.160 Brockovich.
01:49:11.640 That's about how close you came on his name.
01:49:13.460 I know.
01:49:13.640 Well, it's, you know...
01:49:14.860 It is a sickness with you.
01:49:17.100 It is.
01:49:17.520 It's one of those things.
01:49:19.200 It's one of those things.
01:49:19.220 It's one of those psychological illness.
01:49:20.520 Here it is.
01:49:21.260 And I agree with that.
01:49:22.460 I really do.
01:49:24.180 Here's what it is.
01:49:25.400 I have such a fear of mispronouncing names and getting them wrong and everything else
01:49:30.320 that I can know a name and I still won't say it.
01:49:34.160 Oh, my God.
01:49:34.500 You should have...
01:49:35.120 I mean, if you were in this room, America, before the interview with Aaron...
01:49:40.600 Peter.
01:49:41.060 No, with Aaron.
01:49:41.880 Oh.
01:49:42.760 The first interview we did in an hour or two.
01:49:45.480 Literally, before we went on the air, Glenn just repeated his last name correctly 20 times
01:49:51.640 in a row.
01:49:52.260 Just to make sure I had it right.
01:49:53.460 And then in the interview, not even once did he come close to saying it right.
01:49:56.880 No.
01:49:57.680 Nope.
01:49:58.260 But off the air, I can say it absolutely perfectly.
01:50:00.480 It is a sickness.
01:50:01.700 It is.
01:50:02.240 It is.
01:50:02.380 I do have a...
01:50:03.660 I have a real mental block with names.
01:50:06.040 And I really...
01:50:06.680 I feel like I have that too, but you are like...
01:50:08.740 No, it's horrible with me.
01:50:10.080 Anyway.
01:50:10.400 We're deep into it.
01:50:11.780 Okay.
01:50:12.320 So, on tomorrow's program, tomorrow we're going to talk a little in-depth about the book
01:50:16.900 Addicted to Outrage, and I'm looking for people who have read it.
01:50:21.060 Anybody who makes it on the show tomorrow and you've read the book, you get a signed
01:50:24.660 copy I'll send out to you, but I...
01:50:26.420 Or the audiobook too, right?
01:50:27.420 Or the audiobook, whichever you prefer.
01:50:29.140 The audiobook is really good.
01:50:31.120 But I'll send either one, but you have to have read it because I want to get into the
01:50:35.620 things that you disagree with, the things that you don't understand, get into some of
01:50:39.640 the other topics.
01:50:40.480 I'd love to see how people apply the stuff in the book to the Kavanaugh thing.
01:50:43.420 Yes.
01:50:43.600 Because it's so close.
01:50:44.480 Oh, it's...
01:50:45.080 Yeah.
01:50:45.300 It's...
01:50:45.620 I think it's all there.
01:50:47.540 We'll see if people who read the book are connecting the dots.
01:50:50.520 That's tomorrow on radio.
01:50:53.020 Glenn.
01:50:53.880 Back.
01:50:54.240 Mercury.