The Glenn Beck Program - February 02, 2019


Ep 22 | Peter Schweizer | The Glenn Beck Podcast


Episode Stats

Length

1 hour and 14 minutes

Words per Minute

153.97667

Word Count

11,525

Sentence Count

1,032

Misogynist Sentences

6

Hate Speech Sentences

9


Summary

In this episode, Peter and Glenn discuss their new film, "Trust Us." They discuss the dangers of concentrated power in Silicon Valley, and how technology is making us less trusting in the people we should be able to trust.


Transcript

00:00:00.000 So, Peter, your film starts out with an amazing clip from what wasn't long ago on the Today
00:00:23.240 Show with Brian Gumbel and, what's her name, Katie Couric, asking the question, does anybody
00:00:30.620 know what the Internet is?
00:00:32.000 What's the Internet?
00:00:33.480 Yeah.
00:00:34.200 Mid-1990s.
00:00:35.460 I mean, it seems like so long ago, and yet it's 20 years we've had this for wide application,
00:00:41.620 and we're still trying to figure it out.
00:00:43.880 And I feel like what we're going to talk about today, in five years, we'll look like, oh
00:00:52.720 my gosh, look at this, how ridiculous this conversation was.
00:00:57.100 They didn't even know.
00:00:58.360 Yeah.
00:00:58.740 Because things are changing so fast.
00:01:00.840 Right.
00:01:02.220 You have made the film, because why?
00:01:07.760 Because I've always been distrustful of concentrated power.
00:01:11.220 This is why I've investigated corruption in politics and-
00:01:15.720 On both sides.
00:01:16.200 Yeah, on both sides.
00:01:17.100 That's right.
00:01:17.540 Size and scope of government.
00:01:18.820 We've talked about this before, Glenn, and I think we probably agree.
00:01:22.440 You know, corruption is a human problem.
00:01:25.140 It's part of our human nature.
00:01:26.940 Well, that also applies to people that are in the private sector.
00:01:30.780 Now, one of the reasons I like a free market economy is, you know, it tends to self-correct.
00:01:37.460 But the problem is, with this technology, with Silicon Valley, the market's not going
00:01:43.360 to self-correct.
00:01:44.620 So we're dealing with a massive concentration of power in Silicon Valley, I would argue,
00:01:49.600 that is larger than any other corporate titans have had in American history.
00:01:53.440 And it's only going to get worse.
00:01:56.540 So let me understand why you think it won't self-correct.
00:01:59.520 It won't self-correct because the barrier to entries, it seems like, oh, we can just
00:02:05.880 get together and start a tech company, right?
00:02:07.660 And we'll be up and running.
00:02:08.700 But the reality is, the gatekeepers are the large tech companies.
00:02:13.320 Apple, if you're not connected with Apple in some way, and iTunes, or you're not getting
00:02:17.560 promoted, it's very hard to get application there.
00:02:19.860 Google, the titans control it, and they restrict the market.
00:02:23.920 They restrict competition.
00:02:25.240 So it's very hard to say, we're going to create a new search engine today, and we're
00:02:30.240 going to compete with Google.
00:02:31.600 It's just not going to happen.
00:02:33.060 And everybody who's honest knows that.
00:02:35.020 And everybody that I know says that just the servers that Google has, the government servers
00:02:45.340 would fit in a shoebox at Google.
00:02:48.060 Yeah, yeah.
00:02:48.920 That is mind-boggling.
00:02:52.160 Yes.
00:02:52.480 I talked to Ray Kurzweil.
00:02:54.780 You know who Ray Kurzweil is?
00:02:55.920 Oh, yes.
00:02:56.520 So I talked to Ray Kurzweil, and I said to him at one point, we were talking about AI,
00:03:02.760 and we were talking about all of the things he's working on.
00:03:06.520 And I said, Ray, so why wouldn't a Google that is reading everything, knowing everything,
00:03:16.120 and it's only going to get faster, and it's going to be a personal assistant, and the idea
00:03:21.900 of the personal assistant is it's so knowledgeable on you that it can predict what you might think
00:03:30.100 next.
00:03:31.060 So I'm thinking about things.
00:03:33.420 I'm talking to people.
00:03:34.780 And then, you know, there's something in the back of me that says, you know, maybe we go
00:03:37.600 to Hawaii.
00:03:38.180 The next day, the next morning when I wake up, my assistant says, hey, by the way, I
00:03:43.380 found some really good, I've not talked to him about going to Hawaii, but it just can
00:03:48.720 predict me so well.
00:03:49.980 Yes.
00:03:50.220 So what I said was, why wouldn't Google stop anyone from building a replacement for Google?
00:04:03.840 Right.
00:04:04.420 If there was something that was going against the corporation, why would it allow you to
00:04:12.260 do that?
00:04:13.660 His answer was, because we never would.
00:04:18.040 Well, then I'm meeting a unique group of people in the history of the world.
00:04:27.780 Those words, trust us, have been uttered throughout human history by people that rose to power, who
00:04:35.520 were well-intentioned, who did wonderful things, but along the way took a terrible turn.
00:04:41.500 Those words have also been used by people who were terrible from the beginning.
00:04:45.320 The point being, we cannot trust other human beings, given the long span of human history
00:04:52.040 and politics and warfare and technology.
00:04:54.740 We cannot trust other human beings with this much dominant power over our lives.
00:05:00.000 And look, they're not going to come and say, we're going to do terrible things to you.
00:05:03.480 It's always going to be presented as, we're doing this for your benefit.
00:05:08.020 This is a wonderful, good thing.
00:05:09.800 And there are benefits.
00:05:10.820 Huxley was right, not Orwell.
00:05:13.580 Right.
00:05:14.000 You know, I think it turns into Orwell.
00:05:16.240 Right.
00:05:16.560 But first, it's Huxley.
00:05:18.260 We want it.
00:05:18.980 Yes.
00:05:19.500 Brave New World will welcome it.
00:05:21.260 Yes.
00:05:21.680 I wouldn't have given my fingerprint to anyone.
00:05:24.080 I gladly give it to Apple because it reduces, oh, really, I got to remember that.
00:05:28.620 I got to change that.
00:05:29.640 Yeah.
00:05:29.980 You know?
00:05:30.420 Now, facial recognition.
00:05:32.360 Right.
00:05:32.560 So before we start, is your concern the power that they have now and in the next two years
00:05:41.540 or the power that they have and will have in three, five years and with AI coming online?
00:05:51.980 My sense is, Glenn, that right now they have an enormous amount of power and we will reach
00:05:58.180 a point where it won't be reversible.
00:06:01.880 In other words, they will have accumulated so much power that whatever actions are taken,
00:06:07.000 simply it will be too late.
00:06:09.180 That's my concern.
00:06:10.020 They get so embedded into a human life and their ability to steer and manipulate, we become
00:06:16.180 so dependent and conditioned to being dependent upon them that it becomes too late to self-correct.
00:06:21.740 That's my fear.
00:06:22.600 And I think we're approaching that probably in the next five to 10 years.
00:06:25.940 I don't accuse Google of doing this or Facebook doing this at all, but I have often wondered
00:06:33.660 if it today, just with what they have today, if it fell into the wrong hands, could anyone
00:06:41.640 in Washington beat Google?
00:06:44.660 Because they have everything.
00:06:47.940 They can make you look guilty.
00:06:51.060 They can place it.
00:06:52.480 They can do anything to your life.
00:06:55.180 Do they, do they today with what they have?
00:07:01.940 Who's more afraid?
00:07:03.640 Google, Facebook or the government?
00:07:06.600 Oh, I think the government.
00:07:07.460 There's no question about it.
00:07:08.560 And it works a couple of ways as it always does, Glenn.
00:07:10.900 There's certainly absolutely what you're saying.
00:07:13.040 They have the stick that they can wield in lots of ways.
00:07:16.240 They have the search histories of, you know, members of Congress, of senior White House
00:07:20.760 officials.
00:07:21.520 If they're using a Gmail email account, even if it doesn't say Gmail, if they're using a
00:07:26.980 Google-based email server, which a lot of government entities are, they have access to their emails.
00:07:32.560 So they know a lot of secrets about these people.
00:07:36.160 So there is no question there's the stick.
00:07:38.520 Wait, wait.
00:07:39.200 Let me just, let me clarify.
00:07:40.680 So I want to be really careful.
00:07:41.640 Yeah.
00:07:42.160 Because I don't want to, I don't want to assign things that they're not doing.
00:07:47.160 I want to make a difference between what they have, what they can do with what they have,
00:07:52.520 and then what we worry about.
00:07:54.040 Yes.
00:07:54.180 Okay.
00:07:54.440 Very important distinction.
00:07:55.680 So they don't necessarily have all of that information.
00:08:00.920 I shouldn't say that.
00:08:02.080 They have all that information.
00:08:04.420 We are not accusing them of looking or using any of that information at this point.
00:08:08.900 Correct.
00:08:09.380 Correct.
00:08:09.720 But it is a capability.
00:08:11.740 Yes.
00:08:12.120 They have.
00:08:12.600 Yes.
00:08:12.780 It's, you know, the equivalent would be, you know, somebody who has a gun at home for
00:08:18.840 home defense.
00:08:19.400 They may never use that gun, but they have the capacity to use it if they need to, God
00:08:24.300 forbid.
00:08:25.040 Same thing with Google.
00:08:25.940 They have that capacity.
00:08:27.280 And it's the same worry that we have with the United States government, with the NSA servers,
00:08:31.340 which again, the NSA servers are minuscule in comparison to Google.
00:08:38.260 That's right.
00:08:38.920 And the NSA servers, they are not using it, but they have everything.
00:08:45.140 Right.
00:08:45.580 Exactly.
00:08:46.060 And, you know, on top of that, Google has, you know, Google Docs, which again, they have
00:08:51.320 the ability to scan and they do.
00:08:53.200 The Pentagon uses Google Docs.
00:08:55.120 So you've got classified information that potentially, I'm not saying they're doing it,
00:08:59.680 but potentially could be accessed through Google Docs.
00:09:02.620 And you have news organizations from the New York Times to others who use Google email
00:09:07.720 services.
00:09:08.120 So this is a capability that they have.
00:09:10.420 Well, I know five, maybe eight years ago, I was still in New York, so probably eight
00:09:14.240 years ago, one of Google tech called me and said, you need to know something.
00:09:20.960 He said, our server farms.
00:09:23.580 I don't know why, but the federal government is digging trenches around our farms and they
00:09:31.900 are now protecting us.
00:09:33.300 And they were digging very deep and putting security all around.
00:09:38.720 They were not in bed at the time.
00:09:41.140 Right.
00:09:41.720 But even the federal government knew.
00:09:43.740 Yeah.
00:09:44.860 This is really dangerous what they have.
00:09:47.680 Yes.
00:09:48.100 Yeah.
00:09:48.300 The federal government knows the power.
00:09:50.100 And that's why, you know, one of the big myths we've talked about this before, Glenn,
00:09:53.760 one of the big myths that operates in America.
00:09:55.580 And it's oftentimes perpetuated by the political left, which is that big government and big
00:10:01.560 business hate each other.
00:10:02.760 And the reality is oftentimes, no, the big business and big government like each other.
00:10:06.980 That's my worst fear.
00:10:07.920 Yeah, exactly.
00:10:08.780 Exactly.
00:10:09.380 And I think that is certainly the case with Google.
00:10:11.720 And so we've talked about the stick.
00:10:14.160 There's also the carrot.
00:10:15.680 All right.
00:10:16.040 One of the largest lobbying shops now in Washington, D.C., is Google.
00:10:20.540 They are hiring former government employees.
00:10:23.300 I would not be surprised if we see, you know, additional members of Congress put on the
00:10:28.200 payroll.
00:10:28.780 The point being, hey, you know, we can make life really tough for you, but we can also
00:10:33.380 make life for you personally very lucrative if you join us.
00:10:38.260 So it is really interesting.
00:10:41.040 I just before we sat down, I just hung up the phone with one of the editors of Wired magazine.
00:10:48.640 And we were talking for about 40 minutes about some of this stuff.
00:10:52.520 And he said, well, you know, Facebook has poured a lot of money.
00:11:00.900 I mean, they're one of the number one, number two lobbyists in Washington.
00:11:04.720 And they have, you know, they have conservative lobbyists.
00:11:08.420 And I said, that doesn't make me feel better.
00:11:12.160 I want them out of Washington.
00:11:14.400 That's right.
00:11:14.840 I don't want any relationship with those two.
00:11:17.940 Right.
00:11:18.440 So tell me what the film is about and exposes.
00:11:23.880 And I think it's important to start with.
00:11:26.740 The one of the guys that you you feature prominently is a guy who is a diehard Clinton supporter,
00:11:35.460 Democrat.
00:11:35.820 This is not a political.
00:11:38.820 This is not this has nothing to do with politics.
00:11:41.160 Right.
00:11:41.420 The film is really about the power of Google and Facebook.
00:11:45.180 But in a way that a lot of people have not traditionally thought about them.
00:11:48.760 A lot of people recognize and see the issues of privacy.
00:11:52.060 The fact that they're gobbling up all this information and they take this information, Glenn, and they
00:11:57.340 run those ads about, you know, go to Acapulco because you did a search on Mexico or, you know,
00:12:02.220 buy these shoes because you said you were looking for certain shoes.
00:12:06.720 So that is the concern of privacy.
00:12:09.460 But the problem goes far deeper than that.
00:12:11.980 And this goes back, I think, to a fundamental concept that people recognize everywhere.
00:12:16.860 And that is to the extent that an institution can do something for you, it can do something
00:12:23.880 to you.
00:12:25.080 So Google and Facebook can do a lot of things for us.
00:12:28.180 But that power that they have accumulated that gives them the capacity to know so much
00:12:32.800 about us also gives them the capacity to do things to us.
00:12:36.680 And what they're doing is not just collecting information, the privacy concern.
00:12:41.160 They're also actively working to manipulate us.
00:12:44.700 They're looking to steer us and move us in directions that we don't necessarily want to
00:12:49.240 go.
00:12:49.780 They want to influence our values.
00:12:51.960 They want to influence our worldview.
00:12:55.220 They want to influence the way that we see things.
00:12:57.840 And it's not about them flashing in front of us in a very visible way, Glenn, of saying,
00:13:02.560 hey, have you ever considered this point of view?
00:13:05.500 Which is fine.
00:13:06.280 I think you could do that.
00:13:07.240 It's done in a hidden manner.
00:13:09.560 They do it by manipulating search results.
00:13:12.120 So you may be looking for information on, let's say, Congressman John Smith, but they're
00:13:18.080 only going to give you certain bits of information about Congressman John Smith, depending on what
00:13:23.480 Congressman John Smith's politics are and depending on whether they've decided to manipulate
00:13:28.120 that search.
00:13:29.100 That is a really powerful charge.
00:13:31.800 Yeah, it is.
00:13:33.220 And I think there's evidence to back it up.
00:13:35.340 There's lots of evidence to back it up.
00:13:37.020 The first bit of evidence is we know that they manipulate the algorithm because it's
00:13:43.140 been proven by the Federal Trade Commission, the European Union, and academics at Harvard.
00:13:47.740 And this relates to commercial search.
00:13:50.100 They were charged 10 years ago by Yelp and by TripAdvisor for manipulating the algorithm
00:13:55.460 to the detriment of those companies to benefit companies that Google owned that were competitors.
00:14:02.280 And as far as I'm concerned, Google can do that.
00:14:04.600 But here's the problem.
00:14:05.440 Google insisted and they pounded the table and said, we are not manipulating the algorithm.
00:14:11.160 We would never do that.
00:14:13.300 Guess what?
00:14:13.680 They were lying.
00:14:14.520 They were manipulating the algorithm and they pretty much given up the argument that they
00:14:18.600 weren't.
00:14:18.900 So they've done that.
00:14:19.960 And the work of Robert Epstein and others, I think, conclusively demonstrates that they
00:14:25.180 are cooking the books on the algorithm as it relates to political search.
00:14:35.440 So I want to be clear, because you and I are both free market guys.
00:14:48.380 Yes.
00:14:48.600 I don't want to tell Google and Apple.
00:14:50.260 I do not.
00:14:51.040 I will tell you, the conversation I had with Wired was warning.
00:14:56.260 Yeah.
00:14:56.740 Warning, these companies, if they don't smarten up one side or the other, we'll say, you're
00:15:04.480 a utility.
00:15:05.300 And then it's government controlled and in bed with the government.
00:15:10.400 That was semi-okay with the bell system.
00:15:15.040 I mean, it was a lot better when it weren't.
00:15:16.600 Yeah.
00:15:16.740 But it's not this.
00:15:18.580 This is NSA.
00:15:20.420 Yeah.
00:15:20.680 Okay.
00:15:23.620 So I don't want to tell them how to run.
00:15:26.820 So are you saying to me, if they said, yeah, yeah, we put our stuff first, you'd be okay
00:15:34.440 with it?
00:15:35.280 On the commercial search, yes, I would be.
00:15:37.700 Although the problem is that the internet is essentially, to the extent that it's regulated,
00:15:43.760 is based on the Communications Decency Act of 96.
00:15:47.920 And as we've talked about, Glenn, and as you know very well, if you are a neutral platform,
00:15:53.520 as Google and Facebook insist they are, you are essentially saying you're a telegraph
00:15:58.320 and you're simply relaying information from point A to point B.
00:16:01.580 And the law says, great, do that.
00:16:03.820 You have no regulation whatsoever.
00:16:06.080 And so Google and Facebook have operated under that platform.
00:16:09.060 The other part of the Communications Decency Act is if you are editing content, if you
00:16:15.860 are sifting content, if you are engaging in editorial control, you are now a publisher
00:16:21.620 and you will be treated in some respects as a media company.
00:16:25.840 Exactly, legally.
00:16:26.940 And so my point is Google and Facebook should have to choose.
00:16:31.600 They should not be able to say we're a neutral platform and yet we are going to exert editorial
00:16:37.660 control.
00:16:38.060 And they've admitted that they exert editorial control.
00:16:41.620 And the reason why they are getting it both ways is because of the lobbying money.
00:16:47.260 Yes, that's exactly right.
00:16:48.480 They lobbied one way, then they lobbied the other way, and they got both.
00:16:52.380 Yes, exactly.
00:16:53.300 And it's absolutely wrong.
00:16:54.320 Yeah.
00:16:54.720 And it should not, by the way, give us any comfort.
00:16:56.800 I mean, you were right in the interview with the Wired Magazine newspaper.
00:16:59.700 It should not give us comfort that prominent conservatives or Republicans have decided to
00:17:06.080 take a big paycheck from Google, that that somehow is going to fix everything.
00:17:09.880 That's not going to fix anything.
00:17:11.100 Because as we started, people are the problem.
00:17:15.040 Yeah.
00:17:15.140 So let me go back, and when you say they are shaping us and they're pushing us a certain
00:17:23.560 way, first explain that.
00:17:26.940 How is that?
00:17:27.960 And I want specifically, where are they pushing us and how do you know they're pushing us?
00:17:34.920 Right.
00:17:35.200 Okay.
00:17:35.380 Well, one of the people that we feature in the film is a guy who is a Google ethicist,
00:17:41.320 and that's actually a title at Google.
00:17:43.420 And Tristan talks.
00:17:45.960 Tristan Harris?
00:17:46.700 Yes.
00:17:47.160 Talks very, very openly about the fact that his job was to nudge people towards certain
00:17:53.960 directions in certain areas.
00:17:55.700 And they were trying to figure out how to do that, quote unquote, ethically.
00:17:59.500 Now, you know, my view on this is pretty clear.
00:18:01.920 If you're trying to influence somebody, it should be out in the open.
00:18:06.460 It should be direct.
00:18:07.340 It's the only ethical way.
00:18:08.860 Exactly.
00:18:09.460 And so I don't think there's an ethical way to do it.
00:18:11.860 But he's...
00:18:12.740 In the open.
00:18:14.020 Right.
00:18:14.340 It is.
00:18:14.880 Exactly.
00:18:15.380 Because then I'm choosing, and I know you're coming at me with an angle.
00:18:18.720 Yes.
00:18:18.760 But if you don't announce it, there is no ethical way.
00:18:21.420 Yes.
00:18:21.760 Exactly.
00:18:22.340 And, you know, it's interesting.
00:18:23.800 You go back in history, you remember all the concern that was raised about the subliminal
00:18:29.160 advertising, you know, that you'd see, you know,
00:18:31.740 a Coca-Cola drink and there's actually sex written in the ice cubes or something like
00:18:36.220 that.
00:18:36.760 And, you know, there are all kinds of disputes about how effective that was.
00:18:39.820 Well, that was restricted in the United States because it was deemed to be deceptive.
00:18:44.860 You know, you're trying to accomplish something here that you should be above board about.
00:18:48.840 And that's what I'm saying that Google should do.
00:18:50.800 So, you know, the issue of them nudging or trying to influence us, Tristan Harris has
00:18:55.740 talked about that.
00:18:56.600 Eric Schmidt, the CEO of Google, has said that part of Google's purpose is to shape the values
00:19:03.140 of Americans.
00:19:05.240 So they don't view themselves just as we're trying to sell advertising and we're trying
00:19:10.000 to make money.
00:19:10.720 They have a much more ambitious agenda that goes along with this.
00:19:15.700 And it's tied up with their worldview.
00:19:17.500 It's tied up with the sort of the ethos of Burning Man.
00:19:20.620 It's tied up with sort of the Silicon Valley ideal of what they view as wrong with America
00:19:25.940 and what they view wrong with human, American society.
00:19:28.820 So I don't want to get into, I think for the sake of intellectual dialogue, I want to take
00:19:41.360 the position that it's totally cool to say, don't be evil.
00:19:48.840 Yeah.
00:19:49.520 Totally cool.
00:19:50.240 The problem is when you say they're nudging and they're pushing and their ethos is don't
00:20:00.300 be evil, could we define evil?
00:20:04.260 Could we define evil?
00:20:06.680 You know, when it's hate speech, could we define hate speech, please?
00:20:12.240 Because I've seen speech that is very hateful that others are totally fine with.
00:20:18.240 I have seen things, for instance, we define evil.
00:20:23.680 Some people would say that it is evil to take the right away for your own body as a woman.
00:20:33.260 And others would say, no, it's evil to kill a child in the womb.
00:20:39.300 Right.
00:20:39.480 Two valid arguments.
00:20:42.240 One, you can choose whichever one you want.
00:20:45.040 I've chosen one.
00:20:46.140 Yeah.
00:20:46.380 But both sides will say you're evil for believing that.
00:20:50.540 Right.
00:20:50.840 So, without the definition, we don't know what it is, and it can always change.
00:20:59.580 Right.
00:21:00.060 And I think you're exactly right.
00:21:02.100 And even to add to that further, here's the other fundamental problem.
00:21:05.620 I'm always very concerned when people throw around words like evil.
00:21:11.840 There is evil in the world, and we do need to define it.
00:21:14.720 But here's the problem.
00:21:16.060 When a company says, don't be evil, or when an activist says, I am fighting evil, if they
00:21:24.240 have not really thought through what they're saying, what does that do, Glenn?
00:21:28.180 It opens up the possibility to do anything.
00:21:31.600 I mean, the ends justify the means.
00:21:33.020 I'm fighting evil.
00:21:34.200 I'm halting evil.
00:21:35.900 So, deception is now okay.
00:21:38.260 Dishonesty is now okay.
00:21:40.020 Harassment is now okay.
00:21:41.460 Manipulation is now okay.
00:21:42.460 Because I'm fighting evil.
00:21:44.280 So, for a company to say, you know, don't be evil, yes, you never defined it.
00:21:50.240 And second of all, it creates, I think, kind of this mindset of, we're safe now.
00:21:55.880 I mean, our corporate model says, don't be evil.
00:21:57.580 So, we're obviously not evil.
00:21:59.220 And we're fighting all these terrible things out here.
00:22:01.620 And we're justified in, you know, and you see that in some of the emails that have come
00:22:06.460 out that have been leaked in these discussions that Google engineers had about, you know,
00:22:11.700 Trump's immigration policy, regardless of what one thinks about that.
00:22:15.220 But these Google engineers are actually saying, we should fiddle with the algorithm because
00:22:20.440 this is horrible and we need to stop it.
00:22:22.940 That's the kind of mindset that gets adopted when you plant yourself on the grounds of,
00:22:28.960 we're not evil, we're fighting evil.
00:22:35.220 The biggest thing that made me successful is the one thing that I have lost and I am so
00:22:44.880 grateful that I have lost it.
00:22:46.480 I want to take you to a controversial meeting I had with Mark Zuckerberg at Facebook.
00:23:06.180 They had invited a whole bunch of voices and I sat right across the table from Mark and others
00:23:14.880 were talking and I watched him and I watched him closely.
00:23:21.020 And I tend, when I go wrong, I tend to go wrong believing the best in somebody.
00:23:28.640 Right.
00:23:28.740 So maybe I'm wrong.
00:23:30.600 But I really watched him and when he spoke many times, perhaps because I was just right
00:23:36.980 across the table, he was speaking to me and he was looking me right in the eye and he said
00:23:43.200 to me, Glenn, why would we want to do this?
00:23:48.300 First of all, we would, we'd wipe out 50%.
00:23:52.540 If I'm just pushing politics, I'm going to wipe out 50% of my base.
00:23:57.800 That's stupid.
00:23:58.960 For us to sit here and say, that's good.
00:24:03.280 That's not.
00:24:04.480 That's hate speech.
00:24:05.580 That's not.
00:24:06.940 He said, in different parts of the world, we don't know what is good and bad.
00:24:13.920 The C word in America is horrible.
00:24:18.940 Yeah.
00:24:19.460 In Great Britain, not so much.
00:24:21.840 Okay.
00:24:22.360 Right.
00:24:22.640 It means something different.
00:24:24.300 Yeah.
00:24:24.760 Yeah.
00:24:24.900 So his point was, we are not the policemen of the world.
00:24:29.380 It's impossible to be policemen of the world.
00:24:32.060 Right.
00:24:32.560 And I believe that in him.
00:24:34.580 I really do.
00:24:35.320 I believe he believes that.
00:24:37.320 Mm-hmm.
00:24:38.100 But I also believe that Silicon Valley, by nature, is not diverse.
00:24:47.560 It may be diverse in color.
00:24:50.540 It may be diverse in sexual lifestyle.
00:24:54.440 But it is not diverse in political thought.
00:24:58.720 Right.
00:24:58.940 And when you're in the bubble, and explaining the bubble that we're in, where they're sifting
00:25:09.260 and we're getting the stuff we want to read, and we're living in that bubble, that Facebook
00:25:15.120 bubble, they're living in a Facebook bubble.
00:25:18.340 So my question to you is, do you believe that Facebook, Google, are intentionally going after
00:25:31.380 and editing?
00:25:33.160 I mean, en masse.
00:25:34.440 I believe there are people that do this.
00:25:36.480 Right.
00:25:36.680 But en masse.
00:25:37.520 Are they intentionally doing this, or are they just so isolated in their own bubble and surrounded
00:25:44.780 by people that are absolutely certain they're on the side of good, and they don't see what
00:25:53.280 they're doing?
00:25:54.880 That's a great question.
00:25:56.200 And any time you talk about intention or trying to look into somebody's, you know, soul, as
00:26:02.040 it were, I mean, it's very, very difficult to do.
00:26:04.100 I think what's striking to me about Facebook and Zuckerberg is, you know, when he was sat
00:26:09.160 before the Senate, Ted Cruz and others asked him some very tough questions.
00:26:13.120 He acknowledged that Silicon Valley was a very, very liberal place, but he also didn't really
00:26:18.780 have a good answer.
00:26:20.020 You know, when Ted Cruz asked him and said, look, this has happened to a Republican candidate
00:26:23.880 here and here and here.
00:26:25.460 Do you know of any example where this has happened to a Democrat candidate?
00:26:29.860 And he said, no.
00:26:30.600 And he said, this has happened to this pro-life group, to that pro-life group, to another pro-life
00:26:34.960 group.
00:26:35.480 Do you know of any abortion groups that have had these issues?
00:26:39.480 He did not.
00:26:40.240 So part of it is the proof's in the pudding, right?
00:26:42.760 I mean, why is it that one side seems to overwhelmingly face these issues?
00:26:48.780 You know, I'm not talking about, you know, somebody who's got a blog site who's, you
00:26:52.940 know, kind of out there.
00:26:54.040 I'm talking about very substantial, institutional, you know, good, reasonable, well-thought-out
00:26:59.400 voices.
00:27:00.200 Go to the Harvard expert that did this research on the last presidential campaign.
00:27:05.960 Yes, exactly.
00:27:06.880 He's a Clinton supporter.
00:27:07.980 That's exactly right.
00:27:08.960 Robert Epstein.
00:27:09.600 And what they essentially did, Glenn, is they took, they had 2,000 monitors around the
00:27:15.020 country.
00:27:15.460 They were in red states, blue states, Republicans, Democrats, gay, straight, Catholic, agnostic.
00:27:21.900 I mean, you name it.
00:27:22.980 They had the gamut of it.
00:27:24.180 And what they essentially did was said, we want you to do searches through the six-month
00:27:28.560 period before the 2016 election.
00:27:30.400 And we are going to capture every single search result that you get.
00:27:35.340 We're going to have you search on Bing, and we're going to have you search on Google to
00:27:39.260 see what kind of results you get on political topics.
00:27:41.940 What he found in the research when they cataloged all of it was that in all 10 of the search
00:27:46.920 positions on Google, you saw a clear bias in favor of Hillary Clinton, meaning that they
00:27:52.860 were suppressing negative stories about Hillary Clinton and pushing positive stories by Donald
00:27:58.640 Trump.
00:27:59.040 They did not find that problem with Bing.
00:28:01.480 You mean pushing negative stories?
00:28:02.740 Yeah, sorry.
00:28:03.140 Negative stories about Trump.
00:28:05.240 And they did not run into that problem with Bing, which is a Microsoft product.
00:28:09.720 And they found this stunning, because what you should find, what Google will tell you is
00:28:15.480 the algorithm is sort of individually, in a sense, it's conditioned individually for the
00:28:20.940 user.
00:28:21.400 So, you know, Google knows Peter Schweitzer very well.
00:28:24.180 When I search in a certain subject, it's going to give me answers or responses that are
00:28:30.400 tailored to my search history and to the best results they see.
00:28:33.820 If you were, I mean, I shouldn't have gotten any results, I didn't like either side, but
00:28:38.880 if, so let's say if you were a Donald Trump supporter, you should have seen in your search
00:28:44.620 results more positives about Donald Trump?
00:28:47.280 Yes, yes, you should have.
00:28:48.560 And you should have seen for news sources that you're used to clicking on.
00:28:52.320 That's part of what they say.
00:28:54.180 So the fact that they found this uniform, consistent bias in favor of Hillary among all 10 search
00:29:00.960 positions among the 2,000 people was quite astonishing.
00:29:05.360 And there's really no disputing that.
00:29:07.520 Google does not dispute Epstein's results.
00:29:11.240 They just say, well, no, this is just organic search.
00:29:14.100 It's sort of a circular argument.
00:29:15.960 We have organic search.
00:29:17.120 How do we know we have organic search?
00:29:18.820 Because we have organic search.
00:29:20.300 I did a, I did a show recently on the difference between disinformation and misinformation.
00:29:31.500 Fascinating topic.
00:29:32.580 Right.
00:29:32.940 Yeah.
00:29:33.660 Can you explain the difference between the two?
00:29:35.580 I mean, one is one, disinformation is when it is, is planted into what you think is a credible
00:29:45.300 source.
00:29:46.220 Right.
00:29:46.540 If it comes from Pravda, you're like, oh, it's Pravda.
00:29:50.140 Right.
00:29:50.460 But if it comes from the New York Times.
00:29:52.900 Right.
00:29:53.640 Then it's disinformation.
00:29:55.520 Mm-hmm.
00:29:57.560 In a way, Google is, is engaged and Facebook knowingly or not in a disinformation campaign.
00:30:09.600 Yes.
00:30:09.980 I think, I think that you could certainly classify that.
00:30:12.380 And, and, and I think the issue becomes why are they doing this and how are they doing
00:30:18.320 this?
00:30:18.540 You asked earlier about Zuckerberg and, and your interaction with him.
00:30:21.900 I don't know that Mark Zuckerberg, I'm not suggesting that Mark Zuckerberg is sitting
00:30:26.040 around saying, how can we deal with conservatives on Facebook?
00:30:29.300 I don't think so either.
00:30:29.860 But the problem is there are lots of people employed by Facebook who maybe they're in
00:30:35.940 their mid twenties.
00:30:37.300 Maybe they were woke, you know, on, on campus and they are now involved in the news feed.
00:30:43.620 I mean, we had a, you know, a Facebook employee who was involved with their trending section
00:30:48.600 that came out a couple of years ago who said, oh yeah, we sunk conservative stories and we
00:30:53.020 boosted liberal stories.
00:30:54.240 Um, so I don't think it's, it's, it's a question of the executives of these companies, you know,
00:31:00.740 in kind of a James Bond villain moment saying, here's how we're going to rule the universe.
00:31:05.700 I think it's a question of, they've created these powerful companies and they've created
00:31:10.100 a culture within these companies that for all the talk of tolerance is actually very intolerant
00:31:16.680 and it reflects in the product that they are producing.
00:31:19.840 So did you, did you touch on it all?
00:31:24.080 I have a document from a meeting with media matters on inauguration day.
00:31:31.680 It was a meeting that happened with far left donors with media matters on the election day
00:31:37.880 in Florida, not on election day, on, on, uh, inauguration day.
00:31:41.900 And they said, here's where we went wrong and here's what we're going to do.
00:31:47.320 And it talks in that, in that document about how they are going to go to Facebook and Google
00:31:57.500 and they are going to advise.
00:32:00.540 Right.
00:32:01.100 Okay.
00:32:01.400 Right.
00:32:01.980 And in that document, it says, we have already been given access to all of the raw data in
00:32:12.720 real time.
00:32:14.480 Yeah.
00:32:14.760 Peter, I don't think they'd give that to me or any organization that I know of.
00:32:22.040 No, they would not.
00:32:23.300 They would not.
00:32:23.840 And this, and this is the problem.
00:32:25.240 I mean, the problem is the way that they are trying to deal with this is they're like,
00:32:29.320 you know, we're being criticized by conservatives.
00:32:31.760 So we'll go meet with conservatives.
00:32:33.380 I'm not saying that's a bad thing, by the way.
00:32:35.480 I think that's a good thing, but that's not really the issue.
00:32:39.160 The issue is not, you know, uh, saying nice words.
00:32:41.940 The fundamental issue comes down to, you know, what is this company doing?
00:32:46.360 And, and the whole debate now that's arisen about fake news, I think is a huge problem
00:32:52.480 because it's allowing essentially these liberal groups like the Southern Poverty Law Center
00:32:57.380 and Media Matters to essentially say to Facebook and Google, no, no, no.
00:33:01.880 We want you to engage in more censorship.
00:33:04.700 We want you to classify.
00:33:05.820 And the, uh, and the Facebook could respond well, but we have others.
00:33:10.700 We have, I don't remember if it's Heritage Foundation, but we have, we have others that
00:33:14.620 are doing it on the right.
00:33:16.200 I don't want that either side.
00:33:18.420 That's exactly right.
00:33:18.940 I don't want to either side shut up.
00:33:20.480 Yeah.
00:33:20.700 It's, it's the, the, the problem that develops is that nasty word cronyism and cronyism is
00:33:27.000 a problem where you give concentrated power or you give special access or favors to certain
00:33:33.000 people, uh, and invariably it's going to be misused.
00:33:35.820 And, and this is really the question, I guess, Glenn is does Facebook and does Google so much
00:33:43.180 distrust the American people that they believe the American people are incapable of looking
00:33:49.420 at a news story and saying, that's totally, yes, I'm not buying that.
00:33:54.380 Um, and, and, and they don't, they don't have confidence in the American people to do that.
00:33:58.660 They feel like they have to somehow be the arbitrators and they don't.
00:34:04.780 So here's, let's be clear.
00:34:07.520 Um, just new study came out.
00:34:09.880 Goldfish, goldfish have an attention span of nine seconds.
00:34:17.160 Americans have seven seconds.
00:34:20.800 Okay.
00:34:21.200 So let's be very clear.
00:34:24.000 We're, we're not doing our job.
00:34:26.060 Right.
00:34:26.280 Okay.
00:34:26.660 Right.
00:34:27.000 Um, and that has changed dramatically because of Facebook and all of the interaction that
00:34:32.780 we do.
00:34:33.180 Yeah.
00:34:33.520 However, because I was just asked this question, um, well, don't they have a responsibility?
00:34:40.380 Shouldn't they be?
00:34:41.340 No, they have a responsibility to be transparent and be a platform.
00:34:46.300 Correct.
00:34:46.720 A platform.
00:34:47.420 Correct.
00:34:47.680 I don't believe that you should censor anyone on a platform.
00:34:53.180 Right.
00:34:53.260 It's the battlefield of ideas to say that now, what people will say is, well, that's crazy
00:35:01.300 because there's a lot of crazy people.
00:35:03.200 Yeah, there are, there are.
00:35:04.800 Yeah.
00:35:05.460 Thomas Jefferson said, believe the people, trust the people.
00:35:09.940 Right.
00:35:10.300 Okay.
00:35:11.200 The key to that sentence was comma.
00:35:15.020 Uh, they will usually get it wrong, but eventually they'll get it right, right, right.
00:35:23.220 Exactly.
00:35:23.760 So we're going through this period right now.
00:35:26.160 The worst thing we can do is put a babysitter on top of us forever.
00:35:31.520 Yes.
00:35:31.900 We have to learn.
00:35:33.860 Fire is hot.
00:35:35.840 Yes.
00:35:36.060 No, you're exactly right.
00:35:37.480 And this is, is further evidence that I think they don't really understand the dynamics at work in the country today.
00:35:45.320 The dynamic at work in the country today is a rejection of sort of this elite view of how society should be organized.
00:35:53.560 It's one of the reasons why you have in financial markets, conservatives, people on the left don't trust the large banks.
00:36:01.800 They don't trust Wall Street.
00:36:03.040 It's a rejection of that.
00:36:04.340 It's the same reason conservatives, liberals, independents have a distrust of Washington, D.C.
00:36:10.960 It's not because they want tax policy to be slightly different.
00:36:14.280 It's they don't fundamentally trust them to reflect their interests and to look out for them.
00:36:19.980 And they also know that the elites generally look down upon them.
00:36:23.740 So, you know, that my challenge to Silicon Valley is for all their talk of egalitarianism, for all their talk about we love democracy and everybody having a voice.
00:36:34.520 Do you really?
00:36:36.060 Do you really?
00:36:36.860 I mean, the point is, we all remember as a kid, I grew up outside of Seattle, Washington.
00:36:41.160 And I remember going down to a place called Pioneer Square.
00:36:43.940 You probably went there, too.
00:36:45.240 Yeah.
00:36:45.380 There were all kinds of people wandering around saying strange things.
00:36:48.920 Well, those people today may have blog sites and they're going to say some crazy stuff.
00:36:54.100 I didn't pay a lot of attention back then.
00:36:56.420 I'm not paying a lot of attention now.
00:36:58.140 And I have enough trust that most people aren't going to pay a lot of attention to them.
00:37:03.820 And that's, I think, what we have to embrace, because otherwise it's we are going to have intellectual policemen that are trying to tell people, here's what you should think.
00:37:13.500 Here's what you should not think.
00:37:14.780 Not only that, but please don't even look in this direction.
00:37:18.060 You can't even look in this direction.
00:37:19.540 If you look in this direction, it might somehow infect you.
00:37:22.980 It's ridiculous.
00:37:24.000 The battlefield of ideas is such that the best ideas win.
00:37:27.660 And I happen to believe that the ideas of the American founding were the best ideas.
00:37:32.440 And they are going to win.
00:37:34.220 And we ought to be confident in that.
00:37:35.720 And the kind of monitoring and everything else, honestly, goes against almost every single article in the Bill of Rights.
00:37:47.280 That's right.
00:37:47.700 Almost every single one is violated.
00:37:50.220 Now, it's not violated by the government.
00:37:52.920 Right.
00:37:53.160 But it is the same principle, especially the bigger they get.
00:37:57.500 Yes.
00:37:57.800 And that, by the way, goes on to what you were saying earlier about Huxley and Orwell.
00:38:03.440 You know, the traditional view is the government was going to use technology to control our lives.
00:38:10.440 It's really corporations.
00:38:11.920 I've always, you know, I've always, always made fun of, you know, Blade Runner, the corporation.
00:38:19.760 Please shut up about the corporation.
00:38:21.720 It's the government.
00:38:22.460 No.
00:38:23.260 Right.
00:38:23.840 No.
00:38:24.360 We are now entering the time where the liberal concern about corporations is actually accurate now.
00:38:32.660 Right.
00:38:33.120 You know?
00:38:33.660 Yeah.
00:38:33.880 And it's weird that they're so in love with Apple and Google because these are the guys you've been warning us about.
00:38:40.600 Yeah.
00:38:40.660 You know?
00:38:41.220 Yeah.
00:38:42.240 So let me take you kind of to that Orwell place.
00:38:48.960 But first, explain, Gmail is free.
00:38:56.020 Google searches are free.
00:38:57.820 Right.
00:38:58.180 Right.
00:38:58.640 They are free, but they come at a high price.
00:39:01.460 No, I'm not paying anything.
00:39:03.080 Well, you're not paying anything in terms of monetary.
00:39:05.600 That's true.
00:39:06.240 They're free.
00:39:07.140 But the question is, what is going on?
00:39:09.860 Because all these servers, all this capacity is expensive.
00:39:14.460 So what Google is doing is they have a product here.
00:39:17.640 You're not buying it.
00:39:18.460 You are the product.
00:39:19.840 They're selling you.
00:39:20.600 You're selling you.
00:39:21.480 And they're selling all kinds of secrets about you.
00:39:23.720 And Gmail is a perfect example of this.
00:39:25.540 I used Gmail up until I started on this project.
00:39:28.780 And now I don't use Gmail anymore.
00:39:30.880 And what people have to realize about Gmail is they're scanning every email that comes in.
00:39:36.100 They're scanning it.
00:39:37.040 They know what's in it.
00:39:37.880 They are scanning every email that you send out.
00:39:42.000 And if you draft an email, you know, you're upset with your cousin about something.
00:39:47.900 You had a, you know, debate over Thanksgiving and you thought they were rude.
00:39:51.000 And you said, you know, cousin Chris, I think you're rude and you're terrible and you're this
00:39:54.820 and that.
00:39:55.200 And you say, you know what?
00:39:56.140 That's really kind of nasty.
00:39:57.320 I shouldn't send it.
00:39:58.400 And that draft, they're scanning that draft.
00:40:00.460 I want to make it clear.
00:40:02.220 You're not saying the draft that you save and put into drafts.
00:40:06.180 Correct.
00:40:06.540 It's the keystrokes.
00:40:07.960 It's recording the keystrokes.
00:40:09.700 That's correct.
00:40:10.060 Even if you delete all of it, it's still there.
00:40:13.020 Yeah.
00:40:13.540 All right.
00:40:14.660 What's important here is, again, to distinguish.
00:40:18.300 When you say they're scanning, it doesn't mean they're reading it.
00:40:22.700 Correct.
00:40:23.000 Okay.
00:40:23.740 And why are they scanning it?
00:40:25.700 Well, they're scanning it because let's say you send an email to your friend.
00:40:30.780 Golly, I'm really tired of work.
00:40:32.440 I'd sure love to be on a beach in Mexico right now.
00:40:35.380 They're scanning it because they're scanning.
00:40:37.200 They're looking for beach in Mexico.
00:40:38.660 And you're going to probably see ads on your Google feed for apartments or condos in Mexico.
00:40:46.640 And lo and behold, the next morning, someday you wake up and they say, Mr. Schweitzer, I've
00:40:54.760 already booked two tickets.
00:40:56.900 That's right.
00:40:57.800 Would you like to go to Mexico today?
00:41:00.660 I know you're tired and you've been thinking about it.
00:41:03.280 Right.
00:41:03.820 That's right.
00:41:04.420 And that's where it's headed.
00:41:05.540 And again, there are certain amazing conveniences that come with this.
00:41:10.120 I mean, you know, you use Google Maps.
00:41:12.360 There are all sorts of great benefits to that, to Google search.
00:41:15.540 The thing that people have to keep in mind, though, is it's not a one-way street.
00:41:20.600 It's not just these wonderful, good things they're doing for you.
00:41:25.340 It's the capacity they are developing to do things to you.
00:41:29.420 So when I say that they're scanning your Gmails, it's not that there's a person sitting in Silicon
00:41:35.360 Valley saying, oh, look what Glenn just sent in Gmail.
00:41:38.380 Correct.
00:41:38.640 But they have the capacity to do that.
00:41:41.420 Yes.
00:41:41.780 And they have the capacity, if they don't like what you're doing, to shut you off from
00:41:46.060 Gmail.
00:41:46.680 And Dr. Jordan Peterson, we highlight him in the film.
00:41:50.240 That's exactly what happened to him.
00:41:52.160 He's a psychology professor at the University of Toronto.
00:41:55.020 And he took a position against compelled speech, where there was a debate in Toronto about an
00:42:01.600 ordinance that would require you to address somebody by their preferred gender.
00:42:06.360 Peterson's position was, I always address people by their preferred gender, but this
00:42:10.680 is compelled speech.
00:42:11.800 You should not force people to do this.
00:42:13.940 He took this public position.
00:42:15.780 The next day, Glenn, he was shut out of his Gmail account.
00:42:18.880 He was shut out of his YouTube account.
00:42:20.800 Everything Google owned was shut down.
00:42:23.000 Now, you would think, why is this going on?
00:42:26.240 I think probably what happened is somebody connected with Google, maybe mid-level, saw
00:42:32.520 this, you know, is maybe in favor of this policy position and sort of in a juvenile way said,
00:42:38.280 I don't like this guy.
00:42:39.580 We're going to sort of cut him off.
00:42:41.700 But Jordan Peterson lost his Gmail.
00:42:43.820 He lost his Google calendar.
00:42:46.620 The point being, you rely on these products.
00:42:49.160 It's going to give them an enormous capacity over your life.
00:42:53.760 And if they choose to, sometimes in an arbitrary way, they may just shut you out because they don't
00:42:59.620 like a position that you've taken.
00:43:01.180 And the problem is Google does not have a customer service department you can call to say,
00:43:06.500 why did this happen?
00:43:07.340 They have no customer service department.
00:43:09.000 And they make clear, we can choose to do this to you anytime we want.
00:43:12.540 So I've been asking the question of everybody who I think
00:43:38.000 is paying attention to Silicon Valley or is involved in Silicon Valley.
00:43:46.120 The answer comes back exactly the same way every time.
00:43:50.480 Because it's been a plea of mine.
00:43:54.820 I'll say to them, this might sound crazy.
00:44:01.440 However, politicians are politicians.
00:44:05.580 Economies are economies.
00:44:06.920 They usually repeat the same mistakes, okay?
00:44:12.020 Right.
00:44:12.880 We are at a, we're at an economy that I don't care who's in office.
00:44:17.900 At some point, it's going to crash.
00:44:20.340 It always does.
00:44:21.840 Yeah.
00:44:22.140 We are going to feel real pain.
00:44:24.380 And the longer this one goes, the deeper the pain is going to be.
00:44:28.860 We have politicians that tell us, well, I'm going to bring those jobs back.
00:44:34.220 Okay.
00:44:36.080 Well, you have people in Silicon Valley right now that are not celebrating a four point whatever
00:44:43.000 unemployment rate because their entire job is to figure out how do we have a 100% unemployment rate?
00:44:53.920 Yeah.
00:44:54.120 Because that's the world of the future as they see it.
00:44:56.900 Right.
00:44:57.580 But no one is talking to people about this.
00:45:01.540 Bain Capital said about eight months ago, by 2030, the United States of America will have a 30% unemployment rate permanent.
00:45:13.020 And it will only go up from there because of the things Silicon Valley is doing.
00:45:20.660 So here's the scenario.
00:45:24.580 People start to lose jobs.
00:45:26.200 This starts to kick in around 2020.
00:45:28.780 People start to lose jobs.
00:45:30.300 They're not coming back.
00:45:32.200 The politicians have to blame somebody.
00:45:34.020 We're going to, you know, I'm going to bring those jobs back.
00:45:39.080 I'm going to bring those jobs back.
00:45:40.420 At some point, the people say, no, those jobs aren't coming back.
00:45:44.480 They have to have another story.
00:45:46.400 It's them.
00:45:47.640 It's the people in Silicon Valley that are taking your livelihood away.
00:45:52.460 They have manipulated you.
00:45:57.020 They have, and it is torches to Silicon Valley.
00:46:00.960 Unless the politician says, how about we work together?
00:46:11.020 That's when Orwell happens.
00:46:14.680 Yeah.
00:46:15.260 Everyone I have said, does that sound crazy?
00:46:19.680 Let's see if they respond the way you respond.
00:46:23.320 Does that sound crazy?
00:46:24.300 No, it sounds very realistic.
00:46:25.940 And if I were a titan of Silicon Valley with sort of their worldview, that's precisely what I would do.
00:46:32.760 And if I was part of what I call the permanent political class in Washington, Republicans or Democrats, doesn't make all that much difference.
00:46:39.440 That's exactly what I would do.
00:46:40.000 That's exactly what I would do.
00:46:41.280 And you will then have this, in a sense, unholy alliance between the political leadership and high tech.
00:46:47.440 And, you know, we know who's going to get the short end of the stick in when those two entities get together.
00:46:53.560 And that's going to be the American people.
00:46:55.760 I talked to a guy who was just in Beijing.
00:46:58.980 He is high up in the ladder.
00:47:01.220 And he told me, there are three circles, three rings, okay?
00:47:11.340 The center ring, the outside ring, is kind of like American surveillance.
00:47:16.800 The second ring is British surveillance on steroids.
00:47:22.000 The new Sharp Eyes program is the center, 11 million people, okay?
00:47:29.280 They took somebody as a test.
00:47:32.520 He was there, he saw it, took a test.
00:47:35.280 They put a guy out, said, go into the center ring, all right?
00:47:39.420 11 million people, you have two hours, hide.
00:47:43.160 They had him in the back of a squad car in eight minutes.
00:47:48.380 Remarkable.
00:47:49.320 Eight minutes.
00:47:50.380 Remarkable.
00:47:51.960 That's today's technology.
00:47:54.560 Yeah.
00:47:54.660 People have to understand, do not fear high tech.
00:48:01.640 Don't fear it.
00:48:03.400 Fear the goals of high tech.
00:48:06.260 Right.
00:48:06.880 And we don't know the goals, except don't be evil.
00:48:12.360 Right.
00:48:12.860 Right.
00:48:13.740 And you have this strange sense, this fascination that some American elites have with the Chinese model.
00:48:23.340 I mean, remember, it was just a few years ago, Thomas Friedman, the New York Times columnist and, you know, has got a lot of relationships in Silicon Valley, was very frustrated in 2009 and 2010 when Barack Obama was president and actually held up the Chinese model and said, you know, the Chinese at least can deal with these big issues of the day.
00:48:42.720 The concern is the same exact same thing that progressives said about Mussolini before things went bad.
00:48:51.120 Yeah.
00:48:51.320 And this and this is this is the concern is does the sort of traditional American notion of, you know, the Constitution was predicated on you can change the Constitution, add amendments, but it's going to be tough.
00:49:05.060 We don't want radical shifts and radical change.
00:49:08.020 Is that very American notion that has been so central to America's development?
00:49:14.040 Is that really deeply accepted or embedded by a tech culture in Silicon Valley?
00:49:19.780 I'm sure there are some people that do, but I think a lot of them don't.
00:49:22.720 A lot of them are impatient.
00:49:24.620 They made their money quickly.
00:49:26.300 They were billionaires by the time they were 30 years old.
00:49:29.100 They built the built these massive corporations over the course of a decade.
00:49:32.960 They are impatient people who don't have a lot of experience outside of the world in which they operate.
00:49:40.960 So they become naturally impatient when you have things like, you know, checks and balances and civil rights.
00:49:47.900 So, you know, when it came out that, you know, Google, for example, had rejected working with the Pentagon on a contract, which is certainly their prerogative.
00:49:56.940 That's fine. But then said, we are going to work with the Chinese government.
00:50:02.140 It's shocking in a lot of our senses, but it's not in others, because if you want to develop this sort of new technology this way, do it in a country that that doesn't have civil rights.
00:50:15.900 There's no there are very few constraints.
00:50:17.820 The government will let you do what they want you to do.
00:50:20.980 You know, if you if you do a similar surveillance project in the United States, it's messy.
00:50:24.780 You know, you've got courts to worry about and you've got congressional committees.
00:50:28.400 So this fascination that some tech giants have with China and with the sort of Chinese model and way is very, very disturbing.
00:50:40.160 China is building the technology that Hitler only dreamed of.
00:50:48.640 Yeah.
00:50:50.740 The the concentration camps that are being built right now in China are terrifying.
00:50:56.700 The sharp eyes program that President Xi has put in where it's an episode of the Black Mirror where you are you're graded on your on your social interaction.
00:51:11.280 Right.
00:51:11.660 Who you follow, what you write, who you call, where you shop, all of it.
00:51:15.540 Yeah.
00:51:16.320 And you will lose your house.
00:51:18.420 Your kids won't be able to go to school.
00:51:20.220 I mean, it is terrifying and it really said a lot to me when I don't want Google in bed with the government and the Pentagon, please.
00:51:32.080 No.
00:51:32.460 Yeah.
00:51:33.760 But one person spoke out about that.
00:51:37.280 One person.
00:51:38.460 I'm sorry.
00:51:39.120 No, everybody spoke out about that at Google with the government.
00:51:42.860 Only one said, I can't do this.
00:51:46.740 Yeah.
00:51:47.580 One.
00:51:48.260 Yeah.
00:51:48.960 Yeah.
00:51:49.200 No, it's it's that classic example of, you know, we're going to take this position, this moral position with the Pentagon, which you're certainly entitled to take, which is going to make us feel better.
00:52:03.160 But the moral decision that you should really be making that we're not going to work with this oppressive police state halfway around the world.
00:52:10.960 You're not prepared to make that that decision.
00:52:13.700 And I think that is an example of how as much as they, you know, sort of are clad in their T-shirts and they talk about their sort of progressive values, their willingness to work with repressive regimes really reveals the fact that either, A, they really clearly haven't thought this stuff through.
00:52:35.080 Or B, their fascination or B, their fascination or B, their fascination or their sort of moral north star is collective.
00:52:42.780 Yes.
00:52:43.280 It's the collective.
00:52:44.060 And it's and it's just the pursuit of technology in and of itself that there's that we're not going to attach any values, political values or intellectual values to that.
00:52:55.920 It's simply pursuit of the technology.
00:52:59.160 Everyone who says they they care about civil rights and the minorities in America and America was so bad.
00:53:05.800 I urge you to search what the Han, what Han Chinese means.
00:53:13.260 And if you are not Han Chinese, that's 10 percent of the population.
00:53:18.680 You're going into the back door.
00:53:21.040 That's right.
00:53:21.320 You've got separate hotels.
00:53:23.520 You can't have certain jobs.
00:53:25.160 It is segregation 1930 America.
00:53:31.320 It's all Jim Crow laws.
00:53:33.200 Right.
00:53:33.480 They don't seem to care.
00:53:34.580 Right.
00:53:34.900 And it is, you know, in a sense, I mean, a natural fusion between a police state and a system of surveillance and influence and control.
00:53:45.160 I mean, one of the things I talk about in the film is, is, you know, everybody always sort of brings up the sort of Hitler example.
00:53:51.520 But, you know, Hitler, Mussolini, Mao, name them.
00:53:57.100 Stalin.
00:53:57.540 Stalin.
00:53:58.140 I mean, they would they would have loved to have the capacity and the power that we have essentially handed Google and Facebook voluntarily.
00:54:06.020 We have we have these companies have more information and more access than the KGB ever dreamt of having.
00:54:13.640 That's right.
00:54:14.340 Yeah, that's right.
00:54:15.120 And instead of, you know, in 1984, where, you know, you'd have the speaker sort of blare the speeches at you.
00:54:21.380 It's sort of this over the top propaganda of Big Brother.
00:54:24.100 Now it's sort of embedded in this search, this wonderful looking search from Google where, you know, it's sort of secretly being manipulated in a manner that a lot of people don't appreciate.
00:54:34.440 That's enormous power.
00:54:35.780 It's hidden power, but it's enormous power.
00:54:39.460 You didn't touch by this on in the movie.
00:54:41.580 But I can't think of anybody I'm comfortable with being the first to discover AI.
00:54:51.880 And I'm I would love to hear your thought.
00:54:56.080 We we need a Manhattan Project.
00:54:59.380 I mean, this is this is this is going to make the nuclear bomb seem like a firecracker.
00:55:07.280 This changes all of humanity.
00:55:09.860 Right.
00:55:11.200 Possibly forever.
00:55:12.400 It puts a cage around us.
00:55:14.620 May as and it's not me, but it's it's Stephen Hawking, Bill Gates, Elon Musk.
00:55:21.180 It may kill all life on Earth.
00:55:25.680 It may.
00:55:26.380 Yeah.
00:55:28.420 We know Google's working on it.
00:55:30.420 We know China's working on it.
00:55:31.540 We know Russia's working on it.
00:55:33.520 I don't think our government has a clue as to what they're doing.
00:55:37.660 You know, there was a there was an AI conference up in Cambridge and and the president's person came in and said, hey, well, we'll even let you use our our NSA servers.
00:55:50.160 I don't need the NSA servers.
00:55:54.140 I don't need the NSA servers.
00:55:56.300 You know, that's like you can borrow my 1985 Chevy.
00:55:59.540 Yeah.
00:56:00.020 Yeah.
00:56:00.600 Yeah.
00:56:01.320 I mean, it's it's crazy.
00:56:02.900 Who do you feel comfortable with?
00:56:08.660 Anybody?
00:56:09.320 I mean, no.
00:56:10.320 I mean, I certainly don't consider myself an expert on AI per se.
00:56:14.160 But but but here's, I think, what we know from human history and particularly recent human history.
00:56:20.120 Every great technological development has been good, has meant good things and bad things and and technological advances are essentially tools.
00:56:30.560 We think of them as tools because we control them.
00:56:33.540 AI is different by its nature.
00:56:36.060 If you are creating an intelligence system, you want to continue to regard it as a tool.
00:56:42.480 But the point the question becomes, you know, this tool is learning and, you know, think about it this way.
00:56:49.820 You have, let's say, you know, a thousand robots that are operating and learning.
00:56:56.320 And let's assume that they only learn one new thing a day.
00:57:00.080 But Glenn, those thousand robots are all communicating.
00:57:03.220 So they're all actually learning a thousand new things a day and they're learning it the day after the day after the day after.
00:57:10.300 Or what does that compare to in terms of the advance in human knowledge?
00:57:14.980 Who is going to be more knowledgeable and more superior going down the road in that kind of scenario?
00:57:22.020 And what we know from human history, you know, splitting the atom, great advances in biology.
00:57:29.140 All of these developments, wonderful things come out that benefit human society and other consequences that we never intended
00:57:38.300 or that we thought we could control but that we can't are part of it.
00:57:43.360 I have had military types tell me, Glenn, Glenn, the drones.
00:57:51.480 Yes, yes, it's facial recognition, but it requires a human to push that final red button.
00:57:59.720 For now.
00:58:00.600 For now.
00:58:01.540 Right.
00:58:02.300 We should not teach intelligent machines to kill.
00:58:07.580 Right.
00:58:08.500 It's a really bad idea.
00:58:11.180 Well, and the question becomes, not only should you not teach them to kill, but will they be able to learn themselves how to kill?
00:58:21.000 Yes, they will.
00:58:21.540 And that's, you know, the challenge here is they will tell you, no, we will put a barrier there.
00:58:27.340 You can't.
00:58:27.580 You can't.
00:58:28.300 So people who don't understand AI, AI, artificial intelligence is like what we kind of have now where it can only do one thing.
00:58:40.360 Yes, Big Blue can play chess and it can beat everybody on Earth at chess, but it's that one thing.
00:58:45.600 The next step is AGI, and that's general intelligence.
00:58:50.500 That's a human brain that is good at everything, okay, or many things.
00:58:58.100 ASI is super intelligence.
00:59:00.600 When it becomes AGI, it will quickly become smarter than us because it doesn't forget, it learns, it's connected to everything, it's absorbing, it's, you think the people at Google have power.
00:59:16.420 When Google is in charge and not the people, it really has power.
00:59:24.300 And people don't understand that.
00:59:27.560 I read a description, we will be, people will be to ASI as a fly is in a kitchen on a plate to the conversation people are having.
00:59:45.100 You will not be, it won't care.
00:59:49.840 We're assigning these things that it should love people.
00:59:53.960 It should care for people.
00:59:55.920 Well, okay, we can teach it that, but it's going to be so smart.
01:00:01.600 We are going to be a nuisance.
01:00:04.280 Right.
01:00:04.720 And the question becomes, Glenn, is you can say we're going to program it or we're going to set up within it a capacity to like human beings.
01:00:14.200 But what's going to determine the ethics of these machines?
01:00:18.900 The ethics of these machines is going to be determined, at least initially, by the humans that are constructing them.
01:00:25.140 Yes.
01:00:25.440 But we don't know where that ultimately leads.
01:00:27.900 Is it going to create its own ethic?
01:00:30.200 You know, looking at the fact that, you know, these human beings are being destructive.
01:00:34.520 So we have a responsibility now to damage or kill those humans.
01:00:39.120 If there was a little teeny fly Moses, would we be listening and obeying the fly Moses Ten Commandments?
01:00:47.860 No.
01:00:48.480 No.
01:00:48.800 I mean, that was cute when I was little.
01:00:50.280 That was cute.
01:00:50.800 Okay, I got it.
01:00:51.460 I got it.
01:00:51.860 I got it.
01:00:52.420 Yeah.
01:00:52.700 All right.
01:00:53.380 Buzz off.
01:00:54.520 Yeah.
01:00:54.720 This is what my goals are.
01:00:58.880 And no one is thinking about between here and there.
01:01:04.200 Earliest estimate that, you know, Kurzweil says a lot of people think is wrong, but 2030.
01:01:10.280 But if it doesn't happen by 2030, Bain Capital, who does this for a living, says we're going to have 30% unemployment.
01:01:17.620 Right.
01:01:18.260 So something bad is coming and no one is discussing.
01:01:23.960 And no one is.
01:01:25.340 I was so pleased to see Tristan, that he was a source of yours.
01:01:30.700 I've talked to him several times.
01:01:32.280 He is rare to walk out and say, I'm not having anything to do with this.
01:01:38.960 Right.
01:01:39.480 No, no.
01:01:39.980 It is very rare.
01:01:41.040 Because look, if you're in Silicon Valley, you're living a very comfortable life now.
01:01:45.620 You're making lots of money.
01:01:46.780 You're involved in these creative processes.
01:01:49.460 And I think the challenge with AI is that you're going to have government is going to be constantly behind.
01:01:58.400 I mean, look, if the prediction is that this happens at 2030, we can expect congressional hearings in 2031.
01:02:05.420 Right.
01:02:05.700 I mean, that's that's usually how this works.
01:02:07.700 The capacity of government to deal with this is limited.
01:02:11.480 And the pace of change is so great that if we're expecting government to sort of figure this out and manage this, I think we're making a big mistake.
01:02:21.240 And then the question becomes within Silicon Valley, what constraints are there on people that are making these decisions?
01:02:28.540 And there are not very many, you know, there's a reason, I think, sort of deeply embedded in the human psychology that's concerned about technology.
01:02:39.980 And I'm not talking about Luddites who reject it in general, but there's a reason if you go back to, you know, the Superman comics, that the threat is always sort of the madman who is misusing technology in a way to damage people.
01:02:52.820 I think people understand in terms of wisdom that we need to see technology as a benefit, but also as a potential threat.
01:03:01.940 And my concern is that in Silicon Valley, there seems to be far more interest in intelligence as an issue rather than wisdom.
01:03:10.440 And wisdom teaches us that historically, these machines and these tools, the smarter they get, they can do something for us, but that they can do something to us.
01:03:20.040 Yeah, but there, you can't stop that.
01:03:22.020 You're never going to put this, AI is coming.
01:03:24.380 Right.
01:03:24.720 It's coming.
01:03:25.280 Right.
01:03:25.920 And I am thrilled.
01:03:27.460 I am excited about the next 10 years.
01:03:30.400 I am not excited about, I wish I could watch it as a movie.
01:03:34.440 Yeah.
01:03:34.820 Because society is going to freak out.
01:03:38.040 Yeah.
01:03:38.320 But I'm excited about all of the potentials.
01:03:41.480 Yeah.
01:03:42.920 I'm also very concerned, but I don't, I don't, you can't put this back in the bottle.
01:03:49.340 Exactly.
01:03:49.660 It's going to happen.
01:03:51.540 And I don't know who I want to be the one who discovers this because I don't trust any.
01:04:00.440 Right.
01:04:01.080 Anybody.
01:04:01.960 Yeah.
01:04:02.100 With this kind of, this is, this is, we're not create, the movie Frankenstein, he created a powerful, big, strong being.
01:04:14.060 Okay.
01:04:14.720 Yeah.
01:04:15.500 This is creating a God.
01:04:17.260 Mm-hmm.
01:04:18.820 Yeah.
01:04:20.320 Something that's as close to being omnipotent as we can sort of.
01:04:24.040 As we can get our.
01:04:25.040 In the, in the, yes.
01:04:26.200 In human creation.
01:04:27.320 No, that, that's exactly right.
01:04:28.500 And the problem is compounded by the fact that this is, in a sense, a global arms race, as it were.
01:04:35.480 Because even if assuming by some miracle that we in the United States come up with sort of a constructive constraint on how this is going to be developed in a responsible way.
01:04:45.920 This is going on in China.
01:04:46.920 This is going on in China, where really it's about state power and state rule and what government wants.
01:04:52.300 This is going on in Russia, this is a global trend that sort of transcends the United States so we can be in a situation where even in the most optimistic view, we somehow create some kind of code to determine how we are going to create A.I.
01:05:11.880 and what role it's going to play in American society.
01:05:14.000 But, of course, that stops at the border's edge and you can't really enforce it.
01:05:18.420 So it's, it's, it's a very bleak view.
01:05:22.100 And yet, if people are increasingly becoming aware of it, you're talking about it, other people are talking about it, my hope is, I'm an optimist by nature, Glenn, my hope is we will not be taken by surprise.
01:05:37.440 We will at least be able to prepare ourselves in some ways.
01:05:41.460 I don't know what that preparation looks like right now, but I think it's going to be necessary.
01:05:45.900 Let me come out of the deep future here, the 10-year future.
01:05:50.640 Right.
01:05:50.720 And let's, let's end it here.
01:05:56.360 The problem today is this, these companies are still somewhat manageable.
01:06:06.700 Break them up?
01:06:09.400 Regulate them?
01:06:11.200 God forbid, make them a utility?
01:06:13.900 What?
01:06:15.540 It's tough.
01:06:16.660 I don't think you make them a utility.
01:06:18.200 I mean, I've dealt with utilities where I've lived and it would be a nightmare.
01:06:21.300 And by making it a utility, as you and I both know, you're really giving the government or a government body.
01:06:27.360 You're creating that marriage.
01:06:28.560 Exactly.
01:06:29.100 That, that's, that's a power I don't want the government to have.
01:06:31.560 So I don't consider that an option.
01:06:33.760 Regulation is a great idea in a sense.
01:06:35.640 But on the other hand, regulation never keeps up.
01:06:38.320 And there's also the issue of regulatory capture.
01:06:40.780 What, what are regulators going to do?
01:06:42.320 They're going to go to Google and say, how should we regulate you?
01:06:45.120 This is, this is the way it's always done.
01:06:47.000 So to me, ultimately, it's about breaking them up, breaking them up into multiple pieces.
01:06:51.900 Because it's the concentration of power and control and market share that gives them such a dominant position.
01:06:59.540 Doesn't that destroy the opportunity for America to create AI?
01:07:07.040 I don't think it does.
01:07:08.160 I think American innovation is such that, that.
01:07:11.280 It takes server farms the size of a government or Google.
01:07:17.020 Yes.
01:07:17.540 You know, it's not, that's, that's, you know, American innovation, that ain't in your backyard.
01:07:22.900 Yes.
01:07:23.340 And that's not with a lack of information.
01:07:25.200 That's with access to all information.
01:07:27.760 Yes.
01:07:28.120 No, there's absolutely right.
01:07:29.180 There's no question that it, that it, in, in, in some respects could put America or these
01:07:34.020 American companies at a competitive disadvantage on these issues.
01:07:38.000 But I just think the stakes are too high.
01:07:39.760 Uh, I, I don't think we can say that, you know, what's good for Google is good for America.
01:07:44.280 You know, to take the old phrase that what's good for General Motors is good for America.
01:07:48.080 I just don't think it's true, particularly because there is this sense that this is a
01:07:53.680 company, uh, that wants power, that wants to influence and steer people.
01:07:58.580 If this were a company that said, we are simply providing information, this is what we're
01:08:03.320 going to do.
01:08:04.140 We're going to allow other competitors to rise.
01:08:06.840 We're not going to exercise market dominance.
01:08:08.700 And oh, by the way, we're also going to research AI.
01:08:11.840 I wouldn't have as much problem with it, but that's not where we are today.
01:08:15.580 We have a company that's operating in a monopolistic fashion that is trying to steer and manipulate
01:08:20.920 the flow of information in our country in a way that has never been done in American
01:08:25.880 history, human history.
01:08:27.060 And I just think the stakes between now and the next five years are too high.
01:08:31.780 And if this is not corrected, we will reach the point where Google will become the dominant
01:08:36.980 player in determining the future of American elections and elections around the world where
01:08:43.140 they can sway 10% of the vote in one direction or another simply by manipulating their newsfeed
01:08:49.640 or their search results.
01:08:50.940 And that to me is just not acceptable.
01:08:52.720 Am I on the wrong track or the right track?
01:09:14.640 I don't care what your opinion is, as long as it's a bill of rights, and that's a wide
01:09:25.480 range.
01:09:26.620 Yeah.
01:09:27.380 As long as it's a bill of rights, we need to protect ourselves and be on a separate server.
01:09:37.100 Because I believe the voices of freedom and individual rights are going to be deplatformed.
01:09:45.640 Yes, I think you're exactly right.
01:09:47.600 And when you mentioned to me earlier that you had set up, you're using your own system.
01:09:52.360 I think that is so profoundly important.
01:09:55.340 We do the same thing at the Government Accountability Institute.
01:09:57.720 We have our own server.
01:09:58.800 We are not connected in the cloud or any other way that gives people control.
01:10:02.820 It's about autonomy.
01:10:04.280 And, you know, one of the analogies that I use in the film at the end, Glenn, is we're
01:10:10.340 all accustomed to the notion of taking care of our physical selves, right?
01:10:15.340 You got to exercise.
01:10:16.240 You got to eat better.
01:10:17.000 And I'm trying to do that.
01:10:18.140 Other people are trying to do that.
01:10:19.180 I gave up on a year.
01:10:22.360 You know, we try.
01:10:23.880 Well, we have digital selves, right?
01:10:25.900 We have digital selves.
01:10:27.580 We should be taking care of our digital selves because, in a sense, they are close to being
01:10:32.800 as important as ourselves because they're so intimately connected.
01:10:36.820 And that means recognizing these privacy issues, recognizing that you're being manipulated,
01:10:42.140 recognizing that these entities have a capacity to try to influence you, and that they're going
01:10:48.540 to do it.
01:10:49.520 That, I think, is fundamentally important.
01:10:52.220 Whether you're running an operation as large as yours, whether you're running a small mom
01:10:57.740 and pop shop, or whether you are just simply a student somewhere, take control of your digital
01:11:03.920 self and don't be beholden to these entities who want control and want influence over the
01:11:10.640 decisions that you make.
01:11:11.640 And there are steps that you can take.
01:11:12.940 It's not easy, but you can take those steps.
01:11:15.440 It's very, very important because the day is going to come when eventually they decide,
01:11:21.120 you know what?
01:11:21.740 We don't like what Glenn Beck has to say anymore.
01:11:24.600 We want to try to, oh, we can't deplatform him because he's not under our purview.
01:11:30.600 That sort of decision, everybody's going to have to, at some point, consider.
01:11:34.820 I want to end with kind of where we started with the Katie Couric Today Show on the future.
01:11:44.900 In five years, we may look back at this and go, oh my gosh.
01:11:50.020 We might look back at it and people don't know what it is because it failed.
01:11:54.680 Have you heard of solid?
01:11:57.940 You know what solid is?
01:11:59.140 No.
01:11:59.540 And pods?
01:12:00.600 No.
01:12:00.880 Well, the actual inventor of the internet, most people don't even know who he is.
01:12:06.240 It was one guy and he's the guy who came up with www.dot.
01:12:11.640 Okay.
01:12:12.280 And he wanted a very open and free internet.
01:12:16.160 And so he's seen what's happening and he's seeing these companies.
01:12:22.420 And so he has been working on a few things.
01:12:26.920 Solid, which is a, it's, I don't even understand how it works.
01:12:30.880 Okay.
01:12:31.500 But it is a new internet.
01:12:34.740 Okay.
01:12:36.220 And your pods is personal, personal, I can't remember, but it's all of your personal information.
01:12:44.820 And he's redesigning things through this new company that is allowing you to go out and
01:12:53.220 buy apps, but you bring them into your space and you control all the information and all
01:13:01.420 of those apps only work in your space and it can go out and get things, but it doesn't
01:13:08.580 leave the trail of information.
01:13:11.540 So all your pictures, everything, it's all in your pod.
01:13:15.560 So there is somebody trying.
01:13:18.280 It's wonderful.
01:13:18.960 It's digital sovereignty.
01:13:20.440 It's digital sovereignty.
01:13:22.900 You know, every man's castle and each individual has sovereignty over themselves.
01:13:27.480 This is digital sovereignty.
01:13:29.060 I love this.
01:13:30.000 I think this is fantastic.
01:13:31.080 And look, one of the reasons I'm an optimist is human nature as such and the American spirit
01:13:37.180 as such is we don't just sort of toss our arms up and say, okay, this is our fate.
01:13:42.960 I hope, Glenn, that we talk about this in five years and we chuckle and say, boy, we were
01:13:48.240 so pessimistic.
01:13:49.540 We were so, and here came the pods.
01:13:51.420 Yeah, right.
01:13:51.880 And listen to that stupid explanation.
01:13:54.880 Exactly.
01:13:55.640 Exactly.
01:13:56.240 Everything is great and everything.
01:13:57.680 I hope that conversation happens.
01:14:00.320 One of the reasons that conversation might happen is because people now need to be paying
01:14:05.400 attention to it.
01:14:06.260 This is something that can sneak up on people that could be a real problem unless people
01:14:10.840 are aware of it and insist on taking control of their digital selves.
01:14:14.100 I'm going to put you in the calendar for five years from today.
01:14:17.620 I think that would be a wild thing to come back to this.
01:14:19.940 We'll do it.
01:14:20.660 We'll do it, Glenn.
01:14:21.560 Thank you.
01:14:21.960 Thank you.
01:14:49.940 Thank you.