Ep 22 | Peter Schweizer | The Glenn Beck Podcast
Episode Stats
Length
1 hour and 14 minutes
Words per Minute
153.97667
Summary
In this episode, Peter and Glenn discuss their new film, "Trust Us." They discuss the dangers of concentrated power in Silicon Valley, and how technology is making us less trusting in the people we should be able to trust.
Transcript
00:00:00.000
So, Peter, your film starts out with an amazing clip from what wasn't long ago on the Today
00:00:23.240
Show with Brian Gumbel and, what's her name, Katie Couric, asking the question, does anybody
00:00:35.460
I mean, it seems like so long ago, and yet it's 20 years we've had this for wide application,
00:00:43.880
And I feel like what we're going to talk about today, in five years, we'll look like, oh
00:00:52.720
my gosh, look at this, how ridiculous this conversation was.
00:01:07.760
Because I've always been distrustful of concentrated power.
00:01:11.220
This is why I've investigated corruption in politics and-
00:01:18.820
We've talked about this before, Glenn, and I think we probably agree.
00:01:26.940
Well, that also applies to people that are in the private sector.
00:01:30.780
Now, one of the reasons I like a free market economy is, you know, it tends to self-correct.
00:01:37.460
But the problem is, with this technology, with Silicon Valley, the market's not going
00:01:44.620
So we're dealing with a massive concentration of power in Silicon Valley, I would argue,
00:01:49.600
that is larger than any other corporate titans have had in American history.
00:01:56.540
So let me understand why you think it won't self-correct.
00:01:59.520
It won't self-correct because the barrier to entries, it seems like, oh, we can just
00:02:08.700
But the reality is, the gatekeepers are the large tech companies.
00:02:13.320
Apple, if you're not connected with Apple in some way, and iTunes, or you're not getting
00:02:17.560
promoted, it's very hard to get application there.
00:02:19.860
Google, the titans control it, and they restrict the market.
00:02:25.240
So it's very hard to say, we're going to create a new search engine today, and we're
00:02:35.020
And everybody that I know says that just the servers that Google has, the government servers
00:02:56.520
So I talked to Ray Kurzweil, and I said to him at one point, we were talking about AI,
00:03:02.760
and we were talking about all of the things he's working on.
00:03:06.520
And I said, Ray, so why wouldn't a Google that is reading everything, knowing everything,
00:03:16.120
and it's only going to get faster, and it's going to be a personal assistant, and the idea
00:03:21.900
of the personal assistant is it's so knowledgeable on you that it can predict what you might think
00:03:34.780
And then, you know, there's something in the back of me that says, you know, maybe we go
00:03:38.180
The next day, the next morning when I wake up, my assistant says, hey, by the way, I
00:03:43.380
found some really good, I've not talked to him about going to Hawaii, but it just can
00:03:50.220
So what I said was, why wouldn't Google stop anyone from building a replacement for Google?
00:04:04.420
If there was something that was going against the corporation, why would it allow you to
00:04:18.040
Well, then I'm meeting a unique group of people in the history of the world.
00:04:27.780
Those words, trust us, have been uttered throughout human history by people that rose to power, who
00:04:35.520
were well-intentioned, who did wonderful things, but along the way took a terrible turn.
00:04:41.500
Those words have also been used by people who were terrible from the beginning.
00:04:45.320
The point being, we cannot trust other human beings, given the long span of human history
00:04:54.740
We cannot trust other human beings with this much dominant power over our lives.
00:05:00.000
And look, they're not going to come and say, we're going to do terrible things to you.
00:05:03.480
It's always going to be presented as, we're doing this for your benefit.
00:05:21.680
I wouldn't have given my fingerprint to anyone.
00:05:24.080
I gladly give it to Apple because it reduces, oh, really, I got to remember that.
00:05:32.560
So before we start, is your concern the power that they have now and in the next two years
00:05:41.540
or the power that they have and will have in three, five years and with AI coming online?
00:05:51.980
My sense is, Glenn, that right now they have an enormous amount of power and we will reach
00:06:01.880
In other words, they will have accumulated so much power that whatever actions are taken,
00:06:10.020
They get so embedded into a human life and their ability to steer and manipulate, we become
00:06:16.180
so dependent and conditioned to being dependent upon them that it becomes too late to self-correct.
00:06:22.600
And I think we're approaching that probably in the next five to 10 years.
00:06:25.940
I don't accuse Google of doing this or Facebook doing this at all, but I have often wondered
00:06:33.660
if it today, just with what they have today, if it fell into the wrong hands, could anyone
00:07:08.560
And it works a couple of ways as it always does, Glenn.
00:07:10.900
There's certainly absolutely what you're saying.
00:07:13.040
They have the stick that they can wield in lots of ways.
00:07:16.240
They have the search histories of, you know, members of Congress, of senior White House
00:07:21.520
If they're using a Gmail email account, even if it doesn't say Gmail, if they're using a
00:07:26.980
Google-based email server, which a lot of government entities are, they have access to their emails.
00:07:32.560
So they know a lot of secrets about these people.
00:07:42.160
Because I don't want to, I don't want to assign things that they're not doing.
00:07:47.160
I want to make a difference between what they have, what they can do with what they have,
00:07:55.680
So they don't necessarily have all of that information.
00:08:04.420
We are not accusing them of looking or using any of that information at this point.
00:08:12.780
It's, you know, the equivalent would be, you know, somebody who has a gun at home for
00:08:19.400
They may never use that gun, but they have the capacity to use it if they need to, God
00:08:27.280
And it's the same worry that we have with the United States government, with the NSA servers,
00:08:31.340
which again, the NSA servers are minuscule in comparison to Google.
00:08:38.920
And the NSA servers, they are not using it, but they have everything.
00:08:46.060
And, you know, on top of that, Google has, you know, Google Docs, which again, they have
00:08:55.120
So you've got classified information that potentially, I'm not saying they're doing it,
00:08:59.680
but potentially could be accessed through Google Docs.
00:09:02.620
And you have news organizations from the New York Times to others who use Google email
00:09:10.420
Well, I know five, maybe eight years ago, I was still in New York, so probably eight
00:09:14.240
years ago, one of Google tech called me and said, you need to know something.
00:09:23.580
I don't know why, but the federal government is digging trenches around our farms and they
00:09:33.300
And they were digging very deep and putting security all around.
00:09:50.100
And that's why, you know, one of the big myths we've talked about this before, Glenn,
00:09:55.580
And it's oftentimes perpetuated by the political left, which is that big government and big
00:10:02.760
And the reality is oftentimes, no, the big business and big government like each other.
00:10:09.380
And I think that is certainly the case with Google.
00:10:16.040
One of the largest lobbying shops now in Washington, D.C., is Google.
00:10:23.300
I would not be surprised if we see, you know, additional members of Congress put on the
00:10:28.780
The point being, hey, you know, we can make life really tough for you, but we can also
00:10:33.380
make life for you personally very lucrative if you join us.
00:10:41.040
I just before we sat down, I just hung up the phone with one of the editors of Wired magazine.
00:10:48.640
And we were talking for about 40 minutes about some of this stuff.
00:10:52.520
And he said, well, you know, Facebook has poured a lot of money.
00:11:00.900
I mean, they're one of the number one, number two lobbyists in Washington.
00:11:04.720
And they have, you know, they have conservative lobbyists.
00:11:26.740
The one of the guys that you you feature prominently is a guy who is a diehard Clinton supporter,
00:11:38.820
This is not this has nothing to do with politics.
00:11:41.420
The film is really about the power of Google and Facebook.
00:11:45.180
But in a way that a lot of people have not traditionally thought about them.
00:11:48.760
A lot of people recognize and see the issues of privacy.
00:11:52.060
The fact that they're gobbling up all this information and they take this information, Glenn, and they
00:11:57.340
run those ads about, you know, go to Acapulco because you did a search on Mexico or, you know,
00:12:02.220
buy these shoes because you said you were looking for certain shoes.
00:12:11.980
And this goes back, I think, to a fundamental concept that people recognize everywhere.
00:12:16.860
And that is to the extent that an institution can do something for you, it can do something
00:12:25.080
So Google and Facebook can do a lot of things for us.
00:12:28.180
But that power that they have accumulated that gives them the capacity to know so much
00:12:32.800
about us also gives them the capacity to do things to us.
00:12:36.680
And what they're doing is not just collecting information, the privacy concern.
00:12:41.160
They're also actively working to manipulate us.
00:12:44.700
They're looking to steer us and move us in directions that we don't necessarily want to
00:12:55.220
They want to influence the way that we see things.
00:12:57.840
And it's not about them flashing in front of us in a very visible way, Glenn, of saying,
00:13:02.560
hey, have you ever considered this point of view?
00:13:12.120
So you may be looking for information on, let's say, Congressman John Smith, but they're
00:13:18.080
only going to give you certain bits of information about Congressman John Smith, depending on what
00:13:23.480
Congressman John Smith's politics are and depending on whether they've decided to manipulate
00:13:37.020
The first bit of evidence is we know that they manipulate the algorithm because it's
00:13:43.140
been proven by the Federal Trade Commission, the European Union, and academics at Harvard.
00:13:50.100
They were charged 10 years ago by Yelp and by TripAdvisor for manipulating the algorithm
00:13:55.460
to the detriment of those companies to benefit companies that Google owned that were competitors.
00:14:02.280
And as far as I'm concerned, Google can do that.
00:14:05.440
Google insisted and they pounded the table and said, we are not manipulating the algorithm.
00:14:14.520
They were manipulating the algorithm and they pretty much given up the argument that they
00:14:19.960
And the work of Robert Epstein and others, I think, conclusively demonstrates that they
00:14:25.180
are cooking the books on the algorithm as it relates to political search.
00:14:35.440
So I want to be clear, because you and I are both free market guys.
00:14:51.040
I will tell you, the conversation I had with Wired was warning.
00:14:56.740
Warning, these companies, if they don't smarten up one side or the other, we'll say, you're
00:15:05.300
And then it's government controlled and in bed with the government.
00:15:26.820
So are you saying to me, if they said, yeah, yeah, we put our stuff first, you'd be okay
00:15:37.700
Although the problem is that the internet is essentially, to the extent that it's regulated,
00:15:43.760
is based on the Communications Decency Act of 96.
00:15:47.920
And as we've talked about, Glenn, and as you know very well, if you are a neutral platform,
00:15:53.520
as Google and Facebook insist they are, you are essentially saying you're a telegraph
00:15:58.320
and you're simply relaying information from point A to point B.
00:16:06.080
And so Google and Facebook have operated under that platform.
00:16:09.060
The other part of the Communications Decency Act is if you are editing content, if you
00:16:15.860
are sifting content, if you are engaging in editorial control, you are now a publisher
00:16:21.620
and you will be treated in some respects as a media company.
00:16:26.940
And so my point is Google and Facebook should have to choose.
00:16:31.600
They should not be able to say we're a neutral platform and yet we are going to exert editorial
00:16:38.060
And they've admitted that they exert editorial control.
00:16:41.620
And the reason why they are getting it both ways is because of the lobbying money.
00:16:48.480
They lobbied one way, then they lobbied the other way, and they got both.
00:16:54.720
And it should not, by the way, give us any comfort.
00:16:56.800
I mean, you were right in the interview with the Wired Magazine newspaper.
00:16:59.700
It should not give us comfort that prominent conservatives or Republicans have decided to
00:17:06.080
take a big paycheck from Google, that that somehow is going to fix everything.
00:17:15.140
So let me go back, and when you say they are shaping us and they're pushing us a certain
00:17:27.960
And I want specifically, where are they pushing us and how do you know they're pushing us?
00:17:35.380
Well, one of the people that we feature in the film is a guy who is a Google ethicist,
00:17:47.160
Talks very, very openly about the fact that his job was to nudge people towards certain
00:17:55.700
And they were trying to figure out how to do that, quote unquote, ethically.
00:17:59.500
Now, you know, my view on this is pretty clear.
00:18:01.920
If you're trying to influence somebody, it should be out in the open.
00:18:09.460
And so I don't think there's an ethical way to do it.
00:18:15.380
Because then I'm choosing, and I know you're coming at me with an angle.
00:18:18.760
But if you don't announce it, there is no ethical way.
00:18:23.800
You go back in history, you remember all the concern that was raised about the subliminal
00:18:29.160
advertising, you know, that you'd see, you know,
00:18:31.740
a Coca-Cola drink and there's actually sex written in the ice cubes or something like
00:18:36.760
And, you know, there are all kinds of disputes about how effective that was.
00:18:39.820
Well, that was restricted in the United States because it was deemed to be deceptive.
00:18:44.860
You know, you're trying to accomplish something here that you should be above board about.
00:18:48.840
And that's what I'm saying that Google should do.
00:18:50.800
So, you know, the issue of them nudging or trying to influence us, Tristan Harris has
00:18:56.600
Eric Schmidt, the CEO of Google, has said that part of Google's purpose is to shape the values
00:19:05.240
So they don't view themselves just as we're trying to sell advertising and we're trying
00:19:10.720
They have a much more ambitious agenda that goes along with this.
00:19:17.500
It's tied up with the sort of the ethos of Burning Man.
00:19:20.620
It's tied up with sort of the Silicon Valley ideal of what they view as wrong with America
00:19:25.940
and what they view wrong with human, American society.
00:19:28.820
So I don't want to get into, I think for the sake of intellectual dialogue, I want to take
00:19:41.360
the position that it's totally cool to say, don't be evil.
00:19:50.240
The problem is when you say they're nudging and they're pushing and their ethos is don't
00:20:06.680
You know, when it's hate speech, could we define hate speech, please?
00:20:12.240
Because I've seen speech that is very hateful that others are totally fine with.
00:20:18.240
I have seen things, for instance, we define evil.
00:20:23.680
Some people would say that it is evil to take the right away for your own body as a woman.
00:20:33.260
And others would say, no, it's evil to kill a child in the womb.
00:20:46.380
But both sides will say you're evil for believing that.
00:20:50.840
So, without the definition, we don't know what it is, and it can always change.
00:21:02.100
And even to add to that further, here's the other fundamental problem.
00:21:05.620
I'm always very concerned when people throw around words like evil.
00:21:11.840
There is evil in the world, and we do need to define it.
00:21:16.060
When a company says, don't be evil, or when an activist says, I am fighting evil, if they
00:21:24.240
have not really thought through what they're saying, what does that do, Glenn?
00:21:44.280
So, for a company to say, you know, don't be evil, yes, you never defined it.
00:21:50.240
And second of all, it creates, I think, kind of this mindset of, we're safe now.
00:21:55.880
I mean, our corporate model says, don't be evil.
00:21:59.220
And we're fighting all these terrible things out here.
00:22:01.620
And we're justified in, you know, and you see that in some of the emails that have come
00:22:06.460
out that have been leaked in these discussions that Google engineers had about, you know,
00:22:11.700
Trump's immigration policy, regardless of what one thinks about that.
00:22:15.220
But these Google engineers are actually saying, we should fiddle with the algorithm because
00:22:22.940
That's the kind of mindset that gets adopted when you plant yourself on the grounds of,
00:22:35.220
The biggest thing that made me successful is the one thing that I have lost and I am so
00:22:46.480
I want to take you to a controversial meeting I had with Mark Zuckerberg at Facebook.
00:23:06.180
They had invited a whole bunch of voices and I sat right across the table from Mark and others
00:23:14.880
were talking and I watched him and I watched him closely.
00:23:21.020
And I tend, when I go wrong, I tend to go wrong believing the best in somebody.
00:23:30.600
But I really watched him and when he spoke many times, perhaps because I was just right
00:23:36.980
across the table, he was speaking to me and he was looking me right in the eye and he said
00:23:52.540
If I'm just pushing politics, I'm going to wipe out 50% of my base.
00:24:06.940
He said, in different parts of the world, we don't know what is good and bad.
00:24:24.900
So his point was, we are not the policemen of the world.
00:24:38.100
But I also believe that Silicon Valley, by nature, is not diverse.
00:24:58.940
And when you're in the bubble, and explaining the bubble that we're in, where they're sifting
00:25:09.260
and we're getting the stuff we want to read, and we're living in that bubble, that Facebook
00:25:18.340
So my question to you is, do you believe that Facebook, Google, are intentionally going after
00:25:37.520
Are they intentionally doing this, or are they just so isolated in their own bubble and surrounded
00:25:44.780
by people that are absolutely certain they're on the side of good, and they don't see what
00:25:56.200
And any time you talk about intention or trying to look into somebody's, you know, soul, as
00:26:02.040
it were, I mean, it's very, very difficult to do.
00:26:04.100
I think what's striking to me about Facebook and Zuckerberg is, you know, when he was sat
00:26:09.160
before the Senate, Ted Cruz and others asked him some very tough questions.
00:26:13.120
He acknowledged that Silicon Valley was a very, very liberal place, but he also didn't really
00:26:20.020
You know, when Ted Cruz asked him and said, look, this has happened to a Republican candidate
00:26:25.460
Do you know of any example where this has happened to a Democrat candidate?
00:26:30.600
And he said, this has happened to this pro-life group, to that pro-life group, to another pro-life
00:26:35.480
Do you know of any abortion groups that have had these issues?
00:26:40.240
So part of it is the proof's in the pudding, right?
00:26:42.760
I mean, why is it that one side seems to overwhelmingly face these issues?
00:26:48.780
You know, I'm not talking about, you know, somebody who's got a blog site who's, you
00:26:54.040
I'm talking about very substantial, institutional, you know, good, reasonable, well-thought-out
00:27:00.200
Go to the Harvard expert that did this research on the last presidential campaign.
00:27:09.600
And what they essentially did, Glenn, is they took, they had 2,000 monitors around the
00:27:15.460
They were in red states, blue states, Republicans, Democrats, gay, straight, Catholic, agnostic.
00:27:24.180
And what they essentially did was said, we want you to do searches through the six-month
00:27:30.400
And we are going to capture every single search result that you get.
00:27:35.340
We're going to have you search on Bing, and we're going to have you search on Google to
00:27:39.260
see what kind of results you get on political topics.
00:27:41.940
What he found in the research when they cataloged all of it was that in all 10 of the search
00:27:46.920
positions on Google, you saw a clear bias in favor of Hillary Clinton, meaning that they
00:27:52.860
were suppressing negative stories about Hillary Clinton and pushing positive stories by Donald
00:28:05.240
And they did not run into that problem with Bing, which is a Microsoft product.
00:28:09.720
And they found this stunning, because what you should find, what Google will tell you is
00:28:15.480
the algorithm is sort of individually, in a sense, it's conditioned individually for the
00:28:21.400
So, you know, Google knows Peter Schweitzer very well.
00:28:24.180
When I search in a certain subject, it's going to give me answers or responses that are
00:28:30.400
tailored to my search history and to the best results they see.
00:28:33.820
If you were, I mean, I shouldn't have gotten any results, I didn't like either side, but
00:28:38.880
if, so let's say if you were a Donald Trump supporter, you should have seen in your search
00:28:48.560
And you should have seen for news sources that you're used to clicking on.
00:28:54.180
So the fact that they found this uniform, consistent bias in favor of Hillary among all 10 search
00:29:00.960
positions among the 2,000 people was quite astonishing.
00:29:11.240
They just say, well, no, this is just organic search.
00:29:20.300
I did a, I did a show recently on the difference between disinformation and misinformation.
00:29:33.660
Can you explain the difference between the two?
00:29:35.580
I mean, one is one, disinformation is when it is, is planted into what you think is a credible
00:29:46.540
If it comes from Pravda, you're like, oh, it's Pravda.
00:29:57.560
In a way, Google is, is engaged and Facebook knowingly or not in a disinformation campaign.
00:30:09.980
I think, I think that you could certainly classify that.
00:30:12.380
And, and, and I think the issue becomes why are they doing this and how are they doing
00:30:18.540
You asked earlier about Zuckerberg and, and your interaction with him.
00:30:21.900
I don't know that Mark Zuckerberg, I'm not suggesting that Mark Zuckerberg is sitting
00:30:26.040
around saying, how can we deal with conservatives on Facebook?
00:30:29.860
But the problem is there are lots of people employed by Facebook who maybe they're in
00:30:37.300
Maybe they were woke, you know, on, on campus and they are now involved in the news feed.
00:30:43.620
I mean, we had a, you know, a Facebook employee who was involved with their trending section
00:30:48.600
that came out a couple of years ago who said, oh yeah, we sunk conservative stories and we
00:30:54.240
Um, so I don't think it's, it's, it's a question of the executives of these companies, you know,
00:31:00.740
in kind of a James Bond villain moment saying, here's how we're going to rule the universe.
00:31:05.700
I think it's a question of, they've created these powerful companies and they've created
00:31:10.100
a culture within these companies that for all the talk of tolerance is actually very intolerant
00:31:16.680
and it reflects in the product that they are producing.
00:31:24.080
I have a document from a meeting with media matters on inauguration day.
00:31:31.680
It was a meeting that happened with far left donors with media matters on the election day
00:31:37.880
in Florida, not on election day, on, on, uh, inauguration day.
00:31:41.900
And they said, here's where we went wrong and here's what we're going to do.
00:31:47.320
And it talks in that, in that document about how they are going to go to Facebook and Google
00:32:01.980
And in that document, it says, we have already been given access to all of the raw data in
00:32:14.760
Peter, I don't think they'd give that to me or any organization that I know of.
00:32:25.240
I mean, the problem is the way that they are trying to deal with this is they're like,
00:32:29.320
you know, we're being criticized by conservatives.
00:32:35.480
I think that's a good thing, but that's not really the issue.
00:32:39.160
The issue is not, you know, uh, saying nice words.
00:32:41.940
The fundamental issue comes down to, you know, what is this company doing?
00:32:46.360
And, and the whole debate now that's arisen about fake news, I think is a huge problem
00:32:52.480
because it's allowing essentially these liberal groups like the Southern Poverty Law Center
00:32:57.380
and Media Matters to essentially say to Facebook and Google, no, no, no.
00:33:05.820
And the, uh, and the Facebook could respond well, but we have others.
00:33:10.700
We have, I don't remember if it's Heritage Foundation, but we have, we have others that
00:33:20.700
It's, it's the, the, the problem that develops is that nasty word cronyism and cronyism is
00:33:27.000
a problem where you give concentrated power or you give special access or favors to certain
00:33:33.000
people, uh, and invariably it's going to be misused.
00:33:35.820
And, and this is really the question, I guess, Glenn is does Facebook and does Google so much
00:33:43.180
distrust the American people that they believe the American people are incapable of looking
00:33:49.420
at a news story and saying, that's totally, yes, I'm not buying that.
00:33:54.380
Um, and, and, and they don't, they don't have confidence in the American people to do that.
00:33:58.660
They feel like they have to somehow be the arbitrators and they don't.
00:34:09.880
Goldfish, goldfish have an attention span of nine seconds.
00:34:27.000
Um, and that has changed dramatically because of Facebook and all of the interaction that
00:34:33.520
However, because I was just asked this question, um, well, don't they have a responsibility?
00:34:41.340
No, they have a responsibility to be transparent and be a platform.
00:34:47.680
I don't believe that you should censor anyone on a platform.
00:34:53.260
It's the battlefield of ideas to say that now, what people will say is, well, that's crazy
00:35:05.460
Thomas Jefferson said, believe the people, trust the people.
00:35:15.020
Uh, they will usually get it wrong, but eventually they'll get it right, right, right.
00:35:26.160
The worst thing we can do is put a babysitter on top of us forever.
00:35:37.480
And this is, is further evidence that I think they don't really understand the dynamics at work in the country today.
00:35:45.320
The dynamic at work in the country today is a rejection of sort of this elite view of how society should be organized.
00:35:53.560
It's one of the reasons why you have in financial markets, conservatives, people on the left don't trust the large banks.
00:36:04.340
It's the same reason conservatives, liberals, independents have a distrust of Washington, D.C.
00:36:10.960
It's not because they want tax policy to be slightly different.
00:36:14.280
It's they don't fundamentally trust them to reflect their interests and to look out for them.
00:36:19.980
And they also know that the elites generally look down upon them.
00:36:23.740
So, you know, that my challenge to Silicon Valley is for all their talk of egalitarianism, for all their talk about we love democracy and everybody having a voice.
00:36:36.860
I mean, the point is, we all remember as a kid, I grew up outside of Seattle, Washington.
00:36:41.160
And I remember going down to a place called Pioneer Square.
00:36:45.380
There were all kinds of people wandering around saying strange things.
00:36:48.920
Well, those people today may have blog sites and they're going to say some crazy stuff.
00:36:58.140
And I have enough trust that most people aren't going to pay a lot of attention to them.
00:37:03.820
And that's, I think, what we have to embrace, because otherwise it's we are going to have intellectual policemen that are trying to tell people, here's what you should think.
00:37:14.780
Not only that, but please don't even look in this direction.
00:37:19.540
If you look in this direction, it might somehow infect you.
00:37:24.000
The battlefield of ideas is such that the best ideas win.
00:37:27.660
And I happen to believe that the ideas of the American founding were the best ideas.
00:37:35.720
And the kind of monitoring and everything else, honestly, goes against almost every single article in the Bill of Rights.
00:37:53.160
But it is the same principle, especially the bigger they get.
00:37:57.800
And that, by the way, goes on to what you were saying earlier about Huxley and Orwell.
00:38:03.440
You know, the traditional view is the government was going to use technology to control our lives.
00:38:11.920
I've always, you know, I've always, always made fun of, you know, Blade Runner, the corporation.
00:38:24.360
We are now entering the time where the liberal concern about corporations is actually accurate now.
00:38:33.880
And it's weird that they're so in love with Apple and Google because these are the guys you've been warning us about.
00:38:42.240
So let me take you kind of to that Orwell place.
00:39:03.080
Well, you're not paying anything in terms of monetary.
00:39:09.860
Because all these servers, all this capacity is expensive.
00:39:14.460
So what Google is doing is they have a product here.
00:39:21.480
And they're selling all kinds of secrets about you.
00:39:25.540
I used Gmail up until I started on this project.
00:39:30.880
And what people have to realize about Gmail is they're scanning every email that comes in.
00:39:37.880
They are scanning every email that you send out.
00:39:42.000
And if you draft an email, you know, you're upset with your cousin about something.
00:39:47.900
You had a, you know, debate over Thanksgiving and you thought they were rude.
00:39:51.000
And you said, you know, cousin Chris, I think you're rude and you're terrible and you're this
00:40:02.220
You're not saying the draft that you save and put into drafts.
00:40:10.060
Even if you delete all of it, it's still there.
00:40:14.660
What's important here is, again, to distinguish.
00:40:18.300
When you say they're scanning, it doesn't mean they're reading it.
00:40:25.700
Well, they're scanning it because let's say you send an email to your friend.
00:40:32.440
I'd sure love to be on a beach in Mexico right now.
00:40:38.660
And you're going to probably see ads on your Google feed for apartments or condos in Mexico.
00:40:46.640
And lo and behold, the next morning, someday you wake up and they say, Mr. Schweitzer, I've
00:41:00.660
I know you're tired and you've been thinking about it.
00:41:05.540
And again, there are certain amazing conveniences that come with this.
00:41:12.360
There are all sorts of great benefits to that, to Google search.
00:41:15.540
The thing that people have to keep in mind, though, is it's not a one-way street.
00:41:20.600
It's not just these wonderful, good things they're doing for you.
00:41:25.340
It's the capacity they are developing to do things to you.
00:41:29.420
So when I say that they're scanning your Gmails, it's not that there's a person sitting in Silicon
00:41:35.360
Valley saying, oh, look what Glenn just sent in Gmail.
00:41:41.780
And they have the capacity, if they don't like what you're doing, to shut you off from
00:41:46.680
And Dr. Jordan Peterson, we highlight him in the film.
00:41:52.160
He's a psychology professor at the University of Toronto.
00:41:55.020
And he took a position against compelled speech, where there was a debate in Toronto about an
00:42:01.600
ordinance that would require you to address somebody by their preferred gender.
00:42:06.360
Peterson's position was, I always address people by their preferred gender, but this
00:42:15.780
The next day, Glenn, he was shut out of his Gmail account.
00:42:26.240
I think probably what happened is somebody connected with Google, maybe mid-level, saw
00:42:32.520
this, you know, is maybe in favor of this policy position and sort of in a juvenile way said,
00:42:49.160
It's going to give them an enormous capacity over your life.
00:42:53.760
And if they choose to, sometimes in an arbitrary way, they may just shut you out because they don't
00:43:01.180
And the problem is Google does not have a customer service department you can call to say,
00:43:09.000
And they make clear, we can choose to do this to you anytime we want.
00:43:12.540
So I've been asking the question of everybody who I think
00:43:38.000
is paying attention to Silicon Valley or is involved in Silicon Valley.
00:43:46.120
The answer comes back exactly the same way every time.
00:44:12.880
We are at a, we're at an economy that I don't care who's in office.
00:44:24.380
And the longer this one goes, the deeper the pain is going to be.
00:44:28.860
We have politicians that tell us, well, I'm going to bring those jobs back.
00:44:36.080
Well, you have people in Silicon Valley right now that are not celebrating a four point whatever
00:44:43.000
unemployment rate because their entire job is to figure out how do we have a 100% unemployment rate?
00:44:54.120
Because that's the world of the future as they see it.
00:45:01.540
Bain Capital said about eight months ago, by 2030, the United States of America will have a 30% unemployment rate permanent.
00:45:13.020
And it will only go up from there because of the things Silicon Valley is doing.
00:45:34.020
We're going to, you know, I'm going to bring those jobs back.
00:45:40.420
At some point, the people say, no, those jobs aren't coming back.
00:45:47.640
It's the people in Silicon Valley that are taking your livelihood away.
00:45:57.020
They have, and it is torches to Silicon Valley.
00:46:00.960
Unless the politician says, how about we work together?
00:46:25.940
And if I were a titan of Silicon Valley with sort of their worldview, that's precisely what I would do.
00:46:32.760
And if I was part of what I call the permanent political class in Washington, Republicans or Democrats, doesn't make all that much difference.
00:46:41.280
And you will then have this, in a sense, unholy alliance between the political leadership and high tech.
00:46:47.440
And, you know, we know who's going to get the short end of the stick in when those two entities get together.
00:47:01.220
And he told me, there are three circles, three rings, okay?
00:47:11.340
The center ring, the outside ring, is kind of like American surveillance.
00:47:16.800
The second ring is British surveillance on steroids.
00:47:22.000
The new Sharp Eyes program is the center, 11 million people, okay?
00:47:35.280
They put a guy out, said, go into the center ring, all right?
00:47:43.160
They had him in the back of a squad car in eight minutes.
00:47:54.660
People have to understand, do not fear high tech.
00:48:06.880
And we don't know the goals, except don't be evil.
00:48:13.740
And you have this strange sense, this fascination that some American elites have with the Chinese model.
00:48:23.340
I mean, remember, it was just a few years ago, Thomas Friedman, the New York Times columnist and, you know, has got a lot of relationships in Silicon Valley, was very frustrated in 2009 and 2010 when Barack Obama was president and actually held up the Chinese model and said, you know, the Chinese at least can deal with these big issues of the day.
00:48:42.720
The concern is the same exact same thing that progressives said about Mussolini before things went bad.
00:48:51.320
And this and this is this is the concern is does the sort of traditional American notion of, you know, the Constitution was predicated on you can change the Constitution, add amendments, but it's going to be tough.
00:49:05.060
We don't want radical shifts and radical change.
00:49:08.020
Is that very American notion that has been so central to America's development?
00:49:14.040
Is that really deeply accepted or embedded by a tech culture in Silicon Valley?
00:49:19.780
I'm sure there are some people that do, but I think a lot of them don't.
00:49:26.300
They were billionaires by the time they were 30 years old.
00:49:29.100
They built the built these massive corporations over the course of a decade.
00:49:32.960
They are impatient people who don't have a lot of experience outside of the world in which they operate.
00:49:40.960
So they become naturally impatient when you have things like, you know, checks and balances and civil rights.
00:49:47.900
So, you know, when it came out that, you know, Google, for example, had rejected working with the Pentagon on a contract, which is certainly their prerogative.
00:49:56.940
That's fine. But then said, we are going to work with the Chinese government.
00:50:02.140
It's shocking in a lot of our senses, but it's not in others, because if you want to develop this sort of new technology this way, do it in a country that that doesn't have civil rights.
00:50:17.820
The government will let you do what they want you to do.
00:50:20.980
You know, if you if you do a similar surveillance project in the United States, it's messy.
00:50:24.780
You know, you've got courts to worry about and you've got congressional committees.
00:50:28.400
So this fascination that some tech giants have with China and with the sort of Chinese model and way is very, very disturbing.
00:50:40.160
China is building the technology that Hitler only dreamed of.
00:50:50.740
The the concentration camps that are being built right now in China are terrifying.
00:50:56.700
The sharp eyes program that President Xi has put in where it's an episode of the Black Mirror where you are you're graded on your on your social interaction.
00:51:11.660
Who you follow, what you write, who you call, where you shop, all of it.
00:51:20.220
I mean, it is terrifying and it really said a lot to me when I don't want Google in bed with the government and the Pentagon, please.
00:51:39.120
No, everybody spoke out about that at Google with the government.
00:51:49.200
No, it's it's that classic example of, you know, we're going to take this position, this moral position with the Pentagon, which you're certainly entitled to take, which is going to make us feel better.
00:52:03.160
But the moral decision that you should really be making that we're not going to work with this oppressive police state halfway around the world.
00:52:10.960
You're not prepared to make that that decision.
00:52:13.700
And I think that is an example of how as much as they, you know, sort of are clad in their T-shirts and they talk about their sort of progressive values, their willingness to work with repressive regimes really reveals the fact that either, A, they really clearly haven't thought this stuff through.
00:52:35.080
Or B, their fascination or B, their fascination or B, their fascination or their sort of moral north star is collective.
00:52:44.060
And it's and it's just the pursuit of technology in and of itself that there's that we're not going to attach any values, political values or intellectual values to that.
00:52:59.160
Everyone who says they they care about civil rights and the minorities in America and America was so bad.
00:53:05.800
I urge you to search what the Han, what Han Chinese means.
00:53:13.260
And if you are not Han Chinese, that's 10 percent of the population.
00:53:34.900
And it is, you know, in a sense, I mean, a natural fusion between a police state and a system of surveillance and influence and control.
00:53:45.160
I mean, one of the things I talk about in the film is, is, you know, everybody always sort of brings up the sort of Hitler example.
00:53:51.520
But, you know, Hitler, Mussolini, Mao, name them.
00:53:58.140
I mean, they would they would have loved to have the capacity and the power that we have essentially handed Google and Facebook voluntarily.
00:54:06.020
We have we have these companies have more information and more access than the KGB ever dreamt of having.
00:54:15.120
And instead of, you know, in 1984, where, you know, you'd have the speaker sort of blare the speeches at you.
00:54:21.380
It's sort of this over the top propaganda of Big Brother.
00:54:24.100
Now it's sort of embedded in this search, this wonderful looking search from Google where, you know, it's sort of secretly being manipulated in a manner that a lot of people don't appreciate.
00:54:41.580
But I can't think of anybody I'm comfortable with being the first to discover AI.
00:54:59.380
I mean, this is this is this is going to make the nuclear bomb seem like a firecracker.
00:55:14.620
May as and it's not me, but it's it's Stephen Hawking, Bill Gates, Elon Musk.
00:55:33.520
I don't think our government has a clue as to what they're doing.
00:55:37.660
You know, there was a there was an AI conference up in Cambridge and and the president's person came in and said, hey, well, we'll even let you use our our NSA servers.
00:55:56.300
You know, that's like you can borrow my 1985 Chevy.
00:56:10.320
I mean, I certainly don't consider myself an expert on AI per se.
00:56:14.160
But but but here's, I think, what we know from human history and particularly recent human history.
00:56:20.120
Every great technological development has been good, has meant good things and bad things and and technological advances are essentially tools.
00:56:30.560
We think of them as tools because we control them.
00:56:36.060
If you are creating an intelligence system, you want to continue to regard it as a tool.
00:56:42.480
But the point the question becomes, you know, this tool is learning and, you know, think about it this way.
00:56:49.820
You have, let's say, you know, a thousand robots that are operating and learning.
00:56:56.320
And let's assume that they only learn one new thing a day.
00:57:00.080
But Glenn, those thousand robots are all communicating.
00:57:03.220
So they're all actually learning a thousand new things a day and they're learning it the day after the day after the day after.
00:57:10.300
Or what does that compare to in terms of the advance in human knowledge?
00:57:14.980
Who is going to be more knowledgeable and more superior going down the road in that kind of scenario?
00:57:22.020
And what we know from human history, you know, splitting the atom, great advances in biology.
00:57:29.140
All of these developments, wonderful things come out that benefit human society and other consequences that we never intended
00:57:38.300
or that we thought we could control but that we can't are part of it.
00:57:43.360
I have had military types tell me, Glenn, Glenn, the drones.
00:57:51.480
Yes, yes, it's facial recognition, but it requires a human to push that final red button.
00:58:02.300
We should not teach intelligent machines to kill.
00:58:11.180
Well, and the question becomes, not only should you not teach them to kill, but will they be able to learn themselves how to kill?
00:58:21.540
And that's, you know, the challenge here is they will tell you, no, we will put a barrier there.
00:58:28.300
So people who don't understand AI, AI, artificial intelligence is like what we kind of have now where it can only do one thing.
00:58:40.360
Yes, Big Blue can play chess and it can beat everybody on Earth at chess, but it's that one thing.
00:58:45.600
The next step is AGI, and that's general intelligence.
00:58:50.500
That's a human brain that is good at everything, okay, or many things.
00:59:00.600
When it becomes AGI, it will quickly become smarter than us because it doesn't forget, it learns, it's connected to everything, it's absorbing, it's, you think the people at Google have power.
00:59:16.420
When Google is in charge and not the people, it really has power.
00:59:27.560
I read a description, we will be, people will be to ASI as a fly is in a kitchen on a plate to the conversation people are having.
00:59:49.840
We're assigning these things that it should love people.
00:59:55.920
Well, okay, we can teach it that, but it's going to be so smart.
01:00:04.720
And the question becomes, Glenn, is you can say we're going to program it or we're going to set up within it a capacity to like human beings.
01:00:14.200
But what's going to determine the ethics of these machines?
01:00:18.900
The ethics of these machines is going to be determined, at least initially, by the humans that are constructing them.
01:00:30.200
You know, looking at the fact that, you know, these human beings are being destructive.
01:00:34.520
So we have a responsibility now to damage or kill those humans.
01:00:39.120
If there was a little teeny fly Moses, would we be listening and obeying the fly Moses Ten Commandments?
01:00:58.880
And no one is thinking about between here and there.
01:01:04.200
Earliest estimate that, you know, Kurzweil says a lot of people think is wrong, but 2030.
01:01:10.280
But if it doesn't happen by 2030, Bain Capital, who does this for a living, says we're going to have 30% unemployment.
01:01:18.260
So something bad is coming and no one is discussing.
01:01:25.340
I was so pleased to see Tristan, that he was a source of yours.
01:01:32.280
He is rare to walk out and say, I'm not having anything to do with this.
01:01:41.040
Because look, if you're in Silicon Valley, you're living a very comfortable life now.
01:01:49.460
And I think the challenge with AI is that you're going to have government is going to be constantly behind.
01:01:58.400
I mean, look, if the prediction is that this happens at 2030, we can expect congressional hearings in 2031.
01:02:07.700
The capacity of government to deal with this is limited.
01:02:11.480
And the pace of change is so great that if we're expecting government to sort of figure this out and manage this, I think we're making a big mistake.
01:02:21.240
And then the question becomes within Silicon Valley, what constraints are there on people that are making these decisions?
01:02:28.540
And there are not very many, you know, there's a reason, I think, sort of deeply embedded in the human psychology that's concerned about technology.
01:02:39.980
And I'm not talking about Luddites who reject it in general, but there's a reason if you go back to, you know, the Superman comics, that the threat is always sort of the madman who is misusing technology in a way to damage people.
01:02:52.820
I think people understand in terms of wisdom that we need to see technology as a benefit, but also as a potential threat.
01:03:01.940
And my concern is that in Silicon Valley, there seems to be far more interest in intelligence as an issue rather than wisdom.
01:03:10.440
And wisdom teaches us that historically, these machines and these tools, the smarter they get, they can do something for us, but that they can do something to us.
01:03:30.400
I am not excited about, I wish I could watch it as a movie.
01:03:42.920
I'm also very concerned, but I don't, I don't, you can't put this back in the bottle.
01:03:51.540
And I don't know who I want to be the one who discovers this because I don't trust any.
01:04:02.100
With this kind of, this is, this is, we're not create, the movie Frankenstein, he created a powerful, big, strong being.
01:04:20.320
Something that's as close to being omnipotent as we can sort of.
01:04:28.500
And the problem is compounded by the fact that this is, in a sense, a global arms race, as it were.
01:04:35.480
Because even if assuming by some miracle that we in the United States come up with sort of a constructive constraint on how this is going to be developed in a responsible way.
01:04:46.920
This is going on in China, where really it's about state power and state rule and what government wants.
01:04:52.300
This is going on in Russia, this is a global trend that sort of transcends the United States so we can be in a situation where even in the most optimistic view, we somehow create some kind of code to determine how we are going to create A.I.
01:05:11.880
and what role it's going to play in American society.
01:05:14.000
But, of course, that stops at the border's edge and you can't really enforce it.
01:05:22.100
And yet, if people are increasingly becoming aware of it, you're talking about it, other people are talking about it, my hope is, I'm an optimist by nature, Glenn, my hope is we will not be taken by surprise.
01:05:37.440
We will at least be able to prepare ourselves in some ways.
01:05:41.460
I don't know what that preparation looks like right now, but I think it's going to be necessary.
01:05:45.900
Let me come out of the deep future here, the 10-year future.
01:05:56.360
The problem today is this, these companies are still somewhat manageable.
01:06:18.200
I mean, I've dealt with utilities where I've lived and it would be a nightmare.
01:06:21.300
And by making it a utility, as you and I both know, you're really giving the government or a government body.
01:06:29.100
That, that's, that's a power I don't want the government to have.
01:06:35.640
But on the other hand, regulation never keeps up.
01:06:38.320
And there's also the issue of regulatory capture.
01:06:42.320
They're going to go to Google and say, how should we regulate you?
01:06:47.000
So to me, ultimately, it's about breaking them up, breaking them up into multiple pieces.
01:06:51.900
Because it's the concentration of power and control and market share that gives them such a dominant position.
01:06:59.540
Doesn't that destroy the opportunity for America to create AI?
01:07:08.160
I think American innovation is such that, that.
01:07:11.280
It takes server farms the size of a government or Google.
01:07:17.540
You know, it's not, that's, that's, you know, American innovation, that ain't in your backyard.
01:07:29.180
There's no question that it, that it, in, in, in some respects could put America or these
01:07:34.020
American companies at a competitive disadvantage on these issues.
01:07:39.760
Uh, I, I don't think we can say that, you know, what's good for Google is good for America.
01:07:44.280
You know, to take the old phrase that what's good for General Motors is good for America.
01:07:48.080
I just don't think it's true, particularly because there is this sense that this is a
01:07:53.680
company, uh, that wants power, that wants to influence and steer people.
01:07:58.580
If this were a company that said, we are simply providing information, this is what we're
01:08:04.140
We're going to allow other competitors to rise.
01:08:08.700
And oh, by the way, we're also going to research AI.
01:08:11.840
I wouldn't have as much problem with it, but that's not where we are today.
01:08:15.580
We have a company that's operating in a monopolistic fashion that is trying to steer and manipulate
01:08:20.920
the flow of information in our country in a way that has never been done in American
01:08:27.060
And I just think the stakes between now and the next five years are too high.
01:08:31.780
And if this is not corrected, we will reach the point where Google will become the dominant
01:08:36.980
player in determining the future of American elections and elections around the world where
01:08:43.140
they can sway 10% of the vote in one direction or another simply by manipulating their newsfeed
01:09:14.640
I don't care what your opinion is, as long as it's a bill of rights, and that's a wide
01:09:27.380
As long as it's a bill of rights, we need to protect ourselves and be on a separate server.
01:09:37.100
Because I believe the voices of freedom and individual rights are going to be deplatformed.
01:09:47.600
And when you mentioned to me earlier that you had set up, you're using your own system.
01:09:55.340
We do the same thing at the Government Accountability Institute.
01:09:58.800
We are not connected in the cloud or any other way that gives people control.
01:10:04.280
And, you know, one of the analogies that I use in the film at the end, Glenn, is we're
01:10:10.340
all accustomed to the notion of taking care of our physical selves, right?
01:10:27.580
We should be taking care of our digital selves because, in a sense, they are close to being
01:10:32.800
as important as ourselves because they're so intimately connected.
01:10:36.820
And that means recognizing these privacy issues, recognizing that you're being manipulated,
01:10:42.140
recognizing that these entities have a capacity to try to influence you, and that they're going
01:10:52.220
Whether you're running an operation as large as yours, whether you're running a small mom
01:10:57.740
and pop shop, or whether you are just simply a student somewhere, take control of your digital
01:11:03.920
self and don't be beholden to these entities who want control and want influence over the
01:11:15.440
It's very, very important because the day is going to come when eventually they decide,
01:11:21.740
We don't like what Glenn Beck has to say anymore.
01:11:24.600
We want to try to, oh, we can't deplatform him because he's not under our purview.
01:11:30.600
That sort of decision, everybody's going to have to, at some point, consider.
01:11:34.820
I want to end with kind of where we started with the Katie Couric Today Show on the future.
01:11:44.900
In five years, we may look back at this and go, oh my gosh.
01:11:50.020
We might look back at it and people don't know what it is because it failed.
01:12:00.880
Well, the actual inventor of the internet, most people don't even know who he is.
01:12:06.240
It was one guy and he's the guy who came up with www.dot.
01:12:16.160
And so he's seen what's happening and he's seeing these companies.
01:12:26.920
Solid, which is a, it's, I don't even understand how it works.
01:12:36.220
And your pods is personal, personal, I can't remember, but it's all of your personal information.
01:12:44.820
And he's redesigning things through this new company that is allowing you to go out and
01:12:53.220
buy apps, but you bring them into your space and you control all the information and all
01:13:01.420
of those apps only work in your space and it can go out and get things, but it doesn't
01:13:11.540
So all your pictures, everything, it's all in your pod.
01:13:22.900
You know, every man's castle and each individual has sovereignty over themselves.
01:13:31.080
And look, one of the reasons I'm an optimist is human nature as such and the American spirit
01:13:37.180
as such is we don't just sort of toss our arms up and say, okay, this is our fate.
01:13:42.960
I hope, Glenn, that we talk about this in five years and we chuckle and say, boy, we were
01:14:00.320
One of the reasons that conversation might happen is because people now need to be paying
01:14:06.260
This is something that can sneak up on people that could be a real problem unless people
01:14:10.840
are aware of it and insist on taking control of their digital selves.
01:14:14.100
I'm going to put you in the calendar for five years from today.
01:14:17.620
I think that would be a wild thing to come back to this.