Ep 52 | Google’s Hidden Dictatorship | Dr. Robert Epstein | The Glenn Beck Podcast
Episode Stats
Length
1 hour and 9 minutes
Words per Minute
146.12093
Summary
Dr. Robert Epstein is a leading expert on what s happening with Google and what is happening in the world of AI. And this conversation should terrify you, but it will also leave you with the question of, okay, so what do we do about it? What are you going to do? March, protest, Google? For most Americans, the way we think we change things, is not it involves fighting a corrupt and powerful system, one that is now bled right into Washington, D.C.
Transcript
00:00:00.000
On today's podcast, I want to talk to you about the matrix. I want to talk to you about things
00:00:05.300
that I don't think most Americans even understand. And the time to understand it and do something
00:00:10.760
about it is running very short. Dr. Robert Epstein is a guy who you're going to meet today,
00:00:15.600
who is a leading expert on what's happening with Google and what is happening in the world of AI.
00:00:22.820
And this conversation should terrify you, but it will also leave you with the question of,
00:00:28.060
okay, so what do we do about it? What are you going to do? March, protest, Google? For most
00:00:32.660
Americans, I mean, that's the way we think we change things, but that's not it. This involves
00:00:37.680
fighting a corrupt and powerful system, one that is now bled right into Washington, D.C. It has to be
00:00:44.560
flawless, and it has to be smart, and it has to happen right now. So anyone who stands up against
00:00:51.600
Google has an element of hero inside of them, today's guest is, I think, a very brave man. He is
00:01:01.040
a senior research psychologist at the American Institute for Behavioral Research and Technology.
00:01:06.760
He's not a conservative. In his own words, he has been left of center his whole life. He voted for and
00:01:13.920
supported Bill and Hillary Clinton. Hillary Clinton last time in 2016. He loves America, and he feels and
00:01:21.400
felt that everyone on both sides would be interested in his studies. Unfortunately, that is not the case.
00:01:29.560
Too many people are trying to tear him apart. He fears for his life, for what he is looking into,
00:01:35.880
but he will not sit down. But he warns, now is the time. Speak now, or you will forever be forced to
00:01:44.060
hold your peace. Earlier this summer, he spoke in front of the Senate Judiciary Committee to present
00:01:49.080
the research that he has uncovered that proves that leading up to the 2016 election, Google manipulated
00:01:55.260
the search results and news stories with systematic bias in favor of Hillary Clinton. He believes that
00:02:02.200
they could throw the election away from one candidate or the other. It would be away from Donald Trump.
00:02:08.780
He believes this may not be an honest election, and we must have proof that this is or is not.
00:02:15.380
His list of honors and accommodations is impressive. He is charming. He is clever. He is insightful.
00:02:22.460
And he is brave, as you will see. Today, Dr. Robert Epstein.
00:02:32.200
You're not doing any of your research for political purposes. In fact, you voted for Hillary Clinton
00:02:47.840
in 2016 and would have reported on it if it was the other way around. It didn't matter to you, right?
00:02:55.740
I would put it differently. I would say that I put democracy and our country and the free and fair
00:03:04.180
election, I put that ahead of any particular party or candidate. I believe that the average person is
00:03:11.920
like that. I believe the average American just wants it to be fair, wants it to be true, and stop all these
00:03:24.500
games. But we seem to be sliding into a place to where it's become so tribal that you're either
00:03:34.640
for us or against us. And when I say us, I mean that tribe and that tribe, whichever side,
00:03:42.420
is the right one. And the other one is evil and going to get us all enslaved.
00:03:48.460
That's part of the problem. The tribes have always existed. I'm sure you know that Thomas Jefferson,
00:03:59.340
George Washington, they were vehemently opposed to having a two-party system. Jefferson was
00:04:06.080
quite funny, actually, in some of the ways he criticized such a system. But they thought it would
00:04:12.120
lead to strife and it has led to strife. And so we're reliving that now. But there's another factor
00:04:19.760
that's part of the mix that has never existed before. And that is tech and the internet and
00:04:27.280
social media. It's making things happen much faster.
00:04:32.980
I think tech has added the critical element before you add speed. It's added fear because
00:04:41.700
it's changing our society so much. It's changing, it's changing how we relate to each other. It's
00:04:48.140
changing people's jobs. You don't know what's right over the horizon. You know, you, you don't
00:04:54.320
know who you can trust, what to trust. And so it's added fear, tribalism plus fear. Now you add speed
00:05:05.920
Right. And you have complete chaos. But there's another side to what's happening in tech.
00:05:12.600
There's, there's, it's a hidden side. That's what I've been studying in my research now for almost
00:05:16.380
seven years. That's where it gets scary. Now you could say, well, hey, it's scary enough what's on
00:05:21.540
the surface. That is true. It is kind of scary, but the hidden part of tech is much, much scarier.
00:05:33.120
Well, there are a couple of things. Number one, this is, this is easy for people to understand.
00:05:38.540
The other thing that's a little harder, but let's start with the easy one. The first thing is
00:05:42.760
censorship. Um, you know, that's a scary word that we remember maybe from the days of the Soviet
00:05:50.220
Union or something, except with tech, what that means is they're, they're showing you news feeds.
00:05:58.660
They're showing you answers to your questions. They're showing you search results. They're showing
00:06:02.980
you all kinds of things constantly. Um, and they have control over what you see and what you don't see.
00:06:12.860
And it turns out that they exercise that control in ways that serve the company or the company's values
00:06:22.580
or the company's politics. And we don't know what we don't know. I said to Ray Kurzweil,
00:06:30.720
you're a friend of Ray's. And I said to Ray Kurzweil, maybe 12 years ago, Ray, let's just think this
00:06:39.060
through. If Google has all the information about me and knows and can predict what I'm thinking,
00:06:47.280
then as a competitor of Google, I could go, you know what? I could make this better. And Google
00:06:53.060
will already know that I'm there. Why would they supply me with the information to help destroy them?
00:07:00.560
Um, and his answer was, oh, we'd never do that. Yes, that's right. That's, that's for sure. But
00:07:08.280
it turns out that they actually do exercise the powers that they have. Now the censorship power
00:07:16.160
is one power. They exercise that power. Facebook exercises that power. Twitter exercises that power
00:07:22.680
on Twitter. It's called shadow banning. It means you're Ann Coulter, let's say, who I know, and I
00:07:28.240
consider her a friend. And Ann has millions of followers and she sends out a tweet and she doesn't
00:07:34.700
get much response. And she begins to think, Hey, maybe, maybe my tweet isn't reaching all of my
00:07:40.840
followers. And in fact, we know from leaks from Twitter that in fact, that is the case that Twitter
00:07:49.860
in fact throttles information, they throttle messaging. So all of this is, is censorship of
00:07:58.700
one sort or another. And it's censorship. You can't see, you can see if YouTube, which is part of
00:08:05.820
Google, uh, you know, shuts down some conservative channel, which they just did again, apparently
00:08:12.060
yesterday. You can see something like that, but most of the censorship that these companies engage in
00:08:18.740
completely invisible. Now that's just one scary thing, but the most scary thing is the manipulation.
00:08:28.160
These companies have the ability to manipulate us and our kids in ways that have never existed before
00:08:35.960
in human history. And I've been saying this for years and years because I've been showing the power
00:08:42.320
of manipulation that these companies have through controlled experiments. I've been saying this for a long
00:08:48.460
time, just lately, just lately, uh, whistleblowers have come forward. Documents have been leaked,
00:08:56.700
uh, which confirm what I've been saying since 2013. Um, you and I've talked off the air briefly about
00:09:05.640
Bernays, the father of propaganda. Um, a, I think he was the cousin or the nephew of Freud,
00:09:14.540
Freud, that's right. Uh, and, uh, was, was really the first guy to really master, at least in the
00:09:22.740
common era, propaganda and, and sway people. He's the reason we have bacon and eggs for breakfast
00:09:29.280
and everybody thinks so and all that. So he's been that way. No, no, not until Bernays. Uh, we had the
00:09:35.480
average, uh, meal was a cup of coffee and a piece of toast for Americans. Um, and you had, you have
00:09:44.400
been, your, your study includes that whole arc. Hitler said the reason why he lost world war one
00:09:54.920
is because we had Bernays world war two, a copy of Bernays book was on Goebbels desk. When they
00:10:02.180
finally entered it, he, he learned the lesson, what Hitler wanted to do mechanically at the time
00:10:12.100
could not have been done. All of these things now can be done. And in some cases you say they are being
00:10:23.440
done. So, uh, and it, at really rapid speed. So how much time do we have before a door is closed and
00:10:35.240
you're in a cage? I think, um, the door has closed. I think we're, we're in the cage, uh, now, um, you
00:10:46.020
know, there's control and there's control. There's control, uh, like a stoplight that you can see,
00:10:52.260
right? And then there's other kinds of controls that you can't see, but can still impact, uh, which,
00:11:01.080
which way you're going to turn. Um, and there's a wonderful quote that comes from a book called the
00:11:08.000
Hidden Persuaders published back in the 1950s, uh, which is more or less as follows. An unseen
00:11:15.400
dictatorship is possible still using the trappings of democracy.
00:11:23.360
Isn't this the difference between 1984 and Brave New World?
00:11:27.080
No, this goes beyond both of those visions. Um, this is,
00:11:33.840
Well, sure. Because in a dark way goes beyond in a dark way in one sense, in a very dark way,
00:11:41.180
because, um, we're talking about techniques that have simply never existed before ever in human
00:11:49.340
history that are made possible by new tech and made possible because a lot of the tech is controlled
00:11:57.340
by monopolies. Uh, that was never part of the plan. That wasn't part of, uh,
00:12:04.400
you know, the, the concept that, um, you know, um, the, the, the originators of the internet had in
00:12:10.440
mind. They didn't, no one envisioned the opposite. Yeah. The internet was supposed to be the great
00:12:15.220
leveler. There weren't supposed to be these massive entities, Google being the largest, uh, that could
00:12:21.380
control so much of what happens on the internet. So, so there's a very dark side here in the sense that
00:12:28.920
these, these, these new methods of controlling people are in the hands of a very small number of
00:12:34.940
people. And those methods are being used. And we don't know it. Well, I know it at least.
00:12:42.640
Yeah, but the average, like, so this one I've, when, when people are banned on the internet,
00:12:47.120
I, I just want to shake people and say, Oh, how was the book burning today?
00:12:53.500
Hmm. In Germany, when they burned the books, everyone knew what that looked like. They knew
00:12:59.180
what it was. Now, when they just disappear, when you're shadow banned, when you're silenced,
00:13:05.440
when you're deep person, there's no rally, there's no uniform, there's no bright light in
00:13:11.020
the sky. There's no, there's nothing to say to you. This looks wrong. You can so easily just
00:13:20.820
move on and not even notice, but aren't we doing the same thing without knowing it?
00:13:28.580
Let me, let me explain one technique. For example, this will scare, scare the pants off you. Okay.
00:13:34.060
Um, I don't know. You're coming to a professional.
00:13:36.280
Uh, June, 2016, um, a small news service released a video and they posted it on YouTube,
00:13:45.400
by the way. And they were claiming that, uh, that, that Google made it very easy to get,
00:13:54.720
uh, when, when you started to type in a search term related to, let's say, uh, Donald Trump or a lot
00:14:03.200
of other people, uh, they would, they would flash suggestions at you. So those are suggestions.
00:14:08.480
You haven't even seen search results yet. They'd flash suggestions at you of all sorts,
00:14:12.420
including negative ones. But according to this news report, if you typed in anything related to
00:14:19.080
Hillary Clinton, you couldn't get anything negative. And I thought, well, why would they do that?
00:14:25.840
Why would they do that? That, that just, things like that bother me as a scientist.
00:14:32.000
Ah, it's more than a nudge. It turns out it's, it's, it's something amazing. What they're doing
00:14:36.820
is amazing. Uh, so first of all, with my staff, we started checking this out, see if this was true.
00:14:42.800
So we would type in Hillary Clinton is on Yahoo and Bing, and we'd get a long list of very negative
00:14:50.520
search suggestions. Hillary Clinton is the devil. Hillary Clinton is sick, et cetera.
00:14:54.400
And it turns out that's what people were actually searching for. That's what Bing and Yahoo were
00:14:59.880
showing you in their suggestions. We type Hillary Clinton is on Google and we got Hillary Clinton
00:15:05.860
is awesome. Hillary Clinton is winning and that's all. Okay. So again, the question is, why would
00:15:15.080
they do that? Why would they show you things? First of all, that no one is searching for and why
00:15:19.960
can I attempt an answer and see if you see how far off I am? Yeah. I always thought years ago,
00:15:26.980
I always thought that that's what people were actually searching for. And so I feel like I'm
00:15:33.360
out of touch. Oh gosh, everybody thinks she's this. And so I feel, you know, I feel separate,
00:15:42.120
which I don't want to feel separate, but I feel separate and apart. Like, Oh, everybody believes
00:15:46.860
this. I, maybe I'm wrong. Is that anywhere close to why they're doing it?
00:15:53.040
Um, maybe at one point, you know, it's worse than that. Yes, yes, it is. It is actually worse
00:15:59.060
than that. Uh, in fact, I think when those suggestions were first started, I think it was
00:16:03.140
just, uh, it was just for convenience, you know, to help you. Yeah. I think it was nothing
00:16:08.000
more than that. And then it turned into, yeah, autofill exactly. And then it, then it turned
00:16:12.500
into something else and it turned into something else. And what it's morphed into now, uh, is
00:16:17.880
something really, really scary because, um, I ended up conducting a long series of controlled
00:16:23.880
experiments on search suggestions, autocomplete it's called. And I'm pretty sure based on the
00:16:31.620
experiments that Google is manipulating people's searches from the very first character they
00:16:38.720
type into the search box. Now, if you kind of doubt that, then I suggest that you pull
00:16:46.180
out your phone and you type the letter a into the Google search box, because there's a pretty
00:16:53.280
good chance that you're going to see Amazon either in the first position, second or third
00:17:01.620
or maybe all three. Why is Google trying to send you to Amazon? Well, it turns out Amazon is
00:17:09.320
Google's largest advertiser. And it turns out that Google sends more traffic to Amazon than any other
00:17:17.660
traffic source. These are business partners and Google wants you to go there. But the experiments also
00:17:25.740
showed that just by manipulating search suggestions, I could turn a 50, 50 split among undecided voters
00:17:34.340
into a 90, 10 split. Wait, from the dropdown box or from the results? No, no. Just, just with those,
00:17:43.640
just the dropdown box as you're typing it in. Well, because those suggestions, it's very likely
00:17:49.800
people are going to click on one of those suggestions. And here's the key to it. The key to it is this.
00:17:55.220
That if you allow a negative to come onto that list, then this process, which social scientists
00:18:03.020
call negativity bias kicks in, it's also called the cockroach in the salad phenomenon. You've got
00:18:08.700
this big, beautiful salad. There's a little cockroach in the middle and all of your attention
00:18:13.040
is drawn to the cockroach and you won't eat the salad. If they allow even one negative to come into
00:18:19.660
that list, negativity bias occurs. And that negative suggestion draws 10 to 15 times as many clicks
00:18:28.460
as neutral or positive items. So if they allow a negative to come in there, they know it's going
00:18:34.700
to draw a lot of clicks. That's going to generate certain kinds of search results. It's going to bring
00:18:39.900
you to all kinds of negative information about that candidate or whatever that guitar, whatever
00:18:47.580
it is you've typed in. One of the simplest ways for Google to support a candidate is to suppress
00:18:55.340
negative search suggestions for that candidate, but to allow negatives to occur for the other candidates.
00:19:02.060
It's interesting. Donald Trump kicked off his reelection over the summer and he was doing these massive
00:19:10.940
rallies. And, and I was going to do a monologue the next day about how the difference Donald Trump and
00:19:20.860
Barack Obama had the same kind of momentum on their first election. They both had these big stadiums.
00:19:27.960
Obama was the first one to ever do that and really see that now Donald Trump is doing it and same kind
00:19:33.580
of passionate crowd. However, when Obama started to run a second time, I remember clearly his stadiums
00:19:43.300
were half to a quarter, uh, full and 75% of it was empty and they quickly had to change. But I remember
00:19:52.360
those images clearly. So I, I clicked on to find images of Donald Trump's rally. And then I tried to
00:20:00.420
find pictures of Barack Obama in 2012 running for reelection in stadiums. I spent two hours. I couldn't
00:20:09.820
find it. I gave it to one of my researchers who's a whiz on the internet. It took him from eight o'clock at
00:20:17.060
night till three in the morning to find it. And the only reason why he kept looking was because I know
00:20:23.780
they exist. I know they exist. What, how would you possibly explain that? Just, they just disappeared?
00:20:35.180
Well, if you'd asked me this question a couple of months ago, I could have speculated, but now I can
00:20:41.500
actually tell you because a lot of documents have recently been leaked from Google. And the main thing
00:20:49.320
I say when people ask me about them is I told you so because I've been telling people for years about
00:20:56.820
blacklists at the company. And it turns out the documents support every single thing I've been
00:21:04.180
saying. And among these documents, there are reports, memos, a manual explaining how within Google
00:21:16.180
we can re-rank, de-rank, and fringe rank. In other words, Google has developed tools for taking information
00:21:25.920
that they don't want people to see, and they can easily suppress it or boost it. So in other words,
00:21:36.400
whatever the normal process is, there's a lot of manual intervention. Now, I'm going to tell you now,
00:21:43.680
I'm telling you, I guess, on the record that I have now been approached by a potential whistleblower.
00:21:49.660
So I'm not going to give much detail here, but I have been approached by someone. This is within
00:21:55.340
the past week. And this person is confirming what these documents say, confirming that there is an
00:22:05.620
enormous amount of manual intervention occurring at Google.
00:22:10.460
Is this a manual? There's three schools of thought. One, they wrote algorithms, and that's just the way
00:22:16.820
the algorithms were written, and it's human bias gone into the algorithm, but not an ill intent.
00:22:23.880
It just is. They see the world a different way. Two, they're sitting around, and they're conspiring.
00:22:30.740
Yes. And they're all in on it. The third one, I think, is the most realistic, and that is,
00:22:36.340
yes, there's inherent bias, because if a bunch of conservatives wrote it, you just see the world
00:22:41.400
differently. And so there's that inherent bias. But there are also those who are like, you know what?
00:22:49.180
And that's not up all the chain of command or anything else. That's just, there's a lot of
00:22:54.160
humans involved in that. So which is it? Is it a group of people that are sitting around having
00:23:01.200
meetings, passing memos, saying, do this? Or is it a small group of people?
00:23:05.280
Well, as an old psychology professor of mine used to say, if you can imagine it, it's probably
00:23:10.320
true. So in this case, it would be all of the above. We're seeing more and more evidence that
00:23:18.040
the Google's algorithms are inherently biased. And there's a lot of research showing that
00:23:24.580
when someone programs an algorithm, his or her biases get written in. That shouldn't surprise
00:23:30.100
anyone. And there's also probably some people at the top who have certain values and certain
00:23:37.720
agendas, and they want those expressed in their products and services. And there are also individuals,
00:23:48.640
just employees, who have either the password authority or who have the hacking skills to just
00:23:58.160
make magic happen. The most famous case at Google is a programmer named Marius Milner.
00:24:06.140
And when a few years ago, someone like me, a professor like me, found out that Google's
00:24:13.320
street view vehicles were driving up and down the streets of more than 30 countries for four years,
00:24:19.460
not only taking pictures of our houses, but also sucking up all of our Wi-Fi data.
00:24:25.880
Incredible. When this happened, Google blamed the entire operation on one employee,
00:24:33.880
Marius Milner. And of course, what happened to Marius Milner? He's on the streets now,
00:24:39.420
right, with a can in his hand. No, he's a hero at Google. He still works there. And on LinkedIn,
00:24:45.520
he lists his profession as hacker. So all of the above. And this is when you think of the fact that
00:24:56.180
this company can impact the thinking and behavior and purchases and attitudes and beliefs and votes
00:25:02.980
of more than two and a half billion people around the world. And all of those kinds of manipulations can be
00:25:13.040
made. Uh, the algorithm itself can cause people's views to change.
00:25:20.740
I'm writing a fiction book and, um, I need a Manchurian candidate. I need somebody to go and do
00:25:49.060
something. If I have all of their information and I control the way they see that information,
00:26:00.020
can't I find a fairly unstable individual? Let's say I need a Reichstag fire. Can't I find easily
00:26:10.220
individuals just keep funneling them without any fingerprints at all? And pretty much have them
00:26:22.880
Well, as it happens, a senior software engineer at Google, whose name is Shumit Beluja,
00:26:29.180
he wrote a novel that was all about Google, except it was a fictional version of Google,
00:26:36.220
called The Silicon Jungle. Um, you'll probably recognize that title because he was basing it on
00:26:43.000
an early 1900s book, magnificent book called The Jungle about Chicago and in those days. But The
00:26:50.680
Silicon Jungle is about a fictional version of Google in which all kinds of really bad things happen.
00:26:58.440
Sometimes it's because some outside agency comes in and targets an employee and says to that employee,
00:27:07.720
Hey, can you help us with this? And the next thing you know, um, terrorist groups in the Mideast,
00:27:15.860
in the Middle East now have a list of 5,000 potential recruits for their efforts. And all that information
00:27:24.180
is coming out of Google's data. So the answer is not only is that possible, but here is someone who's
00:27:31.000
been with Google a very, very long time actually telling you in fictional form, um, you know, this is
00:27:39.600
reality. So I read something about Amazon that really disturbed me, um, that Amazon does not see
00:27:48.560
themselves in the future as a sales company. You know, this, they're a shipping company and,
00:27:56.400
and that's how they want to be viewed in the future. That's their goal. But to become a shipping
00:28:04.200
company, they have to be able to, um, predict you at least 95% correct every time. So instead of going
00:28:14.960
to Amazon and saying, Oh, geez, we need milk, we need this. And you know, I, I really liked those pants I saw
00:28:20.220
the other day. All those things would arrive at your doorstep and you would then just go, Oh, wow. Okay.
00:28:28.660
Wow. I was thinking about this. I was thinking about this. I was thinking about that. That's great. I don't know
00:28:32.640
about that one. I'll send that one back. As long as they don't have more than 5% returns,
00:28:38.180
that's their legitimate stated goal to do that.
00:28:47.980
Now, if how much are they giving, getting from me and how much are they saying, Hey,
00:28:56.020
have you seen these pants? Have you seen this? You want one of these? These are great. It's kind
00:29:00.900
of like when you're looking for a car, all of a sudden you start to see all these, you're looking
00:29:04.340
of our white car. All of a sudden you notice all these cars are white. I mean, that is normal.
00:29:10.420
This is possibly not normal. It would be juiced for you. What happened? Where a accurate and reality.
00:29:24.380
Oh, quite accurate. Yes. Okay. Um, B, uh, where, where, where is free will? How do, how do we know
00:29:32.640
what we're doing and how do we know when we're being controlled or not? Well, there's, there's
00:29:38.320
free will in the sense that, that, that the, the, the illusion of free will is maintained because
00:29:44.220
you're making the choice. I want this. I don't want that. So the illusion of free will, that's
00:29:48.560
important. But in fact, there is no more free will because what's happening behind the scenes. And
00:29:54.000
this is not just at Amazon. This is a Google and Facebook as well. What's happening behind the
00:29:59.060
scenes is they're building digital models of you. And Google has been pretty open about this. I mean,
00:30:07.200
they want to be able to predict your needs and wants before you're even aware of them. I mean,
00:30:11.800
Google has been investing in DNA repositories for a long time and our DNA information has been
00:30:17.780
incorporated into our profiles because that way they can predict what diseases you're susceptible to.
00:30:23.520
And they also know, by the way, which fathers have, are not the real fathers. Um, so yeah,
00:30:29.760
that's a big part of that hidden world is the digital modeling. Uh, and you know, if you're,
00:30:36.300
if you're a techie, then it's just fascinating. But if you're someone who's interested in concepts
00:30:42.280
like human autonomy and free speech and free will, this is the end. This is the end. And,
00:30:49.400
and I'm saying we're in some sense, well, I'm sorry. I'm sorry. No, I know. I just want you to,
00:30:55.180
I can't tell you. I have, uh, I read a lot of futurists. I've been a fan of Ray Kurzweil's for 30
00:31:06.040
years. Um, and I've been saying this for so long and it has seemed like science fiction to everybody.
00:31:17.960
And I keep saying, as you must feel the same way guys, no, it it's no look at the watch and we're
00:31:25.960
close. We're close. And now for people to start to just be going now, wait a minute. It's like,
00:31:36.240
when did this happen to us? Oh, I don't know. 10 years ago, 10 years ago. So I'm sorry. I'm
00:31:42.760
frustrated. But when you say, I want you to repeat, you said free will. This is the end. Explain that.
00:31:58.760
Google has the equivalent of about, it's hard to even say this because people don't. Okay.
00:32:06.020
Google has the equivalent of about 3 million pages of information about you.
00:32:17.920
They know much more about you than you know about yourself. And they're using that information,
00:32:25.340
not just to make sure that their vendors can sell you products. They're using that information
00:32:31.740
to build a digital you so that they can predict your needs and wants and behavior moment to moment
00:32:41.740
in time every single day, because they know they can monetize that number one, but number two,
00:32:48.480
they know more than that. They know they can use that to create a better human kind. Okay. So
00:32:56.100
how could I possibly say something that absurd? Well, because one of the things that leaked from
00:33:02.280
Google about a year and a half ago is an eight minute video called the selfish ledger.
00:33:09.180
It was meant for internal use only. I recommend strongly that everyone find this online and watch it
00:33:19.580
because this is about the possibility of Google using its information and its tools to re-engineer
00:33:28.900
humanity. And it even mentions according to company values. Oh, dear God. I, I made a transcript of
00:33:39.420
this video and I'm happy to post that online for, for people to look at because it makes it. What website?
00:33:44.700
Uh, I'll, I'll, I'll get it to your people and we'll, we'll put it up. But you know, when you,
00:33:51.000
when you realize the kind of thinking that's going on in a, in a place like this, where they have
00:33:56.000
this much information and this much power to control what people see and don't see, and then all these
00:34:04.380
new manipulative tools, they know, they understand the power that they have and they are openly,
00:34:13.640
I mean, within, within those closed doors, they're openly talking about ways, spectacular ways that
00:34:21.020
they can use these powers that they have powers that no, uh, dictator in, in history has ever had.
00:34:32.680
We'd all be speaking German if Hitler had this power.
00:34:36.280
Well, this, but this is, but this is beyond Hitler because this is almost crazy stuff. Now,
00:34:43.680
for example, I, I, no, seriously. Well, I'm saying something. I mean, well, for example, I, I have,
00:34:50.000
um, and my family doesn't approve it. I have some friends who are conservatives, for example. Oh my gosh.
00:34:56.340
Yeah. Um, maybe you'll be one of them. Who knows? Um, and my conservative friends are just
00:35:04.060
laser focused on the fact that companies like Google, I want to make sure Trump is not reelected
00:35:10.700
and that's it. They're just focused. And they see Google as some bunch of crazy liberals, you know,
00:35:16.740
left-wing communist types. And that's, that's all they can see. Some of my conservative friends.
00:35:22.800
And I try to explain to them, no, no, no, no. You don't understand. The problem is much bigger
00:35:28.000
than that. If you look at how Google functions in different countries, because they're in every
00:35:33.620
country in the world, except China and North Korea. If you look at how they function in different
00:35:39.340
countries, they're not necessarily acting like a bunch of liberals. And there are countries in which
00:35:44.840
they promote the right and suppress the left. They go country by country doing whatever
00:35:52.480
they feel like doing, doing whatever serves the needs of the company, serves their financial
00:35:58.860
needs. Google had plans that were, that were, were, were, were, were squashed by their own
00:36:05.300
employees and by members of Congress to go back into China. In fact, they would have been there by now,
00:36:11.820
or they would have been there early next year at the latest in a project called Dragonfly,
00:36:16.660
working side by side with the Chinese government to help round people up, to control the Chinese
00:36:24.560
population. Yeah. Okay. So let me go to China here for a second on that. Social credit scores.
00:36:31.520
Now it's hard to, it's hard to know how people really feel about it because they're all in China
00:36:40.080
and you don't dare speak out against it. But, um, you know, you'll hear these interviews with people
00:36:47.840
who are absolutely controlled every minute. China has so much control in many of their big cities
00:36:53.620
that they can actually cause you to never leave your house. They don't need to lock your door. They
00:37:00.180
don't need to post a guard. You will never leave your house because you're unwelcome everywhere.
00:37:06.140
And they're proud of that. But you talk to people, the average person, they're like, this is great.
00:37:12.940
We don't have any crime anymore. We don't have this. We don't have that. We kind of have the same
00:37:17.700
thing going on here in America. People love the fact that Amazon could, could predict me so much
00:37:24.980
that they're going to have exactly what I want. They, they, they love this. My life is easy.
00:37:31.740
So how, how do you fight against, against that?
00:37:38.180
Look, every single day I have to remind my own children, I have five wonderful children,
00:37:45.180
uh, please stop using Gmail. Please stop using Chrome, which is a Google's browser. Please,
00:37:51.460
please stop using Google products. Please don't get Android phones. Android is also part of Google.
00:37:58.040
I was at my son's, uh, we, we, I've never let my kids have phones. I don't let them have iPads. Um,
00:38:07.480
you know, we monitor what they're doing. We don't have Chrome. Um, I go to his high school
00:38:13.320
and they so excited to announce that they are soon going to all be a Google class. And I thought,
00:38:22.900
you're handing the life of my child over to Google. You're giving them everything.
00:38:29.080
That's right. That's right. Uh, well, educational institutions will get all the entire suite of
00:38:36.060
Google services for free. So a lot of major universities, even UCLA, even Columbia university,
00:38:42.120
uh, they, they, of course, they're going to take those services. So what, what's going on here?
00:38:46.500
What's going on here is that, uh, these companies are providing certain kinds of services for free,
00:38:54.620
quote unquote, it's not really free because you pay for it with your freedom, but they are giving
00:38:59.740
you services, which are sort of free and they work really well. And they're a lot of fun to use.
00:39:06.120
And what people, and, and, and, and let's say, and they're convenient. They're sure as heck
00:39:12.160
convenient. Cause if you're going to have to do use some alternative that, that might actually
00:39:16.700
cost you a few bucks or every time I do something, they, people always call it. I use duck, duck,
00:39:21.560
go. Have you tried duck, duck, go? It sucks. Right. You know what I mean? So I get the convenience
00:39:28.020
part. What people don't realize is these are surveillance tools. You're, you're voluntarily,
00:39:34.780
uh, using a surveillance tool and their surveillance tools are not just the ones you're
00:39:42.060
kind of aware of something like Gmail, for example, is pretty kind of obvious. Uh, Chrome sees
00:39:48.500
everything you do online and reports it, uh, Android phones, even when you're offline, they're still
00:39:56.120
recording everything you do and everywhere you go. And the moment you go back online, all that
00:40:01.120
information is uploaded. Um, but the surveillance goes beyond that. Stop there for a second. Sure.
00:40:10.200
I've got nothing to hide. I'm not doing anything wrong. So why do I care? Yeah. I've had that
00:40:18.160
conversation. Well, first of all, everything, everyone has things to hide. So you can tell
00:40:24.060
me when we're off the air, give me a sample. Uh, but the nothing to hide idea that just doesn't
00:40:30.740
work because you've got to keep in mind that they're building a digital model of you. And that
00:40:38.560
digital model not only lets them sell things to you, but lets them control you. The more they know
00:40:46.020
about you, the easier it is for them to flip your thinking one way or the other, any way they choose.
00:40:54.040
I know this because this is what I do in my experiments. So nothing to hide
00:40:59.560
is irrelevant. You do not want a private company, which is not accountable to the public,
00:41:07.760
our public or any other public around the world to have that kind of power over you and all of your
00:41:14.180
children. And children are far more susceptible to these manipulations than adults are. Do you really
00:41:21.200
want to give that kind of power to a private company?
00:41:35.860
This is so overwhelming for the average person. Before we get into what can be done.
00:41:43.460
Uh, I remember, I remember almost 30 years ago now I said, there's going to come a time when
00:41:52.760
you're not going to believe your eyes. The digital manipulation will be so good that you will not
00:41:59.320
believe your eyes. And, uh, six years ago we started talking about deep fakes and I've been tracking
00:42:07.780
them online for the, for the audience. Here it is. Here it is. I'm telling you, this is
00:42:13.280
not about Nicholas Cage. This is going to get spooky very soon. Year ago, two years ago, I said 2020
00:42:20.940
is the year. It's good. All hell is going to break loose. What's coming with digital fakes with deep
00:42:28.060
fakes? Well, 2020 is the year. So you called that because, um, next year, uh, especially in, in,
00:42:37.220
in connection with the, with the election, it's not just fake news we have to worry about. Um, it's fake
00:42:44.100
everything and fake videos next year are going to be so good that you're going to have, you literally
00:42:54.060
will have, uh, whoever the democratic candidate is, we'll, we'll, you know, we'll give a speech and then
00:43:01.240
there's going to be for, for every true word that that candidate spoke, there might be a hundred
00:43:06.900
videos in which the words are changed or the same could be done with Donald Trump. You're, you're just
00:43:13.220
not going to know what's real and what isn't real. Now the, the solution quote unquote, that, that some
00:43:19.200
of the tech companies, uh, are talking about, uh, is, is to do what more of what they're doing now,
00:43:24.420
which is suppressing content that they think is defective in some way and boosting content that
00:43:31.220
they think is okay. That's extremely dangerous in and of itself because who are they? Well, who are
00:43:37.960
they? And that, that again, makes them official censors or as, or as one of these documents from
00:43:44.960
Google says is the good source. It makes us, no, it makes us the good censors. We're the good
00:43:51.420
censors. Okay. But I don't want them to have that power. I don't even want a government to have that
00:43:57.240
power. I don't want anyone to have that power over me and my kids, but 2020 is the year. There is going
00:44:04.160
to be incredible amount of fake stuff out there all around us, no matter what the government or any,
00:44:10.900
any company tries to do. Are you concerned at all? My fear was the first one, if it's big enough,
00:44:18.700
powerful enough, the first one that really breaks through will be war of the worlds. I mean, we are,
00:44:24.960
I'm a big fan of Orson Welles. I studied that time period. How could he move the people to where a good
00:44:32.100
portion thought that was real? Well, when you understand the time period and new technology and
00:44:37.920
everything else, yeah, it makes sense. We're there again. And that first deep fake that's real and
00:44:45.540
hits the right nerve will be the Orson Welles war of the worlds down the road, but it's going to happen
00:44:53.740
so fast that it could cause, it could cause war. It could, it could cause civil unrest, anything.
00:45:03.340
Are you, are you confident that, that somehow or another, we're going to be able to filter those
00:45:12.020
out? Are we going to, or do you think that that's realistic to happen? Okay. I don't think we're going
00:45:18.680
to be able to filter those out very well. I think there will always be from starting next year and
00:45:25.600
beyond forever. There'll be more and more chaos around all kinds of fake content of all sorts.
00:45:34.120
Um, you know, there'll be avatars that are indistinguishable from people and there'll be
00:45:39.100
AIs that are indistinguishable from people. And so, and that's just part of the future.
00:45:43.860
Believe it or not, even though that chaos is coming, that doesn't worry me that much because
00:45:49.960
that kind of chaos is competitive and visible. In other words, at least you can see the fake video
00:45:58.120
and someone could post another fake video with different words. It's, it's, to me, it's like
00:46:03.900
billboards. You know, you put up your billboard over here. I put it, my billboard over here in my mind.
00:46:10.660
And eventually we go, eventually we go, well, neither one of you are really probably all right.
00:46:15.340
And you start to think for yourself. Well, maybe, but the point is at least, at least you're up
00:46:22.340
against, um, you know, uh, sources of influence, types of influence that you can see and where
00:46:28.980
there is competition. Okay. So what I'm much more concerned about are these, these other kinds of
00:46:36.040
influence, these creepy kinds of influence that you can't see and where there is no possibility of
00:46:44.480
competition. In other words, one way to put it is this content in a sense, and a video is content
00:46:51.100
and add is content. Okay. Uh, a billboard is content in a way, content doesn't really matter
00:46:58.420
anymore. What matters is who does the filtering of content and who does the ordering of content.
00:47:06.180
In other words, who decides what content we can see or not see and who decides the order in which
00:47:14.520
we are presented the content. Now, when you get to filtering and ordering, you're back to Google
00:47:19.720
number one, to a lesser extent, Facebook and to a much lesser extent, Twitter. And you're back
00:47:26.280
where we started, which is we've got some really dangerous companies with all kinds of power that
00:47:34.220
they should not have. So I know you are trying to set up a monitoring system and you need $50
00:47:42.840
million. So if anybody is watching this, you happen to have $50 million. Here's where you should write
00:47:49.020
your check. Um, but, but average people are helping you as well. And, and for a $50 million project to
00:47:59.380
happen, you need somebody with deep pockets right now to be able to hit 2020, uh, and to monitor
00:48:06.180
this election and to be able to prove, yes, this was fair or this was corrupt. Um, first question,
00:48:17.500
can the average person help? Can you, can you start things with $10 million?
00:48:22.180
Obviously any, any funds we have that we can throw at this problem, uh, we're going to,
00:48:30.260
you're going to do, we're going to do it, of course. But this, this time around, I've, I've built
00:48:34.380
two monitoring systems already. And this time around, I want to do it on a very large scale. Uh,
00:48:39.260
and in, in a way that has, has just enormous credibility that's, that's, that's unimpeachable
00:48:46.600
to use the language that's common in the news these days is unimpeachable. Um, and that means
00:48:53.460
having a very large group of, we call them field agents in all 50 States, thousands of them. And
00:49:01.340
we want to be looking over their shoulders with their permissions. And we want to be tracking
00:49:05.840
a bias in news feeds, email suppression, uh, shadow banning, uh, bias, search results,
00:49:13.220
search suggestions, and a dozen other things, which I won't mention, but we, we want to be
00:49:18.300
tracking this stuff in real time, analyzing it in real time so that we can report manipulations.
00:49:26.120
We can report irregularities as they are occurring. Now, the best result of that would be that these
00:49:34.360
companies back off and we actually end up with a true free and fair election. If they
00:49:43.020
back off, I will have done my job. If they don't back off, I think the repercussions
00:49:49.700
will, will, will, will, will take these companies down. I think the, I think that we'll get serious
00:49:56.720
if, if these companies continue this nonsense after being exposed over and over again, uh,
00:50:02.860
with a very, very, very large, highly credible system of monitoring. If they don't back down,
00:50:08.980
I think we're going to see if, you know, fines that go through the roof and we're going to see
00:50:14.080
actual, uh, penalties and probably criminal prosecutions. So let me rephrase the question
00:50:20.480
that I asked Ray Kurzweil. Let's say I was going to build a system to monitor Google Ray and you knew
00:50:31.260
what I was doing and you knew the people because you have information on everyone. You know, the
00:50:37.980
people who I am monitoring. What would stop Google from just not just making sure those people's
00:50:47.380
algorithms are clean? Well, that's why when we built these systems in the past, we had to do what
00:50:52.800
the Nielsen company has been doing since 1950. You know, Nielsen recruits families, uh, to monitor
00:50:59.260
people's television watching. And it's very, very important that the identities of those families
00:51:04.380
not be known because imagine if they were known, imagine the pressure on those families to watch
00:51:10.840
this show and not that show. I mean, the Nielsen ratings determine advertising rates. They determine
00:51:17.100
whether a show is taken off the air. So they do things quite secretly. And that's what we did in
00:51:23.440
2016 and 2018. In 2020, much of what we do will be done in secret.
00:51:30.240
How many people do you think, think you're overly dramatic or, or paranoid?
00:51:40.320
Well, uh, as Alan Dershowitz once said in a talk, uh, that I was attending, there's a fine line
00:51:46.420
between paranoia and caution. Um, you know, I don't care how people label me. Um, my work adheres to the
00:51:56.100
very highest standards of scientific integrity. It always has, uh, the monitoring systems I've built
00:52:03.700
are rock solid when it comes to, uh, methodology and results. Um, I know I'm doing things right. Uh,
00:52:13.940
so whatever names people throw at me, it's irrelevant. I'm trying to get to the truth of
00:52:20.020
the matter. When we have massive amounts of data that have been preserved, which are normally lost
00:52:26.660
forever because what we're preserving are ephemeral experiences. That's key. That, that phrase,
00:52:31.780
which is used internally at Google is key. Define it. An ephemeral experience is something like
00:52:37.540
generating search results. You type in a term, search results appear, they're generated
00:52:42.900
on the fly just for you. And then you click and they're gone and they're gone forever. So it's a,
00:52:49.540
it's a short, short lived experience, uh, that is generated for you uniquely disappears. And there's
00:52:59.220
no way you can go back in time and recreate it and ephemeral experiences. That's what my monitoring
00:53:05.700
systems have been able to capture. That's, that's never been done before. And that is where we can
00:53:13.380
get a handle on these tech companies. And I, I believe that, that setting up big, you know, well-crafted
00:53:22.660
monitoring systems is critical, not just to try to, to try to guarantee a free, uh, and fair election
00:53:30.260
in 2020. I think these systems are our protection against the misuse of technologies, uh, moving
00:53:40.340
forward into the decades that follow. I mean, I think this is, this is how we protect humankind,
00:53:45.700
um, from technology is with monitoring systems.
00:53:48.900
Why isn't, why isn't DARPA? Why isn't the Pentagon? Why isn't this is to protect and cons and to preserve
00:53:59.380
the constitution of the United States and its people's freedom. Why isn't the government backing
00:54:08.740
this? Why, why, why could I get money for turtle tunnel research, but not this?
00:54:18.900
Uh, Glenn, I have to say in all honesty, I've been wondering that myself. Um, one of my contacts
00:54:23.620
in Washington says, well, maybe they are doing it and you just don't know. I mean, that could be.
00:54:28.420
If you could, what was the phrase you said? If you can think it, what was that? Uh, because my,
00:54:36.820
you know, the CIA helped fund Google, the NSA loves this kind of information. The Google, uh,
00:54:46.980
employees, strangely, for some reason, just, uh, you know, had a day strike where they all marched
00:54:53.780
in front and said, Google, you have to stop giving information to ice. Well, wait,
00:55:00.580
when did that start? What do you mean? Give information to ice. What information are they
00:55:04.260
giving and who are they giving it to and how much are they giving? Um, my fear is that those two are
00:55:10.660
two peas in a pod that they are feasting on each other for protection, both sides.
00:55:18.980
Well, Google works very closely with government agencies. So one answer I could give to your
00:55:23.380
question is, I think for this first large scale monitoring system, I think we probably should try
00:55:29.300
to do it independent of government. Yeah. I just think that, you know, these systems need to exist
00:55:36.340
around the world. I think probably they should be independent of government. They should be
00:55:41.220
reporting possibly to government agencies as appropriate or to the media as appropriate or to
00:55:47.140
the federal election commission as appropriate. But I think, uh, our best protection is to have
00:55:53.700
independent, um, you know, uh, organizations. They're basically nonpartisan, uh, nonprofit organizations
00:56:01.380
that exist, uh, for one purpose and one purpose only. And that is to protect humanity,
00:56:07.380
protect democracy, protect the free and fair election by looking over people's shoulders
00:56:13.860
and seeing what these companies are showing us. What are they, what are they telling us over these,
00:56:22.660
over these home assistants, uh, like, uh, like Alexa or Google home, what are they saying? What kinds of
00:56:29.220
answers are they giving? Uh, that's one of the things we're going to monitor and we're going to
00:56:35.940
tabulate those data and we're going to analyze them as fast as we can using artificial intelligence.
00:56:41.700
You see, we're talking about tech. Tech is the problem. How do you fight tech? You fight tech
00:56:47.300
with tech. That's how you do it. Tech can keep up with tech laws and regulations, in my opinion,
00:56:56.260
cannot. No, they move way too slowly. And then you get into all the partitions.
00:57:01.300
I talked to, I talked to people on the national security council at one point and we all sat down
00:57:08.340
and, uh, I said, I just, I don't want to talk to you about some future tech that's coming.
00:57:14.420
And they looked at each other and said, well, we're going to have to work with Congress. We're going to
00:57:17.940
have to, I mean, should we should maybe they started talking among themselves. Maybe we should look
00:57:22.260
into this law and this law. And I said, by the time you guys figure it out, it's already over.
00:57:29.620
It's over. It's some, they're onto something else. So isn't this kind of what Stephen Hawking was
00:57:39.540
talking about? The end of homo sapien, not meaning that we're all going to, that there won't be people
00:57:45.540
that necessarily look like us, but we will have to augment or we're never going to be able to keep
00:57:50.980
up with tech. Hawking was one of a number of people. Elon Musk is another who'd been,
00:57:58.820
who've been warning about, you know, uh, the dangers of, of new technologies, how various kinds of
00:58:05.620
threats that they pose to the human race. So I'm not taking on that, that whole gamut. No,
00:58:11.460
but I'm, but I'm saying, look, I've, I've, I've been doing some very rigorous scientific research
00:58:17.140
for a long time. It shows without any doubt, the power that these companies have to manipulate
00:58:23.460
people without their knowing and without leaving a paper trail again, ephemeral experiences. That's
00:58:29.940
the key here. And, and, and for someone who maybe who just tuned in, let me just repeat what I said
00:58:35.620
earlier. That phrase ephemeral experiences that actually comes from a memo that was leaked to the
00:58:42.420
Wall Street Journal last year from Google. They know the power of ephemeral experiences to influence
00:58:50.260
people. So, you know, that's, that's what I've stumbled onto. I've stumbled onto that. I've,
00:58:55.220
I've analyzed it. I've quantified it. I'm pretty sure I know how we can protect ourselves from it,
00:59:02.020
from that kind of manipulation. And so that's my, that's my focus. Now, maybe we'll all be taken
00:59:08.740
over by robots. And in fact, this will all be for nothing, but, um, at least in the short term,
00:59:15.700
I'm, I'm trying to protect the integrity of the free and fair election in 2020. Now I have friends and
00:59:23.380
family members who say, what, but that, that means you're helping Donald Trump. No, you're helping
00:59:29.300
the truth. Okay. I, I, who cares? Okay. I, just for the record, for the record, first of all,
00:59:37.540
first and foremost, I am not suicidal. Okay. Got to get that on the record. Did you, did you,
00:59:43.300
did you hear that Google? Okay. I am not suicidal. And secondly, I am not a Trump supporter.
00:59:48.440
Stop for a second. Yeah. I know you said that jokingly. Yeah. But you have to think those things,
00:59:55.280
at least in scary moments of your life. Like I should be, you know, when, when I first started
01:00:03.380
taking on George Soros, I, I couldn't imagine anyone would try to do, but there were times where
01:00:08.680
I like, I want everybody to know George Soros and I are at odds with each other. I mean, do you have
01:00:14.960
those moments of, um, let's, let's put it this way. Um, I had one, I'm not going to identify him,
01:00:23.300
but one of the, one of the people who's highest in our country in law enforcement, uh, came up to me
01:00:32.340
at one point after I gave a talk and said, and, and I, there was no smile, not, not a trace of a smile.
01:00:39.480
Unfortunately, he said, you know, I think in the next few months, you're going to probably die in
01:00:43.740
an accident. He said, and just keep in mind that people die in accidents every day. And then he
01:00:49.800
walked away. Oh my gosh. So, uh, you know, uh, does such thoughts pop into my head? Well, sort of
01:00:58.320
now and then the, the, the latest whistleblower from Google, Zach Voorhees, uh, he was until he
01:01:06.520
finally, you know, launched his documents out into the world. Uh, he was very much afraid for
01:01:12.080
his life, very much afraid for his life. Uh, and Google sent police to his home and sent a SWAT team
01:01:18.440
and, you know, so, uh, you know, anything's possible, but I, I know what I am. I'm a researcher.
01:01:28.460
I'm very, very good at what I do. I'm going to stay focused on doing the research, doing what I do
01:01:34.920
well. And, you know, analyzing my findings, reporting my findings, I'm going to do this,
01:01:42.680
uh, whether it helps Trump or not. I, I just don't think that's an issue that we should.
01:01:49.560
Jefferson said in the end, you have to trust the American people. They will make a mistake,
01:01:57.100
but in the end, they'll figure it out and they'll correct that mistake. It's not for us to decide.
01:02:03.960
That's, and to me, that is the problem with the early 20th century progressive.
01:02:09.940
I know better. Who are you to say, you know, better, let people figure it out. People will
01:02:17.600
figure it out and we're going to make mistakes, but people have to make the choice, not an algorithm,
01:02:24.400
not a company, not a dictator. Well, that's, that's, that's again, where I'm digging in because
01:02:31.580
if we have no monitoring system in place, I mean, let's just talk about that possibility.
01:02:36.860
If there is no monitoring system in place, whether it's mine or the, or DARPA's or whoever,
01:02:42.640
if there's no monitoring system, no, no one turns the switch and turns the system on so that we can
01:02:48.720
look over people's shoulders and see what these companies are showing them and, and, and preserve
01:02:53.700
those ephemeral experiences. If we have no such system in place in 2020, we will have absolutely
01:03:02.260
no idea why whoever won the presidency won the presidency. We won't have a clue. And I can tell
01:03:10.240
you this without any doubt, millions of votes will have been shifted by big tech companies without our
01:03:17.700
understanding what happened and without us being able to go back in time and figure it out.
01:03:33.080
Are we living or about to live in the matrix, but a different kind where we're, we're not in pods,
01:03:40.260
but we're living in a controlled world. Anyway, my impression in, in, in, in the matrix series of
01:03:49.580
movies was that, that, that people did make real choices and those choices had consequences. Maybe
01:03:56.280
I'm just, you know, fantasizing here, but that was my impression. Certainly the illusion of free choice
01:04:04.420
existed in the matrix. Uh, you know, was it all just an illusion? I don't know. I, I don't think the
01:04:12.100
matrix series really weighed in on that point, but we are definitely at a point in time in which,
01:04:19.740
uh, you know, as, as, uh, as this, uh, British economist said long ago, in which, uh, a, uh,
01:04:30.140
an unseen dictatorship is possible within the trappings of democracy. We're definitely at that
01:04:38.120
point in time right now where that an unseen dictatorship is indeed possible. I think to
01:04:45.080
some extent it's already in place. I think to some extent democracy, as we originally conceived of it
01:04:52.280
is an illusion. And to some extent, uh, free will right now, uh, is also an illusion. We're being
01:05:02.060
nudged, but not, not, not, not just in the, in the sense that professor Thaler, uh, outlined so
01:05:09.460
brilliantly, but we're, we're actually being pushed and shoved. I mean, uh, by, by forces that in some
01:05:18.540
sense are all subliminal. I mean, you cannot see bias in search results. You can't. Yeah. You can't
01:05:25.640
like, I remember when I was a kid and, and there was this big thing about subliminal editing and you
01:05:31.200
could slow the tape or the film down and you could go that frame right there. This doesn't have the
01:05:37.640
film. You can't look at the frame. It's gone. Well, normally it's gone. In 2016, uh, I preserved, um,
01:05:47.280
13,207 election related searches and, you know, was able to capture that, capture the web pages
01:05:56.380
in, in, in a, in a sense, slow things down so we could go back and analyze it. 2018, I, you know,
01:06:03.420
uh, preserved, you know, more than twice that information. 2020, I want to be able to preserve
01:06:08.700
millions of searches. And how, and what were the results in 16 and 18? In 16 and 18, both times,
01:06:17.280
uh, I found very, very strong, uh, pro liberal bias in all 10 search positions on the first page of
01:06:28.080
Google search results, but not on Bing or Yahoo. So that's important. The other thing that's important
01:06:33.760
is the statistics. So we're not going to do a statistics course here, but let me just tell you
01:06:40.120
that the results, our results were significant, not at the 0.05 level, which is a kind of generous
01:06:48.560
cutoff that's often used in research, but at the 0.001 level, meaning the chances that we got our
01:06:57.200
results by chance alone were less than one in a thousand. Wow.
01:07:02.520
I'm kind of at this place to where I kind of found myself when, uh, I heard about ISIS taking slaves
01:07:14.660
and, and I thought, what am I doing with my life? What do I mean? How do we, how do we even help?
01:07:23.440
Where's the government on this? And then I realized, you know what? We don't need the government to do
01:07:28.640
those things. That's up for us to do. And, uh, we raised about $26 million and saved about 30,000
01:07:38.240
people, moved them out of the area, rescued some slaves. Um, and I kind of find myself in that same,
01:07:47.980
that same thing with you. Would you write a compelling letter to my colleagues?
01:07:58.640
And I'd like to send it out to every broadcaster and, uh, every podcaster that I know. And I'd like
01:08:07.920
to see if we can coordinate a broadcast podcast day, week, whatever it is, because I think the American
01:08:17.840
people, our, our footprint is 20 or $30 million or 30 million people in a month.
01:08:25.400
We can hit 50 million pretty easily in a week. If we would work together and everybody gave a dollar
01:08:32.740
and some people gave $10, we could hit that quickly. Would you be willing to write that letter
01:08:41.520
Yeah. I think you are an extraordinarily brave man. Um, I know a little bit of what you have faced,
01:08:52.300
um, and, uh, and the temptation and probably the conversations with your family and your wife,
01:08:59.320
honey, is it really worth this? And, uh, the answer with you is yes. And I am deeply grateful.
01:09:09.480
Uh, and my children will be deeply grateful for your research and what you have had the courage
01:09:18.240
to stand and say, thank you. Just a reminder, I'd love you to rate and subscribe to the podcast
01:09:30.020
and pass this on to a friend so it can be discovered by other people.