The John-Henry Westen Show - September 01, 2020


Google is spying on your private conversations, manipulating search results: Harvard trained researcher


Episode Stats

Length

55 minutes

Words per Minute

137.5486

Word Count

7,677

Sentence Count

552

Misogynist Sentences

1

Hate Speech Sentences

3


Summary

In this episode, Dr. Robert Epstein joins us to discuss his research findings into the massive manipulation of public opinion, the surveillance and censorship by Google and Facebook, and other tech giants, and why we should all be concerned about it.


Transcript

00:00:00.000 This show is going to be something of an eye-opener, and this is definitely not my usual guest.
00:00:07.420 We'll be speaking with a dyed-in-the-wool liberal, a Jewish liberal Democrat by his
00:00:12.360 own definition.
00:00:13.700 But his research findings into the massive manipulation of public opinion, the surveillance
00:00:19.040 and censorship by Google and Facebook and other tech giants are something we all need
00:00:23.800 to hear and take action on.
00:00:25.620 According to his Wikipedia entry, Dr. Robert Epstein is an American psychologist, professor,
00:00:31.560 author, and journalist.
00:00:32.700 He earned his PhD in psychology at Harvard University in 1981.
00:00:36.640 He was the editor-in-chief of Psychology Today, a visiting scholar at the University of California,
00:00:42.300 San Diego, and the founder and director emeritus of the Cambridge Center for Behavioral Studies
00:00:48.680 in Concord, Massachusetts.
00:00:50.100 But it's his exposing of the unbelievable control exerted by Google and other tech giants
00:00:57.260 that have shaken the world on all sides of the political perspective.
00:01:02.120 Stay tuned.
00:01:03.020 Before we get into today's episode of the John Henry Weston Show, I wanted to take a minute
00:01:26.120 to let you know that this episode is a quasi-kickoff to a new series of videos that we'll be publishing
00:01:32.800 called Uncensored, Big Tech vs. Free Speech.
00:01:37.000 This series will cover interviews with various experts, professionals, individuals, and organizations
00:01:42.700 who are all experiencing the suppression and censorship coming from big tech companies
00:01:47.980 like Google, Facebook, and Twitter.
00:01:50.680 These videos will be illuminating, shining a light on the dark truth of how these monopolies
00:01:57.160 actively push forward the liberal agenda while crushing Christian values.
00:02:02.960 We hope you'll watch each video as they come out weekly over the course of the next few months.
00:02:08.840 Dr. Epstein, welcome to the program.
00:02:10.760 It's my pleasure.
00:02:11.880 It is absolutely fascinating to speak with you.
00:02:14.780 I was saying just before we got on together that it's odd having you on my program because
00:02:20.280 I think we come at things from the absolute opposite side of the political spectrum.
00:02:25.160 Well, that's probably true.
00:02:27.840 I was at President Trump's acceptance speech a few days ago in D.C., but I am not a conservative.
00:02:35.880 I don't have a conservative bone in my body, as I said when I testified before Congress last year.
00:02:41.340 And so the message I've been trying to get out there has nothing to do with my political views.
00:02:51.220 In fact, I think all of us need to hear the message because, again, we need to set our politics aside
00:02:59.100 and try to understand what's happening with the tech companies and especially the way they're affecting democracy.
00:03:05.920 So I would say no matter what your politics are, this is something you need to learn about.
00:03:14.200 Okay, so there's an interesting question that stems even from that right off the bat.
00:03:19.100 So typically the direction that Google and Facebook are going is in what you might call your direction.
00:03:26.540 It's definitely slanted to the left.
00:03:29.360 I don't think there's any question about that.
00:03:30.780 So one of the first questions would be, why are you concerned about it since things seem to be going in your direction?
00:03:38.480 Oh, there's absolutely no question.
00:03:40.520 There might have been a few years ago, but now there's no question at all.
00:03:43.480 We know from leaks and we know from whistleblowers.
00:03:46.860 We know from the pattern of donations and lots of other things that these big companies,
00:03:52.640 especially Google and Facebook, that they definitely lean left, support the Democrats.
00:04:00.180 And I should be, as you say, I should be pleased with that.
00:04:04.220 Certainly a lot of members of my family are very pleased with that.
00:04:08.280 But I'm not pleased with it because I've been studying in controlled, randomized experiments now for almost eight years.
00:04:15.780 I've been studying and discovering the power that these companies have to shift opinions, attitudes, beliefs, purchases, and votes.
00:04:30.320 And I've been quantifying this power that they have with tens of thousands of people covering five national elections so far, soon to be six.
00:04:41.700 And, you know, I'm scared to death by what I have found.
00:04:47.080 So the reason I'm speaking up is because I don't care who these companies are favoring right now.
00:04:55.640 The fact is they shouldn't have this kind of power.
00:04:59.280 And years ago, I used to just speculate that they might be using the power.
00:05:03.540 Now I'm no longer speculating.
00:05:05.800 I know from my own data, I know from leaks of documents, emails, videos, and I know from statements made by whistleblowers
00:05:17.780 that these companies are using this power strategically and deliberately and on a very large scale.
00:05:23.960 Now the fact that they happen to be favoring Democrats in this country right now, who cares?
00:05:29.740 There's a much bigger issue here, because in different countries, by the way, they don't always favor people leaning to the left.
00:05:37.460 In Cuba, they favor people leaning to the right.
00:05:40.200 When Google has off and on been working with the Chinese government, well, they're favoring the Chinese government
00:05:47.180 in its effort to control its own people using technology.
00:05:53.100 In other words, they do whatever they want to do.
00:05:55.720 And so whoever they're favoring today might not be who they're favoring tomorrow.
00:06:01.700 These are big issues involving surveillance, involving censorship, involving free speech, human autonomy.
00:06:13.020 They're much bigger issues than who's going to win the next senatorial race or presidency.
00:06:19.920 Much, much bigger issues.
00:06:21.260 And we've got to try to get some perspective on this and stop focusing on kind of the minutiae,
00:06:28.140 because these issues are so important that I would say this year is a watershed year.
00:06:36.260 In other words, this is the year where we either wake up and try to fight these companies,
00:06:43.020 or we will be turning over democracy to these companies.
00:06:49.400 Google, especially, will literally be turning over democracy to these companies.
00:06:54.500 And a little bit of history here.
00:06:55.940 People have heard about President Eisenhower's last speech before Kennedy became president.
00:07:03.440 People remember a phrase from that speech.
00:07:06.600 He warned about the rise of a possible military-industrial complex.
00:07:10.740 But, you know, I went back and read the speech.
00:07:15.340 He said a lot more than that.
00:07:16.700 This was the most remarkable speech by, you know, a four-star general who led Allied forces
00:07:23.620 in World War II, very much an insider.
00:07:26.400 He also warned about the rise of a technological elite.
00:07:31.020 That's his phrase.
00:07:32.480 A technological elite that could control public policy without anyone knowing.
00:07:37.780 And he said, we have to be vigilant to make sure this doesn't happen.
00:07:42.800 And the fact is, we have not been vigilant.
00:07:46.400 That technological elite now exists.
00:07:49.880 And they are controlling public policy in ways that are absolutely mind-boggling.
00:07:56.780 And as I say, I think this is the watershed year.
00:08:00.240 Because I think this year we either fight them or we surrender democracy to these companies.
00:08:07.480 And it's very possible we will never, ever get it back.
00:08:13.360 Right, right.
00:08:14.040 So, I mean, one of the underlying factors here is these are unelected people who are making
00:08:20.140 decisions for all of society.
00:08:22.460 Not only all of American society.
00:08:25.260 What you're looking at is it's almost all over the world.
00:08:28.180 Well, it is all over the world, except for China.
00:08:30.580 And, of course, Google works with Chinese government.
00:08:33.540 It's all over the world.
00:08:34.940 It's more than two and a half billion people right now.
00:08:38.080 And within a year or two, we'll be past four billion people.
00:08:42.960 One of the scariest leaks from Google was an eight-minute video, which you can find these
00:08:48.620 days, called the Selfish Ledger, and that eight-minute video, Google internally is saying to other
00:08:56.700 people at the company, you know, we have the ability to re-engineer humanity.
00:09:02.340 Maybe we should think about how to use that power.
00:09:05.840 That's where things stand at this point.
00:09:10.080 Right.
00:09:10.560 Okay, so let's get into what exactly they are doing, what your studies have found.
00:09:18.520 Well, of course, I could go on for hours about that.
00:09:21.840 In fact, I would urge people to go to my website and get—I have about 50 different articles,
00:09:34.920 scientific pieces, conference presentations at this point.
00:09:40.520 My written testimony before Congress is accessible online.
00:09:44.400 You can go to either my institute's website, which is AIBRT.org.
00:09:52.600 That's AIBRT.org.
00:09:55.020 Or if you go to MyGoogleResearch.com, that should take you to a page that will bring you to many
00:10:02.860 different links with lots of details.
00:10:06.020 But to summarize briefly, back in 2013, I discovered a phenomenon I ended up labeling
00:10:16.720 SIEM, the search engine manipulation effect.
00:10:20.620 This was astonishing to me.
00:10:23.260 This was a series of experiments I ran.
00:10:26.840 These are controlled experiments in which I showed people search results which favored
00:10:36.260 one candidate or another or neither candidate.
00:10:39.320 That's a control group.
00:10:41.740 And I had reason to believe, based on marketing research, which had been done at the time,
00:10:46.320 that if indeed search results favored one candidate, and that means higher search results take
00:10:52.440 you to web pages that make that candidate look good, better than the opponent, that if there
00:10:59.500 was this kind of favoritism, I thought it could shift voting preferences by 2% or 3%.
00:11:05.540 In the very first experiment that I ran, I got a shift of over 48%, which I really barely
00:11:15.120 believed, because that's impossible, repeated the experiment with more subjects, more participants,
00:11:23.280 got a shift of 63%, just kept doing this over 63% in the second experiment.
00:11:31.280 63%?
00:11:32.280 Yeah.
00:11:33.340 Wow.
00:11:33.540 Well, by the time we got to the fourth experiment, we did a nationwide experiment with more than
00:11:40.420 2,000 people from all 50 states.
00:11:42.620 When you have a big study like that, you can start to look at demographic differences.
00:11:48.140 There, we not only got large shifts, but in one demographic group, happened to be moderate
00:11:54.760 Republicans, by the way, we got a shift of 80% after just one search.
00:12:00.760 We then went to India with real voters in the middle of a very intense election in 2014.
00:12:07.560 Overall, got a shift with real voters right in the middle of an intense campaign, election
00:12:15.520 campaign, easily got shifts of 20% or more in some demographic groups, over 60%.
00:12:22.720 So then that was first reported by the Washington Post, I think, in 2013, some of our initial
00:12:31.600 research, and then 2015 published the first set of experiments in the Proceedings of the
00:12:38.380 National Academy of Sciences, which is a pretty prestigious scientific journal.
00:12:44.860 And that article has gotten some attention, I can tell you, because it has been accessed
00:12:50.140 or downloaded from the National Academy of Sciences more than 300,000 times.
00:12:55.840 Now, I've never heard of that for a scientific article, no matter what it's about.
00:13:02.100 So there's definitely some interest.
00:13:03.920 Then I published a replication of that study, the SEAMS study in 2017.
00:13:10.520 But that was just the first of about a dozen discoveries of techniques that the internet has
00:13:17.820 made possible, that are entirely in the hands of a couple of big tech companies, that people
00:13:24.780 are unaware of.
00:13:27.160 In other words, these shifts occur without people's awareness, they don't know they're
00:13:30.700 being manipulated, and these techniques don't leave a paper trail for authorities to trace.
00:13:38.700 Think about that.
00:13:39.660 In other words, yeah.
00:13:40.920 Right.
00:13:41.920 Right.
00:13:42.640 So if you can unpack that a little bit for us, what does search engine manipulation,
00:13:49.040 how does that even work?
00:13:49.940 Because the search engines are just serving up what you yourself search for.
00:13:54.840 So how is this manipulation?
00:13:58.940 It's manipulation if you're undecided.
00:14:01.860 That's the key.
00:14:02.560 If you're undecided and you're typing in not something like Trump is an idiot or Hillary
00:14:08.340 is a criminal, you're typing in something like Trump or tell me about Trump or, you know,
00:14:16.500 you're typing in something neutral.
00:14:18.860 You type in the wall with Mexico.
00:14:21.480 That's a neutral search term.
00:14:23.620 In all of our experiments and in our monitoring projects, we use neutral terms like that.
00:14:28.300 And the fact is, what happens in that case where someone is undecided, meaning they're
00:14:33.920 vulnerable?
00:14:34.780 They're vulnerable.
00:14:35.560 They can be shifted and those are the people who decide who wins elections.
00:14:40.580 And what we found was if people are undecided, so we always run experiments with people who
00:14:48.060 are undecided as best we can.
00:14:51.340 Well, what happens is if they click on high ranking search results, which is what we all
00:14:59.060 do, 50% of all clicks go to the top two search results.
00:15:04.080 Well, if those search results are, you could say, biased or statistically biased anyway, in
00:15:10.220 other words, if those search results favor one candidate, they take you to web pages which
00:15:14.760 make that candidate look really good.
00:15:17.260 then opinions shift.
00:15:21.540 And the problem is people are very trusting of computer output.
00:15:25.600 They're very trusting, especially of Google output.
00:15:29.420 People have absolutely no idea when we do these experiments that they're being manipulated.
00:15:34.900 Now, that's very, very dangerous because if you are being influenced and you can't see the
00:15:40.960 source of influence, you end up deciding that you made up your own mind.
00:15:46.580 That's very, very, very dangerous.
00:15:49.160 And that's that we've known for a long, long time.
00:15:51.160 And that's what happens with these new kinds of techniques.
00:15:56.580 So they're all subliminal and people can't see that they're occurring.
00:16:01.020 They end up concluding that they've made up their own mind.
00:16:05.140 And remember, we're randomly assigning people to group A or group B.
00:16:10.620 And if they're assigned to group A, we're pushing them in the direction of candidate A.
00:16:16.380 If they're in group B, we're pushing them in the other direction.
00:16:18.600 If they're in group C, we're not favoring either candidate.
00:16:23.360 And by the way, in all of these experiments, we use real search results and real web pages,
00:16:28.360 which are from real elections with real candidates.
00:16:31.620 So the fact that we can push undecided people toward any candidate, of course, that's the
00:16:39.760 nature of being undecided, is that you're vulnerable.
00:16:43.320 And information that might affect your thinking, it has an impact.
00:16:51.520 And again, in some demographic groups, it has an enormous impact.
00:16:56.120 In other words, with some individuals, it has an enormous impact.
00:16:59.140 So that was just the first discovery, though.
00:17:01.200 I mean, they've gotten, in some ways, worse and worse and worse and worse.
00:17:05.040 The one we're studying now is, I think, the scariest form of influence that we've ever studied.
00:17:14.880 Wow.
00:17:16.160 Absolutely unbelievable.
00:17:17.320 So this is just via means of search engine results that are being manipulated, that are,
00:17:25.100 as you said, the first top two results are clicked on by 50% of the people.
00:17:29.360 One of the things I read that you've written is that usually 95% of the people only look at the first page.
00:17:37.080 And therefore, you've got only 5% of the people doing anything else.
00:17:41.720 That's correct.
00:17:42.240 And that's really where these big numbers come from.
00:17:45.920 These big shift numbers come from.
00:17:47.880 That's why we get shifts that are so large, is because people trust high-ranking search results.
00:17:54.020 Very few go on to the next page.
00:17:57.560 Now, it's not just search results, though, that cause these shifts.
00:18:03.000 And it's not just search engines.
00:18:04.920 There's a lot happening online that we've learned about over the years that goes beyond search results.
00:18:11.100 So I'll give you an example.
00:18:12.960 Our second discovery was of something we call SSE, which is the search suggestion effect.
00:18:19.780 It turns out that when you start to type a search term into a Google search box, you're being manipulated from the very first character that you type.
00:18:29.680 Now, I said this when I was testifying before a Senate committee.
00:18:35.060 And the chair of the committee was Senator Ted Cruz.
00:18:38.540 He immediately pulled out his mobile phone and said, oh, yeah, what do you mean?
00:18:45.860 And I said, well, go to Google.com, type just the letter A into the search box.
00:18:52.580 And he did.
00:18:53.580 And I said, well, chances are the first, second, third position, you're going to see Amazon.com.
00:19:00.320 And you might see Amazon in all three positions.
00:19:03.620 It's a possibility.
00:19:04.900 And he said, well, yes, I see Amazon, Amazon Prime.
00:19:08.760 Then there was a fourth one that had to do, obviously, with his own personal searches.
00:19:13.320 And it was quite funny.
00:19:14.560 I don't remember what it was.
00:19:15.980 And the next one was Amazon again.
00:19:18.140 And I said, the reason why they're trying to send you to Amazon, just because you've typed the letter A, is because Amazon is Google's largest advertiser and Google is Amazon's single largest source of traffic.
00:19:34.520 This is a business partnership.
00:19:36.620 They're not trying to help you with your search.
00:19:39.680 They're trying to manipulate your search.
00:19:43.260 And so, you know, our experiments on SSE taught us a lot.
00:19:49.540 But the scariest part of our experiments on SSE is that we could, just by manipulating those suggestions that you flash at people when they're typing a search term,
00:20:00.020 we could turn a 50-50 split among undecided voters into nearly a 90-10 split with no one having the slightest idea they have been manipulated.
00:20:13.700 And again, there's no paper trail here for authorities to trace.
00:20:17.920 You can't go back in time and see what search suggestions people were shown or search results or news feeds or lots of other things.
00:20:27.620 Right.
00:20:27.740 So, okay, so this manipulation is happening.
00:20:33.320 You've discovered it.
00:20:34.920 What can be done about it?
00:20:36.640 And what have they done about it based on your research?
00:20:41.420 Well, there's lots of discoveries.
00:20:43.460 The one we're studying now is YME, the YouTube manipulation effect, which looks like it's going to be the largest effect we've ever found.
00:20:52.340 And again, people are unaware.
00:20:54.140 So there's lots of discoveries, about a dozen altogether now.
00:20:59.980 And are people aware?
00:21:02.760 Well, there are few people here and there.
00:21:05.580 Certainly, Senator Cruz is aware.
00:21:08.300 Senator Ron Johnson is aware.
00:21:11.620 There's a couple of members of Congress.
00:21:13.660 There's a couple of AGs, attorneys general from various states, who are aware, very much aware.
00:21:21.900 So, yeah, here and there, there's a handful of people who are aware.
00:21:26.360 President Trump tweeted about my research last summer.
00:21:30.820 So the president is aware.
00:21:34.740 Donald Trump Jr. published an article just a few weeks ago in Fox News in which he mentioned my work.
00:21:42.760 And so here and there, people are aware.
00:21:47.360 Tiffany Trump, very much aware.
00:21:49.740 I met with her when I was in D.C. a few days ago.
00:21:53.480 And she just finished law school at Georgetown.
00:21:56.660 Her specialty is technology and the law.
00:22:01.640 She knows.
00:22:03.240 Here and there, people are aware.
00:22:05.040 Are people generally aware?
00:22:06.900 No.
00:22:07.800 Do people generally care?
00:22:09.960 No.
00:22:10.300 Do the Democrats care?
00:22:12.120 Not at all.
00:22:13.880 My fellow Democrats, well, I'm not a registered Democrat, I'm independent, but the point is
00:22:19.280 the people who I sympathize with, you know, from a values perspective, they're happy as can be.
00:22:27.920 They are happy as can be because at the moment they're benefiting from these manipulations.
00:22:32.180 And there are plenty of Republicans who also don't really care because they don't like messing with commerce and with companies.
00:22:45.000 They don't like regulation.
00:22:46.440 So are people aware here and there?
00:22:51.360 Are they doing anything?
00:22:55.140 Not too much.
00:22:56.620 I mean, under Obama, the investigations into Google were all immediately shut down in his second term.
00:23:03.920 Six federal agencies were being run by former Google executives in Obama's second term, his chief technology officer, former Google executive, Hillary Clinton's chief technology officer, Stephanie Hannon, former Google executive.
00:23:19.840 I mean, literally, D.C. was run by Google.
00:23:23.060 Google representatives made more than 450 visits to the White House.
00:23:28.240 That's about 10 times more than any other company.
00:23:31.500 Now, under Trump, they've all been kicked out.
00:23:36.420 All the Google people have been kicked out.
00:23:38.020 The last one was from the U.S. Patent Office.
00:23:42.160 And investigations have started up again.
00:23:45.080 Federal Trade Commission, the Department of Justice.
00:23:49.140 Very recently, because of an executive order, he signed the Federal Communications Commission.
00:23:54.620 In both houses of Congress, literally 50 attorneys general are investigating Google.
00:24:05.620 So lots of is happening in the Trump administration.
00:24:08.940 I'm not a Trump supporter, but I have to say his administration has been pretty aggressive in taking down and trying to constrain, I should say, trying to constrain these companies.
00:24:19.280 And last year, Facebook was fined by the Trump administration $5 billion for mishandling user data and violating user privacy.
00:24:31.900 So, you know, there's been some action.
00:24:34.440 But Trump, in my opinion, has no chance of being reelected.
00:24:38.880 And the first thing that's going to happen in January of next year is all of these investigations are going to be shut down.
00:24:45.180 That's why I said before, I think this is a watershed year.
00:24:48.560 I think that we either fight them or we just, that's it.
00:24:53.200 We just turn over democracy and free speech and human autonomy.
00:24:59.500 That we just turn it over to these companies.
00:25:03.920 If you were to summarize for folks what your solution would be, if you have one, to stopping this,
00:25:12.680 because we've heard suggestions of, you know, break up the monopoly or nobody really knows what that means, what even it looks like.
00:25:19.720 What would you suggest if you had a, you know, just to be able to encapsulate what we might do to get out of this mess?
00:25:26.120 Well, breaking up these companies would accomplish nothing.
00:25:30.820 Absolutely nothing.
00:25:32.200 Kind of the antitrust solution to the problem accomplishes nothing.
00:25:36.800 It doesn't take away the power they have of surveillance, of censorship, and of manipulation.
00:25:42.920 Not at all.
00:25:45.180 Because you can't break up the basic search engine, and that's where most of Google's power lies.
00:25:51.280 So if you got them to, you know, get rid of a few companies they bought, like Fitbit,
00:25:56.020 it will barely have any impact at all on the kind of obscene power they have to manipulate.
00:26:05.460 Same with Facebook.
00:26:06.480 You can't break up the basic social media part of Facebook.
00:26:12.000 If they had to get rid of WhatsApp, again, that would make very little difference.
00:26:18.720 If they had to give up Instagram, what it would do is really it would enrich the main shareholders,
00:26:25.200 because they would benefit from these sales, but their power would not be reduced at all.
00:26:31.200 So in general, law and technology, excuse me, law and regulation cannot keep up with technology.
00:26:39.180 So I don't think law or regulation is going to solve the problem, and they move so slowly,
00:26:47.600 and Congress is so dysfunctional, et cetera, et cetera.
00:26:49.880 Plus, we're going to end up with a complete, you know, sweep by Democrats in D.C.
00:26:56.040 and many places around the country.
00:26:57.860 So that's the end of that.
00:26:59.640 There just isn't going to be any way to constrain these companies using law or regulation.
00:27:05.420 So, yes, I have proposed real solutions, practical solutions.
00:27:09.880 If your viewers and listeners actually want to help support one particular project, I can tell people how to do that.
00:27:21.360 It's to go to either mygoogleresearch.com or stopbigtechnow.com.
00:27:30.100 That particular web page should be available within the next two days or so.
00:27:38.880 Let's stop big tech now.
00:27:40.280 So what is the solution at the moment?
00:27:42.500 Can we protect, for example, the upcoming election in November?
00:27:46.100 And, yes, there is one thing we can do.
00:27:49.140 In 2016 and 2018, in the weeks before each election, I set up the first ever passive monitoring systems basically to spy on these companies just the way they spy on us and our kids 24-7.
00:28:07.140 So I set up monitoring systems.
00:28:09.160 What does that mean?
00:28:10.340 And that means, well, just like the Nielsen Company recruits in secret a lot of families, thousands of families, and observes their television watching, with their permission, of course.
00:28:22.060 We recruited field agents in various states, and we equipped them with special software so we could look over their shoulders as they were doing election-related searches on Google, Bing, and Yahoo.
00:28:36.580 We found in both 2016 and 2018 extreme political bias by Google, by Google, not by Bing or Yahoo, but by Google.
00:28:49.000 And that level of bias in 2016 was sufficient to have shifted between 2.6 and 10.4 million votes to Hillary Clinton, whom I supported.
00:29:00.860 2018, that level of bias could have shifted 78.2 million votes to Democrats around the country in hundreds of races.
00:29:12.780 So, you know, the point is monitoring tells you something very important, and this year we're hoping to set up a system that is more aggressive and that looks at many, many things besides just search results.
00:29:27.020 And we're hoping to make announcements.
00:29:30.880 In the previous elections, we waited until after each election before we disclosed what we had found.
00:29:37.940 This year, there's so much at stake.
00:29:40.020 We are planning to make announcements, certainly in those last few weeks, which are the critical weeks.
00:29:45.480 We intend to make announcements to members of Congress, to members of Congress, to AGs, and so on, indicating what we have found.
00:29:57.660 If we find bias, evidence of manipulation, favoritism, no matter where we find it, we're going to announce it.
00:30:06.380 And this could get these companies to back off, and when they do back off, we'll see it.
00:30:16.680 And if they don't back off, it's true the federal government, the new democratic controlled federal government will not go after them, but the AGs will.
00:30:31.480 If they don't back off, we will have massive amounts of data, which could be used to bring criminal charges against the leaders of these companies.
00:30:42.100 Because, you see, if you are on a massive scale, if you're shifting votes in one direction, that's considered an in-kind campaign donation, and that's a violation of campaign finance law.
00:30:55.820 That's one of the grounds that was used to send Michael Cohen to jail, and the fact is, these executives are going to be in trouble.
00:31:05.180 So, I'm guessing they will back off, in which case, at least during those final critical weeks, we will have a free and fair election.
00:31:14.520 As I say, this year, we either fight them, or we surrender democracy to these tech companies.
00:31:22.700 You mentioned this in three areas, about manipulation, but also about surveillance and censorship.
00:31:35.940 To talk a little bit about surveillance, I think most people know, or many people know, that your search history is not deleted.
00:31:47.260 Even if you delete it, it's still there and sort of tagged to you.
00:31:51.040 Is that correct?
00:31:51.620 Yes, your search history, all of your Gmails.
00:31:57.160 If you write to anyone who uses Gmail, then all of your emails are not only recorded, archived permanently, but they're analyzed, and information is extracted from them that goes into your profile, and the profile of everyone who's mentioned in those emails, including your kids.
00:32:15.300 So, all the emails, including the private ones where you say things you wouldn't want anybody to read, the ones that you sort of mark confidential, don't tell anybody, all that is nevertheless public to Google if you use Gmail.
00:32:33.420 That's correct.
00:33:03.420 That's still stored by Google, still analyzed, but again, all we've mentioned so far is search history and Gmail, but they're actually monitoring people now over more than 200 different platforms.
00:33:18.820 Not just two, 200, 200, most of which people are completely unaware of.
00:33:25.520 If you use a Fitbit device, Google is monitoring all of the physiological data coming from your Fitbit device, which is, of course, internet connected, 24 hours a day.
00:33:39.300 Google Docs is a fantastic source of information for them.
00:33:46.780 Google deliberately gives all of their premium services free of charge to educational institutions around the world.
00:33:56.660 Elementary schools, high schools, colleges, top universities use Google, and all of that information is being...
00:34:09.300 Recorded and analyzed.
00:34:11.560 Top news organizations, including the New York Times and the Guardian, use Google services, and all, every single thing they're doing, all the confidential emails they're exchanging with sources of stories, all the attachments, all of that is being recorded and analyzed by Google.
00:34:36.080 In fact, that kind of information is considered, I know from my sources, extreme high-value content at Google.
00:34:49.920 So if you have a Nest thermostat in your home, well, when Google bought the company, the first thing they did was add a microphone to it.
00:34:59.900 More recently, they've added a camera to it.
00:35:02.340 So the Nest thermostat is observing everything that's happening in your home.
00:35:06.780 Google has been granted patents in recent years for analyzing sounds coming from microphones in your home so that they can, from the sounds, determine whether your kids are brushing their teeth enough, what your sex life is like, what your relationships are like.
00:35:28.080 Maybe you need some counseling, maybe you need some counseling, you know, this is all information which they monetize.
00:35:35.720 They monetize.
00:35:36.920 This is how they make their money.
00:35:38.260 They have no actual products.
00:35:40.460 You and your kids are the product.
00:35:45.420 Okay.
00:35:46.100 So this sounds way too conspiratorial.
00:35:48.360 The Nest, very popular thermostat, has a microphone in it that's listening to you.
00:35:57.520 Yes, and the new versions have a camera.
00:36:01.380 But what does a thermostat need a microphone for anyway?
00:36:07.660 But, okay, so wait, there's a question in all of this because, okay, so we were told or, you know, back in the day you were told, oh, you know, information from you, yes, it's collected, but it's anonymous and it's sort of aggregate information that gets forwarded to the company and blah, blah, blah.
00:36:26.680 So it's not really information on you per se.
00:36:30.780 They don't have, like, your name down and can come back to you with it.
00:36:35.680 Or is that the case?
00:36:38.620 Yeah, I don't know that that was ever the case, but it certainly has not been the case for at least 10 years.
00:36:44.660 The Google and Facebook, they know exactly who you are.
00:36:49.220 Even if you don't sign in to a Google product, they know exactly who you are.
00:36:53.840 They know it from the IP address associated with your computer.
00:36:58.680 They know it from what's called browser fingerprinting.
00:37:02.040 They can identify you instantly.
00:37:05.460 They know exactly who you are.
00:37:08.040 Now, I personally use technology a little differently than most people do.
00:37:13.860 If your viewers go to a website, a URL, which is myprivacytips.com, that's myprivacytips.com,
00:37:23.760 they can read an article of mine in which I explain how I use technology.
00:37:30.140 And the first sentence is, I have not received a targeted ad on my mobile phone or computers since 2014.
00:37:38.380 So, you can use the Internet fully, absolutely fully, without giving up your identity and without giving away massive amounts of personal information.
00:37:55.760 I mean, the fact is, if you're a grown-up and you've been using the Internet for maybe close to 20 years,
00:38:01.400 Google alone has the equivalent of about 3 million pages of information about you.
00:38:11.880 And yes, I really said 3 million.
00:38:15.860 Yeah.
00:38:17.600 People have absolutely no idea.
00:38:19.980 People have no idea.
00:38:21.980 They don't know.
00:38:24.620 Right.
00:38:25.140 And you're a sane guy.
00:38:27.940 You're a very educated guy.
00:38:29.220 And you've done the studies to prove it and have heard the testimony and you're studying this of the whistleblowers from inside Facebook and Google who have come out and said this.
00:38:39.360 So, I think for a lot of us, for a lot of people, it's about waking up and actually realizing that this stuff is real.
00:38:46.500 It's not a conspiracy theory.
00:38:48.540 One of probably the most fascinating things that I read that you talked about was in addition to the Google search engine and the Gmail, which is just absolutely horrendous, in addition to, like, even Nest thermostats, but there's also your own phone.
00:39:08.680 Can you tell us a little bit about that?
00:39:10.300 Yes.
00:39:12.300 Well, if you use an Android phone, I mean, that's ridiculous because Android is an operating system which Google, well, borrowed and then basically enhanced so that they can monitor you even when you're offline.
00:39:28.840 In other words, if you're offline on your mobile device and it's an Android device, it's still tracking every single thing you do.
00:39:36.280 It's tracking every piece of music you listen to on your device and maybe every movie that you watch and every memo that you take and so on.
00:39:47.760 Your shopping lists, the moment you go back online, all the new information is immediately uploaded to Google.
00:39:55.420 So, Android is an extremely aggressive surveillance tool.
00:40:00.320 Chrome, you know, Chrome, the Chrome browser, that's why they created a browser, because the search engine wasn't enough.
00:40:11.140 Search engine only gave them information if you were searching for something and then went to a website.
00:40:16.760 But they, so they had to develop Chrome, which is a browser, because some people just go directly to websites without going through Google.
00:40:25.820 So, Chrome gives them everything.
00:40:28.720 Chrome lets them monitor every single thing you're doing online, period, whether you're going through Google or not or using any Google products.
00:40:37.520 By the way, you agree to this.
00:40:39.840 You agree to all this surveillance because under their terms of service, which we all agree to according to the terms of service, whenever we use a Google product, even if we don't know we're using a Google product, the terms of service says we can track you.
00:40:54.240 If you're using anything that we've created, we have the right to track you.
00:41:01.540 And by using anything that we've created, you're giving permission.
00:41:07.720 I mean, again, it makes the head spin because in some ways it sounds, it doesn't sound like a conspiracy theory, it just sounds like this is nuts.
00:41:19.140 Right.
00:41:19.660 It's almost impossible that we've gotten to this place because, I mean, if people generally believed that every search, everything you look up in the internet, everywhere you go, you're being tracked and followed such that they know it's you and they could come back to you with such information.
00:41:41.700 I mean, people search on the internet things that are so intimate to themselves that they wouldn't tell their spouses or their kids or whatever about this stuff.
00:41:51.380 And yet it's all out there, the information owned and given to a company that you didn't know you agreed to, and yet this is happening.
00:42:00.700 That's correct.
00:42:03.080 And by the way, if people want more information about these problems, there's a wonderful documentary that features my research.
00:42:11.100 It's called The Creepy Line.
00:42:13.740 And if you go to thecreepyline.com, you can learn about this documentary.
00:42:18.760 It's a full-length documentary.
00:42:20.040 It came out about a year and a half ago.
00:42:22.180 And it's also on Amazon.
00:42:25.220 It's on iTunes.
00:42:25.940 If you're an Amazon Prime member, you can watch it for free.
00:42:30.280 And it's an excellent film.
00:42:31.800 It really explains these three big areas, surveillance, censorship, and manipulation.
00:42:38.540 It explains them, I think, very clearly.
00:42:41.380 It's got some great graphics, so I recommend that highly.
00:42:44.720 Again, if people want to figure out how to use the internet a little differently, they can go to myprivacytips.com.
00:42:52.440 If they're interested in learning about or helping to support our monitoring systems, especially the one we're building this year, they can go to stopbigtechnow.com.
00:43:10.580 And that website should be up within the next day or two, stopbigtechnow.com.
00:43:19.840 So, yeah, the surveillance is out of control.
00:43:23.200 Personal devices are listening to you.
00:43:25.920 So, your mobile phones are listening to you.
00:43:28.920 Your Alexa device, which is an Amazon product, is listening to you, never stops listening or recording.
00:43:35.320 The Google Assistant, which is on Android devices, always listening.
00:43:39.140 And, of course, Google Home, the Google Home device, which is like Alexa, is always listening.
00:43:47.560 And Google has been trying very hard to convince people to put the home device into every single room in the home.
00:43:55.920 That's why it's called Google Home, because they want to own your home.
00:44:00.920 And this is listening even when you don't say, hey, Siri or Alexa or call out the machine they're supposed to turn on.
00:44:13.980 They're actually listening before that because they're – and recording is what you're saying.
00:44:19.640 Oh, it never stops listening.
00:44:21.060 They never stop listening.
00:44:22.200 I mean, that's been proven over and over again.
00:44:24.280 And we've done experiments on that ourselves.
00:44:26.180 And they never stop listening and recording, never.
00:44:30.380 There's a court case now in which the prosecution – it was a domestic violence case.
00:44:35.200 The prosecution actually subpoenaed the audio records from the house in which an Alexa device was set up.
00:44:45.740 And they actually subpoenaed audio records from Amazon.
00:44:49.600 And sure enough, the audio records had recorded the argument.
00:44:53.280 I mean, there's just no question that these devices are doing this.
00:44:59.400 So, they're recording private arguments between spouses, the sexual encounters between spouses, fights between children, everything.
00:45:09.300 Everything.
00:45:10.840 Everything.
00:45:11.440 And as I say, Google has been issued patents recently on methods for interpreting the sounds that they're recording.
00:45:21.060 And more and more, of course, they're trying to get cameras into your home, too.
00:45:24.560 It's like Big Brother, you know, from 1984, except it's Big Brother on steroids.
00:45:33.120 I mean, it's – I mean, this would have been – no one, no one ever imagined, you know, surveillance on this level, manipulation on this level.
00:45:45.740 And, of course, the third area is censorship, which is also –
00:45:50.680 Let's get into that a little bit.
00:45:53.420 What does censorship look like?
00:45:55.280 I mean, I know at LifeSite, we have been experiencing that.
00:45:58.680 We've had our search engine rankings go all over the place every once in a while for some reason.
00:46:03.980 And then certain things are blocked.
00:46:06.040 Certain ads we want to run are blocked.
00:46:08.200 Certain petitions are blocked, particularly around COVID.
00:46:11.760 Certain videos, even.
00:46:12.920 I know that YouTube is owned by Google.
00:46:16.480 And so, one of our videos was just taken off.
00:46:19.120 We tried to appeal it.
00:46:20.280 And there's no – it seems that there's no appeal.
00:46:21.920 It's just sort of tough luck.
00:46:23.200 It's gone.
00:46:23.680 And what is – what are they doing with censorship and how are they doing it?
00:46:31.240 Censorship is another big area.
00:46:32.980 They – one of the most fascinating leaks from Google last year was a two-minute video.
00:46:38.500 This is leaked by a former senior software engineer at Google named Zach Voorhees, who might be a very interesting guest for you.
00:46:46.100 And one of the things he brought out with him, besides 950 pages of documents, which he sent over to the Department of Justice, is a two-minute video in which the head of YouTube is talking to her staff about, well, really, censorship.
00:47:03.620 About how they're modifying their algorithm to boost content on YouTube that they think is authoritative and to demote content that they think is not authoritative.
00:47:15.680 That's censorship.
00:47:17.680 That's what censorship is.
00:47:19.020 And that's why a lot of people – Dennis Prager, for example, who runs something called Dennis Prager University – have found that dozens of his videos have pretty much disappeared or been restricted on YouTube.
00:47:32.100 And the fact is, this censorship is occurring on all these platforms.
00:47:36.800 So Google removes, of course, websites from its search results or demotes them, which is pretty much the same thing.
00:47:47.340 Twitter – Twitter suppresses tweets.
00:47:51.720 So Ann Coulter, whom I know, you know, has more than a million followers.
00:47:55.840 Now and then, when she tweets something, she gets the impression – she gets the impression that no one's getting her tweet.
00:48:06.160 And there's no question, we know, again, because of leaks from Twitter, that, in fact, Twitter does this.
00:48:13.960 It's called shadow banning, and they really do it.
00:48:17.780 And they do it to suit their needs.
00:48:20.880 I mean, as you know, Twitter, of course, has put some warnings on some of President Trump's tweets, obviously.
00:48:28.920 Yes, we know very well.
00:48:29.980 Twitter has banned our account and has been banned for many, many months now.
00:48:34.240 So we know very well.
00:48:36.620 There's just no question.
00:48:38.160 And, of course, we have to, again, regardless of our political affiliation, we really have to ask,
00:48:43.840 do we want a private company that's not accountable to the public, any public, anywhere in the world,
00:48:50.600 to be deciding what 2.5 billion people around the world can or cannot see?
00:48:58.540 See, I'm not saying that sometimes some offensive material shouldn't be taken down.
00:49:05.100 I'm not saying that at all.
00:49:06.120 I'm saying who makes the decision, and at the moment, the decision-making is entirely in the hands of a couple of executives,
00:49:15.040 very arrogant executives, by the way, in Silicon Valley.
00:49:19.660 That is, I mean, we are going to look back on that someday and realize what the heck were we doing?
00:49:25.060 How did we ever allow that?
00:49:26.660 Again, think of what Eisenhower said back in 1961.
00:49:29.380 We have to be vigilant, or if not, we are going to be run by a technological elite without us even knowing that this is occurring.
00:49:40.040 Took a while, but that's exactly where we are now.
00:49:43.200 We are being run by a technological elite.
00:49:46.600 They're manipulating us in at least a dozen different ways that I've discovered.
00:49:51.500 They're manipulating us, again, through just deciding what content we can see or not see.
00:49:57.420 And then, of course, they're surveilling us.
00:50:02.420 They're surveilling us to a degree that's quite frightening.
00:50:10.960 Very good.
00:50:11.860 Well, if you can, just in wrapping up, give us some practical things that people can do to gain some freedom back
00:50:18.500 and to educate their own family and friends on either side of the spectrum,
00:50:22.400 just so that they can get more accurate information, stop being surveilled themselves,
00:50:27.640 overcome some of the censorship, and really lessen the manipulation in their lives.
00:50:34.380 Well, if people want to see how I use the Internet and how I protect myself and my family,
00:50:40.580 they can go to myprivacytips.com.
00:50:43.820 Again, that's myprivacytips.com.
00:50:45.920 About half of that article that I wrote is about getting away from Google.
00:50:52.980 You have to get away from Google.
00:50:55.300 That's the first thing.
00:50:56.360 So you have to, I say, jettison Gmail.
00:50:59.320 For example, you must not use Gmail.
00:51:02.340 You can keep your account open because they already have all that information.
00:51:06.600 So you can certainly keep it open.
00:51:08.260 So you have that as a kind of archive.
00:51:10.160 They have it anyway.
00:51:11.160 There's no harm done.
00:51:12.040 But you have to move on.
00:51:15.020 You have to move on to a type of email service that preserves privacy.
00:51:19.040 So the one I use and recommend at the moment is called ProtonMail.
00:51:23.420 It's based in Switzerland.
00:51:25.240 It's subject to very strict Swiss privacy laws.
00:51:30.880 The basic service is free.
00:51:32.960 Eventually, if you're using it heavily, you have to pay a few dollars each month.
00:51:37.340 My goodness, it's absolutely worth it.
00:51:41.780 Absolutely worth it.
00:51:42.820 Because if you're writing ProtonMail to ProtonMail, you're completely protected.
00:51:48.640 The information is encrypted end to end, which means even the people at ProtonMail can't read it.
00:51:54.640 And that's how it should be.
00:51:55.720 That's how communication should be.
00:51:58.120 And again, you've got to dump Chrome, Google's browser.
00:52:02.460 You've got to dump Android, which is Google's operating system.
00:52:05.300 And so part of the article is about...
00:52:07.540 And that means all Android phones, right?
00:52:10.160 It means almost all Android phones, yeah.
00:52:12.940 Certainly the vast majority of them.
00:52:15.480 There are few Android phones which strip away Google's surveillance tools.
00:52:24.940 But generally speaking, yeah, you have to avoid Android.
00:52:29.200 You know, that's, again, a big part of it is just getting away from Google.
00:52:36.260 There are actually some websites and articles written just on that topic, which is how to get away from Google.
00:52:44.880 The search engine, of course, that was their original surveillance tool.
00:52:50.400 Well, you know what?
00:52:50.960 There are other search engines.
00:52:52.660 They're not always as good, but StartPage.com, which is also based in Switzerland, StartPage actually draws from Google.
00:53:03.540 So it gives you pretty good search results because it's drawing from Google.
00:53:07.900 And then I use one called SwissCows.com, believe it or not.
00:53:12.880 And SwissCows.com is pretty darn good.
00:53:16.140 So there are some alternatives.
00:53:17.420 Anyway, go to MyPrivacyTips.com, and then you can learn how at the individual level, the personal level, you can take some steps.
00:53:26.640 At the higher level, of course, we eventually need some authorities to do some pretty dramatic things.
00:53:34.660 At the moment, I would say it's unlikely those things are going to be done.
00:53:38.580 And Senator Cruz himself said to me, he said, the problem is the Democrats are benefiting from these companies and also getting huge donations from these companies and getting lots of votes.
00:53:53.660 And he said, the Republicans, we don't like to regulate, he said, so we're pretty much stuck.
00:53:59.580 But we all have to keep fighting.
00:54:02.260 We've got to fight at that high level.
00:54:04.840 We've got to get monitoring systems set up, for example.
00:54:07.780 That can be done.
00:54:09.740 That doesn't threaten anyone's politics.
00:54:13.640 I mean, monitoring systems have to become permanent around the world because monitoring systems are tech, and tech can keep up with tech.
00:54:23.400 So monitoring systems around the world, they can protect us, they can protect free speech, they can protect human autonomy, they can protect democracy.
00:54:34.300 And so that's one of my goals, just to try to see if I can get systems like this set up in such a way that they're permanent.
00:54:40.880 And again, if people want to help support the creation of this year's election monitoring system, they can go to StopBigTechNow.com, and that website should be up within a day or two.
00:54:57.500 Excellent. Excellent.
00:54:59.500 Dr. Epstein, I want to thank you very much for your time, for being with us, for explaining these things to us.
00:55:04.380 You've given us a lot to chew on, and I think for the sake of democracy and civil society, we really need to, A, in our own lives, get rid of these things, because recognize the censorship manipulation that we're undergoing ourselves, but also fight for it.
00:55:22.880 And in terms of fighting for it, we're really fighting for democracy itself.
00:55:26.620 Exactly. Well, thank you for taking an interest in my work. I appreciate it.
00:55:30.740 Thank you for joining us in this episode of The John Henry Weston Show.
00:55:33.880 Please remember to like and share this video, and if you haven't already done so, please subscribe to this channel by hitting the subscribe button below, and clicking the bell to be notified of all future episodes.
00:55:45.220 For LifeSite News, this is John Henry Weston, and may God bless you.