The Joe Rogan Experience


Joe Rogan Experience #1768 - Dr. Robert Epstein


Summary

In this episode of the Joe Rogan Experience podcast, I sit down with author and researcher Dr. Aaron Kogan to discuss his new book, "The Dark Side of the Internet: How Google Uses Your Data To Manipulate Your Opinion And Control Your Opinion." Dr. Kogan is a professor at the Berkman Center for Internet Ethics at Harvard Law School, where he's researching whether or not Google is an S&M platform, and why you should never use Google's search engine, DuckDuckGo, or Alexa. We also talk about the dangers of Android phones and how they can be used to spy on you and manipulate your thoughts and opinions, and how you can be controlled by your smart phone's GPS system. This is an [Expert] level episode, which means some parts of the conversation may not make sense unless you ve listened to the entire thing. So if you haven't checked it out, don t miss it! It's a must-listen episode, and if you don't like what you hear, you'll definitely want to check out the rest of the episode. If you're in the mood for conspiracy theories, this is the episode for you! You won't want to miss this one. Enjoy! Check it out! -Jon Sorrentino Subscribe to the podcast by clicking the link below to get notified when a new episode is released. Subscribe on Apple Podcasts by searching for "The Joe Rogans Podcast" Subscribe on iTunes Learn more about your ad choices and become a supporter of the show, "Joe Rogan's Podcasts" Become a supporter by becoming a patron of the podcast, and receive 20% off the first-day shipping discount code: JOGANPRODCAST, using promo code JOGANSPODCAST and receive 10% off your first month only, plus a FREE shipping discount when you buy a copy of his second month's ad-free version of his third month, shipping worldwide, shipping nationwide, and other prizes throughout the world, and a total of $10/month, including VIP membership offer, plus two months get a year get a chance to win a VIP membership deal, and get a FREE ad-only shipping offer, shipping only two months of VIP membership, and 7 months get two months VIP access to his second place promo code, and I'll get a discount, and he'll get his first place offer, and they'll get an ad discount, plus he'll also get a VIP rate, and all other perks, plus all other VIP pricing, and more!


Transcript

00:00:01.000 Joe Rogan Podcast, check it out!
00:00:04.000 The Joe Rogan Experience.
00:00:06.000 Train by day, Joe Rogan Podcast by night, all day.
00:00:12.000 First of all, thank you for coming.
00:00:13.000 I really appreciate it.
00:00:14.000 This is a very interesting subject because I think search engine results have always been a thing that people kind of take for granted that the search engine results is going to give you the most significant results At the top,
00:00:31.000 and they don't really think about the fact that this is kind of curated.
00:00:35.000 And, you know, we found it many times because we use two different search engines.
00:00:38.000 We'll use Google, and then we'll say, well, if we can't find it on Google, use DuckDuckGo.
00:00:43.000 And oftentimes, when you're looking for something very specific, you'll find that you can't find it on Google.
00:00:49.000 Like, if it's in there, it's deep, deep, deep, you know, many pages in.
00:00:53.000 Whereas DuckDuckGo will give you the relevant search results very quickly.
00:00:59.000 So something's going on with search engines.
00:01:02.000 And from your research, what you have found is that it can significantly affect the results of elections.
00:01:11.000 Well, not just that.
00:01:12.000 It can affect how people think.
00:01:14.000 It can affect your opinions, attitudes, purchases that you make.
00:01:20.000 Pretty much, it's a mind control machine.
00:01:24.000 It's the most powerful mind control machine that's ever been invented.
00:01:29.000 And by the way, you should never use the Google search engine.
00:01:34.000 Never.
00:01:34.000 Never?
00:01:35.000 Never.
00:01:35.000 Why is that?
00:01:37.000 Because this is what I call...
00:01:39.000 And this is an S&M platform.
00:01:44.000 Now, I'm not sure what S&M means to you.
00:01:46.000 I don't want to pry into your personal life.
00:01:49.000 But the point is that...
00:01:52.000 What I mean by S&M is that this is a surveillance and manipulation platform.
00:01:59.000 On the surface...
00:02:01.000 There are always two levels to everything with Google.
00:02:03.000 On the surface...
00:02:05.000 It's like a free public library kind of thing, right?
00:02:09.000 Yes.
00:02:09.000 That's always on the surface.
00:02:11.000 Beneath the surface, it's something different.
00:02:14.000 From a business perspective, it's an S&M platform.
00:02:17.000 It exists for two purposes only, and that is to trick people into giving up lots and lots of personal information.
00:02:26.000 Notice your public librarian doesn't do that.
00:02:29.000 Notice that they don't actually do that.
00:02:31.000 Right.
00:02:32.000 It's also used for manipulation because they discovered quite a few years ago that if they control the ranking of the search results, they can control people's opinions, purchases, votes.
00:02:49.000 They can't control everyone's opinions because a lot of people already have strong opinions.
00:02:54.000 So the people they're going after are the people who are undecided, the people who are vulnerable, and they know exactly who those people are.
00:03:02.000 And they literally, your mind is putty in their hands.
00:03:12.000 So you should never, ever use Google or any other S&M product like Amazon Alexa is an S&M product or the Google Home device or Android phones.
00:03:26.000 Android phones are bad.
00:03:28.000 An Android phone is an S&M device.
00:03:30.000 It's always listening.
00:03:32.000 It's always recording.
00:03:33.000 Android phones are always recording you?
00:03:36.000 Are you serious?
00:03:37.000 Yeah, I mean, I'm questioning this.
00:03:40.000 I mean, I believe you, but I just want you to elaborate.
00:03:43.000 Oh, yeah.
00:03:44.000 There have been court cases in which the recordings have been subpoenaed from whoever's controlling that so-called personal assistant or that device.
00:03:58.000 And courts have recovered recordings and transcripts when people are not even aware that they're being monitored.
00:04:06.000 I know that's the case with Alexa, right?
00:04:08.000 Yes.
00:04:09.000 But that's the case with Android phones as well?
00:04:11.000 Yes, in fact, Android phones, the equipment to prove this, which I didn't bring, but is so cheap now that literally anyone can confirm this.
00:04:23.000 Android phones, even if they're disconnected from your mobile service provider, even if you pull out the SIM card, okay, as long as the power is on, It is recording, tracking every single thing that you do.
00:04:39.000 So if you use it to read things, if you use it to listen to music, you use it to shop, whatever it is, and of course your location is always tracked.
00:04:50.000 Then when you go back online, the moment you're reconnected, it uploads all that information.
00:04:57.000 So some people wonder why their batteries run down, sometimes even when you're not really doing anything with your phone.
00:05:04.000 That's because with Android phones, I think it's 50 to 60 times per hour, it's uploading.
00:05:14.000 It's uploading about 12 megabytes of data per hour.
00:05:20.000 So that's a lot of energy.
00:05:22.000 That requires energy.
00:05:24.000 So, I mean, the kind of phone I have is completely different.
00:05:27.000 It doesn't do anything like that.
00:05:30.000 Would you have like a no agenda type phone?
00:05:32.000 Do you know that show no agenda?
00:05:34.000 No?
00:05:34.000 It's my friend Adam Curry, who's the original Podfather.
00:05:38.000 He's the guy who invented podcasting, and his company develops these de-Googled phones where they take out all the tracking stuff, everything, and it's basically running on a different operating system.
00:05:51.000 Right.
00:05:52.000 So I have a phone that runs on a different operating system.
00:05:55.000 It's completely de-Googled.
00:05:57.000 What do you got?
00:05:58.000 Can you show it to me?
00:05:59.000 Yeah, I can show it to you.
00:06:01.000 I'm just interested.
00:06:02.000 It just looks like any old regular phone.
00:06:04.000 Right.
00:06:05.000 But it's not.
00:06:05.000 Is it running Linux?
00:06:06.000 What's it running?
00:06:07.000 No.
00:06:08.000 It's a different operating system.
00:06:10.000 Can you not tell me?
00:06:11.000 Well, I can tell...
00:06:12.000 It seems like you're trying to hide this, Robert.
00:06:16.000 Well, the point is, look, if you go to a website that says myprivacytips.com, that's an article.
00:06:30.000 You'll get to an article of mine.
00:06:32.000 And that article begins, I have not received a targeted ad on my mobile phone or my computer since 2014. Wow!
00:06:45.000 So, there is a different way to use all the technology that's out there so that you are not the product, okay?
00:06:54.000 So they're actually, you know, a user making use of services, but you're not the product.
00:07:00.000 And it can be done.
00:07:03.000 Yeah, is there a little inconvenience involved?
00:07:05.000 Yes, very little.
00:07:07.000 Is there some expense involved?
00:07:09.000 Very, very little.
00:07:11.000 All these services that you get for free, quote-unquote, they're not free.
00:07:15.000 You pay for them with your freedom.
00:07:18.000 If you want to get them in a paid form so that you're not being tracked, we're talking $10 to $15 a month.
00:07:26.000 Literally all of those so-called free services that are really, again, these S&M services, all of them together are worth $10 or $15 a month.
00:07:39.000 And how do you use your phone, though?
00:07:42.000 If you want to have a search engine, are you using a different search engine?
00:07:47.000 Like, what are you using?
00:07:49.000 Well, that's changed for me over time, but right now I'm using the Brave browser.
00:07:55.000 I use that.
00:07:56.000 Okay.
00:07:57.000 That's good.
00:07:57.000 That's really the best one right now.
00:07:59.000 And then Brave introduced a Brave search engine, which now, fortunately, very recently, you can make the default search engine on Brave.
00:08:11.000 So Brave doesn't track at all.
00:08:13.000 Brave works faster than Chrome.
00:08:15.000 Chrome is Google's surveillance browser.
00:08:19.000 Yeah.
00:08:35.000 And, you know, now, again, you can make the default search engine on Brave literally the Brave search engine.
00:08:43.000 Do you ever run into websites where they don't work properly because it's trying to upload ads or something and maybe there's a glitch?
00:08:51.000 Very, very rarely.
00:08:53.000 And then I will go over, occasionally I'll go over to Firefox, because Firefox was actually developed by a guy named Brendan Eich, who might be really interesting for you to talk to, by the way.
00:09:08.000 And then he left Mozilla, which was the nonprofit organization that developed Firefox.
00:09:15.000 By the way, the connection between Firefox and Google, don't even get me started.
00:09:20.000 It's disgusting.
00:09:21.000 But the point is, Brendan got sick of that situation, and he founded his own company, and he developed Brave.
00:09:29.000 So the same guy who developed Firefox developed Brave, very much into privacy, really a forward thinker.
00:09:37.000 He's an amazing guy.
00:09:39.000 So when did you first become interested in digital surveillance and privacy and like what you're giving up by using these free services like Google?
00:09:51.000 I wasn't interested at all.
00:09:52.000 I've been a researcher for 40 years and I had a lot of research underway.
00:09:57.000 I've done research on teenagers and creativity and stress management, all kinds of things.
00:10:03.000 I'm still doing all that stuff.
00:10:04.000 But on January 1st of the year 2012, I got, I don't know, eight or nine messages from Google telling me that my website had been hacked and that they were blocking access.
00:10:20.000 So I thought, the first thing I thought was, why am I getting these notices from Google?
00:10:26.000 Who made Google the sheriff of the Internet?
00:10:29.000 Why isn't this coming from the government?
00:10:31.000 Why isn't it coming from some nonprofit organization?
00:10:34.000 So that got my attention.
00:10:37.000 And then, because I'm a coder, I've been a programmer since I was a teenager, and then I started wondering, wait a minute, okay, they're blocking me on the Google search engine.
00:10:48.000 I get that.
00:10:49.000 That's them, right?
00:10:50.000 Yeah.
00:10:51.000 So they have these crawlers that look at all the websites every day, and their crawler found some malware on my website.
00:10:59.000 That happens all the day, too.
00:11:01.000 Everyone gets hacked.
00:11:02.000 I'm sure you've been hacked, and Google itself has been hacked.
00:11:05.000 So I get that.
00:11:08.000 They're blocking me on Google.
00:11:09.000 Google.com search engine.
00:11:11.000 I get it.
00:11:12.000 Okay.
00:11:13.000 But I noticed they're also blocking me on Firefox, which is owned by a non-profit.
00:11:18.000 They're blocking me on Safari, which is owned by Apple.
00:11:21.000 I thought, how could that be?
00:11:24.000 These are completely separate companies.
00:11:28.000 Took me a while.
00:11:30.000 Took me a while to figure this out.
00:11:31.000 I finally published a piece in U.S. News and World Report, an investigative piece called The New Censorship.
00:11:40.000 And I described nine of Google's blacklists.
00:11:45.000 This was 2016, so this was a while ago.
00:11:48.000 In detail, I described nine of Google's blacklists.
00:11:51.000 I explained how the blacklists work.
00:11:53.000 I explained Google can literally block access...
00:11:57.000 On multiple platforms that aren't even theirs, they can block access to any website that Google at one point in time, 2009, I think it was, I don't know, I might get the date wrong.
00:12:10.000 Let's just say January, whatever, 30th.
00:12:14.000 Google blocked access to the entire internet for 40 minutes.
00:12:23.000 Google...
00:12:23.000 Anyway, in this article...
00:12:24.000 When you say that, with all browsers?
00:12:28.000 When you say blocked access to the entire internet, it's like if you use the Brave browser back then.
00:12:32.000 Did it even exist back then?
00:12:34.000 Probably didn't exist.
00:12:35.000 Brave didn't exist.
00:12:36.000 But no, there were lots of search engines.
00:12:39.000 Google was not the first search engine.
00:12:41.000 It was the 21st search engine.
00:12:42.000 So what I'm saying is with all web browsers, it blocked access to the internet?
00:12:48.000 It blocked access to virtually the entire internet to virtually everyone in the entire world for 40 minutes.
00:12:55.000 What?
00:12:56.000 And this was reported in the news.
00:12:58.000 So what's happening with their system is because so many people are searching for things, because they're monitoring so many different things to add to their search engine, do they have some sort of ultimate control over the internet in some weird way?
00:13:14.000 Here it is right here.
00:13:15.000 Google blacklists entire internet.
00:13:18.000 Glitch causes world's most popular search engine to classify all web pages as dangerous.
00:13:24.000 Wow!
00:13:25.000 Google placed the internet on a blacklist today after a mistake caused every site in the search engine's result page to be marked as potentially harmful and dangerous.
00:13:33.000 Holy shit!
00:13:35.000 The fact that they can even do this, I like how it gives you, like, at the top, this article's more than 12 years old.
00:13:41.000 Okay.
00:13:41.000 Imagine that, like, 12 years means, like, it's ancient.
00:13:44.000 Like, they wrote it on stone tablets 12 years ago.
00:13:48.000 Yeah, but, you know, this is nonsense.
00:13:50.000 This report is nonsense.
00:13:52.000 Is it?
00:13:53.000 Of course.
00:13:54.000 They...
00:13:54.000 This is...
00:13:57.000 Google is full of geeks.
00:13:58.000 Okay, I'm part geek, so I can relate.
00:14:02.000 I can speak geek if you want.
00:14:06.000 And geeks for fun, okay, sometimes for profit, but most of the time it's just for fun, just to be cool and get their kicks and show how powerful they are.
00:14:18.000 To be leet.
00:14:19.000 Yeah, so they do crazy things.
00:14:21.000 So they shut down the internet.
00:14:24.000 I guarantee you it was a geek thing, because you know why I figured that out?
00:14:29.000 Because I kept wondering, why did they shut it down on this super early morning on a Saturday?
00:14:37.000 Why?
00:14:38.000 What's so special about that little period of time?
00:14:44.000 It took a while and I figured it out.
00:14:46.000 It's because that is one of the only intervals of time in the entire week when every single stock market in the world is closed.
00:14:59.000 So they did it to show that they could do it and have their fun, but they didn't want to get attention.
00:15:05.000 And if they had interfered with financial transactions, they would have gotten a lot of attention.
00:15:11.000 So no one was ever caught?
00:15:12.000 No one was ever caught, but they never denied that this happened either.
00:15:17.000 So this was done through Google, for sure.
00:15:22.000 They know this how?
00:15:26.000 It's reported in the news reports, and Google was queried, and Google said, yeah, yeah, that did happen, yeah, we fixed it.
00:15:34.000 So how does Google have the ability to even do something like that?
00:15:37.000 How can that even be done?
00:15:38.000 Well, that's what I explained in that article.
00:15:41.000 They have blacklists.
00:15:45.000 Let me jump ahead and then I'll...
00:15:47.000 Okay.
00:15:47.000 Okay.
00:15:47.000 But let me just jump ahead for a second because you got to see really how sinister this whole thing is.
00:15:53.000 It's just...
00:15:53.000 Seriously, if you knew...
00:15:57.000 A half of what I know about all this dark tech stuff, you would just say the hell with it and just give up.
00:16:05.000 You'd say, I don't want to bring up kids in this kind of world.
00:16:08.000 This is too crazy.
00:16:09.000 Anyway, blacklist.
00:16:12.000 I feel like we need to stop you there and make you elaborate.
00:16:15.000 What are you saying?
00:16:18.000 Well, what I ended up doing...
00:16:26.000 I think we should get to later in some detail, if you're still interested.
00:16:30.000 Yes.
00:16:43.000 And I am still, almost month by month, making more discoveries, running more experiments, getting very disturbing data.
00:16:52.000 I mean, so disturbing.
00:16:54.000 We just figured out, I think within the last month, that a single question and answer interaction on Alexa...
00:17:05.000 So you ask Alexa a question, and let's say it's about, I don't know, Some political issue or political candidate.
00:17:12.000 Something you're undecided about.
00:17:14.000 So you ask Alexa.
00:17:15.000 And Alexa gives you back an answer.
00:17:17.000 And the answer, let's say, has a bias.
00:17:20.000 In other words, it favors, you know, one candidate, favors one party, favors one cause, right?
00:17:28.000 Question and answer interaction on Alexa, in a group of, say, 100 undecided people, can shift opinions by 40% or more, one interaction.
00:17:41.000 If there are multiple questions asked on that topic over time, you can get shifts of 65% or more, with no one having the slightest idea that they have been manipulated.
00:17:53.000 But are they doing it to manipulate you or is it just the fact that they distribute this information based on their algorithm?
00:18:01.000 It's manipulating you just by default because the higher or more likely you find this information from the search engine, like that's what's going to influence your opinion.
00:18:13.000 But are they doing it to influence your opinion or is that just the best answer?
00:18:18.000 Like if you have a question.
00:18:20.000 Who is Dr. Robert Epstein?
00:18:22.000 Yes, who is he?
00:18:25.000 Yes, exactly.
00:18:26.000 So if I ask that to Alexa and then it pulls up these results, it's going to pull up supposedly the most relevant result.
00:18:35.000 Now, if you have something like Alexa where you're asking a question and it's just reading it back to you, there has to be some sort of curation of that information, right?
00:18:51.000 Confession.
00:18:52.000 Okay.
00:18:53.000 I... I have not followed Joe Rogan over the years.
00:19:00.000 Okay, I have five kids.
00:19:01.000 My two eldest sons are like your biggest fans in the universe.
00:19:05.000 My eldest son is technically a bigger fan than the other son because he's recently gained 60 pounds because of COVID. So he's definitely the bigger of the two fans.
00:19:17.000 This is Julian and Justin.
00:19:18.000 Yeah, you get it.
00:19:19.000 Anyway, but I don't follow Joe Rogan.
00:19:24.000 Right?
00:19:25.000 Okay.
00:19:26.000 So now I've had to bone up and actually had to listen.
00:19:30.000 I was forced.
00:19:31.000 I had to listen to some of your shows and I'm thinking, wow, this is interesting.
00:19:37.000 This guy is genuinely curious.
00:19:41.000 About things.
00:19:42.000 You really are genuinely curious.
00:19:45.000 It's crazy.
00:19:46.000 Well, what's crazy is that that's crazy.
00:19:48.000 That's not crazy to be curious.
00:19:50.000 Most people are curious, aren't they?
00:19:52.000 No, not like you.
00:19:53.000 Because you actually, you dig in and you really want to know.
00:19:58.000 And I'm now, I'm so...
00:20:03.000 Now I'm going to say something that's not so nice, which is, on this issue, by the questions you're asking me, I can tell you have no idea what's going on.
00:20:16.000 Well, I kind of do, but you have to understand the way I do a show.
00:20:19.000 One of the things that I do when I want you to elaborate on information, it's like, maybe I know something.
00:20:24.000 But I want you to elaborate to everybody else that's listening.
00:20:27.000 So you pretend you don't know.
00:20:28.000 I don't pretend I don't know.
00:20:29.000 I just ask you questions.
00:20:30.000 I don't play stupid.
00:20:31.000 But I do ask questions like, please tell me more or elaborate or where did you come up with this or how do you know this for sure?
00:20:39.000 Maybe I know how you know it for sure.
00:20:41.000 But I want you to tell everybody.
00:20:43.000 So that's what I say.
00:20:44.000 The question you asked was...
00:20:46.000 The dark stuff.
00:20:47.000 Well, you're saying all this stuff that looks, maybe it's biased, maybe it might influence people, you know, where is it coming from?
00:20:55.000 Maybe it's just the algorithm.
00:20:57.000 Well, let's say it's just the algorithm.
00:21:00.000 Well, the algorithm was programmed by human beings, okay?
00:21:02.000 And those human beings have Biases.
00:21:07.000 They have beliefs.
00:21:08.000 And there's a lot of research now showing that that bias, whether it's conscious or unconscious, gets programmed into the algorithms.
00:21:17.000 So the algorithms all by themselves have biases built into them.
00:21:22.000 There's one way it can go.
00:21:24.000 Second way it can go, the Marius Milner effect.
00:21:27.000 You ever hear of Marius Milner?
00:21:29.000 No.
00:21:29.000 Okay.
00:21:31.000 Oh, this is great.
00:21:33.000 This is great.
00:21:33.000 Okay.
00:21:34.000 Marius Milner.
00:21:35.000 Okay.
00:21:36.000 A few years ago, you probably heard that Google got busted because their Street View vehicles were driving up and down.
00:21:44.000 They're still driving up and down streets all over the world, but they had been driving up and down streets all over the world, more than 30 countries, for more than four years.
00:21:52.000 And they weren't just taking pictures of our houses and our businesses.
00:21:56.000 They were also sucking up Wi-Fi data.
00:22:02.000 I mean, we're talking terabytes of Wi-Fi data, passwords, everything, including a lot of very deeply personal stuff.
00:22:12.000 So someone just like me, a professor type, figured this out, reported them to the government, the government went after them.
00:22:21.000 And so this is called the Google Street View scandal.
00:22:26.000 And so they got a fined $25,000 for interfering with the investigation.
00:22:32.000 And they blamed the entire thing.
00:22:35.000 Google blamed the entire operation on one software engineer.
00:22:39.000 His name is Marius Milner.
00:22:42.000 So they fired him.
00:22:44.000 Oh, no, no, that's not true.
00:22:48.000 He's a hero at Google.
00:22:49.000 He's still working there.
00:22:51.000 If you look him up on LinkedIn, his profession is hacker.
00:22:55.000 He's a hero at Google.
00:22:58.000 They didn't fire him.
00:23:00.000 They love this kind of stuff.
00:23:03.000 So another possibility, besides the algorithm itself, is a single rogue programmer at the company can fiddle with content.
00:23:16.000 Can fiddle with any content.
00:23:18.000 And when a single rogue programmer does that, guess what?
00:23:24.000 That shifts thinking and opinions and behavior and purchases and votes.
00:23:30.000 A single rogue programmer can do it.
00:23:33.000 And then, of course, there's the executive level.
00:23:35.000 The executives can...
00:23:40.000 We're good to go.
00:23:45.000 We're good to go.
00:24:02.000 He just couldn't stand it anymore and he quit.
00:24:04.000 But unlike most of these people who've walked away, he brought with him 950 pages of documents and a video.
00:24:11.000 The video is two minutes long and it shows the CEO of YouTube, which is owned by Google, her name is Susan Wojcicki, and she's talking to her staff.
00:24:23.000 And she's explaining, this is 2017 after the horrible election results of 2016, and she's explaining how they're altering the up-next algorithm in YouTube to push up content that they think is legitimate and to suppress content that they think is not legitimate.
00:24:45.000 So if it's happening at that level, the executive level, Again, it still has the same effect.
00:24:52.000 Any of these possibilities, and there are others as well, ends up giving us content that impacts us and our kids especially in ways that people are entirely unaware of.
00:25:09.000 So, the way I like to put it is this.
00:25:11.000 You don't know what they don't show.
00:25:16.000 Now, I'm still confused as to how Google can blacklist websites and how they can shut down the entire internet for 40 minutes.
00:25:26.000 Because, do they have a switch?
00:25:29.000 I mean, like, is there a connection that all websites go through Google?
00:25:33.000 Like, how is that possible?
00:25:35.000 About three years ago, they shut down all of Japan.
00:25:40.000 Accidentally?
00:25:41.000 Well, that would take a whistleblower to figure that one out.
00:25:48.000 It was in the news at one point that the guy who was in charge of making these decisions, he actually has left Google, he once shut down an entire domain name which had 11 million websites on it because he thought it was kind of poor quality.
00:26:09.000 Poor quality?
00:26:10.000 Yes.
00:26:11.000 Poor quality, like, how so?
00:26:13.000 I don't know.
00:26:14.000 This is just his take that it was poor quality?
00:26:17.000 I have a copy of the internal manual.
00:26:20.000 I'm happy to send it to you from Google, showing the criteria they use in deciding which content to suppress.
00:26:28.000 And some of the criteria are pretty straightforward having to do with pornography and things like that.
00:26:33.000 And then there's this wide open area that says, or anything else.
00:26:40.000 Or anything else.
00:26:41.000 Pretty much, yeah.
00:26:42.000 So it's up to the discretion of the engineer?
00:26:47.000 There's a lot of discretion involved in making these decisions, and a lot of the decisions that get made in very recent years, since Trump was elected, they happen to be decisions, for the most part, that suppress conservative content,
00:27:03.000 but not always, not always.
00:27:04.000 Now, I'm going to circle back.
00:27:06.000 Can you please explain again?
00:27:09.000 I still don't know.
00:27:10.000 How do they shut down the Internet?
00:27:12.000 How does Google have that ability?
00:27:20.000 Let's see.
00:27:24.000 I can answer the question, but it's not a simple answer.
00:27:28.000 It's not like they have a switch, okay?
00:27:30.000 Okay.
00:27:31.000 But I'll give you a couple of clues here.
00:27:35.000 Okay.
00:27:35.000 Okay, first of all, what's the most popular browser right now?
00:27:39.000 It's Chrome, by far.
00:27:40.000 Well, Chrome is their browser.
00:27:42.000 So, obviously, anyone using their browser, it's a simple matter for them to block anything, to block access to anything through Chrome.
00:27:51.000 So that one's easy, right?
00:27:52.000 Okay.
00:27:53.000 They can block access to anything through their search engine, which is used for 92% of all search around the world.
00:27:59.000 So that takes care of a lot right there.
00:28:03.000 Then we get to, let's say, Siri.
00:28:06.000 Do you use an iPhone or Apple?
00:28:09.000 I use both.
00:28:10.000 I mean, iPhone or Android, you mean.
00:28:12.000 Yeah, I use both.
00:28:14.000 Yeah, so Siri.
00:28:16.000 Where does Siri get all her answers from?
00:28:19.000 Google.
00:28:21.000 Oh, good guess.
00:28:22.000 Nice.
00:28:23.000 Yes.
00:28:26.000 So, okay, let's take Firefox.
00:28:32.000 Okay, Firefox.
00:28:36.000 Before Firefox takes you to the website that you just typed in, guess what?
00:28:43.000 They have to make sure it's safe.
00:28:44.000 So how do they make sure it's safe?
00:28:48.000 I don't know.
00:28:49.000 They check?
00:28:50.000 Well, they check Google's blacklist.
00:28:53.000 This is what happened when you would that day or during that time period when you search something on Google and you clicked it, you would get this.
00:29:01.000 Warning, visiting this website may harm your computer.
00:29:05.000 I think maybe you could continue through like you can.
00:29:09.000 It happens from time to time now for strange reasons.
00:29:13.000 I don't know.
00:29:14.000 Do not proceed to internet.
00:29:16.000 Yeah, I don't know what happened then.
00:29:17.000 What about if you go through Safari or what if you go through Apple's browser?
00:29:21.000 Safari, same thing.
00:29:22.000 Safari, before they take you anywhere, they've got to check Google's blacklist.
00:29:25.000 So not only is Google getting information about your search on Safari, the fact is if Google wants to block you from going there through Safari, they just add it to their blacklist.
00:29:38.000 In other words, if they put everything on their blacklist, then no one can reach anything.
00:29:43.000 Really?
00:29:44.000 Yeah, really.
00:29:45.000 So all browsers go through Google, except Brave, right?
00:29:50.000 Except Brave, yeah.
00:29:51.000 That's the only one?
00:29:54.000 You know, there are small browsers out there no one's ever heard of, but I mean, Google's influence on the internet is...
00:30:06.000 It's beyond monopoly.
00:30:08.000 They're really in charge.
00:30:09.000 Outside of China and North Korea, they're in charge of pretty much everything that happens on the Internet.
00:30:17.000 Yahoo.
00:30:18.000 Let's take Yahoo.
00:30:19.000 Yahoo used to be one of the big search engines.
00:30:22.000 And some people still use it, except Yahoo stopped crawling the Internet.
00:30:28.000 About five years ago or more, they don't crawl the internet anymore.
00:30:32.000 They get their content from Google.
00:30:35.000 Really?
00:30:36.000 Yeah.
00:30:36.000 So Yahoo isn't really a search engine.
00:30:38.000 It just searches Google.
00:30:40.000 And your second favorite DuckDuckGo is also not a search engine.
00:30:45.000 God damn it.
00:30:46.000 What is it?
00:30:48.000 They have a crawler.
00:30:50.000 They do have a crawler.
00:30:52.000 And they do a little crawling.
00:30:54.000 But actually what DuckDuckGo does is it's a database aggregator.
00:30:58.000 They're checking databases.
00:31:01.000 And what is the difference there?
00:31:02.000 Oh, night and day.
00:31:04.000 In other words, Google is literally looking at billions of websites every day and it's looking for updates and changes and new websites and this and that.
00:31:13.000 So it's crawling and it's extracting information, especially looking for links because that's how it gets you good information.
00:31:19.000 It looks for what's linking to what.
00:31:23.000 But DuckDuckGo doesn't do that.
00:31:25.000 DuckDuckGo is looking at databases of information, and it's trying to answer your question based on information that is in databases.
00:31:32.000 Lots of different databases.
00:31:34.000 That's not what Google does.
00:31:36.000 Google's really looking at the whole internet.
00:31:38.000 And the Brave search engine, what does it do?
00:31:43.000 The Brave search engine is crawling.
00:31:45.000 So it is crawling.
00:31:46.000 It can't do it at the same level that Google can.
00:31:50.000 But obviously, this guy, you know, Brendan Eich, is very ambitious.
00:31:53.000 So he's, you know, he wants to do it at that level.
00:31:57.000 So no, no, they're doing...
00:31:59.000 Brave is trying to do what Google does except preserving privacy and suppressing ads.
00:32:08.000 And it seems like what happened with Google before anyone even understood That the data is so valuable.
00:32:19.000 Before anyone, it was too late.
00:32:21.000 It was already an inexorable part of day-to-day life that people were using that and that people were using Gmail and using all these services and just giving up their data.
00:32:31.000 Yeah?
00:32:32.000 Yeah.
00:32:33.000 So there's no regulation?
00:32:39.000 No, there's no regulation.
00:32:40.000 There are no laws.
00:32:42.000 And in fact, the courts have ruled over and over again when someone has gone after Google that Google can do whatever they want.
00:32:50.000 So I'll give you an example.
00:32:52.000 A case I was following very closely and I was kind of working with these people to some extent.
00:32:57.000 Florida company called eVentures.
00:33:01.000 Again, someone at Google, it might have been that same guy that I mentioned earlier.
00:33:05.000 I think his name was Matt Cutts or something like that.
00:33:09.000 They all of a sudden shut down hundreds of URLs that eVentures was using for its business, saying they were not good quality.
00:33:21.000 I mean, for you to get that much information out of Google is like pulling teeth, because normally they just don't tell you anything.
00:33:29.000 But anyway, so they...
00:33:30.000 When they nearly shut down the company, the company decided to sue.
00:33:35.000 So Google, of course, kept them hung up in court for like a couple years because they wouldn't provide any information through Discovery.
00:33:45.000 Google always does that.
00:33:47.000 They just stonewall you just even on Discovery, which is like preliminary stuff before a lawsuit.
00:33:55.000 Anyway, so eVentures keeps pushing, pushing, pushing, pushing, finally goes to court.
00:34:04.000 And eVentures loses.
00:34:06.000 And they're slaughtered.
00:34:07.000 Literally, the decision of the judge in the case was, Google is a private company.
00:34:13.000 It can do what it wants.
00:34:15.000 It can demote you in search rankings.
00:34:19.000 It can delete you.
00:34:21.000 It can block access to your websites.
00:34:25.000 It can do anything it wants.
00:34:27.000 Literally, that was the decision.
00:34:29.000 So let's say if Donald Trump runs again in 2024 and they have a Trump campaign website, Google can decide that that website is a poor quality and deny people access to it so that when people go to Google Donald Trump,
00:34:45.000 they will never see his website.
00:34:47.000 Correct.
00:34:48.000 That's wild.
00:34:49.000 Well, they block access every day to several million websites.
00:34:53.000 So this is not a rare thing that they do.
00:34:56.000 And they block access based on their own decisions.
00:34:59.000 They're internal.
00:35:00.000 They don't have to justify them.
00:35:01.000 They don't have to have a criteria that they can establish that they're doing the right thing.
00:35:06.000 They just do it.
00:35:07.000 And in the United States, there are no relevant laws or regulations in place to stop them.
00:35:14.000 Do our regulators and do our elected officials even understand this?
00:35:18.000 Is this something that is of concern to them?
00:35:21.000 Has this been discussed?
00:35:22.000 There are a couple of them who understand.
00:35:27.000 And there are a couple of the attorneys general whom I know who understand.
00:35:31.000 Doug Peterson from Nebraska, he totally understands.
00:35:36.000 Ted Cruz.
00:35:37.000 He was behind my invitation to testify before Congress.
00:35:42.000 A couple months later, he invited me to D.C. We sat down, had a four-hour dinner.
00:35:47.000 Fabulous.
00:35:48.000 We never stopped talking.
00:35:51.000 And we never talked politics.
00:35:52.000 We did not talk politics the whole time.
00:35:54.000 We just talked tech.
00:35:56.000 Cruz totally understands.
00:35:59.000 But he's hamstrung.
00:36:03.000 How do you fight the most effective mind control machine that's ever been developed, which also is very rich, has $150 billion in the bank right now in cash, makes huge donations to political candidates,
00:36:20.000 and then can shift votes, millions of votes nationwide, without anyone knowing that they're doing so?
00:36:29.000 How do you fight that?
00:36:32.000 And it's not something that they set out to do when they first created the search engine.
00:36:38.000 It seems like because of the fact that this is something that was initially so you could search websites, that's what it was, right?
00:36:47.000 Did they know when they first made this that they were going to be able to have the kind of power that they have today?
00:36:54.000 Or is this something that we all have sort of awoken to?
00:36:58.000 Okay, I don't know Sergey Brin, Larry Page, the founders.
00:37:03.000 I don't know them.
00:37:04.000 I've lectured at Stanford in the same building where they invented Google, which is kind of cool.
00:37:08.000 But I don't know them.
00:37:09.000 But I think these guys were and probably still are utopians.
00:37:15.000 I think they had the best intentions in mind.
00:37:21.000 Top executive ever to leave Google is a guy named James Whittaker, who's gone completely silent, by the way, in recent years.
00:37:29.000 Completely silent.
00:37:30.000 But he was the first real executive to leave Google.
00:37:33.000 He finally issued a statement.
00:37:34.000 He was under pressure.
00:37:35.000 You know, why did you leave?
00:37:36.000 Why did you leave?
00:37:38.000 He issued a statement, which you can find online.
00:37:40.000 It's fascinating to see this.
00:37:43.000 And he says, look, when I first joined Google, which was practically in the beginning, he said it was just a cool place and we were doing cool things and that was it, he said.
00:37:53.000 And then he said, a few years later, he said we turned into an advertising company.
00:38:00.000 He said, and it was no more fun.
00:38:02.000 It was brutal.
00:38:03.000 This is brutal, profit-driven ad company.
00:38:08.000 Now, if you don't think of Google as an ad company, then, again, you're not getting it.
00:38:14.000 They are the largest advertising company by a factor of 20. I think the next largest one is based in London.
00:38:21.000 But Google is, what it's doing is tricking you, tricking all of us.
00:38:27.000 Well, not me personally, but it's tricking you into giving up personal information 24 hours a day, even when you don't know you're giving up personal information.
00:38:35.000 And then it's monetizing the information mainly by connecting up vendors with potential buyers.
00:38:44.000 It's an advertising company.
00:38:46.000 And so Whitaker actually quit because the nature of the business changed.
00:38:51.000 And then, of course, everyone knows about Google's slogan, right?
00:38:56.000 Don't be evil.
00:39:00.000 But no one seems to know that they dropped that slogan in 2015. Didn't they just add it to a part of a larger slogan?
00:39:08.000 Didn't we go over that, Jamie?
00:39:11.000 There was like a thing, I'm trying to remember what they exactly did, because we were sort of like, oh my god, they said don't be evil, and now they don't say it anymore.
00:39:20.000 Maybe they're evil, but I think they had added something and made it longer.
00:39:28.000 And so it wasn't that it's not their slogan anymore, it's just their slogan sort of morphed.
00:39:34.000 Right?
00:39:35.000 Was that it?
00:39:37.000 Jamie will find it in a moment.
00:39:38.000 Can I go back to something?
00:39:40.000 Please do.
00:39:40.000 Okay.
00:39:41.000 I just want to go back to blacklist.
00:39:42.000 Yes.
00:39:43.000 Because I wrote this big piece on nine of Google's blacklists.
00:39:49.000 Their biggest one is called the quarantine list.
00:39:52.000 That's that list that Safari has to check, and that list that Firefox has to check.
00:39:57.000 Everyone has to check that list before they take you to a website.
00:40:00.000 So that's a simple way, simple tool that Google uses to block access to websites.
00:40:05.000 Because we go to websites through browsers, right?
00:40:08.000 Okay, there we go.
00:40:10.000 I had never seen any of those nine blacklists, but I knew they existed as a programmer, and I talked about each one in detail.
00:40:18.000 2019, I'm invited to testify about my research, my experiments on manipulation and how I monitor elections now and all that stuff.
00:40:31.000 So who testifies before me?
00:40:35.000 A top executive, a vice president from Google.
00:40:38.000 He's under oath.
00:40:40.000 He's sworn in.
00:40:41.000 The senators are asking him some really tough questions.
00:40:46.000 And he's asked, point blank, does Google have blacklists?
00:40:53.000 I think the full question might have been, does Google have white lists and blacklists?
00:40:57.000 And his reply was, no, Senator, we do not.
00:41:01.000 So that was July of 2019. In August, literally three weeks later, Zach Forhees, who I mentioned earlier, that's when he leaves Google.
00:41:16.000 Google sends like a, what's that called when the police, oh, SWAT team.
00:41:24.000 Google sends a SWAT team after him.
00:41:27.000 I kid you not.
00:41:28.000 Yep.
00:41:30.000 So they were very unhappy because he stole all this stuff.
00:41:33.000 And he sent it, he put it all in a box and sent it to the Attorney General of the United States.
00:41:38.000 This is 2019, August.
00:41:40.000 This is only less than a month after this hearing.
00:41:44.000 So he's got 950 pages of documents, all kinds of documents, and three of them are Google blacklists, which are actually labeled Blacklists!
00:41:56.000 Now, if I were putting together blacklists at my company, I would call them shopping lists.
00:42:02.000 I would call them, you know, I don't know, makeup lists, you know, lists for my kids' birthday presents.
00:42:12.000 I wouldn't call them blacklists.
00:42:14.000 So there are actually three of them he walked out with.
00:42:20.000 So you can look at the list.
00:42:21.000 You can see who's on the list.
00:42:22.000 You can see these are almost all or many of them prominent conservative organizations.
00:42:28.000 There are no left-wing organizations on those lists.
00:42:32.000 So this is real.
00:42:34.000 This is how they operate.
00:42:36.000 And they operate this way, do you think they do this because, are they financially driven to put those people on blacklist?
00:42:45.000 Is it maybe some, I mean, this is obviously speculation, but is it maybe some sort of a deal that they've made with certain politicians?
00:42:53.000 Is it something they've decided on their own because this is the right thing to do to suppress the bad people that put Donald Trump into office?
00:43:02.000 Like, why are they doing that?
00:43:05.000 What you just did was amazing.
00:43:08.000 What do I do?
00:43:09.000 Because you got almost all of it.
00:43:11.000 You just came up with it hypothetically, but you left out one area.
00:43:16.000 So the two areas you just nailed, one is to make money.
00:43:20.000 So they have three motives.
00:43:22.000 One is to make money, and that they do extremely well.
00:43:25.000 And no one who's tried to tangle with them has stopped that process.
00:43:31.000 In other words, the rate at which they're making money continues to increase every year.
00:43:36.000 So a few years ago when I was first looking at them, they were bringing in $100 billion a year.
00:43:41.000 Now they're bringing in $150 billion a year.
00:43:43.000 Money.
00:43:44.000 That's number one.
00:43:45.000 Number two, values.
00:43:48.000 I could talk for hours on this issue because of recent leaks of videos, PowerPoint presentations, documents, and of course what whistleblowers have been revealing.
00:44:03.000 They have very strong values there because the founders had very strong values and they hired people who had similar values and they have really strong values.
00:44:12.000 And they want the world to have those values.
00:44:17.000 They really think that their values are more valuable than other people's values, which means they don't understand what values are because...
00:44:26.000 And so their values are pretty much all of tech.
00:44:30.000 It's very left-leaning.
00:44:31.000 Very left-leaning.
00:44:32.000 So 94%, 96% of all donations out of Google go to Democrats, which I sympathize with.
00:44:42.000 I'm from a family of Democrats.
00:44:43.000 I lean left.
00:44:44.000 So I say, yeah, fine.
00:44:46.000 That's fine.
00:44:47.000 That's fine.
00:44:47.000 But it's not fine.
00:44:48.000 It's not fine.
00:44:50.000 Because they have the power to impose their thinking on other people around the world in a way no one has ever had such power, ever.
00:45:01.000 So values is second.
00:45:03.000 And this is serious.
00:45:05.000 One of the leaks from Google was an eight-minute video, which you should definitely watch.
00:45:09.000 It's so creepy.
00:45:10.000 And it's called The Selfish Ledger.
00:45:14.000 And it's eight minutes, and it was put together by their super-secret advanced products division.
00:45:19.000 It was never meant to leak out of that company.
00:45:22.000 And I have a transcript of it, too, which I've published, so I can get you all that stuff.
00:45:27.000 But the point is, what is this about?
00:45:30.000 This is about the ability that Google has to re-engineer humanity according to company values.
00:45:43.000 Re-engineer humanity according to company values.
00:45:46.000 And this is a directive?
00:45:47.000 Like this is something they're doing purposely?
00:45:50.000 Well, in the video, they're presenting this as an ability that we have.
00:45:56.000 This is an ability that we have.
00:46:03.000 So, that's the second area.
00:46:05.000 You nailed it.
00:46:07.000 Third one you didn't mention.
00:46:08.000 The third one is intelligence, because they had some support, Page and Brin, right in the very beginning at Stanford.
00:46:19.000 They had some support and had to be in regular touch with representatives from the NSA, the CIA, and another intelligence agency.
00:46:30.000 The intelligence agencies were doing their job, okay?
00:46:34.000 They realized that the internet was growing.
00:46:36.000 This is the 1990s.
00:46:40.000 So they realized that the internet is growing.
00:46:42.000 And they were thinking, hey, these are people building indexes, indices to the content.
00:46:50.000 So sooner rather than later, we're going to be able to find threats to national security by looking at what people are looking up.
00:47:00.000 If someone is going...
00:47:02.000 Online, they're using a search engine to find out instructions for building bombs, for example.
00:47:10.000 Okay, that's a potential threat to national security.
00:47:13.000 We want to know who those people are.
00:47:14.000 So right from the outset, and this is totally unlike Brave.
00:47:19.000 Okay, Brave doesn't do this.
00:47:21.000 But right from the very, very beginning, the Google search engine was set up to track and preserve search history.
00:47:33.000 So in other words, to keep track of who's doing the search and where did they search, that is very, very important to this day for intelligence agencies.
00:47:45.000 So Google, to this day, works very closely with intelligence agencies, not just in the U.S., but other agencies around the world.
00:47:53.000 So those are the three areas.
00:47:55.000 Money, values, intelligence.
00:47:59.000 And the intelligence stuff...
00:48:01.000 Is legit.
00:48:03.000 I mean, it's legit.
00:48:03.000 You know, it is an obvious place.
00:48:06.000 If you're in law enforcement, that's an obvious place to go to find bad guys and girls.
00:48:15.000 Yeah.
00:48:17.000 So Google has this ability that they've proclaimed that they can sort of shift culture and direct the The opinion of things and direct public consciousness.
00:48:35.000 How much of a percentage do you think they have in shifting?
00:48:39.000 Do they have a 30% swing?
00:48:44.000 Well, see, this is what I do.
00:48:45.000 Now you're getting close to what I actually do, what I've been doing for now for over nine years.
00:48:51.000 I quantify.
00:48:53.000 This is exactly what I do.
00:48:54.000 Every single day, that's what I do.
00:48:56.000 My team, my staff, that's what we do.
00:48:58.000 And it's cool.
00:49:00.000 Talk about cool.
00:49:02.000 We're doing the cool stuff now.
00:49:04.000 Google is not.
00:49:05.000 We're doing the cool stuff.
00:49:06.000 Because we have discovered a number of different tools that Google, and to a lesser extent other companies use, to shift thinking and behavior.
00:49:20.000 And what we do in randomized controlled experiments, which are also counterbalanced and double-blind and all that stuff, we measure The ability that these tools have to shift thinking and behavior.
00:49:37.000 And we pin it down to numbers, percentages, proportions.
00:49:41.000 We can make predictions in an election about how many votes can be shifted if they're using this technique or these three techniques.
00:49:53.000 Yeah, that's what we do.
00:49:55.000 So we started with the search engine.
00:50:00.000 And it took years, years of work, but we really, I think at this point, have a good understanding of what the search engine can do.
00:50:10.000 But then along the way, we discovered other tools that they have and which they are definitely using.
00:50:17.000 And how do we know they're using these tools?
00:50:18.000 Well, we can get to that.
00:50:20.000 What are the tools?
00:50:22.000 Well, the first one we called SEAM, search engine manipulation effect.
00:50:26.000 And that means they're either allowing, you know, one candidate or one party to rise to the top, you know, in search rankings, or they're making it happen.
00:50:35.000 And you don't know for sure whether, you know, which is occurring unless there's a whistleblower or there's a leak.
00:50:43.000 Okay, but the fact that it's occurring at all, that's important.
00:50:47.000 In a way, we don't care.
00:50:49.000 Because if it's just the algorithm that's doing it, well, that's horrible.
00:50:54.000 That means literally a computer program is deciding who's going to be the next president, who's going to be the next senator.
00:51:01.000 Do we want that decision made by an algorithm?
00:51:07.000 Anyway, we spent a lot of time on that.
00:51:10.000 We're still studying SEAM. Then we learned about SSE, which is search suggestion effect.
00:51:17.000 When you start to type...
00:51:19.000 Ooh, in fact, if you have your phone handy, this will be fun.
00:51:21.000 If you start to type a search term into the box, a search box, suggestions flashed at you.
00:51:31.000 As fast as you're typing, that's how fast those suggestions come.
00:51:34.000 Right.
00:51:35.000 Well, guess what?
00:51:36.000 We learned in controlled experiments that by manipulating the suggestions that are being flashed at people, we could turn a 50-50 split in a group of undecided voters into nearly a 90-10 split without anyone having the slightest idea that they're being manipulated.
00:52:00.000 That's just by manipulating search suggestions.
00:52:02.000 Just by suggesting.
00:52:03.000 Yes.
00:52:03.000 And the reason why we started that work is because in June of 2016, a news organization, a small news organization released a video which went viral on YouTube and then got blocked on YouTube.
00:52:21.000 Frozen.
00:52:22.000 Still frozen.
00:52:24.000 But then it continued to go viral on Facebook, so 25 million views.
00:52:29.000 In this little video, this news organization is saying, we've made a discovery.
00:52:36.000 When you go to google.com and you look for information about Hillary Clinton, you can't get any negative search suggestions.
00:52:47.000 Really?
00:52:48.000 Really.
00:52:49.000 And they showed this.
00:52:51.000 What if you Google Clinton body count?
00:52:56.000 You could not get negatives.
00:52:58.000 Really?
00:52:59.000 Yeah.
00:52:59.000 It would give you nothing, probably, for Clinton body count.
00:53:03.000 But as you're typing, you go, Clinton B, it would go, you know, Clinton buys the best clothes.
00:53:11.000 I don't know.
00:53:11.000 It would give you something like that.
00:53:13.000 It would not give you something negative.
00:53:14.000 So, for example, Hillary Clinton is, you do it on, and they showed this, you do it on Yahoo!, You do it on Bing, and you get Hillary Clinton is the devil.
00:53:28.000 Hillary Clinton is evil.
00:53:29.000 Hillary Clinton is poison.
00:53:31.000 And literally, they're showing you eight or ten items that are extremely negative.
00:53:36.000 You can check on Google Trends.
00:53:38.000 That's, in fact, what people are really searching for.
00:53:40.000 So Bing is showing you what people are searching for.
00:53:44.000 Hillary Clinton is on Google at that time, gives you...
00:53:49.000 Guess what?
00:53:50.000 Hillary Clinton is awesome.
00:53:52.000 Hillary Clinton is winning.
00:53:54.000 That's it.
00:53:56.000 Two suggestions.
00:53:58.000 So that's why we started doing this research on search suggestions, because I kept thinking, why?
00:54:05.000 Why would they do that?
00:54:07.000 Why would they suppress negatives for a candidate they presumably support?
00:54:12.000 And we figured it out.
00:54:18.000 It's because...
00:54:19.000 Did you ever hear of negativity bias?
00:54:21.000 Yes.
00:54:22.000 Okay.
00:54:23.000 So this is also called the cockroach in the salad phenomenon.
00:54:25.000 So you've got this big, beautiful salad.
00:54:27.000 You see a cockroach in the middle.
00:54:29.000 It ruins the whole salad.
00:54:32.000 We are drawn...
00:54:33.000 Our attention is drawn to negatives.
00:54:35.000 Negatives.
00:54:36.000 Right.
00:54:36.000 And that's good for evolutionary purposes.
00:54:39.000 Good for survival.
00:54:40.000 So it ruins the whole salad.
00:54:41.000 The opposite doesn't work.
00:54:43.000 If you have a plate of sewage and you put a nice piece of Hershey's chocolate in the middle, It does not make the sewage look any more appetizing.
00:54:52.000 So we're drawn to negatives.
00:54:54.000 Well, Google knows this, okay?
00:54:57.000 And we've quantified it.
00:54:59.000 Basically, if we allow one negative to pop up in a list and the rest are neutral or positive suggestions, that one negative for certain demographic groups can draw 10 to 15 times as many clicks as the other suggestions.
00:55:17.000 So, one of the simplest ways to support a candidate or a cause It's a simple look-up.
00:55:40.000 Okay, you delete it.
00:55:41.000 It's gone.
00:55:42.000 People don't see it.
00:55:43.000 But you let the negatives pop up in the search suggestions for the other candidate or the other cause.
00:55:51.000 And what that does is it draws people who are looking up, let's say, Donald Trump.
00:55:58.000 It draws people to websites.
00:56:01.000 Well, first of all, it generates search results That make that person look bad because you just clicked on Donald Trump is evil.
00:56:13.000 And so you clicked on that, caught your attention, boom!
00:56:16.000 You get a bunch of search results that support that.
00:56:19.000 You click on any of them and now you're at a website that makes him look terrible.
00:56:25.000 Very, very simple kind of manipulation, so subtle.
00:56:29.000 All you do is suppress negative suggestions for the candidate or the cause that you support.
00:56:38.000 And as I say, so we did a series of experiments.
00:56:42.000 We figured out, okay, to what extent can we mess with a group of 100 people or 1,000 people?
00:56:50.000 Yeah, we can turn a 50-50 split among undecided voters into nearly a 90-10 split.
00:56:57.000 When did they first start implementing this sort of search engine manipulation?
00:57:01.000 When did they implement the suggestion manipulation?
00:57:07.000 Well, we were able to estimate that to some extent.
00:57:10.000 And by the way, this landscape keeps changing.
00:57:16.000 So I'll give you an example.
00:57:18.000 When they first came up with search suggestions, actually one engineer there came up with this thing and it was cool.
00:57:26.000 And it was an opt-in thing when it first came out.
00:57:28.000 I think it was 2009. And it was cool and it was helpful because that was the idea initially.
00:57:37.000 So, then over time, I think, you know, with a lot of these services, a lot of these, you know, these little gizmos, people figured out that, wait a minute, we can do things, you know, that maybe we didn't intend to in the beginning,
00:57:52.000 but we can use these for specific purposes.
00:57:55.000 So, anyway, so, at some point, or rather a couple years later, It was no longer opt-in.
00:58:04.000 In fact, it was automatic and you can't opt out.
00:58:08.000 That's the first thing that happened.
00:58:10.000 And then you may remember there were always 10 items in the list initially.
00:58:16.000 But then in 2010 or so, suddenly they dropped to four items.
00:58:25.000 So in our experiments we actually figured out why they were showing four items and we went public with that information in 2017 and three weeks later Google went back to ten items.
00:58:38.000 Why do you think they went to four?
00:58:40.000 Because four is exactly, we know from the research, is exactly the number of search suggestions that allows you to maximize your control over people's searches.
00:58:54.000 Because look, if the list is too long and you've got a negative in there...
00:58:59.000 They're not gonna see it.
00:59:01.000 I mean, imagine if you had 100 search suggestions and you had one negative, right?
00:59:05.000 So it has to be short enough so that the negative pops out, right?
00:59:10.000 But it can't be too short.
00:59:12.000 If it's too short, then the likelihood that they type in their own damn search term and ignore your suggestions goes up.
00:59:19.000 So there has to be this optimal number.
00:59:22.000 It turns out the optimal number to maximize your control over search is four.
00:59:32.000 And we also learned that you are being manipulated on Google from the very first character you type into the search box.
00:59:42.000 If you have a phone handy, I can prove it.
00:59:44.000 Okay.
00:59:45.000 So, I'll Google.
00:59:48.000 And this is going to be, by the way, the last time...
00:59:52.000 So those of you who are watching or listening, you're all witnesses.
00:59:56.000 This is the last time that Joe Rogan ever uses Google.
01:00:00.000 Ever.
01:00:02.000 Well, watch.
01:00:03.000 Okay.
01:00:04.000 Okay.
01:00:05.000 So you got Google up there, right?
01:00:07.000 Yes.
01:00:08.000 And you're in the search box?
01:00:10.000 Yes.
01:00:10.000 Type A. What's it suggesting?
01:00:14.000 Amazon.
01:00:16.000 Yeah.
01:00:17.000 Well, it's doing more than one suggestion.
01:00:19.000 What are the suggestions?
01:00:20.000 Amazon Academy Sports and Outdoors, Amazon Prime, Houston Astros.
01:00:28.000 And then a bunch of other people.
01:00:30.000 Alamo Draft House, American Airlines.
01:00:32.000 So your first and third suggestions, notably the first position is the most important, are Amazon.
01:00:39.000 Yes.
01:00:39.000 Well, it turns out everywhere in the world where Amazon does business, if you try to search for anything beginning with the letter A, and you type A, Google suggests Amazon.
01:00:50.000 Why is that?
01:00:51.000 Well, it turns out Amazon is Google's largest advertiser.
01:00:55.000 And...
01:00:57.000 Google is Amazon's largest single source of traffic.
01:01:01.000 It's a business relationship.
01:01:03.000 Get it?
01:01:04.000 If you type T, you're going to get Target and so on.
01:01:07.000 But what's interesting is when you type G. Just type G. What do you think I'll get?
01:01:19.000 Well, tell us.
01:01:20.000 Tell us what you got.
01:01:21.000 Grand Seiko.
01:01:25.000 Nothing interesting on there at all?
01:01:27.000 No.
01:01:29.000 Gastronomical, and then number four is Google Translate.
01:01:33.000 Number five is Gmail.
01:01:36.000 Number six is Google.
01:01:38.000 Okay.
01:01:39.000 Oh, it's starting to see a pattern here.
01:01:40.000 Yeah, but I mean, like, the first ones are all, like, something that I would look up.
01:01:44.000 Well, they know your history, right?
01:01:46.000 So they know who they're...
01:01:48.000 So the first ones with G, they'll allow you to have a little...
01:01:54.000 They allow you to actually look up the things you're interested in or suggest things you're interested in?
01:01:59.000 First of all, you're Joe Rogan, okay?
01:02:02.000 So they may allow you to do all kinds of things.
01:02:04.000 Do you have specific allows for people like me?
01:02:08.000 Yeah, everything's personalized.
01:02:09.000 But I mean, is it personalized on purpose or personalized through the algorithm that sort of represents what you normally search for?
01:02:19.000 Yeah, that's called on purpose, yeah.
01:02:21.000 No, but I mean, they're not doing it specifically because it's me.
01:02:25.000 If I was any other person that was maybe anonymous, but I also looked up those things...
01:02:35.000 For most people, to answer your question, for most people, and folks out there, literally, pick up your phones, go to google.com, which, by the way, this is the last time you're ever going to use google.com, but just type in G and see what you see.
01:02:51.000 Most people, if they're getting five suggestions, four out of the five will be for Google.
01:02:56.000 So, the lesson there is if you're starting a new company, don't start...
01:03:03.000 Don't name it with a G. Don't name it with a G, right.
01:03:05.000 Yeah.
01:03:05.000 No G, because...
01:03:07.000 So, what they're showing, the point is, has to do with their agenda, their motives, okay?
01:03:13.000 Every single thing that they're doing has to do with their motives, which have to do with money, Values and intelligence.
01:03:21.000 And a public library does not do that.
01:03:27.000 You go, you borrow some books, you ask some questions, you get some answers, that's that.
01:03:33.000 That's the way the internet was meant to be.
01:03:37.000 It wasn't supposed to be this.
01:03:40.000 The whole internet around the world controlled mainly by two huge monopolies.
01:03:48.000 And to a lesser extent by some smaller monopolies like Twitter.
01:03:51.000 It wasn't supposed to be that way.
01:03:53.000 It was supposed to be like the public library.
01:03:56.000 And it is possible, you see, you can set up a company like Brave that doesn't play these stupid games and doesn't fool you and it's not deceptive.
01:04:08.000 This is the business model that Google invented.
01:04:12.000 It's called the surveillance business model.
01:04:14.000 It's fundamentally deceptive.
01:04:19.000 Because up here at the level that you're interacting with it, it looks like public, library, free, cool.
01:04:25.000 And down here underneath, it's something completely different.
01:04:31.000 There's no reason for that.
01:04:33.000 Tim Cook, who's still the CEO of Apple, has publicly said, this is pretty recent, publicly said that this is a creepy business model and it should not be allowed.
01:04:45.000 Well, that is one area where Apple deserves credit, right?
01:04:48.000 That Apple has not taken up that same sort of net-like surveillance where they just kind of cast the net over everything you do and then sell it to advertisers.
01:04:58.000 And you can opt out of certain things in terms of like allowing apps to track purchases or allowing apps to track your use on other devices or on other applications rather.
01:05:12.000 I wish I could agree with you, but I can't, because the fact is Apple is still collecting all this information.
01:05:18.000 Apple is still listening.
01:05:22.000 They're doing the same things, it's just that at the moment, so far, under the leadership they have right now, but that can change in a heartbeat.
01:05:32.000 Let's talk about Microsoft.
01:05:35.000 Okay.
01:05:36.000 So you probably know that Microsoft was Google's enemy number one.
01:05:43.000 Microsoft sued Google in practically every courtroom in the world.
01:05:48.000 Microsoft was submitting regulatory complaints.
01:05:51.000 Microsoft was funding organizations that existed to do nothing else but fight Google.
01:05:58.000 For a long, long time.
01:06:00.000 Early 2016, Google and Microsoft signed a secret pact.
01:06:06.000 So the fact that the pact was signed, that somehow leaked.
01:06:12.000 But to this day, no one knows the details of what's in it, except here's what happened.
01:06:18.000 Simultaneously, both companies around the world dropped all complaints against each other.
01:06:24.000 Google, excuse me, Microsoft withdrew all of its funding from all the organizations it had been supporting.
01:06:32.000 And there are some people who believe, because Bing, Microsoft's search engine, which draws about 2% of search, by the way, it's no Google, it had been bleeding money for Microsoft for years,
01:06:49.000 and some people believe that Bing...
01:06:54.000 As part of this deal, started drawing search results from Google.
01:06:59.000 We don't know, but we do know this, that Windows 10 is a tracking tool.
01:07:08.000 Windows 11 is a tracking tool.
01:07:10.000 These...
01:07:12.000 These new operating systems are so aggressive in tracking that it's very, even if you're a tech geek like me, it's very, very hard to get rid of all the tracking.
01:07:26.000 So I'm still using Windows 8.1, believe it or not, or Windows 7. Why didn't you switch to Linux or Unix or something like that?
01:07:35.000 Well, we use that for certain purposes as well, but for general stuff that you do, if you're using desktops and laptops, Windows is still the way to go, except the company shifted.
01:07:48.000 It has been shifting towards the surveillance business model, as thousands of other companies have, including Verizon.
01:07:54.000 Just because it's so profitable.
01:07:56.000 It's so easy.
01:07:57.000 You're getting all the information anyway.
01:07:59.000 All you're going to do now is start to monetize it.
01:08:03.000 You're just building new parts of the company that no one even sees.
01:08:09.000 Right.
01:08:10.000 And the real issue here seems to be that this wasn't a thing 20 years ago.
01:08:16.000 It's a thing now, and it's the most dominant thing in terms of the way people access information, the way people get data, the way people find answers.
01:08:24.000 What is it going to be in 20 years from now?
01:08:27.000 I mean, it seems like there's so much potential for control and so much potential for manipulation and that it could only just get worse.
01:08:37.000 If there's no regulation put in place and there's no way to stop use of algorithms, use of curated data, what is this going to be like?
01:08:47.000 Have you sort of extrapolated?
01:08:49.000 Have you looked at the future and Yeah, that's what I do.
01:08:53.000 That's what I do every day.
01:08:55.000 It's depressing.
01:08:56.000 What do you think is happening?
01:08:57.000 What do you think, like, when we're looking at 20 years from now, what's going to happen?
01:09:01.000 Well, you might not believe my answer, but 20 years from now already happened.
01:09:10.000 How so?
01:09:11.000 It's now.
01:09:14.000 It's here.
01:09:15.000 Now.
01:09:19.000 Okay.
01:09:20.000 Eisenhower.
01:09:22.000 You're not as old as I am, but you probably remember...
01:09:24.000 I know the speech.
01:09:25.000 Ah.
01:09:25.000 Famous speech.
01:09:26.000 Yeah.
01:09:27.000 And everyone always points to certain language from his speech.
01:09:30.000 This is his retirement speech, his last speech just a few days before John F. Kennedy became president.
01:09:37.000 And it was a very shocking speech because this is a guy who was head of Allied forces in World War II. This is a, you know, I don't know, four-star general.
01:09:46.000 I mean, he's an insider.
01:09:49.000 And in this speech, he says, you know what, this terrible kind of...
01:09:55.000 This entity has begun to emerge, you know, and I've watched it.
01:10:00.000 And he called it the military-industrial complex.
01:10:03.000 And you probably remember hippies like, you know, with signs and screaming, no military-industrial complex.
01:10:09.000 And Eisenhower actually warned about the growth of this military-industrial complex and how it's taking over businesses and it's affecting the government and blah, blah, blah.
01:10:20.000 What he failed to note is that he also warned in the same speech about the rise of a technological elite that could control public policy without anyone knowing.
01:10:31.000 This was 1961. Really?
01:10:35.000 Technological elite.
01:10:36.000 Same speech.
01:10:38.000 I mean, what technological capabilities were even available back then other than the media?
01:10:43.000 Other than, you know, broadcast television and radio?
01:10:47.000 Well, it means that whatever he was seeing behind the scenes, see?
01:10:52.000 Oh, Jesus.
01:10:53.000 Was scaring him.
01:10:55.000 And what I have to tell you is that you're worried about 20 years from now, the technological elite are now in control.
01:11:03.000 So, Apple, Google, Facebook, and to a lesser extent, the other social media platforms?
01:11:10.000 Correct.
01:11:11.000 And Google is by far the most aggressive, the most dangerous.
01:11:18.000 You know, Facebook, there's chaos within Facebook, but, you know, we had this amazingly from Francis Haugen just recently, Of documents showing that, you know, people at Facebook are very much aware that their social platform creates turmoil,
01:11:36.000 terrible turmoil on a massive scale, and that they like that.
01:11:41.000 They encourage that because the more turmoil, the more traffic, the more traffic, the more money.
01:11:49.000 But knowing that you're creating turmoil, Here's my thought on that.
01:11:56.000 Is it just human nature?
01:11:59.000 Because you were saying before about the negativity bias, that people gravitate towards things that are negative.
01:12:05.000 And that's one of the things that you'll find if you use YouTube.
01:12:12.000 When you go on YouTube, if you're a person who likes to get upset at things and you're a person who likes to...
01:12:18.000 Look for things that are disturbing or upsetting or political arguments, whatever.
01:12:24.000 You'll get those in your suggestions over and over and over again.
01:12:27.000 But if you're not interested in that, if you're only interested in airplanes and you start Googling airplanes or cars or watches, that's what it'll suggest to you.
01:12:37.000 It doesn't have to suggest to you negativity.
01:12:40.000 You gravitate towards that, naturally.
01:12:44.000 And so the algorithm represents what you're actually interested in.
01:12:49.000 So is it Facebook's fault that everyone, not everyone, most people generally interact more with things that are negative or things that upset them?
01:13:01.000 That's not their fault, but it is their fault that they take advantage of that to manipulate people.
01:13:06.000 That's entirely their fault.
01:13:07.000 But if their business model is to engage with people and to keep people engaged by giving them content that makes them stay engaged and click on links and read more and spend more time on the platform, And the only thing that it's doing is highlighting what you're actually interested in.
01:13:28.000 What are they supposed to do?
01:13:30.000 Are they supposed to make less money and then have no suggestions and have no algorithm and just leave it all up to chance?
01:13:38.000 Just leave it all up to you go find what you're interested in and then keep finding what you're interested in through a direct search.
01:13:47.000 Like through you trying to find these things directly with no suggestion whatsoever.
01:13:53.000 Because that's better for the human race.
01:13:57.000 For the past year or so, we have been doing controlled experiments on YouTube.
01:14:03.000 We have a YouTube simulator.
01:14:05.000 It's a perfect YouTube simulator.
01:14:08.000 And we have control.
01:14:12.000 We're using real content from YouTube, real videos from YouTube, All the titles, everything comes from YouTube, except we have control over the ordering, and we have control over the up-next algorithm.
01:14:29.000 That's where the power lies, the up-next algorithm.
01:14:33.000 So one of the things we learned recently, not from Francis Haugen, but from someone else who left Facebook, is that 70% of the videos that people watch on YouTube now around the world Are suggested by YouTube's UpNext algorithm.
01:14:49.000 70%.
01:14:50.000 70%.
01:14:52.000 Whoa.
01:14:53.000 Yeah.
01:14:54.000 And that's their algorithm.
01:14:56.000 And just like us in our lab, okay, we have control over what the UpNext algorithm suggests.
01:15:06.000 And guess what we can do with our UpNext algorithm?
01:15:12.000 What?
01:15:12.000 Well, it should be obvious.
01:15:16.000 You can manipulate people.
01:15:17.000 Yeah, we manipulate people.
01:15:19.000 We randomly assign them to this group or that group, and we just push people any old way we want to push them.
01:15:24.000 And when you're doing these tests and studies, how are you doing this?
01:15:28.000 How many people are involved in this?
01:15:30.000 Are they students?
01:15:31.000 How are you doing this?
01:15:32.000 Okay, we never do the, you know, subject pool at the university where you get, you know, 50 students from your college to take, you know, to be your researcher.
01:15:43.000 We never do that.
01:15:44.000 So we're always reaching out to the community or we're doing things online.
01:15:48.000 So we do big studies online.
01:15:51.000 And we are getting very diverse groups of people.
01:15:55.000 We're getting—literally, we're getting people from lists of registered voters.
01:15:59.000 So we're getting people, you know, who look like the American population.
01:16:05.000 And we are—we can mess with them.
01:16:10.000 Can I say we can fuck with them?
01:16:12.000 You just did.
01:16:13.000 Oh, I guess I just did.
01:16:17.000 Oh, this is definitely not Fox.
01:16:18.000 No, we're on the internet.
01:16:20.000 This is not Fox News.
01:16:22.000 Yeah, but the internet, you see, the internet, though, because there are no regulations and rules, it does allow for some pretty evil things to take place.
01:16:32.000 And the fact is, in our experiments, we do these, usually our experiments have hundreds of people in them.
01:16:39.000 Sometimes they have thousands of people.
01:16:42.000 And we can fuck with people and they have absolutely no idea.
01:16:50.000 I'll tell you about something new, okay?
01:16:52.000 Okay.
01:16:52.000 Something new, brand new.
01:16:54.000 Okay, and this is...
01:16:55.000 Thank God I'm not talking about Google this time.
01:16:58.000 I'm just talking about something else that's happening.
01:17:01.000 There are websites that will help you make up your mind about something.
01:17:07.000 So, for example, there's a whole bunch of them right now that'll help you decide whether you're really a Democrat or you're really a Republican.
01:17:15.000 And the way they do that is they give you a quiz.
01:17:18.000 And based on your answers to how you feel about abortion and immigration and this and that, at the end of the quiz, they say, oh, you are definitely a Republican.
01:17:28.000 Sign up here if you want to join the Republican Party.
01:17:32.000 And this is called opinion matching.
01:17:34.000 And the research we do on this is called OME, the opinion matching effect.
01:17:40.000 And there are hundreds of websites like this.
01:17:43.000 And when you get near an election, a lot more of them turn up because the Washington Post will give you a quiz and help you decide who to vote for.
01:17:52.000 And Tinder, Tinder, okay, which is used for sexual hookups.
01:17:58.000 How about romantic, sir?
01:18:00.000 Oh.
01:18:00.000 Not just sex.
01:18:01.000 Sorry.
01:18:02.000 My mistake.
01:18:05.000 So Tinder actually set up a swipe the vote.
01:18:09.000 Option on Tinder during the 2016 election, you swipe left if you think this, you swipe right if you think that, and then at the end of it, they say, oh, you should be voting for Hillary Clinton.
01:18:21.000 But how do you know when one of these websites is helping you make up your mind?
01:18:28.000 How do you know whether the algorithm is paying any attention to your answers at all?
01:18:34.000 How do you know?
01:18:35.000 You don't.
01:18:36.000 So what we've done is...
01:18:38.000 We've done two things.
01:18:39.000 One is we've gone to a lot of these websites and we've been typing in random answers.
01:18:44.000 Or we've actually set up algorithms that do it for us.
01:18:47.000 So if we want to do it 500 times, we set up an algorithm that does it.
01:18:52.000 And then we look and see what the recommendations were.
01:18:55.000 And guess what?
01:18:58.000 Guess what?
01:18:59.000 What?
01:19:01.000 Sometimes these websites are not paying any attention to your answers.
01:19:07.000 They're just telling you what they want to tell you, and they're using this quiz to suck you in, and then they add in—oh, this we love—they add in a timer.
01:19:19.000 So in other words, after you finish the quiz, it'll go tick, tick, tick, tick, tick, computing, computing, computing.
01:19:25.000 And there's this delay creating the impression that they're really thinking hard.
01:19:30.000 And then they give you your answer.
01:19:32.000 So all that is for credibility to manipulate you.
01:19:37.000 Now, so over here we're going to websites and we're typing in random answers.
01:19:42.000 On the other side, we're doing experiments in which we...
01:19:47.000 We are giving people quizzes, and then we are giving people recommendations, and then we are measuring to see whether we can change anyone's mind.
01:19:56.000 And we're getting shifts of 70 to 90% with not a single person Not one person recognizing that they're being manipulated.
01:20:10.000 Not even one person recognizing that there's bias in the results we're giving them.
01:20:17.000 Not one!
01:20:18.000 Because how could you see the bias?
01:20:21.000 How could you see the manipulation?
01:20:23.000 You've just taken a quiz!
01:20:26.000 You're trying to make up your mind.
01:20:28.000 The thing that's so scary about this kind of manipulation is It attracts exactly the right people.
01:20:36.000 Exactly the right people who can be manipulated.
01:20:40.000 Right.
01:20:40.000 Because who's taking quizzes?
01:20:41.000 The people who are trying to make up their minds.
01:20:44.000 They're unsure.
01:20:44.000 Right.
01:20:45.000 They're vulnerable.
01:20:46.000 Yeah.
01:20:48.000 And when you – so you spoke to Congress about this?
01:20:53.000 You spoke in front of Congress?
01:20:55.000 Mm-hmm.
01:20:56.000 Right.
01:20:57.000 And when you did, was there any sort of urgency?
01:21:03.000 Did anybody understand what the implications of this are?
01:21:07.000 Did anybody understand we're literally looking at these massive technologies that are used throughout the world that can completely change Mm-hmm.
01:21:29.000 Well, there are some people.
01:21:32.000 There's a guy named Blumenthal.
01:21:34.000 He's a senator from Connecticut.
01:21:36.000 He gets it.
01:21:37.000 He understands.
01:21:38.000 He's kind of disgusted, I would say, with all this stuff.
01:21:42.000 But, you know, I said to Cruz, I said, why don't you work with Blumenthal?
01:21:46.000 And he said, well, no, I don't think that'll work out.
01:21:49.000 Because he's a Democrat?
01:22:10.000 And they're supporting them in these more subtle ways as well.
01:22:14.000 So the Democrats will do nothing.
01:22:16.000 If they even say something, like if they rattle their swords, they don't actually do anything.
01:22:24.000 And the Republicans hate regulation.
01:22:27.000 This is a perfect storm for these companies to do what they have done, which is they have already taken over.
01:22:37.000 You're thinking 20 years from now?
01:22:40.000 No.
01:22:40.000 They've already done it.
01:22:42.000 Well, I'm not thinking that they haven't already taken over, but I'm thinking, like, how much more control can they have in 20 years, if 20 years ago they didn't have any?
01:22:51.000 Like, as technology advances, do you think that this is going to be a deeper and deeper part of our world?
01:22:59.000 Well, look at Zuckerberg.
01:23:01.000 Zuckerberg's trying to get us all into the metaverse.
01:23:04.000 So, yeah, you have even more control if you get people into virtual realities.
01:23:09.000 Yes, you have more control.
01:23:12.000 Every single thing they're doing is moving us...
01:23:15.000 Farther and farther and farther down the rabbit hole.
01:23:19.000 Well, not just that.
01:23:19.000 I'm thinking like there was a time where Zuckerberg at least was publicly considering cryptocurrencies.
01:23:25.000 Right.
01:23:25.000 Like some sort of a Facebook cryptocurrency.
01:23:28.000 Imagine if Facebook cryptocurrency became the number one currency worldwide.
01:23:33.000 Maybe it was the number one crypto like Bitcoin is today.
01:23:36.000 Sure.
01:23:36.000 What the fuck?
01:23:38.000 So you're in the metaverse.
01:23:39.000 In order to exist and compete and buy things and prosper, you need Zuck bucks.
01:23:45.000 Yep.
01:23:46.000 Right?
01:23:47.000 Yeah.
01:23:47.000 Oh, I published a few years ago an essay calling for his resignation.
01:23:53.000 Roger McNamee, who was one of the first supporters financially of both Google and Facebook, he actually published a book about two years ago saying it was called Zucked.
01:24:07.000 How how Zuckerberg has taken over the world and he basically he said in that book straight out that if he had known What these companies were going to turn into Google and Facebook he would never Never have backed them in those early days Jamie, did we ever find out what Facebook,
01:24:24.000 or Google rather, changed their...
01:24:26.000 It got moved to the bottom of the Code of Conduct.
01:24:29.000 But it's still in there, right?
01:24:31.000 On the screen.
01:24:31.000 Okay, that's right.
01:24:32.000 And remember, don't be evil.
01:24:34.000 And if you see something that you think isn't right, speak up.
01:24:37.000 So it's still in there.
01:24:38.000 Sort of.
01:24:40.000 I think it's really...
01:24:54.000 And our company.
01:24:55.000 We expect all Googlers to be guided by both the letter and the spirit of this code.
01:25:02.000 Sometimes identifying the right thing to do isn't an easy call.
01:25:06.000 If you aren't sure, don't be afraid to ask questions of your manager, legal, or ethics and compliance.
01:25:13.000 And remember, don't be evil.
01:25:15.000 That was updated September 25th, 2020. Right.
01:25:20.000 Can you hold this?
01:25:21.000 Because I have to pee.
01:25:22.000 Unfortunately, I have to pee.
01:25:23.000 I drank way too much coffee today.
01:25:24.000 So we'll be right back.
01:25:26.000 Ladies and gentlemen, hold those thoughts.
01:25:28.000 This don't be evil thing.
01:25:30.000 This is where it gets interesting to me because the company is notoriously woke, right?
01:25:37.000 They've adopted these woke ethics.
01:25:40.000 And you hear about meetings that they have and there was...
01:25:45.000 That there was the one gentleman, Jimmy, what is his name?
01:25:51.000 He was fired from Google because he...
01:25:56.000 James Damore.
01:25:56.000 James Damore, thank you.
01:25:57.000 We actually had him on the podcast at one point in time because they asked him questions about...
01:26:01.000 Why don't women have more prominent roles in tech?
01:26:06.000 And is there some sort of gender bias?
01:26:09.000 Is it natural?
01:26:11.000 And he wrote a whole paper about choices and why people choose one thing or another.
01:26:17.000 And they decided he was a sexist piece of shit and they fired him.
01:26:19.000 And it was really wild.
01:26:21.000 Because if you read the actual paper and the paper was available online, there was nothing that he said that was not sexist at all.
01:26:28.000 Yeah, I read it, yeah.
01:26:30.000 Yeah.
01:26:30.000 So when a company has very clear motives, like they're in it to make money, like that's what they're doing.
01:26:40.000 How does that wokeness play into that?
01:26:44.000 Is that just a natural artifact of people coming from universities and then eventually working for Google?
01:26:51.000 Or is this like sort of a strategy that that's encouraged because it's more profitable that way?
01:26:59.000 Well, first of all, you have to understand that it's not that simple.
01:27:04.000 Because Google has had demonstrators, they've had literally their own employees holding demonstrations.
01:27:12.000 Not everyone is happy about the policies at Google.
01:27:18.000 People, for example, who have conservative leanings, which I don't, but people who have conservative leanings, they're miserable.
01:27:26.000 Because the values agenda is so strong there that it dominates everything.
01:27:34.000 Isn't it interesting that you feel like you have to announce that you don't have conservative leanings?
01:27:41.000 It's interesting because you've done it a couple times so far.
01:27:44.000 Three.
01:27:45.000 Yeah, but you do it because people want to say, oh, alt-right psychologist Dr. Robert Epstein.
01:27:55.000 They would like to do that, right?
01:27:58.000 Oh, I've been...
01:27:59.000 Ever since I did the testimony, I mean, a bunch of things happened, one of which is very sad.
01:28:07.000 But, yeah, I've gotten branded.
01:28:10.000 I've gotten branded as some sort of conservative right-wing nutcase.
01:28:14.000 And I don't have a conservative bone in my whole body.
01:28:17.000 So it really bothers me.
01:28:20.000 I'm doing what I do because...
01:28:23.000 I put humanity, democracy, America ahead of any particular party, any particular candidate.
01:28:33.000 There are bigger issues here.
01:28:35.000 It should be obvious, what you're saying.
01:28:38.000 I mean, what you're saying should concern people.
01:28:41.000 The idea that you would just be labeled as a part of a disparaged political party because it's an easy way to defame you and to discredit you.
01:28:51.000 That should be obvious, too.
01:28:53.000 You said that one of the things that happened was sad.
01:28:55.000 What was that?
01:28:58.000 Well, in 2019, one of the things I did, around the same time I did the testimony, is I did a private briefing for state attorneys general.
01:29:11.000 And so I did my thing and I You know, I can scare people pretty well with my data.
01:29:18.000 We haven't got to my monitoring projects yet, but we will.
01:29:22.000 So, you know, I did my thing.
01:29:24.000 And then I went out into kind of the waiting room there and just waited because I was done.
01:29:30.000 And they started filing out.
01:29:31.000 And one of them came up to me.
01:29:32.000 I know exactly who it was.
01:29:33.000 I know what state he was from.
01:29:35.000 And he says, Dr. Epstein, I hate to tell you this, but he said, I think you're going to die in an accident.
01:29:44.000 Within the next few months.
01:29:46.000 And then he walked away.
01:29:48.000 Now, I did not die in an accident in the next few months, but my wife did.
01:30:00.000 Really?
01:30:01.000 Yeah.
01:30:02.000 So when this person said that to you, what does this person do?
01:30:09.000 He's an attorney general of a state.
01:30:12.000 And why did he say that to you?
01:30:13.000 Because he was concerned.
01:30:15.000 He thought I was pissing people off who had a lot of power and they wouldn't like that.
01:30:24.000 And how did your wife die in an accident?
01:30:26.000 What were the circumstances?
01:30:29.000 She lost control of her little pickup truck that I had bought her and got broadsided by a massive truck that was towing two loads of cement.
01:30:46.000 But her pickup truck was never examined forensically and it disappeared.
01:30:54.000 I was told that it had been sold to someone in Mexico, and it just disappeared.
01:30:59.000 Sold to someone in Mexico.
01:31:01.000 Obviously, it was totaled?
01:31:02.000 It was totaled, and the wreck, which I suppose was technically my property, disappeared.
01:31:10.000 It was never examined and disappeared and went to Mexico.
01:31:13.000 Now, was this an older truck?
01:31:15.000 Was it a newer truck?
01:31:17.000 It was an older truck, but, you know...
01:31:19.000 Older as in, like, how old?
01:31:23.000 Like 2002, but we kept in very good shape.
01:31:27.000 Had low mileage, new tires.
01:31:29.000 The reason why I ask is, like, what kind of computer systems were involved in cars from 2002 as opposed to...
01:31:36.000 Do you remember the...
01:31:40.000 The story of the journalist, Michael Hastings, who wrote a story about a general during the time of Obama's administration, there was a volcano that erupted in Iceland and he was stuck overseas.
01:32:02.000 I believe it was Afghanistan or Iraq.
01:32:06.000 I think it was Afghanistan.
01:32:07.000 So he was over there writing a story for Rolling Stone, and because he was over there for so long, because he was trapped, because no flights were going, because the air cover was so bad because of this volcano, they got real comfortable with him.
01:32:22.000 And these soldiers started saying things, not even thinking this guy is like, you know, he's not one of them.
01:32:29.000 He is a journalist, and he's going to write all these things about it.
01:32:32.000 So he wrote this very damning article.
01:32:36.000 The general in question got fired.
01:32:38.000 And then this guy, Michael Hastings, started talking about how he was fearing for his own life.
01:32:44.000 And cut to sometime in the future, he sped up.
01:32:49.000 There's actually a video of it.
01:32:51.000 Sped up on Sunset Boulevard towards the west side and slammed into a tree going 120 miles an hour.
01:32:58.000 There was an explosion.
01:33:00.000 The car's engine was many yards from the car itself, and there was a lot of speculation.
01:33:08.000 That not only did the government have the ability to manipulate, that intelligence agencies had the ability to manipulate people's cars, but it's something they've actively done.
01:33:18.000 And people were very concerned that this guy was murdered because of what he had done.
01:33:23.000 Because that general wound up getting fired.
01:33:25.000 Obama wound up firing him because it made Obama look bad.
01:33:29.000 He was a very beloved general.
01:33:31.000 That kind of shit scares the fuck out of people.
01:33:34.000 Well, there's a very good book on this subject.
01:33:36.000 It's called Future Crimes.
01:33:38.000 And it starts out saying the kind of things that I've been saying to you, which is the future crimes, they're actually here now.
01:33:44.000 And this is an ex-FBI guy who wrote the book.
01:33:47.000 And he's talking about how tech is being used now to not only commit crimes, but to assassinate people.
01:33:54.000 One of the simplest ways to do it is you hack into a hospital computer and you change dosages on medication.
01:34:03.000 If the person you're going after has been hospitalized, that's a really simple way to just knock them off and have it look like just some silly little glitch or something.
01:34:17.000 So yeah, there's a lot of ways now that you can commit crimes that have never existed before.
01:34:25.000 And as far as I'm concerned, the kinds of things I study, in my opinion, should be considered crimes.
01:34:32.000 And I don't think we should ever be complacent and just say, oh, it's the algorithm.
01:34:39.000 Algorithms are written by people.
01:34:41.000 Algorithms are modified.
01:34:42.000 Google modifies its algorithm, its basic search algorithm, 3,000 times a year.
01:34:48.000 That's human intervention.
01:34:51.000 Do you think that that's what happened to your wife, or do you speculate, or do you just not know and just leave it at that?
01:35:00.000 How do you feel about that?
01:35:01.000 It depends on the day.
01:35:02.000 I think about Misty.
01:35:04.000 She's from Texas originally, and I think about her Pretty much nonstop.
01:35:10.000 I'm still wearing my wedding band, even though the accident was two years ago.
01:35:15.000 I don't know.
01:35:16.000 I know that the accident made news, not just here but in Europe, because some people thought it was suspicious that my beautiful My wife, you know,
01:35:31.000 we've been together for eight years and my beautiful wife was killed in this horrendous fashion.
01:35:37.000 And, you know, obviously I have pissed off some people at some big companies.
01:35:45.000 And I have work coming out.
01:35:46.000 I mean, the work I have coming out, I have right now 12 scientific papers under review and four that are in press, in other words, that have been accepted.
01:35:56.000 So I have stuff coming out that is over and over again, it's like a sledgehammer, is going to make certain companies really look, well, very evil, I would say.
01:36:10.000 Do you think that they have the ability to suppress the kind of coverage of the data that you're putting out to the point where it's not going to impact them?
01:36:20.000 Like, how much has it impacted them currently?
01:36:23.000 I mean, we're talking about committing murder or potentially committing murder.
01:36:26.000 Like, how much have you impacted them if they're still in complete and total control and they're still utilizing all these algorithms and making massive amounts of profit?
01:36:36.000 You haven't put a crimp in that.
01:36:40.000 Well, I have.
01:36:41.000 I have put a crimp in it, yes.
01:36:46.000 So I do want to talk to you about the monitoring stuff, because there is a way.
01:36:51.000 There's more than one way, but there's one very practical way to literally just push these companies out of our personal lives and out of our elections.
01:37:05.000 And I've been working on that project since 2016. That project started because of a phone call I received from a state attorney general, Jim Hood.
01:37:17.000 He was attorney general of Mississippi at the time.
01:37:21.000 He called me in 2015 and he said, could Google mess with my reelection as attorney general?
01:37:29.000 Because in that state they elect them.
01:37:31.000 And I said, well, yeah, very easily.
01:37:33.000 And he said, well, how would they do it?
01:37:34.000 And I explained how they do it, et cetera, et cetera.
01:37:36.000 And he was very, very concerned.
01:37:37.000 And he said, but how would you know that they're doing it?
01:37:42.000 And my mind just started to spin.
01:37:44.000 I was thinking, gee, I don't know.
01:37:47.000 Well, a whistleblower, you know, a warrant, something.
01:37:51.000 And I became obsessed with trying to figure out how to know what these companies are actually showing real people.
01:38:02.000 Now, here and there, there's some researchers at Columbia who should be ashamed of themselves.
01:38:07.000 There's some reporters at The Economist who should be ashamed of themselves.
01:38:10.000 Here and there, people have set up a computer that they anonymize, and they type in lots of search terms, and they get back all these searches, and they conclude that there's no bias.
01:38:26.000 But that doesn't tell you anything because Google's algorithm can easily spot a bot, can easily spot an anonymized computer.
01:38:34.000 They know it's not a real person.
01:38:35.000 How do they know that?
01:38:37.000 Because it doesn't have a profile.
01:38:41.000 You have a profile.
01:38:47.000 How long have you been using the Internet?
01:38:49.000 Let's put it that way.
01:38:50.000 Long time.
01:38:51.000 Well, roughly.
01:38:52.000 25, 30 years, whatever it's been.
01:38:55.000 What was it?
01:38:57.000 94 I first got on?
01:38:59.000 Wow.
01:38:59.000 So almost 30 years.
01:39:00.000 Okay, so Google has a profile on you that has more than three, the equivalent of, more than three million pages of content.
01:39:15.000 Now, you're probably thinking, well, how could I generate that?
01:39:18.000 Because everything you do goes into that profile.
01:39:23.000 So, yeah, it's a lot of content.
01:39:26.000 But the point is, they know the difference between you, because you have a big old profile, and an anonymized computer or a bot because there's no profile.
01:39:35.000 Right.
01:39:35.000 So it turns out, this is the simplest thing in the world to do, is that when they see a bot, okay, they just send out unbiased content.
01:39:46.000 We've shown this ourselves.
01:39:48.000 There's nothing to it.
01:39:49.000 But that's not the challenge that General Hood was basically giving me.
01:39:55.000 He was saying, how would you find out what real people are seeing?
01:40:01.000 So, 2016, I got some funds.
01:40:05.000 I don't even know where they came from, but anyway, and we started recruiting people.
01:40:10.000 We call them field agents.
01:40:12.000 This is exactly what that company does, Nielsen, that does the Nielsen ratings.
01:40:18.000 They've been doing it since 1950. They're now in 47 countries.
01:40:22.000 And they recruit families and they keep their identities very secret.
01:40:26.000 And they equip the families with special gizmos so they can keep an eye on what television shows they're watching.
01:40:32.000 And that's where the Nielsen ratings come from, which are very important because they determine how much those shows can charge for advertising.
01:40:41.000 They determine whether or not a show stays on the air.
01:40:43.000 So it's important.
01:40:45.000 So we started recruiting field agents.
01:40:49.000 And we developed custom software literally from the ground up.
01:40:52.000 And when we screen a field agent and we say, okay, you want to join us?
01:40:58.000 We install on their computer special software, which allows us, in effect, to look over their shoulders.
01:41:06.000 This is with their permission, obviously.
01:41:08.000 Look over their shoulders and we can take snapshots.
01:41:13.000 So when we sign these people up, we're taking lots of snapshots, you know, all day long.
01:41:18.000 And then information's coming in and it's being aggregated.
01:41:21.000 So we can look at what real voters are being sent by Google, Facebook, YouTube, anybody.
01:41:31.000 And we take all kinds of precautions to make sure these people cannot be identified.
01:41:37.000 We deliberately had a small group of people, Gmail users, to make it easy for Google to identify those people.
01:41:46.000 Guess what?
01:41:48.000 They got unbiased content.
01:41:51.000 But everyone else was getting highly biased content.
01:41:55.000 Why did the Gmail people get unbiased content?
01:41:57.000 Because Google knew they were our field agents.
01:42:02.000 So Google was aware of your study?
01:42:04.000 I probably can't even sneeze without Google being aware.
01:42:11.000 So you think Google manipulated the results of the people that were Gmail users to show that they didn't have bias?
01:42:21.000 No, I think they unmanipulated.
01:42:22.000 Yeah, that's what I'm saying.
01:42:23.000 I mean, manipulated in the sense of they didn't apply the algorithm to those people.
01:42:30.000 I had a reporter from D.C. I'm not going to name him.
01:42:35.000 He was doing a piece on my work.
01:42:37.000 Then he contacts me a couple days later and he said that he called up a woman who he believed was the head of Google's PR department.
01:42:47.000 He said, and I asked her questions about your work and she started screaming at me.
01:42:53.000 He said, that's very unprofessional.
01:42:55.000 I've never had that happen before.
01:42:57.000 He said, I'm going to tell you two things.
01:42:59.000 He said, number one, you have their attention.
01:43:03.000 And number two, if I were you, I would take precautions.
01:43:09.000 Jesus.
01:43:12.000 So, monitoring.
01:43:14.000 2016, we recruited 95 field agents in 24 states.
01:43:19.000 We preserved 13,000 election-related search results on Google, Bing, and Yahoo.
01:43:28.000 So it's 130,000 search results.
01:43:31.000 So each one has 10 results in it.
01:43:34.000 So it's 130,000 links.
01:43:36.000 And we also then also preserved the web pages.
01:43:41.000 So we had 98,000 unique web pages.
01:43:43.000 And then we analyzed it.
01:43:45.000 We found extreme pro-Hillary Clinton bias on Google search results, but not on Bing or Yahoo.
01:43:57.000 Now, here's number four, disclaimer number four.
01:44:00.000 I supported Hillary Clinton.
01:44:02.000 But still, I was very disturbed by this, extremely disturbed, because we knew from the experiments we had run that that was enough bias to have shifted over a period of time among undecided voters somewhere between 2.6 and 10.4 million votes without anyone having the slightest idea that this had occurred.
01:44:25.000 That's 2016. 2018, we monitored the midterms.
01:44:30.000 We preserved 47,000 searches.
01:44:33.000 So we were expanding.
01:44:34.000 We're getting bigger.
01:44:36.000 47,000.
01:44:37.000 And we found enough bias on Google, but not Bing or Yahoo, to have shifted 78 million votes.
01:44:46.000 That's spread across hundreds of elections, though, with no one knowing.
01:44:53.000 2020, we went all out.
01:44:55.000 We had more money, we went all out.
01:44:58.000 And we recruited 1,735 field agents just in swing counties, just in swing states, because we knew that's where the action was going to be.
01:45:11.000 We preserved 1.5 million ephemeral experiences, and I'll define that if you want, On Google, Bing, Yahoo, YouTube, Google's homepage, Facebook.
01:45:27.000 We, at this point, know how to preserve pretty much anything.
01:45:31.000 We preserve three million webpages.
01:45:36.000 And we're getting to the climax here.
01:45:40.000 We decided, which we hadn't done in the past, on October 30th, 2020, before the election, a few days before the election, we decided to go public with some of our initial findings.
01:45:51.000 And we did.
01:45:54.000 And as a result, on November 5th, two days after the election, three U.S. senators sent a very threatening letter to the CEO of Google Just summarizing all my work, my preliminary stuff.
01:46:13.000 And guess what happened then in Georgia?
01:46:16.000 We had over a thousand field agents in Georgia.
01:46:21.000 Google turned off the bias like that.
01:46:25.000 Google stopped with their homepage Go Vote reminders.
01:46:30.000 They stayed out of Georgia.
01:46:33.000 What does this say?
01:46:34.000 This tells you that if you monitor, if you do to them what they do to us 24 hours a day, you do that to them and you look for any kind of manipulation, any kind of bias, any kind of shenanigan, and you make that public,
01:46:49.000 you expose it.
01:46:54.000 They back down.
01:46:55.000 They back down.
01:46:56.000 They have to back down.
01:46:58.000 So doesn't this highlight that if our government is concerned about legitimate threats to democracy and legitimate threats to the way information is distributed and free speech and manipulation,
01:47:14.000 that they should be monitoring Google.
01:47:18.000 But is the problem money?
01:47:20.000 Because of the amount of money that they give to campaigns, the amount of money they give to support causes that these politicians back.
01:47:31.000 And the votes.
01:47:32.000 Don't forget the vote shifting.
01:47:34.000 Because some of these politicians understand that.
01:47:36.000 Yes.
01:47:37.000 Yes.
01:47:39.000 Forget the government.
01:47:40.000 Forget the government.
01:47:43.000 The government is not going to do this.
01:47:45.000 And would we even trust the government to do it?
01:47:48.000 So who should be doing it?
01:47:49.000 This should be done by probably a consortium, bipartisan or nonpartisan.
01:47:57.000 Nonprofit organizations and, you know, we should have hearings.
01:48:04.000 We should have, you know, very—everything should be transparent.
01:48:08.000 We should have wide representation of people serving on the boards and all that kind of like— Well, the UN, but this is a narrow kind of task.
01:48:20.000 Here's what we need.
01:48:21.000 We need to set up now, because now we know how to do it.
01:48:24.000 We need to set up a permanent, large-scale monitoring system in all 50 states in the United States.
01:48:31.000 That's how we start.
01:48:32.000 Eventually, we have to help people in other countries set up similar systems.
01:48:36.000 That is how now and in the future—see, that's the real answer to your future question— That is why now and in the future, that is how, now and in the future, we can get control over emerging technologies.
01:48:52.000 Not just Google, but the next Google and the Google after that.
01:48:55.000 There is no way to know what these companies are doing unless you are monitoring.
01:49:03.000 One of the simulators we have now that we developed actually within the past year, which is fabulous, I'm so proud of my staff, we have an Alexa simulator.
01:49:14.000 I mean, it just works just like Alexa.
01:49:17.000 And it talks.
01:49:18.000 It's fantastic.
01:49:19.000 Except we control what it's going to say.
01:49:24.000 And sure enough, can we shift people's, oh yeah, easy peasy, nothing.
01:49:29.000 But what that tells you is that's one of the things we have to monitor.
01:49:32.000 We have to monitor the answers Right.
01:49:55.000 You add up these manipulations, and basically what Eisenhower predicted, it's here now.
01:50:06.000 It's just that you can't see it.
01:50:09.000 First of all, I'll give you an example.
01:50:12.000 Okay.
01:50:15.000 2016. And I bet you Mark Zuckerberg has been kicking himself in the butt ever since.
01:50:20.000 On election day, if Zuckerberg, with one click, if he had sent out go-vote reminders...
01:50:30.000 Just to Democrats that day?
01:50:33.000 Because, you know, I mean, Trump won basically by, what, 47,000 votes in four states?
01:50:38.000 I mean, if Zuckerberg had sent out go-vote reminders just to Democrats, and he knows who the Democrats are, right?
01:50:46.000 He could have generated that day 450,000 more votes for Hillary Clinton than she got.
01:50:55.000 How do we know that?
01:50:56.000 From Facebook's own published data.
01:50:59.000 They published a study in 2012 showing how they could get more people to vote in 2010 by sending out vote reminders.
01:51:09.000 If you just take the data that they published and move it over to 2016 and say, okay, Mark, press the button, Hillary would have absolutely won the election.
01:51:23.000 He, I'm sure to this day, is kicking himself because he didn't do it.
01:51:28.000 But how would you know?
01:51:32.000 See, on any given day, any given election, how would you know whether that kind of reminder is going out, number one?
01:51:37.000 And number two, who it's going to?
01:51:40.000 Is it going to everybody?
01:51:43.000 Or is it going just to select group?
01:51:45.000 Is it targeted?
01:51:46.000 There's no way to know that unless you have monitoring systems in place.
01:51:50.000 With a monitoring system, you would know within seconds or minutes If a targeted message like that was being sent out...
01:52:02.000 But if you had a targeted message like that, that's not illegal, right?
01:52:08.000 Which is part of the problem.
01:52:10.000 Like, even if they did it publicly, and you said, all we're doing is encouraging people to vote.
01:52:15.000 Yeah, but what if it's going just to members of one party?
01:52:18.000 Oh, I get it.
01:52:19.000 But I mean, would they be obligated to send that to everybody?
01:52:25.000 Or maybe they could use the excuse that it's only the people that are politically inclined.
01:52:31.000 Here's what I'm...
01:52:34.000 This is what I believe.
01:52:35.000 Okay.
01:52:36.000 Based on the experience that we just had a few months ago, where we got Google to stay out of Georgia, and by the way, we positively got them to stay out of Georgia because we had over a thousand field agents in Georgia, and we were collecting a massive amount of...
01:52:50.000 We collected more than a million ephemeral experiences.
01:52:54.000 I guess I'm going to have to define that.
01:52:55.000 In Georgia, I'm telling you, Google...
01:52:58.000 We have never seen...
01:53:00.000 So little bias in Google search results ever since we started monitoring in 2016. What's an ephemeral experience?
01:53:07.000 Okay.
01:53:08.000 2018, a leak to the Wall Street Journal from Google.
01:53:13.000 Bunch of emails.
01:53:15.000 One Googler is saying to others, how can we use ephemeral experiences to change people's views about Trump's travel ban?
01:53:27.000 In other words, I didn't make up this term.
01:53:29.000 This is from Google.
01:53:31.000 Internally, this is the kind of lingo that they use.
01:53:34.000 What's an ephemeral experience and why would they want to use ephemeral experiences to change people's minds?
01:53:40.000 Because an ephemeral experience is, well, most of the kinds of interactions we have online involve ephemeral experiences.
01:53:48.000 Search, you type a search term, you see a bunch of search results, it has an impact on you, you click on something, it disappears, it's not stored anywhere, and it's gone forever.
01:53:58.000 So there are these brief experiences, like a news feed, a list of search suggestions, an answer box, that affect users, disappear, stored nowhere, authorities cannot go back in time and figure out what people were being shown.
01:54:19.000 That's why internally at Google they want to use ephemeral experiences to impact people because unless someone like me, and I'm the only one doing this, unless some crazy guy like me is setting up monitoring systems and keeping everything secret while it's running,
01:54:41.000 no one will ever know.
01:54:46.000 That you just flipped an election.
01:54:48.000 No one will ever know.
01:54:51.000 As I say, the most powerful mind control machine ever invented, and it relies, for the most part, on ephemeral experiences, meaning no one knows.
01:55:02.000 You can't track it.
01:55:03.000 You can't track it.
01:55:05.000 You can't go back in time.
01:55:07.000 The only way to do it is you'd have to be looking over the shoulders of real users.
01:55:12.000 You have to look over their shoulders, and you have to grab it as it's occurring, and then you have to aggregate it, analyze it quickly.
01:55:21.000 That's not really possible.
01:55:22.000 Well, no, that's what we do!
01:55:24.000 But I mean, that's not possible for the entire country.
01:55:27.000 Yeah, well, that's why we have to take what we've done.
01:55:31.000 Do you see the irony in that though?
01:55:32.000 It's almost like the only way to prevent this manipulation is by massive surveillance of everyone.
01:55:41.000 No, no, no.
01:55:43.000 All you need is a representative sample.
01:55:45.000 You do what the Nielsen company does.
01:55:47.000 Same thing.
01:55:48.000 So we have, you know, a panel, it's called.
01:55:52.000 We have a panel of field agents around the country.
01:55:54.000 In a state like California, we'd have to have a lot because they have a lot of people.
01:56:00.000 Idaho, we don't need so many.
01:56:01.000 So you just take representative sample from each state.
01:56:06.000 Like a Nielsen thing.
01:56:07.000 Exactly like Nielsen.
01:56:09.000 But would they be aware of who the Nielsen families are or the people that you're surveilling, you know, your Nielsens?
01:56:17.000 Would they be able to just have them receive unbiased data?
01:56:24.000 Well, that's the whole point.
01:56:26.000 The point of Nielsen is they have to keep the identities of those families secret because otherwise people would mess with them.
01:56:32.000 Right.
01:56:32.000 But if they have the amount of surveillance capabilities that we're talking about here, wouldn't they be able to know who these field agents are?
01:56:42.000 Well, that's why we're very, very careful about how we do the recruiting.
01:56:48.000 So it's expensive.
01:56:51.000 Nielsen has to take precautions in the way they do recruiting and equipping and training.
01:56:56.000 We have learned from that.
01:56:59.000 We take tremendous precautions.
01:57:01.000 And so, you know, you're asking, can this really be done?
01:57:06.000 I'm saying, yeah, I've done it four times, so I know it can be done.
01:57:10.000 But it takes effort.
01:57:12.000 There's a lot of security involved.
01:57:16.000 If someone is suspicious, we dump them.
01:57:19.000 Nielsen does the same thing.
01:57:21.000 So how do you find out if someone's suspicious?
01:57:26.000 Well, how do we do that?
01:57:30.000 For example, let's say we're aggregating information that they're getting on the search engines, let's say.
01:57:41.000 So it's coming in.
01:57:42.000 Our software is set up so that if the information we're getting from any particular field agent doesn't look right, then it goes over to human review.
01:57:54.000 So what could that mean?
01:57:56.000 That could mean, for example, that they are using an algorithm.
01:58:00.000 They're trying to tilt things in a particular direction.
01:58:03.000 So they're not actually typing in anything.
01:58:05.000 They're not using the computer the normal way they would use it, which is what they're supposed to do.
01:58:09.000 It means they've now developed or been equipped with an algorithm to, boom, just start generating a lot of stuff, which would mess up our numbers, right?
01:58:22.000 Well, those people immediately are flagged, and when that happens and we can't exactly figure out what's going on, we dump them.
01:58:31.000 And we dump their data.
01:58:34.000 If their information is coming in faster than a person can type, we dump them.
01:58:40.000 But there are other indications, too.
01:58:42.000 I mean, I can't reveal all that, but we're taking precautions exactly like Nielsen has been doing all the way since 1950. It can be done.
01:58:52.000 What is your goal with all this?
01:58:54.000 Do you think you can shift the way these companies do business?
01:58:59.000 Do you want to just inform and educate the public as to what's happening and how divisive and how interconnected all this stuff is?
01:59:15.000 It's hard to answer that question because as I keep learning more, and believe me, what we've learned in the last year easily eclipses what we learned in the previous eight years.
01:59:26.000 We're learning so much.
01:59:28.000 The team is growing.
01:59:29.000 Our capabilities are growing.
01:59:35.000 I'll say at one point in time, what I was concerned about was how can we get Google under control?
01:59:44.000 So I published an article in Bloomberg Businessweek.
01:59:47.000 There's a great backstory there because, you know, it was scheduled to come out and then someone or other made a phone call to someone else and then, boom, the piece got pulled.
01:59:59.000 And this is a solution to the Google problem, literally.
02:00:03.000 The editor in chief is literally having arguments with the, you know, the higher ups, the publishers, because they pulled my piece on how to get Google under control, how to solve the Google problem.
02:00:15.000 I was scheduled to testify before Congress the following Tuesday.
02:00:20.000 The article had been pulled.
02:00:21.000 The editor-in-chief was determined to get this piece out.
02:00:26.000 He got it published in their online version on Monday, the day before the hearing.
02:00:32.000 So what is this about?
02:00:34.000 This is very simple, very light touch regulation.
02:00:38.000 The way to We're good to go.
02:00:54.000 We're good to go.
02:01:03.000 And Google could still sell it, you know, when people like Bing want to use a lot of it, a lot of data from the database.
02:01:11.000 They could still make money.
02:01:12.000 But what would happen in that case, though, is that hundreds of other search engines would now be set up, and then thousands.
02:01:20.000 All pulling really good data from Google's database.
02:01:24.000 And then they would go after niche audiences.
02:01:27.000 And they'd all be giving great search results, but they're going after Lithuanians.
02:01:30.000 They're going after women.
02:01:31.000 They're going after gays.
02:01:32.000 And so you'd end up with a competitive search environment like there used to be when Google started out.
02:01:42.000 And more importantly, you'd end up with innovation in search.
02:01:45.000 There's been no innovation in search now for 20 years.
02:01:47.000 I mean, look at Google's...
02:01:50.000 It's the same and the methodology is the same.
02:01:53.000 So you'd end up with innovation, you'd end up with competition, all with one very simple regulatory intervention.
02:02:02.000 And this was done with AT&T back in the 1950s.
02:02:06.000 Is there any consideration to adopt this?
02:02:08.000 Have you had conversations where you think that this could actually become a real thing?
02:02:14.000 Positively, this could happen because there are members of Congress who get it.
02:02:19.000 And they recognize that this approach is light touch compared to a million other things like the breakups.
02:02:27.000 You know, they're going to do breakups.
02:02:29.000 That's just nonsense.
02:02:32.000 So, yeah, it could happen, but it doesn't need to happen here.
02:02:35.000 It could happen in the EU. Because Google has, I think, 18 data centers.
02:02:40.000 I think only half of them are in the United States.
02:02:44.000 I think nine of them are in the US, and five of them are in Europe.
02:02:50.000 And Brussels, they can't stand Google.
02:02:54.000 Of course, they've fined them, you know, these billion-euro fines.
02:02:58.000 They've hit them up so far with three massive fines.
02:03:02.000 Totally more than 10 billion euros since, I think, 2017. And what have they fined them for?
02:03:10.000 Well, bias in search results.
02:03:13.000 How about that?
02:03:14.000 That's the first big fine.
02:03:17.000 And this is Brussels.
02:03:18.000 Yeah.
02:03:18.000 And why does the United States implement some sort of a similar punishment?
02:03:23.000 Because Google owns the United States.
02:03:28.000 I mean, there's an antitrust action right now in progress against Google, and it's the attorney generals, I believe, from every single state in the United States except California.
02:03:41.000 Because the attorney general of California, his main supporter is Google.
02:03:47.000 Google's based in California.
02:03:49.000 So it's so crazy that they have this massive antitrust action in progress, and the AG of California is staying out of it.
02:03:59.000 Yeah.
02:04:00.000 His name is Becerra, I think.
02:04:03.000 But the point is that we're talking about light touch regulation.
02:04:08.000 That could actually be done.
02:04:09.000 It could be enacted by the European Union.
02:04:12.000 I've spoken in Brussels.
02:04:13.000 I've talked to people there.
02:04:15.000 That's the kind of thing they could do.
02:04:17.000 And if they did it, it would affect Google worldwide.
02:04:21.000 And you would end up with thousands of very good search engines, but each aiming at niche audiences.
02:04:28.000 And doesn't that sound like the world of media?
02:04:30.000 The world you're in?
02:04:31.000 Yeah.
02:04:32.000 And it doesn't seem like this would bankrupt Google.
02:04:35.000 Oh, no, no, not at all.
02:04:36.000 No, they still make massive amounts of profit.
02:04:38.000 Absolutely.
02:04:39.000 And it would fall in line with don't be evil.
02:04:44.000 Well, the fact is that depending on who the leadership is at any point in time at Google, they might look at that idea and say, hey, look, this will be great for us.
02:04:53.000 Really?
02:04:54.000 Sure.
02:04:55.000 But don't you think that any self-regulatory move like that would set up possible future regulatory moves?
02:05:04.000 Like, wouldn't they want to resist any kind of regulations for as long as they possibly can?
02:05:09.000 But if they thought that they were going to be attacked in some worse way and that this is a way out, you know, they're numbers people.
02:05:19.000 They're just numbers people.
02:05:20.000 I'll give you an example of Google just looking at numbers.
02:05:26.000 2018. Election Day.
02:05:31.000 So, because I already told you we got them to stop doing this in Georgia, but now I'm going back in time.
02:05:37.000 2018, Election Day, Google on its homepage posts this, go vote, everyone go vote.
02:05:45.000 So Google does this.
02:05:47.000 Now the question is, were they sending it to everyone?
02:05:51.000 I don't know.
02:05:52.000 But let's assume they were sending it to everyone.
02:05:56.000 Okay, the first thing that happened that day was all the major news services praised Google.
02:06:02.000 Praise Google!
02:06:06.000 For doing this amazingly good public service, right?
02:06:11.000 And I looked at it and I immediately said, that's not a public service.
02:06:15.000 That's a vote manipulation.
02:06:18.000 So I sat down and I did exactly what a data analyst or data scientist at Google would do, and I just ran the numbers.
02:06:27.000 And by the way, this is something we're also studying.
02:06:29.000 It's called DDE, the Differential Demographics Effect.
02:06:35.000 The fact is Google has more Democrat users than Republican users.
02:06:41.000 So if they sent that to everybody that day, That would give 800,000 more votes.
02:06:50.000 It would give, you know, more votes to everybody, but it would give 800,000 more votes to Democrats than to Republicans.
02:06:58.000 This is spread across, again, midterms, so it's hundreds of races.
02:07:02.000 So if they send it to everyone, they win.
02:07:06.000 So if Google is biased towards Democrats in terms of users, what are the Republicans using?
02:07:15.000 What kind of tech are they using?
02:07:17.000 I mean, if you're saying that Google's sending out these messages, right, and that most of their users or the majority of their users are Democrats, right?
02:07:25.000 So what's the majority of Republicans?
02:07:30.000 I'm not sure what you're asking.
02:07:32.000 You're saying Google's sending out this message, go vote.
02:07:35.000 And through that message, because of the bias, because of the difference in the numbers, more Democrats are getting it because more Democrats use Google, right?
02:07:46.000 So what do Republicans use?
02:07:48.000 Well, they're still using Google.
02:07:50.000 There's not more Democrats in the country.
02:07:54.000 Is it less Republicans are online?
02:07:56.000 What's the bias there?
02:07:58.000 What is the difference?
02:08:02.000 I believe there are fewer Republicans online.
02:08:06.000 That could be a factor.
02:08:08.000 Last time I looked at the numbers, it looked like there were a few more Democrats.
02:08:13.000 You know, people registered as Democrats, then people registered as Republicans.
02:08:17.000 So, you know, it's a combination of factors.
02:08:20.000 As you know, in recent years, Republicans and conservatives, they have tried to set up a number of platforms of their own.
02:08:30.000 Parler is one.
02:08:31.000 Yeah.
02:08:31.000 So they're, you know, they're...
02:08:33.000 Social media platforms.
02:08:34.000 Yeah, they're trying to, you know, carve out their own world, their own niche on the internet.
02:08:39.000 So they don't have to use these inherently biased platforms.
02:08:45.000 Now, all this search engine stuff and the manipulation, how much does this apply to social media as well?
02:08:56.000 And is there cross-contamination?
02:09:00.000 Well, social media is more complicated because social media, we're the ones who are posting all the stuff.
02:09:08.000 So we're providing all the- But not necessarily.
02:09:11.000 Because if you pay attention to manipulation, there's a lot of manipulation that's coming from overseas, allegedly.
02:09:18.000 My position has always been like, you know, who's funding these troll farms in Macedonia?
02:09:25.000 And how do we know that it's not someone in the middle of Kentucky?
02:09:29.000 Okay, you're absolutely right.
02:09:30.000 There's a tremendous amount of content that is posted by bots that is coming from organizations in other countries.
02:09:37.000 You're absolutely right.
02:09:38.000 I mean, I think Facebook, just in the first quarter of last year, took down 2 billion profiles?
02:09:48.000 Facebook's top 20 Christian sites, 19 of them, are run by Troll Farms.
02:09:53.000 Right.
02:09:54.000 So there's a lot of junk out there.
02:09:56.000 That's true.
02:09:57.000 So when you get to social media, the picture gets very complicated.
02:10:01.000 However, here's what you got to know.
02:10:06.000 It's algorithms that determine what goes viral.
02:10:10.000 Everyone believes this crazy myth.
02:10:13.000 Everyone believes this.
02:10:14.000 Everyone I know believes this.
02:10:15.000 My kids believe this.
02:10:17.000 Everyone believes that virality is mysterious.
02:10:21.000 It's like winning the lottery.
02:10:25.000 And that's not true.
02:10:26.000 Because if I control the algorithms, okay, I determine what's going to go viral and what is not.
02:10:34.000 Now that's, again, a tremendous source of power.
02:10:38.000 And of course, they do want a bunch of stuff to go viral, even crazy negative stuff, because more traffic, more money.
02:10:46.000 But the bottom line is they control the algorithms that determine what goes viral.
02:10:53.000 That's where a lot of the power lies in the world of social media.
02:10:58.000 That's where the Francis Haugen revelations are extremely important.
02:11:06.000 And just having that underbelly, that ugly underbelly of that company exposed.
02:11:12.000 So no matter how you look at this, for us to sit by, Eisenhower's speech actually says that we have to be vigilant.
02:11:25.000 He uses the word vigilant.
02:11:26.000 We have to be vigilant so that we don't let these kinds of powers take over our government, our democracy, our nation.
02:11:36.000 And we have not been vigilant and we're not being vigilant now.
02:11:40.000 And the research, you know, that we do and the monitoring systems both, the research is over here and the monitoring stuff's over here, that reminds me every single day.
02:11:51.000 I mean, I'm looking at numbers every single day.
02:11:53.000 You're keeping me away from my data and my research, by the way.
02:11:58.000 But I'm reminded every single day of just how serious this stuff is.
02:12:06.000 This is deadly serious for the future of not just our country, but all of humanity.
02:12:11.000 And the fact that people don't know it, or that sometimes I've given speeches, sometimes people say, I don't care.
02:12:20.000 I have nothing to hide.
02:12:22.000 I've heard that.
02:12:22.000 That infuriates me.
02:12:24.000 Yeah.
02:12:24.000 I've heard that about government surveillance, too.
02:12:26.000 Yeah.
02:12:28.000 Well, look at the Chinese.
02:12:30.000 The lives of the Chinese are strictly controlled by the government, and more and more they're using high tech.
02:12:37.000 And Google has worked with the government of China to improve that technology.
02:12:43.000 And to limit access to search results.
02:12:45.000 That is correct.
02:12:47.000 So Google does what's good for Google.
02:12:52.000 They're not the sweet little old lady running the library that people think.
02:12:58.000 That's not what they are.
02:12:59.000 They do what's good for Google.
02:13:00.000 I had a friend who worked at Google during the time they were working and having negotiations with China, and her position was that China was just going to copy Google's tech if they didn't do that.
02:13:12.000 Yeah.
02:13:13.000 I've heard that, yeah.
02:13:14.000 Yeah.
02:13:16.000 So like they were in this position where, you know, like Tiananmen Square, like you cannot bring up, like Tiananmen Square is not searchable.
02:13:23.000 You can't find that in China.
02:13:25.000 The results of it, like the guy standing in front of the tank, like there's a lot of information from Tiananmen Square that would look terrible.
02:13:34.000 That's right.
02:13:35.000 So you can't find it.
02:13:36.000 No, it's suppressed.
02:13:37.000 And, you know, Google's an expert at doing that kind of suppression.
02:13:42.000 They're the biggest censors in the history of humankind.
02:13:46.000 But still, look...
02:13:49.000 I know I'm a very idealistic person.
02:13:54.000 I've handed out tests of idealism in my classes when these are young people in their 20s.
02:14:00.000 And I outscore them.
02:14:02.000 I've always outscored all of my students.
02:14:05.000 I'm very idealistic.
02:14:06.000 I believe in truth, justice, the American way, like Superman, and all that crazy stuff.
02:14:15.000 But...
02:14:20.000 I'm going to do my best to get people to wake up.
02:14:24.000 That's why I said, yes, I'll give up a day of looking at my numbers.
02:14:30.000 I'm going to come and talk to you because I am trying to get people to listen.
02:14:35.000 I'm trying to figure out how to get people to listen.
02:14:39.000 People must listen.
02:14:41.000 Let me put it another way.
02:14:42.000 That monitoring system I keep talking about, That's not optional, okay?
02:14:47.000 That's not optional.
02:14:48.000 That must be set up.
02:14:50.000 If we don't set that up, we will have no clue.
02:14:55.000 We will not understand not only why this person or that person won an election, we will not understand what's happening with our kids.
02:15:07.000 I have five kids.
02:15:08.000 When my daughter Janelle was about 12, and I'm sure you've done this.
02:15:12.000 I think you have kids roughly that age.
02:15:14.000 So I did the thing a dad does sometimes.
02:15:17.000 I went into her bedroom just to check on her.
02:15:21.000 And I noticed one of her little electronic devices, the old iPod or whatever it was, is sitting next to her pillow.
02:15:28.000 And then I looked a little closer and I went, what?
02:15:31.000 There were five electronic devices encircling her pillow.
02:15:39.000 It's our kids that we need to be thinking about here.
02:15:43.000 It's not just their future, but literally, how are they being impacted right now?
02:15:51.000 What kind of content are they being shown?
02:15:54.000 Is it pornographic?
02:15:56.000 Is it violent?
02:15:57.000 I don't know.
02:15:58.000 Are they being pushed one way or another politically?
02:16:02.000 We are in the process right now of trying to expand our research to look at kids and to see what content these kids are being shown.
02:16:11.000 Because it doesn't matter how vigilant you are as a parent, the fact is 99% of what your kids are seeing online or experiencing online, you're unaware of.
02:16:26.000 And that's why, as I say, Solving these problems is not optional.
02:16:34.000 We must solve these problems.
02:16:37.000 We must set up monitoring systems.
02:16:39.000 And it's relatively cheap, by the way, because now that we've done it repeatedly, we know how to do it.
02:16:46.000 And if we don't, we are essentially being controlled by big tech forever.
02:16:53.000 We're turned over our democracy.
02:16:56.000 We've turned over...
02:16:59.000 Our children, we've turned over literally our minds.
02:17:03.000 We've turned them over to tech companies and algorithms.
02:17:12.000 I think that's insane.
02:17:14.000 It is insane.
02:17:15.000 And where does it go?
02:17:18.000 How bad can this get?
02:17:20.000 Yeah, but look, we got Google, with the help of some senators, we got Google to stay out of Georgia.
02:17:25.000 To me, that's a wake-up call that says, wait a minute, we know not only how to track these companies, but we can stop them.
02:17:36.000 Have you ever had a conversation with anybody from Google?
02:17:40.000 Well, Ray Kurzweil's old friend of mine, his wife, Sonia, I was on the board of her school for autistic kids for 15 years.
02:17:48.000 I mean, I went to their daughter's bat mitzvah, they came to my son's bar mitzvah, etc., etc.
02:17:52.000 But he won't talk to me now.
02:17:55.000 He won't talk to you?
02:17:56.000 He's head of engineering at Google.
02:17:57.000 So he won't talk to you now because he's not allowed to?
02:18:00.000 Do you know why he won't talk to you?
02:18:02.000 I don't know.
02:18:03.000 And even Sonia won't talk to me now.
02:18:06.000 And I've never had any conflict with either, ever, ever, ever, going back, I don't know, 20 years.
02:18:12.000 Never, never.
02:18:13.000 I mean, they're lovely people.
02:18:15.000 They're very nice people.
02:18:16.000 I know their kids and, you know, neither of them now will talk to me.
02:18:19.000 Just because he is an executive at Google.
02:18:22.000 He's an executive at Google.
02:18:23.000 I was supposed to be on a panel with another top executive at Google who used to be a professor at like Stanford or some big school.
02:18:32.000 I'm supposed to be on a panel with him in Germany, and when he found out what it is I do, he pulled out.
02:18:42.000 He did not show up.
02:18:43.000 There were a thousand people in that audience who came to see him, the Google guy, not me.
02:18:50.000 He didn't show up.
02:18:52.000 Wow.
02:18:55.000 They, I believe, I'm pretty darn sure, and this upset my wife, Misty, at the time, they sent a private investigator to our house.
02:19:11.000 Posing as someone who wanted to be a research intern.
02:19:15.000 What did he do when he was in the house?
02:19:18.000 I don't know.
02:19:20.000 Did he leave bugs?
02:19:21.000 I don't know.
02:19:22.000 I have no idea what he did.
02:19:24.000 But, you know, I was sitting there with a staff person.
02:19:27.000 We're asking the guy questions like we do for anyone who applies to work with us.
02:19:32.000 And the guy, first of all, he's wearing like a white shirt and a tie, which no one does in San Diego.
02:19:38.000 But we're asking him questions, and his answers didn't make any sense at all.
02:19:43.000 Well, you know, I said, so you're interested at some point in going to graduate school in psychology?
02:19:49.000 And he goes, graduate school?
02:19:52.000 Psychology?
02:19:55.000 I don't know.
02:19:57.000 So none of this made sense.
02:19:58.000 We looked the guy up afterwards.
02:20:02.000 He was supposed to get back to us.
02:20:03.000 He didn't.
02:20:04.000 We looked him up.
02:20:05.000 He worked for a private investigation firm.
02:20:09.000 Now, why do I think Google sent him?
02:20:11.000 Because I had written to that executive at Google, who was supposed to be on that panel in Germany, and just telling him about my work, giving him links and so on, because he's a former professor.
02:20:26.000 It was only a few days after that that this guy showed up at our house.
02:20:31.000 And then it was a few days after that that the Google executive pulled out of that conference.
02:20:37.000 Jesus.
02:20:40.000 And so they're not interested in communicating with you.
02:20:45.000 They've obviously either told people not to communicate with you or the people that you would like to talk to are aware of your work and they feel that it would negatively impact their job or their career.
02:21:02.000 I'm telling you, they're...
02:21:06.000 This has just been, for me, in many ways, a nightmare, an absolute nightmare, because there are people who won't help us, who won't serve on our board, who won't do this, who won't do that.
02:21:18.000 We had an intern lined up who was very, very good.
02:21:22.000 You know, we get some really sharp people.
02:21:23.000 They come from all over the world, actually.
02:21:26.000 And we had this person all signed up, her start date was set up, and she called up and she said, I can't do the internship.
02:21:34.000 I said, why not?
02:21:36.000 My grandmother looked you up online and she thinks that you're like some sort of Trump supporter.
02:21:46.000 And she said she'll cut me off if I do this internship.
02:21:51.000 Jesus.
02:21:53.000 So that's one of the reasons I keep repeating.
02:21:56.000 I've done it four times, but I keep repeating, you know what?
02:22:00.000 You lean left.
02:22:03.000 Yeah, because...
02:22:06.000 But it doesn't help.
02:22:07.000 It doesn't help.
02:22:08.000 They don't care.
02:22:09.000 Well, ever since I testified, terrible things have happened.
02:22:17.000 One of my board members said to me, look, he said...
02:22:21.000 He said, in a way, you should be grateful and pleased that they left you alone for so many years.
02:22:28.000 He said, but that for them was, you know, that was it.
02:22:31.000 That was the final straw.
02:22:33.000 And, you know, what happened after that hearing was Trump tweeted about my My testimony.
02:22:41.000 Hillary Clinton, whom I've been supporting forever, Hillary Clinton replies to Trump on Twitter and says, this man's work has been completely discredited.
02:22:53.000 It's all based on data from 21 undecided voters.
02:22:57.000 What?
02:22:59.000 Then...
02:23:00.000 She said that?
02:23:01.000 Yeah.
02:23:01.000 Can you sue her?
02:23:02.000 I could have, but it would take me away from the research.
02:23:11.000 It would cost a fortune.
02:23:13.000 Yes, I could have and probably could still swear, yes.
02:23:17.000 Because that's a factual statement, which is false and defamatory.
02:23:22.000 But I have to try to stay focused.
02:23:24.000 I really do.
02:23:25.000 I understand.
02:23:26.000 I keep getting pulled away.
02:23:28.000 And believe me, what's happening right now in our work is tremendously exciting.
02:23:34.000 And everyone loves what we're doing.
02:23:38.000 We love each other.
02:23:39.000 We love the whole thing that we're doing.
02:23:42.000 We love the discoveries.
02:23:43.000 We're blown away over and over again by the numbers.
02:23:48.000 And we have very ambitious plans moving forward.
02:23:55.000 So, I mean, as long as I can still function, I'm going to keep doing this.
02:24:06.000 I mean, this is...
02:24:08.000 It's important.
02:24:10.000 I have five kids.
02:24:12.000 Someday I hope I'm going to have grandkids.
02:24:14.000 It's important for the world right now.
02:24:18.000 It's important for our democracy, which as far as I'm concerned is an illusion.
02:24:22.000 It's an illusion.
02:24:23.000 When you look at the numbers, you realize...
02:24:27.000 No.
02:24:28.000 There's a player in here that you don't see that doesn't leave a paper trail and that can shift millions of votes.
02:24:42.000 And if it didn't exist and someone introduced it...
02:24:47.000 Nefarious political parties or nefarious people would 100% be excited about it.
02:24:52.000 Like, look what we have now.
02:24:54.000 Yeah, yeah, yeah, yeah.
02:24:55.000 And then if we found out that someone who was, like, say, if Donald Trump, you know, if the Democrats found out that Donald Trump had implemented some sort of a system, like you're talking about, people would be furious.
02:25:09.000 That's right.
02:25:09.000 They would say, he is a threat to democracy.
02:25:12.000 He should be locked up.
02:25:13.000 He should be in prison for treason.
02:25:15.000 Mm-hmm.
02:25:17.000 Does it concern you that you're the only one?
02:25:21.000 Well, I don't understand that because this is really good science.
02:25:24.000 I mean, in other words, the work I do has been published in top journals.
02:25:30.000 That initial SIEM paper, that was published in the Proceedings of the National Academy of Sciences.
02:25:34.000 It has since been downloaded or accessed from the National Academy of Sciences more than 100,000 times.
02:25:41.000 For a very technical, scientific paper, that's practically unheard of.
02:25:47.000 I've never had that happen before.
02:25:49.000 And the papers that I have coming out, they're in top journals.
02:25:53.000 We're submitting more in top journals.
02:25:55.000 This is good science.
02:25:56.000 So why aren't 20 universities doing this stuff?
02:26:01.000 You know why?
02:26:02.000 Because they're getting funding from Google or they're terrified of Google.
02:26:07.000 The head of Europe's largest publishing conglomerate, his name is Dopfner, published a piece a few years ago that's actually called Fear of Google.
02:26:17.000 It's a superb piece.
02:26:18.000 It's about how, in a lot of industries right now, you cannot make a move without taking into account how Google's going to react.
02:26:31.000 I want to Google Fear of Google.
02:26:34.000 Yeah, Google Fear of Google.
02:26:35.000 Let's see what happens here.
02:26:36.000 Yeah.
02:26:37.000 See if you get Doffner.
02:26:40.000 Fear of...
02:26:41.000 What's it suggest?
02:26:43.000 Fear of rain?
02:26:44.000 Fear of God?
02:26:45.000 G-O-O? Fear of Google?
02:26:49.000 Here we go.
02:26:49.000 Yeah.
02:26:50.000 What does it say?
02:26:51.000 Who's afraid of Google?
02:26:53.000 Everyone.
02:26:54.000 Wired Magazine.
02:26:55.000 Why we fear Google?
02:26:58.000 That's some other website that I don't know.
02:27:03.000 What Americans fear most according to their Google searches.
02:27:08.000 Why are people afraid of Google?
02:27:11.000 Quora.
02:27:16.000 Well, don't forget, whatever you're getting has to do with your history.
02:27:21.000 So someone else is going to get a different list.
02:27:23.000 And that's scary because, you know, ask a con artist.
02:27:29.000 Do you know any con artists?
02:27:30.000 Because I've met one or two.
02:27:33.000 You ask a con artist and they will tell you straight out, if you want to do a con, The more you know about someone, the easier it is.
02:27:41.000 For sure.
02:27:42.000 So that's the problem there, is that everything is personalized and everything you're seeing there is based on you and your 20-plus year history and the 3 million pages of information they have about you.
02:27:53.000 They build digital models of all of us.
02:27:57.000 Do you use social media?
02:28:01.000 I've been trying to close my Facebook page for I think at least three years now.
02:28:07.000 They won't let me close it.
02:28:09.000 They won't let me change it.
02:28:11.000 It's still up there.
02:28:13.000 I didn't even set it up originally.
02:28:14.000 I think it was Misty, my wife, who set it up.
02:28:17.000 But they won't let me touch it.
02:28:19.000 And they won't let me close it.
02:28:21.000 Speaking of which, okay, I'm sitting next to a guy on an airplane the other day, and he's saying how he's very proud that he doesn't use any social media.
02:28:36.000 I said, so, wait, you mean you don't have a Facebook page?
02:28:40.000 And he goes, oh, no.
02:28:41.000 Positively, I do not have a Facebook page.
02:28:43.000 I said, you have a Facebook page.
02:28:45.000 He goes, no.
02:28:46.000 What are you telling me?
02:28:47.000 He says, I know.
02:28:47.000 I don't have a Facebook page.
02:28:48.000 I would know if I had a Facebook page.
02:28:49.000 I said, no, you don't understand.
02:28:52.000 Every time someone mentions you on Facebook or posts a photo in which you appear, that goes into your Facebook profile.
02:29:04.000 You have a big, an enormous Facebook profile, except that you can't see it.
02:29:12.000 And what did he say to that?
02:29:15.000 Well, I said it in a way I guess that was pretty convincing, and he was upset.
02:29:23.000 He didn't like the concept that he might have a Facebook profile that he doesn't know about.
02:29:28.000 That you can't opt out.
02:29:30.000 Well, it's not only that, but even when you think you're...
02:29:33.000 I mean, Google...
02:29:34.000 God, do they ever say anything truthful publicly?
02:29:37.000 That's a big question.
02:29:38.000 But, I mean, Google claims, for example, you can delete your Google data.
02:29:45.000 You can go through the motions of saying, I want to delete my Google data, and then from that point on, you can't see your Google data.
02:29:51.000 But they don't delete it.
02:29:54.000 They never delete it.
02:29:55.000 Even if they deleted it on one server, it's sitting there and backup after backup after backup.
02:30:02.000 And not only that, if you read, I think I'm the only one who reads these things, but if you read Google's Terms of Service and Google's Privacy Policy...
02:30:11.000 It says right in there, we reserve the right to hold on to your data as we might be required by law or in any other way that protects Google.
02:30:28.000 Now what about Twitter, Instagram, things like that?
02:30:36.000 Instagram and Facebook are the same entity, right?
02:30:39.000 Instagram, yeah.
02:30:41.000 And yeah, Instagram is part of Facebook.
02:30:45.000 So here my main concern is, again, is this power to suppress.
02:30:54.000 So I don't know what your opinion is.
02:30:58.000 In fact, I'd love to know what your opinion is of what happened early 2021, I think it was, when both Facebook and Twitter shut down Donald Trump.
02:31:09.000 What do you think of that?
02:31:10.000 I don't think they should be shutting down people at all.
02:31:13.000 And by the way, he was still president.
02:31:15.000 Yeah.
02:31:15.000 I think that what these things are, I think- We're at a time in history where you can't look at them as just private companies because the ability to express yourself is severely limited if you're not in those platforms.
02:31:33.000 I think they should be looked at like utilities and I think they should be subject to the freedoms that are in our Constitution and the Bill of Rights and I think the way the First Amendment protects free speech It should be protected on social media platforms because I think as long as you're not threatening someone or doxing someone or putting someone in harm or lying about them,
02:31:56.000 I think your ability to express yourself is a gigantic part of us trying to figure out the truth.
02:32:03.000 Like when it comes to what are people's honest opinions about things?
02:32:08.000 Do we know?
02:32:09.000 You know, we don't know if honest opinions are suppressed.
02:32:13.000 Because they don't match up to someone's ideology I think that's it's a critical aspect of what it means to be American to be able to express yourself freely and To find out how other people think is educational if you only exist in an echo chamber and you only hear the opinions expressed of people that Align with a certain ideology,
02:32:38.000 that's not free speech.
02:32:40.000 I think free speech is critical.
02:32:42.000 I think the answer to bad speech, and this is not my thought, this is many brilliant people believe this, is better speech, more thought, is more convincing arguments, more logical, sustained reasoning and debate and discussion.
02:33:00.000 And I think as soon as they start suppressing ideas, as soon as they start suppressing and deleting YouTube videos and banning people from Twitter for things that have now been proven to be true, right?
02:33:15.000 There's a lot of people that were banned because they questioned the lab leak theory.
02:33:18.000 Mm-hmm.
02:33:19.000 You know, their videos were pulled down.
02:33:21.000 They were, you know, they were suspended from Twitter.
02:33:25.000 Now that's cover of Newsweek.
02:33:27.000 It's constantly being discussed.
02:33:29.000 It's discussed in the Senate.
02:33:31.000 Well, this is a very old idea.
02:33:32.000 The way Voltaire said it, I'm paraphrasing, is, you know, I may not agree with what you say, but I will defend to the death your right to say it.
02:33:41.000 Yeah.
02:33:42.000 And, you know, I think it was dead wrong.
02:33:45.000 I mean, I was happy, of course, that this happened, but I think it was dead wrong for Twitter and Facebook to literally cut off communication between the current president of the United States who's still in office and his supporters.
02:34:02.000 Yeah.
02:34:02.000 And the real question, too, is how much manipulation was being done by Federal agents in the January 6th event like did they engineer?
02:34:18.000 people going into the capital did they Encourage them and you saw that Ted Cruz conversation with the woman from FBI where she said I can't answer that Did the FBI incite violence?
02:34:30.000 I can't answer that you can't answer that That should be never Would they incite violence with the FBI? We're good to go.
02:34:47.000 We're good to go.
02:34:59.000 Five?
02:35:01.000 Something like that?
02:35:03.000 Multiple parts.
02:35:04.000 And it's great.
02:35:05.000 And you realize, like, how easily manipulated some of these poor folks are.
02:35:10.000 They get involved in these movements.
02:35:12.000 Now, if somebody wanted to disparage a political party or to maybe have some sort of a justification for getting some influential person like Donald Trump offline, that would be the way they would do it.
02:35:28.000 That's...
02:35:29.000 Yeah.
02:35:30.000 Yeah.
02:35:30.000 Say, look, he's responsible for violence.
02:35:32.000 He's responsible for...
02:35:33.000 Look, this is as bad as Pearl Harbor.
02:35:35.000 This is as bad as D-Day.
02:35:38.000 But, you know, the bottom line here really goes back to George Orwell, which is, you know, if you control information, you control everything.
02:35:47.000 Yes.
02:35:47.000 And what we've done is we have lost control...
02:35:55.000 Authorities, gatekeepers who are well-trained journalists, let's say.
02:36:02.000 We've lost control over information.
02:36:05.000 Information is now largely in the hands of algorithms which are controlled by executives who are not accountable to the American public.
02:36:14.000 They're accountable just to their shareholders.
02:36:18.000 So, you know, I just think we're in a terrible position.
02:36:21.000 I'm going to, you know, you asked me this before, but I'm going to continue.
02:36:25.000 I'm going to do my research.
02:36:27.000 I'm going to keep digging.
02:36:29.000 I'm going to do my monitoring.
02:36:30.000 I'm going to try to set up, I hope, this year, this...
02:36:39.000 It says tamebigtech.com.
02:36:43.000 That's your website?
02:36:44.000 Could you say it again?
02:36:46.000 tamebigtech.com.
02:36:48.000 Well done.
02:36:48.000 Well done.
02:36:48.000 Thank you.
02:36:49.000 Yes, because I need help from people.
02:36:53.000 I need people to provide funds, but also to help us find funds.
02:37:00.000 This is the year where I think we should set up this first large-scale nationwide monitoring system, which could be used not only to keep an eye on these midterm elections, but we could finally start to look at our kids.
02:37:15.000 That's become my main concern now, is our kids.
02:37:19.000 Because we don't know.
02:37:20.000 We don't understand what the hell they're doing.
02:37:22.000 We don't know what they're looking at, what they're listening to.
02:37:26.000 But I can tell you for sure that a lot of what's happening is really being done very deliberately and strategically by the big tech companies.
02:37:37.000 Because they're going to do—they have control over the information that everyone has access to, and they're going to do what's best for them, what makes them the most money, what spreads their values, and, of course, sometimes what's good for intelligence purposes.
02:37:52.000 They're going to do those things, and we have no idea what they're doing unless we track them.
02:38:02.000 So anyway, that's my fantasy.
02:38:04.000 This is the year where we're going to get this thing running in a way that it would be self-maintaining.
02:38:10.000 So it would continue year after year after year.
02:38:14.000 Not optional.
02:38:15.000 I've said that before.
02:38:16.000 It's not optional.
02:38:19.000 So if people go to tamebigtech.com, they can get more information.
02:38:25.000 I actually created a special booklet that we're going to give out for free.
02:38:30.000 I had a copy to bring you, and I left it in my car.
02:38:34.000 But we have this booklet.
02:38:37.000 I took my congressional testimony, I updated it, I expanded it, and I turned it into an essay, which is called Google's Triple Threat.
02:38:49.000 To democracy, our children, and our minds.
02:38:53.000 And it says right on it, prepared for Joe Rogan's whatever.
02:39:02.000 And I am doing this with the help of all my wonderful teammates.
02:39:09.000 I am so far still the only one.
02:39:14.000 And that's disgusting.
02:39:16.000 That's horrible.
02:39:17.000 There's something fundamentally wrong with that picture.
02:39:22.000 Imagine if you didn't exist.
02:39:24.000 Like, if you never had started this.
02:39:26.000 Would we be completely in the dark about this stuff?
02:39:29.000 You would be completely in the dark because there's no one doing these kinds of experiments and there's no one collecting all that.
02:39:40.000 But when you think about the internet and how many people on the internet are, you know, interested in politics and interested in the influence of big tech and the dangers of big tech.
02:39:51.000 When they talk about psychological dangers, like Jonathan Haidt's work with young girls and self-harm and suicide and the rise of depression amongst young people, you would think that this would also be something that people would investigate and dig into.
02:40:07.000 The fact that you're the only one, it's very strange.
02:40:11.000 It's a tremendous responsibility.
02:40:14.000 It's horrible.
02:40:15.000 I don't like the responsibility.
02:40:20.000 I'm gone at the moment, so we have two cats in our office, and I'm the poop cleaner.
02:40:30.000 So, when I'm gone, that means someone else has to clean the poop.
02:40:34.000 So I said to my associate director last night, I said, just remember that the more credentials you get, the more responsibilities you get, the more poop you're going to have to clean.
02:40:47.000 And that's the truth.
02:40:49.000 So it's very tough.
02:40:52.000 I don't like being in this position and I do wonder about Misty.
02:41:00.000 I'll probably always wonder about Misty and I'll never know because, again, her truck disappeared.
02:41:10.000 Well, listen, man, thank you very much for coming here, and thanks for taking your time.
02:41:14.000 I know you're very busy, and I'm glad we could get this out there.
02:41:22.000 I don't know what to say.
02:41:23.000 I mean, you've given me a great opportunity.
02:41:25.000 I hope this was, you know, interesting for you.
02:41:29.000 It was.
02:41:30.000 Okay.
02:41:30.000 Scary.
02:41:32.000 Good.
02:41:33.000 Then if I scared you, I'm doing my job.
02:41:34.000 Yeah, you scared the shit out of me.
02:41:35.000 Good.
02:41:38.000 All right.
02:41:39.000 Thank you, Robert.
02:41:40.000 Thank you, Joe.
02:41:41.000 Bye, everybody.