In this episode of the Joe Rogan Experience podcast, I sit down with author and researcher Dr. Aaron Kogan to discuss his new book, "The Dark Side of the Internet: How Google Uses Your Data To Manipulate Your Opinion And Control Your Opinion." Dr. Kogan is a professor at the Berkman Center for Internet Ethics at Harvard Law School, where he's researching whether or not Google is an S&M platform, and why you should never use Google's search engine, DuckDuckGo, or Alexa. We also talk about the dangers of Android phones and how they can be used to spy on you and manipulate your thoughts and opinions, and how you can be controlled by your smart phone's GPS system. This is an [Expert] level episode, which means some parts of the conversation may not make sense unless you ve listened to the entire thing. So if you haven't checked it out, don t miss it! It's a must-listen episode, and if you don't like what you hear, you'll definitely want to check out the rest of the episode. If you're in the mood for conspiracy theories, this is the episode for you! You won't want to miss this one. Enjoy! Check it out! -Jon Sorrentino Subscribe to the podcast by clicking the link below to get notified when a new episode is released. Subscribe on Apple Podcasts by searching for "The Joe Rogans Podcast" Subscribe on iTunes Learn more about your ad choices and become a supporter of the show, "Joe Rogan's Podcasts" Become a supporter by becoming a patron of the podcast, and receive 20% off the first-day shipping discount code: JOGANPRODCAST, using promo code JOGANSPODCAST and receive 10% off your first month only, plus a FREE shipping discount when you buy a copy of his second month's ad-free version of his third month, shipping worldwide, shipping nationwide, and other prizes throughout the world, and a total of $10/month, including VIP membership offer, plus two months get a year get a chance to win a VIP membership deal, and get a FREE ad-only shipping offer, shipping only two months of VIP membership, and 7 months get two months VIP access to his second place promo code, and I'll get a discount, and he'll get his first place offer, and they'll get an ad discount, plus he'll also get a VIP rate, and all other perks, plus all other VIP pricing, and more!
00:00:14.000This is a very interesting subject because I think search engine results have always been a thing that people kind of take for granted that the search engine results is going to give you the most significant results At the top,
00:00:31.000and they don't really think about the fact that this is kind of curated.
00:00:35.000And, you know, we found it many times because we use two different search engines.
00:00:38.000We'll use Google, and then we'll say, well, if we can't find it on Google, use DuckDuckGo.
00:00:43.000And oftentimes, when you're looking for something very specific, you'll find that you can't find it on Google.
00:00:49.000Like, if it's in there, it's deep, deep, deep, you know, many pages in.
00:00:53.000Whereas DuckDuckGo will give you the relevant search results very quickly.
00:00:59.000So something's going on with search engines.
00:01:02.000And from your research, what you have found is that it can significantly affect the results of elections.
00:02:32.000It's also used for manipulation because they discovered quite a few years ago that if they control the ranking of the search results, they can control people's opinions, purchases, votes.
00:02:49.000They can't control everyone's opinions because a lot of people already have strong opinions.
00:02:54.000So the people they're going after are the people who are undecided, the people who are vulnerable, and they know exactly who those people are.
00:03:02.000And they literally, your mind is putty in their hands.
00:03:12.000So you should never, ever use Google or any other S&M product like Amazon Alexa is an S&M product or the Google Home device or Android phones.
00:03:44.000There have been court cases in which the recordings have been subpoenaed from whoever's controlling that so-called personal assistant or that device.
00:03:58.000And courts have recovered recordings and transcripts when people are not even aware that they're being monitored.
00:04:06.000I know that's the case with Alexa, right?
00:04:09.000But that's the case with Android phones as well?
00:04:11.000Yes, in fact, Android phones, the equipment to prove this, which I didn't bring, but is so cheap now that literally anyone can confirm this.
00:04:23.000Android phones, even if they're disconnected from your mobile service provider, even if you pull out the SIM card, okay, as long as the power is on, It is recording, tracking every single thing that you do.
00:04:39.000So if you use it to read things, if you use it to listen to music, you use it to shop, whatever it is, and of course your location is always tracked.
00:04:50.000Then when you go back online, the moment you're reconnected, it uploads all that information.
00:04:57.000So some people wonder why their batteries run down, sometimes even when you're not really doing anything with your phone.
00:05:04.000That's because with Android phones, I think it's 50 to 60 times per hour, it's uploading.
00:05:14.000It's uploading about 12 megabytes of data per hour.
00:05:34.000It's my friend Adam Curry, who's the original Podfather.
00:05:38.000He's the guy who invented podcasting, and his company develops these de-Googled phones where they take out all the tracking stuff, everything, and it's basically running on a different operating system.
00:07:18.000If you want to get them in a paid form so that you're not being tracked, we're talking $10 to $15 a month.
00:07:26.000Literally all of those so-called free services that are really, again, these S&M services, all of them together are worth $10 or $15 a month.
00:07:39.000And how do you use your phone, though?
00:07:42.000If you want to have a search engine, are you using a different search engine?
00:08:53.000And then I will go over, occasionally I'll go over to Firefox, because Firefox was actually developed by a guy named Brendan Eich, who might be really interesting for you to talk to, by the way.
00:09:08.000And then he left Mozilla, which was the nonprofit organization that developed Firefox.
00:09:15.000By the way, the connection between Firefox and Google, don't even get me started.
00:09:39.000So when did you first become interested in digital surveillance and privacy and like what you're giving up by using these free services like Google?
00:10:04.000But on January 1st of the year 2012, I got, I don't know, eight or nine messages from Google telling me that my website had been hacked and that they were blocking access.
00:10:20.000So I thought, the first thing I thought was, why am I getting these notices from Google?
00:10:26.000Who made Google the sheriff of the Internet?
00:10:29.000Why isn't this coming from the government?
00:10:31.000Why isn't it coming from some nonprofit organization?
00:10:37.000And then, because I'm a coder, I've been a programmer since I was a teenager, and then I started wondering, wait a minute, okay, they're blocking me on the Google search engine.
00:11:53.000I explained Google can literally block access...
00:11:57.000On multiple platforms that aren't even theirs, they can block access to any website that Google at one point in time, 2009, I think it was, I don't know, I might get the date wrong.
00:12:10.000Let's just say January, whatever, 30th.
00:12:14.000Google blocked access to the entire internet for 40 minutes.
00:12:58.000So what's happening with their system is because so many people are searching for things, because they're monitoring so many different things to add to their search engine, do they have some sort of ultimate control over the internet in some weird way?
00:13:25.000Google placed the internet on a blacklist today after a mistake caused every site in the search engine's result page to be marked as potentially harmful and dangerous.
00:14:06.000And geeks for fun, okay, sometimes for profit, but most of the time it's just for fun, just to be cool and get their kicks and show how powerful they are.
00:17:17.000And the answer, let's say, has a bias.
00:17:20.000In other words, it favors, you know, one candidate, favors one party, favors one cause, right?
00:17:28.000Question and answer interaction on Alexa, in a group of, say, 100 undecided people, can shift opinions by 40% or more, one interaction.
00:17:41.000If there are multiple questions asked on that topic over time, you can get shifts of 65% or more, with no one having the slightest idea that they have been manipulated.
00:17:53.000But are they doing it to manipulate you or is it just the fact that they distribute this information based on their algorithm?
00:18:01.000It's manipulating you just by default because the higher or more likely you find this information from the search engine, like that's what's going to influence your opinion.
00:18:13.000But are they doing it to influence your opinion or is that just the best answer?
00:18:26.000So if I ask that to Alexa and then it pulls up these results, it's going to pull up supposedly the most relevant result.
00:18:35.000Now, if you have something like Alexa where you're asking a question and it's just reading it back to you, there has to be some sort of curation of that information, right?
00:19:01.000My two eldest sons are like your biggest fans in the universe.
00:19:05.000My eldest son is technically a bigger fan than the other son because he's recently gained 60 pounds because of COVID. So he's definitely the bigger of the two fans.
00:20:03.000Now I'm going to say something that's not so nice, which is, on this issue, by the questions you're asking me, I can tell you have no idea what's going on.
00:20:16.000Well, I kind of do, but you have to understand the way I do a show.
00:20:19.000One of the things that I do when I want you to elaborate on information, it's like, maybe I know something.
00:20:24.000But I want you to elaborate to everybody else that's listening.
00:21:36.000A few years ago, you probably heard that Google got busted because their Street View vehicles were driving up and down.
00:21:44.000They're still driving up and down streets all over the world, but they had been driving up and down streets all over the world, more than 30 countries, for more than four years.
00:21:52.000And they weren't just taking pictures of our houses and our businesses.
00:24:02.000He just couldn't stand it anymore and he quit.
00:24:04.000But unlike most of these people who've walked away, he brought with him 950 pages of documents and a video.
00:24:11.000The video is two minutes long and it shows the CEO of YouTube, which is owned by Google, her name is Susan Wojcicki, and she's talking to her staff.
00:24:23.000And she's explaining, this is 2017 after the horrible election results of 2016, and she's explaining how they're altering the up-next algorithm in YouTube to push up content that they think is legitimate and to suppress content that they think is not legitimate.
00:24:45.000So if it's happening at that level, the executive level, Again, it still has the same effect.
00:24:52.000Any of these possibilities, and there are others as well, ends up giving us content that impacts us and our kids especially in ways that people are entirely unaware of.
00:25:41.000Well, that would take a whistleblower to figure that one out.
00:25:48.000It was in the news at one point that the guy who was in charge of making these decisions, he actually has left Google, he once shut down an entire domain name which had 11 million websites on it because he thought it was kind of poor quality.
00:26:42.000So it's up to the discretion of the engineer?
00:26:47.000There's a lot of discretion involved in making these decisions, and a lot of the decisions that get made in very recent years, since Trump was elected, they happen to be decisions, for the most part, that suppress conservative content,
00:28:53.000This is what happened when you would that day or during that time period when you search something on Google and you clicked it, you would get this.
00:29:01.000Warning, visiting this website may harm your computer.
00:29:05.000I think maybe you could continue through like you can.
00:29:09.000It happens from time to time now for strange reasons.
00:29:22.000Safari, before they take you anywhere, they've got to check Google's blacklist.
00:29:25.000So not only is Google getting information about your search on Safari, the fact is if Google wants to block you from going there through Safari, they just add it to their blacklist.
00:29:38.000In other words, if they put everything on their blacklist, then no one can reach anything.
00:31:04.000In other words, Google is literally looking at billions of websites every day and it's looking for updates and changes and new websites and this and that.
00:31:13.000So it's crawling and it's extracting information, especially looking for links because that's how it gets you good information.
00:32:21.000It was already an inexorable part of day-to-day life that people were using that and that people were using Gmail and using all these services and just giving up their data.
00:34:29.000So let's say if Donald Trump runs again in 2024 and they have a Trump campaign website, Google can decide that that website is a poor quality and deny people access to it so that when people go to Google Donald Trump,
00:36:03.000How do you fight the most effective mind control machine that's ever been developed, which also is very rich, has $150 billion in the bank right now in cash, makes huge donations to political candidates,
00:36:20.000and then can shift votes, millions of votes nationwide, without anyone knowing that they're doing so?
00:37:43.000And he says, look, when I first joined Google, which was practically in the beginning, he said it was just a cool place and we were doing cool things and that was it, he said.
00:37:53.000And then he said, a few years later, he said we turned into an advertising company.
00:38:03.000This is brutal, profit-driven ad company.
00:38:08.000Now, if you don't think of Google as an ad company, then, again, you're not getting it.
00:38:14.000They are the largest advertising company by a factor of 20. I think the next largest one is based in London.
00:38:21.000But Google is, what it's doing is tricking you, tricking all of us.
00:38:27.000Well, not me personally, but it's tricking you into giving up personal information 24 hours a day, even when you don't know you're giving up personal information.
00:38:35.000And then it's monetizing the information mainly by connecting up vendors with potential buyers.
00:39:11.000There was like a thing, I'm trying to remember what they exactly did, because we were sort of like, oh my god, they said don't be evil, and now they don't say it anymore.
00:39:20.000Maybe they're evil, but I think they had added something and made it longer.
00:39:28.000And so it wasn't that it's not their slogan anymore, it's just their slogan sort of morphed.
00:41:40.000This is only less than a month after this hearing.
00:41:44.000So he's got 950 pages of documents, all kinds of documents, and three of them are Google blacklists, which are actually labeled Blacklists!
00:41:56.000Now, if I were putting together blacklists at my company, I would call them shopping lists.
00:42:02.000I would call them, you know, I don't know, makeup lists, you know, lists for my kids' birthday presents.
00:42:36.000And they operate this way, do you think they do this because, are they financially driven to put those people on blacklist?
00:42:45.000Is it maybe some, I mean, this is obviously speculation, but is it maybe some sort of a deal that they've made with certain politicians?
00:42:53.000Is it something they've decided on their own because this is the right thing to do to suppress the bad people that put Donald Trump into office?
00:43:48.000I could talk for hours on this issue because of recent leaks of videos, PowerPoint presentations, documents, and of course what whistleblowers have been revealing.
00:44:03.000They have very strong values there because the founders had very strong values and they hired people who had similar values and they have really strong values.
00:44:12.000And they want the world to have those values.
00:44:17.000They really think that their values are more valuable than other people's values, which means they don't understand what values are because...
00:44:26.000And so their values are pretty much all of tech.
00:47:21.000But right from the very, very beginning, the Google search engine was set up to track and preserve search history.
00:47:33.000So in other words, to keep track of who's doing the search and where did they search, that is very, very important to this day for intelligence agencies.
00:47:45.000So Google, to this day, works very closely with intelligence agencies, not just in the U.S., but other agencies around the world.
00:48:17.000So Google has this ability that they've proclaimed that they can sort of shift culture and direct the The opinion of things and direct public consciousness.
00:48:35.000How much of a percentage do you think they have in shifting?
00:49:06.000Because we have discovered a number of different tools that Google, and to a lesser extent other companies use, to shift thinking and behavior.
00:49:20.000And what we do in randomized controlled experiments, which are also counterbalanced and double-blind and all that stuff, we measure The ability that these tools have to shift thinking and behavior.
00:49:37.000And we pin it down to numbers, percentages, proportions.
00:49:41.000We can make predictions in an election about how many votes can be shifted if they're using this technique or these three techniques.
00:50:22.000Well, the first one we called SEAM, search engine manipulation effect.
00:50:26.000And that means they're either allowing, you know, one candidate or one party to rise to the top, you know, in search rankings, or they're making it happen.
00:50:35.000And you don't know for sure whether, you know, which is occurring unless there's a whistleblower or there's a leak.
00:50:43.000Okay, but the fact that it's occurring at all, that's important.
00:51:36.000We learned in controlled experiments that by manipulating the suggestions that are being flashed at people, we could turn a 50-50 split in a group of undecided voters into nearly a 90-10 split without anyone having the slightest idea that they're being manipulated.
00:52:00.000That's just by manipulating search suggestions.
00:52:03.000And the reason why we started that work is because in June of 2016, a news organization, a small news organization released a video which went viral on YouTube and then got blocked on YouTube.
00:53:11.000It would give you something like that.
00:53:13.000It would not give you something negative.
00:53:14.000So, for example, Hillary Clinton is, you do it on, and they showed this, you do it on Yahoo!, You do it on Bing, and you get Hillary Clinton is the devil.
00:54:43.000If you have a plate of sewage and you put a nice piece of Hershey's chocolate in the middle, It does not make the sewage look any more appetizing.
00:54:59.000Basically, if we allow one negative to pop up in a list and the rest are neutral or positive suggestions, that one negative for certain demographic groups can draw 10 to 15 times as many clicks as the other suggestions.
00:55:17.000So, one of the simplest ways to support a candidate or a cause It's a simple look-up.
00:57:18.000When they first came up with search suggestions, actually one engineer there came up with this thing and it was cool.
00:57:26.000And it was an opt-in thing when it first came out.
00:57:28.000I think it was 2009. And it was cool and it was helpful because that was the idea initially.
00:57:37.000So, then over time, I think, you know, with a lot of these services, a lot of these, you know, these little gizmos, people figured out that, wait a minute, we can do things, you know, that maybe we didn't intend to in the beginning,
00:57:52.000but we can use these for specific purposes.
00:57:55.000So, anyway, so, at some point, or rather a couple years later, It was no longer opt-in.
00:58:04.000In fact, it was automatic and you can't opt out.
00:58:10.000And then you may remember there were always 10 items in the list initially.
00:58:16.000But then in 2010 or so, suddenly they dropped to four items.
00:58:25.000So in our experiments we actually figured out why they were showing four items and we went public with that information in 2017 and three weeks later Google went back to ten items.
00:58:40.000Because four is exactly, we know from the research, is exactly the number of search suggestions that allows you to maximize your control over people's searches.
00:58:54.000Because look, if the list is too long and you've got a negative in there...
01:00:39.000Well, it turns out everywhere in the world where Amazon does business, if you try to search for anything beginning with the letter A, and you type A, Google suggests Amazon.
01:02:21.000No, but I mean, they're not doing it specifically because it's me.
01:02:25.000If I was any other person that was maybe anonymous, but I also looked up those things...
01:02:35.000For most people, to answer your question, for most people, and folks out there, literally, pick up your phones, go to google.com, which, by the way, this is the last time you're ever going to use google.com, but just type in G and see what you see.
01:02:51.000Most people, if they're getting five suggestions, four out of the five will be for Google.
01:02:56.000So, the lesson there is if you're starting a new company, don't start...
01:03:03.000Don't name it with a G. Don't name it with a G, right.
01:03:53.000It was supposed to be like the public library.
01:03:56.000And it is possible, you see, you can set up a company like Brave that doesn't play these stupid games and doesn't fool you and it's not deceptive.
01:04:08.000This is the business model that Google invented.
01:04:12.000It's called the surveillance business model.
01:04:33.000Tim Cook, who's still the CEO of Apple, has publicly said, this is pretty recent, publicly said that this is a creepy business model and it should not be allowed.
01:04:45.000Well, that is one area where Apple deserves credit, right?
01:04:48.000That Apple has not taken up that same sort of net-like surveillance where they just kind of cast the net over everything you do and then sell it to advertisers.
01:04:58.000And you can opt out of certain things in terms of like allowing apps to track purchases or allowing apps to track your use on other devices or on other applications rather.
01:05:12.000I wish I could agree with you, but I can't, because the fact is Apple is still collecting all this information.
01:05:22.000They're doing the same things, it's just that at the moment, so far, under the leadership they have right now, but that can change in a heartbeat.
01:06:00.000Early 2016, Google and Microsoft signed a secret pact.
01:06:06.000So the fact that the pact was signed, that somehow leaked.
01:06:12.000But to this day, no one knows the details of what's in it, except here's what happened.
01:06:18.000Simultaneously, both companies around the world dropped all complaints against each other.
01:06:24.000Google, excuse me, Microsoft withdrew all of its funding from all the organizations it had been supporting.
01:06:32.000And there are some people who believe, because Bing, Microsoft's search engine, which draws about 2% of search, by the way, it's no Google, it had been bleeding money for Microsoft for years,
01:07:12.000These new operating systems are so aggressive in tracking that it's very, even if you're a tech geek like me, it's very, very hard to get rid of all the tracking.
01:07:26.000So I'm still using Windows 8.1, believe it or not, or Windows 7. Why didn't you switch to Linux or Unix or something like that?
01:07:35.000Well, we use that for certain purposes as well, but for general stuff that you do, if you're using desktops and laptops, Windows is still the way to go, except the company shifted.
01:07:48.000It has been shifting towards the surveillance business model, as thousands of other companies have, including Verizon.
01:08:10.000And the real issue here seems to be that this wasn't a thing 20 years ago.
01:08:16.000It's a thing now, and it's the most dominant thing in terms of the way people access information, the way people get data, the way people find answers.
01:08:24.000What is it going to be in 20 years from now?
01:08:27.000I mean, it seems like there's so much potential for control and so much potential for manipulation and that it could only just get worse.
01:08:37.000If there's no regulation put in place and there's no way to stop use of algorithms, use of curated data, what is this going to be like?
01:09:27.000And everyone always points to certain language from his speech.
01:09:30.000This is his retirement speech, his last speech just a few days before John F. Kennedy became president.
01:09:37.000And it was a very shocking speech because this is a guy who was head of Allied forces in World War II. This is a, you know, I don't know, four-star general.
01:09:49.000And in this speech, he says, you know what, this terrible kind of...
01:09:55.000This entity has begun to emerge, you know, and I've watched it.
01:10:00.000And he called it the military-industrial complex.
01:10:03.000And you probably remember hippies like, you know, with signs and screaming, no military-industrial complex.
01:10:09.000And Eisenhower actually warned about the growth of this military-industrial complex and how it's taking over businesses and it's affecting the government and blah, blah, blah.
01:10:20.000What he failed to note is that he also warned in the same speech about the rise of a technological elite that could control public policy without anyone knowing.
01:11:11.000And Google is by far the most aggressive, the most dangerous.
01:11:18.000You know, Facebook, there's chaos within Facebook, but, you know, we had this amazingly from Francis Haugen just recently, Of documents showing that, you know, people at Facebook are very much aware that their social platform creates turmoil,
01:11:36.000terrible turmoil on a massive scale, and that they like that.
01:11:41.000They encourage that because the more turmoil, the more traffic, the more traffic, the more money.
01:11:49.000But knowing that you're creating turmoil, Here's my thought on that.
01:11:59.000Because you were saying before about the negativity bias, that people gravitate towards things that are negative.
01:12:05.000And that's one of the things that you'll find if you use YouTube.
01:12:12.000When you go on YouTube, if you're a person who likes to get upset at things and you're a person who likes to...
01:12:18.000Look for things that are disturbing or upsetting or political arguments, whatever.
01:12:24.000You'll get those in your suggestions over and over and over again.
01:12:27.000But if you're not interested in that, if you're only interested in airplanes and you start Googling airplanes or cars or watches, that's what it'll suggest to you.
01:12:37.000It doesn't have to suggest to you negativity.
01:12:40.000You gravitate towards that, naturally.
01:12:44.000And so the algorithm represents what you're actually interested in.
01:12:49.000So is it Facebook's fault that everyone, not everyone, most people generally interact more with things that are negative or things that upset them?
01:13:01.000That's not their fault, but it is their fault that they take advantage of that to manipulate people.
01:13:07.000But if their business model is to engage with people and to keep people engaged by giving them content that makes them stay engaged and click on links and read more and spend more time on the platform, And the only thing that it's doing is highlighting what you're actually interested in.
01:14:12.000We're using real content from YouTube, real videos from YouTube, All the titles, everything comes from YouTube, except we have control over the ordering, and we have control over the up-next algorithm.
01:14:29.000That's where the power lies, the up-next algorithm.
01:14:33.000So one of the things we learned recently, not from Francis Haugen, but from someone else who left Facebook, is that 70% of the videos that people watch on YouTube now around the world Are suggested by YouTube's UpNext algorithm.
01:15:32.000Okay, we never do the, you know, subject pool at the university where you get, you know, 50 students from your college to take, you know, to be your researcher.
01:16:22.000Yeah, but the internet, you see, the internet, though, because there are no regulations and rules, it does allow for some pretty evil things to take place.
01:16:32.000And the fact is, in our experiments, we do these, usually our experiments have hundreds of people in them.
01:16:39.000Sometimes they have thousands of people.
01:16:42.000And we can fuck with people and they have absolutely no idea.
01:16:50.000I'll tell you about something new, okay?
01:16:55.000Thank God I'm not talking about Google this time.
01:16:58.000I'm just talking about something else that's happening.
01:17:01.000There are websites that will help you make up your mind about something.
01:17:07.000So, for example, there's a whole bunch of them right now that'll help you decide whether you're really a Democrat or you're really a Republican.
01:17:15.000And the way they do that is they give you a quiz.
01:17:18.000And based on your answers to how you feel about abortion and immigration and this and that, at the end of the quiz, they say, oh, you are definitely a Republican.
01:17:28.000Sign up here if you want to join the Republican Party.
01:17:34.000And the research we do on this is called OME, the opinion matching effect.
01:17:40.000And there are hundreds of websites like this.
01:17:43.000And when you get near an election, a lot more of them turn up because the Washington Post will give you a quiz and help you decide who to vote for.
01:17:52.000And Tinder, Tinder, okay, which is used for sexual hookups.
01:18:05.000So Tinder actually set up a swipe the vote.
01:18:09.000Option on Tinder during the 2016 election, you swipe left if you think this, you swipe right if you think that, and then at the end of it, they say, oh, you should be voting for Hillary Clinton.
01:18:21.000But how do you know when one of these websites is helping you make up your mind?
01:18:28.000How do you know whether the algorithm is paying any attention to your answers at all?
01:19:01.000Sometimes these websites are not paying any attention to your answers.
01:19:07.000They're just telling you what they want to tell you, and they're using this quiz to suck you in, and then they add in—oh, this we love—they add in a timer.
01:19:19.000So in other words, after you finish the quiz, it'll go tick, tick, tick, tick, tick, computing, computing, computing.
01:19:25.000And there's this delay creating the impression that they're really thinking hard.
01:19:32.000So all that is for credibility to manipulate you.
01:19:37.000Now, so over here we're going to websites and we're typing in random answers.
01:19:42.000On the other side, we're doing experiments in which we...
01:19:47.000We are giving people quizzes, and then we are giving people recommendations, and then we are measuring to see whether we can change anyone's mind.
01:19:56.000And we're getting shifts of 70 to 90% with not a single person Not one person recognizing that they're being manipulated.
01:20:10.000Not even one person recognizing that there's bias in the results we're giving them.
01:20:57.000And when you did, was there any sort of urgency?
01:21:03.000Did anybody understand what the implications of this are?
01:21:07.000Did anybody understand we're literally looking at these massive technologies that are used throughout the world that can completely change Mm-hmm.
01:22:42.000Well, I'm not thinking that they haven't already taken over, but I'm thinking, like, how much more control can they have in 20 years, if 20 years ago they didn't have any?
01:22:51.000Like, as technology advances, do you think that this is going to be a deeper and deeper part of our world?
01:23:47.000Oh, I published a few years ago an essay calling for his resignation.
01:23:53.000Roger McNamee, who was one of the first supporters financially of both Google and Facebook, he actually published a book about two years ago saying it was called Zucked.
01:24:07.000How how Zuckerberg has taken over the world and he basically he said in that book straight out that if he had known What these companies were going to turn into Google and Facebook he would never Never have backed them in those early days Jamie, did we ever find out what Facebook,
01:28:35.000It should be obvious, what you're saying.
01:28:38.000I mean, what you're saying should concern people.
01:28:41.000The idea that you would just be labeled as a part of a disparaged political party because it's an easy way to defame you and to discredit you.
01:30:29.000She lost control of her little pickup truck that I had bought her and got broadsided by a massive truck that was towing two loads of cement.
01:30:46.000But her pickup truck was never examined forensically and it disappeared.
01:30:54.000I was told that it had been sold to someone in Mexico, and it just disappeared.
01:31:40.000The story of the journalist, Michael Hastings, who wrote a story about a general during the time of Obama's administration, there was a volcano that erupted in Iceland and he was stuck overseas.
01:32:07.000So he was over there writing a story for Rolling Stone, and because he was over there for so long, because he was trapped, because no flights were going, because the air cover was so bad because of this volcano, they got real comfortable with him.
01:32:22.000And these soldiers started saying things, not even thinking this guy is like, you know, he's not one of them.
01:32:29.000He is a journalist, and he's going to write all these things about it.
01:32:32.000So he wrote this very damning article.
01:33:00.000The car's engine was many yards from the car itself, and there was a lot of speculation.
01:33:08.000That not only did the government have the ability to manipulate, that intelligence agencies had the ability to manipulate people's cars, but it's something they've actively done.
01:33:18.000And people were very concerned that this guy was murdered because of what he had done.
01:33:23.000Because that general wound up getting fired.
01:33:25.000Obama wound up firing him because it made Obama look bad.
01:33:38.000And it starts out saying the kind of things that I've been saying to you, which is the future crimes, they're actually here now.
01:33:44.000And this is an ex-FBI guy who wrote the book.
01:33:47.000And he's talking about how tech is being used now to not only commit crimes, but to assassinate people.
01:33:54.000One of the simplest ways to do it is you hack into a hospital computer and you change dosages on medication.
01:34:03.000If the person you're going after has been hospitalized, that's a really simple way to just knock them off and have it look like just some silly little glitch or something.
01:34:17.000So yeah, there's a lot of ways now that you can commit crimes that have never existed before.
01:34:25.000And as far as I'm concerned, the kinds of things I study, in my opinion, should be considered crimes.
01:34:32.000And I don't think we should ever be complacent and just say, oh, it's the algorithm.
01:35:16.000I know that the accident made news, not just here but in Europe, because some people thought it was suspicious that my beautiful My wife, you know,
01:35:31.000we've been together for eight years and my beautiful wife was killed in this horrendous fashion.
01:35:37.000And, you know, obviously I have pissed off some people at some big companies.
01:35:46.000I mean, the work I have coming out, I have right now 12 scientific papers under review and four that are in press, in other words, that have been accepted.
01:35:56.000So I have stuff coming out that is over and over again, it's like a sledgehammer, is going to make certain companies really look, well, very evil, I would say.
01:36:10.000Do you think that they have the ability to suppress the kind of coverage of the data that you're putting out to the point where it's not going to impact them?
01:36:20.000Like, how much has it impacted them currently?
01:36:23.000I mean, we're talking about committing murder or potentially committing murder.
01:36:26.000Like, how much have you impacted them if they're still in complete and total control and they're still utilizing all these algorithms and making massive amounts of profit?
01:36:46.000So I do want to talk to you about the monitoring stuff, because there is a way.
01:36:51.000There's more than one way, but there's one very practical way to literally just push these companies out of our personal lives and out of our elections.
01:37:05.000And I've been working on that project since 2016. That project started because of a phone call I received from a state attorney general, Jim Hood.
01:37:17.000He was attorney general of Mississippi at the time.
01:37:21.000He called me in 2015 and he said, could Google mess with my reelection as attorney general?
01:37:29.000Because in that state they elect them.
01:37:47.000Well, a whistleblower, you know, a warrant, something.
01:37:51.000And I became obsessed with trying to figure out how to know what these companies are actually showing real people.
01:38:02.000Now, here and there, there's some researchers at Columbia who should be ashamed of themselves.
01:38:07.000There's some reporters at The Economist who should be ashamed of themselves.
01:38:10.000Here and there, people have set up a computer that they anonymize, and they type in lots of search terms, and they get back all these searches, and they conclude that there's no bias.
01:38:26.000But that doesn't tell you anything because Google's algorithm can easily spot a bot, can easily spot an anonymized computer.
01:39:26.000But the point is, they know the difference between you, because you have a big old profile, and an anonymized computer or a bot because there's no profile.
01:40:12.000This is exactly what that company does, Nielsen, that does the Nielsen ratings.
01:40:18.000They've been doing it since 1950. They're now in 47 countries.
01:40:22.000And they recruit families and they keep their identities very secret.
01:40:26.000And they equip the families with special gizmos so they can keep an eye on what television shows they're watching.
01:40:32.000And that's where the Nielsen ratings come from, which are very important because they determine how much those shows can charge for advertising.
01:40:41.000They determine whether or not a show stays on the air.
01:44:02.000But still, I was very disturbed by this, extremely disturbed, because we knew from the experiments we had run that that was enough bias to have shifted over a period of time among undecided voters somewhere between 2.6 and 10.4 million votes without anyone having the slightest idea that this had occurred.
01:44:25.000That's 2016. 2018, we monitored the midterms.
01:44:58.000And we recruited 1,735 field agents just in swing counties, just in swing states, because we knew that's where the action was going to be.
01:45:11.000We preserved 1.5 million ephemeral experiences, and I'll define that if you want, On Google, Bing, Yahoo, YouTube, Google's homepage, Facebook.
01:45:27.000We, at this point, know how to preserve pretty much anything.
01:45:40.000We decided, which we hadn't done in the past, on October 30th, 2020, before the election, a few days before the election, we decided to go public with some of our initial findings.
01:45:54.000And as a result, on November 5th, two days after the election, three U.S. senators sent a very threatening letter to the CEO of Google Just summarizing all my work, my preliminary stuff.
01:46:13.000And guess what happened then in Georgia?
01:46:16.000We had over a thousand field agents in Georgia.
01:46:34.000This tells you that if you monitor, if you do to them what they do to us 24 hours a day, you do that to them and you look for any kind of manipulation, any kind of bias, any kind of shenanigan, and you make that public,
01:46:58.000So doesn't this highlight that if our government is concerned about legitimate threats to democracy and legitimate threats to the way information is distributed and free speech and manipulation,
01:47:14.000that they should be monitoring Google.
01:47:49.000This should be done by probably a consortium, bipartisan or nonpartisan.
01:47:57.000Nonprofit organizations and, you know, we should have hearings.
01:48:04.000We should have, you know, very—everything should be transparent.
01:48:08.000We should have wide representation of people serving on the boards and all that kind of like— Well, the UN, but this is a narrow kind of task.
01:48:32.000Eventually, we have to help people in other countries set up similar systems.
01:48:36.000That is how now and in the future—see, that's the real answer to your future question— That is why now and in the future, that is how, now and in the future, we can get control over emerging technologies.
01:48:52.000Not just Google, but the next Google and the Google after that.
01:48:55.000There is no way to know what these companies are doing unless you are monitoring.
01:49:03.000One of the simulators we have now that we developed actually within the past year, which is fabulous, I'm so proud of my staff, we have an Alexa simulator.
01:49:14.000I mean, it just works just like Alexa.
01:50:59.000They published a study in 2012 showing how they could get more people to vote in 2010 by sending out vote reminders.
01:51:09.000If you just take the data that they published and move it over to 2016 and say, okay, Mark, press the button, Hillary would have absolutely won the election.
01:51:23.000He, I'm sure to this day, is kicking himself because he didn't do it.
01:52:36.000Based on the experience that we just had a few months ago, where we got Google to stay out of Georgia, and by the way, we positively got them to stay out of Georgia because we had over a thousand field agents in Georgia, and we were collecting a massive amount of...
01:52:50.000We collected more than a million ephemeral experiences.
01:52:54.000I guess I'm going to have to define that.
01:53:31.000Internally, this is the kind of lingo that they use.
01:53:34.000What's an ephemeral experience and why would they want to use ephemeral experiences to change people's minds?
01:53:40.000Because an ephemeral experience is, well, most of the kinds of interactions we have online involve ephemeral experiences.
01:53:48.000Search, you type a search term, you see a bunch of search results, it has an impact on you, you click on something, it disappears, it's not stored anywhere, and it's gone forever.
01:53:58.000So there are these brief experiences, like a news feed, a list of search suggestions, an answer box, that affect users, disappear, stored nowhere, authorities cannot go back in time and figure out what people were being shown.
01:54:19.000That's why internally at Google they want to use ephemeral experiences to impact people because unless someone like me, and I'm the only one doing this, unless some crazy guy like me is setting up monitoring systems and keeping everything secret while it's running,
01:54:51.000As I say, the most powerful mind control machine ever invented, and it relies, for the most part, on ephemeral experiences, meaning no one knows.
01:56:32.000But if they have the amount of surveillance capabilities that we're talking about here, wouldn't they be able to know who these field agents are?
01:56:42.000Well, that's why we're very, very careful about how we do the recruiting.
01:57:42.000Our software is set up so that if the information we're getting from any particular field agent doesn't look right, then it goes over to human review.
01:57:56.000That could mean, for example, that they are using an algorithm.
01:58:00.000They're trying to tilt things in a particular direction.
01:58:03.000So they're not actually typing in anything.
01:58:05.000They're not using the computer the normal way they would use it, which is what they're supposed to do.
01:58:09.000It means they've now developed or been equipped with an algorithm to, boom, just start generating a lot of stuff, which would mess up our numbers, right?
01:58:22.000Well, those people immediately are flagged, and when that happens and we can't exactly figure out what's going on, we dump them.
01:58:54.000Do you think you can shift the way these companies do business?
01:58:59.000Do you want to just inform and educate the public as to what's happening and how divisive and how interconnected all this stuff is?
01:59:15.000It's hard to answer that question because as I keep learning more, and believe me, what we've learned in the last year easily eclipses what we learned in the previous eight years.
01:59:35.000I'll say at one point in time, what I was concerned about was how can we get Google under control?
01:59:44.000So I published an article in Bloomberg Businessweek.
01:59:47.000There's a great backstory there because, you know, it was scheduled to come out and then someone or other made a phone call to someone else and then, boom, the piece got pulled.
01:59:59.000And this is a solution to the Google problem, literally.
02:00:03.000The editor in chief is literally having arguments with the, you know, the higher ups, the publishers, because they pulled my piece on how to get Google under control, how to solve the Google problem.
02:00:15.000I was scheduled to testify before Congress the following Tuesday.
02:03:18.000And why does the United States implement some sort of a similar punishment?
02:03:23.000Because Google owns the United States.
02:03:28.000I mean, there's an antitrust action right now in progress against Google, and it's the attorney generals, I believe, from every single state in the United States except California.
02:03:41.000Because the attorney general of California, his main supporter is Google.
02:04:39.000And it would fall in line with don't be evil.
02:04:44.000Well, the fact is that depending on who the leadership is at any point in time at Google, they might look at that idea and say, hey, look, this will be great for us.
02:07:17.000I mean, if you're saying that Google's sending out these messages, right, and that most of their users or the majority of their users are Democrats, right?
02:07:25.000So what's the majority of Republicans?
02:07:32.000You're saying Google's sending out this message, go vote.
02:07:35.000And through that message, because of the bias, because of the difference in the numbers, more Democrats are getting it because more Democrats use Google, right?
02:11:26.000We have to be vigilant so that we don't let these kinds of powers take over our government, our democracy, our nation.
02:11:36.000And we have not been vigilant and we're not being vigilant now.
02:11:40.000And the research, you know, that we do and the monitoring systems both, the research is over here and the monitoring stuff's over here, that reminds me every single day.
02:11:51.000I mean, I'm looking at numbers every single day.
02:11:53.000You're keeping me away from my data and my research, by the way.
02:11:58.000But I'm reminded every single day of just how serious this stuff is.
02:12:06.000This is deadly serious for the future of not just our country, but all of humanity.
02:12:11.000And the fact that people don't know it, or that sometimes I've given speeches, sometimes people say, I don't care.
02:13:00.000I had a friend who worked at Google during the time they were working and having negotiations with China, and her position was that China was just going to copy Google's tech if they didn't do that.
02:13:16.000So like they were in this position where, you know, like Tiananmen Square, like you cannot bring up, like Tiananmen Square is not searchable.
02:13:25.000The results of it, like the guy standing in front of the tank, like there's a lot of information from Tiananmen Square that would look terrible.
02:15:58.000Are they being pushed one way or another politically?
02:16:02.000We are in the process right now of trying to expand our research to look at kids and to see what content these kids are being shown.
02:16:11.000Because it doesn't matter how vigilant you are as a parent, the fact is 99% of what your kids are seeing online or experiencing online, you're unaware of.
02:16:26.000And that's why, as I say, Solving these problems is not optional.
02:20:11.000Because I had written to that executive at Google, who was supposed to be on that panel in Germany, and just telling him about my work, giving him links and so on, because he's a former professor.
02:20:26.000It was only a few days after that that this guy showed up at our house.
02:20:31.000And then it was a few days after that that the Google executive pulled out of that conference.
02:20:40.000And so they're not interested in communicating with you.
02:20:45.000They've obviously either told people not to communicate with you or the people that you would like to talk to are aware of your work and they feel that it would negatively impact their job or their career.
02:21:06.000This has just been, for me, in many ways, a nightmare, an absolute nightmare, because there are people who won't help us, who won't serve on our board, who won't do this, who won't do that.
02:21:18.000We had an intern lined up who was very, very good.
02:21:22.000You know, we get some really sharp people.
02:21:23.000They come from all over the world, actually.
02:21:26.000And we had this person all signed up, her start date was set up, and she called up and she said, I can't do the internship.
02:22:33.000And, you know, what happened after that hearing was Trump tweeted about my My testimony.
02:22:41.000Hillary Clinton, whom I've been supporting forever, Hillary Clinton replies to Trump on Twitter and says, this man's work has been completely discredited.
02:22:53.000It's all based on data from 21 undecided voters.
02:24:55.000And then if we found out that someone who was, like, say, if Donald Trump, you know, if the Democrats found out that Donald Trump had implemented some sort of a system, like you're talking about, people would be furious.
02:26:02.000Because they're getting funding from Google or they're terrified of Google.
02:26:07.000The head of Europe's largest publishing conglomerate, his name is Dopfner, published a piece a few years ago that's actually called Fear of Google.
02:27:42.000So that's the problem there, is that everything is personalized and everything you're seeing there is based on you and your 20-plus year history and the 3 million pages of information they have about you.
02:27:53.000They build digital models of all of us.
02:28:21.000Speaking of which, okay, I'm sitting next to a guy on an airplane the other day, and he's saying how he's very proud that he doesn't use any social media.
02:28:36.000I said, so, wait, you mean you don't have a Facebook page?
02:29:55.000Even if they deleted it on one server, it's sitting there and backup after backup after backup.
02:30:02.000And not only that, if you read, I think I'm the only one who reads these things, but if you read Google's Terms of Service and Google's Privacy Policy...
02:30:11.000It says right in there, we reserve the right to hold on to your data as we might be required by law or in any other way that protects Google.
02:30:28.000Now what about Twitter, Instagram, things like that?
02:30:36.000Instagram and Facebook are the same entity, right?
02:30:58.000In fact, I'd love to know what your opinion is of what happened early 2021, I think it was, when both Facebook and Twitter shut down Donald Trump.
02:31:15.000I think that what these things are, I think- We're at a time in history where you can't look at them as just private companies because the ability to express yourself is severely limited if you're not in those platforms.
02:31:33.000I think they should be looked at like utilities and I think they should be subject to the freedoms that are in our Constitution and the Bill of Rights and I think the way the First Amendment protects free speech It should be protected on social media platforms because I think as long as you're not threatening someone or doxing someone or putting someone in harm or lying about them,
02:31:56.000I think your ability to express yourself is a gigantic part of us trying to figure out the truth.
02:32:03.000Like when it comes to what are people's honest opinions about things?
02:32:09.000You know, we don't know if honest opinions are suppressed.
02:32:13.000Because they don't match up to someone's ideology I think that's it's a critical aspect of what it means to be American to be able to express yourself freely and To find out how other people think is educational if you only exist in an echo chamber and you only hear the opinions expressed of people that Align with a certain ideology,
02:32:42.000I think the answer to bad speech, and this is not my thought, this is many brilliant people believe this, is better speech, more thought, is more convincing arguments, more logical, sustained reasoning and debate and discussion.
02:33:00.000And I think as soon as they start suppressing ideas, as soon as they start suppressing and deleting YouTube videos and banning people from Twitter for things that have now been proven to be true, right?
02:33:15.000There's a lot of people that were banned because they questioned the lab leak theory.
02:33:32.000The way Voltaire said it, I'm paraphrasing, is, you know, I may not agree with what you say, but I will defend to the death your right to say it.
02:33:42.000And, you know, I think it was dead wrong.
02:33:45.000I mean, I was happy, of course, that this happened, but I think it was dead wrong for Twitter and Facebook to literally cut off communication between the current president of the United States who's still in office and his supporters.
02:34:02.000And the real question, too, is how much manipulation was being done by Federal agents in the January 6th event like did they engineer?
02:34:18.000people going into the capital did they Encourage them and you saw that Ted Cruz conversation with the woman from FBI where she said I can't answer that Did the FBI incite violence?
02:34:30.000I can't answer that you can't answer that That should be never Would they incite violence with the FBI? We're good to go.
02:35:12.000Now, if somebody wanted to disparage a political party or to maybe have some sort of a justification for getting some influential person like Donald Trump offline, that would be the way they would do it.
02:35:38.000But, you know, the bottom line here really goes back to George Orwell, which is, you know, if you control information, you control everything.
02:36:53.000I need people to provide funds, but also to help us find funds.
02:37:00.000This is the year where I think we should set up this first large-scale nationwide monitoring system, which could be used not only to keep an eye on these midterm elections, but we could finally start to look at our kids.
02:37:15.000That's become my main concern now, is our kids.
02:37:20.000We don't understand what the hell they're doing.
02:37:22.000We don't know what they're looking at, what they're listening to.
02:37:26.000But I can tell you for sure that a lot of what's happening is really being done very deliberately and strategically by the big tech companies.
02:37:37.000Because they're going to do—they have control over the information that everyone has access to, and they're going to do what's best for them, what makes them the most money, what spreads their values, and, of course, sometimes what's good for intelligence purposes.
02:37:52.000They're going to do those things, and we have no idea what they're doing unless we track them.
02:39:26.000Would we be completely in the dark about this stuff?
02:39:29.000You would be completely in the dark because there's no one doing these kinds of experiments and there's no one collecting all that.
02:39:40.000But when you think about the internet and how many people on the internet are, you know, interested in politics and interested in the influence of big tech and the dangers of big tech.
02:39:51.000When they talk about psychological dangers, like Jonathan Haidt's work with young girls and self-harm and suicide and the rise of depression amongst young people, you would think that this would also be something that people would investigate and dig into.
02:40:07.000The fact that you're the only one, it's very strange.
02:40:20.000I'm gone at the moment, so we have two cats in our office, and I'm the poop cleaner.
02:40:30.000So, when I'm gone, that means someone else has to clean the poop.
02:40:34.000So I said to my associate director last night, I said, just remember that the more credentials you get, the more responsibilities you get, the more poop you're going to have to clean.