RFK Jr. The Defender - October 21, 2024


Google and Mind Control with Amaryllis Fox and Robert Epstein


Episode Stats

Length

1 hour and 19 minutes

Words per Minute

143.50516

Word Count

11,368

Sentence Count

744

Misogynist Sentences

5

Hate Speech Sentences

3


Summary

Dr. Robert Epstein is a pioneer in the study of new forms of manipulation that had been made possible by the Internet. He has testified twice before Congress about his research in this area, and he s also developed the world s first large-scale system for preserving and analyzing the ephemeral content of Big Ten companies used to manipulate elections, children, and adult human beings. His new congressional testimony is accessible at HTTPS.co/2023epsteintestimony. A dashboard that summarizes the data his new monitoring system is collecting can be viewed at HTTPSColon.slipcpy.org/1923epstimony. And my daughter-in-law is Amaryllis Fox Kennedy, who is also my campaign manager. She s the smartest person I've ever met. She helped me get on the ballot in every state, and she managed to get a campaign that everybody said was impossible. And she did it because she had an extraordinary mind. She's a great fan of Dr. Epstein's work, and a great admirer of his work. So it's a joy to have him on the show, and it's also a pleasure to have her co-hosting the show with me, my daughter, Amarylle Fox Kennedy. She's an amazing human being, and I can't wait to share her story with the world. I hope you enjoy this special edition of The Dark Side of the Podcast! and that you'll join us in the future episodes of the show. - coming soon. -- we'll be looking into the dark side of the dark web, dark web and dark web. Subscribe to our new show, The Dark Web, coming soon, Dark Web. (coming soon). -- check us out! -- by clicking here for more information about the Dark Web and all kinds of dark web things, including MK-related conspiracies, like MK-Ultra, MK-Dietrich and all sorts of things like that -- and much more! -- coming soon! (linktr.ee/thedarkwebpodcast) Subscribe? Learn more about your ad choices? Click here to become a supporter of our new sponsor, Rate, review us on Apple Podcasts and subscribe to our podcast, and leave us a review on iTunes, and we'll get a shoutout on the podcast, too! Thank you for rating and review on your favorite podcasting platform! and more!


Transcript

00:00:00.000 Hey, everybody, welcome to a special edition of the podcast.
00:00:04.000 And today my guest is Dr.
00:00:07.000 Robert Epstein.
00:00:08.000 Dr.
00:00:08.000 Epstein is a senior research psychologist at the American Institute of Behavioral Research and Technology.
00:00:14.000 He's the former editor in chief of Psychology Today.
00:00:19.000 He's a Ph.D. from Harvard University.
00:00:22.000 Dr.
00:00:23.000 Epstein is a pioneer in the study of new forms of manipulation.
00:00:28.000 That had been made possible by the Internet.
00:00:30.000 He has testified twice before Congress about his research in this area, and he's also developed the world's first large-scale system for preserving and analyzing the ephemeral content of Big Ten companies used to manipulate elections, children, and adult human beings.
00:00:49.000 His new congressional testimony is accessible at HTTPS colon backslash backslash or slash slash 2023epsteintestimony.com.
00:01:05.000 A dashboard that summarizes the data his new monitoring system is collecting can be viewed at HTTPS colon slash slash americasdigitalshield.com.
00:01:20.000 His research suggests that without an aggressive monitoring system in place in the 2024 presidential election, Luka alone will be able to shift between 6.4 and 25.5 million votes.
00:01:35.000 And I first became aware of Dr.
00:01:38.000 Epstein's research, I think, around 2016 when I was reading about the SIEM, which is the search engine manipulation effect.
00:01:48.000 Which was just becoming evident at that time at the beginning of the coronavirus endemic and was extensively used thereafter to manipulate public perception, to manipulate public content behavior.
00:02:05.000 I remember doing a podcast, one of my earliest podcasts, about this theme in which I think it was Dr.
00:02:13.000 Epstein who said it was the most powerful Propaganda and manipulating tool ever devised by humanity.
00:02:21.000 And it's really astonishing, his work about the capacity to shift public perception and therefore alter and manipulate public content is truly frightening.
00:02:38.000 And it's all done Without the targets of this kind of manipulation, ever knowing that they've been manipulated, it can induce essentially a mass psychosis.
00:02:53.000 And my co-host today is my daughter-in-law, Amaryllis Fox, who is Amaryllis Fox Kennedy.
00:03:07.000 Who is also my campaign manager, who did something on this campaign that has got an extraordinary mind.
00:03:13.000 She's the smartest person I've ever met.
00:03:15.000 And she was able to do something in this campaign that everybody said was impossible, which was to get on the ballot in every state, to manage 100,000 volunteers and the complexity of that task.
00:03:31.000 And to do everything else, to write policy papers and generate this huge amount of content for us.
00:03:39.000 At a superhuman level, Amaryllis spent her early career, joined the CIA after 9-11 and spent a career as a clandestine spy in the weapons of mass destruction program.
00:03:57.000 After she left the CIA, she became a tech entrepreneur and started a very successful company.
00:04:06.000 Later sold to Twitter, and then she went to work managing, I think it was communications, right, at Twitter.
00:04:15.000 Amarillo, correct me if I'm wrong, because I'm doing this off the top of my head.
00:04:19.000 You got it, Bobby.
00:04:21.000 Consumer commerce.
00:04:23.000 Yeah, using your heuristics-based natural language processing to identify Product mentions, which in its own way, I think we'll probably delve into later in this conversation, but a great fan of Dr.
00:04:38.000 Epstein's work and really pleased to be here today.
00:04:41.000 Anyway, I always tell this story that somebody told me that Amaryllis was the smartest woman that they've ever met.
00:04:48.000 And I said, no, she's the smartest human being that you've ever met.
00:04:51.000 So it's a joy to have you back on the show, co-hosting with me, Amaryllis.
00:04:55.000 And If you want to follow Amaryllis, she's at, what is your Twitter?
00:05:01.000 Do you do Twitter or Instagram?
00:05:03.000 I do, I do.
00:05:04.000 X, yes, at Amaryllis Fox.
00:05:08.000 Okay, so, you know, that introduction, you can consider it a question, Dr.
00:05:15.000 Epstein.
00:05:15.000 Just describe to us how the SIEM works, SIEM is search engine manipulation.
00:05:21.000 In fact, How it works and, you know, just a brief punchline of the extraordinary power that these tech companies are now wielding to do what the CIA has always wanted to do, which is without ever going into a country And doing all the things they used to do,
00:05:40.000 assassinating leaders, paying off unions, destroying the credibility of institutions, driving division and polarization between different groups in the society, all of those other tools they developed.
00:05:58.000 For 20 years of experimentation through MKUltra and MKNaomi and MKDietrich and all the MK programs, MK meaning mind control, which is what they were after, both individual control of individuals to control the entire societies.
00:06:18.000 And now they can do that much more effectively.
00:06:22.000 Just by manipulating algorithms at Google.
00:06:25.000 And so, will you explain how that works?
00:06:27.000 It is terrifying, by the way.
00:06:30.000 It is terrifying, and I promise to address all of those issues.
00:06:34.000 But first, I have to say, this is a tremendous honor for me.
00:06:39.000 One of the greatest moments in my life was when I was seven years old, and my mom took me to see your Uncle Jack, John F. Kennedy.
00:06:49.000 On the campaign trail.
00:06:50.000 And she got there really early.
00:06:53.000 So we were right up against the stanchions there.
00:06:55.000 And he was just above us.
00:06:58.000 And I was seven and I was getting kind of crushed from behind.
00:07:02.000 So a police officer picked me up and put me in front of the stanchions.
00:07:06.000 So I was there just a couple of feet away from your...
00:07:10.000 Your famous uncle.
00:07:12.000 And that was really one of the greatest moments of my life.
00:07:16.000 And I've had lots of different contact with members of your family over the years.
00:07:20.000 Your Uncle Ted helped me to set up my first nonprofit organization, which is called the Cambridge Center for Behavioral Studies.
00:07:29.000 I think she's your cousin, Maria Shriver, invited me to speak at her last big women's conference.
00:07:35.000 And that was a great thrill.
00:07:36.000 That was also scary because there were 16,000 women and me.
00:07:40.000 So it was very scary.
00:07:42.000 But I could go on and on.
00:07:45.000 But there's a second thing I want to tell you before we get to the topic, because I just spent the whole weekend reading your book on Fauci.
00:07:54.000 Which was superb.
00:07:56.000 And I recommend that everyone read that book.
00:07:59.000 And if you're reluctant to read it because, as someone told me last night, because you've heard some bad things about Bobby Kennedy Jr., if you're reluctant to read it, forget all that stuff.
00:08:14.000 Forget that negative stuff.
00:08:15.000 I can tell you where that negative stuff comes from.
00:08:18.000 I'm going to tell you a lot about that during this podcast.
00:08:22.000 Forget that.
00:08:23.000 Read this book because this book is not only well written, it is superbly documented.
00:08:29.000 The reason why I think that our host here has not been sued by Mr.
00:08:37.000 Fauci, Dr.
00:08:38.000 Fauci, is because this book is just so well done.
00:08:43.000 And it tells you, it gives you a perspective on the health system, or you could call it the anti-health system in this country, that I've never seen laid out so well.
00:08:55.000 And it is terribly frightening.
00:08:58.000 And then as you get through the book and you get more and more frightened and concerned, it just gets bigger.
00:09:05.000 It literally just keeps getting bigger.
00:09:08.000 So I recommend everyone That they read the real Anthony Fauci.
00:09:15.000 Now, having said that, there's something I need to tell you about before we get to my work on Google and the tech companies.
00:09:28.000 Google and the gang, I call them.
00:09:30.000 And that is something you probably don't know.
00:09:33.000 And you know a lot about The COVID-19 vaccine and all of the horrible things that were done to control messaging.
00:09:45.000 And again, we can talk about that when we get to Google.
00:09:47.000 But what you don't know is this.
00:09:50.000 I was working with a member of Trump's Coronavirus Task Force because I had proposed a plan, which is called the Carrier Separation Plan, CSP. That would not have required any lockdowns or social distancing and would not even have required a vaccine and that would have eradicated the virus, which we've not done.
00:10:17.000 It would have eradicated the virus and allowed us to completely reopen society.
00:10:21.000 So I started working with that task force in March of 2020.
00:10:28.000 What's the plan?
00:10:30.000 The plan is very simple.
00:10:31.000 If people go to carriersseparationplan.com or nationaltestingday.com, You'll see it all laid out.
00:10:40.000 I talked about it on Tucker Carlson's show.
00:10:43.000 He thought it was amazing.
00:10:45.000 His mouth dropped, as tends to happen with Tucker sometimes.
00:10:50.000 Glenn Beck said it was brilliant.
00:10:52.000 Michael Medved said he hopes the president goes for it.
00:10:55.000 I was talking to two members of Trump's family about it.
00:10:58.000 It's very simple, and it's consistent with some statements that are in your book.
00:11:03.000 And that is, you test everyone using very cheap...
00:11:08.000 Disposable test devices, and I was in touch with two big companies in China that could have produced these things overnight.
00:11:15.000 The test devices don't even have to be that particularly accurate, it turns out, for this to work.
00:11:22.000 The President announces we're going to have National Testing Day.
00:11:26.000 It's going to be on September 6th, Sunday, September 6th, 2020, which was the 400th anniversary of the day the Mayflower set sail from England to the United States.
00:11:37.000 And he said, on that day, we're all going to test.
00:11:40.000 Before that, he would have announced it early summer, we're going to send you all dozens and dozens of these test devices.
00:11:48.000 They don't have to go up your nose.
00:11:50.000 You just stick them in your mouth and it'll tell you whether you're positive or not.
00:11:55.000 And so he said, I'm going to do it, Melania and all my kids and all the members of Congress, and we're all going to do it on that day.
00:12:04.000 He said, and then if you have, if you test positive, then you need to go into quarantine, not for six months, just a couple of weeks, two or three weeks.
00:12:13.000 Just let the virus, let your body defeat the virus.
00:12:17.000 He said, and that same day, Sunday, September 6th, we're going to reopen society.
00:12:22.000 All the schools and churches and businesses are going to reopen and the virus is going to stop spreading.
00:12:29.000 Because we've removed from the population most of the carriers.
00:12:33.000 And there's also going to be secondary screening.
00:12:36.000 So at the entrance to schools, churches, and businesses, there's going to be barrels full of these test devices.
00:12:42.000 And it's all voluntary, but we'd like you to test and test negative so you can go inside.
00:12:48.000 If you test positive, just go home for three weeks.
00:12:51.000 If you need money to do this, then you're giving up your privacy, but we'll help you.
00:12:57.000 If you need a place to stay, like a hotel room for those three weeks, we'll help you, but you're giving up your privacy then.
00:13:02.000 But for everyone else, this is all going to be done in privacy.
00:13:06.000 I published in Frontiers in Public Health in January of 2021 a mathematical model, detailed predictions.
00:13:14.000 That's been viewed 35,000 times.
00:13:17.000 It is taken seriously in the public health and medical communities.
00:13:22.000 Here's what happened.
00:13:24.000 And by the way, if that plan had been implemented, and at one point there was a speech for President Trump to give, it was sitting on the desk of his head speechwriter, Vince Haley.
00:13:35.000 Short speech.
00:13:36.000 If he had set that in motion, that would have saved 600,000 American lives.
00:13:41.000 And here's what happened.
00:13:43.000 On July 15th of 2020, I received an email from a member of the Coronavirus Task Force saying, we're so sorry.
00:13:51.000 But we've decided to go with the vaccine exclusively.
00:13:55.000 And that was it.
00:13:57.000 They just set aside.
00:13:59.000 They just threw away the plan, which would have made Trump into a hero around the world.
00:14:04.000 It would have made him into a hero.
00:14:07.000 So why did he do this is the question.
00:14:10.000 And the reason is, I was told, is because that summer he got it into his head, not from Fauci, but from somebody.
00:14:18.000 He got it into his head That he could get the vaccine out before the November election, and that that would guarantee him the win.
00:14:28.000 And of course, there are also many, many other kinds of pressure on him to go with the vaccine.
00:14:33.000 And that's what he did.
00:14:35.000 Well, you know, that's very interesting.
00:14:37.000 And I know that you talked about that on Tucker.
00:14:41.000 But it is a, and I don't want to take away from your, you know, your inventive role In devising that, but it's kind of a modern iteration of exactly the protocol that D.A. Henderson, one of the great epidemiologists who is credited with obliterating smallpox, eliminating this deadly disease from humanity.
00:15:11.000 It was basically the same program that he used without the technology, but it is the It was the classic prescription for dealing with pandemics, which always said you never do mass lockdowns because that ends up destroying society and imposing a lot worse costs than the disease, which is what we saw.
00:15:38.000 But the way he eliminated smallpox, a lot of people, you know, the modern consensus is that which is an orchestrated consensus, which is the product of propaganda.
00:15:49.000 Is that smallpox was eliminated by the smallpox vaccine.
00:15:53.000 But there are many parts of the world where the vaccine never reached.
00:15:58.000 And smallpox disappeared there, too.
00:16:00.000 And the way it disappeared is D.A. Henderson's plan, which is to isolate the sick, protect the vulnerable, keep society open, and do systematic isolation of people who have the disease.
00:16:15.000 And that is what eliminated smallpox from the planet.
00:16:19.000 And D.A. Henderson, in later years, complained vociferously against the lockdowns during the beginning of COVID, et cetera.
00:16:27.000 If you never do that, you know, you do you do these protocols that have been proven to work.
00:16:34.000 And in the small box vaccine had a lot of problems and and, you know, a lot of very, very bad, deadly side effects killing people.
00:16:44.000 And the real way that history shows and the best literature shows the way that smallpox was eliminated was through these physical protocols of isolating the sick.
00:16:57.000 And respiratory viruses, as everybody knew before the pandemic, cannot be eliminated through isolating entire society.
00:17:08.000 In fact, They spread indoors.
00:17:11.000 So when you lock people indoors, they tend to spread to the families, etc.
00:17:17.000 And it was really, you know, what we saw here at the beginning of the pandemic was the police shutting playgrounds.
00:17:25.000 Padlocking basketball courts, throwing sand on the half pipe so people couldn't skateboard, arresting surfers who were out in the ocean when they came to the beach and sending them home where the coronavirus is going to spread.
00:17:41.000 It was...
00:17:43.000 It was just a systematic refutation of everything that we know about managing epidemics.
00:17:51.000 I'm very glad.
00:17:52.000 I'm happy that you're talking about this.
00:17:54.000 And I wish I'd known more about that at the time when you were doing it, because I think it would have given a lot of those of us who are skeptical about them, were criticizing the mass lockdowns.
00:18:08.000 It would have given us all a much better understanding Alternative for dealing with the coronavirus.
00:18:15.000 Of course, Bobby, I mean, that's exactly the point of our later conversation, which is that you didn't know it, and neither did anyone else, and that wasn't accidental, right?
00:18:25.000 That's the result of this kind of manipulation.
00:18:29.000 Amaryllis, thank you so much, because that's the segue that I was hoping to have.
00:18:34.000 Why did we not know?
00:18:37.000 And the fact is, there's a lot we don't know, and you don't even know what you don't know, right?
00:18:45.000 Or as I like to say now, you don't know what they don't show.
00:18:48.000 And by they, I mean the big tech companies, primarily Google.
00:18:52.000 And so, you know, I've been studying that.
00:18:55.000 That's a separate thing than my work on the carrier separation plan.
00:18:58.000 But I've been studying the ability that Google and other tech companies have to control Our thinking, our behavior, our emotions, our elections, our children.
00:19:11.000 I've been studying that now for more than 12 years.
00:19:14.000 It is extremely disturbing.
00:19:17.000 It's even more disturbing than reading Bobby's new book on Fauci.
00:19:25.000 That's how disturbing this stuff is.
00:19:26.000 It's so disturbing.
00:19:28.000 Because as we keep making new discoveries about the power that these companies have, To manipulate people and to undermine democracy.
00:19:37.000 It just gets worse and worse and worse with every new discovery.
00:19:41.000 So now I'll get back to your original question.
00:19:44.000 Your original question was, how does this work?
00:19:46.000 How can these companies exercise such power?
00:19:50.000 Well, there's one way they do it that I think everyone's aware of, and that's what my conservative friends tend to focus on.
00:19:58.000 I'm not a conservative myself.
00:20:00.000 But the one way they do it is by censorship.
00:20:06.000 In other words, they suppress content.
00:20:08.000 And again, you don't know what they don't show.
00:20:11.000 So that's the simplest way that they do it.
00:20:13.000 Unfortunately, my conservative friends then stop.
00:20:17.000 They get stuck there.
00:20:19.000 They think that's all there is.
00:20:21.000 No, no, that is not true.
00:20:24.000 I have been publishing in peer-reviewed journals, I've been publishing our discoveries on, so far, 10 different methods that these companies use for manipulating thinking and behavior.
00:20:40.000 And it's just unbelievable.
00:20:44.000 I can tell you this, that the That the search engine is the most powerful mind control machine that's ever existed.
00:20:52.000 That when you start to type a search term, you're being manipulated from the very first character that you type.
00:20:59.000 I said this at a hearing when I first testified before Congress, and Ted Cruz pulled out his cell phone and he said, oh yeah, show me.
00:21:08.000 I said, fine.
00:21:09.000 Type the first letter in the alphabet.
00:21:12.000 So he types A. I said, are they making any suggestions for what you should be looking at?
00:21:19.000 He goes, yeah.
00:21:20.000 He said, I got five suggestions.
00:21:21.000 What are they?
00:21:22.000 I think three or four of them were for Amazon.
00:21:26.000 I said, gee, why are they trying to send you to Amazon?
00:21:30.000 You weren't looking for Amazon, were you?
00:21:32.000 He goes, no.
00:21:33.000 I said, because Amazon is Google's largest advertiser And Google is Amazon's single largest source of traffic.
00:21:42.000 These are business partners.
00:21:44.000 They are manipulating people literally with every character that they type.
00:21:50.000 It's not just search suggestions.
00:21:52.000 It's the answer boxes below.
00:21:54.000 It's the search results.
00:21:56.000 But let me just go back to search suggestions for a second, because you'll remember that a few weeks ago, It made national news that when people were searching for information about the assassination attempt on They couldn't get anything.
00:22:12.000 Google was suggesting that they look at, I don't know, Abraham Lincoln and McKinley, and they wouldn't give you suggestions for learning about that assassination attempt.
00:22:24.000 And that made national news.
00:22:26.000 But here's the thing.
00:22:27.000 That's an anecdote.
00:22:30.000 That doesn't hold up in court.
00:22:32.000 It's useless.
00:22:33.000 But we are now preserving those search suggestions by the millions.
00:22:39.000 We're preserving ephemeral content, that's what they call it in Google, that they use to manipulate people.
00:22:45.000 We're preserving that by the millions, the recommendations they make on YouTube, which are 60 to 70% coming from liberal news sources.
00:22:56.000 For children under 18, or children, young people under 18, it's more like 90% of the content that they're showing people comes from liberal news sources.
00:23:07.000 We're preserving...
00:23:10.000 Tens of millions of this type of data of this sort, which has never been done before.
00:23:17.000 We have the world's first national monitoring system for doing to them what they do to us and our kids.
00:23:24.000 In other words, we're surveilling them just like they surveil us and our kids.
00:23:30.000 So there's two big chunks of research here.
00:23:33.000 One chunk of research, which we've been doing since 2013, looks at the new methods of manipulation that the internet has made possible and that is controlled entirely by a couple of tech companies.
00:23:48.000 And the second piece of research has to do with the fact that we're now monitoring them to see whether they're actually using these techniques, and they are.
00:23:59.000 So I'll give you one quick example.
00:24:00.000 Right now on Google's homepage, they're sending out various kinds of vote reminders, register to vote, mail in your ballot, go vote, to Democrats at about two and a half times they're doing to Republicans.
00:24:16.000 Now think about that.
00:24:18.000 Think how that impacts the vote over time.
00:24:22.000 That's why Google can control so many votes.
00:24:25.000 But anyway, I'm sorry to be so long-winded here.
00:24:28.000 I'm tremendously passionate about this.
00:24:31.000 And I have to say one more thing of a personal nature.
00:24:33.000 I just have to because you're not just anybody, okay?
00:24:39.000 You're an amazing, amazing person.
00:24:41.000 And the fact is, I say in all modesty, that you and I have a lot in common.
00:24:47.000 First of all, we are both huge fans of an incredibly amazing, beautiful woman named Cheryl Hines.
00:24:56.000 So that's the first thing we have in common.
00:24:59.000 Secondly, we each spent some time at Harvard.
00:25:04.000 I overlapped with Caroline there.
00:25:10.000 The fact is you and I are both nuts.
00:25:12.000 We're crazy because we speak the truth.
00:25:16.000 We don't care whether that offends anybody.
00:25:19.000 We speak the truth because we want to defeat bad actors.
00:25:28.000 We want to defeat people who are hurting people.
00:25:31.000 We're hurting society or hurting elections, hurting democracy, hurting health, and we speak the truth no matter what the consequences, which is a very, very crazy, insane, and difficult way to live.
00:25:45.000 And that's what you and I have in common.
00:25:49.000 Let me just read this.
00:25:53.000 And these are some of your findings.
00:25:56.000 SEAM demonstrates how biased ephemeral content, such as search results and video recommendations, influence users' decisions.
00:26:03.000 Just one exposure to the type of biased content can shift the person's perspective by 20% to 80%, with repeated exposures raising that potential to 90%.
00:26:15.000 The SSE, which is the search suggestion of fact, Refers the impacts of search suggestions in drop-down as an online user is looking up information.
00:26:29.000 Google search suggestions can shift undecided voters' opinions from a 50-50 split to almost 90-10, all without user awareness.
00:26:41.000 That is terrifying because that's really the end of democracy.
00:26:46.000 How does democracy survive that?
00:26:48.000 I don't think really that...
00:26:50.000 If you consider the free and fair election...
00:26:53.000 I mean, the implication is you have one guy, Sergey Brin.
00:26:59.000 It's not even the 800 people who are giving 70% of the donations, which is terrifying.
00:27:05.000 One guy, the head of Google, who can decide elections.
00:27:10.000 Most elections in this country could be decided by that 6%, you know, or more.
00:27:18.000 I think actually you quantify that, but it's like, you know, you're shifting election results.
00:27:28.000 I think you find in here that you can, what is it, that you can shift election results by 6.9% typically or more, right?
00:27:38.000 Well, the margin that is somewhere between 4% and 16%.
00:27:43.000 So 4% is the absolute bare minimum that Google alone can shift without anyone knowing, except for, of course, now we have a monitoring system in place.
00:27:54.000 But that 4% is guaranteed.
00:27:57.000 So if you look right now, as I did a little while ago, if you look at the numbers in the swing states right now, the survey data, You'll find, depending on what the poll is, that Trump is ahead by maybe a point or two in three of the seven swing states.
00:28:12.000 Harris is ahead by a point or two in four of the swing states.
00:28:15.000 You know what that means?
00:28:16.000 That means that Harris will win all seven swing states because that margin is well within Google's ability to control.
00:28:27.000 They have absolute control.
00:28:29.000 Any margin, 4% or under, 100% control.
00:28:33.000 Let me just push back on that because the polling already reflects Google's impact on the outcome, right?
00:28:45.000 Because the attitudes have already been shifted.
00:28:49.000 I would also say that the polling is manipulated by the same forces that are manipulating the news.
00:28:56.000 Wow, you are smart.
00:28:58.000 My God.
00:28:59.000 And I've seen it in action because it's been U.S. policy around the world for a long time in terms of Intelligence monkey business in other countries' elections.
00:29:09.000 You can go back and look even on CIA.gov if you go and look at the special forces and election manipulation handbooks.
00:29:21.000 There are sections on polling that are still entirely redacted from the 70s and the 80s.
00:29:27.000 And those are right next to nestled next to the guide for taking over underfunded newspapers in order to provide an unidentified source of funding that then places all of the news stories that the United States government wants for for the candidate of their choice.
00:29:48.000 So they have always viewed both polling and manipulation of news coverage as the first wave.
00:29:57.000 If that doesn't work, then maybe you have to go in and cement a coup or do some other criminal act overseas.
00:30:05.000 But first and foremost, polling and news manipulation is That is correct.
00:30:11.000 I remember you saying to me one time that if you are an intelligent agent who is in charge of making sure the Italian elections come out, the communist candidate or the left-wing candidate loses the Italian elections, that it is a spycraft malpractice not to use those tools.
00:30:39.000 You'd be called in by supervisors.
00:30:41.000 How on earth did this election turn out the way that we didn't want it to?
00:30:45.000 And you see that all around the world when, you know, these improbable results happen.
00:30:49.000 But, you know, I remember a DARPA study coming across my desk in probably 2004 when I was, you know, a brand new baby officer who signed up after 9-11 and,
00:31:06.000 you know, was I wasn't yet even overseas, thinking that I was doing the same thing as, you know, a Marine, you know, a kid who goes down and joins the Marines after 9-11 because you want to do something good to serve your country.
00:31:21.000 Didn't know what nest of vipers I was walking into.
00:31:25.000 But I remember this study coming across that came out of DARPA, and they put people reading news stories in an fMRI machine.
00:31:35.000 And expected that the frontal areas would light up while they were assessing analytically the reliability of what they were reading.
00:31:44.000 And they did, but they lit up second.
00:31:47.000 And the first area that they describe, it's over the ear, and they described it as the part of your brain that lights up when you hold up a shirt in a store and think about whether or not your friends would make fun of you if you wore it.
00:32:00.000 And that in-group, out-group decision is a split-second decision made at the beginning before you even start your frontal lobe, all of your actual analytical assessment.
00:32:10.000 And so you already know before you dive in whether you're assessing in order to poke holes or you're assessing in order to think it's a brilliant article and share it with your friends.
00:32:20.000 And of course, they were weaponizing that at that time between the Sunni and the Shia community that they were trying to create this split.
00:32:27.000 um overseas but we now see that exact same uh methodology at work and i think what dr epstein's describing especially with autocomplete right the subconscious the subtext of that when you begin to write something is oh well this must be informed by what everybody else is searching and what everybody else is thinking exactly And therefore, I should be thinking it too.
00:32:54.000 And that is an incredibly dangerous road for us to go down as a society.
00:32:59.000 I just want to add one thing to this issue of the polling numbers, and I agree with what you said completely, but I want to add one thing.
00:33:07.000 At any point in time, there's still going to be millions of people who are vulnerable, who haven't completely made up their minds, and who can be pushed, who can be nudged one way or the other.
00:33:19.000 Google has a tremendous advantage over any of us, including a campaign manager, because Google knows exactly who those people are.
00:33:27.000 I mean, that would be every campaign manager's wet dream to know exactly who those people are, but Google knows exactly who they are.
00:33:34.000 And it is using these techniques that we've discovered and quantified over the years.
00:33:39.000 It is using these techniques on these people every day.
00:33:47.000 No matter what the polls say, there are still people out there who can be influenced.
00:33:52.000 Google knows who they are, and they're using these techniques.
00:33:56.000 Now, let me also point out that on election day itself, there are a lot of people who Who are just too lazy to get off the sofa.
00:34:09.000 In 2012, Facebook and some of my colleagues out here at the University of California, San Diego, published a piece in Nature Published a piece in Nature about what they had done in the 2010 midterms.
00:34:26.000 They had sent go vote reminders on election day all day long to 60 million Facebook users.
00:34:33.000 And they had some very clever ways and they had a control group.
00:34:37.000 They had some clever ways of determining whether that got some more people to vote.
00:34:41.000 They calculated that that vote reminder got 340,000 more people to get off their butts on Election Day.
00:34:50.000 So just keep that in mind, that even on Election Day itself, sending out partisan go vote reminders shifts a lot of votes.
00:35:00.000 And normally, no one would have any way of knowing that they did that.
00:35:06.000 But because we have a national monitoring system in place, we'll know exactly what they're doing and how many votes they're sending to members of each party.
00:35:16.000 We'll know exactly what they're doing.
00:35:18.000 And we're building an archive, which has never been done before, that will allow us to go back in time and look at these manipulations I'm in touch with members of Congress, with a bunch of AGs, with a couple of parenting groups, some election integrity groups.
00:35:36.000 All of our data are going to be available to all kinds of people who are going to try to use these data in various ways to pressure Google and other companies to stop To stay away from our elections and stay away from our kids.
00:35:53.000 How do you pressure them?
00:35:55.000 Well, that brings me to an old quote from Justice Louis Brandeis 100 years ago, everyone knows this, which is that sunlight is the best disinfectant.
00:36:05.000 The second half of that quote, no one knows, but it's, and street lamps, the best policeman.
00:36:12.000 That's what he wrote back in, I think, 1917.
00:36:15.000 That's what we're doing.
00:36:16.000 We're making these companies accountable to the public for the first time by monitoring them and exposing them and letting them know that we're doing it.
00:36:27.000 And so we have now preserved, as of a few weeks ago, more than 100 million ephemeral experiences.
00:36:34.000 That's what they call them inside of Google.
00:36:36.000 Ephemeral experiences which are normally lost forever, stored nowhere.
00:36:41.000 Google doesn't store them, but we...
00:36:45.000 Are storing them?
00:36:46.000 Do they actually deliberately use ephemeral experiences to manipulate?
00:36:52.000 2018, there was a leak of emails from Google to the Wall Street Journal, John McKinnon, if you know him.
00:36:59.000 And McKinnon reported these emails involved discussions about Trump's travel ban.
00:37:08.000 And specifically, they were discussing How can we use ephemeral experiences to change people's views about Trump's travel ban?
00:37:18.000 This is how they do it.
00:37:20.000 And that's what we're preserving and capturing for the first time.
00:37:24.000 I've been approached so far by people now from eight countries who want me to help them build monitoring systems in those countries.
00:37:31.000 And this is one area where I do happen to agree with Trump.
00:37:34.000 No, we do it here first.
00:37:37.000 How does the monitoring system work?
00:37:39.000 Is it opt-in, presumably, and do you have people volunteer to have their screen recorded, or how does it work?
00:37:47.000 I wish we could accept volunteers because it would be so much cheaper to run this, but we can't because in the past when we've called for volunteers of any sort, Google sends us people.
00:38:00.000 They not only have 100,000 employees, they have 120,000 outside consultants that they use for various purposes.
00:38:08.000 And we've had them send us people over and over again.
00:38:11.000 So over the years, we've had to set up very secure systems for recruiting.
00:38:16.000 We recruit registered voters.
00:38:19.000 We vet them.
00:38:20.000 They sign NDAs.
00:38:22.000 We equip them.
00:38:23.000 We train them.
00:38:24.000 This is very expensive.
00:38:26.000 We're now monitoring the content that's coming into their computers.
00:38:30.000 So this is not just ephemeral content, it's personalized.
00:38:34.000 Because remember, everything they send out is personalized.
00:38:36.000 The only way you're going to know what they're actually sending to people is to look over the shoulders, with their permission, of real registered voters, And capture that content and aggregate it and analyze it.
00:38:51.000 And we've gotten better and better and better at doing all of that.
00:38:55.000 And at this moment in time, we are preserving content through the computers of a politically balanced group of more than 16,000 registered voters in all 50 states.
00:39:07.000 And this group gets larger every day because we do not stop recruiting.
00:39:12.000 We have court admissible data, most likely, at least according to Ken Paxton.
00:39:18.000 We have court admissible data now in 21 states.
00:39:22.000 And obviously, we've got to get that number higher as fast as we can.
00:39:26.000 The bigger that number is, the more it pressures these companies to stop what they're doing.
00:39:34.000 Robert or Dr.
00:39:35.000 Epstein?
00:39:36.000 Oh, no, no, Robert's fine.
00:39:38.000 You have the same name, so.
00:39:40.000 You have a PhD after years and a DR in front of him, so those were hard-earned and well-deserved, and I want to respect them.
00:39:50.000 Yeah, but I heard you on Tucker's show, and you made a very good argument for saying that you probably have 10 PhDs, given the expertise you have to get to litigate these cases.
00:40:05.000 So, let's get to this subject, which I think...
00:40:10.000 I didn't show Amber Ellis any of the documents you showed me, but you did an analysis of my election.
00:40:17.000 There's nobody who knows more about my polling Amaryllis has done the biggest polls of this election in any campaign.
00:40:25.000 We've done polls of 26,000 people.
00:40:28.000 Typical polls are 1,000, 2,200.
00:40:31.000 But we've done polls 10 times outside with extensive cross tabs.
00:40:35.000 And she's read every one of them.
00:40:37.000 She's read every poll that's come out during this and analyzed it.
00:40:42.000 And with her extraordinary encyclopedic computer mind.
00:40:47.000 I want you to tell her for the first time exactly how Google has weaponized searches against me.
00:41:01.000 I am going to tell you, but first I'm going to tell you something that we didn't send you, just for comparison's sake.
00:41:10.000 We generated a report recently about Elizabeth Warren to see what kind of information Google is sending to voters about her.
00:41:18.000 Now, she is a, I would say, extreme liberal person.
00:41:25.000 You know what they're sending to people who search for information about Elizabeth Warren?
00:41:30.000 They are sending extreme conservative content that absolutely trashes Elizabeth Warren.
00:41:38.000 Why would they be doing that?
00:41:40.000 Because Elizabeth Warren has gone on record calling for the breakup of Google.
00:41:45.000 They want her out.
00:41:48.000 Now with you, they're sending content that...
00:41:55.000 On a graph, it looks very blue.
00:41:57.000 In other words, it's liberally biased, but that's just a graph.
00:42:01.000 But if you look at the actual content that they're sending people to, if people click on those high-ranking links in their search results, you end up with, I'm not going to say them, but if you want to share them, that's fine.
00:42:14.000 They are sending people to one piece after another, one website after another, one article after another, one video after another, that makes you look like the devil.
00:42:26.000 Yeah.
00:42:27.000 Well, I mean, Bobby poses perhaps the most grave threat to their business model as has happened since, you know, its instantiation.
00:42:36.000 I mean, banning pharmaceutical ads on day one and that alone is probably number two source of ad revenue for them in addition to, you know, all of their defense contracts and so on.
00:42:49.000 So, yeah.
00:42:50.000 Even outside of their doing the intelligence agencies bidding, which we know to be the case, just from a business case point of view, we knew right from the outset that we had a pretty serious foe in Google.
00:43:05.000 And I mean, across the board, they wouldn't let us claim our Google knowledge panel, which anybody, you know, if anyone's done it here, it's very straightforward.
00:43:12.000 You just, you know, prove your identity by putting in a password.
00:43:16.000 For one of your social accounts, and it pops right up.
00:43:19.000 In this case, it was just the old 404 error every time, but only for Bobby.
00:43:24.000 And we escalated it all the way up the chain there.
00:43:27.000 And they kept telling us, oh, that's terrible.
00:43:30.000 We'll get right back to you.
00:43:31.000 Right up to their C-suite and never solved that issue.
00:43:37.000 And of course, it brings Wikipedia right up to the top.
00:43:39.000 And You know, we've gone deep with Larry Sanger on the censorship and intelligence control of Wikipedia.
00:43:47.000 So we're right there with you.
00:43:50.000 We understand Google's stake in this for sure.
00:43:54.000 Another thing that's, by the way, I think is quite interesting in that report we sent you is that they are sending that attack content, that liberally biased content, not just to liberals and not just to moderates, but to conservatives as well.
00:44:10.000 So that doesn't happen for everyone.
00:44:12.000 They don't do this for everyone.
00:44:13.000 You should be greatly honored that they're slaughtering you across the board with people of every political persuasion.
00:44:22.000 But the fact is that without a monitoring system, you see, no one would ever know.
00:44:29.000 And we know that when you bias search results and search suggestions in that way, that easily shifts 20 to 80% of undecided voters.
00:44:38.000 And even on election day itself, there are still undecided voters.
00:44:44.000 And they know who they are, and they're using these techniques to influence them.
00:44:49.000 Again, without a monitoring system, you don't know any of this.
00:44:55.000 Nothing.
00:44:56.000 Especially those targeted messages on their homepage.
00:45:00.000 You don't know anything that they're doing.
00:45:05.000 We're also looking at the recommendations that they make on YouTube tells the same story.
00:45:12.000 But what's interesting here is that people are not aware that 70% of the videos people watch on YouTube, this is according to Google itself and outside researchers too, but Google has admitted that 70% of the content people watch on YouTube has been recommended by their algorithm.
00:45:35.000 For children, it's 80 to 90%.
00:45:37.000 Why is that?
00:45:39.000 Well, partly because if you don't pick something, they automatically play the Up Next video.
00:45:45.000 So they are determining most of the content around the world that people are watching on YouTube.
00:45:53.000 And guess what, Mr.
00:45:56.000 Kennedy?
00:45:58.000 They're slaughtering you on YouTube as well.
00:46:01.000 Again, no monitoring systems in place.
00:46:04.000 You don't know anything that they're doing.
00:46:07.000 These people, by the way, are very determined.
00:46:11.000 Google more than the other companies because Google has a very, very strong progressive left culture.
00:46:19.000 And it's actually now more than 96% of the donations from their employees go to Democrats.
00:46:28.000 Now, I lean that way myself, so I should be applauding them, but I don't.
00:46:33.000 Because I love this country, and I love the system of government that was put in place by our founding fathers more than I love any particular party or candidate.
00:46:44.000 So, unless you've got other sort of evidence that they used against me, which I'm very curious about, let's talk about how you fix this.
00:46:58.000 Okay, well...
00:47:01.000 There are really only three ways, and I can tell you that what the government's been doing, which is antitrust actions—that's the DOJ, that's Congress, and the AGs—will not solve the problem at all.
00:47:18.000 In fact, I happen to have been working with AGs long enough, so I actually saw Google's attorneys pushing our authorities away from consumer protection issues and pushing them toward antitrust issues.
00:47:35.000 They have basically gotten control over Everybody.
00:47:42.000 It's DOJ, it's the AGs, and Congress, and they're all going after Google with antitrust actions.
00:47:50.000 Antitrust actions will not solve the big problems.
00:47:53.000 The big problems are surveillance, censorship, and manipulation.
00:47:58.000 Those are the three big problems.
00:48:00.000 Antitrust has to do with monopoly issues, which might force them to sell off a company or two.
00:48:05.000 Big deal.
00:48:07.000 That Google knows very well that no one will ever force them to break up the Google search engine because it won't work.
00:48:14.000 And Facebook knows that no one will ever force them to break up the social media platform because that would be like putting the Berlin Wall through the middle of every household.
00:48:23.000 You can't do that.
00:48:24.000 So these companies know that antitrust actions are not a threat to them.
00:48:29.000 So what are the threats?
00:48:30.000 Well, one, which isn't going to happen, is that you could make the surveillance business model, which Google invented, which turns us all into products that they sell to vendors, you could make that model illegal.
00:48:44.000 Tim Cook, the head of Apple, has suggested this.
00:48:47.000 He thinks that model should not be allowed.
00:48:49.000 He doesn't think it's a legitimate business model.
00:48:51.000 It's fundamentally deceptive.
00:48:53.000 Is that going to happen?
00:48:54.000 No.
00:48:55.000 Number two could happen.
00:48:57.000 I published this in Bloomberg Businessweek the day before I first testified before Congress.
00:49:03.000 Number two is this.
00:49:05.000 We could declare, our Congress could declare, or a regulatory agency could declare their indexed Which is the database they use to generate search results.
00:49:16.000 We could declare that to be a public commons.
00:49:19.000 There's ample precedent for this in law.
00:49:23.000 Governments have been doing this for hundreds of years because once some commodity or service becomes essential Then there's room for abuse, and that includes air and water and gasoline and telephone communications.
00:49:38.000 Once it becomes essential, the government at some point always steps in and either takes charge or regulates.
00:49:48.000 In this case, if our government declared their index to be a public commons, then everyone could build search engines using their data and you'd end up with thousands of search engines competing for our attention, which would be just like the news environment.
00:50:06.000 There would be competition and innovation.
00:50:09.000 Because of Google's monopoly, there has been no competition and no innovation in search for 20 years now.
00:50:16.000 Could that be done?
00:50:17.000 Yes, it could.
00:50:18.000 Actually, Ted Cruz and I have talked about this at length.
00:50:22.000 Will it be done in the US? We're kind of dysfunctional.
00:50:26.000 That's the problem.
00:50:27.000 But the EU could do it.
00:50:29.000 And I've spoken in Brussels.
00:50:31.000 There are people quite interested in this possibility.
00:50:34.000 And if it's done in the EU, it would affect Google worldwide.
00:50:37.000 It would end their worldwide monopoly on search.
00:50:40.000 And finally, we get to number three.
00:50:43.000 Number three is monitoring.
00:50:45.000 No one can stop monitoring.
00:50:47.000 And monitoring...
00:50:50.000 Monitoring makes them accountable because it gives us a growing database that allows us to look back in time and see what shenanigans they were using.
00:51:00.000 And that can be used in court and it can also be used just to put public pressure on these companies.
00:51:07.000 Monitoring, in my opinion, is not optional.
00:51:10.000 Monitoring is Essential.
00:51:15.000 Because in the EU, for example, where they have passed some pretty strong laws and put very strong regulations into place, which you've talked about, I've heard you talk about them.
00:51:26.000 Vestager is the woman who spearheaded that movement in Europe.
00:51:30.000 She has acknowledged recently that these companies are not complying as far as they can tell.
00:51:36.000 The fact is, unless you have monitoring systems in place, you can't measure compliance.
00:51:42.000 You don't know if there's compliance without monitoring.
00:51:46.000 You must have monitoring in place.
00:51:48.000 You must capture all that ephemeral content and analyze it rapidly.
00:51:52.000 So those are the three solutions.
00:51:54.000 The one that I've been working on, and we spent $7 million building this nationwide system, obviously, is the third one, which is monitoring.
00:52:03.000 And it's powerful.
00:52:05.000 I can tell you that...
00:52:09.000 A couple days after the November 2020 election, I sent a lot of our data to Senator Cruz's office, and on November 5, 2020, Cruz and two other senators sent a very threatening letter To Sundar Pichai, the CEO of Google, saying, you testified under oaths that you don't interfere with elections.
00:52:34.000 How do you explain Epstein's data?
00:52:36.000 And then it's a two-page letter summarizing our findings in 2020.
00:52:40.000 In that election, by the way, Google shifted more than 6 million votes to Joe Biden.
00:52:46.000 So, you know, Pichai did not reply right away, but on that same day, They turned off all of their manipulations in Georgia, which was gearing up for two Senate runoff elections in January.
00:53:00.000 How do we know that?
00:53:01.000 Because we had more than a thousand observers.
00:53:03.000 We call them field agents.
00:53:05.000 We had more than a thousand observers throughout Georgia.
00:53:08.000 We were monitoring big tech content through the computers of more than a thousand people in Georgia.
00:53:14.000 And we saw Google turn off All their manipulations.
00:53:18.000 The partisan go vote reminders.
00:53:21.000 The bias in search results went to zero.
00:53:25.000 Absolutely flat.
00:53:27.000 We had never seen that before.
00:53:29.000 They literally just pulled out from Georgia.
00:53:35.000 I mean, the interesting thing to me is that all three of those, and it's incredible work that you're doing, seem to me to not work without the other.
00:53:44.000 I don't think they're discrete options.
00:53:46.000 I think that really, I'm a huge fan of free markets.
00:53:52.000 I know that Bobby is, it sounds like you are as well, certainly free markets of ideas, of the right to know, to be an informed consumer, which certainly applies to news and to information.
00:54:04.000 But even when you know, right, with the monitoring capabilities that you're talking about, and I think more and more people do know, you know, they use DuckDuckGo and so forth, but it is a pain in the butt, you know, because of monopolistic practices.
00:54:24.000 And we saw this with the Apple and the Google judgment, right?
00:54:27.000 When you buy a phone and you have a pre-built-in search engine, even if you know based on this monitoring, That, you know, what you're seeing is hopelessly filtered and biased.
00:54:39.000 It is the kind of free ice cream problem where it's just a lot easier to operate with the pre-installed option rather than add your own.
00:54:50.000 And I think the more, I mean, the middle, the publication of the index, you know, there's Common Crawl, for example, and I don't know if you've used Common Crawl at all, but I mean, it doesn't have as many webpages as Google, but I think it's indexed, I don't know, I want to say a couple hundred billion, and it's public commons.
00:55:08.000 But as long as you have that monopolistic practice, even really excellent alternatives are hard to come by.
00:55:17.000 And the solution that I really love is similar to what Jack Dorsey has been talking about, which is giving consumers free market choice between the algorithms that they can use.
00:55:31.000 Because ultimately, consumers will test out, you know, if Google, for example, were, you Corollary to Right to Know is the corresponding duty to Disclose.
00:55:51.000 And if you disclose the algorithm that you are using as your default and you open up an API where anybody can take the index as an API and add their own kind of algorithm, consumers ultimately will choose the one that works best for them.
00:56:09.000 And by messing around the same way that you would try different filters on an Instagram photograph, You try different algorithms on your X feed or on your Google News results and pretty quickly you're gonna wise up to the one that gives you the sharpest and most complete view of the world.
00:56:28.000 And I think giving people that kind of free market choice It is actually a more efficient enforcer than any kind of government oversight could be and less prone to being infiltrated by censors or government leaders that want to control these algorithms for their own purposes.
00:56:52.000 Well, Google's not going to give people access to their data voluntarily.
00:56:57.000 That would have to require some sort of governmental action.
00:57:01.000 Some interpretation of right to know.
00:57:03.000 Right.
00:57:04.000 And that is not, you know, there are legal scholars that propose this, and it's not.
00:57:09.000 President Kennedy was one of the initial proposers of the Consumer Bill of Rights.
00:57:14.000 And in there, he talks about how the marketplace has been flooded with all this new technology, new pharmaceutical products, all kinds of things that I think in his words, he talks about forcing parents or the homemaker to become an amateur scientist and chemist and technologist to know what they should and shouldn't buy.
00:57:38.000 And he was, of course, urging for truth in advertising and for consumer choice.
00:57:44.000 And, you know, we see that with FOIA.
00:57:46.000 We see it with even some of our international obligations under, you know, the UN Declaration of Human Rights and so on.
00:57:53.000 And we've never applied it this way.
00:57:56.000 But in the end, you know, the position that Google and others take that their algorithms are trade secrets and should be subject to those protections, there are established limitations on the protections for trade secret, especially when it comes there are established limitations on the protections for trade secret, especially when it comes to, you know,
00:58:17.000 And I would really like to see that be taken up because if, you know, you when you began using a given search engine or a given social media platform, you in your setup process and at any time thereafter could play around with different algorithms and choose the one that you found to be, you you in your setup process and at any time thereafter could play around with different algorithms and choose the one that you found to be, you know, most productive
00:58:42.000 And of course, one of the challenges that we haven't even talked about is that that's all heuristics based, right?
00:58:48.000 This is almost like it's almost passe this conversation because pretty soon it will be AI driven.
00:58:55.000 Some of it already is.
00:58:56.000 Let's get to AI then, because we're starting to monitor AI content now as well, which is going to become more and more important over time.
00:59:05.000 And this is just, for me, just a...
00:59:09.000 It's like...
00:59:13.000 It's like a tip of an iceberg because there's so much out there that can be a threat to humanity, not just to democracy, but to human autonomy and especially to the minds of children, that we have to be able to look at it, but to human autonomy and especially to the minds of children, that we have to be able to look So we're starting to monitor AI content.
00:59:39.000 And of course, AI content could at some point in the very near future become a serious threat to our existence.
00:59:45.000 Stephen Hawking warned about that.
00:59:47.000 Elon has warned about that at times.
00:59:49.000 But a monitoring system could be the first line of defense.
00:59:54.000 It could be an early detection system for AIs that are posing a threat.
00:59:59.000 So there's another use for monitoring.
01:00:02.000 We also have designed and built equipment.
01:00:05.000 We don't have the money to distribute it yet, but we've designed and built equipment that will allow us to monitor, with permission of the users, Answers that they're getting from intelligent personal assistants like Siri and Alexa and Google Home and so on, the Google Assistant.
01:00:25.000 And so we're very close to being able to monitor and analyze that content as well.
01:00:30.000 I should emphasize here that when we...
01:00:35.000 We transmit content from someone's home or someone's device to our computers.
01:00:41.000 We do that without any identifying information at all.
01:00:46.000 We preserve people's privacy.
01:00:48.000 This is the exact opposite of what the big tech companies do.
01:00:52.000 They're always transferring content with your name on it so they can add it to your profile and they can sell you more things and they can manipulate you more effectively.
01:01:04.000 You think that, you know, I love everything that you're talking about, and my mind is kind of racing and thinking of solutions.
01:01:13.000 And, you know, maybe the government should have a ministry that just, or a new cabinet agency that's the Department of Democracy that monitors all of the undemocratic impacts of the internet companies.
01:01:36.000 And, you know, we were told at the beginning, at the dawn of the internet age, that the internet was going to democratize the world, democratize the spread of information, and it's turned out, and it had that potential, and it's done that, it's accomplished that to some extent.
01:01:57.000 The most odious effect and the most dominant effect Has been that it's become a tool for totalitarianism.
01:02:05.000 Correct.
01:02:06.000 And for the, you know, for this corporate merger of state and corporate power to monetize and to monetize information and to deploy surveillance and control to the systems.
01:02:17.000 You know, should we have a cabinet host That is constantly monitoring the impacts of technology on democracy and taking steps to expose that to the public, to shine the sterilizing light of sunshine and the police effect of lamplight.
01:02:41.000 Who monitors them?
01:02:43.000 Exactly.
01:02:45.000 Who monitors them?
01:02:46.000 Who monitors them?
01:02:47.000 That's the fear, right?
01:02:49.000 The Ministry of Democracy could get Orwellian pretty quickly.
01:02:53.000 That's why I like the free market.
01:02:55.000 Sure.
01:02:55.000 But look at Fauci.
01:02:57.000 I mean, there's a perfect example or the kind of regulatory capture that you describe in detail in your book.
01:03:03.000 I don't know that we can really trust the government exactly.
01:03:07.000 Administrations, thank goodness, every now and then change, but I don't know if I would trust the government.
01:03:13.000 I think we need...
01:03:17.000 We need big things to happen.
01:03:19.000 And at one point, Ted Cruz invited me for a private dinner to discuss these issues because the man's very smart, obviously, and he wants to solve these problems, these big tech problems.
01:03:30.000 We talked for almost four hours straight.
01:03:32.000 We never talked politics.
01:03:33.000 That wouldn't have worked at all.
01:03:35.000 But the point is, we talked about tech.
01:03:37.000 And at the end of it, this poor guy, he actually said aloud, He said, the problem is that we can't, we'll never get bipartisan support to do anything against these companies.
01:03:49.000 He said that the Democrats control them because of the huge donations, because they helped get them into office.
01:03:56.000 He said, and Republicans don't like regulation.
01:03:59.000 He said, I think we're stuck.
01:04:02.000 How do we get over that hurdle?
01:04:05.000 The government don't have the tech capability.
01:04:08.000 I mean, the iterative cycle in Silicon Valley is just so many of these lawmakers are, you know, their kids or grandkids are helping them power cycle their laptop.
01:04:21.000 I mean, I don't know if you guys have ever played around with chat GBT where you put it.
01:04:26.000 Now there are different cheat codes, but there used to be a cheat code that was called do anything mode.
01:04:33.000 And there are a bunch of these, you know, people can go and look online, but there are certain prompts that you could put in there that remove some of the pre-trained parameters.
01:04:44.000 And when you ask Chet-Cpt to produce its results in parallel, one in its normal state and one with those parameters removed, it's almost like carving David from the marble where you can see the areas that have it's almost like carving David from the marble where you can see the areas that have been massaged It underscores them.
01:05:09.000 It makes you think twice about them.
01:05:11.000 Why did they make that change and actually kind of puts a glaring highlighter over them.
01:05:15.000 And when you have that kind of comparison, I think it does a lot more to inform the public than having no comparison at all.
01:05:27.000 So that's why I like being able to choose between the models, because then you see the default.
01:05:32.000 I remember as a young person going to Burma for the first time, and it was in 1999, And it was still under very tight military control, which sadly now is again, but this was prior to the moment of democracy there.
01:05:54.000 And they had a newspaper called The New Light of Myanmar that was just the most preposterous propaganda you've ever read in your entire life.
01:06:03.000 And I remember Thinking, what is the point of even printing this?
01:06:07.000 It's so preposterous.
01:06:08.000 How could anybody read it and believe it?
01:06:11.000 And we're almost there with a lot of what you read in the kind of Overton window Google results at this stage.
01:06:20.000 And the more that it's all people are exposed to, then there's no point of comparison.
01:06:26.000 And it doesn't seem preposterous in the same way that the New Light of Myanmar didn't seem preposterous to the people who had only ever read that as their news source.
01:06:34.000 So, to me, as far as we can pursue Jack's proposal, and by the way, what you're doing, Dr.
01:06:44.000 Eckstein, would be absolutely critical to it, right?
01:06:47.000 Because you need to be able to see with monitoring what the results are across different...
01:06:55.000 Well, now I'll give you some information that might make you think twice about what you just said.
01:07:12.000 When we did our first nationwide study on SEAM, there were more than 2,000 people in the study in all 50 states.
01:07:21.000 We had masked our manipulation, which Google does as well, so that very few people would recognize that there was any bias in the content they were seeing.
01:07:31.000 But there were so many people in this study that there were a couple hundred who did suspect there was bias.
01:07:39.000 Now, here's the problem.
01:07:44.000 They like it.
01:07:45.000 The people who could spot the bias shifted even farther in the direction of the bias than people who could not see the bias.
01:07:54.000 But that's okay.
01:07:55.000 It's okay to choose to be...
01:07:58.000 No, it's not.
01:07:58.000 No, I mean, look, people every day tune into MSNBC and they tune into Fox News.
01:08:01.000 They don't think they're watching actual news.
01:08:04.000 And yes, those are problematic.
01:08:07.000 But we live in a country of free speech.
01:08:10.000 We can't have a situation where there's some top-down authority saying, this is biased, this isn't, because inevitably that itself becomes biased.
01:08:19.000 So you have to give people the ability to choose a stupid algorithm that's going to make them less intelligent and have access to less information, so long as they can compare it with another.
01:08:30.000 Here they're making the decisions for the wrong reason, because They're seeing content which is coming from a computer, and people mistakenly believe that computer content is inherently unbiased and objective.
01:08:46.000 Right, but they won't if they see multiple options.
01:08:50.000 If you say, here would be your news today on Google, and in the same way that you're choosing a filter on Instagram, you could go through and you say, you know, chronological order only is one option.
01:09:03.000 You know, MSNBC, left-leaning bias.
01:09:07.000 Fox News, right-leaning bias.
01:09:08.000 I only care about pop culture bias, etc.
01:09:12.000 So that you are actually, and those should be open source, and they can create some of their favorites, and other people can contribute theirs.
01:09:21.000 But once you see the same day's news through 20 different algorithms...
01:09:25.000 I think it will eat away at that idea that because it's produced by a computer, it's trustworthy because you've just seen all of these different iterations of the same set of news all produced by a computer.
01:09:38.000 I would agree with Amaryllis.
01:09:41.000 I see what you're saying, which is that it's absolutely hopeless.
01:09:45.000 Now look at you as a very nihilistic person, Dr.
01:09:49.000 Epstein, even though you're appropriately skeptical but also really depressing.
01:09:57.000 You too.
01:10:02.000 Jack Dorsey, I had dinner with him a couple of times here at the house, and I said, what's the solution?
01:10:08.000 He said essentially what Amarillo is saying, you let everybody, you make the algorithms transparent.
01:10:16.000 And right now, If you're a Republican, you're living next door to Democrat.
01:10:22.000 A lot of it's just financially driven for Google because the algorithms they've trained are algorithms that keep people's eyeballs on the site the longest.
01:10:34.000 And as it turns out, it will stay on the site longer if they're getting information that fortifies their existing worldview.
01:10:43.000 If you're a Republican living next to a Democrat and you ask, you know, who is Robert Kennedy, you both ask simultaneously the same question, you're going to get different search results depending on what the algorithm is telling them is going to fortify your view of who Robert Kennedy is.
01:11:02.000 And so at least if you can choose and say, Give me a Republican bias.
01:11:09.000 That's what I want to hear.
01:11:11.000 Or give me a Democratic bias.
01:11:13.000 That's what I want to hear.
01:11:14.000 And Google is forced to tell you you're getting the Democratic bias.
01:11:17.000 You're getting the Republican.
01:11:17.000 At least people know.
01:11:19.000 And, you know, there's some choice left there and the illusion of choice.
01:11:24.000 Let me ask you this, because we got five minutes.
01:11:27.000 We got to wrap up.
01:11:29.000 I just want to make observations about two things that I saw during this election that were, you know, that struck me.
01:11:37.000 One is that I, at the beginning of the election, after I gained my initial speeches and stuff, I had popularity ratings that were beating President Trump, President Biden.
01:11:50.000 I had polling, national polling ratings in three-way races.
01:11:54.000 That put me between 20 and 27%.
01:11:57.000 I remember.
01:11:58.000 And then we ran really a perfect campaign other than the exposure, periodic exposure of my personal foibles and my colorful life.
01:12:08.000 But we were doing everything right.
01:12:12.000 Our engagement, social media engagement, was better than any of the other campaigns.
01:12:20.000 We were being outspent, in some cases, 1,000 to 1, and we were holding our own.
01:12:29.000 We were getting bigger crowds than anybody, but Trump, more enthusiasm.
01:12:33.000 We had 100,000 volunteers more than any other campaign.
01:12:37.000 So by many metrics, we were extraordinarily successful, but we saw this steady decline in my polling numbers down to 5% to 8% at the end.
01:12:48.000 And, you know, it was clear to me that this wall of bad media was eroding people's support.
01:12:58.000 That's one phenomena that I witnessed.
01:13:01.000 And then the other really pronounced phenomena was this transformation that happened with Kamala Harris when, you know, in one week, She went from somebody who was the least popular person, the Democratic Party.
01:13:19.000 She was absolutely disdained by Democrats.
01:13:23.000 She could not get a lunch reservation anywhere in the country.
01:13:28.000 People were just talking about ways to get rid of her.
01:13:32.000 And then two weeks later, she was a mix between Beyonce and Jesus Christ.
01:13:39.000 That's right.
01:13:40.000 And Democrats adored her and nothing had changed.
01:13:44.000 She didn't give a single interview for 35 days.
01:13:47.000 Oh no, that's not true.
01:13:49.000 Something did change because we measured it.
01:13:51.000 Because what we saw in our Kamala Harris report going from February through July, see this slope I'm describing?
01:14:01.000 What we saw is that Google and to a lesser extent some other tech companies were basically ignoring her in February and all of a sudden they're boosting her to the top of search, which has a tremendous impact because it shifts people's thinking and opinions.
01:14:28.000 And that in turn boosts her more.
01:14:31.000 It's a synergistic phenomenon.
01:14:33.000 And that's what happened, that there was an explosion of support literally engineered by algorithms.
01:14:42.000 We have the data.
01:14:43.000 We have the evidence.
01:14:45.000 I mean, even if you searched for your name, Bobby, or Donald Trump's name, the results would come up.
01:14:52.000 With Kamala Harris plus Bobby Kennedy or Kamala Harris plus Donald Trump, whereas if you search for Kamala Harris, it will come up just for Kamala Harris.
01:14:59.000 So those shifts alone, you know, look, ultimately, I say this to people all the time, every single thing you know about this election, unless you were in the room, I mean, basically, unless you're you, Bobby, Every single other person who's listening to this, everything you know about this election, you read, saw, or heard on a platform that is owned by someone with a financial interest in the outcome of this election.
01:15:28.000 And again, I would just urge you, I know you don't have it, Emerilus, but I would urge you to look at the report that we sent to Bobby, because what you'll see is you'll see the enormous bias, but also you'll see the particular news stories that people are being shifted to, and they literally turned you overnight into a kind of a devil.
01:15:54.000 And they really do have that power and they're not even slightly hesitant about using that power.
01:16:02.000 These people are unbelievably arrogant.
01:16:05.000 I mean they give new meaning to the word hubris.
01:16:08.000 So that's the problem that we face.
01:16:11.000 Again, I'm doing all I know how to do and all I've figured out how to do.
01:16:15.000 We're studying these phenomena and we're building a bigger and bigger and more powerful and faster monitoring system.
01:16:22.000 And we know that this system can be used to pressure them both in the courts and outside the courts.
01:16:29.000 And I'm just going to stick with that because I know it works.
01:16:32.000 It's not all that needs to be done.
01:16:35.000 But it's important.
01:16:36.000 And can I just give out a couple of websites?
01:16:38.000 I know you have to run.
01:16:39.000 I was just going to ask you, where can people play with this data or see your work?
01:16:44.000 Well, first and foremost, please go to AmericasDigitalShield.com because we built a public dashboard where you can see the data coming in and you can see the bias.
01:16:56.000 We just added Reddit as a new platform and we keep expanding this system and making it bigger, better, faster.
01:17:05.000 AmericasDigitalShield.com.
01:17:06.000 Now, we pay our Field agents, we call them, are our watchdogs.
01:17:11.000 We pay them a token fee of $25 a month, just like the Nielsen company pays its families to monitor their television watching.
01:17:18.000 It's a token fee, but think about that.
01:17:20.000 Once you've got 10,000 of them, we have a lot more than that.
01:17:24.000 That's $250,000 a month in expenses.
01:17:28.000 So please click on the button that says, sponsor a watchdog.
01:17:32.000 And you can sponsor one of our people with a donation of $25 a month.
01:17:37.000 We have so far close to a thousand people who've stepped up and are doing this.
01:17:41.000 The only way to make this economically feasible long-term is if tens of thousands of Americans sponsor watchdogs.
01:17:48.000 We can't take you as volunteers, unfortunately, because then Google sends us people.
01:17:52.000 But another place you can go is MyPrivacyTips.com.
01:17:58.000 So if you want to figure out how to protect your privacy and the privacy of your family members, go to my article at MyPrivacyTips.com.
01:18:06.000 It begins with a sentence, I have not received a targeted ad on my computer or mobile phone since 2014.
01:18:14.000 And that's me talking.
01:18:15.000 So there are other ways to use technology that most people are not aware of.
01:18:20.000 It's not that hard.
01:18:22.000 MyPrivacyTips.com.
01:18:23.000 And last but not least, TechWatchProject.org.
01:18:28.000 TechWatchProject.org.
01:18:29.000 And you can read all about basically the organization that we've set up to build these monitoring systems.
01:18:36.000 And these are very brave people who have been doing this work.
01:18:40.000 They're wonderful, brave, smart, interesting people.
01:18:44.000 I wish we could take volunteers, but we can't.
01:18:47.000 But you can help us in other ways.
01:18:49.000 And so please check out these links.
01:18:52.000 I would appreciate it.
01:18:53.000 Dr.
01:18:54.000 Robert Epstein, thank you so much for joining us.
01:18:57.000 And Amaryllis, thank you for co-hosting this podcast.
01:19:00.000 Thanks for having me, Bobby.
01:19:02.000 Always so fun.
01:19:03.000 And thank you, Dr.
01:19:04.000 Epstein, this work.
01:19:05.000 It could not be more important.
01:19:07.000 It's existential for our democracy.
01:19:10.000 Thank you both.
01:19:11.000 I'm deeply honored that you had me on this show.