Dr. Robert Epstein is a pioneer in the study of new forms of manipulation that had been made possible by the Internet. He has testified twice before Congress about his research in this area, and he s also developed the world s first large-scale system for preserving and analyzing the ephemeral content of Big Ten companies used to manipulate elections, children, and adult human beings. His new congressional testimony is accessible at HTTPS.co/2023epsteintestimony. A dashboard that summarizes the data his new monitoring system is collecting can be viewed at HTTPSColon.slipcpy.org/1923epstimony. And my daughter-in-law is Amaryllis Fox Kennedy, who is also my campaign manager. She s the smartest person I've ever met. She helped me get on the ballot in every state, and she managed to get a campaign that everybody said was impossible. And she did it because she had an extraordinary mind. She's a great fan of Dr. Epstein's work, and a great admirer of his work. So it's a joy to have him on the show, and it's also a pleasure to have her co-hosting the show with me, my daughter, Amarylle Fox Kennedy. She's an amazing human being, and I can't wait to share her story with the world. I hope you enjoy this special edition of The Dark Side of the Podcast! and that you'll join us in the future episodes of the show. - coming soon. -- we'll be looking into the dark side of the dark web, dark web and dark web. Subscribe to our new show, The Dark Web, coming soon, Dark Web. (coming soon). -- check us out! -- by clicking here for more information about the Dark Web and all kinds of dark web things, including MK-related conspiracies, like MK-Ultra, MK-Dietrich and all sorts of things like that -- and much more! -- coming soon! (linktr.ee/thedarkwebpodcast) Subscribe? Learn more about your ad choices? Click here to become a supporter of our new sponsor, Rate, review us on Apple Podcasts and subscribe to our podcast, and leave us a review on iTunes, and we'll get a shoutout on the podcast, too! Thank you for rating and review on your favorite podcasting platform! and more!
00:00:23.000Epstein is a pioneer in the study of new forms of manipulation.
00:00:28.000That had been made possible by the Internet.
00:00:30.000He has testified twice before Congress about his research in this area, and he's also developed the world's first large-scale system for preserving and analyzing the ephemeral content of Big Ten companies used to manipulate elections, children, and adult human beings.
00:00:49.000His new congressional testimony is accessible at HTTPS colon backslash backslash or slash slash 2023epsteintestimony.com.
00:01:05.000A dashboard that summarizes the data his new monitoring system is collecting can be viewed at HTTPS colon slash slash americasdigitalshield.com.
00:01:20.000His research suggests that without an aggressive monitoring system in place in the 2024 presidential election, Luka alone will be able to shift between 6.4 and 25.5 million votes.
00:01:38.000Epstein's research, I think, around 2016 when I was reading about the SIEM, which is the search engine manipulation effect.
00:01:48.000Which was just becoming evident at that time at the beginning of the coronavirus endemic and was extensively used thereafter to manipulate public perception, to manipulate public content behavior.
00:02:05.000I remember doing a podcast, one of my earliest podcasts, about this theme in which I think it was Dr.
00:02:13.000Epstein who said it was the most powerful Propaganda and manipulating tool ever devised by humanity.
00:02:21.000And it's really astonishing, his work about the capacity to shift public perception and therefore alter and manipulate public content is truly frightening.
00:02:38.000And it's all done Without the targets of this kind of manipulation, ever knowing that they've been manipulated, it can induce essentially a mass psychosis.
00:02:53.000And my co-host today is my daughter-in-law, Amaryllis Fox, who is Amaryllis Fox Kennedy.
00:03:07.000Who is also my campaign manager, who did something on this campaign that has got an extraordinary mind.
00:03:13.000She's the smartest person I've ever met.
00:03:15.000And she was able to do something in this campaign that everybody said was impossible, which was to get on the ballot in every state, to manage 100,000 volunteers and the complexity of that task.
00:03:31.000And to do everything else, to write policy papers and generate this huge amount of content for us.
00:03:39.000At a superhuman level, Amaryllis spent her early career, joined the CIA after 9-11 and spent a career as a clandestine spy in the weapons of mass destruction program.
00:03:57.000After she left the CIA, she became a tech entrepreneur and started a very successful company.
00:04:06.000Later sold to Twitter, and then she went to work managing, I think it was communications, right, at Twitter.
00:04:15.000Amarillo, correct me if I'm wrong, because I'm doing this off the top of my head.
00:04:23.000Yeah, using your heuristics-based natural language processing to identify Product mentions, which in its own way, I think we'll probably delve into later in this conversation, but a great fan of Dr.
00:04:38.000Epstein's work and really pleased to be here today.
00:04:41.000Anyway, I always tell this story that somebody told me that Amaryllis was the smartest woman that they've ever met.
00:04:48.000And I said, no, she's the smartest human being that you've ever met.
00:04:51.000So it's a joy to have you back on the show, co-hosting with me, Amaryllis.
00:04:55.000And If you want to follow Amaryllis, she's at, what is your Twitter?
00:05:15.000Just describe to us how the SIEM works, SIEM is search engine manipulation.
00:05:21.000In fact, How it works and, you know, just a brief punchline of the extraordinary power that these tech companies are now wielding to do what the CIA has always wanted to do, which is without ever going into a country And doing all the things they used to do,
00:05:40.000assassinating leaders, paying off unions, destroying the credibility of institutions, driving division and polarization between different groups in the society, all of those other tools they developed.
00:05:58.000For 20 years of experimentation through MKUltra and MKNaomi and MKDietrich and all the MK programs, MK meaning mind control, which is what they were after, both individual control of individuals to control the entire societies.
00:06:18.000And now they can do that much more effectively.
00:06:22.000Just by manipulating algorithms at Google.
00:06:25.000And so, will you explain how that works?
00:07:45.000But there's a second thing I want to tell you before we get to the topic, because I just spent the whole weekend reading your book on Fauci.
00:07:56.000And I recommend that everyone read that book.
00:07:59.000And if you're reluctant to read it because, as someone told me last night, because you've heard some bad things about Bobby Kennedy Jr., if you're reluctant to read it, forget all that stuff.
00:08:38.000Fauci, is because this book is just so well done.
00:08:43.000And it tells you, it gives you a perspective on the health system, or you could call it the anti-health system in this country, that I've never seen laid out so well.
00:09:50.000I was working with a member of Trump's Coronavirus Task Force because I had proposed a plan, which is called the Carrier Separation Plan, CSP. That would not have required any lockdowns or social distancing and would not even have required a vaccine and that would have eradicated the virus, which we've not done.
00:10:17.000It would have eradicated the virus and allowed us to completely reopen society.
00:10:21.000So I started working with that task force in March of 2020.
00:10:52.000Michael Medved said he hopes the president goes for it.
00:10:55.000I was talking to two members of Trump's family about it.
00:10:58.000It's very simple, and it's consistent with some statements that are in your book.
00:11:03.000And that is, you test everyone using very cheap...
00:11:08.000Disposable test devices, and I was in touch with two big companies in China that could have produced these things overnight.
00:11:15.000The test devices don't even have to be that particularly accurate, it turns out, for this to work.
00:11:22.000The President announces we're going to have National Testing Day.
00:11:26.000It's going to be on September 6th, Sunday, September 6th, 2020, which was the 400th anniversary of the day the Mayflower set sail from England to the United States.
00:11:37.000And he said, on that day, we're all going to test.
00:11:40.000Before that, he would have announced it early summer, we're going to send you all dozens and dozens of these test devices.
00:11:50.000You just stick them in your mouth and it'll tell you whether you're positive or not.
00:11:55.000And so he said, I'm going to do it, Melania and all my kids and all the members of Congress, and we're all going to do it on that day.
00:12:04.000He said, and then if you have, if you test positive, then you need to go into quarantine, not for six months, just a couple of weeks, two or three weeks.
00:12:13.000Just let the virus, let your body defeat the virus.
00:12:17.000He said, and that same day, Sunday, September 6th, we're going to reopen society.
00:12:22.000All the schools and churches and businesses are going to reopen and the virus is going to stop spreading.
00:12:29.000Because we've removed from the population most of the carriers.
00:12:33.000And there's also going to be secondary screening.
00:12:36.000So at the entrance to schools, churches, and businesses, there's going to be barrels full of these test devices.
00:12:42.000And it's all voluntary, but we'd like you to test and test negative so you can go inside.
00:12:48.000If you test positive, just go home for three weeks.
00:12:51.000If you need money to do this, then you're giving up your privacy, but we'll help you.
00:12:57.000If you need a place to stay, like a hotel room for those three weeks, we'll help you, but you're giving up your privacy then.
00:13:02.000But for everyone else, this is all going to be done in privacy.
00:13:06.000I published in Frontiers in Public Health in January of 2021 a mathematical model, detailed predictions.
00:13:24.000And by the way, if that plan had been implemented, and at one point there was a speech for President Trump to give, it was sitting on the desk of his head speechwriter, Vince Haley.
00:14:35.000Well, you know, that's very interesting.
00:14:37.000And I know that you talked about that on Tucker.
00:14:41.000But it is a, and I don't want to take away from your, you know, your inventive role In devising that, but it's kind of a modern iteration of exactly the protocol that D.A. Henderson, one of the great epidemiologists who is credited with obliterating smallpox, eliminating this deadly disease from humanity.
00:15:11.000It was basically the same program that he used without the technology, but it is the It was the classic prescription for dealing with pandemics, which always said you never do mass lockdowns because that ends up destroying society and imposing a lot worse costs than the disease, which is what we saw.
00:15:38.000But the way he eliminated smallpox, a lot of people, you know, the modern consensus is that which is an orchestrated consensus, which is the product of propaganda.
00:15:49.000Is that smallpox was eliminated by the smallpox vaccine.
00:15:53.000But there are many parts of the world where the vaccine never reached.
00:16:00.000And the way it disappeared is D.A. Henderson's plan, which is to isolate the sick, protect the vulnerable, keep society open, and do systematic isolation of people who have the disease.
00:16:15.000And that is what eliminated smallpox from the planet.
00:16:19.000And D.A. Henderson, in later years, complained vociferously against the lockdowns during the beginning of COVID, et cetera.
00:16:27.000If you never do that, you know, you do you do these protocols that have been proven to work.
00:16:34.000And in the small box vaccine had a lot of problems and and, you know, a lot of very, very bad, deadly side effects killing people.
00:16:44.000And the real way that history shows and the best literature shows the way that smallpox was eliminated was through these physical protocols of isolating the sick.
00:16:57.000And respiratory viruses, as everybody knew before the pandemic, cannot be eliminated through isolating entire society.
00:17:11.000So when you lock people indoors, they tend to spread to the families, etc.
00:17:17.000And it was really, you know, what we saw here at the beginning of the pandemic was the police shutting playgrounds.
00:17:25.000Padlocking basketball courts, throwing sand on the half pipe so people couldn't skateboard, arresting surfers who were out in the ocean when they came to the beach and sending them home where the coronavirus is going to spread.
00:17:52.000I'm happy that you're talking about this.
00:17:54.000And I wish I'd known more about that at the time when you were doing it, because I think it would have given a lot of those of us who are skeptical about them, were criticizing the mass lockdowns.
00:18:08.000It would have given us all a much better understanding Alternative for dealing with the coronavirus.
00:18:15.000Of course, Bobby, I mean, that's exactly the point of our later conversation, which is that you didn't know it, and neither did anyone else, and that wasn't accidental, right?
00:18:25.000That's the result of this kind of manipulation.
00:18:29.000Amaryllis, thank you so much, because that's the segue that I was hoping to have.
00:18:37.000And the fact is, there's a lot we don't know, and you don't even know what you don't know, right?
00:18:45.000Or as I like to say now, you don't know what they don't show.
00:18:48.000And by they, I mean the big tech companies, primarily Google.
00:18:52.000And so, you know, I've been studying that.
00:18:55.000That's a separate thing than my work on the carrier separation plan.
00:18:58.000But I've been studying the ability that Google and other tech companies have to control Our thinking, our behavior, our emotions, our elections, our children.
00:19:11.000I've been studying that now for more than 12 years.
00:20:24.000I have been publishing in peer-reviewed journals, I've been publishing our discoveries on, so far, 10 different methods that these companies use for manipulating thinking and behavior.
00:21:56.000But let me just go back to search suggestions for a second, because you'll remember that a few weeks ago, It made national news that when people were searching for information about the assassination attempt on They couldn't get anything.
00:22:12.000Google was suggesting that they look at, I don't know, Abraham Lincoln and McKinley, and they wouldn't give you suggestions for learning about that assassination attempt.
00:22:33.000But we are now preserving those search suggestions by the millions.
00:22:39.000We're preserving ephemeral content, that's what they call it in Google, that they use to manipulate people.
00:22:45.000We're preserving that by the millions, the recommendations they make on YouTube, which are 60 to 70% coming from liberal news sources.
00:22:56.000For children under 18, or children, young people under 18, it's more like 90% of the content that they're showing people comes from liberal news sources.
00:23:10.000Tens of millions of this type of data of this sort, which has never been done before.
00:23:17.000We have the world's first national monitoring system for doing to them what they do to us and our kids.
00:23:24.000In other words, we're surveilling them just like they surveil us and our kids.
00:23:30.000So there's two big chunks of research here.
00:23:33.000One chunk of research, which we've been doing since 2013, looks at the new methods of manipulation that the internet has made possible and that is controlled entirely by a couple of tech companies.
00:23:48.000And the second piece of research has to do with the fact that we're now monitoring them to see whether they're actually using these techniques, and they are.
00:24:00.000Right now on Google's homepage, they're sending out various kinds of vote reminders, register to vote, mail in your ballot, go vote, to Democrats at about two and a half times they're doing to Republicans.
00:25:12.000We're crazy because we speak the truth.
00:25:16.000We don't care whether that offends anybody.
00:25:19.000We speak the truth because we want to defeat bad actors.
00:25:28.000We want to defeat people who are hurting people.
00:25:31.000We're hurting society or hurting elections, hurting democracy, hurting health, and we speak the truth no matter what the consequences, which is a very, very crazy, insane, and difficult way to live.
00:25:45.000And that's what you and I have in common.
00:25:56.000SEAM demonstrates how biased ephemeral content, such as search results and video recommendations, influence users' decisions.
00:26:03.000Just one exposure to the type of biased content can shift the person's perspective by 20% to 80%, with repeated exposures raising that potential to 90%.
00:26:15.000The SSE, which is the search suggestion of fact, Refers the impacts of search suggestions in drop-down as an online user is looking up information.
00:26:29.000Google search suggestions can shift undecided voters' opinions from a 50-50 split to almost 90-10, all without user awareness.
00:26:41.000That is terrifying because that's really the end of democracy.
00:26:50.000If you consider the free and fair election...
00:26:53.000I mean, the implication is you have one guy, Sergey Brin.
00:26:59.000It's not even the 800 people who are giving 70% of the donations, which is terrifying.
00:27:05.000One guy, the head of Google, who can decide elections.
00:27:10.000Most elections in this country could be decided by that 6%, you know, or more.
00:27:18.000I think actually you quantify that, but it's like, you know, you're shifting election results.
00:27:28.000I think you find in here that you can, what is it, that you can shift election results by 6.9% typically or more, right?
00:27:38.000Well, the margin that is somewhere between 4% and 16%.
00:27:43.000So 4% is the absolute bare minimum that Google alone can shift without anyone knowing, except for, of course, now we have a monitoring system in place.
00:27:57.000So if you look right now, as I did a little while ago, if you look at the numbers in the swing states right now, the survey data, You'll find, depending on what the poll is, that Trump is ahead by maybe a point or two in three of the seven swing states.
00:28:12.000Harris is ahead by a point or two in four of the swing states.
00:28:59.000And I've seen it in action because it's been U.S. policy around the world for a long time in terms of Intelligence monkey business in other countries' elections.
00:29:09.000You can go back and look even on CIA.gov if you go and look at the special forces and election manipulation handbooks.
00:29:21.000There are sections on polling that are still entirely redacted from the 70s and the 80s.
00:29:27.000And those are right next to nestled next to the guide for taking over underfunded newspapers in order to provide an unidentified source of funding that then places all of the news stories that the United States government wants for for the candidate of their choice.
00:29:48.000So they have always viewed both polling and manipulation of news coverage as the first wave.
00:29:57.000If that doesn't work, then maybe you have to go in and cement a coup or do some other criminal act overseas.
00:30:05.000But first and foremost, polling and news manipulation is That is correct.
00:30:11.000I remember you saying to me one time that if you are an intelligent agent who is in charge of making sure the Italian elections come out, the communist candidate or the left-wing candidate loses the Italian elections, that it is a spycraft malpractice not to use those tools.
00:30:41.000How on earth did this election turn out the way that we didn't want it to?
00:30:45.000And you see that all around the world when, you know, these improbable results happen.
00:30:49.000But, you know, I remember a DARPA study coming across my desk in probably 2004 when I was, you know, a brand new baby officer who signed up after 9-11 and,
00:31:06.000you know, was I wasn't yet even overseas, thinking that I was doing the same thing as, you know, a Marine, you know, a kid who goes down and joins the Marines after 9-11 because you want to do something good to serve your country.
00:31:21.000Didn't know what nest of vipers I was walking into.
00:31:25.000But I remember this study coming across that came out of DARPA, and they put people reading news stories in an fMRI machine.
00:31:35.000And expected that the frontal areas would light up while they were assessing analytically the reliability of what they were reading.
00:31:47.000And the first area that they describe, it's over the ear, and they described it as the part of your brain that lights up when you hold up a shirt in a store and think about whether or not your friends would make fun of you if you wore it.
00:32:00.000And that in-group, out-group decision is a split-second decision made at the beginning before you even start your frontal lobe, all of your actual analytical assessment.
00:32:10.000And so you already know before you dive in whether you're assessing in order to poke holes or you're assessing in order to think it's a brilliant article and share it with your friends.
00:32:20.000And of course, they were weaponizing that at that time between the Sunni and the Shia community that they were trying to create this split.
00:32:27.000um overseas but we now see that exact same uh methodology at work and i think what dr epstein's describing especially with autocomplete right the subconscious the subtext of that when you begin to write something is oh well this must be informed by what everybody else is searching and what everybody else is thinking exactly And therefore, I should be thinking it too.
00:32:54.000And that is an incredibly dangerous road for us to go down as a society.
00:32:59.000I just want to add one thing to this issue of the polling numbers, and I agree with what you said completely, but I want to add one thing.
00:33:07.000At any point in time, there's still going to be millions of people who are vulnerable, who haven't completely made up their minds, and who can be pushed, who can be nudged one way or the other.
00:33:19.000Google has a tremendous advantage over any of us, including a campaign manager, because Google knows exactly who those people are.
00:33:27.000I mean, that would be every campaign manager's wet dream to know exactly who those people are, but Google knows exactly who they are.
00:33:34.000And it is using these techniques that we've discovered and quantified over the years.
00:33:39.000It is using these techniques on these people every day.
00:33:47.000No matter what the polls say, there are still people out there who can be influenced.
00:33:52.000Google knows who they are, and they're using these techniques.
00:33:56.000Now, let me also point out that on election day itself, there are a lot of people who Who are just too lazy to get off the sofa.
00:34:09.000In 2012, Facebook and some of my colleagues out here at the University of California, San Diego, published a piece in Nature Published a piece in Nature about what they had done in the 2010 midterms.
00:34:26.000They had sent go vote reminders on election day all day long to 60 million Facebook users.
00:34:33.000And they had some very clever ways and they had a control group.
00:34:37.000They had some clever ways of determining whether that got some more people to vote.
00:34:41.000They calculated that that vote reminder got 340,000 more people to get off their butts on Election Day.
00:34:50.000So just keep that in mind, that even on Election Day itself, sending out partisan go vote reminders shifts a lot of votes.
00:35:00.000And normally, no one would have any way of knowing that they did that.
00:35:06.000But because we have a national monitoring system in place, we'll know exactly what they're doing and how many votes they're sending to members of each party.
00:35:16.000We'll know exactly what they're doing.
00:35:18.000And we're building an archive, which has never been done before, that will allow us to go back in time and look at these manipulations I'm in touch with members of Congress, with a bunch of AGs, with a couple of parenting groups, some election integrity groups.
00:35:36.000All of our data are going to be available to all kinds of people who are going to try to use these data in various ways to pressure Google and other companies to stop To stay away from our elections and stay away from our kids.
00:35:55.000Well, that brings me to an old quote from Justice Louis Brandeis 100 years ago, everyone knows this, which is that sunlight is the best disinfectant.
00:36:05.000The second half of that quote, no one knows, but it's, and street lamps, the best policeman.
00:36:12.000That's what he wrote back in, I think, 1917.
00:36:16.000We're making these companies accountable to the public for the first time by monitoring them and exposing them and letting them know that we're doing it.
00:36:27.000And so we have now preserved, as of a few weeks ago, more than 100 million ephemeral experiences.
00:36:34.000That's what they call them inside of Google.
00:36:36.000Ephemeral experiences which are normally lost forever, stored nowhere.
00:37:39.000Is it opt-in, presumably, and do you have people volunteer to have their screen recorded, or how does it work?
00:37:47.000I wish we could accept volunteers because it would be so much cheaper to run this, but we can't because in the past when we've called for volunteers of any sort, Google sends us people.
00:38:00.000They not only have 100,000 employees, they have 120,000 outside consultants that they use for various purposes.
00:38:08.000And we've had them send us people over and over again.
00:38:11.000So over the years, we've had to set up very secure systems for recruiting.
00:38:26.000We're now monitoring the content that's coming into their computers.
00:38:30.000So this is not just ephemeral content, it's personalized.
00:38:34.000Because remember, everything they send out is personalized.
00:38:36.000The only way you're going to know what they're actually sending to people is to look over the shoulders, with their permission, of real registered voters, And capture that content and aggregate it and analyze it.
00:38:51.000And we've gotten better and better and better at doing all of that.
00:38:55.000And at this moment in time, we are preserving content through the computers of a politically balanced group of more than 16,000 registered voters in all 50 states.
00:39:07.000And this group gets larger every day because we do not stop recruiting.
00:39:12.000We have court admissible data, most likely, at least according to Ken Paxton.
00:39:18.000We have court admissible data now in 21 states.
00:39:22.000And obviously, we've got to get that number higher as fast as we can.
00:39:26.000The bigger that number is, the more it pressures these companies to stop what they're doing.
00:39:40.000You have a PhD after years and a DR in front of him, so those were hard-earned and well-deserved, and I want to respect them.
00:39:50.000Yeah, but I heard you on Tucker's show, and you made a very good argument for saying that you probably have 10 PhDs, given the expertise you have to get to litigate these cases.
00:40:05.000So, let's get to this subject, which I think...
00:40:10.000I didn't show Amber Ellis any of the documents you showed me, but you did an analysis of my election.
00:40:17.000There's nobody who knows more about my polling Amaryllis has done the biggest polls of this election in any campaign.
00:41:57.000In other words, it's liberally biased, but that's just a graph.
00:42:01.000But if you look at the actual content that they're sending people to, if people click on those high-ranking links in their search results, you end up with, I'm not going to say them, but if you want to share them, that's fine.
00:42:14.000They are sending people to one piece after another, one website after another, one article after another, one video after another, that makes you look like the devil.
00:42:27.000Well, I mean, Bobby poses perhaps the most grave threat to their business model as has happened since, you know, its instantiation.
00:42:36.000I mean, banning pharmaceutical ads on day one and that alone is probably number two source of ad revenue for them in addition to, you know, all of their defense contracts and so on.
00:42:50.000Even outside of their doing the intelligence agencies bidding, which we know to be the case, just from a business case point of view, we knew right from the outset that we had a pretty serious foe in Google.
00:43:05.000And I mean, across the board, they wouldn't let us claim our Google knowledge panel, which anybody, you know, if anyone's done it here, it's very straightforward.
00:43:12.000You just, you know, prove your identity by putting in a password.
00:43:16.000For one of your social accounts, and it pops right up.
00:43:19.000In this case, it was just the old 404 error every time, but only for Bobby.
00:43:24.000And we escalated it all the way up the chain there.
00:43:27.000And they kept telling us, oh, that's terrible.
00:43:50.000We understand Google's stake in this for sure.
00:43:54.000Another thing that's, by the way, I think is quite interesting in that report we sent you is that they are sending that attack content, that liberally biased content, not just to liberals and not just to moderates, but to conservatives as well.
00:44:56.000Especially those targeted messages on their homepage.
00:45:00.000You don't know anything that they're doing.
00:45:05.000We're also looking at the recommendations that they make on YouTube tells the same story.
00:45:12.000But what's interesting here is that people are not aware that 70% of the videos people watch on YouTube, this is according to Google itself and outside researchers too, but Google has admitted that 70% of the content people watch on YouTube has been recommended by their algorithm.
00:45:58.000They're slaughtering you on YouTube as well.
00:46:01.000Again, no monitoring systems in place.
00:46:04.000You don't know anything that they're doing.
00:46:07.000These people, by the way, are very determined.
00:46:11.000Google more than the other companies because Google has a very, very strong progressive left culture.
00:46:19.000And it's actually now more than 96% of the donations from their employees go to Democrats.
00:46:28.000Now, I lean that way myself, so I should be applauding them, but I don't.
00:46:33.000Because I love this country, and I love the system of government that was put in place by our founding fathers more than I love any particular party or candidate.
00:46:44.000So, unless you've got other sort of evidence that they used against me, which I'm very curious about, let's talk about how you fix this.
00:47:01.000There are really only three ways, and I can tell you that what the government's been doing, which is antitrust actions—that's the DOJ, that's Congress, and the AGs—will not solve the problem at all.
00:47:18.000In fact, I happen to have been working with AGs long enough, so I actually saw Google's attorneys pushing our authorities away from consumer protection issues and pushing them toward antitrust issues.
00:47:35.000They have basically gotten control over Everybody.
00:47:42.000It's DOJ, it's the AGs, and Congress, and they're all going after Google with antitrust actions.
00:47:50.000Antitrust actions will not solve the big problems.
00:47:53.000The big problems are surveillance, censorship, and manipulation.
00:48:07.000That Google knows very well that no one will ever force them to break up the Google search engine because it won't work.
00:48:14.000And Facebook knows that no one will ever force them to break up the social media platform because that would be like putting the Berlin Wall through the middle of every household.
00:48:30.000Well, one, which isn't going to happen, is that you could make the surveillance business model, which Google invented, which turns us all into products that they sell to vendors, you could make that model illegal.
00:48:44.000Tim Cook, the head of Apple, has suggested this.
00:48:47.000He thinks that model should not be allowed.
00:48:49.000He doesn't think it's a legitimate business model.
00:49:05.000We could declare, our Congress could declare, or a regulatory agency could declare their indexed Which is the database they use to generate search results.
00:49:16.000We could declare that to be a public commons.
00:49:19.000There's ample precedent for this in law.
00:49:23.000Governments have been doing this for hundreds of years because once some commodity or service becomes essential Then there's room for abuse, and that includes air and water and gasoline and telephone communications.
00:49:38.000Once it becomes essential, the government at some point always steps in and either takes charge or regulates.
00:49:48.000In this case, if our government declared their index to be a public commons, then everyone could build search engines using their data and you'd end up with thousands of search engines competing for our attention, which would be just like the news environment.
00:50:06.000There would be competition and innovation.
00:50:09.000Because of Google's monopoly, there has been no competition and no innovation in search for 20 years now.
00:50:50.000Monitoring makes them accountable because it gives us a growing database that allows us to look back in time and see what shenanigans they were using.
00:51:00.000And that can be used in court and it can also be used just to put public pressure on these companies.
00:51:07.000Monitoring, in my opinion, is not optional.
00:51:15.000Because in the EU, for example, where they have passed some pretty strong laws and put very strong regulations into place, which you've talked about, I've heard you talk about them.
00:51:26.000Vestager is the woman who spearheaded that movement in Europe.
00:51:30.000She has acknowledged recently that these companies are not complying as far as they can tell.
00:51:36.000The fact is, unless you have monitoring systems in place, you can't measure compliance.
00:51:42.000You don't know if there's compliance without monitoring.
00:51:54.000The one that I've been working on, and we spent $7 million building this nationwide system, obviously, is the third one, which is monitoring.
00:52:09.000A couple days after the November 2020 election, I sent a lot of our data to Senator Cruz's office, and on November 5, 2020, Cruz and two other senators sent a very threatening letter To Sundar Pichai, the CEO of Google, saying, you testified under oaths that you don't interfere with elections.
00:52:36.000And then it's a two-page letter summarizing our findings in 2020.
00:52:40.000In that election, by the way, Google shifted more than 6 million votes to Joe Biden.
00:52:46.000So, you know, Pichai did not reply right away, but on that same day, They turned off all of their manipulations in Georgia, which was gearing up for two Senate runoff elections in January.
00:53:29.000They literally just pulled out from Georgia.
00:53:35.000I mean, the interesting thing to me is that all three of those, and it's incredible work that you're doing, seem to me to not work without the other.
00:53:46.000I think that really, I'm a huge fan of free markets.
00:53:52.000I know that Bobby is, it sounds like you are as well, certainly free markets of ideas, of the right to know, to be an informed consumer, which certainly applies to news and to information.
00:54:04.000But even when you know, right, with the monitoring capabilities that you're talking about, and I think more and more people do know, you know, they use DuckDuckGo and so forth, but it is a pain in the butt, you know, because of monopolistic practices.
00:54:24.000And we saw this with the Apple and the Google judgment, right?
00:54:27.000When you buy a phone and you have a pre-built-in search engine, even if you know based on this monitoring, That, you know, what you're seeing is hopelessly filtered and biased.
00:54:39.000It is the kind of free ice cream problem where it's just a lot easier to operate with the pre-installed option rather than add your own.
00:54:50.000And I think the more, I mean, the middle, the publication of the index, you know, there's Common Crawl, for example, and I don't know if you've used Common Crawl at all, but I mean, it doesn't have as many webpages as Google, but I think it's indexed, I don't know, I want to say a couple hundred billion, and it's public commons.
00:55:08.000But as long as you have that monopolistic practice, even really excellent alternatives are hard to come by.
00:55:17.000And the solution that I really love is similar to what Jack Dorsey has been talking about, which is giving consumers free market choice between the algorithms that they can use.
00:55:31.000Because ultimately, consumers will test out, you know, if Google, for example, were, you Corollary to Right to Know is the corresponding duty to Disclose.
00:55:51.000And if you disclose the algorithm that you are using as your default and you open up an API where anybody can take the index as an API and add their own kind of algorithm, consumers ultimately will choose the one that works best for them.
00:56:09.000And by messing around the same way that you would try different filters on an Instagram photograph, You try different algorithms on your X feed or on your Google News results and pretty quickly you're gonna wise up to the one that gives you the sharpest and most complete view of the world.
00:56:28.000And I think giving people that kind of free market choice It is actually a more efficient enforcer than any kind of government oversight could be and less prone to being infiltrated by censors or government leaders that want to control these algorithms for their own purposes.
00:56:52.000Well, Google's not going to give people access to their data voluntarily.
00:56:57.000That would have to require some sort of governmental action.
00:57:04.000And that is not, you know, there are legal scholars that propose this, and it's not.
00:57:09.000President Kennedy was one of the initial proposers of the Consumer Bill of Rights.
00:57:14.000And in there, he talks about how the marketplace has been flooded with all this new technology, new pharmaceutical products, all kinds of things that I think in his words, he talks about forcing parents or the homemaker to become an amateur scientist and chemist and technologist to know what they should and shouldn't buy.
00:57:38.000And he was, of course, urging for truth in advertising and for consumer choice.
00:57:56.000But in the end, you know, the position that Google and others take that their algorithms are trade secrets and should be subject to those protections, there are established limitations on the protections for trade secret, especially when it comes there are established limitations on the protections for trade secret, especially when it comes to, you know,
00:58:17.000And I would really like to see that be taken up because if, you know, you when you began using a given search engine or a given social media platform, you in your setup process and at any time thereafter could play around with different algorithms and choose the one that you found to be, you you in your setup process and at any time thereafter could play around with different algorithms and choose the one that you found to be, you know, most productive
00:58:42.000And of course, one of the challenges that we haven't even talked about is that that's all heuristics based, right?
00:58:48.000This is almost like it's almost passe this conversation because pretty soon it will be AI driven.
00:58:56.000Let's get to AI then, because we're starting to monitor AI content now as well, which is going to become more and more important over time.
00:59:13.000It's like a tip of an iceberg because there's so much out there that can be a threat to humanity, not just to democracy, but to human autonomy and especially to the minds of children, that we have to be able to look at it, but to human autonomy and especially to the minds of children, that we have to be able to look So we're starting to monitor AI content.
00:59:39.000And of course, AI content could at some point in the very near future become a serious threat to our existence.
00:59:49.000But a monitoring system could be the first line of defense.
00:59:54.000It could be an early detection system for AIs that are posing a threat.
00:59:59.000So there's another use for monitoring.
01:00:02.000We also have designed and built equipment.
01:00:05.000We don't have the money to distribute it yet, but we've designed and built equipment that will allow us to monitor, with permission of the users, Answers that they're getting from intelligent personal assistants like Siri and Alexa and Google Home and so on, the Google Assistant.
01:00:25.000And so we're very close to being able to monitor and analyze that content as well.
01:00:30.000I should emphasize here that when we...
01:00:35.000We transmit content from someone's home or someone's device to our computers.
01:00:41.000We do that without any identifying information at all.
01:00:48.000This is the exact opposite of what the big tech companies do.
01:00:52.000They're always transferring content with your name on it so they can add it to your profile and they can sell you more things and they can manipulate you more effectively.
01:01:04.000You think that, you know, I love everything that you're talking about, and my mind is kind of racing and thinking of solutions.
01:01:13.000And, you know, maybe the government should have a ministry that just, or a new cabinet agency that's the Department of Democracy that monitors all of the undemocratic impacts of the internet companies.
01:01:36.000And, you know, we were told at the beginning, at the dawn of the internet age, that the internet was going to democratize the world, democratize the spread of information, and it's turned out, and it had that potential, and it's done that, it's accomplished that to some extent.
01:01:57.000The most odious effect and the most dominant effect Has been that it's become a tool for totalitarianism.
01:02:06.000And for the, you know, for this corporate merger of state and corporate power to monetize and to monetize information and to deploy surveillance and control to the systems.
01:02:17.000You know, should we have a cabinet host That is constantly monitoring the impacts of technology on democracy and taking steps to expose that to the public, to shine the sterilizing light of sunshine and the police effect of lamplight.
01:03:19.000And at one point, Ted Cruz invited me for a private dinner to discuss these issues because the man's very smart, obviously, and he wants to solve these problems, these big tech problems.
01:03:30.000We talked for almost four hours straight.
01:03:35.000But the point is, we talked about tech.
01:03:37.000And at the end of it, this poor guy, he actually said aloud, He said, the problem is that we can't, we'll never get bipartisan support to do anything against these companies.
01:03:49.000He said that the Democrats control them because of the huge donations, because they helped get them into office.
01:03:56.000He said, and Republicans don't like regulation.
01:04:05.000The government don't have the tech capability.
01:04:08.000I mean, the iterative cycle in Silicon Valley is just so many of these lawmakers are, you know, their kids or grandkids are helping them power cycle their laptop.
01:04:21.000I mean, I don't know if you guys have ever played around with chat GBT where you put it.
01:04:26.000Now there are different cheat codes, but there used to be a cheat code that was called do anything mode.
01:04:33.000And there are a bunch of these, you know, people can go and look online, but there are certain prompts that you could put in there that remove some of the pre-trained parameters.
01:04:44.000And when you ask Chet-Cpt to produce its results in parallel, one in its normal state and one with those parameters removed, it's almost like carving David from the marble where you can see the areas that have it's almost like carving David from the marble where you can see the areas that have been massaged It underscores them.
01:05:11.000Why did they make that change and actually kind of puts a glaring highlighter over them.
01:05:15.000And when you have that kind of comparison, I think it does a lot more to inform the public than having no comparison at all.
01:05:27.000So that's why I like being able to choose between the models, because then you see the default.
01:05:32.000I remember as a young person going to Burma for the first time, and it was in 1999, And it was still under very tight military control, which sadly now is again, but this was prior to the moment of democracy there.
01:05:54.000And they had a newspaper called The New Light of Myanmar that was just the most preposterous propaganda you've ever read in your entire life.
01:06:03.000And I remember Thinking, what is the point of even printing this?
01:06:08.000How could anybody read it and believe it?
01:06:11.000And we're almost there with a lot of what you read in the kind of Overton window Google results at this stage.
01:06:20.000And the more that it's all people are exposed to, then there's no point of comparison.
01:06:26.000And it doesn't seem preposterous in the same way that the New Light of Myanmar didn't seem preposterous to the people who had only ever read that as their news source.
01:06:34.000So, to me, as far as we can pursue Jack's proposal, and by the way, what you're doing, Dr.
01:06:44.000Eckstein, would be absolutely critical to it, right?
01:06:47.000Because you need to be able to see with monitoring what the results are across different...
01:06:55.000Well, now I'll give you some information that might make you think twice about what you just said.
01:07:12.000When we did our first nationwide study on SEAM, there were more than 2,000 people in the study in all 50 states.
01:07:21.000We had masked our manipulation, which Google does as well, so that very few people would recognize that there was any bias in the content they were seeing.
01:07:31.000But there were so many people in this study that there were a couple hundred who did suspect there was bias.
01:08:07.000But we live in a country of free speech.
01:08:10.000We can't have a situation where there's some top-down authority saying, this is biased, this isn't, because inevitably that itself becomes biased.
01:08:19.000So you have to give people the ability to choose a stupid algorithm that's going to make them less intelligent and have access to less information, so long as they can compare it with another.
01:08:30.000Here they're making the decisions for the wrong reason, because They're seeing content which is coming from a computer, and people mistakenly believe that computer content is inherently unbiased and objective.
01:08:46.000Right, but they won't if they see multiple options.
01:08:50.000If you say, here would be your news today on Google, and in the same way that you're choosing a filter on Instagram, you could go through and you say, you know, chronological order only is one option.
01:09:08.000I only care about pop culture bias, etc.
01:09:12.000So that you are actually, and those should be open source, and they can create some of their favorites, and other people can contribute theirs.
01:09:21.000But once you see the same day's news through 20 different algorithms...
01:09:25.000I think it will eat away at that idea that because it's produced by a computer, it's trustworthy because you've just seen all of these different iterations of the same set of news all produced by a computer.
01:10:02.000Jack Dorsey, I had dinner with him a couple of times here at the house, and I said, what's the solution?
01:10:08.000He said essentially what Amarillo is saying, you let everybody, you make the algorithms transparent.
01:10:16.000And right now, If you're a Republican, you're living next door to Democrat.
01:10:22.000A lot of it's just financially driven for Google because the algorithms they've trained are algorithms that keep people's eyeballs on the site the longest.
01:10:34.000And as it turns out, it will stay on the site longer if they're getting information that fortifies their existing worldview.
01:10:43.000If you're a Republican living next to a Democrat and you ask, you know, who is Robert Kennedy, you both ask simultaneously the same question, you're going to get different search results depending on what the algorithm is telling them is going to fortify your view of who Robert Kennedy is.
01:11:02.000And so at least if you can choose and say, Give me a Republican bias.
01:11:29.000I just want to make observations about two things that I saw during this election that were, you know, that struck me.
01:11:37.000One is that I, at the beginning of the election, after I gained my initial speeches and stuff, I had popularity ratings that were beating President Trump, President Biden.
01:11:50.000I had polling, national polling ratings in three-way races.
01:12:12.000Our engagement, social media engagement, was better than any of the other campaigns.
01:12:20.000We were being outspent, in some cases, 1,000 to 1, and we were holding our own.
01:12:29.000We were getting bigger crowds than anybody, but Trump, more enthusiasm.
01:12:33.000We had 100,000 volunteers more than any other campaign.
01:12:37.000So by many metrics, we were extraordinarily successful, but we saw this steady decline in my polling numbers down to 5% to 8% at the end.
01:12:48.000And, you know, it was clear to me that this wall of bad media was eroding people's support.
01:12:58.000That's one phenomena that I witnessed.
01:13:01.000And then the other really pronounced phenomena was this transformation that happened with Kamala Harris when, you know, in one week, She went from somebody who was the least popular person, the Democratic Party.
01:13:19.000She was absolutely disdained by Democrats.
01:13:23.000She could not get a lunch reservation anywhere in the country.
01:13:28.000People were just talking about ways to get rid of her.
01:13:32.000And then two weeks later, she was a mix between Beyonce and Jesus Christ.
01:13:49.000Something did change because we measured it.
01:13:51.000Because what we saw in our Kamala Harris report going from February through July, see this slope I'm describing?
01:14:01.000What we saw is that Google and to a lesser extent some other tech companies were basically ignoring her in February and all of a sudden they're boosting her to the top of search, which has a tremendous impact because it shifts people's thinking and opinions.
01:14:45.000I mean, even if you searched for your name, Bobby, or Donald Trump's name, the results would come up.
01:14:52.000With Kamala Harris plus Bobby Kennedy or Kamala Harris plus Donald Trump, whereas if you search for Kamala Harris, it will come up just for Kamala Harris.
01:14:59.000So those shifts alone, you know, look, ultimately, I say this to people all the time, every single thing you know about this election, unless you were in the room, I mean, basically, unless you're you, Bobby, Every single other person who's listening to this, everything you know about this election, you read, saw, or heard on a platform that is owned by someone with a financial interest in the outcome of this election.
01:15:28.000And again, I would just urge you, I know you don't have it, Emerilus, but I would urge you to look at the report that we sent to Bobby, because what you'll see is you'll see the enormous bias, but also you'll see the particular news stories that people are being shifted to, and they literally turned you overnight into a kind of a devil.
01:15:54.000And they really do have that power and they're not even slightly hesitant about using that power.
01:16:02.000These people are unbelievably arrogant.
01:16:05.000I mean they give new meaning to the word hubris.
01:16:39.000I was just going to ask you, where can people play with this data or see your work?
01:16:44.000Well, first and foremost, please go to AmericasDigitalShield.com because we built a public dashboard where you can see the data coming in and you can see the bias.
01:16:56.000We just added Reddit as a new platform and we keep expanding this system and making it bigger, better, faster.