Dr. Robert Epstein joins us on the show to talk about his new documentary, "The Creepy Line" and why he believes that Google may have been manipulating our votes in order to elect Donald Trump in 2020.
00:00:37.000You don't get a sip because that's not really my thing.
00:00:39.000I leave that to Steven when he is sitting in this chair, but I still have the mug and it's still got a wonderful tasty warm beverage in it.
00:00:49.000We don't always get the time to fully dive into certain topics and so we'll take an entire hour to do that.
00:00:55.000on occasion. You've seen us do that before when we did the investigative piece on McKinsey.
00:00:59.000We did it when we dove into the career of Muhammad Ali. A lot of different times we take a chance
00:01:07.000and go into a subject that we think that you are going to be interested in and in this one,
00:01:11.000this is not much of a chance. You need to be interested in this.
00:01:15.000Today's subject, it really does fit perfectly in the realm of something that requires us to dive just a little bit deeper, and I appreciate you guys giving us your time.
00:01:25.000Before we jump into it, really quickly, I just want to make sure that if you are not a member of Mug Club, you go to ladderwithcrowder.com slash Mug Club to sign up.
00:01:32.000$89 you get this show you get the hot twin show you get Alex Jones you get the mr
00:01:36.000Guns in gear you get Brian Callen you get Nick DiPaolo ladies and gentlemen all of the guys
00:01:42.000I mean every single one of these guys is fantastic it affords us opportunities to do shows like this as well
00:01:47.000Where it's not just kind of the the normal show that we do which is fun and exciting and there's always great topics
00:01:51.000to discuss There but every once in a while we need to take a minute
00:01:54.000and dive just a little bit more deeply into a subject And really let me just kind of start at the beginning and
00:02:00.000then I'll introduce our guest who is waiting very patiently for that moment
00:02:11.000I either watched it then or I watched it in 2019.
00:02:13.000And this was basically a movie or kind of a documentary about how Google would manipulate its search function and, you know, potentially End up leading to manipulating elections.
00:02:23.000There were a lot of really creepy things, not to steal the name of the movie, but it just really, it bothered me.
00:02:31.000And it was something that I didn't know much about.
00:02:34.000Did you know anything about the movie, the creepy line, the documentary, or maybe some of the work related to how Google could be shifting elections, manipulating you into voting?
00:02:45.000Uh, if you're in the right district and happen to be of the right political party persuasion and that party needs to turn out the vote to win that district.
00:02:55.000And unfortunately, it's going to scare the hell out of you, but that's okay.
00:02:59.000We've got somebody on the case for us today and joining us today in third chair is Dr. Robert Epstein.
00:03:08.000Psychologists are now for publicizing the influence of search engines and you've kind of had this this very storied
00:03:14.000career Uh, and this has been something that has kind of flipped
00:03:17.000the script on you a little bit, right? You're not a conservative
00:03:20.000Let's start out there. It's okay, though. We're all going to be okay, right?
00:03:24.000We're all going to talk and have a discussion where we have things that we definitely agree upon
00:03:28.000But maybe some of the other issues we don't agree That's okay with everybody.
00:03:31.000You're not a conservative, but conservatives kind of drug you into this a little bit and your world kind of turned upside down.
00:03:37.000Tell me a little bit about that in 2016 when that happened.
00:03:42.000What was Google doing and how did you get connected to Donald Trump and the Trump family and being a conservative and no longer a scientist?
00:03:51.000Let me just point out that you have a gun on your desk.
00:04:20.000So tell me what happened that kind of threw things, I'm imagining that kind of turned life upside down just a little bit.
00:04:27.000Well it started turning upside down actually on New Year's Day in 2012 because I got a bunch of messages from Google, there might have been eight of them or even more, saying your website has been hacked and we're blocking access.
00:04:43.000So, you know, I've been a programmer my whole life, or at least since I was 13, and I was interested.
00:04:50.000I mean, you know, everyone's websites get hacked.
00:06:40.000When was the last time that you went down to the little Google thing that had the numbers and how many pages and results came up and clicked next?
00:06:57.000Life is found in that top half of that first page.
00:07:01.000Well, actually 50% of all clicks go to the top two items.
00:07:04.000So I was thinking if people trust those high ranking items, if they trust those high ranking items so much, Could those high-ranking items be used to shift people's thinking?
00:07:21.000So I did a controlled experiment early 2013.
00:07:24.000I predicted That if we randomly assign people to this group or that group, Candidate A or Candidate B, and in Candidate A, people see search results that favor Candidate A. So, if you click up high, you get to a webpage that makes that candidate look really good.
00:07:41.000If you're in the other group, it's the opposite.
00:07:43.000So, random assignment, good scientific methodology, and I thought With this kind of manipulation, I could shift voting preferences by two or three percent.
00:07:56.000The first shift we got was something like 43 percent, which I- Really?
00:08:00.000Yes, of course, which I did not believe.
00:08:03.000So we had- You're like, okay, run that again.
00:08:19.000I'm thinking, wait a minute, you know, we triple check, we did it over and over again.
00:08:25.000The bottom line is, is that I ended up discovering something called, which is now called SEME, S-E-M-E, the Search Engine Manipulation Effect.
00:08:34.000If you go to searchenginemanipulationeffect.com, you'll actually see the seminal publication, which was in the Proceedings of the National Academy of Sciences, which is kind of a big deal if you're in science.
00:08:48.000And that, you know, that described five experiments in two countries, more than 4,000 participants, and it described one of the most powerful manipulation techniques that's ever been discovered in the behavioral sciences in a hundred years, SEAM.
00:09:06.000And basically it means, wait a minute, Google has this power and it's not like
00:09:12.000the power to put a TV commercial on the air or to buy Facebook ads.
00:09:17.000This is a very different kind of power because they have a monopoly on search worldwide.
00:09:22.000Yeah, what is it, 85% of the search market is controlled by Google?
00:09:39.000But if you have that kind of control and you can shift opinions by that much
00:09:44.000without people knowing, because people don't even know.
00:09:48.000I mean how would you look at a bunch of search results and recognize that they're biased toward one candidate?
00:09:53.000You'd have to click on every one, look at the webpages, read the webpages, you know.
00:09:58.000So people don't know and you can't counteract what they're doing.
00:10:04.000So it's unlike billboards and TV commercials and radio commercials because if you buy a billboard, I can buy two billboards.
00:10:11.000Exactly well and a lot of people when you go and search you're just assuming that an algorithm is basically taking whatever keywords that you're putting in and Finding you the articles or the the sites that are most relevant minus some paid spots, right?
00:10:24.000We know that they're sponsored posts and stuff like that are sponsored listings Or or you know, maybe like the first four but they always note those as being sponsored or paid or an advertisement, right?
00:10:33.000And you're just assuming okay, whatever I get next he's got to be the most relevant thing to what I've typed but you're saying it's not or Theoretically, it's not I'm saying that they fool us because most of the searches people conduct, about 86%, are for very simple facts.
00:10:55.000Are people really that afraid of just factual statements getting out, or at least opinions said his opinions? Well, well some information
00:11:04.000There there are forces out there that they that they don't they don't want that information out there
00:11:11.000So they they suppress it and it turns out that Obviously the the press in general has always had that
00:11:18.000ability for sure, but the press is very competitive So, you know, it's hard to you want the story. Nobody else
00:11:24.000is covering right, right If everybody's running this direction and somebody comes to you and says, well, wait a minute, no, I've got this piece right here, you're going to run with it because it gets people to buy your paper, click on your website, whatever it is.
00:11:34.000Well, in general, too, we have a relatively free press, probably the freest press of any country in the world.
00:11:40.000So, you know, the different points of view are out there.
00:11:44.000The problem, though, with content on a platform like Google Is that there is no competitor, and there's that trust that we have, not just in Google, but in algorithms and computer output.
00:12:02.000They trust those high-ranking search results, for example.
00:12:06.000They trust the search suggestions Google's flashing at you.
00:12:10.000Search suggestions, believe me, that's part of The game here.
00:12:15.000They manipulate us from the very first character we type into the search bar, literally.
00:12:20.000I mean, if you don't believe me, pick up your phone or your computer, type the letter A, they're going to flash some suggestions at you and chances are that some of them or most of them are going to be for Amazon.
00:12:57.000We'll put it up on the screen here for you to see where to send it to, at S Crowder on X. I always want to call it Twitter still because that name is just that there's so much kind of brand identity behind that and equity built up.
00:13:29.000I didn't mean to interrupt there, but... When I was testifying before Congress, and if you want to see my testimony, just go to EpsteinTestimony.com.
00:13:37.000As you can see, I've I bought a lot of URLs.
00:13:39.000I was about to say, you own a lot of websites.
00:13:40.000Yeah, whoever you're hosting through really likes you.
00:13:44.000So, when I was testifying for Congress, EpsteinTestimony.com, I gave this challenge to Senator Ted Cruz, and he pulls out his phone right in the hearing, and he types in an A, and I said, well, okay, tell me what you see.
00:13:59.000Three out of the four suggestions were for Amazon.
00:14:51.000Well, first of all, if you type in something that has a bias, like you type in Hillary is the devil, then of course you're gonna get related search suggestions. But if you just type something
00:15:04.000simple like Donald Trump is, you should get a mix of things probably. Hillary Trump is,
00:15:34.000I mean, I'm not a fan of Hillary Clinton, but I wouldn't say that she's a baby eater.
00:15:38.000I don't know where that one came from.
00:15:41.000But if you type that in though on Google that summer of 2016, you got nothing like that.
00:15:47.000You got Hillary Clinton is winning and Hillary Clinton is awesome, which no one was searching for.
00:15:53.000So what we learned in our research is this, is what Google does when they're supporting a candidate or a cause, They suppress the negative search suggestions.
00:16:06.000Ah, so you're only getting neutral or positive, which is going to take you usually, which is going to generate, if you click on it, it's going to generate search results that are kind of neutral or positive, and you're rarely if ever going to see anything negative because they're suppressing the negative search suggestions.
00:16:21.000So that can take that 50-50 split and turn it into a 90-10 among undecided voters?
00:16:26.000Among undecided people. People are undecided on anything, it turns out.
00:16:30.000Google has that ability to manipulate with search suggestions,
00:16:34.000answer boxes, which give you the answer, the search results.
00:17:13.000You're typing an email because you're really mad at your girlfriend, your boyfriend, your boss, and you're typing this horrible email and what you're saying, and then you look at it and you go, Wait, I'm not sending that.
00:17:31.000If people knew the extent of the surveillance and the extent of the manipulation, first of all, I guarantee you they would stop using Gmail.
00:17:39.000I guarantee you they would stop using Chrome, which is a Google browser.
00:17:44.000They would stop using Android phones because that's a Google.
00:17:58.000I'm just saying that there's a world of hurt here that people are just unaware of, and that is, the more I have studied it, the more I've learned about it, the more concerned I've become.
00:18:09.000First of all, because I'm a dad, so I'm concerned about young people and what are they seeing.
00:18:15.000Is there any hidden messaging going on there?
00:18:18.000Is anyone trying to shift their thinking in any particular way?
00:18:21.000Well, one of the leaks from Google, which is I think in 2018 or so, 2019, was an eight-minute video from their advanced products division, which is called The Selfish Ledger.
00:18:35.000Not their advanced products division, but this video.
00:18:37.000The video is called The Selfish Ledger.
00:18:40.000And if you type in The Selfish Ledger plus my name, you'll get a transcript of it and you'll get a link where you actually can go see this video.
00:18:49.000This video is about the ability that Google has to re-engineer humanity.
00:18:56.000And they say in the video, they call it re-sequencing human behavior according to, it's right in the film, company values.
00:19:20.000It's actually without any... It's probably worse.
00:19:22.000It's much worse because it's without any accountability to anyone.
00:19:26.000And they're controlling all of this stuff, the surveillance, the censorship, the manipulation in every country in the world outside of mainland China.
00:19:52.000Again, who made them sheriff and who gave them all this power?
00:19:56.000And why aren't our public officials doing anything about this?
00:20:00.000That's the question that I think a lot of our audience will have because, look, we've discussed this before.
00:20:06.000You're very familiar with the Hunter Biden laptop story and how that swayed the election.
00:20:10.000So I think it was a combined total of 17% would either have switched their vote or simply not voted for Biden.
00:20:17.000Just the suppression of that one story potentially could change an election.
00:20:21.000Suppressing somebody's voice on YouTube, so like you said owned by Google, could change the outcome of an election, could change the pushback on restrictive policies regarding lockdowns or masks or things that you disagree with or could hide things maybe that you do agree with, right?
00:20:36.000So it can censor people and They're arbitrary rules.
00:20:39.000We've had strikes on our channel for things that don't even make any sense.
00:20:44.000And there's no way for us to get around it.
00:20:47.000There's no way for us to appeal it and get them to change their minds on something.
00:20:56.000It basically goes to the trash bin, I think, somewhere at Google and notifies them that, aha, maybe we need another strike to get back in line with YouTube.
00:21:04.000The problem is that politicians aren't doing anything.
00:21:07.000And we've had Senator Ted Cruz on, we've had Senator Marco Rubio on, we've had a lot of people on, and I'm like, look guys, enough talking about Section 230 as it specifically relates to those platforms, right?
00:21:18.000Either you do something about it or you don't exist.
00:21:21.000I'm not saying that's us, I'm saying that's them.
00:21:23.000If you don't do something about it, and they are exactly who you say they are, and they are skewed against conservative policy makers, then they could just make it to where people don't find you.
00:21:33.000And an election comes around and you should win, but you don't.
00:21:37.000And you've lost your opportunity to do anything.
00:21:38.000And I think probably on Section 230, you would agree like, Hey, this is part of the problem.
00:21:42.000I saw you nodding your head when I mentioned it, like Section 230 kind of being this ambiguous thing where they get the benefits of not being sued, but none of the responsibility to allow voices on as long as they're not breaking the law.
00:21:52.000And we have rules and guidelines for that, what that looks like.
00:23:35.000And where did she even get those ideas from?
00:23:37.000I'm sure That someone at Google gave her that language.
00:23:42.000That then got picked up by this machine, which I had heard of, and I didn't know it was real, but it got picked up by the New York Times and a hundred other- They did a fact check.
00:23:51.000A hundred other mainstream news sources, many of which I've dealt with in the past, I know the people, I've published in them before, and I got cut off.
00:24:03.000Just like Dershowitz, I got completely cut off from mainstream media, mainstream news, even though, again, in many cases I know the people.
00:24:11.000I mean, I was editor-in-chief of Psychology Today magazine for four years.
00:24:14.000I mean, I worked for New York magazines.
00:25:27.000One of the AGs comes out and he says, well, based on what you told us, Dr. Epstein, he said, I don't want to scare you, but he said, I think you're probably going to be killed in some sort of accident in the next few months.
00:26:13.000And that's a terrible thing, and that would be enough.
00:26:16.000But there's more to that story That makes what he said to you, one of the attorneys general there, said to you, kind of ring in the back of your mind a little bit, like, wait a minute, can I trust that this was just an accident?
00:26:29.000So, just give us a couple of things there, because when I read about it, I was like, you've got to be kidding me.
00:26:34.000Well, the little pickup truck that she was driving, um, I mean, some things were just, they were not right.
00:26:41.000The, the truck was never examined forensically.
00:27:32.000I don't want you to dive deeply into that.
00:27:36.000You're now in a position where you're, you're, you aren't suicidal, correct?
00:27:41.000I just want to make sure I get that on the record right now.
00:27:44.000If we end up finding, you did not kill yourself, right?
00:27:47.000That's, and we joke around about that, but at the same time, it's, it's a real thing because these are very powerful forces that probably don't want you tinkering with what they're doing.
00:27:56.000Well, I have to keep going on ex-formerly Twitter, and I had to do it again a few days ago, and I have to say, just a reminder everyone, I am not suicidal.
00:28:03.000Now, why do I have to keep doing that?
00:28:05.000Because just a few days ago there was a very good article came out about my work, and about what I'm doing now to try to stop these companies, which I'm sure you'll want to talk about.
00:28:14.000And it was in a publication called PJ Media, which I think is very good, although it leans heavily to the right, but still I think it's very good.
00:28:24.000And it was this article about my work in PJ Media, and it compares me to another Epstein.
00:28:34.000And it says, you know, they do have one thing in common, because this Epstein that I'm writing about now, me, is just as likely to be suicided as the other Epstein was.
00:28:46.000So I have to go back online and say, no, I'm not suicidal.
00:28:50.000But I am doing work which is definitely inherently dangerous because my team and I,
00:28:59.000and I have almost 50 people working with me, brave souls.
00:29:03.000And we are discovering things about Google and to some extent, some other tech companies
00:29:15.000For one thing, we've learned how to, not just how to test the powers that they have, that we do in our basic science, our basic experimental research.
00:29:24.000And that's good enough to know, but then, okay, now what do I do, right?
00:29:27.000So, you know, that's part of it, the basic science, that tells about their capabilities, which is interesting, but that doesn't mean they're actually doing anything, right?
00:29:36.000But we've, starting in 2016, we have learned how to preserve what, at Google, they call ephemeral experiences.
00:29:44.000Now, in a million years, they never imagined that anyone was going to capture the ephemeral content that they show people.
00:30:36.000My head spun when I saw that because I had been studying an experiment since 2013 The power that ephemeral experiences like search results are ephemeral, search suggestions are ephemeral, answer boxes, YouTube sequences, up next videos, that's all ephemeral.
00:30:53.000And here we're employees of the company saying, basically acknowledging that they know the power that these ephemeral experiences have to shift people's thinking and behavior.
00:31:05.000Now that's incredibly dangerous that they know this and presumably maybe even do it.
00:31:24.000And, you know, if you believe that the election was stolen by, you know, these machines doing stuff or voter fraud, we were talking about that before.
00:31:39.000And so, for you to have come up with a way to monitor that and to catalog that, massive amounts of data, I'm assuming, that is not cheap, by the way, to be able to store that, to process that, to pay people to do it.
00:31:51.000I don't know if I want to go into the details of how, but you guys have a system, let's just say that.
00:31:56.000What's the name of this system that captures all this data?
00:31:59.000Isn't there like a network name or something for it that you guys have?
00:32:02.000Internally we call it our monitoring system, but publicly now we're starting to call it America's digital shield.
00:32:09.000And we actually have a mock-up of a website that's going to be up soon.
00:32:13.000It's going to be live that is going to, I hope, generate a lot of interest.
00:33:49.000And we hide their identities, just like the Nielsen Company hides the identities of the Nielsen families that they use to get the TV ratings.
00:33:57.000It's very, very expensive, very slow, very labor intensive.
00:34:01.000But each day, we have a team of almost 50 people who work on this, each day we're able to successfully recruit, vet, NDA, equip, train every day another 30 to 60 people.
00:34:18.000And then we've started getting some of their kids signed up too.
00:34:22.000So we In 2016, we miraculously got 95 field agents in 24 states, preserved 13,000 searches on Google, Bing, and Yahoo.
00:34:35.000Looked at the actual search results that real people were getting and found there was extreme bias in favor of Hillary Clinton, whom I supported, and I no longer do, obviously, but the point is that we calculated from our research at the time that that level of bias, if it had been present nationwide, would have shifted between 2.6 and 10.2 million votes to Hillary Clinton, with no one knowing.
00:35:01.000As it happened, she won the popular vote by 2.8 million votes, so it looks like If you took Google out of that election, the popular vote would probably have been very, very close.
00:36:40.000Yeah, but it gets the story gets really crazy now because we at this time we were learning how to analyze data faster and faster and faster so I sent it to Senator Cruz's office.
00:36:55.000Because after I had testified a few months later he invited me to DC we had a four-hour private dinner talking tech for four hours that's why.
00:37:03.000The man is really smart and he understands these issues.
00:37:10.000A couple days later, November 5th, so this is two days after the presidential election, Cruz and two other senators sent a very threatening letter to the CEO of Google.
00:37:20.000If you want to see it, go to lettertogoogleceo.com.
00:38:27.000That says go vote, or register to vote, or mail in your ballot.
00:38:30.000They can put anything they want there, but what we have found is they do this in a partisan fashion.
00:38:36.000So for example, one example, 2022, in Florida, 100% of liberals got that go vote reminder all day on election day.
00:38:46.000That go vote reminder, I mean, Google's home page has seen 500 million times a day just in the United States.
00:38:53.000But 100% of liberals got that reminder in Florida, 2022, only 59% of conservatives did.
00:39:02.000That is a really extreme manipulation because we know from actual research that Facebook conducted and published that in that kind of election, a national election, if Google is sending out a go vote reminder in a partisan way, Then literally on election day itself they can send 450,000 more votes to one candidate than to the other.
00:39:38.000So this illustrates a very important point that Justice Brandeis made over a hundred years ago, which is, sunlight is the best disinfectant.
00:39:48.000And the second part of that quote, which no one knows except me, is, and street lamps are the best policemen.
00:40:22.000So let's just, I just want to back this off just a little bit and make sure that you guys understand, this shifts so many elections potentially.
00:40:33.000And nobody should have that power, no matter what my particular belief would be about candidates, or who should be in office.
00:40:42.000It should be a vote, not somebody... I mean, they have get-out-the-vote campaigns.
00:40:48.000We know that it works because candidates have been doing it for as long as voting has been a thing.
00:40:52.000Where you go through districts that maybe you need to win this one a little bit more.
00:41:18.000Nationwide, they go 400,000 votes to one candidate.
00:41:20.000To one candidate, just on election day alone with that manipulation, the partisan go-vote manipulation.
00:41:26.000Now, there's also a bit of good news, though, because the fact that we did get them to back off, the fact that Sunlight... Right, so you've got some results.
00:42:24.000I happen to like that direction, but I don't like the fact that a private company is running our country, that the free and fair election is an illusion I don't like that at all.
00:42:54.000One, that's a stupid comment, because how do you prove it's the freest and most fair election that we've had in our history as a country?
00:42:59.000I didn't realize that the other ones were less free and fair before that.
00:43:02.000And I'm not talking about the 2020 issues, but then why is one side out there saying, hey, No, these aren't necessarily, we need to make sure that these elections... Now, it's not just a bias of, our guy didn't win, right?
00:43:16.000It was, I'm seeing some stuff, and this doesn't look like it's fair, but then the other side completely saying, well no, these are the freest and fairest elections that we've ever had.
00:43:55.000You may think he's like this intense guy that kind of runs around these... By the way, he's been right so many weird times where I'm like, that's crazy.
00:44:01.000And then six months later, I'm like, well, son of a gun.
00:44:04.000He said they were turning the frogs gay, and it sounded weird, but there's the report!
00:44:08.000Well, you know, he just invited me to be on his show, and I don't know what to do.
00:44:58.000You know this fascination, almost an obsession, that some conservatives have with things like ballot harvesting and messing with voting machines and ballot stuffing and on and on and on, all that stuff?
00:45:14.000all of which by the way was claimed by you know Google's legal uh excuse me Trump's legal teams
00:45:19.000in 2020 and that got thrown out of 63 courts all that all those beliefs all that stuff is actually
00:45:27.000spread by Google and to a lesser extent some other tech companies.
00:45:33.000Now why would Google want to spread a bunch of conspiracy theories and get people believing that stuff and get people repeating that stuff and then inspire OAN and Newsmax and Fox to run story after story after story?
00:46:45.000These are inherently competitive techniques.
00:46:47.000They have relatively little net effect.
00:46:51.000But Google, which wants you to believe in all these conspiracy theories, what Google does is different because they're not shifting a few hundred votes here and there, a few thousand votes here and there.
00:47:00.000They are shifting millions of votes in ways you can't see and you cannot counteract.
00:47:12.000When you started talking about that, I kind of figured that you were, once you started talking about it, I figured where you were gonna go with this.
00:47:18.000And the reason that I can say that, guys, you can go back and you can look at our shows when we had Mayor Giuliani on and said, why are we going after these?
00:47:28.000And just to be clear, on the 63 cases, the vast, vast, vast majority of those were either thrown out for standing or procedural issues that never actually even got to discovery, where the judge basically said, Nope, we can't go down this road, not gonna go down this road.
00:47:39.000It wasn't that the entire thing was adjudicated and determined to be false, but your point I think is very valid because it was my point as well regardless.
00:47:47.000I was saying, Why are you going after these things?
00:47:50.000The main point that you can go after... Now, in your case, you're saying, hey guys, we need to be focusing on Google.
00:47:55.000I agree 100%, but we also need to be focusing on states like Pennsylvania that changed their constitution illegally.
00:48:01.000They went to the Supreme Court of Pennsylvania, who did it, and said, well, yeah, that's fine.
00:48:04.000I know it was supposed to be in two legislative sessions back-to-back.
00:48:06.000The point is, there were much more solid grounds to go on and to try to make arguments.
00:48:11.000See, you're actually proving my point.
00:48:29.000And I have a feeling if you and I chatted and chatted and chatted, more and more and more, you would come up with stuff that you think is real and big and consequential.
00:48:39.000It's probably not, because Dirty tricks have always been played.
00:49:35.000I'm just telling you that you have to... What I have been studying for a long time now is a whole different kind of beast, and the beast... Oh, your beast is way worse.
00:49:45.000It's way worse because... It's the one we have to be fighting.
00:51:47.000But the point is that I have scared people, I've gotten them for a few seconds to kind of consider the truth of what I'm saying, and then they go right back.
00:51:58.000And if they're not talking about Pennsylvania, they're talking about, oh yeah, but what about this thing in Ohio?
00:52:18.000Now and then, someone actually gets it.
00:52:21.000Because Carrie Lake, who lost the governorship, lost that race in 2022, She has now gotten it.
00:52:29.000She gets it and she's actually saying now, wait a minute, maybe she was distracted with some of these small issues and maybe really, maybe she should be looking at the tech companies.
00:52:43.000Ramaswamy is now saying, well, all those things you're concerned about, yes, they're all real, we should be concerned, but— There's the thing.
00:53:40.000Because this is, I don't have much faith, yes, Cruz is very impressive, I admit, but I don't actually have much faith in the legal system, especially these days, or in regulators, because in general, law and regulation, even without massive dysfunction, Laws and regulations move very slowly.
00:54:20.000Because I don't, we run up against problems all the time that we find and that frustrate us, and we don't necessarily know how to have an effect to change things, right?
00:54:30.000So you have a monitoring system out there.
00:54:33.000How do we practically go, I want to back that guy and jump in with you on this?
00:54:41.000And unfortunately, you said this, you're the guy standing between us and...
00:57:06.000And in a way, that would be the best thing for our democracy and for our republic, that would be the best thing, is that we just stop them.
00:57:13.000And by the way, if I were them, I would stop, because So we started building last fall.
00:57:19.000I managed to get just over three million dollars from some very generous donors who believed in this monitoring concept, where we now call it America's Digital Shield, and we started building.
00:57:36.000And as of today, let's see, I knew this was going to come up so I went online.
00:58:18.000Now, our goal is to try to have a full system in place with representative samples, court admissible, By the end of this year, so we can actually go right after the primaries.
00:58:30.000Literally, we can protect the primaries.
00:58:35.000And, you know, we're on the way, but the three million is pretty much gone.
00:58:44.000And I'm in a very tough position right now because to shut this system down would be insane.
00:58:52.000It would be even more insane than the fact that I built it.
00:58:55.000But it would be insane to shut it down because that would be literally handing over the free and fair election to the tech companies permanently, probably, and handing over the minds of our kids to the tech companies permanently.
01:00:08.000Whatever people can do to kind of support your cause, where do they go if they want to pitch in?
01:00:16.000You should go to mygoogleresearch.com, but let me just point out that when I go on a show that has a big audience, which this show does, and I go on a show, I get a gazillion people trying to reach me or reaching me saying, I'll be a field agent, I'll be a field agent.
01:00:35.000Don't do that, because we can't ever accept volunteers, because if we accepted volunteers, Google would send us thousands of volunteers, and we know because they have sent us people.
01:00:59.000Now, we only pay our field agents $25 a month.
01:01:03.000It's just a token fee for the privilege of being able to look over their shoulders when they're, you know, getting content from these tech companies.
01:01:11.000So, we preserve and protect their privacy.
01:01:14.000So, when that data is transmitted to us, it has no identifying information ever, and we only look at data in aggregate.
01:01:22.000So, we're doing exactly the opposite of what the tech companies do.
01:01:25.000The tech companies look at your individual data, and then, like the misers they are, they go, oh, look, more data, and then they monetize it, and they use it to manipulate you.
01:03:17.000So if anyone out there, if you have connections to a major foundation, connections to a generous person who can afford a major gift, I can't do this myself.
01:03:30.000I literally need help from other people.
01:03:33.000But the good news is we've gone a very long way in a relatively short time.
01:03:38.000We started building this permanent system last fall.
01:03:42.000We now have 11,638 field agents, all registered voters in all 50 states.
01:03:48.000As far as those representative samples go, we've hit that minimum in 10 states so far.
01:03:55.000So we're going state by state by state.
01:03:57.000Are you starting kind of in swing states where your focus is?
01:04:00.000Because like you said, that's where the action is going to be.
01:04:02.000We did start with the swing states because those are absolutely critical in elections.
01:04:06.000But we're after far more now because the content we're seeing, because we're getting data now from kids.
01:04:16.000We're seeing the content that's being sent to their devices.
01:04:18.000They don't even have to be in front of their devices, we're still seeing the content.
01:04:22.000And that is actually getting me even more concerned than the political bias, because these companies are sending Data to our kids that is so creepy that at the moment we don't even know how to analyze it.
01:05:59.000For those in the know, in the social sciences, it's been studied in many fields.
01:06:03.000It's very, very real and very powerful.
01:06:07.000You know, that's what Google makes use of when they suppress negative search suggestions.
01:06:12.000Because if they let negative search suggestions appear, those negative search terms will draw 10 to 15 times as many clicks as neutral or positive terms.
01:07:25.000How do you – we can watch a video, we can see the creepiness in the video, but by the way, there's a creepiness that goes beyond even that one horrible moment, the traumatic moment.
01:07:40.000There's creepiness all through that we know is important.
01:07:44.000But we just don't know yet how to articulate the importance of it, and we don't know how to analyze it mathematically yet.
01:08:24.000But literally, as of when I checked this morning, we had preserved more than 41 million of these ephemeral experiences.
01:08:32.000And we're going to preserve hundreds of millions and eventually billions.
01:08:37.000And we're getting better and better at doing the analyses.
01:08:41.000And I'm just telling you, what we're doing is crazy, insane, complicated.
01:08:49.000In the beginning it seemed almost impossible, but we're doing it.
01:08:53.000And this, to me, is the real protection that we can have from not just Google, but the next Google after this one, because If you don't monitor and capture, preserve, archive ephemeral content, you'll never understand what's going on.
01:09:12.000You'll never understand why this person won the election versus that person.
01:09:16.000You'll never understand what's happening to kids.
01:09:21.000I mean, human autonomy literally will be undermined and people won't know.
01:09:51.000I wrote this question down earlier because you said with Google like obviously all of these things exist and I'm thinking okay well what service do you use and then I was thinking well wait a minute does incognito mode actually do anything so what what do you use?
01:10:30.000So MyPrivacyTips.com will take you to a piece which begins, it's an article I wrote, and it begins, I have not received a targeted ad on my mobile phone or computer since 2014.
01:10:42.000So, it'll explain to you how you can get started in preserving your privacy, and I first published that in 2017, and good news for your viewers is that I just updated it a few days ago, so now everything's up to date, and it'll get you started.
01:11:01.000One of the things I mentioned, I'm just going to flash this and put it down, is that every single person here has a surveillance phone in their pocket, in their purse, somewhere.
01:11:20.000And remember a few years ago, I don't know, you look a lot younger than I am, but a few years ago, it wasn't that long ago, you could remove a battery from your phone.
01:11:29.000Now they solder them in so you can't remove the battery.
01:11:49.000If you disconnect from your mobile service provider, then yeah, they're just storing everything locally on the phone, but the moment you reconnect, all of it gets uploaded.
01:13:10.000Your information, all of that data, that is the product.
01:13:13.000So, we don't use, and I say we, my friends, my family, my staff, none of us use anything that has anything to do with Google.
01:13:21.000So, none of us use the Google search engine.
01:13:24.000None of you listening or watching should ever use it again because it's an extremely aggressive surveillance tool, but it's also the most powerful mind control machine ever invented in history.
01:13:39.000There's so many different ways in which that search engine is manipulating you that, again, if you really understood, trust me, you would never touch it again.
01:15:32.000It encrypts the messages end-to-end and the attachments so that even the folks at ProtonMail can't read it. Now that's a very different model. Very,
01:15:45.000very different model. So instead they're just people who really use it heavily or people are
01:15:50.000in business so they're charging a few bucks a month. That's what we should be doing. In
01:15:55.000other words, these companies should not be allowed to use surveillance in a way that we're
01:16:04.000not aware of to not just sell us stuff but also to manipulate us.
01:16:10.000The surveillance business model should be illegal.
01:16:13.000Well, and a lot of companies now are switching away from, or at least trying to decouple themselves from the advertisers being the only way they get paid because then the advertisers are in control of the content that gets put on your platform and they can turn that off at the drop of a hat depending on how they feel about you that day.
01:16:32.000When we look at 2024, when we look at the upcoming elections, obviously we've talked about the primaries, we know that you guys are there, you're monitoring this stuff, and you're hoping that your mere presence is enough to get Google to stand down and not do anything with these elections.
01:16:50.000This seems like a national security issue because it is.
01:16:53.000There is no question that this is about national security when you're talking about our elections.
01:16:58.000If we don't have free and fair elections, really what do we have and how long do we have until
01:17:03.000this country descends into a thing that you're not going to want to be a part of, right?
01:17:08.000You're not going to want to be here for that because it won't be freedom and there won't be any semblance of the America that you grew up with.
01:17:15.000And I'm not talking about cultural issues.
01:17:16.000I'm talking about just not having any choice in who represents you.
01:17:19.000That's the most basic thing that we have here.
01:17:23.000Why is the government, even in small parts or organizations that are government adjacent, or somebody stepping up and saying, you know what, that's right, we do need to fund something like that so that we can do it.
01:17:35.000Now, I don't want the government to do it.
01:17:40.000And if it's the government, I want it to be a state government or a local government funding it, because you're doing it on a state-by-state basis anyway, and so states can say, you know what, hey, we'll take care of everybody in the state of Texas.
01:17:50.000We'll take care of everybody in the state of Georgia.
01:17:52.000We'll take care of everybody in the state of Michigan.
01:17:56.000Especially right now, your strongest argument is to conservatives, because they're the group that feels like they're targeted the most by this kind of stuff.
01:18:03.000Your data seems to suggest that too, obviously, right?
01:18:06.000Why are they not stepping up and saying, we got to get this thing funded because at least it will give us a fair chance with our ideas.
01:18:13.000And if our ideas are good enough, people will vote for us.
01:18:15.000If they're not good enough, they won't.
01:18:40.000One of the best articles ever written about this problem was written by the head of the biggest publishing conglomerate in Europe and his piece was actually called Fear of Google and he called it an open letter to the CEO of Google and he was talking about the fact that they can't do anything in their business without deciding how Google's going to react because Google can easily just Snap their fingers with or without cause, and the courts, by the way, have said, yes, you can do this, you're a private company, and they can just demote you or delete you.
01:19:17.000And if you're a publishing house or any other company, you're dead.
01:19:21.000And the courts in the United States have over and over again said, yes, you can do that, Google, either under CDA 230, which you mentioned, or under the First Amendment.
01:19:31.000They still have the First Amendment, don't forget.
01:19:34.000Which gives them the right to suppress speech, apparently.
01:19:39.000But the point is that there are some very, very wealthy conservatives who I have spoken to about this problem.
01:20:03.000He cannot take a chance on offending Google.
01:20:07.000There's a guy who in the past has supported us.
01:20:11.000But he owns a chain of blankety blanks and he or his lawyers or some of his accountants finally explained to him or his marketing people, you know what?
01:20:46.000You're in a position to make a difference.
01:20:49.000I'm sorry, but it's time to grow a pair and actually put your money where your mouth is.
01:20:55.000Nobody ever said this was going to be easy.
01:20:57.000Nobody ever said preserving democracy was not going to be painful.
01:21:01.000Nobody ever told you that being in a position of influence and power, being blessed by God with resources—I'm not saying hard work doesn't play a part, I know it does—but you're given that as a steward.
01:22:15.000If, here's the caveat, if you actually believe what you say you believe, if it's just a convenient idea that's different, But then you need to own that.
01:22:24.000You need to look in the mirror and say, you know what, I don't really believe in this country and the ideals that started this, that gave me the opportunity to create a business of what's it called around the country, doesn't matter what it is.
01:22:34.000All of that, I don't really believe in that.
01:22:36.000I'm just glad it's here that I could take advantage of that system and make my money and go sit in my house and make sure and pray to God that I don't offend these guys and they don't come for me next.
01:22:46.000The only way to make sure that they don't come for anybody next is for you to stand up.
01:23:07.000But you're out there fighting this cause.
01:23:10.000There's so many other things that I really wanted to get to.
01:23:12.000I know we've already kind of gone long here, but these conversations could go on and on and on, and we've already listed a number of ways that people can support you.
01:24:40.000We have to protect the rights that allow us to do that, and we have to step up, and we have to give And we have to do our part in this.
01:24:47.000I don't know if it takes exposing more politicians to what's going on, super PACs to what's going on, groups that say they fight for justice and for freedom and for the First Amendment and the Second Amendment.
01:25:42.000If you as people who have resources don't step up, if the elected officials that we have don't step up, if there aren't future Robert Epsteins that stand up, what are we doing here?
01:25:57.000This is all theater, if that's the case.