Louder with Crowder - October 05, 2023


Google Wants to Steal the 2024 Election, This is How we Stop Them! | Feat. Dr. Epstein


Episode Stats

Length

1 hour and 26 minutes

Words per Minute

177.69809

Word Count

15,362

Sentence Count

1,097

Misogynist Sentences

8

Hate Speech Sentences

5


Summary

Dr. Robert Epstein joins us on the show to talk about his new documentary, "The Creepy Line" and why he believes that Google may have been manipulating our votes in order to elect Donald Trump in 2020.


Transcript

00:00:00.000 🎵 🎵You're a stranger in love🎵
00:00:16.000 🎵I know what I know🎵 🎵
00:00:21.000 🎵You're a stranger in love🎵 🎵I got to follow🎵
00:00:25.000 🎵 🎵I'm in the spirit🎵
00:00:29.000 🎵 🎵
00:00:33.000 🎵 Alright, good morning everybody
00:00:37.000 You don't get a sip because that's not really my thing.
00:00:39.000 I leave that to Steven when he is sitting in this chair, but I still have the mug and it's still got a wonderful tasty warm beverage in it.
00:00:46.000 Glad you could be with us today.
00:00:47.000 It's a little bit of a different day.
00:00:49.000 We don't always get the time to fully dive into certain topics and so we'll take an entire hour to do that.
00:00:55.000 on occasion. You've seen us do that before when we did the investigative piece on McKinsey.
00:00:59.000 We did it when we dove into the career of Muhammad Ali. A lot of different times we take a chance
00:01:07.000 and go into a subject that we think that you are going to be interested in and in this one,
00:01:11.000 this is not much of a chance. You need to be interested in this.
00:01:15.000 Today's subject, it really does fit perfectly in the realm of something that requires us to dive just a little bit deeper, and I appreciate you guys giving us your time.
00:01:25.000 Before we jump into it, really quickly, I just want to make sure that if you are not a member of Mug Club, you go to ladderwithcrowder.com slash Mug Club to sign up.
00:01:32.000 $89 you get this show you get the hot twin show you get Alex Jones you get the mr
00:01:36.000 Guns in gear you get Brian Callen you get Nick DiPaolo ladies and gentlemen all of the guys
00:01:42.000 I mean every single one of these guys is fantastic it affords us opportunities to do shows like this as well
00:01:47.000 Where it's not just kind of the the normal show that we do which is fun and exciting and there's always great topics
00:01:51.000 to discuss There but every once in a while we need to take a minute
00:01:54.000 and dive just a little bit more deeply into a subject And really let me just kind of start at the beginning and
00:02:00.000 then I'll introduce our guest who is waiting very patiently for that moment
00:02:03.000 I appreciate that.
00:02:04.000 I was watching a movie called The Creepy Line, and I believe it was around when it came out.
00:02:09.000 2018 was when it came out.
00:02:11.000 I either watched it then or I watched it in 2019.
00:02:13.000 And this was basically a movie or kind of a documentary about how Google would manipulate its search function and, you know, potentially End up leading to manipulating elections.
00:02:23.000 There were a lot of really creepy things, not to steal the name of the movie, but it just really, it bothered me.
00:02:31.000 And it was something that I didn't know much about.
00:02:33.000 So do me a favor, comment below.
00:02:34.000 Did you know anything about the movie, the creepy line, the documentary, or maybe some of the work related to how Google could be shifting elections, manipulating you into voting?
00:02:45.000 Uh, if you're in the right district and happen to be of the right political party persuasion and that party needs to turn out the vote to win that district.
00:02:52.000 Were you aware of that?
00:02:53.000 I think you will be after today.
00:02:55.000 And unfortunately, it's going to scare the hell out of you, but that's okay.
00:02:59.000 We've got somebody on the case for us today and joining us today in third chair is Dr. Robert Epstein.
00:03:08.000 Psychologists are now for publicizing the influence of search engines and you've kind of had this this very storied
00:03:14.000 career Uh, and this has been something that has kind of flipped
00:03:17.000 the script on you a little bit, right? You're not a conservative
00:03:20.000 Let's start out there. It's okay, though. We're all going to be okay, right?
00:03:24.000 We're all going to talk and have a discussion where we have things that we definitely agree upon
00:03:28.000 But maybe some of the other issues we don't agree That's okay with everybody.
00:03:31.000 You're not a conservative, but conservatives kind of drug you into this a little bit and your world kind of turned upside down.
00:03:37.000 Tell me a little bit about that in 2016 when that happened.
00:03:42.000 What was Google doing and how did you get connected to Donald Trump and the Trump family and being a conservative and no longer a scientist?
00:03:51.000 Let me just point out that you have a gun on your desk.
00:03:53.000 I do, yes.
00:03:55.000 I'm not going to point it at anyone though.
00:03:57.000 Don't worry, it's safe.
00:03:58.000 Inanimate objects rarely kill anyone.
00:04:00.000 It's usually people that do the doing, and none of the people here want that to happen.
00:04:04.000 Okay.
00:04:05.000 Because next time I'm coming with protection.
00:04:08.000 I'm packing next time.
00:04:08.000 Don't worry.
00:04:11.000 It's totally acceptable here in the studio.
00:04:12.000 There are a lot of firearms around for everybody's protection.
00:04:16.000 We've never used them on each other.
00:04:17.000 No.
00:04:18.000 Ever.
00:04:18.000 Not once.
00:04:18.000 No.
00:04:19.000 That we can remember.
00:04:20.000 So tell me what happened that kind of threw things, I'm imagining that kind of turned life upside down just a little bit.
00:04:27.000 Well it started turning upside down actually on New Year's Day in 2012 because I got a bunch of messages from Google, there might have been eight of them or even more, saying your website has been hacked and we're blocking access.
00:04:43.000 So, you know, I've been a programmer my whole life, or at least since I was 13, and I was interested.
00:04:50.000 I mean, you know, everyone's websites get hacked.
00:04:53.000 It's very common.
00:04:53.000 Google itself has been hacked.
00:04:55.000 But why was I getting messages from Google?
00:04:55.000 For sure.
00:04:58.000 Why not from a government agency or some nonprofit organization who made them the sheriff of the internet?
00:05:05.000 So that was the first thing.
00:05:07.000 Second, I noticed that they were somehow blocking access not just on their search engine, which makes sense, it's theirs.
00:05:14.000 But they were blocking access on Firefox, which is owned by a non-profit organization.
00:05:18.000 They're blocking access on Safari, which is owned by Apple.
00:05:22.000 How could Google possibly do that?
00:05:25.000 I did eventually figure that out, and I published a big investigative piece in U.S.
00:05:30.000 News & World Report.
00:05:31.000 People can get to it at thenewcensorship.com, and I explain about Google's blacklisting.
00:05:39.000 I go into detail about their blacklisting.
00:05:41.000 That was published, I think, in 2016, I had never seen any of these blacklists.
00:05:47.000 None had leaked yet.
00:05:49.000 Since then, they have leaked.
00:05:51.000 I was writing about things which I knew existed within the company because I'm a programmer, and I wasn't guessing.
00:05:59.000 I knew for sure that these things existed.
00:06:01.000 The point is, I was learning A lot about them and getting more concerned about them, 2012.
00:06:09.000 2013, I started doing experiments because I started to think, wait a minute, Google maybe has some power here that we're not aware of.
00:06:19.000 There were marketing studies for late 2012 that caught my eye.
00:06:25.000 The marketers were pointing out that if you could just move up one notch in those search results, that could increase your sales by 30%.
00:06:31.000 It could make the difference between life and death for your company.
00:06:35.000 And if you get knocked off that first page, you're dead.
00:06:39.000 Yeah, you don't exist.
00:06:40.000 When was the last time that you went down to the little Google thing that had the numbers and how many pages and results came up and clicked next?
00:06:47.000 Oh, it was exactly never ago.
00:06:50.000 You barely scrolled down past the bottom of the current screen.
00:06:53.000 I don't know any time that I actually get all the way down to that part of the page anymore.
00:06:57.000 So you're right.
00:06:57.000 Life is found in that top half of that first page.
00:07:01.000 Well, actually 50% of all clicks go to the top two items.
00:07:04.000 So I was thinking if people trust those high ranking items, if they trust those high ranking items so much, Could those high-ranking items be used to shift people's thinking?
00:07:14.000 Maybe their opinions?
00:07:15.000 Maybe their purchases?
00:07:16.000 Maybe their votes?
00:07:19.000 So I'm a researcher.
00:07:20.000 This is what I do.
00:07:21.000 So I did a controlled experiment early 2013.
00:07:24.000 I predicted That if we randomly assign people to this group or that group, Candidate A or Candidate B, and in Candidate A, people see search results that favor Candidate A. So, if you click up high, you get to a webpage that makes that candidate look really good.
00:07:41.000 If you're in the other group, it's the opposite.
00:07:43.000 So, random assignment, good scientific methodology, and I thought With this kind of manipulation, I could shift voting preferences by two or three percent.
00:07:56.000 The first shift we got was something like 43 percent, which I- Really?
00:08:00.000 Yes, of course, which I did not believe.
00:08:03.000 So we had- You're like, okay, run that again.
00:08:06.000 Exactly.
00:08:08.000 And we're not doing this with college sophomores, by the way.
00:08:10.000 We're doing this with the full age range, people who look like the American voting population.
00:08:15.000 We do it again, we get 66 percent.
00:08:19.000 I'm thinking, wait a minute, you know, we triple check, we did it over and over again.
00:08:25.000 The bottom line is, is that I ended up discovering something called, which is now called SEME, S-E-M-E, the Search Engine Manipulation Effect.
00:08:34.000 If you go to searchenginemanipulationeffect.com, you'll actually see the seminal publication, which was in the Proceedings of the National Academy of Sciences, which is kind of a big deal if you're in science.
00:08:45.000 Yeah, it is, absolutely.
00:08:46.000 In any of the sciences.
00:08:48.000 And that, you know, that described five experiments in two countries, more than 4,000 participants, and it described one of the most powerful manipulation techniques that's ever been discovered in the behavioral sciences in a hundred years, SEAM.
00:09:06.000 And basically it means, wait a minute, Google has this power and it's not like
00:09:12.000 the power to put a TV commercial on the air or to buy Facebook ads.
00:09:17.000 This is a very different kind of power because they have a monopoly on search worldwide.
00:09:22.000 Yeah, what is it, 85% of the search market is controlled by Google?
00:09:25.000 That's more than 92% now.
00:09:27.000 Worldwide and it's going up still.
00:09:27.000 Is it really?
00:09:30.000 And so Bing and DuckDuckGo haven't really dented.
00:09:34.000 Well, Bing is about 2%, DuckDuckGo is about 1%.
00:09:37.000 They don't impact elections.
00:09:38.000 That's insane, yeah.
00:09:39.000 But if you have that kind of control and you can shift opinions by that much
00:09:44.000 without people knowing, because people don't even know.
00:09:48.000 I mean how would you look at a bunch of search results and recognize that they're biased toward one candidate?
00:09:53.000 You'd have to click on every one, look at the webpages, read the webpages, you know.
00:09:58.000 So people don't know and you can't counteract what they're doing.
00:10:04.000 So it's unlike billboards and TV commercials and radio commercials because if you buy a billboard, I can buy two billboards.
00:10:11.000 Exactly well and a lot of people when you go and search you're just assuming that an algorithm is basically taking whatever keywords that you're putting in and Finding you the articles or the the sites that are most relevant minus some paid spots, right?
00:10:24.000 We know that they're sponsored posts and stuff like that are sponsored listings Or or you know, maybe like the first four but they always note those as being sponsored or paid or an advertisement, right?
00:10:33.000 And you're just assuming okay, whatever I get next he's got to be the most relevant thing to what I've typed but you're saying it's not or Theoretically, it's not I'm saying that they fool us because most of the searches people conduct, about 86%, are for very simple facts.
00:10:51.000 What is the capital of Texas?
00:10:54.000 Is it really that necessary?
00:10:55.000 Are people really that afraid of just factual statements getting out, or at least opinions said his opinions? Well, well some information
00:11:04.000 There there are forces out there that they that they don't they don't want that information out there
00:11:11.000 So they they suppress it and it turns out that Obviously the the press in general has always had that
00:11:18.000 ability for sure, but the press is very competitive So, you know, it's hard to you want the story. Nobody else
00:11:24.000 is covering right, right If everybody's running this direction and somebody comes to you and says, well, wait a minute, no, I've got this piece right here, you're going to run with it because it gets people to buy your paper, click on your website, whatever it is.
00:11:34.000 Well, in general, too, we have a relatively free press, probably the freest press of any country in the world.
00:11:40.000 So, you know, the different points of view are out there.
00:11:44.000 The problem, though, with content on a platform like Google Is that there is no competitor, and there's that trust that we have, not just in Google, but in algorithms and computer output.
00:11:56.000 We have this trust.
00:11:57.000 I mean, we've been studying this now for more than 11 years.
00:12:00.000 We've been measuring it.
00:12:01.000 People really trust that.
00:12:02.000 They trust those high-ranking search results, for example.
00:12:06.000 They trust the search suggestions Google's flashing at you.
00:12:10.000 Search suggestions, believe me, that's part of The game here.
00:12:15.000 They manipulate us from the very first character we type into the search bar, literally.
00:12:20.000 I mean, if you don't believe me, pick up your phone or your computer, type the letter A, they're going to flash some suggestions at you and chances are that some of them or most of them are going to be for Amazon.
00:12:32.000 Now, you're looking for aardvarks.
00:12:34.000 You're looking for apples.
00:12:35.000 You're not interested in Amazon.
00:12:36.000 Some other A. Do us a favor.
00:12:37.000 So that's a really great thing.
00:12:38.000 Let's do this.
00:12:40.000 Everybody out there right now do this.
00:12:41.000 Go to Google, type in the letter A. I want everybody to do the exact same thing.
00:12:45.000 Just the letter A, not capitalized, and then take a screenshot of what you get.
00:12:50.000 I want to see how different those results are across this audience, and send it to us at any of our social media accounts.
00:12:56.000 You can make sure you put it up.
00:12:57.000 We'll put it up on the screen here for you to see where to send it to, at S Crowder on X. I always want to call it Twitter still because that name is just that there's so much kind of brand identity behind that and equity built up.
00:13:08.000 But put it out there.
00:13:09.000 I want to see it, and I want you, more importantly, to be able to see what everybody else's results are and how different they are.
00:13:14.000 Now that's, that is, I mean, if it's Amazon, then they're pushing a company, right?
00:13:19.000 But it seems innocuous enough where you're like, okay, it could be different.
00:13:22.000 After we finish our conversation, you'll be like, holy crap, okay, now I see it.
00:13:26.000 I've done an experiment, and you've shown me this is weird.
00:13:29.000 So keep going.
00:13:29.000 I didn't mean to interrupt there, but... When I was testifying before Congress, and if you want to see my testimony, just go to EpsteinTestimony.com.
00:13:37.000 As you can see, I've I bought a lot of URLs.
00:13:39.000 I was about to say, you own a lot of websites.
00:13:40.000 Yeah, whoever you're hosting through really likes you.
00:13:44.000 So, when I was testifying for Congress, EpsteinTestimony.com, I gave this challenge to Senator Ted Cruz, and he pulls out his phone right in the hearing, and he types in an A, and I said, well, okay, tell me what you see.
00:13:59.000 Three out of the four suggestions were for Amazon.
00:14:02.000 Really?
00:14:03.000 And he said, well, why?
00:14:04.000 I said, well, because Amazon is Google's biggest advertiser and Google is Amazon's biggest single source of traffic.
00:14:13.000 These are business partners.
00:14:16.000 We've shown in experiments that just by manipulating those search suggestions, they're flashing at you.
00:14:21.000 We can turn a 50-50 split among undecided voters into a 90-10 split with no one having the slightest idea that that has occurred.
00:14:32.000 But so give me a little bit more information there.
00:14:35.000 So what would happen, so if I was on Google right now and I was typing in some information, are you saying something like, how do I vote?
00:14:44.000 Or which candidate?
00:14:45.000 Or what did Trump say?
00:14:46.000 Or what did Biden say?
00:14:48.000 What am I typing in that is skewing me?
00:14:50.000 Or is it all of the above?
00:14:51.000 Well, first of all, if you type in something that has a bias, like you type in Hillary is the devil, then of course you're gonna get related search suggestions. But if you just type something
00:15:04.000 simple like Donald Trump is, you should get a mix of things probably. Hillary Trump is,
00:15:11.000 you should get a mix of things.
00:15:13.000 Well, 2016, before that election, on Bing or Yahoo, if you typed in Hillary Clinton is,
00:15:19.000 you got what people were generally searching for, which ironically you could verify at the time on
00:15:25.000 Google Trends, which was things like Hillary is the devil, Hillary is sick, Hillary eats babies,
00:15:31.000 whatever, all those crazy things.
00:15:33.000 Babies?
00:15:34.000 I mean, I'm not a fan of Hillary Clinton, but I wouldn't say that she's a baby eater.
00:15:38.000 I don't know where that one came from.
00:15:41.000 But if you type that in though on Google that summer of 2016, you got nothing like that.
00:15:47.000 You got Hillary Clinton is winning and Hillary Clinton is awesome, which no one was searching for.
00:15:53.000 So what we learned in our research is this, is what Google does when they're supporting a candidate or a cause, They suppress the negative search suggestions.
00:16:06.000 Ah, so you're only getting neutral or positive, which is going to take you usually, which is going to generate, if you click on it, it's going to generate search results that are kind of neutral or positive, and you're rarely if ever going to see anything negative because they're suppressing the negative search suggestions.
00:16:21.000 So that can take that 50-50 split and turn it into a 90-10 among undecided voters?
00:16:26.000 Among undecided people. People are undecided on anything, it turns out.
00:16:30.000 Google has that ability to manipulate with search suggestions,
00:16:34.000 answer boxes, which give you the answer, the search results.
00:16:38.000 Google owns YouTube.
00:16:39.000 They're manipulating people with a sequence of videos, with the up next video.
00:16:44.000 They are also, it turns out, indoctrinating our kids, which is a new area of study for us.
00:16:50.000 But the point is that if you had the slightest idea of what Google is doing...
00:16:58.000 For example, they monitor all your gmails.
00:17:03.000 Not just the ones you write, not just the incoming emails coming from whatever they're coming from, the attachments.
00:17:12.000 How about this?
00:17:13.000 You're typing an email because you're really mad at your girlfriend, your boyfriend, your boss, and you're typing this horrible email and what you're saying, and then you look at it and you go, Wait, I'm not sending that.
00:17:24.000 That's crazy.
00:17:25.000 And then you erase it.
00:17:26.000 Google already has recorded it.
00:17:28.000 That's part of your profile now.
00:17:31.000 If people knew the extent of the surveillance and the extent of the manipulation, first of all, I guarantee you they would stop using Gmail.
00:17:39.000 I guarantee you they would stop using Chrome, which is a Google browser.
00:17:44.000 They would stop using Android phones because that's a Google.
00:17:47.000 Yeah!
00:17:48.000 Now we're on the same page.
00:17:48.000 You know what?
00:17:50.000 There we go.
00:17:51.000 I'm not saying I'm a Tim Cook, you know, loyalist or anything.
00:17:53.000 I just like the products a little bit better than Android.
00:17:55.000 But continue.
00:17:56.000 You're singing my tune.
00:17:58.000 I'm just saying that there's a world of hurt here that people are just unaware of, and that is, the more I have studied it, the more I've learned about it, the more concerned I've become.
00:18:09.000 First of all, because I'm a dad, so I'm concerned about young people and what are they seeing.
00:18:15.000 Is there any hidden messaging going on there?
00:18:18.000 Is anyone trying to shift their thinking in any particular way?
00:18:21.000 Well, one of the leaks from Google, which is I think in 2018 or so, 2019, was an eight-minute video from their advanced products division, which is called The Selfish Ledger.
00:18:35.000 Not their advanced products division, but this video.
00:18:37.000 The video is called The Selfish Ledger.
00:18:40.000 And if you type in The Selfish Ledger plus my name, you'll get a transcript of it and you'll get a link where you actually can go see this video.
00:18:49.000 This video is about the ability that Google has to re-engineer humanity.
00:18:56.000 And they say in the video, they call it re-sequencing human behavior according to, it's right in the film, company values.
00:19:06.000 Really?
00:19:07.000 But whatever they deem to be the correct behavior.
00:19:11.000 It's sounding like the Chinese Communist Party without actually having the communism or the party.
00:19:15.000 It's just this big corporation that's doing this.
00:19:18.000 That's terrifying.
00:19:20.000 It's actually without any... It's probably worse.
00:19:22.000 It's much worse because it's without any accountability to anyone.
00:19:26.000 And they're controlling all of this stuff, the surveillance, the censorship, the manipulation in every country in the world outside of mainland China.
00:19:35.000 And they're accountable to no one.
00:19:38.000 So it's much, much worse than China.
00:19:41.000 The Chinese know they're being surveilled and manipulated by the Chinese government.
00:19:44.000 The Chinese government doesn't hide that.
00:19:46.000 But here you've got a private company that's accountable to nobody.
00:19:51.000 We didn't elect them.
00:19:52.000 Again, who made them sheriff and who gave them all this power?
00:19:56.000 And why aren't our public officials doing anything about this?
00:20:00.000 That's the question that I think a lot of our audience will have because, look, we've discussed this before.
00:20:06.000 You're very familiar with the Hunter Biden laptop story and how that swayed the election.
00:20:10.000 So I think it was a combined total of 17% would either have switched their vote or simply not voted for Biden.
00:20:17.000 Just the suppression of that one story potentially could change an election.
00:20:21.000 Suppressing somebody's voice on YouTube, so like you said owned by Google, could change the outcome of an election, could change the pushback on restrictive policies regarding lockdowns or masks or things that you disagree with or could hide things maybe that you do agree with, right?
00:20:36.000 So it can censor people and They're arbitrary rules.
00:20:39.000 We've had strikes on our channel for things that don't even make any sense.
00:20:44.000 And there's no way for us to get around it.
00:20:47.000 There's no way for us to appeal it and get them to change their minds on something.
00:20:51.000 That doesn't work.
00:20:52.000 They say, there's an appeals process you can submit.
00:20:53.000 Click here if you disagree with this strike.
00:20:55.000 And it's like, yes.
00:20:56.000 It basically goes to the trash bin, I think, somewhere at Google and notifies them that, aha, maybe we need another strike to get back in line with YouTube.
00:21:04.000 The problem is that politicians aren't doing anything.
00:21:07.000 And we've had Senator Ted Cruz on, we've had Senator Marco Rubio on, we've had a lot of people on, and I'm like, look guys, enough talking about Section 230 as it specifically relates to those platforms, right?
00:21:17.000 Enough talking about it.
00:21:18.000 Either you do something about it or you don't exist.
00:21:21.000 I'm not saying that's us, I'm saying that's them.
00:21:23.000 If you don't do something about it, and they are exactly who you say they are, and they are skewed against conservative policy makers, then they could just make it to where people don't find you.
00:21:33.000 And an election comes around and you should win, but you don't.
00:21:37.000 And you've lost your opportunity to do anything.
00:21:38.000 And I think probably on Section 230, you would agree like, Hey, this is part of the problem.
00:21:42.000 I saw you nodding your head when I mentioned it, like Section 230 kind of being this ambiguous thing where they get the benefits of not being sued, but none of the responsibility to allow voices on as long as they're not breaking the law.
00:21:52.000 And we have rules and guidelines for that, what that looks like.
00:21:52.000 Right.
00:21:55.000 Otherwise it should be, you have the ability to have some free speech in the public square, but they're not even allowing you to do that.
00:22:02.000 That's the obvious stuff.
00:22:03.000 We know about that.
00:22:04.000 That's become like a car alarm to us, unfortunately.
00:22:07.000 We get a strike on YouTube and it's like, yeah, it's a Tuesday.
00:22:09.000 Fine.
00:22:10.000 That happens.
00:22:11.000 This is even worse.
00:22:12.000 And so let me just kind of rewind a little bit to 2016.
00:22:15.000 So where did you get your PhD?
00:22:19.000 A conservative bastion?
00:22:21.000 A little university in Cambridge, Massachusetts called Harvard.
00:22:25.000 So Harvard train.
00:22:25.000 Harvard.
00:22:25.000 Okay.
00:22:26.000 You're a smart guy, right?
00:22:28.000 You're not a conservative.
00:22:30.000 So that would make you, would you call yourself a liberal?
00:22:33.000 I lean left.
00:22:34.000 Lean left.
00:22:35.000 Okay, perfect.
00:22:36.000 Leans left, supported Hillary Clinton, and then this happens.
00:22:39.000 Donald Trump tweets out, wow, report just out, Google manipulated 2.6 million to 16
00:22:45.000 million votes for Hillary Clinton in 2016 election.
00:22:48.000 This, this, and I think this next one, this was put out by a Clinton supporter, not a
00:22:52.000 Google should be sued.
00:22:53.000 My victory was even bigger than thought at Judicial Watch.
00:22:56.000 Now there's some context there.
00:22:57.000 Those numbers aren't quite right, I believe.
00:22:59.000 I think you said potentially up to 2 million, to 16 million, or 12 million, whatever it was.
00:23:03.000 I think it was off a little bit on the top number.
00:23:06.000 And it didn't go into some of the nuance.
00:23:08.000 But immediately what happened?
00:23:10.000 Hillary Clinton replied to him.
00:23:13.000 Now, she had no reason to reply to him.
00:23:16.000 None!
00:23:17.000 And what she said was this.
00:23:19.000 She said that that research has been debunked, which is completely false.
00:23:25.000 But I checked with lawyers, that's an opinion.
00:23:28.000 And she said, and it's based on data obtained from 21 undecided voters.
00:23:34.000 And I went, what?
00:23:35.000 And where did she even get those ideas from?
00:23:37.000 I'm sure That someone at Google gave her that language.
00:23:42.000 That then got picked up by this machine, which I had heard of, and I didn't know it was real, but it got picked up by the New York Times and a hundred other- They did a fact check.
00:23:50.000 I read it today.
00:23:51.000 A hundred other mainstream news sources, many of which I've dealt with in the past, I know the people, I've published in them before, and I got cut off.
00:24:03.000 Just like Dershowitz, I got completely cut off from mainstream media, mainstream news, even though, again, in many cases I know the people.
00:24:11.000 I mean, I was editor-in-chief of Psychology Today magazine for four years.
00:24:14.000 I mean, I worked for New York magazines.
00:24:17.000 Yeah.
00:24:18.000 And they cut me off.
00:24:22.000 How did that—so as somebody who leans left, how did that make you feel?
00:24:25.000 I mean, because in your mind, you're like, guys, I'm going where the data's telling me.
00:24:28.000 Like, we have a massive problem here that you should care about, too.
00:24:31.000 And that's really all you're saying.
00:24:32.000 Just because Donald Trump chose to grab a hold of that in that moment shouldn't have—shouldn't have changed what was going on.
00:24:39.000 But it seems like it did.
00:24:40.000 What— Oh, it's—in many ways—let me explain.
00:24:48.000 Trump's tweet came in the summer of 2019.
00:24:50.000 That was shortly after I had testified before Congress about my work.
00:24:58.000 And also that summer, I gave a private briefing to a bunch of state attorneys general, including the guy who just got off the hook.
00:25:09.000 Ken Paxton.
00:25:10.000 Ken Paxton, who actually had been really, really aggressive and getting more and more aggressive against Google over the years.
00:25:20.000 And so I'm standing out in the hall.
00:25:24.000 This is all the summer of 2019.
00:25:27.000 One of the AGs comes out and he says, well, based on what you told us, Dr. Epstein, he said, I don't want to scare you, but he said, I think you're probably going to be killed in some sort of accident in the next few months.
00:25:41.000 Then he walked away.
00:25:42.000 I know exactly who it was.
00:25:43.000 He's still an AG.
00:25:45.000 And obviously I'm here, so I wasn't killed, but my wife was.
00:25:50.000 And that was a very sad story that you and I were talking about, but it's not just a sad story, it's terrible.
00:25:56.000 I'm married, I have two young kids.
00:25:58.000 How many kids do you have?
00:25:59.000 Five.
00:25:59.000 You have five children.
00:26:01.000 What's the oldest and youngest?
00:26:02.000 Give me the spread here.
00:26:03.000 Oh, I don't want to.
00:26:04.000 Okay, they're older.
00:26:05.000 Are they older or are they young?
00:26:06.000 It's a spread.
00:26:07.000 All right, so I have two very young children right now, and so I'm just entering into the dadhood phase.
00:26:07.000 Okay, it's a spread.
00:26:13.000 And that's a terrible thing, and that would be enough.
00:26:16.000 But there's more to that story That makes what he said to you, one of the attorneys general there, said to you, kind of ring in the back of your mind a little bit, like, wait a minute, can I trust that this was just an accident?
00:26:29.000 So, just give us a couple of things there, because when I read about it, I was like, you've got to be kidding me.
00:26:34.000 Well, the little pickup truck that she was driving, um, I mean, some things were just, they were not right.
00:26:41.000 The, the truck was never examined forensically.
00:26:43.000 It disappeared from the impound yard.
00:26:46.000 I said, where, where is it?
00:26:47.000 And they said, oh, it was the remains of the truck were purchased by some dealer in Mexico.
00:26:53.000 So it had disappeared to Mexico.
00:26:54.000 That's your property though, right?
00:26:56.000 I thought it was my property.
00:26:59.000 That doesn't make any sense.
00:27:01.000 Did they ever give you an explanation for why it happened?
00:27:04.000 No.
00:27:04.000 And an explanation at that point wouldn't have made any difference because the thing was gone.
00:27:09.000 So, uh, you know, and I did talk with a woman who was actually in the car right behind her.
00:27:15.000 And from what she, the way she was explaining it to me, it sounded like something had gone suddenly wrong with her brakes.
00:27:22.000 And of course, brakes can be tampered with, but yeah.
00:27:25.000 Anyway, who knows?
00:27:26.000 The problem is, I'll never know.
00:27:28.000 No, I know.
00:27:29.000 And I'm sorry for that.
00:27:30.000 I don't want to relive it for you.
00:27:32.000 I don't want you to dive deeply into that.
00:27:36.000 You're now in a position where you're, you're, you aren't suicidal, correct?
00:27:41.000 I just want to make sure I get that on the record right now.
00:27:44.000 If we end up finding, you did not kill yourself, right?
00:27:47.000 That's, and we joke around about that, but at the same time, it's, it's a real thing because these are very powerful forces that probably don't want you tinkering with what they're doing.
00:27:56.000 Well, I have to keep going on ex-formerly Twitter, and I had to do it again a few days ago, and I have to say, just a reminder everyone, I am not suicidal.
00:28:03.000 Now, why do I have to keep doing that?
00:28:05.000 Because just a few days ago there was a very good article came out about my work, and about what I'm doing now to try to stop these companies, which I'm sure you'll want to talk about.
00:28:14.000 And it was in a publication called PJ Media, which I think is very good, although it leans heavily to the right, but still I think it's very good.
00:28:24.000 And it was this article about my work in PJ Media, and it compares me to another Epstein.
00:28:33.000 He was an Epstein.
00:28:34.000 And it says, you know, they do have one thing in common, because this Epstein that I'm writing about now, me, is just as likely to be suicided as the other Epstein was.
00:28:46.000 So I have to go back online and say, no, I'm not suicidal.
00:28:50.000 But I am doing work which is definitely inherently dangerous because my team and I,
00:28:59.000 and I have almost 50 people working with me, brave souls.
00:29:03.000 And we are discovering things about Google and to some extent, some other tech companies
00:29:12.000 that are really scary because...
00:29:15.000 For one thing, we've learned how to, not just how to test the powers that they have, that we do in our basic science, our basic experimental research.
00:29:24.000 And that's good enough to know, but then, okay, now what do I do, right?
00:29:27.000 Exactly.
00:29:27.000 So, you know, that's part of it, the basic science, that tells about their capabilities, which is interesting, but that doesn't mean they're actually doing anything, right?
00:29:36.000 But we've, starting in 2016, we have learned how to preserve what, at Google, they call ephemeral experiences.
00:29:44.000 Now, in a million years, they never imagined that anyone was going to capture the ephemeral content that they show people.
00:29:52.000 So, what's ephemeral content?
00:29:54.000 Why is that so important?
00:29:55.000 I'm glad you asked the question, because it made me seem smarter for not having to.
00:29:59.000 Maybe the audience would like to know.
00:30:01.000 I know, but maybe they don't know.
00:30:02.000 Well, I'll just tell you, and then you can pass it on.
00:30:05.000 So, ephemeral means fleeting, short-lived.
00:30:10.000 And by definition, ephemeral means it disappears and you can't go back and see what it was.
00:30:16.000 2018, there's a leak of emails to the Wall Street Journal.
00:30:20.000 Of course.
00:30:20.000 Of course.
00:30:23.000 What are these Googlers, they call themselves, what are they discussing?
00:30:28.000 How can we use ephemeral experiences to change people's views about Trump's travel ban?
00:30:35.000 Well, wow!
00:30:36.000 My head spun when I saw that because I had been studying an experiment since 2013 The power that ephemeral experiences like search results are ephemeral, search suggestions are ephemeral, answer boxes, YouTube sequences, up next videos, that's all ephemeral.
00:30:53.000 And here we're employees of the company saying, basically acknowledging that they know the power that these ephemeral experiences have to shift people's thinking and behavior.
00:31:05.000 Now that's incredibly dangerous that they know this and presumably maybe even do it.
00:31:12.000 I'll get to that in a minute.
00:31:14.000 Because it disappears.
00:31:17.000 In other words, it leaves no paper trail.
00:31:20.000 Normally, when people commit a crime, there's a paper trail.
00:31:23.000 It's very hard to prosecute.
00:31:23.000 Right.
00:31:23.000 Exactly.
00:31:24.000 And, you know, if you believe that the election was stolen by, you know, these machines doing stuff or voter fraud, we were talking about that before.
00:31:32.000 How do you prove that?
00:31:33.000 It's one of the hardest things.
00:31:34.000 This is worse.
00:31:35.000 This is worse because these things don't leave a trace.
00:31:38.000 It makes it even harder.
00:31:39.000 And so, for you to have come up with a way to monitor that and to catalog that, massive amounts of data, I'm assuming, that is not cheap, by the way, to be able to store that, to process that, to pay people to do it.
00:31:51.000 I don't know if I want to go into the details of how, but you guys have a system, let's just say that.
00:31:56.000 What's the name of this system that captures all this data?
00:31:59.000 Isn't there like a network name or something for it that you guys have?
00:32:02.000 Internally we call it our monitoring system, but publicly now we're starting to call it America's digital shield.
00:32:09.000 And we actually have a mock-up of a website that's going to be up soon.
00:32:13.000 It's going to be live that is going to, I hope, generate a lot of interest.
00:32:20.000 Yeah, there we go.
00:32:21.000 We've got it right there.
00:32:22.000 All right.
00:32:24.000 What we have learned to do better and better and better starting in 2016 is to preserve ephemeral content.
00:32:31.000 So how do you do this?
00:32:35.000 Imbeciles at places like The Economist and Columbia University, and I have to call them imbeciles because… Is this a Harvard snob thing?
00:32:43.000 I'm just kidding.
00:32:44.000 A little competitiveness?
00:32:44.000 A little bit.
00:32:45.000 A little bit.
00:32:47.000 Maybe a little bit.
00:32:48.000 That's okay.
00:32:49.000 It's a nice university.
00:32:50.000 But you know what they do to try to see if there's bias in Google content?
00:32:54.000 They use an anonymized computer.
00:32:57.000 And that one computer over and over again, they're running searches and then they look at the stuff and they say, there's no bias.
00:33:03.000 There's no bias.
00:33:03.000 There's no bias.
00:33:04.000 Okay.
00:33:05.000 You can't look at anonymized, at content coming from an anonymized computer because their algorithm recognizes that as a bot.
00:33:13.000 It recognizes that it's not a real person because it doesn't have a long 20 year history of data in their database.
00:33:22.000 So they can play it all square because it's a bot potentially and not show bias.
00:33:27.000 Oh, we've shown this. We've shown this. If they know it's not a person, they just send perfectly unbiased stuff.
00:33:33.000 Really?
00:33:34.000 So, how are you going to see the real stuff?
00:33:36.000 So now, starting in 2016, we've come up with better and better ways of recruiting people around the country.
00:33:44.000 We call them our field agents.
00:33:45.000 They're all registered voters.
00:33:47.000 They're real people.
00:33:49.000 And we hide their identities, just like the Nielsen Company hides the identities of the Nielsen families that they use to get the TV ratings.
00:33:56.000 Same thing.
00:33:57.000 It's very, very expensive, very slow, very labor intensive.
00:34:01.000 But each day, we have a team of almost 50 people who work on this, each day we're able to successfully recruit, vet, NDA, equip, train every day another 30 to 60 people.
00:34:18.000 And then we've started getting some of their kids signed up too.
00:34:22.000 So we In 2016, we miraculously got 95 field agents in 24 states, preserved 13,000 searches on Google, Bing, and Yahoo.
00:34:35.000 Looked at the actual search results that real people were getting and found there was extreme bias in favor of Hillary Clinton, whom I supported, and I no longer do, obviously, but the point is that we calculated from our research at the time that that level of bias, if it had been present nationwide, would have shifted between 2.6 and 10.2 million votes to Hillary Clinton, with no one knowing.
00:35:01.000 As it happened, she won the popular vote by 2.8 million votes, so it looks like If you took Google out of that election, the popular vote would probably have been very, very close.
00:35:14.000 We built a bigger system in 2018.
00:35:17.000 2020, we didn't have 95 field agents.
00:35:21.000 2020, we had 1,735 field agents in four swing states, because that's where the action is.
00:35:27.000 And we preserved more than 1.5 million ephemeral experiences.
00:35:32.000 And we found, guess what?
00:35:34.000 Extreme political bias favoring Joe Biden.
00:35:39.000 Who am I supporting, but I no longer do.
00:35:42.000 Anyway, so wait a minute.
00:35:46.000 That means that they shifted more than 6 million votes to Joe Biden.
00:35:51.000 Wait a minute.
00:35:52.000 They shifted 6 million votes?
00:35:53.000 More than 6 million, at least 6 million.
00:35:55.000 Can you tell me what the swing states were?
00:35:58.000 If I can remember, I think Florida, Arizona, Ohio, fourth one, I don't know, could be Michigan, I'm not sure.
00:36:05.000 Okay, so these are states that change in election, potentially, right?
00:36:10.000 These are battles, yeah.
00:36:10.000 Because we know that there were razor-thin margins in these places.
00:36:14.000 Now, so, what was it in Georgia?
00:36:17.000 It was 11,000 votes or something like that.
00:36:20.000 In Arizona and in Michigan and Wisconsin, it was really, really, really close.
00:36:25.000 Pennsylvania was close.
00:36:27.000 All these places.
00:36:28.000 2020, Georgia was one of the four.
00:36:30.000 So I've left out Georgia, but Georgia we had more than a thousand field agents just in Georgia because there was so much going on there.
00:36:36.000 There were those two.
00:36:37.000 Six million in just those four.
00:36:40.000 Yeah, but it gets the story gets really crazy now because we at this time we were learning how to analyze data faster and faster and faster so I sent it to Senator Cruz's office.
00:36:53.000 Why did you pick Senator Ted Cruz?
00:36:55.000 Because after I had testified a few months later he invited me to DC we had a four-hour private dinner talking tech for four hours that's why.
00:37:03.000 The man is really smart and he understands these issues.
00:37:05.000 He is, yeah.
00:37:06.000 So I sent him the data that we had.
00:37:10.000 A couple days later, November 5th, so this is two days after the presidential election, Cruz and two other senators sent a very threatening letter to the CEO of Google.
00:37:20.000 If you want to see it, go to lettertogoogleceo.com.
00:37:24.000 I'm not kidding.
00:37:25.000 One more of your site's letters.
00:37:26.000 Lettertogoogleceo.com, and you'll actually see the letter written by these senators.
00:37:31.000 And now, the shocker, November 5th, 2020, Google, that day, shut off all of the bias going to voters in Georgia.
00:37:42.000 How do we know?
00:37:43.000 Because we had more than a thousand field agents there.
00:37:45.000 Why does this matter?
00:37:46.000 Because they were, at this point, gearing up for the two Senate runoff elections in January.
00:37:52.000 But Google backed away from that election.
00:37:56.000 Literally, the bias, which we've never seen this happen before, the bias in Google went to zero.
00:38:01.000 Google search went to zero.
00:38:04.000 But more important, Those partisan GoVote reminders that they had been sending out stopped.
00:38:12.000 So tell me what the GoVote reminder is.
00:38:14.000 Is that on the search page?
00:38:15.000 Where are you talking about that?
00:38:17.000 Oh no, this is far more dangerous.
00:38:18.000 This is on Google's home page, which is usually blank, but now and then they put a big message there.
00:38:25.000 It says GoVote.
00:38:27.000 That says go vote, or register to vote, or mail in your ballot.
00:38:30.000 They can put anything they want there, but what we have found is they do this in a partisan fashion.
00:38:36.000 So for example, one example, 2022, in Florida, 100% of liberals got that go vote reminder all day on election day.
00:38:46.000 That go vote reminder, I mean, Google's home page has seen 500 million times a day just in the United States.
00:38:53.000 But 100% of liberals got that reminder in Florida, 2022, only 59% of conservatives did.
00:39:02.000 That is a really extreme manipulation because we know from actual research that Facebook conducted and published that in that kind of election, a national election, if Google is sending out a go vote reminder in a partisan way, Then literally on election day itself they can send 450,000 more votes to one candidate than to the other.
00:39:02.000 Really?
00:39:26.000 They can get people to vote who would normally stay sitting on their sofas all day.
00:39:31.000 They can get that many people off their sofas.
00:39:34.000 The point is they stopped.
00:39:37.000 They stopped.
00:39:38.000 So this illustrates a very important point that Justice Brandeis made over a hundred years ago, which is, sunlight is the best disinfectant.
00:39:48.000 And the second part of that quote, which no one knows except me, is, and street lamps are the best policemen.
00:39:55.000 Yeah.
00:39:56.000 That's what Justice Brandeis said in 1917 or something like that.
00:40:00.000 It's perfect.
00:40:00.000 And we use it all the time and we don't use the full one.
00:40:02.000 We're going to change that.
00:40:04.000 You can't just use sunlight as the best disinfectant anymore.
00:40:06.000 You have to have the cops thing in there.
00:40:07.000 That's right.
00:40:08.000 By the way, are you starting to get the picture a little bit here?
00:40:11.000 Is this not a little bit creepy and scary?
00:40:14.000 I got a couple of guys in here doing the stuff, the tech wise.
00:40:16.000 These guys are like, really?
00:40:20.000 Are you serious?
00:40:21.000 A little shocked.
00:40:22.000 So let's just, I just want to back this off just a little bit and make sure that you guys understand, this shifts so many elections potentially.
00:40:33.000 And nobody should have that power, no matter what my particular belief would be about candidates, or who should be in office.
00:40:42.000 It should be a vote, not somebody... I mean, they have get-out-the-vote campaigns.
00:40:48.000 We know that it works because candidates have been doing it for as long as voting has been a thing.
00:40:52.000 Where you go through districts that maybe you need to win this one a little bit more.
00:40:52.000 Right?
00:40:56.000 Make sure you do a get out the vote campaign.
00:40:57.000 Have door knockers.
00:40:58.000 Have people calling.
00:40:59.000 Have people giving rides.
00:41:00.000 Have signs out to remind people to go vote because they can't be bothered in their daily life to remember that it's an election day.
00:41:05.000 That's kind of an important thing to determine who is going to be in charge of your area or your country.
00:41:11.000 And they do this to swing, in just one state, you're saying 400 and some odd thousand?
00:41:15.000 No, no, no.
00:41:16.000 Nationwide.
00:41:18.000 Nationwide, they go 400,000 votes to one candidate.
00:41:20.000 To one candidate, just on election day alone with that manipulation, the partisan go-vote manipulation.
00:41:26.000 Now, there's also a bit of good news, though, because the fact that we did get them to back off, the fact that Sunlight... Right, so you've got some results.
00:41:35.000 That's right.
00:41:36.000 So what this was telling us was, wait, monitoring could be the solution here to a lot of the terrible things that they are doing.
00:41:44.000 And again, we have more and more evidence over time.
00:41:48.000 2022, we had 2,742 field agents, all registered voters.
00:41:53.000 By the way, these field agents are politically balanced.
00:41:56.000 There's no bias, and we're very careful to politically balance them, because we publish our work in peer-reviewed journals.
00:42:03.000 So we're following all the rules to make sure we have credible data.
00:42:07.000 So at this point we're in ten swing states.
00:42:11.000 I'll give you a couple of little glimpses of what we found.
00:42:15.000 First of all, Google easily shifted tens of millions of votes in hundreds of elections nationwide.
00:42:21.000 They all shifted in one direction.
00:42:24.000 I happen to like that direction, but I don't like the fact that a private company is running our country, that the free and fair election is an illusion I don't like that at all.
00:42:54.000 One, that's a stupid comment, because how do you prove it's the freest and most fair election that we've had in our history as a country?
00:42:59.000 I didn't realize that the other ones were less free and fair before that.
00:43:02.000 And I'm not talking about the 2020 issues, but then why is one side out there saying, hey, No, these aren't necessarily, we need to make sure that these elections... Now, it's not just a bias of, our guy didn't win, right?
00:43:16.000 It was, I'm seeing some stuff, and this doesn't look like it's fair, but then the other side completely saying, well no, these are the freest and fairest elections that we've ever had.
00:43:25.000 Why is that divided?
00:43:26.000 Okay, my answer is going to surprise you.
00:43:29.000 Okay.
00:43:29.000 But it's actually the truth.
00:43:30.000 You think it's just candidate bias?
00:43:32.000 My guy won, so I'm happy with it?
00:43:33.000 No, no, no.
00:43:34.000 No, I'm going to tell you the truth.
00:43:34.000 Okay.
00:43:37.000 And all I can ask you to do, since it's going to sound a little nutty, is just... That's okay.
00:43:45.000 Did I mention that Alex Jones comes on this show?
00:43:46.000 Oh my God.
00:43:47.000 That's what I'm saying.
00:43:48.000 We're fine.
00:43:49.000 We're in good company.
00:43:50.000 If you've ever met Alex, he's one of the nicest guys you'll ever meet.
00:43:53.000 It's hilarious.
00:43:55.000 You may think he's like this intense guy that kind of runs around these... By the way, he's been right so many weird times where I'm like, that's crazy.
00:44:01.000 And then six months later, I'm like, well, son of a gun.
00:44:04.000 He said they were turning the frogs gay, and it sounded weird, but there's the report!
00:44:08.000 Well, you know, he just invited me to be on his show, and I don't know what to do.
00:44:12.000 I have my limits.
00:44:14.000 He's a great guy.
00:44:15.000 I've spent some time with him.
00:44:17.000 I've gone down to Austin.
00:44:18.000 I've done a show with him.
00:44:19.000 He's been up here a lot.
00:44:21.000 He really is a great guy.
00:44:22.000 He's very fair.
00:44:23.000 He'll ask you some questions, for sure.
00:44:28.000 I'm putting my tinfoil hat on.
00:44:30.000 Tell me where it gets creepier.
00:44:32.000 I'm going to tell you something now.
00:44:34.000 Again, you'll have to mull it over, but it's the actual truth, okay?
00:44:39.000 And if you want to read my written version of this... You have another website?
00:44:45.000 I do, and it's howgooglestoptheredwave.com.
00:44:49.000 You're saying they mess with 22.
00:44:53.000 Oh, yes.
00:44:53.000 Big time.
00:44:54.000 Okay, lay it on.
00:44:55.000 I'm going to shut up.
00:44:56.000 Here it is.
00:44:56.000 Here it is.
00:44:58.000 You know this fascination, almost an obsession, that some conservatives have with things like ballot harvesting and messing with voting machines and ballot stuffing and on and on and on, all that stuff?
00:45:12.000 Yeah.
00:45:14.000 all of which by the way was claimed by you know Google's legal uh excuse me Trump's legal teams
00:45:19.000 in 2020 and that got thrown out of 63 courts all that all those beliefs all that stuff is actually
00:45:27.000 spread by Google and to a lesser extent some other tech companies.
00:45:33.000 Now why would Google want to spread a bunch of conspiracy theories and get people believing that stuff and get people repeating that stuff and then inspire OAN and Newsmax and Fox to run story after story after story?
00:45:51.000 Why would Google want that?
00:45:54.000 Because they do what magicians do yeah, they use misdirection They want you looking over there because they don't want
00:46:02.000 you looking at them. Yeah, and That is the truth so these stories
00:46:09.000 Most of them are completely false Even the ones that are true, they don't have much of a net impact on elections.
00:46:18.000 So even, you know, Attorney General Bill Barr tried to explain that to Donald Trump, made no difference.
00:46:24.000 The point is they don't have much net impact because they're inherently competitive.
00:46:28.000 Inherently competitive.
00:46:30.000 I just saw a speech by Senator Steve Daines Which he's complaining about all the ballot harvesting that the Dems did in 2020.
00:46:39.000 He said, we got to get better at that!
00:46:42.000 I mean, that's the point.
00:46:43.000 You see what I'm saying?
00:46:45.000 These are inherently competitive techniques.
00:46:47.000 They have relatively little net effect.
00:46:51.000 But Google, which wants you to believe in all these conspiracy theories, what Google does is different because they're not shifting a few hundred votes here and there, a few thousand votes here and there.
00:47:00.000 They are shifting millions of votes in ways you can't see and you cannot counteract.
00:47:09.000 That is insane.
00:47:11.000 I'm gonna tell you this.
00:47:12.000 When you started talking about that, I kind of figured that you were, once you started talking about it, I figured where you were gonna go with this.
00:47:18.000 And the reason that I can say that, guys, you can go back and you can look at our shows when we had Mayor Giuliani on and said, why are we going after these?
00:47:28.000 And just to be clear, on the 63 cases, the vast, vast, vast majority of those were either thrown out for standing or procedural issues that never actually even got to discovery, where the judge basically said, Nope, we can't go down this road, not gonna go down this road.
00:47:39.000 It wasn't that the entire thing was adjudicated and determined to be false, but your point I think is very valid because it was my point as well regardless.
00:47:47.000 I was saying, Why are you going after these things?
00:47:50.000 The main point that you can go after... Now, in your case, you're saying, hey guys, we need to be focusing on Google.
00:47:55.000 I agree 100%, but we also need to be focusing on states like Pennsylvania that changed their constitution illegally.
00:48:01.000 They went to the Supreme Court of Pennsylvania, who did it, and said, well, yeah, that's fine.
00:48:04.000 I know it was supposed to be in two legislative sessions back-to-back.
00:48:06.000 The point is, there were much more solid grounds to go on and to try to make arguments.
00:48:11.000 See, you're actually proving my point.
00:48:13.000 That was right, though.
00:48:14.000 We researched that.
00:48:15.000 They just didn't bring that case.
00:48:19.000 You're proving my point, because as Mark Twain said, it's easy to fool people, but hard to convince them that they've been fooled.
00:48:27.000 And you are proving my point.
00:48:29.000 And I have a feeling if you and I chatted and chatted and chatted, more and more and more, you would come up with stuff that you think is real and big and consequential.
00:48:39.000 It's probably not, because Dirty tricks have always been played.
00:48:39.000 And you know what?
00:48:44.000 They always will be played.
00:48:45.000 For sure.
00:48:46.000 They're inherently competitive.
00:48:48.000 What you have to do is you have to look and say, wait, where did these stories come from?
00:48:52.000 How are they circulated?
00:48:54.000 Remember this.
00:48:55.000 These tech companies have 100% power to control virality.
00:49:02.000 No, it's not 99.2%, it's 100%.
00:49:07.000 They decide what stories go viral and what stories do not.
00:49:11.000 If stories like that...
00:49:14.000 About Pennsylvania or any other state and some dirty tricks and, I don't know, Dominion voting machines.
00:49:21.000 Don't lump those two things together.
00:49:23.000 I know what you're saying, but Pennsylvania is something that you can look in the court documents there and see what happened.
00:49:28.000 Dominion voting machines saying the ghost of Hugo Chavez manipulated the votes in there, that's crazy.
00:49:33.000 Right?
00:49:35.000 I'm just telling you that you have to... What I have been studying for a long time now is a whole different kind of beast, and the beast... Oh, your beast is way worse.
00:49:45.000 It's way worse because... It's the one we have to be fighting.
00:49:47.000 I'm agreeing with you 100%.
00:49:49.000 I'm just making the point that there were, even though he didn't know about Google... You're going to do it again.
00:49:54.000 You're going to do it again.
00:49:54.000 Let me say it this way.
00:49:56.000 Even though Giuliani didn't know about the Google thing, or if he did, he wasn't doing anything about it.
00:50:01.000 There were stronger things to go after and fight against.
00:50:04.000 Then what they did go after.
00:50:06.000 And what it did is played perfectly into the hands of Google.
00:50:10.000 By Google's own manipulation, most likely, is what the case you're making so that people would believe in the conspiracy theories.
00:50:15.000 What I'm saying is the Pennsylvania case is not a conspiracy theory.
00:50:18.000 Stop, stop.
00:50:19.000 How do you say it is?
00:50:19.000 Hold on.
00:50:22.000 I literally looked into the sessions.
00:50:23.000 Let me tell you the law.
00:50:24.000 In Pennsylvania, you cannot have mail-in voting unless the legislature meets in two separate sessions and approved it.
00:50:30.000 They met in one.
00:50:31.000 and they didn't meet in a second. Now you could make the argument,
00:50:34.000 yes but that didn't sway the election, and I would grant you that premise, okay maybe it didn't,
00:50:38.000 but it was something that happened that I'm like, why did they do that? COVID?
00:50:43.000 You're still doing it. Why do you think that's a conspiracy?
00:50:46.000 I'm gonna take that gun.
00:50:47.000 No, I'm genuinely trying to understand.
00:50:51.000 I'm not pushing that as like the grounds for the 2020 election at all.
00:50:54.000 I think if we were prior to the Google story, it'd be Hunter Biden's laptop thing, right?
00:50:58.000 That's much more consequential, you know, across the country.
00:51:02.000 But in this particular case, why is it not okay for me to be frustrated that states did stuff like that?
00:51:08.000 Or at least in this one case it happened.
00:51:09.000 I'm not saying that overturns everything.
00:51:13.000 I'm just saying that frustrates me.
00:51:14.000 It's like, well, but you have laws and you broke them in this case and that our main point was mail-in ballots is a hard thing to do.
00:51:22.000 Doesn't mean it's going to help me, doesn't mean it's going to help this guy.
00:51:24.000 It's just a hard thing to do.
00:51:26.000 It's going to take a bigger gun than that, apparently, to have any effect here.
00:51:30.000 I'm trying to tell you this, that I have given speeches to big crowds of Tea Party people, all of whom were carrying guns.
00:51:39.000 And none of them used them on you, see?
00:51:41.000 We're nice.
00:51:44.000 I've been sewn up nicely, let's say.
00:51:47.000 But the point is that I have scared people, I've gotten them for a few seconds to kind of consider the truth of what I'm saying, and then they go right back.
00:51:58.000 And if they're not talking about Pennsylvania, they're talking about, oh yeah, but what about this thing in Ohio?
00:52:04.000 What about this thing?
00:52:05.000 I have friends in Arizona now who just keep going on and on and on about voting machine stuff in Arizona.
00:52:12.000 And, you know, I just testified before the Arizona State Legislature.
00:52:17.000 You know what?
00:52:18.000 Now and then, someone actually gets it.
00:52:21.000 Because Carrie Lake, who lost the governorship, lost that race in 2022, She has now gotten it.
00:52:29.000 She gets it and she's actually saying now, wait a minute, maybe she was distracted with some of these small issues and maybe really, maybe she should be looking at the tech companies.
00:52:40.000 And guess who else gets it?
00:52:42.000 Ramaswamy!
00:52:43.000 Ramaswamy is now saying, well, all those things you're concerned about, yes, they're all real, we should be concerned, but— There's the thing.
00:52:50.000 You didn't give me the but, though.
00:52:52.000 You told me that that was silly, and I'm saying, well, hold on now.
00:52:55.000 I agree that your thing is bigger, and it's the thing to focus on.
00:52:58.000 Because you can't really do anything about these other things, even if they are real and they're much smaller in scope, right?
00:53:03.000 So I grant that.
00:53:03.000 See, but it's not a little bigger.
00:53:05.000 No, I didn't say that.
00:53:06.000 It's a lot bigger.
00:53:07.000 It's like a little... It's the game.
00:53:09.000 It's like a tiny little marble compared to, like, the sun.
00:53:13.000 It's a whole different kettle of fish.
00:53:16.000 It's really, really scary because it's operating on a massive scale.
00:53:22.000 By 2015, Google was deciding the outcome of 25% of the national elections in the world.
00:53:32.000 And that number has gone up over the years.
00:53:37.000 Can I talk about monitoring systems?
00:53:39.000 Can I talk about the digital shield?
00:53:40.000 Because this is, I don't have much faith, yes, Cruz is very impressive, I admit, but I don't actually have much faith in the legal system, especially these days, or in regulators, because in general, law and regulation, even without massive dysfunction, Laws and regulations move very slowly.
00:54:03.000 Tech moves... Yeah.
00:54:05.000 ...speed of light, kind of, by definition.
00:54:08.000 You're actually going to one of my next questions, but my, I guess my, to sum all of that up was, we need to be focusing on this.
00:54:14.000 All this other stuff is the distraction.
00:54:16.000 I'm agreeing with you, and we need to be doing that.
00:54:18.000 So how do we do that?
00:54:20.000 Because I don't, we run up against problems all the time that we find and that frustrate us, and we don't necessarily know how to have an effect to change things, right?
00:54:30.000 So you have a monitoring system out there.
00:54:33.000 How do we practically go, I want to back that guy and jump in with you on this?
00:54:41.000 And unfortunately, you said this, you're the guy standing between us and...
00:54:45.000 This stuff kind of falling apart.
00:54:46.000 And that's a, that's a strange thing, but somebody said that to you.
00:54:49.000 It was in an article, right?
00:54:50.000 Aranda Devine, the woman who broke the story for the New York Post on Hunter Biden's laptop.
00:54:55.000 If you go to Epstein in the NewYorkPost.com, you will get to her recent piece about my work.
00:55:03.000 And she ends that article.
00:55:05.000 It's an amazing article.
00:55:07.000 I could not have paid someone to write a better article.
00:55:10.000 She ends it with, only Epstein is standing in the way.
00:55:16.000 I did not pay her to do that.
00:55:17.000 Well, I'm sure you probably never set out to have that sentence exist with your name in it.
00:55:22.000 I don't know who would.
00:55:25.000 I don't think I've ever blushed before reading an article, but I think I probably did blush reading that.
00:55:31.000 I mean, wow.
00:55:33.000 Now, why would she say that?
00:55:36.000 What is it that I'm doing?
00:55:38.000 Well, in 2022, we didn't have 95 field agents.
00:55:43.000 We had 2,742 field agents in 10 states.
00:55:47.000 We preserved more than 2.5 million ephemeral experiences on multiple platforms.
00:55:52.000 And we got faster and faster and faster and faster at doing what we're doing.
00:55:57.000 And I made a decision, which is kind of risky.
00:55:59.000 And in fact, this is where your people And did we come to the, did we come to your aid, the crowder
00:56:09.000 people?
00:56:10.000 This is where your people, uh, could be very helpful because I made a decision.
00:56:19.000 I decided, okay, the time has come.
00:56:23.000 We have to go beyond elections.
00:56:26.000 This country needs a permanent, a permanent monitoring system, monitoring the tech companies 24 hours a day in all 50 states.
00:56:36.000 We need to have representative samples.
00:56:37.000 It needs to be politically balanced.
00:56:39.000 So all our data are court admissible.
00:56:42.000 And we will either catch them and put these people in jail, or we will stop them.
00:56:48.000 And when we stop them, we'll see it in the data, because we'll see the bias disappear.
00:56:54.000 We will see the crazy stuff that they're sending to kids will stop.
00:56:59.000 We will see the go-vote reminders, those partisan go-vote reminders, they will stop.
00:57:04.000 We'll see it.
00:57:06.000 And in a way, that would be the best thing for our democracy and for our republic, that would be the best thing, is that we just stop them.
00:57:12.000 They just stop.
00:57:13.000 And by the way, if I were them, I would stop, because So we started building last fall.
00:57:19.000 I managed to get just over three million dollars from some very generous donors who believed in this monitoring concept, where we now call it America's Digital Shield, and we started building.
00:57:36.000 And as of today, let's see, I knew this was going to come up so I went online.
00:57:42.000 You have the website.
00:57:44.000 I need to know what website search engine he uses.
00:57:47.000 Oh, I'll get to that.
00:57:48.000 Oh, I'll get to that.
00:57:50.000 As of today, at this moment, we have 11,638 field agents, all registered voters in all
00:57:58.000 50 states, plus 2,377 children, and we have preserved more than 41 million ephemeral experiences
00:58:09.000 on Google, on Facebook, on YouTube, on Bing, Yahoo, and other platforms.
00:58:16.000 In other words, we're building it.
00:58:18.000 Now, our goal is to try to have a full system in place with representative samples, court admissible, By the end of this year, so we can actually go right after the primaries.
00:58:30.000 Literally, we can protect the primaries.
00:58:35.000 And, you know, we're on the way, but the three million is pretty much gone.
00:58:44.000 And I'm in a very tough position right now because to shut this system down would be insane.
00:58:52.000 It would be even more insane than the fact that I built it.
00:58:55.000 But it would be insane to shut it down because that would be literally handing over the free and fair election to the tech companies permanently, probably, and handing over the minds of our kids to the tech companies permanently.
00:59:11.000 But I don't know.
00:59:12.000 I'm not a fundraiser.
00:59:13.000 I'm a very good researcher.
00:59:16.000 So the money, I'm kind of baffled there.
00:59:20.000 And I need help.
00:59:22.000 Right.
00:59:22.000 Well, how can we help?
00:59:24.000 What do you want people watching this show right now?
00:59:27.000 And look, you spend your hard-earned dollar as you will, right?
00:59:31.000 We encourage you to join Mug Club.
00:59:32.000 That's one of the things that you can do to help fight back against stuff like this.
00:59:37.000 What do we do in practice? Because we always talk to you guys about making a difference practically.
00:59:41.000 It's not just about hearing about a story that gets you all rah-rah fired up. It isn't about
00:59:46.000 that. It's not, we're not looking to just do this to get people to watch our show and just to be
00:59:50.000 entertained. We try to do entertainment as well as news and kind of make it somewhat palatable
00:59:54.000 and help preserve, you know, the ability to have free and fair elections. That's first and foremost.
01:00:00.000 So here's an opportunity.
01:00:01.000 Here's an opportunity to kind of put your money where your mouth is.
01:00:04.000 And I'm assuming that you guys take donations big and small.
01:00:07.000 Doesn't matter, right?
01:00:08.000 Whatever people can do to kind of support your cause, where do they go if they want to pitch in?
01:00:16.000 You should go to mygoogleresearch.com, but let me just point out that when I go on a show that has a big audience, which this show does, and I go on a show, I get a gazillion people trying to reach me or reaching me saying, I'll be a field agent, I'll be a field agent.
01:00:33.000 So don't do that.
01:00:35.000 Don't do that, because we can't ever accept volunteers, because if we accepted volunteers, Google would send us thousands of volunteers, and we know because they have sent us people.
01:00:46.000 They will water it down.
01:00:47.000 Of course, so that they can control the system.
01:00:50.000 So we have to find people using the methods that we use.
01:00:54.000 We find registered voters.
01:00:57.000 So, don't do that.
01:00:58.000 What we need are funds.
01:00:59.000 Now, we only pay our field agents $25 a month.
01:01:03.000 It's just a token fee for the privilege of being able to look over their shoulders when they're, you know, getting content from these tech companies.
01:01:11.000 So, we preserve and protect their privacy.
01:01:14.000 So, when that data is transmitted to us, it has no identifying information ever, and we only look at data in aggregate.
01:01:22.000 So, we're doing exactly the opposite of what the tech companies do.
01:01:25.000 The tech companies look at your individual data, and then, like the misers they are, they go, oh, look, more data, and then they monetize it, and they use it to manipulate you.
01:01:38.000 We're doing exactly the opposite.
01:01:39.000 So we preserve people's privacy, but we only pay them $25 a month.
01:01:45.000 But if you hit, we have $11,000 already.
01:01:47.000 Yeah.
01:01:49.000 And that number is going to be much, much higher.
01:01:51.000 That's a lot of money.
01:01:52.000 If you want to help us sponsor a field agent, I mean, literally go to my Google research.
01:01:57.000 Leave that up at the bottom.
01:01:59.000 Just put that up at the bottom and leave that for a little bit.
01:02:01.000 MyGoogleResearch.com if you guys want to do it.
01:02:04.000 Is it set up like you can just pick to sponsor or are you just saying give $25 a month as the dollar amount?
01:02:10.000 Is it set up that way?
01:02:11.000 It's set up that you can just click monthly and put any amount you want.
01:02:14.000 If you want to sponsor 5 field agents or 10 field agents.
01:02:19.000 We're getting right this minute and this is wonderful.
01:02:25.000 We're getting about $1,000 in small donations every single day from people we don't know.
01:02:32.000 And, you know, again, that's amazing, that's wonderful.
01:02:37.000 But do you know what it costs us?
01:02:39.000 That's not enough.
01:02:40.000 I'll say it for you, it's not enough.
01:02:43.000 It's fantastic, it's wonderful, but we can do better if we want to preserve free and fair elections.
01:02:47.000 This is a very, very big project because it has tremendous people needs.
01:02:56.000 It has tremendous security needs, which are very, very unusual because we have to protect the data and we have to protect our people.
01:03:03.000 It's a very expensive project, which at the moment is costing us more than $10,000 a day to build.
01:03:09.000 And that gets us 30 to 60 new field agents every day who've gone through the whole process.
01:03:15.000 And it's expensive.
01:03:17.000 So if anyone out there, if you have connections to a major foundation, connections to a generous person who can afford a major gift, I can't do this myself.
01:03:30.000 I literally need help from other people.
01:03:33.000 But the good news is we've gone a very long way in a relatively short time.
01:03:38.000 We started building this permanent system last fall.
01:03:42.000 We now have 11,638 field agents, all registered voters in all 50 states.
01:03:48.000 As far as those representative samples go, we've hit that minimum in 10 states so far.
01:03:55.000 So we're going state by state by state.
01:03:57.000 Are you starting kind of in swing states where your focus is?
01:04:00.000 Because like you said, that's where the action is going to be.
01:04:02.000 We did start with the swing states because those are absolutely critical in elections.
01:04:06.000 But we're after far more now because the content we're seeing, because we're getting data now from kids.
01:04:12.000 Yeah.
01:04:13.000 And it's not from kids, it's from their devices.
01:04:15.000 Right.
01:04:16.000 We're seeing the content that's being sent to their devices.
01:04:18.000 They don't even have to be in front of their devices, we're still seeing the content.
01:04:22.000 And that is actually getting me even more concerned than the political bias, because these companies are sending Data to our kids that is so creepy that at the moment we don't even know how to analyze it.
01:04:38.000 I'm telling you the truth.
01:04:40.000 YouTube videos that kids actually watch, because the data we're getting shows that kids are actually watching these videos.
01:04:50.000 It'll be like something that kind of looks like just a crappy cartoon.
01:04:55.000 parent walks by, just sees a crappy cartoon, keeps walking.
01:04:58.000 Four or five minutes into the cartoon, boom, someone's head explodes, something horrible happens.
01:05:07.000 It's very brief, speaking of ephemeral, it's very brief, but then if you look on YouTube, you can do this.
01:05:14.000 If you go down along the bottom.
01:05:17.000 Yeah, you can see what parts people watch the most.
01:05:19.000 Exactly right, and there'll be a spike right at that point where this terrible thing happens
01:05:25.000 because kids replay that over and over again.
01:05:28.000 Now, why would they do that?
01:05:30.000 Because that's how you addict people to content or to your platform.
01:05:34.000 That's how you addict them because we are drawn, all of us, to accidents on the side of the road.
01:05:40.000 Of course, yeah.
01:05:41.000 You can't take your eyes off certain kinds of content.
01:05:43.000 Negativity, bias.
01:05:44.000 Wow!
01:05:46.000 I know a few things.
01:05:48.000 Wow, that's a first.
01:05:51.000 Not for me, even in general.
01:05:52.000 It wasn't directed at me.
01:05:54.000 I'm a Notre Dame man, come on!
01:05:56.000 Wow, I'm very impressed.
01:05:58.000 That is called negativity bias.
01:05:59.000 For those in the know, in the social sciences, it's been studied in many fields.
01:06:03.000 It's very, very real and very powerful.
01:06:07.000 You know, that's what Google makes use of when they suppress negative search suggestions.
01:06:12.000 Because if they let negative search suggestions appear, those negative search terms will draw 10 to 15 times as many clicks as neutral or positive terms.
01:06:24.000 It's the same kind of deal.
01:06:26.000 Anyway, Google is making use of negativity bias in content that they're sending to our kids.
01:06:33.000 That's insane.
01:06:34.000 Kids to that platform and to increase watch time, which is the number one metric they have for revenue.
01:06:43.000 Right.
01:06:44.000 So let me just clarify in that cartoon scenario that you have.
01:06:47.000 One, I'm assuming that people would want to be able to kind of verify something like that.
01:06:52.000 Are you guys able to capture that kind of data?
01:06:53.000 Because you were talking about not really even knowing how to capture that yet.
01:06:57.000 Are you able to capture that where you can actually see it happen again and show somebody, see, this is what I was talking about?
01:07:03.000 Is it provable or at least admissible?
01:07:07.000 Replay that tape because I think you may have missed a couple words there.
01:07:12.000 We know how to capture it.
01:07:13.000 We're capturing, from kids now, we're capturing massive amounts of data 24 hours a day.
01:07:20.000 We know how to capture it.
01:07:21.000 We don't yet know how to analyze it.
01:07:25.000 How do you – we can watch a video, we can see the creepiness in the video, but by the way, there's a creepiness that goes beyond even that one horrible moment, the traumatic moment.
01:07:40.000 There's creepiness all through that we know is important.
01:07:44.000 But we just don't know yet how to articulate the importance of it, and we don't know how to analyze it mathematically yet.
01:07:52.000 But we will figure it out.
01:07:53.000 In a scalable way, obviously, right?
01:07:55.000 We're talking about massive amounts of people that would be needed just to watch this stuff and then to extrapolate.
01:08:01.000 Like you said, mathematically, what is going on here?
01:08:05.000 Well, the good news is we're building an archive.
01:08:10.000 In other words, these companies never imagined that anyone would actually capture and preserve and archive ephemeral content.
01:08:18.000 Well, I mean, the number keeps growing every minute, every second.
01:08:18.000 That's what we're doing.
01:08:24.000 But literally, as of when I checked this morning, we had preserved more than 41 million of these ephemeral experiences.
01:08:32.000 And we're going to preserve hundreds of millions and eventually billions.
01:08:37.000 And we're getting better and better at doing the analyses.
01:08:41.000 And I'm just telling you, what we're doing is crazy, insane, complicated.
01:08:49.000 In the beginning it seemed almost impossible, but we're doing it.
01:08:53.000 And this, to me, is the real protection that we can have from not just Google, but the next Google after this one, because If you don't monitor and capture, preserve, archive ephemeral content, you'll never understand what's going on.
01:09:12.000 You'll never understand why this person won the election versus that person.
01:09:16.000 You'll never understand what's happening to kids.
01:09:21.000 I mean, human autonomy literally will be undermined and people won't know.
01:09:27.000 You have to monitor.
01:09:29.000 In other words, this is not optional for humanity.
01:09:32.000 Okay, maybe my monitoring system is optional because I could get, you know, run over by a Google Street View vehicle tomorrow.
01:09:38.000 Please, let's go.
01:09:38.000 I'm sorry, let's go a different direction.
01:09:40.000 Different direction now.
01:09:42.000 So, incognito mode, does that do anything for us?
01:09:45.000 Is that just a...
01:09:46.000 Are they just messing with us with incognito mode?
01:09:48.000 It's completely fake.
01:09:49.000 Completely fake.
01:09:50.000 I knew it.
01:09:51.000 I wrote this question down earlier because you said with Google like obviously all of these things exist and I'm thinking okay well what service do you use and then I was thinking well wait a minute does incognito mode actually do anything so what what do you use?
01:10:03.000 If you go to MyPrivacyTips.com.
01:10:06.000 You have this whole list.
01:10:07.000 I do.
01:10:08.000 Okay.
01:10:09.000 That's why I finally started using these, these cool domain names because I couldn't remember the links.
01:10:14.000 So MyPrivacyTips.com will take, take you and your four members of your audience to.
01:10:21.000 Oh, come on.
01:10:22.000 It's five.
01:10:23.000 Let's do five.
01:10:23.000 Is it five?
01:10:24.000 You said four or five.
01:10:25.000 Damn, that's a 25% increase.
01:10:26.000 It is.
01:10:27.000 Wow.
01:10:27.000 We'll get Google on getting more here.
01:10:29.000 Okay.
01:10:30.000 So MyPrivacyTips.com will take you to a piece which begins, it's an article I wrote, and it begins, I have not received a targeted ad on my mobile phone or computer since 2014.
01:10:42.000 So, it'll explain to you how you can get started in preserving your privacy, and I first published that in 2017, and good news for your viewers is that I just updated it a few days ago, so now everything's up to date, and it'll get you started.
01:11:01.000 One of the things I mentioned, I'm just going to flash this and put it down, is that every single person here has a surveillance phone in their pocket, in their purse, somewhere.
01:11:12.000 Those are all surveillance devices.
01:11:15.000 Yes, they really do listen to every single thing.
01:11:19.000 I don't doubt it.
01:11:20.000 And remember a few years ago, I don't know, you look a lot younger than I am, but a few years ago, it wasn't that long ago, you could remove a battery from your phone.
01:11:29.000 Now they solder them in so you can't remove the battery.
01:11:31.000 Why would they do that?
01:11:33.000 Convenience?
01:11:34.000 I'm just kidding.
01:11:35.000 Of course I know.
01:11:36.000 No.
01:11:37.000 It's so you can never turn off your phone.
01:11:40.000 You think it's off and it's not off.
01:11:40.000 Not completely.
01:11:43.000 Those are surveillance devices.
01:11:46.000 They're listening, they're recording.
01:11:49.000 If you disconnect from your mobile service provider, then yeah, they're just storing everything locally on the phone, but the moment you reconnect, all of it gets uploaded.
01:11:59.000 We don't do that.
01:12:00.000 We actually build our own phones for our key staff members, and these are secure phones.
01:12:07.000 This is what people who work for, you know, NSA or the CIA, this is what they use.
01:12:15.000 If you go to MyPrivacyTips.com, I'll explain to you if you want to start experimenting with secure phones how you can do that.
01:12:24.000 That's very practical, good advice too.
01:12:26.000 I don't think a lot of people know this.
01:12:29.000 Maybe it's the want to know this.
01:12:31.000 Ignorance is bliss.
01:12:32.000 It's not in this case because you just end up in a very dark place as a society.
01:12:37.000 But I think we're pulling back the blinders for a lot of people.
01:12:39.000 I mean, people have come on and said, ah, you shouldn't have a phone or like, ah, whatever.
01:12:43.000 You just don't like Apple or you're an Android guy, whatever it may be.
01:12:46.000 But you're you're basically saying like, look, it's If you're serious about this, these are the things that you can do to take steps.
01:12:53.000 You can support this kind of surveillance system, right?
01:12:55.000 You can sponsor the $25 a month at a minimum.
01:13:00.000 I know we can get stuff like that done, right?
01:13:03.000 But there's also things you can do with your phones.
01:13:05.000 You can stop being the product, right?
01:13:07.000 Because that's what Google counts on is you.
01:13:09.000 You're the product, right?
01:13:10.000 Your information, all of that data, that is the product.
01:13:13.000 So, we don't use, and I say we, my friends, my family, my staff, none of us use anything that has anything to do with Google.
01:13:21.000 So, none of us use the Google search engine.
01:13:24.000 None of you listening or watching should ever use it again because it's an extremely aggressive surveillance tool, but it's also the most powerful mind control machine ever invented in history.
01:13:39.000 There's so many different ways in which that search engine is manipulating you that, again, if you really understood, trust me, you would never touch it again.
01:13:47.000 So what do you use?
01:13:49.000 We use Brave, brave.com.
01:13:52.000 That's what we use for our browser.
01:13:55.000 Brave has its own search engine.
01:13:56.000 It's built into it.
01:13:57.000 It's the Brave search engine.
01:13:59.000 It works extremely well.
01:14:03.000 For texting and that kind of stuff, we use Signal.
01:14:07.000 It's a free app.
01:14:08.000 It's run by a nonprofit organization.
01:14:10.000 Signal is excellent.
01:14:13.000 It's true that some of the big tech companies have tried to diss it and tried to make up stories about it.
01:14:20.000 I won't lie.
01:14:21.000 But, even Edward Snowden, the guy who brought all that creepy data out of the intelligence agencies a few years ago, even he uses Signal.
01:14:31.000 So, we use Signal.
01:14:33.000 You can use Signal for video calls, for regular calls.
01:14:37.000 It encrypts everything.
01:14:38.000 For email, please stop using Gmail.
01:14:42.000 Please, please, please, I beg you.
01:14:44.000 Well, you should be using ProtonMail.
01:14:47.000 And that's based in Switzerland.
01:14:50.000 It's subject to very strict Swiss privacy laws.
01:14:54.000 There's a free version, which will probably get you at least through your first year.
01:14:59.000 But even if you at some point need the paid version of ProtonMail, it's like six bucks a month.
01:15:04.000 I mean, all of these quote-unquote free services, they're not free.
01:15:08.000 You pay for them with your freedom.
01:15:10.000 Now that's not free to me.
01:15:12.000 If you're paying with your freedom.
01:15:15.000 That's one of the best articles I ever wrote.
01:15:17.000 I should buy that URL.
01:15:18.000 I wrote an article called Free Isn't Freedom.
01:15:21.000 But I need to get that domain name.
01:15:24.000 I should do that.
01:15:26.000 You pay with your freedom.
01:15:28.000 You know what?
01:15:29.000 Don't do that.
01:15:29.000 Don't use Gmail.
01:15:31.000 Get ProtonMail.
01:15:32.000 It encrypts the messages end-to-end and the attachments so that even the folks at ProtonMail can't read it. Now that's a very different model. Very,
01:15:45.000 very different model. So instead they're just people who really use it heavily or people are
01:15:50.000 in business so they're charging a few bucks a month. That's what we should be doing. In
01:15:55.000 other words, these companies should not be allowed to use surveillance in a way that we're
01:16:04.000 not aware of to not just sell us stuff but also to manipulate us.
01:16:10.000 The surveillance business model should be illegal.
01:16:13.000 Well, and a lot of companies now are switching away from, or at least trying to decouple themselves from the advertisers being the only way they get paid because then the advertisers are in control of the content that gets put on your platform and they can turn that off at the drop of a hat depending on how they feel about you that day.
01:16:32.000 When we look at 2024, when we look at the upcoming elections, obviously we've talked about the primaries, we know that you guys are there, you're monitoring this stuff, and you're hoping that your mere presence is enough to get Google to stand down and not do anything with these elections.
01:16:50.000 This seems like a national security issue because it is.
01:16:53.000 There is no question that this is about national security when you're talking about our elections.
01:16:58.000 If we don't have free and fair elections, really what do we have and how long do we have until
01:17:03.000 this country descends into a thing that you're not going to want to be a part of, right?
01:17:08.000 You're not going to want to be here for that because it won't be freedom and there won't be any semblance of the America that you grew up with.
01:17:15.000 And I'm not talking about cultural issues.
01:17:16.000 I'm talking about just not having any choice in who represents you.
01:17:19.000 That's the most basic thing that we have here.
01:17:23.000 Why is the government, even in small parts or organizations that are government adjacent, or somebody stepping up and saying, you know what, that's right, we do need to fund something like that so that we can do it.
01:17:35.000 Now, I don't want the government to do it.
01:17:36.000 I don't trust them to do it.
01:17:38.000 I understand.
01:17:40.000 And if it's the government, I want it to be a state government or a local government funding it, because you're doing it on a state-by-state basis anyway, and so states can say, you know what, hey, we'll take care of everybody in the state of Texas.
01:17:50.000 We'll take care of everybody in the state of Georgia.
01:17:52.000 We'll take care of everybody in the state of Michigan.
01:17:54.000 Why are these groups not stepping up?
01:17:56.000 Especially right now, your strongest argument is to conservatives, because they're the group that feels like they're targeted the most by this kind of stuff.
01:18:03.000 Your data seems to suggest that too, obviously, right?
01:18:06.000 Why are they not stepping up and saying, we got to get this thing funded because at least it will give us a fair chance with our ideas.
01:18:13.000 And if our ideas are good enough, people will vote for us.
01:18:15.000 If they're not good enough, they won't.
01:18:20.000 Fear of Google.
01:18:24.000 A lot of people who have access to a lot of money, they depend on Google for their livelihood.
01:18:31.000 They can't take a chance of Google demonetizing them, demoting them, removing them, deleting them.
01:18:38.000 They can't take a chance.
01:18:40.000 One of the best articles ever written about this problem was written by the head of the biggest publishing conglomerate in Europe and his piece was actually called Fear of Google and he called it an open letter to the CEO of Google and he was talking about the fact that they can't do anything in their business without deciding how Google's going to react because Google can easily just Snap their fingers with or without cause, and the courts, by the way, have said, yes, you can do this, you're a private company, and they can just demote you or delete you.
01:19:17.000 And if you're a publishing house or any other company, you're dead.
01:19:21.000 And the courts in the United States have over and over again said, yes, you can do that, Google, either under CDA 230, which you mentioned, or under the First Amendment.
01:19:31.000 They still have the First Amendment, don't forget.
01:19:34.000 Which gives them the right to suppress speech, apparently.
01:19:39.000 But the point is that there are some very, very wealthy conservatives who I have spoken to about this problem.
01:19:47.000 One quite recently.
01:19:50.000 And, you know, I can't really give you details, but he owns a chain of something or others around the country and he's very, very wealthy.
01:19:58.000 He can't gamble.
01:19:59.000 He cannot take a chance.
01:20:02.000 He's in retailing.
01:20:03.000 He cannot take a chance on offending Google.
01:20:07.000 There's a guy who in the past has supported us.
01:20:11.000 But he owns a chain of blankety blanks and he or his lawyers or some of his accountants finally explained to him or his marketing people, you know what?
01:20:22.000 You can't do that anymore.
01:20:23.000 You can't be supporting that work because we can't take a chance on offending Google.
01:20:28.000 You know, the fact that I live my life 24 hours a day offending Google is pretty nuts.
01:20:34.000 That's pretty crazy.
01:20:36.000 It absolutely is.
01:20:37.000 Let me, let me just change what you said.
01:20:39.000 Those companies, those guys say we can't take a chance on offending Google.
01:20:43.000 You can't afford not to.
01:20:46.000 You're in a position to make a difference.
01:20:49.000 I'm sorry, but it's time to grow a pair and actually put your money where your mouth is.
01:20:55.000 Nobody ever said this was going to be easy.
01:20:57.000 Nobody ever said preserving democracy was not going to be painful.
01:21:01.000 Nobody ever told you that being in a position of influence and power, being blessed by God with resources—I'm not saying hard work doesn't play a part, I know it does—but you're given that as a steward.
01:21:13.000 There are things worth fighting for.
01:21:14.000 There are things worth putting yourself at risk for.
01:21:16.000 There are things worth investing in and saying, you know what?
01:21:20.000 If everybody feels like I do right now and doesn't do anything, then this is only going to get worse.
01:21:25.000 And eventually you won't be able to speak up against it.
01:21:27.000 Eventually you won't be able to make a change.
01:21:30.000 You have to be willing to put something on the line.
01:21:33.000 And I agree.
01:21:33.000 There are people out there that have more to put on the line.
01:21:35.000 You're putting your life on the line every single day.
01:21:38.000 We're putting something on the line every single day as well.
01:21:41.000 And then by having you on, it's like a double whammy, right?
01:21:45.000 But we're talking about the opportunity for you to stand up and do this.
01:21:47.000 I've talked to some CEOs recently, and one of them, a big company, and they deal with these issues.
01:21:53.000 Well, we can't afford to do this or be associated with that.
01:21:55.000 And I'm like, you can't afford not to.
01:21:58.000 Our entire right to do it depends on you doing it and depends on other people in your position not being afraid to do that.
01:22:06.000 It will cost you something.
01:22:09.000 But if you don't do it, it will cost everybody else everything.
01:22:13.000 So you have to do it.
01:22:15.000 If, here's the caveat, if you actually believe what you say you believe, if it's just a convenient idea that's different, But then you need to own that.
01:22:24.000 You need to look in the mirror and say, you know what, I don't really believe in this country and the ideals that started this, that gave me the opportunity to create a business of what's it called around the country, doesn't matter what it is.
01:22:34.000 All of that, I don't really believe in that.
01:22:36.000 I'm just glad it's here that I could take advantage of that system and make my money and go sit in my house and make sure and pray to God that I don't offend these guys and they don't come for me next.
01:22:46.000 The only way to make sure that they don't come for anybody next is for you to stand up.
01:22:50.000 You have to do it.
01:22:52.000 I'm sorry, that kind of pisses me off because people think that this is going to be easy.
01:22:55.000 It never has been in history.
01:22:57.000 Nothing great ever has been.
01:22:59.000 And it's going to cost you something.
01:23:02.000 And you're out there fighting, and you don't even necessarily wear the team jersey!
01:23:05.000 You're not a conservative, right?
01:23:07.000 But you're out there fighting this cause.
01:23:10.000 There's so many other things that I really wanted to get to.
01:23:12.000 I know we've already kind of gone long here, but these conversations could go on and on and on, and we've already listed a number of ways that people can support you.
01:23:19.000 Bring that back up really quickly.
01:23:21.000 They can go there.
01:23:22.000 If you're, we're gonna donate.
01:23:23.000 The Adam's Crowder is going to donate.
01:23:25.000 I want you to join us.
01:23:26.000 MyGoogleResearch.com.
01:23:28.000 Do it.
01:23:30.000 We talk all the time about joining Mug Club.
01:23:31.000 That's another way to fight.
01:23:33.000 But sometimes there are causes we want to contribute to outside of that.
01:23:36.000 This is one of them.
01:23:38.000 Do you want to be able to look at the election and trust that it was secure?
01:23:42.000 This is a way to do that.
01:23:43.000 Guess what?
01:23:44.000 If you can never trust that your elections are secure again, it's only a matter of time before people take to the streets.
01:23:49.000 It's only a matter of time.
01:23:51.000 Eventually, you're just going to think, well, it doesn't really matter, and you're going to be devoid of hope.
01:23:58.000 And we don't want to be in that place.
01:23:59.000 We don't want to be in a place that's polarized.
01:24:01.000 We want to be in a place where people have diverse ideas.
01:24:02.000 Yes, but there's a core set of beliefs that unite us, and then we understand we're going to have differences.
01:24:10.000 He used to support Hillary Clinton.
01:24:11.000 Doesn't anymore.
01:24:12.000 It's fine.
01:24:12.000 I'm not going to ask you why, but that doesn't matter.
01:24:16.000 We're on the same team right now.
01:24:18.000 We're fighting for the same cause.
01:24:20.000 There are things like that that unite us, and I know there's speeches about being far more that unites us than divides us.
01:24:26.000 That's absolutely true.
01:24:27.000 Now, there's some big divides right now, and we have to bridge those gaps a little bit, but we have to start here first.
01:24:33.000 We have to be able to trust that the people that we voted for are the people that end up in office.
01:24:37.000 That is 100% primary.
01:24:40.000 We have to protect the rights that allow us to do that, and we have to step up, and we have to give And we have to do our part in this.
01:24:47.000 I don't know if it takes exposing more politicians to what's going on, super PACs to what's going on, groups that say they fight for justice and for freedom and for the First Amendment and the Second Amendment.
01:24:56.000 All of these groups.
01:24:58.000 You'll take, probably, donations from whoever's willing to step up and give you them, right?
01:25:01.000 As long as there's no strings attached to that money, right?
01:25:05.000 So, do it.
01:25:06.000 Do whatever you can do to get more people to watch this episode, to understand what's going on, and then send them to give.
01:25:12.000 Because that's really the only way that we can stop this.
01:25:15.000 It's not just about elections, though.
01:25:16.000 That's the major thing, right?
01:25:17.000 It's about our kids.
01:25:19.000 It's about every single aspect of our lives being controlled by big corporations that are accountable to no one.
01:25:28.000 And get favorable treatment, it seems, in the courts where, sorry, nothing we can do about that.
01:25:33.000 You just, I guess, don't exist.
01:25:35.000 They're a private company.
01:25:36.000 They can do whatever they want.
01:25:37.000 You're the digital town square now.
01:25:40.000 Things have changed.
01:25:42.000 If you as people who have resources don't step up, if the elected officials that we have don't step up, if there aren't future Robert Epsteins that stand up, what are we doing here?
01:25:57.000 This is all theater, if that's the case.
01:26:00.000 It's not making any difference.
01:26:01.000 We have to stand up.
01:26:02.000 We have to fight.
01:26:04.000 Dr. Epstein, thank you very much for doing that.
01:26:07.000 I really appreciate it.
01:26:08.000 Thank you for spending some time with us today.
01:26:09.000 I know we went over a little bit on time.
01:26:12.000 I'm a little fired up.
01:26:13.000 This is intense stuff that you're doing.
01:26:16.000 He's not selling something.
01:26:19.000 He's not coming to you to try to line his own pockets.
01:26:22.000 He's trying to fight to save our country.
01:26:26.000 Let's join him in that fight.