Stay Free - Russel Brand - September 28, 2023


Google Are Doing THIS To Manipulate You! With Dr Robert Epstein


Episode Stats

Length

45 minutes

Words per Minute

149.90012

Word Count

6,753

Sentence Count

397

Hate Speech Sentences

2


Summary

In this episode, Russell Brand speaks to Dr. Robert Epstein, Director of the American Institute for Behavioral Research and Technology (ABRTR) and author of 15 books. Dr. Epstein argues that Google is able to influence the outcome of elections around the world, and in doing so, through a system of artificial intelligence, they are able to sway the outcome in favor of their preferred candidate. In order to do so, they use a system that is on the brink of becoming a permanent fixture in every state in the US, and could be used as a political tool and even a weapon in order to influence our every day decisions and affect the outcomes of elections across the world. This is an incredible claim, and one that needs to be discussed with extreme caution, because there is no doubt that Google and other tech companies are working on a massive scale to influence elections and influence our everyday decisions. But is there any doubt that they are using this technology to achieve their political and economic goals? And is this information being spread across the entire world, or is this just a small sample of what they are actually doing? or is there something even more sinister than we know about what is going on in the world of tech companies and big tech and how they are really working together to influence and exert control over our daily decisions and our everyday lives? And how can we possibly be so close to the edge of our understanding of what is happening in the real world, when we have no idea what is really going on? and yet we can't even begin to see the full extent of the impact that is being done by technology companies and tech companies in our everyday in our day-to-day lives? Stay tuned in to this episode of Stay Free With Russell Brand. to find out. Stay Free with Russell Brand, Stay Free, by You Awakening Wonders. - Stay Woke, by You Awakening Wonders! - by Russell Brand and You Awakening Wonders by Dr Robert Epstein by The Conversation, by The New York Times bestselling author, Dr. Jay Sheinfeld, and by The Huffington Post, and The New Statesman, and the founder of The Internet Times . by Robert Epstein. , and by BBC Radio 4's Mr. . . . and by the BBC Radio and BBC Radio 5 Live, and also by The Guardian, and many more. by the New Stateside, and we hope you enjoy it.


Transcript

00:00:00.000 Hello there, you Awakening Wonders.
00:00:01.000 Welcome to Stay Free with Russell Brand.
00:00:03.000 What extraordinary times we're living in where reality appears to be curated to an enormous degree.
00:00:09.000 How do you manage reality?
00:00:11.000 How do you manage perception?
00:00:12.000 How do you manage information?
00:00:15.000 Joining me now is Dr. Robert Epstein, a former Harvard psychology professor, author of 15 books and current director of the American Institute for Behavioral Research and Technology.
00:00:25.000 Thank you very much for joining us today, Doctor.
00:00:27.000 My pleasure.
00:00:28.000 One of the claims that you have made that is most astonishing, difficult almost to believe, is that Google are essentially able to curate and control reality.
00:00:40.000 Google, that we all use as an ordinary tool in most people's lives, you claim can be used to drive and direct an agenda, that it can be used as a political tool and even weapon In particular, I'd like to ask you about your claim that Google was able to direct six million extra votes to Joe Biden.
00:01:03.000 And obviously, that's an incredibly contentious claim, because talking about electoral fraud and electoral meddling seems to be one of the subjects that's Most difficult to discuss and has to be discussed with incredible caution.
00:01:19.000 So can you tell me exactly what it is you mean by Google directing six million extra votes to a presumably preferred presidential candidate and how on earth Google would be able to do that?
00:01:31.000 Well, I've been doing very rigorous scientific research on this topic for more than 11 years.
00:01:37.000 And what should really shock you here is that people's preoccupation with election fraud and ballot stuffing and all that, that preoccupation, that obsession is actually engineered by Google and to a lesser extent, other tech companies.
00:01:52.000 There's nothing really there, and that's what they do.
00:01:55.000 They redirect attention like magicians do, so that you won't look at them.
00:02:03.000 That's exactly what they're doing.
00:02:05.000 So they're directing us to look at things that are very trivial, that are competitive,
00:02:09.000 that have little net effect on elections, because they don't want you looking at them
00:02:14.000 because they in fact have the power and use the power to shift millions of votes in elections,
00:02:20.000 not just in the US election in 2020, where they did indeed shift more than 6 million votes
00:02:26.000 to Joe Biden, but in elections around the world.
00:02:30.000 By the year 2015, Google alone was determining the outcomes of upwards of 25%
00:02:38.000 of the national elections in the world.
00:02:41.000 How do we know this?
00:02:43.000 Well, in 2020, for example, we had 1,735 field agents in four swing states in the U.S.
00:02:53.000 That's where the action is.
00:02:55.000 What does that mean?
00:02:56.000 That means that we had recruited registered voters, equipped them with special software, So that we could look over their shoulders as they're getting content from Google and other tech companies.
00:03:08.000 And we recorded all that content.
00:03:10.000 In other words, we were seeing the real content that they're sending to real voters during the days leading up to an election.
00:03:18.000 And then we measured the bias in that content.
00:03:22.000 We found extreme political bias favoring Joe Biden, whom I actually supported, although I no longer do.
00:03:29.000 The point is we found extreme political bias and we know from randomized controlled experiments we've been conducting since 2013 that that level of bias shifted at least six million votes to Biden in that election.
00:03:44.000 In 2022 we had 2,742 field agents in 10 swing states.
00:03:51.000 So in other words, we're monitoring real content sent to real voters by these companies,
00:03:57.000 recording it in real time and analyzing it in real time.
00:04:01.000 In 2022, they shifted millions of votes in hundreds of midterm elections throughout the US.
00:04:08.000 We know they did this for Brexit, by the way, in the UK.
00:04:11.000 And again, they're very good at redirecting attention.
00:04:16.000 What we're doing now is much, much bigger.
00:04:18.000 We decided to build a permanent monitoring system in all 50 US states.
00:04:23.000 At this moment in time, we have 11,638 field agents in all 50 states, which means 24 hours a day, we are monitoring and preserving and archiving ephemeral content.
00:04:36.000 That's what they use to manipulate us.
00:04:39.000 Ephemeral content.
00:04:40.000 Through the computers of more than 11,000 registered voters in the U.S., 24 hours a day, we're on the verge of setting up a permanent system like this that will keep these companies away from our elections and from our kids permanently.
00:04:58.000 Whilst I understand that you're able, with these agents that you described, to monitor the information that Google is publishing, promoting and directing, it does seem to be, given the sort of literally global scale of the endeavour that Google are undertaking, to be a relatively small sample size.
00:05:21.000 I will add, of course, that I understand that there are significant contracts that are explicit between Google and the government in areas like data, security, military-industrial complex, defence.
00:05:33.000 There are explicit financial ties as well as donations and lobbying money, as well as numerous people in Congress and the Senate owning significant shares.
00:05:42.000 In companies, big tech companies, particularly in this instance, that they are supposed to regulate.
00:05:46.000 So the possibility and opportunity for corruption is plainly there.
00:05:51.000 But I do wonder how you're able, with that sample size, to deduce such a significant number, specifically six million.
00:06:01.000 And also the other figure that I've heard in association with your work that a 50/50 split among undecided voters,
00:06:06.000 you know, I know we're talking about swing states anyway, can turn into a 9E/10 split.
00:06:11.000 How do you map these relatively small figures onto like, you know, such a global number?
00:06:18.000 And also you suggested that part of your work going forward is to regulate and oppose this trend and tendency.
00:06:25.000 How would you do that?
00:06:29.000 You're shocking me here because you sound skeptical and yet you have been victimized
00:06:34.000 by exactly these kinds of manipulations and are being victimized now.
00:06:40.000 You've been victimized because you have been suppressed.
00:06:43.000 Your content has been suppressed.
00:06:45.000 You've been demonetized.
00:06:47.000 These companies have enormous power to determine what people see and what people don't see.
00:06:53.000 And what we measure in our experiments is how that impacts people's opinions And people's votes, their voting preferences.
00:07:02.000 That's what we measure in controlled experiments.
00:07:04.000 We present at scientific meetings.
00:07:06.000 We publish in peer-reviewed journals.
00:07:09.000 Our work follows the very highest standards of scientific integrity.
00:07:12.000 And this issue of sample size, you've got that backwards.
00:07:16.000 These are enormous sample sizes for statistical and analytical purposes.
00:07:22.000 These are very, very large samples.
00:07:24.000 And so the effects that we keep replicating over and over again, other teams have now replicated,
00:07:30.000 those are significant, for those of you who know any stats here, at the 0.001 level,
00:07:37.000 meaning the probability that we're making mistakes is less than 1 in 1,000.
00:07:44.000 We're highly confident about what we've been finding.
00:07:48.000 And the problem here is that we're up against...
00:07:51.000 The most powerful mind control machine that's ever been developed by humankind, and it's operating in every country in the world except mainland China, and it impacts how people see those companies.
00:08:04.000 They're impacting not just our elections, they're not just indoctrinating our kids, they're literally altering the way we perceive them as a company.
00:08:14.000 That's extremely dangerous.
00:08:16.000 And most of these manipulations that they have access to now, that they control exclusively because they're a monopoly, most of these manipulations cannot be seen by the people who are being manipulated.
00:08:29.000 That makes it even more dangerous.
00:08:31.000 So your ability to observe them and to track them, it operates against what type of control?
00:08:39.000 If you're able to say that people are being sent this information that's highly biased, what would unbiased information look like?
00:08:48.000 I'm open, of course, to the possibility that this unprecedented and fully immersive technology would be used by people that have an appetite to control information and it seems quite plain to me.
00:09:00.000 That that does happen but because it's so extraordinary and revelatory because it's so significant and if it were able to be opposed it could be so seismic in our ability to have true democracy and a public sphere worthy of the name where dissent and conversation could take place freely.
00:09:19.000 I feel that it's important that I understand exactly how that not exactly because of probably the limitations of my ability to understand but as precisely as I might, the way that you're able to say,
00:09:31.000 "Look, this would constitute neutral information.
00:09:35.000 Look at what you're actually getting."
00:09:38.000 Because I feel that it's very important.
00:09:41.000 Again, you're shocking me because you're being the skeptic here,
00:09:45.000 but you know, good scientists are also skeptics, and there's no one more skeptical
00:09:49.000 about the research I do than me.
00:09:53.000 So, let me give you an example, and I'll just show you exactly how this works.
00:09:57.000 In 2020, where we had collected a massive amount of data, we had preserved more than 1.5 million ephemeral experiences on Google and other platforms, and you're asking, Ephemeral experiences?
00:10:09.000 What are those?
00:10:10.000 Those are those fleeting experiences that we all have online when we're shown search suggestions or answer boxes or search results or news feeds.
00:10:20.000 They appear, they impact you, they disappear, they're stored nowhere, so no one can go back in time and see what was being done.
00:10:29.000 That's what we've learned to preserve over the years.
00:10:32.000 So, here we go.
00:10:32.000 2020, we find, again, Massive, overwhelming evidence of extreme bias will preserve 1.5 million ephemeral experiences.
00:10:44.000 And I sent the data in to the office of Senator Ted Cruz.
00:10:50.000 He and two other senators sent a very threatening letter to the CEO of Google.
00:10:56.000 This was November 5th, 2020, two days after the presidential election.
00:11:01.000 And lo and behold, that same day, Google turned off All the bias in the state of Georgia, which was gearing up for two Senate runoff elections in January.
00:11:13.000 We saw them turn the bias off.
00:11:17.000 It literally like flipping a light switch, as I was told by a Google whistleblower, literally like flipping a light switch.
00:11:23.000 We had more than a thousand field agents in Georgia.
00:11:26.000 So we saw the extreme bias that was being shown.
00:11:30.000 We saw them turn it off.
00:11:31.000 Among other things, they stopped sending partisan go vote reminders.
00:11:36.000 In other words, they were sending go vote reminders mainly to members of one party.
00:11:42.000 But on that day in Georgia, no one Got to go vote reminders from Google anymore.
00:11:48.000 So believe me, they have this power.
00:11:51.000 They exercise this power.
00:11:52.000 This is now being confirmed by multiple leaks from the company.
00:11:56.000 For example, emails that were leaked to the Wall Street Journal in which Google employees were discussing how can we use And I put this in quotes, ephemeral experiences to change people's views about Trump's travel ban.
00:12:12.000 This has been confirmed by multiple whistleblowers, leaks of documents, leaks of videos of a PowerPoint presentation.
00:12:21.000 This is how the company operates.
00:12:23.000 They literally know that they have the power to re-engineer humanity.
00:12:29.000 That's a leak of a video called The Selfish Ledger from Google.
00:12:34.000 Literally, that's what the video is all about.
00:12:37.000 And that's what we're tracking.
00:12:39.000 In other words, we're doing to them what they do to us and our kids 24 hours a day.
00:12:44.000 we have learned how to surveil them and to preserve that very,
00:12:50.000 very powerful ephemeral content, which normally is never preserved.
00:12:54.000 And they never in a million years imagined that anyone would be sophisticated enough,
00:13:00.000 competent enough, audacious enough to preserve that content.
00:13:05.000 And that's what we are doing.
00:13:07.000 And as of this moment in time, we have preserved in recent months more than 44 million
00:13:12.000 ephemeral experiences on Google and other platforms.
00:13:16.000 We have the data.
00:13:18.000 We have the evidence and it's court admissible.
00:13:21.000 Wow.
00:13:23.000 So that's fascinating.
00:13:24.000 So presumably there are relationships and an agenda where interests converge to the degree where there is an established and undemocratic consensus about the nature of this reality that's being formulated, i.e.
00:13:40.000 this is the data that is promoted, this is the information that's amplified, this is the information that's censored, this is the information that people just don't get to see.
00:13:50.000 I wonder if when you presumably began to garner your expertise and education in behaviouralism, tools of this magnitude didn't exist and were not available.
00:14:05.000 Throughout the pandemic period there was a lot of talk about nudge units, certainly in our country there were, how behavioural nudges could be offered and sort of BF Skinner type nomenclature about how behaviour can be controlled, how certain traits can be amplified, certain impressions can be projected and promoted and others maligned, ignored.
00:14:25.000 I wonder how your expertise and background in behaviouralism, Robert, maps onto this new reality and what advantages they now have having this kind of utility.
00:14:38.000 How does this How does this, what do I want to say, how does this marry to your conventional understanding of behaviouralism in a normal propagandist state like in the last century where there have been print media and TV media?
00:14:51.000 And can you tell us what techniques of observation and measurement are preserved and have sustained what must be an epochal shift?
00:14:59.000 I was the last doctoral student at Harvard University of B.F.
00:15:02.000 Skinner, the man who some would say helped to create behavioral psychology.
00:15:09.000 And Skinner himself did not anticipate what has actually happened.
00:15:15.000 He would be shocked.
00:15:17.000 If he hadn't been cremated, I would say he'd be rolling over in his grave right now.
00:15:21.000 Because what is happening is astonishing.
00:15:25.000 It's just, it's unprecedented.
00:15:28.000 Companies like Google, and there are others too, but they're the worst offender.
00:15:32.000 Companies like Google now have access, because of the internet, to new types of manipulations.
00:15:37.000 These aren't nudges.
00:15:39.000 These are massive manipulations.
00:15:42.000 I mean, when we started doing experiments, controlled experiments, on these new techniques, which we had to discover, we had to name, and then we had to learn how to quantify them, I didn't believe our data.
00:15:55.000 In the first experiment we ran in 2013, I thought by showing people biased search results, I could shift their voting preferences by two or three percent, which I thought would be, you know, important possibly in a closed election.
00:16:08.000 The first shift we got was 43 percent, which I thought was incorrect.
00:16:12.000 So we repeated the experiment.
00:16:14.000 These are not with college sophomores, by the way.
00:16:16.000 These are with a this is with a representative sample of U.S.
00:16:19.000 voters.
00:16:20.000 And the fact is, we repeated that experiment.
00:16:22.000 We got a shift of 66 percent.
00:16:25.000 We continued to replicate.
00:16:28.000 Other teams have replicated this effect.
00:16:30.000 We did a national survey in the U.S.
00:16:32.000 We did research in India, research in the U.K.
00:16:35.000 This has been going on now for more than 11 years.
00:16:38.000 This is rock solid research and Skinner himself would be flabbergasted.
00:16:46.000 Because what we're seeing now are techniques for shifting people's thinking and behavior without their knowledge on a massive scale to an extent that has never been possible before in human history.
00:16:59.000 That's what the internet has made available.
00:17:01.000 Now, this wouldn't necessarily be that much of a threat.
00:17:05.000 Except for the fact that the way the internet has evolved, which no one anticipated, is it's controlled mainly by two big monopolies, to a lesser extent by a couple of other monopolies.
00:17:16.000 And because they're monopolies, it means that these techniques of control, we can't counter.
00:17:22.000 If you, in an election, you support a candidate and you buy a billboard, I can buy another billboard.
00:17:28.000 You buy a TV commercial, I can buy two TV commercials.
00:17:32.000 But if one of these big platforms, like Google, if they want to support a candidate, or they want to support a Brexit vote, or they want to support a political party, there is nothing you can do to counter what they're doing.
00:17:46.000 What we've developed are systems to surveil them, to preserve the evidence, preserve the data.
00:17:53.000 That's the only way I know of To stop them is by gathering the evidence in a way that is, again, scientifically valid, so that the data are admissible in court, and that is what we're doing right now.
00:18:09.000 If people want to know the details, they can go to mygoogleresearch.com, they can go to techwatchproject.org, MyGoogleResearch.com will give them lots and lots of links to lots of published papers, lots of talks I've given.
00:18:23.000 This is serious work and what's happening here, again our attention is being misdirected away from what they're doing, but what's happening here, what they're really doing is extremely dangerous and very scary.
00:18:37.000 It undermines democracy.
00:18:39.000 It makes democracy into a kind of a joke and Since you haven't interrupted me yet, thank God, I want to just tell you that President Dwight D. Eisenhower, who was head of Allied Forces in World War II, I mean, he was an insider.
00:18:53.000 In the last speech he gave as president in 1961, some people are aware that he talked about the rise of a military industrial complex and, you know, but that same speech, he warned about the rise of a technological elite.
00:19:09.000 This was 1961.
00:19:11.000 He warned about the rise of a technological elite that could someday control public policy without people's knowledge.
00:19:20.000 And that is what has happened.
00:19:23.000 The technological elite are now in control.
00:19:26.000 Oh my God, it's terrifying.
00:19:29.000 One of the things that you covered there was, I suppose, the monopolization, or at best,
00:19:36.000 duopolization of the public space.
00:19:40.000 Sometimes when I have something of this scale described to me, Robert, I find it inconceivable
00:19:46.000 to envisage that it could ever be opposed.
00:19:50.000 And yet there's something oddly traditional about the dynamics suggested by this.
00:19:56.000 We once believed that, in a sense, it was the function of the evolved state to preserve
00:20:02.000 and protect the interests of the public against corporate behemoths and corporate gigantism.
00:20:10.000 Now we have a gigantism that's unprecedented, way beyond the instantiations of a previous century, where it would have been steel and minerals and resources.
00:20:21.000 But attention and consciousness itself Is the faculty the object of this monopolization?
00:20:31.000 It's extraordinary to hear how effective they are at managing and manipulating to 46% or 66%.
00:20:37.000 to 46% or 66%.
00:20:40.000 These numbers are sort of astonishing to hear.
00:20:44.000 I wonder what you think about Google's attempt to overturn that $2.6 billion EU antitrust fine.
00:20:53.000 I wonder what you think about, for example, we know we're on Rumble, that when Rumble covered the Republican primaries, it was apparently very difficult to find on Google.
00:21:04.000 And I wonder, perhaps most of all, about whether or not, given that it appears that there is
00:21:09.000 a political bias built into the system's current modality, whether or not an alliance with
00:21:17.000 the alternative political party is a possibility in order to regulate and break up these monopolies,
00:21:25.000 because that would seem to be the only way that it could be challenged. And that's the
00:21:30.000 sort of traditional component that I'm referring to, unless you have some kind of like, other
00:21:34.000 than the state or an incredibly mobilized population, even with the information that
00:21:39.000 you are curating and compiling, how do you ever challenge something of this scale?
00:21:46.000 It can be challenged, but the antitrust actions that are currently being used in the EU and
00:21:53.000 also in the United States were actually designed by Google's legal team.
00:21:59.000 They're absolute shams, complete shams.
00:22:03.000 It makes it look like our public officials are doing something to protect us.
00:22:08.000 They're not.
00:22:09.000 Google works closely with governments around the world, even with the government of mainland China.
00:22:16.000 And works closely with intelligence agencies around the world.
00:22:20.000 The people at Google know that no one can ever break them up because you can't break up the search engine.
00:22:26.000 That's their main tool.
00:22:28.000 If you broke up the search engine, it wouldn't work.
00:22:30.000 Facebook knows this, too.
00:22:31.000 You can't break up their basic social media platform.
00:22:34.000 That would be like putting a Berlin Wall through every family in the world.
00:22:40.000 Are there ways to stop them?
00:22:42.000 Yes, but antitrust actions aren't going to do much.
00:22:46.000 What could be done, though, is you could declare this is very light touch regulation.
00:22:50.000 There's precedent for it in law.
00:22:52.000 There's precedent for it in Google's business practices.
00:22:55.000 is that you could declare the index, the database they use to generate search results,
00:23:01.000 you could declare that to be a public commons.
00:23:04.000 The EU could do it.
00:23:06.000 In other words, you would allow other parties, other companies, high school students,
00:23:10.000 you'd allow them to build their own search engine with access to Google's index.
00:23:14.000 You'd end up with thousands of search engines, all competing for our attention,
00:23:19.000 all trying to attract niche audiences exactly like the news media domain.
00:23:26.000 That's exactly what happens in news media.
00:23:28.000 That could be done simply by giving everyone access to Google's index.
00:23:35.000 Google would fight it in court, of course, and we'd see what happens, but that's one way.
00:23:39.000 But the only sure way that I know of to stop these companies Because they're affecting not just our elections, but our thinking, what we focus on.
00:23:52.000 They're in control of what content we see, such as your content, and what content we don't see, such as your content.
00:24:01.000 The only way to really stop them is through monitoring, because by monitoring, what happens is we preserve their manipulations.
00:24:10.000 We preserve them.
00:24:12.000 We can make them public 24 hours a day.
00:24:15.000 We can share the findings with public officials, both in the US and other countries, and give people, give organizations, give government agencies, give political campaigns the power they need to bring effective, Legal action against Google, because we're talking about massive amounts of data collected in a scientifically rigorous way.
00:24:42.000 I'll give you one quick example of how hard it is to find them if you don't have the data.
00:24:47.000 Last year, the Republican National Party sued Google because Google was diverting tens of millions of emails that the Republican Party was sending to the Republicans, and Google was diverting all those emails into spam boxes.
00:25:05.000 So the Republican Party sued them.
00:25:07.000 That case got thrown out of court.
00:25:09.000 Why?
00:25:10.000 They didn't have sufficient data to prove their claim.
00:25:13.000 Now, Google was really doing this and we were not monitoring that.
00:25:18.000 We are now.
00:25:19.000 The point is we can monitor what they're doing, preserve the data on a very large scale that can be used in the courts and that can be used with various government agencies.
00:25:30.000 And will they stop what they're doing?
00:25:33.000 Yes.
00:25:34.000 How do we know that?
00:25:35.000 Because in 2020, when we shared our data with some US senators, they sent a threatening letter to the CEO of Google and Google stopped.
00:25:44.000 They'll have to stop.
00:25:46.000 If they know that they're being monitored on a massive scale, 24 hours a day, worldwide eventually, by the way, we've already been approached by five other countries asking us to help set up monitoring systems.
00:25:59.000 If these tech execs know that their data are being captured, That we're doing to them what they do to us and our kids 24 hours a day.
00:26:08.000 That we're monitoring them.
00:26:10.000 We're preserving data that they thought could never be preserved.
00:26:15.000 They will stop, because you know why?
00:26:17.000 They can still make billions of dollars.
00:26:19.000 They don't have to at the same time be messing with our thinking, be messing with our elections, and especially be messing with our kids.
00:26:28.000 One of the new areas of research that we've started is looking at data coming onto the devices of more than 2,000 children throughout the U.S.
00:26:37.000 We're just beginning to look at that and our heads are spinning because what these companies are sending to kids is just unbelievable and parents are unaware.
00:26:50.000 There's a kind of social engineering occurring here on a massive scale that people are unaware of.
00:26:59.000 You can see it in leaks from Google.
00:27:01.000 You can see this.
00:27:02.000 That this is the intention of some of the top people at that company is to make a better world according to quote unquote company values.
00:27:12.000 That's actually in a video that leaked from the company that was about the power the company has to reengineer humanity.
00:27:21.000 Literally, they're using the phrase according to company values.
00:27:26.000 We can stop them.
00:27:29.000 The first step is to be aware of what it is they're doing.
00:27:34.000 It sounds like the kind of...
00:27:36.000 banalized dystopia described both by Huxley and David Foster Wallace to a degree in Infinite Jest, a sort of a corporatized cultural space that where the ideologies are masked in the kind of language of convenience, safety, no real moral spikes, no real ideological thrusts, you know, until there are, but mostly it's kind of
00:28:08.000 present in normalcy different.
00:28:11.000 I suppose that in order to significantly change society, you have to change the parameters of
00:28:16.000 what people regard as normal significantly. Now, one of the things that you've talked about is the
00:28:21.000 possibility of dissent and the likelihood of dissent being closed down in such a space.
00:28:29.000 What do you think is the role of independent media within this space?
00:28:33.000 How can independent media Succeed in such a highly controlled and curated space.
00:28:40.000 And what do we have to do to ensure that independent voices are able to be heard in a space like this?
00:28:48.000 And I'm very encouraged by the way, by what you say about the monitoring, the effectiveness of monitoring this does seem to, you know, somewhat slow and curtail the proclivities of this organisation in particular.
00:29:05.000 And the possibility for sharing that tech and, you know, making Google search stuff open source.
00:29:11.000 That does seem like an amazing way of dissolving that power.
00:29:15.000 But what do we do in particular about the sort of news media organizations like this one that necessarily exist within a space that's controlled to that degree?
00:29:24.000 Well, at the moment, you're in grave danger.
00:29:26.000 I mean, that's the bottom line.
00:29:28.000 At the moment, independent media of any sort are in grave danger.
00:29:32.000 One of the most remarkable pieces ever written about this problem, long before, by the way, he ever became aware of my research, was written by the head of the EU's largest publishing conglomerate, the German.
00:29:47.000 And he published a long letter in English and in German called Fear of Google.
00:29:54.000 And it was about how his company, they're in constant fear of Google and every decision they make, every business decision they make, they have to make in such a way as to not offend Google.
00:30:05.000 Because when Google decides to suppress content, for example, to demote you in their search results or delete you, There's nothing you can do, there's no recourse at all, and you are now out of business.
00:30:22.000 And that's the environment in which we live.
00:30:24.000 So no matter what content you want to contribute to the world, and I'm speaking of you personally here, it's a whim on their part.
00:30:31.000 You're under the Literally under the influence of whims at that company about whether you can continue to get your message out.
00:30:43.000 They've done this repeatedly with independent news sources.
00:30:47.000 They have reduced their traffic to 10% of what it was.
00:30:51.000 They can do that with the flip of a switch.
00:30:55.000 And by the way, that was confirmed to me by one of the whistleblowers from Google.
00:30:59.000 I'm in touch with a lot of the whistleblowers.
00:31:01.000 I'm in touch with people at Google who haven't even blown the whistle yet.
00:31:05.000 So I know way, way, way too much about what's going on there.
00:31:09.000 But they, yes, at the moment, They have that power.
00:31:12.000 They decide what more than 5 billion people around the world can see and cannot see.
00:31:20.000 And at the moment, there is no way to counteract what they're doing.
00:31:24.000 In the U.S., the courts have said over and over again, when they have, for example, shut down hundreds of websites belonging to one particular company, yes, they have that ability.
00:31:34.000 They can block websites.
00:31:36.000 They block millions of websites every day.
00:31:39.000 March 31st, 2009, they blocked access to the entire internet for 40 minutes.
00:31:45.000 That was reported by The Guardian and that was never denied by the company.
00:31:50.000 I eventually figured out, by the way, why they chose those particular 40 minutes to shut down the internet.
00:31:55.000 The point is they have this incredible power.
00:31:58.000 They use this incredible power.
00:31:59.000 The courts in the U.S.
00:32:00.000 have said they have every right to do that because they're a private company.
00:32:05.000 And see, that's the problem here.
00:32:07.000 In other words, even though I agree with a lot of their values because I lean left myself politically, I don't like the idea of a private company that's not accountable to us, to any public, having this kind of power.
00:32:23.000 That's the problem here.
00:32:24.000 The problem is not necessarily their values.
00:32:26.000 The problem is the power that they have and that they're utilizing without any accountability.
00:32:35.000 To us, to any population, any group of people around the world, they're simply not accountable.
00:32:43.000 I hope some of your viewers find that to be objectionable.
00:32:48.000 I hope some of your viewers will go to mygoogleresearch.com because this big national monitoring system that we started setting up last year, I had raised about $3 million to get us going on it.
00:33:03.000 It's going extremely well.
00:33:05.000 We've preserved now more than 44 million ephemeral experiences.
00:33:10.000 We have a panel nationwide of more than 11,000 field agents in all 50 U.S.
00:33:16.000 states, because we've got to get the system going here fully before we start helping other countries.
00:33:22.000 But the fact is that $3 million is now almost gone.
00:33:26.000 I need access to other major funding.
00:33:31.000 One of our advisors is trying to get us in touch with people in Switzerland who he feels might be very interested.
00:33:40.000 Are there people in Europe or in the UK who could help us?
00:33:46.000 Because this system has to exist.
00:33:48.000 This is not optional for humanity.
00:33:51.000 If we don't monitor them, We will never know how they're influencing elections or kids or human autonomy with no system in place like that.
00:34:05.000 I'll make a specific statement.
00:34:07.000 If this system is not fully running next year in the United States, with all of our data being shared with authorities and with the public every single day, if this system is not there, Google alone will be able to shift between 6.4 and 25.5 million votes in the presidential election of 2024 without anyone knowing what they're doing, without anyone being able to go back in time and look at all that ephemeral content.
00:34:39.000 That's what we're up against here.
00:34:41.000 That's why we must have systems like this, monitoring systems in place, that catch the data that they thought could never be caught.
00:34:50.000 That's what we've learned how to do.
00:34:51.000 I need your help and your audience help in making this happen.
00:34:56.000 This horrifying power that you described, already present, already active, already operating according to your data, is as yet un-augmented by a fully capable AI technology.
00:35:14.000 What are your thoughts on how the AI component will advance these capacities?
00:35:20.000 And what do you feel about, for example, the sort of chatbot story and the talk of sentience and, you know, the sacking of software engineer Blake Lemoine or Lemien or whatever his name was.
00:35:32.000 What do you feel about that, Doc?
00:35:35.000 AI is part of the story, obviously.
00:35:38.000 It is also potentially dangerous in its own right.
00:35:43.000 It will make these capabilities that they have even more powerful.
00:35:47.000 For example, we just finished, in fact, I've not announced this publicly, this will be my first announcement, but we've just finished our first exploration of what we call DPE, the Digital Personalization Effect.
00:36:00.000 And what we've shown is that if we show people biased content, we can produce shifts easily of 20% or more in their voting preferences.
00:36:10.000 If we personalize the content, which of course Google is famous for doing, if we personalize it based on some things we know about those people and what kinds of media sources they trust and news sources and celebrities, if we personalize the content so it's coming from sources they personally trust, that shift goes up to over 70 percent from 20 shift to 70
00:36:34.000 shift that's just by personalizing and AI of course makes it much much easier and smoother to
00:36:41.000 personalize content that's one of the main dangers here.
00:36:46.000 So the fact that these companies have always relied on AI to some extent, and now are relying on it more and more, makes them more powerful and far more dangerous.
00:36:57.000 All the more reason why we have to capture the ephemeral content that they use to manipulate people.
00:37:03.000 And I'm going to say it a third time, MyGoogleResearch.com, because we are desperately in need of help.
00:37:10.000 I'm just being honest with you.
00:37:11.000 I mean, we we desperately need help.
00:37:13.000 We we can't do this ourselves.
00:37:15.000 I have a team of almost 50 people helping working on this day and night.
00:37:20.000 A lot of them are volunteers.
00:37:23.000 It's very, very difficult what we're doing.
00:37:25.000 It's never been done before, but we're doing it and we're doing it well.
00:37:30.000 And we need people's help to make sure that this can be done on a larger scale.
00:37:35.000 For those of you out there who care about such things, All donations are going to a 501c3 public charity.
00:37:43.000 They're all fully tax deductible.
00:37:44.000 I'm so sorry that I have to keep interrupting with this begging for money, but that's the reality.
00:37:53.000 What we're doing is expensive, it's new, and it's important.
00:37:58.000 It's extremely important.
00:38:00.000 It sounds important in a way that's almost difficult to conceive of.
00:38:03.000 When you were talking before about the impact of personalised data, it made me realise that we're simply not evolved to live in a world where information can be curated in that manner.
00:38:20.000 I imagine, I imagine, That the roots of behaviouralism must have, you know, a component that's anthropological and ethnographic and how we are evolved to relate to one another and how we're evolved to trust sources of information.
00:38:36.000 How a consensus between a group is established and to have tools that can wallpaper your reality like a kind of chrome sphere surrounding your mind is...
00:38:50.000 It's in a sense beyond our, it's beyond sugar.
00:38:54.000 It's beyond sugar in terms of an agent of interruption, stimulation and control.
00:39:00.000 So I recognize how important what you're doing is.
00:39:02.000 I can hear that you're, you know, necessarily evangelical about continuing the work because it's seismic and pertains to sort of cornerstones of our As yet still called civilization, like democracy, like judiciary, like the ability to have open conversations, like important principles around which we presumed society was being built but for a while have suspected that in a sense these are simply gestures that are put in place while real power does what real power wants to do.
00:39:34.000 And that kind of power with this kind of utility is truly terrifying.
00:39:40.000 Can you speak for a moment about the aspect of, from a behaviouralist perspective, how we are not, you know, because in a sense, right, I'm a person, obviously, and I imagine that I'd be able to go, oh, well, I'm getting very biased information here from Google.
00:39:58.000 How is it that we simply are not able to discern, tackle, remain objective?
00:40:04.000 Keep some kind of distance from this experience.
00:40:08.000 Why is it so powerful from a, almost from a anthropological and behavioral perspective?
00:40:14.000 A couple of issues there.
00:40:15.000 First of all, most people can't see bias in the content that's being presented to them.
00:40:21.000 So those people are very easy to shift.
00:40:22.000 And in some demographic groups, you can easily shift upwards of 80% of voting preferences.
00:40:31.000 Some people are just very, very vulnerable to this kind of manipulation because they trust companies like Google, they trust They've got algorithmic output because they have no idea what algorithms are.
00:40:45.000 They trust computers because they think computers are inherently objective.
00:40:48.000 So, you've got all that working against you.
00:40:51.000 And then there's another factor, which is really, really scary.
00:40:55.000 In some of the big studies that we've done, there's always a small group of people who do see the bias.
00:41:02.000 Now, it's a small group, but with a big enough study, you know, that group is large enough for us to look at them separately.
00:41:08.000 And here's the thing.
00:41:10.000 The people who see the bias, they shift even farther in the direction of the bias.
00:41:17.000 Now, how can that be?
00:41:19.000 Why would that be?
00:41:21.000 Well, because, presumably, they're thinking, well, I can see there's bias here, and of course, Google is objective, or computer output is objective, or algorithms are objective, and it's clearly It's clearly preferring that candidate over this candidate.
00:41:38.000 So that candidate really must be the best.
00:41:41.000 And those shifts, the shifts among the people who can see the bias are larger than the shifts among the people who can't see the bias.
00:41:50.000 So, you know, there are no protections here.
00:41:53.000 This is a whole new world of influence and manipulation.
00:41:57.000 The only protection that I know of for sure that works is by doing to them what they do to us, by surveilling them, capturing the data so that it can be looked at carefully by authorities and courts.
00:42:14.000 You know, I'll tell you something, the UK and the EU, as you know, have been far more aggressive against Google in particular than any government agency in the US.
00:42:24.000 Because, you know, it's a US company.
00:42:27.000 So the EU has fined Google over and over again, more than 10,000, excuse me, 10 million euros in fines, also big fines in the UK.
00:42:38.000 You know what?
00:42:39.000 It has no impact on these companies whatsoever.
00:42:42.000 They've ordered Google to do this and that.
00:42:44.000 Google has completely ignored them.
00:42:46.000 What is lacking In the EU and the UK, it's a monitoring system to measure compliance with whatever the new laws and regulations are, but there are no monitoring systems in the EU and the UK, and Google knows that.
00:43:01.000 They completely ignore all of these various agreements and orders because no one is monitoring.
00:43:08.000 No monitoring means you can't measure compliance.
00:43:12.000 Well, I imagine we're going to have to be pretty clear about how to find your work because I don't imagine it comes up very easily on a Google search.
00:43:22.000 Dr Robert Epstein, thank you so much for joining us today.
00:43:26.000 Thank you for conveying this complex information in such an Easy to understand in spite of the vastness of the task and the scale of the challenge.
00:43:37.000 Thanks for giving us some suggestions of what a solution might look like and making it clear this is something that's happening right now and how difficult it is to detect And yet there is a way to oppose it and I would recommend that all of you learn more about Dr. Robert's work by going to drrobertepstein.com and mygoogleresearch.com.
00:43:58.000 Doctor, thank you so much for joining us.
00:44:00.000 I'm sure we'll be talking again, although these conversations might be difficult to find online.
00:44:05.000 Thank you.
00:44:05.000 Thank you.
00:44:06.000 That shows you the necessity of supporting us by clicking the red Awaken button and joining and supporting our community.
00:44:14.000 Without a direct connection to you, it's going to become increasingly difficult to communicate with you in a curated and controlled cultural space.
00:44:22.000 On the show tomorrow, we have Glenn Greenwald.
00:44:25.000 Imagine the information that he's going to be able to convey on this subject, as well as war, the pandemic, legacy media, corruption.
00:44:33.000 If you do become an Awakened Wonder and join our community, and I urge you to do that, you've just heard what Dr. Robert Epstein has described, almost a necessity to do that, you'll get access to guided meditations, readings, questions and answers.
00:44:45.000 And I want to thank you that have recently become new annual supporters like Truthfulergave, Barloo, Lucky Lou, Magic Peace, Love Ray, Pardon, Snuffle Dog, The Kennedys, Freddie, Flintstone, and so many more.
00:44:55.000 Thank you for joining us.
00:44:56.000 We really need you now more than ever.
00:44:59.000 Join us tomorrow, not for more of the same.
00:45:00.000 We'd never insult you with that, but For more of the different.