In this episode, Russell Brand speaks to Dr. Robert Epstein, Director of the American Institute for Behavioral Research and Technology (ABRTR) and author of 15 books. Dr. Epstein argues that Google is able to influence the outcome of elections around the world, and in doing so, through a system of artificial intelligence, they are able to sway the outcome in favor of their preferred candidate. In order to do so, they use a system that is on the brink of becoming a permanent fixture in every state in the US, and could be used as a political tool and even a weapon in order to influence our every day decisions and affect the outcomes of elections across the world. This is an incredible claim, and one that needs to be discussed with extreme caution, because there is no doubt that Google and other tech companies are working on a massive scale to influence elections and influence our everyday decisions. But is there any doubt that they are using this technology to achieve their political and economic goals? And is this information being spread across the entire world, or is this just a small sample of what they are actually doing? or is there something even more sinister than we know about what is going on in the world of tech companies and big tech and how they are really working together to influence and exert control over our daily decisions and our everyday lives? And how can we possibly be so close to the edge of our understanding of what is happening in the real world, when we have no idea what is really going on? and yet we can't even begin to see the full extent of the impact that is being done by technology companies and tech companies in our everyday in our day-to-day lives? Stay tuned in to this episode of Stay Free With Russell Brand. to find out. Stay Free with Russell Brand, Stay Free, by You Awakening Wonders. - Stay Woke, by You Awakening Wonders! - by Russell Brand and You Awakening Wonders by Dr Robert Epstein by The Conversation, by The New York Times bestselling author, Dr. Jay Sheinfeld, and by The Huffington Post, and The New Statesman, and the founder of The Internet Times . by Robert Epstein. , and by BBC Radio 4's Mr. . . . and by the BBC Radio and BBC Radio 5 Live, and also by The Guardian, and many more. by the New Stateside, and we hope you enjoy it.
00:00:15.000Joining me now is Dr. Robert Epstein, a former Harvard psychology professor, author of 15 books and current director of the American Institute for Behavioral Research and Technology.
00:00:25.000Thank you very much for joining us today, Doctor.
00:00:28.000One of the claims that you have made that is most astonishing, difficult almost to believe, is that Google are essentially able to curate and control reality.
00:00:40.000Google, that we all use as an ordinary tool in most people's lives, you claim can be used to drive and direct an agenda, that it can be used as a political tool and even weapon In particular, I'd like to ask you about your claim that Google was able to direct six million extra votes to Joe Biden.
00:01:03.000And obviously, that's an incredibly contentious claim, because talking about electoral fraud and electoral meddling seems to be one of the subjects that's Most difficult to discuss and has to be discussed with incredible caution.
00:01:19.000So can you tell me exactly what it is you mean by Google directing six million extra votes to a presumably preferred presidential candidate and how on earth Google would be able to do that?
00:01:31.000Well, I've been doing very rigorous scientific research on this topic for more than 11 years.
00:01:37.000And what should really shock you here is that people's preoccupation with election fraud and ballot stuffing and all that, that preoccupation, that obsession is actually engineered by Google and to a lesser extent, other tech companies.
00:01:52.000There's nothing really there, and that's what they do.
00:01:55.000They redirect attention like magicians do, so that you won't look at them.
00:02:56.000That means that we had recruited registered voters, equipped them with special software, So that we could look over their shoulders as they're getting content from Google and other tech companies.
00:03:10.000In other words, we were seeing the real content that they're sending to real voters during the days leading up to an election.
00:03:18.000And then we measured the bias in that content.
00:03:22.000We found extreme political bias favoring Joe Biden, whom I actually supported, although I no longer do.
00:03:29.000The point is we found extreme political bias and we know from randomized controlled experiments we've been conducting since 2013 that that level of bias shifted at least six million votes to Biden in that election.
00:03:44.000In 2022 we had 2,742 field agents in 10 swing states.
00:03:51.000So in other words, we're monitoring real content sent to real voters by these companies,
00:03:57.000recording it in real time and analyzing it in real time.
00:04:01.000In 2022, they shifted millions of votes in hundreds of midterm elections throughout the US.
00:04:08.000We know they did this for Brexit, by the way, in the UK.
00:04:11.000And again, they're very good at redirecting attention.
00:04:16.000What we're doing now is much, much bigger.
00:04:18.000We decided to build a permanent monitoring system in all 50 US states.
00:04:23.000At this moment in time, we have 11,638 field agents in all 50 states, which means 24 hours a day, we are monitoring and preserving and archiving ephemeral content.
00:04:36.000That's what they use to manipulate us.
00:04:40.000Through the computers of more than 11,000 registered voters in the U.S., 24 hours a day, we're on the verge of setting up a permanent system like this that will keep these companies away from our elections and from our kids permanently.
00:04:58.000Whilst I understand that you're able, with these agents that you described, to monitor the information that Google is publishing, promoting and directing, it does seem to be, given the sort of literally global scale of the endeavour that Google are undertaking, to be a relatively small sample size.
00:05:21.000I will add, of course, that I understand that there are significant contracts that are explicit between Google and the government in areas like data, security, military-industrial complex, defence.
00:05:33.000There are explicit financial ties as well as donations and lobbying money, as well as numerous people in Congress and the Senate owning significant shares.
00:05:42.000In companies, big tech companies, particularly in this instance, that they are supposed to regulate.
00:05:46.000So the possibility and opportunity for corruption is plainly there.
00:05:51.000But I do wonder how you're able, with that sample size, to deduce such a significant number, specifically six million.
00:06:01.000And also the other figure that I've heard in association with your work that a 50/50 split among undecided voters,
00:06:06.000you know, I know we're talking about swing states anyway, can turn into a 9E/10 split.
00:06:11.000How do you map these relatively small figures onto like, you know, such a global number?
00:06:18.000And also you suggested that part of your work going forward is to regulate and oppose this trend and tendency.
00:07:24.000And so the effects that we keep replicating over and over again, other teams have now replicated,
00:07:30.000those are significant, for those of you who know any stats here, at the 0.001 level,
00:07:37.000meaning the probability that we're making mistakes is less than 1 in 1,000.
00:07:44.000We're highly confident about what we've been finding.
00:07:48.000And the problem here is that we're up against...
00:07:51.000The most powerful mind control machine that's ever been developed by humankind, and it's operating in every country in the world except mainland China, and it impacts how people see those companies.
00:08:04.000They're impacting not just our elections, they're not just indoctrinating our kids, they're literally altering the way we perceive them as a company.
00:08:16.000And most of these manipulations that they have access to now, that they control exclusively because they're a monopoly, most of these manipulations cannot be seen by the people who are being manipulated.
00:08:31.000So your ability to observe them and to track them, it operates against what type of control?
00:08:39.000If you're able to say that people are being sent this information that's highly biased, what would unbiased information look like?
00:08:48.000I'm open, of course, to the possibility that this unprecedented and fully immersive technology would be used by people that have an appetite to control information and it seems quite plain to me.
00:09:00.000That that does happen but because it's so extraordinary and revelatory because it's so significant and if it were able to be opposed it could be so seismic in our ability to have true democracy and a public sphere worthy of the name where dissent and conversation could take place freely.
00:09:19.000I feel that it's important that I understand exactly how that not exactly because of probably the limitations of my ability to understand but as precisely as I might, the way that you're able to say,
00:09:31.000"Look, this would constitute neutral information.
00:09:35.000Look at what you're actually getting."
00:09:38.000Because I feel that it's very important.
00:09:41.000Again, you're shocking me because you're being the skeptic here,
00:09:45.000but you know, good scientists are also skeptics, and there's no one more skeptical
00:09:53.000So, let me give you an example, and I'll just show you exactly how this works.
00:09:57.000In 2020, where we had collected a massive amount of data, we had preserved more than 1.5 million ephemeral experiences on Google and other platforms, and you're asking, Ephemeral experiences?
00:10:10.000Those are those fleeting experiences that we all have online when we're shown search suggestions or answer boxes or search results or news feeds.
00:10:20.000They appear, they impact you, they disappear, they're stored nowhere, so no one can go back in time and see what was being done.
00:10:29.000That's what we've learned to preserve over the years.
00:10:32.0002020, we find, again, Massive, overwhelming evidence of extreme bias will preserve 1.5 million ephemeral experiences.
00:10:44.000And I sent the data in to the office of Senator Ted Cruz.
00:10:50.000He and two other senators sent a very threatening letter to the CEO of Google.
00:10:56.000This was November 5th, 2020, two days after the presidential election.
00:11:01.000And lo and behold, that same day, Google turned off All the bias in the state of Georgia, which was gearing up for two Senate runoff elections in January.
00:11:52.000This is now being confirmed by multiple leaks from the company.
00:11:56.000For example, emails that were leaked to the Wall Street Journal in which Google employees were discussing how can we use And I put this in quotes, ephemeral experiences to change people's views about Trump's travel ban.
00:12:12.000This has been confirmed by multiple whistleblowers, leaks of documents, leaks of videos of a PowerPoint presentation.
00:13:24.000So presumably there are relationships and an agenda where interests converge to the degree where there is an established and undemocratic consensus about the nature of this reality that's being formulated, i.e.
00:13:40.000this is the data that is promoted, this is the information that's amplified, this is the information that's censored, this is the information that people just don't get to see.
00:13:50.000I wonder if when you presumably began to garner your expertise and education in behaviouralism, tools of this magnitude didn't exist and were not available.
00:14:05.000Throughout the pandemic period there was a lot of talk about nudge units, certainly in our country there were, how behavioural nudges could be offered and sort of BF Skinner type nomenclature about how behaviour can be controlled, how certain traits can be amplified, certain impressions can be projected and promoted and others maligned, ignored.
00:14:25.000I wonder how your expertise and background in behaviouralism, Robert, maps onto this new reality and what advantages they now have having this kind of utility.
00:14:38.000How does this How does this, what do I want to say, how does this marry to your conventional understanding of behaviouralism in a normal propagandist state like in the last century where there have been print media and TV media?
00:14:51.000And can you tell us what techniques of observation and measurement are preserved and have sustained what must be an epochal shift?
00:14:59.000I was the last doctoral student at Harvard University of B.F.
00:15:02.000Skinner, the man who some would say helped to create behavioral psychology.
00:15:09.000And Skinner himself did not anticipate what has actually happened.
00:15:42.000I mean, when we started doing experiments, controlled experiments, on these new techniques, which we had to discover, we had to name, and then we had to learn how to quantify them, I didn't believe our data.
00:15:55.000In the first experiment we ran in 2013, I thought by showing people biased search results, I could shift their voting preferences by two or three percent, which I thought would be, you know, important possibly in a closed election.
00:16:08.000The first shift we got was 43 percent, which I thought was incorrect.
00:16:32.000We did research in India, research in the U.K.
00:16:35.000This has been going on now for more than 11 years.
00:16:38.000This is rock solid research and Skinner himself would be flabbergasted.
00:16:46.000Because what we're seeing now are techniques for shifting people's thinking and behavior without their knowledge on a massive scale to an extent that has never been possible before in human history.
00:16:59.000That's what the internet has made available.
00:17:01.000Now, this wouldn't necessarily be that much of a threat.
00:17:05.000Except for the fact that the way the internet has evolved, which no one anticipated, is it's controlled mainly by two big monopolies, to a lesser extent by a couple of other monopolies.
00:17:16.000And because they're monopolies, it means that these techniques of control, we can't counter.
00:17:22.000If you, in an election, you support a candidate and you buy a billboard, I can buy another billboard.
00:17:28.000You buy a TV commercial, I can buy two TV commercials.
00:17:32.000But if one of these big platforms, like Google, if they want to support a candidate, or they want to support a Brexit vote, or they want to support a political party, there is nothing you can do to counter what they're doing.
00:17:46.000What we've developed are systems to surveil them, to preserve the evidence, preserve the data.
00:17:53.000That's the only way I know of To stop them is by gathering the evidence in a way that is, again, scientifically valid, so that the data are admissible in court, and that is what we're doing right now.
00:18:09.000If people want to know the details, they can go to mygoogleresearch.com, they can go to techwatchproject.org, MyGoogleResearch.com will give them lots and lots of links to lots of published papers, lots of talks I've given.
00:18:23.000This is serious work and what's happening here, again our attention is being misdirected away from what they're doing, but what's happening here, what they're really doing is extremely dangerous and very scary.
00:18:39.000It makes democracy into a kind of a joke and Since you haven't interrupted me yet, thank God, I want to just tell you that President Dwight D. Eisenhower, who was head of Allied Forces in World War II, I mean, he was an insider.
00:18:53.000In the last speech he gave as president in 1961, some people are aware that he talked about the rise of a military industrial complex and, you know, but that same speech, he warned about the rise of a technological elite.
00:19:40.000Sometimes when I have something of this scale described to me, Robert, I find it inconceivable
00:19:46.000to envisage that it could ever be opposed.
00:19:50.000And yet there's something oddly traditional about the dynamics suggested by this.
00:19:56.000We once believed that, in a sense, it was the function of the evolved state to preserve
00:20:02.000and protect the interests of the public against corporate behemoths and corporate gigantism.
00:20:10.000Now we have a gigantism that's unprecedented, way beyond the instantiations of a previous century, where it would have been steel and minerals and resources.
00:20:21.000But attention and consciousness itself Is the faculty the object of this monopolization?
00:20:31.000It's extraordinary to hear how effective they are at managing and manipulating to 46% or 66%.
00:20:40.000These numbers are sort of astonishing to hear.
00:20:44.000I wonder what you think about Google's attempt to overturn that $2.6 billion EU antitrust fine.
00:20:53.000I wonder what you think about, for example, we know we're on Rumble, that when Rumble covered the Republican primaries, it was apparently very difficult to find on Google.
00:21:04.000And I wonder, perhaps most of all, about whether or not, given that it appears that there is
00:21:09.000a political bias built into the system's current modality, whether or not an alliance with
00:21:17.000the alternative political party is a possibility in order to regulate and break up these monopolies,
00:21:25.000because that would seem to be the only way that it could be challenged. And that's the
00:21:30.000sort of traditional component that I'm referring to, unless you have some kind of like, other
00:21:34.000than the state or an incredibly mobilized population, even with the information that
00:21:39.000you are curating and compiling, how do you ever challenge something of this scale?
00:21:46.000It can be challenged, but the antitrust actions that are currently being used in the EU and
00:21:53.000also in the United States were actually designed by Google's legal team.
00:23:06.000In other words, you would allow other parties, other companies, high school students,
00:23:10.000you'd allow them to build their own search engine with access to Google's index.
00:23:14.000You'd end up with thousands of search engines, all competing for our attention,
00:23:19.000all trying to attract niche audiences exactly like the news media domain.
00:23:26.000That's exactly what happens in news media.
00:23:28.000That could be done simply by giving everyone access to Google's index.
00:23:35.000Google would fight it in court, of course, and we'd see what happens, but that's one way.
00:23:39.000But the only sure way that I know of to stop these companies Because they're affecting not just our elections, but our thinking, what we focus on.
00:23:52.000They're in control of what content we see, such as your content, and what content we don't see, such as your content.
00:24:01.000The only way to really stop them is through monitoring, because by monitoring, what happens is we preserve their manipulations.
00:24:12.000We can make them public 24 hours a day.
00:24:15.000We can share the findings with public officials, both in the US and other countries, and give people, give organizations, give government agencies, give political campaigns the power they need to bring effective, Legal action against Google, because we're talking about massive amounts of data collected in a scientifically rigorous way.
00:24:42.000I'll give you one quick example of how hard it is to find them if you don't have the data.
00:24:47.000Last year, the Republican National Party sued Google because Google was diverting tens of millions of emails that the Republican Party was sending to the Republicans, and Google was diverting all those emails into spam boxes.
00:25:19.000The point is we can monitor what they're doing, preserve the data on a very large scale that can be used in the courts and that can be used with various government agencies.
00:25:30.000And will they stop what they're doing?
00:25:46.000If they know that they're being monitored on a massive scale, 24 hours a day, worldwide eventually, by the way, we've already been approached by five other countries asking us to help set up monitoring systems.
00:25:59.000If these tech execs know that their data are being captured, That we're doing to them what they do to us and our kids 24 hours a day.
00:26:17.000They can still make billions of dollars.
00:26:19.000They don't have to at the same time be messing with our thinking, be messing with our elections, and especially be messing with our kids.
00:26:28.000One of the new areas of research that we've started is looking at data coming onto the devices of more than 2,000 children throughout the U.S.
00:26:37.000We're just beginning to look at that and our heads are spinning because what these companies are sending to kids is just unbelievable and parents are unaware.
00:26:50.000There's a kind of social engineering occurring here on a massive scale that people are unaware of.
00:27:36.000banalized dystopia described both by Huxley and David Foster Wallace to a degree in Infinite Jest, a sort of a corporatized cultural space that where the ideologies are masked in the kind of language of convenience, safety, no real moral spikes, no real ideological thrusts, you know, until there are, but mostly it's kind of
00:28:11.000I suppose that in order to significantly change society, you have to change the parameters of
00:28:16.000what people regard as normal significantly. Now, one of the things that you've talked about is the
00:28:21.000possibility of dissent and the likelihood of dissent being closed down in such a space.
00:28:29.000What do you think is the role of independent media within this space?
00:28:33.000How can independent media Succeed in such a highly controlled and curated space.
00:28:40.000And what do we have to do to ensure that independent voices are able to be heard in a space like this?
00:28:48.000And I'm very encouraged by the way, by what you say about the monitoring, the effectiveness of monitoring this does seem to, you know, somewhat slow and curtail the proclivities of this organisation in particular.
00:29:05.000And the possibility for sharing that tech and, you know, making Google search stuff open source.
00:29:11.000That does seem like an amazing way of dissolving that power.
00:29:15.000But what do we do in particular about the sort of news media organizations like this one that necessarily exist within a space that's controlled to that degree?
00:29:24.000Well, at the moment, you're in grave danger.
00:29:28.000At the moment, independent media of any sort are in grave danger.
00:29:32.000One of the most remarkable pieces ever written about this problem, long before, by the way, he ever became aware of my research, was written by the head of the EU's largest publishing conglomerate, the German.
00:29:47.000And he published a long letter in English and in German called Fear of Google.
00:29:54.000And it was about how his company, they're in constant fear of Google and every decision they make, every business decision they make, they have to make in such a way as to not offend Google.
00:30:05.000Because when Google decides to suppress content, for example, to demote you in their search results or delete you, There's nothing you can do, there's no recourse at all, and you are now out of business.
00:30:22.000And that's the environment in which we live.
00:30:24.000So no matter what content you want to contribute to the world, and I'm speaking of you personally here, it's a whim on their part.
00:30:31.000You're under the Literally under the influence of whims at that company about whether you can continue to get your message out.
00:30:43.000They've done this repeatedly with independent news sources.
00:30:47.000They have reduced their traffic to 10% of what it was.
00:30:51.000They can do that with the flip of a switch.
00:30:55.000And by the way, that was confirmed to me by one of the whistleblowers from Google.
00:30:59.000I'm in touch with a lot of the whistleblowers.
00:31:01.000I'm in touch with people at Google who haven't even blown the whistle yet.
00:31:05.000So I know way, way, way too much about what's going on there.
00:31:09.000But they, yes, at the moment, They have that power.
00:31:12.000They decide what more than 5 billion people around the world can see and cannot see.
00:31:20.000And at the moment, there is no way to counteract what they're doing.
00:31:24.000In the U.S., the courts have said over and over again, when they have, for example, shut down hundreds of websites belonging to one particular company, yes, they have that ability.
00:32:07.000In other words, even though I agree with a lot of their values because I lean left myself politically, I don't like the idea of a private company that's not accountable to us, to any public, having this kind of power.
00:32:24.000The problem is not necessarily their values.
00:32:26.000The problem is the power that they have and that they're utilizing without any accountability.
00:32:35.000To us, to any population, any group of people around the world, they're simply not accountable.
00:32:43.000I hope some of your viewers find that to be objectionable.
00:32:48.000I hope some of your viewers will go to mygoogleresearch.com because this big national monitoring system that we started setting up last year, I had raised about $3 million to get us going on it.
00:33:51.000If we don't monitor them, We will never know how they're influencing elections or kids or human autonomy with no system in place like that.
00:34:07.000If this system is not fully running next year in the United States, with all of our data being shared with authorities and with the public every single day, if this system is not there, Google alone will be able to shift between 6.4 and 25.5 million votes in the presidential election of 2024 without anyone knowing what they're doing, without anyone being able to go back in time and look at all that ephemeral content.
00:34:51.000I need your help and your audience help in making this happen.
00:34:56.000This horrifying power that you described, already present, already active, already operating according to your data, is as yet un-augmented by a fully capable AI technology.
00:35:14.000What are your thoughts on how the AI component will advance these capacities?
00:35:20.000And what do you feel about, for example, the sort of chatbot story and the talk of sentience and, you know, the sacking of software engineer Blake Lemoine or Lemien or whatever his name was.
00:35:38.000It is also potentially dangerous in its own right.
00:35:43.000It will make these capabilities that they have even more powerful.
00:35:47.000For example, we just finished, in fact, I've not announced this publicly, this will be my first announcement, but we've just finished our first exploration of what we call DPE, the Digital Personalization Effect.
00:36:00.000And what we've shown is that if we show people biased content, we can produce shifts easily of 20% or more in their voting preferences.
00:36:10.000If we personalize the content, which of course Google is famous for doing, if we personalize it based on some things we know about those people and what kinds of media sources they trust and news sources and celebrities, if we personalize the content so it's coming from sources they personally trust, that shift goes up to over 70 percent from 20 shift to 70
00:36:34.000shift that's just by personalizing and AI of course makes it much much easier and smoother to
00:36:41.000personalize content that's one of the main dangers here.
00:36:46.000So the fact that these companies have always relied on AI to some extent, and now are relying on it more and more, makes them more powerful and far more dangerous.
00:36:57.000All the more reason why we have to capture the ephemeral content that they use to manipulate people.
00:37:03.000And I'm going to say it a third time, MyGoogleResearch.com, because we are desperately in need of help.
00:38:00.000It sounds important in a way that's almost difficult to conceive of.
00:38:03.000When you were talking before about the impact of personalised data, it made me realise that we're simply not evolved to live in a world where information can be curated in that manner.
00:38:20.000I imagine, I imagine, That the roots of behaviouralism must have, you know, a component that's anthropological and ethnographic and how we are evolved to relate to one another and how we're evolved to trust sources of information.
00:38:36.000How a consensus between a group is established and to have tools that can wallpaper your reality like a kind of chrome sphere surrounding your mind is...
00:38:50.000It's in a sense beyond our, it's beyond sugar.
00:38:54.000It's beyond sugar in terms of an agent of interruption, stimulation and control.
00:39:00.000So I recognize how important what you're doing is.
00:39:02.000I can hear that you're, you know, necessarily evangelical about continuing the work because it's seismic and pertains to sort of cornerstones of our As yet still called civilization, like democracy, like judiciary, like the ability to have open conversations, like important principles around which we presumed society was being built but for a while have suspected that in a sense these are simply gestures that are put in place while real power does what real power wants to do.
00:39:34.000And that kind of power with this kind of utility is truly terrifying.
00:39:40.000Can you speak for a moment about the aspect of, from a behaviouralist perspective, how we are not, you know, because in a sense, right, I'm a person, obviously, and I imagine that I'd be able to go, oh, well, I'm getting very biased information here from Google.
00:39:58.000How is it that we simply are not able to discern, tackle, remain objective?
00:40:04.000Keep some kind of distance from this experience.
00:40:08.000Why is it so powerful from a, almost from a anthropological and behavioral perspective?
00:40:15.000First of all, most people can't see bias in the content that's being presented to them.
00:40:21.000So those people are very easy to shift.
00:40:22.000And in some demographic groups, you can easily shift upwards of 80% of voting preferences.
00:40:31.000Some people are just very, very vulnerable to this kind of manipulation because they trust companies like Google, they trust They've got algorithmic output because they have no idea what algorithms are.
00:40:45.000They trust computers because they think computers are inherently objective.
00:40:48.000So, you've got all that working against you.
00:40:51.000And then there's another factor, which is really, really scary.
00:40:55.000In some of the big studies that we've done, there's always a small group of people who do see the bias.
00:41:02.000Now, it's a small group, but with a big enough study, you know, that group is large enough for us to look at them separately.
00:41:21.000Well, because, presumably, they're thinking, well, I can see there's bias here, and of course, Google is objective, or computer output is objective, or algorithms are objective, and it's clearly It's clearly preferring that candidate over this candidate.
00:41:38.000So that candidate really must be the best.
00:41:41.000And those shifts, the shifts among the people who can see the bias are larger than the shifts among the people who can't see the bias.
00:41:50.000So, you know, there are no protections here.
00:41:53.000This is a whole new world of influence and manipulation.
00:41:57.000The only protection that I know of for sure that works is by doing to them what they do to us, by surveilling them, capturing the data so that it can be looked at carefully by authorities and courts.
00:42:14.000You know, I'll tell you something, the UK and the EU, as you know, have been far more aggressive against Google in particular than any government agency in the US.
00:42:46.000What is lacking In the EU and the UK, it's a monitoring system to measure compliance with whatever the new laws and regulations are, but there are no monitoring systems in the EU and the UK, and Google knows that.
00:43:01.000They completely ignore all of these various agreements and orders because no one is monitoring.
00:43:08.000No monitoring means you can't measure compliance.
00:43:12.000Well, I imagine we're going to have to be pretty clear about how to find your work because I don't imagine it comes up very easily on a Google search.
00:43:22.000Dr Robert Epstein, thank you so much for joining us today.
00:43:26.000Thank you for conveying this complex information in such an Easy to understand in spite of the vastness of the task and the scale of the challenge.
00:43:37.000Thanks for giving us some suggestions of what a solution might look like and making it clear this is something that's happening right now and how difficult it is to detect And yet there is a way to oppose it and I would recommend that all of you learn more about Dr. Robert's work by going to drrobertepstein.com and mygoogleresearch.com.
00:43:58.000Doctor, thank you so much for joining us.
00:44:00.000I'm sure we'll be talking again, although these conversations might be difficult to find online.
00:44:06.000That shows you the necessity of supporting us by clicking the red Awaken button and joining and supporting our community.
00:44:14.000Without a direct connection to you, it's going to become increasingly difficult to communicate with you in a curated and controlled cultural space.
00:44:22.000On the show tomorrow, we have Glenn Greenwald.
00:44:25.000Imagine the information that he's going to be able to convey on this subject, as well as war, the pandemic, legacy media, corruption.
00:44:33.000If you do become an Awakened Wonder and join our community, and I urge you to do that, you've just heard what Dr. Robert Epstein has described, almost a necessity to do that, you'll get access to guided meditations, readings, questions and answers.
00:44:45.000And I want to thank you that have recently become new annual supporters like Truthfulergave, Barloo, Lucky Lou, Magic Peace, Love Ray, Pardon, Snuffle Dog, The Kennedys, Freddie, Flintstone, and so many more.