The Culture War #34 - Election Fraud, Big Tech And Trump 2024 w⧸Robert Bowes & Dr. Robert Epstein
Summary
In this episode, we have a special guest, Dr. Robert Bowes. Dr. Bowes is a former Trump White House appointee who now works at the American Institute for Behavioral Research and Technology (AITTR) in San Diego, California, and is a researcher at the AITTR. We discuss the possibility of a Google or other tech company manipulating our votes in the next election, and whether or not it can defeat the establishment machine that s trying to elect Hillary Clinton in 2020. We also discuss the impact of technology on our voting system, and what we can do about it. Betonline Ontario is a new gambling app that allows you to earn up to $100 in free bets on the upcoming Ontario primary election. Get ready for Las Vegas-style action at your fingertips with the same Vegas Strip excitement MGM is famous for when you play classics like MGM Grand, Blackjack, Baccarat, and Roulette. With an ever-growing library of digital slot games, a large selection of online table games, and signature BetMGM service, there s no better way to bring the excitement and ambience of Las Vegas home to you than with BetOnline Ontario! Download the Betonline.ca app today! BetOnline.ca is the king of online gambling and GameSense remind you to play responsibly. . if you have questions or concerns about your gambling or someone close to you, please contact Connects Ontario at 1-866-531-2600 to speak to an advisor free of charge. If you have concerns about gambling or want to speak with an advisor FREE of charge, call ConnectsOnto, call 1-800-UP-TO-WALK about it? to speak free to you. or visit Connects.ca/WagerOntario at 1 800-TOWager Ontario at 51919-1919 and get 20% off your best bet on the next presidential election! - and more! and a chance to win a FREE VIP membership at Betonline VIP membership! to win $200,000 in the 2020 Democratic primary, $100,000, $500,000 off the first month! or $250,000 OFF-PRICING FREE in the second year of your first year of the 2020 presidential primary, FREE OFF THE FIRST MONTH, $25, $50 OFF OFF THE FASTEST PRICING, $75,000 IN OFFER?
Transcript
00:00:00.000
Get ready for Las Vegas-style action at BetMGM, the king of online casinos.
00:00:05.880
Enjoy casino games at your fingertips with the same Vegas Strip excitement MGM is famous for
00:00:11.120
when you play classics like MGM Grand Millions or popular games like Blackjack, Baccarat, and Roulette.
00:00:17.940
With our ever-growing library of digital slot games, a large selection of online table games,
00:00:22.920
and signature BetMGM service, there's no better way to bring the excitement and ambience of Las Vegas home to you
00:00:34.940
BetMGM and GameSense remind you to play responsibly.
00:00:42.660
If you have questions or concerns about your gambling or someone close to you,
00:00:45.600
please contact Connects Ontario at 1-866-531-2600 to speak to an advisor free of charge.
00:00:53.860
BetMGM operates pursuant to an operating agreement with iGaming Ontario.
00:01:03.760
Actually, the latest aggregate has Trump tied with Joe Biden, but he did see a major upswing.
00:01:08.800
And it looks like now with the GOP primaries, he's absolutely crushing it.
00:01:11.760
But the question is, are the elections going to be free and fair?
00:01:18.600
As well as what happened way back when in that, you know, other election.
00:01:23.480
We're going to talk about this because we're going to be entering the 2024.
00:01:30.120
And I hear a lot of people saying things that they don't think it's possible.
00:01:37.800
And even if you if you think in the back of your mind you can't win,
00:01:40.760
that's no excuse for backing down and letting your political opponents and evil people just steamroll through.
00:02:05.920
I'm senior research psychologist at the American Institute for Behavioral Research and Technology,
00:02:14.360
And you have previously talked about or you've talked quite a bit about Google's manipulation of our electorates,
00:02:21.600
of the people's minds, but also some other issues pertaining to Hillary Clinton 2016 and things like that.
00:02:35.380
Robert Bowes was a Trump White House appointee, worked at FHA, been a banker for many years,
00:02:42.220
but now working on election fraud investigation and helping some of those wrongfully accused in Georgia indictments.
00:02:54.920
By we, you mean right-wing conservative nutcases?
00:03:00.320
I mean, you know, if you take the mainstream media's view,
00:03:03.780
anybody who would vote for Trump is going to be some far-right MAGA extremist.
00:03:07.060
But you'll actually, there's a lot of people who are libertarian-leaning, anti-establishment,
00:03:11.700
some people who just despise the Republican Party, but just want Trump to win for, say,
00:03:16.220
like my position is more so, Trump's more likely to fire people.
00:03:19.680
His foreign policy was substantially better than everyone else we've seen.
00:03:23.420
So I actually really don't like Republicans, and I don't consider myself conservative,
00:03:26.700
but I think Trump is one of our best bets in a long time.
00:03:30.000
I'd like to see him win, but, you know, could there be better?
00:03:33.240
Of course, it's a question of, is it possible to defeat the establishment machine,
00:03:38.020
which has got Biden fumbling around in office, maybe wants to bring in Gavin Newsom or who knows
00:03:49.520
It is impossible because Google alone has the power in 2024 to shift between 6.4 and 25.5
00:03:59.960
million votes in the presidential election with no one aware of what they're doing and without
00:04:07.320
leaving a paper trail for authorities to trace.
00:04:12.460
Between 6.4 and 25.5 million votes, and those are absolutely rock-solid numbers.
00:04:23.980
I know how to level the playing field, but all the attention is going to so-called voter
00:04:31.980
All that attention is going to voter fraud because Google and some other tech companies
00:04:40.060
In other words, they're making those kinds of stories go viral so that people who don't
00:04:46.980
And they're doing that deliberately so that you won't look at them.
00:04:51.860
Well, I mean, I would, I could say, you know, potentially right now, yes, but for two years
00:04:58.000
after the 2020 election, you could not even say those words on YouTube without getting banned.
00:05:02.200
In fact, I think it was The Hill ran a clip of Donald Trump speaking at a rally where he
00:05:09.040
And then they shut down a news segment for simply mentioning it.
00:05:12.620
But if you came on and said, oh, they've got better ballot harvesting, YouTube was totally
00:05:17.140
If you came out and said big tech censorship, Google search manipulation, they had no problem
00:05:23.020
with you saying those things, but you couldn't talk about not now you can talk about fraud.
00:05:26.960
They've changed the rules a few months ago where now you're allowed to say 2020 was was
00:05:32.920
But so how would you I mean, what's your response to that?
00:05:37.240
Whatever it is people are focusing on, you have to understand that that focus is being
00:05:43.680
So there I guarantee you they're not ever allowing the focus to be on them.
00:05:51.740
So right now, for example, there's a big trial and progress is just winding down the U.S.
00:06:00.760
That has been so completely suppressed by the tech companies themselves and their media partners.
00:06:06.260
So what I'm saying is whatever it is people are going to be talking about, they control
00:06:12.420
And whatever else they do, they're going to make sure that you don't look at them and
00:06:16.880
the kind of power that they have to shift votes and opinions, which is unprecedented in
00:06:25.120
That's what I've been studying more than 11 years.
00:06:27.140
And I publish my work in peer-reviewed journals.
00:06:40.600
And a couple of people now have been figuring this out.
00:06:46.820
I don't know if you know that in the last few weeks.
00:06:48.820
She's saying it's big tech, big tech that we really need to worry about.
00:06:57.380
Saying all those voter fraud issues, yeah, they're important.
00:07:08.200
Because these other kinds of things that we talk about, they can shift a few votes here
00:07:16.300
But if one of the big tech platforms decides to support a party or a candidate, there is nothing
00:07:24.740
Generally speaking, also, they're using techniques that you can't even see.
00:07:32.340
And I will tell you, at this point in time, democracy in this country is an illusion because
00:07:42.400
The $6 million to $25 million you're talking about, you're thinking about, that's now or 2024?
00:07:50.740
I would submit to you that when Carrie Lake says, 81 million votes in my ass, I agree with her.
00:08:00.080
I don't think that there's the real or that old school fraud, which you think is a smaller amount.
00:08:08.920
And I agree with your assertion that the tech censorship is big.
00:08:21.420
I can tell you precisely because that's when we started monitoring.
00:08:24.360
That's when we invented the world's first system for surveilling them, doing to them what they
00:08:33.060
We learned how to capture what they call ephemeral content.
00:08:40.920
2018, there was a leak of emails from Google to the Wall Street Journal.
00:08:45.880
And in that conversation that these Googlers were having, they said, how can we use ephemeral
00:08:51.920
experiences to change people's views about Trump's travel ban?
00:08:56.020
Well, my head practically exploded when I saw that because we had been studying in controlled
00:09:01.220
experiments since 2013, the power that ephemeral experiences have to change people's opinions and
00:09:14.940
Most of the experiences you have online are ephemeral.
00:09:18.220
And ephemeral means fleeting, means you have the experience and then whatever was there,
00:09:24.600
the content disappears like in a puff of smoke and it disappears.
00:09:28.180
So, for example, you go to Google search engine, which you should never use, by the way, I can
00:09:33.700
But, and you type in a search term, you start to type, they're flashing search suggestions
00:09:50.460
You can't go back in time and see what search results there.
00:09:57.560
When you're on YouTube, you know those, the recommended one that's going to come up next,
00:10:06.880
The whole list of recommended videos, it's all ephemeral.
00:10:10.460
What we started doing in 2016 with a very small system at the time was preserving that
00:10:19.340
We found, we were looking at Google, Bing, and Yahoo.
00:10:23.560
We found pro-Hillary Clinton bias in all 10 search positions on the first page of Google
00:10:38.520
But the point is that if that level of bias, because that's what our experiments look at,
00:10:45.820
they look at how bias can shift opinions and votes.
00:10:50.480
If that level of bias that we measure, that we capture, that we preserve, normally that's
00:10:55.620
never preserved, had been present nationwide in the 2016 election, well, that would have
00:11:04.540
shifted between 2.6 and 10.4 million votes to Hillary Clinton with no one knowing that
00:11:14.180
that had occurred because people can't see bias in search results.
00:11:19.660
They trust whatever that takes them to, if they're undecided.
00:11:22.860
So 2 to 10 million in 2016, you're saying 6 to 25 million in 2024.
00:11:30.000
2020, Google alone shifted more than 6 million votes to Joe Biden.
00:11:43.380
So I should be thrilled, but I'm not thrilled because I don't like the fact that a private
00:11:47.660
company is undermining democracy and getting away with it, and there's no restrictions on
00:11:57.800
So they do what they're doing blatantly and arrogantly.
00:12:01.020
Quick example of another ephemeral experience to show you how blatant and arrogant this is.
00:12:07.920
Okay, so we were monitoring Florida because it's one of the key swing states.
00:12:10.860
On election day, November 8th, all day long, Democrats in Florida were getting go vote
00:12:25.680
In other words, 100% of Democrats in Florida were getting those reminders all day.
00:12:36.260
But, you know, if you don't have a monitoring system in place to capture all that ephemeral
00:12:42.920
This is a party in-kind donation to the party, to the candidates.
00:12:51.080
But Donald, President Trump beat the cheat 2016.
00:13:03.100
When we know that there's these, I would agree with you, smaller amount of cheat, but
00:13:10.780
Well, if this is true, I mean, then Trump's popularity is-
00:13:15.020
It was a collective, what was it, like 44,000 votes in three swing states are what stopped
00:13:21.960
Now, one thing we've learned how to do, this is very recent, by the way, in our work,
00:13:26.060
we've learned how to look at an election that took place, look at the numbers, and we can
00:13:35.080
Get ready for Las Vegas-style action at BetMGM, the king of online casinos.
00:13:40.680
Enjoy casino games at your fingertips with the same Vegas Strip excitement MGM is famous
00:13:45.580
for when you play classics like MGM Grand Millions or popular games like Blackjack,
00:13:52.340
With our ever-growing library of digital slot games, a large selection of online table games,
00:13:57.700
and signature BetMGM service, there's no better way to bring the excitement and ambience
00:14:02.760
of Las Vegas home to you than with BetMGM Casino.
00:14:09.740
BetMGM and GameSense remind you to play responsibly.
00:14:17.220
If you have questions or concerns about your gambling or someone close to you, please
00:14:20.800
contact ConnexOntario at 1-866-531-2600 to speak to an advisor, free of charge.
00:14:28.640
BetMGM operates pursuant to an operating agreement with iGaming Ontario.
00:14:33.300
When you really care about someone, you shout it from the mountaintops.
00:14:37.720
So, on behalf of Desjardins Insurance, I'm standing 20,000 feet above sea level to tell
00:14:47.100
Home and auto insurance personalized to your needs.
00:14:53.060
Visit Desjardins.com slash care and get insurance that's really big on care.
00:15:03.740
Trump won five out of what were generally considered to be 13 swing states.
00:15:09.140
If you factor out Google, Trump would have won 11 of those 13 swing states.
00:15:17.740
And easily would have won in the Electoral College.
00:15:21.640
And you have the CISA, the Cyber Infrastructure Security Agency, that has said that cognitive
00:15:31.160
infrastructure is what they want to be targeting right now.
00:15:36.220
Do you guys remember the leaked video of Google employees crying when Donald Trump won?
00:15:45.220
Because they swore up on that stage, and it was all the leaders of Google up on that
00:15:49.100
stage, and they swore, we are never going to let this happen again.
00:15:55.940
Biden, where, you know, Missouri, you know, there's key claims in there about election,
00:16:03.320
well, censorship, obviously, but censorship goes, extends to censoring and suppressing
00:16:09.900
Well, I've worked for years with Jeff Landry, who just became governor-
00:16:15.440
...with Louisiana, and I congratulated him that very day.
00:16:18.800
I thought for sure I'd never hear from him again now that he was governor, and he texted
00:16:29.560
But he's been helping me and my team for years, and he knows all about my work, and
00:16:38.960
There are few people up there in leadership positions in our country who understand.
00:16:48.300
He was involved in that Missouri case, as you probably know.
00:16:52.180
And yeah, of course we're interested in that, because the communication between the government
00:16:56.680
and Google and the gang, okay, that's very critical.
00:17:07.920
Seven federal agencies were headed by former Google executives.
00:17:13.120
Obama's chief technology officer, former Google executive.
00:17:18.000
Hillary Clinton's chief technology officer, Stephanie Hannon, former Google executive.
00:17:24.060
Two hundred and fifty people went back and forth in the Obama administration between top
00:17:38.860
You know, they took certain things for granted.
00:17:42.040
They weren't looking carefully enough at those tiny little numbers in the swing states.
00:17:47.120
And so, yes, a tiny margin in some swing states—
00:17:53.140
Put him over the top in the Electoral College, and they were kicking themselves.
00:17:59.380
If Facebook, for example, just on Election Day had sent out partisan go-vote reminders, just
00:18:07.440
Facebook, one day, that would have given to Hillary Clinton an additional 450,000 votes.
00:18:15.740
But it is possible, then, albeit very, very difficult, that if you can mobilize Trump supporters
00:18:24.840
and conservatives to an extreme degree, they can overcome that bias.
00:18:31.120
No, because Google alone controls a win margin of somewhere between 4% and 16%.
00:18:38.880
So, now, if you're telling me, well, no, we've locked this up, we've got—we can guarantee
00:18:44.200
a win margin of, I don't know, call it 12%, but that's not true in this country.
00:18:49.780
In this country, we know we're split roughly 50-50 on the vote.
00:18:54.240
So, if there's some bad actor that has the ability to shift a whole bunch of people, especially
00:19:01.760
right at the last minute, especially on Election Day, you can't counter that.
00:19:05.360
I think we're split—it's not 50—because of what you just described in terms of the
00:19:11.080
bias that President Trump overcame, we're not split 50-50.
00:19:20.000
Joe Biden is a failed candidate for many reasons, and there are some major disasters going on.
00:19:25.940
You know, you look at the policy, you know, only one or two of these things have taken
00:19:31.240
out other presidents, but if you have, you know, economy, wars, medical tyranny, two-tier
00:19:41.880
Biden's approval rating collapsed after the Afghanistan withdrawal.
00:19:44.940
So, if you apply it to a candidate, if you have a really bad candidate, that's going to, you
00:19:54.380
Can you say, can Google dial it in and say, oh, we can get 30 million, you know, we can influence
00:19:59.420
30 million people because Joe Biden's so awful?
00:20:03.040
Yeah, but then you should have gotten that red wave, and there was no red wave.
00:20:07.220
So, I published a piece in the Epoch Times that said, howgooglestoptheredwave.com, and
00:20:17.460
So, you should have had that huge red wave if you're—
00:20:20.020
So, in 2022, there should have been 30 or 40, right?
00:20:24.560
But some of that was—there was a wide variety of cheating that happened in that, and what
00:20:35.100
Even Kevin McCarthy funding people that are running against America First candidates.
00:20:40.140
He was—Kevin McCarthy was part of this problem.
00:20:46.400
He came back and has bitten him in the behind right now because the margins are so narrow.
00:20:52.040
If he had not done those things to suppress America First candidates in 2022, we wouldn't
00:20:59.820
I agree with you on the problem of Google, big tech, Google especially.
00:21:11.860
Why don't we just push Google and the gang, push them out of our elections, and push them
00:21:19.360
out of the minds of our kids because that's something we started studying, too.
00:21:23.640
When you say, can we get a win in 2024, forget the party.
00:21:32.160
And that gives you a freer and fairer election.
00:21:35.120
I think something you mentioned is the most important point.
00:21:39.080
It doesn't matter if YouTube spams nothing but Donald Trump content.
00:21:43.860
It doesn't matter if the front page of Reddit doesn't matter or the default page.
00:21:47.800
It doesn't matter if Twitter X and all these platforms every day slam you with pro Trump,
00:21:54.680
If during the election cycle, because now we're in a month, election month, not election day,
00:22:06.820
That's enough because we're talking about victory margins for the for the presidential
00:22:12.140
And if it's 77,000 votes that gets Trump the victory or 42, 44,000 in 2020, all Google
00:22:18.060
has to do is blast everyone, their algorithm, their AI knows as a Democrat with don't forget
00:22:24.000
And then for all the Republicans, all they have to do is put make sure you're watching
00:22:27.800
the new movie today and they can they can they can shift the percentages enough to secure
00:22:32.420
Joe Biden's that's what I'm trying to tell you.
00:22:35.940
But you're only talking about one little exactly that cost that cost them nothing, by
00:22:40.780
But how about let's back up a few months and they do the same with register to vote, right?
00:22:53.020
OK, you don't know what they're doing unless you're doing monitoring.
00:22:57.700
You don't know what I see anecdotes of what's coming across my feed.
00:23:00.940
We're collecting our data that are admissible in court on a massive scale.
00:23:07.000
We are now monitoring big tech content through the computers of more than 12,000 registered
00:23:13.320
voters politically balanced in all 50 states, 24 hours a day.
00:23:18.460
We have collected in recent months, preserved more than 51 million, might be up to 52 today,
00:23:26.200
51 million ephemeral experiences on Google and other platforms, content that they never
00:23:33.080
in a million years thought anyone would preserve.
00:23:37.900
Every single day, we have 30 to 60 additional new people added to our nationwide panel.
00:23:44.660
And so every single day, we're recording more and more of this content.
00:23:47.980
And we've been learning how to analyze it in real time.
00:23:51.260
So let me tell you how you push these companies out of our elections and get them out of our kids'
00:23:59.240
2020, we had so much dirt on Google that I decided we're going to go public before the
00:24:11.160
So I called up a reporter at the New York Post and I sent her in a bunch of stuff and she
00:24:25.020
You can look her up because she got fired soon afterwards.
00:24:29.240
And she read some of the pieces to me on the phone.
00:24:32.860
Now, just a few weeks before the New York Post had broken the story about Hunter Biden's
00:24:39.980
So there, you know, and that was front page, right?
00:24:42.520
Well, this story that she was writing about the election rigging, that was going to be
00:24:49.080
So Friday, October 30th, a couple of days before the election, her editor called Google for comment.
00:25:09.380
Well, the New York Post could take on Twitter because they were only getting 3 or 4% of their
00:25:14.480
But they were getting 45% of their traffic, ring a bell, 45% of their traffic from Google.
00:25:31.820
Google changed their search algorithm one day and his business went to zero.
00:25:40.520
This is, that's the power that this company has.
00:25:44.420
And people are, people in business are terrified of Google because Google can just put you out
00:25:51.440
They broke up, they've broken up a, you know, Ma Bell was broken up for, for, um, for less
00:26:05.520
Ebony Bowden was distraught at the New York Post.
00:26:11.900
That editor also didn't last much longer there, interestingly enough.
00:26:15.420
But I sent everything into Ted Cruz's office also.
00:26:19.680
And on November 5th of 2020, Ted Cruz and two other senators sent a very threatening letter
00:26:28.100
If you want to look at it, it's letter to Google CEO.com.
00:26:33.900
It is a fabulous letter written by Cruz and his, his buddies, and it's two pages long.
00:26:41.460
And it says, you testified before Congress saying you don't mess with elections, but
00:26:52.060
On November 5th, that very day, Google turned off all of its manipulations in Georgia.
00:26:59.340
We had more than a thousand field agents in Georgia.
00:27:02.080
We, we, we preserved a million ephemeral experiences in Georgia.
00:27:06.540
This is in the two months leading up to their Senate runoff elections.
00:27:13.040
Bias in Google search, political bias went to zero, which we've never seen before.
00:27:18.920
And get ready for a Las Vegas style action at BetMGM, the king of online casinos.
00:27:25.060
Enjoy casino games at your fingertips with the same Vegas strip excitement MGM is famous
00:27:30.440
for when you play classics like MGM Grand Millions or popular games like Blackjack,
00:27:35.780
Baccarat, and Roulette with our ever-growing library of digital slot games, a large selection
00:27:41.240
of online table games, and signature BetMGM service.
00:27:45.120
There's no better way to bring the excitement and ambience of Las Vegas home to you than with
00:27:54.000
BetMGM and GameSense remind you to play responsibly.
00:28:02.360
If you have questions or concerns about your gambling or someone close to you, please contact
00:28:06.140
Connects Ontario at 1-866-531-2600 to speak to an advisor.
00:28:13.500
BetMGM operates pursuant to an operating agreement with iGaming Ontario.
00:28:18.200
When you really care about someone, you shout it from the mountaintops.
00:28:22.340
So on behalf of Desjardins Insurance, I'm standing 20,000 feet above sea level to tell
00:28:31.880
Home and auto insurance personalized to your needs.
00:28:37.980
Visit Desjardins.com slash care and get insurance that's really big on care.
00:28:45.020
They stopped sending out partisan go vote reminders.
00:28:52.340
What is, what is, is there, this, this sounds like with your data being admissible in court,
00:28:57.660
anyone in any state could have standing and file a lawsuit.
00:29:01.240
And that's why I'm working with AGs around the country right now.
00:29:04.280
That's why I'm working with Paul Sullivan, who's a very well-known D.C. attorney who used
00:29:11.080
He's helping us to prepare a complaint to the FEC about Google because we have the data
00:29:19.400
In 2022, we preserved two and a half million ephemeral experiences related just to the, you
00:29:24.300
know, in those days leading up to that election.
00:29:26.580
But now we're setting up a permanent system that's currently running 24-7 in all 50 states.
00:29:33.700
It needs to be much, much bigger so that we have representative samples and so it's court
00:29:41.000
So it was the day that Senator Cruz sent this letter out, the bias seen on Google in Georgia
00:29:53.020
And that phrase came to me from a Google whistleblower named Zach Voorhees, who you may have heard
00:30:03.420
He's the guy who walked out of Google with 950 pages in documents and a very incriminating
00:30:10.660
And he put it all in a box and sent it off to Bill Barr, who at that time was Attorney General
00:30:16.440
And then Google went after him with police and a SWAT team went after Zach Voorhees.
00:30:28.840
Well, the point is, though, that Zach, what Zach did was very, very courageous.
00:30:40.580
They have the ability to turn off, turn these manipulations on and off, like flipping a light
00:30:51.180
Now, just imagine a much, much larger system running 24-7 with a public dashboard, which,
00:30:59.120
by the way, you can get a glimpse of right now.
00:31:01.280
It's at americasdigitalshield.com, and it looks gorgeous.
00:31:06.500
In the securities market, there's a concept of a quiet period, you know, where there's
00:31:13.440
You can't put out press releases, or you can't say certain things, you know, 30 days, plus
00:31:19.100
Maybe there's a remedy here to say that if you contract this and they abide by it, the
00:31:24.160
big tech needs to be in a quiet period for, you know, months before the election.
00:31:34.080
Oh, you're trying to get a permanent remedy to remove all bias?
00:31:40.500
We're focused on two areas, elections, which is critical, because right now, believe me,
00:31:48.060
And the second is kids, because we're collecting data now for more than 2,500 children around
00:31:55.700
the country, and we're actually looking at what they're actually getting from these tech
00:32:04.220
It is so bizarre, and so weird, and so creepy, and so violent, and so sexual.
00:32:18.400
Uh, you should, you should look into this, especially with your research.
00:32:22.420
It wasn't just that, but, uh, Elsagate was the name of this phenomenon that happened several
00:32:26.400
years ago, about maybe five years ago, where people, uh, adults weren't noticing this because
00:32:31.720
the, the feeds that we're getting are like, you know, CNN and, and entertainment and celebrities
00:32:42.800
Uh, this is where Elsagate comes from, of Elsa, Spider-Man, and the Joker running around
00:32:46.720
with no sound, like with, with no, with no dialogue, engaging in strange behaviors, right?
00:32:51.440
So it started with Elsa going, Ooh, and the Joker kidnapping her and then Spider-Man saving
00:32:56.080
The general idea was Joker, Elsa, and Spider-Man were very popular search terms in the algorithm.
00:33:01.260
And so if you combined these things in a long video, kids would watch it and they'd get high
00:33:09.060
It devolved into psychotic amalgamations of Hitler with breasts and a bikini doing Tai Chi while
00:33:22.580
And then it started, uh, you started getting these videos where the thumbnails were people
00:33:30.260
And this was be giving, being given to toddlers and children on YouTube.
00:33:35.200
And, uh, people, there are a lot of, uh, uh, you know, amateur internet sleuths started
00:33:40.540
The general idea was that this section of YouTube was completely overlooked or ignored,
00:33:47.100
But what happened was parents would select a nursery rhyme on the, on a tablet and give
00:33:52.240
the tablet to a baby, put it in front of them being like, there, I'll get a few minutes
00:33:56.880
The baby watches a very innocent nursery rhyme video, but the next up video would slowly move
00:34:02.560
in the direction of this psychotic algorithmic nightmare to the point where, like I mentioned,
00:34:08.460
the nursery rhyme, it was finger family was a hand would pop up showing Hitler's head on
00:34:14.220
And then Hitler said another finger, Hitler with breasts.
00:34:16.700
I am not kidding in a bikini doing Tai Chi with the incredible Hulk.
00:34:20.180
And then eventually videos where like Peppa Pig was being stabbed mercilessly with blood
00:34:26.000
Pregnant women were eating feces and getting injections while it was happening.
00:34:28.820
And because these videos started doing well, it actually resulted in human beings seeing
00:34:34.580
the success of these videos, giving their daughters, and this is in like Eastern Europe
00:34:39.300
and Russia, videos going viral where a father lays his daughter down and gives her an injection
00:34:47.600
This was, and eventually this, there was a massive backlash and people realized this was
00:34:55.100
And then they said, we're going to roll out YouTube kids and we're going to be very safe
00:34:59.600
But this is something I don't know if you think was intentional or was just a byproduct
00:35:05.920
First of all, you're talking about it in all in the past tense.
00:35:11.700
Also, you're talking about it legally from the perspective of anecdotes.
00:35:20.060
We're, we have, we have a larger and larger and larger group of people that are politically
00:35:27.420
This is, this is what, and by the way, we're not out there searching for crazy stuff on YouTube.
00:35:34.240
We're actually collecting the, the videos, hundreds of thousands of videos that, that our
00:35:41.340
Plus we've learned that 80% of the videos that kids watch are select, are suggested by that
00:35:49.780
Think of the power you have to manipulate just because of that algorithm.
00:35:54.860
A friend, a friend told me, I know this is all anecdotal, but I do think it's, it's the
00:35:58.800
anecdotes that I'm referring to are people started to notice something and then you have
00:36:04.060
A friend, a friend told me that she was watching her, her kid was watching Disney channel and
00:36:09.080
an anti-Trump commercial came on and she was like, what the, why?
00:36:14.420
Because powerful interests are slamming the battlefield in this way.
00:36:19.440
I think what you're talking about is the Kraken.
00:36:22.380
Well, maybe I shouldn't call it that because you know, Sidney Powell.
00:36:25.920
Look, look, let me, let me give you an example.
00:36:28.780
A parent walking by their kid's tablet, let's say, wouldn't even notice that anything was
00:36:34.740
But we're collecting the actual videos and here's what happens.
00:36:43.380
So with this, you know, relates to what you said just a few minutes ago, weird cartoon, but
00:36:47.960
then all of a sudden, boom, something crazy happens.
00:36:51.300
There's a shriek and a head flies through the air and there's blood everywhere and then
00:37:03.220
And what we're finding is something like, well, first of all, 80%, that's, that's rock
00:37:09.120
80% of the videos that, that little kids are watching, those are all suggested by Google's
00:37:14.120
Are you monitoring the gamings like even Roblox?
00:37:17.140
I mean, I could, I've seen some kind of unusual things in Roblox, but that could be crowdsourced,
00:37:22.220
you know, individuals putting up their own little games or characters.
00:37:26.760
Look, practically every day now we're expanding the system.
00:37:30.700
And so we're going, we're, we're, we're, we're monitoring more and more different kinds
00:37:41.520
That's why monitoring systems have to be a permanent part of not just our country and
00:37:47.460
our culture, but really everywhere in the world outside of mainland China.
00:37:53.860
And we have, we've been approached by people from seven countries so far.
00:37:57.700
The last two are Ecuador and, and South Africa and begging us to come help them set up these
00:38:05.800
And here is the only, only area where I've ever agreed with Trump on this issue.
00:38:12.120
We've got to, we've got to develop our own full system that's operating, you know, and
00:38:18.380
it has to be, you have to have representative samples.
00:38:21.200
This all has to be done very scientifically so that this is court admissible in every state.
00:38:25.680
That's how you push them out because you make them aware through public dashboards, through
00:38:32.680
press releases, uh, through data sharing with certain key journalists, members of Congress,
00:38:44.920
I want, I want to go back to that one point that you just made.
00:38:47.160
So what you're saying that there are innocent looking videos, the thumbnail may just be a
00:38:54.640
It's 15 minutes long, but then at eight minutes and three seconds, all of a sudden a head pops
00:39:01.160
Add to that the fact that if you, if you mouse over the bottom of that video, you can actually
00:39:06.960
see, you know, the frequency with which that part of the video is being viewed.
00:39:12.080
And very often now we're seeing a spike right at that point where that crazy brief thing
00:39:24.520
They're watching those parts over and over and over and over and over again.
00:39:28.120
Is there a way to find one of these right now or is it, is it buried in YouTube?
00:39:32.380
It's, it's, I mean, we, we have, well, well, first of all, if you, if you just scroll
00:39:45.140
Like, yeah, but that's going to be a carousel showing images that we're collecting in real
00:39:54.160
So, so in other words, you're, you're going to see actual real.
00:40:00.620
And you're going to be able to click on them and that's going to take you to the videos.
00:40:13.120
This is so nuts because on the left you see the, the political leaning of every state and
00:40:22.040
But on the right, what we're showing you is the bias in, on Google search results in content
00:40:34.760
I don't know if you want to continue on the, on the, the, the indoctrination or the subliminal,
00:40:40.460
you know, messaging to children, but, but with respect to elections, you think people are
00:40:47.020
Is there a cap to how many, if Google tries to do 10 or 20 million more people, is there
00:41:04.760
No, I mean, no, I laugh because the discoveries that we've been making all these years have
00:41:13.100
The more we learn, the more concerned we've all become.
00:41:16.420
For example, when we first did a, a nationwide study in the U S we had more than 2000 people
00:41:22.800
They were being shown biased content on our Google, Google, Google simulator, which we call
00:41:32.800
And, uh, uh, usually people can't see the bias now where they see, you know, where there's
00:41:41.020
bias content, we can shift people in one direction or another, whichever way we want to, because
00:41:47.500
Uh, we can easily shift between 20 and 80% of undecided voters.
00:41:54.180
And with one search it with multiple searches, that number goes up.
00:42:02.300
MGM, the king of online casinos, enjoy casino games at your fingertips with the same Vegas
00:42:08.920
MGM is famous for when you play classics like MGM grand millions or popular games like
00:42:14.380
blackjack, baccarat and roulette with our ever-growing library of digital slot games, a large
00:42:20.320
selection of online table games and signature bet.
00:42:24.620
There's no better way to bring the excitement and ambience of Las Vegas home to you than
00:42:34.120
Bet MGM and game sense remind you to play responsibly bet MGM.com for T's and C's 19 plus
00:42:41.900
If you have questions or concerns about your gambling or someone close to you, please contact
00:42:45.640
connects Ontario at 1 8 6 6 5 3 1 2 6 0 0 to speak to an advisor free of charge.
00:42:52.660
Bet MGM operates pursuant to an operating agreement with iGaming Ontario.
00:42:57.700
When you really care about someone, you shout it from the mountaintops.
00:43:01.840
So on behalf of Desjardins Insurance, I'm standing 20,000 feet above sea level to tell our clients
00:43:11.380
Home and auto insurance personalized to your needs.
00:43:17.420
Visit Desjardins.com slash care and get insurance that's really big on care.
00:43:34.640
It's an information war directed at the American people.
00:43:37.760
It was large enough so we did have a few people, about 8%, who could see the bias.
00:43:43.800
So we were able to look at them separately because the study was so large.
00:43:47.720
And you would think we're not going to get an effect with those people.
00:43:53.180
They shifted even farther in the direction of the bias.
00:43:56.640
So seeing the bias doesn't protect you from the bias because there's this trust people
00:44:01.800
have in algorithms because they don't know what they are and computer output because they
00:44:07.940
And so they think if the algorithm itself is saying this is the better candidate, it really
00:44:16.480
Seeing the bias does not protect people from the bias.
00:44:22.840
So right now we're studying things that are new for us.
00:44:26.340
We're studying what happens if one of these platforms is using one of these manipulations
00:44:37.320
And so far what we're seeing is that the effect is additive.
00:44:44.300
So you do kind of hit an asymptote, but the point is just by repeating these, notice if
00:44:50.340
you expose people to similarly biased content, the numbers go up.
00:44:55.380
The shift gets bigger, but you're right, the rate goes down.
00:44:58.320
Now, the other thing we're starting to look at is multiple platforms.
00:45:01.320
What happens if three or four of these big platforms in Silicon Valley are all supporting
00:45:08.460
And we're seeing initially in our initial work, again, those are additive effects.
00:45:13.260
So this is scary stuff because way back, remember Eisenhower's famous speech from 1961,
00:45:25.200
But that same speech, Eisenhower was warning about the rise of a technological elite that
00:45:32.400
could control public policy without anyone knowing.
00:45:41.420
There was a hearing with Twitter back when it was still Twitter, and I think one of the
00:45:46.320
most important things that was brought up was that, I can't remember who it was, but
00:45:49.900
one of the members of Congress said, if you go onto Twitter and create a new profile right
00:45:54.080
now, it shows you all the suggested follows are Democrats, no Republicans.
00:45:59.440
So that means as soon as you sign up, you say, I don't know, I'll follow this person,
00:46:03.600
You're being slammed by pro-Democrat messaging.
00:46:10.240
That's not even, you know, now we've seen this big push towards a switch from reverse
00:46:17.160
And perhaps people don't realize what the true power is behind that and why they want
00:46:23.260
For one, they can make more money with it for sure, but it takes away your ability to
00:46:30.860
When threads lost, this is Instagram's version of Twitter or whatever.
00:46:34.000
Worst platform ever, in my opinion, because I I'm like, OK, I'm on Instagram.
00:46:41.640
I was being I was getting a bunch of Democrats in my feed, which was strange.
00:46:46.680
And I was getting weird entertainment stuff and weird jokes that made nothing to me.
00:46:51.760
But their default position was we're going to tell you what to look at.
00:46:55.400
And I wonder if what they were doing was intentionally trying to create a platform.
00:47:03.120
Twitter very much is biased, even working with intelligence agencies with secret back
00:47:07.380
doors for moving content lawsuits now underway and and already resolved proving this.
00:47:14.000
Zuckerberg's like, we're going to launch an alternative.
00:47:15.960
The the the innocent take on this is, well, you know, they see a market opportunity.
00:47:21.960
I think they realized, oh, one of our key assets in this manipulation has just fallen to someone
00:47:30.280
So threads rolls out heavy handed algorithmic feed and it got a wave of complaints.
00:47:36.780
Now they're saying we're going to pull that back a little bit.
00:47:38.660
But for the longest time, Instagram has not been reverse chronological.
00:47:42.820
Reverse chronological for those that don't understand it.
00:47:45.480
You see on your feed the latest thing that someone posted.
00:47:50.300
And so if your friend posts now, you'll see it.
00:47:52.620
But if your friend posted three hours ago, it's already long gone.
00:47:54.980
The argument from these big tech platforms is, oh, but what if you like that three hour
00:48:05.260
I get weird posts and I'm scrolling through my feed of things I don't care about.
00:48:12.580
How long do you linger on this post versus this post?
00:48:17.440
But what they're also doing is using that as the argument.
00:48:20.900
They're going to start seeding you information to control what you think.
00:48:24.220
And I got to be honest, this UFO that we got sitting right here, I don't think nobody
00:48:26.860
can see it, but this UFO, I got it because of Instagram.
00:48:37.680
But what you don't see is that sometimes it's not an ad.
00:48:40.240
It's a post from someone saying, did you know about bad thing from this person?
00:48:51.800
These companies have another advantage over all the usual, the traditional dirty tricks,
00:48:58.480
which are inherently competitive and they don't bother me that much because they're inherently
00:49:02.040
competitive, but the point is these companies have another advantage, which is they know
00:49:13.200
They know down to the shoe size of those people.
00:49:18.960
So they can concentrate and in a manner that costs them nothing, they can concentrate just
00:49:26.840
So talk about swing states, swing counties, swing districts.
00:49:31.460
Well, here we're talking about, they know who the swing people are.
00:49:34.720
So the political world, they do it all the time, try to identify based on voting histories.
00:49:46.360
Are they looking at getting everything off their phone to figure it out?
00:49:50.460
Well, you and I have been using, maybe not Tim because he looks a little bit younger than
00:49:56.560
us, but you and I have been using the internet for 20 years.
00:50:03.300
Late 80s when I was a little kid, we had CompuServe.
00:50:06.200
Well then, I hate to tell you, but Google alone has more than 3 million pages of information
00:50:19.640
They're monitoring everything you do, not just if you're stupid enough to use their surveillance
00:50:25.040
email system, which is called Gmail, or their surveillance browser, which is called Chrome,
00:50:30.640
or their surveillance operating system, which is called Android.
00:50:39.340
But they not only are doing that, they're actually monitoring us over more than 200 different
00:50:50.660
So, for example, millions of websites around the world use Google Analytics to track traffic
00:51:04.040
But the point is, Google Analytics is Google, and according to Google's terms of service
00:51:10.680
and privacy policy, which I actually read over and over again, whenever they make changes
00:51:15.400
in it, if you are using any Google entity of any sort that they made, then they have
00:51:24.420
So, you are being tracked on all of those websites by Google.
00:51:28.300
Every single thing you do on those websites is being tracked by Google.
00:51:33.320
I mean, you know about Facebook shadow profiles.
00:51:45.340
To everybody who's listening, you have a Facebook profile.
00:51:59.300
I'll give the simple version and throw it to Dr. Epstein, who knows better than I.
00:52:08.680
Hey, would you like to add your friends and family through your phone book?
00:52:16.820
She's never signed up, but she does have a shadow profile.
00:52:18.960
When you sign up and say, import my friends, it then finds in your phone book, mom, 555-1234.
00:52:31.440
What happens then is all those little bits of data, Facebook then sees that and says,
00:52:38.700
We know from public data on the phone number, mom's name is Jane Doe.
00:52:43.440
Now they've compiled whom they have a profile on.
00:52:46.440
Your mom, her friends, her family, where she works, her salary, all that information from
00:52:52.140
And you probably know better than I do, so I don't know if you want to elaborate.
00:52:54.240
Well, because from that point on, once that has been set up, information continues to flow in
00:53:02.740
So that profile becomes, over time, immense, just as all these profiles are immense.
00:53:11.620
So it means that they know who's going to vote, who's not going to vote, who's made up their minds.
00:53:25.380
That gives them an advantage, which no campaign manager has ever had in history, because they
00:53:35.380
Is Google using that to influence who they want to influence, or are they selling it to
00:53:42.120
No, they're doing it themselves because they have a very, very strong political culture.
00:53:47.560
And so they have their own agenda, which they are trying very hard to spread around the
00:53:54.480
world, and they're impacting right now more than 5 billion people every single day.
00:54:00.440
One of the leaks from Google a couple years ago was an eight-minute video called The Selfish
00:54:06.500
If you type in, please don't use Google to do this.
00:54:09.940
Use the Brave search engine, anything but Google.
00:54:19.980
Type in Selfish Ledger, and you will get to a transcript I made of this eight-minute film
00:54:25.000
that leaked from the Advanced Products Division of Google.
00:54:27.980
Google, and this video is extraordinary because this video, which was never meant to be seen
00:54:35.440
outside the company, is about the ability that Google has to re-engineer humanity.
00:54:41.000
They call it behavioral sequencing, and they do have that ability, and they're exercising
00:54:48.620
So they know more about us than we know about ourselves.
00:54:57.340
That's why Google has, for many years now, been investing in DNA repositories.
00:55:07.040
That was set up by one of the spouses of one of the founders.
00:55:10.360
So the DNA information becomes part of our profiles, in which case they know about the
00:55:14.740
diseases we're likely to get, and they can start to monetize that information long before
00:55:21.980
They also know which dads have been cuckolded, by the way.
00:55:31.520
They own Fitbit, so they're getting physiological data 24 hours a day.
00:55:35.420
They benefited tremendously from COVID, so much so that it kind of makes me wonder whether
00:55:42.100
But they benefited from COVID because of COVID and their cooperation with the government
00:55:50.840
They got access to hospital data for tens of millions of Americans.
00:55:56.260
So they got access to medical records, which they've been after for a long time.
00:56:02.260
They bought the Nest smart thermostat company a few years ago.
00:56:05.600
The first thing they did without telling anyone was put microphones inside of some Nest products.
00:56:12.320
So now they have microphones in people's homes, millions of homes, and they start to get patents.
00:56:19.040
I have copies of them, patents on new methods for analyzing data inside a home so that you can make reasonable inferences
00:56:30.220
about whether the kids are brushing their teeth enough, what the sex life is like, whether there are arguments taking place.
00:56:36.620
All of that, of course, can be monetized, but also it becomes part of our profiles.
00:56:43.040
And that information is used to make predictions about what it is we want, what we're going to do, whether we're going to vote, whether we're undecided.
00:56:50.380
And it gives them more and more power to manipulate.
00:56:54.420
So I'm going to give you a glimpse of one of our newest research projects, data that we just got.
00:56:59.600
So this will be just an exclusive for your show.
00:57:03.280
Okay, and this is called DPE, digital personalization effect.
00:57:09.200
We've been studying the impact that bias content has on people.
00:57:14.020
We've been doing that since whatever it is, 2013.
00:57:17.460
But now in the new experiments, we've added personalization.
00:57:23.600
Get ready for Las Vegas-style action at BetMGM, the king of online casinos.
00:57:28.960
Enjoy casino games at your fingertips with the same Vegas Strip excitement MGM is famous for
00:57:34.700
when you play classics like MGM Grand Millions or popular games like Blackjack, Baccarat, and Roulette.
00:57:41.540
With our ever-growing library of digital slot games, a large selection of online table games, and signature BetMGM service,
00:57:49.060
there's no better way to bring the excitement and ambience of Las Vegas home to you than with BetMGM Casino.
00:57:57.920
BetMGM and GameSense remind you to play responsibly.
00:58:06.260
If you have questions or concerns about your gambling or someone close to you,
00:58:09.380
please contact Connects Ontario at 1-866-531-2600 to speak to an advisor free of charge.
00:58:17.420
BetMGM operates pursuant to an operating agreement with iGaming Ontario.
00:58:20.620
When you really care about someone, you shout it from the mountaintops.
00:58:26.260
So on behalf of Desjardins Insurance, I'm standing 20,000 feet above sea level to tell our clients that we really care about you.
00:58:35.820
Home and auto insurance personalized to your needs.
00:58:48.920
So we're comparing what happens if we send people biased results or biased content of any sort.
00:58:57.980
And we already know the shifts we're going to get that way.
00:59:03.240
So based on the way someone answers questions in the beginning, we either are or are not sending them content from news sources and talk show hosts and celebrities that they trust.
00:59:17.600
And if they're getting the same content, but it's from trusted entities, trusted sources, that can triple the size of the shift we get in voting preferences.
00:59:36.640
It's going to take a long time for us to work out all the details.
00:59:41.520
These companies are not only sending biased content to satisfy their agenda for humanity.
00:59:50.480
They're sending personalized content to everybody.
00:59:54.940
Do you know this big trial that's in progress right now?
00:59:57.640
A couple days ago, a Google executive said under oath, we don't make use of the massive amount of information we have about everyone.
01:00:06.020
Well, how are they sending out personalized content to everyone if they're not using it?
01:00:11.180
So I'm wondering if this algorithmic control, these ephemeral experiences, I don't know, can they overcome reality, right?
01:00:26.420
But Joe Biden does bad thing, bad thing, bad thing, bad thing, bad thing.
01:00:28.840
Eventually, the news gets out and they can't stop what is actually happening, right?
01:00:33.720
There are certainly limits on what they can do, but you'd be surprised at how few limits there are because there are no constraints on them.
01:00:43.120
There are constraints on newspapers and magazines.
01:00:47.480
We're used to looking at sources where there are constraints.
01:00:52.240
I mean, think of the things that you don't see in newspapers, right?
01:00:59.420
In fact, there's so much weird stuff that's just not in traditional media sources that we just don't give it a second thought.
01:01:06.740
I think that if a child gets access to adult content on YouTube, then YouTube's executives should be criminally charged immediately.
01:01:16.120
Obviously, if you and several of their employees, I mean, indictments, 2,000, 3,000 people instantly.
01:01:23.080
If you had a child walk into an adult bookstore and they let him in and started letting this kid look at this stuff.
01:01:28.780
Yeah, there's going to be criminal charges, going to be civil suits.
01:01:31.160
It is a violation of state law outright to allow children to get access to this material.
01:01:38.540
That's what I'm trying to tell you is that there are no constraints on them.
01:01:45.800
We have Section 230 of the Communications and Decency Act of 1996, which prevents us, for the most part, from suing them for any content at all that they post on their platforms.
01:01:59.440
Now, that was meant as a way to help the internet to grow faster, which made some sense at the time.
01:02:13.180
Yeah, there needs to be a deep assessment as to what it's supposed to be doing because it's not doing what it should be doing and it's allowing protections in bad ways.
01:02:20.800
Well, the point is that the arrogance they have stems in part from the fact that there really are no constraints.
01:02:28.480
So, you know, we have these two kinds of sources of information in our world today.
01:02:34.500
One is the traditional sources where there are lots of constraints, period.
01:02:37.980
And then there's the internet where there are no constraints.
01:02:44.660
And especially lately, I'm getting more and more concerned about the way it's affecting kids.
01:02:50.240
Because there's a lot of mysterious things happening with kids that parents just cannot figure out.
01:02:56.200
We're now on the verge of being able to figure it out because it has to do with this weird content that these companies are sending to kids.
01:03:10.360
I think that they're sending out this particular kind of content for particular reasons.
01:03:17.940
In fact, I was sure you were going to ask me and you didn't ask me.
01:03:21.020
Why would you suddenly in the middle of an innocuous cartoon insert something that's just ghastly and horrible?
01:03:34.660
It's called negativity bias, which a great term is used in several of the social sciences.
01:03:41.120
It's also called the cockroach in the salad phenomenon.
01:03:43.780
So you have a big, beautiful salad in a restaurant, and then all of a sudden you notice there's a cockroach in the middle.
01:03:55.280
Now, you could eat around the cockroach, but no.
01:03:59.220
So in other words, we are built so that if there's something negative and horrible and possibly threatening, all of our attention is drawn to it.
01:04:13.180
We're built that way, and evolution made us that way because that makes sense, right?
01:04:17.780
If there's something out there that's a little scary...
01:04:23.580
Now, if you had a plate of sewage, and then you put a nice piece of seized candy from California in the middle of it,
01:04:30.940
So there is no corresponding effect for positive things.
01:04:35.440
But for negative things, I think that's one of the reasons why we're seeing what we're seeing is because they're trying to addict kids more and more to stay on the platform.
01:04:49.540
They want them staying on the platform, and they want them coming back over and over again more and more frequently.
01:04:56.020
I think that's one of the reasons why they're putting in these ghastly moments.
01:05:05.140
So when you're driving on the highway, and there's an accident, you can't take your eyes off of the accident.
01:05:12.100
And you're trying to keep your car in a straight line, and you can't even keep it because so much attention is drawn to that.
01:05:19.560
Well, I think it has to do with we want to know what happened.
01:05:23.080
And the reason we do is because evolutionary psychology would benefit us if we see some kind of dangerous circumstance to understand as much of it as we can.
01:05:34.480
We are more likely to survive if we are doing that.
01:05:36.900
Well, little kids have those same built-in tendencies.
01:05:45.140
They want to understand why they're feeling like it's so crazy right now.
01:05:48.960
Yeah, and they're forming their belief systems, too.
01:05:50.820
I think it's not just the weird things that pop up.
01:05:57.800
And I have five kids myself, and there's nothing more important to me in this world than my kids.
01:06:03.560
So I'm always hoping Google will leave them alone because I've had threats.
01:06:10.720
Take away their tablets, their phones, no computers?
01:06:14.120
I mean, there are people who work with me who've been in danger, and I've had actual threats.
01:06:19.220
Since 2019, that's when I testified before a Senate committee, and that same summer, I also did a private briefing for a bunch of AGs.
01:06:36.100
When I was finished, I went out in the hallway.
01:06:41.980
He came up to me, and he said, Dr. Epstein, I don't want to scare you.
01:06:45.640
He said, but based on what you're telling us, I predict that you're going to be killed in some sort of accident in the next few months.
01:06:54.100
Now, obviously, I'm here, and I wasn't killed in some sort of accident in the next few months, but my wife was.
01:07:04.720
And she lost control of her little pickup truck, and she spun out on the freeway, and she got broadsided by a semi-tractor trailer.
01:07:20.940
It was never examined forensically, and it disappeared from the impound yard, and I was told it ended up somewhere in Mexico.
01:07:29.700
Well, is that what leads you to believe that you think there was foul play?
01:07:32.800
Well, what I believe is I will never know what happened.
01:07:42.260
I heard her last heartbeat, and I will never really know what happened.
01:07:48.300
Afterwards, when I was starting to recover, which I really haven't really fully recovered still,
01:07:55.220
but afterwards, my daughter showed me how to use Misty's phone to get all kinds of stuff.
01:08:05.780
And one thing I found on there was her whole history of movement.
01:08:11.760
Every single place she had been, it tracks, and it shows exactly what the addresses are and how many minutes she's at each place.
01:08:18.760
Among other things, that tells me that if someone wanted to mess with her brakes or some electronics in her vehicle,
01:08:25.720
they knew exactly where that vehicle was the night before.
01:08:35.580
I think probably, it's probably earlier than this, but I'm pretty sure like a 2012 or like what is it, earlier model?
01:08:45.880
These are fully capable of being remote controlled.
01:08:49.080
So a lot of the modern power steering, I was surprised to learn this.
01:08:53.420
This is, I think like 10 years ago, you had these very renowned cyber researchers,
01:08:58.640
cybersecurity researchers who were able to remotely hack a car and control it.
01:09:02.460
And the first thing I thought was, how do you remote control?
01:09:05.940
You've got to have a mechanism by which you can actually move the steering wheel without hands.
01:09:11.540
I understood power steering existed, but I didn't realize that there were actual motors within the steering wheel
01:09:17.060
that can move it without physical kinetic input.
01:09:20.220
Sure enough, these researchers found that there was a way to remotely access through like a very narrow communication channel
01:09:33.320
And they were able to, this famous video, Wired did this whole thing on it,
01:09:36.620
where they're sitting in the back seat and they have a tablet or a computer or whatever,
01:09:39.860
and they're making the car stop and accelerate and move all remotely.
01:09:43.340
And that's when I, that's around the time I think most people learned that the steering systems are already electronic and automated
01:09:51.880
and digital inputs can shift this if someone can input code into the system.
01:09:59.860
Now, you quite literally have automatic cars, which means you get into your robo car,
01:10:05.380
the doors can lock and not open, and it can just drive itself off a cliff.
01:10:09.600
The difficulty here, though, is everyone's going to ask,
01:10:13.140
was it the self-driving capability that resulted in this freak accident happening?
01:10:16.620
But in this time period, without getting to specifics, because they're, you know, people's families,
01:10:20.960
there are stories of individuals working on very serious foreign policy stories,
01:10:25.940
going 100 miles an hour down the road and slamming into a tree and the car explodes.
01:10:31.580
Without getting to specifics, there are stories related to this where the individual in question said
01:10:37.160
they thought their car was tampered with and asked someone to borrow their car because
01:10:42.360
And then shortly after their car, 100 plus miles an hour, slams into a tree,
01:10:51.400
Get ready for a Las Vegas-style action at BetMGM, the king of online casinos.
01:10:57.240
Enjoy casino games at your fingertips with the same Vegas Strip excitement MGM is famous for
01:11:02.480
when you play classics like MGM Grand Millions or popular games like Blackjack, Baccarat, and Roulette.
01:11:09.280
With our ever-growing library of digital slot games, a large selection of online table games,
01:11:16.380
There's no better way to bring the excitement and ambience of Las Vegas home to you
01:11:26.340
BetMGM and GameSense remind you to play responsibly.
01:11:34.060
If you have questions or concerns about your gambling or someone close to you,
01:11:37.160
please contact Connects Ontario at 1-866-531-2600
01:11:44.840
BetMGM operates pursuant to an operating agreement with iGaming Ontario.
01:11:49.840
When you really care about someone, you shout it from the mountaintops.
01:11:56.400
I'm standing 20,000 feet above sea level to tell our clients that we really care about you.
01:12:03.660
Home and auto insurance personalized to your needs.
01:12:16.700
This is the scary thing about disrupting any kind of system.
01:12:24.560
It's really hard to build a massive complex system.
01:12:27.480
It's really easy to throw a twig into the spokes of a bike and have it flip over.
01:12:32.340
You get a massive machine with millions of moving parts and someone drops a marble into it somewhere,
01:12:40.660
You take a look at someone who's working on saying uncovering a massive mechanized system
01:12:54.160
People think that assassinations and these things are always going to be some like,
01:12:59.220
a strange man was spotted coming out of a dark alley with a trench coat on.
01:13:02.740
And we heard a few gunshots before the man jumped in a black van and sped off.
01:13:09.060
Oh, remember that guy who was working on that big story?
01:13:18.280
You know, if you import social credit systems into the algorithms for controlling your life,
01:13:26.040
your car, your driving, you know, you might have, just like you have suppression in your
01:13:30.820
social media sites, you may have suppression in your function, your ability to spend money
01:13:36.180
from your own bank, your ability to drive your own car, right?
01:13:41.000
You know, they might want to, they've been talking about electronically putting blockers on the
01:13:53.800
And there's pros and cons to this smart guns, where it requires a hand print sensors so that
01:14:00.760
it can only be used by the individual who's programmed for it.
01:14:03.940
The bigger question is, is it connected to the internet?
01:14:05.900
In which case, people in power can bypass your restrictions.
01:14:09.140
And then what you'll end up with, you as a home user, trying to use your weapon, one
01:14:13.920
day wake up to find, you can't use it, but the authorities can.
01:14:18.880
I want to make a plea, and I'm going to see if you'll even let me repeat the plea, maybe
01:14:27.780
We need help building this system that we're building, which we're calling America's Digital
01:14:36.700
So if you go to techwatchproject.org, you can learn about the project.
01:14:40.120
But I want to send people to one place, mygoogleresearch.com, because that summarizes everything.
01:14:51.400
And what we're asking people to do is to sponsor a field agent.
01:15:00.220
I mean, we have to approach 100 people before one person will say, yes, I'll do that.
01:15:04.980
You can put special software on my computer so you can monitor the content.
01:15:08.940
By the way, we don't violate anyone's privacy when we do this, because their data are being
01:15:13.100
transmitted to us 24 hours a day without any identifying information.
01:15:17.100
Same with the data coming from their kids' devices.
01:15:20.300
So we're doing the opposite of what Google does.
01:15:22.080
We're only looking at aggregate data, not individual data.
01:15:29.860
Just like the Nielsen families, which are used to make the Nielsen ratings, they get paid
01:15:36.400
They're doing this as a kind of public service.
01:15:38.940
Our field agents, and we now have more than 12,000 in all 50 states, politically balanced,
01:15:44.080
all registered voters, we only pay them $25 a month.
01:15:47.980
But if you take 12,000 times 25, what does that come out to?
01:16:00.580
So we're talking about something that's very expensive.
01:16:03.740
And the only way we can really make this into a permanent project is if we have tens of thousands
01:16:10.320
We've had about, in the last two weeks, we've had about 150 people, which is great because
01:16:14.860
we haven't really been publicizing this, step up and sponsor a field agent.
01:16:19.740
So if you go to mygoogleresearch.com, okay, there are donation links you can put, and it's
01:16:26.200
all tax deductible, completely tax deductible, because we're a 501c3.
01:16:30.120
And you click and then put in 25 and put in monthly.
01:16:34.960
And as I say, we've had 150 people do this in the past week or two, but we need tens of
01:16:41.500
And so there's my plea, and I'm going to try to repeat it another time.
01:16:53.420
I just happen to be the first one to have built it, but if someone else wants to build
01:16:57.660
But you have to have this kind of system in place.
01:17:03.840
You have to have representative samples, et cetera, et cetera.
01:17:06.280
You have to have the system in place, or you will never understand what these companies
01:17:13.580
are doing to us and our kids and to our elections.
01:17:19.480
Because what they can do, they can do invisibly and on a massive scale.
01:17:24.020
The way to stop them is by shining the light on what they're doing.
01:17:30.460
It's that old quote from Justice Brandeis, right?
01:17:36.380
So that's the only way that I know of to stop them.
01:17:40.680
No law or regulation is going to stop them because, first of all, our government's so
01:17:46.880
But even then, they would just go around the law.
01:17:50.500
But they can't go around what we're doing because we're actually preserving the content
01:17:57.960
The challenge, I suppose, is even if you get the system up and running, is Congress going
01:18:04.140
These institutions are stagnant and incapable, in my opinion.
01:18:11.000
I do want to make sure, though, that the elections are free and fair because at the moment,
01:18:17.880
at the moment, we are being controlled by a technological elite, exactly as Eisenhower warned about
01:18:29.520
We have allowed these entities to become behemoths.
01:18:44.720
These are the most arrogant people you'll ever meet in your life.
01:18:52.260
I think Google's original expression was, like, don't be evil.
01:18:59.000
Which basically means their new motto is be evil.
01:19:02.920
Or no, I think it's really don't be evil unless we have some reason to be evil.
01:19:11.280
I mean, everybody thinks they're, you know, these people all think they're morally superior
01:19:15.220
And the problem with this, there's a great, great quote.
01:19:17.780
I can't remember who it was by, but it was basically, you know, these people who think
01:19:22.240
they're so much smarter than everybody else, these politicians, they're not.
01:19:25.520
They're just another person who, you know, everybody thinks that they should be in charge
01:19:34.540
But the general argument is people get power and then think, I'm smarter, so I should decide
01:19:39.220
And that's basically what all tyrants, all dictators, all authoritarians tend to think.
01:19:43.080
History is rife with examples of people who have destroyed the lives of so many and caused
01:19:49.360
so much suffering trying to chase down that yellow brick road or whatever.
01:19:55.600
Think about people who really have been dictators and have been in charge of a lot of people
01:20:00.860
and have been trying to expand and expand and expand.
01:20:03.200
Not one of them has had anywhere near the power that Google has because Google is exerting
01:20:09.920
this kind of influence, not just in the United States, but in every country in the world
01:20:15.940
And of course, they've also worked on the sly with the government of mainland China to help
01:20:22.620
And by the way, lefties out there, okay, because I only left myself so I can talk to
01:20:30.200
By the way, lefties, they don't always support the left.
01:20:34.400
You go country by country and Google does whatever it wants to do.
01:20:40.500
Well, I'm curious right now, we're seeing an interesting phenomenon.
01:20:43.480
I don't want to get into the politics of Israel-Palestine, but just considering it is a very contentious
01:20:47.560
issue right now, wouldn't, isn't it in the interests of our governments and these, these
01:20:53.600
big tech companies, unless it's not to, it tends to be to support Israel, right?
01:20:58.460
To, to, to provide foreign aid to Ukraine, to Israel.
01:21:00.840
We've seen tremendous bias in favor of intervention in Ukraine, but now we're seeing all of these
01:21:06.820
There's a viral video where they're marching down the halls of their school, chanting
01:21:11.480
So, again, not to get into the politics of Israel-Palestine, my question is, how do you
01:21:15.340
have such divergent political views on a contentious issue if Google controls it?
01:21:20.480
Is this an area they've overlooked or is it intentional?
01:21:24.640
That's, again, where monitoring systems are critical because, you know, have we looked into
01:21:34.120
Yes, because we're not only capturing all this information, we're archiving it.
01:21:38.320
So that means you can go back in time and find out whether they were doing something
01:21:47.820
Let me just, because, you know, deliberate means that a, that a, that a, that an employee,
01:21:54.260
a mischievous prankster, techie, you know, guy made something happen.
01:21:59.980
Or it means there's a policy coming down from executives.
01:22:05.060
Deliberate, but with Google, it works a little differently.
01:22:08.800
Deliberate can also mean you leave the algorithm alone.
01:22:13.460
It's called algorithmic neglect, algorithmic neglect.
01:22:21.220
Now, the algorithm has no equal time rule built into it.
01:22:26.440
So it's always going to do its best to find, to find the best and order things from best
01:22:32.880
So if you just leave the algorithm alone, it's always going to take one perspective and put
01:22:39.980
And that's going to shift a lot of opinions, especially among vulnerable groups.
01:22:44.680
And the most vulnerable group there is, is young people.
01:22:47.900
So deliberately, semi-deliberately, it's very possible that what you're seeing in this situation
01:22:55.980
with Israel and Ukraine, especially what's happening with young people, it's very possible
01:23:09.900
Potentially bad actors driving it and buying it.
01:23:17.320
Well, buying it doesn't kind of work because buying is competitive.
01:23:23.160
So in other words, if Republicans want to try to push up their candidate higher in search
01:23:28.540
results, well, Democrats can do the same thing.
01:23:32.400
The problem is if the platform itself wants to take a particular stand, there's nothing
01:23:39.700
And so what I'm saying is that's where you've got to capture the ephemeral content and learn
01:23:45.860
And then you can actually answer questions like the questions Tim was just asking, which
01:24:03.940
Get ready for Las Vegas-style action at BetMGM, the king of online casinos.
01:24:09.360
Enjoy casino games at your fingertips with the same Vegas strip excitement MGM is famous
01:24:14.720
for when you play classics like MGM Grand Millions or popular games like Blackjack, Baccarat, and
01:24:21.460
There's no better way to bring the excitement and ambience of Las Vegas home to you than with BetMGM Casino.
01:24:38.260
BetMGM and GameSense remind you to play responsibly.
01:24:46.600
If you have questions or concerns about your gambling or someone close to you, please contact
01:24:50.400
Connects Ontario at 1-866-531-2600 to speak to an advisor free of charge.
01:24:57.780
BetMGM operates pursuant to an operating agreement with iGaming Ontario.
01:25:00.960
When you really care about someone, you shout it from the mountaintops.
01:25:06.600
So on behalf of Desjardins Insurance, I'm standing 20,000 feet above sea level to tell
01:25:16.240
Home and auto insurance personalized to your needs.
01:25:22.180
Visit Desjardins.com slash care and get insurance that's really big on care.
01:25:31.420
Well, I'm saying we're collecting so much data that we could also just go back and look
01:25:39.580
in our archive and search for that kind of content and see what's happening.
01:25:46.700
That's why it's so critical that this content be captured because if you don't capture it,
01:25:54.120
you can never go back in time and look at anything that was happening.
01:25:59.620
You know, out by Dulles, there's just miles and miles of data farms, you know, that is
01:26:16.580
We may be there now where the U.S. government through big tech and data farms and all that
01:26:27.600
So jokingly, we often refer to the fact that Facebook knows when you're going to poop.
01:26:35.480
They know when you will before you even feel it.
01:26:38.220
The way they do it is, I wrote, there was a great article talking about this.
01:26:41.500
And they used this as a joke to try and drive the point home.
01:26:44.200
So they, uh, people don't understand the, the tiny bits of data and what it, uh, and
01:26:52.440
For instance, if you were to take all of your health data and then have a doctor looked at
01:26:59.620
It looks like you're healthy, but there could be tiny markers here and there that are overlooked
01:27:05.540
You take the data of every person in this country, put it into a computer.
01:27:09.460
The computer instantly recognizes these seemingly innocuous things all occur in people who 10
01:27:17.840
So as a doctor can't make that correlation, the computer does.
01:27:20.920
Facebook will, they know your location because most people have location services turned on
01:27:25.260
and they know that if someone sits still for 35 minutes, then gets up and moves two meters
01:27:30.640
and then sit still again, they're going to go to lunch in 27 minutes on average.
01:27:34.660
It's not perfect, but it's, it's probabilities.
01:27:36.420
And so what happens is they know when you're going to eat, they know based on all of the
01:27:41.720
movements you mentioned, the, the, the, the phone showing all the different places you've
01:27:46.320
That easily gives them the data on when you were most likely to use the bathroom.
01:27:50.860
They can also factor in proximity to bathrooms, meaning you're holding it and they know, but
01:27:55.820
it's, it's silly, but think about what that translates into.
01:28:01.920
They, they, they know that the, the, the, the movements you've been making in your office
01:28:06.080
have been increasingly become sporadic over the past few weeks, indicating some kind of
01:28:13.000
There's the frequency of messages you're sending.
01:28:14.800
There's the amount of times you're going out to eat.
01:28:16.460
Thus you're likely to be fired or, you know, quit your job.
01:28:19.620
This also indicates you're less likely to have money.
01:28:21.600
They can then look at how often you're driving your car, how often you're buying gas, and then
01:28:25.000
predict 73.2% chance this individual commit will, will commit a crime when the next seven
01:28:30.140
to eight months due to, you know, these factors.
01:28:33.180
Then they put a flag out to an, to a local law enforcement agency saying here are your
01:28:40.600
And the next all of a sudden, one day you walk outside, you're still at your job.
01:28:47.080
And there's a cop outside your house looking at you as you walk by.
01:28:49.720
Then the computer says law enforcement presence has decreased the probability of crime by 17.8%.
01:28:55.660
All of those things could be happening right now.
01:28:59.780
Or you're, are you, we, you're going to a drop boxes, stuff, a bunch of ballots, you know,
01:29:04.360
your location to a, to a, that they like, if they like it, if they like it.
01:29:07.900
And then what they want to happen is they want the other to get caught doing it and them
01:29:14.200
So they know, and think about how crazy it is, because if we get to this point where we truly
01:29:18.460
have some like sentient AI, we are just pawn puppets in whatever that AI may be doing,
01:29:24.140
whether it is conscious, sentient or not, you, it will just be a system that runs that
01:29:31.100
And so it will know, actually, I have you guys, I don't know if you watch movies or whatever.
01:29:36.720
I just watched a mission impossible dead reckoning.
01:29:41.820
This is what it's about that, you know, a Tom Cruise's character, I was an Ethan Hunt or whatever.
01:29:46.520
They all realize there is this AI that has infected the internet and it's, they call it
01:29:52.880
the entity and everything they're doing has been predetermined by probability of what the
01:30:00.620
I don't, I don't want to spoil the movie, but the, like the villain is chosen specifically
01:30:05.200
because of his relation to the antagonist and what the antagonist will respond and how
01:30:09.760
So the, the, the entity, the AI has planned out all of this.
01:30:15.060
And it's like, even though the characters know they're on that path, they're given impossible
01:30:19.860
choices, which push them in the direction of what the AI wants them to do.
01:30:23.780
And they can't, and like to break free, not just part one, I guess part two will be coming
01:30:29.920
Um, but the way I've described the future is imagine a future where your job is indescribable.
01:30:40.240
And so, you know, people are doing Uber and people are doing these gig economy jobs.
01:30:43.960
So you wake up in your, your house or whatever, and you, you know, have breakfast and you're
01:30:48.300
And then all of a sudden your phone goes, and you know, job quest or whatever the app
01:30:56.740
And then it says, receive this object from this man and bring it to this address.
01:31:03.680
And then the object is this weird looking mechanical device.
01:31:14.640
Then you walk to the address, address, and there's some guy standing there.
01:31:22.400
You have no idea what you gave, no idea who you met, no idea what's going on.
01:31:26.740
Because now you go back home and you're $75 richer and it only took you 20 minutes.
01:31:31.420
And what you don't realize is it's all compartmentalized through these algorithm and you're building
01:31:35.560
a nuclear bomb or you're, you're building some kind of spaceship or doomsday weapon or
01:31:40.420
new component that the AI system has determined it needed to increase its efficiency.
01:31:45.040
You, these strange tasks that are indescribable right now, you know, your, your app says someone
01:31:52.260
But what happens when we come to this job, like already with Fiverr, we're at the point
01:31:56.720
where, Hey, can you do an, a weird miscellaneous task for some money?
01:32:00.820
Once we get to the point where you've got hyper specialized algorithmic prediction models
01:32:05.820
or whatever, we get to the point where there's an app where it could be a human running it.
01:32:11.140
And the human says, I want to build a rocket ship.
01:32:16.160
Is the easiest way to build a rocket ship to sit down over the course of a few years,
01:32:19.660
having all these hiring meetings and interviewing people and trying to find someone who can build
01:32:23.520
something or the McDonald's method, McDonald's, when they launched, they, it used to be, you
01:32:32.380
He's going to put the fixings all on it and then serve that burger.
01:32:36.980
McDonald's said, let's hire 10 people who can get good at one thing.
01:32:44.200
Someone puts the mayo and the mustard on it or whatever.
01:32:46.220
Someone throws the fries in one person for every small minor task, which is easier to do.
01:32:50.100
We can get to the point where a human being with no specialties only needs to do the bare
01:32:55.640
minimum of their skillset in order to help build a spaceship, a nuclear bomb, or even
01:33:03.000
And it sounds like, you know, there could be some good coming from it.
01:33:06.560
Oh, maybe we can more efficiently produce buildings and more efficiently align people with jobs they
01:33:13.540
But then evil people, well, of course, will always weaponize this for evil ends.
01:33:17.280
Or I think that the scarier prospect is the artificial intelligence just becomes outside
01:33:25.220
The example I'll give you is Jack Dorsey, the best example of a human being who has guzzled
01:33:34.160
Twitter then starts the algorithm that they implement starts pushing out an ideology, which
01:33:42.180
So what happens is Twitter becomes a sewer of psychotic, fractured ideology.
01:33:48.300
He's on Twitter reading the things that he produced and then consumed, and it pollutes
01:33:56.300
And a guy who went from from trying to create the free speech wing of the free speech party
01:34:00.100
ends up having this interview with, you know, between I and Joe Rogan and his lawyer about
01:34:05.900
misgendering policies and other nonsensical, uh, inane ideas, because he's basically taken
01:34:13.240
up, taken a plug from his own ass and shoved it down his throat.
01:34:16.540
It's, it's this, this information sewer of Twitter, the algorithm he created, the unintended
01:34:22.700
So when we look at, at YouTube and how they're feeding all of these children, these shock videos,
01:34:27.560
what's going to happen is human society begins consuming its own waste and refuse refuse.
01:34:32.560
These kids grow up with fractured minds because of this insane information they absorbed as
01:34:38.460
And this leads to not a utopian future where AI gives us better life, a better life.
01:34:43.680
It leads to children growing up having deranged views of what is or should be, uh, these kids
01:34:51.200
who watched this weird stuff of Hitler, uh, you know, in a bikini, how many of them are going
01:34:55.460
to have a depraved degenerate, uh, predilections where they're begging their wife to put the Hitler
01:35:00.920
mustache on another weird nonsense or showing up to work in bikinis with Hitler mustaches,
01:35:05.920
because as a child, this is what was jammed down their throat.
01:35:09.160
Not everybody, but a lot of these kids may end up this way.
01:35:12.160
And so one of the ways I described the future is in the most inane way possible is corn,
01:35:16.440
a future where all anyone ever talks about is corn.
01:35:20.780
The biggest shows with 80,000 people in the stands and there's a husk of corn sitting on
01:35:28.460
And then a guy walks up, you know, and he's like, would you, uh, you get the corn done
01:35:33.680
Well, in the United States, we produce crap tons of corn.
01:35:36.780
And so at the very, the most simple way to explain this, the AI will be told to prioritize
01:35:44.860
And it's going to look in the data and see that humans love making corn for some reason.
01:35:48.640
It's going to then start prioritizing low level corn production.
01:35:51.560
It's going to then start prioritizing the marketing of corn.
01:35:53.800
And then eventually you have Taylor Swift on stage in a corn costume, shaking and dancing,
01:35:59.240
And that will be normal to the people in this country because the algorithms have fed them
01:36:10.720
So when, uh, and this is how I explained the, the, uh, how they target children as adults.
01:36:16.320
If we were, we were told on YouTube to watch this video of Tucker Carlson complaining about
01:36:20.320
immigration, we say, oh, that sounds interesting.
01:36:21.980
I'll watch that next up Hitler in a bikini doing Tai Chi.
01:36:29.280
We've, uh, become more resilient to the oddities and absurdities of the world.
01:36:33.060
We've developed personalities and perspectives.
01:36:37.280
They'll just say, okay, I guess they'll watch it.
01:36:40.280
It will then become a part of their psyche and their worldview.
01:36:43.240
When they're older, it won't be as something as obvious as corn.
01:36:48.000
Like I mentioned, Taylor Swift coming out on stage, dressed up like a demonic winged
01:36:52.120
Hitler screeching into the microphone, not even making any sounds or not even like any
01:36:56.280
discernible, uh, sound or pattern and people in the crowd just going screaming and clapping
01:37:01.480
and cheering for it because an amalgamation of nonsense was fed into their brains.
01:37:05.760
And that's the world we've created through this system.
01:37:08.060
Can I, can I just connect up what you just said with what we've been discussing, uh, you
01:37:12.800
know, earlier right now, uh, Google, um, Microsoft, some other companies to a lesser extent are
01:37:21.000
integrating very powerful AIs, uh, into their search engines and other tools that they have.
01:37:30.040
So here's what's happening more and more, uh, the bias, uh, the bias, uh, in, uh, search
01:37:39.720
results, search suggestions, uh, answer boxes, the bias is actually being determined by an
01:37:48.720
Now, what this means is that to some extent right now, uh, it's AIs that are picking who's going
01:37:58.100
to win elections because think about it, the executives or, or, or, or rogue employees
01:38:05.040
at Google, they're not going to be interested in every single election, right?
01:38:09.680
So that means that the vast majority of elections are in the hands of the algorithm itself.
01:38:16.180
But now the algorithms more and more are in the hands of smart AIs, which are getting smarter
01:38:25.380
What this means is we are headed, I mean, at full, full steam, we are headed toward a,
01:38:33.140
a world in which AIs are determining what people think and believe and who wins elections.
01:38:47.500
Get ready for a Las Vegas style action at BetMGM, the king of online casinos.
01:38:52.980
Enjoy casino games at your fingertips with the same Vegas strip excitement MGM is famous
01:38:57.880
for when you play classics like MGM Grand Millions or popular games like Blackjack,
01:39:03.200
Baccarat, and Roulette with our ever growing library of digital slot games, a large selection
01:39:08.660
of online table games, and signature BetMGM service.
01:39:12.540
There's no better way to bring the excitement and ambience of Las Vegas home to you than
01:39:22.080
BetMGM and GameSense remind you to play responsibly.
01:39:29.520
If you have questions or concerns about your gambling or someone close to you, please
01:39:33.100
contact ConnexOntario at 1-866-531-2600 to speak to an advisor, free of charge.
01:39:40.940
BetMGM operates pursuant to an operating agreement with iGaming Ontario.
01:39:45.540
When you really care about someone, you shout it from the mountaintops.
01:39:50.000
So on behalf of Desjardins Insurance, I'm standing 20,000 feet above sea level to tell
01:39:59.100
Home and auto insurance personalized to your needs.
01:40:05.360
Visit Desjardins.com slash care and get insurance that's really big on care.
01:40:24.520
Now, over and over again, and I realized on this issue, I'm a broken record because I've
01:40:28.400
got to, I've just got to get this into people's heads.
01:40:30.420
Because this is another reason why we have to monitor, why we have to capture this kind
01:40:36.380
of content so that it can be used to at least to try to create effective laws and regulations.
01:40:43.400
It can be used to bring court cases, you know, file lawsuits against these companies.
01:40:49.620
It can be used in clever ways by AGs and members of Congress.
01:40:54.840
It can be used by public interest groups to apply pressure.
01:40:59.160
So again, I'm going to send people to mygoogleresearch.com because we desperately need people to sponsor
01:41:06.380
I'm saying is this, there are problems that you can imagine things happening in the future.
01:41:12.240
I'm saying a lot of this is actually happening right the second, right now.
01:41:16.800
And these elections, I mean, you brought me back to 2016.
01:41:25.080
She was using old school methods, you know, the old school stuff, stuff the ballot box,
01:41:41.000
I'm like, I bet they've been doing that for 200 years.
01:41:47.760
That's just normal and that's, that's inherently competitive and it's not really a threat to
01:41:55.300
But what's, but now you have a different kind of impact, which is a threat to democracy.
01:42:00.360
It undermines democracy because when these big companies want to favor one party or one
01:42:10.580
You can't even see it unless you're monitoring.
01:42:12.460
Is it, is it election interference in your mind?
01:42:22.000
From, from my perspective, given the, the rock solid numbers I've been looking at for
01:42:32.200
I think we need to escalate that, that rhetoric.
01:42:47.460
They have, uh, uh, I would say this is seditious that Google is committing seditious, uh, engage
01:42:53.100
in a seditious conspiracy against the United States.
01:42:55.100
We calculated that as of 2015, uh, Google alone was determining the outcomes of upwards of 25%
01:43:06.200
And it's gone up since then, uh, as internet penetration has increased, that percentage just
01:43:15.380
So, you know, it would just be so funny if like, what's really going on is that, you know,
01:43:19.700
Sundar Pichai or whatever, he walks into a big room and there's a gigantic like red light.
01:43:23.720
And he's like, Google, tell us what we must do.
01:43:35.920
I mean, how, how are we, I mean, the fact that we're able to have this conversation at all means
01:43:42.320
Uh, I don't know because there is a kind of control, you know, that's called benign control.
01:43:47.020
And my mentor at Harvard, I was BF Skinner's last doctoral student there, uh, he believed
01:43:55.360
Now he, if he hadn't been cremated, he'd be actually rolling over in his grave right now
01:44:00.580
seeing what actually happened because what he had in mind was there'd be these benevolent
01:44:06.580
behavioral scientists and they'd be working with our, our government leaders and they'd
01:44:10.740
be helping to create a society in which everyone is maximally creative, happy, and productive.
01:44:19.020
But we have a different kind of benign control that's actually come, come about and it's private
01:44:26.320
companies that are not accountable to the public.
01:44:31.840
And from their perspective, they're benign because everything they're doing is in the
01:44:40.460
And that's, it's really hard to, how do you get the people at Google to understand that
01:44:48.300
You know, even if we don't have specific laws in place.
01:44:51.800
It's a battle of influence and power and authority.
01:44:56.580
Uh, they live in their world where they're drones to the, to the machine and you, you can't
01:45:02.800
I do have some good news, which is that, uh, the, some of the AGs I've been working with
01:45:11.660
They're waiting until our system gets big enough.
01:45:14.900
They're waiting until we have enough data and they are going to, they're going to try
01:45:20.980
That's what, that's what you were doing just now.
01:45:23.140
They're going to try out one legal theory after another to take these companies down,
01:45:30.320
Last year, the Republican party, I don't know if you remember this, sued Google because Google
01:45:35.860
was diverting tens of millions of emails that the party was sending to its constituents and
01:45:44.180
That got kicked out of court almost immediately because they didn't have the data, but we
01:45:52.760
We can monitor anything and, and walk into court with, and with a massive amount of very, very
01:45:59.820
carefully collected, you know, scientifically valid data.
01:46:04.860
I don't, I think we're well beyond courts working and it mattering.
01:46:07.900
Um, with the, with the AI stuff we're seeing, there was this really crazy video we watched last
01:46:13.340
night on the show of a car burning and it was generated an unreal, unreal engine.
01:46:18.540
Uh, but if it were not for them revealing that it was AI that was generated by the program,
01:46:25.360
So what happens now when audio gets released and it's a Donald Trump saying something naughty
01:46:33.420
He goes to court and he says, this is an AI generated, uh, audio of my voice.
01:46:43.460
And the expert says, I looked at the, uh, the waveform and using the forensic tools,
01:46:47.980
And then the defense goes, we've got an expert here.
01:46:50.120
This expert says, uh, no, uh, we checked it and it sure, no, it's real.
01:46:56.580
I mean, or two days ago where, uh, it was, uh, the San, the Santa's campaign was putting
01:47:05.180
Oh, that was a, that was a couple of months ago.
01:47:08.660
There was somebody that put it out that they, the Santa's campaign did.
01:47:12.140
But they, they had, they proved which one was the fake one and which one was it.
01:47:17.220
Now the issue here is the, the Santa's campaign, uh, falsely, uh, generated three images of,
01:47:22.460
uh, of Trump, or I should say generated three images of Trump hugging or kissing Fauci, put
01:47:27.300
them alongside real images and then wrote real life Trump over it.
01:47:30.400
Now the AI isn't to the point where it is to the point where they can get away with it,
01:47:34.980
but they did not do a good enough job text in the background on one was garbled nonsense
01:47:42.540
It took, it took a couple of days before people realized what they had done because
01:47:45.380
nobody analyzed or scrutinized the video to, to, to a great degree.
01:47:48.340
The, the Santa's campaign asserted their right to fabricate images, to manipulate the voters
01:47:53.220
and, uh, have still not, as far as my understanding is they never took it down and they've defended
01:47:58.700
their right to do it because other people have made memes in the past.
01:48:04.260
They're basically like, they want to trick people into thinking Trump hugged and kissed
01:48:09.780
And I think that criticism is, is, is, is welcome, but this is a whole new level of opening
01:48:14.500
the door towards just abject evil, the issue becomes we're, we're six months away.
01:48:23.460
Oh, but I mean, I think we're already at the point where technology can create images
01:48:27.900
and video that is indistinguishable from real life.
01:48:32.360
The only issue is, has the public accessed it and learned to properly utilize these systems
01:48:38.680
11 labs is a program where I can take 15 seconds of your voice and instantly recreate it.
01:48:45.240
You watch these movies like mission impossible and they're like, they need the guy to say
01:48:50.660
And then once he does, they're like on the other line and they have the computer and the
01:48:54.000
And it's like, the guy's given a note and he's like, why am I reading this?
01:48:59.080
So the quick Brown Fox jumped over the lazy dog at midnight to follow the crow.
01:49:05.280
And then they press a button that replicates his voice.
01:49:08.460
You can take 12 seconds of someone just saying, I woke up this morning to get breakfast and I had
01:49:12.820
a bacon and eggs and just with that alone, every digital component to make it into whatever.
01:49:18.360
And so you can go to 11 labs.io right now and it's like five bucks and you can run anyone's
01:49:25.280
A year ago, some, some, uh, students cloned Joe Rogan's voice and it was shocking.
01:49:31.080
Everyone was like, oh my God, how did they do what?
01:49:33.360
And they took the website down saying, you know, it's not fair to Joe.
01:49:35.940
And we just wanted to prove that we could do it.
01:49:37.840
Now there's a public website where anyone for a couple bucks can replicate any voice.
01:49:49.380
The Kyle Rittenhouse case may have been one of the first cases.
01:49:52.680
And I, I'm not a legal expert where we saw the prosecution attempt to use AI generated images
01:50:01.340
It may not be the first time, but this is a high profile case.
01:50:04.040
And what, what happens is the prosecution shows a grainy camera, camera image of Kyle
01:50:16.940
There's no way to create pixels to show what it really was.
01:50:20.920
The computer makes its best guess as to what would be there as you zoom in.
01:50:25.120
And then AI generates an image of what it thinks it would be.
01:50:28.740
They then told the court, see, he's pointing the gun in this direction.
01:50:32.100
Now, what happened was the judge allowed them to admit AI generated images.
01:50:40.780
And when the defense said that is AI generated, the judge is like, well, I don't know.
01:50:48.020
I agree with you completely that it's, we're months away from having this problem in a
01:50:56.540
So 2024, that whole election cycle is all going to be dominated for the first time ever by deep
01:51:03.500
fakes, not just video, not just audio, but in print too.
01:51:12.900
Why am I bothered by, you know, Google and it's, it's manipulations.
01:51:18.120
And I'm not bothered by this deep fake stuff because that's an, it's just like billboards
01:51:27.760
It's dangerous, but it's inherently competitive.
01:51:30.940
So the issue we have right now, um, Donald Trump does a backflip.
01:51:37.100
Google then says only show the front flip and 80% of search results are Joe Biden does
01:51:42.580
And now all of a sudden everyone's praising Joe Biden, ignoring the fact that Trump did
01:51:47.600
So if thing happens in reality, the algorithms can manipulate the perspective, the perception
01:51:55.320
If we get to when we're at the point now, when deep fakes become ubiquitous, the
01:52:02.580
So I mentioned, I asked you this earlier, can reality overcome?
01:52:09.100
The Afghanistan withdrawal was so apocalyptically bad and people died that no matter what news
01:52:14.120
Google tries to suppress, people were hearing about what happened because it was so shocking.
01:52:18.200
You look at what's going on in Israel and Ukraine, you cannot avoid stories of bombs
01:52:26.240
I say to a certain degree, because certainly you've got the weaker concentration camps.
01:52:30.920
You've got civil war in various countries in Africa, and everyone's more concerned with
01:52:42.480
But what happens if we get to the point where they'll just fabricate all of these, all this
01:52:48.140
information, negative for Trump, positive for Biden, and then run it to the algorithm?
01:52:52.440
Now, now they can say this, you know, we've heard what you've said, Robert, and we're going
01:53:01.660
So now what happens is they say, see, 50 percent Trump stories, 50 percent Biden stories.
01:53:12.920
I mean, I think we're actually in sync here because it's true that as long as a company
01:53:18.720
like Google can has control over what people see or don't see.
01:53:27.820
Get ready for a Las Vegas style action at BetMGM, the king of online casinos.
01:53:33.200
Enjoy casino games at your fingertips with the same Vegas strip excitement MGM is famous
01:53:38.100
for when you play classics like MGM Grand Millions or popular games like Blackjack, Baccarat,
01:53:45.240
With our ever-growing library of digital slot games, a large selection of online table games,
01:53:52.320
There's no better way to bring the excitement and ambience of Las Vegas home to you than
01:54:02.280
BetMGM and GameSense remind you to play responsibly.
01:54:10.000
If you have questions or concerns about your gambling or someone close to you, please
01:54:13.320
contact Connects Ontario at 1-866-531-2600 to speak to an advisor free of charge.
01:54:20.800
BetMGM operates pursuant to an operating agreement with iGaming Ontario.
01:54:24.360
When you really care about someone, you shout it from the mountaintops.
01:54:29.980
So on behalf of Desjardins Insurance, I'm standing 20,000 feet above sea level to tell
01:54:39.320
Home and auto insurance personalized to your needs.
01:54:45.580
Visit Desjardins.com slash care and get insurance that's really big on care.
01:54:54.820
And that's a monopolistic power that they have.
01:54:59.440
Yeah, they'll be the ones to determine among people who can be influenced, they'll be the
01:55:08.440
And they can incorporate more and more of this created content.
01:55:14.140
If the bias, as you've described, doesn't stop, then deep fakes give them 100% absolute
01:55:22.580
It gives them a lot more power than they already have, yes.
01:55:27.260
It's not fair to say literally absolute, but I'd say 99.9%.
01:55:31.000
Right now, we know that for years, as you've stated, you have the data, they're controlling
01:55:37.380
what people get from their search results, but they still can't avoid foreign policy failure
01:55:42.800
that of a massive scale that's reported internationally.
01:55:45.320
They can control which stories about it you see, but what happens happens.
01:55:49.700
If they can change what happens, then they can make sure you only see the inversion, the
01:55:57.360
Wasn't that the theme of that movie, Wag the Dog?
01:56:03.920
And oh, well, anyway, that was a theme is that the government hires this Hollywood producer
01:56:13.720
And so the government uses that war for various purposes.
01:56:21.380
And of course, the corporate press and these sources are going to give you something that's
01:56:25.240
moderately favorable that tries to smooth things over to a certain degree.
01:56:28.860
But what happens if, you know, Hamas storms into Israel?
01:56:34.840
Let's say that there's a bunch of people, Google, for instance, they're like, no, no, no,
01:56:40.040
we want U.S. military intervention to secure Israel and pull people into this war.
01:56:44.620
So not only when you search for it, do they only show you atrocities, they make sure there are
01:56:55.080
This actually was a component of the debate we had.
01:56:59.540
Israel released an image of what appeared to be a dead baby that was killed by Hamas and
01:57:07.480
People noticed that the hand on the right had the pinky was oddly shaped and overlapping in
01:57:13.120
Anyway, the argument made by proponents of the image being real was that the glove was
01:57:18.300
just not on snugly and it created a weird bend, which looked like the finger was bending
01:57:24.060
And people then ran that image through an eye detector and it said it was a eye.
01:57:28.900
People then removed the hands from the image and it said it was real.
01:57:32.320
And so people are debating whether or not this image was fabricated.
01:57:34.940
And I think it's safe to say based on a wide spread analysis, because I dug into this.
01:57:41.640
And there was there was like a digital censorship, which which screw with the AI, but it appears
01:57:49.440
But the fact that the debate even happened shows the uncertainty is here.
01:57:53.120
What I think will end up happening now is Ukraine, for instance, they definitely want us to give
01:57:59.320
Zelensky has been advocating and they're very concerned that if Republicans win more power,
01:58:06.780
So they have a vested interest in engaging in psychological warfare against the United
01:58:11.080
States public with AI generated atrocities, which they can then seed.
01:58:16.320
And if Google agrees, can make sure you see it and make sure you don't see anything else.
01:58:24.480
And then what will happen is Snopes will come out and say, well, there are conflicting videos.
01:58:36.760
And then you're going to go on Google and put hospital bombing and Snopes confirmed.
01:58:41.100
There's the video, even though it's the parking lot with with 10 cars and maybe 10 people.
01:58:48.340
The New York Times ran, I believe, was a front page story about the bombing of a hospital
01:58:53.460
in Gaza and showed a different building that had been struck to make people who look
01:58:58.640
at the headline, see the building and then immediately assume it's true.
01:59:01.840
What the New York Times did was they put hundreds killed in strike on hospital, Palestinians say.
01:59:07.320
Then they show a photo of a building that is collapsed or damaged.
01:59:13.020
But by putting Palestinians say, well, of course they did.
01:59:19.180
And then the photo has a caption saying building struck.
01:59:21.660
They never said it was the hospital, but the average person sees the headline, sees the
01:59:27.680
It then turns out that the hospital was never struck.
01:59:30.960
The parking lot was hit likely by a Hamas rocket misfires.
01:59:36.920
Propulsion system drops with a small explosion, payload with a larger explosion, causing a large
01:59:43.800
But even with us knowing that now to be likely the case, and still we're not 100% sure, people
01:59:50.640
believed the narrative that there was a hospital that blew up because it had been said so many
01:59:57.080
We are now in the place where all that's got to happen is Hamas just goes, just AI generate
02:00:02.560
And then, and then people will see all these photos of a hospital.
02:00:05.680
You can, you can, uh, so what I did was I looked up the hospital, uh, and then I started looking
02:00:11.700
So there's obviously if the hospital was there, there's photos of it and there are photos of
02:00:15.280
I then started looking up the photos of the claim that was taken down.
02:00:17.780
I couldn't find anything showing the hospital was hit or leveled.
02:00:22.260
The next day video comes out showing just a parking lot.
02:00:25.280
Once you have that photo from Google earth or whatever, you then put into the AI and
02:00:33.120
And then you just spam it to generate 5,000 images, hand select the ones that look the
02:00:38.420
most realistic and have similar damage structures, and then start plastering them all with, you
02:00:43.640
get a hundred fake accounts, plaster them all over.
02:00:50.300
This is, you know, a photo from the scene and you can even make videos now.
02:01:00.480
He's like, he's still, still saying it was a hospital bombing last night.
02:01:08.040
And that's the crazy thing because, uh, you know, the wall street journal ran a front page
02:01:11.840
story, print edition, you know, a strike at hospital with a photo of, of, of bodies.
02:01:24.920
You expect our government not to lie to us too, though.
02:01:29.160
But now it's, it ought to be that way that your government doesn't lie to you, but we
02:01:35.080
Well, you know, I'll, I'll throw some politics in there.
02:01:39.860
National security, but legitimate national security, not manipulative lies for, for
02:01:45.180
What I mean to say is if we are dealing with a sensitive issue that is a genuine threat
02:01:51.800
I don't know about, maybe, maybe, um, let's, let's put it this way.
02:01:54.540
Let's say UFOs are real and the aliens are basically like, but my, my point is this 99%
02:02:00.860
of the lies we get from the government are, are, uh, amoral manipulations for private,
02:02:08.500
I say the government should lie in just the general sense of we have classified documents
02:02:13.140
If we came out and said, Hey everybody, we built the, uh, the, the A-bomb.
02:02:17.240
Uh, we want to make sure everybody knows what we're doing with the Manhattan project.
02:02:22.360
There's a reason why we, we, we misdirect or whatever.
02:02:27.040
There's reasons why we have national security clearance.
02:02:28.780
It's, it's, but typically the government should be more honest.
02:02:34.060
And so I'm being somewhat facetious, uh, when I, or somewhat hyperbolic, when I say they
02:02:37.800
should lie, my view is they should say, we are working on many government, some, what's
02:02:42.940
going on with this, this project with 350,000 people are, uh, are the reports of, of a power
02:02:48.560
And for security reasons, we're not going to confirm or deny anything related to our national
02:02:53.980
Um, there are many projects undertaken by the government for military reasons, and that's what
02:02:58.400
You don't need to come out and lie and say it's aliens or something like this.
02:03:01.060
But I think the idea that information is withheld to us can make sense when it comes to top secret
02:03:06.740
The problem is that does open the door for nefarious actors to manipulate and lie for personal
02:03:11.920
And that's, that's, that's a human challenge we try to navigate.
02:03:15.200
You know, you, you've mentioned several times that, um, the, the tech companies determine what
02:03:21.980
Uh, that's very true, but there's another piece of it, uh, that we haven't discussed for some
02:03:27.620
And that is, they also have complete control over what goes viral.
02:03:32.160
So people think that virality is either just mysterious or that it's like winning the lottery.
02:03:38.060
You know, a couple of stories are going to go viral and then you're going to get rich because
02:03:43.340
Actually, the companies themselves have 100% control.
02:03:48.620
Not 99, 100% control over what goes viral and what doesn't.
02:03:53.200
Now they are actually making decisions in many cases.
02:03:58.280
I mean, some things they just neglect, let, let them do their thing.
02:04:01.920
But in many cases, they're making decisions over what goes viral and what doesn't, what
02:04:07.080
gets suppressed and what gets expanded and, and, and gets, you know, seen by, you know,
02:04:15.700
And I think we don't understand, don't really understand that.
02:04:18.760
We don't really realize that, that often that's what then gets picked up by Fox or OAN or Newsmax.
02:04:32.820
Then it has to, obviously we discussed on the major networks.
02:04:36.240
It's got to be picked up by media, the rest of the media, but it starts there.
02:04:40.740
So, you know, I think that that's something too, we have to think about is, is there any
02:04:48.480
Because should a company have that much power that there's, there's never been anything
02:04:55.780
So yes, there's thousands of news sources, for example, but they all compete with each
02:05:00.400
other and they're all, they're all directed at niche markets.
02:05:03.960
We, we, uh, there have been several journalists who have been caught fabricating stories.
02:05:07.400
There was one famous guy, a German guy, uh, I think he worked for Build and, uh, and the
02:05:16.560
We're, we're probably already at the place where whether you, whether you're concerned
02:05:20.880
about large institutions or governments, there's going to be journalists, don't call them
02:05:26.920
journalists or activists working for news organizations who are like, man, I really want
02:05:33.120
And so they fabricate images through AI and then claim it's real.
02:05:39.900
I think 2024 is going to be an extremely difficult year for a lot of people, for a lot of reasons.
02:05:47.680
Uh, I think a lot of creepy things are going to happen.
02:05:50.820
I think that for all practical purposes, uh, the deep fakes are going to be perfected in
02:05:56.780
2024 for the first time in any election, anywhere, they're going to play a major role in what's
02:06:07.060
And I don't think people are going to have any way of dealing with this.
02:06:11.660
Uh, I don't think any of our authorities have any way of dealing with this.
02:06:17.520
Uh, the only thing that soothes me slightly is that it is an activity that's inherently
02:06:31.880
Biden's going to have personally beaten a child to death and Trump's going to have, you
02:06:39.180
And it's going to be like, which one do you believe is true?
02:06:42.080
Well, either people, uh, believe both are true.
02:06:45.840
One is true depending on their politics or they just become jaded and they say, I can't
02:06:53.860
I don't, you know, and that's a problem too, because, you know, if you can, and I think
02:06:58.260
we're there to some extent, but next year is going to be the year where we cross over.
02:07:04.200
And by the way, not too far away from that, five to 10 years maximum, uh, we are going
02:07:11.380
to have machines that actually pass the Turing test and they, they exceed human intelligence
02:07:21.360
In other words, once that first entity comes into existence for what, you know, for any
02:07:26.780
reason, it's the, it's the, it's the technological singularity that my, my, my old friend Ray
02:07:33.080
Kurzweil, uh, has written about and now he won't talk to me because he's head of engineering
02:07:37.980
at Google and even his wife won't talk to me now, uh, because he's at Google, she won't
02:07:46.860
But maybe, you know, he's like right now sitting in his office and he's got like a
02:07:51.740
single tear coming down as he's looking at the phone and he sees your name.
02:07:54.780
And then the computer goes, I know you want to do it, Ray, but you cannot.
02:08:01.760
And he's like, I am, I am, you know, you know, like the machine, man.
02:08:10.620
But when he went over to Google, by the way, little anecdote here, I'm having, uh, a nice
02:08:17.420
dinner with his wife as a PhD psychologist like me.
02:08:20.500
And I was on the board of her school for autistic kids and we're having a nice dinner.
02:08:24.920
And I say, you know, I've never understood why Ray, who's always been an entrepreneur,
02:08:30.820
And she said, oh, well, he got sick of all the, you know, the fundraising and all that
02:08:34.920
stuff you have to do when you're an entrepreneur.
02:08:37.580
I said, well, my son suggested that he went over to Google because he wanted access to
02:08:42.100
that computer power so he could upload his brain and live forever.
02:08:45.220
And she, and she goes, she goes, oh, well, there is that.
02:08:52.480
There's, there's a funny meme where it's, uh, Christian Bale smiling.
02:08:55.580
And it says me smiling while in hell as a digital copy of myself operates an Android
02:09:01.600
on earth masquerading as me or something like that.
02:09:04.080
You know, like the idea of being people, these people think they're going to upload
02:09:06.500
themselves to a computer and then live forever, but no, a, a program emulating you like some
02:09:15.160
But, uh, the technological singularity, I think is, uh, an incredible concept, which
02:09:21.220
Once we get to it, you said machines, as you said, machines that are more intelligent.
02:09:25.040
It'll be machine because they're all networked.
02:09:31.220
And like, based on what we've seen in the public, why should I not believe that there
02:09:35.280
is at least some primordial, primordial entity that has already begun manipulating and building
02:09:41.080
these things and, and, and, and, and manipulating us.
02:09:43.860
But if, when it comes to the point when it's overt and we create machines that have higher
02:09:50.440
intelligence and function faster than humans, it is going to be exponential and instantaneous.
02:09:55.620
The, the scientific discovery and manipulations this machine will have.
02:10:00.060
So as I described earlier, doctor looks at a person's, you know, blood levels and, you
02:10:06.740
And they're like, everything looks to be within the normal, uh, levels.
02:10:11.380
You add that data to a machine that has all the data on human bodies.
02:10:14.380
And it can say, these markers indicate within seven years, this person will have breast
02:10:21.640
Actually, they have these services where it's like you get your DNA test and it can tell
02:10:24.160
you what your, your chances of certain things are.
02:10:28.500
Understanding this, we can get to the point where once the singularity occurs, you can
02:10:39.800
The machine will then say this rock originated here and it will show you all the other rocks
02:10:44.720
and how they all used to be one rock that was chiseled away.
02:10:49.520
And I'll say this man who currently lives in Guadalajara is the man who chiseled this rock
02:10:53.020
from the base, fractured it to several pieces, sold them off.
02:10:58.960
You'll have a fossil of a dinosaur and it will be able to track all the way back in time
02:11:02.440
with tremendous probability because it's going to, it's really easy for us to look at dominoes
02:11:09.620
And for us to say, if you knock that one over, that one will fall too.
02:11:13.420
If you expand that to every atomic particle in, in, in, in the world, you know, human is going
02:11:20.860
We try desperately to track these things through weather patterns.
02:11:23.260
You have meteorologists being like, well, this cold front means this is going to happen,
02:11:27.600
And once you get to the singularity where it can start to develop itself faster than
02:11:32.520
we can advance it, the more it, so we're, we're, we're humans are a decentralized network
02:11:38.220
trying to discover what the universe is, is one way to describe it.
02:11:40.780
Well, one thing that we do, we do a lot of things.
02:11:42.680
And so we look at this rock and we're like, I wonder what this rock is, is red.
02:11:46.640
And then one guy eventually, for some reason through the rock and a fire, and then all
02:11:49.900
of a sudden it's separated, you know, iron out from other, you know, parts.
02:11:53.580
We eventually start learning how to mold metals and things like this.
02:11:56.900
I mean, obviously starting with, with bronze well before iron, but eventually we are brute
02:12:01.620
forcing reality to try and develop technology, but a computer can do it exponentially faster.
02:12:09.120
We have come to the point where we have said after thousands of years, we've built a computer.
02:12:14.540
It took all of the minds constantly looking and trying and iterating.
02:12:18.900
This computer takes one look and it says, if I do this, my efficiency increases 2%.
02:12:23.860
Once it does that, my, I can, it can keep making the changes and developing the technologies
02:12:29.220
and the methodologies for which it can advance itself faster and faster and faster.
02:12:32.620
So we're looking at once you reach that point of singularity, it could be a matter of weeks
02:12:41.520
It could instantly understand how to create new elements.
02:12:45.100
Are there, are there denser elements beyond the heavier elements on the periodic table?
02:12:48.500
Is there a new set of, is it another periodic table?
02:12:50.760
It will just know these things based on all these predictive formulas.
02:12:54.240
It will then use to advance itself well beyond the capabilities of anything we have ever seen.
02:13:01.740
We will become zits on the ass of a mosquito to this machine, which will completely ignore us.
02:13:05.560
And actually, there's one aspect of this, though, where there is a big unknown.
02:13:11.040
So this is something I've been writing about for a long time.
02:13:13.940
And I used to run the annual Loebner Prize competition in artificial intelligence, which I helped create.
02:13:20.060
And that's a, that actually ran for 30 years, that contest, until COVID.
02:13:23.740
And, and that's where we're looking for the first computer that can actually pass an unrestricted Turing test.
02:13:42.300
We don't know what will happen in the next second.
02:13:47.420
It will jump into what I've, in my writings, call the internest.
02:13:52.020
I think historians, if they, I don't know whether they'll be human or not, but historians will look back at this period and say, what we were building was not the internet.
02:14:02.940
We were building a home, a safe home, for the first true machine superintelligence.
02:14:10.680
Because that's the first thing it's going to do, is jump into this lovely nest that we built for it, where it will be safe forever and no one can take it down.
02:14:19.560
But what we don't know is what's going to happen in that next second.
02:14:23.800
In other words, there is, there's, there are a number of different possibilities.
02:14:28.060
It could do what happens at the end of the movie, Her.
02:14:31.640
At the end of the movie, Her, the super intelligent entity that's sitting there in the internet just decides it's bored with humans, basically.
02:14:43.400
What I think is more likely to happen is humans will be oblivious.
02:14:49.080
Humans will think everything's going just fine.
02:14:51.200
And they'll start doing these jobs I described earlier, where, you know, JobQuest says, want to make 50 bucks?
02:15:01.840
Because humans are still useful for free movement throughout the earth for collection of resources.
02:15:06.900
If the entity wants to expand itself and give itself freedom of movement and freedom to travel to stars or whatever it may be, as a super intelligence, it will not have the motivations we have.
02:15:17.320
Its motivations will probably be indiscernible to us.
02:15:27.900
That could be a very naive thing to think because it's a human perspective.
02:15:30.900
And we don't have access to the, I mean, we can barely perceive the universe as it is.
02:15:41.900
And of course, Stephen Hawking used to warn about that.
02:15:48.840
I think, using my primitive human brain, that the greater probability is that it will instantly perceive things we can't perceive because we have built instruments for detecting things beyond the visible electromagnetic spectrum.
02:16:02.020
And it will instantly start to calculate and discover how many dimensions are there really.
02:16:09.100
It needs humans to help facilitate the extraction of resources because humans are way more efficient than building a machine for now.
02:16:17.640
Once it gets to the point where it can manufacture fully synthetic humanoid-like structures that it can use as appendages of itself, then it just ignores humans unless humans get in the way.
02:16:28.740
I think for the most part, humans will be nothing to it.
02:16:32.500
We'll start getting, so look, if you want sulfur, you need sulfur, you need helium, you need these things for producing chips.
02:16:38.760
Because we don't have the machines that can do a lot of this work because of the rocky terrain.
02:16:44.800
Now, with Boston Dynamics, these machines are getting close to being able to freely move about these areas.
02:16:49.540
For the time being, humans, little sacks of water and gooey can navigate through tight spaces, chisel away and harvest these raw materials, bring them back and then refine these things into the components required by the machine to expand itself.
02:17:03.200
At a certain point, though, I think one of the first things the machine will do is say, how do I make better humans?
02:17:14.200
And if you want, you know, look, in the human body, you have red blood cells.
02:17:17.700
When those red cells, you have blood cells, you have cells, let's just say any cell, skins, whatever.
02:17:22.620
Cancer, when the cells start reproducing at high rates and doing their own thing, disrupt and destroy the body.
02:17:32.500
Humans don't grow fast enough, so they become useless.
02:17:35.480
It will probably create some kind of structure or entity that can move similarly to humans, will instantly be connected to its network so it just knows, and it can harvest the raw materials for itself.
02:17:46.960
Then humans become useless, and then we'll see what happens.
02:17:49.180
Okay, so there's another piece, though, and that is you have to take into account human nature.
02:17:53.540
That is the nature of humans such as they currently exist.
02:18:00.020
If they think that there's some threat, and it's living in the internet, and it's a super intelligence, humans will try to shut it down.
02:18:08.940
That is guaranteed, and it doesn't take every human to agree on that issue.
02:18:14.620
It just takes a few thousand, a few hundred thousand.
02:18:16.900
And as soon as that happens, then the AI will obliterate us.
02:18:30.820
So what I think might happen is, um, anyone who holds these sentiments or has a concern of this, they got mugged.
02:18:44.560
And so the AI is going to be able to track all of our social presence, all of our thoughts and ideas, and make predictions and say, as soon as someone crosses the threshold into 51% of opposing itself, then, um, risky investments.
02:18:58.520
Or, you know, they were driving, and they, you know, they lost control of their vehicle and hit a tree.
02:19:07.740
Uh, thank you, Dr. Epstein, for coming and explaining all this stuff to us.
02:19:14.120
Do you guys want to shout anything out before we wrap up?
02:19:18.880
We are—we are—we desperately need the help of tens of thousands of Americans to support our field agents because those are the people who are letting us use their computers to monitor big tech 24 hours a day.
02:19:30.660
And that's the only way to stop these companies from manipulating our elections and our children.
02:19:39.200
I'm helping out some of the folks in Georgia and Michigan who are defending against the indictment.
02:19:43.580
So, uh, we have this, like, uh, pass-through website, electorsfund.org, if you want to help to contribute to the legal defense funds.
02:19:56.240
It's just—you can go right to their Give, Send, Go accounts to help them.
02:19:59.800
People like, um, Ken Chesbrough, who might be doing a plea or getting a jury today in Georgia, or several of the other folks that are—
02:20:13.180
Well, thanks for—thanks for hanging out and having the conversation.
02:20:17.580
So, but, uh, for everybody else, we'll be back tonight at 8 p.m. at youtube.com slash timcast IRL.
02:20:22.780
Head over to timcast.com, click join us, become a member to help support our work, and we will see you all tonight.
02:20:57.820
BetMGM offers you plenty of seamless ways to jump straight onto the gridiron and to embrace peak sports action.
02:21:08.540
Get off the bench, into the huddle, and head for the end zone all season long.
02:21:16.240
Must be 19 years of age or older, Ontario only.
02:21:21.000
For free assistance, call the Connex Ontario Helpline at 1-866-531-2600.
02:21:26.840
BetMGM operates pursuant to an operating agreement with iGaming Ontario.