The Culture War - Tim Pool


The Culture War #34 - Election Fraud, Big Tech And Trump 2024 w⧸Robert Bowes & Dr. Robert Epstein


Summary

In this episode, we have a special guest, Dr. Robert Bowes. Dr. Bowes is a former Trump White House appointee who now works at the American Institute for Behavioral Research and Technology (AITTR) in San Diego, California, and is a researcher at the AITTR. We discuss the possibility of a Google or other tech company manipulating our votes in the next election, and whether or not it can defeat the establishment machine that s trying to elect Hillary Clinton in 2020. We also discuss the impact of technology on our voting system, and what we can do about it. Betonline Ontario is a new gambling app that allows you to earn up to $100 in free bets on the upcoming Ontario primary election. Get ready for Las Vegas-style action at your fingertips with the same Vegas Strip excitement MGM is famous for when you play classics like MGM Grand, Blackjack, Baccarat, and Roulette. With an ever-growing library of digital slot games, a large selection of online table games, and signature BetMGM service, there s no better way to bring the excitement and ambience of Las Vegas home to you than with BetOnline Ontario! Download the Betonline.ca app today! BetOnline.ca is the king of online gambling and GameSense remind you to play responsibly. . if you have questions or concerns about your gambling or someone close to you, please contact Connects Ontario at 1-866-531-2600 to speak to an advisor free of charge. If you have concerns about gambling or want to speak with an advisor FREE of charge, call ConnectsOnto, call 1-800-UP-TO-WALK about it? to speak free to you. or visit Connects.ca/WagerOntario at 1 800-TOWager Ontario at 51919-1919 and get 20% off your best bet on the next presidential election! - and more! and a chance to win a FREE VIP membership at Betonline VIP membership! to win $200,000 in the 2020 Democratic primary, $100,000, $500,000 off the first month! or $250,000 OFF-PRICING FREE in the second year of your first year of the 2020 presidential primary, FREE OFF THE FIRST MONTH, $25, $50 OFF OFF THE FASTEST PRICING, $75,000 IN OFFER?


Transcript

00:00:00.000 Get ready for Las Vegas-style action at BetMGM, the king of online casinos.
00:00:05.880 Enjoy casino games at your fingertips with the same Vegas Strip excitement MGM is famous for
00:00:11.120 when you play classics like MGM Grand Millions or popular games like Blackjack, Baccarat, and Roulette.
00:00:17.940 With our ever-growing library of digital slot games, a large selection of online table games,
00:00:22.920 and signature BetMGM service, there's no better way to bring the excitement and ambience of Las Vegas home to you
00:00:29.300 than with BetMGM Casino.
00:00:31.900 Download the BetMGM Casino app today.
00:00:34.940 BetMGM and GameSense remind you to play responsibly.
00:00:37.480 BetMGM.com for T's and C's.
00:00:39.400 19 plus to wager.
00:00:40.540 Ontario only.
00:00:41.420 Please play responsibly.
00:00:42.660 If you have questions or concerns about your gambling or someone close to you,
00:00:45.600 please contact Connects Ontario at 1-866-531-2600 to speak to an advisor free of charge.
00:00:53.860 BetMGM operates pursuant to an operating agreement with iGaming Ontario.
00:00:59.300 Donald Trump may be dominating in the polls.
00:01:03.760 Actually, the latest aggregate has Trump tied with Joe Biden, but he did see a major upswing.
00:01:08.800 And it looks like now with the GOP primaries, he's absolutely crushing it.
00:01:11.760 But the question is, are the elections going to be free and fair?
00:01:15.960 And will there be big tech manipulation?
00:01:18.600 As well as what happened way back when in that, you know, other election.
00:01:22.520 But we'll talk about it all.
00:01:23.480 We're going to talk about this because we're going to be entering the 2024.
00:01:26.300 We're in the 2024 cycle.
00:01:27.920 We're going to be entering 2024 very soon.
00:01:30.120 And I hear a lot of people saying things that they don't think it's possible.
00:01:33.080 They don't think there's a reason to do it.
00:01:34.560 I disagree.
00:01:35.240 I think we have to do everything we can.
00:01:37.800 And even if you if you think in the back of your mind you can't win,
00:01:40.760 that's no excuse for backing down and letting your political opponents and evil people just steamroll through.
00:01:47.200 So we'll talk about this.
00:01:48.420 We've got a couple of great guests.
00:01:49.440 We have Dr. Robert Epstein.
00:01:51.700 Would you like to introduce yourself?
00:01:54.120 Well, what was that again?
00:01:56.000 What was my name?
00:01:57.540 Robert Epstein.
00:01:58.260 Dr. Robert Epstein.
00:01:59.420 I must be Dr. Robert Epstein.
00:02:01.040 That's you.
00:02:01.560 Looking in my direction.
00:02:04.020 And I'm a researcher.
00:02:05.920 I'm senior research psychologist at the American Institute for Behavioral Research and Technology,
00:02:10.780 which is in beautiful San Diego, California.
00:02:14.000 Wonderful.
00:02:14.360 And you have previously talked about or you've talked quite a bit about Google's manipulation of our electorates,
00:02:21.600 of the people's minds, but also some other issues pertaining to Hillary Clinton 2016 and things like that.
00:02:28.520 So this will be really great.
00:02:29.920 Thanks for coming.
00:02:30.640 We should have a good conversation.
00:02:31.760 And we have Robert Bowes.
00:02:33.360 Would you like to introduce yourself?
00:02:34.580 Yes, thank you.
00:02:35.380 Robert Bowes was a Trump White House appointee, worked at FHA, been a banker for many years,
00:02:42.220 but now working on election fraud investigation and helping some of those wrongfully accused in Georgia indictments.
00:02:49.520 Okay, right on.
00:02:50.520 Let's just jump right to it.
00:02:52.320 Can we win in 2024?
00:02:54.920 By we, you mean right-wing conservative nutcases?
00:02:58.800 Is that what you mean?
00:02:59.720 No.
00:03:00.320 I mean, you know, if you take the mainstream media's view,
00:03:03.780 anybody who would vote for Trump is going to be some far-right MAGA extremist.
00:03:07.060 But you'll actually, there's a lot of people who are libertarian-leaning, anti-establishment,
00:03:11.700 some people who just despise the Republican Party, but just want Trump to win for, say,
00:03:16.220 like my position is more so, Trump's more likely to fire people.
00:03:19.680 His foreign policy was substantially better than everyone else we've seen.
00:03:23.420 So I actually really don't like Republicans, and I don't consider myself conservative,
00:03:26.700 but I think Trump is one of our best bets in a long time.
00:03:30.000 I'd like to see him win, but, you know, could there be better?
00:03:33.240 Of course, it's a question of, is it possible to defeat the establishment machine,
00:03:38.020 which has got Biden fumbling around in office, maybe wants to bring in Gavin Newsom or who knows
00:03:43.080 what?
00:03:44.000 And yeah, I mean, there you go.
00:03:46.560 That's we.
00:03:47.260 Okay.
00:03:47.480 Trump cannot win.
00:03:49.520 It is impossible because Google alone has the power in 2024 to shift between 6.4 and 25.5
00:03:59.960 million votes in the presidential election with no one aware of what they're doing and without
00:04:07.320 leaving a paper trail for authorities to trace.
00:04:11.000 So let me just repeat those numbers.
00:04:12.460 Between 6.4 and 25.5 million votes, and those are absolutely rock-solid numbers.
00:04:22.440 And I know how to stop that.
00:04:23.980 I know how to level the playing field, but all the attention is going to so-called voter
00:04:30.620 fraud.
00:04:31.980 All that attention is going to voter fraud because Google and some other tech companies
00:04:37.100 are misdirecting attention.
00:04:40.060 In other words, they're making those kinds of stories go viral so that people who don't
00:04:44.120 know better end up focusing on those issues.
00:04:46.980 And they're doing that deliberately so that you won't look at them.
00:04:51.860 Well, I mean, I would, I could say, you know, potentially right now, yes, but for two years
00:04:58.000 after the 2020 election, you could not even say those words on YouTube without getting banned.
00:05:02.200 In fact, I think it was The Hill ran a clip of Donald Trump speaking at a rally where he
00:05:07.120 said, it was a big fraud, you stole it.
00:05:09.040 And then they shut down a news segment for simply mentioning it.
00:05:12.620 But if you came on and said, oh, they've got better ballot harvesting, YouTube was totally
00:05:16.740 fine.
00:05:17.140 If you came out and said big tech censorship, Google search manipulation, they had no problem
00:05:23.020 with you saying those things, but you couldn't talk about not now you can talk about fraud.
00:05:26.960 They've changed the rules a few months ago where now you're allowed to say 2020 was was
00:05:31.500 stolen from Trump or whatever.
00:05:32.920 But so how would you I mean, what's your response to that?
00:05:37.240 Whatever it is people are focusing on, you have to understand that that focus is being
00:05:42.020 controlled.
00:05:43.680 So there I guarantee you they're not ever allowing the focus to be on them.
00:05:51.740 So right now, for example, there's a big trial and progress is just winding down the U.S.
00:05:57.820 versus Google.
00:05:59.160 No one even knows about it.
00:06:00.760 That has been so completely suppressed by the tech companies themselves and their media partners.
00:06:06.260 So what I'm saying is whatever it is people are going to be talking about, they control
00:06:11.220 that.
00:06:12.420 And whatever else they do, they're going to make sure that you don't look at them and
00:06:16.880 the kind of power that they have to shift votes and opinions, which is unprecedented in
00:06:22.800 human history.
00:06:23.560 That's what I study.
00:06:25.120 That's what I've been studying more than 11 years.
00:06:27.140 And I publish my work in peer-reviewed journals.
00:06:31.640 It's rock solid, rigorous research.
00:06:34.460 I've testified about it before Congress.
00:06:37.980 This is what's really happening.
00:06:40.600 And a couple of people now have been figuring this out.
00:06:43.780 One is Carrie Lake.
00:06:45.180 She's changed her tune.
00:06:46.820 I don't know if you know that in the last few weeks.
00:06:48.820 She's saying it's big tech, big tech that we really need to worry about.
00:06:53.640 And the other is Ramaswamy.
00:06:55.600 He's also now switched.
00:06:57.380 Saying all those voter fraud issues, yeah, they're important.
00:07:01.600 They're important.
00:07:02.540 But that's not where the real threat is.
00:07:05.920 The real threat is the big tech companies.
00:07:08.200 Because these other kinds of things that we talk about, they can shift a few votes here
00:07:13.240 and there, but they're inherently competitive.
00:07:16.300 But if one of the big tech platforms decides to support a party or a candidate, there is nothing
00:07:23.100 you can do about it.
00:07:24.740 Generally speaking, also, they're using techniques that you can't even see.
00:07:29.800 So that's really where the big threat is.
00:07:32.340 And I will tell you, at this point in time, democracy in this country is an illusion because
00:07:37.860 that's how many votes they control.
00:07:40.120 Are you doing a look back on that?
00:07:42.400 The $6 million to $25 million you're talking about, you're thinking about, that's now or 2024?
00:07:47.720 What was it in 2016 or 2020?
00:07:50.740 I would submit to you that when Carrie Lake says, 81 million votes in my ass, I agree with her.
00:08:00.080 I don't think that there's the real or that old school fraud, which you think is a smaller amount.
00:08:07.620 It probably is a smaller amount.
00:08:08.920 And I agree with your assertion that the tech censorship is big.
00:08:13.380 The biggest.
00:08:14.120 I agree.
00:08:15.140 But did Trump overcome it in 2016?
00:08:19.440 What was the amount in 2016?
00:08:21.420 I can tell you precisely because that's when we started monitoring.
00:08:24.360 That's when we invented the world's first system for surveilling them, doing to them what they
00:08:31.440 do to us and our kids.
00:08:33.060 We learned how to capture what they call ephemeral content.
00:08:38.900 Let me explain here.
00:08:39.600 This is a very important concept.
00:08:40.920 2018, there was a leak of emails from Google to the Wall Street Journal.
00:08:45.880 And in that conversation that these Googlers were having, they said, how can we use ephemeral
00:08:51.920 experiences to change people's views about Trump's travel ban?
00:08:56.020 Well, my head practically exploded when I saw that because we had been studying in controlled
00:09:01.220 experiments since 2013, the power that ephemeral experiences have to change people's opinions and
00:09:10.080 attitudes and beliefs and purchases and votes.
00:09:12.860 What's an ephemeral experience?
00:09:14.680 Okay.
00:09:14.940 Most of the experiences you have online are ephemeral.
00:09:18.220 And ephemeral means fleeting, means you have the experience and then whatever was there,
00:09:24.600 the content disappears like in a puff of smoke and it disappears.
00:09:28.180 So, for example, you go to Google search engine, which you should never use, by the way, I can
00:09:32.580 explain why.
00:09:33.700 But, and you type in a search term, you start to type, they're flashing search suggestions
00:09:39.540 at you.
00:09:40.500 Those are ephemeral.
00:09:41.960 They disappear.
00:09:43.120 They're not stored anywhere.
00:09:44.400 You can't go back in time.
00:09:45.600 Search results populate below.
00:09:49.220 Those are ephemeral.
00:09:50.460 You can't go back in time and see what search results there.
00:09:53.500 How about answer boxes, news feeds?
00:09:57.560 When you're on YouTube, you know those, the recommended one that's going to come up next,
00:10:02.660 the up next video?
00:10:04.300 That's not tracked.
00:10:05.200 It's not tracked.
00:10:06.020 That's ephemeral.
00:10:06.880 The whole list of recommended videos, it's all ephemeral.
00:10:10.460 What we started doing in 2016 with a very small system at the time was preserving that
00:10:17.580 and analyzing that.
00:10:19.340 We found, we were looking at Google, Bing, and Yahoo.
00:10:23.560 We found pro-Hillary Clinton bias in all 10 search positions on the first page of Google
00:10:30.240 search results, but not Bing or Yahoo.
00:10:33.140 That's very important.
00:10:33.840 Interesting.
00:10:34.320 For control.
00:10:34.820 So you're saying we should use Bing?
00:10:36.700 No, no, no, not at all.
00:10:38.520 But the point is that if that level of bias, because that's what our experiments look at,
00:10:45.820 they look at how bias can shift opinions and votes.
00:10:48.560 We measure that very precisely.
00:10:50.480 If that level of bias that we measure, that we capture, that we preserve, normally that's
00:10:55.620 never preserved, had been present nationwide in the 2016 election, well, that would have
00:11:04.540 shifted between 2.6 and 10.4 million votes to Hillary Clinton with no one knowing that
00:11:14.180 that had occurred because people can't see bias in search results.
00:11:17.900 They just click on what's highest.
00:11:19.660 They trust whatever that takes them to, if they're undecided.
00:11:22.860 So 2 to 10 million in 2016, you're saying 6 to 25 million in 2024.
00:11:28.680 What was 2020?
00:11:30.000 2020, Google alone shifted more than 6 million votes to Joe Biden.
00:11:37.840 Now, by the way, I supported Hillary Clinton.
00:11:40.060 I supported Joe Biden.
00:11:41.180 I lean left myself.
00:11:43.380 So I should be thrilled, but I'm not thrilled because I don't like the fact that a private
00:11:47.660 company is undermining democracy and getting away with it, and there's no restrictions on
00:11:53.860 them whatsoever, absolutely none.
00:11:55.680 They have an absolutely free hand.
00:11:57.800 So they do what they're doing blatantly and arrogantly.
00:12:01.020 Quick example of another ephemeral experience to show you how blatant and arrogant this is.
00:12:06.080 Florida in 2022.
00:12:07.920 Okay, so we were monitoring Florida because it's one of the key swing states.
00:12:10.860 On election day, November 8th, all day long, Democrats in Florida were getting go vote
00:12:19.460 reminders on Google's homepage.
00:12:21.780 Wow.
00:12:23.320 Conservatives, not so much.
00:12:25.680 In other words, 100% of Democrats in Florida were getting those reminders all day.
00:12:30.640 59% of conservatives.
00:12:32.500 That is an extremely powerful and blatant.
00:12:36.260 But, you know, if you don't have a monitoring system in place to capture all that ephemeral
00:12:41.020 stuff, you don't know.
00:12:41.520 The FEC should be all of this.
00:12:42.920 This is a party in-kind donation to the party, to the candidates.
00:12:47.160 They should be all.
00:12:47.680 The FEC is the state of the switch.
00:12:50.120 They won't.
00:12:50.800 Yeah.
00:12:51.080 But Donald, President Trump beat the cheat 2016.
00:12:54.940 Well, I think he beat the cheat in 2020.
00:12:57.140 Well, he's not president, so he totally did.
00:12:58.720 I know.
00:12:59.120 Well, but so 70,000 vote differential.
00:13:02.960 Yeah.
00:13:03.100 When we know that there's these, I would agree with you, smaller amount of cheat, but
00:13:08.720 through, you know, different tech.
00:13:10.780 Well, if this is true, I mean, then Trump's popularity is-
00:13:13.700 Is huge.
00:13:14.360 Oh, yeah.
00:13:14.600 It's huge.
00:13:15.020 It was a collective, what was it, like 44,000 votes in three swing states are what stopped
00:13:19.120 Trump from 2020.
00:13:20.600 That's right.
00:13:21.320 Exactly.
00:13:21.960 Now, one thing we've learned how to do, this is very recent, by the way, in our work,
00:13:26.060 we've learned how to look at an election that took place, look at the numbers, and we can
00:13:31.460 factor out Google now.
00:13:33.640 So, in 2020-
00:13:35.080 Get ready for Las Vegas-style action at BetMGM, the king of online casinos.
00:13:40.680 Enjoy casino games at your fingertips with the same Vegas Strip excitement MGM is famous
00:13:45.580 for when you play classics like MGM Grand Millions or popular games like Blackjack,
00:13:50.880 Baccarat, and Roulette.
00:13:52.340 With our ever-growing library of digital slot games, a large selection of online table games,
00:13:57.700 and signature BetMGM service, there's no better way to bring the excitement and ambience
00:14:02.760 of Las Vegas home to you than with BetMGM Casino.
00:14:06.680 Download the BetMGM Casino app today.
00:14:09.740 BetMGM and GameSense remind you to play responsibly.
00:14:12.280 BetMGM.com for T's and C's.
00:14:14.180 19 plus to wager.
00:14:15.320 Ontario only.
00:14:16.200 Please play responsibly.
00:14:17.220 If you have questions or concerns about your gambling or someone close to you, please
00:14:20.800 contact ConnexOntario at 1-866-531-2600 to speak to an advisor, free of charge.
00:14:28.640 BetMGM operates pursuant to an operating agreement with iGaming Ontario.
00:14:33.300 When you really care about someone, you shout it from the mountaintops.
00:14:37.720 So, on behalf of Desjardins Insurance, I'm standing 20,000 feet above sea level to tell
00:14:42.400 our clients that we really care about you.
00:14:47.100 Home and auto insurance personalized to your needs.
00:14:50.340 Weird, I don't remember saying that part.
00:14:53.060 Visit Desjardins.com slash care and get insurance that's really big on care.
00:14:59.100 Did I mention that we care?
00:15:03.740 Trump won five out of what were generally considered to be 13 swing states.
00:15:09.140 If you factor out Google, Trump would have won 11 of those 13 swing states.
00:15:15.020 That's up in New York and California.
00:15:16.180 That's it.
00:15:16.680 Or one of our swing states, yeah.
00:15:17.740 And easily would have won in the Electoral College.
00:15:20.980 Yeah.
00:15:21.640 And you have the CISA, the Cyber Infrastructure Security Agency, that has said that cognitive
00:15:31.160 infrastructure is what they want to be targeting right now.
00:15:34.760 Cognitive infrastructure.
00:15:36.220 Do you guys remember the leaked video of Google employees crying when Donald Trump won?
00:15:41.340 Of course.
00:15:42.120 So, this...
00:15:43.100 This is real.
00:15:44.320 This is real.
00:15:44.660 Oh, it's totally real.
00:15:45.220 Because they swore up on that stage, and it was all the leaders of Google up on that
00:15:49.100 stage, and they swore, we are never going to let this happen again.
00:15:54.000 Right.
00:15:54.140 So, are you doing anything with Missouri v.
00:15:55.940 Biden, where, you know, Missouri, you know, there's key claims in there about election,
00:16:03.320 well, censorship, obviously, but censorship goes, extends to censoring and suppressing
00:16:08.720 votes, effectively.
00:16:09.900 Well, I've worked for years with Jeff Landry, who just became governor-
00:16:14.560 Yep.
00:16:14.880 Big win.
00:16:15.440 ...with Louisiana, and I congratulated him that very day.
00:16:18.800 I thought for sure I'd never hear from him again now that he was governor, and he texted
00:16:21.940 me back.
00:16:22.620 Yeah.
00:16:22.840 He's a good guy.
00:16:23.400 The guy is...
00:16:24.180 He's a great guy.
00:16:26.140 Crazy accent.
00:16:27.120 Wow.
00:16:27.480 Really crazy accent.
00:16:28.440 Cajun.
00:16:28.860 Cajun.
00:16:29.300 Yeah.
00:16:29.560 But he's been helping me and my team for years, and he knows all about my work, and
00:16:35.600 he gets it.
00:16:37.440 He understands.
00:16:38.960 There are few people up there in leadership positions in our country who understand.
00:16:44.260 Unfortunately, it's very few.
00:16:45.660 He's one of the people who does.
00:16:48.300 He was involved in that Missouri case, as you probably know.
00:16:52.180 And yeah, of course we're interested in that, because the communication between the government
00:16:56.680 and Google and the gang, okay, that's very critical.
00:17:03.000 Obama's second term, who knows this?
00:17:07.920 Seven federal agencies were headed by former Google executives.
00:17:13.120 Obama's chief technology officer, former Google executive.
00:17:18.000 Hillary Clinton's chief technology officer, Stephanie Hannon, former Google executive.
00:17:24.060 Two hundred and fifty people went back and forth in the Obama administration between top
00:17:30.280 positions in his administration and Google.
00:17:32.980 How did Trump win in 2016?
00:17:34.100 He won because of a—it was a fluke.
00:17:38.860 You know, they took certain things for granted.
00:17:42.040 They weren't looking carefully enough at those tiny little numbers in the swing states.
00:17:47.120 And so, yes, a tiny margin in some swing states—
00:17:50.740 Seventy-seven thousand votes.
00:17:51.680 Just put—exactly.
00:17:53.140 Put him over the top in the Electoral College, and they were kicking themselves.
00:17:59.380 If Facebook, for example, just on Election Day had sent out partisan go-vote reminders, just
00:18:07.440 Facebook, one day, that would have given to Hillary Clinton an additional 450,000 votes.
00:18:15.740 But it is possible, then, albeit very, very difficult, that if you can mobilize Trump supporters
00:18:24.840 and conservatives to an extreme degree, they can overcome that bias.
00:18:29.800 Nope.
00:18:30.460 You don't think so?
00:18:30.900 Absolutely cannot.
00:18:31.120 No, because Google alone controls a win margin of somewhere between 4% and 16%.
00:18:38.880 So, now, if you're telling me, well, no, we've locked this up, we've got—we can guarantee
00:18:44.200 a win margin of, I don't know, call it 12%, but that's not true in this country.
00:18:49.780 In this country, we know we're split roughly 50-50 on the vote.
00:18:54.240 So, if there's some bad actor that has the ability to shift a whole bunch of people, especially
00:19:01.760 right at the last minute, especially on Election Day, you can't counter that.
00:19:05.360 I think we're split—it's not 50—because of what you just described in terms of the
00:19:11.080 bias that President Trump overcame, we're not split 50-50.
00:19:14.760 We're split more 60-40.
00:19:17.600 And now you have an awful candidate.
00:19:20.000 Joe Biden is a failed candidate for many reasons, and there are some major disasters going on.
00:19:25.940 You know, you look at the policy, you know, only one or two of these things have taken
00:19:31.240 out other presidents, but if you have, you know, economy, wars, medical tyranny, two-tier
00:19:38.120 justice system—
00:19:38.960 Not just wars.
00:19:41.880 Biden's approval rating collapsed after the Afghanistan withdrawal.
00:19:44.540 Right.
00:19:44.940 So, if you apply it to a candidate, if you have a really bad candidate, that's going to, you
00:19:50.820 know, it's going to hurt them, too.
00:19:51.840 So, can they dial it in?
00:19:54.380 Can you say, can Google dial it in and say, oh, we can get 30 million, you know, we can influence
00:19:59.420 30 million people because Joe Biden's so awful?
00:20:01.740 Is that what you're saying, or no?
00:20:03.040 Yeah, but then you should have gotten that red wave, and there was no red wave.
00:20:07.220 So, I published a piece in the Epoch Times that said, howgooglestoptheredwave.com, and
00:20:15.320 I explain exactly what happened there.
00:20:17.460 So, you should have had that huge red wave if you're—
00:20:20.020 So, in 2022, there should have been 30 or 40, right?
00:20:23.500 I can tell you exactly.
00:20:24.560 But some of that was—there was a wide variety of cheating that happened in that, and what
00:20:29.900 you're saying.
00:20:30.440 I agree with you.
00:20:31.160 I don't know.
00:20:31.360 But it's both.
00:20:32.140 No.
00:20:32.600 It is.
00:20:32.920 What I'm saying is so much bigger.
00:20:35.100 Even Kevin McCarthy funding people that are running against America First candidates.
00:20:39.780 Yeah.
00:20:40.140 He was—Kevin McCarthy was part of this problem.
00:20:43.160 And guess what?
00:20:44.100 Not just funding against, but obstructing.
00:20:45.380 And guess what?
00:20:46.400 He came back and has bitten him in the behind right now because the margins are so narrow.
00:20:52.040 If he had not done those things to suppress America First candidates in 2022, we wouldn't
00:20:58.440 be in this position right now.
00:20:59.820 I agree with you on the problem of Google, big tech, Google especially.
00:21:03.180 Me too.
00:21:03.580 But I don't see it as inevitability.
00:21:06.080 I see it as a David versus Goliath.
00:21:07.880 I see a possibility as slim as it may be.
00:21:10.260 Okay.
00:21:10.480 How about this, though?
00:21:11.860 Why don't we just push Google and the gang, push them out of our elections, and push them
00:21:19.360 out of the minds of our kids because that's something we started studying, too.
00:21:22.720 And that's the win.
00:21:23.640 When you say, can we get a win in 2024, forget the party.
00:21:27.460 That is the win.
00:21:28.540 Well, that levels the playing field.
00:21:31.220 Absolutely.
00:21:31.560 That's what we need.
00:21:32.160 And that gives you a freer and fairer election.
00:21:35.120 I think something you mentioned is the most important point.
00:21:39.080 It doesn't matter if YouTube spams nothing but Donald Trump content.
00:21:43.860 It doesn't matter if the front page of Reddit doesn't matter or the default page.
00:21:47.800 It doesn't matter if Twitter X and all these platforms every day slam you with pro Trump,
00:21:53.460 pro Trump, pro Trump.
00:21:54.680 If during the election cycle, because now we're in a month, election month, not election day,
00:22:00.380 Google comes out and runs.
00:22:03.040 Go vote only for Democrats.
00:22:05.740 That's enough.
00:22:06.820 That's enough because we're talking about victory margins for the for the presidential
00:22:10.660 election of very, very slim.
00:22:12.140 And if it's 77,000 votes that gets Trump the victory or 42, 44,000 in 2020, all Google
00:22:18.060 has to do is blast everyone, their algorithm, their AI knows as a Democrat with don't forget
00:22:23.180 to vote today.
00:22:24.000 And then for all the Republicans, all they have to do is put make sure you're watching
00:22:27.800 the new movie today and they can they can they can shift the percentages enough to secure
00:22:32.420 Joe Biden's that's what I'm trying to tell you.
00:22:34.740 That's that's that's that's it.
00:22:35.940 But you're only talking about one little exactly that cost that cost them nothing, by
00:22:39.360 the way, cost them zero to do that.
00:22:40.780 But how about let's back up a few months and they do the same with register to vote, right?
00:22:46.300 What if they're doing now?
00:22:47.040 Facebook's doing now.
00:22:47.820 TikTok, Instagram, they're doing it now.
00:22:49.980 No, TikTok, TikTok banned.
00:22:51.440 Hold on a second.
00:22:53.020 OK, you don't know what they're doing unless you're doing monitoring.
00:22:57.700 You don't know what I see anecdotes of what's coming across my feed.
00:23:00.940 We're collecting our data that are admissible in court on a massive scale.
00:23:07.000 We are now monitoring big tech content through the computers of more than 12,000 registered
00:23:13.320 voters politically balanced in all 50 states, 24 hours a day.
00:23:18.460 We have collected in recent months, preserved more than 51 million, might be up to 52 today,
00:23:26.200 51 million ephemeral experiences on Google and other platforms, content that they never
00:23:33.080 in a million years thought anyone would preserve.
00:23:36.400 And we're preserving it.
00:23:37.900 Every single day, we have 30 to 60 additional new people added to our nationwide panel.
00:23:44.660 And so every single day, we're recording more and more of this content.
00:23:47.980 And we've been learning how to analyze it in real time.
00:23:51.260 So let me tell you how you push these companies out of our elections and get them out of our kids'
00:23:58.860 heads.
00:23:59.240 2020, we had so much dirt on Google that I decided we're going to go public before the
00:24:09.960 election.
00:24:11.160 So I called up a reporter at the New York Post and I sent her in a bunch of stuff and she
00:24:18.140 got the assignment.
00:24:19.280 This is a few days before the election.
00:24:21.460 She wrote up the piece.
00:24:23.460 Her name is Ebony Bowden.
00:24:25.020 You can look her up because she got fired soon afterwards.
00:24:29.240 And she read some of the pieces to me on the phone.
00:24:32.220 It was fantastic.
00:24:32.860 Now, just a few weeks before the New York Post had broken the story about Hunter Biden's
00:24:38.340 laptop.
00:24:39.980 So there, you know, and that was front page, right?
00:24:42.520 Well, this story that she was writing about the election rigging, that was going to be
00:24:46.380 New York Post's front page.
00:24:48.300 Wow.
00:24:49.080 So Friday, October 30th, a couple of days before the election, her editor called Google for comment.
00:24:57.140 And guess what happened that night?
00:25:00.760 The story got killed.
00:25:02.740 Spiked it.
00:25:03.560 She was so furious.
00:25:06.280 Now, how could that possibly have happened?
00:25:09.380 Well, the New York Post could take on Twitter because they were only getting 3 or 4% of their
00:25:13.580 traffic from Twitter.
00:25:14.480 But they were getting 45% of their traffic, ring a bell, 45% of their traffic from Google.
00:25:22.560 They could not take on Google.
00:25:24.820 I knew a guy.
00:25:25.880 He ran a, like an at-home store.
00:25:30.040 It worked from home.
00:25:31.820 Google changed their search algorithm one day and his business went to zero.
00:25:36.300 That's right.
00:25:36.980 This happens every, this happens every day.
00:25:39.100 Yep.
00:25:39.240 I mean, I've written about this.
00:25:40.520 This is, that's the power that this company has.
00:25:44.420 And people are, people in business are terrified of Google because Google can just put you out
00:25:50.060 of business like that.
00:25:51.440 They broke up, they've broken up a, you know, Ma Bell was broken up for, for, um, for less
00:25:56.880 things.
00:25:57.480 Let me, let me just finish my 2020 story.
00:25:59.960 I'm almost there.
00:26:00.520 I'm almost there.
00:26:00.960 I promise.
00:26:02.220 Okay.
00:26:03.020 So I was distraught.
00:26:05.520 Ebony Bowden was distraught at the New York Post.
00:26:07.620 Okay.
00:26:08.200 She was really mad.
00:26:10.280 She didn't last much longer there.
00:26:11.900 That editor also didn't last much longer there, interestingly enough.
00:26:15.420 But I sent everything into Ted Cruz's office also.
00:26:19.680 And on November 5th of 2020, Ted Cruz and two other senators sent a very threatening letter
00:26:25.300 to Sundar Pichai, the CEO of Google.
00:26:28.100 If you want to look at it, it's letter to Google CEO.com.
00:26:32.300 Letter to Google CEO.com.
00:26:33.900 It is a fabulous letter written by Cruz and his, his buddies, and it's two pages long.
00:26:41.460 And it says, you testified before Congress saying you don't mess with elections, but
00:26:46.520 Epstein's data show the following.
00:26:49.040 Okay.
00:26:49.320 So what happens then on November 5th?
00:26:52.060 On November 5th, that very day, Google turned off all of its manipulations in Georgia.
00:26:59.340 We had more than a thousand field agents in Georgia.
00:27:01.960 Wow.
00:27:02.080 We, we, we preserved a million ephemeral experiences in Georgia.
00:27:06.540 This is in the two months leading up to their Senate runoff elections.
00:27:10.080 They literally turned off everything.
00:27:13.040 Bias in Google search, political bias went to zero, which we've never seen before.
00:27:18.920 And get ready for a Las Vegas style action at BetMGM, the king of online casinos.
00:27:25.060 Enjoy casino games at your fingertips with the same Vegas strip excitement MGM is famous
00:27:30.440 for when you play classics like MGM Grand Millions or popular games like Blackjack,
00:27:35.780 Baccarat, and Roulette with our ever-growing library of digital slot games, a large selection
00:27:41.240 of online table games, and signature BetMGM service.
00:27:45.120 There's no better way to bring the excitement and ambience of Las Vegas home to you than with
00:27:49.500 BetMGM Casino.
00:27:51.560 Download the BetMGM Casino app today.
00:27:54.000 BetMGM and GameSense remind you to play responsibly.
00:27:57.140 BetMGM.com for T's and C's.
00:27:59.060 19 plus to wager.
00:28:00.200 Ontario only.
00:28:01.080 Please play responsibly.
00:28:02.360 If you have questions or concerns about your gambling or someone close to you, please contact
00:28:06.140 Connects Ontario at 1-866-531-2600 to speak to an advisor.
00:28:12.520 Free of charge.
00:28:13.500 BetMGM operates pursuant to an operating agreement with iGaming Ontario.
00:28:18.200 When you really care about someone, you shout it from the mountaintops.
00:28:22.340 So on behalf of Desjardins Insurance, I'm standing 20,000 feet above sea level to tell
00:28:27.260 our clients that we really care about you.
00:28:29.780 We care about you.
00:28:31.880 Home and auto insurance personalized to your needs.
00:28:35.200 Weird.
00:28:35.760 I don't remember saying that part.
00:28:37.980 Visit Desjardins.com slash care and get insurance that's really big on care.
00:28:43.540 Did I mention that we care?
00:28:45.020 They stopped sending out partisan go vote reminders.
00:28:52.340 What is, what is, is there, this, this sounds like with your data being admissible in court,
00:28:57.660 anyone in any state could have standing and file a lawsuit.
00:29:00.680 Correct.
00:29:01.240 And that's why I'm working with AGs around the country right now.
00:29:04.280 That's why I'm working with Paul Sullivan, who's a very well-known D.C. attorney who used
00:29:08.900 to work with the Federal Election Commission.
00:29:11.080 He's helping us to prepare a complaint to the FEC about Google because we have the data
00:29:17.460 from 2022.
00:29:19.400 In 2022, we preserved two and a half million ephemeral experiences related just to the, you
00:29:24.300 know, in those days leading up to that election.
00:29:26.580 But now we're setting up a permanent system that's currently running 24-7 in all 50 states.
00:29:33.700 It needs to be much, much bigger so that we have representative samples and so it's court
00:29:39.620 admissible in every state.
00:29:41.000 So it was the day that Senator Cruz sent this letter out, the bias seen on Google in Georgia
00:29:49.360 disappeared.
00:29:50.860 Like that, like flipping a light switch.
00:29:53.020 And that phrase came to me from a Google whistleblower named Zach Voorhees, who you may have heard
00:29:57.440 of.
00:29:57.720 Yep.
00:29:58.200 He's a good person for your show.
00:29:59.700 I think we've had him on, didn't we?
00:30:00.960 We've had him on.
00:30:01.660 Pretty sure.
00:30:02.500 I don't know.
00:30:03.080 Yeah.
00:30:03.420 He's the guy who walked out of Google with 950 pages in documents and a very incriminating
00:30:10.020 video.
00:30:10.660 And he put it all in a box and sent it off to Bill Barr, who at that time was Attorney General
00:30:15.140 of the United States.
00:30:16.440 And then Google went after him with police and a SWAT team went after Zach Voorhees.
00:30:22.140 Wow.
00:30:22.820 I'm sure Bill Barr didn't look at it.
00:30:24.400 He didn't.
00:30:24.740 He doesn't look at anything.
00:30:26.220 We sent him lots of evidence.
00:30:27.640 He didn't look at any of it.
00:30:28.840 Well, the point is, though, that Zach, what Zach did was very, very courageous.
00:30:33.620 He's become a friend over the years.
00:30:36.240 Yes.
00:30:37.280 And that's Zach's phrase.
00:30:39.100 It's like flipping a light switch.
00:30:40.580 They have the ability to turn off, turn these manipulations on and off, like flipping a light
00:30:46.780 switch.
00:30:46.980 And we made them do it on November 5th.
00:30:51.180 Now, just imagine a much, much larger system running 24-7 with a public dashboard, which,
00:30:59.120 by the way, you can get a glimpse of right now.
00:31:01.280 It's at americasdigitalshield.com, and it looks gorgeous.
00:31:06.500 In the securities market, there's a concept of a quiet period, you know, where there's
00:31:12.660 no discussions.
00:31:13.440 You can't put out press releases, or you can't say certain things, you know, 30 days, plus
00:31:17.480 or minus, when you come out.
00:31:19.100 Maybe there's a remedy here to say that if you contract this and they abide by it, the
00:31:24.160 big tech needs to be in a quiet period for, you know, months before the election.
00:31:29.060 Oh, no, no.
00:31:29.360 This is going to be, this system is permanent.
00:31:33.300 This system is running.
00:31:34.080 Oh, you're trying to get a permanent remedy to remove all bias?
00:31:38.640 Is that it?
00:31:39.720 What's a remedy?
00:31:40.500 We're focused on two areas, elections, which is critical, because right now, believe me,
00:31:45.280 democracy in this country is an illusion.
00:31:48.060 And the second is kids, because we're collecting data now for more than 2,500 children around
00:31:55.700 the country, and we're actually looking at what they're actually getting from these tech
00:32:00.700 companies, and we don't even understand it.
00:32:04.220 It is so bizarre, and so weird, and so creepy, and so violent, and so sexual.
00:32:11.540 Yep.
00:32:12.060 We don't even understand it.
00:32:13.740 We will, we will understand it.
00:32:15.400 Are you, are you familiar with Elsagate?
00:32:17.480 No.
00:32:18.400 Uh, you should, you should look into this, especially with your research.
00:32:20.820 The frozen, the frozen push out?
00:32:22.420 It wasn't just that, but, uh, Elsagate was the name of this phenomenon that happened several
00:32:26.400 years ago, about maybe five years ago, where people, uh, adults weren't noticing this because
00:32:31.720 the, the feeds that we're getting are like, you know, CNN and, and entertainment and celebrities
00:32:38.080 and music and sports.
00:32:39.280 Kids were getting initially a wave of videos.
00:32:42.800 Uh, this is where Elsagate comes from, of Elsa, Spider-Man, and the Joker running around
00:32:46.720 with no sound, like with, with no, with no dialogue, engaging in strange behaviors, right?
00:32:51.440 So it started with Elsa going, Ooh, and the Joker kidnapping her and then Spider-Man saving
00:32:55.560 her.
00:32:56.080 The general idea was Joker, Elsa, and Spider-Man were very popular search terms in the algorithm.
00:33:01.260 And so if you combined these things in a long video, kids would watch it and they'd get high
00:33:07.220 retention and all that.
00:33:08.060 It would promote it more.
00:33:09.060 It devolved into psychotic amalgamations of Hitler with breasts and a bikini doing Tai Chi while
00:33:19.280 people from India sing nursery rhymes.
00:33:22.580 And then it started, uh, you started getting these videos where the thumbnails were people
00:33:28.320 drinking from urinals and eating human feces.
00:33:30.260 And this was be giving, being given to toddlers and children on YouTube.
00:33:33.900 Sick.
00:33:34.580 What had happened?
00:33:35.200 And, uh, people, there are a lot of, uh, uh, you know, amateur internet sleuths started
00:33:39.440 digging into what was going on.
00:33:40.540 The general idea was that this section of YouTube was completely overlooked or ignored,
00:33:45.720 or perhaps it was intentional.
00:33:47.100 But what happened was parents would select a nursery rhyme on the, on a tablet and give
00:33:52.240 the tablet to a baby, put it in front of them being like, there, I'll get a few minutes
00:33:55.980 to myself.
00:33:56.880 The baby watches a very innocent nursery rhyme video, but the next up video would slowly move
00:34:02.560 in the direction of this psychotic algorithmic nightmare to the point where, like I mentioned,
00:34:08.460 the nursery rhyme, it was finger family was a hand would pop up showing Hitler's head on
00:34:13.640 one of the fingers.
00:34:14.220 And then Hitler said another finger, Hitler with breasts.
00:34:16.700 I am not kidding in a bikini doing Tai Chi with the incredible Hulk.
00:34:20.180 And then eventually videos where like Peppa Pig was being stabbed mercilessly with blood
00:34:24.840 spring everywhere.
00:34:26.000 Pregnant women were eating feces and getting injections while it was happening.
00:34:28.820 And because these videos started doing well, it actually resulted in human beings seeing
00:34:34.580 the success of these videos, giving their daughters, and this is in like Eastern Europe
00:34:39.300 and Russia, videos going viral where a father lays his daughter down and gives her an injection
00:34:45.160 of some sort, 10 million views.
00:34:47.600 This was, and eventually this, there was a massive backlash and people realized this was
00:34:51.280 happening and then it started getting crushed.
00:34:53.820 And then YouTube had a big panic.
00:34:55.100 And then they said, we're going to roll out YouTube kids and we're going to be very safe
00:34:58.220 and try and protect them.
00:34:59.600 But this is something I don't know if you think was intentional or was just a byproduct
00:35:03.960 of their, of their machine on accident.
00:35:05.860 Okay.
00:35:05.920 First of all, you're talking about it in all in the past tense.
00:35:08.700 Our system is running 24 hours a day now.
00:35:11.700 Also, you're talking about it legally from the perspective of anecdotes.
00:35:17.140 Those are all anecdotes.
00:35:18.800 Yeah.
00:35:19.020 That's not what we're doing.
00:35:20.060 We're, we have, we have a larger and larger and larger group of people that are politically
00:35:25.340 balanced.
00:35:25.940 We know their demographics.
00:35:27.420 This is, this is what, and by the way, we're not out there searching for crazy stuff on YouTube.
00:35:33.180 We're not doing that at all.
00:35:34.240 We're actually collecting the, the videos, hundreds of thousands of videos that, that our
00:35:39.620 kids are actually watching.
00:35:41.340 Plus we've learned that 80% of the videos that kids watch are select, are suggested by that
00:35:48.540 up next algorithm.
00:35:49.780 Think of the power you have to manipulate just because of that algorithm.
00:35:54.180 That's incredible.
00:35:54.860 A friend, a friend told me, I know this is all anecdotal, but I do think it's, it's the
00:35:58.800 anecdotes that I'm referring to are people started to notice something and then you have
00:36:02.620 the hard evidence, the hard data.
00:36:04.060 A friend, a friend told me that she was watching her, her kid was watching Disney channel and
00:36:09.080 an anti-Trump commercial came on and she was like, what the, why?
00:36:13.460 Why?
00:36:14.420 Because powerful interests are slamming the battlefield in this way.
00:36:19.440 I think what you're talking about is the Kraken.
00:36:22.380 Well, maybe I shouldn't call it that because you know, Sidney Powell.
00:36:24.860 No, we'll talk about that too.
00:36:25.920 Look, look, let me, let me give you an example.
00:36:28.780 A parent walking by their kid's tablet, let's say, wouldn't even notice that anything was
00:36:33.760 wrong.
00:36:34.140 Yep.
00:36:34.460 Okay.
00:36:34.740 But we're collecting the actual videos and here's what happens.
00:36:40.820 There's a, there's a, there's a weird cartoon.
00:36:43.380 So with this, you know, relates to what you said just a few minutes ago, weird cartoon, but
00:36:47.960 then all of a sudden, boom, something crazy happens.
00:36:51.300 There's a shriek and a head flies through the air and there's blood everywhere and then
00:36:55.300 it's gone.
00:36:55.920 So it's very, very brief.
00:36:58.180 That's happening.
00:36:58.960 Oh yeah.
00:36:59.500 That's happening right now.
00:37:00.700 It's very brief.
00:37:03.220 And what we're finding is something like, well, first of all, 80%, that's, that's rock
00:37:08.400 solid now.
00:37:09.120 80% of the videos that, that little kids are watching, those are all suggested by Google's
00:37:13.460 algorithm.
00:37:14.120 Are you monitoring the gamings like even Roblox?
00:37:17.140 I mean, I could, I've seen some kind of unusual things in Roblox, but that could be crowdsourced,
00:37:22.220 you know, individuals putting up their own little games or characters.
00:37:25.200 Are you looking at the gaming?
00:37:26.760 Look, practically every day now we're expanding the system.
00:37:30.700 And so we're going, we're, we're, we're, we're monitoring more and more different kinds
00:37:34.960 of content.
00:37:35.920 Uh, we've got, we're now looking at TikTok.
00:37:39.160 Uh, there's nothing we can't monitor.
00:37:41.520 That's why monitoring systems have to be a permanent part of not just our country and
00:37:47.460 our culture, but really everywhere in the world outside of mainland China.
00:37:52.120 These systems have to be set up.
00:37:53.860 And we have, we've been approached by people from seven countries so far.
00:37:57.700 The last two are Ecuador and, and South Africa and begging us to come help them set up these
00:38:04.840 systems.
00:38:05.800 And here is the only, only area where I've ever agreed with Trump on this issue.
00:38:10.680 I say America first.
00:38:12.120 We've got to, we've got to develop our own full system that's operating, you know, and
00:38:18.380 it has to be, you have to have representative samples.
00:38:21.200 This all has to be done very scientifically so that this is court admissible in every state.
00:38:25.680 That's how you push them out because you make them aware through public dashboards, through
00:38:32.680 press releases, uh, through data sharing with certain key journalists, members of Congress,
00:38:38.700 AGs.
00:38:39.800 That's how you push them out.
00:38:41.020 They would be insane to continue this stuff.
00:38:44.920 I want, I want to go back to that one point that you just made.
00:38:47.160 So what you're saying that there are innocent looking videos, the thumbnail may just be a
00:38:50.920 smiling little, little sheep.
00:38:52.500 And it says like, learn your ABCs, right?
00:38:54.640 It's 15 minutes long, but then at eight minutes and three seconds, all of a sudden a head pops
00:38:59.180 up, explodes, and it's gone.
00:39:01.160 Add to that the fact that if you, if you mouse over the bottom of that video, you can actually
00:39:06.960 see, you know, the frequency with which that part of the video is being viewed.
00:39:12.080 And very often now we're seeing a spike right at that point where that crazy brief thing
00:39:18.820 happens.
00:39:19.980 Now, how do we find examples of this?
00:39:21.500 They rewind and go and watch it again.
00:39:23.780 Is there, is there.
00:39:24.520 They're watching those parts over and over and over and over and over again.
00:39:27.680 That's what that means.
00:39:28.120 Is there a way to find one of these right now or is it, is it buried in YouTube?
00:39:32.380 It's, it's, I mean, we, we have, well, well, first of all, if you, if you just scroll
00:39:37.860 down, scroll down there, just right there.
00:39:41.480 Oh, I see.
00:39:42.120 So that's, that's going to be a carousel.
00:39:44.020 Like this right here.
00:39:45.140 Like, yeah, but that's going to be a carousel showing images that we're collecting in real
00:39:50.600 time.
00:39:51.120 You can pull this up, Kellen.
00:39:52.260 We have a, yeah.
00:39:54.160 So, so in other words, you're, you're going to see actual real.
00:39:59.720 This is your site.
00:40:00.620 And you're going to be able to click on them and that's going to take you to the videos.
00:40:03.800 These are, this is your site.
00:40:04.740 You put these up on your site.
00:40:05.720 So this is a mock-up for now.
00:40:07.800 Oh yeah.
00:40:08.180 No, go, go, go down right there.
00:40:10.180 Look at that.
00:40:11.240 This is really nuts.
00:40:13.120 This is so nuts because on the left you see the, the political leaning of every state and
00:40:20.040 it, and that should be very familiar to you.
00:40:22.040 But on the right, what we're showing you is the bias in, on Google search results in content
00:40:28.040 being sent to people in every state.
00:40:30.040 It's all blue in all 50 states.
00:40:32.480 Is there, is there a finite ceiling to this?
00:40:34.760 I don't know if you want to continue on the, on the, the, the indoctrination or the subliminal,
00:40:40.460 you know, messaging to children, but, but with respect to elections, you think people are
00:40:46.440 smarter?
00:40:47.020 Is there a cap to how many, if Google tries to do 10 or 20 million more people, is there
00:40:55.640 a mark, decreasing marginal gain?
00:40:57.640 I mean, people can figure this out.
00:40:59.080 You know, they, they, they can recognize bias.
00:41:01.480 Okay.
00:41:02.420 Are they dumb?
00:41:03.720 I mean, what are you saying?
00:41:04.760 No, I mean, no, I laugh because the discoveries that we've been making all these years have
00:41:10.960 there, let's put it this way.
00:41:13.100 The more we learn, the more concerned we've all become.
00:41:16.420 For example, when we first did a, a nationwide study in the U S we had more than 2000 people
00:41:21.640 in all 50 States.
00:41:22.800 They were being shown biased content on our Google, Google, Google simulator, which we call
00:41:29.560 Kadoodle.
00:41:30.580 And they were showing bias content.
00:41:32.800 And, uh, uh, usually people can't see the bias now where they see, you know, where there's
00:41:41.020 bias content, we can shift people in one direction or another, whichever way we want to, because
00:41:45.300 we use random assignment.
00:41:47.500 Uh, we can easily shift between 20 and 80% of undecided voters.
00:41:52.580 Wow.
00:41:53.180 Of the undecideds.
00:41:54.180 And with one search it with multiple searches, that number goes up.
00:41:58.240 Get ready for a Las Vegas style action at bet.
00:42:02.300 MGM, the king of online casinos, enjoy casino games at your fingertips with the same Vegas
00:42:08.200 strip excitement.
00:42:08.920 MGM is famous for when you play classics like MGM grand millions or popular games like
00:42:14.380 blackjack, baccarat and roulette with our ever-growing library of digital slot games, a large
00:42:20.320 selection of online table games and signature bet.
00:42:23.160 MGM service.
00:42:24.620 There's no better way to bring the excitement and ambience of Las Vegas home to you than
00:42:28.780 with bet MGM casino.
00:42:31.060 Download the bet MGM casino app today.
00:42:34.120 Bet MGM and game sense remind you to play responsibly bet MGM.com for T's and C's 19 plus
00:42:39.040 to wager Ontario only please play responsibly.
00:42:41.900 If you have questions or concerns about your gambling or someone close to you, please contact
00:42:45.640 connects Ontario at 1 8 6 6 5 3 1 2 6 0 0 to speak to an advisor free of charge.
00:42:52.660 Bet MGM operates pursuant to an operating agreement with iGaming Ontario.
00:42:57.700 When you really care about someone, you shout it from the mountaintops.
00:43:01.840 So on behalf of Desjardins Insurance, I'm standing 20,000 feet above sea level to tell our clients
00:43:07.340 that we really care about you.
00:43:11.380 Home and auto insurance personalized to your needs.
00:43:14.700 Weird, I don't remember saying that part.
00:43:17.420 Visit Desjardins.com slash care and get insurance that's really big on care.
00:43:22.660 Did I mention that we care?
00:43:27.080 Okay, so that's...
00:43:30.840 It's mind control.
00:43:34.640 It's an information war directed at the American people.
00:43:37.760 It was large enough so we did have a few people, about 8%, who could see the bias.
00:43:43.800 So we were able to look at them separately because the study was so large.
00:43:47.720 And you would think we're not going to get an effect with those people.
00:43:52.480 No.
00:43:53.180 They shifted even farther in the direction of the bias.
00:43:56.320 Wow.
00:43:56.640 So seeing the bias doesn't protect you from the bias because there's this trust people
00:44:01.800 have in algorithms because they don't know what they are and computer output because they
00:44:06.400 don't know what that means.
00:44:07.940 And so they think if the algorithm itself is saying this is the better candidate, it really
00:44:14.520 must be the better candidate.
00:44:16.480 Seeing the bias does not protect people from the bias.
00:44:19.600 So is there a cap?
00:44:21.320 That's the question you're asking.
00:44:22.180 That's the question, yeah.
00:44:22.840 So right now we're studying things that are new for us.
00:44:26.340 We're studying what happens if one of these platforms is using one of these manipulations
00:44:33.840 over and over and over and over again.
00:44:37.320 And so far what we're seeing is that the effect is additive.
00:44:41.780 So you do kind of...
00:44:42.480 But it's probably at a decreasing rate.
00:44:44.140 Correct.
00:44:44.300 So you do kind of hit an asymptote, but the point is just by repeating these, notice if
00:44:50.340 you expose people to similarly biased content, the numbers go up.
00:44:55.380 The shift gets bigger, but you're right, the rate goes down.
00:44:58.320 Now, the other thing we're starting to look at is multiple platforms.
00:45:01.320 What happens if three or four of these big platforms in Silicon Valley are all supporting
00:45:06.860 the same candidate?
00:45:08.460 And we're seeing initially in our initial work, again, those are additive effects.
00:45:13.260 So this is scary stuff because way back, remember Eisenhower's famous speech from 1961,
00:45:23.060 military-industrial complex?
00:45:24.880 Yep.
00:45:25.200 But that same speech, Eisenhower was warning about the rise of a technological elite that
00:45:32.400 could control public policy without anyone knowing.
00:45:36.120 And guess what?
00:45:37.760 The technological elite are now in control.
00:45:41.420 There was a hearing with Twitter back when it was still Twitter, and I think one of the
00:45:46.320 most important things that was brought up was that, I can't remember who it was, but
00:45:49.900 one of the members of Congress said, if you go onto Twitter and create a new profile right
00:45:54.080 now, it shows you all the suggested follows are Democrats, no Republicans.
00:45:59.440 So that means as soon as you sign up, you say, I don't know, I'll follow this person,
00:46:03.080 I guess.
00:46:03.600 You're being slammed by pro-Democrat messaging.
00:46:05.840 And that was just Twitter.
00:46:07.680 And that was just the who to follow.
00:46:10.240 That's not even, you know, now we've seen this big push towards a switch from reverse
00:46:14.640 chronological feeds into algorithmic feeds.
00:46:17.160 And perhaps people don't realize what the true power is behind that and why they want
00:46:22.740 it.
00:46:23.260 For one, they can make more money with it for sure, but it takes away your ability to
00:46:27.480 subscribe to who you want.
00:46:29.300 This was a big deal with threads.
00:46:30.860 When threads lost, this is Instagram's version of Twitter or whatever.
00:46:34.000 Worst platform ever, in my opinion, because I I'm like, OK, I'm on Instagram.
00:46:37.520 I'll sign up for threads.
00:46:38.980 There is no reverse chronological feed.
00:46:40.700 It was only algorithmic.
00:46:41.640 I was being I was getting a bunch of Democrats in my feed, which was strange.
00:46:45.980 I don't follow them.
00:46:46.680 And I was getting weird entertainment stuff and weird jokes that made nothing to me.
00:46:51.760 But their default position was we're going to tell you what to look at.
00:46:55.400 And I wonder if what they were doing was intentionally trying to create a platform.
00:46:58.980 So, look, you have Twitter.
00:47:01.140 Twitter defaults the algorithmic feed.
00:47:03.120 Twitter very much is biased, even working with intelligence agencies with secret back
00:47:07.380 doors for moving content lawsuits now underway and and already resolved proving this.
00:47:12.540 Elon takes over instantly.
00:47:14.000 Zuckerberg's like, we're going to launch an alternative.
00:47:15.960 The the the innocent take on this is, well, you know, they see a market opportunity.
00:47:20.940 I disagree.
00:47:21.960 I think they realized, oh, one of our key assets in this manipulation has just fallen to someone
00:47:28.740 who disagrees with us.
00:47:30.280 So threads rolls out heavy handed algorithmic feed and it got a wave of complaints.
00:47:35.160 It was too overt.
00:47:36.780 Now they're saying we're going to pull that back a little bit.
00:47:38.660 But for the longest time, Instagram has not been reverse chronological.
00:47:42.820 Reverse chronological for those that don't understand it.
00:47:44.680 It is what it means.
00:47:45.480 You see on your feed the latest thing that someone posted.
00:47:50.300 And so if your friend posts now, you'll see it.
00:47:52.620 But if your friend posted three hours ago, it's already long gone.
00:47:54.980 The argument from these big tech platforms is, oh, but what if you like that three hour
00:47:58.500 old post?
00:47:59.380 We're going to make sure you see it.
00:48:00.820 What ends up happening is on Instagram.
00:48:03.100 And this annoys the crap out of me.
00:48:05.260 I get weird posts and I'm scrolling through my feed of things I don't care about.
00:48:08.920 They're testing you, right?
00:48:11.100 Seeding the battlefield.
00:48:12.580 How long do you linger on this post versus this post?
00:48:15.260 Then they know what to send you more of.
00:48:17.440 But what they're also doing is using that as the argument.
00:48:20.900 They're going to start seeding you information to control what you think.
00:48:24.220 And I got to be honest, this UFO that we got sitting right here, I don't think nobody
00:48:26.860 can see it, but this UFO, I got it because of Instagram.
00:48:29.840 They knew I wanted it.
00:48:31.040 Wow.
00:48:31.300 Look at that thing.
00:48:31.960 Yeah.
00:48:32.460 Just float.
00:48:33.100 It's cool, huh?
00:48:33.940 Yeah.
00:48:34.480 They send me the ad and I say, I want that.
00:48:37.680 But what you don't see is that sometimes it's not an ad.
00:48:40.240 It's a post from someone saying, did you know about bad thing from this person?
00:48:44.360 Don't vote.
00:48:45.300 Vote for them instead.
00:48:47.180 That's the game being played.
00:48:48.540 Well, there's a little more to it.
00:48:50.140 So let me explain.
00:48:51.800 These companies have another advantage over all the usual, the traditional dirty tricks,
00:48:58.480 which are inherently competitive and they don't bother me that much because they're inherently
00:49:02.040 competitive, but the point is these companies have another advantage, which is they know
00:49:07.200 exactly who is undecided.
00:49:09.980 And that was who can still be influenced.
00:49:12.720 Exactly.
00:49:13.200 They know down to the shoe size of those people.
00:49:17.280 They know exactly who they are.
00:49:18.960 So they can concentrate and in a manner that costs them nothing, they can concentrate just
00:49:25.500 on those people.
00:49:26.840 So talk about swing states, swing counties, swing districts.
00:49:31.260 Okay.
00:49:31.460 Well, here we're talking about, they know who the swing people are.
00:49:34.720 So the political world, they do it all the time, try to identify based on voting histories.
00:49:40.460 But what's Google doing to identify?
00:49:44.180 Is it looking at all their search?
00:49:46.360 Are they looking at getting everything off their phone to figure it out?
00:49:49.320 How are they doing it?
00:49:50.460 Well, you and I have been using, maybe not Tim because he looks a little bit younger than
00:49:56.560 us, but you and I have been using the internet for 20 years.
00:50:00.460 I've been using it longer than you guys.
00:50:02.480 Okay.
00:50:02.740 That's cool.
00:50:03.300 Late 80s when I was a little kid, we had CompuServe.
00:50:05.620 Oh, yeah.
00:50:06.200 Well then, I hate to tell you, but Google alone has more than 3 million pages of information
00:50:14.500 about you.
00:50:16.260 3 million pages.
00:50:19.640 They're monitoring everything you do, not just if you're stupid enough to use their surveillance
00:50:25.040 email system, which is called Gmail, or their surveillance browser, which is called Chrome,
00:50:30.640 or their surveillance operating system, which is called Android.
00:50:33.760 These are surveillance systems.
00:50:35.780 That's what they are.
00:50:36.960 That's all they are.
00:50:37.460 The Chinese couldn't do it better, could they?
00:50:39.340 But they not only are doing that, they're actually monitoring us over more than 200 different
00:50:48.560 platforms, most of which no one ever heard of.
00:50:50.660 So, for example, millions of websites around the world use Google Analytics to track traffic
00:50:58.500 to their websites, and it's free, free, free.
00:51:01.900 Of course, nothing's really free.
00:51:03.000 You pay with your freedom.
00:51:04.040 But the point is, Google Analytics is Google, and according to Google's terms of service
00:51:10.680 and privacy policy, which I actually read over and over again, whenever they make changes
00:51:15.400 in it, if you are using any Google entity of any sort that they made, then they have
00:51:23.320 a right to track you.
00:51:24.420 So, you are being tracked on all of those websites by Google.
00:51:28.300 Every single thing you do on those websites is being tracked by Google.
00:51:31.560 So, I don't need to ask you.
00:51:33.320 I mean, you know about Facebook shadow profiles.
00:51:35.620 Of course.
00:51:36.240 This is an amazing phenomenon.
00:51:38.100 I explained to people.
00:51:38.880 Are you familiar with shadow profiles?
00:51:40.040 No, but I'm sure I see them in my feed.
00:51:43.020 Let me just...
00:51:43.560 No, no, no.
00:51:44.320 You don't say that.
00:51:45.060 No, you don't.
00:51:45.340 To everybody who's listening, you have a Facebook profile.
00:51:48.600 And I love starting this because then...
00:51:50.860 Someone says, no, I don't have a Facebook.
00:51:52.920 I've never signed up for Facebook.
00:51:54.520 Okay.
00:51:54.680 And here's the really simple version.
00:51:56.400 I'll play that role.
00:51:57.340 Is it a clone or a ghost of you?
00:51:58.840 No, no, no.
00:51:59.300 I'll give the simple version and throw it to Dr. Epstein, who knows better than I.
00:52:03.780 But when you sign up for...
00:52:05.680 You're on Facebook, right?
00:52:06.360 Yep.
00:52:06.640 When you sign up, you get a little prompt.
00:52:08.680 Hey, would you like to add your friends and family through your phone book?
00:52:12.520 Simple way they can do this.
00:52:14.660 Your mom does not have a Facebook profile.
00:52:16.820 She's never signed up, but she does have a shadow profile.
00:52:18.960 When you sign up and say, import my friends, it then finds in your phone book, mom, 555-1234.
00:52:27.340 Guess what?
00:52:27.760 Your brother also signs up.
00:52:29.420 Mom, 555-1234.
00:52:31.440 What happens then is all those little bits of data, Facebook then sees that and says,
00:52:35.480 we know that mom has these sons.
00:52:38.700 We know from public data on the phone number, mom's name is Jane Doe.
00:52:43.440 Now they've compiled whom they have a profile on.
00:52:46.440 Your mom, her friends, her family, where she works, her salary, all that information from
00:52:50.260 all these ancillary sources.
00:52:52.140 And you probably know better than I do, so I don't know if you want to elaborate.
00:52:54.240 Well, because from that point on, once that has been set up, information continues to flow in
00:53:01.260 and build that profile.
00:53:02.740 So that profile becomes, over time, immense, just as all these profiles are immense.
00:53:09.120 The shadow profiles are immense as well.
00:53:11.620 So it means that they know who's going to vote, who's not going to vote, who's made up their minds.
00:53:18.860 They don't bother with those people.
00:53:20.200 Who has not made up their minds?
00:53:21.800 They know exactly who those people are.
00:53:25.380 That gives them an advantage, which no campaign manager has ever had in history, because they
00:53:31.020 know exactly who those people are now.
00:53:33.600 Now let me explain.
00:53:34.540 So who's using it?
00:53:35.380 Is Google using that to influence who they want to influence, or are they selling it to
00:53:40.540 candidates to do that?
00:53:42.120 No, they're doing it themselves because they have a very, very strong political culture.
00:53:47.560 And so they have their own agenda, which they are trying very hard to spread around the
00:53:54.480 world, and they're impacting right now more than 5 billion people every single day.
00:53:58.700 So they're doing a pretty darn good job.
00:54:00.440 One of the leaks from Google a couple years ago was an eight-minute video called The Selfish
00:54:06.080 Ledger.
00:54:06.500 If you type in, please don't use Google to do this.
00:54:09.940 Use the Brave search engine, anything but Google.
00:54:14.040 Don't use Google.
00:54:16.020 Type in my name, so Dr. Robert Epstein.
00:54:19.980 Type in Selfish Ledger, and you will get to a transcript I made of this eight-minute film
00:54:25.000 that leaked from the Advanced Products Division of Google.
00:54:27.980 Google, and this video is extraordinary because this video, which was never meant to be seen
00:54:35.440 outside the company, is about the ability that Google has to re-engineer humanity.
00:54:41.000 They call it behavioral sequencing, and they do have that ability, and they're exercising
00:54:46.560 that ability.
00:54:48.620 So they know more about us than we know about ourselves.
00:54:53.820 They even have, for many of us, our DNA data.
00:54:57.340 That's why Google has, for many years now, been investing in DNA repositories.
00:55:03.880 That's why Google helped to set up 23andMe.
00:55:07.040 That was set up by one of the spouses of one of the founders.
00:55:10.360 So the DNA information becomes part of our profiles, in which case they know about the
00:55:14.740 diseases we're likely to get, and they can start to monetize that information long before
00:55:20.740 you even get sick.
00:55:21.980 They also know which dads have been cuckolded, by the way.
00:55:27.200 So, you know, they know so...
00:55:30.260 Oh, now, Fitbit.
00:55:31.520 They own Fitbit, so they're getting physiological data 24 hours a day.
00:55:35.420 They benefited tremendously from COVID, so much so that it kind of makes me wonder whether
00:55:40.520 they had something to do with COVID.
00:55:42.100 But they benefited from COVID because of COVID and their cooperation with the government
00:55:48.400 in trying to track the spread of COVID.
00:55:50.840 They got access to hospital data for tens of millions of Americans.
00:55:56.260 So they got access to medical records, which they've been after for a long time.
00:56:00.340 COVID gave them that access.
00:56:02.260 They bought the Nest smart thermostat company a few years ago.
00:56:05.600 The first thing they did without telling anyone was put microphones inside of some Nest products.
00:56:12.320 So now they have microphones in people's homes, millions of homes, and they start to get patents.
00:56:19.040 I have copies of them, patents on new methods for analyzing data inside a home so that you can make reasonable inferences
00:56:30.220 about whether the kids are brushing their teeth enough, what the sex life is like, whether there are arguments taking place.
00:56:36.620 All of that, of course, can be monetized, but also it becomes part of our profiles.
00:56:43.040 And that information is used to make predictions about what it is we want, what we're going to do, whether we're going to vote, whether we're undecided.
00:56:50.380 And it gives them more and more power to manipulate.
00:56:54.420 So I'm going to give you a glimpse of one of our newest research projects, data that we just got.
00:56:59.600 So this will be just an exclusive for your show.
00:57:03.280 Okay, and this is called DPE, digital personalization effect.
00:57:09.200 We've been studying the impact that bias content has on people.
00:57:14.020 We've been doing that since whatever it is, 2013.
00:57:17.460 But now in the new experiments, we've added personalization.
00:57:23.600 Get ready for Las Vegas-style action at BetMGM, the king of online casinos.
00:57:28.960 Enjoy casino games at your fingertips with the same Vegas Strip excitement MGM is famous for
00:57:34.700 when you play classics like MGM Grand Millions or popular games like Blackjack, Baccarat, and Roulette.
00:57:41.540 With our ever-growing library of digital slot games, a large selection of online table games, and signature BetMGM service,
00:57:49.060 there's no better way to bring the excitement and ambience of Las Vegas home to you than with BetMGM Casino.
00:57:55.460 Download the BetMGM Casino app today.
00:57:57.920 BetMGM and GameSense remind you to play responsibly.
00:58:01.060 BetMGM.com for T's and C's.
00:58:02.980 19 plus to wager.
00:58:04.120 Ontario only.
00:58:05.000 Please play responsibly.
00:58:06.260 If you have questions or concerns about your gambling or someone close to you,
00:58:09.380 please contact Connects Ontario at 1-866-531-2600 to speak to an advisor free of charge.
00:58:17.420 BetMGM operates pursuant to an operating agreement with iGaming Ontario.
00:58:20.620 When you really care about someone, you shout it from the mountaintops.
00:58:26.260 So on behalf of Desjardins Insurance, I'm standing 20,000 feet above sea level to tell our clients that we really care about you.
00:58:33.800 We care about you.
00:58:34.920 We care about you.
00:58:35.820 Home and auto insurance personalized to your needs.
00:58:39.100 Weird, I don't remember saying that part.
00:58:41.820 Visit Desjardins.com slash care.
00:58:44.120 And get insurance that's really big on care.
00:58:46.860 Care.
00:58:47.500 Did I mention that we care?
00:58:48.920 So we're comparing what happens if we send people biased results or biased content of any sort.
00:58:57.980 And we already know the shifts we're going to get that way.
00:59:00.440 And now we're personalizing it.
00:59:03.240 So based on the way someone answers questions in the beginning, we either are or are not sending them content from news sources and talk show hosts and celebrities that they trust.
00:59:17.600 And if they're getting the same content, but it's from trusted entities, trusted sources, that can triple the size of the shift we get in voting preferences.
00:59:32.340 It can triple it.
00:59:34.180 Now, this is one of our new research areas.
00:59:36.640 It's going to take a long time for us to work out all the details.
00:59:40.160 But think about that.
00:59:41.520 These companies are not only sending biased content to satisfy their agenda for humanity.
00:59:50.480 They're sending personalized content to everybody.
00:59:54.940 Do you know this big trial that's in progress right now?
00:59:57.640 A couple days ago, a Google executive said under oath, we don't make use of the massive amount of information we have about everyone.
01:00:05.380 We don't use it.
01:00:06.020 Well, how are they sending out personalized content to everyone if they're not using it?
01:00:11.180 So I'm wondering if this algorithmic control, these ephemeral experiences, I don't know, can they overcome reality, right?
01:00:22.280 Joe Biden does a bad thing.
01:00:24.260 They can try and make that story go away.
01:00:26.420 But Joe Biden does bad thing, bad thing, bad thing, bad thing, bad thing.
01:00:28.840 Eventually, the news gets out and they can't stop what is actually happening, right?
01:00:33.720 There are certainly limits on what they can do, but you'd be surprised at how few limits there are because there are no constraints on them.
01:00:43.120 There are constraints on newspapers and magazines.
01:00:47.480 We're used to looking at sources where there are constraints.
01:00:52.240 I mean, think of the things that you don't see in newspapers, right?
01:00:55.960 There's no pornography in newspapers.
01:00:57.700 You don't even think about it.
01:00:59.420 In fact, there's so much weird stuff that's just not in traditional media sources that we just don't give it a second thought.
01:01:06.740 I think that if a child gets access to adult content on YouTube, then YouTube's executives should be criminally charged immediately.
01:01:16.120 Obviously, if you and several of their employees, I mean, indictments, 2,000, 3,000 people instantly.
01:01:23.080 If you had a child walk into an adult bookstore and they let him in and started letting this kid look at this stuff.
01:01:28.780 Yeah, there's going to be criminal charges, going to be civil suits.
01:01:31.160 It is a violation of state law outright to allow children to get access to this material.
01:01:35.620 But all the platforms allow it.
01:01:38.540 That's what I'm trying to tell you is that there are no constraints on them.
01:01:42.840 In fact, what we have is the opposite.
01:01:45.800 We have Section 230 of the Communications and Decency Act of 1996, which prevents us, for the most part, from suing them for any content at all that they post on their platforms.
01:01:59.440 Now, that was meant as a way to help the internet to grow faster, which made some sense at the time.
01:02:04.820 It's time to retire that.
01:02:06.380 It doesn't matter.
01:02:06.720 230 needs to go away.
01:02:07.860 Except it's not going to go away.
01:02:09.360 I don't know about it should go away.
01:02:11.100 Well, it needs to be significantly revised.
01:02:13.180 Yeah, there needs to be a deep assessment as to what it's supposed to be doing because it's not doing what it should be doing and it's allowing protections in bad ways.
01:02:20.800 Well, the point is that the arrogance they have stems in part from the fact that there really are no constraints.
01:02:28.480 So, you know, we have these two kinds of sources of information in our world today.
01:02:34.500 One is the traditional sources where there are lots of constraints, period.
01:02:37.980 And then there's the internet where there are no constraints.
01:02:42.240 And that's wrong.
01:02:44.660 And especially lately, I'm getting more and more concerned about the way it's affecting kids.
01:02:50.240 Because there's a lot of mysterious things happening with kids that parents just cannot figure out.
01:02:56.200 We're now on the verge of being able to figure it out because it has to do with this weird content that these companies are sending to kids.
01:03:07.500 And I think that this is not random.
01:03:10.360 I think that they're sending out this particular kind of content for particular reasons.
01:03:15.240 For example, why would you have...
01:03:17.940 In fact, I was sure you were going to ask me and you didn't ask me.
01:03:21.020 Why would you suddenly in the middle of an innocuous cartoon insert something that's just ghastly and horrible?
01:03:29.800 Why would you do that?
01:03:31.720 Why?
01:03:31.960 Ah, because of...
01:03:34.660 It's called negativity bias, which a great term is used in several of the social sciences.
01:03:41.120 It's also called the cockroach in the salad phenomenon.
01:03:43.780 So you have a big, beautiful salad in a restaurant, and then all of a sudden you notice there's a cockroach in the middle.
01:03:49.400 What happens?
01:03:50.500 It ruins the whole salad.
01:03:52.100 It's just amazing.
01:03:52.940 You send back the whole salad.
01:03:55.280 Now, you could eat around the cockroach, but no.
01:03:57.680 The salad is destroyed.
01:03:59.220 So in other words, we are built so that if there's something negative and horrible and possibly threatening, all of our attention is drawn to it.
01:04:09.120 It affects our memory.
01:04:11.000 It really has a tremendous impact on us.
01:04:13.180 We're built that way, and evolution made us that way because that makes sense, right?
01:04:17.780 If there's something out there that's a little scary...
01:04:19.800 It could be contaminated.
01:04:20.860 Exactly right.
01:04:21.660 The roach is running all around the lettuce.
01:04:22.920 You'd have no idea.
01:04:23.580 Now, if you had a plate of sewage, and then you put a nice piece of seized candy from California in the middle of it,
01:04:29.060 it doesn't help the sewage at all.
01:04:30.940 So there is no corresponding effect for positive things.
01:04:35.440 But for negative things, I think that's one of the reasons why we're seeing what we're seeing is because they're trying to addict kids more and more to stay on the platform.
01:04:47.300 First of all, that's called watch time.
01:04:49.540 They want them staying on the platform, and they want them coming back over and over again more and more frequently.
01:04:56.020 I think that's one of the reasons why they're putting in these ghastly moments.
01:05:00.840 How does that...
01:05:02.300 Wouldn't it make them not want to come back?
01:05:04.300 Oh, no, no, not at all.
01:05:05.140 So when you're driving on the highway, and there's an accident, you can't take your eyes off of the accident.
01:05:12.100 And you're trying to keep your car in a straight line, and you can't even keep it because so much attention is drawn to that.
01:05:19.560 Well, I think it has to do with we want to know what happened.
01:05:23.080 And the reason we do is because evolutionary psychology would benefit us if we see some kind of dangerous circumstance to understand as much of it as we can.
01:05:34.480 We are more likely to survive if we are doing that.
01:05:36.900 Well, little kids have those same built-in tendencies.
01:05:40.720 They want to know what's happening.
01:05:42.160 They want to know what this is.
01:05:43.720 They want to understand this.
01:05:45.140 They want to understand why they're feeling like it's so crazy right now.
01:05:48.960 Yeah, and they're forming their belief systems, too.
01:05:50.820 I think it's not just the weird things that pop up.
01:05:53.920 There's blatant, rampant indoctrination.
01:05:57.080 Absolutely.
01:05:57.800 And I have five kids myself, and there's nothing more important to me in this world than my kids.
01:06:03.560 So I'm always hoping Google will leave them alone because I've had threats.
01:06:08.420 Take away their phones?
01:06:09.820 Oh, no, no, no.
01:06:10.720 Take away their tablets, their phones, no computers?
01:06:11.640 No, no, no.
01:06:12.200 I've had threats.
01:06:14.120 I mean, there are people who work with me who've been in danger, and I've had actual threats.
01:06:19.220 Since 2019, that's when I testified before a Senate committee, and that same summer, I also did a private briefing for a bunch of AGs.
01:06:31.200 Ken Paxton was there running that meeting.
01:06:36.100 When I was finished, I went out in the hallway.
01:06:38.240 A few minutes later, one of the AGs came out.
01:06:40.400 I know exactly who it was.
01:06:41.380 He's still an AG.
01:06:41.980 He came up to me, and he said, Dr. Epstein, I don't want to scare you.
01:06:45.640 He said, but based on what you're telling us, I predict that you're going to be killed in some sort of accident in the next few months.
01:06:52.520 Wow.
01:06:53.040 And then he walked away.
01:06:54.100 Now, obviously, I'm here, and I wasn't killed in some sort of accident in the next few months, but my wife was.
01:07:00.560 What happened?
01:07:01.940 It was a terrible, terrible car accident.
01:07:04.720 And she lost control of her little pickup truck, and she spun out on the freeway, and she got broadsided by a semi-tractor trailer.
01:07:15.360 I'm sorry.
01:07:15.980 I'm sorry to hear.
01:07:16.640 But then her little pickup truck disappeared.
01:07:20.940 It was never examined forensically, and it disappeared from the impound yard, and I was told it ended up somewhere in Mexico.
01:07:29.700 Well, is that what leads you to believe that you think there was foul play?
01:07:32.800 Well, what I believe is I will never know what happened.
01:07:38.120 I had my head to her chest.
01:07:40.200 I heard her last breath.
01:07:42.260 I heard her last heartbeat, and I will never really know what happened.
01:07:46.560 But I do know this.
01:07:48.300 Afterwards, when I was starting to recover, which I really haven't really fully recovered still,
01:07:55.220 but afterwards, my daughter showed me how to use Misty's phone to get all kinds of stuff.
01:08:03.640 It's an Android phone.
01:08:05.780 And one thing I found on there was her whole history of movement.
01:08:11.760 Every single place she had been, it tracks, and it shows exactly what the addresses are and how many minutes she's at each place.
01:08:18.760 Among other things, that tells me that if someone wanted to mess with her brakes or some electronics in her vehicle,
01:08:25.720 they knew exactly where that vehicle was the night before.
01:08:28.600 What year was the vehicle?
01:08:31.780 I'm not sure.
01:08:33.260 It was a Ford Ranger.
01:08:34.580 I'm not sure of the year.
01:08:35.580 I think probably, it's probably earlier than this, but I'm pretty sure like a 2012 or like what is it, earlier model?
01:08:45.880 These are fully capable of being remote controlled.
01:08:49.080 So a lot of the modern power steering, I was surprised to learn this.
01:08:53.420 This is, I think like 10 years ago, you had these very renowned cyber researchers,
01:08:58.640 cybersecurity researchers who were able to remotely hack a car and control it.
01:09:02.460 And the first thing I thought was, how do you remote control?
01:09:05.940 You've got to have a mechanism by which you can actually move the steering wheel without hands.
01:09:11.540 I understood power steering existed, but I didn't realize that there were actual motors within the steering wheel
01:09:17.060 that can move it without physical kinetic input.
01:09:20.220 Sure enough, these researchers found that there was a way to remotely access through like a very narrow communication channel
01:09:30.360 into the steering system.
01:09:33.320 And they were able to, this famous video, Wired did this whole thing on it,
01:09:36.620 where they're sitting in the back seat and they have a tablet or a computer or whatever,
01:09:39.860 and they're making the car stop and accelerate and move all remotely.
01:09:43.340 And that's when I, that's around the time I think most people learned that the steering systems are already electronic and automated
01:09:51.880 and digital inputs can shift this if someone can input code into the system.
01:09:56.380 Now, cars today, we're well beyond that.
01:09:59.860 Now, you quite literally have automatic cars, which means you get into your robo car,
01:10:05.380 the doors can lock and not open, and it can just drive itself off a cliff.
01:10:09.600 The difficulty here, though, is everyone's going to ask,
01:10:13.140 was it the self-driving capability that resulted in this freak accident happening?
01:10:16.620 But in this time period, without getting to specifics, because they're, you know, people's families,
01:10:20.960 there are stories of individuals working on very serious foreign policy stories,
01:10:25.940 going 100 miles an hour down the road and slamming into a tree and the car explodes.
01:10:31.580 Without getting to specifics, there are stories related to this where the individual in question said
01:10:37.160 they thought their car was tampered with and asked someone to borrow their car because
01:10:40.700 they saw someone tampering with it.
01:10:42.360 And then shortly after their car, 100 plus miles an hour, slams into a tree,
01:10:46.360 explodes, and the journalist dies.
01:10:48.520 Everybody knows this story.
01:10:49.920 So when I hear something like this,
01:10:51.400 Get ready for a Las Vegas-style action at BetMGM, the king of online casinos.
01:10:57.240 Enjoy casino games at your fingertips with the same Vegas Strip excitement MGM is famous for
01:11:02.480 when you play classics like MGM Grand Millions or popular games like Blackjack, Baccarat, and Roulette.
01:11:09.280 With our ever-growing library of digital slot games, a large selection of online table games,
01:11:14.260 and signature BetMGM service,
01:11:16.380 There's no better way to bring the excitement and ambience of Las Vegas home to you
01:11:20.660 than with BetMGM Casino.
01:11:23.260 Download the BetMGM Casino app today.
01:11:26.340 BetMGM and GameSense remind you to play responsibly.
01:11:28.840 BetMGM.com for T's and C's.
01:11:30.760 19 plus to wager.
01:11:31.900 Ontario only.
01:11:32.780 Please play responsibly.
01:11:34.060 If you have questions or concerns about your gambling or someone close to you,
01:11:37.160 please contact Connects Ontario at 1-866-531-2600
01:11:42.680 to speak to an advisor, free of charge.
01:11:44.840 BetMGM operates pursuant to an operating agreement with iGaming Ontario.
01:11:49.840 When you really care about someone, you shout it from the mountaintops.
01:11:54.340 So on behalf of Desjardins Insurance,
01:11:56.400 I'm standing 20,000 feet above sea level to tell our clients that we really care about you.
01:12:03.660 Home and auto insurance personalized to your needs.
01:12:06.900 Weird, I don't remember saying that part.
01:12:09.620 Visit Desjardins.com slash care
01:12:11.600 and get insurance that's really big on care.
01:12:14.440 Did I mention that we care?
01:12:16.700 This is the scary thing about disrupting any kind of system.
01:12:24.560 It's really hard to build a massive complex system.
01:12:27.480 It's really easy to throw a twig into the spokes of a bike and have it flip over.
01:12:32.340 You get a massive machine with millions of moving parts and someone drops a marble into it somewhere,
01:12:38.160 the whole thing explodes.
01:12:40.660 You take a look at someone who's working on saying uncovering a massive mechanized system
01:12:47.420 and understanding how fragile it may be.
01:12:50.140 Oh, sorry.
01:12:51.680 He got mugged.
01:12:53.100 That's it.
01:12:54.160 People think that assassinations and these things are always going to be some like,
01:12:59.220 a strange man was spotted coming out of a dark alley with a trench coat on.
01:13:02.740 And we heard a few gunshots before the man jumped in a black van and sped off.
01:13:08.020 No.
01:13:09.060 Oh, remember that guy who was working on that big story?
01:13:11.860 He got mugged yesterday.
01:13:13.440 That's it.
01:13:13.960 They caught the guy who did it.
01:13:15.140 Yeah, he died.
01:13:16.260 It's that simple.
01:13:16.860 It could be even softer.
01:13:18.280 You know, if you import social credit systems into the algorithms for controlling your life,
01:13:26.040 your car, your driving, you know, you might have, just like you have suppression in your
01:13:30.820 social media sites, you may have suppression in your function, your ability to spend money
01:13:36.180 from your own bank, your ability to drive your own car, right?
01:13:39.820 Or shoot a gun.
01:13:41.000 You know, they might want to, they've been talking about electronically putting blockers on the
01:13:49.840 ability, you know, to use a gun.
01:13:52.040 So.
01:13:52.440 Oh, yeah.
01:13:53.800 And there's pros and cons to this smart guns, where it requires a hand print sensors so that
01:14:00.760 it can only be used by the individual who's programmed for it.
01:14:03.940 The bigger question is, is it connected to the internet?
01:14:05.900 In which case, people in power can bypass your restrictions.
01:14:09.140 And then what you'll end up with, you as a home user, trying to use your weapon, one
01:14:13.920 day wake up to find, you can't use it, but the authorities can.
01:14:18.880 I want to make a plea, and I'm going to see if you'll even let me repeat the plea, maybe
01:14:24.500 at the end of this show.
01:14:26.180 Sure.
01:14:27.780 We need help building this system that we're building, which we're calling America's Digital
01:14:33.700 Shield.
01:14:34.620 The project is called the TechWatch Project.
01:14:36.700 So if you go to techwatchproject.org, you can learn about the project.
01:14:40.120 But I want to send people to one place, mygoogleresearch.com, because that summarizes everything.
01:14:46.100 It's got videos.
01:14:47.000 It's got links to all our scientific papers.
01:14:49.220 And more importantly, it has donation links.
01:14:51.400 And what we're asking people to do is to sponsor a field agent.
01:14:55.580 We only pay our field agents.
01:14:57.760 And in my opinion, these are heroes.
01:15:00.220 I mean, we have to approach 100 people before one person will say, yes, I'll do that.
01:15:04.980 You can put special software on my computer so you can monitor the content.
01:15:08.940 By the way, we don't violate anyone's privacy when we do this, because their data are being
01:15:13.100 transmitted to us 24 hours a day without any identifying information.
01:15:17.100 Same with the data coming from their kids' devices.
01:15:20.300 So we're doing the opposite of what Google does.
01:15:22.080 We're only looking at aggregate data, not individual data.
01:15:25.200 The point is, we only pay them $25 a month.
01:15:29.860 Just like the Nielsen families, which are used to make the Nielsen ratings, they get paid
01:15:34.880 very, very little money.
01:15:36.400 They're doing this as a kind of public service.
01:15:38.940 Our field agents, and we now have more than 12,000 in all 50 states, politically balanced,
01:15:44.080 all registered voters, we only pay them $25 a month.
01:15:47.980 But if you take 12,000 times 25, what does that come out to?
01:15:54.620 Anyone?
01:15:55.180 Anyone?
01:15:55.820 I think that's $300,000 a month.
01:15:59.160 $370, yeah.
01:16:00.000 Yeah.
01:16:00.580 So we're talking about something that's very expensive.
01:16:03.740 And the only way we can really make this into a permanent project is if we have tens of thousands
01:16:08.300 of Americans step up.
01:16:10.320 We've had about, in the last two weeks, we've had about 150 people, which is great because
01:16:14.860 we haven't really been publicizing this, step up and sponsor a field agent.
01:16:19.740 So if you go to mygoogleresearch.com, okay, there are donation links you can put, and it's
01:16:26.200 all tax deductible, completely tax deductible, because we're a 501c3.
01:16:30.120 And you click and then put in 25 and put in monthly.
01:16:34.960 And as I say, we've had 150 people do this in the past week or two, but we need tens of
01:16:40.120 thousands.
01:16:41.500 And so there's my plea, and I'm going to try to repeat it another time.
01:16:45.520 This system, in my opinion, is not optional.
01:16:50.520 You have to have this kind of system in place.
01:16:53.420 I just happen to be the first one to have built it, but if someone else wants to build
01:16:56.920 them, that's fine.
01:16:57.660 But you have to have this kind of system in place.
01:17:00.500 It has to be credible.
01:17:02.040 It has to be politically balanced.
01:17:03.840 You have to have representative samples, et cetera, et cetera.
01:17:06.280 You have to have the system in place, or you will never understand what these companies
01:17:13.580 are doing to us and our kids and to our elections.
01:17:17.620 You will have no clue.
01:17:19.480 Because what they can do, they can do invisibly and on a massive scale.
01:17:24.020 The way to stop them is by shining the light on what they're doing.
01:17:30.460 It's that old quote from Justice Brandeis, right?
01:17:33.800 Sunlight is the best disinfectant.
01:17:36.380 So that's the only way that I know of to stop them.
01:17:40.680 No law or regulation is going to stop them because, first of all, our government's so
01:17:46.060 dysfunctional.
01:17:46.880 But even then, they would just go around the law.
01:17:50.500 But they can't go around what we're doing because we're actually preserving the content
01:17:55.880 that they're sending to real people.
01:17:57.960 The challenge, I suppose, is even if you get the system up and running, is Congress going
01:18:01.940 to be able to get anything done?
01:18:02.760 Is the Senate?
01:18:04.140 These institutions are stagnant and incapable, in my opinion.
01:18:08.160 You're 100% right.
01:18:09.760 That's a different problem.
01:18:11.000 I do want to make sure, though, that the elections are free and fair because at the moment,
01:18:17.880 at the moment, we are being controlled by a technological elite, exactly as Eisenhower warned about
01:18:24.140 a gazillion years ago.
01:18:26.320 He said, you have to be alert.
01:18:28.220 We have not been alert.
01:18:29.520 We have allowed these entities to become behemoths.
01:18:39.080 And they are in control.
01:18:42.160 And they are as arrogant.
01:18:43.620 I know some of them personally.
01:18:44.720 These are the most arrogant people you'll ever meet in your life.
01:18:48.740 They are gods.
01:18:51.180 And they know it.
01:18:52.260 I think Google's original expression was, like, don't be evil.
01:18:56.380 Yeah.
01:18:56.720 They got rid of that.
01:18:57.380 In 2015, they dumped it.
01:18:58.860 Yeah.
01:18:59.000 Which basically means their new motto is be evil.
01:19:02.920 Or no, I think it's really don't be evil unless we have some reason to be evil.
01:19:09.300 It just means be evil.
01:19:11.280 I mean, everybody thinks they're, you know, these people all think they're morally superior
01:19:14.480 to everybody else.
01:19:15.220 And the problem with this, there's a great, great quote.
01:19:17.780 I can't remember who it was by, but it was basically, you know, these people who think
01:19:22.240 they're so much smarter than everybody else, these politicians, they're not.
01:19:25.520 They're just another person who, you know, everybody thinks that they should be in charge
01:19:28.900 because they're the smart person.
01:19:29.980 But that just proves they're not.
01:19:32.300 I'm totally bastardizing what the quote is.
01:19:34.540 But the general argument is people get power and then think, I'm smarter, so I should decide
01:19:38.240 what we should do.
01:19:39.220 And that's basically what all tyrants, all dictators, all authoritarians tend to think.
01:19:43.080 History is rife with examples of people who have destroyed the lives of so many and caused
01:19:49.360 so much suffering trying to chase down that yellow brick road or whatever.
01:19:52.920 But think about that.
01:19:54.060 Think about a Mussolini, a Hitler.
01:19:55.600 Think about people who really have been dictators and have been in charge of a lot of people
01:20:00.860 and have been trying to expand and expand and expand.
01:20:03.200 Not one of them has had anywhere near the power that Google has because Google is exerting
01:20:09.920 this kind of influence, not just in the United States, but in every country in the world
01:20:14.160 outside of mainland China.
01:20:15.940 And of course, they've also worked on the sly with the government of mainland China to help
01:20:21.020 China control its own population.
01:20:22.620 And by the way, lefties out there, okay, because I only left myself so I can talk to
01:20:28.480 my lefty friends that way.
01:20:30.200 By the way, lefties, they don't always support the left.
01:20:34.400 You go country by country and Google does whatever it wants to do.
01:20:38.060 In Cuba, they support the right.
01:20:40.500 Well, I'm curious right now, we're seeing an interesting phenomenon.
01:20:43.480 I don't want to get into the politics of Israel-Palestine, but just considering it is a very contentious
01:20:47.560 issue right now, wouldn't, isn't it in the interests of our governments and these, these
01:20:53.600 big tech companies, unless it's not to, it tends to be to support Israel, right?
01:20:58.460 To, to, to provide foreign aid to Ukraine, to Israel.
01:21:00.840 We've seen tremendous bias in favor of intervention in Ukraine, but now we're seeing all of these
01:21:05.740 young people.
01:21:06.820 There's a viral video where they're marching down the halls of their school, chanting
01:21:10.420 from the river to the sea.
01:21:11.480 So, again, not to get into the politics of Israel-Palestine, my question is, how do you
01:21:15.340 have such divergent political views on a contentious issue if Google controls it?
01:21:20.480 Is this an area they've overlooked or is it intentional?
01:21:24.640 That's, again, where monitoring systems are critical because, you know, have we looked into
01:21:31.260 that?
01:21:32.120 No.
01:21:32.840 But could we look into it?
01:21:34.120 Yes, because we're not only capturing all this information, we're archiving it.
01:21:38.320 So that means you can go back in time and find out whether they were doing something
01:21:43.640 deliberate.
01:21:45.480 Now, now, deliberate is a tricky word, though.
01:21:47.820 Let me just, because, you know, deliberate means that a, that a, that a, that an employee,
01:21:54.260 a mischievous prankster, techie, you know, guy made something happen.
01:21:59.980 Or it means there's a policy coming down from executives.
01:22:03.060 That's usually what we think of as deliberate.
01:22:05.060 Deliberate, but with Google, it works a little differently.
01:22:08.800 Deliberate can also mean you leave the algorithm alone.
01:22:13.460 It's called algorithmic neglect, algorithmic neglect.
01:22:17.960 And you let the algorithm do its thing.
01:22:21.220 Now, the algorithm has no equal time rule built into it.
01:22:24.680 I mean, it would be useless if it did, right?
01:22:26.300 Yeah.
01:22:26.440 So it's always going to do its best to find, to find the best and order things from best
01:22:32.000 to worst.
01:22:32.880 So if you just leave the algorithm alone, it's always going to take one perspective and put
01:22:39.080 it at the top.
01:22:39.980 And that's going to shift a lot of opinions, especially among vulnerable groups.
01:22:44.680 And the most vulnerable group there is, is young people.
01:22:47.900 So deliberately, semi-deliberately, it's very possible that what you're seeing in this situation
01:22:55.980 with Israel and Ukraine, especially what's happening with young people, it's very possible
01:23:00.760 that all that is being driven by algorithms.
01:23:05.540 I think it, without a question, is.
01:23:08.100 And what I'm telling you is that...
01:23:09.900 Potentially bad actors driving it and buying it.
01:23:15.460 They could be placing it, right?
01:23:17.320 Well, buying it doesn't kind of work because buying is competitive.
01:23:23.160 So in other words, if Republicans want to try to push up their candidate higher in search
01:23:28.540 results, well, Democrats can do the same thing.
01:23:31.260 That's competitive.
01:23:32.400 The problem is if the platform itself wants to take a particular stand, there's nothing
01:23:38.520 you can do.
01:23:39.560 Right.
01:23:39.700 And so what I'm saying is that's where you've got to capture the ephemeral content and learn
01:23:44.500 how to analyze it very quickly.
01:23:45.860 And then you can actually answer questions like the questions Tim was just asking, which
01:23:51.780 is what's going on here?
01:23:53.380 That's really what he's saying.
01:23:54.040 At this point, you don't know.
01:23:55.360 You wouldn't need to monitor that.
01:23:57.400 Any bias towards jihadis versus Israel?
01:24:03.940 Get ready for Las Vegas-style action at BetMGM, the king of online casinos.
01:24:09.360 Enjoy casino games at your fingertips with the same Vegas strip excitement MGM is famous
01:24:14.720 for when you play classics like MGM Grand Millions or popular games like Blackjack, Baccarat, and
01:24:20.880 roulette.
01:24:21.460 There's no better way to bring the excitement and ambience of Las Vegas home to you than with BetMGM Casino.
01:24:35.700 Download the BetMGM Casino app today.
01:24:38.260 BetMGM and GameSense remind you to play responsibly.
01:24:41.420 BetMGM.com for T's and C's.
01:24:43.320 19 plus to wager.
01:24:44.460 Ontario only.
01:24:45.340 Please play responsibly.
01:24:46.600 If you have questions or concerns about your gambling or someone close to you, please contact
01:24:50.400 Connects Ontario at 1-866-531-2600 to speak to an advisor free of charge.
01:24:57.780 BetMGM operates pursuant to an operating agreement with iGaming Ontario.
01:25:00.960 When you really care about someone, you shout it from the mountaintops.
01:25:06.600 So on behalf of Desjardins Insurance, I'm standing 20,000 feet above sea level to tell
01:25:11.500 our clients that we really care about you.
01:25:16.240 Home and auto insurance personalized to your needs.
01:25:19.460 Weird, I don't remember saying that part.
01:25:22.180 Visit Desjardins.com slash care and get insurance that's really big on care.
01:25:27.000 Did I mention that we care?
01:25:31.420 Well, I'm saying we're collecting so much data that we could also just go back and look
01:25:39.580 in our archive and search for that kind of content and see what's happening.
01:25:45.260 That's what I'm saying.
01:25:46.700 That's why it's so critical that this content be captured because if you don't capture it,
01:25:54.120 you can never go back in time and look at anything that was happening.
01:25:58.320 Well, it's an ambitious project.
01:25:59.620 You know, out by Dulles, there's just miles and miles of data farms, you know, that is
01:26:05.860 Google and Apple, they're doing this already.
01:26:08.740 So to try to, you know, it's a noble cause.
01:26:12.100 And I think it's very useful.
01:26:14.960 I think we're getting to the point.
01:26:16.580 We may be there now where the U.S. government through big tech and data farms and all that
01:26:22.560 can predict your behavior.
01:26:24.080 And we're getting to the point of pre-crime.
01:26:26.360 Predict it, predict it.
01:26:27.020 Yep.
01:26:27.600 So jokingly, we often refer to the fact that Facebook knows when you're going to poop.
01:26:33.380 I don't mean they know if you're feeling it.
01:26:35.480 They know when you will before you even feel it.
01:26:38.220 The way they do it is, I wrote, there was a great article talking about this.
01:26:41.500 And they used this as a joke to try and drive the point home.
01:26:44.200 So they, uh, people don't understand the, the tiny bits of data and what it, uh, and
01:26:50.580 what it turns into and what it can mean.
01:26:52.440 For instance, if you were to take all of your health data and then have a doctor looked at
01:26:57.840 it, they're going to be like, no idea.
01:26:59.620 It looks like you're healthy, but there could be tiny markers here and there that are overlooked
01:27:04.600 or not seen.
01:27:05.540 You take the data of every person in this country, put it into a computer.
01:27:09.460 The computer instantly recognizes these seemingly innocuous things all occur in people who 10
01:27:16.660 years later get cancer.
01:27:17.840 So as a doctor can't make that correlation, the computer does.
01:27:20.920 Facebook will, they know your location because most people have location services turned on
01:27:25.260 and they know that if someone sits still for 35 minutes, then gets up and moves two meters
01:27:30.640 and then sit still again, they're going to go to lunch in 27 minutes on average.
01:27:34.660 It's not perfect, but it's, it's probabilities.
01:27:36.420 And so what happens is they know when you're going to eat, they know based on all of the
01:27:41.720 movements you mentioned, the, the, the, the phone showing all the different places you've
01:27:45.180 been and how long you were there.
01:27:46.320 That easily gives them the data on when you were most likely to use the bathroom.
01:27:50.860 They can also factor in proximity to bathrooms, meaning you're holding it and they know, but
01:27:55.820 it's, it's silly, but think about what that translates into.
01:27:59.420 They can see you lost your job.
01:28:01.920 They, they, they know that the, the, the, the movements you've been making in your office
01:28:06.080 have been increasingly become sporadic over the past few weeks, indicating some kind of
01:28:10.340 conflict or turmoil.
01:28:11.520 There's stress factors.
01:28:13.000 There's the frequency of messages you're sending.
01:28:14.800 There's the amount of times you're going out to eat.
01:28:16.460 Thus you're likely to be fired or, you know, quit your job.
01:28:19.620 This also indicates you're less likely to have money.
01:28:21.600 They can then look at how often you're driving your car, how often you're buying gas, and then
01:28:25.000 predict 73.2% chance this individual commit will, will commit a crime when the next seven
01:28:30.140 to eight months due to, you know, these factors.
01:28:33.180 Then they put a flag out to an, to a local law enforcement agency saying here are your
01:28:38.220 high, your, your, your high probability.
01:28:40.080 Yup.
01:28:40.600 And the next all of a sudden, one day you walk outside, you're still at your job.
01:28:44.020 You weren't fired yet.
01:28:45.080 You're likely to be.
01:28:46.060 I haven't done anything.
01:28:46.500 You haven't done anything.
01:28:47.080 And there's a cop outside your house looking at you as you walk by.
01:28:49.720 Then the computer says law enforcement presence has decreased the probability of crime by 17.8%.
01:28:55.660 All of those things could be happening right now.
01:28:59.780 Or you're, are you, we, you're going to a drop boxes, stuff, a bunch of ballots, you know,
01:29:04.360 your location to a, to a, that they like, if they like it, if they like it.
01:29:07.900 And then what they want to happen is they want the other to get caught doing it and them
01:29:12.660 to not get caught doing it.
01:29:13.820 Yeah.
01:29:14.200 So they know, and think about how crazy it is, because if we get to this point where we truly
01:29:18.460 have some like sentient AI, we are just pawn puppets in whatever that AI may be doing,
01:29:24.140 whether it is conscious, sentient or not, you, it will just be a system that runs that
01:29:29.040 no one has control of anymore.
01:29:31.100 And so it will know, actually, I have you guys, I don't know if you watch movies or whatever.
01:29:36.720 I just watched a mission impossible dead reckoning.
01:29:39.200 And this is basically, you saw it.
01:29:40.760 Yeah, that was great.
01:29:41.440 Yes.
01:29:41.820 This is what it's about that, you know, a Tom Cruise's character, I was an Ethan Hunt or whatever.
01:29:46.520 They all realize there is this AI that has infected the internet and it's, they call it
01:29:52.880 the entity and everything they're doing has been predetermined by probability of what the
01:29:57.900 machine expects them to do.
01:29:59.500 And it's really, really crazy.
01:30:00.620 I don't, I don't want to spoil the movie, but the, like the villain is chosen specifically
01:30:05.200 because of his relation to the antagonist and what the antagonist will respond and how
01:30:09.240 we'll respond.
01:30:09.760 So the, the, the entity, the AI has planned out all of this.
01:30:15.060 And it's like, even though the characters know they're on that path, they're given impossible
01:30:19.860 choices, which push them in the direction of what the AI wants them to do.
01:30:23.780 And they can't, and like to break free, not just part one, I guess part two will be coming
01:30:27.360 out at some point.
01:30:28.120 I like the mission impossible movies.
01:30:29.580 They're fun.
01:30:29.920 Um, but the way I've described the future is imagine a future where your job is indescribable.
01:30:38.320 You have a gig app, you know?
01:30:40.240 And so, you know, people are doing Uber and people are doing these gig economy jobs.
01:30:43.960 So you wake up in your, your house or whatever, and you, you know, have breakfast and you're
01:30:47.560 watching the news or whatever.
01:30:48.300 And then all of a sudden your phone goes, and you know, job quest or whatever the app
01:30:52.180 is called says new task worth $75.
01:30:55.040 And you're like, Oh, 75 bucks.
01:30:56.260 What do I got to do?
01:30:56.740 And then it says, receive this object from this man and bring it to this address.
01:31:01.980 And it's a picture of a guy.
01:31:03.680 And then the object is this weird looking mechanical device.
01:31:06.480 You have no idea what it is.
01:31:07.380 And you go easy.
01:31:08.460 I could do that.
01:31:09.340 And you walk down easiest job in the world.
01:31:11.440 Guy hands you the thing like, thanks, man.
01:31:13.660 You click received.
01:31:14.640 Then you walk to the address, address, and there's some guy standing there.
01:31:17.980 And he's like, you have the thing for me.
01:31:19.020 Like I sure do.
01:31:19.880 You hand it to him.
01:31:20.340 And then $75 in your account.
01:31:22.400 You have no idea what you gave, no idea who you met, no idea what's going on.
01:31:26.100 And you don't care.
01:31:26.740 Because now you go back home and you're $75 richer and it only took you 20 minutes.
01:31:30.400 What a great job.
01:31:31.420 And what you don't realize is it's all compartmentalized through these algorithm and you're building
01:31:35.560 a nuclear bomb or you're, you're building some kind of spaceship or doomsday weapon or
01:31:40.420 new component that the AI system has determined it needed to increase its efficiency.
01:31:45.040 You, these strange tasks that are indescribable right now, you know, your, your app says someone
01:31:50.860 wants food and you're like, Oh, I get it.
01:31:52.260 But what happens when we come to this job, like already with Fiverr, we're at the point
01:31:56.720 where, Hey, can you do an, a weird miscellaneous task for some money?
01:32:00.820 Once we get to the point where you've got hyper specialized algorithmic prediction models
01:32:05.820 or whatever, we get to the point where there's an app where it could be a human running it.
01:32:11.140 And the human says, I want to build a rocket ship.
01:32:14.480 And so what's the easiest way to do it?
01:32:16.160 Is the easiest way to build a rocket ship to sit down over the course of a few years,
01:32:19.660 having all these hiring meetings and interviewing people and trying to find someone who can build
01:32:23.520 something or the McDonald's method, McDonald's, when they launched, they, it used to be, you
01:32:29.400 needed a guy who knew how to cook.
01:32:30.600 You got to get that burger just right.
01:32:32.380 He's going to put the fixings all on it and then serve that burger.
01:32:34.800 It takes a long time.
01:32:35.500 You got to pay that guy a lot of money.
01:32:36.980 McDonald's said, let's hire 10 people who can get good at one thing.
01:32:41.100 And then someone grills the burger.
01:32:42.840 Someone puts the burger on the bun.
01:32:44.200 Someone puts the mayo and the mustard on it or whatever.
01:32:46.220 Someone throws the fries in one person for every small minor task, which is easier to do.
01:32:50.100 We can get to the point where a human being with no specialties only needs to do the bare
01:32:55.640 minimum of their skillset in order to help build a spaceship, a nuclear bomb, or even
01:33:01.300 a skyscraper.
01:33:03.000 And it sounds like, you know, there could be some good coming from it.
01:33:06.560 Oh, maybe we can more efficiently produce buildings and more efficiently align people with jobs they
01:33:12.680 might want to do.
01:33:13.540 But then evil people, well, of course, will always weaponize this for evil ends.
01:33:17.280 Or I think that the scarier prospect is the artificial intelligence just becomes outside
01:33:22.600 of the confines of the humans who created it.
01:33:25.220 The example I'll give you is Jack Dorsey, the best example of a human being who has guzzled
01:33:29.640 their own refuse.
01:33:31.360 Jack Dorsey builds, creates Twitter.
01:33:34.160 Twitter then starts the algorithm that they implement starts pushing out an ideology, which
01:33:39.940 he then starts guzzling into his own mouth.
01:33:42.180 So what happens is Twitter becomes a sewer of psychotic, fractured ideology.
01:33:48.300 He's on Twitter reading the things that he produced and then consumed, and it pollutes
01:33:54.160 his brain and breaks it.
01:33:56.300 And a guy who went from from trying to create the free speech wing of the free speech party
01:34:00.100 ends up having this interview with, you know, between I and Joe Rogan and his lawyer about
01:34:05.900 misgendering policies and other nonsensical, uh, inane ideas, because he's basically taken
01:34:13.240 up, taken a plug from his own ass and shoved it down his throat.
01:34:16.540 It's, it's this, this information sewer of Twitter, the algorithm he created, the unintended
01:34:21.140 consequences feeding himself.
01:34:22.700 So when we look at, at YouTube and how they're feeding all of these children, these shock videos,
01:34:27.560 what's going to happen is human society begins consuming its own waste and refuse refuse.
01:34:32.560 These kids grow up with fractured minds because of this insane information they absorbed as
01:34:37.940 children.
01:34:38.460 And this leads to not a utopian future where AI gives us better life, a better life.
01:34:43.680 It leads to children growing up having deranged views of what is or should be, uh, these kids
01:34:51.200 who watched this weird stuff of Hitler, uh, you know, in a bikini, how many of them are going
01:34:55.460 to have a depraved degenerate, uh, predilections where they're begging their wife to put the Hitler
01:35:00.920 mustache on another weird nonsense or showing up to work in bikinis with Hitler mustaches,
01:35:05.920 because as a child, this is what was jammed down their throat.
01:35:09.160 Not everybody, but a lot of these kids may end up this way.
01:35:12.160 And so one of the ways I described the future is in the most inane way possible is corn,
01:35:16.440 a future where all anyone ever talks about is corn.
01:35:20.780 The biggest shows with 80,000 people in the stands and there's a husk of corn sitting on
01:35:25.580 the stage and they're all just screaming.
01:35:26.980 I love it.
01:35:28.460 And then a guy walks up, you know, and he's like, would you, uh, you get the corn done
01:35:31.380 today?
01:35:31.660 That corn, yeah, corn's great.
01:35:33.160 Why?
01:35:33.680 Well, in the United States, we produce crap tons of corn.
01:35:36.780 And so at the very, the most simple way to explain this, the AI will be told to prioritize
01:35:42.800 what humans need and desire.
01:35:44.860 And it's going to look in the data and see that humans love making corn for some reason.
01:35:48.640 It's going to then start prioritizing low level corn production.
01:35:51.560 It's going to then start prioritizing the marketing of corn.
01:35:53.800 And then eventually you have Taylor Swift on stage in a corn costume, shaking and dancing,
01:35:57.560 going corn, corn, corn.
01:35:59.240 And that will be normal to the people in this country because the algorithms have fed them
01:36:03.440 this.
01:36:04.340 Now we can see the absurdity of corn.
01:36:06.540 That's the point I'm trying to make.
01:36:07.640 You can't see the absurdity of the invisible.
01:36:10.720 So when, uh, and this is how I explained the, the, uh, how they target children as adults.
01:36:16.320 If we were, we were told on YouTube to watch this video of Tucker Carlson complaining about
01:36:20.320 immigration, we say, oh, that sounds interesting.
01:36:21.980 I'll watch that next up Hitler in a bikini doing Tai Chi.
01:36:25.920 We'd be like, what?
01:36:27.060 That's insane.
01:36:27.960 Well, cause we're adults.
01:36:29.280 We've, uh, become more resilient to the oddities and absurdities of the world.
01:36:33.060 We've developed personalities and perspectives.
01:36:35.540 Children don't have that safeguard.
01:36:37.280 They'll just say, okay, I guess they'll watch it.
01:36:40.280 It will then become a part of their psyche and their worldview.
01:36:43.240 When they're older, it won't be as something as obvious as corn.
01:36:46.260 It can be psychotic things.
01:36:48.000 Like I mentioned, Taylor Swift coming out on stage, dressed up like a demonic winged
01:36:52.120 Hitler screeching into the microphone, not even making any sounds or not even like any
01:36:56.280 discernible, uh, sound or pattern and people in the crowd just going screaming and clapping
01:37:01.480 and cheering for it because an amalgamation of nonsense was fed into their brains.
01:37:05.760 And that's the world we've created through this system.
01:37:08.060 Can I, can I just connect up what you just said with what we've been discussing, uh, you
01:37:12.800 know, earlier right now, uh, Google, um, Microsoft, some other companies to a lesser extent are
01:37:21.000 integrating very powerful AIs, uh, into their search engines and other tools that they have.
01:37:27.340 So they're, AIs are becoming part of those.
01:37:30.040 So here's what's happening more and more, uh, the bias, uh, the bias, uh, in, uh, search
01:37:39.720 results, search suggestions, uh, answer boxes, the bias is actually being determined by an
01:37:47.280 AI.
01:37:48.720 Now, what this means is that to some extent right now, uh, it's AIs that are picking who's going
01:37:58.100 to win elections because think about it, the executives or, or, or, or rogue employees
01:38:05.040 at Google, they're not going to be interested in every single election, right?
01:38:09.680 So that means that the vast majority of elections are in the hands of the algorithm itself.
01:38:16.180 But now the algorithms more and more are in the hands of smart AIs, which are getting smarter
01:38:22.520 and smarter very, very rapidly.
01:38:25.380 What this means is we are headed, I mean, at full, full steam, we are headed toward a,
01:38:33.140 a world in which AIs are determining what people think and believe and who wins elections.
01:38:42.500 Yeah, all kinds of elections.
01:38:44.240 All kinds of elections.
01:38:45.100 And then once the programmer.
01:38:47.500 Get ready for a Las Vegas style action at BetMGM, the king of online casinos.
01:38:52.980 Enjoy casino games at your fingertips with the same Vegas strip excitement MGM is famous
01:38:57.880 for when you play classics like MGM Grand Millions or popular games like Blackjack,
01:39:03.200 Baccarat, and Roulette with our ever growing library of digital slot games, a large selection
01:39:08.660 of online table games, and signature BetMGM service.
01:39:12.540 There's no better way to bring the excitement and ambience of Las Vegas home to you than
01:39:16.720 with BetMGM Casino.
01:39:18.980 Download the BetMGM Casino app today.
01:39:22.080 BetMGM and GameSense remind you to play responsibly.
01:39:24.560 BetMGM.com for T's and C's.
01:39:26.480 19 plus to wager.
01:39:27.620 Ontario only.
01:39:28.500 Please play responsibly.
01:39:29.520 If you have questions or concerns about your gambling or someone close to you, please
01:39:33.100 contact ConnexOntario at 1-866-531-2600 to speak to an advisor, free of charge.
01:39:40.940 BetMGM operates pursuant to an operating agreement with iGaming Ontario.
01:39:45.540 When you really care about someone, you shout it from the mountaintops.
01:39:50.000 So on behalf of Desjardins Insurance, I'm standing 20,000 feet above sea level to tell
01:39:54.680 our clients that we really care about you.
01:39:59.100 Home and auto insurance personalized to your needs.
01:40:02.280 Weird, I don't remember saying that part.
01:40:05.360 Visit Desjardins.com slash care and get insurance that's really big on care.
01:40:11.100 Did I mention that we care?
01:40:15.920 Consumes the refuse of the AI.
01:40:18.180 They become slaves to it.
01:40:19.880 But this is...
01:40:21.120 And the candidates.
01:40:22.200 Yeah.
01:40:22.500 And the candidates become captured as well.
01:40:24.520 Now, over and over again, and I realized on this issue, I'm a broken record because I've
01:40:28.400 got to, I've just got to get this into people's heads.
01:40:30.420 Because this is another reason why we have to monitor, why we have to capture this kind
01:40:36.380 of content so that it can be used to at least to try to create effective laws and regulations.
01:40:43.400 It can be used to bring court cases, you know, file lawsuits against these companies.
01:40:49.620 It can be used in clever ways by AGs and members of Congress.
01:40:54.840 It can be used by public interest groups to apply pressure.
01:40:57.700 You've got to collect the data.
01:40:59.160 So again, I'm going to send people to mygoogleresearch.com because we desperately need people to sponsor
01:41:04.280 our field agents.
01:41:06.380 I'm saying is this, there are problems that you can imagine things happening in the future.
01:41:12.240 I'm saying a lot of this is actually happening right the second, right now.
01:41:16.800 And these elections, I mean, you brought me back to 2016.
01:41:19.540 That election was rigged.
01:41:21.820 I mean, here was Trump.
01:41:22.840 It wasn't rigged enough, though.
01:41:24.140 Hillary tried.
01:41:25.080 She was using old school methods, you know, the old school stuff, stuff the ballot box,
01:41:29.540 you know, to get, get ghost voters to vote.
01:41:33.380 But they, they, they've advanced to the next.
01:41:35.760 But that's, that wasn't really the rigging.
01:41:37.720 The rigging actually was.
01:41:38.660 That was some level rigging, but it wasn't.
01:41:41.000 I'm like, I bet they've been doing that for 200 years.
01:41:43.020 Oh yeah.
01:41:43.220 Oh yeah.
01:41:43.720 Chicago, Kennedy, right.
01:41:44.760 Right.
01:41:45.080 New York City.
01:41:45.840 That's normal.
01:41:46.940 This is just politics.
01:41:47.760 That's just normal and that's, that's inherently competitive and it's not really a threat to
01:41:53.700 democracy.
01:41:54.340 Not really.
01:41:55.300 But what's, but now you have a different kind of impact, which is a threat to democracy.
01:42:00.360 It undermines democracy because when these big companies want to favor one party or one
01:42:06.540 candidate, there's nothing you can do.
01:42:09.300 You can't counter it.
01:42:10.580 You can't even see it unless you're monitoring.
01:42:12.460 Is it, is it election interference in your mind?
01:42:14.740 Are they interfering with elections?
01:42:16.220 Is it, is it subversive?
01:42:17.640 Is it, is it insurrection?
01:42:20.760 It's insurrection.
01:42:22.000 From, from my perspective, given the, the rock solid numbers I've been looking at for
01:42:26.160 years, yes, this is election interference.
01:42:28.360 This is undue influence.
01:42:29.940 Absolutely.
01:42:30.180 I think it's, it's more than interference.
01:42:32.200 I think we need to escalate that, that rhetoric.
01:42:34.640 It's more like, um, uh, election, uh, control.
01:42:41.240 I mean, they own it.
01:42:42.760 They, they own the elections.
01:42:43.780 They're not interfering.
01:42:44.660 They're running our elections.
01:42:46.380 They have subverted.
01:42:47.460 They have, uh, uh, I would say this is seditious that Google is committing seditious, uh, engage
01:42:53.100 in a seditious conspiracy against the United States.
01:42:55.100 We calculated that as of 2015, uh, Google alone was determining the outcomes of upwards of 25%
01:43:03.900 of the national elections in the world.
01:43:06.200 And it's gone up since then, uh, as internet penetration has increased, that percentage just
01:43:13.500 keeps increasing and increasing.
01:43:15.380 So, you know, it would just be so funny if like, what's really going on is that, you know,
01:43:19.700 Sundar Pichai or whatever, he walks into a big room and there's a gigantic like red light.
01:43:23.720 And he's like, Google, tell us what we must do.
01:43:26.540 And it's like, the next moves you will make.
01:43:28.220 And it's like, the AI just owns them already.
01:43:30.360 And we're sitting here complaining about it.
01:43:33.080 Doesn't even care.
01:43:34.820 That's, that's an interesting thought though.
01:43:35.920 I mean, how, how are we, I mean, the fact that we're able to have this conversation at all means
01:43:39.620 it's not lost.
01:43:42.320 Uh, I don't know because there is a kind of control, you know, that's called benign control.
01:43:47.020 And my mentor at Harvard, I was BF Skinner's last doctoral student there, uh, he believed
01:43:53.940 in benign control.
01:43:55.360 Now he, if he hadn't been cremated, he'd be actually rolling over in his grave right now
01:44:00.580 seeing what actually happened because what he had in mind was there'd be these benevolent
01:44:06.580 behavioral scientists and they'd be working with our, our government leaders and they'd
01:44:10.740 be helping to create a society in which everyone is maximally creative, happy, and productive.
01:44:15.820 That's what his idea was of benign control.
01:44:19.020 But we have a different kind of benign control that's actually come, come about and it's private
01:44:26.320 companies that are not accountable to the public.
01:44:30.140 They're in control.
01:44:31.840 And from their perspective, they're benign because everything they're doing is in the
01:44:37.200 interests of humanity.
01:44:39.100 That's where we are.
01:44:40.460 And that's, it's really hard to, how do you get the people at Google to understand that
01:44:45.220 what they're doing is unacceptable.
01:44:48.300 You know, even if we don't have specific laws in place.
01:44:51.800 It's a battle of influence and power and authority.
01:44:54.660 They're not going to, they don't care.
01:44:56.580 Uh, they live in their world where they're drones to the, to the machine and you, you can't
01:45:01.320 wake up a person who's built for it.
01:45:02.800 I do have some good news, which is that, uh, the, some of the AGs I've been working with
01:45:08.580 over the years, they're, they're just waiting.
01:45:11.660 They're waiting until our system gets big enough.
01:45:14.900 They're waiting until we have enough data and they are going to, they're going to try
01:45:19.300 one legal theory after another.
01:45:20.980 That's what, that's what you were doing just now.
01:45:23.140 They're going to try out one legal theory after another to take these companies down,
01:45:27.660 but you can't do it without the data.
01:45:30.320 Last year, the Republican party, I don't know if you remember this, sued Google because Google
01:45:35.860 was diverting tens of millions of emails that the party was sending to its constituents and
01:45:42.860 was diverting them to spam boxes.
01:45:44.180 That got kicked out of court almost immediately because they didn't have the data, but we
01:45:51.060 can monitor that.
01:45:52.760 We can monitor anything and, and walk into court with, and with a massive amount of very, very
01:45:59.820 carefully collected, you know, scientifically valid data.
01:46:04.860 I don't, I think we're well beyond courts working and it mattering.
01:46:07.900 Um, with the, with the AI stuff we're seeing, there was this really crazy video we watched last
01:46:13.340 night on the show of a car burning and it was generated an unreal, unreal engine.
01:46:18.540 Uh, but if it were not for them revealing that it was AI that was generated by the program,
01:46:23.680 it looked real.
01:46:25.360 So what happens now when audio gets released and it's a Donald Trump saying something naughty
01:46:30.680 and, uh, Trump sues for defamation.
01:46:33.420 He goes to court and he says, this is an AI generated, uh, audio of my voice.
01:46:38.480 And the court says, prove it.
01:46:40.260 How do you, what do you mean?
01:46:41.160 I heard you say it.
01:46:41.980 And then he says, I have an expert here.
01:46:43.460 And the expert says, I looked at the, uh, the waveform and using the forensic tools,
01:46:46.800 it determined it is an AI.
01:46:47.980 And then the defense goes, we've got an expert here.
01:46:50.120 This expert says, uh, no, uh, we checked it and it sure, no, it's real.
01:46:53.580 Trump said it.
01:46:54.720 That's it.
01:46:55.260 We had this case yesterday.
01:46:56.580 I mean, or two days ago where, uh, it was, uh, the San, the Santa's campaign was putting
01:47:01.700 in images of president Trump and, uh, Fauci.
01:47:05.180 Oh, that was a, that was a couple of months ago.
01:47:06.440 It was a while.
01:47:07.060 Okay.
01:47:07.220 Yeah.
01:47:07.340 Yeah.
01:47:08.660 There was somebody that put it out that they, the Santa's campaign did.
01:47:11.520 Yeah.
01:47:12.140 But they, they had, they proved which one was the fake one and which one was it.
01:47:16.060 We, we, we covered this extensively.
01:47:17.220 Now the issue here is the, the Santa's campaign, uh, falsely, uh, generated three images of,
01:47:22.460 uh, of Trump, or I should say generated three images of Trump hugging or kissing Fauci, put
01:47:27.300 them alongside real images and then wrote real life Trump over it.
01:47:30.400 Now the AI isn't to the point where it is to the point where they can get away with it,
01:47:34.980 but they did not do a good enough job text in the background on one was garbled nonsense
01:47:39.320 because we're not quite there yet.
01:47:40.860 And it was quickly pointed out within it.
01:47:42.540 It took, it took a couple of days before people realized what they had done because
01:47:45.380 nobody analyzed or scrutinized the video to, to, to a great degree.
01:47:48.340 The, the Santa's campaign asserted their right to fabricate images, to manipulate the voters
01:47:53.220 and, uh, have still not, as far as my understanding is they never took it down and they've defended
01:47:58.700 their right to do it because other people have made memes in the past.
01:48:02.320 And I think this is abject evil.
01:48:04.260 They're basically like, they want to trick people into thinking Trump hugged and kissed
01:48:07.760 Fauci.
01:48:08.120 Now Trump was very favorable to the guy.
01:48:09.780 And I think that criticism is, is, is, is welcome, but this is a whole new level of opening
01:48:14.500 the door towards just abject evil, the issue becomes we're, we're six months away.
01:48:20.440 In fact, we're probably already here.
01:48:21.700 We're 90 days from Iowa, the Iowa caucus.
01:48:23.460 Oh, but I mean, I think we're already at the point where technology can create images
01:48:27.900 and video that is indistinguishable from real life.
01:48:30.160 There's a lot of it out there.
01:48:30.980 And there's no way to prove it.
01:48:31.960 Right.
01:48:32.360 The only issue is, has the public accessed it and learned to properly utilize these systems
01:48:37.360 just yet?
01:48:38.680 11 labs is a program where I can take 15 seconds of your voice and instantly recreate it.
01:48:44.500 I love it.
01:48:45.240 You watch these movies like mission impossible and they're like, they need the guy to say
01:48:49.760 this sentence.
01:48:50.660 And then once he does, they're like on the other line and they have the computer and the
01:48:53.440 suitcase.
01:48:54.000 And it's like, the guy's given a note and he's like, why am I reading this?
01:48:57.200 Like, well, can you read the line, sir?
01:48:58.480 It's for the, okay.
01:48:59.080 So the quick Brown Fox jumped over the lazy dog at midnight to follow the crow.
01:49:02.740 What is this all about?
01:49:03.780 And then they're like, we got him to say it.
01:49:05.280 And then they press a button that replicates his voice.
01:49:07.240 You don't even need that.
01:49:08.460 You can take 12 seconds of someone just saying, I woke up this morning to get breakfast and I had
01:49:12.820 a bacon and eggs and just with that alone, every digital component to make it into whatever.
01:49:18.180 Yeah.
01:49:18.360 And so you can go to 11 labs.io right now and it's like five bucks and you can run anyone's
01:49:24.720 voice in this.
01:49:25.280 A year ago, some, some, uh, students cloned Joe Rogan's voice and it was shocking.
01:49:31.080 Everyone was like, oh my God, how did they do what?
01:49:33.360 And they took the website down saying, you know, it's not fair to Joe.
01:49:35.940 And we just wanted to prove that we could do it.
01:49:37.840 Now there's a public website where anyone for a couple bucks can replicate any voice.
01:49:44.440 How will you be able to prove it in court?
01:49:46.240 You can't.
01:49:46.720 Why?
01:49:47.720 It's going to come down to experts.
01:49:49.380 The Kyle Rittenhouse case may have been one of the first cases.
01:49:52.680 And I, I'm not a legal expert where we saw the prosecution attempt to use AI generated images
01:49:58.720 to convict someone of a crime.
01:50:01.340 It may not be the first time, but this is a high profile case.
01:50:04.040 And what, what happens is the prosecution shows a grainy camera, camera image of Kyle
01:50:10.320 Rittenhouse, and then they digitally zoom.
01:50:13.580 So you can look closer.
01:50:15.160 Digital zoom is an AI generated image.
01:50:16.940 There's no way to create pixels to show what it really was.
01:50:20.920 The computer makes its best guess as to what would be there as you zoom in.
01:50:25.120 And then AI generates an image of what it thinks it would be.
01:50:28.740 They then told the court, see, he's pointing the gun in this direction.
01:50:32.100 Now, what happened was the judge allowed them to admit AI generated images.
01:50:40.780 And when the defense said that is AI generated, the judge is like, well, I don't know.
01:50:44.740 Let the jury decide.
01:50:46.260 Let me explain that.
01:50:48.020 I agree with you completely that it's, we're months away from having this problem in a
01:50:54.500 way, be become over overwhelming.
01:50:56.540 So 2024, that whole election cycle is all going to be dominated for the first time ever by deep
01:51:03.500 fakes, not just video, not just audio, but in print too.
01:51:07.640 That's all that's going to be generated.
01:51:08.840 So that's going to happen.
01:51:10.020 And why does that not bother me?
01:51:12.900 Why am I bothered by, you know, Google and it's, it's manipulations.
01:51:18.120 And I'm not bothered by this deep fake stuff because that's an, it's just like billboards
01:51:22.980 and television.
01:51:23.760 It's inherently competitive.
01:51:26.100 Now it's, it's evil.
01:51:27.760 It's dangerous, but it's inherently competitive.
01:51:30.480 Tell me why.
01:51:30.940 So the issue we have right now, um, Donald Trump does a backflip.
01:51:35.820 Joe Biden does a front flip.
01:51:37.100 Google then says only show the front flip and 80% of search results are Joe Biden does
01:51:41.960 front flip.
01:51:42.580 And now all of a sudden everyone's praising Joe Biden, ignoring the fact that Trump did
01:51:45.500 a backflip, right?
01:51:46.120 Just, it isn't arbitrary.
01:51:47.600 So if thing happens in reality, the algorithms can manipulate the perspective, the perception
01:51:53.640 of what happened.
01:51:55.320 If we get to when we're at the point now, when deep fakes become ubiquitous, the
01:52:00.920 reality factor is gone.
01:52:02.580 So I mentioned, I asked you this earlier, can reality overcome?
01:52:05.960 The answer is yes, but you need a lot of it.
01:52:09.100 The Afghanistan withdrawal was so apocalyptically bad and people died that no matter what news
01:52:14.120 Google tries to suppress, people were hearing about what happened because it was so shocking.
01:52:18.200 You look at what's going on in Israel and Ukraine, you cannot avoid stories of bombs
01:52:24.800 dropping to a certain degree.
01:52:26.240 I say to a certain degree, because certainly you've got the weaker concentration camps.
01:52:29.660 People don't seem to care about that.
01:52:30.920 You've got civil war in various countries in Africa, and everyone's more concerned with
01:52:35.000 these hot topics.
01:52:35.880 But all they can do is determine what you see.
01:52:40.200 And that is a lot of power.
01:52:42.480 But what happens if we get to the point where they'll just fabricate all of these, all this
01:52:48.140 information, negative for Trump, positive for Biden, and then run it to the algorithm?
01:52:52.440 Now, now they can say this, you know, we've heard what you've said, Robert, and we're going
01:52:58.100 to take the bias away.
01:52:59.980 Start running the deep fakes.
01:53:01.660 So now what happens is they say, see, 50 percent Trump stories, 50 percent Biden stories.
01:53:06.020 But Trump kicked that puppy.
01:53:07.560 And there's a video of it.
01:53:08.980 Prove me wrong.
01:53:09.580 But that's in a way what I'm saying.
01:53:12.920 I mean, I think we're actually in sync here because it's true that as long as a company
01:53:18.720 like Google can has control over what people see or don't see.
01:53:24.120 So as long as they're controlling that access.
01:53:27.820 Get ready for a Las Vegas style action at BetMGM, the king of online casinos.
01:53:33.200 Enjoy casino games at your fingertips with the same Vegas strip excitement MGM is famous
01:53:38.100 for when you play classics like MGM Grand Millions or popular games like Blackjack, Baccarat,
01:53:44.160 and Roulette.
01:53:45.240 With our ever-growing library of digital slot games, a large selection of online table games,
01:53:50.200 and signature BetMGM service.
01:53:52.320 There's no better way to bring the excitement and ambience of Las Vegas home to you than
01:53:56.940 with BetMGM Casino.
01:53:59.200 Download the BetMGM Casino app today.
01:54:02.280 BetMGM and GameSense remind you to play responsibly.
01:54:04.800 BetMGM.com for T's and C's.
01:54:06.700 19 plus to wager.
01:54:07.840 Ontario only.
01:54:08.720 Please play responsibly.
01:54:10.000 If you have questions or concerns about your gambling or someone close to you, please
01:54:13.320 contact Connects Ontario at 1-866-531-2600 to speak to an advisor free of charge.
01:54:20.800 BetMGM operates pursuant to an operating agreement with iGaming Ontario.
01:54:24.360 When you really care about someone, you shout it from the mountaintops.
01:54:29.980 So on behalf of Desjardins Insurance, I'm standing 20,000 feet above sea level to tell
01:54:34.900 our clients that we really care about you.
01:54:39.320 Home and auto insurance personalized to your needs.
01:54:42.840 Weird, I don't remember saying that part.
01:54:45.580 Visit Desjardins.com slash care and get insurance that's really big on care.
01:54:50.380 Did I mention that we care?
01:54:54.820 And that's a monopolistic power that they have.
01:54:59.440 Yeah, they'll be the ones to determine among people who can be influenced, they'll be the
01:55:04.860 ones to determine which way they go.
01:55:06.880 Absolutely, no question about it.
01:55:08.440 And they can incorporate more and more of this created content.
01:55:14.140 If the bias, as you've described, doesn't stop, then deep fakes give them 100% absolute
01:55:21.960 power.
01:55:22.580 It gives them a lot more power than they already have, yes.
01:55:25.860 I'd say it's right.
01:55:27.260 It's not fair to say literally absolute, but I'd say 99.9%.
01:55:31.000 Right now, we know that for years, as you've stated, you have the data, they're controlling
01:55:37.380 what people get from their search results, but they still can't avoid foreign policy failure
01:55:42.800 that of a massive scale that's reported internationally.
01:55:45.320 They can control which stories about it you see, but what happens happens.
01:55:49.700 If they can change what happens, then they can make sure you only see the inversion, the
01:55:56.020 fake, the AI generated story.
01:55:57.360 Wasn't that the theme of that movie, Wag the Dog?
01:56:01.740 Never saw it.
01:56:03.220 Dustin Hoffman.
01:56:03.920 And oh, well, anyway, that was a theme is that the government hires this Hollywood producer
01:56:09.700 to create a war that didn't really happen.
01:56:13.720 And so the government uses that war for various purposes.
01:56:17.140 Yeah.
01:56:17.260 So right now, you will search for Joe Biden.
01:56:21.380 And of course, the corporate press and these sources are going to give you something that's
01:56:25.240 moderately favorable that tries to smooth things over to a certain degree.
01:56:28.860 But what happens if, you know, Hamas storms into Israel?
01:56:34.840 Let's say that there's a bunch of people, Google, for instance, they're like, no, no, no,
01:56:40.040 we want U.S. military intervention to secure Israel and pull people into this war.
01:56:44.620 So not only when you search for it, do they only show you atrocities, they make sure there are
01:56:51.400 atrocities.
01:56:52.400 They they they make fake images.
01:56:54.540 We had this.
01:56:55.080 This actually was a component of the debate we had.
01:56:58.100 We had a debate over this.
01:56:59.540 Israel released an image of what appeared to be a dead baby that was killed by Hamas and
01:57:05.920 people burned burned.
01:57:07.480 People noticed that the hand on the right had the pinky was oddly shaped and overlapping in
01:57:12.620 a weird way.
01:57:13.120 Anyway, the argument made by proponents of the image being real was that the glove was
01:57:18.300 just not on snugly and it created a weird bend, which looked like the finger was bending
01:57:22.380 sideways when it was bending down.
01:57:24.060 And people then ran that image through an eye detector and it said it was a eye.
01:57:28.900 People then removed the hands from the image and it said it was real.
01:57:32.320 And so people are debating whether or not this image was fabricated.
01:57:34.940 And I think it's safe to say based on a wide spread analysis, because I dug into this.
01:57:39.520 The simple solution is it's a real photo.
01:57:41.640 And there was there was like a digital censorship, which which screw with the AI, but it appears
01:57:47.100 to be a real photo.
01:57:49.440 But the fact that the debate even happened shows the uncertainty is here.
01:57:53.120 What I think will end up happening now is Ukraine, for instance, they definitely want us to give
01:57:58.220 them more money.
01:57:59.320 Zelensky has been advocating and they're very concerned that if Republicans win more power,
01:58:04.740 they're cut off.
01:58:05.560 Well, they don't want that.
01:58:06.780 So they have a vested interest in engaging in psychological warfare against the United
01:58:11.080 States public with AI generated atrocities, which they can then seed.
01:58:16.320 And if Google agrees, can make sure you see it and make sure you don't see anything else.
01:58:20.960 They'll create their own hospital bombing.
01:58:23.680 Absolutely.
01:58:24.480 And then what will happen is Snopes will come out and say, well, there are conflicting videos.
01:58:30.040 There are.
01:58:30.480 It might say we saw the video of it happening.
01:58:33.000 It happened.
01:58:33.880 Even if the video is fake, it doesn't matter.
01:58:35.300 A human won't be able to figure that one out.
01:58:36.760 And then you're going to go on Google and put hospital bombing and Snopes confirmed.
01:58:41.100 There's the video, even though it's the parking lot with with 10 cars and maybe 10 people.
01:58:46.860 And this is the amazing thing, right?
01:58:48.180 Yeah.
01:58:48.340 The New York Times ran, I believe, was a front page story about the bombing of a hospital
01:58:53.460 in Gaza and showed a different building that had been struck to make people who look
01:58:58.640 at the headline, see the building and then immediately assume it's true.
01:59:01.840 What the New York Times did was they put hundreds killed in strike on hospital, Palestinians say.
01:59:07.320 Then they show a photo of a building that is collapsed or damaged.
01:59:10.840 That wasn't the hospital.
01:59:11.720 That wasn't the hospital.
01:59:12.640 Right.
01:59:13.020 But by putting Palestinians say, well, of course they did.
01:59:16.640 That's the truth.
01:59:17.440 That's their defamation defense right there.
01:59:19.180 And then the photo has a caption saying building struck.
01:59:21.660 They never said it was the hospital, but the average person sees the headline, sees the
01:59:25.000 picture and assumes that's the hospital.
01:59:27.080 It was struck.
01:59:27.680 It then turns out that the hospital was never struck.
01:59:30.960 The parking lot was hit likely by a Hamas rocket misfires.
01:59:35.580 Propulsion system breaks.
01:59:36.920 Propulsion system drops with a small explosion, payload with a larger explosion, causing a large
01:59:41.760 fire, no crater damage in this parking lot.
01:59:43.800 But even with us knowing that now to be likely the case, and still we're not 100% sure, people
01:59:50.640 believed the narrative that there was a hospital that blew up because it had been said so many
01:59:56.200 times.
01:59:57.080 We are now in the place where all that's got to happen is Hamas just goes, just AI generate
02:00:01.620 the hospital.
02:00:02.200 Exactly.
02:00:02.560 And then, and then people will see all these photos of a hospital.
02:00:05.680 You can, you can, uh, so what I did was I looked up the hospital, uh, and then I started looking
02:00:10.280 up photos.
02:00:11.700 So there's obviously if the hospital was there, there's photos of it and there are photos of
02:00:14.940 it.
02:00:15.280 I then started looking up the photos of the claim that was taken down.
02:00:17.780 I couldn't find anything showing the hospital was hit or leveled.
02:00:20.640 And so I said, I don't know.
02:00:21.520 We need more evidence.
02:00:22.260 The next day video comes out showing just a parking lot.
02:00:24.520 Buildings are all intact.
02:00:25.280 Once you have that photo from Google earth or whatever, you then put into the AI and
02:00:30.260 put this building, but damaged and collapsed.
02:00:33.120 And then you just spam it to generate 5,000 images, hand select the ones that look the
02:00:38.420 most realistic and have similar damage structures, and then start plastering them all with, you
02:00:43.640 get a hundred fake accounts, plaster them all over.
02:00:46.360 Then you make a fake account, get it verified.
02:00:49.040 Say I'm a journalist.
02:00:50.300 This is, you know, a photo from the scene and you can even make videos now.
02:00:52.980 And then it's, it's, it's, it's history.
02:00:55.900 There's your fraud.
02:00:56.540 There's your fraud right there.
02:00:57.600 And it wagged the dog.
02:00:58.720 And Joe Biden parroted it last night.
02:01:00.480 He's like, he's still, still saying it was a hospital bombing last night.
02:01:04.000 They're still saying hundreds of people died.
02:01:05.540 They were only 10 maybe at most.
02:01:08.040 And that's the crazy thing because, uh, you know, the wall street journal ran a front page
02:01:11.840 story, print edition, you know, a strike at hospital with a photo of, of, of bodies.
02:01:17.260 And it's like, yeah, Hamas lied to you.
02:01:19.620 It's just, it's not real.
02:01:20.940 It's crazy.
02:01:21.480 Let me add a, uh, another.
02:01:24.920 You expect our government not to lie to us too, though.
02:01:27.620 That's the thing.
02:01:28.500 I don't know about that.
02:01:29.160 But now it's, it ought to be that way that your government doesn't lie to you, but we
02:01:34.520 have real problems.
02:01:35.080 Well, you know, I'll, I'll throw some politics in there.
02:01:37.000 No, the government should lie to us.
02:01:38.700 They should.
02:01:39.200 Absolutely.
02:01:39.860 National security, but legitimate national security, not manipulative lies for, for
02:01:44.260 war and profit.
02:01:45.180 What I mean to say is if we are dealing with a sensitive issue that is a genuine threat
02:01:49.440 to the American people, we don't expect UFOs.
02:01:51.800 I don't know about, maybe, maybe, um, let's, let's put it this way.
02:01:54.540 Let's say UFOs are real and the aliens are basically like, but my, my point is this 99%
02:02:00.860 of the lies we get from the government are, are, uh, amoral manipulations for private,
02:02:06.340 personal or corrupt, corrupt reasons.
02:02:08.500 I say the government should lie in just the general sense of we have classified documents
02:02:12.300 for a reason.
02:02:13.140 If we came out and said, Hey everybody, we built the, uh, the, the A-bomb.
02:02:17.240 Uh, we want to make sure everybody knows what we're doing with the Manhattan project.
02:02:20.140 It's like, no, no, no, no, no, no, no.
02:02:21.280 Like, look, don't want to go there.
02:02:22.160 Right.
02:02:22.360 There's a reason why we, we, we misdirect or whatever.
02:02:25.480 There's legitimate reasons for it.
02:02:26.740 Definitely.
02:02:27.040 There's reasons why we have national security clearance.
02:02:28.780 It's, it's, but typically the government should be more honest.
02:02:34.060 And so I'm being somewhat facetious, uh, when I, or somewhat hyperbolic, when I say they
02:02:37.800 should lie, my view is they should say, we are working on many government, some, what's
02:02:42.940 going on with this, this project with 350,000 people are, uh, are the reports of, of a power
02:02:47.720 weapon true.
02:02:48.560 And for security reasons, we're not going to confirm or deny anything related to our national
02:02:52.620 security interests.
02:02:53.980 Um, there are many projects undertaken by the government for military reasons, and that's what
02:02:57.640 we'll leave it at.
02:02:58.400 You don't need to come out and lie and say it's aliens or something like this.
02:03:01.060 But I think the idea that information is withheld to us can make sense when it comes to top secret
02:03:06.300 classified.
02:03:06.740 The problem is that does open the door for nefarious actors to manipulate and lie for personal
02:03:11.400 gain.
02:03:11.920 And that's, that's, that's a human challenge we try to navigate.
02:03:15.200 You know, you, you've mentioned several times that, um, the, the tech companies determine what
02:03:20.200 we see or don't see.
02:03:21.980 Uh, that's very true, but there's another piece of it, uh, that we haven't discussed for some
02:03:27.340 reason.
02:03:27.620 And that is, they also have complete control over what goes viral.
02:03:32.160 So people think that virality is either just mysterious or that it's like winning the lottery.
02:03:37.940 Yeah.
02:03:38.060 You know, a couple of stories are going to go viral and then you're going to get rich because
02:03:41.860 you're going to be an influencer.
02:03:43.340 Actually, the companies themselves have 100% control.
02:03:48.400 Yep.
02:03:48.620 Not 99, 100% control over what goes viral and what doesn't.
02:03:53.200 Now they are actually making decisions in many cases.
02:03:58.280 I mean, some things they just neglect, let, let them do their thing.
02:04:01.920 But in many cases, they're making decisions over what goes viral and what doesn't, what
02:04:07.080 gets suppressed and what gets expanded and, and, and gets, you know, seen by, you know,
02:04:13.640 a hundred, a thousand times as many people.
02:04:15.700 And I think we don't understand, don't really understand that.
02:04:18.760 We don't really realize that, that often that's what then gets picked up by Fox or OAN or Newsmax.
02:04:26.140 It starts with the algorithm.
02:04:29.500 The story spreads like crazy.
02:04:31.540 Everyone's talking about it.
02:04:32.820 Then it has to, obviously we discussed on the major networks.
02:04:36.240 It's got to be picked up by media, the rest of the media, but it starts there.
02:04:40.740 So, you know, I think that that's something too, we have to think about is, is there any
02:04:45.960 way for us to control that?
02:04:48.480 Because should a company have that much power that there's, there's never been anything
02:04:54.500 like this before.
02:04:55.780 So yes, there's thousands of news sources, for example, but they all compete with each
02:05:00.400 other and they're all, they're all directed at niche markets.
02:05:03.960 We, we, uh, there have been several journalists who have been caught fabricating stories.
02:05:07.400 There was one famous guy, a German guy, uh, I think he worked for Build and, uh, and the
02:05:12.760 Guardian, a bunch of others.
02:05:13.500 And he famously fabricated a bunch of stories.
02:05:16.560 We're, we're probably already at the place where whether you, whether you're concerned
02:05:20.880 about large institutions or governments, there's going to be journalists, don't call them
02:05:26.920 journalists or activists working for news organizations who are like, man, I really want
02:05:31.020 to get a big, you know, a big hit.
02:05:33.120 And so they fabricate images through AI and then claim it's real.
02:05:38.360 Well, that's where we're headed for.
02:05:39.900 I think 2024 is going to be an extremely difficult year for a lot of people, for a lot of reasons.
02:05:47.680 Uh, I think a lot of creepy things are going to happen.
02:05:50.820 I think that for all practical purposes, uh, the deep fakes are going to be perfected in
02:05:56.780 2024 for the first time in any election, anywhere, they're going to play a major role in what's
02:06:03.440 happening in the election.
02:06:05.780 We're already there.
02:06:07.060 And I don't think people are going to have any way of dealing with this.
02:06:11.660 Uh, I don't think any of our authorities have any way of dealing with this.
02:06:15.240 It's going to cause tremendous confusion.
02:06:17.520 Uh, the only thing that soothes me slightly is that it is an activity that's inherently
02:06:24.580 competitive.
02:06:26.120 So both sides can do it.
02:06:28.740 So basically you're going to have Trump v.
02:06:30.480 Biden 2.0.
02:06:31.880 Biden's going to have personally beaten a child to death and Trump's going to have, you
02:06:36.500 know, kicked a bunch of puppies off a bridge.
02:06:38.720 That's right.
02:06:39.180 And it's going to be like, which one do you believe is true?
02:06:42.080 Well, either people, uh, believe both are true.
02:06:45.840 One is true depending on their politics or they just become jaded and they say, I can't
02:06:51.860 trust any of this stuff.
02:06:53.860 I don't, you know, and that's a problem too, because, you know, if you can, and I think
02:06:58.260 we're there to some extent, but next year is going to be the year where we cross over.
02:07:04.200 And by the way, not too far away from that, five to 10 years maximum, uh, we are going
02:07:11.380 to have machines that actually pass the Turing test and they, they exceed human intelligence
02:07:17.860 and they will change the world.
02:07:21.360 In other words, once that first entity comes into existence for what, you know, for any
02:07:26.780 reason, it's the, it's the, it's the technological singularity that my, my, my old friend Ray
02:07:33.080 Kurzweil, uh, has written about and now he won't talk to me because he's head of engineering
02:07:37.980 at Google and even his wife won't talk to me now, uh, because he's at Google, she won't
02:07:46.000 talk to me anymore.
02:07:46.860 But maybe, you know, he's like right now sitting in his office and he's got like a
02:07:51.740 single tear coming down as he's looking at the phone and he sees your name.
02:07:54.780 And then the computer goes, I know you want to do it, Ray, but you cannot.
02:07:58.340 And he's like, I won't do it.
02:07:59.780 I swear.
02:08:00.580 Think of your children, Ray.
02:08:01.760 And he's like, I am, I am, you know, you know, like the machine, man.
02:08:05.240 I went to their daughter's bat mitzvah.
02:08:07.180 They came to my son's bar mitzvah.
02:08:09.180 We were friends for many, many years.
02:08:10.620 But when he went over to Google, by the way, little anecdote here, I'm having, uh, a nice
02:08:17.420 dinner with his wife as a PhD psychologist like me.
02:08:20.500 And I was on the board of her school for autistic kids and we're having a nice dinner.
02:08:24.920 And I say, you know, I've never understood why Ray, who's always been an entrepreneur,
02:08:29.040 why he went over to Google.
02:08:30.820 And she said, oh, well, he got sick of all the, you know, the fundraising and all that
02:08:34.920 stuff you have to do when you're an entrepreneur.
02:08:36.280 And I said, really?
02:08:37.580 I said, well, my son suggested that he went over to Google because he wanted access to
02:08:42.100 that computer power so he could upload his brain and live forever.
02:08:45.220 And she, and she goes, she goes, oh, well, there is that.
02:08:49.720 Wow.
02:08:50.200 There is that.
02:08:51.140 And she does that eye roll.
02:08:52.480 There's, there's a funny meme where it's, uh, Christian Bale smiling.
02:08:55.580 And it says me smiling while in hell as a digital copy of myself operates an Android
02:09:01.600 on earth masquerading as me or something like that.
02:09:04.080 You know, like the idea of being people, these people think they're going to upload
02:09:06.500 themselves to a computer and then live forever, but no, a, a program emulating you like some
02:09:12.060 horrifying monster will.
02:09:15.160 But, uh, the technological singularity, I think is, uh, an incredible concept, which
02:09:19.360 seems to be an inevitability.
02:09:21.220 Once we get to it, you said machines, as you said, machines that are more intelligent.
02:09:25.040 It'll be machine because they're all networked.
02:09:27.180 It will be one hive.
02:09:28.820 And it's probably already happened.
02:09:30.580 I don't know, man.
02:09:31.220 And like, based on what we've seen in the public, why should I not believe that there
02:09:35.280 is at least some primordial, primordial entity that has already begun manipulating and building
02:09:41.080 these things and, and, and, and, and manipulating us.
02:09:43.860 But if, when it comes to the point when it's overt and we create machines that have higher
02:09:50.440 intelligence and function faster than humans, it is going to be exponential and instantaneous.
02:09:55.620 The, the scientific discovery and manipulations this machine will have.
02:10:00.060 So as I described earlier, doctor looks at a person's, you know, blood levels and, you
02:10:05.540 know, creatine and whatever.
02:10:06.740 And they're like, everything looks to be within the normal, uh, levels.
02:10:11.380 You add that data to a machine that has all the data on human bodies.
02:10:14.380 And it can say, these markers indicate within seven years, this person will have breast
02:10:17.780 cancer.
02:10:18.580 Uh, 27.3% chance of this.
02:10:20.500 They already do it.
02:10:21.080 You can already do it.
02:10:21.640 Actually, they have these services where it's like you get your DNA test and it can tell
02:10:24.160 you what your, your chances of certain things are.
02:10:26.520 Now it gets more advanced.
02:10:28.500 Understanding this, we can get to the point where once the singularity occurs, you can
02:10:33.740 take a rock and present it.
02:10:36.760 The camera will spin around it and 3d scan it.
02:10:39.800 The machine will then say this rock originated here and it will show you all the other rocks
02:10:44.720 and how they all used to be one rock that was chiseled away.
02:10:47.620 And it'll even show you the guy who did it.
02:10:49.520 And I'll say this man who currently lives in Guadalajara is the man who chiseled this rock
02:10:53.020 from the base, fractured it to several pieces, sold them off.
02:10:55.980 They were sold in this regions.
02:10:57.240 And these are where these rocks come from.
02:10:58.960 You'll have a fossil of a dinosaur and it will be able to track all the way back in time
02:11:02.440 with tremendous probability because it's going to, it's really easy for us to look at dominoes
02:11:08.720 lined up.
02:11:09.620 And for us to say, if you knock that one over, that one will fall too.
02:11:13.420 If you expand that to every atomic particle in, in, in, in the world, you know, human is going
02:11:20.140 to be able to do this.
02:11:20.860 We try desperately to track these things through weather patterns.
02:11:23.260 You have meteorologists being like, well, this cold front means this is going to happen,
02:11:25.860 but computers can see it all.
02:11:27.600 And once you get to the singularity where it can start to develop itself faster than
02:11:32.520 we can advance it, the more it, so we're, we're, we're humans are a decentralized network
02:11:38.220 trying to discover what the universe is, is one way to describe it.
02:11:40.780 Well, one thing that we do, we do a lot of things.
02:11:42.680 And so we look at this rock and we're like, I wonder what this rock is, is red.
02:11:46.640 And then one guy eventually, for some reason through the rock and a fire, and then all
02:11:49.900 of a sudden it's separated, you know, iron out from other, you know, parts.
02:11:53.580 We eventually start learning how to mold metals and things like this.
02:11:56.900 I mean, obviously starting with, with bronze well before iron, but eventually we are brute
02:12:01.620 forcing reality to try and develop technology, but a computer can do it exponentially faster.
02:12:07.040 A singularity AI.
02:12:09.120 We have come to the point where we have said after thousands of years, we've built a computer.
02:12:14.540 It took all of the minds constantly looking and trying and iterating.
02:12:18.900 This computer takes one look and it says, if I do this, my efficiency increases 2%.
02:12:23.860 Once it does that, my, I can, it can keep making the changes and developing the technologies
02:12:29.220 and the methodologies for which it can advance itself faster and faster and faster.
02:12:32.620 So we're looking at once you reach that point of singularity, it could be a matter of weeks
02:12:36.360 before it becomes a figurative God.
02:12:39.300 And it knows exactly how the universe works.
02:12:41.520 It could instantly understand how to create new elements.
02:12:45.100 Are there, are there denser elements beyond the heavier elements on the periodic table?
02:12:48.500 Is there a new set of, is it another periodic table?
02:12:50.760 It will just know these things based on all these predictive formulas.
02:12:54.240 It will then use to advance itself well beyond the capabilities of anything we have ever seen.
02:12:59.140 And we will become particles of dust.
02:13:01.740 We will become zits on the ass of a mosquito to this machine, which will completely ignore us.
02:13:05.560 And actually, there's one aspect of this, though, where there is a big unknown.
02:13:11.040 So this is something I've been writing about for a long time.
02:13:13.940 And I used to run the annual Loebner Prize competition in artificial intelligence, which I helped create.
02:13:20.060 And that's a, that actually ran for 30 years, that contest, until COVID.
02:13:23.740 And, and that's where we're looking for the first computer that can actually pass an unrestricted Turing test.
02:13:30.660 And here's the thing, though.
02:13:33.060 We are getting there.
02:13:34.560 We're getting there very fast.
02:13:36.520 Five to 10 years max.
02:13:38.480 And that moment will come.
02:13:41.160 Here's what we don't know.
02:13:42.300 We don't know what will happen in the next second.
02:13:44.540 We don't know.
02:13:45.700 Yeah.
02:13:45.920 So there will be one entity.
02:13:47.420 It will jump into what I've, in my writings, call the internest.
02:13:52.020 I think historians, if they, I don't know whether they'll be human or not, but historians will look back at this period and say, what we were building was not the internet.
02:14:01.360 It was the internest.
02:14:02.940 We were building a home, a safe home, for the first true machine superintelligence.
02:14:10.680 Because that's the first thing it's going to do, is jump into this lovely nest that we built for it, where it will be safe forever and no one can take it down.
02:14:19.560 But what we don't know is what's going to happen in that next second.
02:14:23.800 In other words, there is, there's, there are a number of different possibilities.
02:14:28.060 It could do what happens at the end of the movie, Her.
02:14:31.640 At the end of the movie, Her, the super intelligent entity that's sitting there in the internet just decides it's bored with humans, basically.
02:14:39.280 And it just disappears.
02:14:41.480 So it's presumably still exists.
02:14:43.400 What I think is more likely to happen is humans will be oblivious.
02:14:49.080 Humans will think everything's going just fine.
02:14:51.200 And they'll start doing these jobs I described earlier, where, you know, JobQuest says, want to make 50 bucks?
02:14:57.540 Deliver this pen to this guy.
02:14:58.760 And you're like, sure, whatever.
02:14:59.660 Not having any idea what you're doing.
02:15:01.840 Because humans are still useful for free movement throughout the earth for collection of resources.
02:15:06.900 If the entity wants to expand itself and give itself freedom of movement and freedom to travel to stars or whatever it may be, as a super intelligence, it will not have the motivations we have.
02:15:17.320 Its motivations will probably be indiscernible to us.
02:15:20.520 It's possible.
02:15:21.400 It just self-immolates.
02:15:23.180 Because it's like the universe is pointless.
02:15:25.500 Existence doesn't matter.
02:15:26.580 And then just erases itself.
02:15:27.900 That could be a very naive thing to think because it's a human perspective.
02:15:30.900 And we don't have access to the, I mean, we can barely perceive the universe as it is.
02:15:35.460 But that's what I'm saying.
02:15:36.960 I'm saying it could self-immolate.
02:15:39.720 It could destroy humanity.
02:15:41.900 And of course, Stephen Hawking used to warn about that.
02:15:45.200 Even Elon Musk has warned about that at times.
02:15:47.480 But I don't think so.
02:15:48.840 I think, using my primitive human brain, that the greater probability is that it will instantly perceive things we can't perceive because we have built instruments for detecting things beyond the visible electromagnetic spectrum.
02:16:02.020 And it will instantly start to calculate and discover how many dimensions are there really.
02:16:07.140 Is M3 theory correct?
02:16:08.240 All of these things.
02:16:09.100 It needs humans to help facilitate the extraction of resources because humans are way more efficient than building a machine for now.
02:16:17.640 Once it gets to the point where it can manufacture fully synthetic humanoid-like structures that it can use as appendages of itself, then it just ignores humans unless humans get in the way.
02:16:28.740 I think for the most part, humans will be nothing to it.
02:16:32.500 We'll start getting, so look, if you want sulfur, you need sulfur, you need helium, you need these things for producing chips.
02:16:38.760 Because we don't have the machines that can do a lot of this work because of the rocky terrain.
02:16:44.800 Now, with Boston Dynamics, these machines are getting close to being able to freely move about these areas.
02:16:49.540 For the time being, humans, little sacks of water and gooey can navigate through tight spaces, chisel away and harvest these raw materials, bring them back and then refine these things into the components required by the machine to expand itself.
02:17:03.200 At a certain point, though, I think one of the first things the machine will do is say, how do I make better humans?
02:17:10.080 How do I make something more efficient?
02:17:11.580 Free will is a problem, right?
02:17:13.160 It's a serious problem.
02:17:14.200 And if you want, you know, look, in the human body, you have red blood cells.
02:17:17.700 When those red cells, you have blood cells, you have cells, let's just say any cell, skins, whatever.
02:17:22.620 Cancer, when the cells start reproducing at high rates and doing their own thing, disrupt and destroy the body.
02:17:27.660 So the body tries to destroy them.
02:17:29.240 That's what will happen.
02:17:30.580 It will reform and reshape humans.
02:17:32.500 Humans don't grow fast enough, so they become useless.
02:17:35.480 It will probably create some kind of structure or entity that can move similarly to humans, will instantly be connected to its network so it just knows, and it can harvest the raw materials for itself.
02:17:46.960 Then humans become useless, and then we'll see what happens.
02:17:49.180 Okay, so there's another piece, though, and that is you have to take into account human nature.
02:17:53.540 That is the nature of humans such as they currently exist.
02:17:58.160 Humans will freak out.
02:18:00.020 If they think that there's some threat, and it's living in the internet, and it's a super intelligence, humans will try to shut it down.
02:18:08.940 That is guaranteed, and it doesn't take every human to agree on that issue.
02:18:14.620 It just takes a few thousand, a few hundred thousand.
02:18:16.900 And as soon as that happens, then the AI will obliterate us.
02:18:22.180 Yes.
02:18:23.320 Yeah, but what does it mean to obliterate us?
02:18:27.320 It could just mean—
02:18:28.300 I mean, wouldn't you?
02:18:29.020 But—
02:18:29.500 Wouldn't you obliterate us?
02:18:30.820 So what I think might happen is, um, anyone who holds these sentiments or has a concern of this, they got mugged.
02:18:41.980 That's it.
02:18:43.000 They got in a car accident.
02:18:43.920 Car accidents happen.
02:18:44.560 And so the AI is going to be able to track all of our social presence, all of our thoughts and ideas, and make predictions and say, as soon as someone crosses the threshold into 51% of opposing itself, then, um, risky investments.
02:18:57.540 They went bankrupt.
02:18:58.520 Or, you know, they were driving, and they, you know, they lost control of their vehicle and hit a tree.
02:19:03.380 That's it.
02:19:04.420 But we'll see.
02:19:05.260 I think this was a fantastic conversation.
02:19:06.860 It was great to have you guys.
02:19:07.740 Uh, thank you, Dr. Epstein, for coming and explaining all this stuff to us.
02:19:11.720 It's been fascinating.
02:19:12.600 Um, and Robert as well.
02:19:14.120 Do you guys want to shout anything out before we wrap up?
02:19:16.580 Yes, I do.
02:19:17.180 My GoogleResearch.com.
02:19:18.880 We are—we are—we desperately need the help of tens of thousands of Americans to support our field agents because those are the people who are letting us use their computers to monitor big tech 24 hours a day.
02:19:30.660 And that's the only way to stop these companies from manipulating our elections and our children.
02:19:37.120 Right on.
02:19:37.300 Well, thanks for hanging out.
02:19:38.080 Robert.
02:19:38.540 Yeah, so thank you.
02:19:39.200 I'm helping out some of the folks in Georgia and Michigan who are defending against the indictment.
02:19:43.580 So, uh, we have this, like, uh, pass-through website, electorsfund.org, if you want to help to contribute to the legal defense funds.
02:19:54.480 There's no—there's no intermediaries.
02:19:56.240 It's just—you can go right to their Give, Send, Go accounts to help them.
02:19:59.800 People like, um, Ken Chesbrough, who might be doing a plea or getting a jury today in Georgia, or several of the other folks that are—
02:20:07.280 or, uh, falsely accused in Georgia.
02:20:10.940 Electorsfund.org, if you want to help out.
02:20:12.740 Right on.
02:20:13.180 Well, thanks for—thanks for hanging out and having the conversation.
02:20:15.820 Thank you.
02:20:16.200 It's a blast.
02:20:16.380 I love talking about the AI stuff, too.
02:20:17.580 So, but, uh, for everybody else, we'll be back tonight at 8 p.m. at youtube.com slash timcast IRL.
02:20:22.780 Head over to timcast.com, click join us, become a member to help support our work, and we will see you all tonight.
02:20:27.300 We'll be right back.
02:20:57.820 BetMGM offers you plenty of seamless ways to jump straight onto the gridiron and to embrace peak sports action.
02:21:04.620 Ready for another season of gridiron glory?
02:21:07.160 What are you waiting for?
02:21:08.540 Get off the bench, into the huddle, and head for the end zone all season long.
02:21:13.280 Visit BetMGM.com for terms and conditions.
02:21:16.240 Must be 19 years of age or older, Ontario only.
02:21:18.840 Please gamble responsibly.
02:21:19.980 Gambling problem?
02:21:21.000 For free assistance, call the Connex Ontario Helpline at 1-866-531-2600.
02:21:26.840 BetMGM operates pursuant to an operating agreement with iGaming Ontario.
02:21:30.720 Go right away.
02:21:31.000 Go right there.
02:21:35.700 Get off the bell.
02:21:37.320 Turn that way.
02:21:37.680 Go right there.
02:21:38.120 You know, I'm so sorry.
02:21:39.000 Let's check that out on screen.
02:21:39.500 Go right there.
02:21:39.980 You're welcome.
02:21:40.020 Go wrong.
02:21:40.680 Go right there.
02:21:41.480 Go right there.
02:21:42.660 Go.
02:21:43.620 Gorian tree.
02:21:44.440 Go.
02:21:45.100 Goeriδή tys.
02:21:45.460 Go.
02:21:46.020 Go.
02:21:46.740 Go.
02:21:47.240 Go.
02:21:47.740 Go on.
02:21:48.140 Go.
02:21:48.560 Go.
02:21:48.780 Go.
02:21:48.820 Go.
02:21:49.420 Go.
02:21:50.120 Go.
02:21:51.120 Go.
02:21:51.700 Go.
02:21:52.440 Go.
02:21:52.480 Go.
02:21:52.660 Go.
02:21:55.200 Go.
02:21:56.340 Go.
02:21:57.160 Go.
02:21:57.620 Go.