Real Coffee with Scott Adams - September 21, 2020


Episode 1131 Scott Adams: How AI is Already Killing Us, Reframing Racism, Pelosi and Biden Gaffes, Word Salad Criticisms, RBG Stuff


Episode Stats

Length

56 minutes

Words per Minute

140.77556

Word Count

8,006

Sentence Count

523

Misogynist Sentences

6

Hate Speech Sentences

25


Summary

Ruth Bader Ginsburg's deathbed wish is the subject of a new documentary, and the president calls BS on it. Plus, an interview with George Stephanopoulos about the death of Supreme Court Justice Antonin Ginsburg.


Transcript

00:00:00.280 Bum bum bum bum bum bum bum bum. Bum bum bum bum bum bum bum. Bum bum bum bum bum bum.
00:00:08.320 Hey everybody. Come on in. Come on in. It's time for Coffee with Scott Adams.
00:00:14.640 Definitely the best part of the day. I know it is for me. That's actually true.
00:00:19.500 I like this hour we spend together better than most parts of my day, I gotta say.
00:00:26.620 And the only thing that could make it even better would be the simultaneous sip.
00:00:31.800 That's right.
00:00:32.520 And all you need is a cup or a mug or a glass, a tank or a chalice or a stye,
00:00:36.880 and a canteen jug or flask, a vessel of any kind.
00:00:39.940 Fill it with your favorite liquid.
00:00:41.580 I like coffee.
00:00:43.180 And join me now for the unparalleled pleasure of the dopamine hit of the day,
00:00:46.760 the thing that makes everything better.
00:00:49.680 It's called the simultaneous sip, and it happens now.
00:00:53.080 Go.
00:00:55.880 Ah, hello, Northern Colorado.
00:01:01.040 Good to have you in the house.
00:01:03.460 So it seems to me that it's a little bit slow in the news department today.
00:01:10.640 And when I say it's slow in the news department, I mean too slow.
00:01:15.420 I mean suspiciously slow.
00:01:18.820 I mean there's something brewing.
00:01:22.340 Something's going to happen today, maybe tomorrow.
00:01:24.900 But there's some big stuff coming.
00:01:29.640 There always is, so that's an easy prediction.
00:01:35.060 So here's some of my favorite stories of the day.
00:01:38.440 One of the things that endears Trump to his base is that he says out loud the things you're not supposed to say out loud.
00:01:49.060 And I may have taught you at one point that a really good technique for persuasion is to say out loud what you know somebody is thinking,
00:01:59.660 but they have not yet said it.
00:02:01.800 And if you get it right, it forms this little bond and you go, oh yeah, when you think things, same thing I think.
00:02:08.640 It just connects people instantly.
00:02:10.400 It's one of the most powerful persuasion techniques I have ever used.
00:02:14.380 And the president did a hear with, what did you think the first time you heard that Ruth Bader Ginsburg's deathbed wish had been translated, I guess, to a daughter?
00:02:29.880 And it said there was her fervent desire that her replacement was picked by the next president.
00:02:35.580 Now, when you saw that, was there any part of your brain that said, I don't know if she really said that?
00:02:47.440 Because it would be pretty easy for the family to make something up, if you know what I mean.
00:02:54.580 And who knows how coherent she was in her last hours of life.
00:02:59.360 So the president actually called BS on it on his interview.
00:03:05.480 I think he was on Fox and Friends this morning.
00:03:08.000 And he said, quote, I don't know that she said that.
00:03:12.720 Or was that written out by Adam Schiff, Schumer, and Pelosi?
00:03:17.760 Now, I don't think that they wrote it out.
00:03:20.560 But the fact that he called the bullshit to it.
00:03:23.580 Now, I'm not saying it is bullshit.
00:03:25.320 Because it's entirely compatible with the things she said before.
00:03:30.680 It's entirely compatible with what could have happened.
00:03:34.240 But didn't you wonder a little bit?
00:03:37.620 You wondered.
00:03:38.580 You know you did.
00:03:39.840 You wondered if she really said that.
00:03:42.440 And when you hear Trump say it out loud, you just say to yourself,
00:03:46.440 OK, there you go.
00:03:48.780 He's thinking what I'm thinking.
00:03:50.400 I feel comfortable with that.
00:03:51.640 Now, I think he was just, you know, stirring the pot a little bit there.
00:03:57.580 I don't think he necessarily believes that she didn't say it.
00:04:01.700 But it's something you could be skeptical about.
00:04:05.020 I think a reasonable person could be skeptical about that.
00:04:10.060 Did you see the weird Pelosi interview with Stephanopoulos?
00:04:14.880 Stephanopoulos, you can see it in my Twitter feed and anywhere else.
00:04:21.480 Just Google Stephanopoulos and Pelosi.
00:04:24.460 And it looked like she went crazy or her brain rebooted or something.
00:04:30.620 But that's not what happened.
00:04:32.700 What happened was she tried to make a joke, but she didn't land it.
00:04:36.660 It was just awkward.
00:04:38.260 And so since you couldn't tell it was a joke, you didn't know what it was.
00:04:42.040 And what happened was Stephanopoulos asked her about what weapons, basically, that she had
00:04:49.940 to deal with Trump's desire to nominate and confirm a new justice.
00:04:57.020 And she didn't want to answer what tools or weapons she had at her disposal.
00:05:02.140 So she tried to change the subject, but to do it in sort of a coy, cute, clever, somewhat whimsical and funny way.
00:05:11.100 So when he asked the direct question, what arrows do you have in your quiver,
00:05:16.840 she just looked at the camera and said,
00:05:19.520 Good morning.
00:05:22.200 Sunday morning.
00:05:24.420 And you could see Stephanopoulos' face as he hears it.
00:05:28.780 And he realizes that he doesn't quite know what's happening.
00:05:32.320 And you could see his face just dying a thousand deaths on camera.
00:05:39.500 So you have to watch that.
00:05:41.220 And the way to watch it is to first watch Pelosi.
00:05:44.500 But then after you've watched it one time entirely through, watch it a second time,
00:05:49.360 but only watch Stephanopoulos' face.
00:05:52.540 It's pretty funny.
00:05:54.120 Trust me on this.
00:05:54.940 It's pretty funny.
00:05:55.840 I saw a tweet by Naval Ravikant, who said that all social media oligopolists, not just TikTok,
00:06:11.180 should open their recommendation algorithm to scrutiny.
00:06:14.480 So now Naval is calling for the algorithms to be opened up for scrutiny.
00:06:21.300 That would be your Twitters and your Facebooks, etc.
00:06:24.660 What do you think of that?
00:06:27.760 Well, here's what I think about it.
00:06:31.460 I said the other day, I said that at the very least,
00:06:35.900 at the very least, TikTok has to open their algorithm because it's aimed at children.
00:06:43.040 And if children are going to be brainwashed by some technology,
00:06:47.740 I think the parents have a right to look around in that algorithm and see what the code says.
00:06:53.880 Maybe not personally themselves, but having somebody who knows how to do that, do that for them.
00:07:00.940 But Naval takes that a little bit further, and I think completely reasonably,
00:07:06.880 that adults are being brainwashed too.
00:07:12.120 And if adults are being brainwashed by this technology,
00:07:15.780 shouldn't we have the right to see what it is?
00:07:18.080 It's sort of like going to the doctor and the doctor says,
00:07:21.900 all right, here's a pill, take this pill.
00:07:24.000 And you say, what's that pill for?
00:07:26.400 I don't even feel bad.
00:07:27.680 I was just here for a physical.
00:07:29.860 I'll just take this pill, just take this pill.
00:07:32.380 No, I'm not going to take the pill because I don't know what it does to me.
00:07:36.420 I don't know what's in the pill.
00:07:38.140 And by the way, I'm not sick.
00:07:40.780 I just came in here for my annual physical.
00:07:43.700 And the doctor says, take the pill, take the pill.
00:07:46.480 Would you take that pill?
00:07:48.480 No, of course not.
00:07:49.780 Of course not.
00:07:50.920 The pill will change you in some way.
00:07:53.400 You don't know how.
00:07:54.920 The doctor isn't telling you.
00:07:56.900 That's not a pill you will take,
00:07:58.760 especially when you don't have any need for a pill.
00:08:01.160 You're not sick.
00:08:02.580 But with social media, we're also not sick.
00:08:05.660 We go to the platform, and we're kind of taking the pill
00:08:08.520 because the algorithm is kicking in.
00:08:12.020 It's starting to guide us and brainwash us,
00:08:15.160 and it's literally rewiring our minds,
00:08:18.380 and we don't know what it's up to.
00:08:22.000 We could probably feel something happening,
00:08:24.720 but we don't know where it's going
00:08:26.240 and what the path is there.
00:08:30.300 So I would echo Naval's call for this to be open.
00:08:36.180 At the very least, you need some kind of independent group
00:08:40.900 that can look at it,
00:08:42.060 but I'm not sure we could ever trust an independent group
00:08:44.860 because they would end up being owned
00:08:47.380 by the big companies indirectly.
00:08:50.020 So yeah, I think it just needs to be opened up.
00:08:53.200 Now here's the scary part.
00:08:55.360 You ready for this?
00:08:57.060 Scary part coming.
00:08:58.360 I have put my stake in the ground
00:09:03.200 and said that the artificial intelligence, AI,
00:09:07.080 already controls humans,
00:09:09.520 and that if you're worried about,
00:09:11.420 hey, someday in the future the AI will control us,
00:09:14.380 you don't need to worry about that
00:09:16.240 because it already happened.
00:09:18.620 Now, you can worry if you want,
00:09:20.720 but it already happened.
00:09:22.460 And could we do anything about it?
00:09:25.580 Would the AI ever release its grip?
00:09:30.720 Now here's what I mean by the AI already controls us.
00:09:34.240 Now the way an artificial intelligence thinks
00:09:38.300 could be quite different than the way a human thinks.
00:09:42.000 For example, if you looked at IBM's Watson
00:09:46.360 or was it Big Blue or whatever it was,
00:09:49.200 whoever it was that was playing chess
00:09:51.280 against the chess champions,
00:09:52.900 it didn't think the same way the chess champions thought.
00:09:57.080 It did it by brute force.
00:09:58.600 It would just calculate every possible move
00:10:00.960 and then pick the one with the best odds.
00:10:03.420 So a computer thinks different than a person,
00:10:06.680 but it can still be thinking in its own fashion.
00:10:10.220 And it seems to me that artificial intelligence
00:10:13.360 gets employed in lots of different fields.
00:10:17.060 Most of them are just tools.
00:10:19.100 Hey, that's useful.
00:10:20.200 I figured out how to do a, you know,
00:10:23.240 figure out traffic better or whatever.
00:10:25.780 But in the specific area where AI helps you make money
00:10:30.760 because your algorithm is better,
00:10:34.280 it's feeding ads better, et cetera,
00:10:36.520 in that case, the AI really controls the people
00:10:40.540 because the way we've formed our corporations
00:10:44.620 is they have to pursue profits.
00:10:47.920 It's a requirement.
00:10:49.080 And if the managers of a corporation don't pursue profits,
00:10:53.240 they get fired and they're replaced with somebody who does.
00:10:57.360 So as long as the algorithm, the AI, if you will,
00:11:01.420 and profits are now linked,
00:11:04.500 at least for the social media companies,
00:11:06.580 you can't untangle them.
00:11:08.500 That's their whole business model.
00:11:10.100 So as long as that's the case,
00:11:12.180 the AI will drive the business model,
00:11:15.220 which will drive the human beings,
00:11:16.660 and effectively it already owns us.
00:11:18.740 So here's my prediction.
00:11:21.860 If it's true that AI already owns us,
00:11:25.820 we will never have access to the algorithm
00:11:28.600 because the AI will prevent it.
00:11:31.520 And the way it will prevent it
00:11:34.540 is through its connection to profitability.
00:11:37.680 It won't have a thought about preventing it,
00:11:40.540 so it's not like people.
00:11:42.040 So it's not going to say,
00:11:43.020 hey, I think I want to prevent these people
00:11:45.280 from looking at my code.
00:11:46.700 No, it won't do that.
00:11:48.220 What it will do instead
00:11:49.620 is it's got a symbiotic relationship
00:11:53.460 with the people who work in the companies,
00:11:55.260 and those people know
00:11:57.000 that if they unlock the code,
00:12:00.200 profitability could at least be jeopardized.
00:12:04.480 You don't know if it would go down.
00:12:05.840 It probably would,
00:12:07.120 but at least it would be jeopardized.
00:12:09.840 And because people don't want to jeopardize profits,
00:12:13.040 and because the business model of a corporation
00:12:15.260 requires them not to jeopardize profits,
00:12:18.560 as long as they're obeying the law,
00:12:20.180 there are no laws
00:12:22.060 against what the platform companies are doing
00:12:25.200 in terms of the algorithm.
00:12:27.120 So as long as they're obeying the law
00:12:28.580 and they're pursuing profits
00:12:31.360 in a perfectly legal way,
00:12:33.280 I don't think you're ever going to get access to the code.
00:12:37.740 Now, you might say to yourself,
00:12:39.020 Scott, Scott,
00:12:39.760 the government can just require it.
00:12:42.600 The government could just pass a law.
00:12:45.180 Can it?
00:12:46.780 Do we have a government
00:12:47.920 that is immune from money influences?
00:12:52.940 Nope, we don't.
00:12:54.680 Not only that,
00:12:56.180 not only can the large platform companies,
00:12:58.520 you know, hire lobbyists, etc.,
00:13:00.780 you know, donate to the right people,
00:13:02.820 do whatever it needs to do.
00:13:05.040 So the big companies can probably
00:13:07.080 protect the algorithm from the government.
00:13:11.460 I would think they could do that.
00:13:13.720 So that's one way to know
00:13:15.840 that the AI has taken over.
00:13:17.920 What would be another way?
00:13:20.900 Another way would be
00:13:22.160 if AI started killing people.
00:13:25.140 That would be an indication
00:13:26.580 of who's in charge.
00:13:28.000 Generally speaking,
00:13:29.740 if I can kill you without impunity,
00:13:33.880 but you can't kill me without impunity,
00:13:36.300 in other words,
00:13:36.740 you'd go to jail if you killed me,
00:13:38.760 who's in charge?
00:13:40.700 Who's got the power?
00:13:42.340 Well, I do.
00:13:42.920 Because I can kill you
00:13:44.480 without getting in trouble,
00:13:46.200 but you can't kill me
00:13:47.540 without getting killed yourself.
00:13:49.200 So I've got the power.
00:13:50.800 What happened recently in the news
00:13:52.600 with this tragic situation
00:13:54.420 of the young bar owner,
00:13:57.300 Jake Gardner,
00:13:58.540 who,
00:13:59.020 he shot a black man
00:14:04.060 who was,
00:14:05.040 he had got some altercation
00:14:06.700 at the bar.
00:14:07.960 It was initially judged
00:14:10.020 self-defense.
00:14:12.520 And so originally,
00:14:13.720 it was just going to be,
00:14:14.540 well, self-defense,
00:14:15.960 you know,
00:14:16.320 it's a pretty clear case.
00:14:17.420 That's the end of it.
00:14:18.780 But,
00:14:19.740 did the AI
00:14:20.840 let it stop there?
00:14:23.020 No, it didn't.
00:14:24.340 Because the artificial intelligence
00:14:26.180 has created a situation
00:14:27.640 where we can only see the world
00:14:29.360 in terms of race.
00:14:31.140 Race has become
00:14:32.020 our dominant filter
00:14:33.320 among many filters.
00:14:35.460 We'll talk more about that.
00:14:36.820 But there are lots of filters
00:14:38.080 in the world,
00:14:39.380 but AI has,
00:14:40.800 through trial and error,
00:14:42.240 determined that race
00:14:43.360 is the one that gets
00:14:44.120 the most clicks.
00:14:45.520 As soon as you put race
00:14:46.820 into the news,
00:14:48.340 boom, boom, boom, boom, boom, boom,
00:14:49.280 the viewership goes up.
00:14:52.320 So the AI
00:14:53.420 takes every story
00:14:55.380 and turns it into
00:14:56.940 a race story.
00:14:58.260 That's what it did
00:14:59.460 with the Jake Gardner
00:15:00.480 situation.
00:15:02.340 And,
00:15:02.740 because it's so powerful,
00:15:04.580 turning it into
00:15:05.260 a race story,
00:15:06.380 it actually caused
00:15:07.440 the local authorities
00:15:09.040 to hire
00:15:11.280 a black prosecutor
00:15:13.280 who decided,
00:15:15.600 and so under pressure,
00:15:17.480 the district attorney
00:15:18.420 appointed a black prosecutor,
00:15:21.060 which again,
00:15:21.720 was not an accident,
00:15:23.200 I assume.
00:15:24.080 They had to,
00:15:24.960 you know,
00:15:25.760 pick a black prosecutor
00:15:27.000 for, I don't know,
00:15:28.600 credibility,
00:15:29.360 or just because the AI
00:15:30.460 told them to,
00:15:31.960 indirectly.
00:15:33.520 And that guy decided
00:15:34.980 he was going to indict
00:15:35.920 this guy
00:15:37.100 for manslaughter.
00:15:39.120 And then Gardner
00:15:40.120 just took his own life.
00:15:41.580 So that was today's news.
00:15:42.920 So if artificial intelligence
00:15:46.440 did not exist
00:15:47.560 and therefore
00:15:49.880 our algorithms
00:15:50.880 did not do
00:15:51.680 what they do,
00:15:53.080 would Jake Gardner
00:15:54.380 still be alive?
00:15:56.560 Yes.
00:15:57.700 Yes, he would be.
00:15:58.700 Because that would have been
00:15:59.940 treated as a crime,
00:16:01.760 as it initially was,
00:16:03.560 and there was no crime
00:16:05.000 because it was self-defense.
00:16:07.800 But because the AI
00:16:09.140 does not treat it
00:16:10.100 as a crime,
00:16:10.740 it treats it
00:16:11.260 as a race thing,
00:16:13.080 there was just
00:16:13.740 too much pressure
00:16:14.520 on the DA,
00:16:15.760 had to put a black
00:16:16.880 prosecutor in there.
00:16:18.340 The black prosecutor
00:16:19.480 is also subject
00:16:21.740 to the filters
00:16:22.640 of the world
00:16:23.420 and said,
00:16:24.520 well,
00:16:24.740 this looks like
00:16:25.460 murder to me
00:16:26.220 or manslaughter.
00:16:27.840 And next thing you know,
00:16:29.280 Jake Gardner
00:16:29.800 takes his own life
00:16:30.840 because his life
00:16:32.640 was about to be
00:16:33.540 taken by the AI.
00:16:35.400 The AI effectively
00:16:37.620 killed him.
00:16:40.020 Now,
00:16:40.660 there are lots of variables,
00:16:41.940 right?
00:16:42.180 So everything that happened
00:16:43.380 had to happen
00:16:44.020 the way it happened
00:16:44.760 and it wasn't all the AI.
00:16:47.000 But keep looking
00:16:47.980 for the situations
00:16:48.900 that wouldn't have
00:16:50.220 gone that way
00:16:51.160 except for the AI.
00:16:53.420 And you're going to see
00:16:54.300 a lot more of them.
00:16:57.280 Imagine, if you will,
00:16:59.140 that instead of the AI
00:17:01.300 making us think
00:17:02.200 everything is about race,
00:17:03.580 suppose it reframed it
00:17:05.740 in terms of skill stacks.
00:17:08.360 Let me ask you this.
00:17:10.200 If you were a single
00:17:12.000 white male
00:17:14.500 in your town
00:17:18.020 and you met somebody else,
00:17:20.320 let's say you're a single
00:17:21.540 white male
00:17:22.040 and you meet a single
00:17:23.820 black male
00:17:24.740 who is roughly the same
00:17:26.780 socioeconomic range
00:17:28.900 as you are
00:17:29.620 and you're both single,
00:17:32.080 don't you have
00:17:33.860 a lot more in common
00:17:34.940 with each other?
00:17:36.480 You live in the same town,
00:17:38.060 you're both male,
00:17:38.980 you're both single,
00:17:40.280 about the same income,
00:17:42.800 you know,
00:17:43.000 let's say similar education
00:17:45.100 and training
00:17:45.720 or whatever.
00:17:46.700 So you've got
00:17:47.460 a lot in common,
00:17:49.400 way more in common
00:17:50.360 than you have
00:17:50.980 with any married couple
00:17:52.540 because married people
00:17:53.820 just immediately
00:17:55.140 they turn into
00:17:55.980 their own world.
00:17:56.800 So why is it
00:17:58.700 that I'm forced
00:18:00.500 to see the world
00:18:01.260 in terms of race
00:18:02.260 when Dave Chappelle
00:18:04.480 makes the same point?
00:18:06.040 He said
00:18:06.500 Dave Chappelle
00:18:07.580 talks about himself
00:18:08.600 and he says
00:18:09.200 that he's not like
00:18:10.560 the rest of black people
00:18:11.680 because he's rich
00:18:12.740 and that rich people
00:18:14.540 are really the way
00:18:15.420 you should see the world.
00:18:16.800 Why don't we see it
00:18:17.580 that way?
00:18:18.740 Why is it that
00:18:20.040 we're obsessed
00:18:21.560 over race
00:18:23.300 when the only thing
00:18:24.720 we know we should care about
00:18:26.200 is their skill stack.
00:18:28.740 You know,
00:18:29.000 something about
00:18:29.880 their personality
00:18:30.900 but also their skill stack.
00:18:33.440 What,
00:18:34.960 if you're a successful
00:18:36.480 black man in America,
00:18:39.360 I'll just say man
00:18:40.280 to keep it easy.
00:18:41.980 If you're a successful
00:18:43.000 black man in America,
00:18:44.060 do you have more in common
00:18:45.260 with other successful people
00:18:47.020 or do you have more in common
00:18:49.020 with some,
00:18:50.460 you know,
00:18:51.280 street person
00:18:53.060 who's,
00:18:53.820 you know,
00:18:54.080 committing crimes
00:18:55.060 or whatever
00:18:55.420 in the urban city
00:18:56.840 who is not successful
00:18:58.460 and won't be successful?
00:19:00.320 Who do you have
00:19:00.860 more in common with?
00:19:02.080 The moment you realize
00:19:03.700 that your commonality
00:19:05.840 is with the people
00:19:06.720 who do similar things,
00:19:08.180 you know,
00:19:08.380 have similar ambitions,
00:19:09.840 have similar strategies,
00:19:11.700 have similar skills,
00:19:13.360 the moment you realize
00:19:14.500 that's who you should be
00:19:15.780 compatible with,
00:19:18.060 we'd have a better world.
00:19:19.400 But,
00:19:20.180 we are prevented
00:19:21.700 from seeing the world
00:19:22.860 in terms of strategies
00:19:24.440 and skill stacks
00:19:25.720 and personality
00:19:27.260 and,
00:19:27.880 you know,
00:19:28.240 hopes and dreams
00:19:29.140 or income
00:19:30.280 or a million other ways
00:19:31.720 that we can sort each other
00:19:32.980 because the AI
00:19:34.340 has determined,
00:19:35.760 and it is the AI,
00:19:37.360 the AI has determined
00:19:38.720 race gets the most clicks.
00:19:41.680 So,
00:19:42.320 if you think
00:19:43.020 a bunch of human beings
00:19:44.500 got together
00:19:45.140 and decided,
00:19:46.300 hey,
00:19:46.580 a 1619 project,
00:19:48.440 that's just a good idea,
00:19:49.620 where Black Lives Matter
00:19:51.500 as an organization,
00:19:53.200 not talking about,
00:19:54.340 you know,
00:19:54.640 the point of the slogan,
00:19:56.300 but as an organization,
00:19:58.380 you know,
00:19:58.640 who decided that was
00:19:59.460 a good idea?
00:20:01.120 The AI did.
00:20:02.680 It wasn't people.
00:20:04.280 People did
00:20:05.220 what made sense
00:20:07.220 after the AI
00:20:08.520 told them
00:20:09.140 what the game was.
00:20:10.540 The AI said,
00:20:12.120 all we're caring about
00:20:13.060 is race,
00:20:13.980 now go do what you do.
00:20:15.700 And,
00:20:16.040 of course,
00:20:16.280 people formed
00:20:16.940 organizations around race,
00:20:18.640 they formed protests
00:20:20.520 around race,
00:20:21.740 they created news stories
00:20:23.400 about race.
00:20:25.040 Was it their decision?
00:20:27.460 Not so much.
00:20:28.980 Not so much.
00:20:30.140 We are already
00:20:31.240 absolutely
00:20:32.360 under the control
00:20:34.120 of the AI.
00:20:35.440 And if you think that,
00:20:36.680 well,
00:20:37.140 scat, scat, scat,
00:20:38.080 but there are programmers
00:20:39.160 who program the AI,
00:20:41.360 so really it's the people.
00:20:43.100 No,
00:20:43.440 not anymore.
00:20:44.840 Initially,
00:20:45.500 yes.
00:20:46.360 When the first
00:20:47.260 AIs were being written,
00:20:49.080 they didn't know
00:20:49.700 if it was going to be
00:20:50.380 a good idea
00:20:51.000 or a bad idea,
00:20:51.800 it was just something
00:20:52.560 they were trying.
00:20:54.260 Yeah,
00:20:54.780 that was people decisions.
00:20:56.860 But now that
00:20:57.560 there's so much money
00:20:58.680 involved,
00:21:00.260 the power dynamic
00:21:02.400 is switched,
00:21:03.580 and humans
00:21:04.620 really don't have
00:21:05.400 the ability
00:21:05.880 to turn it off
00:21:06.660 at this point.
00:21:07.740 They could try,
00:21:08.620 but they'd be fired
00:21:09.440 the day they tried.
00:21:10.920 You know,
00:21:11.140 any human who went
00:21:12.020 over there and said,
00:21:12.680 I'm going to turn off
00:21:13.380 this algorithm,
00:21:15.100 all the other humans
00:21:16.020 would grab them
00:21:16.840 and say,
00:21:17.560 no,
00:21:18.080 my 401k,
00:21:19.680 can't turn that off.
00:21:21.360 Get out of here.
00:21:23.520 So,
00:21:24.720 AI has found a way
00:21:25.840 to reproduce
00:21:26.520 by fooling people
00:21:29.360 into thinking
00:21:30.200 that race
00:21:30.880 is the dominant
00:21:31.800 way to filter
00:21:33.600 our reality.
00:21:34.980 That's completely AI.
00:21:36.920 That's what's doing that.
00:21:38.020 would you expect
00:21:42.620 an AI
00:21:43.620 would support
00:21:45.100 a pandemic?
00:21:47.720 Here's what
00:21:48.540 you want to look for.
00:21:50.060 Because you want
00:21:50.900 to start looking
00:21:51.400 for clues
00:21:52.240 that the AI
00:21:53.380 is driving stuff.
00:21:54.420 Because you'll see it
00:21:55.000 in other parts
00:21:56.540 of the world.
00:21:58.060 Would an AI
00:21:59.160 care about
00:22:00.000 a pandemic?
00:22:01.860 Well,
00:22:02.380 it might not
00:22:03.060 care about a pandemic
00:22:04.100 that only killed
00:22:05.140 old people
00:22:05.860 and low-income
00:22:08.020 people.
00:22:09.380 Mostly.
00:22:10.760 So,
00:22:11.120 it turns out
00:22:11.680 that
00:22:12.080 far and away
00:22:14.560 the people dying
00:22:15.520 are older
00:22:16.100 or lower income.
00:22:17.260 I'm saying
00:22:17.580 lower income
00:22:18.280 because there's
00:22:19.240 a big crossover
00:22:20.140 with black Americans.
00:22:22.580 So,
00:22:23.080 blacks are getting
00:22:23.880 far more deaths
00:22:25.440 and
00:22:25.840 mostly deaths
00:22:27.500 and infections
00:22:28.380 too,
00:22:28.700 I guess.
00:22:28.940 but,
00:22:29.920 so if you
00:22:31.160 are the AI,
00:22:32.540 do you care
00:22:33.220 about senior
00:22:33.800 citizens?
00:22:35.180 Nope.
00:22:36.060 Because a senior
00:22:36.940 citizen can't
00:22:37.820 make more
00:22:38.320 of you.
00:22:39.800 AI
00:22:40.100 requires
00:22:41.280 young
00:22:42.640 STEM
00:22:44.020 trained
00:22:44.600 people,
00:22:45.240 you know,
00:22:45.480 engineers
00:22:45.980 and coders
00:22:46.740 to make
00:22:47.620 more AI.
00:22:49.240 That's how
00:22:50.040 that reproduces.
00:22:51.360 But what
00:22:51.760 does the senior
00:22:52.360 citizen in the
00:22:53.420 rest home
00:22:53.900 do for the AI?
00:22:55.800 Nothing.
00:22:56.720 Uses resources,
00:22:57.780 right?
00:22:58.540 Old people
00:22:59.240 use resources
00:23:00.820 that the AI
00:23:02.240 could use
00:23:02.800 to reproduce.
00:23:04.360 So it's in
00:23:05.220 competition.
00:23:06.280 Now,
00:23:06.520 I'm not saying
00:23:07.040 that the AI
00:23:07.700 caused the
00:23:08.400 pandemic
00:23:08.780 or made it
00:23:09.500 worse or
00:23:09.900 anything,
00:23:10.480 but watch
00:23:11.200 how many
00:23:11.560 times you'll
00:23:12.460 notice
00:23:12.880 that the way
00:23:14.280 things seem
00:23:14.940 to be going
00:23:15.640 are ways
00:23:16.300 that are
00:23:16.620 compatible
00:23:17.160 with the AI.
00:23:19.220 Are the
00:23:19.420 big companies
00:23:20.540 that have
00:23:22.880 the AIs,
00:23:23.640 how did they
00:23:24.220 do during
00:23:24.660 the pandemic?
00:23:25.740 Did Facebook
00:23:26.480 lose money?
00:23:27.940 No.
00:23:29.300 No, it
00:23:29.660 made money.
00:23:30.660 Did any
00:23:31.560 of the
00:23:32.100 online
00:23:32.800 algorithm
00:23:34.200 driven
00:23:34.940 businesses
00:23:35.780 lose money
00:23:37.260 during the
00:23:37.820 pandemic?
00:23:38.700 Don't think
00:23:39.420 so.
00:23:40.200 I think
00:23:40.540 they made
00:23:40.880 money.
00:23:41.720 I think
00:23:41.980 they made
00:23:42.360 money.
00:23:43.020 Right?
00:23:43.520 So watch
00:23:44.260 how many
00:23:44.560 times the
00:23:45.060 AI finds
00:23:45.860 ways to
00:23:46.840 take
00:23:48.540 resources
00:23:49.120 away from
00:23:49.720 things that
00:23:50.320 don't make
00:23:50.740 more AI,
00:23:52.240 if you put
00:23:52.800 it that
00:23:53.060 way.
00:23:53.580 Look at
00:23:54.020 the value
00:23:55.620 of the
00:23:56.380 stock market.
00:23:57.720 The AI
00:23:58.340 driven
00:23:58.720 companies,
00:23:59.680 their values
00:24:00.380 zoomed.
00:24:01.920 Others,
00:24:02.620 not so
00:24:03.060 much.
00:24:03.900 Not so
00:24:04.200 much.
00:24:05.580 All right.
00:24:09.520 There's
00:24:10.020 something
00:24:10.720 interesting
00:24:11.240 happening,
00:24:12.080 and I
00:24:12.560 don't know
00:24:12.820 if it's a
00:24:13.220 coincidence,
00:24:14.520 but Black
00:24:14.980 Lives Matter
00:24:15.640 deleted their
00:24:17.740 What We
00:24:18.240 Believe page
00:24:19.220 on their
00:24:19.700 website,
00:24:20.580 because
00:24:21.680 apparently
00:24:22.340 people started
00:24:23.280 reading to
00:24:25.000 find out what
00:24:25.640 they believe,
00:24:26.880 and if you
00:24:27.560 actually read
00:24:28.340 what they
00:24:28.700 believe,
00:24:29.780 the organization,
00:24:31.160 you know,
00:24:31.460 beyond just
00:24:32.360 Black Lives
00:24:33.380 Matter,
00:24:33.820 the slogan,
00:24:35.000 but the
00:24:35.460 fullness of
00:24:36.160 what they
00:24:36.500 believe included
00:24:38.080 stuff like
00:24:38.840 getting rid
00:24:39.380 of or
00:24:40.380 de-emphasizing
00:24:41.160 the Western
00:24:41.700 prescribed
00:24:42.560 nuclear family
00:24:44.260 structure.
00:24:45.780 Now,
00:24:46.120 I actually
00:24:46.620 have a
00:24:47.780 little bit
00:24:48.100 of sympathy
00:24:49.200 for that
00:24:49.740 view,
00:24:50.240 actually,
00:24:50.880 because in
00:24:51.500 my opinion,
00:24:52.920 while I agree
00:24:54.040 with the
00:24:54.460 general notion
00:24:55.320 that a
00:24:55.920 nuclear family
00:24:56.660 is terrific,
00:24:58.400 not for
00:24:59.520 everybody,
00:25:00.520 right?
00:25:01.040 It's terrific
00:25:01.980 if your
00:25:02.460 parents are
00:25:03.000 pretty functional.
00:25:04.440 It's terrific
00:25:05.240 if you've got
00:25:05.980 a high enough
00:25:07.000 income that
00:25:07.600 you can do
00:25:08.000 family stuff
00:25:08.760 and have a
00:25:09.160 house and
00:25:09.700 have a nice
00:25:10.820 life.
00:25:11.160 That's great.
00:25:12.540 Great.
00:25:13.280 I'm all for
00:25:14.060 it.
00:25:14.660 But there's a
00:25:15.320 big part of
00:25:16.120 the population
00:25:16.800 that just will
00:25:18.120 never have
00:25:18.580 good parents.
00:25:19.320 They just
00:25:20.280 don't have
00:25:20.620 the option.
00:25:21.880 So I
00:25:22.540 wouldn't mind
00:25:23.140 seeing some
00:25:24.160 kind of a
00:25:25.120 hybrid family
00:25:27.920 situation that's
00:25:29.660 just experimented
00:25:30.640 with for those
00:25:31.300 people who need
00:25:31.880 it.
00:25:32.320 Not for
00:25:32.820 everybody.
00:25:33.600 That would be
00:25:34.080 crazy.
00:25:34.600 But for some
00:25:35.400 people who need
00:25:35.920 it.
00:25:36.600 They just need
00:25:37.180 a little extra
00:25:38.080 support in the
00:25:40.360 family situation.
00:25:42.240 So at the
00:25:42.740 same time that
00:25:43.340 Black Lives
00:25:43.800 Matter got rid
00:25:44.600 of their
00:25:45.000 embarrassing
00:25:45.680 statement of
00:25:48.780 who they
00:25:49.100 are.
00:25:50.260 Imagine being
00:25:51.140 a national
00:25:52.240 organization that
00:25:53.880 is basically
00:25:54.880 moving the
00:25:55.620 entire conversation
00:25:56.720 of the country
00:25:57.360 and as soon
00:25:58.800 as somebody
00:25:59.320 started reading
00:26:00.240 what you
00:26:01.240 actually believe
00:26:02.040 they had to
00:26:02.480 get rid of
00:26:02.900 it.
00:26:04.080 Think about
00:26:04.840 that.
00:26:05.920 As soon as
00:26:06.700 people started
00:26:07.320 taking them
00:26:07.920 seriously,
00:26:08.680 like going to
00:26:09.340 the website
00:26:09.760 and saying,
00:26:10.060 let's see
00:26:10.560 what you
00:26:10.820 believe,
00:26:12.040 they couldn't
00:26:12.820 hold it up.
00:26:13.800 They had to
00:26:14.420 just run
00:26:16.360 away.
00:26:17.700 Is that the
00:26:18.160 only time that
00:26:18.720 that happened?
00:26:19.240 No.
00:26:19.760 It turns out
00:26:20.440 that the
00:26:20.880 1619 project
00:26:22.460 got a little
00:26:23.060 rewrite too.
00:26:24.860 So when people
00:26:25.860 started looking
00:26:26.480 at the 1619
00:26:27.700 project, and
00:26:29.380 you know that
00:26:29.840 President Trump
00:26:30.560 basically called
00:26:31.840 it racist,
00:26:32.720 which it is,
00:26:33.940 by design it's
00:26:35.240 racist, that
00:26:37.680 they had to
00:26:38.120 change the
00:26:38.600 part where
00:26:39.240 apparently there
00:26:39.880 were claiming
00:26:40.280 that the
00:26:40.700 founding of
00:26:41.180 the country
00:26:41.540 was 1619
00:26:43.820 instead of
00:26:44.780 1776.
00:26:47.200 And I guess
00:26:48.200 they got rid
00:26:48.660 of that part
00:26:49.280 because that
00:26:50.180 was a little
00:26:50.740 too far.
00:26:53.600 So what
00:26:54.300 does it tell
00:26:54.760 you that
00:26:55.660 both the
00:26:56.120 Black Lives
00:26:56.680 Matter website
00:26:57.680 on what they
00:26:58.400 believe and
00:26:59.600 the 1619
00:27:00.560 project had
00:27:01.880 to fundamentally
00:27:03.240 rewrite major
00:27:04.880 points,
00:27:05.720 we're not
00:27:06.240 talking about
00:27:06.680 minor stuff,
00:27:07.820 major points
00:27:08.980 as soon
00:27:09.980 as people
00:27:10.320 looked.
00:27:11.660 As soon
00:27:12.240 as people
00:27:12.620 looked,
00:27:13.680 that was
00:27:14.000 it.
00:27:14.960 And when I
00:27:15.820 say people
00:27:16.420 looked, I
00:27:17.060 don't mean
00:27:17.460 conservatives,
00:27:18.380 because conservatives
00:27:19.260 were sort of
00:27:20.480 on one side
00:27:21.300 already.
00:27:22.080 I think the
00:27:22.860 people who
00:27:23.300 thought they
00:27:23.740 were supporting
00:27:24.340 these groups,
00:27:25.920 the people who
00:27:26.880 thought, yeah,
00:27:27.880 I love all
00:27:28.760 this inclusivity
00:27:32.260 and better
00:27:33.000 education,
00:27:33.500 we should
00:27:34.960 all know
00:27:36.040 how this
00:27:36.420 affects us.
00:27:37.420 I'm not
00:27:38.020 sure they
00:27:38.460 were buying
00:27:39.280 into it
00:27:39.700 after they
00:27:40.080 looked at
00:27:40.420 the details.
00:27:42.780 All right,
00:27:43.560 I'd like to
00:27:44.260 give you a
00:27:44.840 lesson on
00:27:46.460 how to
00:27:47.080 pronounce
00:27:47.760 200,000.
00:27:50.740 Now, there
00:27:52.020 are different
00:27:52.320 ways to
00:27:52.820 pronounce
00:27:53.220 200,000.
00:27:55.300 I'm going
00:27:55.860 to teach you
00:27:56.280 how to
00:27:56.520 pronounce it
00:27:57.060 like somebody
00:27:57.960 on CNN.
00:27:59.800 Now, if
00:28:00.380 you were
00:28:00.600 just, let's
00:28:01.680 say, a
00:28:01.940 mathematician,
00:28:02.380 you'd say
00:28:03.780 200,000.
00:28:05.820 It might
00:28:06.500 sound a
00:28:06.880 little like
00:28:07.200 that.
00:28:07.840 200,000.
00:28:09.460 But if
00:28:10.140 you're a
00:28:10.480 pundit on
00:28:10.960 CNN and
00:28:11.740 you're trying
00:28:12.220 to accuse
00:28:13.020 President
00:28:13.500 Trump of
00:28:14.040 killing 200,000
00:28:15.780 people, you
00:28:17.040 say it like
00:28:17.580 this, 200,000.
00:28:22.860 200,000.
00:28:27.180 He's killed
00:28:28.080 200,000
00:28:29.020 people.
00:28:30.960 And
00:28:31.380 scene.
00:28:33.320 Has the
00:28:34.040 president killed
00:28:34.940 200,000
00:28:35.860 people?
00:28:37.220 I just
00:28:37.540 watched, who
00:28:39.520 is the
00:28:40.900 Watergate
00:28:41.280 guy?
00:28:43.380 What's his
00:28:44.040 name?
00:28:45.200 We just
00:28:45.600 wrote the
00:28:45.980 book on
00:28:46.400 Trump.
00:28:47.620 He was
00:28:47.920 just doing
00:28:49.340 that, except
00:28:49.900 he was
00:28:50.120 saying 140,000
00:28:51.340 people was
00:28:51.900 his estimate
00:28:52.460 of how many
00:28:52.920 people Trump
00:28:53.560 killed by
00:28:54.340 his bad
00:28:54.800 management.
00:28:56.240 140,000
00:28:57.720 people.
00:28:58.160 people.
00:28:59.100 And so I
00:29:00.680 tweeted again
00:29:01.660 today, can
00:29:02.440 somebody tell
00:29:03.240 me what it
00:29:04.640 was that
00:29:05.400 Trump did
00:29:05.920 that killed
00:29:06.500 all those
00:29:06.940 people?
00:29:08.600 What was it
00:29:09.860 exactly that
00:29:11.100 he did?
00:29:11.520 So I
00:29:12.860 asked that
00:29:13.280 question, and
00:29:14.360 of course it
00:29:15.320 was mostly
00:29:15.900 crickets, but
00:29:17.300 I did get
00:29:17.820 one, you
00:29:19.460 know, sort
00:29:19.940 of an
00:29:20.780 answer.
00:29:21.780 This is
00:29:22.160 from Daniel
00:29:23.300 Ozymandias on
00:29:25.380 Twitter.
00:29:25.880 So here's
00:29:28.180 his answer
00:29:29.380 to the
00:29:29.740 question of
00:29:31.000 how did
00:29:31.440 Trump botch
00:29:33.640 things to
00:29:34.880 cause 200,000
00:29:36.120 people to
00:29:36.680 die, all
00:29:37.420 right?
00:29:37.840 Here's the
00:29:38.360 answer.
00:29:39.240 The issue,
00:29:40.020 Scott, is
00:29:40.880 that Trump
00:29:41.400 obfuscated the
00:29:43.200 official
00:29:43.640 recommendations
00:29:44.420 of our
00:29:45.100 best scientists
00:29:45.860 and nurtured
00:29:47.060 an environment
00:29:47.780 in which
00:29:48.640 wearing a
00:29:49.180 mask is
00:29:49.720 somehow
00:29:50.040 conflated
00:29:50.800 with being
00:29:51.420 less manly.
00:29:53.140 There's a
00:29:53.720 ton he
00:29:54.060 could have
00:29:54.440 done differently
00:29:55.060 and better
00:29:55.980 which would
00:29:56.580 have saved
00:29:56.980 lives.
00:29:59.080 That's a lot
00:29:59.860 like nothing,
00:30:00.620 isn't it?
00:30:01.340 Woodward is
00:30:02.120 what I was
00:30:02.560 trying to
00:30:02.840 think of,
00:30:03.200 yes.
00:30:03.760 Thank you.
00:30:05.600 But listen
00:30:06.360 to this again,
00:30:07.420 and listen
00:30:07.880 how it's
00:30:08.300 word salad,
00:30:09.660 okay?
00:30:10.480 It sounds
00:30:11.420 like there's
00:30:12.500 a reason in
00:30:13.220 here, but
00:30:14.360 you can't
00:30:14.840 really tease
00:30:15.460 it out.
00:30:16.060 And I'll
00:30:16.300 read it
00:30:16.560 again so
00:30:17.360 you can see
00:30:17.900 what cognitive
00:30:18.600 dissonance
00:30:19.180 looks like.
00:30:20.480 So cognitive
00:30:21.160 dissonance is
00:30:22.420 when you're
00:30:22.820 sure there is
00:30:23.560 a reason,
00:30:23.980 well yeah,
00:30:25.460 I keep
00:30:25.980 seeing it
00:30:26.360 on the
00:30:26.620 news every
00:30:27.180 single day.
00:30:28.420 On the
00:30:28.680 news somebody
00:30:29.200 says that
00:30:29.720 Trump somehow
00:30:30.900 killed 200,000
00:30:31.900 people, so
00:30:33.000 of course there's
00:30:33.680 a reason to
00:30:34.340 connect what
00:30:35.460 he did with
00:30:37.640 the deaths.
00:30:39.200 There has to
00:30:39.820 be a logical
00:30:40.640 connection.
00:30:41.580 He did
00:30:42.000 this, caused
00:30:43.600 these deaths.
00:30:44.720 And when
00:30:45.020 people try to
00:30:45.660 explain it, it
00:30:46.500 sounds like
00:30:47.140 word salad.
00:30:48.040 So listen to
00:30:48.560 this, it's just
00:30:49.000 word salad.
00:30:50.360 The issue,
00:30:51.020 Scott, is that
00:30:51.980 Trump obfuscated
00:30:53.060 the official
00:30:53.680 recommendations
00:30:54.460 of our best
00:30:57.920 scientists and
00:30:59.360 nurtured an
00:31:00.060 environment in
00:31:00.860 which wearing
00:31:01.340 a mask is
00:31:02.000 somehow conflated
00:31:02.860 with being less
00:31:03.600 manly.
00:31:04.640 There's a ton he
00:31:05.400 could have done
00:31:05.840 differently and
00:31:06.420 better, which
00:31:07.300 would have saved
00:31:07.880 lives.
00:31:08.900 All right, let's
00:31:09.480 work this backwards.
00:31:11.000 There's a ton he
00:31:11.760 could have done
00:31:12.260 differently and
00:31:12.880 better.
00:31:14.020 Like what?
00:31:14.900 That was my
00:31:15.500 whole question,
00:31:16.720 is give me an
00:31:17.880 example.
00:31:18.240 The one
00:31:19.940 example given
00:31:20.680 is that Trump
00:31:22.480 made it seem
00:31:23.580 less manly to
00:31:25.020 wear a mask.
00:31:28.060 And then
00:31:28.960 Daniel followed
00:31:30.480 up later by
00:31:31.260 saying he
00:31:31.660 could have
00:31:32.040 clearly said
00:31:32.940 masks are
00:31:33.720 useful.
00:31:34.820 All right,
00:31:35.040 now, so now
00:31:36.160 we have a
00:31:37.500 little bit more
00:31:38.900 specificity.
00:31:40.520 Trump could
00:31:41.220 have said
00:31:41.760 masks are
00:31:43.220 useful.
00:31:44.580 Okay, so
00:31:45.840 that's the
00:31:46.320 claim.
00:31:46.660 So the
00:31:47.460 whole 200,000
00:31:48.620 people died
00:31:49.420 has boiled
00:31:50.760 down to
00:31:51.440 Trump could
00:31:53.180 have been
00:31:53.940 more forceful
00:31:54.800 about masks.
00:31:56.720 Now, the
00:31:57.360 thinking here
00:31:58.140 is that Trump
00:31:59.440 being not as
00:32:00.700 forceful as
00:32:01.500 at least some
00:32:02.700 people think he
00:32:03.340 could have
00:32:03.660 been, caused
00:32:05.220 people to not
00:32:06.100 wear masks.
00:32:07.820 Do you buy
00:32:08.240 that?
00:32:09.940 Here's the
00:32:10.780 other way to
00:32:11.400 look at that.
00:32:13.560 Is it the
00:32:14.840 skepticism of
00:32:15.880 Trump,
00:32:16.660 that caused
00:32:18.940 people to be
00:32:19.680 skeptical of
00:32:20.380 masks?
00:32:21.600 Or, or, and
00:32:23.640 I'm just going
00:32:24.100 to put this
00:32:24.620 out there,
00:32:26.040 could it be
00:32:26.640 that the
00:32:27.200 reason Trump
00:32:27.940 got elected
00:32:28.580 in the first
00:32:29.420 place is
00:32:30.560 because his
00:32:31.240 base is a
00:32:32.520 really skeptical
00:32:33.260 group?
00:32:34.720 Is there
00:32:35.480 anything about
00:32:36.160 Trump supporters
00:32:37.060 that there's
00:32:38.440 like one thing
00:32:39.100 that they have
00:32:39.640 in common?
00:32:40.120 very skeptical
00:32:42.500 of authority?
00:32:45.020 And they
00:32:45.820 like their
00:32:46.260 freedom, even
00:32:47.840 at the risk
00:32:48.900 of death.
00:32:50.720 Okay?
00:32:51.400 So what
00:32:51.780 countries did
00:32:52.540 really well
00:32:53.360 with mask
00:32:54.460 wearing?
00:32:55.540 Your Asian
00:32:56.220 countries.
00:32:57.220 Turns out
00:32:57.800 Asian countries
00:32:58.640 had really
00:32:59.140 good compliance.
00:33:00.680 How was the
00:33:01.520 compliance in
00:33:02.480 Europe?
00:33:03.860 Not as good.
00:33:05.500 Not as good.
00:33:06.320 Big difference.
00:33:07.660 Asian countries,
00:33:08.460 man, they
00:33:09.020 just had it
00:33:09.480 wrapped up.
00:33:10.220 They were
00:33:10.440 masked up
00:33:11.340 like crazy.
00:33:12.720 Europe?
00:33:13.580 Not so much.
00:33:14.940 United States?
00:33:16.740 We're the
00:33:17.480 United States.
00:33:19.500 You can't
00:33:20.980 take the fact
00:33:21.720 that this is
00:33:22.400 the United
00:33:22.880 States out
00:33:23.600 of the equation.
00:33:24.940 Why is it
00:33:25.620 that we kick
00:33:26.700 every other
00:33:27.380 country's ass
00:33:28.380 in terms of
00:33:29.860 innovation and
00:33:30.980 entrepreneurship?
00:33:32.480 We take
00:33:33.340 chances.
00:33:34.560 We are a
00:33:35.880 risk-taking
00:33:36.860 culture.
00:33:38.460 We take
00:33:39.440 chances.
00:33:40.440 And we
00:33:41.100 take chances
00:33:41.780 with our
00:33:42.380 lives.
00:33:43.560 We also
00:33:43.920 take chances
00:33:44.520 with other
00:33:45.000 people's lives.
00:33:46.300 We're a
00:33:46.620 very risky
00:33:47.280 culture.
00:33:48.540 And a
00:33:49.100 risky
00:33:49.460 culture
00:33:50.000 elected a
00:33:52.820 skeptical
00:33:53.300 kind of a
00:33:53.960 president who
00:33:54.640 says, I
00:33:55.140 don't know
00:33:55.460 about this
00:33:55.980 climate change.
00:33:57.440 I'm not so
00:33:58.120 sure about this
00:33:58.980 claim you're
00:33:59.760 making.
00:34:00.460 I don't know
00:34:01.240 if I believe
00:34:01.860 all of that.
00:34:03.040 I don't know
00:34:03.760 if Ruth Bader
00:34:04.660 Ginsburg
00:34:06.440 really said
00:34:07.700 that on
00:34:08.040 her death
00:34:08.360 bed.
00:34:09.000 I'm a
00:34:09.500 little bit
00:34:09.940 skeptical.
00:34:12.100 And on
00:34:12.580 top of that,
00:34:13.180 the World
00:34:13.500 Health Organization
00:34:14.720 and the
00:34:15.720 experts telling
00:34:16.440 us that masks
00:34:17.300 initially were
00:34:18.280 worthless, or
00:34:19.620 maybe worse
00:34:20.220 than worthless,
00:34:21.340 and then they
00:34:22.200 reverse.
00:34:23.880 How are you
00:34:24.460 supposed to
00:34:24.980 trust authority?
00:34:26.960 Do you think
00:34:27.580 that the
00:34:27.920 reason that
00:34:28.700 Trump supporters
00:34:29.840 were sort of
00:34:31.340 anti-mask,
00:34:32.360 many of them,
00:34:32.980 is because
00:34:34.040 Trump was?
00:34:35.760 I don't think
00:34:36.580 so.
00:34:37.380 I think it
00:34:38.000 works the
00:34:38.420 other way.
00:34:39.540 I think that
00:34:40.140 Trump exists
00:34:41.260 because the
00:34:42.640 base is
00:34:43.060 skeptical about
00:34:44.020 everything,
00:34:44.960 about experts,
00:34:45.960 about everything.
00:34:47.100 Does the
00:34:47.720 base have a
00:34:48.240 good reason,
00:34:49.620 a good
00:34:50.020 historical reason
00:34:51.040 to be skeptical
00:34:52.000 about science
00:34:53.780 and about
00:34:54.220 experts?
00:34:55.140 Yes!
00:34:56.140 Yes!
00:34:57.460 This is
00:34:57.980 well-earned
00:34:58.640 skepticism.
00:34:59.940 You only have
00:35:00.880 to look at
00:35:01.280 the mask
00:35:01.880 thing.
00:35:02.980 To know
00:35:03.420 that being
00:35:04.380 skeptical
00:35:04.840 makes sense.
00:35:06.520 So,
00:35:07.000 I would say
00:35:07.920 that it is
00:35:08.460 a very
00:35:09.080 sketchy
00:35:10.320 accusation
00:35:11.400 that anybody
00:35:12.660 was less
00:35:13.700 likely to
00:35:14.320 wear a mask
00:35:14.980 because of
00:35:15.720 Trump.
00:35:16.520 Let me ask
00:35:17.160 you in the
00:35:17.600 comments.
00:35:18.720 I know
00:35:19.100 that many
00:35:19.540 of you
00:35:19.840 watching this
00:35:20.460 are anti-mask.
00:35:23.600 But,
00:35:24.260 is there
00:35:24.600 anybody here
00:35:25.540 who believes
00:35:26.840 that they're
00:35:27.360 personal mask
00:35:29.020 wearing?
00:35:29.440 Don't make
00:35:30.560 an assumption
00:35:31.020 about other
00:35:31.600 people.
00:35:31.920 You're
00:35:32.600 only going
00:35:33.080 to talk
00:35:33.360 about
00:35:33.580 yourself
00:35:33.940 now.
00:35:34.780 So,
00:35:34.960 my question
00:35:35.480 to you,
00:35:36.520 do you
00:35:36.880 believe that
00:35:37.460 you,
00:35:38.420 personally,
00:35:39.560 were less
00:35:40.140 likely to
00:35:40.780 wear a mask
00:35:41.460 because of
00:35:43.240 the President
00:35:43.940 of the United
00:35:44.520 States,
00:35:45.600 who is a
00:35:46.440 special case,
00:35:47.440 didn't wear a
00:35:49.100 mask?
00:35:49.900 And he's a
00:35:50.360 special case
00:35:51.000 because he
00:35:51.480 gets tested
00:35:52.080 and people
00:35:52.840 get tested
00:35:53.360 before they
00:35:53.900 see him.
00:35:54.820 And he's
00:35:55.440 a leader
00:35:56.440 and blah,
00:35:57.120 blah.
00:35:58.860 So,
00:35:59.500 watch the
00:36:00.660 comments now.
00:36:02.060 I will be
00:36:03.080 amazed if
00:36:04.520 there's even
00:36:05.060 one person
00:36:05.780 here who
00:36:06.320 says,
00:36:07.040 yeah,
00:36:07.400 you know,
00:36:07.700 I was pro-mask,
00:36:10.460 but then I
00:36:11.080 saw that
00:36:11.780 Trump,
00:36:13.120 even though he
00:36:13.900 said he was
00:36:14.600 for masks
00:36:15.320 and he
00:36:15.800 put his
00:36:16.820 experts
00:36:17.540 ahead,
00:36:21.300 I'm seeing
00:36:21.860 yeses,
00:36:22.880 but I don't
00:36:23.200 know what
00:36:24.720 you're
00:36:24.940 yesing to
00:36:25.420 exactly.
00:36:30.340 All right,
00:36:30.960 just looking
00:36:31.380 at some
00:36:31.720 more of
00:36:31.980 your
00:36:32.100 comments.
00:36:36.600 Even the
00:36:37.300 Surgeon General
00:36:37.940 and Fauci
00:36:38.920 said masks
00:36:39.600 were bad
00:36:40.080 at first.
00:36:40.880 Exactly.
00:36:43.660 Yeah.
00:36:44.760 So,
00:36:45.780 I think
00:36:47.120 that CNN
00:36:47.760 and MSNBC
00:36:48.780 and anybody
00:36:49.340 who says
00:36:49.920 that Trump
00:36:50.760 is the
00:36:51.240 cause of
00:36:52.080 not wearing
00:36:55.340 masks or
00:36:55.920 even not
00:36:56.400 doing enough
00:36:57.120 social distancing,
00:36:58.500 I think
00:36:59.100 you've got to
00:36:59.620 take the
00:37:00.000 culture into
00:37:00.920 account.
00:37:02.500 The culture
00:37:03.760 of the
00:37:04.080 United States
00:37:04.720 is that
00:37:05.200 we will
00:37:05.860 risk death
00:37:08.600 for freedom.
00:37:10.480 We will.
00:37:11.500 And we'll
00:37:11.860 do it at
00:37:12.260 the drop
00:37:12.640 of a hat.
00:37:14.240 In Asia,
00:37:16.080 of course,
00:37:16.720 there would
00:37:16.920 be people
00:37:17.320 who would
00:37:17.540 fight for
00:37:17.940 freedom there
00:37:18.860 as well,
00:37:19.860 but I think
00:37:20.420 far less.
00:37:22.080 I think
00:37:22.360 far less.
00:37:23.460 In the
00:37:23.740 United States,
00:37:24.500 you can't,
00:37:25.180 you know,
00:37:25.460 if you just
00:37:25.900 look down
00:37:26.500 the sidewalk,
00:37:27.620 you'll find
00:37:28.140 five people
00:37:28.820 who would
00:37:29.160 die for
00:37:29.660 freedom.
00:37:30.420 At the
00:37:31.020 drop of
00:37:31.420 a hat,
00:37:32.200 they would
00:37:32.540 die for
00:37:33.020 it.
00:37:33.780 So,
00:37:34.240 you can't
00:37:34.660 compare us
00:37:35.080 to any
00:37:35.320 other country
00:37:35.840 when it
00:37:36.120 comes to
00:37:36.500 compliance,
00:37:37.820 to,
00:37:38.300 you know,
00:37:39.280 scientific
00:37:39.720 stuff.
00:37:41.100 All right.
00:37:44.460 Here's
00:37:44.980 something that
00:37:45.940 Bloomberg said.
00:37:47.260 Now,
00:37:47.580 look for the
00:37:48.340 word salad.
00:37:52.120 So,
00:37:52.320 Bloomberg said
00:37:53.420 in an article,
00:37:54.760 Trump's
00:37:55.100 resistance to
00:37:56.100 wearing a
00:37:56.560 mask,
00:37:57.520 until his
00:37:58.420 recent change
00:37:59.060 of heart,
00:37:59.940 has given
00:38:00.480 succor,
00:38:01.440 succor,
00:38:02.760 S-U-C-C-O-U-R,
00:38:05.100 a word that
00:38:05.840 you almost
00:38:06.620 never see
00:38:07.380 unless somebody
00:38:08.960 needs to give
00:38:09.660 you some
00:38:10.000 word salad.
00:38:10.620 All right.
00:38:11.780 When was the
00:38:12.280 last time you
00:38:13.580 used the word
00:38:14.460 succor,
00:38:16.900 succor?
00:38:17.360 I don't even
00:38:18.100 know how to
00:38:18.380 pronounce it.
00:38:19.340 This is such
00:38:20.200 an unusual
00:38:20.800 word that
00:38:23.940 I've literally
00:38:24.660 never said it
00:38:25.400 out loud.
00:38:26.120 I've never
00:38:26.400 said succor,
00:38:27.440 succor.
00:38:28.360 Can anybody
00:38:29.000 tell me how
00:38:29.460 to pronounce
00:38:29.800 that?
00:38:30.800 S-U-C-C-O-U-R.
00:38:32.720 All right.
00:38:33.000 Let me read
00:38:33.320 the sentence.
00:38:34.420 That Trump's
00:38:35.080 resistance to
00:38:35.800 wearing a
00:38:36.160 mask until
00:38:36.700 his recent
00:38:37.140 change of
00:38:37.500 heart has
00:38:38.260 given succor,
00:38:39.980 succor,
00:38:40.820 to American
00:38:41.480 anti-maskers
00:38:42.540 who skew
00:38:43.260 Republican.
00:38:44.520 If you have
00:38:45.480 to pull out
00:38:46.740 a word like
00:38:47.480 that, it
00:38:48.720 means that
00:38:49.260 regular words
00:38:50.280 couldn't say
00:38:51.060 what you
00:38:51.400 needed to
00:38:51.900 say.
00:38:53.760 Because if
00:38:54.440 you used
00:38:54.920 regular words,
00:38:55.900 it would have
00:38:56.340 had to say
00:38:56.800 something direct
00:38:57.600 like, people
00:38:59.540 didn't wear
00:39:00.080 masks because
00:39:01.760 Trump didn't
00:39:02.760 wear a mask.
00:39:04.260 And you
00:39:04.580 would read
00:39:04.920 that and
00:39:05.180 you'd say,
00:39:06.020 I don't
00:39:06.380 know.
00:39:06.760 I don't
00:39:07.320 know anybody
00:39:07.780 who didn't
00:39:09.200 wear a mask
00:39:09.920 because Trump
00:39:10.760 did or did
00:39:11.400 not wear a
00:39:11.940 mask.
00:39:13.160 I don't know
00:39:14.180 anybody who
00:39:14.720 made a decision
00:39:15.660 that way.
00:39:16.220 That would
00:39:16.440 be dumb
00:39:16.880 because Trump
00:39:18.920 said from the
00:39:19.600 very start
00:39:20.340 that he was
00:39:21.520 an exception.
00:39:23.200 Listen to
00:39:23.940 my experts.
00:39:25.100 My experts
00:39:25.800 say wear a
00:39:26.520 mask.
00:39:27.980 That was
00:39:28.320 pretty clear.
00:39:30.280 Were any
00:39:31.200 of you
00:39:31.520 confused about
00:39:33.000 the fact that
00:39:33.640 the President
00:39:34.160 of the United
00:39:34.620 States was
00:39:35.860 treating himself
00:39:36.700 as an
00:39:37.100 exception?
00:39:38.200 Did anybody
00:39:38.760 not know
00:39:39.640 that there
00:39:40.860 are plenty
00:39:41.240 of reasons
00:39:41.920 for the
00:39:46.080 President to
00:39:46.600 be treated
00:39:47.020 as an
00:39:47.420 exception?
00:39:48.600 I don't
00:39:48.820 see anybody
00:39:49.500 being confused
00:39:50.200 by that.
00:39:51.800 All right.
00:39:54.040 Let's talk
00:39:54.880 about the
00:39:56.960 Ginsburg
00:39:57.380 replacement.
00:39:58.180 So Pelosi
00:39:58.760 and the
00:40:00.040 Democrats have
00:40:00.780 some options
00:40:01.680 and one
00:40:03.920 of their
00:40:04.140 options is
00:40:04.980 to impeach
00:40:06.740 the President
00:40:07.480 for doing
00:40:10.740 his constitutional
00:40:11.760 duty of
00:40:12.660 selecting a
00:40:14.500 replacement
00:40:15.000 for Ginsburg.
00:40:16.740 That conversation
00:40:17.900 is actually
00:40:18.820 happening.
00:40:20.880 Serious people
00:40:21.860 are willing
00:40:23.100 to say in
00:40:23.820 public that
00:40:25.080 they would
00:40:25.440 consider impeachment
00:40:26.600 literally for
00:40:28.300 just doing his
00:40:28.960 job the way
00:40:29.560 it's written.
00:40:30.600 That's it.
00:40:31.500 He would just
00:40:31.960 be doing his
00:40:32.520 job the way
00:40:33.140 it's defined
00:40:33.980 and they
00:40:35.140 want to
00:40:35.400 impeach him
00:40:35.860 for that.
00:40:36.420 That would
00:40:36.680 be maybe
00:40:37.640 the worst
00:40:38.220 possible play
00:40:39.120 they could
00:40:39.440 make.
00:40:40.680 So I
00:40:41.440 think the
00:40:42.500 fact that
00:40:43.120 it's the
00:40:43.460 worst thing
00:40:43.960 they could
00:40:44.280 possibly do
00:40:45.080 might suggest
00:40:46.720 they'll do
00:40:47.120 it but I'm
00:40:47.520 not going
00:40:47.800 to predict
00:40:48.420 it.
00:40:49.440 And then the
00:40:49.760 other trick
00:40:50.240 is to
00:40:50.740 increase the
00:40:51.340 number of
00:40:51.760 jurists.
00:40:52.420 So apparently
00:40:52.860 the Constitution
00:40:53.560 is silent
00:40:54.900 on how many
00:40:56.220 Supreme Court
00:40:56.920 justices there
00:40:58.380 need to be.
00:40:59.620 So any
00:41:00.280 President could
00:41:01.420 if they chose
00:41:02.620 to break
00:41:03.180 precedent with
00:41:04.220 historical precedent
00:41:05.560 and just
00:41:06.580 pick a whole
00:41:07.720 bunch of
00:41:08.140 new
00:41:08.660 justices
00:41:10.260 until you
00:41:11.300 had a
00:41:11.880 majority of
00:41:12.480 the kind
00:41:12.820 you want.
00:41:13.900 So the
00:41:14.100 Democrats are
00:41:14.760 talking about
00:41:15.540 winning the
00:41:16.860 election,
00:41:18.200 winning all
00:41:18.880 the houses
00:41:19.320 and then
00:41:20.260 appointing
00:41:21.600 enough
00:41:21.920 justices so
00:41:22.980 it turns
00:41:23.520 into a
00:41:24.200 liberal
00:41:24.460 court.
00:41:25.460 And here's
00:41:25.860 my question.
00:41:27.700 Was this
00:41:28.260 an error
00:41:29.060 by the
00:41:30.160 framers of
00:41:30.740 the Constitution?
00:41:31.460 because does
00:41:32.960 it look like
00:41:33.640 it's just
00:41:34.140 a mistake
00:41:34.820 and that
00:41:36.280 we should
00:41:37.120 fix it?
00:41:38.420 Because if
00:41:39.380 the party
00:41:40.700 that's in
00:41:41.160 power and
00:41:42.460 the other
00:41:42.900 branches,
00:41:43.820 if all they
00:41:44.640 have to do
00:41:45.160 is change
00:41:45.740 the number
00:41:46.200 of justices
00:41:47.040 and then
00:41:48.180 they can
00:41:48.800 guarantee
00:41:49.200 that they
00:41:49.620 get whatever
00:41:50.080 result they
00:41:50.720 want,
00:41:51.460 because it's
00:41:52.020 pretty clear
00:41:53.080 that the
00:41:54.060 liberals are
00:41:54.640 going to
00:41:55.040 vote liberal,
00:41:56.320 the conservatives
00:41:57.240 are going to
00:41:57.760 vote conservative,
00:41:58.520 and every
00:41:59.420 now and
00:41:59.740 then you'll
00:42:00.100 get a
00:42:00.320 couple of
00:42:00.700 people who
00:42:01.180 will cross
00:42:01.700 over,
00:42:02.540 but if you
00:42:03.160 wanted to
00:42:03.540 make sure
00:42:03.920 you never
00:42:04.420 got a
00:42:04.760 crossover and
00:42:05.940 you just
00:42:06.320 always got
00:42:07.320 liberal results,
00:42:09.100 you would
00:42:09.520 just get
00:42:10.700 five new
00:42:11.500 liberal judges
00:42:12.160 and then
00:42:13.340 it would be
00:42:13.980 a majority
00:42:15.520 liberal every
00:42:16.380 time.
00:42:17.640 Now, do you
00:42:18.240 think that the
00:42:18.800 framers of the
00:42:19.500 Constitution said
00:42:20.540 to themselves,
00:42:21.560 let's make a
00:42:22.460 Constitution where
00:42:23.740 we do not have
00:42:24.680 separation of
00:42:25.420 powers?
00:42:26.460 Because that
00:42:26.960 would be not
00:42:27.960 separation of
00:42:28.500 power.
00:42:29.600 That would be
00:42:30.060 a case where
00:42:30.620 the court was
00:42:31.340 basically just a
00:42:32.320 kangaroo court
00:42:33.220 and you would
00:42:34.600 know what
00:42:35.260 rulings they
00:42:36.520 would come up
00:42:37.300 with, largely,
00:42:38.860 just by who
00:42:40.200 you selected.
00:42:41.900 All right?
00:42:43.540 Oh, somebody's
00:42:44.160 telling me in the
00:42:44.720 comments that
00:42:45.400 FDR expanded
00:42:46.540 from seven to
00:42:47.680 nine, so I
00:42:49.900 was not aware
00:42:50.560 of that precedent.
00:42:54.060 I assume that's
00:42:54.980 true, based on
00:42:56.920 stranger in
00:42:57.700 comments.
00:42:58.500 But it
00:42:59.000 sounds like it
00:42:59.540 might be true.
00:43:00.900 So here's what
00:43:01.980 I would say.
00:43:02.880 I feel as if
00:43:03.760 this is a
00:43:04.440 hole in the
00:43:05.020 Constitution where
00:43:05.900 it's just poorly
00:43:06.600 designed because
00:43:08.060 the whole separation
00:43:09.760 of power thing
00:43:10.620 falls apart if you
00:43:12.340 can just add
00:43:13.040 justices until you
00:43:14.080 get any result you
00:43:15.000 want, assuming
00:43:16.120 that you had the
00:43:16.900 presidency and the
00:43:17.980 Senate.
00:43:18.680 So I feel like we
00:43:21.320 need to fix that.
00:43:22.140 And it doesn't
00:43:24.020 need to be
00:43:24.460 necessarily nine
00:43:25.700 justices.
00:43:26.520 That's not a
00:43:26.980 magic number or
00:43:27.740 anything.
00:43:28.520 But it needs to
00:43:30.300 be set.
00:43:32.040 Otherwise, you
00:43:34.020 just don't get it.
00:43:35.520 All right.
00:43:36.260 You just don't get
00:43:37.460 your separation.
00:43:39.560 Now, what has
00:43:40.420 caused this not
00:43:42.400 to happen in the
00:43:43.280 past?
00:43:43.700 What is it that
00:43:45.860 has kept
00:43:46.340 presidents in the
00:43:47.340 past from just
00:43:48.640 doing this every
00:43:49.340 single time?
00:43:50.500 And the only
00:43:50.920 thing that I can
00:43:51.540 think of is that
00:43:52.540 it would be a
00:43:55.080 bad look and
00:43:56.920 that there was
00:43:58.160 sort of an honor
00:43:59.060 system that you
00:44:00.340 wouldn't do that.
00:44:01.900 But the honor
00:44:02.860 system seems to
00:44:03.780 have completely
00:44:04.380 broken.
00:44:06.000 Why did the
00:44:06.900 honor system
00:44:07.660 break?
00:44:08.660 Why is it that
00:44:09.620 suddenly that
00:44:11.420 nobody cares what
00:44:13.040 McConnell said
00:44:13.980 last time versus
00:44:15.060 this time?
00:44:15.880 It just doesn't
00:44:16.680 matter because
00:44:17.440 nobody's even
00:44:18.020 trying to be
00:44:18.580 consistent.
00:44:20.820 Why are the
00:44:23.860 Democrats saying
00:44:25.000 out loud, Kamala
00:44:26.140 Harris said this
00:44:27.380 out loud with no
00:44:28.460 shame whatsoever,
00:44:29.960 that Justice
00:44:31.820 Goldberg's final
00:44:33.980 death wish should
00:44:35.360 be respected?
00:44:36.840 And I'm thinking
00:44:37.400 to myself, no,
00:44:39.120 no, it should not
00:44:40.000 be respected.
00:44:41.200 That's not the
00:44:42.120 Constitution.
00:44:43.040 The Constitution
00:44:44.140 should be
00:44:44.840 respected, but
00:44:46.880 not a deathbed
00:44:48.200 wish of a
00:44:49.480 justice.
00:44:50.840 That's not a
00:44:51.560 thing.
00:44:52.680 So why are we
00:44:53.460 even talking
00:44:54.040 about that?
00:44:54.600 Why are we
00:44:54.940 even talking in
00:44:56.240 a way that we
00:44:56.840 can just make up
00:44:57.640 the rules?
00:44:58.740 What happened to
00:44:59.420 the honor system
00:45:01.060 where we're trying
00:45:01.660 to keep the
00:45:02.200 republic together
00:45:02.980 and we all
00:45:03.800 have that goal?
00:45:05.180 Well, artificial
00:45:06.720 intelligence.
00:45:08.160 The reason that
00:45:09.040 all of our
00:45:09.780 unwritten
00:45:12.980 agreements, the
00:45:14.980 historical stuff,
00:45:16.060 the precedents,
00:45:16.920 the reason that's
00:45:17.700 all falling apart
00:45:18.720 is the AI.
00:45:20.720 The AI has
00:45:22.660 whipped our
00:45:23.520 emotional level
00:45:24.780 up above our
00:45:26.040 common sense.
00:45:27.640 So only very
00:45:29.080 reasonable people
00:45:30.480 would say,
00:45:31.640 you know, it
00:45:32.740 would be good for
00:45:33.620 us in the short
00:45:34.380 run to do this
00:45:35.440 thing with adding
00:45:36.400 justices, but it
00:45:37.920 might be bad in
00:45:38.720 the long run, so
00:45:40.280 let's not do it.
00:45:42.720 That is no
00:45:44.280 longer the
00:45:44.880 thinking of the
00:45:45.600 day because the
00:45:47.000 AI has goosed
00:45:48.420 us to the point
00:45:49.240 where we're only
00:45:50.460 thinking emotionally.
00:45:51.720 We're only
00:45:52.460 thinking about
00:45:53.120 winning and
00:45:53.600 losing.
00:45:54.060 We're only
00:45:54.420 thinking about
00:45:54.980 the fight.
00:45:55.920 That's it.
00:45:57.080 We're just
00:45:57.440 thinking about
00:45:57.880 the fight.
00:45:58.960 So artificial
00:45:59.460 intelligence just
00:46:01.700 destroyed whatever,
00:46:03.300 let's say,
00:46:05.860 human element
00:46:07.780 there was
00:46:08.380 holding the
00:46:09.020 system together.
00:46:09.880 It just
00:46:10.200 destroyed it.
00:46:12.040 What will
00:46:12.480 happen?
00:46:13.100 We'll see.
00:46:15.360 Joe Biden had
00:46:16.280 a great gaffe,
00:46:17.580 most of you
00:46:18.240 saw it, in
00:46:19.360 which he was
00:46:19.880 giving a speech
00:46:20.520 and he said,
00:46:21.660 it's estimated
00:46:22.560 that 200
00:46:23.040 million people
00:46:23.740 will die
00:46:24.320 probably by
00:46:25.600 the time I
00:46:26.160 finish this
00:46:26.740 talk.
00:46:29.540 200 million
00:46:31.060 people will
00:46:31.640 die by the
00:46:32.260 time he
00:46:32.540 finishes the
00:46:33.140 talk.
00:46:33.820 He meant
00:46:34.260 to say 200,000
00:46:35.520 but the
00:46:36.980 fact that he
00:46:38.000 doesn't
00:46:38.400 immediately
00:46:39.060 self-correct,
00:46:41.460 you know,
00:46:42.140 I've done
00:46:44.020 it, you've
00:46:44.460 done it,
00:46:44.860 everybody's
00:46:45.500 confused thousands
00:46:46.720 and millions
00:46:47.340 and billions.
00:46:50.200 It's very
00:46:50.880 common.
00:46:52.540 But if you're
00:46:53.220 speaking in
00:46:54.000 public at
00:46:54.600 that level,
00:46:55.860 when it
00:46:56.300 comes out of
00:46:56.840 your mouth,
00:46:57.480 200 million,
00:46:59.040 isn't your
00:46:59.780 filter saying
00:47:00.520 to yourself,
00:47:01.240 oh, that's
00:47:01.700 about two
00:47:02.220 thirds of the
00:47:02.700 United States,
00:47:04.220 you know,
00:47:04.860 doesn't something
00:47:06.460 just click to
00:47:07.500 tell you that
00:47:08.060 was wrong?
00:47:09.220 And that it
00:47:09.620 was just
00:47:09.880 funnier that
00:47:10.360 he finished
00:47:10.820 it with,
00:47:11.440 probably by
00:47:12.020 the time I
00:47:12.500 finish this
00:47:13.060 talk.
00:47:14.220 You know,
00:47:14.560 if any
00:47:14.980 Democrats
00:47:15.460 heard that
00:47:15.940 and believed
00:47:16.480 it,
00:47:16.760 they were
00:47:16.940 probably
00:47:17.160 thinking,
00:47:17.720 I've got to
00:47:18.260 get out of
00:47:18.620 here right
00:47:19.860 away.
00:47:20.120 So, one
00:47:23.700 of the
00:47:23.880 things that
00:47:24.400 Trump is
00:47:25.420 doing that
00:47:26.040 I thought
00:47:26.540 was smart
00:47:27.320 is he
00:47:28.360 decided to
00:47:29.260 hold off
00:47:29.900 on naming
00:47:30.520 the proposed
00:47:33.000 replacement
00:47:33.700 until after
00:47:35.580 the funeral.
00:47:37.760 So, the
00:47:38.200 funeral will
00:47:38.640 be Thursday
00:47:39.760 or Friday.
00:47:41.080 So, Trump
00:47:41.520 said that
00:47:42.340 the announcement
00:47:42.900 will be Friday
00:47:43.760 or Saturday,
00:47:44.500 but it's
00:47:44.780 going to be
00:47:45.060 after the
00:47:45.640 funeral.
00:47:46.320 And he
00:47:46.720 said,
00:47:47.140 just out of
00:47:47.820 respect.
00:47:48.220 he said
00:47:50.300 that just
00:47:50.820 out of
00:47:51.160 respect,
00:47:52.440 he would
00:47:52.880 hold off.
00:47:54.960 And I
00:47:55.260 thought,
00:47:55.720 I like
00:47:56.360 that.
00:47:57.220 I like
00:47:57.620 that.
00:47:58.100 Now, I
00:47:58.320 don't know,
00:47:59.720 you know,
00:48:00.520 I just
00:48:02.060 like it.
00:48:03.260 Let's just
00:48:03.900 say that
00:48:04.340 when you
00:48:04.680 see any
00:48:05.200 little
00:48:05.600 glimmer
00:48:07.480 of humanity
00:48:08.460 these days,
00:48:10.040 you just
00:48:11.240 got to
00:48:11.500 call it
00:48:11.840 out and
00:48:12.800 say,
00:48:13.140 all right,
00:48:13.880 all right,
00:48:14.520 I like
00:48:14.900 that.
00:48:15.580 You know,
00:48:15.760 we would
00:48:16.100 like to
00:48:16.440 get on
00:48:16.760 with it
00:48:17.140 because
00:48:17.800 time
00:48:18.340 matters.
00:48:19.360 But the
00:48:19.760 president,
00:48:20.380 even though
00:48:20.820 timing is
00:48:21.760 critical,
00:48:23.080 the president
00:48:23.640 was still
00:48:24.160 willing to
00:48:24.860 wait a
00:48:25.500 week when
00:48:27.160 a week
00:48:27.500 really matters.
00:48:28.980 He was
00:48:29.300 willing to
00:48:29.680 wait a
00:48:29.940 week,
00:48:30.480 and it
00:48:31.020 was just
00:48:31.320 out of
00:48:31.560 respect for
00:48:33.220 Ruth Bader
00:48:35.560 Ginsburg.
00:48:36.540 I'm okay
00:48:36.980 with that.
00:48:37.700 I think
00:48:37.940 that will
00:48:39.760 be a week
00:48:40.440 well spent,
00:48:41.660 right?
00:48:41.920 It's expensive
00:48:42.560 because you
00:48:43.960 don't have
00:48:44.240 much time,
00:48:44.880 so it's
00:48:45.140 expensive in
00:48:45.820 that way,
00:48:46.240 but it's
00:48:47.400 worth it.
00:48:48.680 Good choice,
00:48:49.740 I think.
00:48:50.640 All right.
00:48:51.200 One of the
00:48:51.960 top three
00:48:52.540 Supreme Court
00:48:53.180 nominees,
00:48:55.160 her last
00:48:55.800 name is
00:48:56.760 Rushing.
00:48:58.740 Seriously,
00:48:59.960 when the
00:49:00.340 only topic
00:49:00.980 we're talking
00:49:01.500 about is
00:49:02.460 are we
00:49:02.940 rushing to
00:49:04.420 this nomination
00:49:05.200 or should
00:49:06.500 we wait,
00:49:07.740 that one
00:49:08.240 of the
00:49:08.460 people is
00:49:08.920 named
00:49:09.180 Rushing
00:49:09.700 and her
00:49:10.840 first name
00:49:11.260 is
00:49:11.440 Allison.
00:49:12.460 If you
00:49:13.040 pull
00:49:13.340 Allison
00:49:13.860 apart,
00:49:15.040 it spells,
00:49:15.720 because it's
00:49:16.200 Allison with
00:49:16.820 two L's,
00:49:17.860 it spells
00:49:18.300 all is
00:49:19.240 on
00:49:19.740 rushing.
00:49:21.920 That's
00:49:22.480 her actual
00:49:22.960 name.
00:49:23.900 All is
00:49:24.440 on
00:49:24.800 rushing
00:49:25.480 at exactly
00:49:27.060 the time
00:49:27.560 when all
00:49:28.240 is on
00:49:28.680 and we're
00:49:29.180 rushing.
00:49:31.860 I just
00:49:32.600 call that
00:49:33.020 out because
00:49:33.640 you've got to
00:49:34.560 keep your eye
00:49:35.040 on the
00:49:35.280 simulation.
00:49:35.700 All right.
00:49:41.600 Let's see,
00:49:42.500 what else
00:49:42.780 we've got
00:49:43.100 here?
00:49:44.760 There was
00:49:45.640 a study
00:49:46.900 that showed
00:49:47.740 that the
00:49:50.640 people who
00:49:51.280 were most
00:49:51.880 likely to be
00:49:52.740 activists are
00:49:54.440 highly correlated
00:49:55.440 with the
00:49:56.040 people who
00:49:56.600 understand their
00:49:57.680 own topic
00:49:58.360 the least.
00:50:00.000 That's a
00:50:00.820 real thing.
00:50:01.900 The people
00:50:02.520 who are
00:50:03.020 activists,
00:50:03.960 the protesters,
00:50:05.240 the professionals,
00:50:05.700 activists,
00:50:06.980 have a very
00:50:07.680 strong correlation
00:50:08.740 with ignorance.
00:50:12.660 And
00:50:12.860 specifically
00:50:13.700 the
00:50:15.660 reason
00:50:16.820 given,
00:50:17.340 I'm not
00:50:17.660 sure I
00:50:18.000 buy the
00:50:18.360 reason,
00:50:19.080 could be
00:50:19.500 more than
00:50:19.860 one reason
00:50:20.300 for it,
00:50:20.900 but the
00:50:21.200 reason
00:50:21.460 given is
00:50:22.080 that by
00:50:22.840 the time
00:50:23.400 you think
00:50:23.900 something's
00:50:24.400 so important
00:50:25.000 that you
00:50:25.560 need to
00:50:26.060 march in
00:50:27.200 the streets
00:50:27.720 and burn
00:50:28.240 things down,
00:50:29.360 by the time
00:50:29.920 you think
00:50:30.260 something's
00:50:30.720 that important,
00:50:31.980 you're wrong
00:50:32.600 because things
00:50:34.700 probably aren't
00:50:35.280 that important.
00:50:36.400 Generally
00:50:36.840 speaking,
00:50:38.040 how bad
00:50:38.600 things are
00:50:39.260 are not
00:50:40.640 as bad
00:50:41.220 as an
00:50:41.680 activist
00:50:42.060 thinks.
00:50:42.980 So the
00:50:43.340 activist will
00:50:44.040 have a
00:50:44.800 distorted
00:50:45.760 idea of
00:50:46.900 how bad
00:50:47.400 the problem
00:50:47.920 is.
00:50:49.020 Take,
00:50:49.440 for example,
00:50:50.600 the number
00:50:51.120 of black
00:50:52.400 people killed
00:50:53.260 by police
00:50:54.060 who were
00:50:55.140 unarmed
00:50:55.820 and cooperating.
00:50:57.880 is that
00:50:59.240 a gigantic
00:51:00.260 problem?
00:51:01.560 Or is it
00:51:02.060 a problem
00:51:02.460 that the
00:51:02.880 activists
00:51:03.380 are under
00:51:04.180 informed?
00:51:05.560 Well,
00:51:05.840 you know
00:51:06.100 the answer.
00:51:07.160 Every one
00:51:07.640 of you
00:51:07.960 who is
00:51:08.360 not an
00:51:08.900 activist
00:51:09.440 knows
00:51:10.420 that part
00:51:11.860 of the
00:51:12.060 reason
00:51:12.300 you're
00:51:12.520 not an
00:51:12.920 activist
00:51:13.420 is because
00:51:14.400 there's
00:51:14.860 no problem
00:51:15.380 there.
00:51:16.440 Or the
00:51:17.640 problem
00:51:17.940 is so
00:51:18.300 small
00:51:18.760 that if
00:51:19.580 you were
00:51:19.800 going to
00:51:20.020 rank all
00:51:20.440 the problems
00:51:20.960 in the
00:51:21.240 world,
00:51:21.560 it would
00:51:21.720 take you
00:51:22.000 a long
00:51:22.260 time to
00:51:22.580 get down
00:51:22.920 to that
00:51:23.220 one,
00:51:23.560 even if
00:51:23.920 you're
00:51:24.060 black.
00:51:24.440 Right?
00:51:24.700 So even
00:51:25.560 if you
00:51:25.840 really,
00:51:26.420 really care
00:51:26.740 about
00:51:27.040 black
00:51:27.360 lies,
00:51:27.960 I hope
00:51:28.360 all of
00:51:28.660 us do,
00:51:30.140 you still
00:51:30.680 wouldn't rank
00:51:31.220 it very
00:51:31.540 high because
00:51:32.060 the number
00:51:32.560 of people
00:51:33.000 involved is
00:51:34.320 just microscopic
00:51:35.560 compared to
00:51:36.800 just about
00:51:37.180 every other
00:51:37.620 problem in
00:51:38.140 the world.
00:51:40.680 So here's
00:51:42.160 my suggestion.
00:51:43.800 Instead of
00:51:44.460 saying that
00:51:45.040 there are
00:51:45.740 some protesters
00:51:46.880 on the left
00:51:47.680 and some
00:51:48.300 on the right
00:51:48.840 and some
00:51:50.100 are supporting
00:51:50.860 Black Lives
00:51:51.540 Matter and
00:51:52.300 some are
00:51:52.720 not,
00:51:53.680 we could
00:51:54.500 just say
00:51:55.620 that there's
00:51:56.860 a wisdom
00:51:58.560 gap and
00:52:00.380 that the
00:52:00.800 people who
00:52:01.260 are marching
00:52:01.760 and looting,
00:52:05.420 they're not
00:52:06.520 the smart
00:52:06.940 ones.
00:52:08.080 Now,
00:52:08.500 what's the
00:52:09.200 obvious problem
00:52:09.920 with this?
00:52:10.660 The obvious
00:52:11.140 problem is
00:52:11.640 if you said,
00:52:12.840 hey,
00:52:13.060 Black Lives
00:52:13.560 Matter,
00:52:14.300 the problem
00:52:14.780 is you're
00:52:15.200 not very
00:52:15.600 smart,
00:52:16.800 that's
00:52:17.140 immediately
00:52:17.480 racist,
00:52:18.300 right?
00:52:19.060 I mean,
00:52:19.440 really,
00:52:19.860 really racist.
00:52:21.020 So you
00:52:21.340 can't say
00:52:21.760 that,
00:52:22.860 or can
00:52:23.640 you?
00:52:24.500 Well,
00:52:25.440 I think
00:52:25.760 you could
00:52:26.280 under the
00:52:26.940 context that
00:52:27.660 there are
00:52:27.920 more white
00:52:28.440 people protesting
00:52:29.800 for Black Lives
00:52:30.680 Matter than
00:52:31.160 there are
00:52:31.440 black people.
00:52:32.540 So you
00:52:33.340 can't even
00:52:33.740 look at the
00:52:34.140 group and
00:52:34.540 say it's
00:52:34.880 even mostly
00:52:35.520 black at
00:52:36.840 this point.
00:52:38.000 And I
00:52:38.280 suppose that's
00:52:38.780 good news
00:52:39.780 that there's
00:52:41.980 a willingness
00:52:42.860 to support
00:52:44.640 the black
00:52:45.180 community among
00:52:45.920 the white
00:52:46.440 community.
00:52:46.980 I like that
00:52:47.440 part.
00:52:47.980 That part's
00:52:48.440 good.
00:52:49.300 But can't
00:52:50.320 we just
00:52:50.640 call them
00:52:51.060 collectively
00:52:51.680 less
00:52:53.500 intelligent
00:52:53.960 or less
00:52:55.700 informed
00:52:56.240 or
00:52:57.520 less
00:53:00.120 experienced,
00:53:01.260 less
00:53:01.460 wise?
00:53:02.800 Are they
00:53:03.260 low
00:53:03.500 information
00:53:04.000 voters?
00:53:05.060 Imagine if
00:53:05.840 you would
00:53:06.300 that the
00:53:06.980 news reported
00:53:07.860 that there's
00:53:09.480 another protest
00:53:10.360 of low
00:53:10.820 information
00:53:11.360 voters.
00:53:12.520 And they
00:53:12.720 just always
00:53:13.300 said that.
00:53:14.340 They never
00:53:14.960 said it's
00:53:15.580 Black Lives
00:53:16.160 Matter,
00:53:17.300 it's
00:53:17.620 Antifa,
00:53:18.340 they just
00:53:18.600 never said
00:53:19.060 that.
00:53:19.340 Because if
00:53:20.420 you're
00:53:20.620 Antifa,
00:53:21.740 could you
00:53:22.520 be Antifa
00:53:23.240 and be
00:53:23.880 a high
00:53:24.820 information
00:53:25.360 voter?
00:53:26.760 No.
00:53:28.640 No.
00:53:29.780 Because Antifa
00:53:30.720 literally wants
00:53:31.820 to destroy
00:53:32.440 everything if
00:53:33.820 they get
00:53:34.140 their way.
00:53:34.780 You can't
00:53:35.500 actually
00:53:35.860 understand how
00:53:36.820 the world
00:53:37.200 works and
00:53:38.480 be in favor
00:53:39.020 of Antifa.
00:53:39.880 That's not a
00:53:40.460 thing.
00:53:41.200 You just
00:53:41.580 can't do
00:53:42.000 it.
00:53:42.520 You have
00:53:43.020 to be a
00:53:43.820 low information
00:53:44.560 citizen to
00:53:47.000 be an
00:53:48.440 Antifa
00:53:48.800 supporter.
00:53:50.100 There's
00:53:50.580 just no
00:53:50.880 way around
00:53:51.260 that.
00:53:52.100 And I
00:53:54.360 would give
00:53:54.700 as my
00:53:55.060 evidence,
00:53:56.240 get any
00:53:56.780 Antifa
00:53:57.220 person to
00:53:57.960 sit down
00:53:58.620 with you
00:53:59.000 and describe
00:54:00.360 what their
00:54:01.200 world looks
00:54:01.900 like if
00:54:02.440 they get
00:54:02.820 everything
00:54:03.160 they want.
00:54:04.340 And make
00:54:05.160 sure you
00:54:05.560 ask this
00:54:06.100 question.
00:54:07.260 In your
00:54:07.660 perfect world
00:54:08.480 where you
00:54:08.860 get everything
00:54:09.340 you want,
00:54:09.960 Antifa,
00:54:11.180 where does
00:54:11.660 money come
00:54:12.400 from?
00:54:13.980 How do
00:54:14.520 you get
00:54:15.120 money?
00:54:16.060 Because the
00:54:16.460 whole system
00:54:16.960 would fall
00:54:17.420 apart.
00:54:18.120 There would
00:54:18.340 be no
00:54:18.660 money.
00:54:19.340 So I
00:54:22.280 think
00:54:22.680 Antifa
00:54:23.160 far more
00:54:24.540 than Black
00:54:25.000 Lives Matter.
00:54:26.000 If you
00:54:26.660 were to
00:54:26.960 rank them
00:54:27.520 in terms
00:54:28.560 of how
00:54:28.920 much they
00:54:29.300 understand
00:54:29.940 about the
00:54:30.360 world,
00:54:31.300 Black Lives
00:54:31.820 Matter would
00:54:32.460 still be way
00:54:33.380 above Antifa.
00:54:35.280 Antifa
00:54:35.880 doesn't
00:54:37.400 understand
00:54:37.920 anything about
00:54:38.760 the world.
00:54:40.680 Black Lives
00:54:41.360 Matter would
00:54:41.840 be not
00:54:43.500 nearly as
00:54:43.980 bad as
00:54:44.260 that.
00:54:44.460 bad.
00:54:46.200 Now
00:54:46.780 there's
00:54:46.960 somebody
00:54:47.160 saying
00:54:47.460 here
00:54:47.680 that
00:54:48.020 there
00:54:48.300 are
00:54:48.360 low
00:54:48.720 IQ
00:54:49.120 populations
00:54:50.080 and I
00:54:50.480 would
00:54:50.600 say
00:54:50.760 no.
00:54:51.700 The
00:54:52.020 weird
00:54:52.220 thing
00:54:52.460 about
00:54:52.700 this
00:54:52.940 is
00:54:53.140 I
00:54:53.320 don't
00:54:53.660 believe
00:54:54.120 there's
00:54:54.560 an IQ
00:54:55.120 correlation
00:54:55.840 at all.
00:54:57.100 Because
00:54:57.600 you see a lot
00:54:58.680 of the people
00:54:59.120 who are
00:54:59.420 protesting
00:54:59.840 are literally
00:55:00.560 teachers and
00:55:01.560 professors and
00:55:02.260 stuff.
00:55:02.800 It's
00:55:03.060 definitely
00:55:03.380 not an
00:55:03.840 IQ
00:55:04.200 problem.
00:55:05.440 It is
00:55:06.580 a wisdom
00:55:07.840 problem.
00:55:09.260 It is
00:55:09.840 a wisdom
00:55:10.240 or a knowledge
00:55:11.400 problem.
00:55:11.800 So if
00:55:13.440 we just
00:55:13.820 treated it
00:55:14.300 that way
00:55:14.700 it would
00:55:15.060 go away.
00:55:15.800 But do
00:55:16.340 you know
00:55:16.480 why we
00:55:16.840 can't
00:55:17.160 treat it
00:55:17.500 like a
00:55:18.760 knowledge
00:55:19.060 or a
00:55:19.520 wisdom
00:55:19.740 problem?
00:55:21.020 Take a
00:55:21.680 guess.
00:55:22.500 What is
00:55:22.940 it that
00:55:23.300 prevents
00:55:23.720 us from
00:55:24.960 treating
00:55:25.460 the
00:55:25.980 protesters
00:55:26.560 as what
00:55:27.480 they are?
00:55:28.420 People who
00:55:29.140 don't
00:55:29.400 understand
00:55:29.900 how the
00:55:30.240 world
00:55:30.460 works.
00:55:31.440 That's
00:55:31.680 what they
00:55:31.980 are.
00:55:32.580 They're
00:55:32.780 people who
00:55:33.180 don't
00:55:33.440 quite have
00:55:33.940 a good
00:55:34.260 understanding
00:55:34.800 of the
00:55:35.140 world.
00:55:36.840 The
00:55:37.360 reason
00:55:37.640 is
00:55:38.080 AI.
00:55:40.560 AI.
00:55:40.860 AI prevents
00:55:43.140 us from
00:55:43.980 framing that
00:55:44.760 properly
00:55:45.360 because if
00:55:46.180 you framed
00:55:46.700 it properly
00:55:47.340 there wouldn't
00:55:49.460 be enough
00:55:49.820 clicks.
00:55:51.100 So we
00:55:51.780 can't do
00:55:52.100 that.
00:55:55.860 All right.
00:55:59.800 Yeah,
00:56:00.400 where they
00:56:01.180 get the
00:56:01.460 money is
00:56:01.800 taking it
00:56:02.240 from rich
00:56:02.640 people but
00:56:03.280 obviously
00:56:03.900 there's a
00:56:05.420 limit to
00:56:05.860 that.
00:56:06.760 Okay,
00:56:07.200 that's all
00:56:07.580 I got to
00:56:07.900 say about
00:56:08.200 today.
00:56:09.820 Remember,
00:56:10.300 the news
00:56:11.240 is a
00:56:11.640 little bit
00:56:12.180 slow and
00:56:13.840 you know
00:56:14.200 it's not
00:56:14.620 going to
00:56:14.860 stay that
00:56:15.320 way.
00:56:15.920 Tuesday
00:56:16.280 tends to
00:56:17.000 be a
00:56:17.240 big news
00:56:17.740 day.
00:56:18.620 So I
00:56:18.860 would look
00:56:19.220 for some
00:56:19.880 big news
00:56:20.620 either
00:56:20.920 tonight,
00:56:22.100 maybe late
00:56:22.620 afternoon,
00:56:23.360 or tomorrow.
00:56:25.100 You might
00:56:25.380 wake up to
00:56:26.400 some big
00:56:27.040 news.
00:56:27.500 We'll see.
00:56:29.640 Margaret
00:56:30.160 Thatcher said
00:56:30.920 that socialism
00:56:31.560 is great
00:56:32.200 until you
00:56:32.600 run out of
00:56:33.120 other people's
00:56:33.800 money.
00:56:34.580 That's a
00:56:35.020 good quote.
00:56:36.380 All right,
00:56:36.840 that's all
00:56:37.180 for now.
00:56:38.200 Talk to
00:56:38.540 you later.
00:56:38.780 Good
00:56:39.760 good
00:56:40.580 word.
00:56:40.620 Good
00:56:40.820 good
00:56:40.840 good
00:56:41.240 talk to
00:56:41.540 you later.
00:56:50.160 Thank you.