Real Coffee with Scott Adams - June 15, 2020


Episode 1028 Scott Adams: I Help You Calculate Reparations, CHAZ Updates, How to Fix Everything That is Broken


Episode Stats

Length

52 minutes

Words per Minute

145.78337

Word Count

7,620

Sentence Count

503

Misogynist Sentences

2

Hate Speech Sentences

29


Summary

In this episode of the podcast, Scott Adams talks about the anti-Trump backlash, why Kamala Harris is likely to be the Democratic vice presidential pick, and why it's OK to vote for a candidate based on their race and gender.


Transcript

00:00:00.000 Hey everybody, come on in. It's time. It's time for a coffee with Scott Adams, and you already know who you are, so put us all together and you've got everything you need, except you might need some kind of a container for your beverage.
00:00:30.000 What kind of container? Well, it could be a cup or a mug or a glass, a tank or a challenge or a stein, a canteen jug or a flask, a vessel of any kind. Fill it with your favorite liquid, I like coffee. And join me now for the unparalleled pleasure, the dopamine at the end of the day, the thing that makes everything better, including coronaviruses, economies, and race relations. Yeah, everything. Happens now. Sip. Go.
00:01:00.000 Ah. I can even feel some of our natural disasters mitigating just a little bit. The wind is dying down a few degrees. Yeah, it's looking good.
00:01:16.560 So you can tell that the anti-Trumpers are running out of material when they start to repeat. And some of their attacks are the dumbest attacks you've ever seen.
00:01:30.660 For example, they're going after Trump for his health. Now, I think it's fair to go after anybody over 70 for their health if they're running for president. So the attack in general, perfectly fair.
00:01:44.420 But the trouble is, the candidate he's running against is actually decomposing in a basement somewhere. So any health issues that you highlight on Trump, even if they're true, it's going to bring the question of health to the front.
00:02:05.840 I don't know if that's what they want. I don't know if that's what they want to do, if Biden is your candidate. I'm seeing more and more evidence that Kamala Harris will be the vice presidential pick.
00:02:17.000 The smart people have decided that she's the only one who can be picked. Guess why? Because of her race and her gender.
00:02:24.240 Why is it okay that the job of vice president of the United States, at least the candidacy on the Democrat side, will be determined by race and gender?
00:02:41.040 Why are we okay with that?
00:02:43.720 Did it just sort of, we just ended up here without paying attention?
00:02:47.840 Isn't the very thing we're supposed to be guarding against, is picking a candidate based on their skin color and their gender?
00:02:57.280 And we're not even pretending, you know, in the old days you would sort of pretend you weren't, but you really were.
00:03:03.880 You know, when Obama got elected, he got something like 95% of the black vote.
00:03:09.540 But, you know, nobody was, well, not nobody, but people didn't want to say, well, we want a black president, but people sort of wanted a black president.
00:03:22.640 There were a lot of people who just said, yeah, it's time. Let's get us a black president.
00:03:27.640 I was one of those people, and I still think it was good that we had a black president for two terms.
00:03:33.620 Because I think it served us well, in terms of how the country feels about itself.
00:03:40.140 It did not solve racism, as it turns out.
00:03:43.700 It didn't really solve anything.
00:03:49.080 All right, so I wouldn't worry about the attack on Trump's health.
00:03:52.840 He did say that the ramp was slippery. That's what I told you.
00:03:56.540 So if you want to find out the future, you know where to come.
00:04:00.900 Anyway, there's a funny story about the ticket sales for Trump's rally.
00:04:07.800 Reportedly, they've sold, or people have asked for something like a million tickets,
00:04:13.120 but there's only room for a few hundred thousand or whatever.
00:04:16.960 So the ticket sales, or at least the inquiries, are through the roof.
00:04:21.080 But here's what's funny.
00:04:22.060 So there's some suggestion that Democrats were buying tickets
00:04:27.920 so that they could essentially keep Republicans out of there,
00:04:33.520 so that it would be empty.
00:04:36.820 But I don't think they thought this through,
00:04:39.280 because there are always so many people outdoors
00:04:41.580 that if the Democrats bought a bunch of tickets for seats
00:04:45.660 and then didn't show up,
00:04:48.660 they would just let the people who were outdoors inside.
00:04:52.700 And they would have not had to pay for a ticket
00:04:54.920 because the Democrats paid for their tickets.
00:04:57.600 So I don't know that they've thought this through,
00:05:00.380 because I think they've cleverly figured out a way
00:05:02.520 for Democrats to buy tickets for Republicans.
00:05:04.760 If any of that's happening, that would be hilarious.
00:05:12.140 So the president is apparently going to sign
00:05:14.380 some kind of executive order on policing tomorrow.
00:05:18.380 I don't know what the details are.
00:05:19.960 But do you think that will help?
00:05:23.260 No.
00:05:25.420 No.
00:05:26.360 Doing something isn't going to help.
00:05:28.580 So the protesters have a number of demands.
00:05:32.060 Let's say you met them all.
00:05:33.580 Let's say they had 15 demands
00:05:37.180 and you just gave them all 15 things.
00:05:39.500 What would happen?
00:05:40.880 Would the activists then retire?
00:05:44.140 No.
00:05:44.860 Because that's their job.
00:05:46.200 If you're an activist,
00:05:47.860 you're sort of stuck being an activist.
00:05:50.180 You don't really go from activist to,
00:05:52.640 well, I think I'll be an accountant now.
00:05:54.960 Once you're an activist,
00:05:56.200 you're sort of locked into that activist life.
00:05:59.220 So if the activists for Black Lives Matter,
00:06:01.900 just as a mental experiment,
00:06:06.060 suppose they got everything they asked for.
00:06:08.780 Well, they're not going to retire and claim victory.
00:06:12.660 It doesn't work that way.
00:06:14.540 They would still be activists.
00:06:16.540 So if you gave them everything you wanted,
00:06:18.500 they don't have any kind of retirement plan.
00:06:21.120 They would have to keep telling you
00:06:23.060 that you didn't give them enough
00:06:24.140 because it's sort of the job.
00:06:26.840 So while I do think that one should look
00:06:30.700 to fix any systems that have problems,
00:06:34.120 and policing certainly is quite imperfect,
00:06:37.380 I don't think we should think of them
00:06:39.360 in terms of solving the problem.
00:06:41.800 Because you could solve the police problem,
00:06:44.220 but it doesn't really have much to do
00:06:45.820 with the overall racism problem.
00:06:47.680 So that's sort of an evergreen situation.
00:06:52.980 I saw a funny headline on Fox News
00:06:55.320 that Black Lives Matter is starting to doubt
00:06:58.460 the sincerity of the white activists
00:07:00.400 in the Chaz zone.
00:07:02.740 They're starting to wonder
00:07:03.620 if the white activists are quite in there
00:07:05.580 for the right reasons.
00:07:06.480 So I don't know what could be funnier
00:07:11.940 than difficult race relations
00:07:15.720 within the protesters themselves.
00:07:18.440 Now, I don't want to see anybody get hurt,
00:07:21.520 but I wouldn't be unabused
00:07:24.940 if racial strife broke out
00:07:28.040 within the people protesting against racial strife.
00:07:33.760 Just for entertainment purposes,
00:07:35.540 again, I wouldn't want anybody to get hurt,
00:07:38.240 but for entertainment purposes,
00:07:39.800 that would be quite terrific,
00:07:41.360 and it looks like it's heading in that direction.
00:07:45.920 I woke up early this morning
00:07:47.520 to start work, 3 a.m. or so,
00:07:50.240 and the first thing I see on Twitter
00:07:54.460 is Jack Posobiec did a livestream
00:07:57.040 as he was leaving the Chaz zone,
00:08:00.440 where apparently he's been undercover
00:08:01.980 for three days.
00:08:03.020 So Jack and somebody who was working with on this
00:08:08.260 were undercover with masks and hats and stuff,
00:08:12.620 and they were just operating
00:08:14.040 and taking videos and stuff
00:08:15.440 within the Chaz zone,
00:08:17.260 and now he's out.
00:08:19.480 So he's letting us know
00:08:20.940 because he's left the zone.
00:08:22.220 He's not going back.
00:08:24.020 But good job.
00:08:25.200 Do you remember seeing all that good video on CNN
00:08:30.640 from inside Chaz?
00:08:33.660 Nope.
00:08:34.480 You didn't see any of that.
00:08:36.100 Do you remember on Fox News
00:08:37.540 all that good video inside Chaz?
00:08:39.940 No?
00:08:40.200 No, you didn't see any.
00:08:41.920 But you did see a lot of video
00:08:44.140 coming out of Chaz,
00:08:45.400 and some large percentage of that,
00:08:49.120 I guess, came from Jack Posobiec.
00:08:51.980 So we're going to talk more about that in a minute.
00:08:58.540 I decided to be in favor of reparations,
00:09:02.400 but not for the reasons you might imagine.
00:09:04.820 I'm in favor of reparations
00:09:06.420 because I really, really want to see people
00:09:08.460 try to calculate it.
00:09:11.060 Don't you?
00:09:12.340 Aren't you just a little bit curious about that?
00:09:14.600 I don't think there would be anything funnier
00:09:17.940 than watching, you know,
00:09:20.620 the dog catch the car.
00:09:22.160 You know, they always say,
00:09:23.060 dogs chase cars,
00:09:25.040 but what if they catch one?
00:09:26.740 What are they going to do with it?
00:09:28.520 Well, similarly,
00:09:29.600 I'm not comparing anybody to a dog,
00:09:31.800 so don't say that.
00:09:32.560 It's just an analogy.
00:09:34.220 Similarly,
00:09:35.700 what would happen, you know,
00:09:38.020 if everybody agreed
00:09:40.300 to calculate reparations?
00:09:42.820 What if people said, you know,
00:09:44.060 yeah, that's a good idea.
00:09:45.260 Let's sit down,
00:09:46.780 let's get out the pencil,
00:09:47.900 let's get out the spreadsheet,
00:09:49.560 let's figure out who's owed what
00:09:52.100 and what do they owe.
00:09:54.660 For example,
00:09:55.780 I have some open questions.
00:09:57.240 I'm descended from abolitionists,
00:10:01.080 people who try to end slavery.
00:10:04.500 So, as a descendant of abolitionists,
00:10:07.960 how much do I owe Oprah?
00:10:10.920 Because that's my understanding,
00:10:12.620 is that I would owe Oprah some money
00:10:14.960 because not only was I descended
00:10:19.260 from people who fought to end slavery
00:10:22.140 and did not own any slaves themselves,
00:10:24.200 they sacrificed to try to end it,
00:10:27.500 but also I'm white,
00:10:29.900 so that, you know,
00:10:31.500 there's got to be some expense involved
00:10:33.180 with that by coincidentally being white.
00:10:36.120 So I would just like to know
00:10:37.320 how much I owe Oprah.
00:10:39.820 That's just one question.
00:10:41.260 Some of you would have questions too.
00:10:43.380 For example,
00:10:44.100 let's say you didn't have any of these,
00:10:46.100 the immense privileges
00:10:49.600 that you thought were owed to you
00:10:51.820 by being white,
00:10:53.440 well, do you still have to pay?
00:10:55.800 Well, yes, you do.
00:10:57.640 And it's explained to me
00:10:59.820 that even if you were poor and white
00:11:02.380 and your family didn't do anything useful for you
00:11:04.800 and you had to make it on your own,
00:11:07.820 that you still owe black people money
00:11:10.500 because your family could have taken advantage
00:11:15.100 of being white
00:11:15.900 and it's not black people's fault
00:11:18.240 if you didn't do it,
00:11:19.300 if your family didn't do it.
00:11:21.160 That's sort of on you.
00:11:22.980 So that's the argument,
00:11:24.320 I've been told.
00:11:26.300 You might find that unconvincing.
00:11:28.200 But that's the argument.
00:11:34.000 So I've got two days left
00:11:36.640 in my one-week challenge
00:11:37.900 for somebody to explain to me
00:11:39.920 what systemic racism means
00:11:41.900 with a current example.
00:11:45.120 Now, lots of people have explained to me
00:11:46.900 what it means
00:11:47.700 with examples of things
00:11:50.280 that don't apply anymore,
00:11:51.460 like laws that used to be the laws, etc.
00:11:54.960 Tim Poole had an interesting definition.
00:11:58.920 Which I've never heard from anybody else.
00:12:01.680 So I don't know how common it is.
00:12:03.820 I can at least say
00:12:04.460 that I've never heard it before.
00:12:06.340 And it goes like this.
00:12:07.420 And I think there's a phrase here
00:12:08.920 called going on tour,
00:12:10.620 which refers to
00:12:12.220 getting in a small amount of trouble,
00:12:14.940 which you can't get out of
00:12:17.000 and it just magnifies the rest of your life.
00:12:19.280 For example,
00:12:20.740 let's say you got a traffic ticket,
00:12:23.100 but you couldn't afford to pay it
00:12:24.800 because you're poor and you're black.
00:12:27.540 The poor part being more important
00:12:29.840 than the black part.
00:12:31.800 And so you don't pay your ticket
00:12:33.180 and then you get pulled over again
00:12:35.540 and now you've got an outstanding ticket.
00:12:38.060 They take your license,
00:12:39.620 but you still have to drive your car
00:12:40.960 because that's the only way to get to work.
00:12:43.700 And then the next time you get pulled over,
00:12:45.580 now you've got tickets and no license.
00:12:47.500 Now you go to jail,
00:12:48.400 but you can't afford a lawyer.
00:12:49.940 Now you've got a criminal record.
00:12:51.540 So that by being poor,
00:12:54.720 you can't extricate yourself from small problems
00:12:57.820 because you can't even pay a fine.
00:13:01.520 Now, I don't know how...
00:13:03.140 So this...
00:13:04.200 Don't take too much from the specific example,
00:13:07.140 but rather generalize it
00:13:08.380 to the larger point
00:13:10.120 that small troubles can be compounded over time
00:13:14.520 and that Tim Poole would like to define that
00:13:18.100 as sort of a structural systemic racism.
00:13:23.500 But it still doesn't pass the test
00:13:26.060 of how does any other poor person
00:13:28.420 get out of this?
00:13:30.140 If you're poor and white,
00:13:32.460 how do you pay your ticket?
00:13:34.840 I mean, there's something terribly missing from this.
00:13:37.980 Do the poor white people
00:13:39.260 suddenly produce money out of nothing
00:13:41.280 and they can pay the ticket
00:13:43.080 with having no money before they got stopped?
00:13:46.160 Like, how do they get out of trouble?
00:13:48.820 Why isn't it exactly the same for all poor people?
00:13:51.760 So what makes it systemic racism
00:13:53.560 if it applies to everybody who doesn't have...
00:13:57.760 Now, I think there was, on top of this,
00:13:59.880 there was the example of something
00:14:03.020 that no longer is the case
00:14:04.680 where each municipality would be stopping people
00:14:08.080 and maybe they stopped black people at a higher rate.
00:14:11.180 So if black people were being stopped at a higher rate
00:14:14.380 for no particular reason,
00:14:17.400 well, that would be just regular racism.
00:14:20.440 I don't know if that's systemic.
00:14:22.800 That would just be a cop being a racist
00:14:25.060 and if there were more than one cop,
00:14:27.500 of course there are,
00:14:28.280 who would also have those same opinions,
00:14:31.020 it would be multiple cops who are racist.
00:14:33.400 But is that systemic racism?
00:14:35.620 It just looks like cops being racist,
00:14:38.420 just like every other profession
00:14:40.300 would have some people being racist.
00:14:44.060 So, two days left,
00:14:46.380 nobody has given me a coherent definition of systemic.
00:14:51.220 I've gotten lots of definitions
00:14:52.520 and now I've gotten lots of examples.
00:14:55.360 But the examples are all the things that are illegal now.
00:14:58.580 They're all examples from the past.
00:15:00.440 I tweeted today that
00:15:06.580 if it seems as if every one of your trusted institutions
00:15:10.200 have lost credibility in the past few years,
00:15:14.420 that's probably an illusion.
00:15:17.580 Now you're thinking to yourself,
00:15:18.720 wait a minute,
00:15:19.860 that the news went bad,
00:15:21.780 that Congress went extra bad,
00:15:23.940 that basically our intelligence agencies,
00:15:27.880 the police,
00:15:28.600 basically everything went bad.
00:15:30.720 All the things we trusted
00:15:32.500 turned out to be corrupt.
00:15:35.200 All of them.
00:15:35.840 Everything.
00:15:36.280 Basically everybody's lying to you,
00:15:38.100 top to bottom,
00:15:39.460 government,
00:15:40.380 or just everybody.
00:15:42.220 Corporations, you name it.
00:15:43.300 And here was my take on that.
00:15:49.660 My take is that nothing changed.
00:15:52.540 The only thing changed is that now you know it.
00:15:56.340 That those institutions were not worse.
00:15:59.880 They were not worse now
00:16:01.580 than they have been in the past.
00:16:04.020 You just found out.
00:16:05.800 They were all always corrupt.
00:16:08.600 They were all always lying to you.
00:16:11.440 You just found out.
00:16:13.300 Now,
00:16:14.680 there's certainly a matter of degree,
00:16:16.400 and I think that
00:16:17.060 certainly in the news business,
00:16:19.100 there's been a worsening.
00:16:22.840 But it's not like the news
00:16:24.360 was always giving it to you straight.
00:16:27.120 Right?
00:16:27.840 I mean,
00:16:28.180 back in the Walter Cronkite days,
00:16:30.440 do you think the news was always honest?
00:16:32.800 Right down the middle?
00:16:34.640 I don't think so.
00:16:36.440 I don't think so.
00:16:37.380 But now we're just more aware of it.
00:16:39.720 Now,
00:16:39.980 I will point out
00:16:40.960 that when Trump was running for president,
00:16:43.280 even before he was elected,
00:16:44.760 I told you,
00:16:45.960 not all of you,
00:16:47.200 but I said publicly a number of times,
00:16:49.400 that he would change more than politics.
00:16:52.020 He would change our view of reality.
00:16:54.720 And he did.
00:16:58.120 He sure did.
00:16:59.540 Because some of the things you've learned about reality
00:17:01.820 are that we don't make decisions based on facts and reason.
00:17:06.220 A few years ago,
00:17:07.560 you would have said we did.
00:17:09.360 You know,
00:17:09.640 sure,
00:17:09.940 we don't do it well,
00:17:10.980 so we need to improve it.
00:17:12.820 But now you know it's not even part of the process,
00:17:15.340 don't you?
00:17:16.460 Now you know
00:17:17.440 facts and reason
00:17:19.240 don't work.
00:17:20.740 And the reason they don't work
00:17:22.460 is we don't know what their facts are.
00:17:24.760 They're all lies.
00:17:26.560 All of our facts
00:17:27.680 are unreliable.
00:17:30.200 But,
00:17:30.800 at least you have your sense of reason, right?
00:17:33.560 But we observe in public
00:17:35.340 people unable to reason.
00:17:37.120 So what happened was,
00:17:39.620 over time,
00:17:41.200 we human beings,
00:17:42.680 we started out being
00:17:43.940 irrational and superstitious,
00:17:46.860 and we didn't have any ability to reason.
00:17:49.860 We didn't have any math or philosophy skills.
00:17:52.860 We were just primitive people.
00:17:55.520 But over time,
00:17:57.240 oh boy,
00:17:57.720 did we evolve mentally.
00:17:59.700 We learned science.
00:18:01.920 We learned how to do controlled tests.
00:18:04.440 We learned reason and logic.
00:18:07.980 And then we put them all together,
00:18:09.580 our ability to find facts,
00:18:12.520 and then marry that with logic and reason
00:18:14.820 to make good decisions.
00:18:17.360 How'd that work out?
00:18:20.000 Not at all.
00:18:21.640 Soon as you get it outside of the realm
00:18:23.340 of science or math,
00:18:25.180 where you can actually check
00:18:26.560 to see if something makes sense,
00:18:27.880 you run the math,
00:18:29.640 you test it,
00:18:30.480 and you can find out
00:18:31.160 if something's true or not over time.
00:18:32.640 But in the messy real world,
00:18:35.260 all of our facts are lies.
00:18:37.840 Or even if they're true,
00:18:39.460 we don't know they're true,
00:18:40.640 so you can't even trust them.
00:18:42.520 Things sometimes are true by coincidence.
00:18:45.640 So in a world in which
00:18:47.260 your facts are all lies,
00:18:48.820 or at least you can't trust them,
00:18:51.160 and there's no two people
00:18:52.820 you can put in a room
00:18:54.180 who will agree what is a logical way
00:18:56.060 to approach something,
00:18:57.540 we've created a system that can't work.
00:19:00.840 That's right.
00:19:01.320 We've created a system
00:19:03.200 which absolutely favors
00:19:06.040 logic and data,
00:19:08.520 and there are two things
00:19:10.280 that we don't have any of
00:19:11.820 and can't get.
00:19:13.720 Logic and accurate data.
00:19:17.340 So we built a system
00:19:18.920 that relies completely
00:19:21.260 on the two things we don't have
00:19:23.360 and can't get.
00:19:26.100 That's where we are.
00:19:27.300 Now, amazingly,
00:19:29.260 we have such robust systems
00:19:30.860 that they seem to survive
00:19:32.400 all of this.
00:19:33.840 Somehow, you know,
00:19:35.120 the republic keeps chugging along.
00:19:37.660 Somehow, democracy
00:19:38.780 and the republic seem to work.
00:19:41.120 Somehow, capitalism
00:19:42.060 keeps chugging along.
00:19:43.820 You know, we have protests
00:19:44.680 and stuff,
00:19:45.220 but we get past them.
00:19:46.960 So amazingly,
00:19:48.100 we built a system
00:19:49.160 that can't work.
00:19:51.340 Just by design,
00:19:52.440 it can't work.
00:19:53.040 If you don't have the right data
00:19:54.760 and you don't have anybody
00:19:55.680 who can do logic and reason,
00:19:57.620 that's not much of a system.
00:20:00.040 But that's where we are.
00:20:03.120 I would say that,
00:20:04.340 so anyway,
00:20:05.640 the big point is that
00:20:06.920 nothing got worse,
00:20:09.600 you got smarter.
00:20:11.860 Nothing got worse,
00:20:13.700 you just found out.
00:20:14.820 And that was the Trump effect.
00:20:18.280 Because if Trump
00:20:19.220 had not framed the news
00:20:21.680 as fake news,
00:20:23.720 it probably would not be
00:20:25.380 nearly as convincing
00:20:26.460 that they really do
00:20:27.780 make up the news.
00:20:29.360 Like, actually,
00:20:30.120 just make it up.
00:20:31.660 I don't know that we would have
00:20:32.980 understood that
00:20:34.260 as a civilization.
00:20:36.040 And now I think
00:20:36.760 everybody understands it.
00:20:37.920 Although half of the country
00:20:38.880 thinks it only happens
00:20:39.900 with, you know,
00:20:40.940 the other side.
00:20:42.340 There's still half of the country
00:20:43.440 that has not reached
00:20:44.360 the level of awareness
00:20:45.400 to know it's both sides
00:20:46.540 that make stuff up.
00:20:47.860 They still think
00:20:48.600 it's only the other side.
00:20:50.320 If that's where you're at,
00:20:51.800 you need to catch up.
00:20:54.720 All right?
00:20:55.140 There's no such thing
00:20:57.140 as the one side
00:20:58.000 that's telling you the truth
00:20:59.140 and the other side
00:21:00.260 that's lying.
00:21:01.140 If you're stuck there,
00:21:02.800 you're very confused
00:21:04.260 about your world.
00:21:05.700 But I have hope
00:21:07.280 that you'll make it
00:21:07.980 to the next level.
00:21:08.780 I have a suspicion
00:21:14.620 that the inhabitants
00:21:17.100 of Chaz,
00:21:19.180 the new autonomous zone
00:21:20.480 in what used to be Seattle,
00:21:22.520 whatever's left of Seattle,
00:21:25.100 I have a feeling
00:21:27.320 that they are not economists
00:21:28.860 on the whole.
00:21:31.460 Probably not a lot
00:21:32.460 of engineers in there.
00:21:34.960 But I bet they have
00:21:35.880 a lot of artists.
00:21:37.300 So this is a...
00:21:39.940 And I've said this before,
00:21:41.140 and I know that people
00:21:42.080 can't tell if I'm kidding
00:21:43.200 about this,
00:21:44.200 because parody and reality
00:21:45.480 have gotten so close.
00:21:46.900 But I'll tell you directly,
00:21:48.480 and this will be
00:21:49.040 my promise to you.
00:21:50.340 If I ever tell you directly
00:21:52.200 the way I'm doing it now,
00:21:54.180 I am telling you directly,
00:21:56.720 you can trust it.
00:21:58.560 All right?
00:21:59.040 If I tried to leave
00:22:00.280 something out,
00:22:01.300 or I'm trying to...
00:22:02.380 I don't know.
00:22:03.340 If I were trying
00:22:04.300 to just persuade,
00:22:05.400 that might look
00:22:07.940 a little different.
00:22:09.360 But if I tell you directly,
00:22:11.840 I forgot what I was going to say.
00:22:16.880 You know,
00:22:17.560 have you noticed
00:22:18.340 how often I'll go on a tangent,
00:22:20.380 and then when I get down
00:22:21.500 the tangent,
00:22:22.180 I'm like,
00:22:22.860 ah,
00:22:23.660 I wish I knew
00:22:25.000 what that was related to
00:22:26.480 by the time
00:22:27.020 I got to the end of it.
00:22:27.920 This is one of those times
00:22:29.300 when having no sense
00:22:30.360 of embarrassment
00:22:31.120 really comes in handy.
00:22:35.040 Anyway,
00:22:35.740 I don't think
00:22:36.240 Chaz has many
00:22:37.420 engineers or economists,
00:22:40.320 and,
00:22:40.700 oh,
00:22:41.200 here's the part
00:22:41.680 I was going to say.
00:22:42.880 I am completely serious,
00:22:45.120 no parody,
00:22:46.740 completely serious
00:22:47.680 when I say
00:22:48.300 I am very interested
00:22:50.060 in how Chaz works out.
00:22:52.000 I'm very interested
00:22:53.140 in letting it run
00:22:54.300 a little bit.
00:22:55.280 Now,
00:22:55.520 I don't think
00:22:56.080 he can run forever
00:22:57.080 because the city
00:22:58.100 has to be reclaimed.
00:22:59.660 I don't think
00:23:00.120 you can have
00:23:00.520 a lawless zone
00:23:01.560 in the middle of,
00:23:02.480 in the middle of the city.
00:23:04.940 But,
00:23:05.560 I'm completely in favor
00:23:07.020 of the very,
00:23:08.100 very,
00:23:08.480 very gentle way
00:23:09.920 that it's being treated.
00:23:12.500 I'm very much
00:23:13.440 in favor
00:23:13.920 of learning
00:23:15.560 something from it,
00:23:16.820 and I'm very much
00:23:17.660 in favor
00:23:18.160 of the people involved
00:23:19.540 learning something
00:23:20.500 from it.
00:23:21.380 I just watched
00:23:22.100 a video,
00:23:23.060 I don't know
00:23:23.980 if I retweeted it,
00:23:24.780 I just watched
00:23:25.340 it before I got on,
00:23:26.940 of a black activist
00:23:29.160 who,
00:23:31.180 and why do I have
00:23:32.040 to say black,
00:23:33.080 right?
00:23:34.080 Like,
00:23:34.520 what's wrong
00:23:35.740 with the world
00:23:36.400 that this story
00:23:37.500 that has nothing
00:23:38.220 to do with the guy
00:23:39.000 being black,
00:23:40.060 I still think
00:23:41.000 it's important
00:23:41.540 to the story.
00:23:43.100 And it is.
00:23:43.840 It actually is
00:23:44.460 important to the story,
00:23:45.700 but it sure shouldn't be,
00:23:47.540 right?
00:23:48.320 So he's
00:23:49.140 an activist
00:23:50.140 who happens
00:23:50.740 to be black,
00:23:52.060 and because we're
00:23:52.800 in the middle
00:23:53.240 of what we're
00:23:53.800 in the middle
00:23:54.180 of the part
00:23:55.900 about being black
00:23:56.680 actually is
00:23:57.500 important to the story.
00:23:59.700 And he was
00:24:00.580 an activist
00:24:02.440 against police brutality,
00:24:05.460 against black population
00:24:07.260 in particular,
00:24:07.900 and the police
00:24:09.600 asked him
00:24:10.120 to come in
00:24:10.600 for a training day
00:24:11.660 so that he would
00:24:12.700 train like a police
00:24:13.780 officer,
00:24:14.540 and they gave him
00:24:15.480 certain set-up
00:24:16.360 situations
00:24:16.960 where a police
00:24:18.300 officer would
00:24:19.060 pretend to be
00:24:19.780 uncooperative,
00:24:20.800 being arrested
00:24:21.840 and such things.
00:24:23.400 And at the end
00:24:24.080 of the training,
00:24:24.800 the activist said,
00:24:25.920 okay,
00:24:27.260 change my mind
00:24:28.100 completely.
00:24:29.560 Now that I see
00:24:30.700 the split-second
00:24:32.460 decisions
00:24:33.020 and the amount
00:24:34.380 of perceived
00:24:35.640 risk that even
00:24:36.700 I had
00:24:37.160 just in
00:24:38.480 an artificial
00:24:39.680 environment,
00:24:41.220 he said he
00:24:41.680 completely changed
00:24:42.440 his mind
00:24:42.860 and people should
00:24:43.600 cooperate with
00:24:44.280 police,
00:24:45.100 and that the
00:24:45.680 real problem
00:24:46.260 is people
00:24:47.560 not knowing
00:24:48.180 they really
00:24:48.600 need to cooperate
00:24:49.260 with the police.
00:24:50.740 Now,
00:24:51.640 that was just
00:24:52.480 one guy
00:24:52.980 and one anecdote.
00:24:53.920 I don't know
00:24:54.320 that that could
00:24:54.880 be repeated,
00:24:55.900 but I think
00:24:56.600 there's something
00:24:57.180 happening with
00:24:58.180 Chaz that's
00:25:00.240 just like that
00:25:01.120 because they're
00:25:02.360 watching themselves
00:25:03.660 have racial strife.
00:25:06.220 I guarantee
00:25:07.300 that there's at
00:25:08.180 least somebody
00:25:09.000 there who's
00:25:10.440 worried about
00:25:11.000 the fact that
00:25:11.700 the security
00:25:13.260 slash police
00:25:14.200 force is
00:25:15.480 mostly black,
00:25:16.800 if not all
00:25:17.440 black.
00:25:18.080 I don't know.
00:25:18.660 So you don't
00:25:19.680 think there's
00:25:20.240 somebody white
00:25:21.380 in Chaz
00:25:22.780 who is worried
00:25:23.920 about the fact
00:25:24.560 that the police
00:25:25.200 force seems
00:25:25.800 mostly black
00:25:26.640 and that maybe
00:25:27.640 there might be
00:25:28.200 some bias
00:25:28.740 involved?
00:25:30.100 I don't know.
00:25:31.740 Since this is
00:25:32.680 a self-selected
00:25:34.160 group of people
00:25:34.900 who identify
00:25:37.120 as having a
00:25:37.900 lack of bias,
00:25:39.220 maybe not.
00:25:40.400 Maybe not.
00:25:41.080 Maybe they're
00:25:41.740 so self-selected
00:25:42.780 and filtered
00:25:43.220 that that doesn't
00:25:45.080 register as any
00:25:46.040 kind of a bias.
00:25:46.720 But I think
00:25:48.040 if you let
00:25:48.540 the experiment
00:25:49.200 run a little
00:25:49.980 bit longer,
00:25:52.000 people are
00:25:52.440 going to have
00:25:52.840 some racial
00:25:54.260 tension will
00:25:55.260 emerge,
00:25:56.360 which would be
00:25:57.780 interesting to
00:25:58.400 know since
00:25:58.960 these are all
00:25:59.500 the anti-racist
00:26:00.540 people.
00:26:01.800 I think they'll
00:26:02.740 recreate systems
00:26:03.860 and find out
00:26:04.640 that they need
00:26:05.400 money and
00:26:06.160 economies and
00:26:07.220 all those
00:26:07.620 things,
00:26:08.100 incentives.
00:26:09.300 So let's
00:26:10.880 run it.
00:26:11.540 But ultimately
00:26:12.060 they have to
00:26:12.620 leave there because
00:26:13.220 it's not their
00:26:13.680 property.
00:26:14.060 And I would
00:26:14.700 love to see
00:26:15.220 this experiment
00:26:16.460 taken somewhere
00:26:18.500 else.
00:26:19.420 Like I would
00:26:20.120 love, love, love
00:26:21.540 to see some
00:26:22.180 part of a state
00:26:23.200 just carved out
00:26:24.860 for experimenting.
00:26:26.780 Let people go
00:26:27.840 there and
00:26:28.640 develop a system
00:26:30.000 and see if they
00:26:31.420 can live with
00:26:32.500 no police force.
00:26:35.340 Because I think
00:26:36.020 if you designed
00:26:36.840 your community
00:26:38.060 right,
00:26:39.500 and part of the
00:26:40.440 design would be
00:26:41.160 deciding who
00:26:41.820 could be there
00:26:42.340 and who
00:26:42.560 couldn't.
00:26:42.880 So you can
00:26:44.260 imagine controlling
00:26:45.280 the elements of
00:26:46.120 a design of
00:26:46.840 a community.
00:26:50.020 I can imagine
00:26:50.900 that you could
00:26:51.420 get your need
00:26:52.140 for a police
00:26:52.740 force down to
00:26:53.880 almost nothing.
00:26:55.960 For example,
00:26:57.400 imagine you
00:26:58.040 built a community
00:26:58.780 and one of the
00:26:59.940 rules is that
00:27:02.180 every block
00:27:02.960 had to have
00:27:03.880 a working
00:27:05.280 or retired
00:27:06.500 police officer
00:27:07.280 who owned
00:27:08.940 a gun.
00:27:09.260 just imagine
00:27:11.600 that every
00:27:12.020 block you
00:27:12.760 know there's
00:27:13.660 at least one
00:27:14.640 police officer
00:27:15.480 and you just
00:27:15.900 make that a
00:27:16.440 rule.
00:27:17.300 Alright,
00:27:17.740 one of these
00:27:18.240 apartments or
00:27:19.020 one of these
00:27:19.360 homes has to
00:27:20.500 be reserved
00:27:21.080 for a police
00:27:21.960 officer.
00:27:22.620 Now that police
00:27:23.240 officer doesn't
00:27:23.900 need to be
00:27:24.420 working in that
00:27:25.380 community.
00:27:26.600 Their job could
00:27:27.440 be miles away
00:27:28.840 somewhere else.
00:27:29.800 It's just that if
00:27:30.620 you put a certain
00:27:31.640 kind of person
00:27:32.720 with a certain
00:27:33.380 kind of training
00:27:34.240 and you sprinkle
00:27:35.920 them around the
00:27:36.620 community,
00:27:37.480 you probably get
00:27:38.560 a better result.
00:27:39.900 Just one of
00:27:42.200 many things you
00:27:42.920 could do to
00:27:43.440 design a
00:27:44.800 safer place.
00:27:47.080 Alright,
00:27:47.960 I would love
00:27:51.200 to see an
00:27:51.960 experiment in
00:27:52.860 which something
00:27:54.720 like Chaz
00:27:55.660 gets set up
00:27:56.520 in which the
00:27:57.580 dominant theme
00:27:59.020 is a victim,
00:28:01.040 let's say a
00:28:01.900 victim-dominant
00:28:03.100 system.
00:28:04.240 Meaning that
00:28:05.320 whoever can
00:28:06.920 most effectively
00:28:08.880 make the case
00:28:09.900 that they're the
00:28:11.160 biggest victim
00:28:11.920 in whatever
00:28:12.580 situation is
00:28:13.480 involved,
00:28:14.700 that the
00:28:15.380 biggest victim
00:28:16.140 gets the
00:28:17.160 most say.
00:28:18.460 That the
00:28:18.840 most resources
00:28:19.660 and attention
00:28:20.860 goes to whoever
00:28:22.380 can present
00:28:23.720 themselves most
00:28:24.520 effectively as the
00:28:25.400 biggest victim.
00:28:26.680 Now that's the
00:28:27.280 system that
00:28:28.320 Chaz,
00:28:30.520 not explicitly,
00:28:31.960 but by their
00:28:33.160 collective actions,
00:28:34.240 seem to favor.
00:28:36.760 I'd like to see
00:28:37.520 how it goes
00:28:38.120 because I think
00:28:39.640 they need to see
00:28:40.300 how it goes too.
00:28:41.740 Now the
00:28:42.140 alternative to
00:28:43.160 this would be a
00:28:44.020 system that's more
00:28:44.720 aspirational and
00:28:45.880 self-reliant and
00:28:47.080 you know, you can
00:28:48.260 do what you want
00:28:48.880 to do and then
00:28:50.860 just run an
00:28:51.720 experiment.
00:28:52.760 Alright, let me
00:28:53.200 tell you what I
00:28:53.940 think is wrong
00:28:57.160 with everything in
00:28:58.100 the world right
00:28:58.680 now.
00:28:59.000 and I'm going to
00:29:01.260 do this on the
00:29:01.880 whiteboard.
00:29:02.340 Yes, there's a
00:29:03.060 whiteboard.
00:29:04.020 And what I've
00:29:04.680 done is I've
00:29:05.660 taken our
00:29:06.460 current situation,
00:29:08.060 the world
00:29:08.440 situation, and I
00:29:10.280 put it into the
00:29:11.780 model of a car
00:29:14.480 engine.
00:29:15.620 Now if you don't
00:29:16.360 know how car
00:29:17.000 engines work, I'll
00:29:17.960 give you the
00:29:18.380 really quick
00:29:19.080 explanation of
00:29:20.360 just the car
00:29:21.200 part.
00:29:21.740 And then I'll
00:29:22.240 talk about the
00:29:23.420 human system that's
00:29:25.080 operating like a
00:29:26.280 car.
00:29:26.500 So in a car
00:29:27.780 you've got this
00:29:28.280 thing called a
00:29:28.920 carburetor that
00:29:30.280 takes air and
00:29:31.620 gasoline, mixes
00:29:33.140 them into a
00:29:34.280 sort of a mist,
00:29:36.140 if you will, and
00:29:37.320 feeds that mist
00:29:38.160 over to the
00:29:38.740 spark plug, which
00:29:39.880 adds the fire to
00:29:41.160 the gasoline that's
00:29:42.100 now in a mist.
00:29:44.120 There's another
00:29:44.620 word for it, a
00:29:45.440 gaseous air
00:29:48.840 form.
00:29:49.660 What would you
00:29:50.240 call it?
00:29:51.040 But anyway, the
00:29:51.600 spark plug ignites
00:29:52.620 that and the
00:29:53.400 timing of the
00:29:53.920 spark plug is
00:29:54.620 controlled by other
00:29:55.900 devices.
00:29:56.500 in the car, these
00:29:57.880 small explosions
00:29:58.900 in confined
00:30:02.460 spaces make the
00:30:03.960 pistons move and
00:30:05.060 the pistons are
00:30:05.780 connected to the
00:30:06.400 wheels through a
00:30:07.860 series of gears.
00:30:09.300 So if all of
00:30:10.380 this is working the
00:30:11.280 way it's supposed
00:30:11.960 to, you would
00:30:13.220 have the air and
00:30:14.860 the gas mixed in
00:30:15.880 just the right
00:30:16.680 amount, you would
00:30:18.000 have a spark plug
00:30:18.940 which is really
00:30:19.700 well timed, so
00:30:21.260 it's sparking just
00:30:22.640 right against just
00:30:24.920 the right amount of
00:30:25.900 gas and air and
00:30:27.900 then it makes the
00:30:28.560 pistons and wheel
00:30:29.260 turn.
00:30:30.100 Now, the second
00:30:31.440 layer is in the
00:30:32.480 purple letters.
00:30:34.320 So just by way of
00:30:36.980 overlaying the human
00:30:38.920 system on top of
00:30:39.980 this so you can see
00:30:40.780 it as a machine.
00:30:41.960 Oh, and by the way,
00:30:43.280 this is a, vapor is
00:30:46.320 the word, thank you,
00:30:47.220 fumes, vapor,
00:30:48.540 atomizes, those are
00:30:49.940 the words I was
00:30:50.460 looking for, not
00:30:51.720 mist.
00:30:52.280 Now, what you're
00:30:55.140 seeing here is a
00:30:55.820 technique I use a
00:30:56.860 lot when I'm
00:30:57.660 trying to think
00:30:58.400 through a
00:30:59.860 complicated situation.
00:31:01.700 Often I'll
00:31:02.240 translate it into
00:31:03.100 my mind into a
00:31:04.100 machine so that I
00:31:05.760 can look at the
00:31:06.500 machine in my mind
00:31:07.580 and rotate it and
00:31:08.900 see all the parts
00:31:09.660 and then I
00:31:10.720 diagnose it like
00:31:11.660 an engineer.
00:31:13.400 And it's just a
00:31:14.520 way to hold things
00:31:15.920 in your mind in a
00:31:17.100 way that you can
00:31:17.740 see them connected.
00:31:18.660 So it doesn't
00:31:20.900 matter what the
00:31:21.420 machine is as long
00:31:22.780 as it forms a good
00:31:23.720 analogy.
00:31:24.300 So that's what I've
00:31:24.820 done here.
00:31:27.740 Imagine, if you
00:31:28.480 will, that the gas
00:31:29.360 is the GOP and
00:31:30.940 the Democrats are
00:31:32.540 the air.
00:31:33.980 Now, it's just
00:31:34.840 kind of fun that
00:31:35.980 you've got air and
00:31:37.320 gas and if you
00:31:38.520 had to assign a
00:31:39.320 political party to
00:31:40.320 those, you would
00:31:41.480 definitely give the
00:31:42.240 Democrats the air
00:31:43.960 and the Republicans
00:31:45.660 the gas, right?
00:31:47.220 It just seems like
00:31:48.660 it fits.
00:31:50.000 But anyway, so
00:31:50.600 they're feeding their
00:31:51.980 stuff into the
00:31:52.640 carburetor, which is
00:31:53.620 like the news.
00:31:55.420 So the news is the
00:31:56.860 thing that takes all
00:31:57.680 of these inputs and
00:31:59.200 sort of normalizes it
00:32:01.100 and turns it into a
00:32:02.160 workable form for the
00:32:03.620 public and for the
00:32:05.000 government and then
00:32:07.140 sends it over to the
00:32:07.920 spark plug.
00:32:08.600 The spark plug is
00:32:09.340 like Trump, the
00:32:11.640 president in general,
00:32:12.760 but Trump right now.
00:32:14.200 Now, normally the
00:32:14.980 spark plug is
00:32:15.860 controlled.
00:32:17.220 by advisers.
00:32:19.780 So the president
00:32:21.200 doesn't just spew
00:32:22.980 anything that comes
00:32:24.460 to his mind.
00:32:25.700 It's filtered and
00:32:27.320 massaged and any
00:32:29.920 controversial or
00:32:31.380 offensive things are
00:32:32.700 taken out of it.
00:32:33.780 It's all scrubbed up
00:32:35.020 so it's nice and
00:32:35.820 clean and tight and
00:32:37.560 doesn't offend anyone.
00:32:39.460 That's what advisers
00:32:40.620 do.
00:32:41.260 But we don't have that
00:32:42.500 situation.
00:32:42.940 What we have is a
00:32:44.580 president who has a
00:32:45.380 Twitter account and
00:32:47.160 the advisers are not
00:32:48.960 controlling his timing.
00:32:50.500 So you've got a wild
00:32:51.380 spark plug and you've
00:32:54.280 got a news business
00:32:55.800 that no longer acts
00:32:58.340 as a carburetor.
00:32:59.880 Instead, it acts as an
00:33:01.700 accelerant because their
00:33:03.360 business model changed.
00:33:05.240 In the old days, the news
00:33:06.480 was trying to give you
00:33:07.400 something that was sort
00:33:08.880 of a common view of the
00:33:10.680 world that would also
00:33:11.540 be productive.
00:33:13.340 So the carburetor used
00:33:14.800 to work to make
00:33:17.080 things work better.
00:33:19.140 Even if the news
00:33:20.020 wasn't right, we were
00:33:21.060 all on the same page.
00:33:22.620 So the carburetor used
00:33:24.020 to be helpful, but now
00:33:25.980 the news really just
00:33:27.280 tries to throw as much
00:33:29.360 fuel onto the fire as
00:33:30.720 possible.
00:33:31.520 So they're putting way
00:33:32.920 too rich a formula onto
00:33:35.600 the spark plug.
00:33:36.680 So you've got the news
00:33:38.020 who is ramped up out of
00:33:39.400 control.
00:33:39.740 It's like a broken
00:33:40.580 carburetor.
00:33:41.680 And it's feeding too
00:33:42.660 much fuel to the spark
00:33:43.720 plug, which is Trump,
00:33:45.600 who has already got his
00:33:46.980 own timing issues.
00:33:48.420 So you've got a wild
00:33:49.460 ride going on here.
00:33:51.560 And, of course, you can
00:33:53.100 still move forward, maybe
00:33:55.140 just lurching instead of
00:33:57.240 smoothly driving.
00:33:59.040 But here's my take on the
00:34:02.040 engine of society.
00:34:03.620 The engine of society is
00:34:05.660 broken.
00:34:06.140 And it's primarily because
00:34:08.480 the news business is no
00:34:09.920 longer a traditional
00:34:11.580 news business.
00:34:12.780 The news business is in
00:34:14.220 the business of, as many
00:34:17.540 smart people have told
00:34:18.420 you, jacking up that part
00:34:20.260 of your brain that gets
00:34:21.100 you excited.
00:34:21.640 So it's not about informing
00:34:23.700 you whatsoever.
00:34:24.940 It's just about getting
00:34:25.720 you excited.
00:34:27.180 So your excitation gets
00:34:30.600 applied to issues which
00:34:32.340 you might have handled
00:34:33.840 differently if you had not
00:34:34.940 been all excited by the
00:34:37.000 news.
00:34:38.180 And so my view of the
00:34:41.420 problem with the world is
00:34:45.280 the news.
00:34:45.820 And then if you fix that
00:34:47.860 one thing, everything else
00:34:50.040 would be easier to fix.
00:34:51.980 So, in my opinion, the
00:34:53.640 foundational crisis is the
00:34:55.900 news.
00:34:56.940 The news business being
00:34:58.460 broken, completely broken,
00:35:00.160 just absolutely worthless.
00:35:01.780 And the news business is no
00:35:03.600 longer trying to help the
00:35:04.900 country.
00:35:05.740 They've simply taken a team
00:35:07.740 and they're just trying to
00:35:09.220 help their team.
00:35:11.280 Nobody's helping the
00:35:12.260 country.
00:35:13.180 Did I say nobody?
00:35:14.320 Nobody's helping the
00:35:16.700 country?
00:35:17.400 Well, what's interesting is
00:35:18.600 that this engine has a new
00:35:22.260 factor, which is the
00:35:24.300 independent journalists and
00:35:26.160 voices on the Internet.
00:35:27.920 I'm talking about your Tim
00:35:29.260 Pools.
00:35:30.080 I'm talking about your Mike
00:35:31.280 Cernovich's.
00:35:32.240 I'm talking about me.
00:35:34.580 You know, just anybody who
00:35:35.660 is an independent voice and
00:35:37.280 isn't especially worried about
00:35:38.740 getting canceled.
00:35:40.040 Now, to some extent, it would
00:35:41.060 also be like a Jack Posobiec,
00:35:42.760 but he works for, you know, a
00:35:44.160 larger news entity, but the
00:35:46.720 people who are not in, let's
00:35:48.580 say, the most mainstream of
00:35:51.200 the mainstream news, it's
00:35:53.180 people outside of that who
00:35:55.500 are trying to hold this model
00:35:57.420 together.
00:35:58.680 We're trying to keep the car
00:36:01.220 in the road because the news
00:36:03.240 business used to do that
00:36:04.680 function.
00:36:05.200 It used to be a carburetor, but
00:36:07.120 now it's not.
00:36:07.860 So, you do see the independent
00:36:12.860 Internet personalities trying to
00:36:16.960 calm down the system, trying to
00:36:18.860 give it a little more, a little
00:36:21.160 less fakeness in the news, calling
00:36:23.380 out the mistakes, et cetera.
00:36:25.060 It helps a little.
00:36:26.400 I don't know if it's enough.
00:36:28.040 Yeah, Taibbi is another good
00:36:29.940 example.
00:36:32.200 Cheryl Atkinson, another good
00:36:33.860 example.
00:36:34.240 Yeah.
00:36:39.380 Informational warlords.
00:36:40.780 Somebody's using that phrase,
00:36:42.400 informational warlords, or
00:36:44.960 persuasion warlords.
00:36:46.940 Yeah.
00:36:47.640 There is something like that.
00:36:49.700 So, I believe that our problems
00:36:52.780 with race are mostly a news
00:36:55.920 problem.
00:36:57.020 Meaning, I'm not saying that the
00:36:58.700 problems don't exist, because
00:37:00.300 somebody will take me out of
00:37:01.320 context and say, you said there's
00:37:03.960 no problems.
00:37:04.780 No, I'm not saying that.
00:37:06.100 I'm saying that the way we're
00:37:07.500 approaching the problems are
00:37:10.700 because the news has assigned us
00:37:12.420 opinions to make us fight,
00:37:15.320 basically.
00:37:16.120 So, the news business is the
00:37:18.200 foundational crisis.
00:37:20.180 The other crises happen on top of
00:37:22.600 that.
00:37:23.700 Right?
00:37:24.000 And they're, in many ways, caused
00:37:27.300 by a broken news business.
00:37:30.920 All right.
00:37:33.440 What about sharks with laser
00:37:35.240 guns?
00:37:35.720 Somebody asked me.
00:37:41.920 All right.
00:37:43.520 Yeah.
00:37:44.080 All right.
00:37:44.300 That's all I wanted to say about
00:37:45.780 that topic.
00:37:46.460 Let's see what else we've got
00:37:47.880 going on here.
00:37:48.600 Do you think it's immoral to tell
00:37:54.780 people that they're victims?
00:37:57.880 I was thinking that today.
00:38:00.080 It seems to me that the way black
00:38:03.200 people are being abused in this
00:38:05.580 country, it just changed forms and it
00:38:08.980 didn't stop happening.
00:38:10.640 In other words, you know, all of the
00:38:13.140 past racial injustices, to a large
00:38:16.840 extent, you know, we've been working
00:38:18.400 pretty hard as a society to try to,
00:38:20.480 you know, to reduce those.
00:38:23.500 But then we just come up with a
00:38:25.180 whole new way to abuse black
00:38:26.740 people.
00:38:27.240 And right now it's happening because
00:38:28.840 we're all lying to them.
00:38:30.860 You know, me included, of course.
00:38:33.360 I don't take myself out of that.
00:38:35.040 We're just lying to black people.
00:38:37.680 Do you know why we're lying to black
00:38:39.380 people?
00:38:41.700 It's because the news is broken.
00:38:45.160 The news is broken.
00:38:46.840 So you have to lie because it's a
00:38:49.400 survival thing.
00:38:51.560 What do I think black people need to do
00:38:54.220 differently to have more success?
00:38:56.160 I don't know if I could tell them.
00:38:58.440 Because I feel like I'm sort of forced
00:39:00.680 into lying because to move forward with
00:39:04.540 anything, you pretty much have to be
00:39:06.800 able to talk about it.
00:39:08.620 How do you make a decision or work
00:39:11.120 together when you can't even talk?
00:39:13.400 That doesn't make sense.
00:39:14.500 And that's where we are.
00:39:16.120 So we have a news business which has
00:39:19.140 fully endorsed the victim-driven system
00:39:24.200 that if you complain the most and you
00:39:26.940 make the best case, that you get more
00:39:29.440 resources.
00:39:30.740 Now, if you were to design a system that
00:39:33.180 was based on that principle, it would
00:39:35.120 surely fail.
00:39:35.880 So we have a system that can't work and we
00:39:41.000 have a news business that can't tell you it
00:39:43.020 can't work.
00:39:44.000 And then you have people like me who I'd
00:39:46.900 love to help.
00:39:48.120 I mean, I really would.
00:39:49.480 I'd really like to be helpful.
00:39:50.800 But I'm forced to lie to black people
00:39:54.280 because black people require it.
00:39:56.320 I mean, it's basically, you know, it's
00:39:58.420 almost demanded.
00:40:00.200 White people require it.
00:40:01.540 The news requires it.
00:40:02.740 My career requires it.
00:40:03.880 We just all have to lie.
00:40:05.780 So anybody who wants the truth,
00:40:09.440 you're going to have to do something
00:40:12.340 different to get it because what you're
00:40:13.880 doing now isn't going to give you any
00:40:15.080 truth.
00:40:15.360 So as long as victimhood is the dominant
00:40:19.900 preference, there's not much we can do
00:40:22.620 here.
00:40:23.900 And so I ask you this.
00:40:26.700 Suppose you've got your average young
00:40:30.260 black man and your average white black
00:40:34.200 man.
00:40:35.100 I'm sorry, the average young white man.
00:40:38.200 So just consider young black man, young
00:40:40.880 white man.
00:40:42.240 And you tell the young black man that
00:40:45.260 the deck is stacked against him and the
00:40:47.320 system is racist and he's going to have
00:40:49.920 all these problems.
00:40:51.720 Now, forget about for a moment whether
00:40:53.640 that's true or untrue because it's not
00:40:57.320 going to matter to my point.
00:40:59.580 My point is that people have sort of an
00:41:02.080 operating system about how to succeed.
00:41:04.700 If you give somebody an operating system
00:41:08.140 that says the system is stacked against you
00:41:10.840 specifically, how are they going to
00:41:13.300 perform poorly, poorly?
00:41:17.500 And we know that because, you know, there
00:41:20.200 have been experiments in which school
00:41:22.940 children are told they're gifted or not.
00:41:25.040 And if they're told they're gifted, they
00:41:26.520 perform better.
00:41:27.760 When people are told that they can succeed,
00:41:30.260 they perform better.
00:41:31.240 When people are told, no, there's a
00:41:33.100 problem.
00:41:33.760 It's a big problem.
00:41:34.720 It's beyond what you can handle.
00:41:36.760 In other words, you can't personally fix
00:41:38.820 racism.
00:41:39.360 It's something bigger than you.
00:41:41.660 You're just a victim of it.
00:41:43.260 So, you know, good luck.
00:41:44.980 You're not going to succeed as well as
00:41:46.820 other people.
00:41:48.160 I would say it's immoral to give anybody
00:41:51.740 that message because it would so handicap
00:41:54.540 their abilities to succeed that, I mean, it
00:41:58.060 just feels like evil.
00:41:58.980 I don't know how else to say it.
00:42:02.340 It's not just bad advice.
00:42:04.240 It's so bad it's evil, even if it's true.
00:42:08.260 All right.
00:42:08.440 So if you're hung up on, but Scott, it's
00:42:11.460 true.
00:42:13.160 There is racism.
00:42:15.800 It is a problem.
00:42:17.820 It will be an obstacle.
00:42:20.080 I say to you, you're off my point.
00:42:23.560 Whether it's true or not, and obviously
00:42:27.480 it's true, but that's irrelevant to what
00:42:30.320 strategy that you use.
00:42:32.180 If your strategy is to act like it's not
00:42:34.420 a problem, you'll have a better life.
00:42:37.500 And isn't that what everybody wants, to
00:42:39.320 simply go through life like, yeah, it's
00:42:41.780 an obstacle, but it's like a cardboard
00:42:44.540 obstacle because I have these good
00:42:46.580 strategies and, you know, my mental game
00:42:49.940 is strong.
00:42:50.580 My physical game is strong.
00:42:52.020 My career is good.
00:42:53.220 My family is good.
00:42:54.560 Yeah, there's this little cardboard in
00:42:56.120 the way, but push that cardboard out of
00:42:58.820 the way.
00:42:59.080 Ah, there's more cardboard.
00:43:01.500 Are you kidding me?
00:43:02.700 There's more cardboard?
00:43:04.620 So you probably never have to stop
00:43:06.480 pushing the cardboard out of the way
00:43:08.260 because, like I said, racism is sort of
00:43:11.280 built into the system.
00:43:12.740 You can't get rid of it completely, no
00:43:15.920 matter how hard you try.
00:43:17.240 But you can certainly have a life
00:43:18.820 strategy that makes it cardboard instead
00:43:21.140 of concrete.
00:43:22.020 I think that is realistic.
00:43:24.560 So I would say that the current
00:43:26.580 messaging from all the people who mean
00:43:29.540 well is absolutely immoral, even if
00:43:33.800 true.
00:43:35.080 Because your strategy has to be separate
00:43:37.420 from the facts.
00:43:40.020 Like, the facts don't limit your strategy.
00:43:43.700 Your strategy should be whatever works best.
00:43:45.900 And when did race become a privileged problem?
00:43:55.460 And here's what I mean by that.
00:43:59.220 Now, I don't want to diminish, you know,
00:44:02.920 racial problems.
00:44:05.760 So accepting that racial problems have been
00:44:09.440 big, are big, will be big.
00:44:11.300 You know, no diminishing of that.
00:44:13.540 Aren't there other problems that people have?
00:44:18.160 If you have cancer, is that a bigger problem
00:44:22.780 or a smaller problem than being black
00:44:25.120 in America?
00:44:26.760 If, you know, if you're 4'9", is that a bigger problem
00:44:33.180 or a smaller problem, trying to get a job,
00:44:36.420 trying to have a good life, than being black
00:44:38.860 in America?
00:44:39.400 If, if you're really ugly, is that, you know,
00:44:46.100 and I can say that because I, I, I, I'm a part
00:44:50.840 of that community.
00:44:52.580 If you're really ugly, is your life going to be
00:44:56.620 just as good?
00:44:58.080 Are you going to have, you know, the same number
00:45:00.540 of obstacles as everybody else?
00:45:02.920 So, here's my point.
00:45:04.640 If you, let's, or let's say you have mental
00:45:06.240 illness.
00:45:06.700 Let's say you're addicted and you've got a
00:45:09.060 genetic propensity for it.
00:45:11.020 Is your life great because you're also white
00:45:13.660 in these cases?
00:45:15.300 How is it that race became such a privileged,
00:45:18.860 the problem that, that sits over above all
00:45:21.640 problems?
00:45:24.420 I don't know.
00:45:25.180 Seems to be lots of people have problems.
00:45:32.160 Yes.
00:45:32.680 And of course, um, we've, we've just overshot the
00:45:37.480 mark here.
00:45:38.040 The, the fact that we're talking about Kamala
00:45:40.280 Harris, uh, really being the only one that Joe
00:45:43.840 Biden can pick because she's black or person of
00:45:47.280 color, whatever, however details you want to put
00:45:49.620 on that.
00:45:50.620 Um, I don't know.
00:45:53.420 I don't feel like that's a better world.
00:45:55.200 I really don't think it is.
00:45:57.900 I understand that we got here, but I, at this
00:46:00.760 point, I think it's immoral to continue lying to black
00:46:04.680 people.
00:46:05.360 And let me be more specific about the lie.
00:46:08.640 The lie looks like this.
00:46:11.340 The lie is that you should take the truth and then build
00:46:15.440 your strategy of victimhood around that and that that
00:46:18.480 will get you to a good place.
00:46:20.020 It's just a lie.
00:46:21.100 The truth is if you use the basic principles of strategy
00:46:27.100 that everybody else who succeeds uses, your odds of
00:46:31.440 succeeding are really high, really, really high, like
00:46:36.320 almost guaranteed if you do the right strategy.
00:46:40.220 So denying, denying people, anybody, the truth that your
00:46:46.980 strategy is what you should be working on, not your
00:46:49.380 victimhood, is immoral.
00:46:52.940 It's just, it's just flat out immoral that, you know, that
00:46:57.460 people like me can't just be honest and say, look, I know
00:47:00.740 you've got issues.
00:47:01.500 Everybody's got problems.
00:47:02.580 What's your best strategy?
00:47:03.980 And then work on the thing you can change.
00:47:07.400 That's it.
00:47:07.980 Um, just looking at your comments for a moment.
00:47:15.280 And I guess the, uh, you know, one of the things I, I've heard
00:47:23.660 as systemic racism is, uh, I read an example from, it was an
00:47:28.580 African-American woman on Twitter who was saying she told this
00:47:31.800 story about when she went in to get a loan for her first
00:47:34.780 home, uh, the lender said, you know, maybe you could get family
00:47:38.940 members to help you with a down payment, 50, a hundred
00:47:41.800 thousand dollars, you know, get a family member to help.
00:47:45.360 And as she tells the story, um, she doesn't have any family
00:47:48.900 members who have an extra 50 or a hundred thousand dollars.
00:47:52.080 So if you do, you're starting from a privileged, white
00:47:57.000 privileged position to which I say, you're right.
00:48:00.280 It's not the white part though.
00:48:03.320 Isn't it the money part, which part of that was the white
00:48:07.120 part, because there are poor white people who are not getting 50
00:48:12.180 and a hundred thousand dollars from their parents either.
00:48:14.620 In fact, what percentage of white people have ever gotten
00:48:19.460 $50,000 from their parents?
00:48:22.400 I don't know the answer to this, but it's not a big number, right?
00:48:27.240 Uh, uh, I've never gotten $50,000 from my parents.
00:48:32.360 In fact, uh, I helped support my parents.
00:48:36.120 I paid them.
00:48:39.420 So I don't know.
00:48:42.340 Um, if you, if you put a number on it, let me put a guess on
00:48:46.640 this, the number of, uh, white families percentage wise who
00:48:52.620 could help a child get a house.
00:48:55.520 So we're talking, you know, tens of thousands, whatever number
00:48:59.040 you want to put in a, what percentage of white families
00:49:01.440 could do that?
00:49:03.580 20% I'm going to say, I guess a maximum 20%.
00:49:09.280 Does that feel right?
00:49:11.960 Maybe 10%, but no more than 20.
00:49:17.000 So you've got 80% of the white world who is just being thrown under the bus
00:49:23.580 because they don't have money and they don't have any victim privilege.
00:49:29.300 So their white privilege didn't, didn't buy them a fucking thing
00:49:32.600 because their parents got nothing.
00:49:35.460 And yeah, they didn't even get good advice.
00:49:38.840 So they got nothing and then, and they're not even victims.
00:49:42.180 So they don't get that extra victim juice, you know, if you're a poor black kid, all
00:49:48.660 you need is an education and you can, you can walk into any fortune 500 company and
00:49:53.980 you're hired the same day.
00:49:55.500 Now, if you don't know that, then your strategy needs work because you can every
00:50:02.520 time you could just walk into any fortune 500 company with a college degree.
00:50:07.740 probably a hundred percent of them would hire you as long as you didn't have, you know,
00:50:13.360 any bad, I don't know, criminal record or some weird thing.
00:50:20.940 So I'm seeing guesses from five to 20%, but certainly 80% can't do that.
00:50:26.980 So every time I see one of these examples of somebody telling me what they think systemic
00:50:32.400 racism is, I always apply it to the same test.
00:50:35.840 All right, a poor white kid with no money.
00:50:40.680 Now talk me through this.
00:50:42.880 How does the poor kid with no money get a benefit?
00:50:47.060 Because there are other strangers who are also white, who have money.
00:50:51.000 How does that work?
00:50:53.340 Do I get their money because I'm also white?
00:50:56.180 How does that work?
00:50:57.380 All right.
00:50:58.040 So these are the conversations that, um, the news business and social media and the activists
00:51:04.820 have decided that even if we want to be helpful, we have to instead treat black people like
00:51:11.420 their children and they can't have an honest conversation.
00:51:15.060 You know, I guess the rules of society are that I can't be, I can't be useful.
00:51:20.380 I can't be helpful.
00:51:21.760 Can't help anybody with any strategy because I would just be racist.
00:51:24.820 So, um, there you are.
00:51:30.860 All right.
00:51:31.800 Uh, victim juice.
00:51:36.180 Yeah, that is a funny saying.
00:51:40.060 All right.
00:51:40.820 Uh, looking at your comments, it looks like I've said enough for today and I will talk to
00:51:45.660 you all tomorrow.
00:51:46.940 Yeah.
00:52:03.060 Yeah.
00:52:03.720 Yeah.
00:52:04.020 Yeah.
00:52:08.120 Yeah.
00:52:13.120 Yeah.
00:52:13.980 Yeah.
00:52:14.760 All right.
00:52:15.140 Yeah.