Real Coffee with Scott Adams - June 10, 2020


Episode 1023 Scott Adams: I Fix the Racism Problem in America With a Whiteboard, Ironically. You Won't Want to Miss it.


Episode Stats

Length

46 minutes

Words per Minute

152.06993

Word Count

7,005

Sentence Count

530

Misogynist Sentences

2

Hate Speech Sentences

6


Summary

Racism is a problem that needs to be solved. But how long does it take to solve? And what are the best ways to do so? Scott Adams explains why we should stop being defensive about white privilege and start being honest about racism.


Transcript

00:00:00.000 Bum bum bum bum bum bum bum bum bum bum bum bum bum bum bum bum bum bum bum bum bum bum bum bum hey everybody come on in it's time it's time for coffee with Scott Adams best part of the day best part of the week sometimes the best part of the year and today will be an extra special one because I'm going to solve racism yeah I know what took me so long you'll be mad at me for not
00:00:29.980 doing a suitor once you see it. But before we solve racism, can it wait another minute?
00:00:38.620 We can wait on that, right? All right. So we'll wait on solving racism for a few minutes.
00:00:43.940 First, we're going to have to do something called the simultaneous sip. But what do you need for
00:00:47.980 that? Not much. A cup or a mug or a glass, a tank or chalice or stein, a canteen jug or flask,
00:00:52.320 a vessel of any kind. Fill it with your favorite liquid. I like coffee. And join me now for the
00:00:59.920 unparalleled pleasure, the dopamine at the end of the day, the thing that makes everything better,
00:01:05.180 including pandemics, including racism, including the economy. It's all getting better now, folks.
00:01:11.780 Take a sip. It's a simultaneous sip. Go. More improvement. Well, I haven't checked the stock
00:01:22.160 market today. I can't believe it would be up again. But let's see. It should be. If I had to guess,
00:01:29.220 it will pull back a little bit today. Yep. So far, a little bit. I don't know what's up
00:01:36.000 with Apple. Apple's just going nuts. And Amazon. All right. Let's talk about all the things.
00:01:45.080 Mark Cuban was at some kind of an event where they're talking about racism. And he tweeted,
00:01:53.120 we need to stop being defensive about white privilege. We need to be to stop being defensive
00:02:00.200 about white privilege. I would say that that is true. True statement. It is, however, only 50% of what
00:02:08.800 we need to be honest about. So I would call this a call to be half as honest as you need to be
00:02:16.760 to make any difference at all. So I completely agree. Nobody should be defensive about any of
00:02:25.600 this. Right? So white privilege is either a thing or it's not a thing. It applies here. It doesn't
00:02:33.240 apply here. It does or it doesn't. But you don't need to be defensive about it. You should just be
00:02:39.200 able to discuss it along with everything else. But since you can't discuss everything else,
00:02:43.580 there's really no point. There's no point at all. As long as you don't have freedom of speech,
00:02:51.220 then I don't. Some of you might. But I don't personally have freedom of speech, except in a
00:02:57.320 legal technical way, in a practical sense, as a person who lives in the country and wants to survive
00:03:03.180 and thrive. I don't have freedom of speech.
00:03:07.400 Question. What is the right amount of time to wait for an example of systemic racism
00:03:17.580 before discarding it as an illusion? Now, here's the reason I asked this. When I first started hearing
00:03:25.920 the idea that there was systemic and institutional racism, I thought to myself, what's that? What's that
00:03:34.060 mean? And then I heard a definition of it. And it has to do with the entire structure of how things
00:03:41.560 are put together could favor or not favor one group or another. And I thought, all right, that makes
00:03:47.720 sense. I could totally see how systems would be formed and would not favor one group over another.
00:03:55.000 That makes complete sense. And then I kept waiting for some examples.
00:04:01.960 And I thought to myself, I don't want to be rude. I don't want to, you know, I don't want to question
00:04:07.160 whether this is real or imagined, because that would be kind of obnoxious, isn't it? If somebody
00:04:14.700 says, I have this gigantic problem, it's like, you know, one of the biggest problems in my life in the
00:04:21.060 country. And it's called this. You don't say, no, it isn't. You kind of wait, right? I mean, good,
00:04:30.680 good social behavior, good manners says that you wait, you listen. And you say, all right, all right,
00:04:37.900 let me know what you're thinking. And I waited. And I thought, well, pretty soon we're going to be
00:04:44.860 talking about some examples. Because until you get into the actual examples of like, all right,
00:04:50.560 here's one. Here's some institutional or systematic racism. Here's an example. And then once you see a
00:04:59.300 few examples, you could maybe spot your own. You know, once I've seen what you're talking about,
00:05:03.840 I go, oh, okay, now I get the concept. And then I could pick out some examples without any help,
00:05:09.800 because I get the theme. And I waited. And I waited. And I waited. I'm still waiting. It has been now
00:05:21.700 years since I first heard this term, systemic racism. And I've never heard an example. Now, I don't,
00:05:32.560 again, I don't want to be rude. Just, you know, normal, good social behavior says that you don't
00:05:38.900 say it doesn't exist. But isn't it fair to ask for an example at this point? You know, now that,
00:05:45.300 you know, the country's on fire, figuratively speaking, on fire, a little bit not figuratively
00:05:52.540 recently. But shouldn't this be the time when we start seeing some examples? I've seen exactly
00:06:02.520 one example. And it applied to poor people equally. The only example I've ever seen, I don't remember
00:06:10.680 what it was, because it applied to poor people equally. So it wasn't a good example. So let me,
00:06:16.840 let me say this. So the reason I put it in this form of a question, how long should you wait for an
00:06:23.300 example before you discard it as an illusion? I would say a week, a week more. What do you think is
00:06:31.380 fair? Given that we're all having this conversation, given that, you know, hundreds of thousands of
00:06:38.620 people will see this periscope eventually, given all of that attention, how long should I wait
00:06:45.300 before I get an example? And I'm not telling you I'm arguing with the examples. I'm just saying that
00:06:51.160 I need to see one, because I think I could help. Right? But, you know, one of the, I've said this
00:06:58.740 before, the greatest untapped resource in the United States is the helpfulness of white people.
00:07:06.540 We like to help. If I can make that gross generalization about people who coincidentally
00:07:12.880 look like me in one particular way. If there's some, if there's some way I can help make this
00:07:20.380 institutional racism go away, I'm all in. I like fixing stuff. And, and if there's some inequality,
00:07:27.500 why wouldn't I want to fix it? Well, what would be my incentive not to fix an inequality? I don't
00:07:33.740 have any. So examples, please. But if, but if at the end of the week, let's say a week from today,
00:07:41.660 what's today, Wednesday? Let's say a week from today, if I've not heard any examples, given the
00:07:48.220 amount of attention I'll get just by asking for them, if I haven't heard any examples, would it be
00:07:53.340 wrong for me to conclude it doesn't exist? That it's more of an illusion? Is that fair? Because I
00:08:02.540 think that's fair to, to wait a full week for just an example of something that's pervasive and
00:08:10.340 everywhere. So give it a week. And then I would have to conclude that it's more of a, more of a
00:08:17.640 hallucination than a real thing. But with all the people who say it's real, I assume it's real. And
00:08:22.740 I assume I'll get some examples. But it's weird that I haven't seen any yet. All right, were there
00:08:28.280 any protests yesterday? I saw zero coverage of protests yesterday. Can anybody confirm were there?
00:08:37.620 Because I guess it was the funeral yesterday. So maybe the funeral was a stand in for what would
00:08:44.200 have been protests? Because I think, I think the protests might be starting to fizzle out. Now,
00:08:53.980 one of the predictions that I made is that you're not seeing, you're not seeing a genuine phenomenon.
00:09:00.800 phenomenon, you're seeing a phenomenon, which is a coincidence of a perfect storm of something
00:09:07.780 happening that is rare. And the perfect storm is that people were locked up for months, and then
00:09:13.580 warm weather came. If you lock people up for months, and they don't have, you know, their energy is kind
00:09:19.980 of building up and not not being released, it needs to be released. Now, the trigger, of course, was the
00:09:28.580 video. And everybody had the same feeling about the video. We were all appalled by it. We all think
00:09:34.360 everybody thinks it looks like a crime. But that was the trigger. Now, when I said this before, of
00:09:40.500 course, because I don't have freedom of speech, somebody committed a hate crime against me by saying
00:09:47.040 I must be a racist. Because I'm white, basically. And I say good things about Trump now and then. So
00:09:54.720 anyway. Police were everywhere last night in San Diego. Anyway, my point is, I think it may be
00:10:06.320 that although the trigger was the video, and then somebody said, you racist, don't you realize
00:10:11.840 that, you know, there were huge underlying problems. And of course, I do. I'm just saying that those huge
00:10:20.620 underlying problems would not have turned into this specific form of expression that ended up
00:10:27.520 attracting looters and everything else, that that wouldn't have happened without the trigger.
00:10:32.820 If you took that trigger away, we just would have limped into the summer with all the same problems we
00:10:38.020 always had, but without the riots. Now, you could argue that the protests and the looting, etc.,
00:10:44.940 were productive. But I don't think you could argue that they would have happened on their own,
00:10:49.760 not without some trigger. So you had a trigger. But I think that if you take the energy out of the
00:10:57.620 situation, which is what happens when everybody just gets tired. If you've, if you've gone out,
00:11:03.560 let's say you protested three or four nights, how many more nights do you want to do it?
00:11:08.740 All right, your energy to protest the first night, sky high. Second night, still high. Third night's
00:11:15.800 pretty good. Fourth night? Do you, do you protest a fourth night, you know, an individual? You start to
00:11:25.400 run out of energy. So I think the natural, the natural direction of this is to reduce an energy. It has,
00:11:33.260 however, I would, I would say that the protesters have accomplished at least one thing that never
00:11:40.400 happened before. One thing that they've accomplished for sure is that a lot more people are talking about
00:11:46.440 specific solutions. So we never saw before, I don't think, maybe, I could be wrong, but we've never
00:11:54.540 seen before Congress putting together an actual set of, you know, laws and legislation to try to deal with,
00:12:02.120 you know, police brutality. Or let's say just police conduct, police misconduct.
00:12:10.360 So something happened. You know, you can't argue that it didn't create some kind of activity. Now, none of
00:12:16.620 those laws have been passed. Who knows if any of it will turn into anything. But I would, I would say that it did
00:12:22.280 turn into something, at least something positive, even if the net was not as positive.
00:12:31.220 I, I've, of course, have told you that I'm not interested in talking about the specifics of fixing
00:12:37.520 the problem. Because unlike some people in America, I don't have free speech. Now, I don't have freedom of
00:12:44.260 speech in the sense that, you know, the entire topic of racism, I can only sort of talk about in surface-y
00:12:50.980 ways, you know, approved surface-y ways. So I don't have actual freedom of speech to really get into the
00:12:58.200 details and talk about what is true and what's not true and the data and stuff like that. If I did have freedom of
00:13:04.420 speech in this country, and again, I don't mean legally, of course, legally, I have freedom of speech. But in a
00:13:11.300 practical sense, I don't, like most of you don't. So you can't really work on the suggestions without
00:13:18.220 the ability to talk about them. So I'd say it's a waste of time to actually talk about the solutions
00:13:23.900 as a citizen. If Congress passes something and it works, that'd be great. But in terms of my
00:13:30.000 contribution, I couldn't possibly be useful without being able to talk about it.
00:13:33.760 So there's that. So here's something really interesting. A Democrat. Keyword, Democrat.
00:13:46.700 Now, when I tell you the rest of the story, just keep in your mind, this is a Democrat. Okay? That's the
00:13:54.360 key part of the story. Democrat. Vernon Jones, a representative from Georgia. He's, he says he's,
00:14:05.760 let's call this what it is, a hate crime. Basically, he said, I've watched, this is a tweet
00:14:12.580 from him, Vernon Jones. I've watched countless videos of Trump supporters getting attacked in
00:14:17.440 the streets simply due to their support of Donald Trump. Let's call this what this is, a hate crime.
00:14:25.080 He's a Democrat. A hate crime. And as we return to the legislature next week, I'll be introducing
00:14:31.700 legislation that'll make it such. So it's a Democrat. Vernon Jones, I believe he's African American,
00:14:40.720 which also, you know, gives a little context to the story. He's recommending legislation to make it
00:14:49.140 illegal to attack a Trump supporter for just being a Trump supporter, such as, you know, wearing a MAGA hat
00:14:56.260 or something. Now, how much do you love this guy? How much do you love Vernon Jones? A Democrat.
00:15:04.580 A Democrat. He's, he's the guy introducing legislation to protect Republicans.
00:15:10.720 So I asked myself, Republicans, where were the Republicans? There's not one Republican who
00:15:22.620 wanted to introduce some legislation to protect Republicans. It had to come from Vernon Jones,
00:15:29.360 an African American Democrat. Now, what if I taught you about reciprocity?
00:15:35.980 Right? I've been teaching you that reciprocity is the single most, probably the most important tool
00:15:43.960 for success. If you could get one thing right, of all the things you should do right for success,
00:15:52.300 you know, you want to stay out of jail and stuff like that. But reciprocity is just, it's just the king
00:15:59.080 of the hill for getting what you want out of life. Do something for somebody else. That's it. You do
00:16:06.360 something for somebody else. And it's going to far increase your odds that something good will come
00:16:12.080 back to you directly or indirectly. So did I just spend 10 minutes praising a Democrat? I've never met
00:16:19.920 a Democrat named Vernon Jones. And thank you. So Vernon Jones, thank you. Sincerely, thank you. This is
00:16:28.900 what actual leadership looks like. Because he's, you know, bucking the, bucking the majority, I would
00:16:34.840 imagine. Now, I would like to take this excellent idea and extend it. I will extend it thusly.
00:16:43.420 Suppose you, a social media group, attack somebody for being a Trump supporter, so much so that they
00:16:52.580 lose their job. Let's say that somebody's tweets are rounded up, and they're not that bad. They just
00:17:01.000 show that it's somebody who's an avid Trump supporter. Let's say they take those tweets, and
00:17:08.400 they send them to an employer, and it causes somebody to get fired. Should it be illegal to
00:17:15.220 get somebody fired for being a Trump supporter? Well, I would say it should not only be illegal,
00:17:21.980 it should be a hate crime. Because if somebody is attacked because of their support of a political
00:17:30.300 party, that is a completely legal, functioning political party, Republican, if you get fired for
00:17:38.040 that, and the reason you get fired is that, you know, somebody organized an attack to talk to your
00:17:43.680 employer to get you fired, under those conditions, should that not be a hate crime? Because it is.
00:17:51.540 You know, if you, if you steal somebody's job, just because you hate them, basically getting them
00:17:58.580 fired, that's stealing their job. You know, you're not stealing it to keep it, but you're taking it
00:18:02.840 from them. All right. If you take money from somebody, or let's say you burn down somebody,
00:18:09.320 let me give you a cleaner example. If I burned down your house because you were a certain political
00:18:14.720 party, would it be called a hate crime? Yeah, it would be. If I burned your house, even if you
00:18:22.040 weren't in it, let's say there was no danger to anybody physically. If I burned down your house
00:18:26.940 because of your race, your beliefs, your religion, is that a hate crime? And the answer is yes. Now,
00:18:36.380 a house is just an economic good, right? It's something that has a dollar amount. It's not a
00:18:43.240 person. It's just a dollar amount economic entity. Now you live in it, you got your private pictures in
00:18:49.680 there, so it's worse. But my point is, if you take somebody's job, it's not that different from
00:18:56.340 burning down their house. And if you would agree that it would be certainly a hate crime to burn
00:19:01.660 down somebody's house for being a certain political party, it's a fucking crime if you get them fired.
00:19:09.600 If you get them fired for their political beliefs, and this is happening in this country.
00:19:15.820 I'll be telling you a story about one that I know of later, but not today.
00:19:23.620 This needs to be a hate crime, not just a crime. It needs to be a hate crime, and there should be jail time.
00:19:30.560 So I think you should actually go to jail if you're getting somebody fired, or even trying to.
00:19:37.180 I would say even if you tried to get somebody fired for their political beliefs and have failed,
00:19:43.660 you should still go to jail. Jail. Actual jail. I'm not talking about a fine. I'm not talking about
00:19:52.420 being sued. I'm talking about jail. I'm talking about putting you in a cell, and you spend some time
00:20:00.680 there. Because if you burn down somebody's house for their political views, would you go to jail?
00:20:08.820 You would go to fucking jail every time, assuming you got convicted and everything. Every time.
00:20:17.080 So what is the real difference between burning down somebody's house for their political views
00:20:21.780 versus organizing a mob to take the person's job? There's no difference. It's the same fucking crime.
00:20:33.580 Jail. There should be jail for that.
00:20:39.340 All right. Let's talk about some other things.
00:20:43.940 There's a Rasmus and Paul that I'm going to break some news for you. You want to hear some Rasmus and
00:20:49.940 poll results that you have not heard anywhere else, because I actually got permission to tell you
00:20:56.060 before it gets posted, which will be very soon. All right. Here was a survey. Rasmus and reports.
00:21:03.660 A thousand U.S. likely voters. And they conducted this June 3rd and 4th. So use your imagination to
00:21:12.660 wind back your brain to June 3rd or 4th. We're sort of right in the middle of a lot of the
00:21:19.640 protests when things were pretty hot. And here was the first question. Do you have a very favorable,
00:21:25.680 somewhat favorable, somewhat unfavorable, or very unfavorable impression of Black Lives Matter?
00:21:31.760 So this is the existing thing. All right. So a very favorable, 32%. Somewhat favorable, 30%.
00:21:40.740 So the favorables are 62% when you put them all together. 62% of the people surveyed, and the survey,
00:21:50.960 I believe, I saw the details, was very close to the population of the United States. So they asked
00:21:57.780 about 13% African Americans. And, you know, they got a mix that was similar to the United States
00:22:05.280 demographic. And 62% have a favorable view of Black Lives Matter, right in the middle of the protests.
00:22:12.620 And then the somewhat unfavorable and unfavorables added up to 31%. So twice as many people have a
00:22:21.120 positive view as a negative view. But here's the more interesting question. After several days of
00:22:26.880 protests, and following the death of George Floyd, do you have a more favorable or less favorable
00:22:33.440 opinion of Black Lives Matter? So here's the interesting thing. Did all of the recent news
00:22:38.760 cause people to like Black Lives Matter more or less? What is your guess? Give me your guess
00:22:47.480 before I give you the answer. Did the protests make the public at large like Black Lives Matter more
00:22:55.520 or less? The answer is both. The people who said they had a more favorable opinion, 30%. The people who
00:23:08.620 said they had a less favorable opinion, exactly 30%. In other words, it broke even. Now, it didn't really
00:23:18.580 break even because those aren't necessarily the same people. And maybe they were already favorable,
00:23:23.440 but they're a little more favorable now. And 38% said their opinion is about the same. I don't know
00:23:29.680 how your opinion could be about the same. What kind of person could watch all that and say,
00:23:34.260 ah, opinion is about the same? I guess that's possible. Yeah, I suppose if you if you already
00:23:40.100 had a very high opinion, or a very low opinion, it probably wouldn't change. You would just feel
00:23:46.700 stronger about it. Yeah, so I'm watching the comments go by. And I think I think what we would
00:23:52.180 find is that the conservatives liked Black Lives Matter less. And I'm just speculating. This is not
00:23:59.660 based on the data that I'm looking at. But probably, you know, probably people on the left like Black
00:24:06.580 Lives Matter more because it was doing more, you know, more active, raising the issue. All right.
00:24:13.340 All right. So that's from Rasmussen. And
00:24:19.180 so I tweeted a little test tweet. So I've been testing this concept that Republicans feel as if
00:24:31.840 they don't have freedom of speech. So I tweeted and pinned this just to see how many likes I got. And
00:24:36.980 here was the sentence. I said, Republicans only have free speech on election day.
00:24:43.340 Now, the point of that was to find out if the Republicans were watching this would sort of
00:24:50.560 agree that they don't have free speech, but that but that the price will be paid. In other words,
00:24:58.040 if you're a Republican, and you are aware that there's a big national topic, and you are also aware
00:25:04.900 that you don't have freedom of speech. Where are you going to go to compensate for that?
00:25:10.480 Like your incentive to vote is just sort of through the roof, because it'll be the first time you can
00:25:17.320 be honest. The one time you can be honest is when you walk in that voting booth, and you pull the
00:25:24.940 lever or fill out your form or whatever, because you can honestly vote. You can totally honestly vote.
00:25:32.180 But you can't honestly talk. You don't have freedom of speech in a practical sense. You do in a legal
00:25:39.740 sense, of course. And it got 4,400 retweets, which on the size of my account is, is a lot. So I love
00:25:50.720 that I love that I could use Twitter to quickly test a notion. So I was quickly testing if other people
00:25:57.120 were feeling the same. And then this morning, I wake up to at least three major accounts tweeting
00:26:02.960 the same concept, that that we don't have freedom of speech. So we can't really talk about the topic,
00:26:09.400 honestly. I think four different accounts had the same the same theme, that we we can't make progress
00:26:17.200 because we can't talk about it. So you can see more of that. You know, I would also say it should
00:26:25.080 be a hate crime for social media, somebody on social media to label a Trump supporter a white
00:26:31.440 supremacist. What do you think of that? If somebody on social media were to call a Trump supporter who's
00:26:39.940 just a Trump supporter, you know, hasn't done anything specific to be, you know, objectionable,
00:26:45.480 and they're labeled a white supremacist just for being a Trump supporter. Is that a hate crime?
00:26:53.100 If they do it publicly, if somebody calls you out and labels you a white supremacist on Twitter,
00:26:59.500 would you call that a hate crime? I would. I would. Because, you know, you couldn't use the n-word,
00:27:06.800 you couldn't call somebody any other kind of offensive, hateful thing. Now, calling somebody a
00:27:13.340 white supremacist really is an invitation to violence, I would say, because I think that most
00:27:19.420 people would feel justified in violence against a white supremacist. So I would say that if somebody
00:27:25.160 labels you a white supremacist for having, you know, just an opinion about things, and it's not a
00:27:32.760 white supremacist opinion, you're just a Republican, I would say that's a hate crime, because it is an
00:27:39.340 invitation to violence directly and indirectly. So I'd like to see those changes. And without those
00:27:46.640 changes, I think social media just has to be regulated like any other utility or publisher.
00:27:56.520 Let's see. There's additional scientific evidence for the fact that your genetics make a
00:28:04.840 big difference in how susceptible you are to COVID and coronavirus. So there's now a second study,
00:28:13.680 there was one in China that had the same result, that found that blood type O is more resistant.
00:28:21.640 Now, it's not completely resistant. It's, you know, it's in the teens of resistance, you know,
00:28:27.320 might be 10 to 19 percent or something in that range. But there is a pretty measurable difference.
00:28:34.820 If you've got type O, you're more resistant. My guess is that there are other factors in your
00:28:42.660 genetics that would also be predictive. So remember I told you early on, if you're keeping score,
00:28:50.040 if you're keeping score of which pundits predicted correctly as we went, one of my earliest predictions
00:28:56.200 is that there would be a strong genetic component. And if we knew that genetic component, it might help
00:29:03.040 us determine who to keep safe and who's at risk. Remember, that was one of the first things I said
00:29:07.780 months ago, and now science has confirmed. So I was right about masks, right about closing the airports,
00:29:14.020 even before the president said it.
00:29:19.600 You know, I don't know how many times to say it, because it blows my mind that my medical advice
00:29:27.160 was consistently superior to the World Health Organization. And most of you too. I think if
00:29:36.520 you could, a monkey with a dartboard would have been better than the World Health Organization.
00:29:40.900 But man, I freaking nailed it. If, let me say this as starkly as I can. If you had ignored the World
00:29:49.340 Health Organization and just, and just took your medical advice from me, so far, you would have been a
00:29:57.080 way ahead. Now, in the future, I don't recommend you take medical advice from me. That'd be probably a bad
00:30:04.080 idea as a general principle. But it is nonetheless true that if you'd only listen to my medical advice, and I
00:30:12.100 have, you know, no medical training, obviously, you would have been a way ahead. That's just a fact.
00:30:17.800 There's no way you can argue that, that statement. So guessing is better than expertise, sometimes.
00:30:29.620 Bill Barr says, and he's talking about the social media companies, he says, quote, I think there are
00:30:36.020 clearly these, these entities are now engaged in censorship. And they originally held themselves as
00:30:43.440 open forums. So I think that's a direct statement from, from the Attorney General, that he is aware. I mean, it's
00:30:55.340 kind of, wouldn't you say that it's a, it's a pretty strong statement from the Attorney General, to say that they are
00:31:04.120 engaged in censorship. So the Attorney General did not say, he did not say, we're looking into censorship. He did
00:31:13.040 not say we're concerned that they might. He's not saying people have complained that they do. He said, he's the
00:31:20.740 Attorney General of the United States. And he said, clearly, these entities are now engaged in censorship.
00:31:26.480 He said, clearly, like it's a done deal. What? All right. Let me cure racism. I'm going to take some
00:31:38.060 things which I've said before, but I've not said them as well as I'm going to package them now. So I'm
00:31:43.620 going to take you to my whiteboard, and I'm going to solve racism for you. Sure, maybe not all at once,
00:31:50.580 but the cumulative effect will be a complete solution to racism. It comes from understanding
00:31:59.760 it like this. All right. So some of this you've seen before, but I'm going to package it better.
00:32:06.260 You have a brain. It's a big old lump of stuff inside your skull. Your brain is a pattern recognition
00:32:14.260 machine, but it's not a very good one. And that's where all our problems come from.
00:32:19.400 So your brain doesn't have a choice of being biased. It's designed to be biased because you
00:32:27.840 can't do a scientific controlled experiment for all of the thousands of decisions you make every day
00:32:34.620 from whether to cross the street, how to start your car, what clothes you put on, thousands of
00:32:40.740 decisions every day. And you don't have any data or scientific studies. So you use your bias. You say
00:32:47.800 to yourself, well, the last five times I did whatever, it worked out well. So I'll do that
00:32:54.640 again. Or the last three times I tried this, it didn't work. So I won't do that again. So your brain
00:33:02.280 is really just a pattern recognition machine that's running all the time, and you can't turn that off.
00:33:07.880 Even though it doesn't do a good job of it. So if you see a pattern that's not a real pattern,
00:33:15.180 you're still going to think it's a pattern. Because you don't have time to do a deep dive
00:33:19.980 on everything. So you can be misled by your pattern generating machine, and being misled
00:33:25.880 will make you biased, bigoted, racist, sexist, ageist, and all the other isms. Because your brain
00:33:32.360 is very biased. It just looks for patterns. It's dumb. It sees a pattern, even if it's not a real
00:33:39.800 pattern. It could be an imagined pattern. And it just says, well, it's the best I got. I'll go with it.
00:33:45.240 Now, the problem with Black Lives Matter, and their approach to things and systemic racism, etc.,
00:33:55.480 is that they're trying to fix this. They're trying to fix the fact that people are biased,
00:34:00.880 bigoted, racist, sexist, etc. Well, they don't care about the bottom one so much, but they're
00:34:05.800 working on the racist part. This, the problem they're trying to solve, can't be solved.
00:34:13.760 So if you're wondering why racism hasn't been solved, it's because it can't. It's not a thing.
00:34:21.920 It doesn't, it's not a thing that has a solution. I'm not saying it's hard. I'm not saying we don't
00:34:28.000 know how. I mean, it is logically unsolvable by its nature. Because to solve this, you'd have to
00:34:35.960 remove our brains. Because you can't rewire a brain to make it not a pattern recognition machine.
00:34:42.100 That's its basic nature. If you made your brain not capable, or at least not automatically operating
00:34:50.180 on patterns, it would also no longer be a brain. It just wouldn't do anything. You would just sit in
00:34:55.160 the chair and starve to death. So this is only, is not solvable in any sense. There's nothing you
00:35:03.520 can do. Cannot solve the basic nature of the human brain. And if you try, you're doomed to fail. And
00:35:12.300 here we are. Here's what you can do. So instead of doing what you can't do, how about doing what you
00:35:19.300 can do? Your bias is permanent. You can't get rid of bias. It might change over time as you have new
00:35:25.700 experiences and they cause new or revised biases. But you're always going to have bias. But what you
00:35:32.340 can do is make sure that your filters are strong. Your moral filter says, no, I don't want to treat
00:35:39.120 people with bias. So I will use my higher level thinking to try to tamp down that instinct. You
00:35:47.260 can put your social filter on it and say, no, I want to be a good person. All my friends are good
00:35:54.520 people. I want to be a good person. I don't want to be that kind of person. So I'll use my social filter
00:36:00.160 to tamp down my bias. And then there's a practical filter, which is how would society work if we're
00:36:08.060 just go around discriminating against each other for race or anything else? It doesn't work very
00:36:12.880 well. So from a practical perspective, if you want to have a better world, of course you want to
00:36:20.140 tamp down your bias because it just works better. If somebody comes in and asks for a job, do you want
00:36:26.260 to lose the possibility that you could have hired the best employee you'd ever had because you were
00:36:31.760 being biased? No, it's just not practical. You would rather look at the resume and say, oh, okay,
00:36:38.080 I was a little biased when you walked in the door because you weigh 110 pounds and the job requires
00:36:43.480 lifting heavy objects. But now that I've looked at your resume, I see here that you're a bodybuilder.
00:36:49.640 Couldn't tell the way you're dressed, but it looks like you could actually handle this job.
00:36:53.300 Okay, you're hired. So on a practical basis, you don't want to lose the chance to hire the best
00:36:59.620 person you ever had for the job. So you want to overcome your bias. So here's what I'm saying.
00:37:06.080 If you spend your time trying to fix this, it's just a waste of time. You can't fix it. You
00:37:11.500 shouldn't fix it. It's the basic nature of your operating system for your brain. You can't change
00:37:16.660 that. But you can certainly strengthen your filters. You can strengthen your filters. So imagine
00:37:23.920 starting early and saying to kids, look, you're going to be biased. There's nothing you can do about
00:37:29.020 that. But you can strengthen your filters. Here's why being biased doesn't work for anybody. Here's why
00:37:36.220 it's immoral. Here's why you're not going to have a good time with your social life. It's just not going to
00:37:41.580 be as good if you're operating on your biases. So to some extent, I'm stealing these ideas from
00:37:53.180 Morgan Freeman, for example. Morgan Freeman, and this isn't his version of it, but I borrowed
00:38:01.000 from his version as inspiration. So I'd say I'll use Morgan Freeman as inspiration, which is
00:38:07.980 that you just sort of ignore the, ignore the racism part and laugh about it. Now, imagine,
00:38:16.720 imagine the world, you know, I'm not going to say I have a dream, but you can see where I'm going
00:38:23.440 here. Imagine a world where somebody walks in for a job interview or just walks in the room in a social
00:38:31.100 setting. And your, your first thought is whatever your bias is. And you can make a joke about it.
00:38:39.060 And the other person could make a joke about it too. Let me give you an example. If somebody
00:38:44.720 mocks me for being a white guy who has no rhythm and can't dance, what is my, what is my first
00:38:51.780 impression to that? Let's say, let's say a black guy mocks me for being a, like a bad dresser,
00:38:58.580 which I am, or being a bad dancer. If he mocks me in a funny way, I just think it's kind of funny
00:39:06.280 because it's true. I don't dress well. I don't have style. I don't have rhythm. Now I don't care
00:39:14.000 if these stereotypes do or do not have some kind of statistical meaning. I don't really care because
00:39:21.000 they don't apply to me. I just think it's funny. And anytime somebody teases me for, you know,
00:39:27.080 playing air guitar. I don't play air guitar, but you know what I mean? Or dancing with an overbite.
00:39:33.280 Do I say to myself, ah, hate crime, hate crime. Or do I say to myself, ah, that's kind of funny.
00:39:39.360 Kind of funny. So I think, you know, I think Morgan Freeman is my inspiration here that you should just
00:39:49.740 accept. Let me say it more directly. What we're trying to do is to minimize our feelings of bias
00:39:58.400 so that in our interactions, they don't come out. It's probably exactly the wrong instinct.
00:40:05.960 It's probably backwards. Don't know. But imagine it going exactly the other way.
00:40:12.280 What if we could speak freely about our biases? What if you could just speak freely about it?
00:40:19.940 But at the same time, you would strengthen your filters to the point where you could overcome it.
00:40:27.360 So somebody walks into your office for an interview and you say, oh man, you do not look like the right
00:40:33.620 person for the job. For whatever reason. You're too short. You're too old. You're the wrong gender.
00:40:39.960 Maybe you're even a racist. But imagine a world where you could just say that. Say, man,
00:40:45.080 I don't even think you're the right person for the job. You just laugh about it. And then the person
00:40:49.700 pulls out their resume, says, well, my Harvard degree and my 10 years, you know, writing published
00:40:56.900 papers in this says otherwise. And then the person who was laughing said, oh damn, pretty darn good.
00:41:03.260 You're hired. And nobody thinks anything about it. They just say, okay, I had a bias because
00:41:10.140 I'm a human. How about just saying humans have bias? But we also have tools. So just accept
00:41:18.740 your bias and then use your tools. Make sure you've strengthened your filters so you can build
00:41:24.880 a system that works best. So that's what I would suggest. All of this stuff about how you
00:41:30.460 feel about it and trying to cleanse your soul and, you know, because you speak a certain
00:41:36.080 way in your dark heart, you've got these bad feelings. Every bit of that is counterproductive
00:41:41.900 because it imagines a person the way we are not, which is free from bias or we can be free
00:41:48.920 from bias. Or that there are some people who don't have bias and some people who do. That's
00:41:53.660 a big illusion. Everybody's got bias. Somebody says, you are such a tool. That's a good productive
00:42:03.400 comment there. Thanks for jumping in with such useful advice. Somebody says, unrealistic but
00:42:12.320 good. Yes, it is unrealistic in the sense that I do not expect that people will automatically
00:42:19.800 say, oh, okay, good idea. You got me now. Let me tell you how this works. If this is a
00:42:26.680 good frame, it will live on its own. Meaning that there are enough people here who will
00:42:33.320 see it that if I said anything useful on this topic or anything else, it forms sort of almost
00:42:40.640 a creature. You know, a good idea is almost an entity that can travel and it can be spread
00:42:46.880 like a virus from one person to another. So one of two things just happened. Either I
00:42:52.960 had an idea that doesn't have much traction and that's the end of it. Last time you'll
00:42:57.480 ever hear of it. Or there are some few people who heard this who said, damn, that really gives
00:43:04.980 me a better way to think about it and maybe I can do something about that. And if it's a
00:43:11.660 good idea, it will grow on its own over time. So you can just release an idea like that and
00:43:18.680 find out if it has any power. All right. Somebody says it's not a good frame. Well, you would
00:43:27.500 need to define that a little bit better. I'm not going to block you. If you're new, it's
00:43:34.500 my policy to block people who have a criticism that doesn't have a reason attached. You don't
00:43:39.400 have to go deep with your reason. But when you said it's, I think you said it's not practical
00:43:44.720 or something, not a good frame. Give me, give me just a hint of a reason if you can summarize
00:43:51.180 it. Usually you can. All right. Swaddle me, Captain. That's always funny every time I hear
00:44:01.160 it. Yeah. And I've said this before, but we should be concentrating on strategy.
00:44:10.480 Here's something that's provocative that I would love to tell a 12-year-old, a class of 12-year-old
00:44:17.060 black Americans. So if I could talk to a classroom at that age, I would tell them this.
00:44:23.700 Yeah. Racism will always be a problem. It's not going away. It can't go away, really.
00:44:32.200 But while you might be disadvantaged, let's say socially or even systemically, I haven't
00:44:39.200 heard examples of that, but I'm sure I will. So you might be disadvantaged in this life because
00:44:45.540 of racism, et cetera. But you also have a tactical advantage. And they would say, what? Yes, that's
00:44:54.160 right. You have a disadvantage because of racism, but you also have a tactical advantage if you
00:44:59.720 have the right strategy. And the tactical advantage, and I'll just give you the one obvious example,
00:45:04.900 which is you could walk into any Fortune 500 company, and if you have the same qualifications
00:45:10.220 as a white candidate, you'll get the job every time because the company needs diversity as well
00:45:16.320 as employees. So if you can get a good employee who has the same capability as everybody else who's
00:45:21.580 applying, but you can also get some diversity, far preferred. And that is just, that's as close
00:45:29.500 to a universally true statement as you can make about corporate America. So I would teach the class,
00:45:35.400 yes, you were born into a world that's going to have some racism. It's always going to be
00:45:39.860 there. So here's your strategy. Be tactical. You can slice through that like it didn't even exist
00:45:47.500 because your tactical advantage is so strong if you use it, if you have the right strategy.
00:45:52.780 So that's the way I would go. Let's find out if we have any freedom of speech,
00:45:57.940 and we'll get back to social media, and we'll find out. And I will talk to you later.