Real Coffee with Scott Adams - February 09, 2020


Episode 814 Scott Adams: "Extremist Face", Van Jones the One-Eyed King, Bad Risk Management, Blame


Episode Stats

Length

56 minutes

Words per Minute

151.08282

Word Count

8,532

Sentence Count

595

Misogynist Sentences

1

Hate Speech Sentences

2


Summary

Some idiot in a van crashed into a voter registration tent in a Florida shopping mall, and no one knows if he was aiming at them. Or was he aiming at a neo-Nazi? And why do extremists have a hard time identifying themselves?


Transcript

00:00:00.000 Bum, bum, bum, bum, bum, bum, bum, bum, bum, bum, bum, bum, bum, bum, bum, bum, bum, bum, bum, hey, everybody, it's good to see you, come on in, gather around, it's time for the best part of the day, except for the rest of it, which is going to be pretty good too, yes, it's going to be coffee with Scott Adams this morning, and all you need, you don't need much.
00:00:30.000 You need a cup or a mug or a glass, a tank or a chalice or a stein, a canteen jug or a flask, a vessel of any kind, fill it with your favorite liquid, I like coffee, join me now for the unparalleled pleasure of the dopamine hit of the day, the thing that makes everything better, the simultaneous sip, Erica, I see you there, Dr. Funk Juice, grab your mugs, Marla, come on, go.
00:00:54.160 So, ah, sublime, let's talk about the news, and stuff, stuff in the news.
00:01:08.240 Well, there's a big story about some idiot in a white van who crashed into a voter, a GOP voter registration tent in a shopping center in Florida.
00:01:25.180 Now, as luck would have it, he did not kill anybody, but apparently he wasn't that far away from killing anybody.
00:01:31.520 This is one of those stories that you sort of had to be there to know, you know, if you were there, could you know in advance, I mean, or could you know by observing, that he did or did not intend to hurt people?
00:01:46.060 So, one report is that he got pretty close to somebody, but I don't know if that person was jumping out of the way and was barely missed, or if, you know, he aimed his truck at the objects and not the people.
00:02:00.180 So, we'll find out, but I'm not sure it makes it that much better if he was aiming at their tent.
00:02:07.500 Apparently, the guy gets out that he's making all kinds of, he took a video of it and gave them the finger, so it was clearly political.
00:02:14.480 But then we see his mugshot, and you look at the mugshot, and you say to yourself,
00:02:22.580 okay, I'm tired of pretending that I don't see the correlation.
00:02:29.920 There's a correlation.
00:02:31.920 So, just before I got on, I tweeted in the replies to another tweet.
00:02:38.000 I'd ask the question if there's something called Antifa eyes.
00:02:41.720 In other words, do they have a look that you could identify?
00:02:46.280 So, I took a picture of a page full of Antifa mugshots, people who definitely were in Antifa, they got caught.
00:02:57.320 And then I compared it to a page of shots of members of Congress, just headshots in both cases.
00:03:03.800 Take a look at them.
00:03:08.080 See if it looks like you could identify them just walking down the street.
00:03:14.400 You know, not necessarily that you're Antifa or that you're in Congress,
00:03:18.400 but that in one case, there's somebody who's likely to obey the law,
00:03:22.880 and in another case, someone a little less likely.
00:03:25.820 Now, someone said, quite correctly,
00:03:28.880 hey, that guy that probably is associated with, or at least compatible with the philosophy of Antifa,
00:03:37.080 the guy with the van who ran into the voter registration,
00:03:41.320 somebody said, hey, he looks more like one of those neo-Nazis.
00:03:45.400 To which I say, yeah, he does.
00:03:48.720 So, here's my hypothesis.
00:03:50.220 I think that there is an extremist look.
00:03:56.860 I believe that you could identify people who are extreme on the right or extreme on the left.
00:04:04.620 And here's how I think you could identify them.
00:04:08.180 By mental illness.
00:04:11.060 I think that's the identifying characteristic, is mental illness.
00:04:14.540 Because if you're mentally ill, you're going to be far more likely to be gullible or drawn into something that's extreme.
00:04:23.460 And, you know, depending on where you started from, you might say,
00:04:26.200 oh, I'm extreme left or extreme right.
00:04:28.980 But there's lots of science to suggest that human beings can detect illness.
00:04:37.320 In other words, if you, let's say there's somebody you're very close to.
00:04:42.680 It's a spouse, boyfriend, girlfriend, somebody in your family, somebody you see all the time.
00:04:47.620 And you walk into the room, and you look at them, and in a minute, you can tell if they're sick.
00:04:53.600 Most of the time, right?
00:04:54.860 Not every time.
00:04:55.600 It's not one of those 100% things.
00:04:57.080 But we're really good, we're really good at identifying, we're really good at identifying illness.
00:05:09.000 And I think that would probably include mental illness.
00:05:12.020 So I don't think it's a coincidence that when we see these extremists, they have a look.
00:05:17.660 And to me, again, there's no science in what I'm going to say next.
00:05:24.040 In my opinion, subjectively, the people on the extreme left and the extreme right look mentally ill.
00:05:31.960 It might be different types of mental illness, but they certainly have a distinctive, generic look.
00:05:39.220 And maybe it would be more productive to talk about them that way,
00:05:43.700 because we end up talking about them in political ways.
00:05:46.320 So the left will say, hey, that one neo-Nazi killed somebody, so all you Republicans are bad.
00:05:54.080 And, of course, Republicans say, hey, those Antifa guys hit Andy Ngo in the head,
00:05:59.180 so I guess all Democrats are bad.
00:06:01.620 You know, they don't say it exactly that way, but, you know, it becomes a political thing.
00:06:06.600 What I think, it's mostly a health thing.
00:06:10.000 I think if you drilled down, you'd find a health problem, a mental health problem.
00:06:13.460 Anyway, just a hypothesis.
00:06:18.320 Mike Bloomberg has a commercial.
00:06:20.440 It's an anti-Trump commercial, in which he shows Trump quoting, well, not quoting,
00:06:26.320 he shows Trump saying something about climate change.
00:06:30.220 And I don't have Trump's exact words, but it's something like this.
00:06:34.820 Trump says, you know, climate change, blah, blah, blah, you know, a lot of it is a hoax.
00:06:41.020 I think he said a lot of it, or much of it, or something like that.
00:06:43.800 So, in other words, the quote that Bloomberg puts on the screen is not Trump saying climate change is a hoax.
00:06:52.900 He wants you to think that, but if you listen to the actual words, that's not what he's saying.
00:06:58.840 Trump says something more like a lot of it is a hoax.
00:07:03.240 Now, I think that's just true.
00:07:05.840 I think that's objectively true.
00:07:07.920 Now, you have to accept that in the political realm, the word hoax is being used very broadly for something that's not quite right, something that's not true.
00:07:17.240 It doesn't necessarily mean that there's a prankster behind it who's running a practical joke on you.
00:07:23.720 You know, we've sort of morphed the word hoax from the original meaning to just mean there's something not right, doesn't add up.
00:07:30.220 You know, somebody's trying to get rich off it in general, that sort of thing.
00:07:35.220 And if you were to look at it that way, there is, objectively speaking, let's just say the Paris Accords.
00:07:42.360 Because if you just looked at that, that looks like a hoax, again, in the general sense, not of a prankster playing a trick, but of something that didn't add up.
00:07:53.680 It didn't make sense to be a part of it, wasn't helping anything.
00:07:57.320 It was costing us money, but it wasn't helping anything, except, you know, the argument is in some leadership way it would make a difference, but not really.
00:08:05.680 So Trump's opinion of climate change is, first of all, not what it's being presented, because there's a really big difference between saying the science is a hoax, which he didn't say, I don't know what he thinks, I'm just telling you what he said, versus saying climate change, a lot of it is a hoax.
00:08:28.200 Because the whole topic is not just the science, it's the prediction models that are a little less dependable, probably, than the science.
00:08:38.340 There's the economic models that are even less dependable, and then there's the politics.
00:08:43.800 If you look at the politics and the models and stuff, saying that a lot of that looks like a hoax, that's pretty reasonable.
00:08:51.780 There's no departure from science whatsoever to say that, because it's not even a comment about science.
00:08:58.200 So Bloomberg shows that with the intention of showing that Trump is anti-science.
00:09:07.580 And then it says, and here's the ridiculous part, and one of the things that politicians have to do is create content that matches and paces the people they're trying to reach.
00:09:23.460 So I'm not going to say that this is necessarily Mike Bloomberg's personal private opinion, but it's something that he approved to put in an ad.
00:09:33.180 And it says in the ad, after mocking Trump for his opinion, it said, Mike Bloomberg knows his science.
00:09:42.620 To which I say, does he?
00:09:46.680 Does he?
00:09:47.880 Does Mike Bloomberg know his climate science?
00:09:51.940 Because I don't think he's a scientist.
00:09:53.740 Do you know who else doesn't know their science?
00:09:58.960 All of the scientists, well, I won't put it that way.
00:10:04.800 It is loser think, as I devoted a part of a chapter in my book called Loser Think,
00:10:11.100 to imagine that you can, quote, do your own research on a field that you don't understand.
00:10:18.920 And let's face it, if you're not a climate scientist, you probably don't understand enough about climate science
00:10:25.740 to do the research on your own and come to a conclusion that says you know science.
00:10:31.900 It's a ridiculous concept.
00:10:35.900 So Mike Bloomberg is running with this proposition.
00:10:42.300 I think this is fair to say.
00:10:44.240 I think his proposition is he's running against somebody who doesn't understand science
00:10:49.140 or doesn't respect it or bow to it.
00:10:53.060 And he's the more rational candidate.
00:10:56.520 That's the proposition, right?
00:10:58.180 One is rational.
00:10:59.140 One, Mike Bloomberg, looks at the facts.
00:11:02.900 And the other is irrational, according to Bloomberg, this Trump fellow.
00:11:08.900 But his commercial is 100% irrational.
00:11:12.880 And it probably works.
00:11:15.260 So in terms of political ads, it probably works.
00:11:20.100 Because I think the people who want to believe this say,
00:11:22.440 oh, yeah, Trump thinks science is a hoax, which he didn't say.
00:11:27.140 And if he had said it, I think Mike Bloomberg would have used that clip instead of the clip
00:11:34.540 where he talks generally about much of it being a hoax, which I take to be the political part.
00:11:40.680 Now, this is the most irrational thing you could ever say.
00:11:45.800 And many of you have said the same thing.
00:11:48.400 So this is about you, too.
00:11:50.380 If you believe that you can do your own research on the topic of climate change
00:11:55.980 and reach an opinion that is rational because you've done your own research,
00:12:02.220 you're not a rational person.
00:12:04.900 You can't do that.
00:12:06.100 There's no way that you or I could research climate change,
00:12:11.960 no matter how much work we put into it.
00:12:14.620 If we don't have a background in it, we haven't waded into it,
00:12:18.500 haven't really wrestled with the details in a scientific way,
00:12:21.760 we can't reach an opinion on that stuff.
00:12:24.500 All you can do is believe people who told you stuff.
00:12:27.640 That's it.
00:12:28.300 So here's what Bloomberg's commercial should have said if it had been honest.
00:12:34.300 And isn't he running on honesty?
00:12:36.520 Isn't that one of his biggest propositions?
00:12:39.600 Trump's a big liar.
00:12:41.180 I'll give it to you honestly.
00:12:43.220 And then he produces a campaign commercial that's pretty much the opposite of that.
00:12:48.080 Because if he'd been honest about Trump's opinion,
00:12:51.640 he would have said, okay, we don't know what he means by the hoax part.
00:12:55.100 That would have been honest, right?
00:12:56.880 We don't know what he means.
00:12:59.280 Is he talking about the basic science?
00:13:01.720 He didn't say that.
00:13:03.120 Or is he talking about the political element of it?
00:13:05.820 That's the best interpretation.
00:13:07.600 But again, you'd have to get his clarification on it.
00:13:11.640 So it's a complete mischaracterization of the president's opinion.
00:13:15.560 So first of all, Bloomberg is essentially lying by context omission.
00:13:21.200 Just a lie.
00:13:22.980 And he's running to be the honest guy.
00:13:24.820 And it still works.
00:13:27.780 Because his people will accept this message the way he wants them to.
00:13:34.700 So it's a weird situation.
00:13:36.860 All right.
00:13:37.100 I don't trust the guy who believes what the experts tell him.
00:13:44.340 I've got a little more trust in the one who says, maybe, but it looks fishy to me.
00:13:52.720 Who do you trust?
00:13:53.440 I kind of have a little more natural trust for the person who says, maybe, but there's a part of this that could be a little sketchy.
00:14:04.340 I just naturally trust that person more.
00:14:07.360 All right.
00:14:09.300 I want to talk about Van Jones, my favorite Democrat.
00:14:12.460 And I mean that literally.
00:14:15.020 Of all the people who are Democrats, he is my favorite one.
00:14:19.200 And what I love about watching him is that you've heard that he's saying that in the land of the blind, the one-eyed man is king.
00:14:29.300 So Van Jones is like the only person who has an eyeball, who's a Democrat too.
00:14:35.160 I keep watching to see him suffer from TDS, and he doesn't.
00:14:43.080 He doesn't.
00:14:44.140 I've been watching him for a long time, since before the election, after the election, including last night, his comments about the impeachment, his comments about the debates.
00:14:55.660 I just can't find any Trump derangement syndrome at all.
00:14:59.640 It's like he doesn't have it.
00:15:00.740 But how in the world could he live and operate in that world and be immune when everyone around him, obviously, is deeply infected with TDS?
00:15:14.380 And I have a hypothesis.
00:15:17.040 You ready?
00:15:20.720 I probably shouldn't say this in public.
00:15:23.680 So I apologize to you, Van Jones, in advance, if you would wish I had not.
00:15:30.740 But I'm just going to put this out here.
00:15:33.060 And the only people who are going to really understand this is people who have had a similar situation.
00:15:38.060 I predict that at some time in his life, maybe not recently, but at some time maybe in his younger life, Van Jones experienced some hallucinogens.
00:15:48.880 Now, when I said that, all of you who have never taken a hallucinogenic drug don't know what I mean.
00:16:00.180 Everybody who has knows exactly what I just said.
00:16:04.680 And you'll see it in the comments.
00:16:06.460 All the people who know what I mean are just going to say, uh-huh.
00:16:10.100 Yeah, just watch the comments.
00:16:14.480 You'll see all the people agree.
00:16:17.100 Now, I mean this totally as a compliment.
00:16:20.340 As you know, I'd like to see some kinds of hallucinogens legal because they're greatly implicated in helping people with various mental health problems, solving addictions, all kinds of good stuff.
00:16:32.060 But one of their benefits is that you only have to do it once, and you can start seeing the filters of life.
00:16:40.580 The filters of life are the people who get caught in a little mental prison, and then they can only see the world through one filter.
00:16:49.000 And they think that their one filter is reality.
00:16:52.260 But it's not.
00:16:54.740 If you have one experience with hallucinogens, you experience living a world through a different filter, even temporarily.
00:17:02.440 And when you're done, you always remember it.
00:17:06.500 Now, you don't see the world the same way as you saw it when you were doing the hallucinogen.
00:17:10.840 But what you come away from is the understanding that you could have a completely wrong or different view of the world, and you can still operate.
00:17:21.940 You can still eat and procreate and go to work and everything else.
00:17:27.480 So if you had never had that experience, and you were surrounded with people who said, you know, the world is this one way.
00:17:34.840 This President Trump is an existential threat.
00:17:37.260 He's going to destroy the world.
00:17:39.940 What would you do?
00:17:41.280 Well, if your entire experience of life was that there's just this one reality and the smart people can see it and the dumb people can't, you would buy into the majority view.
00:17:53.560 It's the most natural thing you would do.
00:17:55.440 You would go with the people around you.
00:17:57.200 It's like, oh, everybody's seeing the world the same way.
00:18:00.140 They all seem pretty smart.
00:18:02.180 I think I'll probably adopt that view.
00:18:04.660 Why wouldn't I?
00:18:05.420 But time and time again, Van Jones, surrounded by TDS, I mean, just, he's like right in the middle of this boiling cauldron of insanity, and he's not affected.
00:18:19.840 Not even a little bit.
00:18:22.840 And what can explain that?
00:18:26.820 Well, you have my hypothesis.
00:18:28.660 I believe he is experienced, and it doesn't have to necessarily have been hallucinogens.
00:18:32.920 But my hypothesis is that Van Jones, at least once in his life, has experienced seeing the world through a different filter.
00:18:41.420 I don't know what that filter was.
00:18:42.800 But if you see the world through two different filters at two different times, and you come to realize the subjectivity of your experience, it's easier to see past other false filters and know that they're more of a lifestyle choice.
00:18:58.440 They could be a mental prison, et cetera.
00:19:01.160 Here's what, here's how Van Jones described this in his own words, roughly speaking.
00:19:07.100 In a clip, I just saw it today.
00:19:11.040 I'm not sure when he said it, but it was recently.
00:19:13.720 And he said that the Democrats are playing, quote, fantasy football.
00:19:19.140 And then he explains it this way.
00:19:20.820 Now, listen to how much this sounds like two movies playing on one screen, but it's a slightly different version.
00:19:27.200 And his examples of fantasy football that the Democrats are playing, according to him, are that, remember when Trump first got elected, there was all that talk that they would throw out the Electoral College results?
00:19:41.240 How realistic was that?
00:19:43.340 It wasn't really realistic, was it?
00:19:45.480 And then it didn't happen, and they're like, God, I'm so surprised.
00:19:48.420 That didn't work.
00:19:49.520 And then they did the Mueller report, and that didn't work.
00:19:54.040 It didn't work.
00:19:54.580 It was like a fantasy of getting rid of Trump, and then there was the impeachment, which also didn't have a chance of working.
00:20:03.420 And when Van Jones explains those three things, and those were just three examples.
00:20:08.300 By no means would I expect these to be the exhaustive list.
00:20:13.640 But isn't that description, fantasy football, kind of really good, kind of perfect?
00:20:19.500 Because they're playing a game with the Electoral College and Mueller and impeachment that they can fantasize is real.
00:20:29.060 Meaning that in their minds, they're like, yeah, we got them now.
00:20:33.420 When Mueller brings in the goods, we got them now.
00:20:36.700 So it allows them to live in a manufactured fantasy world in which they're winning, or they're about to win.
00:20:45.780 It's going to be good any minute now.
00:20:48.180 Sure, it's been bad for us for years in a row, but any minute now.
00:20:53.560 I thought that was a great, great way to frame this thing.
00:20:57.160 Anyway, so I love watching Van Jones talking reasonably in the cauldron of TDS,
00:21:08.700 because I don't think other people know quite what to do with it.
00:21:11.560 They don't really push back.
00:21:13.500 They just sort of change the subject.
00:21:16.100 Because if you're the only rational person in an irrational room,
00:21:21.240 sometimes maybe they just change the subject.
00:21:23.800 So, I tweeted this this morning.
00:21:27.960 I said there are two things that MSNBC pundits know to be true.
00:21:33.860 These are the two things they know to be true, if you're on MSNBC.
00:21:38.500 Number one, President Trump is an, quote, existential threat.
00:21:43.860 It's a phrase they use all the time.
00:21:46.260 Existential threat.
00:21:48.080 Meaning, he's literally a threat to our existence.
00:21:51.260 That we could all die, or at least a lot of us could die, because of President Trump.
00:21:57.920 So, they use this exact phrase.
00:21:59.500 He's an existential threat.
00:22:01.420 And they know that to be true, because they all say it without any pushback.
00:22:06.660 There's nobody on MSNBC who says, whoa, wait a minute.
00:22:10.340 You know, that's hyperbole, right?
00:22:13.000 You don't actually literally mean we're all going to die.
00:22:18.020 Nobody does that.
00:22:19.000 Do you know why?
00:22:21.660 Fantasy football.
00:22:23.980 They've all taken on this fantasy that there's this president who's going to kill us all.
00:22:30.420 And they're all on the other team.
00:22:33.140 Now, I'm adding that to Van Jones' list.
00:22:35.660 He didn't put that on the list.
00:22:37.460 But to me, it seems the same.
00:22:39.660 That there's like a fantasy filter on this.
00:22:43.620 So, that's the first thing they know, that President Trump is an existential threat.
00:22:47.000 Here's the second thing they know.
00:22:50.060 It's a special kind of existential threat that disguises itself for the first four years as a continuous flow of good news, meaning that the country is in good shape.
00:23:01.860 Things are going well.
00:23:02.680 And so, the people who believe that Trump is an existential threat, I would say to them, I can kind of see what you meant three or four years ago.
00:23:15.300 Remember, by election day, we'll have four continuous years of a Trump administration.
00:23:21.780 If he's an existential threat, wouldn't we see some signs of that?
00:23:27.840 You know, it's one thing to say, this is a big unknown.
00:23:32.380 You know, the election of President Trump.
00:23:34.600 On day one, when he's sworn in, it's a pretty big unknown.
00:23:38.780 So, you would expect, in the face of unknown, unknowns, that people would have different opinions.
00:23:45.440 Because they're just saying, you know, they're predicting with their biases or their preferences or their fantasies, whatever.
00:23:51.700 So, fairly reasonable, fairly reasonable to be afraid of President Trump on his first day of office.
00:23:58.980 I disagreed, I had a different opinion, and I had a strong, different opinion.
00:24:04.600 But I don't think it was crazy to be afraid of the unknown.
00:24:09.200 Because he did talk, you know, he did talk in a way that people hadn't seen, people found it scary.
00:24:15.120 You know, so, reasonable, reasonable.
00:24:17.480 Turned out to be wrong, but it wasn't crazy.
00:24:21.420 But now it's three years in.
00:24:23.840 And when election day comes around, it's four years in.
00:24:26.500 And I think, at that point, it's kind of crazy.
00:24:30.800 Right?
00:24:31.320 You should be able to change your opinion if four years go by and all we are is better off.
00:24:37.680 But no.
00:24:40.180 All right.
00:24:41.280 Rudy Giuliani went on Jesse Waters' show last night and made news.
00:24:46.160 I don't know if Jesse knew that he was going to be breaking a big story.
00:24:50.840 He looked surprised.
00:24:53.020 But there it was.
00:24:54.120 Now, you may have seen President Trump tweeting ahead of last night's shows that there was going to be a great lineup on Fox last night.
00:25:05.480 So President Trump just basically just did a tweet commercial for Judge Jadine, the Greg Gottfeld show, and Jesse Waters' show.
00:25:14.560 And now we know why, because Rudy Giuliani was on, and, of course, he knew he would get favorable coverage in general.
00:25:23.820 But Rudy Giuliani has this claim that he's got three witnesses ready to name names in an investigation into Hunter Biden.
00:25:34.640 And so there's alleged corruption.
00:25:39.120 And Rudy Giuliani says he's got at least one document and three people who are willing to testify to something.
00:25:45.880 I don't know exactly, had something to do with money laundering through different countries and making a loan not look like a loan.
00:25:54.880 I don't know the details, so I don't want to characterize it.
00:25:57.260 But Rudy is making a pretty big claim that there was something that wasn't too hard to find, apparently.
00:26:06.840 I mean, Rudy found it without a lot of help.
00:26:09.480 And that it would change how you saw President Trump's request to Ukraine.
00:26:16.580 Now, do you think that Rudy Giuliani really has the goods?
00:26:24.380 What do you think?
00:26:26.740 If he had to put the odds on it, of course, we don't know.
00:26:30.300 But there are two possibilities.
00:26:31.540 One is that Rudy is just being political, and he doesn't really have the goods.
00:26:36.920 But maybe it's something that he could convince you looked a little suspicious.
00:26:41.760 So maybe.
00:26:42.700 So that's one possibility.
00:26:43.580 The other possibility is he totally has the goods.
00:26:48.360 He totally has it.
00:26:50.140 Now, given that Rudy Giuliani is literally famous as being, what, one of the most effective prosecutors of all time,
00:27:01.760 what are the odds that Rudy Giuliani can't tell the difference between having the goods, the smoking gun, as he called it,
00:27:09.040 and not having a smoking gun?
00:27:11.100 I feel as if he'd know the difference.
00:27:13.580 And you would have to believe that he was intentionally lying to you about the quality of his evidence
00:27:19.200 in order to think that there's something to question here.
00:27:23.500 What are the odds that Rudy Giuliani, who has names and he can show them to you?
00:27:27.800 So, in other words, he's offering, he's practically begging to show his sources.
00:27:33.000 When people lie, do they beg you to look at their original sources, the same stuff they looked at, to reach their opinion?
00:27:41.820 Well, not that often.
00:27:43.740 He wants us to look at the evidence, which suggests, and given his experience and background,
00:27:49.960 if I had to guess, you know, if I had to put money on it, sounds like he's got something.
00:27:58.640 So, we'll see.
00:27:59.320 In general, I have a proposition.
00:28:08.060 One way to tell the difference between Republicans and Democrats is risk management.
00:28:14.660 And I often see that Republicans seem, let's say, better skilled.
00:28:24.000 I think that's the way to say it.
00:28:26.400 Better skilled at managing risk.
00:28:28.560 Now, I've said, and I've said it often, that managing risk is not something you're born with, necessarily.
00:28:36.880 It's a learned skill.
00:28:39.040 So, if you had never learned to manage risk, you'd probably think you're good at it, but you wouldn't be.
00:28:45.900 Because it's like a lot of things.
00:28:47.260 You get better over time.
00:28:49.560 And you notice that a lot of Antifa people either don't have corporate jobs, to say the least,
00:28:55.780 or they're very young, people in their 20s.
00:28:59.360 What would you say is generally true of people who have never worked in the corporate world and are young?
00:29:08.200 They're not all young, but many of them are.
00:29:10.220 So, it's people who don't have corporate experience and they're young.
00:29:14.180 I would say the one thing that really characterizes that group,
00:29:17.440 there would be a number of things that they'd have in common, I suppose.
00:29:20.300 But one of them is that they had never learned risk management.
00:29:25.360 You know, how do we evaluate risks?
00:29:27.220 And you see that all the time.
00:29:29.380 There's no better example than this one, that Warren and Bernie Sanders want to radically change the economy
00:29:37.420 at a time when the economy is doing better than it has ever done.
00:29:41.360 If you were good at risk management, would you ever take something that's operating better than it has ever worked and change it completely?
00:29:51.760 You could argue completely, but let's say a big change.
00:29:57.360 I don't think you would.
00:29:59.060 I don't think there's anybody who would make a big change to something that's working perfectly.
00:30:04.280 Now, the exception to that would be, let's say you're in business and you make a good product and it's better than your competition's product.
00:30:13.860 In that case, you know your competition is going to catch up soon.
00:30:18.520 So in that case, you actually would make a big change because you anticipate that your competition will match you.
00:30:24.360 So you'd better cannibalize your own product and make a big change.
00:30:29.760 It's like, oh, yeah, our old one was dominating the market, but we're still going to throw it away.
00:30:35.340 Because if we don't, the competition's going to catch us.
00:30:38.640 So they throw away their own product, even though it's the best one in the market, to try to make an even better one.
00:30:45.280 Now, in business, you have to do that because you can depend on competition pretty much 100% of the time.
00:30:51.300 But when you're talking about the American economy, well, we have sort of a general competition with China and other countries,
00:30:59.520 but we're beating all of them and probably will continue to because we have a superior system.
00:31:05.320 And I think that's a really big difference.
00:31:08.740 Now, take climate change.
00:31:10.320 The climate change, if the progressives got everything they wanted, would be a gigantic change, again, to the economy,
00:31:19.800 and would be really expensive, and it would be a big risk in terms of how much money would be spent
00:31:26.220 and how it would change the way we live.
00:31:29.340 But, they say, there's a big risk of not doing it, so this makes sense.
00:31:34.300 So the risk management analysis is that, sure, it would be risky to disrupt the economy, make all these changes,
00:31:43.960 but it's an even bigger risk to let climate change do what it's doing, according to them.
00:31:49.540 Here's what's wrong with that.
00:31:51.400 It's very bad risk management.
00:31:53.540 In other words, it's sort of risk management 101.
00:31:58.120 It's how you manage risk if you don't understand risk.
00:32:01.300 The Republican approach is to say, well, we don't know exactly what the risk is, the models are not that accurate,
00:32:11.000 but there's probably some risk.
00:32:12.600 So let's do all the things that don't hurt us that would also be the right thing to do if it were a big risk.
00:32:19.420 Nuclear power, planting trees, just two examples.
00:32:23.580 President Trump is all in favor of planting trees, and he's all in favor of developing nuclear power.
00:32:30.960 At least, he doesn't say it enough, and I think that's a flaw.
00:32:34.160 But in terms of the Department of Energy and the government, they're doing a lot.
00:32:38.600 They're doing a lot.
00:32:39.980 So if you were to look at it not as a left or right situation, but rather as a risk management situation,
00:32:47.860 which one of those is better?
00:32:50.080 I would argue that unambiguously the Trump administration approach is better no matter what you think of climate change
00:32:58.600 because their risk management approach does all the things that would be good no matter what.
00:33:03.680 Well, we can always use more trees.
00:33:06.180 Trees aren't bad.
00:33:06.980 It's not much of a risk, especially since everyone in the world is doing it.
00:33:11.140 And it's not like we're the only ones planting trees.
00:33:13.620 So risk management, you look for that in all of the differences.
00:33:21.220 It's a risk management difference, and I think it's an experience difference as well.
00:33:25.720 All right.
00:33:27.720 Here's a theory that I have some appreciation of because of my experience as the author of the Dilbert comic,
00:33:37.640 and it goes like this.
00:33:38.560 When the Dilbert comic first started, it kind of reached its peak during the mid-'90s
00:33:46.400 when there was a lot of downsizing, and employees had very little power,
00:33:51.580 and their jobs were being sent overseas and all that.
00:33:56.440 And Dilbert was immensely popular, and people were always sending me suggestions
00:34:00.940 to mock their bosses and the bad management and how bad business is.
00:34:06.620 So it was sort of the heyday for being the Dilbert guy,
00:34:09.580 because people were giving me all this great material from their own unhappy experiences.
00:34:15.160 And then Bill Clinton came along, and the dot-com situation happened.
00:34:19.240 Now, when the dot-com thing happened, it looked like everybody was, not everybody,
00:34:24.640 but it looked like the only thing keeping you from getting rich was yourself.
00:34:30.100 Because so many people were doing so well in so many different ways during the dot-com boom
00:34:35.900 that if I asked you, hey, what's wrong with your career, and I did, let me give you the exact thing,
00:34:44.540 people stopped sending me complaints.
00:34:47.880 And I thought to myself, come on.
00:34:50.080 Even if, let's say, 10% of the population is doing great because of this dot-com boom,
00:34:57.720 it can't be more than 10%,
00:34:59.640 the other 90% still have the same bad boss they had before and the same job.
00:35:06.080 If you were complaining before with the same job, the same boss, the same co-workers,
00:35:11.760 why did you just stop complaining?
00:35:14.100 I literally couldn't get people to send me complaints during the dot-com era.
00:35:18.380 To the point where I actually asked people to give me their phone number
00:35:21.940 so I could call them at work and ask them if they have any complaints
00:35:26.680 because I wasn't getting any solicitation.
00:35:30.460 And so some people volunteered, a lot of them actually,
00:35:33.600 and I would call them at work, and sometimes they would pick up, sometimes not.
00:35:37.780 And I'd say, hi, I'm Scott Adams, I'm the Dilbert guy.
00:35:41.120 Tell me what's bothering you about your job.
00:35:44.240 And they'd say, oh, yeah, I'm having a good day, nothing's bothering me.
00:35:46.540 I'd go, no, really, there's nobody who has a perfect job and a perfect day.
00:35:51.180 Just tell me what's really bugging you about your job.
00:35:55.120 And I couldn't pry it out of them.
00:35:57.800 I couldn't pry out of them a complaint about their boss or their company.
00:36:05.820 It was amazing.
00:36:06.760 And then after the dot-com bust, things got better for the Dilbert guy, me,
00:36:14.780 because people went back to complaining.
00:36:16.780 Here's how this is relevant to our political situation.
00:36:21.200 When the economy is bad, who do you blame?
00:36:26.040 Who do you blame when the economy is bad?
00:36:29.120 Well, you blame the government.
00:36:30.920 Even if it's not the government's fault, you're still going to blame them.
00:36:34.800 Well, you should have done more.
00:36:36.360 If you had done this, we wouldn't be so bad.
00:36:39.080 You also blame management.
00:36:41.540 You blame rich people.
00:36:43.340 You blame big corporations.
00:36:46.140 Right?
00:36:46.620 Yeah.
00:36:47.100 So you're blaming banks and businesses and corporations and stuff.
00:36:51.020 Who does that sound like?
00:36:52.440 Who did I just describe?
00:36:53.900 Did that sound like Republicans?
00:36:55.860 Or did that sound like Democrats?
00:36:58.520 Answer?
00:36:59.440 Democrats.
00:36:59.800 So Democrats, their message of blaming the government, the rich people, the system,
00:37:09.000 that's what Democrats do.
00:37:10.440 They blame the government, big business, banks, Wall Street, rich people.
00:37:15.340 That makes complete sense when the economy isn't working.
00:37:20.340 The Democrat message is perfectly suited for a bad economy.
00:37:24.900 What happens when the economy is really, really good?
00:37:29.800 People blame themselves.
00:37:34.760 Now, of course, these are generalities.
00:37:37.000 So most people never change their mind, no matter what the evidence is.
00:37:40.900 So most Democrats will just vote Democrat and feel the same way they always feel.
00:37:45.000 But for the small sliver of persuadables, they suddenly found themselves from, yeah, the government's bad, the big business, to, do you have any complaints about your job?
00:37:57.180 Bob, you personally, just you personally, don't talk about other people, not employed, don't tell me that other people are being discriminated against.
00:38:05.040 I get that.
00:38:05.860 I'm accepting all that to be true.
00:38:07.560 Just you.
00:38:08.080 You personally, Bob, Bob, are you okay?
00:38:14.040 How are you doing?
00:38:15.460 Well, you probably saw there was a study that said 90% of people are happy in their personal lives.
00:38:20.460 Because there's nobody to blame.
00:38:21.980 If your economy is screaming, and you're not happy with your situation, whose fault is it?
00:38:31.620 Right?
00:38:32.740 People understand that if the system is giving them low unemployment, they can kind of change jobs.
00:38:40.720 It's pretty easy to change jobs in this economy.
00:38:44.980 So if you haven't changed jobs, if you haven't done what you need to do, if you haven't taken those courses to get that promotion, it's kind of on you.
00:38:54.740 So we have what I would call a Republican-biased situation, which Republicans, by philosophy, are, you have a problem?
00:39:05.140 That's your problem.
00:39:06.660 Go solve your problem.
00:39:07.940 Well, sure, we'll try to do what we can to get the government out of your way.
00:39:12.900 But the government isn't there to solve your problem.
00:39:16.440 That's what you're for.
00:39:18.220 You're the one who solves your problem.
00:39:21.460 And that makes sense.
00:39:22.760 When the economy is good, people will accept that message.
00:39:25.040 So the point is, I don't know if anybody has mentioned this effect.
00:39:29.500 But you can't get a strong Democrat turnout for an election.
00:39:35.340 And I think it would be reflected mostly in turnout.
00:39:37.940 You're not going to get a strong turnout from people who blame themselves for the problem.
00:39:45.160 And that's the situation we're in.
00:39:46.900 So I suggest there will be a low voter turnout because people won't want to abandon their hatred for Trump.
00:39:53.760 They just might be busy that day.
00:39:56.640 You know what I mean?
00:39:58.100 They're not going to say, I didn't vote because even though I don't like Trump, I have to admit things are going pretty well.
00:40:04.520 It's not going to be that.
00:40:06.480 It's going to be, well, I could vote or, I don't know, I was invited to this thing.
00:40:13.100 I can't say no to the thing.
00:40:15.140 I'm going to have to go to the thing instead of vote today.
00:40:17.440 So I think you're going to see low voter turnout, and mostly because of that.
00:40:20.900 All right.
00:40:22.840 You've probably seen by now the clip of Buttigieg, maybe you saw it live, when he gave this long, nonsense word salad,
00:40:31.500 sounded like a corporate consultant answer to a question during the debate.
00:40:35.340 And I watched that live, and Buttigieg starts talking, and it's all just concepts and words put into sentences.
00:40:46.800 And the sentence sort of made sense, but in some cases not.
00:40:51.080 And I kept waiting for that to turn into a crisp point.
00:40:56.040 You know, I would do this, or here's the problem, here's the solution.
00:40:59.560 I thought it was going to turn into that, and then he just ran out of time and stopped.
00:41:06.200 And it was almost a full minute of talking with no content.
00:41:11.660 And I thought to myself, where have I heard that before?
00:41:16.060 Where have I been in my life, in my history, where I've heard somebody talk for a full minute without any content?
00:41:24.900 And I thought, oh, yeah.
00:41:27.640 I was sitting in a meeting at my old employer, Pacific Bell,
00:41:31.280 and a high-end, very expensive consultant from McKinsey was telling us how they were going to make everything better.
00:41:40.280 It's consultant talk.
00:41:42.640 It's corporate jargon, except taken into the political world.
00:41:46.980 It was pure, empty calories.
00:41:49.760 And I thought to myself, I've never seen somebody create a kill shot for themselves that was so effective.
00:41:57.640 If he became the candidate and ran against Trump, all Trump would have to do is run that clip
00:42:04.160 and show a picture of an empty suit.
00:42:07.140 Am I right?
00:42:11.360 Just show the clip, and then do a split screen, and it's like an empty suit.
00:42:16.020 It's standing there, and there's nobody in it.
00:42:18.240 Empty suit.
00:42:19.480 It's over.
00:42:20.180 So, it's a big weakness he's got there.
00:42:24.200 All right.
00:42:30.520 So, here's somebody who responded to my comments about Buttigieg.
00:42:36.020 Or actually, I think I retweeted somebody else's tweet.
00:42:39.960 I think it was Kathy's tweet.
00:42:45.380 And so, this is what this individual said.
00:42:48.460 Now, remember I've told you that you can identify people who are artists without looking at their profile on Twitter?
00:42:56.220 You can identify them by the way they address topics.
00:43:01.900 All right.
00:43:02.280 So, I'm going to give you this example.
00:43:03.840 So, on the tweet where people were making fun of that Buttigieg statements during the debate where it was all word salad,
00:43:13.040 this is what this, presumably a supporter, or at least a Democrat, says.
00:43:16.960 Quote,
00:43:17.640 word salad, question mark, question mark.
00:43:21.380 It's called being smart and informed.
00:43:24.240 Trump can't even spell, for crying out loud.
00:43:27.440 Do you know who was notoriously against intellectuals?
00:43:30.800 Yeah, the Nazis.
00:43:33.100 Why are you so threatened by intelligence?
00:43:35.980 Someone is not elite, because they read.
00:43:40.100 All right, now, as I mentioned in my book, Loser Think, it talks about all the bad ways of thinking.
00:43:48.800 They're almost all in this one tweet.
00:43:51.380 Let me call them out.
00:43:52.380 So, he's saying it's not word salad.
00:43:56.520 It's called being smart and informed.
00:43:59.160 What is that?
00:44:00.860 That's what I call word thinking.
00:44:04.100 He didn't give a reason.
00:44:06.100 He just relabeled it.
00:44:08.480 That's it.
00:44:08.820 He just relabeled it.
00:44:10.580 Relabeling things is not thinking.
00:44:12.860 You just put different words on the same thing.
00:44:14.960 We're looking at exactly the same thing.
00:44:17.100 There's no question about the facts.
00:44:19.400 Just labeling it a different label didn't get you anything.
00:44:22.220 So, word thinking.
00:44:24.520 Then he says Trump can't even spell.
00:44:27.480 What's that got to do with Buttigieg?
00:44:30.140 What does spelling have to do with what Buttigieg just did?
00:44:37.080 It's a ridiculous comparison.
00:44:39.400 So, that's the next thing that artists do.
00:44:41.000 They don't know how to compare things effectively.
00:44:43.540 And it's not a good comparison because spelling things wrong is so common that everyone here says to themselves,
00:44:52.600 Yeah, I spelled a tweet wrong once.
00:44:54.720 I went to college and I spelled a tweet wrong.
00:44:58.160 How many times have I spelled a tweet wrong or spelled something wrong in a blog post or mispronounced something?
00:45:05.640 Fairly often.
00:45:07.740 I've got a college degree.
00:45:10.200 I've got a master's degree.
00:45:12.180 I do this for a living.
00:45:13.220 I'm a professional writer.
00:45:15.560 They still spell stuff wrong all the time.
00:45:17.680 It means nothing.
00:45:19.780 So, it was a weird comparison.
00:45:22.300 Then he does the Hitler thing.
00:45:24.160 He goes, Do you know who was notoriously against intellectuals?
00:45:27.740 Yeah, the Nazis.
00:45:29.720 So, that's what I call analogy thinking.
00:45:31.960 Where you imagine that because something reminds you of something,
00:45:36.220 that there's something that you're learning because of that.
00:45:40.060 You're not learning anything.
00:45:41.060 You just got reminded of something.
00:45:43.160 That's it.
00:45:43.700 That's the entire end of the story.
00:45:46.360 I was reminded of something.
00:45:48.000 You don't take that and then say,
00:45:50.360 Therefore, I predict Trump will become Hitler.
00:45:53.940 So, that's the analogy thinking.
00:45:56.580 And then he says, Why are you so threatened by intelligence?
00:46:01.720 Who was threatened by it?
00:46:04.480 Was there somebody in this story who was threatened by intelligence?
00:46:09.760 That's just mind reading.
00:46:11.680 He's mind reading and getting the wrong answer.
00:46:14.600 So, he's got word thinking, bad comparisons, analogy thinking, and mind reading.
00:46:19.460 And then he ends with, Someone is not elite because they read.
00:46:23.660 And I'm not even sure he, oddly enough, he had a typo.
00:46:27.480 He spelled a word wrong.
00:46:29.360 In his tweet, in his tweet, in which he was mocking the president for a spelling error,
00:46:36.000 he has a typo.
00:46:36.780 In this case, it's a typo, not a spelling error.
00:46:39.260 But I'm thinking, Well, you know, don't throw stones.
00:46:42.620 Anyway, he says, Someone is not elite because they read.
00:46:46.460 And I'm thinking, Who is arguing anything like that?
00:46:50.280 Who does he imagine he is countering?
00:46:55.040 He basically creates a straw man argument and then argues against it.
00:46:59.240 So, he's got word thinking, bad comparison, analogy thinking, mind reading, and a straw man.
00:47:05.020 All in one tweet.
00:47:07.860 And when I saw that, I said to myself,
00:47:10.720 Artist?
00:47:12.860 Click on profile.
00:47:15.240 Professional writer.
00:47:17.240 For television.
00:47:20.060 Was I surprised?
00:47:22.120 No, I was not.
00:47:24.320 Now, since the moment I pointed this out,
00:47:27.880 and many of you have seen me point this out for a while,
00:47:30.700 that you can identify an artist by their comments
00:47:33.880 because they don't know
00:47:36.100 how to, let's say,
00:47:40.320 understand the world in rational ways.
00:47:42.780 It's really obvious.
00:47:44.600 All of the worst comments on Twitter are kind of from people who are professional artists or want to be.
00:47:53.660 It's not a coincidence.
00:47:55.080 All right.
00:47:57.680 Let me tell you the secret to success.
00:48:04.740 All right.
00:48:05.380 You ready?
00:48:05.940 If you stayed to the end, you get this little nugget.
00:48:09.100 One of the greatest secrets to success I learned from a salt salesman.
00:48:17.440 He was my neighbor.
00:48:19.020 And he'd gotten rich after being born.
00:48:22.380 He was actually born in a shack in the south that didn't have running water.
00:48:27.880 All right.
00:48:27.980 So this is a guy who was born into extreme poverty,
00:48:32.080 lied about his age to get into the Navy.
00:48:34.680 I think he was 16 because he just needed some way out of his extreme poverty.
00:48:39.840 But when I met him, he was living in a mansion,
00:48:42.940 a small mansion, but, you know, he was a rich guy, lived in my neighborhood.
00:48:47.280 And I once asked him at a party what his path to success was.
00:48:53.140 And he told me that he started out after the Navy, he became a salt salesman.
00:49:00.040 He would sell salt to grocery stores.
00:49:03.580 So he would go in and say, you should carry my brand of salt instead of the other one.
00:49:07.360 And I laughed and I said, how in the world can you sell salt?
00:49:12.860 It's just price.
00:49:14.940 Salt is salt.
00:49:16.240 Am I wrong?
00:49:17.760 You know, how could anybody say my salt is better than your salt?
00:49:21.160 And I said, how in the world did you sell it?
00:49:24.360 And he said, well, and he told me this story.
00:49:28.000 He said, well, yeah, iodine and blah, blah, blah.
00:49:30.840 But the point is, from the consumer's point of view, it's a pretty generic thing.
00:49:37.740 So he told me this story.
00:49:38.920 He said there was a local grocery store guy who, when he called upon him,
00:49:42.740 the guy said that he was going to be,
00:49:44.200 he learned that the guy was going to be reorganizing his store over the weekend.
00:49:48.480 So the guy was going to have to come in and, you know, work all night or weekend or something,
00:49:52.800 reorganizing his shelves for some purpose.
00:49:55.800 So the salt salesman shows up unannounced to help.
00:50:01.200 He just shows up and says, well, you were going to reorganize your shelves,
00:50:05.060 so I'm going to show up and help.
00:50:07.360 So he works with him.
00:50:09.400 And when he was done, that store owner bought this guy's salt forever.
00:50:13.680 All right, now, what is the lesson from this?
00:50:17.560 The lesson is that people who receive things, if they're smart,
00:50:22.760 they created that situation by giving something and asking nothing in return.
00:50:27.740 So he gave something to somebody, and that was his system.
00:50:32.940 He had a system, not a goal.
00:50:35.040 His system was, I'm going to be nice to people.
00:50:37.260 I'm going to help them.
00:50:38.220 I'm not going to ask for anything.
00:50:39.380 And when they decide who they want to buy salt from, I'm going to get my share.
00:50:45.140 And he became the best salt salesman in his company.
00:50:49.140 And that created a bunch of money because, you know, he got paid on commission.
00:50:52.880 And they parlayed that into a number of other businesses, became a famous entrepreneur.
00:50:58.620 Here's the reason I bring this up.
00:51:00.760 Somebody else is using this technique.
00:51:03.980 Somebody who's watching this Periscope right now.
00:51:06.600 You probably already know him.
00:51:09.540 His name is Dr. Funk Juice.
00:51:12.240 Dr. Funk Juice started tweeting about my Periscopes, the one you're watching right now.
00:51:20.200 So I'm not sure how long ago, several weeks ago.
00:51:23.040 And he made a little image of a coffee cup and, I guess it was a Jeff,
00:51:27.980 and announced that my Periscope comes up at, you know, the same time, 10 Eastern, 7 Pacific,
00:51:35.900 and told people to watch it.
00:51:38.100 Now, the first time I saw it, I thought, oh, great.
00:51:40.920 A fan.
00:51:41.480 He's such a fan.
00:51:42.240 He made a GIF and was promoting my thing.
00:51:45.740 And I don't know if I tweeted it or I liked it or whatever, but I enjoyed seeing it.
00:51:49.740 Then the next day, he did it again and again and again.
00:51:57.780 Now, I had been thinking to myself, you know, I should do that myself.
00:52:01.480 But it was just one extra thing, and I didn't feel like doing it.
00:52:05.120 But it was convenient for me because every time Dr. Funk Juice would do one of those tweets,
00:52:11.040 it would come in at exactly the right time, like half an hour or so before the actual Periscope,
00:52:15.560 perfect timing, and I would see it in my Twitter feed, because I'm always on Twitter right before
00:52:19.800 I come on here, and I'd say, oh, I don't have to do a tweet.
00:52:23.880 I'll just retweet Dr. Funk Juice.
00:52:27.260 And day after day after day, a person I've never met, owes me nothing, gave me something.
00:52:36.300 It was just a gift.
00:52:38.060 Now, you know, he apparently likes the content that comes out of this.
00:52:41.900 Now, normally, I don't like to mention ethnicity, but I think it matters in this case.
00:52:51.380 So Dr. Funk Juice, based on his profile, is an African-American man and is a DJ.
00:52:59.200 And he has found the secret to success.
00:53:04.320 Because who am I talking about right now?
00:53:06.920 Dr. Funk Juice.
00:53:07.920 And I'm going to tell you that you should follow him at DJ underscore DR underscore Funk Juice.
00:53:17.760 One word, Funk Juice.
00:53:20.200 And I've got to say that if I were to bet on somebody, I would bet on this guy.
00:53:26.840 Because he's figured out the salt salesman trick.
00:53:31.100 He simply gave me something for free and got my attention.
00:53:36.220 And then he gave me something else for free and got my attention.
00:53:39.960 And now I'm giving him a commercial.
00:53:42.420 So presumably, he will gain some followers, you know, get something out of it.
00:53:47.080 But I don't think he had that specifically in mind.
00:53:50.620 I'm guessing that he was running it like a system.
00:53:53.520 I can't read his mind.
00:53:54.420 But I'm guessing he didn't have a specific outcome in mind.
00:53:58.020 He just knew what the salt salesman knew.
00:54:02.080 That if I do this good thing and ask nothing in return, something good might happen.
00:54:07.320 But I don't think he necessarily did it for that reason.
00:54:10.420 I think he just understands the world at a deeper level.
00:54:14.380 And the reason I mention ethnicity is this is the following point.
00:54:17.980 I've often thought that one of the biggest forms of, I would call it almost, what's the word, industrial racism or sort of built-in racism.
00:54:36.160 Is that if you're born a typical white kid and you've got a successful white family, you're getting all this advice.
00:54:42.960 Even if you don't want it.
00:54:44.280 You know, just being born in a family where you've got entrepreneurs and people who are, you know, going to college and stuff.
00:54:51.020 You're just going to sort of pick up advice.
00:54:53.600 And that's got to be a tremendous advantage over somebody who has some different situation and they don't get the benefit of that, you know, learning things by osmosis.
00:55:03.120 Just being around it.
00:55:04.920 And I've often thought that there should be some kind of a class or a lesson.
00:55:10.480 It doesn't have to be for, you know, African-Americans specifically, but as a group, they may have less access to people who have already made it.
00:55:20.680 And that's part of the sort of intrinsic bias of our society is that there's some people just don't have access to mentors.
00:55:27.400 So Dr. Fung-chus, either by being smart or possibly he had some good experiences with family members who were also successful, is on to this secret.
00:55:39.120 Very, very powerful secret.
00:55:42.360 Now, can he share it?
00:55:44.280 You know, I would imagine that if he has kids, I don't know if he has, or he has them someday, they're going to have the benefit of his experience.
00:55:52.400 And by osmosis, they'll learn what he did and maybe pick up some tricks.
00:55:57.900 How could you expand that to, let's say, inner cities, people who really needed it?
00:56:02.940 When I wrote my book, How to Fail at Almost Everything and Still Win Big, I was thinking in those terms.
00:56:09.320 But I don't think that book necessarily cracks every community.
00:56:13.740 So anyway, big call out to Dr. Fung-chus for being so smart about success in particular.
00:56:23.360 So thanks for that, and thanks for the help.
00:56:25.340 That's all I have for today, and I will talk to you all tomorrow.