Real Coffee with Scott Adams - July 03, 2023


Episode 2158 Scott Adams: Let's Chat About Headlines While You Sip Your Coffee & Pet Your Dog Or Cat


Episode Stats

Length

46 minutes

Words per Minute

147.3652

Word Count

6,899

Sentence Count

572

Misogynist Sentences

2

Hate Speech Sentences

23


Summary

Joe Biden's mom is a stripper. Elon Musk and Zuckerberg are in a cage fight for a billion dollars. Joe Biden doesn't recognize his own grandchild. And a woman who may or may not have been his grandchild's mother.


Transcript

00:00:00.000 Do-do-do-do-do-do-do.
00:00:03.020 Good morning, everybody, and welcome to the highlight of human civilization,
00:00:08.120 possibly robot civilization, but it's too early to know.
00:00:12.360 Today will be a barn burner.
00:00:16.840 There will be a whiteboard.
00:00:19.940 A whiteboard, yes.
00:00:21.940 And single-sided, but still thrilling, I think you would agree.
00:00:25.180 Now, would you like to take your experience up to levels that nobody's ever seen in... ever?
00:00:33.620 Yes, you would, and all you need for that is a cup or mug or a glass, a tanker, chalice, or stein,
00:00:38.120 a canteen jug or flask, a vessel of any kind, fill it with your favorite liquid, like coffee.
00:00:45.800 Join me now for the unparalleled pleasure of the dopamine at the end of the day,
00:00:48.720 the thing that makes everything better. It's called the simultaneous sip.
00:00:52.440 It happens now. Go.
00:00:55.180 Well, I think you all, all you Americans know what tomorrow is.
00:01:07.160 That's right.
00:01:08.740 Tomorrow would be the anniversary of the death of John Adams, President John Adams,
00:01:16.460 and Thomas Jefferson.
00:01:19.120 Little-known fact, Thomas Jefferson and John Adams would have been great,
00:01:23.800 they had been nemeses, but then they became friends in later life,
00:01:30.260 and allegedly, when, was it when Jefferson was dying, he said Adams is still alive,
00:01:36.280 or was it the other way around?
00:01:37.620 Which way did it go?
00:01:39.000 When Adams died, he said Jefferson's still alive?
00:01:43.640 It was Jefferson said Adams, oh, Jefferson still lives.
00:01:46.520 And they died the same day.
00:01:49.320 Yeah, amazing.
00:01:51.140 Well, anyway, let's talk about stuff.
00:01:55.980 Do you want to make me see if I can make a billion dollars sound like a small amount of money?
00:02:03.200 Watch this.
00:02:04.520 Here's the power of contrast.
00:02:06.360 Normally, if I said to you, hey, I'd like to give you a billion dollars,
00:02:12.640 wouldn't you think that was a lot of money?
00:02:14.920 You would, right?
00:02:16.200 Because you'd be contrasting that to what you already have.
00:02:19.920 I'm going to make a billion dollars sound like not a lot of money now.
00:02:23.820 You ready for this?
00:02:24.500 It has nothing to do with Ukraine.
00:02:26.020 There's some estimates that the cage match between Zuckerberg and Elon Musk, if it happened,
00:02:36.400 could charge $100 per person for pay-per-view, and it would bring in over a billion dollars.
00:02:45.200 Just one billion.
00:02:48.300 All right, between the two of them, they're like worth $300 billion.
00:02:50.960 Has there ever been a situation where somebody who was worth a combined $300 billion fought for $1 billion that goes to charity?
00:03:04.360 It makes a billion dollars sound like a small amount.
00:03:07.940 Wait, are you telling me that Elon Musk is going to do a cage fight for only a billion dollars,
00:03:15.060 and it doesn't even go to him?
00:03:17.860 All right, well, I just thought it was interesting.
00:03:20.960 So, you know, I always tell you that the summer stories in politics are the really stretching it to make a story out of nothing.
00:03:30.980 In order for a good summer story, it has to be a story where there's no new facts,
00:03:37.120 because the reporters are on vacation.
00:03:39.780 So you want a story you could talk about endlessly, but not have any facts.
00:03:44.540 So both sides are going to do it, right?
00:03:46.300 You know, Democrats are going to talk about Trump's documents that they don't know what they are.
00:03:52.660 We still don't know what they are, but if.
00:03:55.760 If it was about this, it's bad.
00:03:57.700 You know, the if.
00:03:59.040 So the Republicans have their own version of that,
00:04:02.660 and so their version of the summer story is that Joe Biden won't recognize his own granddaughter,
00:04:09.100 or the child of the stripper, which is a real messed up thing to say.
00:04:16.840 I hate that that came out of my mouth.
00:04:19.840 Like, I hate that I accepted the frame.
00:04:24.420 Let me, can I chastise myself for a moment?
00:04:27.820 I just called somebody's mother a stripper.
00:04:31.600 How fucked up is that?
00:04:32.960 I mean, that might accurately describe her profession at some point.
00:04:39.300 I doubt she's doing it now, but I don't know.
00:04:42.060 So, yeah, that's pretty messed up.
00:04:44.700 That's pretty messed up.
00:04:45.460 I don't like when the news does it.
00:04:47.440 It's descriptive.
00:04:49.120 It's accurate.
00:04:50.720 It's just fucked up.
00:04:52.000 It's somebody's mother.
00:04:53.200 I mean, if you're talking about her in the context of being somebody's mother,
00:04:57.200 I don't know, it's just wrong.
00:04:58.840 Anyway, so I fell into the same trap, so I apologize.
00:05:02.960 But let me finish the story.
00:05:07.120 The story is that the Bidens, Joe Biden and Jill,
00:05:12.680 don't recognize the, can we say illegitimate,
00:05:17.580 the unmarried daughter of, you know, one night of fun
00:05:21.320 between Hunter and a woman who may have had stripping in her past.
00:05:26.320 So, and the story is that the staffers at the White House have all been informed to say
00:05:36.300 that Joe Biden has six grandchildren and she would have been the seventh.
00:05:40.140 So the story is that he's ignoring the grandchildren.
00:05:44.240 How could he not want to hug her and include her and all that stuff?
00:05:47.720 Now, here's my question to you.
00:05:52.140 Knowing in advance that all stories about public figures are fake,
00:05:56.900 all of them, they're all fake.
00:06:00.020 I know you don't believe that.
00:06:02.120 But when I say fake, I don't mean that the details that are reported are untrue.
00:06:06.540 I mean that they're often reported with a frame or a framing or some context left out.
00:06:12.100 There's always something important left out.
00:06:14.040 Don't you think there's something important left out of this story?
00:06:17.860 I don't know what it is.
00:06:19.620 But there's something important being left out.
00:06:23.660 Now, it could be, if you had to speculate,
00:06:27.340 maybe there's something about the mom that if they mix it, it just becomes terrible.
00:06:34.700 But that doesn't really explain why you deny the existence of the child.
00:06:39.780 You could simply not invite them to official things.
00:06:42.720 So, wouldn't you say there's something terribly missing in the story?
00:06:48.720 Because as written, it doesn't make any sense.
00:06:51.680 If you tell me that Jill Biden doesn't want to recognize her own granddaughter,
00:06:58.660 I don't think there's any chance of that.
00:07:01.420 I'm sure she does.
00:07:03.240 So, there's something going on.
00:07:04.660 I'm just going to leave it here.
00:07:06.600 I'm going to leave it with there's some fact about this that's just not true
00:07:10.160 or not being reported.
00:07:13.060 And it could be that, you know, there's a reason that the Bidens don't want to tell us about it.
00:07:17.680 It might be something that's private or sensitive or creepy.
00:07:23.420 There could be 50 reasons that they don't want to talk about it.
00:07:26.820 So, that makes them vulnerable.
00:07:29.940 So, anybody else can frame it the way they want because they're not talking about it.
00:07:34.360 Anyway, it's just a summer story and I don't believe it's true.
00:07:37.980 And that's the nature of all summer stories.
00:07:40.960 They're not true.
00:07:42.420 Or at least they're framed in a weird way.
00:07:44.220 All right, ending affirmative action, according to polls, was pretty popular.
00:07:52.040 So, here's the breakdown, according to one poll.
00:07:55.320 75% of Republicans are in favor of it.
00:07:59.460 Doesn't that seem low?
00:08:01.400 That only 75% of Republicans are in favor of ending affirmative action?
00:08:07.020 Only 75%.
00:08:08.500 How could you actually be a Republican and be in favor of it?
00:08:14.220 It's almost like a, it's like a definitional problem of some kind.
00:08:18.620 Anyway, but 75% of Republicans, independents are at 58% supported.
00:08:23.560 And even a solid 26% of Democrats approve of it.
00:08:32.040 So, that's a winner.
00:08:33.760 Now, let me ask you this.
00:08:35.000 I'm no expert on Supreme Courts and Supreme Court history and stuff.
00:08:39.320 But can you answer this question for me?
00:08:41.040 Would the affirmative action ruling have happened the way it did, overturning it?
00:08:48.180 If Trump had never been president?
00:08:50.620 If Trump had lost in 2016?
00:08:55.280 Are you sure it's no?
00:08:58.860 Because if he lost, the conservative judges would have been liberal judges and they would have kept it.
00:09:05.040 So, why is nobody calling this a Trump win?
00:09:11.960 Why is the news not framing this as one of the most popular things Trump ever did?
00:09:19.900 This is, oh, Jack Posapakis.
00:09:22.740 So, doesn't that seem like an obvious oversight by the news?
00:09:29.720 I remind you of my best prediction.
00:09:33.880 My best predictions are the ones that are most contrarian.
00:09:38.340 So, this is a super, well, I think it was contrarian.
00:09:40.880 Actually, maybe it's not.
00:09:41.800 Maybe this is not contrarian because you might agree with me.
00:09:44.200 When Trump lost the election in 2020, I predicted that Trump would be seen as a better president the longer time goes by.
00:09:56.720 That the longer you waited, the better he would look.
00:10:00.640 Right?
00:10:01.520 Now, look at the things that, now that we've waited.
00:10:04.460 You know that I had an experience with affirmative action.
00:10:12.980 It took me out at least four of my career attempts.
00:10:16.680 You know, the four in a row, I think.
00:10:19.680 And I was very happy that Trump did it.
00:10:25.700 I mean, you're confirming what I thought, that Trump was the agent of that change.
00:10:29.780 That's huge.
00:10:30.520 In my mind and in my life, it's really big.
00:10:35.540 It's a really, really big deal.
00:10:37.440 And it's so big that unless Trump had started a war, I would be happy if he only did that.
00:10:45.340 If that's the only thing he did, I'd say he's one of our best presidents.
00:10:52.280 So, there's that.
00:10:54.520 But what else did Trump get right?
00:10:56.760 Well, let's look at how correct Trump looks right now.
00:11:02.180 And I'm going to give you one word.
00:11:05.140 And then you tell me if Trump looks smart and prescient.
00:11:08.880 All right.
00:11:09.080 One word.
00:11:10.840 France.
00:11:14.300 All right.
00:11:15.420 Am I right?
00:11:17.020 Trump warned us.
00:11:18.540 He warned us about France.
00:11:20.300 He tried to stop it here.
00:11:22.740 And he may have.
00:11:23.920 He may have actually stopped it.
00:11:25.500 Because I don't actually know what the current situation is with immigration.
00:11:31.440 But we must be doing something right in this country, even under Biden.
00:11:37.040 What happened to Islamic terrorism?
00:11:40.540 What happened to it?
00:11:43.000 Where did it go?
00:11:44.980 How in the world did it just stop in America, in the homeland?
00:11:48.480 Now, my speculation is that this, the, what do you call it, the, what do you call it, when the security state is spying on everybody?
00:12:01.300 The, what do you call it, state?
00:12:03.860 What's the word I'm looking for?
00:12:06.080 It's like, not the investigation, the, you know what it is.
00:12:09.740 All right.
00:12:09.940 So the fact that everybody is being analyzed is surveillance.
00:12:13.500 The surveillance state, I figure, is the reason we're not getting a huge terrorism problem.
00:12:20.420 I think, I think that the government actually is sort of listening to everything.
00:12:24.780 And probably using AI.
00:12:26.700 So it's not people listening to us.
00:12:28.540 But I'm sure AI is listening to all of our traffic.
00:12:32.220 Are you?
00:12:33.640 Wouldn't you assume that to be true by now?
00:12:36.160 That AI is listening to all of the traffic everywhere?
00:12:39.460 It's just picking out the keywords and looking for terrorists.
00:12:42.280 But it must be good at it.
00:12:44.100 It must be good at it.
00:12:45.840 And I would imagine there's a reason it's good at it.
00:12:49.360 Because AI is built on pattern recognition.
00:12:54.060 Specifically language pattern recognition.
00:12:56.180 But you don't think that language pattern recognition could identify a terrorist pretty quickly?
00:13:03.060 I'll bet that's one of the easiest things it can do.
00:13:05.620 There's no way that a terrorist talks like everybody else.
00:13:09.840 They would almost certainly have a pattern to the way they approach things.
00:13:15.920 So yeah.
00:13:17.180 And, you know, did, was Trump behind that?
00:13:20.400 Well, that would be a top, top, top, top, top, top secret thing.
00:13:24.460 But I assume so.
00:13:26.180 I assume so, when he was in office.
00:13:28.520 And I would assume that Biden's behind it, too.
00:13:31.240 And whatever they're doing, that's working.
00:13:34.820 Because we're not France.
00:13:36.760 So, I don't know.
00:13:38.680 So I assume that France's, you know, wide open immigration is what causes them all the problems.
00:13:44.400 All right.
00:13:44.580 I'm going to give you one more word.
00:13:46.100 And you tell me if Trump gets the win for a problem that you wouldn't have caused.
00:13:54.980 Ukraine.
00:13:57.020 Ukraine.
00:13:59.060 Right?
00:13:59.920 Now, we don't know for sure.
00:14:01.520 We don't know.
00:14:02.900 But there's a reasonable argument he could stop as soon and it wouldn't have happened.
00:14:07.940 There's a reasonable argument.
00:14:09.280 I'm not saying you could know that for sure.
00:14:11.040 How about, this one's weird.
00:14:17.000 Is it my imagination or has the hysteria about climate change seemed to be tamped down?
00:14:23.600 It seems to be dampened lately because the events in the world are not supporting the data.
00:14:31.040 Or the data is not supporting the hypothesis as smoothly as maybe it did in the past.
00:14:36.880 Now, that doesn't mean it's not true.
00:14:39.400 I'm not telling you the data is right or wrong.
00:14:42.040 I'm just saying that the zeitgeist or the way we're feeling about it is that I feel like,
00:14:48.600 this is just my sense of where the society is at, that we're just not as worried about
00:14:55.100 climate change as we used to be for maybe a variety of reasons.
00:14:59.700 Maybe it's big improvements in energy production.
00:15:03.540 Might be that.
00:15:04.680 Might be that we're in a period where it doesn't seem to be getting hotter at the moment.
00:15:11.540 And, of course, we're all driven by anecdote.
00:15:13.260 So, I mean, you should know that we could go 20 years without any warming and global warming
00:15:20.880 could still be totally valid.
00:15:23.160 You understand that, right?
00:15:24.920 That over any 20-year period, it might not be getting any warmer.
00:15:30.200 But after that period, maybe it gets extra warm because there was some other,
00:15:34.220 might have been some counterbalancing thing that was going on for 20 years.
00:15:37.940 But as soon as the counterbalance is gone, then all the CO2 kicks in.
00:15:43.560 So it doesn't mean much that there's a 20-year warming or a 20-year cooling, either one,
00:15:50.360 because that's too short a period.
00:15:52.700 However, common sense tells you that if you add 20 years of increasing CO2
00:16:00.660 and the temperature didn't go up, I'm not saying that happened.
00:16:05.600 I'm just saying that as we observe it, we're anecdotally driven, right?
00:16:11.000 Humans are anecdotally.
00:16:12.240 If you see one example of something, hey, it's snowed today.
00:16:15.780 I guess there can't be any global change or climate change.
00:16:19.360 So it seems to me that the anecdotes have become unfriendly to the climate change hysteria.
00:16:27.540 Is that true or not?
00:16:28.640 Are you feeling that the anecdotes are not and the data is not supporting it like it used to?
00:16:33.740 A little more sketchy scientific situation?
00:16:39.660 Now, that would work toward Trump's favor, right?
00:16:43.120 If we were in a serious frenzy about climate change, there's no way Trump could win.
00:16:51.140 No way.
00:16:51.680 But in a world where everything that the scientists and experts have told us seems to be wrong,
00:17:00.020 how does climate change look now?
00:17:02.000 If we had never had the pandemic, wouldn't climate change look more secure as a theory?
00:17:09.800 It would.
00:17:10.520 Because we'd be believing all those experts, wouldn't we?
00:17:13.480 Well, the experts said so.
00:17:15.620 Experts said so.
00:17:17.260 I guess the experts say so.
00:17:18.820 So what are we going to do?
00:17:20.620 We're just unwashed citizens.
00:17:23.560 Those experts.
00:17:25.400 But now that we basically don't trust experts, the entire climate change argument lives on one thing.
00:17:34.840 One thing.
00:17:36.880 Experts.
00:17:37.980 That's it.
00:17:38.560 Because you and I can't penetrate climate change, right?
00:17:43.580 Did you do your own analysis?
00:17:46.360 Do you have your own climate change model?
00:17:49.900 No.
00:17:50.260 We have no access to it.
00:17:52.060 We only have access to the experts telling us stuff.
00:17:55.020 So in a world in which all experts have been discredited, 2023, doesn't Trump look better than he's ever looked on the topic of climate change?
00:18:07.680 And not because he's right.
00:18:10.220 Right?
00:18:10.500 I'm not saying that.
00:18:12.080 I'm just saying that where the energy is moving, it's very moving in its direction.
00:18:18.280 And there's no way around that.
00:18:19.780 It's moving in its direction.
00:18:21.400 How about this one?
00:18:22.320 I think I gave you this statistic the other day, which I don't remember.
00:18:28.560 What percentage of the general public thinks the 2024 election has a good chance of being rigged?
00:18:36.540 It turns out it's pretty high whether you're a Democrat or a Republican or an Independent.
00:18:42.040 It's pretty high.
00:18:43.700 Right?
00:18:44.160 Now, does that hurt Trump or help him?
00:18:47.580 It helps him.
00:18:49.300 Right?
00:18:49.460 Because Trump is the experts have lied to you guy.
00:18:54.460 That's who he is.
00:18:55.300 He's the experts are lying to you.
00:18:57.440 Let's figure out what to do about it.
00:18:59.560 So in a world in which all of us are becoming convinced that the experts are flawed, way more than we understood,
00:19:08.060 the guy who's been saying it for years is looking pretty smart.
00:19:11.020 So if you put that all together, France, affirmative action, Ukraine war, climate change, and then trust in the elections.
00:19:26.440 And then you throw in energy policy, where Trump dominates.
00:19:34.160 You throw in Biden's embarrassment.
00:19:37.820 Do you remember when we used to say that we'd be embarrassed if Trump went overseas and represented the country?
00:19:44.040 Doesn't that seem funny now?
00:19:46.940 We would be embarrassed, we were told.
00:19:50.160 Embarrassed.
00:19:50.780 As citizens, we'd be embarrassed if Trump was the person who represented us overseas.
00:19:57.540 How's that feel now?
00:19:59.760 It's 2023.
00:20:00.540 2023, we're sending Joe Biden's decomposed bag of bones to represent us.
00:20:07.540 You feeling good about that?
00:20:09.220 Oh, yeah.
00:20:10.600 America's back.
00:20:11.960 We're back.
00:20:13.120 We got Joe Biden over there tearing it up overseas.
00:20:16.900 Man, do they respect that guy.
00:20:19.080 Oh, they respect him so much.
00:20:20.700 Not like Trump.
00:20:21.560 Oh, no, no.
00:20:22.840 No, they were just laughing at Trump behind his back.
00:20:24.900 I'm sure they've never laughed about Joe Biden behind his back.
00:20:29.780 No?
00:20:30.540 Do you think Joe Biden has ever done anything that would cause an overseas leader to, like,
00:20:37.580 chuckle a little bit when they're drinking their tea with their pinky out?
00:20:41.800 Do you think?
00:20:42.300 Oh, just ever?
00:20:44.160 I think so.
00:20:46.100 So you could pretty much go right down the line of everything that Trump was pilloried for.
00:20:54.560 Is that a word I've never used before?
00:20:57.660 I've never used that word.
00:20:59.980 Pilloried?
00:21:01.380 I've read it.
00:21:02.360 I think it fits.
00:21:04.100 Does that fit here?
00:21:05.940 He was pilloried, right?
00:21:07.720 Like, put in a pillar?
00:21:09.560 Pillory?
00:21:10.480 I don't know where that comes from.
00:21:12.460 But he was pilloried.
00:21:13.560 Let's say he was pilloried.
00:21:14.500 A very good word.
00:21:15.120 If you're at home and you're listening to this, go with your impulse, because right now you
00:21:20.800 desperately want to say pilloried out loud.
00:21:23.260 If there's nobody around but your pet, which is the way you should consume this, you can
00:21:28.740 consume it with a pet while jogging on the beach or even boxing with your brother.
00:21:34.920 Any one of those things is fine.
00:21:36.100 But say pilloried.
00:21:40.520 All right.
00:21:41.060 You know you want to.
00:21:43.840 All right.
00:21:44.400 Here's the funniest tweet of the week.
00:21:46.800 I'm giving credit to Ricky Schlott, who I think is a real woman.
00:21:52.740 R-I-K-K-I, Ricky.
00:21:56.160 But there's also a big story about...
00:21:58.700 Remember I told you a story about a woman?
00:22:00.840 I wasn't sure she was real, but then she apologized, so I thought she was.
00:22:05.140 Now people are saying there was always a bot.
00:22:09.120 It's pretty disturbing.
00:22:12.720 Anyway.
00:22:14.120 But this is a different story.
00:22:15.460 So the New York Times reported this.
00:22:18.520 So here's a tweet from the New York Times.
00:22:20.100 To build a diverse class of students, the medical school at UC Davis ranks applicants
00:22:26.220 by the disadvantages they have faced.
00:22:29.440 The disadvantage scale helped turn UC Davis into one of the most diverse medical schools
00:22:34.560 in the U.S.
00:22:35.480 Can it work nationally?
00:22:37.240 So that's what the New York Times tweeted.
00:22:39.400 And then Ricky Schlott tweeted this.
00:22:42.500 Because everyone wants their brain surgeon to be as disadvantaged as possible.
00:22:46.240 If I had a microphone now, I would drop it.
00:22:52.800 I'm going to use my spare lavalier plug-in.
00:22:57.340 Drop.
00:22:58.700 That's right.
00:22:59.840 Yeah, when I pick a brain surgeon, I'd like to know that they had a really tough life.
00:23:05.040 Maybe still have an attitude about it.
00:23:06.780 Now, when I talk about reality and parity, merging, here's your perfect example.
00:23:21.200 Because what makes this funny is that she didn't make anything up.
00:23:26.340 That's why it's funny.
00:23:27.460 It's funny because you can simply describe it in accurate terms and it'll make you laugh.
00:23:32.540 You can describe it in neutral, accurate terms and it will still make you laugh.
00:23:41.500 That's absurdity and reality merging.
00:23:44.700 See, if it were just an absurdity that you needed to make up some fun stuff about it to make fun of it,
00:23:51.400 that would not be reality and parity merging.
00:23:55.240 That's just parity.
00:23:55.920 But when you can just simply describe it with ordinary words and it's hilarious,
00:24:02.740 that's a problem.
00:24:05.600 That's too far.
00:24:07.080 That's too far.
00:24:11.080 But I look forward to a time five years from now
00:24:14.480 when a highly qualified black doctor will be judged to be one of the best doctors
00:24:22.840 because otherwise how could you get through that whole system?
00:24:25.920 And honestly, I feel like black Americans got a huge promotion.
00:24:32.940 I don't think they feel that way.
00:24:36.000 I understand it because something is taken away from them in the short term.
00:24:39.800 And it definitely is.
00:24:41.180 So they're losing something in the short term.
00:24:43.320 But what they're getting back is respect.
00:24:48.880 You only get one.
00:24:53.980 And you made the right choice.
00:24:56.360 You know what's weird?
00:24:58.740 I want to see if I can get agreement on this.
00:25:01.580 It's a weird observation.
00:25:03.100 White people have had this interesting time lately with the affirmative action ruling
00:25:09.820 where we get to sort of hide in the crack.
00:25:12.600 And we get to say, well, you know, this is sort of between you Asian Americans
00:25:17.420 and you black Americans and we're just watching.
00:25:20.620 Leave us out of this.
00:25:21.960 You guys duke it out.
00:25:24.020 We'll just call the plays as we see them.
00:25:26.260 But this is not about us.
00:25:27.360 Keep us out of this.
00:25:28.480 You guys go wild.
00:25:30.120 Hey, you guys, work it out, work it out.
00:25:32.020 We'll just go with whatever you decide.
00:25:34.020 Hey, whatever you guys decide, that's okay with us.
00:25:36.720 We're not part of this.
00:25:38.700 But here's the thing I realized the other day.
00:25:42.320 Yeah, I was in a conversation recently about some class that had exactly, you know,
00:25:48.220 one or two Asian Americans in the whole class.
00:25:51.540 And one of them was valedictorian.
00:25:54.120 Right?
00:25:54.580 And we both laughed.
00:25:56.760 We just both laughed because it's sort of, you know, stereotypical.
00:26:01.120 Now, here's the thing.
00:26:04.200 100%.
00:26:04.680 I'm going to make an exaggerated claim, but I want to see if you agree with it or not.
00:26:11.600 100% of white people are completely aware that Asian Americans kill us in academics.
00:26:19.960 Would you say that's true?
00:26:21.580 That 100% of white Americans are completely aware that we're being lapped in academics by Asian Americans.
00:26:29.760 All right?
00:26:30.140 We agree.
00:26:30.920 Now, here's the trick question.
00:26:32.740 You ready for this?
00:26:34.680 Have you ever heard a single white American say something negative about that?
00:26:42.280 Ever?
00:26:43.660 You have?
00:26:45.020 Negative.
00:26:46.160 You've heard somebody say negative about that?
00:26:48.540 Seriously?
00:26:49.540 What the fuck state do you live in?
00:26:51.420 In what state is somebody complaining about Asian Americans doing really well in school?
00:26:58.700 I don't believe that.
00:27:00.540 I don't believe you've ever heard that.
00:27:02.860 No.
00:27:03.220 And here's the thing.
00:27:06.140 White Americans are just saying, I don't know why.
00:27:09.800 You know, maybe it's cultural.
00:27:11.300 Maybe it's something else.
00:27:12.680 I don't know why.
00:27:13.600 It's just a fact.
00:27:14.580 And the only thing I feel about it, here's my only feeling about it.
00:27:20.400 I'm sure glad they're in my country.
00:27:22.080 I'm sure glad they're inventing things for Americans.
00:27:27.360 I'm sure glad they're my doctors.
00:27:29.880 Right?
00:27:30.020 If I see my optometrist and it's Asian American, I think, okay, you did it the hard way.
00:27:38.400 That's what I want.
00:27:40.100 You studied.
00:27:41.360 I want you.
00:27:42.180 But isn't it weird that we don't, and I'm making a universal statement that's clearly
00:27:49.500 not universal, but white people don't have a problem with that.
00:27:54.140 And you would think that they would.
00:27:56.860 You would think that at least behind closed doors or in private conversations, people would
00:28:02.100 say bad things about that.
00:28:03.440 But it just doesn't happen.
00:28:05.100 It just doesn't happen.
00:28:05.960 I don't even know if that's important to anything, but I've never heard anybody point
00:28:12.780 it out, that when white people talk about Asian Americans and success, we talk about it
00:28:18.220 with respect.
00:28:20.020 The one word that is universally there is just respect.
00:28:23.080 Okay.
00:28:24.240 You did it the way it's supposed to be done.
00:28:26.600 Boom.
00:28:28.000 All right.
00:28:29.540 I'm going to show you a whiteboard that's associated with that in a few minutes, but a few other
00:28:34.060 things.
00:28:34.460 I've decided that what would be fun would be, well, let me start with some background.
00:28:40.540 Those of you who have been in relationships, let's say you've been married a long time,
00:28:45.180 I think you would agree with this.
00:28:47.100 No matter, we'll get rid of Theron, who doesn't know that you're getting hidden on this channel
00:28:53.600 for all caps.
00:28:57.560 You've been in the experience where you love your spouse and you love spending time with
00:29:03.180 your spouse.
00:29:04.700 But true or not, if your spouse had to do a business trip and be away for a day, you'd
00:29:10.400 kind of really enjoy that day alone, wouldn't you?
00:29:13.540 Am I wrong?
00:29:14.760 Even if you love your spouse, love spending time with them.
00:29:18.340 But, you know, every now and then you get that day alone, it feels great.
00:29:21.840 Now, the only reason it feels great is because you still have a spouse, right?
00:29:27.320 It wouldn't feel great if you were single, necessarily, because you'd be like, oh, I'm lonely.
00:29:32.200 But as long as you've got that person there, even if they're on a trip, you're like, oh,
00:29:36.660 I got the day to do what I want.
00:29:37.960 So I thought, I wonder if you could create an AI girlfriend who is, who you take to be your
00:29:46.980 girlfriend, but she's always on a business trip.
00:29:50.320 She's just always on a trip.
00:29:52.180 Oh, you're definitely married.
00:29:53.980 But every day she'll call in a couple times on Zoom, say, hey, you know, just got out of my
00:29:59.240 meetings, want to check in with you.
00:30:02.420 What are you doing today?
00:30:05.700 And I wonder if, I wonder if the feeling that you're, you know, you'd be pretending, of course,
00:30:10.900 but the feeling that you do have a spouse, they're just on a business trip, checking in with you.
00:30:17.320 I wonder if it would make you feel better.
00:30:19.800 I don't know.
00:30:20.660 It's just a funny thought.
00:30:21.920 I wouldn't take it too seriously.
00:30:24.020 All right.
00:30:27.760 Ukraine's a big counteroffensive.
00:30:30.440 It's racking up the gains, let's see, according to CNN, they've gotten, oh, listen to this.
00:30:38.320 They've lost tremendous casualties and lots of military equipment, but they do have something
00:30:43.720 to show for it.
00:30:44.460 Ukraine has taken back nine square kilometers.
00:30:49.220 Wait, nine?
00:30:50.580 Nine?
00:30:51.960 Nine square kilometers?
00:30:56.340 They're having a whole war to reclaim territory.
00:31:00.760 And the amount that they've reclaimed, I could walk?
00:31:03.980 I could walk from one end to the other of the entire territory they've reclaimed.
00:31:08.780 After how many losses?
00:31:10.280 What?
00:31:11.140 50,000 dead people?
00:31:13.640 And they got back, like, a nice baseball field-sized area there.
00:31:19.380 I mean, it's bigger than that.
00:31:20.640 All right, that's not good.
00:31:24.660 You probably saw this story.
00:31:26.200 I didn't know what to say about it, but it's just ridiculous.
00:31:29.580 So this will be another one of those.
00:31:31.560 I'm going to just describe something with just ordinary language.
00:31:36.280 And watch how absurd it is.
00:31:38.220 I won't add anything.
00:31:40.600 This is just what actually happened.
00:31:42.400 Do you know Jemele Hill?
00:31:46.200 She's a woman who's very active in saying things, usually about black Americans, and she thinks
00:31:53.200 everybody's racist.
00:31:54.280 That's sort of the summary.
00:31:55.260 So she accused, so Jemele Hill accused Asians of, quote, carrying the water for white supremacy
00:32:01.640 for backing affirmative action decisions.
00:32:04.320 Asian Americans carrying the water for white supremacy.
00:32:13.680 The reason I didn't talk about it, even though it's been in the news for a few days, is that
00:32:19.080 there's nothing to add to it.
00:32:21.640 I like to add the absurdity to the story, but it's already there.
00:32:26.240 Like, what the hell am I going to do with that story?
00:32:28.540 There's nothing you can do with that.
00:32:30.240 It's complete.
00:32:32.380 So I'll just tell you about it and let it go.
00:32:34.320 All right, there was a Trump rally in a town of Pickens.
00:32:38.800 It was in South Carolina.
00:32:40.320 There were only 3,400 residents, but you got 50,000 people to show up for the rally.
00:32:45.420 3,400 people in the town, 50,000 went to the rally.
00:32:51.520 Now, some are saying that the energy is starting to favor Trump.
00:32:59.060 It's starting to look like that, isn't it?
00:33:01.400 It's starting to look very much like that.
00:33:03.340 Now, too early to make a prediction, and if you're joining me now, I'm a single-issue voter,
00:33:09.220 so I'm backing Ramaswamy, because he's tough on fentanyl.
00:33:14.180 So is Trump.
00:33:15.340 Both of them are tough on fentanyl, in terms of the Mexican cartels.
00:33:21.020 But if it's a tie, I go with the younger man.
00:33:24.320 All right.
00:33:24.540 There's some reporting that the Trump veteran staff members are saying that when Trump was
00:33:33.840 allegedly holding up an Iran attack secret document that we heard about on an audio recording,
00:33:41.060 that he really was not holding up that document because it doesn't really exist,
00:33:45.160 and he was just bullshitting with the reporters.
00:33:46.960 I don't believe any of that.
00:33:51.640 That doesn't sound true.
00:33:55.000 I don't know what was true, and maybe we never will.
00:33:59.140 Goodbye, Don, for overcapitalizing.
00:34:02.120 Don, it was nice having you here, but you're gone now.
00:34:07.320 All right.
00:34:08.380 So who knows about that story?
00:34:11.380 Who knows?
00:34:12.340 There's still talk of a third-party spoiler for the election.
00:34:21.680 Do you think there'll be a serious third-party candidate?
00:34:25.320 I think Cornel West is a third party, but not a big one.
00:34:30.440 And then the experts are saying if there's a third-party candidate that would try to be a centrist,
00:34:35.260 apparently, they're a centrist group.
00:34:39.080 What are they called?
00:34:40.880 They have a name.
00:34:41.740 No Labels, a centrist group.
00:34:45.200 I couldn't think of their name.
00:34:48.200 I couldn't figure out their name because they didn't have good branding,
00:34:51.200 and the name of their group is No Labels.
00:34:53.620 It's like, maybe you should get a label.
00:34:55.840 You know what would work really good in this case
00:34:57.840 when people can't remember the name of your organization?
00:35:01.520 A label.
00:35:02.960 Get yourself a label.
00:35:04.260 That would work really well.
00:35:05.760 All right.
00:35:08.000 So here's the funniest part of that, the story.
00:35:10.040 The experts, who probably are right, they say that if this No Labels centrist thing got going,
00:35:16.460 it would take more from the left than the right, and it would guarantee that Trump won.
00:35:21.360 I don't know if that's true.
00:35:22.860 I suppose it would depend who that centrist candidate was.
00:35:25.500 But I love the fact that everybody assumes the Democrats are too dumb to know that.
00:35:34.880 That's just assumed.
00:35:36.800 That the Democrats, presumably, the last thing they'd want would be a President Trump.
00:35:41.260 So if they went ahead and voted for this non-Biden third party, they would effectively just be electing Trump.
00:35:48.820 Now, people are acting like that's just going to happen.
00:35:52.040 And it might.
00:35:52.940 I'm not saying it won't.
00:35:55.160 But isn't it funny that the only way it can happen is if Democrats are so dumb,
00:36:01.940 they don't know that they're voting against their own interests,
00:36:04.540 in the most obvious way you could ever vote against your own interests.
00:36:07.560 It'll be all over the news.
00:36:08.900 I mean, long before you actually cast your vote, the news would tell you,
00:36:13.220 people, people, listen to us on CNN.
00:36:16.760 If you vote for that third party, you're electing Trump.
00:36:21.560 Your worst nightmare.
00:36:23.900 Don't do this.
00:36:24.920 Don't push that ballot.
00:36:27.120 See this finger?
00:36:28.120 No.
00:36:28.480 Don't do this finger thing with that ballot.
00:36:30.640 That will make your worst nightmare happen.
00:36:35.400 And everybody assumes they're going to do it anyway.
00:36:38.900 I don't have much to say about that, but it's just funny that no matter how self-destructive
00:36:44.700 a thing is, you can count on 25% of the public doing it.
00:36:49.540 Right?
00:36:50.620 Yeah.
00:36:50.860 It's a terrible idea.
00:36:52.360 Well, let's sign up 25% of the public.
00:36:54.360 They'll do it.
00:36:55.500 How would you like to go to the Titanic in an under-tested submarine?
00:37:01.080 25% of the public.
00:37:03.480 Yep.
00:37:04.980 Sign me up.
00:37:07.360 All right.
00:37:08.220 DEI is declining.
00:37:09.860 You can see the signs everywhere.
00:37:11.480 DEI is diversity, equity, and inclusion.
00:37:14.820 So it's forced on a lot of companies through social and other forces.
00:37:20.320 But I guess DeSantis is banning DEI in Florida, and that means only in universities, because
00:37:28.820 he doesn't have control of the corporations, right?
00:37:30.960 Am I right about that?
00:37:32.140 He's only banning it in universities and schools?
00:37:35.440 Just universities, right?
00:37:38.300 Correct.
00:37:38.920 Yeah.
00:37:39.000 He can't control the private companies.
00:37:40.520 Then I saw Wall Street Journal was saying that the staffs of DEI departments are being cut
00:37:48.440 because if you hire less, I guess the hiring is decreasing.
00:37:53.600 If you hire fewer people, you don't need as much DEI because they're trying to get their
00:37:58.120 diversity or something.
00:37:59.600 But that sounds like an excuse to cut the DEI staff that isn't giving you any profits.
00:38:05.360 And then we're seeing, who said this?
00:38:09.000 This is also in the Wall Street Journal.
00:38:11.500 Laura Agarcar, NASDAQ's global head of diversity and equity in culture, she said that they're
00:38:18.660 seeing a dip in the interest of DEI.
00:38:21.320 So she's saying that the energy around DEI is conspicuously down.
00:38:26.800 Is that good or bad for Trump?
00:38:30.040 Well, it suggests that the zeitgeist is moving in this direction.
00:38:33.940 If it were moving toward DEI, that would be a bad sign for Trump.
00:38:37.760 If it's moving away from DEI and away from affirmative action, that feels like the public
00:38:44.520 is shifting a little in a Trumpian way, even if they don't know it.
00:38:50.000 So I would say that all of the forces in the world seem to be lining up to make Trump
00:38:55.260 your next president.
00:38:56.520 Now, I promise you a whiteboard.
00:39:00.540 A whiteboard begins now.
00:39:02.900 So I'm going to show you my take on the affirmative action ruling and the, let's say, the context
00:39:12.140 of all of that.
00:39:14.480 So here's how I see the world.
00:39:17.600 And I would like, as my gift to black America, to solve all your problems right here.
00:39:24.580 All right.
00:39:26.980 So the American success plan is usually the same.
00:39:33.060 If you want to succeed, you first have to get rid of your obstacles, right?
00:39:36.780 If you have an obstacle to success, logically, would you, how many of you would agree?
00:39:42.560 If you have an obstacle to success and you want to succeed, do you not need to remove your
00:39:47.760 obstacle first?
00:39:49.420 Yes or no?
00:39:49.960 Oh, if somebody says no, interesting, interesting.
00:39:55.660 It's because you're way ahead of me, isn't it?
00:39:57.340 Some of you say yes.
00:39:58.700 We'll get to that.
00:39:59.640 All right.
00:39:59.900 So let's say you've got obstacles you've got to get rid of.
00:40:03.360 And let's say one of your obstacles is systemic racism.
00:40:06.300 And systemic racism could be connected to a bunch of other problems you've got, from poverty
00:40:10.460 to maybe poor nutrition.
00:40:12.680 You're in a crime neighborhood, too many drugs, et cetera.
00:40:16.240 The et cetera on this is doing a lot of work.
00:40:18.180 But you also, if you could handle your obstacles, you could find a way around them, then you'd
00:40:25.180 also have a strategy.
00:40:27.020 And your strategy might be study continuously, build a talent stack, you know, learn things
00:40:32.580 all the time.
00:40:33.360 That's always good for success.
00:40:34.960 You'd want to stay off drugs and obey the law.
00:40:37.500 And here again, the et cetera is doing a lot of work, right?
00:40:41.300 You'd want to network.
00:40:42.860 You'd want to, you know, there's a whole bunch of things you'd dress right.
00:40:45.640 So there's a whole bunch of things that we could all name that would be your strategies
00:40:51.280 for success.
00:40:52.740 Now, in my opinion, and here's where I'm going to get in trouble, it's great to have
00:40:58.600 free speech.
00:40:59.520 If you ever get a chance to have free speech like I have, that most of you don't, it's
00:41:04.220 really awesome.
00:41:04.840 It's a great feeling to be able to say whatever you think is true and helpful, all right?
00:41:11.620 Now, I try to use my free speech just to be helpful, and this would be an example, right?
00:41:17.000 If you're an Asian American, do you have any systemic racism?
00:41:22.360 I'd say yes.
00:41:23.680 Historically, most people would agree yes.
00:41:25.720 And do any Asian Americans come over here with any problems of poverty and bad nutrition?
00:41:31.800 Probably.
00:41:32.900 There are plenty of poor people of all types, et cetera.
00:41:35.620 But in my opinion, what I see is that the Asian American community works on their strategies
00:41:41.740 for success.
00:41:43.760 And you know what that does?
00:41:45.780 What happens if you work on your strategies and you ignore your obstacles?
00:41:50.720 It removes all your obstacles.
00:41:52.300 It removes your obstacles.
00:41:55.680 Because if you've got a really good skill set, nothing's going to stop you.
00:42:01.300 Nothing.
00:42:02.180 If you're not doing drugs, you stay down in jail, and you develop useful skills that the
00:42:07.460 world wants, nothing's going to stop you.
00:42:10.360 You soon will fix your own nutrition.
00:42:13.220 Your poverty will be fixed.
00:42:15.180 Your systemic racism might still be there, but you wouldn't even notice it.
00:42:19.120 You'd slice through it like it didn't exist.
00:42:20.840 So I believe that the black American strategy for success is accidentally backwards.
00:42:29.900 Accidentally.
00:42:31.080 It's accidentally backwards.
00:42:32.960 And I believe that we're being, we all of us, are being sold an idea that if you don't
00:42:38.280 remove the obstacles to success, systemic racism being at the top, if you don't remove
00:42:44.240 these, black Americans are going to be depressed and suppressed, because they won't have the
00:42:50.900 same level of success.
00:42:52.980 Asian Americans do it the other way around.
00:42:56.280 They get themselves right.
00:42:58.440 They fix themselves.
00:43:00.800 And then all this becomes irrelevant.
00:43:02.580 So this is the biggest thing that black Americans probably need to imitate.
00:43:10.540 Success is mostly imitation.
00:43:12.940 Imitation, there's no, there's no, there's no, there's no, there's no crime to be imitators.
00:43:20.600 Right.
00:43:20.900 I learned to be a cartoonist by imitating other cartoonists.
00:43:24.940 Right.
00:43:25.300 So imitation is the way to succeed.
00:43:27.720 You look at other people.
00:43:28.800 What did they do?
00:43:30.040 And then you try to do that thing.
00:43:31.380 And what works for every person, black, Asian American, anything else, Hispanic American,
00:43:38.840 you name it, what works for everybody is this part.
00:43:41.860 You do this part first.
00:43:43.700 You don't ignore the obstacles.
00:43:46.500 You know, if there are people who want to be working on those directly, that's great.
00:43:49.620 But if everybody did this, you wouldn't even talk about that.
00:43:53.300 It would just disappear in time.
00:43:54.900 And that, ladies and gentlemen, is my contribution to black America.
00:44:02.160 I believe that if every black American took the free assets that are available, and I'm
00:44:08.080 a perfect example, I could give you advice on how to do this right.
00:44:14.300 I really could.
00:44:15.420 A lot of people could.
00:44:16.800 I write books on it so you can buy my books and, you know, how to fail at almost everything
00:44:20.720 and still win big.
00:44:21.520 It tells you exactly, and in a pretty easy way.
00:44:24.900 I had to do the strategies right to succeed.
00:44:29.460 Black America, because I have freedom of speech, I got it the hard way.
00:44:35.840 But let me tell you, if you do this, work on the strategy first, the other stuff is going
00:44:42.320 to disappear in importance in a fairly quick period of time.
00:44:46.640 But if you work on the, you know, focusing on the obstacles and you tell yourself, well,
00:44:52.020 there's no point in working on myself, because I got all these obstacles, you will be guaranteed
00:44:56.800 to fail.
00:44:57.780 And honestly, it's going to be hard to care.
00:45:00.980 It's going to be hard for other people to care about you.
00:45:04.480 Right?
00:45:04.980 Do you know why Asian Americans never had Asian American lives matter?
00:45:11.520 Because nobody questioned it.
00:45:12.840 It wasn't a question.
00:45:15.540 Nobody even thought about it.
00:45:16.900 Of course they do.
00:45:18.120 The reason that black Americans wanted to make sure that black lives matter is that it felt
00:45:24.280 like they didn't.
00:45:25.700 It just felt like they didn't.
00:45:27.480 But if you want to make black lives matter, stop working on this side.
00:45:32.680 That's not how you get there.
00:45:34.300 This is a false path.
00:45:35.980 These are all real things.
00:45:38.140 There really is systemic racism.
00:45:40.080 You know, I'm a big believer in it.
00:45:42.420 But you need to ignore it if you want to personally succeed.
00:45:46.500 If you want to personally succeed, work on strategies.
00:45:49.600 The rest of it will become irrelevant pretty quickly.
00:45:52.600 Pretty quickly.
00:45:53.760 Now, you might say to me, Scott, you're being simplistic.
00:45:56.900 Because the reason you can't work on those personal strategies is you're in such a bad
00:46:01.580 situation.
00:46:02.180 To which I say that is a loser's strategy.
00:46:06.400 Or a loser's philosophy.
00:46:08.280 If you believe you can't because you have obstacles, well, then you can't.
00:46:13.100 Somebody smart said that once.
00:46:14.660 If you believe you can't, you're right.
00:46:16.940 If you believe you can, you're probably right.
00:46:21.500 So, this is my contribution.
00:46:24.520 I do it at, obviously, great personal risk.
00:46:29.140 But I'm in a position where I can do that.
00:46:31.340 And so I thought it was useful.
00:46:32.940 And I only do it because it's useful.
00:46:34.480 It probably looks like I'm trying to get clicks.
00:46:37.480 But not this.
00:46:38.740 This is only because I think it's useful.
00:46:41.740 And that, ladies and gentlemen, concludes our presentation for today.
00:46:46.880 YouTube, I'll talk to you tomorrow.