Real Coffee with Scott Adams - June 17, 2023


Episode 2142 Scott Adams: Trump & Truth, Systemic Racism Cortisol Theory, Schiff Censure, AI Debates


Episode Stats

Length

1 hour and 1 minute

Words per Minute

144.22873

Word Count

8,818

Sentence Count

729

Misogynist Sentences

16

Hate Speech Sentences

30


Summary

In this episode of Coffee with Scott Adams, host Scott Adams talks about the latest breakthroughs in battery technology, and why he thinks you should build a railroad from here to the Indian Ocean. Plus, the latest in the Biden/WannabeDictator saga.


Transcript

00:00:00.000 Good morning, everybody, and welcome to the best thing that's ever happened to anybody in the history of the universe.
00:00:07.960 It's called Coffee with Scott Adams. And aren't you glad you're here? Wow. Wow. Yeah.
00:00:14.120 And if you'd like to take this experience up to levels where, how can I explain it?
00:00:20.440 What kind of a level could we take this up to?
00:00:22.680 The level where you could build, let's say, a railroad from here to the Indian Ocean.
00:00:28.420 That's how good it's going to be.
00:00:30.260 And all you need is a cup or a mug or a glass, a tank or chalice or stine, a canteen jug or flask, a vessel of any kind.
00:00:37.280 Fill it with your favorite liquid. I like coffee.
00:00:40.780 And join me now for the unparalleled pleasure, the dopamine, the end of the day, the thing that makes everything awesome.
00:00:47.720 It's called Simultaneous Sip. It Happens Now. Go.
00:00:55.340 Ah. Yes.
00:00:57.560 All right. We're private on locals and we're public on YouTube and it's time to go.
00:01:05.440 So, did anybody notice that there's no news?
00:01:10.440 That the news just stopped?
00:01:12.500 If there's anything that would tell you that all the news is artificial,
00:01:18.360 it's that as soon as summer vacation comes,
00:01:22.320 as soon as it's summer vacation, the news stops.
00:01:25.580 Come on.
00:01:26.180 Doesn't that tell you the news is fake?
00:01:29.060 That the news stops when the reporters go on vacation?
00:01:33.260 Like, how could that be?
00:01:35.500 Well, that's not going to stop me from talking about things that are fascinating and interesting
00:01:41.080 and will make you so happy that you're here.
00:01:44.280 I don't know.
00:01:44.840 I think it might be the best day you've ever had in your life.
00:01:48.920 Now, I'm going to be hypnotizing all of you with casual conversation.
00:01:54.700 Here's what I'm going to do.
00:01:56.220 This is going to be very simple hypnosis.
00:01:58.980 It's the simplest form of hypnosis,
00:02:02.160 which is associating something with another thing.
00:02:07.220 Have you noticed that your mood can be influenced by other people?
00:02:11.480 Well, you've all noticed that, right?
00:02:13.580 Somebody sad comes into the room and you're like,
00:02:15.840 ugh, your energy goes down.
00:02:18.120 But somebody happy comes into the room
00:02:19.780 and all of a sudden you're like,
00:02:21.540 whoa, all right, that took me up a little bit.
00:02:24.700 Well, here's what I'm going to do to all of you.
00:02:27.780 I'm in a better mood than you are.
00:02:31.300 That's true.
00:02:32.580 I'm in a great mood.
00:02:33.500 And my good mood, just by itself,
00:02:37.540 is going to start to seep into your being.
00:02:40.880 Your mere exposure to my good, happy mood today
00:02:43.800 will make you happier.
00:02:45.560 You don't have to do anything.
00:02:47.300 Just sit there and let it happen.
00:02:50.720 Go.
00:02:51.860 First story, good news.
00:02:53.520 There's another battery breakthrough.
00:02:55.920 Looks like, who did this?
00:02:58.440 Big old car company.
00:03:01.480 Who's maybe Toyota?
00:03:04.700 Doesn't matter.
00:03:05.740 It's some big car company.
00:03:07.400 They've got some new solid-state battery
00:03:10.960 that's way cool.
00:03:12.860 It'll give you almost a 1,000-mile range
00:03:15.860 and 10-minute charge.
00:03:18.760 Does that sound pretty good?
00:03:20.920 A 932-mile range and a 10-minute charge.
00:03:25.220 And apparently it's all doable.
00:03:27.080 You know, the technology can be demonstrated.
00:03:29.420 It looks like it can be scaled up for mass production.
00:03:33.460 And it's solid-state.
00:03:36.640 So it would be lighter.
00:03:38.040 Well, they can pack more goodness into a smaller space.
00:03:41.640 Anyway, compare that to Tesla Model Y
00:03:44.520 that has one-third the range, sort of,
00:03:48.780 and 15-minute charge.
00:03:50.520 15 minutes, not bad, but one-third the range.
00:03:53.640 So this is not the only battery breakthrough.
00:03:56.080 I always tell you about all the new battery breakthrough things.
00:03:58.840 Because I think the battery breakthrough is everything.
00:04:04.220 I mean, correct me if I'm wrong,
00:04:05.540 but the big problem with green energy is
00:04:07.560 it doesn't work when there's no sun
00:04:10.000 and the wind isn't blowing.
00:04:11.760 But if your batteries get cheap enough and awesome enough,
00:04:15.000 and here's a gigantic step in that direction,
00:04:18.300 then all problems are solved.
00:04:20.460 You just store your energy until you need it.
00:04:24.340 All right.
00:04:25.580 I'm loving the hashtag wannabedictator.
00:04:28.080 You know that came from the Fox News chyron hero
00:04:32.380 who referred to Biden as a wannabe dictator
00:04:36.100 in the last chyron that he did before he was let go.
00:04:42.320 It's still the best quitting story I've ever heard.
00:04:48.560 Nobody's ever quit a job better than that.
00:04:51.040 That is the highest style points for a job quitting
00:04:54.460 there has ever been in the history of the universe.
00:04:58.080 But I'd like to check in on the wannabe dictator
00:05:02.380 of Biden's health.
00:05:04.800 Let's see.
00:05:05.700 He still wants to build a railroad to the Indian Ocean
00:05:08.400 and God save the Queen.
00:05:11.040 Why do you say God save the Queen at the end of his speech?
00:05:19.120 Nobody really knows.
00:05:24.240 Maybe it was something about trans and LGBTQ,
00:05:27.760 but I don't think so.
00:05:29.340 I think it was just some kind of a brain dysfunction.
00:05:32.540 But, you know, I always tell you,
00:05:36.880 my mother told me when I was young
00:05:39.020 that you could get used to anything if you did it long enough,
00:05:44.080 including hanging.
00:05:45.620 And I don't think she made that up.
00:05:47.160 I think that was an old saying from somewhere.
00:05:49.420 But have we gradually gotten used to
00:05:53.000 how degraded Joe Biden is?
00:05:56.520 Is it my imagination, or can you tell me if I'm exaggerating?
00:06:02.380 I can't tell if this is hyperbole.
00:06:05.120 But it seems to me that Biden is about 10 minutes
00:06:08.480 from what I'll call the taxidermy phase,
00:06:11.220 where they literally just stuff him
00:06:13.620 and wheel him around on a cart,
00:06:15.820 and everybody acts like he's still there.
00:06:18.100 Because we're acting like he's still there now.
00:06:24.380 Could you ever imagine we would be in this place?
00:06:28.020 Could you ever imagine that the president
00:06:29.820 would say in public,
00:06:32.540 God saves the Queen randomly,
00:06:34.820 and we want to build a railroad to the Indian Ocean?
00:06:38.680 And that we let those things go.
00:06:40.900 I mean, we talked about them.
00:06:42.300 They were both news, and we had fun with it,
00:06:44.220 and we tweeted, and we made some memes.
00:06:46.540 But generally speaking, we just gave it a pass.
00:06:52.100 We actually got used to having a brainless president.
00:06:57.660 Now, is my mother a genius?
00:07:00.380 Is my mother the smartest person who ever lived?
00:07:02.980 When she said you can get used to anything,
00:07:05.760 I didn't believe that.
00:07:07.600 I didn't believe you can get used to anything.
00:07:10.200 I thought it was like an exaggeration,
00:07:12.720 as in, well, you can get used to some things.
00:07:15.240 There are some things, a small group of things,
00:07:18.420 that you might not think you can get used to,
00:07:20.060 but this small group of things,
00:07:22.000 actually you can.
00:07:22.940 You can get used to them.
00:07:25.080 But no, it turns out you can get used to anything.
00:07:29.020 Anything.
00:07:30.300 We're going to stuff this motherfucker
00:07:32.040 like a pet bear or whatever you stuff.
00:07:36.380 We're going to wheel him around in a car,
00:07:38.660 and the Marines are going to salute him.
00:07:40.720 The Marines are going to salute.
00:07:43.420 They're going to wheel this fucker
00:07:44.660 to the helicopter on a car like this,
00:07:49.680 and the Marine outside the helicopter will be.
00:07:55.380 And everybody will act like nothing happened.
00:08:00.220 I don't know.
00:08:01.220 I think it's funny.
00:08:02.040 Apparently on June 22nd,
00:08:08.620 I saw a Clay Travis tweet on this,
00:08:11.320 June 22nd, which is just a few days from now,
00:08:15.700 Trump's exclusive contract with Truth Social ends,
00:08:20.420 meaning that he can tweet anywhere he wants,
00:08:24.600 anytime he wants.
00:08:26.040 At the moment, he's limited to truth.
00:08:28.580 Now, what do you think he's going to do?
00:08:29.860 Because if Trump starts tweeting on Twitter,
00:08:36.100 and it's the same tweets,
00:08:38.900 why would anybody be on truth?
00:08:41.900 I mean, it might just destroy the entire model,
00:08:44.220 if it's working at all.
00:08:45.500 So I don't know if truth is, you know,
00:08:48.300 on the way to make money or not.
00:08:50.340 But he's got a big decision,
00:08:52.480 because I believe he's got a sizable equity interest in that.
00:08:56.660 He would like it to work.
00:08:57.920 On the other hand, he wants to be president.
00:09:00.640 You know what?
00:09:01.480 I'm almost semi-convinced that he should stay on truth
00:09:06.920 for his presidential benefit.
00:09:11.500 Do you know why?
00:09:13.600 Because the less the people see of him, the better.
00:09:17.980 But the people on truth are the people who absolutely want to see him.
00:09:21.640 Those are the people who said,
00:09:23.680 I'm here to see you.
00:09:25.460 Give me more of you.
00:09:27.080 And then he gives it to him.
00:09:28.720 And all the people on Twitter would only benefit from less of him.
00:09:33.180 Am I wrong?
00:09:34.640 They would only benefit from less of him.
00:09:36.820 Yeah, people do retweet him, but it doesn't get the same energy.
00:09:43.780 Like, I almost never see a Don Jr.
00:09:47.080 Actually, I had never seen Don Jr.'s tweets.
00:09:49.060 I just realized that Don Jr. is completely removed from my Twitter feed.
00:09:55.060 On a side note,
00:09:59.960 the stuff that is surfaced to me on Twitter
00:10:04.740 is so different than it used to be.
00:10:07.140 But I can tell that people I really would like to see their tweets are just gone.
00:10:12.820 So why is it that people I follow, I don't see their tweets?
00:10:16.580 Like, at all.
00:10:18.660 Isn't the whole point of following somebody
00:10:20.400 that you would see all of their tweets?
00:10:22.560 Like, in theory, wouldn't I see 100% of his tweets, right?
00:10:26.920 But I don't remember the last time I've seen one.
00:10:30.000 So there's something going on with the Twitter algorithm that's not cool.
00:10:34.400 Because I'm following people and I'm not seeing their tweets at all.
00:10:39.460 Like, zero times.
00:10:41.000 What's up with that?
00:10:42.520 I mean, the most basic thing of Twitter is
00:10:44.260 you follow somebody so you can see their stuff.
00:10:46.600 I don't follow things so they can show me whatever the fuck they want.
00:10:50.300 But that seems to be what's happening.
00:10:52.620 I don't understand it at all.
00:10:55.180 So, anyway, that needs to get fixed, I think.
00:10:58.540 So I saw a funny meme, if you call it that,
00:11:02.320 video that was an AI version of Biden and Trump doing a debate.
00:11:08.700 Now, it was a funny version where they were just doing insults.
00:11:12.980 But I thought to myself,
00:11:14.960 suppose you've got two candidates who don't want to debate.
00:11:19.760 It could be in the primary, it could be anywhere else.
00:11:21.780 How hard would it be to create an AI deepfake of both of the candidates,
00:11:29.440 train them both on all of the policies and all of the recent speeches?
00:11:35.020 You don't want to go back too far.
00:11:37.180 But just say recent speeches, you know, last two, three years, something.
00:11:41.460 You don't think you could get the AIs to do the debate
00:11:44.500 and you would learn as much?
00:11:47.180 Would you not learn as much as if you saw the real people debate?
00:11:50.220 Because ask yourself, what's the point of a debate?
00:11:54.260 What's the point of a debate?
00:11:56.820 Do you know the only thing we get out of a debate
00:11:59.120 is fake gotchas?
00:12:02.940 That's it.
00:12:04.780 Fake gotchas.
00:12:06.480 Because in terms of policy, it's already published.
00:12:09.860 Right?
00:12:10.140 Their policies are already on their websites.
00:12:12.560 They say it every day.
00:12:14.160 So you could certainly teach the AI
00:12:16.820 to answer any question about policy.
00:12:20.220 Have you used AI enough to know that that's true?
00:12:24.640 Chat GPT would give you a perfect impression
00:12:27.460 of their policy preferences
00:12:29.460 and could answer a question,
00:12:31.840 actually answer a question based on the policies.
00:12:35.120 And it could give you a good answer,
00:12:37.000 a very complete one with caveats and everything else.
00:12:40.480 In fact, it could do it better than the candidates themselves.
00:12:43.280 Do you disagree?
00:12:47.300 I think AI could represent its real person
00:12:52.240 better than the real person could represent themselves.
00:12:57.020 Because the AI would not get flustered.
00:13:00.140 The AI would not do a human thing that shows its flaws.
00:13:04.900 And AI would not forget what it was going to say.
00:13:07.840 It would not get excited about something.
00:13:10.360 It would not give the stupid answer.
00:13:14.060 And if it did give a stupid answer,
00:13:15.740 you wouldn't blame the candidate.
00:13:17.420 You'd blame the program.
00:13:19.760 So it could actually make a mistake,
00:13:21.840 and then the public wouldn't care.
00:13:23.480 They'd say, well, that was just the AI.
00:13:25.440 That wasn't the candidate.
00:13:28.360 So you don't think that there will be a debate
00:13:30.420 between Biden and RFK Jr.?
00:13:33.600 Prediction.
00:13:34.900 Prediction.
00:13:35.340 There will be a debate,
00:13:37.500 but it will be two AI characters.
00:13:40.460 How much attention could a real debate
00:13:42.900 between two AI characters get?
00:13:45.440 Like, how many views would that get?
00:13:48.220 An actual, let's say, one-hour debate
00:13:50.220 with real deep fakes.
00:13:52.800 Real deep fakes.
00:13:54.860 I believe it would get millions.
00:13:57.280 Yeah, I think it would get millions.
00:13:58.920 Because I would not be less interested
00:14:01.060 in the AI version.
00:14:02.440 I would be fascinated to see how the AI
00:14:05.600 handled their two competing deals.
00:14:11.360 There are some of you who wouldn't watch it,
00:14:12.960 and I get that.
00:14:13.860 But it would be really interesting.
00:14:15.140 I think a lot of people would watch it.
00:14:17.180 Now, that's interesting, too.
00:14:20.080 The deep fake, here's what I would do
00:14:22.920 if I made the deep fake of RFK Jr.
00:14:26.060 I would start with his actual voice
00:14:28.320 as it is today.
00:14:29.200 And then, in fairly short time,
00:14:33.180 I would improve it as he talks.
00:14:36.460 So that you would almost not notice
00:14:39.160 that the imperfections start to decrease.
00:14:42.340 And then, say, five minutes in,
00:14:44.700 it just disappears.
00:14:46.640 And you're just listening to everybody else's voice.
00:14:49.840 That would be cool.
00:14:51.300 Now, it might also be insensitive
00:14:54.020 to people with disabilities.
00:14:58.760 Do you see that?
00:15:00.440 I think it would be insensitive
00:15:02.140 to take his disability away
00:15:03.880 in a public forum.
00:15:07.360 Because that would be sort of like saying
00:15:08.860 you have to change him to make him better
00:15:10.540 or something like that.
00:15:11.300 I don't know.
00:15:11.900 I feel like people with actual other disabilities
00:15:14.880 might be insulted
00:15:15.780 just by the fact that the AI
00:15:18.460 thought it needed to fix it.
00:15:20.580 You know what I mean?
00:15:22.000 Am I all alone on this?
00:15:24.020 I think I'm all alone on that.
00:15:26.780 I don't know.
00:15:27.460 I feel like there's a sensitivity there
00:15:29.240 where we should accept him the way he is,
00:15:32.060 not imagine that we need to fix him in our minds.
00:15:36.140 I might be a little sensitive to this
00:15:37.820 because I had the same issue with my voice.
00:15:40.120 So I might be a little tightly wound on that topic.
00:15:44.380 All right.
00:15:45.300 I'd love to see it, though.
00:15:46.340 So I guess Juneteenth is...
00:15:51.280 Monday is the celebration of Juneteenth,
00:15:54.260 which I understand to be the last day
00:15:57.000 that slavery was legal in an American state.
00:16:02.720 Now, what do you think of Juneteenth?
00:16:05.900 Do you think it...
00:16:07.380 in Texas specifically?
00:16:09.100 And what do you think of that?
00:16:10.860 Should we have that holiday?
00:16:12.380 Or are you against it?
00:16:13.780 Anybody opposed to it?
00:16:17.400 Yeah, I see people saying,
00:16:19.040 F that and blah, blah, blah.
00:16:22.200 All right.
00:16:23.460 I want to tell you how easily I was persuaded.
00:16:28.420 All right.
00:16:28.860 I have to admit I went into it with a little bit of,
00:16:31.640 all right, this might be a little over-woke.
00:16:34.960 You know what I mean?
00:16:36.180 Because if you read enough right-leaning media,
00:16:39.360 there's like a little bit of a sense
00:16:41.760 that maybe it's a little unnecessary,
00:16:46.720 a little too much,
00:16:48.220 a little too for one group,
00:16:50.700 maybe a little too woke.
00:16:53.180 And I have to admit I was leaning in that direction.
00:16:55.240 I didn't...
00:16:56.440 I had no problem with it.
00:16:58.060 I mean, I didn't have a disagreement with it.
00:16:59.700 I just thought, you know,
00:17:01.140 sign of the times.
00:17:02.660 It's a sign of the times.
00:17:04.180 Then I saw a quick video clip
00:17:05.860 of apparently the woman
00:17:07.560 who seems to be behind it becoming a big deal
00:17:11.840 in the United States.
00:17:13.240 It was a black woman.
00:17:14.840 She looks like a grandmother now.
00:17:17.200 And she was taught...
00:17:18.500 And it was just a little clip,
00:17:19.800 probably a 10-second clip
00:17:23.380 of her making her point.
00:17:26.040 And in 10 seconds,
00:17:28.060 she completely changed my mind.
00:17:32.000 And I'm going to quote her, roughly, paraphrase,
00:17:35.980 and I'm going to see if I can do the same to you.
00:17:39.380 So she's talking to the crowd
00:17:40.920 about the importance of Juneteenth,
00:17:42.500 and she says something like this,
00:17:44.220 rough quote,
00:17:46.000 it's not about black, it's not about white,
00:17:47.840 it's about America.
00:17:49.520 It's an America celebration.
00:17:52.040 Sold.
00:17:53.360 Sold.
00:17:55.680 Don't say anything else.
00:17:58.620 Do you know what I always talk about?
00:18:01.560 Don't keep selling past the sale.
00:18:04.320 Once you've closed the sale,
00:18:06.040 shut up.
00:18:07.280 Shut up.
00:18:08.440 You just made the sale.
00:18:10.680 It's about America.
00:18:12.620 Sold.
00:18:13.780 Good.
00:18:14.320 I'm all in.
00:18:15.460 100% sold.
00:18:17.840 The person who came up with it
00:18:19.440 tells me it's about America.
00:18:22.460 Yep.
00:18:23.540 I'm in.
00:18:26.020 So let's celebrate that on June 19th.
00:18:29.680 All right.
00:18:30.480 Here's my bottom line at the moment
00:18:32.800 on those Mar-a-Lago boxes.
00:18:37.600 See if you agree with these following statements.
00:18:40.340 Professional lawyers,
00:18:42.940 the best lawyers in the world,
00:18:46.600 cannot agree whether a crime of any importance was committed.
00:18:51.580 True or false?
00:18:53.480 There are definitely top lawyers who say yes,
00:18:56.700 but are there not also top lawyers who say no or maybe?
00:19:00.420 So without saying whether it is a crime or is not,
00:19:07.380 can we simply agree that the biggest experts on what is a crime can't tell?
00:19:13.560 Meaning they don't agree.
00:19:15.160 I mean, if you ask them, they say they can tell,
00:19:17.260 but they don't agree.
00:19:19.720 Now, number two.
00:19:21.300 I think you can agree on that.
00:19:22.740 Number two.
00:19:24.920 The public is unlikely ever to see the documents
00:19:28.660 to judge for ourselves
00:19:30.880 whether these are important secrets that got released.
00:19:34.740 Now, I'm not saying that we should.
00:19:37.480 Somebody on Twitter,
00:19:39.280 like a very dumb person on Twitter,
00:19:41.080 said,
00:19:41.800 are you saying that we should see all of these secret documents?
00:19:45.780 No.
00:19:47.340 No.
00:19:47.960 I'm not saying we should.
00:19:49.040 I'm saying we won't.
00:19:51.840 In all likelihood.
00:19:53.600 In all likelihood.
00:19:54.680 Because if the documents were so unimportant
00:19:57.180 that they could just declassify them today,
00:20:00.380 then that would take away the whole point of the case.
00:20:03.720 Not the whole point,
00:20:04.980 but it would weaken the case
00:20:06.480 because that would be an admission
00:20:08.080 that was nothing there of importance.
00:20:09.980 They just showed the public.
00:20:12.580 But can you live with this situation
00:20:15.220 that might be developing?
00:20:16.480 Can you imagine
00:20:18.300 putting a past, present,
00:20:21.540 and current candidate
00:20:22.640 in prison
00:20:24.340 for life
00:20:26.400 for something that
00:20:28.900 the best lawyers can't even agree
00:20:30.860 is a crime
00:20:31.780 and the public is not allowed
00:20:34.060 to see the most important part
00:20:36.640 of the evidence?
00:20:38.200 Can you even imagine it?
00:20:40.440 Honestly, I can't.
00:20:42.460 I actually can't imagine that.
00:20:45.340 Like, actually can't.
00:20:46.440 Like, when I say I can't imagine it,
00:20:49.300 I mean my imagination
00:20:50.500 literally can't concoct it.
00:20:52.920 I can't play it
00:20:54.340 as like a little movie in my head.
00:20:56.280 It's just too far.
00:20:59.320 So this is what I mean by too far.
00:21:02.040 That's just too far.
00:21:04.160 And so when I say
00:21:05.100 it's not going to happen,
00:21:07.160 I don't mean, you know,
00:21:08.700 I'm going to do some personal thing
00:21:10.460 that would make it not happen.
00:21:11.980 I mean, it's not up to me.
00:21:13.020 I'm saying it's not going to happen.
00:21:16.940 Because one way or another,
00:21:18.980 the country won't let that happen.
00:21:21.940 We won't let it happen.
00:21:24.300 Collectively.
00:21:25.920 And I don't have to say,
00:21:27.640 you know,
00:21:27.880 what will the country do?
00:21:29.540 You know,
00:21:29.740 what will be the response?
00:21:30.920 I don't think we're talking about,
00:21:32.360 you know,
00:21:32.600 an armed revolution or anything.
00:21:36.860 Nothing violent.
00:21:37.760 I'm just saying,
00:21:40.020 no.
00:21:41.140 No.
00:21:42.400 Absolutely not.
00:21:44.060 That's just too far.
00:21:46.580 And whatever it takes
00:21:49.240 to make that stop,
00:21:50.580 short of violence,
00:21:51.600 we don't need that.
00:21:52.680 But whatever it takes.
00:21:55.020 I don't know,
00:21:55.660 it might be extreme civil obedience.
00:21:59.060 You know,
00:21:59.300 I was sort of
00:22:01.520 half seriously saying that,
00:22:05.280 you know,
00:22:05.480 people might stop paying taxes.
00:22:07.040 But realistically,
00:22:08.600 it's kind of hard to do that.
00:22:11.040 I think something would happen.
00:22:13.460 I think something would happen.
00:22:15.680 It wouldn't surprise me
00:22:16.700 if everybody who had a gun
00:22:18.000 decided to
00:22:19.500 march down the street with it.
00:22:23.600 Imagine how scary that would be.
00:22:25.740 If just everybody who had a gun
00:22:27.320 just took it out
00:22:28.480 and just started marching down the street.
00:22:30.660 Didn't say anything.
00:22:32.340 No words necessary.
00:22:34.220 Just take all the guns out.
00:22:35.960 If you've got three guns,
00:22:37.160 take a ball.
00:22:38.280 And just walk down the street
00:22:39.360 altogether.
00:22:41.980 Now,
00:22:42.620 that's not a suggestion.
00:22:45.580 I'm using that as an example of
00:22:48.200 there's probably something
00:22:50.220 that the public would do,
00:22:53.260 but there's no way
00:22:54.320 they're going to let that happen.
00:22:55.660 There's just no way
00:22:56.840 that with what we know today,
00:23:00.240 they're going to put him in prison.
00:23:01.300 So I would see it
00:23:03.180 as more of a political move
00:23:04.820 to try to suppress
00:23:05.940 his election chances.
00:23:08.680 But I'll tell you,
00:23:10.520 here's how I read the room.
00:23:13.580 I think Trump is getting
00:23:14.880 more popular every day,
00:23:16.980 and some of it has to do
00:23:18.540 with the fact that the media
00:23:19.620 is reducing his visibility.
00:23:22.300 I think it's working for him.
00:23:25.280 They're reducing his visibility
00:23:26.780 at the same time
00:23:27.660 they're sucking all the energy
00:23:28.800 from the other candidates.
00:23:30.380 It's sort of the perfect situation.
00:23:33.520 Because they can't stop
00:23:34.760 talking about him,
00:23:36.500 but they're trying to keep
00:23:38.320 the talking about him
00:23:39.640 in this little narrow area
00:23:40.960 of his bad behavior
00:23:42.960 as they see it.
00:23:44.660 Except that so many people
00:23:46.220 don't see it as that bad,
00:23:48.880 that they're spending
00:23:49.520 all the time on a thing
00:23:50.480 that people really
00:23:51.160 don't give a shit about.
00:23:53.400 Do you think there's actually
00:23:54.440 a Democrat who cares
00:23:55.540 about his Mar-a-Lago boxes?
00:23:56.960 Like, actually really care?
00:24:00.920 Like, even one
00:24:01.940 in the whole world?
00:24:04.100 Do you think there's one?
00:24:05.220 You know, it would be funny
00:24:06.180 to see if you could get
00:24:07.500 that person to say it
00:24:08.800 on video.
00:24:10.900 Hey, what do you think
00:24:11.660 about these Mar-a-Lago boxes?
00:24:14.780 Well, I believe we'll need
00:24:17.080 to bring somebody in for that.
00:24:21.860 Dale?
00:24:23.260 What do you think
00:24:23.980 of President Trump
00:24:24.980 and those Mar-a-Lago boxes?
00:24:27.420 It's pretty bad.
00:24:28.480 Worst thing in the world.
00:24:29.960 Worst thing in the world.
00:24:33.020 Worst thing in the world.
00:24:34.960 Is it worse than inflation?
00:24:37.780 It's not worse than inflation,
00:24:39.140 but second worst thing
00:24:40.460 in the world.
00:24:41.940 Is it worse than
00:24:42.980 potential nuclear war
00:24:44.400 over Ukraine?
00:24:45.260 Not worse than that.
00:24:46.380 Third worst thing
00:24:47.400 in the world.
00:24:47.940 Is it worse than
00:24:49.880 the high unemployment?
00:24:51.940 Unemployment?
00:24:52.760 It's not worse than that,
00:24:53.720 but it's the fifth worst
00:24:55.060 thing in the world.
00:24:57.280 And then you just keep going.
00:25:00.180 You get to see
00:25:01.080 where Dale takes his stand.
00:25:04.040 All right.
00:25:06.100 This is the farthest
00:25:07.260 I'm going to go.
00:25:08.380 It's the 27th
00:25:09.600 most important thing
00:25:10.440 in the world.
00:25:13.160 What about climate change?
00:25:15.000 28th.
00:25:15.620 28th most important thing
00:25:16.920 in the world.
00:25:17.400 And so it goes.
00:25:21.740 But seriously,
00:25:23.320 you know,
00:25:24.040 joking aside,
00:25:25.360 do you think you could get
00:25:26.500 an actual regular human being
00:25:28.540 on, you know,
00:25:31.040 on video
00:25:31.620 to say,
00:25:33.180 yeah,
00:25:34.200 I tell you,
00:25:34.760 those Mar-a-Lago boxes
00:25:36.140 are keeping me up at night.
00:25:38.280 I was worried about
00:25:39.680 the climate
00:25:40.460 frying the world.
00:25:42.260 I was worried about
00:25:43.680 nuclear holocaust
00:25:45.580 because of the Ukraine war.
00:25:47.400 But not anymore.
00:25:49.460 Not anymore.
00:25:50.080 Now it's the Mar-a-Lago boxes.
00:25:52.420 I can't tell you
00:25:53.240 about the nightmares
00:25:54.080 I've had
00:25:54.700 about the boxes.
00:25:57.160 And when I imagine
00:25:58.520 in my mind
00:25:59.260 Trump rustling papers
00:26:01.600 and talking about secrets,
00:26:03.320 oh, no,
00:26:04.820 no,
00:26:05.260 this sound.
00:26:06.940 Here,
00:26:07.140 here's some secrets.
00:26:08.020 Here's some secrets.
00:26:08.980 I just hear it in my mind.
00:26:10.960 And it makes me crazy.
00:26:12.340 Yeah.
00:26:12.540 I mean,
00:26:15.880 in all seriousness,
00:26:17.620 wouldn't it be funny
00:26:18.420 to get somebody
00:26:19.160 to try to tell you
00:26:20.660 that they really care
00:26:22.500 about the boxes,
00:26:23.720 that they actually
00:26:25.140 care about them?
00:26:27.100 That would be hilarious
00:26:28.220 because at first
00:26:30.360 they'd say yes,
00:26:31.640 but then you push them
00:26:32.460 on it.
00:26:33.280 Yeah,
00:26:33.460 where is this,
00:26:34.440 where is it
00:26:34.980 in your list
00:26:35.480 of priorities?
00:26:36.080 If you were to compare
00:26:38.060 the Mar-a-Lago boxes
00:26:39.500 to, let's say,
00:26:41.140 the health of your family,
00:26:43.940 is it higher than that?
00:26:45.900 You can see
00:26:46.680 where that would go.
00:26:48.480 Get back in the box,
00:26:49.480 Dale.
00:26:52.460 All right.
00:26:55.820 One of the things
00:26:56.740 I've been waiting to see
00:26:58.100 is something that
00:26:59.060 famous investor
00:27:00.320 Bill Ackman
00:27:01.160 wants to see as well.
00:27:03.300 You're aware
00:27:03.840 that Joe Rogan
00:27:05.480 did an extensive interview
00:27:06.680 with RFK Jr.,
00:27:08.080 and RFK Jr.
00:27:09.360 made a number
00:27:10.000 of claims
00:27:10.580 about vaccinations
00:27:12.580 and cell phone calls
00:27:14.980 and being bad for you
00:27:18.600 and basically vaccinations
00:27:21.820 and some other things.
00:27:24.800 So if you're just listening
00:27:27.540 to one person make claims,
00:27:29.640 they would be very persuasive.
00:27:32.320 Have I ever taught you
00:27:33.520 that listening to one lawyer
00:27:35.460 talk about their case
00:27:37.460 tells you absolutely
00:27:39.380 nothing about the truth?
00:27:41.600 Nothing about the truth.
00:27:43.220 Because a lawyer,
00:27:44.300 by training,
00:27:46.520 and that's what RFK Jr. is,
00:27:48.780 they're trained to give you
00:27:50.180 their version of things,
00:27:51.400 and they're trained
00:27:52.060 to make you sound
00:27:52.740 persuasive as hell
00:27:53.920 until you hear the other side.
00:27:56.740 It's hearing the other side
00:27:57.920 where everything falls apart.
00:27:58.980 So there's a three-hour Joe Rogan.
00:28:03.800 Was that a useful service
00:28:06.000 to America?
00:28:08.360 Do you think Joe,
00:28:09.280 because I consider Joe Rogan
00:28:10.580 a national treasure,
00:28:12.340 one of the most useful,
00:28:13.620 useful citizens
00:28:15.760 America has ever produced.
00:28:18.020 And his show is a,
00:28:19.960 you know,
00:28:20.260 it's a force of nature.
00:28:21.280 All right,
00:28:23.140 but everybody who said
00:28:23.920 that's useful,
00:28:24.680 useful,
00:28:25.260 you are so wrong.
00:28:26.300 Oh my God,
00:28:26.900 you're wrong.
00:28:27.920 You could not be more wrong.
00:28:30.240 There's nothing more destructive
00:28:31.500 than three hours
00:28:32.460 of one lawyer's opinion.
00:28:34.620 In fact,
00:28:35.120 if you were to make a list
00:28:36.140 of the worst fucking things
00:28:37.500 you could ever do,
00:28:38.240 that would be it.
00:28:39.580 That's the worst thing
00:28:40.660 you could do.
00:28:42.060 So I would say that
00:28:43.340 the Joe Rogan
00:28:44.760 three hours with RFK
00:28:47.060 was damaging to the country.
00:28:49.320 That was just bad
00:28:50.320 for the country.
00:28:51.260 Absolutely fucking bad.
00:28:53.200 Really bad.
00:28:54.420 Like,
00:28:54.620 seriously,
00:28:55.600 seriously bad.
00:28:56.960 It's like,
00:28:57.520 one of the worst things
00:28:58.300 has happened this year.
00:29:00.120 Now,
00:29:00.660 am I saying that RFK
00:29:01.740 was wrong about anything?
00:29:03.560 Nope.
00:29:04.580 Nope.
00:29:05.540 You didn't hear that.
00:29:07.180 That was not in my opinion.
00:29:09.560 Do I think he's probably wrong
00:29:11.600 about some things he claims?
00:29:14.860 I would put a really big bet
00:29:16.700 on that.
00:29:18.640 Because when I listen to him talk,
00:29:20.820 I hear a lot of correlation
00:29:22.200 that he thinks is causation.
00:29:24.620 And I can't,
00:29:26.320 I just can't buy into that.
00:29:28.560 I understand the correlation,
00:29:30.580 and I understand
00:29:31.620 it would be worth looking into.
00:29:33.100 I agree with that completely.
00:29:35.200 But to assume
00:29:37.780 that we know something
00:29:38.780 because of correlation
00:29:39.880 is a really big ask,
00:29:42.300 and I can't get there.
00:29:43.960 So here's what would
00:29:45.160 have been useful.
00:29:46.400 What would have been
00:29:47.180 really useful
00:29:47.800 is somebody who had
00:29:48.740 the same amount of knowledge
00:29:50.780 as RFK Jr. has,
00:29:53.140 but has a different
00:29:54.240 point of view.
00:29:55.560 And have both of them
00:29:56.840 on for three hours,
00:29:57.880 and one says a claim,
00:29:59.160 the other says
00:29:59.740 what they think,
00:30:00.800 then maybe,
00:30:02.300 maybe you would have
00:30:03.160 something good
00:30:03.720 for the country.
00:30:04.360 But whether Joe Rogan
00:30:06.800 had RFK Jr. on
00:30:08.320 or somebody
00:30:10.320 just as powerful
00:30:11.220 who had completely
00:30:12.380 opposite points of view,
00:30:14.360 my opinion would be
00:30:15.280 the same.
00:30:16.520 It wouldn't matter
00:30:17.160 if you had RFK Jr.
00:30:18.360 for three hours,
00:30:19.380 or somebody disagreed
00:30:20.740 with everything he said
00:30:21.700 for three hours.
00:30:22.700 If you have them on
00:30:23.780 by themselves
00:30:24.640 with just Joe Rogan,
00:30:26.780 that is a huge mistake
00:30:28.380 and a disservice
00:30:29.180 to the country.
00:30:29.880 A very big one.
00:30:31.000 It's a really big disservice.
00:30:32.460 And I think,
00:30:34.800 because I love me
00:30:36.260 some Joe Rogan,
00:30:37.620 as most of you do.
00:30:39.580 And so, you know,
00:30:41.800 I have ultimate respect
00:30:42.960 for everything he's doing.
00:30:44.720 But that's a real big mistake
00:30:46.380 in terms of benefit
00:30:47.820 to the country.
00:30:48.980 It's real good
00:30:49.580 for ratings.
00:30:51.320 You know,
00:30:51.500 it's good for,
00:30:52.380 you know,
00:30:53.460 entertainment.
00:30:54.400 But in terms of
00:30:55.440 informing the country,
00:30:56.740 it's a disaster.
00:30:58.220 It's a fucking disaster.
00:31:00.480 And not because
00:31:01.520 RFK is wrong.
00:31:02.840 You get that, right?
00:31:04.360 I'm not saying
00:31:05.160 RFK Jr. is wrong.
00:31:07.240 I'm saying
00:31:07.840 if you don't put
00:31:08.440 the other side,
00:31:09.440 when there is
00:31:10.180 such a big other side,
00:31:11.800 if you don't put them
00:31:12.600 on there at the same time,
00:31:14.160 you are misinforming
00:31:15.260 the public,
00:31:16.180 period.
00:31:17.480 Period.
00:31:18.120 That's just propaganda
00:31:19.240 at that point.
00:31:20.900 Now, it's accidental.
00:31:22.620 There's no way
00:31:23.380 that Joe Rogan
00:31:24.160 is, like,
00:31:24.740 intentionally trying
00:31:25.680 to mislead people.
00:31:27.220 There's no way
00:31:28.240 that's true.
00:31:29.380 We've all watched
00:31:30.320 him long enough.
00:31:31.060 We know he's
00:31:32.360 pretty dedicated
00:31:33.140 to what's true.
00:31:34.600 So there's no way
00:31:35.380 he's trying to do
00:31:36.040 anything that's misleading.
00:31:37.500 It just works out
00:31:38.420 that way.
00:31:39.120 That's just the outcome.
00:31:40.660 So I would bet
00:31:41.460 there's a whole bunch
00:31:42.580 of people who are
00:31:43.380 very persuaded
00:31:44.240 by RFK Jr.
00:31:45.360 and you should not
00:31:47.440 have been.
00:31:48.800 And I don't mean
00:31:49.520 because his arguments
00:31:50.420 are not good,
00:31:51.280 because they're actually
00:31:51.780 quite good.
00:31:52.680 His arguments are
00:31:53.320 very persuasive-sounding.
00:31:56.180 I just can't judge them.
00:31:59.740 Yeah, you would be
00:32:00.640 very, very poorly served
00:32:02.360 if you watched that.
00:32:03.320 In fact,
00:32:03.700 that's the reason
00:32:04.140 I didn't watch it.
00:32:05.400 I saw some clips,
00:32:07.000 and I said,
00:32:07.600 I'm not going to watch this.
00:32:09.260 This is not good.
00:32:11.120 This does not help anybody.
00:32:14.200 All right.
00:32:15.200 However,
00:32:15.840 I love the fact
00:32:16.600 that RFK Jr.
00:32:17.540 will go anywhere
00:32:18.480 and talk to anybody.
00:32:19.600 He can do it
00:32:20.060 without notes.
00:32:21.140 He can do it persuasively.
00:32:22.580 He's impressive as hell.
00:32:24.600 So let me say that.
00:32:25.920 RFK Jr.,
00:32:27.420 impressive as hell.
00:32:29.560 Doesn't mean he's right.
00:32:30.600 And I've got some
00:32:32.540 big questions about that.
00:32:34.900 But, impressive.
00:32:37.020 All right.
00:32:37.760 I did see a Twitter user
00:32:39.700 called BadStats.
00:32:41.940 And I retweeted it
00:32:43.200 if you're looking for it.
00:32:44.280 In which he went through
00:32:45.280 and tried to,
00:32:47.260 I'm going to use the word
00:32:48.440 debunk,
00:32:49.700 but since I don't know
00:32:50.700 which one is right,
00:32:52.500 don't assume that the debunker
00:32:54.300 is the right one.
00:32:55.360 I'm just using that
00:32:56.280 as a shorthand.
00:32:57.260 So somebody who disagreed.
00:32:59.300 And it was a long,
00:33:00.600 long thread
00:33:00.940 in which he went through
00:33:01.900 various things
00:33:02.840 and showed his work.
00:33:05.440 So, that was useful.
00:33:08.020 If the only thing you saw
00:33:09.220 was RFK Jr.
00:33:11.000 on Joe Rogan,
00:33:13.360 you should see somebody
00:33:15.020 who's got a different opinion
00:33:16.660 and some knowledge
00:33:17.840 pushing back on it.
00:33:20.300 The pushback is just as strong.
00:33:23.720 Right?
00:33:24.160 If you listen to RFK Jr.
00:33:25.440 by himself,
00:33:26.140 you're totally persuaded
00:33:27.180 because he's very persuasive.
00:33:29.240 As soon as you see
00:33:30.080 the thread
00:33:30.860 that calls out,
00:33:32.980 you know,
00:33:33.300 some,
00:33:34.040 let's say,
00:33:36.020 I don't want to say flaws,
00:33:37.080 but a different opinion
00:33:37.800 on the same situation,
00:33:39.960 you completely change your mind.
00:33:41.840 You're like,
00:33:42.220 oh,
00:33:43.080 ooh,
00:33:43.380 I didn't know that.
00:33:44.800 Ooh,
00:33:45.220 really?
00:33:46.000 Really?
00:33:46.360 That's a correlation,
00:33:47.220 not a causation?
00:33:48.100 Ooh.
00:33:48.980 But then you look
00:33:49.900 at the debunk
00:33:50.760 and you say to yourself,
00:33:51.740 okay,
00:33:51.960 there's one part
00:33:52.780 of this debunk
00:33:53.500 that I know
00:33:53.960 looks like bullshit.
00:33:54.660 And then you say to yourself,
00:33:57.380 all right,
00:33:57.680 who's behind this debunking?
00:33:59.100 Is it Big Pharma?
00:34:00.300 So you never really
00:34:01.300 can get to full confidence.
00:34:03.680 I'm just saying
00:34:04.220 if you haven't seen both sides,
00:34:05.440 you've seen nothing.
00:34:07.680 All right.
00:34:08.160 I have a theory
00:34:09.520 of systemic racism
00:34:11.160 that cortisol levels
00:34:15.900 could be measured
00:34:18.500 to find out
00:34:19.300 where systemic racism
00:34:20.740 is bad,
00:34:21.880 and then you could use that,
00:34:23.580 that knowledge,
00:34:25.460 to know what to do about it.
00:34:27.320 Now,
00:34:27.620 cortisol is the chemical
00:34:29.100 released when you're
00:34:30.380 under stress.
00:34:32.420 And what happens is
00:34:33.700 if your cortisol level
00:34:34.820 goes up,
00:34:35.440 John,
00:34:40.240 are you stupid?
00:34:41.920 I can't tell.
00:34:43.560 Let's figure out
00:34:44.560 if John is stupid.
00:34:45.960 Here's what he says.
00:34:46.780 Scott,
00:34:47.280 Scott,
00:34:48.120 big issues with RFK,
00:34:49.860 but little mountain
00:34:51.240 with his argument
00:34:52.100 is because he refuses
00:34:53.900 to hear him speak.
00:34:56.220 So somebody's saying
00:34:57.160 that my refusing
00:34:58.240 to hear one side
00:34:59.680 is making me dumb.
00:35:02.520 Whereas my only point was
00:35:04.220 you should hear
00:35:04.780 both sides.
00:35:06.580 But you think
00:35:07.380 that refusing to hear
00:35:08.240 one side by itself
00:35:09.360 makes me dumb,
00:35:11.180 John.
00:35:12.800 Do better.
00:35:13.640 Do better, John.
00:35:14.860 You can do better than that.
00:35:18.520 Who argues against
00:35:20.020 hearing both sides
00:35:21.020 of an argument
00:35:21.580 besides John?
00:35:23.420 All right.
00:35:33.920 Cortisol.
00:35:35.220 So if your cortisol
00:35:36.380 levels are high,
00:35:37.540 it lowers your
00:35:38.300 cognitive function.
00:35:39.800 Did you know that?
00:35:41.340 If you're stressed,
00:35:42.440 you actually can't
00:35:43.240 think as well.
00:35:44.040 You don't,
00:35:44.560 and if it's long term,
00:35:46.840 long term,
00:35:47.620 you're just not going
00:35:48.900 to do as well
00:35:49.660 in school.
00:35:50.200 So your academic
00:35:52.320 performance,
00:35:53.200 your actual IQ
00:35:54.120 will drop
00:35:54.880 if you have enough
00:35:56.120 cortisol for long
00:35:57.080 enough.
00:35:57.360 So that's based on
00:35:58.260 stress.
00:35:59.260 Now, where would
00:35:59.900 cortisol be highest?
00:36:02.060 Where do you think
00:36:02.900 would be the highest
00:36:03.600 cortisol?
00:36:04.680 Well, I'll speculate.
00:36:07.060 I would imagine
00:36:07.660 if you were poor,
00:36:09.420 you would have
00:36:10.140 higher cortisol.
00:36:10.920 There's at least one
00:36:11.640 study that shows
00:36:12.740 a strong correlation.
00:36:14.300 So poverty alone
00:36:15.420 would give you
00:36:15.840 higher cortisol.
00:36:16.640 But also,
00:36:18.360 I would think
00:36:19.180 living in a high
00:36:20.380 crime area,
00:36:22.220 you know,
00:36:22.440 dangerous neighborhood,
00:36:23.600 high cortisol.
00:36:24.700 I suppose,
00:36:26.000 logically,
00:36:27.120 if you had one
00:36:27.860 parent,
00:36:28.960 you might feel
00:36:29.640 a little more
00:36:30.500 anxious than
00:36:31.620 if you had two,
00:36:32.840 if there were,
00:36:33.340 you know,
00:36:33.540 two good parents
00:36:34.280 who were taking
00:36:34.720 care of you.
00:36:35.500 You'd feel a little
00:36:36.460 less safe.
00:36:38.620 So you can imagine
00:36:39.720 a whole host of
00:36:40.880 things that would
00:36:41.940 make you have
00:36:42.660 high cortisol
00:36:43.280 if you lived
00:36:43.940 in some urban,
00:36:45.520 high density,
00:36:46.200 high crime area.
00:36:49.620 Doesn't that
00:36:50.200 almost perfectly
00:36:51.140 explain the situation?
00:36:53.060 That the people
00:36:53.900 who are in the place
00:36:55.240 that creates
00:36:55.860 high cortisol
00:36:56.520 are the ones
00:36:58.460 who are performing
00:36:59.180 poorly.
00:37:00.480 They're not doing
00:37:01.000 as well in school.
00:37:03.080 Now, I'm not saying
00:37:03.920 that's the one thing
00:37:05.360 that's causing it.
00:37:06.520 I'm saying that
00:37:07.320 it's a really big thing.
00:37:09.380 And if we were
00:37:09.940 to only measure
00:37:10.880 the cortisol levels
00:37:12.080 in, let's say,
00:37:14.040 the entire black population
00:37:15.380 versus the entire
00:37:16.280 white population,
00:37:17.300 that would tell you
00:37:18.560 what the difference
00:37:20.820 is on people's
00:37:22.260 actual reality,
00:37:24.260 their bodies.
00:37:25.980 Because if black
00:37:27.380 people in general
00:37:28.360 are having
00:37:30.180 high cortisol
00:37:31.120 situations,
00:37:32.200 just because of
00:37:32.980 where they live
00:37:33.540 mostly,
00:37:34.800 you would expect
00:37:36.060 that would be
00:37:36.520 a huge drag
00:37:37.500 on performance.
00:37:38.920 But it would also
00:37:39.840 give you something
00:37:40.460 something to fix
00:37:41.100 or at least
00:37:42.320 something to test.
00:37:43.680 So you could test
00:37:44.560 the cortisol theory
00:37:45.600 fairly easily
00:37:46.300 by taking some
00:37:48.000 random black kids
00:37:49.440 out of an urban area,
00:37:51.200 putting them in,
00:37:52.000 let's say,
00:37:52.940 a nicer environment,
00:37:55.440 remove all the danger,
00:37:57.780 so you just put them
00:37:58.360 in a no-danger situation
00:37:59.800 where they still have
00:38:00.460 a path to success.
00:38:02.340 Just see what happens.
00:38:03.840 First of all,
00:38:04.540 see if their cortisol
00:38:05.420 levels go down.
00:38:06.420 So that's the first
00:38:08.300 thing you test.
00:38:09.740 And I think it would.
00:38:11.140 And then you test
00:38:12.020 if that seemed
00:38:14.100 to be correlated
00:38:14.780 with better
00:38:15.420 academic achievement.
00:38:17.500 Probably would.
00:38:19.780 Now, you know,
00:38:20.880 you also have
00:38:21.360 other influences
00:38:22.120 of the inner city,
00:38:23.220 drugs, etc.
00:38:26.180 So,
00:38:28.740 but I'm going to
00:38:30.360 dovetail this
00:38:31.000 into my bigger point,
00:38:32.640 which is
00:38:33.020 the big secret
00:38:35.760 of personal success
00:38:37.080 is to decouple
00:38:39.740 what is true
00:38:40.700 from what works.
00:38:43.900 And that's
00:38:44.660 totally counterintuitive,
00:38:45.940 and I think
00:38:46.380 it's one of the
00:38:47.080 biggest problems
00:38:47.680 in the world
00:38:48.280 is that that's
00:38:49.080 counterintuitive.
00:38:50.540 And it's been my,
00:38:51.360 sort of my
00:38:51.940 recent life's work,
00:38:54.260 you know,
00:38:54.420 the second half
00:38:55.040 of my life,
00:38:55.960 has been
00:38:56.520 specifically on this,
00:38:59.080 this illusion.
00:39:00.800 The illusion
00:39:01.760 that you have
00:39:02.700 to focus
00:39:03.280 on what's true
00:39:04.380 as opposed
00:39:06.060 to what works.
00:39:08.520 And here's
00:39:09.040 the perfect
00:39:09.840 example of it.
00:39:11.600 Systemic racism
00:39:12.540 is true.
00:39:15.880 Right?
00:39:16.240 Now, I think
00:39:16.820 that there's
00:39:17.720 a systemic
00:39:19.020 poverty problem
00:39:20.400 that's, you know,
00:39:22.680 the bigger part of it.
00:39:23.840 So if you're
00:39:24.340 a poor anything,
00:39:26.060 if you're just poor,
00:39:27.600 that's going
00:39:28.360 to have an effect
00:39:29.120 on your generation
00:39:30.040 and maybe the next
00:39:30.760 one too.
00:39:32.080 So,
00:39:33.440 but systemic
00:39:34.120 racism is real.
00:39:35.640 Right?
00:39:36.080 So if you say,
00:39:37.320 but what's true?
00:39:39.100 Is systemic
00:39:39.720 racism true?
00:39:40.900 Yes.
00:39:41.780 Does it have
00:39:42.360 an impact
00:39:42.860 on some people?
00:39:44.160 Yes.
00:39:46.020 So what do you
00:39:46.780 do about that?
00:39:47.620 It's true,
00:39:48.940 has an impact,
00:39:50.000 so that's where
00:39:50.580 you put your resources,
00:39:51.580 right?
00:39:52.260 That's the thing
00:39:52.920 you fix,
00:39:53.380 because it's true
00:39:54.120 and it has
00:39:55.940 a negative effect,
00:39:56.900 so you should
00:39:57.380 go fix that thing.
00:39:58.660 That makes
00:39:59.260 common sense,
00:40:00.000 right?
00:40:00.520 That's straightforward.
00:40:01.780 Here's my problem.
00:40:02.660 Here's the solution.
00:40:04.100 Everybody agrees.
00:40:05.340 The trouble is
00:40:06.080 that gets you
00:40:06.480 a terrible outcome.
00:40:09.860 Sorry.
00:40:11.020 I'm sorry.
00:40:11.560 The truth
00:40:11.920 gets you
00:40:12.260 a bad outcome.
00:40:13.520 So if you want
00:40:14.500 a good outcome,
00:40:15.880 you have to go
00:40:17.040 a different direction,
00:40:18.460 which is you
00:40:19.260 don't ignore
00:40:19.880 the truth
00:40:20.480 that systemic
00:40:21.500 racism exists.
00:40:23.660 So you don't
00:40:24.120 ignore it.
00:40:24.960 You just find
00:40:25.820 a way to slice
00:40:26.540 through it,
00:40:27.620 which is to focus
00:40:28.620 on personal
00:40:30.020 success strategies.
00:40:31.480 Make sure
00:40:32.060 that you go
00:40:32.660 to school,
00:40:33.780 even if other
00:40:34.420 people don't.
00:40:35.160 Make sure
00:40:35.600 you don't get
00:40:36.360 on drugs,
00:40:37.300 even if other
00:40:37.960 people do.
00:40:38.960 You know,
00:40:39.240 basic stuff.
00:40:40.960 Stay off
00:40:41.420 of drugs,
00:40:41.980 don't get
00:40:42.300 anybody pregnant,
00:40:43.160 don't get
00:40:43.520 pregnant too
00:40:44.500 early.
00:40:46.660 You know,
00:40:46.920 build a talent
00:40:47.560 stack,
00:40:48.540 systems over
00:40:49.280 goals,
00:40:50.040 you know,
00:40:50.180 make sure
00:40:50.500 your sleep
00:40:51.740 and your diet
00:40:52.640 are on point.
00:40:54.400 I mean,
00:40:54.700 the methods
00:40:55.680 for being successful
00:40:56.840 have always
00:40:57.700 been the same.
00:40:59.160 You know,
00:40:59.300 they improve
00:40:59.820 a little bit
00:41:00.280 over time.
00:41:01.660 But how to do
00:41:03.260 it is pretty
00:41:04.440 well defined.
00:41:05.700 People do
00:41:06.220 understand how
00:41:07.020 to be successful.
00:41:08.360 The trouble is,
00:41:09.480 if you're focusing
00:41:10.220 on what's true,
00:41:11.900 you're going to
00:41:13.480 go the wrong
00:41:13.940 direction.
00:41:15.120 You're going to
00:41:15.560 feel a victim.
00:41:17.060 Do you know
00:41:17.360 what happens
00:41:17.840 if somebody
00:41:18.360 tells you
00:41:18.900 you're a victim
00:41:19.520 and somebody
00:41:20.580 tells you
00:41:21.140 that you're
00:41:21.480 a victim
00:41:21.800 of discrimination?
00:41:22.660 It raises
00:41:25.000 your cortisol
00:41:26.940 levels.
00:41:28.180 It would be
00:41:28.600 another form
00:41:29.620 of stress.
00:41:30.980 All stress
00:41:31.620 raises your
00:41:32.420 cortisol.
00:41:33.780 So telling
00:41:34.260 people that
00:41:34.880 there is
00:41:35.220 systemic racism
00:41:36.200 and it could
00:41:36.840 hold them
00:41:37.280 back should
00:41:38.840 make them
00:41:39.960 fail.
00:41:41.360 Everything we
00:41:42.260 know about
00:41:42.660 everything says
00:41:44.200 that telling
00:41:44.640 people that
00:41:45.160 they have an
00:41:45.540 obstacle they
00:41:46.220 can't easily
00:41:47.640 get past
00:41:48.340 will make
00:41:49.700 them fail.
00:41:50.240 However,
00:41:52.760 everything we
00:41:53.500 know about
00:41:53.880 the world
00:41:54.280 suggests that
00:41:55.060 if you tell
00:41:55.440 people that
00:41:55.940 nothing can
00:41:56.500 stop them,
00:41:57.660 which is
00:41:58.160 what?
00:41:59.000 It's a lie.
00:42:00.840 That's a lie.
00:42:02.240 Of course
00:42:02.720 things can
00:42:03.200 stop you.
00:42:04.620 There's a lot
00:42:05.360 of stuff that
00:42:05.800 can stop you.
00:42:06.860 Stop you
00:42:07.360 right in your
00:42:07.740 tracks.
00:42:08.920 But if you
00:42:10.020 act as though
00:42:10.860 that's not
00:42:11.500 true,
00:42:12.260 when it's
00:42:12.740 clearly true,
00:42:14.140 your result
00:42:15.260 will be way
00:42:16.100 better.
00:42:17.440 Because you're
00:42:18.180 going to push
00:42:18.540 through the
00:42:19.000 smaller,
00:42:19.340 you're going
00:42:20.040 to push
00:42:20.320 through the
00:42:20.660 smaller problems.
00:42:21.860 You're going
00:42:22.300 to say,
00:42:23.180 well,
00:42:23.360 there's a
00:42:23.780 door in
00:42:24.260 front of
00:42:24.500 me that's
00:42:24.800 locked.
00:42:25.900 I guess
00:42:26.340 I'm done
00:42:26.760 here.
00:42:27.960 Systemic
00:42:28.360 racism.
00:42:29.440 It's a
00:42:29.840 locked door.
00:42:30.540 What can
00:42:30.860 I do?
00:42:31.720 But if you
00:42:32.260 told yourself
00:42:32.820 that nothing
00:42:33.280 could stop
00:42:33.820 you,
00:42:34.760 you'd find
00:42:35.680 the key
00:42:36.100 smith,
00:42:36.620 you'd figure
00:42:37.700 out how to
00:42:38.160 pick a
00:42:38.480 lock,
00:42:38.980 you'd go
00:42:39.600 to the
00:42:39.860 gym until
00:42:40.280 you could
00:42:40.620 kick that
00:42:41.000 door down.
00:42:42.420 And then
00:42:42.780 you would
00:42:43.060 kick it
00:42:43.360 down.
00:42:45.100 And then
00:42:45.460 there'd be
00:42:45.820 another
00:42:46.280 obstacle.
00:42:47.400 And then
00:42:47.680 you'd say,
00:42:48.020 ah,
00:42:48.280 another
00:42:49.340 just like
00:42:49.920 I expected.
00:42:51.020 And then
00:42:51.340 you dismantle
00:42:52.200 the wall
00:42:53.260 one brick
00:42:53.780 at a time,
00:42:54.700 and you
00:42:54.920 go to the
00:42:55.220 next one.
00:42:56.160 That's
00:42:56.500 life.
00:42:58.200 That's
00:42:58.580 life.
00:42:59.580 Now,
00:42:59.980 here's the
00:43:00.300 biggest problem
00:43:01.200 with systemic
00:43:01.940 racism,
00:43:02.680 in my opinion.
00:43:04.980 I was
00:43:05.720 raised to
00:43:06.240 believe that.
00:43:07.440 That's
00:43:07.740 exactly how
00:43:08.460 I was
00:43:08.720 raised,
00:43:09.240 to believe
00:43:09.900 that I
00:43:10.440 could push
00:43:10.900 through any
00:43:11.460 obstacle.
00:43:13.300 Were you?
00:43:14.620 How many
00:43:15.240 of you were
00:43:15.660 raised that
00:43:16.080 way,
00:43:16.900 to believe
00:43:17.320 that you
00:43:17.580 could push
00:43:18.020 through whatever
00:43:18.420 you needed
00:43:18.800 to push
00:43:19.160 through?
00:43:22.360 And I'll
00:43:22.940 bet you
00:43:23.200 all the
00:43:23.480 people saying
00:43:23.980 yes did
00:43:24.460 okay in
00:43:24.940 life.
00:43:25.920 You probably
00:43:26.740 raised a
00:43:27.260 family,
00:43:28.540 took care of
00:43:29.080 your retirement.
00:43:30.560 I'll bet
00:43:30.880 you did.
00:43:31.900 And I'll bet
00:43:32.540 you if you
00:43:32.900 looked at the
00:43:33.360 population of
00:43:34.060 people who
00:43:34.420 told they
00:43:34.760 couldn't
00:43:35.060 succeed because
00:43:35.840 of this
00:43:36.100 big old
00:43:36.580 racist
00:43:37.600 country that's
00:43:38.380 working against
00:43:39.000 them, I'll
00:43:39.960 bet they
00:43:40.280 didn't do
00:43:40.600 so well.
00:43:42.840 And I'm
00:43:44.200 sure there
00:43:44.620 are lots of
00:43:45.960 reasons for
00:43:46.780 the differences
00:43:47.300 in performance
00:43:48.240 across groups.
00:43:49.460 There are
00:43:49.920 lots of
00:43:50.180 reasons.
00:43:51.460 But I'll
00:43:51.720 bet if you
00:43:52.080 just remove
00:43:52.660 this one
00:43:53.480 thing, you'd
00:43:54.420 have the
00:43:54.720 biggest improvement.
00:43:58.120 If you could
00:43:58.660 just convince
00:43:59.260 people that,
00:44:00.740 yeah, systemic
00:44:01.540 racism exists,
00:44:03.140 and you can
00:44:03.760 slice through it
00:44:04.600 like it doesn't.
00:44:06.960 Is that true?
00:44:08.300 It's true enough.
00:44:09.840 It's true enough
00:44:10.760 for the purpose
00:44:11.700 that you want it
00:44:12.440 to be, which is
00:44:13.060 to inspire people
00:44:14.120 to do their
00:44:14.560 best.
00:44:14.820 But it's
00:44:15.960 not true,
00:44:16.900 true, you
00:44:18.500 know, like
00:44:18.920 true, true,
00:44:19.720 capital T,
00:44:21.080 true and, you
00:44:21.860 know, that
00:44:22.180 kind of
00:44:22.500 true, because
00:44:23.740 a rock could
00:44:24.520 fall on your
00:44:25.000 head and
00:44:25.320 kill you.
00:44:26.820 Right?
00:44:27.360 A drunk
00:44:27.820 driver could
00:44:28.440 hit you
00:44:28.760 tomorrow.
00:44:29.940 No, you
00:44:30.520 can't do
00:44:30.960 anything that
00:44:31.540 you want to
00:44:32.040 do all the
00:44:32.540 time, but
00:44:34.180 if your
00:44:34.920 mindset is
00:44:35.840 that you
00:44:36.100 can, you're
00:44:36.720 going to do
00:44:37.040 better.
00:44:38.880 So there's
00:44:39.540 no way to
00:44:40.040 separate the
00:44:41.440 fact that
00:44:44.000 personal success
00:44:45.340 requires believing
00:44:47.960 or acting as if
00:44:49.220 you believe
00:44:49.780 something that's
00:44:50.540 absolutely not
00:44:51.240 true.
00:44:52.720 You get that,
00:44:53.620 right?
00:44:53.900 Does everybody
00:44:54.260 get that?
00:44:55.500 That personal
00:44:56.100 success requires
00:44:57.360 you to accept
00:44:58.260 that which you
00:44:59.460 know not to be
00:45:00.320 true.
00:45:01.500 That you can do
00:45:02.460 anything.
00:45:03.460 You can't do
00:45:04.100 anything.
00:45:05.040 You can't flap
00:45:05.640 your wings and
00:45:06.180 fly.
00:45:08.580 But you can get
00:45:09.600 through most
00:45:10.080 problems.
00:45:10.540 You can solve
00:45:11.780 most problems.
00:45:12.480 That is true.
00:45:14.380 And I would
00:45:15.180 even go further
00:45:15.860 and say that if
00:45:16.860 you were, let's
00:45:17.360 say, a black
00:45:17.900 American, you
00:45:19.180 could take
00:45:19.660 advantage of
00:45:20.640 the various
00:45:22.360 advantages that
00:45:23.300 everybody has in
00:45:24.980 different categories.
00:45:26.380 Your personal
00:45:27.580 advantage might be
00:45:28.360 different than
00:45:28.820 mine.
00:45:30.040 So my
00:45:30.400 advantage might
00:45:31.100 be that, let's
00:45:32.280 say, I could
00:45:33.320 get a job with
00:45:34.120 a bunch of
00:45:34.540 racists if I'm
00:45:37.480 the same color,
00:45:39.060 but you
00:45:39.580 couldn't.
00:45:39.860 So there'd
00:45:40.800 be little
00:45:41.100 pockets of
00:45:41.760 things where
00:45:42.160 I've got
00:45:42.440 an advantage.
00:45:43.280 But then if
00:45:43.720 I went to a
00:45:44.260 Fortune 500
00:45:45.020 company who
00:45:46.220 was trying to
00:45:46.720 improve their
00:45:47.240 diversity and
00:45:48.560 we walk in
00:45:49.140 with the same
00:45:49.600 qualifications for
00:45:50.560 the same job,
00:45:51.940 black guy gets
00:45:52.820 it every time.
00:45:54.380 Every time.
00:45:55.800 100% of the
00:45:56.720 time the black
00:45:57.820 guy is going to
00:45:58.300 get the job
00:45:58.940 equal qualifications
00:46:01.360 because he's a
00:46:03.020 twofer.
00:46:03.820 The black guy is
00:46:04.740 a twofer, but I'm
00:46:05.600 a onefer.
00:46:06.100 I give them
00:46:07.640 just a good
00:46:08.260 employee, I
00:46:09.060 hope, but the
00:46:10.480 black guy gives
00:46:11.160 them a good
00:46:11.580 employee and
00:46:12.560 solves their
00:46:13.400 diversity problem
00:46:14.220 a little bit.
00:46:15.440 It's a twofer.
00:46:16.280 Of course they're
00:46:17.000 going to pick the
00:46:17.520 twofer.
00:46:18.440 Why wouldn't
00:46:19.040 they?
00:46:19.840 You would be an
00:46:20.600 idiot if you
00:46:21.180 didn't.
00:46:21.480 I would.
00:46:22.260 I would do the
00:46:22.860 same thing.
00:46:24.480 So everybody's
00:46:26.380 got an advantage,
00:46:27.120 you just have to
00:46:27.660 find it instead of
00:46:29.500 complaining about
00:46:30.140 the other person
00:46:30.740 having an advantage,
00:46:31.840 which they do.
00:46:32.540 All right.
00:46:37.680 Apparently this
00:46:38.480 Adam Schiff
00:46:39.380 censure, which I
00:46:40.720 was opposed to
00:46:41.520 when it included
00:46:42.260 a $16 million
00:46:43.560 fine, but I
00:46:45.520 guess it's going
00:46:45.940 to be reintroduced
00:46:46.740 without the
00:46:47.300 monetary part, so
00:46:48.540 he'll just get
00:46:49.080 censured, which I
00:46:50.320 think should be
00:46:50.800 enough to get
00:46:51.320 all of the
00:46:51.840 Republicans to
00:46:52.620 vote for it.
00:46:53.640 So I think
00:46:54.240 there's a good
00:46:54.640 chance he'll get
00:46:55.160 censured, and I
00:46:57.020 think this is a
00:46:57.860 far better outcome
00:46:58.960 because I don't
00:46:59.920 want to set that
00:47:00.620 precedent of
00:47:02.140 Congress fighting
00:47:03.140 each other.
00:47:04.500 Like, that would
00:47:05.300 just be the end
00:47:06.360 of any useful
00:47:07.980 Congress, if
00:47:09.080 they're useful at
00:47:09.700 all.
00:47:11.720 All right, here's
00:47:12.640 the latest on
00:47:13.680 Ukraine.
00:47:17.780 Ukraine is just
00:47:18.680 such a weird thing
00:47:19.440 to talk about
00:47:20.140 because there's
00:47:20.700 such a lack of
00:47:21.420 information.
00:47:23.200 But now there's
00:47:24.900 talk of giving
00:47:27.240 Ukraine what
00:47:28.120 they would call
00:47:28.680 Israel status.
00:47:29.880 Israel status.
00:47:33.280 Meaning that
00:47:35.200 with Israel, we
00:47:36.940 give them
00:47:37.520 permanent, ongoing,
00:47:40.620 regular military
00:47:41.680 assistance, just
00:47:42.740 sort of forever.
00:47:44.540 And the idea is
00:47:45.860 that instead of
00:47:46.480 coming up with
00:47:47.360 specific funding
00:47:48.640 for Ukraine for
00:47:49.640 a specific war,
00:47:51.220 that they would
00:47:51.760 become our, let's
00:47:53.240 say, military
00:47:54.160 dependent forever.
00:47:57.160 Yeah, we must
00:48:00.400 be pretty dumb.
00:48:02.820 We must be
00:48:03.860 pretty dumb.
00:48:05.580 I don't really
00:48:06.500 understand what's
00:48:08.340 going on there.
00:48:09.720 It's got to be
00:48:10.600 that the neocons
00:48:12.020 are just trying to
00:48:13.120 destroy Russia,
00:48:14.160 and they just
00:48:14.760 don't want to say
00:48:15.340 it directly.
00:48:16.420 Although they do
00:48:17.060 say it directly,
00:48:17.780 some of them.
00:48:19.720 Yeah.
00:48:20.500 The whole idea
00:48:21.380 of making Ukraine
00:48:22.300 NATO, unless,
00:48:25.160 well, let me put
00:48:25.900 it this way, if
00:48:28.280 we could destroy
00:48:29.420 Russia, then
00:48:31.260 making Ukraine
00:48:32.060 NATO might make
00:48:33.260 some sense, I
00:48:34.280 suppose, just to
00:48:35.280 keep the lid on
00:48:36.020 the thing you
00:48:36.500 destroyed.
00:48:37.720 But if you're
00:48:39.040 not going to
00:48:39.560 destroy Russia,
00:48:41.760 it doesn't seem
00:48:42.520 like a good idea
00:48:43.380 to keep, you
00:48:45.640 know, keep
00:48:46.020 attacking it with
00:48:47.020 more and more
00:48:47.520 NATO surrounding
00:48:48.420 it.
00:48:49.080 It feels like a
00:48:49.960 really bad idea.
00:48:52.100 Yeah, it feels
00:48:52.900 like the military
00:48:53.620 industrial complex
00:48:54.560 just wants another
00:48:55.380 permanent piggy bank.
00:48:58.320 And there you
00:48:59.540 go.
00:49:03.620 Make Russia the
00:49:04.700 51st state.
00:49:07.720 Yeah.
00:49:08.680 All right.
00:49:10.340 So that should be
00:49:11.380 alarming.
00:49:12.880 And then here's
00:49:14.340 the news from the
00:49:15.620 Ukraine counter
00:49:16.460 offensive.
00:49:16.880 counteroffensive.
00:49:18.020 And remember, I've
00:49:19.380 taught you that if
00:49:20.300 there are too many
00:49:21.120 explanations for a
00:49:22.460 thing, all you know
00:49:24.140 is that nobody knows
00:49:24.920 about the thing.
00:49:26.580 Listen to how many
00:49:27.320 explanations there are
00:49:28.440 of what's happening
00:49:29.140 over there.
00:49:30.340 Because the
00:49:31.160 counteroffensive is
00:49:32.140 moving very slowly,
00:49:33.560 right?
00:49:34.280 There's almost nothing
00:49:35.300 happening in terms of
00:49:36.260 change of territory.
00:49:38.720 A little bit, but
00:49:39.960 nothing important.
00:49:40.560 And so one take on
00:49:43.760 it is that what
00:49:44.420 the Ukrainians
00:49:45.140 doing is a
00:49:45.920 strategy of
00:49:47.380 testing all the
00:49:48.880 front line.
00:49:50.660 And as they're
00:49:51.260 testing, they're
00:49:52.020 learning.
00:49:53.300 And really, the
00:49:54.020 testing phase is a
00:49:55.560 legitimate military
00:49:56.960 strategy in which
00:49:58.440 you test, test, test.
00:49:59.780 And then once you
00:50:00.500 find the weak spot,
00:50:01.920 then you mass up
00:50:03.960 and you go through
00:50:04.660 the weak spot.
00:50:06.340 So is that what's
00:50:07.540 happening?
00:50:07.880 Are they probing?
00:50:08.840 Are they probing
00:50:09.900 and learning?
00:50:10.760 Maybe.
00:50:11.220 That's one possibility.
00:50:13.160 The other is that
00:50:14.300 they're waiting for
00:50:16.320 the Russians to
00:50:17.040 make mistakes.
00:50:18.600 And because the
00:50:19.260 Russian military
00:50:19.840 always makes mistakes,
00:50:21.520 all you really have
00:50:22.440 to do is get them
00:50:23.320 to react.
00:50:24.940 So simply getting
00:50:26.100 them to react will
00:50:27.560 cause them to make
00:50:28.300 mistakes, and then
00:50:29.120 you let them make
00:50:29.700 the mistakes.
00:50:31.640 Another take is
00:50:32.720 that what the
00:50:35.140 Ukrainians were doing
00:50:36.060 was trying to wipe
00:50:36.900 out the Russian
00:50:37.800 conscripts first.
00:50:39.900 Instead of going
00:50:40.820 after big assets,
00:50:42.620 that they were
00:50:43.060 going after troops,
00:50:44.540 because if they
00:50:45.180 could kill enough
00:50:45.800 of the troops,
00:50:46.520 it would have a
00:50:47.220 devastating effect
00:50:48.200 on the psychology
00:50:49.320 of the Russian
00:50:50.160 citizens, who might
00:50:51.640 try to stop the war.
00:50:53.840 The other is,
00:50:55.220 it's really all about
00:50:57.100 cutting off Crimea
00:50:58.220 and making it look
00:51:00.280 like they better
00:51:01.760 negotiate, because
00:51:02.860 that would be a big
00:51:04.380 win if they could do
00:51:05.300 it.
00:51:05.440 So, how many
00:51:08.520 billions of dollars
00:51:09.400 are we giving to
00:51:10.060 Ukraine, and our
00:51:11.600 press can't tell us
00:51:12.860 what the strategy is?
00:51:15.420 Now, are you
00:51:16.740 bothered by that?
00:51:18.400 Now, I'm not sure
00:51:19.420 that I want the press
00:51:20.340 to know, you know,
00:51:21.200 detailed military
00:51:22.160 strategy, because then
00:51:23.420 the other side would
00:51:24.260 now too.
00:51:25.380 But, don't you feel
00:51:27.580 that for our billions
00:51:28.640 and billions of dollars,
00:51:29.860 we should have some
00:51:30.540 idea what they're
00:51:31.200 trying to do over
00:51:32.060 there.
00:51:33.320 Are they really
00:51:34.160 trying to win?
00:51:35.500 Or are they trying
00:51:36.400 to just negotiate
00:51:37.540 better?
00:51:39.260 Or is the entire
00:51:40.340 thing about creating
00:51:41.620 a permanent systems
00:51:43.440 or military industrial
00:51:44.680 complex can just
00:51:45.740 feed weapons there
00:51:46.620 forever?
00:51:47.980 I have no idea
00:51:49.040 what we're doing
00:51:49.580 there.
00:51:50.500 But I know it's
00:51:51.300 expensive.
00:51:53.100 So, it just, to me,
00:51:54.220 it looks like nothing
00:51:55.320 but a military industrial
00:51:56.540 complex play that has
00:51:59.380 nothing to do with
00:52:00.120 American interests.
00:52:02.060 That's what it looks
00:52:03.140 like to me.
00:52:05.760 It's hard for me to
00:52:06.740 be in favor of this
00:52:07.720 war, because I just
00:52:08.460 don't see the reasons.
00:52:10.720 So, yeah, it looks
00:52:11.340 like a money grab to
00:52:12.220 me, which will kill
00:52:13.920 millions of people.
00:52:15.840 Just like usual.
00:52:18.260 In other words, you
00:52:20.420 can get used to
00:52:21.140 anything.
00:52:22.820 We actually watch
00:52:24.120 this happening, and
00:52:25.860 then we just go to
00:52:26.740 lunch.
00:52:28.300 There are too many
00:52:29.160 outrages.
00:52:29.660 I feel like when
00:52:31.120 the Vietnam War was
00:52:32.960 on, the total number
00:52:35.240 of outrages was not
00:52:37.820 that many.
00:52:39.200 So, you could really
00:52:40.260 get mad about this
00:52:41.440 one thing, oh, this
00:52:43.040 war.
00:52:43.680 But now we've got
00:52:44.420 everything from climate
00:52:46.000 change to, you know,
00:52:48.580 January 6th insurrection
00:52:50.080 and things that they make
00:52:52.180 up every day that are
00:52:53.620 not even real problems.
00:52:54.720 times that any one
00:52:56.660 problem, it's hard, I
00:52:58.740 can't muster enough
00:52:59.960 outrage.
00:53:01.000 We're spending billions
00:53:02.100 of dollars on a war
00:53:03.060 that's probably only
00:53:04.100 making us more, you
00:53:05.440 know, more at risk.
00:53:07.080 And I can't even get
00:53:08.540 outraged, because
00:53:09.320 everything else is
00:53:10.040 outrageous at the same
00:53:11.020 time.
00:53:12.680 It's a cortisol
00:53:13.300 industry, yeah.
00:53:15.900 Gold diffusion.
00:53:17.340 I think it's outrage
00:53:18.680 diversification.
00:53:21.120 Outrage diffusion.
00:53:21.720 Now, by the way, I
00:53:25.320 think that's a real
00:53:26.300 thing.
00:53:27.260 The outrage
00:53:28.480 dilution.
00:53:30.680 That we're just
00:53:31.380 outraged at so many
00:53:32.320 things, we can't be
00:53:33.080 outraged at anything
00:53:34.060 enough.
00:53:37.400 Yeah.
00:53:37.900 Outrage dilution?
00:53:40.200 Not saturation.
00:53:43.520 My hypnosis is
00:53:44.780 failing?
00:53:46.320 Yeah, I think it's
00:53:46.980 my problem.
00:53:47.460 Call everyone a
00:53:51.600 Nazi?
00:53:54.480 Rage diversity.
00:53:57.860 All right.
00:54:00.080 Is there anything
00:54:00.800 else happening?
00:54:01.520 It's another slow
00:54:03.260 news day, but that's
00:54:04.460 the end of my
00:54:05.040 prepared comments.
00:54:06.940 Have you noticed
00:54:07.540 that your mood has
00:54:08.700 improved, or did all
00:54:10.160 the news make you
00:54:10.840 feel sad?
00:54:12.580 How Amazon locks you
00:54:13.980 out of your house if
00:54:14.860 the driver accuses you
00:54:16.180 of racism?
00:54:16.800 Oh, can Amazon do
00:54:20.500 that?
00:54:22.060 Lock you out of your
00:54:22.840 own house?
00:54:23.980 I've never heard that
00:54:24.900 story.
00:54:26.660 I bummed you out?
00:54:34.800 Well, oh yeah,
00:54:36.820 there's a story about
00:54:37.800 Joe Biden grabbing
00:54:40.900 some side boob of
00:54:42.220 Eva Longorian.
00:54:43.360 and the video is
00:54:46.900 showing the slow
00:54:47.600 motion of her
00:54:48.760 grabbing his
00:54:49.420 hands.
00:54:49.900 He goes in for the
00:54:50.740 hug, but then he
00:54:52.360 slides his hands
00:54:54.140 past her side boob
00:54:55.260 on the way out,
00:54:56.340 and she grabs his
00:54:57.140 hands and twists his
00:54:58.340 thumb over and
00:54:59.820 hands his hands
00:55:01.140 back to him.
00:55:01.740 All right.
00:55:05.440 All right.
00:55:10.900 Black
00:55:11.340 owner locked
00:55:11.900 that by Amazon.
00:55:13.120 Wow.
00:55:17.180 Amazon locked the
00:55:18.440 smart home device?
00:55:21.180 Oh, he just locked
00:55:22.100 them out of his
00:55:22.660 devices and his...
00:55:24.240 Yeah, okay.
00:55:24.660 Yeah, I'm sure
00:55:30.260 that's happened to
00:55:30.860 her before.
00:55:31.600 That's true.
00:55:33.660 I did watch the
00:55:34.760 Black Mirror
00:55:35.220 episode.
00:55:36.200 So there's a new
00:55:36.900 episode of Black
00:55:37.820 Mirror that, let's
00:55:41.640 say, it deals with
00:55:42.940 the simulation idea,
00:55:45.520 but in a very
00:55:48.300 clever way.
00:55:50.280 You could see it
00:55:51.560 coming, but it was
00:55:53.080 still super clever.
00:55:54.660 Really good
00:55:55.300 writing.
00:55:56.040 Great show.
00:55:56.880 I watched it last
00:55:57.660 night.
00:55:57.940 It was terrific.
00:55:59.840 Best one I've
00:56:00.580 seen, actually.
00:56:01.660 Of the Black
00:56:02.780 Mirrors, it's the
00:56:03.420 best episode I've
00:56:04.240 seen.
00:56:07.420 Also good acting.
00:56:08.740 Good writing, good
00:56:09.460 acting.
00:56:09.820 It was really
00:56:10.160 well done.
00:56:12.700 All right.
00:56:15.480 Black Mirror is
00:56:16.400 on Netflix.
00:56:24.660 What about Peter
00:56:27.460 Zahin taking on
00:56:28.440 Russia invading?
00:56:29.840 I haven't catch
00:56:31.000 up with Peter
00:56:31.980 Zahin lately.
00:56:33.760 What's he saying
00:56:34.320 about...
00:56:35.520 Is he saying
00:56:36.280 that Russia is
00:56:36.980 totally, definitely
00:56:37.880 going to win?
00:56:39.800 Do you think
00:56:40.580 anybody knows
00:56:41.200 that?
00:56:42.400 Do you think
00:56:42.860 there's anybody in
00:56:43.940 the world who
00:56:45.140 can tell you where
00:56:45.980 the Ukraine thing
00:56:47.260 is going other
00:56:48.040 than it's going to
00:56:48.700 be roughly the
00:56:49.740 same now as it
00:56:51.000 will be later?
00:56:51.540 All right.
00:56:54.020 Here's my
00:56:54.580 prediction.
00:56:55.940 That neither
00:56:56.440 side can win,
00:56:58.140 and they will
00:56:58.880 have to negotiate.
00:57:01.280 Now, it might be
00:57:01.880 after Ukraine
00:57:03.020 maybe puts some
00:57:05.060 threat on Crimea.
00:57:07.040 Because some
00:57:08.020 people are saying
00:57:08.760 that Putin can't
00:57:10.740 risk losing
00:57:11.540 Crimea.
00:57:13.180 So if it comes
00:57:14.200 down to it looks
00:57:14.960 like they do have
00:57:15.740 a lock on taking
00:57:16.960 Crimea back,
00:57:18.300 that Putin will
00:57:18.960 get flexible in
00:57:20.720 his negotiating.
00:57:21.540 because if he
00:57:22.480 could negotiate
00:57:23.080 to keep Crimea,
00:57:24.520 he can claim
00:57:25.660 something good
00:57:27.020 happened.
00:57:27.540 At least he
00:57:27.940 kept Crimea.
00:57:29.840 But I don't
00:57:31.400 see any way
00:57:31.980 that the war
00:57:35.500 ends in anything
00:57:36.460 that looks like
00:57:37.000 a victory.
00:57:38.500 It has to turn
00:57:39.740 into either a
00:57:40.640 permanent war
00:57:41.360 or a DMZ
00:57:42.180 or a negotiated
00:57:43.880 settlement.
00:57:44.520 I think Putin's
00:57:45.640 too smart to have
00:57:46.380 a permanent DMZ.
00:57:48.340 I think nobody
00:57:49.180 wins with that.
00:57:50.220 So I think he
00:57:50.740 would negotiate.
00:57:51.540 Because I think
00:57:52.500 he's not crazy.
00:57:54.060 Do you remember
00:57:54.540 in the early
00:57:55.560 part of the war
00:57:56.260 when all the
00:57:57.540 propaganda told
00:57:58.440 us that Putin
00:57:59.840 was sick
00:58:00.760 and maybe crazy?
00:58:02.920 Do you remember
00:58:03.600 that?
00:58:05.560 And from
00:58:06.960 today's perspective,
00:58:08.380 isn't that really
00:58:09.560 obvious that you
00:58:10.560 were being
00:58:10.920 propagandized by
00:58:11.980 your own country?
00:58:14.020 That was such
00:58:14.940 propaganda.
00:58:16.400 It doesn't look
00:58:17.000 like any of that
00:58:17.740 was true or even
00:58:18.520 a little bit
00:58:19.000 true.
00:58:21.140 Yeah, I said
00:58:22.340 it too,
00:58:22.700 probably.
00:58:23.580 But what it
00:58:26.820 looks like is
00:58:28.060 that now we
00:58:28.620 have a better
00:58:29.100 understanding of
00:58:30.020 how the Ukraine
00:58:30.820 thing emerged.
00:58:33.140 It looks like
00:58:34.180 Putin was pushed
00:58:35.060 into his position
00:58:36.140 by NATO.
00:58:39.720 And it looks
00:58:40.480 like he was
00:58:40.980 acting somewhat
00:58:42.520 rationally the
00:58:43.440 entire time.
00:58:44.220 And I'm not
00:58:45.640 sure we were.
00:58:47.200 If you're going
00:58:47.880 to argue who
00:58:48.420 was being
00:58:48.840 irrational, it
00:58:49.680 looked like my
00:58:51.020 team.
00:58:51.780 It looked like my
00:58:52.640 team was the
00:58:53.140 irrational ones.
00:58:54.440 That's what it
00:58:54.840 looks like.
00:58:55.580 Now, that didn't
00:58:56.160 look like that in
00:58:57.020 the beginning.
00:58:58.020 In the beginning,
00:58:58.800 it looked like
00:58:59.300 Putin all bad.
00:59:00.920 Putin bad,
00:59:01.700 Putin bad.
00:59:02.840 Putin crazy.
00:59:04.140 He'll use his
00:59:04.840 nukes.
00:59:06.640 Yeah.
00:59:07.200 None of that
00:59:07.820 was true.
00:59:09.720 Basically, the
00:59:10.560 Democrats did to
00:59:11.540 Putin what they
00:59:12.240 did to Trump,
00:59:12.940 which is they
00:59:14.420 accused Putin of
00:59:15.600 their own crimes,
00:59:16.940 of being the
00:59:17.560 aggressors.
00:59:18.680 It's exactly what
00:59:19.460 they did to Trump.
00:59:20.580 Exactly the same
00:59:21.740 thing.
00:59:24.500 You were told.
00:59:27.080 Can we do the
00:59:27.960 thing where you
00:59:28.680 pretend that I was
00:59:29.560 wrong and you
00:59:30.120 told me the right
00:59:31.120 thing all along,
00:59:32.080 which never
00:59:32.500 happened?
00:59:33.520 We're going to
00:59:34.020 pretend that,
00:59:34.580 right?
00:59:35.820 Would it help
00:59:36.580 you if I give
00:59:37.080 you a husband
00:59:37.640 apology and
00:59:38.880 pretend that I
00:59:39.680 didn't know that
00:59:40.660 and that you
00:59:41.880 told me and
00:59:42.840 that now I'm
00:59:43.380 just figuring
00:59:43.920 it out?
00:59:44.620 Can we do
00:59:45.060 that?
00:59:45.880 Let's do a
00:59:46.500 husband apology
00:59:47.180 because the
00:59:48.180 people on
00:59:48.580 YouTube are
00:59:50.940 in sort of a
00:59:51.440 different head
00:59:51.840 space.
00:59:52.640 I'd like to
00:59:53.200 apologize for
00:59:54.940 ever having
00:59:55.660 thought that
00:59:56.500 Putin was
00:59:58.120 insane and
00:59:59.620 that it was
01:00:00.140 all his fault.
01:00:01.620 So you have
01:00:02.720 my apology.
01:00:03.860 I'm sorry,
01:00:04.540 honey, if it
01:00:06.220 made you feel
01:00:06.680 bad.
01:00:07.320 I didn't mean
01:00:08.040 that.
01:00:08.360 And I will
01:00:09.260 buy you a
01:00:09.760 nice diamond.
01:00:12.020 Are we
01:00:12.560 good now?
01:00:14.220 Husband
01:00:14.620 apology, given
01:00:16.500 and accepted.
01:00:18.240 All right.
01:00:20.020 I'll tell you,
01:00:20.920 if you don't
01:00:21.620 think husband
01:00:22.160 apologies should
01:00:23.140 be used in
01:00:23.740 politics, you're
01:00:24.600 wrong.
01:00:25.120 They work
01:00:25.520 every time.
01:00:28.400 Yeah, if you
01:00:28.900 feel better now,
01:00:29.620 good.
01:00:31.640 All right.
01:00:32.700 That's all for now.
01:00:33.600 YouTube, talk to
01:00:34.960 you tomorrow.
01:00:35.360 Bye for now.
01:00:38.360 Bye for now.