Real Coffee with Scott Adams - August 22, 2022


Episode 1843 Scott Adams: Being Bad At Data Analysis Is No Reason To Be Mad At Me. And More Fun


Episode Stats

Length

1 hour and 8 minutes

Words per Minute

149.9139

Word Count

10,273

Sentence Count

858

Misogynist Sentences

10

Hate Speech Sentences

17


Summary

In this episode, I talk about how a video I made had a profound impact on two people's lives, and how it changed the way they think about their mental health problems. Tweet me if you have any thoughts or suggestions on how to improve your mental health! Timestamps: 3:00 - How many people feel that I helped them cure or reduce a mental health problem? 791 - Is this real?


Transcript

00:00:00.560 And congratulations!
00:00:02.620 You made it to the highlight of civilization.
00:00:05.440 Yeah, you did that.
00:00:07.200 Good job.
00:00:08.320 You woke up this morning and you said to yourself, I think I'll do something amazing
00:00:11.620 today.
00:00:12.620 And then you did.
00:00:13.620 You showed up here.
00:00:14.820 Just in time.
00:00:16.180 Unless you're watching this on a recorded replay, in which case you're just as awesome
00:00:20.760 but a little bit delayed.
00:00:23.160 How would you like to take it up to a higher level yet?
00:00:25.840 Highest level of awareness possible?
00:00:27.720 Yeah.
00:00:28.720 Let's max it out.
00:00:30.240 And all you need is a cup or mug or a glass of tanker gel sysdine, a canteen jug or a flask,
00:00:34.620 a vessel of any kind.
00:00:37.040 Fill it with your favorite liquid.
00:00:39.080 I like coffee.
00:00:41.080 And join me now for the unparalleled pleasure, the dopamine of the day.
00:00:46.240 The thing that makes everything better.
00:00:48.180 It's called the simultaneous sip and it's happening now.
00:00:51.100 Get ready everybody go!
00:00:56.440 You know, you'll always remember where you were when you took the first simultaneous sip.
00:01:04.760 Probably right where you are right now, in most cases.
00:01:08.800 But you won't forget that, will you?
00:01:14.120 So, yesterday was an interesting day for me.
00:01:19.160 As my days often are.
00:01:21.160 And it was interesting because I heard from a couple of different people that I'd said something or done something that had an impact on their lives.
00:01:30.480 And I saw two people in particular believe that videos that I've produced have cured mental health problems.
00:01:41.000 And it was real people saying their stories.
00:01:49.320 And I thought, how many people have had that experience?
00:01:54.320 Is it more than two?
00:01:55.320 Because the videos were created for that purpose, but I don't have a direct way to get feedback.
00:02:02.640 How many of you in the comments feel that something I produced or said, a video usually, fixed a mental health issue for you or somebody close to you?
00:02:16.640 Oh my goodness, that's a lot of yeses.
00:02:23.640 Is this real?
00:02:28.640 Over on the locals platform they see different videos than you do on YouTube.
00:02:32.640 So they see what I call my micro lessons.
00:02:35.640 Now the micro lessons are designed to reframe and reprogram people.
00:02:40.640 And I'm seeing a wall of yeses.
00:02:44.640 Now are you answering the right question?
00:02:46.640 The question was specifically a mental health problem.
00:02:52.640 It's a wall of yeses.
00:02:54.640 I was not expecting that.
00:02:56.640 Oh my God.
00:02:58.640 Something's happening.
00:03:01.640 You know, I'm going to just turn the screens toward each other.
00:03:04.640 You have to see this.
00:03:06.640 YouTube, you've got to see this.
00:03:08.640 I'm not making this up.
00:03:10.640 Look at the comments.
00:03:12.640 I'm going to ask you again.
00:03:14.640 How many people feel that I helped them cure or reduce a mental health problem with a video that I created?
00:03:25.640 Can you see that?
00:03:28.640 I wasn't expecting that at all.
00:03:31.640 Holy shit.
00:03:33.640 What is going on here?
00:03:36.640 Good Lord.
00:03:40.640 I don't know.
00:03:42.640 My head is exploding.
00:03:46.640 I don't even know what to make of this.
00:03:48.640 I'm a little bit stunned.
00:03:50.640 I'm way stunned.
00:03:52.640 Here's what I expected.
00:03:53.640 I thought I'd get two or three people.
00:03:55.640 I didn't expect I'd see a wall of yes.
00:03:59.640 Wow.
00:04:01.640 All right.
00:04:02.640 Well, there was a specific video that people were talking about.
00:04:09.640 Was it one?
00:04:10.640 What was the number on it?
00:04:12.640 I wrote it down and now I can't find it.
00:04:16.640 Anyway, if you check my Twitter feed last, 791.
00:04:20.640 Yeah, so my video labeled 791 if you're looking for it.
00:04:27.640 791.
00:04:28.640 I think that's right.
00:04:29.640 That sounds right.
00:04:30.640 So that was the one that cured people's mental health problems.
00:04:33.640 Now, if you're new to my livestream, I'm a trained hypnotist, and I'm writing a book right now on how to do that.
00:04:42.640 But apparently that book's going to be more important than I thought.
00:04:46.640 So what the book would try to do is capture all of the reframes that people seem to have responded to.
00:04:54.640 And I guess whatever it was that fixed all these people's mental health problems, it was a reframe, but I don't know which one it was.
00:05:01.640 So I'm going to have to go figure it out myself.
00:05:04.640 Was there a specific reframe that I used?
00:05:07.640 Or was it more the totality of it?
00:05:10.640 Like one specific sentence?
00:05:13.640 Because usually it is.
00:05:14.640 It's like one sentence that switches your brain.
00:05:19.640 The addiction reframe.
00:05:22.640 Systems.
00:05:23.640 It's different stuff.
00:05:24.640 Okay.
00:05:25.640 So I guess I don't know exactly what it was.
00:05:27.640 We'll figure that out.
00:05:30.640 TikTok has announced that it's going to be helping people with elections in the United States.
00:05:37.640 Now, that's good news, isn't it?
00:05:39.640 What is that?
00:05:40.640 So a Chinese-owned company is going to help American voters make decisions on who to vote for.
00:05:47.640 That's actually happening.
00:05:49.640 A Chinese-owned company has announced that it has a whole variety of things it's going to be doing to help American voters make the right decisions and not be fooled by fake news.
00:06:02.640 What could go wrong?
00:06:04.640 What could go wrong when you have the Chinese government in charge of deciding what American voters will see before they make a decision?
00:06:14.640 I see no problem with that whatsoever.
00:06:16.640 Do you?
00:06:17.640 So helpful.
00:06:18.640 So helpful.
00:06:19.640 So one of the questions I'm wondering is, do you think the TikTok algorithms will show people a lot of news on the topic of China sending fentanyl precursors to Mexican cartels to kill 100,000 Americans again this year?
00:06:36.640 If you had to guess, do you think the TikTok algorithm will give you a lot of news about the Uyghurs?
00:06:46.640 How many TikToks about the Uyghurs are we going to see?
00:06:49.640 A lot of them?
00:06:50.640 Because they want to give you the facts, right?
00:06:53.640 Give you the facts.
00:06:54.640 Just straight facts.
00:06:55.640 How in the world is this legal?
00:07:00.640 Now, I understand that the data is going to be stored at some Oracle American facility, but that's just the data.
00:07:13.640 That's just the data.
00:07:14.640 What about the algorithm?
00:07:15.640 The algorithm is the one that's programming people, not the data.
00:07:19.640 If the thing you're worried about is China has my data, they were going to get your data anyway, probably.
00:07:25.640 I'm not sure if the data is the big thing.
00:07:28.640 I mean, I'd be worried about it.
00:07:29.640 I wouldn't want them to have my data.
00:07:31.640 But isn't the algorithm the big thing?
00:07:36.640 If they still control the algorithm, they decide what we see and how often we see it.
00:07:42.640 That is all it takes to program your brain.
00:07:45.640 What do you see and how often do you see it?
00:07:47.640 Nothing else.
00:07:48.640 That's it.
00:07:49.640 And they have complete control of that.
00:07:51.640 And we're letting that happen.
00:07:54.640 Oh, yeah, that's okay.
00:07:56.640 Do they let that happen to them?
00:07:58.640 Does China let their citizens get programmed by American media companies?
00:08:03.640 I don't believe they do.
00:08:05.640 I don't believe they do.
00:08:07.640 Do you know why?
00:08:08.640 Because they're smart.
00:08:09.640 Do you know why we do allow our foreign adversaries to reprogram the minds of our own children?
00:08:16.640 Because we're dumb.
00:08:18.640 I guess we're dumb.
00:08:20.640 Or we're already controlled by China or something.
00:08:23.640 But if Trump is going to run for re-election, banning TikTok needs to be right up there at the top of his list.
00:08:34.640 Because it's sort of like the wall.
00:08:36.640 It's easy for everybody to understand.
00:08:38.640 And it's right.
00:08:40.640 If it's easy to understand and it's unambiguously right for America, that's just like a classic Trump campaign thing, right?
00:08:51.640 It's simple and it's obvious.
00:08:54.640 And we're not doing it.
00:08:56.640 You can't get a better campaign topic than that.
00:08:59.640 All right.
00:09:01.640 How many of you have followed the story of Andrew Taint?
00:09:06.640 And I know some of you think that I'm pronouncing it with an N, but that's just a Yanni and Laurel thing.
00:09:16.640 You hear it that way because I primed you.
00:09:19.640 But I'm actually saying T-A-T-E.
00:09:22.640 So if it sounds like taint to you, that's probably just a psychological phenomenon that's happening on your end.
00:09:31.640 But I heard there was this video of him allegedly beating up a girl.
00:09:36.640 Have you heard that?
00:09:38.640 There's a video of him allegedly beating up a girl.
00:09:42.640 So I said to myself, well, I'm going to have to see that to find out how evil is this guy.
00:09:50.640 And I first watched it with the sound off.
00:09:54.640 And then I listened to it with the sound on.
00:09:57.640 And that was an interesting experiment.
00:10:01.640 When I listened with the sound off, I said to myself, why is everybody so concerned about this?
00:10:07.640 It looks like consensual sex to me.
00:10:10.640 There was hair pulling.
00:10:13.640 There was spanking.
00:10:15.640 But there wasn't much in terms of resistance.
00:10:19.640 It looked like two people who had probably negotiated, probably negotiated what was in and what was out.
00:10:26.640 And it looked like they were just doing their little kinky fun.
00:10:29.640 Now that was with the sound off.
00:10:31.640 And then you turn the sound on.
00:10:36.640 You turn the sound on.
00:10:38.640 And it just sounds really frightening.
00:10:41.640 And I say to myself, was any of that manipulated?
00:10:45.640 Because even if you heard the same sound, but you'd heard it sort of muffled, I don't think it would have sounded the same.
00:10:52.640 There was something about the loudness of the sound compared to the video.
00:10:58.640 For example, there was a part at the beginning where it looks like he slapped her in the face.
00:11:05.640 When you see it with no sound on, it looks like it probably was more like a, you know, within the context of negotiated kinky play, more of a get your attention, not intended to hurt.
00:11:21.640 So without sound, it looked like this was a slap.
00:11:25.640 Hey, sort of like that.
00:11:28.640 Now, I'm not defending or condemning, right?
00:11:31.640 So I'm not giving you any opinion on what's happening.
00:11:33.640 I'm just describing it.
00:11:34.640 So if you hear it without the sound, it looks like something that easily could have been within the realm that two people negotiated.
00:11:43.640 I don't know if they did.
00:11:44.640 So I don't know what the woman said afterwards.
00:11:46.640 By the way, has anybody heard from the woman?
00:11:48.640 Did the woman say afterwards that she was within the realm of things she had negotiated or was she not?
00:11:56.640 I don't know.
00:11:58.640 But when you hear the sound, it's frightening.
00:12:00.640 Because the sound of the slap goes from something that I would have imagined would not have even made a sound.
00:12:07.640 Because remember, he was an MMA fighter.
00:12:12.640 If he wanted to hurt her, it wouldn't have looked anything like that.
00:12:17.640 If his intention had been to hurt her, the visual would have looked a lot different, I think.
00:12:22.640 I think.
00:12:23.640 Now, I'm not defending him, right?
00:12:25.640 I want to be really clear.
00:12:27.640 Because I don't know what was the situation.
00:12:30.640 I don't have information.
00:12:31.640 But I will say that when you hear it without sound, it looks consensual.
00:12:36.640 Or it looks like it could be.
00:12:38.640 You don't know for sure.
00:12:39.640 But it looks like it could easily be within the realm of two people doing something that maybe you don't do.
00:12:45.640 But maybe other people do.
00:12:47.640 Let me ask you this.
00:12:49.640 In the comments.
00:12:51.640 Have any of you ever been involved in sex play in which there was hair pulling, light slapping, and spanking,
00:13:02.640 and the other person who was a recipient of such clearly enjoyed it.
00:13:11.640 So, as expected, there's a combination of hell no and yes.
00:13:16.640 Lots of yeses and lots of noes.
00:13:18.640 Right?
00:13:19.640 Now, what would you conclude?
00:13:20.640 If you were one of the noes, if you were one of the people who said no, what would you conclude about the people who said yes?
00:13:28.640 Are they all people who sexually abuse people and are not aware of it?
00:13:34.640 Maybe the woman didn't like it as much, or it could have been either way.
00:13:38.640 It didn't have to be the woman.
00:13:39.640 But do you imagine that women don't enjoy this kind of play?
00:13:46.640 Sometimes.
00:13:47.640 Some women in some situations with some men.
00:13:51.640 See, the trouble is that what you see visually is well within the realm of what I would consider fairly routine sexual practices that are not exactly in the middle mainstream, but they're not too far out.
00:14:08.640 But as soon as you hear the audio, it turns into an assault.
00:14:13.640 And so the question I would ask, is there anything about that audio that's been manipulated?
00:14:20.640 And I would look for specifically the sound of the slap.
00:14:23.640 Because remember, the two people were not mic'd.
00:14:26.640 There was no microphone.
00:14:28.640 But the sound of the slap is really loud.
00:14:31.640 And yet it doesn't exactly match what you see on the video.
00:14:35.640 It looks like there might have been a little video audio trickery going on, but I don't know.
00:14:43.640 Anyway, so let me say again, I'm not defending him, because I don't like him.
00:14:50.640 So if there's one thing you can be sure of, I'm not going to defend him.
00:14:55.640 Right?
00:14:56.640 Because he's a nemesis.
00:14:58.640 So, but just be aware that the days of, if you saw it on video, it's true, are long gone.
00:15:06.640 It does not mean it's true if you saw it with your own eyes on video.
00:15:10.640 If you heard it with your own ears on video, it doesn't mean it happened.
00:15:13.640 It really doesn't.
00:15:15.640 That used to mean something.
00:15:16.640 It doesn't mean anything anymore.
00:15:18.640 In fact, I would argue that the more famous it is, the fact that it's on video is the less likely it's true.
00:15:26.640 So just be careful about that.
00:15:30.640 So I'm not going to defend him.
00:15:31.640 I'm just going to say, be careful about what you believe.
00:15:35.640 All right.
00:15:36.640 And I think I would want to hear what the woman said about it afterwards.
00:15:39.640 I'd probably base my decision on that.
00:15:41.640 I haven't seen that.
00:15:45.640 I had a friend who I talked to yesterday who made me laugh for a long time.
00:15:51.640 Because he escaped the scenery.
00:15:55.640 And it's funny when somebody does it.
00:15:58.640 And here's what I mean.
00:15:59.640 He stopped using social media except for an obscure hobby he has.
00:16:04.640 So he follows one hobby, but that's it.
00:16:08.640 And the reason he described...
00:16:11.640 I wish I...
00:16:12.640 Actually, I do have his actual words.
00:16:14.640 I'm going to give you his actual words for describing what he was experiencing.
00:16:19.640 And what I heard when I saw his words was, it sounded like somebody who had just achieved a higher level of awareness
00:16:26.640 and was trying to deal with it.
00:16:28.640 All right.
00:16:30.640 I've got to read the exact words.
00:16:40.640 Oh, damn it.
00:16:41.640 I'm not going to be fine.
00:16:42.640 But the idea was that once he noticed that everybody was playing a part, like a character,
00:16:48.640 he couldn't debate politics anymore because he wasn't really debating the topic.
00:16:52.640 He was in a play with characters.
00:16:55.640 So there would be one character who would be the American flag wearing American guy.
00:17:00.640 And there would be another character who would be the left leaning, I'm so left person.
00:17:06.640 And then the third one is what made me laugh.
00:17:08.640 He goes, and then there's always a coach, some kind of coach for whatever the fuck.
00:17:14.640 And I just lost it.
00:17:15.640 I thought, oh my God.
00:17:17.640 There is always the American flag guy.
00:17:19.640 There's always the leftist crazy guy.
00:17:22.640 And there's always a coach.
00:17:24.640 There's always a coach.
00:17:25.640 There's always a coach.
00:17:26.640 And it's always a coach for whatever the fuck.
00:17:29.640 And I thought, why is there always a coach?
00:17:33.640 There is always a coach.
00:17:35.640 Am I wrong?
00:17:36.640 And when I heard that, I just thought, oh my God.
00:17:39.640 It is so obviously, it's just like Civil War recreations.
00:17:45.640 You know how the people who dress in Civil War outfits and they go do the fake battles to recreate the battles?
00:17:53.640 That's all social media is now.
00:17:55.640 It's people literally taking and putting on a costume and going to do symbolic battle.
00:18:00.640 They don't know what the real date is and they don't care.
00:18:03.640 And one of them is always a coach.
00:18:05.640 And I thought, no, this is exactly like Halloween.
00:18:09.640 Politics has just become Halloween.
00:18:13.640 So what happens a few weeks before actual Halloween?
00:18:16.640 People start talking about what costume they're going to wear.
00:18:19.640 And what costume do people pick?
00:18:22.640 Well, they do it for a variety of reasons.
00:18:25.640 But often, often people will pick a costume that's sort of how they're feeling.
00:18:30.640 Like, oh, on the inside, I feel like a devil.
00:18:33.640 I'm going to be a devil.
00:18:35.640 Or I feel like livestock because I'm in my cubicle all day.
00:18:39.640 I'm going to dress as a cow.
00:18:41.640 So at some level, the Halloween costume you pick is a sort of a lifestyle decision of who you want to pretend to be.
00:18:49.640 And I think that's what politics have become.
00:18:51.640 Because we definitely don't have the data to use data to make decisions.
00:18:55.640 It's all sketchy data.
00:18:57.640 And we definitely don't have the logic skills.
00:19:00.640 But everybody knows how to act.
00:19:02.640 Everybody knows how to pretend.
00:19:04.640 So instead of debates, it's now just pretending.
00:19:07.640 And if you understand people's reactions as pretending to play a role, everything makes sense.
00:19:14.640 If you imagine that they're trying to be logical and failing, it's just really frustrating.
00:19:20.640 Because you'll be like, oh, why?
00:19:23.640 Why is your logic failing?
00:19:25.640 Why?
00:19:26.640 Or is it me?
00:19:27.640 You always have that question.
00:19:29.640 And the answer is, nobody's even pretending to use logic.
00:19:32.640 They're just dressing up.
00:19:34.640 It would be like asking, why are you dressing like a devil on Halloween when you know you're not really a devil?
00:19:41.640 That's not really the right question, is it?
00:19:43.640 Why are you dressing like a devil when you're not really one?
00:19:46.640 It's exactly the same with politics.
00:19:49.640 Why are you arguing that case when you know that doesn't make sense, right?
00:19:53.640 It is compatible with the costume you're wearing, but you know it doesn't make sense, right?
00:19:59.640 You do know that.
00:20:01.640 That's what it feels like.
00:20:02.640 And once he described the situation that way, I couldn't see it another way.
00:20:07.640 It just feels like people wearing costumes.
00:20:10.640 All right.
00:20:12.640 So watch for the coach of whatever the fuck.
00:20:16.640 How many of you have seen the video, there's a black and white video from long ago, maybe the 60s,
00:20:25.640 in which a Russian, an ex-Russian spy allegedly, who defected from the Soviet Union, I guess a Soviet spy,
00:20:35.640 was explaining the Soviet Union secret plan to brainwash their enemies, like Americans,
00:20:44.640 into accepting socialism that would destroy America from inside.
00:20:48.640 You've all seen that, right?
00:20:50.640 I can't tell you how many people have sent that to me.
00:20:55.640 Yeah, what's his name?
00:20:57.640 Yeah, blah, blah, blah.
00:20:59.640 How many of you think Yuri Bezmenov?
00:21:04.640 It wasn't black and white.
00:21:05.640 It was just shitty video.
00:21:07.640 Yuri Bezmenov, right?
00:21:11.640 How many of you think that he was a credible person telling you something useful?
00:21:16.640 Did you believe what he was saying?
00:21:19.640 Sounded pretty good, didn't it?
00:21:21.640 Sounded pretty credible?
00:21:23.640 It's complete bullshit.
00:21:25.640 It is absolute bullshit.
00:21:28.640 Now, I hadn't watched the whole thing before I'd seen clips,
00:21:31.640 and I imagined it was bullshit from the clips,
00:21:35.640 so I'd never bothered to spend any time listening.
00:21:38.640 So I listened to an extended explanation.
00:21:43.640 It's really obvious that it's bullshit.
00:21:45.640 It's really obvious.
00:21:47.640 Now, I'll tell you why, and maybe it'll be obvious after I tell you.
00:21:52.640 Number one on your detecting bullshit checklist.
00:21:58.640 What is number one on the checklist?
00:22:00.640 If something's a hoax or a prank, let's say it's intentionally wrong,
00:22:05.640 what's the first thing you look for?
00:22:07.640 Two on the nose.
00:22:08.640 Exactly.
00:22:09.640 Two on the nose.
00:22:10.640 What is it that Americans are afraid of?
00:22:13.640 That those darn communists are trying to infiltrate the brains of our children.
00:22:18.640 Ever since the 60s, we've been afraid of that.
00:22:21.640 And there's that.
00:22:23.640 There he is.
00:22:25.640 Just the way you'd expect.
00:22:27.640 The exact person telling you that behind the scenes,
00:22:31.640 that's exactly what the bad guys are doing.
00:22:34.640 They're reprogramming your children.
00:22:36.640 Your children!
00:22:39.640 So, it's a little on the nose, isn't it?
00:22:42.640 Now, that doesn't mean they don't want to reprogram your children.
00:22:46.640 Because I was just talking about TikTok, as somebody said.
00:22:49.640 But here's the difference.
00:22:51.640 TikTok can reprogram your children in an hour.
00:22:57.640 In an hour.
00:22:59.640 TikTok can change your children's opinions in an hour.
00:23:05.640 Do you know how long the Soviets were willing to wait?
00:23:08.640 Uh, 15 to 50 years.
00:23:10.640 Because that's how long they said it takes to educate the next generation of kids
00:23:15.640 to believe in all this socialism.
00:23:18.640 Well, have I ever told you the trick about doing something random
00:23:23.640 and then claiming credit when things go your way?
00:23:27.640 For example, when Biden said he was going to release the national oil reserves
00:23:35.640 to increase the supply of oil and therefore decrease the price,
00:23:40.640 what did all of the experts say?
00:23:42.640 They said, but there's so little of that oil,
00:23:45.640 that even if you released all of it, which would put us at some risk
00:23:48.640 because we wouldn't have the reserve then.
00:23:50.640 Even if you used all of it, it wouldn't change the volume enough
00:23:53.640 to change the price much.
00:23:57.640 But what did Biden know?
00:23:59.640 Well, he knew that if prices went up, he could say it would have been worse.
00:24:03.640 Right?
00:24:04.640 It would have been worse.
00:24:06.640 And he knew that if prices went down, which in the long run, of course,
00:24:09.640 they were going to go down, that he could claim credit.
00:24:12.640 So if you know something's going to change anyway,
00:24:15.640 you know business will go up and you know business will go down,
00:24:18.640 you do what all Dilbert managers do.
00:24:21.640 So you change something, you change something and then claim credit.
00:24:28.640 So do you think that the Russians were, or the Soviets,
00:24:31.640 or maybe just this one guy, Yuri,
00:24:34.640 do you think that it was clever for him to say,
00:24:36.640 oh yes, the reason that people are asking for more socialism
00:24:40.640 is because we've been hypnotizing them for decades.
00:24:45.640 Do you know what causes people to ask for more socialism?
00:24:49.640 A good economy.
00:24:51.640 That's what does it.
00:24:53.640 It was going to happen anyway.
00:24:55.640 Socialism was coming, no matter what the Soviets did.
00:24:59.640 Because if you've got a bunch of rich people,
00:25:01.640 and they're living the good life,
00:25:03.640 and they're living right next to people who do not have a good life,
00:25:06.640 what are the people who don't have a good life going to do?
00:25:09.640 They're going to say, I have a good reason,
00:25:11.640 you should give me some of your stuff.
00:25:13.640 And what would be the good reason?
00:25:15.640 Socialism, right?
00:25:17.640 It was going to happen on its own.
00:25:20.640 And it happened everywhere.
00:25:22.640 Is there a country somewhere that had a good economy,
00:25:25.640 and the poor people weren't asking to get a bigger share?
00:25:29.640 That happened nowhere.
00:25:31.640 So the first part where the socialism is embedded into our schools
00:25:38.640 to make everybody socialist,
00:25:40.640 I feel like that was just going to happen on its own.
00:25:42.640 And he's sort of taking credit for something that's a normal trend.
00:25:47.640 Right?
00:25:48.640 Then what about the demoralizing part?
00:25:51.640 That once we don't trust anything,
00:25:53.640 then we get all demoralized.
00:25:56.640 Do you think we're more demoralized than we've ever been?
00:26:00.640 Not really.
00:26:01.640 The only thing we've done is woken up to the fact that our news was probably always fake.
00:26:05.640 Did we come out behind by learning that our news was always fake?
00:26:12.640 Because when this guy, Yuri, was talking,
00:26:15.640 this was in a time when the CIA was hypnotizing the citizens,
00:26:19.640 and you didn't know it.
00:26:21.640 The CIA was actively managing movies and TV to hypnotize the public to get some result that the government wanted.
00:26:33.640 Was that good?
00:26:34.640 Are you better off now than knowing that the Soviets are trying to influence you?
00:26:39.640 Not Soviets, the Russians, and the Chinese.
00:26:45.640 And also knowing that your own government is trying to influence you.
00:26:48.640 I'd say we're ahead.
00:26:50.640 How did we come out behind?
00:26:51.640 To me, we're way ahead, because now we understand that we were being hypnotized way back then.
00:26:57.640 And not only by opponents, but by our own government,
00:27:01.640 and quite intentionally and quite aggressively.
00:27:04.640 So, I feel like Yuri was more of an opportunist.
00:27:11.640 Was he selling a book, by the way?
00:27:13.640 Does anybody know, was he selling a book at the time?
00:27:16.640 Because if he wasn't selling a book, I would worry that he was still working for the Soviet Union.
00:27:22.640 And if he wasn't selling a book, it means that he probably was not so much an ex-spy, if you know what I mean.
00:27:29.640 He might have been a current spy.
00:27:31.640 He was a professor.
00:27:33.640 Okay.
00:27:35.640 But people are always selling a book.
00:27:38.640 I don't know.
00:27:39.640 So, I would say I wouldn't trust anything about him, because his explanations were academic and generic.
00:27:47.640 And they really described things that were going to happen anyway for different reasons.
00:27:51.640 And his two-on-the-nose part is a big flag.
00:27:54.640 Yes, I am selling several books.
00:28:02.640 Does that make me less credible, because I'm selling books?
00:28:07.640 It would depend on the book you're selling, doesn't it?
00:28:10.640 But anyway, the beauty of my books are the funny ones, you can either laugh or not laugh.
00:28:16.640 And that's all they're trying to do.
00:28:18.640 And the ones that are self-help, you can look at the people who already tried them, and you can see what their outcome was.
00:28:25.640 So, you don't really have to depend on me for credibility.
00:28:29.640 Yesterday, I came upon, just by accident, the first Ronald Reagan press conference from 1981.
00:28:36.640 And I tweeted it around.
00:28:39.640 It's really interesting.
00:28:40.640 If you look at Reagan's first press conference, check his, let's say, his skill level, his communication ability, and compare it to Biden and Trump.
00:28:54.640 It's really interesting.
00:28:55.640 And I'll tell you what my impression was, and I think yours might be different, right?
00:28:59.640 Because we're all going to have a subjective impression.
00:29:02.640 But my first impression was, my goodness, this guy is good.
00:29:05.640 Wow.
00:29:06.640 I mean, I remembered him as being good, but he was really good.
00:29:11.640 Like, his charisma and his control of the stage and how he kept the conversation at a high level, like he didn't get in the weeds.
00:29:21.640 He kept everything at a higher level, was masterful.
00:29:25.640 He was great.
00:29:26.640 But, here's a big but.
00:29:30.640 When he started taking questions about all different things, because it seemed like in those days they would ask you more obscure questions.
00:29:39.640 Somebody would stand up and say,
00:29:41.640 Mr. President, the Native American reservations have spotted eagles that are being destroyed by the tractors.
00:29:51.640 What are you going to do about it?
00:29:54.640 And it would be some topic that the president has never even heard of.
00:29:58.640 And then they have to respond to it.
00:30:00.640 Now, it seems like they used to ask that kind of question all the time.
00:30:04.640 And it was really good for seeing what the president would deal, how he would deal with, he or she would deal with a question that was tricky.
00:30:12.640 And I don't think they asked questions like that anymore.
00:30:16.640 But Reagan was actually a little bit stumped.
00:30:19.640 Because there were things he had made no decisions on, didn't have a thought on.
00:30:23.640 And he kept having to say, well, we've just been here nine days.
00:30:29.640 We've been working on this big legislation.
00:30:31.640 And when I'm done with that, you know, we'll be able to look at some other things.
00:30:34.640 But I don't have anything to tell you about that now.
00:30:36.640 Now, I do think there's a way to say that confidently and in a way that makes the country say, oh, okay, I get it.
00:30:45.640 That's coming up.
00:30:46.640 You haven't done it yet.
00:30:48.440 But he did seem a little bit scared and lost during the questions where he didn't have a prepared answer.
00:30:56.640 So what I saw is that where he had prepared answers, like a good actor, he could deliver his lines and he would wow you.
00:31:04.460 I mean, his presentation was really spectacular.
00:31:08.300 But when he didn't have prepared lines, he did look a little lost.
00:31:12.620 He did look a little lost.
00:31:13.960 And I thought you could actually detect maybe the beginnings of some lack of capability.
00:31:20.460 But that's probably my imagination.
00:31:22.720 Because we know that, you know, toward the end of his term, there were more questions about his mental capability.
00:31:27.280 But if you look at it through today's lens and compare it to Trump's performance and to Biden's, it's really, really interesting if you have any historical curiosity.
00:31:40.440 All right.
00:31:43.560 You know, it's funny we keep redefining terms.
00:31:47.260 And my critics are funny.
00:31:49.660 So here are some things that my critics have told me in the last 24 hours.
00:31:56.240 That if you got the jab, you were taking a dangerous experimental drug.
00:32:01.920 That's one thing they said.
00:32:03.700 And that people who voluntarily take a dangerous experimental drug are called cowards.
00:32:09.320 So the people who do something that's dangerous and they volunteer to do it are called cowards.
00:32:15.960 This is different than how I used to use that word.
00:32:18.160 I used to think that people who volunteered to do something that was dangerous would be called brave.
00:32:26.380 But no, they're called cowards.
00:32:30.240 And I also learned that staying home is called freedom.
00:32:35.980 It's called freedom.
00:32:37.160 But getting a vaccination so you can travel the world is a form of slavery.
00:32:42.400 Because you're doing what other people want you to do, which is get the shot.
00:32:45.940 So while I was in Bora Bora and Santorini, and my critics were in their basements staying home because they couldn't travel,
00:32:55.860 they were experiencing something called freedom while I was suffering in a five-star resort with my lack of freedom.
00:33:05.400 And that's what I learned this week.
00:33:09.520 So here's my takeaway on that.
00:33:13.500 I don't know what is the best take on anything, necessarily.
00:33:18.480 But I'll tell you what the lowest, worst take is.
00:33:21.920 Here's the worst take.
00:33:23.360 That there was somebody during the pandemic who was operating on fear, and there was somebody who was not.
00:33:32.700 Nothing like that happened.
00:33:34.700 You were either afraid of the shot, or you were afraid of not getting the shot.
00:33:42.100 By which they could make a good decision.
00:33:44.820 So it was fear and guessing.
00:33:46.780 So you used whatever you were most afraid of, and then you rationalized it.
00:33:53.360 That's all that happened.
00:33:54.820 There was nobody brave.
00:33:57.040 Every single person made a fear-based decision.
00:33:59.960 Were you afraid of the government?
00:34:01.360 Afraid of losing your job?
00:34:02.620 Afraid of, were you afraid of not being able to travel?
00:34:06.480 Were you afraid of being alone?
00:34:08.080 Were you afraid of something?
00:34:09.380 So to imagine that some group were afraid and cowards, and some group were not, is really the lowest level of understanding.
00:34:19.720 It's not even a logic problem.
00:34:21.480 It's like a low level of understanding what a human is.
00:34:25.980 That's like a complete misunderstanding of how a brain works, how people operate in the real world.
00:34:32.380 We're basically just afraid of stuff, and then rationalizing our choices after the fact.
00:34:37.280 That's pretty much all we do.
00:34:38.420 We're afraid of stuff, and then we come up with reasons why it makes sense.
00:34:43.540 That's it.
00:34:44.840 And if you think we're a reason-based species, and that some of us are brave and some of us are not, nothing like that's happening.
00:34:53.780 You're just going to be confused if you go through life thinking that's what your reality looks like.
00:35:00.820 All right.
00:35:01.160 And then, because I can't talk about a story without becoming part of it, I was asking on Twitter, because I keep seeing graphs showing there's all these excess deaths after the pandemic.
00:35:17.940 Have you seen those on the Internet?
00:35:19.140 A graph that's showing we expect this many deaths in a normal situation, but it seems like everywhere they're higher, pretty much globally.
00:35:28.380 Wherever you can determine it, they're higher.
00:35:30.960 Now, let's do a bias check.
00:35:33.320 If you believe that the data is correct, what is the most reasonable hypothesis for why they're higher?
00:35:42.380 Go.
00:35:43.480 So here you're going to be just giving me the most...
00:35:46.040 We don't know, because nobody's saying they know.
00:35:49.360 All right.
00:35:49.660 So could we agree that nobody knows?
00:35:51.480 But what is the most reasonable hypothesis for why they would be higher?
00:35:59.560 Somebody thinks it has to do with how the financial incentives work.
00:36:05.300 Okay.
00:36:07.300 People who died with COVID instead of because of...
00:36:10.640 I don't know if that fits into excess deaths.
00:36:13.420 The data could be wrong, right?
00:36:15.680 What about the odds that the data is just wrong?
00:36:17.680 Some are saying the most obvious is the lockdown.
00:36:23.240 Some are saying the most obvious is the delay of health care.
00:36:27.540 Some are saying it's obesity.
00:36:29.340 People got fatter and less active.
00:36:32.160 What about increasing depression?
00:36:34.260 Depression.
00:36:35.160 There's a hypothesis.
00:36:36.700 All right.
00:36:37.720 So here's what I think is how to identify the worst take.
00:36:44.180 The worst take would be it's one of those things.
00:36:47.680 No matter which one you picked.
00:36:50.160 I think that's the worst take.
00:36:52.340 Because people did get fatter.
00:36:55.420 And there's no question that made a difference.
00:36:58.460 People did delay care.
00:37:00.780 There's no question that delayed.
00:37:03.300 The vaccinations themselves may have had...
00:37:07.320 Don't know this.
00:37:08.660 But may have had more complications than a normal vaccination.
00:37:11.920 Which doesn't mean it wasn't a good idea.
00:37:14.000 It just means there might have been more complications.
00:37:18.720 So there are lots of reasons.
00:37:20.360 It's probably five different reasons.
00:37:23.600 But let's rank them.
00:37:27.180 Let's rank them in order.
00:37:28.480 Give me a...
00:37:29.020 Give me a...
00:37:29.820 Let's say...
00:37:30.620 Let's give me a 1 to 10 on this.
00:37:32.960 So 10 means that this would be a really important variable.
00:37:36.800 And 1 means it's not much of a variable.
00:37:40.160 What are the odds?
00:37:41.300 Because we don't know.
00:37:42.620 What are the odds that the data is just misleading?
00:37:46.440 Go.
00:37:46.940 Scale of 1 to 10.
00:37:48.160 What are the odds that the data is just misleading?
00:37:53.820 I'm saying 8 or 10.
00:37:55.160 Right?
00:37:55.340 There's a very high chance of it.
00:37:56.880 Very high chance.
00:37:58.680 But if I told you the data everywhere is consistent.
00:38:03.040 So the way we collect data in the US is different than how they do it in the UK.
00:38:07.860 But all the people collecting data are finding the same thing.
00:38:11.460 I think.
00:38:12.220 I think that's true.
00:38:13.560 So now if I told you no matter how you collect the data,
00:38:16.260 it doesn't matter how you do it,
00:38:18.140 you end up with the same general direction.
00:38:20.860 That there are excess deaths.
00:38:22.700 Okay?
00:38:23.260 Would that tell you that maybe it's not a data problem?
00:38:25.500 If I told you that everybody, no matter how they collect it,
00:38:28.900 gets the same answer,
00:38:30.440 does that tell you maybe not data?
00:38:34.340 So am I lowering your percentage on that one?
00:38:38.840 Yeah.
00:38:39.120 It still could be the data.
00:38:40.960 Because it could be everybody.
00:38:43.060 That's totally possible.
00:38:44.400 All right.
00:38:44.580 How about the next one?
00:38:45.800 The next one is that the elites, whoever they are,
00:38:50.780 whoever controls the pharma companies
00:38:52.720 or whoever the pharma companies control,
00:38:55.000 that there's some elite group
00:38:56.200 who is intentionally hypnotizing you
00:38:59.980 not to notice the excess deaths.
00:39:02.760 So under this hypothesis,
00:39:05.260 which I don't believe,
00:39:07.140 the vaccinations themselves
00:39:09.980 are causing the excess deaths,
00:39:11.760 and that fact is being hidden from you
00:39:14.000 by all the elites.
00:39:15.400 Give me the odds on that.
00:39:16.780 The odds that it's the vaccination themselves
00:39:18.820 and the truth is being hidden by the elites.
00:39:22.480 Give me your odds.
00:39:24.820 I'm seeing everything from 2% to 100%.
00:39:27.540 Yeah.
00:39:30.660 I don't know.
00:39:31.300 I don't know the odds on that one.
00:39:33.200 So how about the odds that it is delayed care?
00:39:38.680 Now, doctors,
00:39:40.480 doctors say the problem is probably delayed care
00:39:43.740 because the types of problems
00:39:45.740 they're seeing people die of
00:39:47.000 are the ones that need constant care.
00:39:51.900 So it turns out there's a pretty strong correlation,
00:39:55.200 at least anecdotally.
00:39:56.540 I don't believe there's any randomized controlled trial.
00:39:59.700 But anecdotally, doctors are saying
00:40:01.320 the people we're seeing dying
00:40:02.760 seem to be the ones that could have used treatment
00:40:06.240 like diabetes and cancer,
00:40:10.460 I think were the two examples.
00:40:11.560 They're two that you really need to stay on top of them
00:40:14.040 to reduce the death rate.
00:40:17.300 Now, suppose that's true.
00:40:19.040 And I don't know that that's been proven.
00:40:21.280 Suppose it's true that it's obvious
00:40:23.400 that the people dying are the ones
00:40:24.880 who didn't get continuous care
00:40:26.280 because of the pandemic.
00:40:27.640 That would be pretty convincing.
00:40:29.520 But I don't trust the data yet.
00:40:34.140 So here's how, once again,
00:40:35.880 I become part of the story.
00:40:37.440 I'd forgotten temporarily
00:40:39.560 that Alex Berenson is back on Twitter.
00:40:42.200 So somebody tweeted
00:40:43.380 that I should have a conversation with him.
00:40:46.620 And he saw that, and he weighed in.
00:40:48.700 And his statement is this on Twitter.
00:40:54.300 He said,
00:40:55.380 but yes, the excess deaths are real.
00:40:57.580 So he's confirming that the excess deaths are real.
00:41:00.540 And I think I believe that.
00:41:02.100 So, so far,
00:41:03.240 so far, I am tentatively believing
00:41:06.460 there are excess deaths
00:41:08.100 just because so many people are seeing it
00:41:10.540 in so many places.
00:41:12.180 So, so far, we're on the same page.
00:41:13.700 There probably are excess deaths.
00:41:15.360 But he says it's happening in all
00:41:17.520 or nearly all of the mRNA-vaccinated countries.
00:41:21.000 That part, I don't have data on
00:41:22.760 and don't believe.
00:41:23.660 So, I believe that there were so many mRNA countries
00:41:28.320 that unless all of them are experiencing this,
00:41:31.860 the fact that nearly all of them are experiencing it
00:41:34.320 would tell me that's not the problem.
00:41:35.840 If all of them were,
00:41:38.360 I'd say, oh, there's a problem.
00:41:40.560 But if not all of them,
00:41:42.240 and only nearly all of them,
00:41:44.200 then I'd say,
00:41:45.020 hmm, maybe it just means that
00:41:47.200 a lot of countries who can collect data
00:41:49.780 can also have lots of vaccines.
00:41:52.500 So it seems like there's a correlation between
00:41:54.620 are you a country that even could get the mRNA vaccination?
00:42:01.080 And if you are,
00:42:04.200 you know, what does that say about you too?
00:42:06.000 All right.
00:42:08.440 So what do you think?
00:42:10.440 Do you think the shots are the most obvious?
00:42:12.940 So Alex Berenson says it's the most obvious explanation.
00:42:16.700 Is it the most obvious?
00:42:19.400 Because I would think most obvious
00:42:20.960 is what the doctors say.
00:42:23.340 The actual doctors who are treating people
00:42:25.360 are saying the most obvious explanation
00:42:27.600 is delayed treatment.
00:42:30.000 But why is Alex Berenson looking at the same data
00:42:32.820 and saying the most obvious explanation
00:42:35.060 is one that agrees with something he said
00:42:37.480 before we had the data?
00:42:40.560 Well, one of them is being consistent
00:42:42.740 with prior statements.
00:42:47.280 If you were looking for a cognitive dissonance,
00:42:49.780 which I don't see here yet, by the way,
00:42:51.720 so there's not a signal for it.
00:42:53.740 Well, maybe there is.
00:42:56.060 Do you think this is a signal?
00:42:59.040 If doctors are the ones saying
00:43:00.940 the most obvious thing is delayed treatment,
00:43:04.080 why would somebody who's not a doctor
00:43:05.820 say the most obvious thing is the shots themselves
00:43:08.320 when the doctors are not saying
00:43:11.920 that that's the most obvious thing?
00:43:13.820 What is the definition of obvious?
00:43:15.820 I would think if doctors don't see it,
00:43:19.780 but a non-doctor does,
00:43:21.840 it doesn't qualify as obvious.
00:43:24.700 It doesn't mean Alex Berenson is wrong.
00:43:27.380 He could be the one that's right
00:43:28.520 and the doctors could be wrong.
00:43:30.980 But I think obvious is a word that doesn't fit.
00:43:34.240 Does it?
00:43:36.740 Yeah.
00:43:37.600 So you could say that the word obvious
00:43:39.700 is a tell
00:43:40.500 or something out of the ordinary here.
00:43:45.940 But I don't think you can conclude
00:43:47.880 that he is wrong
00:43:48.840 based on anything that we know now.
00:43:51.140 Now, here's some other data.
00:43:52.560 There was a VA study
00:43:53.640 which looked at only veterans
00:43:55.900 and only unvaccinated ones.
00:43:58.100 And it found that long COVID
00:44:00.180 causes all kinds of problems,
00:44:02.140 especially heart disease later.
00:44:04.660 So if we know that long COVID
00:44:06.800 causes people to die later,
00:44:10.500 it could just be that.
00:44:13.360 So if you took the delay in treatment
00:44:16.100 and you added it to the fact
00:44:18.580 that long COVID itself
00:44:19.820 might cause you to have a heart problem
00:44:21.980 down the road,
00:44:23.000 and this is down the road.
00:44:24.980 So the VA found that you could have
00:44:26.780 these problems months down the road.
00:44:29.640 So we're only months out of the pandemic itself.
00:44:32.940 So this is exactly when we would see
00:44:35.140 COVID, long COVID problems
00:44:38.140 killing people at higher rates.
00:44:39.700 Exactly now.
00:44:42.620 So anyway, I would say
00:44:44.360 that that would be
00:44:45.040 the most obvious ones
00:44:46.180 would be the COVID itself,
00:44:47.840 the delay in care,
00:44:49.540 the increase in weight.
00:44:51.600 You have to question
00:44:52.660 whether the data is right.
00:44:54.240 And then you can't rule out
00:44:56.180 the vaccination.
00:44:57.400 Can you?
00:44:59.160 And alcohol.
00:44:59.880 Yeah.
00:45:00.220 Drug abuse.
00:45:01.160 Exactly.
00:45:02.240 Drug abuse would certainly be
00:45:03.720 in there as an obvious choice to.
00:45:09.880 But what I worry about
00:45:11.240 is that the people
00:45:12.220 who were anti-vax from the beginning,
00:45:14.320 they need to be right.
00:45:16.280 If you were anti-vax from the beginning,
00:45:18.900 you need the excess deaths
00:45:20.640 to be high and unexplained.
00:45:23.940 You need that.
00:45:24.940 So the only way
00:45:26.780 the anti-vaxxers can be right
00:45:28.680 is if the
00:45:29.920 is if the excess deaths
00:45:32.400 are from the vaccination.
00:45:34.320 So I would worry
00:45:35.500 that cognitive dissonance
00:45:36.760 or confirmation bias
00:45:37.780 are kicking in.
00:45:39.600 But everybody is,
00:45:41.600 I think we're all subject to it now, right?
00:45:44.340 Wouldn't you say,
00:45:45.440 would you say it's a true statement
00:45:46.780 that every one of us now
00:45:48.980 is deeply in the grip
00:45:51.100 of confirmation bias?
00:45:52.540 Because whatever you thought was true,
00:45:55.440 you have found a reason
00:45:56.480 to believe it's true
00:45:57.420 based on data.
00:45:59.200 I can do it.
00:46:00.720 You don't think
00:46:01.320 that Alex Berenson
00:46:02.100 could do what I'm going to do right now?
00:46:04.480 I could easily tell you
00:46:05.760 that everything I predicted
00:46:07.000 came true.
00:46:07.780 Because I have.
00:46:09.180 There were some that didn't.
00:46:11.220 But most of the things
00:46:12.320 I could argue
00:46:12.920 it came true
00:46:13.500 and here's my data.
00:46:14.660 But people who said the opposite,
00:46:16.800 the very opposite
00:46:17.740 of what I predicted,
00:46:19.360 also say they were right
00:46:21.060 and here's my data.
00:46:23.240 We both have data.
00:46:25.300 It turns out
00:46:25.840 the data's useless.
00:46:28.280 It turns out
00:46:29.140 the data's completely useless.
00:46:32.820 All right.
00:46:38.100 That turns out to be
00:46:39.760 the bulk of what
00:46:42.040 I wanted to talk about.
00:46:45.280 Yes, no mandates.
00:46:47.000 No mandates.
00:46:48.800 Was there a topic I missed?
00:46:50.240 Is there anything
00:46:51.200 that I should have talked about
00:46:53.180 that I didn't?
00:46:53.920 I feel like I missed something.
00:46:55.100 It was like a big topic
00:46:56.000 or something.
00:47:01.220 Oh, so the Salisbury Hill thing
00:47:03.220 is what helped you.
00:47:06.500 Huh.
00:47:08.460 The car bomb,
00:47:09.420 we talked about
00:47:09.920 the Russian car bomb
00:47:10.980 that I think
00:47:12.600 it was probably Ukraine.
00:47:14.060 And if it wasn't Ukraine,
00:47:15.300 why not?
00:47:16.700 If Ukraine is not
00:47:17.700 trying to kill
00:47:18.440 Russian leadership,
00:47:20.060 why not?
00:47:21.660 All right.
00:47:22.360 Here's a question
00:47:24.120 I have for you.
00:47:27.460 Do you feel
00:47:27.980 that things are,
00:47:30.380 forget about
00:47:31.060 where things are,
00:47:31.980 but just the direction
00:47:32.840 of things.
00:47:33.640 It feels to me
00:47:34.660 like everything's
00:47:35.520 starting to trend
00:47:36.380 positive.
00:47:38.680 Does anybody else
00:47:39.560 feel that?
00:47:41.600 Does anybody feel like,
00:47:42.840 there's still some things
00:47:44.280 we don't like?
00:47:46.000 Well, let me make
00:47:46.980 my argument.
00:47:49.460 Energy prices
00:47:50.320 are falling.
00:47:52.060 Now remember,
00:47:52.900 my argument is not
00:47:53.820 that things are good.
00:47:55.360 My argument is that
00:47:56.360 things are
00:47:57.060 no longer moving bad.
00:47:59.860 Right?
00:48:00.220 That they're trending
00:48:01.160 in the right direction.
00:48:02.060 They're not good yet,
00:48:02.920 but they're trending right.
00:48:04.060 So the stock market
00:48:05.280 has improved.
00:48:07.220 The supply chain problem,
00:48:10.560 I feel like we've
00:48:12.220 gotten past the worst
00:48:13.140 of it.
00:48:14.000 Maybe there's more ahead,
00:48:15.640 but I feel like
00:48:16.440 we got past the worst.
00:48:17.960 I'm not really seeing
00:48:19.380 enough worry
00:48:21.620 about starvation,
00:48:23.220 and I don't know
00:48:24.420 if that's a reporting
00:48:25.220 problem or we're not sure,
00:48:27.240 but I feel as if,
00:48:29.320 oh, stocks are down today,
00:48:31.840 somebody said.
00:48:33.400 I wouldn't look at today
00:48:34.640 to eat anything.
00:48:35.780 Let's see a bad there.
00:48:37.300 Eh, 1.6%.
00:48:39.560 That's no big deal.
00:48:41.720 That could come back
00:48:42.780 in a day.
00:48:44.020 So I wouldn't worry
00:48:44.760 about that.
00:48:45.720 So we've got,
00:48:48.280 we're working
00:48:48.960 through the shortages.
00:48:50.340 We're working
00:48:50.900 through the economy.
00:48:53.660 It looks like
00:48:54.640 climate change
00:48:55.620 will be addressed
00:48:57.080 with nuclear.
00:48:58.100 Nuclear is on the rise.
00:49:01.160 Elon Musk is sending
00:49:02.360 things to Mars.
00:49:04.360 I would say
00:49:06.300 the Ukraine situation
00:49:07.480 has settled
00:49:08.960 into some kind
00:49:09.760 of a stalemate
00:49:10.800 that at least
00:49:12.040 doesn't risk
00:49:12.780 nuclear war.
00:49:14.920 I would say
00:49:15.680 that we are
00:49:16.360 decoupling
00:49:17.360 from China.
00:49:21.140 We do not have,
00:49:24.520 well,
00:49:25.580 the mortality
00:49:26.420 is increasing,
00:49:27.560 but we're not sure
00:49:28.260 it is.
00:49:29.360 Is it?
00:49:29.860 The excess mortality,
00:49:33.100 you're right,
00:49:33.680 but it's probably
00:49:34.460 also temporary
00:49:35.340 because whatever
00:49:37.340 caused this
00:49:38.060 excess mortality
00:49:39.000 is,
00:49:39.840 unless it's the
00:49:40.460 vaccinations,
00:49:41.160 I suppose,
00:49:42.060 but even if it is,
00:49:43.160 it's likely to be
00:49:44.460 done in a year
00:49:45.940 because if what you
00:49:47.880 did is delay
00:49:48.560 something,
00:49:49.400 delay treatment,
00:49:50.180 it's going to get
00:49:50.720 you in a year,
00:49:52.060 right?
00:49:52.900 If the vaccinations
00:49:54.260 were a problem,
00:49:55.420 and I don't see
00:49:56.100 that as being true,
00:49:57.460 but if they were,
00:49:58.320 most of it's
00:49:59.460 going to happen
00:49:59.900 soonish.
00:50:01.340 Most of it
00:50:01.980 doesn't happen
00:50:02.480 down the road.
00:50:03.780 So whatever is
00:50:04.760 causing the excess
00:50:05.560 mortality,
00:50:06.300 we've probably
00:50:07.180 already chewed
00:50:08.300 through most of it.
00:50:09.680 Problem.
00:50:11.460 Right?
00:50:15.520 So what else
00:50:17.620 is going on?
00:50:18.700 I mean,
00:50:19.000 even our ability
00:50:20.260 to mitigate
00:50:21.180 serious death
00:50:23.200 and risk
00:50:23.700 from cataclysms
00:50:25.580 is really good.
00:50:26.820 So we might
00:50:27.860 have worse
00:50:28.340 hurricanes,
00:50:29.140 maybe not,
00:50:30.700 but we'll
00:50:31.300 survive them
00:50:31.960 better.
00:50:36.240 Yeah,
00:50:37.140 fentanyl is
00:50:37.700 getting worse,
00:50:39.300 but I feel
00:50:40.200 like,
00:50:40.580 I feel fentanyl
00:50:41.620 is approaching
00:50:43.640 a tipping point.
00:50:46.100 Here's the
00:50:46.800 tipping point.
00:50:47.880 Let me give you
00:50:49.340 an example of
00:50:49.920 how close it is
00:50:50.660 to the tipping
00:50:51.160 point.
00:50:52.180 I'm not going
00:50:53.120 to do this,
00:50:54.820 but imagine
00:50:56.020 I did.
00:50:57.580 Imagine I ran
00:50:58.420 for president
00:50:59.000 and I had
00:51:00.220 only one policy.
00:51:02.140 I'm going to
00:51:02.540 stop fentanyl.
00:51:05.080 And then the
00:51:05.980 reporters would
00:51:06.760 say,
00:51:07.040 what is your
00:51:07.780 view on
00:51:09.380 abortion?
00:51:10.100 And I would
00:51:10.400 say,
00:51:11.020 you work on
00:51:11.840 abortion.
00:51:12.500 I'm going to
00:51:12.900 work on
00:51:13.300 stopping fentanyl.
00:51:14.980 And then they
00:51:15.340 say,
00:51:15.980 but Scott,
00:51:16.940 what do you
00:51:17.640 feel about
00:51:18.560 international trade
00:51:20.180 and the trade
00:51:21.500 agreement?
00:51:22.060 And I'd say,
00:51:22.840 well,
00:51:23.080 why don't you
00:51:23.440 work that out?
00:51:24.100 I'm going
00:51:24.820 to go
00:51:25.000 stop fentanyl.
00:51:26.940 It's all
00:51:27.560 I'm going
00:51:27.900 to do.
00:51:29.360 And let's
00:51:29.740 say I had
00:51:30.160 an idea
00:51:30.560 that actually
00:51:31.180 looked like
00:51:31.600 it could
00:51:31.880 work.
00:51:33.260 You know,
00:51:33.540 like stopping
00:51:34.100 all trade
00:51:34.500 from China
00:51:35.020 until they
00:51:35.420 stop it,
00:51:35.960 for example.
00:51:36.900 Or killing
00:51:37.740 the guy
00:51:38.160 in the
00:51:39.220 country.
00:51:42.280 Here's what
00:51:42.820 I mean by
00:51:43.220 a tipping
00:51:43.600 point.
00:51:45.360 I could
00:51:45.960 get elected
00:51:46.480 on that.
00:51:48.380 I could
00:51:48.980 get elected
00:51:49.480 on that.
00:51:50.360 I could
00:51:50.780 get elected
00:51:51.280 on.
00:51:51.680 I'm not
00:51:51.940 even going
00:51:52.220 to talk
00:51:52.500 about the
00:51:52.860 other stuff.
00:51:54.120 I'm just
00:51:54.640 going to
00:51:54.880 solve one
00:51:55.400 problem.
00:51:57.420 And I
00:51:57.920 won't even
00:51:58.280 run for
00:51:58.680 re-election
00:51:59.160 if I solve
00:51:59.800 it.
00:52:00.300 I'll give
00:52:00.660 you one
00:52:01.060 term.
00:52:01.760 I'll solve
00:52:02.360 one problem.
00:52:03.660 Now,
00:52:03.940 the other
00:52:04.200 stuff I'll
00:52:04.620 do my
00:52:04.960 best.
00:52:05.540 I'll listen
00:52:05.920 to the
00:52:06.200 experts,
00:52:06.900 but I'm
00:52:07.280 not even
00:52:07.580 going to
00:52:07.840 debate it
00:52:08.220 with you.
00:52:09.140 What are
00:52:09.580 you going
00:52:09.780 to do
00:52:10.020 about taxes?
00:52:10.860 I don't
00:52:11.100 know.
00:52:12.200 I don't
00:52:12.480 know.
00:52:12.780 We'll work
00:52:13.200 that out
00:52:13.520 later.
00:52:13.820 I'm not
00:52:14.040 even
00:52:14.180 worried
00:52:14.400 about it.
00:52:15.740 Just one
00:52:16.340 thing.
00:52:21.280 one way
00:52:22.080 or another.
00:52:22.540 I'm going
00:52:22.740 to ship
00:52:23.060 all their
00:52:23.460 fucking
00:52:23.840 students back
00:52:25.740 home.
00:52:26.200 I'm going
00:52:26.420 to stop
00:52:26.780 all trade.
00:52:27.720 I'm going
00:52:28.080 to remove
00:52:28.580 our ambassadors.
00:52:29.500 I'm going
00:52:29.700 to kill
00:52:29.980 that fucking
00:52:30.660 dealer in
00:52:32.420 country,
00:52:32.960 in China,
00:52:33.500 and I'm
00:52:33.840 not going
00:52:34.180 to have
00:52:34.720 any remorse
00:52:35.440 for it.
00:52:37.700 That's all
00:52:38.260 I'd promise.
00:52:39.460 Now,
00:52:39.760 here's what
00:52:40.580 I want you
00:52:41.180 to think.
00:52:41.900 Now,
00:52:42.100 none of this
00:52:42.500 is real
00:52:42.900 because I'm
00:52:43.220 not running
00:52:43.600 for president.
00:52:44.560 But the
00:52:44.940 point is,
00:52:45.800 I could
00:52:46.200 win.
00:52:48.520 Think about
00:52:49.160 it.
00:52:49.940 You don't
00:52:50.280 think I
00:52:50.600 could win?
00:52:51.280 With just
00:52:52.480 one point.
00:52:53.980 Because nobody
00:52:54.960 could touch
00:52:55.760 me.
00:52:56.860 I would be
00:52:57.680 the only one
00:52:58.360 who was doing
00:52:58.800 anything for
00:52:59.380 the country,
00:53:00.060 and you
00:53:00.380 would fucking
00:53:00.940 know it.
00:53:02.320 You would
00:53:02.760 know it.
00:53:03.800 Because if I
00:53:04.780 stopped the
00:53:05.280 fentanyl trade,
00:53:06.120 the odds of
00:53:06.520 me getting
00:53:06.960 killed would
00:53:07.600 be very high
00:53:08.340 by either
00:53:09.900 the cartel
00:53:10.520 or by China
00:53:11.200 or somebody.
00:53:12.640 I would
00:53:13.400 be risking
00:53:13.940 my life
00:53:14.600 to save
00:53:17.340 100,000
00:53:18.000 Americans,
00:53:18.700 and I would
00:53:19.320 let everything
00:53:19.820 else work
00:53:20.300 its way
00:53:20.740 through the
00:53:21.460 system.
00:53:22.040 I'd let
00:53:22.380 Congress
00:53:22.740 make decisions.
00:53:24.220 If I ran
00:53:25.100 as a
00:53:25.420 Democrat,
00:53:25.980 I'd let
00:53:26.240 the Democrats
00:53:26.860 give me
00:53:27.480 some proposals
00:53:28.120 and I'd
00:53:28.560 look at
00:53:28.860 them.
00:53:29.280 If I ran
00:53:29.720 as a
00:53:30.060 Republican,
00:53:30.640 I'd let
00:53:30.900 them give
00:53:31.240 me some
00:53:31.540 proposals.
00:53:31.940 And I'd
00:53:32.760 let either
00:53:33.300 side give
00:53:33.700 me some
00:53:34.000 proposals.
00:53:34.880 I would
00:53:35.280 just look
00:53:35.620 at the
00:53:35.880 experts,
00:53:36.420 I'd look
00:53:36.680 at the
00:53:36.940 thing,
00:53:37.940 and I'm
00:53:38.220 not even
00:53:38.480 going to
00:53:38.720 tell you
00:53:39.020 what I'm
00:53:39.300 going to
00:53:39.500 decide.
00:53:40.560 I'll just
00:53:40.900 tell you
00:53:41.200 how I
00:53:41.620 decide.
00:53:42.720 I won't
00:53:43.260 even tell
00:53:43.560 you what
00:53:43.760 the decision
00:53:44.260 will be.
00:53:45.560 All right,
00:53:45.720 here's how
00:53:46.260 I will
00:53:46.540 decide.
00:53:47.600 I will
00:53:47.920 debate this
00:53:48.460 in public.
00:53:49.760 That's how
00:53:50.180 I'll decide.
00:53:51.080 What are
00:53:51.360 you going to
00:53:51.580 do about
00:53:52.180 this tax
00:53:53.660 change?
00:53:54.960 I'll debate
00:53:55.440 it in public.
00:53:57.140 I'll let the
00:53:57.960 public see it
00:53:58.580 the same time
00:53:59.100 I see it.
00:54:00.060 I'll ask
00:54:00.560 tough questions,
00:54:01.260 and if you
00:54:01.780 can't answer
00:54:02.240 these tough
00:54:02.660 questions in
00:54:03.300 public,
00:54:04.080 you're going
00:54:04.460 to be very
00:54:04.880 embarrassed,
00:54:05.440 because I'm
00:54:05.840 going to make
00:54:06.180 you embarrassed.
00:54:07.720 How will it
00:54:08.540 turn out?
00:54:08.940 I don't know.
00:54:09.400 But that's
00:54:11.200 my process.
00:54:12.760 So here's
00:54:13.620 the thing
00:54:13.900 that should
00:54:14.240 blow your
00:54:14.720 mind.
00:54:15.680 If you
00:54:16.100 don't think
00:54:16.580 fentanyl
00:54:17.300 is really
00:54:19.120 close to
00:54:20.580 being the
00:54:20.960 only topic
00:54:21.600 that matters,
00:54:22.940 you're not
00:54:23.300 paying attention.
00:54:24.280 It's killing
00:54:24.680 100,000 people
00:54:25.660 a year,
00:54:26.640 and it's
00:54:27.260 intentional.
00:54:28.900 It's that
00:54:29.500 close.
00:54:30.660 And do you
00:54:31.100 think that I
00:54:31.580 have the
00:54:31.900 skill to
00:54:33.000 make people
00:54:33.480 stop caring
00:54:34.200 about everything
00:54:34.860 else for a
00:54:35.780 while?
00:54:37.200 I do.
00:54:38.820 Enough
00:54:39.280 people,
00:54:39.780 right?
00:54:40.060 I just
00:54:40.660 have to be
00:54:41.120 enough to
00:54:41.460 get elected.
00:54:42.220 I do.
00:54:42.780 I do have
00:54:43.340 that skill.
00:54:44.300 And it's
00:54:44.760 because the
00:54:45.280 topic makes
00:54:46.080 it easy.
00:54:47.640 Nobody can
00:54:48.180 top the
00:54:48.580 topic.
00:54:49.380 You can't,
00:54:49.960 there's no
00:54:50.320 high ground
00:54:50.880 above that
00:54:51.420 topic.
00:54:52.600 You can't
00:54:53.340 get there.
00:54:54.220 If somebody
00:54:54.860 says,
00:54:55.260 ah,
00:54:55.800 and I'm
00:54:56.200 going to
00:54:56.380 worry about
00:54:56.800 your pronouns,
00:54:57.880 and I'll
00:54:58.200 say,
00:54:58.560 that's great.
00:54:59.860 I think
00:55:00.240 everybody should
00:55:00.800 be treated
00:55:01.220 with respect.
00:55:02.440 You're also
00:55:03.080 not doing
00:55:03.760 anything useful
00:55:04.740 for the
00:55:06.060 public.
00:55:06.460 I'm going
00:55:06.740 to be
00:55:06.920 stopping
00:55:07.260 fentanyl,
00:55:07.960 and you're
00:55:09.320 going to
00:55:09.440 be talking
00:55:09.800 about
00:55:10.080 pronouns.
00:55:10.880 There's
00:55:11.220 your choice,
00:55:11.760 public.
00:55:12.380 Take your
00:55:12.740 choice.
00:55:13.840 It would
00:55:14.140 be easy
00:55:14.740 to become
00:55:15.220 president
00:55:15.620 with just
00:55:16.760 one topic.
00:55:18.320 That's how
00:55:18.940 weak politics
00:55:19.800 is right now.
00:55:20.840 Even though
00:55:21.240 Trump is a
00:55:21.980 monster,
00:55:23.540 you know,
00:55:23.760 like he's
00:55:24.120 very capable
00:55:24.880 even if you
00:55:25.460 don't like
00:55:25.820 what he's
00:55:26.120 doing,
00:55:27.700 there still
00:55:28.680 isn't anybody
00:55:29.200 good.
00:55:31.240 Because Reagan
00:55:32.180 was good.
00:55:34.320 Reagan was
00:55:34.840 good.
00:55:35.200 We don't
00:55:36.300 have anybody
00:55:36.700 like that.
00:55:37.920 Trump is
00:55:38.780 his own
00:55:39.340 thing.
00:55:41.420 I mean,
00:55:41.640 he's capable
00:55:42.420 in a way
00:55:42.940 I've never
00:55:43.440 seen anything
00:55:44.100 like it.
00:55:45.220 He's very
00:55:45.660 persuasive.
00:55:46.700 But he's
00:55:47.580 not Reagan.
00:55:50.620 You know,
00:55:51.300 in some
00:55:51.600 ways,
00:55:52.520 he's got
00:55:53.260 some advantages
00:55:53.900 over Reagan,
00:55:54.640 I would say.
00:55:55.480 But overall,
00:55:57.020 there's nobody
00:55:57.700 operating at a
00:55:58.680 Reagan level.
00:56:00.280 And what I
00:56:01.500 heard Reagan
00:56:02.000 do was
00:56:02.520 diffuse his
00:56:03.680 criticism of
00:56:04.460 the other
00:56:04.780 side.
00:56:05.240 I forget
00:56:05.500 what the
00:56:05.820 topic was.
00:56:06.840 But there
00:56:07.200 was one of
00:56:07.580 those red
00:56:08.140 meat situations
00:56:09.020 where a
00:56:10.520 politician of
00:56:11.200 today would
00:56:11.700 have used
00:56:12.060 that to
00:56:12.580 just crap
00:56:13.500 on the
00:56:14.040 other side.
00:56:15.260 And Reagan
00:56:15.760 took the
00:56:16.180 high ground.
00:56:16.940 I forget
00:56:17.280 what it was.
00:56:18.420 But he
00:56:19.080 treated the
00:56:19.780 other side
00:56:20.940 with such
00:56:21.420 respect.
00:56:22.260 I thought,
00:56:22.820 wow,
00:56:23.060 you don't
00:56:23.260 see that.
00:56:24.700 And it
00:56:25.080 was effective.
00:56:26.740 Because it
00:56:27.360 made the
00:56:27.640 people on
00:56:27.940 the other
00:56:28.180 side say,
00:56:28.700 oh,
00:56:30.100 looks like
00:56:30.600 he's listening
00:56:31.100 to us a
00:56:31.580 little bit.
00:56:32.340 That's not
00:56:32.740 so bad.
00:56:34.460 Isn't
00:56:34.960 that how
00:56:35.320 he got
00:56:35.620 elected by
00:56:36.360 such landslides?
00:56:37.920 Because he
00:56:38.420 treated the
00:56:38.920 other team
00:56:39.420 like, you
00:56:40.340 know, he
00:56:40.900 did this
00:56:41.280 sort of
00:56:41.740 joshing,
00:56:42.940 ha, ha,
00:56:43.340 ha, you
00:56:44.300 know, I'm
00:56:44.580 definitely
00:56:44.860 right, and
00:56:46.140 I'm confident
00:56:46.660 that I'm
00:56:47.040 right, but
00:56:48.060 I love you
00:56:48.520 guys too.
00:56:50.300 It felt like
00:56:51.080 he was a
00:56:51.760 fatherly, I
00:56:52.540 love you, even
00:56:53.200 if you
00:56:53.460 disagree, whereas
00:56:54.840 Trump is more
00:56:55.560 like, you're on
00:56:56.640 my team or I
00:56:57.320 must destroy
00:56:57.860 you.
00:56:59.060 I mean, he's
00:56:59.500 not, because
00:57:00.100 he's team
00:57:00.500 America.
00:57:00.940 But you
00:57:02.380 could easily
00:57:03.080 feel that way
00:57:03.820 because of the
00:57:04.420 style of his
00:57:05.080 rhetoric.
00:57:09.000 Yeah, Trump
00:57:09.780 is noisy and
00:57:10.700 expensive.
00:57:11.480 That's true.
00:57:17.060 All right, is
00:57:17.720 there any
00:57:17.940 other topic?
00:57:21.480 Yeah, if all
00:57:22.640 I did was
00:57:23.100 stop fentanyl
00:57:24.940 and TikTok,
00:57:28.320 I would be the
00:57:29.080 greatest president
00:57:30.000 ever elected, and
00:57:31.720 I would get
00:57:32.320 out after one
00:57:33.020 term.
00:57:36.480 I would go
00:57:37.160 all George
00:57:37.660 Washington on
00:57:38.400 it.
00:57:41.560 All right.
00:57:48.300 We'll never
00:57:48.940 know.
00:57:49.240 Yeah, I can't
00:57:50.080 run.
00:57:50.500 Do you know
00:57:50.800 why I can't
00:57:51.300 run?
00:57:53.600 Does anybody
00:57:54.320 know why I
00:57:55.940 could never run
00:57:56.600 for president?
00:57:57.260 I mean, legally,
00:57:57.980 I could, I
00:57:58.960 suppose.
00:58:00.000 Yeah, I
00:58:00.320 think I
00:58:00.620 can.
00:58:02.860 Too old?
00:58:04.120 No.
00:58:05.200 I'm almost
00:58:05.780 too old.
00:58:11.040 Yeah, bad
00:58:11.960 Vs.
00:58:13.800 Yeah, bad
00:58:14.600 Vs.
00:58:15.460 I can't walk
00:58:16.220 up and down
00:58:16.580 those ramps.
00:58:19.240 No, my
00:58:19.940 problem would
00:58:20.460 be my
00:58:23.200 background.
00:58:23.680 So the
00:58:24.840 only way
00:58:25.540 that I
00:58:25.860 could run
00:58:26.860 for office
00:58:27.500 is with
00:58:28.040 the following
00:58:28.520 promise.
00:58:30.120 I guess I
00:58:30.640 could do it.
00:58:31.380 And the
00:58:31.700 promise would
00:58:32.240 be this.
00:58:34.260 I'm not
00:58:34.820 going to be
00:58:35.120 your role
00:58:35.500 model, and
00:58:36.780 if you look
00:58:37.260 into my
00:58:37.660 personal life,
00:58:38.340 you're not
00:58:38.640 going to like
00:58:39.040 it.
00:58:39.920 But if it
00:58:40.520 entertains you,
00:58:41.660 you should
00:58:42.040 believe all of
00:58:42.620 it's true.
00:58:43.420 It won't
00:58:43.860 be.
00:58:44.800 You know,
00:58:45.000 half of what
00:58:46.160 you read about
00:58:46.680 my personal
00:58:47.160 life is not
00:58:47.680 going to be
00:58:47.980 true.
00:58:48.200 But if it
00:58:49.180 entertains you,
00:58:50.980 you should
00:58:51.900 believe it.
00:58:53.400 But my
00:58:54.400 proposition would
00:58:55.220 be the
00:58:55.480 following.
00:58:56.600 If whatever
00:58:57.200 you think about
00:58:57.800 me personally
00:58:58.460 or what I
00:58:59.100 have said or
00:58:59.900 done, assume
00:59:00.800 it's all
00:59:01.180 true.
00:59:02.820 Don't vote
00:59:03.560 for me under
00:59:04.720 the benefit of
00:59:05.440 a doubt.
00:59:06.480 Assume the
00:59:06.980 worst things
00:59:07.640 you hear about
00:59:08.120 me are true,
00:59:10.100 and I'm
00:59:10.720 only going to
00:59:11.100 want to solve
00:59:11.500 one problem
00:59:12.080 for you.
00:59:13.840 Fentanyl.
00:59:14.940 If you think
00:59:15.580 I can do
00:59:16.080 that,
00:59:16.440 then maybe
00:59:18.620 look out for
00:59:19.400 your own
00:59:19.720 best interest.
00:59:21.220 Your own
00:59:21.740 best interest
00:59:22.300 is to get
00:59:22.860 that problem
00:59:23.380 solved.
00:59:24.260 It's not
00:59:24.700 in your best
00:59:25.180 interest to
00:59:25.720 make a moral
00:59:26.320 point about
00:59:27.500 your moral
00:59:28.340 superiority to
00:59:29.420 me, because
00:59:30.460 I stipulate
00:59:31.180 that.
00:59:32.240 If you'd like
00:59:32.880 me to stipulate
00:59:33.740 that the
00:59:34.760 voters are
00:59:35.380 morally superior
00:59:36.360 to me as a
00:59:37.220 candidate, I
00:59:37.900 give you that,
00:59:38.820 freely and
00:59:39.660 without
00:59:40.040 reservations.
00:59:41.420 You are
00:59:42.120 my moral
00:59:42.980 betters.
00:59:44.700 I'm not
00:59:45.460 running to be
00:59:46.120 your God.
00:59:47.360 I'm not
00:59:47.940 running to
00:59:48.360 be Jesus.
00:59:49.480 I'm just
00:59:50.040 running to
00:59:50.480 stop fentanyl
00:59:51.280 from getting
00:59:51.680 into your
00:59:52.020 kids.
00:59:52.420 That's it.
00:59:53.260 If you
00:59:53.800 want that,
00:59:55.420 hire me
00:59:55.880 like you
00:59:56.280 would hire
00:59:56.660 a plumber.
00:59:58.300 You got a
00:59:58.920 leak in
00:59:59.200 your pipes?
00:59:59.640 I'm going
00:59:59.860 to fix
01:00:00.140 your leak.
01:00:01.100 You don't
01:00:01.680 need to
01:00:02.000 know what
01:00:02.320 I'm doing
01:00:02.600 at home.
01:00:04.460 Just assume
01:00:05.200 it's all
01:00:05.540 true.
01:00:08.500 Somebody
01:00:08.900 on local
01:00:09.300 says,
01:00:09.640 I sound
01:00:09.940 like a
01:00:10.260 boring
01:00:10.520 Trump.
01:00:13.680 Could
01:00:14.080 boring
01:00:14.400 get elected?
01:00:16.120 Do you
01:00:17.540 think boring
01:00:18.020 could get
01:00:18.400 elected?
01:00:19.200 It could
01:00:19.800 if it's
01:00:20.160 provocative.
01:00:21.700 But I
01:00:22.000 guess that
01:00:22.280 wouldn't be
01:00:22.640 boring.
01:00:28.140 Yeah, I
01:00:28.700 think you
01:00:28.980 could be
01:00:29.320 provocative and
01:00:30.780 honest at the
01:00:31.520 same time.
01:00:32.600 Usually the
01:00:33.200 way people are
01:00:33.780 provocative is
01:00:34.580 by saying
01:00:35.920 something that's
01:00:36.500 a little
01:00:36.700 sketchy.
01:00:37.940 That's how
01:00:38.420 you get
01:00:38.640 people's
01:00:39.000 attention.
01:00:39.360 DeSantis
01:00:44.200 is kind
01:00:44.740 of boring
01:00:45.180 but he
01:00:45.540 can get
01:00:45.800 elected.
01:00:47.180 Is he?
01:00:48.540 I don't
01:00:48.860 know.
01:00:50.440 I think
01:00:51.100 DeSantis
01:00:51.540 is learning
01:00:52.320 to be
01:00:52.700 less boring
01:00:53.460 because I
01:00:54.860 think it's
01:00:55.200 a learned
01:00:55.640 skill and
01:00:56.680 I think
01:00:56.980 he is
01:00:57.260 learning it.
01:00:59.420 Yeah,
01:00:59.840 Tom Cotton
01:01:00.380 would need
01:01:00.800 some help.
01:01:02.220 So Tom
01:01:02.660 Cotton has
01:01:03.620 a real
01:01:04.820 solid
01:01:05.400 resume but
01:01:07.400 he doesn't
01:01:07.920 light up the
01:01:08.460 screen.
01:01:09.360 But here's
01:01:10.000 the question.
01:01:10.600 Could he?
01:01:11.620 Do you
01:01:12.020 think Tom
01:01:12.560 Cotton could
01:01:13.140 learn to
01:01:15.820 be more of
01:01:17.600 a charismatic
01:01:18.260 presence?
01:01:20.580 I believe
01:01:21.300 he could.
01:01:23.700 Do you
01:01:24.220 know the
01:01:24.500 definition of
01:01:25.060 charisma?
01:01:26.100 I say this
01:01:27.520 a lot but
01:01:28.000 it's one of
01:01:28.360 the most
01:01:28.560 useful things
01:01:29.260 you'll ever
01:01:29.680 understand.
01:01:30.960 Especially if
01:01:31.580 you want to
01:01:31.980 have charisma.
01:01:33.740 Charisma is
01:01:34.380 a combination
01:01:34.820 of power
01:01:36.260 and empathy.
01:01:38.620 You have
01:01:38.920 to have
01:01:39.120 both.
01:01:40.380 And here's
01:01:40.840 why that
01:01:41.120 makes sense.
01:01:41.760 If somebody
01:01:42.140 has power
01:01:42.840 but they
01:01:43.860 don't have
01:01:44.260 empathy for
01:01:44.980 you, what
01:01:45.380 do you
01:01:45.520 think of
01:01:45.800 them?
01:01:46.920 They're
01:01:47.140 dangerous.
01:01:48.400 The worst
01:01:49.120 thing is to
01:01:49.640 be around
01:01:49.980 somebody powerful
01:01:50.820 who doesn't
01:01:51.280 care about
01:01:51.740 you.
01:01:52.340 You don't
01:01:52.680 want to be
01:01:52.980 anywhere near
01:01:53.520 that.
01:01:54.840 So power
01:01:55.560 without empathy
01:01:56.280 is just
01:01:56.720 danger.
01:01:58.100 Suppose you
01:01:58.760 had lots of
01:01:59.280 empathy but
01:01:59.840 you had no
01:02:00.200 power.
01:02:01.600 Well, that's
01:02:02.520 somebody who's
01:02:02.960 going to need
01:02:03.320 your help
01:02:03.780 because they
01:02:05.260 have no
01:02:05.540 power.
01:02:07.280 So empathy
01:02:08.760 doesn't help
01:02:09.400 you.
01:02:09.860 It's nice
01:02:10.280 that they're
01:02:10.540 empathy.
01:02:11.000 They have
01:02:11.320 empathy.
01:02:11.720 They're nice
01:02:12.020 people.
01:02:12.720 But if they
01:02:13.120 have no
01:02:13.420 power, they're
01:02:15.480 just a cost
01:02:16.440 to you probably
01:02:17.360 because you
01:02:18.700 have something
01:02:19.120 you're going to
01:02:19.420 have to give
01:02:19.780 it to them
01:02:20.440 in the form
01:02:21.360 of money or
01:02:21.960 something else.
01:02:23.800 So if you
01:02:24.940 have power and
01:02:25.960 you have
01:02:26.240 empathy, everybody
01:02:27.000 wants to be
01:02:27.900 around you because
01:02:29.420 they want to
01:02:29.840 take advantage of
01:02:30.560 your power and
01:02:31.780 they want to
01:02:32.140 take advantage of
01:02:32.780 it for their
01:02:33.240 benefit, which
01:02:33.900 is what the
01:02:34.300 empathy gets
01:02:34.880 them.
01:02:36.020 So how could
01:02:36.820 Tom Cotton have
01:02:39.180 more charisma?
01:02:40.640 It would be
01:02:41.380 easy because
01:02:43.020 that's the
01:02:43.460 formula.
01:02:44.000 There's a
01:02:44.340 formula for
01:02:45.000 charisma.
01:02:45.440 Just follow
01:02:45.840 it.
01:02:46.700 So does he
01:02:47.280 have power?
01:02:48.960 Yes.
01:02:50.180 Because he's a
01:02:50.820 sitting senator.
01:02:52.160 He's highly
01:02:52.720 educated.
01:02:54.140 He's successful
01:02:55.300 in his job.
01:02:56.620 He gets on
01:02:57.160 TV.
01:02:58.040 People listen
01:02:58.600 to him.
01:02:59.720 And he'll
01:03:00.700 probably run for
01:03:01.380 president or he's
01:03:02.260 thinking about
01:03:02.720 it.
01:03:03.540 He has
01:03:03.840 power.
01:03:04.660 Now if he
01:03:05.180 became president
01:03:05.840 he would have
01:03:06.580 far more real
01:03:07.980 power.
01:03:08.980 So the power
01:03:09.540 part he has
01:03:10.380 nailed.
01:03:11.580 But what about
01:03:12.040 the empathy
01:03:12.620 part?
01:03:14.320 That's the part
01:03:15.280 he's missing.
01:03:16.660 Now I'm not
01:03:17.420 saying he doesn't
01:03:18.000 have empathy on
01:03:18.860 the inside.
01:03:19.440 I'm saying it's
01:03:19.860 not projecting.
01:03:21.220 Could he learn
01:03:22.060 to project it?
01:03:23.980 Yes.
01:03:25.160 Because it's
01:03:25.920 acting.
01:03:27.820 Yeah, the
01:03:28.320 empathy they have
01:03:29.280 on the inside
01:03:30.140 nobody can
01:03:32.120 see.
01:03:33.400 So the
01:03:33.940 acting is
01:03:34.620 making sure
01:03:35.080 everybody can
01:03:35.580 see it.
01:03:36.440 And then you
01:03:36.780 say to yourself
01:03:37.220 but that would
01:03:37.620 be a big
01:03:37.960 phony.
01:03:39.500 No, everybody's
01:03:40.620 a phony.
01:03:42.480 I'm a phony
01:03:43.240 right now.
01:03:44.220 Do you think
01:03:44.580 the way that I'm
01:03:45.220 talking to you
01:03:45.800 right now on
01:03:46.260 live stream is
01:03:47.640 the way that I
01:03:48.360 talk to a
01:03:49.200 six-year-old?
01:03:50.560 No.
01:03:51.460 Is it the way I
01:03:52.200 would talk to
01:03:52.680 my mother?
01:03:53.940 No.
01:03:54.340 Is it the way I
01:03:54.880 would talk to a
01:03:55.520 lover?
01:03:56.240 No.
01:03:57.240 No, I talk
01:03:57.900 differently for
01:03:58.560 every situation
01:03:59.280 just like you
01:04:00.340 do.
01:04:01.200 We're all
01:04:02.160 acting all
01:04:03.240 the time.
01:04:04.400 You have to
01:04:04.780 get over that.
01:04:05.860 If you don't
01:04:06.420 get over the
01:04:07.100 fact that we're
01:04:07.640 all acting all
01:04:08.300 the time,
01:04:10.220 some people say
01:04:11.220 they're not
01:04:11.580 acting.
01:04:12.160 No, you're
01:04:12.480 acting.
01:04:13.220 Sorry.
01:04:14.980 And if you're
01:04:15.700 not acting,
01:04:16.280 you're doing
01:04:16.580 it wrong.
01:04:17.820 Because acting
01:04:18.660 is usually what
01:04:19.500 gets you a
01:04:19.940 better outcome.
01:04:22.380 So, could I
01:04:23.220 teach Tom
01:04:24.820 Cotton,
01:04:25.180 to have
01:04:26.140 more charisma?
01:04:26.920 Yes, because
01:04:27.500 he has the
01:04:27.960 power, and
01:04:28.740 he would have
01:04:29.340 more if he's
01:04:29.840 president.
01:04:30.880 And his
01:04:31.940 charisma is
01:04:33.020 really just
01:04:33.500 about adding
01:04:34.060 more empathy.
01:04:35.360 Because when
01:04:36.020 Tom Cotton
01:04:36.560 tells you what
01:04:37.180 we should or
01:04:37.700 should not
01:04:38.120 do, he
01:04:39.140 takes a
01:04:39.800 hawkish
01:04:40.300 approach, and
01:04:41.980 he says,
01:04:42.580 here are the
01:04:42.800 hard things we
01:04:43.400 should do, and
01:04:43.980 we should do
01:04:44.360 these right
01:04:44.760 away.
01:04:46.000 What's
01:04:46.420 missing is
01:04:48.480 understanding how
01:04:49.600 we're feeling
01:04:50.140 about it.
01:04:51.380 And all he'd
01:04:51.940 have to do is
01:04:52.540 say it out
01:04:52.980 loud.
01:04:54.140 All right, I
01:04:54.680 understand that
01:04:55.380 this could be
01:04:55.860 scary for all
01:04:56.560 of you, but
01:04:57.720 here are the
01:04:58.140 things we're
01:04:58.580 weighing, and
01:04:59.720 I'm going to
01:05:00.080 make sure that
01:05:00.680 the least scary
01:05:01.560 thing is what
01:05:02.160 happens.
01:05:03.360 Right?
01:05:04.060 Something like
01:05:04.540 that.
01:05:05.100 So in other
01:05:05.500 words, you
01:05:05.800 just have to
01:05:06.360 use your
01:05:07.000 words and
01:05:08.180 first say what
01:05:09.400 the people are
01:05:10.040 feeling.
01:05:11.220 After you say,
01:05:12.200 I understand what
01:05:12.880 everybody's feeling,
01:05:13.700 that's the
01:05:14.020 empathy part, then
01:05:15.520 you can be as
01:05:16.040 tough as you
01:05:16.500 want.
01:05:18.300 So Reagan did
01:05:19.280 that right.
01:05:20.540 Reagan acted
01:05:21.380 like he had
01:05:21.920 genuine
01:05:22.320 empathy, and
01:05:23.820 then when he
01:05:24.260 acted tough,
01:05:25.200 people were
01:05:25.620 like, okay,
01:05:27.180 all right.
01:05:28.700 Now, Trump
01:05:29.480 does the
01:05:30.100 opposite.
01:05:31.300 Trump is on
01:05:31.880 the side of
01:05:32.440 Americans, and
01:05:33.220 he says it
01:05:33.800 intellectually and
01:05:35.200 consistently, but
01:05:37.500 the Democrats
01:05:38.220 don't feel it,
01:05:39.800 do they?
01:05:40.760 They feel his
01:05:41.640 power.
01:05:42.740 They definitely
01:05:43.240 feel his power.
01:05:44.940 Everybody can
01:05:45.460 feel that.
01:05:46.680 But do they
01:05:47.380 feel his
01:05:47.940 empathy?
01:05:48.420 They do
01:05:49.460 not.
01:05:50.640 They do
01:05:51.100 not.
01:05:51.540 And that's
01:05:52.040 Trump's biggest
01:05:53.160 flaw.
01:05:54.760 His biggest
01:05:55.420 advantage is he's
01:05:56.380 a fighter, and
01:05:57.940 I liked him for
01:05:59.120 that.
01:05:59.840 Still do.
01:06:00.840 I like the
01:06:01.520 fact that you
01:06:02.000 can depend on
01:06:02.780 him to fight
01:06:03.480 and not give
01:06:04.640 up.
01:06:05.520 The trouble
01:06:05.980 is, if it's
01:06:07.480 something like
01:06:08.000 losing an
01:06:08.480 election, he
01:06:10.260 still fights and
01:06:10.980 he doesn't give
01:06:11.540 up.
01:06:12.580 So I have
01:06:14.340 not condemned
01:06:15.060 him as much
01:06:16.360 as others have,
01:06:17.340 because I
01:06:18.180 knew what I
01:06:18.640 was getting.
01:06:20.960 If you get a
01:06:24.120 porcupine as
01:06:24.980 a pet, and
01:06:26.460 then you hurt
01:06:26.940 your hand trying
01:06:27.620 to pet it,
01:06:28.580 it's not
01:06:29.000 exactly the
01:06:29.760 porcupine's
01:06:30.440 problem, is
01:06:31.600 it?
01:06:32.940 Is it?
01:06:34.300 Because you
01:06:34.920 petted a
01:06:35.320 porcupine.
01:06:36.440 If you voted
01:06:37.320 for Trump, did
01:06:39.280 you think you
01:06:40.040 were voting for
01:06:40.660 the person who
01:06:41.260 would give
01:06:41.640 up?
01:06:43.160 No.
01:06:44.240 No, the
01:06:44.680 thing you
01:06:45.060 voted for is
01:06:46.320 he doesn't
01:06:46.740 give up.
01:06:47.240 And then
01:06:48.440 when the
01:06:49.020 election went
01:06:49.560 the way it
01:06:49.940 did, he
01:06:50.240 didn't give
01:06:50.640 up.
01:06:52.340 I don't
01:06:52.920 know.
01:06:53.640 Am I
01:06:54.040 disappointed?
01:06:55.120 I wish
01:06:55.720 things had
01:06:56.200 gone a
01:06:56.540 different way.
01:06:57.840 But I
01:06:58.440 don't wish
01:06:58.980 that he
01:06:59.340 was a
01:06:59.640 different
01:06:59.860 person,
01:07:00.800 because that's
01:07:01.720 what I
01:07:02.000 bought.
01:07:03.020 I bought
01:07:03.600 that.
01:07:05.360 Right?
01:07:06.140 If you buy
01:07:06.980 a gun and
01:07:08.440 you leave it
01:07:08.900 loaded and
01:07:09.600 shoot you in
01:07:10.460 the foot or
01:07:10.840 something, is
01:07:11.880 it the gun's
01:07:12.620 problem?
01:07:12.960 No, it's
01:07:15.500 what you did
01:07:16.220 and what you
01:07:16.760 expected of
01:07:17.540 the gun.
01:07:18.860 Right?
01:07:19.020 If you buy
01:07:19.780 Trump and
01:07:21.340 then he acts
01:07:21.840 like Trump
01:07:22.340 from the
01:07:22.860 beginning to
01:07:23.360 the end,
01:07:24.420 don't tell
01:07:25.420 me that's
01:07:25.840 Trump's
01:07:26.160 problem.
01:07:27.780 You bought
01:07:28.580 that.
01:07:29.420 You bought
01:07:29.840 exactly that.
01:07:31.100 You didn't
01:07:31.560 buy approximately
01:07:32.780 that, something
01:07:34.160 in that direction.
01:07:35.020 You bought
01:07:35.460 that.
01:07:35.960 And if you
01:07:37.280 didn't know you
01:07:37.840 were buying
01:07:38.340 that, what
01:07:40.280 did you think
01:07:40.780 you were
01:07:41.000 getting?
01:07:42.680 I mean, it's
01:07:43.060 the most
01:07:43.360 consistent thing
01:07:44.180 about him is
01:07:44.720 he doesn't
01:07:45.120 give up.
01:07:46.740 It's the
01:07:47.360 most consistent
01:07:48.260 thing about
01:07:48.840 his personality.
01:07:49.640 He doesn't
01:07:50.020 give up.
01:07:52.960 All right.
01:07:58.420 I think
01:07:59.040 that's all
01:07:59.420 for today.
01:08:00.380 I think
01:08:00.940 we've done
01:08:01.340 it.
01:08:02.240 YouTube,
01:08:03.160 thanks for
01:08:03.640 joining.
01:08:04.060 Spotify, too.
01:08:06.040 If you, too,
01:08:06.920 would like to
01:08:07.700 have your
01:08:08.580 life changed,
01:08:09.540 apparently,
01:08:09.920 I've done
01:08:11.040 that for
01:08:11.320 a number
01:08:11.600 of people.
01:08:12.760 So go
01:08:13.140 look for
01:08:13.500 that video.
01:08:15.720 $791, it
01:08:16.320 was.
01:08:16.940 They're all
01:08:17.540 listed, the
01:08:18.060 live streams
01:08:18.540 I do.
01:08:19.440 So just
01:08:19.780 Google my
01:08:20.320 name and
01:08:20.860 Coffee with
01:08:21.320 Scott Adams
01:08:21.880 and episode
01:08:23.160 $791, and
01:08:24.540 it should pop
01:08:25.160 right up.
01:08:25.800 Maybe it'll
01:08:26.340 change your
01:08:26.740 life, too.
01:08:28.360 And that
01:08:28.880 is all I
01:08:29.560 have for
01:08:29.820 today.
01:08:30.980 Bye for
01:08:31.360 now.