Real Coffee with Scott Adams - March 26, 2022


Episode 1694 Scott Adams: All The News About Russian Generals, Vaccination Safety, Movies Are Dying


Episode Stats

Length

56 minutes

Words per Minute

149.91557

Word Count

8,434

Sentence Count

681

Misogynist Sentences

6

Hate Speech Sentences

22


Summary

Every time I learn to play the drums, something bad happens to them. Like, a drummer dies on stage. Or, a rock musician dies in the middle of a rock concert. And it's not because of drugs or something else, it's because of something else.


Transcript

00:00:00.000 Hey everybody, congratulations and welcome to another amazing, amazing day.
00:00:12.000 Did you know that if I tell you you're going to have a good day, it actually makes it better?
00:00:16.060 No, it's true.
00:00:17.080 If I simply tell you today will be a good day, it will bias you.
00:00:20.760 And your confirmation bias will kick in.
00:00:23.220 And so, as the official hypnotist of live streaming,
00:00:27.180 I'm going to snap my fingers and tell you to have a good day today.
00:00:31.600 Have a good day.
00:00:34.460 Watch how many people are going to have a better day today.
00:00:38.120 It's amazing.
00:00:39.060 Now, suppose you'd like to take it up a level and you'd like to really enjoy the day,
00:00:44.120 really dig in, really give some traction.
00:00:46.260 All you need is a copper mug or a glass of tank of Chelsea's time,
00:00:48.840 a canteen jargon flask, a vessel of Vinicott,
00:00:51.280 filled with your favorite liquid.
00:00:54.400 I like coffee.
00:00:55.180 And join me now for the unparalleled pleasure.
00:00:58.340 It's the dopamine hit of the day.
00:00:59.980 Everybody's talking about it.
00:01:01.040 It's the simultaneous sip,
00:01:02.460 and it's going to make all of you sexier, smarter, and more capable.
00:01:06.300 Go.
00:01:11.300 Hold on, hold on, everybody.
00:01:12.940 Hold on.
00:01:14.200 Ben isn't ready.
00:01:16.240 Everybody?
00:01:17.160 We're all in this together.
00:01:19.440 I just got an emergency message on YouTube from Ben.
00:01:22.680 Wasn't quite prepared.
00:01:23.720 Everybody, hold on.
00:01:25.800 Ben?
00:01:27.180 Ben, are you ready?
00:01:29.180 I think he's ready.
00:01:30.840 Go.
00:01:34.820 All right.
00:01:35.600 Now, if you didn't make it that time, Ben,
00:01:38.700 it just wasn't there for you.
00:01:42.300 Well, did anybody join me last night in the man cave?
00:01:49.420 I did an unscheduled live stream in the man cave in which I decided that my thing is going to be telling you stories.
00:01:56.200 Because it turns out I've accumulated a vast amount of weird, interesting stories.
00:02:04.420 And so I was trying them out last night on my local subscription platform.
00:02:10.120 And I think those might have been available to other people.
00:02:14.380 I can't remember what setting I hid.
00:02:17.320 But that's going to be my thing.
00:02:18.760 I'll do some more man cave live streams.
00:02:22.280 Well, in the weird simulation-like tragedy, as you know, I've been learning to play the drums in recent years.
00:02:32.120 So it's sort of my main hobby is learning to play the drums.
00:02:36.360 And so part of that process and that journey, you end up following great drummers because you want to see how the great ones do it.
00:02:43.840 And so I feel like every time I find a great drummer, something bad happens to them.
00:02:50.200 So I started out, you know, looking at Ginger Baker from the old group, The Cream.
00:02:57.500 But he's dead.
00:02:58.680 So I was like, oh, wouldn't it be cool to follow somebody who's alive?
00:03:02.360 And so I was looking at Neil Peart.
00:03:05.360 And he died.
00:03:07.640 And then I started getting interested in the drummer for the Foo Fighters, Taylor Hawkins.
00:03:13.140 He died last night at age 50.
00:03:16.480 I don't think any of this is my fault.
00:03:20.900 John Bonham, deceased.
00:03:23.940 Yeah, so basically Ringo Starr is all we have left at this point.
00:03:29.080 Charlie Watts, exactly.
00:03:31.040 Charlie Watts was literally who I was copying when he died.
00:03:36.620 I was actually playing his videos and like, oh, he's got...
00:03:39.520 He had a simpler style that was designed for being danceable.
00:03:44.920 So that was Charlie Watts.
00:03:46.040 It just has to be danceable.
00:03:47.280 Everything else doesn't matter.
00:03:48.420 So how weird is this?
00:03:53.680 Every time I love a drummer, they die.
00:03:55.500 Now, the story is that he was about to...
00:03:58.460 He was a few hours from going on stage in Columbia, I believe.
00:04:01.400 And he was age 50.
00:04:04.100 He looked perfectly healthy.
00:04:06.320 I mean, if you looked at him, he looked...
00:04:08.640 He didn't look 50, first of all.
00:04:11.720 Did you know he was 50?
00:04:13.140 I mean, he looked 40.
00:04:14.880 So he didn't look like he was going to die at all.
00:04:17.900 But he died at age 50, and no cause of death was given.
00:04:23.520 No cause of death.
00:04:24.780 He's in Columbia.
00:04:26.720 He was 50.
00:04:26.780 He was 50.
00:04:30.440 He was 50.
00:04:30.880 He was before a rock concert.
00:04:34.940 He'd be maybe preparing for the rock concert.
00:04:37.640 You know, I really feel like when they don't tell you the cause of death,
00:04:46.420 you kind of know the cause of death, don't you?
00:04:48.780 I mean, I don't want to throw this guy into the bus,
00:04:51.600 because I actually respected him quite a bit as a performer.
00:04:56.080 So I liked him a lot.
00:04:57.340 So I don't want to make up some crappy rumors about the guy that's not true.
00:05:01.260 We'll wait to see what happens.
00:05:02.700 So I don't want to predict.
00:05:08.880 Oh, yes, I do.
00:05:10.980 Yes, I do.
00:05:12.460 It's probably drugs and probably fentanyl.
00:05:15.640 Now, what's it going to take?
00:05:19.120 What's it going to take?
00:05:20.380 Now, I'm not going to assume that I'm right there.
00:05:22.600 So if it turns out that it's not fentanyl,
00:05:25.340 I have an apology to give,
00:05:27.500 and I will definitely be giving it.
00:05:28.940 All right, so if tomorrow you find out he had a heart problem or something,
00:05:34.640 I will be apologizing profusely.
00:05:37.360 But I am concerned that this looks obviously...
00:05:41.860 Oh, Stuart Copeland is still alive.
00:05:43.800 You're right.
00:05:45.240 This looks exactly like what it looks like.
00:05:47.960 And I hope it isn't.
00:05:49.380 I really hope it isn't.
00:05:51.580 But if it is,
00:05:53.360 we might be closer to taking it seriously.
00:05:56.400 Well, is there any story that starts with the three words
00:06:01.180 a Florida man that is not entertaining?
00:06:05.200 I should do a survey on that.
00:06:07.160 Have you ever seen a story
00:06:08.420 that started with the three words
00:06:10.620 a Florida man,
00:06:12.280 and then you found yourself saying,
00:06:14.420 well, that's not going to be interesting?
00:06:16.840 Well, here's the story today.
00:06:18.260 I think it was in CNN.
00:06:19.400 A Florida man
00:06:21.680 has died after crashing his car
00:06:25.020 into an 11-foot alligator.
00:06:30.280 And then they reported
00:06:31.460 that both the driver
00:06:33.040 and the alligator were deceased.
00:06:37.260 I kind of wonder
00:06:38.260 why there are not more alligator fatalities.
00:06:42.860 You know,
00:06:43.480 the only times I've been in Florida,
00:06:45.180 I feel like I always see an alligator by the road.
00:06:49.800 Am I wrong?
00:06:51.280 If you're driving around in Florida,
00:06:53.080 don't you,
00:06:54.000 you always see alligators by the road.
00:06:56.440 How in the world
00:06:57.160 are there not continuous vehicle deaths
00:07:00.780 based on alligators?
00:07:04.560 I don't know.
00:07:05.180 The thing that surprised me
00:07:06.340 is that we don't hear it every day.
00:07:07.380 Well, here's a story
00:07:11.440 about Bill Maher's bubble.
00:07:14.180 Here's something that he said,
00:07:15.600 Bill Maher said on the show,
00:07:16.880 Friday, I guess.
00:07:18.240 He said,
00:07:18.700 I think that today's Republicans
00:07:20.000 would not do that.
00:07:21.900 I think that they would be thrilled
00:07:23.080 to have no black seats on the court
00:07:25.340 talking about the Supreme Court.
00:07:28.340 And then he clarified,
00:07:29.640 you know,
00:07:29.860 okay, maybe a few.
00:07:32.280 Well,
00:07:32.900 you know,
00:07:33.160 he said,
00:07:33.500 okay,
00:07:34.060 a lot of them.
00:07:35.000 So he wasn't saying
00:07:36.000 every single Republican.
00:07:40.380 But I would like to add
00:07:41.700 the following
00:07:42.440 to his bubble.
00:07:45.120 Here's something
00:07:45.800 that Bill Maher
00:07:46.900 really, really doesn't understand.
00:07:49.660 And I want to see
00:07:50.620 if you all understand it
00:07:51.900 or disagree with me.
00:07:53.840 You know,
00:07:54.040 you're free to disagree.
00:07:56.160 I'm going to tell you
00:07:57.240 a story of two people talking
00:07:59.080 and how it goes.
00:08:02.220 And you tell me,
00:08:03.120 this is,
00:08:03.660 I'm just making this up.
00:08:05.180 But you tell me
00:08:05.780 this isn't how it goes.
00:08:08.040 There's a white conservative guy
00:08:10.220 gets into conversation
00:08:11.840 with a black man
00:08:13.460 and he doesn't know
00:08:14.940 if the black guy
00:08:15.640 is conservative or what.
00:08:18.180 And maybe they get
00:08:19.160 into a little disagreement
00:08:20.000 about something
00:08:20.640 and then the black guy says,
00:08:21.940 well,
00:08:22.060 I just want to clarify.
00:08:23.700 I'm a conservative.
00:08:25.680 And the white guy says,
00:08:26.760 what?
00:08:27.180 Oh,
00:08:27.700 odds were against that.
00:08:29.500 And then the guy says,
00:08:30.420 yeah,
00:08:30.660 I'm totally conservative.
00:08:31.840 Church going.
00:08:33.000 Love my constitution.
00:08:34.060 I think everybody
00:08:35.540 should earn their own way
00:08:36.980 and race should not be
00:08:38.140 part of any decisions
00:08:39.280 in this country.
00:08:41.040 And then what does
00:08:41.780 the white guy say?
00:08:42.900 He's a white conservative.
00:08:44.660 Does he say,
00:08:45.480 in his mind,
00:08:46.320 does he say,
00:08:47.720 you know,
00:08:48.420 but,
00:08:49.040 yeah,
00:08:49.400 you do agree with me
00:08:50.420 on all the things
00:08:51.080 that I find most dear,
00:08:53.080 but,
00:08:53.400 you know,
00:08:53.780 you're still black.
00:08:55.740 So,
00:08:56.380 is that what happens?
00:08:58.000 Is that what happens?
00:08:59.600 Does the white guy,
00:09:00.540 the white conservative,
00:09:01.280 the Republicans say,
00:09:03.080 oh,
00:09:03.380 you know,
00:09:03.720 you do agree with me
00:09:04.540 on all the important stuff,
00:09:06.180 but,
00:09:06.520 you know,
00:09:06.720 you're still black.
00:09:07.660 So,
00:09:08.740 so,
00:09:09.820 no.
00:09:12.220 I don't think
00:09:13.340 in the history of the world
00:09:14.220 that's ever happened.
00:09:15.500 Let me tell you
00:09:16.080 how that goes.
00:09:17.760 Oh,
00:09:18.260 you probably don't know,
00:09:19.160 but I'm actually
00:09:19.860 a conservative.
00:09:21.220 And then the white conservative
00:09:22.120 says,
00:09:22.540 what?
00:09:23.380 Yeah,
00:09:23.740 right down the line.
00:09:25.080 Constitution,
00:09:25.860 Bible,
00:09:26.360 family,
00:09:27.360 whole deal.
00:09:27.880 And then the white conservative
00:09:30.320 says,
00:09:31.480 huh,
00:09:32.860 are you free for lunch?
00:09:34.560 Let's do lunch.
00:09:36.540 You know that's the fucking
00:09:38.020 way it goes,
00:09:39.080 right?
00:09:39.660 Why does Bill Maher
00:09:40.720 not know that?
00:09:42.380 Am I right?
00:09:43.660 I'm literally right.
00:09:45.840 If,
00:09:46.300 if the conservative
00:09:47.780 finds somebody
00:09:48.520 who agrees
00:09:49.440 with the philosophy,
00:09:51.660 they're 100% okay.
00:09:54.200 Not 99.
00:09:56.160 Not 99%.
00:09:56.880 99%.
00:09:57.760 They're 100%
00:09:59.500 okay.
00:10:01.240 No exception.
00:10:03.620 That's my opinion,
00:10:04.760 and I've never seen
00:10:05.380 an exception to it.
00:10:07.260 Now,
00:10:07.940 I tell you often
00:10:08.900 that I'm not conservative,
00:10:10.300 because I'm not.
00:10:11.740 But,
00:10:12.540 I definitely appreciate
00:10:14.300 that about conservatives.
00:10:16.580 There is a consistency there
00:10:18.420 that's impressive.
00:10:20.940 Right?
00:10:21.640 It's that consistency
00:10:22.980 that draws me
00:10:24.900 to conservatives
00:10:25.680 and Republicans.
00:10:27.720 You know,
00:10:28.000 they,
00:10:28.540 I'm not going to be
00:10:29.500 the guy who tells you
00:10:30.240 that the Republicans
00:10:31.240 don't have a lot
00:10:31.980 of conspiracy theories
00:10:33.180 running around
00:10:33.880 in their brains,
00:10:34.700 because they do.
00:10:35.860 They do.
00:10:36.760 So does,
00:10:37.200 so does everybody.
00:10:38.360 Just different ones.
00:10:39.620 So it's not about that.
00:10:41.100 It's not about
00:10:41.780 the conservatives
00:10:42.420 are always right
00:10:43.240 and,
00:10:43.720 you know,
00:10:44.820 the Democrats
00:10:45.420 are always wrong.
00:10:46.160 I don't really see that.
00:10:46.980 What I see
00:10:49.260 is a group
00:10:50.000 that has a consistent
00:10:51.080 workable system
00:10:52.240 that they respect
00:10:53.820 and they're,
00:10:55.280 they're quite consistent
00:10:56.400 about it.
00:10:57.580 I love that.
00:10:59.140 I love that.
00:11:00.220 even when it gives me
00:11:01.500 a result
00:11:02.000 that's not exactly
00:11:02.800 my preference.
00:11:03.960 But I love the system
00:11:04.860 and I love,
00:11:05.680 I love the people
00:11:06.740 who buy into that system.
00:11:07.960 It's not like I dislike
00:11:08.780 the other people,
00:11:09.540 but there,
00:11:10.380 there's much to respect
00:11:11.440 about people
00:11:13.140 who take seriously
00:11:14.260 their own philosophy.
00:11:15.140 And I do think
00:11:16.880 conservatives
00:11:17.360 take seriously
00:11:18.220 their own philosophy.
00:11:20.240 All right.
00:11:21.220 I've been telling you
00:11:22.060 for a while
00:11:22.380 that I think movies
00:11:23.400 are a dead art form
00:11:24.920 for a variety of reasons.
00:11:26.220 The biggest one
00:11:26.940 is that our attention
00:11:27.940 spans have declined.
00:11:30.120 Watching a two
00:11:31.120 or three hour movie
00:11:31.920 just sounds like
00:11:32.500 torture to me.
00:11:33.720 I can't imagine
00:11:34.500 how anybody
00:11:34.960 does it anymore.
00:11:36.300 But beyond that,
00:11:37.260 the movies themselves
00:11:38.020 are complete crap.
00:11:40.300 There must have been
00:11:41.320 a time in our world
00:11:43.240 where we were not
00:11:44.160 exposed to so much
00:11:45.340 ugliness
00:11:45.920 that we could watch
00:11:47.480 it for entertainment.
00:11:50.700 And I don't know
00:11:51.500 if this is what
00:11:52.040 happened to me.
00:11:52.780 It feels like it.
00:11:53.920 Maybe it happened
00:11:54.560 to you too.
00:11:55.720 I suppose in my childhood
00:11:57.360 I just was not exposed
00:11:59.780 to as many negative things,
00:12:01.720 maybe because no internet
00:12:02.700 or whatever.
00:12:04.000 TV was all,
00:12:05.060 you know,
00:12:05.700 sanitized.
00:12:06.940 So if I went to a movie
00:12:08.380 and the movie showed
00:12:09.320 some like horrible thing
00:12:10.360 happening to somebody,
00:12:12.020 it would be so
00:12:13.560 out of the usual,
00:12:14.900 out of the normal
00:12:15.640 for my experience,
00:12:16.800 I'd be interested in it
00:12:18.020 even if it's horrible
00:12:18.760 because unfortunately
00:12:20.580 that's the way brains work.
00:12:22.320 But now we live
00:12:23.440 in a world in which
00:12:24.280 we are inundated
00:12:25.160 with real world
00:12:26.400 disaster scenarios.
00:12:29.080 We're all going to die
00:12:30.180 from the bio labs
00:12:31.360 and the inflation
00:12:32.080 and the food shortages
00:12:33.300 and Russia's going to attack
00:12:35.140 and it's a nuclear war
00:12:36.360 and everything.
00:12:37.400 And my capacity
00:12:38.740 to handle ugliness
00:12:40.180 is always
00:12:41.620 completely full
00:12:43.040 so that any extra bit
00:12:45.660 of ugliness
00:12:46.360 that's added
00:12:46.920 it just makes me
00:12:47.800 flip out, right?
00:12:51.280 Everybody's the same.
00:12:52.560 There's a limit
00:12:53.380 that you can take
00:12:54.060 and then there's
00:12:54.580 over the limit.
00:12:55.920 And to me,
00:12:57.220 paying money
00:12:58.040 to go across town,
00:13:00.360 let's say you're
00:13:00.820 watching it in person,
00:13:02.960 and sit in an
00:13:04.160 uncomfortable seat
00:13:05.160 and watch three hours
00:13:07.460 of something
00:13:08.000 that is designed
00:13:09.100 to make you
00:13:09.900 feel uncomfortable.
00:13:11.900 It's made that way.
00:13:13.500 That's not an accident,
00:13:14.440 that's the feature.
00:13:15.680 The feature of a movie
00:13:16.900 is to make you
00:13:17.600 feel really bad
00:13:18.720 so that when
00:13:19.940 the end of the movie
00:13:20.660 comes they can
00:13:21.440 relieve that.
00:13:22.320 It's like an itch
00:13:23.060 and a scratch.
00:13:24.380 So first they make you
00:13:25.360 itch really badly
00:13:26.140 and then if you
00:13:28.160 can sit through
00:13:28.820 all three hours
00:13:29.500 they'll give you
00:13:30.200 a little scratch
00:13:30.880 and that's your payoff.
00:13:32.960 Who does that?
00:13:35.180 Why?
00:13:35.540 Why would you
00:13:37.360 go to a movie?
00:13:39.020 It makes no sense
00:13:40.240 because you have
00:13:42.100 infinite alternative
00:13:43.900 entertainment
00:13:45.180 on the internet.
00:13:46.400 You can go
00:13:47.020 into YouTube
00:13:47.480 and have
00:13:48.180 50 cool
00:13:50.240 short experiences
00:13:51.280 that are just
00:13:51.880 what you wanted
00:13:52.540 with nobody
00:13:53.520 tied to a chair
00:13:54.300 to be tortured
00:13:54.880 for information.
00:13:57.800 By the way,
00:13:58.860 there's a standard
00:13:59.720 that I use
00:14:00.400 that I would
00:14:00.800 recommend to all
00:14:02.080 of you.
00:14:02.900 It might make
00:14:03.620 movies better.
00:14:04.400 It goes like
00:14:05.540 this.
00:14:06.560 As soon as
00:14:07.280 the movie
00:14:07.740 involves somebody
00:14:09.040 tied to a chair
00:14:10.140 turn it off.
00:14:13.100 Turn it off.
00:14:14.780 That's it.
00:14:15.540 Because as soon
00:14:16.260 as you see
00:14:16.620 somebody tied
00:14:17.140 to a chair
00:14:17.640 you know
00:14:18.040 this isn't
00:14:18.500 a good movie.
00:14:20.380 This is a movie
00:14:21.220 designed to make
00:14:21.920 you feel bad
00:14:22.500 and then you
00:14:23.980 got suckered
00:14:24.540 into going
00:14:25.400 to it.
00:14:26.060 Yeah.
00:14:26.840 Right.
00:14:27.360 As soon as
00:14:27.920 you see the chair
00:14:28.600 even before
00:14:30.480 they're tied
00:14:30.880 to it
00:14:31.220 you know
00:14:31.740 they're going
00:14:32.000 to get tied
00:14:32.380 to the chair
00:14:32.800 turn it off.
00:14:34.400 There's nothing
00:14:34.920 good there.
00:14:36.220 Here's another
00:14:36.940 little tip
00:14:37.580 for watching
00:14:38.560 comedies.
00:14:41.120 If you want
00:14:41.860 to know
00:14:42.120 whether a comedy
00:14:42.980 let's say
00:14:43.380 a sitcom
00:14:43.920 on TV
00:14:44.420 is worth
00:14:45.420 watching to
00:14:45.980 the end
00:14:46.400 see how long
00:14:48.760 it takes them
00:14:49.300 to make a
00:14:49.860 food joke.
00:14:51.680 Now this is
00:14:52.520 real insider
00:14:53.360 humorist
00:14:54.320 stuff here.
00:14:56.440 Bad writers
00:14:57.320 make food jokes.
00:14:58.620 Do you know
00:14:58.920 why?
00:14:59.180 Because when
00:15:01.260 they're writing
00:15:01.780 they're eating.
00:15:06.640 And a bad
00:15:07.780 writer does
00:15:08.340 this.
00:15:09.700 Blank page.
00:15:13.120 What am I
00:15:13.780 going to write
00:15:14.040 about?
00:15:15.700 Eating my
00:15:16.420 donut.
00:15:17.280 Got to come
00:15:18.060 up with an
00:15:18.420 idea.
00:15:20.560 Hey
00:15:20.960 what about
00:15:22.200 something about
00:15:22.660 food?
00:15:23.080 Now
00:15:25.100 it is
00:15:27.320 impossible
00:15:27.860 to make
00:15:28.620 a funny
00:15:29.040 food joke.
00:15:30.540 I actually
00:15:31.160 write about
00:15:31.620 this very
00:15:32.040 thing.
00:15:32.800 You can do
00:15:33.260 jokes about
00:15:33.720 people and
00:15:35.100 about how
00:15:35.460 people feel
00:15:36.000 and maybe
00:15:36.380 you can do
00:15:36.820 a joke about
00:15:37.340 how somebody
00:15:37.840 feels about
00:15:38.940 food.
00:15:39.680 But you can't
00:15:40.220 really make a
00:15:40.740 joke about
00:15:41.120 the food.
00:15:42.540 And if you
00:15:42.960 see somebody
00:15:43.360 making a
00:15:43.860 food joke
00:15:44.820 like
00:15:46.000 love those
00:15:46.920 donuts
00:15:47.320 in Jim
00:15:49.420 Gaffigan.
00:15:50.380 Jim Gaffigan's
00:15:51.420 jokes are
00:15:51.840 really about
00:15:52.280 himself.
00:15:54.200 So that's a
00:15:55.200 little bit of
00:15:55.540 a trick.
00:15:56.620 Yeah, Jim
00:15:57.000 Gaffigan can
00:15:57.800 do hot
00:15:58.700 pocket jokes
00:15:59.480 because it's
00:16:00.740 not really
00:16:01.080 about the
00:16:01.420 food.
00:16:02.520 It's about
00:16:03.200 his reaction
00:16:03.980 to food.
00:16:04.620 That's the
00:16:04.900 only thing
00:16:05.220 that makes
00:16:05.520 it funny.
00:16:08.060 So if you
00:16:08.860 see jokes
00:16:09.300 that are
00:16:09.560 about the
00:16:09.940 food and
00:16:10.560 not the
00:16:10.880 person's
00:16:11.380 response to
00:16:11.900 the food,
00:16:12.260 that's a
00:16:12.520 bad writer.
00:16:13.580 Turn it
00:16:13.920 off.
00:16:15.260 All right,
00:16:15.820 I tweeted
00:16:16.220 this and
00:16:16.760 then I
00:16:17.180 caused an
00:16:17.820 accidental
00:16:18.420 controversy.
00:16:19.900 Here's my
00:16:20.500 tweet.
00:16:22.140 No system
00:16:23.180 can survive
00:16:23.940 the addition
00:16:24.440 of one
00:16:25.080 person to
00:16:25.700 the situation.
00:16:27.840 No
00:16:28.320 personal
00:16:28.800 system for
00:16:29.420 success
00:16:30.040 can survive
00:16:31.900 adding one
00:16:32.700 person to
00:16:33.200 the process.
00:16:34.460 And a
00:16:35.120 lot of
00:16:35.320 people said,
00:16:35.800 oh, you're
00:16:36.380 publicly
00:16:37.420 complaining about
00:16:38.380 your divorce.
00:16:39.380 No, it's
00:16:40.080 not about
00:16:40.400 that.
00:16:40.820 It's actually
00:16:41.320 literally just
00:16:41.960 about systems.
00:16:43.340 So it's
00:16:43.780 not about
00:16:44.140 me personally.
00:16:45.400 It's something
00:16:46.080 I've been
00:16:46.400 noticing.
00:16:46.720 Here's a
00:16:47.920 specific
00:16:48.300 example.
00:16:49.700 Now, if
00:16:50.140 you think
00:16:50.480 this example
00:16:51.060 is unique
00:16:51.720 and maybe
00:16:52.880 an example
00:16:53.360 of the
00:16:53.700 exception,
00:16:54.800 the whole
00:16:55.520 point is
00:16:56.180 that this
00:16:56.480 happens over
00:16:57.020 and over
00:16:57.300 and over
00:16:57.580 again.
00:16:58.500 So what
00:16:58.840 sounds like
00:16:59.480 an exception,
00:17:00.520 trust me,
00:17:01.060 you can
00:17:01.420 generalize
00:17:01.960 this a
00:17:03.040 lot.
00:17:04.460 So I
00:17:05.200 have a
00:17:05.500 system of
00:17:06.780 rewarding
00:17:07.360 myself for
00:17:08.120 work, so
00:17:09.300 I don't
00:17:09.560 mind the
00:17:09.880 work as
00:17:10.220 much, just
00:17:10.780 like a
00:17:11.060 dog.
00:17:11.820 If I do
00:17:12.340 my work, I
00:17:12.940 get a
00:17:13.180 treat.
00:17:13.400 Now, the
00:17:14.420 treat that
00:17:14.800 I give
00:17:15.040 myself is
00:17:15.700 a raisin
00:17:18.420 bagel with
00:17:20.160 coffee in
00:17:20.940 the morning.
00:17:21.940 It's just
00:17:22.540 about my
00:17:23.000 favorite thing
00:17:24.080 to do, to
00:17:25.260 have a nice
00:17:25.660 bagel.
00:17:26.720 You know, you
00:17:27.060 have to wait
00:17:27.520 for it in the
00:17:28.440 toaster, it's
00:17:29.280 piping hot, and
00:17:31.060 there's something
00:17:31.820 about the
00:17:32.240 process of
00:17:32.960 making it that's
00:17:33.600 kind of pleasant,
00:17:34.700 and all of this
00:17:35.560 I treat as a
00:17:36.560 reward.
00:17:37.020 So it's
00:17:38.700 working for
00:17:39.620 years and
00:17:40.900 years and
00:17:41.180 years, my
00:17:42.400 reward.
00:17:44.120 Now, I
00:17:46.400 improved my
00:17:47.480 system.
00:17:48.840 In the
00:17:49.240 morning, I
00:17:49.680 would order
00:17:50.140 fresh bagels
00:17:50.920 every three
00:17:51.640 days or so,
00:17:52.480 because you
00:17:52.740 can eat
00:17:53.740 them for a
00:17:54.380 few days
00:17:54.720 before they
00:17:55.120 get stale.
00:17:56.220 So about
00:17:57.040 every three
00:17:57.380 days, I'll
00:17:57.820 order fresh
00:17:58.280 bagels in
00:17:58.720 the morning,
00:17:59.540 and they
00:17:59.920 get delivered
00:18:00.400 to the
00:18:00.680 door.
00:18:01.740 And a
00:18:02.700 member of
00:18:03.080 my household,
00:18:03.880 I'm going
00:18:04.200 to speak
00:18:04.440 very generically
00:18:05.220 now, just
00:18:05.740 for privacy
00:18:06.320 purposes,
00:18:07.020 but I
00:18:07.640 have several
00:18:08.340 members of
00:18:08.860 my household.
00:18:10.640 And one
00:18:11.560 member of the
00:18:11.960 household said,
00:18:13.560 hey, bagels,
00:18:14.860 can you get
00:18:15.580 me some
00:18:16.280 blueberry bagels,
00:18:18.480 because I
00:18:18.780 really like
00:18:19.280 blueberry bagels.
00:18:20.860 Not really
00:18:21.480 like any other
00:18:22.120 kind, but really
00:18:22.900 like the
00:18:23.320 blueberry bagels.
00:18:25.060 And so, I
00:18:26.400 took my
00:18:26.900 system of
00:18:28.140 ordering raisin
00:18:29.660 bagels on a
00:18:31.100 regular basis,
00:18:31.700 which are
00:18:32.000 awesome, and
00:18:33.300 I added to
00:18:33.820 that the
00:18:34.480 blueberry bagels,
00:18:35.900 so they come
00:18:36.960 in the same
00:18:37.280 order.
00:18:38.740 Now, here's
00:18:39.600 what I
00:18:39.800 didn't count
00:18:40.260 on.
00:18:41.380 A blueberry
00:18:42.000 bagel and
00:18:43.000 a raisin
00:18:43.440 bagel look
00:18:44.740 exactly the
00:18:45.580 same to
00:18:46.580 idiots.
00:18:47.860 In this
00:18:48.480 case, I
00:18:48.960 would be
00:18:49.200 the idiot.
00:18:50.580 Now, I
00:18:50.940 know what
00:18:51.140 you're saying.
00:18:52.280 Scott, I
00:18:53.000 could pretty
00:18:53.420 much easily
00:18:54.000 tell the
00:18:54.420 difference between
00:18:55.020 a blueberry
00:18:55.520 bagel and a
00:18:56.280 raisin bagel.
00:18:57.220 You just have
00:18:57.800 to sniff them,
00:18:59.120 to which I
00:18:59.840 say I do
00:19:00.480 not have a
00:19:02.440 sense of
00:19:02.820 smell.
00:19:03.060 I have to
00:19:04.340 rely on
00:19:04.840 visual.
00:19:06.220 Visually, they
00:19:07.020 look about the
00:19:07.760 same until you
00:19:08.460 cut them open,
00:19:09.680 and then you
00:19:10.100 realize you
00:19:10.520 got the wrong
00:19:10.960 one.
00:19:11.420 Now, they
00:19:11.720 come in a
00:19:12.100 big box.
00:19:13.320 They're all
00:19:13.580 mixed up.
00:19:15.580 So, every
00:19:16.480 morning, my
00:19:18.180 system, which
00:19:19.160 used to be
00:19:19.660 treated as a
00:19:20.360 reward, where
00:19:21.880 I just, oh, I
00:19:22.980 love that
00:19:23.460 raisin bagel,
00:19:24.660 has now been
00:19:25.360 transformed by
00:19:26.780 the addition of
00:19:27.540 one extra
00:19:28.300 person into the
00:19:29.720 situation.
00:19:31.160 Now, every
00:19:32.420 morning, I go
00:19:33.160 and I look at
00:19:33.940 the bag of
00:19:34.380 bagels, and I
00:19:35.380 say to myself,
00:19:36.920 God fucking
00:19:37.880 damn it, I'm
00:19:39.100 not going to be
00:19:39.540 able to know
00:19:39.940 which one is
00:19:40.440 the right bagel.
00:19:41.140 I'm going to
00:19:41.440 waste my
00:19:41.900 fucking time
00:19:42.740 toasting this
00:19:43.780 bagel.
00:19:44.280 I'm going to
00:19:44.640 be done with
00:19:45.160 it.
00:19:45.260 I'm going to
00:19:45.480 put it in my
00:19:45.920 mouth, and
00:19:46.600 I'm going to
00:19:47.020 say, fuck,
00:19:47.940 fuck, fuck.
00:19:48.940 Once again, I
00:19:49.900 picked the
00:19:50.260 wrong bagel.
00:19:52.320 And so, my
00:19:54.260 beautiful system
00:19:55.400 of rewarding
00:19:56.020 myself has
00:19:57.600 turned into a
00:19:58.420 bagel
00:19:58.960 hellscape
00:19:59.660 in which I
00:20:00.940 have not yet
00:20:01.560 figured out a
00:20:02.240 way to emerge.
00:20:04.020 I was thinking
00:20:04.880 I could do two
00:20:05.840 orders of
00:20:06.360 bagels, one of
00:20:07.860 just raisin and
00:20:09.200 one of just
00:20:09.960 blueberry, but it
00:20:11.580 literally doubles
00:20:12.360 my work.
00:20:14.080 I don't want to
00:20:14.960 double my work.
00:20:16.420 That's not a
00:20:16.960 good system.
00:20:18.120 Yeah, two
00:20:18.600 orders?
00:20:19.460 Nope.
00:20:21.980 Now, I could
00:20:22.800 ask them, of
00:20:23.460 course, it's
00:20:23.820 DoorDash, I
00:20:24.400 could ask them
00:20:24.900 to put them in
00:20:25.360 separate bags, but
00:20:26.640 they won't.
00:20:27.780 If you've ever
00:20:28.340 tried to make a
00:20:29.040 special request
00:20:29.820 to DoorDash,
00:20:30.980 they don't
00:20:31.420 really read
00:20:31.920 those.
00:20:32.620 As far as I
00:20:33.240 can tell, they
00:20:33.660 don't read
00:20:34.020 them.
00:20:34.820 So, I don't
00:20:35.640 really know what
00:20:36.100 to do.
00:20:37.300 Now, is this
00:20:38.540 the biggest
00:20:38.860 problem in the
00:20:39.400 world?
00:20:39.680 No, this is
00:20:40.160 the ultimate
00:20:40.960 rich person's
00:20:42.420 problem.
00:20:43.380 However, watch
00:20:45.200 how well that
00:20:46.020 generalizes.
00:20:47.080 Let me give you
00:20:47.480 another example.
00:20:49.120 I created a
00:20:50.020 system where
00:20:50.560 downstairs, I
00:20:51.820 had one
00:20:53.360 drawer in the
00:20:54.700 kitchen that
00:20:55.380 was just for
00:20:55.980 things that I
00:20:56.740 always had to go
00:20:57.400 upstairs to get.
00:20:59.220 So, I kept
00:20:59.860 putting anything
00:21:00.360 important in that
00:21:01.140 drawer.
00:21:01.940 One of the
00:21:02.380 things I put in
00:21:02.960 there was my
00:21:03.460 wallet, because I
00:21:05.080 was always running
00:21:05.600 upstairs to get it if
00:21:06.540 I needed to leave
00:21:07.300 the house, or my
00:21:08.080 keys, you know, that
00:21:09.000 sort of stuff.
00:21:10.500 So, my ex-wife at
00:21:13.140 the time said, don't
00:21:15.660 keep a wallet down
00:21:16.420 there, somebody might
00:21:17.820 get at it.
00:21:18.860 Now, of course, I
00:21:19.640 had made that
00:21:20.640 conscious decision that
00:21:22.460 there was a little
00:21:22.900 extra risk, but the
00:21:24.540 convenience was worth
00:21:25.480 but when somebody
00:21:26.880 else tells you not
00:21:27.860 to do it, you're
00:21:28.300 like, ah, yeah, I'm
00:21:30.040 going to have to get
00:21:30.600 into this conversation
00:21:31.360 again.
00:21:31.740 So, I was like,
00:21:33.020 okay, I'll move the
00:21:34.440 wallet, but it'll
00:21:35.800 still be my drawer of
00:21:36.880 everything else.
00:21:38.200 Well, about that
00:21:40.300 point, my ex-wife
00:21:41.040 decided that that was
00:21:41.920 a good junk drawer.
00:21:43.740 And so, my well-
00:21:45.460 positioned drawer of
00:21:46.500 only the things that
00:21:47.500 were clearly visible
00:21:48.540 became a pile of
00:21:50.400 garbage that I had to
00:21:51.320 pile through every
00:21:52.640 time I wanted
00:21:53.240 anything.
00:21:53.720 And it was the
00:21:54.880 stuff I wanted most
00:21:55.720 often.
00:21:57.320 So, take those two
00:21:59.260 examples, and if
00:22:00.240 you're tempted to say,
00:22:01.200 Scott, that's just
00:22:01.900 about those two
00:22:03.180 people.
00:22:04.660 I mean, that's not
00:22:06.020 everybody.
00:22:06.940 You can't add just
00:22:08.280 everybody to a system
00:22:09.300 and it breaks.
00:22:10.380 No, I'm saying
00:22:11.340 everybody.
00:22:12.440 I'm saying it has
00:22:13.080 nothing to do with
00:22:13.800 the personalities of
00:22:14.680 the people involved.
00:22:15.760 It is a general rule
00:22:17.360 that as soon as you
00:22:18.680 add one person to
00:22:20.560 your system, it'll
00:22:21.260 break.
00:22:21.620 So, look for ways
00:22:23.720 to avoid adding
00:22:24.920 anybody to your
00:22:25.560 system, and just
00:22:26.620 be aware, the moment
00:22:27.600 you let that other
00:22:28.820 person in with their
00:22:30.220 little preferences,
00:22:31.740 they start getting in
00:22:32.540 your head, your whole
00:22:33.720 system is gone.
00:22:35.220 So, just remember that
00:22:36.100 when you're building
00:22:36.500 your systems.
00:22:38.420 Rasmussen says 66%
00:22:40.180 of the public thinks
00:22:41.560 the Hunter laptop
00:22:42.300 story is important.
00:22:44.420 Scott did a poll in
00:22:45.360 his head and found
00:22:46.120 out that 0% of people
00:22:47.720 will base their vote
00:22:49.160 on Hunter's laptop.
00:22:50.740 I don't think
00:22:52.600 anybody's going to
00:22:53.180 vote on that, do you?
00:22:54.700 They should.
00:22:56.160 I mean, it feels like
00:22:57.160 it's pretty darn
00:22:57.720 important, but I don't
00:23:00.500 think anybody will
00:23:01.240 because the
00:23:02.780 Republicans, we're all
00:23:03.740 going to vote
00:23:04.620 Republican, and the
00:23:05.680 Democrats, we're going
00:23:06.420 to vote Democrats.
00:23:07.700 So, I'm not sure
00:23:08.600 that's telling us
00:23:09.160 anything.
00:23:11.560 We have further
00:23:13.480 confirmation today
00:23:14.660 in a more meaningful
00:23:16.240 publication.
00:23:16.940 So, it's the UK's
00:23:19.060 daily mail.
00:23:19.800 You can insert your
00:23:21.480 own comments about
00:23:22.220 the credibility of
00:23:23.120 the media, and that
00:23:23.880 one in particular.
00:23:25.260 But, it is reporting
00:23:27.040 a lot of details
00:23:27.780 about a Ukrainian
00:23:29.140 biolab that was
00:23:30.560 funded in part by
00:23:32.220 Hunter Biden's
00:23:35.120 efforts.
00:23:37.300 So, he was part of
00:23:37.960 an investment group,
00:23:39.420 and he did get some
00:23:40.660 investments for at
00:23:41.480 least one biolab that
00:23:43.760 was involved in stuff
00:23:44.800 that at least could
00:23:45.880 have been weaponized.
00:23:47.320 Don't know exactly
00:23:48.100 what they were doing.
00:23:49.620 Now, I saw the
00:23:50.700 funniest tweet from a
00:23:53.680 Twitter user called
00:23:54.560 Nothing.
00:23:55.560 That's his handle,
00:23:56.460 Nothing.
00:23:57.500 He goes, I'm
00:23:58.900 genuinely a little
00:23:59.860 bit upset.
00:24:00.680 He's talking to me,
00:24:01.600 and he says, I'm
00:24:02.560 genuinely a little
00:24:03.380 bit upset that you
00:24:04.220 guessed the Hunter
00:24:04.900 Biden biolab thing.
00:24:09.160 He's genuinely upset
00:24:10.700 that I made that
00:24:11.400 prediction that it
00:24:12.120 was correct in 48
00:24:13.520 hours.
00:24:13.860 Now, is anybody
00:24:15.240 else impressed that
00:24:18.560 I made that
00:24:19.040 prediction, that
00:24:20.380 Hunter Biden would
00:24:21.200 be tied to the
00:24:21.840 biolabs?
00:24:23.160 Thank you.
00:24:24.280 You should be
00:24:24.980 impressed by that.
00:24:26.660 You know, there
00:24:27.200 aren't that many
00:24:27.760 things I do that
00:24:29.200 even I think you
00:24:30.060 should be impressed
00:24:30.640 by, but that one
00:24:31.380 you should be
00:24:31.760 impressed by.
00:24:33.100 I mean, that
00:24:33.500 was a frozen
00:24:36.740 rope home run over
00:24:38.360 the center field
00:24:39.020 fence.
00:24:39.820 I mean, there's no
00:24:40.340 way around that
00:24:40.920 one.
00:24:42.480 Now, as I
00:24:43.780 replied to my
00:24:44.680 critics who were
00:24:45.660 wondering how the
00:24:46.340 hell I guessed
00:24:46.880 that, was it a
00:24:48.220 guess or did I
00:24:48.960 have inside
00:24:49.520 information?
00:24:50.240 I will tell you I
00:24:50.900 didn't have inside
00:24:51.920 information.
00:24:53.400 So on this
00:24:54.120 particular topic,
00:24:54.980 sometimes I do.
00:24:56.820 I have to admit,
00:24:57.860 sometimes my
00:24:58.560 predictions are
00:25:00.180 based on a
00:25:00.940 little bit of
00:25:01.400 inside information.
00:25:02.660 Not always.
00:25:03.520 Sometimes.
00:25:04.680 This one had no
00:25:05.640 inside information.
00:25:06.720 The technique I
00:25:07.480 used I explained to
00:25:08.660 you ahead of
00:25:09.220 time.
00:25:12.620 Somebody says,
00:25:13.880 Scott pretends to
00:25:14.940 make predictions
00:25:15.540 based on insider
00:25:16.560 information.
00:25:17.500 Did you write that
00:25:18.240 before I just said
00:25:19.180 that or after?
00:25:21.160 But let me clarify.
00:25:23.020 I rarely make
00:25:24.420 predictions based on
00:25:25.680 insider information.
00:25:27.660 Is that different
00:25:28.680 than what you
00:25:29.200 think?
00:25:31.840 I think I rarely
00:25:32.900 do it.
00:25:33.560 Now, I don't
00:25:33.960 always tell you
00:25:34.680 because the source
00:25:35.680 of the inside
00:25:36.180 information might be
00:25:37.540 private.
00:25:39.340 But rarely,
00:25:41.360 rarely,
00:25:42.020 but sometimes.
00:25:43.000 I don't always
00:25:43.560 have inside
00:25:44.060 information.
00:25:45.460 All right.
00:25:47.120 Now, the part
00:25:47.900 that would make
00:25:48.440 this amazing
00:25:49.540 and really a
00:25:50.740 movie is if we
00:25:51.400 found out the
00:25:51.960 coronavirus came
00:25:52.860 out of the
00:25:53.460 Hunter Biden
00:25:53.980 funded biolab.
00:25:55.820 Now, I don't
00:25:56.360 think that's going
00:25:57.080 to happen.
00:25:58.080 But it would be
00:25:58.600 the perfect movie
00:25:59.380 ending, wouldn't
00:25:59.920 it?
00:26:00.720 So the technique
00:26:02.580 I use to predict
00:26:03.560 this is that
00:26:04.480 reality will follow
00:26:05.860 a movie script
00:26:07.420 path because
00:26:09.540 we're a...
00:26:12.200 There's a
00:26:14.380 Cristino video.
00:26:15.740 Don't know
00:26:16.280 about that.
00:26:19.160 So there's a...
00:26:21.400 By the way,
00:26:22.700 let me ask you
00:26:23.460 something.
00:26:25.400 If you see
00:26:26.340 anything about me
00:26:27.440 that you think
00:26:29.140 you need to send
00:26:29.920 to my ex,
00:26:30.600 don't do it.
00:26:31.160 And if you see
00:26:32.240 anything about her
00:26:32.940 in the media
00:26:33.500 or anything else
00:26:34.340 and you think,
00:26:35.420 well, he's got to
00:26:36.020 know about that,
00:26:36.800 just don't do it.
00:26:38.520 Just do me a favor.
00:26:39.320 Just don't do it.
00:26:40.140 Just stay out of it.
00:26:43.320 Everything's fine.
00:26:44.180 Just stay out of it.
00:26:45.600 All right.
00:26:51.060 Mushrooms are coming.
00:26:54.820 There's a weird
00:26:55.520 thing happening.
00:26:56.440 So a number of
00:26:57.160 Republicans and
00:26:58.260 Democrats around
00:26:59.200 the country in
00:27:00.500 about a dozen
00:27:01.100 states are
00:27:03.060 pushing legislation
00:27:04.200 to legalize
00:27:05.980 some form of
00:27:08.100 psychedelic
00:27:09.180 mushrooms.
00:27:10.300 Now, the context
00:27:11.060 usually is medical
00:27:12.020 or always, I think.
00:27:13.940 And it has to do
00:27:14.760 with experimentation
00:27:15.440 and maybe some
00:27:16.460 treatment.
00:27:17.460 And so what
00:27:18.100 they're doing is
00:27:18.540 different.
00:27:18.920 But here's the
00:27:19.420 amazing thing.
00:27:21.020 It's bipartisan.
00:27:23.060 There doesn't seem
00:27:23.980 to be like an
00:27:25.520 obvious difference
00:27:26.480 about whether this
00:27:27.740 is a Republican
00:27:28.440 or Democrat
00:27:30.080 thing.
00:27:31.920 And, you know,
00:27:33.760 the thinking is
00:27:34.420 that it might be
00:27:34.880 going the same
00:27:35.420 path as marijuana
00:27:36.540 legalization.
00:27:38.280 But I don't think
00:27:40.060 that's the
00:27:40.400 interesting part of
00:27:41.100 the story.
00:27:41.580 I do believe
00:27:42.400 that psychedelic
00:27:44.900 mushrooms are
00:27:45.580 guaranteed to
00:27:47.720 become legal and
00:27:49.280 routinely used in a
00:27:51.620 medical context.
00:27:53.500 That's my belief.
00:27:54.740 Actually, I'll make
00:27:55.400 that a prediction.
00:27:56.720 It's barely a
00:27:57.600 prediction because
00:27:58.160 it's just so
00:27:58.680 obvious.
00:27:59.640 I don't think
00:28:00.080 anybody disagrees,
00:28:01.020 do they?
00:28:01.660 Is there anybody
00:28:02.060 who would disagree?
00:28:03.320 I don't think so.
00:28:04.220 I think that's just
00:28:04.800 obvious at this
00:28:05.460 point.
00:28:06.220 So maybe some
00:28:06.840 other psychedelics
00:28:09.520 as well.
00:28:10.300 But mushrooms for
00:28:11.080 sure.
00:28:11.760 I will tell you
00:28:12.640 that in the last
00:28:13.380 few weeks, the
00:28:16.200 number of people
00:28:17.020 who I would call
00:28:18.080 normies, people
00:28:20.260 that you would
00:28:20.680 not associate
00:28:21.280 with advanced
00:28:23.180 drug use, I
00:28:25.440 think at least
00:28:25.980 three normies
00:28:26.920 have said
00:28:29.060 something about
00:28:29.840 wanting to do
00:28:30.760 mushrooms.
00:28:32.220 And apparently
00:28:33.140 there's some
00:28:33.720 chocolate bars,
00:28:35.960 and I don't know
00:28:36.540 if they're legal
00:28:37.120 or illegal, but
00:28:38.120 now there's a
00:28:40.720 well-made product
00:28:41.800 that puts the
00:28:43.260 mushrooms into
00:28:44.020 chocolate squares,
00:28:45.440 and I hate to
00:28:46.340 tell you this,
00:28:47.340 but it was the
00:28:48.180 main thing that
00:28:49.000 was keeping
00:28:49.400 mushrooms from
00:28:50.180 growing in terms
00:28:51.320 of the public
00:28:51.840 is dosage.
00:28:54.520 If you get a
00:28:55.260 bag of actual
00:28:56.160 mushrooms, you
00:28:57.640 don't know how
00:28:58.240 much to take.
00:28:59.800 And for me,
00:29:00.640 that's a full
00:29:02.020 stop.
00:29:03.560 If you don't
00:29:04.560 know a quantity
00:29:05.280 to take, full
00:29:06.880 stop.
00:29:08.000 And there's no
00:29:08.520 way I'm going to
00:29:09.000 play with something
00:29:09.680 I don't know the
00:29:10.200 quantity, especially
00:29:11.480 if a small
00:29:12.140 quantity could have.
00:29:13.380 Now, I don't
00:29:13.760 think, I've never
00:29:14.400 even heard of
00:29:14.880 anybody having a
00:29:15.520 bad mushroom
00:29:16.260 experience.
00:29:16.960 It must happen.
00:29:18.060 Like, one assumes
00:29:18.860 it does happen.
00:29:20.080 Never heard of it,
00:29:21.120 so I don't know
00:29:21.800 how dangerous it
00:29:22.480 is if I've never
00:29:23.120 heard of one
00:29:23.700 example.
00:29:25.440 But, so
00:29:27.240 mushrooms are
00:29:27.800 coming, and
00:29:28.860 there's nothing
00:29:29.280 to stop it.
00:29:30.580 However, that's
00:29:32.120 not the big
00:29:32.560 story.
00:29:32.860 I just got an
00:29:41.860 anonymous text
00:29:42.660 from somebody
00:29:43.200 who's saying
00:29:43.520 the chocolate
00:29:44.020 squares are
00:29:44.620 amazing.
00:29:45.620 So, mushrooms
00:29:47.520 have moved into
00:29:48.320 the mainstream.
00:29:51.900 You know that,
00:29:52.760 right?
00:29:53.860 So, ordinary
00:29:54.780 people, you just
00:29:55.700 don't associate
00:29:56.280 with this, are
00:29:56.960 just saying,
00:29:57.380 yeah, I'm going
00:29:58.240 to get some
00:29:58.580 mushrooms, and
00:29:59.240 mushrooms are
00:29:59.640 great.
00:29:59.820 Now, the
00:30:00.660 medicinal value
00:30:02.880 is almost
00:30:04.500 incalculable.
00:30:06.900 Mushrooms have
00:30:08.040 been reported
00:30:08.700 to be curing
00:30:10.660 everything from
00:30:11.760 addiction to
00:30:15.960 different mental
00:30:16.940 health illnesses,
00:30:18.340 anxiety, all
00:30:19.500 kinds of stuff.
00:30:20.640 And it's because
00:30:21.520 they all have the
00:30:22.100 same root.
00:30:24.440 Do you believe
00:30:25.520 that?
00:30:25.740 The reason
00:30:27.100 that mushrooms
00:30:28.180 and psychedelics
00:30:29.660 are reported
00:30:31.060 to, you know,
00:30:31.880 I'm not the
00:30:32.260 doctor, so I'm
00:30:32.900 not going to
00:30:33.180 tell you what's
00:30:33.600 true or not,
00:30:34.360 but reportedly
00:30:35.160 have these, you
00:30:36.660 know, miracle
00:30:37.180 cures against
00:30:38.980 all kinds of
00:30:39.680 different, seemingly
00:30:40.600 different conditions.
00:30:42.080 Like, is
00:30:42.520 addiction really
00:30:43.280 the same thing
00:30:43.960 as anxiety
00:30:45.500 or some
00:30:46.980 other mental
00:30:47.460 illness?
00:30:47.840 Not really.
00:30:49.180 But it seems
00:30:49.920 to fix all of
00:30:50.620 them.
00:30:51.240 Do you know
00:30:51.580 why?
00:30:52.780 Tell me why.
00:30:53.540 Why do
00:30:54.780 mushrooms seem
00:30:55.500 to work against
00:30:56.140 a bunch of
00:30:57.040 different things?
00:30:58.900 Somebody says
00:30:59.540 trauma equals
00:31:00.460 an, that's
00:31:02.380 interesting, trauma
00:31:03.960 equals inflammation.
00:31:05.720 I wouldn't bet
00:31:06.360 against that.
00:31:08.340 I haven't heard
00:31:09.400 that hypothesis,
00:31:10.300 but I like it.
00:31:13.100 All right, let
00:31:13.700 me, there we
00:31:14.900 go.
00:31:15.860 Yeah.
00:31:17.040 Yeah.
00:31:18.060 Mushrooms
00:31:18.560 dissolve your ego.
00:31:20.520 It's like rebooting
00:31:21.680 your whole mentality.
00:31:22.640 It allows you to
00:31:24.000 reframe your
00:31:24.860 existence.
00:31:26.500 So you basically
00:31:27.180 do a demo on
00:31:28.620 your entire thought
00:31:29.440 process, and
00:31:31.320 then by eliminating
00:31:33.300 your ego, because
00:31:34.180 in a way your ego
00:31:35.160 is what keeps all of
00:31:36.200 your existing thoughts
00:31:37.220 intact, even when
00:31:38.120 they're stupid.
00:31:39.660 Your ego is the
00:31:40.580 part that says,
00:31:41.100 well, you think I'm
00:31:42.520 wrong, but I'm not
00:31:43.160 wrong, because the
00:31:44.500 ego is protecting
00:31:45.260 itself.
00:31:45.860 It's the thing that
00:31:46.500 makes you not able to
00:31:47.400 change your opinion,
00:31:48.220 even when the evidence
00:31:48.980 says you should.
00:31:49.680 So if you get rid
00:31:51.100 of your ego, what
00:31:53.120 happens to your
00:31:53.740 confirmation bias?
00:31:55.760 It goes away.
00:31:57.680 Because it's your
00:31:58.620 ego protecting your
00:32:00.220 reputation, your
00:32:01.500 feelings and stuff.
00:32:02.380 That's the only reason
00:32:03.160 the confirmation bias
00:32:04.120 works, because you're
00:32:05.680 trying to protect
00:32:06.200 yourself.
00:32:07.060 If you take away the
00:32:08.020 protecting yourself, no
00:32:10.020 confirmation bias.
00:32:11.900 Imagine seeing the
00:32:12.940 world for the first
00:32:13.920 time without bias.
00:32:16.160 Now, it's impossible
00:32:21.240 to describe what
00:32:22.760 people experience
00:32:23.580 under any kind of
00:32:24.460 hallucinogenic state.
00:32:26.080 There aren't any
00:32:26.960 words in the
00:32:27.640 non-hallucinogenic
00:32:28.480 state that capture
00:32:29.780 that in a way you
00:32:30.460 can transmit it.
00:32:32.000 All I'm going to
00:32:32.880 say is that it
00:32:34.800 doesn't surprise me
00:32:35.820 that if you can
00:32:37.200 eliminate your ego
00:32:38.780 long enough to
00:32:40.080 reprogram your
00:32:41.240 frames, that when
00:32:43.580 you come back
00:32:44.140 online after the
00:32:46.040 mushrooms, and
00:32:48.300 again, let me say
00:32:49.120 this to avoid
00:32:50.500 demonetization, I'm
00:32:52.420 very much against
00:32:53.240 anybody using any
00:32:54.260 kind of drug that's
00:32:55.180 not doctor-recommended
00:32:56.420 and doctor-supervised.
00:32:58.640 So, you know, don't
00:32:59.420 go wild and just
00:33:00.340 shove magic mushrooms
00:33:01.740 in your mouth.
00:33:03.120 Be a little smart,
00:33:04.180 okay?
00:33:06.360 All right.
00:33:07.320 The Supreme Court
00:33:08.360 ruled that Biden is
00:33:09.460 the commander-in-chief.
00:33:11.200 You know, I thought
00:33:12.100 that was like a joke
00:33:13.020 headline.
00:33:14.800 Was it Politico
00:33:16.420 had that headline?
00:33:18.220 And then I read the
00:33:19.400 article, I was like,
00:33:20.020 oh yeah, actually that
00:33:20.800 headline does capture
00:33:21.600 it.
00:33:21.940 So the Supreme Court
00:33:22.740 was ruling on
00:33:25.060 whether Biden can
00:33:26.280 order the troops to
00:33:27.440 get vaccinated, or
00:33:29.680 because it's maybe
00:33:30.920 unconstitutional, the
00:33:33.020 commander-in-chief could
00:33:34.140 not tell people to get
00:33:35.880 mandatory vaccines.
00:33:37.260 But the court ruled
00:33:38.500 that the commander-in-chief
00:33:39.460 can pretty much tell the
00:33:40.660 troops to do anything
00:33:41.540 because he or she is
00:33:44.300 the commander-in-chief.
00:33:45.860 That's why.
00:33:47.680 And I feel I agree
00:33:50.080 with that.
00:33:51.960 You know, you could say
00:33:53.140 that there's some kind
00:33:54.220 of evil going on with
00:33:55.300 forcing anybody to get
00:33:56.380 a shot, but I don't
00:33:59.800 think we treat the
00:34:00.980 military like any other
00:34:02.500 part of society.
00:34:03.360 Am I wrong that the
00:34:05.840 military can expressly
00:34:08.320 discriminate?
00:34:09.480 They can, can't they?
00:34:12.480 Can't the military
00:34:13.700 discriminate?
00:34:15.560 They can.
00:34:17.680 It's the only part of
00:34:19.360 society, well, that's
00:34:20.360 not true.
00:34:21.120 But it's one part of
00:34:22.260 society that is
00:34:23.100 explicitly allowed to
00:34:24.480 discriminate.
00:34:26.740 Now, they can't
00:34:27.980 discriminate against
00:34:28.860 race.
00:34:30.020 Do you know why?
00:34:32.540 Because race apparently
00:34:33.940 doesn't affect your
00:34:35.040 fighting capability.
00:34:37.280 Makes sense, right?
00:34:38.960 There just wouldn't be
00:34:39.940 any reason.
00:34:40.640 Now, if there was a
00:34:41.540 reason, that would be
00:34:43.040 dicey.
00:34:44.880 If it turned out there
00:34:45.960 was some reason, the
00:34:47.520 race made a difference
00:34:48.500 in your fighting, they
00:34:49.320 would discriminate.
00:34:50.680 They would.
00:34:52.080 Because that's their
00:34:52.840 job.
00:34:53.700 But as it turns out, it's
00:34:55.720 probably not a factor.
00:34:57.200 So, they can discriminate
00:34:58.960 against your height.
00:35:01.000 Am I right?
00:35:02.240 You could be too small.
00:35:04.080 You could be too old.
00:35:05.720 Too disabled.
00:35:06.640 So, the military is all
00:35:08.800 about discriminating.
00:35:10.280 And when it comes to the
00:35:11.200 trans question, I think
00:35:13.860 they only care about cost
00:35:15.520 and readiness.
00:35:17.780 And otherwise, that's your
00:35:18.760 problem.
00:35:19.960 So, we do allow the
00:35:21.580 military to discriminate,
00:35:23.720 and the Supreme Court has
00:35:25.040 backed them in that
00:35:26.060 ability.
00:35:27.100 The discrimination is a
00:35:28.860 weird kind because it's
00:35:29.660 about the vaccinations.
00:35:30.420 All right.
00:35:33.980 Fact-checkers on Twitter
00:35:35.740 were going wild today, and
00:35:37.580 they were fact-checking as
00:35:39.200 false a claim that
00:35:41.100 vaccines were hurting
00:35:42.560 people more than they
00:35:43.540 were helping them.
00:35:44.700 So, they say, the fact-
00:35:46.180 checkers say, this is not
00:35:47.240 me, this is the fact-
00:35:48.100 checkers.
00:35:49.020 And can we pause for a
00:35:50.320 moment?
00:35:51.380 Give you a little context.
00:35:54.200 If you look at something
00:35:55.320 like factcheck.org, one of
00:35:58.460 the ways you can know that
00:35:59.540 they're right about
00:36:00.480 everything, is that fact
00:36:02.740 is right in their name.
00:36:04.440 Fact-check.
00:36:06.120 So, therefore, logically,
00:36:08.420 it follows that 100% of
00:36:10.960 the things they say are
00:36:12.080 facts.
00:36:13.360 And so, when I read this
00:36:14.280 fact-check, you should
00:36:15.120 definitely believe the
00:36:16.100 fact-check.
00:36:17.520 Do not believe the
00:36:18.740 alleged fake fact.
00:36:21.400 Don't believe that.
00:36:23.140 Oh, wait a minute.
00:36:24.720 Okay.
00:36:26.080 All right.
00:36:26.700 I did some mushrooms, and
00:36:28.520 now I'm thinking more
00:36:29.220 clearly.
00:36:30.220 It does turn out that the
00:36:31.400 fact-checks are exactly as
00:36:33.500 non-credible as any
00:36:35.420 conspiracy theory.
00:36:37.660 Now, I'm not going to back
00:36:38.880 the conspiracy theory in
00:36:40.300 this specific case, but it
00:36:43.320 is sadly true that the
00:36:45.960 fact-checkers do not have
00:36:47.460 more credibility than the
00:36:49.180 conspiracy theory.
00:36:50.440 They can be right.
00:36:51.840 I'm not saying they're
00:36:52.520 wrong.
00:36:53.620 They don't have
00:36:54.260 credibility.
00:36:55.300 Certainly not on this.
00:36:56.220 And let me tell you why
00:36:57.560 they don't have
00:36:58.040 credibility.
00:36:59.520 So, here's the story.
00:37:01.080 So, there's a bogus claim
00:37:02.220 by some doctor's group
00:37:03.900 that's organized doctor's
00:37:05.960 group to basically say
00:37:08.660 contrarian things about
00:37:09.900 the pandemic.
00:37:11.920 You've probably seen them.
00:37:13.920 And they say that the,
00:37:17.340 there's a misleading graphic
00:37:18.860 to show that the deaths
00:37:20.800 rose even though the
00:37:22.020 vaccinations were rolled
00:37:23.280 down.
00:37:23.520 So, the graph purported to
00:37:25.960 show that there was no
00:37:27.340 difference in the deaths
00:37:28.420 and they kept going up
00:37:29.340 after vaccinations rolled
00:37:30.620 out.
00:37:32.240 The fact-check says they
00:37:33.700 made this mistake.
00:37:35.860 They confuse cumulative with
00:37:38.360 daily.
00:37:40.680 What?
00:37:42.260 Who does that?
00:37:43.520 The graph, apparently the
00:37:45.600 entire graph only, it only
00:37:49.760 tells the story that these
00:37:50.920 doctors wanted to tell
00:37:52.040 because they confused
00:37:53.900 cumulative deaths, which, by
00:37:56.200 the way, you do know that
00:37:57.880 cumulative deaths always go
00:38:01.360 up, right?
00:38:02.620 They never really turn down.
00:38:05.740 You can't add anything to
00:38:07.460 anything and get less of it.
00:38:09.820 It doesn't work.
00:38:10.800 And so, the doctors who
00:38:12.700 are, have I ever mentioned
00:38:13.660 this?
00:38:14.640 Have I ever mentioned that
00:38:15.620 doctors, while they might be
00:38:18.040 excellent at doctoring, do not
00:38:21.060 necessarily have skills at data
00:38:23.080 analysis?
00:38:25.860 Exhibit A.
00:38:27.340 If you can't tell the
00:38:28.560 difference between cumulative
00:38:30.260 and daily, well, maybe you
00:38:32.780 shouldn't talk in public about
00:38:33.940 data.
00:38:35.660 But, and then the fact-checkers
00:38:38.340 pointed to two studies, more
00:38:40.360 recent studies, that showed
00:38:42.160 that the vaccines totally
00:38:43.320 worked and that they saved
00:38:45.200 America.
00:38:47.540 And so, you can believe the
00:38:48.920 fact-check because they
00:38:49.960 pointed to two studies.
00:38:52.580 And if it's in the study, am I
00:38:55.200 right?
00:38:56.260 It's in the study.
00:38:57.880 Well, it's got to be true.
00:38:59.400 It's got to be true if it's in
00:39:00.680 a study.
00:39:02.660 No.
00:39:04.060 No.
00:39:05.020 What about two studies?
00:39:06.140 There are two studies that say
00:39:08.680 that it works.
00:39:10.320 So, you believe that?
00:39:12.360 Two studies?
00:39:14.540 No.
00:39:15.740 No.
00:39:16.760 No, I happen to think it's
00:39:18.460 probably true that the vaccines
00:39:21.300 worked.
00:39:22.340 And I would place a fairly large
00:39:24.200 bet on that, actually.
00:39:25.720 But, we can't tell.
00:39:28.240 There is no fucking way that you
00:39:30.840 and I know if the vaccines
00:39:32.560 worked or not.
00:39:33.740 Because we only get bullshit.
00:39:35.280 The only data is unreliable
00:39:37.960 data.
00:39:38.420 That's all we have.
00:39:39.360 We have only unreliable data.
00:39:42.340 So, I have my bias.
00:39:43.820 My bias is that the vaccine
00:39:45.020 probably worked.
00:39:46.040 Do you think that I'm biased by
00:39:47.300 the fact that I got vaccinated?
00:39:49.780 Do you think my belief that
00:39:52.040 vaccines maybe were safer than
00:39:54.160 some people thought, do you
00:39:55.620 think that it's in any way
00:39:56.520 influenced by the fact that I
00:39:58.500 got vaccinated?
00:39:59.260 It's in any way.
00:40:00.260 It's in any way.
00:40:01.260 Yes!
00:40:02.540 Yes!
00:40:04.060 If I've taught you nothing,
00:40:06.360 you should know that.
00:40:08.200 Of fucking course, I'm biased.
00:40:11.420 Now, does it help that I'm
00:40:15.200 completely aware of the source
00:40:17.460 of the bias?
00:40:18.800 A little bit.
00:40:19.940 But not much, right?
00:40:22.940 If people could see past their
00:40:24.620 bias because they understood
00:40:25.940 where it was coming from,
00:40:27.620 well, we'd be in pretty good
00:40:28.720 shape.
00:40:29.560 But apparently, people can't do
00:40:30.820 that.
00:40:31.500 And why would I think I can?
00:40:33.460 If I observe that it just
00:40:35.240 doesn't seem to be something
00:40:36.360 that human beings can do,
00:40:38.620 why would I think I can do it?
00:40:41.360 Now, it feels like I can do it,
00:40:43.860 but it feels like you can do it
00:40:45.360 too, doesn't it?
00:40:46.080 And I'm not so sure you can do
00:40:47.660 it.
00:40:48.340 So the fact that it feels like,
00:40:50.700 you know, I can come up with
00:40:51.840 good, unbiased opinions means
00:40:53.780 nothing.
00:40:54.860 I definitely feel like I can.
00:40:56.800 If you ask me, Scott, like,
00:40:58.860 really deep down in your bones,
00:41:00.320 do you think you could overcome
00:41:01.360 these bias?
00:41:02.080 I'd say, with a completely
00:41:03.360 straight face, I'd say, you
00:41:05.900 know, I really think I can.
00:41:08.720 Unlike everybody else in the
00:41:10.180 world, I really think I can.
00:41:12.440 And then I step back and I hear
00:41:13.880 myself, and I think, okay,
00:41:15.920 listen to yourself.
00:41:17.460 Listen to yourself.
00:41:18.780 Now listen to everybody else on
00:41:20.480 earth who you know can't do
00:41:22.380 that.
00:41:23.180 Yeah, they all think they can.
00:41:25.460 So that's the only way you can
00:41:28.180 get to anything like humility,
00:41:29.920 is to understand that you're
00:41:31.760 claiming a superpower that
00:41:33.980 everybody on earth claims, and
00:41:35.620 nobody has.
00:41:37.060 Nobody has it.
00:41:38.440 The ability to be unbiased.
00:41:41.780 So factor that in when you look
00:41:44.000 at my claims.
00:41:45.140 Now, here's the payoff.
00:41:47.220 There's nothing interesting in
00:41:48.900 knowing that the fact-checkers
00:41:50.380 say something you don't think is
00:41:52.000 true.
00:41:52.820 But here's the trick.
00:41:56.420 Are you ready?
00:41:58.480 Was this the only data and the
00:42:00.520 only graph and the only group
00:42:02.280 saying that vaccinations might
00:42:05.240 have been more dangerous than we've
00:42:06.520 been told?
00:42:07.600 Only one?
00:42:08.900 Nobody else out there?
00:42:09.960 Did anybody else have different
00:42:12.280 arguments, but they came to a
00:42:15.160 similar conclusion?
00:42:16.840 Why is it that the fact-check orgs
00:42:18.880 just had multiple orgasms on this
00:42:22.380 one specific claim?
00:42:25.840 Why is that?
00:42:28.460 Because I'm seeing stuff by other
00:42:30.100 people online, I won't name names,
00:42:32.700 that seem far more compelling than
00:42:36.000 this one would have.
00:42:38.340 But I don't hear anybody fact-checking
00:42:39.880 them.
00:42:41.100 Do you?
00:42:42.440 Have you now seen completely
00:42:43.960 different claims that vaccines
00:42:46.200 might have a problem?
00:42:47.040 I don't know if they're true, and I'm
00:42:49.240 not saying that they are.
00:42:50.720 In fact, I'd bet against them.
00:42:52.020 I don't think they're true.
00:42:53.640 But I don't know.
00:42:54.580 I don't know, and you don't know
00:42:56.060 either.
00:42:57.260 So why is it that they picked this
00:42:59.240 one to debunk?
00:43:03.200 If you don't know this, I will be
00:43:05.600 disappointed, because you've been
00:43:07.580 watching me for a while.
00:43:08.900 Why'd they pick this one to debunk?
00:43:14.480 It's a diversion.
00:43:15.340 They picked the weakest one to
00:43:17.920 debunk, because they can't debunk
00:43:20.700 the strong one.
00:43:22.040 If they debunk the weakest one, they
00:43:23.720 could say, we've debunked the idea
00:43:25.240 that vaccinations are dangerous.
00:43:27.360 Just look at it.
00:43:27.960 There's my fact-check.
00:43:29.140 We just debunked it.
00:43:30.420 Totally debunked.
00:43:32.180 But why did they debunk the weakest,
00:43:35.540 most obviously wrong one that had no
00:43:37.520 value at all, when there are stronger
00:43:39.780 claims that also might be false, but
00:43:42.200 they appear to be more substantive?
00:43:45.340 This is propaganda.
00:43:48.420 This is brainwashing.
00:43:50.700 And again, I'm not saying that
00:43:52.140 vaccinations are dangerous, because I
00:43:53.840 think if I had to bet, I'd bet that
00:43:56.320 the risk balance was appropriate.
00:44:00.460 Don't know.
00:44:01.500 Could be totally wrong, because I'm
00:44:03.420 biased.
00:44:04.480 You don't need to tell me I'm biased.
00:44:05.880 I know that.
00:44:07.380 All right.
00:44:09.020 But if you see a fact-checker going
00:44:10.780 after the weakest argument, you should
00:44:12.520 say, fuck you, a fact-checker.
00:44:14.580 You are trying to manipulate me by going
00:44:16.740 after the weakest argument.
00:44:17.940 Go after the strong one.
00:44:19.660 Do you know how I tell you that if you're
00:44:21.440 trying to debunk the list of persuasion,
00:44:24.460 somebody comes at you with a list of 10
00:44:26.280 reasons why something's true?
00:44:27.700 If they've got 10 reasons why something's
00:44:31.020 true, the only way you should attack that
00:44:33.540 is say, give me your one best reason.
00:44:36.320 Would you agree?
00:44:37.140 If I can debunk your one best reason,
00:44:39.280 you'll go rethink the others.
00:44:41.320 That's all you can do.
00:44:42.400 That's the best you can do.
00:44:46.220 Well, here's a question.
00:44:47.740 Do you think Russia will run out of
00:44:49.340 generals before Ukraine runs out of
00:44:52.340 bullets?
00:44:52.700 Because it seems like every day or so
00:44:55.440 we're hearing about another Russian
00:44:56.660 general got killed by the Ukrainians.
00:45:01.500 How many of you believe that is true?
00:45:05.700 Well, I'm going to agree with Twitter
00:45:09.980 user, I hope I wrote down his name.
00:45:15.380 Maybe I'll find it later.
00:45:18.480 But I think that after the war is over,
00:45:21.200 we're going to find out that maybe some
00:45:22.560 of those generals are alive.
00:45:23.900 What do you think?
00:45:25.600 I think maybe some of those generals
00:45:27.060 might be alive.
00:45:28.240 Maybe more than we think.
00:45:30.420 All right.
00:45:30.660 There was, speaking of Russian
00:45:31.600 generals, there was a Russian general
00:45:32.860 who was talking about what they were
00:45:35.440 trying to accomplish, what Russia is
00:45:36.900 trying to accomplish.
00:45:38.240 And it feels like he was moving the
00:45:40.300 goalpost a little bit.
00:45:42.540 So here's what the Russian general says
00:45:44.280 they're trying to accomplish.
00:45:45.500 In his own words, the combat potential
00:45:49.740 of the armed forces of Ukraine has been
00:45:51.660 significantly reduced, allowing us,
00:45:54.360 I emphasize again, to focus the main efforts
00:45:57.780 on achieving the main goal, the liberation
00:46:01.020 of Donbass and the other region there.
00:46:03.780 I forget his name.
00:46:04.360 So it seems that a Russian general, speaking
00:46:07.980 in public, presumably not off page,
00:46:12.240 is saying that the reason they're there
00:46:14.100 is to basically degrade the Ukrainian military
00:46:18.040 and to liberate Donbass.
00:46:20.780 Now, backwards mission creep, right?
00:46:24.940 You see what's forming here, right?
00:46:27.100 This is the exit strategy.
00:46:28.340 So Putin is putting in play an option
00:46:31.640 if it turns out he can't conquer the whole country,
00:46:35.460 which one assumes would be the preferred thing.
00:46:38.760 But he's creating an option to say,
00:46:41.900 these were our goals, even if they weren't,
00:46:44.260 and here's how we accomplished them,
00:46:45.820 even if we didn't, and now we can get out.
00:46:49.240 Right?
00:46:49.700 So you can see the end game now.
00:46:51.320 The end game is really clear.
00:46:53.180 The end game is that Putin will create a situation
00:46:55.760 in which he can definitely claim victory.
00:46:59.400 Here's my opinion as of today.
00:47:03.140 Russia won.
00:47:05.000 In my opinion, the war is basically over.
00:47:08.680 I mean, it still needs to run out.
00:47:10.740 But here's what I think.
00:47:12.460 I don't think we were trying to win this war.
00:47:15.860 Because that's not what winning a war looks like.
00:47:18.080 Am I right?
00:47:19.640 When I say we, I mean the Western people
00:47:22.420 backing the Ukrainians.
00:47:23.980 We weren't trying to win this war.
00:47:26.240 We weren't.
00:47:28.520 Because I don't think we wanted to.
00:47:32.020 Because when we want to do something
00:47:33.620 and we have the ability to do it,
00:47:36.040 well, don't we do it?
00:47:37.700 Am I right?
00:47:38.660 If we have the ability to do something
00:47:40.320 and we want to do it,
00:47:42.540 we do it.
00:47:44.140 So here we have the ability
00:47:45.340 to do far more damage,
00:47:47.720 and we didn't do it.
00:47:49.500 That looks like a choice.
00:47:50.580 It looks like a choice to simply destroy Ukraine
00:47:54.860 and destroy Russia
00:47:55.980 for maybe decades
00:47:58.880 to get an advantage over them
00:48:01.700 so that they couldn't fund their military
00:48:03.280 and wouldn't sell their energy against us.
00:48:06.100 So I think both sides are going to get what they want,
00:48:09.920 weirdly, in a way.
00:48:12.400 Here's what Putin's going to get.
00:48:14.260 Complete victory.
00:48:16.060 It looks at this point, if he survives,
00:48:18.580 and it looks like he will,
00:48:20.160 that Putin just won.
00:48:21.120 I think he just won straight up.
00:48:23.860 Because he can tell his people he won.
00:48:26.560 He did get things he didn't have before.
00:48:29.340 Now what he loses
00:48:30.160 is maybe international reputation
00:48:34.260 and then sanctions.
00:48:37.320 But let me tell you,
00:48:38.220 if the sanctions last 10 years,
00:48:40.900 but their productive control of Ukraine
00:48:43.880 lasts 100,
00:48:46.380 Putin is going to look like
00:48:47.420 one of the best leaders Russia ever had.
00:48:49.180 Am I wrong about that?
00:48:51.720 All he has to do is serve on his time,
00:48:54.320 you know,
00:48:54.560 and keep the history books
00:48:56.840 saying what he wants them to say.
00:48:58.960 And he's going to say,
00:48:59.760 we got rid of the Russians,
00:49:01.140 or I'm sorry,
00:49:01.600 we got rid of the Nazis,
00:49:03.180 we repatriated the Russian-speaking lands,
00:49:06.680 we built a land bridge,
00:49:08.120 and we neutralized the Ukrainian threat,
00:49:12.340 and we kept NATO out.
00:49:14.900 I think he ran the table.
00:49:16.320 Am I right?
00:49:18.720 He ran the table.
00:49:19.840 He got everything.
00:49:21.120 Because I don't know
00:49:21.680 that he really cared about Kiev.
00:49:23.140 I mean,
00:49:23.800 Kiev is irrelevant.
00:49:25.280 If you own the parts you want,
00:49:26.720 you've got everything you need,
00:49:27.860 and you can come back if you need to,
00:49:30.120 and blah, blah, blah.
00:49:31.480 So Putin himself won't starve.
00:49:33.840 His country might go through some tough times.
00:49:35.880 I think they will.
00:49:37.000 But so will everybody else.
00:49:39.140 I mean,
00:49:39.460 we're heading toward a potential global food shortage.
00:49:43.900 And I think we're going to work our way around that.
00:49:47.080 I think we'll be okay on the food.
00:49:48.760 But I think Russia's going to go through a tough 10 or 20 years,
00:49:54.060 and then in 100 years,
00:49:56.420 Putin will look like a hero,
00:49:57.880 one of the great builders of the nation,
00:50:00.060 and he will win.
00:50:01.840 And it will never have that much effect on his personal life,
00:50:04.420 because he'll go to his dacha
00:50:05.820 and have his harem and whatever else he's doing.
00:50:09.000 So it looks like he won.
00:50:17.940 What else has happened?
00:50:20.580 Oh, it was Gregory Maccles on Twitter.
00:50:25.420 This was his tweet.
00:50:26.560 I'm agreeing with this.
00:50:27.400 He said,
00:50:27.740 My money is that most of those generals will pop up alive after the war,
00:50:31.840 and after many people who follow edgy Russian sources...
00:50:35.100 Oh, and only the people who follow edgy Russian sources will know about it.
00:50:39.880 I wouldn't bet against that.
00:50:42.380 Now, you know, if one general died,
00:50:45.040 well, I'd believe that, or maybe two or three.
00:50:47.140 I'd believe that.
00:50:47.940 I mean, anything could happen.
00:50:49.260 But seven or eight generals?
00:50:51.520 No.
00:50:52.860 Here's another fake news.
00:50:54.800 There's a story that a Russian commander has died
00:50:58.360 after being run over by a tank by his own mutinous troops.
00:51:02.520 So that was in the news today.
00:51:03.700 Do you believe that?
00:51:06.200 It comes from a Ukrainian journalist.
00:51:09.720 One source.
00:51:11.260 A Ukrainian journalist believes that he witnessed it,
00:51:14.780 or he was close to the scene, so he found out about it,
00:51:17.980 that the mutinous troops ran over their own commander
00:51:21.420 with a tank.
00:51:25.140 Now, how fast does a tank go?
00:51:30.360 And how fast can a Russian general get out of the way?
00:51:33.960 Are you buying a Russian tank ran over a general?
00:51:38.860 Yeah.
00:51:39.320 Now, somebody says it could be an accident.
00:51:41.520 How many people in a war zone during a hot war,
00:51:44.320 how many people die by mutinous murder
00:51:49.940 versus accidents and acts of war?
00:51:53.680 If you had to put a bet on this one,
00:51:56.700 somebody says 30 or 40 miles an hour, but very loud,
00:52:00.120 I'm guessing you could see it coming.
00:52:02.960 Yeah, I wouldn't stand in front of it, that's for sure.
00:52:04.700 A tank goes faster than a person can run,
00:52:09.700 but does a tank turn faster than a person
00:52:13.240 can just walk out of the way?
00:52:15.620 If you saw a tank coming at you at 30 or 40 miles an hour,
00:52:19.560 and remember, it's not going to be at top speed,
00:52:21.420 because it's...
00:52:22.680 Could you get out of the way?
00:52:26.300 Can tanks go sideways too easily?
00:52:29.280 Somebody says they're surprisingly agile,
00:52:32.840 but at that speed.
00:52:36.600 Have you looked up Operation Gladio yet?
00:52:39.460 No, should I?
00:52:41.960 All right, I'm going to call this fake news,
00:52:44.060 because it's a little bit too on the nose.
00:52:46.340 You know, when you predict,
00:52:47.280 hey, I think Ukraine's going to do some propaganda like this,
00:52:50.400 and then some propaganda like this comes out,
00:52:53.180 right when you expect it.
00:52:55.000 Because this is right when you expect
00:52:56.740 the mutinous Russian soldier stories.
00:53:00.380 Now, I did get credit.
00:53:03.960 I think Andres Bekhaus gave me credit for...
00:53:07.520 And he said if, big question,
00:53:09.280 he said if it's true, then I predicted it.
00:53:13.080 But I don't think it's true.
00:53:14.100 So I'm not going to take credit for that,
00:53:15.340 because I don't think it's true.
00:53:17.840 And that, ladies and gentlemen,
00:53:19.760 brings us to the end of possibly the best livestream
00:53:22.160 that has ever happened in the history of civilization.
00:53:25.700 Civilization.
00:53:28.400 And try ordering your bagels in separate boxes.
00:53:33.860 That would double my work.
00:53:35.840 I know how to fix it if I double my work.
00:53:38.900 Trust me.
00:53:40.860 I easily know how to fix it by doubling my work.
00:53:44.240 I'm trying to avoid that.
00:53:47.660 Oh, my God.
00:53:48.740 Zelensky being recognized by the Oscars
00:53:50.780 is a strike against validity.
00:53:53.280 Yep.
00:53:54.720 It is.
00:54:02.980 Get a girl to sniff my bagels?
00:54:05.340 Oh, yeah.
00:54:06.060 Maybe I could get a bagel-sniffing service animal.
00:54:10.640 Not a woman.
00:54:11.380 But maybe like a squirrel or something.
00:54:17.160 You don't really need a whole dog for that job.
00:54:19.640 I guess some kind of a ferret or something like that.
00:54:22.680 Ferrets.
00:54:23.140 There we go.
00:54:25.080 Putin won by ignoring Biden?
00:54:27.460 Maybe.
00:54:28.760 Now, who do you think owns Mariupol?
00:54:31.880 So the Ukrainian city
00:54:34.180 that has been demolished by Russia.
00:54:37.480 Who do you think owns Mariupol?
00:54:41.320 The news tells you that Ukraine still owns it.
00:54:46.820 I'm not so sure about that.
00:54:49.280 Who owns rubble?
00:54:50.960 If you've surrounded the rubble, do you own it?
00:54:53.320 Or if you live in the rubble, do you own it?
00:54:55.440 Who owns the rubble?
00:54:56.980 The people living in it and starving?
00:54:59.020 Or the people who have surrounded it
00:55:00.420 with a massive military?
00:55:03.240 Oh, that's a bad pun.
00:55:05.600 Marry a rubble.
00:55:06.460 Ugh.
00:55:07.480 It's clever.
00:55:10.680 Too soon.
00:55:11.680 Too soon.
00:55:15.320 Yeah.
00:55:16.440 I don't know.
00:55:17.460 So I don't think any of the news coming out of Ukraine
00:55:19.440 should be believed.
00:55:20.680 I think you've got that story by now.
00:55:23.480 And how do we do today?
00:55:29.260 Was it amazing?
00:55:32.820 It was, was it?
00:55:33.820 Yeah, best ever, I think.
00:55:35.120 Probably the highlight of your day.
00:55:38.580 And now, I'll be turning off YouTube.
00:55:43.260 I'm glad you enjoyed the perfect audio experience.
00:55:47.500 And now, go enjoy the rest of your day.
00:55:51.580 Go enjoy the rest of your day.
00:55:51.600 Go on, go.
00:55:52.560 Go on, go.
00:56:01.800 Go for the rest of my day.
00:56:03.440 Go on, go.
00:56:08.420 Go on, go.
00:56:08.640 Go on, go.
00:56:09.460 Go on, go.
00:56:10.100 Go on, go.
00:56:10.380 Go on, go.
00:56:10.820 Go on, go on.
00:56:12.620 Go on, go on, go.
00:56:13.340 Go on, go.
00:56:13.520 Go on, go, go.
00:56:13.860 Go on, go.
00:56:14.940 Go on, go.