Real Coffee with Scott Adams - September 11, 2023


Episode 2228 Scott Adams: Bad Behavior By Everyone - I Call It "The News." Bring Coffee


Episode Stats

Length

1 hour and 23 minutes

Words per Minute

146.29845

Word Count

12,276

Sentence Count

865

Misogynist Sentences

11

Hate Speech Sentences

35


Summary

The dopamine hit of the day: the thing that makes everything better, the dopamine hit that makes it all better. Also, the winner of the U.S. Open is a man who rejects vaccinations. And a company that can predict crime before it happens.


Transcript

00:00:00.000 Good morning, everybody, and welcome to the highlight of human civilization, possibly
00:00:11.540 Mars civilization, not too distant future.
00:00:16.680 Ah, YouTube doesn't have sound.
00:00:19.780 Probably YouTube would be better if I put my microphone on.
00:00:27.420 Probably this would be better if I had my microphone on, too.
00:00:30.900 I'll bet everything would be better with sound.
00:00:33.720 That's what I think.
00:00:35.420 If you'd like to take your...
00:00:37.360 Uh-oh.
00:00:38.200 Oh, shit.
00:00:40.740 Apparently, I have to turn off the comments for every episode.
00:00:47.320 I thought I had them off.
00:00:49.000 But the YouTube comments, I will be ignoring, because there are too many anti-Semites.
00:00:53.440 If you'd like to make comments, you could always be on the Subscription Locals platform,
00:00:58.060 or you could be on the X platform that's streaming live right now at the same time.
00:01:04.520 But too many anti-Semites on YouTube, and I think they're not organic.
00:01:10.080 I think somebody's sending them here, so we're going to ignore them.
00:01:12.640 But if you'd like to take your experience up to levels which you've never seen before,
00:01:17.640 all you need is a cup or a mug or a glass, a tank or a chalice or a stein,
00:01:21.180 a canteen jug or a flask, a vessel of any kind,
00:01:24.300 filled with your favorite liquid.
00:01:26.140 I like coffee.
00:01:27.700 And join me now for the unparalleled pleasure,
00:01:30.180 the dopamine hit of the day, the thing that makes everything better.
00:01:33.040 It's called the Simultaneous Sip.
00:01:35.240 Go.
00:01:35.420 Ah, so good.
00:01:43.580 So good.
00:01:45.340 Well, I hear that people who are ordering my hardcover version of Reframe Your Brain,
00:01:51.420 the notices that it's taking to October 19th,
00:01:54.240 which means you probably don't want to wait beyond November to order it
00:02:00.620 if you're going to get it for gifts.
00:02:01.860 Now, a lot of people are buying, you know, five to ten copies of the book
00:02:06.160 because it turns out it's sort of the perfect gift for literally everyone,
00:02:11.040 anybody who can read,
00:02:12.900 because it will change lives.
00:02:14.880 It's guaranteed.
00:02:16.540 And some people told me that the best thing about giving this book as a gift
00:02:21.240 is it's the first time they can explain who I am
00:02:24.680 because a lot of people are getting heat for listening to me.
00:02:28.400 If you've been getting heat for listening to me,
00:02:34.260 this book will explain me in a way that other people will understand for the first time.
00:02:39.320 So that's one benefit from it.
00:02:40.900 If you're having a hard time explaining why you would listen to somebody like me,
00:02:44.800 that'll do it for you.
00:02:46.020 It's already changing lives in a very big way.
00:02:48.900 So don't wait too long or you won't have yours by Christmas.
00:02:51.360 All right, because, as you know,
00:02:54.580 reality likes to follow the path of most entertainment,
00:02:59.460 the winner of the U.S. Open tennis was Djokovic.
00:03:05.160 And Djokovic is famous for what?
00:03:09.000 Besides tennis.
00:03:11.220 Famous for rejecting vaccinations
00:03:13.700 to the point where, was it last year,
00:03:16.680 he was not even allowed to play in the U.S. Open
00:03:18.700 because he was unvaccinated.
00:03:20.600 Well, he came back, he won the damn tournament,
00:03:24.020 and he was not only the winner,
00:03:26.800 but the sponsor,
00:03:28.840 sponsor of the tournament.
00:03:31.200 Let's see, one of them was Moderna.
00:03:33.740 Moderna.
00:03:34.640 And so he was featured in the Moderna shot of the day.
00:03:39.360 That's right.
00:03:40.000 They showed a highlight of Djokovic,
00:03:42.500 the most famous denier of vaccinations,
00:03:46.160 and it was labeled the shot of the day.
00:03:49.440 Moderna.
00:03:50.600 Yep, yep.
00:03:57.100 That's almost as bad as TikTok advertising on Fox News.
00:04:02.160 As soon as you hear the advertiser, you go,
00:04:06.060 oh, man, what's going on here?
00:04:08.360 So that's happened.
00:04:12.500 There's a company called Voyager Labs
00:04:15.080 who theoretically, hypothetically,
00:04:18.740 can predict crime before it happens.
00:04:22.500 Now, New York City is using the software,
00:04:24.780 but not in that way.
00:04:25.820 So that's not the only thing it does.
00:04:28.140 So they say New York City is not using it to predict crime,
00:04:31.540 but it could do it.
00:04:34.580 There's a thought that it could do it.
00:04:36.700 But here's my question.
00:04:38.760 As much as I'm not sure I want the government
00:04:41.460 to be able to predict crime,
00:04:43.100 because you know that that's going to go terribly wrong, right?
00:04:46.660 If the government can predict crime,
00:04:49.700 it's just not a good predictive sort of thing.
00:04:53.940 But here's the question I ask you.
00:04:56.960 What if citizens could use it to protect themselves?
00:05:00.640 Would that be wrong?
00:05:02.800 Suppose you could run it against everybody
00:05:06.160 that AI knows is living in your neighborhood,
00:05:09.400 and even could check the police
00:05:11.800 that are your local police that are on your streets.
00:05:15.980 Suppose you got stopped by a police
00:05:20.440 for speeding or something.
00:05:22.520 Suppose you could tell immediately
00:05:24.180 if your police officer was going to be dangerous
00:05:26.700 to people like you.
00:05:29.780 Wouldn't that be useful?
00:05:31.660 If as soon as you saw the police officer come up
00:05:34.660 and you saw the name,
00:05:36.660 if somehow you had some way to know,
00:05:38.440 uh-oh, this one's a bad one.
00:05:40.500 This one's been accused of five different abuses already,
00:05:44.740 so you better keep it under wraps.
00:05:48.300 Or suppose they would tell you to move.
00:05:51.420 Suppose you ran this offer and said,
00:05:53.200 you're a single woman living in this neighborhood,
00:05:56.760 your odds of being abused are 80%.
00:05:59.380 You better get the hell out of here.
00:06:01.980 I don't know.
00:06:02.400 I feel like something that was predictive of crime
00:06:05.280 would be useful to citizens
00:06:07.780 because you could even know
00:06:09.160 if your own friends are likely to steal from you.
00:06:13.400 Wouldn't that be handy?
00:06:15.220 To know if the people you're working with
00:06:17.140 or the people you want to do a deal with
00:06:19.060 are likely to rob you?
00:06:22.900 Well, it's an interesting question.
00:06:24.720 I don't know if the software
00:06:25.800 could actually identify that kind of risk.
00:06:27.600 But if it could,
00:06:29.400 if it could,
00:06:30.900 would you use it?
00:06:33.380 It's kind of creepy.
00:06:35.200 It would definitely discriminate,
00:06:37.020 but it would also keep you safe.
00:06:40.220 Well, Joe Biden,
00:06:41.520 can anybody confirm that this really happened?
00:06:43.580 I saw it on a tweet,
00:06:45.080 but I did not see a source.
00:06:47.500 Is it true that Joe Biden said,
00:06:49.820 I guess yesterday maybe,
00:06:51.180 that China has become too weak
00:06:53.680 to invade Taiwan
00:06:54.760 because their economy is so bad?
00:06:59.740 You actually said that, huh?
00:07:01.200 What do you think of that?
00:07:04.160 Do you think that's true?
00:07:07.440 You know what's interesting is
00:07:09.280 it might be a little bit true.
00:07:11.820 It's certainly not true enough, right?
00:07:14.500 It's not 100% true.
00:07:15.900 And it's probably not 50% true.
00:07:19.280 But it's an interesting point
00:07:20.940 that their economy
00:07:23.100 is a little bit sketchy.
00:07:26.500 But I would be surprised
00:07:27.820 if they couldn't mount a war
00:07:29.080 with their existing assets.
00:07:31.680 So I think that's an overstatement,
00:07:34.900 but it's a variable.
00:07:37.340 It might be a variable
00:07:38.400 that's making China have second thoughts.
00:07:42.780 Imagine if you are
00:07:44.280 President Xi.
00:07:48.340 And sure, you're a dictator,
00:07:50.400 so you've got a lot of control,
00:07:51.820 but you still need to keep
00:07:53.340 other people happy, right?
00:07:55.840 Because the other elites
00:07:57.480 could gang up on you
00:07:58.560 if you were just totally out of control.
00:08:01.180 So even the head of China
00:08:03.940 needs to make other people happy.
00:08:07.080 Imagine if you're the head of China,
00:08:10.760 President Xi,
00:08:11.740 and you see that you've got
00:08:13.040 all these other economic woes,
00:08:15.040 and they're really big ones,
00:08:16.260 like demographic problems,
00:08:18.020 things you can't really easily fix.
00:08:20.440 So let's say you see
00:08:21.760 manufacturing is leaving.
00:08:23.980 You know, you're maybe losing
00:08:25.540 some of your access
00:08:26.280 to some high-tech stuff.
00:08:28.780 You know, you're a little bit
00:08:30.660 pressured by everybody.
00:08:32.920 You've got to spend a lot
00:08:33.920 on social services.
00:08:35.040 You've got 50% unemployment
00:08:37.480 among the youth.
00:08:39.000 Suppose you've got
00:08:39.720 all those problems,
00:08:40.720 and at the moment,
00:08:41.680 they're sort of
00:08:42.300 a little bit under wraps,
00:08:44.940 meaning that the average
00:08:46.200 Chinese citizen
00:08:47.460 is probably not thinking
00:08:48.740 about them every moment
00:08:49.780 of the day.
00:08:51.100 So you're kind of stable,
00:08:53.220 but there's a lot of stuff,
00:08:55.280 a lot of big stuff,
00:08:57.360 that could be a problem
00:08:58.460 and fairly soon.
00:09:00.280 Under those conditions,
00:09:02.440 would you want to take
00:09:03.180 the chance of your economy
00:09:04.660 imploding coincidentally
00:09:07.840 at the same time
00:09:08.620 that you started a war?
00:09:10.180 Because the problem is,
00:09:11.600 if it looked like
00:09:12.380 you made a mistake
00:09:13.200 starting a war,
00:09:14.820 at the same time
00:09:15.580 it looked like
00:09:16.080 you made a mistake
00:09:16.800 managing the economy,
00:09:18.780 that might be
00:09:19.680 too many mistakes.
00:09:21.620 So if you were
00:09:22.320 a Chinese leader,
00:09:23.280 could you stand
00:09:24.300 the internal
00:09:25.880 national pressure
00:09:26.880 of having a failed economy
00:09:29.140 and a war that's not
00:09:30.340 maybe going too smoothly?
00:09:31.860 Because who knows
00:09:32.740 what would happen.
00:09:33.600 I feel like Biden
00:09:36.480 is on to something.
00:09:38.240 I feel like if China's economy
00:09:40.340 were 100% solid,
00:09:43.100 that President Xi
00:09:43.900 could take a chance
00:09:44.860 of screwing up a war
00:09:46.260 and still stay in power.
00:09:49.560 But if the economy is bad
00:09:51.480 and there's a war,
00:09:53.060 the Chinese public
00:09:53.900 are going to say,
00:09:54.760 why are we spending money
00:09:55.640 on a war
00:09:56.180 when we've got
00:09:57.920 these other problems,
00:09:59.120 just like the American economy is?
00:10:00.880 So it could be
00:10:03.000 that the best thing
00:10:04.020 to reduce the chance
00:10:05.820 of war in the world
00:10:06.760 is bad economies.
00:10:09.960 Because wars
00:10:10.800 make economies worse,
00:10:12.540 especially if you lose.
00:10:14.800 All right.
00:10:15.900 So maybe Biden's
00:10:17.280 on to something
00:10:17.800 a little bit there.
00:10:21.700 So Biden was in Vietnam
00:10:23.300 embarrassing the country.
00:10:26.040 But he picked
00:10:26.860 the right country
00:10:27.480 to do it.
00:10:28.900 You know,
00:10:29.120 if you're going to send
00:10:29.820 Biden out
00:10:30.660 onto the international stage,
00:10:34.400 wouldn't you want
00:10:35.520 to send him
00:10:35.980 to a country
00:10:36.500 that values
00:10:37.580 the elderly?
00:10:39.480 Yeah.
00:10:40.080 I don't think
00:10:40.720 we should send
00:10:41.280 Biden anywhere
00:10:42.340 where they don't have
00:10:43.780 a real strong
00:10:44.540 cultural preference
00:10:45.720 for showing respect
00:10:47.460 to the elderly.
00:10:48.720 Because can you imagine
00:10:49.820 the face
00:10:50.460 of the Vietnamese
00:10:52.280 leaders
00:10:54.000 who were greeting him?
00:10:56.160 Yeah,
00:10:57.600 you're totally all right.
00:10:58.860 Uh-huh.
00:11:00.780 The things coming
00:11:01.560 out of your mouth
00:11:02.320 totally make sense.
00:11:04.320 Oh,
00:11:04.780 we respect you,
00:11:05.740 you elder.
00:11:08.160 Yeah,
00:11:08.440 I've got a feeling
00:11:09.160 that they can fake it
00:11:10.200 better than other countries.
00:11:13.140 Can you imagine
00:11:14.060 the Russians?
00:11:15.740 Imagine Biden
00:11:16.640 taking this performance
00:11:18.920 to Russia,
00:11:19.660 you know,
00:11:19.880 as if he would travel
00:11:20.780 to Russia.
00:11:21.200 but imagine
00:11:23.360 the Russians
00:11:23.900 watching Biden,
00:11:25.380 you know,
00:11:25.660 flustered
00:11:26.500 and mumbling around
00:11:28.160 and wandering off.
00:11:30.420 They would literally
00:11:31.120 be drinking
00:11:31.960 and laughing.
00:11:33.920 But you take him
00:11:34.900 to Vietnam
00:11:35.420 and everybody's like,
00:11:37.060 oh,
00:11:37.280 we respect the elderly.
00:11:39.240 Let's just
00:11:40.160 let him be.
00:11:43.080 So,
00:11:44.380 I don't know.
00:11:44.920 the fact that
00:11:47.000 there's sort of
00:11:48.040 a cats-on-the-roof
00:11:49.320 quality
00:11:50.520 to the old
00:11:51.100 Biden experience,
00:11:53.080 you know what I mean?
00:11:54.780 Biden is failing
00:11:56.320 right in front of us,
00:11:57.480 but because
00:11:58.880 every day
00:11:59.720 is sort of
00:12:01.100 similar to the day
00:12:01.940 before,
00:12:02.500 but,
00:12:02.760 you know,
00:12:03.300 half a percent worse,
00:12:05.940 every day you say,
00:12:07.100 ah,
00:12:07.360 today,
00:12:08.040 well,
00:12:08.460 it's not that much
00:12:10.880 worse than it was
00:12:11.860 yesterday.
00:12:12.320 And then the next day,
00:12:14.500 ah,
00:12:15.560 it's just not
00:12:17.580 that much worse
00:12:18.560 than yesterday.
00:12:19.700 But if you start
00:12:20.600 adding it up,
00:12:21.480 you know,
00:12:22.120 at least half of
00:12:22.960 one percent worse
00:12:23.780 every day,
00:12:24.940 things are looking
00:12:25.640 pretty bad in six months,
00:12:27.060 if you know what I mean.
00:12:28.880 Compounding.
00:12:29.800 Compounding interest.
00:12:32.760 So,
00:12:33.300 that's how we got here
00:12:34.080 to this absurd,
00:12:35.080 ridiculous situation.
00:12:37.120 I guess
00:12:37.860 Kamala Harris
00:12:38.600 was sent to the
00:12:39.400 9-11
00:12:40.220 event today,
00:12:42.320 and Biden
00:12:44.180 said he was
00:12:45.000 going back to bed
00:12:45.960 after he talked
00:12:46.560 to the people
00:12:47.200 in Vietnam.
00:12:48.260 Now,
00:12:48.600 to be fair,
00:12:49.720 the time change
00:12:50.420 must be brutal.
00:12:51.980 You know,
00:12:52.200 beyond a certain age,
00:12:54.340 traveling and
00:12:55.300 time changes
00:12:56.200 and tough schedules,
00:12:57.680 it's got to be brutal,
00:12:58.580 I have to admit.
00:13:00.860 But,
00:13:01.460 maybe you shouldn't
00:13:02.160 say that loud.
00:13:03.340 I'm going back to bed.
00:13:05.460 All right.
00:13:08.640 California
00:13:09.120 is
00:13:09.900 kind of
00:13:11.240 quietly
00:13:11.900 looking to
00:13:13.540 reverse
00:13:14.120 a California
00:13:15.380 law
00:13:15.840 that would
00:13:17.160 punish
00:13:18.120 doctors
00:13:18.820 for spreading
00:13:19.620 COVID
00:13:20.120 misinformation.
00:13:22.900 Now,
00:13:23.700 in all of the,
00:13:24.640 can we call it
00:13:25.880 fuckery,
00:13:26.800 of the
00:13:27.600 pandemic,
00:13:29.700 somehow I'd
00:13:30.580 forgotten
00:13:31.040 that California
00:13:33.060 had a law
00:13:33.860 that doctors
00:13:35.600 who are doing
00:13:36.980 their best
00:13:38.160 to help
00:13:38.560 their customers,
00:13:40.360 but if
00:13:41.200 they happen
00:13:42.000 to say
00:13:42.360 something
00:13:42.700 that was
00:13:43.000 not the
00:13:43.560 approved
00:13:44.840 orthodoxy
00:13:45.640 of what
00:13:45.940 you can say,
00:13:47.300 they could
00:13:47.860 be punished
00:13:48.720 for being
00:13:51.100 wrong
00:13:52.540 according to
00:13:53.200 other people.
00:13:54.180 Not for being
00:13:54.880 wrong.
00:13:56.260 We're not
00:13:56.840 talking about
00:13:57.320 punishing the
00:13:57.940 doctors for
00:13:58.500 being wrong.
00:14:00.000 No,
00:14:00.300 that's not
00:14:00.700 even the topic.
00:14:01.500 It's not about
00:14:02.000 being wrong.
00:14:02.500 it's that
00:14:03.440 they can't
00:14:03.920 even say
00:14:04.440 it if
00:14:05.900 it's against
00:14:06.460 the standard
00:14:08.380 interpretation.
00:14:10.260 So,
00:14:10.540 the fact
00:14:11.000 that it's
00:14:11.340 not about
00:14:11.780 being right
00:14:12.280 or wrong
00:14:12.740 should be
00:14:13.060 the scariest
00:14:13.540 thing in
00:14:13.940 the world.
00:14:14.980 It's really
00:14:15.520 just about
00:14:16.060 not being
00:14:16.600 with the
00:14:18.080 government's
00:14:18.800 allowed
00:14:19.260 interpretation.
00:14:22.100 So,
00:14:22.700 they're slowly,
00:14:23.500 it looks like
00:14:23.960 they're sticking
00:14:25.040 it in some
00:14:25.660 other legislation,
00:14:27.160 kind of slowly
00:14:28.060 walk it back.
00:14:28.920 but at
00:14:32.660 least it's
00:14:33.060 being walked
00:14:33.540 back.
00:14:35.300 However,
00:14:35.860 you can't be
00:14:36.340 too happy
00:14:36.780 about it
00:14:37.240 because it
00:14:38.400 existed in
00:14:39.080 the first
00:14:39.400 place.
00:14:40.460 In modern
00:14:41.280 times.
00:14:42.440 In our
00:14:43.040 lifetime,
00:14:43.600 that existed.
00:14:44.500 That was a
00:14:45.000 real thing
00:14:45.440 that was a
00:14:45.840 real law.
00:14:46.960 And still
00:14:47.380 is, by the
00:14:48.060 way.
00:14:48.360 They just
00:14:48.680 haven't
00:14:48.920 reversed it
00:14:49.440 yet.
00:14:51.340 I mean,
00:14:51.840 the level
00:14:53.160 of wrongness
00:14:54.380 that we've
00:14:55.500 experienced in
00:14:56.280 the last
00:14:56.540 five years
00:14:57.460 or so,
00:14:57.820 I think
00:14:59.180 it's
00:14:59.340 unprecedented.
00:15:01.000 Now,
00:15:01.480 I saw a
00:15:02.020 scientific
00:15:02.600 study that
00:15:03.360 is being
00:15:03.960 questioned,
00:15:04.600 so we
00:15:04.840 don't know
00:15:05.120 if it's
00:15:05.360 true,
00:15:06.220 that the
00:15:07.060 human brain
00:15:07.920 has shrunk
00:15:08.680 in the
00:15:10.520 past,
00:15:11.180 I don't
00:15:11.500 know,
00:15:12.180 however many
00:15:12.860 thousands of
00:15:13.560 years.
00:15:14.320 And that
00:15:15.000 we used to
00:15:15.500 have bigger
00:15:15.880 brains,
00:15:17.820 you know,
00:15:18.080 like before
00:15:18.600 the Ice Age
00:15:19.220 or something.
00:15:20.320 So our
00:15:20.720 brains used
00:15:21.180 to be bigger
00:15:21.660 and now
00:15:21.900 they're smaller.
00:15:23.360 Do you
00:15:23.600 know what
00:15:23.820 that would
00:15:24.080 explain?
00:15:26.340 What would
00:15:26.960 that explain?
00:15:27.820 if our
00:15:28.560 brains had
00:15:29.060 in fact
00:15:29.640 gotten
00:15:29.920 smaller?
00:15:33.140 No,
00:15:33.600 the answer
00:15:33.900 is everything.
00:15:34.940 It would
00:15:35.400 explain everything.
00:15:37.100 Yeah,
00:15:37.320 you know how
00:15:37.720 we couldn't
00:15:38.380 understand how
00:15:39.300 the ancients
00:15:40.500 could build
00:15:40.960 pyramids,
00:15:41.640 but we
00:15:41.940 wouldn't know
00:15:42.320 how to do
00:15:42.740 it today
00:15:43.120 using their
00:15:43.680 technology?
00:15:45.640 What if
00:15:46.140 they were
00:15:46.340 just smarter?
00:15:48.800 That's the
00:15:49.280 whole story.
00:15:50.480 It's like,
00:15:50.860 well,
00:15:51.120 if you were
00:15:51.480 smart as we
00:15:52.100 used to be,
00:15:53.080 you could have
00:15:53.880 built a
00:15:54.220 pyramid too.
00:15:55.520 Maybe that's
00:15:56.180 the whole
00:15:56.420 answer.
00:15:56.760 Maybe the
00:15:58.320 answer to
00:15:59.940 what's wrong
00:16:00.480 with everything
00:16:01.280 is that
00:16:03.220 we're dumber
00:16:03.600 and maybe
00:16:04.660 there's nothing
00:16:05.120 else to it.
00:16:06.620 I don't
00:16:07.240 think so.
00:16:08.080 And the
00:16:08.440 other scientists
00:16:09.400 who are not
00:16:10.700 part of this
00:16:11.080 study are not
00:16:11.780 so sure that
00:16:13.120 we're getting
00:16:13.640 dumber.
00:16:14.420 Some of them
00:16:15.160 are saying,
00:16:15.980 oh yes,
00:16:17.040 generally speaking,
00:16:17.940 the bigger the
00:16:18.380 brain,
00:16:18.660 the smarter you
00:16:19.240 are,
00:16:19.540 but there
00:16:20.600 could be an
00:16:21.080 exception where
00:16:22.400 if you have
00:16:22.760 more folds,
00:16:23.500 it could be
00:16:24.200 tighter and more
00:16:24.900 compact like a
00:16:25.740 microchip.
00:16:26.740 So it might
00:16:27.020 actually be
00:16:27.520 better.
00:16:28.760 But there's
00:16:29.260 one theory that
00:16:29.940 says that
00:16:30.660 brains shrunk
00:16:32.080 because we
00:16:33.340 didn't need to
00:16:34.080 think individually.
00:16:36.900 That once you
00:16:37.460 can depend on
00:16:38.140 other people to
00:16:38.940 remember things
00:16:39.880 and do some
00:16:40.900 of your thinking
00:16:41.420 for you,
00:16:42.020 sort of group
00:16:42.700 thinking,
00:16:43.740 that you didn't
00:16:44.360 need as much
00:16:44.860 individual intelligence
00:16:45.880 to stay alive.
00:16:47.780 And therefore it
00:16:48.660 atrophied because
00:16:49.720 you didn't need it
00:16:50.220 so much.
00:16:51.460 Not buying that.
00:16:52.500 So what should
00:16:54.440 you believe about
00:16:55.160 this new study
00:16:55.920 about brain
00:16:56.460 shrinking?
00:16:58.180 Nothing.
00:16:59.320 Including the
00:17:00.040 fact that they
00:17:00.540 shrunk.
00:17:01.320 Because even
00:17:01.760 that's questioned.
00:17:03.560 Other people
00:17:04.340 say you can't
00:17:05.040 really measure
00:17:05.640 the size of a
00:17:06.920 brain back then
00:17:07.820 because it's
00:17:08.840 not preserved
00:17:09.480 and blah,
00:17:10.260 blah, blah.
00:17:10.920 Can't really
00:17:11.440 tell.
00:17:13.980 So ignore all
00:17:15.100 that.
00:17:15.300 So here's the
00:17:17.120 CNN narrative
00:17:18.240 on Trump
00:17:19.220 courtesy of
00:17:20.320 Stephen Collinson
00:17:21.180 and by the
00:17:22.780 way I try to
00:17:23.720 predict who
00:17:24.680 the opinion
00:17:26.700 piece will be
00:17:27.980 based on the
00:17:29.200 title.
00:17:30.340 So I look at
00:17:30.900 the title of
00:17:31.520 the opinion
00:17:31.900 piece before
00:17:32.620 you see the
00:17:33.100 name associated
00:17:34.200 and try to
00:17:35.080 guess which
00:17:35.700 person it is
00:17:37.120 and this one
00:17:38.060 as soon as I
00:17:39.020 saw it I was
00:17:39.360 like that looks
00:17:40.160 like Stephen
00:17:40.600 Collinson to
00:17:41.240 me.
00:17:41.460 Sure enough.
00:17:45.740 Here are some
00:17:46.800 things it says.
00:17:48.520 The Republican
00:17:49.100 front runner,
00:17:49.920 meaning Trump,
00:17:50.420 his stark
00:17:51.400 speech,
00:17:53.060 they're saying
00:17:53.920 stark instead
00:17:54.660 of dark.
00:17:56.300 I think we
00:17:56.980 embarrassed him
00:17:57.680 off dark
00:17:58.360 and so now
00:17:59.760 it's turned
00:18:00.100 stark.
00:18:01.980 So it's very
00:18:02.740 stark.
00:18:04.380 The front
00:18:05.160 runner's
00:18:05.560 stark speech
00:18:06.620 raised the
00:18:07.640 prospect of a
00:18:08.800 second presidency
00:18:09.700 that would be
00:18:10.880 even more
00:18:11.360 extreme and
00:18:12.180 challenging to
00:18:12.860 the rule of
00:18:13.460 law than
00:18:13.980 his first.
00:18:16.200 Now, do you
00:18:17.780 believe that you
00:18:18.400 could objectively
00:18:19.240 make the case
00:18:20.180 that the Trump
00:18:21.740 administration
00:18:22.460 was just
00:18:24.160 purely
00:18:24.500 objectively
00:18:25.100 had more
00:18:27.600 let's say
00:18:28.020 disregard for
00:18:29.000 law than
00:18:30.660 the Biden
00:18:31.640 administration?
00:18:34.840 Does anybody
00:18:35.740 think that's the
00:18:36.420 case?
00:18:38.480 I can only
00:18:39.680 think of one
00:18:40.340 instance where
00:18:41.420 this is even a
00:18:42.220 question and the
00:18:43.600 one instance is
00:18:44.440 January 6th and
00:18:46.080 January 6th is
00:18:47.040 an op, it's a
00:18:48.780 misinformation op
00:18:49.800 by the left, it's
00:18:50.660 not even real.
00:18:51.900 It's not even real
00:18:52.760 in the sense that
00:18:53.420 obviously it wasn't
00:18:54.320 an insurrection, but
00:18:55.880 half of the country
00:18:56.600 thinks you can
00:18:57.300 overturn a country
00:18:58.780 with some
00:18:59.800 paperwork and some
00:19:00.700 trespassing.
00:19:04.120 So the thing
00:19:08.560 with the
00:19:09.140 Democrats is that
00:19:11.080 they're so
00:19:11.500 brainwashed to
00:19:13.300 believe that they're
00:19:13.960 the good guys, that
00:19:15.520 you can just put
00:19:16.200 this out there like
00:19:17.040 it's true and you
00:19:18.060 don't have to
00:19:18.480 support it.
00:19:19.420 Well, you know,
00:19:19.980 there was this one
00:19:20.580 administration that
00:19:22.040 was ignoring the
00:19:22.880 law and the
00:19:25.100 Democrats go, oh
00:19:26.300 yeah, oh yeah,
00:19:27.760 thank goodness we've
00:19:28.880 got an administration
00:19:29.760 that just follows
00:19:30.520 the law now.
00:19:34.500 Is it amazing?
00:19:36.860 To me it's just
00:19:37.600 like mind-boggling
00:19:38.700 that there's even a
00:19:40.460 possibility that
00:19:41.480 somebody thinks that
00:19:42.460 they could measure
00:19:43.320 one administration's
00:19:45.280 flouting of the
00:19:46.240 law compared to
00:19:47.740 the other.
00:19:48.440 How in the world
00:19:49.160 would he even
00:19:49.660 compare?
00:19:51.240 So that's
00:19:52.120 brainwashing
00:19:53.060 propaganda.
00:19:54.900 Point number one
00:19:55.880 is to make you
00:19:57.140 accept uncritically
00:19:58.480 that one of them
00:19:59.620 is the honest one.
00:20:00.820 And we just
00:20:01.340 start from there.
00:20:02.100 Well, start from
00:20:02.900 the assumption
00:20:03.380 that one is the
00:20:04.560 honest side.
00:20:05.540 Now look at all
00:20:06.200 these crimes.
00:20:07.300 Whoa!
00:20:09.980 Yeah.
00:20:11.340 How about we
00:20:12.280 check that
00:20:12.900 assumption of
00:20:13.820 which one
00:20:14.340 flouts the law?
00:20:18.560 And Collinson
00:20:19.080 goes on to say
00:20:20.580 that, talk about
00:20:21.880 Trump, his view
00:20:23.080 that the Oval
00:20:23.740 Office confers
00:20:24.740 unfettered powers
00:20:26.020 weight.
00:20:28.560 So Stephen
00:20:29.080 Collinson on
00:20:30.200 CNN wrote this
00:20:31.720 sentence about
00:20:32.540 Trump.
00:20:33.560 His view that
00:20:34.440 the Oval
00:20:34.960 Office confers
00:20:36.040 unfettered
00:20:36.920 powers.
00:20:41.180 What evidence
00:20:42.200 is there of
00:20:42.720 that?
00:20:44.600 Exactly what
00:20:45.420 evidence is there
00:20:46.240 of his internal
00:20:47.320 unspoken thoughts?
00:20:50.080 It's just put
00:20:50.920 here like it's
00:20:51.540 a fact.
00:20:53.480 It's mind
00:20:54.040 reading.
00:20:54.780 It's mind
00:20:55.160 reading.
00:20:55.780 By the way,
00:20:57.820 I'll say this
00:20:58.560 again.
00:20:59.220 Once I
00:20:59.780 introduce the
00:21:00.500 idea that a
00:21:02.200 lot of what
00:21:02.580 you see in
00:21:03.020 politics is
00:21:03.700 people pretending
00:21:04.500 they can read
00:21:05.180 minds,
00:21:06.040 when obviously
00:21:06.660 they can't.
00:21:07.920 Once you see
00:21:08.660 it, you see
00:21:09.140 it everywhere,
00:21:09.720 right?
00:21:10.340 The mind
00:21:11.020 reading frame.
00:21:13.660 It's like
00:21:14.380 pervasive.
00:21:15.780 And then once
00:21:16.700 you realize that
00:21:17.400 it's completely
00:21:19.040 illegitimate speech,
00:21:20.900 meaning that
00:21:21.660 it's just somebody
00:21:23.480 trying to brainwash
00:21:24.380 you, it's not
00:21:24.840 somebody who has
00:21:25.520 like a real
00:21:26.860 opinion.
00:21:27.200 So let me
00:21:29.340 finish this.
00:21:29.900 This is mind
00:21:30.620 blowing.
00:21:31.340 His view,
00:21:32.320 as if he
00:21:33.120 could know
00:21:33.440 this,
00:21:33.880 his view
00:21:34.320 that the
00:21:34.680 Oval Office
00:21:35.300 confers unfettered
00:21:36.860 powers suggests
00:21:38.180 Trump would
00:21:38.800 indulge in
00:21:39.440 similar conduct
00:21:40.360 as that for
00:21:41.080 which he is
00:21:41.520 awaiting trial.
00:21:43.580 All right,
00:21:43.840 so see if I can
00:21:44.580 put this all
00:21:45.080 together,
00:21:45.520 because he's
00:21:45.880 talking about
00:21:46.300 the January
00:21:46.740 6th.
00:21:47.660 So the fake
00:21:48.980 accusations
00:21:49.860 about January
00:21:50.980 6th form a
00:21:53.400 base,
00:21:53.880 completely fake
00:21:55.240 accusations,
00:21:56.920 that we see
00:21:58.480 that if we
00:21:59.360 read his mind
00:22:00.400 and we read
00:22:01.900 his mind,
00:22:02.520 we see that
00:22:03.000 he's likely to
00:22:03.860 do more of
00:22:05.300 the thing that
00:22:05.820 never happened,
00:22:07.120 because we
00:22:08.780 read his mind.
00:22:10.320 Did I state
00:22:11.160 that right?
00:22:12.100 They read his
00:22:12.760 mind,
00:22:13.280 Trump's mind,
00:22:14.620 and in it
00:22:15.260 they can see
00:22:15.900 that he's likely
00:22:16.600 to do more
00:22:17.420 of the things
00:22:18.600 he's falsely
00:22:19.340 accused of.
00:22:20.120 That's called
00:22:25.360 the news on
00:22:26.340 the left.
00:22:27.240 That's the
00:22:27.640 news.
00:22:28.700 It's an
00:22:29.060 opinion piece,
00:22:29.760 but it's on
00:22:30.300 a news site.
00:22:33.440 Well,
00:22:34.160 there's a lot
00:22:35.960 of concern by
00:22:36.700 the people who
00:22:37.600 care about AI
00:22:38.280 that there will
00:22:38.780 be an October
00:22:40.680 surprise.
00:22:42.560 You know,
00:22:42.740 that's the big
00:22:43.420 news story that
00:22:44.200 you don't expect,
00:22:45.060 but of course
00:22:45.480 you always expect
00:22:46.200 it before an
00:22:47.840 election,
00:22:48.560 the October
00:22:49.000 before the
00:22:49.560 election.
00:22:50.120 So that not
00:22:51.220 this year,
00:22:52.980 but one year
00:22:54.500 from this
00:22:55.200 October,
00:22:56.200 that we'll have
00:22:56.840 an AI-inspired
00:22:58.040 deep fake
00:22:58.900 that will be
00:23:00.000 convincing enough
00:23:00.880 that will change
00:23:01.460 the election.
00:23:03.100 Now,
00:23:03.780 here's my
00:23:04.280 question.
00:23:06.120 I feel as if
00:23:07.460 this could go
00:23:08.700 either way.
00:23:09.920 We might have
00:23:10.860 reached the point
00:23:11.520 where people will
00:23:12.380 stop looking at
00:23:13.460 photos and
00:23:14.680 videos.
00:23:16.560 Suppose
00:23:17.120 you had a
00:23:19.060 site that
00:23:19.580 wouldn't show
00:23:20.120 you any of
00:23:20.580 it.
00:23:20.780 They wouldn't
00:23:21.840 show you a
00:23:22.340 video,
00:23:23.040 and they
00:23:23.780 would never
00:23:24.140 show you a
00:23:24.680 picture.
00:23:25.860 They would
00:23:26.280 only describe
00:23:27.020 it.
00:23:28.300 That would be
00:23:28.840 as close as
00:23:29.480 you could get
00:23:29.920 to real news,
00:23:30.800 because videos
00:23:31.820 and pictures
00:23:32.480 lie.
00:23:33.440 They're the
00:23:33.800 biggest liars.
00:23:35.820 So I feel
00:23:36.740 like if I knew
00:23:37.360 a human wrote
00:23:38.060 the story,
00:23:39.420 and there were
00:23:39.860 no pictures or
00:23:40.640 videos to
00:23:41.260 distract me and
00:23:42.140 lie to me,
00:23:43.360 that might be
00:23:44.280 closer to real.
00:23:45.640 But also,
00:23:46.780 if there's no
00:23:47.220 picture or video,
00:23:48.160 it's easier to
00:23:48.800 lie.
00:23:50.020 So there's no
00:23:50.940 real good solution
00:23:52.180 here.
00:23:54.840 But what do you
00:23:55.660 think?
00:23:55.960 Do you think that
00:23:56.500 there will be a
00:23:57.440 deep fake October
00:23:58.640 surprise?
00:23:59.620 I say no.
00:24:01.600 And the reasons I
00:24:02.360 say no, there'll be
00:24:03.160 lots of attempts.
00:24:04.540 So there'll be
00:24:05.180 plenty of deep
00:24:05.940 fake attempts.
00:24:06.980 But I think we'll
00:24:07.660 be so onto the
00:24:08.660 fakes that we
00:24:10.380 will be primed to
00:24:11.360 see everything as
00:24:12.160 fake even when it's
00:24:13.140 not.
00:24:14.040 I think it's more
00:24:14.880 likely we'll think
00:24:15.840 a real thing is
00:24:16.620 fake.
00:24:17.980 Oh, okay, here's
00:24:18.960 my prediction.
00:24:20.160 Here's my
00:24:20.600 prediction.
00:24:21.420 It's more likely
00:24:22.480 we'll think a
00:24:23.360 real thing was
00:24:24.700 faked than we'll
00:24:26.800 think a fake
00:24:27.560 thing was real.
00:24:29.960 Anybody want to
00:24:30.760 take the other
00:24:31.200 side of that bet?
00:24:32.140 It'd be hard to
00:24:32.800 prove.
00:24:33.580 But look for a
00:24:35.160 situation where
00:24:35.900 there's a real
00:24:36.440 thing that one
00:24:38.760 side or the other
00:24:39.500 believes is fake.
00:24:41.280 I think that's
00:24:41.960 slightly more
00:24:43.180 likely than
00:24:43.980 actually believing
00:24:44.880 a deep fake.
00:24:46.080 Now, we might
00:24:46.800 believe a deep fake
00:24:47.740 for a day, but we
00:24:49.760 wouldn't believe a
00:24:50.480 deep fake for two
00:24:51.380 days if we're
00:24:52.280 watching the news
00:24:54.060 because it would
00:24:54.820 get debunked pretty
00:24:55.580 quickly.
00:24:56.940 And then I think
00:24:57.820 that AI will be
00:24:58.880 able to detect AI.
00:25:01.400 So we should have
00:25:02.320 pretty quickly an AI
00:25:03.660 that's a watchdog of
00:25:04.920 other AI.
00:25:06.660 Hey, do you know
00:25:07.180 why there's no such
00:25:08.020 thing as an AI
00:25:09.300 fact check?
00:25:10.060 Isn't that the
00:25:12.760 obvious thing?
00:25:13.620 If ChatGPT can do
00:25:16.440 search, shouldn't it
00:25:18.560 also be able to
00:25:19.300 watch your posts on
00:25:20.900 social media and
00:25:22.520 automatically add a
00:25:23.840 fact check?
00:25:25.640 It already, we have
00:25:26.860 that ability right
00:25:27.640 now, right?
00:25:28.600 How hard would it
00:25:29.500 be?
00:25:29.740 It would be somewhat
00:25:30.520 easy.
00:25:32.080 Do you know why you
00:25:32.900 don't see it?
00:25:34.180 Why is there no
00:25:35.220 ChatGPT fact check?
00:25:36.980 Because the
00:25:41.000 Democrats can't
00:25:41.880 control it.
00:25:44.300 They do know that
00:25:45.860 ChatGPT is biased
00:25:47.100 toward the left, but
00:25:48.720 not in every way.
00:25:50.980 Not in every way.
00:25:52.140 It still will give you
00:25:52.980 facts sometimes.
00:25:54.760 So if you let ChatGPT
00:25:56.800 do an automatic
00:25:57.860 fact checking, it
00:25:59.240 would destroy the
00:26:00.060 Democratic Party.
00:26:00.900 Because their
00:26:02.980 entire operation
00:26:03.920 involves gaslighting
00:26:07.240 their own team.
00:26:08.520 It just doesn't
00:26:09.380 work on the other
00:26:10.020 side, because the
00:26:11.000 other side is, you
00:26:11.980 know, they're sort of
00:26:12.880 primed to reject
00:26:13.680 everything that comes
00:26:14.500 from that side just
00:26:15.340 automatically.
00:26:16.380 So it doesn't work,
00:26:18.040 except for their own
00:26:18.920 team.
00:26:20.220 But what would
00:26:21.040 happen if they
00:26:21.620 thought ChatGPT
00:26:22.900 was real?
00:26:24.260 It's real enough.
00:26:25.300 And then it kept
00:26:26.860 disagreeing with
00:26:27.560 their own team.
00:26:29.420 There's no way that
00:26:30.180 Democrats can allow
00:26:31.220 ChatGPT to become,
00:26:33.900 or AI in general,
00:26:35.300 to be part of the
00:26:36.200 public conversation
00:26:37.100 unless they've biased
00:26:38.420 it so badly that it
00:26:39.980 can't possibly be
00:26:40.840 useful.
00:26:41.940 Just think about that.
00:26:42.960 So that's my
00:26:43.460 prediction.
00:26:44.540 My prediction is
00:26:45.360 Democrats will never
00:26:46.460 allow AI to be part
00:26:48.880 of fact checking
00:26:49.620 unless they've
00:26:50.940 corrupted it so badly
00:26:52.080 that it can't really
00:26:52.920 do the fact checking.
00:26:55.300 Anybody want to
00:26:56.880 take the other side
00:26:57.500 of that prediction?
00:26:58.960 You don't want to
00:26:59.780 take the other side
00:27:00.420 of that one.
00:27:01.560 There will never be
00:27:02.720 an AI fact checker
00:27:06.380 approved by Democrats.
00:27:09.020 It can't happen.
00:27:10.640 Just think about
00:27:11.280 that fact for a while.
00:27:12.540 So this is going to
00:27:13.340 be the biggest dog
00:27:14.280 not barking.
00:27:15.800 Once it becomes
00:27:16.820 like incredibly
00:27:18.660 clear to every
00:27:20.360 citizen that ChatGPT
00:27:21.980 could identify the
00:27:23.040 bullshit, you're
00:27:25.120 going to ask why
00:27:25.660 it's not doing it.
00:27:27.540 If it can identify
00:27:29.000 fakes and bullshit
00:27:30.080 and facts that are
00:27:33.300 just a spin, and
00:27:34.840 of course it can do
00:27:35.640 that, why aren't we
00:27:37.940 massively using it?
00:27:39.960 Now it doesn't mean
00:27:40.580 it will get everything
00:27:41.260 right.
00:27:41.980 It could just flag
00:27:42.960 things you should look
00:27:43.840 into more deeply.
00:27:45.540 But wouldn't that be
00:27:46.280 useful?
00:27:47.520 Yeah.
00:27:48.020 Look how useful the
00:27:49.000 community notes are
00:27:50.160 on the X platform.
00:27:52.580 The best thing ever.
00:27:53.620 The community notes
00:27:55.220 where they add the
00:27:56.360 contacts.
00:27:57.220 Because the community
00:27:57.860 notes do not seem to
00:27:59.400 be targeting one side.
00:28:01.500 Would you agree?
00:28:03.120 If you've watched
00:28:03.820 anything?
00:28:04.560 The community notes
00:28:05.680 have taken down people
00:28:06.660 on the left and the
00:28:07.500 right.
00:28:09.060 It's just the fact
00:28:10.300 is the fact.
00:28:11.620 So, yeah, I think
00:28:13.820 that's the most, I
00:28:14.820 think the community
00:28:15.440 notes are the most
00:28:16.360 successful fact-checking
00:28:19.160 process so far.
00:28:22.640 All right.
00:28:27.500 Jake Tapper looks
00:28:29.480 like he's trying to
00:28:30.140 get Elon Musk
00:28:30.940 assassinated or
00:28:32.020 jailed because he's
00:28:34.280 pushing a narrative
00:28:35.480 via his questioning.
00:28:37.980 He's pushing the
00:28:38.860 narrative.
00:28:40.200 He was talking to
00:28:40.780 Tony Blinken.
00:28:42.820 And he says, I'll
00:28:44.500 paraphrase, but Jake
00:28:46.080 Tapper said that Musk
00:28:47.740 effectively sabotaged
00:28:49.600 a military attack
00:28:51.000 against a U.S.
00:28:52.620 ally.
00:28:53.900 Now, his version of
00:28:54.960 it is that Musk
00:28:56.680 deactivated the Skylink
00:28:59.480 satellites, his own
00:29:00.380 satellites, that Ukraine
00:29:02.200 was using.
00:29:02.860 And they were planned to
00:29:03.760 use it for a maritime
00:29:05.280 attack, which would
00:29:06.940 have been essentially a
00:29:07.920 Pearl Harbor attack on
00:29:09.760 the Russian fleet
00:29:10.720 before the Russian fleet.
00:29:13.380 You know, I don't think
00:29:14.180 they're all in one place
00:29:15.000 now.
00:29:15.200 So there was that one
00:29:16.520 point during the early
00:29:17.300 part of the war when
00:29:18.020 the fleet was in one
00:29:18.820 place, or a lot of it.
00:29:20.700 And an attack of that
00:29:23.140 size would have made
00:29:24.840 Musk uncomfortable
00:29:25.920 because he says he
00:29:26.960 talked to some high
00:29:27.780 ranking Russians who
00:29:29.380 threatened nuclear
00:29:30.760 response if things got
00:29:32.640 too bad.
00:29:33.760 And that that looked
00:29:34.920 like it was possibly a
00:29:36.200 nuclear trigger.
00:29:38.200 And that Musk himself,
00:29:41.360 apparently, decided that
00:29:43.240 he would not turn on
00:29:44.240 those satellites, and
00:29:45.420 there's some difference
00:29:46.140 in the reporting.
00:29:47.700 The way Tapper
00:29:48.900 describes it, it sounds
00:29:50.140 like he turned them
00:29:50.920 off.
00:29:52.320 But indeed, they were
00:29:53.420 never on in the first
00:29:54.340 place.
00:29:55.060 So they were actually
00:29:55.960 asking him to
00:29:56.660 participate in something
00:29:59.240 that could have started
00:30:00.180 World War III.
00:30:01.500 And he said, no.
00:30:05.080 He just said no.
00:30:06.240 He never promised
00:30:12.700 coverage over Crimea
00:30:13.780 because he thought
00:30:14.540 Crimea was a red thing.
00:30:17.340 Now, Blinken, quite
00:30:20.400 wisely, I thought he
00:30:21.480 handled it well.
00:30:22.940 Instead of saying
00:30:23.840 anything good or bad
00:30:25.140 about Musk or what he
00:30:26.280 did or did not do,
00:30:27.960 Blinken just said, and
00:30:29.240 this is the right answer.
00:30:30.140 You know, I'm not a big
00:30:33.200 Blinken fan, but he
00:30:35.300 gave the exact right
00:30:36.300 answer.
00:30:37.200 He said, we think
00:30:38.580 Skylink is important,
00:30:40.260 has been important, and
00:30:41.340 will be important to the
00:30:42.480 future in our war.
00:30:45.640 And then, you know, and
00:30:46.860 then Tapper tried to
00:30:48.740 humiliate him for
00:30:49.920 avoiding the question, to
00:30:51.900 which I say, thank you
00:30:53.020 for avoiding that
00:30:53.760 question.
00:30:55.100 I wanted you to avoid
00:30:56.560 that question.
00:30:57.100 I wanted you to say
00:30:58.080 Skylink is important,
00:30:59.740 and we're going to keep
00:31:01.020 using it where we can.
00:31:03.140 That's all I want to
00:31:04.100 hear.
00:31:04.880 I don't want to hear you
00:31:06.060 say anything about an
00:31:07.140 American citizen who's
00:31:09.380 trying to help every way
00:31:10.680 he can.
00:31:11.920 I don't want to hear any
00:31:13.160 criticisms about him in
00:31:14.840 that context.
00:31:15.780 I mean, criticism is
00:31:16.640 fine, but not in that
00:31:17.960 context.
00:31:18.780 So Blinken played that
00:31:19.700 exactly right, because
00:31:21.220 you don't want to piss off
00:31:22.120 Musk if he controls the
00:31:23.800 biggest asset in the sky.
00:31:27.100 But to watch Jake Tapper
00:31:29.340 frame this in a way that
00:31:31.080 would get Musk killed or
00:31:34.160 jailed is probably one of
00:31:37.060 the least ethical things
00:31:38.240 I've ever seen in my
00:31:39.100 life.
00:31:40.340 Would you agree?
00:31:42.200 If anybody saw the clip,
00:31:44.120 did it look like the least
00:31:45.200 moral and ethically
00:31:48.000 responsible thing you've
00:31:49.780 ever seen?
00:31:50.220 And the idea here is that
00:31:53.420 Musk was doing his own
00:31:54.440 foreign policy, which would
00:31:56.980 be illegal.
00:31:58.980 Now, at what point is
00:32:01.320 talking to somebody else
00:32:02.820 foreign policy?
00:32:06.080 Or is it the fact that he
00:32:07.560 didn't turn on the
00:32:08.620 satellites as a foreign
00:32:10.340 policy?
00:32:11.120 If I don't give you assets
00:32:13.020 for your military stuff, is
00:32:16.280 that interfering?
00:32:17.180 Am I doing my own foreign
00:32:19.100 policy if I decide not to go
00:32:20.920 kill people you want me to
00:32:21.980 kill?
00:32:23.120 That's my own foreign
00:32:24.000 policy.
00:32:26.340 Refusing to help in a war
00:32:27.760 that doesn't look legitimate
00:32:29.120 to some people, that would
00:32:32.020 be foreign policy.
00:32:34.400 Well, let me say this.
00:32:36.940 We would have a big
00:32:38.080 conversation about what
00:32:39.200 foreign policy means.
00:32:41.120 Because do you think I've
00:32:42.580 never talked to anybody in
00:32:43.700 another country?
00:32:44.320 That's what the X platform
00:32:48.220 does.
00:32:49.080 It allows people in other
00:32:50.360 countries to DM me.
00:32:53.700 Are there not plenty of
00:32:55.300 public figures, and I would
00:32:56.760 be one of them, who have
00:32:58.080 had conversations with
00:32:59.280 people in other countries
00:33:00.340 in which they tell me
00:33:02.060 things that inform my
00:33:03.760 opinion?
00:33:05.180 You know, in many cases I
00:33:06.160 don't believe them because
00:33:06.880 it looks like propaganda.
00:33:08.540 But we're not allowed to
00:33:10.440 talk to people in other
00:33:11.320 countries?
00:33:11.660 Under what conditions are
00:33:15.140 we not allowed to talk to
00:33:16.160 other people?
00:33:17.200 Because I'm not aware of
00:33:18.320 any law that would prevent
00:33:19.440 me from talking to anybody
00:33:21.100 under any conditions.
00:33:24.900 Who can tell me who I can
00:33:26.180 talk to in another country?
00:33:28.620 Is there a law about that?
00:33:32.040 You know, if it's my job to
00:33:33.640 go over there and influence
00:33:34.580 them, that's something.
00:33:35.800 But I can't have a
00:33:36.700 conversation where I learn
00:33:39.540 things and they learn things
00:33:40.760 that are, you know, not
00:33:41.840 secrets.
00:33:45.060 Yeah.
00:33:46.120 You know, there's, it's one
00:33:48.220 thing to be, I guess, Kerry
00:33:49.660 was kind of negotiating with
00:33:50.880 Iran, somebody said.
00:33:52.140 So, I mean, that's the real
00:33:53.300 kind that might be a problem.
00:33:55.340 But if all Musk did is talk
00:33:57.300 to a bunch of high-ranking
00:33:58.800 Russians and they told him
00:34:00.800 something that is a little
00:34:02.200 bit obvious, you know, that
00:34:03.880 they would consider nukes under
00:34:05.220 some situations, and that he
00:34:07.220 acted on that, how is that
00:34:09.860 wrong?
00:34:10.900 Like, what kind of law did
00:34:12.200 that violate?
00:34:14.800 Not wanting to be involved in
00:34:16.420 starting World War III is
00:34:17.840 doing your own foreign policy?
00:34:20.060 That's quite a stretch.
00:34:22.120 So, to me, it looks like
00:34:23.400 powers on the left are
00:34:26.560 ganging up to take Trump, or
00:34:28.920 to take Musk out.
00:34:30.600 I think that they need him
00:34:32.000 gone in order for their
00:34:33.500 propaganda gaslighting
00:34:36.460 machine to work.
00:34:37.240 Because if you can't get the
00:34:39.460 X platform to go along, it's
00:34:42.700 a little more obvious what
00:34:43.700 you're up to.
00:34:47.000 Well, let's talk about Nikki
00:34:48.220 Haley, whose prospects seem to
00:34:50.160 be rising.
00:34:50.860 A lot of people are liking her.
00:34:53.020 But she is pro-war in Ukraine
00:34:56.720 and funding it, whereas Trump
00:34:59.140 and Vivek, to name two, would be
00:35:02.420 a little more aggressive in trying
00:35:04.380 to wind down that war and get us
00:35:06.240 out of there.
00:35:07.440 But she said a win for Russia in
00:35:10.940 that war is a win for China.
00:35:14.540 Do you see that?
00:35:15.840 A win for Russia is a win for
00:35:17.400 China?
00:35:19.400 That feels like a weak,
00:35:22.360 bullshit reason, doesn't it?
00:35:25.320 There's something about that
00:35:26.600 reason that makes me think she
00:35:28.780 doesn't have a reason.
00:35:29.620 Yeah, it's like, why do you
00:35:32.020 have to stretch?
00:35:33.240 You're giving billions of
00:35:34.360 dollars for a war, and you've
00:35:36.320 got to come up with some
00:35:37.480 third-party effect for why it's
00:35:40.020 important.
00:35:41.000 There's not a first-party effect,
00:35:43.000 as in people dying right now,
00:35:45.340 and it either does make us safer
00:35:47.960 or not, in terms of Russia.
00:35:50.560 But the real concern is China.
00:35:53.560 Is she just bad at explaining
00:35:55.320 things, or is that her actual
00:35:57.440 reason?
00:35:59.860 I don't know.
00:36:00.720 I feel like her explanations of
00:36:03.020 the Ukraine situation, I'm not
00:36:06.300 saying that they're wrong, but
00:36:08.840 they're at the very least poorly
00:36:10.640 explained.
00:36:11.780 And I don't think you could have a
00:36:13.680 war that's poorly explained.
00:36:17.820 Like, that feels like really
00:36:19.020 important to explain it accurately.
00:36:21.600 So, yeah, this whole win for Russia
00:36:23.460 is a win for China, that's a little
00:36:25.980 too thin.
00:36:26.980 That's some weak soup right there.
00:36:29.640 Which is different from saying we
00:36:31.840 should or should not fund them.
00:36:33.560 All right?
00:36:34.360 I have my opinions on that, but
00:36:35.700 what we're talking about now is
00:36:36.800 just this explanation just doesn't
00:36:39.040 even feel real.
00:36:41.020 It sounds like somebody had to sit
00:36:43.040 down and think it up.
00:36:43.940 All right, can you come up with a
00:36:45.960 reason why we're funding this war?
00:36:47.420 Well, Russia, well, we'll, let's see,
00:36:53.260 we'll be safer from Russia.
00:36:55.200 No, we're not going to be safer from
00:36:56.440 Russia.
00:36:57.420 We'll actually be in more danger from
00:36:59.500 Russia.
00:37:00.640 So, the reason would be to save
00:37:03.120 Ukraine.
00:37:03.800 Well, no, Ukraine's totally destroyed,
00:37:05.860 so it's not about saving Ukraine.
00:37:08.700 It's about making sure that NATO is,
00:37:14.140 not really, I mean, it's not really
00:37:16.360 about NATO.
00:37:17.820 So, I feel like it was somebody who
00:37:19.940 had to come up with a, an essay for
00:37:23.160 a college class.
00:37:25.800 And he had to say something that
00:37:27.280 somebody hadn't said already.
00:37:28.900 Oh, I think I'll write an essay.
00:37:30.380 Well, let's say I'll, I'll talk about
00:37:33.860 the third, third effect, you know,
00:37:36.980 downstream.
00:37:39.100 I mean, it doesn't even, it just
00:37:40.180 doesn't seem real.
00:37:42.020 It sounds like CIA.
00:37:44.420 Let's just say what it sounds like.
00:37:46.280 It sounds like somebody in a
00:37:48.080 propaganda job came up with this
00:37:52.040 ridiculous, you know, China is really
00:37:55.000 about, that Russia is really about
00:37:57.140 China and Ukraine is really about
00:37:59.500 China.
00:38:01.460 I'm sorry.
00:38:02.260 Nope.
00:38:04.380 Nope.
00:38:05.180 Do you know what you'll never see?
00:38:07.280 No, you'll definitely see somebody
00:38:08.820 like Vivek say that, you know, we
00:38:12.340 should be focusing on China and it's
00:38:14.620 not helping us to be focused over
00:38:16.380 here.
00:38:17.180 That's a lot different than saying a
00:38:20.560 win for Russia is a win for China.
00:38:22.740 Those don't sound the same to me.
00:38:25.240 But I get the fact that our focus is
00:38:27.200 in the wrong place.
00:38:28.820 But, wait a minute.
00:38:31.140 I've got this backwards.
00:38:32.260 So Vivek is saying we should get
00:38:33.840 out because being there is good for
00:38:38.680 China.
00:38:39.640 And Nikki Haley is saying we should
00:38:41.840 stay there and fund it because getting
00:38:44.820 out would be good for China.
00:38:48.620 So there are two major Republicans who
00:38:52.420 can't even tell you for sure.
00:38:54.220 Well, one of them's right, but we don't
00:38:56.380 know which one's right.
00:38:57.140 I'm not smart enough.
00:38:58.720 So we can't even tell if a war is good
00:39:03.240 or bad for our main, let's say,
00:39:06.520 competition in China.
00:39:08.360 That's a pretty fundamental problem.
00:39:10.800 Do you know what I say when you can't
00:39:12.320 decide if a war is good or bad for you?
00:39:15.180 How do you handle it when you can't
00:39:16.620 decide if it's good or bad for you?
00:39:18.120 You don't do it.
00:39:21.340 You don't do a war if you're not sure.
00:39:24.700 Right?
00:39:25.040 Unless it's a defensive war and then
00:39:26.580 you're sure.
00:39:28.680 But that's pretty scary.
00:39:30.540 Even the Republicans don't know why
00:39:32.500 we're there.
00:39:34.200 Yeah.
00:39:36.280 Here's a little interesting tidbit.
00:39:39.020 There's a drug called memantine
00:39:43.240 that's given to adults with dementia.
00:39:48.520 It's a memory drug.
00:39:49.620 So it's an already approved existing
00:39:51.620 memory drug.
00:39:53.860 And there's some newer studies that
00:39:56.980 say it might be useful for reducing
00:39:59.500 ADHD.
00:40:02.660 And, but, hold on, but you won't be
00:40:08.380 able to find this in the news.
00:40:12.660 Do you know why?
00:40:14.900 Why would you not be able to find
00:40:16.680 that there's an existing drug that
00:40:19.300 maybe, and by the way, you should not
00:40:21.080 take any medical advice from me.
00:40:22.940 I'm not, I'm not making the claim
00:40:24.420 that it works.
00:40:25.240 I'm not making the claim that it
00:40:26.580 works.
00:40:27.260 How would I know?
00:40:28.160 I'm not a doctor.
00:40:29.340 Do not take any drugs because you
00:40:31.480 heard it on this show.
00:40:33.480 All right.
00:40:33.780 Let's be clear.
00:40:35.380 I'm just spitballing here.
00:40:36.820 There's no medical advice.
00:40:39.340 However, there are, there are at
00:40:44.080 least some doctors who work in the
00:40:45.560 field who are pretty sure that it's
00:40:47.380 having some effect.
00:40:49.000 Why is it that if you did a search
00:40:51.040 for it, it would be hard to find
00:40:52.100 that?
00:40:54.880 Could it be that there's some other
00:40:56.480 drug that makes a lot of money that
00:41:01.100 is the current thing that works for
00:41:03.780 ADHD?
00:41:05.400 Could it be there's some other drug?
00:41:07.520 So here's how you understand drug news.
00:41:13.680 If, if you do a search and it says,
00:41:16.000 hey, there's this one drug that
00:41:18.140 everybody should use for this situation,
00:41:20.100 it probably means that whoever sells that
00:41:22.480 drug did a good job of managing the
00:41:26.340 search engine optimization so that their
00:41:28.940 thing comes up and anything else doesn't
00:41:30.940 come up.
00:41:31.340 Now, as you know, I gave you a direct
00:41:34.240 example of this the other day where
00:41:36.420 there's something I have personal knowledge
00:41:38.200 of because I am in fact cured by surgery on
00:41:41.920 my throat.
00:41:42.440 But if you searched for my condition, good luck
00:41:46.160 going way down the search to find the known
00:41:50.080 cure.
00:41:51.860 Instead, you will find out that Botox is the
00:41:54.780 gold standard of treatment, which in fact is
00:41:59.020 by far a less good cure than the surgery.
00:42:06.080 Now, the surgery doesn't work for every person, so
00:42:08.680 there's some risk to it.
00:42:10.240 But the Botox doesn't work for every person
00:42:12.400 either.
00:42:13.620 So, so we do know with some certainty, I can
00:42:17.280 tell you with personal experience, that the, the
00:42:21.560 search engine optimization for drugs that work
00:42:24.780 doesn't work.
00:42:26.500 So that the search engines are all, they're all
00:42:31.260 owned.
00:42:32.400 They're essentially been neutered by the drug
00:42:34.780 companies.
00:42:35.600 So if there were a drug, and I'm not saying there
00:42:38.160 were, there is, I just, this is just one story, you
00:42:42.560 wouldn't know about it.
00:42:45.240 So your social media would prevent you from
00:42:47.920 knowing about cures that would totally change your
00:42:50.780 life, like really change your life.
00:42:53.420 Imagine having ADHD that wasn't properly
00:42:57.340 treated, and that you wouldn't know that there
00:43:00.740 was something out there that could help you.
00:43:02.180 Maybe.
00:43:02.740 Again, don't take my word for it.
00:43:04.540 I'm not a doctor.
00:43:06.600 But the, the level of evil that we're seeing,
00:43:10.240 because when it comes to medical stuff, regular
00:43:14.180 marketing shouldn't even be legal.
00:43:18.520 Because the way, the way you market a normal
00:43:20.820 consumer product is you say, ours is the
00:43:23.400 best, and the other person sucks.
00:43:26.900 And that doesn't even necessarily need to be
00:43:28.860 true.
00:43:29.760 You might not have the best one.
00:43:31.740 It might be overpriced.
00:43:33.540 Your competition might have a better one.
00:43:35.320 But we're kind of used to that when it's just
00:43:36.940 some consumer good.
00:43:38.740 But when it comes to life and death, like that
00:43:42.220 your quality of life is below the worth living
00:43:45.640 point, but it could be above it, if you simply
00:43:49.160 have this little information about what would
00:43:51.100 make you better.
00:43:51.720 And the search engines will just bury that.
00:43:55.460 Not intentionally, perhaps.
00:43:57.060 It's just maybe gaming from the, from the people who
00:44:00.080 are trying to game it.
00:44:01.540 But it's just so unethical.
00:44:05.260 It's, it's almost hard to keep it in your head.
00:44:07.880 Anyway, there's this new book on Musk.
00:44:13.500 It's a biography by Walter Isaacson.
00:44:16.780 And Musk himself must be happy about it because
00:44:19.600 I saw him sending a post in which he was
00:44:23.220 recommending a interview with Walter Isaacson
00:44:25.960 without even seeing it because he said everything
00:44:29.160 that guy says is kind of interesting.
00:44:31.180 So they like each other, apparently.
00:44:34.100 So that, that's good to know.
00:44:37.120 Just when you're, when you're trying to evaluate
00:44:39.260 how true the biography is, it's good to know that
00:44:42.020 the biographer and the subject got along.
00:44:45.620 And, yeah, I guess Walter Isaacson was talking
00:44:49.240 to Lex Friedman.
00:44:51.760 And on a podcast, which would probably be quite
00:44:53.960 interesting, I would imagine.
00:44:54.960 But there's a few anecdotes coming out.
00:44:57.600 And one of the, one of them is about the relationship
00:45:00.120 between Elon Musk and Bill Gates, which you might
00:45:04.180 know is not ideal.
00:45:06.640 Now, the funniest thing about this was that they
00:45:09.420 didn't always know each other.
00:45:12.020 Isn't that weird?
00:45:13.120 That only a few years ago, they seem to have met
00:45:15.340 for the first time.
00:45:17.300 How is that even possible?
00:45:19.500 How did Elon Musk and Bill Gates not end up in
00:45:22.540 the same room just naturally a bunch of times
00:45:26.320 without even trying?
00:45:30.480 Yeah.
00:45:32.900 So, by the way, my Triggernometry podcast is up
00:45:36.480 today, apparently.
00:45:38.500 Triggernometry.
00:45:39.120 Look for that.
00:45:39.780 If you know that, it's one of the best podcasts.
00:45:44.620 So here are some things we learned from the biography.
00:45:48.180 That Musk was really mad at Gates for shorting Tesla
00:45:52.400 a stock.
00:45:53.440 Now, if you're not an investor, a short is a bet
00:45:57.320 that the stock will plunge and go down.
00:45:59.960 And then you can make money by betting against the stock.
00:46:02.960 So you can either buy a stock and make money when it
00:46:05.860 goes up, or you can do what's called selling a short,
00:46:09.480 where you make money when it goes down.
00:46:12.060 But that's considered putting pressure on a stock.
00:46:15.340 Like if somebody buys a big short, it makes other people
00:46:18.480 say, well, what do you know that I don't know?
00:46:20.260 So it's a big pressure on the stock price.
00:46:22.120 So it's kind of a messed up thing to do if you're the person
00:46:26.220 who owns a lot of stock in that company.
00:46:29.680 And so Musk apparently challenged him when Gates wanted
00:46:33.280 to bring one of his children to watch a launch.
00:46:37.680 Here's a subcategory, or part of the story that's also
00:46:40.780 interesting.
00:46:41.860 When Gates wanted to schedule with Musk to bring Gates' kid to
00:46:47.160 see a launch or see a rocket or something, Musk told them that he
00:46:51.560 fired his schedulers because he didn't like them having too much
00:46:54.900 control over his schedule.
00:46:56.260 So that to schedule with him, you'd have to call him directly
00:46:59.260 to, like, call his phone number.
00:47:03.540 So Bill Gates wanted to schedule it and have his secretary set it
00:47:07.060 up.
00:47:07.260 But he couldn't send his secretary to call Musk directly because that
00:47:12.680 would just sort of be inappropriate.
00:47:14.980 So he had to schedule it himself.
00:47:17.440 So Bill Gates called Musk and scheduled it.
00:47:21.720 So apparently when they got together, there was some disagreement
00:47:27.420 about whether there could ever be a Tesla semi-truck.
00:47:35.460 I just saw one.
00:47:37.500 So Musk said it would be impossible to use battery power to make
00:47:41.560 the big semi-trucks.
00:47:45.060 And I think Musk was saying that they were already running at the time
00:47:48.220 that he said that they were impossible.
00:47:49.980 They were, like, already up and running.
00:47:52.080 And, you know, and now they're in lots of places.
00:47:54.960 I saw one yesterday or the other day.
00:47:57.420 So I guess Gates was completely wrong about the capability of the
00:48:03.920 batteries.
00:48:05.220 And it turns out that Musk knew a lot more about batteries than
00:48:10.000 Gates does.
00:48:10.820 Surprise.
00:48:12.020 And that Gates was just totally wrong about everything about Tesla
00:48:16.820 and its potential.
00:48:19.700 Now, Musk's problem is that Gates is not just a guy who's investing
00:48:24.180 to make money.
00:48:24.860 And so he asked him, reportedly, Musk asked Gates why he would have a short
00:48:30.020 on the one company that is doing the most to get us away from fossil fuels.
00:48:38.600 And Gates basically said he did it to make money.
00:48:42.900 He bet against the most important anti-fossil fuel company,
00:48:48.660 the one that would be doing the most to make fossil fuels unnecessary.
00:48:53.460 He bet against it in a way that could damage it, like, severely to make money.
00:49:01.960 And he said it directly.
00:49:05.880 Yeah.
00:49:06.520 He just did it to make money.
00:49:07.560 So, and then I guess the quote from Musk after learning this was,
00:49:18.520 Musk said in a note to somebody,
00:49:20.600 at this point, I am convinced that he, meaning Gates,
00:49:23.800 is categorically insane, and then parenthetically,
00:49:28.400 and an asshole to the core.
00:49:29.860 And Musk said, I did actually want to like him, sigh.
00:49:37.640 So, in person, Elon Musk says he's either crazy
00:49:41.540 or just an asshole who doesn't care about anything.
00:49:45.500 So, now I know you're going to extend this,
00:49:59.960 and you're going to say, but Scott,
00:50:01.700 you said he doesn't invest just to make money.
00:50:05.080 You said he's, you know, using his fortune
00:50:07.120 to make the world a better place,
00:50:08.780 and haven't I been telling you that he's evil
00:50:11.600 and he's trying to destroy the world?
00:50:12.980 I don't think you can use this example to make that case.
00:50:17.200 Here's why.
00:50:18.820 You don't think that the previous richest person in the world
00:50:22.180 has any bad feelings about the current richest person in the world?
00:50:26.600 You don't think that Gates and Musk have a personal situation
00:50:31.080 that is primarily what's driving them,
00:50:34.620 and that has nothing to do with anything else
00:50:36.600 that Gates does anywhere else?
00:50:39.460 Yeah.
00:50:39.900 To me, it just looked personal.
00:50:41.160 It just looked like Gates lost his mojo to Musk.
00:50:46.200 Musk is now the better Gates, basically.
00:50:50.440 Because when I grew up, Bill Gates was the standard
00:50:53.380 of high-tech entrepreneurial success,
00:50:56.760 but he's not at the moment.
00:50:59.020 At the moment, that's Musk.
00:51:00.700 So, imagine going most of your life being the guy,
00:51:03.800 like you're the guy for technology,
00:51:05.840 and now there's another the guy,
00:51:07.440 and you're getting all this trouble
00:51:10.300 for your personal behavior, etc.
00:51:13.280 I don't know.
00:51:13.980 To me, it just looked personal.
00:51:15.800 I think that Gates would have been happy
00:51:17.760 to see a Tesla go down
00:51:19.640 because it would have made Gates look smarter.
00:51:23.940 But I don't think you could take that,
00:51:25.900 which is pretty ugly,
00:51:27.720 and extend it to the reason he's making toilets in Africa
00:51:31.120 as to make money.
00:51:32.740 That's a pretty big stretch.
00:51:37.760 But people are complex,
00:51:40.180 so they're neither all good nor all bad.
00:51:46.320 Yeah.
00:51:47.280 Yeah, it just seems personal to me.
00:51:48.780 So, I'd say, I doubt he's insane,
00:51:53.480 but asshole to the core appears to be consistent
00:51:56.360 with what people have said about Gates from the first days.
00:51:59.980 I would never argue that he's not an asshole.
00:52:03.040 You didn't hear that, did you?
00:52:04.540 I've never made that case.
00:52:06.540 I'm not saying that if you were there in person,
00:52:09.860 he'd be a good guy.
00:52:12.480 I doubt it.
00:52:13.400 I was watching, last night I was watching
00:52:19.320 a special on Frank Sinatra.
00:52:22.660 And here's something I didn't know about Frank Sinatra.
00:52:27.020 In my opinion, his greatest accomplishments
00:52:29.520 we never talk about.
00:52:31.260 Like, if you think of Frank Sinatra,
00:52:32.880 you think, oh, you know, he's a great singer,
00:52:35.600 and he was in movies and successful and stuff.
00:52:38.260 But it turns out that his maybe most awesome accomplishments
00:52:42.900 had nothing to do with those things.
00:52:44.880 Those things allowed him to be Frank Sinatra
00:52:47.020 so he could do the other things.
00:52:49.040 For example, he had in his little rack pack,
00:52:52.120 you know, his group of guys,
00:52:53.620 included Sammy Davis Jr.
00:52:58.680 Now, in the days, in the 50s,
00:53:01.540 Sammy Davis Jr. couldn't even stay at the hotels
00:53:04.480 where they performed.
00:53:07.500 He wasn't allowed to stay there
00:53:08.940 or to eat there or something.
00:53:10.940 You know, because the racism was so that bad.
00:53:14.020 But because Sinatra was so powerful,
00:53:17.820 he would just go in and say,
00:53:19.240 yes, you are.
00:53:20.220 You're absolutely going to change that.
00:53:22.960 You know, no discrimination when I'm here.
00:53:25.100 Sammy is going to be treated like everybody else.
00:53:26.920 And then he would just force people to do it.
00:53:29.320 So he was one of the biggest forces
00:53:31.080 for, you know, realistic diversity
00:53:34.440 in entertainment, at least,
00:53:36.500 and also the world.
00:53:38.000 So he basically went to the mat
00:53:39.660 against discrimination.
00:53:43.520 But there was another story
00:53:44.680 that I hadn't heard until I watched this.
00:53:46.880 So apparently at the time that
00:53:48.400 Sinatra had moved to Palm Springs,
00:53:51.460 I think it was, Palm Springs,
00:53:53.500 and other rich people
00:53:55.940 wanted to spend some time there too.
00:53:57.600 It was just a good place for rich people.
00:53:59.220 But a lot of the rich Jewish actors
00:54:03.900 and Hollywood people
00:54:05.340 couldn't even join
00:54:07.220 the big golf club in Palm Springs.
00:54:11.520 I don't know how many there were,
00:54:12.540 but there was one big one
00:54:13.860 that they would want to be a member of,
00:54:16.140 but they were not allowed.
00:54:17.960 So they were rich enough,
00:54:19.680 they decided to start their own golf club
00:54:22.340 so that all the Jewish members could play.
00:54:25.520 But it wouldn't be restricted.
00:54:26.760 So you didn't have to be Jewish,
00:54:28.880 it's just that they wouldn't restrict anybody.
00:54:30.620 So it ended up being, you know,
00:54:32.140 lots of Jewish members.
00:54:33.800 Well, Frank Sinatra, who was not Jewish,
00:54:36.480 he actually moved his home
00:54:39.760 onto the golf course
00:54:41.760 so that Frank Sinatra
00:54:44.120 was basically endorsing, you know,
00:54:47.340 this new arrangement
00:54:48.440 that was not discriminating
00:54:50.080 against his friends.
00:54:51.860 So if you, and it was a big,
00:54:54.240 very, apparently especially
00:54:56.480 for the area,
00:54:57.460 it was a big statement.
00:54:59.200 Like, his statement was,
00:55:00.700 if this club isn't going to let my friends in,
00:55:03.320 you're not going to see me there.
00:55:05.460 And seeing Frank Sinatra
00:55:06.780 was sort of a big deal.
00:55:09.080 So he put the hammer down
00:55:11.560 on two very, you know,
00:55:14.280 notable situations
00:55:16.200 in which he would not tolerate
00:55:18.300 discrimination.
00:55:20.400 And to me,
00:55:21.140 that was his greater accomplishments.
00:55:23.760 But this is not even why
00:55:24.760 I was going to talk about him.
00:55:26.320 I was going to talk about him
00:55:27.580 because he was also
00:55:29.000 a miserable drunk.
00:55:30.940 And apparently he was just terrible
00:55:32.500 to people when he was drunk.
00:55:34.040 And he really liked to drink.
00:55:35.960 So it wasn't that uncommon.
00:55:38.680 But when you watch the people
00:55:40.020 talk about him,
00:55:40.760 they talked like they were
00:55:41.580 seeing a god.
00:55:42.900 It's like, oh my god,
00:55:43.700 he was so nice.
00:55:44.600 And apparently he did just
00:55:46.020 incredible amount of charity work.
00:55:48.940 like for individuals
00:55:50.700 but also for organizations.
00:55:53.120 He did like just all kinds
00:55:55.620 of charity singing.
00:55:56.800 He said yes to everything.
00:55:58.220 So he was in fact
00:55:59.280 a genuine charitable guy
00:56:02.560 who did things which I think
00:56:04.380 could have changed history.
00:56:07.180 I mean, it might be that important.
00:56:10.800 But he could also be
00:56:12.480 a pretty mean guy.
00:56:14.460 So how do you handle the fact
00:56:17.080 that somebody is
00:56:18.200 sometimes terrible
00:56:19.380 and sometimes an angel
00:56:22.060 and he can be both guys.
00:56:24.500 He can be both people.
00:56:27.060 And I think you have to keep
00:56:27.960 that in mind
00:56:28.480 when you're looking at
00:56:29.340 anybody like Bill Gates.
00:56:32.700 So if you want to say
00:56:33.800 that he's like pure evil,
00:56:35.540 that doesn't feel likely to me.
00:56:37.920 That feels really unlikely.
00:56:39.620 But if you told me
00:56:40.460 that everything he does
00:56:41.460 is noble
00:56:42.060 and for the good of humanity,
00:56:44.320 I would say
00:56:44.780 clearly not.
00:56:47.340 Clearly not.
00:56:49.420 But he could be both.
00:56:51.300 He could be one of the most
00:56:52.440 useful people to society
00:56:53.940 and also an asshole.
00:56:56.540 There's no competition there.
00:56:58.460 He could be both.
00:57:00.480 I speak from personal experience.
00:57:04.140 I like to do things
00:57:05.380 that help the world.
00:57:06.720 But am I flawless?
00:57:09.180 No.
00:57:10.960 Obviously.
00:57:12.060 All right.
00:57:14.420 People are complex.
00:57:15.380 That's my only point.
00:57:19.220 Do you remember when Forbes,
00:57:21.120 the publication,
00:57:22.340 was a respectable publication?
00:57:26.040 Oh, my God.
00:57:27.220 I don't know what happened.
00:57:29.840 But, oh, my God.
00:57:32.300 So there's an article here
00:57:33.760 about how
00:57:34.540 the rich are more unethical
00:57:38.680 than the poor.
00:57:39.460 The rich are more unethical
00:57:42.820 and immoral
00:57:43.400 than the poor.
00:57:44.500 It's in Forbes.
00:57:46.800 It's in Forbes.
00:57:49.440 And here are the examples.
00:57:52.420 Without any,
00:57:54.960 no believable data
00:57:56.720 to back it,
00:57:58.400 that the rich have
00:58:00.240 lax attitude
00:58:01.140 toward the rules.
00:58:02.600 So they're more prone
00:58:03.780 to break laws
00:58:04.620 while driving,
00:58:06.360 hinting at a belief
00:58:07.260 that certain rules
00:58:08.120 might not apply to them.
00:58:11.540 Yeah,
00:58:12.180 because when I'm afraid
00:58:13.780 of a crime,
00:58:14.960 for example,
00:58:15.860 I don't want to be
00:58:17.000 around rich people.
00:58:19.100 Yeah.
00:58:19.720 No, if you want
00:58:20.400 to be free from crime,
00:58:21.400 go to the poorest
00:58:22.180 part of the country
00:58:23.040 because that's where
00:58:24.900 people are
00:58:25.440 morally strong.
00:58:28.060 You can leave
00:58:28.760 your door unlocked.
00:58:29.860 Just go to any inner city.
00:58:31.360 Just leave your door unlocked
00:58:32.400 because there are no rich people
00:58:33.820 there to take your stuff
00:58:34.760 because they're all immoral,
00:58:35.760 those rich people.
00:58:36.340 The rich also,
00:58:38.440 according to the Forbes article,
00:58:40.760 it's an opinion piece,
00:58:41.700 I guess,
00:58:43.080 skewed ethical compasses,
00:58:45.280 yeah,
00:58:45.620 because that's something
00:58:46.360 that the rich have
00:58:47.160 but not the poor.
00:58:48.740 They're inconsiderate,
00:58:50.280 yeah,
00:58:50.620 the rich are inconsiderate,
00:58:52.020 unlike the poor
00:58:52.800 who just,
00:58:55.380 they're so considerate.
00:58:57.680 And the rich have
00:58:58.680 something called
00:58:59.220 an overemphasis on winning,
00:59:01.620 whether in games or work.
00:59:04.680 Yeah,
00:59:05.980 those stupid rich people,
00:59:07.540 they have an overemphasis
00:59:08.600 on winning.
00:59:11.020 So you wouldn't want
00:59:11.980 any of that around you,
00:59:12.860 would you?
00:59:13.900 Let me get away
00:59:14.640 from those winners.
00:59:15.680 My God.
00:59:18.400 Now,
00:59:19.220 do I even need to
00:59:21.020 add commentary to this?
00:59:23.600 This is Forbes.
00:59:25.880 And they're basically saying
00:59:27.220 kill rich people
00:59:28.120 indirectly.
00:59:29.940 I mean,
00:59:31.540 all the rich people
00:59:32.900 are the bad people.
00:59:34.320 So if you did something
00:59:35.360 bad to a rich person,
00:59:38.560 well,
00:59:38.820 that would not even
00:59:39.720 be immoral,
00:59:40.380 would it?
00:59:41.080 Because they have it coming.
00:59:42.300 They're all bad people.
00:59:45.440 My God.
00:59:49.040 Do you think that
00:59:50.140 the left is trying
00:59:52.200 to kill Trump
00:59:53.100 and Musk,
00:59:55.080 like actually kill them?
00:59:56.140 No.
00:59:59.940 I don't know
01:00:02.320 if anybody's had
01:00:03.040 a meeting
01:00:03.520 where they said
01:00:04.180 those words,
01:00:05.580 but if you were
01:00:06.500 to judge by actions,
01:00:08.760 it looks like it.
01:00:11.120 It looks exactly
01:00:12.280 like it.
01:00:13.920 You know,
01:00:14.520 you want to hear it?
01:00:15.100 Here's my conspiracy theory.
01:00:17.660 I'm going to give you
01:00:18.680 the conspiracy theory
01:00:20.020 of all conspiracy theories.
01:00:21.780 I will not say
01:00:22.860 that I believe
01:00:23.460 my own conspiracy theory,
01:00:24.760 and therefore
01:00:25.580 you shouldn't.
01:00:26.240 But when a situation
01:00:28.800 is created,
01:00:30.700 we are led
01:00:31.600 to a certain set
01:00:32.420 of conclusions
01:00:33.060 whether you like it
01:00:33.920 or not.
01:00:35.720 And the thing
01:00:36.540 that's been bugging me
01:00:37.380 for a long time
01:00:38.180 is the UFO stories.
01:00:41.080 And I've told you
01:00:42.000 before that
01:00:42.640 if I were in charge
01:00:43.700 of the,
01:00:44.420 let's say,
01:00:44.880 some massive
01:00:45.440 brainwashing operation,
01:00:47.240 I would want to know
01:00:48.520 if I could test
01:00:49.580 when the public
01:00:51.320 is ready
01:00:51.840 for the big one.
01:00:53.080 And the UFOs
01:00:55.400 would be this
01:00:55.880 nice little silly story
01:00:57.220 that no matter
01:00:58.280 which way it went,
01:00:59.060 it would just be
01:00:59.520 kind of funny.
01:01:00.720 So it wouldn't hurt you.
01:01:02.380 So you put this
01:01:03.000 UFO story out
01:01:04.080 and then you wait
01:01:04.640 a few weeks
01:01:05.240 and you see
01:01:06.420 how many people
01:01:07.180 believed it.
01:01:08.640 Now,
01:01:09.180 I don't know
01:01:09.720 the answer to that.
01:01:10.500 Can you tell me
01:01:11.120 what percentage
01:01:11.820 of the public
01:01:12.320 believed the UFO story?
01:01:14.520 Less than half?
01:01:17.980 Not 25%.
01:01:19.240 But it was fewer
01:01:20.660 than half,
01:01:21.260 was it not?
01:01:22.320 Fewer than half
01:01:23.020 believed it was true?
01:01:25.480 I need a fact check
01:01:26.580 on that.
01:01:26.980 You're just guessing.
01:01:28.220 All right.
01:01:28.920 So,
01:01:29.940 here's my
01:01:30.840 conspiracy theory.
01:01:33.620 That if 75%
01:01:35.100 of the country
01:01:35.660 had believed
01:01:36.260 the UFO story,
01:01:38.000 then the brainwashers
01:01:39.740 in charge
01:01:40.200 would know
01:01:40.760 that we're ready
01:01:41.640 for the big one.
01:01:43.260 And the big one
01:01:43.920 would be assassination.
01:01:46.120 The big one
01:01:46.900 would be assassination.
01:01:49.340 And that
01:01:50.040 because the UFO story
01:01:51.580 was not bought
01:01:52.860 hook,
01:01:53.360 line,
01:01:53.600 and sinker
01:01:54.100 that they pulled
01:01:55.460 back on the plan
01:01:56.560 to kill Trump.
01:02:00.120 And they may have
01:02:01.020 the same plan
01:02:02.140 for Musk
01:02:03.360 for all I know.
01:02:04.500 But,
01:02:04.820 no,
01:02:05.120 I'm not saying
01:02:05.520 this is true.
01:02:06.820 Not saying
01:02:07.480 this is true.
01:02:08.420 I'm just saying
01:02:09.300 that if you remove
01:02:10.360 any trust in media,
01:02:13.060 we're all left
01:02:14.380 with our own devices.
01:02:16.100 So we're all
01:02:16.800 going to look
01:02:17.120 for patterns
01:02:17.720 and then we're
01:02:18.500 going to think
01:02:18.820 those patterns
01:02:19.380 mean something.
01:02:20.000 they don't.
01:02:22.060 A lot of times
01:02:22.940 the patterns
01:02:23.360 are just
01:02:23.800 accidental patterns.
01:02:27.340 But I'll tell you
01:02:28.180 the pattern I see
01:02:29.160 with the Forbes
01:02:31.480 thing.
01:02:32.960 This basically
01:02:33.900 is trying to
01:02:35.120 change the public's
01:02:37.140 opinion
01:02:37.500 of how much
01:02:39.020 they think
01:02:39.420 about the rich people.
01:02:41.220 Now,
01:02:41.440 who do you think
01:02:41.980 of when you think
01:02:42.620 of rich people?
01:02:44.500 Trump
01:02:44.940 and Musk.
01:02:45.740 they're defined
01:02:48.000 by being rich people.
01:02:50.420 So if you can
01:02:51.060 make rich people
01:02:51.940 look like they're
01:02:52.960 the bad ones
01:02:53.700 and you can make
01:02:55.480 the public believe
01:02:56.620 anything,
01:02:57.260 including UFOs,
01:02:58.960 you can just
01:03:00.560 kill the rich people
01:03:01.520 if they're in your way.
01:03:04.080 And then you can
01:03:04.740 tell the public
01:03:05.340 that it was an accident
01:03:06.300 and 75%
01:03:08.320 will believe.
01:03:10.240 25%
01:03:11.240 won't believe
01:03:11.860 but it's not
01:03:12.400 enough to change it.
01:03:13.140 The 25%
01:03:14.940 was probably
01:03:15.940 how many believe
01:03:16.580 that Kennedy
01:03:17.080 was killed
01:03:18.260 by the CIA.
01:03:19.620 For most of my life,
01:03:21.500 probably 25%
01:03:22.660 of the public
01:03:23.140 thought the CIA
01:03:24.040 killed Kennedy.
01:03:27.560 Right?
01:03:29.160 So that was enough
01:03:30.260 that society
01:03:31.740 did not rise up
01:03:32.880 because 25%
01:03:34.240 just isn't enough.
01:03:35.760 75% said,
01:03:37.040 let's just move on.
01:03:38.280 Probably one gunman.
01:03:40.180 Right?
01:03:40.540 So that's what
01:03:41.240 scares me about
01:03:41.960 the UFO story.
01:03:43.680 It looks like
01:03:44.420 it's planted.
01:03:45.400 It didn't look
01:03:46.000 organic to me.
01:03:47.260 And it looked like
01:03:48.240 there would be
01:03:48.540 some point to it.
01:03:50.140 And the only other
01:03:50.880 point I could think
01:03:51.720 would be to scare
01:03:52.700 China or Russia
01:03:54.460 into thinking
01:03:55.560 we have some
01:03:56.140 technology that
01:03:57.220 they don't know about.
01:03:59.200 Maybe.
01:04:00.020 I mean,
01:04:00.420 that's another
01:04:01.020 plausible explanation.
01:04:03.700 But it feels
01:04:05.820 like a test
01:04:06.540 to see if we're
01:04:07.940 ready to believe
01:04:09.220 literally anything.
01:04:10.180 you would have
01:04:12.560 to get us
01:04:13.000 to Kennedy
01:04:13.700 levels of
01:04:14.720 trust in the
01:04:16.760 system
01:04:17.140 before you
01:04:18.620 could listen
01:04:19.020 to the system
01:04:19.600 and say,
01:04:19.960 oh, no problem,
01:04:20.820 it was one bullet.
01:04:21.860 It was a magic
01:04:22.520 bullet, but it
01:04:23.040 was just one.
01:04:25.520 Yeah.
01:04:26.360 Because remember,
01:04:27.220 the public believed
01:04:27.960 the magic bullet
01:04:28.900 for most of my life.
01:04:32.060 They believed
01:04:32.680 there was a magic
01:04:33.300 bullet that just
01:04:34.180 like,
01:04:34.420 went through them
01:04:36.720 and still do.
01:04:43.840 Yeah.
01:04:44.420 I don't know
01:04:44.780 how many bullets
01:04:45.340 there were,
01:04:45.780 but maybe we'll
01:04:48.180 never know.
01:04:49.480 So that's what
01:04:50.020 scares me.
01:04:55.260 And that's the
01:04:56.340 reason the study
01:04:57.560 says all these
01:04:58.560 unethical tendencies,
01:05:00.080 according to the
01:05:00.760 psychologist,
01:05:01.860 is why the rich
01:05:03.060 can succeed in life
01:05:05.780 but fail in love.
01:05:07.720 So the rich
01:05:08.560 fail in love
01:05:09.420 because they're
01:05:09.940 not trusted.
01:05:11.960 That's not why
01:05:12.820 the rich fail in love.
01:05:14.980 Would anybody
01:05:15.560 like me to
01:05:16.240 explain to them
01:05:17.040 why the rich
01:05:17.740 fail in love?
01:05:21.960 Does anybody
01:05:22.660 who doesn't know
01:05:23.320 the obvious
01:05:24.020 answer to that?
01:05:28.340 Because we can
01:05:29.140 afford it.
01:05:31.740 We can afford it.
01:05:34.300 Why is this
01:05:35.100 a mystery?
01:05:35.780 Rich people
01:05:39.780 have the most
01:05:40.520 options.
01:05:42.160 So you have
01:05:42.440 the most
01:05:42.800 temptation.
01:05:45.060 And the,
01:05:46.300 let's say the
01:05:47.200 wife,
01:05:47.720 if you take a
01:05:48.500 classic situation
01:05:49.580 with a richer
01:05:50.400 guy and
01:05:51.120 marries somebody,
01:05:52.900 a woman who
01:05:54.200 marries a rich
01:05:54.760 guy doesn't
01:05:56.000 have to put up
01:05:56.600 with any shit
01:05:57.280 because they're
01:05:58.720 going to get
01:05:59.060 enough money
01:05:59.600 even with a
01:06:00.160 prenup.
01:06:01.100 Even with a
01:06:01.860 prenup,
01:06:02.400 there's going
01:06:02.920 to be some
01:06:03.280 bag of money
01:06:03.960 involved.
01:06:04.360 So all you
01:06:07.140 do is change
01:06:07.700 the financial
01:06:08.340 incentives and
01:06:09.300 you would get
01:06:09.700 this result.
01:06:10.740 And the
01:06:10.960 psychologist thinks
01:06:11.920 it has something
01:06:12.360 to do with
01:06:12.840 trust.
01:06:14.380 It has nothing
01:06:15.180 to do with
01:06:15.600 trust.
01:06:16.700 It has to do
01:06:17.340 with follow the
01:06:18.200 money.
01:06:18.600 Follow the
01:06:19.020 money works
01:06:19.480 100% of the
01:06:20.280 time.
01:06:22.340 And the
01:06:22.780 money suggests
01:06:23.540 that getting
01:06:24.560 married to a
01:06:25.180 rich guy and
01:06:25.800 then divorcing
01:06:26.460 the rich guy
01:06:27.020 is a really
01:06:27.520 good deal
01:06:28.120 because then
01:06:28.940 you could have
01:06:29.360 money plus a
01:06:30.220 guy you like
01:06:30.700 better.
01:06:31.780 Why wouldn't
01:06:32.240 everybody do
01:06:32.900 it?
01:06:33.020 I don't know
01:06:33.380 how any rich
01:06:33.960 people stay
01:06:34.440 married.
01:06:36.640 To me it
01:06:37.280 would be a
01:06:37.980 miracle if
01:06:38.580 anybody ever
01:06:39.120 stayed married
01:06:39.740 to a rich
01:06:40.820 person.
01:06:43.520 Yeah.
01:06:43.940 Because the
01:06:44.520 money suggests
01:06:45.160 that you
01:06:45.440 shouldn't.
01:06:46.400 So they
01:06:46.800 don't.
01:06:47.060 It's as
01:06:50.040 simple as
01:06:50.440 that.
01:06:51.540 All right.
01:06:52.240 The Ukrainians
01:06:53.060 have, I
01:06:54.700 didn't know
01:06:55.000 this, but
01:06:55.360 they've
01:06:55.680 developed a
01:06:56.660 substantial
01:06:58.020 weapons
01:06:59.300 manufacturing
01:07:00.220 capacity.
01:07:01.960 Didn't see
01:07:02.640 that coming.
01:07:03.480 But apparently
01:07:03.900 the Ukrainians
01:07:04.820 can make a
01:07:05.880 lot of
01:07:06.180 weapons now
01:07:06.840 locally.
01:07:08.700 Big ones.
01:07:09.540 We're talking
01:07:09.820 about artillery.
01:07:11.240 Now they're
01:07:11.660 made out of
01:07:12.300 plastic and
01:07:13.140 wood and they
01:07:13.680 don't fire any
01:07:14.480 shells.
01:07:15.520 They're used
01:07:16.280 for decoys.
01:07:18.060 So Ukraine
01:07:18.820 has this
01:07:19.300 massive decoy
01:07:20.400 military
01:07:21.040 industrial
01:07:21.980 process where
01:07:24.520 they're making
01:07:25.080 decoys to
01:07:28.060 use up the
01:07:28.740 Russian artillery.
01:07:31.220 So they're
01:07:32.040 all happy
01:07:32.480 about their
01:07:32.920 fake radars
01:07:36.060 and stuff.
01:07:37.220 So they just
01:07:37.900 make it out of
01:07:38.440 wood and they
01:07:38.920 spray paint it
01:07:39.660 and they wait
01:07:40.360 for the Russians
01:07:40.920 to destroy it.
01:07:43.180 Now, on one
01:07:45.160 hand, I think
01:07:45.940 this is awesome.
01:07:48.060 Pretty good
01:07:48.840 strategy.
01:07:49.540 I'm sure it
01:07:49.920 works.
01:07:51.260 It's using up
01:07:51.980 their weapons
01:07:53.180 on the other
01:07:53.580 side.
01:07:54.320 On the other
01:07:54.780 hand, I don't
01:07:56.060 know if I would
01:07:56.720 bet on the
01:07:57.640 group that
01:07:58.460 needed to make
01:07:59.240 wooden weapons.
01:08:02.320 I'm just going
01:08:03.000 to say, if
01:08:03.720 there's any way
01:08:05.100 to determine
01:08:06.720 who's winning
01:08:07.520 and who's
01:08:08.060 losing a war,
01:08:09.520 I would look
01:08:10.400 to the one
01:08:10.940 that had to
01:08:11.580 make weapons
01:08:12.500 on a cardboard.
01:08:13.140 If one
01:08:15.580 side is making
01:08:16.200 weapons on
01:08:16.760 a cardboard
01:08:17.380 and plastic
01:08:18.060 bottles, they
01:08:20.180 are not going
01:08:20.780 to be the
01:08:21.160 winner.
01:08:23.860 Right?
01:08:25.080 Again, I
01:08:25.840 remind you, I'm
01:08:26.620 no military
01:08:28.200 analyst.
01:08:30.380 But if I
01:08:31.100 were a military
01:08:31.820 analyst, I
01:08:33.880 would not place
01:08:34.460 a large bet on
01:08:35.680 the people who
01:08:36.180 are making
01:08:36.500 weapons on a
01:08:37.160 wood.
01:08:37.340 that's just
01:08:39.820 me.
01:08:41.360 Yeah.
01:08:44.560 They did the
01:08:45.300 same thing
01:08:45.760 during World
01:08:46.300 War I and
01:08:46.920 II.
01:08:47.180 They did.
01:08:48.740 They did.
01:08:54.900 Yeah, I'm
01:08:55.820 not saying it's
01:08:56.400 a bad strategy.
01:08:57.700 I'm just saying
01:08:58.600 it doesn't sound
01:08:59.380 like the winner's
01:09:00.200 strategy.
01:09:01.580 It sounds like
01:09:02.360 an admission that
01:09:03.200 the other side
01:09:03.760 has better
01:09:04.300 weapons.
01:09:05.360 Because I
01:09:07.460 don't think
01:09:07.780 Russia is doing
01:09:08.400 that.
01:09:09.040 Do you think
01:09:09.360 Russia is making
01:09:10.020 a bunch of
01:09:10.400 wooden artillery?
01:09:11.640 They don't need
01:09:12.400 to because they
01:09:12.940 have so much
01:09:13.560 of the real
01:09:13.960 stuff.
01:09:15.860 I mean, it's
01:09:16.280 just an indicator
01:09:16.980 of what's
01:09:18.540 happening there.
01:09:20.800 All right, I
01:09:21.520 was asked on
01:09:23.140 the locals
01:09:23.540 platform,
01:09:24.180 somebody asked
01:09:24.620 me, what's
01:09:25.500 the difference
01:09:25.900 between affirmations
01:09:27.220 and wishful
01:09:27.900 thinking?
01:09:29.500 So an
01:09:29.820 affirmation is
01:09:30.580 when you're
01:09:31.020 imagining the
01:09:31.880 thing you want
01:09:32.440 to be true.
01:09:34.200 As an aid to
01:09:35.280 making it
01:09:35.620 true.
01:09:36.880 And wishful
01:09:37.440 thinking is
01:09:38.060 just when you
01:09:39.440 wish you had
01:09:39.860 something.
01:09:40.860 And my
01:09:41.340 answer is
01:09:41.760 this, wishful
01:09:43.420 thinking is
01:09:44.120 just wanting.
01:09:46.640 Affirmations is
01:09:47.460 closer to
01:09:48.080 deciding.
01:09:49.640 Because if you
01:09:50.600 can chant or
01:09:51.620 write down or
01:09:52.480 visualize this
01:09:53.340 goal every
01:09:54.160 single day,
01:09:55.680 you'd probably
01:09:56.440 decide it.
01:09:57.720 Because you
01:09:58.460 just wouldn't
01:09:59.100 do that unless
01:10:01.320 you'd really
01:10:02.040 committed yourself
01:10:02.840 to it.
01:10:03.420 Because the
01:10:03.840 affirmations are
01:10:04.560 sort of the
01:10:05.000 last thing
01:10:05.560 you do.
01:10:06.720 Sometimes the
01:10:07.380 first.
01:10:08.020 But typically,
01:10:09.100 you know, you're
01:10:09.560 already trying to
01:10:10.460 do something in
01:10:11.080 that regard.
01:10:12.080 You're trying to
01:10:12.560 make something
01:10:12.960 happen.
01:10:13.760 And then you
01:10:14.180 say, all right,
01:10:14.740 how about
01:10:15.040 affirmations?
01:10:16.500 So affirmations
01:10:17.340 are a pretty
01:10:17.960 good sign,
01:10:19.560 because you're
01:10:20.040 going to do it
01:10:20.440 every day, that
01:10:21.780 you're committed
01:10:22.280 to this thing.
01:10:23.580 Wishful thinking
01:10:24.320 is like me
01:10:25.320 thinking, well,
01:10:26.720 you know, if I
01:10:27.400 were Elon Musk,
01:10:28.520 what would I do
01:10:29.220 today?
01:10:30.340 Right?
01:10:30.660 But I'm not
01:10:31.080 trying to make
01:10:31.720 myself Elon
01:10:33.440 Musk.
01:10:34.600 It's just, oh,
01:10:35.680 wouldn't that be
01:10:36.200 fun?
01:10:36.720 And then you
01:10:37.140 watch a movie and
01:10:37.980 you see somebody's
01:10:38.540 having a good
01:10:39.040 life, and you
01:10:39.520 say, oh,
01:10:40.020 wouldn't that be
01:10:40.520 fun?
01:10:41.320 But that's just
01:10:41.880 wanting.
01:10:43.860 And there's no
01:10:44.700 faith involved in
01:10:45.540 either case.
01:10:46.600 Let me be clear.
01:10:48.460 Believing and
01:10:49.340 having some faith
01:10:50.300 has nothing to do
01:10:51.540 with any wishful
01:10:52.440 thinking.
01:10:53.180 There's no faith
01:10:54.060 required for
01:10:54.640 wishful thinking.
01:10:55.600 But there's no
01:10:56.220 faith required or
01:10:57.840 belief for
01:10:59.000 affirmations.
01:10:59.700 The idea behind
01:11:01.780 affirmations, which
01:11:03.480 may be not even
01:11:04.800 true, is that the
01:11:06.880 simple process of
01:11:08.240 concentrating on a
01:11:09.180 goal makes it more
01:11:10.040 likely to happen.
01:11:11.300 There are several
01:11:12.320 ways that could be
01:11:13.120 true.
01:11:14.740 Way number one is
01:11:16.860 that there's some
01:11:19.220 backwards causation
01:11:20.420 happening.
01:11:21.820 And the way that
01:11:22.460 works is if you can
01:11:23.560 spend 15 minutes a
01:11:25.220 day focusing on
01:11:27.300 your objective, it
01:11:29.620 probably means you
01:11:30.340 already decided to
01:11:31.280 do whatever it
01:11:31.980 takes to get
01:11:32.500 there.
01:11:33.400 So it's the
01:11:34.100 deciding whatever it
01:11:35.160 takes to get there
01:11:35.820 that was probably the
01:11:36.700 operational part.
01:11:38.420 But by coincidence,
01:11:39.740 there's a correlation
01:11:40.540 between people who
01:11:42.280 are going to do
01:11:42.740 whatever it takes and
01:11:44.780 people who would also
01:11:45.540 do an affirmation.
01:11:47.240 So it could be the
01:11:48.120 affirmations are just
01:11:49.080 attached to people who
01:11:51.200 are going to do
01:11:51.720 whatever it takes so
01:11:53.100 that you would see a
01:11:53.860 correlation between
01:11:54.960 affirmations of the
01:11:55.960 success.
01:11:56.900 And that would be the
01:11:57.600 least magical
01:11:58.420 explanation.
01:11:59.820 That would just be
01:12:00.680 pretty straightforward.
01:12:02.560 People who do one
01:12:03.780 thing are likely to do
01:12:04.960 the other thing as
01:12:05.720 well.
01:12:06.940 So it's not the other
01:12:07.760 thing that's predictive,
01:12:08.840 it's just associated.
01:12:10.800 So that could explain
01:12:11.940 it all.
01:12:13.180 I would be one of
01:12:14.200 those personalities.
01:12:15.920 So my personality is
01:12:17.140 once I've decided to do
01:12:18.300 something, I'm going to
01:12:19.720 do all of the things.
01:12:21.620 I'm going to do all of
01:12:22.680 the things.
01:12:23.280 And one of them might be
01:12:24.120 affirmations, whether I
01:12:25.980 think it works or not.
01:12:27.640 Because once I've
01:12:28.420 decided, I'm going to
01:12:30.000 do all of the things.
01:12:31.640 I'm not going to leave
01:12:32.500 out any of the things.
01:12:34.200 Like, if they might
01:12:35.060 work, all right, try it.
01:12:38.620 So that could be one
01:12:39.540 reason they appear to
01:12:40.420 work.
01:12:41.060 Another would be that
01:12:42.140 you, there's something
01:12:44.000 called a reticular
01:12:45.680 activation.
01:12:47.040 I didn't make this up.
01:12:48.420 And it has to do with
01:12:49.280 the fact that you notice
01:12:50.140 things that are relevant
01:12:51.080 to you.
01:12:52.340 And repeating a goal
01:12:53.920 makes something more
01:12:54.800 relevant to you because
01:12:55.760 it's at the front of
01:12:56.800 your mind.
01:12:58.060 So whatever's at the
01:12:58.980 front of your mind tends
01:13:00.600 to be the filter where
01:13:02.360 you notice things in your
01:13:03.500 environment.
01:13:04.580 For example, if your
01:13:07.120 best friend bought a
01:13:08.380 certain color and model
01:13:09.720 of car, from that
01:13:11.960 moment on, the moment
01:13:13.340 after you saw your
01:13:14.100 friend bought this car,
01:13:15.300 a certain color model,
01:13:16.720 you'll see that car
01:13:17.760 everywhere.
01:13:18.880 You get on the
01:13:19.540 highway, there's one,
01:13:20.340 there's one, and you
01:13:21.180 never noticed before.
01:13:22.720 It's because your
01:13:23.560 friend's car moved to
01:13:25.300 the front of your mind
01:13:26.140 and it became your
01:13:26.860 filter, and now you're
01:13:28.260 seeing things that you
01:13:29.080 just wouldn't have
01:13:29.600 noticed before because
01:13:30.500 you changed your
01:13:31.080 filter.
01:13:31.920 That's how it works.
01:13:33.220 Now, suppose that you
01:13:35.080 did an affirmation about
01:13:36.180 becoming a famous
01:13:37.920 cartoonist, and you
01:13:40.400 didn't know how to do
01:13:41.100 it.
01:13:42.500 Would you notice some
01:13:44.200 opportunities that you
01:13:45.960 wouldn't have noticed
01:13:46.620 before?
01:13:47.320 Because you set that in
01:13:48.400 the front of your mind.
01:13:49.120 Well, that's what made
01:13:50.460 me a cartoonist.
01:13:51.980 I literally was doing
01:13:53.280 the affirmations but
01:13:54.380 didn't know how to
01:13:54.920 become one, and I
01:13:56.480 turned on the TV and
01:13:57.700 I noticed, would I have
01:13:59.640 noticed before?
01:14:01.100 Don't know.
01:14:01.940 I noticed there was a
01:14:02.840 TV show telling people
01:14:04.000 how to be a cartoonist.
01:14:05.900 Now, I missed the show
01:14:06.900 and I wrote a letter and
01:14:08.100 asked some questions and
01:14:09.100 that got me started.
01:14:10.700 But would I have even
01:14:12.740 noticed if I hadn't put
01:14:14.800 that filter in the front
01:14:15.780 of my mind to become a
01:14:17.240 cartoonist?
01:14:17.740 No, no.
01:14:18.940 So it could be just
01:14:19.980 filtering for opportunity.
01:14:22.780 There's a study by Dr.
01:14:24.580 Wiseman, who studied
01:14:26.700 luck.
01:14:28.060 He wanted to see if
01:14:28.940 there was such a thing
01:14:29.640 as luck.
01:14:30.920 There isn't.
01:14:32.040 So it turns out nobody
01:14:33.100 can guess things better
01:14:34.360 than other people.
01:14:35.700 But he did this study, I
01:14:36.920 talk about it all the
01:14:37.560 time, where he asked
01:14:39.040 people to sort themselves
01:14:41.820 by optimists and
01:14:43.380 pessimists.
01:14:43.920 And then he gave them
01:14:45.640 all the same task, which
01:14:46.960 was to count the number
01:14:47.900 of photographs in a
01:14:49.300 newspaper.
01:14:50.800 So the pessimists
01:14:52.280 counted the number of
01:14:53.760 photographs, and on
01:14:55.380 average they got the
01:14:56.100 right number.
01:14:56.760 Let's say that was 42,
01:14:58.660 just to make up a
01:14:59.480 number.
01:15:00.780 But it would take them
01:15:01.680 several minutes to go
01:15:02.860 through the newspaper and
01:15:03.800 count them all up, but
01:15:04.620 they'd get the right
01:15:05.160 answer.
01:15:06.500 The optimists would also
01:15:08.140 get the right answer, but
01:15:10.240 it would take them
01:15:10.800 seconds.
01:15:11.380 They were done in
01:15:12.940 seconds, but it took
01:15:14.900 the others minutes.
01:15:17.180 And the reason that the
01:15:18.000 optimists were done in
01:15:19.420 seconds, on average,
01:15:21.820 again, this is just on
01:15:22.760 average, is that the
01:15:24.300 optimists noticed that on
01:15:25.980 page two of every
01:15:26.920 newspaper, in big
01:15:28.320 writing, it said, stop
01:15:30.820 counting the photographs,
01:15:32.340 there are 42.
01:15:34.220 It was right there in big
01:15:35.440 writing.
01:15:36.300 Stop counting them,
01:15:37.180 there are 42.
01:15:39.100 Now, the pessimists
01:15:40.720 were not expecting any
01:15:42.620 luck.
01:15:43.760 That's what pessimists
01:15:44.700 are.
01:15:45.600 So they just said,
01:15:46.960 one, two, count the
01:15:49.040 photographs, three,
01:15:50.820 droopy dog, four, five,
01:15:53.220 42.
01:15:54.620 The optimists are always
01:15:56.220 looking for luck, all the
01:15:58.300 time.
01:15:59.420 Something good's going to
01:16:00.100 happen to me.
01:16:01.120 Something good's going to
01:16:01.800 happen.
01:16:02.580 And they're like, all
01:16:03.320 right, let's start this
01:16:04.040 task.
01:16:05.360 Do, do, do, hey, I guess
01:16:07.180 I can stop.
01:16:07.840 I got some luck.
01:16:08.960 I thought this was going
01:16:09.860 to be boring.
01:16:11.140 But look, I got lucky.
01:16:12.460 Page two, it says, stop
01:16:13.460 counting.
01:16:14.020 Ba-bom, boom, done.
01:16:17.660 So Dr. Wiseman
01:16:19.160 hypothesized that what's
01:16:21.580 happening here is that
01:16:22.940 people who expect luck are
01:16:25.740 actually adjusting their
01:16:27.380 filter to see more of it.
01:16:30.840 So is it possible that when
01:16:32.520 I was expecting good things
01:16:34.960 to happen, you know, to
01:16:36.160 become a cartoonist, that I
01:16:38.020 just set my filter for good
01:16:39.360 luck, and then I saw it.
01:16:41.620 And then I saw some good
01:16:42.540 luck.
01:16:43.620 Maybe.
01:16:45.660 The other possibility for
01:16:47.240 why affirmations could work
01:16:48.500 is it just makes you more
01:16:50.580 serious about your goal.
01:16:52.240 You basically talk yourself
01:16:53.740 into being more serious
01:16:54.840 about it by thinking about
01:16:56.100 it and repeating it every
01:16:57.640 day.
01:16:58.320 So it could be just that.
01:16:59.780 Maybe it makes you focus
01:17:00.700 better.
01:17:01.480 But the experience that
01:17:03.100 people have is almost
01:17:04.080 magical.
01:17:04.560 like there was some luck
01:17:06.740 that popped up.
01:17:08.800 Maybe they're just noticing
01:17:10.100 the luck.
01:17:11.320 But here's the weirdest
01:17:13.160 explanation.
01:17:15.120 That we live in a
01:17:16.440 simulation, that this
01:17:17.780 reality of ours is a
01:17:19.080 simulation, and that the
01:17:21.200 way you navigate the
01:17:23.000 simulation is by your
01:17:25.240 intentions.
01:17:27.440 Your intentions.
01:17:29.660 That's what an
01:17:30.420 affirmation is.
01:17:31.240 An affirmation is your
01:17:32.860 intention put into words.
01:17:35.900 When you put an intention
01:17:37.280 into words, what happens?
01:17:42.160 Words are the operating
01:17:43.700 system of your brain.
01:17:45.540 They're words.
01:17:46.520 Just words.
01:17:47.580 So as soon as your
01:17:48.340 intention turns into a set
01:17:50.120 of words, then you can
01:17:51.900 reprogram your brain.
01:17:53.020 The intention isn't good
01:17:54.020 enough.
01:17:55.080 You need to word it,
01:17:57.540 reinsert it, and repeat
01:17:58.700 it, and then it becomes the
01:17:59.920 code in your brain.
01:18:01.400 That's what a reframe is.
01:18:03.820 So my book teaches you how
01:18:06.560 to use words in a short
01:18:08.240 sentence to reprogram your
01:18:09.980 brain.
01:18:10.520 There's no effort involved.
01:18:11.900 You just read the words.
01:18:13.220 You're done.
01:18:14.120 Read it once, and that's
01:18:15.840 all you have to do.
01:18:17.080 That's the entire change.
01:18:18.420 Read one sentence once.
01:18:20.220 And there are 160 of those
01:18:21.600 sentences.
01:18:23.000 So one possibility in the
01:18:25.860 simulation is that you're
01:18:29.440 not just programming your
01:18:30.560 brain with words, which
01:18:31.920 would be good, but there
01:18:33.700 might be something even
01:18:34.580 weirder.
01:18:35.520 If we're a simulation, it
01:18:38.080 could be that's the way we
01:18:39.500 steer within the simulation.
01:18:41.980 In other words, maybe we
01:18:43.320 have lots of possibilities,
01:18:44.960 but when you focus on one,
01:18:47.000 it actually veers you into a
01:18:49.340 reality in which it's true.
01:18:51.320 So it's not that there's one
01:18:52.680 reality and there's one set of
01:18:54.620 things that could happen.
01:18:55.480 There are infinite dimensions
01:18:57.360 and you can, through your
01:18:59.200 affirmations, move your path
01:19:01.660 off of a dimension that you're
01:19:03.780 heading toward and toward a
01:19:05.340 different one.
01:19:06.260 So you're not making something
01:19:07.580 happening.
01:19:09.040 That's the way it feels like.
01:19:11.360 Affirmations feels like you made
01:19:13.900 your environment change.
01:19:15.740 But what if the environment's
01:19:17.180 never changed and all that
01:19:19.200 happened was you moved to a new
01:19:20.420 one?
01:19:20.580 That would be perfectly possible
01:19:23.860 if we're a simulation, but not
01:19:26.180 possible under a more normal
01:19:28.120 understanding of reality.
01:19:30.180 Now, given that I'm pretty sure
01:19:31.700 we are a simulation because the
01:19:33.200 odds are a trillion to one that
01:19:35.220 we're anything but that, that's a
01:19:37.580 longer discussion.
01:19:39.480 My understanding, my operating
01:19:42.120 understanding of affirmations is
01:19:44.340 that it's a steering mechanism to
01:19:47.300 steer toward a reality that's more in
01:19:49.960 my liking.
01:19:53.640 So, wishful thinking is just some
01:19:57.000 thinking about some good things.
01:19:59.680 There's no operational part to
01:20:01.420 that.
01:20:02.320 It's just thinking.
01:20:04.040 Once you have an intention, and
01:20:06.520 that intention turns into words, and
01:20:09.180 you either read it as a reframe, such
01:20:11.120 as my book, or you turn it into an
01:20:13.320 affirmation where you say, I, Scott
01:20:15.380 Adams, will have a best-selling
01:20:16.580 book.
01:20:17.880 I actually use that one on my
01:20:19.200 best-selling books.
01:20:20.900 And so that would give you all the
01:20:23.960 tools you need to change
01:20:25.000 dimensions.
01:20:29.160 All right, how many of you just had
01:20:30.420 your minds blown?
01:20:33.740 Did anybody's mind just blow up?
01:20:35.560 I love watching the comments when I, when I
01:20:41.380 know something's happening.
01:20:43.160 Yeah.
01:20:43.860 Yeah.
01:20:44.140 This is one of those moments when
01:20:46.280 you're saying to yourself, holy cow.
01:20:50.720 Yeah.
01:20:51.380 This is sort of a holy cow moment,
01:20:53.300 isn't it?
01:20:54.120 Now you understand my entire life.
01:20:57.160 The, the filter that I just explained to
01:20:59.260 you with affirmations and the simulation
01:21:02.400 and all that, that's what I live
01:21:04.360 every day.
01:21:06.300 That, that's my everyday
01:21:07.620 understanding of my environment,
01:21:09.840 is what I explained.
01:21:13.060 It's really nice.
01:21:14.900 It is nice to live in that frame,
01:21:16.860 whether, I don't know what's true.
01:21:18.180 Maybe truth is beyond us, but I can live
01:21:22.140 in that frame and it's really pleasant
01:21:24.140 where, because I feel like I'm changing
01:21:26.580 my environment, but really I'm just
01:21:28.920 changing which environment I choose to
01:21:31.060 experience.
01:21:31.640 And, and, and I would argue that there are
01:21:38.940 other forms of affirmations.
01:21:40.820 Somebody mentioned prayer.
01:21:43.180 If you were, if your frame is that there's
01:21:46.460 a, um, a supreme being, and that praying
01:21:51.140 to that supreme being is the mechanism that
01:21:53.400 causes the thing to happen in your real
01:21:55.020 life, that could be nothing but another way
01:21:58.440 to steer toward another dimension.
01:22:00.000 You know, one where the things are
01:22:01.860 happening that you want.
01:22:03.040 So it could be that the God model works
01:22:05.160 perfectly well, or, or as well as
01:22:08.060 affirmations, because it would be just a
01:22:10.100 steering mechanism in both cases.
01:22:14.780 Yeah.
01:22:15.640 But the, but in my telling of things, the
01:22:18.500 faith would be, um, the faith would be
01:22:21.460 unrelated to the effectiveness.
01:22:23.620 You wouldn't need any faith.
01:22:25.160 That is just a mechanical process.
01:22:26.780 You could add faith, but it would, it
01:22:31.180 would neither add to or detract from the
01:22:34.760 outcome.
01:22:38.000 Somebody says God works best.
01:22:40.280 Um, I'm going to agree with you because
01:22:42.440 we're all different.
01:22:44.060 So I, I very much think that for some types
01:22:48.140 of people, the God prayer model is exactly
01:22:51.260 the right one.
01:22:51.900 Um, and for other types of people, maybe
01:22:54.000 some other model is the one that works
01:22:55.360 best.
01:22:57.880 But I don't think it's, I don't think it's
01:22:59.640 universally true.
01:23:01.400 All right.
01:23:02.500 Faith is not even close to being
01:23:03.960 mechanical.
01:23:04.700 That's what I said.
01:23:06.100 All right.
01:23:06.620 You think you're disagreeing with me?
01:23:08.760 No, that's what I said.
01:23:10.400 Faith is not mechanical, but you would
01:23:13.800 only need the mechanical part to observe
01:23:16.380 benefits.
01:23:17.560 The faith might be additional, but they're,
01:23:20.180 like you said, they're, they're not the
01:23:21.600 same.
01:23:22.020 Don't think that the mechanical process
01:23:24.220 and faith are related.
01:23:26.800 The faith is extra.
01:23:32.700 All right.
01:23:33.820 Well, that got everybody thinking, didn't
01:23:35.360 it?
01:23:35.700 I think I'll leave you on this.
01:23:37.620 Don't believe anything in the news.
01:23:39.200 Uh, and, uh, I will talk to you tomorrow.
01:23:46.500 So I'm going to say goodbye to the folks
01:23:48.400 on the X platform first, and then
01:23:52.560 YouTubers, I'll see you tomorrow.