Real Coffee with Scott Adams - June 26, 2024


Episode 2518 CWSA 06⧸26⧸24


Episode Stats

Length

1 hour and 43 minutes

Words per Minute

151.8617

Word Count

15,727

Sentence Count

1,060

Misogynist Sentences

13

Hate Speech Sentences

30


Summary

What would it be like to live in a world where you thought the news was real? And how would you deal with it if you were stuck on a big screen all day long? And what would you do if you couldn t see the outside anymore?


Transcript

00:00:00.000 measure of the dopamine hit of the day, the thing that makes everything better.
00:00:04.540 It's called, and watch me do the Biden whisper,
00:00:08.380 the simultaneous sip. Go.
00:00:14.540 Mmm, delicious.
00:00:18.660 Well,
00:00:19.720 if you're subscribing to the Dilbert
00:00:24.520 Reborn comic, which you can only see if you're subscribing to it on X
00:00:28.720 under my profile, you can see the button, or on the
00:00:32.720 Locals platform where you get that plus a lot more, you would know that Ratbert,
00:00:36.720 who's working for a newspaper called The Washington Poop,
00:00:40.580 has been asked by his boss to be the liaison to
00:00:44.600 the CIA. So Ratbert's job will be to
00:00:48.880 write what the CIA tells him to write.
00:00:53.280 So, if you want to see how that works out,
00:00:56.040 it's in the Dilbert Reborn.
00:00:59.160 And I was thinking,
00:01:01.080 what would it be like
00:01:02.160 to live in a world in which
00:01:05.080 you thought the news
00:01:06.880 was real?
00:01:09.140 Because I haven't lived in that world
00:01:10.960 in a long time. I kind of remember
00:01:13.060 it, but
00:01:14.620 how disorienting would it be
00:01:16.960 to read, you know,
00:01:19.000 publications that are clearly not even
00:01:21.040 intending to be the news,
00:01:22.340 and imagine that they intended to be the news
00:01:25.160 and that they told you the actual news
00:01:27.380 and that there was,
00:01:29.420 there were only a few places that there were
00:01:31.360 bad news.
00:01:33.520 And if you,
00:01:34.900 if you,
00:01:37.460 is there only place,
00:01:39.580 I'm looking at some messages going by,
00:01:41.500 I'm sorry, just took me off my game for a moment.
00:01:43.420 Anyway, so
00:01:47.160 it'd be hard to live in a world
00:01:48.940 where you thought the news was real.
00:01:51.880 And
00:01:52.280 we'll get to more about that in a minute.
00:01:55.700 How many of you have
00:01:57.000 experienced something
00:01:59.380 that I'm starting to call
00:02:00.800 screen sickness?
00:02:03.620 Screen sickness.
00:02:05.160 Meaning that you get
00:02:06.320 stuck looking at screens,
00:02:08.660 your phone, your TV,
00:02:10.260 your computer screen on your desk,
00:02:11.660 and
00:02:12.800 you can't stop doing it.
00:02:16.100 And it might be different things
00:02:17.520 you're looking at,
00:02:18.120 but they're all on screens.
00:02:19.560 You just go from one screen to another.
00:02:22.240 And you start feeling of,
00:02:23.760 a feeling of general malaise.
00:02:26.100 Has anybody had that?
00:02:27.940 Where you can't stop doing it,
00:02:29.800 but you don't feel good anymore
00:02:31.820 while you're doing it.
00:02:32.800 But at the same time,
00:02:33.860 you feel like doing something else
00:02:35.620 wouldn't feel as good.
00:02:37.860 What it feels like to me
00:02:39.240 is that I'm having some kind of
00:02:41.140 dopamine crisis.
00:02:43.740 And my dopamine
00:02:45.040 maker is like
00:02:46.880 operating at 10%.
00:02:48.100 And the only thing
00:02:50.220 that gives me a hit,
00:02:52.020 you know,
00:02:52.220 once my dopamine is depressed,
00:02:54.400 is just
00:02:55.080 looking through reels.
00:02:56.580 So I'm currently
00:02:58.800 completely
00:02:59.580 addicted
00:03:00.460 to the
00:03:01.400 fast form
00:03:02.660 algorithmically
00:03:04.800 perfect stuff
00:03:05.740 that is exactly
00:03:06.700 what I want to look at.
00:03:09.460 I can't tell you
00:03:10.580 how much I enjoy
00:03:11.400 looking at cats
00:03:12.400 and dogs
00:03:12.940 hugging each other.
00:03:13.980 I can watch it all day.
00:03:15.980 And there are days
00:03:16.720 when I probably spend
00:03:17.560 an hour,
00:03:18.560 you know,
00:03:18.900 collectively all through the day
00:03:20.180 looking at
00:03:21.360 animals hugging each other.
00:03:23.160 I'll never get tired of it.
00:03:25.780 But I do get sick.
00:03:28.020 I can actually feel
00:03:28.880 this general malaise
00:03:30.300 coming over me
00:03:32.420 that I'm just stuck
00:03:33.260 on these screens
00:03:33.980 and not seeing the outdoors.
00:03:36.740 So
00:03:37.020 I'm actually
00:03:39.220 struggling with it.
00:03:41.500 I was not at all
00:03:42.820 addicted to anything
00:03:44.220 like TikTok.
00:03:45.720 But when the TikTok model
00:03:47.400 moved over
00:03:48.320 to the other platforms,
00:03:49.740 so now even X,
00:03:50.880 if you look at one video,
00:03:53.640 you can scroll up
00:03:54.620 and it'll just
00:03:55.120 take you to the next
00:03:56.060 interesting video.
00:03:57.540 And it's all stuff
00:03:58.440 I want to see.
00:03:59.760 They're so,
00:04:00.920 so,
00:04:01.260 so good
00:04:02.100 at knowing
00:04:03.160 what I personally
00:04:04.100 want to see
00:04:04.820 that I just
00:04:06.240 can't even walk away
00:04:07.040 from it.
00:04:07.440 It's actually
00:04:08.120 a really dangerous
00:04:09.020 addiction.
00:04:10.240 And I'm trying to
00:04:10.900 think,
00:04:12.120 what would this
00:04:13.640 have been like
00:04:14.220 if I were a teenager?
00:04:17.060 Oh my God.
00:04:18.080 So I'm a big old
00:04:21.140 adult who didn't
00:04:22.060 grow up,
00:04:22.700 you know,
00:04:22.940 with a brain
00:04:23.400 that was compromised
00:04:24.580 by smartphones.
00:04:25.960 And I'm addicted.
00:04:27.620 Like,
00:04:27.920 let me just say,
00:04:29.520 today I'm actually
00:04:30.680 going to work on it
00:04:31.560 like,
00:04:32.060 like I had
00:04:32.980 an alcohol addiction.
00:04:34.820 The feeling
00:04:35.840 is making me sick
00:04:37.080 and I can't stop.
00:04:39.860 I mean,
00:04:40.280 that's an addiction.
00:04:41.620 There's no other
00:04:42.220 way around that.
00:04:43.340 And
00:04:43.840 I don't know
00:04:45.020 how to stop it.
00:04:45.780 And here's my
00:04:47.020 current problem
00:04:47.800 and I want to
00:04:48.440 get your advice.
00:04:51.120 What is the time,
00:04:52.920 what is the
00:04:53.720 appropriate time
00:04:55.000 of response
00:04:56.200 when you get
00:04:56.920 a text message?
00:04:59.000 In modern world,
00:05:00.660 let's say it's
00:05:01.180 the middle of the day.
00:05:02.380 It's a work day.
00:05:04.000 You get a text message
00:05:05.200 from somebody
00:05:05.800 and it's always urgent.
00:05:09.060 How long can you
00:05:10.140 wait before you
00:05:11.600 respond
00:05:12.220 before it becomes
00:05:14.000 a problem?
00:05:14.500 And
00:05:16.320 then I want to
00:05:17.620 ask you
00:05:17.940 a follow-up
00:05:18.520 question.
00:05:20.280 How often
00:05:21.080 do you send
00:05:21.640 somebody a message
00:05:22.580 and then
00:05:23.980 within five
00:05:24.740 to ten minutes
00:05:25.420 you send
00:05:25.880 a follow-up
00:05:26.500 message
00:05:26.900 to ask
00:05:27.280 if they got
00:05:27.820 the first
00:05:28.280 message?
00:05:29.880 And if you
00:05:30.540 do that,
00:05:31.640 are you a man
00:05:32.620 or a woman?
00:05:35.060 Because I find
00:05:35.940 that
00:05:36.400 it's sort of
00:05:38.700 like there's
00:05:39.420 some things
00:05:39.860 that men
00:05:40.220 won't do
00:05:40.700 in traffic
00:05:41.240 that women
00:05:41.740 will
00:05:42.040 because men
00:05:43.080 don't want
00:05:43.460 to get killed.
00:05:45.380 Right?
00:05:45.880 So men
00:05:46.560 don't cut
00:05:47.260 off other
00:05:47.760 men in
00:05:48.200 traffic
00:05:48.660 intentionally
00:05:49.540 because it'll
00:05:50.460 be a fight.
00:05:51.740 But if a woman
00:05:52.640 cuts you off,
00:05:53.500 you're like,
00:05:53.800 oh,
00:05:54.500 well,
00:05:55.180 okay.
00:05:57.320 Because you're not
00:05:58.100 going to have a fight.
00:05:59.160 So you just go,
00:05:59.920 ah,
00:06:00.360 all right,
00:06:01.320 whatever.
00:06:02.840 So here's my
00:06:03.860 experience
00:06:04.440 that women
00:06:07.020 don't understand
00:06:08.120 why you don't
00:06:08.780 respond right away.
00:06:09.820 but men,
00:06:12.160 I don't know
00:06:12.900 if a man
00:06:13.520 has ever
00:06:14.060 texted me
00:06:15.000 twice.
00:06:16.960 Is that
00:06:17.600 your experience?
00:06:19.120 So if a man
00:06:20.160 texts me
00:06:20.760 and asks for
00:06:21.220 something,
00:06:21.880 I can't think
00:06:23.120 of one time
00:06:24.220 in my entire
00:06:25.180 life
00:06:25.880 that I ever
00:06:27.060 got a follow-up
00:06:27.880 text,
00:06:28.500 did you see
00:06:29.060 my first text?
00:06:30.520 Have you?
00:06:31.880 Is that
00:06:32.560 completely a
00:06:33.420 gender-specific
00:06:34.120 thing?
00:06:35.020 But with women,
00:06:36.260 it's very
00:06:36.780 common
00:06:37.280 that a response
00:06:38.780 not within the
00:06:39.540 first,
00:06:40.160 say,
00:06:40.280 30 minutes
00:06:40.960 will get you
00:06:41.660 the follow-up
00:06:42.240 text.
00:06:43.120 Hey,
00:06:43.520 where are you?
00:06:44.060 I don't know
00:06:44.600 if you saw
00:06:44.940 my message.
00:06:46.520 Now,
00:06:46.980 the reason I
00:06:47.600 would not
00:06:47.920 send a follow-up
00:06:48.860 text to a
00:06:49.460 text is
00:06:51.100 I don't want
00:06:51.520 to get beat
00:06:51.980 up.
00:06:54.700 Not really.
00:06:55.540 I mean,
00:06:55.840 you're not really
00:06:56.300 going to get
00:06:56.660 beat up.
00:06:57.440 But it
00:06:57.760 would feel
00:06:59.360 like an
00:06:59.820 insult
00:07:00.300 to another
00:07:02.220 man.
00:07:02.660 if I
00:07:04.120 texted
00:07:04.500 a man
00:07:05.080 and then
00:07:05.560 texted
00:07:05.900 again in
00:07:06.400 10 minutes,
00:07:07.040 I would
00:07:07.700 feel like
00:07:08.080 I was
00:07:08.320 insulting
00:07:08.700 him.
00:07:10.120 Do you
00:07:10.540 have the
00:07:10.780 same
00:07:10.960 feeling
00:07:11.240 or is
00:07:11.580 this
00:07:11.720 unique
00:07:12.100 to me?
00:07:14.560 So I'm
00:07:15.160 saying one
00:07:15.660 hour?
00:07:17.080 Because I'm
00:07:17.920 thinking of
00:07:18.440 instituting a
00:07:19.300 two-hour
00:07:19.820 rule.
00:07:21.180 The two-hour
00:07:22.200 rule is that
00:07:23.040 no matter how
00:07:23.980 important it
00:07:24.540 is,
00:07:25.380 you should
00:07:25.780 assume it's
00:07:26.360 going to be
00:07:26.660 two hours.
00:07:28.020 So if you
00:07:28.400 have a problem
00:07:28.940 that just
00:07:29.500 has to be
00:07:30.120 solved in
00:07:30.560 less than
00:07:30.880 two hours,
00:07:31.640 you better
00:07:32.240 start
00:07:32.580 right away
00:07:33.100 with somebody
00:07:33.620 else or
00:07:35.100 some other
00:07:35.580 solution.
00:07:36.460 But I don't
00:07:36.920 know how to
00:07:37.200 sell it.
00:07:38.220 I'm not sure
00:07:38.700 I could ever
00:07:39.120 sell that.
00:07:39.780 Everything
00:07:40.000 seems like
00:07:40.760 an emergency.
00:07:43.340 Anyway,
00:07:44.500 some of you
00:07:45.340 may know
00:07:45.920 that I had
00:07:46.360 an extended
00:07:47.000 conversation
00:07:47.700 almost a few
00:07:48.820 hours with
00:07:49.560 Michael Ian
00:07:51.020 Black.
00:07:51.740 You know
00:07:52.320 him from
00:07:52.980 TV.
00:07:54.100 He's written
00:07:54.800 books.
00:07:55.300 He's got a
00:07:55.800 sub stack you
00:07:56.960 might want to
00:07:57.440 subscribe to.
00:07:58.600 And he's a
00:07:59.160 podcaster and
00:08:00.120 a stand-up
00:08:00.540 comedian.
00:08:01.880 Does lots
00:08:02.400 of things.
00:08:03.040 So he's
00:08:03.400 multi-talented
00:08:04.160 across many
00:08:05.280 domains.
00:08:06.920 And we
00:08:08.980 talked after
00:08:10.220 he had a
00:08:10.920 comment about
00:08:11.680 my observation
00:08:12.640 that it was
00:08:13.900 impossible to
00:08:14.660 have a
00:08:14.920 conversation with
00:08:15.880 someone who
00:08:16.340 thinks the
00:08:16.900 news is
00:08:17.500 real.
00:08:19.940 And he
00:08:20.640 wanted to
00:08:21.040 know, what
00:08:22.140 do you mean
00:08:22.620 by the
00:08:23.340 news isn't
00:08:23.920 real?
00:08:25.140 And I said
00:08:25.920 to myself,
00:08:26.540 or how do you
00:08:27.340 know the
00:08:27.680 news isn't
00:08:28.140 real?
00:08:28.520 Or what do
00:08:29.140 you mean by
00:08:29.500 that?
00:08:29.680 And I
00:08:30.820 thought, that's
00:08:31.500 actually a
00:08:31.940 really interesting
00:08:32.480 question.
00:08:33.920 Because those
00:08:35.560 of you who have
00:08:35.920 been watching
00:08:36.380 for a long
00:08:36.800 time, you
00:08:37.540 know that we
00:08:38.000 have developed
00:08:38.660 collectively a
00:08:40.380 set of tools for
00:08:41.700 determining what's
00:08:42.520 real.
00:08:42.820 For example,
00:08:44.580 the Gell-Man
00:08:45.600 amnesia, the
00:08:47.320 Scott Alexander,
00:08:49.380 you know, I'll
00:08:50.760 call it the
00:08:51.400 man-bites-dog
00:08:52.700 idea.
00:08:53.140 the idea that
00:08:55.000 one anonymous
00:08:56.440 source is
00:08:57.980 usually BS.
00:08:59.520 So there's a
00:09:00.140 whole bunch of
00:09:00.640 tools, and
00:09:02.220 some of them,
00:09:03.780 you know, are
00:09:04.580 harder to
00:09:05.140 explain.
00:09:06.300 But I
00:09:06.840 thought, oh,
00:09:07.780 this could be
00:09:08.420 really interesting.
00:09:09.880 Because the
00:09:10.860 way that I
00:09:11.380 determine truth
00:09:12.300 is by these
00:09:13.840 tools.
00:09:14.840 You know, it's
00:09:15.220 a set of
00:09:16.000 tools for
00:09:16.640 knowing what's
00:09:17.300 true.
00:09:18.100 Another one
00:09:18.860 would be that
00:09:19.660 I've studied
00:09:20.320 mass hysterias.
00:09:21.840 So I
00:09:23.620 propose that
00:09:25.160 anybody who
00:09:25.800 understands,
00:09:26.680 you know, the
00:09:27.220 McMartin school
00:09:28.540 preschool case,
00:09:30.960 anybody who's
00:09:31.620 studied mass
00:09:33.020 hysterias is in
00:09:34.480 a better position
00:09:35.200 to recognize a
00:09:36.040 new one when it
00:09:36.620 pops up.
00:09:37.640 Just basic
00:09:38.600 tools of
00:09:39.440 understanding your
00:09:40.200 reality.
00:09:41.040 So I thought
00:09:41.920 that it would
00:09:43.180 be useful to
00:09:44.180 have him on and
00:09:45.060 explain not
00:09:46.700 politics, you
00:09:47.700 know, not an
00:09:48.440 argument about
00:09:49.100 politics, because
00:09:49.880 we know how that
00:09:50.540 would go.
00:09:51.020 Oh, by the way,
00:09:51.480 he would be a
00:09:52.600 big supporter of
00:09:53.380 Biden over
00:09:54.140 Trump.
00:09:55.480 You might say
00:09:56.480 he has TDS, but
00:09:58.140 I'm not
00:09:58.680 concluding that.
00:10:00.660 I'm just saying
00:10:01.120 you would
00:10:01.460 probably say
00:10:01.900 that, as
00:10:05.560 you do about
00:10:06.080 everybody who's
00:10:06.680 got a strong
00:10:07.160 opinion about
00:10:07.900 Trump.
00:10:10.140 So, here's
00:10:11.440 what happened.
00:10:13.120 We agreed that
00:10:14.140 it wouldn't be a
00:10:14.880 political debate,
00:10:16.600 that rather I
00:10:17.460 would answer the
00:10:18.060 question, and I
00:10:19.800 told him to,
00:10:20.500 when we
00:10:21.240 started, I
00:10:22.060 invited him to
00:10:22.780 interrupt as
00:10:23.500 much as he
00:10:23.960 wanted, otherwise
00:10:25.540 it would be me
00:10:26.260 just explaining
00:10:27.040 things, and I
00:10:28.160 wanted to be
00:10:28.740 more interactive.
00:10:30.900 So if you
00:10:31.580 watched it, and I
00:10:32.540 saw a lot of
00:10:32.960 people say, hey,
00:10:34.280 he kept
00:10:34.700 interrupting you,
00:10:35.960 I invited that.
00:10:37.880 I said, please
00:10:39.200 interrupt, because I
00:10:40.800 don't want to be
00:10:41.240 just talking the
00:10:42.260 whole time.
00:10:43.500 Now, what did
00:10:45.520 you think you saw
00:10:46.540 if you watched
00:10:47.340 it?
00:10:47.500 here's where it
00:10:48.040 gets fun.
00:10:49.600 I don't know
00:10:50.160 what happened.
00:10:51.940 I spent, you
00:10:53.020 know, 90
00:10:53.920 minutes or more
00:10:54.760 in a conversation,
00:10:56.300 and when I was
00:10:57.040 done, I
00:10:58.040 legitimately didn't
00:10:58.920 know what
00:10:59.220 happened.
00:11:00.540 Let me explain
00:11:01.140 that.
00:11:04.960 I've been
00:11:05.580 hearing for
00:11:06.480 maybe a few
00:11:07.460 years, almost
00:11:09.060 every day,
00:11:10.260 somebody would
00:11:10.860 say something
00:11:11.400 like this on my
00:11:12.520 comments online.
00:11:14.240 Maybe you've
00:11:14.820 heard it, too.
00:11:16.160 I had the
00:11:16.840 experience of
00:11:17.640 spending the
00:11:18.420 weekend talking
00:11:19.140 to a relative
00:11:19.900 or a friend
00:11:20.600 who's a big
00:11:22.400 Democrat, and
00:11:24.320 oh my God, my
00:11:25.220 head exploded.
00:11:26.600 They live in a
00:11:27.420 different world.
00:11:29.180 And I would
00:11:29.480 listen to this,
00:11:30.080 and I'd think,
00:11:30.540 oh, so basically
00:11:31.940 you just met
00:11:32.600 somebody who
00:11:33.080 disagrees on
00:11:33.740 politics, right?
00:11:35.400 But it didn't
00:11:36.060 really sound like
00:11:36.780 that.
00:11:37.780 When people
00:11:38.500 talked about it,
00:11:39.300 they would say,
00:11:39.820 there's something
00:11:40.500 going on.
00:11:41.080 It's like a
00:11:41.460 different world.
00:11:42.240 world, there's
00:11:43.460 no overlap in
00:11:45.400 what we know.
00:11:46.600 It's like a
00:11:47.320 whole different
00:11:47.940 world.
00:11:48.660 And I don't
00:11:49.080 know that it's
00:11:49.520 always been
00:11:49.940 that way.
00:11:50.940 In my opinion,
00:11:51.940 it's never been
00:11:52.900 that way.
00:11:53.400 It's more of a
00:11:54.220 modern development.
00:11:56.280 So people
00:11:58.340 signed on, and
00:11:59.200 they didn't know
00:11:59.780 what they saw.
00:12:01.180 And I'll give
00:12:01.580 you an idea of
00:12:02.520 some of the
00:12:03.160 things they saw.
00:12:03.740 First of all,
00:12:04.700 I'd like to
00:12:05.240 thank Michael
00:12:06.280 Ian Black for
00:12:07.180 crossing out of
00:12:08.600 his silo of
00:12:09.440 news and into
00:12:10.180 my silo and
00:12:11.640 taking a
00:12:12.140 chance.
00:12:13.160 Because it's
00:12:13.980 really brave to
00:12:15.720 put yourself in
00:12:16.760 front of what
00:12:17.700 he would know
00:12:18.280 would be an
00:12:19.660 adversarial audience
00:12:20.860 when he watched
00:12:21.500 it.
00:12:22.340 So very
00:12:23.520 brave.
00:12:25.660 And whatever
00:12:26.400 you thought
00:12:26.880 about it,
00:12:28.520 let's just
00:12:29.240 say, let's
00:12:29.680 give him that,
00:12:30.280 right?
00:12:31.920 All right.
00:12:33.820 Do me a
00:12:34.540 favor in the
00:12:35.040 comments and
00:12:35.760 don't give me
00:12:36.460 stock updates.
00:12:37.400 updates.
00:12:39.280 I know I
00:12:40.040 asked for them
00:12:40.580 earlier.
00:12:41.360 There's a
00:12:41.740 specific stock I
00:12:42.760 was looking at.
00:12:43.720 But don't give
00:12:44.240 me updates.
00:12:45.360 It'll take me
00:12:45.980 off my game
00:12:46.540 while I'm doing
00:12:47.400 this.
00:12:50.920 So here's what
00:12:51.820 happened.
00:12:52.400 There were
00:12:52.780 definitely two
00:12:53.400 movies on one
00:12:54.200 screen in a
00:12:55.560 way that I've
00:12:56.280 never experienced
00:12:57.100 before.
00:12:57.560 And what I
00:13:02.460 thought I was
00:13:03.000 doing is
00:13:03.620 explaining in
00:13:04.740 detail the
00:13:05.420 tools and
00:13:06.220 techniques I
00:13:06.880 used to
00:13:07.360 determine what's
00:13:08.040 true and
00:13:08.420 what's not
00:13:08.840 true.
00:13:09.880 What he said
00:13:10.720 was, no, you
00:13:11.740 haven't answered
00:13:12.220 the question and
00:13:12.920 you're not on
00:13:13.400 the topic.
00:13:14.520 And I would
00:13:15.160 say, well, let's
00:13:16.580 agree on the
00:13:17.080 topic and we
00:13:17.880 would.
00:13:18.940 And then I'd
00:13:19.400 say, that's what
00:13:19.960 I talked about.
00:13:21.040 And in fact,
00:13:21.820 it's the only
00:13:22.220 thing I talked
00:13:22.780 about was that
00:13:23.500 topic.
00:13:24.300 And he would
00:13:24.940 say, no, you
00:13:25.480 haven't.
00:13:26.520 And I would
00:13:27.120 say, it's the
00:13:27.500 only thing I've
00:13:28.040 talked about.
00:13:28.820 And I would
00:13:29.180 say why I
00:13:29.860 thought I was
00:13:30.220 talking about
00:13:30.640 the topic.
00:13:31.260 And he would
00:13:31.740 say, no, you
00:13:32.940 haven't even
00:13:33.340 addressed the
00:13:33.860 topic.
00:13:35.180 So we
00:13:35.800 couldn't even
00:13:36.380 find the
00:13:38.140 same movie
00:13:38.880 of a one
00:13:41.160 sentence
00:13:41.620 agreement of
00:13:42.400 what we were
00:13:42.840 there to talk
00:13:43.400 about.
00:13:44.480 And by the
00:13:45.180 way, we
00:13:45.460 never solved
00:13:45.980 it.
00:13:46.560 Because even
00:13:47.200 after we
00:13:47.740 talked about
00:13:48.260 it, and I
00:13:48.840 don't think
00:13:49.200 we are on
00:13:49.940 the same page
00:13:50.560 about what
00:13:51.760 we agreed on
00:13:52.560 or even what
00:13:53.340 happened.
00:13:54.320 It was a
00:13:55.020 complete
00:13:55.380 antibody
00:13:56.020 mind
00:13:58.460 effed that
00:13:58.980 I've never
00:13:59.340 experienced
00:13:59.840 before.
00:14:00.200 Well, once
00:14:00.640 before.
00:14:01.340 I've had one
00:14:01.960 experience like
00:14:02.760 it, and I'll
00:14:03.040 tell you about
00:14:03.740 that in a
00:14:04.100 minute.
00:14:05.660 All right.
00:14:06.240 So remember,
00:14:07.620 my claim was
00:14:08.640 that it's
00:14:09.040 impossible to
00:14:09.700 have a
00:14:09.920 conversation with
00:14:11.040 someone who
00:14:11.480 thinks the
00:14:12.000 news is real.
00:14:14.180 And I
00:14:14.760 proved it.
00:14:15.880 So here was
00:14:16.660 someone who
00:14:17.020 was completely
00:14:17.740 game to
00:14:18.860 give us his
00:14:19.760 time at
00:14:20.440 personal risk.
00:14:21.940 He was
00:14:22.200 willing to
00:14:22.920 give his
00:14:23.240 time.
00:14:23.600 And it
00:14:26.320 was a
00:14:27.340 nothing in
00:14:28.280 terms of
00:14:29.040 any kind
00:14:29.700 of useful
00:14:31.460 conversation.
00:14:32.520 But boy,
00:14:32.940 was it an
00:14:33.520 experience.
00:14:35.340 So on
00:14:37.300 some level,
00:14:37.880 it was a
00:14:38.260 train wreck.
00:14:39.560 But that's
00:14:40.060 why you
00:14:40.340 should watch
00:14:40.760 it.
00:14:41.480 It's pinned
00:14:42.120 to my
00:14:42.560 ex-post.
00:14:44.460 It's on
00:14:44.700 YouTube.
00:14:45.180 It's
00:14:45.280 everywhere.
00:14:46.620 Anyway,
00:14:47.080 so here's
00:14:47.460 what I
00:14:47.800 discovered.
00:14:50.180 First of
00:14:50.700 all, we
00:14:51.000 did not
00:14:51.460 have an
00:14:52.380 agreement
00:14:52.840 about
00:14:53.360 base
00:14:53.840 reality.
00:14:55.800 So there
00:14:56.140 are things,
00:14:57.100 of course,
00:14:57.380 that you
00:14:57.640 would disagree
00:14:58.280 politically,
00:14:59.400 but then
00:15:00.280 there are
00:15:00.520 other things
00:15:01.040 that you
00:15:01.320 would say,
00:15:01.880 well,
00:15:02.060 everybody
00:15:02.400 agrees on
00:15:03.000 this,
00:15:03.520 right?
00:15:05.000 The things
00:15:05.800 that I
00:15:06.300 thought were
00:15:07.360 not even
00:15:07.900 a political
00:15:08.560 question were
00:15:10.100 simply things
00:15:10.740 that both
00:15:11.180 networks reported
00:15:12.140 exactly the
00:15:12.880 same.
00:15:13.960 And as far
00:15:14.340 as I knew,
00:15:14.800 nobody even
00:15:15.200 had a question
00:15:15.700 about, but
00:15:16.860 he did.
00:15:18.140 So my
00:15:19.100 basic
00:15:19.620 assumptions
00:15:20.160 about reality,
00:15:21.220 the most
00:15:21.880 observable
00:15:22.540 parts of
00:15:23.080 reality,
00:15:23.500 I'll give
00:15:23.760 you specifics,
00:15:25.300 he did
00:15:26.120 not agree
00:15:27.080 were part
00:15:27.480 of reality.
00:15:29.740 And if
00:15:31.700 you've ever
00:15:32.100 experienced
00:15:32.740 gaslighting,
00:15:34.440 it's when,
00:15:35.540 I'll give you
00:15:36.240 the simplified
00:15:37.020 example,
00:15:38.020 if somebody
00:15:38.740 has a rock
00:15:39.420 in their
00:15:39.720 hand and
00:15:40.220 they say,
00:15:40.600 hey, look
00:15:40.960 at this rock
00:15:41.540 that's in
00:15:41.940 my hand,
00:15:42.800 and the
00:15:43.400 other person
00:15:43.940 looks at
00:15:44.400 it and
00:15:44.680 says,
00:15:45.000 there's no
00:15:45.360 rock in
00:15:45.780 your hand,
00:15:47.180 that's not
00:15:47.820 lying.
00:15:48.320 I don't
00:15:50.360 know what
00:15:50.620 that is.
00:15:51.980 But, you
00:15:52.460 know, we
00:15:52.660 call it
00:15:53.020 gaslighting
00:15:53.700 because if
00:15:55.920 it's just
00:15:56.260 you and
00:15:56.720 the other
00:15:56.960 person in
00:15:57.420 the room
00:15:57.740 and the
00:15:57.940 other person
00:15:58.400 swears,
00:15:59.040 look, you
00:15:59.600 are, I
00:16:00.300 don't know
00:16:00.520 what's wrong
00:16:00.920 with you,
00:16:01.220 but there
00:16:01.460 is no
00:16:01.820 rock in
00:16:02.300 your hand.
00:16:03.540 But you
00:16:04.100 have it in
00:16:04.560 your hand.
00:16:05.080 You can
00:16:05.340 feel its
00:16:05.720 weight, you
00:16:06.160 can touch
00:16:06.560 it, you
00:16:07.180 can drop
00:16:07.600 it, it
00:16:07.820 makes a
00:16:08.160 sound, and
00:16:08.620 you're like,
00:16:09.460 I don't
00:16:10.000 know, I
00:16:10.380 see the
00:16:10.780 rock, I'm
00:16:11.320 sure I
00:16:11.620 have a
00:16:11.880 rock in
00:16:12.240 my hand.
00:16:12.920 And then
00:16:13.180 the other
00:16:13.420 person will
00:16:18.320 go crazy.
00:16:19.460 And that's
00:16:20.200 what happened
00:16:20.600 to me.
00:16:21.240 I had a
00:16:21.880 complete break
00:16:22.780 with reality
00:16:23.540 during the
00:16:25.480 conversation.
00:16:28.040 Now that
00:16:28.760 complete break
00:16:29.560 in reality
00:16:30.020 you'll see
00:16:30.560 on the
00:16:30.920 video, and
00:16:31.800 you know,
00:16:32.140 most people
00:16:32.580 noticed.
00:16:33.580 And he
00:16:34.420 asked me
00:16:34.960 for proof
00:16:36.400 of what I
00:16:37.720 considered
00:16:38.180 the most
00:16:39.120 completely
00:16:40.320 universally
00:16:40.980 true thing
00:16:41.800 that I
00:16:42.240 understood.
00:16:43.500 Now here's
00:16:44.000 what he
00:16:44.280 asked me
00:16:44.620 to prove.
00:16:45.920 He did
00:16:46.320 not believe
00:16:46.900 that President
00:16:47.560 Biden,
00:16:48.320 had ever,
00:16:49.880 not even
00:16:50.340 once,
00:16:51.540 suggested that
00:16:52.540 President
00:16:53.040 Trump had
00:16:54.800 ever called
00:16:55.480 the neo-Nazis
00:16:56.940 and racists
00:16:57.600 in Charlottesville
00:16:58.380 fine people.
00:17:02.620 Now just
00:17:03.380 hold that
00:17:03.880 in your
00:17:04.120 head.
00:17:05.160 He did
00:17:05.500 not believe
00:17:06.200 that that
00:17:07.180 had ever
00:17:07.480 happened.
00:17:07.960 And he's
00:17:08.220 a big
00:17:08.540 political
00:17:09.520 person.
00:17:10.920 He does
00:17:12.060 politics,
00:17:12.900 and he
00:17:13.020 talks about
00:17:13.480 it, and
00:17:13.680 he tweets
00:17:13.920 about it.
00:17:15.680 So I
00:17:16.060 said,
00:17:17.140 so you'll,
00:17:18.080 if you
00:17:18.460 watch it,
00:17:18.860 you'll see
00:17:19.160 a complete
00:17:19.580 break with
00:17:20.100 reality.
00:17:21.060 You'll see
00:17:21.640 my face
00:17:22.180 go,
00:17:22.620 what?
00:17:24.180 Because I
00:17:24.800 didn't know
00:17:25.120 what to
00:17:25.400 do.
00:17:26.060 And then he
00:17:26.740 challenged me
00:17:27.420 to find a
00:17:28.000 source,
00:17:29.020 any source,
00:17:29.820 even one,
00:17:31.280 that suggested
00:17:31.920 that Biden
00:17:32.640 had ever
00:17:33.780 said that
00:17:35.840 Trump called
00:17:36.880 the racist
00:17:37.480 in Charlottesville
00:17:38.300 fine people.
00:17:38.960 and I
00:17:40.540 said,
00:17:41.020 I've been
00:17:41.520 watching
00:17:41.900 compilation
00:17:42.560 clips of
00:17:43.220 it all
00:17:43.480 morning,
00:17:44.240 which I
00:17:44.620 had.
00:17:46.040 I'd
00:17:46.520 probably
00:17:46.740 seen 16
00:17:47.580 of the
00:17:47.940 clips within
00:17:49.320 an hour
00:17:49.760 before that
00:17:50.460 we talked.
00:17:51.720 But I'm
00:17:53.400 a boomer
00:17:54.040 who can't
00:17:54.460 find things
00:17:54.960 on phones,
00:17:55.760 especially
00:17:56.380 when you
00:17:56.900 search for
00:17:57.420 them, you
00:17:58.220 get too
00:17:58.560 many hits.
00:17:59.700 So I'm
00:18:00.100 trying real
00:18:00.860 time to
00:18:01.300 find a
00:18:02.340 source.
00:18:02.800 So it
00:18:05.540 made me
00:18:05.860 look like
00:18:06.300 I couldn't
00:18:06.740 prove.
00:18:07.960 So he
00:18:08.800 was sort
00:18:09.180 of resting
00:18:09.620 his case
00:18:10.340 that there
00:18:11.440 was no
00:18:11.780 evidence in
00:18:12.380 the world
00:18:12.860 that President
00:18:14.260 Biden had
00:18:15.460 run on
00:18:16.320 the main
00:18:18.900 theme that
00:18:20.100 Trump had
00:18:20.540 called these
00:18:20.980 racist fine
00:18:21.700 people.
00:18:22.280 That never
00:18:22.840 happened.
00:18:24.220 Now that's
00:18:24.760 the, I
00:18:25.420 have a rock
00:18:25.920 in my hand.
00:18:26.700 No, you
00:18:27.000 don't.
00:18:27.720 No, I
00:18:28.120 swear it's
00:18:28.480 right here.
00:18:29.400 Except I
00:18:30.320 couldn't say
00:18:30.780 it's right
00:18:31.120 here because I
00:18:31.840 couldn't Google
00:18:32.380 it.
00:18:32.800 in, you
00:18:33.560 know, on
00:18:33.900 a live
00:18:34.260 stream because
00:18:34.860 I'm a
00:18:35.440 boomer who
00:18:35.860 can't Google
00:18:36.320 things that
00:18:36.760 are hard to
00:18:37.140 Google on
00:18:37.560 my phone.
00:18:38.680 And I've
00:18:38.980 got these
00:18:39.300 weird thumbs
00:18:39.980 where I
00:18:40.340 mistype
00:18:40.760 everything because
00:18:41.680 my thumb
00:18:42.180 isn't round.
00:18:43.720 It's distorted
00:18:44.640 from holding
00:18:45.140 the phone.
00:18:47.060 So eventually
00:18:47.920 somebody sends
00:18:48.700 me the
00:18:49.140 link.
00:18:50.300 So then I
00:18:51.120 got to play
00:18:51.620 the link.
00:18:52.760 They very
00:18:53.140 clearly had
00:18:54.120 President Biden
00:18:54.960 himself say
00:18:57.040 that he did
00:18:57.720 believe that,
00:18:59.020 you know,
00:18:59.660 that Trump had
00:19:00.320 called them fine
00:19:00.880 people.
00:19:01.640 So I
00:19:02.220 played it and
00:19:03.920 now in my
00:19:04.780 mind, here
00:19:05.900 it is, here's
00:19:06.480 the proof.
00:19:08.460 And then there
00:19:09.300 was a compilation
00:19:10.000 of the news
00:19:11.000 people saying
00:19:11.620 it as well
00:19:12.120 because he
00:19:13.020 had also
00:19:13.600 suggested that
00:19:14.480 the news had
00:19:15.080 never said
00:19:15.520 that.
00:19:16.360 So not only
00:19:16.880 had Biden
00:19:17.240 ever said
00:19:17.720 it, but
00:19:18.000 nobody in
00:19:18.420 the news
00:19:18.800 had ever
00:19:19.120 suggested.
00:19:20.740 Just think
00:19:21.380 about that,
00:19:21.920 that nobody
00:19:22.260 in the news
00:19:23.080 had ever
00:19:24.180 suggested that
00:19:25.900 Trump had
00:19:26.440 said the
00:19:27.700 racists were
00:19:28.240 fine people.
00:19:28.800 now, he
00:19:32.260 listened to
00:19:32.800 it and
00:19:33.880 then he
00:19:34.160 said, well,
00:19:34.980 that's what
00:19:35.360 I said.
00:19:36.760 And I
00:19:37.260 said, no,
00:19:37.600 that's the
00:19:37.920 opposite of
00:19:38.380 what you
00:19:38.620 said.
00:19:39.300 Listen to
00:19:39.740 it.
00:19:39.920 Maybe you
00:19:40.200 didn't hear
00:19:40.520 it.
00:19:41.000 And we
00:19:41.420 played it
00:19:41.980 multiple
00:19:42.320 times.
00:19:43.660 And what I
00:19:44.460 was clearly
00:19:45.040 hearing that
00:19:45.760 could not
00:19:46.140 have been
00:19:46.340 more clear,
00:19:47.640 he was
00:19:47.980 hearing the
00:19:48.360 opposite.
00:19:49.880 Now you're
00:19:50.460 going to say
00:19:50.700 to me, wow,
00:19:51.500 what's wrong
00:19:51.900 with him?
00:19:53.480 Too fast,
00:19:54.580 too fast.
00:19:55.540 What's wrong
00:19:56.040 with me?
00:19:57.180 Why wouldn't
00:19:57.620 you say what's
00:19:58.100 wrong with
00:19:58.440 me?
00:19:59.620 Why is
00:20:00.160 your reality
00:20:00.700 the right
00:20:01.040 one?
00:20:02.220 We had two
00:20:02.900 complete
00:20:03.240 realities, but
00:20:04.660 if you think
00:20:05.180 you know which
00:20:05.800 one is right,
00:20:06.500 hold on.
00:20:07.600 Just hold
00:20:08.220 on.
00:20:09.200 Because it
00:20:09.820 won't be my
00:20:11.180 claim that
00:20:11.640 he had the
00:20:11.980 wrong reality.
00:20:13.860 I'm going to
00:20:14.420 surprise you
00:20:14.920 at the end.
00:20:17.100 So he
00:20:19.160 also didn't
00:20:20.800 seem to
00:20:21.400 understand what
00:20:23.280 I would call
00:20:23.720 normal
00:20:24.240 communication
00:20:25.020 patterns.
00:20:26.620 And I
00:20:27.200 didn't see
00:20:27.520 that coming.
00:20:28.920 Which is
00:20:29.840 that if I
00:20:30.560 used an
00:20:31.020 analogy, he
00:20:32.140 seemed to
00:20:33.100 not understand
00:20:33.720 how analogies
00:20:34.460 work.
00:20:35.300 And when I
00:20:35.980 used a
00:20:36.340 generality, he
00:20:38.220 seemed to
00:20:38.680 not understand
00:20:39.320 how generalities
00:20:40.260 work in
00:20:40.880 normal
00:20:41.200 conversation.
00:20:42.340 So almost
00:20:43.200 all of his
00:20:43.760 questions were
00:20:44.800 about what my
00:20:45.600 sentence meant
00:20:46.440 instead of,
00:20:48.600 you know, a
00:20:48.920 productive
00:20:49.280 conversation about
00:20:50.200 content.
00:20:51.080 And he had
00:20:51.840 disagreements such
00:20:52.800 as, that's not
00:20:54.060 the news, that's
00:20:55.060 a topic in the
00:20:55.920 news.
00:20:56.740 And I would
00:20:57.640 think, I
00:20:59.840 don't make a
00:21:00.360 distinction between
00:21:01.160 a topic, let's
00:21:02.220 say climate
00:21:02.680 change, and
00:21:03.820 the news, because
00:21:05.340 that topic is in
00:21:06.820 the news.
00:21:07.300 so I didn't even
00:21:09.480 understand that and
00:21:10.440 still don't, that
00:21:12.320 there's a difference
00:21:12.920 between a topic
00:21:14.300 that's in the
00:21:15.540 news and the
00:21:16.980 news.
00:21:17.320 so it turned
00:21:19.500 into this, uh, two
00:21:21.880 movies on one
00:21:22.740 screen, which I
00:21:23.480 always describe in
00:21:24.800 which his reality
00:21:25.620 and mine never
00:21:26.720 really overlapped,
00:21:27.860 even though on the
00:21:29.620 surface we were
00:21:30.280 having something
00:21:30.880 like a conversation.
00:21:32.540 But then also there
00:21:33.900 were other movies
00:21:34.760 that were the
00:21:36.500 audience's movies.
00:21:37.980 So one of the
00:21:38.960 comments I got is
00:21:40.060 that I lost the
00:21:41.340 debate.
00:21:43.320 Do you know what I
00:21:44.300 said to that?
00:21:45.520 What debate?
00:21:46.300 We didn't agree
00:21:48.120 to a debate and
00:21:48.900 we didn't have a
00:21:49.460 debate.
00:21:50.540 There was no
00:21:51.380 debate.
00:21:52.320 It was a
00:21:52.920 conversation where
00:21:54.560 something happened
00:21:55.240 that I don't
00:21:55.780 understand and I
00:21:56.920 was definitely, I
00:21:58.520 definitely was
00:21:59.400 shocked into a,
00:22:01.840 uh, let's say a
00:22:02.620 cognitive condition
00:22:03.860 that was very akin
00:22:06.540 to a mushroom
00:22:07.380 trip.
00:22:09.060 Now I've had, you
00:22:10.860 know, one
00:22:11.280 productive mushroom
00:22:12.860 trip and one time
00:22:13.780 I took mushrooms
00:22:14.400 and just got sick
00:22:15.200 so that didn't
00:22:16.220 count.
00:22:16.980 But the one time
00:22:17.820 I did, what you
00:22:18.860 experience is that
00:22:19.860 you're living in a
00:22:20.500 world where the
00:22:21.160 stuff is familiar
00:22:22.200 but it doesn't
00:22:23.700 work the same.
00:22:25.660 So in other words,
00:22:26.960 you know, people are
00:22:27.860 still people and a
00:22:28.760 chair is still a
00:22:29.460 chair and you still
00:22:30.460 know how to sit in
00:22:31.080 it, but everything's
00:22:32.280 different.
00:22:33.580 So that's what I
00:22:34.460 experienced because,
00:22:36.080 because once my base
00:22:37.780 reality was in question,
00:22:39.220 which the most
00:22:40.500 observable things are
00:22:41.880 not true, and even
00:22:43.260 when I'm playing an
00:22:44.060 audio, that it's
00:22:46.900 not clear if the
00:22:47.700 audio is what I'm
00:22:48.500 hearing or if it's
00:22:49.380 the opposite of what
00:22:50.260 I'm hearing, because
00:22:51.700 that's what Michael
00:22:52.340 was hearing.
00:22:54.220 And then you were
00:22:55.100 watching it and some
00:22:56.140 of you thought you
00:22:56.760 watched the debate
00:22:57.820 and there wasn't any
00:22:59.180 debate.
00:22:59.720 It was a conversation
00:23:00.640 on one question that
00:23:02.880 kind of went a weird
00:23:04.300 direction.
00:23:05.580 So here's some of
00:23:06.600 the, some of the
00:23:10.180 observations from
00:23:12.660 people watched.
00:23:13.880 Now these are not
00:23:14.600 ones I agree with.
00:23:15.940 I want to show you
00:23:16.640 how many different
00:23:17.240 interpretations there
00:23:18.240 were of this same
00:23:19.140 weird event.
00:23:21.120 Some people wondered
00:23:22.100 if I was high.
00:23:23.180 I wasn't.
00:23:26.240 Let's see.
00:23:27.640 Oh, Michael was
00:23:28.580 also not convinced
00:23:29.880 that money has a
00:23:32.160 substantial distortion
00:23:33.540 effect on experts.
00:23:36.600 So one of the base
00:23:37.380 realities that I think
00:23:38.540 is not really
00:23:39.800 questionable is that
00:23:41.740 if somebody is getting
00:23:42.500 paid to have a
00:23:43.260 specific opinion, and
00:23:44.960 there would be a
00:23:45.780 penalty if they didn't
00:23:46.880 and their bosses would
00:23:47.840 be mad and they
00:23:49.100 would lose their
00:23:49.600 reputation, that that
00:23:51.480 wouldn't really be
00:23:52.780 something that would
00:23:54.080 stop the truth from
00:23:54.980 coming out.
00:23:57.220 That's a base reality
00:23:59.280 difference where I
00:24:00.280 think it's not just a
00:24:02.620 thing that can happen.
00:24:04.160 It's the thing that
00:24:04.820 happens everywhere, all
00:24:05.820 the time, universally,
00:24:06.600 with no exceptions.
00:24:09.020 And he would say, I
00:24:10.360 don't know, I'm not
00:24:11.160 even sure if you're
00:24:11.740 going to see that
00:24:12.140 signal.
00:24:13.400 So, I mean, I don't
00:24:14.760 want to put words in
00:24:15.720 his mouth, but
00:24:16.380 effectively, we had a
00:24:17.700 different opinion on
00:24:18.420 how much money
00:24:19.160 distorts things.
00:24:21.940 So some of you said
00:24:23.200 he's just bad at
00:24:24.100 arguing, but that's
00:24:26.700 hard to explain when
00:24:28.020 you look at how
00:24:28.620 accomplished he is as
00:24:29.920 a communicator.
00:24:31.600 So here's somebody
00:24:32.580 successful on multiple
00:24:34.280 books published.
00:24:35.200 He's been an actor, he's
00:24:37.500 a writer, stand-up.
00:24:40.160 So someone who
00:24:41.500 communicates that well
00:24:43.020 is not going to be just
00:24:44.720 bad at understanding
00:24:45.840 words.
00:24:47.160 So I don't buy that.
00:24:48.660 I don't buy that he's a
00:24:50.100 bad debater.
00:24:53.620 Some of you thought he
00:24:54.660 was playing a prank.
00:24:55.840 He was just trolling me
00:24:56.740 the entire time.
00:24:57.580 But I didn't get that
00:24:59.840 vibe because I talked
00:25:01.200 to him both before and
00:25:02.260 after.
00:25:03.580 And the before and after
00:25:05.100 that you didn't get to
00:25:06.000 see didn't really
00:25:07.360 suggest any kind of
00:25:08.320 prank or trolling.
00:25:10.060 There was actually
00:25:10.780 confusion on his side
00:25:12.260 as well about what
00:25:13.720 happened.
00:25:16.160 If he had known
00:25:17.220 exactly what happened,
00:25:18.720 then that would support
00:25:19.840 the idea that he
00:25:22.320 thought he was in just
00:25:23.180 a normal debate and
00:25:24.140 didn't do well or he was
00:25:25.500 trolling or as a prank.
00:25:26.720 But he was as
00:25:28.420 confused as I was.
00:25:30.760 So it's not it's not
00:25:32.160 whatever obvious thing
00:25:33.240 you think it is.
00:25:36.080 And I don't even think
00:25:37.180 it's TDS.
00:25:38.740 You know, you want to
00:25:39.580 say, oh, Trump
00:25:40.200 derangement syndrome.
00:25:42.080 Maybe there's a touch of
00:25:43.320 it somewhere in the
00:25:44.040 story.
00:25:44.760 But TDS doesn't get you
00:25:46.160 to the point where you
00:25:47.960 disagree on basic facts
00:25:49.420 of reality.
00:25:51.300 Usually it's more like an
00:25:52.540 interpretation of
00:25:53.480 something.
00:25:54.420 But not not the basic
00:25:56.140 fact.
00:25:56.720 There's CNN and Fox
00:25:58.260 News are both reporting
00:25:59.480 every day.
00:26:02.160 Not that.
00:26:03.640 That's not TDS.
00:26:04.700 That's something else.
00:26:06.300 So.
00:26:07.840 And other people thought
00:26:09.100 he was arguing in bad
00:26:10.180 faith.
00:26:11.740 I can't read his mind,
00:26:13.300 but I didn't see that
00:26:14.220 again because we had a
00:26:16.180 conversation before and
00:26:17.400 after.
00:26:18.160 I just have a little more
00:26:19.440 insight than you would
00:26:20.740 have if you just watched
00:26:21.680 it.
00:26:21.860 So I didn't think it was
00:26:23.360 bad faith.
00:26:24.540 And I thought that it was
00:26:25.540 brave and, you know,
00:26:27.520 he put himself out there
00:26:28.840 to even have the
00:26:29.440 conversation.
00:26:31.420 Some say he was
00:26:32.360 pretending to not
00:26:33.480 understand what I said
00:26:34.720 because it was so
00:26:36.100 pervasive.
00:26:36.580 It was like he didn't
00:26:37.520 understand normal sentences
00:26:38.820 and I don't know what
00:26:41.800 that was.
00:26:43.120 So I actually got in
00:26:45.060 trouble because I asked
00:26:46.440 him online if he was on
00:26:47.980 the spectrum because I was
00:26:50.200 trying to understand why
00:26:51.500 the communication, which
00:26:54.140 would be a normal pattern
00:26:55.180 of communication, was not
00:26:57.180 just failing once, but
00:26:58.360 consistently.
00:26:59.940 The things I thought were
00:27:00.960 just normal sentences, he
00:27:03.160 was interpreting in a way
00:27:04.340 that I thought was very,
00:27:05.880 you know, not
00:27:07.240 neurotypical.
00:27:08.840 Now, here's what I walked
00:27:10.080 into.
00:27:10.940 Apparently, if you ask
00:27:13.360 somebody if they're on the
00:27:14.680 spectrum, that's taken as
00:27:16.960 an insult.
00:27:18.400 How many of you think
00:27:19.380 that's an insult?
00:27:21.540 Because it never even
00:27:22.560 occurred to me that that
00:27:23.420 would be an insult.
00:27:24.840 And I'd like to apologize.
00:27:26.760 So I apologized to him.
00:27:28.960 But once people said,
00:27:30.580 hey, that's an insult.
00:27:31.480 Why are you why are you
00:27:32.720 saying bad things about
00:27:33.640 people on the spectrum?
00:27:34.760 And I thought, when did I
00:27:36.320 do that?
00:27:37.500 If you ask somebody if
00:27:38.660 they're on the spectrum in
00:27:40.660 2024, generally, they just
00:27:43.640 say yes.
00:27:44.760 Oh, yeah, I'm on the
00:27:45.440 spectrum.
00:27:46.560 Elon Musk says he's on the
00:27:47.800 spectrum.
00:27:48.940 Do I have a bad feeling
00:27:50.000 about him?
00:27:51.420 No.
00:27:53.020 Something like, you know,
00:27:54.120 sometimes it feels like
00:27:55.320 half of all of my audience,
00:27:57.360 especially for Dilbert,
00:27:58.960 are on the spectrum.
00:28:00.540 I love those people.
00:28:02.940 Literally, my favorite
00:28:04.400 people, engineers, spectrum
00:28:06.480 people.
00:28:07.620 So it never occurred to me.
00:28:09.980 And by the way, I get asked
00:28:11.160 if I'm on the spectrum
00:28:12.180 about once a week, usually
00:28:14.220 privately.
00:28:15.580 Somebody will say, you know,
00:28:17.020 I don't know if you've
00:28:18.020 looked into this, Scott, but
00:28:19.220 I think you're on the
00:28:21.240 spectrum.
00:28:21.860 You're showing a little bit
00:28:22.760 of, and I always think it's
00:28:25.240 not an insult.
00:28:26.680 It's just an observation, and
00:28:28.380 it might be true, and it
00:28:29.220 might be false, but I
00:28:30.300 certainly don't think of it
00:28:31.080 as an insult.
00:28:31.800 So if anybody was insulted,
00:28:34.840 certainly I didn't mean it
00:28:36.000 that way.
00:28:36.980 And I'll tell you my view is
00:28:39.660 that it's no more interesting
00:28:40.960 than being gay in 2024.
00:28:43.540 Right?
00:28:44.100 If somebody said, I'm gay, it
00:28:46.660 would be the biggest nothing,
00:28:48.540 and it would just be nothing.
00:28:49.340 So if you ask somebody, oh,
00:28:52.460 by the way, are you gay?
00:28:54.100 In 2024, I wouldn't consider
00:28:56.080 that an insult.
00:28:57.660 It'd be weird if you did.
00:28:59.920 So to me, I thought it was
00:29:01.000 just an ordinary question.
00:29:02.140 But if anybody's insulted by
00:29:03.480 it, I apologize.
00:29:07.300 So it wasn't that.
00:29:09.000 I don't think he's on the
00:29:10.000 spectrum.
00:29:12.280 Then that leaves, I'll tell
00:29:14.020 you my remaining theories.
00:29:16.520 Either it was cognitive
00:29:17.840 dissonance.
00:29:19.340 But here's the fun part.
00:29:21.160 If it's cognitive dissonance,
00:29:22.820 there's no real way to know
00:29:24.160 if he had it or I had it or
00:29:25.560 we both had it.
00:29:27.460 You know that, right?
00:29:29.340 Because if I had it, you
00:29:30.480 probably had it too.
00:29:32.400 In other words, if there was
00:29:33.360 something that happened there
00:29:34.220 that triggered people who
00:29:36.160 think like me, well, that's
00:29:38.500 most of you.
00:29:39.680 You know, the people who
00:29:40.540 follow me and would have
00:29:41.580 watched that typically, you
00:29:44.100 know, have some agreement
00:29:45.100 with the way I think, similar
00:29:46.720 mindsets.
00:29:47.260 So if it triggered me into
00:29:48.720 cognitive dissonance, in
00:29:50.040 other words, I saw a world
00:29:51.040 that didn't make sense, so I
00:29:52.620 started hallucinating to make
00:29:54.420 it make sense.
00:29:55.120 That's what cognitive
00:29:56.000 dissonance is.
00:29:57.300 If it happened to me, I
00:29:58.540 wouldn't have a way to know.
00:29:59.980 And you wouldn't be able to
00:30:01.160 tell me because you'd be in
00:30:02.800 it too.
00:30:04.700 But my experience of it is
00:30:06.880 that my observation would be
00:30:09.340 that if he were in it, it
00:30:11.060 would match all of my
00:30:11.920 observations.
00:30:13.260 In other words, everything
00:30:14.080 would make sense under that
00:30:16.340 filter doesn't mean it's
00:30:17.360 true, because remember, it
00:30:19.540 would also make sense if I
00:30:20.840 was the one experiencing it.
00:30:22.820 So the thing I try to teach,
00:30:24.900 but it's hard to even
00:30:25.640 remember it myself, is that
00:30:27.780 if you're in one of these
00:30:28.620 situations where one of you
00:30:30.080 at least is having some kind
00:30:32.260 of cognitive experience, you
00:30:34.180 never know which one.
00:30:35.920 Because the whole point of
00:30:36.940 cognitive dissonance is that
00:30:38.760 when you're in it, you're the
00:30:40.640 only one who can't tell.
00:30:41.600 And if I'm in it, you're in
00:30:44.580 it too.
00:30:45.820 It would be the same trigger
00:30:47.100 for all of us.
00:30:48.820 So the trigger was there.
00:30:50.900 So the trigger was there for
00:30:52.340 both of us, which is we were
00:30:53.640 presented with a world which
00:30:56.000 isn't the world we lived in.
00:30:57.740 And we were asked to accept a
00:30:59.260 world that we don't live in.
00:31:01.360 That should have triggered maybe
00:31:03.600 both of us.
00:31:05.360 Maybe Michael and I were both in
00:31:07.820 cognitive dissonance.
00:31:08.740 We wouldn't know and you
00:31:11.000 wouldn't be able to tell us.
00:31:12.480 That's the fun part because you
00:31:14.180 would be equally triggered on
00:31:15.540 each side if you were, you
00:31:16.800 know, thinking like him, you
00:31:18.600 think I was in cognitive
00:31:19.740 dissonance and vice versa.
00:31:21.840 So it would at least explain
00:31:23.440 the observation, but it
00:31:24.540 doesn't mean it's true.
00:31:25.860 I'm going to give you a more
00:31:28.320 fun explanation.
00:31:30.820 Here's my preferred one.
00:31:32.900 Remember I've told you that we
00:31:35.740 don't know what's true, but
00:31:37.140 some things predict a better
00:31:38.700 than others.
00:31:40.340 And one of the things I've
00:31:41.320 predicted for a long time is
00:31:43.260 that we might find out we live
00:31:44.760 in a simulation by discovering
00:31:48.220 that the reality we live in
00:31:50.960 matches what you would build if
00:31:53.240 you had resource constraints for
00:31:55.880 your computer simulation.
00:31:58.640 So imagine a simulation that had
00:32:01.320 a hundred million people in it and
00:32:03.380 they all had to have memories that
00:32:04.820 were at least a little bit
00:32:05.720 compatible.
00:32:06.180 So if I said, hey, yesterday I
00:32:09.580 killed a wildebeest and you were
00:32:12.020 with me on the hunt, in order for
00:32:14.960 us not to be crazy, you would have
00:32:16.960 to remember that you killed a
00:32:18.080 wildebeest with me.
00:32:19.020 So we'd have to have compatible
00:32:21.020 memories.
00:32:21.500 Now think about a hundred million
00:32:23.220 people all interacting with each
00:32:25.280 other and they all have to have
00:32:27.420 compatible memories.
00:32:29.540 Eventually you just run out of
00:32:31.180 computing space because when you get
00:32:34.240 from a hundred million to eight
00:32:35.940 billion, the number of connections and
00:32:38.640 permutations and things that one
00:32:40.640 person influenced another person
00:32:42.200 becomes so large that it's hard to
00:32:45.740 imagine, you know, even an advanced
00:32:47.640 civilization being able to have that
00:32:49.140 much computing power.
00:32:50.820 Now, to put it in perspective, I've
00:32:53.660 predicted that within one year we will
00:32:57.280 have built our own simulations in which
00:33:00.160 the people in it believe they're real and
00:33:02.580 live complete lives.
00:33:04.320 In one year, we'll have little AI
00:33:06.820 creatures in a game who don't know
00:33:10.060 they're being observed by the builder of
00:33:12.000 the game who will just live their
00:33:13.740 lives.
00:33:14.680 They'll live an entire life.
00:33:15.940 They'll have kids.
00:33:17.220 And if you ask them, they'd say they're
00:33:18.580 real.
00:33:19.640 Now, in our current world, we have all
00:33:22.000 kinds of resource constraints.
00:33:23.620 We've got electricity constraints, you
00:33:26.020 know, chip constraints, board
00:33:27.880 constraints.
00:33:28.680 So we can't have an infinitely big
00:33:30.920 computer.
00:33:32.780 Maybe eventually, but not in short
00:33:34.660 term.
00:33:35.640 So what we should see if we are a
00:33:38.240 simulation is that as our population
00:33:40.900 grows, our histories and our collective
00:33:44.440 understanding of what happened
00:33:45.920 yesterday should start to dissemble.
00:33:49.280 You should see a situation in which we
00:33:52.220 don't agree on anything because the
00:33:55.080 computing can't handle us agreeing.
00:33:57.960 All it can do is give us the illusion that
00:34:00.960 we had some kind of common world, but we
00:34:04.200 didn't.
00:34:04.580 So we would all be playing our own game
00:34:07.380 with our own simulation, subjective
00:34:10.460 reality.
00:34:11.400 But mine doesn't need to agree with
00:34:13.200 Michael, Ian Black's or yours.
00:34:16.920 So in other words, because of resource
00:34:18.980 constraints, you should see more and more
00:34:21.960 people disagreeing on the rock that they're
00:34:24.980 holding in their hand, like really, really
00:34:26.800 basic stuff.
00:34:27.620 And that's what we're seeing.
00:34:29.520 We're seeing people, you know, questioning
00:34:32.040 the basic reality of just what happened
00:34:34.680 yesterday or even what's happening right
00:34:37.080 now while we're talking.
00:34:39.080 But here's the here's the payoff.
00:34:41.820 If you were this advanced, let's say, advanced
00:34:46.040 program from an advanced society that maybe
00:34:48.760 is no more advanced than we are in one
00:34:50.440 year, you would have to do things to
00:34:55.140 manage the fact that you were running
00:34:56.460 out of memory and you might say we have
00:34:59.100 to move from a model in which I treat
00:35:02.460 every individual differently and manage
00:35:04.560 their memory so that it's compatible to
00:35:07.920 a silo where there are two groups of
00:35:11.280 people and they have completely
00:35:13.780 incompatible realities.
00:35:17.860 And that's what we see in order to save
00:35:20.500 memory.
00:35:21.620 It appears the simulation has divided us
00:35:24.820 into basically two creatures.
00:35:28.020 The creatures on the left are experiencing
00:35:30.540 an entirely different world and it never
00:35:33.160 has to be compatible with anybody on the
00:35:35.780 right.
00:35:36.560 So in other words, when we have a
00:35:38.080 conversation, we'll differ on all the facts
00:35:40.480 and all the interpretation and we'll say, my
00:35:43.260 God, you must have TDS.
00:35:45.300 And then the other side would say, my God, you
00:35:47.400 must be racist.
00:35:48.180 And we would, instead of having common memories
00:35:51.880 and common facts and common experiences, because
00:35:55.400 that would be too much computing to keep it all
00:35:57.380 straight and consistent over time, we have
00:36:00.720 inconsistent memories, inconsistent experiences, and
00:36:05.080 inconsistent facts.
00:36:07.280 And it won't matter because we don't talk to each
00:36:11.940 other now.
00:36:13.100 This was the first experience in years, years, the
00:36:20.020 first one in which I had a meaningful conversation
00:36:23.080 with somebody who was not exactly on my
00:36:26.420 wavelength politically.
00:36:28.820 Years.
00:36:30.160 And I don't think he'd ever talk to anybody who had
00:36:33.000 my point of view, because when I said things
00:36:35.700 like, well, you know, Biden made this the center
00:36:41.640 of his campaign in 2020, how many other people
00:36:45.680 don't know that?
00:36:47.920 Now, other people might have completely different
00:36:50.000 views of reality, but my take is that it's
00:36:53.220 unexplained.
00:36:54.720 Neither Michael nor I know exactly what happened.
00:36:57.860 And that's what makes it interesting.
00:37:00.340 It looked to me like reality was just bifurcating
00:37:03.860 and you could watch it in real time.
00:37:06.040 That's what it looked like.
00:37:08.920 All right.
00:37:11.240 I also noted that, so Michael said he was no
00:37:14.020 expert on climate change, but seemed to be
00:37:16.500 inclined to believe the experts, which would be
00:37:19.420 a normal thing.
00:37:20.420 But one of the things that I think is very
00:37:22.240 common, let me test this.
00:37:24.100 Let me see if this is common knowledge in my
00:37:26.140 audience.
00:37:26.500 Those of you who are watching right now, how
00:37:30.020 many of you know, in the topic of climate
00:37:32.800 change, what the heat island effect is and why
00:37:37.220 that's important?
00:37:39.000 How many of you think, well, everybody knows
00:37:41.160 that?
00:37:43.580 Because I think it was the first time that
00:37:45.080 Michael heard it.
00:37:46.340 But how many of you think that's common
00:37:48.100 knowledge?
00:37:49.300 I don't know.
00:37:49.840 It might be maybe only 25% of you, perhaps.
00:37:52.700 I see one no.
00:37:57.120 I'm just going to look at the ratio of yes to no.
00:38:01.200 A lot of no's.
00:38:04.520 But the yes is, look to me, 75%.
00:38:07.760 That's very unscientific.
00:38:10.920 But just looking at the comments, maybe 75% of
00:38:14.000 you are, yeah, I know what that is.
00:38:16.320 And then maybe 25%, could be more, could be half,
00:38:21.420 are not familiar with it.
00:38:24.600 So if there's that many not familiar, let me just
00:38:27.140 quickly say, if you're measuring the temperature
00:38:29.560 of the earth, the way that's done is there are
00:38:32.860 sensitive thermometers placed in various parts around
00:38:35.700 the world, and they made sure that they didn't put
00:38:38.720 it near any cities or big concrete areas, because
00:38:42.800 concrete attracts heat that's not actually normal.
00:38:47.620 So the heat readings would be skewed if they were too
00:38:50.900 close to any big cities or anything.
00:38:53.300 So they made sure that they were nice rural places, but
00:38:57.400 then the cities grew.
00:38:59.340 So the thing they were avoiding became the thing that
00:39:03.100 they couldn't avoid, because the cities grew to where
00:39:05.240 the thermometers were.
00:39:06.700 So that means that we don't have a record that would
00:39:10.540 show us with the same kind of thermometers under the
00:39:13.480 same conditions, what the temperature was and what
00:39:16.680 it is now.
00:39:18.860 Now, if you'd never heard that, it wouldn't be unreasonable
00:39:23.200 to think that scientists could measure the temperature
00:39:25.940 of the earth.
00:39:27.120 But once you hear just a little bit about how they do it,
00:39:30.760 it immediately becomes obvious to anybody with experience
00:39:33.460 that it's not real.
00:39:36.900 Now, again, I don't know if the world is getting warmer or not.
00:39:39.880 I don't know if humans are contributing to it and to what
00:39:44.240 degree or if it matters in terms of the future.
00:39:48.100 But I can tell you for sure that we can't measure the temperature
00:39:52.980 of the earth.
00:39:54.440 And we will laugh at that someday.
00:39:56.900 Someday that will be like, you know, finding out if somebody's a
00:40:00.620 witch by seeing if they float.
00:40:02.800 It's going to feel like that.
00:40:04.660 It's like, seriously?
00:40:06.440 In 2024?
00:40:07.940 Are you telling me that three quarters of the planet thought that
00:40:11.520 you could measure the temperature of the earth?
00:40:13.240 We're not even close to being able to do that, in my opinion,
00:40:19.220 in my reality.
00:40:22.560 All right.
00:40:23.440 Cheap fakes seems to be the new hoax, as Mary Catherine Hamm was
00:40:27.700 saying, that it's the new Russian disinformation.
00:40:31.820 So I guess cheap fakes is how the Democrats will tell you that the
00:40:36.440 thing you're seeing with your own eyes isn't happening.
00:40:39.060 No, Biden's fine.
00:40:41.760 Oh, no, he's not dead.
00:40:44.320 No, he's not dead.
00:40:46.000 No, look at him move.
00:40:47.280 He's not moving.
00:40:48.700 No, he's fine.
00:40:51.100 I'm pretty sure his limbs are stiff.
00:40:53.080 He's got rigor mortis.
00:40:54.060 He hasn't moved in days.
00:40:55.080 He's fine.
00:40:56.180 He's fine.
00:40:57.140 There's no problem.
00:40:58.720 You and your cheap fakes.
00:41:01.740 So that's happening.
00:41:02.700 The Daily Caller has a story that the military has turned into a vast
00:41:08.820 DEI bureaucracy, and their main thing is to make sure that we've got
00:41:13.380 lots of diversity.
00:41:15.140 This, of course, on paper, should be the end of American military
00:41:21.480 dominance.
00:41:22.060 Because, if you haven't noticed, white men are saying, why would we ever be in
00:41:29.220 the military?
00:41:30.320 And white men are about, I don't know, 90% of the actual shooting people.
00:41:38.380 You know, the people who actually have, like, a weapon and are killing people,
00:41:43.720 the ones knocking down doors, the special forces, you know, the special forces,
00:41:47.520 are, I think, 85% to 90% of white guys, only Hispanic and Asian, but mostly white
00:41:55.580 guys.
00:41:57.680 So, and I'm not saying that's good or bad.
00:42:00.000 It's just what it is.
00:42:01.180 So, if you take the people who are doing the bulk of the fighting, and you say,
00:42:07.740 well, there's no reason for you to do it now, because you're not fighting for what
00:42:11.000 you thought you were.
00:42:12.120 You're fighting to be a second-class citizen.
00:42:14.100 And the military will make sure that when you're in there, you know that your
00:42:18.340 buddies are not equal to you, they're superior, because they're going to get
00:42:21.380 that promotion, even if you're equally qualified.
00:42:24.040 And they'll tell you directly.
00:42:25.560 I mean, you're not, it's not an interpretation.
00:42:28.040 It's, you know, it's the entire thing.
00:42:31.500 So, no, if you're a white man in America, it would be, to me, seems somewhat
00:42:36.000 ridiculous to join the military.
00:42:40.060 I would never have said that before DEI, of course.
00:42:44.100 Anyway, so that looks like a bad situation.
00:42:49.360 Imagine being those two astronauts that are trapped on the International Space Station,
00:42:55.160 and they don't have a plan or a timeline to get them back.
00:43:01.740 But this is what the bureaucracy back at their home company, Boeing, is saying, quote,
00:43:06.380 we are taking our time and following our standard mission management team process, he said.
00:43:11.460 We are letting the data drive our decision-making relative to managing the small helium system
00:43:16.960 leaks and thruster performance we observed during rendezvous and docking.
00:43:22.240 So, how would you like to be trapped in space?
00:43:25.680 And the people responsible for getting you back are talking like this.
00:43:31.640 We are taking our time, following our standard mission management.
00:43:35.100 Now, apparently, they're not going to run out of food or water.
00:43:42.180 But imagine the mindset of being trapped in space, and your bosses are like, well, we're
00:43:48.320 not in a super hurry, because, you know, we want to do it right.
00:43:52.800 But more importantly, we want to make sure that the effort to get this right is, you know,
00:43:59.900 properly diverse.
00:44:01.860 So, yes, they do want to save the lives of the astronauts.
00:44:06.560 Duh.
00:44:06.900 Of course, they want to save their lives.
00:44:08.820 It's not as important as diversity, but it's in the top two.
00:44:13.180 It's in the top two.
00:44:15.200 So, it would feel great to know that you were in the top two priorities for saving your life.
00:44:23.120 So, it may be curious, how is Boeing doing on the ESG?
00:44:27.440 You know, the ESG, environmental, social, and governance.
00:44:31.900 So, that would include whether they're promoting enough diverse candidates.
00:44:36.600 But it would also show if they're good for the environment.
00:44:40.540 So, ESG, of course, as you know, is in lots of different companies, and it's a very big
00:44:47.600 thing.
00:44:47.960 And if you're a big company, you don't have a big ESG, you know, push, you're going to
00:44:55.340 be evaluated by independent evaluators.
00:44:58.060 And then you're in trouble.
00:44:59.540 So, what did the independent ESG evaluators say about Boeing?
00:45:04.460 Well, I asked ChatGPT, and if it's correct, this time there were three big entities that
00:45:10.940 rated Boeing for the ESG.
00:45:13.160 One gave them an excellent, got an excellent, good, good for them.
00:45:20.340 They're excellent on ESG.
00:45:21.740 The other rating agency gave them, oh, okay.
00:45:25.020 The other one gave them average.
00:45:27.680 So, one said they were excellent on ESG.
00:45:30.160 The other just said average.
00:45:31.500 Then the third one said they're high risk, which is way below average.
00:45:37.160 So, you've got three companies looking at their ESG.
00:45:43.360 One says they're excellent.
00:45:44.660 One says they're average.
00:45:45.540 One says they're high risk.
00:45:47.000 Now, under those circumstances, you might say to yourself, I'm not so sure this ESG stuff
00:45:52.380 is even real.
00:45:53.880 It would appear to me that ratings agencies have no way to even have a common understanding
00:46:00.520 of what ESG is, which is true.
00:46:03.740 It's almost like the entire thing is complete bullshit.
00:46:06.420 Which it is.
00:46:09.000 But Boeing has a big diversity problem, although they do have 27% women, and they have 61% of
00:46:17.800 their employees are white.
00:46:18.960 But they have a sizable Hispanic and pretty good Asian American population in their ranks.
00:46:29.340 But their percentage of black employees at Boeing is so low that I didn't even get a percentage
00:46:36.240 for it.
00:46:36.700 So, they have a big goal to get their diversity for black employees specifically.
00:46:43.720 They want to increase it by 25%.
00:46:46.300 So, Boeing is working on improving their diversity specifically within the black community.
00:46:53.260 They're trying to get a 25% increase, but I think that would be 25% on 1%.
00:46:59.720 So, I feel like their goal is to get from 1% to 1.25% or something like that.
00:47:06.760 So, it's very small to also small.
00:47:10.680 But as I said, the good news is that saving those two astronauts is definitely in the top
00:47:17.380 two priorities.
00:47:19.040 So, they should feel good about that.
00:47:20.480 So, Google and a couple of smart people, I guess, are offering a $1 million prize for any AI
00:47:30.860 that can solve simple logic puzzles that humans can solve even when they're kids.
00:47:38.100 Now, apparently, AI is great for doing things where there's pattern recognition, but pattern
00:47:44.800 recognition is not logic.
00:47:46.180 And apparently, it fails completely, all of the models do, the LLMs.
00:47:51.740 They all fail at just even simple little puzzles of logic.
00:47:56.980 And so, there's a new test that's apparently a well-accepted test of logic, which a kid could,
00:48:04.140 you know, a small child could pass in many cases.
00:48:07.320 Not too small, but a child could pass.
00:48:10.020 And you get a million dollars if you can do it.
00:48:13.960 Now, if you spend a lot of time with AI, as I have, as just a user, you very quickly learn
00:48:20.220 that it doesn't have logic, which is weird.
00:48:24.460 Because it seems so smart, but it really doesn't have logic.
00:48:28.800 And you can beat it in an argument, it's almost trivially easy, because it'll spew around patterns.
00:48:36.460 But as soon as you get in a logic trap, it just falls apart and agrees with you or something.
00:48:43.500 So, yeah, it's very obvious that AI cannot do logic.
00:48:47.820 And I don't know that the fact that they have to offer a million dollars suggests that nobody has any idea how to make it logical.
00:48:56.120 Now, I would like to remind you of something I've been telling you since the start.
00:49:04.340 And this prediction will just get better and better until you all agree.
00:49:09.520 The reason we'll never be able to make machines smart like us, let's say logical like us,
00:49:16.960 is because we only imagine we're logical.
00:49:22.080 And we would never agree even what logic looked like.
00:49:26.740 Now, these tests that I'm talking about, the million dollar challenge, these are just like puzzles.
00:49:32.120 So there's no narrative to them or anything.
00:49:34.460 They're as close as you could get to pure logic.
00:49:37.860 So it might be able to do that.
00:49:39.440 But imagine if it took its pure logic into anything else that mattered.
00:49:44.720 As soon as it did, you'd say, well, I guess it's not working.
00:49:49.920 You know, going back to the conversation I had with Michael Ian Black, suppose AI had logic.
00:49:57.440 And we were to say, you know what?
00:49:59.140 We've got some disagreement about what happened.
00:50:01.440 And some of it's a logical disagreement, let's say, hypothetically.
00:50:05.680 Let's have AI judge us.
00:50:07.460 And then the AI would say, AI, I've looked at your conversation.
00:50:12.580 I've looked at the transcript.
00:50:13.960 And I judge that Michael is being logical and Scott is not being logical.
00:50:18.580 And here's why.
00:50:19.920 What do you think would be my reaction to that?
00:50:22.520 Wow.
00:50:23.680 I didn't realize.
00:50:25.480 I didn't realize I was being so illogical.
00:50:28.200 But now that the superior intelligence of AI has corrected me, I guess I have to rethink my entire life.
00:50:34.420 Here I thought I was logical, but I guess I'm not.
00:50:36.440 No, that would never happen.
00:50:39.740 Even my brain, even knowing that this happens, there's no protection.
00:50:44.640 Here's what I would say.
00:50:46.800 I would say, I guess the AI isn't working because it got the logic wrong.
00:50:51.680 So we can never have logical machines when we as humans can never agree on what is a logical argument.
00:51:00.300 As soon as you get down to the narrow range of math and puzzles, as soon as you get into any human thing, we just won't agree what's logical and never will.
00:51:10.800 All right.
00:51:12.800 All right.
00:51:14.540 The spies who lied.
00:51:16.340 New York Post had that headline.
00:51:17.880 I love it.
00:51:18.580 Talking about the 51 intelligence people who signed that laptop letter.
00:51:26.100 That Hunter's laptop was probable Russian disinformation when, of course, it wasn't and it was live.
00:51:32.820 We're finding out that two of the 51 were allegedly, and there's some disagreement about that, actually contractors for the CIA when it happened.
00:51:43.260 And one of them was the organizer of the letter, and the claim is that the organizer of the letter was actively working as a contract employee for the CIA, and he was a former CIA acting director.
00:51:56.000 So the fake letter, allegedly, remember there's a disagreement here, allegedly, the former CIA acting director put it together knowing it was fake.
00:52:12.100 Or at least knowing that, you know, there was no reason to say it was Russian misinformation.
00:52:19.620 So he says that he was not under contract, but there's some document that would suggest he was.
00:52:26.000 Now, how do you interpret that?
00:52:28.440 Generally speaking, if somebody said, I did not work for them, and there was some document that says they did, I feel like I would believe the person over the document, because maybe the document was on a date or something.
00:52:42.040 But in this case, since the CIA folks are actually allowed to lie, including domestically, if he says, I don't work for the CIA, that really doesn't mean anything.
00:52:54.080 It doesn't mean anything.
00:52:55.240 Because if he worked for the CIA, I'm pretty sure you're allowed to say you didn't.
00:53:00.880 That seems like basic CIA stuff.
00:53:03.520 Do you work for the CIA?
00:53:05.160 Nope.
00:53:05.640 So I'm not sure his denial has any credibility in this specific case.
00:53:12.340 Normally it would.
00:53:14.160 But who knows?
00:53:15.660 So, yes, the people who think that the CIA has been managing the information, let's say, atmosphere in the United States, this would be, let's say, circumstantial evidence to support that.
00:53:31.220 We've learned that among the migrants coming in, the many millions of them, allegedly, according to the Department of Homeland Security, there have been at least 400 migrants who were brought here from ISIS-affiliated smuggling networks.
00:53:48.780 So ISIS has their own smuggling network, and 400 of them, and the whereabouts of more than 50 of them are unknown.
00:53:57.040 But 150 of them have been arrested out of the 400, which means there's a lot of them that we know where they are, but they're not arrested.
00:54:04.900 But anyway, I think you see the size of the problem.
00:54:14.140 Yeah.
00:54:15.560 And I don't know if the Biden administration is taking seriously the threat of getting their pronouns wrong.
00:54:24.200 I mean, it's like it's not even, they're acting like it doesn't even matter or something.
00:54:27.360 And that's, I find that alarming.
00:54:30.340 There could be a lot of people coming in with ISIS affiliations that were misgendering just carelessly.
00:54:40.460 And I think that's got to be taken into account.
00:54:43.300 But I would disagree with calling the people coming in that are ISIS-affiliated.
00:54:49.600 Let's not call them illegals, please.
00:54:52.960 It's insulting.
00:54:54.700 You know, they're people.
00:54:55.460 They're not illegals.
00:54:58.020 People can't be illegal.
00:55:00.260 There could be illegal acts.
00:55:02.200 But I think it's unfair to call the ISIS operatives coming into the country as illegal.
00:55:07.620 I would prefer the term ISIS-positive.
00:55:11.320 They're ISIS-positive.
00:55:13.780 Yeah.
00:55:14.280 Let's take some of the, you know, emotion out of it and just treat them like regular citizens.
00:55:22.400 I hope to God they can vote.
00:55:25.460 All right.
00:55:25.880 You can't tell when I'm being serious, probably, if you're just coming in now.
00:55:29.120 But I'm not that serious.
00:55:32.140 All right.
00:55:32.660 Let's see.
00:55:34.180 Speaker Mike Johnson saying that the House is going to write an amicus brief for Steve Bannon,
00:55:40.680 for his Supreme, I guess, for the Supreme Court that's going to review his situation, where he did not agree to a, what is it, to talk to the Congress.
00:55:55.240 He refused them.
00:55:56.460 And I guess you could go to jail for that, as Peter Navarro is still in jail.
00:55:59.560 But I guess you can only go to jail for that if you're Republican in today's world.
00:56:05.340 Garland, of course, was in the same situation.
00:56:07.480 But nope, no problem for Garland, because he's not a Republican.
00:56:10.860 Peter Navarro, no such luck.
00:56:12.940 Anyway, an amicus brief just means they're arguing that on behalf of Bannon.
00:56:18.760 So it's a friendly argument in favor of him.
00:56:24.640 Now, is that going to work?
00:56:26.100 Because the argument is that the January 6th committee was illegitimately formed.
00:56:31.540 And therefore, there should be no political, there should be no legal implications, or at least no risk of jail.
00:56:39.980 If the people asking him to do the thing, which is appear and testify, were not legitimately formed.
00:56:47.140 And since the House is the group that formed him in the first place, if the House concludes that it was illegitimately formed, I would think that would be persuasive to the Supreme Court.
00:56:59.240 I feel like that would matter.
00:57:01.740 But we'll see.
00:57:03.000 And I think this is being sped up because the Supreme Court needs to act pretty soon.
00:57:11.360 Are there any decisions that have come out yet?
00:57:13.420 I think there are some decisions pending that might come out, like, right now.
00:57:17.640 So we've got some big stuff pending.
00:57:20.780 Anyway, here's a little story.
00:57:25.360 Here's a tiny little story.
00:57:27.860 So small, it doesn't mean anything.
00:57:31.480 It's such a little trivial.
00:57:33.880 It's just a nothing of a story.
00:57:36.060 And you should not take anything from it except its complete irrelevance to everything.
00:57:41.560 Okay, just a tiny, tiny little thing.
00:57:44.380 But over in Fulton County in Georgia, there's the attorneys for Fulton County.
00:57:51.820 They're trying to argue.
00:57:54.240 They want to get rid of the injunction.
00:58:00.780 Let's see if I got the legal words right.
00:58:02.960 At the moment, there's a legal order to preserve a bunch of ballots that have been allegedly, allegedly, that witnesses say are fake ballots.
00:58:15.220 So if I understand this correctly, Rasmussen has been talking about this a lot.
00:58:22.280 If I understand it correctly, there's a room, a locked room, in which witnesses have said, we've seen those ballots and they're fake.
00:58:31.320 And all you'd have to do to know if they're fake is unlock it.
00:58:37.040 And the judge has ordered that it should be unlocked.
00:58:40.780 And I think for over a year, it hasn't been.
00:58:43.260 And there's no explanation for why it hasn't been.
00:58:46.140 And nobody's been arrested or threatened for not doing it.
00:58:50.880 Now, after, I don't know, a year or 18 months or whatever, or however long it's been, it's been a long time.
00:58:57.740 After a long time of the court saying it should be opened, now the state wants to make sure that it's never opened
00:59:05.180 and that whatever is in there is destroyed before it's evaluated.
00:59:10.020 So given all of the things that people need to do in their life, all the things that a lawyer could be doing,
00:59:23.220 all the things that a government could be doing, how important is it to have a legal case to destroy evidence
00:59:33.680 that might be the most important evidence that the election was either clean, which would be good to know,
00:59:41.140 or totally rigged, which would be good to know.
00:59:45.420 No matter whether the ballots show something irregular or not, we really need to know,
00:59:52.960 because we have witnesses that says they're fake, and that there would be enough of them that it would totally have changed the election.
00:59:58.860 Why would anybody spend their time, given all the other uses for time,
01:00:04.820 why would they spend it trying to destroy the evidence when all you had to do was unlock the door
01:00:11.960 and then people would look at them and say, oh, okay, these are real,
01:00:16.180 and then you could destroy them legally.
01:00:18.540 You wouldn't need any process.
01:00:20.500 Just unlock it and say, take a look.
01:00:23.000 If you see anything irregular, let us know.
01:00:25.620 So if those ballots were completely fine, why would anybody go to court to have them destroyed before they're looked at?
01:00:38.360 Can you think of any reason to destroy them before they're looked at when there's a credible claim
01:00:45.940 that it would change the entire understanding of America?
01:00:49.540 It wouldn't just change the election.
01:00:51.320 It would understand the basic understanding of America.
01:00:57.440 If the witnesses are correct, and I don't know one way or the other.
01:01:01.760 I mean, most of the claims have turned out to be not correct.
01:01:04.520 If you were going to bet on it, you should bet against it.
01:01:08.200 Because really every claim has not quite, you know, hasn't quite proven that the whole election was rigged.
01:01:16.520 There are lots of claims of individual improprieties.
01:01:21.400 So I'm going to say that my assumption is guilt.
01:01:26.680 Remember my standard?
01:01:28.300 If it's an individual, a citizen, you are innocent until proven guilty, and I'm going to die on that hill.
01:01:35.460 But if you're the government, and their lawyers are working for the government, and you do something that's this suspicious,
01:01:44.660 your functioning, working assumption is that the election was rigged.
01:01:49.700 So in my opinion, this is confirmation.
01:01:53.100 I don't need to see anything else.
01:01:54.940 If they're putting this much attention into destroying the best evidence that there might be a problem,
01:02:02.040 instead of embarrassing the other side by showing that there's no problem,
01:02:07.060 especially since the court said they have to,
01:02:10.580 there's only one interpretation I can think of.
01:02:13.560 The working assumption is that the election was rigged,
01:02:16.840 and that it's now demonstrated to my satisfaction.
01:02:19.200 I would say that all my questions are answered by the fact that they can't let you look at these.
01:02:26.820 Would you agree with that?
01:02:28.620 Would you agree that if they're not willing to simply unlock a door,
01:02:33.360 the simplest thing you could ever do,
01:02:35.500 and the court told them to do it,
01:02:37.900 and they're trying to have them burned instead of that,
01:02:41.040 there is no second way to understand that, is there?
01:02:44.840 Can anybody come up with an alternate explanation of why that could be the case?
01:02:50.520 Unless it's exactly what you think it is.
01:02:53.220 I can't think of anything.
01:02:57.660 Well, Trump has regained what I'll call partial free speech.
01:03:02.600 How about that?
01:03:03.840 Let's celebrate that he got partial free speech.
01:03:07.020 So now that the trial is over,
01:03:09.020 that Judge Mershon stuff, he had some gag orders,
01:03:12.260 but now that it's over, I guess Trump can say stuff about Michael Cohen
01:03:17.100 and Stormy Daniels and the other witnesses at the trial.
01:03:20.760 So that's very nice of the judge to give an American citizen his fucking freedom.
01:03:29.880 Oh, thanks.
01:03:31.040 Thanks, Judge Mershon.
01:03:32.480 He was leaning against it, but he decided to do it.
01:03:35.960 He was leaning against it.
01:03:37.960 He's an American citizen.
01:03:39.360 Now, if you're in the actual court case,
01:03:44.220 I understand why, you know, that's a special situation.
01:03:47.840 But once it's done, once you have a result,
01:03:52.980 how in the world do you argue that he shouldn't have free speech
01:03:56.520 like everybody else in the world?
01:03:59.940 That's crazy talk.
01:04:01.300 So, yeah, so Trump gets partial free speech.
01:04:07.320 Thanks.
01:04:10.060 So the Republican-oriented legal entity,
01:04:15.880 America First Legal,
01:04:17.500 I think that's Stephen Miller's creation,
01:04:20.180 that's doing all of the legal responses
01:04:22.720 to all the many bad things that Democrats are doing,
01:04:25.620 continues to be a superstar.
01:04:27.960 To me, this is one of the best things happening in the world right now
01:04:31.400 is that there was a counterforce
01:04:33.540 that the Republicans put together
01:04:35.340 to challenge all of the legal fuckery
01:04:38.400 that was becoming monumental, really.
01:04:41.640 It was just out of control.
01:04:43.140 But now they're just pushing back on everything.
01:04:45.700 Just everything.
01:04:46.760 Just overwhelming pushback on everything you can push.
01:04:49.840 So here now they're sending a directive to all 50 states
01:04:54.840 on how they should prevent illegal aliens,
01:04:58.420 they call them, from voting
01:05:00.060 amidst reports that the illegal migrants
01:05:04.760 are being given voting materials
01:05:08.540 and would be allowed to vote if they don't get detected.
01:05:11.780 So, now, whether or not this works,
01:05:18.480 because one imagines that Democrat states
01:05:20.960 will just do what they want,
01:05:22.360 because they don't really care
01:05:23.520 that America First Legal says this,
01:05:25.640 but you also imagine
01:05:26.840 that if America First Legal tells them
01:05:29.720 that the law requires them
01:05:31.780 to prevent non-citizens from voting,
01:05:35.280 and then they go and they do it anyway,
01:05:37.220 I'm not enough of a lawyer,
01:05:40.240 I'm not a lawyer at all,
01:05:41.540 to know if that makes it a stronger case later
01:05:44.300 if they sue them for not following their own laws.
01:05:47.540 I think it does,
01:05:49.220 because then you could determine
01:05:50.340 that they definitely knew what the law was
01:05:52.220 and that they intended to,
01:05:54.760 you know, they obviously violated it
01:05:56.820 because they knew what it was
01:05:57.660 and they didn't act on it.
01:05:59.520 So, everything that America First Legal does,
01:06:03.560 you know, I retweet a lot of their stuff.
01:06:05.440 I love all of it.
01:06:08.440 Like, I'm not sure every case
01:06:09.800 is as strong as every other case,
01:06:11.640 but you've got to push back
01:06:13.500 on every single thing,
01:06:14.960 and it's got to be really expensive
01:06:16.500 for the other side.
01:06:17.800 And they've got to lose a few.
01:06:19.840 I mean, it's just necessary.
01:06:21.020 It's mutually assured destruction.
01:06:23.240 And my big question is,
01:06:25.620 is this America First Legal sufficiently funded?
01:06:29.820 Because Republicans should be putting
01:06:32.300 a billion dollars into this.
01:06:33.860 Like, I don't know what their budget is.
01:06:36.320 It's not a billion dollars,
01:06:37.740 but it should be.
01:06:39.600 I think the Republicans should think
01:06:41.500 in terms of a $1 billion budget
01:06:43.800 just for the legal stuff,
01:06:46.120 just for legal pushback.
01:06:47.520 Because the world has turned into,
01:06:49.100 that's the whole fight.
01:06:50.860 The elections are almost irrelevant.
01:06:53.400 It's more who wins the legal battles
01:06:55.320 about how the election will be held.
01:06:56.880 Because you know if there are legal battles
01:06:59.140 to say we'll have massive mail-in ballots
01:07:04.220 and no signature requirements.
01:07:06.700 Let's say that's a change before an election.
01:07:09.500 That's the election.
01:07:11.340 That's the whole election.
01:07:13.020 It's the rules.
01:07:14.060 So if the only thing that matters
01:07:16.300 to the outcome is the rules
01:07:18.280 and we get to change them before the election,
01:07:21.440 then the only thing that matters
01:07:23.300 is how big the budget is
01:07:24.760 for the legal entities fighting each other.
01:07:27.760 And so I think Republicans,
01:07:30.520 you need to think about $1 billion
01:07:34.000 as an annual budget
01:07:35.440 just for the legal push.
01:07:37.320 And I believe also that this should cover,
01:07:40.520 I think that that billion
01:07:41.620 should go to support people
01:07:43.060 like Peter Navarro and Bannon
01:07:44.660 and all the lawyers
01:07:46.520 who got destroyed financially
01:07:48.160 with the lawfare.
01:07:50.320 There should be a lawfare protection insurance
01:07:54.340 that is basically,
01:07:56.400 if the Democrats come after you
01:07:57.820 with lawfare that's bullshit,
01:07:59.300 like they did with all the January 6th lawyers
01:08:01.360 that were Republicans,
01:08:02.600 then there should be some gigantic protective fund
01:08:06.020 that says we're going to make this free
01:08:08.140 and maybe even pay to keep the family alive
01:08:13.120 because the person being sued
01:08:15.660 won't be able to work.
01:08:17.860 So $1 billion, I think, is the target.
01:08:22.420 And if you're thinking,
01:08:23.520 no, Scott, they could do it for $20 million.
01:08:28.280 Screw that.
01:08:29.700 $20 million is a loser number.
01:08:31.800 They need a $1 billion fund
01:08:34.720 a large part of that
01:08:37.400 to just protect Republicans
01:08:39.220 who are being law-fared, like Trump.
01:08:41.340 I think Trump's entire legal bill
01:08:43.080 should be covered by an entity
01:08:45.020 that's just protecting all the Republicans
01:08:47.040 who are just trying to live their life
01:08:48.620 and didn't know they were committing any crimes.
01:08:51.960 They should be getting
01:08:52.820 all the January 6th people out.
01:08:54.940 So $1 billion should be their budget.
01:08:57.480 And if you're listening,
01:09:00.660 tell your boss I said $1 billion should be your budget
01:09:04.660 or whoever you get your money from.
01:09:06.540 I don't know who you get your money from.
01:09:09.580 Well, cyborg technology is becoming big.
01:09:12.240 You've heard that there are organic,
01:09:14.040 like fake brains
01:09:15.300 that were just grown in a lab
01:09:17.460 that are used as computers.
01:09:18.860 They're like a million times better
01:09:20.420 on energy consumption.
01:09:22.100 I don't know how big that'll get,
01:09:23.660 but if it's a million times better on energy,
01:09:26.160 that's pretty impressive.
01:09:27.660 Now we have these bees
01:09:29.220 that are being hooked up
01:09:30.800 to sensors in their brains.
01:09:34.080 So apparently you can do brain surgery
01:09:36.200 on a bumblebee.
01:09:38.260 So I don't know if they're bumblebees.
01:09:40.040 Honeybees.
01:09:41.000 They're honeybees.
01:09:42.100 So honeybees have an intense ability to smell
01:09:44.940 and they can even smell lung cancer.
01:09:48.000 And you can't trust the bees
01:09:50.360 to smell it and then tell you
01:09:51.740 because bees are terrible with language.
01:09:54.100 You say, hey bee,
01:09:56.500 do you smell any lung cancer?
01:10:00.000 And the bee is going to go,
01:10:01.160 useless, totally useless.
01:10:05.180 So instead they hook up the bee
01:10:06.740 to some machines
01:10:07.640 that can detect
01:10:09.520 that it's detecting some lung cancer.
01:10:12.520 So the bee's brain
01:10:14.180 becomes part of the machine, basically.
01:10:15.820 It's like a cyborg device.
01:10:17.660 And so if you've got all that,
01:10:23.060 I feel like we're going to have
01:10:24.120 more and more cyborgs.
01:10:25.720 But the cyborgs are not necessarily
01:10:28.020 a human being plus a machine.
01:10:30.920 It could be a fake organic brain
01:10:33.760 plus machine.
01:10:35.480 It could be a fake honeybees brain
01:10:38.220 plus machine.
01:10:39.760 So we're going to have this weird
01:10:42.240 cyborg world
01:10:45.300 where it's not a human plus machine.
01:10:47.580 We'll have that as well.
01:10:48.660 But it will be
01:10:49.700 all manner of organic things
01:10:52.400 plus machines,
01:10:53.420 at least for some period of history.
01:10:57.340 All right.
01:10:58.160 So we keep talking about the Ukraine wars
01:11:00.660 pushing Russia and China together
01:11:02.460 and they'll be best friends
01:11:04.140 and Russia has resources
01:11:05.340 and China is buying 20% of their stuff
01:11:07.720 or 20% of what China buys
01:11:10.120 is from Russia
01:11:10.900 or something like that.
01:11:11.920 But China is buying
01:11:12.640 a lot of resources from Russia,
01:11:14.560 natural resources,
01:11:16.320 energy mostly.
01:11:19.060 And here's what I would say.
01:11:22.660 And this is one of those filters
01:11:24.540 that comes with age.
01:11:26.900 When you look on paper
01:11:28.380 and say, all right,
01:11:30.220 the things that NATO
01:11:31.180 and the United States
01:11:32.140 and Europe are doing
01:11:32.920 are pushing Russia toward China
01:11:34.840 and that's all bad
01:11:35.800 because they're both adversaries
01:11:37.940 in a way.
01:11:39.300 I'm going to add this filter.
01:11:41.120 This is not predictable.
01:11:43.580 None of this is predictable.
01:11:45.680 On paper,
01:11:46.580 it looks like a disaster.
01:11:48.440 It looks like
01:11:49.180 our adversaries,
01:11:50.920 we just made them friends.
01:11:52.680 But they were sort of already friends.
01:11:54.580 They have a common border.
01:11:55.920 I mean,
01:11:56.560 they're pretty intent
01:11:57.840 on avoiding war with each other
01:11:59.320 and they're very intent
01:12:00.920 on being trading partners
01:12:02.400 because they have a common border
01:12:03.820 and war would be crazy.
01:12:05.500 So it's not like a big change
01:12:07.860 that China and Russia
01:12:09.280 are becoming best friends.
01:12:11.100 But think of all the ways
01:12:12.420 this could go wrong.
01:12:14.380 And also think that
01:12:15.820 have we ever been in a situation
01:12:17.320 where our adversaries
01:12:19.140 were also our suppliers
01:12:20.440 and we were their customers
01:12:22.120 in a very big way.
01:12:24.520 China is such a weird situation
01:12:25.980 that we call them an adversary
01:12:27.440 while we're doing business
01:12:29.020 with them like crazy.
01:12:30.000 You can't really
01:12:32.200 get to a war
01:12:34.460 with somebody
01:12:36.300 who's your business partner.
01:12:38.400 And I'm wondering
01:12:39.100 if that's ever happened.
01:12:41.300 Has there ever been a war
01:12:42.860 between two countries
01:12:44.680 that had a really strong
01:12:46.200 trading situation?
01:12:48.740 I don't know enough
01:12:49.740 about history
01:12:50.240 to know has that happened.
01:12:52.100 Because I feel like
01:12:53.000 usually what happens
01:12:54.080 is that war is about resources.
01:12:56.420 You know,
01:12:56.720 unless it's a crazy kind of war.
01:12:58.380 And that resources,
01:13:01.000 you would go after a country
01:13:03.200 that you didn't have
01:13:03.840 anything to do with
01:13:04.900 so you could get
01:13:06.480 the resources.
01:13:07.740 You know,
01:13:08.040 that would be colonization.
01:13:10.840 But why would you attack
01:13:12.060 somebody that you already have
01:13:13.500 a gigantic economic machine
01:13:16.060 that's working for both of you?
01:13:18.860 Has that ever happened?
01:13:20.720 You know,
01:13:20.980 you've heard
01:13:21.420 no two countries
01:13:22.900 that have McDonald's
01:13:23.760 have ever gone to war
01:13:24.540 with each other,
01:13:25.140 but I think that's happened
01:13:26.140 by now.
01:13:26.780 Maybe you,
01:13:27.140 yeah,
01:13:28.520 I think that's happened
01:13:29.320 by now.
01:13:30.520 But more than just
01:13:32.680 having a McDonald's,
01:13:34.180 when you have
01:13:35.380 a gigantic trading situation
01:13:37.780 that you both need
01:13:38.980 or want,
01:13:39.880 how do you ever
01:13:40.740 get into a war?
01:13:42.600 Now,
01:13:43.040 here's the other thing
01:13:43.820 that I don't think
01:13:45.360 is obvious.
01:13:46.520 If you imagine
01:13:47.500 that Russia and China
01:13:48.700 work together
01:13:49.440 and that it's good
01:13:50.640 for both of them,
01:13:51.880 I don't think
01:13:52.840 you've met China.
01:13:53.560 I don't think
01:13:56.180 China wants Russia
01:13:57.800 to do really well.
01:13:59.620 They just sort of
01:14:00.980 want to buy their stuff.
01:14:03.360 And the nature
01:14:04.620 of business
01:14:05.220 is that if you can do it,
01:14:06.940 you will dominate
01:14:07.680 your trading partners.
01:14:09.100 If you can do it.
01:14:10.620 If you can't do it,
01:14:11.400 you won't.
01:14:12.240 But if you can,
01:14:13.660 well,
01:14:13.900 you're definitely
01:14:14.300 going to do it.
01:14:15.540 So what happens
01:14:16.280 to Russia's autonomy
01:14:17.540 when China
01:14:19.260 is completely
01:14:20.420 responsible
01:14:21.300 for their survival?
01:14:22.300 It's not so good,
01:14:25.000 is it?
01:14:26.040 Not so good.
01:14:28.720 Now,
01:14:29.960 so here's
01:14:30.920 the unpredictable part.
01:14:33.140 The unpredictable part
01:14:34.540 is,
01:14:35.400 are we sure
01:14:35.880 that it's good
01:14:36.520 for Russia
01:14:37.140 that they're
01:14:38.640 becoming more
01:14:39.340 dependent on China?
01:14:41.260 I don't know.
01:14:42.960 I mean,
01:14:43.220 it's probably
01:14:43.540 good for China,
01:14:44.560 but, you know,
01:14:45.720 China was going
01:14:46.320 to trade
01:14:46.640 with Russia anyway.
01:14:47.680 I mean,
01:14:48.200 it's not like
01:14:48.920 there was something
01:14:49.380 preventing them
01:14:50.260 from doing trade
01:14:51.000 before.
01:14:51.400 I'm not even
01:14:52.500 sure there's
01:14:52.900 any difference
01:14:53.440 in how much
01:14:53.860 they're trading
01:14:54.360 unless they
01:14:55.760 got a price
01:14:56.200 advantage
01:14:56.760 or something.
01:14:58.240 So,
01:14:58.820 I guess
01:15:00.500 all I'm going
01:15:00.860 to say
01:15:01.100 is that
01:15:01.540 the thing
01:15:02.540 that you
01:15:03.080 can predict
01:15:03.920 about Ukraine
01:15:05.300 pushing Russia
01:15:06.060 and China
01:15:06.460 together
01:15:06.920 is that
01:15:07.920 it's unpredictable.
01:15:09.380 It could go
01:15:10.040 totally wrong
01:15:10.720 for Russia.
01:15:11.860 It could be
01:15:12.280 the worst thing
01:15:12.820 they've ever done.
01:15:14.180 You just don't know.
01:15:15.300 The other thing
01:15:16.020 I would offer
01:15:17.500 is that
01:15:18.820 we assume
01:15:19.740 that all three
01:15:20.620 countries
01:15:21.060 are ready
01:15:21.540 for war
01:15:22.100 at any time.
01:15:23.320 You know,
01:15:23.500 China,
01:15:24.600 totally big military,
01:15:25.960 biggest military.
01:15:26.960 They're building
01:15:27.700 out their naval
01:15:28.340 capacity.
01:15:29.120 They got nukes
01:15:29.980 like crazy.
01:15:31.220 Man,
01:15:31.700 could they go
01:15:32.400 to war.
01:15:33.560 United States,
01:15:34.580 well,
01:15:34.900 strongest military
01:15:35.700 in the world.
01:15:36.520 We've got everything.
01:15:37.500 We've got bombs.
01:15:38.420 We've got new tech.
01:15:39.220 Oh, man,
01:15:39.680 you wouldn't want
01:15:40.140 to mess with us.
01:15:41.680 Russia?
01:15:42.360 Oh, my goodness.
01:15:43.840 Look at Russia's
01:15:44.600 military.
01:15:45.520 They're just playing.
01:15:46.660 They're just toying
01:15:47.280 with Ukraine.
01:15:48.260 They could take them
01:15:48.980 over any time
01:15:49.540 they wanted
01:15:49.960 if they wanted
01:15:50.580 to take the casualties.
01:15:52.300 But there's holding
01:15:53.040 back a little bit.
01:15:53.940 You know,
01:15:54.160 NATO probably.
01:15:55.420 So we've kidded
01:15:56.620 ourselves into thinking
01:15:57.840 there are three tigers.
01:16:01.420 But that's largely
01:16:02.500 based on the lying
01:16:03.620 that comes from
01:16:04.260 each of those countries.
01:16:05.700 Because they're not
01:16:06.420 going to tell you
01:16:06.960 if they have any weakness
01:16:07.920 in their military.
01:16:09.700 And what I wonder
01:16:10.620 is if any of those
01:16:12.020 three militaries
01:16:12.820 could handle the war
01:16:14.020 after the first
01:16:15.980 two weeks.
01:16:17.900 I don't even know
01:16:18.700 if they can.
01:16:19.480 I have a feeling
01:16:20.560 that all three of them
01:16:21.840 are just full of shit
01:16:22.740 and none of us
01:16:23.740 are ready for war.
01:16:25.940 And that we'd all
01:16:26.800 run out of fuel
01:16:27.500 in about 10 minutes.
01:16:30.180 That's what I think.
01:16:32.440 That's why I think
01:16:33.460 that the future
01:16:34.020 is just drone war
01:16:35.200 and nothing else.
01:16:37.700 I saw somebody describe,
01:16:39.260 I wish I could remember
01:16:39.940 who said it
01:16:40.400 because it was so brilliant,
01:16:42.760 that in the future
01:16:44.060 there will be massive,
01:16:45.120 you know,
01:16:45.380 swarms of drones
01:16:47.100 and that would be
01:16:47.640 the main way of attack.
01:16:48.780 I think that's
01:16:49.220 obviously true.
01:16:50.660 But somebody said
01:16:51.880 it's like using
01:16:52.740 nuclear weapons
01:16:53.820 but without the radiation.
01:16:56.940 So imagine
01:16:57.780 that a gigantic swarm
01:16:59.280 goes over a city,
01:17:00.880 you know,
01:17:01.100 a city you wanted
01:17:02.620 to attack for some reason
01:17:03.740 and the drones
01:17:04.860 just kill
01:17:05.380 every living person.
01:17:07.380 But they keep
01:17:08.280 the infrastructure
01:17:09.000 and they leave
01:17:09.820 no radiation.
01:17:10.980 They just literally
01:17:11.980 kill anybody
01:17:12.620 who walks outside
01:17:13.340 and if they don't
01:17:15.040 walk outside,
01:17:15.720 they're going to
01:17:16.020 starve to death.
01:17:17.100 So it basically
01:17:18.260 just kills
01:17:18.820 all the people,
01:17:20.780 but leaves all
01:17:21.280 the buildings
01:17:21.840 and no radiation.
01:17:24.800 That's dangerous
01:17:25.800 because at least
01:17:28.280 with a nuclear attack,
01:17:30.160 you could be assured
01:17:31.400 that they would attack
01:17:32.260 back so you won't do it.
01:17:34.160 But when the drones
01:17:35.380 can do that
01:17:36.440 and it's guaranteed
01:17:37.460 that they will be
01:17:38.300 really soon,
01:17:39.720 what's to stop you
01:17:41.320 from using them?
01:17:42.020 if you were looking
01:17:44.940 at a country
01:17:45.440 that didn't have drones.
01:17:47.200 If they didn't have nukes
01:17:48.260 and they didn't have drones
01:17:49.220 and you could darken
01:17:50.600 their sky with drones
01:17:51.540 and basically kill
01:17:52.800 anybody you wanted to,
01:17:54.560 that's going to be
01:17:55.140 a tough thing
01:17:55.620 to keep in the box.
01:17:57.660 So look for that.
01:18:00.360 All right.
01:18:01.000 Well,
01:18:01.400 Biden has got
01:18:02.040 the all-important
01:18:02.640 Adam Kinzinger endorsement.
01:18:05.360 Did not see that coming.
01:18:06.820 So I've said
01:18:10.900 that there are no men,
01:18:12.780 no straight men
01:18:14.060 who are also Democrats.
01:18:16.920 And then Adam Kinzinger,
01:18:19.360 he endorses him.
01:18:22.920 So is that proving me wrong
01:18:26.180 or is that the exception
01:18:28.040 that proves the rule?
01:18:29.860 Well, it's not quite that,
01:18:31.620 but I have to say
01:18:32.720 it's an exception
01:18:33.340 if you get paid
01:18:34.360 for your opinion.
01:18:35.460 Correct me if I'm wrong,
01:18:37.580 but Adam Kinzinger
01:18:38.440 has found a way
01:18:39.180 to monetize
01:18:40.060 being Adam Kinzinger.
01:18:42.960 So if somebody
01:18:43.760 has found a way
01:18:44.320 to monetize
01:18:45.140 being the way they are
01:18:46.360 and then they give
01:18:47.280 an announcement,
01:18:48.100 I'm going to be the way
01:18:49.160 that I have monetized,
01:18:50.760 I think we should ignore it.
01:18:53.880 But I would like
01:18:54.720 to point out
01:18:55.300 that there are
01:18:56.620 other important
01:18:57.820 men
01:19:00.360 who may be endorsing Biden.
01:19:02.560 So Kinzinger
01:19:03.960 won't be the last.
01:19:05.460 For example,
01:19:06.520 there's still
01:19:06.980 the Sam Bankman
01:19:08.160 Freed endorsement.
01:19:09.520 I don't know
01:19:10.040 if he's said anything yet.
01:19:11.800 He's in jail,
01:19:12.620 so it's hard to know.
01:19:14.400 And then there will be
01:19:15.220 some men
01:19:15.620 who are deeply uninformed.
01:19:18.380 So you've got Adam Kinzinger,
01:19:20.780 you've got Sam Bankman Freed,
01:19:22.960 you've got people
01:19:23.680 who are paid consultants
01:19:26.340 and operatives,
01:19:27.600 and then men
01:19:30.540 who are not paying attention.
01:19:32.660 So he does have some men.
01:19:34.280 It's not like a zero.
01:19:36.240 He's got some.
01:19:39.140 All right.
01:19:40.160 Do you follow Simon Atiba
01:19:41.880 on X?
01:19:44.360 Highly recommended.
01:19:46.020 I think he should have
01:19:46.780 a million followers.
01:19:47.720 He's one of my favorite followers.
01:19:48.940 So he writes
01:19:50.240 for an African publication,
01:19:52.160 but he does a lot of
01:19:53.080 stuff you'd like
01:19:54.740 in what you would identify
01:19:57.800 as a pro-Trump format.
01:20:01.000 Anyway,
01:20:01.500 so he's a great poster.
01:20:03.880 And his theory yesterday,
01:20:07.260 he says,
01:20:08.480 this is what Simon says
01:20:10.560 about himself.
01:20:11.560 He says,
01:20:12.040 I may actually be
01:20:13.140 an unrecognized genius.
01:20:14.960 Now,
01:20:15.160 if that's not enough
01:20:16.060 to follow him,
01:20:18.000 just take a look.
01:20:20.100 Right?
01:20:20.540 He's just really interesting.
01:20:22.420 So,
01:20:22.720 I mean,
01:20:22.980 he's got a Trump-like personality.
01:20:24.780 I may actually be
01:20:25.600 an unrecognized genius.
01:20:27.220 He says,
01:20:27.720 Governor Doug Burgum
01:20:29.160 says he will be
01:20:30.760 at the debate on Thursday,
01:20:32.600 and Trump has said
01:20:33.440 this VP pick
01:20:34.300 would be at the debate.
01:20:37.140 So Simon is thinking
01:20:38.380 that the more he sees
01:20:39.820 of Doug Burgum,
01:20:40.720 the more he thinks
01:20:41.680 that he might be
01:20:42.300 a good VP pick,
01:20:43.540 but attending the debate
01:20:45.280 is not much of a tip-off
01:20:46.740 because Vivek said
01:20:49.020 he's also attending
01:20:49.780 the debate.
01:20:50.960 I would expect
01:20:51.860 that you would see
01:20:52.620 the other names
01:20:54.560 like Rubio
01:20:55.800 and maybe J.D. Vance,
01:20:57.260 but at least J.D. Vance,
01:20:58.980 also saying
01:20:59.720 that they'll attend.
01:21:01.560 By now,
01:21:02.800 I think all the folks
01:21:04.240 who are in the running
01:21:05.460 know who has been picked,
01:21:08.060 or at least they know
01:21:08.920 they haven't been.
01:21:10.140 And I think
01:21:10.740 they're all in on
01:21:11.640 the fact that
01:21:12.420 it's going to be fascinating
01:21:13.460 to have them all
01:21:14.880 in the audience
01:21:15.500 so that we can keep
01:21:16.740 speculating who it is.
01:21:18.340 So Trump,
01:21:19.560 Trump knows how
01:21:20.280 to put it on a show,
01:21:21.480 and he always teases you,
01:21:23.200 so he just makes it
01:21:24.180 like you're sitting
01:21:24.700 on the edge.
01:21:25.160 Oh, who is it?
01:21:26.340 Who is it?
01:21:27.020 I am so curious now
01:21:28.600 when normally
01:21:29.280 I wouldn't be
01:21:29.900 because he's just
01:21:31.320 keeping up
01:21:32.420 the mystery of it,
01:21:33.680 and he's living
01:21:34.320 little hints,
01:21:34.980 and he's just doing
01:21:36.100 that Trump thing
01:21:36.860 that nobody else can do.
01:21:38.380 Nobody can do this.
01:21:40.360 Let me say that clearly.
01:21:42.140 Nobody else can do this.
01:21:44.060 This is just purely
01:21:45.400 a Trump skill
01:21:47.060 that just nobody else has
01:21:49.000 had to put on the show,
01:21:50.860 and he's put on
01:21:51.320 a hell of a show.
01:21:52.220 So you should expect
01:21:53.500 at least most
01:21:55.320 of the candidates
01:21:56.540 that were in
01:21:57.220 the discussion
01:21:58.020 to attend.
01:21:58.580 And it doesn't even mean
01:22:01.720 that he's going
01:22:02.240 to pick one of them
01:22:02.960 because, remember,
01:22:04.140 it's all part of the show.
01:22:06.200 If he surprises you,
01:22:08.000 well, that's part
01:22:08.540 of the show, too.
01:22:09.940 He's not obligated
01:22:10.980 to pick one of the ones
01:22:11.820 that are on your top five.
01:22:13.500 Could be a total stranger
01:22:14.700 or somebody you don't know.
01:22:17.620 But I am going to
01:22:19.400 play along
01:22:21.540 just for fun.
01:22:23.500 All right?
01:22:24.620 I had told you
01:22:25.560 I've been dismissing
01:22:26.600 Doug Burgum
01:22:27.320 every time he comes up.
01:22:29.880 And usually I do it
01:22:30.960 without an argument.
01:22:32.220 I usually say,
01:22:32.980 well, it's not Doug Burgum,
01:22:34.100 and I talk about
01:22:34.920 the other ones
01:22:35.460 who are more interesting.
01:22:37.060 Here's what I've been
01:22:37.960 getting wrong.
01:22:39.920 And I think Simon
01:22:40.740 might be right.
01:22:42.700 Simon Atiba.
01:22:43.940 And I think he might be
01:22:45.160 an unrecognized genius.
01:22:47.420 Could be.
01:22:48.440 He might be.
01:22:50.200 Because here's the thing
01:22:51.700 we always get wrong
01:22:53.040 when looking at
01:22:53.740 the vice president.
01:22:55.300 You say to yourself,
01:22:56.420 well, Trump needs a,
01:22:57.380 you know,
01:22:57.680 photogenic,
01:22:59.380 exciting,
01:23:01.900 pro-Trump,
01:23:03.060 you know,
01:23:03.340 somebody who's really
01:23:04.100 a fighter.
01:23:05.040 You know,
01:23:05.300 you have all these
01:23:06.560 qualities
01:23:07.180 that you want
01:23:09.200 to sort of match
01:23:10.060 Trump,
01:23:10.880 but be the VP version.
01:23:13.280 That's not how
01:23:14.060 VPs work.
01:23:15.820 The reason it's not
01:23:17.060 going to be Vivek
01:23:18.060 is because of Vivek.
01:23:21.740 Did that make sense?
01:23:24.220 Vivek is way too strong.
01:23:26.960 So at one point,
01:23:28.520 it sort of sounded
01:23:29.160 like it made sense.
01:23:30.840 But the stronger
01:23:31.960 Vivek gets,
01:23:33.460 it makes less
01:23:35.300 and less sense.
01:23:36.500 Because you could
01:23:37.320 use him better
01:23:38.040 in any other capacity.
01:23:39.520 Vice president
01:23:40.060 can be wasted.
01:23:41.380 A vice president,
01:23:42.480 what you want,
01:23:43.300 and remember I said
01:23:43.940 this about Pence,
01:23:45.300 you want the most
01:23:46.840 boring person
01:23:47.940 who looks like
01:23:49.480 they could do the job.
01:23:50.540 that's what you want.
01:23:53.420 The most boring person
01:23:55.340 who looks like
01:23:56.800 they could do the job.
01:23:58.800 Name anybody
01:23:59.640 who fits that standard
01:24:00.900 better than
01:24:01.600 Doug Burgum.
01:24:05.500 Doug Burgum
01:24:06.600 is the ultimate
01:24:07.580 boring guy
01:24:08.520 who could totally
01:24:09.620 do the job.
01:24:10.880 And he's,
01:24:11.760 I guess he gets
01:24:12.180 along with Trump.
01:24:13.260 That's all you need.
01:24:14.820 Gets along with Trump,
01:24:16.760 seems, you know,
01:24:17.700 totally on board.
01:24:20.000 Could do the job.
01:24:22.140 And is boring as fuck.
01:24:25.260 That's sort of
01:24:26.100 the gold standard
01:24:27.060 for a vice president.
01:24:28.960 And I'm embarrassed
01:24:30.600 that I didn't see it earlier.
01:24:33.500 Now,
01:24:34.600 just for fun,
01:24:36.060 I'm going to predict
01:24:36.880 that assignment is right.
01:24:38.780 And that,
01:24:39.620 and that the pick
01:24:40.680 will be Burgum.
01:24:44.060 I know most of you
01:24:45.260 are saying,
01:24:45.800 but, but, but,
01:24:46.580 wouldn't it be
01:24:47.620 just free
01:24:48.460 to pick a black
01:24:51.000 vice president
01:24:52.440 and just sort of,
01:24:53.520 you know,
01:24:54.200 get some of that goodness
01:24:55.220 and get a little
01:24:56.400 goodwill
01:24:57.120 and make it harder
01:24:58.460 for them to call you
01:24:59.440 a racist and all that.
01:25:02.160 Normally, yes.
01:25:04.320 But in the,
01:25:05.420 in the context
01:25:06.360 of DEI,
01:25:07.720 I'm not sure
01:25:09.500 it is the right move.
01:25:11.680 Because here's the thing.
01:25:13.920 I think Tim Scott's
01:25:15.260 very solid.
01:25:16.880 But if he got picked
01:25:18.220 in the context
01:25:19.100 of other people
01:25:19.980 also being solid,
01:25:21.200 because there are
01:25:21.540 quite a few good choices,
01:25:23.620 wouldn't you say
01:25:24.360 to yourself
01:25:24.860 just a little bit,
01:25:27.180 it felt a little bit
01:25:28.540 DEI.
01:25:30.240 And that would be
01:25:31.180 terribly unfair
01:25:31.820 to Tim Scott,
01:25:32.860 who's had a very
01:25:33.840 successful career
01:25:34.740 and has every qualification
01:25:35.840 for president,
01:25:37.180 much less vice president.
01:25:38.760 So terribly unfair.
01:25:40.020 but it would feel like
01:25:42.300 maybe.
01:25:44.260 Right?
01:25:45.660 But if he picks
01:25:46.680 Burgum,
01:25:48.100 you're going to say
01:25:49.280 to yourself,
01:25:50.040 well, there is somebody
01:25:50.940 committed to picking
01:25:51.860 the right choice
01:25:52.680 and he didn't go
01:25:54.140 with the easy,
01:25:55.180 popular,
01:25:56.440 you know.
01:25:57.980 I saw a poll
01:25:58.840 that
01:25:59.160 Byron,
01:26:02.700 why am I forgetting
01:26:03.960 his name?
01:26:04.480 Byron,
01:26:05.020 you'll give me
01:26:05.480 his last name.
01:26:07.840 But he got
01:26:08.360 a pretty low,
01:26:09.840 he got a pretty low
01:26:11.980 rating from people
01:26:12.820 in terms of
01:26:13.380 vice president choice.
01:26:14.680 So that would be
01:26:15.440 a good enough reason
01:26:16.120 not to pick him.
01:26:17.880 I think
01:26:18.520 Dr. Carson,
01:26:20.600 Ben Carson,
01:26:22.620 he's never been
01:26:24.440 in politics.
01:26:26.020 Well,
01:26:26.500 not elected politics.
01:26:28.480 He hasn't been
01:26:29.200 an elected politician.
01:26:31.140 And I'm not sure
01:26:32.040 that's the right choice
01:26:33.000 for a VP.
01:26:33.780 I would make
01:26:35.740 an exception
01:26:36.200 for Vague
01:26:37.000 because he's
01:26:39.000 shown that he can
01:26:39.980 pick up
01:26:40.440 basically anything
01:26:41.640 quickly.
01:26:42.980 And I think
01:26:43.720 Carson could pick up
01:26:44.620 anything quickly too,
01:26:45.540 but he's 72.
01:26:47.240 Like,
01:26:47.720 you don't really
01:26:48.180 pick the 72-year-old
01:26:49.680 non-politician
01:26:51.740 for vice president.
01:26:53.480 Could he do the job?
01:26:54.440 Probably.
01:26:55.100 Yeah,
01:26:55.380 I think he could
01:26:55.980 do the job.
01:26:56.980 But the most obvious
01:26:58.500 one is somebody
01:26:59.220 who's been in politics,
01:27:00.360 is really boring,
01:27:01.580 gets along with Trump,
01:27:02.520 and completely ignoring
01:27:04.100 any DEI aspects
01:27:05.480 to it.
01:27:07.860 I don't know.
01:27:09.480 Bergen's a possibility.
01:27:10.860 But let me tell you,
01:27:12.200 because I say this
01:27:13.060 all the time,
01:27:14.780 the vice president's
01:27:17.020 speculation is not
01:27:17.960 something I have
01:27:18.560 any skill for.
01:27:19.920 I don't have any
01:27:20.940 filters or background
01:27:22.200 that would make me
01:27:22.840 good at it,
01:27:23.540 and I've never got
01:27:24.520 one right yet.
01:27:25.240 So the fact that
01:27:27.020 I'm going to agree
01:27:28.460 with Simon,
01:27:29.220 think of it more
01:27:30.180 as Simon's prediction.
01:27:32.240 That way,
01:27:32.920 if it's wrong,
01:27:34.080 you can blame Simon.
01:27:35.780 And if it's right,
01:27:37.400 maybe he's right.
01:27:38.500 He might be
01:27:38.960 an unrecognized genius.
01:27:41.100 All right,
01:27:41.340 so follow him.
01:27:44.140 RFK Jr.
01:27:44.900 is going to hold
01:27:45.340 his own debate
01:27:46.540 at the same time
01:27:47.760 as the other debates,
01:27:49.140 except the way
01:27:50.100 he's going to do it
01:27:50.840 is he's going to
01:27:51.500 answer the questions
01:27:52.500 that are happening live.
01:27:54.520 So I think what he'll do
01:27:55.620 is listen to the question
01:27:56.680 and then turn it off,
01:27:58.680 and then he'll answer it
01:28:00.480 as if the question
01:28:01.260 had been asked to him.
01:28:03.320 Now,
01:28:03.860 I love that.
01:28:05.060 I absolutely love that
01:28:06.260 because he had to do
01:28:07.520 something,
01:28:08.680 you know,
01:28:08.940 not nothing,
01:28:10.040 and that's some
01:28:11.680 good counter-programming.
01:28:14.600 But here's what he said
01:28:15.960 about the likelihood
01:28:17.120 that Trump would win
01:28:18.140 the debate,
01:28:18.860 and I want you to listen
01:28:20.000 to the exact wording
01:28:21.100 and see how much
01:28:22.940 you appreciate this.
01:28:24.520 because my take
01:28:26.940 on RFK Jr.
01:28:27.860 is that
01:28:28.260 I don't line up
01:28:29.400 with a number
01:28:30.860 of his important
01:28:31.700 policy preferences.
01:28:33.840 But I've never
01:28:34.860 disagreed with anyone
01:28:36.320 and liked them
01:28:38.000 more than him.
01:28:39.420 He is the most
01:28:40.320 likable fucking guy,
01:28:42.780 even when you
01:28:43.500 disagree with him.
01:28:44.660 And man,
01:28:45.160 I don't know
01:28:45.480 if that's just
01:28:45.920 a Kennedy thing
01:28:46.840 that they all
01:28:48.440 just learned it
01:28:49.080 at birth or something.
01:28:49.920 But here's what
01:28:51.140 he says about Trump
01:28:52.100 in this big
01:28:53.080 contentious world
01:28:54.280 where everybody's
01:28:55.420 got to be mean
01:28:56.180 and I'm the best
01:28:57.020 and you're shit
01:28:57.940 and all this.
01:28:58.980 Listen to what
01:28:59.680 he says about Trump.
01:29:01.540 He said,
01:29:02.340 quote,
01:29:02.740 I would predict
01:29:03.560 that Trump will win
01:29:04.420 because I really,
01:29:05.520 I think Donald Trump
01:29:06.560 is,
01:29:07.140 he could win a prize
01:29:08.040 for the greatest
01:29:09.580 debater
01:29:10.260 in modern American
01:29:12.020 history,
01:29:12.920 probably since
01:29:14.340 Lincoln Douglas.
01:29:16.420 He said that
01:29:17.160 in an interview
01:29:17.840 with Piers Morgan.
01:29:20.180 And there's more.
01:29:22.420 He said his conclusion
01:29:23.860 that Trump would win
01:29:24.840 comes from,
01:29:26.160 quote,
01:29:26.440 watching him run
01:29:27.540 through 16 Republicans
01:29:29.020 and easily outmatching
01:29:30.840 them in debates.
01:29:33.260 He also said
01:29:34.340 Trump has,
01:29:34.980 quote,
01:29:35.460 all of these
01:29:36.380 extraordinary techniques
01:29:37.860 and is,
01:29:38.880 quote,
01:29:39.180 extremely entertaining
01:29:40.340 to watch.
01:29:41.700 He said,
01:29:42.120 quote,
01:29:42.320 I don't think
01:29:42.920 it's possible
01:29:43.600 for President Biden
01:29:44.680 to beat him
01:29:45.320 in that debate.
01:29:46.060 Now,
01:29:48.840 I do think
01:29:49.680 it's possible
01:29:50.240 for Biden
01:29:50.740 to beat him
01:29:51.220 in the debate.
01:29:52.120 Let me be clear
01:29:52.820 about that.
01:29:53.620 So here again,
01:29:54.500 I'm not agreeing
01:29:55.300 with Kennedy.
01:29:56.660 I think Biden
01:29:57.740 has at least
01:29:59.500 a 50% chance.
01:30:01.320 And honestly,
01:30:02.280 I'm leaning
01:30:02.680 in his direction
01:30:03.420 because I think
01:30:04.960 Biden just has
01:30:05.740 to show up
01:30:06.500 and it's going
01:30:07.880 to look like
01:30:08.280 he killed it.
01:30:09.400 And I think
01:30:10.160 Trump,
01:30:10.940 if he's just Trump,
01:30:12.720 is going to say
01:30:13.360 things that are
01:30:14.180 perfectly clear
01:30:15.180 to you and me
01:30:16.540 and the CNN
01:30:18.400 will say
01:30:18.980 that he decided
01:30:20.040 he wants to
01:30:20.680 drain all our blood
01:30:21.860 and turn us
01:30:22.540 into vampires.
01:30:23.880 And we'll say,
01:30:24.600 what?
01:30:25.820 Where are you
01:30:26.460 getting that
01:30:26.940 from his ordinary
01:30:27.780 language and ordinary
01:30:28.900 things?
01:30:29.420 Well,
01:30:29.800 I don't know.
01:30:30.720 Let me play it
01:30:31.420 for you.
01:30:32.520 So if you say
01:30:34.640 that the winner
01:30:35.340 is based on
01:30:36.420 what the public
01:30:37.140 reaction,
01:30:38.660 I think the
01:30:39.360 mainstream media
01:30:40.120 is going to
01:30:40.940 declare Biden
01:30:41.660 the winner.
01:30:41.980 So I definitely
01:30:43.680 disagree that
01:30:44.600 Trump is going
01:30:45.800 to tear through
01:30:46.420 him.
01:30:47.180 I do agree
01:30:48.040 that you and I
01:30:48.880 will think he did.
01:30:50.480 But I love the fact
01:30:51.520 that he's running
01:30:52.060 against Trump
01:30:52.800 and he doesn't
01:30:54.100 have any qualms
01:30:54.960 whatsoever for
01:30:56.160 saying that he
01:30:56.700 might be the
01:30:57.160 greatest debater
01:30:57.880 in American history
01:30:58.820 and it's because
01:31:00.000 he has
01:31:00.540 extraordinary
01:31:01.320 techniques.
01:31:03.440 Extraordinary
01:31:03.960 techniques.
01:31:06.260 Persuasion.
01:31:08.440 Think back
01:31:09.260 to 2015
01:31:10.000 when I was
01:31:10.880 mocked
01:31:12.160 for saying
01:31:12.700 that he
01:31:13.100 was a
01:31:13.540 persuasion
01:31:14.100 master.
01:31:15.800 Can we
01:31:16.160 put that
01:31:16.500 to rest?
01:31:18.280 Can we
01:31:18.620 put it
01:31:18.880 to rest
01:31:19.360 when one
01:31:21.100 of your
01:31:21.420 main two
01:31:22.240 competitors
01:31:22.700 for the
01:31:23.140 presidency
01:31:23.660 says
01:31:24.560 unambiguously
01:31:25.420 best debater
01:31:26.840 in American
01:31:27.320 history
01:31:27.800 since
01:31:30.420 Douglas,
01:31:31.540 Lincoln
01:31:31.880 Douglas,
01:31:32.660 and that
01:31:33.460 is based
01:31:33.820 on skill.
01:31:36.140 I mean,
01:31:36.360 that's incredible.
01:31:37.400 I've never
01:31:37.680 liked anybody
01:31:38.780 so much
01:31:39.440 while disagreeing
01:31:40.360 on a lot
01:31:41.000 of points.
01:31:43.960 Anyway,
01:31:45.040 some people
01:31:46.560 are saying
01:31:47.000 that Biden
01:31:47.460 cleverly
01:31:48.160 outmaneuvered
01:31:49.060 Trump and
01:31:50.500 fooled him
01:31:51.020 into accepting
01:31:51.700 debate criteria
01:31:54.160 that will be
01:31:55.080 bad for Trump.
01:31:56.080 There's no
01:31:56.560 audience and
01:31:57.320 he didn't
01:31:57.720 get to pick
01:31:58.120 the side
01:31:58.620 and you
01:31:59.520 get to sit
01:32:00.240 in chairs
01:32:00.780 and microphones
01:32:01.940 turned off.
01:32:02.820 So I'm
01:32:07.540 seeing in
01:32:07.920 the comments
01:32:08.480 a good
01:32:09.080 comment
01:32:09.500 that RFK
01:32:10.720 Jr.
01:32:11.140 is just
01:32:11.560 raising the
01:32:12.460 bar for
01:32:13.660 Trump
01:32:14.060 so that
01:32:15.680 he doesn't
01:32:17.000 cross the
01:32:17.460 bar.
01:32:18.320 That's a
01:32:18.700 good observation.
01:32:20.540 It could be
01:32:21.400 seen as a
01:32:22.260 purely political
01:32:23.040 statement
01:32:23.540 because it
01:32:24.440 does have
01:32:24.840 that effect.
01:32:25.880 But it's
01:32:26.600 also probably
01:32:28.100 true.
01:32:28.520 I think
01:32:30.600 he actually
01:32:31.120 believes that
01:32:31.960 Trump is
01:32:32.420 good at
01:32:32.740 debating but
01:32:34.020 that doesn't
01:32:34.760 mean he
01:32:35.160 doesn't want
01:32:35.660 to take a
01:32:36.920 little shine
01:32:37.440 off the
01:32:37.860 debate.
01:32:38.420 So that's
01:32:39.120 a good
01:32:39.360 comment.
01:32:39.840 I agree
01:32:40.120 with that.
01:32:41.440 However,
01:32:42.280 I still
01:32:43.300 think it's
01:32:43.660 true.
01:32:44.240 I think
01:32:44.480 it's his
01:32:44.780 actual
01:32:45.220 opinion.
01:32:48.360 Anyway,
01:32:49.800 so did
01:32:50.480 Biden win
01:32:51.500 by creating
01:32:52.240 this debate
01:32:52.820 situation that's
01:32:53.680 bad for
01:32:54.420 Trump or
01:32:55.960 I have a
01:32:57.200 competing
01:32:57.580 opinion.
01:32:58.520 that Trump
01:32:59.340 accepting a
01:33:00.480 debate in
01:33:01.140 the worst
01:33:01.620 conditions for
01:33:02.480 Trump is a
01:33:03.660 super strong
01:33:04.380 thing to do.
01:33:05.700 And even the
01:33:06.440 Democrats are
01:33:07.160 noting, wait a
01:33:08.300 minute, Trump
01:33:09.620 is going to
01:33:10.820 the Bronx?
01:33:12.900 Wait a
01:33:13.620 minute, Trump
01:33:14.840 is in
01:33:15.960 Detroit?
01:33:17.180 Like, why is
01:33:17.740 Trump in all
01:33:18.420 these places
01:33:18.980 that you don't
01:33:19.560 go and
01:33:20.620 walking away
01:33:21.300 victorious every
01:33:22.360 time?
01:33:23.580 Right?
01:33:23.980 Every time.
01:33:25.100 He's going
01:33:25.680 where he
01:33:26.020 shouldn't go
01:33:26.660 and showing
01:33:28.000 you that
01:33:28.500 even that
01:33:29.000 doesn't
01:33:29.320 stop him.
01:33:30.360 In my
01:33:31.000 opinion, agreeing
01:33:31.980 to a debate
01:33:32.740 in which
01:33:33.280 everything is
01:33:33.940 stacked against
01:33:34.640 you and
01:33:35.160 everybody can
01:33:35.740 see it, even
01:33:37.540 the Democrats
01:33:38.200 are bragging
01:33:38.800 about it, is
01:33:40.220 the ideal
01:33:40.980 Trump situation.
01:33:43.080 It's a trap
01:33:44.220 that he knows
01:33:46.300 how to beat.
01:33:48.660 And only he.
01:33:49.920 You know, it's
01:33:50.860 hard to imagine
01:33:51.420 another person
01:33:52.340 who would be
01:33:53.380 as capable of
01:33:54.660 walking into
01:33:55.240 this bad
01:33:55.800 situation and
01:33:56.640 just owning
01:33:57.240 it.
01:33:58.240 So, I've
01:33:59.220 told you the
01:33:59.740 Andre Agassi
01:34:00.740 tennis strategy
01:34:01.680 that you try
01:34:02.700 to beat the
01:34:03.260 players' best
01:34:04.880 part, you
01:34:05.940 don't go after
01:34:06.420 their weaknesses,
01:34:07.460 because if you
01:34:08.100 can quickly make
01:34:08.960 them doubt
01:34:09.420 their strengths,
01:34:11.020 let's say their
01:34:11.500 forehand is really
01:34:12.240 good, but you
01:34:13.120 make them miss
01:34:13.600 a few, then
01:34:15.300 you just run
01:34:16.240 the table.
01:34:17.620 Because once
01:34:18.020 you've got their
01:34:18.500 confidence, they're
01:34:20.300 dead.
01:34:21.400 So, Trump
01:34:22.260 can go in
01:34:22.860 there with
01:34:23.420 Biden maybe
01:34:24.040 feeling a little
01:34:24.660 confident that
01:34:26.160 Biden set up
01:34:26.860 the situation
01:34:27.480 perfectly, and
01:34:28.840 if Trump gets
01:34:29.700 a few good
01:34:30.440 jabs in, it's
01:34:33.340 not going to
01:34:33.800 look like that
01:34:34.360 worked.
01:34:35.360 And then
01:34:35.620 Biden's going
01:34:36.120 to be, oh,
01:34:36.620 shoot, I guess
01:34:37.360 I have to rely
01:34:38.180 on my skill, and
01:34:40.280 I'm against the
01:34:41.080 best debater in
01:34:41.880 the history of
01:34:42.440 the country.
01:34:45.200 We're literally
01:34:46.080 putting the
01:34:47.320 best debater
01:34:48.100 since Lincoln
01:34:48.820 Douglas against
01:34:50.340 a guy with
01:34:51.100 obvious dementia.
01:34:52.360 I mean, at
01:34:52.680 this point, that's
01:34:53.300 not really a
01:34:53.860 debatable point.
01:34:55.460 You might call
01:34:56.040 it some other
01:34:56.560 problem, but
01:34:57.540 it's a brain
01:34:58.480 problem.
01:35:00.080 And we don't
01:35:00.780 know what's
01:35:01.140 going to happen.
01:35:01.720 That's the
01:35:02.100 funny part.
01:35:02.820 We don't know
01:35:03.280 what's going
01:35:03.540 to happen.
01:35:04.560 Well, Matt
01:35:05.280 Gates is
01:35:05.900 introducing some
01:35:07.040 legislation that
01:35:08.400 would allow you
01:35:08.820 to pay your
01:35:09.260 taxes with
01:35:09.880 Bitcoin.
01:35:12.720 That seems
01:35:13.400 right.
01:35:14.920 And I think
01:35:15.880 Trump is getting
01:35:16.540 a lot of
01:35:17.500 credit for
01:35:18.580 being a
01:35:19.220 person of
01:35:19.720 a certain
01:35:20.040 age who
01:35:21.500 has listened
01:35:22.080 to people
01:35:22.540 who understand
01:35:23.220 Bitcoin and
01:35:25.040 embraced its
01:35:26.620 potential.
01:35:27.520 And it's a
01:35:28.180 big, big
01:35:28.560 deal.
01:35:29.060 And it's
01:35:29.360 also free
01:35:29.920 votes, because
01:35:31.140 the pro-Bitcoin
01:35:32.020 people are
01:35:32.660 really going to
01:35:33.280 be pro-Bitcoin.
01:35:34.780 And if one
01:35:35.600 person is, yeah,
01:35:36.840 let's do lots
01:35:37.460 of Bitcoin, and
01:35:38.120 the other is
01:35:38.520 not, those
01:35:40.160 are free
01:35:40.540 votes.
01:35:42.300 Free votes
01:35:43.060 with no real
01:35:43.960 cost, because
01:35:44.780 using Bitcoin
01:35:45.500 is probably a
01:35:46.100 great idea.
01:35:46.760 I think most
01:35:47.240 smart people
01:35:47.720 agree.
01:35:48.580 Now, here's
01:35:49.740 the thing that
01:35:50.460 I still think
01:35:51.300 is a terrible
01:35:51.940 idea, but I
01:35:53.860 don't know
01:35:54.280 why yet.
01:35:57.060 And I
01:35:58.400 know just
01:35:58.960 enough to
01:35:59.780 confuse
01:36:00.220 myself, right?
01:36:01.840 So here's the
01:36:02.560 question.
01:36:03.960 Hypothetically, and
01:36:04.780 I asked
01:36:05.060 ChatGPT to
01:36:06.020 tell me why
01:36:06.540 this wouldn't
01:36:07.020 work, and
01:36:07.500 it couldn't.
01:36:08.800 So ChatGPT
01:36:09.940 couldn't tell
01:36:10.660 me why this
01:36:11.280 wouldn't work,
01:36:12.340 which doesn't
01:36:12.920 mean it
01:36:13.200 works.
01:36:14.500 And my
01:36:14.740 idea was for
01:36:16.800 the United
01:36:17.160 States to
01:36:17.660 simply create
01:36:18.340 a crypto, not
01:36:19.840 one that's
01:36:20.260 already there,
01:36:20.740 just create a
01:36:21.300 crypto that's
01:36:22.540 pegged to the
01:36:23.080 dollar, and
01:36:24.220 you're going to
01:36:24.540 say, wait, wait,
01:36:25.460 they've already
01:36:25.880 done that, but
01:36:26.900 not to replace
01:36:27.660 cash, in
01:36:29.600 addition to
01:36:30.220 cash.
01:36:31.340 So they would
01:36:32.020 just be operating
01:36:32.940 like Matt
01:36:34.440 Gates wants to
01:36:35.180 do with
01:36:35.440 Bitcoin, which
01:36:36.480 by the way, I
01:36:37.900 think is step
01:36:38.500 one to what I'm
01:36:39.520 talking about.
01:36:40.060 I think that
01:36:41.360 Matt Gates
01:36:41.880 introducing the
01:36:42.800 idea that you
01:36:43.500 can pay your
01:36:44.020 taxes with
01:36:44.800 Bitcoin is
01:36:46.100 setting the
01:36:46.700 stage for
01:36:49.800 paying your
01:36:50.280 taxes with a
01:36:51.000 different kind
01:36:51.460 of crypto that
01:36:52.080 doesn't yet
01:36:52.540 exist.
01:36:55.340 Because here's
01:36:56.020 my hypothesis,
01:36:58.500 and by the
01:36:58.860 way, this is a
01:36:59.580 weak hypothesis,
01:37:01.320 I'm not going to
01:37:02.080 die on this
01:37:02.600 hill, so as
01:37:03.860 soon as it
01:37:04.260 looks like it's
01:37:04.760 not going to
01:37:05.120 happen, I'll
01:37:05.620 be like, oh,
01:37:06.100 okay, I was
01:37:06.380 wrong.
01:37:08.040 But here's what
01:37:09.300 it feels like.
01:37:10.060 It feels like
01:37:12.500 since neither the
01:37:13.400 Democrats or
01:37:14.060 Republicans have
01:37:14.820 any kind of a
01:37:15.740 plan for the
01:37:17.080 deficit, and
01:37:18.780 the deficit will
01:37:19.660 clearly destroy
01:37:21.480 us, obviously.
01:37:23.460 There is no
01:37:24.360 way to survive
01:37:25.400 with just
01:37:26.740 growing faster
01:37:27.780 and having
01:37:28.520 more taxes.
01:37:29.480 There is no
01:37:30.360 path to
01:37:31.180 survival.
01:37:32.600 But that
01:37:34.000 just means that
01:37:34.740 if there's a
01:37:35.240 path to
01:37:35.640 survival, it
01:37:36.700 won't be the
01:37:37.220 normal path.
01:37:38.180 So it won't
01:37:38.780 be normal
01:37:39.360 growth.
01:37:40.060 Plus some
01:37:41.080 normal financial
01:37:43.200 manipulation.
01:37:44.160 It won't be
01:37:44.540 anything normal.
01:37:45.980 So we either
01:37:47.060 have to be very
01:37:48.700 innovative or
01:37:50.480 dead.
01:37:52.720 There isn't any
01:37:53.760 other choice.
01:37:54.980 We have to
01:37:55.860 innovate in a
01:37:56.600 way that we
01:37:57.000 couldn't even
01:37:57.420 imagine, literally
01:37:59.680 can't imagine, or
01:38:01.640 we're absolutely
01:38:02.400 fucking dead.
01:38:03.520 We do not have a
01:38:04.760 survival plan, and
01:38:06.200 neither Trump nor
01:38:06.960 Biden, have
01:38:08.080 even suggested a
01:38:10.060 survival plan.
01:38:11.540 Neither of
01:38:11.980 them.
01:38:13.520 Neither of
01:38:13.940 them.
01:38:14.580 Kennedy doesn't
01:38:15.200 have one either.
01:38:16.060 There is no
01:38:16.620 survival plan for
01:38:17.900 the United States,
01:38:18.920 and we probably
01:38:20.400 have five years
01:38:21.140 at most.
01:38:24.700 Now, I do
01:38:25.560 think we'll be
01:38:26.100 fine.
01:38:26.940 So I didn't mean
01:38:27.680 to scare you.
01:38:28.840 It's just that
01:38:29.700 when you realize
01:38:31.060 that doing the
01:38:31.920 two normal things
01:38:33.200 that are the
01:38:33.660 only things
01:38:34.080 you've ever
01:38:34.440 done, fiddle
01:38:35.900 with taxes and
01:38:37.180 growth, when you
01:38:38.700 know that can't
01:38:39.420 work, then it
01:38:40.540 frees you to do
01:38:41.280 things you would
01:38:41.800 never even
01:38:42.280 consider under
01:38:43.400 normal situations.
01:38:44.680 In other words,
01:38:45.760 an emergency
01:38:46.480 clarifies everything.
01:38:49.620 Everybody gets
01:38:50.180 flexible in an
01:38:50.960 emergency.
01:38:51.660 You saw the
01:38:52.120 pandemic.
01:38:52.760 Everybody did
01:38:53.600 things that they
01:38:54.120 would never do
01:38:54.720 under normal
01:38:55.820 circumstances.
01:38:57.300 So once we
01:38:58.180 reach maybe a
01:38:58.880 little bit closer
01:38:59.580 to the emergency,
01:39:01.360 we've only got,
01:39:02.100 you know, six
01:39:02.680 months left to
01:39:03.320 live, I
01:39:05.740 wonder if we
01:39:06.460 could exchange
01:39:08.300 all of the
01:39:09.520 debt, the
01:39:10.640 $35 trillion,
01:39:12.060 for a crypto
01:39:13.540 that never
01:39:14.180 existed until we
01:39:15.180 made it into
01:39:15.700 nothing.
01:39:17.700 Now, before you
01:39:18.600 say, Scott,
01:39:19.480 that's massive
01:39:20.420 inflation.
01:39:21.500 No, there's no
01:39:22.200 money being
01:39:22.680 added.
01:39:23.760 You're replacing
01:39:24.700 dollar for
01:39:25.340 dollar.
01:39:25.960 You're taking
01:39:26.340 the U.S.
01:39:27.260 dollar and
01:39:28.160 just say,
01:39:29.280 boop, we're
01:39:30.320 going to repay
01:39:30.820 all of the
01:39:31.500 national debt in
01:39:32.500 crypto that's
01:39:34.060 worth a
01:39:34.440 dollar.
01:39:35.320 So you'll
01:39:35.640 be made
01:39:36.060 whole.
01:39:37.180 Every one of
01:39:37.680 you will be
01:39:38.060 made whole.
01:39:39.560 But we might
01:39:40.480 be introducing
01:39:41.440 some extra
01:39:42.240 crypto in
01:39:43.040 there to pay
01:39:44.600 off some
01:39:45.040 extra stuff.
01:39:46.460 So it would
01:39:47.300 still be
01:39:47.640 inflationary because
01:39:48.980 you'd still have
01:39:49.480 to add some
01:39:49.880 extra, but
01:39:51.040 not $35 trillion
01:39:52.840 extra.
01:39:53.640 The $35 trillion
01:39:54.700 would just be a
01:39:55.400 one-for-one
01:39:55.900 replacement of
01:39:56.920 crypto for
01:39:57.620 U.S.
01:39:58.060 dollars.
01:39:58.340 And then
01:39:59.440 both of
01:39:59.860 them under
01:40:00.440 this theory
01:40:01.060 would be
01:40:01.840 accepted by
01:40:02.480 the United
01:40:02.900 States in
01:40:04.740 payment of
01:40:05.160 taxes, maybe
01:40:06.580 also in
01:40:07.580 payment of
01:40:08.040 tariffs.
01:40:09.220 So if the
01:40:09.920 United States
01:40:10.540 says, as a
01:40:11.400 country, we'll
01:40:12.320 accept tariff
01:40:13.160 payments, any
01:40:14.300 kind of
01:40:14.620 international
01:40:15.120 payments, any
01:40:16.460 kind of
01:40:16.800 domestic tax
01:40:17.640 payments, we'll
01:40:19.120 accept all
01:40:19.600 that in our
01:40:20.200 own crypto, it
01:40:21.320 immediately has
01:40:21.980 backing.
01:40:22.860 That's what
01:40:23.460 would back it.
01:40:24.020 You don't need
01:40:24.420 gold if the
01:40:25.660 government says
01:40:26.280 we'll take it
01:40:26.700 to pay your
01:40:27.060 taxes.
01:40:28.840 That would
01:40:29.520 be the
01:40:29.800 backing, the
01:40:30.820 fact that the
01:40:31.520 government accepts
01:40:32.160 it.
01:40:33.100 So, could
01:40:35.200 you use
01:40:37.420 crypto in
01:40:38.640 some way that
01:40:39.320 I don't quite
01:40:39.920 understand?
01:40:40.900 Now, if you're
01:40:41.620 smarter than I
01:40:42.520 am about both
01:40:43.280 crypto and
01:40:43.900 economics, you're
01:40:45.340 probably at home
01:40:46.120 screaming, the
01:40:47.760 obvious problem
01:40:48.660 with that, here's
01:40:49.840 the obvious
01:40:50.360 problem.
01:40:51.260 I don't know
01:40:52.040 what it is.
01:40:53.240 I'm not saying
01:40:53.900 there isn't an
01:40:54.600 obvious problem.
01:40:56.700 There probably
01:40:57.340 is.
01:40:58.300 I just don't
01:40:58.820 know what it
01:40:59.140 is.
01:41:00.080 And I also
01:41:02.040 present this as
01:41:02.940 what I call the
01:41:03.540 bad idea.
01:41:04.940 I've told you
01:41:05.680 before, one of
01:41:06.380 my contributions
01:41:07.940 to the country
01:41:09.000 is my inability
01:41:11.180 to be
01:41:11.800 embarrassed.
01:41:13.700 I've just
01:41:14.340 floated an
01:41:14.880 idea that I
01:41:16.080 fully expect
01:41:16.880 somebody who
01:41:17.540 knows more
01:41:17.980 than I do
01:41:18.540 to say, that
01:41:19.420 is the dumbest
01:41:20.080 stupid idea
01:41:20.860 I've ever
01:41:22.120 heard.
01:41:22.620 And let me
01:41:23.020 give you the
01:41:23.500 obvious reason
01:41:24.340 that you're
01:41:25.200 not seeing
01:41:25.800 why you're
01:41:26.880 so dumb
01:41:27.360 and this
01:41:27.760 could never
01:41:28.100 work.
01:41:30.120 But I've
01:41:32.420 taught you
01:41:32.720 the bad
01:41:33.240 idea plan.
01:41:34.320 The bad
01:41:34.800 idea strategy
01:41:35.920 is when
01:41:36.560 you're
01:41:36.760 brainstorming
01:41:37.520 and you're
01:41:38.540 all going to
01:41:38.960 die if you
01:41:39.420 don't come
01:41:39.720 up with a
01:41:40.080 better idea.
01:41:40.980 And we're
01:41:41.460 all going to
01:41:41.820 die if we
01:41:42.300 don't come
01:41:42.600 up with a
01:41:42.960 better idea.
01:41:43.760 Let me be
01:41:44.380 clear.
01:41:45.760 We're all
01:41:46.400 going to
01:41:46.660 die.
01:41:48.240 We're going
01:41:48.700 to fucking
01:41:49.060 starve to
01:41:49.660 death if
01:41:50.780 we don't
01:41:51.120 figure out how
01:41:51.720 to pay off
01:41:52.380 the debt
01:41:52.740 like right
01:41:53.220 away.
01:41:54.740 Five years
01:41:55.560 from now
01:41:55.860 we're all
01:41:56.180 dead.
01:41:57.040 We will
01:41:57.440 starve to
01:41:58.080 death if
01:41:58.900 we don't
01:41:59.200 figure out
01:41:59.640 an innovative
01:42:00.260 way to
01:42:01.260 handle it.
01:42:03.200 So I
01:42:04.800 throw this
01:42:05.360 bad idea
01:42:06.100 into the
01:42:06.580 mix so
01:42:08.260 that other
01:42:08.780 people who
01:42:09.280 are much
01:42:09.680 smarter than
01:42:10.260 I am about
01:42:11.080 everything crypto
01:42:12.100 and economic
01:42:12.780 can say,
01:42:13.820 you know,
01:42:14.060 that is a
01:42:14.540 bad idea,
01:42:15.440 but wait,
01:42:16.960 what if we
01:42:17.920 did this?
01:42:20.020 And that's
01:42:20.480 what I'm
01:42:20.760 going for.
01:42:21.280 So it's
01:42:23.040 a Hail
01:42:23.360 Mary pass
01:42:24.120 and the
01:42:25.260 Hail
01:42:25.460 Mary is
01:42:26.240 that my
01:42:26.720 bad idea
01:42:27.460 is big
01:42:28.980 enough and
01:42:29.460 bold enough
01:42:30.080 and now
01:42:30.720 with Matt
01:42:31.200 Gates setting
01:42:32.220 the stage
01:42:32.800 that crypto
01:42:33.460 could be a
01:42:33.960 payment for
01:42:34.400 taxes,
01:42:35.260 everybody will
01:42:36.260 understand that
01:42:37.900 crypto can be
01:42:38.700 a way to
01:42:39.120 pay taxes.
01:42:40.340 He's setting
01:42:41.020 the psychological
01:42:42.380 stage for
01:42:44.260 maybe a bigger
01:42:44.860 play.
01:42:45.960 Now right now
01:42:46.700 he's just in
01:42:47.220 crypto or he's
01:42:48.120 just in Bitcoin
01:42:48.720 only and that's
01:42:50.100 the safest place
01:42:51.020 to be because
01:42:51.720 enough people
01:42:52.280 have some sense
01:42:53.740 that that would
01:42:54.500 be a reasonable
01:42:55.100 thing to do.
01:42:59.300 I think Matt
01:43:00.260 Gates might be
01:43:01.000 setting this up
01:43:01.620 for a big
01:43:02.060 play.
01:43:04.160 It's just
01:43:04.900 we're so close
01:43:06.540 but I don't
01:43:07.540 know if we're
01:43:07.900 there.
01:43:09.180 Anyway,
01:43:09.660 so that's what
01:43:10.300 I got for you.
01:43:11.660 That is my
01:43:12.540 show for today.
01:43:13.880 Thanks for
01:43:14.220 joining.
01:43:14.620 I went way
01:43:15.180 too long
01:43:15.700 and if you
01:43:17.780 stayed with me,
01:43:18.620 I really
01:43:19.000 appreciate it.
01:43:20.640 Let's go look
01:43:21.440 to find out
01:43:22.120 what the
01:43:22.460 Supreme Court
01:43:22.920 is going to
01:43:23.240 do.
01:43:23.660 I'm going to
01:43:23.960 say hi to
01:43:24.640 the locals
01:43:25.260 people just
01:43:25.840 quickly and
01:43:27.500 say goodbye
01:43:28.200 to YouTube
01:43:29.280 and X
01:43:29.960 and Rumble.
01:43:31.220 Thanks for
01:43:31.700 joining.
01:43:32.620 Appreciate it
01:43:33.300 a lot.