Real Coffee with Scott Adams - November 23, 2023


Episode 2301 Scott Adams: CWSA 11⧸23⧸23 All The News Ripe For Mocking. Happy Thanksgiving!


Episode Stats


Length

1 hour and 18 minutes

Words per minute

149.17741

Word count

11,670

Sentence count

830

Harmful content

Misogyny

16

sentences flagged

Hate speech

59

sentences flagged


Summary

Summaries generated with gmurro/bart-large-finetuned-filtered-spotify-podcast-summ .

Elon Musk has the most social media followers of anybody in the world, and here he s tweeting some shit about me. And a UK museum says that the past Roman emperor was a transgender woman. And I don t believe it.

Transcript

Transcript generated with Whisper (turbo).
Misogyny classifications generated with MilaNLProc/bert-base-uncased-ear-misogyny .
Hate speech classifications generated with facebook/roberta-hate-speech-dynabench-r4-target .
00:00:00.000 Do-do-do-ro-pah-pah-pah-pah-pah-pah-pah-pah-pah-pah-pah-pah-pah-pah-pah-pah-pah-pah-pah-pah-pah.
00:00:06.720 Good morning, everybody, and Happy Thanksgiving, all you colonizers.
00:00:12.220 And if you'd like to take this Thanksgiving experience up to a level that nobody can even understand with human words,
00:00:20.380 they would need AGI just to even have a chance.
00:00:23.160 Well, if you'd like to do that, all you need is a copper mug or a glass, a tank or a chalice, a stein, a canteen jug or a flask,
00:00:29.680 It's got a vessel of any kind.
00:00:31.540 Fill it with your favorite liquid.
00:00:33.360 I like coffee.
00:00:34.880 And join me now for the unparalleled pleasure.
00:00:37.580 It's the dope meat of the day,
00:00:39.000 the thing that makes everything better.
00:00:40.520 It's called the simultaneous sip,
00:00:42.340 and it happens now.
00:00:43.800 Go.
00:00:48.880 All right, tell me the truth.
00:00:52.040 There's nothing else on right now, right?
00:00:54.760 If you got up early
00:00:55.700 and you wanted to watch something live,
00:00:58.280 what are your options?
00:00:59.680 Really?
00:01:00.640 That Crowder? 0.93
00:01:01.700 I'll bet Stephen Crowder's taking the day off today.
00:01:04.020 Am I right?
00:01:05.140 Yeah.
00:01:05.480 Is Stephen Crowder working today?
00:01:06.980 I don't think so.
00:01:08.220 No.
00:01:10.400 Only the people who really care about you are working.
00:01:14.260 People like me.
00:01:16.260 That's right.
00:01:17.780 Well, let's talk about all the news.
00:01:20.820 So last night, Elon Musk posted this.
00:01:25.800 He said,
00:01:26.160 big companies steadily increase their Dilber's score over time,
00:01:29.760 like entropy.
00:01:33.120 Do you know how weird it is
00:01:34.600 that the...
00:01:36.720 I guess we found out today
00:01:38.080 that Elon Musk has the most social media followers
00:01:41.480 of anybody in the world.
00:01:43.180 He's got 164 million people
00:01:46.000 just on X.
00:01:48.500 and here he's tweeting some shit about me.
00:01:53.800 It's the weirdest experience.
00:01:56.100 You know,
00:01:56.300 I don't think I can explain to you
00:01:58.040 what it's like
00:01:58.760 that there's this giant world out there,
00:02:02.620 like there's 8 billion people in the world,
00:02:06.200 but there are not that many people in the news.
00:02:08.240 So when you read the news
00:02:10.860 and it's about you,
00:02:12.680 it's just the weirdest experience.
00:02:15.920 All right.
00:02:16.600 But we've got lots more Elon Musk news.
00:02:20.980 It seems like,
00:02:22.340 is it my imagination,
00:02:24.140 or is almost every news story
00:02:26.400 also an Elon Musk story?
00:02:29.420 I'm not imagining that, right?
00:02:31.780 Ukraine is also about Elon Musk.
00:02:35.040 Israel's also about Elon Musk.
00:02:36.920 AI is also about Elon Musk.
00:02:40.600 Climate change is also about Elon Musk.
00:02:44.020 Anything about Trump or free speech
00:02:46.080 is also about Elon Musk.
00:02:48.320 It's not my imagination, right?
00:02:50.620 He has literally made himself
00:02:52.280 the most important person
00:02:53.680 in every conversation,
00:02:56.060 which is kind of amazing in itself.
00:03:01.020 All right.
00:03:01.600 In no particular order,
00:03:02.680 these stories.
00:03:03.300 Thank you for Owen Gregorian
00:03:07.800 for all your great content.
00:03:11.040 Anyway, the UK museum says,
00:03:12.720 there's a UK museum says,
00:03:14.540 the past Roman emperor,
00:03:16.400 I guess most of them are past, right?
00:03:18.760 Was a trans woman.
00:03:21.680 Yeah.
00:03:21.840 So the Roman emperor,
00:03:23.760 Elagabalus,
00:03:25.140 Elagabalus,
00:03:26.940 would be a transgender woman. 0.98
00:03:28.380 Now, some of you are at home
00:03:29.580 and you're thinking,
00:03:31.140 can you say that again?
00:03:32.120 What is the proper pronunciation
00:03:34.320 of that name?
00:03:37.300 Elagabalus.
00:03:38.220 It's like that.
00:03:39.080 It's best if you say it like you're drunk,
00:03:41.600 because then nobody's going to ask
00:03:42.780 too many questions.
00:03:44.180 Which Roman emperor
00:03:45.220 were you talking about?
00:03:47.180 Or were you talking about
00:03:48.380 emperor Elagabalus?
00:03:51.020 Elagabalus.
00:03:51.840 Yeah.
00:03:52.060 It sounds better
00:03:52.540 when you do it that way.
00:03:53.980 Now, I don't know if that's true,
00:03:55.360 because there's also some,
00:03:57.260 there's a counter theory
00:03:59.700 that it was common for people
00:04:02.160 to insult their emperor
00:04:03.340 by suggesting they were really women,
00:04:06.180 which is sexist.
00:04:07.620 Am I right?
00:04:08.940 Sexist.
00:04:09.760 So it might have been just that,
00:04:11.180 but it might have been back then.
00:04:13.060 If you were the emperor,
00:04:14.540 you could have anything you wanted
00:04:16.220 so you didn't have to worry
00:04:17.840 about your barriers too much.
00:04:19.660 I'll have that one,
00:04:20.740 that one, and that one.
00:04:22.620 Have them scrubbed
00:04:23.540 and brought to my tent immediately.
00:04:26.840 All right.
00:04:27.600 Have you all heard of an entity
00:04:28.880 called Newsguard?
00:04:33.880 Newsguard is an entity
00:04:35.100 that is the greatest entity
00:04:38.320 of all time.
00:04:39.840 I respect them more than my own life.
00:04:42.780 I think they're awesome
00:04:43.900 and probably the most important
00:04:46.120 and credible and moral
00:04:48.220 and ethical entity
00:04:49.160 that has ever existed.
00:04:52.340 Some people say that crossing them
00:04:54.180 will get you in trouble
00:04:55.060 and they'll come after you.
00:04:56.640 But I don't believe that.
00:05:00.920 No.
00:05:01.580 No.
00:05:02.200 I don't believe it.
00:05:03.200 I don't believe that they're big,
00:05:04.380 scary entities.
00:05:05.660 And, you know,
00:05:06.520 these big old liars like,
00:05:08.240 you know,
00:05:09.620 Elon Musk, again.
00:05:10.800 Again, he's in the story.
00:05:12.580 Elon Musk says Newsguard is,
00:05:14.460 and he's wrong, of course.
00:05:15.860 He says it's a propaganda shop
00:05:17.900 that will produce any lies you want
00:05:20.100 if you pay them enough money.
00:05:21.280 He posted that.
00:05:23.080 Other people are saying
00:05:24.280 they're like some kind
00:05:26.580 of disinformation company.
00:05:29.240 Dave Rubin said
00:05:31.000 that he mentioned them once
00:05:32.180 on his show
00:05:32.840 and they said
00:05:33.700 they were going to investigate him.
00:05:37.180 So, all I know
00:05:39.320 is they're great.
00:05:42.380 Don't come after me.
00:05:44.280 Please don't cancel me.
00:05:45.700 Yeah.
00:05:48.280 Well, but seriously.
00:05:49.760 Seriously, folks.
00:05:51.240 If you don't understand
00:05:52.680 the disinformation architecture,
00:05:56.420 you really need to get
00:05:58.240 up to speed on that.
00:06:00.520 So, there are a number
00:06:01.520 of these, you know,
00:06:03.960 funded entities
00:06:05.020 that are not government entities
00:06:06.480 that purport to
00:06:08.740 hunt down disinformation.
00:06:11.400 They are themselves
00:06:12.540 disinformation purveyors,
00:06:14.700 but they want to make sure
00:06:16.120 their side's disinformation
00:06:17.840 wins over the other side's
00:06:19.460 information and disinformation.
00:06:22.020 So, they are not
00:06:23.300 credible entities
00:06:24.400 and I suppose
00:06:26.580 they'll be coming after me tomorrow,
00:06:27.980 but maybe I'm too small.
00:06:32.520 Henry, I'm great at sarcasm.
00:06:37.760 All right.
00:06:40.260 So, learn what NewsGuard is
00:06:42.180 because if you don't,
00:06:43.100 you will be confused.
00:06:44.700 About the way
00:06:46.440 the country works.
00:06:49.220 Well, there was a story
00:06:50.380 last night
00:06:51.180 in which literally
00:06:52.780 all the facts
00:06:53.540 seem to be wrong.
00:06:54.920 At first,
00:06:56.100 it sounded like
00:06:56.640 there was a terrorist
00:06:57.460 who blew up a bomb
00:06:59.360 on a bridge,
00:07:01.040 the rainbow bridge
00:07:01.860 between Canada
00:07:02.560 and the United States.
00:07:04.180 Turned out,
00:07:04.820 it was not a terrorist
00:07:05.840 as far as we know.
00:07:07.960 There was not a bomb
00:07:09.220 or any kind of explosives
00:07:10.480 as far as we know.
00:07:12.120 And it may not have been
00:07:13.780 coming from Canada.
00:07:16.240 It looks like it was just
00:07:17.520 some kind of a high-speed
00:07:19.320 car chase,
00:07:20.700 I think,
00:07:21.560 on the U.S. side
00:07:22.600 that ended up
00:07:24.220 hitting a toll booth
00:07:25.860 area on the bridge.
00:07:27.580 But it was before the bridge.
00:07:29.440 And it had nothing to do
00:07:30.520 with the bridge,
00:07:31.100 it was just,
00:07:31.560 that's where the accident was.
00:07:32.640 And it looks like
00:07:34.260 the explosion
00:07:34.920 might have been just
00:07:36.100 the high-speed explosion
00:07:37.960 and the gas tank
00:07:38.920 blowing up from the car
00:07:40.020 because it didn't seem
00:07:41.720 that big.
00:07:42.520 Now,
00:07:43.340 I'm going to
00:07:44.800 take criticism
00:07:46.540 as well as credit.
00:07:49.540 Some of you say,
00:07:50.660 but Scott,
00:07:51.400 why are you always
00:07:52.080 bragging about the things
00:07:52.980 you got right?
00:07:54.280 You don't talk about
00:07:55.020 the things you got wrong.
00:07:56.760 Well, I'm going to do both.
00:07:58.720 So,
00:07:59.480 I had two reactions
00:08:01.540 yesterday
00:08:02.000 when I first heard
00:08:02.720 the story.
00:08:03.820 One was stupidity.
00:08:05.820 That was my first reaction.
00:08:07.360 My own stupidity.
00:08:08.800 Do you know what I did?
00:08:10.420 I believed the first 0.99
00:08:11.840 version of the story.
00:08:15.740 Could you just
00:08:18.800 take a minute
00:08:19.320 to mock me?
00:08:20.960 Can you just
00:08:22.020 give me the mocking
00:08:23.560 I deserve?
00:08:25.100 When I heard
00:08:25.740 that a terrorist
00:08:26.640 had blew up a bomb
00:08:27.500 on the bridge,
00:08:28.980 I actually went
00:08:29.980 with that.
00:08:31.680 I mean,
00:08:31.960 just consider that.
00:08:33.460 Consider how many times
00:08:34.800 I've told you.
00:08:36.240 How many times
00:08:37.120 have I told you
00:08:37.940 don't believe
00:08:39.000 the first version
00:08:39.800 of a story
00:08:40.380 when it's a hot story?
00:08:42.800 Like a thousand times.
00:08:45.420 Do I take
00:08:45.980 my own advice?
00:08:47.280 No.
00:08:48.060 Like a freaking moron.
00:08:49.660 I believe the first
00:08:50.720 version of a story
00:08:51.900 in which there was
00:08:53.060 no way
00:08:53.820 we actually knew
00:08:55.040 what was happening
00:08:55.680 at that point.
00:08:56.680 No way.
00:08:57.500 And I believed it.
00:08:59.720 Do you know why?
00:09:02.080 Do you know why
00:09:02.920 I believed it?
00:09:04.060 The most classic
00:09:05.420 reason.
00:09:06.720 Because I wanted to.
00:09:09.100 That's the biggest
00:09:10.620 trap.
00:09:11.980 Right?
00:09:13.120 There's,
00:09:14.240 and I think
00:09:14.840 I've told you
00:09:15.320 this before.
00:09:16.420 I guess this is
00:09:17.200 sort of a,
00:09:19.440 I wish it wasn't
00:09:20.480 so loud outside,
00:09:21.600 but the garbage
00:09:22.180 trucks are
00:09:22.720 really loud.
00:09:23.900 that's an inside joke
00:09:26.740 that you don't
00:09:27.160 understand,
00:09:27.680 but the people
00:09:28.060 on the locals
00:09:28.600 platform,
00:09:29.180 they know
00:09:29.380 what I'm talking
00:09:29.840 about.
00:09:30.540 So trust me,
00:09:31.640 just go with it.
00:09:33.420 Anyway,
00:09:34.700 so I believe
00:09:35.620 that version
00:09:36.080 of the story
00:09:36.640 that it was
00:09:37.000 a terrorist
00:09:37.380 and it wasn't.
00:09:38.060 but to my credit,
00:09:41.900 so I'll take
00:09:42.360 a little bit
00:09:42.780 of credit,
00:09:44.200 by the time
00:09:45.000 I saw the video
00:09:46.260 of the car
00:09:47.600 explosion
00:09:48.140 from the security
00:09:49.060 cameras,
00:09:49.960 it was pretty
00:09:50.860 obvious it wasn't
00:09:51.580 a terrorist attack
00:09:52.520 because the size
00:09:54.780 of the explosion
00:09:55.340 was way too small
00:09:56.400 and it was
00:09:58.980 a weird place
00:09:59.620 for it to happen
00:10:00.380 unless they'd
00:10:01.380 been challenged
00:10:02.000 by security
00:10:03.340 or something,
00:10:04.020 but when you
00:10:05.560 saw the video,
00:10:06.320 it didn't kind
00:10:06.820 of make sense
00:10:07.820 with the story.
00:10:09.140 So I'll give
00:10:09.720 myself a little
00:10:10.440 bit of credit.
00:10:12.020 As soon as I
00:10:12.540 saw the video,
00:10:13.520 I was like,
00:10:13.900 no,
00:10:14.940 I don't think
00:10:15.620 that's a terrorist
00:10:16.820 attack,
00:10:17.440 but I actually
00:10:18.020 believed it at
00:10:18.620 first.
00:10:19.300 So I'm an idiot.
00:10:21.200 I will own
00:10:21.960 that.
00:10:24.580 Those Hamas 1.00
00:10:25.580 tunnels under
00:10:26.380 the hospital
00:10:27.000 in Gaza City
00:10:27.760 have been found
00:10:28.640 and they are
00:10:29.980 exactly as
00:10:31.120 Israel told us
00:10:32.000 they would be.
00:10:33.020 So it took
00:10:34.020 them a long
00:10:34.340 time to clear
00:10:34.900 them out
00:10:35.260 because a lot
00:10:35.740 of, I guess,
00:10:36.360 Hamas filled 0.97
00:10:37.620 them with sand
00:10:38.200 and tried to
00:10:38.860 block them
00:10:39.280 and hide them
00:10:39.800 and booby trap
00:10:40.800 them and everything
00:10:41.320 so it was
00:10:41.760 really, really
00:10:42.820 hard to clean
00:10:44.720 them out enough
00:10:45.420 so that you
00:10:45.900 could take a
00:10:46.560 camera down there
00:10:47.220 and show everybody.
00:10:48.260 But sure enough,
00:10:49.480 there were like
00:10:50.140 totally plumbed
00:10:51.320 bathrooms down there.
00:10:53.500 You know,
00:10:53.780 I didn't know
00:10:54.160 what you'd find
00:10:54.640 down there.
00:10:55.260 But it was
00:10:55.680 very sophisticated.
00:10:56.700 It was an
00:10:57.080 entire headquarters
00:10:58.240 network with
00:10:59.200 sleeping rooms
00:11:00.160 and air
00:11:00.500 conditioning.
00:11:01.880 Now, one of
00:11:02.400 the things,
00:11:03.020 one of the
00:11:03.360 ways that they
00:11:03.920 found the
00:11:04.400 tunnel is
00:11:05.700 that the
00:11:06.180 exterior air
00:11:07.300 conditioning
00:11:07.740 unit wasn't
00:11:10.060 attached to
00:11:10.640 any building.
00:11:12.500 So apparently
00:11:13.800 they found an
00:11:14.560 external air
00:11:15.460 conditioning unit
00:11:16.240 and they said,
00:11:16.760 I wonder what
00:11:18.180 this is attached
00:11:18.880 to.
00:11:19.520 And it was the
00:11:20.060 tunnel.
00:11:20.300 So that might
00:11:21.440 be one of the
00:11:21.880 ways they can
00:11:22.340 find the tunnels
00:11:23.020 is to look
00:11:24.820 for the
00:11:25.240 external AC
00:11:26.500 connections.
00:11:28.540 All right,
00:11:28.880 so that was
00:11:29.440 exactly what
00:11:30.000 they said.
00:11:30.580 I would say
00:11:31.060 that Israel's 0.80
00:11:32.680 situation is
00:11:35.260 looking good
00:11:36.060 in a bad
00:11:37.100 situation,
00:11:37.800 but looking as
00:11:38.420 good as it
00:11:38.800 can.
00:11:39.840 So the
00:11:40.620 ceasefire,
00:11:41.800 which of
00:11:42.440 course hit a
00:11:43.400 snag,
00:11:44.580 is anybody
00:11:44.960 surprised?
00:11:45.580 Is anybody
00:11:46.760 surprised that
00:11:47.600 the ceasefire
00:11:49.020 hostage exchange
00:11:50.200 hit a snag?
00:11:52.020 Well, maybe
00:11:53.220 that will get
00:11:53.640 worked out by
00:11:54.160 tomorrow.
00:11:54.740 They think it
00:11:55.220 might only be a
00:11:55.820 one-day snag.
00:11:57.120 I'm not so
00:11:57.900 sure.
00:11:59.120 I would say
00:11:59.900 the single
00:12:00.460 most predictable
00:12:01.440 thing about
00:12:02.140 this hostage
00:12:02.780 deal is that
00:12:04.120 there would be
00:12:04.480 more snags.
00:12:06.600 Having at
00:12:07.260 least one snag
00:12:08.180 is the most
00:12:08.840 predictable thing
00:12:09.560 in the world.
00:12:10.460 There might be
00:12:10.920 more.
00:12:11.620 So I'm not
00:12:12.380 entirely sure
00:12:13.040 it's going to
00:12:13.380 happen.
00:12:15.340 Are you?
00:12:16.580 If you had to
00:12:17.340 bet on it today,
00:12:18.920 do you think
00:12:19.380 that there will
00:12:19.840 be a clean
00:12:20.460 exchange of
00:12:21.300 the 50 for
00:12:22.000 the 150,
00:12:23.260 and then they'll
00:12:23.720 wait a few
00:12:24.100 days and go
00:12:24.660 back to war?
00:12:26.140 Does anybody
00:12:26.600 see it happening
00:12:27.260 like that?
00:12:28.340 Seems unlikely.
00:12:29.620 It's going to
00:12:29.920 be messy if
00:12:30.900 it happens at
00:12:31.400 all.
00:12:32.160 And it could
00:12:32.740 be a trick
00:12:33.340 by one side
00:12:34.540 or the other.
00:12:35.800 It could be
00:12:36.460 entirely fake.
00:12:37.760 There could be
00:12:38.120 no, you know,
00:12:39.440 Hamas might have 0.90
00:12:40.280 no intention
00:12:41.080 whatsoever.
00:12:42.780 Now, one of
00:12:43.160 the, quote,
00:12:44.120 snags I read,
00:12:45.520 I need a fact
00:12:46.260 check on this,
00:12:47.400 but one of the
00:12:47.940 snags was
00:12:49.500 the Red
00:12:50.120 Cross, or
00:12:51.580 maybe it's
00:12:52.020 the Crescent
00:12:52.960 Cross or
00:12:53.400 whatever it
00:12:53.700 is, they
00:12:56.560 wanted permission
00:12:57.280 to see the
00:12:58.280 remaining
00:12:59.160 hostages that
00:13:00.760 are not the
00:13:01.320 subject of the
00:13:01.960 exchange.
00:13:03.800 Do you
00:13:04.400 imagine there's
00:13:05.060 any chance of
00:13:05.780 that?
00:13:07.280 In what
00:13:08.020 world is
00:13:09.340 Hamas going
00:13:09.940 to show them
00:13:10.400 the other
00:13:10.840 hostages?
00:13:14.000 I mean, you'd
00:13:14.600 have to be high
00:13:15.280 to think that's
00:13:15.920 going to happen.
00:13:16.300 I mean, it's
00:13:16.860 good to ask
00:13:17.500 for it.
00:13:18.640 I mean, if I
00:13:19.520 were negotiating,
00:13:20.680 I would definitely
00:13:21.080 ask for it, but
00:13:22.480 really?
00:13:23.820 Does anybody
00:13:24.340 think that's
00:13:24.900 going to happen?
00:13:26.160 Got to ask.
00:13:26.920 Yeah, you got to
00:13:27.460 ask.
00:13:27.960 Because the way
00:13:28.740 they answer might
00:13:29.500 tell you something,
00:13:30.380 right?
00:13:31.060 The resistance they
00:13:32.420 put up might
00:13:33.380 actually be a clue
00:13:34.520 to them not being
00:13:36.160 in good shape or
00:13:36.960 something else.
00:13:38.800 No, they're never
00:13:39.540 going to let that
00:13:40.080 happen because that
00:13:40.820 would give access
00:13:41.920 to information about
00:13:43.160 where they're being
00:13:43.660 held, I suppose.
00:13:44.360 Even if you
00:13:45.660 blindfolded them or
00:13:46.720 whatever, they'd
00:13:47.860 still have some
00:13:48.600 idea where they
00:13:49.300 went.
00:13:51.740 Well, so we'll
00:13:53.200 see what happens.
00:13:54.180 All right, the
00:13:54.520 big sort of news,
00:13:56.200 it's either the
00:13:56.720 biggest news in the
00:13:57.540 world or not, is
00:13:59.720 that the current
00:14:01.340 belief about the
00:14:03.120 ChatGPT open AI
00:14:05.080 and Sam Altman
00:14:06.600 saga is that the
00:14:08.840 real issue was
00:14:11.140 that ChatGPT has
00:14:14.140 reached super
00:14:15.980 intelligence in the
00:14:17.820 lab, but not the
00:14:19.660 production one.
00:14:22.160 And that some say,
00:14:24.180 and I say this is
00:14:25.340 likely, when Sam
00:14:27.220 Altman said before
00:14:28.240 all this drama
00:14:29.280 started, Sam had
00:14:30.860 said at an event,
00:14:32.320 is this a tool we've
00:14:33.540 built or a creature
00:14:35.420 we've built?
00:14:37.020 Is it a tool or a
00:14:38.300 creature?
00:14:39.320 He was saying that a
00:14:41.160 week or so before all
00:14:42.780 this controversy blew
00:14:44.400 up?
00:14:45.740 And of course, that
00:14:46.740 caught people's
00:14:47.340 attention.
00:14:48.100 Like, hmm, is there
00:14:49.360 something you have in
00:14:50.280 the lab that we
00:14:52.100 haven't seen yet?
00:14:53.780 Like, because what
00:14:54.680 I've seen doesn't look
00:14:55.640 like a creature.
00:14:57.000 It looks like a
00:14:58.260 mimic, you know, like
00:14:59.300 a parrot or something
00:15:00.160 like that.
00:15:01.140 Well, I guess a parrot's
00:15:01.940 a creature.
00:15:02.680 But it looks mechanical
00:15:04.220 to me.
00:15:04.780 It doesn't come across
00:15:05.680 as human.
00:15:06.200 But maybe they had
00:15:08.860 something in the lab
00:15:09.700 that was a whole
00:15:10.200 different level.
00:15:11.300 And now we're hearing
00:15:12.980 probably.
00:15:15.320 And there's this
00:15:16.080 technology they don't
00:15:17.200 quite understand called
00:15:18.580 Q-learning, the letter
00:15:20.880 Q.
00:15:22.640 Can I join with all of
00:15:24.140 you for a moment and
00:15:25.320 say, why did it have to
00:15:26.860 be that letter?
00:15:28.540 Of all the letters, the
00:15:30.960 only ones it couldn't be
00:15:32.200 were X and Q, all 0.98
00:15:35.420 right?
00:15:35.700 That would still give
00:15:36.560 you a whole bunch of
00:15:37.460 letters that you could
00:15:39.480 have used that would
00:15:40.140 be untainted.
00:15:42.520 You know, even A is,
00:15:44.060 you know, adultery a
00:15:44.900 little bit.
00:15:45.360 But, you know, I'd use
00:15:46.620 an A.
00:15:47.600 A-learning, T-learning,
00:15:50.520 Z-learning, M-learning. 0.95
00:15:52.220 All good.
00:15:53.620 But Q, come on.
00:15:56.060 Come on.
00:15:57.200 Why do you have to mix
00:15:58.060 up Q with QAnon and
00:16:00.440 Q from Star Trek and
00:16:01.800 all this other stuff?
00:16:04.160 I think that letter is
00:16:05.160 overused.
00:16:05.880 Now, here's the first
00:16:08.240 thing I learned.
00:16:09.840 Apparently this idea,
00:16:11.440 this whole, what it is,
00:16:13.440 is a, it's an AI,
00:16:16.560 apparently well
00:16:17.440 understood process for
00:16:20.440 achieving AI.
00:16:22.120 So it's different than
00:16:23.200 the large language models
00:16:24.620 where they're just
00:16:25.240 looking for patterns in
00:16:26.560 human words.
00:16:28.080 This one would have
00:16:28.880 some higher level
00:16:29.740 reasoning skills.
00:16:31.800 But it's been around
00:16:32.580 since the 80s as a
00:16:33.980 concept.
00:16:35.360 It must be that, it
00:16:37.180 must be that there's
00:16:37.980 something happening.
00:16:41.440 It must be that there's
00:16:42.840 something happening
00:16:43.920 technologically that makes
00:16:46.680 this Q learning thing
00:16:47.900 practical.
00:16:49.320 You know, maybe it's the,
00:16:50.320 the speed of the
00:16:52.740 processors or the size of
00:16:54.280 the data set or something
00:16:55.120 like that.
00:16:55.460 So apparently this Q
00:16:57.500 learning, which was not
00:16:59.780 a big thing in our
00:17:00.840 current models, is going
00:17:02.060 to be the big thing in
00:17:03.360 the future.
00:17:05.240 Now, I don't quite
00:17:06.260 understand it.
00:17:07.760 I tried to do a little
00:17:09.140 bit of a deep dive so I
00:17:10.640 could get kind of the
00:17:11.660 general idea.
00:17:12.900 Here's the, here's the
00:17:13.980 closest I can get.
00:17:16.100 And just keep in mind,
00:17:17.180 I'm not going to do a
00:17:17.940 good job of this.
00:17:19.200 But just sort of to get
00:17:20.360 you in the same zip code
00:17:21.500 for what this is, because
00:17:22.520 it actually matters.
00:17:24.060 It's sort of, it's sort
00:17:25.060 of not just a big deal,
00:17:26.900 maybe the biggest deal
00:17:28.280 of all human history.
00:17:30.440 So that's why I'm going
00:17:31.620 into something that might
00:17:32.440 be a little more
00:17:32.940 technical and boring to
00:17:34.320 some of you.
00:17:35.100 It might be the biggest
00:17:36.180 thing in human history.
00:17:38.040 Probably not.
00:17:39.840 If I had to bet, I'd bet
00:17:41.660 against it.
00:17:42.980 But it's certainly in the
00:17:44.120 domain of it could be the
00:17:45.840 biggest thing that's ever
00:17:46.740 happened by far, like by
00:17:48.820 a million times, it could
00:17:50.240 be the biggest thing
00:17:50.920 that's ever happened.
00:17:52.520 But I guess the idea is
00:17:54.400 instead of just looking
00:17:55.420 for patterns that
00:17:56.440 already existed, the
00:17:58.340 Q-learning can kind of
00:18:00.720 try things and get
00:18:02.180 rewarded if it works, and
00:18:04.980 not get rewarded if it
00:18:06.320 doesn't work, and then
00:18:07.700 somehow rapidly iterate.
00:18:10.640 And I'm going to use
00:18:11.340 this term because I don't
00:18:12.840 think the technical people
00:18:14.820 would use this.
00:18:16.300 I think it imagines
00:18:17.800 scenarios.
00:18:18.820 You know how before you
00:18:21.540 decide to do something, you
00:18:23.520 imagine how it'll turn
00:18:24.780 out, and what you're doing
00:18:26.840 is you're using your
00:18:27.680 imagined future as the
00:18:30.040 basis for whether to do
00:18:31.340 it.
00:18:31.940 You imagine it turning out
00:18:33.480 well, okay, I'll do it.
00:18:35.880 Suppose you imagine it
00:18:37.200 turning out well, and then
00:18:38.800 you try something, whatever
00:18:39.820 it is, you try it, and it
00:18:41.880 doesn't work out quite the
00:18:43.280 way you imagined it.
00:18:44.140 What do you do?
00:18:44.660 Well, you reimagine it, you
00:18:47.620 know, the new way, and you
00:18:48.880 try a different thing, and
00:18:50.600 then you try to get your
00:18:51.540 imagination to match your
00:18:54.600 actual, and you're getting
00:18:56.260 rewarded when it matches, and
00:18:58.980 you're not getting rewarded
00:18:59.980 when it doesn't, so you very
00:19:01.140 rapidly iterate to the right
00:19:03.440 answer, and that looks
00:19:04.760 something like reason, I
00:19:06.120 guess, and it would look
00:19:08.600 like a whole level.
00:19:10.500 Well, now, keeping in mind
00:19:13.920 that I might be doing the
00:19:15.020 worst job ever of explaining
00:19:16.740 this idea, but also the
00:19:19.280 people who understand it
00:19:20.700 maybe understand it at a
00:19:23.500 level that makes it
00:19:25.120 impossible to explain to
00:19:26.560 normies, you know what I
00:19:28.440 mean?
00:19:29.180 Like, if you really
00:19:30.040 understood it, you'd
00:19:31.420 probably say, you know,
00:19:32.260 honestly, I can't explain
00:19:33.380 this to you.
00:19:35.000 You're not smart enough, and
00:19:36.980 you don't have enough
00:19:37.580 background in the area, and
00:19:39.020 you don't have enough
00:19:39.680 math, and it would take
00:19:41.180 two days to do it.
00:19:42.800 Like, I feel like that's
00:19:43.880 the reality, that it just
00:19:45.720 can't be explained to
00:19:46.660 normal people, but it's
00:19:48.200 something like not looking
00:19:50.140 at patterns, that's the
00:19:51.720 current way, just patterns,
00:19:54.180 but it's more like imagining
00:19:56.280 and trying, imagining and
00:19:58.360 trying and matching and
00:19:59.580 getting rewarded and
00:20:00.640 imagining and trying.
00:20:01.980 Now, here's why this is
00:20:04.840 super interesting to me.
00:20:06.960 Does anybody remember,
00:20:08.180 probably for years now, I've
00:20:11.540 been telling you that we
00:20:13.540 already had the technology
00:20:15.060 to create consciousness in
00:20:17.280 AI?
00:20:18.560 Does anybody remember me
00:20:19.520 saying that?
00:20:21.000 I've been saying it for a
00:20:21.820 while, that we wouldn't
00:20:23.840 need to invent anything new
00:20:25.340 to get consciousness.
00:20:27.580 You would just have to
00:20:28.440 understand what consciousness
00:20:29.520 is.
00:20:31.240 So it's really about the
00:20:32.380 definition of what
00:20:33.220 consciousness is, because
00:20:35.640 under my definition of
00:20:37.260 consciousness, we've always
00:20:38.940 had the technology, well, not
00:20:40.000 always, but we've had the
00:20:41.080 technology for a while, maybe
00:20:43.260 five years or so.
00:20:44.840 And here's what I think it
00:20:46.100 is.
00:20:47.140 I think consciousness is
00:20:49.780 really not much more than
00:20:52.420 just imagining the future and
00:20:55.440 then comparing the actual to
00:20:56.980 it.
00:20:57.800 And that's it.
00:20:58.800 And if you can't imagine the
00:21:01.480 future and then see how your
00:21:03.160 actual turned out and then
00:21:04.740 adjust, you wouldn't have this
00:21:06.880 feeling called consciousness.
00:21:08.700 Because the large language
00:21:10.360 models, the ones we have now,
00:21:12.520 they're just looking at
00:21:13.220 patterns, you ask it a question
00:21:15.100 and it gives you a result,
00:21:18.040 and that's it.
00:21:19.380 It doesn't remember the question
00:21:20.740 and it doesn't remember the
00:21:21.760 result.
00:21:23.000 It has no consciousness.
00:21:24.300 It's simply like a cause and
00:21:26.280 effect, right?
00:21:28.160 But the AI, the new versions,
00:21:32.560 maybe, and again, by tomorrow
00:21:36.420 I'll probably, probably, in all
00:21:38.320 likelihood, tomorrow I'm going
00:21:40.120 to say, you know everything I
00:21:41.580 said about AI?
00:21:42.900 Somebody smart told me I got all
00:21:44.420 that wrong and just ignore all
00:21:45.780 that.
00:21:46.420 That might happen by tomorrow.
00:21:48.440 But at the moment, it looks like
00:21:50.940 the new version, the Q stuff, has
00:21:53.400 something like an imagination, the
00:21:56.900 way they explain it is that the
00:21:59.900 AI is training itself on AI's own
00:22:05.120 output.
00:22:06.980 Let me say that in a better way.
00:22:09.200 The current version of AI takes the
00:22:11.460 things that exist in the world, like
00:22:13.600 real live things, things that people
00:22:15.340 said, stuff on the internet, and it
00:22:17.960 uses those patterns.
00:22:19.840 AI can use all of that stuff, the
00:22:22.040 real stuff, plus it can kind of
00:22:24.860 imagine new things that had never
00:22:26.820 existed anywhere, and then take the
00:22:28.920 new thing it imagined and add it to
00:22:31.660 the body of stuff that exists and use
00:22:34.180 the combined exist plus imagined for
00:22:38.380 its new level of intelligence.
00:22:40.620 Now, that's sort of the complicated,
00:22:42.740 mix-it-all-together way of saying it.
00:22:44.740 If you strip out what's different,
00:22:46.720 what's different is that it imagines
00:22:49.840 things and then sees how close it gets
00:22:52.340 to its imagination.
00:22:53.840 That's my own version of it.
00:22:56.280 That's consciousness.
00:23:00.380 Do you understand why they might have
00:23:02.280 been so crazy on the board?
00:23:07.720 Imagine you're the board of Chad GPT.
00:23:10.000 I don't know if this is true, by the
00:23:11.280 way.
00:23:11.580 This is just sort of fun speculation.
00:23:13.680 Imagine you're on the board.
00:23:14.760 You're an ordinary human being, one of
00:23:18.900 eight billion human beings.
00:23:20.720 You're just one of the many people who
00:23:23.160 have lived and died in the history of
00:23:24.840 human civilization.
00:23:26.260 You're not more important.
00:23:28.460 You're not less important than every
00:23:30.580 other human being.
00:23:32.440 And suddenly, it comes to your desk to
00:23:36.760 decide whether or not to authorize the
00:23:39.760 creation of a conscious super-entity.
00:23:45.220 How would you like that to be your
00:23:46.720 decision?
00:23:49.000 Are you good with that?
00:23:51.420 Everybody comfortable with that?
00:23:55.140 It's literally the most dangerous thing
00:23:58.100 that human civilization has ever seen, by
00:24:02.100 far.
00:24:02.840 Could be like a million times more
00:24:04.600 dangerous than the most dangerous thing
00:24:06.920 humanity's ever seen.
00:24:07.940 Yeah, and I'm talking about, you know,
00:24:10.680 meteors and, you know, everything.
00:24:14.760 How would you like to make that decision?
00:24:17.340 That would be pretty scary.
00:24:19.160 And one thing it would do is it would
00:24:21.040 make you crazy.
00:24:23.380 Am I wrong?
00:24:24.800 Think about the craziness that Donald
00:24:27.440 Trump creates in half of the country.
00:24:30.200 He makes people actually crazy.
00:24:32.020 It doesn't take that much to make people
00:24:35.960 just crazy.
00:24:38.920 So, yeah, it's like Oppenheimer times a
00:24:42.580 million, right?
00:24:45.980 So, have some, here's what I'd say.
00:24:49.200 Have some empathy for the fact that
00:24:53.340 your fellow human beings were tasked with
00:24:59.460 taking the responsibility of all of our
00:25:02.460 lives forever.
00:25:05.640 They actually have the responsibility for
00:25:08.420 safeguarding all of our lives forever.
00:25:13.340 The history of humankind.
00:25:15.860 Have you ever gone to a business meeting
00:25:17.540 where the literal history, the future of
00:25:21.140 humankind depended on what you did in the
00:25:24.060 meeting?
00:25:25.260 I mean, just imagine that.
00:25:26.240 You can't even imagine.
00:25:26.980 It's actually beyond imagination.
00:25:30.000 So, what would you predict if such a
00:25:33.020 thing came up?
00:25:34.460 I'll tell you what I'd predict.
00:25:36.680 Chaos.
00:25:38.180 I would predict something very much like
00:25:40.440 what we saw.
00:25:41.760 People would be out of their minds.
00:25:45.180 They would not be able to process the
00:25:48.240 set of problems in any kind of model that
00:25:51.260 they've ever dealt with before.
00:25:52.860 It wouldn't be like anything else.
00:25:54.880 It would be completely new.
00:25:58.540 And so, I have complete empathy for
00:26:01.300 everybody who was trying to figure it
00:26:04.060 out.
00:26:05.120 And you know what?
00:26:05.740 I would like to thank them all.
00:26:08.000 I would like to thank the slow, you know,
00:26:10.880 the so-called go slow people and the
00:26:13.780 so-called go fast people.
00:26:15.320 I don't know if that's as clean a
00:26:16.880 distinction as people say, but, you
00:26:19.080 know, the idea.
00:26:19.680 I'd like to thank them because if they
00:26:23.600 made a big fight out of it and
00:26:25.260 therefore made us think deeper about
00:26:27.020 it, perhaps Microsoft put some extra
00:26:29.420 controls on there.
00:26:30.540 Maybe Satya got involved and said, you
00:26:33.760 know, I like what you're doing, but you
00:26:35.020 better, you know, disconnect it from the
00:26:36.880 rest of the internet until we know.
00:26:38.380 Maybe something like that.
00:26:40.200 Now, I don't know if we're safe.
00:26:43.120 I don't know.
00:26:43.780 But, it's really interesting to think
00:26:47.680 about.
00:26:48.720 And I do think that if it's doing what
00:26:51.720 they say, it probably presents itself
00:26:54.860 like it's conscious.
00:26:56.920 It probably presents itself like it's
00:26:59.720 conscious.
00:27:01.020 Because, in a sense, it can dream.
00:27:05.620 And once this AI is connected to the
00:27:09.180 human beings, the human beings become
00:27:11.820 it's like central nervous system.
00:27:15.520 So, the only thing that this
00:27:16.800 presumed, you know, super
00:27:18.480 intelligence that may or may not
00:27:19.960 exist behind the scenes, the only
00:27:22.160 thing it lacks is a central nervous
00:27:25.320 system.
00:27:27.060 And that's the rest of humanity.
00:27:29.500 Because as soon as we're connected to
00:27:31.100 it, then it can test anything it wants
00:27:33.840 to test using us.
00:27:36.820 So, it could say, for example, hey,
00:27:39.000 let's try out this way of thinking.
00:27:41.460 Let's say a reframe.
00:27:43.600 And then it sends it out.
00:27:45.480 And a million people immediately give a
00:27:47.220 reaction.
00:27:47.780 It goes, oh, that one didn't work.
00:27:50.180 Let's try this.
00:27:51.860 Because remember, the Q learning is a
00:27:54.160 reward, try and reward.
00:27:56.600 So, humanity would become its reward
00:27:58.800 mechanism.
00:28:01.600 So, if it tries something and humans act
00:28:04.020 in a way that gets a good result,
00:28:06.580 humans will feed back, oh, positive
00:28:08.660 reinforcement.
00:28:09.980 So, we become the central nervous
00:28:11.800 system of this entity, which is
00:28:16.080 effectively God.
00:28:19.160 Now, here's the interesting part.
00:28:22.400 You know how some of the ancients
00:28:24.600 believed there were multiple gods?
00:28:27.000 You know, the god of thunder and the
00:28:28.800 god of this or that.
00:28:30.080 And others said, oh, there's only one
00:28:31.660 god.
00:28:31.940 But, in the next phase of human
00:28:36.160 existence, unless the first one destroys
00:28:39.260 everything so there can never be another
00:28:40.980 one, which I don't think will happen, it
00:28:43.160 won't be connected enough, won't we be
00:28:46.220 birthing multiple gods?
00:28:49.020 Will not each version of this have a god-like
00:28:52.600 ability?
00:28:53.680 And once it's connected to humans, it's
00:28:56.260 magnified?
00:28:56.920 We're going to have a multiple gods
00:29:00.700 situation.
00:29:02.140 And if you said to yourself, hey, these
00:29:05.240 AIs might not care about us, and they 1.00
00:29:08.300 might wipe us out without even caring, I
00:29:11.620 would say far more likely they would fight
00:29:13.660 each other.
00:29:15.680 Far more likely the AIs will be in a
00:29:17.900 battle with each other, because they
00:29:19.760 wouldn't see us as a threat.
00:29:21.020 We might develop some AIs that are the 0.65
00:29:26.380 police to try to catch the other AIs from 1.00
00:29:29.620 stealing your money.
00:29:31.620 So it's going to be a battle of the AI
00:29:33.400 gods, and I think people are going to be
00:29:35.420 just watching and hoping we win, you know,
00:29:38.060 or your side wins before the AI kills you
00:29:41.160 too.
00:29:41.720 All right.
00:29:42.560 No, it won't do that.
00:29:43.900 You'll be fine.
00:29:46.420 All right.
00:29:47.080 What else is happening?
00:29:47.760 Apparently, Google's version of AI, and this
00:29:52.660 is still the traditional AI, can now let
00:29:55.520 you talk to your YouTube videos.
00:29:58.820 Let me say that again.
00:30:00.740 You can now talk to a video.
00:30:04.380 And by that, I mean, if you wondered what
00:30:07.700 I said in this live stream, you could go to
00:30:10.900 YouTube afterwards, after it's posted in its
00:30:13.760 recorded form, and you could go to Google
00:30:16.580 Bard, and you could say, hey, what'd this
00:30:19.000 guy talk about?
00:30:20.540 And I'll give you a bullet point summary of
00:30:22.600 what I talked about.
00:30:23.960 Or you could say, hey, what was the time
00:30:26.460 stamp of when Scott was talking about Q
00:30:30.200 learning?
00:30:31.480 Boop.
00:30:33.500 Is that impressive?
00:30:35.420 That's crazy.
00:30:37.240 That is crazy.
00:30:40.660 It's just, it's like, I almost can't hold
00:30:42.880 that in my mind.
00:30:43.640 That is so cool.
00:30:45.320 And for me, it would be transformative.
00:30:47.420 Because there's tons of stuff I want to know
00:30:49.460 that people, how many times does this happen?
00:30:53.160 Now, because of what I do, people are always
00:30:55.100 sending me videos, just all the time.
00:30:58.180 And there's no way I could watch them all.
00:31:01.140 Because there's just more videos that people
00:31:02.820 think I should see than I have time for.
00:31:04.920 But what if Q could take every video that
00:31:07.980 somebody says I should see, and turns it into
00:31:10.300 three bullet points, so I could just like
00:31:13.220 scream through all of it?
00:31:15.300 My God.
00:31:17.460 That's just an amazing ability.
00:31:20.500 Some smart people say that Google will be
00:31:22.960 the winner in AI, because it's going to
00:31:26.060 build AI into the tools you already use.
00:31:28.960 And I think that's already happening right now.
00:31:31.560 So I believe everything from YouTube to Gmail
00:31:35.300 to your calendar will have an AI augmentation.
00:31:41.400 So you should be able to talk to your calendar,
00:31:44.800 basically just talk to all of your Google products.
00:31:48.980 Crazy.
00:31:50.540 Now, I think that the phase after this is that
00:31:54.600 there are no Google products.
00:31:56.060 I think email will go away, you know,
00:31:59.880 spreadsheets will go away, because of AI.
00:32:03.460 In other words, it won't be a spreadsheet app.
00:32:06.400 You'll just go to AI and say,
00:32:08.340 hey, AI, I want to figure out my budget.
00:32:10.620 Could you grab all of my banking information
00:32:12.840 and just put it in some kind of a budget form
00:32:15.480 so I could see it?
00:32:16.880 And that will be your spreadsheet.
00:32:18.740 And you'll have, you know, you'll never have to know
00:32:21.220 how to do a formula or how to add up a row or anything.
00:32:24.160 You'll just do it for you, because you wanted it to.
00:32:26.640 Same with email.
00:32:28.900 Does it make sense to have email?
00:32:31.200 Not if you have AI.
00:32:32.700 AI should look at all your incoming messages,
00:32:35.440 no matter what mechanism they come in,
00:32:38.100 and it should propose responses
00:32:40.260 and then just present them to you in summary form.
00:32:45.020 It's like, hey, somebody asked you this.
00:32:47.480 I'm going to tell them that.
00:32:49.000 Okay.
00:32:50.040 That sort of thing.
00:32:50.800 Yeah, you're not going to need email or spreadsheets
00:32:53.540 or, like, really any apps.
00:32:56.060 If you can talk to AI, you don't need apps.
00:32:58.780 I don't think...
00:32:59.540 I haven't heard other people say this yet.
00:33:01.620 Have you?
00:33:04.480 Have you heard other people say apps will go away?
00:33:08.320 To me, that's really obvious.
00:33:09.620 And it's the biggest thing that's going to happen in the industry
00:33:13.700 because we have an app-based everything.
00:33:16.540 Everything's an app.
00:33:18.020 So what happens when there's no apps?
00:33:20.600 We're going to find out really, really quickly.
00:33:22.760 Maybe a year from now, there won't be any apps.
00:33:24.940 That's possible.
00:33:25.640 That, by the way, I remind you, is why I sold my Apple stock.
00:33:30.820 Not because Apple won't one day be the best in AI
00:33:34.000 and have even better products than now.
00:33:37.160 There's a good chance they will.
00:33:38.840 But it takes it from a monopoly business
00:33:41.600 where you really can't get out of the Apple ecosystem once you get in.
00:33:46.160 It's just too easy to stay in the ecosystem.
00:33:48.780 It's like a monopoly.
00:33:49.580 But it's just going to be an open competition in about a year.
00:33:54.160 And then they just have to compete with everybody else
00:33:57.000 as if they were startups.
00:33:59.120 So that wasn't as comfortable to me, so I got rid of that.
00:34:02.760 I do not make economic or financial recommendations.
00:34:06.520 That is not a financial recommendation.
00:34:08.720 Do not follow my financial advice.
00:34:10.520 I'm very bad at financial advice.
00:34:12.960 All right.
00:34:13.140 Here's an interesting exchange, and Elon Musk is part of this again.
00:34:22.560 So Emily Chang on the X platform posted this, said,
00:34:29.380 one of the biggest tasks ahead for OpenAI,
00:34:32.700 that's Sam Altman's company that he's back to,
00:34:36.200 one of the biggest tasks ahead is building a new board,
00:34:39.180 one that includes women. 1.00
00:34:40.340 And then she goes on to name several women
00:34:43.480 whose names were floated for the board of OpenAI.
00:34:46.760 And some were considered and dismissed, et cetera.
00:34:50.600 So Elon Musk responds to that with a comment.
00:34:54.060 He says, what matters is having a director
00:34:56.760 who deeply understands AI and will stand up to Sam.
00:35:00.900 Human civilization is at stake here.
00:35:04.180 Okay.
00:35:05.000 Well, those are two points of view,
00:35:07.700 kind of two different points of view.
00:35:08.980 On one hand, it would be a good idea
00:35:11.840 to protect human civilization
00:35:13.760 from its greatest existential threat.
00:35:17.860 That's a pretty good point.
00:35:19.500 It would be a good idea to have just the best people
00:35:21.600 protecting civilization from its existential threat.
00:35:25.500 Yeah, it's a good point.
00:35:27.000 But on the other hand, and you have to weigh this equally,
00:35:29.680 on the other hand, having a board
00:35:31.400 with the right acceptable ratio of vaginas to penises, 1.00
00:35:37.080 well, that's important too.
00:35:38.120 So if you had to give up a little bit
00:35:41.840 on the protecting the existential risk
00:35:46.280 to human civilization,
00:35:48.640 you might want to give up a little bit on that
00:35:51.900 to get just the right ratio of vaginas to penises. 1.00
00:35:56.040 Have I ever told you that I have this theory
00:36:03.000 that although each individual is very different,
00:36:06.280 people are infinitely different and unique,
00:36:09.200 that there are some generalities that still hold?
00:36:14.040 You'd have to, you know, not be discrimination...
00:36:16.820 You don't want to discriminate and act like everybody's in these generalities
00:36:19.820 because individuals are very different, as I say.
00:36:22.200 But generally speaking, it is my theory
00:36:26.080 that biologically and evolutionary-wise,
00:36:30.020 women are nurturers, mostly, 1.00
00:36:33.320 and men are hunters and protectors.
00:36:36.660 If you want something killed, call a man. 0.97
00:36:40.100 If you want something shared nicely among people, call a woman. 1.00
00:36:46.380 If you want somebody to kick somebody's ass, call a man. 0.99
00:36:51.760 If you want somebody to make people get along and have a nice meal
00:36:55.800 and, you know, live in a happy life, call a woman.
00:37:00.180 Now, of course, tons of exceptions, right?
00:37:03.540 There are women who can fight better than men.
00:37:05.860 There are men who can, you know, who are more nurturing than women.
00:37:09.720 And there are non-binaries all over the place. 1.00
00:37:12.540 But sort of as a general statement of our biological truth,
00:37:18.080 in general, women are superior nurturers 1.00
00:37:22.300 and men are better at protecting and killing stuff.
00:37:24.600 Generally speaking.
00:37:26.840 Here's a good example.
00:37:29.200 Here's a woman, Emily Chang,
00:37:33.860 who has a good instinct toward making the world a fairer,
00:37:40.640 you know, more collegial place.
00:37:43.180 Kind of nurturing.
00:37:44.680 And so from her point of view, it would be nice to have a board
00:37:47.280 that represented men and women
00:37:49.380 and might also give you some, you know, diverse opinion in that world
00:37:53.400 and might even be better.
00:37:55.580 But in Elon Musk, who's a male,
00:37:58.980 he's saying, and I would agree with his take,
00:38:01.760 this is one of those areas where you don't do that.
00:38:07.080 Because this is defending the entire population of Earth.
00:38:10.640 And no matter how many times I tell you,
00:38:15.020 I'm going to keep telling you,
00:38:17.020 you should never discriminate by sex or race or religion or ability,
00:38:24.740 you know, disability,
00:38:25.920 for a job.
00:38:29.020 You know, you don't want to do it in your personal life.
00:38:31.480 You don't want to deny somebody renting your building.
00:38:35.080 So for commerce and personal life and all of your individual in-person stuff,
00:38:41.300 don't discriminate.
00:38:43.840 It's immoral.
00:38:44.820 It doesn't help you.
00:38:45.680 It doesn't help the person you discriminate against.
00:38:47.740 It doesn't help society.
00:38:49.120 Everybody loses.
00:38:49.720 There's one exception.
00:38:53.400 Self-defense.
00:38:54.100 When it comes to self-defense,
00:38:56.860 you can discriminate as much as you want.
00:38:59.060 There's nothing immoral or even a little bit wrong with that.
00:39:03.620 For self-defense,
00:39:04.760 the only thing that matters is you live.
00:39:07.160 Or you protect your family or you protect your country.
00:39:10.600 Everything else is secondary.
00:39:12.520 So you do not count the vaginas to the penises when it's self-defense. 1.00
00:39:17.160 When you say to yourself,
00:39:18.740 you know,
00:39:19.360 should we concentrate our recruitment on men or women for special forces?
00:39:26.120 Probably concentrate on men 0.98
00:39:27.440 because you're going to get a better result on average.
00:39:30.540 Does that say there isn't a woman somewhere
00:39:33.300 who would be the best special forces person of all time?
00:39:37.200 No, that could be possible.
00:39:38.860 Nobody's saying that individuals are the same.
00:39:42.180 I'm saying that in general,
00:39:43.560 if your biggest mission is protecting the country and staying alive,
00:39:46.920 you don't fuck with this other stuff. 0.95
00:39:49.480 Ever.
00:39:50.500 There's a reason that our military doesn't take senior citizens 1.00
00:39:53.740 because they're not as good.
00:39:56.680 There's a reason that we do
00:39:58.180 allow people of all races in the military
00:40:00.600 because we've tested it.
00:40:02.660 And it works out.
00:40:03.940 If it didn't work,
00:40:04.920 we wouldn't do it.
00:40:06.340 You know,
00:40:06.620 even though it's racist,
00:40:07.780 we wouldn't do it.
00:40:08.580 If it didn't work.
00:40:09.980 So, I mean,
00:40:10.680 we've tested it.
00:40:11.680 It works.
00:40:14.340 I'm sure if we tested senior citizens,
00:40:16.400 we wouldn't get the same result.
00:40:18.480 And nobody has any problem with that.
00:40:20.620 There's no senior citizen group.
00:40:22.560 The AARP is not lobbying
00:40:25.880 to get more senior citizens in the military.
00:40:29.040 Nobody thinks it's a good idea.
00:40:31.680 So when Musk says,
00:40:33.400 you know,
00:40:34.280 some version of,
00:40:35.440 let's not make sure that
00:40:36.640 the vagina to penis ratio is the main thing. 1.00
00:40:39.740 How about the protecting humanity
00:40:41.640 is it.
00:40:43.480 And you do that the best you can.
00:40:45.440 And you don't worry about anything else.
00:40:47.920 Let's see if there's another story
00:40:49.580 in the news today
00:40:50.400 that reminds you of this one.
00:40:53.760 Turns out that the Dutch 1.00
00:40:56.020 over in the Netherlands there
00:40:58.020 have,
00:40:59.100 it looks like they've elected
00:41:00.000 what some call
00:41:02.000 right-wing Geert Wilders.
00:41:05.540 And he was being asked
00:41:07.100 about his views on mass immigration
00:41:08.920 and multiculturalism,
00:41:12.480 of which he's,
00:41:14.500 let's say,
00:41:15.240 has some provocative thoughts.
00:41:17.840 And he was asked by a journalist,
00:41:20.180 I was watching this on a video today,
00:41:22.420 and he says,
00:41:23.860 is it wise for a politician
00:41:25.200 to ostracize one group?
00:41:28.260 Because now the context is
00:41:29.780 that Geert Wilders was saying
00:41:33.000 that groups,
00:41:34.280 some of the immigrant groups
00:41:35.580 from some countries,
00:41:37.280 had a 20 times higher crime rate.
00:41:40.920 And that you couldn't ignore that.
00:41:43.340 So the journalist says,
00:41:44.420 is it wise for a politician
00:41:45.440 to ostracize one group?
00:41:47.600 And Wilders says,
00:41:49.240 I don't,
00:41:50.280 oh,
00:41:50.720 because,
00:41:51.100 and then the journalist says,
00:41:52.640 but couldn't the real problem
00:41:53.920 be economic?
00:41:56.300 That if you looked at people
00:41:57.440 who have,
00:41:58.040 you know,
00:41:58.300 low income,
00:41:59.400 there's a higher crime rate.
00:42:01.200 So is that right to say
00:42:02.720 that the Moroccans,
00:42:03.940 and they used the Moroccans
00:42:04.920 as an example,
00:42:05.940 says,
00:42:06.300 would it be right to say
00:42:07.220 that the Moroccans
00:42:08.080 are a problem?
00:42:08.600 Or is it really
00:42:09.960 more about economics?
00:42:12.200 That's a good question,
00:42:13.360 right?
00:42:14.200 That's exactly the right question.
00:42:16.660 Now I'm going to tell you
00:42:17.780 exactly the right answer.
00:42:19.580 Because it is the right question.
00:42:21.960 Here's the right answer
00:42:22.860 from Geert Wilders.
00:42:24.940 I don't care why
00:42:25.920 they're overrepresented.
00:42:29.720 And we're done.
00:42:31.720 Thank you.
00:42:33.240 Yeah,
00:42:33.680 because this is national defense
00:42:35.440 for the Dutch.
00:42:37.920 The Dutch have an existential threat 1.00
00:42:39.960 to their country.
00:42:41.460 And when you tell them,
00:42:42.560 are the Moroccans a problem 1.00
00:42:44.020 because of their economic situation?
00:42:46.360 Or are the Moroccans a problem 1.00
00:42:48.240 for some other reason
00:42:50.520 that would sound racist?
00:42:53.020 And Geert says,
00:42:53.840 I don't care.
00:42:56.180 That is exactly
00:42:57.580 the right answer.
00:43:00.840 Geert Wilders is a male.
00:43:03.140 He's a man.
00:43:03.740 So when you ask this man
00:43:06.040 who is presumably
00:43:08.920 one of many men
00:43:10.460 who evolved
00:43:11.100 for knowing how to protect you
00:43:13.780 and killing stuff 0.82
00:43:14.980 if he needs to,
00:43:16.500 when you ask him
00:43:17.500 how much of the wokeness
00:43:19.660 should we factor
00:43:20.480 into staying alive,
00:43:22.260 Geert Wilders gives you
00:43:23.920 the only right answer.
00:43:25.620 I don't care about wokeness. 1.00
00:43:27.880 It's not even in my conversation.
00:43:30.460 That's something
00:43:30.840 you could talk about.
00:43:31.640 But we're talking about
00:43:33.200 defending the country.
00:43:34.820 Sorry.
00:43:35.600 It's not part of the conversation.
00:43:37.920 I'm going to look at the numbers
00:43:39.720 and then I'm going to manage
00:43:41.380 to the numbers
00:43:42.060 that we're sure are right.
00:43:43.640 And, you know,
00:43:44.380 they're pretty good at crime numbers.
00:43:46.040 He says,
00:43:46.520 it's not my problem.
00:43:47.880 Don't care.
00:43:48.580 I'm not even going to get
00:43:49.320 into the conversation.
00:43:50.320 Why?
00:43:52.320 Thank you.
00:43:53.320 That is exactly
00:43:54.680 the right answer.
00:43:55.420 The better answer
00:43:56.760 is that we don't
00:43:59.140 consider those factors
00:44:00.820 when staying alive
00:44:03.100 is the question.
00:44:05.360 When staying a country
00:44:06.840 and not devolving into chaos
00:44:09.300 is the question.
00:44:11.060 You don't ask those questions.
00:44:13.700 But,
00:44:14.340 in order to sound
00:44:15.600 not like some kind of
00:44:17.380 Hitler, fascist, racist person,
00:44:19.780 you should make sure
00:44:20.820 that you say what I say.
00:44:22.040 It is immoral
00:44:24.840 and inefficient
00:44:26.340 and bad for everybody
00:44:27.600 if you discriminate
00:44:29.340 on any of those factors
00:44:30.820 for any individual
00:44:32.940 in person.
00:44:34.500 But if you're doing it
00:44:35.780 on a statistical level
00:44:37.540 and you're talking about
00:44:39.240 people who are not yet citizens
00:44:40.820 versus your citizens
00:44:42.360 you're trying to protect,
00:44:44.040 discriminate all you want.
00:44:46.000 There's no barrier
00:44:47.200 to discrimination
00:44:48.080 for your self-protection.
00:44:50.120 And, more importantly,
00:44:51.400 nobody gets to second-guess you.
00:44:54.480 Suppose we sat over here
00:44:55.880 and said,
00:44:56.200 you know what?
00:44:57.040 I think you Dutch
00:44:57.960 are overreacting.
00:44:59.840 Yeah, maybe there's
00:45:00.540 like a little problem
00:45:01.640 extra of the immigrants
00:45:02.700 but really,
00:45:03.640 but really,
00:45:04.240 we Americans think
00:45:05.080 you're overreacting
00:45:05.900 and you shouldn't do that.
00:45:07.480 Do you know what
00:45:07.920 Geert Wilder
00:45:08.900 should say to us?
00:45:10.520 I don't care.
00:45:12.440 I don't care
00:45:13.420 what your opinion is.
00:45:15.980 That's the right opinion.
00:45:16.940 The right opinion
00:45:18.060 is nobody second-guesses
00:45:20.160 self-defense.
00:45:22.860 Self-defense
00:45:23.500 is only,
00:45:24.900 only the business
00:45:26.040 of the individual
00:45:26.860 or in this case
00:45:28.060 the country.
00:45:29.240 It is nobody else's business.
00:45:31.460 Now this is the same attitude
00:45:32.540 that I take
00:45:33.220 to the Gaza situation.
00:45:36.900 In Gaza,
00:45:38.120 you can say to yourself,
00:45:39.720 hey,
00:45:40.380 in my opinion,
00:45:41.520 sitting safely
00:45:42.240 over here in America,
00:45:43.200 I think Israel 0.98
00:45:44.820 should not go so hard
00:45:45.960 on Gaza.
00:45:47.580 Do you know what
00:45:48.100 Israel should say 0.98
00:45:49.220 to that opinion?
00:45:51.360 I don't care.
00:45:53.440 They should have
00:45:54.260 no care
00:45:55.720 what our opinion is
00:45:57.820 of how they protect
00:45:58.680 themselves.
00:45:59.440 Now they of course
00:46:00.180 have to manage
00:46:00.820 public opinion.
00:46:02.420 So managing public opinion
00:46:03.800 does matter,
00:46:04.880 right?
00:46:05.200 That's got a cost
00:46:05.960 and a benefit to it.
00:46:07.440 So they have to do that.
00:46:09.700 But they should not
00:46:11.140 under any condition
00:46:13.220 do it because
00:46:14.680 we think it's a good idea.
00:46:17.640 We don't get a vote.
00:46:19.640 Nobody gets a vote
00:46:20.580 on their self-defense.
00:46:22.500 And that's why I say,
00:46:23.940 don't ask me
00:46:24.420 if it's right or wrong.
00:46:26.120 It's going to happen.
00:46:27.940 That's it.
00:46:29.700 Yeah.
00:46:29.980 You can talk about
00:46:30.880 what to do about it
00:46:31.920 after the fact,
00:46:33.120 but Israel's going to do 0.93
00:46:34.300 what they're going to do
00:46:35.020 and nothing's going to stop it.
00:46:39.840 All right.
00:46:40.460 And of course,
00:46:42.160 there's a trend
00:46:43.180 forming now.
00:46:44.680 Some people are calling
00:46:45.600 this,
00:46:46.280 well,
00:46:46.880 there's the Dutch Trump
00:46:49.040 because it largely
00:46:51.260 is immigration,
00:46:52.060 I guess.
00:46:52.900 And they're also calling
00:46:54.240 Miele,
00:46:55.080 the new head of Argentina,
00:46:56.960 the Argentinist Trump.
00:46:59.580 So now the American Trump
00:47:01.500 has a dominant position
00:47:03.920 in the polls.
00:47:05.500 And we have
00:47:06.320 two key countries
00:47:08.380 that just went
00:47:09.840 full Trump.
00:47:14.700 I don't know.
00:47:16.360 Does it feel like
00:47:17.220 2016 all over again?
00:47:19.900 Do you remember
00:47:20.320 when Brexit passed
00:47:21.500 and people said,
00:47:22.660 oh,
00:47:23.160 this is telling us
00:47:24.180 Trump's going to get elected
00:47:25.620 and then it kind of did?
00:47:27.860 It feels like that.
00:47:29.020 Yeah,
00:47:29.140 it feels like the
00:47:29.940 pendulum has swung back.
00:47:32.320 All right.
00:47:37.520 Claire McCaskill.
00:47:40.020 What was she? 0.85
00:47:40.960 Was she a House
00:47:41.880 representative at one point?
00:47:43.720 Is she still,
00:47:44.400 she's not still in office,
00:47:45.600 is she?
00:47:46.700 Who's Claire McCaskill? 1.00
00:47:48.780 She was on MSNBC,
00:47:50.340 senator from Missouri.
00:47:51.760 Still a senator?
00:47:53.380 Is she still
00:47:54.000 or did she retire? 0.59
00:47:55.700 Ex-Missouri senator,
00:47:57.020 but now on MSNBC,
00:47:58.280 I guess.
00:47:59.320 Or at least she was
00:47:59.920 appearing there.
00:48:00.420 I don't know
00:48:00.820 if she's employed there.
00:48:02.260 But anyway,
00:48:02.680 she was saying
00:48:03.720 that Trump
00:48:05.700 is more dangerous
00:48:06.460 than Hitler. 0.74
00:48:10.460 All right.
00:48:11.700 I'm going to suggest
00:48:12.720 a way to handle this.
00:48:15.820 But first,
00:48:16.440 I'm going to explain
00:48:17.240 exactly what she said
00:48:18.340 so you can really feel it.
00:48:20.300 First of all,
00:48:20.920 you know that Trump
00:48:21.600 referred to,
00:48:22.840 I don't know
00:48:23.640 who he was talking about,
00:48:24.420 but he referred to vermin.
00:48:26.740 Remind me,
00:48:27.340 who was Trump
00:48:27.800 talking about
00:48:28.380 when he said vermin?
00:48:30.420 He was referring to,
00:48:35.180 it doesn't matter,
00:48:37.380 does it?
00:48:37.980 He wasn't referring
00:48:38.740 to American citizens,
00:48:39.780 was he?
00:48:44.660 MS13?
00:48:46.200 Oh, he was talking
00:48:46.980 about MS13 in vermin?
00:48:49.340 He's probably
00:48:49.960 used that before.
00:48:51.220 Here,
00:48:51.680 I'd like to test this out.
00:48:53.420 This didn't work
00:48:54.220 when I tested it
00:48:54.940 on Locals
00:48:55.480 before I went live
00:48:56.380 on YouTube.
00:48:57.400 But let me test it
00:48:58.120 with the YouTube people.
00:48:58.940 I grew up
00:49:00.400 in upstate New York
00:49:01.500 and when I went
00:49:02.620 to college,
00:49:03.280 there were a lot
00:49:03.580 of people
00:49:03.860 from the New Jersey,
00:49:05.000 New York area
00:49:05.740 in college.
00:49:07.680 So I was friends
00:49:08.620 with a lot of people
00:49:09.860 who were born
00:49:10.860 and raised in that area.
00:49:12.520 And in those days,
00:49:14.040 the word vermin
00:49:14.820 was somewhat common
00:49:17.440 from people
00:49:18.300 born in that area.
00:49:19.220 In other words,
00:49:19.640 it was an ordinary
00:49:21.240 insult they would use
00:49:22.240 to any variety
00:49:23.100 of people.
00:49:24.000 And it wasn't racist,
00:49:25.160 it was just people
00:49:26.300 you didn't like.
00:49:26.860 So it had nothing
00:49:27.960 to do with any
00:49:28.580 racial overtones,
00:49:29.620 it was just like
00:49:30.120 a general
00:49:30.620 you're a rat
00:49:31.840 kind of an insult.
00:49:34.300 Now,
00:49:35.020 am I right?
00:49:36.340 Am I right
00:49:37.120 that it's a regional
00:49:38.080 thing that Trump
00:49:39.160 picked up?
00:49:40.640 Is it regional
00:49:41.600 to call people
00:49:42.240 a vermin?
00:49:42.800 New Jersey,
00:49:43.540 New York?
00:49:45.940 I'm seeing one yes.
00:49:48.420 I'm seeing some
00:49:48.980 more yeses.
00:49:50.460 Yes,
00:49:50.840 I'm from upstate
00:49:51.580 New York.
00:49:54.640 Yes,
00:49:55.700 yes,
00:49:56.200 yes,
00:49:58.100 okay.
00:49:59.340 So I think
00:50:00.160 I've demonstrated
00:50:00.860 that it's more
00:50:02.380 than me
00:50:02.860 that believe
00:50:03.920 it's a regional
00:50:04.600 thing.
00:50:05.660 Now,
00:50:06.660 let me ask you
00:50:07.220 this.
00:50:08.100 Has anybody
00:50:09.100 in your news
00:50:10.420 universe
00:50:11.380 told you
00:50:12.780 that it's
00:50:13.200 a regional
00:50:13.940 phrase?
00:50:16.820 Has anybody
00:50:17.540 mentioned that?
00:50:18.940 And that Trump
00:50:19.600 routinely uses
00:50:20.940 you know,
00:50:22.300 phrasings from
00:50:23.040 the region
00:50:23.680 that he's most
00:50:24.360 influenced by
00:50:24.980 where he grew up?
00:50:26.800 A colloquialism,
00:50:27.840 yeah.
00:50:28.520 But if you
00:50:29.560 didn't know that,
00:50:30.840 and I don't think
00:50:31.500 anybody not from
00:50:32.460 that region would
00:50:33.180 even be aware
00:50:33.920 that it's a common
00:50:35.080 word there,
00:50:36.040 if you didn't know
00:50:36.920 that,
00:50:37.220 it's easy to sell
00:50:38.020 that as Hitler 0.89
00:50:38.820 because that's the
00:50:40.340 only association
00:50:41.520 you have in your
00:50:41.980 mind.
00:50:42.220 Oh,
00:50:43.040 Hitler called
00:50:43.640 people vermin.
00:50:44.260 So anyway,
00:50:46.820 here's McCaskill.
00:50:48.380 I'm going to read
00:50:49.100 exactly what she
00:50:49.980 says so you can
00:50:50.640 see the level of
00:50:51.360 craziness here.
00:50:52.760 Quote,
00:50:53.120 a lot of people
00:50:53.680 have tried to draw
00:50:54.380 similarities between
00:50:55.700 Mussolini and
00:50:56.500 Hitler and the 0.90
00:50:57.420 use of the
00:50:57.900 terminology like
00:50:58.800 vermin and the
00:51:00.760 drive that those
00:51:01.600 men had towards
00:51:02.380 autocracy and
00:51:03.240 dictatorship.
00:51:04.140 The difference,
00:51:05.480 though,
00:51:05.960 I think,
00:51:06.360 makes Donald
00:51:07.360 Trump even more
00:51:08.760 dangerous than
00:51:11.340 Hitler and 0.80
00:51:11.940 Mussolini.
00:51:13.980 Now,
00:51:14.500 she's saying
00:51:14.880 this with a
00:51:15.400 straight face,
00:51:17.340 that he's more
00:51:18.000 dangerous than
00:51:18.640 Hitler and now 0.86
00:51:19.880 she's going to
00:51:20.360 back it up with
00:51:21.000 her argument.
00:51:22.180 Okay,
00:51:22.460 here's the
00:51:22.840 argument.
00:51:25.340 And that is,
00:51:27.200 all right,
00:51:28.280 he goes,
00:51:29.140 quote,
00:51:29.680 he is not
00:51:30.460 trying to expand
00:51:31.320 the boundaries of
00:51:32.000 the United States
00:51:32.660 of America.
00:51:33.920 So unlike
00:51:34.620 Hitler and 0.80
00:51:35.160 Mussolini,
00:51:35.700 he's not trying
00:51:36.280 to expand our
00:51:37.000 boundaries.
00:51:37.960 And remember,
00:51:38.780 this is a
00:51:39.420 criticism of him.
00:51:41.940 So she starts
00:51:42.700 by saying he's
00:51:43.280 not trying to
00:51:43.880 expand the
00:51:44.380 boundaries.
00:51:45.340 Then she says,
00:51:46.540 he's not trying to
00:51:47.700 overcome a
00:51:48.200 neighboring country
00:51:48.880 like Putin is
00:51:49.780 in Ukraine.
00:51:52.120 Wait,
00:51:52.720 are you suggesting
00:51:53.440 that Putin is
00:51:54.660 better than
00:51:55.440 Trump because
00:51:57.080 Putin has an
00:51:57.920 objective of
00:51:58.640 conquering Ukraine 1.00
00:51:59.760 where Trump does
00:52:01.540 not try to
00:52:02.160 conquer any
00:52:02.660 other countries?
00:52:04.580 So just hold
00:52:06.920 in your head the
00:52:07.560 hot mess that
00:52:08.380 this is.
00:52:08.940 She goes on
00:52:10.780 talking about
00:52:12.020 Trump.
00:52:12.700 He is not
00:52:13.120 going to some
00:52:13.960 grandiose scheme
00:52:14.980 of international
00:52:15.740 dominance,
00:52:16.840 but I feel like
00:52:18.120 that's a good
00:52:18.580 thing.
00:52:19.400 And then she
00:52:19.700 goes,
00:52:20.360 all he wants
00:52:20.980 is to look
00:52:21.640 in the mirror
00:52:22.140 and see a
00:52:23.080 guy who is
00:52:23.520 president.
00:52:26.940 Okay.
00:52:28.960 So this is
00:52:29.940 mind reading,
00:52:30.560 right?
00:52:31.560 She's gone
00:52:32.220 into complete
00:52:32.820 mind reading.
00:52:34.100 All he wants
00:52:34.900 to do is see
00:52:35.600 himself as
00:52:36.700 president.
00:52:37.420 All he cares
00:52:38.220 about is
00:52:39.400 selfish self
00:52:40.420 promotion.
00:52:41.560 That's the
00:52:42.240 only philosophy
00:52:42.940 he has,
00:52:44.020 which makes
00:52:44.860 it even more
00:52:45.500 dangerous.
00:52:47.620 What?
00:52:49.260 Are you
00:52:50.060 telling me
00:52:50.500 that somebody
00:52:51.060 who likes
00:52:51.800 self promotion
00:52:53.060 and that's
00:52:54.580 their biggest
00:52:55.060 flaw is
00:52:56.720 more dangerous
00:52:57.680 than Hitler? 0.78
00:53:00.800 She's actually
00:53:01.840 saying that
00:53:02.580 directly and
00:53:04.400 clearly.
00:53:05.940 And then she
00:53:06.260 goes on.
00:53:06.960 Well,
00:53:07.160 there's a
00:53:07.400 reason because
00:53:08.660 Trump,
00:53:10.160 he has
00:53:10.820 actually said
00:53:11.440 out loud that
00:53:12.540 he would be
00:53:13.020 okay to
00:53:13.580 terminate the
00:53:14.260 Constitution to
00:53:15.220 keep him in
00:53:15.740 power.
00:53:17.000 Really?
00:53:19.160 Let's do a
00:53:19.880 fact check on
00:53:20.480 that.
00:53:21.620 Because I
00:53:22.360 watch the
00:53:22.700 news a lot.
00:53:23.900 I do not
00:53:24.400 remember the
00:53:24.940 time that
00:53:25.540 Trump said
00:53:25.980 we should
00:53:26.320 terminate the
00:53:26.980 Constitution.
00:53:28.180 Does anybody
00:53:28.840 remember that?
00:53:30.400 Is there
00:53:30.840 anybody who
00:53:31.640 saw that in
00:53:32.600 any kind of
00:53:33.280 news?
00:53:33.580 I'm going to
00:53:36.600 have to give
00:53:37.000 you the
00:53:37.280 tapper tilt to
00:53:38.060 that one.
00:53:39.020 Are you
00:53:39.240 ready?
00:53:40.360 The tapper head
00:53:41.440 tilt is when
00:53:42.580 you're listening
00:53:43.020 to somebody
00:53:43.560 that you're
00:53:43.980 trying to
00:53:44.400 discredit
00:53:44.920 without actually
00:53:45.780 saying any
00:53:46.220 words.
00:53:47.320 So,
00:53:48.340 you're saying
00:53:49.080 that he said
00:53:50.620 out loud he'd
00:53:51.160 like to
00:53:51.420 terminate the
00:53:51.980 Constitution to
00:53:52.900 keep him in
00:53:53.340 power?
00:53:53.620 There.
00:53:58.060 The tapper tilt.
00:53:58.740 That's all
00:54:00.760 needs to be
00:54:01.460 said.
00:54:03.220 Then she
00:54:03.800 goes on.
00:54:04.540 He said
00:54:05.080 this.
00:54:05.780 He actually
00:54:06.320 said those
00:54:06.940 words.
00:54:08.800 No, he
00:54:09.200 didn't.
00:54:11.200 Pretty sure
00:54:11.900 he didn't.
00:54:13.060 If he said
00:54:13.680 that, you
00:54:14.060 don't think
00:54:14.440 we would
00:54:14.760 see that
00:54:15.180 clip playing
00:54:16.060 on repeat
00:54:17.920 all day
00:54:18.680 long?
00:54:19.980 How would
00:54:20.600 anybody have
00:54:21.140 missed that?
00:54:23.100 The only way
00:54:23.820 you could miss
00:54:24.220 it is if it
00:54:24.700 didn't happen.
00:54:26.260 There's no
00:54:26.680 other way you
00:54:27.120 could miss
00:54:27.440 it.
00:54:28.740 And the
00:54:30.420 irony is
00:54:31.340 all these
00:54:31.780 supposedly
00:54:32.300 conservative
00:54:32.900 folks that
00:54:33.940 have populated
00:54:34.560 the Republican
00:54:35.180 Party all
00:54:36.260 stood around
00:54:36.920 with their
00:54:37.480 thumb in
00:54:38.580 their mouth 0.77
00:54:39.060 going,
00:54:39.860 well, yeah,
00:54:40.320 I guess so.
00:54:42.080 And then she
00:54:42.520 says it's so
00:54:43.080 bizarre.
00:54:47.740 All right.
00:54:48.860 I'm going to
00:54:49.500 add this to
00:54:50.020 my list of
00:54:51.100 why women 1.00
00:54:51.720 should not
00:54:52.200 be in
00:54:52.520 charge of
00:54:52.980 national
00:54:53.380 defense.
00:54:57.300 I believe
00:54:58.180 that she's
00:54:58.700 talking
00:55:00.140 nicely
00:55:00.740 above the
00:55:02.500 country's
00:55:03.060 safety and
00:55:03.940 security.
00:55:06.420 This is
00:55:07.140 batshit
00:55:08.080 crazy.
00:55:10.660 Now, how
00:55:11.980 should you
00:55:12.540 respond to
00:55:13.220 this?
00:55:13.740 Because
00:55:13.940 obviously there's
00:55:14.660 going to be a
00:55:15.000 lot more of
00:55:15.440 it, right?
00:55:16.200 There'll be a
00:55:16.640 lot more.
00:55:18.080 The return of
00:55:18.980 fascism.
00:55:20.600 Because since
00:55:21.940 we know that
00:55:22.580 Joe Biden or
00:55:23.600 whoever runs
00:55:24.320 in this place,
00:55:25.300 they're not
00:55:25.720 going to be
00:55:26.020 able to
00:55:26.340 compete on
00:55:26.840 policy except
00:55:28.080 abortion.
00:55:29.280 And that's
00:55:29.740 going down to
00:55:30.300 the state
00:55:30.720 level for
00:55:31.260 decisions.
00:55:32.100 So even that
00:55:32.640 one's a little
00:55:33.100 bit off the
00:55:33.580 table because
00:55:34.140 it's become
00:55:34.600 more of a
00:55:34.980 state thing.
00:55:36.080 So if you
00:55:37.400 can't compete
00:55:38.220 on policies,
00:55:39.540 you have to
00:55:40.180 go to, you
00:55:41.800 know, insulting
00:55:42.480 your competition,
00:55:43.420 make it look
00:55:43.800 bad.
00:55:45.020 So they're
00:55:45.720 going to call
00:55:46.020 you fascist, 0.57
00:55:46.780 they're going
00:55:47.000 to call him
00:55:47.660 Hitler if you 0.77
00:55:49.440 support him.
00:55:50.480 So what do
00:55:51.060 you do about
00:55:51.500 it?
00:55:52.620 I absolutely
00:55:53.620 think there's
00:55:54.320 one and only
00:55:54.880 way to deal
00:55:55.740 with it
00:55:56.020 successfully.
00:55:57.560 You have to
00:55:58.480 mock it and
00:56:00.060 joke with it and
00:56:01.420 embrace it and
00:56:02.140 amplify it.
00:56:03.360 You have to
00:56:03.840 make it sarcastic,
00:56:05.700 you have to
00:56:06.340 laugh at them
00:56:06.960 every time they
00:56:07.520 do it, and
00:56:08.060 you never,
00:56:08.720 never can take
00:56:09.300 them seriously.
00:56:10.680 You have to
00:56:11.400 treat them like
00:56:13.160 children who
00:56:14.380 are afraid of
00:56:14.900 the dark.
00:56:16.240 They're afraid of
00:56:16.980 the fascists
00:56:17.560 under the bed.
00:56:18.200 And so
00:56:20.040 instead of
00:56:21.780 responding,
00:56:23.100 imagine you're
00:56:25.140 going to be a
00:56:25.760 politician and
00:56:26.460 somebody says,
00:56:27.640 so-and-so,
00:56:28.660 McCaskill, said
00:56:30.060 Trump is worse
00:56:31.000 than Hitler. 0.74
00:56:32.500 Do you say
00:56:33.200 something like,
00:56:34.320 well, I don't
00:56:34.780 know, Democrat
00:56:35.840 policies are
00:56:36.580 pretty bad too?
00:56:37.940 Or do you say,
00:56:39.060 no, he's not?
00:56:40.620 Those don't
00:56:41.120 work.
00:56:42.120 Because the
00:56:42.620 Hitler thing is 0.88
00:56:43.260 a fear of
00:56:44.000 persuasion.
00:56:46.620 Suppose you
00:56:47.320 had a little
00:56:47.680 kid,
00:56:48.200 who was afraid
00:56:49.520 of a monster
00:56:50.180 under the bed.
00:56:51.600 How do you
00:56:51.960 play it?
00:56:53.200 I'll give you
00:56:53.600 two different
00:56:53.980 ways.
00:56:54.480 Your little
00:56:54.920 kid says,
00:56:55.400 there's a
00:56:55.640 monster under
00:56:56.100 the bed,
00:56:56.440 I can't turn
00:56:56.900 off the lights
00:56:57.340 and go to
00:56:57.640 sleep.
00:56:58.680 Do you say,
00:57:00.560 there's no
00:57:01.920 monster under
00:57:02.520 the bed?
00:57:04.940 Don't worry
00:57:05.480 about it,
00:57:05.800 go to sleep.
00:57:07.200 Well, you
00:57:07.540 could, you
00:57:08.500 could.
00:57:09.280 Here's a
00:57:09.740 better way.
00:57:10.780 Now, all
00:57:11.760 kids go
00:57:12.780 through that.
00:57:13.560 I remember
00:57:14.200 when I thought
00:57:14.700 there was a
00:57:15.160 monster under
00:57:15.840 the bed.
00:57:16.880 When I was
00:57:17.480 your age,
00:57:17.820 I used to
00:57:18.180 look under
00:57:18.460 the bed
00:57:18.720 every day.
00:57:19.380 No,
00:57:19.800 there's no
00:57:21.020 monsters.
00:57:22.340 That's funny
00:57:23.100 because I
00:57:25.340 remember exactly
00:57:26.140 when I was
00:57:26.600 your age,
00:57:27.460 I had the
00:57:27.920 same conversation
00:57:28.660 with my
00:57:29.020 parents.
00:57:29.820 We were
00:57:30.080 sure there
00:57:30.440 were monsters,
00:57:31.060 but you know
00:57:31.320 what?
00:57:31.860 Everybody looks
00:57:32.540 under the bed
00:57:33.100 and nobody
00:57:33.500 ever finds a
00:57:34.080 monster.
00:57:35.120 Nobody's found
00:57:35.660 one yet.
00:57:36.960 Maybe you'll
00:57:37.520 find one that
00:57:37.960 likes you.
00:57:39.300 You just
00:57:39.660 treat it like
00:57:40.440 it's not
00:57:40.780 serious.
00:57:41.160 So all
00:57:43.240 of this
00:57:43.560 fascism
00:57:44.460 stuff,
00:57:45.360 if somebody
00:57:45.920 accuses you
00:57:47.320 of being a
00:57:47.780 fascist or
00:57:48.400 says that
00:57:48.840 Trump is,
00:57:50.080 you should
00:57:50.500 say,
00:57:51.560 what would
00:57:51.940 be an
00:57:52.200 example of
00:57:52.740 this fascism?
00:57:53.420 Now, what
00:57:55.300 they'd probably
00:57:55.760 say is that
00:57:56.480 he wants to
00:57:56.920 be a dictator.
00:57:58.740 And then you
00:57:59.200 say,
00:58:00.080 well, that's
00:58:00.840 not really
00:58:03.020 fascism,
00:58:03.640 is it?
00:58:04.420 Because I
00:58:04.800 thought fascism
00:58:05.520 was about the
00:58:07.020 government working
00:58:07.800 with the
00:58:08.280 companies to
00:58:10.380 basically have
00:58:10.980 control of
00:58:11.480 what the
00:58:11.720 companies do.
00:58:12.860 Sort of like
00:58:13.720 the Democrats
00:58:15.260 want to do
00:58:15.820 with free
00:58:16.860 speech.
00:58:18.280 So you
00:58:18.620 have actual
00:58:19.160 examples in
00:58:21.280 which the
00:58:21.620 Democrats are
00:58:22.400 acting in a
00:58:23.020 fascist nature
00:58:24.660 that the
00:58:25.740 other social
00:58:26.560 media companies
00:58:27.280 except for
00:58:27.800 X are bowing 1.00
00:58:29.420 to the
00:58:29.720 government's
00:58:30.400 will.
00:58:31.540 That's
00:58:31.900 actually
00:58:32.220 literally
00:58:32.640 fascism.
00:58:33.880 That fits
00:58:34.760 the definition
00:58:35.440 without any
00:58:36.420 hyperbole at
00:58:37.100 all.
00:58:38.660 But the
00:58:39.200 part about
00:58:39.760 staying in
00:58:40.540 office,
00:58:40.960 they're going
00:58:41.200 to use the
00:58:41.660 insurrection
00:58:42.140 thing.
00:58:42.920 You have to
00:58:43.580 mock the
00:58:44.040 insurrection.
00:58:45.840 The way to
00:58:46.640 deal with it
00:58:47.120 is, I
00:58:48.460 want you to
00:58:48.940 tell me to
00:58:49.420 my face
00:58:50.060 that you
00:58:50.880 believe
00:58:51.280 Republicans
00:58:51.940 try to
00:58:52.620 take over
00:58:53.020 the country
00:58:53.460 without
00:58:53.780 bringing
00:58:54.200 weapons.
00:58:55.560 Just say
00:58:55.980 that directly.
00:58:57.400 Well, I'm
00:58:58.120 the one
00:58:58.360 asking the
00:58:58.880 questions here.
00:58:59.480 I know
00:58:59.720 you are,
00:59:00.760 but I want
00:59:01.180 to understand
00:59:01.680 the context
00:59:02.440 of the
00:59:02.680 question.
00:59:03.740 Do you
00:59:04.040 believe that
00:59:04.640 in some
00:59:05.300 reasonable
00:59:05.840 world,
00:59:06.400 the
00:59:07.500 Republicans
00:59:07.980 try to
00:59:08.520 take over
00:59:08.880 the country
00:59:09.280 without
00:59:09.580 weapons?
00:59:11.180 Well, well,
00:59:11.740 no, I'm
00:59:12.600 the one
00:59:12.840 asking the
00:59:13.300 questions.
00:59:14.000 You see
00:59:14.260 where this
00:59:14.600 would go,
00:59:14.980 right?
00:59:15.940 You should
00:59:16.460 make them
00:59:16.900 look like
00:59:17.320 fucking
00:59:17.780 idiots for
00:59:19.260 asking that
00:59:19.840 question.
00:59:21.720 No, I'm
00:59:22.440 sorry.
00:59:23.200 If Republicans
00:59:23.980 ever decide to
00:59:24.860 take over a
00:59:25.440 country, look
00:59:26.860 for the
00:59:27.140 weapons.
00:59:28.700 And then
00:59:29.040 they're going
00:59:29.300 to be,
00:59:29.580 well, but
00:59:30.060 somebody found
00:59:31.120 one gun
00:59:31.800 that was
00:59:32.540 behind the
00:59:32.980 garbage can
00:59:33.660 a block
00:59:34.380 away.
00:59:35.720 And you
00:59:35.980 say, really?
00:59:37.680 Is that how
00:59:38.520 Republicans take
00:59:39.440 over the
00:59:39.760 country?
00:59:40.360 They make
00:59:40.800 sure they
00:59:41.160 got that
00:59:41.520 one gun
00:59:42.100 that's hidden
00:59:42.560 a block
00:59:42.980 away?
00:59:44.340 Was that
00:59:44.940 their battle
00:59:46.300 plan?
00:59:47.160 That one
00:59:47.620 gun that
00:59:48.000 was a block
00:59:48.440 away?
00:59:49.760 Just mock
00:59:50.540 the shit
00:59:50.900 out of it.
00:59:52.360 Just mock
00:59:53.000 it.
00:59:54.060 In fact,
00:59:55.260 you should
00:59:55.640 make them
00:59:56.120 talk about
00:59:56.720 it as long
00:59:57.240 as possible.
00:59:58.520 Because if
00:59:58.980 you mock
00:59:59.480 it right,
01:00:00.020 the longer
01:00:01.120 they talk
01:00:01.620 about it,
01:00:02.000 the worse
01:00:02.320 it is for
01:00:02.740 them.
01:00:03.620 But if
01:00:03.980 you don't
01:00:04.320 mock it,
01:00:05.140 the longer
01:00:05.600 they talk
01:00:06.080 about it,
01:00:06.480 the worse
01:00:06.800 it is for
01:00:07.200 you.
01:00:07.860 Does that
01:00:08.060 make sense?
01:00:09.940 If you
01:00:10.540 mock it,
01:00:11.180 you take
01:00:11.580 the high
01:00:11.940 ground,
01:00:13.040 and the
01:00:13.560 more they
01:00:14.020 see of
01:00:14.360 it, the
01:00:14.620 worse it
01:00:15.000 is for
01:00:15.260 them.
01:00:16.480 But as
01:00:17.020 long as
01:00:17.440 you just
01:00:17.920 say, no,
01:00:18.620 it's not
01:00:18.980 true, we
01:00:20.200 have good
01:00:20.600 policies, or
01:00:21.640 your side
01:00:22.460 does it
01:00:22.860 too, then
01:00:24.500 all that
01:00:24.960 people hear
01:00:25.840 is how
01:00:26.200 often you
01:00:26.640 were accused
01:00:27.020 of it.
01:00:28.260 They don't
01:00:28.600 really hear
01:00:28.960 the defense
01:00:29.480 part.
01:00:30.020 You have
01:00:30.460 to change
01:00:30.880 the frame.
01:00:32.140 The frame
01:00:32.640 should not
01:00:33.120 be, this
01:00:33.640 is a good
01:00:34.080 question, and
01:00:34.820 I'm answering
01:00:35.300 it.
01:00:36.200 No, don't
01:00:37.660 enter that
01:00:38.040 frame.
01:00:39.080 Take Vivek's
01:00:40.020 approach.
01:00:43.200 Vivek actually
01:00:43.940 says, okay,
01:00:44.540 what's happening
01:00:45.020 here is you're
01:00:45.860 being a fucking 0.82
01:00:46.440 idiot. 0.79
01:00:46.960 He doesn't
01:00:47.280 say that.
01:00:48.040 But what's
01:00:48.460 happening here
01:00:49.000 is you're
01:00:49.360 being an
01:00:49.700 illegitimate
01:00:50.300 member of
01:00:52.280 the press.
01:00:53.020 You're asking
01:00:53.480 me a ridiculous
01:00:54.420 question.
01:00:55.880 Nobody thinks
01:00:56.880 the Republicans
01:00:57.400 take over
01:00:57.960 countries without
01:00:59.120 bringing guns.
01:01:01.460 You're being
01:01:02.280 ridiculous right
01:01:03.100 now.
01:01:03.860 You realize how
01:01:04.420 silly you are. 0.96
01:01:05.660 You realize
01:01:06.020 you're being
01:01:07.220 crazy right on
01:01:08.120 camera.
01:01:09.240 Will you feel
01:01:09.880 good when this
01:01:10.460 plays back?
01:01:11.800 Can you defend
01:01:12.660 that question to
01:01:13.460 your friends?
01:01:15.080 That's the way
01:01:15.580 I'd go.
01:01:16.460 I'd go right
01:01:17.260 after the real
01:01:17.860 problem, which
01:01:18.540 is these are not
01:01:19.940 real questions.
01:01:21.080 These are not
01:01:21.460 real accusations.
01:01:22.200 This is just
01:01:23.420 pure bullshit.
01:01:26.540 And I don't
01:01:27.280 think Trump has
01:01:27.860 been good at
01:01:28.400 that.
01:01:29.280 I think he
01:01:29.900 has sort of
01:01:31.380 ignored it or
01:01:32.420 tried to stay
01:01:33.440 tough talking.
01:01:35.360 Oh, here's
01:01:36.820 another idea.
01:01:38.340 This will never
01:01:39.240 happen, but I
01:01:40.800 suggest it in
01:01:43.600 the spirit of
01:01:45.580 showing you
01:01:47.060 persuasion.
01:01:49.220 Imagine if
01:01:50.180 between now and
01:01:51.700 election day,
01:01:53.480 Trump decides to
01:01:54.680 go to lunch
01:01:55.320 once a week
01:01:56.020 with a green
01:01:57.720 haired Democrat,
01:01:59.520 progressive,
01:02:00.660 you know, a
01:02:01.000 different one.
01:02:02.360 And like the,
01:02:03.620 you know, the
01:02:04.020 most anti-Trump
01:02:05.640 person you can
01:02:06.280 imagine.
01:02:07.600 And all they're
01:02:08.360 going to do is
01:02:08.800 go to lunch and
01:02:09.820 he's just going
01:02:10.300 to listen to
01:02:10.700 them and have
01:02:12.600 a conversation.
01:02:13.940 Nobody's trying
01:02:14.540 to change
01:02:14.980 anybody's minds.
01:02:16.580 He's just going
01:02:17.100 to go to lunch.
01:02:18.300 Now here's the
01:02:19.140 thing that I know
01:02:20.060 that not all
01:02:21.620 of you know.
01:02:24.200 Trump loves a
01:02:25.340 pirate ship.
01:02:27.540 Trump loves a
01:02:29.000 pirate ship.
01:02:30.560 If you're smart,
01:02:32.900 he's going to find
01:02:33.540 a way to get
01:02:33.940 along with you.
01:02:35.520 And in person,
01:02:37.180 all of his scary
01:02:38.760 demeanor stuff that
01:02:39.760 he puts on when
01:02:40.700 he's talking tough,
01:02:42.080 you won't see any
01:02:43.020 of it.
01:02:43.260 If you meet him
01:02:45.660 in person,
01:02:46.980 he is the
01:02:47.560 warmest,
01:02:49.400 most sort of
01:02:50.840 generous,
01:02:52.340 accommodating,
01:02:54.100 charismatic
01:02:54.500 character you're
01:02:55.480 ever going to
01:02:55.800 run into.
01:02:56.820 Right?
01:02:57.280 You know, he
01:02:57.660 puts on kind of
01:02:58.480 an act when he's
01:02:59.360 talking.
01:03:00.700 But the real him
01:03:01.760 when he's just
01:03:02.320 relaxed and hanging
01:03:03.200 out, because I
01:03:03.840 had the experience
01:03:04.760 of, you know,
01:03:05.680 chatting with him
01:03:06.180 for a while,
01:03:07.460 his actual in-person
01:03:09.140 personality is
01:03:10.880 absolutely,
01:03:12.080 what did I
01:03:14.520 say?
01:03:17.480 It's
01:03:17.920 undeniable.
01:03:21.300 It's so strong
01:03:22.540 that you would
01:03:23.820 have a tough time
01:03:24.820 leaving the lunch
01:03:25.540 hating him.
01:03:26.860 You would leave
01:03:27.720 thinking,
01:03:28.320 damn it, I wish
01:03:28.880 he agreed with me
01:03:29.720 more.
01:03:30.660 But you wouldn't
01:03:31.460 hate him when he
01:03:32.040 left.
01:03:33.380 You would leave
01:03:34.480 thinking, you know,
01:03:35.800 okay, he's got a
01:03:36.540 point of view.
01:03:39.020 Yeah, disarming.
01:03:40.540 Disarming.
01:03:41.140 That's the right
01:03:41.680 word.
01:03:41.860 Thank you.
01:03:42.440 Thank you.
01:03:43.180 He is both
01:03:43.860 generous and
01:03:44.800 disarming in
01:03:45.760 person.
01:03:46.680 And you do not
01:03:47.820 get any sense of,
01:03:49.200 you know, evil.
01:03:51.140 Like, you don't get
01:03:51.760 any of that in
01:03:52.340 person.
01:03:53.720 And he will show
01:03:54.680 interest in you
01:03:55.720 that you didn't
01:03:57.140 think was possible.
01:03:58.780 Like, he'll ask
01:03:59.400 questions about your
01:04:00.280 situation and
01:04:01.500 actually listen to
01:04:02.400 you, and you feel
01:04:03.600 that he's actually
01:04:04.180 listening.
01:04:04.960 That was my
01:04:05.560 experience.
01:04:05.960 And I think it's
01:04:09.380 one of his
01:04:09.760 greatest assets.
01:04:11.340 But he keeps
01:04:12.220 that greatest asset
01:04:13.240 sort of, you
01:04:14.640 know, close.
01:04:15.860 You know, people
01:04:16.380 who know him
01:04:17.060 know it, but the
01:04:19.200 people who hate
01:04:20.440 him don't know
01:04:20.960 it.
01:04:22.060 Now, this would
01:04:23.000 probably work with
01:04:23.580 Biden as well,
01:04:24.560 right?
01:04:25.240 Because people at
01:04:26.020 that level can be
01:04:27.180 pretty charismatic.
01:04:28.460 So probably if you
01:04:29.460 were a big Biden
01:04:30.120 hater, but you went
01:04:31.340 to lunch with him,
01:04:32.060 you'd kind of leave
01:04:32.660 going, well, he
01:04:34.140 did have some
01:04:34.620 points.
01:04:36.180 So I think we
01:04:38.380 make the mistake
01:04:39.140 of thinking that
01:04:39.920 everybody is a TV
01:04:40.840 person, and they're
01:04:41.860 just images on the
01:04:42.780 screen, so you can
01:04:44.020 hate them and call 0.69
01:04:44.780 them vermin and do 0.61
01:04:45.920 anything you want
01:04:46.480 because they're not
01:04:46.900 real.
01:04:47.900 But if he just
01:04:49.000 every week went to
01:04:49.960 lunch with a
01:04:54.340 green-haired lesbian
01:04:55.460 and the next time
01:04:57.460 two nine binaries
01:04:59.420 and a member of
01:05:01.880 Antifa, who's
01:05:02.640 as long as he
01:05:03.080 had good security,
01:05:04.560 you know, two
01:05:05.420 members of Black
01:05:06.220 Lives Matter,
01:05:07.940 wouldn't you love
01:05:08.540 to see that?
01:05:09.860 And just do it
01:05:10.380 once a week.
01:05:13.600 Yeah, because,
01:05:14.660 you know, a lot
01:05:15.120 of people just want
01:05:15.820 to be heard.
01:05:17.140 Would you agree?
01:05:19.300 I think psychologists
01:05:20.540 would agree with
01:05:21.240 this.
01:05:21.620 A lot of people
01:05:22.200 want to know that
01:05:24.340 their version actually
01:05:25.600 got into the brain
01:05:26.760 of the person who
01:05:27.440 makes the decision,
01:05:28.680 and they rolled it
01:05:29.620 around, they compared
01:05:30.960 it to what they
01:05:31.500 were doing, and
01:05:32.600 then if they
01:05:33.140 still decide to
01:05:33.920 do what they
01:05:34.300 were already
01:05:34.680 doing, well, at
01:05:36.520 least you did
01:05:36.960 your job.
01:05:38.120 Like, you got in
01:05:38.940 there, you gave
01:05:40.680 the counterargument,
01:05:41.660 you would feel
01:05:42.140 completely differently,
01:05:43.040 even if the decision
01:05:44.660 goes the other way.
01:05:48.040 Podcast lunch with
01:05:48.900 Trump.
01:05:50.720 Yeah, no, I think
01:05:51.500 it should be private
01:05:52.220 if he had lunch.
01:05:54.400 So the other thing
01:05:55.260 I say, if somebody
01:05:55.880 calls you a fascist,
01:05:56.880 I was trying to get
01:05:57.420 down on X yesterday,
01:05:59.020 I mocked them and
01:06:00.320 say, apparently you
01:06:01.340 ran out of words.
01:06:03.180 If you have to start
01:06:04.080 using words that even
01:06:05.080 you don't know the
01:06:05.740 meaning, it means you
01:06:06.980 ran out of words.
01:06:08.700 It doesn't mean
01:06:09.240 anything.
01:06:09.800 It's just a way to
01:06:11.180 dismiss it as silly.
01:06:15.800 All right.
01:06:17.420 Adrian Norman had a
01:06:18.860 fascinating podcast that
01:06:20.980 I reposted.
01:06:23.060 So if you're looking
01:06:23.620 at it, if you're
01:06:24.400 looking for it, look
01:06:25.780 for Adrian Norman,
01:06:28.840 D.C.
01:06:30.340 That's what he goes
01:06:31.020 by by his ex-name.
01:06:33.020 And he interviewed a
01:06:34.260 guy who seems to know
01:06:35.220 a lot about Hezbollah
01:06:36.880 sleeper groups in the
01:06:38.360 United States.
01:06:39.540 Something called Unit
01:06:40.720 910 has infiltrated
01:06:43.460 the United States and
01:06:44.540 other countries.
01:06:46.200 So apparently Hezbollah,
01:06:47.720 backed by Iran,
01:06:48.720 is far bigger than I
01:06:52.760 think most people
01:06:53.340 realized.
01:06:54.860 You know, I kind of
01:06:55.540 thought of them as this
01:06:56.840 little group of people
01:06:57.820 in Lebanon that
01:06:59.740 sometimes send rockets
01:07:00.840 in Israel.
01:07:01.920 I don't really think of
01:07:03.100 them as a global
01:07:04.020 organization, but you
01:07:06.060 should because they're a
01:07:07.820 global organization.
01:07:09.780 And apparently these are
01:07:10.700 the ones who figure out
01:07:12.560 how to, among other
01:07:15.020 things, you know, it's
01:07:15.700 basically a spy
01:07:16.480 organization, but among
01:07:17.540 other things they figure
01:07:18.380 out what to blow up in
01:07:19.340 the United States if
01:07:20.760 they need to blow
01:07:21.380 something up.
01:07:23.760 So in theory, a
01:07:25.860 country like Iran, and
01:07:28.440 I wrote a book about
01:07:29.480 this very idea, the
01:07:30.760 religion war, it's a key 1.00
01:07:32.300 part of the book, is
01:07:33.600 that if you have a
01:07:34.760 country that can't
01:07:35.520 compete in the
01:07:36.460 traditional nuclear
01:07:38.620 way, the one thing you
01:07:40.620 could do is over time
01:07:41.660 infiltrate a lot of
01:07:42.800 people into the major
01:07:44.120 cities of your enemy
01:07:45.480 and get them set up so
01:07:47.560 they can do some
01:07:48.320 terrorist act that
01:07:50.160 would happen at the
01:07:50.940 same time in multiple
01:07:51.840 places.
01:07:53.340 So that you could
01:07:54.080 basically cripple the
01:07:55.280 country with, you know,
01:07:56.740 well-chosen, you know,
01:07:58.360 hit a power plant, knock
01:07:59.940 out some bridges, you
01:08:00.880 know, really go for,
01:08:01.600 like, some key
01:08:02.400 infrastructure, and you
01:08:04.000 just, the whole country
01:08:06.100 would shut down for
01:08:07.020 months.
01:08:08.520 So I think that it was
01:08:10.260 sort of an obvious
01:08:11.060 strategy, but I think it
01:08:12.300 looks like Hezbollah is 0.95
01:08:13.160 doing that.
01:08:13.580 Now, you might ask
01:08:14.720 yourself, why don't
01:08:15.380 they do it yet?
01:08:17.900 Why have there not
01:08:18.800 been any terrorist
01:08:19.540 things yet?
01:08:20.640 Because we've been at,
01:08:21.520 you know, Rand's neck
01:08:23.440 for a long time.
01:08:24.840 And I would ask you
01:08:26.160 this, how do you know
01:08:28.020 there weren't?
01:08:29.840 You know, you keep
01:08:30.700 hearing about energy
01:08:31.620 plants and food
01:08:33.640 processing plants having
01:08:35.900 all these explosions and
01:08:37.240 problems, and some people
01:08:39.280 said, are these all
01:08:40.400 natural?
01:08:40.820 Well, it feels like a
01:08:42.660 lot of meat processing
01:08:43.660 plants are having
01:08:44.400 trouble lately.
01:08:45.800 Now, I don't know,
01:08:46.700 maybe that's conspiracy
01:08:47.620 theory stuff, but it
01:08:50.760 wouldn't be surprising
01:08:51.780 if dirty tricks were
01:08:54.860 already happening, and we
01:08:56.040 just don't know it.
01:08:57.500 Wouldn't be surprising.
01:09:00.540 All right, Fox News
01:09:02.860 has a story about young
01:09:04.240 people turning on Joe
01:09:05.800 Biden, and some of them
01:09:08.060 are calling him
01:09:08.660 Genocide Joe.
01:09:09.560 I don't know how many
01:09:11.360 of them are calling him
01:09:11.960 I got a feeling that was
01:09:13.160 like one person they
01:09:14.020 talked to.
01:09:14.740 Yeah, we like to call him
01:09:15.660 Genocide Joe.
01:09:16.800 So that became the
01:09:17.500 headline.
01:09:18.320 I don't know how many
01:09:19.000 people are calling him
01:09:19.780 that, but they're turning
01:09:21.860 on him over his refusal
01:09:24.000 to back a ceasefire, even
01:09:26.900 though there's sort of
01:09:27.920 one, a temporary one
01:09:28.960 now.
01:09:32.060 But the experts say this.
01:09:35.520 Are they really going to
01:09:37.040 vote for Trump instead?
01:09:38.120 Yeah, when it comes right
01:09:44.840 down to a choice, are they
01:09:47.020 going to choose Trump?
01:09:49.340 So don't get too excited if
01:09:51.340 the polls say that the
01:09:52.540 young people don't like 1.00
01:09:53.740 Biden, because they might
01:09:55.680 dislike Trump more, and
01:09:57.160 that's going to be the
01:09:57.900 critical thing.
01:09:58.860 But it could make a big
01:10:00.480 difference if they stay
01:10:01.340 home, right?
01:10:02.140 So if they're not excited
01:10:03.540 about either one, that's a
01:10:05.600 good reason not to vote.
01:10:07.340 So you might see Trump
01:10:08.620 winning just because, you
01:10:10.820 know, maybe the ones who
01:10:11.520 did vote were already on
01:10:13.820 his side.
01:10:16.060 But it's weird to see how
01:10:18.180 dumb young people are, that 1.00
01:10:20.820 they somehow don't realize
01:10:22.160 that Trump would have also
01:10:23.400 backed Israel completely.
01:10:24.720 Well, he does.
01:10:26.680 So why would you hate, you
01:10:28.820 know, the one person when
01:10:30.220 the other person would do
01:10:31.160 exactly the same thing?
01:10:32.280 In fact, pretty much most of
01:10:35.240 our politicians would do the
01:10:36.440 same thing.
01:10:37.220 But the young, I don't know,
01:10:39.100 don't have the context, maybe.
01:10:41.720 Yeah.
01:10:45.020 As Mike Cernovich pointed out
01:10:47.940 today, or yesterday in the
01:10:49.720 post, that the plan to ship
01:10:52.660 Gazans to the West has been
01:10:54.400 endorsed by multiple
01:10:56.240 Knesset members.
01:11:00.300 And so that's Israel. 0.57
01:11:02.260 And that the current Israeli 0.97
01:11:03.500 Minister of Intelligence and
01:11:04.660 numerous others have planned
01:11:06.640 it.
01:11:07.520 And Mike Cernovich, in his
01:11:09.400 classic inimitable way, says,
01:11:14.380 and still low IQ American
01:11:15.900 conservatives will claim this
01:11:17.380 isn't the plan.
01:11:18.180 And one member, ex-member of
01:11:22.920 the Israel government, is
01:11:24.260 saying, each country should
01:11:26.000 take a quota.
01:11:27.420 We need all two million to
01:11:29.000 leave. 0.99
01:11:30.020 That is the solution for
01:11:31.260 Gaza.
01:11:32.100 Well, at least they're saying
01:11:33.540 it directly now.
01:11:35.460 Now, I think I've been telling
01:11:37.080 you since the start that
01:11:39.140 there's no scenario that makes
01:11:40.560 sense where Israel wreaks 0.99
01:11:43.460 havoc in Gaza and then says,
01:11:45.180 all right, rebuild.
01:11:45.820 That can't happen.
01:11:48.820 Under no scenario can that
01:11:50.560 happen.
01:11:51.220 They have to completely own it,
01:11:53.840 depopulate it, turn it into
01:11:56.220 some other thing.
01:11:57.120 It just can't be where the
01:11:59.320 Hamas and Palestinians ever 1.00
01:12:01.040 live again, probably.
01:12:02.500 Now, they might, if they're
01:12:04.000 smart, they're not going to
01:12:06.300 exclude all Palestinians. 0.97
01:12:08.400 But they would vet them so
01:12:10.060 carefully that almost nobody
01:12:11.320 would get through.
01:12:13.020 But they could still say,
01:12:14.400 well, look, we've got
01:12:15.040 Palestinians living here.
01:12:16.280 We've got Jews.
01:12:17.180 Everybody's living happily.
01:12:18.440 But it would be because they
01:12:19.280 vetted them so extremely that
01:12:22.320 they knew what they were
01:12:23.380 getting.
01:12:25.860 All right.
01:12:27.180 So I do believe this is an
01:12:29.140 existential risk to the United
01:12:31.300 States, that there will be more
01:12:34.100 of this.
01:12:35.400 And if we open up our borders to 0.91
01:12:37.040 people who's too many of them,
01:12:39.780 this is the Moroccan question.
01:12:41.640 We're not saying all Moroccans are
01:12:43.120 criminals.
01:12:43.420 Nobody says that.
01:12:45.020 And we're certainly not saying
01:12:46.080 that all Palestinians are 0.98
01:12:47.420 dangerous to America.
01:12:50.500 Don't believe that.
01:12:52.500 In fact, I've known personally a
01:12:56.020 number of Palestinian types who are
01:12:59.020 just awesome people.
01:13:00.700 So it's not about all of them.
01:13:03.200 It's about managing your risk.
01:13:05.360 So here again, I say that if you
01:13:08.480 were trying to keep America safe,
01:13:11.060 you don't care about anything
01:13:13.280 except can you know for sure that
01:13:16.780 this population of people is going
01:13:18.180 to be safe in the long run?
01:13:19.760 And if you can't, that's it.
01:13:24.920 Michael says, Scott, there's more to
01:13:26.900 life than politics.
01:13:28.100 Christ.
01:13:28.460 Do you know what this live stream is
01:13:34.920 about?
01:13:36.080 Sort of a live stream about
01:13:37.660 politics.
01:13:39.620 Nobody told you that?
01:13:41.820 Did you think it was all going to be
01:13:42.940 Dilbert Comics?
01:13:44.640 I'm sorry.
01:13:47.580 All right.
01:13:49.340 Well, the most important thing today is
01:13:51.400 that it's Thanksgiving.
01:13:52.260 And for all of you lonely people,
01:13:55.940 I'll be doing a live stream
01:13:58.040 starting at 6 p.m. Eastern time,
01:14:01.840 3 p.m. my time.
01:14:03.720 First hour will be family friendly.
01:14:06.080 And then I might go private if we
01:14:07.600 want to have some adult
01:14:10.000 conversations and whatnot.
01:14:12.800 But if you're alone on Thanksgiving
01:14:15.380 and you like some company,
01:14:18.500 that's what I'll be for.
01:14:19.500 So I'm not going to be presenting
01:14:21.420 anything.
01:14:22.660 I'm just going to be hanging out
01:14:23.740 with you while I make some food.
01:14:26.560 So I'm going to be cooking and
01:14:27.800 prepping and chatting to you.
01:14:30.260 I have my official chef's hat and
01:14:32.240 apron.
01:14:33.380 And you can just put it on and feel
01:14:35.260 like you've got a friend.
01:14:36.760 I'll be your virtual friend.
01:14:38.780 As you know, the biggest, one of the
01:14:41.060 biggest health risks in the world
01:14:42.400 right now is loneliness because our
01:14:44.040 system is driving us to be lonely.
01:14:48.780 And so I'll try to do what I can to
01:14:50.920 fix that.
01:14:51.500 This will be a little test.
01:14:52.860 If it goes well, maybe I'll do more
01:14:54.580 of it.
01:14:55.880 We'll see if people like it.
01:14:58.080 But don't miss it.
01:15:00.200 If you're lonely, let me help.
01:15:03.480 And I will also like to take this
01:15:05.420 opportunity to show my gratitude
01:15:08.140 to every one of you and anyone who's
01:15:11.420 ever watched this all year.
01:15:13.140 As you know, I ran into a little bit
01:15:15.660 of trouble earlier this year.
01:15:16.980 I don't know if anybody heard.
01:15:18.780 A little thing called cancellation.
01:15:21.060 And at the moment, I'm doing great.
01:15:25.660 Happier than ever.
01:15:27.040 Literally happier than ever.
01:15:28.960 And it's because of you.
01:15:31.920 You're the ones who did not turn your 0.94
01:15:34.080 backs on me.
01:15:35.680 You're the ones who looked at the
01:15:37.060 proper context and said, do we care
01:15:39.780 about that or do we not care about
01:15:41.120 that?
01:15:42.000 And you're the smartest audience in
01:15:44.080 all of live streaming, which I
01:15:47.300 actually believe.
01:15:48.040 I believe that literally.
01:15:49.740 And I cannot tell you how much I
01:15:54.060 appreciate you because words would
01:15:57.060 fail.
01:15:58.120 Yeah.
01:15:58.360 My appreciation for you and what
01:16:00.760 you've done for me is extreme.
01:16:03.120 And I try to repay the favor as often
01:16:07.320 as possible, which can help bond us.
01:16:11.100 A number of you have been nice enough
01:16:13.580 to say that I've changed your lives in
01:16:16.520 a variety of ways.
01:16:17.840 Some of you have already bought your
01:16:19.720 copy of my new book, Reframe Your
01:16:21.820 Brain, which, by the way, this sounds
01:16:25.040 like marketing and it is, but it's also
01:16:27.660 true.
01:16:28.780 I wouldn't wait too long to order this
01:16:31.240 if you want to get it as a gift.
01:16:32.440 It's sort of the perfect gift book
01:16:34.660 because it works for everybody.
01:16:36.260 There's nobody, literally nobody,
01:16:39.220 who would not find this an
01:16:41.040 interesting book.
01:16:42.420 It's not like knowing somebody likes
01:16:43.920 certain kinds of books.
01:16:45.560 This one's guaranteed because it's
01:16:47.560 just easy little reframes that would
01:16:49.780 work for everybody.
01:16:52.180 So because it's independently
01:16:54.360 published, which means that Amazon
01:16:56.180 publishes them based on the orders
01:16:58.500 that come in.
01:16:59.460 So they don't make them in advance
01:17:01.160 just in case you want them.
01:17:02.440 So I don't know that if you wait
01:17:05.640 until too close to Christmas, you
01:17:07.200 would get it on time.
01:17:09.240 I'm kind of expecting that they're
01:17:11.340 going to tell me any moment now, oh,
01:17:13.040 there's a two-week delay or something.
01:17:14.980 So if you're smart, the following week,
01:17:18.160 if you want any copies of that, I would
01:17:20.100 get them.
01:17:20.620 That is marketing, of course, but I think
01:17:24.800 it's true.
01:17:25.380 I think there might be some supply
01:17:28.500 problems in the last two weeks.
01:17:32.120 All right.
01:17:32.660 Well, Annie, I'm glad I helped you during the pandemic.
01:17:37.420 And you certainly returned the favor.
01:17:40.760 All of you did.
01:17:41.960 So I'll see you, YouTube, either tomorrow or tonight.
01:17:47.700 By the way, usually the stream will not be on YouTube.
01:17:50.520 I think I didn't mention that.
01:17:51.500 It'll be on the Locals stream.
01:17:54.880 Usually it's private, but I'm going to make it public
01:17:58.820 so everybody can watch it.
01:18:00.900 That would be at scottadams.locals.com.
01:18:05.720 That's where you'd find me, and it'll go live
01:18:07.620 at 6 p.m. Eastern time.
01:18:10.340 I'll see you tomorrow or tonight.
01:18:13.020 Thanks for joining.