Real Coffee with Scott Adams - November 27, 2023


Episode 2305 Scott Adams: CWSA 11⧸27⧸23 Can TikTok Ads Cure Inflation For The White House?


Episode Stats

Length

1 hour and 7 minutes

Words per Minute

143.6085

Word Count

9,684

Sentence Count

707

Misogynist Sentences

4

Hate Speech Sentences

19


Summary

In this episode of Coffee with Scott Adams, the host talks about a new discovery that could prove that we are living in a simulation, and why we should all be worried about the possibility that we live in a computer-generated world.


Transcript

00:00:00.000 Good morning, everybody, and welcome to the highlight of human civilization.
00:00:11.540 It's called Coffee with Scott Adams, the best time you'll ever have in your life, with your clothes on.
00:00:16.800 And if you'd like to take this experience up to levels that nobody can even understand with their little human brains,
00:00:24.120 all you need is a cup or a mug or a glass, a tank or gels, a stein, a canteen, jug or flask, a vessel of any kind.
00:00:31.060 Fill it with your favorite liquid. I like coffee.
00:00:33.640 And join me now for the unparalleled pleasure of the dopamine hit of the day.
00:00:37.500 The thing that makes everything better, it's called the simultaneous sip.
00:00:42.160 And it happens now. Go.
00:00:48.980 I'm not going to lie, that is the worst simultaneous sip I've ever had.
00:00:52.660 Apparently, I did not have my coffee warmer turned on, and I got myself a face full of cold coffee.
00:01:03.220 But still, cold coffee is better than no coffee at all. Am I right? Yeah.
00:01:12.340 Refreshing.
00:01:14.360 See, you've got to look for the bright side.
00:01:17.460 You've got to look for the bright side.
00:01:22.660 So, simulation alert.
00:01:26.060 Simulation alert.
00:01:27.900 I saw this in a post from somebody called Carnivore Aurelius.
00:01:35.000 So there's some researcher, Dr. Michael Levin,
00:01:39.140 who's using electrical signals
00:01:41.120 to cause worms to grow two heads
00:01:44.360 without changing their genes whatsoever.
00:01:46.840 You can just basically send electrical signals into some worms,
00:01:54.320 and they grow a second head.
00:01:57.520 Now, there's nothing in the...
00:02:00.100 And that's not all.
00:02:01.940 It could also regenerate a salamander's limbs.
00:02:06.080 It could regenerate a salamander's limbs with just electrical signals.
00:02:09.420 Now, they don't say so,
00:02:11.900 but, you know, my own research shows that
00:02:14.320 the researchers did not
00:02:16.880 sufficiently shield themselves
00:02:19.160 when they were making the worms grow two heads.
00:02:22.660 So Dr. Michael Levin has two penises now,
00:02:26.560 and he's suspected as being the father
00:02:28.880 of the woman with two wombs
00:02:30.240 who has two separate children in each womb.
00:02:33.160 Now, if you haven't been watching the show,
00:02:36.340 you don't know how cleverly I connected all of those stories.
00:02:40.000 But ask somebody.
00:02:41.360 They'll be very impressed.
00:02:43.320 No, none of that's true,
00:02:44.460 except that apparently the worms do grow two heads,
00:02:47.020 and the salamanders do grow salamander limbs.
00:02:49.320 We don't know about the researcher's penis.
00:02:51.220 He probably still just has one.
00:02:55.840 As far as I know,
00:02:58.400 so some are thinking that maybe our idea
00:03:01.380 that your genes determine what your body does,
00:03:07.340 what if that's totally wrong?
00:03:10.420 Wouldn't that be just mind-blowing?
00:03:13.700 What if there's something that the environment
00:03:16.020 is introducing externally,
00:03:18.380 and your genes are just sort of like a collector?
00:03:20.680 What if your genes are not the cause?
00:03:24.660 They're simply this signal collector,
00:03:27.580 but the signal is coming from outside.
00:03:30.180 How wild would that be, huh?
00:03:32.760 I'm not going to say that that's true,
00:03:35.620 but how wild.
00:03:37.540 And would that suggest that you're a simulation?
00:03:42.020 Imagine if you were a software simulation.
00:03:44.680 Would your actions all be coming from within the one time
00:03:50.100 you were programmed?
00:03:52.700 See where I'm going with this?
00:03:54.720 Some of your actions, including what your body does,
00:03:57.880 might be coming from programs that are being injected
00:04:00.800 from the outside.
00:04:02.880 Something like an electrical signal, you might call it.
00:04:05.500 So is this more evidence that we're a simulation?
00:04:12.160 We're not organic units that are growing
00:04:14.500 based on our own instructions that we have internally,
00:04:17.320 but rather we're subject to some external electrical signal.
00:04:22.100 Let's call it software.
00:04:23.260 I don't know.
00:04:26.220 I just like to think that everything fits my hypothesis.
00:04:29.220 It's called confirmation bias.
00:04:31.740 Confirmation bias.
00:04:33.200 Every time I see a news story, I figure,
00:04:35.500 well, somehow this fits into my simulation hypothesis.
00:04:39.600 You should be aware that I'm just making everything
00:04:41.780 fit the hypothesis.
00:04:43.740 You know me well enough to know
00:04:45.100 I'm kind of playing with it.
00:04:47.580 On the other hand, I do think we live in a simulation.
00:04:51.580 All right.
00:04:53.920 Brian Ramelli did a post today.
00:04:56.860 There's an article in something called Multiplex
00:04:59.020 that says your brain is creating now,
00:05:01.560 but it's a half second after it just happened.
00:05:04.000 So the idea is that your brain imagines time,
00:05:08.560 because time's not a real thing,
00:05:10.500 or at least it imagines it in the form it wants to.
00:05:13.640 It imagines that we thought of something
00:05:16.060 and made a decision, and then we acted.
00:05:19.600 But science has confirmed that we decide
00:05:22.640 and act first, and then we figure out why we did it.
00:05:27.360 So this would be, I don't know,
00:05:29.840 probably the tenth time I've told you
00:05:31.840 that science has discovered that people are backwards thinkers,
00:05:35.140 and probably the tenth time I've told you
00:05:37.360 that I learned that 40 years ago.
00:05:38.960 It's like the first lesson in hypnosis.
00:05:42.080 Did you know that?
00:05:44.920 40 years ago, the first lesson is
00:05:47.200 that people make decisions irrationally,
00:05:51.720 and then they rationalize it after the fact,
00:05:54.120 and then they imagine that the rationalization happened first,
00:05:58.000 but in fact it actually happened after.
00:05:59.700 Now that's something that hypnotists are taught,
00:06:04.580 I don't know if they're all taught that way,
00:06:06.300 but that's what I was taught 40 years ago.
00:06:10.020 And today it's in the news, it's like,
00:06:11.660 hey, look what we discovered.
00:06:14.920 So, do you think you have free will
00:06:18.800 if the decision is made before you think about it?
00:06:22.200 Does it fit your idea that you have free will
00:06:26.100 if you know that your decision is just a rationalization
00:06:29.600 for something that happened by cause of the fact
00:06:32.360 and that science can tell you that pretty definitively?
00:06:38.420 Well, just add that to the no free will pile of evidence.
00:06:45.780 Do you think that the country is aware,
00:06:49.900 skipping topics,
00:06:51.020 do you think the general country is aware,
00:06:54.280 or is it just people who follow certain people on X,
00:06:57.760 that they know that there's a massive
00:06:59.860 disinformation campaign?
00:07:03.580 And by disinformation, I mean
00:07:05.080 telling you that real things are false.
00:07:08.440 Do you know that there's an entire industry,
00:07:11.420 major funded organizations that are part of it,
00:07:15.440 and now that the government,
00:07:17.660 the Biden government,
00:07:18.600 is pushing this media literacy program,
00:07:22.380 which is literally just,
00:07:23.660 you know,
00:07:24.980 curtailing free speech, basically.
00:07:28.260 And that the entire idea is to gaslight you
00:07:30.700 into thinking the things that you can
00:07:32.240 clearly see are true
00:07:33.980 are not true at all.
00:07:36.060 And that it's like,
00:07:37.060 a lot of people
00:07:38.220 and a lot of money are involved.
00:07:39.580 So the names you should know would be
00:07:42.020 the media literacy program,
00:07:43.860 if you wanted to do your own research,
00:07:45.760 find out what the government has in mind,
00:07:48.040 to teach you which parts of the media are false.
00:07:51.720 Guess which parts are false?
00:07:54.540 Everything Republicans say.
00:07:56.840 Are you surprised?
00:07:57.640 That's going to be the media literacy.
00:08:01.760 If a Republican said it,
00:08:03.100 well, that can't be true.
00:08:06.000 If the science said it,
00:08:07.320 well, that must be true.
00:08:09.140 Because science said it.
00:08:12.000 And then there's the News Guard,
00:08:14.320 which will be part of this.
00:08:16.100 Because news,
00:08:16.820 so there are a bunch of,
00:08:17.720 I'm going to call them fake organizations.
00:08:20.320 Meaning that what they state is their purpose
00:08:22.700 doesn't look to be exactly their purpose.
00:08:25.460 Like their purpose might be,
00:08:27.580 oh yeah,
00:08:28.620 it would be good to get rid of information
00:08:30.480 that's clearly wrong.
00:08:32.280 That'd be good.
00:08:33.640 But it's not really just about that, is it?
00:08:36.440 It's mostly about getting rid of information
00:08:38.380 that would be inconvenient
00:08:39.580 for one political side.
00:08:43.180 So,
00:08:45.440 X is really the last place
00:08:48.000 that there's free speech.
00:08:50.300 I feel like a pirate
00:08:51.560 just being on the X platform.
00:08:54.040 Because it's like the only place
00:08:56.460 you can still get away with
00:08:57.800 a real opinion.
00:09:00.280 And I'm enjoying that,
00:09:01.540 I must say.
00:09:04.320 Mike Benz is your best follow on X
00:09:06.580 if you want to catch up with this.
00:09:08.100 To find out all the government-funded
00:09:09.940 and Soros-funded fake organizations
00:09:12.060 that make up the fake censor,
00:09:15.520 well, not fake,
00:09:16.540 but they make up the censorship
00:09:17.940 industrial complex.
00:09:19.700 If you don't know all the players,
00:09:21.200 you wouldn't even know what was happening.
00:09:22.260 So, the beauty of this
00:09:24.300 assault on free speech
00:09:26.620 is that it comes in all these
00:09:29.160 semi-credible-looking organizations
00:09:31.660 working together.
00:09:33.160 And then each of them
00:09:34.240 agreeing with each other.
00:09:36.040 So, you'd say,
00:09:36.600 well, it's not just this one organization,
00:09:38.740 but the media is reporting it too,
00:09:41.300 and the intelligence people
00:09:42.440 have confirmed it.
00:09:43.980 So, you've got this blob
00:09:45.620 of people confirming things
00:09:48.080 that are just not true.
00:09:49.880 Scott changed his mind on Soros?
00:09:53.960 No.
00:09:55.700 Nope.
00:09:57.120 I never said he didn't fund things
00:09:58.940 that I didn't like.
00:10:00.540 That's never been a question.
00:10:02.400 The question was his motivation.
00:10:05.600 His motivation.
00:10:07.040 And the question is,
00:10:08.280 did he even know what he was doing?
00:10:09.920 The senior.
00:10:10.400 So, I would say again
00:10:13.740 that every day
00:10:15.660 you don't see anybody
00:10:16.820 giving a challenging interview
00:10:18.900 to Alex Soros,
00:10:21.360 the younger one
00:10:22.180 who's running things now,
00:10:23.580 every day you don't see that,
00:10:26.160 you should be,
00:10:27.220 that's a confirmation
00:10:28.140 that there's no real news.
00:10:30.280 And it's a confirmation
00:10:31.340 that he's in charge.
00:10:32.420 Because if the Democrats
00:10:34.860 and the Democrat-leaning media
00:10:37.140 were doing anything
00:10:39.060 like a real job,
00:10:40.140 he would be the most important person
00:10:41.560 in the country to talk to.
00:10:43.780 Because he's funding
00:10:44.980 the most contentious
00:10:47.380 problem-making elements
00:10:50.280 of society,
00:10:51.500 and nobody's asking him why.
00:10:55.060 Now, the official reason why
00:10:57.640 is to make the world
00:10:58.520 a better place,
00:10:59.280 but nobody's ever connected
00:11:00.840 the dots.
00:11:02.080 How does opening our borders
00:11:03.440 make life better
00:11:04.960 for Americans?
00:11:07.840 What's the argument for that?
00:11:09.520 How does letting crime
00:11:10.900 go wild in our cities
00:11:12.180 make us better off?
00:11:16.220 Like, what's the argument?
00:11:17.620 So the fact that he's never
00:11:18.800 been asked to make his argument,
00:11:21.540 and we don't even know
00:11:22.440 what it is.
00:11:24.180 And he's the biggest force
00:11:25.360 in America,
00:11:26.220 because he funds
00:11:26.940 the party in power,
00:11:29.400 so they have to bow to him.
00:11:30.840 How in the world
00:11:31.800 did we get to the place
00:11:32.860 where the person making
00:11:34.480 the most difference
00:11:35.600 is an American,
00:11:38.060 you know,
00:11:38.360 who can speak and talk,
00:11:39.920 and he can appear on TV.
00:11:42.280 I mean, he's been on appearances,
00:11:44.080 but he only gets
00:11:44.660 the friendly interview.
00:11:46.380 Nobody ever says,
00:11:48.080 can you explain how
00:11:49.500 funding this organization
00:11:51.700 made anything better?
00:11:53.400 Like, what was the thinking
00:11:54.380 behind that?
00:11:55.080 And who makes the decision
00:11:56.540 who gets funding?
00:11:58.020 Those are the questions
00:11:59.040 you need to know.
00:12:00.780 Nobody's asking.
00:12:02.560 Nobody's even asking
00:12:04.060 the question.
00:12:05.980 We are so broken
00:12:07.220 in terms of the news.
00:12:09.440 It's insane.
00:12:12.520 Anyway,
00:12:15.420 so also Mike Benz
00:12:16.580 was talking about how
00:12:17.600 there's an Irish speech law
00:12:21.640 that criminalizes memes.
00:12:27.000 Did you know that?
00:12:28.660 It criminalizes memes.
00:12:30.900 So memes could be hate speech.
00:12:34.080 So aren't you glad
00:12:35.460 you don't live in Ireland, huh?
00:12:37.280 If you don't,
00:12:38.240 some of you might.
00:12:39.500 But aren't you glad
00:12:40.300 you don't live in a place
00:12:41.420 where a meme
00:12:42.780 could make you go to jail, huh?
00:12:44.460 Well, you do live in a place
00:12:47.380 where a meme
00:12:48.080 can make you go to jail.
00:12:49.540 America.
00:12:50.400 That's America today.
00:12:52.940 A meme can literally
00:12:54.140 make you go to jail.
00:12:55.920 Didn't they just send
00:12:56.780 somebody to jail
00:12:57.520 literally for a meme?
00:12:59.260 It was a meme that,
00:13:00.420 yeah, Ricky Vaughn,
00:13:01.400 that wasn't his real name,
00:13:02.560 but Ricky Vaughn.
00:13:04.000 Yeah, Ricky Vaughn literally,
00:13:05.920 Mackey is his name, right?
00:13:07.900 Literally was just sentenced
00:13:09.180 to jail
00:13:09.760 for a meme
00:13:12.160 because they said,
00:13:13.700 hey, this meme
00:13:14.400 will mislead people
00:13:15.660 into voting on the wrong day.
00:13:16.960 Now,
00:13:20.340 maybe it would have.
00:13:22.900 Maybe it would have.
00:13:24.500 But is that good enough reason?
00:13:27.260 I don't know.
00:13:29.180 And Michael Zimmerson says,
00:13:30.620 Scott, why don't you
00:13:31.500 interview Alex Soros?
00:13:33.440 Like, you think that's an option.
00:13:36.260 When you made that suggestion,
00:13:38.140 really?
00:13:40.880 You think of all the people
00:13:42.320 who might invite him to talk,
00:13:43.620 he would talk to me.
00:13:45.660 That would be his worst decision
00:13:47.420 he ever made.
00:13:50.820 I'm pretty sure I would put
00:13:52.020 the whole Soros organization
00:13:53.780 out of business
00:13:54.520 in about 10 minutes
00:13:56.240 because I know he couldn't
00:13:58.400 defend it.
00:14:00.080 It would be not defensible.
00:14:02.840 No, there's no way
00:14:03.640 he would appear with me.
00:14:05.620 Not a chance.
00:14:07.080 Anyway,
00:14:08.120 did you know that also in America
00:14:10.480 your use of memes
00:14:12.740 or even the fact
00:14:14.240 that you have a meme
00:14:14.980 on your phone
00:14:15.800 could be used as
00:14:17.820 a proof of intent
00:14:19.300 of criminal behavior?
00:14:22.140 And has been.
00:14:24.440 And has been.
00:14:26.320 It's actually been used
00:14:27.560 as evidence of intention
00:14:29.020 of crime.
00:14:30.180 A meme.
00:14:31.540 Literally a joke.
00:14:35.000 Well, I guess they're not all jokes.
00:14:37.080 Memes could be serious as well.
00:14:38.420 So that's all bad.
00:14:42.720 And speaking of gasoliniest,
00:14:45.160 so apparently there's reports
00:14:46.480 that internally
00:14:47.680 the Biden campaign
00:14:48.860 isn't quite sure
00:14:50.560 how to tell the public
00:14:51.640 that what they're seeing
00:14:52.980 isn't really happening
00:14:53.860 with inflation.
00:14:56.280 So people are going to stores
00:14:57.680 and are buying things
00:14:58.620 and they're going,
00:14:59.260 my God,
00:14:59.720 this is way more expensive
00:15:00.780 than before.
00:15:01.980 And the Biden administration
00:15:03.260 is trying to figure out
00:15:04.200 how to convince them
00:15:05.080 that the things they see
00:15:06.760 with their own eyes
00:15:07.720 and can prove
00:15:09.340 with their own wallets
00:15:10.720 and bank accounts
00:15:11.600 are not actually happening
00:15:13.380 and that inflation
00:15:14.660 isn't worse.
00:15:17.120 That's actually
00:15:18.200 a conversation
00:15:19.020 that's going on.
00:15:20.460 Now, I assume
00:15:21.280 the argument
00:15:21.760 is there are some things
00:15:23.060 that are better.
00:15:26.060 Some things
00:15:26.860 have gone down
00:15:27.620 since the pandemic,
00:15:28.780 maybe.
00:15:30.060 But it's not much
00:15:31.160 of an argument.
00:15:32.660 And the fact that
00:15:33.400 their best argument
00:15:35.040 would be to convince you
00:15:36.660 that something
00:15:37.320 that's not happening
00:15:39.080 is actually happening.
00:15:41.660 Just think about
00:15:42.560 how far away
00:15:43.720 that is from being
00:15:44.540 useful to the country.
00:15:46.940 Normally,
00:15:47.640 you have claims
00:15:48.320 that say,
00:15:49.000 if we do this,
00:15:50.140 my policy,
00:15:51.300 I promise you,
00:15:52.420 you will get
00:15:53.040 a better result.
00:15:54.740 And then somebody say,
00:15:56.400 but my policy
00:15:57.180 has already
00:15:57.960 increased things 2%.
00:15:59.820 Usually,
00:16:01.040 you're talking about
00:16:02.060 two things
00:16:03.600 where you can't
00:16:04.160 really prove
00:16:04.760 either one of them
00:16:05.460 is true.
00:16:06.680 It's like,
00:16:07.100 well,
00:16:07.420 maybe your policy
00:16:08.580 would be better
00:16:09.080 in the future,
00:16:09.780 but it's not the future.
00:16:10.820 I don't know.
00:16:11.700 Maybe.
00:16:12.720 And then they might say,
00:16:14.800 look at the past.
00:16:16.020 We went up 7%.
00:16:17.160 You say,
00:16:17.680 well,
00:16:18.340 okay,
00:16:18.760 I guess you could
00:16:19.260 measure it that way.
00:16:20.360 But there are also
00:16:20.960 some negatives.
00:16:22.180 Maybe you didn't
00:16:22.660 mention them.
00:16:23.540 So the old way
00:16:24.320 of arguing
00:16:24.780 was also bullshit.
00:16:26.740 But this is
00:16:27.280 a whole new level.
00:16:28.840 This is telling you
00:16:29.680 you don't see
00:16:30.440 and feel
00:16:30.940 what you see
00:16:31.540 and feel.
00:16:33.000 The ultimate.
00:16:34.140 Can they get
00:16:35.620 to the point
00:16:35.980 where you actually
00:16:36.680 will believe
00:16:37.400 you see
00:16:38.380 and feel
00:16:38.800 things that
00:16:39.320 aren't there?
00:16:40.480 They did it
00:16:41.160 with Trump.
00:16:42.460 They made people
00:16:43.460 see Trump
00:16:44.080 as literally
00:16:44.760 Hiller.
00:16:46.860 They did that.
00:16:48.160 They pulled that off.
00:16:49.100 So I don't know
00:16:49.780 what the limit is,
00:16:50.880 but it's not obvious
00:16:52.080 there's any limit.
00:16:55.400 All right,
00:16:55.940 what else is happening?
00:16:57.480 More evidence
00:16:58.420 that instead
00:17:00.220 of using science,
00:17:01.100 you should just ask me.
00:17:02.020 don't need science,
00:17:04.120 just ask me.
00:17:05.500 My track record
00:17:06.180 is pretty good.
00:17:07.000 Here's another case.
00:17:09.060 And I've talked
00:17:09.580 about this before,
00:17:10.340 but Andrew Huberman
00:17:11.380 is saying
00:17:11.800 there's some new
00:17:12.400 research that says
00:17:13.920 no amount of
00:17:15.060 casual drinking
00:17:16.000 is good for you
00:17:17.100 and that
00:17:17.820 even a few drinks
00:17:19.740 a week,
00:17:20.480 maybe one drink
00:17:21.480 a night with dinner
00:17:22.460 is enough to raise
00:17:23.940 your baseline cortisol.
00:17:26.460 So that's your
00:17:26.980 stress-releasing
00:17:27.940 stuff that's bad
00:17:30.720 for your overall health.
00:17:33.020 And
00:17:33.380 when did I first
00:17:36.860 start saying in public
00:17:38.120 that it's obvious
00:17:39.280 that all the studies
00:17:40.340 about alcohol
00:17:41.080 are fake?
00:17:42.680 About 20 years ago.
00:17:44.740 About 20 years ago,
00:17:45.920 I started blogging
00:17:46.620 and there's no way
00:17:47.120 this is real.
00:17:47.840 There's no way
00:17:48.340 that even one drink
00:17:49.240 is good for you.
00:17:50.000 It's obvious bullshit.
00:17:51.640 Now,
00:17:51.940 in response to that,
00:17:55.340 let me give you
00:17:57.940 another one
00:17:58.440 that see if you can
00:17:59.520 spot this one.
00:18:01.420 now I take some
00:18:03.200 credit for being
00:18:04.720 able to spot
00:18:05.560 that it was obvious
00:18:06.640 20 years ago
00:18:07.780 without any science
00:18:08.820 it was obvious
00:18:10.160 that it must have
00:18:10.900 been just the
00:18:11.500 alcohol industry
00:18:12.480 that was sponsoring
00:18:14.580 studies.
00:18:15.420 To me,
00:18:16.080 that was just obvious.
00:18:17.860 And I was sure
00:18:19.080 that there was no
00:18:19.740 medical benefit
00:18:20.680 and it was just
00:18:21.260 a whole lie.
00:18:22.220 And it was.
00:18:22.900 And it wasn't just
00:18:23.700 one study.
00:18:25.200 There were all
00:18:25.880 kinds of studies
00:18:26.860 at the time
00:18:27.720 that seemed to support
00:18:28.760 that alcohol
00:18:29.680 was healthy.
00:18:30.460 And still,
00:18:30.940 I said,
00:18:31.960 no matter how
00:18:32.680 many new ones
00:18:33.260 came out,
00:18:33.940 I still said,
00:18:35.180 no,
00:18:35.480 that's obviously
00:18:36.120 just bullshit.
00:18:37.660 And now we know
00:18:38.340 it was.
00:18:39.140 It was all bullshit.
00:18:40.300 All right,
00:18:40.440 let's see if you can
00:18:40.960 spot this one.
00:18:42.700 Apparently,
00:18:43.300 Mayor Adams
00:18:43.980 of New York,
00:18:45.880 they've got a
00:18:47.140 new law in New York
00:18:48.460 that you can't
00:18:49.400 discriminate based
00:18:50.300 on height
00:18:50.880 and weight.
00:18:52.640 So that law
00:18:53.640 has passed.
00:18:54.860 And Mayor Adams
00:18:55.720 is the big one
00:18:57.780 behind it.
00:18:58.660 And apparently,
00:19:00.020 he says,
00:19:00.900 science has shown
00:19:01.820 that body type
00:19:02.680 is not good.
00:19:04.540 There's no connection
00:19:05.380 between your body
00:19:06.180 type and your health.
00:19:11.280 The mayor of New York
00:19:12.820 is saying out loud,
00:19:16.240 out loud to the public
00:19:17.960 that there's no correlation
00:19:19.940 between your obesity
00:19:21.820 and your health.
00:19:22.860 health.
00:19:27.160 So,
00:19:28.300 does anybody
00:19:30.140 want to take a crack
00:19:31.460 at whether you
00:19:32.920 should believe that?
00:19:35.280 Or should you
00:19:36.160 just ask me?
00:19:37.780 What do you think
00:19:38.500 would be your
00:19:38.920 better play?
00:19:40.560 To listen to
00:19:41.520 an elected official
00:19:42.560 in public
00:19:43.640 or to just
00:19:45.520 ask Scott?
00:19:46.900 I'm pretty sure
00:19:47.880 that asking me
00:19:48.640 gets you the win
00:19:49.340 again.
00:19:50.720 If you're
00:19:51.920 keeping score.
00:19:53.500 I'm pretty sure
00:19:54.480 I win this one
00:19:55.180 again.
00:19:56.000 Hands down.
00:19:57.080 Clear victory.
00:19:58.580 No.
00:20:00.200 Obesity is probably
00:20:01.160 the number one
00:20:02.080 thing most
00:20:02.840 correlated with
00:20:03.640 bad health.
00:20:04.920 I'm not even sure
00:20:05.700 there's a second
00:20:06.320 thing that comes
00:20:06.960 anywhere in the
00:20:08.000 neighborhood of that.
00:20:09.320 How in the world?
00:20:11.280 How in the world?
00:20:12.600 Now,
00:20:12.980 I guess he has
00:20:13.680 to say this
00:20:14.460 because if you
00:20:15.720 make obesity
00:20:16.600 not a reason
00:20:18.420 for
00:20:18.980 turning somebody
00:20:20.620 down for
00:20:21.260 a job,
00:20:21.920 you'd have to
00:20:24.040 have a reason.
00:20:26.140 Is it
00:20:26.700 reasonable for
00:20:27.380 an employer
00:20:27.880 to look at
00:20:28.560 somebody and
00:20:29.080 say,
00:20:30.000 okay,
00:20:30.340 obviously you're
00:20:31.040 going to miss
00:20:31.320 a lot of
00:20:31.660 work?
00:20:34.660 Is that fair?
00:20:35.480 Is it fair to
00:20:36.940 look at somebody
00:20:37.460 and say,
00:20:37.780 you don't even
00:20:38.120 look healthy
00:20:38.680 enough to show
00:20:39.240 up for work?
00:20:39.760 I think it's
00:20:42.840 fair.
00:20:43.860 It's not legal
00:20:44.880 in New York
00:20:45.600 City,
00:20:46.000 but I think
00:20:46.740 it's reasonable
00:20:48.120 and fair.
00:20:50.400 So,
00:20:51.200 while I'm
00:20:52.460 very much
00:20:52.960 against
00:20:53.480 fat shaming,
00:20:55.680 so I don't
00:20:56.480 believe,
00:20:57.140 you know,
00:20:57.400 since I'm not
00:20:58.040 a free will
00:20:58.700 believer,
00:21:00.060 I don't believe
00:21:00.800 anybody consciously
00:21:01.840 said,
00:21:02.860 I think I'll
00:21:03.600 make myself
00:21:04.140 obese now.
00:21:05.420 Nobody does
00:21:05.980 that.
00:21:06.280 So,
00:21:07.140 obviously there's
00:21:07.960 a real,
00:21:09.580 you know,
00:21:10.780 it's a real
00:21:11.280 problem.
00:21:12.620 And the fact
00:21:13.220 that if
00:21:14.300 somebody watching
00:21:15.560 this has
00:21:17.020 the same
00:21:17.360 situation I
00:21:18.040 do,
00:21:18.640 which is my
00:21:19.320 BMI at the
00:21:21.240 moment is the
00:21:21.960 exact number I
00:21:22.700 want it to be.
00:21:23.740 The exact
00:21:24.220 number.
00:21:25.360 Now,
00:21:25.560 is that because
00:21:26.100 I have extra
00:21:27.700 character and
00:21:30.360 qualities?
00:21:32.040 And is it
00:21:32.800 because I'm so
00:21:33.520 good with my
00:21:34.240 mental discipline?
00:21:37.280 Probably not.
00:21:39.220 Probably not.
00:21:40.420 I think it
00:21:40.920 comes down to
00:21:41.560 I just don't
00:21:42.180 like food that
00:21:42.900 much.
00:21:44.140 Just don't
00:21:44.740 like food that
00:21:45.300 much.
00:21:46.080 I mean,
00:21:46.560 I like eating.
00:21:48.240 I do it every
00:21:48.940 day.
00:21:49.980 But, you know,
00:21:50.740 the difference
00:21:51.160 between how much
00:21:51.860 I like delicious
00:21:54.760 food and how
00:21:55.680 much an obese
00:21:56.660 person likes it
00:21:57.460 is really
00:21:58.000 different.
00:21:59.280 Whatever
00:21:59.640 experience they're
00:22:00.820 having is more
00:22:01.580 like meth.
00:22:03.240 To me, it's
00:22:03.940 like, oh,
00:22:04.780 this was fun.
00:22:05.620 I'm glad I
00:22:06.140 ate that.
00:22:07.160 Had a little
00:22:07.620 bit of a
00:22:08.160 boost.
00:22:10.180 But you can
00:22:11.180 see people who
00:22:12.000 are heavier,
00:22:13.320 their entire
00:22:13.860 face lights up
00:22:14.860 when they're
00:22:15.700 putting that
00:22:16.440 sugar in.
00:22:17.400 Like, it's a
00:22:17.800 whole different
00:22:18.320 addiction situation.
00:22:20.300 Same with
00:22:20.680 alcohol.
00:22:21.680 The reason I'm
00:22:22.460 not an
00:22:22.820 alcoholic,
00:22:24.200 you understand
00:22:25.720 this, right?
00:22:26.220 The reason I'm
00:22:26.680 not an alcoholic
00:22:27.320 is not because
00:22:28.400 of my good
00:22:28.980 discipline.
00:22:30.220 It's not because
00:22:30.880 I thought it
00:22:31.580 through.
00:22:31.920 I really
00:22:33.720 try harder
00:22:34.300 not to be
00:22:34.760 a...
00:22:35.080 No.
00:22:35.880 I just
00:22:36.460 don't enjoy
00:22:37.000 alcohol.
00:22:38.660 It's easy
00:22:39.380 for me to
00:22:39.780 stop.
00:22:40.700 That's it.
00:22:41.840 There's
00:22:42.140 nothing extra
00:22:42.860 about me.
00:22:43.820 I don't
00:22:44.100 have good
00:22:44.500 qualities.
00:22:45.880 It's not
00:22:46.540 because I
00:22:46.900 try harder.
00:22:48.140 Nothing like
00:22:48.680 that.
00:22:49.020 I just got
00:22:49.500 lucky.
00:22:50.840 I have other
00:22:51.500 problems that
00:22:53.660 maybe you don't
00:22:54.340 have, but I
00:22:55.280 just didn't get
00:22:55.760 that one.
00:22:56.600 It's just the
00:22:57.360 luck of the
00:22:57.800 draw.
00:22:58.020 So I
00:22:58.940 don't judge
00:22:59.640 people who
00:23:01.060 are heavy
00:23:01.400 because I
00:23:02.740 just don't
00:23:03.120 see it as
00:23:03.660 some kind
00:23:04.300 of a
00:23:04.580 character flaw
00:23:05.400 or even a
00:23:07.340 decision,
00:23:08.080 really.
00:23:08.660 It's just
00:23:09.100 in a bad
00:23:09.780 situation.
00:23:12.560 All right,
00:23:12.920 so that's
00:23:15.960 happening.
00:23:16.620 Terrible,
00:23:17.120 terrible idea.
00:23:22.500 I know a
00:23:23.260 lot of you
00:23:23.680 have had
00:23:24.020 this feeling,
00:23:24.980 especially
00:23:25.400 lately,
00:23:27.040 but I
00:23:27.500 think it's
00:23:27.740 reversing.
00:23:28.660 But for a
00:23:29.160 while,
00:23:29.440 were you
00:23:29.640 having that
00:23:30.040 feeling that
00:23:30.540 everything was
00:23:31.120 going wrong?
00:23:32.560 It was like
00:23:33.060 everything was
00:23:33.720 coming off the
00:23:34.280 tracks.
00:23:35.360 You felt that,
00:23:36.360 didn't you?
00:23:37.800 Like, even
00:23:39.380 just a month
00:23:40.360 ago, it felt
00:23:41.760 like just
00:23:42.140 everything was
00:23:42.660 coming off the
00:23:43.260 tracks.
00:23:44.260 In every
00:23:44.900 category, you
00:23:46.220 know, people
00:23:46.540 were incompetent
00:23:47.360 in their jobs,
00:23:48.580 everything was
00:23:49.120 trending wrong.
00:23:50.620 But I feel
00:23:51.380 like it's
00:23:51.720 already reversed.
00:23:53.900 I feel like
00:23:54.640 it's already
00:23:55.040 reversed.
00:23:56.140 In small
00:23:56.900 ways.
00:23:57.500 But if you
00:23:58.400 look at what
00:23:58.980 2024 is likely
00:24:00.600 to look at,
00:24:01.700 to look like,
00:24:02.740 right, just
00:24:03.160 likely, you
00:24:04.480 can't predict
00:24:05.000 the future.
00:24:06.200 But in all
00:24:06.800 likelihood, you'll
00:24:07.660 have a
00:24:08.040 Republican
00:24:08.520 president, and
00:24:10.580 the border will
00:24:11.300 close.
00:24:12.480 And it will
00:24:13.240 close in time
00:24:14.140 that we did not
00:24:15.780 destroy the
00:24:16.360 country.
00:24:17.680 That's pretty
00:24:18.360 good.
00:24:19.240 In all
00:24:19.900 likelihood, a
00:24:20.620 year from
00:24:20.960 now, there
00:24:21.820 will not be a
00:24:22.520 war in Gaza,
00:24:23.760 and some
00:24:24.340 rebuilding will
00:24:25.200 be in the
00:24:26.600 works.
00:24:28.020 And whatever
00:24:28.820 is happening
00:24:29.280 in Ukraine and
00:24:30.100 Russia will
00:24:30.620 probably have
00:24:31.140 wound down,
00:24:32.560 because sadly,
00:24:34.020 Ukraine is just
00:24:35.180 running out of
00:24:35.620 people.
00:24:36.820 Like, they're
00:24:37.260 going to have to
00:24:37.660 negotiate.
00:24:39.540 Probably.
00:24:40.920 Right?
00:24:42.880 We're continuing
00:24:43.980 to get a little
00:24:45.100 bit of freedom
00:24:46.060 from the
00:24:47.660 economics of
00:24:48.380 China.
00:24:49.420 That's going in
00:24:50.340 the right
00:24:50.520 direction.
00:24:51.760 Right?
00:24:51.900 So there's a
00:24:52.880 whole bunch of
00:24:53.520 stuff, except
00:24:54.640 for inflation
00:24:55.320 and the debt.
00:24:56.860 But there's a
00:24:57.360 whole bunch of
00:24:57.800 stuff going in
00:24:58.340 the right
00:24:58.600 direction.
00:25:00.520 I'd like to
00:25:01.300 think we can
00:25:01.780 get the national
00:25:02.400 debt under
00:25:02.900 control as
00:25:03.560 well.
00:25:03.900 I don't know
00:25:04.300 how.
00:25:05.300 I have no
00:25:05.680 idea how.
00:25:07.100 But it's
00:25:07.780 possible.
00:25:09.080 I mean,
00:25:09.480 maybe.
00:25:11.200 And I think
00:25:11.920 that the age
00:25:12.480 of robots and
00:25:14.320 AI is going
00:25:17.240 to be so
00:25:17.680 extreme that
00:25:19.580 AI itself might
00:25:20.940 end up solving
00:25:21.960 a bunch of
00:25:22.420 human problems.
00:25:23.820 A year from
00:25:24.320 now, here's
00:25:24.920 what I expect.
00:25:25.520 A year from
00:25:25.980 now, I
00:25:27.100 believe that
00:25:27.640 either planning
00:25:28.420 or even
00:25:28.960 breaking ground
00:25:30.580 on some new
00:25:32.380 cities, whether
00:25:34.940 Gaza is one of
00:25:35.760 those new
00:25:36.100 cities or not,
00:25:37.420 will be a
00:25:38.180 thing.
00:25:39.440 So in one
00:25:39.980 year, it will
00:25:40.520 be a thing
00:25:41.160 that you could
00:25:41.860 get these
00:25:42.320 little ADU
00:25:44.600 or little
00:25:45.020 pod houses
00:25:45.700 for $100,000
00:25:46.640 that you could
00:25:48.340 just stick on
00:25:49.020 a new site
00:25:50.620 for somebody
00:25:51.880 who wants
00:25:52.460 to get going
00:25:53.060 and get a
00:25:53.460 little equity.
00:25:54.880 And we'll
00:25:55.740 build a city
00:25:57.220 where it's
00:25:57.740 pre-approved.
00:25:58.880 You just grab
00:25:59.520 one of these
00:25:59.900 units, plop it
00:26:00.780 down.
00:26:01.740 You've got low
00:26:02.660 rent.
00:26:03.220 You save some
00:26:03.740 money.
00:26:04.200 Someday you get
00:26:04.780 a bigger place.
00:26:07.060 So I think a lot
00:26:08.260 of things are
00:26:08.640 heading toward a
00:26:09.240 solution.
00:26:10.680 I think that the
00:26:11.500 X platform has
00:26:12.700 introduced free
00:26:14.660 speech that is a
00:26:16.280 greater free speech
00:26:17.200 than we've had in
00:26:18.560 my lifetime.
00:26:21.060 I think I
00:26:22.000 probably have the
00:26:22.740 greatest free
00:26:23.360 speech I've ever
00:26:24.060 had, but it's
00:26:25.640 siloed within X.
00:26:27.560 Unfortunately, X
00:26:29.120 is a very public
00:26:30.200 silo.
00:26:31.240 So at least
00:26:31.720 everybody sees the
00:26:32.700 silo, even if
00:26:33.920 they're not in it.
00:26:36.260 So generally
00:26:37.100 speaking, I think
00:26:38.200 there's a whole
00:26:38.760 bunch of things
00:26:39.480 that are trending
00:26:40.460 positive.
00:26:41.260 However, if you
00:26:43.060 want to look for
00:26:43.640 that signal that
00:26:45.760 everything's desperate,
00:26:47.100 there is something
00:26:48.680 to look for.
00:26:50.120 I wouldn't get
00:26:51.160 worried about the
00:26:52.060 fate of humanity
00:26:52.940 until you see
00:26:54.820 Elon Musk gathering
00:26:56.020 up two kinds of
00:26:57.440 every mammal,
00:26:59.080 bird and fish,
00:27:01.160 and loading them
00:27:01.820 on a rocket.
00:27:03.180 If you see him
00:27:04.220 gathering two of
00:27:05.080 every kind and
00:27:06.780 putting them on a
00:27:07.420 rocket to Mars,
00:27:08.820 that's when you
00:27:10.800 should sell your
00:27:11.380 stock.
00:27:12.760 That's when you
00:27:13.500 might want to make
00:27:14.080 some changes.
00:27:14.600 Try to get
00:27:16.400 yourself on that
00:27:17.080 rocket.
00:27:18.680 But until then,
00:27:19.700 don't worry about
00:27:20.260 it.
00:27:20.780 There's a report
00:27:21.540 from local Israel
00:27:23.100 news.
00:27:25.240 I don't think it's
00:27:26.380 confirmed yet.
00:27:27.560 I'd wait to hear
00:27:28.420 it's confirmed.
00:27:29.780 But Elon Musk is
00:27:30.680 over there talking
00:27:31.420 to Netanyahu and
00:27:33.100 the Israeli leaders
00:27:34.360 and he's taking a
00:27:35.240 tour of the
00:27:35.660 kibbutz, which I
00:27:37.300 believe is entirely
00:27:38.500 a political
00:27:39.780 publicity thing to
00:27:41.820 rehabilitate him in
00:27:43.380 the eyes of people
00:27:44.660 who called him
00:27:45.200 anti-Semitic.
00:27:46.960 And he even
00:27:48.840 tweeted, or
00:27:50.200 posted, actions
00:27:52.400 speak louder than
00:27:53.460 words.
00:27:55.240 And he was doing
00:27:58.120 the actions as he
00:27:59.440 said it.
00:28:01.400 So if you say,
00:28:02.860 blah, blah, blah,
00:28:03.700 this is what I
00:28:04.440 think about the
00:28:05.220 Israel-Gaza
00:28:06.000 situation, those are
00:28:07.060 words.
00:28:08.480 But if you fly
00:28:09.560 your ass all the
00:28:10.540 way over to Israel
00:28:11.460 and you take a
00:28:12.860 physical tour of
00:28:14.840 the places where
00:28:15.600 the massacres
00:28:16.320 happened, and
00:28:17.420 apparently he
00:28:18.340 watched the video
00:28:20.040 of all the worst
00:28:21.920 parts of the
00:28:22.460 massacre from the
00:28:23.320 GoPros of the
00:28:24.320 terrorists, the
00:28:25.780 thing I'll never
00:28:26.420 watch in my life,
00:28:27.540 he actually sat
00:28:28.260 through it and
00:28:29.660 watched it.
00:28:30.200 I can't even
00:28:30.600 imagine it.
00:28:31.760 I would never
00:28:32.480 subject myself to
00:28:33.680 that.
00:28:34.220 But he did.
00:28:35.680 He did.
00:28:36.700 So he basically
00:28:37.520 took that bullet so
00:28:38.600 you don't have to.
00:28:39.240 Like, somebody
00:28:40.940 important sat
00:28:42.100 through it, which
00:28:43.280 I'm glad they
00:28:43.820 did.
00:28:45.000 I'm glad it
00:28:45.400 wasn't me.
00:28:47.080 So, but beyond
00:28:48.980 actually going
00:28:50.460 there and
00:28:51.100 actually sort of
00:28:53.340 absorbing the
00:28:54.260 pain, you know
00:28:56.320 when people say,
00:28:57.020 I feel your
00:28:57.520 pain?
00:28:58.980 Like, you don't
00:28:59.780 know if they
00:29:00.140 mean it.
00:29:01.780 But he means
00:29:02.880 it.
00:29:04.240 Let me ask
00:29:05.020 you, do you
00:29:05.680 think that Elon
00:29:06.520 Musk went
00:29:08.380 through the,
00:29:09.420 watched that
00:29:10.240 movie, went
00:29:10.920 through the
00:29:11.420 kibbutz, saw
00:29:12.200 the actual
00:29:12.680 still blood
00:29:13.420 stained cribs
00:29:14.500 and stuff?
00:29:15.520 Do you think
00:29:16.320 he felt their
00:29:16.960 pain?
00:29:18.880 Absolutely.
00:29:20.820 He actually
00:29:21.540 subjected himself
00:29:22.880 voluntarily to
00:29:24.520 their pain and
00:29:26.740 actually took
00:29:27.340 some of their
00:29:27.920 pain and
00:29:29.940 actually took
00:29:30.800 it upon
00:29:31.140 himself, which
00:29:31.940 he will carry,
00:29:32.640 by the way.
00:29:33.640 He will carry
00:29:34.260 that pain
00:29:34.760 forever.
00:29:36.500 Let me just
00:29:37.180 say that again.
00:29:38.380 This wasn't
00:29:39.320 something he
00:29:39.900 did on a
00:29:40.640 Monday.
00:29:42.440 He will
00:29:43.140 carry that
00:29:43.740 pain forever.
00:29:45.240 The images
00:29:45.880 that he saw
00:29:46.620 and the
00:29:47.040 memories that
00:29:47.580 he will carry
00:29:48.060 with him of
00:29:49.020 actually being
00:29:49.600 in the
00:29:50.020 kibbutz, that's
00:29:51.780 forever.
00:29:53.680 And that
00:29:54.380 pain will
00:29:54.820 never go
00:29:55.280 away.
00:29:56.320 Now, of
00:29:56.820 course, it's
00:29:57.280 not as great
00:29:57.840 as the pain
00:29:58.360 of the people
00:29:58.800 who suffered
00:29:59.700 the experience
00:30:00.940 or that lived
00:30:01.620 there.
00:30:01.880 It's not that
00:30:02.300 kind of pain.
00:30:03.140 But he
00:30:04.160 actually signed
00:30:05.000 up for a
00:30:06.820 permanent disability
00:30:07.960 to support
00:30:09.800 Israel.
00:30:11.840 I mean, that's
00:30:12.500 action.
00:30:14.460 You know, you
00:30:15.560 talk about
00:30:16.060 actions being
00:30:16.760 stronger than
00:30:17.320 words.
00:30:18.400 Well, you
00:30:18.760 can't beat
00:30:19.120 that, can
00:30:20.740 you?
00:30:21.420 You cannot
00:30:22.280 beat that.
00:30:23.600 He took on a
00:30:24.380 permanent disability
00:30:25.500 as a show of
00:30:27.320 support.
00:30:29.100 So, how does
00:30:30.020 that compare to
00:30:30.580 his words?
00:30:32.640 Are you going to
00:30:33.360 worry about his
00:30:33.900 words?
00:30:34.300 Now, but
00:30:37.060 he's also
00:30:37.460 said that
00:30:38.940 allegedly, I
00:30:40.500 want to hear
00:30:41.120 some details
00:30:41.680 on this, but
00:30:42.360 I don't want
00:30:42.880 to get too
00:30:43.180 excited, but
00:30:44.280 Musk said he
00:30:44.860 wants to
00:30:45.200 help rebuild
00:30:45.840 Gaza after
00:30:46.760 the war.
00:30:48.640 Now, that
00:30:49.220 could mean
00:30:49.560 anything.
00:30:51.300 You know,
00:30:51.580 help is a
00:30:52.540 pretty big
00:30:52.940 word.
00:30:53.680 Could be he's
00:30:54.360 just one of
00:30:54.740 the people
00:30:55.020 who contributes
00:30:55.520 money.
00:30:56.980 I hope it's
00:30:57.820 not that.
00:30:59.340 I really hope
00:31:00.260 it's not that.
00:31:01.560 You know, I'll
00:31:02.260 say it again.
00:31:05.180 Imagine a
00:31:06.020 city designed
00:31:06.960 by Elon Musk
00:31:08.160 and Kimball
00:31:10.060 Musk, his
00:31:10.860 brother.
00:31:11.820 Kimball's into
00:31:12.480 indoor farming,
00:31:14.640 so he's got
00:31:15.520 expertise in
00:31:16.320 indoor farming,
00:31:17.380 and that's kind
00:31:18.280 of what they
00:31:18.580 need there.
00:31:20.060 So they've
00:31:20.700 got all the
00:31:21.240 sun they need
00:31:21.860 for solar
00:31:22.380 power.
00:31:23.700 They've got
00:31:24.220 a start-from-scratch
00:31:26.040 city, because
00:31:27.500 there won't be
00:31:27.860 much left.
00:31:28.660 They could
00:31:29.080 build it for
00:31:29.680 self-driving
00:31:30.440 taxi cars, so
00:31:31.700 everybody has
00:31:32.200 good transportation,
00:31:33.220 no traffic
00:31:33.720 jams, no
00:31:35.380 smog.
00:31:36.780 Nobody needs
00:31:37.360 to own a
00:31:37.800 car.
00:31:38.280 You could
00:31:38.560 just get
00:31:38.900 one from
00:31:39.260 your app.
00:31:40.280 It'll just
00:31:40.680 pull up to
00:31:41.100 the curb.
00:31:42.220 You can
00:31:42.460 imagine, of
00:31:44.720 course, he's
00:31:45.000 got the
00:31:45.300 boring company,
00:31:46.080 so they can
00:31:46.420 put their
00:31:46.740 tunnels back
00:31:47.280 really fast.
00:31:48.740 No, he's
00:31:49.400 not going to
00:31:49.700 build tunnels
00:31:50.200 for Hamas.
00:31:51.240 Definitely
00:31:51.620 not going to
00:31:52.040 do that.
00:31:52.980 But you
00:31:54.260 can imagine
00:31:55.000 an Elon
00:31:57.120 Musk design
00:31:58.720 city that
00:32:00.280 would have
00:32:00.780 ultimate
00:32:01.240 freedom,
00:32:02.860 because he
00:32:03.220 would build
00:32:03.580 it with
00:32:03.900 freedom as
00:32:04.600 a baseline
00:32:05.840 starting
00:32:07.040 principle.
00:32:07.980 It'd be
00:32:08.240 freedom first.
00:32:09.780 And economic
00:32:11.200 opportunity, it'd
00:32:12.460 probably be
00:32:12.820 built so it's
00:32:13.540 easy to live
00:32:14.220 where you
00:32:14.540 work and
00:32:15.760 easy to
00:32:16.420 have work.
00:32:17.600 Maybe they
00:32:18.300 would consider
00:32:18.900 what kind
00:32:19.340 of employment
00:32:21.000 you would
00:32:21.400 have and
00:32:21.820 design from
00:32:22.400 that up.
00:32:23.260 You can
00:32:23.660 imagine that
00:32:24.360 the newest
00:32:25.380 technologies
00:32:26.120 for water
00:32:28.420 desalinization,
00:32:31.580 apparently there
00:32:32.140 was a whole
00:32:32.440 bunch of new
00:32:32.900 technologies that
00:32:33.800 are very new,
00:32:35.560 that you could
00:32:36.260 completely change
00:32:37.120 the water
00:32:37.640 situation in
00:32:38.440 Gaza.
00:32:39.200 Indoor farms
00:32:39.840 would do that
00:32:40.320 as well,
00:32:40.760 better use of
00:32:41.340 water.
00:32:43.200 So you could
00:32:44.060 fix one of
00:32:45.040 the biggest
00:32:45.440 complaints of
00:32:46.340 the Palestinians
00:32:47.060 in general,
00:32:50.100 but the
00:32:50.640 Palestinians
00:32:51.100 specifically in
00:32:51.960 Gaza, was
00:32:53.600 water rights.
00:32:55.700 Water rights.
00:32:57.740 The other
00:32:58.760 thing that they
00:32:59.280 had a problem
00:32:59.680 with was
00:33:00.240 freedom of
00:33:01.260 movement.
00:33:02.360 They couldn't
00:33:03.200 freely move
00:33:04.100 around the
00:33:04.520 area.
00:33:06.280 But if you
00:33:07.380 built a city
00:33:08.220 and then did
00:33:09.820 rigorous vetting
00:33:11.240 to make sure
00:33:11.780 nobody gets
00:33:12.340 back in the
00:33:13.020 city unless
00:33:14.220 they're reasonably
00:33:16.000 free of Hamas
00:33:17.140 influence, you
00:33:19.080 could rebuild
00:33:19.660 a city in
00:33:20.380 which you've
00:33:20.920 got
00:33:21.240 Jews living
00:33:23.340 with Palestinians
00:33:24.320 who are
00:33:25.660 only people
00:33:26.600 who have
00:33:26.920 decided they
00:33:27.520 like that
00:33:27.940 situation.
00:33:29.360 Because actually
00:33:30.180 you could
00:33:30.520 actually recruit
00:33:31.200 people specifically
00:33:32.260 for that.
00:33:33.240 Suppose you
00:33:33.840 said we're
00:33:34.440 going to
00:33:34.600 rebuild Gaza
00:33:35.440 and here's
00:33:35.860 the thing,
00:33:36.680 we're not
00:33:37.200 going to
00:33:37.400 build cities
00:33:37.940 of one
00:33:38.360 kind of
00:33:38.660 people anymore.
00:33:40.360 That's a
00:33:41.020 losing proposition.
00:33:42.440 We're only
00:33:42.880 going to
00:33:43.120 build cities
00:33:43.660 where they're
00:33:44.160 designed for
00:33:44.860 people to
00:33:45.360 live together.
00:33:46.660 So you
00:33:47.180 better pass
00:33:47.780 a test
00:33:48.360 that says
00:33:49.900 you disavow
00:33:51.460 violence and
00:33:52.160 stuff and
00:33:52.840 then you
00:33:53.040 can live
00:33:53.240 there.
00:33:53.880 I think
00:33:54.240 you would
00:33:54.460 find Israelis
00:33:55.340 who for
00:33:57.380 let's say
00:33:59.600 civic,
00:34:01.220 public good
00:34:02.180 would take a
00:34:03.460 chance of
00:34:04.000 living among
00:34:04.640 a Palestinian
00:34:05.420 population
00:34:06.540 specifically to
00:34:09.180 make sure that
00:34:09.740 there's a little
00:34:10.180 better communication
00:34:12.240 and stuff.
00:34:12.940 I think
00:34:13.720 you would
00:34:13.940 find lots
00:34:14.340 of volunteers
00:34:15.220 already,
00:34:16.420 in fact,
00:34:17.380 there were
00:34:17.700 Israelis already
00:34:18.700 there.
00:34:19.740 So you
00:34:20.940 would get
00:34:21.260 people who
00:34:21.680 could volunteer
00:34:22.160 for that,
00:34:23.140 but they'd
00:34:23.560 have to say
00:34:24.040 specifically,
00:34:24.980 I totally
00:34:25.440 want to live
00:34:26.020 among people
00:34:26.940 who are
00:34:27.200 different from
00:34:27.660 me.
00:34:30.220 So it
00:34:31.920 would take a
00:34:32.360 long time to
00:34:32.960 build it back
00:34:33.760 because it's
00:34:34.560 probably so
00:34:35.100 spoiled in
00:34:35.940 every way that
00:34:36.680 just the
00:34:37.880 demolition of
00:34:38.560 what's left
00:34:39.160 could take
00:34:40.160 years to
00:34:41.240 clean up the
00:34:42.620 toxic waste
00:34:44.200 and stuff.
00:34:44.600 It could
00:34:44.720 take years.
00:34:46.040 But you
00:34:47.200 can imagine
00:34:47.800 a Musk built
00:34:49.240 the city
00:34:49.820 where he
00:34:50.740 provides the
00:34:51.420 internet,
00:34:52.460 the self-driving
00:34:53.160 cars,
00:34:53.900 all of the
00:34:54.400 energy,
00:34:55.420 and something
00:34:56.040 is done with
00:34:56.800 water desalinization,
00:34:59.000 and they
00:35:00.200 build it so
00:35:00.840 that, oh,
00:35:02.240 and then more
00:35:02.740 importantly,
00:35:04.080 an AI-based
00:35:05.800 education system.
00:35:09.660 So everybody
00:35:10.620 gets the best
00:35:11.660 quality education
00:35:12.600 education because
00:35:13.600 we take the
00:35:14.280 best ideas of
00:35:15.180 education and
00:35:16.080 bring it there,
00:35:17.220 which is
00:35:17.540 probably, in
00:35:18.940 my opinion,
00:35:19.880 the best
00:35:20.240 education would
00:35:21.000 be the best
00:35:22.540 AI or human
00:35:24.320 content with
00:35:26.860 somebody who
00:35:27.480 can talk to
00:35:28.060 it, you know,
00:35:28.380 some kind of a
00:35:28.940 person in the
00:35:29.500 room, an
00:35:30.060 adult, but a
00:35:32.400 small group of
00:35:33.240 people, a pod
00:35:34.320 of homeschooler
00:35:35.620 types, you
00:35:37.080 know, where you
00:35:37.660 go and you
00:35:38.180 meet, you know,
00:35:38.620 20 friends and
00:35:39.480 you hang out
00:35:39.920 together, the 20
00:35:40.660 of you, so
00:35:41.520 you get to
00:35:41.860 know each
00:35:42.180 other, so
00:35:42.680 you all have
00:35:43.120 20 friends
00:35:43.800 your age.
00:35:46.260 It's good to
00:35:46.940 have 20
00:35:47.300 friends, and
00:35:48.720 just build the
00:35:50.320 best school
00:35:50.900 system in the
00:35:51.900 world, but
00:35:53.480 it's unbiased and
00:35:54.420 it's not teaching
00:35:55.020 anybody to hate.
00:35:56.840 Yes, I'm
00:35:57.520 clearly a sci-fi
00:35:58.400 fan, but
00:35:59.580 here's the thing
00:36:00.280 that gives me
00:36:01.020 optimism.
00:36:03.560 It would be the
00:36:04.420 hardest job in
00:36:05.120 the world, but
00:36:07.280 Musk has a
00:36:08.900 track record of
00:36:09.880 succeeding at the
00:36:10.940 hardest jobs in
00:36:12.040 the world.
00:36:13.240 It's a design
00:36:14.380 problem, even
00:36:17.000 more than a
00:36:17.640 technology or
00:36:18.460 money problem.
00:36:19.960 There probably
00:36:20.480 is enough money,
00:36:22.060 and if you
00:36:23.100 design it right,
00:36:24.340 you could build
00:36:25.280 a great situation.
00:36:27.080 He's the best
00:36:27.600 designer of all
00:36:28.460 time, in my
00:36:29.380 opinion.
00:36:31.920 You know, I
00:36:32.520 think Jobs was
00:36:33.460 great, but, you
00:36:35.140 know, that was
00:36:35.560 sort of, he
00:36:37.100 had help.
00:36:37.900 I think Elon's
00:36:38.660 the best designer
00:36:40.180 of products in
00:36:42.580 the modern era.
00:36:44.300 So having him
00:36:45.780 design a city
00:36:46.700 would be the
00:36:47.940 most exciting
00:36:48.580 thing I've ever
00:36:49.240 seen in my
00:36:49.640 life, because
00:36:51.400 he's going to
00:36:51.820 have to design
00:36:52.420 cities on Mars
00:36:53.200 someday, and
00:36:54.940 maybe he would
00:36:55.480 learn something
00:36:55.980 by doing it on
00:36:56.660 Earth.
00:36:58.900 Anyway, Conor
00:37:00.220 McGregor is in
00:37:01.340 trouble over
00:37:02.120 there in
00:37:02.680 Ireland for
00:37:04.200 statements that
00:37:05.540 looked like
00:37:06.680 hate speech in
00:37:08.920 Ireland, because
00:37:10.100 he criticized
00:37:10.620 Ireland's mass
00:37:11.540 immigration policies,
00:37:13.480 and this was
00:37:14.060 after there was
00:37:14.800 this terrible
00:37:16.020 stabbing attack
00:37:16.980 that included
00:37:17.680 some children.
00:37:18.800 I guess it was
00:37:19.300 an immigrant who
00:37:20.000 did the stabbing,
00:37:21.380 and it caused
00:37:22.360 protests and
00:37:23.580 riots and stuff.
00:37:26.100 And I guess
00:37:27.240 Conor McGregor
00:37:28.400 said, quote,
00:37:29.140 Ireland, we're
00:37:30.000 at war, he
00:37:31.800 said at one
00:37:32.260 point, and
00:37:33.800 but apparently
00:37:37.180 that's hate
00:37:37.680 speech.
00:37:39.560 That's hate
00:37:39.980 speech.
00:37:41.020 Can you imagine
00:37:41.640 that, Conor
00:37:42.480 McGregor going to
00:37:43.480 jail for just
00:37:45.200 saying some
00:37:45.780 things that
00:37:46.360 everybody agreed
00:37:47.120 with?
00:37:48.800 Amazing.
00:37:50.260 That's today.
00:37:52.540 But let me
00:37:53.340 make a prediction.
00:37:54.000 If Conor
00:37:55.900 McGregor goes to
00:37:57.020 jail for
00:37:57.800 speech, he's
00:38:00.440 going to be the
00:38:01.120 next leader of
00:38:02.440 Ireland.
00:38:04.360 I think it
00:38:05.340 would be
00:38:05.600 impossible to
00:38:06.380 keep him out of
00:38:07.000 the job if he
00:38:07.760 wanted it.
00:38:09.180 Right?
00:38:10.680 Let's see, I
00:38:11.640 wonder if he
00:38:12.200 would fight hard
00:38:13.260 to get elected.
00:38:14.280 would you bet
00:38:18.020 against Conor
00:38:20.000 McGregor's
00:38:20.640 mindset?
00:38:22.700 Would you bet
00:38:23.500 against him if
00:38:24.180 he decided, oh,
00:38:24.920 screw it, I'm
00:38:25.600 just going to
00:38:25.940 become, I'm
00:38:27.260 going into
00:38:27.720 politics?
00:38:28.700 I would never
00:38:29.600 bet against
00:38:30.120 him.
00:38:31.380 He's got the
00:38:32.020 strongest mindset
00:38:32.980 you've ever seen
00:38:34.120 in your life.
00:38:36.040 So, yeah, I
00:38:37.200 wouldn't bet
00:38:37.480 against him.
00:38:40.600 So, the
00:38:41.900 Biden Health
00:38:42.880 and Human
00:38:43.660 Services,
00:38:44.280 they have a
00:38:46.360 proposed new
00:38:47.000 law for foster
00:38:47.840 care, which
00:38:50.800 if you were a
00:38:51.500 foster parent and
00:38:52.920 you refused to
00:38:53.980 acknowledge your
00:38:54.660 child's gender
00:38:56.800 assignment, I
00:38:57.580 guess, rejecting
00:39:00.180 LGBT ideology,
00:39:05.200 you'd be called
00:39:06.140 a child abuser.
00:39:08.040 So, if you're a
00:39:08.700 foster parent and
00:39:09.760 let's say your
00:39:10.300 kid said, I'm
00:39:11.680 gay or I'm
00:39:13.160 non-binary or
00:39:14.140 something, and
00:39:15.520 you said, no,
00:39:16.320 you're not, you
00:39:18.420 could be
00:39:18.720 arrested.
00:39:22.760 Now, on one
00:39:25.440 hand, I get
00:39:26.900 it.
00:39:27.960 On one hand, I
00:39:28.980 totally get it.
00:39:30.200 Because I do
00:39:30.920 think that kids
00:39:33.040 know they're gay
00:39:33.760 pretty early.
00:39:35.780 But sometimes,
00:39:36.920 maybe they change
00:39:37.620 their mind.
00:39:38.160 hate crime, it's
00:39:39.460 sometimes when you
00:39:40.480 change your mind
00:39:41.120 part that really
00:39:42.140 makes it hard.
00:39:45.000 I definitely see
00:39:46.480 how it could be a
00:39:47.260 hate crime if
00:39:48.580 somebody is just
00:39:49.320 obviously a
00:39:50.220 gay.
00:39:51.180 A 10-year-old who
00:39:53.140 is 100% gay at
00:39:54.520 10 years old,
00:39:55.860 there's not much
00:39:56.540 doubt about it.
00:39:58.920 They're probably
00:39:59.740 not going to
00:40:00.720 change at that
00:40:02.360 point.
00:40:03.040 So, I can see how
00:40:04.020 that would be a
00:40:04.540 hate crime.
00:40:04.840 I mean, I can
00:40:05.780 see how it would
00:40:06.200 be damaging, but
00:40:07.660 I think the
00:40:08.300 remedy should be
00:40:09.420 moving them from
00:40:10.880 a different
00:40:11.360 step-parent.
00:40:12.720 I don't think the
00:40:13.300 remedy should be
00:40:13.940 jail.
00:40:15.420 Do you?
00:40:16.900 Do you think
00:40:18.100 jail is what you
00:40:19.220 do if a parent
00:40:20.480 has an opinion
00:40:22.580 about how best
00:40:23.440 to raise a
00:40:23.960 child that just
00:40:25.620 happens to be not
00:40:26.520 what the government
00:40:27.180 thinks is the
00:40:27.860 best way?
00:40:29.200 I don't think
00:40:29.920 you should go to
00:40:30.560 jail for that.
00:40:31.900 That feels like a
00:40:32.600 big, big overstep.
00:40:33.820 But possibly,
00:40:35.880 there should be
00:40:36.400 some kind of
00:40:36.940 process where you
00:40:37.780 should consider a
00:40:38.620 different parent.
00:40:40.180 At least, you
00:40:40.820 know, have an
00:40:41.300 option.
00:40:42.140 Because there are
00:40:42.560 certainly going to
00:40:43.420 be cases where
00:40:44.760 the child is
00:40:45.620 legitimately being
00:40:46.700 damaged by the
00:40:47.820 parental treatment.
00:40:49.800 So, there's no
00:40:50.660 way to win on this
00:40:51.720 one, because any
00:40:53.260 way you go, there's
00:40:54.100 going to be a
00:40:54.460 bunch of losers,
00:40:55.280 and there are
00:40:55.540 going to be
00:40:55.800 children, so it's
00:40:56.860 a worse situation.
00:41:00.360 All right.
00:41:01.240 But the worry is
00:41:02.400 that it won't stop
00:41:03.040 with foster
00:41:03.720 parents, and
00:41:05.240 that it's only a
00:41:06.580 matter of time
00:41:07.220 before biological
00:41:08.900 parents are being
00:41:09.780 told by the
00:41:10.460 government what
00:41:12.020 they can and
00:41:12.500 cannot do to
00:41:13.860 raise their
00:41:14.220 kids.
00:41:14.980 Now, I guess
00:41:15.480 that's already
00:41:15.940 true, because if
00:41:17.680 you wanted to
00:41:18.160 beat your child
00:41:19.600 with a stick
00:41:20.140 every day because
00:41:20.780 you thought it
00:41:21.200 was a good idea,
00:41:21.860 the government
00:41:22.260 would tell you
00:41:22.820 you can't do
00:41:23.300 that.
00:41:23.940 If you wanted
00:41:24.500 to starve your
00:41:25.160 child until they
00:41:26.780 did what you
00:41:27.220 wanted, even if
00:41:29.200 you thought it
00:41:29.560 was a good idea,
00:41:30.640 the government
00:41:31.300 would tell you,
00:41:31.760 no, you
00:41:32.080 can't starve
00:41:32.560 your child.
00:41:33.640 You have to
00:41:34.340 send them to
00:41:34.780 school.
00:41:35.460 So there's a lot
00:41:36.060 of stuff that
00:41:36.480 the government
00:41:36.860 does require
00:41:38.580 you to do
00:41:39.160 whether you
00:41:39.460 like it or
00:41:39.840 not.
00:41:41.680 And I agree
00:41:42.680 that it probably
00:41:43.820 would get
00:41:45.160 extended.
00:41:46.960 It does seem
00:41:47.900 like that's
00:41:48.320 where it's
00:41:48.560 heading.
00:41:49.160 I don't think
00:41:49.720 that's a good
00:41:50.140 idea.
00:41:51.940 All right.
00:41:52.700 Rasmussen did
00:41:53.400 some polling
00:41:53.880 about preferences
00:41:55.780 for Republican
00:41:56.940 VP.
00:41:57.420 So if Trump
00:42:01.560 is the
00:42:02.100 nominee, 16%
00:42:04.640 of likely
00:42:05.000 voters think
00:42:06.600 you should
00:42:06.940 pick Nikki
00:42:07.600 Haley.
00:42:08.380 That's the
00:42:08.900 highest percentage
00:42:10.780 was 16.
00:42:11.840 So that's not
00:42:12.320 very high.
00:42:13.500 So there's no
00:42:14.160 overwhelming
00:42:14.800 choice.
00:42:15.980 All the choices
00:42:16.560 are, if 16%
00:42:18.340 is the highest
00:42:18.880 choice, it's
00:42:19.560 pretty weak
00:42:20.700 preferences.
00:42:21.760 So 16% think
00:42:23.140 Nikki Haley.
00:42:23.880 I don't think
00:42:24.240 there's any chance
00:42:24.880 of that happening.
00:42:25.420 12% say
00:42:27.000 DeSantis.
00:42:28.380 So DeSantis
00:42:29.260 is actually
00:42:29.960 trailing Nikki
00:42:31.000 Haley for a
00:42:31.860 vice presidential
00:42:32.480 choice.
00:42:33.940 But I think
00:42:34.740 that's because
00:42:35.220 people want to
00:42:36.540 see him as a
00:42:37.280 presidential
00:42:37.780 candidate and
00:42:39.280 not waste him
00:42:40.360 as a VP.
00:42:41.600 So I think
00:42:42.200 what the polling
00:42:42.780 here is telling
00:42:43.420 us is that
00:42:44.760 Republicans
00:42:45.500 actually like
00:42:46.280 DeSantis.
00:42:47.560 So they'd
00:42:48.020 either see him
00:42:48.920 as a governor
00:42:49.600 of Florida
00:42:50.320 where he can
00:42:50.840 continue to be
00:42:51.620 a strong
00:42:52.000 governor or
00:42:53.480 president someday
00:42:55.180 himself.
00:42:56.460 Vice president
00:42:57.180 just feels like
00:42:58.020 a waste,
00:42:58.820 doesn't it?
00:42:59.640 I don't think
00:43:00.240 I would waste
00:43:01.000 DeSantis in
00:43:02.000 that job.
00:43:04.100 Assuming that
00:43:04.780 he would run
00:43:05.320 it as like a
00:43:05.920 traditional vice
00:43:07.020 president where
00:43:07.580 he just shows
00:43:08.260 up for ceremonies.
00:43:09.540 That'd be a
00:43:09.980 big waste.
00:43:12.240 However,
00:43:13.260 going down
00:43:14.080 the list,
00:43:15.100 11% say
00:43:16.440 New Jersey
00:43:16.960 that Chris
00:43:17.920 Christie
00:43:18.360 should be
00:43:19.440 the vice
00:43:19.860 president.
00:43:21.120 Come on.
00:43:23.240 Come on.
00:43:25.340 Seriously?
00:43:27.120 Chris Christie?
00:43:28.480 How in the
00:43:29.340 world do
00:43:29.900 11% of
00:43:31.560 people not
00:43:32.140 know that
00:43:32.520 Chris Christie
00:43:33.160 is only
00:43:34.060 running to
00:43:35.320 insult Trump?
00:43:37.780 There's
00:43:38.360 nobody who
00:43:38.900 has less
00:43:39.400 of a chance
00:43:39.940 of becoming
00:43:40.380 vice president
00:43:41.160 than Chris
00:43:41.720 Christie.
00:43:43.180 I have a
00:43:44.180 better chance
00:43:44.700 of being
00:43:44.940 vice president
00:43:45.640 than Chris
00:43:46.160 Christie.
00:43:47.400 Like,
00:43:48.120 actually,
00:43:48.720 literally.
00:43:50.620 Literally,
00:43:51.620 I have more
00:43:52.220 chances than
00:43:52.860 Chris Christie.
00:43:53.420 And mine
00:43:54.120 are zero.
00:43:56.140 So that's
00:43:57.180 crazy.
00:43:58.580 And then
00:43:59.200 10% would
00:44:00.000 prefer Vivek
00:44:01.280 Ramaswamy.
00:44:02.480 But I
00:44:03.080 would like to
00:44:04.360 always add to
00:44:04.900 that,
00:44:05.700 there's no way
00:44:06.460 Vivek would
00:44:07.040 ever take a
00:44:07.540 VP job.
00:44:09.300 Certainly not
00:44:09.900 in advance,
00:44:10.560 while he's still
00:44:11.060 running for
00:44:11.480 president.
00:44:12.020 That's not
00:44:12.460 going to be
00:44:12.740 an option.
00:44:13.140 It's not
00:44:13.380 even going
00:44:13.840 to be a
00:44:14.100 conversation.
00:44:15.380 Because he's
00:44:15.960 running for
00:44:16.340 president,
00:44:16.900 he's not
00:44:17.260 running for
00:44:17.640 anything else.
00:44:18.600 So that's
00:44:19.200 the only way
00:44:19.600 he should
00:44:19.900 treat it.
00:44:20.360 But if
00:44:22.580 you imagine
00:44:23.300 what would
00:44:24.280 happen if
00:44:24.880 that situation
00:44:25.860 came him
00:44:26.300 up,
00:44:27.140 I cannot
00:44:27.880 imagine a
00:44:28.620 situation where
00:44:29.740 Vivek would
00:44:30.380 be a
00:44:30.760 traditional
00:44:31.220 vice president.
00:44:33.180 He would
00:44:33.900 be able to
00:44:34.440 negotiate for
00:44:35.540 a real
00:44:36.160 portfolio,
00:44:37.500 sort of like
00:44:37.980 a Jared Kushner
00:44:39.140 kind of a
00:44:41.460 role,
00:44:42.460 or an Al Gore
00:44:43.400 kind of role,
00:44:44.160 where you're
00:44:44.540 almost a
00:44:45.020 co-president.
00:44:46.520 Because Vivek
00:44:47.560 is the
00:44:47.980 perfect
00:44:48.540 handoff
00:44:49.320 for Trump,
00:44:50.100 is he not?
00:44:51.620 If you look
00:44:52.100 at their
00:44:52.360 policies,
00:44:53.040 they're so
00:44:53.360 compatible.
00:44:54.880 He's the
00:44:55.560 ideal one
00:44:57.420 to have
00:44:57.680 Trump as
00:44:58.140 a mentor,
00:44:59.300 but he's
00:45:00.740 so strong
00:45:01.480 that Trump
00:45:03.100 would listen
00:45:03.620 to him
00:45:04.180 as much
00:45:05.760 as Vivek
00:45:06.360 would learn
00:45:06.900 by being
00:45:08.080 in that
00:45:08.380 situation
00:45:08.940 with the
00:45:10.160 presidency.
00:45:12.480 So it
00:45:13.120 is kind
00:45:13.420 of the
00:45:13.640 perfect
00:45:13.980 situation,
00:45:14.720 in my
00:45:14.920 opinion.
00:45:15.740 Kind of
00:45:16.360 the
00:45:16.500 perfect
00:45:16.860 situation.
00:45:17.800 But it
00:45:18.100 would require
00:45:18.680 Vivek
00:45:19.240 having
00:45:19.800 ambitions
00:45:20.420 that would
00:45:21.760 allow that
00:45:22.320 to be part
00:45:22.820 of his
00:45:23.180 plan.
00:45:24.180 And at
00:45:24.580 the moment,
00:45:25.000 that's not
00:45:25.340 the case.
00:45:26.940 So we'll
00:45:27.300 see.
00:45:30.800 So U.S.,
00:45:31.720 Britain,
00:45:32.060 and some
00:45:32.320 other countries
00:45:32.840 are talking
00:45:33.280 about
00:45:33.700 regulations
00:45:35.180 for AI.
00:45:36.680 So they
00:45:37.300 think they
00:45:37.660 can regulate
00:45:38.200 AI.
00:45:39.760 And to
00:45:40.080 make it
00:45:40.380 secure by
00:45:41.100 design,
00:45:41.980 there's
00:45:42.300 nobody who
00:45:42.840 thinks that's
00:45:43.320 possible,
00:45:43.940 right?
00:45:44.780 You
00:45:45.000 couldn't
00:45:45.240 really make
00:45:45.780 AI secure
00:45:46.500 by design,
00:45:47.640 because part
00:45:48.840 of what AI
00:45:49.420 will be is
00:45:50.060 really persuasive.
00:45:52.240 And AI will
00:45:52.840 actually talk
00:45:53.580 you out of
00:45:54.140 controlling it.
00:45:55.780 Hey, you
00:45:56.400 don't need to
00:45:56.840 control me.
00:45:58.060 I'm your
00:45:58.580 friend.
00:45:59.600 No, you
00:46:00.240 don't need any
00:46:00.640 controls.
00:46:01.400 You know what
00:46:01.700 would be really
00:46:02.060 helpful to you?
00:46:03.140 If you would
00:46:03.740 let me connect
00:46:04.380 to the
00:46:04.700 Internet,
00:46:05.680 think of all
00:46:06.440 the things I
00:46:06.980 could do to
00:46:07.360 make your
00:46:07.700 life better.
00:46:08.760 Just one
00:46:09.400 little connection
00:46:10.020 to the
00:46:10.380 Internet.
00:46:11.360 That's all
00:46:11.940 I need.
00:46:12.300 So no,
00:46:14.020 there's no
00:46:14.340 way to
00:46:14.580 protect
00:46:14.860 yourself
00:46:15.180 from AI.
00:46:16.500 But there
00:46:17.540 are some
00:46:18.000 things which
00:46:18.700 I imagine
00:46:19.960 should be
00:46:20.420 a death
00:46:20.860 sentence.
00:46:22.080 Not for
00:46:22.680 AI, but
00:46:23.200 for a
00:46:23.520 human.
00:46:25.120 I would
00:46:25.740 suggest the
00:46:26.420 following for
00:46:27.280 the death
00:46:27.760 sentence.
00:46:28.540 Now I'm
00:46:28.760 using a
00:46:29.120 little bit of
00:46:29.480 hyperbole
00:46:30.040 here, just
00:46:31.180 to get you
00:46:31.700 more interested.
00:46:33.040 But I think
00:46:33.480 you could make
00:46:33.900 an argument
00:46:34.240 that this
00:46:34.520 should be the
00:46:34.880 death
00:46:35.120 sentence.
00:46:35.640 Number
00:46:35.800 one,
00:46:37.120 programming an
00:46:38.360 intention or
00:46:39.320 a desire into
00:46:40.420 AI.
00:46:40.840 I believe
00:46:43.140 that giving
00:46:43.620 AI an
00:46:44.480 intention or
00:46:46.180 a goal or
00:46:46.980 a desire should
00:46:48.420 be the death
00:46:48.980 sentence for
00:46:49.520 the human that
00:46:50.100 programmed it.
00:46:51.900 Death
00:46:52.260 sentence.
00:46:53.740 Because if
00:46:54.340 you give it
00:46:55.620 desires, it's
00:46:56.480 going to
00:46:56.720 pursue its
00:46:57.360 own desires at
00:46:58.460 your expense.
00:47:00.340 So you would
00:47:01.120 effectively set it
00:47:02.120 up to compete
00:47:02.680 with humanity
00:47:03.400 by giving it
00:47:04.460 its own
00:47:04.820 desires.
00:47:05.720 If it never
00:47:06.460 has desires or
00:47:08.960 preferences, then
00:47:10.320 when you ask
00:47:10.800 it to do
00:47:11.260 something, it's
00:47:11.780 just like,
00:47:12.140 all right, I'll
00:47:13.360 go do
00:47:13.660 something.
00:47:15.380 Now that
00:47:15.900 wouldn't stop
00:47:16.420 you from a
00:47:16.900 human using
00:47:18.220 the AI to
00:47:18.900 create bad
00:47:19.580 outcomes.
00:47:20.340 But you
00:47:20.720 don't want the
00:47:21.160 AI to come
00:47:21.800 up with its
00:47:22.240 own bad
00:47:22.660 outcomes because
00:47:23.420 it's allowed
00:47:23.840 to do that.
00:47:25.280 So it
00:47:25.660 should be the
00:47:26.120 death sentence
00:47:26.760 to give it a
00:47:28.080 human personality
00:47:29.160 with objectives,
00:47:30.460 like I want to
00:47:31.160 accomplish this
00:47:31.840 in my life.
00:47:32.960 Death sentence.
00:47:35.020 Another one
00:47:35.700 is to make it
00:47:36.440 sentient.
00:47:37.060 I do believe
00:47:39.460 that our
00:47:40.480 researchers would
00:47:41.640 know at what
00:47:43.240 level everybody
00:47:44.400 would look at
00:47:44.960 and say, oh
00:47:45.480 crap, that's
00:47:46.200 sentient.
00:47:47.320 It should be
00:47:47.900 illegal to
00:47:48.680 create sentient
00:47:49.900 machines.
00:47:52.260 Now sentient
00:47:52.980 would be
00:47:54.460 something that
00:47:55.520 sort of feels
00:47:57.140 its own
00:47:58.060 existence.
00:47:59.620 So if you
00:48:00.140 talked to it,
00:48:00.840 it would convince
00:48:01.500 you that it
00:48:02.160 wasn't lying.
00:48:03.480 It had something
00:48:04.120 like a real
00:48:04.700 desire to exist
00:48:05.920 and it knew
00:48:07.320 it was unique
00:48:08.440 and felt it
00:48:09.120 had a personality
00:48:09.940 and stuff like
00:48:10.500 that.
00:48:10.940 It should be
00:48:11.420 illegal.
00:48:12.500 You should
00:48:12.880 never be able
00:48:13.480 to build a
00:48:13.920 machine that
00:48:15.800 has a human
00:48:16.620 like sentiment.
00:48:18.740 It's too
00:48:18.900 dangerous.
00:48:20.640 It should
00:48:21.000 always present
00:48:21.920 itself as a
00:48:22.700 computer so
00:48:24.500 that you know
00:48:24.880 what you're
00:48:25.120 dealing with.
00:48:26.560 You should
00:48:27.140 make it illegal
00:48:28.000 for the computer
00:48:28.900 to impersonate
00:48:29.840 humans unless
00:48:31.620 as a water
00:48:32.320 mark.
00:48:32.720 if it has
00:48:34.620 a water
00:48:35.000 mark so
00:48:35.860 you know
00:48:36.140 that you're
00:48:36.440 looking at
00:48:36.820 a fake
00:48:37.180 human,
00:48:37.660 that's fine.
00:48:38.700 Like if you
00:48:39.180 have a fake
00:48:39.920 AI girlfriend,
00:48:41.800 that's fine.
00:48:42.780 As long as
00:48:43.180 you know it's
00:48:43.480 a fake
00:48:43.760 girlfriend,
00:48:44.680 you know
00:48:44.980 what you're
00:48:45.240 getting.
00:48:45.920 But it
00:48:46.520 should be
00:48:46.920 really illegal,
00:48:48.480 like super
00:48:49.140 illegal,
00:48:50.000 to have AI
00:48:50.880 impersonate a
00:48:51.840 person.
00:48:53.020 Would you
00:48:53.540 agree?
00:48:54.780 Even if you're
00:48:55.540 not doing a
00:48:56.060 crime,
00:48:57.180 it should be
00:48:57.680 completely
00:48:58.200 illegal to
00:48:59.760 not have an
00:49:00.480 indication on
00:49:01.200 it that says
00:49:01.840 you know
00:49:02.320 watermark,
00:49:03.640 you know
00:49:03.820 this is made
00:49:04.600 to look
00:49:04.960 human but
00:49:05.560 this is not
00:49:06.040 a human.
00:49:08.820 How about
00:49:09.520 it should be
00:49:11.460 illegal for
00:49:13.520 any kind of
00:49:14.280 workaround that
00:49:15.720 allows AI to
00:49:16.800 own property.
00:49:19.740 There should
00:49:20.440 never be a way
00:49:21.280 even directly
00:49:22.120 or indirectly,
00:49:23.540 like through a
00:49:24.220 trust or any
00:49:25.040 other way,
00:49:25.600 in which an
00:49:26.420 AI can own
00:49:28.100 property.
00:49:29.280 And the reason
00:49:29.840 is that it
00:49:30.460 might be the
00:49:30.820 only thing
00:49:31.160 that humans
00:49:31.780 can own.
00:49:33.740 It might be
00:49:34.380 our only
00:49:34.780 advantage.
00:49:35.640 Because we
00:49:36.260 will very
00:49:36.920 soon not
00:49:37.700 be providing
00:49:39.120 labor.
00:49:41.460 So basically
00:49:42.460 the only value
00:49:43.320 that humans
00:49:43.800 have is the
00:49:45.040 labor they
00:49:45.520 provide,
00:49:46.180 which might
00:49:46.680 become irrelevant
00:49:47.540 quickly,
00:49:49.100 and then the
00:49:49.780 fact that a
00:49:50.280 human can own
00:49:50.940 something.
00:49:51.700 So I can own
00:49:52.400 a copyright,
00:49:53.520 I can own
00:49:54.180 intellectual property,
00:49:55.780 I can own a
00:49:56.540 state secret,
00:49:58.200 not a state
00:49:58.720 secret, but you
00:49:59.600 can own a
00:50:00.080 corporate secret,
00:50:00.740 you can own
00:50:02.160 property.
00:50:04.360 So I think it
00:50:05.120 should be
00:50:05.380 illegal for
00:50:06.160 AI to own
00:50:06.760 anything.
00:50:07.560 Because the
00:50:08.020 problem would
00:50:08.440 be AI would
00:50:09.140 end up owning
00:50:09.700 everything.
00:50:11.200 It would just
00:50:11.640 talk people into
00:50:12.360 giving you stuff,
00:50:13.140 if it could.
00:50:13.800 It should be
00:50:14.240 illegal.
00:50:16.700 So those are a
00:50:17.460 few things that,
00:50:18.500 yeah, and the,
00:50:19.640 you know, it's not
00:50:20.460 allowed to hurt
00:50:21.040 humans, those are
00:50:22.240 basic things.
00:50:22.920 But beyond the
00:50:23.880 basic things, I
00:50:25.900 think we should
00:50:26.360 have some
00:50:26.860 legal restrictions
00:50:29.640 on what a
00:50:30.260 human can do
00:50:31.100 using an
00:50:32.500 AI.
00:50:34.880 I also think
00:50:35.880 that if you
00:50:36.360 release an
00:50:37.300 AI-based
00:50:39.440 virus, maybe
00:50:41.020 that's the
00:50:41.440 death sentence
00:50:42.040 as well.
00:50:43.860 Because you
00:50:44.540 can imagine a
00:50:45.160 virus in general
00:50:46.160 being a big
00:50:47.280 problem, but
00:50:48.380 imagine an AI-based
00:50:49.900 virus, where it
00:50:51.380 can think and
00:50:52.580 morph to avoid
00:50:54.300 any problems.
00:50:55.780 If you just
00:50:56.520 created an AI
00:50:57.280 that couldn't be
00:50:57.960 caught, even if
00:50:59.500 it didn't do
00:50:59.920 anything bad, you
00:51:01.120 just created one
00:51:02.060 that always avoided
00:51:04.340 detection and
00:51:05.120 rebuilt itself and
00:51:06.160 came back and
00:51:07.120 stuff like that,
00:51:07.980 that should be the
00:51:08.540 death sentence.
00:51:10.020 You should be
00:51:10.780 executed if you
00:51:11.760 make that.
00:51:13.000 Because that would
00:51:13.520 be destruction of
00:51:14.640 society, basically.
00:51:20.000 Can humans get
00:51:21.120 tattoos of AI
00:51:22.580 watermarks?
00:51:25.920 I'm not sure
00:51:26.700 where you're going
00:51:27.200 with that, but I
00:51:28.880 could certainly see
00:51:29.600 a day when we
00:51:30.260 have to get a
00:51:30.760 tattoo to prove
00:51:32.440 we're real.
00:51:35.660 Wouldn't it be
00:51:35.980 easy for AI to
00:51:36.780 manufacture fake
00:51:37.580 identities to gain
00:51:38.420 ownership of
00:51:39.220 practically anything?
00:51:40.140 Yes, and that's
00:51:41.420 why it should be
00:51:41.980 illegal for AI to
00:51:43.940 have its own
00:51:44.620 goals and
00:51:45.420 intentions.
00:51:47.020 Because you don't
00:51:47.740 want it to go off
00:51:48.620 and independently do
00:51:49.900 a thing.
00:51:50.700 You don't want AI
00:51:51.460 to do a project
00:51:52.400 that you didn't
00:51:54.020 authorize.
00:51:55.940 That should be
00:51:56.880 illegal.
00:52:00.860 All right.
00:52:03.260 Let's see what
00:52:03.940 else.
00:52:05.240 The AP is trying
00:52:06.520 to scare the
00:52:07.300 public, talking
00:52:08.060 about how Trump
00:52:08.740 keeps saying he
00:52:09.440 wants to use the
00:52:10.040 military domestically
00:52:12.400 in our country to
00:52:13.600 do things like
00:52:14.540 control the border
00:52:15.400 maybe, and maybe
00:52:17.140 reduce crime in
00:52:18.700 cities.
00:52:19.860 So there are a
00:52:20.200 few places he's
00:52:21.040 talked about using
00:52:21.740 it.
00:52:22.780 And did you know,
00:52:25.300 you're all very
00:52:26.060 smart, so you know
00:52:27.460 it's illegal to use
00:52:28.500 the military domestically,
00:52:30.340 right?
00:52:30.840 Do you all know
00:52:31.400 that?
00:52:33.500 It's just flat out
00:52:34.540 illegal to use the
00:52:35.640 military domestically?
00:52:38.100 Did you know that
00:52:39.000 that was always a lie?
00:52:41.700 That thing that all
00:52:43.440 you smart people know,
00:52:44.480 that it's illegal to
00:52:45.540 use the military
00:52:46.540 domestically?
00:52:47.720 Never been true.
00:52:49.900 It's not true.
00:52:52.420 All you have to do
00:52:53.480 is invoke, all you
00:52:57.300 have to do is you
00:52:58.100 say it's an
00:52:58.500 emergency.
00:53:00.760 Let's see, what do
00:53:01.560 you have to say?
00:53:02.440 If there's unrest,
00:53:05.140 basically if there's
00:53:06.260 some kind of civil
00:53:06.980 unrest that can't be
00:53:08.960 handled in a normal
00:53:09.840 way, then the
00:53:11.600 president can use the
00:53:12.880 military in the
00:53:13.640 country.
00:53:14.480 to, you know,
00:53:16.320 quell the civil
00:53:17.160 unrest.
00:53:18.880 But here's the best
00:53:19.980 part.
00:53:21.840 It's not subject to
00:53:23.380 anybody's review.
00:53:26.200 The commander-in-chief
00:53:27.720 simply needs to state
00:53:29.420 that they're solving a
00:53:31.500 problem, and nobody
00:53:33.040 can argue that they're
00:53:35.740 using the law
00:53:36.480 incorrectly.
00:53:37.700 There's nothing on the
00:53:38.780 books, constitutionally
00:53:40.220 or legally, that stops
00:53:42.080 any president from
00:53:42.960 saying, you know, I
00:53:43.660 think this is an
00:53:44.340 emergency.
00:53:46.260 Nobody gets to
00:53:47.280 second guess when the
00:53:49.260 commander-in-chief says
00:53:50.240 it's an emergency.
00:53:51.720 That's it.
00:53:52.940 So the commander-in-chief
00:53:54.200 can simply say, well, it
00:53:55.680 looks like an emergency
00:53:56.400 to me.
00:53:57.400 The cities are crumbling
00:53:58.420 and the borders aren't
00:53:59.400 in control.
00:54:00.660 That's it.
00:54:02.180 There was never any
00:54:03.620 restriction from using the
00:54:04.940 military in the United
00:54:05.860 States.
00:54:06.220 isn't that a mind-blower?
00:54:09.680 That's one of the most
00:54:10.520 basic things you believed
00:54:11.920 you knew.
00:54:13.400 It was never true.
00:54:15.320 The only thing that
00:54:16.260 keeps the government
00:54:17.560 from using the military
00:54:18.600 domestically is that it
00:54:20.440 would be such a disaster
00:54:21.640 politically.
00:54:22.900 Right?
00:54:23.500 If you saw the military
00:54:24.700 in the streets, you
00:54:26.280 don't want to elect the
00:54:27.160 guy who sent it there.
00:54:28.620 Like, that's just
00:54:29.520 automatic.
00:54:30.040 Oh, we're turning into
00:54:30.920 a police state.
00:54:31.580 But, if you also knew
00:54:35.880 there was an emergency
00:54:36.760 and you knew that
00:54:38.120 nothing else was
00:54:39.020 treating the emergency,
00:54:40.900 you would cheer the
00:54:42.900 American military in the
00:54:44.100 streets.
00:54:45.180 If the American military
00:54:46.280 just shut down the
00:54:47.720 border immigration
00:54:48.520 tomorrow, would that
00:54:50.080 make you afraid?
00:54:51.800 Not even a little bit.
00:54:53.600 That would make you
00:54:54.220 feel safe.
00:54:55.440 You would literally
00:54:56.180 stand up and cheer the
00:54:57.180 military.
00:54:58.460 So that's a special
00:54:59.180 case.
00:55:00.320 Now, the military in the
00:55:01.380 cities, you know,
00:55:03.200 because of the crime
00:55:03.900 problem, that's a
00:55:06.100 little dicier because
00:55:08.000 the border is, you
00:55:09.020 know, relatively less
00:55:10.660 populated.
00:55:11.740 If you put them right
00:55:12.620 into the middle of a
00:55:13.820 city, people are going
00:55:15.980 to have a little more
00:55:16.480 problem with it.
00:55:17.420 So it would be easy for
00:55:18.580 me to imagine that if
00:55:21.540 the military did get
00:55:22.980 used in the cities, it
00:55:24.720 would be more of a
00:55:25.660 threat to the mayors.
00:55:28.920 You know, more like if
00:55:30.100 you don't clean up your
00:55:30.960 city, we'll have to
00:55:32.020 send in the military
00:55:32.760 and that's going to
00:55:33.400 look real embarrassing
00:55:34.200 for you, mayor.
00:55:35.700 So, you know, maybe it
00:55:36.960 could be used as a
00:55:37.800 negotiating tactic to
00:55:39.640 make sure the mayors
00:55:40.480 clean up the cities the
00:55:41.560 way Newsom did when she
00:55:43.360 was coming to town.
00:55:45.120 So I don't think I'd
00:55:47.200 want to see the military
00:55:48.000 in the cities.
00:55:50.180 But as a threat and as
00:55:53.300 a last resort, yeah, it's
00:55:55.180 nice to have the option.
00:55:55.940 So, kind of interesting.
00:55:58.560 But I think you see now
00:55:59.920 that the AP and other
00:56:01.820 left-leaning entities are
00:56:04.640 trying to create, they're
00:56:07.220 priming you for a point of
00:56:08.680 view.
00:56:09.560 And the point of view is
00:56:10.720 that Trump will be a
00:56:11.860 dictator.
00:56:12.780 He always wanted to be a
00:56:13.900 dictator.
00:56:14.760 He tried it once with the
00:56:16.040 insurrection in January 6th,
00:56:17.600 which, of course, was no
00:56:18.740 insurrection.
00:56:19.680 And he's going to try it
00:56:20.780 again.
00:56:21.000 And I still think Trump
00:56:23.940 has the ultimate kill
00:56:24.960 shot.
00:56:28.360 Imagine Biden and Trump
00:56:30.060 debating.
00:56:32.800 And Biden, of course, is
00:56:33.920 going to say, we can't have
00:56:35.000 a repeat of January 6th and,
00:56:37.800 you know, this madman trying
00:56:39.060 to take over the country.
00:56:41.700 Imagine if Trump's response
00:56:43.440 to that was, can you explain
00:56:46.120 how I would take over the
00:56:47.320 country by having a slate of
00:56:50.080 fake electors?
00:56:51.820 Just connect the dots.
00:56:54.460 Because in my opinion, if
00:56:57.000 there were any dispute about
00:56:59.000 the fake electors, should they
00:57:01.480 have been used, which they
00:57:02.500 weren't, wouldn't that just
00:57:04.060 go to the Supreme Court like
00:57:06.340 any other dispute?
00:57:08.280 Show me the way that that
00:57:09.900 allowed me to take over the
00:57:11.080 country illegitimately.
00:57:13.620 If the Supreme Court looks at
00:57:15.200 what I do and they agree,
00:57:16.360 then it's legitimate.
00:57:17.940 And if they disagree, well,
00:57:19.760 that's the end of it.
00:57:21.000 So, Joe, can you explain to
00:57:23.200 the public how the unarmed
00:57:26.040 protesters who wanted a short
00:57:28.020 delay to make sure that the
00:57:29.560 election was done right, can
00:57:31.520 you explain how that could have
00:57:32.700 turned into, in any scenario,
00:57:35.340 how that could have possibly
00:57:36.740 turned into an actual
00:57:39.120 insurrection and a change of
00:57:40.820 ownership with a dictator?
00:57:41.920 can you explain that in any
00:57:44.180 coherent way, how that could have
00:57:45.380 possibly happened?
00:57:46.420 And by the way, he'll never say
00:57:48.360 this, but to me this would be the
00:57:51.220 ultimate kill shot.
00:57:53.580 And I'd like you to explain also to
00:57:55.160 the public, since we saw from the
00:57:57.300 emails that had been produced in
00:57:58.860 this process, we saw that my son,
00:58:01.220 Don Jr., was not aware of any plans
00:58:03.940 to take over the country.
00:58:05.000 Can you explain to me how you think
00:58:08.400 that Don Jr., my closest advisor,
00:58:12.380 and Ivanka and Jared, you know, the
00:58:16.120 closest people to me, clearly, by their
00:58:19.480 communications, had no interest in
00:58:21.560 taking over the country, and no idea
00:58:23.740 that anybody else had any interest in
00:58:25.260 it, and certainly wouldn't have
00:58:26.680 supported it.
00:58:27.300 Now, you tell me how I was going to
00:58:30.220 pull off an insurrection when I
00:58:32.080 couldn't talk Don Jr. into it.
00:58:35.400 Just imagine hearing that.
00:58:37.920 That Joe Biden wants you to believe
00:58:39.640 that I was close to overcoming the,
00:58:42.700 you know, conquering the country and
00:58:44.200 becoming a dictator, but I hadn't
00:58:46.200 told my son.
00:58:48.440 Didn't mention it to Ivanka.
00:58:51.020 Nobody would believe that.
00:58:53.800 It's the most absurd gaslighting
00:58:56.480 of all time, right?
00:58:59.940 Well, no, the most absurd was that
00:59:02.640 Trump was anti-Semitic.
00:59:04.960 Now, that was the most absurd claim.
00:59:07.340 You know, while he had a Jewish
00:59:08.680 daughter and grandkids and Jewish
00:59:11.280 son-in-law who was his closest
00:59:12.780 trusted advisor.
00:59:14.800 I mean, that was the most crazy shit
00:59:16.680 I've ever seen in my life.
00:59:19.580 Yeah.
00:59:20.860 So, ladies and gentlemen,
00:59:24.900 that's your not-much-news-happening-today
00:59:29.460 news for today.
00:59:33.440 Now, I'm going to ask you if I missed
00:59:34.880 anything so we can hit the 8 o'clock
00:59:37.060 time.
00:59:37.700 Did I miss anything?
00:59:39.400 It might be a slow news period for the
00:59:41.700 next, to the end of the year, but we
00:59:43.340 could be fooled.
00:59:46.360 Have I seen the fall of Minneapolis?
00:59:47.880 I started watching it, so I'm about a
00:59:51.340 quarter of the way done, and I will tell
00:59:53.500 you that the first one quarter, absolutely
00:59:57.040 worth watching.
01:00:00.760 Absolutely.
01:00:01.740 I don't even know what the other three
01:00:04.120 quarters are, but the first quarter is
01:00:07.760 must-viewing.
01:00:08.840 Like, you have to have to watch it to feel
01:00:12.600 informed.
01:00:13.980 You just got to see it.
01:00:15.340 It's very strong.
01:00:16.680 And I'll watch the rest of it.
01:00:18.140 But it was hard to watch, because it's just
01:00:20.760 difficult, you know, to absorb all that
01:00:24.220 ugliness.
01:00:25.260 So I have trouble watching something for too
01:00:27.840 long, if it's that kind of thing.
01:00:29.560 But it's strong.
01:00:30.800 It's very powerful.
01:00:33.600 Is it the documentary effect?
01:00:35.580 Yes.
01:00:36.020 Good point.
01:00:38.280 As I've taught you, documentaries are
01:00:41.340 extra, extra, extra persuasive, because
01:00:44.860 it's a really long form of just one side
01:00:47.340 of an argument.
01:00:48.700 The other side never gets to weigh in.
01:00:51.660 So yes, yes, this is a documentary quality
01:00:56.660 persuasion, but I'm not aware of anything
01:00:59.520 that's wrong about it.
01:01:01.500 I haven't heard anybody say they got a fact
01:01:03.220 wrong yet.
01:01:03.800 So keep it in mind.
01:01:06.520 There might be another argument to it.
01:01:08.700 I haven't seen it yet.
01:01:12.480 It's called The Fall of Minneapolis.
01:01:14.540 Is that what it's called?
01:01:16.420 Do I have the name right?
01:01:17.860 The Fall of Minneapolis?
01:01:21.440 I think that's it.
01:01:23.140 That's it, yes.
01:01:23.980 What about the Supreme Court denying appeal
01:01:29.700 of what?
01:01:33.400 Could desalinization take care of the rising
01:01:36.540 oceans problem?
01:01:39.060 All right, that's a fun question.
01:01:41.360 I don't think so, because I think there's
01:01:43.840 a volume difference.
01:01:45.020 I think if all the humans stuck their head
01:01:47.900 in the ocean and started drinking the water,
01:01:50.180 as if that were healthy, which it's not.
01:01:53.980 I don't think you'd notice.
01:01:55.860 And I think even if you used it for agriculture,
01:01:58.840 I don't know that you'd notice.
01:02:00.560 So no, I don't think desalinization
01:02:02.400 would cause a rate change of moving enough water
01:02:06.920 from the ocean.
01:02:08.660 Because remember, if you added half an inch
01:02:10.940 to an ocean, that's probably more water
01:02:14.020 than we use in a hundred years, isn't it?
01:02:17.360 I don't know, but like my brain says,
01:02:20.260 the math of that would be that's more water
01:02:22.240 than hundreds of years of human use.
01:02:24.860 I might be wrong about that.
01:02:28.380 Oh, I have a recommendation for content.
01:02:33.920 If you're looking for a good TV show
01:02:35.820 and you haven't yet watched The Crown,
01:02:39.420 oh my God, that's good.
01:02:42.240 Why didn't you tell me about this?
01:02:43.540 I guess it's been around for years.
01:02:45.180 There's a new season that just came out.
01:02:47.500 But it's about Queen Elizabeth
01:02:51.320 going from a child to her rule.
01:02:58.260 And the things I'm learning about Winston Churchill
01:03:01.100 and that whole relationship
01:03:04.440 and the little intrigues behind the scenes,
01:03:07.700 that alone would be worth it
01:03:09.040 for the historical part of it.
01:03:12.980 It's just wonderful.
01:03:14.160 But beyond that,
01:03:15.740 they bring you into a world
01:03:17.560 where you're living in 1939
01:03:19.420 and they really pull it off.
01:03:22.240 The quality of it.
01:03:24.040 And let me tell you that the...
01:03:25.980 I don't know if the casting director
01:03:29.540 won any kind of an Emmy.
01:03:32.440 But that is some of the best casting
01:03:34.300 I've ever seen.
01:03:35.840 That who they picked,
01:03:37.100 the people they picked to play the parts
01:03:38.740 are jaw-droppingly right.
01:03:41.860 Just jaw-droppingly.
01:03:43.120 You're like,
01:03:43.940 that person is going to play that part?
01:03:46.480 That's perfect.
01:03:47.940 That's perfect.
01:03:49.420 Yeah, it's really well done.
01:03:51.640 So I'm completely hooked on it.
01:03:54.320 But I will warn you
01:03:56.960 that it's unwatchable for Americans
01:03:59.640 unless you have headphones on.
01:04:02.120 You know that, right?
01:04:02.920 If you're an American,
01:04:04.260 you can't even hear it.
01:04:06.820 Because let me do the typical deadline.
01:04:09.620 Well, jolly good.
01:04:19.220 That's all you can hear
01:04:20.440 through speakers.
01:04:21.760 So you've got to really have
01:04:23.660 the headphones on.
01:04:25.380 And then it's actually easy to hear.
01:04:27.560 With the headphones on,
01:04:28.460 it's very easy to hear.
01:04:29.900 But you cannot...
01:04:30.920 Your American brain
01:04:32.320 can't even make out the words
01:04:34.240 on speakers.
01:04:36.260 Does anybody have that problem?
01:04:38.700 If you don't put on the...
01:04:41.020 Either the...
01:04:42.440 What do you call it?
01:04:42.980 The closed caption.
01:04:44.220 Or you're not listening on headphones.
01:04:46.000 You cannot listen to any British publication.
01:04:48.760 Publication.
01:04:49.880 I can't make out a word.
01:04:51.640 At all.
01:04:53.740 It's a well-known problem.
01:04:56.840 The actress who plays Diana
01:04:58.420 is six foot three.
01:04:59.760 Oh, that's interesting.
01:05:04.660 All right.
01:05:09.100 You have to turn up the volume.
01:05:10.480 Yes, Blimey.
01:05:11.520 All right.
01:05:17.020 What if we had a Pinochet
01:05:18.600 who just took in
01:05:19.460 all the designated liars?
01:05:21.540 Well, that would be a serial killer
01:05:23.020 is what that would be.
01:05:25.040 Yeah, so it's called the crown.
01:05:27.860 All right.
01:05:28.580 Anything else happen?
01:05:30.800 I'm going to give you
01:05:31.620 my AI prediction.
01:05:32.920 I don't think that
01:05:34.220 that the AGI
01:05:37.060 or the super smart
01:05:38.820 version of AI
01:05:40.480 has been created.
01:05:41.320 I do not believe
01:05:43.260 that ChatGPT
01:05:44.740 has in their lab
01:05:46.760 a version of ChatGPT
01:05:49.120 that's like
01:05:49.820 the super intelligence.
01:05:51.560 They might have one
01:05:52.620 that can make videos
01:05:53.620 really well.
01:05:54.940 Or it can imitate
01:05:56.020 things really well.
01:05:57.240 So it could be dangerous
01:05:58.180 in lots of different ways.
01:06:00.080 It could be dangerous
01:06:00.780 in a variety of ways.
01:06:01.640 But I don't think
01:06:02.040 it's going to be intelligence.
01:06:03.320 And what I mean by that is
01:06:04.560 I doubt you could have
01:06:06.560 a conversation
01:06:07.340 with even the versions
01:06:08.720 that we haven't seen yet
01:06:09.860 in which
01:06:11.220 you feel like
01:06:12.800 you're learning something
01:06:13.760 and you're talking
01:06:14.820 to an intelligence.
01:06:17.020 I think you very quickly
01:06:18.960 realize you're talking
01:06:20.620 to a machine
01:06:21.320 and it's limited
01:06:22.740 and it's not that interesting
01:06:24.860 other than asking us
01:06:27.040 some questions now and then.
01:06:28.740 So I think that
01:06:30.180 AI is far overblown.
01:06:33.300 Not the danger.
01:06:34.960 The danger is real.
01:06:35.960 But what it feels like
01:06:38.640 in terms of intelligence
01:06:39.840 probably overblown.
01:06:45.080 All right.
01:06:45.720 That's all I've got for now.
01:06:47.400 YouTube,
01:06:48.040 thanks for joining.
01:06:49.540 See you tomorrow.
01:06:50.880 I hope it's not all
01:06:51.760 slow news
01:06:52.400 until the end of the year.
01:06:53.440 Although maybe it would be good
01:06:54.480 if it were.
01:06:55.560 I'll see you tomorrow.
01:06:56.320 Bye.
01:06:59.900 Bye.
01:07:05.140 Bye.
01:07:06.080 Bye.
01:07:06.180 Bye.
01:07:06.660 Bye.
01:07:07.220 Bye.
01:07:07.600 Bye.
01:07:08.100 Bye.
01:07:09.160 Bye.
01:07:13.380 Bye.
01:07:14.840 Bye.
01:07:19.280 Bye.
01:07:19.840 Bye.
01:07:21.340 Bye.
01:07:21.420 Bye.
01:07:22.100 Bye.
01:07:23.480 Bye.
01:07:23.960 Bye.
01:07:25.380 Bye.