Real Coffee with Scott Adams - November 01, 2023


Episode 2279 Scott Adams: CWSA 11⧸01⧸23, Solving US Debt Problem And More


Episode Stats

Length

1 hour and 16 minutes

Words per Minute

147.68709

Word Count

11,235

Sentence Count

809

Misogynist Sentences

10

Hate Speech Sentences

25


Summary

A bunch of optimistic news about the future of the world, including a new invention that could revolutionize the way we live, and a new way to get a good night's rest, and some other things that are going to change the world.


Transcript

00:00:00.000 Bom-bom-bom-bom, ra-ba-bop-bom, la-da-da-da-da-da-da, ra-ba-bo.
00:00:10.700 Good morning, everybody, and welcome to the highlight of human civilization, be that as it may.
00:00:17.780 If you'd like your experience today to go up to levels which nobody could even imagine with the best imaginations in the world,
00:00:25.480 well, then all you need is a cup or a mug or a glass, a tank or chalice or stein, a canteen, jug or flask, a vessel of any kind,
00:00:32.780 fill it with your favorite liquid I like, coffee, and join me now for the unparalleled pleasure,
00:00:38.540 the dopamine at the end of the day, the thing that makes everything better.
00:00:42.880 It's called the simultaneous sip and a half a snack, go.
00:00:48.980 Oh, my God.
00:00:50.700 I don't know if any of you have ever tried this.
00:00:52.860 It's something, I don't know when it was the last time I ever tried it, but I tried it yesterday, and it's amazing.
00:01:04.060 I got almost enough sleep.
00:01:07.340 Has anybody ever done that?
00:01:09.540 I mean, it wasn't like a ton, probably six hours, but six hours is like, you know, that's like nine to me.
00:01:16.600 So I'm like raring to go, let me at it.
00:01:20.940 My God, you should try sleeping.
00:01:22.860 I really recommend it.
00:01:25.240 I'll tell you my two biggest sleeping tricks.
00:01:28.640 Number one, if I exercise that day, perfect correlation.
00:01:34.840 One to one.
00:01:36.200 Exercise, good sleep.
00:01:38.380 Period.
00:01:39.780 No exceptions.
00:01:41.300 Never once, not once in my life, has that not worked.
00:01:44.640 Exercise equals good sleep.
00:01:46.320 So you just got to get to sleep on time.
00:01:49.240 That basically, if you've exercised that day, there's nothing except getting to bed on time.
00:01:54.520 That's it.
00:01:55.280 And then you've solved the biggest problem in your life.
00:01:57.780 Right there.
00:01:59.180 Yeah, you can take your melatonin.
00:02:00.820 You can do all those things.
00:02:02.180 But I guarantee you, if you exercise that day, you're going to fall asleep.
00:02:06.780 You don't need any of that stuff.
00:02:09.020 All right.
00:02:10.000 Here's some, would you like some optimistic news?
00:02:13.440 Because it looks like the world's falling apart.
00:02:15.060 Today, I'm going to give you the optimistic news.
00:02:19.640 A whole bunch of it.
00:02:20.800 Just a whole bunch of optimistic news.
00:02:23.740 And of course, a number of stories about where I'm right about everything.
00:02:27.000 You know, the usual.
00:02:29.280 Well, MIT has come up with a discovery that they can make water evaporate far more efficiently
00:02:37.440 with light than heat.
00:02:43.020 Does that sound like a big deal?
00:02:45.060 It doesn't, really.
00:02:47.040 It's not a big deal.
00:02:48.740 But you know that apparently the light without any heat, under only the ideal conditions,
00:02:54.880 has to be some kind of edge condition that they can create, the light without heat will
00:03:01.340 evaporate water far more efficiently, like way more efficiently, than heat.
00:03:08.100 Do you know what that means?
00:03:09.780 It could mean that desalinization just became cheap.
00:03:16.660 It could go down by a factor of hundreds of percents.
00:03:21.300 And that changes a lot.
00:03:25.340 Because imagine if you could have water anywhere.
00:03:30.920 What's the biggest limiter to where people can live and how they can grow food?
00:03:38.080 Probably water as much as anything, right?
00:03:40.140 Climate, of course, number one.
00:03:41.980 But water is going to be the big one.
00:03:43.400 So imagine if there's a technology in five years where pretty much anybody can live off grid
00:03:51.160 and just have a little device and it makes all the water what they want.
00:03:56.760 Now, there have been a whole bunch of breakthroughs in producing water out of the atmosphere.
00:04:00.700 So between the atmosphere that they can now suck the water out of, that already exists,
00:04:07.120 and then this thing, which might be able to desalinate as scale,
00:04:11.740 it could be that our real estate options just increase by 100, right?
00:04:16.620 There's just more places you could live without having a city bring a pipe to you.
00:04:22.620 So that's one thing.
00:04:24.480 Here's another one that is just, this is just so perfect.
00:04:30.700 You know, this is one of those things that makes you think the simulation is real
00:04:34.440 and that the life we're living is at least partly scripted like a sitcom.
00:04:40.220 You know, there'll be little things that happen that you say,
00:04:41.980 how did that happen by coincidence?
00:04:44.340 That had to be in a script somewhere.
00:04:46.120 There's no way this happens by chance.
00:04:48.420 All right, here's another one.
00:04:51.040 Big improvement in steam engines.
00:04:55.360 You say to yourself, Scott, why do we need improvement in steam engines?
00:05:00.180 Are you going to be running your car on a steam engine, Scott?
00:05:02.960 No, no, no, no.
00:05:04.800 Turns out the steam engines have been used for, I don't know, how many decades to produce electricity.
00:05:12.000 So if you've got some big source of energy, like a falling waterfall,
00:05:17.680 it could drive a generator which could drive steam engines.
00:05:22.060 I don't know.
00:05:23.220 Now that's the wrong example.
00:05:24.560 I think it's nuclear, right?
00:05:25.480 Is it nuclear that drives the steam engines?
00:05:29.100 Yeah, nuclear power steam.
00:05:30.800 Anyway, so if you want to make a lot of steam to make energy from nuclear power,
00:05:36.740 the way we've done it traditionally is with some kind of steam engines
00:05:41.720 using power plants worldwide.
00:05:46.660 But it turns out that technology is from the 19th century.
00:05:49.600 We're using steam turbines from the 19th century.
00:05:55.400 So they found a way to run the steam turbines without water.
00:06:01.140 What would you guess is the replacement for the water that gives it a huge increase in energy production?
00:06:10.160 CO2.
00:06:12.400 Liquefied CO2.
00:06:13.700 I guess you could put CO2 under pressure and liquefy it.
00:06:18.200 And it makes your steam engine, or some modified steam engine, way, way more efficient.
00:06:24.180 It looks like it's all doable.
00:06:25.840 Like there's nothing that would stop them from doing it.
00:06:28.680 So here's something that could have a huge impact someday on our energy costs.
00:06:35.120 At the same time, desalinization could have a huge impact.
00:06:38.880 Now, if you've got cheap water, or at least ubiquitous, and you've got way cheaper energy,
00:06:46.860 that's a good world, isn't it?
00:06:50.460 Because then all you need is robots and AI, and those are both coming.
00:06:56.120 More on that later.
00:06:58.360 All right.
00:06:59.280 You probably saw that Joe Rogan talked to Elon Musk and made about 10 pieces of news.
00:07:04.100 You know, one of them is that Joe Rogan shot his bow and arrow at the side of a Cybertruck,
00:07:10.220 and it did not penetrate it.
00:07:12.380 But I'm told, if it had been a normal car, that that bow, from the distance, it was pretty close,
00:07:20.620 would have gone right through a door of a normal car, which is kind of impressive.
00:07:25.880 I had no idea that an arrow could be that powerful,
00:07:28.300 but apparently it's as powerful as a gun at a short distance.
00:07:32.020 Now, the interesting thing is that the glass windows are not bulletproof.
00:07:40.260 So, as Elon says, you're going to have to duck.
00:07:44.740 If somebody starts shooting at your car, you're still going to have to duck,
00:07:48.500 because it will break the window.
00:07:50.420 Now, they could have bulletproof glass,
00:07:53.760 but it would be so thick that Elon says it's too hard to make it go up and down.
00:07:58.960 So that's not an option, really.
00:08:00.560 I mean, it is an option if you don't want to ever put your windows down.
00:08:04.920 But the most interesting thing, in my opinion, that Musk said was about George Soros.
00:08:11.660 And he speculated that George Soros had a troubling childhood,
00:08:16.180 and that might contribute to the fact that, at least by his actions, says Musk,
00:08:22.800 it would appear that he hates humanity because his actions are so anti-human
00:08:29.580 that maybe it has something to do with his upbringing.
00:08:34.460 And to which I say, I like the line of thinking,
00:08:41.160 but there's a problem with it, which may be a solvable problem.
00:08:45.700 But why is his son exactly on the same page?
00:08:49.720 Because his son would have been raised in a privileged upbringing.
00:08:53.500 So if somebody's bad childhood is what makes them hate humanity,
00:08:58.340 wouldn't, is it Alex Soros, the guy who's in charge at the moment,
00:09:02.620 the son of George Soros,
00:09:04.640 wouldn't the son have had more like an ideal upbringing
00:09:08.160 and have no interest in destroying humanity?
00:09:11.440 He looks like he's having a good time.
00:09:13.060 So whatever the truth is here, and this might be right.
00:09:20.060 I mean, Musk might be exactly right.
00:09:21.780 He just, he's a broken man with a hatred of humanity.
00:09:26.240 But I usually think that's not the case.
00:09:31.860 My best, I've told, and by the way,
00:09:35.080 Musk also says he thinks Soros is senile.
00:09:38.200 So I think that's the better explanation.
00:09:43.060 I see one of, I see a couple of different possible explanations
00:09:46.780 for what appears to be anti-humanity actions.
00:09:52.080 The example would be funding prosecutors
00:09:54.560 who don't want to prosecute crime.
00:09:57.000 And my first, my first, this would be just a speculation.
00:10:03.480 Or it would be a theory, I guess.
00:10:07.380 So it's not based on any facts.
00:10:09.520 There could be some intelligence,
00:10:12.120 some country that has an intelligence group
00:10:15.100 that controls Soros.
00:10:17.620 So maybe he's just controlled by some foreign entity
00:10:20.580 that wants the United States to be destroyed,
00:10:23.740 in which case he's just doing what he has to do
00:10:26.320 for whatever reason,
00:10:27.480 to stay alive or to stay rich.
00:10:29.980 Maybe they have some blackmail on him.
00:10:33.260 I don't know.
00:10:33.900 But he acts like somebody who's controlled by an enemy
00:10:36.960 because what he does is so dangerous to America.
00:10:41.080 That's one possibility.
00:10:42.760 The other possibility is that
00:10:44.540 he doesn't exactly know what he's doing
00:10:48.120 and that he has a sprawling empire
00:10:50.860 and he hasn't been paying attention.
00:10:52.160 And when people talk to him,
00:10:55.400 when people talk to him,
00:10:57.080 they say some version of,
00:10:58.840 everything's great,
00:11:00.120 your money is going to these great causes,
00:11:01.960 you're saving the world.
00:11:03.680 And then he doesn't read maybe newspapers
00:11:05.920 that have bad articles about him.
00:11:08.280 So maybe he just never sees the other argument
00:11:10.280 and he's told by his people
00:11:12.160 who are receiving his money
00:11:13.520 and want to receive more of it,
00:11:16.160 hey, you're doing great.
00:11:17.600 Do some more of this.
00:11:18.820 And then he gets lots of pats on the back
00:11:21.460 and he feels like he's really helping the world.
00:11:24.380 But maybe he lives such a small world at the moment
00:11:28.400 because he's a certain age
00:11:29.520 that he just kind of doesn't know what's going on.
00:11:32.920 It seems like it.
00:11:34.220 It seems to me like he's not aware of it.
00:11:37.240 Now, that would also explain
00:11:38.460 why his son would not be able to change the boat
00:11:41.940 because the dad presumably still has his full authority
00:11:46.120 to fire his son.
00:11:48.820 It could be that the son's just waiting for the dad to die
00:11:52.900 and then he's going to reverse
00:11:55.300 some of the worst parts of the father's philosophy.
00:11:59.540 But Alex might know.
00:12:01.440 Again, this is just speculation.
00:12:02.920 I have no idea of knowing who's thinking what.
00:12:06.740 But there's either something about blackmail,
00:12:10.300 something about senility,
00:12:12.420 something about what information he receives
00:12:14.600 versus what you receive,
00:12:16.940 something like that.
00:12:18.820 But I don't quite buy the bad childhood explanation
00:12:22.280 because it would seem just as likely
00:12:26.580 that the bad childhood would want him to be a celebrity
00:12:30.680 and do the right things.
00:12:33.460 Unless he thinks he is,
00:12:35.300 which would be the information problem.
00:12:38.300 So we don't know.
00:12:39.260 But it was an interesting speculation
00:12:40.720 and I would love to see somebody set his son down
00:12:44.640 and ask the following questions.
00:12:47.940 What's going on?
00:12:48.880 It looks like your father is trying to destroy civilization.
00:12:52.860 How can you explain it?
00:12:55.140 I'd love to see that interview.
00:12:56.600 That's never going to happen.
00:12:57.440 All right.
00:13:00.000 As Musk pointed out,
00:13:01.800 the two biggest donors to Democrats were
00:13:04.360 George Soros,
00:13:06.620 who arguably looks like he's trying to destroy civilization.
00:13:12.080 No joke.
00:13:13.640 That's like a serious conversation by serious people
00:13:17.460 that he may want to destroy civilization.
00:13:20.240 That's their biggest donor.
00:13:23.180 Now, I don't know that he wants to destroy civilization,
00:13:25.940 but it's a serious question.
00:13:28.580 That's pretty scary.
00:13:31.620 Second biggest donor was Sam Bankman-Fried,
00:13:35.840 a con man, basically.
00:13:38.540 So those are their two biggest donors.
00:13:43.180 So if you're wondering,
00:13:44.640 hey, why is everything fucked up?
00:13:48.100 Well, maybe it's because the two biggest donors are,
00:13:52.640 one seems to be an evil genius crook
00:13:55.180 and the other one seems to be
00:13:57.200 a demented evil genius billionaire.
00:14:02.220 Maybe if you follow the money,
00:14:04.080 it gets you to exactly where we are,
00:14:05.980 which is a completely corrupted government
00:14:08.520 that's of no use to the people.
00:14:12.520 All right.
00:14:13.060 Some new science in the category of,
00:14:16.560 hey, you should have just asked Scott.
00:14:17.960 You ought to save some time.
00:14:20.780 Turns out, two separate studies,
00:14:22.520 one on gratitude and one on empathy.
00:14:25.360 So I guess empathy could reduce depression.
00:14:28.260 That's what one study says.
00:14:30.280 So thinking about other people
00:14:31.980 and caring about them
00:14:34.080 could make you less depressed.
00:14:37.220 Do you know who else knew that?
00:14:40.400 I did.
00:14:42.540 Didn't you all know that?
00:14:44.660 I thought it was sort of obvious.
00:14:48.120 I've always, in fact,
00:14:49.820 I may have told you this in the past,
00:14:52.400 but for,
00:14:54.300 I guess, all of my Dilbert-y career,
00:14:57.140 when I was getting more
00:14:59.580 than I needed for myself,
00:15:01.420 whenever I had a bad day,
00:15:03.340 I would just randomly do something
00:15:05.540 awesomely good for somebody else.
00:15:08.540 And I always cheered up.
00:15:11.120 A hundred percent of the time it worked.
00:15:13.300 I just picked somebody and say,
00:15:14.720 you know what?
00:15:15.500 Today's your lucky day.
00:15:16.920 And just made somebody happy.
00:15:18.840 Like in some substantial way.
00:15:21.140 And, oh my God, it feels good.
00:15:23.420 Yeah.
00:15:23.540 So, they could have saved a lot of money
00:15:25.740 in this study.
00:15:26.500 Just ask me.
00:15:28.200 I've been doing it for years.
00:15:29.540 It works every time.
00:15:30.600 And I've recommended it to people.
00:15:32.420 And when they try it,
00:15:33.400 it works every time.
00:15:35.260 Right?
00:15:35.540 You really didn't need to study that.
00:15:37.300 Here's another one.
00:15:39.160 Gratitude can make you healthier.
00:15:41.940 Again,
00:15:43.480 ask me.
00:15:44.720 I could have told you that.
00:15:46.620 In fact,
00:15:47.220 most self-help gurus
00:15:49.160 could have told you that.
00:15:50.720 That gratitude is a very healthy,
00:15:52.720 mindset.
00:15:54.520 It's going to make everything work better.
00:15:56.760 And it does.
00:15:58.500 So,
00:15:59.000 what do gratitude and empathy
00:16:01.640 have in common?
00:16:05.100 Gratitude and empathy.
00:16:06.940 It's about getting you out of your own head
00:16:09.440 and making you think about
00:16:11.200 the well-being of other people
00:16:12.580 and the value they have in your life.
00:16:15.040 The gratitude part.
00:16:16.300 So, getting out of your own head
00:16:18.060 is the trick for not being depressed.
00:16:21.480 Now,
00:16:22.340 why is there a gigantic
00:16:23.420 mental health problem
00:16:24.680 in the United States
00:16:25.400 at the moment?
00:16:26.800 Well,
00:16:27.140 I would say,
00:16:27.840 again,
00:16:28.280 you don't need to do
00:16:29.380 a big expensive study.
00:16:31.220 Let me explain it to you.
00:16:33.760 It's your screens.
00:16:36.280 When I was a child,
00:16:38.280 human beings were human beings.
00:16:40.800 And if I watched television,
00:16:42.960 you know,
00:16:43.240 that was just television.
00:16:44.540 That wasn't really my life.
00:16:46.040 It was just a thing.
00:16:46.840 I did something.
00:16:47.320 But if your life
00:16:48.760 is mostly online,
00:16:50.320 your friends are online,
00:16:52.240 everybody's online,
00:16:53.360 are those people real?
00:16:55.800 Kind of.
00:16:57.360 I mean,
00:16:57.940 you know that they're real people.
00:17:00.120 But do they have
00:17:00.780 the same effect on you
00:17:01.980 in terms of,
00:17:03.240 let's say,
00:17:03.720 gratitude and empathy
00:17:04.840 as somebody in the real world?
00:17:08.020 My guess is no.
00:17:09.920 My guess is that
00:17:10.820 our social life
00:17:11.740 has turned into
00:17:12.940 a social life
00:17:13.800 in which gratitude
00:17:14.780 and empathy
00:17:15.460 are hard to express
00:17:17.120 because you're just
00:17:18.660 people on screens.
00:17:20.100 The people who are
00:17:21.000 in the room with you,
00:17:22.460 you can have gratitude
00:17:23.420 and empathy
00:17:23.960 just naturally and easily.
00:17:25.860 So it could be
00:17:26.600 something about
00:17:27.340 how much we see
00:17:28.080 people in person.
00:17:29.900 That makes sense.
00:17:31.140 And we know that
00:17:31.980 loneliness, of course,
00:17:33.400 works against
00:17:34.220 your happiness.
00:17:35.580 So a lot of it
00:17:36.480 is just very obviously
00:17:38.200 screens.
00:17:39.060 You can see that
00:17:39.680 the graph of mental illness
00:17:41.220 spikes exactly
00:17:42.960 when smartphones
00:17:43.660 came into the market.
00:17:45.420 It's pretty obvious
00:17:46.300 at this point.
00:17:47.360 That's what it is.
00:17:48.800 But more generally,
00:17:50.060 having a good attitude
00:17:51.060 about life
00:17:51.820 is good for your health
00:17:53.100 and your mental well-being.
00:17:55.780 So no surprises there.
00:17:58.540 Well, it's almost
00:18:00.000 one year ago.
00:18:01.400 God, it seems longer
00:18:02.500 since Elon Musk
00:18:05.060 bought the Twitter platform
00:18:06.640 and now turned it into X.
00:18:09.600 Think about what people
00:18:11.100 said when he bought it.
00:18:13.820 It's the last day
00:18:14.980 of Twitter.
00:18:15.660 Twitter's going to fail.
00:18:17.140 He's destroyed
00:18:17.900 this valuable asset.
00:18:20.160 What's the latest update?
00:18:22.260 The latest update
00:18:23.420 is it might be
00:18:24.500 cash positive
00:18:25.340 in 2024.
00:18:28.460 And the engagement
00:18:29.680 is higher than ever.
00:18:30.860 and he's driven out
00:18:34.020 the government
00:18:35.240 oversight
00:18:36.400 on the platform
00:18:37.920 which was essentially,
00:18:39.260 not essentially,
00:18:40.180 it was directly
00:18:40.880 government
00:18:42.120 blocking your free speech.
00:18:45.280 And when Musk
00:18:46.500 talks casually
00:18:47.620 about how bad
00:18:49.140 Twitter was
00:18:49.740 when he bought it
00:18:50.580 and that basically
00:18:52.480 it was a government organ
00:18:53.620 and he says
00:18:55.120 that the other
00:18:55.780 entities,
00:18:56.900 at least the social media,
00:18:58.500 he says
00:18:58.840 are just government organs.
00:19:00.860 They can only do
00:19:01.820 what the government
00:19:02.420 tells them they can do.
00:19:03.920 There's no free speech there.
00:19:05.800 So he claims,
00:19:07.340 and I think this is true,
00:19:09.060 that the only place
00:19:10.180 free speech exists
00:19:11.300 is on X.
00:19:13.720 I think that's
00:19:14.760 exactly true.
00:19:16.400 There's the only place
00:19:17.600 free speech exists
00:19:18.580 is on X.
00:19:20.520 And, you know,
00:19:20.940 and that's limited by
00:19:21.940 you can't go too far,
00:19:23.360 but I don't think
00:19:24.380 that's hurting us.
00:19:26.460 All right.
00:19:26.840 So that's kind of amazing
00:19:30.120 and I like to
00:19:31.440 take a little bit
00:19:32.460 of every day
00:19:33.080 to just thank
00:19:33.880 Elon Musk
00:19:34.560 for what I believe
00:19:36.320 was primarily
00:19:37.380 an unselfish act.
00:19:40.940 You know,
00:19:41.460 you can always
00:19:42.000 make some argument,
00:19:42.840 well, you know,
00:19:43.600 you can always say
00:19:44.280 with rich people,
00:19:45.760 well,
00:19:46.060 lose for their ego
00:19:46.880 or whatever.
00:19:48.600 I don't know.
00:19:49.160 He's a different
00:19:49.620 kind of cat.
00:19:50.920 It does look like
00:19:51.900 he did this for the world
00:19:53.020 and it worked.
00:19:55.420 And I do believe
00:19:57.240 that, well,
00:19:58.500 you know,
00:19:58.660 history is strange
00:19:59.580 because the winners
00:20:00.380 write it and all that.
00:20:02.420 But I think
00:20:03.700 this is one of the
00:20:05.860 biggest events
00:20:06.560 in American history.
00:20:09.260 Musk
00:20:09.780 getting the government
00:20:11.340 out of X.
00:20:13.140 To me,
00:20:13.740 this is on a level
00:20:15.520 with the Boston Tea Party.
00:20:16.840 You know,
00:20:18.520 it's right up there
00:20:19.460 with the
00:20:20.100 most important things
00:20:22.020 that have ever happened
00:20:22.800 in the Republic.
00:20:24.960 So,
00:20:25.900 thank you.
00:20:27.100 Thank you,
00:20:27.600 Elon Musk.
00:20:28.800 He had an interesting
00:20:29.840 explanation of why
00:20:31.520 Twitter became
00:20:32.640 a toilet.
00:20:35.000 And he says
00:20:36.000 it's geographical.
00:20:38.460 That San Francisco
00:20:39.880 has always been
00:20:40.860 sort of an oddball
00:20:42.100 culture.
00:20:45.040 But usually
00:20:45.840 it was geographically
00:20:47.100 constrained.
00:20:48.420 You could go
00:20:48.960 to San Francisco
00:20:49.700 and there would be
00:20:50.960 some weird people there.
00:20:52.020 and then you would
00:20:53.040 go home.
00:20:54.560 And it didn't bother
00:20:55.680 anybody.
00:20:56.700 They could be weird
00:20:57.640 in San Francisco
00:20:58.440 and you could go
00:20:59.360 to your house
00:20:59.840 and not have any of it.
00:21:02.680 But once
00:21:03.440 Twitter
00:21:04.320 started hiring
00:21:06.200 locally,
00:21:06.900 which
00:21:07.140 everybody does,
00:21:09.200 they turned
00:21:10.640 this niche
00:21:11.380 or niche
00:21:12.160 philosophy
00:21:13.460 into a worldwide
00:21:14.920 phenomenon
00:21:15.600 through their control
00:21:16.820 of the algorithm.
00:21:19.260 And that's where
00:21:20.200 it all went to hell.
00:21:21.620 So in the old days
00:21:22.460 they would have
00:21:23.920 just stayed localized
00:21:24.780 and nobody
00:21:25.680 would have known
00:21:26.080 the difference.
00:21:26.960 It's a fascinating
00:21:27.900 way to explain it.
00:21:30.980 All these things
00:21:31.780 could have ten
00:21:32.320 different explanations
00:21:33.200 depending on
00:21:34.060 what frame
00:21:35.420 you decide to look at.
00:21:36.880 But that's a really
00:21:37.540 interesting frame
00:21:38.340 that technology
00:21:40.580 allowed pockets
00:21:42.480 of craziness
00:21:43.420 in this weird way
00:21:44.980 just because
00:21:45.560 the company
00:21:46.040 was centered there
00:21:46.780 to become worldwide.
00:21:49.320 It's a good explanation.
00:21:55.080 Let's see.
00:21:57.380 Here's news
00:21:58.120 about TikTok.
00:21:59.680 Apparently
00:22:00.120 Republicans
00:22:00.780 are renewing
00:22:01.640 calls for a TikTok
00:22:02.680 ban
00:22:03.240 because it turns out
00:22:04.620 that, oh,
00:22:05.740 didn't see this coming.
00:22:07.740 Oh, total surprise.
00:22:09.340 Who could have
00:22:10.080 ever seen this coming?
00:22:11.140 That as soon
00:22:12.400 as something
00:22:12.840 is really important
00:22:13.780 let's say
00:22:15.020 the Gaza situation
00:22:16.300 that TikTok
00:22:17.840 was
00:22:19.020 I guess
00:22:20.700 this study
00:22:21.140 showed
00:22:21.500 that the
00:22:22.440 pro-Israel
00:22:23.640 messages
00:22:24.480 were about
00:22:25.760 one-tenth
00:22:26.740 of the scale
00:22:28.220 of the pro
00:22:29.000 or the anti-Israel
00:22:31.000 messages.
00:22:32.240 It was
00:22:32.860 enormously
00:22:33.620 balanced
00:22:34.540 against Israel
00:22:35.460 which is
00:22:36.160 against the
00:22:36.680 United States
00:22:37.340 in this
00:22:38.300 particular case.
00:22:39.160 enormously
00:22:40.780 enormously.
00:22:42.080 Now, let me ask you.
00:22:43.240 Do you think
00:22:43.600 that happened
00:22:44.020 on its own?
00:22:46.520 Do you think
00:22:47.120 the TikTok
00:22:47.780 owning
00:22:48.360 Chinese people
00:22:49.780 said,
00:22:50.680 you know,
00:22:50.980 let's just let
00:22:51.520 this one play out?
00:22:53.140 Oh, wow.
00:22:53.860 Surprise.
00:22:54.740 Lots of different
00:22:55.940 people like
00:22:56.580 the Palestinian side.
00:23:00.220 No.
00:23:01.020 Ladies and gentlemen,
00:23:02.080 it's exactly
00:23:02.720 what I've been
00:23:03.340 warning Congress
00:23:04.540 about for five years.
00:23:06.820 For five years
00:23:07.720 I've been saying,
00:23:08.880 stop talking
00:23:09.620 about the
00:23:10.060 data security
00:23:10.840 problem.
00:23:11.860 That is a
00:23:12.440 diversion.
00:23:13.160 It might also
00:23:13.680 be a problem,
00:23:14.800 but it's a
00:23:15.280 diversion.
00:23:16.440 The real problem
00:23:17.180 is the heat
00:23:17.780 button that
00:23:19.140 China,
00:23:19.980 or TikTok,
00:23:21.100 owned by China,
00:23:22.260 can push one
00:23:23.240 button that is
00:23:24.060 literally labeled
00:23:25.400 heat,
00:23:26.800 which they admit
00:23:27.600 they have,
00:23:28.860 that can boost
00:23:29.640 any message.
00:23:31.200 You don't think
00:23:31.840 they pushed
00:23:32.240 that button?
00:23:34.800 Do you think
00:23:35.560 that China sat
00:23:36.220 there and thought
00:23:36.760 to themselves,
00:23:37.100 you know,
00:23:38.520 if we did
00:23:39.060 push this
00:23:39.560 button,
00:23:40.540 it would be
00:23:41.080 really,
00:23:41.580 really bad
00:23:42.180 for America?
00:23:44.200 And they
00:23:44.800 probably wouldn't
00:23:45.480 do anything
00:23:45.880 about it,
00:23:47.460 and it would
00:23:48.300 be perfect
00:23:49.080 crime because
00:23:50.200 nobody knows
00:23:51.260 you pushed
00:23:51.660 the button.
00:23:53.300 And then you
00:23:54.020 think that they
00:23:54.520 had that
00:23:54.900 conversation with
00:23:55.660 themselves and
00:23:56.760 decided to not
00:23:57.740 push the button.
00:23:59.680 Is that what
00:24:00.100 you think?
00:24:00.760 Do you think
00:24:01.200 they didn't
00:24:01.620 push the button
00:24:02.280 because, oh,
00:24:02.940 that would be
00:24:03.360 biasing things?
00:24:04.700 No.
00:24:05.880 No.
00:24:06.540 It's exactly
00:24:07.460 what I told
00:24:08.200 you would
00:24:08.520 happen.
00:24:09.180 If you
00:24:09.720 allow TikTok,
00:24:11.420 it is the
00:24:12.100 user interface
00:24:13.100 to young minds
00:24:15.300 in America
00:24:15.860 and other
00:24:16.440 places.
00:24:17.520 But the
00:24:17.900 American minds,
00:24:18.880 the young
00:24:19.200 ones under,
00:24:19.980 you know,
00:24:20.440 some age,
00:24:21.400 are completely
00:24:22.380 controlled by
00:24:23.080 TikTok.
00:24:24.580 And so
00:24:25.040 China built
00:24:26.300 a user interface
00:24:27.400 for our young
00:24:28.540 people's brains,
00:24:29.580 you know,
00:24:29.900 the future of
00:24:30.600 the world,
00:24:31.300 future of our
00:24:31.900 country anyway.
00:24:32.440 and then
00:24:33.460 they put a
00:24:34.840 button on
00:24:35.240 there and
00:24:36.100 they even
00:24:36.500 told us,
00:24:37.640 we have a
00:24:38.100 button that's
00:24:38.920 the user interface
00:24:39.780 to all the
00:24:40.360 young people's
00:24:40.960 brains in
00:24:41.480 America.
00:24:42.300 But, you
00:24:42.840 know,
00:24:42.940 we don't like
00:24:43.380 to push it
00:24:43.940 for any bad
00:24:44.480 reasons.
00:24:45.280 Oh, oh,
00:24:46.220 fine.
00:24:47.400 Thank goodness
00:24:48.140 you have that
00:24:49.340 button that you
00:24:49.860 could simply
00:24:50.360 push and it
00:24:51.500 would change
00:24:52.180 the world
00:24:53.360 in a way
00:24:54.940 that's good
00:24:55.340 for you and
00:24:55.820 bad for your
00:24:56.320 enemies.
00:24:57.260 But you're not
00:24:58.020 going to push
00:24:58.420 it, right?
00:25:00.100 Do you promise?
00:25:01.740 Picky swear.
00:25:03.020 Don't push
00:25:03.540 it.
00:25:05.340 That's,
00:25:05.740 amazingly,
00:25:07.040 that's the
00:25:07.380 situation we're
00:25:08.100 in.
00:25:08.840 And now
00:25:09.220 there's some
00:25:09.660 Republicans
00:25:10.100 saying,
00:25:11.020 hey,
00:25:11.960 hey,
00:25:12.800 I wonder,
00:25:14.000 I wonder if
00:25:14.840 this is a
00:25:15.320 problem.
00:25:16.420 Oh,
00:25:16.720 I wonder.
00:25:19.640 Again,
00:25:20.520 may I say,
00:25:22.840 they should
00:25:23.580 have asked
00:25:23.960 Scott for
00:25:25.960 pretty much
00:25:26.600 all of the
00:25:27.140 science and
00:25:28.620 politics.
00:25:29.700 You should
00:25:30.020 have just
00:25:30.280 asked me.
00:25:30.740 I could
00:25:31.760 have told
00:25:32.060 you this
00:25:32.360 was coming.
00:25:33.980 And it's
00:25:34.260 not,
00:25:34.900 and I also
00:25:35.580 make a further
00:25:36.180 prediction that
00:25:37.820 TikTok has
00:25:38.660 sufficient control
00:25:39.840 of our
00:25:40.200 Congress and
00:25:41.520 sufficient control
00:25:42.660 of the
00:25:42.980 Democrats and
00:25:44.180 sufficient control
00:25:45.120 of our news
00:25:45.840 sources that
00:25:47.240 this won't
00:25:47.780 change.
00:25:49.040 My prediction
00:25:49.940 is Congress
00:25:51.320 will not ban
00:25:52.960 TikTok because
00:25:54.260 TikTok already
00:25:55.020 owns Congress.
00:25:55.760 they lost.
00:25:57.780 That's what it
00:25:58.340 looks like to
00:25:58.800 me.
00:25:59.560 To me,
00:25:59.860 it looks like
00:26:00.300 it was a
00:26:01.160 battle and
00:26:01.820 Congress lost.
00:26:03.580 And there's
00:26:03.960 no way they
00:26:04.360 can reverse
00:26:04.840 it because
00:26:05.980 the tentacles
00:26:08.680 are already
00:26:09.280 around them.
00:26:12.440 All right.
00:26:15.620 I controlled
00:26:16.800 myself,
00:26:17.540 and I can't
00:26:18.020 believe I did
00:26:18.600 it.
00:26:18.780 But when
00:26:19.280 Matthew Perry
00:26:20.180 tragically died
00:26:21.740 in his hot
00:26:22.340 tub, it
00:26:24.460 took every
00:26:25.400 bit of my
00:26:25.980 self-control
00:26:26.840 not to
00:26:28.480 tweet the
00:26:29.140 word fentanyl.
00:26:31.400 I mean,
00:26:32.200 I had to
00:26:32.860 just pry my
00:26:33.940 phone out of
00:26:34.480 my own
00:26:34.760 hand because
00:26:36.320 it's the
00:26:38.020 obvious,
00:26:38.820 the obvious
00:26:39.820 thing that
00:26:40.200 makes you die
00:26:40.740 suddenly if
00:26:42.040 you're an
00:26:42.380 addict is
00:26:43.820 you got some
00:26:44.380 fentanyl that
00:26:45.080 you didn't
00:26:45.380 know you
00:26:45.660 were getting.
00:26:46.440 But it
00:26:46.780 turns out he
00:26:47.260 had none
00:26:47.620 in his
00:26:47.920 system.
00:26:48.880 There was
00:26:49.420 no fentanyl
00:26:50.100 in his
00:26:50.420 system.
00:26:50.680 So I'm
00:26:51.860 chastising
00:26:52.580 my imaginary
00:26:53.880 self that
00:26:54.780 couldn't
00:26:55.140 control myself
00:26:56.000 and I
00:26:57.060 said that
00:26:57.820 when it
00:26:58.140 would have
00:26:58.340 been really
00:27:00.240 inappropriate
00:27:00.760 and tacky
00:27:01.420 to say it.
00:27:02.480 But God,
00:27:03.100 I wanted
00:27:03.400 to.
00:27:04.480 So that's
00:27:05.460 bad on me.
00:27:07.100 But so
00:27:08.120 we don't
00:27:08.380 know.
00:27:09.120 If I had
00:27:09.660 to guess,
00:27:11.580 well, I
00:27:13.000 won't guess.
00:27:13.880 I guess I'll
00:27:14.480 let the process
00:27:15.160 play out.
00:27:16.100 So I guess
00:27:16.500 there's a
00:27:16.760 little bit
00:27:17.000 of still some
00:27:17.600 mystery of
00:27:18.080 what it
00:27:19.100 might have
00:27:19.320 been.
00:27:19.480 Over in
00:27:22.700 Switzerland,
00:27:23.860 they've built
00:27:25.320 what they
00:27:25.660 call suicide
00:27:26.380 pods.
00:27:27.640 It's like a
00:27:28.640 coffin-sized
00:27:29.580 pod with a
00:27:30.840 glass top.
00:27:32.420 And you
00:27:32.760 can get in
00:27:33.260 this if you're
00:27:33.880 ready for
00:27:34.340 assisted suicide.
00:27:36.380 And they'll
00:27:36.840 just turn up
00:27:37.500 the, they
00:27:40.020 fill it with
00:27:40.440 nitrogen gas,
00:27:41.480 which rapidly
00:27:42.100 lowers oxygen
00:27:42.940 levels, causing
00:27:43.780 its user to
00:27:44.420 die.
00:27:44.620 I assume
00:27:46.860 this is
00:27:47.360 painless.
00:27:48.640 Otherwise, it
00:27:49.440 wouldn't exist.
00:27:50.680 Now, of
00:27:51.040 course, the
00:27:51.680 opponents say,
00:27:55.460 my God, this
00:27:56.140 will be
00:27:56.760 abused.
00:27:59.260 Would you
00:27:59.940 agree that it
00:28:00.560 will be abused?
00:28:02.000 And it won't
00:28:02.500 just be people
00:28:03.480 who are in
00:28:04.500 desperate end-of-life
00:28:05.900 situation, but
00:28:07.460 that it would
00:28:08.060 naturally slide
00:28:09.280 down to people
00:28:09.900 who are just
00:28:10.240 having a bad
00:28:11.160 time in life.
00:28:12.760 Yeah, probably.
00:28:13.780 So, I'll
00:28:15.260 tell you, I
00:28:15.580 have very
00:28:15.880 mixed feelings
00:28:16.700 about it.
00:28:18.640 Number one,
00:28:20.020 for 30
00:28:21.140 years, I
00:28:21.720 have imagined
00:28:22.280 the invention
00:28:23.820 of this
00:28:24.300 product, pretty
00:28:25.720 much exactly
00:28:26.440 the way it
00:28:26.880 exists, except
00:28:27.760 I didn't know
00:28:28.200 what gas would
00:28:28.800 be involved.
00:28:31.060 Although, I
00:28:31.760 thought it
00:28:32.020 would be used
00:28:32.380 for murder.
00:28:33.780 I thought it
00:28:34.260 would be more
00:28:34.740 of a murder
00:28:35.260 device.
00:28:37.000 Here, we'll
00:28:37.500 tie your hands
00:28:38.100 and put you
00:28:38.560 in this pod,
00:28:39.440 and next thing
00:28:40.680 you know, you'll
00:28:41.220 be dead.
00:28:41.700 And so, I
00:28:44.800 have mixed
00:28:45.120 feelings about
00:28:45.680 it because, on
00:28:46.360 one hand, it
00:28:48.040 looks like you
00:28:48.620 will definitely
00:28:49.140 be abused.
00:28:51.440 No doubt
00:28:52.060 about it.
00:28:52.680 But you know
00:28:53.180 what else is
00:28:53.700 abused?
00:28:55.080 Everything.
00:28:57.400 Food is
00:28:58.040 abused.
00:28:59.560 Like, the
00:29:00.120 most innocent
00:29:00.800 thing in the
00:29:01.240 world is our
00:29:01.760 food.
00:29:02.360 We all abuse
00:29:03.100 our food until
00:29:03.840 we're, like,
00:29:04.740 dying from
00:29:05.640 food, eating
00:29:06.920 too much food.
00:29:07.720 So, there's
00:29:08.160 nothing we don't
00:29:08.800 abuse.
00:29:09.340 So, on
00:29:10.540 one hand, you
00:29:12.440 could say, hey,
00:29:13.700 it's going to
00:29:14.360 lead to all
00:29:14.840 these abuses.
00:29:15.660 On the other
00:29:16.000 hand, you
00:29:16.400 could say, well,
00:29:18.220 if it's a free
00:29:18.960 country, at the
00:29:20.220 moment, this is
00:29:20.800 only in
00:29:21.180 Switzerland, but
00:29:22.300 if it were
00:29:22.700 here, I can
00:29:24.060 easily say, yeah,
00:29:25.260 it's totally
00:29:25.740 going to get
00:29:26.120 abused.
00:29:27.260 So are guns,
00:29:28.680 so is food,
00:29:29.680 so is alcohol,
00:29:30.720 so are cigarettes,
00:29:31.640 all legal, all
00:29:33.140 abused.
00:29:34.340 So, the fact
00:29:35.060 that something is
00:29:35.820 abused to the
00:29:36.480 point of death
00:29:37.160 is not a
00:29:38.700 stopper.
00:29:40.020 I mean, if we
00:29:40.600 were going to
00:29:40.880 be objective
00:29:42.340 about it, it
00:29:43.000 wouldn't be a
00:29:43.480 stopper for any
00:29:44.200 product.
00:29:45.180 So, would it
00:29:46.660 be abused?
00:29:47.280 Yes.
00:29:49.140 Especially, you
00:29:50.740 know, if things
00:29:51.180 get worse in the
00:29:51.880 future, economically,
00:29:53.540 et cetera.
00:29:54.300 But, on the other
00:29:55.140 hand, I really,
00:29:56.340 really want this
00:29:56.900 option for myself.
00:29:58.960 Although, I don't
00:29:59.620 like the
00:29:59.980 claustrophobia right
00:30:00.980 before you die of
00:30:01.880 being in a little
00:30:03.260 casket thing.
00:30:04.380 I don't like that
00:30:05.240 part.
00:30:05.460 But, I'd
00:30:07.580 certainly like the
00:30:08.260 option to be
00:30:09.840 able to go out
00:30:10.360 on my own
00:30:10.720 terms.
00:30:11.480 So, I've got
00:30:11.900 mixed feelings
00:30:12.380 about that.
00:30:15.380 Here's another
00:30:15.980 one I should have
00:30:16.580 asked Scott.
00:30:17.540 Climate
00:30:17.900 scientist Roger
00:30:18.860 Peelke, Jr., who
00:30:21.620 has long been on
00:30:22.760 the side of,
00:30:23.680 let's say,
00:30:26.980 skepticism.
00:30:29.500 So, skepticism,
00:30:31.960 and I don't
00:30:33.320 believe he's
00:30:33.880 saying that
00:30:35.460 there's no
00:30:35.900 connection between
00:30:36.760 CO2 and
00:30:37.940 warming.
00:30:38.740 I don't think
00:30:39.180 he's that kind
00:30:39.720 of skeptic.
00:30:40.480 He's a skeptic
00:30:41.200 on the models.
00:30:43.020 So, that's the
00:30:43.580 part I know he's
00:30:44.140 a skeptic about,
00:30:44.980 so I have less
00:30:46.600 knowledge about
00:30:47.880 what else he
00:30:48.360 might be skeptical
00:30:48.980 about.
00:30:49.940 But, he has
00:30:51.220 another, I guess
00:30:53.540 he gave a speech
00:30:54.280 or something
00:30:54.740 recently, and he
00:30:56.020 talked about all
00:30:56.620 the problems with
00:30:57.360 the models, but
00:30:59.040 here's the one
00:30:59.600 that makes the
00:31:00.280 most difference.
00:31:02.300 So, apparently
00:31:02.840 there are several
00:31:03.600 models that
00:31:04.680 people could use
00:31:05.740 to do their
00:31:06.980 additional science
00:31:08.140 and say, if this
00:31:09.140 changed, what
00:31:10.420 would happen?
00:31:11.120 We studied this
00:31:11.880 variable.
00:31:12.820 If you add this
00:31:13.500 variable to the
00:31:14.260 existing model,
00:31:15.160 what happens?
00:31:16.060 So, I guess
00:31:16.500 they've got some
00:31:17.300 low, medium,
00:31:18.980 and high models
00:31:20.740 that are most
00:31:21.520 often used.
00:31:22.540 You know, the
00:31:22.820 worst case, best
00:31:23.680 case, average
00:31:24.660 case.
00:31:25.800 And, which one
00:31:27.660 do you think
00:31:28.000 they usually use?
00:31:29.060 to show the
00:31:30.820 effects of their
00:31:31.400 science.
00:31:32.120 If you said
00:31:33.140 they use the
00:31:34.060 worst case
00:31:34.760 scenario, you'd
00:31:36.660 be right.
00:31:37.780 Do you know
00:31:38.240 why they do
00:31:38.780 that?
00:31:39.640 Because if you
00:31:40.480 make a paper
00:31:41.460 or discovery,
00:31:43.440 which shows
00:31:44.760 the most
00:31:45.160 cataclysmic
00:31:46.140 eventual outcome,
00:31:48.160 you're more
00:31:49.080 likely to get
00:31:49.720 funded, because
00:31:50.520 that makes it
00:31:50.960 more important,
00:31:52.040 and you're more
00:31:52.460 likely to get
00:31:53.260 attention, and
00:31:53.980 you're more
00:31:54.260 likely to get
00:31:54.960 promoted, and
00:31:56.220 you're more
00:31:56.520 likely to get
00:31:57.080 more grants.
00:31:57.840 So, basically,
00:31:59.700 the system is
00:32:00.820 designed to
00:32:02.600 incentivize people
00:32:04.140 to always chase
00:32:05.700 the worst case
00:32:06.440 scenario.
00:32:07.900 And the worst
00:32:08.780 case scenario
00:32:09.500 could happen,
00:32:12.080 but it's probably
00:32:13.000 the least likely
00:32:13.800 outcome.
00:32:15.780 Somewhere in the
00:32:16.620 middle is the
00:32:17.180 usual.
00:32:19.560 So, there it
00:32:21.380 is again.
00:32:22.640 Now, I hate
00:32:24.540 to brag.
00:32:25.260 No, I love to
00:32:25.980 brag, but
00:32:27.160 everybody who
00:32:30.080 ever had any
00:32:30.780 experience doing
00:32:31.780 financial modeling
00:32:32.820 or any kind of
00:32:33.640 modeling of the
00:32:34.220 future, we all
00:32:35.660 knew this.
00:32:37.120 Like, we all
00:32:38.060 knew that all
00:32:39.540 models that
00:32:40.420 predict the
00:32:40.920 future are just
00:32:41.780 financially driven,
00:32:43.620 period.
00:32:44.680 There's no
00:32:45.200 exception to
00:32:45.880 that.
00:32:46.520 If it comes
00:32:47.280 out of a
00:32:47.660 corporation,
00:32:49.000 they're showing
00:32:49.560 you something
00:32:50.080 because they
00:32:50.600 want to make
00:32:51.000 money.
00:32:51.880 If I produced
00:32:52.700 a number as
00:32:53.980 part of my
00:32:54.420 analysis of
00:32:55.320 what the
00:32:56.000 financial long
00:32:56.860 term effect of
00:32:57.740 some decision
00:32:58.660 would be,
00:32:59.360 which was my
00:32:59.960 job, I knew
00:33:01.220 I was just
00:33:01.760 doing what my
00:33:02.340 boss needed
00:33:02.960 me to do to
00:33:04.320 support what
00:33:04.860 he already
00:33:05.340 wanted to do.
00:33:06.580 Like, everybody
00:33:07.120 knows that.
00:33:08.300 That's just
00:33:08.900 common knowledge.
00:33:10.640 So, did I
00:33:12.300 need to be
00:33:13.260 told by an
00:33:13.980 expert who
00:33:14.600 knows a lot
00:33:15.120 about it,
00:33:15.800 Roger Pilkey
00:33:16.640 Jr., did he
00:33:17.680 need to tell
00:33:18.200 me that the
00:33:18.880 financial models
00:33:19.740 are being
00:33:20.160 gamed and
00:33:21.300 are not any
00:33:21.880 kind of
00:33:22.220 indication of
00:33:22.860 reality?
00:33:23.720 He did not
00:33:24.200 need to tell
00:33:24.660 me that.
00:33:25.580 That was
00:33:25.980 obvious, super
00:33:27.940 obvious from
00:33:28.960 day one.
00:33:30.220 I've never had
00:33:31.300 a different
00:33:31.640 opinion.
00:33:32.120 The first time
00:33:32.720 I ever heard
00:33:33.440 that there
00:33:34.260 were long-term
00:33:34.920 data predictions,
00:33:37.920 I mean, I
00:33:38.260 just laughed.
00:33:39.720 There's nobody
00:33:40.740 who does this
00:33:41.940 work or any
00:33:43.400 version of this
00:33:44.240 work who
00:33:45.400 thinks you can
00:33:45.980 even do that.
00:33:47.560 There's no such
00:33:48.160 thing as predicting
00:33:48.900 the future about
00:33:50.240 anything.
00:33:54.480 Anyway, so
00:33:56.780 just ask me
00:33:57.380 next time.
00:33:59.560 So, the
00:34:00.180 Wall Street
00:34:00.520 Journal has
00:34:02.480 a story that
00:34:04.340 says the rate
00:34:04.900 of babies
00:34:05.380 dying in the
00:34:06.040 U.S.
00:34:06.380 rose significantly
00:34:07.220 for the first
00:34:07.940 time in two
00:34:08.480 decades.
00:34:09.660 Wow!
00:34:10.500 Increasing
00:34:10.980 3% from
00:34:11.980 2021 to
00:34:12.880 wow!
00:34:13.960 So, just
00:34:14.600 recently, up
00:34:16.420 to 2022, a
00:34:17.880 big increase in
00:34:18.620 babies dying in
00:34:19.480 the United
00:34:19.740 States, according
00:34:21.200 to the latest
00:34:21.860 federal government.
00:34:24.920 So, does that
00:34:26.620 sound like real
00:34:27.220 news to you?
00:34:28.880 Do you think
00:34:29.280 it's real news
00:34:30.200 that the rate of
00:34:31.440 babies dying in
00:34:32.200 the U.S.
00:34:32.640 rose significantly
00:34:33.540 for the first
00:34:34.240 time in two
00:34:34.760 decades?
00:34:36.960 Well, that
00:34:37.680 headline was
00:34:38.920 directly over the
00:34:39.880 graph that shows
00:34:41.600 the rate of
00:34:42.260 death, and the
00:34:43.700 graph showed that
00:34:44.640 it's one of four
00:34:45.540 times recently that
00:34:46.620 it's upticked.
00:34:49.480 So, the graph
00:34:50.840 proved that the
00:34:52.380 headline was
00:34:52.900 false.
00:34:54.920 And then there
00:34:55.420 was still an
00:34:55.900 article about it
00:34:56.820 after the
00:34:58.120 graph, because
00:34:59.240 I don't think I
00:35:01.740 misread the
00:35:02.280 graph.
00:35:03.140 Can somebody
00:35:03.580 confirm?
00:35:04.400 I mean, I just
00:35:05.160 need a fact check
00:35:05.800 on this.
00:35:07.160 This direction is
00:35:08.100 down, right, where
00:35:09.220 I'm pointing?
00:35:10.440 Can you confirm
00:35:11.100 that?
00:35:11.360 That's down, right?
00:35:12.120 Am I confused
00:35:13.900 about that?
00:35:14.720 Yeah, okay, that's
00:35:15.300 down.
00:35:15.960 Now, here's the
00:35:16.760 part where I'm
00:35:17.180 confused about.
00:35:18.240 Is this direction
00:35:19.220 up?
00:35:20.600 I think it is.
00:35:22.400 So, if they had a
00:35:23.400 graph that had
00:35:25.260 substantial portions
00:35:26.340 of it that were
00:35:27.780 pointing in this
00:35:28.480 direction, is that
00:35:29.280 up or down?
00:35:30.160 I'm forgetting
00:35:30.900 again.
00:35:31.500 Oh, it's up.
00:35:32.280 It's up.
00:35:34.760 And was it in the
00:35:35.840 last 20 years?
00:35:37.100 Yes, all four of
00:35:38.100 the upticks were in
00:35:38.880 the last 20 years.
00:35:40.800 What kind of story
00:35:41.740 is this?
00:35:42.200 Am I confused, or
00:35:45.160 did they literally
00:35:45.840 write a headline that
00:35:46.720 didn't match the
00:35:47.400 graph, their own
00:35:49.420 data?
00:35:50.300 I don't know what's
00:35:51.140 going on.
00:35:51.680 I'm just actually
00:35:52.580 confused by it.
00:35:53.780 But don't believe
00:35:54.540 data.
00:35:56.120 Don't believe news.
00:35:57.980 Don't believe data.
00:36:01.340 Here's what I think
00:36:02.240 happened.
00:36:04.020 You ready for this?
00:36:05.720 I think it's
00:36:06.260 cognitive blindness,
00:36:08.240 where smart people,
00:36:10.240 even groups of
00:36:11.180 smart people, can
00:36:12.560 be blinded to the
00:36:13.640 obvious because
00:36:14.920 there's something
00:36:15.540 else going on.
00:36:16.740 Basically, that's how
00:36:17.600 all magic tricks
00:36:18.660 work.
00:36:19.500 A magic trick takes
00:36:20.840 your brain to the
00:36:21.560 place where you
00:36:22.100 can't solve the
00:36:22.700 trick.
00:36:23.840 Your brain is just
00:36:24.800 working on the
00:36:25.340 wrong problem.
00:36:27.180 Here's what I think,
00:36:28.520 and this is just
00:36:29.160 wild speculation,
00:36:31.160 that we've been
00:36:32.180 served up for
00:36:33.140 three weeks now
00:36:34.180 stories about dead
00:36:35.800 babies because of
00:36:36.920 the Gaza situation.
00:36:39.040 Right?
00:36:39.140 every hour on the
00:36:41.840 hour for three
00:36:42.620 weeks, I've heard
00:36:43.460 about a dead
00:36:44.000 baby.
00:36:45.520 So then some
00:36:46.320 data comes out
00:36:47.240 that says there's
00:36:48.020 an uptick in dead
00:36:49.220 babies in the
00:36:50.540 United States.
00:36:52.260 Could that blind
00:36:54.240 the people involved
00:36:55.580 to the point where
00:36:56.820 they could only see
00:36:57.840 the data as more
00:36:59.040 dead babies because
00:37:00.600 they've been fed
00:37:01.160 more dead babies,
00:37:02.020 more dead babies,
00:37:02.680 highest rate of
00:37:03.220 dead babies, so many
00:37:04.040 dead babies, all
00:37:04.740 dead babies, dead
00:37:05.460 babies, dead babies.
00:37:06.300 Oh, here's some
00:37:07.260 data.
00:37:08.420 Sure enough, uptick in
00:37:10.040 dead babies.
00:37:11.000 Write the story.
00:37:12.260 Write the story.
00:37:13.920 And did they actually
00:37:14.780 write the story while
00:37:15.960 looking at the data
00:37:16.940 that showed that there
00:37:17.840 was no uptick that
00:37:18.980 was unusual?
00:37:19.880 There is an uptick,
00:37:20.840 but not an unusual one.
00:37:23.400 That's actually
00:37:24.160 possible in the
00:37:27.580 hypnotist world.
00:37:29.520 If you had a
00:37:30.340 hypnotist filter on the
00:37:31.500 world, I'm not saying
00:37:33.560 that's what happened.
00:37:34.200 I'm saying it would be
00:37:35.580 perfectly within the
00:37:36.760 normal realm of
00:37:38.420 behavior that somebody
00:37:40.080 would be cognitively
00:37:41.380 blinded and not even
00:37:42.960 notice that the graph
00:37:44.340 didn't match the
00:37:45.040 headline.
00:37:45.780 Now, I'll be
00:37:46.320 embarrassed by tomorrow
00:37:47.300 because by tomorrow
00:37:48.440 the Wall Street Journal
00:37:49.220 will explain why I read
00:37:50.600 the graph wrong, but at
00:37:52.440 the moment I'm just
00:37:53.160 confused.
00:37:54.760 So, you should wait
00:37:55.700 for the other side of
00:37:56.460 this story.
00:37:57.640 This is sort of the
00:37:58.520 documentary problem.
00:38:00.220 You're hearing me make
00:38:01.620 a claim, but you've not
00:38:03.360 heard the Wall Street
00:38:04.040 Journal say, Scott,
00:38:05.000 you're looking at the
00:38:05.680 graph wrong, or maybe
00:38:07.620 you interpreted it
00:38:08.680 wrong or something.
00:38:09.820 So, just consider I
00:38:11.020 could be wrong about
00:38:11.740 this.
00:38:12.460 Wait until you hear the
00:38:13.160 other side if there is
00:38:14.380 one.
00:38:16.760 All right.
00:38:18.080 I saw a lawyer for
00:38:19.940 some of the January 6
00:38:21.020 people, somebody who
00:38:22.580 goes by shipwrecked
00:38:23.940 crew on the X
00:38:25.400 platform, and is
00:38:26.940 apparently a lawyer with
00:38:28.260 some J6 clients, and
00:38:30.540 he makes this claim.
00:38:31.680 He or she makes this
00:38:33.540 claim.
00:38:34.440 I have two new clients
00:38:35.500 charged with J6 crimes.
00:38:37.780 Neither went inside the
00:38:39.160 Capitol, and there are no
00:38:40.940 allegations of any
00:38:42.180 interference with the
00:38:43.260 police.
00:38:45.020 I'm not aware of any
00:38:46.300 previous cases where
00:38:47.300 defendants were charged
00:38:48.420 just for being present
00:38:49.860 outside the building and
00:38:51.980 watching.
00:38:53.920 If anyone knows of
00:38:55.180 such a case, mention it
00:38:56.400 in the comments.
00:38:57.860 If this is a new
00:38:58.880 standard for the
00:38:59.660 Department of Justice,
00:39:00.360 there are thousands of
00:39:01.440 potential defendants
00:39:02.500 still uncharged.
00:39:04.640 Now, question number
00:39:05.480 one, is this an
00:39:08.080 accurate description of
00:39:09.720 the situation?
00:39:11.480 Well, remember, this is
00:39:12.280 coming from a lawyer
00:39:13.140 who is a proponent for
00:39:16.140 clients.
00:39:18.060 So the lawyer is going to
00:39:19.200 put it in the most
00:39:20.160 minimalistic description of
00:39:22.160 what happened.
00:39:23.820 But let's say that the
00:39:26.320 main claim is true, that
00:39:29.500 they didn't go inside and
00:39:31.040 they didn't interfere with
00:39:32.100 police.
00:39:33.380 What could they have been
00:39:34.580 doing that would get them
00:39:35.540 charged?
00:39:39.360 You can't be in the
00:39:40.840 general area of other
00:39:41.820 people doing a crime,
00:39:44.740 inciting, because they
00:39:46.320 were saying things.
00:39:47.440 Well, if it's inciting, it's
00:39:48.780 going to be everybody.
00:39:50.860 I'll tell you my
00:39:51.680 interpretation of this is
00:39:52.960 that Democrats, mostly, are
00:39:57.500 looking for any excuse to
00:39:59.340 increase the number of
00:40:00.420 Republicans they're putting
00:40:01.320 in jail, which I call
00:40:02.920 hunting.
00:40:05.580 This, to me, looks like
00:40:07.180 lawfare hunting, where
00:40:09.500 they're literally looking for
00:40:10.840 Republicans that they can
00:40:12.100 jail to make sure that
00:40:13.860 Republicans stay in line in
00:40:15.700 the future.
00:40:16.460 I think it's not so much
00:40:17.640 about punishing anybody.
00:40:18.720 I think it's about creating
00:40:20.380 a standard of fear so that
00:40:22.720 Republicans will know that
00:40:25.060 if they get anywhere near a
00:40:27.040 line, they're going to
00:40:28.680 fucking jail.
00:40:33.180 Do you know what I consider
00:40:34.780 my biggest risk in my life?
00:40:38.440 I consider my biggest risk
00:40:40.700 being jailed for a crime I did
00:40:43.380 not commit.
00:40:45.580 It's 2023 in America.
00:40:49.700 Literally, my biggest risk, and
00:40:53.100 there are plenty of things that
00:40:54.020 can, you know, be risky, but
00:40:55.780 my biggest risk is going to
00:40:58.400 jail for a crime I didn't
00:40:59.600 commit.
00:41:01.300 That's reality of America in
00:41:03.720 2023.
00:41:05.180 And, you know, if you've
00:41:06.700 watched Mike Cernovich tweeting
00:41:08.080 lately, he would be an obvious
00:41:11.720 target for the bad guys to want
00:41:14.020 to invent a crime and charge it
00:41:16.360 with it.
00:41:16.700 I think they would be sorry if
00:41:18.900 they did that.
00:41:20.900 That would be way more pushback
00:41:22.960 than they understand.
00:41:26.040 But I think we're all at risk
00:41:29.080 at this point.
00:41:29.700 Anybody who's talking about the
00:41:30.940 news and the way the Democrats
00:41:32.780 don't like, I think we're all at
00:41:34.780 risk.
00:41:36.560 That's America 2023.
00:41:37.820 All right.
00:41:41.040 I asked on the X platform,
00:41:44.040 can you describe any way in the
00:41:46.620 world that the national debt could
00:41:48.260 be paid off and we're not running
00:41:50.420 as fast as we can towards certain
00:41:52.880 doom?
00:41:55.080 And nobody could do it.
00:41:57.640 But I saw one suggestion that made
00:42:02.160 me think it might be possible.
00:42:03.460 Now, I'm going to go to the
00:42:06.360 whiteboard.
00:42:07.500 Whiteboard.
00:42:08.600 I'm going to explain maybe the
00:42:10.620 best case scenario for paying off
00:42:12.640 the debt.
00:42:14.360 Now, there are things that you can
00:42:16.660 do when the debt is small that you
00:42:19.280 can't do when the debt is enormous
00:42:21.460 and growing at an enormous rate.
00:42:23.740 The normal way, if you've got a
00:42:28.340 crushing debt, or just debt in
00:42:31.340 general, you either have to create
00:42:35.880 money from nothing.
00:42:39.260 You have to create money from
00:42:40.920 nothing.
00:42:42.500 Let me move this.
00:42:46.020 Or inflate it away.
00:42:48.840 And there are basically three ways to
00:42:53.400 get rid of a debt.
00:42:56.200 And by the way, this is my first
00:42:57.820 draft, sort of a first take.
00:43:00.820 So if anybody knows more than I do
00:43:02.600 about this topic, you know, tell me
00:43:04.660 to add something or subtract
00:43:05.960 something, right?
00:43:07.120 So we're just getting the
00:43:08.240 conversation going.
00:43:09.160 The reason I'm doing this, by the
00:43:10.340 way, is I've seen no serious person
00:43:13.000 describe how you even could
00:43:15.360 possibly, like even in the best case
00:43:18.320 scenario, how could we get off of
00:43:20.520 this train to doom, which is the
00:43:23.160 debt?
00:43:24.600 I've heard nobody do it.
00:43:26.020 Have you?
00:43:26.980 Has anybody heard anybody serious
00:43:28.820 describe any scenario where we're not
00:43:31.820 doomed from debt?
00:43:34.060 Right?
00:43:34.460 You haven't heard it.
00:43:35.860 So I'm going to give you what I think
00:43:37.220 is the first possibility.
00:43:40.320 I think it's possible, but boy, would
00:43:44.240 it take some things to go right.
00:43:46.820 I think they might, actually.
00:43:48.640 But here are the options.
00:43:50.040 There could be some kind of weird
00:43:51.240 crypto trickery.
00:43:53.420 By trickery, I mean something that I
00:43:55.820 can't quite imagine, and I wouldn't
00:43:57.820 know how it works.
00:43:58.900 But the basic idea is that crypto can
00:44:01.780 create an artificial value from
00:44:04.440 nothing.
00:44:05.780 Right?
00:44:06.260 So that's the only thing I'm
00:44:07.740 starting with.
00:44:08.420 We've got this huge debt, and we
00:44:10.820 don't have the money to pay it.
00:44:12.700 But what if we just invented money?
00:44:16.520 Now you say to yourself, but Scott,
00:44:18.580 that creates 10 new problems.
00:44:21.560 Maybe.
00:44:23.000 But I'm just saying, is there any
00:44:24.540 play, not one that I can quite
00:44:27.160 describe, but is there anything you
00:44:29.280 could do that would, one time only,
00:44:31.720 now here's the key, one time only,
00:44:33.740 create a bunch of money, and pay off
00:44:36.780 the debt.
00:44:37.760 Is there any way to do it?
00:44:41.000 Now before you say, ha ha, it's
00:44:43.760 obvious that it won't work, I would
00:44:46.660 bet that you know less than me, not
00:44:48.200 more.
00:44:49.440 If you believe it's obvious there's
00:44:51.520 no way to do it, I can't take you
00:44:55.060 seriously at all.
00:44:57.160 I'm not saying I know how to do it.
00:44:59.120 I'm saying that crypto literally
00:45:00.740 creates money that people will accept
00:45:03.840 out of nothing.
00:45:06.580 So maybe there's a play there.
00:45:08.300 I just don't know how I would do it.
00:45:10.900 All right, here's the other things.
00:45:12.380 I believe the debt is too big to
00:45:14.040 inflate away.
00:45:15.600 Meaning you could, you could try to
00:45:17.660 inflate it away, but remember it's
00:45:19.400 growing two trillion a year.
00:45:21.180 How much inflation would you need to
00:45:22.880 inflate it away?
00:45:24.200 I believe the answer is too much.
00:45:27.500 Meaning that that rate of inflation
00:45:29.360 would just destroy the country
00:45:30.960 basically.
00:45:31.920 Your standard of living would just
00:45:33.600 go to crap.
00:45:35.140 Now, so can we all agree that
00:45:37.120 inflating it away, if that's the only
00:45:39.580 thing you're doing, won't work.
00:45:43.200 Do you think that's true?
00:45:45.060 Now in the past, a smaller debt, you
00:45:47.860 could actually inflate away.
00:45:49.580 In fact, that's where we were at a
00:45:52.280 few years ago.
00:45:53.300 This can't be inflated away.
00:45:54.800 So the other way to go would be a
00:45:57.780 massive improvement in GDP.
00:46:02.020 Normally our GDP grows, what, two to
00:46:05.320 four percent a year.
00:46:06.800 That is nowhere, nowhere near enough to
00:46:10.600 pay down the debt.
00:46:11.580 Not even in the same universe.
00:46:14.460 You would need something like, and this
00:46:16.960 is just sort of the economist off the
00:46:19.600 top of my head.
00:46:20.280 It feels like you would need something
00:46:22.860 like a very rapid 40% increase in our
00:46:27.980 GDP.
00:46:29.320 How many of you would accept that as
00:46:31.400 sort of a ballpark for just talking
00:46:34.220 about the topic?
00:46:35.980 A 40% increase.
00:46:38.500 Is there anything in the world that
00:46:41.060 would make the economy increase by that
00:46:43.020 rate?
00:46:44.640 War is the wrong answer, because war
00:46:47.380 creates more debt.
00:46:48.240 Yeah, war creates a lot of activity, but
00:46:52.900 it also creates more debt.
00:46:55.460 Here's what I think.
00:46:58.460 And this is partly a borrowed idea.
00:47:02.800 Suppose you get a President Trump or a
00:47:06.600 President Ramaswamy.
00:47:08.880 I'll just take two Republicans as my
00:47:10.760 example.
00:47:11.580 Both dealmakers.
00:47:13.540 Both understand capitalism.
00:47:15.280 Both know how to get rid of regulations.
00:47:19.560 Both know the value of fossil fuels in
00:47:22.660 the short term.
00:47:24.300 So they're basically the right people
00:47:26.840 for the job.
00:47:28.100 Imagine them coming in and doing
00:47:29.880 something so radical that the cost of
00:47:33.460 energy drops to almost trivial.
00:47:36.660 Is that possible?
00:47:37.960 Absolutely.
00:47:39.100 You just remove the prohibitions against
00:47:42.560 small nuclear reactors.
00:47:44.500 That's day one.
00:47:46.160 You just say, look, we're not even going to
00:47:48.120 have state standards.
00:47:50.060 There's going to be a federal standard,
00:47:52.100 and it's going to be friendly to the
00:47:53.540 industry, because we don't have an option
00:47:55.200 now.
00:47:56.320 So you build a bunch of nuclear reactors.
00:47:58.780 Is there a risk?
00:48:00.420 Probably.
00:48:00.820 But it's way better than the risk of just
00:48:04.180 spending yourself into a certain doom.
00:48:06.880 If those are your two options, you're
00:48:08.520 going to take a little extra risk on your
00:48:10.660 nuclear development, because even if you
00:48:12.860 get a meltdown on a small reactor, it's
00:48:15.420 going to be way less problem than spending
00:48:18.420 yourself into oblivion.
00:48:20.180 So, given that energy is such a large
00:48:23.020 portion of our total expenses, if you could
00:48:26.460 just subtract that from our expenses, suddenly
00:48:29.440 everybody has more money.
00:48:30.540 Now, that's a reverse inflation, pro-GDP
00:48:35.620 thing.
00:48:37.340 Now, it's not just that it reduces your
00:48:39.560 current expenses.
00:48:40.800 If you could take energy from its current
00:48:42.780 level to closer to zero, then there's a
00:48:48.800 whole bunch of businesses that become
00:48:50.400 economical that could not have been done
00:48:52.360 before, because they would be too energy
00:48:54.380 intensive.
00:48:56.160 One of them would be desalinization.
00:48:58.140 Imagine the number of companies that would
00:49:02.200 immediately go into the small nuclear reactor
00:49:05.360 plus desalinization business.
00:49:08.060 I mean, that alone would be a trillion
00:49:09.940 dollar business, and it wouldn't be
00:49:12.020 practical until you had that power source.
00:49:15.040 So, imagine drilling like crazy, getting
00:49:17.860 rid of regulations, building nuclear,
00:49:22.540 building.
00:49:23.560 Elon Musk says if you build a 100-mile by
00:49:26.860 100-mile solar facility, you could power the
00:49:30.000 entire United States.
00:49:32.260 Does anybody want to check his work?
00:49:36.980 Does anybody want to take a chance that he
00:49:39.080 got the numbers wrong?
00:49:41.200 I don't think I'm going to go down that
00:49:42.880 path.
00:49:44.220 Actually, I do think there's an argument
00:49:46.240 there.
00:49:48.780 If you listen to some of the critics,
00:49:50.340 they'll say you can't get there with solar.
00:49:52.040 But I'd like to have that conversation.
00:49:53.580 But I'd also like to see robots build
00:49:56.900 facilities.
00:49:59.360 So, because the robots can do
00:50:00.900 cheaper.
00:50:01.460 So, if you've got cheap electricity
00:50:03.180 and energy, suddenly robots are way
00:50:07.280 more practical.
00:50:09.460 Because the cost of your robot is probably
00:50:11.220 going to be half energy cost and half
00:50:14.820 material cost, you know, at some point.
00:50:17.860 If you could take the half of the energy
00:50:20.000 cost closer to zero, suddenly I'm
00:50:23.860 buying two robots.
00:50:25.740 Elon Musk said that he thought the
00:50:27.260 Tesla robots, that already can be taught
00:50:31.880 skills, you can already teach them to
00:50:33.940 fold the laundry and stuff, that he might
00:50:37.660 sell more of those than anything else
00:50:39.460 he's doing.
00:50:40.240 Bigger than Tesla, bigger than SpaceX.
00:50:43.180 I don't know if it would ever be bigger
00:50:44.400 than SpaceX, but...
00:50:46.400 So, robots
00:50:49.040 plus AI
00:50:50.600 plus cheap energy
00:50:53.480 creates an industrial
00:50:56.320 type revolution
00:50:57.440 greater than anything we've ever
00:50:59.560 experienced yet.
00:51:01.400 And there are probably
00:51:02.520 two candidates who could give this world
00:51:05.360 to you,
00:51:06.640 and they're both Republicans.
00:51:08.360 There is no chance that a Democrat could
00:51:10.360 ever pay off the debt.
00:51:11.940 Because Democrats are limited in their
00:51:14.120 options.
00:51:15.280 They're not going to remove
00:51:16.300 regulations.
00:51:17.940 They're not going to do anything that
00:51:19.200 hurts one blade of grass in the
00:51:21.240 environment.
00:51:22.780 And they're not going to do anything
00:51:24.360 that their base reflexively doesn't
00:51:26.460 like, such as nuclear power.
00:51:30.340 So, there's no way that a Democrat can
00:51:32.400 get you to anything that's not total
00:51:35.260 doom.
00:51:37.240 And there is a way, and I think it's a
00:51:40.080 scratch.
00:51:41.220 Honestly, none of this is easy.
00:51:43.680 None of it is predictable.
00:51:45.220 But it's there.
00:51:47.580 It's there.
00:51:48.520 Yeah, we could all have solar roof tiles
00:51:50.360 and be powering our own homes
00:51:52.060 and making our own water.
00:51:53.940 Yeah, the only thing left is
00:51:55.280 good waste treatment,
00:51:57.360 which I think will turn into
00:51:59.060 some kind of, like,
00:52:00.460 laser beam that
00:52:01.540 turns all of your waste into CO2
00:52:04.080 that gets used in a steam engine
00:52:05.840 or something.
00:52:08.500 But that's coming.
00:52:13.100 That's coming.
00:52:14.580 All right.
00:52:15.620 Is this the best explanation
00:52:17.560 of debt repayment options
00:52:19.740 you've ever seen?
00:52:22.580 It is, isn't it?
00:52:27.220 I saw one person say no.
00:52:29.700 If there's somebody who did a better
00:52:31.420 job of explaining the options,
00:52:33.220 can you send me a link?
00:52:35.160 Because I've never seen one.
00:52:36.940 I've never seen anybody get close.
00:52:39.200 Now, when you say it's not the best
00:52:40.860 description, is it because
00:52:41.960 the crypto part?
00:52:44.160 Which part are you complaining about?
00:52:45.580 Just the crypto part?
00:52:46.640 Because the rest is very ordinary.
00:52:48.540 Everybody would agree that we can't
00:52:50.000 inflate it away.
00:52:50.900 Everybody would agree that more GDP
00:52:52.420 would work.
00:52:53.600 Everybody would agree that more energy
00:52:55.220 would be good.
00:52:55.840 Everybody would agree that robots
00:52:57.080 are good.
00:52:58.260 Right?
00:52:58.820 So it's really the crypto part
00:53:00.940 that you don't like.
00:53:01.780 Right?
00:53:01.920 Crypto equals tulips.
00:53:07.960 Well, here's something I think
00:53:09.300 you don't understand about crypto.
00:53:12.020 The United States could make
00:53:13.780 any crypto immediately
00:53:15.460 valuable by saying they would
00:53:18.460 accept it for tax payments.
00:53:21.180 On day one, they can make
00:53:22.920 any crypto valuable.
00:53:24.360 Because the only thing that makes
00:53:25.480 a crypto valuable is somebody
00:53:26.940 who's willing to take it
00:53:27.940 as their payment.
00:53:29.700 If the United States says
00:53:31.640 we'll accept it for tax payments,
00:53:33.460 you're there.
00:53:34.940 Day one.
00:53:36.000 It has permanent value
00:53:37.280 and it won't become less.
00:53:40.220 Or I guess it could.
00:53:43.240 So for those of you who think
00:53:45.060 the crypto option is magic and crazy,
00:53:48.760 you might be right.
00:53:50.220 But you cannot deny
00:53:54.060 that it is a way to create money
00:53:56.460 out of nothing.
00:53:58.000 It might have some other
00:53:59.300 problems with it.
00:54:02.240 All right.
00:54:03.040 Well,
00:54:04.000 I thought I'd take a run at that.
00:54:06.720 Remember, this is
00:54:07.400 the first draft.
00:54:09.440 One of the things I think
00:54:10.520 I can add as value
00:54:12.400 to the public conversation
00:54:14.140 is when the public
00:54:15.860 is not being served,
00:54:18.160 I think I do a good job
00:54:19.500 of explaining
00:54:20.120 what we haven't been told.
00:54:23.300 I always think
00:54:24.080 that's my greatest value
00:54:25.140 is that I do a good job
00:54:27.080 of telling you
00:54:27.540 what they haven't told you
00:54:28.560 and how that's important
00:54:29.840 and you better find that out.
00:54:31.860 So with the debt,
00:54:34.180 we are letting
00:54:35.280 all of our politicians
00:54:36.420 off the hook.
00:54:37.860 Let me ask you this question.
00:54:39.900 This is going to blow
00:54:40.680 your fucking mind.
00:54:41.880 Watch your head
00:54:42.560 just explode.
00:54:44.320 Watch this.
00:54:47.440 Number one,
00:54:48.520 would you agree
00:54:49.260 that our debt problem
00:54:50.460 probably is our biggest one?
00:54:53.480 Would you agree
00:54:54.060 with that, first of all?
00:54:57.480 You see some no's
00:54:58.460 because you might have
00:54:59.180 some other things
00:54:59.820 that you think
00:55:00.240 are bigger problems.
00:55:01.240 Well, if you think no,
00:55:03.320 then that would suggest
00:55:04.260 you know that there's
00:55:05.100 some way to pay it off.
00:55:06.900 But you don't.
00:55:08.740 So there's one problem
00:55:09.760 that we have
00:55:10.240 where we can't conceive
00:55:11.340 of a solution.
00:55:13.160 There's a bunch of problems
00:55:14.340 we have that could
00:55:15.280 become really bad,
00:55:17.000 but we can totally
00:55:18.000 conceive of solutions.
00:55:19.400 For example,
00:55:20.460 the border is open.
00:55:22.000 Can you conceive
00:55:22.720 of a solution?
00:55:24.280 Let me see.
00:55:27.220 Close the border?
00:55:29.180 Yes.
00:55:30.240 We might have a world war.
00:55:32.080 Can you conceive
00:55:32.800 of some way to avoid it?
00:55:35.800 Yes.
00:55:37.200 Negotiating
00:55:37.680 and the usual stuff.
00:55:39.460 Yes.
00:55:40.700 Now, I would agree
00:55:41.700 with you that
00:55:42.300 some surprise thing
00:55:43.840 like a nuclear war
00:55:44.840 could happen suddenly.
00:55:47.260 Anything could happen.
00:55:48.500 But there's only
00:55:49.300 one thing we have
00:55:50.300 where the straight line
00:55:51.500 is to doom.
00:55:53.280 And that's the debt.
00:55:55.400 The debt,
00:55:56.280 because it's going up
00:55:57.000 two trillion a year,
00:55:58.980 if the debt
00:55:59.660 were just stable,
00:56:01.380 then I wouldn't
00:56:02.180 worry about it.
00:56:03.640 We've just inflated away.
00:56:04.900 But it's also going up
00:56:06.340 two trillion a year.
00:56:07.960 So it's unsustainable
00:56:09.600 and growing quickly.
00:56:11.840 That's definitely
00:56:12.660 the biggest risk.
00:56:14.100 By far,
00:56:15.020 that's the biggest risk.
00:56:16.760 All right.
00:56:17.320 That's my opinion.
00:56:19.080 All right.
00:56:19.260 Here's the part
00:56:19.700 that's going to blow your mind.
00:56:21.720 Now,
00:56:22.840 remember all the news
00:56:24.080 you've watched
00:56:24.740 and try to remember
00:56:27.180 any time that
00:56:28.240 a top politician
00:56:29.580 was asked this question.
00:56:30.780 How would you
00:56:32.820 handle the debt?
00:56:36.100 I don't remember it once.
00:56:38.260 I can't think of it
00:56:39.600 once.
00:56:41.000 Can you?
00:56:42.300 Do you know why
00:56:43.000 they don't ask that question?
00:56:45.900 They don't ask it
00:56:46.980 on the right
00:56:47.580 and they don't ask it
00:56:48.660 on the left.
00:56:50.020 Do you know why?
00:56:53.140 Probably because
00:56:54.100 the reporters
00:56:55.920 don't understand
00:56:56.840 the topic.
00:56:58.420 Your average reporter
00:56:59.600 would not really
00:57:00.700 quite understand
00:57:01.700 how debt works
00:57:02.580 because some people
00:57:04.280 make the mistake
00:57:04.960 of saying national debt
00:57:06.200 is like personal debt.
00:57:08.360 You know,
00:57:08.540 if anybody makes
00:57:09.260 that analogy,
00:57:10.040 like if you bought
00:57:11.080 this house
00:57:11.660 and, you know,
00:57:12.640 the interest rate
00:57:13.260 was this
00:57:13.740 and you had made
00:57:14.360 this much money,
00:57:15.500 what would you do?
00:57:16.280 That's not the right model.
00:57:18.020 Anybody who thinks
00:57:18.820 that a mortgage
00:57:19.560 of your house
00:57:20.400 is a good thing
00:57:21.760 to learn about
00:57:22.540 how to handle
00:57:23.340 the national debt,
00:57:24.560 you should immediately
00:57:25.720 stop talking to them
00:57:26.860 because they don't
00:57:27.840 understand the basics
00:57:28.720 of anything.
00:57:29.600 So you've got
00:57:31.460 reporters,
00:57:32.300 if they're not
00:57:32.940 the business reporters,
00:57:34.180 the regular,
00:57:34.780 you know,
00:57:35.040 generic reporters,
00:57:36.020 they don't know
00:57:36.600 how to ask the question
00:57:37.500 or they wouldn't
00:57:38.960 understand the answer
00:57:39.900 because here's the answer
00:57:41.240 a politician
00:57:41.860 would give you.
00:57:44.200 Mr. Politician,
00:57:45.820 let's say,
00:57:46.340 Mr. Republican,
00:57:47.160 how are you going
00:57:47.660 to pay off the debt?
00:57:49.420 Well,
00:57:50.060 we've got to get
00:57:50.940 these omnibus
00:57:52.040 spending bills
00:57:52.960 under control.
00:57:53.700 And then what
00:57:55.600 does the reporter say?
00:57:57.160 Well,
00:57:57.400 how will you do that?
00:57:58.140 Well,
00:57:59.280 you know,
00:57:59.680 we'll try to break
00:58:00.540 them up,
00:58:01.100 the Speaker of the House,
00:58:02.200 blah, blah, blah.
00:58:03.160 And then the reporter
00:58:04.040 goes to the next question.
00:58:07.080 Do you think
00:58:07.980 that breaking up
00:58:08.720 our bills
00:58:09.120 into individual
00:58:09.940 spending bills
00:58:10.860 is going to solve
00:58:11.940 the national debt?
00:58:13.840 Not even close.
00:58:16.160 It's not even in the,
00:58:17.300 it's not even in the zip code
00:58:19.040 of a solution.
00:58:20.140 It's simply better
00:58:21.220 than what we were doing.
00:58:22.180 It's a small tweak
00:58:24.920 to improve
00:58:26.400 the quality
00:58:27.040 of the budgeting.
00:58:28.740 It's not going to
00:58:29.660 solve the national debt
00:58:30.860 or even close
00:58:32.760 and nobody serious
00:58:34.560 believes it would.
00:58:36.400 But I'll bet you
00:58:37.140 the Republicans
00:58:37.760 get away with saying that.
00:58:40.080 I'll bet the reporter
00:58:41.100 stops asking questions
00:58:42.300 when they say,
00:58:43.080 oh,
00:58:43.480 we've got to cut
00:58:44.420 our budget.
00:58:45.720 There's no way
00:58:46.480 to cut the budget
00:58:47.320 enough to pay off
00:58:48.120 this debt.
00:58:49.380 You'd have to cut
00:58:50.480 it like 70%
00:58:51.400 or something.
00:58:52.040 You'd have to give up
00:58:52.900 the military.
00:58:54.340 You're not going to
00:58:55.120 cut the budget
00:58:55.740 enough at all.
00:58:58.000 So we have reporters
00:58:59.680 who don't know
00:59:00.300 how to ask the question.
00:59:01.960 Would you agree
00:59:02.580 with that?
00:59:03.640 Our reporters
00:59:04.220 do not know
00:59:05.060 how to ask the question.
00:59:06.680 And you have politicians
00:59:07.920 who on the Democrat side
00:59:09.920 apparently have
00:59:10.720 never been asked.
00:59:12.940 Let me say it again.
00:59:14.860 I don't think
00:59:15.380 a Democrat's
00:59:15.980 ever been asked.
00:59:18.640 What would they say?
00:59:20.320 Because Democrats
00:59:21.180 can't say
00:59:22.080 we're going to
00:59:23.300 reduce social services.
00:59:26.820 And if they say
00:59:27.740 we're going to
00:59:28.120 reduce the military,
00:59:29.680 well,
00:59:30.360 just ask them
00:59:31.060 to show you the math
00:59:31.980 because that doesn't
00:59:32.720 work out either
00:59:33.360 if you want
00:59:34.680 to actually have
00:59:35.440 a national defense.
00:59:38.480 So we have
00:59:39.240 a reporter problem
00:59:40.600 primarily.
00:59:41.860 The reporters
00:59:42.500 refuse to know
00:59:43.600 what is the
00:59:44.120 important stuff.
00:59:45.640 So with TikTok,
00:59:46.620 they don't know
00:59:47.140 the influences
00:59:47.820 the important thing
00:59:48.780 as opposed
00:59:49.340 to data security.
00:59:51.600 So because
00:59:52.060 our reporters
00:59:52.700 can't tell
00:59:54.020 the story right,
00:59:55.580 it gives Congress
00:59:56.440 a way to ignore it
00:59:57.680 because our reporters
00:59:59.080 don't tell the story right.
01:00:01.100 Right?
01:00:01.720 It gives them
01:00:02.140 a free pass.
01:00:03.140 If nobody's
01:00:03.780 reporting it correctly,
01:00:04.860 they just can ignore it.
01:00:06.260 That's what they do
01:00:06.820 with TikTok.
01:00:07.720 So nobody reports
01:00:08.800 the national debt story
01:00:10.260 accurately
01:00:10.760 or even close.
01:00:12.460 Nobody knows
01:00:13.060 how to even ask
01:00:13.700 the question
01:00:14.180 of a politician.
01:00:14.920 And if they did,
01:00:16.760 they would allow
01:00:17.440 the politician
01:00:18.040 to get away with,
01:00:19.360 oh, yeah, yeah,
01:00:20.240 we're going to have
01:00:21.540 to have a national
01:00:22.160 conversation
01:00:22.800 about the spending.
01:00:26.100 What?
01:00:27.200 How's that going to help?
01:00:28.780 Yes, we're going to have
01:00:29.660 to cut one of these
01:00:31.120 big expenses.
01:00:32.240 It won't help.
01:00:34.360 Yeah.
01:00:34.820 It's a complete
01:00:36.000 abdication
01:00:37.340 of responsibility
01:00:38.420 by the press.
01:00:39.320 and it allows
01:00:41.520 the politicians
01:00:42.220 to just kick the can.
01:00:44.260 So here's what
01:00:44.800 I'd like to see.
01:00:46.600 I'd like to see
01:00:47.660 Vivek
01:00:48.340 come up
01:00:49.780 with a specific plan,
01:00:51.960 even if it's
01:00:52.600 a long shot,
01:00:53.980 for solving the debt.
01:00:56.360 And I think
01:00:56.900 he could make
01:00:57.460 a story,
01:00:58.500 Vivek could,
01:00:59.720 that would say
01:01:01.240 what we're doing
01:01:01.820 with energy
01:01:02.580 and what we're doing
01:01:03.380 with new tech
01:01:04.000 and robots
01:01:04.620 and AI
01:01:05.200 and quantum computing.
01:01:06.320 and he's going
01:01:08.160 to have to make
01:01:08.900 an argument
01:01:09.300 that we need
01:01:10.060 a second
01:01:10.580 industrial revolution.
01:01:12.240 We need to bring
01:01:13.200 our manufacturing
01:01:13.980 back from China
01:01:15.040 in the smartest way.
01:01:16.960 You know,
01:01:17.100 not in a complete way,
01:01:18.200 but in the smartest way
01:01:19.300 so that we're
01:01:20.140 manufacturing
01:01:20.740 the things
01:01:22.460 that we care about,
01:01:23.860 pharmaceuticals
01:01:24.960 in particular.
01:01:26.140 And we need to figure
01:01:27.240 out how to build
01:01:27.800 our own chip factory
01:01:29.440 so we're not
01:01:30.100 depending on
01:01:31.260 the one
01:01:31.820 Taiwanese factory.
01:01:34.700 By the way,
01:01:35.140 do you know
01:01:35.400 the story
01:01:35.760 that the biggest
01:01:36.820 chip maker,
01:01:39.340 is it Taiwan
01:01:40.000 semiconductor
01:01:40.740 in Taiwan,
01:01:42.760 the one that we're
01:01:43.280 totally dependent on?
01:01:44.960 The entrepreneur
01:01:45.760 for that
01:01:46.440 tried to build it
01:01:47.620 in Texas
01:01:48.080 and was rejected.
01:01:51.600 He couldn't
01:01:52.420 build it in Texas
01:01:53.260 so he went
01:01:54.260 to where he could
01:01:54.780 build it
01:01:55.160 and he went
01:01:55.540 to Taiwan.
01:01:56.960 Man,
01:01:57.500 is that on us.
01:01:59.240 That is so,
01:02:00.960 so on us.
01:02:03.160 God,
01:02:03.720 what a fuck up.
01:02:04.400 That's like the
01:02:05.960 fuck up
01:02:06.460 of the last
01:02:07.540 200 years.
01:02:09.560 Letting that guy
01:02:10.360 go start
01:02:11.080 the big
01:02:11.820 essential
01:02:12.780 chip company
01:02:13.840 in another country
01:02:14.640 that we have
01:02:15.460 to protect,
01:02:16.240 I guess.
01:02:17.300 It couldn't have
01:02:18.020 been a more
01:02:18.660 fucked up situation.
01:02:21.740 If it had gone
01:02:22.840 to any other
01:02:23.600 country,
01:02:24.060 it would have
01:02:24.360 been suboptimal.
01:02:25.780 But it went
01:02:26.480 to Taiwan.
01:02:28.720 Right?
01:02:29.460 I mean,
01:02:29.840 it might have
01:02:30.180 gone to Gaza.
01:02:31.460 Like,
01:02:31.660 is there any
01:02:32.060 other way
01:02:32.460 we could make
01:02:32.920 that more
01:02:33.300 fucked up?
01:02:35.160 So,
01:02:35.760 there are a lot
01:02:36.320 of things
01:02:36.660 that a Vivek
01:02:37.800 could make
01:02:39.000 a good story
01:02:39.860 for a solution.
01:02:41.540 Trump could do
01:02:42.120 it too.
01:02:43.380 Trump likes
01:02:44.240 to keep things
01:02:44.920 a little simpler.
01:02:46.380 So,
01:02:46.640 I don't know
01:02:47.020 that this is
01:02:47.660 his ideal
01:02:48.300 approach.
01:02:51.420 And he's,
01:02:51.900 and Trump
01:02:52.660 also is a
01:02:53.640 little less
01:02:54.220 facile
01:02:56.280 about the
01:02:57.020 new technology.
01:02:57.740 technology.
01:02:58.900 So,
01:02:59.120 you know,
01:02:59.320 Vivek is
01:02:59.880 going to
01:03:00.120 understand
01:03:00.660 that AI
01:03:01.820 plus quantum
01:03:02.640 computing
01:03:03.220 changes everything.
01:03:06.060 Trump,
01:03:06.620 you know,
01:03:08.020 might have
01:03:08.440 heard that
01:03:08.840 story,
01:03:09.640 but I doubt
01:03:10.220 that would
01:03:10.600 become part
01:03:11.180 of his
01:03:11.440 narrative.
01:03:12.540 You know,
01:03:12.700 it's a little
01:03:13.420 bit more
01:03:14.040 futuristic
01:03:15.580 than Trump
01:03:16.380 usually talks.
01:03:17.720 But,
01:03:18.080 to be fair,
01:03:19.380 when Trump
01:03:19.820 talks about
01:03:20.340 building these
01:03:21.040 new cities
01:03:22.140 from scratch,
01:03:23.560 I think
01:03:24.520 that the
01:03:25.580 new technologies
01:03:26.440 plus the
01:03:27.460 new cities
01:03:28.040 could be
01:03:29.260 the biggest
01:03:30.800 boom in
01:03:31.740 building and
01:03:32.620 industrial
01:03:33.100 everything,
01:03:34.280 could solve
01:03:34.840 housing by
01:03:35.600 bringing the
01:03:36.040 cost down.
01:03:37.520 There's just
01:03:38.280 a whole bunch
01:03:39.080 of ways
01:03:39.560 that Trump
01:03:40.560 could use
01:03:41.260 a real estate
01:03:42.120 focus
01:03:44.600 to fix
01:03:45.800 the economy.
01:03:46.880 But he
01:03:47.300 would fix
01:03:47.660 energy first.
01:03:48.820 Trump would
01:03:49.200 fix energy
01:03:49.860 because that's
01:03:50.380 a big part
01:03:50.800 of the
01:03:51.140 building and
01:03:51.980 construction
01:03:52.480 puzzle.
01:03:53.900 Yeah,
01:03:56.380 Trump is
01:03:56.780 actually very
01:03:57.300 good about
01:03:57.740 talking about
01:03:58.400 the golden
01:03:59.180 future,
01:04:00.920 but I don't
01:04:02.180 know if he
01:04:02.760 could quite
01:04:03.760 get AI
01:04:04.600 robots and
01:04:05.480 quantum computing
01:04:06.440 right.
01:04:07.300 But they
01:04:07.560 could.
01:04:10.160 There's a
01:04:10.880 reason I'm
01:04:12.100 biased toward
01:04:12.740 Young.
01:04:14.240 Right?
01:04:15.680 It makes
01:04:16.540 sense to be
01:04:17.080 a little biased
01:04:17.800 toward Young
01:04:18.360 at the moment.
01:04:19.140 Speaking of
01:04:22.460 that, do
01:04:23.640 you know
01:04:24.180 that the
01:04:24.680 politicians
01:04:25.460 who are
01:04:26.420 driving up
01:04:27.000 our debt,
01:04:27.560 the most
01:04:27.880 powerful
01:04:28.280 politicians,
01:04:29.460 are the
01:04:29.820 ones who
01:04:30.200 are about
01:04:30.520 a year
01:04:30.820 away from
01:04:31.220 being dead
01:04:31.680 themselves?
01:04:33.320 If you
01:04:33.900 are a year
01:04:34.480 away from
01:04:34.880 being dead,
01:04:36.400 do you
01:04:36.660 think you
01:04:37.000 should be
01:04:37.280 in charge
01:04:37.760 of driving
01:04:39.080 up the
01:04:39.500 debt for
01:04:39.920 the United
01:04:40.260 States for
01:04:40.920 the next
01:04:41.260 hundred
01:04:41.500 years?
01:04:42.680 And you're
01:04:43.260 going to be
01:04:43.520 dead in a
01:04:44.020 year?
01:04:44.980 And you're
01:04:45.380 in charge of
01:04:45.900 deciding what
01:04:46.400 the rest of
01:04:46.960 us will do
01:04:47.500 for the next
01:04:48.560 hundred years?
01:04:49.140 Now, I
01:04:51.140 saw some
01:04:51.540 pushback on
01:04:52.180 that, because
01:04:52.540 I said it
01:04:53.100 on X, and
01:04:54.760 the pushback
01:04:55.320 was, but
01:04:55.900 Scott, all
01:04:57.320 of these
01:04:57.680 ancient people,
01:04:59.080 they've got
01:04:59.540 kids, they've
01:05:00.100 got grandkids,
01:05:00.960 of course they
01:05:01.520 care about
01:05:01.900 them, so
01:05:02.860 they care
01:05:03.220 about the
01:05:03.540 future as
01:05:04.000 much as
01:05:04.340 anybody does.
01:05:05.600 To which I
01:05:06.520 replied, have
01:05:08.340 you met
01:05:08.720 people?
01:05:10.300 Let me
01:05:10.800 explain something
01:05:11.480 about people.
01:05:15.360 They really
01:05:16.240 do make
01:05:16.880 selfish decisions,
01:05:18.020 and they
01:05:18.640 really do
01:05:19.160 follow the
01:05:19.680 money, even
01:05:20.660 if it's
01:05:21.020 bad for
01:05:21.400 their
01:05:21.560 grandkids.
01:05:23.100 That's
01:05:23.580 people.
01:05:24.700 Now, I'd
01:05:25.200 love to
01:05:25.520 tell you that
01:05:26.120 there are
01:05:26.480 people making
01:05:27.040 the grandkid
01:05:27.980 decision, and
01:05:29.360 there are, but
01:05:31.420 I don't think
01:05:32.560 that's what
01:05:32.940 drives the
01:05:33.420 politicians.
01:05:34.280 I think they
01:05:34.820 say to
01:05:35.140 themselves, I
01:05:36.520 could try to
01:05:37.340 solve an
01:05:37.760 unsolvable
01:05:38.320 problem and
01:05:39.020 fail, which
01:05:40.020 is how to
01:05:40.400 get the debt
01:05:41.300 down, or I
01:05:42.520 could just
01:05:42.920 vote for one
01:05:43.580 more omnibus
01:05:44.440 bill, and I'm
01:05:45.740 going to be
01:05:46.040 dead in a few
01:05:46.640 years anyway.
01:05:47.260 It's never
01:05:48.400 going to come
01:05:48.780 back to bite
01:05:49.260 me.
01:05:50.420 And then
01:05:50.900 they'd say,
01:05:51.380 well, grandchildren
01:05:52.320 will work it
01:05:53.000 out.
01:05:54.560 Life's never
01:05:55.160 been easy.
01:05:55.980 Grandkids will
01:05:56.500 work it out,
01:05:57.080 one way or the
01:05:57.600 other.
01:05:58.480 So I think it
01:05:59.460 does make a big
01:06:00.120 difference that
01:06:01.220 the people voting
01:06:02.000 on the debt are
01:06:02.700 100 years old.
01:06:03.860 I suggest the
01:06:05.100 following fix, that
01:06:07.020 the only members of
01:06:07.980 Congress who can
01:06:08.780 vote on debt,
01:06:10.420 long-term debt,
01:06:11.780 have to be under
01:06:13.260 50 years old.
01:06:15.120 And if you're
01:06:15.820 over 50, you
01:06:16.680 can't vote on
01:06:17.740 debt or any
01:06:18.740 kind of spending
01:06:19.280 bill.
01:06:21.600 And then I
01:06:22.320 would say,
01:06:23.160 public, you
01:06:24.860 are free to
01:06:25.540 elect anybody
01:06:26.280 you want.
01:06:27.500 If you'd like
01:06:28.380 your representative
01:06:29.160 to be over
01:06:29.940 50, just know
01:06:31.360 that your state
01:06:32.060 will not be
01:06:32.620 involved in any
01:06:33.300 of the budget
01:06:33.700 decisions.
01:06:35.340 They can talk
01:06:36.140 about it, but
01:06:36.820 they don't get
01:06:37.200 to vote.
01:06:38.280 Because letting
01:06:38.980 people who
01:06:39.700 won't be here
01:06:40.440 to pay the
01:06:40.960 debt decide how
01:06:42.280 much debt you
01:06:43.000 have is a
01:06:44.420 fucking stupid
01:06:45.180 system.
01:06:46.960 And if you
01:06:47.620 think that
01:06:48.340 them having
01:06:49.540 grandkids is
01:06:50.460 going to save
01:06:51.180 you, that's
01:06:52.300 not how people
01:06:52.920 work.
01:06:54.880 They just
01:06:55.580 don't.
01:06:56.440 I wish they
01:06:56.940 did, but
01:06:57.880 they don't.
01:06:59.980 So that'll
01:07:00.960 never happen,
01:07:01.620 but might be
01:07:02.800 the only way
01:07:03.300 to survive.
01:07:05.860 There's a new
01:07:06.540 study that says
01:07:07.280 two of the
01:07:09.000 things that
01:07:09.400 predict a
01:07:10.420 marriage lasting
01:07:11.340 there are
01:07:13.140 two factors
01:07:13.680 psychologists
01:07:14.200 found.
01:07:15.340 One is that
01:07:16.480 the man has
01:07:17.940 high verbal
01:07:18.660 intelligence.
01:07:20.840 Does that
01:07:21.600 surprise you?
01:07:22.360 If the man
01:07:23.360 has high
01:07:23.940 verbal
01:07:24.320 intelligence,
01:07:25.480 what goes
01:07:26.920 along with
01:07:27.460 high verbal
01:07:28.080 intelligence?
01:07:30.180 Is there
01:07:30.860 anything else
01:07:31.360 correlated with
01:07:32.200 that besides
01:07:34.980 long marriages?
01:07:37.280 I see the
01:07:42.000 suggestion
01:07:42.400 a huge
01:07:43.140 penis, but
01:07:44.000 I don't
01:07:44.380 think that's
01:07:44.860 what I was
01:07:45.140 going for.
01:07:48.300 Well, I
01:07:48.920 don't know.
01:07:49.480 Let's see if
01:07:50.020 there's anything
01:07:50.700 else about the
01:07:51.480 second thing
01:07:52.340 that suggests
01:07:53.460 a long
01:07:53.860 marriage.
01:07:55.000 See, the
01:07:55.400 second thing
01:07:56.040 is that the
01:07:57.640 man in the
01:07:58.280 marriage is
01:07:59.040 able to buy
01:07:59.780 new and more
01:08:00.660 expensive cars.
01:08:02.620 That also
01:08:03.500 signals longevity.
01:08:04.920 So what
01:08:06.340 would having
01:08:08.220 the ability
01:08:10.180 to buy
01:08:10.660 new expensive
01:08:11.500 cars have
01:08:13.300 in common
01:08:13.840 with a man
01:08:14.640 who has
01:08:15.040 high intellect
01:08:16.040 and verbal
01:08:17.080 abilities?
01:08:18.920 Could it
01:08:19.760 be that
01:08:21.900 money makes
01:08:22.600 people stay?
01:08:24.960 The person
01:08:26.140 with the
01:08:26.500 high IQ
01:08:27.040 and the
01:08:27.460 high verbal
01:08:28.280 ability probably
01:08:29.140 has a good
01:08:29.540 job.
01:08:31.040 That's what
01:08:31.560 that gets you.
01:08:32.880 If they're
01:08:33.240 able to buy
01:08:33.820 new cars,
01:08:34.420 probably got
01:08:35.740 money.
01:08:36.580 Good job.
01:08:38.780 And women
01:08:39.840 like to be
01:08:40.780 in stable
01:08:41.420 situations,
01:08:42.640 of course.
01:08:43.280 Everybody
01:08:43.580 does.
01:08:44.620 But the
01:08:46.380 researchers said
01:08:47.220 at least the
01:08:47.700 car part has
01:08:49.100 something to
01:08:49.680 do with
01:08:49.880 signaling a
01:08:50.820 higher social
01:08:51.500 status.
01:08:53.980 Money.
01:08:55.460 The car is
01:08:56.560 just a signal
01:08:57.180 of money.
01:08:58.180 It's not the
01:08:58.740 car.
01:08:59.780 Although women
01:09:00.560 do like,
01:09:01.800 women do see
01:09:02.540 their mate
01:09:03.740 and the
01:09:05.120 mate's car
01:09:05.860 as their
01:09:06.980 accessories.
01:09:08.240 By the way,
01:09:09.040 this is one
01:09:09.540 of the best
01:09:09.960 ways to
01:09:10.380 think about
01:09:10.800 this.
01:09:11.960 I know
01:09:12.560 people,
01:09:13.080 and I used
01:09:13.440 to be one,
01:09:14.860 how dumb
01:09:15.280 I was when
01:09:15.980 I was young.
01:09:16.800 I used to
01:09:17.420 think that I
01:09:18.320 would not be
01:09:18.860 judged by the
01:09:19.540 quality of my
01:09:20.300 automobile.
01:09:22.620 I know.
01:09:23.640 Pretty stupid.
01:09:25.280 I used to
01:09:25.960 think women
01:09:26.700 aren't going
01:09:27.120 to care.
01:09:28.100 They're going
01:09:28.420 to care about
01:09:28.900 my wonderful
01:09:29.660 personality,
01:09:31.320 my empathy,
01:09:32.780 my conversation
01:09:33.600 style,
01:09:34.800 my ability
01:09:35.320 to engage
01:09:36.380 things,
01:09:37.580 maybe my
01:09:38.900 clever thoughts,
01:09:41.860 my haircut.
01:09:43.440 I thought
01:09:43.960 those are the
01:09:44.380 things that
01:09:45.040 really influenced
01:09:45.780 women.
01:09:46.440 But it
01:09:46.680 turns out
01:09:47.120 it's your
01:09:47.620 car.
01:09:52.100 I've told
01:09:52.740 you the
01:09:53.020 story of
01:09:53.440 how I
01:09:53.700 learned that
01:09:54.120 the hard
01:09:54.460 way.
01:09:55.420 When I
01:09:55.760 was in
01:09:56.060 my 20s,
01:09:57.280 I guess,
01:09:57.600 I had
01:09:58.660 the worst
01:09:59.360 little
01:09:59.740 you'll never
01:10:00.420 get laid
01:10:00.960 car in
01:10:01.500 the world.
01:10:02.200 I mean,
01:10:02.420 I'm not
01:10:02.680 even going
01:10:02.960 to describe
01:10:03.420 it,
01:10:03.680 but it
01:10:03.880 was sad.
01:10:05.040 Nobody
01:10:05.480 would ever
01:10:05.880 get in
01:10:06.160 that car
01:10:06.660 and think,
01:10:07.520 you know,
01:10:08.620 I sure
01:10:09.080 want to
01:10:09.380 have sex
01:10:09.840 with the
01:10:10.140 owner of
01:10:10.500 this car.
01:10:11.920 I guarantee
01:10:12.560 nobody
01:10:13.020 ever had
01:10:13.400 that thought.
01:10:14.360 So one
01:10:14.820 day,
01:10:15.280 when things
01:10:16.100 were going
01:10:16.540 a little
01:10:16.800 bit better
01:10:17.180 in my
01:10:17.560 career,
01:10:18.040 this was
01:10:18.340 before
01:10:18.720 Dilbert,
01:10:19.320 but my
01:10:20.100 corporate
01:10:20.500 career was
01:10:21.180 advancing on
01:10:22.200 time,
01:10:22.840 and I
01:10:23.640 bought a
01:10:24.160 very old
01:10:25.120 BMW.
01:10:27.600 But it
01:10:28.100 was still
01:10:28.360 a BMW.
01:10:29.780 And one
01:10:30.620 day I
01:10:30.980 pulled it
01:10:31.320 into the
01:10:31.620 parking garage,
01:10:32.360 it was brand
01:10:32.820 new,
01:10:33.240 and I
01:10:33.460 wanted to
01:10:33.760 get the
01:10:34.040 opinion of
01:10:34.680 some other
01:10:35.300 people.
01:10:36.620 And my
01:10:37.920 neighbor had
01:10:38.440 a couple
01:10:38.760 of girls
01:10:39.520 who were
01:10:39.800 twins who
01:10:40.400 happened to
01:10:40.700 be hanging
01:10:41.140 out in the
01:10:41.740 parking lot.
01:10:42.740 And I
01:10:43.260 knew them
01:10:43.560 pretty well
01:10:43.960 because they
01:10:44.320 lived right
01:10:44.860 across the
01:10:45.360 hole.
01:10:46.180 And I
01:10:46.540 said to
01:10:46.820 them,
01:10:47.800 they noticed
01:10:48.600 the new
01:10:49.020 car and they
01:10:49.660 said something,
01:10:50.320 I said,
01:10:50.720 all right,
01:10:51.100 what do you
01:10:51.480 think of the
01:10:51.820 car?
01:10:52.200 And they're
01:10:52.380 like,
01:10:52.580 oh,
01:10:52.760 we love
01:10:53.040 it,
01:10:53.220 we love
01:10:53.500 it.
01:10:53.660 And I
01:10:55.420 said,
01:10:55.820 do you
01:10:56.060 think that
01:10:56.600 women will
01:10:57.800 like me
01:10:58.320 better because
01:10:58.900 I have this
01:10:59.500 car?
01:11:00.560 I remember
01:11:00.960 these are
01:11:01.320 like nine
01:11:02.100 year old
01:11:02.600 twin girls.
01:11:04.340 And the
01:11:04.740 nine year
01:11:05.160 olds said,
01:11:06.520 oh,
01:11:06.960 yeah,
01:11:07.900 oh,
01:11:08.360 yeah,
01:11:09.280 yeah.
01:11:11.060 And like
01:11:11.360 their eyes
01:11:11.820 got big,
01:11:12.360 like for
01:11:12.880 sure,
01:11:14.020 no doubt
01:11:14.420 about it.
01:11:15.380 And I'm
01:11:16.020 thinking,
01:11:16.780 it starts
01:11:17.620 that early.
01:11:19.120 The nine
01:11:19.680 year old
01:11:20.060 girls are
01:11:20.640 completely
01:11:21.180 aware that
01:11:22.500 if I don't
01:11:22.900 have a good
01:11:23.440 car,
01:11:23.880 I'm not
01:11:24.180 going to
01:11:24.400 get any
01:11:24.740 action.
01:11:25.840 At nine
01:11:26.320 years old,
01:11:26.760 they knew
01:11:27.080 that.
01:11:28.820 And once
01:11:29.720 you understand
01:11:30.600 that your
01:11:31.460 car is a
01:11:32.960 reflection of
01:11:33.840 the woman
01:11:34.180 who's in
01:11:34.600 it,
01:11:35.140 like she's
01:11:35.640 not going
01:11:35.960 to want
01:11:36.200 to be
01:11:36.600 painted
01:11:37.920 with the
01:11:38.860 embarrassment
01:11:39.640 of your
01:11:40.260 automobile
01:11:40.980 that the
01:11:41.660 paint is
01:11:42.060 falling off.
01:11:43.380 But if you
01:11:43.920 got a good
01:11:44.260 car,
01:11:44.600 it's like a
01:11:45.060 good accessory.
01:11:46.520 And the
01:11:46.760 same for how
01:11:47.380 you dress
01:11:47.940 and how tall
01:11:48.460 you are and
01:11:49.020 if you have
01:11:49.480 hair and all
01:11:49.980 that stuff.
01:11:51.400 Yes,
01:11:51.700 people are
01:11:52.200 shallow
01:11:52.680 but you'll
01:11:54.380 get used
01:11:54.680 to it.
01:11:57.140 All right.
01:11:59.360 What is it
01:12:00.200 that links
01:12:01.200 the various
01:12:02.200 groups on
01:12:02.720 the left?
01:12:03.140 The pro-Palestine
01:12:04.560 people with
01:12:05.200 the climate
01:12:05.640 people,
01:12:06.320 with the
01:12:06.640 gay rights
01:12:07.160 people,
01:12:08.060 with the
01:12:08.340 indigenous
01:12:08.700 rights
01:12:09.200 people,
01:12:10.260 et cetera.
01:12:10.620 of?
01:12:14.680 Apparently
01:12:15.160 it's some
01:12:15.480 kind of
01:12:15.840 intersectionality.
01:12:17.940 And I
01:12:18.320 was reading
01:12:18.640 some tweets
01:12:19.140 by Michael
01:12:19.840 Naina.
01:12:21.240 But one
01:12:23.180 thing they
01:12:23.540 have in
01:12:23.880 common,
01:12:25.040 all of
01:12:25.440 those groups
01:12:26.000 that you
01:12:26.340 would think
01:12:26.640 would be
01:12:26.960 completely
01:12:27.400 different
01:12:27.740 groups,
01:12:28.460 is a
01:12:29.580 victim
01:12:29.900 narrative.
01:12:32.000 And the
01:12:32.560 oppressor is
01:12:33.240 always the
01:12:33.600 same person.
01:12:34.460 It's always
01:12:34.820 somebody white.
01:12:35.680 So everybody
01:12:38.080 who is
01:12:39.100 not a
01:12:40.260 white person
01:12:42.400 with money
01:12:42.960 thinks that
01:12:45.340 all of their
01:12:45.940 groups have
01:12:46.460 in common,
01:12:47.460 if only
01:12:47.980 those white
01:12:49.960 people would
01:12:50.420 do something
01:12:50.900 differently,
01:12:52.100 things would
01:12:52.680 be better
01:12:53.060 for all the
01:12:53.480 rest of us.
01:12:56.560 So that's
01:12:57.280 what they
01:12:57.500 have in
01:12:57.740 common.
01:12:59.000 So to
01:13:00.100 say that
01:13:00.960 they are
01:13:01.440 illegitimate
01:13:02.240 would be
01:13:03.720 an
01:13:04.080 understatement.
01:13:06.140 All right.
01:13:07.920 I guess
01:13:08.640 the U.S.
01:13:09.400 is telling
01:13:10.000 Qatar to
01:13:11.360 give up the
01:13:12.200 Hamas leader
01:13:12.940 who lives in
01:13:13.520 luxury in
01:13:14.280 Qatar.
01:13:14.920 I would like
01:13:15.720 to suggest
01:13:16.440 the following
01:13:17.200 technique for
01:13:19.220 getting that
01:13:19.720 guy.
01:13:20.680 I think
01:13:21.280 Israel and
01:13:22.040 or the
01:13:22.340 United States
01:13:22.940 in conjunction
01:13:23.580 should say
01:13:24.620 to Qatar,
01:13:25.360 this guy
01:13:26.200 is dead.
01:13:27.620 The only
01:13:28.100 thing you
01:13:28.540 have to
01:13:28.900 decide is
01:13:29.560 where he
01:13:29.900 is when
01:13:30.200 it happens.
01:13:30.680 we're
01:13:32.220 going to
01:13:32.440 kill him
01:13:32.760 in Qatar
01:13:33.140 if we
01:13:33.980 don't have
01:13:34.300 a choice
01:13:34.740 and we're
01:13:35.600 going to
01:13:35.820 do it
01:13:36.080 on Tuesday.
01:13:38.000 I think
01:13:38.580 that's the
01:13:38.900 way you
01:13:39.100 have to
01:13:39.360 play it
01:13:40.040 because
01:13:40.700 as long
01:13:42.520 as Qatar
01:13:43.020 thinks that
01:13:43.800 this is a
01:13:44.360 point of
01:13:44.720 negotiation
01:13:45.380 or that
01:13:46.640 maybe it
01:13:47.300 won't happen,
01:13:48.120 they're going
01:13:48.480 to act that
01:13:48.940 way.
01:13:49.740 You have
01:13:50.120 to take
01:13:50.500 away from
01:13:50.960 them any
01:13:51.860 notion that
01:13:52.620 this guy
01:13:53.060 isn't going
01:13:53.480 to die on
01:13:53.960 Tuesday.
01:13:55.020 You have
01:13:55.360 to say
01:13:55.700 Tuesday
01:13:56.220 he dies.
01:13:57.540 The only
01:13:58.120 question that
01:13:58.760 we're asking
01:13:59.300 you is where
01:14:00.020 you want
01:14:00.400 him to
01:14:00.640 be.
01:14:01.420 If you
01:14:01.720 want him
01:14:02.060 to be
01:14:02.340 in one
01:14:02.680 of your
01:14:02.940 nicest
01:14:03.400 high-rise
01:14:04.040 buildings,
01:14:04.780 we are
01:14:05.160 going to
01:14:05.360 take him
01:14:05.680 out there.
01:14:07.100 It's going
01:14:07.740 to be
01:14:07.940 Tuesday and
01:14:09.100 there will
01:14:09.440 be casualties.
01:14:11.040 I know
01:14:11.660 Qatar,
01:14:12.220 you're going
01:14:12.580 to have
01:14:12.760 to do
01:14:13.000 everything
01:14:13.320 you can
01:14:13.820 to punish
01:14:14.320 us back
01:14:14.840 in whatever
01:14:15.240 way you
01:14:15.700 can and
01:14:16.460 we're going
01:14:16.760 to take
01:14:17.040 that punishment.
01:14:18.740 The only
01:14:19.220 thing that's
01:14:19.620 not going
01:14:20.020 to happen
01:14:20.380 is he's
01:14:21.240 going to
01:14:21.420 be alive
01:14:21.820 on Wednesday.
01:14:22.760 The only
01:14:23.080 thing we
01:14:23.420 can guarantee
01:14:24.000 you is
01:14:24.660 he's going
01:14:25.000 to be
01:14:25.160 dead.
01:14:26.040 You only
01:14:26.680 get to
01:14:27.020 decide
01:14:27.500 where it
01:14:28.640 happens
01:14:28.960 and when.
01:14:29.580 Not even
01:14:29.960 when,
01:14:30.340 you get
01:14:30.720 to decide
01:14:31.140 where.
01:14:31.960 So we'd
01:14:32.520 like you
01:14:32.820 to give
01:14:33.120 him up,
01:14:34.080 but if you
01:14:34.640 want to
01:14:34.880 just walk
01:14:35.300 him out
01:14:35.580 to the
01:14:35.800 desert
01:14:36.060 where we
01:14:36.480 can kill
01:14:36.860 him,
01:14:37.040 that would
01:14:37.240 be great.
01:14:38.200 But if you
01:14:38.600 want him
01:14:38.860 to be in
01:14:39.180 one of your
01:14:39.580 very expensive
01:14:40.280 high-rises
01:14:40.820 in the middle
01:14:41.360 of your
01:14:41.660 main city,
01:14:42.520 we'll do
01:14:42.880 that too.
01:14:44.160 On
01:14:44.480 Tuesday.
01:14:45.940 On
01:14:46.200 fucking
01:14:46.520 Tuesday.
01:14:48.840 I don't
01:14:49.360 see any
01:14:49.780 other way
01:14:50.140 to handle
01:14:50.500 it,
01:14:50.740 do you?
01:14:51.520 Now this
01:14:52.060 is something
01:14:52.440 that you
01:14:52.820 couldn't
01:14:53.260 do under
01:14:54.160 any other
01:14:54.700 context
01:14:55.340 than the
01:14:56.240 October 7th
01:14:57.520 horrors.
01:14:59.480 After those
01:15:00.280 horrors,
01:15:00.980 everybody knows
01:15:01.740 you're not
01:15:02.140 kidding.
01:15:03.360 If they
01:15:04.420 thought you
01:15:04.820 were bluffing,
01:15:06.020 they'd play
01:15:06.880 it that way.
01:15:08.800 Do you
01:15:09.080 think Israel's
01:15:09.780 bluffing?
01:15:10.920 Would you
01:15:11.340 think Israel
01:15:11.940 was bluffing
01:15:12.640 if they
01:15:13.020 said we're
01:15:13.440 going to
01:15:13.560 take out
01:15:14.020 two stories
01:15:16.140 of that
01:15:16.440 building and
01:15:17.300 we're going
01:15:17.580 to do it
01:15:17.880 on Tuesday
01:15:18.380 and there's
01:15:18.920 nothing you
01:15:19.420 can do
01:15:19.680 about it?
01:15:20.760 And whatever
01:15:21.300 you do to
01:15:21.840 us,
01:15:22.080 we'll take
01:15:22.400 it.
01:15:23.780 No.
01:15:24.520 They would
01:15:24.900 mean it.
01:15:25.300 And I
01:15:26.220 think they
01:15:26.560 could do
01:15:26.920 it.
01:15:29.620 So that's
01:15:30.240 the way I
01:15:30.500 play it.
01:15:31.240 I wouldn't
01:15:31.580 act like
01:15:32.440 maybe
01:15:32.840 Qatar has
01:15:33.360 a choice.
01:15:34.120 They only
01:15:34.500 have a
01:15:34.760 choice of
01:15:35.320 where.
01:15:38.500 And ladies
01:15:39.180 and gentlemen,
01:15:40.500 this concludes
01:15:41.140 the best
01:15:41.500 live stream
01:15:41.960 you'll see
01:15:42.300 today.
01:15:45.480 We'll talk
01:15:46.180 lots more
01:15:46.620 about other
01:15:47.020 stuff at
01:15:47.400 other times.
01:15:48.700 I hope
01:15:49.080 this was
01:15:49.460 useful to
01:15:49.920 you.
01:15:51.140 And
01:15:51.580 what?
01:15:55.300 Have
01:15:55.860 MBS
01:15:56.260 make the
01:15:56.720 call.
01:15:57.240 Yeah,
01:15:57.400 maybe.
01:15:59.860 All right.
01:16:00.780 Thanks for
01:16:01.380 joining YouTube.
01:16:02.280 I'll talk
01:16:02.840 to you
01:16:03.380 tomorrow.