Real Coffee with Scott Adams - September 01, 2023


Episode 2218 Scott Adams: I Tell You How To Use The Designated Liars To Deduce What Is True


Episode Stats

Length

1 hour and 10 minutes

Words per Minute

149.374

Word Count

10,487

Sentence Count

818

Misogynist Sentences

6

Hate Speech Sentences

16


Summary

In this episode, Scott Adams talks about the best book in the world, The Whiteboard Presentation, and the weirdest thing he's heard in a long time. Plus, an update on the magic mushrooms trial, and a story about a robot that thinks it can do math.


Transcript

00:00:00.000 Good morning, everybody, and welcome to the highlight of human civilization.
00:00:08.160 It's called Coffee with Scott Adams, now streaming to you live all over the world
00:00:13.680 through YouTube and through X, live on X at the moment,
00:00:18.600 and, of course, my beloved locals community who get to see the good stuff.
00:00:25.240 Oh, you wouldn't believe what they heard before you came on. Wow.
00:00:28.020 But if you'd like to take this experience up to levels that nobody dreamed were possible,
00:00:34.420 including two whiteboard presentations, two, two, wait for that.
00:00:40.980 All you need is a cup or mug or a glass of tank or chalice or stye,
00:00:44.140 and a canteen jug or flask, a vessel of any kind, fill it with your favorite liquid.
00:00:49.340 I like coffee.
00:00:50.440 And join me now for the unparalleled pleasure, the dopamine hit of the day,
00:00:54.060 the thing that makes everything better.
00:00:56.340 It's called the simultaneous sip.
00:00:58.860 And it happens now. Go.
00:01:03.900 Ah.
00:01:06.020 So good, so good.
00:01:07.960 Today's episode is brought to you by the best book in the world.
00:01:11.380 It's called Reframe Your Brain, and it might actually be the best book in the world.
00:01:15.600 I say that because people say, come on.
00:01:18.640 Come on.
00:01:19.480 Now, I don't mean religious books.
00:01:22.040 Religious books are their own category.
00:01:24.780 But if you want a book that will almost certainly change your life in some positive way,
00:01:30.800 probably a number of ways.
00:01:33.380 Apparently, people are having their lives changed, even now.
00:01:36.940 It's available, even though Amazon is making it as hard as possible.
00:01:42.040 Let me ask you this.
00:01:42.900 Of all the books on Amazon, you can imagine, there are tens of millions of books.
00:01:49.420 How many of those books, if you went to the page and then tried to click on it to buy it,
00:01:55.020 how many of those would Amazon not let you buy?
00:01:59.380 I don't know if there are any others, but mine's pretty hard to get.
00:02:02.820 So the hardcover got listed, but it got listed as not yet available.
00:02:08.000 But if you go to the hardcover page, it doesn't show the softcover option.
00:02:14.760 So if you thought you were buying a book, and you're like, oh, hardcover's there,
00:02:18.680 you'd see it's not available, and then you wouldn't see that you could buy the softcover
00:02:23.120 that's been available for a couple weeks.
00:02:25.620 Now, we live in a zero-trust environment.
00:02:30.320 Under normal conditions, I would say to myself,
00:02:33.300 there's probably a little technical problem.
00:02:35.160 We'll get that worked out.
00:02:36.160 But how many times have I told you there's a technical problem?
00:02:40.560 Not the first one.
00:02:42.280 So in a zero-trust environment, I'm going to assume guilt until proven innocent.
00:02:48.040 Not that it's true, but I wouldn't assume it's an accident at this point.
00:02:52.380 So if you can buy that book, if you can find a way to actually purchase it,
00:02:56.980 you'll be ahead of the game.
00:02:58.900 All right, here's some updates.
00:03:01.000 Yet another trial of using mushrooms for depression.
00:03:06.160 It turns out that the magic mushrooms made a huge difference.
00:03:11.740 So basically, every time they study it, I think, correct me if I'm wrong,
00:03:16.920 but 100% of the times they study depression and mushrooms, there's a big improvement.
00:03:23.500 I mean, there's something really big, big, big that's coming down the pike.
00:03:30.280 And I don't know what it's going to do.
00:03:34.380 Remember, you know, I always remind you that I told you that when Trump rose in 2015, 2016,
00:03:40.220 I told you it was going to change more than politics.
00:03:43.660 It would change the way you saw reality itself.
00:03:45.900 Well, if you thought that was a change, wait till you try mushrooms for your depression.
00:03:53.660 Talk about a change in reality.
00:03:56.780 So toward the end of today's show, I'm going to blow your minds like your brain is just going to come off.
00:04:05.940 So, and almost certainly, almost certainly, you're going to say, what?
00:04:13.440 So wait for that.
00:04:14.540 That's the whiteboard presentation.
00:04:16.760 All right.
00:04:17.220 I saw an article that apparently a lot of employees are pretending they have AI knowledge
00:04:26.120 because their boss doesn't know one way or the other
00:04:28.720 so that they can feel like they're the valuable employees.
00:04:32.720 And I thought to myself, well, there's a whole week of Dilbert Reborn comics,
00:04:38.820 which are available by subscription only on Twitter and on the Locals platform,
00:04:44.380 local, scottadams.locals.com.
00:04:47.740 But did you see that coming?
00:04:52.560 You knew that people were going to pretend that they knew AI without knowing AI.
00:04:57.240 You knew that was going to happen.
00:04:59.300 That's a Wally thing.
00:05:00.860 Well, unemployment is inching up from 3.5 to 3.8.
00:05:05.160 Let's pretend we can measure that.
00:05:08.280 Yeah.
00:05:08.380 Let's pretend that we can measure it in a meaningful and accurate way
00:05:12.680 such that the change between 3.5 and 3.8 is meaningful.
00:05:18.320 Probably not.
00:05:20.240 Probably not.
00:05:21.900 So, but if the numbers that are official make it look like inflation is inching up,
00:05:28.640 which is hard to imagine, why would it be going up?
00:05:33.140 Just too much money supply?
00:05:36.260 I don't know.
00:05:36.740 Seems like you should be going down, but that would be, I would guess,
00:05:41.500 one of the most important predictors of the next election.
00:05:47.120 Should it be a fair and free election?
00:05:50.620 I would say, I would think inflation is the biggest number that would change people's vote.
00:05:55.080 What do you say?
00:05:55.700 If everything else were the same and inflation looked like it was inching up on election day,
00:06:03.080 I think that's game over, at least in terms of the public.
00:06:07.320 The voting is a separate question.
00:06:10.120 All right, here's my opinion on Mitch McConnell.
00:06:12.540 And you all know he's had health problems and he's been freezing up.
00:06:16.720 Actually, he just goes blank like he can't talk in public.
00:06:20.600 It's happened a few times.
00:06:21.740 We all feel bad for him.
00:06:23.720 Don't want to, you know, I don't want to mock somebody for age or infirmity.
00:06:28.100 So I'm going to mock other people instead.
00:06:31.360 I think you could ignore everything Republicans say on any topic,
00:06:36.840 as long as they let that guy keep his job.
00:06:41.220 Now you could say to me, but, but, but the other Republicans can't,
00:06:44.720 you know, they have no say over whether he retires.
00:06:48.000 Sure they do.
00:06:49.120 Of course they do.
00:06:50.820 Yeah.
00:06:51.280 If they all said you got to retire, he'd retire because it would just be too embarrassing not to.
00:06:56.940 They could just not take his calls, you know, just ignore his office.
00:07:00.760 They could treat him like he's not there.
00:07:02.840 They could get him out in a day.
00:07:05.580 And I'm sure they have their reasons.
00:07:09.540 But if you're not going to tell me the reasons,
00:07:11.500 I'm not going to listen to you about anything.
00:07:13.380 You have no credibility if the leader of your party is incapacitated
00:07:20.380 and you're just letting it happen.
00:07:23.060 Why?
00:07:23.620 Because Biden is also incapacitated, so that's okay.
00:07:27.600 Or, you know, Schumer's close to it.
00:07:30.520 No, Schumer's doing fine.
00:07:32.300 But, yeah.
00:07:34.340 How can you make fun of the fact that Democrats keep Fetterman?
00:07:38.520 You can't.
00:07:39.100 How can you make fun of Biden's declining cognitive ability?
00:07:43.420 You can't.
00:07:44.460 Not if you're trying to be a serious person.
00:07:46.880 So if you want to be a serious person, they all have to go.
00:07:50.020 You know, Feinstein's got to go, but McConnell's got to go at the same time.
00:07:53.760 So Republicans, could you pull it together just a little bit?
00:07:57.680 I mean, this is just free money sitting on the table.
00:08:01.120 I realize it's awkward, it's uncomfortable, it's not the thing you want to be known for.
00:08:06.220 But you've got to do it.
00:08:08.200 You've got to do it.
00:08:10.020 There's nobody in the world who thinks this is right.
00:08:12.860 Nobody.
00:08:13.880 Zero people think this is okay.
00:08:16.100 And you can't do this?
00:08:17.620 The simplest fucking thing you could ever do in your life?
00:08:20.480 You can't do this.
00:08:22.200 Just help a guy out.
00:08:24.540 Help him retire with a little bit of dignity, maybe.
00:08:28.180 Crazy.
00:08:29.260 Yeah, just ignore everything Republicans say from now on.
00:08:32.540 If you can't get that right.
00:08:34.080 All right.
00:08:35.600 Here's a service I'm thinking seriously of doing.
00:08:38.640 One, you've seen me give my description of how to know if the news is credible.
00:08:46.620 For example, I won't go through the whole thing.
00:08:49.280 But I did a whole list of, you know, what is a credible story versus a non-credible story.
00:08:54.700 And one of them, just to give you an example, is if the only source is an anonymous source
00:09:00.380 and it's only being reported by the part of the media that hates the person who's being narced on,
00:09:06.660 that's never true.
00:09:07.460 It's just never true.
00:09:09.880 Right?
00:09:10.140 And then you could rate the other things for how often they're true.
00:09:13.620 And then you'd have a good little guide to look at the news.
00:09:17.380 But there's another thing I want to add, and I'm just starting to build the list.
00:09:21.540 It would be the list of what I call the designated liars.
00:09:25.960 Now, this is a little tricky because there's some nuance to it.
00:09:29.400 If you miss the nuance, then the beauty of the idea disappears.
00:09:33.900 All right?
00:09:34.120 So I'm not talking about people who lie.
00:09:37.420 Right?
00:09:37.880 In a moment, you're going to suggest people who are simple liars.
00:09:42.540 That's not what this is about.
00:09:44.520 We're not talking about simply lying.
00:09:47.900 Because that would be a lot of politicians.
00:09:50.820 I'm talking about designated liars.
00:09:53.940 Right?
00:09:54.120 If you miss the designated part, that would be like missing the difference between climate change is a hoax versus the climate agenda is a hoax.
00:10:03.580 Right?
00:10:03.780 If you miss that designated word, which you will, not all of you, but somebody on YouTube is going to say, but other people lie.
00:10:13.500 You know they're going to do that.
00:10:15.400 Whoever does that is an NPC.
00:10:16.960 Right?
00:10:17.400 So it's not about liars.
00:10:21.700 It's not about liars.
00:10:23.440 It's about designated liars.
00:10:26.260 And what I mean by that is there's a group of Democrats.
00:10:31.400 Now, you could make an argument that it happens on the right.
00:10:34.420 We'll talk about that as well.
00:10:35.520 But on the left, if you see any of these people be the chief character in a story, now the chief character would be somebody who wrote the big article that everybody's talking about, or somebody who's always on the news talking about it.
00:10:50.420 If you see Phil Bump of the Washington Post making a claim that other people are saying, hmm, I'm not so sure that's true.
00:11:00.380 Phil Bump is an absolute signal that it's a fake story.
00:11:04.140 Like, he's almost synonymous with fake news.
00:11:08.540 Now, he would deny that, of course.
00:11:11.860 And he denied it quite a bit because he's under fire in the news.
00:11:15.800 I saw Jonathan Turley and Miranda Devine.
00:11:20.720 Is it Devine or Devine?
00:11:22.340 Devine?
00:11:23.280 Miranda Devine or Devine?
00:11:26.360 Somebody fact-check me.
00:11:28.000 I'll say it correctly when you do.
00:11:30.140 Vine.
00:11:30.680 Devine.
00:11:31.200 Okay, it's Devine.
00:11:32.000 So Miranda Devine has added an article.
00:11:35.220 And basically, it's just mocking Phil Bump for being a ridiculous character.
00:11:40.360 Now, in terms of full disclosure, Phil Bump does work for the Washington Post, which in the Dilbert Reborn comic is where Ratbert works as a writer.
00:11:53.180 Ratbert, I guess I should tell you, is current incarnation as a writer for the Washington Poop.
00:12:01.040 So in the Dilbert Reborn comic, you can only see it by subscription.
00:12:05.220 It's the Washington Poop.
00:12:07.280 And Ratbert basically plays Phil Bump.
00:12:11.940 Now, I don't use the name.
00:12:14.440 But in your mind, if you see it, just tell yourself it's Phil Bump and it's funnier.
00:12:19.760 All right.
00:12:20.100 So Phil Bump is one of my mascots.
00:12:22.640 So full disclosure.
00:12:24.240 He's one of the people who comes after me in public.
00:12:26.200 And he was dancing on my grave when I got canceled, primarily by his newspaper, which started the rest of the newspapers.
00:12:35.720 So when the Washington Post canceled me, that allowed everybody else to do it at the same time.
00:12:40.080 So just know that I'm not objective, but Phil Bump's a good signal for fake news.
00:12:45.800 The other ones you know, Adam Schiff, Eric Swalwell, Jamie Raskin, Jerry Nadler, Dan Goldman, Blumenthal, Brennan, Clapper.
00:12:57.920 Now, they're the ones that I call the designated liars.
00:13:02.140 Here are the names that are not on the list.
00:13:04.380 Corinne Jean-Pierre.
00:13:08.260 Do you understand why she's not on the list?
00:13:11.760 Because she's not a signal of fake news.
00:13:15.040 She's somebody who lies and spins everything all the time.
00:13:18.480 But she'll spin a thing that's real.
00:13:21.380 She'll spin a thing that's not real.
00:13:22.980 You can't really tell.
00:13:24.560 So she's not a signal because she's just out there spinning all the time.
00:13:28.020 The other people that are not a signal would be leadership.
00:13:31.380 So Schumer, Biden, Pelosi, and Harris.
00:13:34.380 Because they're going to talk about everything all the time because they're in leadership.
00:13:38.420 Right?
00:13:39.160 So if you see them saying something that's true or not true, it's not really a signal.
00:13:44.080 They're just talking about everything all the time.
00:13:46.160 Same as Jean-Pierre, whatever.
00:13:50.480 Corinne Jean-Pierre.
00:13:51.960 So I don't consider them designated liars.
00:13:55.220 They would just be ordinary politicians who don't always tell the truth and spin a lot.
00:13:59.480 Now, the Republicans have a lot of those.
00:14:03.080 Republicans have people who are wrong.
00:14:05.940 You know, they believe things that are not true.
00:14:08.300 They have people who are under-informed.
00:14:11.160 Let's say not banning TikTok, for example.
00:14:14.740 Might not understand it.
00:14:16.020 So they might have a bad opinion there.
00:14:17.940 There are people who maybe just have different philosophies.
00:14:20.580 There are people who maybe they know they're bending or spinning or something.
00:14:24.980 And probably there's some people who just know they're lying on the Republican side.
00:14:30.280 But here's what's different.
00:14:32.280 There is not an identifiable squad of liars on the right that I'm aware of.
00:14:38.260 If I'm missing it because I have some bias, somebody should suggest it to me.
00:14:43.380 Now, if you say to me, but Scott, here's somebody who lied.
00:14:47.600 Again, it's not about lying.
00:14:50.100 They're all going to be lying sooner or later, except for Thomas Massey.
00:14:55.520 I have to throw Thomas Massey in there every time I call Congress liars,
00:14:59.240 because he's so obviously not one that I just feel shitty when I don't...
00:15:04.760 Yeah, Rand Paul's not a liar.
00:15:06.000 There are others, too, right?
00:15:07.600 There are others that I do trust are not lying.
00:15:11.440 But Republicans have a different set of credibility problems.
00:15:16.800 I don't think they have designated liars.
00:15:20.100 But if they do, well, so somebody's saying,
00:15:23.440 what about Crenshaw on Ukraine?
00:15:28.080 That doesn't seem to me like a designated liar.
00:15:31.060 That seems like somebody who has an opinion you don't have.
00:15:34.160 Because Crenshaw is not identifiable with the guy you stick forward
00:15:38.720 to say the things that are never true.
00:15:41.300 He's just somebody who disagrees with you deeply on a big issue.
00:15:44.220 And I probably disagree with him as well.
00:15:48.120 But that doesn't make him a liar.
00:15:49.600 And it certainly doesn't make him a designated liar.
00:15:54.140 Adam Kinzinger, there's something going on with Adam Kinzinger
00:15:56.940 that has more to do with the Ukraine war.
00:16:00.360 I don't see Adam Kinzinger as sort of the designated liar.
00:16:04.460 I think he's...
00:16:05.560 Honestly, it looks like he has some mental difficulties,
00:16:08.500 is how it projects.
00:16:09.900 Now, I don't know that.
00:16:11.520 I'm not a doctor.
00:16:13.080 But when I see Adam Kinzinger, I don't see mental health.
00:16:16.920 He doesn't display mental health
00:16:19.160 in the way that I normally would recognize it.
00:16:23.700 So if you're displaying something that looks like
00:16:27.200 maybe there's something you're working on on your own,
00:16:30.040 that's just its own category.
00:16:33.660 All right.
00:16:36.200 So would that be helpful?
00:16:37.500 Tell me the truth.
00:16:40.520 If you could, for a moment, be unbiased.
00:16:44.280 I know it's impossible.
00:16:46.140 Would this be useful?
00:16:47.480 If you were teaching somebody how to look at the news,
00:16:50.260 would it be useful to have a list
00:16:51.740 of these 10 people are designated liars?
00:16:55.100 You know they're lying.
00:16:56.580 That's the only reason they're on TV.
00:16:59.260 That would be useful, right?
00:17:02.080 Yeah.
00:17:03.800 All right.
00:17:04.400 All right.
00:17:07.500 Here's some fake science.
00:17:13.580 All right.
00:17:14.240 Here's the thing.
00:17:15.040 I believe I'm going to explain this to you correctly.
00:17:19.260 But if later you say to me,
00:17:20.700 Scott, you've got that story completely wrong,
00:17:23.240 well, I'll change it.
00:17:24.760 I'll change my mind.
00:17:26.740 But I'm going to tell you what I think I know.
00:17:29.320 All right.
00:17:29.640 So, you know what a placebo is, right?
00:17:37.720 You're all smart, educated people.
00:17:40.380 You know that a placebo is a fake pill
00:17:42.740 that if you have a pill that you think will be real
00:17:46.500 and will solve some problem,
00:17:48.520 then you give somebody the fake pill
00:17:50.460 and you see how they do.
00:17:52.300 You compare it to the real drug.
00:17:54.260 Now, if the real drug improves people's condition
00:17:58.660 more than the placebo,
00:18:03.840 then you're likely to get improved,
00:18:05.680 you know, if there are no side effects
00:18:06.820 or if there are minimal side effects.
00:18:10.700 Now, this largely proves, wouldn't you say,
00:18:14.240 because whenever they do this study,
00:18:16.500 you always get this effect.
00:18:18.720 The people who take the fake pill,
00:18:20.980 very predictably,
00:18:22.160 not 100% of the time,
00:18:24.160 but predictably,
00:18:25.640 they'll have, you know,
00:18:28.080 a substantial kind of a benefit.
00:18:31.100 So would you agree
00:18:32.660 that the placebo effect
00:18:34.980 is one of the most studied
00:18:37.420 and guaranteed to be real effects
00:18:40.280 you've ever seen in science?
00:18:42.540 How many would say that?
00:18:43.800 That's guaranteed to be real
00:18:46.120 because nothing,
00:18:47.260 I don't think anything's been studied this much.
00:18:50.000 Has anything ever been studied as much as this?
00:18:51.940 Because every time they do a study,
00:18:54.300 this shows up.
00:18:56.160 Like, you don't even have to be looking for it
00:18:58.040 and it's everywhere.
00:18:59.200 I mean, the placebo effect is just everywhere.
00:19:01.640 So it's real.
00:19:03.720 In the comments,
00:19:05.560 can you at least acknowledge
00:19:06.900 that the placebo effect is real
00:19:09.120 so that I can go on to my next point?
00:19:11.060 It's real, right?
00:19:12.600 Everybody knows?
00:19:13.960 It's totally real?
00:19:15.100 Yeah, totally real.
00:19:16.280 All right.
00:19:16.520 Um, maybe not.
00:19:21.200 Maybe not.
00:19:22.700 Do you know what they don't study?
00:19:24.780 Because they don't have to?
00:19:26.820 What they don't study
00:19:28.000 is somebody who took no pill at all.
00:19:32.140 Do you know what would happen if they did?
00:19:33.900 Suppose every test
00:19:37.300 was placebo in one group,
00:19:40.260 real drug in another group,
00:19:42.140 and then the thing that they don't study,
00:19:43.960 but what if they did?
00:19:45.620 No pill at all,
00:19:46.840 and you don't even know you're in the study.
00:19:48.820 So they'd have to not even know
00:19:50.500 they're in the study.
00:19:52.320 What do you think that would be?
00:19:53.560 Just take a guess.
00:19:58.560 What do you think
00:19:59.260 the don't do anything
00:20:00.380 and you're not even in this study
00:20:01.720 would look like?
00:20:16.440 That's right.
00:20:17.500 The people who did absolutely nothing,
00:20:20.180 they improve about the same
00:20:22.000 as the placebo.
00:20:23.560 Do you know what that means?
00:20:26.240 It means everything you've ever heard
00:20:28.200 about the placebo was bullshit.
00:20:30.620 It's always been bullshit.
00:20:32.980 It was easy to prove.
00:20:35.880 Easy to prove.
00:20:38.100 And for your entire life,
00:20:40.200 the people that you trust,
00:20:42.040 the scientists,
00:20:43.160 told you they did some science,
00:20:45.540 and then they told you
00:20:46.700 they used their statistical genius
00:20:48.460 to prove that the thing called
00:20:50.780 the placebo effect is a real thing.
00:20:53.560 Now, I'm seeing somebody say
00:20:57.420 this is incorrect.
00:20:58.960 So I'm open to correction
00:21:00.700 because I'm not completely sure
00:21:03.280 this is true.
00:21:05.280 I'll tell you that, you know,
00:21:07.060 it's just something I ran across
00:21:08.280 on the internet recently,
00:21:09.200 but as soon as I saw it,
00:21:10.680 I thought to myself,
00:21:11.480 well, I'm pretty sure they don't study
00:21:14.060 the person who wasn't in the study.
00:21:17.280 I'm pretty sure that's not a normal thing.
00:21:19.760 And I do know
00:21:20.860 that in the normal course of things,
00:21:24.020 most people improve if you do nothing.
00:21:27.060 Or at least they say they improved.
00:21:28.760 Maybe they just got used to it,
00:21:30.120 so they said they improved.
00:21:31.220 Who knows?
00:21:31.940 Some of us.
00:21:32.440 Might be some of that.
00:21:33.320 But, I don't know.
00:21:36.420 I'm open for fact-checking.
00:21:38.480 I'm open for fact-checking.
00:21:40.340 But remember,
00:21:41.300 we live in a zero-trust environment
00:21:43.200 where science is mostly bullshit.
00:21:49.860 Mostly.
00:21:51.380 The vast majority of the things
00:21:53.340 you call science are bullshit.
00:21:55.320 And apparently always have been.
00:21:58.780 From the, you know,
00:22:00.080 the so-called nutrition,
00:22:02.240 you know,
00:22:02.540 the nutrition triangle.
00:22:04.800 That was bullshit from day one.
00:22:07.360 It was never even attempted to be true.
00:22:09.800 But if the placebo effect
00:22:12.160 isn't real,
00:22:15.060 what can you believe?
00:22:18.260 Want to see another one?
00:22:20.060 Do you think I could do this again?
00:22:22.680 Let's see.
00:22:25.320 You probably know that RFK Jr.
00:22:41.720 says that solar might be
00:22:43.880 a good green technology,
00:22:45.560 but it's not economical
00:22:46.940 if you include all of the other costs.
00:22:51.400 You know,
00:22:51.540 because everything has, you know,
00:22:52.940 closed down costs
00:22:54.080 and maybe social costs
00:22:56.180 and then there's
00:22:56.880 how long it takes
00:22:58.000 to get approval
00:22:58.680 and all that stuff.
00:23:00.540 Now, also with solar power,
00:23:03.480 people will say,
00:23:04.660 hey, you forgot to include
00:23:06.100 the recycling costs,
00:23:08.620 you know,
00:23:09.160 blah, blah, blah, blah, blah.
00:23:10.780 Right?
00:23:11.760 So let me ask you the question.
00:23:13.680 You all watch the news.
00:23:16.280 You know,
00:23:16.580 this is the most informed group
00:23:18.160 I've ever seen.
00:23:19.300 Literally.
00:23:19.820 It's the most informed group
00:23:20.980 of news watchers.
00:23:21.800 So you all watch the news
00:23:23.340 and you all know
00:23:24.880 that climate change,
00:23:26.440 no matter what you think
00:23:27.480 of the reality of it,
00:23:28.740 you would agree
00:23:29.320 it's maybe the biggest issue
00:23:31.860 outside of Ukraine,
00:23:33.220 I guess.
00:23:34.180 Maybe the biggest issue, right?
00:23:36.320 Yeah.
00:23:37.600 So,
00:23:38.520 but at least you're recycling
00:23:40.860 your plastic, right?
00:23:43.100 Or is everybody recycling
00:23:44.640 their plastic,
00:23:45.440 at least trying to help?
00:23:46.860 As Michael Schellenberger
00:23:50.760 informed you recently,
00:23:52.580 do you know that
00:23:53.220 the plastic recycling
00:23:54.420 has never been real?
00:23:56.840 It's never been real.
00:24:00.040 They don't recycle
00:24:01.440 the plastic that you separate.
00:24:03.540 They throw it in the garbage
00:24:05.080 and they ship it to Asia
00:24:06.780 and it ends up in the ocean
00:24:08.560 and your water supply.
00:24:11.360 And always has.
00:24:15.060 And always has.
00:24:16.380 Recycling isn't real.
00:24:20.160 Plastic.
00:24:20.920 I think maybe metal cans is real.
00:24:24.360 Maybe cardboard is real.
00:24:25.900 I don't know.
00:24:27.060 But plastic recycling
00:24:28.260 has never been real.
00:24:31.300 How many of you knew that?
00:24:33.660 How many of you knew
00:24:34.760 that plastic recycling
00:24:35.800 was never real?
00:24:38.700 A lot of you did.
00:24:40.540 How many Democrats
00:24:41.700 do you think know that?
00:24:44.340 Not so many.
00:24:45.640 Not so many.
00:24:47.240 All right.
00:24:48.340 So what is the answer?
00:24:50.020 You all watch the news.
00:24:51.960 So the most important question,
00:24:53.880 I would say,
00:24:54.880 see if you agree with me,
00:24:56.060 the most important question
00:24:57.920 about climate change
00:24:59.060 in terms of what
00:25:00.620 we're going to do about it,
00:25:01.980 because even if you think
00:25:03.260 there's no risk,
00:25:05.040 you need more energy, right?
00:25:06.800 So the people who don't believe
00:25:08.340 in climate change
00:25:09.440 as a risk,
00:25:11.340 the ones who do believe it,
00:25:13.400 you all need more energy.
00:25:14.980 So the biggest question is,
00:25:16.800 which one's more economical?
00:25:19.380 So tell me,
00:25:20.220 which one's more economical?
00:25:22.340 Solar or nuclear?
00:25:24.720 And I'm going to tell you
00:25:25.740 the real answer
00:25:26.500 after you tell me your answers.
00:25:28.460 There's an absolute real answer.
00:25:30.360 I can give you
00:25:31.720 complete certainty
00:25:32.460 on this question.
00:25:34.300 A lot of people
00:25:34.780 saying nuclear.
00:25:37.700 A lot of people
00:25:38.540 saying nuclear.
00:25:40.220 All right.
00:25:41.440 Both.
00:25:42.140 That's an interesting answer.
00:25:43.140 Both make sense
00:25:46.080 from a you-need-all-the-energy-you-can-get
00:25:48.220 perspective.
00:25:49.320 But one of them
00:25:50.220 is going to be better.
00:25:53.000 All right.
00:25:53.600 Do you want the absolute,
00:25:55.480 guaranteed,
00:25:56.220 correct answer?
00:25:59.400 Nobody knows.
00:26:01.780 Are you kidding me?
00:26:03.640 You think somebody knows
00:26:04.820 the answer to this question?
00:26:06.660 No.
00:26:07.220 This is completely unknowable.
00:26:09.420 Now here,
00:26:10.000 we happen to be
00:26:10.640 in my domain of expertise.
00:26:13.140 You know,
00:26:14.100 I know it doesn't seem like it.
00:26:16.340 But I do have
00:26:17.360 many years
00:26:18.120 of corporate experience
00:26:19.360 trying to analyze
00:26:20.800 what costs more,
00:26:22.580 both initially
00:26:23.540 and also in the long run.
00:26:25.120 Because it's not
00:26:25.820 your initial cost.
00:26:26.880 It's, you know,
00:26:27.280 you have to put it
00:26:28.760 in decommissioning
00:26:29.680 and everything else there.
00:26:31.300 So,
00:26:32.020 I can guarantee you
00:26:33.980 that the level
00:26:35.340 of complexity involved
00:26:36.880 in nuclear
00:26:37.740 as well as solar
00:26:39.220 guarantees
00:26:40.620 that nobody knows
00:26:41.780 the answer.
00:26:42.140 it's a guarantee.
00:26:44.700 Here's what else
00:26:45.380 they don't know.
00:26:47.020 What would be
00:26:47.600 the economics
00:26:48.320 if, let's say,
00:26:49.240 you got a capable
00:26:50.220 president,
00:26:51.500 let's say,
00:26:52.140 a Vivek Ramaswamy,
00:26:54.020 and he said,
00:26:55.420 hey,
00:26:55.660 this nuclear stuff
00:26:56.640 would be good
00:26:57.460 if you could make it
00:26:59.120 easier to build
00:27:00.080 and easier to get approved.
00:27:02.920 So,
00:27:03.480 at the moment,
00:27:04.040 it's nearly impossible
00:27:04.980 to get a nuclear power plant
00:27:06.440 built because of all
00:27:07.700 the environmentalists
00:27:08.700 and blah, blah, blah.
00:27:10.480 But could you imagine
00:27:11.700 that somebody
00:27:13.060 who is smart
00:27:13.840 and a president
00:27:14.540 could say,
00:27:16.040 hey,
00:27:16.700 you states
00:27:17.400 don't get a vote
00:27:18.240 because this is
00:27:20.460 too close
00:27:21.460 to the homeland security.
00:27:23.100 You can't really
00:27:23.740 have a country
00:27:24.420 that has a national defense
00:27:26.700 unless you're also
00:27:28.100 a strong economy.
00:27:29.660 So,
00:27:30.080 you need your basics,
00:27:31.320 your energy production.
00:27:32.480 That's the most basic thing.
00:27:34.000 You got to get
00:27:34.480 your energy production
00:27:35.400 and then maybe
00:27:36.320 transportation
00:27:36.980 would be next.
00:27:38.540 But if you don't get
00:27:39.380 your energy production
00:27:40.220 right,
00:27:41.220 you might as well
00:27:41.760 disband your army.
00:27:43.960 Your army is useless
00:27:45.300 if you don't have energy
00:27:47.200 and you don't have
00:27:48.360 a good economy
00:27:49.700 to equip them.
00:27:51.180 Right?
00:27:51.760 So,
00:27:52.220 somebody like Vivek
00:27:53.640 could make the argument
00:27:55.120 that nuclear energy
00:27:57.060 is not just a choice
00:27:58.660 of which energy to use,
00:28:00.540 but it's a requirement
00:28:01.980 for the sustainability
00:28:04.220 of the United States
00:28:05.520 defensively
00:28:06.900 and in every other way
00:28:08.040 that we have
00:28:08.920 a robust,
00:28:10.560 efficient nuclear energy
00:28:12.800 game.
00:28:15.800 Now,
00:28:16.280 could the federal government
00:28:17.400 just say,
00:28:18.260 hey, environmentalists,
00:28:20.180 go away?
00:28:22.300 And,
00:28:22.940 hey, Democrats,
00:28:23.980 it doesn't matter
00:28:24.440 what you want
00:28:25.020 because it's national security.
00:28:27.300 So,
00:28:27.620 I'm going to get rid
00:28:28.200 of all the little
00:28:29.500 state and local ordinances
00:28:31.000 and I'm going to say
00:28:32.260 there's just one
00:28:33.080 set of federal approvals
00:28:35.460 and if you,
00:28:36.840 and they won't be that hard,
00:28:38.160 you know,
00:28:38.340 they'll be,
00:28:38.760 they'll be optimized.
00:28:40.480 So,
00:28:40.980 you can get your approval
00:28:41.940 and I'm just going to say
00:28:43.940 you states,
00:28:44.540 you just don't get a vote
00:28:45.460 in this
00:28:45.860 because you're driving
00:28:46.880 the country into ruin
00:28:47.920 and that's a,
00:28:50.180 that's a defense problem.
00:28:52.480 So,
00:28:53.040 could that happen?
00:28:55.100 I don't know.
00:28:56.440 But you don't either.
00:28:58.300 That's my point.
00:28:59.660 My point is
00:29:00.420 if you don't know,
00:29:02.220 could Vivek do that
00:29:03.480 or could Trump do that?
00:29:05.040 He didn't do it
00:29:05.540 the first time
00:29:06.080 so I suspect he can't.
00:29:08.220 But I think Vivek
00:29:09.200 could do it
00:29:09.600 because do you know,
00:29:10.860 do you know what
00:29:11.360 it would take
00:29:12.180 to remove
00:29:13.580 all of those regulations?
00:29:15.520 The minimum
00:29:16.200 it would take
00:29:16.740 for a president
00:29:17.420 to do that?
00:29:20.080 They would have
00:29:20.720 to understand
00:29:21.460 the topic.
00:29:24.740 How many people
00:29:25.660 who have ever run
00:29:26.640 for that office,
00:29:27.600 a president,
00:29:28.120 do you think
00:29:29.440 you could introduce
00:29:30.420 to the topic
00:29:31.220 of nuclear regulation
00:29:33.660 and have them
00:29:35.420 like do a deep dive
00:29:36.580 and then come out
00:29:38.180 with a usable opinion,
00:29:40.460 a usable opinion
00:29:41.600 of which things
00:29:43.020 could be tweaked
00:29:43.780 and modified
00:29:44.320 and who's lying
00:29:45.620 to them
00:29:46.000 about what
00:29:46.480 you can't change?
00:29:49.060 We've only had
00:29:49.940 one candidate
00:29:50.720 who was capable,
00:29:51.900 well,
00:29:52.840 Jimmy Carter,
00:29:53.540 I guess.
00:29:54.420 But,
00:29:54.700 yeah,
00:29:55.280 Jimmy Carter,
00:29:55.980 right.
00:29:56.160 But at the moment
00:29:57.180 we have one candidate
00:29:58.120 I think has that capability
00:29:59.700 and I happen to think
00:30:01.580 we have some smart candidates,
00:30:03.240 right?
00:30:04.020 RFK Jr.,
00:30:05.240 super smart.
00:30:06.980 Trump,
00:30:07.560 I believe,
00:30:08.020 is super smart
00:30:08.840 in his,
00:30:09.720 you know,
00:30:10.000 within his domain.
00:30:13.360 But you haven't seen
00:30:14.460 Vivek before.
00:30:16.440 Vivek is the one person
00:30:18.000 who's running
00:30:18.500 for president
00:30:19.160 in the longest time
00:30:20.920 who you could say,
00:30:22.360 can you look at this mess
00:30:23.540 and try to untangle this?
00:30:26.260 You know,
00:30:26.420 you don't have to be
00:30:26.960 the biggest expert
00:30:27.900 in each part of it,
00:30:29.180 but at least you could
00:30:30.080 understand the landscape.
00:30:31.500 I don't think anybody else
00:30:32.520 could even see the landscape.
00:30:34.180 It's just way too complicated.
00:30:37.360 So,
00:30:37.980 if you don't know
00:30:38.880 the political part,
00:30:40.320 you also don't know
00:30:41.200 the base economics.
00:30:42.700 If you don't know
00:30:43.540 that we could approve,
00:30:45.420 for example,
00:30:46.800 you could approve
00:30:47.460 modular designs
00:30:48.800 and then let people
00:30:50.400 build as many as they want
00:30:51.680 so long as it's
00:30:52.460 the same design.
00:30:54.500 So,
00:30:55.000 there's a whole bunch
00:30:55.900 of things you could do
00:30:56.640 to lower the price
00:30:58.040 or cost of nuclear.
00:30:59.960 You can store
00:31:00.840 the waste on site,
00:31:03.540 which is now
00:31:04.200 the normal way to do it.
00:31:05.240 So,
00:31:05.540 you don't have
00:31:05.860 the transportation,
00:31:06.940 you don't need
00:31:07.320 a different facility.
00:31:08.900 You just keep it on site.
00:31:10.280 Put it in a little barrel
00:31:11.160 every once in a while.
00:31:13.180 So,
00:31:13.760 the economics of waste
00:31:15.420 is pretty low
00:31:16.740 at this point.
00:31:18.260 The risk of a meltdown,
00:31:21.000 which is what causes you
00:31:22.160 to be uninsurable.
00:31:23.940 If we build the newer models,
00:31:25.780 they've never had a meltdown.
00:31:27.900 If we did a generation three
00:31:30.260 or we're getting close
00:31:31.900 to generation four,
00:31:33.180 those have never had
00:31:34.100 a meltdown.
00:31:34.980 Generation four
00:31:35.660 can't have one.
00:31:37.120 It's built so
00:31:37.800 the design itself
00:31:39.300 is a failure
00:31:40.660 that causes it
00:31:41.420 to just stop.
00:31:43.200 The current designs,
00:31:44.600 including three,
00:31:45.960 you have to keep
00:31:46.720 the energy going.
00:31:48.260 If the energy is lost,
00:31:49.680 you could get
00:31:50.580 a meltdown.
00:31:52.000 So,
00:31:52.440 you don't build
00:31:53.000 something like Fukushima
00:31:54.140 where you put
00:31:55.420 your backup generators
00:31:56.580 below the ocean line.
00:32:00.140 It was below sea level.
00:32:01.780 They actually put
00:32:02.740 their backup generators
00:32:04.360 below sea level
00:32:06.660 in a tsunami zone.
00:32:10.460 Now,
00:32:11.040 is that a problem
00:32:11.700 with nuclear power?
00:32:13.460 Would you say,
00:32:14.460 and therefore,
00:32:15.100 nuclear power
00:32:15.780 is dangerous?
00:32:16.320 I mean,
00:32:18.120 that's a pretty big leap.
00:32:19.720 How about we just
00:32:20.400 don't put our backup
00:32:21.300 generators underwater?
00:32:23.420 That would be
00:32:24.020 like a start.
00:32:25.880 But,
00:32:26.360 you know,
00:32:26.520 that's not the only problem.
00:32:28.200 If you get to generation four,
00:32:29.740 you don't need
00:32:30.180 the power at all.
00:32:31.000 It just stops working
00:32:31.960 when the other
00:32:33.020 electricity goes off.
00:32:35.040 Yeah,
00:32:35.240 it's a dumb design.
00:32:38.500 Snoopy Boob says,
00:32:39.620 wow,
00:32:40.260 Scott's a nuclear scientist.
00:32:42.180 Which part of this
00:32:43.220 did I need to be
00:32:44.200 a nuclear scientist for?
00:32:46.320 Was there something
00:32:47.840 I forgot I said
00:32:48.920 that was sort of
00:32:50.080 science-y?
00:32:52.500 All right.
00:32:56.600 Jamie Raskin,
00:32:58.300 you recognize him
00:32:59.340 from the People
00:33:00.620 Who Signal Fake News.
00:33:02.400 He wants to investigate
00:33:04.200 Jared Kushner's
00:33:05.540 Saudi investments.
00:33:08.180 Now,
00:33:08.840 that's a really good play
00:33:10.180 from the Democrats.
00:33:12.080 In a political sense,
00:33:13.300 it's good.
00:33:14.040 And here's why it's good.
00:33:14.920 The general public
00:33:16.580 can't tell the difference
00:33:18.040 between a publicly
00:33:20.120 announced investment
00:33:22.200 and an investment fund
00:33:23.840 that is publicly investing
00:33:26.100 in things with
00:33:27.020 well-known
00:33:28.100 public management
00:33:29.880 compared to
00:33:32.800 literally a money-washing
00:33:35.420 slash bribery
00:33:37.620 influence scheme
00:33:38.860 with lots of shell
00:33:40.360 corporations
00:33:40.980 run by Hunter Biden.
00:33:42.300 The public
00:33:44.420 isn't going to know
00:33:45.060 the difference.
00:33:46.020 To them,
00:33:46.360 it's going to look like
00:33:47.020 two sketchy things.
00:33:48.780 Is it financial?
00:33:50.680 Yes.
00:33:52.460 Okay,
00:33:52.980 so there's money involved
00:33:53.860 and it's like
00:33:55.140 a politician?
00:33:56.500 Yes.
00:33:57.500 But is it a politician
00:33:58.400 who ran for office?
00:33:59.460 No.
00:34:00.540 Jared didn't run for office
00:34:01.560 and neither did Hunter.
00:34:02.980 Oh,
00:34:03.260 so it's similar,
00:34:04.480 you're saying.
00:34:05.540 Oh,
00:34:06.300 so if Jared is innocent,
00:34:08.700 well,
00:34:09.060 therefore,
00:34:10.040 logically,
00:34:12.320 Hunter did nothing wrong.
00:34:14.820 So it's a brilliant play
00:34:16.380 because most of the public
00:34:18.380 doesn't know
00:34:19.280 anything that's going on
00:34:21.080 with either story.
00:34:22.660 But if you tune in
00:34:23.880 and you're a casual viewer
00:34:24.900 and somebody says,
00:34:26.060 sure,
00:34:26.380 they're going after Hunter,
00:34:27.460 but look what Jared did.
00:34:29.220 You're a casual consumer
00:34:31.560 of the news.
00:34:32.800 That sounds like
00:34:33.420 a pretty good point,
00:34:34.180 doesn't it?
00:34:35.040 Oh,
00:34:35.320 they're both doing it.
00:34:36.020 Okay,
00:34:36.420 I guess I can forget
00:34:37.780 about that as a topic
00:34:38.820 because everybody does it.
00:34:40.440 Everybody's doing it.
00:34:41.720 No,
00:34:41.980 not everybody is setting up
00:34:43.200 a shell corporations
00:34:44.200 and lying about it.
00:34:47.140 Jared,
00:34:47.760 Jared did an announcement.
00:34:50.140 He did a press release.
00:34:51.640 If you do a press release
00:34:53.180 about what you're doing
00:34:54.220 and,
00:34:55.720 you know,
00:34:55.980 the investments presumably
00:34:57.080 will be known
00:34:58.440 at some point
00:34:59.080 because there could be
00:34:59.780 big ones.
00:35:01.860 I don't know.
00:35:02.220 if you say
00:35:04.380 you're not comfortable
00:35:05.220 with Jared,
00:35:07.000 let's say,
00:35:07.780 capitalizing on his
00:35:08.840 connections he made
00:35:09.700 during office,
00:35:10.620 I get that.
00:35:12.600 But what is better
00:35:14.140 than doing it
00:35:14.820 transparently?
00:35:17.360 And are we better off
00:35:18.980 or worse off
00:35:19.720 if Saudi Arabia
00:35:21.200 and Jared Kushner
00:35:23.640 have a close relationship?
00:35:24.920 Are we better off
00:35:26.880 or worse off?
00:35:27.500 He was the architect
00:35:29.840 of the Abraham Accords.
00:35:32.020 We're 100% better off.
00:35:34.720 It's not even close.
00:35:36.540 You want the Saudis
00:35:38.160 to have a good
00:35:39.780 working relationship
00:35:40.760 with some prominent Americans
00:35:42.600 so that we work
00:35:44.360 better with them.
00:35:45.220 There's a reason
00:35:45.940 the Saudi,
00:35:46.960 you know,
00:35:47.780 has played well
00:35:48.620 with the Trump administration
00:35:50.060 and now Jared
00:35:50.920 because they played
00:35:52.360 well with them.
00:35:53.460 That's how it works.
00:35:55.260 And we'll need them
00:35:56.120 in the future.
00:35:56.980 So I don't mind
00:35:58.760 that we have connections
00:35:59.640 with them
00:36:00.240 as long as it's all public.
00:36:03.280 Did I talk about this?
00:36:06.240 Vivek dismantling
00:36:07.520 Harvey Levin on TMZ
00:36:10.480 and his co-host
00:36:11.840 whose name I can't remember,
00:36:12.860 Charles?
00:36:13.400 Charles, I think.
00:36:16.280 Now, I'm not going
00:36:17.720 to replay it for you
00:36:18.700 or describe it.
00:36:19.760 I'll just tell you
00:36:20.360 how he did it.
00:36:22.420 Here's how most Republicans
00:36:24.580 have argued about
00:36:26.100 climate change
00:36:27.060 and what happens, right?
00:36:28.500 Here's most Republicans.
00:36:30.900 Climate change
00:36:31.600 isn't even real.
00:36:33.420 Well, are you denying
00:36:34.360 all of science
00:36:35.240 because you're an idiot?
00:36:36.700 Well, yesterday it was cold.
00:36:39.760 Okay, you know that
00:36:40.680 temperature and climate
00:36:42.160 are really not the same
00:36:43.340 and anecdotally
00:36:44.620 that doesn't really
00:36:45.320 prove anything.
00:36:46.700 It snowed yesterday.
00:36:48.160 Okay, I don't even feel
00:36:49.320 like we're talking
00:36:49.840 about the same thing.
00:36:50.760 It's cold.
00:36:54.240 And then the Democrats
00:36:56.160 declare that the Republican,
00:36:57.840 whoever it is,
00:36:58.920 is a big old dope
00:37:00.000 who doesn't believe
00:37:01.420 in science, right?
00:37:02.980 That's the way
00:37:03.380 it goes every time.
00:37:04.720 Now, enter Vivek.
00:37:06.860 It took him about
00:37:08.080 one minute
00:37:09.100 to demonstrate
00:37:09.960 that everything
00:37:10.780 those two knew,
00:37:12.040 Harvey and Charles,
00:37:13.440 knew about climate change
00:37:14.820 was way less
00:37:16.240 than Vivek knew.
00:37:17.640 Okay, that was
00:37:20.980 his starting point.
00:37:23.240 You know,
00:37:23.440 he very clearly
00:37:24.300 told them
00:37:24.900 that he was more
00:37:26.620 well-read
00:37:27.220 and understood
00:37:28.420 the topic
00:37:28.940 at a more detailed level.
00:37:31.640 Next thing he did right
00:37:33.020 is he debunked
00:37:34.860 their, you know,
00:37:36.080 climate hoax thing
00:37:37.320 and said,
00:37:37.720 no, the agenda,
00:37:39.480 that what we're doing
00:37:40.440 about it part
00:37:41.120 is the hoax.
00:37:41.940 In other words,
00:37:42.460 we're not doing
00:37:43.040 the right stuff.
00:37:44.460 I'm not arguing
00:37:45.880 whether CO2
00:37:47.080 is a greenhouse gas.
00:37:49.380 I am instead arguing
00:37:50.740 that you have not
00:37:52.000 included all of the
00:37:53.260 costs and the benefits
00:37:54.160 in your analysis.
00:37:55.920 And that's when
00:37:56.720 he was done, right?
00:37:59.540 That's, you know,
00:38:00.340 when he started
00:38:00.960 to tell them
00:38:01.620 you've simply
00:38:02.540 not included
00:38:03.380 enough in your analysis,
00:38:05.820 you know,
00:38:06.020 you have to look
00:38:06.880 at the number
00:38:07.560 of people dying.
00:38:09.020 You have to look
00:38:09.860 at how well
00:38:10.400 we remediate.
00:38:11.880 You have to look
00:38:12.680 at how many people
00:38:13.460 die from cold,
00:38:14.660 which is way more
00:38:16.140 than the ones
00:38:16.600 who die from warmth.
00:38:19.100 And, you know,
00:38:20.040 once you've included
00:38:21.020 all of those things,
00:38:23.360 your best analysis
00:38:26.020 for keeping people alive
00:38:27.560 would be to go hard
00:38:29.320 with fossil fuels today
00:38:30.860 because that's what
00:38:31.920 keeps people alive today,
00:38:34.300 but also work hard
00:38:36.020 to get your nuclear
00:38:36.940 and your, you know,
00:38:38.040 all forms of energy up
00:38:39.260 so that you can transition
00:38:40.740 because you don't
00:38:41.400 want to pollute.
00:38:41.980 Now, Vivek doesn't
00:38:44.660 want to pollute.
00:38:46.380 So you don't have
00:38:47.480 to ask him this question
00:38:48.600 because it's obvious.
00:38:50.040 Would you rather have
00:38:51.000 a new nuclear power plant
00:38:52.400 or more coal?
00:38:55.400 You don't have
00:38:56.120 to ask him that.
00:38:57.480 Of course he wants
00:38:58.280 a nuclear power plant.
00:38:59.340 He says it directly, right?
00:39:01.240 So once Vivek showed
00:39:03.740 that the difference
00:39:04.840 between their analyses
00:39:06.040 was that he had included
00:39:07.920 the value of human life
00:39:11.180 in his.
00:39:13.980 I'm not making this up.
00:39:16.740 Vivek showed that
00:39:17.600 he was including
00:39:18.380 the value of human life,
00:39:21.560 literally keeping people alive,
00:39:23.880 and that, you know,
00:39:25.180 as bad as the pollution
00:39:26.980 is from fossil fuels,
00:39:29.420 it is how you keep them alive.
00:39:31.460 And it's not even close.
00:39:33.500 If you take people's energy away,
00:39:35.220 things don't work out at all.
00:39:36.460 So the funny thing about it,
00:39:40.660 if you get a chance
00:39:41.300 to look at it,
00:39:41.940 you can just Google it,
00:39:43.180 look for Vivek and TMZ,
00:39:45.040 it'll pop right up.
00:39:46.440 You have to watch
00:39:47.300 the reactions,
00:39:49.100 the physical reactions
00:39:50.060 of the two hosts,
00:39:51.540 Harvey and Charles,
00:39:52.600 and they start getting animated,
00:39:54.820 and they're triggered
00:39:56.780 into cognitive dissonance.
00:39:58.700 Because I'll tell you what,
00:39:59.880 nobody expects to lose.
00:40:02.320 If you're Harvey and Charles,
00:40:03.900 these are well-informed
00:40:05.620 public hosts,
00:40:09.060 you know,
00:40:09.220 these are high-functioning people
00:40:10.980 who pay attention
00:40:11.860 to the news.
00:40:12.800 It's their business, right?
00:40:14.560 So when they get
00:40:16.420 into a conversation
00:40:17.380 about climate change,
00:40:20.240 they expect to be
00:40:22.200 on top of the mountain
00:40:23.100 and just pissing on the ants
00:40:25.120 that are running below,
00:40:26.320 because that's how
00:40:27.020 it's supposed to work.
00:40:28.160 Because the only people
00:40:29.160 they've ever talked to
00:40:30.160 who disagreed with them
00:40:31.480 were idiots.
00:40:32.540 If I'm being honest,
00:40:35.420 they've only talked to idiots.
00:40:37.280 Because the idiot view is,
00:40:38.800 oh, it snowed today,
00:40:40.480 so no climate change.
00:40:42.800 You're not going to win with that.
00:40:45.260 But as soon as Vivek came in
00:40:46.760 and said, you know,
00:40:47.860 why don't you count
00:40:48.800 keeping people alive?
00:40:51.540 How about that
00:40:52.420 as your best metric?
00:40:53.740 Keeping people alive.
00:40:55.580 And that just shut them down.
00:40:57.520 It just made them
00:40:58.200 look like idiots.
00:40:58.880 Because what was
00:41:00.740 their argument for,
00:41:02.420 well, I guess I would rather
00:41:03.820 kill ten times
00:41:04.660 as many people,
00:41:06.060 but, you know,
00:41:06.760 I really like solar.
00:41:09.040 There's nothing there.
00:41:10.900 Once you've made the case
00:41:12.140 that more people
00:41:12.900 will clearly die
00:41:14.220 with the current
00:41:15.420 set of policies,
00:41:17.400 then all they have to do is
00:41:18.680 they don't even know
00:41:20.000 how to respond to that.
00:41:21.140 So they were both
00:41:21.800 at deep cognitive dissonance,
00:41:23.500 and it was wonderful.
00:41:24.140 So if you'd like to see
00:41:25.540 what cognitive dissonance
00:41:26.700 looks like,
00:41:28.280 that's a real good example.
00:41:29.880 You have to watch them
00:41:30.760 flip out.
00:41:31.620 They started just spewing things
00:41:33.160 and trying to talk
00:41:34.060 over each other,
00:41:34.920 and they fell apart.
00:41:36.900 So it was wonderful.
00:41:40.000 All right.
00:41:41.500 I saw this joke
00:41:42.720 from Wokasaurus Rex
00:41:44.320 on X.
00:41:46.240 He says,
00:41:47.060 he says,
00:41:49.140 new game.
00:41:50.380 All you have to do
00:41:51.380 is add the quote,
00:41:52.640 I've never seen
00:41:53.560 anything like it
00:41:54.540 to any statement
00:41:55.740 about anything
00:41:56.480 to prove your point
00:41:57.340 in how bad climate change
00:41:58.560 is affecting the world.
00:41:59.740 You don't need any evidence.
00:42:01.000 The phrase itself.
00:42:02.340 So he gives an example.
00:42:03.980 It rained today.
00:42:05.520 I've never seen
00:42:06.280 anything like it.
00:42:09.020 And I laugh for 10 minutes
00:42:10.980 because that's exactly
00:42:11.880 what the news coverage is.
00:42:13.820 It's like,
00:42:14.480 the wind
00:42:15.060 was very strong.
00:42:17.920 We've never seen
00:42:18.700 anything like it.
00:42:19.400 The five late show hosts
00:42:24.160 minus the good one,
00:42:26.040 Greg Gotfeld.
00:42:27.020 If you saw the podcast
00:42:28.280 with the five of them,
00:42:30.940 they're like,
00:42:32.100 it rained so hard.
00:42:34.340 I've never seen
00:42:34.860 anything like it.
00:42:35.920 Anyway,
00:42:36.600 to me that was funny.
00:42:38.160 RFK Jr.
00:42:39.000 has these stats,
00:42:40.920 which,
00:42:41.740 if these are true,
00:42:42.780 I don't know
00:42:45.860 what to say
00:42:46.360 about how Biden
00:42:47.780 could even be
00:42:48.800 polling anywhere
00:42:50.960 near equal.
00:42:52.020 But this is what
00:42:52.760 RFK Jr.
00:42:53.600 says about his own party.
00:42:55.640 Right?
00:42:56.020 President Biden
00:42:56.820 justified his
00:42:58.040 open border policy,
00:42:59.740 so even RFK Jr.
00:43:00.980 calls it
00:43:01.380 an open border.
00:43:05.060 Remember,
00:43:05.700 he's a Democrat.
00:43:07.420 And he's just,
00:43:07.900 he was there.
00:43:09.420 He did a,
00:43:09.940 he did a little documentary.
00:43:11.300 It's an open border.
00:43:12.780 It's because
00:43:13.420 it's an open border.
00:43:14.900 Let me say that again.
00:43:16.360 Do you know why
00:43:16.980 RFK Jr.
00:43:18.020 calls it
00:43:18.420 an open border?
00:43:20.540 Because he's not
00:43:21.360 a fucking liar.
00:43:22.700 That's why.
00:43:23.900 Right?
00:43:24.780 He's not a liar.
00:43:26.540 So he couldn't
00:43:27.620 not call it that
00:43:28.520 because that's
00:43:29.280 obviously what it is.
00:43:31.020 So all credit to him
00:43:32.320 for, you know,
00:43:33.180 bringing some truth
00:43:34.040 to the topic.
00:43:35.320 But look at
00:43:36.180 these numbers.
00:43:39.020 He says the Trump
00:43:40.180 era border patrol
00:43:41.160 had 2,600
00:43:42.480 quote,
00:43:43.700 children in cages
00:43:44.840 as they like to say
00:43:46.560 or his critics
00:43:48.060 like to say.
00:43:49.320 And RFK Jr.
00:43:50.240 says today
00:43:50.760 there are 12,000
00:43:52.100 children in cages
00:43:53.620 plus,
00:43:55.480 and here's the part
00:43:56.200 that I can't even
00:43:57.400 process,
00:43:58.840 85,000 children
00:44:00.160 have disappeared.
00:44:04.000 Now,
00:44:04.740 disappeared means
00:44:05.600 we just don't know
00:44:06.820 where they went.
00:44:07.440 It doesn't mean
00:44:07.940 that something bad
00:44:08.660 happened to them
00:44:09.360 necessarily.
00:44:11.680 But out of
00:44:12.420 85,000 missing
00:44:13.660 children,
00:44:14.820 which a lot of them
00:44:15.780 I assume
00:44:16.140 were unaccompanied
00:44:17.000 minors
00:44:17.500 and were probably
00:44:19.240 trafficked
00:44:19.740 intentionally,
00:44:21.080 I don't know
00:44:21.840 what percentage
00:44:22.500 of the 85,000
00:44:24.380 were being trafficked
00:44:26.320 and abused,
00:44:27.000 but it's not zero.
00:44:28.080 I don't know
00:44:29.560 what it is,
00:44:30.180 but it's going
00:44:31.620 to be some
00:44:32.020 shocking percentage.
00:44:33.500 And is it fair
00:44:37.320 to say that
00:44:37.840 that makes Biden
00:44:38.580 the biggest
00:44:39.140 child trafficker
00:44:40.540 in the world?
00:44:42.260 Because it's
00:44:43.720 entirely his
00:44:44.620 decision whether
00:44:45.340 this happens
00:44:45.920 or not.
00:44:47.160 He's the only
00:44:47.880 one who decides
00:44:48.840 whether this
00:44:49.460 is going to
00:44:49.860 happen.
00:44:50.780 Nobody else.
00:44:52.000 Just one guy.
00:44:53.820 And that one guy
00:44:54.660 apparently is
00:44:55.660 responsible for
00:44:56.460 12,000 children
00:44:57.500 in cages
00:44:58.060 and 85,000
00:44:59.660 missing,
00:45:00.560 of which
00:45:01.120 what would be
00:45:03.800 a small guess?
00:45:05.000 20,000 of them
00:45:06.060 are sexually abused
00:45:07.020 and still are
00:45:08.720 like at this
00:45:09.340 minute.
00:45:10.260 What's your guess?
00:45:11.740 It's probably
00:45:12.180 something in that
00:45:12.860 range.
00:45:14.280 This is the
00:45:15.240 deepest level
00:45:16.860 of evil
00:45:17.580 that I've seen
00:45:18.740 in America
00:45:19.700 since
00:45:21.620 slavery
00:45:22.880 or Jim Crow
00:45:23.760 or something.
00:45:26.060 Right?
00:45:26.640 I mean,
00:45:26.980 this is sort of
00:45:27.740 epic evil.
00:45:29.720 This is evil
00:45:30.580 on a scale
00:45:31.200 that honestly
00:45:32.620 I had no idea.
00:45:34.220 I had no idea
00:45:35.000 it was that big.
00:45:36.300 I knew it was
00:45:37.400 hard to track
00:45:37.880 people once they
00:45:38.500 got in,
00:45:38.880 but 85,000
00:45:40.340 kids that we
00:45:41.620 don't have a
00:45:42.040 good idea
00:45:42.440 even where
00:45:42.820 they are?
00:45:44.200 That is
00:45:45.100 not good
00:45:46.200 dad action.
00:45:48.140 Right?
00:45:48.540 You need a
00:45:49.260 dad.
00:45:50.480 You're going to
00:45:51.060 need somebody
00:45:51.620 in that office
00:45:52.620 who cares about
00:45:54.500 children.
00:45:55.480 Now the good
00:45:56.080 news is a
00:45:56.780 number of
00:45:57.160 candidates fit
00:45:57.960 that description.
00:45:59.860 But wow.
00:46:03.400 All right.
00:46:06.100 Vivek,
00:46:06.860 who makes
00:46:07.660 news so
00:46:08.540 well.
00:46:09.420 Vivek is the
00:46:10.180 best earned
00:46:11.180 media guy
00:46:12.220 of all time.
00:46:13.600 Trump got
00:46:14.200 tons of
00:46:14.760 earned media,
00:46:15.920 but often
00:46:16.580 they were just
00:46:17.080 criticizing him.
00:46:19.200 So his
00:46:19.720 earned media
00:46:20.260 was working
00:46:20.780 against him
00:46:21.340 as much
00:46:21.660 as it was
00:46:22.080 for him.
00:46:22.840 Vivek's is
00:46:23.460 more of a
00:46:25.020 60-40.
00:46:25.700 like it's
00:46:27.160 overwhelmingly
00:46:27.780 positive,
00:46:28.940 but of course
00:46:29.500 critics will
00:46:30.280 try to turn
00:46:30.820 it into
00:46:31.220 something it
00:46:31.760 isn't.
00:46:32.540 But here's
00:46:32.880 something he
00:46:33.260 said today,
00:46:33.880 another perfect
00:46:34.760 thing to
00:46:35.300 highlight.
00:46:36.420 He says,
00:46:36.900 under General
00:46:37.680 CQ Brown's
00:46:38.960 leadership,
00:46:39.440 the Air
00:46:39.680 Force is
00:46:40.120 trying to
00:46:40.560 reduce
00:46:41.020 white male
00:46:42.200 pilots from
00:46:42.880 86% of
00:46:43.920 flyers down
00:46:44.600 to 43%
00:46:45.820 amidst a
00:46:47.080 major
00:46:47.400 recruiting
00:46:48.080 crisis.
00:46:48.560 So there's
00:46:50.000 a recruiting
00:46:50.580 crisis,
00:46:51.940 can't get
00:46:52.640 enough people,
00:46:53.960 but they're
00:46:54.700 going to make
00:46:55.120 it much,
00:46:55.760 much harder
00:46:56.140 to get
00:46:56.500 good people
00:46:57.000 by discriminating
00:46:58.500 against,
00:46:59.500 I assume,
00:47:00.340 straight white
00:47:00.920 males.
00:47:01.780 I assume if
00:47:02.480 you're a
00:47:02.740 gay and a
00:47:03.200 white male,
00:47:03.760 maybe you're
00:47:04.160 still good,
00:47:05.720 but if you're
00:47:06.320 a straight white
00:47:07.380 male, they're
00:47:07.840 going to tell
00:47:08.160 you, pretty
00:47:10.620 much, we're
00:47:11.080 looking for
00:47:11.480 somebody else.
00:47:13.280 Now,
00:47:15.340 why would a
00:47:17.560 straight white
00:47:18.100 male join
00:47:18.760 the military?
00:47:20.720 If you were
00:47:21.600 a straight white
00:47:22.920 male and one
00:47:23.580 of your options
00:47:24.600 was to join
00:47:25.240 the military,
00:47:26.040 I would advise
00:47:26.980 against it.
00:47:28.520 Now, I
00:47:28.920 typically would
00:47:29.580 not advise
00:47:30.220 against joining
00:47:31.440 the military,
00:47:32.340 even though it's
00:47:32.980 clearly a risky
00:47:34.160 proposition by its
00:47:35.360 nature, and I
00:47:36.760 don't usually
00:47:37.640 advise people to
00:47:38.660 take risky
00:47:39.280 actions.
00:47:40.780 But, at least
00:47:41.860 there's a payoff,
00:47:43.560 right?
00:47:43.900 At least you get
00:47:44.860 benefits and you
00:47:46.060 learn a skill,
00:47:47.200 maybe.
00:47:47.400 There's a lot
00:47:47.860 to gain.
00:47:49.380 So, you know,
00:47:50.120 it's a personal
00:47:50.620 decision.
00:47:51.840 But at this
00:47:52.500 point, I think I
00:47:53.260 will weigh in
00:47:53.780 and say, if you're
00:47:54.380 a straight white
00:47:55.220 male, the
00:47:56.300 military just
00:47:57.060 said you're not
00:47:58.040 going to do
00:47:58.420 well.
00:47:59.220 So you should
00:47:59.820 probably look for
00:48:00.300 a different plan.
00:48:02.020 So I can't
00:48:02.680 recommend the
00:48:03.520 U.S.
00:48:03.880 military to a
00:48:04.720 straight white
00:48:05.360 male under the
00:48:06.120 current conditions.
00:48:07.980 This was about
00:48:08.720 the Air Force,
00:48:09.380 but you assume
00:48:09.840 it's the same
00:48:10.340 everywhere.
00:48:10.700 All right.
00:48:13.800 So Vivek did a
00:48:14.700 video in which
00:48:15.320 he's debunking
00:48:16.300 a number of
00:48:17.900 hoaxes against
00:48:18.580 him.
00:48:19.440 One of the
00:48:19.980 hoaxes was that
00:48:21.140 he made a lot
00:48:21.860 of money on a
00:48:22.500 failed Alzheimer's
00:48:23.660 drug.
00:48:24.580 How many of you
00:48:25.080 think that's
00:48:25.540 true?
00:48:26.860 That he made a
00:48:27.560 lot of money on
00:48:28.280 a failed
00:48:28.920 Alzheimer's drug?
00:48:31.040 Because that's
00:48:31.640 one of the main
00:48:32.220 claims about him.
00:48:33.560 No, it's not
00:48:34.160 true.
00:48:35.020 He did have a
00:48:35.980 failed Alzheimer's
00:48:36.800 drugs.
00:48:37.080 As he points
00:48:37.860 out, over
00:48:38.780 99% of all
00:48:40.140 Alzheimer's drug
00:48:41.600 attempts failed.
00:48:43.560 He was one of
00:48:44.440 them.
00:48:44.940 And it was in a
00:48:45.520 subsidiary of his
00:48:46.580 company, and they
00:48:47.960 never sold any
00:48:48.720 stock in the
00:48:49.360 subsidiary.
00:48:50.740 So the stock
00:48:51.660 was worth a
00:48:52.840 zillion dollars,
00:48:53.660 and then it was
00:48:54.020 worth zero, but
00:48:55.260 he rode it all
00:48:55.980 the way.
00:48:56.680 He rode it to
00:48:57.300 the top, and
00:48:57.740 he rode it to
00:48:58.160 the bottom, which
00:48:59.400 was the ethical
00:49:01.040 thing to do.
00:49:02.480 Because there was
00:49:03.140 a point where he
00:49:03.720 could have sold
00:49:04.120 his stock at the
00:49:04.800 top before he
00:49:06.320 knew if the
00:49:06.740 drug worked.
00:49:07.080 He could have
00:49:08.720 done that.
00:49:09.840 Decided not to.
00:49:11.120 Decided to let
00:49:12.020 his investment
00:49:12.700 follow the actual
00:49:14.420 result of the
00:49:15.340 trial.
00:49:16.020 The trial said it
00:49:17.020 didn't work, and
00:49:17.660 then that was
00:49:18.220 the end of it.
00:49:21.480 So that's his
00:49:23.340 version.
00:49:24.140 Let me say that
00:49:25.040 the only thing I
00:49:26.040 know for sure is
00:49:26.800 that that was his
00:49:27.800 explanation.
00:49:29.280 I wasn't there,
00:49:30.420 but it sounded
00:49:31.160 right to me.
00:49:33.660 There's a study
00:49:34.620 out of CU Boulder,
00:49:36.040 computer that
00:49:37.060 says that opposites
00:49:38.220 don't really
00:49:38.740 attract.
00:49:40.980 So aren't you
00:49:41.920 glad you studied
00:49:43.000 that?
00:49:44.180 You know, I've
00:49:44.660 been wondering why
00:49:45.420 Lizzo wasn't
00:49:46.420 returning my
00:49:47.100 calls, but
00:49:48.920 science has now
00:49:49.700 answered this.
00:49:50.320 Apparently, opposites
00:49:51.880 don't attract.
00:49:53.760 So no call
00:49:55.720 coming from me.
00:49:57.260 Was there
00:49:57.800 anybody who
00:49:58.520 needed science
00:49:59.720 to tell them
00:50:00.420 that people
00:50:02.440 like to have
00:50:02.920 something in
00:50:03.460 common, you
00:50:04.220 know, the
00:50:04.420 important stuff
00:50:05.120 in common
00:50:05.700 with their
00:50:06.180 mates?
00:50:07.080 You know, they
00:50:07.460 like to have
00:50:07.940 the same
00:50:08.300 religion.
00:50:09.840 Usually they
00:50:10.520 like to have
00:50:10.900 the same, at
00:50:11.400 least, political
00:50:12.280 leaning.
00:50:14.260 They like to be
00:50:15.100 somewhere in the
00:50:15.840 same age, usually,
00:50:17.520 unless one of them
00:50:18.200 is rich.
00:50:20.300 And, yeah, I
00:50:23.380 feel like they
00:50:24.100 didn't need to
00:50:24.640 study this so
00:50:25.360 much, huh?
00:50:26.380 Do you think
00:50:27.000 they wasted a
00:50:27.640 little money
00:50:28.000 studying this?
00:50:29.380 Feels like it.
00:50:30.300 Feels like it,
00:50:30.840 yeah.
00:50:33.140 Well, let's
00:50:33.740 talk about
00:50:34.420 Ukraine, and
00:50:36.780 then we're
00:50:37.040 going to talk
00:50:37.240 about the
00:50:37.580 double slit
00:50:38.100 experiment, if
00:50:40.080 you haven't had
00:50:40.560 enough science.
00:50:42.080 All right,
00:50:42.480 Ukraine,
00:50:43.420 allegedly, don't
00:50:44.840 believe anything
00:50:45.380 coming out of
00:50:45.920 Ukraine.
00:50:46.940 Don't believe
00:50:47.580 anything coming
00:50:48.240 out of Ukraine.
00:50:48.820 This is just
00:50:49.280 the story.
00:50:50.620 We're told
00:50:51.220 that Ukraine
00:50:52.700 has made a
00:50:53.500 small puncture
00:50:54.700 in the
00:50:55.200 Russian lines,
00:50:57.220 but I have
00:50:58.620 to admit, I
00:50:59.820 was trying to
00:51:00.760 visualize what
00:51:02.200 difference that
00:51:02.880 makes.
00:51:04.200 Because it
00:51:04.540 seems to me,
00:51:05.160 if you punch
00:51:05.700 through a line,
00:51:07.280 then you're
00:51:07.760 right in the
00:51:08.120 middle of the
00:51:08.920 strongest part
00:51:09.780 of the Russian
00:51:10.360 military, which
00:51:12.000 is their side
00:51:13.020 of the line.
00:51:14.400 So it feels
00:51:14.960 like it could
00:51:15.800 be a trap,
00:51:17.220 you know,
00:51:17.540 bringing all
00:51:19.060 the good
00:51:19.460 assets, oh,
00:51:20.220 we finally got
00:51:20.820 through, put
00:51:21.580 our best
00:51:21.960 assets through
00:51:22.560 the hole, and
00:51:23.680 then they get
00:51:24.500 destroyed.
00:51:25.200 So I was
00:51:26.660 trying to
00:51:27.240 sort of
00:51:27.760 visualize,
00:51:29.160 you know,
00:51:30.360 how that
00:51:31.480 works.
00:51:33.000 But I saw
00:51:34.920 an explanation
00:51:35.540 that I liked
00:51:36.100 a lot, which
00:51:36.620 is that the
00:51:37.660 Russian forces,
00:51:38.660 this is according
00:51:39.300 to Dave
00:51:41.040 DeMauro, a
00:51:42.320 retired U.S.
00:51:42.940 Army non-commissioned
00:51:44.020 officer, who
00:51:45.140 was a front-line
00:51:46.080 military intelligence
00:51:47.220 person during the
00:51:48.180 Cold War and in
00:51:49.100 Iraq.
00:51:49.780 So he knows what
00:51:50.600 he's talking about.
00:51:51.540 And he says that
00:51:52.220 the Russians don't
00:51:53.060 know how to fight
00:51:53.740 in all directions.
00:51:56.020 In other words,
00:51:56.960 you've got a lot
00:51:57.640 of, let's say,
00:51:58.700 artillery batteries
00:51:59.700 that are designed
00:52:01.320 to shoot bullets,
00:52:02.620 not bullets,
00:52:03.360 shoot artillery,
00:52:04.620 in one direction.
00:52:06.140 Now, obviously,
00:52:06.800 they could turn it
00:52:07.460 around, but it's
00:52:08.700 not meant for
00:52:09.480 up-close fighting.
00:52:11.200 Right?
00:52:11.640 So if you can get
00:52:12.860 a small, you know,
00:52:14.640 heavy machine gun
00:52:15.700 kind of truck,
00:52:17.100 you can just pull
00:52:18.180 up behind the
00:52:18.980 artillery battery and
00:52:20.040 wipe them out,
00:52:20.700 because they're
00:52:21.640 not really designed
00:52:22.940 for defending from
00:52:24.140 behind.
00:52:25.220 So the idea is that
00:52:26.800 if you get a few,
00:52:27.860 you know, a few
00:52:29.180 assets through the
00:52:30.020 line, you can get
00:52:31.420 behind some of
00:52:32.220 their stuff, totally
00:52:33.840 mess up their, you
00:52:35.100 know, coordination
00:52:35.780 because they don't
00:52:36.500 know what's going
00:52:36.980 on because you're
00:52:37.620 shooting them from
00:52:38.180 both sides.
00:52:39.520 And then there's
00:52:41.440 chaos and then
00:52:42.260 somehow you can take
00:52:43.140 advantage of that.
00:52:44.220 So that's the idea.
00:52:45.520 So that's not my
00:52:47.020 prediction or anything.
00:52:48.480 Just so you wondered
00:52:49.340 how that works.
00:52:51.040 It's the deal is to
00:52:52.580 get behind people who
00:52:53.820 are not good at
00:52:54.660 defending you from
00:52:55.860 behind.
00:52:57.120 So there's more to
00:52:58.160 it, but that's one
00:52:59.040 thing that was not,
00:53:00.440 wasn't obvious to me.
00:53:01.780 So I like that
00:53:02.720 explanation.
00:53:04.100 How many of you are
00:53:04.760 aware of the physics
00:53:06.140 experiment called the
00:53:07.280 double slit experiment?
00:53:10.380 I'll tell you what it
00:53:11.280 is, but how many of
00:53:12.540 you are familiar with
00:53:13.340 it already?
00:53:13.940 A lot of you,
00:53:14.700 right?
00:53:15.020 It's a famous
00:53:15.680 experiment in which it
00:53:18.440 gets to the question
00:53:19.580 of whether light is
00:53:21.220 a particle or a
00:53:22.280 wave.
00:53:23.080 Have you ever heard
00:53:23.620 that?
00:53:24.160 If you're like a
00:53:24.960 science nerd, you've
00:53:26.700 heard, oh, sometimes
00:53:27.880 light is a particle,
00:53:29.720 like a photon, and
00:53:31.540 sometimes it's a
00:53:32.340 wave.
00:53:33.440 What the hell does
00:53:34.540 that mean?
00:53:35.900 What's a wave?
00:53:37.420 I never understood
00:53:38.620 that until finally
00:53:39.620 somebody explained that
00:53:41.700 a wave is just
00:53:42.680 probability.
00:53:45.120 Why don't they just
00:53:46.060 say that?
00:53:46.560 A wave?
00:53:49.240 And then they say
00:53:50.040 the wave is
00:53:51.020 collapsed.
00:53:52.300 So the probability
00:53:53.400 is collapsed into a
00:53:55.620 particle when you
00:53:56.500 measure it.
00:53:59.100 You know what would
00:53:59.900 be another way to
00:54:00.580 explain that?
00:54:05.000 There was always a
00:54:06.080 photon.
00:54:06.920 We just didn't know
00:54:07.740 where it was.
00:54:08.920 And then when we
00:54:09.580 measured it, we knew
00:54:10.360 where it was, and
00:54:10.980 now we know where
00:54:11.520 it is.
00:54:11.860 So there's this
00:54:14.220 whole, like, big
00:54:15.120 science-y experiment
00:54:16.380 that's supposed to
00:54:17.080 tell you about the
00:54:17.660 nature of reality, and
00:54:19.940 all it is is it's
00:54:20.760 hard to find out
00:54:21.360 where a photon is.
00:54:22.720 It's all bullshit.
00:54:24.260 It's like one of the
00:54:24.940 most basic things that
00:54:26.240 they try to use to
00:54:27.180 explain how scientists
00:54:28.940 understand the
00:54:29.920 universe, but you
00:54:31.060 don't.
00:54:31.940 No, they just don't
00:54:32.760 know how to use words
00:54:33.580 clearly.
00:54:34.540 If they use words
00:54:35.700 clearly, they would
00:54:36.680 say, well, there are
00:54:38.040 photons.
00:54:39.140 We don't know
00:54:39.860 where they are
00:54:40.280 until we look, but
00:54:41.820 once we look, there
00:54:43.240 they are.
00:54:44.960 And they call that
00:54:45.860 collapsing the wave
00:54:47.000 field.
00:54:48.820 How about tell me
00:54:49.500 you didn't know
00:54:49.920 where it was, but
00:54:50.500 then when you
00:54:50.980 checked, there it
00:54:51.500 was.
00:54:53.240 Now, there's an
00:54:55.040 oddity to it about
00:54:58.060 whether you're
00:54:58.920 measuring it or
00:54:59.600 not.
00:55:00.640 So what the
00:55:01.400 double-slit
00:55:03.460 experiment tries to
00:55:04.940 show, but I think
00:55:05.740 it's all bullshit
00:55:06.300 actually, is that
00:55:07.780 there's something
00:55:08.520 about observation
00:55:09.800 that turns things
00:55:11.880 real.
00:55:13.460 Now, suppose you
00:55:14.820 don't believe what
00:55:15.540 I just said, and
00:55:17.120 instead you believe
00:55:17.960 the scientists.
00:55:19.440 So the scientists
00:55:20.220 would say that the
00:55:21.180 things don't become
00:55:22.180 real until they're
00:55:23.600 looked at.
00:55:25.240 And there was a
00:55:25.820 study that I tweeted
00:55:27.140 today, in May, I
00:55:28.880 guess, or it was
00:55:30.820 written up in May
00:55:31.480 of this year, that
00:55:33.520 there is no
00:55:35.320 preferred reality,
00:55:36.480 it's all subjective,
00:55:37.780 and that the two
00:55:39.900 movies on one
00:55:40.800 screen is actually
00:55:41.620 literally what's
00:55:42.460 happening.
00:55:43.580 That is to say, my
00:55:45.040 subjective impression
00:55:46.120 of what's happening
00:55:46.900 is no more or less
00:55:49.360 true than your
00:55:51.220 subjective impression,
00:55:52.780 and there isn't any
00:55:53.880 base reality.
00:55:55.540 So science is sort
00:55:56.940 of heading in that
00:55:57.500 direction.
00:55:58.260 So that's two movies
00:55:59.080 on one screen, which
00:56:00.020 I've been telling you
00:56:00.660 forever, that reality
00:56:02.400 is obviously subjective,
00:56:03.660 in my opinion.
00:56:04.200 So, here's my
00:56:07.640 additive tying
00:56:11.640 together of two
00:56:12.600 things that shouldn't
00:56:13.380 be tied together to
00:56:15.040 blow your mind.
00:56:16.360 It's coming.
00:56:17.800 I haven't said it
00:56:18.420 yet.
00:56:20.880 If it's true that
00:56:23.340 you can collapse
00:56:24.280 reality by observation
00:56:26.720 and also measurement,
00:56:28.460 if it's true, that's
00:56:31.240 what the scientists
00:56:31.780 say, could it also
00:56:33.820 be true, since
00:56:35.420 observation is sort
00:56:37.460 of a weird thing, it
00:56:38.340 has to do with
00:56:38.880 consciousness, I guess
00:56:40.420 if your eyes were
00:56:41.300 closed, it wouldn't
00:56:42.580 happen, the scientists
00:56:44.280 would say.
00:56:45.120 If you were standing
00:56:45.940 right in front of it
00:56:46.640 with your eyes closed
00:56:47.560 and you couldn't tell
00:56:48.720 it was happening, the
00:56:50.540 particle wouldn't really
00:56:51.860 be anywhere.
00:56:52.400 But as soon as you
00:56:54.060 open your eyes,
00:56:55.400 boop, it pops
00:56:56.700 immediately into one
00:56:57.620 position.
00:56:58.860 Now, I'm simplifying
00:56:59.700 because you can't see
00:57:00.400 a photon, but let's
00:57:01.320 say you were using
00:57:01.840 equipment to look at
00:57:02.760 it.
00:57:03.780 All right.
00:57:04.620 So, that's what the
00:57:05.880 actual scientists say
00:57:06.840 is real.
00:57:08.160 So, if you were to
00:57:09.160 take their belief
00:57:11.100 that that's real,
00:57:12.520 I'm going to extend
00:57:13.660 that now to a thing
00:57:14.640 called affirmations
00:57:15.880 and positive thinking.
00:57:19.420 Affirmations are
00:57:20.320 visualizing what you
00:57:21.840 want to happen in
00:57:22.680 the future, as if
00:57:24.900 it's a reality.
00:57:27.020 Now, if being
00:57:29.000 conscious in the
00:57:29.900 present can collapse
00:57:31.680 a wave field, as
00:57:32.580 they say, and make
00:57:33.260 something real, is it
00:57:35.300 possible that you can
00:57:37.340 collapse reality by
00:57:40.900 imagining it really
00:57:42.000 clearly?
00:57:45.340 Because if you told
00:57:46.280 me that consciousness
00:57:47.160 can collapse reality,
00:57:48.500 I would say, well,
00:57:49.140 then you're telling me
00:57:49.740 reality is subjective.
00:57:50.700 Because if I'm not
00:57:52.200 there, it's not
00:57:52.720 collapsing.
00:57:54.420 Or if I'm not
00:57:55.340 measuring it.
00:57:57.540 But, I've had an
00:58:00.220 – this is anecdotal.
00:58:03.240 My personal observation
00:58:04.660 over a lifetime is that
00:58:06.460 the more clearly I can
00:58:07.880 see a specific future,
00:58:09.380 the more likely it
00:58:10.260 happens.
00:58:11.620 Whether it's good or
00:58:12.560 bad.
00:58:13.920 So, I've told you that
00:58:14.760 I have this ongoing
00:58:16.000 problem with water
00:58:17.020 leaks.
00:58:17.360 You know, no matter
00:58:19.960 where I go, whatever
00:58:21.040 house, it has nothing
00:58:22.060 to do with the quality
00:58:22.900 of the construction.
00:58:24.420 It's just everywhere I
00:58:25.560 go, there are massive
00:58:26.440 water leaks.
00:58:27.940 Yesterday, I'm writing a
00:58:29.180 check for my, you
00:58:30.420 know, handyman slash
00:58:31.720 builder kind of guy who
00:58:33.740 does a lot of work in
00:58:34.520 my house.
00:58:35.640 And he looks up and he
00:58:36.840 goes, uh-oh.
00:58:38.820 And I'm like, I'm not
00:58:41.420 looking up.
00:58:41.900 And he keeps looking
00:58:43.680 up.
00:58:43.920 He's like, oh, wow.
00:58:46.060 And I'm like, no, don't
00:58:46.960 look up, Scott.
00:58:48.020 That ceiling was just
00:58:49.780 redone, just repainted.
00:58:51.920 There's no problem up
00:58:53.340 there.
00:58:53.600 And finally, I did look
00:58:55.460 up and there's this big
00:58:57.520 water bulge directly
00:58:58.920 above our heads.
00:59:00.600 You know, where the
00:59:01.540 paint starts to bulge
00:59:02.620 down just before it
00:59:04.140 pops and turns into a
00:59:05.740 shower.
00:59:08.320 Yeah.
00:59:09.060 Now, do you think that
00:59:10.800 that was going to happen
00:59:11.500 on its own, or do you
00:59:13.120 think the fact that
00:59:13.980 every time I walk to
00:59:15.100 the house, I look at
00:59:16.120 the ceiling and say,
00:59:17.060 well, where's my next
00:59:18.080 leak?
00:59:19.100 Because I expect them
00:59:20.200 to be there.
00:59:21.620 That leak was exactly
00:59:22.920 where I expected it to
00:59:24.000 be.
00:59:24.240 I mean, in the kitchen.
00:59:26.460 Sure enough, there it
00:59:27.300 was.
00:59:28.040 Do you know how many
00:59:28.580 times I've been in a
00:59:29.500 kitchen that rained
00:59:31.280 where actually water was
00:59:34.040 coming from the
00:59:34.640 ceiling?
00:59:36.000 I don't know the exact
00:59:37.200 number, but I think
00:59:37.900 it's about a dozen.
00:59:39.600 A dozen times.
00:59:40.600 Different houses.
00:59:42.440 Yeah.
00:59:44.120 Does that happen to
00:59:45.000 you?
00:59:45.680 No.
00:59:46.500 But I also spend a lot
00:59:47.880 of time visualizing it
00:59:49.060 accidentally, because I
00:59:50.320 don't want it to happen,
00:59:51.120 but I think about it all
00:59:52.060 the time, because of the
00:59:53.120 history.
00:59:54.280 So I've got a feeling that
00:59:55.660 one of the reasons that
00:59:57.260 maybe somebody have
00:59:58.120 repetitive problems,
00:59:59.860 people have different
01:00:00.640 sets of problems, but
01:00:01.520 they have the same one
01:00:02.240 all the time.
01:00:03.280 I've got this one, but
01:00:05.240 all the time.
01:00:06.680 It just never stops.
01:00:07.940 And by the way, this is
01:00:10.080 not interpreting the
01:00:13.360 past.
01:00:14.040 I've been telling my
01:00:14.900 followers for years that
01:00:16.860 I have this continuous
01:00:17.680 problem, and then what I
01:00:19.300 have when I report it, so
01:00:20.700 they can see it
01:00:21.160 themselves.
01:00:21.760 I take pictures of it.
01:00:22.720 It's real.
01:00:23.060 I have continuous water
01:00:25.300 leak problems.
01:00:26.920 Some people don't.
01:00:28.980 So could it be that
01:00:31.220 literally everything that
01:00:32.640 happens is some function
01:00:34.200 of our imagination, and
01:00:36.500 if you're not imagining
01:00:37.580 anything specific, then
01:00:39.880 it's random.
01:00:41.380 It just happens to you.
01:00:43.380 But if you visualize it, it
01:00:44.820 happens.
01:00:48.780 Yeah.
01:00:49.000 So, that's my prostate,
01:00:52.540 okay.
01:00:53.480 That's funny.
01:00:54.200 It took me a while to
01:00:54.880 figure that out.
01:00:57.600 All right, so my mind
01:00:59.540 breaking reframe is that
01:01:04.700 it's possible that the
01:01:07.060 thing that collapses the
01:01:08.460 wave and makes something
01:01:10.560 real is how clearly you
01:01:13.060 imagine it.
01:01:18.060 What do you think?
01:01:19.000 Do you think it's a
01:01:23.820 coincidence that Elon
01:01:25.160 Musk says, and I say it
01:01:27.380 as well, that the most
01:01:28.900 likely outcome is the
01:01:30.260 most entertaining?
01:01:32.780 That the most entertaining
01:01:33.980 one is the most likely, is
01:01:35.100 a better way to say it.
01:01:36.260 Right.
01:01:36.720 Because don't you often
01:01:37.740 think about the most
01:01:38.700 entertaining outcome?
01:01:40.840 Like, when you think of
01:01:42.080 the next presidential
01:01:43.220 election, or let's say
01:01:44.400 2016, did anybody have
01:01:46.820 any doubt, any doubt
01:01:48.620 whatsoever, that the most
01:01:50.100 entertaining outcome would
01:01:51.420 be Trump winning the
01:01:52.300 presidency?
01:01:53.440 Of course.
01:01:54.600 Of course.
01:01:55.520 Then when he ran the second
01:01:57.480 time, it wasn't really that
01:01:59.720 entertaining.
01:02:00.700 It was just something that
01:02:01.860 was going to happen or not
01:02:02.820 happen.
01:02:03.460 You had a preference or not
01:02:04.580 a preference.
01:02:05.880 But it wasn't entertaining.
01:02:07.560 In fact, it was the
01:02:08.220 opposite.
01:02:09.040 Because we were sort of not
01:02:10.440 being entertained by all the
01:02:12.020 division.
01:02:13.160 And now time is going by, and I
01:02:15.360 keep telling you, the most
01:02:16.640 entertaining outcome would be
01:02:19.280 if Trump had a third act
01:02:20.820 revival, managed to somehow
01:02:23.360 against all odds prove there
01:02:24.920 was some problem in the
01:02:25.960 election, which would be the
01:02:27.400 ultimate, and then goes to the
01:02:30.380 presidency and has a good
01:02:31.480 presidency.
01:02:32.480 That would be the most
01:02:33.440 entertaining.
01:02:34.340 And the thing is, everybody
01:02:35.660 knows that.
01:02:37.220 We all know it.
01:02:38.680 Every time he opens his mouth
01:02:40.100 about some new evidence about
01:02:41.860 election problems, you say to
01:02:44.360 yourself, I don't think he's
01:02:46.360 going to prove it, but that sure
01:02:49.380 would be entertaining.
01:02:51.200 That would be entertaining.
01:02:54.880 So, we'll see.
01:02:58.080 And what am I thinking of?
01:02:59.800 I'm seeing this book as the
01:03:02.500 number one book in the entire
01:03:05.300 world, except for religious
01:03:06.920 books.
01:03:08.980 And the more clearly I see it,
01:03:12.820 the more likely it will happen.
01:03:13.780 So, if you want to do this
01:03:15.220 experiment to see if our
01:03:17.200 imaginations can make something
01:03:19.460 happen, since you have no
01:03:21.740 reason not to, you're probably
01:03:23.340 not competing against me for a
01:03:25.400 best-selling book, just join me.
01:03:29.180 Just imagine this, the number
01:03:30.540 one book.
01:03:32.200 Just imagine it.
01:03:33.620 That's all.
01:03:35.060 We'll see what happens.
01:03:37.000 And, oh, yes, I don't know the
01:03:40.660 details, but Joe Biggs, one of
01:03:42.620 the January 6th people, got 17
01:03:44.880 years?
01:03:46.400 Is that right?
01:03:48.680 I don't know the details of his
01:03:50.340 case.
01:03:51.180 I would just say that on the
01:03:52.980 surface, that doesn't look like
01:03:56.400 justice to me.
01:03:57.880 And I would say that that's
01:03:59.040 another example of why you pretty
01:04:01.280 much have to elect Trump or
01:04:03.500 somebody like Vivek.
01:04:05.320 somebody, I wouldn't vote for
01:04:07.500 anybody who didn't promise to
01:04:08.800 pardon the January 6th people.
01:04:11.160 To me, that's bottom line.
01:04:14.380 By the way, has DeSantis said he
01:04:16.120 would or would not?
01:04:17.560 What has DeSantis said?
01:04:21.480 DeSantis is quiet on it or he
01:04:23.380 says he won't?
01:04:24.380 He's just quiet about it, right?
01:04:26.700 Yes, he would.
01:04:27.420 He's not talking about pardoning
01:04:31.160 Trump, though.
01:04:34.680 All right.
01:04:35.580 So here's the thing.
01:04:37.720 Any candidate who doesn't say it
01:04:39.580 directly, I will pardon these
01:04:41.320 people, I think they're
01:04:42.920 disqualified.
01:04:44.500 Do you agree?
01:04:46.560 If you can't say it out loud,
01:04:48.280 that's disqualifying.
01:04:50.180 Pence can't say it.
01:04:51.420 He's disqualified.
01:04:52.200 I think that's got to be the
01:04:57.860 ticket to the show.
01:04:59.380 If you can't say it directly,
01:05:02.320 I don't even want to hear
01:05:03.520 anything else you have to say.
01:05:04.660 I don't care about your
01:05:05.400 policies.
01:05:06.300 Don't care about your
01:05:07.160 character.
01:05:08.640 Don't care about your
01:05:09.480 history.
01:05:09.900 Don't care about anything.
01:05:12.900 If you're not willing to do
01:05:14.080 that simple thing, then you're
01:05:15.920 willing to let your party burn
01:05:17.520 if the other side decides they
01:05:20.480 want to.
01:05:20.840 That is unacceptable.
01:05:23.400 All right, ladies and
01:05:24.220 gentlemen, I believe that is
01:05:28.040 what we wanted to talk about
01:05:30.140 today.
01:05:30.700 I would like to mention a
01:05:31.920 reframe that I heard on the
01:05:33.900 Joe Rogan show.
01:05:35.260 I think I'll mention reframes
01:05:36.460 even when they're not my own.
01:05:38.520 This one comes from Jocko
01:05:40.160 Willick.
01:05:41.340 And I won't explain it as well
01:05:42.900 as he does.
01:05:43.480 If you want to see the video
01:05:44.380 where Jocko explains it, it's
01:05:46.000 much better.
01:05:47.080 But he's got this thing he
01:05:49.280 says when bad news happens.
01:05:50.840 He says, good.
01:05:56.020 The first time I heard it, and
01:05:58.240 Joe Rogan says, he now uses
01:06:00.360 that technique when something
01:06:01.640 bad happens or something's
01:06:03.100 hard, he says, good.
01:06:06.200 And I thought to myself, that
01:06:07.940 couldn't possibly work, right?
01:06:09.660 Because it's a little bit
01:06:10.960 opposite of what a hypnotist
01:06:12.580 would recommend.
01:06:14.220 Generally, you don't want to
01:06:15.200 recommend saying something
01:06:16.380 that's observably not true.
01:06:18.440 You'd like to stick with things
01:06:21.500 that you actually feel are
01:06:22.620 true.
01:06:23.220 But if you say that bad news is
01:06:25.140 good, then your brain is like,
01:06:27.340 well, but is it?
01:06:28.620 You know, so it's a little
01:06:29.580 unclear messaging to yourself.
01:06:31.600 But I tried it.
01:06:33.260 I tried it yesterday.
01:06:35.060 So when I got the leak in my
01:06:36.360 ceiling, I'm like, because it
01:06:38.420 wasn't just a few hours ago I'd
01:06:40.400 heard about that reframe.
01:06:41.760 And I thought, well, I'll give
01:06:43.000 it a try.
01:06:44.100 So I look at the ceiling and I
01:06:45.560 go, good.
01:06:48.640 I swear to God it worked.
01:06:52.340 It worked.
01:06:54.080 It completely changed my
01:06:55.980 connection to the problem.
01:07:00.900 What the hell?
01:07:04.220 It worked.
01:07:05.280 It worked instantly.
01:07:06.480 It actually worked instantly.
01:07:08.600 Now, that one's as weird as
01:07:12.280 there's a reframe I do have in
01:07:14.040 the book, in which the
01:07:17.020 reframe is if you have some
01:07:18.180 big problem and it's just
01:07:19.420 bugging you, you say that the
01:07:21.440 problem has as much of a right
01:07:23.000 to exist as I do.
01:07:25.320 It doesn't make any sense,
01:07:26.520 right?
01:07:27.200 It's the same as saying that
01:07:28.120 your problem is good or that
01:07:29.960 anything is good about it.
01:07:31.300 Now, in Jocko, it had more of
01:07:32.940 an explanation.
01:07:34.300 It had something to do with,
01:07:35.500 you know, another challenge to
01:07:37.720 overcome, you know, you'll
01:07:40.540 learn something, maybe there's
01:07:41.740 an opportunity that comes out
01:07:42.980 of the bad news, because bad
01:07:44.160 news often kicks up
01:07:45.340 opportunities.
01:07:46.360 So you had a little, you
01:07:47.840 know, explanation around it,
01:07:49.040 but I'm not even sure you
01:07:50.580 needed it.
01:07:51.800 I think the word itself
01:07:53.080 carried the power.
01:07:55.060 You just associate it with a
01:07:56.460 positive word and suddenly it
01:07:57.680 changed how you feel about it.
01:07:59.300 It was that easy.
01:08:01.020 Yeah.
01:08:01.440 So just say if you had a
01:08:02.500 problem, well, the problem has
01:08:03.520 a right to exist and it just
01:08:06.080 won't bother you as much.
01:08:07.620 You'll still, you know, work
01:08:08.820 on it if you can solve it, but
01:08:11.140 if you just put a different
01:08:12.240 word on it.
01:08:13.120 Now, again, I'll remind you
01:08:14.380 that the power of reframes, and
01:08:16.500 by the way, the way that I can
01:08:17.960 tell when my critics on, who
01:08:20.980 give me one-star reviews, the
01:08:22.840 way I can tell that they haven't
01:08:23.980 read the book is they say it's
01:08:26.360 a book of advice.
01:08:28.120 It's not a book of advice.
01:08:30.220 It's a book of words that
01:08:31.540 change your brain.
01:08:33.520 Like, good, in that context.
01:08:35.620 It's just a word.
01:08:37.200 The word itself has the power.
01:08:38.660 It's like a little program.
01:08:39.980 So a reframe is like a little
01:08:41.340 program that you put into your
01:08:43.080 head to optimize something.
01:08:45.260 It doesn't have to be true.
01:08:46.900 It doesn't have to be logical.
01:08:49.780 It just has to work.
01:08:51.360 And that's what good does.
01:08:52.780 It just works.
01:08:54.100 So if you want to spend all
01:08:55.540 your time figuring out why it
01:08:56.660 works, you can.
01:08:57.940 Maybe that would be
01:08:58.600 interesting.
01:08:59.740 But it doesn't matter.
01:09:01.360 It works.
01:09:02.060 Try it.
01:09:03.660 So Jocko, good job on that.
01:09:05.740 That was really useful.
01:09:09.300 So when I tell people that, you
01:09:12.120 know, my 160 reframes will
01:09:13.960 change your life, if the only one
01:09:17.040 you'd ever heard was Jocko's,
01:09:20.660 that's a pretty big change.
01:09:22.160 It immediately made all the
01:09:23.600 problems for the rest of my life a
01:09:25.000 little bit less bothersome.
01:09:26.840 Because I'll just do that again.
01:09:28.300 It looks like it works.
01:09:29.120 So imagine how big the changes
01:09:32.340 are that you can make just by a
01:09:34.140 little tuning of the words in
01:09:35.700 your head.
01:09:36.740 All right, ladies and gentlemen,
01:09:38.440 that's all for now.
01:09:39.580 Thanks for joining us on YouTube
01:09:40.960 for the best live stream you've
01:09:43.200 ever seen.
01:09:43.940 Come back tomorrow.
01:09:44.780 Bye.
01:09:45.880 Bye.
01:09:54.860 Bye.
01:09:55.340 Bye.
01:09:55.400 Bye.
01:09:56.040 Bye.
01:09:57.880 Bye.
01:10:06.220 Bye.
01:10:06.620 Bye.
01:10:07.140 Bye.
01:10:08.200 Bye.
01:10:09.140 Bye.
01:10:09.780 Bye.
01:10:10.240 Bye.
01:10:10.580 Bye.
01:10:10.680 Bye.
01:10:11.380 Bye.
01:10:11.820 Bye.
01:10:11.840 Bye.
01:10:12.220 Bye.