Real Coffee with Scott Adams - October 31, 2022


Episode 1913 Scott Adams: Elon Musk Looks At Internal Twitter Communications And It's Glorious, More


Episode Stats

Length

1 hour and 44 minutes

Words per Minute

146.72182

Word Count

15,383

Sentence Count

1,164

Misogynist Sentences

18

Hate Speech Sentences

19


Summary

In this episode of the podcast, I talk about some of the funniest things that have happened in the world in the past 24 hours, and then I do a deep dive into the question of whether or not cholesterol is real.


Transcript

00:00:00.200 Why am I in such a good mood? Well, it's because the news is funny.
00:00:11.880 When the news is funny, that's when I come alive.
00:00:15.740 Today we've got lots of funny news.
00:00:18.000 So if you'd like to take your already beginning to be a good mood up to a great mood to start the day,
00:00:25.840 I've got exactly what you need.
00:00:27.640 And all you need to join me is a cupper mugger, a glass of tanker,
00:00:31.740 a chalice of stein, a canteen jugger, a flask of vessel of any kind.
00:00:34.060 Fill it with your favorite liquid. I like coffee.
00:00:36.520 And join me now for the unparalleled pleasure of the dopamine hit of the day,
00:00:39.520 the thing that makes everything better.
00:00:42.560 It's called the simultaneous sip.
00:00:45.300 And it happens right now.
00:00:47.760 Go.
00:00:52.560 Oh, so good.
00:00:55.360 Take a moment to savor it.
00:01:01.320 Goes good with oxygen.
00:01:04.660 Well, here are some stories that I'm going to update you as I go.
00:01:09.820 Remember I told you I had problems with my blood pressure meds,
00:01:12.580 and then when I got off them, I felt great?
00:01:14.340 So this caused me to do a semi-deep dive into the question of,
00:01:20.380 and I don't want to, I want to be very careful here,
00:01:23.260 nothing I say about medicine or health should be taken seriously.
00:01:28.640 Do not use anything on this, do not use anything on this live stream
00:01:33.280 to make like major health decisions.
00:01:36.520 We're just speculating, okay?
00:01:38.220 Imagine being me, okay?
00:01:41.540 Just put yourself in my head for a moment.
00:01:43.780 I just went through the entire vaccination,
00:01:48.300 ivermectin, hydroxychloroquine,
00:01:51.560 you know, past two or three years, right?
00:01:54.220 So that's my context.
00:01:56.560 And then I find out that my blood pressure medicine
00:01:58.840 has bad side effects.
00:02:01.000 But I say to myself, well, that's not a big deal
00:02:03.380 because there are so many different ones.
00:02:05.400 There's probably one that, you know, will suit my needs.
00:02:09.140 And then I said to myself, you know what?
00:02:11.680 I should probably just spend five minutes Googling,
00:02:15.860 which I've never done,
00:02:17.380 to find out, just ask the question,
00:02:21.080 is it possible that blood pressure medicine
00:02:23.220 is total bullshit or partial bullshit?
00:02:26.240 I'm sure it's not total bullshit.
00:02:28.020 But is it possible that it's bullshit
00:02:29.860 and it was never good for you in the first place?
00:02:31.960 And I'm not sure.
00:02:38.320 I actually can't tell.
00:02:40.380 I will tell you that all the signals
00:02:42.340 that were sort of there when we found out
00:02:46.020 we couldn't trust science on other things,
00:02:48.520 all the signals are there that it's not real.
00:02:51.740 They're all there.
00:02:53.280 Now let me be careful again.
00:02:55.240 I'm not saying you should not take blood pressure medication.
00:02:58.440 That would be very dangerous, right?
00:03:01.540 Talk to your doctor.
00:03:03.160 Do what you need to do.
00:03:04.240 You're an adult, probably.
00:03:06.440 Don't stop taking any medications
00:03:08.120 because you think I said something clever.
00:03:10.860 Right?
00:03:11.060 That's a bad idea.
00:03:12.820 But honestly, I'll tell you where my head is at.
00:03:15.080 I don't think it's real.
00:03:17.340 I think that if you have a serious problem,
00:03:19.420 it probably does keep you alive.
00:03:21.240 But I think maybe the blood pressure medicine,
00:03:23.920 the science behind it has never been real.
00:03:25.800 That's what I think.
00:03:28.860 And again, I'm way, way different from science here.
00:03:33.760 There is not, there's no large,
00:03:36.480 as far as I know, I haven't found it,
00:03:38.520 there's no large body of skeptics saying what I'm saying.
00:03:42.100 I don't believe there are even too many rogue doctors.
00:03:44.620 I saw a few.
00:03:45.800 But I think it's bullshit.
00:03:49.740 Now when I say I think it's bullshit,
00:03:51.440 I mean at the low end.
00:03:53.360 People like me.
00:03:54.180 So if I'm untreated,
00:03:57.820 I'm going to start to get up into 140 over 100.
00:04:03.000 If I spend a little more time,
00:04:05.140 you know, trouble on lifestyle,
00:04:06.600 I'm down to mid-130s, roughly 85.
00:04:11.740 Here's the question.
00:04:13.060 If I went to any doctor in the world and I said,
00:04:16.040 my blood pressure is consistently with good lifestyle,
00:04:19.820 135 over 90,
00:04:22.120 should I treat it?
00:04:24.880 Most, I think, would say yes.
00:04:27.300 I think.
00:04:28.820 I think most would say yes.
00:04:30.800 Because that's right at the edge where you treat it.
00:04:33.620 And I'm not sure that there's any science to support that.
00:04:37.920 So, yeah, so he says research cholesterol.
00:04:43.900 Here's another question.
00:04:46.040 Is high cholesterol real?
00:04:49.080 In the sense that it's as dangerous as science says?
00:04:53.860 I don't know that it is, right?
00:04:56.200 Didn't we learn that our understanding of it was different?
00:04:59.540 I don't know if it debunked it or not.
00:05:01.060 But our understanding of cholesterol is now in question, right?
00:05:05.020 Whether that was a big deal or not.
00:05:08.220 Man, I'll tell you, these last two years have made everything up for grabs, right?
00:05:13.640 So, what would you do if you were me?
00:05:15.960 I'm not going to follow your advice.
00:05:19.220 But put yourself in my situation.
00:05:22.000 You just watched proof positive that big medicine is corrupt.
00:05:27.980 Would you agree?
00:05:29.360 Would you say that we have proof, no question about it,
00:05:33.000 that big medicine is corrupt?
00:05:35.560 That's no longer in question, right?
00:05:37.740 There's a question of how much and in what specific ways,
00:05:41.440 but there's no question about the larger question.
00:05:43.140 Would you take blood pressure meds, if you were me?
00:05:48.180 Because you know now that the doctors sort of have to do
00:05:51.280 what the institutions, their bosses tell them,
00:05:55.040 and the bosses have to do what the big pharma tells them to do.
00:06:03.040 Yeah.
00:06:03.800 And somebody says, check SSRIs or SSRIs, what are they?
00:06:09.800 Yeah, I think it's the same thing.
00:06:11.100 I think Tom Cruise was the smartest person in America,
00:06:16.700 and he got there early.
00:06:19.020 That's what I think.
00:06:20.720 You know, maybe not for all the right reasons, but he got there.
00:06:24.640 So, I don't know what to do.
00:06:26.620 I have not decided to not take blood pressure meds.
00:06:31.140 So, I've not decided to stop them forever.
00:06:33.480 I'm just off of them at the moment.
00:06:35.440 But I'll keep you informed.
00:06:36.480 Funny story of the week, Kim Kardashian went to a Halloween costume party
00:06:43.280 dressed as Mystique from the X-Men.
00:06:47.080 You know, very elaborate costume.
00:06:50.320 She found when she got there, it was not a costume party.
00:06:53.540 It was just a birthday party.
00:06:54.760 So, Kim Kardashian was the only one in a costume,
00:07:00.580 except for the two people she went with.
00:07:03.500 Now, here's the question I ask you.
00:07:08.160 Do you think that would have happened if she were still married to Ye?
00:07:11.860 Do you think Ye would have asked if it, you know,
00:07:17.280 is it really a costume party,
00:07:18.880 or maybe somebody would have checked it out?
00:07:21.420 Maybe it would have.
00:07:22.720 Maybe it would have.
00:07:24.200 Now, here's the great thing about this story.
00:07:27.920 Do you think that Kim Kardashian is embarrassed by the story?
00:07:32.200 What do you think?
00:07:33.260 Do you think she feels embarrassed by the story?
00:07:36.360 No, of course not, because it works in her favor.
00:07:38.260 So, here's a different way to interpret the same story.
00:07:42.980 Somebody gave a boring party,
00:07:44.700 and there was only one really interesting thing about it.
00:07:47.960 Kim Kardashian, right?
00:07:49.960 She was the only interesting thing that happened.
00:07:52.540 And that's her job.
00:07:54.280 You know, her job is being interesting.
00:07:57.640 Roughly speaking, right?
00:07:59.000 Her job is to be interesting.
00:08:00.600 Boy, was she interesting.
00:08:02.600 So, weirdly, although it was an accident, apparently,
00:08:06.400 it was probably one of the best accidents anybody ever did.
00:08:11.380 I mean, it was sort of brilliant.
00:08:12.920 It just worked exactly in her favor.
00:08:15.020 And she also looked great, right?
00:08:17.800 If you saw the pictures, it was an interesting kind of a great
00:08:22.720 because she was sort of poured into a skin-tight outfit,
00:08:26.620 and she wore it well, and the makeup worked and everything.
00:08:29.600 So, she looked great.
00:08:30.560 But she's also in that weird category of sexy whatever.
00:08:37.180 You know, whatever costume you wear, it's the sexy version.
00:08:40.460 And let me ask you this.
00:08:42.620 When you think of Kim Kardashian,
00:08:45.500 is the first thing you think her, you know, young, single Kardashian?
00:08:51.680 Or do you think of her as a mother?
00:08:53.100 Because I don't know if it's just because I'm listening to too much yay,
00:08:57.860 but I think of her as a mom now.
00:08:59.940 Don't you?
00:09:02.360 And so, when I see her, I think of her as a mom,
00:09:05.180 and then I see her in the skin-tight outfit,
00:09:07.840 which she totally pulls off, by the way.
00:09:09.620 She did a great job on it.
00:09:13.780 I thought, I don't know, it's a little uncomfortable now.
00:09:17.660 At the same time, she looks terrific, right?
00:09:19.920 You know, if I looked like that, I'm sure it'd show it off too.
00:09:23.540 But does anybody else have the same reaction to it?
00:09:28.380 Like, I've actually adopted Ye's point of view effortlessly,
00:09:34.560 which suggests he's pretty persuasive, right?
00:09:38.620 It could be just because, you know, we're paying attention to him,
00:09:41.920 so you get that frame.
00:09:43.520 But man, I just frame her as a mother now,
00:09:45.740 and now it looks different to me.
00:09:46.960 Like, it's just, it's not on point anymore.
00:09:50.720 But she had a good day, in a weird way.
00:09:56.080 Speaking of yay, he's speaking out about his latest bad news
00:10:00.280 getting canceled by all of his, what would you call them, partners,
00:10:05.620 Adidas, et cetera.
00:10:06.440 And he says that it's God humbling him
00:10:11.000 because, you know, he was bragging about being the richest black man,
00:10:15.940 and so he thinks, you know, God is knocking him down
00:10:19.240 to get him back in humble territory.
00:10:22.400 To which I ask this valuable question.
00:10:25.200 Does God really care if you're humble?
00:10:27.120 Is there a commandment about humility?
00:10:34.000 Is one of the ten, I don't know all my ten commandments.
00:10:37.180 Which one is about humility?
00:10:39.940 Which commandment?
00:10:41.740 Can you, oh, pride is, no, that's not in the ten commandments, right?
00:10:46.440 It has to be in the ten commandments.
00:10:48.600 Give me a fact check.
00:10:50.260 I know pride is bad in general, but is it a ten commandment?
00:10:57.220 I'm seeing some no's, somebody looks smart.
00:11:01.080 It's not actually, it's not actually a commandment, right?
00:11:06.600 We accept humility as a virtue, yes, sort of socially.
00:11:11.860 And pride is bad religiously, wise, and envy.
00:11:16.320 But is humility, it's not in the ten commandments, though, right?
00:11:24.420 Oh, it's a deadly sin.
00:11:26.320 It's one of the seven deadly sins.
00:11:28.540 All right.
00:11:29.820 All right.
00:11:30.400 So we'll take that.
00:11:32.820 So that's his interpretation.
00:11:36.180 And, but he feels that God has humbled him now, so.
00:11:40.280 And then he, he, some people say he walked back what he said about George Floyd, because
00:11:48.040 he said George Floyd was killed by fentanyl, not the police officer.
00:11:52.080 But here's what he said.
00:11:53.800 You tell me if this sounds like he walked it back.
00:11:57.760 So remember, he said that Floyd died of fentanyl, so it was kind of fake news.
00:12:04.100 So see if he walked it back with a statement.
00:12:06.000 When the idea of Black Lives Matter came out, it says yea, it made us come together as a
00:12:12.240 people.
00:12:13.220 So I said that, and I questioned the death of George Floyd.
00:12:17.500 It hurt my people.
00:12:19.420 It hurt the black people.
00:12:21.760 So I want to apologize for hurting them, because right now God has shown me what Adidas is doing.
00:12:28.840 Then he talks about being humbled by God.
00:12:31.860 So did he walk it back?
00:12:33.060 No, he did not.
00:12:36.700 He didn't do anything like walking it back.
00:12:39.220 Not even close.
00:12:41.160 He simply said he's sorry it hurt people.
00:12:44.420 He's sorry that his opinion, which I imagine he thinks is the truth, he's sorry that it
00:12:49.980 hurt people.
00:12:51.300 Now, do you think that that's a good kind of an apology?
00:12:57.180 Is that a good apology?
00:12:58.400 On one hand, some would say you should never apologize if you still hold the opinion.
00:13:06.520 It sounds like he still holds the opinion, so he's not apologizing for the opinion.
00:13:10.920 He's just apologizing that it hurt you.
00:13:13.560 Sorry it hurt you.
00:13:14.920 Yeah, it's sort of a half-apology.
00:13:16.800 But it's also as good as you can do if you want to be honest.
00:13:23.320 The best you can do if you want to be honest, but also show some empathy, is what he did.
00:13:30.040 It's not good.
00:13:31.440 It's simply the best that could be done under that circumstance.
00:13:34.720 You can't do better than that unless you lie.
00:13:36.560 Because he's not going to change his opinion, nor should he.
00:13:41.160 There's nothing that would change it.
00:13:43.120 But he didn't want to lie, so he just said, sorry, sorry it made you feel bad.
00:13:47.280 Best you can do.
00:13:48.920 This is a funny story.
00:13:50.200 All the stories are interesting today.
00:13:51.640 So Arizona, the governor has decided to use a bunch of empty shipping containers piled too high to fill in spots of the border wall where it was too easy for people to get across and there was no fence.
00:14:08.500 And I saw the, quote, fence or the wall or whatever.
00:14:13.080 It looked pretty good.
00:14:14.060 But it turns out that if you put two containers, number one, they're built to stack, so that's good.
00:14:22.240 They're actually made so that you can put them on top of each other.
00:14:27.080 Somebody says it's very old news.
00:14:28.540 I understand that.
00:14:30.020 But what's new about it is he's being asked to remove them.
00:14:33.500 So it's a current topic as well.
00:14:36.160 But I'd never seen the picture.
00:14:38.680 And it looks like it actually works.
00:14:40.920 But then it made me think of the following.
00:14:45.420 Don't you think that containers would be way more useful for reuse if you could easily remove all of the walls, right?
00:14:55.960 Imagine if it were easy to remove, let's say, even just the front and the back.
00:15:00.660 You could just flap them down.
00:15:02.440 If you could remove easily the front and the back, you could build a tunnel, right?
00:15:07.080 You have the boring, you know, that boring machine that just builds a tunnel.
00:15:11.920 It's a machine that makes tunnels.
00:15:14.500 If you could make the tunnel, you know, just shaped right, you could just drop a shipping container in there and, you know, just slap down the front and the back and connect them together, right?
00:15:25.820 Couldn't you build all kinds of useful things?
00:15:28.240 Somebody says that's already been done.
00:15:30.560 Yeah, maybe.
00:15:31.160 But it makes me wonder how useful these shipping containers are.
00:15:37.980 Because one of the problems is you don't want to ship back an empty container, which apparently happens a lot.
00:15:45.040 It would be easier if sometimes you just say, oh, these are old ones.
00:15:48.660 Just leave them where they are and recycle them.
00:15:51.020 So I just love this idea.
00:15:53.200 Now, what are all of the things you could do with an easy-to-build tunnel?
00:15:58.800 Are you ready?
00:15:59.240 Here are the things you could do if you had an easy-to-build tunnel.
00:16:03.820 Building communities, right?
00:16:06.560 You build a community so that you have underground cabling and underground, you know, fiber optics and stuff like that.
00:16:15.680 But also for heating and cooling.
00:16:18.520 Now, I don't know how far down the boring machine goes, how deep it could go, but it goes pretty deep, right?
00:16:27.220 Because I know it goes below the earthquake level.
00:16:30.820 That's why they're safe from earthquakes, because it goes so deep that all the earthquaking is above the tunnel.
00:16:36.500 So if you could go that deep, can you not somewhat easily build heating and cooling tunnels where the natural difference in temperature below the ground is all you need to keep your above-ground homes reasonably, you know, around 56 degrees, right?
00:16:56.860 Earthquakes can be very deep, but my understanding is that when the Boring Company was going to build that underground tunnel in Los Angeles, I think, right, which they've already built, that they do go down so deep that it's below the earthquake line.
00:17:12.680 So check that out.
00:17:15.000 I'm sure that's, I think that's, you know, an important part of the process.
00:17:18.220 No, obviously not the, not the approach.
00:17:21.820 The approach would be, you know, have to be high at some point.
00:17:27.220 Right?
00:17:28.080 But maybe, if there is any risk, it's got to be much lower.
00:17:32.540 I'm pretty sure of that.
00:17:34.900 It can be hundreds of miles before you get under the earthquake line.
00:17:38.880 Maybe it depends on the specific area, huh?
00:17:44.060 So maybe we know that some places the earthquakes are never deep.
00:17:47.680 Is that something that we would know?
00:17:50.460 It could be that maybe there is some place you can't build these tunnels.
00:17:54.100 Yeah.
00:17:55.180 I don't know.
00:17:55.700 All right.
00:17:56.040 Well, that's something we don't know.
00:17:57.560 But I think this is bigger than you think, even though it looks like a weird little story.
00:18:02.320 Tunneling and then easily building walls for your tunnel is civilization transforming.
00:18:09.200 Because you would do everything differently.
00:18:11.640 Let me give you another example.
00:18:13.780 Underground farming.
00:18:16.800 There you go.
00:18:18.620 I'm not sure if you could do this, but if you could do a light tube.
00:18:22.920 Do you know what a light tube is?
00:18:24.260 It's like a skylight that takes the sunlight and then distributes it inside the house.
00:18:31.600 But if you could do light tubes, would that give you the right kind of light for growing underground?
00:18:36.140 Because once you're growing underground, you don't have to worry about the bugs, the weather, any of that stuff, right?
00:18:47.360 And I don't even know about fertilizer.
00:18:51.540 I think fertilizer, you still need it.
00:18:54.000 But I believe you could fertilize way more efficiently if you're doing hydroponic underground.
00:18:58.960 Can somebody tell me if that's true?
00:19:02.000 I'll bet.
00:19:02.500 All right, we'll use the...
00:19:03.540 This is another example of the collaborative brain, right?
00:19:07.660 So I don't know the answer to this, but some of you do.
00:19:09.680 If you're using hydroponic growing and the fertilizer is added to the liquid, the liquid, I guess, is added to the liquid that you water the plants with,
00:19:22.140 is it true that that is way more efficient for use of fertilizer, which is now in tight supply?
00:19:29.940 Is that true?
00:19:30.400 I'm saying yeses, but I don't know if it's 10% true or, like, you know, 100 times better.
00:19:41.300 Does anybody have an estimate?
00:19:43.280 I'm saying all yeses.
00:19:44.460 Okay, thank you.
00:19:45.480 Thank you for the yeses.
00:19:48.220 Does anybody know how big a difference it is?
00:19:51.540 Is it like a 10 to 1 difference or like a 20% difference?
00:19:56.180 Does anybody have a sense for that?
00:19:57.260 I'll give you my guess, like, just as a person who lives in the world, I feel like just the fact that the liquid suspension would get it there better would mean, yeah, at least somewhere in the 40% range, right?
00:20:14.300 If you didn't know anything at all, but you had to put a guess on it just based on the few things that you do know, just based on the few things that you do know, wouldn't you say around 40%?
00:20:24.820 Does that feel about right?
00:20:27.260 Any engineers here?
00:20:28.620 Because the liquid suspension gets it to places better than the dirt would.
00:20:33.340 You know, the dirt is going to ruin, the dirt is going to prevent your fertilizer from getting to the roots, right?
00:20:39.320 But the liquid would do less?
00:20:41.280 I don't know.
00:20:43.400 Anyway, I think it's a big deal.
00:20:45.760 Here's a report from Mexico.
00:20:48.200 Mexico is doing great.
00:20:49.420 You know, we don't hear much about civilization in Mexico, even though it's so close.
00:20:55.580 But here's a little public interest story.
00:20:58.940 Residents of Zacatecas, Mexico, they witnessed a dog carrying a human head in its mouth as it ran down the streets.
00:21:07.600 Okay, that's not so good, I guess.
00:21:10.740 Yeah.
00:21:11.700 Okay, change that.
00:21:13.720 I thought it was more of a feel-good story.
00:21:15.880 But it's more of a dog with a human head walking down the street story, which in many ways is completely different.
00:21:23.320 Okay, so I had that one wrong.
00:21:24.640 Speaking of a human head in a dog's mouth, let's talk about the teachers' unions.
00:21:34.860 Now, was that the best segue you've ever seen in your life?
00:21:39.100 Can we take a moment to just savor that?
00:21:42.640 Okay, savor it.
00:21:44.060 Good.
00:21:44.500 Now we can go on.
00:21:46.620 So the American Federation of Teachers, you know, Randy Weingarten's organization,
00:21:52.540 whom many, including me, blame for the children being kept out of school and their progress being stilted.
00:22:04.860 Is that a word?
00:22:06.100 Stymied or something.
00:22:09.180 But Randy Weingarten is trying to gaslight it away, saying this.
00:22:13.980 The bottom line is everyone suffered in the pandemic.
00:22:18.040 So the first thing you need to know is it wasn't just tough on the children.
00:22:24.260 So Randy is trying to...
00:22:25.740 It was tough on all of us.
00:22:28.080 So the bottom line is everyone suffered in the pandemic.
00:22:31.800 Because of the pandemic, blah, blah, blah.
00:22:34.680 The disruption was everywhere.
00:22:36.760 And it was bad regardless of whether schools were remote or in person.
00:22:41.320 We were focused...
00:22:42.760 We were focused on the urgent need to help kids, blah, blah, blah.
00:22:45.580 So it wasn't a case of Randy Weingarten and her union making the wrong decision to her children.
00:22:54.720 It was a case of things were bad everywhere.
00:22:58.480 Things were bad everywhere.
00:23:01.400 Now, how much more fucking evil can this organization be?
00:23:06.000 On one hand, I actually respect the fact that they understand their mission.
00:23:15.800 The mission of the union is to take care of the teachers.
00:23:18.740 And I'm not really hearing the teachers complain.
00:23:21.800 Has anybody heard the teachers complain?
00:23:23.540 I mean, ones and twosies.
00:23:25.280 But I don't think the teachers complained.
00:23:27.060 So if the teachers' union is doing what the teachers seem to be okay with, they're not complaining,
00:23:34.580 then you have a competitive, you know, free market system, I guess.
00:23:39.240 Except that we can't just say, well, we don't like that, so we'll take our business elsewhere.
00:23:45.700 That's the problem, right?
00:23:47.980 You can't take your business elsewhere.
00:23:49.840 So I'm perfectly okay with a competitive system, which would include unions.
00:23:58.160 Except that the side they're competing against can't compete.
00:24:03.000 Because what they're competing against is parents.
00:24:06.160 Where is the parents' union?
00:24:08.740 Parents don't have a union.
00:24:11.380 So it's not really market competition.
00:24:13.840 It's sort of one side has a hammer and one side is Paul Pelosi.
00:24:17.560 That story is so fucking funny.
00:24:26.480 Now, I hate that he was injured.
00:24:30.340 And, you know, we all have genuine concern that he recovers.
00:24:34.540 Apparently he'll recover.
00:24:35.820 But, you know, I don't think you're ever really recovered from that, right?
00:24:40.360 You never really recovered from that.
00:24:42.300 Somebody says I nailed it.
00:24:44.320 All right.
00:24:44.560 The hammer puns are just, well, I can't resist.
00:24:48.780 I can't resist hammer puns.
00:24:51.240 It's not going to happen.
00:24:55.480 Did you know that the leading cause of death for people between 18 and 45 is?
00:25:01.560 What's the leading cause of death?
00:25:03.960 People between 18 and 45.
00:25:08.020 Fentanyl.
00:25:09.140 Fentanyl.
00:25:10.200 It's the leading cause of death.
00:25:11.340 And do you think that our government is treating it like the leading cause of death?
00:25:18.440 Well, not according to Republicans.
00:25:21.140 McCarthy was making this point yesterday.
00:25:23.760 Nope.
00:25:25.220 Do you want to know how much worse it is than that?
00:25:27.680 If the only thing I told you was, the leading cause of death from 18 to 45 is fentanyl, and
00:25:36.460 we're not doing enough about it, that would be terrible, right?
00:25:42.180 I'm going to make this so much worse.
00:25:44.700 You ready?
00:25:46.440 Do you remember when the pandemic was raging, and people over 80 were dropping like flies?
00:25:52.740 And while those were all tragedies, those were all human tragedies, we were also in a triage
00:26:00.640 emergency situation, and we were able to say, for the first time, young people are worth
00:26:07.620 more than old people.
00:26:09.700 We can say that out loud.
00:26:11.420 Because the old person had their life, and the next, you know, year or two of their existence
00:26:18.560 wasn't going to be that good.
00:26:20.980 And if an 80-year-old dies and loses three years of life, you cannot compare that to a
00:26:27.260 10-year-old dying and losing 100 years of life.
00:26:30.060 I'm assuming they have longer lifespans.
00:26:32.640 The 10-year-old will lose 100 years of life.
00:26:36.280 The 82-year-old today will lose one or two.
00:26:38.760 So there's a difference of like 50 times.
00:26:44.040 So let's say there's a 50 times worsened difference between somebody ready to die anyway, who goes
00:26:50.640 early, and somebody in the prime of their life.
00:26:54.060 So now we take these 18 to 45-year-olds, and do you say that one of those is equal to one
00:27:00.840 senior citizen?
00:27:03.200 Only if you're an idiot.
00:27:04.460 One 18-year-old is not worth the same as one 80-year-old.
00:27:11.860 I'm sorry.
00:27:13.220 We can say that out loud now, because these are emergencies, right?
00:27:17.140 In, let's say, easy times, you would never let those words leave your lips, because it
00:27:24.500 sounds too much like we're not valuing senior citizens, and nobody wants that.
00:27:28.660 But the truth is, when you lose people who are between 18 and 45, you're losing, what,
00:27:38.360 50, 70 years of life per person.
00:27:44.260 So one of these people dying is equal to, what, 50 old people or something, depending on who
00:27:49.780 they are.
00:27:50.040 So when I said that the United States is ignoring the biggest cause of death between 18 and 45,
00:27:58.420 that doesn't come close.
00:28:00.960 That doesn't come close to telling you how big the problem is.
00:28:05.820 Because if you were to measure it in life years, like productive, good life years, this is the
00:28:13.200 biggest disaster we've ever had, by far.
00:28:17.740 It's bigger than the pandemic, by far.
00:28:21.480 It's bigger than World War II, the deaths, by far, right?
00:28:27.720 And World War II killed a lot of young people.
00:28:31.240 There's nothing like this.
00:28:33.180 There's nothing close.
00:28:34.320 And have I ever told you that the biggest problem with people who are not good at analysis,
00:28:42.320 what I tell you is the biggest problem.
00:28:44.200 It's always the same.
00:28:45.500 What's the number one problem of people who are not good at analyzing things?
00:28:49.900 What mistake do they always make?
00:28:52.200 Go.
00:28:52.960 Let's see if you've learned.
00:28:53.860 Can't compare, right?
00:28:55.460 They compare it to the wrong thing.
00:28:57.960 Classic example.
00:28:58.860 Even the Republicans, who are on the correct side of this argument, because they're strong
00:29:05.140 about fighting fentanyl, they don't have a policy, so they're worthless in terms of what
00:29:10.280 to do about it, just to be clear.
00:29:14.560 Just to be clear.
00:29:15.700 We'll remove this asshole.
00:29:18.280 Goodbye, asshole.
00:29:21.600 But this is the classic problem.
00:29:23.400 So even the Republicans, who have every incentive to make the problem seem as big as possible,
00:29:28.860 are not even in the ballpark.
00:29:33.800 When McCarthy says it's the leading cause of death between 18 and 45, he is wrong by 20
00:29:41.600 times?
00:29:42.860 10 times?
00:29:44.600 He is so wrong, he could not be more wrong, because he is comparing wrong.
00:29:51.360 It is implicitly saying that a person is a person, a death is a death, right?
00:29:55.960 He doesn't say that, but that's embedded in the statement.
00:30:00.480 What he should be comparing is not, you know, well, you know what to compare.
00:30:05.280 He should compare the number of years of productive, good life that are lost.
00:30:10.360 And if he did that, maybe you could make this something that people care about.
00:30:17.580 Maybe.
00:30:17.880 And I think the, when we watch the problems with the school kids who are losing that year
00:30:27.780 of school, does everybody say, well, they had one bad year, but now it's over?
00:30:33.960 No, we do not.
00:30:35.420 So here you're going to see a case of narrative bleed, where one narrative influences the other,
00:30:42.180 even though they're different.
00:30:42.960 So the narrative about these school kids losing one year of school, how do we measure that?
00:30:51.020 Do we say, well, that kid was going to live to 90, he only lost one year or two, whatever
00:30:57.260 you're counting.
00:30:58.340 You say, well, he lost one out of 90, so one out of 90 is bad, but not that bad, right?
00:31:04.320 No.
00:31:05.320 We say that that one year that they lost will fuck up the rest of their whole life, don't
00:31:10.080 we?
00:31:10.220 We all say that that one year is not one year.
00:31:14.560 We are correctly, correctly comparing it to the rest of their life, aren't we?
00:31:21.100 So that's an example where we're making the correct comparison.
00:31:26.140 Now, what happens when everybody makes the correct comparison?
00:31:30.040 Have you noticed?
00:31:31.860 What happened when both the left and the right used the correct comparison?
00:31:37.440 What happens to these kids for the rest of their life?
00:31:41.580 We fucking agreed.
00:31:43.540 We fucking agreed.
00:31:45.380 How about that, huh?
00:31:46.780 When we compared the right things for the first time, it's never been done before, Democrats
00:31:52.700 and Republicans all said, oh shit, at the same time, and we identified the correct evil
00:31:58.380 as not each other.
00:32:00.460 Am I right?
00:32:01.260 How many of you blame Democrats, sort of in general, for the school closings?
00:32:07.520 I don't.
00:32:09.000 Do you?
00:32:10.900 I mean, you could argue they should have done more about it or something like that, but
00:32:14.620 I don't feel like they even had the power.
00:32:17.420 Okay, some of you do.
00:32:18.360 I know you're highly political.
00:32:19.720 But do I make my point?
00:32:24.160 The simple act of knowing what to compare was all it took to take two groups that were
00:32:31.200 at each other's throats to say, oh yeah, well, that's true.
00:32:36.020 Let's try to keep these schools open.
00:32:39.160 That's all it took.
00:32:40.780 Now, take that to fentanyl.
00:32:42.980 In fentanyl, we knew to make the correct comparison, and that allowed at least the public, not the
00:32:52.040 teachers' unions yet, but at least the public to start getting on the same page.
00:32:55.880 With fentanyl, we feel like we're on the same page, but we're not.
00:33:00.460 We're not on the same page.
00:33:01.880 Because even the people who say, that's terrible, they think it's terrible because people between
00:33:08.140 18 and 49, it's the leading cause of death.
00:33:11.560 If that's the only thing they think, that's not nearly enough energy to get something done.
00:33:18.760 But if we can get them to understand it's quality life years, is what's being taken from us,
00:33:26.200 then you can get people on the same page.
00:33:30.540 All right.
00:33:30.800 Here's a, I'm going to take another run at why analogies usually fail for thinking.
00:33:42.620 I did my list of hoaxes, and I had somebody say, compare that to a known lie.
00:33:56.660 And it was just this weird comparison.
00:34:01.700 And I was accused of saying, of making people think past the sale, because I said, how many
00:34:08.600 of these hoaxes do you still believe?
00:34:11.360 And the criticism was that I've made them think past the sale of whether the hoaxes are real
00:34:18.260 by saying, how many do you still believe?
00:34:20.740 Now, of course, that's true.
00:34:23.420 But there's also something you know about the list.
00:34:34.760 There's no such thing as making you think past the truth.
00:34:41.040 It's not a sale.
00:34:43.620 It's just the truth.
00:34:45.280 So I can refer to the truth any way I want, and it's not gaslighting, because it's just
00:34:50.140 the truth.
00:34:54.580 And then that was compared to, asking people how many of the hoaxes they believe was compared
00:34:59.340 to somebody saying to me, hey, Scott, why did you stop beating your wife?
00:35:05.440 To which I say, you just compared things which are known to be true to a sense of truth.
00:35:11.040 It's just speculation.
00:35:12.400 That's the opposite of how you should use an analogy.
00:35:16.320 Do you get that?
00:35:19.800 They're not both a case of making you think past the sale.
00:35:23.520 That's not the important part.
00:35:25.180 The important part is one is a bunch of true stuff, and one is just speculation.
00:35:30.340 You can't compare those.
00:35:31.820 So analogies fail almost every time, the way we use them.
00:35:35.800 The only way to use an analogy is, I'm convinced, is this way.
00:35:39.540 You give your analogy, and then you state the one and only thing that you can learn from
00:35:45.920 the analogy.
00:35:47.360 Well, and the analogy shows you that, you know, you can never promise something you don't
00:35:51.620 deliver, or something like that, right?
00:35:54.200 If that's the only point you're making, then good.
00:35:56.460 But otherwise, people will just argue all of the other parts of the analogy, which were
00:36:00.980 never part of your point.
00:36:02.340 I'm always surprised that people read my tweets.
00:36:09.340 I know it's a weird thing, but because you tweet when you're alone, or at least I do,
00:36:14.900 I'm totally alone, it's just me and my phone, I'll like, boop, boop, boop.
00:36:17.740 And then I'll tweet about some famous character, because I think it's just me and my followers
00:36:24.940 talking to each other.
00:36:26.400 And then the famous follower responds.
00:36:28.280 I'm like, oh, shit, I didn't know he'd actually read it.
00:36:32.640 Yeah, even the size of my Twitter account, it still surprises me.
00:36:37.760 So here's an example of that.
00:36:38.940 So Max Boot was tweeting today, and he says, don't accept the GOP framing of the assault
00:36:48.500 on Paul Pelosi as evidence of a problem plaguing, quote, both sides of the aisle.
00:36:54.500 Political violence in America is being driven primarily by the far right, not the far left.
00:36:59.840 And then he refers to a Washington Post link.
00:37:02.780 And so I saw that, and I said to myself, hmm, Max Boot.
00:37:10.380 Does everybody know Max Boot by reputation?
00:37:15.460 Does everybody here need an explanation of who he is?
00:37:19.920 I'll give it to you anyway.
00:37:21.400 All right.
00:37:22.000 So he's as Democrat as you can be, meaning that for people who are in the tank for Democrats,
00:37:28.840 you know, ride or die.
00:37:30.480 He's sort of a ride or die Democrat.
00:37:32.200 But he's also been associated with some of the most insane Democrat narratives that have
00:37:38.900 been, you know, let's say, discredited.
00:37:41.880 Now, I'm not going to say who's right or wrong about anything.
00:37:44.320 I'll just say that, you know, in many people's opinions, his views have been discredited.
00:37:50.860 That's fair.
00:37:51.540 You can say that about me as well.
00:37:56.440 So I tweeted this.
00:37:58.280 I said, once you know who the players are, meaning in this case, if you knew who Max Boot
00:38:03.480 is, I said, everything looks different.
00:38:06.660 Democrats will see this tweet as their new narrative, which is what Max Boot sort of is.
00:38:13.880 I would say that Democrats know to look at Max Boot and others, but a small group of people
00:38:21.000 to know what their narrative will be in the same way that the right might look at Tucker
00:38:26.480 Carlson.
00:38:27.680 So people on the right would wait for Tucker Carlson to have an opinion, for example, not
00:38:31.500 just him.
00:38:32.380 And then, you know, they'd form their narrative.
00:38:34.140 So Max Boot is one of those narrative-hardening characters, that once he's put out a good
00:38:41.080 narrative that others can adopt, you know, they see it, and just the fact that they see
00:38:46.800 him talking about it, they go, okay, that's a safe, it's got a link to it, it's got a little
00:38:52.260 meat to it, we can jump on that.
00:38:54.180 So, if you know that Max Boot is a narrative booster, or even narrative creator, then everything
00:39:02.620 looks different, doesn't it?
00:39:04.360 Because once you know that there's some people who are not in the business of even trying
00:39:08.420 to give you a balanced opinion, then you can see them for what they are.
00:39:14.480 And so I tweeted that, you know, everything is different once you know who the players are,
00:39:19.660 and Max Boot apparently saw my tweet and responded back to me, and he said, always know what
00:39:24.160 nice to hear from an expert in gaslighting.
00:39:27.000 Now, you know what this is, was interesting for me?
00:39:31.900 Because Max Boot told me what the Democrat narrative about me is.
00:39:37.320 I wasn't, I mean, I'd heard that about myself a lot, but this is the first time I got, like,
00:39:42.740 a confirmation that what the Democrats will lie about me is that I'm a gaslighter.
00:39:49.420 And I thought, oh, shit, that's good to know.
00:39:51.500 Now, obviously, I've seen people on Twitter call me a gaslighter, because everybody calls
00:39:56.140 everybody a gaslighter.
00:39:57.800 It's just sort of a universal thing everybody says.
00:40:01.120 But I didn't, but hearing it from him makes me, it suggests that the only way they can,
00:40:06.820 they can stunt my influence is by saying I'm a gaslighter.
00:40:12.820 So if they can frame the narrative that I'm the person who always makes stuff up, then
00:40:18.380 whatever I say, you can ignore it.
00:40:20.780 Oh, it's that makes stuff up guy, right?
00:40:24.060 So it's sort of the same way I treat John Brennan.
00:40:29.300 When John Brennan's on, I try to tell all of you, hey, hey, he's their official gaslighter.
00:40:35.280 So it doesn't matter what he says, because he's not in the business of telling you the
00:40:40.220 truth.
00:40:41.040 He's in the narrative business.
00:40:43.500 So once you know who's in the narrative business, you can just say, oh, there's a gaslighter.
00:40:49.800 So apparently Max Boot would be at least one person who thinks that of me.
00:40:54.860 And so when he said, always nice to hear from an expert in the gaslighting, I agreed and
00:41:01.660 amplified.
00:41:03.940 I am, in fact, an expert on gaslighting.
00:41:07.640 And as I tweeted back to him, it's totally true.
00:41:11.580 I'm a trained hypnotist, but my expertise in gaslighting is limited to the spotting of
00:41:17.900 it because of ethics.
00:41:21.540 I don't think I could.
00:41:22.940 Have I ever done this?
00:41:25.680 Let me ask you.
00:41:26.740 Let me do an audit of my own ethics, okay?
00:41:31.060 So I need you to audit my ethics.
00:41:33.520 Have you ever seen me create a story that was intentionally fake?
00:41:39.660 Have I done that?
00:41:41.700 Have I ever created any narrative that was intentionally fake?
00:41:48.440 Because I'm trying to think if I ever did it just for fun or I was playing a hoax or anything.
00:41:53.560 I can't think of one.
00:41:56.700 Oh, yeah.
00:41:57.560 Okay.
00:41:57.960 When it was a joke, for sure.
00:41:59.300 Yeah, my IQ of 185.
00:42:01.620 So when it's an overt prank, yes.
00:42:06.320 But have I ever done it for, like, serious reasons?
00:42:10.380 I've said things that didn't turn out to be true.
00:42:14.220 We all do that.
00:42:16.120 But I thought they were true.
00:42:18.000 For example, I said that, you know, when Hillary was running, I thought she looked ill and I
00:42:23.160 predicted that she'd have a lot of problems.
00:42:25.500 So maybe she had less problems than that, so that would just make me wrong.
00:42:31.680 But I believed it.
00:42:33.160 I wasn't telling you something I didn't believe.
00:42:36.020 That was my opinion.
00:42:37.040 Interesting.
00:42:41.040 All right.
00:42:41.520 Rick Wilson, who you know as someone who used to be identified as Republican, but now he's
00:42:47.280 transitioned into something like a Republican disliker, or at least mostly a Trump disliker,
00:42:55.540 I think, but he tweeted today that that Pelosi attacker, even though he was definitely a crazy
00:43:04.900 person, that it's the rhetoric of the right which activated him.
00:43:10.140 So he really was weaponized to attack Paul Pelosi by all of the, you know, anti-Nancy Pelosi
00:43:17.260 rhetoric.
00:43:17.920 And I decided to agree and amplify.
00:43:26.260 So I agreed about this point of using rhetoric to get crazy people to kill people.
00:43:31.640 That's a real problem.
00:43:33.420 All right.
00:43:34.020 When Rick Wilson says the way we talk about politics could be activating crazy people to
00:43:40.120 do stuff, that's real.
00:43:42.220 Yeah, we're on the same page on that.
00:43:44.040 That's real.
00:43:44.760 I don't know if there's anything you can do about it.
00:43:47.920 Like, within the limitations of, you know, free speech and political discussion, I don't
00:43:54.180 think you can fix it.
00:43:56.000 But it's true.
00:43:57.320 And it's a real problem.
00:43:58.940 But, so I agreed and amplified.
00:44:00.580 And I said, it's a strong point by Rick Wilson, because it is.
00:44:04.020 And I said, we've watched Rick Wilson trying for years to indirectly kill Trump by using
00:44:09.040 this method of riling up left-wing nutjobs with conspiracy theories about Russia and whatnot.
00:44:14.700 And the practice is as dangerous as he notes in his video.
00:44:19.880 So, agree and amplify.
00:44:21.960 I'll model it as many times as I can.
00:44:26.080 But do you see how often that works?
00:44:28.680 Have I sold you yet?
00:44:30.980 Watch me making you think past the sale.
00:44:34.120 Have I sold you yet?
00:44:35.420 Because I will.
00:44:37.400 That agree and amplify just kind of always works.
00:44:41.860 But it only works for ridiculous points.
00:44:46.220 So, if somebody makes a point that's just sort of ridiculous, then it almost always works.
00:44:51.380 There may be some exceptions, but, I mean, I have great success with it.
00:44:56.540 All right.
00:44:57.000 Here's a little glimpse of the future.
00:44:58.380 I've told you that I saw VR before you did.
00:45:03.220 You meaning most of you, not all of you.
00:45:05.900 So, but that now is, how long ago was it when I was first telling you I got a VR set at home?
00:45:12.320 Three years, something like that.
00:45:14.660 So, imagine three years in virtual reality, how the technology has improved.
00:45:20.000 Well, the VR that we usually see is the consumer type.
00:45:27.240 So, the kind that Meta would be making, you know, the kind that you strap on and you pay, let's say you pay several hundred dollars.
00:45:35.940 But it turns out that there's a real high-end company whose name I forgot to write down.
00:45:42.360 But there's one company that's doing the highest-end virtual reality goggles, and they're aiming toward a market of, like, medical stuff and maybe technical stuff where the degree of resolution and the reality that you see has to be super high.
00:45:59.400 And I think they use, it's not Oculus, and I think they use a technique where the focus of what you're looking at, like the center of where you're concentrating, is super crisp.
00:46:09.640 But the things that would be your peripheral vision are a little less crisp.
00:46:15.840 But that's how your actual eye works.
00:46:19.080 So, your brain interprets everything you see as being clear.
00:46:23.380 But the truth is, there's only this little space right in the middle where you're actually focusing that's clear.
00:46:29.720 And everything else is fuzzy.
00:46:32.080 But you don't care, and you don't see it that way because you're not looking at it.
00:46:36.220 You're just sort of aware that there's some peripheral vision.
00:46:38.680 So, that's one of the tricks they used, is to make it just like an eye.
00:46:42.980 So, they don't use too much resources for the peripheral.
00:46:45.740 It's just there, and your brain thinks it would be clear if you looked at it.
00:46:49.540 Because if you do, it's clear.
00:46:52.220 So, if you do look at it, it's incidentally clear wherever you look.
00:46:55.820 But it's just not clear until you look.
00:46:58.220 Now, do you think I'm teaching you something about virtual reality?
00:47:04.920 No.
00:47:05.320 What you're going to learn from virtual reality and AI is about your actual reality.
00:47:13.980 You're going to learn a lot about virtual reality, but it won't come close to the gigantic mindfuck you're about to have.
00:47:24.100 Which is, everything you think about reality is going to be disproved.
00:47:29.920 And very soon, like really basic stuff is going to be all disproved.
00:47:35.980 And the way it's going to be disproved is that we'll learn so much from virtual reality that it would allow you to question the really, the basics of your reality.
00:47:46.060 May I give you an example?
00:47:47.240 I think we live in a simulation, that this is actually a software construct, and that I only believe I'm real, but there's somebody who created me out of bits.
00:47:59.740 And it might have been me.
00:48:01.000 I might have created myself, and I'm living in my own avatar.
00:48:05.940 That's possible.
00:48:06.720 It's possible I created a virtual end life for myself after my organic body died.
00:48:14.400 I mean, it's possible that I'm the, you know, the legacy of the organic person who programmed me.
00:48:20.800 But here's the point.
00:48:22.600 If we are a simulation, the simulation would not add equal detail everywhere.
00:48:29.680 It would only put detail where you were looking.
00:48:33.400 Just like virtual reality.
00:48:35.500 The only way virtual reality could have the, let's say, the power to make you think you're living in a different world
00:48:43.940 is they had to cheat on all the stuff that you're not looking at at the moment.
00:48:48.940 All of that had to be approximately there.
00:48:51.980 Otherwise, there wouldn't be enough resources to compute it all.
00:48:55.380 Every time you see the virtual reality world run into a physical block,
00:49:02.860 you're going to notice that the same physical limitation exists in what you thought was your real world.
00:49:10.460 Once you learn that 100% of the programming obstacles for a virtual world are identical to the ones in your real world,
00:49:22.020 that's the moment you'll know you're living in a simulation.
00:49:26.880 And here's where the proof will be.
00:49:28.860 If you build a virtual reality and you want your person to go for a walk,
00:49:36.380 and you say, all right, you're going to go for a walk in a forest,
00:49:40.300 and you can walk as far as you want,
00:49:42.800 would the virtual reality create an infinite forest, just in case?
00:49:48.020 Or would it fill in the forest as you walked?
00:49:51.620 And the answer is, of course, it would fill it in as you walked.
00:49:54.180 It would just stay a little bit ahead of you and harden it as you walked.
00:49:59.480 We are going to discover, here's my prediction,
00:50:03.180 that our actual reality does the same thing.
00:50:06.220 We will discover in our lifetime, because of virtual reality, I think,
00:50:10.540 that our actual reality doesn't exist until you need it.
00:50:15.960 It's on demand.
00:50:18.780 It's basically Schrodinger's cat writ large, right?
00:50:22.760 It's just Schrodinger's cat.
00:50:24.180 If you don't observe it, it doesn't need to be there.
00:50:27.140 And we...
00:50:28.440 Yeah, Chris, you're out of me.
00:50:31.640 So Chris, let me talk to you.
00:50:33.280 So Chris says we're already there.
00:50:35.480 I agree with you 100%.
00:50:37.200 What I'm telling you is that everybody else isn't there yet, Chris.
00:50:42.080 So you're there, and I'm there.
00:50:44.720 I'm totally there.
00:50:45.940 And a lot of you are there as well.
00:50:47.560 But the general public is very much not there.
00:50:50.680 But they will be.
00:50:51.460 Okay, you want to get a little more evidence of this?
00:50:55.600 So I tweeted, you can see it in my Twitter feed, highly recommended.
00:51:00.420 If you want to know what the next five years looks like, highly, highly recommended that you look in my Twitter feed today
00:51:07.840 and look for my tweet, there's a YouTube video where a guy who's sort of an expert on VR tried out these high-end glasses.
00:51:16.660 And here's what he describes as his experience.
00:51:19.340 Somewhat lifelike human characters would be in the virtual world.
00:51:28.940 The first thing you need to know is that in his virtual world, he could not distinguish between real and virtual objects.
00:51:36.400 So they put him in a room that had some real furniture and some virtual furniture.
00:51:41.280 He couldn't tell.
00:51:44.960 There was no difference.
00:51:46.680 He couldn't tell if the table he was looking at was real or had been put there.
00:51:50.720 He had to feel it.
00:51:52.140 If his hand went through it, it wasn't real.
00:51:54.300 If he touched it, it was real.
00:51:55.880 You ready for super weirdness?
00:51:59.420 This one will just blow your mind.
00:52:00.880 So then they also put him in a room where there were human-like people who looked as much as virtual reality can, like a real person.
00:52:09.480 That real person walks up to you like a real person and stands right in front of you.
00:52:14.900 Here's the weird part.
00:52:16.620 They're virtual, so you could put your hand right through them.
00:52:21.380 But you don't.
00:52:23.340 You touch them, and you can feel them.
00:52:27.720 You ready?
00:52:28.360 You can feel them.
00:52:30.880 And they're not there.
00:52:33.240 You put your hand on them, you can feel their warmth.
00:52:36.320 You can feel their shoulder.
00:52:38.380 It's not there.
00:52:41.560 Now, this is something we already know about.
00:52:44.040 It's called phantom.
00:52:45.200 You know, people lose a limb, can still feel their limb.
00:52:49.260 That's a very well-known thing, right?
00:52:51.480 Phantom limb.
00:52:53.440 How many of you have phantom cell phone buzzing?
00:52:57.500 Do you have the buzz in the pocket where you carry your phone?
00:53:02.740 A lot of you, right?
00:53:03.720 I have that.
00:53:04.220 I have that all the time.
00:53:05.740 I feel my phone buzzing, and it's not buzzing.
00:53:09.120 It's a phantom buzz.
00:53:12.240 Right?
00:53:12.340 So, wait till you find out that even in your real reality, the things you think you're touching
00:53:20.500 and feeling, you might not be.
00:53:23.760 You might not be.
00:53:25.060 You might not be touching anything.
00:53:27.380 You might not be feeling anything.
00:53:28.740 And I'm going to give you a super messed up thing now.
00:53:35.000 If you, in the real world, if you were to, say, jump off, let's say, a 10-foot height,
00:53:42.820 you would feel your stomach go, like, right?
00:53:46.060 You know if you're in an elevator, and the elevator comes to a stop, and you feel your guts changing?
00:53:50.520 Now, go into a virtual reality game in which you're made to seem as though you're also moving.
00:54:01.680 What happens to your guts?
00:54:04.420 Same fucking thing.
00:54:06.300 Same thing.
00:54:07.760 Your guts will react as if you're actually moving.
00:54:10.860 Or you'll feel like they do.
00:54:12.260 They don't actually move.
00:54:13.760 But you'll have the gut feeling.
00:54:16.660 Yeah.
00:54:16.800 Now, I know you can't see it yet.
00:54:23.460 Chris can see it.
00:54:25.420 So Chris may have spent some time in virtual reality.
00:54:28.280 You have to spend time in virtual reality to understand this.
00:54:31.760 I don't think I could understand it if I hadn't spent some time doing it.
00:54:35.860 And I did just the beginning, early stuff, right?
00:54:38.180 It's just the earliest stuff.
00:54:39.860 But wow.
00:54:42.240 Wow.
00:54:42.760 Do you have a mind fuck coming?
00:54:44.940 Wait till you see what's coming.
00:54:47.640 Speaking of Twitter, the New York Times tweeted that, quote, in a headline that said,
00:54:56.020 the New York Times, Elon Musk, in a tweet, shares link from site known to publish false news.
00:55:04.080 Elon Musk retweeted that headline and said, this is fake.
00:55:09.400 I did not tweet in a link to the New York Times.
00:55:17.500 Is it just me, or does the air feel cleaner?
00:55:28.060 I feel more free.
00:55:31.680 Hold on.
00:55:32.820 Hold on.
00:55:33.300 Yep, it's a song.
00:55:37.220 There's a song in my body.
00:55:41.040 I can't even turn it off.
00:55:46.140 Is that a butterfly?
00:55:48.060 Okay, I swear I see a rainbow.
00:55:52.120 Unicorn?
00:55:53.360 Unicorn?
00:55:54.640 I don't know.
00:55:55.220 Just everything seems better.
00:55:57.420 Just everything seems better today.
00:55:59.360 Maybe this is the golden age.
00:56:03.780 Maybe this is the golden age.
00:56:06.100 Because honest to God, I have not awakened feeling so good.
00:56:12.840 I don't know.
00:56:13.780 It's hard to remember the last time.
00:56:15.540 It's been years.
00:56:16.940 But for the last several days, I just wake up feeling great.
00:56:20.160 Just feeling great.
00:56:21.020 Can't wait for the day.
00:56:21.760 It gets better.
00:56:24.380 I'm just building up to the best story.
00:56:26.720 We're not even at the best story yet.
00:56:29.980 It's coming.
00:56:33.240 All right, so my first thought when I saw that Musk had originally, when he tweeted that link,
00:56:39.940 which I knew immediately was not a credible link,
00:56:43.680 my first thought was, oh my God, I wonder if he'll get canceled.
00:56:49.260 I actually had that thought.
00:56:50.640 I wonder if he's going to get canceled for that.
00:56:54.520 And then I thought, oh, hey, he owns Twitter.
00:57:00.640 He's the first person who can't be canceled.
00:57:04.400 He's the first one, right, in the world.
00:57:08.020 Correct me if I'm wrong, but he's the first person in the world who can't be canceled.
00:57:13.120 Am I wrong about that?
00:57:16.320 Because I can be canceled.
00:57:18.560 Let me put it in different terms.
00:57:19.760 I've told you before that I have fuck you money, right?
00:57:23.060 Meaning that I can say things you can't say, because if I get canceled, well, I'm not going
00:57:28.140 to be poor.
00:57:29.120 So that's fuck you money.
00:57:31.320 But what Elon Musk has is whatever is the level above fuck you money.
00:57:36.700 What he has, I'm going to coin a new phrase.
00:57:39.700 He has fuck your spouse right in front of you money.
00:57:42.440 That's different money, right?
00:57:46.000 That's different.
00:57:46.580 It's like, you know, I say I'm rich, but I'm not private jet rich, meaning I wouldn't waste
00:57:53.240 my money on a private jet.
00:57:54.440 Like, I don't have that much money.
00:57:56.700 All right.
00:57:57.400 But I have fuck you money, but I don't have fuck your spouse right in front of you money.
00:58:02.280 Elon Musk has fuck your spouse right in front of you.
00:58:04.560 Elon Musk, he's a free man.
00:58:08.360 He's as free as you can get.
00:58:12.480 But here's the best part.
00:58:14.400 If you haven't heard this story yet, watch how happy this makes you.
00:58:20.200 Are you ready for this?
00:58:21.800 And I hope some of you, is there anybody who hasn't seen the news yet?
00:58:24.920 I hope there's somebody here who hasn't watched the news, because I want to be the one who
00:58:29.040 tells you.
00:58:30.200 Can somebody?
00:58:31.080 Oh, good.
00:58:31.860 Good.
00:58:33.280 Good.
00:58:33.820 Good.
00:58:34.460 So some of you don't know this news from today, right?
00:58:37.480 Because it's brand new today.
00:58:39.260 Oh, this is so beautiful.
00:58:42.540 I get to tell you.
00:58:44.400 I get to tell you.
00:58:46.000 Here's the news.
00:58:47.900 I had not thought about it, but it turns out that Elon Musk, now owning Twitter,
00:58:54.320 has access to all of the internal communications of the staff.
00:59:01.800 How do you feel now?
00:59:04.680 And the first thing he found, he has tweeted, which is that, oh, I hope I wrote it down.
00:59:18.280 He found an internal communication in which the person who is in charge of the sort of
00:59:26.600 Twitter censoring stuff was blaming somebody else within the company of hiding data.
00:59:36.160 Not only hiding the data, but hiding it exactly as Musk is accusing them of doing.
00:59:42.380 And he threatened that he was going to basically be a whistleblower.
00:59:48.880 So I saw a lot of people come down hard on that, what's his name, Yol something, who is
00:59:55.900 the head of, what was his actual job?
01:00:00.260 But he was head of some kind of, he was in charge of something important to, you know,
01:00:05.040 tracking down bots and stuff, right?
01:00:08.760 So Elon Musk has the actual communication that proves that he was right in his accusations.
01:00:16.500 It's just the beginning.
01:00:19.580 Security?
01:00:20.420 He wasn't a security guy, was he?
01:00:22.080 It must have been a wing of security.
01:00:24.660 Oh, here it is.
01:00:25.580 Here we go.
01:00:26.680 Somebody was nice enough.
01:00:27.740 It's Yol Roth, and the thing you send me, I can't open, unfortunately, but Yol Roth.
01:00:38.080 So when people saw it, they were like, why is this Yol Roth guy still working?
01:00:43.760 Because it looked like he was, you know, part of the, you know, it looked like he might have
01:00:48.760 been part of the process of hiding information.
01:00:50.880 But if you read it more carefully, you can see that he was the guy, Yol, was the guy who
01:00:57.500 was criticizing the person hiding the information and was threatening to, you know, take it public
01:01:04.000 or at least public within the company.
01:01:07.140 So, and then interestingly, Musk actually tweeted his support for Yol and says he has his trust.
01:01:17.460 And what's interesting is people found the past tweets of this individual, Yol, and found
01:01:25.160 that he was, you know, pretty anti-Trump and anti-Republican, and very anti-Republican.
01:01:31.180 And Musk, to his credit, says that's just a political opinion, you know, but as an employee,
01:01:38.960 he has his trust.
01:01:40.780 And I thought, damn, damn, that is exactly what you want to see, isn't it?
01:01:48.180 The very thing you wanted Musk to do was to conspicuously keep somebody who hates Trump,
01:01:56.780 but does a good job.
01:01:58.380 That's exactly what I wanted to see.
01:02:00.520 And that's exactly what he served up.
01:02:02.840 I've told you about the new CEO move, right?
01:02:06.060 And Musk is just killing it.
01:02:08.660 The new CEO move is what you do in the first week.
01:02:11.060 Because you end up believing your first impression, it's hard to shake it, right?
01:02:17.780 So whatever first impression becomes your thing.
01:02:20.860 When Trump was first elected, but before he was sworn in, he and Pence started traveling
01:02:26.360 around and trying to make sure that their companies were hiring American and staying in America.
01:02:31.740 And that became your first impression of Trump.
01:02:35.300 Very good form.
01:02:37.500 You know, to give your first impression before he's sworn in, that's just great stuff.
01:02:42.600 And I called it down at the time.
01:02:44.000 But this is even better.
01:02:50.720 Because everything that Musk is doing is just what you wanted to see.
01:02:54.920 Now, here's the best part of the story.
01:02:56.840 It's interesting that he caught this one example of maybe some bad behavior.
01:03:04.320 We'll have to hear on the other side.
01:03:06.460 But he has access to all of the internal communication.
01:03:13.580 Everything that Twitter has done, he's going to find out.
01:03:20.800 Oh, my God.
01:03:22.440 Now, this is yet another warning I would like to give you.
01:03:25.820 Don't ever put anything in a digital message that you can't live with being public.
01:03:32.580 Do you all get that now?
01:03:35.280 Like, it used to be that you thought, well, I'm an average citizen.
01:03:39.180 I'm not doing anything.
01:03:40.320 Nobody's even going to look anyway.
01:03:42.380 Don't put it in any kind of communication.
01:03:44.560 Don't put it in a DM.
01:03:46.340 Don't put it in, and especially, especially, don't put it in an encrypted app.
01:03:51.480 If you think you're safe because an app is encrypted, and that's his whole point, the
01:03:56.980 whole point is that nobody can see it, you are not safe.
01:04:00.400 You're not even close.
01:04:01.940 Because you know what people really want to get at?
01:04:04.640 You're encrypted apps.
01:04:05.740 So they can read it on your phone before it gets sent.
01:04:09.680 The person you send it to, maybe their phone is hacked, so it can be read from the phone.
01:04:15.680 Maybe the person himself is not trustworthy.
01:04:19.060 There are a million ways that that message is going to get out.
01:04:21.480 So do not write down anything you don't want somebody to read, period, ever.
01:04:27.560 And I ask myself if I've followed that advice, and the answer is almost.
01:04:34.360 And by that I mean the only reason I'm safe is that I don't have any sense of embarrassment
01:04:39.180 and I have fucking money.
01:04:40.980 So I can't think of anything that could hurt me that would be in any of my messages.
01:04:45.440 I mean, there might be things that would titillate and excite you, and you'd be like,
01:04:48.900 whoa, didn't know that.
01:04:50.540 It could be that kind of stuff.
01:04:52.280 But nothing that would really hurt me.
01:04:54.460 So be like that.
01:04:56.500 This is one case where you should be like me.
01:04:59.340 And I almost never tell you to do that.
01:05:01.960 Because being like me is usually a recipe for a disaster.
01:05:06.080 But this one time, be like me.
01:05:08.680 Don't write anything in the message that you don't want to see.
01:05:16.700 Here's a typical tweet from the left.
01:05:20.540 Somebody who's mad because there's this hashtag, Pelosi gay lover going around.
01:05:26.760 We'll talk about Paul Pelosi in a moment.
01:05:29.380 And a lefty account said, yeah, Twitter advertisers, time to bail.
01:05:35.240 So somebody who uses Twitter, who is encouraging the advertisers to leave.
01:05:40.500 Have I ever mentioned that Democrats don't have a firm understanding of economics?
01:05:48.960 Have I ever mentioned that?
01:05:52.880 I've seen no better case.
01:05:55.040 Here's somebody who thinks that they can enjoy their Twitter experience.
01:05:58.840 Here, I'm reading between the lines, but I think that's safe.
01:06:01.260 I think that if they're a Twitter user, they obviously like Twitter, right?
01:06:07.200 If you're using it, and you like it enough to use it.
01:06:10.880 Here's somebody who likes it, but thinks maybe they can still use it if the advertisers go away.
01:06:15.880 Well, maybe.
01:06:20.100 Maybe Musk will pay for it all himself.
01:06:23.100 But I wouldn't encourage the advertisers to go away if you want the model to continue.
01:06:30.760 I think, didn't Jack Dorsey say that the advertising model was the mistake?
01:06:36.700 I thought I saw that, right?
01:06:37.720 So even Jack Dorsey knew that advertising was sort of the road to ruin.
01:06:43.600 And maybe Twitter will become a non-advertising model.
01:06:50.660 What would you pay to use Twitter?
01:06:56.100 So Twitter has how many users?
01:06:58.720 Are there a billion users?
01:07:01.840 How many users does Twitter have?
01:07:04.460 Give me that quickly.
01:07:05.260 Quickly, how many users does Twitter have?
01:07:08.600 It's under a billion, right?
01:07:11.300 700 million or something.
01:07:13.520 I'm going from 200 million?
01:07:18.860 Yeah, somebody who actually knows.
01:07:20.860 I'm seeing a number of people say 300 million.
01:07:23.720 Is that really the right number, 300 million?
01:07:28.000 Nobody knows.
01:07:30.120 All right.
01:07:31.020 Well, if you're a blue check account, you would pay more.
01:07:36.260 Because having a blue check experience on Twitter is just a better experience.
01:07:40.720 I can tell you that having a blue check just makes everything different, right?
01:07:44.340 I have a completely different experience than you do.
01:07:47.240 I would pay for that.
01:07:49.500 Five dollars a month, easily.
01:07:51.780 Easily.
01:07:52.360 Five dollars a month.
01:07:54.160 But there aren't that many blue checks compared to other people.
01:07:57.280 So would the rest of you who are just using it to read content pay a dollar?
01:08:02.940 Dollar a month?
01:08:06.500 You know what kind of advertisement I would accept?
01:08:11.560 I've never seen this.
01:08:12.880 I wonder if anybody's tested it.
01:08:14.120 I would accept challenge-out advertisement.
01:08:18.800 Challenge-out.
01:08:20.500 So imagine if you want to see a video, instead of having to sit through a minute of an ad,
01:08:26.860 which seems like forever, suppose they just slapped up a multiple-choice test.
01:08:32.220 And it says, let's say, Toyota Prius.
01:08:35.540 Gas mileage is 20 miles per gallon, 30 miles per gallon, 75.
01:08:42.720 I don't know what the real number is.
01:08:44.680 But if you correctly guess the right gas mileage, it goes away and you're on with your content.
01:08:50.540 Do you know how quickly you could answer a multiple-choice?
01:08:55.240 Like three seconds, right?
01:08:58.140 Prius, boop.
01:08:59.140 You don't even need to get the right answer, because it'll show you the right answer if you get the wrong answer.
01:09:04.340 You don't even get weighed down by that.
01:09:07.820 Now, suppose it asked me a question I didn't know the answer to.
01:09:11.480 I would actually be interested.
01:09:13.880 Let's say it was some kind of laundry soap product, which typically would not be served to be,
01:09:20.260 because I wouldn't see that kind of ad.
01:09:22.060 But let's say it did.
01:09:22.740 And it said something like, laundry soap is also flammable under the following condition.
01:09:32.260 I don't even know if that's a thing, but it might be.
01:09:35.340 I wouldn't be interested in that.
01:09:37.440 And I would say, what?
01:09:39.040 And I would actually stop.
01:09:40.860 I would remember the name of the company.
01:09:43.240 And it might actually bind me to them a little bit.
01:09:45.940 And it would still, it wouldn't slow me down.
01:09:48.540 It would be a little bit of a distraction before I watched my video, but still, five seconds.
01:09:53.200 Five seconds.
01:09:55.520 And if you ask me, that would be better for the advertiser.
01:10:00.100 All right, here's how I would do it.
01:10:02.160 Here would be my advertisement.
01:10:04.440 Pop.
01:10:04.820 Pops on the screen.
01:10:06.720 You could read Dilbert in newspapers, but where else?
01:10:11.140 And then it would say, newspaper websites?
01:10:14.740 No, it wouldn't say that.
01:10:15.940 It would say, in the New York Times, somewhere on TV, and then at Dilbert.com.
01:10:25.400 And then, of course, I would want you to know, yeah, there's some other answers.
01:10:30.160 But I would want you to know that the real answer that I'm trying to lead you toward is Dilbert.com.
01:10:35.000 So I would have informed you that there's a place you can see my comics.
01:10:39.460 It would have taken you no more than five seconds.
01:10:42.520 And then you'd go on with it.
01:10:44.680 Now, that's better than paying for it, isn't it?
01:10:46.480 Your ad block is keeping you from seeing Dilbert?
01:10:53.800 Well, all right.
01:10:58.720 Let me ask you this.
01:11:02.500 If advertisers left Twitter, would Twitter be a safer place or a less safe place?
01:11:10.120 Go.
01:11:11.540 So this is, again, the left not understanding economics.
01:11:14.400 If advertisers left, it would be safer, you say.
01:11:23.540 Let me offer you the devil's advocate counterpoint.
01:11:29.440 If every time you tweeted, you didn't have to worry about anybody,
01:11:35.680 would you tweet the same as if you did have to worry about at least advertisers wanting you to get kicked off?
01:11:42.200 I feel like the advertisers create a guardrail that can be both good and bad.
01:11:50.300 Would you agree with that?
01:11:51.940 It definitely creates bad.
01:11:53.660 So that part we can stipulate.
01:11:55.300 We all agree.
01:11:56.340 The advertisers can cause bad distortions.
01:12:00.180 100% agreement on the bad part, right?
01:12:02.140 But is there not a balancing effect that it keeps us all within a socially appropriate channel for the most part?
01:12:13.860 Because the thing about social media is that we act like monsters in ways we wouldn't in person.
01:12:21.340 But the advertisers are sort of like a conscience, you know, with a bite to keep you acting a little bit closer to how you would in person.
01:12:30.300 I don't know.
01:12:30.620 I'm not a big fan of the advertising model, but I'm not sure I understand totally how it would play out if you took it away.
01:12:44.900 Imagine being a left-leaning political person, and you have these two choices now that Musk owns Twitter.
01:12:52.960 If you stay there, you're going to be exposed to, let's say, painful freedom.
01:13:04.880 Painful freedom.
01:13:06.260 It's actually going to hurt.
01:13:08.420 If you leave, then you're not part of the game.
01:13:12.500 You lose your power.
01:13:14.480 Because Democrats have a lot of power because of their influence on Twitter.
01:13:18.100 I've argued that Twitter is the tail that wags the entire media dog, and it goes like this.
01:13:26.020 Even though you say to yourself, no, Twitter is just one more outlet.
01:13:30.240 No, Twitter is not just one more outlet.
01:13:33.380 Twitter is where every professional journalist goes to figure out what they can and cannot say.
01:13:40.660 It's way bigger than just one outlet, right?
01:13:44.340 Twitter's influence on the people who can be influenced, the ones who make a difference, the influencers, is everything.
01:13:52.840 And I don't think Facebook has that, not Instagram, not LinkedIn.
01:13:57.000 It's only Twitter.
01:13:59.200 Twitter is the tail that's wagging all of the dog, and Musk now owns that.
01:14:04.160 So he owns the thing that controls other people in a way that nobody else ever has.
01:14:14.520 And so now the Democrats have this weird choice.
01:14:19.520 They can either stay on Twitter and be exposed to painful freedom because it's going to hurt,
01:14:25.880 or they can leave and give up their power.
01:14:28.540 Because if you get off of Twitter, you've really given away your power.
01:14:33.240 And I mean the Democrat power, not just an individual's power of influence, but the whole Democrats.
01:14:38.280 The fewer Democrats there are on Twitter, the less influence Democrats have.
01:14:44.240 Would you agree with that?
01:14:45.700 The fewer Democrats on Twitter, substantially, the less power they will have in the real political world.
01:14:52.160 Because Twitter is where all power emanates, and from there, all things happen.
01:14:59.140 But if Twitter says you can't say that, they're not going to say it.
01:15:04.120 And if Twitter says it's true, then they'll say it.
01:15:07.400 That's what I think.
01:15:08.580 Because Twitter is where you go to get mocked for a bad job if you're a journalist, right?
01:15:14.460 If you're a journalist, let's say that the journalist who interviewed Fetterman, I can't remember her name.
01:15:19.900 Don't you think she loved that day when all the people were saying,
01:15:24.340 great job, you know, you were honest, you finally told the truth.
01:15:27.700 Like, that was a huge dopamine hit.
01:15:30.160 But then also people on her side were saying, damn you for hurting our candidate.
01:15:34.900 Yeah.
01:15:35.300 I don't think people could quit.
01:15:37.240 So Dem's got a bad choice.
01:15:43.880 I saw somebody on the left say that Musk is obviously a sociopath.
01:15:49.900 Musk is a sociopath.
01:15:56.080 That would conflict with about 100% of everything you observe about him.
01:16:04.680 I don't think...
01:16:06.360 I mean, to me, that is the most absurdly wrong comment I've ever heard in my life.
01:16:10.720 His entire Mars thing is to spread the light of consciousness.
01:16:16.560 Like, there's nobody who cares about humans in a demonstrably, you know, you're working on it every day, putting your heart and soul, your blood into it.
01:16:28.720 Nobody's working harder than he is on behalf of humanity.
01:16:32.160 I mean, literally, just count the hours he works trying to save the planet from climate change as he sees it, trying to save the planet from, you know, energy shortages as he sees it, trying to save us from a planet that can't sustain us forever.
01:16:46.600 However, how much of that is a sociopath?
01:16:51.840 It's like the word lost all meaning.
01:16:54.380 It lost all meaning.
01:16:55.360 He's got nine kids because he apparently loves kids.
01:17:00.060 He jokes like a person who's definitely not a sociopath.
01:17:05.440 And I have real questions about his claim that he's Asperger's.
01:17:11.280 Like, I'm not seeing it.
01:17:13.020 I don't see it at all.
01:17:15.820 I don't know.
01:17:17.280 I'm no expert, but I don't see it.
01:17:24.100 All right.
01:17:24.740 I saw an interesting couple of articles by a fellow named Will Lockett, who writes in an online publication called Predict.
01:17:35.860 I don't know either the person or the publication, but he had some really interesting claims I want to run past you.
01:17:43.220 Did you know that your thermonuclear stockpile has to be continuously refreshed with tritium?
01:17:52.060 Is that right?
01:17:52.660 Tritium?
01:17:53.020 Let me make sure I got the right material here.
01:18:00.500 No, not tritium.
01:18:04.060 Lithium or something.
01:18:06.000 No, tritium.
01:18:07.120 Yes, tritium is the right word.
01:18:09.240 So tritium apparently doesn't last long.
01:18:12.920 All right.
01:18:13.500 Tritium doesn't last long.
01:18:15.400 So if you want your thermonuclear weapons to work, you have to continually be adding to them and topping them off.
01:18:23.020 The ability to continually produce this stuff, the tritium, is really, really expensive.
01:18:37.200 It's like $30,000 per gram, but I didn't see how many grams you need, but it must be enough.
01:18:43.000 Now, you can't stockpile it.
01:18:44.840 This is important.
01:18:45.580 Because there would be no point in stockpiling the very thing that, you know, it decays over time.
01:18:52.800 So you can't stockpile it.
01:18:54.120 So you have to make it as you need it.
01:18:55.480 So if Russia is running out of money, which is suggested by the fact that they've asked their soldiers to buy their own equipment, in some cases,
01:19:08.040 there is clearly indications that Russia has got a budget problem.
01:19:12.560 Don't know how much.
01:19:14.040 It could be no big deal.
01:19:15.200 But as Will speculates, there's a non-zero chance that their thermonuclear weapons already don't work.
01:19:24.340 Now, we should not make any serious decisions based on that being true, because you'd hate to be wrong.
01:19:29.520 I'd be really surprised if they don't have a few that are working.
01:19:32.720 You know, it seems to me that they would keep a few fresh, right?
01:19:35.660 So I don't think they're all broken.
01:19:37.980 But I do wonder if Putin knows which ones work, because it seems to me that the worst thing you could do is attempt to fire a nuke and it doesn't work.
01:19:53.480 You'd be really in trouble then, because you don't know if the American nukes work, but I'll bet they do.
01:20:01.780 Not all of them, but I'll bet they do.
01:20:03.620 So I don't know that we can make anything in terms of a decision about this,
01:20:11.060 but it's interesting to know that if you don't have a huge budget,
01:20:15.580 your nuclear weapons become obsolete fairly soon.
01:20:20.640 I think maybe within a year or so.
01:20:22.720 Does anybody have it?
01:20:23.580 How long does it take for your tritium to wear out?
01:20:31.260 Eight days?
01:20:33.620 Twelve-year half-life?
01:20:36.280 But is the half-life what we're looking at, or just when it decays too much?
01:20:42.940 All right, well, we don't know.
01:20:45.500 People are guessing.
01:20:47.920 Here's another thing that Will Lockett says.
01:20:49.800 There's a recent paper out of the University of New South Wales
01:20:53.020 where some entity figured out how to recycle solar panels.
01:20:58.540 You know how solar panels are a big polluting problem?
01:21:03.200 Because once you make them, if they go bad, once they wear out, there's nothing to do with them.
01:21:08.700 Well, some company cleverly figured out how to, what do they do?
01:21:12.460 They remove the aluminum frame, and then they've got some electrostatic process
01:21:17.140 that separates all the valuable materials that they can reuse.
01:21:21.700 So instead of being mostly toxic garbage,
01:21:25.560 they become mostly a source for valuable materials.
01:21:31.100 Now, remember, it's just a paper.
01:21:33.800 Just a paper doesn't mean that it can actually work in the real world.
01:21:37.040 But it's nice to know somebody's working on that problem.
01:21:40.040 Well, I'd be surprised if it doesn't get solved eventually.
01:21:44.420 So that's the good news.
01:21:47.140 Now, that, ladies and gentlemen,
01:21:50.360 brings us to approximately the end of what has to be
01:21:55.640 the most entertaining live stream you've ever seen in your life.
01:22:00.560 And I know.
01:22:05.440 It's really surprising that I can do it every single time.
01:22:08.820 It's hard to top myself every day, but so far, it just keeps getting better.
01:22:14.240 Yes, and it's Halloween.
01:22:17.580 I will be handing out candy at my house.
01:22:21.340 If anybody wants to come say hi, that would be an excellent time to do it.
01:22:26.540 So if you live anywhere in my driving distance,
01:22:29.400 it isn't hard to find my house.
01:22:34.440 You can find it pretty easily.
01:22:36.280 So come to my house.
01:22:37.180 I'll give you some candy.
01:22:38.820 But I might run out, so.
01:22:44.660 Let's declare a pandemic amnesty in the Atlantic.
01:22:47.560 Even the Atlantic.
01:22:49.020 All right.
01:22:52.200 I think I'm going to put on a green T-shirt and go as Zelensky.
01:22:58.540 Did I talk about Pelosi?
01:23:00.160 Oh, good catch.
01:23:01.100 I didn't really talk about Pelosi, did I?
01:23:02.740 All right.
01:23:03.840 Let's talk about Pelosi.
01:23:04.640 Let's talk about Pelosi.
01:23:06.340 So here are the mysteries so far.
01:23:09.380 And you can update me in case any of these got solved in the last hour.
01:23:13.440 Number one.
01:23:14.540 Oh, his 911 call came out.
01:23:17.100 Number one.
01:23:18.280 Why did he know the name of his assailant?
01:23:22.180 I don't find that too strange because he may have asked the name.
01:23:28.240 He may have just asked him his name.
01:23:30.020 Maybe he just told him.
01:23:31.560 Because I think there was some interaction no matter what.
01:23:34.800 If he said his friend, that may be because he knew he was dealing with a crazy person.
01:23:41.160 It may be because he knew he was dealing with a crazy person.
01:23:44.240 It may be because it was his friend.
01:23:46.980 Right.
01:23:47.480 So we don't know.
01:23:48.800 The question of the security cameras.
01:23:53.440 That's a pretty big question, isn't it?
01:23:55.780 And the no staff and no security.
01:23:59.020 I'm going to tell you something that probably shouldn't, but it's relevant to this story.
01:24:08.420 In many cases, the people you think should have security don't.
01:24:14.060 Because having security is a giant pain in the ass.
01:24:17.520 Even if you can afford it.
01:24:19.120 It's a giant pain in the ass.
01:24:21.160 So I do think that there are some people, especially like a spouse of a...
01:24:29.020 Politician.
01:24:30.280 Probably they could.
01:24:31.520 They could afford it and just don't.
01:24:34.320 Yeah.
01:24:34.920 So I think there's a whole lot of people...
01:24:37.360 So here's what I don't find completely surprising.
01:24:40.760 I'm not surprised he knew his name.
01:24:42.920 That doesn't tell you everything.
01:24:44.960 I'm not surprised that he didn't have security.
01:24:49.960 I know you are, but I'm not surprised by that.
01:24:54.000 I'm also not surprised if his security camera didn't work.
01:24:59.020 Has anybody ever owned security cameras at your house?
01:25:04.360 Let me give you my experience.
01:25:08.540 All right.
01:25:09.180 I've owned maybe five expensive security systems for various homes.
01:25:16.420 None of them worked.
01:25:18.700 Let me say it again.
01:25:20.240 I've had probably five expensive security systems for different homes at different times.
01:25:24.920 None of them worked.
01:25:27.940 I actually have a redundant system now.
01:25:30.760 So I have two security systems because the odds of one of them working are less than 50%.
01:25:36.380 Let me say that again.
01:25:38.760 I have two security systems, two completely redundant systems, because the odds of any one of them working is low.
01:25:46.140 Probably half of the time, maybe at best.
01:25:48.720 Probably half of the time.
01:25:49.780 Let me say that again.
01:25:52.260 It works half of the time.
01:25:55.700 All right.
01:25:57.740 My current security system, set up by professionals for whom I paid a lot, if I open it up on my app, it just crashes.
01:26:07.040 And I ask them to fix it, and they're busy.
01:26:11.180 For months.
01:26:12.040 Now, I have an entirely second system that is working at the moment.
01:26:19.300 So I do have security, but it's my blink camera.
01:26:24.500 Like, if I didn't buy a blink camera system, I wouldn't have security right now, for all practical purposes.
01:26:34.320 Have I ever told you I have this problem?
01:26:37.680 Well, all right.
01:26:38.400 Let's go back to Paul Pelosi.
01:26:39.340 So I'm not surprised if security cameras don't work.
01:26:42.660 That's actually normal.
01:26:44.640 And when you heard that Epstein's camera was off, did you all say to yourself, my God, the only way a security camera could not be working is if somebody intentionally turned it off.
01:26:55.620 And do you know what I said?
01:26:57.020 No, it's about a 50% chance.
01:27:00.020 I'll bet if you went into any prison right now, you'd have problems with their security cameras somewhere.
01:27:05.840 Somewhere.
01:27:06.860 Yeah, in a prison.
01:27:07.540 Yes, in a prison.
01:27:10.460 Even in a prison.
01:27:11.940 That's right.
01:27:13.840 Because, you know, I'm a perfect example of somebody who should not ever not have a security system.
01:27:19.540 But lots of times I would if I didn't have two.
01:27:22.020 If I didn't have two, I wouldn't have a security system right now.
01:27:24.300 All right.
01:27:28.560 Paul Pelosi.
01:27:30.400 Now, what do you feel about the fact that Elon Musk tweeted something that clearly was a non-credible source, which he probably knew?
01:27:42.320 We don't know.
01:27:42.820 I love the fact that here's what I think.
01:27:46.920 I think when he tweeted the non-credible source, it was more like a proxy for just the hypothesis that there's something more to the story.
01:27:56.880 That's how I took it.
01:27:57.700 I didn't take it that that story was necessarily credible, but rather he was just sort of teasing them that there might be more to the story.
01:28:05.840 It seemed more playful than serious.
01:28:08.540 And so that I just loved because it bothered them so much.
01:28:14.780 I just loved it because it bothered them, I guess.
01:28:16.680 I can't be proud of that.
01:28:21.640 So now, if Paul Pelosi had been gay, we presume it didn't happen instantly.
01:28:29.340 If that's true, do you think you wouldn't know about that already?
01:28:33.680 What do you think of the odds you wouldn't know about that already?
01:28:38.540 Well, okay, you'd call it bi.
01:28:41.180 Yeah, what are the odds you wouldn't know about that already?
01:28:43.760 Or what are the odds you wouldn't have heard some confirming stories by now?
01:28:48.340 Like, by now, you already would have heard of the guy who said,
01:28:52.220 yeah, I met him at the bar that time.
01:28:54.660 You know, he's been bi or gay for years.
01:28:57.580 Where are all those people?
01:28:59.460 Because you'd know that by now they would come out of the, right, yeah, they'd be coming out of the woodwork.
01:29:04.180 Because if it were true, and I think probably not, but if it were true that he had some relationship with this intruder,
01:29:15.260 why would he pick that guy?
01:29:18.120 And if he did pick that guy because it was a mistake,
01:29:21.680 don't you think there would be a whole, like, a history of other guys?
01:29:25.260 Unless you don't think one of those people would want to cash in?
01:29:31.360 If he had some history of, let's say, you know, sex with marginal people, you know, barely, who could barely support themselves,
01:29:41.960 if that were true, one of those wouldn't take a $10,000 payday?
01:29:45.560 You don't think somebody would want to get the $10,000 payday for saying, oh, yeah, I was with him that time?
01:29:52.580 Of course, of course they would.
01:29:54.640 So the longer we don't hear of any confirmation, the less likely the story is.
01:30:01.120 Now, is there any good reason they would both be in their underwear?
01:30:06.080 Yes.
01:30:07.640 One of them just woke up, and the other one is a known nudist who walks around in his underwear.
01:30:14.540 Yeah, there is.
01:30:16.900 Is there any reason that the glass that's broken would only be broken outside?
01:30:23.720 Yes.
01:30:24.440 Yes, there is.
01:30:25.580 Because he got in some other way, and then a fight happened, and something went through a window from the inside out.
01:30:33.220 Easily.
01:30:34.200 Yeah.
01:30:34.880 So everything that you think is proving that he had a gay relationship has an other explanation that's far more likely.
01:30:43.740 Right?
01:30:43.900 Far more likely.
01:30:45.920 Now, can I rule out that there was some kind of something going on?
01:30:50.900 No.
01:30:51.600 No.
01:30:52.360 But I'm not willing to believe it.
01:30:55.000 We have a confirmation there was only one hammer.
01:30:58.460 So it was a fake news that there was another person present.
01:31:03.460 At one point, somebody else answered the door.
01:31:06.920 But now we hear that that wasn't true.
01:31:09.080 There were only two people there.
01:31:10.100 So here are all the things that can easily be explained away.
01:31:14.600 Security systems routinely don't work.
01:31:17.320 Very normal.
01:31:18.040 People you think would have security often don't.
01:31:23.500 It would surprise you, but they often don't.
01:31:27.000 Being in their underwear is explained.
01:31:29.340 Broken window can be easily explained.
01:31:31.340 Knowing his name can be explained.
01:31:35.520 Knowing his name can be explained.
01:31:35.640 He probably just asked him.
01:31:37.700 Probably just asked him his name.
01:31:40.620 Calling him a friend can be explained.
01:31:42.200 Because he might want to keep him calm.
01:31:45.280 Right?
01:31:45.820 Because that would actually be a good technique.
01:31:48.780 If you have a crazy guy who looks dangerous come to you, try that technique.
01:31:55.040 Try the Paul Pelosi technique.
01:31:57.360 So my friend is here.
01:31:59.640 Hey, my friend.
01:32:01.080 Hey, friend.
01:32:01.600 Let's talk.
01:32:02.780 What's bothering you, friend?
01:32:03.760 So you automatically went to the worst assumption about a very smart guy.
01:32:12.440 Right?
01:32:13.240 You went to the worst assumption that something was going on there.
01:32:16.820 If you say he's a smart person who still has his faculties, then he has game.
01:32:22.880 And if he has game, calling this guy a friend when he's dangerous is exactly the right play.
01:32:29.560 Persuasion-wise, that's right on target.
01:32:31.640 I would have done the same thing.
01:32:33.520 Yeah.
01:32:33.760 If somebody comes in who's dangerous but crazy, you treat them like a friend.
01:32:39.280 You got that?
01:32:41.620 And it wasn't until the hammer came into the question.
01:32:46.700 Remember, Pelosi brought the hammer.
01:32:49.360 It wasn't until the hammer was introduced that the guy flipped out.
01:32:53.740 So Pelosi had actually controlled the situation by calling him a friend up until that point.
01:33:00.760 All right.
01:33:00.960 Here's the other bullshit part of the story.
01:33:02.460 Was it Biden who said this?
01:33:05.240 That the crazy guy was chanting January 6th slogans.
01:33:11.140 Did you hear that?
01:33:12.700 And then did you hear what the slogan was?
01:33:15.360 Where's Nancy?
01:33:16.220 Because some of the January 6th people were saying, where's Nancy?
01:33:19.860 Where's Nancy?
01:33:20.440 Now, what is the most ordinary question you'd ask if you were, let's say, looking for Nancy?
01:33:28.820 If you were looking for Nancy, what words would you choose?
01:33:36.440 I would have chosen, where is Nancy?
01:33:40.000 And I might even use a contraction and say, where's Nancy?
01:33:43.020 And if he didn't answer, I might ask again.
01:33:46.500 Because we do assume that he broke into the house to confront her, not him, right?
01:33:51.380 So if you confront somebody's spouse in the home in which you were trying to talk to the other spouse, what would be the normal question you would ask?
01:34:02.840 Where's Nancy?
01:34:03.520 How many times would you ask the question?
01:34:06.760 As many times as you needed to.
01:34:08.980 Where's Nancy?
01:34:09.640 No, where's Nancy?
01:34:10.560 Where's Nancy?
01:34:11.440 Where's Nancy?
01:34:12.880 And then Joe Biden says he was chanting January 6th stuff.
01:34:16.100 That fucking piece of shit.
01:34:18.880 I mean, Joe Biden is really just a piece of shit.
01:34:21.200 He really is.
01:34:22.600 Remember when, I remember when I used to think, well, I disagree with his policies and, you know, maybe he's misguided.
01:34:28.640 And, you know, oh, he makes some gaffes and stuff.
01:34:31.980 But he's just a piece of shit.
01:34:34.040 He's like just a piece of shit.
01:34:36.100 He really is.
01:34:39.120 And this was a perfect example.
01:34:41.460 All right.
01:34:42.000 Now, can you name anything else that I cannot explain away this easily?
01:34:46.200 Anybody?
01:34:47.200 And while you're naming that, now that I've explained away all the suspicious things, did I miss anything?
01:34:56.040 Is it right?
01:34:56.740 Both in their underwear makes sense.
01:34:58.960 One just woke up, two in the morning, and one is a known nudist.
01:35:03.340 Totally explained.
01:35:07.320 It was not a wellness check.
01:35:09.240 It was not a wellness check.
01:35:10.560 That's fake news.
01:35:12.120 Pelosi called 911 and they came.
01:35:14.640 Because he snuck into the bathroom and used his phone and called 911.
01:35:17.940 So we've explained the wellness check, the underpants, the broken window, all of it.
01:35:22.920 All of it's explained.
01:35:23.780 And all of the explanations I gave you are the ordinary ones.
01:35:29.180 The extraordinary one would be if this rich guy was gay and the best he could do to work out a gay encounter was this guy.
01:35:38.400 Let me toss you a really, okay?
01:35:43.400 Let me, we'll back up and we'll take a running start at a good really.
01:35:47.820 This is a really persuasion, all right?
01:35:49.660 So, you're telling me that Paul Pelosi, a successful business person who by all accounts has all of his faculties, very high operating individual, super rich, super connected, and he lives in the gayest city in the entire solar system, San Francisco.
01:36:12.020 Are you telling me that this highly capable person couldn't find a better gay partner than a homeless guy who's crazy?
01:36:25.380 Really?
01:36:26.880 Really?
01:36:27.440 This guy who can make hundreds of millions of dollars, I know you think he did insider trading, but even before then, this hugely successful, high operating, brilliant guy in all of his faculties, and the best he could do for a gay tryst was that guy.
01:36:44.820 Really?
01:36:46.100 Really?
01:36:47.380 And he would just let him in his house?
01:36:49.980 Really?
01:36:53.400 All right.
01:36:54.440 Well, then you throw on top of that, he has a drinking issue, maybe.
01:36:59.300 I don't know if he was just drinking and driving that one time or if he has an issue, but I don't know.
01:37:06.280 I don't buy the gay lover story.
01:37:10.060 How many of you think it's real?
01:37:11.180 Now, one of the things I tweeted today is that I don't believe there's a single Republican who cares where Paul Pelosi parks his becker.
01:37:22.380 Does anybody care where Paul Pelosi parks his becker?
01:37:25.860 No, right?
01:37:26.640 We don't care.
01:37:27.500 It's a complete non-story, except that it's funny.
01:37:31.480 Complete non-story.
01:37:32.440 And I would like to defend Paul Pelosi to the right.
01:37:42.520 Well, I know you're having fun with it, and I don't begrudge you the fun, actually, as long as he's going to live and recover.
01:37:49.800 I think we can live with a world where people have naughty senses of humor.
01:37:55.420 So I'm going to support him.
01:37:56.640 I think that the speculation is fun, but don't take it seriously.
01:38:04.600 The evidence strongly suggests it's exactly what it looks like.
01:38:08.960 Strongly suggests.
01:38:12.000 All right.
01:38:12.920 We're still guessing.
01:38:14.180 Now, and let me also say that we're in the fog of war of this, right?
01:38:17.640 The fog of war.
01:38:18.120 So everything that I've said could be completely wrong.
01:38:21.980 But if you play the odds.
01:38:25.440 All right.
01:38:25.900 Now, having heard me explain away every element that you found suspicious.
01:38:31.140 Having done that, did you learn anything about confirmation bias?
01:38:35.840 It's the best confirmation bias story you'll ever see.
01:38:39.100 It should be like a case study in confirmation bias.
01:38:42.520 The moment you heard, wait, it might be gay, did you notice that everything confirmed that, or it seemed to?
01:38:52.680 Now, as I explained, it didn't.
01:38:54.660 It did the opposite of confirming it.
01:38:56.660 But you had to take the least likely explanation of every piece of evidence.
01:39:02.520 Did you not?
01:39:03.660 In order to get to that conclusion, you had to take the least likely explanation of every element.
01:39:09.740 And you did.
01:39:10.280 Not you, but people did.
01:39:13.000 People took the least likely of like 10 pieces of evidence.
01:39:22.140 Two on the nose?
01:39:25.580 Was it two on the nose?
01:39:28.480 So, you know, I promote that standard for detecting bullshit.
01:39:32.980 I don't know.
01:39:34.380 Was it two on the nose?
01:39:35.460 Because it's pretty ordinary that people are having, you know, affairs and gay relationships and stuff.
01:39:46.620 This seems pretty ordinary.
01:39:48.700 I don't know.
01:39:50.060 I wouldn't call that one a two on the nose.
01:39:54.180 The wellness check is debunked, yes, because it was a 911 call from Pelosi using his own phone from the bathroom.
01:40:01.680 So, we know that.
01:40:04.240 I think the, I think, I think the, oh, here, here we go.
01:40:10.340 I think the police call it a wellness check when there's not a specific immediate danger that's mentioned.
01:40:18.520 How about that?
01:40:19.360 If you call the police and say, there's somebody in my house and, you know, I'm a little concerned.
01:40:27.980 That's a wellness check, right?
01:40:29.940 Because at that point, that was before the hammer came out.
01:40:32.700 So, before the hammer came out, if Paul Pelosi said, all right, there's a guy here, here's his name, you know, we're having a conversation, but there's clearly something off about this guy and he shouldn't be in my house.
01:40:43.700 Can you, can you stop by and make sure everything's okay here?
01:40:48.060 And then they say, okay, that's a wellness check because there's no crime reported, no specific crime.
01:40:53.320 Because probably, it's possible he didn't know how he got in the house either.
01:40:57.840 Like, there might have been some point where Paul Pelosi was saying, I don't know, maybe I left the door unlocked.
01:41:04.480 Who knows?
01:41:05.580 Like, maybe it wasn't so much breaking and entering as it was just opening a door.
01:41:14.000 Yeah.
01:41:14.820 If you have a small house, it is probably your practice to make sure the doors are locked when you go to sleep.
01:41:21.580 Am I right?
01:41:23.080 Do you check all your doors to make sure they're locked before you go to sleep?
01:41:28.300 Depends where you live, I guess.
01:41:29.860 Yeah.
01:41:30.660 How many doors do you think Paul Pelosi had and windows?
01:41:34.180 How many doors do you think he has?
01:41:37.140 More than an 82-year-old will check before he goes to sleep.
01:41:41.400 Do you think an 82-year-old, if he was living there alone, let's say for the weekend, do you think he checked all his doors?
01:41:47.040 Do you think that people who work there, probably during the day, do you think that service people were in and out of his house all day, using various different doors?
01:41:58.540 Of course they were.
01:42:00.180 And do you think he checked them before he went to sleep?
01:42:02.960 He's 82.
01:42:03.520 Probably not.
01:42:06.320 Probably not.
01:42:07.940 So he may not have known at that point.
01:42:10.300 We don't know what he knew.
01:42:11.260 But he may have thought, oh, shit, I just left a door open and this street person wandered in.
01:42:15.940 So I just have a situation here.
01:42:21.720 All right.
01:42:22.240 No, not intentionally.
01:42:28.100 Did anybody hear me say leaving their doors open intentionally in San Francisco?
01:42:33.040 Nobody does it intentionally.
01:42:34.580 I'm just saying he has lots of doors and he's not the one using them all.
01:42:38.780 If you had all your doors locked and you only ever used your front door, well, probably you check it before you go to bed.
01:42:46.660 Well, I'll bet he doesn't walk the entire perimeter of his mansion, check the garage door.
01:42:52.240 I don't think so.
01:42:53.900 And there wasn't anybody else there, so it would have had to be you.
01:42:57.580 All right.
01:42:58.660 So let me ask you again.
01:43:01.200 Now that some of you are coming in late, but trust me, I just debunked all of the evidence that he was having a gay affair
01:43:07.380 with a far more likely explanation of the same facts.
01:43:11.320 How many of you, forget about the fun of it, right?
01:43:14.720 Forget about the fun of it.
01:43:17.060 Tell me if you believe or do not believe you had a gay relationship.
01:43:20.640 Go.
01:43:20.740 Did he or did he not have a gay relationship with that man?
01:43:28.100 Mostly no's, but some yes's.
01:43:32.400 Mostly no's, but some yes's.
01:43:39.020 I think we're about 90% no.
01:43:41.700 Maybe 80 to 90% no.
01:43:44.880 If you said it's possible, I'm with you.
01:43:46.860 I'm totally with you on it's possible.
01:43:51.520 Everybody good with that?
01:43:53.100 Because there's nothing that debunked it.
01:43:55.580 It wasn't debunked.
01:43:57.360 It just doesn't quite fit the facts as well as the ordinary explanation.
01:44:01.840 The most ordinary explanation fits everything.
01:44:04.780 The gay explanation has problems.
01:44:07.740 Like, why that guy?
01:44:08.840 You know, just all of it has a problem.
01:44:14.360 And if they were there for sex, don't you think the sex would have happened first?
01:44:22.820 Does the hammer out before the sex?
01:44:24.400 Maybe at your house.
01:44:28.660 Maybe at your house the hammer comes out before the sex.
01:44:31.920 I don't know.
01:44:34.520 And yes, it was a ball peen hammer.
01:44:36.880 I just assumed it was.
01:44:38.120 It's just as funnier.
01:44:38.860 All right.
01:44:42.180 That's all for me.
01:44:43.360 I told you it would be the best live stream you've ever seen.
01:44:45.880 And I'm sure I didn't disappoint you.
01:44:48.220 Wait till tomorrow.
01:44:49.920 Tomorrow's going to be...