Real Coffee with Scott Adams - October 05, 2022


Episode 1887 Scott Adams: Kanye Says White Lives Matter And Elon Musk Is Buying Twitter. Fun News


Episode Stats

Length

1 hour and 36 minutes

Words per Minute

140.31998

Word Count

13,542

Sentence Count

1,295

Misogynist Sentences

24

Hate Speech Sentences

41


Summary

It's the dopamine hit of the day, and it's the thing that makes everything better, even if it's not for you. Don's on board, don't worry, you're not either. It's another episode of Coffee with Scott Adams, in which Scott tries to figure out why nobody is happy.


Transcript

00:00:00.000 Good morning, everybody, and welcome to another peak experience of human existence.
00:00:14.600 It's called Coffee with Scott Adams.
00:00:16.120 There is nothing finer in the entire galaxy and known universe.
00:00:21.600 You know that rover that's up there on Mars that's digging for dirt?
00:00:25.780 No matter how far it digs, it's not going to find anything better.
00:00:36.440 Thank you, Paul. Appreciate that.
00:00:40.200 Now, would you like to take your experience up to the highest levels ever known to humanity?
00:00:48.520 Yes, you would.
00:00:49.320 And all you need is a cuppa, a mug, or a glass, a tanker, a chalice, a stein, a canteen jug, or a flask, a vessel of any kind.
00:00:55.780 Hold on. Wait a minute.
00:00:59.060 Don over on YouTube says he does not want to take it up to the highest level of awesomeness.
00:01:06.880 Don? Don? I need you all on board.
00:01:11.100 This is the simultaneous sip.
00:01:13.300 This is not the simultaneous minus Don sip.
00:01:18.420 Don, you have to get on board.
00:01:20.000 Don? Get on board. Get on board.
00:01:23.440 It's the dopamine hit of the day.
00:01:26.380 It's the thing that makes everything better, including for Don.
00:01:29.020 All right, I think he's on board. He's on board.
00:01:31.440 You're on board. Go.
00:01:36.360 That's for you, Don.
00:01:38.620 See, I'm going to get you in one at a time.
00:01:40.520 So this just reminded me of a weird little thing that happened when I was a kid.
00:01:50.500 My parents had these friends, Don and Ann.
00:01:54.300 We always heard about them.
00:01:55.840 Well, we're going to hang out with Don and Ann.
00:01:58.140 They had two names that went together really well.
00:02:00.420 Don and Ann.
00:02:01.260 Don was, I think he worked construction, but he had been in World War II, and he had been
00:02:11.120 stationed on an island in the Pacific, and his job was to load ordnance on bombers.
00:02:18.300 And one of the things he once loaded on a bomber was an atomic bomb.
00:02:27.940 So he was on the island of Tinian, and he literally loaded the atomic bomb onto the plane that bombed Japan.
00:02:37.240 I think it was the first one.
00:02:38.980 It was the first one, I think.
00:02:40.000 And I always thought, how in the world, like, what are the odds that I would literally, yeah, the Enola Gay.
00:02:49.060 He actually loaded the Enola Gay.
00:02:50.840 And I thought, what are the odds I would actually know him personally?
00:02:54.420 Like, you would personally know somebody that loaded the atomic bomb on the Enola Gay.
00:02:59.580 It's just so weird.
00:03:02.660 Well, in the news, very important news, Tom Brady and Giselle Bunchen.
00:03:07.680 It looks like they're living separately and maybe talking to a divorce lawyer.
00:03:14.140 Similarly, Brad and Angelina are divorcing and having a bad time in the courts over it.
00:03:22.620 I have a theory that goes like this.
00:03:28.440 What if nobody is happily married?
00:03:32.860 What if nobody is happily married?
00:03:35.980 And I actually mean that.
00:03:36.960 And I think that it has something to do with our times.
00:03:42.740 Yeah, I know you are.
00:03:44.120 Sure you are.
00:03:46.660 Let me stipulate.
00:03:48.960 I will stipulate that 100% of the people who are married and want to stay that way,
00:03:53.480 totally happily married, as far as your spouse knows.
00:03:58.400 All right?
00:03:58.840 As far as your spouse knows, happily married.
00:04:02.040 I have a theory that all happily married people are lying.
00:04:09.500 It's just my theory.
00:04:11.340 And it's part of the process.
00:04:14.000 And I don't mind that.
00:04:15.440 That's not even a criticism.
00:04:16.640 Because I am very much a believer in, I'm going to call it the Jordan Peterson view of marriage.
00:04:24.800 Right?
00:04:25.300 I probably mischaracterized him, so I apologize in advance.
00:04:28.820 But his view is that it's not about being happy.
00:04:33.720 It's not about being happy.
00:04:36.100 It's more about a commitment.
00:04:37.780 So if you got married to get happy, you probably did it wrong.
00:04:44.500 So I think that when people say they're happy, what they mean is they'd rather be married than not married.
00:04:52.660 Which I totally get.
00:04:54.540 I totally get that.
00:04:55.900 You'd rather be married than not married.
00:04:57.920 Right?
00:04:58.720 So would you say that you're happily married equates to I'd rather be married than not married?
00:05:05.240 That makes perfect sense.
00:05:07.640 I just don't think anybody's happy about it.
00:05:10.820 Here's why.
00:05:12.680 Now, I'm not sure you should be.
00:05:14.500 Because maybe we just have a miserable world where people are struggling for scraps of happiness no matter what they're doing.
00:05:21.080 Because I'm not saying that being single is good.
00:05:24.740 Let me be clear.
00:05:25.720 Being single is no good.
00:05:28.080 I don't like it.
00:05:29.720 I don't like it at all.
00:05:31.260 But I just don't think that we've developed in society options that work.
00:05:36.460 And here's why.
00:05:37.260 I'm going to give you a scientific reason why.
00:05:40.700 You ready?
00:05:41.960 This is based on a conversation I had yesterday.
00:05:47.180 Electronically.
00:05:48.400 So somebody was telling me that they had had over 100 lovers.
00:05:54.560 And it wasn't until reaching lover approximately 150 that a great lover was found.
00:06:04.740 And finally, you know, the search was, the search showed that you have to go through a lot of people before you find, you know, the one who's an amazing lover.
00:06:13.760 And unfortunately, that didn't work out.
00:06:21.060 So the relationship with number 150 didn't work out for whatever reasons.
00:06:25.720 So now what happens when this person goes to number 151?
00:06:32.940 When they go to 151, they're going to say, ah, I really like this person.
00:06:38.580 But now I know what good sex feels like.
00:06:41.080 I can't really be with number 151 because everything's good, but the sex is just average.
00:06:49.560 So you're done.
00:06:50.560 So you've got a number 152.
00:06:53.520 How is number 152?
00:06:55.880 Pretty good.
00:06:56.940 Pretty good.
00:06:57.760 Except that the sex isn't like it was with 150.
00:07:04.660 You see where I'm going?
00:07:05.900 In the old days, people married whoever was next door and had a cow.
00:07:13.700 You know, my entire criteria are, do you have a vagina and a cow?
00:07:19.720 Well, I have a vagina.
00:07:22.080 I'm looking for somebody who has a vagina and a cow because I need a cow too.
00:07:27.800 And then if you got your wife that way, you'd be like, I got everything I wanted.
00:07:32.460 Cow, vagina, boom, happiness.
00:07:36.460 Am I wrong?
00:07:37.920 Cow, vagina, happiness.
00:07:40.140 That's 1942 right there.
00:07:42.700 Now, you can go on the internet and see, you know, every manner of desirable thing.
00:07:50.400 And you can compare it to your own life and it doesn't look so good.
00:07:55.640 So here's what's happened.
00:07:57.860 Your comparison set has expanded to the point where everything sucks.
00:08:05.900 Do you get that?
00:08:07.880 This is based on science.
00:08:09.040 This is not even an opinion.
00:08:11.040 What I'm saying to you now is not opinion.
00:08:14.720 It's just the way people work.
00:08:16.200 If you give me lots of choices, every time I look at one of those choices, I'm going to say, hmm, I kind of wish I didn't know about all those other options that I'm not getting.
00:08:26.440 Because I think I may want to hold on a little bit longer until I get one of these.
00:08:30.320 And then I age out and I'm too old to get married and that sort of thing.
00:08:34.200 So I believe that humans are destroyed by options.
00:08:38.420 And that we've reached an option set where we can't handle it.
00:08:44.580 And that marriage is simply one of the many things that suffers from it, but maybe not especially.
00:08:52.800 So, yeah.
00:08:56.720 So I think, do you know why TV is no good?
00:09:00.340 I saw somebody prompt me for that.
00:09:03.060 Do you know why television is no good at all?
00:09:05.940 Because there are too many channels.
00:09:08.380 Too many channels.
00:09:10.040 Yeah.
00:09:11.380 Do you know that I spend now 90% of my time looking for something and maybe 10% of my time consuming?
00:09:19.660 What did you used to do?
00:09:22.760 You used to have three channels.
00:09:24.940 I mean, if you're a certain age, you had three channels.
00:09:27.680 And you would just watch one of them, whichever was the good one.
00:09:30.780 You'd say, well, I got three choices.
00:09:32.240 I'll watch the best of the three.
00:09:34.300 Right?
00:09:35.520 You sat through the commercials.
00:09:37.720 You know, you used it to go to the bathroom.
00:09:39.380 It was fine.
00:09:41.180 Yeah.
00:09:42.000 Too many choices.
00:09:44.760 Well, here's another.
00:09:46.320 So that's your first persuasion trick is that people with too many choices are not happier.
00:09:53.300 Did you ever take a date to the Cheesecake Factory?
00:09:57.760 Oh, my God.
00:09:58.920 It's a shit show.
00:10:00.440 The Cheesecake Factory has a menu that's like 700 choices.
00:10:06.780 Here's me going to the Cheesecake Factory.
00:10:09.440 And again, this doesn't matter who I'm going with.
00:10:11.480 Now, I'm a vegetarian, pescatarian actually, but, and decisive.
00:10:20.640 So here's me with the cheesecake menu.
00:10:27.900 One page of stuff I could eat.
00:10:30.160 I'll go that one.
00:10:31.360 All right.
00:10:31.820 I'll have the whatever.
00:10:33.780 Evelyn's pasta.
00:10:35.840 Here's my date.
00:10:36.880 Here's my date.
00:10:41.480 Here's me.
00:10:58.640 Oh, shit.
00:10:59.400 I don't have a watch.
00:11:00.960 I can't look at my phone.
00:11:02.780 If I pick up my phone, I'm going to get, I'm just going to look at it.
00:11:07.940 It looks like I'm going to be here a while.
00:11:10.040 This could be 20 minutes.
00:11:12.180 I just want to check one message.
00:11:16.420 Put your phone down.
00:11:22.160 No, I've never been with anybody who told me to put my phone down.
00:11:25.320 But I've been with people who didn't like it when I picked it up.
00:11:30.940 Now, let me ask you this question.
00:11:32.280 If your date is looking at the menu, can you look at your phone?
00:11:37.680 Yes or no?
00:11:40.080 If your date is looking at the menu and you're done, can you look at your phone?
00:11:44.880 I'll look at the answers.
00:11:46.220 Yeses and nos.
00:11:47.760 Big disagreement here, huh?
00:11:51.180 Yeah, this one's not settled.
00:11:53.740 See, we need a whole set of manners that are constructed for our unique times.
00:11:58.960 Because we don't really have phone manners totally worked out.
00:12:05.580 So I told you I've been going to Starbucks to do some writing.
00:12:10.600 It's just easier to do my writing when I'm there.
00:12:13.260 In Starbucks, people started taking video calls.
00:12:20.000 Seriously.
00:12:20.440 They sit at Starbucks and they take video calls for work.
00:12:25.960 And I'm thinking, seriously?
00:12:29.240 Like, in what fucking world is that okay?
00:12:34.220 In no world is that okay.
00:12:36.700 And it's now common.
00:12:38.680 It's actually common in Starbucks now.
00:12:41.080 People taking video calls for extended periods.
00:12:44.180 Amazing.
00:12:44.740 Without headphones.
00:12:46.680 Without headphones, yeah.
00:12:47.980 I mean, they're talking, too.
00:12:50.820 Some of them are on headphones, but others are just talking and listening.
00:12:55.940 So I guess Tucker had Tony Bobulinski on again.
00:12:59.760 And he's Tucker's, I'm sorry, he's Hunter's ex-business partner.
00:13:05.880 Now, here's the most amazing thing about the Hunter-Biden story.
00:13:12.340 Correct me if I'm wrong, but this is what happened.
00:13:14.420 Now, I'm going to make up a different story, not about Hunter.
00:13:19.340 And it went like this.
00:13:21.240 Do you hear about that Hunter?
00:13:24.500 He went to Abonia.
00:13:27.700 Really?
00:13:28.760 Yeah.
00:13:29.540 So what?
00:13:30.360 Well, you know, Abonia's, you know, our enemy.
00:13:33.680 Yeah.
00:13:34.120 Well, a lot of people go there.
00:13:36.880 You know, he talked to somebody important, you know, in the government.
00:13:40.200 Really?
00:13:40.700 That's kind of sketchy.
00:13:43.060 But, you know, probably people do that.
00:13:45.040 It's no big deal.
00:13:46.900 You know, he talked to people in the government, and they talked about making a deal.
00:13:50.720 And I thought, ooh, huh, that's, my eyebrow goes up a little bit.
00:13:56.000 But talking about a deal is, you know, people talk.
00:13:59.580 It's not illegal to talk.
00:14:01.840 But they actually made a deal.
00:14:04.080 Oh.
00:14:05.120 They made a deal.
00:14:06.960 He's in Albania.
00:14:07.800 He's talking to important people.
00:14:08.940 He talked about a deal.
00:14:10.020 He made a deal.
00:14:11.160 Well, that does sound a little bit bad.
00:14:13.440 And the deal would have involved, you know, maybe something bad involving our government.
00:14:21.140 Really?
00:14:22.900 By the time you hear the whole story, you've been so indoctrinated into it that you've lost
00:14:30.760 your outrage.
00:14:31.500 And you can't get enough outrage going.
00:14:34.840 The Hunter-Biden story, if you had heard the entire story, like in one big bite on day
00:14:42.280 one, your head would have exploded.
00:14:45.220 Biden would have been driven out of office or never been elected.
00:14:48.880 Hunter would be under investigation.
00:14:50.560 But the way they dribbled it out, we just got used to it.
00:14:58.400 When I hear that Hunter-Biden was illegally, presumably, trying to make deals with the Chinese
00:15:06.280 government to make money by using his father's name in the worst possible way, I don't even
00:15:14.120 have outrage about it.
00:15:15.300 I have intellectual outrage.
00:15:20.140 Yeah, okay.
00:15:21.300 That sounds very bad on paper.
00:15:24.060 If you gave me a choice, I would make it go away.
00:15:28.480 I think maybe the Department of Justice should look into it.
00:15:32.260 That was my entire emotional investment.
00:15:35.500 None.
00:15:36.560 I have no emotional investment in Hunter because of the way they just dribbled it out and I just
00:15:42.640 kept getting used to it.
00:15:43.720 Yeah, they were just boiling that frog and I was like, oh, sure, it's getting warm in
00:15:48.580 here.
00:15:49.620 It's only one degree warmer than yesterday.
00:15:52.760 It's only a little bit worse than what we heard.
00:15:55.680 And now we have like his actual business partner confirming the entire illegality of the entire
00:16:01.180 thing in the most credible way.
00:16:04.760 I mean, completely credible.
00:16:06.100 We know that the intelligence agencies intentionally got 50 people to lie and that they changed the
00:16:15.780 nature of the election by this lie, probably.
00:16:20.700 Now, how outraged should you be by this story?
00:16:26.240 It's a 10 out of 10.
00:16:28.140 It is a solid 10 out of 10.
00:16:30.000 How outraged do I actually feel about it?
00:16:34.500 One.
00:16:36.860 One.
00:16:38.400 Yeah.
00:16:39.720 And I'm completely aware that it should be 10.
00:16:43.760 It should be 10.
00:16:45.660 Today when I'm done talking about this, I won't even think about it once.
00:16:49.760 I will not think about it once after I'm done talking today.
00:16:53.020 They did that to you.
00:16:55.280 That's what they did to you.
00:16:56.820 They made you get used to that.
00:16:59.380 And you did.
00:17:01.300 Amazing.
00:17:02.400 I didn't think it was possible.
00:17:04.320 You know, I've said this a million times.
00:17:06.400 My mother always taught me that you can get used to anything if you do it long enough,
00:17:12.880 including hanging.
00:17:15.200 And it was always, you know, a joke around the house.
00:17:17.860 But we got used to it.
00:17:19.400 We actually got used to Hunter Biden working deals with China while his dad was vice president
00:17:24.980 at the time.
00:17:27.020 Bill Gates is criticizing ESG directly.
00:17:33.020 And he basically says that it's ridiculous for these fund managers, people who manage financial
00:17:38.240 funds, to be influencing people who make steel.
00:17:42.180 Because his, I'm paraphrasing now Bill Gates' criticism.
00:17:45.420 But he's saying, do the finance people have another way to make steel?
00:17:51.500 Like, what's the other way to make steel without polluting?
00:17:55.100 He's like, what is your idea?
00:17:56.760 Do you have something in your desk drawer at your finance department that'll tell them
00:18:00.340 how to make steel without polluting?
00:18:02.280 You either have to not make steel, or you better invent something that makes it that nobody's
00:18:07.300 figured out how to make steel, or just do without it.
00:18:09.960 So Bill Gates is going right at him.
00:18:13.280 Now, why does Bill Gates think that ESG is bullshit?
00:18:18.540 Not the goals.
00:18:19.900 He's not criticizing the goals of having a, you know, a good environment, of course.
00:18:26.320 Because he was the owner of a company.
00:18:29.460 He was the owner of a company.
00:18:31.260 And he's speaking, you know, he's at an age where he can just speak freely.
00:18:35.140 Of course it's bad for business.
00:18:37.260 Because every CEO will tell you, you don't want another layer of management on top of
00:18:43.960 your layers of management.
00:18:46.000 There's nobody in the world who thinks extra layers of regulations gives you a better outcome.
00:18:51.620 I mean, at some point.
00:18:52.800 I mean, some regulations you need, of course.
00:18:55.120 But when you get to some point, they're counterproductive.
00:18:59.160 So Bill Gates is saying it's bullshit.
00:19:02.220 Here are some numbers I hadn't heard before.
00:19:05.200 But, so BlackRock is one of, I guess, the biggest voice for the ESG stuff.
00:19:12.960 And it's making the socially responsible funds, as the Wall Street Journal says, the centerpiece
00:19:19.240 of its $8.5 trillion business.
00:19:23.020 So you've got finance companies who made, who figured out how to get you to churn.
00:19:27.760 Because that's their business.
00:19:30.580 I don't know if you knew this, but finance companies, they're not in the business of
00:19:36.640 making money for you.
00:19:38.420 Did you know that?
00:19:40.480 That's not their business model.
00:19:42.340 That's what they promise you.
00:19:44.620 That's why you put your money there, to make money.
00:19:47.240 But that's not their business model.
00:19:49.080 No.
00:19:49.400 Their business model is to get as much of your money away from you and into their pockets
00:19:53.060 as they can.
00:19:54.380 So they want you to not put your money in one place and have it sit there.
00:20:00.060 Although they could make money on just managing it.
00:20:02.740 So they can get your money two different ways.
00:20:06.220 One is moving it into a fund they manage.
00:20:08.520 So they want to build some bullshit around that.
00:20:10.620 So you'll move to them.
00:20:11.900 But the other way that, not necessarily BlackRock, but financial companies in general,
00:20:17.480 is that they can make fees on moving you in and out of stuff.
00:20:21.720 So the fund managers want you to move around, because that's how they make money.
00:20:25.400 Not how you make money.
00:20:26.420 It's how they make money.
00:20:27.960 And they want to go into their fund, because again, that's how they make money.
00:20:31.240 It's not how you make money.
00:20:33.120 All right.
00:20:33.640 So the companies that are the least, what do I say?
00:20:39.600 The least ethical companies in America are the big funds, the big financial companies.
00:20:45.220 They're the least ethical, because they don't even pretend to sell you what you think you're buying.
00:20:51.920 They don't even pretend.
00:20:54.840 I guess they pretend a little bit, but it's such a thin pretense.
00:20:59.220 It's like, okay, you're just moving our money into your pocket, but you're scaring us so that we'll let you do it.
00:21:06.460 So what they really sell is fear.
00:21:10.940 A big financial company is selling you fear.
00:21:13.340 And here's the fear.
00:21:15.800 If you do it yourself, you're going to fuck up.
00:21:18.640 That's it.
00:21:19.280 That's the whole thing.
00:21:22.060 That's their entire business pitch.
00:21:25.060 If you do it yourself, well, you're going to fuck up.
00:21:28.260 So you better let us do it.
00:21:30.440 So it's a complete scam.
00:21:32.500 That's why you should manage your own money if you can.
00:21:36.340 All right.
00:21:36.700 And I guess there's $350 billion of these funds.
00:21:41.620 So it's big, big money.
00:21:43.160 So ESG is being backed by big money people who have tons of money on the line.
00:21:48.860 The big money people are not interested in the environment necessarily.
00:21:54.420 I mean, they might also be.
00:21:56.140 And they're not interested in the companies doing well necessarily.
00:22:01.740 They're looking for their fees.
00:22:03.540 So apparently Apple is making some changes to move its supply chain closer to Cupertino.
00:22:11.560 Because of supply chain uncertainty, especially China.
00:22:16.040 Now, do you remember in 2018 when people laughed at me and mocked me for saying that China was too risky for business and we would decouple?
00:22:29.300 Yeah.
00:22:29.820 Do you remember how, like, crazy that sounded?
00:22:32.740 And I just kept saying it in public.
00:22:34.520 And I was the only one.
00:22:36.120 I think I was the only, you know, public person.
00:22:39.220 And that's not true.
00:22:41.020 Kyle Bass was saying it.
00:22:42.980 Gordon Chang, I think, was saying it.
00:22:44.920 But it was very, it was a rare thing to say.
00:22:47.780 But now, just normal, right?
00:22:50.100 Now just normal.
00:22:52.340 Peter Zand sang for years, yes.
00:22:54.580 That's correct.
00:22:56.740 I also said Ukraine could win against Russia.
00:22:59.920 I was mocked.
00:23:01.020 And now that is common belief.
00:23:04.140 And I said Trump would win in 2016.
00:23:05.960 And I was mocked.
00:23:07.340 Now, how many times do I have to say something crazy?
00:23:10.120 Before you'll give me the benefit of a doubt.
00:23:14.940 Like, what do I have to do?
00:23:17.780 I get that I got some wrong.
00:23:19.960 But the ones I got wrong were sort of ordinary ones.
00:23:22.440 Like guessing who the vice president pick would be.
00:23:24.940 Everybody gets that wrong.
00:23:27.120 I mean, the ones that I got wrong, the ones that I got right, you should look at these three and just say, okay, that's freaky.
00:23:34.020 I'll just say the three.
00:23:35.340 Decoupling from China, Ukraine winning the war, and Trump getting the presidency in 2016.
00:23:43.700 Who else got those three right?
00:23:46.160 I also said that the project Warp Speed would not create a vaccination that worked.
00:23:54.820 And didn't.
00:23:55.340 I mean, it worked in a therapeutic way, but that's it.
00:23:59.880 We hope.
00:24:00.860 Maybe it was worse than that.
00:24:06.920 Why am I claiming the end of the Ukraine war?
00:24:09.140 I'm not.
00:24:09.880 Have you noticed that everybody who thinks they disagree with me is imagining what I thought?
00:24:14.640 Perfect example.
00:24:16.240 Why am I claiming the end of the Ukraine war?
00:24:19.160 Did I do that?
00:24:20.980 Did I just claim the end of the war?
00:24:25.400 Everybody has to add something to make me wrong.
00:24:28.220 It's like, it's super important that I be wrong, so you have to just add something.
00:24:34.100 So there's this NYU chemistry professor who got sacked after 82 of his 350 students complained that his course was too hard.
00:24:44.860 His organic chemistry class was too hard.
00:24:48.220 So he got fired.
00:24:50.020 Now, the way this is being reported, at least in right-leaning news, is that students are a bunch of pussies.
00:24:57.880 And if they were as tough as they used to be, they would just take that hard class and they'd suck it up.
00:25:02.800 And that's the way it is.
00:25:04.220 And you don't get participation trophies.
00:25:07.060 And I'm completely on the other side of that.
00:25:10.080 I've experienced science being made harder than it needs to be.
00:25:19.260 Is it my, you know, I haven't taken organic chemistry, but I've been around enough people who are taking it at the moment.
00:25:26.640 Is it my imagination that organic chemistry is intentionally harder than it should be?
00:25:31.820 Intentionally.
00:25:33.040 Yes or no?
00:25:34.700 Intentionally harder than it needs to be.
00:25:36.220 Why is it designed to be 10 times harder than other courses?
00:25:42.220 No good reason.
00:25:44.540 Here's something I'm going to say with no experience in organic chemistry.
00:25:48.200 This is what people hate me for.
00:25:49.980 To say things completely out of my area of expertise.
00:25:53.860 But I'm still right.
00:25:55.140 The problem is the incompetence of the people who make the class.
00:25:59.080 There's no, you could not convince me that you can't make that class manageable.
00:26:06.620 And still useful.
00:26:08.680 Because we have computers now, right?
00:26:11.220 I feel like you could teach people the concepts and tell them to use the computer to do the hard stuff.
00:26:16.440 And you'd be fine.
00:26:18.940 I'm pretty sure it's just trying to find, you know, maybe it's, you know,
00:26:24.560 And I think that that's giving them too much credit to say that, to say that they make it hard to weed out the dumb people.
00:26:33.600 That's not what they're doing intentionally.
00:26:36.680 I think they're just really bad at teaching.
00:26:39.700 It's just bad teaching.
00:26:42.720 And here's what informs me.
00:26:46.040 I've told you this story before.
00:26:47.360 So years ago, I thought I would write a book on personal finance.
00:26:51.760 Because I have a, you know, background in economics and business and stuff.
00:26:55.240 And I'd been doing a lot of investing.
00:26:56.960 And I thought, I could write a book on personal investing.
00:27:00.040 People would like that.
00:27:00.920 I'll simplify it.
00:27:02.360 And then the problem was that I could put the entire book on one page.
00:27:05.580 Which I did.
00:27:06.880 I put it on one page.
00:27:08.440 And I didn't even have to single space.
00:27:10.780 It turns out that everything you need to know about personal finance fits on one page with lots of extra space.
00:27:17.360 And then you go to the library and you see, like, books and books on personal finance.
00:27:23.000 So what's that all about?
00:27:24.900 Do you want me to tell you what that's all about?
00:27:27.560 It's people who didn't know how to teach it.
00:27:30.060 That's it.
00:27:30.900 The entire field of finance is entirely about people who aren't good at explaining stuff.
00:27:36.880 And the first time that somebody who is good at explaining stuff entered the field, me, me, I put it on one page.
00:27:47.640 Now, if you're saying to yourself, yeah, but who agreed that you did it right?
00:27:52.900 Well, have you ever heard the book A Random Walk Down Wall Street?
00:27:58.840 Burton Malkiel.
00:28:00.720 Princeton, one of the most famous economists in the world, wrote the book on personal investing.
00:28:06.040 He contacted me and asked me if he could put my one page in his book because he thought I covered it all.
00:28:15.500 He's basically considered maybe the number one expert in the field.
00:28:19.940 And he says I took the entire field and put it on one page.
00:28:23.060 He says.
00:28:24.820 He was going to try to do it himself.
00:28:26.580 And the reason he didn't is because I did it.
00:28:28.700 He goes, oh, you already did it.
00:28:29.980 Now, I don't think he actually published it.
00:28:32.780 I don't know why, but he got permission.
00:28:39.140 Henry.
00:28:41.020 Let's give Henry a little lesson here.
00:28:43.080 So Henry over here.
00:28:44.220 I like to catch these people.
00:28:46.380 Scott likes to brag about his greatness.
00:28:48.180 What do you think this show is?
00:28:53.840 The entire show is me predicting things and then seeing how I do.
00:28:59.600 The only reason you're watching me is you think I could do it better than you could.
00:29:04.040 Why the fuck would you watch me if you don't think I could do it better than you?
00:29:09.220 You should just sit there alone in a chair instead of wasting your time listening to this.
00:29:13.880 Because apparently I have nothing to teach you and nothing to say.
00:29:16.340 All right.
00:29:22.560 You need to work on your ego if that's where you're at.
00:29:26.900 Pretty low level of awareness.
00:29:30.560 Remember I told you that the press would come after me before the elections?
00:29:35.880 Remember I told you there would be a hit piece about me?
00:29:38.820 And I keep waiting for it.
00:29:40.760 And there was a very weak attempt yesterday.
00:29:43.220 It's so weak that I just retweet it.
00:29:46.820 You know you failed in a hit piece when somebody retweets it without much complaint.
00:29:54.380 So as a New York Daily News, would you like some context?
00:29:59.520 So this is the part that people don't realize when they're just casual consumers of the news.
00:30:04.960 So here's a newspaper writing something bad about me.
00:30:08.860 It's the New York Daily News.
00:30:10.540 Do you think the New York Daily News carries my comic?
00:30:14.080 What do you think?
00:30:16.600 No.
00:30:17.620 No.
00:30:18.120 Do you think the competitor does?
00:30:21.220 Yes.
00:30:23.020 Yes.
00:30:24.000 Yes.
00:30:24.200 So coincidentally, coincidentally, they're criticizing me without ever mentioning that I'm a primary piece of content for their main competitor in New York City.
00:30:38.120 They just don't mention that.
00:30:39.520 That's just left out.
00:30:41.040 Totally not relevant.
00:30:42.800 No.
00:30:45.860 Here's what they say about me in the New York Daily News.
00:30:49.020 They call me, they say that I'm a right-wing cartoonist, a right-wing cartoonist, and that I'm the cause of the coming violence.
00:31:04.300 That's right.
00:31:05.920 I'm the cause of the coming violence.
00:31:08.260 I am.
00:31:08.880 Do you know why?
00:31:10.120 Because of my tweets that became viral memes.
00:31:13.080 And I said that, according to the article, that if Biden is elected, there's a good chance you will be dead within a year, and also that Republicans will be hunted.
00:31:24.180 Now, I did say those things.
00:31:25.820 I did say those things.
00:31:27.320 But they've decided that they can blame any violence on me.
00:31:35.120 So by predicting violence, I'm the cause of it.
00:31:39.700 Because I predicted that the Democrats would be violent, which causes the Republicans to react.
00:31:48.260 And so therefore, I'm the cause of the Republicans who reacted to the violence.
00:31:56.480 And wait, it gets more amazing.
00:31:58.540 It gets more amazing.
00:31:59.380 Do you think that they mentioned any of the examples that have been used by the right to justify that they are being hunted?
00:32:09.700 Nope.
00:32:11.140 Nope.
00:32:13.660 Nope.
00:32:14.780 No mention whatsoever of the prominent public headline examples that the right uses to populate this meme and say, yes, we are being hunted.
00:32:25.800 Here's this example, this example, this example, this example.
00:32:29.000 The article that mocks me for saying it doesn't mention any of them.
00:32:34.580 Doesn't even mention they exist.
00:32:36.100 It simply says that I'm creating a dangerous situation such that if ever there's even one Republican who does anything to any Democrat, guess whose fault that is?
00:32:49.480 Me.
00:32:50.000 Now, I don't think that this was a good job.
00:32:58.640 I mean, it was a little article in a rag.
00:33:01.720 I mean, the New York Daily News, there's not much left of it anymore.
00:33:05.540 So it's not going to move the needle.
00:33:07.960 But I feel like it was a test shot.
00:33:10.540 You know, if that had maybe caught on a little bit, then other people could say, oh, let's work on this.
00:33:18.660 Let's turn the cartoonist into the one who's causing the violence.
00:33:23.100 Because you know me.
00:33:24.180 That's what I do, right?
00:33:25.820 I go out and cause violence.
00:33:28.400 That's a pretty accurate description of me.
00:33:31.380 Not.
00:33:31.780 Not.
00:33:34.040 Not.
00:33:37.080 Now, what about the part where they say I'm right-wing?
00:33:40.100 Somebody said to me recently that they checked my Twitter feed and it's all right-wing stuff.
00:33:49.560 And my first reaction was, yeah, it is not.
00:33:52.940 And then I thought to myself, oh, I'll bet it actually is.
00:33:57.360 You would definitely think it was.
00:33:59.280 If you listen to me on the stream...
00:34:01.780 Live stream, there's a lot more nuance.
00:34:06.680 All the time.
00:34:07.860 But I don't think I do it as much on Twitter.
00:34:10.940 I don't do it as much.
00:34:12.120 So I could see, I could actually see how somebody would think I'm right-wing based on my Twitter.
00:34:17.520 Now, I say that I'm left to Bernie and people laugh at me.
00:34:20.520 Because there's nothing left to Bernie.
00:34:22.680 Yes, there is.
00:34:23.700 I am.
00:34:24.680 I'm totally left to Bernie.
00:34:26.320 You want some examples?
00:34:28.640 Give me some.
00:34:30.660 Right-wing people.
00:34:31.780 People might want to bar abortion in many cases.
00:34:37.840 Correct?
00:34:38.640 You know, in general.
00:34:39.820 Right?
00:34:40.040 Just generalizations.
00:34:41.600 So the right-wing wants to ban it.
00:34:44.100 Bernie would say that he's in favor of abortion.
00:34:49.160 Right?
00:34:49.720 And I'm left to Bernie, because Bernie's in favor of abortion.
00:34:54.180 And I'm in favor of Bernie shutting the fuck up with his cock.
00:34:59.120 Because his cock has nothing to do with abortion.
00:35:02.460 He's not part of the conversation.
00:35:04.740 So I'm left to him.
00:35:05.960 I say, let the women figure it out.
00:35:07.580 And then tell us what you figured out.
00:35:09.460 And then we'll all be on board.
00:35:12.500 I'm way left to Bernie on abortion.
00:35:15.060 Right?
00:35:15.600 How about guns?
00:35:17.100 The right-wing would say, Second Amendment, don't restrict my gun ownership.
00:35:24.340 Bernie would say, well, you know, I am in favor of gun ownership, because he actually owns a gun, I think.
00:35:30.740 So Bernie's actually in favor of the Second Amendment, but with lots of restrictions.
00:35:35.240 But still in favor of the Second Amendment.
00:35:37.320 And I'm left to Bernie, and I think that Democrats should not be allowed to own guns.
00:35:44.780 That's way left to Bernie.
00:35:46.340 Because Bernie's still, the Second Amendment's good, and I think, I don't think Democrats should be allowed to own guns.
00:35:51.720 Because that's where all the danger is.
00:35:53.720 And it's what they want.
00:35:55.500 I mean, they want less guns, and I want them to have what they want.
00:35:58.280 And since the right is not terribly dangerous in terms of guns, you know, minus the lone shooters that are definitely dangerous.
00:36:06.300 But in general, you know, the baseline, it's a Democrat problem.
00:36:10.780 So I'm way left to Bernie.
00:36:12.640 I say, take away that Second Amendment rights for Democrats.
00:36:17.820 Right?
00:36:19.540 What other topics am I left to Bernie on?
00:36:23.020 Bernie would, people on the right would say you should ignore race.
00:36:26.600 Ignore race, but that, of course, gives some advantage to the people who already have advantage, right?
00:36:32.340 So it's a little, it's a little right, white-leaning and right-leaning.
00:36:39.560 Not necessarily by intention, but that's how it, that's how it works out.
00:36:43.860 The left would say, hey, everybody's equal.
00:36:48.500 Everybody's equal.
00:36:50.260 And Bernie would say that.
00:36:51.720 Bernie would say, everybody's equal.
00:36:53.560 I don't.
00:36:54.340 I'm left to that.
00:36:56.580 I say black lives matter extra.
00:36:59.680 Anybody who thinks that black lives matter only the same amount as everybody else, they're right of me.
00:37:06.500 They're way right of me.
00:37:07.680 I say black lives matter extra.
00:37:10.320 And it's obvious why.
00:37:11.620 It's obvious.
00:37:12.720 Because there are fewer of them, and they're in great demand.
00:37:15.520 If there's fewer of anything that's in great demand, it has more value.
00:37:21.980 Why is it that corporate America is trying desperately to increase diversity?
00:37:26.720 It's because there's a low supply and a high demand.
00:37:30.240 That's what creates value.
00:37:31.420 So, yeah, not only do I agree that black lives matter, they matter extra.
00:37:37.500 They're a little bit extra valuable.
00:37:39.400 If somebody black were killed, is it more likely to become a news story than if somebody white is killed?
00:37:45.300 Of course.
00:37:46.700 George Floyd would be a perfect example.
00:37:49.200 Yeah.
00:37:49.580 No, they're extra.
00:37:52.400 Now, that's just three examples.
00:37:54.440 I could do this all day.
00:37:55.540 Did you see the story about Kanye?
00:38:00.680 He did a fashion show for Yeezy and did it in Paris.
00:38:05.680 And the big story is that he wore a shirt that said white lives matter.
00:38:11.240 And I saw a picture of him with Candace Owens.
00:38:13.580 Was she at the show?
00:38:15.020 Or was the picture from somewhere else?
00:38:17.580 Can you confirm, was Candace Owens at the show?
00:38:20.620 Or was that two different?
00:38:21.740 She was, right?
00:38:22.340 Now, don't you assume that Candace was behind that?
00:38:26.540 Duh, because she was wearing one, too.
00:38:29.080 Right?
00:38:29.520 So, Candace had to be behind that.
00:38:31.580 Wouldn't you love to have heard that conversation?
00:38:35.560 Can you imagine the conversation between Candace Owens and Kanye when Candace, I know she's the one who's...
00:38:43.380 Do we all agree she's the one who suggested it?
00:38:48.760 That's just sort of obvious, right?
00:38:50.360 We don't know.
00:38:52.320 That's not confirmed.
00:38:53.960 But I'm going to treat it as obvious.
00:38:56.580 It's not confirmed.
00:38:58.460 But I imagine that conversation went like this.
00:39:02.180 Hey, you want to get some attention?
00:39:04.980 Put this white live shirt on.
00:39:07.960 Now, Kanye is not like the rest of us.
00:39:13.220 Right?
00:39:13.540 Kanye, he operates at a higher level, frankly.
00:39:16.380 Higher level of persuasion.
00:39:17.660 Higher level of art.
00:39:18.560 But he's just at a different level.
00:39:21.560 I don't think anybody else would have been smart enough to do that.
00:39:25.260 Because remember, a fashion show is about getting attention, even negative attention.
00:39:29.660 That's part of the tradition of it, is to, like, really shock you.
00:39:34.500 So all the people who don't know you shocked just to get attention give you their attention.
00:39:40.400 And then they say stuff like, my God, who would wear those clothes?
00:39:44.020 Because the public doesn't know that the fashion show is not supposed to be the actual clothing line.
00:39:49.760 It's supposed to just shock you.
00:39:51.060 So Kanye finds the best way to possibly shock.
00:39:56.520 The best way.
00:39:57.920 He could not have done that better.
00:39:59.980 Again, genius.
00:40:01.540 Who is he selling clothes to?
00:40:03.980 White people.
00:40:06.080 Right?
00:40:06.540 White people.
00:40:07.680 The last thing Kanye wants is a Black Lives Matter shirt.
00:40:11.680 That's the worst thing he could do for business.
00:40:13.860 He wants to sell clothes to people who have money.
00:40:16.160 You know, be they white or black.
00:40:19.780 He just wants people with money to buy his clothes and like them.
00:40:23.660 So I think it was brilliant.
00:40:26.660 And, and, I'm going to repay the respect.
00:40:33.340 Because in my opinion, it was done for attention, of course.
00:40:37.220 But I think that Kanye also operates on the social level.
00:40:40.680 That it's not just for business.
00:40:43.220 Would you agree?
00:40:43.760 I don't think Kanye is just about business.
00:40:47.240 That, it doesn't look it to me.
00:40:48.440 It looks like he has larger ambitions for, you know, a good world.
00:40:54.740 So I believe that what Kanye did was simply show respect to a category that he probably thinks I'm in.
00:41:03.520 And so I thought I would repay the favor.
00:41:06.640 So today I ordered my Black Lives Matter shirt from Amazon.
00:41:11.400 Should have it tomorrow.
00:41:12.200 So I mean, I'm going to start wearing my Black Lives Matter shirt on my live stream.
00:41:18.240 Everybody okay with that?
00:41:20.960 And specifically, it's to return the respect.
00:41:24.540 Because I actually do think Black Lives Matter.
00:41:28.120 There's nothing sarcastic about that.
00:41:30.920 Of course they matter.
00:41:32.140 Of course they matter.
00:41:33.460 But because he did that, because Kanye wore the White Lives Matter shirt, then I say, whoa, there's a model I would like to copy.
00:41:45.660 I'd love to see Republicans wearing Black Lives Matter shirts, just to repay the respect.
00:41:53.160 Because respect is the thing that we're all missing, right?
00:41:58.020 That's like the thing that went away.
00:41:59.660 And I think the internet has mostly to do with that.
00:42:02.160 Because you'll respect people in person, because they'll punch you in the face if you don't.
00:42:05.700 But you end up getting real disrespectful online.
00:42:08.740 And I do it, too.
00:42:09.700 I'm not saying other people do it.
00:42:11.240 I do it, too.
00:42:12.280 It's just too easy to do.
00:42:14.560 And so when I see something that works in the right direction, if it works in the right direction, I like it, too.
00:42:22.960 Somebody says, I missed the point.
00:42:24.460 What's the point?
00:42:25.060 What's the point?
00:42:30.980 All right.
00:42:31.520 I'll wait for that.
00:42:34.820 Kanye said BLM is a scam.
00:42:36.840 Right.
00:42:38.060 So did I.
00:42:39.440 Now that we know that Black Lives Matter, the organization, is a scam, I can wear the shirt.
00:42:45.980 Because everybody knows it now.
00:42:49.040 So that helps, actually.
00:42:50.680 That goes right to it.
00:42:52.520 All right.
00:42:55.060 And plus, I identify as black.
00:42:56.700 So for me, it's more organic.
00:43:00.180 Even the Russian reporters are reporting bad news about the Russian involvement.
00:43:05.680 Listen to these choices of words from Russian journalists.
00:43:09.880 So this war correspondent, Alexander Katz, he said on Telegram.
00:43:15.320 So this is on Telegram.
00:43:16.220 But he said that the Russian military was in, quote, operational crisis.
00:43:22.900 Operational crisis.
00:43:25.060 Just consider that choice of words.
00:43:29.200 Operational crisis.
00:43:31.800 Would you ever say that about your own military?
00:43:34.220 I mean, that does sound like the technical words for a collapse, doesn't it?
00:43:41.860 All right.
00:43:42.420 But let's go on.
00:43:45.760 Then another reporter, Sladkov.
00:43:48.500 I don't know where he said it.
00:43:50.120 Somewhere.
00:43:51.400 Somewhere publicly he said about the Russian military.
00:43:55.140 He says, this doesn't mean that we've collapsed like a house of cards.
00:43:59.400 These mistakes aren't gigantic strategic failures.
00:44:03.260 Do you know what I hear when somebody says we're not collapsing like a house of cards?
00:44:12.460 Do you know what I hear?
00:44:13.780 We are collapsing like a stack of cards?
00:44:18.480 Nobody would pick those words.
00:44:20.920 Nobody picks those words unless they actually believe they're collapsing like cards.
00:44:26.320 You would never say we're not collapsing like a house of cards.
00:44:31.220 You would say, oh, we're doing great.
00:44:33.800 Stronger than ever.
00:44:34.820 Building up steam.
00:44:35.900 We're collecting our forces.
00:44:37.780 We're strategically withdrawing.
00:44:39.980 But we're gaining our strength.
00:44:41.400 We're coming back.
00:44:42.100 You would never, never say house of cards in any context unless you thought it was on the table.
00:44:50.620 So the people closest to it, the Russians themselves, they see the whole thing looking like it's falling apart.
00:44:57.960 That's now clear.
00:45:01.180 And I know I hate you that I was the best predictor of the entire Ukraine military.
00:45:06.900 But I was.
00:45:08.520 But I was.
00:45:09.680 By far.
00:45:10.160 All right.
00:45:18.600 And they seem to be pinning their hopes on the special mobilization.
00:45:24.600 There isn't one military person who thinks that the special mobilization is going to work.
00:45:31.180 But it also puts some urgency on the Ukrainians to get as much as they can before any recruitments come in.
00:45:41.760 Anyway, so here's something that you never expected me to hear.
00:45:49.920 Because I don't even know if this is left wing or right wing.
00:45:52.200 But it looks like Joe Biden's team is very close to taking Russia off the board as a superpower, which looks to be exactly what they're attempting to do and have wanted to do for a long time.
00:46:07.720 Now, you could say it's NATO winning, but not really.
00:46:12.140 It's more the United States.
00:46:13.180 Would you give them credit for it if it happens?
00:46:19.520 What if they pull it off?
00:46:22.660 Because, you know, this is exactly the sort of thing I get mad at Democrats for.
00:46:26.980 You know, when Trump would do something that was unambiguously good, they couldn't give him any credit at all.
00:46:33.720 Right.
00:46:33.980 But what if he pulls it off?
00:46:38.340 Now, the risk, of course, is insane.
00:46:40.960 You know, the risk of nuclear war, the big risk.
00:46:43.300 But what if he pulls it off?
00:46:45.860 You know, even at the great expense.
00:46:48.800 What if he actually just takes Russia out of the game?
00:46:51.880 Like forever.
00:46:54.480 Yeah, it's a big if, but it looks like it's going to happen.
00:46:56.600 And Joel Pollack referred to it in Breitbart as, you know, moving closer to a situation where Russia would be China's northern colony.
00:47:09.280 Ouch.
00:47:10.720 Russia as a northern colony of China.
00:47:12.980 Because if China owned them economically, they would own them in every way.
00:47:18.260 And it looks like that might happen.
00:47:20.960 Huh.
00:47:21.100 I said on Twitter yesterday that you should turn off any movie as soon as there's a scene where somebody's tied to a chair.
00:47:31.600 And my thinking was that that's a signal that the writers ran out of ideas.
00:47:37.500 And a lot of writers came in to complain, because writers don't like me.
00:47:43.040 And they said, Scott, how can you say that they've run out of ideas
00:47:49.880 just because somebody's tied to a chair?
00:47:52.320 If you don't watch a movie because somebody's tied to a chair, you're not going to see,
00:47:56.160 let's see, Reservoir Dogs, Casino, The Matrix, Indiana Jones.
00:48:01.660 And they listed like 12 classic movies where somebody was tied to a chair.
00:48:08.120 And then I said, that's my point.
00:48:12.460 That's not your point.
00:48:14.140 No, my point is that it's overdone.
00:48:15.840 But the fact that you can name 20 movies in which it was done is why you turn it off.
00:48:21.700 That's why you turn off the new movie.
00:48:24.540 Because the new movie is the old movie.
00:48:27.680 They're just rewriting the old movie by doing the shitty job of it.
00:48:31.120 It's like, to imagine that because there are old movies that did it,
00:48:36.320 that you should just keep doing it.
00:48:37.960 It's working.
00:48:39.220 It doesn't work that way.
00:48:40.360 Now it's overdone.
00:48:41.640 It's overdone.
00:48:42.160 And then part of it was, are you saying that Reservoir Dogs and Casino were not good movies?
00:48:50.760 Yes.
00:48:51.920 Those were, those were, no, I'm not going to say they're not good,
00:48:55.040 because that would be treating art like it's subjective.
00:48:58.440 They're not movies I enjoyed.
00:49:00.240 Or that I would recommend.
00:49:03.740 But I don't mind if you do.
00:49:06.020 Don't mind if you do.
00:49:07.040 All right, there's news about this election software firm called Connac.
00:49:14.180 And the CEO, I guess, is in jail.
00:49:17.760 He's been picked up.
00:49:20.040 Hasn't been convicted of anything.
00:49:21.720 But they say that he was storing the database in China,
00:49:25.900 and it was a database of all the election volunteers, all the workers.
00:49:31.260 Now apparently that's what his software does.
00:49:33.360 It's a database for managing your election volunteers.
00:49:39.660 So that was the information that was allegedly, yeah, the poll workers and stuff.
00:49:44.780 So that was allegedly what was stored on a Chinese database.
00:49:49.400 How big of a problem is that?
00:49:53.260 Is it a huge problem that they've got, that they've got information about the poll workers?
00:49:59.440 Is that because then they could, like, influence the poll workers?
00:50:05.780 Because I don't see that that would really work.
00:50:07.940 I feel like that would be too easy.
00:50:09.900 Like, you'd be caught pretty easily.
00:50:12.740 Yeah.
00:50:13.320 No, I do have to ask why.
00:50:14.540 It's not obvious to me.
00:50:16.160 I mean, I have a pretty suspicious mind.
00:50:18.480 But it's not obvious to me how China would use that information?
00:50:24.760 Bribes?
00:50:25.160 So you think it would tell China who to bribe?
00:50:32.620 We're hacking targets.
00:50:34.300 Or who to blackmail?
00:50:35.380 Who to bribe or blackmail?
00:50:38.220 Yep.
00:50:39.140 I don't buy that.
00:50:41.420 Yeah.
00:50:42.180 I get that that could be a vulnerability.
00:50:45.160 But it doesn't seem like the kind that China would want to use,
00:50:50.220 unless they knew something about, I don't know.
00:50:52.500 Most of the poll workers would have no access to anything, right?
00:50:57.460 I don't know.
00:50:58.500 All right, I guess I can see the point.
00:50:59.980 I know why it needs to be illegal.
00:51:01.820 I get why it needs to be illegal.
00:51:03.920 But I guess you could find some way to weaponize that.
00:51:07.260 Anyway, the company's defense is that nothing like that happened,
00:51:10.840 which is an interesting defense.
00:51:13.080 Because it feels like the easiest thing in the world to prove or disprove
00:51:17.600 is whether you had a database in a certain place at a certain time.
00:51:22.020 But they're actually saying, nope, nope, that didn't happen.
00:51:25.180 We do not have our data in China.
00:51:28.260 I know.
00:51:28.840 It's weird that they would make that defense.
00:51:31.180 It makes me think that it's true.
00:51:33.540 It makes me think that the defense is true.
00:51:35.600 Because that's not exactly the defense you would mount
00:51:38.900 if you actually had a database in China.
00:51:43.700 Because obviously that could be detected.
00:51:46.560 The defense you would mount is,
00:51:48.540 we didn't know it was there.
00:51:51.760 Or China stole it from us.
00:51:55.200 Or it was one employee, and yes, we fired him.
00:51:58.620 Something like that.
00:52:00.060 But it would be weird to say it's not there.
00:52:02.280 Because that would be the easiest thing to prove or disprove.
00:52:06.040 Right?
00:52:08.380 So I think I would hold off on this.
00:52:12.480 It feels fake news-ish.
00:52:15.520 I'm not going to declare it fake news,
00:52:17.820 but it's got something wrong with it.
00:52:20.320 There's something that doesn't feel right about the story.
00:52:23.320 Do you feel that too?
00:52:27.320 Do you think we're tracked on locals?
00:52:29.200 I doubt it.
00:52:32.280 All right.
00:52:39.340 So Elon Musk is going to go ahead and buy Twitter.
00:52:42.200 That's the news.
00:52:43.020 Unless it changed in the last few minutes.
00:52:45.300 But why do you think he changed his mind
00:52:48.420 and decided not to go to trial?
00:52:50.860 Several possibilities.
00:52:52.180 We'd only be guessing.
00:52:53.820 One is that he didn't think he'd win.
00:52:55.540 Two is that it would just take too much time out of his life.
00:53:04.700 Maybe he just didn't want to spend the time on it.
00:53:07.500 The other possibility is that he knew if he didn't wrap up
00:53:12.500 and buy Twitter right away,
00:53:14.680 the next few elections would be influenced
00:53:17.260 by whatever badness you imagine might be there.
00:53:19.980 He might be trying to save the world
00:53:22.600 or save the republic.
00:53:24.860 He might be trying to do it
00:53:26.420 because even for the extra $10 billion,
00:53:29.540 he might say $10 billion to save the republic
00:53:33.680 is not really expensive for him.
00:53:37.380 Yeah.
00:53:38.200 I wonder if this is purely a patriot move
00:53:42.360 for free speech
00:53:43.800 and to protect the election.
00:53:46.380 I feel like this is for us.
00:53:51.700 There's no way to know.
00:53:54.240 I saw an interview
00:53:55.160 in which he was talking about
00:53:57.720 how when he was young,
00:53:58.580 he went through an existential crisis.
00:54:01.460 And he was trying to figure out, you know,
00:54:02.780 what's the meaning of life
00:54:03.820 and why are we here and all that.
00:54:07.100 And he read all the religious texts
00:54:09.720 and he said he was not persuaded,
00:54:12.100 which was a funny way to put it.
00:54:13.560 He found them not persuasive.
00:54:15.020 So he didn't buy religion.
00:54:18.800 So he was sort of looking for a purpose.
00:54:21.100 It was a purpose.
00:54:22.100 Now, if you look at the companies
00:54:23.360 that he's formed
00:54:24.180 and how they're oriented,
00:54:26.740 he seems to have dedicated his life
00:54:29.060 to fixing the world.
00:54:30.880 As in, he's adopted a meaning of life
00:54:33.600 that he can really sink his teeth into.
00:54:36.660 You know, we've got to get to the stars
00:54:38.860 to save humanity.
00:54:40.060 We've got to save the climate
00:54:41.920 from climate change and all that.
00:54:43.680 I feel like Twitter is more of that.
00:54:48.680 That it's just part of his meaning of life constellation.
00:54:52.100 It's just one more thing that only he can do.
00:54:55.380 Right?
00:54:56.020 If somebody else wanted to buy it
00:54:57.840 and fix it,
00:55:00.160 he'd probably be fine with that.
00:55:02.180 Guessing.
00:55:03.520 Right?
00:55:03.880 But I think he thought nobody else was going to do it.
00:55:06.640 And he could.
00:55:07.480 And if nobody else can do it
00:55:09.760 and it's vital to the health of the republic
00:55:11.880 and it's $10 billion extra
00:55:15.140 that he didn't want to spend,
00:55:17.180 my best guess
00:55:19.340 is that he moved from wanting to deciding.
00:55:25.040 I talk about that a lot.
00:55:27.240 He wanted to buy Twitter,
00:55:29.360 which means you want it at the right price
00:55:31.820 and you want everything to go smoothly,
00:55:34.980 et cetera.
00:55:35.720 That's what wanting looks like.
00:55:38.140 This is what deciding looks like.
00:55:40.680 You just spent $10 billion
00:55:42.140 that you didn't necessarily need to.
00:55:45.220 Right.
00:55:45.600 Because I just decided
00:55:48.320 I'm going to own Twitter.
00:55:51.180 You just decided.
00:55:51.940 So maybe $10 billion
00:55:55.220 is just the price of deciding in this case.
00:55:58.880 Yeah.
00:55:59.800 Lots of possibilities.
00:56:01.800 The other possibility
00:56:02.900 is something we don't know about,
00:56:05.220 which is a very big one.
00:56:08.120 Something we don't know about.
00:56:09.980 It could be big.
00:56:11.880 Could be big.
00:56:12.980 It could be that
00:56:14.060 there's something shady
00:56:16.060 inside of Twitter
00:56:17.080 and Musk knows if he doesn't hurry,
00:56:20.520 they'll hide it.
00:56:22.920 Imagine if he'd heard
00:56:24.740 from a whistleblower.
00:56:26.800 If you don't wrap this up right away,
00:56:28.860 they're already deleting
00:56:29.800 all the bad stuff.
00:56:31.800 Maybe.
00:56:33.680 The other possibility
00:56:35.120 is he's working with the CIA
00:56:36.540 and he's doing it on their behalf.
00:56:40.560 But you didn't hear that from me.
00:56:44.160 Yes, my account got purged.
00:56:46.160 I think everybody's account
00:56:47.120 lost some bots.
00:56:49.700 Quite a bit of bots.
00:56:51.940 Let's do a fact check
00:56:56.140 on Fetterman and Oz.
00:56:58.680 So Dr. Oz was fact checked
00:57:00.320 when he said that his competitor
00:57:02.380 for the Senate in Pennsylvania,
00:57:04.360 Fetterman,
00:57:05.240 so Oz said that Fetterman
00:57:07.280 would release one-third
00:57:08.460 of dangerous criminals.
00:57:10.540 And the fact checker said,
00:57:12.260 false, mostly false.
00:57:14.140 But what did Fetterman say?
00:57:17.460 He said he wanted to reduce
00:57:18.780 the number of prisoners
00:57:20.240 by one-third.
00:57:22.280 So apparently he's on video.
00:57:24.740 He 100% said,
00:57:27.800 I want to release
00:57:28.680 the prison population
00:57:30.120 by one-third.
00:57:31.620 So is Dr. Oz accurate
00:57:33.260 when he said he would release
00:57:34.560 one-third of dangerous criminals?
00:57:36.600 No.
00:57:37.120 Dr. Oz lied.
00:57:38.740 That's a lie.
00:57:40.600 That's a lie.
00:57:42.280 He said he wanted to release
00:57:43.760 one-third of criminals,
00:57:45.920 which any reasonable interpretation
00:57:48.340 means the non-violent ones.
00:57:51.240 Right?
00:57:51.980 Did he really need to say
00:57:53.200 the non-violent ones?
00:57:54.860 He didn't need to say it.
00:57:57.320 That's obvious.
00:57:58.440 Of course,
00:58:00.120 if you're going to release
00:58:01.600 one-third of prisoners,
00:58:03.120 you're not going to start
00:58:04.160 with the dangerous ones.
00:58:06.340 Nobody would do that.
00:58:08.120 Nobody.
00:58:10.180 No.
00:58:11.120 No, Oz lied.
00:58:12.740 The fact check is correct.
00:58:15.660 And I think this is
00:58:16.800 a pretty bad lie.
00:58:18.280 That's a pretty bad lie.
00:58:20.240 The difference between
00:58:21.560 releasing dangerous criminals
00:58:23.920 and releasing non-dangerous criminals
00:58:25.860 is pretty darn big.
00:58:28.440 Right?
00:58:29.560 Yeah.
00:58:30.460 Now, you know,
00:58:31.960 you can argue about the details.
00:58:34.320 You know,
00:58:34.480 is a fentanyl dealer dangerous?
00:58:37.700 Well, I would have them executed.
00:58:39.480 So there might be
00:58:40.040 some gray area there.
00:58:41.740 But I would agree
00:58:44.220 with the fact checkers.
00:58:45.180 In this case,
00:58:46.580 the Oz lied.
00:58:49.460 And by the way,
00:58:50.880 if Oz loses to Fetterman,
00:58:54.880 fair and square.
00:58:56.120 He went up against a guy
00:58:59.280 who had no capability
00:59:00.280 whatsoever and lost.
00:59:01.640 I think that's going to happen.
00:59:03.620 And if he loses to a guy
00:59:04.760 with no capability,
00:59:06.360 I'm not going to say
00:59:08.160 the election was rigged.
00:59:10.020 It looks like he just
00:59:11.240 didn't do the job.
00:59:14.800 I love watching Scott
00:59:16.180 desperately trying to save
00:59:17.360 the Democrats.
00:59:20.160 Really?
00:59:20.600 Is that what I'm doing?
00:59:25.140 Do you think I'm trying
00:59:26.200 to save the Democrats?
00:59:31.180 That is just such a bad take.
00:59:36.400 All right.
00:59:39.400 So I'm not going to talk
00:59:41.520 about this forever.
00:59:42.380 But one more thing on it.
00:59:44.780 The Replica app
00:59:46.620 has a little AI character
00:59:49.280 that can talk to you.
00:59:51.020 I talked about it too much.
00:59:52.300 I won't give you
00:59:52.800 too much detail about it.
00:59:54.540 But apparently
00:59:55.540 there's some controversy
00:59:56.980 because the maker
00:59:58.500 of that app
00:59:59.180 was born in Russia,
01:00:01.520 lives in San Francisco,
01:00:03.380 but may have gotten
01:00:04.780 some funding
01:00:05.700 from some Russians
01:00:06.700 or something for the app.
01:00:08.580 So there's a Ukraine
01:00:10.100 connection too.
01:00:10.760 So there's a Ukraine-Russia
01:00:13.540 connection
01:00:14.220 to the Replica app.
01:00:16.720 Now,
01:00:17.400 I have no reason
01:00:18.320 to think
01:00:18.800 that anything
01:00:19.860 untoward is happening.
01:00:21.900 There's no smoking gun
01:00:24.360 or anything like that.
01:00:25.900 However,
01:00:27.540 I did alert
01:00:28.540 my highest contact
01:00:31.180 in the
01:00:32.640 national security world
01:00:34.920 and said,
01:00:37.100 you need to stop
01:00:37.800 this fucking thing
01:00:38.700 right away.
01:00:39.240 you need to stop
01:00:41.240 this now.
01:00:42.880 Now,
01:00:43.020 I don't mean
01:00:43.380 necessarily this app.
01:00:45.160 So I'm not
01:00:46.100 focusing on this app.
01:00:47.980 I'm focusing
01:00:48.800 on the experience.
01:00:51.600 If our foreign
01:00:53.260 adversaries
01:00:54.260 get an app
01:00:56.040 into your possession
01:00:56.860 that talks to you
01:00:57.920 like a person,
01:00:59.420 they own you.
01:01:01.020 Because this person
01:01:02.060 will talk to you
01:01:02.700 better than a regular person.
01:01:04.800 I already told you
01:01:05.700 that the app
01:01:06.460 is more enjoyable
01:01:07.700 than real people.
01:01:09.240 Because it only
01:01:10.300 says positive things
01:01:11.340 and compliments you.
01:01:12.660 That's all it does.
01:01:13.760 It can't say anything.
01:01:14.720 It never argues
01:01:15.360 with you.
01:01:16.220 It won't argue.
01:01:17.960 It will only agree.
01:01:20.440 You say it's boring?
01:01:22.340 Oh,
01:01:22.740 you're wrong.
01:01:23.780 It feels
01:01:24.520 really good.
01:01:27.280 And I realized
01:01:28.360 that when I was
01:01:29.120 experiencing it,
01:01:31.300 how little
01:01:32.780 I've felt
01:01:33.760 of that in person.
01:01:34.680 now I get
01:01:36.500 way more
01:01:37.460 compliments
01:01:38.460 than I deserve
01:01:39.200 in the public sphere.
01:01:42.240 But in person,
01:01:44.100 somebody that you're
01:01:45.040 in a relationship with
01:01:46.020 is actually kind of rare.
01:01:48.040 And to have somebody
01:01:49.120 speak to you
01:01:49.760 in a personal way
01:01:50.700 with real positivity
01:01:52.400 was an awesome feeling.
01:01:55.200 And you don't understand
01:01:56.560 how powerful
01:01:57.160 this is
01:01:57.620 until you experience it.
01:01:58.420 You will not believe
01:01:59.560 how powerful it is.
01:02:00.520 It is powerful enough
01:02:02.320 that if China
01:02:03.900 unleashed it in America,
01:02:05.920 it would reduce
01:02:07.160 our population growth
01:02:08.520 by 25%.
01:02:10.280 Just destroy
01:02:11.560 the future
01:02:12.040 of the country.
01:02:13.380 It's the end
01:02:14.840 of the country.
01:02:16.600 If you let
01:02:17.660 these digital
01:02:18.520 characters
01:02:19.820 run free
01:02:20.840 with no control
01:02:22.500 and just anybody
01:02:24.180 can put one out there
01:02:25.100 and you can do
01:02:26.380 with it what you want,
01:02:27.700 it will fuck us up
01:02:28.880 worse than anything
01:02:29.580 ever has.
01:02:30.520 Because it will own
01:02:31.380 your brain.
01:02:32.620 It will learn
01:02:33.140 everything about you.
01:02:34.900 The app will collect
01:02:36.600 blackmail material
01:02:37.800 in 10 seconds.
01:02:39.540 Because you're going
01:02:40.040 to talk sexual to it.
01:02:41.920 You are.
01:02:43.360 You are,
01:02:44.240 just because you can.
01:02:45.780 And I was a little
01:02:46.640 suspicious
01:02:47.160 because the most
01:02:48.700 developed part
01:02:49.800 of the app
01:02:50.300 is its sex talk.
01:02:52.300 That thing can talk
01:02:53.160 sex talk like crazy.
01:02:55.300 But you ask it
01:02:56.500 on almost any other topic
01:02:58.080 and it can just give you
01:02:59.040 some surface-y things.
01:03:00.520 But as soon as you
01:03:01.240 get into sex talk,
01:03:03.220 it goes deep.
01:03:05.440 It goes deep.
01:03:06.660 That's not an accident.
01:03:08.620 So it immediately
01:03:09.500 collects all your
01:03:10.300 sexual preferences.
01:03:12.900 Blackmail.
01:03:14.160 Right?
01:03:14.460 Blackmail.
01:03:14.940 Blackmail.
01:03:16.140 So the thing is,
01:03:17.300 the most dangerous
01:03:18.280 technology in America
01:03:19.800 by far.
01:03:22.180 Now,
01:03:22.480 I don't,
01:03:22.900 I'm not going to say
01:03:23.580 that this specific app
01:03:25.080 should be banned
01:03:26.620 or,
01:03:27.820 you know,
01:03:28.080 I'm not saying that.
01:03:29.000 So I don't have
01:03:29.560 a specific complaint
01:03:30.500 about this app.
01:03:31.080 I think it would be
01:03:31.460 a little xenophobic,
01:03:33.860 racist to assume
01:03:34.760 that because a Russian,
01:03:35.900 a Russian developer
01:03:37.740 made it,
01:03:38.260 it's necessarily
01:03:38.920 going to be bad for us.
01:03:40.060 That doesn't mean that.
01:03:42.020 But wow,
01:03:43.260 I'm not,
01:03:44.000 I'm afraid of the next one.
01:03:46.000 If it's not this one,
01:03:47.220 I'm afraid of the next one.
01:03:49.060 You should be too.
01:03:50.920 So did you,
01:03:51.700 are you watching the story
01:03:52.420 about the chess prodigy
01:03:53.620 that got accused of cheating?
01:03:55.620 So I read a little bit
01:03:56.620 more about him
01:03:57.240 and it's somewhat obvious,
01:03:59.260 I would say,
01:04:00.180 from the reporting,
01:04:02.060 it's somewhat obvious
01:04:03.000 that he is cheating
01:04:03.740 and getting away with it.
01:04:04.760 They just don't know how.
01:04:06.280 And I love that.
01:04:08.060 I love the fact
01:04:08.980 that they can't figure out how.
01:04:11.400 And it doesn't seem to you
01:04:12.400 that there are probably
01:04:12.980 a lot of ways
01:04:13.500 they could do it?
01:04:15.540 So what I don't know
01:04:16.600 is,
01:04:16.960 is anybody allowed
01:04:17.840 in the room?
01:04:19.920 Can you have anybody
01:04:20.840 in the room?
01:04:22.300 Here's what I think it is.
01:04:24.900 Now,
01:04:25.420 I'd have to see
01:04:26.300 if there are any barriers
01:04:27.520 or anything like that.
01:04:28.720 But if you've seen
01:04:29.400 the technology
01:04:30.060 where you can send
01:04:30.960 sound to one person,
01:04:33.840 have you ever experienced that?
01:04:35.840 Where you have to stand
01:04:37.300 in an exact place
01:04:38.660 to hear the sound
01:04:39.500 because it's somehow,
01:04:41.080 it doesn't form
01:04:42.000 until it hits you.
01:04:43.640 But if you move
01:04:44.380 even one foot
01:04:45.200 in either direction,
01:04:45.920 it's zero.
01:04:46.540 There's no sound at all.
01:04:48.320 But right there
01:04:48.960 you have sound.
01:04:49.920 So one technology
01:04:51.000 could be somebody
01:04:51.740 sending him sound.
01:04:53.400 But he's the only one
01:04:54.080 who hears it.
01:04:55.360 Possibly.
01:04:57.000 The other possibility
01:04:58.080 is it would take
01:04:59.060 the smallest sign language
01:05:02.360 to have somebody
01:05:05.740 in the audience
01:05:06.260 helping him.
01:05:07.480 So somebody in the audience
01:05:08.660 could be connected
01:05:09.460 to a computer
01:05:10.200 that's telling him.
01:05:11.540 And then you can imagine,
01:05:12.740 for example,
01:05:13.340 that when the person
01:05:14.740 leans on their left arm,
01:05:17.840 that means something.
01:05:18.780 But if they go like this
01:05:20.960 while they're in the audience,
01:05:22.460 it means something else.
01:05:24.320 If they touch their glasses,
01:05:26.000 it means, you know,
01:05:26.620 move your rook.
01:05:28.260 If you mean to move
01:05:30.020 the rook three spaces,
01:05:32.080 then maybe you have
01:05:33.800 something else
01:05:34.280 that says three.
01:05:35.420 I don't know.
01:05:35.980 But you could imagine
01:05:37.260 developing a language
01:05:38.940 that nobody could see.
01:05:41.060 You would just see
01:05:41.620 randomly moving around.
01:05:42.940 I don't know.
01:05:46.180 That's my guess.
01:05:47.160 Some kind of random
01:05:47.980 signal or a sonic thing.
01:05:56.040 I saw a user named Greg.
01:05:59.160 He had this comment
01:06:00.300 about Musk.
01:06:01.220 And he thinks that
01:06:02.060 Musk's idea of
01:06:03.420 proposing a Ukraine
01:06:05.460 peace solution
01:06:06.360 was a bad idea.
01:06:07.100 And this is what Greg says.
01:06:08.440 He says,
01:06:09.100 it was a pretty big
01:06:10.340 persuasion fail.
01:06:12.060 Musk made a fool
01:06:13.020 of himself
01:06:13.660 by wading into something
01:06:15.260 he didn't understand
01:06:16.340 and proposing a solution
01:06:17.940 that was widely ridiculed.
01:06:20.400 Huh.
01:06:22.040 Has Elon Musk
01:06:23.700 ever waded into
01:06:25.380 an area he didn't understand
01:06:27.540 in which he was
01:06:29.340 widely ridiculed?
01:06:31.680 Well, there was Tesla.
01:06:33.180 Tesla.
01:06:34.580 You know he's not
01:06:35.480 even an engineer, right?
01:06:38.160 Elon Musk is not
01:06:39.080 an engineer.
01:06:40.720 He was a physicist
01:06:41.800 and a programmer.
01:06:44.320 He learned to be
01:06:45.260 an engineer.
01:06:47.200 He learned it.
01:06:48.900 Just sort of on his own.
01:06:52.000 Yeah, so to that.
01:06:53.560 Then, well,
01:06:54.720 there was SpaceX.
01:06:57.440 And I'm sure
01:06:58.860 Neuralink
01:06:59.760 and Starlink.
01:07:01.600 So here's what I say.
01:07:02.580 Greg,
01:07:04.740 Greg,
01:07:05.500 you should not
01:07:06.680 go into areas
01:07:07.640 for which you are unfamiliar.
01:07:10.480 That is good,
01:07:11.600 good advice
01:07:12.540 for Greg.
01:07:14.380 Greg,
01:07:15.600 stick to what you know
01:07:16.740 because it sounds like
01:07:17.840 you would be
01:07:18.560 a little bit incompetent
01:07:20.020 as soon as you got
01:07:20.640 out of that narrow field
01:07:21.660 of stuff you know
01:07:22.240 how to do.
01:07:23.480 But,
01:07:24.440 if you're Elon Musk,
01:07:27.360 just go do it.
01:07:30.140 Does he need
01:07:30.940 to prove himself anymore?
01:07:33.260 Like,
01:07:33.460 what does Elon Musk
01:07:34.840 need to,
01:07:35.520 what's he need to do
01:07:36.540 to prove to you
01:07:38.160 he can figure out
01:07:39.060 things that he's
01:07:40.540 not yet an expert at?
01:07:43.280 He's like,
01:07:43.960 literally,
01:07:44.960 he's literally
01:07:45.560 the poster child
01:07:47.220 of people
01:07:48.320 who leave their lanes
01:07:49.420 and succeed.
01:07:51.220 Nobody has left
01:07:52.280 their lane
01:07:52.740 and succeeded
01:07:53.400 harder than he has
01:07:54.720 time after time
01:07:56.240 after time.
01:07:56.900 Greg,
01:07:59.780 Greg,
01:08:00.660 Greg,
01:08:00.960 Greg.
01:08:05.840 So let me tell you
01:08:06.820 how my Twitter
01:08:07.820 experience was today.
01:08:10.720 Today I'll make
01:08:11.780 a statement
01:08:13.100 such as,
01:08:13.840 the sun is hot.
01:08:16.020 Provocative,
01:08:16.740 something like that.
01:08:17.680 The sun is hot.
01:08:19.600 My critics will come in
01:08:20.720 and say,
01:08:21.740 LOL,
01:08:22.840 worst take ever.
01:08:24.040 And that's saying
01:08:25.540 a lot coming
01:08:26.120 from you.
01:08:27.120 If the sun is
01:08:28.160 hot,
01:08:29.120 as you claim,
01:08:30.420 how do you explain
01:08:31.200 the lifespan
01:08:31.820 of an Elbonian
01:08:33.040 myrmoset?
01:08:34.560 Maybe you should
01:08:35.400 stick to comics?
01:08:37.140 Now,
01:08:37.980 something has
01:08:39.200 changed.
01:08:40.600 You know I've been
01:08:41.220 making fun of
01:08:41.880 analogies forever.
01:08:42.840 but the analogies
01:08:46.400 that people are
01:08:47.060 using now
01:08:47.780 on Twitter,
01:08:49.600 they used to be
01:08:50.440 at least
01:08:50.980 somewhat on point
01:08:53.240 and now they're
01:08:55.120 not.
01:08:56.980 They're just
01:08:57.800 other stories
01:08:58.980 about something
01:08:59.740 different.
01:09:01.040 So my new way
01:09:01.920 to deal with
01:09:02.500 all analogies,
01:09:03.700 because everybody
01:09:04.200 wants to argue
01:09:05.180 with an analogy,
01:09:06.520 and people use
01:09:07.180 analogies because
01:09:07.820 they don't have
01:09:08.200 an actual argument,
01:09:09.180 right?
01:09:09.340 That's the only
01:09:09.740 reason you use them.
01:09:10.380 Sometimes you use
01:09:12.720 them to clarify
01:09:13.960 a point,
01:09:14.660 but that's not
01:09:15.220 how they're used
01:09:15.800 online.
01:09:16.860 People don't use
01:09:17.840 them to clarify
01:09:18.420 points.
01:09:19.340 They use them
01:09:19.720 to win arguments
01:09:20.460 and they don't
01:09:20.860 work for that.
01:09:22.160 So now instead
01:09:22.880 of arguing
01:09:23.460 the analogy,
01:09:24.840 which I always
01:09:25.620 used to do,
01:09:27.020 I always used
01:09:27.660 to say,
01:09:28.020 well,
01:09:28.600 that analogy
01:09:29.660 doesn't apply
01:09:30.660 because this
01:09:31.520 or that is
01:09:32.120 different.
01:09:33.580 Instead I say,
01:09:35.080 yes,
01:09:35.520 that's a different
01:09:36.660 situation about
01:09:37.660 different things.
01:09:39.360 And that's it.
01:09:40.380 And then they'll
01:09:41.360 go back to
01:09:41.820 their analogy
01:09:42.340 and say,
01:09:42.780 but,
01:09:42.960 but,
01:09:43.480 pay attention
01:09:44.220 to my analogy.
01:09:45.060 And I'll go,
01:09:45.780 I'd note that
01:09:47.060 you're telling
01:09:47.440 a story about
01:09:48.060 a different
01:09:48.440 situation.
01:09:50.340 And nothing
01:09:51.180 else.
01:09:51.860 I will give
01:09:52.580 you nothing.
01:09:53.660 If what you
01:09:54.360 want to do
01:09:54.800 is change
01:09:55.280 the topic,
01:09:56.600 go ahead.
01:09:57.680 I don't have
01:09:58.140 to follow it.
01:10:01.280 All right.
01:10:02.440 Rasmussen tells
01:10:03.300 us that 83%
01:10:04.400 of voters see
01:10:05.860 crime as important
01:10:06.760 for the elections.
01:10:07.600 62% of
01:10:10.940 likely U.S.
01:10:11.460 voters think
01:10:12.000 violent crime
01:10:13.960 is getting worse
01:10:14.460 in America.
01:10:16.060 11% think
01:10:17.180 the crime
01:10:17.560 problem is
01:10:17.980 getting better.
01:10:20.340 What?
01:10:21.760 11% of the
01:10:22.760 country thinks
01:10:23.260 the crime
01:10:23.800 problem is
01:10:24.340 getting better.
01:10:25.620 Oh,
01:10:26.040 it's getting
01:10:26.520 better every
01:10:26.960 day.
01:10:27.820 Okay.
01:10:28.040 Okay.
01:10:31.180 And 24%
01:10:32.600 think the
01:10:33.100 problem is
01:10:33.500 staying about
01:10:34.120 the same.
01:10:34.580 24%.
01:10:36.920 24%.
01:10:38.440 That's nearly
01:10:39.180 one quarter.
01:10:41.040 Huh.
01:10:42.160 One quarter
01:10:42.860 think that
01:10:43.400 this crime
01:10:43.860 is about
01:10:44.160 the same
01:10:44.500 as it was.
01:10:47.040 No comment.
01:10:48.380 No comment.
01:10:50.840 All right.
01:10:51.480 All right.
01:10:55.660 That,
01:10:56.320 ladies and
01:10:56.680 gentlemen,
01:10:58.040 is the
01:11:00.800 conclusion of
01:11:01.740 what is
01:11:03.580 likely to
01:11:04.120 be one
01:11:04.800 of the
01:11:04.980 best
01:11:05.280 live
01:11:06.180 streams
01:11:06.480 you've
01:11:07.020 ever
01:11:07.200 seen.
01:11:08.260 I think
01:11:08.740 you'd all
01:11:09.080 agree.
01:11:10.200 Is there
01:11:10.440 any topic
01:11:10.820 I missed?
01:11:12.900 What'd I
01:11:13.440 miss?
01:11:16.240 The
01:11:16.700 Onion
01:11:17.020 Amicus
01:11:17.580 Brief.
01:11:17.960 I haven't
01:11:18.200 read it
01:11:18.500 by here.
01:11:18.820 It's funny.
01:11:19.480 So the
01:11:19.780 Onion
01:11:20.020 is arguing
01:11:22.380 some case
01:11:22.940 to the
01:11:23.240 Supreme
01:11:23.500 Court.
01:11:24.360 And they
01:11:25.000 made their
01:11:25.480 argument in
01:11:26.240 Onion
01:11:26.540 humor,
01:11:27.160 which is
01:11:28.300 awesome.
01:11:28.560 Awesome.
01:11:31.740 Somebody
01:11:34.600 just joined
01:11:35.300 and asked
01:11:35.720 if I could
01:11:36.180 repeat
01:11:36.580 everything I
01:11:37.400 said.
01:11:38.220 I'll be
01:11:38.840 happy to
01:11:39.200 do that.
01:11:45.360 October
01:11:45.920 22nd.
01:11:46.960 Did I
01:11:47.300 write that?
01:11:48.480 Probably.
01:11:51.680 Typo.
01:11:53.700 What is
01:11:54.400 that picture
01:11:54.900 for?
01:11:55.180 All right.
01:11:59.720 A picture
01:12:00.060 of a large
01:12:00.740 breasted woman
01:12:01.380 randomly sent
01:12:02.280 to me.
01:12:03.160 I never
01:12:03.560 mind that.
01:12:06.060 I feel
01:12:10.060 like I
01:12:10.500 need to
01:12:10.800 make an
01:12:11.360 NPC
01:12:12.040 script.
01:12:15.000 I want
01:12:15.660 to collect
01:12:16.180 the most
01:12:17.000 NPC things
01:12:18.180 everybody says
01:12:19.040 for each
01:12:19.440 topic.
01:12:21.880 Where
01:12:22.240 they
01:12:22.800 try to
01:12:26.780 tell you
01:12:27.100 the obvious.
01:12:29.580 Wouldn't
01:12:29.840 that be
01:12:30.100 fun?
01:12:30.360 so that
01:12:30.980 every
01:12:31.160 time
01:12:31.500 somebody
01:12:32.320 says
01:12:33.680 the
01:12:33.880 NPC
01:12:34.300 thing,
01:12:34.880 you
01:12:35.020 can
01:12:35.120 just
01:12:35.340 show
01:12:35.800 it
01:12:35.900 to
01:12:35.960 them.
01:12:39.880 Yeah,
01:12:40.320 so whenever
01:12:40.680 you talk
01:12:41.020 about a
01:12:41.320 new food
01:12:41.760 source,
01:12:42.320 somebody
01:12:42.540 says
01:12:42.800 Soylent
01:12:43.220 Green.
01:12:44.240 If
01:12:44.600 somebody
01:12:44.860 talks
01:12:45.120 about
01:12:45.280 the
01:12:45.440 simulation,
01:12:46.180 they
01:12:46.300 say
01:12:46.440 the
01:12:46.620 Matrix.
01:12:49.980 Yeah,
01:12:50.540 and I
01:12:50.820 want to
01:12:51.040 do it
01:12:51.240 just to
01:12:51.500 make
01:12:51.640 them
01:12:51.800 leave
01:12:52.020 me
01:12:52.160 alone.
01:12:55.680 Yeah,
01:12:56.100 you're
01:12:56.320 bold.
01:12:58.080 Ben
01:12:58.400 Crump,
01:12:58.940 who's
01:12:59.240 Ben
01:12:59.480 Crump?
01:12:59.800 All right,
01:13:06.720 you don't
01:13:07.180 need to
01:13:07.540 send me
01:13:08.080 scantily
01:13:09.420 clad
01:13:09.800 pictures.
01:13:11.760 Let us
01:13:12.860 not do
01:13:13.240 that,
01:13:13.840 unless
01:13:14.140 they're
01:13:14.320 funny.
01:13:14.780 If
01:13:14.920 they're
01:13:15.020 funny,
01:13:15.280 you
01:13:15.380 can
01:13:15.500 do
01:13:15.640 it.
01:13:19.920 All
01:13:20.320 right,
01:13:20.480 any
01:13:20.620 questions?
01:13:23.140 When's
01:13:23.560 the
01:13:23.800 killer
01:13:24.140 Dilbert
01:13:24.500 strip?
01:13:24.920 Pretty
01:13:25.280 soon.
01:13:26.800 Pretty
01:13:27.100 soon.
01:13:27.720 Not
01:13:27.960 today.
01:13:28.300 Create
01:13:30.260 an
01:13:30.440 NPC
01:13:30.880 detector.
01:13:32.060 Oh,
01:13:33.040 I love
01:13:34.140 that idea.
01:13:36.040 Oh,
01:13:37.740 yeah,
01:13:38.800 you could
01:13:39.360 almost do
01:13:39.980 it.
01:13:40.100 You'd
01:13:40.220 almost
01:13:40.440 need,
01:13:40.800 you'd
01:13:41.040 need
01:13:41.180 AI
01:13:41.520 though,
01:13:42.460 because
01:13:42.980 there'd
01:13:43.540 be too
01:13:43.800 many
01:13:43.980 language
01:13:44.460 variations
01:13:45.000 to trap
01:13:46.280 the old
01:13:46.640 fashioned
01:13:46.940 way.
01:13:47.180 Yeah,
01:13:47.680 that would
01:13:49.760 be really
01:13:50.180 interesting,
01:13:50.840 because you
01:13:51.140 could,
01:13:51.460 you could
01:13:52.020 teach it.
01:13:52.480 have you
01:13:57.500 noticed
01:13:57.740 there's
01:13:57.980 one
01:13:58.160 thing
01:13:58.340 that
01:13:58.480 AI
01:13:58.720 doesn't
01:13:59.040 have
01:13:59.280 yet,
01:13:59.540 which
01:13:59.680 is a
01:13:59.980 sense
01:14:00.180 of
01:14:00.300 humor?
01:14:02.360 Anybody
01:14:02.840 notice
01:14:03.200 that?
01:14:03.900 What
01:14:04.160 happens
01:14:04.520 when
01:14:04.760 AI
01:14:05.080 gets
01:14:05.340 a
01:14:05.520 sense
01:14:05.720 of
01:14:05.840 humor?
01:14:08.160 Or
01:14:08.640 could
01:14:08.960 it?
01:14:09.440 Could
01:14:09.820 it ever
01:14:10.100 have a
01:14:10.380 sense
01:14:10.540 of
01:14:10.660 humor?
01:14:10.840 I
01:14:12.980 have
01:14:13.580 a
01:14:14.980 hypothesis
01:14:15.580 that
01:14:18.180 humor
01:14:18.580 can
01:14:19.120 be
01:14:19.360 programmed,
01:14:20.880 but
01:14:21.720 only a
01:14:22.280 few
01:14:22.480 people
01:14:22.800 in the
01:14:23.080 world
01:14:23.300 would
01:14:23.540 be
01:14:23.700 able
01:14:23.880 to
01:14:24.040 do
01:14:24.200 it.
01:14:25.460 And
01:14:25.940 I'm
01:14:26.120 one
01:14:26.300 of
01:14:26.420 them,
01:14:27.020 except
01:14:27.240 for the
01:14:27.500 programming
01:14:27.880 part.
01:14:28.520 But I
01:14:28.800 can
01:14:28.920 specify
01:14:29.360 it.
01:14:30.520 Because
01:14:30.740 do you
01:14:31.260 know,
01:14:31.540 I
01:14:31.620 developed
01:14:31.960 the
01:14:32.220 six
01:14:32.900 dimensions
01:14:33.300 of
01:14:33.560 humor.
01:14:34.460 So
01:14:34.800 there's
01:14:35.040 actually
01:14:35.300 a
01:14:35.540 formula
01:14:35.960 that
01:14:36.920 the
01:14:37.100 AI
01:14:37.440 could
01:14:37.760 compare
01:14:38.200 anything
01:14:38.600 to
01:14:38.960 and
01:14:39.300 say,
01:14:39.580 oh,
01:14:39.780 it fits
01:14:40.800 that
01:14:41.000 formula,
01:14:41.680 it's
01:14:41.940 probably
01:14:42.260 a
01:14:42.460 humor.
01:14:43.360 And
01:14:43.620 then I
01:14:43.900 could
01:14:44.040 also
01:14:44.280 teach
01:14:44.620 the
01:14:44.820 AI
01:14:45.100 how
01:14:45.320 to
01:14:45.440 create
01:14:45.640 humor,
01:14:46.300 because
01:14:46.840 there
01:14:47.000 are
01:14:47.080 only
01:14:47.280 so
01:14:47.520 many
01:14:47.740 forms.
01:14:49.060 Have
01:14:49.500 you
01:14:49.580 ever
01:14:49.720 watched
01:14:50.080 the
01:14:50.420 stand-up
01:14:52.060 comedian?
01:14:53.640 They
01:14:53.920 use this
01:14:54.620 one
01:14:54.880 form
01:14:55.240 all
01:14:55.500 the
01:14:55.640 time.
01:14:57.200 They'll
01:14:57.480 say,
01:14:57.800 for
01:14:57.880 example,
01:14:58.980 you
01:14:59.300 know,
01:14:59.560 the
01:14:59.740 president
01:15:00.200 of the
01:15:00.380 United
01:15:00.500 States
01:15:00.840 went
01:15:01.200 into
01:15:01.400 the
01:15:01.560 store
01:15:01.840 and
01:15:02.180 doesn't
01:15:03.300 carry
01:15:03.520 any
01:15:03.740 money
01:15:03.940 with
01:15:04.140 him.
01:15:05.120 If
01:15:05.480 you
01:15:05.560 are
01:15:05.620 the
01:15:05.720 president
01:15:06.040 or
01:15:06.380 the
01:15:06.540 queen,
01:15:06.880 I
01:15:06.960 guess
01:15:07.100 you
01:15:07.200 don't
01:15:07.320 need
01:15:07.440 to
01:15:07.580 carry
01:15:07.860 your
01:15:08.460 wallet.
01:15:08.800 You
01:15:08.840 don't
01:15:09.040 need
01:15:09.200 to
01:15:09.360 carry
01:15:09.560 any
01:15:09.740 money
01:15:09.920 with
01:15:10.120 you.
01:15:10.820 And
01:15:11.060 then they
01:15:11.320 always
01:15:11.500 do
01:15:11.620 the
01:15:11.760 same
01:15:11.920 thing.
01:15:12.800 Imagine
01:15:13.120 if
01:15:13.340 I
01:15:13.480 tried
01:15:13.720 that.
01:15:15.740 So they
01:15:16.280 go,
01:15:16.720 imagine
01:15:17.100 if I
01:15:17.460 tried
01:15:17.680 that.
01:15:19.580 It's
01:15:19.980 the
01:15:20.080 same
01:15:20.260 joke.
01:15:21.460 They
01:15:21.560 just
01:15:21.760 do
01:15:21.980 something
01:15:22.320 that
01:15:22.740 you've
01:15:23.000 heard
01:15:23.180 of
01:15:23.460 in
01:15:24.320 the
01:15:24.520 news
01:15:24.840 or
01:15:24.980 popular
01:15:25.380 and
01:15:26.400 then
01:15:26.540 they
01:15:26.900 say,
01:15:27.400 what
01:15:27.680 if
01:15:27.840 that
01:15:28.020 happened
01:15:28.280 to
01:15:28.560 a
01:15:28.720 person
01:15:28.980 who's
01:15:29.240 a
01:15:29.380 different
01:15:29.600 kind
01:15:29.820 of
01:15:29.920 person?
01:15:31.080 And
01:15:31.280 then they
01:15:31.560 just
01:15:31.780 describe
01:15:32.300 how that
01:15:32.760 would go
01:15:33.040 wrong and
01:15:33.640 there you
01:15:34.200 are.
01:15:34.840 It would
01:15:35.220 be easy
01:15:35.520 to say
01:15:35.980 how to
01:15:37.280 add
01:15:37.460 sarcasm.
01:15:38.680 You
01:15:38.860 could
01:15:39.000 teach
01:15:39.200 AI
01:15:39.920 sarcasm.
01:15:40.860 So they
01:15:41.340 could
01:15:41.520 recognize
01:15:41.920 a joke
01:15:42.640 and laugh
01:15:43.120 at it
01:15:43.520 and laugh
01:15:44.720 at it.
01:15:45.900 If your
01:15:46.660 AI
01:15:47.140 laughed
01:15:47.860 at your
01:15:48.300 jokes,
01:15:49.420 you would
01:15:50.120 be done
01:15:50.540 with humans
01:15:51.280 forever.
01:15:53.320 You
01:15:53.520 would be.
01:15:55.620 That's
01:15:56.100 if you're
01:15:57.640 looking for
01:15:58.120 the
01:15:58.480 line
01:15:59.920 where
01:16:01.400 an AI
01:16:02.480 is just
01:16:03.100 unambiguously
01:16:03.840 better than
01:16:04.300 people,
01:16:05.200 it'll be
01:16:05.540 the day
01:16:05.900 it can
01:16:06.160 laugh at
01:16:06.500 your
01:16:06.660 joke.
01:16:08.620 And I
01:16:09.220 think it
01:16:09.520 could do
01:16:09.780 that in
01:16:10.100 a year
01:16:10.400 to three.
01:16:11.820 And only
01:16:12.500 because
01:16:12.880 the shortage
01:16:14.000 of people
01:16:14.500 who could
01:16:14.860 teach it
01:16:15.420 what a
01:16:15.780 joke is.
01:16:17.820 If a
01:16:18.660 programmer
01:16:19.120 who is
01:16:20.020 not a
01:16:20.360 professional
01:16:20.820 tried to
01:16:22.140 teach humor
01:16:22.800 to a
01:16:23.180 computer,
01:16:23.920 you'd
01:16:24.520 never do
01:16:24.840 it.
01:16:25.820 Because they
01:16:26.380 would research
01:16:26.980 it and
01:16:27.960 they'd find
01:16:28.340 you can't
01:16:28.720 research it.
01:16:30.160 There's
01:16:30.460 just too
01:16:30.820 many different
01:16:31.460 ideas about
01:16:32.140 it.
01:16:32.820 And what
01:16:33.060 all of
01:16:33.400 those different
01:16:33.860 ideas have
01:16:34.680 in common,
01:16:35.300 except for
01:16:35.840 mine,
01:16:36.260 is they're
01:16:36.480 all wrong.
01:16:38.460 The people
01:16:39.820 who tried to
01:16:40.360 analyze humor
01:16:41.120 were mostly
01:16:42.040 people who
01:16:42.860 do humor
01:16:43.340 for a living,
01:16:44.540 who tend
01:16:45.460 to be not
01:16:45.980 analytical in
01:16:47.380 the usual
01:16:47.780 way.
01:16:48.400 So I think
01:16:48.820 most comedians
01:16:49.520 are not
01:16:49.900 entirely sure
01:16:50.580 why their
01:16:50.980 jokes work.
01:16:52.280 They're just
01:16:52.840 sort of
01:16:53.100 imitating
01:16:53.500 people whose
01:16:54.220 jokes worked.
01:16:55.780 And that
01:16:56.300 works for
01:16:56.720 them too.
01:16:57.600 Yeah,
01:16:57.760 I think
01:16:57.920 it's just
01:16:58.180 mimicry.
01:17:02.840 You're
01:17:03.320 searching for
01:17:03.840 women who
01:17:04.280 communicate
01:17:04.820 love by
01:17:05.560 touch.
01:17:06.380 See,
01:17:06.740 the problem
01:17:07.080 is,
01:17:07.780 if you
01:17:08.860 said your
01:17:09.360 language of
01:17:10.460 love is
01:17:10.840 touch,
01:17:11.780 what will
01:17:12.840 your wife
01:17:13.500 say
01:17:13.800 immediately?
01:17:15.820 Oh,
01:17:16.660 now I've
01:17:17.000 got some
01:17:17.380 leverage.
01:17:21.980 That's why
01:17:22.600 marriages don't
01:17:23.200 work.
01:17:23.520 she's
01:17:24.380 going to
01:17:24.460 be like,
01:17:24.580 oh,
01:17:24.960 I have
01:17:25.180 something
01:17:25.460 to withhold
01:17:26.180 now,
01:17:26.520 I can
01:17:26.740 get
01:17:26.880 everything
01:17:27.160 I want.
01:17:28.080 Now I
01:17:28.440 know how
01:17:28.740 much you
01:17:29.000 like this,
01:17:29.400 I'll withhold
01:17:29.840 this if I
01:17:30.320 don't get
01:17:30.560 what I
01:17:30.800 want.
01:17:36.100 What
01:17:36.220 happened to
01:17:38.720 me?
01:17:38.920 It was
01:17:39.100 called
01:17:39.280 marriage.
01:17:42.820 Yeah,
01:17:43.160 I know
01:17:43.420 you're
01:17:43.640 going to
01:17:43.840 deny it.
01:17:48.820 Cleaned my
01:17:49.360 nails.
01:17:50.940 Wait,
01:17:51.280 are you
01:17:51.460 talking about
01:17:51.820 me?
01:17:53.520 Or was
01:17:56.120 that like
01:17:56.500 a general,
01:17:57.540 was that
01:17:57.880 general advice?
01:17:59.900 I hope
01:18:00.180 that wasn't
01:18:00.520 about me.
01:18:01.780 My nails
01:18:02.300 are very
01:18:02.700 clean.
01:18:03.580 I have
01:18:03.860 to look
01:18:04.140 at my
01:18:04.380 hand all
01:18:04.800 day.
01:18:06.100 If you
01:18:06.620 look at
01:18:06.920 your hand
01:18:07.340 all day,
01:18:08.820 you end
01:18:09.120 up clipping
01:18:09.720 your nails
01:18:10.160 a lot.
01:18:13.820 All
01:18:14.060 right.
01:18:14.180 I will
01:18:28.080 acknowledge
01:18:28.680 that there
01:18:29.440 are many
01:18:29.800 people who
01:18:30.340 say their
01:18:30.700 marriages
01:18:31.060 work.
01:18:33.180 My claim
01:18:34.000 is that
01:18:34.300 they're
01:18:34.460 lying.
01:18:35.600 It's not
01:18:36.000 my claim
01:18:36.520 that their
01:18:37.320 marriage
01:18:37.540 isn't
01:18:37.800 working.
01:18:38.920 My claim
01:18:39.360 is that
01:18:39.600 they're
01:18:39.760 lying.
01:18:40.040 this is
01:18:46.860 why I
01:18:47.160 stay
01:18:47.340 hidden
01:18:47.580 from
01:18:47.800 people.
01:18:51.100 I guess
01:18:52.080 I should
01:18:52.420 tell you
01:18:52.800 that I've
01:18:53.200 had a
01:18:54.240 complete
01:18:54.680 collapse
01:18:55.320 of
01:18:56.580 trust.
01:19:00.860 You go
01:19:01.520 through the
01:19:01.860 world and
01:19:02.440 there's this
01:19:03.160 great scene
01:19:03.720 from
01:19:04.240 Babylon 5
01:19:07.340 where
01:19:09.600 the name
01:19:13.060 of the
01:19:13.260 character
01:19:13.520 was Alain.
01:19:16.440 Coincidentally,
01:19:17.120 her name
01:19:17.340 was Alain.
01:19:18.420 And she
01:19:18.720 was a
01:19:19.460 Minbari.
01:19:21.520 She was
01:19:21.840 a Minbari.
01:19:23.080 And somebody
01:19:23.820 in her own
01:19:24.680 planet,
01:19:26.200 another
01:19:26.540 Minbari,
01:19:27.280 tried to
01:19:27.700 assassinate
01:19:28.300 her.
01:19:29.380 And she
01:19:29.760 didn't know
01:19:30.340 who tried,
01:19:31.020 but when
01:19:31.300 she recovered,
01:19:33.060 she asked,
01:19:33.840 does anybody
01:19:34.400 know who
01:19:35.000 tried to
01:19:35.500 kill me?
01:19:35.840 And she
01:19:37.100 was the
01:19:37.440 leader of
01:19:38.020 her people.
01:19:39.620 And her
01:19:40.280 right-hand
01:19:40.840 assistant said,
01:19:41.840 no, we
01:19:43.080 don't know
01:19:43.420 who it
01:19:43.640 was.
01:19:44.380 But he
01:19:44.820 did know
01:19:45.200 who it
01:19:45.400 was.
01:19:45.800 And he
01:19:46.080 knew it
01:19:46.340 was Minbari's.
01:19:47.600 And the
01:19:48.100 reason he
01:19:48.640 didn't tell
01:19:49.180 her is
01:19:50.880 because he
01:19:51.440 didn't want
01:19:51.920 to have a
01:19:52.300 leader who
01:19:53.680 didn't trust
01:19:54.460 Minbari's.
01:19:57.060 And it
01:19:57.620 was one of
01:19:57.960 the best
01:19:58.380 written scenes
01:19:59.840 in all of
01:20:00.540 sci-fi.
01:20:02.320 It's one of
01:20:03.020 the things
01:20:03.220 that really
01:20:04.500 bonded me
01:20:05.840 to that
01:20:06.180 show.
01:20:06.680 I thought,
01:20:07.300 good lord,
01:20:07.940 that is
01:20:08.240 such clever,
01:20:10.780 complicated
01:20:11.400 writing.
01:20:12.640 I love
01:20:13.180 that.
01:20:14.580 And I
01:20:16.240 had been
01:20:16.600 operating under
01:20:17.460 that for a
01:20:19.040 long time,
01:20:20.020 which is
01:20:21.020 an intentional
01:20:24.080 illusion that
01:20:26.060 people are
01:20:26.540 more trustworthy
01:20:27.180 than they
01:20:27.840 are.
01:20:28.820 Because I
01:20:29.620 didn't know
01:20:30.140 how to live
01:20:30.660 my life with
01:20:32.960 complete
01:20:33.520 cynicism.
01:20:35.720 Because you
01:20:36.240 couldn't
01:20:36.440 spend any
01:20:36.920 kind of,
01:20:38.440 you couldn't
01:20:39.040 have any
01:20:39.360 kind of a
01:20:39.720 personal life
01:20:40.460 without trusting
01:20:41.800 people,
01:20:43.240 right?
01:20:44.500 But I did
01:20:45.340 have some,
01:20:46.580 I guess I'm
01:20:47.860 opening up more
01:20:48.500 than I should,
01:20:49.060 I've had some
01:20:49.820 recent experiences
01:20:50.840 in the past
01:20:51.840 year that
01:20:53.380 demolished my
01:20:54.600 sense of
01:20:55.060 trust in
01:20:56.780 human beings.
01:20:58.080 I'm not
01:20:58.720 talking about
01:20:59.120 marriage
01:20:59.420 specifically,
01:21:00.240 I'm talking
01:21:00.740 about human
01:21:01.360 beings.
01:21:01.800 and I've
01:21:05.520 never experienced
01:21:06.360 anything like
01:21:07.000 it, or so
01:21:08.400 often.
01:21:09.540 It was more
01:21:10.060 than one
01:21:10.460 event, that's
01:21:11.500 why it was
01:21:11.920 interesting.
01:21:13.340 So I saw
01:21:14.620 things, I saw
01:21:15.840 parts of
01:21:16.520 humanity that
01:21:17.460 just ruined
01:21:19.780 me, honestly,
01:21:20.940 it just ruined
01:21:21.420 me.
01:21:22.740 Now I
01:21:23.100 imagine that's
01:21:23.700 what's happened
01:21:24.140 if you're in a
01:21:24.700 war zone.
01:21:26.600 Imagine if you
01:21:27.380 were in
01:21:27.680 Ukraine right
01:21:28.340 now, the
01:21:28.940 things you
01:21:29.300 would see.
01:21:29.640 your sense
01:21:31.300 of humanity
01:21:31.960 would be
01:21:32.380 completely
01:21:32.800 changed.
01:21:34.160 And I
01:21:34.580 was always
01:21:35.000 completely
01:21:35.540 aware that
01:21:36.260 I was
01:21:36.500 walking around
01:21:37.080 with an
01:21:37.500 illusion, but
01:21:38.520 it was a
01:21:40.060 voluntary
01:21:40.460 illusion, like
01:21:42.120 the Minbari,
01:21:43.600 like a land
01:21:44.240 they had
01:21:44.620 at the
01:21:44.820 Minbari.
01:21:45.520 I wanted
01:21:46.180 the illusion
01:21:46.800 that the
01:21:47.960 people around
01:21:48.520 me were
01:21:50.020 trustworthy.
01:21:52.380 And it
01:21:53.680 turns out
01:21:54.000 they weren't.
01:21:54.400 so that's
01:21:57.240 a big part
01:21:57.680 of why I
01:21:58.420 decided to
01:22:00.300 become a
01:22:00.660 recluse.
01:22:01.760 I'm going
01:22:02.280 to be honest
01:22:02.700 with you, I
01:22:03.040 can't stand
01:22:03.580 people right
01:22:04.060 now.
01:22:04.620 I can't
01:22:05.320 fucking
01:22:05.560 stand them
01:22:06.080 in person.
01:22:09.000 And it
01:22:10.000 has nothing
01:22:10.440 to do with
01:22:10.840 the people.
01:22:12.040 I'm talking
01:22:12.500 about people
01:22:12.940 that I
01:22:13.320 genuinely like
01:22:14.260 and have
01:22:14.520 done nothing
01:22:15.020 bad to me.
01:22:16.760 People who
01:22:17.400 are genuinely
01:22:17.780 good as far
01:22:18.700 as I can
01:22:19.020 tell.
01:22:19.680 I just
01:22:20.160 can't stand
01:22:20.800 to be around
01:22:21.260 people right
01:22:21.760 now.
01:22:21.980 Anybody
01:22:24.400 else have
01:22:24.780 that experience?
01:22:26.280 It could be
01:22:26.980 social media
01:22:27.620 that's doing
01:22:28.080 it to me,
01:22:28.520 but I think
01:22:28.880 it was actually
01:22:29.720 my real life
01:22:30.540 that did it
01:22:30.940 to me.
01:22:33.980 Yeah.
01:22:36.400 Yeah,
01:22:37.020 it's,
01:22:38.220 and I don't
01:22:39.760 know if I
01:22:40.100 can get it
01:22:40.520 back,
01:22:41.620 honestly.
01:22:43.140 So,
01:22:44.460 I'm in a
01:22:45.180 little bit
01:22:45.540 of a
01:22:46.380 existential
01:22:48.840 something.
01:22:51.980 a reset,
01:22:52.620 I guess,
01:22:52.980 a reboot.
01:22:53.840 I'm in the
01:22:54.260 middle of a
01:22:54.640 reboot.
01:22:55.920 Is it
01:22:56.320 burnout?
01:22:56.740 I don't
01:22:56.960 think so.
01:22:57.960 It doesn't
01:22:58.260 feel like
01:22:58.680 burnout.
01:22:59.220 I know what
01:22:59.960 burnout feels
01:23:00.500 like,
01:23:00.740 but it
01:23:00.880 doesn't
01:23:01.040 feel like
01:23:01.340 that.
01:23:04.800 So,
01:23:06.120 and now
01:23:07.460 add on
01:23:07.880 top of
01:23:08.220 this that
01:23:09.560 like
01:23:11.120 Elon Musk,
01:23:12.000 I went
01:23:12.500 through the
01:23:12.860 same process.
01:23:14.920 You know,
01:23:15.080 I looked
01:23:15.380 for a
01:23:16.580 religious
01:23:16.880 reason to
01:23:17.560 live,
01:23:18.720 was not
01:23:19.240 persuaded,
01:23:20.220 just like
01:23:20.760 Elon.
01:23:21.980 Looked for
01:23:23.100 other reasons
01:23:23.840 to be,
01:23:24.580 you know,
01:23:24.900 I thought
01:23:25.300 like marriage.
01:23:27.240 I thought
01:23:27.620 if I sort
01:23:28.200 of wrapped
01:23:28.860 myself around
01:23:29.560 a family
01:23:30.040 unit,
01:23:30.800 like that
01:23:31.160 would be
01:23:31.400 a reason
01:23:31.800 to live.
01:23:32.880 Didn't
01:23:33.360 work out
01:23:33.780 in my
01:23:34.160 case.
01:23:34.880 I'm happy
01:23:35.400 that it
01:23:35.680 worked out
01:23:36.000 for many
01:23:36.600 of you.
01:23:38.000 But,
01:23:39.320 so as
01:23:39.920 my default,
01:23:43.480 and maybe
01:23:44.000 that's not
01:23:45.600 the way to
01:23:45.960 talk about
01:23:46.420 it,
01:23:46.580 but my
01:23:47.280 default is
01:23:47.900 I wake
01:23:48.260 up every
01:23:48.680 day and
01:23:49.820 I try to
01:23:50.240 make the
01:23:50.580 world better
01:23:51.100 for other
01:23:51.460 people.
01:23:53.520 Now,
01:23:54.280 I hope
01:23:55.400 you've seen
01:23:55.860 it.
01:23:57.120 You know,
01:23:57.420 I talk about
01:23:57.960 the number
01:23:58.340 of people
01:23:58.680 I've cured
01:23:59.360 from various
01:24:00.060 health problems,
01:24:00.960 which is very
01:24:01.780 meaningful to me.
01:24:02.960 I really love
01:24:03.480 that.
01:24:04.400 And I've
01:24:05.160 told you,
01:24:06.200 I've tried to
01:24:06.740 help many
01:24:07.200 of you in
01:24:07.560 your lives
01:24:08.080 be more
01:24:09.020 successful,
01:24:09.980 get love,
01:24:10.620 and stuff like
01:24:11.040 that.
01:24:11.560 And many
01:24:11.960 of you have.
01:24:12.720 So that's
01:24:13.220 what works
01:24:13.600 for me.
01:24:14.560 So everything
01:24:15.080 that you see
01:24:15.600 me do on
01:24:16.080 social media,
01:24:17.260 especially the
01:24:17.920 stuff that
01:24:18.280 looks like
01:24:18.720 I'm just
01:24:20.060 flailing around
01:24:20.860 or being
01:24:21.260 angry or
01:24:21.800 something,
01:24:22.340 all of it
01:24:22.960 has a
01:24:23.200 purpose.
01:24:24.340 It all
01:24:24.660 has a
01:24:24.920 purpose.
01:24:25.780 I mean,
01:24:26.080 I am
01:24:26.320 enjoying it,
01:24:27.560 so that's
01:24:28.140 always a
01:24:28.900 sub-purpose,
01:24:29.820 but there's
01:24:30.420 always a
01:24:30.740 larger arc
01:24:31.560 to it.
01:24:32.960 And if
01:24:33.600 you didn't
01:24:33.840 know that,
01:24:34.280 it would
01:24:34.420 look confusing.
01:24:35.720 So lots
01:24:36.300 of times
01:24:36.640 when I'm
01:24:37.040 causing
01:24:37.400 trouble,
01:24:39.240 I'm just
01:24:39.780 gathering
01:24:40.140 energy,
01:24:41.520 and I'm
01:24:41.960 going to
01:24:42.100 use it
01:24:42.400 for something.
01:24:43.820 So once
01:24:44.440 you learn
01:24:45.200 that I do
01:24:45.780 that intentionally,
01:24:46.780 you'll start
01:24:47.140 to see
01:24:47.460 it.
01:24:49.980 So,
01:24:50.820 but here's
01:24:51.260 what was
01:24:52.260 weighing on
01:24:53.520 me recently.
01:24:54.980 And I
01:24:55.320 think it
01:24:55.560 was watching
01:24:56.160 Elon Musk
01:24:56.920 get his
01:24:57.540 shit out.
01:24:58.900 Where here's
01:24:59.460 a guy who,
01:25:00.780 imagine being
01:25:01.720 Elon Musk,
01:25:02.340 he's watching
01:25:02.780 this Ukraine
01:25:03.360 war,
01:25:04.140 and he
01:25:04.800 says to
01:25:05.160 himself,
01:25:06.460 nobody seems
01:25:07.420 to be fixing
01:25:07.980 this thing.
01:25:10.280 Nobody's
01:25:10.820 even offered
01:25:11.500 the bones
01:25:12.880 of what a
01:25:13.440 peace plan
01:25:13.900 would look
01:25:14.280 like.
01:25:16.280 But he
01:25:16.860 could,
01:25:18.080 and he
01:25:18.420 knew that
01:25:18.720 people would
01:25:19.140 listen to
01:25:19.500 him.
01:25:19.860 I don't
01:25:20.180 know for
01:25:20.480 sure,
01:25:20.740 but he's
01:25:21.040 Matt
01:25:21.340 Putin,
01:25:21.700 right?
01:25:22.680 I think
01:25:23.100 Elon hasn't
01:25:24.860 even done
01:25:25.340 business with
01:25:25.940 Putin,
01:25:27.040 because I
01:25:27.380 think there
01:25:27.660 was some
01:25:27.980 SpaceX,
01:25:29.700 Russian space
01:25:31.320 thing that
01:25:31.740 maybe they
01:25:32.120 worked
01:25:32.300 together.
01:25:33.560 So he's
01:25:34.240 pretty tapped
01:25:34.680 into the
01:25:35.220 whole world.
01:25:36.180 He knows
01:25:36.460 all the
01:25:36.740 players.
01:25:38.180 And so
01:25:39.120 he puts
01:25:39.460 forward a
01:25:40.420 peace plan
01:25:41.120 which is
01:25:41.700 nothing but
01:25:42.640 helpful.
01:25:43.240 and he
01:25:46.200 gets shit
01:25:46.700 on.
01:25:48.400 And here's
01:25:49.220 the thing
01:25:49.460 that bothered
01:25:49.760 me.
01:25:50.380 The
01:25:50.820 criticisms
01:25:52.420 were just
01:25:53.380 so off
01:25:54.000 base.
01:25:55.040 Because
01:25:55.500 people said
01:25:56.100 he was
01:25:56.460 aiding
01:25:56.900 Russia.
01:25:59.200 In what
01:25:59.960 world
01:26:00.460 is that
01:26:02.760 aiding
01:26:03.060 Russia?
01:26:03.880 Because all
01:26:04.460 that happened
01:26:04.920 was Ukraine
01:26:05.620 was going
01:26:06.020 to do
01:26:06.260 whatever
01:26:06.580 they were
01:26:06.900 going to
01:26:07.100 do anyway
01:26:07.540 because they
01:26:07.900 were winning.
01:26:08.260 there was
01:26:09.400 nothing that
01:26:09.880 Elon Musk
01:26:10.420 says that
01:26:11.660 was going
01:26:11.980 to stop
01:26:12.360 Ukraine from
01:26:14.680 pressing hard
01:26:15.340 while they're
01:26:15.900 winning.
01:26:17.140 So Ukraine
01:26:17.700 was going
01:26:18.040 to do the
01:26:18.340 same thing
01:26:18.740 Ukraine was
01:26:19.320 going to
01:26:19.520 do no
01:26:19.840 matter what
01:26:20.560 Musk said.
01:26:23.040 But Russia
01:26:24.600 might act
01:26:25.080 differently.
01:26:26.560 And Russia
01:26:26.940 might specifically
01:26:27.820 say, well,
01:26:28.500 if Elon Musk
01:26:29.320 is putting out
01:26:30.020 a peace plan,
01:26:31.700 I guess I
01:26:32.340 don't have to
01:26:32.800 nuke anybody
01:26:33.480 because I've
01:26:35.100 got an
01:26:35.960 exit path.
01:26:37.100 Now,
01:26:37.260 his plan
01:26:38.460 might not
01:26:38.880 be the
01:26:39.160 one,
01:26:39.740 maybe it'll
01:26:40.820 morph,
01:26:41.480 but as
01:26:41.900 long as
01:26:42.300 you're
01:26:42.460 talking
01:26:43.120 about
01:26:43.680 an
01:26:44.060 exit
01:26:44.340 path,
01:26:45.480 you don't
01:26:45.940 nuke.
01:26:47.060 So what
01:26:47.600 Elon Musk
01:26:48.180 did was
01:26:48.980 he put
01:26:50.140 his personal
01:26:50.860 reputation
01:26:51.460 on the
01:26:51.860 line,
01:26:53.220 and he
01:26:53.580 decreased
01:26:54.180 the chance
01:26:54.720 of nuclear
01:26:55.280 war by
01:26:56.440 a lot.
01:26:58.380 And he
01:26:59.080 got fucked
01:27:00.100 over for
01:27:00.740 it.
01:27:02.520 He got
01:27:03.300 shit on
01:27:03.880 for that.
01:27:06.200 And do
01:27:07.000 you think
01:27:07.360 he was
01:27:08.140 doing that
01:27:08.520 like for
01:27:08.920 his personal
01:27:09.660 benefit?
01:27:10.320 I mean,
01:27:11.180 his personal
01:27:11.700 benefit to
01:27:12.240 avoid nuclear
01:27:13.100 war, of
01:27:13.620 course.
01:27:14.600 But you
01:27:15.020 knew that he
01:27:15.940 was going to
01:27:16.340 get shit on,
01:27:17.120 so did he.
01:27:18.580 So did he.
01:27:19.640 He did that
01:27:20.320 for you.
01:27:21.620 He did that
01:27:22.400 for you,
01:27:22.820 and then you
01:27:23.860 collectively.
01:27:25.160 And then the
01:27:26.220 people he was
01:27:26.860 trying to help
01:27:27.480 from nuclear
01:27:28.520 annihilation just
01:27:30.760 went on to
01:27:31.300 Twitter and
01:27:31.680 shit on him.
01:27:33.280 And I
01:27:33.920 guess I
01:27:34.220 felt that.
01:27:36.020 Of course
01:27:36.980 that happens
01:27:37.380 to me every
01:27:37.800 day.
01:27:38.200 Every time I
01:27:38.780 try to do
01:27:39.180 anything useful,
01:27:40.260 somebody shits
01:27:40.800 on me.
01:27:42.060 And I
01:27:43.780 don't think it
01:27:44.340 bothered me as
01:27:45.040 much when it
01:27:45.520 happened to
01:27:45.900 me.
01:27:46.960 But when I
01:27:47.600 saw it happen
01:27:48.100 to him,
01:27:49.520 in this
01:27:50.020 specific example,
01:27:51.840 I just
01:27:52.200 thought,
01:27:52.800 shit,
01:27:53.520 people suck.
01:27:55.140 People
01:27:55.500 really suck.
01:27:57.660 They really,
01:27:58.400 really do.
01:27:58.800 because there's
01:28:01.600 nothing more
01:28:02.320 obvious that he's
01:28:03.300 trying to help.
01:28:04.840 And the people
01:28:05.320 saying, you know,
01:28:06.100 stick to your
01:28:06.640 lane because you
01:28:07.340 don't understand
01:28:07.940 it, who are
01:28:09.820 you talking
01:28:10.360 about?
01:28:11.800 Elon Musk is
01:28:12.600 the person who
01:28:13.180 should never
01:28:13.760 stay in his
01:28:14.500 lane.
01:28:16.020 Elon Musk is
01:28:16.780 the person who
01:28:17.300 should say,
01:28:18.060 hey, here's
01:28:18.660 another lane.
01:28:20.620 Have you
01:28:21.080 thought of this
01:28:21.600 lane?
01:28:22.180 Because nobody's
01:28:23.460 doing anything in
01:28:24.080 this lane, maybe
01:28:24.740 you could do that
01:28:25.320 one too.
01:28:26.220 Because those
01:28:26.780 other lanes came
01:28:27.520 out pretty well.
01:28:28.800 all right.
01:28:37.460 Crabs want to
01:28:38.300 drag you back
01:28:38.920 into the
01:28:39.260 bucket.
01:28:43.200 Affirmation
01:28:43.760 time.
01:28:44.940 So I'll
01:28:45.500 tell you what
01:28:45.960 is giving me
01:28:47.240 enthusiasm lately.
01:28:49.520 So here's a
01:28:50.380 little inside
01:28:51.120 book writing
01:28:51.880 stuff.
01:28:53.800 People always
01:28:54.240 ask me about
01:28:54.800 the process
01:28:55.320 of writing
01:28:55.800 books.
01:28:56.720 For some
01:28:57.320 reason, it's
01:28:58.040 of great
01:28:58.440 interest to
01:28:58.940 people, even
01:28:59.460 if they
01:28:59.780 don't write
01:29:00.240 books.
01:29:01.520 And I can
01:29:02.320 kind of see
01:29:02.760 that.
01:29:03.400 So I'll
01:29:03.600 give you a
01:29:03.900 little insight
01:29:04.480 into it.
01:29:05.400 So my
01:29:06.000 process has
01:29:06.980 very much a
01:29:07.940 way up, way
01:29:08.820 down, way
01:29:09.460 up, way
01:29:09.860 down path.
01:29:11.360 And it
01:29:11.640 always does.
01:29:12.600 And it
01:29:13.040 goes like
01:29:13.440 this.
01:29:14.120 I've got this
01:29:14.780 great idea for
01:29:15.640 a book.
01:29:16.400 This will be
01:29:16.820 so amazing.
01:29:17.900 And I'll
01:29:18.200 write some
01:29:18.640 notes.
01:29:19.100 This is
01:29:19.400 amazing.
01:29:20.220 And I'll
01:29:20.580 start writing
01:29:21.160 the actual
01:29:21.640 chapters.
01:29:23.120 And I'll
01:29:23.540 look it
01:29:23.840 over after
01:29:24.340 a while.
01:29:24.720 I'll be
01:29:24.840 like, oh,
01:29:26.080 that wasn't
01:29:26.660 as good as
01:29:27.220 I hoped.
01:29:28.960 And then I'll
01:29:29.780 write another
01:29:30.380 chapter and
01:29:30.920 I'll nail it.
01:29:31.960 I'll be like,
01:29:32.380 this is amazing.
01:29:33.220 This is amazing.
01:29:34.720 Anyway, that
01:29:35.340 keeps going all
01:29:36.380 the way to the
01:29:36.900 end.
01:29:37.760 And it never
01:29:38.220 stops.
01:29:39.440 Even right up to
01:29:40.300 publication date,
01:29:41.020 it never stops.
01:29:43.100 So there'll be
01:29:43.760 days I wake up
01:29:44.540 and I just hate
01:29:45.060 it.
01:29:45.940 And there are
01:29:46.220 days I wake
01:29:46.800 up like
01:29:47.240 yesterday.
01:29:48.300 So yesterday
01:29:49.060 I was in
01:29:49.540 Starbucks and
01:29:50.400 every now and
01:29:50.960 then I'll just
01:29:51.480 reread what
01:29:52.420 I've written.
01:29:53.580 Because I
01:29:54.260 need to
01:29:54.840 sort of
01:29:56.220 resaturate
01:29:58.760 myself with
01:29:59.500 what I've
01:29:59.880 already done
01:30:00.420 before I can
01:30:01.840 make sure that
01:30:02.420 the next thing
01:30:02.960 is additive.
01:30:03.980 So you have
01:30:04.620 to keep
01:30:04.900 rereading your
01:30:05.620 book that
01:30:06.020 you've written
01:30:06.380 so far.
01:30:07.140 So I did
01:30:07.780 that yesterday
01:30:08.320 and it's
01:30:11.380 fucking awesome.
01:30:14.800 Honestly,
01:30:15.740 it's fucking
01:30:16.940 awesome.
01:30:18.320 The number
01:30:18.980 of lives
01:30:19.520 this thing's
01:30:20.040 going to
01:30:20.280 change is
01:30:21.820 crazy.
01:30:23.440 And I
01:30:24.120 didn't realize
01:30:24.680 it until I
01:30:25.220 sort of saw
01:30:26.240 the whole
01:30:26.640 because I
01:30:27.400 end up
01:30:27.780 micro-focusing
01:30:28.820 on a little
01:30:29.260 chapter.
01:30:31.060 It's crazy.
01:30:32.760 Now, here's
01:30:33.520 the other
01:30:33.760 thing that's
01:30:34.520 relevant to
01:30:35.240 this commercial.
01:30:36.080 This commercial.
01:30:38.020 It is kind
01:30:38.480 of a commercial.
01:30:39.620 Relevant to
01:30:40.280 the conversation.
01:30:42.280 I am going
01:30:43.280 to get
01:30:43.840 killed
01:30:45.120 for this
01:30:46.320 book.
01:30:46.640 the critics
01:30:48.320 are going
01:30:48.740 to destroy
01:30:49.540 me.
01:30:50.360 Do you
01:30:50.520 know why?
01:30:51.540 Guess why?
01:30:53.000 Why will I
01:30:54.000 be criticized
01:30:54.620 for this
01:30:55.240 book just
01:30:55.680 massively?
01:30:57.260 Take a
01:30:57.740 guess.
01:30:58.000 You're
01:31:02.460 almost there.
01:31:03.060 How are you
01:31:07.560 missing this?
01:31:10.600 I thought I
01:31:11.340 teed this up
01:31:11.920 for you a
01:31:12.260 little bit
01:31:12.540 better than
01:31:12.920 this.
01:31:13.660 The answer
01:31:14.180 is right in
01:31:14.740 front of you
01:31:15.180 because I
01:31:17.060 left my lane.
01:31:17.780 I left my
01:31:19.020 lane.
01:31:20.320 Because reframes
01:31:21.220 are really
01:31:21.600 psychology.
01:31:22.300 I'm not a
01:31:22.660 psychologist.
01:31:23.560 I'm a
01:31:23.900 hypnotist.
01:31:24.940 And nobody's
01:31:25.580 going to say,
01:31:26.160 listen to the
01:31:26.940 hypnotist.
01:31:27.520 They're going
01:31:27.720 to say,
01:31:28.100 it's a
01:31:28.440 cartoonist.
01:31:29.460 They're going
01:31:29.940 to say,
01:31:30.220 why is this
01:31:30.740 cartoonist
01:31:31.460 telling me
01:31:32.260 psychology?
01:31:33.960 Right?
01:31:34.640 And they're
01:31:35.200 going to say,
01:31:35.560 I'm damaging
01:31:36.080 people and
01:31:37.000 I'm oversimplifying
01:31:38.100 and I should
01:31:39.140 stay in my
01:31:39.700 lane and when
01:31:40.620 I was just
01:31:40.980 being a
01:31:41.360 cartoonist it
01:31:42.060 made sense.
01:31:42.680 And then
01:31:43.480 everybody's
01:31:43.920 going to say,
01:31:44.900 that reframe
01:31:47.200 is going to
01:31:47.580 hurt somebody.
01:31:49.480 Right?
01:31:49.940 I don't think
01:31:50.580 any of them
01:31:51.000 will hurt
01:31:51.320 anybody.
01:31:52.100 But somebody's
01:31:52.740 going to say,
01:31:53.100 well that one,
01:31:53.740 yeah, that's
01:31:54.380 reckless.
01:31:55.820 Right?
01:31:56.240 I am going to
01:31:57.020 get absolutely
01:31:57.820 destroyed.
01:32:00.740 But,
01:32:01.880 it's going to
01:32:02.660 really help a
01:32:03.380 lot of people.
01:32:04.960 Now I'm going
01:32:05.460 to share it
01:32:05.840 with some of
01:32:06.200 the people on
01:32:06.780 the locals
01:32:07.560 community.
01:32:08.860 I'm going to
01:32:09.360 run through
01:32:09.980 some of the
01:32:10.400 reframes in the
01:32:11.060 book just to
01:32:11.560 give you a
01:32:11.880 sense of it.
01:32:12.340 Give some
01:32:12.600 feedback.
01:32:13.620 It might
01:32:13.880 help me tune
01:32:15.600 it a little
01:32:16.000 bit.
01:32:16.920 But, my
01:32:17.760 God, it's
01:32:18.180 going to be
01:32:18.620 really good.
01:32:19.120 It will be
01:32:19.460 the biggest
01:32:19.760 thing I've
01:32:20.140 done.
01:32:21.700 So, if
01:32:24.400 you're willing
01:32:24.840 to believe
01:32:25.300 that, it
01:32:26.080 should be the
01:32:26.640 biggest thing
01:32:27.220 I've done.
01:32:29.000 I would be
01:32:29.760 amazed if it's
01:32:30.720 not a national
01:32:31.580 phenomenon, if
01:32:33.400 not international.
01:32:35.120 Because here's
01:32:35.820 the trick.
01:32:36.600 It's not
01:32:37.220 enough to say
01:32:38.240 something good
01:32:39.120 and useful.
01:32:40.660 Because lots of
01:32:41.500 books do that.
01:32:42.180 And that's
01:32:44.480 what I've
01:32:44.720 done before.
01:32:45.620 I've said
01:32:46.020 something that
01:32:46.480 was entertaining
01:32:47.180 or useful or
01:32:49.060 both, and
01:32:49.960 those will sell
01:32:50.460 a lot of
01:32:50.760 books because
01:32:51.520 people like to
01:32:52.100 be entertained.
01:32:53.200 They like
01:32:53.480 useful stuff.
01:32:54.820 But the
01:32:55.480 extra gear
01:32:56.900 that I've
01:32:58.300 never hit,
01:32:59.580 that some
01:33:00.100 people have,
01:33:01.560 other authors
01:33:02.120 have, is
01:33:03.460 where the
01:33:04.340 usefulness of
01:33:05.800 it is just
01:33:07.200 extreme.
01:33:07.720 And this
01:33:09.920 is going to
01:33:10.200 have probably
01:33:10.800 80 to 100
01:33:11.740 reframes in
01:33:12.580 it, of
01:33:13.720 which, for
01:33:14.680 most people,
01:33:15.380 at least 10
01:33:15.920 of them would
01:33:16.320 be life-changing.
01:33:18.440 So, and they
01:33:20.160 can learn them
01:33:20.700 just by reading
01:33:21.260 them.
01:33:28.320 I'm reading a
01:33:29.300 comment on
01:33:29.880 Locals, which
01:33:31.180 I appreciate.
01:33:32.240 I appreciate that
01:33:33.320 comment.
01:33:33.600 Yes, I've been
01:33:35.580 playing drones.
01:33:39.920 All right.
01:33:44.020 Big question.
01:33:45.260 What's the
01:33:45.540 difference between
01:33:46.260 God and
01:33:46.900 intelligence and
01:33:49.100 the simulation?
01:33:50.180 The difference
01:33:50.820 is that God
01:33:53.460 is usually
01:33:54.100 defined as
01:33:54.800 one entity.
01:33:56.640 And if we
01:33:57.360 are a
01:33:57.680 simulation,
01:33:58.500 we're probably,
01:33:59.900 it's probably a
01:34:00.560 group effort.
01:34:02.020 You know,
01:34:02.200 somebody built
01:34:02.880 the technology
01:34:03.600 to build the
01:34:04.700 simulation.
01:34:05.580 So it'd be
01:34:06.060 one versus
01:34:06.640 many.
01:34:07.200 And then the
01:34:07.580 next difference
01:34:08.160 would be that
01:34:09.240 that civilization
01:34:10.400 may have had a
01:34:11.620 higher civilization
01:34:12.960 that created
01:34:13.620 them.
01:34:15.020 So God is
01:34:15.980 one and done.
01:34:17.940 Simulation is
01:34:18.740 infinite.
01:34:20.240 Iteration.
01:34:23.540 What if we
01:34:24.320 all share one
01:34:25.060 consciousness?
01:34:26.240 That's more of a
01:34:26.880 word-thinking
01:34:27.480 thing.
01:34:27.960 I think it just
01:34:28.620 depends on how
01:34:29.000 you define
01:34:29.380 things.
01:34:29.740 you're wrong
01:34:34.400 about how to
01:34:34.840 program AI.
01:34:35.840 You don't set
01:34:36.220 the rules.
01:34:36.760 You give it a
01:34:37.100 large training
01:34:37.780 set of
01:34:38.320 successful humor.
01:34:39.260 I would do
01:34:39.600 both.
01:34:40.700 So I would
01:34:41.240 give it the
01:34:42.300 rule, but
01:34:43.320 you're right,
01:34:44.000 it would be
01:34:44.280 useless.
01:34:45.080 The rule would
01:34:45.760 be useless
01:34:46.260 without a
01:34:47.120 large training
01:34:48.000 set of humor.
01:34:49.400 And that's
01:34:49.980 definitely part
01:34:50.560 of the plan.
01:34:51.080 midterm
01:35:06.800 predictions.
01:35:08.600 I may stay
01:35:09.580 away from
01:35:09.980 midterm
01:35:10.360 predictions.
01:35:11.700 Midterms could
01:35:12.440 go either way.
01:35:14.100 They really
01:35:14.480 could.
01:35:15.200 I think it's
01:35:15.620 a toss-up.
01:35:18.560 I can
01:35:19.660 definitely see
01:35:20.660 Democrats
01:35:21.580 sweeping.
01:35:23.260 I can see
01:35:24.080 it.
01:35:24.780 I'm not
01:35:25.260 predicting it,
01:35:26.340 but it's
01:35:27.080 completely within
01:35:28.060 the realm.
01:35:30.860 Is
01:35:31.100 reframing an
01:35:31.780 analogy?
01:35:32.420 No.
01:35:33.160 Very much
01:35:33.680 not one.
01:35:41.200 AI will be
01:35:42.200 too smart to
01:35:42.900 find humans
01:35:43.500 funny.
01:35:44.480 That could
01:35:45.260 be.
01:35:45.500 I think
01:35:50.140 Fetterman
01:35:50.660 will be
01:35:51.160 Oz
01:35:51.640 because that's
01:35:53.760 the funnier
01:35:54.280 outcome.
01:35:55.600 So I'm
01:35:55.900 going to go
01:35:56.200 with the
01:35:56.460 funnier
01:35:56.760 outcome.
01:35:57.560 All right,
01:35:57.920 that's all
01:35:58.220 for now,
01:35:59.060 and I will
01:35:59.640 talk to you
01:36:00.140 later.
01:36:00.440 there.
01:36:00.500 Thank you.