Real Coffee with Scott Adams - April 05, 2022


Episode 1704 Scott Adams: What Elon Musk Will Do At Twitter, Russian Atrocities, And Other Stuff


Episode Stats

Length

1 hour

Words per Minute

146.66267

Word Count

8,820

Sentence Count

693

Misogynist Sentences

12

Hate Speech Sentences

42


Summary


Transcript

00:00:01.000 One moment. It looks like my printer might work after all. Hold on. This will be exciting.
00:00:16.000 Well, I have to tell you that like a technical stud that I am, I changed my mesh network yesterday,
00:00:28.000 which means that all my devices don't know what my network is, but they're learning.
00:00:35.280 So, today we'll be going for the backup plan, but not until the simultaneous sip.
00:00:41.680 And all you need for that is a cup or a mug or a glass or a tank or a gel,
00:00:43.860 a scanty and a jug or a flask or a vessel of any kind.
00:00:48.480 Fill it with your favorite liquid. I like coffee.
00:00:52.160 And join me now for the unparalleled pleasure.
00:00:54.820 It's called the dopamine the other day, and it's called the simultaneous sip.
00:00:59.380 It's going to happen now. Go.
00:01:05.700 Oh, yeah. Oh, yeah.
00:01:09.200 Let's take a vote.
00:01:10.680 How many of you would like to see me raise my printer above my head
00:01:13.940 and throw it vigorously onto the ground on the tile floor,
00:01:17.780 breaking into a million shards,
00:01:22.340 which I will then have to walk through with my bare feet?
00:01:29.840 No? I'm seeing lots of no's.
00:01:31.860 This actually tells a lot about my audience.
00:01:40.260 Half of you want me to destroy an expensive piece of equipment in front of your eyes,
00:01:44.980 and half of you say, no, we don't want to start the morning that way.
00:01:49.500 Well, let's see.
00:01:51.560 Let's leave it to fate.
00:01:55.640 Oh, good.
00:01:56.260 My backup plan actually works, so the printer has one more chance.
00:02:02.660 Now, I'm not going to blame the printer this time, because it could be user error.
00:02:06.720 Normally, you have to teach the technology how to behave,
00:02:11.860 because that really sets the lesson for the other technology.
00:02:15.880 You know, a lot of you believe that if you break one piece of technology
00:02:19.240 in front of the other devices, it has no effect,
00:02:24.000 because you think to yourself, well, those other devices, they're not as sentient.
00:02:28.100 They have no idea that you've just destroyed another piece of equipment.
00:02:32.140 How is that going to affect them?
00:02:34.540 Well, you don't understand how the simulation works.
00:02:38.200 The simulation will rewrite itself if you make it.
00:02:44.080 All right.
00:02:45.520 Here's a story that I think is interesting.
00:02:47.480 So, Michael Schellenberger, as you know, is running for governor,
00:02:50.340 and interestingly, he's running as an independent.
00:02:54.040 Now, the first thing you should say to yourself is,
00:02:56.980 independents never win.
00:02:59.000 Am I right?
00:03:00.200 I mean, they have, Jesse Ventura, but it's very unusual.
00:03:04.140 Now, have you ever noticed that there's a theme or a commonality
00:03:10.920 to independent candidates?
00:03:14.440 What is it that independent candidates generally have in common with each other?
00:03:21.360 In the comments, what do you know?
00:03:23.260 What do they usually have in common with each other?
00:03:27.360 They're not very good candidates.
00:03:28.980 Can you tell me one, give me an example of one independent candidate
00:03:36.420 who had the full stack?
00:03:41.420 Well, Jesse Ventura came pretty close, and he got elected.
00:03:46.020 And, yeah, Trump.
00:03:47.240 Well, you could say that Trump was sort of an outlier.
00:03:49.440 So here's the pattern you should look at.
00:03:52.460 If you're looking at the pattern that independents don't win,
00:03:56.700 that's the wrong pattern.
00:03:58.680 The pattern you should look at is that good candidates always win.
00:04:02.980 That's the pattern.
00:04:04.620 The best candidate almost always wins.
00:04:08.580 Or at least they're, you know, knack and knack.
00:04:11.160 They might lose by 1%.
00:04:12.860 But the best candidate doesn't come in last.
00:04:16.420 It doesn't matter what party they run in.
00:04:18.300 The best candidate is going to make a big show.
00:04:20.780 So I think what's different about Schellenberger is
00:04:24.100 he's the first full stack political candidate I've ever seen.
00:04:30.560 Now, do you know what I mean by a full stack?
00:04:33.520 It comes from programming language.
00:04:35.880 There are programmers who know not just one programming language,
00:04:40.900 but they might know everything from, you know,
00:04:43.280 the hardware to user interfaces.
00:04:46.000 Just every component of development.
00:04:51.080 If somebody has all of those skills, which is sort of rare,
00:04:54.240 finding a, quote, full stack developer is sort of like finding a unicorn.
00:05:00.460 But if you find one, they're worth their weight in gold because they're so rare.
00:05:05.640 Now, in the political world, what would a full stack candidate look like?
00:05:10.400 Not Ross Perot.
00:05:11.360 Ross Perot has the problem that he sounds like, I don't know,
00:05:18.880 he sounds like a small rodent who's been caught in some kind of a deadly trap.
00:05:26.680 Right?
00:05:28.600 He was short.
00:05:30.460 He was unattractive.
00:05:32.200 And he had a weird voice.
00:05:34.900 He wasn't really a full stack candidate.
00:05:36.900 He was just a candidate who had some solidly good ideas.
00:05:39.600 In retrospect, they were good ideas.
00:05:42.420 And now take Schellenberger.
00:05:45.980 Good looking.
00:05:47.960 Perfect hair.
00:05:50.760 And he's the only person I know, candidate-wise,
00:05:54.440 who's coming into the process,
00:05:57.380 having never been a political official,
00:06:00.820 he's coming into the process with books full of specific solutions
00:06:05.060 for the biggest problems in California.
00:06:06.960 He's actually looked at all the things you can do
00:06:10.180 and figured out what are the good ones.
00:06:13.000 Who the hell else has ever done that?
00:06:15.660 I mean, seriously.
00:06:17.340 Who else can communicate so well
00:06:19.580 that they can write best-selling books all day long,
00:06:23.080 can understand the technology
00:06:25.680 and the complexity of very complicated things,
00:06:28.660 and simplify it to the point
00:06:30.980 where the public can understand it perfectly,
00:06:32.740 knows the most important policies or issues.
00:06:39.120 He's not spending his time...
00:06:40.360 I don't think I've heard him speak...
00:06:42.800 Correct me if I'm wrong.
00:06:44.360 All right?
00:06:44.700 Give me a fact check.
00:06:46.700 Have you ever heard Michael Schellenberger
00:06:48.800 weigh in on any wokeness stuff?
00:06:52.360 Not pro or con, right?
00:06:54.500 I mean, I'm sure he'll have to answer it
00:06:56.480 because he's a candidate.
00:06:57.140 But I feel like it's his lowest priority
00:06:59.860 because it doesn't seem to be anywhere
00:07:02.580 near the top of anything he talks about.
00:07:06.480 Can you make...
00:07:07.520 If you were going to design a perfect candidate on paper,
00:07:11.340 this is the first...
00:07:13.180 I'm going to say this,
00:07:14.480 and I don't know that it's true,
00:07:16.500 but if anybody has a better example of it,
00:07:18.480 I'd love to hear it.
00:07:20.040 My contention is that Schellenberger
00:07:22.600 is the first full-stack candidate.
00:07:27.260 The first one.
00:07:28.920 He's even a full-stack in terms of
00:07:30.900 a little bit Democrat and a little bit Republican,
00:07:34.060 meaning that he was a registered Democrat for years,
00:07:37.520 decided that some of their solutions
00:07:39.080 didn't kind of work,
00:07:40.860 wasn't really willing to be a Republican,
00:07:43.420 but clearly understands
00:07:44.840 that some of the Republican opinions make sense,
00:07:47.420 nuclear energy being chief among them.
00:07:50.100 So where else have you ever gotten a candidate
00:07:54.360 who had everything from full detailed policy,
00:07:58.520 understanding, ability to communicate it,
00:08:01.280 the physicality of a candidate,
00:08:04.200 because people are influenced by just how tall you are
00:08:07.560 and how your hair looks.
00:08:09.600 It's just true.
00:08:10.660 I mean, I'm not saying it's a good thing.
00:08:12.500 It's just true.
00:08:15.120 Yeah, so the only thing he lacks
00:08:17.320 is maybe finance, finances.
00:08:20.300 Because that's the problem being independent.
00:08:23.040 But who are we talking about?
00:08:25.440 Who's the other person running for governor?
00:08:29.800 Name the other person running for governor.
00:08:32.220 Any other person.
00:08:33.980 Name one other person running for governor in California.
00:08:38.120 Am I right?
00:08:39.540 Does he need money?
00:08:41.880 Does he need money
00:08:42.880 if you don't even know the name
00:08:43.940 of the person he's running against
00:08:45.140 or names of people?
00:08:47.820 Is it Newsom?
00:08:49.420 You don't know that, do you?
00:08:51.660 Are you sure Newsom's going to run for governor
00:08:53.520 and not president?
00:08:56.200 I don't think we know any of that.
00:08:59.060 But if it came down to,
00:09:01.300 let's say,
00:09:01.880 Newsom versus Schellenberger,
00:09:03.800 would he need money?
00:09:06.640 Because the publicity alone would be insane.
00:09:09.100 I mean, it's the perfect matchup
00:09:10.940 of the sort of empty suit governor
00:09:13.480 versus the full-stack candidate.
00:09:16.320 Don't you want to see the empty suit
00:09:18.380 versus the full-stack?
00:09:20.300 Everybody wants to see that.
00:09:22.040 That's like the best show ever.
00:09:24.100 I would watch their debates
00:09:25.940 like I would watch entertainment.
00:09:28.760 To me, that would just be like
00:09:29.760 an entertaining show.
00:09:30.580 All right, Biden says
00:09:33.840 now he wants to try
00:09:35.060 Putin as a war criminal,
00:09:37.040 taking it to the next level.
00:09:38.380 Not just saying he is one, sort of,
00:09:40.340 but getting serious about it.
00:09:43.700 Now, I saw Joel Pollack's opinion
00:09:46.300 that that could be making things worse
00:09:49.700 if you ever hope to have
00:09:51.540 a negotiated outcome.
00:09:54.180 It's pretty hard to have
00:09:55.020 a negotiated outcome
00:09:55.920 with somebody you've labeled
00:09:57.120 as a war criminal
00:09:58.020 that you plan to put on trial.
00:10:01.620 Now, there's no real risk
00:10:02.900 that Putin will be on trial
00:10:05.200 because I don't think
00:10:06.300 Russia buys into the process
00:10:09.220 any more than the United States does.
00:10:11.500 But it's good persuasion
00:10:15.040 unless it causes Putin to dig in.
00:10:20.660 But here's the real question.
00:10:23.120 Isn't Putin already all in?
00:10:25.680 Is there any more in he could be?
00:10:28.020 I mean, he's fighting for his life
00:10:29.540 in a sense.
00:10:31.680 And I think Putin's all in.
00:10:33.860 I think he's just going to do
00:10:34.880 whatever it takes.
00:10:36.140 And as long as it takes
00:10:37.180 and however many people it kills.
00:10:39.400 And I don't think that
00:10:40.520 anything will happen to him
00:10:41.880 war crime-wise.
00:10:43.420 So this is the sort of thing
00:10:45.640 you could imagine
00:10:46.500 would move the needle
00:10:47.520 and make us a lot more dangerous.
00:10:50.680 But my personal view
00:10:52.560 is that Putin's already
00:10:53.880 a 10 out of 10
00:10:54.900 for danger.
00:10:57.320 I don't think he's going to go to nuclear
00:10:59.220 because that doesn't work for him.
00:11:01.840 And he's still winning.
00:11:03.740 Winning in this sense.
00:11:06.460 I think that Putin
00:11:07.360 will still be in charge of Russia
00:11:08.860 when this is all done.
00:11:11.060 Russia's economy
00:11:11.960 might be quite degraded by then.
00:11:14.820 But I'll bet his popularity
00:11:16.060 will be high.
00:11:17.440 His personal life will be great.
00:11:19.420 He'll be seen as a hero eventually,
00:11:21.240 if not right away,
00:11:22.020 because he can control the media.
00:11:25.300 So to me it looks like
00:11:26.960 he's nowhere near losing.
00:11:29.560 So I'm not too worried
00:11:30.680 about the war crimes thing
00:11:31.800 because all that really does
00:11:33.980 is says it's going to be hard
00:11:34.900 for him to travel, I guess,
00:11:36.580 and get any respect in the world.
00:11:39.000 I do think that Putin is done
00:11:40.580 as a respected leader,
00:11:42.560 if he ever was.
00:11:43.600 I suppose that's a matter of opinion.
00:11:46.200 But don't you think
00:11:46.880 that he will no longer
00:11:48.200 be invited to anything
00:11:50.180 unless they need his energy,
00:11:52.820 in which case,
00:11:53.940 if they need his energy,
00:11:54.980 then they'll go right back
00:11:55.740 to normal as soon as they can.
00:11:57.280 Yeah, yeah, we'll buy your energy.
00:12:01.120 Well, speaking of Russian atrocities,
00:12:05.080 so it does appear
00:12:06.400 there's more and more evidence
00:12:07.920 that the Russians
00:12:08.540 are committing war crimes.
00:12:11.660 But let me ask you this.
00:12:14.540 Name the war
00:12:15.460 that didn't have any war crimes
00:12:17.000 on both sides.
00:12:19.020 Do you think
00:12:19.380 there's ever been one?
00:12:21.100 Do you believe
00:12:21.900 there's ever been a war,
00:12:23.860 like a serious war,
00:12:25.560 where there weren't war crimes
00:12:26.880 on both sides?
00:12:29.320 Do you remember
00:12:29.940 all the war crimes
00:12:30.880 that the Americans
00:12:31.620 committed in World War II?
00:12:34.760 You don't, do you?
00:12:37.500 I can tell you
00:12:38.680 that I personally
00:12:39.780 spoke to someone
00:12:42.380 who's no longer with us.
00:12:44.600 You know, he's passed away.
00:12:45.460 But I personally
00:12:46.580 spoke to someone
00:12:47.340 who was involved
00:12:48.160 in a war crime,
00:12:50.400 a big one,
00:12:51.680 where they lined up
00:12:52.860 the Germans
00:12:53.580 and just killed them all.
00:12:56.240 And of revenge,
00:12:57.140 they were mad.
00:12:58.880 That really happened.
00:13:00.500 Now, I'm not going
00:13:00.960 to give you the details
00:13:01.680 because I don't want you
00:13:02.640 to know who it was.
00:13:03.860 But a real person
00:13:05.440 who was really there
00:13:06.400 during the crush
00:13:09.980 across Europe
00:13:12.100 to end Germany's run,
00:13:14.840 so he was there.
00:13:16.060 He was in the middle
00:13:16.620 of the action
00:13:17.100 and he was personally
00:13:18.220 involved in a massacre
00:13:19.820 of captured German soldiers
00:13:22.460 at the end of the war.
00:13:24.540 It was the end of the war.
00:13:26.520 And they killed him anyway.
00:13:28.620 Now, did you ever hear
00:13:29.420 about that war crime?
00:13:31.760 No.
00:13:32.400 Because do you know why?
00:13:34.400 Do you know why
00:13:35.100 the Americans
00:13:35.780 murdered the Germans?
00:13:37.740 Because they discovered
00:13:40.080 that the Germans
00:13:40.720 had committed a war crime.
00:13:43.400 They found literally
00:13:45.880 hot evidence,
00:13:47.060 meaning the bodies
00:13:47.760 were still warm.
00:13:49.420 So the Germans
00:13:50.340 had just massacred
00:13:51.260 a bunch of captured prisoners.
00:13:53.720 And then the Americans
00:13:54.820 captured the people
00:13:55.760 who had just murdered
00:13:56.500 the captured American prisoners.
00:13:58.800 How do you think that went?
00:14:01.140 The bodies were still fresh.
00:14:02.660 They had just been murdered.
00:14:06.160 And that's when
00:14:06.680 the Americans
00:14:07.320 captured the situation
00:14:09.160 and captured all the Germans.
00:14:12.080 Do you know what the
00:14:12.900 most chilling sentence
00:14:17.060 I've ever heard?
00:14:18.760 It came from this individual.
00:14:21.120 And I asked him one time,
00:14:22.380 in World War II,
00:14:23.800 did you take prisoners?
00:14:25.500 Because to me,
00:14:26.120 it seemed unlikely.
00:14:27.860 To me, it seemed that
00:14:28.680 taking prisoners
00:14:29.620 when you're marching
00:14:30.820 across Europe
00:14:31.680 would slow you down,
00:14:33.260 wouldn't it?
00:14:34.460 And why would they
00:14:35.340 want to be slowed down?
00:14:37.060 My assumption was
00:14:38.020 they just murdered
00:14:38.680 all their prisoners.
00:14:41.660 I don't know that.
00:14:43.420 But I'm just thinking
00:14:44.180 from a practical perspective,
00:14:45.840 they probably murdered
00:14:46.980 their prisoners.
00:14:48.500 And so I asked him
00:14:49.560 about his experience
00:14:50.880 and he said,
00:14:51.580 you know,
00:14:51.820 did you actually capture
00:14:53.400 Germans
00:14:54.360 and then take care of them
00:14:55.800 and like everything was fine?
00:14:58.380 And he said,
00:14:59.040 oh, yeah,
00:14:59.480 yeah, yeah, yeah.
00:15:00.200 He told the story
00:15:01.180 about capturing
00:15:02.460 one particular German soldier
00:15:04.260 and the German soldier
00:15:05.980 was friendly, basically,
00:15:08.300 because he was caught
00:15:09.320 and, you know,
00:15:11.500 the war was winding down
00:15:13.340 and he wasn't posing
00:15:15.040 any danger
00:15:15.640 and they just had
00:15:17.040 sort of a friendly
00:15:17.840 interaction with him.
00:15:19.920 Right?
00:15:20.540 I think he was an officer.
00:15:22.020 But they had a somewhat
00:15:23.360 professional interaction,
00:15:25.600 just took him prisoner.
00:15:26.580 He said, yeah,
00:15:27.020 yeah, no problem.
00:15:28.020 We just took him prisoner.
00:15:29.480 There was nothing.
00:15:31.540 Then I kept asking him
00:15:32.980 because I was a little skeptical
00:15:34.360 that they really took prisoners.
00:15:36.800 And then he told me this story
00:15:38.480 about liberating
00:15:40.440 a German prisoner
00:15:41.860 of war camp
00:15:42.640 and finding,
00:15:44.220 and they looked for the prisoners
00:15:45.220 and they couldn't find them.
00:15:46.600 And they said,
00:15:47.360 where are all the prisoners?
00:15:48.260 And finally,
00:15:49.660 somebody pointed to a train.
00:15:51.900 I guess there was a train
00:15:53.100 that went through the,
00:15:54.000 or next to the camp.
00:15:55.260 And they opened up the doors
00:15:57.480 of the train
00:15:58.800 and all of the dead,
00:16:01.660 recently machine-gunned
00:16:03.140 prisoners fell out dead.
00:16:06.580 They had put them in the trains
00:16:08.060 to take them away
00:16:09.120 because the, you know,
00:16:10.520 the American forces were coming.
00:16:12.420 They were going to remove them,
00:16:13.580 but they didn't have time.
00:16:15.080 So instead,
00:16:16.840 they machine-gunned it.
00:16:17.820 And so,
00:16:21.300 as the person I was talking to
00:16:22.560 described this,
00:16:23.340 he said,
00:16:23.960 and we opened up,
00:16:24.720 you know,
00:16:24.920 the train cars
00:16:25.800 and the bodies of the Americans
00:16:27.280 just fell out.
00:16:28.700 They'd been machine-gunned.
00:16:30.420 And then I heard
00:16:31.460 the most chilling sentence
00:16:33.120 I've ever heard.
00:16:35.280 And he looked at me
00:16:36.020 and he said,
00:16:37.380 that day,
00:16:38.260 we didn't take prisoners.
00:16:42.360 That day,
00:16:43.400 we didn't take prisoners.
00:16:45.980 That was World War II.
00:16:47.820 You know,
00:16:48.420 the good one
00:16:48.900 where everybody was awesome
00:16:50.100 and there was good and evil
00:16:51.480 and we were the good guys
00:16:53.280 and they were the bad guys.
00:16:55.640 It's war.
00:16:57.180 It's fucking war.
00:16:58.880 Right?
00:16:59.700 There's no such thing
00:17:00.680 as an atrocity in a war
00:17:02.000 because it's all atrocity.
00:17:04.100 Here's something
00:17:04.640 that really fucking pisses me off.
00:17:07.820 That you can watch scenes
00:17:08.900 of adult men
00:17:10.320 ripped apart by shrapnel
00:17:11.960 and God knows what
00:17:13.020 and you go,
00:17:14.020 oh, that's just war.
00:17:15.220 That's just war.
00:17:15.840 But then you see
00:17:16.880 a civilian
00:17:17.880 dead on the street.
00:17:19.620 That's an atrocity.
00:17:21.280 That's an atrocity.
00:17:22.660 No,
00:17:23.120 it's all fucking atrocity.
00:17:25.200 It's atrocity
00:17:25.980 from top to bottom.
00:17:27.200 There's no non-fucking atrocities
00:17:29.600 in war.
00:17:31.060 We just have people
00:17:32.080 that we don't give a shit about.
00:17:34.440 It's called adult men.
00:17:36.520 We don't give a shit
00:17:37.900 about adult men.
00:17:39.700 No matter how many of them die,
00:17:41.680 we say,
00:17:42.180 well,
00:17:42.400 that was a war.
00:17:43.160 How'd the war turn out?
00:17:44.100 Who won the war?
00:17:46.380 But as soon as somebody
00:17:47.420 who's not an adult
00:17:48.360 fucking man dies,
00:17:50.080 gets a goddamn splinter,
00:17:52.240 it's a fucking war crime.
00:17:55.760 Not cool.
00:17:57.520 Not cool.
00:17:59.180 It's either all war crimes
00:18:00.480 or there's no war crimes.
00:18:01.960 Fuck you
00:18:02.620 if you think
00:18:03.260 that some of them
00:18:03.920 are good deaths
00:18:04.680 and some of them
00:18:05.820 are bad ones.
00:18:07.340 Fuck all of you
00:18:08.160 who think that.
00:18:09.200 Like,
00:18:09.480 I don't even care
00:18:10.380 about the fucking bodies
00:18:11.740 piling up
00:18:12.660 because they're just
00:18:13.580 an example
00:18:14.220 of a larger evil.
00:18:16.880 The reason I don't care
00:18:17.980 is not because
00:18:18.480 I don't have empathy.
00:18:19.400 The opposite.
00:18:20.420 It's the opposite.
00:18:21.640 My empathy
00:18:22.140 is much greater than that
00:18:23.440 because I have an empathy
00:18:24.760 for the fucking soldiers.
00:18:26.720 What about the soldiers?
00:18:28.480 We don't care
00:18:29.200 about the fucking soldiers,
00:18:30.340 right?
00:18:33.300 We talk about,
00:18:34.300 oh,
00:18:34.440 there were a thousand
00:18:35.020 civilian deaths.
00:18:36.280 A tragedy.
00:18:37.440 A tragedy.
00:18:38.460 Probably undercounted.
00:18:39.640 But we're talking
00:18:41.480 about tens of thousands
00:18:42.520 of adult men
00:18:43.740 getting ripped apart.
00:18:45.280 Even the ones
00:18:45.840 who don't die
00:18:46.420 are never going
00:18:46.920 to be the same.
00:18:48.000 The ones who
00:18:48.680 don't even get wounded
00:18:49.560 are never going
00:18:50.120 to be the same.
00:18:51.240 Every fucking man
00:18:52.440 in that battle
00:18:53.060 is destroyed.
00:18:54.920 They're all wounded.
00:18:56.580 All of them.
00:18:57.220 They're all wounded.
00:18:58.660 They're not coming back.
00:19:00.400 They're not coming back
00:19:01.420 the same.
00:19:01.880 And there are millions
00:19:02.540 of them.
00:19:03.680 Fucking millions
00:19:04.540 of them.
00:19:06.560 Yeah,
00:19:06.760 let's talk about
00:19:07.380 the picture
00:19:08.440 of the little toddler
00:19:10.100 who died on the beach.
00:19:11.440 Tragedy.
00:19:12.740 Not minimizing it.
00:19:14.720 But why do we
00:19:15.840 pick and choose?
00:19:17.000 Like, oh,
00:19:17.400 that's an atrocity.
00:19:18.860 That's a war crime.
00:19:20.200 Oh,
00:19:20.440 that's just a battle.
00:19:22.600 That's just a fucking battle.
00:19:24.620 All right,
00:19:24.800 here's what the Soviets
00:19:25.620 have done
00:19:26.120 and the Russians
00:19:26.680 have done,
00:19:27.260 according to Peter Bergen
00:19:28.260 and CNN.
00:19:30.260 These are just
00:19:30.940 some of the things
00:19:31.600 they've done.
00:19:32.100 And the Soviet Union
00:19:34.600 killed over a million
00:19:35.940 Afghan civilians.
00:19:37.040 A million.
00:19:39.840 The Soviet Union
00:19:40.860 killed a million
00:19:41.880 civilians
00:19:42.700 in Afghanistan.
00:19:45.320 Russia's first war
00:19:46.400 in Chechnya
00:19:46.940 in 1994.
00:19:48.660 25,000 civilians
00:19:50.560 died
00:19:51.020 in just two months
00:19:52.220 of fighting
00:19:52.640 in Grozny,
00:19:53.720 the capital.
00:19:54.780 During the second
00:19:55.460 Russian war
00:19:56.040 in Chechnya,
00:19:57.820 Russian soldiers
00:19:58.940 summarily executed
00:20:00.020 at least 38 civilians
00:20:01.420 in Grozny.
00:20:03.020 And on February 5,
00:20:03.940 2000,
00:20:04.600 Russian soldiers
00:20:05.300 summarily executed
00:20:06.300 at least 60 civilians.
00:20:09.020 Now,
00:20:09.200 this,
00:20:09.420 of course,
00:20:09.720 is according
00:20:10.360 to various
00:20:11.600 human rights groups,
00:20:12.880 so I don't know
00:20:13.620 how to,
00:20:14.080 I really don't know
00:20:14.860 how to assess
00:20:15.860 the validity of them.
00:20:18.580 But Peter Bergen
00:20:19.600 thinks they're real.
00:20:20.560 It's on CNN.
00:20:21.720 The International
00:20:22.240 Federation of Human Rights
00:20:23.240 found the Russians
00:20:23.900 in Chechnya
00:20:24.540 engaged in summary
00:20:26.300 executions,
00:20:27.340 murders,
00:20:27.760 physical abuse,
00:20:28.860 tortures,
00:20:29.460 et cetera.
00:20:31.700 To which I say,
00:20:33.880 it was a war.
00:20:34.900 It was a war.
00:20:37.540 Do you know
00:20:37.940 who's left out?
00:20:40.140 How about the adult
00:20:41.520 fucking men
00:20:42.320 who died?
00:20:43.780 It's not even
00:20:44.800 a fucking statistic.
00:20:47.480 Not even a statistic.
00:20:49.780 It's not even
00:20:50.420 in the fucking story.
00:20:52.540 Don't even care.
00:20:54.220 Oh,
00:20:54.500 adult men?
00:20:55.420 Yeah,
00:20:55.800 just go die.
00:20:57.340 And we'll make news
00:20:58.380 out of it.
00:20:59.060 We'll get some content
00:21:00.120 out of it.
00:21:00.600 I didn't expect
00:21:06.500 to be so angry
00:21:07.220 this morning,
00:21:07.820 but I think
00:21:08.480 it was my printer.
00:21:09.740 I think my printer
00:21:10.720 got me off
00:21:11.220 to a bad start.
00:21:14.780 So the U.S.,
00:21:16.760 all these financial
00:21:18.680 sanctions stories
00:21:19.720 get a little complicated.
00:21:21.020 Let's see if I can
00:21:21.500 make this simple.
00:21:22.840 But Russia has,
00:21:25.160 you know,
00:21:25.400 lots of rubles
00:21:26.360 that is its own currency,
00:21:27.600 but it also has
00:21:28.900 to hold on
00:21:29.440 to a reserve
00:21:30.000 of American dollars
00:21:31.380 because some things
00:21:33.260 need to be paid
00:21:34.120 for in American dollars.
00:21:35.820 One of those things
00:21:36.800 is a whole bunch
00:21:37.360 of debt
00:21:37.920 that Russia has
00:21:40.460 and America
00:21:41.480 somehow,
00:21:42.480 with their financial
00:21:43.300 engineering,
00:21:44.680 made it impossible
00:21:45.540 for them to repay
00:21:46.780 their debt
00:21:47.220 with dollars
00:21:47.820 because I guess
00:21:49.340 we somehow
00:21:50.600 blocked their access
00:21:53.080 to the dollars
00:21:53.940 that they owned.
00:21:55.220 Don't know
00:21:55.540 how we did that.
00:21:56.160 through the banking system
00:21:57.740 presumably.
00:21:59.120 And so that's going
00:22:03.600 to push Moscow
00:22:04.560 into either default
00:22:05.840 on their loans,
00:22:07.520 which is a pretty
00:22:08.180 good problem,
00:22:09.440 or, given that
00:22:11.420 their U.S. dollars
00:22:12.280 are limited,
00:22:13.340 apparently that's
00:22:14.000 how they buy weapons.
00:22:15.840 So Russia can't fund
00:22:17.320 its war,
00:22:18.080 not weapons so much,
00:22:19.340 I guess they make
00:22:19.940 their own weapons,
00:22:20.900 but in terms of funding
00:22:21.960 their war,
00:22:22.800 they need U.S. dollars.
00:22:24.460 Is that weird?
00:22:26.160 Do you believe
00:22:27.420 that Russia can't
00:22:28.280 fund their own war
00:22:29.880 when they're a
00:22:31.180 weapons manufacturing
00:22:32.200 country and they've
00:22:33.040 got their own currency,
00:22:34.040 that they can't do it
00:22:35.140 without the dollar?
00:22:36.780 I don't know.
00:22:37.220 I'd like to hear
00:22:37.820 some details on that,
00:22:39.160 but there must be
00:22:39.820 something that they
00:22:40.660 buy internationally
00:22:41.640 that is required
00:22:44.240 to keep the army
00:22:45.040 running.
00:22:46.100 What is that?
00:22:47.180 And who's selling
00:22:47.860 it to them?
00:22:48.660 And why do they
00:22:49.280 only take dollars?
00:22:50.780 I don't know.
00:22:51.160 I have some questions
00:22:52.120 about that.
00:22:52.640 All right,
00:22:55.480 the most important
00:22:56.040 thing that's
00:22:56.440 happening today
00:22:57.000 is, as you know,
00:22:57.740 Elon Musk bought
00:22:58.680 over 9% of Twitter,
00:23:01.180 which makes him
00:23:01.740 by far the biggest
00:23:02.640 stockholder,
00:23:04.000 and today he was
00:23:05.220 granted a seat
00:23:06.160 on the Twitter
00:23:06.960 board.
00:23:09.420 Now,
00:23:10.620 let's look at
00:23:11.720 some context.
00:23:13.140 I always thought
00:23:14.000 it was smart
00:23:14.540 when Jeff Bezos
00:23:15.480 bought the
00:23:16.040 Washington Post,
00:23:17.740 because if you're
00:23:18.480 a billionaire,
00:23:19.860 you want to be
00:23:20.620 able to control
00:23:21.420 the narrative,
00:23:23.320 right,
00:23:23.560 because big news
00:23:24.520 affects you
00:23:25.140 because you're
00:23:25.680 a big player
00:23:26.440 in the world.
00:23:27.840 And so it just
00:23:28.740 makes sense that
00:23:29.580 if a billionaire
00:23:30.480 can buy a
00:23:31.880 financial organ,
00:23:33.700 they should do it.
00:23:35.120 They should do it
00:23:35.820 because it gives
00:23:36.300 them some control
00:23:37.000 over the narrative.
00:23:38.020 And the Washington
00:23:38.520 Post, like the
00:23:40.000 New York Times,
00:23:41.260 are the papers
00:23:42.240 of record,
00:23:42.940 so to speak.
00:23:43.840 So if it's in
00:23:44.500 one of those papers,
00:23:45.340 then it's real.
00:23:46.640 If it's in some
00:23:47.400 smaller publication,
00:23:48.560 well, it might be
00:23:49.120 real, it might not
00:23:49.780 be, but yeah.
00:23:51.420 So, and I also
00:23:53.640 love the competition
00:23:54.680 between Elon Musk
00:23:57.720 and Jeff Bezos.
00:23:58.900 So they're both,
00:23:59.960 they both got
00:24:00.560 rocket companies,
00:24:02.080 both trying to be
00:24:03.240 the richest person.
00:24:04.840 And so far,
00:24:06.040 Elon's rockets
00:24:07.000 are better than,
00:24:09.120 is it Blue Horizon?
00:24:10.720 It's better than
00:24:11.360 Bezos' rockets.
00:24:12.740 But I think Musk
00:24:14.220 just topped him
00:24:15.060 on media control
00:24:16.300 as well,
00:24:17.400 because Bezos
00:24:18.620 bought the
00:24:19.000 Washington Post,
00:24:19.740 which is sort
00:24:21.120 of a, I don't
00:24:22.680 know, a dying
00:24:23.500 industry, in a way.
00:24:25.820 Whereas, I would
00:24:27.180 argue that having
00:24:28.300 some kind of
00:24:29.160 influence on Twitter,
00:24:30.580 and I think Musk
00:24:31.280 will have a lot,
00:24:32.960 is a bigger play.
00:24:34.960 It's a better play.
00:24:37.080 So I think that in
00:24:38.540 the battle of the
00:24:39.920 billionaires,
00:24:42.040 I think Elon Musk
00:24:43.360 has got a better play.
00:24:44.580 Because if you're
00:24:46.120 going to try to
00:24:46.720 control the
00:24:47.420 consciousness of
00:24:48.260 the world,
00:24:49.920 Twitter is how
00:24:50.780 you get to all
00:24:51.460 the reporters.
00:24:53.180 All the journalists
00:24:54.020 are on Twitter,
00:24:55.320 but not everybody
00:24:56.280 reads the Washington
00:24:56.980 Post.
00:24:58.200 So I feel like he
00:24:59.400 found the right
00:25:00.500 lever, the right
00:25:01.700 choke point for
00:25:02.660 influence.
00:25:04.080 Now, given that
00:25:05.320 Elon Musk is not
00:25:06.260 about, so much
00:25:08.260 about his own
00:25:10.360 message versus
00:25:11.060 free speech,
00:25:12.460 seems pretty
00:25:13.120 consistent about
00:25:13.840 that, I think
00:25:17.360 that you're going
00:25:17.740 to see a free
00:25:18.420 speech model
00:25:18.980 coming here.
00:25:19.820 It'll be
00:25:20.340 interesting.
00:25:20.920 Now that he has
00:25:21.380 a board seat,
00:25:22.640 what do we
00:25:23.700 expect?
00:25:25.140 Well, first of
00:25:26.900 all, if I worked
00:25:27.740 in the marketing
00:25:28.400 department of
00:25:29.240 Twitter, I would
00:25:30.540 be very worried
00:25:31.400 right now.
00:25:33.460 Because Elon
00:25:34.740 Musk is now
00:25:35.940 replacing everything
00:25:37.260 that the marketing
00:25:38.020 department does
00:25:39.220 for Twitter.
00:25:40.340 Am I right?
00:25:42.120 The Twitter stock
00:25:43.300 went up 20%,
00:25:43.840 27% or something?
00:25:45.940 Well, that's what
00:25:46.420 marketing is supposed
00:25:47.200 to do, right?
00:25:48.440 You're supposed to
00:25:48.940 get that excitement
00:25:51.640 up.
00:25:52.780 And so I think,
00:25:55.340 you know, Elon
00:25:55.840 just tweeting and
00:25:57.060 being on the board
00:25:58.100 and buying stock
00:25:59.140 and doing whatever
00:26:00.520 he's going to do
00:26:01.160 next, which will
00:26:01.920 also be interesting,
00:26:03.000 I'm sure, you
00:26:04.580 don't even need
00:26:05.080 marketing anymore.
00:26:06.120 You just fire the
00:26:06.820 whole marketing
00:26:07.280 department.
00:26:10.260 Here's what I
00:26:11.080 want to see more
00:26:11.940 than anything.
00:26:12.540 And we might
00:26:13.560 actually see this.
00:26:15.460 If Elon Musk
00:26:16.580 gets enough
00:26:17.280 influence at
00:26:18.600 Twitter, and a
00:26:19.280 board member is
00:26:20.020 pretty influential,
00:26:21.340 especially if you're
00:26:21.980 the biggest shareholder
00:26:24.260 too, do you think
00:26:26.180 that Elon Musk
00:26:26.980 could get a meeting
00:26:28.360 with the top
00:26:30.300 engineer in charge
00:26:31.320 of the algorithm
00:26:32.080 at Twitter?
00:26:33.660 Well, of course he
00:26:34.460 could.
00:26:35.020 Of course he could.
00:26:35.980 How much would
00:26:38.120 you like to see
00:26:38.720 that broadcast
00:26:40.060 or live streamed?
00:26:42.840 A lot, right?
00:26:44.740 Because here's the
00:26:45.660 thing that's different
00:26:46.240 about Elon Musk.
00:26:47.680 Imagine any
00:26:49.160 ordinary board
00:26:51.220 member talking to
00:26:52.760 an engineer about
00:26:53.580 the algorithm.
00:26:54.760 It would go like
00:26:55.440 this.
00:26:56.300 Hey, engineer, can
00:26:57.640 you tell me what's
00:26:58.280 in that algorithm?
00:27:00.180 Then the engineer
00:27:01.220 says, blah, blah,
00:27:02.840 blah, something
00:27:03.440 technical.
00:27:04.000 And then the
00:27:05.320 board member says,
00:27:06.140 well, I'm not
00:27:06.660 getting that.
00:27:07.700 Can you simplify
00:27:08.260 that?
00:27:09.320 Blah, blah, blah,
00:27:10.100 something even
00:27:10.740 more technical.
00:27:12.220 And eventually the
00:27:12.960 board member goes,
00:27:13.660 I can't understand
00:27:14.680 it.
00:27:14.940 I guess I give up.
00:27:16.760 Right?
00:27:17.680 That's probably the
00:27:18.400 only way it could
00:27:19.300 go.
00:27:19.920 Because nobody can
00:27:20.820 really get under
00:27:21.520 the hood unless
00:27:22.340 the engineer wants
00:27:23.260 them to.
00:27:23.840 Otherwise, you're
00:27:24.640 just going to hear
00:27:25.660 some blah, blah, blah
00:27:26.440 and go away not
00:27:27.220 knowing what
00:27:27.640 happened.
00:27:28.520 But imagine Elon
00:27:29.420 Musk talking to
00:27:30.480 the top engineer
00:27:31.280 and the engineer
00:27:32.360 gives him some
00:27:33.080 bullshit.
00:27:34.000 How does that
00:27:35.420 go?
00:27:37.440 I mean, really.
00:27:39.120 Really.
00:27:39.840 Imagine Elon Musk
00:27:40.820 talking to the
00:27:41.440 person who
00:27:42.020 understands best
00:27:43.140 Twitter's algorithm
00:27:44.540 and asking him to
00:27:46.200 explain, you know,
00:27:47.880 how it works,
00:27:48.880 et cetera.
00:27:49.420 And then imagine
00:27:50.200 that engineer
00:27:50.740 trying to
00:27:51.220 bullshit him.
00:27:53.740 How long is
00:27:54.480 that going to
00:27:54.840 last?
00:27:55.740 Right?
00:27:56.100 He's like the
00:27:56.680 only person you
00:27:57.320 couldn't bullshit
00:27:58.060 on this question.
00:27:59.060 Well, there are
00:27:59.840 others, but there
00:28:01.040 are not others who
00:28:02.000 would actually buy
00:28:02.780 the company and
00:28:03.480 get a board
00:28:03.860 seat and start
00:28:05.440 digging into it.
00:28:06.940 But the scariest
00:28:08.000 thing about Elon
00:28:08.760 is how much he
00:28:09.380 understands.
00:28:11.200 That's the scary
00:28:11.940 part.
00:28:12.340 It's not even the
00:28:13.020 billions.
00:28:13.900 It's the fact that
00:28:14.640 if they let him
00:28:15.600 under the hood,
00:28:16.200 he can see what's
00:28:16.820 going on.
00:28:17.920 Maybe you and I
00:28:18.660 couldn't.
00:28:19.720 Right?
00:28:19.920 We don't necessarily
00:28:20.840 have those skills.
00:28:21.600 But the skill that
00:28:22.960 Musk brings to
00:28:23.860 this, I think he
00:28:25.400 can look at the
00:28:25.960 algorithm and
00:28:26.560 figure out what's
00:28:27.120 going on.
00:28:28.320 Now, I do think,
00:28:29.460 here's my prediction,
00:28:31.060 that the algorithm's
00:28:32.760 too complicated for
00:28:33.880 anybody to
00:28:34.420 understand.
00:28:36.120 But I'll bet that's
00:28:37.100 going to change.
00:28:39.020 And I think that
00:28:41.260 Musk is likely to
00:28:42.880 implement Jack
00:28:43.880 Dorsey's plan that
00:28:46.380 for whatever reason
00:28:47.220 Jack Dorsey didn't
00:28:48.060 get done.
00:28:48.460 I imagine there
00:28:49.740 was internal
00:28:50.460 dissent.
00:28:52.500 But I think that
00:28:57.680 Musk could probably
00:28:58.340 push it through.
00:28:59.140 And that would be to
00:28:59.860 have a user choice
00:29:01.620 on which filter or
00:29:03.040 algorithm you use.
00:29:04.400 You choose the one
00:29:05.220 that gives you
00:29:05.580 everything, you got
00:29:06.260 everything.
00:29:07.600 So I think that's
00:29:08.680 where it's going to
00:29:09.100 go.
00:29:09.420 Because that's really
00:29:10.560 the most rational
00:29:11.800 way to go.
00:29:13.400 And I would expect
00:29:14.720 Elon to go a
00:29:15.480 rational direction.
00:29:16.340 Now, what about
00:29:18.780 the question of
00:29:19.820 Twitter amnesty?
00:29:21.800 What do you think
00:29:22.300 of that?
00:29:23.140 Do you think that
00:29:23.960 Twitter should
00:29:24.760 consider some
00:29:25.580 process for
00:29:27.080 bringing back
00:29:27.780 people who have
00:29:28.360 been banned for
00:29:29.120 life?
00:29:33.520 I think so.
00:29:34.820 I think so.
00:29:35.720 Now, I'm not
00:29:36.740 opposed to banning.
00:29:38.920 I just think there
00:29:39.880 needs to be a time
00:29:40.520 limit.
00:29:41.680 Because if you do
00:29:42.580 something mildly
00:29:44.020 naughty, maybe
00:29:46.320 you give 48 hours
00:29:47.540 time out or
00:29:48.260 something.
00:29:48.480 Just enough to get
00:29:49.880 your attention.
00:29:50.780 Doesn't really have
00:29:51.380 any effect on your
00:29:52.080 business, no effect
00:29:53.580 on your brand,
00:29:54.520 nothing.
00:29:55.040 It just gets your
00:29:55.620 attention.
00:29:56.560 48 hours.
00:29:57.940 But then there are
00:29:58.760 things that are just
00:29:59.520 worse, right?
00:30:00.940 You know, calling for
00:30:01.700 violence in some
00:30:02.520 credible way,
00:30:03.440 something like that.
00:30:04.480 But those people,
00:30:05.760 too, I think, should
00:30:07.060 have access to social
00:30:08.140 media, eventually.
00:30:10.060 But maybe you give
00:30:10.820 them a five-year
00:30:11.520 time out.
00:30:12.580 Or a three-year.
00:30:13.940 Whatever is the right
00:30:14.800 number.
00:30:15.620 And just say,
00:30:16.200 all right, you've
00:30:16.740 got three years.
00:30:17.420 If you come back,
00:30:18.180 we're going to watch
00:30:18.780 you, because we're
00:30:20.120 going to watch you
00:30:20.580 carefully, because you
00:30:21.380 had trouble.
00:30:22.500 If you're fine for
00:30:23.980 the next few months,
00:30:25.940 you know, we'll stop
00:30:26.560 watching you.
00:30:27.120 Something like that.
00:30:28.460 Now, don't you think
00:30:29.100 they could have some
00:30:29.760 kind of an amnesty
00:30:30.480 policy that just had
00:30:32.620 some kind of clarity
00:30:34.080 on it that you could
00:30:35.120 get in a jail
00:30:35.800 eventually?
00:30:36.940 I think so.
00:30:37.740 What's that?
00:30:44.420 When you're convicted
00:30:45.520 because what you
00:30:46.560 posted was actually
00:30:47.440 illegal.
00:30:48.940 Was that true?
00:30:50.560 All right.
00:30:53.340 So maybe that'll
00:30:54.300 happen.
00:30:54.680 All right.
00:30:54.900 The Shanghai situation
00:30:56.120 is getting pretty bad.
00:30:58.000 25 million people in
00:30:59.380 lockdown.
00:31:00.640 Some are tweeting
00:31:01.360 that they have no
00:31:02.060 access to food or
00:31:03.120 water in some cases
00:31:04.380 because the distribution
00:31:05.940 to 25 million people
00:31:07.300 locked up is pretty
00:31:08.740 challenging, even for
00:31:09.820 China, who's pretty
00:31:10.880 good at stuff.
00:31:12.660 And I saw a tweet
00:31:16.020 that usually there are
00:31:18.140 100 ships backed up
00:31:19.500 in Shanghai at the
00:31:20.360 ports, but now there
00:31:21.180 are 300.
00:31:23.240 Uh-oh.
00:31:24.580 The two biggest
00:31:25.480 problems in the world
00:31:26.300 are probably China
00:31:29.620 locking down and
00:31:31.000 affecting the supply
00:31:32.600 chain.
00:31:32.960 and secondly,
00:31:37.720 fertilizer.
00:31:39.400 I think fertilizer is
00:31:40.540 going to be the big
00:31:41.120 problem because
00:31:41.980 apparently you need
00:31:43.000 petrochemicals to
00:31:44.340 make fertilizer.
00:31:46.140 And if you don't
00:31:46.640 have petrochemicals,
00:31:48.500 you can't do
00:31:49.100 anything.
00:31:49.700 You can't make
00:31:50.060 plastic, fertilizers.
00:31:52.980 Civilization will
00:31:53.780 start to decline.
00:31:56.280 So those are the
00:31:57.820 things I worry about.
00:31:58.900 I worry about supply
00:32:00.080 chain and fertilizer.
00:32:01.500 I worry less about
00:32:03.980 availability of energy.
00:32:05.520 I think we could
00:32:06.020 work that out.
00:32:07.160 But the cost of it is
00:32:09.100 going to be crazy.
00:32:11.220 So maybe what will
00:32:12.700 come out of this is
00:32:13.420 some maybe alternative
00:32:14.960 way to fertilize.
00:32:16.800 Maybe.
00:32:18.700 Maybe indoor growing is
00:32:20.140 how you get away with
00:32:20.840 less fertilizer.
00:32:22.120 Is that one way to do
00:32:22.940 it?
00:32:23.260 I don't know.
00:32:23.640 Well, you could get
00:32:26.080 away with less
00:32:26.580 pesticides, I guess.
00:32:28.520 Fewer pesticides.
00:32:31.640 Jon Stewart, as you
00:32:33.260 know, has come back
00:32:34.400 in the scene and he's
00:32:35.620 doing a sort of a
00:32:37.200 serious show about,
00:32:38.800 you know, the big
00:32:39.280 problems in the world.
00:32:42.860 But he went hard at
00:32:46.080 the question of
00:32:47.240 systemic racism with,
00:32:48.520 I guess, Cory Booker.
00:32:49.580 He couldn't even get
00:32:50.540 Cory Booker to go as
00:32:51.640 far as he was going
00:32:52.500 and hating white
00:32:53.800 people for their
00:32:55.600 racism.
00:32:59.160 So here's the thing.
00:33:01.700 I agree with Jon
00:33:03.000 Stewart that systemic
00:33:04.200 racism exists.
00:33:07.020 But unless he's talking
00:33:08.420 about what to do
00:33:09.200 about it, why do we
00:33:11.720 need to hear it?
00:33:13.480 What's the point?
00:33:14.920 There are lots of
00:33:15.540 problems that exist that
00:33:16.820 you just can't do a
00:33:17.520 damn thing about.
00:33:18.840 We don't talk about
00:33:19.680 them to death because
00:33:20.580 there's not a damn
00:33:21.800 thing you can do
00:33:22.420 about it.
00:33:23.880 But if Jon Stewart
00:33:26.680 is not suggesting
00:33:27.700 specific fixes for
00:33:29.600 stuff, all he's
00:33:31.620 doing is insulting us
00:33:32.700 for a paycheck, I
00:33:34.620 think.
00:33:35.380 I mean, I need some
00:33:36.600 solutions.
00:33:37.960 And I don't know
00:33:38.400 what exactly has he
00:33:40.520 recommended for
00:33:43.000 fixing systemic
00:33:44.680 racism.
00:33:45.160 All right.
00:33:49.680 Wall Street Journal
00:33:50.460 has an article that
00:33:51.380 the rate of
00:33:52.120 depression of
00:33:53.100 middle-aged women
00:33:53.880 is through the roof,
00:33:55.120 but part of it might
00:33:56.180 be sexism.
00:33:59.160 That's my own
00:34:00.000 interpretation.
00:34:01.000 Their interpretation
00:34:01.600 is that some of the
00:34:03.400 people who are getting
00:34:04.080 antidepressants, the
00:34:05.240 women, might actually
00:34:07.240 be suffering from
00:34:08.100 hormone fluctuations
00:34:09.280 that are somewhat
00:34:11.000 normal as they
00:34:11.940 approach menopause.
00:34:13.040 So we might be
00:34:14.740 giving antidepressants
00:34:16.380 to women who just
00:34:17.420 have temporary
00:34:18.080 problems with their
00:34:20.320 and normal, natural
00:34:22.200 problems with their
00:34:23.360 hormones as they
00:34:25.320 reach a certain age.
00:34:27.200 That's pretty scary
00:34:28.400 shit.
00:34:29.900 That's pretty scary.
00:34:31.700 Yeah.
00:34:32.280 And the sexism part is
00:34:34.100 that I imagine that
00:34:35.260 women are being
00:34:35.920 treated like men,
00:34:36.840 basically.
00:34:38.080 You know, if men get
00:34:38.920 an antidepressant, it's
00:34:39.920 probably because they're
00:34:40.500 depressed, if women act
00:34:42.660 sad, it could be because
00:34:44.040 they're depressed in the
00:34:45.320 same way as men, but
00:34:46.900 also it could be a
00:34:48.360 hormone thing.
00:34:49.780 And if they just get
00:34:50.560 treated like men, they're
00:34:51.980 just going to get
00:34:52.520 antidepressants no matter
00:34:53.700 what they're complaining
00:34:54.420 about, as long as it's
00:34:56.020 depression-related.
00:35:00.080 So I think my instinct is
00:35:03.280 that this is right.
00:35:05.020 And if you've ever talked
00:35:07.200 to anybody who had
00:35:07.880 hormone replacement,
00:35:09.580 they're pretty happy
00:35:11.100 with it.
00:35:12.640 Let me see in the
00:35:13.420 comments.
00:35:14.140 Maybe my experience is
00:35:16.040 unique.
00:35:18.320 But the only time I've
00:35:19.840 heard of people getting
00:35:20.760 hormone replacement,
00:35:21.980 they were delighted.
00:35:23.820 And I understand that the
00:35:25.200 risk if you're over 50 is
00:35:26.540 actually low.
00:35:27.940 Yeah, so I am seeing, I'm
00:35:36.380 not seeing anybody
00:35:37.060 disagree.
00:35:39.220 Oh, somebody else, that
00:35:40.160 is not true.
00:35:42.160 I believe the cancer risk
00:35:43.880 is under 50, but over 50,
00:35:46.280 I just read this today,
00:35:47.720 over 50, it actually makes
00:35:49.260 it safer, not less safe.
00:35:52.320 So if you think the cancer
00:35:54.080 risk is the issue, do an
00:35:56.120 update on that, because I
00:35:57.720 think they found out it is
00:35:58.940 an issue under a certain
00:35:59.960 age, but over an age, it's
00:36:01.200 actually a benefit.
00:36:02.600 But do a fact check on that,
00:36:04.100 because I'm not so sure
00:36:04.780 about that.
00:36:07.920 Here's the ratio.
00:36:09.920 So for women between the
00:36:12.660 ages of 40 and 59, one in
00:36:17.160 five women are on
00:36:18.380 antidepressants.
00:36:19.900 Does that sound right to
00:36:20.940 you?
00:36:22.000 Do you know any women
00:36:22.900 between the ages of 40 and
00:36:24.240 59?
00:36:25.520 And would you say that only
00:36:27.060 20% of them are on
00:36:28.460 antidepressants?
00:36:30.120 20%?
00:36:32.100 That doesn't sound real to
00:36:33.500 me, does it?
00:36:34.480 I feel like it's higher.
00:36:36.820 I'm not sure, but maybe it's
00:36:38.460 a California thing, but it
00:36:39.960 feels like it's higher.
00:36:41.380 But let me add something to
00:36:43.040 this.
00:36:44.560 So the Wall Street Journal
00:36:45.720 says one in five women in
00:36:48.160 that age range are on
00:36:49.120 antidepressants.
00:36:49.920 In my opinion, it's 100%,
00:36:53.100 but not antidepressants per
00:36:56.520 se.
00:36:57.800 In my experience, 100% of
00:36:59.920 adults are medicating.
00:37:01.500 They're self-medicating.
00:37:02.920 They're just doing it
00:37:03.860 differently.
00:37:04.940 Some are drinking, some are
00:37:06.240 smoking, some are doing
00:37:07.480 weeds, mushrooms, God
00:37:09.180 knows what.
00:37:10.240 But I don't know any adults
00:37:11.660 who aren't self-medicating,
00:37:13.080 one way or another.
00:37:14.660 So I think this is super
00:37:16.940 misleading when you say one
00:37:18.320 in five are using
00:37:20.240 antidepressants.
00:37:21.600 The other four out of five
00:37:23.240 just don't use those words.
00:37:26.360 Do you think they're drinking
00:37:27.260 and doing drugs because they're
00:37:28.480 so happy all the time without
00:37:29.740 them?
00:37:30.740 No.
00:37:32.040 They're doing drugs and
00:37:33.060 drinking and smoking cigarettes
00:37:34.580 because they're happier with
00:37:35.840 those things.
00:37:37.040 Otherwise, they wouldn't do it.
00:37:38.140 Or at least they think they
00:37:40.580 are.
00:37:44.560 All right.
00:37:45.140 What else is going on?
00:37:46.760 I've come to the opinion that
00:37:49.280 we need to reframe the climate
00:37:51.100 change question because it used
00:37:53.760 to be there are climate change
00:37:55.200 believers who were following the
00:37:57.860 science, and then there were
00:37:59.220 climate change deniers who were
00:38:02.000 not following the science.
00:38:04.680 What is it now?
00:38:05.740 I think everybody agrees on the
00:38:09.180 science now.
00:38:10.680 Not everybody, of course.
00:38:12.380 But at this point, I think the
00:38:14.560 question of climate change is
00:38:16.360 completely transformed into the
00:38:18.900 following categories.
00:38:20.000 No longer do we have people who
00:38:21.580 believe and people who deny.
00:38:24.060 Now we have people who understand
00:38:25.840 economics and people who don't.
00:38:29.540 And people haven't realized the
00:38:31.000 shift yet.
00:38:32.000 And I think the nuclear energy
00:38:33.920 shift really highlights that.
00:38:37.300 Anybody who understood both
00:38:39.520 nuclear energy and economics was
00:38:42.020 in favor of nuclear power for a
00:38:43.680 long time.
00:38:44.920 But it's only recently that
00:38:47.260 Democrats are starting to be a
00:38:49.200 little bit more vocal about the
00:38:51.700 same thing.
00:38:52.920 So we now have bipartisanship on
00:38:55.360 nuclear energy.
00:38:57.120 And that's almost the whole game.
00:39:00.700 Right?
00:39:00.760 Because nobody likes pollution.
00:39:04.640 Everybody likes nuclear energy.
00:39:06.700 I mean, I'm exaggerating, but on the
00:39:08.440 left and the right, we have plenty of
00:39:09.880 support for it now, where we didn't
00:39:11.420 before.
00:39:12.460 And I think at this point, we just have
00:39:13.820 to understand that the people who are
00:39:16.300 for, let's say, aggressive climate
00:39:20.340 change mitigation are not climate
00:39:22.920 change believers.
00:39:25.240 They're also that, but that's not the
00:39:26.800 important part.
00:39:27.560 The important part is that they're not
00:39:28.920 good at economics.
00:39:30.760 So what you have is really just two
00:39:32.880 groups, people who understand
00:39:34.240 economics and people who don't.
00:39:36.800 We don't have climate deniers and
00:39:38.700 climate believers.
00:39:41.340 I mean, we sort of do.
00:39:42.640 But you can see the shift has really
00:39:44.100 moved from believing versus not
00:39:46.320 believing the science to do you even
00:39:49.540 know how to do economics?
00:39:51.420 Because how many of the climate change
00:39:53.260 people thought that we would run out of
00:39:55.180 food if we stopped using petrochemicals?
00:39:59.140 Do you think if I talked to Greta and I
00:40:01.300 said, Greta, did you know that we'd run
00:40:03.720 out of food if we get too aggressive on
00:40:07.720 climate change and that that's worse than
00:40:11.320 even the whatever catastrophe we get from
00:40:13.860 warmer or unstable climate?
00:40:17.520 Yeah.
00:40:18.280 So I don't think she understood that.
00:40:20.060 How many of you knew that petrochemicals
00:40:22.380 were the were the actually a raw
00:40:24.960 ingredient to make fertilizer and
00:40:26.980 there's not really a better way to do
00:40:28.500 it?
00:40:29.480 How many knew you had to have
00:40:31.120 petrochemicals to make fertilizer?
00:40:34.900 Yeah.
00:40:36.420 I mean, I knew you needed it to
00:40:38.180 transport it and to mine it, but I
00:40:40.260 didn't know that I did.
00:40:41.600 I actually didn't know until today that
00:40:43.320 you directly chemically change the
00:40:45.720 petrochemicals and gas in particular
00:40:47.980 natural gas into fertilizer components.
00:40:53.820 So and then when you add in the fact
00:40:56.220 that the developing countries can't
00:40:58.060 develop, you add in the fact that we'll
00:41:00.000 probably only be saved by high tech,
00:41:02.760 you know, and some high tech solution
00:41:04.860 and you don't get too much high tech if
00:41:07.000 you don't have a good economy, if you
00:41:08.400 don't have a good economy unless you've
00:41:09.780 got cheap energy.
00:41:11.500 I don't think that the people who wanted
00:41:13.820 climate change understood what it would
00:41:17.520 cost.
00:41:19.220 Had no idea what it would cost.
00:41:21.320 So I think we just have to stop talking
00:41:23.120 about climate denial and climate
00:41:25.620 follow the science and just say some
00:41:28.540 people understand economics and some
00:41:30.580 people don't.
00:41:32.060 And that's it.
00:41:33.540 When let me ask you this.
00:41:34.600 When was the last time you saw an
00:41:36.000 economist on MSNBC or CNN saying, you
00:41:42.060 know, from an economic perspective, it
00:41:44.320 would be a good idea to get rid of
00:41:45.640 fossil fuels as quickly as possible.
00:41:48.940 Have you ever seen it?
00:41:51.700 No, you see scientists, right?
00:41:53.920 You always see a scientist.
00:41:56.560 And the science is a little bit closer
00:41:58.480 to settled than it ever has been.
00:42:00.540 Because remember, there are tons of
00:42:02.340 Republicans now who are willing to say,
00:42:05.360 yeah, there's it looks like there's some
00:42:07.060 climate change.
00:42:07.700 But the real question is what you do
00:42:09.100 about it.
00:42:10.060 How aggressive?
00:42:10.720 What do you do?
00:42:12.200 And and I think once everybody's on the
00:42:14.120 page that, yeah, something's happening,
00:42:15.760 it's really just a what you do about it.
00:42:17.680 Then you have to move it to the
00:42:19.060 economists.
00:42:20.200 So we've got to get climate change away
00:42:22.060 from the scientists because they did a
00:42:24.980 good job.
00:42:26.200 You don't want to hear that, right?
00:42:28.340 But the entire basis of what I'm saying
00:42:30.540 is that because the scientists did a
00:42:33.640 good job.
00:42:34.880 They figured out a problem and they
00:42:36.760 communicated until enough Democrats and
00:42:40.020 Republicans believe that the climate is
00:42:42.080 getting warmer and that humans are part of
00:42:44.540 it.
00:42:44.940 Now, I know there are a lot of people who
00:42:46.440 don't believe it on my audience.
00:42:48.880 But within the professional, you know, the
00:42:50.860 politicians, the politicians, I think the
00:42:54.680 Republicans are largely believe in climate
00:42:57.040 change now.
00:42:57.940 Wouldn't you say?
00:42:58.460 Give me a fact check of, let's say, Republicans
00:43:04.600 in the government.
00:43:07.440 Let's just say Congress.
00:43:09.100 What percentage of Republicans in Congress
00:43:11.600 believe that humans are causing the climate to
00:43:16.240 warm?
00:43:17.240 What percentage of Republicans would you say?
00:43:19.280 I'm seeing 25 percent, 60, 30.
00:43:23.740 I would say 60 to 70.
00:43:28.960 That's my guess.
00:43:31.000 Somebody says 90 and I don't know if you're
00:43:32.880 wrong.
00:43:33.900 I wouldn't disagree with 90, actually.
00:43:36.500 I wouldn't disagree with 90.
00:43:39.000 I don't know that it's right.
00:43:40.660 I'd say at least 60 percent.
00:43:43.180 I think a solid majority.
00:43:44.820 Now, if you still disagree, just remember
00:43:49.660 you're disagreeing with your own side now.
00:43:52.300 Your own side has made a move fairly
00:43:55.140 dramatically in the last few years.
00:43:57.360 And I think some of it has to do with the
00:43:58.880 fact that if nuclear energy can be seen as
00:44:02.020 a solution, then I think a lot of Republicans
00:44:04.680 are willing to say there was a problem.
00:44:07.580 You know what I mean?
00:44:09.700 If you're willing, if everybody's willing to
00:44:11.620 go nuclear, then I think the Republicans
00:44:14.100 are going to say, all right, well, maybe
00:44:15.460 there's a little bit of a problem.
00:44:17.080 Because at least we're doing something smart.
00:44:20.180 At least that's something.
00:44:27.980 Yeah.
00:44:28.600 So I think that among Republicans,
00:44:30.780 they've decided climate change is a real thing.
00:44:33.600 All right.
00:44:33.840 What else we got going on today?
00:44:37.100 Not much.
00:44:38.740 I think we're into bonus territory.
00:44:40.300 What's your side, Scott?
00:44:44.980 I don't know.
00:44:47.920 Podcast on racism?
00:44:49.900 I'm kind of tired of racism.
00:44:52.240 Aren't you?
00:44:54.940 One of the topics you may notice that I'm
00:44:57.680 avoiding like the plague is the Disney thing
00:45:00.560 and don't say gay and everything.
00:45:02.680 I don't know.
00:45:04.640 There's just something about that topic
00:45:06.560 that's, first of all, it's too easy to talk about it.
00:45:11.220 So everybody is.
00:45:12.420 And I don't have anything to add.
00:45:13.900 It's just pure opinion.
00:45:15.280 And if your opinion is different than my opinion,
00:45:17.040 well, what can I do about that?
00:45:18.960 So the Disney thing is,
00:45:21.960 the Disney thing,
00:45:24.600 it just doesn't have any traction for me.
00:45:28.880 Like, I guess it matters.
00:45:32.240 But it just doesn't matter to me.
00:45:34.960 Now, have you been to Disney lately?
00:45:37.440 I can't figure out
00:45:38.660 how they survive.
00:45:43.220 Is it just because there are new stupid children
00:45:45.940 born every day who don't know that Disney is lame?
00:45:49.960 Because the last time I went to Disney,
00:45:52.740 I was struck by how old it looks.
00:45:55.160 Like, it just looks like pre-internet.
00:46:00.140 Like, it all just looks pre-internet.
00:46:02.000 And I'm thinking,
00:46:02.680 how is this going to last in the long run,
00:46:05.920 a pre-internet business,
00:46:07.580 where it's just like physical rides
00:46:09.380 and, you know, it's a small world after.
00:46:11.640 I mean, there's like one of the famous things
00:46:13.820 that's just like puppets singing
00:46:15.080 it's a small world while you ride a boat past it.
00:46:18.880 That's not exactly cutting edge, is it?
00:46:21.420 I'm not sure if I'm talking about Disneyland
00:46:24.800 or Disney World or what,
00:46:28.320 but I think they both have the same situation.
00:46:35.220 For very young children, yeah, it makes sense.
00:46:37.480 But I still see lots of adults
00:46:38.820 who can't wait to go.
00:46:40.740 All the time I hear adults say,
00:46:42.300 oh, can't wait to go to Disney.
00:46:43.900 And they use their kids as an excuse.
00:46:48.720 Nostalgia, yeah.
00:46:49.820 Epcot is better?
00:46:53.260 Probably.
00:46:54.740 All right.
00:46:57.180 Is there anything I haven't mentioned
00:46:58.580 that you desperately want me to?
00:47:00.840 Are there any topics?
00:47:03.260 Oh, let me just put a bow on the atrocity thing.
00:47:09.680 I don't believe the video of the atrocities from Buka,
00:47:14.600 the people who were left on the street
00:47:17.160 with their hands tied.
00:47:18.380 So I don't believe that's real.
00:47:20.260 I also don't think it makes any difference.
00:47:23.420 Because what I think is happening
00:47:24.720 is that fake atrocities are being faked
00:47:29.720 because the real ones don't have any video.
00:47:33.880 That's what I think.
00:47:35.260 I think the reason for faking atrocities
00:47:38.460 is, number one, to get an advantage, right?
00:47:41.280 Political advantage.
00:47:42.160 But number two,
00:47:45.360 they're probably, almost certainly,
00:47:46.880 are real atrocities,
00:47:48.480 but nobody has a picture of it.
00:47:50.340 So if you can't get a picture of the real ones,
00:47:53.640 is it really unethical to fake one?
00:47:59.900 What do you think?
00:48:00.700 Suppose they knew for sure there were real atrocities,
00:48:04.220 but they can't prove it.
00:48:05.900 So they fake one that's video ready,
00:48:08.940 and then the world can see it.
00:48:10.060 Is that unethical?
00:48:12.660 I mean, it's a war.
00:48:15.280 I'm going to say no.
00:48:17.260 I'm going to say no.
00:48:18.520 It's a lie.
00:48:19.620 If you want to say all lies are unethical,
00:48:21.560 then okay.
00:48:22.720 I mean, I would accept that point of view.
00:48:24.940 But in my personal opinion,
00:48:26.560 it's war.
00:48:27.860 It's war.
00:48:29.640 It's all atrocity.
00:48:31.500 It's all inappropriate.
00:48:33.160 All of it.
00:48:35.620 So I think it is ethical.
00:48:37.640 It's ethical within the context of war.
00:48:40.520 Because it is true,
00:48:42.800 you have to use a lie
00:48:44.100 to convey the truth.
00:48:46.980 Now,
00:48:47.940 that's what hyperbole is sometimes too, right?
00:48:50.780 Hyperbole is a lie
00:48:52.360 that if you do it right,
00:48:54.160 and you're doing it in a somewhat ethical way,
00:48:56.600 the lie is convincing the public
00:48:58.480 of something that's useful to them,
00:49:00.200 and they should know,
00:49:01.460 and the world's better off.
00:49:05.720 So I'm actually,
00:49:07.560 I'm sort of in favor of the faked atrocities,
00:49:11.100 so long as we know that there are real ones.
00:49:15.680 And I think we do know that.
00:49:17.760 Because it's war.
00:49:19.060 Now, do you remember all the video you saw
00:49:21.220 of the Ukrainian military atrocities?
00:49:26.760 Did you see any?
00:49:29.960 I saw one video
00:49:31.320 of some Russians claiming
00:49:33.700 that the Ukrainians shelled something,
00:49:36.680 like a residential building.
00:49:38.080 But no evidence.
00:49:39.560 It didn't look real to me.
00:49:41.860 Yeah.
00:49:42.460 So what are the chances
00:49:43.700 that the Ukrainians...
00:49:46.000 Oh, the leg shooting.
00:49:47.660 That's right.
00:49:48.000 Yes, there was the leg shooting video.
00:49:50.720 You're right.
00:49:52.640 So that looked pretty darn real.
00:49:55.460 I don't know how you could have faked that one.
00:49:57.780 That looked really real.
00:50:00.880 But I also imagine
00:50:02.320 that that sort of thing
00:50:03.300 is because they've seen
00:50:04.200 some bad stuff on the other side.
00:50:06.500 You know, there's a little bit of that.
00:50:07.620 The raped and murdered woman
00:50:14.200 with a swastika
00:50:15.780 painted in her own blood.
00:50:19.220 That doesn't sound real.
00:50:21.360 I mean,
00:50:22.480 that's a little bit too on the nose.
00:50:24.680 That doesn't sound real.
00:50:27.020 And I'm not doubting
00:50:28.100 that there are Ukrainian Nazis.
00:50:32.660 The Azov are shooting them in the nuts.
00:50:35.420 Is that something that's happened?
00:50:36.480 Yeah.
00:50:38.800 I don't know what I would do
00:50:40.240 if I were in an actual war
00:50:41.640 and I captured somebody
00:50:42.760 who had been trying to kill me.
00:50:45.880 I don't know that I could
00:50:47.380 take a prisoner.
00:50:50.000 I don't know.
00:50:57.920 And apparently there's some
00:50:59.460 effort by the Ukrainians
00:51:01.380 to do massive phone callings,
00:51:03.240 you know, cold calling
00:51:04.020 into Russia
00:51:04.980 to try to convince them
00:51:07.560 convince them to quit the war.
00:51:11.780 So your mind changes in war, yeah?
00:51:20.340 Hoaxing war crimes
00:51:21.480 is pure manipulation.
00:51:23.120 It's manipulation, for sure.
00:51:25.140 But it also might be
00:51:27.720 weirdly honest.
00:51:36.020 How to handle dog owners
00:51:37.820 whose dogs bark and snarl
00:51:39.680 at your dog.
00:51:41.920 Well, I could tell you
00:51:43.080 once at the dog park
00:51:44.320 there was a bigger dog
00:51:45.800 that once attacked my dog.
00:51:47.040 And it took me about half a second
00:51:53.620 to go primitive.
00:51:56.760 And you don't want to get in a fight
00:51:58.320 with a dog,
00:51:59.460 but I did.
00:52:01.320 I got in a fight with a dog.
00:52:03.880 The dog lost.
00:52:04.720 And then the owner came over.
00:52:08.440 And I thought I was going to have
00:52:09.580 to fight the owner next.
00:52:13.140 But it turns out
00:52:14.120 if you beat up an owner's dog
00:52:15.540 in front of him,
00:52:16.280 he doesn't want to fuck with you.
00:52:18.540 Because he was next.
00:52:20.660 If that guy had made
00:52:21.640 one move toward me
00:52:22.920 for taking his dog out,
00:52:24.940 I mean, the dog survived.
00:52:26.560 It'll be fine.
00:52:27.920 But I did kick the shit
00:52:29.600 out of his dog
00:52:30.240 to get it off my dog.
00:52:32.120 So I did go.
00:52:33.140 I just went primitive on it.
00:52:36.600 And the owner comes over
00:52:38.140 and I'm still kicking
00:52:39.020 the shit out of his dog.
00:52:40.820 And I just turned
00:52:41.720 and looked at the owner
00:52:42.580 with the your next look.
00:52:45.120 Do you know what
00:52:45.760 the your next look is, right?
00:52:47.540 You're kicking the shit
00:52:48.360 out of the guy's dog.
00:52:49.320 And the dog was trying
00:52:51.000 to kill my dog.
00:52:52.580 Right?
00:52:52.840 So, I mean,
00:52:53.820 it was self-defense.
00:52:56.200 But if you're in the middle
00:52:57.300 of taking a guy's dog out
00:52:59.200 and you turn and look
00:53:00.620 at him in the eyes,
00:53:02.020 he's not going to move.
00:53:03.860 He's not going to make
00:53:04.720 a move against you.
00:53:06.320 He's going to take care
00:53:07.180 of his dog,
00:53:07.740 which he did.
00:53:10.940 So, no,
00:53:12.340 it wasn't a pit bull.
00:53:13.260 It was just bigger
00:53:13.960 than my dog.
00:53:17.620 Sometimes being a psycho
00:53:18.740 is justified.
00:53:20.220 True enough.
00:53:21.180 And by the way,
00:53:21.680 I love dogs.
00:53:23.000 So I would never
00:53:23.740 I would never hurt a dog
00:53:25.760 except in self-defense.
00:53:30.960 Would you bring a dog
00:53:31.980 like that to the dog park?
00:53:33.560 Well,
00:53:34.040 that might be part
00:53:34.840 of the conversation
00:53:35.560 I had with the owner.
00:53:37.660 You might not be surprised
00:53:39.860 that the question of
00:53:41.520 should this dog
00:53:42.260 be in the dog park,
00:53:43.460 it came up.
00:53:44.660 It came up.
00:53:46.340 I believe,
00:53:47.340 if I recall,
00:53:48.060 I told him never
00:53:48.840 to come back.
00:53:50.500 I believe I banned him
00:53:51.520 from the dog park.
00:53:52.420 I don't know
00:53:55.900 if he ever came back,
00:53:57.120 but I'm pretty sure
00:53:57.860 I banned him.
00:53:59.120 I didn't have the power
00:54:00.200 to ban him
00:54:00.680 from the dog park,
00:54:01.560 but I don't think
00:54:04.180 he'd want to have
00:54:04.680 a second round with me.
00:54:09.520 Keep your dog
00:54:10.460 out of my effing mouth.
00:54:13.080 Keep your dog's name
00:54:14.280 out of your mouth.
00:54:15.820 Dog bird actually
00:54:23.340 caused a volcanic
00:54:24.220 eruption in real life.
00:54:25.940 Was there a volcanic
00:54:26.720 eruption somewhere?
00:54:27.980 I did a comic
00:54:28.740 about a dog bird
00:54:30.080 trying to create one.
00:54:35.420 The Philippines'
00:54:36.660 second most active
00:54:37.500 volcano,
00:54:38.420 Tal,
00:54:39.560 erupted on Thursday.
00:54:40.700 And then somebody
00:54:46.300 tweeted it,
00:54:47.920 tweeted that volcano
00:54:49.340 with my comic.
00:54:51.300 I'll read you
00:54:51.720 the comic.
00:54:52.520 It was Dog Bird
00:54:53.080 talking to Dilbert.
00:54:54.980 He says,
00:54:55.660 Dog Bird says,
00:54:56.380 you didn't believe me
00:54:57.020 when I said I hacked
00:54:57.880 reality and acquired
00:54:58.980 godlike powers,
00:55:00.300 so I will demonstrate
00:55:01.420 my powers by making
00:55:02.460 a long, dormant volcano
00:55:03.840 erupt in Elbonia's
00:55:05.220 biggest city.
00:55:06.620 And then Dilbert says,
00:55:07.700 couldn't you use your
00:55:08.420 powers to do something
00:55:09.340 good instead?
00:55:10.700 And Dog Bird says,
00:55:11.760 too late.
00:55:12.460 This is on you.
00:55:14.640 So it wasn't Elbonia,
00:55:16.040 but...
00:55:21.380 All right.
00:55:25.720 Do I get royalties
00:55:26.980 when Harvard Business Review
00:55:28.960 uses Dilbert in its
00:55:30.280 magazines?
00:55:32.240 Probably yes.
00:55:33.440 So the process is
00:55:34.600 that a publication
00:55:36.340 would contact
00:55:37.140 my syndication company
00:55:38.300 and they would make
00:55:39.620 a deal that wouldn't
00:55:40.380 involve me.
00:55:41.220 So they would say
00:55:42.260 yes or no
00:55:42.740 without my direct
00:55:43.920 involvement.
00:55:45.100 But an organization
00:55:46.200 like the Harvard
00:55:47.200 Business Review
00:55:48.040 would be fairly careful
00:55:50.360 about copyrights.
00:55:52.580 So while I can't
00:55:53.320 answer the question
00:55:54.000 specifically,
00:55:55.420 I would say the odds
00:55:56.560 of it not being
00:55:57.720 licensed are low
00:55:58.820 because that's the
00:55:59.980 kind of publication
00:56:00.800 that would make sure
00:56:01.480 they did it right.
00:56:03.720 How much have you
00:56:04.680 made with Dilbert?
00:56:05.900 Well, you want
00:56:06.420 a dollar amount?
00:56:07.020 Twitter's at 52?
00:56:16.160 Ten and two-year rates.
00:56:18.060 Yeah, it's not
00:56:19.000 looking good.
00:56:20.500 All right.
00:56:21.080 I think we've said
00:56:22.160 everything we need
00:56:22.760 to say here.
00:56:23.860 And did you see
00:56:25.860 my Madonna joke?
00:56:28.200 Apparently, my
00:56:28.960 Madonna tweet
00:56:30.040 made it into
00:56:30.880 the Daily Wire.
00:56:31.640 There was an article
00:56:33.540 in the Daily Wire
00:56:34.300 about people's
00:56:36.100 reactions to Madonna.
00:56:37.760 And she did
00:56:38.260 the TikTok video
00:56:39.420 where she gets
00:56:40.800 close to this camera
00:56:41.820 and her big old
00:56:44.020 lips are out
00:56:45.600 like a kiss.
00:56:47.260 And I tweeted
00:56:49.160 that Madonna
00:56:50.120 is turning into
00:56:50.920 Jar Jar Binks
00:56:51.920 and nobody's
00:56:52.560 talking about it.
00:56:53.840 Now, I tried
00:56:54.620 that joke out
00:56:55.340 with some people
00:56:55.960 who didn't know
00:56:56.620 who Jar Jar Binks
00:56:57.600 is.
00:56:57.880 This joke
00:56:59.740 only goes two ways.
00:57:01.560 Number one,
00:57:02.180 if you don't know
00:57:02.900 who Jar Jar Binks
00:57:03.800 is, nothing.
00:57:06.200 Nothing.
00:57:07.260 If you do know
00:57:08.520 who Jar Jar Binks
00:57:09.520 is, it's pretty funny
00:57:11.260 if you've seen
00:57:11.860 the TikTok video.
00:57:13.280 Trust me,
00:57:14.420 trust me,
00:57:15.020 if you've seen
00:57:15.540 the Madonna video
00:57:16.400 and you know
00:57:17.540 who Jar Jar Binks
00:57:18.440 is, you're going
00:57:19.500 to be laughing.
00:57:20.840 If you don't,
00:57:21.840 not so much.
00:57:22.480 All right.
00:57:26.440 Scott, are you
00:57:27.200 a libertarian?
00:57:28.140 I'm not.
00:57:28.880 No.
00:57:29.380 I used to say
00:57:30.340 that I was a libertarian
00:57:31.480 but without the
00:57:32.300 crazy parts.
00:57:33.960 And then people
00:57:34.480 would say,
00:57:34.840 well, what are
00:57:35.200 the crazy parts?
00:57:36.060 And I'd have to
00:57:36.860 admit it was all
00:57:37.460 of it.
00:57:40.020 That's what I used
00:57:40.780 to say before
00:57:41.380 I used to say
00:57:41.960 I'm left to Bernie
00:57:42.740 because I keep
00:57:44.180 trying to make
00:57:44.900 a space for myself
00:57:46.920 that doesn't exist
00:57:47.900 so that nobody
00:57:49.400 knows what I am.
00:57:50.540 I mean,
00:57:51.040 I do it intentionally.
00:57:51.740 So when I say
00:57:52.980 I'm a libertarian
00:57:53.780 but without the
00:57:54.440 crazy parts,
00:57:55.780 it's similar to
00:57:56.620 when I say
00:57:57.200 I'm left to Bernie
00:57:59.120 but better at math.
00:58:00.880 It literally
00:58:01.540 doesn't mean anything.
00:58:02.720 That's what it's
00:58:03.260 supposed to mean.
00:58:04.020 It's supposed to
00:58:04.740 just make you say,
00:58:05.760 I don't know.
00:58:07.120 What does he believe?
00:58:08.740 That's exactly
00:58:09.400 where I want to be.
00:58:12.840 Yeah, there are
00:58:13.480 a lot of people
00:58:13.880 who inhabit a space
00:58:14.700 that doesn't exist
00:58:15.540 politically but they
00:58:16.500 don't do what I do
00:58:17.280 in public.
00:58:21.740 All right.
00:58:24.800 And YouTube,
00:58:25.880 thanks for joining
00:58:26.780 and hit that
00:58:28.780 subscribe button
00:58:29.620 if you haven't.
00:58:31.060 I probably should
00:58:31.760 say that more often.
00:58:33.540 I feel bad that
00:58:34.600 what's his name?
00:58:37.380 Russell Brand
00:58:38.060 has like millions
00:58:39.060 of subscribers.
00:58:41.560 Don't I deserve
00:58:42.660 at least a million?
00:58:43.420 YouTube?
00:58:45.980 Come on.
00:58:47.480 I deserve
00:58:48.280 at least a million.
00:58:50.020 Yeah, Russell Brand
00:58:50.620 has 4.5 million
00:58:52.060 subscribers.
00:58:52.880 At least a million.
00:58:54.640 Just one million.
00:58:55.620 That's all I ask.
00:58:56.680 Because I believe
00:58:57.360 my content is better
00:58:58.340 than Russell Brand's
00:58:59.340 except his is
00:59:00.240 more entertaining.
00:59:01.740 He does have
00:59:02.260 that over me.
00:59:03.380 But I think
00:59:04.840 mine is smarter.
00:59:08.720 According to me.
00:59:10.200 All right.
00:59:10.580 That's all for now.
00:59:12.060 Yeah, he is better
00:59:12.960 than me.
00:59:13.420 I'll give him that.
00:59:14.960 Yep.
00:59:15.860 I hate to admit it.
00:59:17.100 I hate it when
00:59:17.960 somebody does
00:59:18.480 the same job I do
00:59:19.440 and I just have
00:59:20.120 to admit they're better.
00:59:21.380 It's like
00:59:21.720 Bill Watterson,
00:59:24.760 cartoonist
00:59:25.440 for Calvin and Hobbes.
00:59:27.560 I want to be able
00:59:28.640 to argue that
00:59:29.260 I am or was
00:59:30.620 a better cartoonist.
00:59:32.520 But I'm not.
00:59:34.100 And I won't be.
00:59:35.360 He's just better.
00:59:36.900 So sometimes
00:59:37.520 you just have
00:59:38.040 to accept it.
00:59:39.660 That's right, Erica.
00:59:40.540 I'm just different.
00:59:41.840 I'm different.
00:59:42.520 I'm not worse.
00:59:43.000 I'm just different.
00:59:47.360 He described
00:59:48.340 Larry Flint
00:59:49.040 as be healed,
00:59:50.240 be wheeled.
00:59:52.600 His verbal skills
00:59:53.860 are crazy
00:59:54.580 just off the charts.
00:59:58.680 All right.
01:00:03.740 And that's all
01:00:04.520 for now, YouTube.
01:00:05.580 And I'll talk to you
01:00:06.460 tomorrow.
01:00:07.020 I love you all.
01:00:08.260 I love you all.