Real Coffee with Scott Adams - March 18, 2024


Episode 2417 CWSA 03⧸18⧸24


Episode Stats

Length

1 hour and 17 minutes

Words per Minute

144.58359

Word Count

11,218

Sentence Count

878

Misogynist Sentences

17

Hate Speech Sentences

44


Summary

A man responds to wife s text with a thumbs-up emoji. Apple and Google in talks to bring AI to the iPhone. A joke about women texting a lot. And why it's so funny. Plus, a look at how fake news has changed the way we see the world.


Transcript

00:00:00.000 Just dine a canteen, jug, or flask, a vessel of any kind.
00:00:03.860 Fill it with your favorite liquid.
00:00:05.680 I like coffee.
00:00:06.840 Join me now for the unparalleled pleasure, the dopamine hit of the day,
00:00:10.080 the thing that makes everything better.
00:00:11.260 It's called the simultaneous sip.
00:00:14.220 It happens now.
00:00:15.900 Go.
00:00:20.260 Oh, so good.
00:00:23.440 Delightful.
00:00:25.260 All right.
00:00:25.620 Let's make sure that the locals, people, comments are up and good.
00:00:32.160 They're good.
00:00:34.240 All right.
00:00:36.360 Hey, whoop, that's me.
00:00:38.220 I don't like to hear me.
00:00:41.100 I don't know why you do.
00:00:43.920 All right.
00:00:44.920 You might not be surprised that the news is funny and ridiculous today.
00:00:50.660 Funny and ridiculous news, that's our specialty.
00:00:56.160 Here's a opening thought for you before we get into the fun stuff.
00:01:01.380 Do things seem a lot different than, say, four or five years ago in terms of how we frame the news?
00:01:09.600 Now we understand the news to be completely fake.
00:01:14.040 And we understand all the ops.
00:01:16.900 And we see the gears of the machine.
00:01:18.820 And have you noticed that when the bad guys were running their hoaxes in the last cycle, that we were treating it more like, oh, it's a serious thing.
00:01:32.900 And I think you got some of those facts wrong.
00:01:35.040 And now we treat it like entertainment.
00:01:39.240 Have you noticed that?
00:01:41.340 It's completely different.
00:01:43.060 Because once you can see the entire mechanism of how they run the hoaxes, and then which entities support it like it's real, it's actually just kind of funny.
00:01:52.540 And we, meaning the people who oppose the regime's brainwashing, we're having fun with it this time.
00:02:01.700 It's actually hilarious.
00:02:03.180 And it's driving them crazy because we're mocking them and calling them out in their exact technique.
00:02:10.260 And you're watching the people on the left get angry that we notice they're all lying.
00:02:18.080 Have you seen that yet?
00:02:19.920 Look at their faces.
00:02:21.740 They're really mad that everybody knows they're lying.
00:02:24.480 They don't know what to do about it.
00:02:25.820 So they're getting angrier and angrier.
00:02:27.760 Or, ah, ah, sure.
00:02:30.440 Well, we'll talk about that more.
00:02:32.000 But that's the big theme for today.
00:02:35.960 I'd like to start with a Babylon Bee joke.
00:02:40.020 They showed a little image there.
00:02:41.900 But the caption was, man thoughtfully responds to wife's nine-paragraph text about her day with a thumbs-up emoji.
00:02:50.480 Now, that was the whole joke.
00:02:54.460 But here's the better part of the joke.
00:02:57.760 So the editor-in-chief of the Babylon Bee, Kyle Mann, he puts in the first comment under it.
00:03:03.860 He says, the joke is women text a lot.
00:03:08.040 Now, I like the joke.
00:03:10.600 But I laughed for 10 minutes when I saw his explanation of the joke.
00:03:14.140 And I'm not exactly sure why.
00:03:16.800 I don't know why the explanation of the joke was funnier than the joke.
00:03:20.760 Because the joke itself is great.
00:03:22.680 It's actually a great joke.
00:03:24.020 But why is the explanation of it funnier?
00:03:28.660 And as a person who writes jokes for a living, I feel like I'm seeing one I haven't seen before, which is really weird.
00:03:36.180 Because there are only 100 jokes in the world.
00:03:38.680 And we just, you know, reuse them with different characters, basically.
00:03:42.060 I don't know if I've seen this one.
00:03:44.360 Or maybe it's a rare one or something.
00:03:46.320 But what exactly is, what is it that makes that funny?
00:03:49.900 That he's explaining it to us without being asked.
00:03:53.180 The joke is women text a lot.
00:03:54.920 Why is that funny?
00:03:57.260 Does anybody know?
00:03:58.580 I actually have no idea.
00:04:00.640 Like, I've laughed at it over and over again.
00:04:02.660 And I don't know what makes that funny.
00:04:04.960 That is a damn interesting thing if you do this for a living.
00:04:09.540 Anyway, the Apple is announcing, or at least Bloomberg has a little scoop here, it looks like.
00:04:14.720 The Apple is in talks with Google's Gemini AI to bring Google's AI onto the iPhone.
00:04:25.600 Now, I want you to hold in your head for a moment.
00:04:30.640 Do you think that Steve Jobs would have given up on making AI within Apple and would have just meekly used Google's AI?
00:04:41.740 Which no doubt will be in Google's own phones.
00:04:46.100 It feels like the biggest losing play of all time.
00:04:51.000 That they just sort of, it's like they surrendered on the key technology of the future.
00:04:58.560 Now, I'm sure it's a practical decision because they decided they couldn't compete or it wasn't cost effective or something.
00:05:06.460 But it's kind of shocking.
00:05:09.380 I mean, to me, it looks like a complete surrender.
00:05:12.420 It just doesn't seem like a Apple of old.
00:05:15.760 It just feels like they're a cash cow and they're just going to milk their cash as long as they can until they go out of business.
00:05:23.280 That's what it looks like when they're using their competitor's AI, to me.
00:05:27.840 Now, this is not a done deal.
00:05:29.820 So it could change.
00:05:31.360 And then I saw somebody speculating that Siri would still be the main thing you're talking to because it's already integrated with everything.
00:05:41.680 But that when Siri can't handle the question, it'll pass it off to somebody else.
00:05:46.200 I would have no interest in that product.
00:05:49.680 Do you know why?
00:05:50.220 Because the first part of it, the S-I-R-I that I won't say out loud again because it's triggering all your machines.
00:05:58.740 It doesn't understand anything I say.
00:06:02.080 Unlike AI, which usually does.
00:06:05.460 You can have a conversation with AI and, you know, let's say OpenAI, ChatGPT.
00:06:11.060 It'll understand everything you say.
00:06:13.540 You talk to S-I-R-I on your phone and it'll be like a completely different conversation.
00:06:21.180 You think it's talking to somebody else.
00:06:23.640 If there's any noise in the room, you're dead.
00:06:25.640 So that doesn't even seem like a good idea if it worked, which is weird.
00:06:33.300 Well, here's an interesting thing.
00:06:35.020 Maybe it's more interesting to me.
00:06:37.180 But Bry AI, who you might know as the Prince of Fakes, used the Claude III Opus AI to read my book, Reframe Your Brain,
00:06:48.820 The User Interface for Happiness and Success, which you should all buy.
00:06:51.500 By the way, reframe your brain is just killing it.
00:06:55.640 In terms of the difference it's making for the readers.
00:06:58.620 Every day I'm hearing from people, that reframe changed my life.
00:07:02.540 Like every day.
00:07:03.760 Somebody actually tells me their life was changed by one sentence in the book.
00:07:08.540 It's a different sentence for each person.
00:07:11.000 But they've all found a reframe that changed their lives.
00:07:14.340 Anyway, so Bry.ai asked to give it three reframes to help him with his imposter syndrome he was having.
00:07:23.800 Because he applied to a gazillion jobs.
00:07:26.980 He did get one.
00:07:28.560 But only one out of, you know, 1,500 he applied to or something.
00:07:32.780 So he was feeling that he had imposter syndrome for the one job he did get and wondered if he could get some reframes.
00:07:39.620 And sure enough, I won't go through the reframes.
00:07:41.540 But the AI did accurately read my book and pull out three reframes that are effectively in my voice but reworded that are actually useful.
00:07:56.140 And so the question I ask myself is, as a creator of that material, do I get to license it?
00:08:06.960 Or can anybody just do what Bry did and say, hey, you're an app now.
00:08:12.980 Give me advice from that book.
00:08:14.380 Why can this AI use my material and then present it, like, as an app, and I wouldn't get paid for that?
00:08:25.860 I think that's the current situation.
00:08:28.180 But on the other hand, if everybody the AI cribbed from got paid, it would be an unmanageable situation.
00:08:36.100 So I'm not sure how to solve this, but we're right on the brink of writing a book doesn't make any sense.
00:08:45.580 Because buying a book won't make any sense.
00:08:49.320 Why would anybody buy my book if they could just have it summarized by AI?
00:08:54.880 Right?
00:08:56.200 For nonfiction books, it's a real problem.
00:08:58.940 For a fiction book, you're reading it for the pleasure of each sentence, so the summary doesn't make a difference.
00:09:04.300 But why would you write a nonfiction book if the market value will plunge to zero worldwide upon production?
00:09:16.100 I mean, it's not there yet, because people still have the habit of reading books.
00:09:20.220 But we're probably five years away from buying a book doesn't make sense.
00:09:25.460 Because you can just ask your phone and it'll tell you everything that matters in the book.
00:09:30.200 So what's that do to authors?
00:09:32.500 I really don't know.
00:09:33.140 Well, what should happen is that any AI that's using it in a vertical way, as opposed to just looking at it as part of the context for a larger question, which I think is fair.
00:09:44.660 To me, fair use would be, it's aware of what's in my book, and it incorporates it in answers.
00:09:51.120 That's fair.
00:09:52.140 It's the way a human brain works.
00:09:53.480 But if you're actually taking from my book specifically, and that's the source of your advice, and it's like an app, I feel like they need to pay me for that.
00:10:03.320 Do you agree?
00:10:05.200 I mean, yeah, there's no should in this world.
00:10:08.200 It's just what you can get away with.
00:10:09.660 But it feels like that would be a stable system, that the author gets some kind of taste if they're using the work like a little silo.
00:10:19.900 So anyway, well, here's the shocker.
00:10:24.220 Putin won his election.
00:10:26.420 I'll be damned.
00:10:28.820 Did anybody see that coming?
00:10:30.280 I don't know if the polling was suggesting he would win.
00:10:33.120 But Putin wins again in his election.
00:10:35.680 Well, thank goodness we don't live in a country like that where your elections are predetermined and fake.
00:10:42.060 Am I right?
00:10:44.420 Yeah.
00:10:45.260 Thank God we don't live in a place where you can't trust your elections, and maybe it's all predetermined and fake.
00:10:51.240 I'd hate to live in one of those countries.
00:10:53.440 Are you with me?
00:10:55.420 Yeah.
00:10:58.100 Here's one of the reasons that I think Putin is popular in this country.
00:11:02.400 And if it had been a free and fair election, I think he probably would have won.
00:11:10.120 He's pro-Russian.
00:11:13.760 Isn't that weird?
00:11:16.220 He's popular because he's pro-Russia.
00:11:19.160 Now, you might not like a lot of things he does because he's sort of doing pro-Russian stuff.
00:11:26.260 But isn't it weird that America is not pro-America?
00:11:30.480 Isn't that kind of striking?
00:11:34.440 We're literally not pro-America.
00:11:37.200 I mean, we're pro-Ukraine.
00:11:38.900 We're pro-Israel.
00:11:40.400 We're pro-migrants from every part of the world.
00:11:44.400 But we're not exactly pro-America.
00:11:47.180 Because if we were, we'd have a big old wall around us, and we'd keep our money, and we wouldn't give it away.
00:11:52.160 So, maybe this is a lesson to our politicians.
00:11:58.040 If the elections are rigged anyway, wouldn't it be nice if you were pro-American at the same time?
00:12:04.020 Now, we do have a candidate this time who's pro-American.
00:12:06.720 In all likelihood, our intelligence people will have them killed, or something, or rig the election, or something.
00:12:17.820 Anyway, I have no proof that any of that's going to happen.
00:12:21.240 It's just pattern recognition.
00:12:23.040 All right, did you know that the number of young people who plan to have kids ever is at a new low?
00:12:33.640 Were you aware of that?
00:12:35.100 The number of young people who say, yeah, I'm never going to have kids is at an all-time high.
00:12:39.580 Now, a lot of that is financial, obviously.
00:12:41.980 But since when is it ever a good financial idea to have children?
00:12:47.080 Has that ever been a good idea?
00:12:49.440 I mean, it was easier than before, but here's what I would suggest.
00:12:54.800 I believe that TikTok is primarily, or maybe half of that, cause.
00:13:00.680 I think that if you go to TikTok, you will see endless videos of dinks, dual-income, no-kids people,
00:13:09.080 saying how awesome it is to have no children, cause they can go anywhere they want,
00:13:13.160 they can spend the money any way they want, and it's great, and they're really happy.
00:13:17.220 Now, do you think that that's influential?
00:13:20.140 Do you think that TikTok is literally training Americans not to reproduce?
00:13:27.460 It is.
00:13:28.960 It's probably measurable.
00:13:31.220 I'll bet you can measure this, cause all you'd have to do is say,
00:13:34.000 all right, people who watch TikTok versus people who don't,
00:13:37.620 what do you feel about having children?
00:13:40.680 That's it.
00:13:41.260 Real easy to study.
00:13:43.260 Do you wonder if TikTok can turn off human reproduction in America?
00:13:49.040 Because I've been saying this for some time.
00:13:51.980 I've been saying publicly, I don't think you understand.
00:13:56.600 China now has the power, literally, there's no hyperbole here,
00:14:00.960 to turn off human reproduction in America, and apparently they're doing it.
00:14:08.300 Now, are they doing it intentionally?
00:14:10.800 Well, there's no way to know, is there?
00:14:12.920 How would you know?
00:14:14.580 It just happens to be very compatible with their overall Chinese goal of being the dominant country,
00:14:21.420 because their biggest problem, too, is also demographic,
00:14:24.800 and I'm sure they'd like to share that problem with their biggest rival.
00:14:29.100 So, yes, China is literally turning off the mating instinct in America.
00:14:39.500 Anybody worried about that?
00:14:42.700 Is that a problem?
00:14:44.540 Yeah, it's probably just about the biggest problem.
00:14:47.620 I doubt we have a bigger problem than that.
00:14:50.240 You know, debt looks pretty bad.
00:14:51.380 I don't know how to solve that.
00:14:52.860 But the end of reproduction or insufficient reproduction is definitely the end of your country.
00:14:59.280 Yeah.
00:15:00.080 So, it's the most critical thing we have to get right,
00:15:04.540 and it's under direct attack by a rival.
00:15:09.460 And there are some people who say, hey, free speech.
00:15:11.760 Free speech.
00:15:12.480 Let them go.
00:15:15.600 Elon Musk had a conversation on spaces about the size of his testicles.
00:15:21.380 I think he was talking to Gad Saad.
00:15:24.780 And Gad was talking about how entrepreneurs who have more testosterone are more successful.
00:15:34.420 Have you ever heard that?
00:15:36.540 That the more testosterone you have, the more risk-taking you have.
00:15:41.960 And the people who take risks are going to fail the most but also succeed the most.
00:15:47.300 They'll have the most wildly different outcomes.
00:15:51.640 So, Elon was asked about this, and he was joking that it's not just metaphorical that you need huge balls to be an entrepreneur.
00:16:00.540 You must actually have huge balls.
00:16:03.080 Now, I don't think the size of the balls necessarily correlate to how much testosterone you have.
00:16:08.700 Is that even a thing?
00:16:09.760 I don't know that that's a thing, but it's pretty funny.
00:16:14.680 So, apparently, testosterone gives you better outcomes for entrepreneurial stuff.
00:16:22.680 Now, are you worried about the falling testosterone?
00:16:28.200 You should be.
00:16:29.760 The less testosterone there is, the less entrepreneurship there will be.
00:16:33.980 So, yeah, that's a pretty big problem, like an existential problem.
00:16:40.680 Yep.
00:16:41.800 Now, what does this suggest about the differences between male and female entrepreneurial success?
00:16:47.740 Well, as I've said before, everything is a reflection or a ripple from our mating instinct.
00:16:57.520 And the people with the most testosterone probably have the strongest mating instinct, and you'd expect that they would become entrepreneurs, the men anyway.
00:17:07.100 So, I would expect that we would never see a situation in which women are as successful as entrepreneurs unless the level of testosterone becomes even among the two groups.
00:17:19.860 Because it's just cause and effect.
00:17:24.480 Testosterone causes risky behavior.
00:17:27.080 Risky behavior causes the most death and destruction, but also every now and then you hit a home run.
00:17:35.240 So, those are the big wins.
00:17:38.880 So, testosterone is more than a health thing.
00:17:43.120 It is an economic force.
00:17:46.440 Have you ever thought of it that way?
00:17:47.620 Have you ever thought of testosterone as a key to your economy?
00:17:52.840 It is.
00:17:54.200 It definitely is.
00:17:57.700 All right.
00:17:59.660 If you're not following the ongoing forever stories of people who claim that they have found evidence of rigging in the 2020 election,
00:18:10.620 here's the update.
00:18:13.600 The people who are closest to it believe they have the goods now.
00:18:17.620 Now, I'm not going to tell you they have the goods because, you know, my general warning to you is that 95% of all claims at minimum, it could be 100%,
00:18:27.480 but at least 95% of all claims about election rigging will be BS.
00:18:33.560 Which doesn't mean the election was clean.
00:18:37.780 It just means that most of the theories, overwhelmingly, most of them will turn out not to be true.
00:18:44.620 Even if it's true, most of the theories about it will be untrue.
00:18:48.820 But there are people, I'll just tell you from sort of behind the curtain,
00:18:52.740 there are now people that I consider well-informed and serious and show their work who believe they have proof that the 2020 election was rigged.
00:19:02.240 Now, you haven't heard it because it's not in the mainstream and maybe never will be.
00:19:08.560 It may never make the cross the barrier.
00:19:11.180 But there are serious people who think they have the goods.
00:19:13.720 I can't tell you that because I can't verify what they're saying.
00:19:16.820 I have no way to verify it.
00:19:18.280 But I will tell you that serious people are saying they have the goods.
00:19:24.240 We'll see.
00:19:25.880 We'll see.
00:19:27.080 That would be the Trump third act, if it happens.
00:19:31.440 Searchlight Pictures presents The Roses, only in theaters August 29th.
00:19:35.600 From the director of Meet the Parents and the writer of Poor Things, comes The Roses.
00:19:40.440 Starring Academy Award winner Olivia Colman.
00:19:43.120 Academy Award nominee Benedict Cumberbatch.
00:19:45.440 Andy Samberg, Kate McKinnon, and Allison Janney.
00:19:48.520 A hilarious new comedy filled with drama, excitement, and a little bit of hatred,
00:19:53.220 proving that marriage isn't always a bed of roses.
00:19:56.480 See The Roses, only in theaters August 29th.
00:19:59.240 Get tickets now.
00:20:02.220 Well, the Associated Press is worried that the Germans are showing nationalism.
00:20:07.500 So, yeah, they're worried that Germans like their country.
00:20:19.120 Okay.
00:20:19.920 I don't even know if I want to say anything about that.
00:20:22.180 Now, obviously, they're tying it back to their, quote, Nazi past.
00:20:26.440 But I don't think that nationalism was the problem with Hitler.
00:20:31.540 Am I wrong?
00:20:32.480 Does nationalism lead you naturally to Hitler?
00:20:37.680 I don't think they're necessarily connected.
00:20:42.020 But it's funny that in today's day and age that playing for your team, your nation,
00:20:49.320 is automatically considered killer-ish.
00:20:54.580 How'd that happen?
00:20:57.380 Probably TikTok.
00:20:58.620 Anyway, the Biden administration is looking to finalize their plan to get rid of gas engines
00:21:08.220 in the United States over time.
00:21:11.340 At the same time, they're doing everything they can to destroy Elon Musk and Tesla.
00:21:18.780 The same time.
00:21:20.800 The same time they're trying to destroy the gas car industry.
00:21:25.200 They're trying to destroy the biggest dominant electric car company.
00:21:30.960 Now, you might say, that's not true.
00:21:32.740 They're only going after Elon Musk.
00:21:34.800 Okay.
00:21:36.020 All right.
00:21:36.580 I'm not going to argue that difference.
00:21:39.540 Yeah.
00:21:40.360 They're targeting the biggest electric car company and all the gas car manufacturers do.
00:21:48.200 I'll tell you, people, when it comes to the car industry, which is the only thing I'm talking
00:21:55.400 right now, the car industry, when it comes to only the car industry and no other topic,
00:21:59.680 it's going to be a bloodbath.
00:22:01.300 But only the blood, only, only, regarding the economy of cars.
00:22:08.700 No, no, no.
00:22:10.380 I'm not talking about anything else.
00:22:11.840 Bloodbath of just automobile-related topics and nothing else.
00:22:20.740 All right.
00:22:21.520 There's a new study that says alcohol deaths in the U.S. are up 29%.
00:22:27.560 They're calling it excessive alcohol use.
00:22:30.920 And, of course, that would affect all age groups.
00:22:33.180 Is that enough to explain all excess mortality?
00:22:41.480 It feels like it.
00:22:43.680 Because if drinking, if excess, they're talking about excess, right?
00:22:49.440 People are drinking way too much.
00:22:51.200 It's up 29% in five years.
00:22:54.600 And then what they did was they studied deaths that are not obviously directly related to alcohol.
00:23:00.160 So a drunk driving death would be directly related.
00:23:05.340 But now they look at diseases which they know are exacerbated by alcohol.
00:23:11.260 And if you look at the diseases that would be exacerbated by alcohol, apparently they're up.
00:23:18.840 So is this the entire explanation for mortality, excess mortality?
00:23:23.140 Because it is one thing that would affect all age groups.
00:23:25.760 The thing that was mysterious is the excess mortality was in all age groups.
00:23:33.420 So if you had 29% more drinking, excessive drinking, not just 29% more people having a sip with dinner,
00:23:43.660 but 29% more excessive alcohol in five years,
00:23:48.500 are you telling me that wouldn't completely explain the excess deaths?
00:23:53.680 Now remember, the fentanyl deaths don't have the same quality,
00:23:58.340 where if you're taking fentanyl and you haven't died yet,
00:24:01.920 it doesn't have the quality that I know of,
00:24:04.680 of weakening your other organs, you know, the way that being a long-term drinker would be.
00:24:10.220 It might, but I haven't seen that signal.
00:24:13.880 So the fentanyl deaths, we can kind of count.
00:24:19.500 So if it was just that, it wouldn't be affecting older people so much, right?
00:24:24.380 If fentanyl were the cause, it would be a little bit concentrated in the younger people.
00:24:29.840 But if alcohol is the cause,
00:24:32.220 because the excess mortality is across all age groups,
00:24:36.340 that makes sense.
00:24:37.720 It might actually just be alcohol.
00:24:41.940 Now, I know you want me to say,
00:24:44.020 but it could be the vaccinations, and it could be.
00:24:48.160 But this one is just right in front of you.
00:24:50.980 You don't have to do a whole bunch of extra, you know, research,
00:24:54.700 if this is true.
00:24:56.500 Now, of course, any study is questionable these days,
00:24:58.840 so I'd like to see confirmation that drinking really went up that much.
00:25:03.160 There's a new statistic that says 90% of OnlyFans users are married.
00:25:09.080 The average age is 29.
00:25:12.840 90% of OnlyFans users are married.
00:25:18.620 So, what's that tell you?
00:25:26.760 Remember I told you that marriage,
00:25:29.100 marriage, by its design,
00:25:32.240 makes the wife the sort of the jailer for the husband's sexuality, right?
00:25:40.160 So, your wife is a person who prevents you from having sex.
00:25:43.520 That's the primary purpose of a wife,
00:25:45.580 to prevent you from having sex.
00:25:48.520 Now, not just with other people,
00:25:51.240 but necessarily because schedules are busy,
00:25:54.480 and people are not always feeling good and healthy,
00:25:56.780 also with one person.
00:25:58.260 So, if you have a situation where the system
00:26:02.220 is to deny men the one thing they want the most,
00:26:07.380 biologically, a sexual outlet,
00:26:10.260 it's the one thing we want the most,
00:26:12.180 biologically, not intellectually,
00:26:15.160 but biologically it's the one thing we want the most.
00:26:17.820 And if you deny it to them with a system
00:26:19.940 which guarantees they'll have less than they want,
00:26:22.700 apparently you get OnlyFans,
00:26:25.400 it's worth $18 billion,
00:26:27.320 they're making over a billion dollars a year,
00:26:29.140 and 90% of them are married,
00:26:31.300 exactly like you'd expect it to be.
00:26:34.080 If I told you,
00:26:35.380 if I described the system of marriage
00:26:37.820 as a thing that prevents men from having sex,
00:26:41.960 and then I said,
00:26:42.800 but there is this tool that a man could use somewhat secretly
00:26:46.740 to fill in some of the gap,
00:26:50.280 how would that tool do?
00:26:52.860 You would probably predict it would do well.
00:26:55.940 Yeah.
00:26:59.460 As Glenn Greenwald points out,
00:27:01.560 and it's a real head shaker for me too,
00:27:04.380 that did you know that the most underreported story of 2023
00:27:08.260 is that the courts have proven that,
00:27:12.260 or the courts have decided,
00:27:13.680 that it's a true thing
00:27:14.820 that the Biden White House and the FBI
00:27:17.680 were going after the First Amendment
00:27:21.020 by coercing big tech to censor dissent online.
00:27:26.820 Literally, the First Amendment
00:27:28.440 was violated by our government
00:27:32.180 in the most egregious, grossest, biggest possible way.
00:27:36.020 Not a trivial thing.
00:27:37.960 No, no, not a trivial thing.
00:27:39.820 A direct stake in the heart to free speech.
00:27:44.820 And we just kind of let that go.
00:27:48.560 You know, the Twitter files and all the reporting.
00:27:51.480 We, you know, we all saw it.
00:27:53.900 It's a fact.
00:27:54.800 The courts have confirmed it.
00:27:56.920 And, you know, we just moved on.
00:28:00.560 How do we move on from that?
00:28:03.120 Well, the answer is that the media is controlled.
00:28:06.020 By the bad guys.
00:28:07.160 And they just decided not to make a story
00:28:09.240 about the fact that they're the bad guys.
00:28:12.500 Apparently, if you are the bad guys,
00:28:14.680 you don't have to report on it.
00:28:17.640 And then nobody will act on it.
00:28:18.920 Like, it doesn't matter.
00:28:21.100 Mike Benz was the subject of a big New York Times hit piece.
00:28:25.900 I think it was a big Sunday feature piece,
00:28:28.260 like, right on the front of one of the sections.
00:28:31.020 And he reported today on X.
00:28:34.800 It was so uninteresting that he forgot to read it.
00:28:38.880 There was a New York Times hit piece about himself.
00:28:41.440 And he knew it was there, but he was like, eh, got busy.
00:28:44.740 Didn't look at it.
00:28:45.560 Went to bed.
00:28:47.320 That's how important they seem to many of us.
00:28:50.160 All right.
00:28:53.160 The funniest part of the day is the bloodbath hoax.
00:28:58.360 And so let's take a victory lap, shall we?
00:29:03.480 Do you remember what things were like five, seven years ago?
00:29:07.060 When the fake news would do one of their plays that was just fake news,
00:29:13.800 and people weren't really up to speed on the fact that this was even a thing.
00:29:20.360 So even the people on the right treated it like it was real news,
00:29:25.560 but just it was wrong.
00:29:28.000 Right?
00:29:29.020 We treated the fine people hoax like, no, the news has a fact wrong.
00:29:34.100 And that didn't really work at all, did it?
00:29:38.220 Because then the news would say, no, we got all the facts right.
00:29:41.180 And then we'd say, no, but the fact is wrong.
00:29:43.300 Here's why.
00:29:44.380 And then they'd say, no, we got all the facts right.
00:29:46.900 And it made no dent.
00:29:48.640 You couldn't make a dent by telling them the news that they got facts wrong
00:29:52.640 or left something out.
00:29:54.160 They just said they didn't.
00:29:55.780 And that would be the end of it.
00:29:57.060 Because their people looked at their truth as the truth.
00:30:00.680 But much has changed since then, would you agree?
00:30:06.480 One of the biggest things that's changed is that,
00:30:12.220 how do I say this?
00:30:15.640 With the most humble,
00:30:18.160 how do I say this with the most fake humility
00:30:21.440 so it doesn't sound obnoxious?
00:30:23.240 In around 2016, I came on the scene and told you
00:30:28.180 that you were ignoring persuasion
00:30:29.900 as the dominant force in politics.
00:30:33.740 It wasn't just a thing to talk about now and then.
00:30:36.780 It's the main thing.
00:30:38.920 And that everything you saw
00:30:40.360 should be seen in the persuasion filter.
00:30:44.160 Where are we now?
00:30:46.240 And now we're there.
00:30:48.320 Today, every one of you,
00:30:50.820 every person watching and most of the people who are on X,
00:30:55.120 we now see everything as a persuasion play.
00:30:58.760 And we also see all the gears of the machine.
00:31:02.540 We know that one of the plays, for example,
00:31:05.360 is the wrap-up smear.
00:31:08.120 Now, Nancy Pelosi actually explained it.
00:31:10.900 And then we got to watch it in the wild.
00:31:13.020 So we all learned what it was.
00:31:14.740 The wrap-up smear is when you leak something to a reporter
00:31:18.520 that's not true, the reporter reports it,
00:31:22.200 and then you refer to the news
00:31:24.280 as your validation that it's true.
00:31:27.220 So first you give them the fake news,
00:31:29.500 and then you use their reporting
00:31:31.300 as your independent check that it's real.
00:31:33.940 But it was just your fake news that you gave to them.
00:31:37.160 Now, Nancy Pelosi explained that
00:31:38.960 as a common political play.
00:31:41.280 But you can generalize from that,
00:31:44.660 and you can see all the other ways
00:31:46.320 they use a similar thing.
00:31:48.240 For example, how many of you noticed on your own
00:31:51.900 that when the bloodbath thing came out,
00:31:56.300 that number one, everybody understood its form.
00:32:00.840 It's a root bar.
00:32:02.560 It's where they take something out of context,
00:32:05.060 and then they pretend they didn't take it out of context
00:32:08.440 to create a whole story.
00:32:10.400 But as soon as they did it this time,
00:32:13.180 everybody yelled,
00:32:14.280 it's a root bar, it's a fake edit,
00:32:16.020 and they were mocked.
00:32:18.080 They were mocked just viciously.
00:32:21.320 You know what's different?
00:32:23.620 We didn't mock them for the fine people hoax.
00:32:28.080 We just tried to correct it.
00:32:30.620 That didn't work.
00:32:32.660 But mockery, oh, that's working.
00:32:36.300 Have you noticed how well it's working?
00:32:37.700 Have you seen how many people have backed off the hoax already?
00:32:43.020 Let's see.
00:32:43.540 I saw Stephen Smith say,
00:32:48.320 yeah, this is a hoax.
00:32:50.820 So Ian Bremmer, who's no fan of President Trump,
00:32:54.040 but tries to call balls and strikes,
00:32:56.700 he said, yeah, yeah, it's a hoax.
00:32:59.200 I don't think Bill Maher's going to believe it.
00:33:04.480 And even there's a report that even Morning Joe backed off it when Elon called it down as a hoax.
00:33:10.660 But Morning Joe's back at it because he's not a legitimate news guy.
00:33:15.300 So I would say that what's different is we've learned to mock the machinery of the hoaxes
00:33:25.340 because we can see all the moving parts.
00:33:28.120 So first they take it out of context.
00:33:30.260 Then they put it all over social media.
00:33:32.980 Once it's all over social media, that makes it a story.
00:33:36.080 So now the story is about how all these right-wing people are trying to say it didn't happen,
00:33:42.820 take it down to context and stuff.
00:33:44.600 So now you've got a story for the mainstream,
00:33:46.840 and then the mainstream can continue lying.
00:33:49.400 Now the way the mainstream media lies is they have guests who lie for them.
00:33:56.960 Do you think that the regular news people want to say out loud,
00:34:01.820 I believe that the bloodbath thing is not taken out of context.
00:34:06.080 No, they don't want to do that because we all know that's what they're doing.
00:34:09.940 So instead they'll have some crazy batshit person come on to say that,
00:34:15.340 oh, it's real.
00:34:16.180 He really meant it.
00:34:18.240 Now, here's the best part.
00:34:22.640 This is the most mockable part.
00:34:25.040 And I would like to invite you to join me in mocking this.
00:34:29.500 This actually happened in the real world.
00:34:31.880 I swear to God I'm not making it up.
00:34:33.840 Jen Psaki and a number of other people, including George Conway,
00:34:40.760 when caught turning the bloodbath thing into fake news,
00:34:47.180 did they say, okay, you're right, you caught us, it's fake news.
00:34:52.700 It's just out of context.
00:34:54.180 Ah, you got us.
00:34:55.380 Did they say that?
00:34:56.200 No, they didn't.
00:34:58.620 They got caught red-handed with their hands in the cookie jar up to their fucking shoulders
00:35:04.060 and they doubled down.
00:35:06.760 But here's the funny part.
00:35:08.860 Do you know how they doubled down to say they were right when they're obviously wrong?
00:35:13.240 It goes roughly like this, and it's going to sound like I'm just making this up.
00:35:17.480 This is real, people.
00:35:19.860 This is real.
00:35:22.440 They said, and I'm sort of summarizing several of them.
00:35:28.380 They said, well, maybe the bloodbath was only talking about the automobile industry.
00:35:37.140 But what you don't understand is the larger pattern, and it must be seen in the larger pattern.
00:35:45.560 Now, it gets better, but just take a moment to delight in the fact that they're saying,
00:35:53.440 we know it's a lie, but it's still true because.
00:35:58.580 Whatever happens after the word because is going to be so fucking funny
00:36:03.220 that you're going to be laughing about it for a day.
00:36:06.680 So we know it's not true, but it is true
00:36:11.260 because it matches the pattern of our other hoaxes.
00:36:18.960 Now, they don't say it that way.
00:36:20.440 They say it matches the pattern of January 6th,
00:36:24.440 which was their other hoax.
00:36:29.420 It also matches the pattern of the fine people hoax of him saying things
00:36:38.680 that you shouldn't say, you know, that are divisive.
00:36:43.860 Let's see.
00:36:45.680 So this is how Jen Psaki says it.
00:36:48.680 Quote, we did not miss the full context.
00:36:53.160 This was not an off-message comment.
00:36:55.380 This is his message, Jen Psaki explained, about the bloodbath thing.
00:37:02.500 And it's the exact same message he wants MAGA cult to hear.
00:37:06.420 So they're literally saying, yes, it's not true, but it is true.
00:37:14.720 Now, that's funny.
00:37:16.860 They're actually using their other hoaxes to create what is what I'm going to name
00:37:23.820 the wrap-up hoax.
00:37:27.520 See, the wrap-up news reporting is when you put a fake story
00:37:32.500 and pretend it's true.
00:37:34.240 But in this case, they actually know that their hoax is a hoax
00:37:37.760 and they're still selling it as a little bit true
00:37:41.280 because it fits the pattern of their other hoaxes.
00:37:48.320 And that's happening right in front of us.
00:37:51.500 Here's George Conway.
00:37:54.400 And so they knew they were in trouble, which you can tell in their faces.
00:37:58.860 I'd like to give you the face of somebody who's telling you the truth.
00:38:03.260 All right.
00:38:03.480 Now my impression of a face of somebody telling you the truth.
00:38:06.880 Well, it turns out that that bloodbath thing was in a context
00:38:11.160 and we put it in context.
00:38:12.480 It was probably more about the automobiles and the economy.
00:38:16.280 See that face?
00:38:17.600 That's what the truth looks like.
00:38:19.480 Now I'd like to show you the face of somebody who's lying,
00:38:23.300 but they don't know that you know they're lying yet.
00:38:27.120 So this is different, right?
00:38:28.820 I'm getting to Jen Psaki and Conway.
00:38:31.300 But it's lying, but they still think they can get away with it.
00:38:35.300 That's when the eyes go up.
00:38:36.880 Oh, yes.
00:38:40.420 It might have been a little bit about the car industry,
00:38:43.880 but it's really eyes open.
00:38:47.240 It's really about a larger pattern.
00:38:49.960 Now that's if they think they can get away with the lie.
00:38:54.180 You're seeing a whole different thing.
00:38:56.680 Watch Morning Joe and Jen Psaki videos and stuff.
00:38:59.940 They know they got caught lying.
00:39:01.820 So they're like children who are angry they got caught lying.
00:39:07.620 Or a spouse who is angry they got caught lying.
00:39:10.840 So look at their faces.
00:39:13.800 They're tortured faces.
00:39:15.480 It's like, but they don't understand that it's really,
00:39:20.700 it's not about the fact that we made up a total lie.
00:39:23.780 No, no.
00:39:25.120 They're on the wrong point.
00:39:27.260 Why are you on the point about my total lie?
00:39:29.900 Why can't you see it in the context of all of my lies?
00:39:34.440 So their faces are like, they become like a, you know, a tortured golem face.
00:39:40.100 So look for the tortured golem faces.
00:39:42.620 It's hilarious.
00:39:43.140 So what would a lawyer say?
00:39:48.080 Here's George Conway.
00:39:49.960 What matters.
00:39:51.520 Now the what matters is after the point.
00:39:54.000 He's sort of acknowledging without acknowledging that the bloodbath thing is a hoax.
00:39:58.960 So when he uses the language, what matters is, he's allowing us to, oh, okay.
00:40:05.360 Well, that was hoax-y.
00:40:07.420 You know, maybe it was hoax-y, but what really matters, says the lawyer.
00:40:13.140 What really matters is that Trump consistently uses apocalyptic and violent language in an indiscriminate fashion as a result of his psychopathy and correlative authoritarian tendencies, and because he's just plain evil.
00:40:32.640 So it's part of this big pattern of apocalyptic and violent language.
00:40:36.660 So George Conway, being a lawyer, tries to find some lawyerly argument that being completely wrong and lying about what bloodbath meant is, if really, in a technical way, if you were to look at the full context of the Constitution,
00:40:58.700 and, and, you know, oh, I'm in a pretzel, I'm in a pretzel, I can't get out.
00:41:06.420 So it's pretty funny.
00:41:10.180 Do you know what's funnier?
00:41:12.920 Oh, it gets funnier.
00:41:14.140 So, not long after George Conway talked about Trump using this consistently apocalyptic and violent language, and my God, he's a psychopath and plain evil,
00:41:29.720 Biden posts on Acts that Trump must be planning another January 6th, the way he's talking.
00:41:36.920 You know, what they call a violent insurrection.
00:41:44.240 Some would call that consistent use of apocalyptic and violent language, because in the last election, Biden accused him of siding with the racists at Charlottesville.
00:41:57.160 Now, that's funny.
00:42:01.220 Watching them twist when their entire game has been exposed is just entertaining.
00:42:09.080 Am I wrong?
00:42:10.900 Are you, are you, let me see, let me just back up and see if my main theme is coming through.
00:42:17.440 Do you understand that this is completely different than 2016?
00:42:21.640 In 2016, we were also having fun, but it really seemed like there was just some problem with the news coverage, you know, it was a little bit biased, or maybe they missed something, you know, maybe there was just some herd instinct, you know, we couldn't figure out what was going on.
00:42:38.320 But now we see the whole thing as an op.
00:42:41.080 It's a very organized, it repeats, it's the same players using the same techniques, it even has a name.
00:42:49.420 It's called a Rupar.
00:42:51.640 Because it's done so often, that it needed a name so we could refer to it.
00:42:57.020 Now, that's funny.
00:42:58.880 Every time there's a Rupar, it's funny.
00:43:03.560 All right.
00:43:04.520 So, I would like to also add to this, that the, there's a new force that grew since 2016.
00:43:12.320 I'm going to call them again, the internet dads.
00:43:14.380 But it's the independent people who have credibility on the internet.
00:43:20.960 And the, the internet dads, and that includes the, you know, the, the men and women.
00:43:26.220 So it's not to keep this less sexist than the sounds.
00:43:29.920 It's just the credible people who, uh, called out the hoax and mocked it.
00:43:37.000 And we didn't have that before.
00:43:39.260 There, there was not before a large group of credible people who all at once would say,
00:43:44.960 nope, that's not true.
00:43:47.880 But that happened.
00:43:48.900 And there was a large group of men, mostly, who, who said at the first moment, that's not true.
00:43:59.200 And that really made it definitely, um, Steve, um, yeah, that is true.
00:44:05.940 All right.
00:44:06.440 So lots of examples of that.
00:44:08.060 And it's a very positive force.
00:44:11.300 And the other, the other thing that Jen Psaki wanted to bring in, besides saying it's, uh,
00:44:16.180 the bloodbath thing correlated with the January 6th insurrection hoax that they created,
00:44:20.880 it also created, it also matches his other hoax, according to them, of, uh, I don't know if you call them people.
00:44:30.060 So guess what they did?
00:44:32.460 They took another statement out of context where he was talking about the criminals.
00:44:38.280 He was specifically talking about the criminals.
00:44:40.500 I don't know if you'd call them people.
00:44:42.800 Perfectly acceptable.
00:44:44.680 Everybody talks like that.
00:44:46.180 Oh, this criminal was so bad.
00:44:47.560 They're like a monster.
00:44:48.960 Normal language.
00:44:50.140 They take it out of context.
00:44:51.500 So you think it means all migrants.
00:44:53.360 And then she used that as evidence of the pattern.
00:44:57.180 So let me show you this again.
00:44:58.560 The pattern is that the Rupar they did about bloodbath matches the Rupar they did about, I don't know if you call them people,
00:45:08.020 which is also consistent with their long-term hoax about January 6th.
00:45:14.500 So, sure, it's not true that any one of them really happened.
00:45:18.920 But you have to look at the fact that there are three things that didn't happen, and then you can see the pattern.
00:45:26.640 That's actually what they're selling to us.
00:45:28.480 And if you don't think that's funny, well, you're dead on the inside.
00:45:34.940 All right.
00:45:35.160 Let me see if I can reboot my comments here on locals.
00:45:40.560 I'm going to refresh those bad boys here.
00:45:43.440 One moment.
00:45:44.120 There we go.
00:45:49.860 All good.
00:45:51.440 All right.
00:45:52.780 So mockery is the best weapon of this election, and you should use it more.
00:45:57.680 That's why I'm so happy that Biden is staying in the race.
00:46:01.320 If you put somebody good in the race against Trump, it would be a whole different situation.
00:46:09.700 But could you imagine the next few months with Biden walking around like a robot and falling apart in front of us?
00:46:20.860 So did anybody see my clever post?
00:46:24.020 I will paraphrase.
00:46:25.220 I've been seeing online a lot of conspiracy theorists, conspiracy theorists, who are saying that Biden died a while ago, has been replaced with a body double, some say, or others say an actual robot.
00:46:44.580 Because he walks like a robot, and he is no smarter than Chad GPT.
00:46:50.840 And he hallucinates a lot of things.
00:46:54.920 A lot of things he says are not true, just like AI.
00:46:58.920 So a lot of people are saying, oh, he's AI.
00:47:03.120 He's already dead.
00:47:04.660 People.
00:47:06.340 People.
00:47:08.820 I hate to be the one to tell you, but the technology to make a person from a robot with AI,
00:47:19.240 AI, you think that's here?
00:47:22.440 You think that technologists could actually make a real looking and talking and walking robot that you wouldn't know was a robot,
00:47:33.160 and that they could pass it off as the president of the United States in front of everybody?
00:47:38.580 My God, people.
00:47:40.360 That kind of technology isn't going to be available for weeks.
00:47:46.520 Yeah.
00:47:49.240 Literally.
00:47:54.720 What I really mean is, we could do it today.
00:47:58.340 Now, I don't think Biden is a robot, but he could be.
00:48:04.520 If you think you could tell by looking at him, or by the way he acts, I say you're wrong.
00:48:12.040 I say you're wrong.
00:48:13.100 There is nothing he does or says that could not be reproduced basically today.
00:48:20.780 Basically.
00:48:21.720 You could make AI sound like him.
00:48:24.400 You could make AI talk like him.
00:48:27.080 Now, let me be a little bit careful here.
00:48:30.280 I'm not saying that you could make AI reproduce a regular person.
00:48:34.880 Biden's not regular.
00:48:35.940 He's so degraded at this point that he could mumble anything and it would just sound like Biden.
00:48:43.260 All right.
00:48:43.820 Here's my robot impression that's doing a bad impression of Biden that would be perfectly good.
00:48:49.800 Well, Ukraine, I love ice cream.
00:49:01.520 Now, if he said that, the news would report he really loves ice cream and he's passionate about Ukraine.
00:49:10.080 And we're so trained that nothing he says necessarily makes any sense that we'd be fine with it.
00:49:19.200 Now, do you think you could make a robot that could pretend to be Trump?
00:49:26.060 Not with current technology.
00:49:29.200 No.
00:49:30.360 No, because it could never be funny.
00:49:33.180 The robot could never be as funny as he is.
00:49:36.040 Never be as provocative.
00:49:37.640 It could never be as unexpected.
00:49:39.220 But could you make an AI version of Biden right now?
00:49:45.000 Yes.
00:49:46.340 All you'd need.
00:49:47.800 Remember, also, Biden's face doesn't move, right?
00:49:51.140 Biden's face is frozen.
00:49:55.740 He's got a frozen face.
00:49:57.560 How hard is it to put Hollywood makeup on a robot body if the only thing that moves is the mouth and only one side of it?
00:50:06.520 He doesn't even have a full mouth.
00:50:07.860 No.
00:50:08.880 No.
00:50:10.020 Trump.
00:50:11.140 Well, that.
00:50:12.180 Well, that's it.
00:50:13.360 He's got like half a mouth and his eyes have all the expression of open or closed.
00:50:20.560 You know how eyes are the window to the soul?
00:50:24.520 Normally, your eyes are saying a lot.
00:50:27.840 You can see, you know, passion.
00:50:30.120 You can see love.
00:50:31.680 Not Biden.
00:50:32.980 Two modes.
00:50:34.640 Open, open like he's got dementia.
00:50:38.540 Where am I?
00:50:40.980 Oh, what do I do now?
00:50:43.720 Or closed like he's evil.
00:50:46.760 Trump, January 16th, I reckon it's going to be a bloodbath.
00:50:52.600 Right?
00:50:53.380 How hard is it to make that robot?
00:50:56.220 The entire face is half a mouth and two modes of eyes.
00:51:00.840 And then obviously sometimes they change out the ears.
00:51:05.680 You've all noticed the photos.
00:51:08.280 Some photos of Biden have the earlobe distinct,
00:51:11.400 and some of them are, it's connected to his head.
00:51:14.920 So there are at least two robots with a little ear difference,
00:51:18.460 or perhaps they put different ears on the robots.
00:51:22.260 Okay, I'm just joking about the robot part.
00:51:25.260 But I'm not joking about the fact that you could recreate Biden
00:51:30.020 with a robot, with AI, today.
00:51:35.020 He's just the only person you could do it with.
00:51:39.300 Am I right?
00:51:41.040 Does anybody disagree?
00:51:42.780 He's just the only person you could do it with.
00:51:45.300 AI is nowhere near enough to do Trump or RFK Jr.
00:51:50.380 or Larry Elder, or just pick anybody who was running for president.
00:51:54.680 AI couldn't do anybody else.
00:51:56.760 Right?
00:51:57.280 They're all complicated people.
00:51:59.980 Not Biden.
00:52:01.480 I could tell you what Biden will do next month.
00:52:06.800 Nothing about it.
00:52:09.400 Can your eyes change color?
00:52:11.920 Did Biden's eyes change color?
00:52:15.680 I'm seeing a comment about eye color changing.
00:52:21.100 Huh.
00:52:21.540 All right.
00:52:27.340 So,
00:52:28.460 that's happening.
00:52:34.480 There's a study that says 75% of women want a,
00:52:38.480 they prefer a dad bod on their guy.
00:52:42.260 So women don't like their guy to be too muscular.
00:52:45.740 75% of them said they prefer sort of a little bit of meat,
00:52:49.920 a little bit of a dad bod.
00:52:52.420 But,
00:52:53.000 so we're all believing that,
00:52:54.200 right?
00:52:55.800 We all believe that?
00:53:00.120 No.
00:53:02.160 Yeah.
00:53:03.060 Don't believe that.
00:53:06.540 I've even experimented with that.
00:53:09.140 I've experimented,
00:53:10.380 you know,
00:53:10.760 letting myself gain some weight,
00:53:12.640 but also seeing what it looks like when it's off.
00:53:15.740 I can tell you that people respond to me completely differently.
00:53:20.000 If my fitness is high,
00:53:21.600 you get a completely different reaction from people.
00:53:24.220 If you didn't know that,
00:53:25.600 it's because you've never tried it.
00:53:27.640 You should try getting in really good shape,
00:53:31.480 enough so that other people would notice it when they meet you.
00:53:34.680 All right.
00:53:34.860 Now at different times in my life,
00:53:37.360 when people wouldn't meet me,
00:53:39.920 it would be one of the first things they noticed that I was in good shape.
00:53:44.280 There was a,
00:53:44.780 an author came to my house one day to write a hit piece about me.
00:53:49.420 Back when I was dumb enough to let an author spend the day with me to write a,
00:53:52.800 write a book.
00:53:54.500 So author was going to write a book and I was going to be part of it.
00:53:57.520 So he spends the day with me.
00:53:58.540 And when he writes about me,
00:54:02.000 his first,
00:54:02.640 his first part of his description is cartoonists must work out a lot.
00:54:09.120 Cause that was his first impression.
00:54:10.720 That was unusually fit for a cartoonist,
00:54:13.200 I guess.
00:54:14.120 So now everybody will respond to you in a better way.
00:54:18.580 If you're in better shape,
00:54:19.620 that's male or female.
00:54:22.060 And I don't believe there are any exceptions,
00:54:24.060 but here's what there are.
00:54:26.840 There are definitely liars.
00:54:29.540 They're definitely liars.
00:54:31.500 And why would a woman say that she likes a dad bod if maybe she doesn't?
00:54:36.460 Well,
00:54:36.880 there are a few reasons.
00:54:38.060 Number one,
00:54:38.640 they might be with a person who has a dad bod,
00:54:40.980 at which point you say,
00:54:42.780 oh yeah,
00:54:43.080 that's kind of like,
00:54:44.400 because you don't want to say you don't like it.
00:54:46.220 It's your partner.
00:54:47.400 The other thing is you might've gotten together when you didn't have both dad
00:54:51.700 bods.
00:54:52.500 Maybe both of you have a dad bod eventually,
00:54:55.180 but maybe you got together when you didn't.
00:54:57.680 And maybe that was your initial attraction.
00:54:59.760 And you don't want your guy to get too good looking because it'll leave you.
00:55:04.900 How much of it is that?
00:55:06.420 How much of it is the women don't want their guy to be better looking because they
00:55:10.940 are the jailers of their sexuality and they can't be good wardens of the jail
00:55:15.700 and keep you non-sexual.
00:55:18.760 If you're looking so good that other women are throwing themselves at you.
00:55:22.140 No,
00:55:22.920 it's probably just to keep the guy from being madeable.
00:55:28.060 I'm definitely sure that's a thing.
00:55:29.660 But on top of that,
00:55:33.360 I have met women who legitimately did like guys with a dad bod.
00:55:39.300 So I've,
00:55:40.640 I had a friend,
00:55:41.640 I'm not going to name names,
00:55:42.720 but you know,
00:55:43.600 long ago,
00:55:44.280 female friend who,
00:55:46.280 if you were to rate her on a scale of one to 10,
00:55:48.140 would probably be a 10.
00:55:48.920 And she would tell me that she liked guys with dad bods.
00:55:55.240 And I would say,
00:55:56.540 well,
00:55:56.880 that's funny because the guy you're with is like super,
00:56:01.460 super ripped,
00:56:02.940 but you like the dad bods,
00:56:05.400 even though the guy you're with is super ripped.
00:56:10.040 And you know,
00:56:10.360 so I'm not believing it.
00:56:11.320 Right.
00:56:12.360 But then I watched her dating history going forward and she did literally
00:56:18.560 select men for dad bods.
00:56:22.020 She actually did.
00:56:23.640 So it was real.
00:56:25.720 When she was still a 10,
00:56:28.320 she selected dad bod guys.
00:56:31.440 It was the damnedest thing.
00:56:32.580 And I think she ended up marrying one.
00:56:34.680 She ended up marrying a dad bod and happily,
00:56:37.940 totally happily.
00:56:38.980 Now I think in that case,
00:56:40.400 that might be somebody who doesn't want to lose power.
00:56:45.220 I think that she wanted to have the most power in the relationship.
00:56:51.100 And if she was the beautiful one and he wasn't,
00:56:54.360 she'd have more power.
00:56:56.140 You know,
00:56:56.320 he could never do better basically.
00:56:59.580 So I always suspected that that's part of the psychology.
00:57:02.380 All right,
00:57:02.680 let's get some really provocative stuff here.
00:57:05.380 There's another study out of Finland that the more woke you are,
00:57:11.080 the less happy you are.
00:57:13.800 Is anybody surprised that the more woke you are,
00:57:17.340 the less happy you are and the more mental illness you have?
00:57:21.840 No,
00:57:22.420 we're not surprised.
00:57:23.220 But do you think that the wokeness causes your unhappiness?
00:57:28.320 Because it takes away maybe your sense of agency in the sense that if you
00:57:34.100 think that your problems are caused by the environment,
00:57:36.960 there's nothing I can do about it.
00:57:38.960 That's probably not good for your health,
00:57:41.400 mental.
00:57:43.920 But if you think that you do have complete control over your situation,
00:57:48.900 sure,
00:57:49.220 there might be some systemic racism,
00:57:51.600 but I'll slice through that.
00:57:52.920 It's just one more little problem I've overcome.
00:57:56.740 Then you're probably going to be happier because you feel like it's up to
00:57:59.420 you.
00:57:59.800 You know,
00:58:00.100 you're taking charge of your life and you're not letting the little things
00:58:03.300 bug you.
00:58:05.380 So definitely there's a causation in that direction.
00:58:08.320 Being woke makes you less happy,
00:58:10.340 but I would suggest that mental illness is the beginning point more often.
00:58:16.200 In other words,
00:58:16.940 that mentally ill people are woke and people are not mentally ill or not.
00:58:23.360 What would be a good definition of mental illness?
00:58:27.580 Well,
00:58:28.020 I'll give you one.
00:58:29.420 Well,
00:58:29.700 let me give you an example first.
00:58:31.240 Let me give you two situations.
00:58:32.500 I wake up every day and I think that things are going to go better for me than observation
00:58:41.600 would suggest.
00:58:42.900 In other words,
00:58:43.400 I'm unusually optimistic about my own future and always have been.
00:58:48.320 Is that a mental illness?
00:58:49.880 Go.
00:58:50.180 It's an unrealistic view of the world.
00:58:53.860 Is it a mental illness?
00:58:57.440 It's not based on science.
00:59:00.900 It's just how I feel.
00:59:03.520 Is it a mental illness?
00:59:04.880 The answer is no,
00:59:06.800 because as a positive effect on my life.
00:59:10.140 Right?
00:59:10.980 So the definition is,
00:59:12.520 does it hurt you?
00:59:14.380 So you could be weird in any one of a billion ways because everybody is.
00:59:20.780 If I haven't mentioned this,
00:59:22.800 everybody's weird.
00:59:25.020 Did you know that?
00:59:26.840 I hope you're not one of these people who still think they're normal people.
00:59:30.740 And then you're not one of them.
00:59:32.460 You're weird.
00:59:33.840 No,
00:59:34.480 you're just young.
00:59:36.740 That's the problem.
00:59:37.820 As soon as you're older,
00:59:38.840 you realize everybody's fucked up.
00:59:41.300 Some people hide it better.
00:59:43.200 Some people's,
00:59:43.960 you know,
00:59:44.200 weirdness is more socially acceptable,
00:59:46.940 but no,
00:59:47.460 we are all weirdos.
00:59:49.040 I mean,
00:59:49.400 we're all weird as fuck.
00:59:52.660 Right?
00:59:52.840 So once you get that,
00:59:54.060 nobody's weird.
00:59:55.700 So the reason I don't judge people,
00:59:57.520 uh,
00:59:59.040 for their idiosyncrasies,
01:00:00.840 idiosyncrasies,
01:00:03.840 yeah.
01:00:04.180 The reason I don't judge people for being weird is that I'm weird.
01:00:07.540 You're weird.
01:00:08.160 We're all weird.
01:00:09.160 And the fact that we're weird in different ways,
01:00:11.360 I don't know how to judge that.
01:00:13.660 I don't know how to,
01:00:14.880 I don't know how to judge you as the bad one.
01:00:17.840 If we're all just weird,
01:00:19.540 like how do you even rank it?
01:00:21.080 So I have a natural resistance to bigotry.
01:00:25.420 I know coming from me,
01:00:26.900 that sounds weird,
01:00:27.560 right?
01:00:28.180 Because if you believe the,
01:00:29.280 if you believe the news,
01:00:30.500 I'm the biggest bigot in the country.
01:00:32.300 But internally I don't have it.
01:00:34.580 And the reason I don't have it is that I don't see anybody's difference as
01:00:39.540 rankable.
01:00:41.420 Does that make sense?
01:00:42.980 I see the differences,
01:00:44.640 but I don't know how you'd rank them.
01:00:47.020 Now the exceptions would be if somebody is doing something illegal or doing
01:00:50.700 something bad to me,
01:00:51.840 right?
01:00:52.160 Then I can rank that.
01:00:53.400 I just don't like it,
01:00:54.560 but I don't rank you for being weird.
01:00:57.420 That we're all weird.
01:00:58.700 All right.
01:01:00.560 So here's my point.
01:01:02.080 In my opinion,
01:01:03.080 wokeness is a form of self-harm.
01:01:07.060 Wokeness is a form of self-harm.
01:01:11.340 Let me give you an example.
01:01:12.440 I would like to protect the borders and not let too many people in,
01:01:19.480 but you know,
01:01:20.120 I like the immigration and I like immigrants and I'd like high quality
01:01:24.020 immigrants with skills who can add to the country and all that.
01:01:28.500 Right?
01:01:29.040 So I consider that good mental health because I'm,
01:01:34.440 because I'm in favor of something that's good for me and good for,
01:01:38.180 you know,
01:01:38.780 people I love.
01:01:40.340 That's good mental health.
01:01:42.200 But if you want 10 million people to come through,
01:01:45.280 because it's good for the 10 million people,
01:01:48.540 even though you can be pretty sure it's bad for you,
01:01:52.580 or statistically it could be bad for you.
01:01:54.860 That's self-harm.
01:01:57.320 Isn't it?
01:01:59.440 That's self-harm.
01:02:01.220 That's somebody choosing the benefit of strangers over their own well-being.
01:02:05.720 I don't know how that's not self-harm.
01:02:09.600 Suppose you wanted to have DEI in your business and you're white.
01:02:16.120 That is something that is designed to be good for people who are not you.
01:02:20.980 And it's designed,
01:02:22.320 and you know it,
01:02:23.500 we all know it,
01:02:24.620 that you will be worse off in the short run,
01:02:27.280 at least in the short run,
01:02:28.520 probably in the long run,
01:02:29.980 but at least in the short run.
01:02:31.840 It's self-harm.
01:02:32.560 When people say,
01:02:35.200 yeah,
01:02:36.100 white people are the oppressors,
01:02:39.040 and everybody else is a victim,
01:02:42.160 that's self-harm.
01:02:45.060 I'm the bad person.
01:02:47.160 Harm me.
01:02:48.080 It's okay.
01:02:49.340 You know,
01:02:49.620 I deserve it.
01:02:50.980 It's all self-harm.
01:02:52.300 So here's our big mistake.
01:02:53.780 We treated wokeness like it's a philosophical opinion,
01:02:59.580 when it's definitely not.
01:03:01.500 It is just mental illness,
01:03:03.320 and we've decided that,
01:03:04.640 hey,
01:03:04.820 free speech,
01:03:05.980 and then the next thing you know,
01:03:07.840 women were embracing it,
01:03:09.960 all the self-harm.
01:03:11.060 To me,
01:03:11.420 it's like cutting.
01:03:13.520 Wokeness and cutting are basically the same thing.
01:03:16.740 It's,
01:03:17.280 you think there's something wrong with you,
01:03:18.660 and you've got to hurt yourself to feel better.
01:03:20.140 So we need to stop treating wokeness
01:03:26.160 like it's a political opinion.
01:03:28.580 It is primarily mental health problem.
01:03:33.780 Now,
01:03:34.380 what about men?
01:03:36.300 Well,
01:03:36.700 it turns out that there's a seven-to-one difference in wokeness.
01:03:40.980 So women primarily are the woke ones,
01:03:43.580 and men are,
01:03:45.120 especially young men,
01:03:46.240 are far more likely to not be woke.
01:03:48.760 Far more likely.
01:03:51.260 Do you know why?
01:03:52.920 Because those men don't want to hurt themselves.
01:03:55.800 That's why.
01:03:57.180 They have good mental health.
01:03:59.540 You show me a teenage boy with good mental health,
01:04:03.040 and I'll show you somebody who's not woke.
01:04:05.880 You show me a teenage boy in bad mental health,
01:04:09.000 and low testosterone,
01:04:11.000 and can't get laid,
01:04:11.940 and he might be simping with the mentally unhealthy people
01:04:16.900 just because he thinks it's a good mating strategy.
01:04:20.020 But it's all mental health.
01:04:21.920 This is not like liberal versus conservative.
01:04:25.780 It's just mental health.
01:04:28.640 And the longer we treat it like it's something other than cutting,
01:04:32.600 the worse will be.
01:04:33.800 Now,
01:04:34.000 the reason we can't talk honestly and plainly about it is,
01:04:37.600 or at least you can't,
01:04:38.580 because you haven't been canceled like I have.
01:04:42.360 Once you get as canceled as I am,
01:04:45.020 free speech,
01:04:45.800 baby.
01:04:47.580 How do you like me now?
01:04:50.180 So I get to do the things you can't do,
01:04:52.960 and test the room to see if it's safe.
01:04:55.680 That's what I'm doing.
01:04:56.480 You can tell that,
01:04:57.320 right?
01:04:58.020 You can tell that what I'm doing is not just content for the show.
01:05:01.840 What I'm doing is a bigger play.
01:05:03.320 It's a reframe that's pretty sticky and it could grow.
01:05:08.680 And I'm going to,
01:05:09.280 I'm going to go into this room that nobody's allowed in.
01:05:12.220 And I'm going to say,
01:05:13.360 no,
01:05:13.520 the problem is it's mostly women who are batshit crazy.
01:05:17.820 I always like to add,
01:05:19.540 I'm not saying all women are batshit crazy.
01:05:21.740 That would be batshit crazy.
01:05:23.680 Can we agree?
01:05:25.280 I'm not saying all women are batshit crazy,
01:05:28.000 because if I did,
01:05:30.160 well,
01:05:30.400 that would mean I'm batshit crazy,
01:05:32.100 because obviously that's not true.
01:05:34.560 I'm just saying that the wokeness is coming almost entirely from mentally ill women
01:05:40.020 and the men who can't say no.
01:05:43.120 So you need more men who are willing to say,
01:05:45.420 you know what?
01:05:46.540 I will not get laid so that I can tell you the truth.
01:05:52.280 Now,
01:05:52.780 I don't know if it'll make any difference in my sex life,
01:05:55.020 but I don't care.
01:05:57.220 I don't care.
01:05:58.960 If it does,
01:05:59.780 it does.
01:06:00.960 I am.
01:06:01.880 I have no fucks left to give.
01:06:04.220 I'm just trying to be useful.
01:06:07.560 By the way,
01:06:08.360 can you tell?
01:06:09.340 I don't know if you can tell.
01:06:10.660 I am actually just trying to help.
01:06:12.720 I think the country has a serious problem,
01:06:15.380 and it's all because we've framed mental health problems
01:06:19.080 who want self-harm
01:06:20.760 as some kind of positive,
01:06:23.420 noble thing.
01:06:24.240 It's the opposite of noble.
01:06:25.720 It's just a medical problem.
01:06:27.820 And again,
01:06:29.020 if you're going to say,
01:06:30.080 am I judging them?
01:06:31.660 Not really.
01:06:32.360 Not really.
01:06:33.060 Not really.
01:06:34.080 Let me be as clear as possible.
01:06:35.940 A mental health problem is a health problem.
01:06:38.600 I wouldn't judge you if you had a sprained ankle.
01:06:41.820 I wouldn't judge you if you had cancer.
01:06:45.120 Right?
01:06:46.200 And I'm not going to judge you
01:06:47.320 if you have a mental health problem
01:06:48.780 that causes you to cut
01:06:50.140 or to commit suicide
01:06:52.340 or to take drugs
01:06:54.000 because you're trying to kill yourself
01:06:55.180 or you're looking for a reframe
01:06:57.720 because you've got imposter syndrome.
01:07:01.040 Am I judging Brian
01:07:03.560 because he has imposter syndrome?
01:07:05.920 Nope.
01:07:06.920 I don't do that.
01:07:08.040 How in the world can you rank
01:07:10.160 people's different oddities?
01:07:13.500 I got plenty of them.
01:07:15.760 I don't want to rank yours
01:07:17.820 because then I'd end up ranking mine.
01:07:21.500 You know what I mean?
01:07:22.500 Or I'd give weight
01:07:23.320 to your ranking of mine.
01:07:24.600 As long as I live in a world
01:07:26.120 in which I can't rank you,
01:07:28.660 then when I hear you rank me
01:07:30.600 and say,
01:07:30.920 oh, that's a bad thing you did there,
01:07:32.620 it just goes,
01:07:33.540 I just don't have a model to hold that.
01:07:37.040 Nope.
01:07:37.360 There are just differences.
01:07:39.740 Just differences.
01:07:41.020 Nobody's better.
01:07:41.720 Nobody's worse.
01:07:42.960 But we do need to treat mental illness
01:07:44.880 like mental illness
01:07:45.780 or we can't be a civilization.
01:07:51.100 All right.
01:07:54.300 What about Ukraine funding?
01:07:56.660 Is Ukraine funding
01:07:57.880 because it's good for America
01:07:59.560 or is it a form of self-harm?
01:08:02.620 Well, that's more of a gray area.
01:08:04.720 But Senator Michael Bennett
01:08:07.220 is threatening
01:08:08.060 that there'll be
01:08:08.620 a government shutdown
01:08:09.640 unless there's more aid
01:08:13.480 approved to Ukraine.
01:08:15.340 So in other words,
01:08:16.360 the American government
01:08:17.600 will stop functioning
01:08:18.760 if we don't give our money
01:08:21.360 to another country
01:08:22.300 that would stop functioning
01:08:23.340 if we don't give them money.
01:08:25.900 Now, some people said,
01:08:27.240 well, this is a mistake.
01:08:29.040 You know,
01:08:29.660 you should take care
01:08:30.540 of the American government
01:08:31.520 and not fund Ukraine.
01:08:33.260 Some say.
01:08:33.680 I say it's a win-win.
01:08:37.080 If we can find a way
01:08:38.380 to not fund our government
01:08:39.900 and not fund Ukraine,
01:08:43.340 I'm only being
01:08:46.500 a little bit unserious
01:08:48.140 because I do think
01:08:50.060 our government needs
01:08:50.940 to shut completely
01:08:51.980 because it just doesn't function.
01:08:54.820 It's not helping us.
01:08:56.460 If the government
01:08:57.420 were doing things
01:08:58.280 that made me better off,
01:08:59.560 I'd be like,
01:09:00.140 yeah, keep that government open.
01:09:01.520 But if you keep them open,
01:09:03.180 they're going to give
01:09:03.640 my money to Ukraine.
01:09:06.860 They're going to run up my debt.
01:09:09.240 Why do I want that?
01:09:11.280 They're going to start a war
01:09:12.320 if they had more money?
01:09:16.060 No.
01:09:18.360 All right.
01:09:19.100 I'd like to give you
01:09:19.980 the wisdom of Mike Goodwin,
01:09:23.040 who's, he passed away,
01:09:25.160 but he was my old boss
01:09:27.000 at Pacific Bell.
01:09:29.120 And he was one
01:09:31.760 of the most interesting characters
01:09:33.200 because he had a way
01:09:35.380 of summarizing things
01:09:36.640 better than I've ever
01:09:37.640 heard anybody summarize.
01:09:39.240 And he had a voting philosophy
01:09:41.660 that went like this.
01:09:44.140 If it's going to cost me
01:09:45.460 extra money, I vote no.
01:09:47.800 And I heard that.
01:09:48.800 I was like, come on.
01:09:50.460 That is so,
01:09:51.600 that's like abdicating thought.
01:09:54.320 Clearly, there are things
01:09:56.120 that are worth
01:09:56.620 a little extra money, right?
01:09:58.360 If you've got an emergency,
01:10:00.540 you know, there's a war.
01:10:01.980 I mean, obviously,
01:10:03.560 obviously, I'd say to him,
01:10:06.480 you know, sometimes
01:10:07.360 you just need more money.
01:10:09.980 And he would say,
01:10:11.100 they already have enough.
01:10:14.820 And I would say, well,
01:10:16.920 would they ask for more
01:10:18.480 if they had enough?
01:10:20.280 Can you tell how young I was?
01:10:22.700 Guess from the story
01:10:23.680 how young I was.
01:10:24.580 Why would they ask for more
01:10:26.340 if they have enough?
01:10:28.200 I would say,
01:10:29.400 with my idiot brain.
01:10:32.700 And then he would say,
01:10:34.160 they have enough.
01:10:36.800 If they have different priorities,
01:10:38.640 they can just change
01:10:39.960 their budget around.
01:10:41.780 But they could make it work
01:10:43.060 if they had to.
01:10:43.740 Do you know what it took
01:10:48.620 for me to completely
01:10:49.900 embrace his philosophy
01:10:51.120 that they already
01:10:52.460 have enough?
01:10:54.700 Age.
01:10:55.880 That's it.
01:10:57.120 Experience.
01:10:58.520 Because my job
01:10:59.480 at the time
01:11:00.000 was to make budgets
01:11:01.000 for my own group
01:11:03.640 within the phone company.
01:11:06.080 So I was the budget guy
01:11:07.400 for the, you know,
01:11:08.120 larger entity
01:11:09.400 that I was in.
01:11:10.020 And I would come up
01:11:12.440 with a budget
01:11:12.920 and all the individual
01:11:13.900 managers would say,
01:11:14.980 we can't live
01:11:15.740 unless we get 20% more.
01:11:18.300 And I'd take him to my boss
01:11:19.440 and everybody said
01:11:20.240 they wanted 20% more.
01:11:21.700 So the overall budget
01:11:22.780 was 20% more
01:11:23.720 and they all had good reasons.
01:11:25.380 So I took him to my boss
01:11:26.580 and I said,
01:11:27.480 you know,
01:11:27.780 it looks like it's going
01:11:28.340 to be 20% more.
01:11:29.340 That's what you're going
01:11:30.000 to have to ask for
01:11:30.920 when you ask your boss.
01:11:32.720 And the boss would say,
01:11:33.860 nah, I'll never get that.
01:11:35.580 Tell them to cut it 10%.
01:11:37.020 And I'd be like,
01:11:39.900 wait, what?
01:11:40.500 No, that's not even close.
01:11:42.240 They need 20% more
01:11:44.820 to do the basic things
01:11:46.620 you've asked them to do.
01:11:47.740 They can't do the project.
01:11:49.540 They can't hire the people.
01:11:50.700 They can't do anything.
01:11:52.080 They'll basically be dead in the water
01:11:53.640 unless you give them 20% more
01:11:55.620 because that was their arguments.
01:11:57.120 I was just taking forth
01:11:58.140 the arguments
01:11:58.660 from the people I talked to.
01:12:00.560 And then the executive
01:12:01.640 looked at me and he goes,
01:12:02.820 tell everybody to cut 10%.
01:12:05.140 And then I said,
01:12:07.360 okay, that's just crazy.
01:12:08.880 I didn't say it that way,
01:12:09.880 of course,
01:12:10.200 but in my mind,
01:12:10.860 I was thinking that.
01:12:11.840 I go,
01:12:12.560 clearly,
01:12:13.300 you should be able
01:12:14.620 to pick and choose.
01:12:16.040 Like maybe this one
01:12:16.940 should be cut 20%.
01:12:18.040 Maybe this one
01:12:18.860 should be getting
01:12:19.480 a little extra.
01:12:20.460 You really should be
01:12:21.380 making decisions
01:12:22.240 based on the individual need.
01:12:25.580 And then the executive
01:12:26.660 said to me,
01:12:27.640 tell them to cut 10%.
01:12:29.320 It'll be fine.
01:12:31.860 So I had to go back
01:12:33.180 and tell all these people
01:12:35.060 the completely
01:12:36.000 illogical orders
01:12:38.240 that they were all
01:12:39.560 going to cut 10%
01:12:40.460 no matter what
01:12:41.200 their business was
01:12:42.040 or what their needs were.
01:12:44.020 How do you think
01:12:44.640 that went over?
01:12:46.160 Well,
01:12:46.540 it was hard.
01:12:47.380 They didn't like it at all,
01:12:48.420 but it wasn't,
01:12:49.120 you know,
01:12:49.320 for me,
01:12:49.680 it was from the boss
01:12:50.420 so they had to deal with it.
01:12:52.680 And then I tracked
01:12:53.580 the budget for the year.
01:12:55.220 How do you think we did?
01:12:58.740 Under budget.
01:13:01.080 That's right.
01:13:01.620 After all of that,
01:13:04.140 we came in under budget.
01:13:07.260 Do you know what projects
01:13:08.600 failed because we didn't
01:13:09.600 have enough money?
01:13:11.680 None.
01:13:13.400 They pretty much all
01:13:14.460 failed for other reasons.
01:13:16.060 You know,
01:13:16.220 most of them failed.
01:13:17.440 Most things fail.
01:13:19.360 So,
01:13:19.860 yes,
01:13:20.640 I have learned
01:13:21.780 that you don't need
01:13:22.780 to give your government
01:13:23.560 more money.
01:13:24.820 You can just insist
01:13:26.120 that they use what they have
01:13:27.240 and they'll figure it out.
01:13:29.200 Could they have done that
01:13:30.160 with Ukraine?
01:13:31.700 Probably.
01:13:33.020 Because what would be different?
01:13:35.020 If you hadn't given
01:13:36.040 your government
01:13:36.860 more money for Ukraine,
01:13:38.140 what would have happened?
01:13:39.720 They would have negotiated
01:13:41.280 a peace
01:13:41.920 a long time ago.
01:13:45.360 See how this works?
01:13:47.620 All we had to do
01:13:48.900 was not give them money
01:13:51.040 and we would have gotten
01:13:52.400 a better result.
01:13:55.060 Not giving people money
01:13:56.540 is a really,
01:13:57.740 really smart thing
01:13:58.580 to do in some situations.
01:14:01.260 This is one of them.
01:14:03.440 Now,
01:14:03.960 do I have, you know,
01:14:04.840 empathy for Ukraine?
01:14:06.100 Do I want Putin
01:14:06.860 to kill them all?
01:14:08.600 No.
01:14:09.680 But I don't think
01:14:10.580 that's ever the choice.
01:14:12.220 I think we're still
01:14:13.260 at a point where
01:14:13.800 they could just make a deal
01:14:14.640 if they wanted to.
01:14:16.040 And it'd be better
01:14:16.600 than waiting.
01:14:19.040 All right,
01:14:19.620 Israel's Netanyahu
01:14:21.540 said,
01:14:22.400 here's three things
01:14:23.560 he wants.
01:14:24.580 He said,
01:14:24.940 there won't be
01:14:25.620 snap elections.
01:14:26.960 So despite what
01:14:28.080 Biden might want,
01:14:29.740 he's not going
01:14:30.220 to call elections.
01:14:31.480 He said that
01:14:32.280 their military
01:14:32.920 will enter
01:14:33.540 and control Rafa,
01:14:34.600 which is what
01:14:35.120 a lot of people
01:14:35.900 around the world
01:14:36.440 don't want to happen
01:14:37.280 because it would be,
01:14:38.320 they say,
01:14:38.900 too much of a
01:14:39.980 civilian death toll.
01:14:42.340 And he says
01:14:43.260 that Israel will fight
01:14:44.280 until, quote,
01:14:45.140 total victory.
01:14:46.580 Total victory
01:14:47.160 means they're not
01:14:47.900 going to just
01:14:49.620 let Hamas
01:14:50.880 go back in
01:14:51.440 and do what they want.
01:14:52.100 So all of Hamas
01:14:54.560 will be dismantled
01:14:55.740 and dead
01:14:56.400 and jailed
01:14:57.140 and whatever they do.
01:15:00.040 And they're currently,
01:15:01.720 there might be
01:15:02.280 an update on this,
01:15:03.400 but they were
01:15:04.100 doing some kind
01:15:05.520 of operation
01:15:05.980 at the Shifa hospital
01:15:08.020 because they believe
01:15:09.540 they're a bunch
01:15:09.980 of Hamas leadership
01:15:11.980 at the hospital.
01:15:14.960 Probably true.
01:15:16.620 All right.
01:15:17.800 Ladies and gentlemen,
01:15:19.640 the reason I don't
01:15:21.640 give you much opinion
01:15:22.460 about Israel
01:15:23.320 is that Israel
01:15:25.320 just told you
01:15:25.920 what it's going to do
01:15:26.680 and then that's
01:15:28.160 what they're going to do.
01:15:29.480 And it doesn't matter
01:15:30.200 what you think.
01:15:31.580 Now, suppose we say,
01:15:33.160 let's,
01:15:33.480 what if we don't
01:15:34.340 give them weapons
01:15:35.420 or don't sell them weapons?
01:15:37.880 Do you think
01:15:38.600 it will stop them?
01:15:39.740 No.
01:15:41.300 What if they didn't
01:15:42.260 have the right weapons
01:15:43.300 for the job?
01:15:44.360 Do you think
01:15:44.700 that would stop them?
01:15:46.140 No.
01:15:47.280 No.
01:15:47.840 So if we don't
01:15:48.700 give them any funding
01:15:49.680 for weapons,
01:15:50.940 will it make any
01:15:52.000 difference to the outcome?
01:15:55.620 I don't know.
01:15:56.980 It might make a difference
01:15:58.360 on the number
01:15:59.220 of military people
01:16:00.820 that get killed
01:16:01.540 and injured
01:16:02.080 versus maybe
01:16:03.960 the number
01:16:04.500 of civilians,
01:16:06.360 but I don't know
01:16:07.500 what the net would be
01:16:08.360 in terms of human life.
01:16:10.060 So I don't even
01:16:11.060 have a reason
01:16:11.600 to believe
01:16:12.000 the extra funding
01:16:12.800 would give you
01:16:13.520 a better outcome.
01:16:14.240 So my impulse
01:16:16.500 is to not do it.
01:16:18.760 Now,
01:16:19.400 to be clear,
01:16:21.080 I am
01:16:21.520 completely
01:16:22.660 supportive
01:16:23.760 of Israel
01:16:24.440 doing what it needs
01:16:25.280 to do
01:16:25.600 to defend itself
01:16:26.520 and I don't
01:16:27.640 take my analysis
01:16:28.480 before October 7th.
01:16:30.940 So everything
01:16:31.580 you say
01:16:31.980 before October 7th,
01:16:33.400 I'll just nod,
01:16:34.680 say,
01:16:34.920 yep,
01:16:35.700 yep,
01:16:36.040 that's true.
01:16:37.120 I just don't care
01:16:38.380 because if your country
01:16:40.300 got attacked like that,
01:16:41.400 you would not be
01:16:42.660 looking at the
01:16:43.320 historical precedents
01:16:44.540 that led up to it
01:16:45.440 and whether or not
01:16:47.060 it was a good idea
01:16:47.940 and whether or not
01:16:49.200 they were justified,
01:16:51.340 you would do
01:16:52.220 what you had to do
01:16:52.840 and that's what
01:16:53.420 they're doing.
01:16:55.180 And there's no
01:16:56.160 situation
01:16:58.060 in which they're
01:16:58.800 going to put up
01:16:59.460 with reconstructing
01:17:01.060 Gaza with their own money
01:17:02.940 so it becomes
01:17:04.000 their biggest problem again
01:17:05.240 and anybody
01:17:06.320 who imagines
01:17:07.060 that they would
01:17:07.560 even consider that,
01:17:09.240 I don't know
01:17:09.560 what planet you're on
01:17:10.380 because that's not
01:17:11.420 going to happen.
01:17:12.920 Anyway,
01:17:13.780 that's all I got
01:17:15.020 for today's show.
01:17:16.320 I'm going to say
01:17:16.860 goodbye to
01:17:17.580 all the platforms
01:17:19.260 and then I'm going
01:17:20.040 to separately
01:17:20.740 close the locals
01:17:23.140 feed
01:17:24.980 but then I'm going
01:17:25.560 to open another
01:17:26.160 one for the post show
01:17:27.560 that only the subscribers
01:17:28.740 get to see.
01:17:29.920 Thanks for joining.
01:17:31.040 Another great
01:17:32.320 coffee with Scott Adams.
01:17:33.840 See you tomorrow.
01:17:34.980 Same place.