Real Coffee with Scott Adams - July 05, 2023


Episode 2160 Scott Adams: Let's Talk About All The Fake News, False Flags And Absurd Narratives


Episode Stats

Length

1 hour and 3 minutes

Words per Minute

142.28723

Word Count

8,988

Sentence Count

808

Misogynist Sentences

14

Hate Speech Sentences

34


Summary

Trump looks better today than he ever has before, and it's all down to one thing: the environment. The environment is just serving him up another President Trump, it looks like. And that's the theme for today's theme: The environment, just by chance, is serving you up another Trump.


Transcript

00:00:00.000 Good morning, everybody, and welcome to the highlight of human civilization.
00:00:07.940 Happy 5th of July. I hope you all survived the fireworks.
00:00:11.420 Maybe you've got a sore foot or a sore hand, but I hope you've all survived with all your digits.
00:00:17.540 If you need any extra fingers or toes, I recommend AI. It has lots of extra fingers.
00:00:24.020 I don't know why.
00:00:25.200 But if you'd like this experience to go up a level that nobody could ever anticipate,
00:00:28.920 all you need is a cup or a mug or a glass, a tank or chalice or stein, a canteen jug or flask, a vessel of any kind.
00:00:35.880 Fill it with your favorite liquid.
00:00:38.300 I like coffee.
00:00:39.740 And join me now for the unparalleled pleasure, the dopamine, the day, the thing that makes everything better.
00:00:45.380 It's called the simultaneous sip, and it happens now. Go.
00:00:52.620 Ah. Delightful.
00:00:55.580 Well, Ben and Jerry's is, as you know, a very activist kind of an organization,
00:01:03.680 and Ben and Jerry themselves are very active.
00:01:06.960 And they tweeted,
00:01:07.800 It's high time we recognize that the U.S. exists on stolen indigenous land and commit to returning it.
00:01:19.340 Learn more and take action now.
00:01:21.520 They give a link.
00:01:23.960 Ashley St. Clair saw that tweet and tweeted it herself with this comment.
00:01:29.720 You stole the milk from cows to make your ice cream. Checkmate.
00:01:44.280 You stole the milk from cows to make your ice cream. Checkmate.
00:01:49.780 That's as close as you can get to a perfect joke.
00:01:57.440 It's a perfect joke.
00:02:00.300 And what makes it perfect is it walks right up to the line of reality.
00:02:07.160 Because you know that they're probably vegetarians, right?
00:02:11.940 I'm just guessing. I mean, I don't know.
00:02:14.160 But Ben and Jerry probably like to protect the animals.
00:02:16.520 And here they've been taking the animals' milk and selling it for profit.
00:02:26.080 That's just the best joke of the day.
00:02:31.040 All right.
00:02:31.900 I'm going to ask you if you feel what I feel.
00:02:36.420 Now, not in every way, of course.
00:02:38.800 But this is a zeitgeist test.
00:02:42.420 All right.
00:02:43.420 The zeitgeist is that thing that we all feel, but we haven't talked about it.
00:02:47.960 It's the thing you see forming.
00:02:50.280 There's an energy forming, and it's in a certain direction, but you haven't...
00:02:54.420 Nobody's quite talking about it yet, but it's in all of our minds.
00:03:00.240 The other day I tweeted just a little video of Trump insulting Joe Biden in a speech.
00:03:07.000 Just a little tweet.
00:03:09.800 2.9 million views.
00:03:13.580 Just a tweet.
00:03:16.260 Did you hear how many people are showing up at his rallies?
00:03:20.080 The rallies are just lit.
00:03:22.780 They're just insane, apparently.
00:03:26.140 Yep.
00:03:27.440 Can you feel it?
00:03:30.500 Can you feel it?
00:03:31.960 It's happening again.
00:03:36.300 That motherfucker is doing it again.
00:03:39.380 He's just slowly climbing up the...
00:03:42.020 You know, climbing up your mental...
00:03:44.520 I don't know.
00:03:46.420 Your mental menu until he's the only thing on the menu.
00:03:50.440 I don't know how he does it exactly.
00:03:52.540 I mean...
00:03:53.380 But some of it...
00:03:54.400 Some of it is the environment is just serving him up.
00:03:57.780 And that's going to be the theme for today.
00:03:59.500 The theme for today is the environment, just by chance, is serving you up another President
00:04:09.060 Trump, it looks like.
00:04:10.640 That's what it looks like.
00:04:11.780 It just feels like the entire energy of the country is starting to form around it, and
00:04:17.500 it's now becoming obvious.
00:04:19.280 I'll give you some examples as we go.
00:04:21.340 All right.
00:04:21.820 We'll just talk about the headlines, but watch how this thesis comes together.
00:04:26.480 Well, here are some things that I think you would all agree that Trump looks better today
00:04:33.300 than he ever did, right?
00:04:35.900 Immigration.
00:04:37.940 As France is burning, do you think Trump is looking good on immigration?
00:04:44.720 Never better.
00:04:46.500 Never better.
00:04:48.220 He is at the all-time peak of looking right on immigration.
00:04:52.920 He has never been this high on immigration before, not even close, in my opinion.
00:05:01.480 How about the January 6th?
00:05:05.000 What about the stain of January 6th that was over him?
00:05:08.980 Did that work out just the way the Democrats hoped?
00:05:11.920 It did not.
00:05:13.640 It did not.
00:05:14.660 Because we live in a country where we don't trust that the people who are rounded up really
00:05:20.120 did something that deserved it.
00:05:21.440 We saw Hunter Biden getting a sweetheart deal, while January 6th people got the opposite.
00:05:29.040 Everybody noticed.
00:05:31.180 Everybody noticed.
00:05:32.860 You think the Democrats didn't notice?
00:05:35.100 No, they noticed.
00:05:36.760 Even the Democrats.
00:05:38.560 Well, actually, I have a...
00:05:40.180 I believe I have a Rasmussen poll on this.
00:05:42.740 67% of voters believe it is likely that Hunter Biden received favorable treatment from the prosecutors
00:05:51.040 because of his father.
00:05:53.520 Two-thirds of voters.
00:05:55.480 Two-thirds of voters said Hunter got a sweetheart deal.
00:05:59.000 At the same time, the January 6th people, probably some of them still rotten in jail for not much.
00:06:05.880 Now, of course, some of them did terrible things.
00:06:07.800 Nobody ignores that.
00:06:12.840 So, let's see.
00:06:13.880 How many people do you think don't think Hunter got a favorable treatment?
00:06:18.780 And this time, I'll see.
00:06:20.480 Let's say if you can get it within three points.
00:06:25.640 Within three.
00:06:26.640 Let's see if you can get it within three.
00:06:27.940 Try again.
00:06:31.240 You have to get it within three.
00:06:34.380 28.
00:06:34.960 That is right.
00:06:35.740 28.
00:06:36.720 Hey, how did you do that?
00:06:38.860 That's amazing.
00:06:40.140 So many of you got the right...
00:06:41.140 It's 28%.
00:06:42.040 Good guess.
00:06:44.360 Yeah, 28% think it was a perfectly reasonable deal.
00:06:47.060 All right, if you're new to the live stream, there's a running joke that we've been doing for months
00:06:55.140 that for any important question, 25% of the respondents will have the dumbest fucking answer you've ever seen in your life.
00:07:03.820 Or roughly 25.
00:07:05.320 For anything.
00:07:06.140 It doesn't even matter the topic.
00:07:07.600 It's not the same 25%.
00:07:09.160 It's not the same people.
00:07:11.440 I don't know why it's true.
00:07:13.320 I just don't know why.
00:07:14.380 It's just very consistent that 25% are just wrong about everything.
00:07:19.180 Different 25.
00:07:21.540 All right.
00:07:22.960 Let's see what else is going on.
00:07:25.200 Have you noticed that the climate is not...
00:07:28.060 The climate is not cooperating with the CO2?
00:07:35.000 Which, to me, is one of the funniest things that's happening.
00:07:37.780 Because even nature is fucking with the Democrats now.
00:07:40.980 It's like they lost nature.
00:07:42.760 You know, it's bad enough that you lost 67% of the general public.
00:07:48.240 They just lost nature.
00:07:51.920 Remember when nature was on their side?
00:07:54.300 And the Democrats would say, hey, it's getting warmer because of the CO2.
00:07:57.940 And then you'd check the temperature.
00:08:00.340 And you'd be like, damn.
00:08:02.400 These Democrats got something going on here.
00:08:04.460 This is warmer.
00:08:05.140 And then the next year, they'd say it again.
00:08:09.460 And you're like, well, you can't be right.
00:08:10.680 Oh, wow, you're right again.
00:08:12.600 Shoot.
00:08:13.760 It's getting warmer.
00:08:15.940 And then the last few years didn't really serve up what people were expecting.
00:08:21.260 If you follow the hashtag climate scam on Twitter, you're going to see a whole bunch of claims of people who purport to have data claiming that there's no such warming.
00:08:37.920 But at the same time, and I say this so I don't get demonetized, there are experts still claiming at the same time that the evidence shows warming exactly like they expect.
00:08:50.640 So there it is.
00:08:52.020 There's your totally reliable news of the day that if you look at the data and you listen to the people who really have looked into it, they'll have completely opposite opinions of what even the data says.
00:09:04.860 Forget about the hypothesis you put on top of it about warming.
00:09:09.700 Forget about that.
00:09:11.280 The experts, or at least the Twitter talking experts, can't even agree if it's getting warmer.
00:09:20.240 You'd think that'd be pretty basic.
00:09:22.020 Is it getting warmer?
00:09:23.660 Well, we don't know.
00:09:25.040 Well, how about something easier?
00:09:26.580 Are the glaciers melting, staying the same, or getting bigger?
00:09:33.080 Well, as of this morning, the people who looked into it on the Internet can't agree.
00:09:38.400 They can't even agree if the glaciers are getting bigger or smaller or anything.
00:09:43.680 So I still think that CO2 probably adds to warming.
00:09:49.500 I just don't think the projections about it are necessarily credible.
00:09:53.760 However, this situation is very Trump-friendly, isn't it?
00:09:59.080 Imagine if we go another year without anything that looks like significant warming.
00:10:05.600 By the way, let me say it carefully, that would not disprove climate change.
00:10:10.100 You could have five or even ten years of not much warming or even a little cooling, and that would not, I want to be very careful about that, that would not disprove climate change.
00:10:21.960 It would certainly make you scratch your head, right?
00:10:25.700 It would make you look a little more closely at the other numbers.
00:10:28.700 It would certainly raise a red flag.
00:10:30.740 But it doesn't get quite to the level of disproving it, because you would need a much longer time period to do that.
00:10:40.100 Because even the pro-CO2 people, the people who are saying it's behind the warming, even they don't say it's the only factor.
00:10:50.080 Nobody says that.
00:10:51.020 So the other factors could bounce it around, but the idea would be over time it's definitely going to trend up, even if a few years is down.
00:11:00.880 I don't know what's true, but in the context we're talking about it today, I will say that that situation is more friendly to Trump than at any time in the past.
00:11:12.960 Would you agree?
00:11:13.500 That there's never been a moment in modern American, let's say the last 20 years, there's never been a moment when the external facts were friendlier to skepticism.
00:11:26.940 Is that true?
00:11:28.320 Would you say that the data is supporting the skeptics, albeit temporarily and albeit not proof of anything?
00:11:37.100 Not proof of anything.
00:11:38.380 But it just happens to be moving in their direction at the moment.
00:11:41.040 So, how about Ukraine?
00:11:45.500 How about the situation in Ukraine?
00:11:46.940 We'll talk about the false flag.
00:11:49.220 But does that look like it's friendlier to a Trump administration?
00:11:53.320 Yes.
00:11:54.320 Yes, it is.
00:11:55.480 Because people actually do believe he could end it in a day.
00:11:59.980 Do you think the Democrats believe that?
00:12:02.400 Do you think the Democrats believe that Trump could end the Ukraine war in a day?
00:12:06.660 Now, I will, if you remove the hyperbole, if you remove the hyperbole, he doesn't literally mean a day.
00:12:16.380 Right?
00:12:17.120 He doesn't mean 24 hours.
00:12:18.820 But he could.
00:12:19.900 He could.
00:12:20.700 I think it's actually possible.
00:12:23.280 It's within the realm of possibility that he could do it in one day.
00:12:26.500 Because you just have to agree to a ceasefire and then work out the details.
00:12:32.060 But I don't think so.
00:12:34.160 I mean, if I had to bet on it, I wouldn't bet anything happens in one day ever.
00:12:38.860 You know, we don't live in the one day world.
00:12:41.300 But do you think the Democrats doubt?
00:12:45.500 Do you think they doubt that Trump could end it?
00:12:47.560 I don't think they do.
00:12:50.900 I think that they grudgingly, and not all of them, of course, right?
00:12:56.240 Some are so dug in that they would never say he could do anything.
00:12:59.560 But there's got to be a solid 25% of Democrats who are saying to themselves,
00:13:06.000 well, Biden definitely is not getting us out of this war.
00:13:09.680 That seems clear to everybody.
00:13:11.720 Biden is not getting us out of any wars.
00:13:14.740 But would Trump?
00:13:15.580 You don't know.
00:13:18.480 He might.
00:13:20.220 He would try.
00:13:21.540 Nobody's trying right now.
00:13:23.720 So you don't think that 25% of Democrats kind of prefer peace?
00:13:29.560 No matter how hypnotized or brainwashed they are about Ukraine.
00:13:33.440 You don't think 25% are solidly holding out with independent thought,
00:13:38.700 and they have an anti-war bias, and they're thinking,
00:13:41.540 I hate Trump.
00:13:43.040 I freaking hate Trump.
00:13:45.580 But he might stop this war.
00:13:47.680 Everything's going his way right now, like in a very substantial way.
00:13:55.580 How about Hunter's laptop and the corruption that we all saw there?
00:14:00.920 It just proves everything about Trump was right, right?
00:14:03.860 The Hunter laptop story wildly helps Trump.
00:14:06.580 How about the cocaine that was just found in the White House?
00:14:09.860 All right, we'll talk about that more.
00:14:11.440 Does that help Trump?
00:14:12.340 Do you think Trump looks better now that they found cocaine in the White House?
00:14:18.200 Yeah, of course.
00:14:19.480 It's like one of these weird, not really an important story.
00:14:23.020 It's probably the least important thing that's happening in the world,
00:14:25.940 is a bag of cocaine in the White House.
00:14:28.360 Nothing's less important than that.
00:14:29.980 But in our minds, it just makes the non-drinking, non-drug-taking Trump look like a superstar.
00:14:40.220 I mean, really?
00:14:40.880 You let a bunch of degenerates in the White House with their cocaine?
00:14:44.980 That's what people are going to say, not me.
00:14:47.760 All right.
00:14:48.040 Wall Street Journal, and then, of course, Biden himself is decomposing right in front of our eyes.
00:14:57.920 So everything's looking friendly for Trump at the moment.
00:15:00.540 I think on the economy, the economy might be a jump ball.
00:15:06.420 I think on the economy, people will just retreat to their parties.
00:15:09.640 Because there is an argument for Biden being good on the economy.
00:15:14.340 That's not my opinion.
00:15:15.360 Again, the data simply suggests that.
00:15:18.960 Because we're coming, you know, even inflation is slightly coming down.
00:15:23.640 But directionally, directionally, things will look like they're improving.
00:15:28.160 Now, we do have a potentially insolvent insurance and banking industry,
00:15:33.160 but I think we'll work our way through it.
00:15:35.860 I think.
00:15:37.600 But anyway, the situation, I think, is ambiguous enough.
00:15:41.140 You could argue that Trump could do better in X ways.
00:15:45.280 And you could argue that Biden actually did pretty well, given the situation.
00:15:49.620 You could argue that neither president has anything to do with anything.
00:15:53.560 We just have a strong free market, and it handled it.
00:15:57.780 So it wasn't Biden.
00:15:59.480 Just the free market did what it does.
00:16:02.020 So I think that economics is going to be a jump ball.
00:16:05.020 So that's not going to favor Trump substantially.
00:16:10.760 All right, here's a story that feeds into this theme.
00:16:15.320 The Wall Street Journal is talking about how there's a growing Republican consensus
00:16:20.280 about using the military against the cartels.
00:16:23.720 Here is a sentence from the Wall Street Journal reporting today.
00:16:28.040 Quote,
00:16:28.440 There is a simple reason the idea of military intervention keeps cropping up.
00:16:34.060 It is popular, and not just with the Republicans.
00:16:37.900 In an NBC poll taken in late June, sending troops to the border to stop drugs
00:16:41.860 was the single best-liked of 11 GOP proposals tested with Republican primary voters.
00:16:51.920 Hold on.
00:16:53.080 Hold on.
00:16:53.780 And it was the only one that gained support from a majority of all registered voters.
00:17:04.340 All registered voters.
00:17:11.720 I have two words to say about that.
00:17:16.300 Two words.
00:17:19.160 You're welcome.
00:17:20.180 You're welcome.
00:17:23.780 How do you like me now?
00:17:28.120 Now, of course, it could turn into a horrible debacle, depending on how it gets executed, if it does.
00:17:36.720 But there you go.
00:17:42.420 So we'll see if it happens.
00:17:43.980 I mean, it still requires a Republican president.
00:17:46.840 So by no means does it mean any of this is going to happen.
00:17:51.220 By no means.
00:17:52.240 But the public is on the side.
00:17:56.160 The public's on the side.
00:17:58.360 Here's a true anecdote.
00:18:01.940 Recently, a friend told me that another friend of theirs died from fentanyl overdose.
00:18:09.280 How many of you have heard that story recently?
00:18:12.680 Somebody you know died of a fentanyl overdose.
00:18:15.080 Tell me.
00:18:15.740 Just somebody you know.
00:18:17.240 Personally.
00:18:17.640 Yep, I have.
00:18:19.680 Yes, yes, yes.
00:18:21.560 Some no's.
00:18:22.700 But a lot of yes's.
00:18:25.220 Have we reached a point where most of the country knows somebody who died of fentanyl?
00:18:30.840 All right.
00:18:31.240 You want to have your brain broken?
00:18:33.700 There's a second part to the story.
00:18:38.560 So I said, how many people do you know who have died of fentanyl overdose?
00:18:43.620 Like, for sure, died of fentanyl overdose.
00:18:45.400 A person said, approximately 10.
00:18:53.500 10.
00:18:55.860 And I said, 10?
00:18:57.980 In what period of time?
00:18:59.600 Two years.
00:19:03.680 Two years.
00:19:06.120 10 people this person knows well.
00:19:09.600 10.
00:19:11.100 10 people two years.
00:19:17.480 All right.
00:19:21.020 Next topic.
00:19:22.040 So I guess there's a U.S. district court just ruled that this might be a big deal.
00:19:28.840 That governments can't coerce social media to censor news.
00:19:34.840 Now, I don't know how this will actually play out in the real world.
00:19:39.680 Because it's a little like affirmative action.
00:19:42.580 I'm not sure the government has to twist any arms.
00:19:47.020 Am I right?
00:19:48.120 Do they really need to twist any arms?
00:19:50.700 They don't really.
00:19:51.380 They just need to whisper it.
00:19:53.760 Hey, Mark, you know, it'd really be nice if your platform, you know, just had a little
00:20:02.840 less of this or that.
00:20:05.040 And then Mark says, oh, that's what my side wants.
00:20:09.560 My side wants less of this or that.
00:20:12.480 Is that the government coercing people?
00:20:14.500 Or is that just somebody knowing what team he's on?
00:20:17.700 And then he finds out what the team wants.
00:20:19.300 Is that coercion?
00:20:22.340 Not really.
00:20:24.420 But yes.
00:20:26.020 But not really.
00:20:27.680 But it totally is.
00:20:29.680 But not really.
00:20:30.700 I'm not sure that this ruling makes any difference.
00:20:34.920 Do you think the ruling will make any difference at all?
00:20:38.420 Maybe in the form of the coercion.
00:20:41.300 But not in the coercion.
00:20:42.900 The coercion should be exactly the same.
00:20:45.720 We'll see.
00:20:46.400 We shall see.
00:20:48.560 We shall see.
00:20:54.000 So where's Purgosian?
00:21:00.160 Purgosian.
00:21:00.780 He's in Belarus?
00:21:02.480 Belarus?
00:21:03.140 Is he partying it up in Belarus?
00:21:05.240 Is anybody having good selfies?
00:21:06.840 Has he published a lot of selfies lately?
00:21:12.620 How many of you have figured out that you need to listen to me to figure out what the news really is?
00:21:17.840 Well, he might be alive, but he's definitely under Putin's control.
00:21:25.520 Putin would have been silly to kill him right away.
00:21:28.780 That would have been the wrong move.
00:21:30.360 That's not good dictatoring.
00:21:31.680 Good dictatoring is you capture him, you interrogate him, you find out who's working with him.
00:21:37.020 And when you're completely sure that you've tortured him to the point where he's really given you everybody,
00:21:41.680 then you kill him at a mercy.
00:21:44.500 That would be good dictatoring.
00:21:46.280 So we assume that Putin is a high-level operator in the dictatoring.
00:21:51.220 So his dictatoring probably included this very obvious move.
00:21:54.660 Have you seen the news embarrassingly admit that they had been fooled into thinking he went to Belarus
00:22:01.780 when obviously that never happened?
00:22:04.200 You've seen all those stories?
00:22:06.260 Correction.
00:22:07.540 We said he went to Belarus, but it turns out it was only his plane, as far as we know.
00:22:13.940 You've seen all the corrections, right?
00:22:16.020 No.
00:22:17.000 No.
00:22:17.340 No corrections.
00:22:17.840 How long do we have to wait, without seeing a picture of him, before we can admit he's not in Belarus?
00:22:26.140 How long?
00:22:28.820 All right.
00:22:29.560 Well, you're already here first.
00:22:32.980 All right.
00:22:33.780 You probably saw in the news that there's a Ukraine-built nuclear power plant that Russia has control of now.
00:22:42.720 So they control that territory and they control the plant.
00:22:45.060 And the Ukrainians say that the Russians have planted explosives on the top of it
00:22:50.400 and they're planning to blow it up or to missile it or drone it.
00:22:56.260 It's called the Zaporizhia nuclear power plant.
00:23:00.840 Oh, you all believe that, right?
00:23:03.080 Do you all believe that Russia will attack a nuclear power plant on the territory it controls?
00:23:11.200 Its own territory.
00:23:11.980 It's going to blow up a multi-billion dollar asset of their own, one that they use on their own territory that they've captured.
00:23:25.440 Here's what's different in 2023.
00:23:29.920 In 2023, we all know it's a false flag.
00:23:32.960 Well, literally all of us know that.
00:23:36.480 We've all been trained.
00:23:38.360 We finally learned what it looks like, how they do it, and how the news will cover it.
00:23:44.780 And the hilarious thing about this is that everybody spotted this at the first minute.
00:23:50.560 It was like Zelensky's on TV.
00:23:52.240 It's like, well, the Russians, they seem to have put explosives on the top of the nuclear power plant.
00:24:00.240 No, they didn't.
00:24:01.280 You fucking troll.
00:24:02.900 You goddamn fucking bastard.
00:24:05.060 They did not do that.
00:24:06.200 You did that.
00:24:07.120 We know you did that.
00:24:08.520 It's the Nord Stream all over again.
00:24:10.700 We've seen this play.
00:24:12.100 Don't lie to us, you little fucking troll.
00:24:14.920 You know they're not going to blow up their own nuclear power plant.
00:24:19.840 They didn't blow up their own pipeline.
00:24:21.980 They're not going to blow up their own nuclear power plant.
00:24:25.420 Is it a fucking coincidence that that would be the red line?
00:24:30.200 Oh, what a coincidence.
00:24:32.180 In order to give the Ukrainians massive NATO support, all it would require is for Putin to do something careless or stupid with a nuclear asset.
00:24:43.100 Yes, that's all it would take.
00:24:45.060 Oh, surprise, surprise.
00:24:46.920 All right, let me just lay down some probabilities.
00:24:57.280 If this thing goes the way it looks like it's going to go, and there actually is a false flag attack that we can all see, like we just assume it's a false flag, it's President Trump.
00:25:12.800 I give you President Trump.
00:25:18.220 Just put him in office tomorrow.
00:25:21.120 You don't even have to wait for the election.
00:25:23.240 Because, first of all, the war is over.
00:25:27.600 If Ukraine tries to do that to the United States, give us this false flag thing, or maybe it's our military.
00:25:34.880 Maybe it's NATO.
00:25:35.780 I don't know.
00:25:36.120 Whoever it is.
00:25:36.820 But whoever does this, the war is over.
00:25:40.000 The war is over.
00:25:40.660 Right?
00:25:42.800 One way or another, the American support will disappear.
00:25:49.280 Now, you could say, wait a minute.
00:25:52.680 That's exactly Putin's plan.
00:25:55.760 Putin's going to do this because they know that we'll think it's a false flag, and then they'll reduce support.
00:26:04.320 So maybe it's a Putin plan.
00:26:07.200 Chess player Putin.
00:26:08.860 You never know.
00:26:09.660 Well, I do think we will know.
00:26:13.060 That's what I think.
00:26:14.300 I think we'll know.
00:26:16.040 Just like we know the pipeline.
00:26:18.440 You know, maybe not on day one.
00:26:20.120 On day one, we might not know.
00:26:22.100 But we'll know.
00:26:23.740 We're going to figure it out, and we're going to hold responsible anybody who was behind it.
00:26:28.640 You're not going to get away with this.
00:26:32.200 Like, maybe in the short run in some way, but no way.
00:26:36.200 Let me just say, if there's anybody that's allegedly on our side, allegedly on our side who does that, they have to die.
00:26:46.480 I mean, that's got to be a death sentence.
00:26:49.720 You start a war on fake pretense.
00:26:52.460 I don't care if you're the president of the United States.
00:26:54.280 That's a death sentence.
00:26:56.400 It should be.
00:26:57.660 It should be.
00:26:58.260 I mean, with the courts, not talking about a revolution.
00:27:01.780 I think the courts and the law, they don't, but they should support executing anybody who would do a false flag to start a nuclear war.
00:27:12.040 I mean, that seems basic.
00:27:14.580 All right.
00:27:17.300 The reparations task force in California continues to abuse, and they've added to their claims that they would like to drop the law.
00:27:28.760 They want to get rid of the law that makes it a crime to urinate in public.
00:27:35.340 So they would like millions of dollars per person, and they would like to urinate in public.
00:27:43.240 Now, we're not quite there, but you can see the logical endpoint here, can't you?
00:27:49.100 The logical endpoint, we're almost there, is that the reparations task force would like to recommend,
00:27:55.760 and I feel like they're just sort of crawling up to it, you know, cats on the roof, they don't want to break it to us all at once.
00:28:02.120 But what they really want is for Californians to pay black Californians a massive amount of money,
00:28:10.280 and then let the black Californians piss on them.
00:28:14.780 So you want to combine those two things.
00:28:16.600 So it's not good enough that you just give money.
00:28:19.380 You want to give money and then have them piss on you in return.
00:28:22.760 And that's apparently what they're asking for.
00:28:24.980 So right now it's just they want to urinate on sidewalks, I guess.
00:28:28.540 But, you know, obviously that's going to be, you know, scaling up to eventually they'll just be whipping it out
00:28:35.520 and spraying all passersby.
00:28:37.720 Thanks for the reparations.
00:28:39.360 Hey, everybody.
00:28:40.360 Thanks for the reparations.
00:28:42.080 Yeah.
00:28:42.720 Come on by.
00:28:43.760 Thanks for the reparations.
00:28:44.860 So it's going to be like that.
00:28:46.740 I think that's the end point.
00:28:50.460 All right.
00:28:51.920 I saw a video from Brian Romelli.
00:28:57.740 He shows an example of AI making a Pixar-style movie just from a script.
00:29:04.840 Now, not from Pixar's script, I don't think, but just from some other script.
00:29:10.000 And the animation looked Pixar quality.
00:29:17.220 Now, I don't think they're quite at the point where you could feed a script into it
00:29:22.040 and a movie comes out the other end that you'd want to watch.
00:29:24.340 But we're very close to a Dilbert movie that requires me to do little or no work.
00:29:33.500 Now, I don't know if anybody would watch it, but the technology to make it just from a script is very high.
00:29:42.240 Very high right now.
00:29:43.040 So maybe that'll happen.
00:29:45.400 If I had to guess, that's a year away.
00:29:49.360 Maybe less.
00:29:50.420 Maybe less than one year, AI will be able to make a full movie with hitting all the beats
00:29:56.040 and having basically a fully edited movie.
00:30:00.100 Maybe less than a year.
00:30:01.420 I continue with my prediction that humans will not want to watch AI-generated art of any kind.
00:30:14.780 Is that a contrarian?
00:30:16.840 I don't know if anybody else is saying that.
00:30:19.040 And the reason is that we enjoy art made by humans because of our knowledge that came from humans.
00:30:25.660 If you take away your knowledge that a human made it, it'll lose all interest.
00:30:29.360 Because when we're reading it, we're not just reading it for entertainment.
00:30:33.640 There's some connection with the creators that's important to the process.
00:30:38.360 Like a mental connection.
00:30:39.740 You know that somebody wrote it, somebody acted in it.
00:30:42.940 As soon as you take away all the human elements, you would just be watching a computer's daydream.
00:30:50.420 No interest whatsoever.
00:30:52.880 I want to see that if I see a movie that's like, ah, this is the best movie,
00:30:57.340 I think about the director, I think about the writers, I think about the skill of the actors,
00:31:03.560 I even think about the costume people.
00:31:07.860 So when I see a great movie, I'm thinking about the talent of the humans who put it together,
00:31:13.100 and that's what thrills me.
00:31:14.920 I'm thrilled by the human talent.
00:31:17.560 Even though I don't necessarily conceptualize it that way when I'm watching it,
00:31:21.000 I'm just sort of enjoying the jokes.
00:31:22.600 So I've been re-watching, in my opinion, the best television show of all time called Modern Family.
00:31:29.680 Ran for many years.
00:31:31.320 So I'm starting from the beginning to re-watch it.
00:31:34.220 And I'm having that experience.
00:31:36.240 As I'm watching Phil Dunphy say the funniest things you've ever seen on television,
00:31:41.860 I'm thinking about the writers.
00:31:44.380 The human writers.
00:31:45.540 I'm thinking, how did they come up with that?
00:31:48.280 And that's what makes me, like, really engaged.
00:31:53.280 If an AI had written the same joke and had a CGI that looked just like Phil Dunphy, the character,
00:32:01.260 I don't know that I would enjoy it.
00:32:05.500 I think it would just be flat in my mind.
00:32:08.560 Even if it was just the same, I think it would be flat.
00:32:11.640 And I think that would be true with music, humor, movies, and probably books.
00:32:20.420 And I don't know if that's going to change.
00:32:23.700 It might be that we just lose interest in all of those forms of entertainment
00:32:27.300 because people stop doing it and computer-based stuff isn't interesting.
00:32:32.040 We just might just stop being entertained in the same way.
00:32:36.720 But we'll see.
00:32:37.700 You know, the most likely place this will end up is that the same creators that are creating now,
00:32:45.320 like me, will just use the tools.
00:32:48.180 For example, instead of replacing me to make Dilber,
00:32:52.860 it's far more likely that I will use an AI tool with my art director
00:32:58.120 and we will slightly animate the comic.
00:33:02.100 So I should be able to, very soon, say,
00:33:06.280 Wally's at the table with Dilbert,
00:33:08.820 Wally's back is to us,
00:33:11.540 and he says the following things.
00:33:13.560 And then watch it come alive.
00:33:14.980 And then Wally talks, you hear it,
00:33:17.600 and then he turns around, he gesticulates.
00:33:20.840 So within a year, for sure,
00:33:23.600 I should be able to create comics with my art director.
00:33:27.280 But there's still going to be lots of trial and error.
00:33:31.160 Try this, try this, adjust that, adjust that.
00:33:33.580 So you can't take the humans out
00:33:35.000 because the humans are the only thing telling you
00:33:37.040 if you've done it right so far.
00:33:39.460 Right?
00:33:40.440 The human can say, try this, try this, try this,
00:33:42.820 but only the human can tell you if it's working
00:33:45.560 because they can feel it.
00:33:47.600 The computer can't do that.
00:33:50.200 All right.
00:33:50.780 I saw that there's still an effort
00:33:54.160 to pack the Supreme Court.
00:33:57.940 And I thought to myself,
00:34:01.020 all right, Biden already had a task force
00:34:03.560 on asking that question,
00:34:05.640 and Biden's own task force said no.
00:34:08.740 Did you know that?
00:34:10.180 That Biden actually had a task force some time ago.
00:34:13.460 And the task force came back and said,
00:34:15.160 no, don't do it, don't pack the court.
00:34:18.100 So it's been asked and answered.
00:34:20.780 But that doesn't mean it's done.
00:34:22.820 So I wanted to see who is dumb enough
00:34:25.380 and worthless enough and terrible enough
00:34:29.060 that they'd be working on trying to pack the court even today,
00:34:33.000 even after Biden's own people said don't do it.
00:34:36.720 Who would actually do that in our government,
00:34:38.620 not in the government?
00:34:40.840 Jerry Nadler.
00:34:42.880 Jerry Nadler's on that team trying to push.
00:34:46.120 I mean, there's some others as well.
00:34:47.260 But I'm genuinely curious about that guy.
00:34:53.760 Is he even, does he wake up in the morning
00:34:57.500 and try to do the job of the people's work?
00:35:02.860 Like, I don't really even know what's going on with him.
00:35:05.480 Because he doesn't even seem slightly interested
00:35:08.420 in the well-being of the country.
00:35:09.840 You know, you could easily,
00:35:12.440 you could defend him by saying,
00:35:15.000 well, he's just being a good Democrat
00:35:16.340 and he's doing what a lot of Democrats want him to do,
00:35:19.840 so he's just serving his base.
00:35:22.760 Maybe.
00:35:23.840 But it doesn't look like that, does it?
00:35:25.880 It looks like there's something else going on with him
00:35:28.820 that's not like other people.
00:35:32.220 It's like he's intentionally picking
00:35:34.020 the worst possible thing to support.
00:35:37.220 And I can't believe that his brain
00:35:40.520 is so dysfunctional he doesn't know
00:35:42.720 he's intentionally picking the worst possible things.
00:35:46.360 He acts as though he's in the pocket
00:35:49.160 of a foreign adversary.
00:35:51.100 Am I wrong?
00:35:53.060 And I'm not saying that about AOC.
00:35:56.120 That's not something I just say about Democrats.
00:35:59.100 I'm not just like insulting Democrats.
00:36:01.360 I don't think AOC works for China.
00:36:03.800 I think AOC worked for herself.
00:36:05.280 And I think actually she tries to do things
00:36:08.160 that her people want.
00:36:09.180 And, you know, she's at least semi-believable,
00:36:12.000 even if you don't like her policies.
00:36:13.860 But Nadler, he doesn't look like a real politician.
00:36:19.320 Right?
00:36:20.580 He looks, he looks like he's not trying.
00:36:24.620 AOC looks like she's trying.
00:36:26.680 You know, and she can get silly sometimes
00:36:28.620 with her arguments.
00:36:29.760 But he doesn't look like he's trying.
00:36:33.180 He looks like he wakes up trying to hurt the country.
00:36:36.740 I mean, it just looks like that.
00:36:38.220 I'm not reading his mind.
00:36:40.220 And I'm not even accusing him of anything.
00:36:42.260 I'm saying he acts like somebody
00:36:43.700 who's not even trying
00:36:44.720 to be on the side of the Democrats or anybody.
00:36:48.260 Just anybody.
00:36:49.320 He looks like he's on China's side.
00:36:51.620 Just trying to, like,
00:36:52.580 screw everything over a little bit.
00:36:53.960 All right.
00:36:56.360 Well, it just looks like that.
00:36:57.900 Anyway, I saw a tweet by,
00:37:01.640 oh, Zion Lights.
00:37:08.540 That's her name, Zion Lights.
00:37:12.760 And she said,
00:37:14.640 oh, hold on.
00:37:18.180 I pasted the wrong thing in there,
00:37:19.620 but I remember what it was.
00:37:20.600 So if you look at the UAE,
00:37:24.320 the country, UAE,
00:37:26.740 apparently they ramped up
00:37:29.420 their nuclear energy program really fast,
00:37:32.180 just a few years.
00:37:33.760 And in just a few years,
00:37:35.700 their nuclear energy production
00:37:38.100 has surpassed other European countries
00:37:41.520 doing green energy for, you know,
00:37:43.920 decades and decades.
00:37:45.600 And so the argument is
00:37:46.980 that if the UAE
00:37:49.740 can rapidly
00:37:51.480 and apparently economically
00:37:53.340 ramp up a nuclear power industry,
00:37:56.540 we should be able to do it too.
00:37:58.700 And I think the key is
00:37:59.960 that we always look at
00:38:00.740 how it was done in the past
00:38:02.000 and that was done wrong.
00:38:04.420 But I think if you stick
00:38:05.500 with a design that's proven
00:38:06.920 and, you know,
00:38:08.460 been used,
00:38:09.840 let's say pick a design
00:38:11.160 that France has been using
00:38:13.080 for 20 years.
00:38:13.900 You don't have to wonder
00:38:16.020 too much if it works.
00:38:17.860 It's the one that's been working
00:38:19.160 for 20 years.
00:38:20.800 So it looks like that's the secret.
00:38:24.460 You have to, you know,
00:38:25.760 it might help to have
00:38:26.480 a dictatorish, you know, leader
00:38:28.760 so you can get rid of a lot of red tape.
00:38:31.180 So I think if you could get rid
00:38:32.640 of our red tape,
00:38:33.780 standardize on some known designs
00:38:36.420 that work,
00:38:37.980 should be fairly quick.
00:38:40.800 I mean, there's no reason
00:38:42.520 that a strong president
00:38:43.600 who really cared about
00:38:44.620 nuclear energy,
00:38:45.560 there's no reason
00:38:46.400 that that strong president
00:38:47.520 couldn't just make this work,
00:38:49.400 you know,
00:38:49.600 just sort of force it through.
00:38:52.200 But I don't see that energy
00:38:53.780 from Trump.
00:38:55.580 I never saw him being pro-nuclear
00:38:57.420 except throwing it
00:38:58.740 in a speech or two.
00:39:00.700 And certainly don't see it
00:39:02.360 with Biden.
00:39:04.840 So I don't know
00:39:07.320 what president it would take
00:39:08.580 to push nuclear energy harder,
00:39:10.400 but we don't have,
00:39:12.760 we don't have,
00:39:13.720 at least the top two candidates
00:39:15.140 at the moment,
00:39:16.040 polling-wise,
00:39:17.100 don't seem to be the right ones.
00:39:19.520 Neither of them are the right ones
00:39:20.720 for nuclear energy.
00:39:22.220 On the other hand,
00:39:23.400 I would say that both,
00:39:24.380 that the energy department
00:39:25.660 under Trump
00:39:27.320 as well as under Biden
00:39:28.480 seems to be very activist
00:39:30.640 and doing the right things
00:39:33.240 to spur nuclear energy growth.
00:39:36.700 So I think the Department of Energy
00:39:38.320 seems like it's been on point
00:39:39.780 through two different administrations,
00:39:42.000 but somehow without
00:39:43.140 the support of the top.
00:39:46.260 I don't know what that's all about.
00:39:49.880 All right.
00:39:51.040 Let's talk about cocaine
00:39:52.280 in the White House.
00:39:53.900 You saw the news.
00:39:55.820 Cocaine was discovered
00:39:56.940 in the White House.
00:39:57.820 And let me tell you
00:39:58.420 the order of how the news broke.
00:40:01.320 It went like this.
00:40:05.060 Cocaine has been found
00:40:06.320 in the library
00:40:07.600 of the White House.
00:40:10.920 Step two,
00:40:12.900 social media reports
00:40:14.820 that everybody knows.
00:40:16.420 There are security cameras
00:40:17.820 in the library.
00:40:19.300 All right.
00:40:19.460 Here's the first part.
00:40:21.140 Cocaine was found
00:40:21.960 in the library.
00:40:23.900 Security cameras
00:40:24.860 should tell us for sure
00:40:25.920 who was using it.
00:40:27.800 Point three,
00:40:29.100 oh, it totally wasn't
00:40:29.940 in the library.
00:40:30.880 It was in the working area.
00:40:32.700 A working area.
00:40:33.900 A working area.
00:40:36.180 I'm just going to go out
00:40:37.320 on a limb and say,
00:40:40.000 I'll bet there are
00:40:40.980 no security cameras
00:40:42.940 in the working area.
00:40:46.860 Just a guess.
00:40:48.600 I'll bet there are no cameras
00:40:49.520 in the working area.
00:40:50.360 Or could that be
00:40:51.720 where the cameras don't work?
00:40:53.780 Aw.
00:40:54.820 Aw.
00:40:55.880 And would it be possible
00:40:57.280 that whoever looks
00:40:59.160 at the cameras,
00:40:59.960 if there is anybody doing that,
00:41:01.100 they fell asleep.
00:41:01.980 Probably fell asleep.
00:41:03.900 Does any of this
00:41:05.400 sound familiar?
00:41:08.360 Now,
00:41:09.720 my own opinion
00:41:11.860 is that
00:41:12.520 the cocaine
00:41:13.940 did not belong
00:41:14.780 to Hunter.
00:41:16.820 Had it been a laptop,
00:41:18.860 I'd say Hunter.
00:41:20.860 But all evidence suggests
00:41:22.840 that Hunter Biden
00:41:24.600 has never left a house
00:41:25.900 that still had cocaine in it.
00:41:30.020 Anybody?
00:41:31.340 Anybody?
00:41:31.640 Anybody?
00:41:31.700 Never left a house
00:41:33.880 that still had cocaine in it.
00:41:37.080 All right, well,
00:41:38.140 that's all I got for you there.
00:41:40.400 Yeah, I'm sure it was
00:41:41.280 Dr. Jill's been
00:41:42.740 been dipping in.
00:41:48.200 So let me ask you
00:41:49.280 this question.
00:41:50.600 Do you think that
00:41:51.640 authorities don't know
00:41:54.040 who it belongs to?
00:41:57.140 Do you think they don't know?
00:41:58.160 It was in a baggie.
00:42:01.820 Do you think that
00:42:02.660 whoever handled the baggie
00:42:03.860 used a glove?
00:42:08.220 You don't think
00:42:09.280 there are fingerprints
00:42:09.920 all over that baggie?
00:42:11.780 You don't think
00:42:12.500 every staffer
00:42:13.500 in the White House
00:42:14.140 has been fingerprinted?
00:42:17.180 How many people
00:42:18.160 would be able
00:42:19.760 to use a workspace
00:42:20.880 in the White House,
00:42:22.020 just have access to it,
00:42:23.440 who had not been
00:42:24.440 fingerprinted?
00:42:26.240 Anybody?
00:42:26.620 All about zero.
00:42:29.260 I think the answer is zero.
00:42:31.280 So they got the bag.
00:42:32.780 They've got the fingerprints.
00:42:34.680 They know exactly
00:42:35.780 who it belongs to.
00:42:37.820 Have you heard
00:42:38.500 the news yet?
00:42:40.980 Oh, interesting.
00:42:41.880 It's not in the news,
00:42:42.740 is it?
00:42:43.560 How long does it take
00:42:44.440 to get a fingerprint?
00:42:46.600 Weeks?
00:42:49.040 Ten minutes?
00:42:50.400 About ten minutes, right?
00:42:52.080 It's about ten minutes.
00:42:53.140 And once you've got
00:42:56.000 your fingerprint,
00:42:57.460 you can tell us
00:42:59.120 if that fingerprint
00:43:00.200 belongs to
00:43:01.160 Hunter Biden or not.
00:43:03.340 And if it's somebody else,
00:43:04.600 you can tell us that too.
00:43:06.740 Do you think
00:43:07.420 you're going to hear that?
00:43:09.100 No.
00:43:09.960 Do you think
00:43:10.560 you're going to see
00:43:11.040 the videotape
00:43:13.100 of the crime?
00:43:15.780 No.
00:43:16.580 No.
00:43:17.780 No.
00:43:20.420 No, no, no.
00:43:21.460 You're not.
00:43:21.920 No.
00:43:23.140 Do you think
00:43:24.320 this story
00:43:24.920 favors Trump?
00:43:28.380 Yeah.
00:43:30.340 Yep.
00:43:32.220 Yes, it does.
00:43:34.100 Now, let me be clear.
00:43:36.720 I assume
00:43:38.060 that every
00:43:39.160 administration
00:43:40.480 has its,
00:43:42.740 you know,
00:43:43.000 a healthy share
00:43:43.900 of regular
00:43:44.860 drug doers.
00:43:46.400 Is that fair to say?
00:43:47.840 Don't you assume
00:43:48.460 every administration
00:43:49.280 has some
00:43:49.820 pretty, you know,
00:43:51.280 cranked up people in it?
00:43:52.420 So I'm not
00:43:53.760 going to tell you
00:43:54.800 that a Trump White
00:43:56.520 House would be
00:43:57.320 drug-free
00:43:58.060 and this Biden
00:43:59.420 White House
00:44:00.000 is full of these
00:44:01.040 degenerate drug users.
00:44:02.700 I don't think
00:44:03.100 that's the case.
00:44:04.040 It's just that
00:44:04.640 this specific story
00:44:06.260 massively favors
00:44:08.360 one side.
00:44:09.980 Massively.
00:44:10.980 It just makes you
00:44:11.960 feel a certain way.
00:44:13.300 And if I've taught you
00:44:15.600 anything,
00:44:16.520 the way you feel
00:44:17.420 about it
00:44:17.920 is going to
00:44:18.860 determine your opinion.
00:44:20.660 You know,
00:44:20.820 the facts are
00:44:21.560 somewhat irrelevant.
00:44:23.000 And the way you feel
00:44:24.380 about Hunter Biden
00:44:25.640 having been cleared
00:44:27.520 of all those
00:44:28.440 other charges,
00:44:29.780 or at least
00:44:30.300 has a plea deal
00:44:31.360 that could get
00:44:32.660 accepted,
00:44:33.760 if this happened,
00:44:36.460 well,
00:44:36.800 there are two
00:44:37.320 things that are bad.
00:44:37.920 One would be
00:44:38.840 we found out
00:44:39.520 for sure
00:44:39.900 it was Hunter Biden
00:44:40.780 and he brought
00:44:41.400 cocaine into
00:44:42.040 the White House.
00:44:44.960 That would be
00:44:45.720 really,
00:44:46.200 really bad.
00:44:47.260 But you know
00:44:47.660 what would be
00:44:48.020 almost as bad?
00:44:50.240 Never getting
00:44:51.080 an answer
00:44:51.600 of whose
00:44:52.840 it was.
00:44:54.660 That would be
00:44:55.500 assuming it's
00:44:58.580 hunters.
00:44:59.520 You'd still
00:45:00.040 make the assumption.
00:45:01.160 And then you
00:45:01.740 assume that the
00:45:02.640 administration
00:45:03.080 is corrupt.
00:45:03.800 There's no way
00:45:05.840 they win this.
00:45:06.860 They're either
00:45:07.140 going to throw
00:45:08.320 Hunter under
00:45:08.920 the bus,
00:45:09.500 which can't
00:45:10.100 happen,
00:45:11.380 or they're
00:45:13.740 going to throw
00:45:14.080 the administration
00:45:14.620 under the bus
00:45:15.380 and say it's
00:45:16.100 corrupt.
00:45:17.540 So,
00:45:19.260 am I wrong
00:45:22.180 that things
00:45:22.800 are shaping up
00:45:23.940 to just hand
00:45:24.780 it to Trump?
00:45:28.100 Does it look
00:45:28.860 like that?
00:45:30.040 Now remember,
00:45:30.700 I'm endorsing
00:45:31.780 Ramaswamy.
00:45:34.200 But you
00:45:36.240 can't deny
00:45:36.820 what you see
00:45:37.520 and what you
00:45:38.380 feel,
00:45:39.300 and it looks
00:45:40.480 to me like
00:45:41.040 Trump is rising.
00:45:42.860 To me,
00:45:43.560 it looks like
00:45:43.980 Trump is rising.
00:45:45.920 Do any of you
00:45:46.660 feel it?
00:45:47.540 Do you feel
00:45:47.920 the energy
00:45:48.360 shift?
00:45:52.300 Because I'm
00:45:52.980 interested if I'm
00:45:53.940 just hallucinating
00:45:54.680 here.
00:45:56.460 Have anybody
00:45:57.240 noticed it
00:45:57.940 before,
00:45:58.660 before I
00:45:59.320 mentioned it?
00:46:00.440 Because I
00:46:01.040 might be
00:46:01.360 influencing you.
00:46:03.800 so some
00:46:06.040 say they
00:46:06.440 hadn't noticed
00:46:07.060 it before.
00:46:08.760 And Trump
00:46:09.440 just needs
00:46:09.880 to shut
00:46:10.280 up.
00:46:12.980 Yeah,
00:46:13.640 when Trump
00:46:14.240 is doing
00:46:14.800 nasty nicknames
00:46:16.780 and stuff,
00:46:18.240 he's definitely
00:46:19.400 the funniest
00:46:20.200 candidate.
00:46:21.780 But let's
00:46:22.520 see if he'll
00:46:22.840 stay out of
00:46:23.300 trouble.
00:46:23.900 We'll see.
00:46:24.480 I doubt it.
00:46:25.380 All right.
00:46:31.140 This is how
00:46:31.960 we read the
00:46:32.520 news in
00:46:33.020 2023.
00:46:35.360 Ready?
00:46:36.300 So I'm
00:46:36.680 looking at
00:46:37.040 the headlines.
00:46:37.780 I'm like,
00:46:38.580 all right,
00:46:39.020 it looks like
00:46:39.420 a false flag
00:46:40.680 developing in
00:46:41.360 Ukraine.
00:46:42.960 We've got
00:46:43.620 the cocaine
00:46:44.240 cover-up
00:46:44.940 that's going
00:46:45.420 to happen
00:46:45.820 at any
00:46:47.080 moment now.
00:46:48.560 We've got
00:46:49.240 the climate
00:46:49.740 scam that's
00:46:50.600 falling apart.
00:46:51.880 It's like,
00:46:52.340 when I read
00:46:52.780 the news
00:46:53.160 now, not
00:46:55.200 only do I
00:46:55.780 assume it's
00:46:56.380 fake, but I
00:46:57.940 know it's
00:46:58.280 very nature.
00:46:59.840 I know the
00:47:00.580 nature of the
00:47:01.180 fakeness, and
00:47:01.740 even before it
00:47:02.460 happens.
00:47:03.520 Like the
00:47:03.900 false flag,
00:47:04.640 we know the
00:47:05.100 nature of the
00:47:05.980 fakeness, and
00:47:07.280 we can also
00:47:07.840 predict what's
00:47:08.440 going to
00:47:08.600 happen.
00:47:09.520 Now, I do
00:47:10.380 imagine that
00:47:11.580 everybody
00:47:12.060 immediately seeing
00:47:13.160 it as a
00:47:13.640 false flag,
00:47:14.980 that could
00:47:15.540 ruin the
00:47:16.000 false flag-ness.
00:47:17.760 Right?
00:47:18.240 Like if you
00:47:18.940 were actually
00:47:19.960 trying to do
00:47:20.640 a false flag
00:47:21.440 in Ukraine,
00:47:22.380 and everybody
00:47:23.440 saw it before
00:47:24.200 it happened,
00:47:25.180 everybody, we
00:47:26.480 all see it, and
00:47:28.220 then he'd go
00:47:28.660 ahead and do
00:47:29.080 it, I'm not
00:47:30.980 sure that's a
00:47:31.440 good idea.
00:47:32.860 I mean, the
00:47:33.280 fact that we
00:47:33.940 all see it
00:47:34.600 might actually
00:47:35.220 stop it.
00:47:36.700 Might actually
00:47:37.300 stop it.
00:47:37.940 Because I'm
00:47:38.480 definitely not
00:47:39.200 in favor of
00:47:40.160 throwing NATO
00:47:41.960 in there if
00:47:42.920 there's a
00:47:43.260 nuclear event.
00:47:45.560 Right?
00:47:45.840 I'm not, no.
00:47:47.980 I mean, not
00:47:50.520 at all.
00:47:52.380 All right.
00:47:56.940 I would
00:47:57.500 like to say
00:47:58.220 something so
00:47:59.220 provocative now
00:48:00.740 that the odds
00:48:01.520 of me being
00:48:02.080 kicked off of
00:48:02.680 YouTube are
00:48:03.240 pretty good.
00:48:04.320 Pretty good.
00:48:05.300 Anybody up for
00:48:06.000 that?
00:48:07.540 Anybody want
00:48:08.100 to go on a
00:48:09.660 dangerous journey?
00:48:10.440 All right.
00:48:12.480 Now, this
00:48:13.220 dangerous journey,
00:48:14.160 I believe is,
00:48:18.200 I'm going to
00:48:18.800 present to you
00:48:19.540 with the hope
00:48:20.700 that it's
00:48:22.440 signaling something
00:48:23.700 positive.
00:48:25.100 Right?
00:48:25.440 So there's
00:48:25.960 something positive
00:48:27.020 happening that I
00:48:28.760 also sort of feel
00:48:29.800 in the zeitgeist,
00:48:31.080 but I need a
00:48:32.120 confirmation.
00:48:33.660 And it goes
00:48:34.340 like this.
00:48:35.500 I feel like our
00:48:36.620 feelings about
00:48:37.900 race are
00:48:40.060 improving.
00:48:40.720 Now, you
00:48:44.820 might not feel
00:48:45.420 like that because
00:48:46.060 the news is
00:48:46.780 serving you up,
00:48:47.700 the news is
00:48:48.300 serving you up
00:48:48.840 the opposite,
00:48:49.480 right?
00:48:49.780 If you looked at
00:48:50.440 the news, it
00:48:50.900 looks like
00:48:51.220 everything's
00:48:51.680 getting worse.
00:48:52.620 But here's
00:48:53.060 what I'm
00:48:53.360 seeing.
00:48:54.640 There's a
00:48:55.240 willingness to
00:48:56.000 joke about
00:48:57.280 race that's a
00:48:59.040 little bit more
00:48:59.700 than I've seen
00:49:00.280 before.
00:49:01.280 There's a little
00:49:02.060 bit more
00:49:02.480 willingness to
00:49:03.240 joke.
00:49:04.220 And I think
00:49:05.040 that until you
00:49:06.120 can get to
00:49:06.840 full joking,
00:49:08.660 you're not a
00:49:09.240 family, right?
00:49:11.780 You're not a
00:49:12.640 team.
00:49:13.480 You're not a
00:49:14.240 team until you
00:49:15.080 can joke with
00:49:15.680 each other.
00:49:16.420 You're not a
00:49:16.920 family until you
00:49:18.000 can, you know,
00:49:18.960 criticize and
00:49:19.660 joke with each
00:49:20.260 other.
00:49:20.620 And it never
00:49:21.220 leaves that
00:49:21.760 frame.
00:49:22.720 It never
00:49:23.440 leaves the
00:49:23.880 frame that
00:49:24.380 we're a
00:49:24.700 family joking
00:49:25.320 with each
00:49:25.720 other, right?
00:49:27.000 I'm going to
00:49:27.660 give you two
00:49:28.380 jokes that I
00:49:29.460 saw on Twitter
00:49:30.440 today.
00:49:31.440 One, you
00:49:32.460 could consider
00:49:33.080 racist against
00:49:33.960 white people.
00:49:35.200 Yikes.
00:49:36.540 The other, you
00:49:37.820 could consider
00:49:38.460 racists against
00:49:39.400 black people.
00:49:40.120 Yikes.
00:49:41.180 Yikes.
00:49:42.360 I'm going to
00:49:42.920 read you both
00:49:43.440 jokes.
00:49:45.040 Yeah, I'm going
00:49:45.800 to do that right
00:49:46.300 in front of you.
00:49:47.920 But the context
00:49:48.920 is this.
00:49:49.920 I think they're
00:49:50.540 both funny.
00:49:51.960 And that's the
00:49:52.720 end of the
00:49:53.100 story.
00:49:54.400 Now, one of
00:49:55.180 them is pretty
00:49:56.340 brutal.
00:49:57.520 You'll figure
00:49:58.020 out which one
00:49:58.600 pretty quickly.
00:49:59.280 One of them is
00:49:59.760 kind of brutal.
00:50:01.000 But it's still
00:50:02.740 funny.
00:50:04.460 Even though it's
00:50:05.160 brutal, it's
00:50:05.900 funny.
00:50:07.180 So, I
00:50:08.700 want to test
00:50:09.740 this with you.
00:50:11.720 Could you
00:50:12.280 accept these
00:50:13.000 jokes as just
00:50:13.860 funny?
00:50:15.280 Because if you
00:50:16.040 can, we've
00:50:17.400 really come
00:50:17.880 somewhere.
00:50:18.560 Like, I would
00:50:19.120 say that's
00:50:19.500 progress.
00:50:20.900 So, the first
00:50:21.900 one was, and
00:50:23.040 I'll just explain
00:50:23.780 it to you so
00:50:24.380 you don't see
00:50:24.760 it.
00:50:25.240 There was a
00:50:25.680 video of a
00:50:26.700 white guy
00:50:27.420 dancing poorly
00:50:29.160 in, I think,
00:50:31.060 Times Square or
00:50:31.940 something that
00:50:32.380 looked like
00:50:32.720 that.
00:50:32.900 So, there's
00:50:33.480 this dumpy
00:50:33.960 little white
00:50:34.640 guy.
00:50:35.700 He's dancing
00:50:36.520 like Elaine
00:50:37.160 from Seinfeld,
00:50:38.700 sort of, not
00:50:39.480 too elegantly.
00:50:40.980 And there
00:50:41.400 was a black
00:50:42.900 influencer who
00:50:44.660 retweeted that
00:50:45.640 and added the
00:50:46.700 note, White
00:50:47.840 History Month.
00:50:49.800 Instead of
00:50:50.440 White History
00:50:50.960 Month.
00:50:51.600 And this
00:50:51.940 little nerdy
00:50:52.900 white guy
00:50:53.320 dancing like
00:50:54.000 a nerd.
00:50:55.460 And I
00:50:55.760 thought to
00:50:56.040 myself, okay,
00:50:56.780 that was really
00:50:57.260 funny.
00:50:59.480 And it's just
00:51:00.000 funny.
00:51:00.280 When I see
00:51:02.240 a black
00:51:03.200 influencer making
00:51:04.240 fun of white
00:51:05.040 people for
00:51:06.220 dressing poorly
00:51:07.260 or for
00:51:08.780 dancing
00:51:09.440 inelegantly,
00:51:11.780 to me, that's
00:51:12.300 just funny.
00:51:14.020 So, I had
00:51:14.820 no negative,
00:51:16.060 not a single
00:51:16.780 negative feeling
00:51:17.640 about that.
00:51:18.200 It was just
00:51:18.520 funny.
00:51:20.380 All right, here's
00:51:20.960 the brutal one.
00:51:22.560 All right, I saw
00:51:22.980 that maybe you've
00:51:23.600 heard this before.
00:51:25.120 This one's pretty
00:51:26.120 brutal.
00:51:27.320 All right, so
00:51:27.940 I'm not telling you
00:51:28.800 this because of the
00:51:29.660 joke.
00:51:30.980 Or because I'm
00:51:31.960 agreeing with it,
00:51:32.640 right?
00:51:32.920 I want to see
00:51:33.520 where we are.
00:51:34.400 I'm just testing
00:51:35.080 the temperature.
00:51:36.060 All right?
00:51:36.760 So this one was
00:51:37.440 on Twitter, and
00:51:38.340 wasn't banned on
00:51:39.340 Twitter.
00:51:40.740 It said, you've
00:51:42.500 heard people say
00:51:43.140 that the slaves
00:51:44.160 and reparations
00:51:46.960 is based on the
00:51:47.600 fact that the
00:51:48.600 slaves and black
00:51:49.800 Americans built
00:51:50.740 America.
00:51:52.340 You've heard
00:51:52.680 that?
00:51:53.520 Now, of course,
00:51:54.160 that's hyperbole.
00:51:55.420 But it's part of
00:51:56.880 the argument for
00:51:57.480 reparations is that
00:51:58.700 black Americans,
00:52:00.280 specifically the
00:52:01.180 slaves, built
00:52:02.340 America.
00:52:02.800 Well, this joke
00:52:04.260 sort of pushes back
00:52:05.240 on that idea quite
00:52:06.460 brutally and says
00:52:07.560 blacks built America
00:52:09.320 like cows built
00:52:10.360 McDonald's.
00:52:14.160 Okay, that's just
00:52:15.060 funny.
00:52:16.440 It's brutal.
00:52:18.000 It's racist.
00:52:19.600 Just like the other
00:52:20.620 joke, you know,
00:52:21.440 White History Month.
00:52:22.820 It's just racist.
00:52:24.500 But it's also funny.
00:52:25.620 Are we at the point
00:52:28.720 where we can laugh
00:52:29.420 at that?
00:52:31.160 Did anybody get
00:52:32.220 hurt?
00:52:33.180 Was anybody
00:52:33.740 injured?
00:52:35.380 Do you have a,
00:52:36.580 you know, somehow
00:52:37.140 lesser opinion of
00:52:38.780 black people because
00:52:39.660 of this joke?
00:52:41.660 I don't.
00:52:42.980 It doesn't change
00:52:43.860 anything about my
00:52:44.860 real opinions in the
00:52:46.060 real world.
00:52:46.920 It's just clever.
00:52:48.180 It's just funny.
00:52:48.740 Like, so how do we
00:52:51.780 get here?
00:52:52.880 Like, how do we do
00:52:53.780 more of this?
00:52:55.320 And again, certainly
00:52:57.720 there would be racial
00:52:59.060 jokes that are beyond
00:53:00.240 the pale, right?
00:53:01.960 But some of them are
00:53:03.060 just silly and, you
00:53:04.980 know, they're just
00:53:05.520 clever.
00:53:07.520 Could we not laugh
00:53:08.300 about that?
00:53:09.280 We couldn't have a
00:53:10.140 good time together,
00:53:11.140 like black and white,
00:53:12.520 and just still laugh
00:53:13.240 at that?
00:53:14.140 I feel like we
00:53:15.280 could.
00:53:16.540 And I feel like
00:53:17.580 there's something that
00:53:18.380 happened with the
00:53:19.060 affirmative action
00:53:19.940 ruling, because the
00:53:21.780 affirmative action
00:53:22.520 ruling did something
00:53:23.640 beyond change the
00:53:25.840 precedence and the
00:53:26.920 law.
00:53:28.340 What it did was it
00:53:29.760 educated the country
00:53:31.040 of what was
00:53:32.860 happening to Asian
00:53:33.640 Americans.
00:53:34.820 And I don't think
00:53:35.680 people who don't
00:53:36.740 follow Twitter and
00:53:37.660 don't follow the
00:53:38.220 news, I don't think
00:53:38.820 they do.
00:53:39.920 Do you?
00:53:40.680 I mean, I didn't
00:53:41.280 know.
00:53:41.980 I mean, I knew, I
00:53:42.860 knew there was a
00:53:43.480 thing there, and
00:53:44.560 I knew it was bad.
00:53:46.460 I had no idea how
00:53:48.600 extreme it was.
00:53:49.680 No idea.
00:53:50.960 Until I saw the
00:53:51.700 data.
00:53:52.900 And then I said,
00:53:53.820 whoa, things have
00:53:55.480 gone too far.
00:53:56.980 That's just too
00:53:57.620 far.
00:53:58.380 And the Supreme
00:53:59.160 Court agreed.
00:54:00.240 They said, that's
00:54:00.700 too far.
00:54:02.260 But I feel like
00:54:04.260 that somehow
00:54:05.420 changed the way
00:54:07.580 all of us are
00:54:08.460 thinking about
00:54:09.160 everything in
00:54:11.180 terms of race.
00:54:11.780 it changed
00:54:13.100 everything about
00:54:13.840 how you're
00:54:14.140 thinking about
00:54:14.560 everything, but
00:54:15.440 not in such an
00:54:16.200 obvious way that
00:54:17.040 you're maybe
00:54:18.060 conscious of it.
00:54:20.100 I think it
00:54:20.560 just, it was
00:54:21.440 just like a
00:54:22.120 wake-up call.
00:54:22.940 It's just like a
00:54:23.680 little slap in the
00:54:24.400 face of the
00:54:24.840 country.
00:54:25.440 It's like, wake
00:54:25.900 up, wake up.
00:54:27.340 You can't
00:54:27.820 possibly think it's
00:54:28.740 good to massively
00:54:30.240 discriminate against
00:54:31.340 Asian Americans.
00:54:32.660 Wake up, wake
00:54:33.480 up.
00:54:34.000 Hey, hey, wake
00:54:34.700 up.
00:54:35.100 That can't be
00:54:35.600 good.
00:54:35.980 Wake up.
00:54:36.460 Oh.
00:54:36.580 And like
00:54:38.460 suddenly the
00:54:39.020 country woke
00:54:39.660 up at the
00:54:40.080 same time and
00:54:41.620 looked at that
00:54:42.340 situation and
00:54:43.060 said, what?
00:54:46.060 I don't want to
00:54:47.240 live in a
00:54:47.600 country where
00:54:48.420 Asian Americans
00:54:49.200 are just
00:54:49.600 massively
00:54:50.300 discriminated
00:54:51.120 against for
00:54:52.400 nothing except
00:54:53.900 their ethnicity.
00:54:55.940 Who wants to
00:54:56.520 live in that
00:54:56.840 country?
00:54:57.720 Nobody.
00:54:58.860 Literally
00:54:59.400 nobody.
00:55:00.720 So I feel
00:55:01.400 like that was
00:55:02.220 like a wake-up
00:55:02.960 call where
00:55:04.020 people understood
00:55:04.880 that good
00:55:06.500 intentions can
00:55:07.400 go too far.
00:55:09.460 That's the
00:55:10.100 one that
00:55:10.420 solidified it.
00:55:11.340 Because we
00:55:11.840 live on stories.
00:55:12.900 We don't
00:55:13.120 live on
00:55:13.480 concepts.
00:55:14.960 The concept
00:55:15.820 that things
00:55:16.440 had gone
00:55:16.760 too far
00:55:17.460 didn't really
00:55:19.020 solidify in
00:55:20.600 your head.
00:55:21.560 But once it's
00:55:22.120 a story,
00:55:23.600 you know,
00:55:23.800 it's the
00:55:24.160 Asians getting
00:55:24.920 into college
00:55:25.780 story,
00:55:26.600 now it's a
00:55:27.180 story.
00:55:28.040 So you'll
00:55:28.720 remember that
00:55:29.380 and it'll be
00:55:29.920 influential forever.
00:55:31.160 So I think
00:55:31.680 something changed.
00:55:34.020 I think
00:55:34.740 something changed.
00:55:37.000 And I think
00:55:38.040 that we're
00:55:38.400 moving toward
00:55:39.160 an understanding
00:55:40.520 that black
00:55:42.160 Americans need
00:55:43.120 more than
00:55:43.640 anything
00:55:44.040 strategy.
00:55:46.660 Because the
00:55:47.680 Asian American
00:55:48.400 example,
00:55:50.140 if you're
00:55:50.620 racist,
00:55:51.280 you're going
00:55:51.540 to say,
00:55:51.860 oh,
00:55:52.040 because they're
00:55:52.340 smart.
00:55:53.800 Right?
00:55:54.240 If you're
00:55:54.700 racist,
00:55:55.120 you say,
00:55:55.400 oh,
00:55:55.520 the Asians
00:55:55.880 are smart.
00:55:56.600 That's why
00:55:56.920 they do so
00:55:57.420 well.
00:55:58.700 Maybe they
00:55:59.180 are.
00:55:59.980 But you
00:56:00.320 don't need
00:56:00.720 that assumption.
00:56:02.900 You could
00:56:03.400 look at
00:56:03.720 the strategy.
00:56:05.620 You don't
00:56:06.040 say culture.
00:56:07.140 As soon as
00:56:07.520 you say
00:56:07.780 culture,
00:56:08.260 you get
00:56:08.480 in trouble.
00:56:09.360 That's why
00:56:09.700 I say
00:56:09.980 strategy.
00:56:10.940 Look at
00:56:11.180 the strategy.
00:56:12.540 Stay out
00:56:13.100 of jail.
00:56:14.180 Stay off
00:56:14.640 drugs.
00:56:15.980 Study hard.
00:56:18.140 Try to
00:56:18.700 get a job
00:56:19.200 that you
00:56:19.500 know will
00:56:19.840 pay well.
00:56:21.000 Start a
00:56:21.420 family.
00:56:22.580 Stay
00:56:22.820 married.
00:56:24.260 Raise
00:56:24.500 the kids.
00:56:25.420 Get them
00:56:25.680 into college.
00:56:27.000 That's a
00:56:27.380 strategy.
00:56:28.740 Are you
00:56:29.040 telling me
00:56:29.360 that black
00:56:29.780 Americans can't
00:56:30.540 use that
00:56:30.880 same strategy?
00:56:31.640 Why?
00:56:33.120 Because they
00:56:34.020 started behind
00:56:34.860 the ball or
00:56:35.800 systemic racism?
00:56:38.380 That might
00:56:38.800 make it
00:56:39.120 harder.
00:56:40.620 But unless
00:56:41.240 you're doing
00:56:41.780 those things,
00:56:43.400 nobody wants
00:56:44.060 to hear from
00:56:44.600 you.
00:56:46.600 I think
00:56:47.160 that's the
00:56:47.560 big takeaway
00:56:49.400 from the
00:56:51.040 affirmative
00:56:51.360 action ruling.
00:56:52.920 Unless you're
00:56:53.720 doing what the
00:56:54.380 Asian Americans
00:56:55.080 are doing,
00:56:56.480 doing it the
00:56:57.080 way they're
00:56:57.440 doing it,
00:56:58.700 we don't
00:56:59.100 want to hear
00:56:59.460 from you.
00:57:02.600 Didn't that
00:57:03.300 feel like
00:57:03.780 exactly what
00:57:04.440 you were
00:57:04.660 thinking?
00:57:05.940 If you
00:57:06.560 can't do
00:57:07.000 what the
00:57:07.280 Asian Americans
00:57:07.900 are doing,
00:57:09.260 I don't
00:57:10.860 even want to
00:57:11.240 have a
00:57:11.460 conversation.
00:57:12.720 Honestly.
00:57:13.320 I'm just
00:57:13.580 not interested
00:57:14.160 anymore.
00:57:15.340 If black
00:57:16.160 Americans were
00:57:16.840 trying hard to
00:57:17.880 do what the
00:57:18.300 Asian Americans
00:57:18.960 are doing
00:57:19.380 that's working,
00:57:20.940 and it
00:57:21.260 wasn't working
00:57:21.900 for black
00:57:22.380 Americans,
00:57:23.240 then I'm
00:57:23.640 completely
00:57:24.120 engaged.
00:57:25.320 Let's fix
00:57:25.820 this.
00:57:26.840 What's wrong
00:57:27.400 here?
00:57:27.620 Let's
00:57:28.380 fix it.
00:57:29.700 But if
00:57:30.060 you're not
00:57:30.400 doing the
00:57:31.060 strategy that
00:57:32.220 everybody knows
00:57:33.140 works,
00:57:34.040 everybody knows
00:57:34.900 it.
00:57:35.700 Everybody
00:57:36.140 knows it.
00:57:37.260 If you're
00:57:37.620 not trying
00:57:38.300 to do that
00:57:38.980 strategy,
00:57:39.860 we don't
00:57:40.420 want to
00:57:40.600 hear from
00:57:40.940 you.
00:57:42.000 Don't
00:57:42.400 make it
00:57:42.660 my problem.
00:57:44.980 Definitely
00:57:45.460 not my
00:57:45.880 problem.
00:57:47.500 So something
00:57:48.280 has definitely
00:57:48.660 changed,
00:57:49.280 and I think
00:57:49.720 some of it
00:57:50.320 is positive.
00:57:52.040 In other
00:57:52.400 words,
00:57:52.640 we're sort
00:57:53.260 of framing
00:57:53.720 the things
00:57:54.260 as more
00:57:54.660 of a strategy
00:57:55.680 deficit.
00:57:56.200 And once
00:57:57.800 you prove
00:57:58.280 that the
00:57:58.780 Asian
00:57:59.180 Americans,
00:58:01.200 basically all
00:58:02.520 of them
00:58:02.860 followed the
00:58:03.380 same strategy
00:58:04.040 and it
00:58:04.260 worked.
00:58:04.960 I'm exaggerating.
00:58:06.100 It's not
00:58:06.400 all of
00:58:06.780 anybody.
00:58:07.800 But it's
00:58:09.640 undeniable.
00:58:11.220 We've gotten
00:58:11.840 to the point
00:58:12.300 where you
00:58:12.640 can't make
00:58:13.280 an argument
00:58:13.760 that the
00:58:14.960 people who
00:58:15.420 do the
00:58:15.820 right things
00:58:16.380 get bad
00:58:16.860 outcomes.
00:58:17.360 You just
00:58:17.640 can't make
00:58:17.980 that argument.
00:58:19.120 It would
00:58:19.480 sound ridiculous
00:58:20.560 in 2023.
00:58:22.140 But I think
00:58:22.780 in the past
00:58:23.280 people made
00:58:23.780 that argument
00:58:24.240 that even
00:58:25.140 if you're
00:58:25.600 black and
00:58:26.080 you did
00:58:26.360 everything
00:58:26.700 right,
00:58:27.120 it still
00:58:27.380 wouldn't
00:58:27.580 work out
00:58:28.020 for you.
00:58:29.060 Does anybody
00:58:29.500 believe that?
00:58:30.800 Do any
00:58:31.380 of you believe
00:58:31.900 that if you're
00:58:32.480 black American,
00:58:33.840 you do
00:58:34.160 everything
00:58:34.560 right,
00:58:36.080 stay in
00:58:37.000 a jail,
00:58:37.500 study hard,
00:58:38.240 build a
00:58:38.560 skill,
00:58:40.200 unless you
00:58:40.720 have terrible
00:58:41.380 luck,
00:58:42.180 like you're
00:58:42.740 the victim
00:58:43.140 of a drive-by
00:58:44.180 shooting or
00:58:44.700 something,
00:58:45.360 unless you
00:58:45.760 have terrible
00:58:46.240 luck,
00:58:47.600 don't every
00:58:48.260 one of you
00:58:48.700 expect that
00:58:49.280 that would
00:58:49.560 work?
00:58:50.920 It would
00:58:51.380 probably work
00:58:51.860 really well.
00:58:53.100 Not even a
00:58:53.720 little bit,
00:58:54.100 just a lot.
00:58:55.300 It works
00:58:55.740 pretty much
00:58:56.400 every time.
00:58:57.580 So there
00:58:58.900 does seem
00:58:59.620 to be some
00:59:00.280 willingness,
00:59:01.480 and I think
00:59:01.840 the affirmative
00:59:02.320 action thing
00:59:02.960 caused it,
00:59:03.880 a willingness
00:59:04.460 to say,
00:59:05.760 get your
00:59:06.240 strategy right,
00:59:07.420 or we can't
00:59:08.140 even tell how
00:59:08.680 much systemic
00:59:09.340 racism is
00:59:10.100 mattering.
00:59:11.800 I take it
00:59:12.500 as a given
00:59:12.960 that systemic
00:59:13.600 racism is a
00:59:15.360 big variable.
00:59:17.040 But how
00:59:17.440 big?
00:59:18.760 How big?
00:59:19.700 I have no
00:59:20.140 idea,
00:59:21.000 because I'm
00:59:21.400 not in that
00:59:22.220 situation.
00:59:22.740 But if I
00:59:24.420 looked at
00:59:24.780 people whose
00:59:25.400 strategy was
00:59:26.320 solid,
00:59:27.100 they're doing
00:59:27.440 the right
00:59:27.680 strategy,
00:59:28.480 and they're
00:59:28.960 still not
00:59:29.400 getting it
00:59:29.800 done,
00:59:30.960 then I
00:59:31.300 would say,
00:59:31.680 whoa,
00:59:31.860 whoa,
00:59:31.960 whoa,
00:59:32.300 then you've
00:59:33.460 got to look
00:59:33.740 pretty hard
00:59:34.160 at the
00:59:34.460 systemic
00:59:34.860 racism.
00:59:36.000 But then
00:59:36.540 you've
00:59:36.800 sized it,
00:59:38.020 right?
00:59:38.220 Then you
00:59:38.520 know the
00:59:38.840 size of
00:59:39.320 it.
00:59:40.120 Could also
00:59:40.660 be something
00:59:41.920 else,
00:59:42.420 but unless
00:59:44.000 you get the
00:59:44.400 strategy the
00:59:45.060 same,
00:59:46.280 we don't
00:59:47.080 want to
00:59:47.280 hear from
00:59:47.620 you.
00:59:47.780 All
00:59:49.700 right.
00:59:51.580 That,
00:59:52.200 ladies and
00:59:52.580 gentlemen,
00:59:53.200 is my
00:59:53.740 always risky
00:59:55.300 but positive
00:59:56.380 take on the
00:59:56.900 world.
00:59:58.460 And how
00:59:59.520 much do you
01:00:00.020 love that I
01:00:00.660 have free
01:00:01.020 speech?
01:00:04.380 How much
01:00:05.100 do you love
01:00:05.500 it?
01:00:06.660 How many
01:00:07.300 of you would
01:00:07.720 like to hear
01:00:08.920 me read the
01:00:10.680 Dilber Reborn
01:00:12.620 comic today?
01:00:13.380 And now
01:00:17.240 some of you
01:00:17.880 pay for it,
01:00:18.600 you're
01:00:18.780 subscribers,
01:00:19.760 so if you
01:00:20.680 don't mind,
01:00:21.240 I'm going to
01:00:21.640 read this
01:00:22.480 one.
01:00:23.560 All right.
01:00:24.260 So this will
01:00:24.880 give you an
01:00:25.260 idea of what
01:00:25.760 you're missing.
01:00:33.700 Why can't I
01:00:34.580 find it in my
01:00:35.320 own feed?
01:00:35.780 There it is.
01:00:37.120 All right,
01:00:37.440 so it starts
01:00:38.560 out with
01:00:38.980 Dilber and
01:00:39.720 Dogbird
01:00:40.120 are watching
01:00:40.540 the news,
01:00:41.600 and this is
01:00:42.120 a crossover.
01:00:43.380 Where the
01:00:43.800 news that
01:00:44.240 they're watching
01:00:44.680 is Robots
01:00:45.360 Read News,
01:00:45.980 my other
01:00:46.440 comic.
01:00:47.280 So this is
01:00:47.820 the first
01:00:48.200 crossover of
01:00:49.040 my two
01:00:49.740 comics.
01:00:51.000 So they
01:00:51.900 start out
01:00:52.360 watching the
01:00:52.880 news, and
01:00:53.240 then the
01:00:54.140 robot's talking
01:00:54.860 from the
01:00:55.280 news.
01:00:56.100 So the
01:00:56.480 news says,
01:00:57.720 Roomba
01:00:58.220 started offering
01:00:59.080 what some
01:00:59.600 are calling
01:01:00.080 an Uber
01:01:00.980 for cats.
01:01:02.480 And you
01:01:03.000 see the
01:01:03.340 robot, and
01:01:03.880 he says,
01:01:04.340 the way it
01:01:04.920 works.
01:01:07.040 And then the
01:01:07.540 robot looks
01:01:08.020 down and
01:01:08.600 sees that the
01:01:09.420 chyron says,
01:01:10.580 put your
01:01:11.080 pussy on it.
01:01:12.300 And then the
01:01:12.760 robot says,
01:01:13.380 I knew it
01:01:14.500 was a mistake
01:01:15.000 to hire the
01:01:15.540 chyron guy
01:01:16.200 from Fox
01:01:16.780 News.
01:01:17.880 And then the
01:01:18.300 chyron says,
01:01:19.640 dipshit robot.
01:01:21.880 Anyway,
01:01:23.440 dipshit robot.
01:01:26.140 Now there's
01:01:26.860 something I
01:01:27.240 couldn't do in
01:01:27.700 newspapers.
01:01:28.660 So this
01:01:31.180 brings together
01:01:31.820 the headlines
01:01:33.240 and two
01:01:34.920 separate comics.
01:01:36.460 You know,
01:01:36.780 I'll tell you,
01:01:37.300 I could not
01:01:38.620 be happier
01:01:40.860 creatively.
01:01:43.160 And let me
01:01:43.800 ask you this,
01:01:44.340 do you think
01:01:44.700 AI could have
01:01:45.400 made that
01:01:45.740 comic?
01:01:47.120 Do you
01:01:47.640 think there's
01:01:48.000 any time in
01:01:48.580 the future
01:01:48.980 AI could make
01:01:49.860 that comic?
01:01:51.060 No.
01:01:52.040 Because AI
01:01:52.800 can't take the
01:01:53.500 risks that I
01:01:54.060 just took.
01:01:55.600 Right?
01:01:56.840 AI would never
01:01:57.820 do what I
01:01:58.320 just did.
01:01:59.700 Part of being
01:02:00.500 a humorist
01:02:01.280 is you're
01:02:03.620 putting yourself
01:02:04.340 at mortal
01:02:04.940 danger in
01:02:06.600 front of
01:02:07.000 other people.
01:02:09.300 When you
01:02:09.840 mock things,
01:02:10.680 you do open
01:02:11.460 yourself up
01:02:12.080 for revenge.
01:02:13.740 And, you
01:02:14.220 know, I get
01:02:14.500 plenty of that.
01:02:15.720 But it's
01:02:16.680 worth it.
01:02:17.360 I think it's
01:02:17.780 worth it.
01:02:19.040 And that,
01:02:19.560 ladies and
01:02:19.880 gentlemen,
01:02:20.300 concludes my
01:02:21.000 amazing
01:02:21.560 presentation.
01:02:23.180 Thanks for
01:02:23.560 joining me on
01:02:24.120 YouTube.
01:02:25.200 I think it'll
01:02:25.760 be even better
01:02:26.760 tomorrow.
01:02:27.300 If you want
01:02:27.680 to see the
01:02:28.280 subscription
01:02:29.220 version of
01:02:30.100 Dilbert Reborn
01:02:30.820 that's spicy
01:02:31.920 like that one,
01:02:33.060 you can do
01:02:34.000 that by joining
01:02:34.800 locals,
01:02:35.500 scottadams.locals.com
01:02:37.860 and you would
01:02:38.860 see all the
01:02:39.320 political stuff
01:02:40.100 as well,
01:02:40.520 as well as
01:02:41.020 Robots Read
01:02:41.820 News.
01:02:43.020 But, if you
01:02:44.820 only want the
01:02:45.580 Dilbert Reborn
01:02:46.220 comic without
01:02:46.840 the politics,
01:02:48.520 you can do
01:02:48.980 that on
01:02:49.360 Twitter.
01:02:49.980 Just hit the
01:02:50.540 subscribe button
01:02:51.420 on my profile.
01:02:54.480 And that is
01:02:55.480 all we need to
01:02:56.120 know.
01:02:56.340 And if you're
01:02:56.660 on YouTube,
01:02:57.520 it would help
01:02:57.880 me if you hit
01:02:58.360 the subscribe
01:02:58.980 button.
01:03:00.240 If you're
01:03:01.180 feeling generous,
01:03:02.620 hit the
01:03:03.000 subscribe button
01:03:03.760 because I think
01:03:04.660 that's good for
01:03:05.260 me somehow.
01:03:06.360 And I will
01:03:07.000 talk to you
01:03:07.680 tomorrow.
01:03:09.680 Bye, you