Real Coffee with Scott Adams - March 08, 2023


Episode 2041 Scott Adams: Closing Mexico, J6 Lies, Reparations, A Persuasion Lesson And More


Episode Stats

Length

1 hour and 18 minutes

Words per Minute

146.19283

Word Count

11,539

Sentence Count

848

Misogynist Sentences

14

Hate Speech Sentences

18


Summary

In this episode of the podcast, I talk about the latest news about China's nuclear capabilities and how they can threaten the U.S. energy supply, and why we should be worried about them. And I talk a little bit about the Nord Stream pipeline and why it might be a good thing.


Transcript

00:00:00.000 Do-do-do-do, do-do-do-do, do-do-do-do.
00:00:06.980 Good morning, everybody, and welcome to the highlight of civilization.
00:00:12.820 Wow, it's good to be alive, isn't it?
00:00:15.300 Seems like we're in the middle of all this swirling chaos of badness.
00:00:20.880 But maybe that's just our perception.
00:00:23.020 Maybe things are going great.
00:00:25.480 Let's talk about that.
00:00:26.460 But first, we need to prepare our minds.
00:00:30.700 If you have something at home that will prepare your mind better than coffee, well, go nuts.
00:00:37.040 But for the rest of us, all you need is a cup or a mug or a glass of tank or a chalice or a stein, a canteen jug or a flask, a vessel of any kind.
00:00:44.760 Fill it with your favorite liquid.
00:00:47.520 I like coffee.
00:00:49.280 And join me now for the unparalleled pleasure of the dopamine hit of the day, the thing that makes everything better.
00:00:54.640 It's called the simultaneous sip.
00:00:56.620 It's famous around the world, and maybe a little bit more famous this week.
00:01:01.980 And it happens now.
00:01:03.780 Go.
00:01:03.960 Oh, yeah, that's good.
00:01:10.500 Savor it.
00:01:11.300 Savor it.
00:01:15.820 Well, let's talk about all the things.
00:01:18.060 There are some funny things and some other things.
00:01:21.340 I heard it's International Women's Day.
00:01:24.260 Is that right?
00:01:25.700 Today?
00:01:26.060 Or is it like a month or something?
00:01:30.360 All right.
00:01:30.820 Well, we'll talk about that.
00:01:32.500 So I saw a tweet from John Quakes.
00:01:36.280 He said that as of December 2021, at least according to one source, China is building more nuclear capacity that's already planned than the entire world put together.
00:01:48.800 And apparently the USA is like this little dot.
00:01:54.440 How in the world can the United States compete in the future with inferior energy sources?
00:02:02.700 I'm not positive that what I'm going to say is 100% true.
00:02:07.420 But I'll bet it is.
00:02:08.940 You know, if I had to bet, I'll bet it's true.
00:02:10.940 Has any country ever lost a war when they had the most abundant energy sources in modern times?
00:02:21.740 In modern times, the one with the most energy sources wins everything, right?
00:02:27.840 I think so.
00:02:29.320 And I would think that nuclear would be a big part of making you a safe country.
00:02:36.000 Here's the thing the United States, maybe everybody, does wrong.
00:02:39.980 But maybe China does it right, actually, because they have more of a comprehensive view of war.
00:02:47.440 You know, I've heard, I'm no China expert, but I've heard, you know, that China has the concept of total war, that, you know, the economy is war, influence is war, all that stuff is war.
00:02:59.100 But in the United States, because the way we like to lump things and report things, you know, literally the news has somebody who covers the military stuff, and then a different person covers the economy.
00:03:15.040 Am I right?
00:03:16.260 The way our news is organized is that the economy is separate from military.
00:03:20.180 It's just, it's easier to talk about it.
00:03:23.460 And although it's easier to talk about it, it also gives you a completely misleading idea of what defense is.
00:03:32.440 Military defense is economy.
00:03:35.340 Military defense is energy.
00:03:38.960 Because energy is basically your economy.
00:03:41.500 You know, that's an oversimplification.
00:03:44.380 You could call it hyperbole.
00:03:45.980 I've been known to use it.
00:03:49.260 But nobody would disagree, who knows anything about the world, that the economy is basically your military.
00:03:56.640 You know, I mean, it's almost a one-to-one correspondence.
00:03:59.580 And that energy is pretty much your economy.
00:04:02.420 So energy is your homeland security.
00:04:05.520 Yeah.
00:04:06.000 And that's the thing that Trump got right.
00:04:08.520 Right?
00:04:09.380 Trump was the one who said, you've got to get rid of that pipeline from Russia to Germany.
00:04:14.620 Because that's a military problem, essentially.
00:04:18.440 He was the first one I remember framing it correctly.
00:04:21.980 I might be wrong.
00:04:25.600 And I was listening to Russell Brand talk about the fact that blowing up the Nord Stream pipeline
00:04:33.800 was more about economics than any kind of, you know, military security.
00:04:40.120 To which I said, hmm, Russell Brand, you can't separate those things.
00:04:46.600 Now, his point that there might be a, not might, his point that some people would have a purely financial reason for it, for sure.
00:04:58.160 I mean, that's completely right.
00:04:59.780 Some people in the larger drama would have a purely financial interest.
00:05:04.740 Surely those people exist.
00:05:06.480 In power, too.
00:05:07.560 But as soon as you say that you're doing it for one reason, you've really lost, you've sort of lost the bigger picture.
00:05:15.900 The bigger picture is that taking Europe away from dependence on energy from, you know, a potential adversary
00:05:24.740 was the smartest security thing you can do, even if it's really expensive.
00:05:29.800 In the long run, you've got to get your energy under control and not under your, not under your adversary's control.
00:05:38.860 That's just the dumbest thing you could ever do.
00:05:41.640 So, I just want to add that frame.
00:05:44.700 I guess that would be called a reframe.
00:05:47.000 So, instead of thinking that the military and the economy are two separate questions, they're always the same.
00:05:52.320 And then the third thing is that energy equals economy.
00:05:55.040 They end up basically being a proxy for the other.
00:06:01.100 Well, let's challenge your IQs here.
00:06:07.800 Let's see if you can get the answer before I ask the question.
00:06:11.100 Go.
00:06:13.760 People in the, there you go.
00:06:15.600 That is the correct answer.
00:06:17.220 Very good.
00:06:17.820 Very good.
00:06:18.220 This is the only audience that can answer a question accurately, accurately, before the question is asked.
00:06:27.500 Am I right?
00:06:28.560 Have you ever seen any audience that can get the answer before the question is asked?
00:06:33.540 And here was the question.
00:06:35.680 According to Rasmussen, what was the Congress's perform, or approval, let's say in December?
00:06:44.620 They have an update.
00:06:45.600 But as a, oh, you're right, 25%.
00:06:48.720 25%, that's right.
00:06:50.600 So, Congress had a 25% approval in December.
00:06:53.440 But Rasmussen's update is it's up to 28%.
00:06:56.900 28%.
00:06:58.700 That's roughly, that's roughly 25%.
00:07:03.460 So, a quarter of the country thinks that Congress is killing it.
00:07:10.020 Oh, I love Congress.
00:07:12.300 I like what they're doing now.
00:07:13.700 Do you ever wonder how that conversation goes?
00:07:18.420 You know, the conversations we're most used to are smart people talking to each other.
00:07:23.660 Because you see them on TV.
00:07:25.260 You see the pundits arguing.
00:07:26.940 But in the real world, the 25% of the public, the voting public.
00:07:31.420 This is voting public.
00:07:33.220 Likely voters.
00:07:34.960 Just to make it a little worse.
00:07:37.440 This isn't the general public.
00:07:39.000 This is the elite part of the public that votes.
00:07:43.360 And the 25% of them think Congress is, they're really putting up some good numbers.
00:07:49.740 Do you wonder how that conversation goes?
00:07:53.300 Like, well, I feel like it goes like this.
00:07:56.760 You know, Carl, I've been noticing that Congress has really been killing it lately.
00:08:10.220 What was that so, Eric?
00:08:13.040 You know, like what would be an example of that?
00:08:14.840 Well, there was that time they approved the budget.
00:08:26.000 Did they approve the budget?
00:08:30.740 I don't know.
00:08:35.940 But other things, a lot of other things they did, they were pretty darn good.
00:08:41.260 Yeah, a lot of other things.
00:08:42.760 A lot of other things.
00:08:45.580 Pretty darn good.
00:08:47.260 I feel like it went like that.
00:08:52.340 Like there wasn't a lot of depth of the conversation.
00:08:56.820 See what I'm saying?
00:08:57.820 Not a lot of depth.
00:08:59.360 There were not layers.
00:09:00.940 What I'm saying is there were not layers upon layers of complexity.
00:09:05.580 Probably not.
00:09:08.440 All right.
00:09:09.040 Well, that's good.
00:09:10.520 Approval of Congress is up.
00:09:11.700 I am enjoying watching CNN try to give something embarrassing out of the personal, not personal, but the communications within Fox News about Dominion during the aftermath of the election.
00:09:29.280 And there are lots of emails that you could consider maybe embarrassing.
00:09:35.100 I'm not even sure that's the right word because they're being sort of presented as embarrassing.
00:09:42.580 But when I read them, they're trying to say, look, you know, they knew that they knew the coverage was wrong and that they knew that the election was rigged.
00:09:57.460 But they said otherwise.
00:09:58.580 And then they show the email and I read it and it says, I don't see that.
00:10:03.700 I see them wanting to serve their audience, to give their audience the news that the audience is most interested in as citizens of the United States.
00:10:14.140 Is that nothing?
00:10:16.640 Is it nothing that there's an enormous audience that has an intense interest in a specific story?
00:10:23.260 That's not nothing.
00:10:25.360 Now, even no matter what your personal feelings were, as long as you were talking about the facts, you know, and opinions around facts, I think it's perfectly appropriate to give the audience the news they crave the most.
00:10:39.460 They're the country.
00:10:40.260 The audience is the country, you know, half of it or a third of it or something.
00:10:46.000 So, like, how is that embarrassing?
00:10:48.820 But they try to make it.
00:10:50.060 But here's the newest one.
00:10:53.060 Apparently, Tucker Carlson was reported in some email to somebody else or something that he would be basically glad that Trump was out of the scene because he's sick of reporting on him and that he, quote, passionately hates Trump.
00:11:08.220 He passionately hates him.
00:11:11.080 Now, keep in mind that the context was immediately right in the middle of Trump complaining about the election, which sort of put Fox News in a bad situation.
00:11:22.800 Now, here's the thing.
00:11:27.120 Is that embarrassing?
00:11:29.580 Is it embarrassing that, you know, maybe the most, you know, prominent Fox News opinion person...
00:11:38.540 Let me just finish the reframe here.
00:11:41.520 Here's my reframe.
00:11:43.560 Somehow I didn't know that.
00:11:46.180 I didn't know that.
00:11:47.320 I watched Tucker Carlson during that entire period, and I did not know that he had a bias against Trump like a personal bias.
00:11:57.900 And I guess Tucker went on to say that he hadn't accomplished much.
00:12:02.700 Now, how is that an insult?
00:12:05.760 How is that embarrassing?
00:12:07.880 In what world is it embarrassing that his audience couldn't even tell?
00:12:12.200 And I couldn't tell.
00:12:13.280 I didn't see a bias.
00:12:14.160 I thought he reported Trump right down the middle, you know, as his opinion, you know, matched up with the facts.
00:12:23.760 I feel like that was almost a compliment.
00:12:28.060 The fact that the audience couldn't tell that he had a very serious bias.
00:12:35.200 How do you do better than that?
00:12:37.500 Like, what's the level above that?
00:12:39.320 I mean, to me, they just reported he's the pinnacle of objective reporting.
00:12:47.420 You know, not objective about his own opinion, because he's an opinion guy, but objective about who was the president of the United States.
00:12:58.840 I don't know.
00:12:59.800 To me, that's, like, worthy of applause.
00:13:02.300 And that's the best that CNN could come up with.
00:13:04.400 All right, I'm going to give you, oh, and here's another one.
00:13:11.120 I think they were trying to embarrass Murdoch or something.
00:13:13.960 Murdoch must have testified.
00:13:16.100 And I guess the lawyer said, quote, you've never believed that Dominion was evolved in an effort to delegitimize and destroy votes for Donald Trump, correct?
00:13:24.840 That was a Dominion lawyer asking Rupert Murdoch.
00:13:27.500 And Murdoch says, quote, I'm open to persuasion, but no, I've never seen it.
00:13:35.480 Okay, that's why he's rich.
00:13:38.200 Is that not the perfect answer?
00:13:41.760 How do you beat that answer?
00:13:44.280 Am I right?
00:13:45.400 That is the perfect answer.
00:13:47.100 See, he starts the answer with, you know, probably assuming that this would get out.
00:13:54.060 He starts the answer by respecting his, by respecting his audience.
00:13:59.520 That is respectful of the audience.
00:14:01.840 I'm open to persuasion.
00:14:04.060 But I haven't, but I haven't seen the evidence.
00:14:06.880 You can't beat that.
00:14:08.780 And they're, they're reporting it like it's, maybe it's a little embarrassing or something.
00:14:13.160 No, if Murdoch were my boss, I'd be pretty darn happy about that.
00:14:19.660 That's a good answer.
00:14:21.820 Especially, so this is a persuasion lesson as well.
00:14:26.080 Saying you're open to persuasion doesn't even say, you know, it's necessarily going to be facts that change people's minds.
00:14:33.720 That's like the highest level you can talk about.
00:14:36.840 It's like, we can be persuaded, but I haven't seen anything that persuades me.
00:14:41.260 What a perfect answer that is.
00:14:42.840 That's the way you should answer questions like that.
00:14:45.660 So respect the, you know, respect who you need to respect.
00:14:49.560 Say you're open-minded.
00:14:51.300 Talk about the evidence.
00:14:52.700 Can't beat it.
00:14:54.920 All right, here's your next persuasion lesson.
00:14:59.200 This one comes to us courtesy of Newsom, Governor Gavin Newsom in California.
00:15:08.760 Now, I've talked about this before, but I'm going to add a little, add a little flavor to it.
00:15:13.640 So apparently, there's a, so as you know, reparations is a big question.
00:15:21.240 And Gavin Newsom told the people interested in reparations to form a little committee and come back with a recommendation.
00:15:27.880 Now, I've told you already that that's a brilliant way for any bureaucracy or any boss to make an idea go away without saying no.
00:15:37.980 You just make them go away into a committee.
00:15:40.320 All right, so that's the first persuasion lesson, but I'm going to go deeper.
00:15:43.400 So if the only thing that Gavin Newsom did was tell the reparations people, yeah, I'm open to persuasion, I'm open to persuasion, right?
00:15:54.720 Just like Rupert Murdoch.
00:15:56.660 I'm open to persuasion.
00:15:58.700 Can you go show me the facts?
00:16:01.060 You know, give me something I can say yes or no to.
00:16:04.900 See?
00:16:05.620 That's good technique.
00:16:06.760 He's respecting his public, saying, you know, go form a committee.
00:16:13.500 And they're like, yes, finally we're being taken seriously.
00:16:17.000 Good technique.
00:16:18.000 But here's the brilliant part.
00:16:20.760 It's not just about the fact that it will die in bureaucracy and in fighting.
00:16:26.680 It's about the fact that in the end, they have to put it in writing.
00:16:32.900 They have to put it in writing.
00:16:35.300 Here's your persuasion lesson.
00:16:37.920 If somebody has an idea that just doesn't hang together and couldn't possibly work,
00:16:44.960 and I think reparations is one of those, no matter what you think about, you know, the morality of it,
00:16:52.140 as a practical matter, we're way beyond the part, the way that you could insert that into current society
00:16:59.140 and get that to, you know, and not cause a revolution or something.
00:17:03.020 I mean, it can't work.
00:17:04.800 So, and you also can't price it.
00:17:06.960 You can't figure out who should get it.
00:17:08.580 You can't figure out who should not get it.
00:17:10.120 So here's what I would do if I were in charge of this and I wanted to persuade it out of existence.
00:17:18.300 I would respect my audience and say, all right, you have a moral argument,
00:17:24.240 and it's not unlike reparations that have been paid for other things in the past.
00:17:29.880 So that would be respecting the audience, right?
00:17:32.580 There are people who care about it.
00:17:34.240 You've got a moral argument.
00:17:35.560 It's not unlike things we've talked about before.
00:17:39.320 Let's talk about it.
00:17:41.260 But then I would go further and say, you need to come up with a number,
00:17:46.020 but because this is so racially charged, you need to break it down by race,
00:17:53.820 not just in terms of who would receive it, but how the tax burden would be distributed.
00:18:00.060 So I'd like to know, for example, if you're recommending,
00:18:03.340 I think they went from recommending $220,000 per black Californian,
00:18:09.520 they've upped that to $360,000 per black Californian.
00:18:15.560 And keep in mind there was no slavery in California, by the way.
00:18:21.080 But that doesn't mean that people didn't come from places where there were.
00:18:25.500 Here's how I would ask them to do it.
00:18:30.500 I'd say, I want you to come up with a number, and so let's say that's $360,000.
00:18:35.280 Figure out how much that is per year or what that does to the budget.
00:18:40.200 And then figure out what each ethnic group will be contributing to it.
00:18:45.680 So you'd say, okay, white people, you make a lot of money,
00:18:48.260 so you'd be paying, I don't know, 40% of it, 60%.
00:18:52.880 And it'd be like, Hispanic Americans, this would be your share.
00:18:57.300 And then Asian Americans, here's how much you're going to pay for black reparations.
00:19:05.040 And then you publish it, and you say,
00:19:07.340 here's the recommendation from our reparations group.
00:19:10.580 And then you let the public do the rest.
00:19:14.360 All you have to do is ask somebody to write it down.
00:19:17.580 Just write it down.
00:19:18.640 And if they don't write down, of course they would leave out anything embarrassing to their case.
00:19:23.780 You just say, all right, you've got to do both the pros and the cons.
00:19:27.060 I don't want to put the cons on top of your idea.
00:19:30.100 You're the experts.
00:19:31.460 So give us both the pros and the cons.
00:19:34.020 Tell us who benefits, what it costs, who's paying.
00:19:37.360 And just really map out some details here.
00:19:40.600 So it might come out that the average Hispanic immigrant who came across the border on Tuesday
00:19:48.420 would maybe be on the hook for some of those reparations, too.
00:19:55.100 And then you'd have to deal specifically with,
00:19:57.960 what about people who live here but are not residents?
00:20:03.780 What about that?
00:20:04.860 What if somebody's not a resident of California, but they do live here?
00:20:08.220 You know, they might not have changed their residency yet.
00:20:11.480 Do they get paid?
00:20:12.880 Because if they do, then what about people who move here just temporarily to get some reparations?
00:20:20.520 Do they get some reparations?
00:20:22.720 How about that?
00:20:24.180 So you simply ask the people to describe their plan in a little more detail,
00:20:29.640 make sure that they've got the pros and the cons, so there's an argument in both.
00:20:32.940 But here's the argument you would be sure to ask them to include,
00:20:36.020 and I saw this today, equal opportunity activist Ward Carnally.
00:20:41.940 And so he was at some California board.
00:20:44.560 And he got up and he spoke and he said that there's only one way to stop all the crime.
00:20:53.160 And that's to, here's it.
00:20:55.500 He said, there's only one thing that would stop our children from busting into these liquor stores.
00:21:00.260 There's only one thing that would stop our kids from busting into these jewelry stores,
00:21:04.700 stealing watches and jewelry, and that's reparations.
00:21:09.680 So I would say, make sure you include that argument, like right up in the summary.
00:21:14.860 You want to put that right up top?
00:21:16.420 And not only that, but I would highlight this, and have you ever heard me say, embrace and amplify?
00:21:28.600 I would embrace this, and I would amplify it.
00:21:33.280 Because there might be a lot of other places we could use this concept of paying large amounts of money to criminals
00:21:40.320 so that they would be disincentive from a crime.
00:21:45.440 What's that? There's a word for that.
00:21:47.680 So there's a word for that.
00:21:48.800 What's the word for when you pay somebody money not to do something bad to you?
00:21:55.540 What's, what's it?
00:21:58.140 Oh, extortion. Extortion.
00:22:00.360 Right.
00:22:00.620 So the plan here is to sort of morph from the crimes which none of us want.
00:22:08.740 I mean, none of us want anybody breaking into jewelry stores and liquor stores and stuff.
00:22:12.640 We'd like that then.
00:22:13.940 But if we could just convert that into more of an extortion kind of a model,
00:22:19.380 that might be something that we could use in other ways.
00:22:21.500 For example, we keep talking about using the military in Mexico,
00:22:27.920 but that was before I heard this idea.
00:22:30.920 We probably could just pay the cartels not to do crimes.
00:22:36.700 Now, it would be more expensive, but so is war.
00:22:39.760 So is war.
00:22:40.800 So why don't we use the war-carnely idea of paying reparations to black Californians
00:22:46.880 as a way to stop the attacks on stores?
00:22:51.620 Because once they had some money, they would have no reason to do it.
00:22:54.760 So sort of one plus one equals two.
00:22:57.480 Basic logic.
00:22:58.340 But you could just extend that to all crime, really.
00:23:02.780 Why would we stop it there?
00:23:05.380 One of the things I like is if you try something small and it works,
00:23:09.020 hey, let's pay criminals extortion so they don't rob us.
00:23:14.060 And we should just get rid of prisons.
00:23:18.240 There is some amount of money.
00:23:19.600 If we save all that prison money, we'll just give all of our money to the criminals,
00:23:23.880 and then they're going to let us go.
00:23:25.780 And I think that would be one way to defund police, is to just pay the criminals directly
00:23:31.980 until they have no reason to rob a liquor store.
00:23:36.720 Now, they still might rob a liquor store because it's easier than taking out your wallet,
00:23:41.880 and apparently there's no penalty for it.
00:23:43.840 So it might, you know, the convenience factor still has to be factored in,
00:23:48.220 because it might be just more convenient to pick up the liquor and just walk directly out the door,
00:23:52.920 especially if there's like one or two people in line.
00:23:55.420 Don't you hate that?
00:23:56.840 Like you want to pay for something, you plan to pay for it,
00:23:59.520 but there's two people in line, and you're like,
00:24:01.380 what time is it?
00:24:02.460 I want to get and drink my liquor.
00:24:05.500 So just cut the line, walk out the door.
00:24:08.680 We don't have law in California.
00:24:10.800 So that would be a good idea.
00:24:12.380 But anyway, the point is that all of that should be in the reparations recommendation,
00:24:16.200 because you don't want people to think you didn't think it through.
00:24:23.280 You know, people need to think you thought it through.
00:24:27.440 So that's your persuasion tip for today.
00:24:32.460 Also, there's a big unreported thing.
00:24:37.220 I'm a little bit disgusted with all of the videos that we see that are sort of lopsided, you know, race-wise.
00:24:45.640 You know, especially in Fox News and on the right side of social media,
00:24:50.460 you see a lot of videos of, it seems like they're focusing only on the few videos of black people hitting non-black people.
00:24:59.240 You know, white or Asian-American or stuff like that.
00:25:03.920 And if you think about all the videos that they're suppressing,
00:25:08.720 it really makes you wonder about these algorithms.
00:25:12.000 For example, I've never seen a video of an Orthodox Jew beating up a black person.
00:25:18.520 You know they exist, but it's probably an algorithm thing.
00:25:24.620 So the other thing is, when I watch those videos, I always have the same feeling.
00:25:32.400 Do you?
00:25:33.680 And I know we get so biased because it's only one type we're seeing.
00:25:37.320 It's just one type we're seeing, one type, one type.
00:25:39.260 And there's no way that doesn't affect your brain and make you more biased.
00:25:43.560 And when I watch them, I just think the same thing every time.
00:25:49.040 Whoever talks about all the hand injuries to the attackers?
00:25:53.780 Every time I see one of them, I see people using their bare hands and punching people into these hard skulls.
00:25:59.700 And they're usually hitting, like, the hardest part of the body.
00:26:02.020 They're not even hitting soft parts.
00:26:03.300 Like, the soft parts are usually using their boots.
00:26:07.120 But they're hitting barehanded.
00:26:09.900 There's a reason that boxers wear those big gloves.
00:26:12.700 Did you know that?
00:26:13.900 That's like to protect their hands.
00:26:15.460 And they're professionals.
00:26:17.620 Professional boxers are protecting their hands with these big things.
00:26:21.420 So if you're an amateur and you just sort of spontaneously get into one of these fights,
00:26:26.840 there have to be a lot of hand injuries.
00:26:30.100 And nobody's talking about that.
00:26:31.520 Nobody's talking about that.
00:26:34.540 So I thought we should talk about that.
00:26:37.720 All right, Vivek, here's some Vivek Ramaswamy persuasion about January 6th.
00:26:45.800 So he's using an analogy here.
00:26:49.100 He says, if you're prosecuted for an alleged bank robbery,
00:26:52.560 you get to see all the video footage of what happened.
00:26:56.420 Not just the time your face is caught in camera at the site.
00:26:59.540 That's basic constitutional law and criminal procedure.
00:27:04.020 No one should ever be convicted of a crime without seeing all potential exculpatory evidence.
00:27:10.680 This is not a right-wing or left-wing issue.
00:27:13.260 Justice demands it.
00:27:14.940 High ground maneuver.
00:27:18.380 That's the high ground maneuver.
00:27:19.520 So I love the fact that he changes the frame to really a constitutional thing that every person would agree with, basically.
00:27:30.620 Literally nobody would disagree, right?
00:27:33.020 The thing that makes it a high ground maneuver is that there is no argument against it.
00:27:39.860 Did you notice that?
00:27:40.720 See, there are tons of professional politicians who will make an argument when there's an argument against it.
00:27:50.520 Now, sometimes you can't avoid that, right?
00:27:52.220 But if you have a choice, and you can make your argument in a way that nobody can argue,
00:27:57.340 well, that's the best you can do.
00:28:00.720 You can't beat that, like, by definition.
00:28:03.880 You can't beat something that just shuts everybody down.
00:28:07.360 Who exactly is going to argue that the accused should not have access to the evidence?
00:28:14.880 Most basic thing in America, right?
00:28:17.220 Can't get more basic than that.
00:28:19.620 So, yeah, that's what Ramaswamy brings to the game,
00:28:23.120 and I think he's going to make all of the Republicans better.
00:28:27.900 Because he, once again, once again, this is like, how many times have we seen it?
00:28:33.100 He set the standard for how to talk about it.
00:28:37.100 And I think that Republicans have done a poor job in the past in finding frames that are the high ground.
00:28:44.700 They go to the partisan stuff.
00:28:48.700 Now, the partisan stuff is how you win stuff in the past,
00:28:51.900 but nobody's ever tried to use a high ground where everybody could be happy.
00:28:58.860 Politicians typically don't know how to do it.
00:29:01.040 It's actually a rare skill.
00:29:03.300 So if you think this is an accident, this isn't an accident.
00:29:08.260 Ramaswamy actually knows how to do this.
00:29:11.540 Right?
00:29:11.960 You know who else was good at it?
00:29:13.420 For a while.
00:29:15.120 Obama was good at it for a while.
00:29:17.280 Yeah, he got a little more partisan, but...
00:29:19.740 For a while he was good at it.
00:29:21.900 All right, so Tucker talked to a security guard who was working on the January 6th day,
00:29:27.020 and everything about January 6th is disgusting.
00:29:34.800 Am I right?
00:29:36.260 Like, everything about it is not just wrong or inaccurate,
00:29:40.340 not just fake news, not partisan.
00:29:43.040 It's disgusting.
00:29:45.000 It's just disgusting.
00:29:46.700 And I'd have to say, that should be the one thing we can agree on, right?
00:29:50.160 Left and right.
00:29:51.440 That everything about this is just disgusting.
00:29:55.060 How happy am I that violent people...
00:29:58.600 There were some violent people, right?
00:30:00.860 We don't know the percentage.
00:30:02.160 But how happy am I that people that I would identify...
00:30:05.380 I would have identified as roughly on my team at the time.
00:30:09.100 And they go and, like, ruin things for all of us.
00:30:14.020 I'm disgusted by that.
00:30:16.080 The violence.
00:30:16.900 Disgusted.
00:30:18.160 Right?
00:30:18.460 Violence is bad enough.
00:30:19.780 All violence is terrible.
00:30:21.360 But this is violent and disgusting.
00:30:25.660 Right?
00:30:25.760 The way the news treated it is disgusting.
00:30:30.900 The way Congress, you know, did their special hearing,
00:30:34.580 you know, at least some part of it was total bullshit,
00:30:37.320 is disgusting.
00:30:39.420 They were sending Americans to jail
00:30:41.780 knowing that they were...
00:30:44.240 Apparently, if you were to take the current reporting at face value,
00:30:49.100 it does look like there's a good argument for evidence was withheld.
00:30:54.600 I mean, I guess that's a fact, right?
00:30:57.360 I think we could say it's a fact
00:30:58.760 that the defendants did not have access to this video.
00:31:03.120 So that's just a fact, right?
00:31:04.780 That's not an opinion.
00:31:06.560 So that's the end of the story.
00:31:09.240 That should be the end of the story, right?
00:31:12.640 Why in the world is anybody keeping these people in jail?
00:31:16.920 At this point, whether or not they committed a crime
00:31:20.060 can't be the question.
00:31:22.020 That we cannot reframe this as did they commit a crime.
00:31:26.320 And again, I'm only talking about the nonviolent ones.
00:31:29.460 Everybody understands that, right?
00:31:31.480 A hundred percent of us, I think,
00:31:35.100 oppose the violence, whether you're left or right.
00:31:38.160 You know, close to a hundred percent of us.
00:31:41.380 But, you know, the left wants to make it,
00:31:44.820 whatever the violence is,
00:31:46.540 characterize the entire thing,
00:31:47.820 which is totally disgusting.
00:31:50.800 Just less disgusting.
00:31:52.560 But, it might be no more disgusting
00:31:55.880 than, you know, the right characterized Black Lives Matter.
00:31:59.740 So here's a question for you,
00:32:01.420 the question of the day.
00:32:02.940 What percentage of a crowd of protesters
00:32:05.460 would have to be violent
00:32:08.400 and or destroying property, let's say,
00:32:10.840 before you would say it's a violent protest?
00:32:13.820 What percent of actual participants in violent?
00:32:19.180 Because it doesn't need to be that big.
00:32:22.320 You know, and I think this is where the bias comes in.
00:32:25.160 I think if you're on the right
00:32:27.120 and you see Black Lives Matter
00:32:28.780 and one percent of them are violent,
00:32:31.080 it's going to feel like 30 percent
00:32:32.420 because that's what was on the news.
00:32:34.380 And you're going to say, that's too much.
00:32:35.960 But you wouldn't know what percentage it was.
00:32:38.600 I have no idea.
00:32:39.520 If you said, Scott, gun to your head,
00:32:43.160 you must come up with an estimate
00:32:45.600 of what percentage of the Black Lives Matter rioters
00:32:49.600 were actually destroying things or violent.
00:32:52.600 And I'd have to say, first of all, I have no idea.
00:32:56.240 Do you?
00:32:57.480 Like, I couldn't even guess.
00:33:00.140 If you put the gun to my head, I'd say one percent.
00:33:05.760 Gun to head, one percent.
00:33:07.200 And that's maybe something in that neighborhood
00:33:11.880 if we round.
00:33:13.660 If you round, it's probably around one percent
00:33:15.660 for these protesters as well.
00:33:18.160 What would you say?
00:33:20.880 Some are saying of its highest, 25 percent
00:33:22.960 for Black Lives Matter.
00:33:26.520 If 25 percent of those crowds
00:33:28.800 were violent and destroying things,
00:33:31.660 the rate of destruction would be
00:33:33.660 way beyond what we've seen, I think.
00:33:37.200 But that's without data.
00:33:39.240 That's just sort of living in the world.
00:33:41.340 It has that feeling about it.
00:33:42.800 You know what I mean?
00:33:43.980 Sort of my collective experience
00:33:46.880 of the world says that if 25 percent
00:33:48.740 of those crowds were destroying things,
00:33:50.840 there would be no cities left.
00:33:53.440 I think it's one percent.
00:33:55.000 But I don't know.
00:33:55.580 So before you decide that Black Lives Matter
00:33:59.320 was or was not violent in their protests
00:34:01.880 or the January Sixers were or were not violent,
00:34:05.400 you're going to have to figure out
00:34:06.480 what percentage you would say makes them violent.
00:34:09.700 And then also, it would be fair
00:34:12.100 if you treated both sides roughly the same.
00:34:15.200 That would be fair.
00:34:17.240 In your thinking, it would be fair.
00:34:19.620 So I'd love to know that.
00:34:21.000 But anyway, so Tucker talks to the security guard
00:34:24.940 and it's important to the story
00:34:27.160 that you know he was a black guy.
00:34:29.200 He was a Biden voter, Biden voter.
00:34:31.800 And the way he was treated is disgusting.
00:34:36.940 Now, we may be missing some facts,
00:34:39.980 but the facts that are reported based on his story
00:34:43.140 is that he was there that day.
00:34:45.140 He was a Capitol Police guy.
00:34:46.680 He says they were not informed
00:34:49.820 that the protest was going to be as big as it was.
00:34:53.740 So he thought that his management failed him
00:34:56.060 for reasons he doesn't know.
00:34:57.940 And what happened was, he says,
00:35:00.200 that as he was walking through the crowd,
00:35:03.820 somebody put a MAGA hat on his head.
00:35:07.420 And he quickly realized
00:35:09.480 that that was the safest thing he could do
00:35:11.460 to walk through the crowd.
00:35:13.820 He must have still had his uniform on,
00:35:15.820 but once he had the MAGA hat on,
00:35:18.640 and he's black, right?
00:35:20.740 So if you see a black guy in a Capitol uniform
00:35:24.040 with a MAGA hat in the middle of the protest,
00:35:28.180 I'm guessing that makes you completely safe.
00:35:33.100 Would you agree?
00:35:34.940 Like that hat would be like a force field.
00:35:38.440 So he puts on this hat
00:35:39.700 in the middle of this dangerous chaos, right?
00:35:43.220 Very unpredictable, dangerous chaos,
00:35:45.720 and violence was happening.
00:35:47.580 And he puts on the hat to keep himself safe.
00:35:50.260 Clearly not a supporter.
00:35:52.560 Clearly not a supporter.
00:35:54.300 Just thinking fast and being smart.
00:35:58.280 I mean, how would you just...
00:35:59.880 I can't describe that in any way other than smart.
00:36:03.100 That's the only description, right?
00:36:05.660 He got fired for it.
00:36:07.740 He got fired.
00:36:09.500 Because there's a picture taken of him in the hat.
00:36:12.020 And then he wasn't interviewed by the January 6th people.
00:36:20.280 I think...
00:36:20.720 Oh, I'm sorry.
00:36:21.540 Yeah, I'm getting the story a little bit wrong.
00:36:24.320 Somebody's correcting me here.
00:36:25.900 He was put on some kind of leave indefinitely,
00:36:29.240 and then he quit during the leave,
00:36:32.160 which lost his pension or something like that.
00:36:34.900 Suspended.
00:36:36.260 Suspended, put on leave, something like that.
00:36:38.040 Yeah, so basically he got a penalty
00:36:42.500 for doing a smart thing.
00:36:45.120 And I don't know if there's a counter-argument.
00:36:47.560 I just feel like everybody turned into a turd
00:36:50.080 at the same day.
00:36:51.840 Like the press, you know,
00:36:53.800 everybody talking about it practically.
00:36:55.900 We all turned bad one way or another.
00:36:58.600 But it's like everything was bad from top to bottom.
00:37:01.520 Like it was this moment of, you know,
00:37:04.240 just extraordinary, disgusting ugliness
00:37:07.840 that swept over everybody for a little while.
00:37:10.680 It was like a mass hysteria
00:37:12.480 that took too many people in.
00:37:15.280 I'd just like to forget the whole thing.
00:37:18.760 You know, I guess we can't forget it.
00:37:20.000 We have to figure out what happened.
00:37:21.220 But the faster we get past that, the better.
00:37:25.620 All right.
00:37:26.000 Of all the many terrible things,
00:37:33.100 this one bothered me the most
00:37:34.640 because he got punished for doing something smart.
00:37:38.420 Like that just, that hurts my head so hard.
00:37:41.420 As the creator of the Dilbert comic, right?
00:37:45.320 Like that's right in my feels.
00:37:48.660 I hate that.
00:37:49.280 All right.
00:37:52.120 And then I guess there's video now, Tucker Rez,
00:37:55.400 that the narrative about Brian Sicknick
00:37:58.880 by some people,
00:38:01.560 and I don't know who had it right
00:38:02.800 and who had it wrong in the media,
00:38:04.620 was wrong.
00:38:06.500 I feel as if the media was a little more accurate
00:38:09.460 that he was not killed by the protesters.
00:38:12.960 But I feel that the politicians
00:38:14.740 kept conflating his death
00:38:16.420 as if the protesters did it.
00:38:18.840 Does that feel right to you?
00:38:21.820 New York Times tells the fake story.
00:38:24.340 So on Twitter, there was a community notes
00:38:26.320 put up on a tweet
00:38:27.400 that said the Washington Post
00:38:29.480 and I think somebody else, Reuters maybe,
00:38:32.100 reported it correctly,
00:38:34.140 which would mean some of the media got it right.
00:38:37.280 But I haven't confirmed that.
00:38:40.220 My best guess is that the media,
00:38:43.200 some got it right, some got it wrong.
00:38:44.700 And as long as some of the media got it wrong,
00:38:47.600 that gave the politicians cover
00:38:49.660 to intentionally get it wrong.
00:38:52.440 I think that's what happened.
00:38:54.300 Now, what do you think of that?
00:38:56.560 Now, I don't mind so much
00:38:59.060 that the media got a story wrong.
00:39:01.480 You have to live in a world
00:39:02.820 where stories are wrong sometimes.
00:39:06.980 Case in point.
00:39:09.600 But, I don't know,
00:39:12.000 this is just hideous.
00:39:13.700 Hideous behavior by Congress.
00:39:16.980 They should be in jail.
00:39:18.980 The January 6th people
00:39:20.640 who perpetrated this hoax.
00:39:23.300 And I'm going to call it a hoax
00:39:24.360 because they left out,
00:39:26.500 they did not provide the exculpatory
00:39:29.400 or potentially exculpatory.
00:39:31.480 I think with the QAnon shaman,
00:39:33.780 the video is highly exculpatory.
00:39:37.620 Highly.
00:39:38.080 I don't know how much it is
00:39:39.800 for the other people,
00:39:40.740 but at least a little bit probably.
00:39:43.400 So yeah, this is just disgusting.
00:39:45.540 All right, so I'm revising my opinion
00:39:47.140 about the cartel violence
00:39:50.780 on a car of four Americans
00:39:52.320 who went into Mexico.
00:39:54.340 Some cartels shot them up,
00:39:55.900 killed two, tragically.
00:39:57.700 My first thing was that
00:40:00.500 they knew they were Americans.
00:40:02.300 So in the initial reporting,
00:40:04.620 I felt like they knew they were Americans
00:40:06.380 and they took them hostage.
00:40:08.180 That didn't happen.
00:40:10.000 It looks like they were mistaken
00:40:11.700 for a Haitian gang.
00:40:13.620 It looks like they were black,
00:40:15.280 so they mistook them.
00:40:17.280 They might have avoided,
00:40:19.600 maybe avoided a checkpoint
00:40:20.760 after they went through or something.
00:40:22.640 So there's something that didn't look right
00:40:24.600 and the cartel guards who,
00:40:28.140 apparently there's two checkpoints.
00:40:30.180 You get across the border legally
00:40:32.180 and then as soon as you're over,
00:40:33.600 the cartel checks you a second time.
00:40:35.880 Did you know that?
00:40:37.300 Did you know that the cartel
00:40:38.680 has its own guard post at the border?
00:40:42.820 Yeah.
00:40:43.820 If you didn't know that.
00:40:47.280 So this is the sort of thing
00:40:50.560 that might spark a war,
00:40:52.960 but my take was
00:40:54.300 if they would brazenly kidnap Americans,
00:40:56.880 you'd just have to turn up the heat to 100.
00:40:59.600 Not 100, but you've got to turn it up
00:41:02.240 to laser quality.
00:41:04.080 But apparently they did not.
00:41:07.040 However, there do seem to be
00:41:09.380 lots of actual other American kidnapping cases.
00:41:12.720 There are probably cartel cases too.
00:41:14.460 So I wouldn't make my case
00:41:17.760 for attacking Mexico
00:41:19.240 based on this event,
00:41:20.980 as tragic as it was,
00:41:22.720 but it does make everybody's brains think
00:41:26.600 in a more aggressive fashion, I think.
00:41:30.020 So Lindsey Graham's getting serious
00:41:31.620 about this now.
00:41:33.900 Cartel, or the Congress seems to be changing.
00:41:36.720 But I was watching The Five yesterday,
00:41:39.020 and they had a graphic
00:41:39.760 about how many cities in the U.S.
00:41:42.140 the cartels have already set up shop.
00:41:45.780 Don't ever look at that graphic
00:41:47.460 if you have a choice.
00:41:50.600 It will mess up your brain.
00:41:53.120 I don't know the real size of the problem,
00:41:55.840 because it could be, you know,
00:41:57.200 five people in a city
00:41:58.540 count as MS-13 or something.
00:42:00.920 I don't know what it takes
00:42:02.240 to count as having a foothold.
00:42:04.500 But it's a lot of cities.
00:42:08.080 It looked like 100 cities.
00:42:10.540 It's taken, like,
00:42:11.540 the bottom two-thirds of the country, basically.
00:42:14.820 And, yeah,
00:42:17.800 but I don't know if they're influencing
00:42:19.540 the police departments yet.
00:42:22.480 Presumably, if they grow, they would.
00:42:24.620 You know, there's some rumor
00:42:25.820 that they're taking over from other gangs.
00:42:27.820 I don't know the extent of that.
00:42:32.040 But it's...
00:42:33.500 I think the country's getting pretty serious
00:42:35.800 about doing something about it for a change.
00:42:39.020 However, I do take counsel
00:42:40.800 from Geraldo Rivera,
00:42:43.280 who's, you know, anti...
00:42:44.820 sounds like he's anti-using military in Mexico.
00:42:48.400 And here's the argument against it.
00:42:52.600 Wars never work, and they never end.
00:42:54.480 So that would be the argument against it.
00:42:58.920 It's just another way
00:42:59.940 for the military-industrial complex
00:43:02.320 to make money.
00:43:03.340 It'll never end,
00:43:04.640 and it'll never work.
00:43:07.060 Now, that's not terrible.
00:43:09.820 That's not a terrible opinion.
00:43:11.840 I actually found myself persuaded
00:43:14.080 because there's such a long history
00:43:16.000 of wars not working,
00:43:17.500 for America anyway.
00:43:19.380 And so I think you have to take that seriously.
00:43:22.900 However, as Greg Gottfeld pointed out
00:43:27.000 on the same show,
00:43:28.420 they're killing 100,000 a year now.
00:43:31.820 How much worse could it be?
00:43:34.400 Would it be worse?
00:43:36.160 I don't know.
00:43:37.540 It might be worse in a different way
00:43:39.040 because you'd expect some pushback.
00:43:41.800 But I don't know.
00:43:43.320 My instinct is that if you do nothing,
00:43:46.240 the 100,000 gets bigger.
00:43:47.540 And it's already, you know,
00:43:50.860 completely out of any reasonable range
00:43:54.760 of tolerance.
00:43:55.840 I mean, it's so far away
00:43:57.140 from anything you would tolerate.
00:43:59.060 But let's consider the alternative.
00:44:04.880 Full legalization.
00:44:08.640 Would that just make the cartels
00:44:10.860 own a legal business?
00:44:13.180 And that's the only change?
00:44:14.340 They would just figure a way
00:44:15.800 to make it legal
00:44:16.560 and then they'd still be in business?
00:44:18.980 Or would it only work
00:44:20.780 if the government provided
00:44:21.900 the drugs for free?
00:44:23.880 Or at the same price, let's say.
00:44:26.640 Could you drive the cartels
00:44:28.080 out of business
00:44:28.680 by taking away their source?
00:44:33.600 The trouble is,
00:44:34.520 we only have ideas that can't work.
00:44:37.760 So here are the three ideas
00:44:39.360 that can't work.
00:44:40.820 Do nothing.
00:44:42.380 That can't work.
00:44:43.120 Go to war.
00:44:46.200 It might work,
00:44:47.020 but, you know,
00:44:47.680 the odds are not as good
00:44:49.120 as you'd like.
00:44:51.080 Or legalize it,
00:44:52.600 which would probably,
00:44:53.780 in the short run,
00:44:54.800 cause way more deaths.
00:44:57.080 But it would put the cartels
00:44:58.520 out of business
00:44:59.220 after we've addicted more people.
00:45:02.940 I don't know.
00:45:04.680 You know,
00:45:04.980 does the supply and demand
00:45:06.580 actually cause more people
00:45:08.720 to be addicted
00:45:09.380 if it's easier
00:45:12.880 and safer to get?
00:45:14.940 Or is it just that
00:45:16.100 the people who want
00:45:16.780 to be addicted
00:45:17.300 already have access
00:45:19.000 and it wouldn't make
00:45:19.980 any difference at all?
00:45:21.760 So it feels like
00:45:23.100 every path doesn't work.
00:45:26.360 Doesn't it?
00:45:27.280 It just feels like
00:45:28.060 all of them don't work.
00:45:29.660 But doing nothing
00:45:30.540 seems like the dumbest.
00:45:31.640 You know,
00:45:33.780 if the war makes it worse,
00:45:35.220 I suppose we could
00:45:35.880 stop doing it.
00:45:36.880 But we never do.
00:45:39.500 So,
00:45:40.280 I don't know.
00:45:41.160 I'd be tempted
00:45:41.880 to annex Mexico.
00:45:42.880 I mean,
00:45:49.260 I suppose you could try
00:45:50.620 getting permission
00:45:51.800 to use special forces
00:45:53.000 and stuff.
00:45:54.240 But,
00:45:54.600 I don't know.
00:45:56.020 Would that make
00:45:56.500 any difference?
00:45:58.920 We'll see.
00:46:00.100 I think military
00:46:01.240 is inevitable
00:46:01.840 at this point.
00:46:03.280 Here's what
00:46:03.940 the world needs.
00:46:05.180 The world needs
00:46:05.800 what I call
00:46:06.240 a Zoom government
00:46:07.220 or a government
00:46:08.360 in a box
00:46:09.140 for situations
00:46:10.780 in which a government
00:46:11.980 would be temporarily
00:46:13.620 without a government,
00:46:15.480 usually because of
00:46:16.480 a war or revolution
00:46:17.520 or something.
00:46:18.700 So,
00:46:19.460 wouldn't it be good,
00:46:20.680 and I'll just use
00:46:21.640 the Swiss
00:46:22.120 as my universal
00:46:23.400 neutral country,
00:46:24.820 imagine you had
00:46:25.680 a Swiss entity
00:46:27.400 that was,
00:46:28.640 like,
00:46:28.940 organized as already
00:46:30.040 a government.
00:46:31.460 And they would go in
00:46:32.600 and they would act
00:46:33.240 as a government
00:46:33.840 for a six-month period
00:46:35.660 for any country
00:46:37.040 that temporarily
00:46:38.520 didn't have one.
00:46:40.080 They would be,
00:46:40.820 let's say,
00:46:41.120 under UN supervision,
00:46:43.080 something like that,
00:46:43.780 you know,
00:46:43.940 just so there's
00:46:44.400 a little bit
00:46:44.740 of credibility.
00:46:45.740 But the deal is
00:46:46.580 they have to leave
00:46:47.680 in six months.
00:46:49.320 They have to.
00:46:50.580 There's just no option.
00:46:52.260 They've got to leave
00:46:52.820 in six months,
00:46:53.520 even if they haven't
00:46:54.240 fixed anything,
00:46:55.540 right,
00:46:55.760 even if they haven't
00:46:56.440 fixed anything.
00:46:57.240 Because if you can't
00:46:57.940 get something going
00:46:58.560 in six months,
00:46:59.440 probably you never will,
00:47:01.760 right?
00:47:02.480 So,
00:47:03.380 I think we need
00:47:05.380 something like that.
00:47:06.140 I saw a bunch of people
00:47:06.940 yelling,
00:47:07.320 no,
00:47:08.340 what's the argument
00:47:11.800 against it?
00:47:13.680 You could make it
00:47:14.600 an international group.
00:47:16.280 Just people who are
00:47:17.280 just a safe,
00:47:19.000 competent,
00:47:19.680 maybe they're older,
00:47:20.840 it would be good
00:47:21.200 if they're older
00:47:21.720 so they don't want
00:47:22.640 to stay there forever.
00:47:24.320 They just take over
00:47:25.120 for a while,
00:47:25.860 keep the lights on,
00:47:27.820 and then you phase
00:47:28.640 them out willingly.
00:47:30.980 If they're well paid,
00:47:32.520 it would be not too big
00:47:34.200 of a risk
00:47:34.580 that they would try
00:47:35.120 to stay.
00:47:36.200 And I don't think
00:47:37.080 that a foreign power
00:47:38.640 could control
00:47:39.480 the military very easily,
00:47:41.800 right?
00:47:42.580 So I wouldn't worry
00:47:43.460 about the government
00:47:44.440 in a box coming in
00:47:45.600 and then taking over
00:47:47.740 the country,
00:47:48.380 because they wouldn't
00:47:49.200 have the loyalty
00:47:49.800 of the military.
00:47:51.200 The military,
00:47:52.280 at best,
00:47:53.440 would say,
00:47:53.960 all right,
00:47:54.220 we've got a problem here,
00:47:55.820 see if you can work
00:47:56.580 it out in the next
00:47:57.180 six months
00:47:57.660 and then turn it
00:47:58.260 back over to us.
00:48:01.540 Economic collapse?
00:48:02.760 Well,
00:48:02.980 it would be better
00:48:03.600 than no government
00:48:04.840 at all.
00:48:08.760 Let's check in
00:48:09.600 on the Biden competency
00:48:11.080 for handling
00:48:11.780 this Mexico fentanyl
00:48:15.080 problem and MS-13.
00:48:17.240 To check competency,
00:48:18.600 we'll check in
00:48:19.100 with Jean-Pierre,
00:48:21.220 see what she said,
00:48:22.040 spokesperson.
00:48:23.180 She said that
00:48:23.980 fentanyl is currently
00:48:25.340 at historic lows,
00:48:27.520 historic levels
00:48:28.560 under the Biden presidency.
00:48:30.680 All right.
00:48:30.840 So the Biden administration,
00:48:34.520 according to the
00:48:34.960 spokesperson,
00:48:36.380 can't tell the difference
00:48:38.060 between how much
00:48:40.740 they catch
00:48:41.460 and how much
00:48:42.500 is getting in.
00:48:44.520 They actually
00:48:45.560 can't tell the difference
00:48:47.500 between measuring
00:48:49.640 how much you catch
00:48:50.720 and knowing
00:48:52.500 how much actually
00:48:53.380 got in
00:48:53.780 that you didn't catch.
00:48:55.740 As if they can't
00:48:57.100 tell the difference.
00:48:57.920 And that's
00:49:00.340 who's in charge
00:49:00.940 of the...
00:49:01.800 Right.
00:49:02.600 Now,
00:49:03.920 if you say to me,
00:49:05.260 Scott, Scott,
00:49:05.780 that's the spokesperson.
00:49:07.460 That's just the spokesperson.
00:49:08.780 She sometimes
00:49:09.560 has a gaffe.
00:49:11.140 She's been saying
00:49:11.880 this for a long time.
00:49:14.000 It wasn't just yesterday.
00:49:16.140 Am I right?
00:49:17.520 She's been saying
00:49:18.340 the same thing
00:49:19.080 for a long time.
00:49:19.960 They act as though
00:49:22.360 they're not just lying
00:49:24.240 that they can't
00:49:25.080 tell the difference.
00:49:26.420 That they actually
00:49:27.420 can't tell the difference.
00:49:29.340 Like, actually.
00:49:31.100 That's what it looks like.
00:49:32.520 I mean,
00:49:32.840 you could say,
00:49:33.380 yeah,
00:49:33.580 it's just spin,
00:49:34.200 but
00:49:34.380 it doesn't look like it.
00:49:37.380 It looks like
00:49:37.880 they can't tell
00:49:38.400 the difference.
00:49:40.720 All right.
00:49:41.560 Don't know.
00:49:42.100 I don't want to read
00:49:42.640 her mind.
00:49:43.120 Maybe she can't
00:49:43.640 tell the difference.
00:49:45.100 That would be
00:49:45.680 even worse.
00:49:46.160 Has anybody seen
00:49:48.460 this new beauty filter
00:49:49.620 on TikTok
00:49:50.180 where all you do
00:49:51.820 is turn on the filter
00:49:52.680 and in real time
00:49:53.900 you look like
00:49:55.200 a beautiful version
00:49:56.220 of yourself?
00:49:58.580 It's scary.
00:50:00.700 Oh, and there's
00:50:01.280 a pedo one
00:50:02.520 where a man
00:50:04.100 could be a beautiful woman
00:50:05.180 and stuff like that.
00:50:06.820 So basically
00:50:07.740 you can turn
00:50:08.440 into anything
00:50:09.060 and you can't
00:50:10.000 tell anymore.
00:50:11.340 What's different
00:50:12.100 about it is
00:50:12.840 you can't tell.
00:50:14.640 You actually
00:50:15.280 can't tell.
00:50:16.960 And
00:50:17.360 there's like
00:50:19.280 good news
00:50:19.760 and bad news.
00:50:20.880 The good news
00:50:21.600 is
00:50:21.980 I'm going to
00:50:24.720 look a lot
00:50:25.180 younger
00:50:25.540 in about a year
00:50:26.280 because it seems
00:50:28.840 to me that
00:50:29.380 Zoom
00:50:29.760 and all
00:50:31.620 of these services
00:50:32.260 at the very least
00:50:33.620 they would have
00:50:34.120 a makeup option.
00:50:36.120 Am I right?
00:50:37.200 So there's somebody
00:50:37.840 like me
00:50:38.240 who doesn't want
00:50:38.700 to put makeup
00:50:39.220 on to do a live stream.
00:50:40.760 I would just
00:50:41.300 hit a button
00:50:41.820 and it would
00:50:42.600 just give me
00:50:44.500 a look
00:50:44.980 as if I had
00:50:45.560 TV makeup
00:50:46.180 on.
00:50:47.180 And you
00:50:47.560 wouldn't be
00:50:47.880 able to tell
00:50:48.240 the difference.
00:50:48.800 There wouldn't
00:50:49.740 be any
00:50:50.180 pixels floating
00:50:51.420 or anything.
00:50:51.840 It would just
00:50:52.160 look exactly
00:50:52.840 like me
00:50:53.400 but better.
00:50:55.180 And I could
00:50:56.220 make myself
00:50:57.040 younger.
00:50:58.180 I could remove
00:50:59.000 wrinkles.
00:51:00.180 Anything I want
00:51:01.000 apparently.
00:51:02.360 So what's that
00:51:03.280 going to do
00:51:03.680 to podcasting
00:51:05.280 when you're
00:51:06.360 looking at
00:51:06.780 somebody who's
00:51:07.300 a perfect
00:51:07.720 reproduction
00:51:08.280 but the
00:51:09.080 better version?
00:51:09.700 I guess you
00:51:11.700 could argue
00:51:12.060 that's what
00:51:12.480 movies and
00:51:13.000 TV have
00:51:13.620 always been
00:51:14.080 right?
00:51:15.100 If you see
00:51:15.760 the real
00:51:16.140 movie star
00:51:16.740 they don't
00:51:17.940 quite look
00:51:18.440 like they
00:51:19.140 look in
00:51:19.420 the movies.
00:51:22.180 Although I
00:51:22.620 heard one
00:51:24.100 exception to
00:51:26.020 that.
00:51:26.800 So when
00:51:27.100 Angelina Jolie
00:51:28.040 was in her
00:51:29.280 let's say
00:51:30.300 movie making
00:51:31.200 prime I don't
00:51:32.320 want to say
00:51:32.700 her prime
00:51:33.280 I'll say
00:51:34.180 her movie
00:51:34.680 career prime.
00:51:35.900 Did you see
00:51:36.180 how cleverly I
00:51:37.000 danced around
00:51:38.220 that?
00:51:39.120 Don Lemon.
00:51:39.700 Take a
00:51:40.220 lesson Don
00:51:40.720 Lemon.
00:51:42.140 So when she
00:51:42.860 was in her
00:51:43.240 movie making
00:51:43.860 prime I
00:51:45.760 was photographed
00:51:46.860 by somebody
00:51:47.960 who had just
00:51:48.860 photographed her
00:51:49.720 not too long
00:51:50.940 ago.
00:51:51.960 And I asked
00:51:53.100 him about that.
00:51:53.760 Apparently
00:51:54.060 I think she
00:51:55.100 showed up
00:51:55.480 alone like
00:51:56.720 at sort of
00:51:57.740 the height of
00:51:58.120 her fame.
00:51:59.000 She showed
00:51:59.320 up to a
00:51:59.780 photo shoot
00:52:00.240 alone.
00:52:01.220 Didn't need
00:52:01.580 anybody.
00:52:02.720 And he
00:52:03.940 said that she
00:52:04.800 was the most
00:52:05.560 stunning person
00:52:07.320 in person
00:52:08.240 he'd ever
00:52:08.620 seen.
00:52:08.900 So apparently
00:52:11.180 her movie
00:52:12.520 charisma
00:52:13.540 perfectly
00:52:14.600 translated into
00:52:15.620 a one-on-one
00:52:16.400 private situation.
00:52:18.300 Like he
00:52:18.740 couldn't say
00:52:19.180 he couldn't
00:52:19.580 stop talking
00:52:20.620 about what
00:52:21.680 it was like
00:52:22.120 to be in
00:52:22.440 the same
00:52:22.740 room.
00:52:23.880 And he
00:52:24.360 was a
00:52:24.940 celebrity
00:52:25.320 photographer.
00:52:26.520 So he'd
00:52:26.820 done all the
00:52:28.300 actresses and
00:52:29.240 models and
00:52:30.000 stuff.
00:52:30.540 But she was
00:52:31.000 the one he
00:52:31.400 said yeah
00:52:31.880 that it's the
00:52:33.400 same in the
00:52:33.880 room.
00:52:35.340 That was
00:52:35.940 interesting.
00:52:37.340 All right.
00:52:40.360 Here's a
00:52:41.120 thing that I
00:52:42.240 just figured
00:52:42.940 out today.
00:52:45.160 And maybe
00:52:45.680 some of you
00:52:46.200 already knew
00:52:46.640 this.
00:52:47.500 Forever I've
00:52:48.360 been asking
00:52:48.780 you what's
00:52:49.380 the deal
00:52:49.780 with everybody
00:52:50.380 hating Soros?
00:52:52.780 Right?
00:52:53.700 And everybody
00:52:54.100 gets mad at
00:52:54.660 me.
00:52:55.600 But people
00:52:57.140 wouldn't explain
00:52:57.800 why.
00:52:58.820 Now of course
00:52:59.560 there's the
00:53:00.040 vague Jewish
00:53:00.980 thing.
00:53:01.760 So I think
00:53:02.080 oh it's
00:53:02.440 anti-Semitic
00:53:03.200 right?
00:53:03.420 It's just
00:53:03.700 an anti-Semitic
00:53:04.620 trope.
00:53:06.060 But I
00:53:06.720 couldn't get
00:53:07.300 any more
00:53:08.580 knowledge or
00:53:10.440 information about
00:53:11.380 where it comes
00:53:12.060 from.
00:53:12.700 Why him
00:53:13.160 specifically?
00:53:14.800 You know
00:53:14.960 what's this
00:53:15.400 business?
00:53:16.860 And I
00:53:17.100 finally went
00:53:19.560 down the
00:53:20.060 conspiracy
00:53:20.560 theory rabbit
00:53:21.640 hole to
00:53:22.140 figure out
00:53:22.500 what's going
00:53:22.900 on.
00:53:23.700 How many
00:53:24.400 of you knew
00:53:24.940 the following
00:53:25.560 thing?
00:53:27.320 That when
00:53:27.900 not everybody
00:53:29.740 right?
00:53:30.980 Most of my
00:53:31.520 generalizations
00:53:32.240 are not
00:53:32.760 everybody.
00:53:34.100 But when
00:53:34.540 people on
00:53:35.020 the right,
00:53:35.700 not all
00:53:36.160 the people
00:53:36.480 on the
00:53:36.800 right,
00:53:38.000 some of
00:53:38.460 the people
00:53:38.740 on the
00:53:39.040 right,
00:53:39.680 when they
00:53:39.980 talk about
00:53:40.440 Marxism
00:53:41.020 they're
00:53:41.300 really talking
00:53:42.780 about Jewish
00:53:44.000 people trying
00:53:44.580 to take over
00:53:45.100 the world.
00:53:46.800 Do I have
00:53:47.260 that right?
00:53:47.840 But nobody's
00:53:48.420 willing to say
00:53:48.900 that out loud.
00:53:50.320 That's the
00:53:50.760 conspiracy theory
00:53:51.540 right?
00:53:52.440 That from
00:53:53.300 some document
00:53:54.500 I'm not going
00:53:54.980 to name,
00:53:56.040 from like
00:53:56.500 1920 there
00:53:57.760 was a fake
00:53:58.580 document that
00:53:59.860 said that
00:54:00.220 Marxism was
00:54:01.100 really a
00:54:02.180 cover for
00:54:02.760 some kind
00:54:04.060 of Jewish
00:54:04.720 takeover
00:54:05.340 of the
00:54:05.620 world.
00:54:07.320 Now is
00:54:07.680 that why
00:54:08.000 you're trying
00:54:08.400 to not
00:54:08.780 tell me
00:54:09.180 because you
00:54:09.540 don't want
00:54:09.780 to say
00:54:09.980 that out
00:54:10.320 loud?
00:54:11.200 By the
00:54:11.640 way,
00:54:11.980 there's no
00:54:12.620 evidence of
00:54:13.700 that.
00:54:14.140 It would be
00:54:14.420 crazy.
00:54:15.520 Now let me
00:54:15.880 tell you,
00:54:16.480 if there's
00:54:16.800 anybody on
00:54:17.340 here,
00:54:18.260 there are a
00:54:19.220 few yeses.
00:54:20.220 Most of you
00:54:20.840 are saying
00:54:21.160 no,
00:54:22.100 but there
00:54:22.620 are some
00:54:23.020 yeses.
00:54:24.640 So just
00:54:25.180 look at the
00:54:25.560 other comments.
00:54:26.660 There are
00:54:27.040 some yeses.
00:54:27.640 And that
00:54:28.200 means that
00:54:28.580 some people
00:54:29.180 were aware
00:54:29.720 of this.
00:54:31.060 I just
00:54:31.680 heard this
00:54:33.100 theory yesterday
00:54:35.060 actually,
00:54:36.340 and finally
00:54:37.300 pieced it
00:54:37.760 together.
00:54:39.000 So I
00:54:41.440 was always
00:54:41.820 confused why
00:54:42.700 people would
00:54:43.200 use the word
00:54:43.720 Marxist.
00:54:44.520 Do you know
00:54:44.860 why?
00:54:46.160 Because I
00:54:46.560 don't think
00:54:47.040 it's a
00:54:47.740 persuasive word.
00:54:49.780 And I kept
00:54:50.280 wondering,
00:54:50.620 why does
00:54:50.860 everybody say
00:54:51.460 she's a
00:54:52.900 Marxist,
00:54:54.180 they're a
00:54:54.500 Marxist,
00:54:55.560 BLM is a
00:54:56.300 Marxist.
00:54:57.260 I never
00:54:57.840 understood it.
00:54:59.460 Because if
00:55:00.400 you're calling
00:55:00.720 somebody Marxist,
00:55:01.980 when I hear
00:55:02.740 that, I go,
00:55:03.800 is it a
00:55:04.300 different
00:55:04.500 economic
00:55:05.000 theory?
00:55:06.900 Like, why
00:55:08.580 are you using
00:55:09.200 that word?
00:55:13.060 So I just
00:55:13.900 wondered how
00:55:14.740 much of a
00:55:15.600 anti-Semitic
00:55:17.240 variable is
00:55:19.500 built into
00:55:20.060 that when
00:55:20.400 people use
00:55:20.920 that word.
00:55:21.480 Because I
00:55:21.820 don't think
00:55:22.540 I'd ever
00:55:22.840 use that
00:55:23.240 word.
00:55:24.080 I'm not
00:55:24.300 that there's
00:55:27.060 something going
00:55:27.520 on.
00:55:29.180 All right.
00:55:29.620 So here's my
00:55:30.320 take on
00:55:30.800 conspiracy
00:55:31.220 theories.
00:55:32.860 Here's how
00:55:33.320 to tell what
00:55:33.780 is not a
00:55:34.420 conspiracy
00:55:34.840 theory.
00:55:36.000 When too
00:55:36.760 many people
00:55:37.280 are allegedly
00:55:38.000 involved,
00:55:39.240 that's never
00:55:39.780 a conspiracy
00:55:40.340 theory.
00:55:41.800 Never.
00:55:42.800 Never.
00:55:43.860 This is never
00:55:44.480 a conspiracy
00:55:45.000 theory.
00:55:45.340 So this
00:55:46.060 whole worldwide
00:55:47.040 Marxist,
00:55:48.640 it's really a
00:55:49.960 Jewish plan to
00:55:50.780 take over the
00:55:51.280 world,
00:55:51.600 imagines that
00:55:54.020 there's plotters
00:55:55.480 everywhere and
00:55:56.480 nobody's talking
00:55:57.780 about it, but
00:55:58.460 they're all
00:55:58.780 connected.
00:55:59.680 That's never a
00:56:00.680 thing.
00:56:01.720 That is never
00:56:02.500 a thing.
00:56:03.960 I guarantee
00:56:05.080 you that's not
00:56:06.520 a thing.
00:56:07.240 But if you
00:56:08.120 told me that
00:56:08.860 50 Intel
00:56:11.280 people who
00:56:11.860 knew each
00:56:12.300 other signed
00:56:13.420 a document and
00:56:14.240 conspired to
00:56:14.920 lie about
00:56:15.320 Hunter's
00:56:15.720 laptop, I'd
00:56:16.900 say 50 is
00:56:18.360 a lot.
00:56:19.820 But if they
00:56:20.340 were all Intel
00:56:21.040 people who
00:56:21.560 knew each
00:56:21.920 other, I
00:56:22.660 could see
00:56:23.000 that.
00:56:24.120 So that's
00:56:24.900 not too
00:56:25.260 many people,
00:56:26.060 given that
00:56:26.560 they're Intel
00:56:26.980 people who
00:56:27.420 know each
00:56:27.720 other.
00:56:28.660 But as
00:56:29.560 soon as
00:56:29.860 you say
00:56:30.160 worldwide,
00:56:32.200 there's
00:56:34.660 nothing like
00:56:35.180 that.
00:56:36.240 Nobody can
00:56:37.340 maintain a
00:56:37.880 worldwide
00:56:38.540 100-year
00:56:40.840 plan.
00:56:43.200 Now, that
00:56:43.960 doesn't mean
00:56:44.500 that there's
00:56:45.500 nobody who
00:56:46.020 ever said
00:56:46.460 it 100
00:56:47.820 years ago.
00:56:48.840 It doesn't
00:56:49.360 mean there
00:56:49.840 aren't people
00:56:50.520 who actually
00:56:52.280 have that
00:56:53.100 belief.
00:56:54.880 But there's
00:56:56.180 somebody who
00:56:56.520 has every
00:56:56.840 belief.
00:56:59.200 So I
00:57:00.260 feel like the
00:57:00.840 whole we
00:57:01.780 don't like
00:57:02.200 Marxism
00:57:02.800 thing, even
00:57:04.400 if it's the
00:57:05.320 exact correct
00:57:06.260 word, it's
00:57:06.900 not persuasive.
00:57:08.860 And it has
00:57:09.700 this tinge of
00:57:10.900 the anti-Semitism
00:57:11.820 on it that
00:57:13.280 doesn't seem
00:57:13.980 persuasive.
00:57:15.160 Like, why
00:57:15.560 would you use a
00:57:16.200 word that's
00:57:16.880 already slimed
00:57:18.200 by anti-Semitism?
00:57:21.040 You wouldn't
00:57:21.640 want to use
00:57:22.000 that word.
00:57:22.500 It's not
00:57:22.820 persuasive.
00:57:24.020 So I don't
00:57:24.880 know what the
00:57:25.220 alternative is,
00:57:26.000 because Marxism
00:57:26.920 covers a
00:57:27.740 description of
00:57:29.000 a larger
00:57:29.340 thing, but
00:57:31.220 not Marxism.
00:57:34.040 Let me
00:57:34.820 speculate a
00:57:38.760 few things.
00:57:39.260 If you
00:57:40.700 changed it to
00:57:41.500 systems over
00:57:42.380 goals, it
00:57:44.500 gets closer.
00:57:46.460 Like, marriage,
00:57:47.440 having a
00:57:47.940 traditional family
00:57:48.760 is a system,
00:57:50.320 isn't it?
00:57:51.800 Whereas, Marxism
00:57:53.260 might have a
00:57:53.720 goal of
00:57:54.840 everybody being
00:57:55.520 treated equally.
00:57:59.380 Free markets
00:58:00.460 are a system,
00:58:02.920 whereas Marxism,
00:58:04.660 of course, both
00:58:05.620 have systems, but
00:58:06.680 Marxism is
00:58:07.420 goal-oriented,
00:58:08.400 like to get to
00:58:09.120 a place where,
00:58:10.300 I don't know,
00:58:10.780 the government
00:58:11.240 controls everybody,
00:58:12.460 if you have that
00:58:13.100 point of view,
00:58:13.880 or get to a
00:58:14.840 place where
00:58:15.200 everybody's
00:58:15.580 equal, or
00:58:16.780 something like
00:58:17.060 that.
00:58:17.720 So instead of
00:58:18.520 Marxism versus,
00:58:20.700 I don't know,
00:58:21.120 free markets or
00:58:21.920 capitalism, I
00:58:23.020 would go with
00:58:23.620 something like
00:58:24.280 systems that
00:58:25.520 work and
00:58:27.200 systems that
00:58:27.900 don't.
00:58:30.220 And the ones
00:58:31.000 that don't are
00:58:31.780 always the same
00:58:32.580 reason.
00:58:33.080 The systems that
00:58:33.800 don't are either
00:58:34.540 focused backwards,
00:58:35.980 you know, my big
00:58:36.520 point of the
00:58:37.000 week, they're
00:58:37.880 either backwards
00:58:38.520 looking at
00:58:39.060 victimization,
00:58:40.540 or they don't
00:58:41.360 take into account
00:58:42.380 incentives.
00:58:47.600 So, you know,
00:58:48.960 you could almost
00:58:49.560 say systems
00:58:53.640 with incentives
00:58:54.460 versus systems
00:58:55.640 without.
00:58:56.980 You could say
00:58:57.880 people who have
00:58:58.680 systems that are
00:59:00.080 well-designed
00:59:00.880 versus systems
00:59:02.780 that are designed
00:59:03.500 to fail.
00:59:05.960 Maybe that's the
00:59:06.780 way to go.
00:59:07.420 We favor systems
00:59:08.920 that are designed
00:59:09.600 to work, and
00:59:10.640 have always worked.
00:59:11.380 They're based on
00:59:12.000 human incentives
00:59:12.720 and everybody
00:59:13.940 getting a fair
00:59:14.740 access, fair
00:59:16.240 access.
00:59:17.280 That's a system
00:59:17.980 that works.
00:59:20.020 A system that
00:59:20.960 doesn't work is
00:59:22.320 removing incentives.
00:59:24.640 You know, maybe
00:59:24.980 there's another way
00:59:25.500 to say that.
00:59:26.140 But here's my
00:59:27.080 big persuasion
00:59:28.080 thing.
00:59:28.560 As long as
00:59:29.060 Republicans and
00:59:29.820 conservatives are
00:59:30.540 using the word
00:59:31.220 Marxist, they
00:59:32.820 are, on a scale
00:59:34.400 of 1 to 10,
00:59:35.260 their persuasion
00:59:35.840 is a 1.
00:59:38.260 That's my
00:59:38.880 opinion.
00:59:39.600 A scale of 1
00:59:40.340 to 10, saying
00:59:41.820 Marxists, they're
00:59:42.660 all Marxists, on
00:59:44.080 a 1 to 10, that's
00:59:44.800 just a 1.
00:59:46.460 It's probably
00:59:47.280 hurting you more
00:59:47.980 than it's helping
00:59:48.460 you.
00:59:49.040 It's so bad.
00:59:50.440 But there have
00:59:51.560 to be better
00:59:52.020 ways to do
00:59:52.500 that.
00:59:54.560 There's a story
00:59:55.220 about Musk
00:59:55.920 mocking a
00:59:57.020 disabled employee.
00:59:58.120 Do you believe
00:59:58.540 that happened?
01:00:00.400 Do you believe
01:00:00.900 that headline?
01:00:02.360 Musk mocked a
01:00:03.520 disabled employee?
01:00:05.220 Is your first
01:00:06.160 instinct that
01:00:06.860 that's exactly
01:00:07.560 right and there's
01:00:08.320 no context that
01:00:09.260 needs to be
01:00:09.680 added?
01:00:12.220 Would it
01:00:12.940 change your
01:00:13.940 mind if you
01:00:14.680 knew that
01:00:15.700 Musk had no
01:00:16.540 idea that the
01:00:17.380 person was
01:00:18.080 disabled, nor
01:00:20.260 did anybody
01:00:20.900 else, because
01:00:21.640 they had the
01:00:22.200 conversation in
01:00:22.940 public on
01:00:23.700 Twitter?
01:00:24.960 He found out
01:00:25.920 later.
01:00:27.500 And when he
01:00:27.980 found out later,
01:00:28.840 and he also
01:00:29.280 made some other
01:00:29.980 assumptions about
01:00:31.060 the guy, he
01:00:32.940 apologized in
01:00:33.740 public.
01:00:35.560 And people
01:00:35.980 were saying,
01:00:36.820 how dare you be
01:00:37.720 so insulting to
01:00:38.580 the disabled guy?
01:00:40.060 My God.
01:00:42.040 Here's my
01:00:43.160 standard for
01:00:43.900 behavior.
01:00:46.120 Do I judge
01:00:46.740 people by making
01:00:48.020 mistakes?
01:00:49.840 Nope.
01:00:51.860 I never have,
01:00:53.400 because everybody
01:00:54.180 makes mistakes.
01:00:55.600 That's a ridiculous
01:00:56.600 standard.
01:00:57.280 This is a
01:00:57.720 reframe as well.
01:00:58.880 Judging people by
01:00:59.720 the mistakes, and
01:01:00.760 this is in the
01:01:01.740 book that just
01:01:02.620 got unpublished,
01:01:04.120 making mistakes is
01:01:05.780 what everybody
01:01:06.320 does.
01:01:07.300 I try as hard
01:01:08.180 as possible not
01:01:09.440 to judge people
01:01:10.200 by the mistakes,
01:01:11.380 but rather to
01:01:12.800 judge them by
01:01:13.440 how they handle
01:01:14.100 their mistake.
01:01:15.420 Because more
01:01:16.820 thought is put
01:01:17.580 into it, and
01:01:18.520 more character
01:01:19.560 is exhibited.
01:01:22.020 So Musk learned
01:01:23.720 what the real
01:01:24.160 facts were, and
01:01:25.060 apologized in a
01:01:27.540 completely adequate
01:01:28.860 way, in my
01:01:29.480 opinion.
01:01:32.360 That's as good
01:01:33.140 as it gets.
01:01:33.640 again, he's
01:01:35.580 being criticized
01:01:36.240 for behavior
01:01:37.560 that nobody
01:01:40.620 condones a
01:01:43.100 mistake.
01:01:44.240 It's just
01:01:44.620 kind of dickish
01:01:46.680 to condemn
01:01:47.200 it.
01:01:48.300 It's not so
01:01:49.620 out of range of
01:01:50.600 things that we've
01:01:51.540 all done and
01:01:52.760 thought, oh, I
01:01:54.300 better go fix
01:01:54.880 that.
01:01:56.320 It's not like he
01:01:57.340 ate a baby or
01:01:58.060 something.
01:01:59.340 So it was pretty
01:01:59.920 funny.
01:02:00.640 I like that he
01:02:01.300 defaults to the
01:02:02.260 funniest approach
01:02:03.080 and everything.
01:02:04.320 Don't you?
01:02:05.460 I feel like part
01:02:06.500 of Elon Musk's
01:02:07.780 operating system
01:02:08.920 is that if
01:02:10.820 there are two
01:02:11.160 things you can
01:02:11.680 do, and they
01:02:12.720 look sort of
01:02:13.600 equally risky
01:02:15.600 in terms of
01:02:16.480 risk and reward,
01:02:17.660 that he'll always
01:02:18.180 take the funny
01:02:18.780 one.
01:02:20.580 And I think he
01:02:21.240 said something
01:02:21.940 that suggests
01:02:23.740 maybe that's
01:02:24.380 actually in his
01:02:25.460 mind.
01:02:25.740 All right.
01:02:29.000 I saw a good
01:02:29.700 theory today that
01:02:30.580 some person who
01:02:32.520 had inside
01:02:33.000 knowledge about
01:02:33.540 Putin, that
01:02:35.100 Putin has a
01:02:35.760 young mistress
01:02:37.280 who's basically
01:02:38.600 like a wife and
01:02:40.140 has kids or
01:02:41.100 a kid, I think
01:02:42.280 kids.
01:02:43.320 And so he has a
01:02:43.980 young family and
01:02:46.080 he has, you know,
01:02:46.720 great mansions and
01:02:47.880 everything he wants.
01:02:49.320 And so the
01:02:49.880 argument is he's
01:02:50.640 not crazy.
01:02:51.620 So the odds of
01:02:52.640 him not launching a
01:02:53.660 nuclear war, which
01:02:55.180 would kill his
01:02:55.860 young family, that,
01:02:57.900 you know, I saw
01:02:58.440 pictures of him
01:02:59.140 looking at the
01:02:59.900 woman who was
01:03:01.120 apparently confirmed
01:03:02.760 as his mistress.
01:03:04.120 He looks in love.
01:03:05.960 Like the way he's
01:03:06.740 looking in the
01:03:07.280 pictures is that
01:03:08.100 look like I'm
01:03:09.300 not going to lose
01:03:09.920 this.
01:03:11.720 And she's not
01:03:13.000 like, I mean,
01:03:15.080 she just looks
01:03:16.020 like somebody who
01:03:16.780 had a genuine
01:03:17.620 connection with
01:03:18.380 him, just based
01:03:19.560 on a few
01:03:19.920 photographs.
01:03:21.280 So that's a
01:03:22.240 pretty good
01:03:22.520 argument, and
01:03:23.120 that was the
01:03:23.460 argument that I
01:03:24.180 made without
01:03:24.800 the details, that
01:03:26.260 it wouldn't be
01:03:26.840 rational for him
01:03:27.560 to start a
01:03:28.040 nuclear war.
01:03:28.800 It would never
01:03:29.240 be good for him,
01:03:30.780 personally.
01:03:32.860 All right, the
01:03:33.540 problem with
01:03:34.180 statistics.
01:03:36.120 Do you know
01:03:36.520 why statistics is
01:03:39.200 even a thing that
01:03:40.060 people have to
01:03:40.620 learn, depending
01:03:42.080 on their career?
01:03:44.100 Why was
01:03:45.220 statistics even
01:03:46.100 invented?
01:03:46.620 Well, let me
01:03:47.080 answer that
01:03:47.640 question for you.
01:03:48.620 It's because our
01:03:49.400 common sense fools
01:03:51.440 us routinely.
01:03:54.380 Common sense is
01:03:55.260 so opposite of
01:03:56.500 what statistics
01:03:57.420 truth is, that
01:03:58.920 we end up
01:03:59.560 getting in, you
01:04:00.260 saw it in the
01:04:00.940 last week without
01:04:01.620 getting into
01:04:02.320 details again,
01:04:03.260 there's a bunch
01:04:03.720 of argument about
01:04:04.580 whether a poll
01:04:05.340 was accurate or
01:04:06.480 not.
01:04:07.440 The people who
01:04:08.020 criticized the
01:04:08.740 poll said, oh,
01:04:10.320 it's such a small
01:04:11.080 sample, therefore it
01:04:12.580 cannot tell us
01:04:13.200 anything.
01:04:14.300 And then I would
01:04:14.900 say, well, it only
01:04:15.940 has an 8% margin
01:04:17.160 of error, even at
01:04:18.320 that small number,
01:04:19.340 assuming that the
01:04:20.220 sample was
01:04:20.860 collected appropriately.
01:04:24.960 really.
01:04:26.060 And I would just
01:04:27.500 get, like, stunned
01:04:28.320 silence, because
01:04:30.020 8% wouldn't have
01:04:31.100 changed any
01:04:31.940 conclusion from
01:04:32.720 it.
01:04:33.700 So that's not
01:04:36.180 obvious.
01:04:37.420 And what people
01:04:38.000 would say is, how
01:04:38.780 could 130 people
01:04:40.040 represent 100
01:04:41.320 million people?
01:04:42.740 And I would sort
01:04:43.840 of have to just
01:04:44.560 hold my tongue,
01:04:46.240 because I wanted
01:04:47.120 to answer
01:04:47.520 sarcastically.
01:04:48.320 How can a small
01:04:50.660 sample represent a
01:04:51.820 large population?
01:04:54.300 That's called
01:04:55.540 statistics and
01:04:57.080 polling, and that's
01:04:58.640 exactly the
01:04:59.920 description of it.
01:05:01.560 And the, yeah, and
01:05:03.280 then the confidence
01:05:03.980 interval tells you
01:05:05.860 how confident you
01:05:06.820 should be based on
01:05:07.860 how small your
01:05:08.660 size is.
01:05:10.140 So really, really
01:05:11.900 basic stuff people
01:05:13.320 don't know about
01:05:14.440 statistics.
01:05:15.920 But this brings me
01:05:17.620 to Charles
01:05:18.260 Barkley disagreeing
01:05:19.800 with a, let's
01:05:22.740 see, a ESPN
01:05:23.720 commentator named
01:05:26.000 Kendrick Perkins.
01:05:28.000 And Kendrick
01:05:28.600 Perkins suggested
01:05:29.480 that there was
01:05:30.060 racial bias in the
01:05:31.460 MVP votes, because
01:05:33.160 since 1990, the
01:05:36.080 only people who
01:05:37.460 won MVP without
01:05:39.520 being in the top
01:05:40.740 10 of scoring?
01:05:42.720 Now, that sounds a
01:05:43.780 little suspicious on
01:05:44.640 its surface, doesn't
01:05:45.420 it?
01:05:45.600 that somebody could
01:05:46.900 be the MVP of the
01:05:48.240 entire league ever
01:05:49.520 and not be in the
01:05:51.060 top 10 of scoring?
01:05:53.780 Doesn't that sound
01:05:54.620 suspicious to you?
01:05:55.820 And the only time
01:05:56.800 it's ever happened is
01:05:57.660 with three white
01:05:58.440 players.
01:06:01.820 But, no, but
01:06:02.880 honestly, you don't
01:06:04.140 think the scoring
01:06:05.040 would be sort of
01:06:06.920 five to one of
01:06:08.000 importance compared to
01:06:09.200 all of the other
01:06:09.880 things.
01:06:10.460 Because remember, it's
01:06:11.340 fans voting, right?
01:06:12.400 Or, no, is it fans
01:06:13.820 voting, or is it the
01:06:16.380 professionals voting?
01:06:17.900 Who votes for the
01:06:18.840 MVP?
01:06:20.760 Oh, fans and the
01:06:22.940 press, who covers
01:06:23.820 it?
01:06:24.540 So, sports writers,
01:06:25.780 no fans.
01:06:27.960 I'm getting mixed
01:06:28.900 messages here.
01:06:31.960 Sports writers.
01:06:33.680 So, we think it's
01:06:34.700 sports writers who do
01:06:35.960 it.
01:06:36.140 So, when it seems
01:06:38.400 suspicious to you
01:06:39.580 that the only
01:06:42.560 three times that
01:06:43.660 it wasn't, they
01:06:44.360 weren't in the top
01:06:45.060 10 of scoring,
01:06:45.780 they were all white
01:06:46.300 guys.
01:06:47.720 And there's another
01:06:48.500 one, there's another
01:06:50.900 one that's up for it,
01:06:52.080 I guess, or won it.
01:06:54.540 And then Barclay
01:06:55.740 says, you can't tell
01:06:59.720 me because the
01:07:00.420 numbers don't make
01:07:01.040 sense.
01:07:01.740 Does he know how
01:07:02.460 many voters are
01:07:05.020 white, actually?
01:07:06.140 Or did he pull
01:07:07.000 80% out of his
01:07:08.200 ass?
01:07:09.000 So, I guess Kendrick
01:07:10.500 must have said 80% of
01:07:11.640 the voters on that
01:07:12.540 are white.
01:07:13.780 My point is if only
01:07:15.300 five white guys have
01:07:16.720 won MVP in the last
01:07:17.920 30 years, that makes
01:07:19.360 zero sense.
01:07:20.500 His argument, zero
01:07:21.420 sense.
01:07:22.160 Because if that was
01:07:23.780 the case, we'd have a
01:07:25.160 lot more white
01:07:25.780 MVPs.
01:07:26.680 Wouldn't the numbers
01:07:27.280 be way, way worse?
01:07:29.280 So, Barclay's sort
01:07:31.440 of statistical, you
01:07:33.840 know, instinct is that
01:07:34.960 if 80% of the voters
01:07:36.660 were white, and racial
01:07:38.980 bias is in it, that
01:07:42.020 we'd see, like, mostly
01:07:45.600 white winners.
01:07:47.120 Does that make sense
01:07:48.100 to you?
01:07:51.700 Does that make sense
01:07:52.700 to you?
01:07:54.240 Neither of these
01:07:55.120 arguments make any
01:07:56.040 damn sense.
01:07:56.620 neither side makes any
01:07:59.880 sense, right?
01:08:00.940 Because neither of
01:08:01.800 them have any data.
01:08:03.800 Neither of them have
01:08:04.700 any data.
01:08:06.040 And if they did, would
01:08:07.120 the data tell you
01:08:07.820 anything?
01:08:09.480 Probably not.
01:08:10.600 Let me tell you why
01:08:11.260 the data wouldn't tell
01:08:12.000 you anything.
01:08:14.740 And I'm going to
01:08:15.640 surprise you.
01:08:16.200 I'm going to side with
01:08:18.660 Kendrick Perkins.
01:08:19.700 I'm not going to side
01:08:22.760 with Barclay.
01:08:23.900 I like what Barclay's
01:08:25.300 doing.
01:08:26.380 I think what Barclay's
01:08:27.360 doing is trying to just
01:08:28.560 take race out of it,
01:08:31.480 which I love, which is
01:08:34.080 why Charles Barclay is,
01:08:35.720 like, way on the top
01:08:38.040 of my list of people I
01:08:39.140 would want to vote for
01:08:39.940 if he ran for politics.
01:08:41.420 Like, he would be way
01:08:42.280 at the top of my list.
01:08:43.720 I just love that he's
01:08:44.840 lived a life where he
01:08:46.240 can sort of laugh about
01:08:47.280 racial stuff, but you
01:08:48.840 never think he takes
01:08:49.880 it seriously.
01:08:51.120 That's, like, such a
01:08:52.880 good, like, place to
01:08:54.400 start.
01:08:55.060 So I like his instinct
01:08:56.140 to try to take the
01:08:57.100 energy out of it and
01:08:57.920 stuff like that.
01:08:58.840 But here's what I
01:08:59.680 think.
01:09:00.760 If you were the, let's
01:09:03.080 say most of them were
01:09:04.140 white.
01:09:04.600 Nobody knows if it's 80%.
01:09:05.900 But if most of the
01:09:07.400 writers who were voting
01:09:10.400 every year, do you
01:09:11.660 think that white people
01:09:13.040 would have the following
01:09:13.920 thought?
01:09:15.340 And I'm going to speak
01:09:16.260 as one white person who
01:09:18.480 would think the
01:09:19.120 following way, but I'm
01:09:20.980 not going to say that
01:09:21.760 they do.
01:09:22.660 All right.
01:09:22.840 So I'm not going to say
01:09:23.560 that somehow I, that
01:09:26.440 somehow represent white
01:09:28.680 people.
01:09:29.300 Here's what I would have
01:09:30.200 done if I were in that
01:09:31.100 group.
01:09:31.380 I'd say to myself, what's
01:09:33.020 good for, what's good
01:09:34.180 for the game?
01:09:36.480 And then I would have
01:09:37.440 voted appropriately.
01:09:39.180 If I were a sports
01:09:40.260 writer, especially white,
01:09:43.140 but maybe also black, I'm
01:09:44.480 not sure why it would
01:09:45.160 especially be different.
01:09:46.860 But if I were voting, I
01:09:48.460 would vote for what's
01:09:49.200 good for the game.
01:09:51.220 And sometimes what's
01:09:52.500 good for the game is
01:09:53.980 that a few white people
01:09:55.520 win sometimes.
01:09:58.240 Because maybe if you
01:09:59.500 only did the top
01:10:00.420 scorers, it would be
01:10:01.920 nothing but black
01:10:02.820 winners.
01:10:03.740 And you have probably
01:10:05.540 way more white audience.
01:10:08.620 And to me, it seems
01:10:09.440 like the writers were
01:10:10.720 somewhat subconsciously,
01:10:12.620 subconsciously, balancing
01:10:15.980 it out so that it
01:10:17.460 would sort of look
01:10:19.520 like something closer
01:10:22.380 to balanced.
01:10:24.220 And it might be that,
01:10:26.240 you know, maybe it's
01:10:27.500 been since Larry Bird
01:10:28.560 that like a white guy
01:10:29.900 should have won.
01:10:31.480 I don't know if that's
01:10:32.560 true.
01:10:33.500 But I would have to say
01:10:34.920 it does look a little
01:10:35.880 suspicious to me that
01:10:38.120 the white guys are the
01:10:39.680 ones who win without
01:10:40.480 being in the top ten
01:10:41.380 of scoring.
01:10:42.460 That's not a bad
01:10:43.440 point, is it?
01:10:45.000 Is that a bad point?
01:10:46.620 But I don't think it's
01:10:47.740 exactly the way he's
01:10:49.400 describing it.
01:10:50.560 I think you could allow
01:10:52.180 for two types of
01:10:53.080 racism.
01:10:54.180 One type of racism is
01:10:55.480 the bad kind, where
01:10:56.980 people are just voting
01:10:57.900 based on race.
01:11:00.280 The other kind is also
01:11:02.240 incorporating race, but
01:11:05.100 more trying to find a
01:11:06.220 balance of just what's
01:11:07.140 good for the game.
01:11:08.580 It would be good if
01:11:09.260 some white people won
01:11:10.060 once in a while, because
01:11:11.540 there are a lot of
01:11:11.860 white fans.
01:11:13.380 It's the same sort of
01:11:14.780 thinking if the
01:11:17.000 situation were reversed.
01:11:18.800 Now, that's probably how
01:11:20.300 I would vote, I have to
01:11:22.200 admit.
01:11:23.180 I probably would vote
01:11:24.620 based on if I thought
01:11:26.580 there was some unfairness
01:11:27.940 or inequity, I might vote
01:11:30.220 in a way that would fix
01:11:31.300 it.
01:11:31.700 I can imagine that.
01:11:33.200 I don't know if I would
01:11:34.340 just be like, oh, who
01:11:36.320 had the best stats?
01:11:37.320 Because it's not really
01:11:38.780 even about the best
01:11:39.600 stats, is it?
01:11:41.540 Because beyond the
01:11:42.940 stats, some players
01:11:45.260 consistently make their
01:11:47.320 team win when they're on
01:11:48.220 the field.
01:11:48.940 That's actually the best
01:11:49.960 stat.
01:11:50.460 The best stat is how they
01:11:51.660 score when that one
01:11:52.860 person is playing.
01:11:54.260 That's the one I like the
01:11:55.200 best.
01:11:56.960 All right.
01:11:57.440 So I guess we don't
01:11:58.520 know.
01:11:58.780 I guess my only point is
01:11:59.820 that without data and
01:12:01.320 without being statisticians,
01:12:04.000 and none of us in the
01:12:04.840 story are statisticians,
01:12:06.840 that I don't think
01:12:08.400 Kendrick Perkins' idea is
01:12:10.060 crazy.
01:12:10.840 I would just say it
01:12:12.260 might be more of a good
01:12:13.600 purpose to it, or at
01:12:15.220 least good intention,
01:12:17.140 than bad intention.
01:12:18.240 But probably there's a
01:12:19.240 racial component to that.
01:12:22.080 A woke agenda might be
01:12:23.460 banned in Iowa.
01:12:24.700 So the House of
01:12:25.500 Representatives is looking
01:12:26.500 at something to ban the
01:12:29.900 DEI bureaucracies in our
01:12:31.520 institutions of higher
01:12:32.580 education.
01:12:33.060 Do you think that'll
01:12:34.480 pass?
01:12:36.620 Iowa's solidly Republican,
01:12:38.780 are they?
01:12:39.520 No, they're not.
01:12:41.100 Is Iowa in their
01:12:43.880 legislature?
01:12:45.480 They're not, right?
01:12:47.620 Yeah.
01:12:48.220 So does it have a chance?
01:12:50.560 Wouldn't you need a solid
01:12:51.740 Republican legislature to
01:12:53.500 have a chance with
01:12:54.240 something like that?
01:12:56.360 So I don't know what the
01:12:57.320 rods are.
01:12:57.880 I guess I shouldn't have
01:13:00.500 brought it up.
01:13:03.060 So here's just the latest
01:13:07.520 update on me, but I'm not
01:13:09.200 going to go into any
01:13:09.900 detail on this now.
01:13:11.280 So the Chris Cuomo
01:13:12.280 interview I did about my
01:13:14.060 so-called racist rant,
01:13:17.800 TM, trademark.
01:13:19.580 Should I try to get a
01:13:20.360 trademark on racist rant?
01:13:22.960 No.
01:13:23.440 Probably too soon.
01:13:24.160 But nearly half a million
01:13:26.420 people have viewed it.
01:13:29.420 And the best criticism that
01:13:31.320 came out of it was from
01:13:32.420 Dan Abrams, who said that I
01:13:34.980 can't have it both ways.
01:13:36.260 I can't say that my
01:13:37.040 statement was hyperbole, but
01:13:39.000 also out of context.
01:13:43.660 Yes, I can.
01:13:45.540 The statement I made was a
01:13:47.880 sentence of hyperbole, and the
01:13:50.440 reason that I did it was the
01:13:52.200 context that was left out.
01:13:54.240 It's pretty easy to do both
01:13:55.740 of those at the same time.
01:13:57.320 Now, consider the fact that
01:13:59.200 that was the best objection.
01:14:01.220 There were other
01:14:01.840 objections, but the other
01:14:03.660 objections were based on
01:14:04.740 things I didn't even say, or
01:14:07.340 didn't even feel, or was
01:14:08.480 mind-reading right.
01:14:09.520 The best objection was that
01:14:11.680 you can't have it both ways
01:14:12.980 when obviously you can.
01:14:15.000 Very easily, obviously, you
01:14:17.700 can have it both ways.
01:14:19.020 It's just two things.
01:14:20.020 It's not two opposite things.
01:14:22.260 It's just two things.
01:14:24.680 I can have an apple and an
01:14:27.260 avocado at the same time.
01:14:29.120 Do you know what I can't
01:14:29.980 have?
01:14:30.400 I can't have an apple and not
01:14:32.060 an apple at the same time.
01:14:34.260 That would be, you know, if
01:14:36.020 Dan Abrams made that point
01:14:37.960 and said, he says he can have
01:14:39.720 an apple and not an apple at
01:14:41.280 the same time, and he just
01:14:43.180 can't do that.
01:14:44.020 No, Dan, I can have an
01:14:45.180 avocado and an apple.
01:14:47.580 Same time.
01:14:48.300 There's no conflict.
01:14:51.180 So, you tell me your
01:14:53.740 opinion.
01:14:54.380 I've asked this on the
01:14:55.340 locals' platform.
01:14:56.380 I'm going to talk to them
01:14:57.060 privately in a minute.
01:14:58.420 But, on YouTube, has the
01:15:02.620 narrative about my little
01:15:04.500 drama shifted in my
01:15:07.420 direction?
01:15:08.820 Once you realize that all
01:15:10.220 the adjutants, you know,
01:15:12.080 the people who just like to
01:15:12.960 make trouble, the click
01:15:14.500 horrors and the trolls,
01:15:16.040 once they get bored with
01:15:18.040 it, does it look like
01:15:20.160 things started moving my
01:15:21.180 way or no?
01:15:23.560 So, I'll ask both of you.
01:15:27.480 So, people are seeing
01:15:28.880 different things.
01:15:30.380 So, what you should see is
01:15:31.640 that some people see it and
01:15:32.740 some people don't, which is
01:15:35.020 what it looks like.
01:15:36.360 So, best case scenario,
01:15:37.760 because it's not possible to
01:15:39.660 persuade everybody.
01:15:40.980 Not everybody sees
01:15:41.780 everything.
01:15:42.280 But, for the people who
01:15:44.780 have been exposed to the
01:15:46.300 context, everything looks
01:15:49.180 pretty good.
01:15:52.440 So, I'm going to leave
01:15:53.660 the reason, here's a little
01:15:55.520 media tip, if I didn't give
01:15:56.840 you this one.
01:15:57.520 The reason I'm leaving that
01:15:59.300 one interview up there and
01:16:00.560 I'm turning down others is
01:16:02.860 that as soon as you have
01:16:04.140 more than one, it turns into
01:16:06.120 a court case.
01:16:07.880 You know what I mean?
01:16:08.500 So, the Cuomo interview hit
01:16:12.220 all the points I needed to
01:16:13.300 hit.
01:16:13.920 So, I was just leaving it
01:16:14.840 there as one consistent
01:16:16.620 thing.
01:16:17.240 Because the moment I talk to
01:16:18.720 somebody else, somebody is
01:16:21.180 going to illegitimately say,
01:16:24.240 you said two different
01:16:25.100 things, even if I don't.
01:16:27.740 More likely, they're going to
01:16:29.160 miss the context, and because
01:16:30.880 they miss the context on one,
01:16:32.740 they're going to say, well,
01:16:34.000 you're lying because you said
01:16:35.640 this and this one, but you
01:16:37.500 said this and this one, and
01:16:39.560 it won't be true.
01:16:41.160 Both of them will be
01:16:42.000 completely consistent, but
01:16:44.000 the news only has to tell
01:16:45.440 you they're inconsistent, and
01:16:47.140 you go down.
01:16:48.940 So, I just leave it, you
01:16:51.280 know, leave it with one
01:16:52.440 story as long as possible.
01:16:54.400 And the next thing, the next
01:16:56.020 part of the play, is I want
01:16:58.180 to see if there's any, let's
01:17:00.300 say, semi-legitimate news
01:17:02.120 organization who decides to
01:17:04.280 tell the whole story.
01:17:05.320 Like, as a story.
01:17:08.540 So, in other words, would, I
01:17:11.340 don't know, New York Times or
01:17:12.300 something, say, actually, there's
01:17:13.820 a way more interesting story
01:17:15.140 here that connects to a
01:17:16.780 larger trend, which is the
01:17:18.760 whole thing I was trying to
01:17:19.560 do, is connect it to a
01:17:20.660 larger trend.
01:17:21.860 Will anybody tell the whole
01:17:23.380 story of why I did it, how I
01:17:27.960 felt about it from my
01:17:29.640 perspective, not guessing, you
01:17:31.680 know, no mind reading, and
01:17:33.120 then describe my strategy for
01:17:36.160 handling it, my intention for
01:17:38.260 doing it, and also how I
01:17:40.300 reframed, once I had your
01:17:42.560 attention, I reframed from all
01:17:44.540 the backwards-looking strategies
01:17:47.180 for success to forward-looking
01:17:49.640 ones, and then also reframed from,
01:17:51.860 you know, looking at the big
01:17:53.340 picture of systemic racism, which
01:17:55.360 you have to work on, rather
01:17:56.880 focusing on individual success, so
01:17:59.940 even if you don't fix systemic
01:18:01.940 racism, you can just slice
01:18:03.980 through it, like a hot poker
01:18:05.880 through butter.
01:18:08.060 And there's a reason I always
01:18:09.020 repeat that one, because it's
01:18:10.560 visual, and it's repetition.
01:18:14.060 So, you'll know if I won.
01:18:16.520 Here's how you'll know if I won.
01:18:18.480 If you hear anybody in the media
01:18:20.320 refer to strategy as looking
01:18:24.880 backwards strategy, a rear-view
01:18:26.780 mirror strategy, or planning
01:18:29.100 for the past, and then they
01:18:31.860 compare it to looking forward
01:18:33.280 and how can we work together,
01:18:35.020 how can you learn the tools of
01:18:36.300 success, et cetera.
01:18:38.040 If you see anybody in the public
01:18:39.980 eye who starts using that
01:18:41.880 frame, that would be probably
01:18:44.640 from me, because it's sticky,
01:18:49.280 and it's the high-ground
01:18:51.060 maneuver.
01:18:52.000 All right, that's all for now.
01:18:53.240 YouTube, I will talk to you
01:18:54.180 tomorrow.
01:18:55.300 Bye for now.