Real Coffee with Scott Adams - March 07, 2023


Episode 2040 Scott Adams: J6 Narrative Dissolves, Cartel Kidnaps, Persuasion Lesson, Cuomo Interview


Episode Stats

Length

1 hour and 15 minutes

Words per Minute

146.98961

Word Count

11,118

Sentence Count

834

Misogynist Sentences

6

Hate Speech Sentences

34


Summary

Vivek Ramaswamy was a brilliant political strategist, and now he's dead at the age of 48, which is a bummer. Also, I think we should take a moment of silence for the late comedian Carl Sagan.


Transcript

00:00:00.000 Good morning, everybody, and welcome to the Highlight of Civilization.
00:00:04.940 It's private over there on the Locals platform,
00:00:07.700 where we do extra special things behind the curtain.
00:00:11.840 And eventually, on March 13th,
00:00:15.380 it'll be the only place you can see the new Dilberts,
00:00:18.360 which will be called Dilbert Reborn.
00:00:22.060 It's at Locals.
00:00:23.340 If you want to look for it, just look at my Twitter profile for the link,
00:00:27.520 or scottadams.locals.com.
00:00:31.260 It's a subscription site.
00:00:33.320 All right, well, if you'd like to take your experience up,
00:00:35.900 and I promise you, today will be not only entertaining,
00:00:40.400 you'll laugh, you'll cry,
00:00:43.300 but probably only because of your own problems.
00:00:45.520 Nothing I'm going to say will make you cry.
00:00:47.580 But you'll probably laugh, and you'll probably cry at your own stuff.
00:00:51.920 But have a good cry if you need it.
00:00:54.540 I mean, I'm just saying if you need it.
00:00:55.740 And you'll probably be educated in a way that,
00:01:00.240 well, it's hard to even describe.
00:01:03.180 Or, as somebody we know says,
00:01:05.540 like nobody's ever seen before.
00:01:08.360 But first, to take your experience up,
00:01:10.780 all you need is a cup or a mug or a glass,
00:01:12.280 a tank or a chalice or a stein,
00:01:13.520 a canteen, a joke or a flask,
00:01:14.960 a vessel of any kind.
00:01:17.060 Fill it with your favorite liquid.
00:01:19.520 I like coffee.
00:01:20.920 And join me now for the unparalleled pleasure,
00:01:23.180 the dopamine hit of the day,
00:01:25.180 the thing that makes everything better.
00:01:27.520 It's called the simultaneous sip,
00:01:28.820 and it happens now.
00:01:29.720 Go.
00:01:36.960 So good.
00:01:37.800 Well, toward the second half of this,
00:01:44.460 I will talk about my interview last night with Chris Cuomo,
00:01:48.380 which was really interesting.
00:01:50.800 And I'm going to talk about the communication techniques
00:01:54.440 and the persuasion techniques that you can see happening.
00:02:00.700 What I attempted to do was move the frame,
00:02:04.720 and then we'll talk about whether I did that.
00:02:07.420 Now, that'll be later.
00:02:08.760 We'll talk about the news first.
00:02:11.600 I tweeted just the other day
00:02:13.740 that having Vivek Ramaswamy in the primaries
00:02:21.100 makes Trump a better candidate.
00:02:24.620 Would anybody disagree with that at this point?
00:02:27.020 Because Trump came out saying he wants to disband
00:02:30.720 the Department of Education,
00:02:32.700 which Vivek had said just the other day.
00:02:36.080 Now, once Vivek said it,
00:02:38.540 and sort of focused on it for a little while,
00:02:41.140 Trump kind of had to play catch-up, didn't he?
00:02:43.880 He had to catch up.
00:02:45.360 So he had to at least match it.
00:02:48.060 And how happy are you about that?
00:02:51.600 You know, I always tell you that I like to look at things
00:02:54.220 from a systems perspective.
00:02:55.580 The goal perspective would be,
00:02:58.700 I want this one candidate to win.
00:03:00.460 That would be a goal.
00:03:02.020 A system perspective is,
00:03:04.000 we are so much better off with Vivek in the debate.
00:03:08.360 You know what nobody can say anymore?
00:03:12.440 Republicans are dumb.
00:03:15.100 Or Republicans are too old.
00:03:18.120 Or, you know, they're not reading the room.
00:03:21.360 Or, like, he just eliminates a whole bunch of arguments.
00:03:25.940 Right?
00:03:26.200 Or even that conservatives, you know,
00:03:29.320 won't back somebody who's a person of color,
00:03:31.780 you know, even though he's clearly not making that a point of anything
00:03:35.560 in his campaign, which is, of course, brilliant.
00:03:38.740 Which is, of course, why he's, you know, gets attention.
00:03:42.240 Because he's brilliant.
00:03:43.220 Like, wouldn't it be nice to have somebody, like, in charge of the country
00:03:48.960 who you legitimately said,
00:03:51.480 well, I might not agree with everything, but that guy's brilliant.
00:03:54.400 Like, just once.
00:03:56.020 Right?
00:03:56.360 Now, this is the reason that I like Bill Clinton.
00:03:58.880 I'm far less party-bound than maybe my reputation would suggest.
00:04:07.420 I like Bill Clinton because he was frickin' brilliant.
00:04:10.880 And I didn't care who he banged on the side
00:04:13.040 any more than I care what Trump did or did not do with Stormy Daniels.
00:04:17.840 Completely irrelevant.
00:04:19.460 You know, give me smart, give me capable, I'm good.
00:04:24.280 But, so, good job for Vivek, making things just better.
00:04:30.260 You know, no matter what happens, I think he made things better.
00:04:32.540 And it looks like that'll continue.
00:04:34.520 I would like to announce a death.
00:04:37.280 I know that's a bummer, but sometimes things die.
00:04:40.760 And it should be, I think we should take a moment of silence for it.
00:04:44.500 But here's what happened.
00:04:46.300 You've probably heard this quote.
00:04:48.980 Quote, it must be true because I saw the video with my own eyes.
00:04:54.280 Well, that absurd belief was born around 1951 with the advent of video.
00:05:00.760 And it had a long life.
00:05:02.920 But in recent years, it started struggling with a bad illness.
00:05:07.020 And it finally died yesterday, February 6, 2023.
00:05:12.600 And RIP, rest in peace, it must be true because I saw the video with my own eyes.
00:05:22.040 Yes, that's how all hoaxes are created, making you think there's no other interpretation than the one you saw in the video.
00:05:32.260 If you learn nothing this year, learn that all video is a lie.
00:05:38.420 All of it.
00:05:40.020 All video is a lie.
00:05:42.260 Every time.
00:05:43.000 Even if it's not edited wrong, it's still focusing your visual persuasion on one point, almost always to the exclusion of whatever the other point of view is.
00:05:54.280 Because if you're looking at it, you're believing it.
00:05:56.780 If you're hearing it, you're like, oh, concept, concept.
00:05:59.760 Retreat to my team views.
00:06:01.260 I didn't hear what you said.
00:06:02.120 But a video is establishing the argument, right?
00:06:07.880 All video lies.
00:06:10.140 All video lies.
00:06:11.420 All the time.
00:06:12.660 It can't do anything else.
00:06:14.580 I don't even know if there's a possibility for it to not lie.
00:06:18.820 I mean, think about it.
00:06:20.040 It's always going to focus you on one thing, at the exclusion or at least the diminishment of the other things.
00:06:26.900 It's always a lie.
00:06:29.320 At least in part.
00:06:32.120 So, yes, we're talking about Tucker Carlson got a hold of the January 6th footage.
00:06:38.400 And his take, and most of the people who are looking at it from the right, their take seems to be that it proves the January 6th videos that the so-called select January 6th committee showed to the world to build their case were a disgusting lie.
00:06:56.580 And although not illegal, because I believe Congress, and people do a fact check on me, I believe Congress is allowed, in terms of it being legal, they're legally allowed to lie to you as much as they want, are they not?
00:07:14.140 Legally.
00:07:15.320 Am I right?
00:07:15.820 Yeah, there's no law against it.
00:07:18.620 It's actually...
00:07:19.120 So we watched Congress frame a bunch of American citizens.
00:07:26.160 You might call them patriots.
00:07:28.580 And I wouldn't argue with that.
00:07:30.040 We saw our own Congress frame them because they have this, apparently, some kind of loophole where they can destroy lives by lying in public in a way that an ordinary citizen couldn't possibly do the same thing.
00:07:45.860 Now, like I say, it's not illegal.
00:07:47.600 But if it were illegal, what would the penalty be?
00:07:52.600 Pretty serious jail time, wouldn't it?
00:07:55.340 Now, I don't think there's any practical way to make it illegal, so I understand why it's not illegal, because everybody's lying all the time.
00:08:01.900 Like, you couldn't even have a government.
00:08:03.580 If you went to jail for lying, everybody would be in jail.
00:08:06.440 So I get why it's not illegal.
00:08:08.500 But if it were illegal, this would have been 25 years in jail.
00:08:14.380 Allowing, what, dozens or hundreds?
00:08:18.300 How many people went to jail for effectively trespassing?
00:08:24.620 A lot, right?
00:08:25.980 If you send dozens or hundreds of your own citizens to jail, and you know you're lying, or at least you know you're showing things out of context,
00:08:36.620 it's probably...
00:08:37.320 That should be like a 25-year jail term.
00:08:40.320 Does anybody disagree with that?
00:08:41.700 It's hard for me to think of anything worse than that, that's, you know, not actually murder, or rape, or something, right?
00:08:50.360 Pedophilia, I suppose.
00:08:51.260 There are a few things that are worse.
00:08:53.160 But it's one of the worst things I've ever seen.
00:08:55.860 One of the worst things I've ever seen.
00:08:57.520 Unless, unless, unless, did anybody have the suspicion that you just went from one misleading set of videos to another?
00:09:12.580 Did anybody say to themselves, how do I know that Tucker's videos are giving me the right story?
00:09:18.600 How quickly did you buy into the narrative that the other narrative was completely wrong, and that it was all a big ol' op?
00:09:32.020 Pretty quickly, right?
00:09:33.840 Pretty quickly.
00:09:34.740 Because that's what agreed with your preconceived notions, didn't it?
00:09:40.180 Didn't it?
00:09:40.940 Yeah, you kind of expected that you'd see that the narrative had been false.
00:09:49.640 So you saw what you expected.
00:09:52.460 How much should you trust your own, let's say, rational senses,
00:09:58.100 if you see something that totally agrees with what you thought was going to be true?
00:10:04.580 You ought to be a little bit cautious.
00:10:07.760 Here's the thing I'd like to see.
00:10:09.460 I'd like to see somebody who does not agree with Tucker Carlson about anything,
00:10:15.340 have access to the whole catalog, give them a little time, and then give them a full hearing, right?
00:10:23.880 This is the sort of thing where you need to hear the other side.
00:10:27.360 So now Tucker's shown a counterpoint to the January 6th people, and I think that was a huge service.
00:10:33.980 Would you agree?
00:10:34.600 I think what Tucker Carlson is doing, somebody said it, maybe Cernovich said it,
00:10:41.480 that it's maybe one of the most useful, important things anybody ever did in journalism.
00:10:49.480 It's like right at the top of important things, in my opinion.
00:10:53.660 So what Tucker's doing is like real good work, real good work.
00:11:00.960 But you as a smart consumer, and I'll bet Tucker wouldn't disagree with this.
00:11:05.780 I'll bet if he were sitting here he'd say, yeah, that's true.
00:11:08.540 Which is, until you hear what other people say, if they also have access to the full catalog of the videos,
00:11:15.860 just see if they can come up with a narrative that makes his narrative weaker.
00:11:20.480 I don't know if they can.
00:11:23.000 Let me tell you how every hoax was done that fooled other people, all right?
00:11:27.960 Here's how those hoaxes were done.
00:11:30.340 Here's a video, or a series of videos.
00:11:34.680 Here's the video.
00:11:36.020 How could you possibly interpret this any other way, right?
00:11:40.700 That's how all the hoaxes are done.
00:11:42.300 I can't imagine any other way to interpret this video.
00:11:46.280 There's only one way to interpret this video.
00:11:49.100 As long as they can get you to not imagine any other way to interpret it, you won't.
00:11:57.180 You probably just won't imagine there's any other possibility.
00:12:01.800 But I would say, imagine there might be.
00:12:05.600 I kind of trust Tucker on this story.
00:12:08.700 So if I had to guess, I don't think anybody will be able to refute his narrative in any important way.
00:12:17.580 Like, there's always ways to pick at the edges of anything.
00:12:20.240 But I don't think anybody's going to take the heart out of his narrative.
00:12:23.740 But, that's exactly what you say before you get fooled.
00:12:28.120 I can't imagine anybody doing that.
00:12:31.400 Right?
00:12:31.580 So, I'm actually describing myself in the same blind spot I'm warning you not to be in.
00:12:38.360 It's almost impossible to avoid it.
00:12:41.320 All right.
00:12:41.740 So, let's talk about some of the details.
00:12:43.400 So, it looks like the horn guy, the Q shaman, right?
00:12:49.300 QAnon shaman, they call him.
00:12:50.560 The videos of him are so clearly indicating he needs to be released from prison, like right now.
00:13:00.980 Like right, right now.
00:13:02.980 Now, if somebody else has other video that shows something we didn't see, maybe.
00:13:08.160 But he was actually hanging out in a friendly way with a number of, I don't know, guards or law enforcement, what they were.
00:13:17.300 One of them opened a door for him.
00:13:20.400 It seemed like they enjoyed him.
00:13:23.380 I mean, they weren't smiling or anything.
00:13:25.220 But it looked like they were either amused, certainly not threatened.
00:13:29.180 Certainly not threatened.
00:13:30.840 It was just a guy in a fun outfit walking through a hallway.
00:13:34.040 And they opened a door for him.
00:13:36.420 Right?
00:13:36.660 And they were walking with him at one point.
00:13:39.060 Yeah, when you see that video, again, it's impossible to imagine
00:13:44.540 that what that one person was doing was in any way a four-year criminal sentence.
00:13:51.900 Like, I'm not entirely sure what he got convicted of,
00:13:55.180 but wasn't it something like protesting or interfering with official proceedings or something?
00:14:02.760 Yeah, and as somebody said, I think it was Tom Fitton said on Spaces,
00:14:10.220 there can be things that are technically illegal that are not appropriate to prosecute.
00:14:17.120 Would you agree?
00:14:18.120 That the world is full of things that are technically illegal,
00:14:22.080 but you do not get a better world by prosecuting them.
00:14:26.400 Now, he's a perfect example of that.
00:14:28.340 Did he do things that were technically illegal?
00:14:31.260 Probably.
00:14:32.540 Probably.
00:14:34.320 Is anything he did worth prosecuting in the sense that it would make the world a better place?
00:14:40.660 Would it prevent him from doing it in the future?
00:14:43.980 I don't think that's a risk.
00:14:46.260 Would it prevent anyone else from doing something similar in the future?
00:14:51.320 If it did, I don't think it was worth putting him in jail.
00:14:54.040 So you could be technically breaking a law, and still the right thing is not to prosecute.
00:15:01.860 There are probably infinite examples like that.
00:15:06.220 All right.
00:15:08.080 So the general tone of Tucker's narrative, I guess,
00:15:14.520 is that the evidence shows something like patriotic protesters
00:15:19.960 who clearly were not looking to hurt anybody.
00:15:24.040 Everybody agrees there were dangerous people there who definitely had some bad ideas.
00:15:30.360 All of those people are condemned.
00:15:32.800 If they go to jail, I don't care.
00:15:35.140 But that wasn't the character of the crowd, and it wasn't the point of it.
00:15:40.040 It certainly wasn't an insurrection.
00:15:41.900 So the insurrection narrative is dissolving,
00:15:47.080 and Tucker Carlson gets to be right again.
00:15:51.320 And every time he's right about this, I'll tell you what I'm talking about.
00:15:54.900 Every time he's right about this, it pisses me off.
00:15:59.540 Because for maybe a few years,
00:16:02.720 I would just shake my head when I heard him say it,
00:16:05.720 because it just sounded so just like team play narrative stuff.
00:16:10.820 And you'd always say that whatever the Democrats are accusing you of,
00:16:14.300 you can be sure that's what they're doing themselves.
00:16:17.120 Here it is.
00:16:18.620 Here it is.
00:16:20.080 That they were literally trying to do something just incredibly disreputable.
00:16:30.060 It wasn't illegal, but certainly unethical, certainly unwise,
00:16:35.700 certainly bad for the country.
00:16:38.340 Now, it wasn't exactly what they were accusing people of,
00:16:42.220 but the accusations did seem like a cover-up for their own behavior in a way.
00:16:49.640 So, I mean, I hate it when Tucker keeps being right about that over and over again.
00:16:54.420 Poor Josh Hawley, who was, there was one video of him seemingly, you know,
00:17:03.180 skipping fast or running or something,
00:17:05.560 which the Democrats used to say,
00:17:08.140 that guy, he encouraged things and then he ran away like a coward, like a coward.
00:17:13.800 Now, I didn't see the, I couldn't see the video when I was listening to it,
00:17:17.920 but apparently that was debunked.
00:17:19.880 When you see the greater context, apparently it doesn't look weird or cowardly if you see the context.
00:17:28.400 Surprise! Surprise!
00:17:30.540 A hoax, it was a hoax within a hoax.
00:17:33.420 So the, you know, the first hoax being that it was an insurrection,
00:17:36.660 obviously it was a protest, not an insurrection.
00:17:39.740 Second hoax within, a hoax within the hoax,
00:17:42.720 was that Hawley was running like a pussy.
00:17:45.180 I think that's what people said.
00:17:46.420 I only used that word because I think that was actually a congressperson used that word.
00:17:53.960 And then that turned out to be a RUPAR edit, as we call it,
00:17:57.920 where you cut out the context.
00:18:01.200 Have we ever seen this first?
00:18:02.680 Is this the first time we've seen a hoax,
00:18:06.140 and then a hoax within the hoax?
00:18:08.620 Have we gone to hoax squared?
00:18:11.840 It's sort of like the Russian egg of hoaxes.
00:18:14.840 Oh, what's inside here?
00:18:16.540 Oh, lovely.
00:18:17.640 But what's inside here?
00:18:19.320 Oh, another hoax.
00:18:20.880 But wait, what's inside here?
00:18:22.520 Oh!
00:18:24.140 And repeat.
00:18:26.840 All right, so what do you do about all that?
00:18:31.920 What do you do about any of it?
00:18:34.040 Do you think anybody will pay,
00:18:36.460 either politically or legally or lawsuits?
00:18:40.300 I don't even think you can sue them, right?
00:18:42.800 I think Congress is exempt from being sued, as far as I know.
00:18:50.500 So here's sort of a side issue.
00:18:53.960 Again, Mike Cernovich is going hard at Trump
00:18:57.160 for having the opportunity to preemptively pardon the nonviolent January Sixers
00:19:04.180 before he left office, and he didn't do it,
00:19:08.000 which Cernovich properly points out needs some explaining.
00:19:15.360 Now, that's my take.
00:19:17.200 My take is it needs some explaining,
00:19:19.380 because, again, I haven't heard Trump's argument.
00:19:22.600 Can you imagine any argument that Trump would make
00:19:25.300 that would make you happy?
00:19:26.740 Like, what in the world could Trump say
00:19:30.100 that would make you satisfied
00:19:32.140 having not pardoned those people before he left?
00:19:36.360 You can't think of one, right?
00:19:38.100 You can't even think of one.
00:19:40.040 All right.
00:19:41.280 Suppose he had one.
00:19:43.860 Suppose he had a reason that's not obvious,
00:19:47.740 because I don't see any reason.
00:19:49.760 I mean, the only reason that's obvious
00:19:51.220 would be something like cowardice or not caring.
00:19:54.460 Those would be the obvious ones.
00:19:55.720 But what if there's another reason?
00:20:00.100 Don't you think there could be some other explanation
00:20:02.380 that's just not obvious?
00:20:04.080 Let me give you one.
00:20:06.100 All right?
00:20:06.900 I'm going to give you an explanation
00:20:08.220 that I'm not saying is the accurate one.
00:20:12.440 All I'm doing is stretching your imagination a little bit, right?
00:20:15.180 This is just an imagination stretcher,
00:20:17.480 not a claim of truth.
00:20:20.520 Imagine being Trump,
00:20:22.340 and you're being accused of running an insurrection.
00:20:25.720 It looks like Trump had a private army.
00:20:28.760 Do you remember what people were saying?
00:20:30.540 It's Trump's private army.
00:20:32.940 And all the MAGA people basically are insurrectionists,
00:20:35.960 and they don't love the country.
00:20:37.760 They're definitely not patriots.
00:20:39.220 They're all criminals,
00:20:40.160 and they're running a coup.
00:20:41.640 And then he pardons them.
00:20:47.080 Think about it.
00:20:48.840 Just put yourself in Trump's shoes.
00:20:51.360 He's got to make a decision that's good for himself,
00:20:54.300 but also not damage the country.
00:20:57.640 Not damage the country.
00:20:58.960 If you put me in that position and you say,
00:21:01.840 Scott, the ethical thing to do,
00:21:04.820 the moral thing to do,
00:21:06.460 is to pardon those innocent people right away.
00:21:09.480 Would you agree that the ethical and moral thing to do
00:21:12.440 was to pardon those innocent people before you left office?
00:21:15.460 Yes, I would.
00:21:17.040 That would be the moral and ethical thing to do.
00:21:20.420 However, would it be good for the country?
00:21:25.140 Would it be good for the country?
00:21:27.740 Because it would confirm the false narrative
00:21:30.400 that he had formed a private army.
00:21:32.920 Because if he just says,
00:21:34.440 I'm going to ignore the law,
00:21:36.240 I'm going to ignore the legal process.
00:21:38.060 Now, keep in mind,
00:21:40.220 he also didn't know how long they would be held
00:21:42.360 or what they would be charged with exactly, right?
00:21:46.320 Do you think that Trump knew that today
00:21:49.320 they'd still be in jail
00:21:50.640 for basically nothing that matters?
00:21:54.320 Nothing that matters?
00:21:56.240 No.
00:21:56.660 He couldn't have known how bad it would be,
00:21:59.280 but even still,
00:22:00.820 I'm going to agree hard with Cernovich.
00:22:03.680 This is a hard agreement
00:22:04.880 that he didn't put it this way,
00:22:07.060 but I think he would agree with the following statement,
00:22:09.760 that it was unethical and immoral,
00:22:13.860 frankly immoral,
00:22:15.360 to not help them
00:22:16.860 when he knew that they were just good patriots.
00:22:20.900 And I would have said the same thing
00:22:22.100 if a bunch of Democrats had been picked up
00:22:24.240 on sketchy charges.
00:22:25.980 Like, in both cases,
00:22:27.540 it would be immoral to leave them there.
00:22:29.640 But number one,
00:22:30.760 he could not have known how bad it would get
00:22:32.880 because he was only in office, what,
00:22:34.840 two more weeks, right?
00:22:37.060 But he could have suspected
00:22:38.820 that it would have hardened the narrative
00:22:40.940 that Republicans are nothing
00:22:42.780 but an insurrectionist party
00:22:44.500 and the MAGA movement
00:22:46.660 has to be uprooted by its roots.
00:22:50.100 But playing it this way,
00:22:53.700 again,
00:22:55.600 John, if you're coming in late,
00:22:57.940 this is not spinning it.
00:22:59.440 This is just helping you imagine
00:23:01.400 that no matter who it is,
00:23:03.200 whether it's Trump or Satan,
00:23:05.320 there might be something more to the story.
00:23:07.760 Every time any of us have been fooled
00:23:09.880 by a news story,
00:23:10.940 and that's all of us,
00:23:12.200 I remember the Covington kids thing
00:23:13.760 fooled me for a little less than a day,
00:23:16.320 but it definitely fooled me.
00:23:18.160 We all have the same blind spot.
00:23:21.040 You can't imagine
00:23:22.280 what you can't imagine.
00:23:24.600 Does that make sense?
00:23:27.460 You can't understand
00:23:29.540 what you can't imagine,
00:23:30.900 and it could be just
00:23:31.760 there's more to the story.
00:23:34.980 Look at the quality of this comment.
00:23:36.840 Scott is a liar.
00:23:39.680 I'm a liar?
00:23:41.320 I've not even made a claim of fact.
00:23:44.040 All I've made is a claim
00:23:45.400 that you should expand
00:23:46.280 your imagination.
00:23:48.980 You're having a little trouble
00:23:50.100 with this NPC.
00:23:50.800 You can tell the NPC
00:23:52.400 is like the first level.
00:23:54.780 Can't get past.
00:23:56.980 Anyway, so I do not defend Trump,
00:24:00.240 but just as I try to be consistent
00:24:04.080 with every story,
00:24:06.020 if you haven't heard the other side,
00:24:09.180 if you haven't heard the other side,
00:24:10.940 you do owe that to Trump.
00:24:13.780 You owe Trump his explanation.
00:24:17.760 Because I think there's a possibility
00:24:19.600 that he had to choose
00:24:21.420 between destroying the country
00:24:23.700 and doing something
00:24:24.740 clearly unethical,
00:24:26.620 which is keeping them in jail.
00:24:28.160 Or immoral.
00:24:29.360 Pick your term.
00:24:32.300 With that choice,
00:24:34.860 I know,
00:24:35.940 it might have been the choice.
00:24:38.260 Just saying.
00:24:41.000 All right.
00:24:43.040 How about that lab leak theory?
00:24:44.820 I haven't talked about that too much.
00:24:48.560 You know,
00:24:48.740 the fact that now
00:24:49.840 the consensus of the experts
00:24:52.160 is it must have come from the lab.
00:24:54.940 But it doesn't feel like news to me.
00:24:58.040 Like, the news is that
00:24:59.340 the government is no longer
00:25:02.180 denying you the obvious explanation.
00:25:05.560 But doesn't it feel like
00:25:07.080 we've known it for so long
00:25:08.580 that the fact that the government
00:25:10.740 had been lying
00:25:11.660 or was useless
00:25:13.480 in terms of helping us understand,
00:25:16.260 that feels like business
00:25:17.440 as usual today,
00:25:18.440 doesn't it?
00:25:19.460 Like, the fact that
00:25:20.240 the government
00:25:21.400 was lying or misleading
00:25:22.980 once again,
00:25:25.080 once again,
00:25:26.920 I couldn't generate
00:25:28.760 any outrage at all.
00:25:30.940 I've been trying for days
00:25:32.480 to generate some outrage
00:25:33.760 over the fact that
00:25:35.220 we were denied,
00:25:36.160 you know,
00:25:36.440 the obvious explanation
00:25:37.700 that came from the lab.
00:25:39.020 And I'm just,
00:25:40.880 it's now just baseline
00:25:42.180 government
00:25:43.400 weaselness
00:25:45.480 that I'm so frickin' used to it,
00:25:47.760 it didn't even register
00:25:48.800 in my outrage orb.
00:25:51.240 So I guess I need,
00:25:52.140 you know,
00:25:52.420 more.
00:25:53.120 I need more
00:25:53.720 to get me outraged.
00:25:54.900 I mean,
00:25:55.140 it's an outrageous story.
00:25:56.620 I just,
00:25:57.100 I'm just so used to it.
00:26:00.860 Here's a logic question.
00:26:06.920 Let's say
00:26:07.660 everybody knows
00:26:08.760 there's some excess deaths
00:26:09.960 that are continuing
00:26:11.080 long after the pandemic peak.
00:26:14.820 And there's a lot of question
00:26:15.880 about what is causing
00:26:16.800 the excess deaths.
00:26:20.200 If the COVID itself
00:26:22.300 starts fizzling out,
00:26:24.100 or at least the,
00:26:25.320 you know,
00:26:25.800 the most dangerous forms
00:26:27.020 of COVID fizzle out,
00:26:29.020 let's say the hospitalizations
00:26:30.400 go to practically nothing.
00:26:32.240 And it looks like
00:26:32.800 that could happen
00:26:33.400 in the next,
00:26:34.040 I don't know,
00:26:34.320 six months or so.
00:26:35.040 But let's say
00:26:37.060 if the excess deaths
00:26:38.260 still continued climbing
00:26:41.080 after the COVID itself
00:26:43.140 had gone away,
00:26:45.320 what would be
00:26:45.980 the logical
00:26:46.900 theory?
00:26:50.600 Somebody say vaccinations.
00:26:52.420 Vaccinations.
00:26:53.440 Now that,
00:26:53.900 that was the theory
00:26:54.720 I heard,
00:26:55.240 but I want to test
00:26:55.880 the logic with you.
00:26:56.780 At some point,
00:27:00.380 most adults
00:27:01.320 are going to be
00:27:01.940 vaccinated.
00:27:03.160 Most,
00:27:03.800 not you necessarily.
00:27:05.200 They'll have at least
00:27:05.940 two,
00:27:06.640 two boosters.
00:27:08.180 Most.
00:27:09.380 And they will also
00:27:10.740 have been exposed
00:27:11.540 to COVID.
00:27:13.980 So if you've got
00:27:14.980 both the vaccination
00:27:16.020 in you
00:27:16.660 and the COVID,
00:27:18.660 let's say
00:27:19.840 one of those
00:27:20.560 had a long-term effect.
00:27:22.620 Let's say
00:27:23.020 both of them
00:27:23.580 stay in your body
00:27:24.360 forever.
00:27:25.160 I'm not saying
00:27:25.840 that's true.
00:27:26.780 But let's say
00:27:27.540 it did.
00:27:28.200 Let's say
00:27:28.540 the spike protein
00:27:29.640 stays in your body
00:27:30.560 forever
00:27:30.900 and does bad stuff.
00:27:33.300 And let's say
00:27:33.800 that the COVID
00:27:34.380 maybe stays in your body
00:27:36.280 and does some bad stuff.
00:27:37.620 Or,
00:27:38.720 maybe either one
00:27:39.860 of them
00:27:40.200 weakened some
00:27:41.400 key system
00:27:42.560 in your body
00:27:43.180 so that you
00:27:44.420 don't die right away,
00:27:46.040 but that excess death
00:27:47.180 just stays up there
00:27:48.140 because people
00:27:49.060 are just generally
00:27:49.740 weakened
00:27:50.220 from one or the other
00:27:51.720 or from some
00:27:52.700 third or fourth effect.
00:27:54.620 do you think
00:27:56.940 that the decrease
00:27:59.140 of COVID
00:27:59.700 with the increase
00:28:02.320 of excess deaths
00:28:03.680 or even staying
00:28:05.300 the same excess deaths,
00:28:07.840 do you think
00:28:08.220 that narrows it down
00:28:10.040 to the vaccinations?
00:28:12.580 Because
00:28:13.140 that would only
00:28:15.060 make sense to me
00:28:16.020 if the only kind
00:28:17.520 of harm
00:28:17.960 was short-term harm.
00:28:19.500 Now,
00:28:20.980 most harm
00:28:21.480 is short-term,
00:28:22.420 I think.
00:28:23.620 But everything
00:28:24.140 about this virus,
00:28:25.300 well,
00:28:25.500 not everything,
00:28:26.120 a lot about this virus
00:28:27.140 has surprised us.
00:28:29.220 So,
00:28:29.840 if you've got
00:28:30.280 a surprising
00:28:30.920 and engineered virus,
00:28:32.900 is it possible
00:28:34.000 that it could be
00:28:35.500 eating at your systems
00:28:37.020 in a slow way
00:28:38.040 and then
00:28:39.380 either the vaccination
00:28:40.480 or the COVID
00:28:41.400 because both of them
00:28:42.900 have some long-term,
00:28:44.160 at least hypothetical,
00:28:45.820 risk?
00:28:46.140 Is it possible
00:28:48.000 that people
00:28:48.840 could be dying
00:28:49.580 at greater numbers
00:28:50.660 in the future
00:28:51.380 only because
00:28:52.740 they had been
00:28:53.220 so degraded
00:28:54.260 by one
00:28:55.440 or both of those things
00:28:56.500 or three or four
00:28:57.420 other things?
00:28:59.040 So,
00:28:59.360 can't you get,
00:29:00.580 because I think
00:29:01.100 the math still works.
00:29:03.080 I believe the math
00:29:04.220 works to keep
00:29:05.060 excess deaths
00:29:06.920 even if COVID
00:29:09.280 goes down.
00:29:11.880 I think it does.
00:29:13.820 But I wouldn't
00:29:14.540 bet my life on it.
00:29:15.500 And by the way,
00:29:16.700 the more likely
00:29:17.660 explanation
00:29:18.320 would be exactly
00:29:20.860 the theory
00:29:22.380 that it's the vaccinations.
00:29:24.880 So,
00:29:26.560 that's just sort
00:29:27.680 of a logic question
00:29:28.680 whether there's
00:29:29.480 any scenario
00:29:30.140 in which
00:29:30.760 you could not
00:29:33.200 narrow it down.
00:29:34.680 But it would definitely
00:29:35.680 be a strong,
00:29:36.580 strong signal
00:29:37.260 that the vaccinations
00:29:38.760 were a problem.
00:29:39.920 We'll see.
00:29:40.780 And apparently
00:29:41.300 we won't know for sure
00:29:42.400 for maybe until May.
00:29:44.000 All right.
00:29:45.380 So I saw this
00:29:46.000 from the ethical skeptic
00:29:47.380 tweets.
00:29:50.960 You heard about
00:29:51.880 four Americans
00:29:52.600 kidnapped
00:29:53.180 by a cartel
00:29:54.600 in Mexico.
00:29:57.020 That's war.
00:29:59.280 That's war.
00:30:01.340 How many Americans
00:30:02.700 need to be kidnapped
00:30:03.960 by a cartel
00:30:04.880 in Mexico
00:30:05.400 before it's war?
00:30:07.540 One.
00:30:08.280 One carful.
00:30:09.120 I think it was
00:30:10.760 four people.
00:30:13.500 Somebody says
00:30:14.260 two are dead?
00:30:16.960 Oh, shit.
00:30:18.360 Yeah, I figured.
00:30:19.680 No surprise.
00:30:20.820 Two are dead.
00:30:23.460 That's war.
00:30:25.860 Yep.
00:30:26.300 That's war.
00:30:28.100 You don't walk
00:30:29.640 away from that.
00:30:30.780 That's war.
00:30:31.440 Now, do you think
00:30:33.820 having Vivek Ramaswamy
00:30:36.080 in the race
00:30:36.800 now makes sense?
00:30:38.900 Yes.
00:30:39.960 Because he said
00:30:40.620 go into Mexico.
00:30:42.000 First, try to work
00:30:43.060 with the Mexican government.
00:30:44.280 But if that doesn't work,
00:30:45.620 we take care of it ourselves.
00:30:48.320 Trump also said that.
00:30:50.240 Trump said it first,
00:30:52.260 which I think
00:30:52.920 also raises
00:30:53.780 Vivek's bid.
00:30:56.300 So now you have
00:30:57.140 two strong candidates,
00:30:59.060 one of them
00:30:59.580 the one
00:30:59.960 who's leading
00:31:00.600 in the polls,
00:31:01.440 saying a war
00:31:02.820 with Mexico.
00:31:04.480 Like, actually invade.
00:31:06.000 Just the cartels,
00:31:07.020 not the government.
00:31:08.960 Yeah.
00:31:10.480 So,
00:31:11.980 these deaths
00:31:13.340 should be the turning point
00:31:14.920 if we're a serious country.
00:31:17.360 It should be the reason
00:31:18.360 to vote
00:31:18.880 Republican.
00:31:21.720 You know all of your other reasons
00:31:23.160 to not vote Republican?
00:31:24.880 Don't care.
00:31:26.660 Don't care.
00:31:27.520 I'm a single-issue voter.
00:31:29.220 If we can't fix this one thing,
00:31:31.440 like, what's the point
00:31:33.100 of a government?
00:31:34.360 You can't do this one thing.
00:31:36.140 Just give me one thing
00:31:37.120 and I'll vote for you.
00:31:40.360 Marsha Blackburn,
00:31:41.480 Republican,
00:31:41.960 has an idea on fentanyl.
00:31:43.720 She wants to increase
00:31:44.560 sanctions on China
00:31:45.560 because that's the source
00:31:46.600 of the precursors.
00:31:48.140 I think we all know
00:31:49.580 that even if China
00:31:50.780 shut down
00:31:51.360 the precursor business,
00:31:53.280 probably just move
00:31:54.060 to China
00:31:54.660 or move to Mexico
00:31:56.080 or someplace else,
00:31:57.580 some other country.
00:31:58.740 So,
00:31:59.460 I'm definitely
00:32:00.280 in favor
00:32:00.840 of this plan,
00:32:02.980 but it's not
00:32:03.560 the end of fentanyl.
00:32:04.980 Right?
00:32:05.200 It might make a difference.
00:32:06.520 It might make a big difference,
00:32:07.840 but it wouldn't be
00:32:08.400 the end of fentanyl.
00:32:09.520 So,
00:32:10.100 her idea is to increase
00:32:11.440 the sanctions
00:32:12.600 on China
00:32:13.560 until they shut down
00:32:15.360 their business.
00:32:16.800 Now,
00:32:17.440 the way China
00:32:18.080 games the system
00:32:19.320 is they say,
00:32:20.840 we do make it illegal
00:32:21.940 and if anybody does it,
00:32:23.840 we do pick them up
00:32:24.620 and put them in jail.
00:32:26.020 But the criminals
00:32:26.740 simply change the formula
00:32:28.300 a little bit
00:32:28.860 of the precursors
00:32:29.820 and then it doesn't
00:32:31.240 technically
00:32:32.080 trigger the law.
00:32:34.800 So,
00:32:35.200 as long as
00:32:35.840 the Chinese government
00:32:36.900 allows them
00:32:37.620 to tinker
00:32:38.100 with the formula
00:32:38.880 every time the law
00:32:39.800 is specified,
00:32:41.180 it's like doing nothing.
00:32:43.400 Right?
00:32:43.960 China knows
00:32:44.720 how to stop it.
00:32:45.600 So,
00:32:46.960 that's why
00:32:47.380 I'm in favor
00:32:47.920 of sanctions.
00:32:50.020 But,
00:32:50.400 here's what doesn't work.
00:32:53.240 Doing the sanctions
00:32:54.480 that you haven't announced.
00:32:56.860 Like,
00:32:57.500 oh,
00:32:57.720 we're going to give you
00:32:58.240 some sanctions
00:32:58.880 because you've done this.
00:33:00.900 I don't like that.
00:33:02.420 I like putting it
00:33:03.300 on a schedule.
00:33:05.220 I like publishing
00:33:06.920 a schedule.
00:33:08.380 All right?
00:33:09.340 On this day,
00:33:10.720 we're going to do this
00:33:11.880 if the fentanyl
00:33:13.080 is still coming in.
00:33:13.860 Which it will be.
00:33:15.720 On this day,
00:33:16.400 we're going to do
00:33:16.840 this additional thing.
00:33:17.940 On this day,
00:33:18.500 we're going to do
00:33:18.880 this additional thing.
00:33:20.040 And just show them
00:33:20.920 the menu
00:33:21.360 and you see that
00:33:22.760 by the end
00:33:23.300 it's full destruction
00:33:24.240 of their economy.
00:33:26.820 Now,
00:33:27.260 they can go as far
00:33:28.240 down that list
00:33:29.020 as they want to
00:33:29.920 as long as
00:33:31.380 we don't stop going.
00:33:33.280 Right?
00:33:33.540 At some point,
00:33:34.540 they're going to say,
00:33:35.580 you know,
00:33:36.040 after three of the things
00:33:37.060 on this list,
00:33:37.920 I don't want to see
00:33:38.960 number six
00:33:39.680 because this is starting
00:33:41.160 to sting a little bit.
00:33:42.140 But I think
00:33:44.040 the entire threat
00:33:45.800 is useless
00:33:46.900 unless it's completely
00:33:48.600 mapped out
00:33:49.180 on a schedule.
00:33:50.680 This is what
00:33:51.360 we're going to do
00:33:51.840 Tuesday.
00:33:52.560 This is what
00:33:53.240 we're going to do
00:33:53.700 Wednesday.
00:33:54.320 And then you've got
00:33:54.920 to do it.
00:33:56.060 Right?
00:33:56.400 No matter how
00:33:57.620 brutal it gets.
00:34:00.900 The Matt Hancock story.
00:34:02.660 I don't know that story.
00:34:04.780 The Matt Hancock story.
00:34:07.200 Does that ring a bell
00:34:08.300 with anybody?
00:34:08.860 He's British.
00:34:13.460 Okay.
00:34:15.720 Anyway,
00:34:16.360 so that's the start.
00:34:20.820 Have you heard about
00:34:21.700 this is like
00:34:22.280 the worst story
00:34:23.160 in Iran
00:34:24.500 for,
00:34:26.520 I guess,
00:34:26.960 over a year
00:34:27.440 there were
00:34:28.780 these poisoning
00:34:29.600 attacks
00:34:30.080 in girls'
00:34:31.160 schools.
00:34:32.240 Over 50 schools
00:34:33.620 and more than
00:34:34.160 400 schoolgirls
00:34:35.660 in 21 provinces
00:34:36.780 across Iran
00:34:37.680 have been poisoned
00:34:39.100 in school
00:34:41.060 like intentionally.
00:34:43.900 I guess it's obvious
00:34:45.000 that it's intentional
00:34:45.780 but they don't know
00:34:46.920 who's doing it.
00:34:47.620 They assume
00:34:48.160 the best guess
00:34:49.560 is some religious
00:34:50.380 extremist
00:34:51.100 who doesn't like
00:34:51.680 girls' education
00:34:52.500 which is a pretty
00:34:53.660 good guess.
00:34:55.060 But
00:34:55.360 I can't even
00:34:57.600 imagine that.
00:34:59.300 Can you?
00:35:00.660 It's like it
00:35:01.300 defies imagination.
00:35:03.080 It's like a level
00:35:03.980 of evil
00:35:04.460 that your brain
00:35:05.160 can't
00:35:06.300 even wrap
00:35:07.100 its head around.
00:35:09.000 So,
00:35:09.500 you know,
00:35:09.720 this is one
00:35:10.680 thing where
00:35:11.340 I guess I can
00:35:11.860 say I agree
00:35:12.460 with the Ayatollah.
00:35:13.940 The Ayatollah
00:35:14.600 says if they
00:35:15.200 find who's
00:35:15.660 doing it
00:35:16.000 they're going
00:35:16.280 to execute
00:35:17.120 them.
00:35:18.120 And
00:35:18.420 I hate to
00:35:20.340 agree with
00:35:20.700 the Ayatollah
00:35:21.320 but yeah,
00:35:23.020 that would be
00:35:24.740 a big guess.
00:35:25.400 You need to
00:35:26.700 execute whoever
00:35:27.460 that is.
00:35:28.600 Wow.
00:35:31.000 President Xi
00:35:31.900 was getting
00:35:33.000 a little more
00:35:33.460 pointed about
00:35:34.160 the America,
00:35:35.260 no surprise.
00:35:36.300 And he's
00:35:37.520 saying directly
00:35:38.460 in his speeches
00:35:39.140 that the U.S.
00:35:40.220 is hurting
00:35:40.720 China's economy.
00:35:42.680 And I don't
00:35:43.440 know how much
00:35:43.960 is the U.S.
00:35:45.100 hurting it
00:35:45.600 with sanctions
00:35:46.360 and how much
00:35:47.920 is just the
00:35:48.420 natural situation
00:35:49.260 but here's
00:35:49.740 some of the
00:35:50.020 bad economic
00:35:50.720 news for
00:35:51.260 China.
00:35:52.160 No surprise.
00:35:53.360 China's exports
00:35:54.040 fell 7%
00:35:55.400 in the January
00:35:57.360 to February
00:35:57.900 period.
00:35:59.380 I don't know
00:35:59.800 if that's a
00:36:00.260 big deal.
00:36:01.320 Could be
00:36:01.660 seasonal.
00:36:02.340 I don't know
00:36:02.600 what that's
00:36:02.920 about.
00:36:03.840 Extending a
00:36:04.560 decline from
00:36:05.140 previous months.
00:36:06.800 Okay,
00:36:07.080 that's even
00:36:07.420 worse.
00:36:08.260 As global
00:36:08.980 demand weakened
00:36:09.780 and imports
00:36:10.620 dropped 10%.
00:36:12.060 China's exports
00:36:14.180 to the U.S.
00:36:14.720 plunged 15%.
00:36:16.140 There we go.
00:36:28.860 Good work,
00:36:29.480 America.
00:36:31.020 Yeah.
00:36:31.480 China's exports
00:36:32.260 to the U.S.
00:36:32.780 plunged 15%.
00:36:33.840 Now, here's
00:36:35.520 something that
00:36:36.580 you wouldn't
00:36:37.680 necessarily know
00:36:38.620 if you're not
00:36:39.300 an economist
00:36:39.820 or you don't
00:36:40.940 study this stuff.
00:36:41.720 you don't
00:36:43.540 have to take
00:36:44.100 100% of
00:36:45.060 anything to
00:36:46.020 destroy it.
00:36:48.600 Things have
00:36:49.480 a breaking
00:36:50.000 point and
00:36:52.040 you don't
00:36:52.740 have to get
00:36:53.140 down to
00:36:53.500 zero exports
00:36:55.800 to America
00:36:56.520 before China's
00:36:57.540 in serious
00:36:58.600 trouble.
00:36:59.520 15% sounds
00:37:00.840 like a
00:37:01.240 catastrophe to
00:37:02.080 me.
00:37:03.700 Doesn't it
00:37:04.220 to you?
00:37:05.460 A 15%
00:37:06.360 decline in
00:37:07.180 exports to
00:37:08.660 the U.S.?
00:37:09.160 To me,
00:37:09.660 that sounds
00:37:10.000 like a
00:37:10.380 catastrophe.
00:37:11.720 It's not
00:37:12.900 an end-of-the-economy
00:37:15.200 catastrophe,
00:37:16.300 but what if
00:37:16.860 it hits 25%
00:37:17.820 or 30%?
00:37:19.940 If it hits
00:37:20.620 30%,
00:37:21.580 it's probably
00:37:23.600 game over
00:37:24.200 and we're
00:37:25.700 heading in
00:37:26.020 that direction.
00:37:27.440 Now,
00:37:27.960 that's always
00:37:28.680 hyperbole,
00:37:29.540 too.
00:37:30.120 Have I ever
00:37:30.800 used any
00:37:31.240 hyperbole
00:37:31.760 before?
00:37:32.940 Or is this
00:37:33.260 the first
00:37:33.720 time?
00:37:34.440 I think
00:37:34.780 maybe I've
00:37:35.440 possibly done
00:37:36.000 that before.
00:37:36.860 I have
00:37:37.180 some vague
00:37:37.800 memory where
00:37:38.280 I've done
00:37:38.540 that before.
00:37:39.700 But no,
00:37:40.240 it doesn't
00:37:40.560 mean that
00:37:40.820 China's
00:37:41.140 going to
00:37:41.440 collapse,
00:37:42.120 but it
00:37:42.360 would be
00:37:43.380 a serious
00:37:44.240 adjustment
00:37:45.540 they'd have
00:37:46.000 to make
00:37:46.260 if they
00:37:46.500 lost,
00:37:47.400 if U.S.
00:37:48.440 exports
00:37:48.760 declined 30%.
00:37:49.880 15% is
00:37:51.100 pretty big.
00:37:52.680 All right.
00:37:53.420 Who would
00:37:53.780 like to hear
00:37:54.520 about my
00:37:55.280 interview last
00:37:55.900 night with
00:37:56.420 Chris Cuomo
00:37:57.060 on News
00:37:57.760 Nation?
00:37:58.860 Only available
00:37:59.860 initially on
00:38:01.480 your cable if
00:38:02.640 they carried
00:38:03.120 it, but now
00:38:03.780 it's available
00:38:04.320 on YouTube.
00:38:05.520 So if you
00:38:06.840 go to my
00:38:07.360 pinned tweet
00:38:08.740 on YouTube
00:38:10.660 today, you'll
00:38:11.660 get a link
00:38:12.160 to the full
00:38:12.600 interview.
00:38:13.900 And I
00:38:14.700 wanted to
00:38:15.160 talk about
00:38:15.780 it, but
00:38:16.840 also give
00:38:17.640 you a
00:38:18.040 little bit
00:38:18.380 of a
00:38:18.800 sort of
00:38:20.420 a lesson.
00:38:22.060 Would
00:38:22.300 anybody like
00:38:22.800 a media
00:38:23.640 communication
00:38:24.680 persuasion
00:38:25.680 lesson?
00:38:27.460 Because it's
00:38:28.160 coming up.
00:38:29.600 All right.
00:38:30.520 But first,
00:38:31.360 let's go to
00:38:31.740 the white
00:38:31.940 board.
00:38:32.180 All right.
00:38:32.260 here's my
00:38:37.500 overview of
00:38:38.680 what life
00:38:39.560 is like for
00:38:40.380 me in the
00:38:41.680 center of a
00:38:43.020 scandal.
00:38:45.000 There appear
00:38:45.960 to be two
00:38:46.620 very different
00:38:47.900 worlds.
00:38:49.560 There's a
00:38:50.060 real world,
00:38:51.700 like the
00:38:52.340 one when I
00:38:52.860 walk outside,
00:38:54.800 and everybody
00:38:56.620 I see is
00:38:58.100 kind and
00:38:58.740 happy.
00:39:00.980 And they
00:39:01.760 have genuine
00:39:02.420 good feelings
00:39:04.160 for other
00:39:04.600 people.
00:39:07.040 I don't
00:39:07.680 know what
00:39:07.960 your experience
00:39:08.680 is in
00:39:09.140 life, but
00:39:10.120 I don't
00:39:10.580 really run
00:39:11.120 into any
00:39:11.580 ugly people
00:39:12.260 in terms of
00:39:13.000 attitude, not
00:39:13.760 looks.
00:39:14.700 But almost
00:39:16.120 everybody I
00:39:16.860 encounter in
00:39:17.820 real life is
00:39:19.580 happy to talk
00:39:20.720 to you if
00:39:21.180 you're happy
00:39:21.540 to talk to
00:39:22.060 them.
00:39:23.100 They can
00:39:24.360 easily like
00:39:25.220 you, very
00:39:26.280 easily.
00:39:26.900 You just
00:39:27.260 have to be a
00:39:27.720 nice person.
00:39:29.220 The only
00:39:29.840 problem with
00:39:30.360 the real
00:39:30.640 world is you've
00:39:31.240 got a few
00:39:31.600 Karens, am
00:39:32.820 I right?
00:39:33.940 I'm using
00:39:34.380 that sort of
00:39:35.100 generically for
00:39:35.860 you always have
00:39:37.140 some bad
00:39:38.140 characters and
00:39:38.940 criminals and
00:39:39.620 whatnot.
00:39:40.600 But basically
00:39:41.300 the 99%
00:39:43.820 experience of
00:39:45.180 life, if you
00:39:46.400 don't count
00:39:46.840 that co-worker
00:39:47.720 who's a
00:39:48.540 psychopath,
00:39:49.600 everybody has
00:39:50.120 a co-worker
00:39:50.620 who's a
00:39:50.980 psychopath,
00:39:52.180 right?
00:39:52.640 But not
00:39:53.600 counting that,
00:39:54.440 most of your
00:39:54.940 interactions are
00:39:55.600 just with cool
00:39:56.200 people having a
00:39:56.860 good time.
00:39:57.580 And it works
00:39:58.560 across race,
00:39:59.600 age, gender,
00:40:01.420 religion,
00:40:03.200 sexual orientation,
00:40:05.160 you name it.
00:40:06.100 That's the real
00:40:06.720 world.
00:40:07.660 But we've been
00:40:09.200 hypnotized into
00:40:10.100 thinking the
00:40:10.700 screen world,
00:40:11.960 I call it the
00:40:12.720 screen world,
00:40:14.060 the thing that's
00:40:14.720 only happening on
00:40:15.680 my screens.
00:40:17.260 So I'm in the
00:40:18.520 middle of one of
00:40:19.200 the biggest
00:40:19.840 scandals,
00:40:21.600 dramas,
00:40:22.200 cancellation,
00:40:22.840 whatever you call
00:40:23.420 it, in the
00:40:24.420 entire history of
00:40:26.040 America.
00:40:26.520 it's one of
00:40:27.760 the big ones.
00:40:28.660 I don't know
00:40:29.140 what would be
00:40:29.700 the big S,
00:40:30.860 but I think
00:40:31.600 you'd agree
00:40:32.040 that my
00:40:32.560 cancellation is
00:40:33.280 one of the
00:40:33.720 biggest.
00:40:35.500 I have no
00:40:36.420 real world
00:40:37.080 feeling about
00:40:37.960 it.
00:40:39.680 Like the
00:40:40.620 penalty went
00:40:42.040 from the
00:40:42.440 screen world
00:40:43.120 over to the
00:40:44.200 real world,
00:40:44.940 and it got
00:40:45.600 cancelled,
00:40:46.340 so it affects
00:40:46.920 my economics,
00:40:48.560 but it
00:40:49.500 comes down
00:40:50.140 80%.
00:40:50.900 If your
00:40:52.920 income goes
00:40:53.620 down to 80%,
00:40:54.400 you definitely
00:40:55.100 feel it.
00:40:55.660 no matter
00:40:57.160 who you
00:40:57.500 are,
00:40:58.200 you definitely
00:40:58.840 feel it.
00:41:00.260 But the
00:41:00.920 screen world
00:41:01.500 isn't any
00:41:02.100 real people.
00:41:04.520 You know,
00:41:04.800 there are real
00:41:05.180 people who
00:41:05.660 interact,
00:41:06.960 but everybody
00:41:07.680 turns into
00:41:08.280 their worst
00:41:08.840 self.
00:41:10.680 In person,
00:41:11.940 people like to
00:41:12.540 put on their
00:41:13.080 best self.
00:41:14.340 Why wouldn't
00:41:14.700 you?
00:41:15.320 But on the
00:41:15.900 screen,
00:41:16.260 everybody turns
00:41:16.900 into like a
00:41:17.540 monster.
00:41:18.580 So you've
00:41:18.880 got mostly
00:41:19.320 trolls,
00:41:20.200 bots,
00:41:21.300 grifters,
00:41:21.800 narcissists,
00:41:22.500 white knights,
00:41:23.120 click-oars,
00:41:24.240 peacocks,
00:41:24.720 the angry
00:41:25.400 uninformed.
00:41:26.220 They're my
00:41:26.520 favorite.
00:41:27.500 We love the
00:41:28.140 angry uninformed.
00:41:29.720 And then
00:41:30.360 political hacks
00:41:31.160 who are just
00:41:31.760 working for
00:41:32.980 clicks or
00:41:33.500 attention or
00:41:34.000 money.
00:41:35.220 None of this
00:41:35.980 is real.
00:41:36.420 I didn't even
00:41:38.440 get cancelled
00:41:39.040 in the real
00:41:39.760 world.
00:41:40.820 It's just that
00:41:41.380 the cancellation
00:41:42.040 bled into a
00:41:42.840 different world.
00:41:44.280 Like,
00:41:44.440 usually it
00:41:44.880 doesn't jump.
00:41:46.580 Yeah,
00:41:46.840 usually you can
00:41:47.780 keep the screen
00:41:48.500 world in the
00:41:49.240 screen world.
00:41:49.880 but it
00:41:51.040 broke out
00:41:51.460 of its,
00:41:52.200 like,
00:41:52.560 it was like
00:41:53.820 a portal
00:41:54.300 from hell
00:41:54.860 opened.
00:41:55.920 It was like
00:41:56.260 and once
00:42:01.640 the portal
00:42:02.720 from hell
00:42:03.220 connected these
00:42:04.460 two worlds,
00:42:05.800 all hell
00:42:06.380 broke loose.
00:42:07.660 All right,
00:42:07.980 I've got to
00:42:08.260 draw that.
00:42:12.200 This has to
00:42:12.980 be immortalized.
00:42:14.960 If I can reach
00:42:15.920 without pulling
00:42:16.520 my microphones
00:42:17.120 off.
00:42:17.560 this will be
00:42:22.020 the portal
00:42:25.020 from hell.
00:42:34.220 Portal
00:42:34.820 from hell.
00:42:40.660 All right,
00:42:41.460 so that's
00:42:42.120 sort of the
00:42:42.500 big picture.
00:42:46.280 Portal
00:42:46.880 from hell.
00:42:47.560 Take a good
00:42:48.000 look at that.
00:42:48.440 It's beautiful.
00:42:50.160 I think you
00:42:50.900 can tell from
00:42:51.500 this that I'm
00:42:52.440 a professional
00:42:53.080 cartoonist.
00:42:54.580 Like,
00:42:54.780 I don't like
00:42:55.460 to brag and
00:42:56.240 stuff,
00:42:57.060 but,
00:42:57.520 I mean,
00:42:57.740 look at this.
00:42:58.820 It's beautiful.
00:43:00.440 Can you tell
00:43:00.940 that Karen
00:43:01.400 looks exactly
00:43:02.260 like a human
00:43:03.000 female?
00:43:04.060 Yeah,
00:43:04.440 yes,
00:43:04.700 you can.
00:43:05.280 Or possibly
00:43:06.060 a stain
00:43:07.400 on a napkin.
00:43:09.400 It's one of
00:43:09.900 those two things.
00:43:11.080 I do them
00:43:11.820 both very well.
00:43:13.640 All right,
00:43:13.940 let's talk about,
00:43:15.060 here's the big
00:43:15.600 picture.
00:43:15.900 So last
00:43:16.760 night I did
00:43:17.460 one hour
00:43:18.020 for Chris
00:43:19.300 Cuomo.
00:43:20.720 And boy,
00:43:21.440 do I appreciate
00:43:22.320 Chris Cuomo.
00:43:24.240 All right,
00:43:24.660 so I think I
00:43:25.320 told you before,
00:43:26.600 he was one of
00:43:27.500 the people who
00:43:27.900 contacted me,
00:43:28.620 but a lot of
00:43:29.260 people contacted
00:43:29.960 me for interviews
00:43:31.100 and comments,
00:43:31.780 and I said no
00:43:32.520 to almost all
00:43:33.540 of them.
00:43:34.420 But when he
00:43:35.160 contacted me,
00:43:35.960 it was clear he
00:43:36.840 had done his
00:43:37.620 homework,
00:43:38.720 so he actually
00:43:39.400 understood the
00:43:40.080 full context.
00:43:40.820 amazing.
00:43:44.060 Secondly,
00:43:44.740 he promised
00:43:45.200 to give me
00:43:45.680 close to a
00:43:46.760 full hour
00:43:47.280 so that I
00:43:48.460 wouldn't be
00:43:49.020 timed out
00:43:49.880 before I
00:43:50.400 made my
00:43:50.760 point.
00:43:51.960 Thank you.
00:43:52.920 That's exactly
00:43:53.600 what this
00:43:54.060 required.
00:43:55.400 So I said
00:43:56.100 yes.
00:43:57.180 And he
00:43:58.040 delivered.
00:43:59.000 He totally
00:43:59.480 delivered.
00:44:00.580 Now,
00:44:00.920 News Nation,
00:44:01.680 I was not
00:44:02.120 totally familiar
00:44:02.820 with,
00:44:03.160 but I think
00:44:03.660 they're trying
00:44:04.080 to,
00:44:04.760 I believe
00:44:05.380 they're trying
00:44:05.780 to frame
00:44:06.280 themselves
00:44:06.720 as the
00:44:07.460 non-crazy
00:44:09.180 news,
00:44:10.100 you know,
00:44:10.360 without the
00:44:10.960 super spin.
00:44:12.380 There's always
00:44:12.920 a little bit
00:44:13.340 of bias in
00:44:13.820 everything,
00:44:14.240 but I think
00:44:14.900 they're trying
00:44:15.260 to find the
00:44:15.920 middle,
00:44:16.900 and I think
00:44:17.880 they did.
00:44:19.000 I think they
00:44:19.540 actually found
00:44:20.060 the middle
00:44:20.400 of this story.
00:44:21.620 So he asked
00:44:22.240 hard questions.
00:44:23.640 I'll talk in
00:44:24.320 more detail,
00:44:25.040 and I'll give
00:44:25.340 you some media
00:44:26.480 lessons.
00:44:27.320 So he asked
00:44:27.760 hard questions.
00:44:29.240 And by the
00:44:29.540 way, you can
00:44:29.900 see the link
00:44:30.540 to the full
00:44:30.980 thing in my
00:44:31.860 pinned tweet
00:44:32.780 on Twitter.
00:44:34.500 He pushed
00:44:35.420 on the hard
00:44:36.120 questions,
00:44:36.800 which is what
00:44:37.320 I wanted him
00:44:37.880 to do,
00:44:38.360 because I
00:44:38.740 don't want
00:44:39.000 to do
00:44:39.220 interview with
00:44:39.800 a friendly,
00:44:41.080 like somebody
00:44:42.040 who just
00:44:42.760 agrees with
00:44:43.360 me.
00:44:44.580 That's not
00:44:45.160 really going
00:44:45.620 to move
00:44:45.860 anything,
00:44:46.520 talking to
00:44:47.000 people who
00:44:47.360 agree with
00:44:47.740 me.
00:44:48.700 Not really
00:44:49.340 a good use
00:44:49.820 of time.
00:44:50.960 So I knew
00:44:53.180 that he would
00:44:53.680 push, but I
00:44:54.640 also knew that
00:44:55.400 he knew the
00:44:55.820 full context,
00:44:56.920 and I knew
00:44:57.380 that he'd
00:44:57.700 give me time
00:44:58.360 to say what
00:44:59.420 I wanted to
00:44:59.900 say, and he
00:45:00.520 delivered all
00:45:01.340 three.
00:45:02.580 So, you
00:45:03.960 know, revise
00:45:04.940 your CNMT,
00:45:06.120 opinions of
00:45:07.340 Chris Cuomo,
00:45:08.360 because that
00:45:08.860 was a solid
00:45:09.820 contribution to
00:45:11.820 journalism, I
00:45:13.200 think.
00:45:14.220 Now, my
00:45:16.820 objective was
00:45:19.340 to reframe
00:45:21.180 things and
00:45:22.460 move the
00:45:22.940 window.
00:45:24.340 So those of
00:45:25.420 you who
00:45:25.660 watched it can
00:45:26.940 either confirm
00:45:27.620 or deny, but
00:45:29.180 those of you who
00:45:29.600 have not watched
00:45:30.160 it and plan to,
00:45:30.980 do, if
00:45:31.620 you're interested
00:45:32.180 to see these
00:45:32.880 little, let's
00:45:34.680 say, communication
00:45:36.000 techniques that I'm
00:45:36.960 going to talk
00:45:37.400 about, if you
00:45:38.140 want to see them
00:45:38.640 in practice, just
00:45:39.380 watch the video
00:45:39.980 after I explain
00:45:40.900 them.
00:45:41.940 All right, so
00:45:43.600 the first thing
00:45:44.560 that this is like
00:45:45.280 a media lesson
00:45:46.040 to, the first
00:45:47.420 thing you want to
00:45:47.920 avoid if you find
00:45:48.800 yourself in a
00:45:49.400 scandal is going
00:45:50.880 on a show with
00:45:51.580 a four-minute hit,
00:45:53.860 because they're
00:45:54.260 just going to
00:45:55.000 yell at you for
00:45:55.880 three and a half
00:45:56.500 minutes.
00:45:57.640 You'll get half
00:45:58.660 a minute to do
00:45:59.540 something that
00:46:00.040 takes ten minutes,
00:46:01.080 and then you'll
00:46:01.820 run out of time.
00:46:03.440 So that's
00:46:03.920 basically just, you
00:46:05.100 know, inviting you
00:46:06.660 into a trap.
00:46:07.840 But if somebody
00:46:08.440 says, I'll give
00:46:09.060 you an hour, and
00:46:10.320 it'll be live, and
00:46:11.460 the live part's very
00:46:12.460 important, right?
00:46:14.140 That's what Chris
00:46:15.260 Cuomo offered, was
00:46:16.600 live.
00:46:17.300 You don't want it
00:46:18.240 recorded, that's the
00:46:19.740 60-minute trick,
00:46:21.580 right?
00:46:21.880 The 60-minute trick
00:46:22.980 is to say, oh, it's
00:46:24.760 a high, respectable,
00:46:26.920 thing, you sure
00:46:27.600 want to be on
00:46:28.080 here, and then
00:46:28.540 they can cut the
00:46:29.180 video any way
00:46:29.780 they want.
00:46:30.800 You've seen a lot
00:46:31.420 of people complain
00:46:32.060 about that, right?
00:46:33.120 So the first media
00:46:34.120 trick is don't go
00:46:34.900 into anything that's
00:46:35.660 short.
00:46:37.180 Don't go to
00:46:38.300 anything that's
00:46:39.580 recorded.
00:46:41.640 You get that
00:46:42.460 right, and you
00:46:42.980 get a chance,
00:46:43.880 right?
00:46:44.240 And then also,
00:46:45.320 this is important,
00:46:46.720 you want to pick
00:46:47.400 somebody that the
00:46:48.260 audience will think
00:46:49.760 is going to go at
00:46:50.560 you hard.
00:46:51.840 That's important.
00:46:53.200 If it looks too
00:46:54.060 friendly, then you
00:46:54.900 lose all credibility.
00:46:55.860 You want somebody
00:46:56.380 who genuinely is
00:46:58.080 going to challenge
00:46:58.840 you on whatever
00:47:00.480 allegedly you did
00:47:01.740 wrong.
00:47:02.900 And Cuomo did
00:47:03.860 that.
00:47:05.780 All right, so
00:47:06.800 here's what I
00:47:07.380 wanted to do.
00:47:07.980 Number one, I
00:47:08.520 wanted to put my
00:47:09.320 point of view on
00:47:10.600 record without any
00:47:12.560 of those limitations
00:47:13.440 I just detailed.
00:47:15.620 Did I do that?
00:47:17.300 Did I accomplish
00:47:18.280 putting on video
00:47:20.080 one full record of
00:47:22.440 my complete thinking?
00:47:23.460 Yes, yes.
00:47:25.620 I was happy with
00:47:26.800 the time he gave
00:47:27.500 me, the questions
00:47:28.640 he asked, and he
00:47:29.920 gave me all the
00:47:30.780 time in the world
00:47:31.540 to fill in a lot
00:47:33.620 of context, which
00:47:34.440 made a difference.
00:47:36.720 So I got my
00:47:37.760 viewpoint on record.
00:47:40.160 Did you see
00:47:40.920 anybody disagree
00:47:42.120 with my major
00:47:44.140 reframes, which
00:47:45.140 were the point of
00:47:45.980 the entire offense?
00:47:47.400 As I described, I
00:47:51.080 intentionally used
00:47:52.020 hyperbole to draw
00:47:53.120 energy toward me to
00:47:56.360 make a point and to
00:47:57.740 reframe the race
00:47:58.580 conversation.
00:48:00.160 So my objective
00:48:01.160 was just to put out
00:48:03.620 the idea into the
00:48:04.420 universe that a lot
00:48:06.820 of the race-related
00:48:07.860 training from CRT to
00:48:09.840 ESG to DEI, they're
00:48:12.240 all backwards
00:48:13.140 philosophies or
00:48:14.520 backwards strategies.
00:48:15.500 Nobody in life
00:48:17.620 goes forward by
00:48:19.460 looking backwards.
00:48:20.620 You can have
00:48:21.200 small gains, like
00:48:22.880 you can make
00:48:23.280 somebody guilty and
00:48:24.100 they'll give you
00:48:24.440 some money in the
00:48:24.960 short run.
00:48:25.540 But in the long
00:48:26.340 term, no person,
00:48:28.140 no individual, black,
00:48:29.440 white, any other
00:48:30.040 color, no company,
00:48:31.280 no organization can
00:48:32.820 thrive unless it's
00:48:34.140 focused forward.
00:48:36.320 If you take a
00:48:37.160 driving lesson, they
00:48:39.340 will teach you to
00:48:39.960 make sure you check
00:48:40.760 your mirrors, always
00:48:42.880 be aware of history,
00:48:44.240 don't forget your
00:48:44.960 history.
00:48:45.860 Make sure you're
00:48:46.480 really clear on
00:48:47.200 everything that
00:48:48.300 happened in black
00:48:48.940 history, et cetera.
00:48:50.540 But your focus has
00:48:52.060 to be on the road.
00:48:53.420 Am I right?
00:48:54.700 I'm not ignoring my
00:48:55.900 rearview mirrors.
00:48:57.020 That would be bad
00:48:57.860 driving.
00:48:59.000 I'm looking at my
00:48:59.920 history.
00:49:00.820 I'm looking at what's
00:49:01.540 behind me, but my
00:49:02.700 focus is straightforward.
00:49:04.560 Now, that was the
00:49:05.800 main thing I wanted to
00:49:06.780 get across, and that
00:49:08.380 there are a set of
00:49:09.840 tools that can make
00:49:12.100 any individual
00:49:12.820 successful as long as
00:49:14.840 we work together to
00:49:15.840 fix schools, which
00:49:16.820 are completely broken
00:49:17.740 at the moment.
00:49:18.860 Schools are broken
00:49:19.540 for black kids, white
00:49:20.660 kids.
00:49:21.000 They're just broken
00:49:21.920 completely.
00:49:23.180 And so that's the
00:49:24.100 common ground.
00:49:25.760 If you look forward,
00:49:27.400 you could say to
00:49:28.020 yourself, hey, how can
00:49:29.800 we work together to
00:49:30.960 fix this thing?
00:49:32.100 If you look backwards,
00:49:33.240 you just argue about
00:49:34.960 who owes who.
00:49:35.620 I mean, it just can't
00:49:37.620 work.
00:49:38.920 Take my exact situation.
00:49:42.580 What I offered was a
00:49:44.040 set of success tools,
00:49:45.780 because a lot of the
00:49:46.320 audience didn't know
00:49:47.120 that I had more impact
00:49:48.920 in the field of personal
00:49:50.680 success.
00:49:52.000 That's what I write
00:49:52.820 about and teach online,
00:49:54.420 et cetera.
00:49:54.940 So I have way more
00:49:55.780 impact on the world,
00:49:57.240 on that domain.
00:49:58.240 The audience was not
00:49:59.180 aware of that.
00:50:00.260 But here I am offering a
00:50:01.660 set of tools to make
00:50:03.140 anybody more successful
00:50:04.420 and would be especially
00:50:06.480 useful to anybody
00:50:08.260 who's suffering from
00:50:09.300 systemic racism.
00:50:11.100 Because systemic racism
00:50:12.280 is a real drag on
00:50:13.760 success.
00:50:15.100 But if you learn the
00:50:16.540 right tools of success,
00:50:17.960 you can slice through
00:50:19.120 it like a hot steel
00:50:21.080 through butter.
00:50:22.660 Butter is still a
00:50:23.640 barrier.
00:50:24.840 But if you have a hot
00:50:26.100 rod to stick through
00:50:27.760 it, that was not a
00:50:29.280 sexual reference.
00:50:30.080 It just sounded like
00:50:30.800 it.
00:50:31.400 Don't stick your hot
00:50:32.300 rod through butter.
00:50:33.760 You can.
00:50:34.940 I just don't
00:50:35.540 recommend it.
00:50:38.900 So look at my
00:50:40.320 exact situation.
00:50:42.120 People are mad at me
00:50:43.180 for what I said in
00:50:45.220 the future.
00:50:46.640 Are they mad at me for
00:50:47.600 something I said in
00:50:48.400 the future?
00:50:49.340 No, that's not
00:50:50.180 possible.
00:50:50.820 Are they mad at me for
00:50:52.360 something I'm saying
00:50:53.100 right now, like in the
00:50:54.740 present?
00:50:55.900 Nope.
00:50:56.980 Nope.
00:50:57.660 They're mad at me for
00:50:58.980 the past.
00:51:00.980 Now look what you
00:51:02.280 miss if that's your
00:51:04.040 focus.
00:51:05.080 If your focus is mad
00:51:06.720 at me for the past,
00:51:08.480 does that ignore the
00:51:09.420 fact that I'm the most
00:51:10.580 persuasive person in the
00:51:12.900 realm of personal
00:51:13.720 success and I've just
00:51:15.420 told you how to be
00:51:16.320 personally successful in
00:51:18.260 a way that is very
00:51:20.280 likely to work for just
00:51:22.220 about everybody?
00:51:23.900 I'm offering for free.
00:51:26.020 You don't have to buy
00:51:26.920 my book.
00:51:27.980 I'll tell you everything
00:51:28.780 you want for free.
00:51:30.900 Now, somebody's going to
00:51:32.420 buy the book because it's
00:51:33.720 packaged in a way, but
00:51:35.180 that's not the point.
00:51:36.260 In fact, if you tried to
00:51:37.520 buy my book, you
00:51:38.200 couldn't.
00:51:39.280 You couldn't buy my
00:51:40.260 books.
00:51:41.400 Try.
00:51:42.760 They're all banned at the
00:51:43.840 moment, so I think they're
00:51:44.660 out of stock.
00:51:46.160 No, so it's not about
00:51:47.240 selling the books.
00:51:48.240 I'd love to sell some
00:51:49.640 books.
00:51:50.280 I usually do things for
00:51:51.400 more than one reason.
00:51:52.740 So if somebody bought my
00:51:53.700 books, that'd be great.
00:51:54.380 But I will give it all
00:51:56.220 to you for free, and I
00:51:57.840 do it all the time.
00:51:59.220 How many times do I
00:52:00.280 describe what's in the
00:52:01.160 book for free on
00:52:02.840 live stream?
00:52:03.740 So if you look at me as
00:52:05.460 somebody who said
00:52:06.120 something you didn't
00:52:06.840 like in the past, you
00:52:08.420 would be blind to the
00:52:09.800 fact that I just opened
00:52:10.820 up this trove of
00:52:13.880 useful things that I
00:52:15.240 want to give you for
00:52:16.020 free that absolutely
00:52:17.600 will change your life in
00:52:18.780 a positive way.
00:52:20.260 Now, if that sounds
00:52:21.820 absurd to anybody who's
00:52:22.940 new to me, I'm going
00:52:24.360 to do an exercise that
00:52:25.420 I often do.
00:52:27.200 Can I deliver that?
00:52:29.440 The people who know
00:52:30.400 me, you've been around
00:52:31.200 a while, can I deliver
00:52:32.460 on giving you tools that
00:52:34.480 would make you more
00:52:35.200 successful?
00:52:35.880 Based on your own
00:52:36.640 experience, can I
00:52:38.260 deliver?
00:52:40.800 Locals all say yes,
00:52:42.140 because they've watched
00:52:43.040 me the longest.
00:52:44.860 YouTube, the people who
00:52:45.900 know me, say yes.
00:52:47.200 This is a real asset.
00:52:50.200 All you have to do is
00:52:51.480 remove your past
00:52:52.660 blindness.
00:52:54.400 Stop looking at the
00:52:55.600 rear view mirror and
00:52:57.240 look at me.
00:52:57.740 I'm standing in the
00:52:58.560 road right in front of
00:52:59.400 your car with a big
00:53:00.820 barrel of cash.
00:53:02.800 Who saw it?
00:53:04.660 In this entire drama,
00:53:07.320 who was able to see me
00:53:09.560 standing in the front of
00:53:10.440 the highway with a big
00:53:12.120 barrel of cash and it was
00:53:13.400 for free?
00:53:14.500 It's free cash.
00:53:15.260 Now, that was the frame
00:53:19.160 that I was trying to
00:53:20.480 reframe.
00:53:21.620 I was trying to reframe
00:53:22.920 from looking at the
00:53:23.680 past to frame it as
00:53:26.220 looking at the future
00:53:27.560 always works if you
00:53:30.120 have the right tools.
00:53:31.440 But also, you need to
00:53:32.840 fix the systemic racism
00:53:34.180 in the schools, because
00:53:35.880 if you don't get the
00:53:36.880 schools right, nothing
00:53:38.580 else works, basically.
00:53:40.060 It's pretty hard.
00:53:41.120 All right.
00:53:42.840 So, I believe I put
00:53:44.420 into the world the
00:53:45.340 idea that you should
00:53:46.160 stop pursuing anything
00:53:48.700 that's a backwards
00:53:49.580 looking strategy for
00:53:51.100 success.
00:53:52.920 Now, that's called the
00:53:54.340 high ground maneuver,
00:53:55.620 which I discuss as the
00:53:57.100 most powerful persuasion
00:53:59.780 technique.
00:54:01.560 Literally, no one can
00:54:02.760 disagree with looking
00:54:04.380 forward instead of
00:54:05.520 backwards.
00:54:06.220 Because if you said it
00:54:07.460 out loud, you would
00:54:08.200 sound stupid.
00:54:09.800 You would sound stupid.
00:54:11.120 If you said, you
00:54:11.980 know, I hear what
00:54:12.940 you're saying, Scott,
00:54:13.660 but I really do like
00:54:15.260 focusing on the past.
00:54:17.300 Nobody could even say
00:54:18.380 that in public, because
00:54:19.400 it sounds so stupid.
00:54:20.940 Right?
00:54:21.340 And yet, everybody was
00:54:22.560 doing it.
00:54:24.420 Not just black people.
00:54:26.400 Everybody was doing
00:54:27.440 it.
00:54:28.460 Everybody was focusing
00:54:29.600 on the past.
00:54:30.820 And when you deal with
00:54:31.940 me, you're focusing on
00:54:32.880 the past, too, if all
00:54:33.800 you can bitch about is
00:54:35.260 something I said in
00:54:36.080 the past.
00:54:37.080 Recent past.
00:54:38.300 Recent past.
00:54:39.880 Still past.
00:54:41.740 The recent past is
00:54:43.680 still the past.
00:54:46.760 It's different.
00:54:49.660 All right.
00:54:50.340 So, that should act
00:54:52.900 like an earworm.
00:54:54.600 You know how earworms
00:54:55.740 are?
00:54:56.000 You hear music and you
00:54:57.060 can't get it out of
00:54:57.520 your head?
00:54:59.660 The people who heard
00:55:00.860 you should not focus on
00:55:01.940 the past are going to
00:55:04.180 have a hard time
00:55:04.800 forgetting that.
00:55:05.980 Would you agree?
00:55:06.620 It's a reframe that, and
00:55:09.080 this is how reframes
00:55:09.880 work.
00:55:10.640 One of the magic of
00:55:11.620 reframes is that a good
00:55:13.060 one, you only have to
00:55:14.540 hear once.
00:55:15.980 Such as, alcohol is
00:55:17.740 poison.
00:55:19.460 Just thinking of alcohol
00:55:20.660 as poison is a reframe
00:55:22.460 that actually makes it
00:55:23.360 easier to stop drinking.
00:55:24.960 That's been proven many
00:55:26.100 times.
00:55:27.420 So, yeah, a good reframe
00:55:29.160 gets in your mind and it
00:55:30.620 can't get out.
00:55:31.120 Especially if it's a
00:55:32.320 high ground maneuver.
00:55:34.400 Zero people.
00:55:35.320 Now, you saw at the end
00:55:36.620 of the, if you watch the
00:55:37.900 Cuomo interview, he brought
00:55:39.580 in a few guests to get a
00:55:41.700 counterpoint, which I
00:55:42.840 thought was good
00:55:43.360 technique.
00:55:44.420 Now, you could argue that
00:55:45.480 I should have also been on
00:55:46.620 to counterpoint the
00:55:47.520 counterpoint, but at some
00:55:48.800 point, you know, at some
00:55:50.300 point it becomes too much.
00:55:52.340 So, I was happy with that.
00:55:54.200 Even not having a response
00:55:57.040 to the responses, that was
00:55:58.600 fair because, you know, it's
00:56:00.360 the real world.
00:56:01.820 But, did you see any of the
00:56:03.560 people say that you should
00:56:05.300 have a backwards focus?
00:56:07.280 Did anybody disagree with my
00:56:08.900 primary purpose and reframe?
00:56:11.040 I don't think so, right?
00:56:14.180 So, here, here are the
00:56:17.300 things that, that I think
00:56:20.980 were not focused on.
00:56:24.780 Did, and you saw, I didn't
00:56:26.480 see the guest talk, I only
00:56:27.620 heard some quotes, so you'll
00:56:29.120 have to fact check me on
00:56:30.260 this.
00:56:30.840 Did the guests he had on
00:56:32.020 later, I guess it was Dan
00:56:33.260 Abrams and somebody, Eric
00:56:36.760 Dyson, and then another
00:56:39.120 gentleman whose name I can't
00:56:40.400 remember.
00:56:41.320 Anyway, did any of them
00:56:43.020 accuse me of being a right
00:56:44.420 wing MAGA or a racist?
00:56:48.100 Did any of them say that?
00:56:50.200 No.
00:56:51.100 No.
00:56:51.380 Because once you heard the
00:56:52.220 context, that no longer made
00:56:54.620 sense.
00:56:55.760 So, was I successful, at least
00:56:57.540 in the interview, not in the
00:56:58.760 world, but in the interview,
00:57:00.360 was I successful in reframing
00:57:02.180 myself as not a right wing
00:57:04.300 crazy?
00:57:05.540 Yes or no?
00:57:07.560 Yes.
00:57:08.340 Right?
00:57:08.600 So, that would be a big
00:57:09.600 success, communication-wise,
00:57:11.680 wouldn't it?
00:57:12.320 Because that would be an
00:57:13.220 objective.
00:57:16.300 Remember how in the first days
00:57:18.000 of my scandal, everybody was
00:57:19.600 focused on the quality and
00:57:21.160 usefulness of the poll, the
00:57:23.200 Rasmussen poll?
00:57:24.680 By the end of me giving the
00:57:26.960 context, was anybody still
00:57:29.320 complaining about depending on
00:57:31.120 the poll, knowing that I didn't
00:57:33.200 depend on the poll?
00:57:35.640 I don't think they talked about
00:57:37.260 it, did they?
00:57:38.500 So, that would be a successful
00:57:39.780 communication.
00:57:44.080 How about their focus on, you
00:57:46.700 had to know that there would be
00:57:48.780 trouble?
00:57:49.120 You know, everybody's been
00:57:50.520 saying the same thing, you had
00:57:51.600 to know, you had to know, and
00:57:54.120 then you suffered the
00:57:55.340 consequence.
00:57:56.800 Did you see that I agreed with
00:57:58.200 that, and have from the
00:58:00.080 beginning?
00:58:00.900 You had to know.
00:58:02.340 Yeah, of course I knew it was a
00:58:03.820 risk.
00:58:04.340 It was a calculated risk.
00:58:05.940 But, why did you do it if you
00:58:08.000 had to know?
00:58:09.260 Well, I was using hyperbole.
00:58:11.320 But you had to know.
00:58:12.980 Yes, I had to know.
00:58:14.220 That's why I did it.
00:58:15.760 I knew that it would cause
00:58:16.960 trouble.
00:58:17.720 That's the reason I did it.
00:58:19.120 But you had to know it would
00:58:20.540 cause trouble.
00:58:22.160 Okay, how many times do I have
00:58:23.760 to say I knew it would cause
00:58:24.760 trouble before you'll agree
00:58:26.760 with me?
00:58:27.560 I knew it would cause trouble.
00:58:29.180 That's why I did it.
00:58:31.280 Okay, I hear you, but you had to
00:58:33.960 know it would cause trouble.
00:58:37.140 Yeah, I don't know if you
00:58:38.000 noticed that a lot of the
00:58:39.080 conversations are starting to
00:58:40.360 take that forward.
00:58:41.000 But, yeah, did you notice that?
00:58:46.640 Right.
00:58:48.260 So, but then there was, why am I
00:58:51.800 complaining?
00:58:52.580 Remember, people kept saying, why
00:58:53.900 are you complaining?
00:58:55.240 Because you knew there was, you
00:58:56.860 know, the risk.
00:58:58.100 And I say, when did I complain?
00:59:01.180 I've described, but I've never
00:59:03.680 complained once.
00:59:04.500 I think they accepted that because
00:59:08.760 they have no, I mean, there's no
00:59:10.680 counterfactual.
00:59:12.140 Nobody's seen me complain.
00:59:14.020 I haven't complained privately,
00:59:16.140 believe it or not.
00:59:17.360 I mean, you, there's no way to know
00:59:19.400 that.
00:59:19.660 But even privately, I have not
00:59:22.240 complained because I'm not, I'm not
00:59:24.660 processing it as a complaint.
00:59:26.760 Do you know why I don't process it
00:59:28.540 as a complaint?
00:59:30.240 Guess why?
00:59:31.720 Why do I not process what I'm doing
00:59:33.960 as a, why don't I feel like I
00:59:35.940 should complain?
00:59:37.180 Do you know why?
00:59:39.100 Because I'm not looking at the
00:59:40.520 past.
00:59:42.060 I'm looking at the future.
00:59:43.620 And do you know what the future
00:59:44.500 kicked up?
00:59:45.740 The future just kicked up me as the
00:59:48.400 most prominent voice on an
00:59:50.880 important trend and put me exactly
00:59:54.660 where I want it to be.
00:59:56.360 If you had asked me in advance,
00:59:58.160 Scott, this is going to cost you,
01:00:00.660 here's the number, it's going to
01:00:02.960 cost you this many dollars, but you
01:00:05.960 get to be exactly where you are now.
01:00:07.900 Would you take the deal?
01:00:11.040 Yup.
01:00:12.360 Yeah, I would.
01:00:14.220 I would.
01:00:16.160 Now, I know that doesn't make sense
01:00:17.760 to a normal person.
01:00:20.600 And I've never claimed I'm normal.
01:00:22.980 So I get that you wouldn't make the
01:00:24.820 choice.
01:00:25.680 I get that.
01:00:26.920 But I'm pretty comfortable with it.
01:00:29.260 Pretty comfortable.
01:00:31.460 All right.
01:00:32.060 What else?
01:00:34.180 I made the claim, and Chris allowed me
01:00:37.220 to interrupt him to make this point,
01:00:41.200 because for a moment it looked like
01:00:42.320 we were going to go to break or
01:00:43.220 something before I made an important
01:00:44.840 point.
01:00:45.100 I claimed that everyone who understood
01:00:47.620 the context agreed with me.
01:00:50.640 But for a minute it looked like maybe
01:00:52.420 that point was getting lost, and there
01:00:55.140 was some, let's say, disagreement that
01:00:57.940 quote, everybody agreed with me.
01:01:00.140 And then I got a chance, and this is why
01:01:02.060 you want an hour, right?
01:01:03.720 This is exactly why offering me an hour
01:01:07.200 is exactly the right thing to do.
01:01:09.140 So I interrupted him, because I had time.
01:01:13.160 Right?
01:01:13.360 If I didn't have time, I couldn't have done it.
01:01:15.300 But I had time to interrupt and say, no,
01:01:17.100 I just want to clarify, it's only the people
01:01:19.580 who have seen the context are okay with it.
01:01:22.560 And I got to, like, really focus on that.
01:01:27.080 In the end, I think Chris Cuomo disagreed
01:01:30.800 with me being cancelled.
01:01:33.120 I couldn't hear at that point, but I think
01:01:34.720 that happened, right?
01:01:36.080 If you saw it.
01:01:38.360 Would you call that a, let's say,
01:01:42.800 a communication goal achieved?
01:01:47.320 Because if the host actually says you
01:01:49.720 shouldn't be cancelled after hearing the
01:01:52.640 whole context and really listening to it,
01:01:54.940 that's about as good as it gets, right?
01:01:58.920 Now, I never wanted him to agree
01:02:01.900 with the way I said it.
01:02:05.080 I never asked him to agree with the hyperbole.
01:02:08.320 Because the whole point of the hyperbole
01:02:09.860 is that it makes you mad.
01:02:12.320 Now, if you saw it, there was a point
01:02:14.000 where he asked a really good question,
01:02:15.980 and I think it was based on a caller's concern.
01:02:19.880 You know, what do you say to the caller
01:02:21.960 who was hurt?
01:02:23.500 But, like, her feelings were hurt
01:02:25.740 by, you know, my statements.
01:02:29.680 And under many normal circumstances,
01:02:33.860 that would be a normal apology situation.
01:02:36.640 But I think I would hold an apology
01:02:38.580 for this world.
01:02:42.640 Anybody in this world, the real one,
01:02:44.840 who wants an apology,
01:02:46.440 I'd probably give it to them.
01:02:48.200 I'd probably give it to them.
01:02:49.180 And I wouldn't even, I don't think I'd hesitate.
01:02:50.820 If somebody was standing in front of me,
01:02:53.500 said, you know,
01:02:55.260 hey, you made me feel bad,
01:02:56.580 I'd probably say something like this.
01:02:58.400 Well, I'm sorry I made you feel bad.
01:03:00.180 But I had a purpose in mind.
01:03:02.680 And all change is painful.
01:03:05.120 I hope you can get over it.
01:03:06.360 And hey, how about let's not focus on the past.
01:03:09.080 I've got some really good tools for you
01:03:10.740 that would make you more successful.
01:03:12.200 Are you interested?
01:03:12.700 So, let's see, what else?
01:03:20.040 So when the, oh, let me say that
01:03:23.540 I did have a blind spot,
01:03:25.320 which I think explains a lot.
01:03:28.780 Here's my blind spot,
01:03:30.740 which I think fills in the biggest mystery.
01:03:33.340 Because when people said you had to know
01:03:35.180 how it would happen,
01:03:36.540 that sounds incomplete.
01:03:40.140 Because I, you know,
01:03:43.080 I say I was surprised that it was that big.
01:03:46.060 And people say,
01:03:47.020 how can you possibly be surprised
01:03:48.820 that the reaction was that big?
01:03:52.940 And that's a,
01:03:56.500 wouldn't you like to know that?
01:03:57.780 Why was I surprised
01:03:59.000 that the reaction was that big?
01:04:00.580 Doesn't that seem like a blind spot?
01:04:01.880 Because I was surprised.
01:04:03.740 I was genuinely surprised.
01:04:05.120 I knew there'd be a reaction.
01:04:06.360 That was the point.
01:04:07.760 All right, here's why.
01:04:10.260 It is impossible
01:04:11.240 to sit here and talk to
01:04:14.220 two computer screens
01:04:17.240 and feel my impact on the world.
01:04:21.900 You can't feel it.
01:04:23.920 There's a total illusion
01:04:25.100 when you're famous
01:04:25.880 that you're not famous.
01:04:28.160 Did you know that?
01:04:29.060 I speculate that one of the reasons
01:04:32.740 that somebody like Michael Jordan
01:04:34.780 talks about himself
01:04:35.920 with a third party,
01:04:37.580 you know, like,
01:04:38.460 Michael Jordan would never do that,
01:04:39.940 or Michael Jordan isn't going to
01:04:41.580 let you lose the game
01:04:42.460 in the last second,
01:04:43.740 like he's a different person.
01:04:45.800 It's because you're, you know,
01:04:46.980 a public person
01:04:47.680 and you're a real person
01:04:48.460 are completely different.
01:04:52.860 So, did I underestimate
01:04:55.060 my own, let's say,
01:04:57.320 I won't say importance.
01:05:01.820 Did I underestimate
01:05:02.580 my impact
01:05:05.280 on other people's minds?
01:05:09.200 And I think I did.
01:05:10.420 Or influence, maybe.
01:05:11.580 Yeah, reach.
01:05:12.840 Reach, maybe.
01:05:13.920 So I underestimated
01:05:15.000 my reach and my influence.
01:05:17.320 Now, the reach, of course,
01:05:18.380 was accelerated by the fact that,
01:05:20.260 as I talked about on the show,
01:05:22.000 and I didn't see anybody disagree,
01:05:23.260 that we've monetized outrage.
01:05:27.940 So I definitely underestimated
01:05:30.280 the effectiveness
01:05:32.240 of the monetization of outrage
01:05:34.500 because I didn't think
01:05:35.740 they would get to my choke points, right?
01:05:38.160 I did think some newspapers
01:05:39.580 might bail out.
01:05:41.140 That was a risk.
01:05:42.360 But I didn't think they'd get
01:05:43.540 to my actual distributor.
01:05:44.640 If you could turn
01:05:46.740 the distributor off,
01:05:47.880 then all the newspapers
01:05:48.700 turn off at the same time,
01:05:50.040 which is what happened.
01:05:51.040 And the same with my books.
01:05:52.080 You know, it's not like
01:05:53.760 they had to cancel a book.
01:05:55.400 They went after the publisher.
01:05:56.960 And the publisher had
01:05:57.860 my entire backlog of books
01:05:59.580 plus the new one.
01:06:01.740 So with one cancellation,
01:06:03.260 they can get my whole catalog of books.
01:06:05.300 So that's what happened.
01:06:06.700 Now, I didn't...
01:06:07.920 That was not something
01:06:08.780 I saw coming.
01:06:10.340 So if you want to call me dumb
01:06:12.360 or say I have a blind spot
01:06:13.840 because I didn't realize
01:06:15.120 that people cared enough
01:06:17.000 about my opinion.
01:06:18.080 I mean, honestly,
01:06:18.720 I didn't think people cared enough
01:06:20.040 about my opinion.
01:06:21.160 And I really didn't.
01:06:23.380 That was sort of a surprise.
01:06:25.360 But I'm not sure
01:06:26.440 they cared about the opinion
01:06:27.520 so much as the outrage machine
01:06:30.040 is so well-oiled
01:06:31.000 and everybody gets a payday.
01:06:33.400 Click, click, click.
01:06:34.560 Click, click, click.
01:06:35.420 Everybody gets a payday.
01:06:36.800 So I think I had not estimated
01:06:38.880 how efficiently
01:06:40.560 the cancellation machine
01:06:41.980 is in February 2023.
01:06:45.440 So that's on me.
01:06:47.900 Would you agree?
01:06:49.740 That's on me.
01:06:51.560 So because one of the things
01:06:52.700 that people want to say is
01:06:53.860 you're not taking responsibility.
01:06:56.320 No, I'm trying to take it
01:06:57.200 as hard as possible.
01:06:58.900 I'm trying to take as much responsibility
01:07:01.560 as I could possibly can.
01:07:03.080 If there's anything I'm leaving out,
01:07:06.180 remind me,
01:07:06.940 and I'll take that responsibility too.
01:07:08.460 I'm at 100% responsibility
01:07:11.620 for what I did.
01:07:14.120 Now, I'm not responsible
01:07:15.520 for the fact
01:07:16.260 that we have a backwards-looking world
01:07:18.880 and there's a cancellation machinery
01:07:21.560 that operates efficiently.
01:07:22.820 that's for other people to work on.
01:07:25.960 But I'm completely responsible
01:07:27.760 for any risk management decision I make
01:07:30.620 if it goes wrong.
01:07:33.300 And then other people
01:07:34.300 just don't know how to do statistics.
01:07:38.740 Well, let me get to this point.
01:07:40.440 So one of the guests who came on,
01:07:42.400 they did this little mind reading.
01:07:44.240 So one of the guests had two complaints
01:07:45.820 which he emphasized on Twitter today.
01:07:48.380 He was surprised that I was baffled
01:07:52.180 by the outcome.
01:07:53.460 That was his words.
01:07:54.960 I was baffled by the cancellation.
01:07:59.120 Does that word fit anything that happened?
01:08:03.360 Because what I thought happened was
01:08:05.060 it was a calculated risk
01:08:07.180 and one of the outcomes
01:08:10.180 was complete cancellation.
01:08:12.480 Of course it was.
01:08:13.740 It wasn't my expectation,
01:08:16.600 but of course it was an obvious
01:08:18.220 risk.
01:08:19.420 And he read my mind
01:08:20.940 and after all of this
01:08:22.340 decided that I was baffled
01:08:24.560 that I got cancelled.
01:08:26.680 Now baffled is one of those
01:08:28.300 words you use
01:08:29.240 when you don't have an argument.
01:08:31.420 If you don't have an argument,
01:08:32.640 you read somebody's mind
01:08:33.760 and then you characterize it
01:08:35.780 in a weird way
01:08:36.800 and then you criticize
01:08:37.940 your own characterization
01:08:39.160 of the thing you mind read.
01:08:42.760 That's called winning
01:08:44.240 the conversation for me.
01:08:46.660 If I could give somebody's
01:08:48.600 only best argument
01:08:49.880 to be that they imagined
01:08:51.820 I was baffled
01:08:52.740 when the evidence suggests
01:08:55.000 exactly the opposite.
01:08:57.060 But I did have a blind spot.
01:08:59.460 That's true.
01:09:00.140 But I wasn't baffled
01:09:01.120 that that was always a possibility.
01:09:04.440 Then his other was
01:09:05.400 he says he disagrees
01:09:08.000 that I said it was the
01:09:09.960 quote only way
01:09:10.960 to fight racism.
01:09:12.460 Did I say anything like that?
01:09:15.220 Did anybody hear me say
01:09:16.580 that getting cancelled
01:09:18.000 was the only way
01:09:18.780 to fight racism?
01:09:20.440 Nothing like that.
01:09:22.100 So when somebody has to
01:09:23.700 literally make up
01:09:25.360 an opinion for you,
01:09:26.900 assign it to you,
01:09:28.200 and then argue
01:09:28.940 you shouldn't have
01:09:29.560 that opinion,
01:09:30.600 who won the conversation?
01:09:32.300 That's me winning
01:09:34.500 as hard as you can win.
01:09:36.660 That's like game over.
01:09:40.460 All right.
01:09:42.020 Dan Abrams argued,
01:09:44.800 and keep in mind
01:09:46.600 that he was an attorney, right?
01:09:49.360 So he's somebody
01:09:50.620 who's really good at arguing
01:09:51.720 and breaking down logic
01:09:53.520 and stuff.
01:09:54.460 All right, so the guy
01:09:55.160 who's a professional,
01:09:56.700 logical communicator
01:09:58.860 had this to say.
01:10:00.340 He said,
01:10:01.400 I can't have it both ways,
01:10:03.740 claiming that
01:10:04.400 I was using hyperbole,
01:10:06.080 but I was also
01:10:07.100 taken out of context.
01:10:09.240 So he lawyered me
01:10:10.560 and he said,
01:10:10.980 oh, I found a technical
01:10:12.840 problem with the argument.
01:10:14.640 You can't use hyperbole
01:10:16.220 or say you used hyperbole,
01:10:17.920 but also
01:10:18.820 a completely different excuse
01:10:20.960 that you were taken
01:10:22.300 out of context.
01:10:23.960 To which I say,
01:10:26.420 the hyperbole
01:10:27.400 was the specific sentence
01:10:29.740 the context
01:10:31.560 is why I said it.
01:10:33.960 Yes, Dan Abrams,
01:10:35.560 you can use hyperbole
01:10:37.240 and be taken out of context.
01:10:40.480 There's absolutely
01:10:41.640 no conflict
01:10:42.980 between those two things.
01:10:44.940 They work together
01:10:45.880 really well.
01:10:47.180 Yes,
01:10:48.340 hyperbole.
01:10:49.520 Yes,
01:10:50.200 also taken out of context.
01:10:53.160 And that was his best,
01:10:54.340 I think that was
01:10:55.120 his best criticism.
01:10:56.200 Keep in mind
01:10:58.300 that none of the points
01:11:00.100 I made
01:11:00.660 were involved
01:11:01.880 in the criticism.
01:11:03.920 Right?
01:11:04.660 The criticism
01:11:05.420 is that you can't
01:11:06.660 have it both ways
01:11:07.700 when obviously
01:11:09.180 you can
01:11:09.820 quite easily
01:11:11.320 and it doesn't take much
01:11:12.680 to convince you
01:11:13.600 it's true.
01:11:14.700 That was the best he had.
01:11:17.180 He's a lawyer.
01:11:18.720 He's a smart guy.
01:11:20.440 He does this
01:11:21.320 for a living.
01:11:22.160 He does this
01:11:24.260 for a living
01:11:24.820 and that was
01:11:25.800 his best take.
01:11:27.860 That's all he had.
01:11:30.320 Can I win
01:11:31.220 any harder than that?
01:11:33.620 Seriously.
01:11:34.800 Would it be
01:11:35.360 even possible
01:11:36.140 to win harder than that?
01:11:37.660 I don't think so.
01:11:40.140 And then there was
01:11:40.920 some professor guy
01:11:42.520 who
01:11:43.700 somebody said
01:11:44.820 that he was doubting
01:11:45.740 that a
01:11:46.760 qualified
01:11:47.800 black applicant
01:11:48.760 would have
01:11:49.260 an advantage
01:11:49.800 in a big corporation
01:11:51.220 in America
01:11:51.860 that is desperate
01:11:53.240 to increase
01:11:53.880 their diversity.
01:11:55.400 Did that actually happen?
01:11:57.300 Did he actually
01:11:57.880 doubt the fact
01:11:58.980 that
01:12:00.760 a qualified
01:12:02.500 black employee
01:12:03.820 would have
01:12:04.280 an advantage
01:12:04.780 in corporate America?
01:12:05.640 I didn't know
01:12:07.880 that anybody
01:12:08.340 questioned that.
01:12:10.200 But if there are
01:12:11.500 other people
01:12:12.440 in black America
01:12:13.280 who believe
01:12:13.760 that's true
01:12:14.380 think of the opportunity
01:12:15.780 that just opened up.
01:12:17.760 Imagine
01:12:18.320 if that's
01:12:19.360 a common opinion.
01:12:21.220 And imagine
01:12:22.240 some number
01:12:23.240 of black Americans
01:12:23.980 watched the show.
01:12:25.880 Because there were
01:12:26.340 a number of call-ins
01:12:27.000 so I know he's got
01:12:27.740 black watchers,
01:12:29.420 viewers.
01:12:30.440 So imagine
01:12:31.000 you're a black American
01:12:32.620 and you heard me say
01:12:33.820 no,
01:12:35.120 if you get a good education
01:12:36.480 you're going to actually
01:12:37.480 be at the top
01:12:38.300 of the hiring list.
01:12:39.720 Imagine you'd never
01:12:40.640 heard that before.
01:12:42.100 Because I've actually
01:12:42.740 heard other
01:12:43.220 black Americans
01:12:44.300 say that.
01:12:45.080 They said
01:12:45.500 no,
01:12:45.820 that's not true.
01:12:46.740 There's no favoritism
01:12:47.840 in corporate America.
01:12:48.660 And I have to
01:12:50.160 explain the long
01:12:50.920 story of
01:12:51.480 it is so true.
01:12:54.040 It's like the
01:12:54.920 truest thing
01:12:55.720 that's ever been
01:12:56.400 true.
01:12:57.140 Like there's no
01:12:57.900 gray area.
01:12:58.800 This is the
01:12:59.380 truest of all
01:13:00.080 true things.
01:13:01.020 It just doesn't
01:13:01.960 apply to small
01:13:02.780 companies.
01:13:03.840 So I think that's
01:13:04.780 where the confusion
01:13:05.620 is.
01:13:06.160 If your experience
01:13:07.020 is with small
01:13:07.800 companies,
01:13:08.820 yeah,
01:13:09.000 they're probably
01:13:09.340 pretty racist.
01:13:10.680 So that would
01:13:11.340 be a good strategy
01:13:12.160 to not get a job
01:13:13.420 at the Korean
01:13:13.980 supermarket.
01:13:14.580 Have I ever
01:13:17.120 advised a young
01:13:18.100 black man to
01:13:19.580 get an education
01:13:20.380 and while he's
01:13:21.900 working through
01:13:22.320 school,
01:13:23.140 go work at the
01:13:24.000 Korean grocery
01:13:24.780 store?
01:13:26.360 I don't know
01:13:27.200 if they're going
01:13:27.640 to discriminate,
01:13:29.240 but I'd worry
01:13:30.400 about it.
01:13:31.560 Go where you
01:13:32.820 have an advantage
01:13:34.860 in bias.
01:13:36.860 Because the
01:13:37.380 world is going
01:13:37.840 to give you
01:13:38.200 both bias
01:13:39.220 advantages and
01:13:40.020 bias disadvantages.
01:13:41.340 Go where you
01:13:41.840 have an advantage,
01:13:43.300 black or white.
01:13:44.580 Go where you
01:13:45.000 have an advantage.
01:13:46.100 And that would
01:13:46.400 be corporate
01:13:46.840 America if
01:13:47.440 you're black.
01:13:48.260 If you're white,
01:13:49.200 start your own
01:13:49.660 business.
01:13:51.200 If you're a
01:13:51.940 white man in
01:13:52.920 America, I
01:13:54.140 would not get
01:13:54.700 into the
01:13:55.080 corporate world.
01:13:56.720 Unless it's a
01:13:57.520 special case,
01:13:58.780 it's like all
01:13:59.880 you've lived for
01:14:00.720 all your life
01:14:01.240 or something,
01:14:02.220 but it's really
01:14:02.920 not the best
01:14:04.720 statistical.
01:14:07.620 Statistically,
01:14:08.260 it's not your
01:14:08.660 best strategy.
01:14:09.600 Your best strategy
01:14:10.280 would be start
01:14:11.360 your own business
01:14:11.960 if you can.
01:14:12.540 All right.
01:14:14.920 And by the
01:14:15.320 way, that
01:14:15.560 would be a
01:14:16.340 great strategy
01:14:17.080 for black
01:14:17.800 Americans, but
01:14:19.200 a good way to
01:14:19.800 do it is to
01:14:20.400 get some
01:14:20.780 corporate experience
01:14:22.100 and connections
01:14:22.800 and build up
01:14:23.860 some money and
01:14:24.360 stuff and then
01:14:25.160 start a business.
01:14:25.960 That's not a
01:14:26.400 bad way to do
01:14:26.960 it.
01:14:27.820 So two ways
01:14:28.420 to do it.
01:14:30.900 All right.
01:14:32.500 That, ladies
01:14:33.180 and gentlemen,
01:14:33.960 is my review
01:14:35.200 of it.
01:14:36.180 If you haven't
01:14:37.240 seen it, go
01:14:38.700 take a look and
01:14:39.440 see if my
01:14:40.260 characterization of
01:14:41.200 that feels fair
01:14:42.080 to you.
01:14:43.500 Again, big
01:14:44.200 thank you to
01:14:44.960 Chris Cuomo.
01:14:46.220 I think that
01:14:46.700 was a solid
01:14:47.380 contribution to
01:14:48.880 journalism, frankly.
01:14:52.300 It's the best
01:14:53.100 thing that's
01:14:53.620 happened so
01:14:54.340 far, except
01:14:56.140 Hotep was
01:14:57.140 great.
01:14:58.860 The Hotep
01:14:59.340 Jesus interview
01:15:00.000 I thought was
01:15:00.560 terrific, and I'm
01:15:01.980 very appreciative
01:15:02.660 for that.
01:15:04.420 All right.
01:15:05.620 Did you learn
01:15:06.420 anything?
01:15:06.680 Did you learn
01:15:07.180 anything about
01:15:09.760 media communication?
01:15:11.200 Hope so.
01:15:15.840 And somebody
01:15:16.440 says no.
01:15:17.760 There's always
01:15:18.320 a no.
01:15:20.220 All right.
01:15:20.880 That's all for
01:15:21.300 now.
01:15:21.560 I'm going to go
01:15:21.960 talk to the
01:15:22.480 locals, people
01:15:23.040 privately.
01:15:23.980 Thanks for
01:15:24.440 joining everybody
01:15:25.320 on YouTube.
01:15:26.320 See you
01:15:26.660 tomorrow.
01:15:26.880 Let's go.
01:15:37.440 You