Real Coffee with Scott Adams - March 28, 2021


Episode 1327 Scott Adams: Paying Artists in SF, Kitler the Cat, Vaccination Passports, and Mostly Fun Tonight


Episode Stats

Length

53 minutes

Words per Minute

142.94724

Word Count

7,707

Sentence Count

580

Misogynist Sentences

10

Hate Speech Sentences

24


Summary

Scott Adams shares a story about how he prepared for the Suez Crisis, and how it changed his outlook on the possibility of surviving a nuclear holocaust. Scott Adams is the host of the popular morning radio show, "Coffee with Scott Adams," and he's a regular contributor on NPR's Morning Mashup.


Transcript

00:00:00.000 Hey everybody, I hope you're as prepared as I am for Coffee with Scott Adams, the best
00:00:10.420 part of the day, every single time, and sometimes, not every time, but sometimes it's even better
00:00:17.660 than that.
00:00:18.460 This might be one of those days.
00:00:20.680 Man, you'll be sorry if you miss it.
00:00:22.880 Oh, wow, will you be?
00:00:24.780 Well, happy Palm Sunday.
00:00:26.280 It doesn't mean what you think it means, so slow your roll there.
00:00:32.460 If you'd like to enjoy Palm Sunday to its ultimate potential, well, with your other hand, what
00:00:40.400 you want to do is grab a cup or mug or a glass, a tank or a chalice or a stein, a canteen, a
00:00:44.580 jug or a flask, a vessel of any kind, fill it with your favorite liquid.
00:00:48.180 I like coffee.
00:00:49.620 And join me now for the unparalleled pleasure, the dopamine hit of the day.
00:00:54.620 The thing that makes everything better.
00:00:57.320 You know it does.
00:00:58.540 Come on, you know it does.
00:01:00.220 And it happens right now.
00:01:01.220 It's called the simultaneous sip.
00:01:03.500 Join me now.
00:01:04.860 Go.
00:01:08.960 Ah.
00:01:11.660 Yeah, that's good.
00:01:14.060 So I'd like to give you a little flash from the past.
00:01:18.480 You ready for it?
00:01:19.320 I was looking in my closet the other day, upstairs, and I was just looking for something, and I
00:01:26.940 found this.
00:01:30.720 This gigantic emergency rice.
00:01:34.820 And I want to take you back to where your heads were one year ago, roughly one year
00:01:48.380 ago, when the lockdowns were just beginning, and everybody was buying their secret stores.
00:01:56.660 Not only did I buy secret stores of foodstuffs, but, and by the way, why do they call it foodstuffs?
00:02:07.000 How about just food?
00:02:08.680 But I actually hid it, so that when the roving band of armed, I'm sorry, so when the roving
00:02:18.680 band of bandits came to rob my house and kill me and steal my food, they wouldn't be able
00:02:25.060 to find it.
00:02:26.240 Now, do you remember how scared you were a year ago?
00:02:31.540 When all the toilet paper was gone, it looked like maybe the entire economy of the world
00:02:37.200 would, you know, plunge into darkness.
00:02:40.080 Do you remember just how scary that was?
00:02:44.140 I mean, just think about it.
00:02:45.700 Because, you know, we're all, we're all bitching about our vaccinations, right?
00:02:49.300 Like, oh, my arm's going to hurt, and, you know, maybe, maybe it's not good for me.
00:02:55.000 But compared to what it was a year ago, oh, my God.
00:03:02.480 I'm going to tell you a story that would be better if I could tell you the person, but
00:03:07.140 just for privacy, I won't.
00:03:09.000 So when the pandemic began, and I was telling people, you know, don't worry too much, it
00:03:14.440 won't be that bad, I got a call at home one day from somebody who is really, really good
00:03:22.200 at understanding the world and predicting what's going to happen next, and I've never
00:03:28.060 been so frightened in my life.
00:03:31.260 Now, when I say I've never been so frightened in my life, I mean that literally.
00:03:36.900 Last year, at a time when I was telling all of you, it'll be fine, you know, just prepare,
00:03:42.100 do everything you can, but don't, don't worry yourself to death, you know, we'll work through
00:03:46.960 it, et cetera.
00:03:48.060 And by the way, I was right, wouldn't you say?
00:03:51.400 Wouldn't you largely say that my characterization, that we would get through it, we wouldn't run
00:03:58.460 out of food, you know, we wouldn't die, it would just be really, really inconvenient and,
00:04:04.620 you know, bad economically for a lot of people, but we would get through it.
00:04:08.580 But I'll tell you, at the same time I was hearing that, I was hearing from somebody whose opinion
00:04:16.340 I really respect, that we were doomed, that we were in big trouble.
00:04:23.520 And I was trying not to tell you that, because I didn't buy it, right?
00:04:28.700 So first of all, you know, my optimism was legitimate, I thought we would be fine.
00:04:33.040 Fine in the sense that we would come out of it about the way we are, right?
00:04:38.240 That's not fine if half a million people are dead.
00:04:40.840 But I thought we would come out of it about the way we did.
00:04:44.660 I heard a story that was so scary that I've just never been that scared in my life.
00:04:52.600 It was just the most frightening thing I've ever heard.
00:04:55.300 And I was not going to tell you, I'll tell you, you are not going to hear that story.
00:05:03.640 Anyway, I'm glad it didn't go to the worst case scenario.
00:05:07.980 So, you know, as we're watching this story about the Suez Canal, and how that little choke
00:05:15.580 point in our economy will affect us all, I was starting to see a pattern here.
00:05:20.680 Here's an interesting pattern.
00:05:27.600 If you look at the Suez Canal thing, the story is that it's going to affect the world economy,
00:05:32.680 right?
00:05:32.900 We're all affected by it.
00:05:34.860 At the same time, we're in the middle of the pandemic, which is sort of a unique problem
00:05:40.400 in the sense that it's worldwide.
00:05:43.500 We're also talking about, you know, climate change, which is worldwide.
00:05:48.020 And we're talking about the rise of China and what that means to the world, which is,
00:05:54.380 again, a global problem.
00:05:56.480 Have we ever had this many global problems?
00:06:01.160 It's kind of new, isn't it?
00:06:03.060 I mean, we've had world wars, but they seem like special cases.
00:06:07.400 But now it seems that we're such a connected world that you can't do anything without affecting
00:06:13.140 everybody else.
00:06:13.960 So I feel as if it's like the simulation or God, if you prefer, I feel like there's a message
00:06:23.340 being sent to us and we're not getting it.
00:06:26.060 Do you feel that way?
00:06:28.400 It feels like, let's take the assumption that there's a God.
00:06:34.200 We'll just take that model of life for a moment.
00:06:36.980 It feels like God is just keeps tapping us on the shoulder and saying, hey, hey, you can't
00:06:44.280 really just take care of yourself anymore because we're all connected.
00:06:49.140 Do you get it?
00:06:50.100 I'll send you a pandemic.
00:06:51.320 See if you get it.
00:06:52.600 Wait, wait, you're not getting it yet, right?
00:06:54.900 All right, let's try again.
00:06:56.320 I'm going to move this tanker sideways and watch what happens.
00:07:00.420 Entire world will be affected.
00:07:02.780 Tap, tap, tap.
00:07:03.320 You get it now, right?
00:07:04.580 Do you see that?
00:07:05.200 You don't get it yet?
00:07:07.540 Well, watch this.
00:07:09.500 And it feels like one thing after another is just trying to tell us the same message.
00:07:14.280 You know you're all on the same team.
00:07:16.600 You know that, right?
00:07:17.780 Team human being, same side.
00:07:22.400 But we're not getting it yet.
00:07:24.700 Here's my favorite story in the news.
00:07:26.980 It's about a cat named Kittler.
00:07:29.180 Now, poor little Kittler has some markings that look like a Hitler mustache, and so the owners
00:07:38.600 quite humorously named their Hitler-looking cat Kittler.
00:07:43.680 Now, that was fine.
00:07:45.740 Everything was fine with that because everybody knew that it was just a joke.
00:07:50.720 But there was a local Fox News weather person who made the terrible, terrible mistake of running a cute picture of Kittler on her show where I guess she was showing some cute animal stuff.
00:08:04.880 And she got a lot of pushback.
00:08:08.040 Turns out that a lot of people were offended by Kittler.
00:08:12.440 And so she apologized.
00:08:14.740 Now, I want to play her apology because remember I told you that the apologies for all the wokeness mistakes, the apologies are becoming hilarious.
00:08:25.460 Because the distinction between parody and the real world has just disappeared.
00:08:33.800 You can't tell the difference between a joke and somebody literally trying to be sincere.
00:08:39.360 You can't tell the difference.
00:08:41.440 Let's see if you can tell the difference.
00:08:44.060 Is she really apologizing or is this a joke?
00:08:48.960 I did make a mistake during our Catterday segment.
00:08:52.100 I used a submitted photo of a cat with an inappropriate name.
00:08:54.800 I don't want to use the name here.
00:08:56.220 But I never intended to hurt or offend anyone by using that picture that was actually just given to me.
00:09:02.020 I understand my mistake and I am deeply sorry.
00:09:04.880 And in the future, I will absolutely be more diligent with this content to ensure it never happens again.
00:09:10.880 15 degrees below zero.
00:09:13.240 All right.
00:09:14.440 So the punchline, the punchline is I'll be more diligent with this type of material to make sure it never happens again.
00:09:22.460 Because, you know, because, you know, because, you know, the big risk here, oh, we, you know, we can accept that it happened once.
00:09:31.380 But, man, you don't want this to happen again, do you?
00:09:36.680 Because how can we go on?
00:09:39.320 So I would like to join in in solidarity with this weather person who has now apologized.
00:09:48.800 And I would like to say, too, that I'm going to learn from this.
00:09:53.600 I'm going to learn from this.
00:09:55.080 Some of you'd never learn.
00:09:57.340 But I learned from this.
00:09:58.560 And my plans of showing a picture of a cat that looks like Hitler, I've changed them.
00:10:07.080 I was going to show a picture of a cat that looked like Hitler, but now I know that would be insensitive.
00:10:13.340 So I'm not going to do it.
00:10:14.860 I learned.
00:10:16.180 Maybe you don't.
00:10:18.160 Well, in New York, and I guess in other places, they're going to have a vaccination passport, it looks like.
00:10:23.420 So there'll be some kind of an app that you can show if you're going to certain kinds of events that says, I've been vaccinated.
00:10:31.340 What do you think of that?
00:10:33.300 I don't really understand the resistance to this.
00:10:40.480 Now, I understand there is a lot of resistance, and people really, really care about it.
00:10:44.960 And it looks like, I don't know, it looks like we're all going to be branded and tattooed and sent to the concentration camps or something.
00:10:51.180 Somebody is saying, is it a HIPAA violation?
00:10:55.340 Well, not if you do it voluntarily.
00:10:58.320 Or maybe you mean some other point.
00:11:00.120 I'm not sure.
00:11:01.600 But all of you who said, you know, I knew it was coming, are you really worried about this?
00:11:09.160 Of all the things that you have to worry about, this might be the smallest problem in the world.
00:11:15.440 Well, no.
00:11:16.120 The smallest problem in the world is Kittler.
00:11:18.800 Kittler is the smallest problem in the world.
00:11:21.180 This might be the second smallest.
00:11:24.300 Do you really think this is going to go wrong?
00:11:27.440 In what way?
00:11:29.260 What exactly is the argument that this is going to go wrong?
00:11:33.580 Somebody says it's a freedom violation.
00:11:36.420 Is it?
00:11:40.360 I'm seeing people are just going nuts in the comments.
00:11:43.040 Over on YouTube, they're swearing at me with the F word.
00:11:46.680 Even for just bringing up the topic.
00:11:48.260 My freedom.
00:11:50.580 My freedom.
00:11:52.420 You know, literally everything you do for your health affects your freedom.
00:11:58.000 Almost every law we make affects your freedom.
00:12:02.980 Everything you do to be polite affects your freedom.
00:12:06.740 We don't do anything that doesn't affect your freedom.
00:12:09.000 All of our choices are trade-offs of, well, I'll give up a little freedom to get a little bit more of this.
00:12:16.760 Let me ask you this.
00:12:17.820 If you had two choices, there are no mass events, or if you would like to attend, optionally.
00:12:27.660 Nobody's forcing you to do it.
00:12:29.040 You don't have to go to the event.
00:12:30.640 But if you want to attend, you'll show a little thing that says you're safe.
00:12:35.200 Is that a big problem?
00:12:37.940 I mean, seriously.
00:12:38.780 I get the point that maybe something could get out of control or whatever.
00:12:43.840 But if you were going to rank this on your top hundred problems, is this in the top hundred?
00:12:50.800 Because I can't even generate a little bit of caring about it.
00:12:55.320 Like, I'm not even close to being mad at it.
00:12:58.900 I can't even care about it.
00:13:00.480 It doesn't even seem like an issue.
00:13:02.460 Am I missing this completely?
00:13:03.960 Nobody's going to lose their freedom.
00:13:07.440 You're gaining freedom.
00:13:09.620 Because if you don't have this thing, you'll never be able to have the event.
00:13:14.520 The event won't happen.
00:13:16.880 So doesn't that give you more freedom?
00:13:19.500 Because now you can go to a thing that you couldn't go to before.
00:13:23.240 Okay, now maybe you would say, but how about we just have the big events and everybody can go.
00:13:28.680 We'll get there.
00:13:30.300 Do you think this is permanent?
00:13:32.680 I don't think it's permanent.
00:13:34.540 I don't worry about anything that's temporary.
00:13:37.420 Let me give you a philosophy that might help you a little bit.
00:13:42.960 Don't worry about anything that's temporary.
00:13:45.800 That's it.
00:13:47.280 That's some of the best advice you'll ever hear in your life.
00:13:50.940 Don't worry about anything that's temporary.
00:13:54.980 Worry about the big stuff.
00:13:57.680 Now, I hear it's slippery slope and they're boiling a frog and they're getting us used to all this control
00:14:03.320 and all that, but do you think they don't already know where you are on your phone?
00:14:08.180 I mean, they're already tracking you on your phone.
00:14:10.780 They can listen to you if they want.
00:14:13.480 Your privacy is largely gone.
00:14:15.220 And so he says it's segregation.
00:14:21.640 Here's the thing.
00:14:23.040 All of your complaints about this are just sort of like weird conceptual complaints.
00:14:28.300 I just don't think any of it's real.
00:14:30.980 I think this is your smallest problem.
00:14:32.760 But should I be wrong about that, I know you'll let me know in the future.
00:14:42.420 So Saturday Night Live is starting to, let's say, gently mock the Biden administration.
00:14:48.760 Michael Che had a really good one on, he did the weekend update.
00:14:55.480 And as he was reading the fake news, he said Biden was asked if he plans to run for re-election in 2024,
00:15:01.060 which is probably the nicest way to ask if he plans to be alive in three years.
00:15:06.360 Now, what's great about this joke is that's exactly what I was thinking when I heard the question.
00:15:11.060 I thought, are you really asking if you think you're going to be alive in three years?
00:15:16.300 And, of course, Biden had to say, you know, his presumption was that he would run for re-election.
00:15:23.360 But there's no way he thinks that.
00:15:25.700 There isn't the slightest chance that Biden thinks he's running for re-election.
00:15:30.920 Do you think that?
00:15:32.240 There's no chance of that.
00:15:36.500 Here's a story that made me laugh.
00:15:40.200 And let's see if you laugh for the same reason.
00:15:42.160 So, journalist Cheryl Atkinson, you probably are all familiar with her.
00:15:49.180 She tweeted that she's looking for some story assistance.
00:15:53.360 And she tweets, for a story, I'm looking for a fairly large group of Georgians who want to vote but cannot get an ID.
00:16:01.840 Please DM me if you have any leads.
00:16:05.340 Now, what do you think is going to...
00:16:07.160 Now, here's what made me laugh.
00:16:10.040 We haven't done that yet?
00:16:12.460 One of the biggest stories in the country is the Georgia election law changes.
00:16:18.340 And the biggest part of that is requiring identification.
00:16:22.400 Because if you didn't have it, you wouldn't be able to vote, at least for mail-in stuff.
00:16:26.560 And Cheryl Atkinson is the first person to say, maybe we could find one of these people.
00:16:37.640 These alleged people who have no ID but want to vote.
00:16:42.020 Do they exist?
00:16:44.840 Here's my prediction.
00:16:47.040 I'll be surprised if you can find three in the whole state.
00:16:52.860 I'll be surprised.
00:16:54.000 Now, that doesn't mean there are fewer than three.
00:16:57.380 I'm saying that there are probably so few that if you could find three, it would be a little bit of a miracle.
00:17:04.680 Now, there are two problems here.
00:17:05.940 One is, I don't know that any exist.
00:17:09.800 If we get one, we'll find out.
00:17:12.480 But how hard is it to find someone who doesn't have an ID
00:17:16.980 and also has the second criteria that they wanted to vote but they couldn't?
00:17:21.820 How do you even find people who don't have an ID?
00:17:25.880 Like, are they on Twitter?
00:17:28.040 If they are, how?
00:17:29.840 I mean, I guess you could be somehow.
00:17:32.520 But, yeah.
00:17:34.800 And so here's what made me laugh.
00:17:36.640 Why is Cheryl Atkinson the first person to ask this question?
00:17:42.200 Hey, can you help me find some of these people?
00:17:44.820 The entire, the biggest story in the country is based on the assumption that there are lots of them.
00:17:52.900 And nobody asked to find one until Cheryl Atkinson asked.
00:17:56.940 Hey, I don't know where they are.
00:18:00.240 Never seen one.
00:18:01.820 Never met one.
00:18:03.860 Does anybody have one?
00:18:05.800 Can anybody find one?
00:18:09.360 What's it tell you that only one journalist even asked the question?
00:18:16.720 What's that tell you?
00:18:17.840 It tells you there's no interest in knowing if it's even real.
00:18:22.700 Except for Cheryl Atkinson.
00:18:25.100 And I, of course, will be very interested in reading what she comes up with on this.
00:18:29.320 But it just amazes me that nobody did this.
00:18:37.120 And let me give you another example of this.
00:18:39.280 Do you remember when after the Charlottesville fine people hoax thing happened?
00:18:43.820 Do you remember all the press who went and talked to the people who attended to find out for sure,
00:18:51.120 just by talking to the attendees, whether there were people that could be described as fine people?
00:18:57.940 Wouldn't you want to know that?
00:18:59.960 Don't you think that the news organization should have done exactly what Cheryl Atkinson did
00:19:04.520 and say, hey, does anybody know anybody who meets that description?
00:19:09.540 Because the entire story is about whether they exist or not.
00:19:13.820 And nobody checked to see if they exist.
00:19:18.560 Just think about that.
00:19:19.840 Just hold that thought for a moment.
00:19:21.940 It was a gigantic story and still is.
00:19:25.640 And the entire thing boiled down to were there people there
00:19:28.760 who a reasonable person would say, no, you're not a racist.
00:19:32.360 You had other reasons to be there.
00:19:35.380 As far as I know, only one person in the world did that.
00:19:41.120 Me.
00:19:42.140 I did that.
00:19:43.180 I tweeted, hey, was anybody there who would describe themselves as a fine person or something like that?
00:19:49.500 And people contacted me.
00:19:51.320 And I interviewed them and found out that indeed people who, in my opinion,
00:19:56.660 did a good job of explaining that they don't have racial biases in this particular way
00:20:02.160 and didn't like the racists who were there or did not agree with them at all, disavowed them completely.
00:20:11.900 I'm the only one who checked.
00:20:13.980 I'm the only one who checked.
00:20:16.240 I'm the biggest story of the year.
00:20:17.460 That's not, I'm not making that up.
00:20:20.700 I'm the only one who checked.
00:20:22.960 Right?
00:20:23.240 And now Cheryl Atkinson is literally the only person in the world asking the biggest question about the biggest story.
00:20:32.140 Are there any?
00:20:34.140 Have you ever met one?
00:20:35.360 All right.
00:20:39.180 So it made me laugh.
00:20:41.240 Speaking of laughs, so San Francisco has decided to start paying $1,000 a month guaranteed income
00:20:49.700 to artists in San Francisco, and especially whose artistic practice is rooted in a historically marginalized community.
00:21:04.360 So if you're not white, it looks like they really want to give you some money.
00:21:11.140 And this works out for me.
00:21:12.600 A lot of you are offended by that.
00:21:14.000 But since I started identifying as black, I like this idea now.
00:21:19.220 Because I'm an artist, kind of.
00:21:23.020 I'm sort of an artist.
00:21:24.260 And also, now that I've identified with, I've self-identified as a member of a historically marginalized community,
00:21:33.280 I think my people need this money.
00:21:37.100 So yes, I think San Francisco should start giving money to my people,
00:21:41.800 people, the artists from the historically marginalized communities.
00:21:46.300 And doesn't it make you wonder if they're still trying?
00:21:56.660 Are they still trying, the government of San Francisco?
00:22:00.200 Or is it now just sort of like performance art?
00:22:03.040 They're trying to figure out what's the most ridiculous thing you'll go along with.
00:22:07.540 Watch this.
00:22:08.640 Watch this.
00:22:09.160 We're going to tell people that our artists are essential and see if we can just like give them money.
00:22:18.460 And somebody would say, they're not going to buy that.
00:22:22.160 Nobody's going to think artists are essential in a pandemic and that you should give them money because they're essential.
00:22:29.280 Nobody's going to believe that.
00:22:31.660 Watch this.
00:22:33.060 Watch this.
00:22:34.360 I'm going to sell this.
00:22:35.320 The difference between a government doing what a government should do and a practical joke has completely evaporated.
00:22:48.260 The actual government decisions, you can't tell anymore if they're practical jokes.
00:22:54.760 Now, there should be some version of, let's say, the Turing test for government.
00:23:03.220 Now, if you're familiar with the Turing test, T-U-R-I-N-J, named after Turing.
00:23:09.480 And it was the test to see if your artificial intelligence could fool somebody to make them think that they're a real human being.
00:23:19.280 And the test, at least conceptually, is that behind a curtain, there's somebody talking.
00:23:25.480 You don't know if it's a computer or a person or they're typing either way.
00:23:29.960 And you ask them questions, you have a conversation, and if you can't tell, yeah, it's Alan Turing, thank you for the first name.
00:23:37.460 If you can't tell that it's a computer on the other side, you think it might be human, then you've passed the Turing test.
00:23:44.240 But if you can tell, oh, that's obviously a computer, then you failed.
00:23:48.520 But I feel like there's some kind of Turing test going on with our government right now, where you hear a new policy out of San Francisco, and you say to yourself,
00:24:00.940 I'm not so sure.
00:24:05.580 Was that a real one, or was that a prank?
00:24:10.340 I can't tell.
00:24:12.160 And if you fool me, you've passed the equivalent of the Turing test.
00:24:16.900 I'm going to call it the Adams test.
00:24:20.140 Or should I call it the Dilbert test?
00:24:23.320 I think Dilbert is more famous.
00:24:25.080 So let's do the Dilbert test.
00:24:27.520 So the Dilbert test is an updated version of the Turing test.
00:24:32.720 And the Dilbert test, I guess you could apply this to a company as well as a government.
00:24:40.440 Yeah, actually, let's do that.
00:24:41.720 Since it's a Dilbert test, let's apply it to corporations or organizations of any kind, government or not.
00:24:49.180 If you can't tell if it's a joke or real, you've passed the Dilbert test.
00:24:58.880 All right.
00:25:04.180 Next week, I'm going in to re-record my audio book of how to fail at almost everything and still win big.
00:25:10.380 If you don't know the backstory of that, back in 2006, I believe, I lost my ability to speak.
00:25:18.540 So I couldn't talk.
00:25:20.760 My vocal cords would clench when I tried to make certain sounds.
00:25:24.760 So I could make noise, but it would sound like, you know, basically people couldn't understand sentences.
00:25:31.500 And I had just finished my book, how to fail at almost everything and still win big.
00:25:35.680 And although it was years after that, after surgery and years of recovery from my voice,
00:25:42.360 But I hired a voice artist to do that book for me because I couldn't get through a whole book.
00:25:50.320 My voice wouldn't have lasted.
00:25:53.000 But if there's one thing you know about me by now is that I am one stubborn motherfucker.
00:26:01.960 And I fucking hate to lose.
00:26:04.780 And I don't know, I will wait forever to get revenge.
00:26:11.740 I will wait forever to fix something that wasn't right.
00:26:16.900 Like, I will chew through a fucking concrete wall to get something done that just needed to get done.
00:26:23.900 If you don't know anything else about me, you should know that.
00:26:27.760 And I have never been more bugged by the fact that I couldn't read my own book.
00:26:34.780 But, as you've noticed, I have now spent at least one hour a day, every day, for several years now,
00:26:46.780 Talking in public.
00:26:49.460 Until I can do this.
00:26:52.820 Finally.
00:26:54.140 I can finally do this.
00:26:56.320 It took me years.
00:26:59.300 But I don't like to give up.
00:27:00.960 And so, tomorrow I'm going in the studio and I will re-record and then we'll re-release the book,
00:27:07.300 How to Fail at Almost Everything and Still Win Big.
00:27:11.120 Because, if you write a book called How to Fail at Everything and Still Win Big,
00:27:17.020 and you give up before you record it properly,
00:27:20.460 well, you have not lived up to your own book.
00:27:23.140 So, I'm going to get this done.
00:27:25.600 It's not going to be easy.
00:27:26.940 Because the recording is always unpleasant.
00:27:29.400 It takes days.
00:27:30.780 But I have rebuilt my voice to the point where not only can I re-record this book,
00:27:37.120 but I think I'll be happy with how it will come out.
00:27:42.920 So, that's just a little personal update for you.
00:27:45.220 Clarification.
00:27:48.220 I thought that Vice President Kamala Harris would be assigned to the border control situation,
00:27:56.000 but apparently she's being assigned just to the international affairs part,
00:28:00.800 where she'll deal with the Central American countries
00:28:03.360 to see if we can make them more attractive for staying there.
00:28:08.600 And that tells us a little bit more about the future.
00:28:12.880 If they had put Kamala Harris in charge of border security,
00:28:18.660 do you remember what I said about that?
00:28:20.500 It's like, uh, are they trying to get rid of her?
00:28:23.900 Because that's like a suicide mission.
00:28:26.460 Nobody's ever going to succeed about the actual border itself.
00:28:31.760 You can't succeed.
00:28:33.440 Half the country is just going to hate your guts no matter what.
00:28:36.180 Maybe three quarters.
00:28:37.040 But you probably could succeed doing something that would maybe help the Central American countries,
00:28:44.940 you know, do a little better job of being an attractive place to stay.
00:28:48.980 And that's more of a presidential kind of a job, isn't it?
00:28:52.900 So, instead of giving her this suicide mission,
00:28:56.460 she's really going to have a very high-level job.
00:28:59.420 And if she were to succeed in that at all,
00:29:02.000 it would put her in a good position for running for president.
00:29:05.040 So, I would say at this point, that is helping her future.
00:29:10.280 So, Christina and I went out to eat last night
00:29:13.220 in my San Francisco East Bay neighborhood.
00:29:18.460 And there was the most fun thing.
00:29:22.120 You know what?
00:29:26.020 We're all looking for these little signs of life,
00:29:28.240 you know, that the world is getting back to normal
00:29:30.420 and the economy is going back to where it was.
00:29:32.900 And let me tell you what I saw when we went out to dinner.
00:29:37.740 I've never seen so many senior citizens out to dinner, right?
00:29:43.520 I'm not counting myself necessarily in this group.
00:29:47.300 I'm on the bubble.
00:29:48.280 But I'm talking about 70 and older.
00:29:52.840 The restaurants are crushed with vaccinated seniors.
00:30:01.880 The restaurants are crushed.
00:30:04.780 They're crushed.
00:30:06.080 Couldn't get in.
00:30:07.800 I mean, we did get in.
00:30:08.920 But finally, you know, we found a place.
00:30:11.200 But you go there, and it's over 70, over 70, over 70.
00:30:15.480 And, oh, my God, it's just great to see.
00:30:18.280 It is great to see.
00:30:20.980 I have to admit it.
00:30:22.840 It affected me.
00:30:25.900 So, and we ate indoors for the first time.
00:30:30.860 Well, the first time in a while.
00:30:32.900 Ate indoors.
00:30:34.580 But the funny thing is,
00:30:35.920 it's like the night of the living vaccinated.
00:30:39.760 Because, you know, obviously, you go out to dinner.
00:30:42.140 There'll always be, you know,
00:30:43.120 a good dose of senior citizens anywhere you go out to dinner.
00:30:46.440 But not like this.
00:30:50.300 You go out to dinner,
00:30:51.660 and as soon as you get into the restaurant areas,
00:30:55.080 it's like night of the living vaccinated.
00:30:57.260 You know, I got my second shot.
00:31:00.900 Give me restaurant food.
00:31:03.720 Let me in.
00:31:05.320 I got two shots.
00:31:06.420 So, I think I'm eligible.
00:31:10.220 I believe my eligibility kicks in this week.
00:31:14.380 So, sometime this week, I might get a first shot
00:31:17.160 if I can find access to it somewhere.
00:31:20.180 Dr. Birx is saying that
00:31:24.520 because we handle things wrong in this country,
00:31:28.700 and if we had handled it more like, let's say, Germany,
00:31:31.960 that all the deaths over the first 100,000
00:31:34.620 might have been mitigated or reduced.
00:31:38.320 To which a doctor, I don't know his name,
00:31:41.120 but said something, was on TV, said,
00:31:44.400 yeah, that's great, Dr. Birx.
00:31:45.960 It was your job to make sure that didn't happen.
00:31:49.920 Is that fair?
00:31:52.040 So, Birx is saying we handled it wrong,
00:31:54.480 and all these people died,
00:31:56.140 and then somebody else is saying,
00:31:58.080 that's on you.
00:32:00.380 That's on you.
00:32:02.140 If you didn't talk the government
00:32:04.320 into doing what you knew was right,
00:32:06.840 why are you complaining?
00:32:08.620 Because your job is not just to tell them what to do.
00:32:11.860 Your job is to talk them into it.
00:32:13.500 It's not just your job to know what they should have done.
00:32:18.660 Your job was to make them do it.
00:32:22.980 And let me be as clear as possible.
00:32:25.680 If you couldn't make them do it,
00:32:28.280 your job was to fucking quit.
00:32:32.160 Really publicly.
00:32:34.320 Right?
00:32:34.900 Let's just be as clear about this as we can.
00:32:37.740 Her job was to make this happen,
00:32:40.460 since she says she knows this is what we should have done,
00:32:43.500 her job was to make it happen,
00:32:45.580 or fucking quit.
00:32:47.540 Right in front of us.
00:32:49.160 So that we know who the problem is.
00:32:52.240 If the problem is you couldn't talk some idiot
00:32:54.400 into doing what you want,
00:32:56.360 let us know.
00:32:57.880 That's your fucking job.
00:33:00.500 Your job is not to complain about it
00:33:02.820 after you're out.
00:33:04.660 That was not your job.
00:33:06.880 You did not do your job, lady.
00:33:10.200 Do it right or quit,
00:33:12.340 so that we know what the problem was.
00:33:15.120 Who the problem was.
00:33:18.180 All right.
00:33:18.580 So I issued a challenge on Twitter
00:33:25.520 to see if somebody could win Sidney Powell's case,
00:33:30.580 the lawsuit in which Dominion is suing Sidney Powell
00:33:33.200 for libel, I guess,
00:33:35.100 defamation or libel or whatever the proper legal word is.
00:33:39.060 And I said,
00:33:40.100 try to win her case in one sentence.
00:33:42.740 You know,
00:33:43.000 just the first sentence.
00:33:44.060 Try to win the whole case.
00:33:45.580 And I gave you my take on this.
00:33:49.000 So here's a sentence
00:33:49.920 that I believe Sidney Powell's lawyer
00:33:53.660 could say one thing
00:33:56.020 and then just sit down
00:33:57.140 and just be done.
00:33:59.560 And we'll see if you agree
00:34:00.500 that this is one thing
00:34:01.420 that would win the case.
00:34:04.400 And it goes like this.
00:34:05.600 So imagine the Sidney Powell defense attorney
00:34:07.860 looking at the jury,
00:34:09.960 I guess it's a jury,
00:34:11.540 and saying,
00:34:12.500 quote,
00:34:13.800 or potential quote,
00:34:15.580 I'm a lawyer advocating for my client,
00:34:18.040 Ms. Powell,
00:34:19.060 and I'm wondering,
00:34:20.580 how many of you think
00:34:21.660 everything I say about Dominion today
00:34:24.060 should be assumed to be a fact?
00:34:28.580 And then you just sit down
00:34:29.840 and say,
00:34:30.960 the defense rests.
00:34:33.780 There's nothing else you'd have to say.
00:34:36.320 Now, of course they would.
00:34:38.480 But think about this point.
00:34:40.920 Do you think that anybody in the jury
00:34:43.500 would look at the defense attorney
00:34:45.780 and say,
00:34:46.740 oh yeah,
00:34:47.120 it is my plan.
00:34:49.460 I plan to believe everything you tell me.
00:34:53.420 Nobody.
00:34:54.780 Nobody would say that.
00:34:56.140 Because everybody in the jury understands,
00:34:58.620 because they're adults
00:34:59.500 and they were at least capable enough
00:35:01.420 to be chosen for jury duty.
00:35:03.420 So you have to show a little bit
00:35:05.280 of human capability
00:35:07.040 or you're not on the jury.
00:35:09.300 Every person on the jury
00:35:10.660 would listen to that and say,
00:35:12.340 well no,
00:35:12.700 I'm not going to believe it
00:35:13.440 just because you say it.
00:35:15.080 You're a lawyer.
00:35:16.720 You're advocating for a client.
00:35:18.880 In this context,
00:35:20.280 we only believe evidence.
00:35:22.700 If you show me evidence,
00:35:24.700 I may or may not believe the evidence,
00:35:27.460 but I'm sure as hell
00:35:28.500 not going to believe it
00:35:29.320 because it came out of your stupid mouth.
00:35:31.680 Your job is to persuade me.
00:35:33.920 Your job is not to be true.
00:35:37.200 Your job is not to tell me the truth
00:35:39.060 if you're a defense attorney.
00:35:40.560 Your job is to do the best you can
00:35:42.700 to get your client free.
00:35:45.440 That's it.
00:35:46.580 Now, let me tell you,
00:35:47.980 how many adults don't understand that?
00:35:51.320 None.
00:35:52.500 Every single adult
00:35:53.880 understands that
00:35:55.740 an attorney advocating for a client
00:35:58.420 is going to say things
00:36:00.300 that you need to check.
00:36:01.320 Everybody knows
00:36:04.300 if you can't check it,
00:36:07.140 you shouldn't think
00:36:08.100 it's necessarily true.
00:36:12.140 The weird part
00:36:14.140 is that the defense attorney
00:36:15.680 would be in exactly the same position
00:36:17.460 as Powell herself was
00:36:19.400 because she was a defense attorney
00:36:20.960 or maybe more of an offense
00:36:22.880 in this case,
00:36:23.960 but the analogy is perfect.
00:36:27.440 It's not even an analogy.
00:36:28.640 It's basically the same thing.
00:36:31.320 And this is an example
00:36:32.700 if you're following
00:36:33.640 my persuasion lessons
00:36:34.860 of what I call
00:36:35.580 the high ground maneuver.
00:36:37.720 Do you know how often
00:36:38.920 the high ground maneuver,
00:36:40.360 which is a persuasion trick
00:36:41.740 I'll explain a little bit more
00:36:42.920 in a moment,
00:36:43.780 do you know how often
00:36:44.580 the high ground maneuver works?
00:36:47.700 Every time.
00:36:48.980 It's probably the only
00:36:50.300 persuasion method
00:36:51.320 that works every time.
00:36:54.220 Because there's a,
00:36:55.900 and there's a reason for this.
00:36:57.020 People don't care
00:36:59.260 about facts, right?
00:37:01.220 You understand that, right?
00:37:02.620 So if I'm trying
00:37:03.340 to persuade you
00:37:04.180 with my better reasons
00:37:05.160 and my facts,
00:37:05.960 we observe that people
00:37:06.920 just harden their opposition.
00:37:09.540 The facts don't persuade anybody.
00:37:12.140 But everybody
00:37:13.980 wants to avoid
00:37:16.220 looking foolish.
00:37:18.400 It's universal.
00:37:20.260 So if you're in a meeting
00:37:21.320 or in your business situation,
00:37:23.080 and there's something
00:37:24.320 that would make you
00:37:25.100 look foolish,
00:37:26.440 you're going to change
00:37:27.380 your mind immediately.
00:37:29.100 Because you don't want
00:37:30.220 to look foolish.
00:37:31.800 So regardless of what you think
00:37:33.280 about the truth of the world,
00:37:34.960 you're going to protect
00:37:35.880 yourself first.
00:37:37.500 So the high ground maneuver
00:37:39.120 creates a situation
00:37:41.280 where the person
00:37:42.060 you're trying to persuade
00:37:43.120 has to protect themselves.
00:37:46.520 This is the key.
00:37:48.080 They're protecting themselves
00:37:49.560 as opposed to
00:37:51.660 protecting their argument.
00:37:54.000 If you give somebody
00:37:55.080 a choice of
00:37:55.920 I'm going to let you
00:37:56.560 protect your argument
00:37:57.600 or I'm going to let you
00:37:59.300 protect yourself,
00:38:01.060 they'll choose themselves
00:38:02.500 every time.
00:38:05.060 That's why it works
00:38:06.520 every time.
00:38:07.400 Now you don't always
00:38:08.180 have the opportunity
00:38:09.060 to use it, right?
00:38:10.580 So not every situation
00:38:11.980 will lend itself
00:38:13.000 to a high ground technique.
00:38:15.840 It has to be there
00:38:17.100 and you have to recognize it.
00:38:18.820 So it has to naturally
00:38:20.580 sort of be in the situation.
00:38:22.780 And the high ground
00:38:24.360 in this case
00:38:25.120 is that you adults
00:38:27.740 in the jury
00:38:28.400 are certainly smarter
00:38:30.020 than the public
00:38:30.780 in general, right?
00:38:32.080 You're not so gullible
00:38:33.500 that you would believe
00:38:35.100 that somebody
00:38:35.640 whose job
00:38:36.460 is to persuade you
00:38:37.840 is also always
00:38:40.140 telling you the truth.
00:38:41.480 You're well above that,
00:38:42.680 are you not?
00:38:43.560 Because the person
00:38:45.400 sitting next to you
00:38:46.220 is above it
00:38:46.860 and the person
00:38:48.640 behind you,
00:38:49.860 they're above that.
00:38:51.820 They know how lawyers work.
00:38:54.540 They know the difference
00:38:55.700 between an honest person
00:38:57.440 and an advocate.
00:38:59.260 But you don't
00:39:00.340 because, you know,
00:39:02.100 all the other people
00:39:03.240 seem to understand this.
00:39:06.000 Right?
00:39:07.220 So as soon as you
00:39:08.400 create the situation
00:39:09.320 where if you stay
00:39:11.160 with your opinion,
00:39:11.960 you will look like
00:39:14.340 the only person
00:39:15.140 who doesn't understand
00:39:16.240 something that
00:39:16.860 everyone understands.
00:39:18.700 Because everyone
00:39:19.600 understands lawyers
00:39:20.540 or advocates
00:39:21.260 not a source
00:39:23.020 of truth.
00:39:24.540 Everyone understands that.
00:39:26.080 The moment
00:39:26.880 you've painted
00:39:28.020 that picture,
00:39:30.180 everybody goes
00:39:31.300 to the high ground
00:39:32.020 with you
00:39:32.520 because they can't
00:39:33.520 stay here.
00:39:34.580 They look like
00:39:35.380 idiots.
00:39:37.060 Right?
00:39:37.560 So you're not making
00:39:38.520 them defend
00:39:38.960 their argument.
00:39:41.180 You're making them
00:39:41.980 defend themselves.
00:39:43.860 Do you want to see
00:39:44.840 here being the only idiot
00:39:45.960 who doesn't understand
00:39:46.940 that lawyers
00:39:47.480 are not supposed
00:39:48.320 to tell the truth?
00:39:49.740 Well, I'll say that wrong.
00:39:51.000 They're supposed
00:39:51.400 to tell the truth.
00:39:52.620 But nobody expects it.
00:39:54.740 No reasonable person
00:39:56.340 expects it.
00:39:58.140 That's it.
00:39:59.240 That's the end
00:39:59.920 of the trial.
00:40:02.500 All right.
00:40:04.380 I'm oversimplifying,
00:40:05.720 of course.
00:40:06.280 Trials are more
00:40:06.940 complicated than that.
00:40:07.780 But if you wanted
00:40:08.720 to understand
00:40:09.260 the high ground
00:40:09.940 maneuver,
00:40:10.480 this is just
00:40:10.900 the best example.
00:40:12.240 You couldn't get
00:40:12.980 a better example
00:40:13.620 than this.
00:40:17.160 Twitter user
00:40:18.060 John Katz,
00:40:19.640 K-A-T-Z.
00:40:20.640 I believe he has
00:40:21.180 a podcast.
00:40:22.720 So he's
00:40:23.660 somebody in the public.
00:40:25.520 He had a tweet
00:40:26.420 that I just love.
00:40:27.800 This is just
00:40:28.240 the greatest tweet.
00:40:29.480 And when I read
00:40:30.180 this to you,
00:40:31.260 you're going to say,
00:40:32.080 why did it take
00:40:32.740 somebody so long
00:40:33.600 to say this?
00:40:35.460 All right.
00:40:36.000 Here's his tweet.
00:40:36.920 Quote,
00:40:37.780 you think
00:40:38.920 your AR-15
00:40:39.880 will work
00:40:40.560 against a government
00:40:41.420 with tanks?
00:40:42.700 So he's mocking
00:40:44.120 the people
00:40:44.920 on the left
00:40:45.480 who have two opinions
00:40:46.540 that don't seem
00:40:47.860 to fit together.
00:40:48.900 All right.
00:40:49.460 So here are
00:40:49.980 their two opinions.
00:40:51.540 That they think
00:40:52.120 an AR-15
00:40:52.960 will work...
00:40:54.560 I'm sorry.
00:40:56.300 Let me stop
00:40:56.780 screwing this up.
00:40:57.900 He's mocking
00:40:58.600 the people
00:40:59.200 who are saying
00:41:01.360 that having
00:41:02.600 AR-15s
00:41:03.600 will be no use
00:41:04.740 against a government
00:41:05.620 with tanks.
00:41:06.280 All right.
00:41:06.980 So one of the
00:41:07.460 arguments for
00:41:08.000 owning guns
00:41:08.920 is that it
00:41:09.580 protects you
00:41:10.120 in case your
00:41:10.780 government
00:41:11.120 turns against you.
00:41:12.620 And the Democrats
00:41:13.580 say,
00:41:14.560 you fool.
00:41:16.000 You fool.
00:41:17.140 Are you telling me
00:41:18.420 that you think
00:41:20.420 that a bunch
00:41:21.900 of hillbillies
00:41:22.700 with their AR-15s
00:41:24.420 are going to,
00:41:25.880 like,
00:41:26.680 protect against
00:41:27.340 the government
00:41:27.940 with tanks?
00:41:29.740 Seriously?
00:41:30.480 You think that?
00:41:31.120 Now,
00:41:32.740 the same people
00:41:33.340 who are asking
00:41:33.960 you that,
00:41:34.820 as John Katz
00:41:35.820 points out,
00:41:37.240 are also telling
00:41:38.140 you that
00:41:38.660 these same
00:41:39.340 hillbillies,
00:41:40.040 I'm just using
00:41:40.740 hillbillies to be
00:41:41.560 provocative,
00:41:42.820 almost took over
00:41:44.000 the government
00:41:44.480 with bear spray.
00:41:47.660 Both of those
00:41:48.600 messages
00:41:49.060 are out there
00:41:50.740 at the same time.
00:41:52.440 Don't be ridiculous.
00:41:54.120 You're not going
00:41:54.580 to be able
00:41:55.220 to defend the public
00:41:56.300 with just your
00:41:56.960 little rifles
00:41:57.860 against tanks
00:41:59.320 and nukes.
00:42:01.120 At the same time,
00:42:02.800 you know,
00:42:03.220 if you've got
00:42:03.620 some bear spray,
00:42:05.120 you could pretty
00:42:06.200 much take out
00:42:06.860 the government.
00:42:08.020 And it was
00:42:08.500 this close.
00:42:10.260 The bear spray
00:42:10.940 people
00:42:11.340 almost got it
00:42:13.180 done.
00:42:14.160 Almost got it
00:42:15.000 done.
00:42:15.920 And both of
00:42:16.840 those messages
00:42:17.500 are out there
00:42:18.140 completely.
00:42:21.760 Seriously.
00:42:23.340 Like,
00:42:23.680 nobody's even
00:42:24.280 embarrassed by that.
00:42:26.380 Nobody's embarrassed
00:42:27.320 that they hold
00:42:27.960 both of those
00:42:28.460 positions.
00:42:29.840 And a lot of
00:42:30.360 people do.
00:42:30.740 Now,
00:42:31.320 usually,
00:42:32.160 I saw this
00:42:33.180 comment on
00:42:34.160 Twitter as well,
00:42:35.520 that something
00:42:36.200 like 90%
00:42:37.000 of Twitter
00:42:37.520 arguments is
00:42:38.440 somebody imagining
00:42:39.280 somebody who
00:42:39.860 doesn't exist
00:42:40.520 and being mad
00:42:41.180 at them.
00:42:42.140 You know,
00:42:42.380 the person who
00:42:42.900 holds this
00:42:43.320 opinion,
00:42:44.200 but also,
00:42:45.180 you know,
00:42:46.340 hypocritically
00:42:46.980 holds this
00:42:47.500 other opinion.
00:42:48.880 And usually,
00:42:49.540 that person
00:42:49.880 doesn't exist.
00:42:51.800 It'd be hard
00:42:52.300 to find somebody
00:42:52.940 who actually
00:42:53.620 held those
00:42:54.180 literally two
00:42:55.180 opposing opinions.
00:42:56.420 But this is
00:42:57.100 one where this
00:42:58.600 is real.
00:42:59.040 I'm pretty
00:43:00.260 sure that
00:43:02.060 this describes
00:43:02.920 almost the
00:43:03.580 entire Democratic
00:43:04.380 Party.
00:43:05.560 I believe that
00:43:06.340 most of
00:43:07.740 Democrats would
00:43:08.560 agree with
00:43:09.080 both of these
00:43:09.800 sentences that
00:43:11.880 conflict.
00:43:13.300 And I don't
00:43:13.960 think that they
00:43:14.440 have any
00:43:14.820 problem with
00:43:15.400 it.
00:43:15.680 Like,
00:43:15.900 they're not,
00:43:16.600 there's no
00:43:17.100 interior conflict
00:43:18.740 with thinking
00:43:20.340 that both of
00:43:21.420 these could be
00:43:21.900 true.
00:43:22.120 And of
00:43:24.080 course,
00:43:24.380 Trump caused
00:43:24.980 trouble by
00:43:25.860 referring to
00:43:26.560 the Capitol
00:43:27.400 riots as
00:43:28.140 having,
00:43:28.540 you know,
00:43:28.900 basically zero
00:43:30.180 danger.
00:43:34.320 What he
00:43:34.980 meant, of
00:43:35.420 course, was
00:43:35.840 as an
00:43:36.300 overthrow to
00:43:36.980 the government.
00:43:38.520 Now,
00:43:38.920 or at least
00:43:39.540 I hope.
00:43:40.020 I hope he
00:43:40.580 meant there
00:43:41.180 was zero
00:43:41.560 danger in
00:43:42.340 terms of an
00:43:42.820 overthrow to
00:43:43.360 the government.
00:43:44.300 Obviously,
00:43:44.820 there was
00:43:45.060 physical danger
00:43:46.020 and plenty
00:43:46.940 of it.
00:43:47.360 90% of
00:43:53.040 baseball is
00:43:53.680 half mental.
00:43:55.980 Is that
00:43:56.300 a yogi
00:43:57.000 bearer?
00:43:59.240 Yeah,
00:43:59.560 mind reading.
00:44:00.160 I don't
00:44:00.340 want to
00:44:00.540 mind read,
00:44:01.080 so I'll
00:44:01.640 back up on
00:44:02.240 that point.
00:44:03.320 You know,
00:44:03.580 the mind
00:44:04.200 reading flaw
00:44:05.120 is just the
00:44:07.220 stickiest thing,
00:44:07.980 isn't it?
00:44:08.900 And once you
00:44:09.480 start seeing it
00:44:10.380 in other people,
00:44:11.280 you can recognize
00:44:12.200 that they're
00:44:12.740 involved in
00:44:13.340 mind reading.
00:44:14.720 But it
00:44:15.880 doesn't help
00:44:16.440 you not do
00:44:17.020 it yourself.
00:44:17.960 It's really,
00:44:19.300 really hard to
00:44:19.800 not fall into
00:44:21.240 the mind reading
00:44:22.000 trap where you
00:44:22.880 confidently imagine
00:44:24.620 you know what
00:44:25.140 other people are
00:44:25.700 thinking,
00:44:26.180 because you
00:44:26.400 never do.
00:44:27.260 You're terrible
00:44:28.120 at that.
00:44:29.240 Your accuracy
00:44:30.640 is just terrible
00:44:31.400 at that.
00:44:35.820 He said
00:44:36.640 they were
00:44:36.920 kissing the
00:44:37.560 Capitol Police.
00:44:39.040 Maybe some
00:44:39.900 of them were.
00:44:45.700 There's another
00:44:46.480 carjacking.
00:44:47.340 You know,
00:44:48.700 let me ask you
00:44:50.040 this.
00:44:52.300 Our entire
00:44:53.500 media is
00:44:54.760 now obsessed
00:44:55.620 lately with
00:44:56.440 the violence
00:44:58.060 against Asian
00:44:59.120 Americans.
00:45:00.080 Let me start
00:45:00.760 by saying we
00:45:01.660 don't want any
00:45:02.200 violence against
00:45:03.000 Asian Americans.
00:45:03.780 Americans.
00:45:04.480 So I think we're
00:45:05.460 all on the
00:45:05.860 same side that
00:45:06.780 violence against
00:45:08.200 anybody's bad.
00:45:09.520 Violence against
00:45:10.320 an ethnic group
00:45:11.180 in particular
00:45:11.800 takes something
00:45:13.240 that's already
00:45:13.800 really, really
00:45:14.400 bad and adds
00:45:15.360 that extra
00:45:15.860 badness.
00:45:16.460 We're all on
00:45:17.020 the same side.
00:45:17.880 Nobody likes
00:45:18.320 violence.
00:45:20.700 But let me
00:45:21.440 ask you this.
00:45:22.940 Do you believe
00:45:23.900 that the
00:45:24.420 mainstream media
00:45:25.380 would be
00:45:26.280 reporting about
00:45:27.140 all this
00:45:27.620 anti-Asian
00:45:28.540 bias if
00:45:31.260 China were not
00:45:32.020 making it
00:45:32.440 happen?
00:45:34.960 Yeah, think
00:45:35.840 about it.
00:45:36.720 Is it a
00:45:37.200 coincidence that
00:45:39.080 China has
00:45:40.320 advanced AI
00:45:41.760 to the point
00:45:43.480 where they can
00:45:44.120 influence the
00:45:45.920 conversation and
00:45:47.040 the argument in
00:45:47.760 the United States?
00:45:48.740 And there
00:45:49.980 would be no
00:45:51.020 thing that
00:45:51.600 would be
00:45:51.920 better for
00:45:52.400 China than
00:45:53.680 to create
00:45:54.220 as a national
00:45:55.100 story anti-Asian
00:45:57.560 in this case
00:45:58.220 Asian American
00:45:58.900 mostly, anti-Asian
00:46:00.880 discrimination as
00:46:02.100 the top headline
00:46:03.320 story.
00:46:04.680 Do you think
00:46:05.760 that this story
00:46:06.700 is originating
00:46:07.600 in America?
00:46:11.380 It might.
00:46:13.460 It might.
00:46:14.920 Right?
00:46:16.340 I don't think
00:46:17.460 so.
00:46:17.760 I'm going to
00:46:18.940 tell you
00:46:19.300 something that
00:46:21.600 might scare
00:46:22.700 the shit out
00:46:23.600 of you.
00:46:24.600 We no longer
00:46:25.780 live in a
00:46:26.340 world where
00:46:27.240 the default
00:46:28.000 assumption is
00:46:29.340 that this is
00:46:29.960 a natural
00:46:30.460 story.
00:46:31.960 The default
00:46:33.080 assumption is
00:46:33.960 that it came
00:46:34.380 from China
00:46:34.940 and that
00:46:36.200 they're
00:46:36.600 manipulating
00:46:37.360 the United
00:46:37.820 States through
00:46:39.200 a variety of
00:46:39.920 means to
00:46:40.480 make this
00:46:40.920 the main
00:46:41.360 story.
00:46:43.060 Now, do
00:46:44.300 I know?
00:46:45.460 Do I know
00:46:46.360 that?
00:46:46.860 No.
00:46:47.040 No, I
00:46:47.800 don't know
00:46:48.160 it.
00:46:48.780 I can't
00:46:49.440 give you
00:46:49.740 evidence of
00:46:50.400 it, etc.
00:46:51.980 Do I think
00:46:52.800 that this
00:46:53.680 would be a
00:46:54.260 good play?
00:46:56.040 Yes.
00:46:57.580 If China
00:46:58.500 were to do
00:46:59.060 this and
00:46:59.620 if they
00:47:00.180 pulled it
00:47:00.640 off, it
00:47:02.260 would be a
00:47:02.840 really good
00:47:03.480 play because
00:47:04.640 it's exactly
00:47:05.380 the kind of
00:47:06.000 thing that
00:47:06.940 would prevent
00:47:07.460 the United
00:47:07.900 States from
00:47:08.520 going hard
00:47:09.140 at China
00:47:09.720 China, because
00:47:10.720 somebody's
00:47:11.380 going to
00:47:11.580 say, hey,
00:47:12.540 are you
00:47:12.940 going hard
00:47:13.420 at China
00:47:13.900 because you
00:47:15.100 need to,
00:47:16.120 or is it
00:47:16.800 really sort
00:47:17.440 of a racist
00:47:18.040 thing?
00:47:19.120 Because I'm
00:47:19.880 sure China
00:47:20.460 noticed that
00:47:21.640 we have
00:47:21.980 trouble dealing
00:47:22.680 with our
00:47:23.120 southern border
00:47:23.840 because,
00:47:25.460 because,
00:47:27.140 the internal
00:47:28.220 politics are
00:47:29.260 that we're
00:47:29.600 fighting over
00:47:30.240 whether that's
00:47:30.880 really about
00:47:31.460 racism.
00:47:32.880 That's why
00:47:33.580 we can't
00:47:34.060 deal with
00:47:34.440 our own
00:47:34.760 border.
00:47:35.080 What would
00:47:36.380 China like
00:47:36.960 to do in
00:47:37.740 terms of
00:47:38.120 how the
00:47:38.360 United States
00:47:38.920 deals with
00:47:39.580 China's
00:47:40.780 aggressive
00:47:41.900 growth?
00:47:43.580 China would
00:47:44.180 like the
00:47:44.500 United States
00:47:45.080 to be arguing
00:47:45.760 about whether
00:47:46.360 that's a
00:47:46.820 racist policy.
00:47:48.620 And they
00:47:49.400 succeeded.
00:47:50.620 In the sense
00:47:51.600 that it's
00:47:51.960 happening.
00:47:53.200 All we're
00:47:53.620 talking about
00:47:54.160 is the
00:47:55.040 country racist
00:47:55.820 against Asians
00:47:56.640 and Asian
00:47:57.140 Americans.
00:47:58.280 If that's
00:47:59.180 the top
00:47:59.620 thing you're
00:48:00.040 talking about,
00:48:01.480 what's the
00:48:02.460 default assumption
00:48:03.600 in a world
00:48:04.360 of AI
00:48:04.860 and social
00:48:05.620 media and
00:48:07.120 intelligence
00:48:07.700 agencies really
00:48:08.920 behind most
00:48:09.820 of the big
00:48:10.240 stuff?
00:48:11.600 The default
00:48:12.340 intelligence,
00:48:13.460 I'm sorry,
00:48:14.660 the default
00:48:15.440 assumption should
00:48:17.120 be that this
00:48:17.680 is coming from
00:48:18.300 China.
00:48:19.880 That doesn't
00:48:20.680 mean it is.
00:48:22.420 But the
00:48:23.160 default assumption
00:48:24.000 should be if
00:48:24.900 you don't
00:48:25.360 know, if you
00:48:26.880 can't tell one
00:48:27.820 way or the
00:48:28.260 other, the
00:48:30.080 default assumption
00:48:30.880 is that it's
00:48:31.440 China.
00:48:32.540 And if you
00:48:33.000 don't get
00:48:33.480 that,
00:48:33.880 you're a
00:48:35.840 few years
00:48:36.400 behind.
00:48:38.160 Because in
00:48:38.920 2021, that
00:48:40.680 is the
00:48:41.120 default assumption.
00:48:42.560 And if you
00:48:42.960 don't know
00:48:43.460 that, you
00:48:44.620 think the
00:48:45.080 news is
00:48:45.520 real.
00:48:46.640 It might
00:48:47.140 be.
00:48:47.840 I can't
00:48:48.380 rule that
00:48:48.800 out.
00:48:49.780 I'm just
00:48:50.260 saying that if
00:48:51.000 you don't
00:48:51.540 think China
00:48:52.200 is behind
00:48:52.780 this being
00:48:53.720 the headlines
00:48:54.300 in this
00:48:54.700 country,
00:48:55.520 you are
00:48:56.480 probably naive.
00:48:58.160 probably naive.
00:49:02.480 So here's
00:49:04.800 something to
00:49:05.500 look for.
00:49:06.760 See how
00:49:07.120 many times
00:49:07.780 our internal
00:49:08.800 news is
00:49:10.240 focused on
00:49:10.880 a story that
00:49:11.620 coincidentally
00:49:12.500 would have
00:49:13.600 some kind of
00:49:14.320 a benefit to
00:49:15.340 one of our
00:49:16.320 foreign
00:49:17.680 adversaries or
00:49:18.760 even allies.
00:49:20.440 How many of
00:49:21.300 our internal
00:49:21.920 stories look
00:49:22.880 like they
00:49:23.360 were seeded
00:49:24.100 by another
00:49:24.660 country?
00:49:25.120 because
00:49:25.860 coincidentally
00:49:26.600 that would
00:49:27.600 be really
00:49:28.000 good for
00:49:28.420 that other
00:49:28.820 country,
00:49:29.840 be it an
00:49:31.420 ally or
00:49:33.220 an adversary.
00:49:34.120 Because remember,
00:49:34.820 our intelligence
00:49:35.380 agencies are
00:49:37.040 putting influence
00:49:38.660 on both
00:49:39.260 adversaries and
00:49:40.540 enemies.
00:49:41.820 So there's
00:49:42.980 no reason to
00:49:43.560 think that our
00:49:45.240 adversaries are
00:49:46.100 not doing the
00:49:46.660 same thing to
00:49:47.220 us, just in
00:49:47.880 different ways
00:49:48.500 for different
00:49:48.940 interests.
00:49:49.360 So you
00:49:55.880 can fill in
00:49:56.500 the names of
00:49:57.940 those other
00:49:58.520 countries as
00:49:59.380 easily as I
00:50:00.020 can.
00:50:01.560 All right.
00:50:04.240 As a
00:50:04.840 black American,
00:50:05.720 will you be
00:50:06.280 making any
00:50:06.920 edits to how
00:50:07.720 to fill and
00:50:08.200 everything?
00:50:08.820 Well, I
00:50:09.340 don't think
00:50:09.820 that the
00:50:11.820 book needs
00:50:12.300 to be edited
00:50:12.940 for that
00:50:13.640 reason.
00:50:16.560 Should we
00:50:17.220 suspect
00:50:17.660 foreigners or
00:50:18.480 Democrats?
00:50:19.360 Well, here's
00:50:20.240 why, if
00:50:21.360 China is
00:50:21.880 behind these
00:50:22.780 headlines, the
00:50:24.820 Democrats would
00:50:25.600 be, let's
00:50:27.060 say, so
00:50:27.920 interested in
00:50:28.760 the same
00:50:29.140 story that
00:50:30.700 they would
00:50:31.280 easily allow
00:50:32.860 it, if not
00:50:34.420 being a
00:50:35.140 participant.
00:50:41.920 Thank you.
00:50:43.560 Yes.
00:50:44.660 Now that I
00:50:45.520 have an
00:50:45.840 interracial
00:50:46.360 marriage, I
00:50:47.840 feel that I
00:50:48.460 understand things
00:50:49.300 a little bit
00:50:49.780 better.
00:50:51.220 I didn't
00:50:51.800 have one
00:50:52.180 until recently
00:50:52.800 when I
00:50:53.480 started
00:50:54.960 identifying as
00:50:55.780 black.
00:50:58.760 And by the
00:50:59.420 way, the
00:51:00.040 great thing
00:51:00.680 about identifying
00:51:01.400 as black is
00:51:03.480 that I get to
00:51:04.100 be on the
00:51:04.440 winning team
00:51:04.980 for a while.
00:51:07.080 You know, and
00:51:08.000 it's funny
00:51:08.640 because I
00:51:10.620 actually feel
00:51:11.460 that.
00:51:11.780 the moment I
00:51:14.440 started looking
00:51:18.000 at the news
00:51:18.520 differently.
00:51:21.300 People are
00:51:21.900 so automatically
00:51:22.700 team-oriented
00:51:23.860 that if you
00:51:24.640 just say you're
00:51:25.220 on a team,
00:51:26.200 suddenly you
00:51:26.780 start rooting
00:51:27.240 for that team.
00:51:28.040 It's just
00:51:28.360 automatic.
00:51:29.220 You can't
00:51:29.760 turn it off.
00:51:30.900 And suddenly I'm
00:51:31.520 looking at the
00:51:31.980 news differently,
00:51:32.740 and I swear to
00:51:33.220 God it's true.
00:51:34.080 I'm looking at the
00:51:34.760 news differently,
00:51:35.560 and I'm thinking,
00:51:36.520 all right,
00:51:37.360 looks like my
00:51:37.920 team's doing
00:51:38.380 pretty well
00:51:38.840 today.
00:51:39.980 They're doing
00:51:42.000 really well
00:51:42.500 today.
00:51:45.740 Somebody says
00:51:46.500 I'm not black.
00:51:47.380 I think you're
00:51:48.060 a racist.
00:51:49.540 So there's some
00:51:50.360 racists on here
00:51:51.120 saying that I'm
00:51:51.820 not black.
00:51:52.840 Now I understand
00:51:53.820 that visually it
00:51:54.980 seems like that,
00:51:55.860 but I think we're
00:51:56.460 well past the
00:51:59.140 visual part being
00:52:00.400 your identity,
00:52:01.220 aren't we?
00:52:02.580 Aren't we all
00:52:03.120 past that?
00:52:03.960 It doesn't matter
00:52:04.660 what I look like.
00:52:06.240 That's no longer
00:52:07.860 a criteria and
00:52:08.720 should not be.
00:52:09.900 I'm fully on
00:52:10.860 board with that
00:52:11.420 no longer being
00:52:12.220 a criteria.
00:52:13.300 Because you
00:52:13.560 wouldn't want to
00:52:14.060 say, who's
00:52:15.520 one of my
00:52:17.580 brothers, the
00:52:18.320 black activist
00:52:19.400 Sean, what's
00:52:21.320 his name?
00:52:23.220 Sean, somebody
00:52:24.520 fill in the last
00:52:25.180 name.
00:52:26.100 But he's a
00:52:26.880 famous black
00:52:28.040 activist that
00:52:28.920 people say is
00:52:29.700 not physically
00:52:30.620 black enough.
00:52:31.720 And I say
00:52:32.080 that is offensive.
00:52:34.080 Sean King,
00:52:34.740 thank you.
00:52:35.120 So I defend
00:52:38.120 Sean King
00:52:38.920 because racist
00:52:40.060 saying that
00:52:40.800 he's not
00:52:41.620 black enough
00:52:42.360 so he can't
00:52:43.300 self-identify
00:52:44.120 as black.
00:52:44.960 That's purely
00:52:45.700 racist.
00:52:47.100 And I
00:52:48.820 will defend
00:52:49.900 my brother
00:52:51.380 Sean King
00:52:52.200 as much as
00:52:53.220 I would defend
00:52:53.760 myself because
00:52:55.100 we're in the
00:52:55.560 same situation.
00:52:57.000 Not the same.
00:52:57.880 Nothing's the
00:52:58.440 same, but
00:52:58.940 everybody's
00:53:00.000 different.
00:53:00.300 But I
00:53:01.880 feel it.
00:53:04.240 I feel it.
00:53:05.580 All right.
00:53:06.240 That's all I've
00:53:06.980 got for now and
00:53:07.980 I will talk to
00:53:08.900 you all
00:53:09.460 tomorrow.
00:53:11.800 Tomorrow.
00:53:12.620 Yeah.
00:53:13.040 We'll see you
00:53:13.460 tomorrow.
00:53:14.520 Bye for now.
00:53:19.540 All right.
00:53:21.640 YouTubers, I
00:53:22.420 got you for
00:53:22.840 another minute
00:53:23.420 here.
00:53:23.720 somebody who's
00:53:28.480 famous on
00:53:29.000 TikTok.
00:53:30.320 Well, good
00:53:30.940 for you.
00:53:35.540 Why are you
00:53:36.180 laughing?
00:53:42.080 Zajian, is
00:53:42.740 that a word?
00:53:43.220 What's that
00:53:43.500 mean?
00:53:43.640 All right.
00:53:52.280 That's enough
00:53:52.720 for now and
00:53:53.460 I will talk
00:53:54.360 to you tomorrow.
00:53:54.840 Bye.