Real Coffee with Scott Adams - May 05, 2020


Episode 956 Scott Adams: Come Sip the News


Episode Stats

Length

1 hour and 8 minutes

Words per Minute

151.98193

Word Count

10,429

Sentence Count

789

Misogynist Sentences

18

Hate Speech Sentences

17


Summary

It's Cinco de Mayo and Scott Adams is here with the hit of the day: The simultaneous sip. He talks about the new coronavirus death model, and what it means for the chances of catching the killer bug.


Transcript

00:00:00.000 Hey everybody, come on in here.
00:00:13.620 It's time for a special Cinco de Mayo episode of Coffee with Scott Adams
00:00:19.980 featuring the simultaneous sip, which I would say in Spanish if I could, but I can't.
00:00:26.280 So let's just call it the simultaneous sip.
00:00:30.660 It's not going to take much.
00:00:32.520 Hey, you Walnut Creek in the house.
00:00:35.240 Good to see you.
00:00:35.940 Good morning, everybody.
00:00:37.960 There will be no cursing this morning.
00:00:41.540 I mean, unless things completely go off the rails.
00:00:44.720 No cursing.
00:00:45.520 I did my cursing last night.
00:00:46.860 That will be my evening, evening cursing.
00:00:50.760 It's better to curse in the evening.
00:00:52.260 I think it just fits better.
00:00:53.960 This morning, I overslept.
00:00:56.380 So I'd love to tell you I'm really prepared.
00:00:58.440 But I'm not.
00:01:01.480 What do you do when you're not prepared?
00:01:04.440 There's one thing you can always do.
00:01:06.560 It doesn't take much.
00:01:08.300 All it takes is a cup or a mug or a glass, a tank or a chalice or a sign, a canteen jug or a flask, a vessel of any kind.
00:01:16.060 Fill it with your favorite liquid.
00:01:18.280 I like coffee.
00:01:19.420 And join me now for the unparalleled pleasure of the dope bean hit of the day.
00:01:23.360 The thing that makes everything better, including the damn pandemic.
00:01:28.060 And it's called the simultaneous sip.
00:01:35.480 And it happens now.
00:01:37.740 Go.
00:01:37.940 Infection rates falling.
00:01:47.620 I can feel it.
00:01:49.420 Hello, New York.
00:01:50.240 All right.
00:01:53.620 It turns out that I've developed two different fan bases, I think.
00:01:59.240 One that likes it when I swear.
00:02:01.780 One that prefers that I not.
00:02:04.580 So it's like I've split into two characters.
00:02:07.680 Somebody likes some.
00:02:09.340 Some like the other.
00:02:10.380 All right.
00:02:10.640 Let's see what's going on here.
00:02:11.720 So the new coronavirus death model, that's right.
00:02:18.240 I said a death model, has been revised upwards, of course, because we're reopening some of the country.
00:02:27.580 So what do you call it when things go exactly the way everybody assumed they would go?
00:02:35.080 You call that news.
00:02:36.620 That's news, my friends.
00:02:37.680 And if things go exactly the way everybody said they should go, will go, can't really go any other way, that's called news.
00:02:50.120 And so it's been jacked up to 134,000.
00:02:55.320 A new model.
00:02:58.380 So I guess 134,000 looks like it's not just the minimum.
00:03:05.100 I'm looking at CNN's text.
00:03:07.680 I think that it's maybe where they actually think it will be.
00:03:12.340 It's a little unclear if that's the new minimum or that's just the prediction.
00:03:18.780 Either way, it's a little absurd.
00:03:21.200 But anyway.
00:03:22.940 And apparently it's predicted to kill 3,000 people a day by June 1st.
00:03:33.740 It's nearly double the current number.
00:03:37.380 Now, let's just game this out.
00:03:39.840 3,000 people a day for June, July, and August.
00:03:48.480 3,000 people a day for 90 days.
00:03:52.780 How does that equal 134,000?
00:03:57.440 Feels like it could be twice that.
00:04:01.500 Now, that depends.
00:04:02.840 Of course, the model probably takes into account the summer months and something magic about summer that makes us get less of the virus.
00:04:14.220 But what is that exactly?
00:04:16.980 So I don't know if 134 is really the low end, the middle, or the high end anymore.
00:04:24.240 But it's about where I thought it would be.
00:04:27.160 So I would say it's pretty close to where I just assumed it would be there, right?
00:04:32.340 That's the whole point.
00:04:33.160 You go back to work, people die.
00:04:35.300 So what does CNN do about this decision?
00:04:40.800 Well, of course, they're trying to cram it down the president's throat so that he will be responsible personally for every death that happens because we reopened the economy.
00:04:52.120 And as I watch this forum, you can see it just sort of coming together.
00:04:58.620 You know that what's going to happen, for sure, is that governors will open some states.
00:05:06.240 But CNN doesn't really mock governors because it's not a good model for television, I guess.
00:05:13.560 They prefer to mock the head of the federal government.
00:05:17.420 It's better TV.
00:05:18.120 So Trump is going to get blamed for everything, no matter what the governors do.
00:05:23.320 You know that, right?
00:05:24.880 And watching this take shape is so disgusting to me.
00:05:29.580 I think I've reached a new level of just contempt for CNN's coverage.
00:05:37.740 I'll just say for their coverage rather than make it personal.
00:05:41.640 And here's the thing.
00:05:44.280 You know, and I'll say this a million more times.
00:05:46.560 Nobody knows how to make the right decision here.
00:05:50.180 I don't even know if there is a right decision.
00:05:52.760 There are just two ways that people are going to die.
00:05:55.140 Is there a right decision when there's just two ways people are going to die and you've got to pick which one?
00:06:00.660 To put pressure on the leaders, let's say the governors and all the decision makers,
00:06:08.060 throw Fauci and Birx in there too, they're leaders and decision makers in this context,
00:06:14.220 to put that pressure on them that these deaths are sort of going to be on their hands is so low and so contemptible.
00:06:25.260 Now, I wouldn't say that they've actually done that.
00:06:29.880 You just see it taking form because what else are they going to talk about?
00:06:34.360 They're going to talk about who the leaders killed, right, with their decisions.
00:06:39.940 Well, he made this decision and now Aunt Sally's gone.
00:06:43.800 And here's the people on this page and here's all the names and their faces.
00:06:49.180 And they all died because of the decisions made by these Republicans or whatever.
00:06:56.300 And I'll say it again.
00:06:58.480 There's nothing more important than that we as a country understand that there's no right decision.
00:07:04.240 So if you're hamstringing your leaders to make any decision because you're ready to just eviscerate them no matter what,
00:07:15.180 you're not really helping.
00:07:16.740 You're not helping.
00:07:18.160 You know, I think everybody has to get a pardon in advance for whatever happens.
00:07:23.380 I'm talking about the governors, the experts, the politicians.
00:07:26.940 I mean, we just have to find some way to be okay with decisions that can't be good.
00:07:34.400 They can't be good by their nature.
00:07:36.440 It's just two bad decisions.
00:07:37.780 They're going to have to pick one.
00:07:39.400 So somebody's going to die and our leaders have the very unenviable task of doing it.
00:07:45.200 I'm glad I don't.
00:07:47.000 I'm glad you don't.
00:07:48.300 Are you?
00:07:48.900 Are you glad it's not your decision?
00:07:50.980 I mean, you could make a decision.
00:07:52.700 I could make a decision if I had to.
00:07:54.340 You know, if it were my job, I'd do it.
00:07:56.980 But I'm glad I don't because whoever makes it is going to be accused of murder for just doing their job.
00:08:04.240 Just trying to help the country, trying to get the country to a better place.
00:08:08.740 No matter what anybody does, no matter what their intentions are, no matter how smart they are, even if they make all the right choices, they're going to be accused of murder.
00:08:18.760 Murder.
00:08:20.100 Not just regular murder, but like mass murder.
00:08:23.140 So that's the situation we put our leaders in, and then we ask them to make good decisions for us.
00:08:29.180 Make us some good decisions.
00:08:30.940 And by the way, we are going to accuse you of murder.
00:08:35.560 Well, I take that back.
00:08:36.920 We're going to accuse you of mass murder, no matter what you do.
00:08:42.380 I just don't want to live in a country like that.
00:08:47.140 Sean Hannity, surprised in a way, he's actually requesting that the armed protesters in Michigan reconsider the military garb.
00:09:01.180 And Sean Hannity's argument, I think, is completely solid, somewhat unexpected, which is the fun part of the story, but solid argument.
00:09:13.340 And his argument goes like this, that if you put a show of force on against police officers, that's a very dangerous situation.
00:09:25.040 Because the police, they kind of have to maintain their own control, right?
00:09:30.500 That's part of the job.
00:09:32.080 It's not just standing around in a uniform.
00:09:34.860 The police have to exert authority, right?
00:09:38.640 They've got to exert that they are the ultimate force of control.
00:09:42.220 That's what keeps everybody safe.
00:09:44.780 The last thing you want is people saying, well, police, yes.
00:09:47.820 Police, no.
00:09:48.800 I don't have to do what they say.
00:09:50.400 I've got a gun.
00:09:51.220 They've got a gun.
00:09:52.020 My gun's bigger.
00:09:53.220 I don't know.
00:09:53.780 Maybe I don't have to do what they say.
00:09:55.400 So you could easily see it sliding in the wrong direction.
00:09:58.880 And so I think Sean Hannity is making a very, it's a hard call.
00:10:03.820 Because even Dan Bongino came on his show right after and said,
00:10:07.160 I don't know, Second Amendment, freedom of speech.
00:10:12.220 Everything they're doing is legal.
00:10:15.380 Do you tell people to stop doing completely legal things in this country?
00:10:20.320 And so you can see both arguments, can't you?
00:10:23.460 I don't think Dan Bongino is wrong.
00:10:25.960 You know, we got laws.
00:10:27.480 What gives anybody the right to throw the laws away?
00:10:31.600 Like, when did that happen?
00:10:33.120 Right?
00:10:33.280 So that's a good argument.
00:10:34.860 Dan Bongino's argument, completely solid.
00:10:37.580 Follow the Constitution.
00:10:39.760 They're following the Constitution.
00:10:41.880 As long as they follow the Constitution, why shouldn't we?
00:10:44.880 Right?
00:10:46.380 But Hannity's argument is also solid.
00:10:48.460 Why would you want to put the police at risk?
00:10:50.920 And I think Hannity's got a good, you know, long history of supporting law enforcement.
00:10:56.780 And so he's being consistent on that and confesses that it's a very hard call to do anything that would be even slightly anti-Second Amendment.
00:11:06.440 You know, he's got to concealed carry himself, as he reminds us.
00:11:09.500 So, here's my opinion on it.
00:11:14.640 And I don't think anybody's opinion should sway you, because this is so subjective.
00:11:20.480 Right?
00:11:20.760 You know, I'm not going to tell you my opinion is right or not right.
00:11:25.220 But under these very, very specific conditions, meaning that the next time something comes up, don't ask me to be consistent with this.
00:11:36.320 Because this is just a one-off.
00:11:38.000 So, anything I say about this should not ever be generalized in the future to any other situation.
00:11:44.180 It would just be different.
00:11:46.080 The people with the guns are there to support the Constitution.
00:11:53.240 That's my understanding.
00:11:54.860 They're not there to overthrow the government.
00:11:57.300 They're, in fact, there to support the government.
00:11:59.740 They're literally there to support the document, if you will, the Constitution, that binds us all together.
00:12:08.340 It's the thing that pays the police.
00:12:10.560 You know, ultimately, the form of our government defined by the Constitution allows a structure in which the police can get paid.
00:12:17.700 So, are the protesters dangerous, or are they protecting the system?
00:12:27.880 It's really tough, isn't it?
00:12:31.160 And who said this the other day?
00:12:35.020 Oh, somebody smart on television said that when you've got a gray area, somebody was saying that their father gave them this advice.
00:12:43.240 Maybe one of you saw it.
00:12:44.660 It was somebody wise that was saying this.
00:12:46.440 That when you have a gray area, and, you know, one direction is maybe safety, and the other direction is maybe freedom, but it's a tough call, that you should always bias toward freedom.
00:13:01.620 And that, in the long run, you end up better that way, if you bias toward freedom.
00:13:06.380 So, here's my take.
00:13:09.600 I've never seen a situation where the armed militia types have ever fired.
00:13:15.320 Have you?
00:13:16.440 Because they show up on a lot of stuff.
00:13:19.180 If you look at all the times that the same types of people, the ones who, you know, are into that lifestyle, the militia, the guns, etc., they're very anti-shooting.
00:13:31.340 Sort of super anti-shooting, I would say.
00:13:34.120 Because they're not there to shoot.
00:13:36.360 They didn't come there for that.
00:13:37.380 They didn't come to shoot anybody.
00:13:40.620 What's the last thing any of them with a gun actually wants to happen?
00:13:43.900 The last thing.
00:13:44.760 The last thing any of them want to happen is to shoot anybody.
00:13:49.360 They don't want to get shot.
00:13:50.760 They don't want to shoot anybody.
00:13:52.200 It's the last thing they want.
00:13:54.000 So, it's hard to get into the minds of strangers, which I just did, and so that, you know, I probably shouldn't have.
00:14:01.480 But if I were to say, okay, let's treat them as mindless, we don't know what they're thinking, yeah, I'd say you don't want guns and police officers in the same place.
00:14:10.260 So, if you ignore their internal mental state, and maybe that's the smartest play.
00:14:16.980 Maybe Hannity has nailed this completely.
00:14:19.160 It's just, just look at the variables.
00:14:21.600 Lots of guns, police, put them in the same place, nothing good can happen.
00:14:25.080 That's a pretty adult opinion, and I would respect him for that opinion.
00:14:30.960 But I've got to say, I'm leaning in the other direction.
00:14:34.360 I'm leaning, I'm leaning Bongino in this one.
00:14:38.200 I'm leaning, if they're there to protest for their freedom, I don't know, if the whole point of it is freedom,
00:14:47.500 then, how do you take their freedom away when they're protesting for freedom?
00:14:54.380 Legally.
00:14:56.020 Legally protesting for freedom in legal ways?
00:14:59.200 And you take some more of their freedom away while they're protesting for freedom?
00:15:02.700 How's that going to go?
00:15:04.200 What's more dangerous?
00:15:07.020 What's more dangerous?
00:15:07.760 Well, I mean, nobody's tried to take their guns away.
00:15:09.680 That would be, of course, crazy.
00:15:11.620 But, I guess I would lean toward letting them do their thing.
00:15:15.340 And, here's another story.
00:15:18.200 You probably saw this.
00:15:20.300 Apparently, there's a Chinese drone maker, of the smaller drones, the kind that law enforcement would use.
00:15:27.960 And this gigantic Chinese company just dropped their price or donated some of them.
00:15:33.520 And they're putting their drones in lots of different local police forces.
00:15:37.900 And part of what the drones are doing is flying around and looking for social distancing violators.
00:15:43.180 Do you feel comfortable with that?
00:15:46.040 Do you feel comfortable that there's basically a Chinese-made computer?
00:15:53.840 Because, you know, each of the drones has a little intelligence in it.
00:15:58.100 There's a little computer in those drones from China that's flying around and collecting information on our citizens.
00:16:06.860 What kind of information is it collecting?
00:16:08.720 What does it know just by flying around and being part of the police force?
00:16:14.580 Could it listen in on the police force?
00:16:17.200 I don't know.
00:16:18.580 You know, could the drones be modified to collect information we don't know they're collecting?
00:16:24.540 I don't know.
00:16:26.140 Could anybody ever take advantage of any of the, I don't know, administrative or data that gets sent up to headquarters?
00:16:34.140 Does that even happen?
00:16:35.100 Does anything get sent to headquarters?
00:16:36.700 I don't know.
00:16:39.200 But does it matter?
00:16:41.220 Does it matter?
00:16:43.000 You've got a Chinese-made computer that they're giving away to local law enforcement?
00:16:51.340 Hmm.
00:16:53.060 No.
00:16:54.380 No, you can't let that company into the United States.
00:16:57.700 Apparently, Homeland Security has already banned them from Homeland Security.
00:17:01.940 But the local police force is like, hey, free drone.
00:17:06.160 Of course I want a free drone.
00:17:08.060 Do I want a free drone?
00:17:09.740 Yeah.
00:17:10.600 Give me a free drone.
00:17:11.560 That's what the local police say.
00:17:13.300 So they're accepting all these Chinese spy drones.
00:17:16.920 They're not made exactly for spying, but you know what I mean.
00:17:23.300 And let me ask you this.
00:17:24.860 Do you think the United States has any companies that make some drones?
00:17:29.540 I think so.
00:17:30.540 I think the United States has companies that make drones.
00:17:34.940 Yes.
00:17:35.880 Lots of them.
00:17:37.260 Why the hell are we buying Chinese drones?
00:17:40.560 We need to get rid of that immediately.
00:17:44.060 In fact, let's just get rid of everything we buy from China.
00:17:46.760 If there's any substitute in the United States, I don't care if it costs more.
00:17:52.220 Of course, that's easy for me to say.
00:17:53.900 I know.
00:17:54.360 I know.
00:17:55.600 All right.
00:17:57.360 So that's going on.
00:17:59.480 There's a Massachusetts golf course owner who's going to defy the state's lockdown and
00:18:04.680 reopen for business.
00:18:05.800 And this is a way more interesting story than it sounds like as a headline, because the family
00:18:12.960 that's reopening, I guess it's family-owned golf courses, the family that's doing this
00:18:17.940 in defiance of the state just lost a family member to coronavirus, just lost a grandmother
00:18:26.000 in a tragic situation.
00:18:28.700 So whoever the representative of the family was saying, believe me, we get it.
00:18:33.440 Like, we're not in any, you know, we're not confused about how deadly this is.
00:18:41.020 Grandma just died from it.
00:18:42.920 And we still want to open up the golf course, you know, partly because it's something you
00:18:46.880 could social distance a little bit easier than most, but also because there's no choice.
00:18:52.780 There's just no choice.
00:18:54.400 You know, people need to eat.
00:18:56.800 Got to get going.
00:18:58.160 So I would say I support this Massachusetts golf course.
00:19:01.600 If it opened in my state, I would go golf.
00:19:05.220 The police would come, probably, probably.
00:19:09.060 And maybe I'd have to go home.
00:19:10.860 But if they reopened the next day, I'd try to go golf again.
00:19:15.280 When the police would come, they'd send me home.
00:19:17.940 And then maybe some other businesses would try to reopen.
00:19:20.620 Maybe there would be too many businesses for the police.
00:19:23.540 Maybe.
00:19:24.240 Don't know.
00:19:25.360 Am I suggesting civil disobedience?
00:19:29.420 I'm not suggesting it.
00:19:32.200 I'm saying it's coming.
00:19:34.920 I don't need to suggest it.
00:19:38.140 Civil disobedience is guaranteed.
00:19:41.060 This is America.
00:19:42.660 Have you heard?
00:19:44.340 Have you heard of Americans?
00:19:46.980 You've probably heard of us, right?
00:19:48.600 Most of you are American.
00:19:49.640 We like to protest.
00:19:53.180 So, I mean, we'd protest a ham sandwich.
00:19:56.140 So do you think anybody's going to have some civil disobedience about closing down the economy for another month?
00:20:01.920 Oh, yeah.
00:20:02.660 Oh, yeah.
00:20:03.040 It's coming.
00:20:03.400 So I put some questions, or I put a tweet on Twitter and told people to ask me questions, and I would answer them.
00:20:21.980 And, by the way, are you seeing that every day we get conflicting data about hydroxychloroquine?
00:20:29.000 I feel as though every day somebody says, hey, it's great, this trial.
00:20:33.880 Well, it's not conclusive, but it looks good.
00:20:36.700 And then 10 minutes later, yeah, you know, here's all the reasons why it probably doesn't work.
00:20:42.160 And I feel like we can't get any good information.
00:20:45.980 What was the last thing we learned about that was actually true?
00:20:49.580 Have we learned anything that's true about the coronavirus?
00:20:52.840 Just fact-check me on this.
00:20:56.940 Is there even one thing about the coronavirus, one scientific or other data fact, that actually turned out to be true?
00:21:06.580 From the viral amount of it to what it was, the spreading to humans, do masks work?
00:21:13.180 Are ventilators good or bad?
00:21:15.400 Think of that.
00:21:16.480 Name one thing we got right.
00:21:18.700 I can't.
00:21:19.700 I can't think of the one thing we got right.
00:21:21.520 We didn't have enough of stuff, and then we had too much of stuff.
00:21:25.120 Name one thing we got right.
00:21:26.860 Nothing.
00:21:27.840 Not a single thing.
00:21:29.500 Now, I'm not too harsh about getting things wrong in a context of a pandemic,
00:21:35.360 because your first actions were, you know, just guessing, really.
00:21:41.000 All right.
00:21:42.180 I'm looking for my own tweet in which I ask people to ask me questions.
00:21:47.300 If you would like to ask me a question, that's where you should have put it.
00:21:50.100 There are 200 questions.
00:21:52.980 And so, will Trump debate Biden in person?
00:21:56.200 I'm feeling like the odds are no.
00:22:04.820 If I had to bet on it right now, I'd say 70% no would be my bet.
00:22:11.320 And there would be two reasons for that.
00:22:13.300 Number one, I think that the Democrats will do everything they can to not have a debate.
00:22:18.260 They'll use the coronavirus excuse, etc.
00:22:20.620 Trump will probably say he wants to do it.
00:22:24.780 Biden might also say he wants to do it.
00:22:27.260 But gosh, if only I could.
00:22:30.300 No, I don't think it'll happen.
00:22:32.180 And I think that the Democrats know that that's a losing proposition.
00:22:36.120 So, they'll use the coronavirus to avoid it, I think.
00:22:41.780 Why are healthy people being quarantined?
00:22:44.300 Because they could become carriers.
00:22:46.780 Why are we still asking that question?
00:22:48.520 How did we get to May?
00:22:55.080 And there's somebody, and I know this user, because we interact a lot on Twitter.
00:23:00.140 But this is somebody who's like really following.
00:23:03.260 And I'm not making fun of you.
00:23:04.940 I'm just honestly curious.
00:23:07.300 How could you get all the way to May and not know at least the argument for why healthy people are being locked out?
00:23:14.780 The argument is that they won't be healthy.
00:23:17.260 They'll go out and get it and bring it back and kill grandma.
00:23:20.480 Now, you could argue that that's not a good enough reason.
00:23:23.380 You could say that the costs are greater than the benefits.
00:23:26.940 That would be an argument.
00:23:28.200 But to ask why are healthy people being quarantined?
00:23:32.160 Don't we all know that now?
00:23:34.580 I mean, I'm not saying it's the right answer.
00:23:36.260 But we should know the reason.
00:23:37.620 All right, and so we're seeing people criticizing Trump on the 60,000 expected deaths.
00:23:51.380 And you saw it on Worldometer.
00:23:54.640 Well, yeah, the numbers are all over the place, and everybody speaks imprecisely about them.
00:24:01.800 And you saw me do it just a moment ago.
00:24:03.800 So I was looking, I was reading CNN, looking at it, and I realized that the CNN report doesn't tell me if that's the low estimate, the likely estimate, or the high, or what.
00:24:13.280 I mean, what good is that?
00:24:14.760 When you've got a vast range, and they give me one number, I don't know.
00:24:19.840 Is that the low, the middle?
00:24:21.440 It's all useless information.
00:24:22.640 What do I think about the 1619 Project?
00:24:28.800 So that's the New York Times Project, where they were going to write a long series about institutional racism and how it all came from slavery, etc.
00:24:37.760 And I guess they got the Pulitzer Prize for that, which Pulitzer Prize is like six people sitting in the living room who saw 0.001% of the creative stuff that got created that year.
00:24:54.500 And they said, oh, we like this one.
00:24:56.240 The most useless, stupid, completely non-prestigious award, the Pulitzer Prize.
00:25:01.660 It's just ridiculous.
00:25:02.920 It'd be like you just making up a prize.
00:25:04.720 I'd like to give the Carl and Jane Prize for literature.
00:25:12.160 It's just me, you know, just me, Jane and me.
00:25:15.280 We just sat in their living room, and we decided we'd like this book better than this one.
00:25:19.460 So we're going to give it the Carl and Jane prestigious award.
00:25:23.760 That is exactly as valuable as the Pulitzer Prize.
00:25:29.360 No difference.
00:25:30.180 It's just some people in their living room, probably got together on Zoom, and said, do you read these seven books that were submitted?
00:25:40.180 Yeah, which one do you like?
00:25:42.020 I don't know.
00:25:42.600 They're all pretty good.
00:25:43.460 How about this one?
00:25:44.300 Yeah, all right.
00:25:45.000 Pulitzer Prize.
00:25:46.340 Useless.
00:25:48.520 All right.
00:25:48.880 And then I guess historians complained that some of the articles that came out of that 1619 project were inaccurate.
00:25:58.400 I would go on.
00:25:59.700 I would go further and say that they were written for naked political purposes and that we shouldn't take any of it too seriously.
00:26:09.620 Andres asks, how have you stayed motivated before Dilbert became a success?
00:26:18.820 Excellent question.
00:26:19.760 And the answer is, for my entire life as a young person, I expected to be successful.
00:26:31.260 And successful on some kind of a level where the rest of the world noticed, you know, not just sort of successful privately, but successful in some way that the public knew about it.
00:26:44.360 And I've never not felt it.
00:26:48.280 I think I was born with that feeling.
00:26:50.680 I mean, I felt it from my earliest memories.
00:26:53.320 I always thought, I'm going to be famous someday.
00:26:56.640 Just always.
00:26:58.000 And I've told this story before that adults would tell me that too when I was a kid.
00:27:03.620 They would say, someday you're going to be really rich and famous.
00:27:07.920 And I never knew why they were saying that.
00:27:10.180 I mean, I knew I thought it myself, but I also didn't know if everybody thought that.
00:27:16.020 I kind of wonder.
00:27:17.140 How many of you, let me ask in the comments while I'm finishing my story here.
00:27:21.560 How many of you, when you were a kid, thought that you would be famous someday?
00:27:28.480 I'm just wondering.
00:27:29.520 Because I always thought I would.
00:27:31.360 Now, obviously that causes you to make certain choices in your life that make it more likely it's going to happen.
00:27:37.360 So there's a causal element there.
00:27:40.520 But how many of you thought you would be famous when you were kids?
00:27:43.380 I'm just curious.
00:27:44.440 Because when I was a kid, I didn't know if it was just the way everybody felt.
00:27:48.120 I thought maybe everybody, you know, was optimistic and dreamed about good things.
00:27:53.660 But I didn't dream about it.
00:27:56.140 I expected it.
00:27:58.080 It just was a different feeling.
00:27:59.720 I can tell the difference when I'm just dreaming about stuff.
00:28:03.500 I'm looking at people's answers.
00:28:05.420 This is fascinating.
00:28:06.140 I didn't know where this would go.
00:28:07.840 There were a lot of yeses.
00:28:08.680 I see some no's.
00:28:11.020 Oh, my goodness.
00:28:12.680 That's very common.
00:28:15.020 I didn't know.
00:28:16.180 I had no idea which way that question was going to go.
00:28:18.660 Apparently, it's universal.
00:28:21.780 I mean, not universal because I see some no's.
00:28:24.580 But if I were just based on the answers, which is very non-representative of the sample, it's just the people who wanted to answer.
00:28:31.540 Wow, there's a little, I don't know what percentage, but we can conclude from the comments that a lot of people thought that they would be famous someday.
00:28:41.400 Interesting.
00:28:42.420 So I guess I should take it to be nothing, you know, because it felt like it was almost a premonition.
00:28:47.360 But if everybody had the same premonition and it didn't work out for most people, I'd have to say it's just how kids feel, I guess.
00:28:54.420 So to finish this answer, how do I stay motivated?
00:28:59.020 I've never not been motivated.
00:29:00.480 So I don't know what it's like to feel unmotivated.
00:29:05.440 I don't even know what that feels like.
00:29:07.200 Because I've always felt like, almost like there was a problem with myself that I needed to fix.
00:29:16.120 You know, when we talk about success and motivation and things, we tend to, because we're America and, you know, we're a certain type of people, we tend to put the most positive spin on success.
00:29:28.800 And we say motivation, ambition, we tend to put a positive frame on those things, because we like to encourage people to do those things.
00:29:37.940 It makes sense.
00:29:39.300 But when you're actually that person, and you're the person who has that ambition and has that motivation, it doesn't always feel like it's a positive.
00:29:49.980 It feels like a flaw that you're trying to fix.
00:29:53.600 Now, I don't know how many other people would back me up on that feeling.
00:29:58.460 That may be highly individual.
00:30:00.120 I don't know.
00:30:01.140 But what I feel like is a continuous weight on me that I should have done more.
00:30:08.000 And I can't shake it.
00:30:09.840 I wake up thinking I should have done more.
00:30:12.340 And I go to bed thinking I should have done more that day.
00:30:15.440 And I can't wait to wake up.
00:30:18.400 I hate sleep.
00:30:19.860 I just dislike the old idea of sleeping, because it takes me away from my day.
00:30:25.460 And my day is when I can do stuff.
00:30:27.640 And I can get stuff done.
00:30:29.080 I can accomplish things.
00:30:31.040 Right?
00:30:31.300 I can make a difference.
00:30:32.560 When I'm sleeping, sleeping is like suspended animation.
00:30:36.780 Sleeping is like practice for being dead.
00:30:38.780 And sleeping has no place in my world, except, unfortunately, it's necessary for good health.
00:30:44.860 So you should do everything you can to sleep a healthy amount.
00:30:48.560 I don't want to convince anybody to sleep less.
00:30:50.500 But I'm just telling you how I feel.
00:30:53.700 So to answer Andre's question, I've never felt unmotivated.
00:30:56.860 So I suspect some of that is just baked into your nature.
00:31:02.220 I don't know.
00:31:03.260 It's always been there.
00:31:04.220 It's never going away.
00:31:06.080 And it feels like an itch that can't be scratched.
00:31:09.840 But I spend my whole life scratching anyway.
00:31:13.020 Is that good?
00:31:14.700 I don't know.
00:31:15.740 Because I don't know if I have the capability to be, let's say, retired.
00:31:20.940 You know, I'm at that age where everybody thinks seriously about it.
00:31:25.480 And, you know, there are many times I've thought, you know, wouldn't it be great to be retired?
00:31:32.080 What if I could wake up and just do anything I wanted?
00:31:36.200 What if I didn't have to work?
00:31:38.220 What if I could sleep as long as I want, do anything I want when I wake up, not worry about money, and just live out my days?
00:31:47.020 You know, like a perfect free human that nobody ever gets to be.
00:31:51.700 Like nobody ever gets to be free.
00:31:54.160 Just wake up, do what you want, not run out of money.
00:31:57.060 Who gets to do that?
00:32:00.460 I have no interest in that.
00:32:03.060 I have no interest.
00:32:04.840 Because that's available to me now.
00:32:07.260 I could stop working.
00:32:09.180 You know, I know you hate me for it, because especially now when people are having a tough time.
00:32:14.080 But it's just the truth.
00:32:16.100 I could stop working.
00:32:17.420 I could just wake up and just eat and play around and just have a good time for the rest of my life.
00:32:24.700 I could.
00:32:25.600 I have no interest in it.
00:32:27.780 It sounds like hell to me.
00:32:29.820 Because it doesn't scratch my itch.
00:32:32.380 It doesn't make me feel like I've done anything useful.
00:32:35.280 And I don't mean useful for myself, because I already got what I need.
00:32:38.760 I'm almost entirely externally focused at this point in my career.
00:32:42.680 Because if it's not good for you, not interested.
00:32:46.480 If it doesn't help someone else, no interest at all.
00:32:51.320 Not even a little bit.
00:32:52.780 So my ambition, to sort of further to Andrea's question,
00:32:57.500 my ambition has changed over the years from a pure personal ambition to, you know,
00:33:03.900 I've got to get somewhere personally, to, oh, now what do I do?
00:33:08.200 Because once I got it, what do I do?
00:33:13.760 What am I going to do with that?
00:33:15.800 And so almost immediately upon getting what I thought I wanted,
00:33:20.280 my ambition changed to external.
00:33:23.600 And I thought to myself, well, you know, could I run for public office?
00:33:28.100 And I thought, what would be a bigger waste of my talents than to put me in a meeting?
00:33:33.060 You know, if you've watched me long enough, you say to yourself, okay, what is it that you're good at?
00:33:40.860 What is it that you could contribute to the world?
00:33:43.580 And now you've got to go raise money and attend meetings.
00:33:48.380 It's just been the biggest waste of time ever for my specific set of talents.
00:33:52.580 So I try to find ways such as this in which I could do something that would be potentially useful,
00:33:58.980 which is why I've started to put the micro lessons on the locals.com platform.
00:34:05.860 So right now there's a lesson on how to be funny.
00:34:08.040 I think my next lesson will either be personal finance,
00:34:11.300 very, very short lesson on that for those who just don't know anything about it,
00:34:15.320 or maybe design.
00:34:17.120 I'm going to probably do one of those today, maybe.
00:34:19.280 All right, so I'm always motivated.
00:34:23.340 It's just the nature of it changed.
00:34:26.820 Shata says, what are some techniques to wake up and start the day off great?
00:34:33.460 And the answer is, if you don't drink coffee, I don't know what the answer to that question is.
00:34:39.860 One of the things that I teach, in fact, just yesterday, somebody said that this lesson changed their life.
00:34:46.660 So what I'm going to tell you now, this will be the short version of it,
00:34:52.260 but somebody just told me yesterday, it changed their life to hear this idea,
00:34:56.880 that we're moist robots.
00:34:59.540 I wrote about this in my Hatterfeld, Almost Everything and Still Win Big book,
00:35:03.360 you see on the shelf back there.
00:35:05.300 And the idea of a moist robot is that it just responds to inputs.
00:35:10.120 So if you put an input into a computer, you'll get an output.
00:35:15.140 And then if you treat yourself like that, instead of some kind of mental creature,
00:35:20.040 if you treat yourself as a mind, you don't know what to do with it.
00:35:24.140 Because you don't quite know how to manage a mind.
00:35:26.640 It's more of a concept.
00:35:28.080 But if you manage your physical body, and you manage it right,
00:35:31.860 you can produce the right kind of thoughts and actions and stuff that your mind likes.
00:35:35.620 So I teach people to reprogram their mind by programming their physical environment
00:35:44.740 and their physical body, and also associating rewards with things they want their physical body to do.
00:35:53.220 So one of the things that I want to do is wake up and be productive.
00:35:57.240 Who doesn't, right?
00:35:58.020 If it were free and easy and didn't take any work,
00:36:02.500 wouldn't you all want to wake up and be productive?
00:36:05.400 You might want to wake up at different times.
00:36:07.800 But everybody wants to be productive.
00:36:09.800 So I use the moist robot technique to train myself like a dog.
00:36:15.040 And the way I do that is I say,
00:36:17.400 Hey, if you do this trick, you'll get a treat.
00:36:19.940 And the treat is this protein bar, this specific one, because it's one that I like.
00:36:27.900 I'm not saying you'll like it.
00:36:28.980 I'm just saying I like it.
00:36:30.720 And delicious cup of coffee, which when combined in the same bite,
00:36:36.560 a bite of the chocolate peanut butter protein bar,
00:36:39.960 with the sips of coffee,
00:36:42.340 is really, really good.
00:36:44.940 And the coffee wakes me up, it gives me a buzz,
00:36:48.740 and my body registers the buzz, right?
00:36:51.340 It's a little shot of energy,
00:36:52.660 and your body recognizes that as a treat.
00:36:56.220 And then you do the taste treat,
00:36:58.820 and maybe you were a little hungry when you woke up anyway,
00:37:01.980 and then those two tastes go together,
00:37:04.140 and it's like a taste explosion of awesomeness.
00:37:07.600 Now, it doesn't matter that you don't like these things,
00:37:10.240 because if you were doing it, you would pick your own treat.
00:37:12.680 Your treat might be, I don't know, watch a TV show.
00:37:16.900 Your treat might be take a walk in the morning.
00:37:19.420 Maybe it's whatever you like.
00:37:21.200 But the trick is, for whoever said,
00:37:23.720 how do you wake up in the morning,
00:37:25.540 is you should pair your wake-up routine,
00:37:28.480 and you should turn it into a routine,
00:37:30.380 so it's not different every day,
00:37:31.920 with a treat.
00:37:33.900 A really, really good one.
00:37:35.760 And let me tell you,
00:37:36.940 if you heard this and said to yourself,
00:37:39.200 a protein bar and a cup of coffee,
00:37:41.300 that's not much of a treat.
00:37:44.480 That's just food and beverage.
00:37:46.260 You could have food and beverage all day.
00:37:48.340 Why is that a treat?
00:37:49.940 Well, the thing you might miss is how much I like it.
00:37:53.340 So if you don't like it,
00:37:54.980 find something that you do.
00:37:56.600 I really, really like
00:37:58.140 the first two hours of my day.
00:38:00.840 By far, they're extraordinary.
00:38:03.360 The first two hours of every day for my life
00:38:06.920 are extraordinary pretty much every time.
00:38:11.880 There's almost nothing else that's as good
00:38:13.720 the rest of the day unless, you know,
00:38:15.860 unless Christina's involved.
00:38:19.760 How popular will pop-up drive-in movies be this summer?
00:38:23.620 Well, I saw that somebody's got some
00:38:26.200 pop-up movie theaters that already happened.
00:38:29.220 So I predicted that that would happen.
00:38:32.680 And I know it's happened at least one place,
00:38:34.700 maybe more.
00:38:35.840 And there have been the drive-in
00:38:37.380 church services as well.
00:38:41.880 And I guess the people just parking in a parking lot
00:38:46.080 and listening to the pastor on their car radio
00:38:50.660 turned out to be too dangerous,
00:38:52.160 which is ridiculous.
00:38:53.980 If I had to guess,
00:38:56.100 you'll see more of that in the summer,
00:38:58.140 but not much more.
00:39:03.580 Practical Bob says this.
00:39:06.200 Is KB,
00:39:07.260 I assume he's referring to my fiancée,
00:39:10.680 Christina,
00:39:12.020 as hot-looking in person as pictures?
00:39:14.800 No.
00:39:15.640 No, she is not.
00:39:17.740 Christina is not as hot-looking in person.
00:39:20.560 She is way hotter.
00:39:22.000 Way hotter in person.
00:39:25.040 There is no photograph that could ever capture
00:39:27.140 her total beauty.
00:39:29.140 So no,
00:39:30.200 I'm sad to say
00:39:31.240 that she does not look as good as her pictures.
00:39:33.700 She looks way better than her pictures.
00:39:36.520 Way better.
00:39:39.440 That's true, by the way.
00:39:41.660 And
00:39:42.100 the 15-minute COVID tests,
00:39:44.960 can it be done
00:39:48.020 multiple,
00:39:48.980 done at a time,
00:39:50.060 or is it one at a time?
00:39:51.160 Well,
00:39:52.140 that's a good question.
00:39:53.600 You know,
00:39:53.740 do you have to wait 15 minutes
00:39:55.120 before you do the next test?
00:39:56.480 And the answer is,
00:39:57.860 I doubt it.
00:39:59.500 I doubt,
00:40:00.060 I would assume that they can batch them up
00:40:01.820 and just test, test, test,
00:40:03.160 and then everybody's just waiting for results.
00:40:06.060 Or at the very least,
00:40:07.220 you've got more than one test,
00:40:09.620 test station.
00:40:11.020 So yeah,
00:40:12.100 there might be ways to make that faster.
00:40:15.620 The New York Times calling for the DNC
00:40:18.060 to investigate Tara Reid's complaint.
00:40:22.640 How can they suggest such a thing
00:40:24.440 with a straight face?
00:40:25.440 It does make you wonder.
00:40:28.760 Yeah,
00:40:29.100 there's so much happening in the news
00:40:30.660 that you just look at it and you say,
00:40:32.320 do they really mean that?
00:40:35.420 Or are they just saying that
00:40:37.520 because they know it sounds good?
00:40:39.860 You really can't tell anymore,
00:40:41.160 can you?
00:40:41.540 You really can't tell
00:40:42.820 if people mean what they say
00:40:45.200 in any
00:40:45.780 in any
00:40:48.480 real way.
00:40:50.620 Do I follow
00:40:51.320 brilliant
00:40:51.940 geopolitical writer
00:40:53.520 Peter Zahin?
00:40:55.340 I don't
00:40:55.940 I don't know if I follow him.
00:40:57.860 I've read his stuff,
00:40:58.780 but I'm not too familiar.
00:41:02.600 I answered about the armed protesters.
00:41:06.780 Sweet Caroline says,
00:41:08.440 do you think most grown people
00:41:10.000 will do what's proper
00:41:12.140 and if they want to work,
00:41:13.460 they should be able to?
00:41:14.860 And the answer is,
00:41:15.620 yes,
00:41:16.040 most people will do
00:41:17.500 most things properly,
00:41:19.500 but 20% will not.
00:41:22.240 If 20% don't do
00:41:24.060 what they should do,
00:41:25.940 is that good enough?
00:41:27.200 Will the other 80% be okay
00:41:29.260 if the 20% don't do
00:41:32.160 what they're supposed to do?
00:41:33.240 And the answer is,
00:41:33.800 they won't be.
00:41:34.920 When the 20% don't
00:41:36.860 do their social distancing,
00:41:39.640 et cetera,
00:41:40.480 it will be a problem
00:41:42.500 and it will extend
00:41:43.480 to other people,
00:41:44.500 but there also
00:41:45.800 isn't another way.
00:41:47.980 So I don't like to
00:41:49.140 complain about things
00:41:50.460 that can't change
00:41:51.640 as a general rule.
00:41:52.820 I probably do,
00:41:53.740 but it's a good rule
00:41:54.820 not to complain about
00:41:55.660 things that can't change.
00:41:57.560 And one thing
00:41:58.680 that can't change
00:41:59.620 is that you're not
00:42:01.040 going to get 100%
00:42:01.960 of adults
00:42:02.640 to act like adults.
00:42:04.120 There's no practical way
00:42:06.400 that that could ever happen.
00:42:07.860 So it shouldn't be part
00:42:09.320 of our planning
00:42:10.000 to assume that it would.
00:42:11.940 So assuming that
00:42:12.900 100% of people
00:42:14.700 or anything close to it
00:42:15.980 would obey social distancing
00:42:18.300 would be,
00:42:19.040 I think,
00:42:21.120 probably unwise.
00:42:22.680 20% failure rate
00:42:28.460 could be catastrophic,
00:42:30.060 but we don't know.
00:42:30.900 So again,
00:42:31.340 I'll throw this
00:42:31.880 in the category of
00:42:32.820 is there any expert
00:42:34.180 who can answer
00:42:34.760 this question?
00:42:35.880 No.
00:42:37.080 No,
00:42:37.480 there's no expert
00:42:38.160 who can answer that question.
00:42:39.240 It's just one of our
00:42:40.020 many unknowns.
00:42:40.700 Let's see.
00:42:47.060 The research professor
00:42:50.460 in Pittsburgh
00:42:51.060 that was on the verge
00:42:51.940 of making
00:42:52.340 significant findings
00:42:53.940 on how he was just found
00:42:56.680 murdered in his home.
00:42:58.720 What?
00:43:02.460 Is this new?
00:43:04.380 He was a coronavirus
00:43:05.400 researcher,
00:43:06.840 dead?
00:43:08.120 Huh.
00:43:10.700 Looks like a murder-suicide?
00:43:12.480 I don't know.
00:43:13.640 Questions?
00:43:14.380 I got questions.
00:43:15.180 Who knows?
00:43:17.140 What do your notes
00:43:18.120 look like
00:43:18.680 for these periscopes?
00:43:20.480 That's a funny question.
00:43:22.360 The answer is
00:43:23.180 I just print them out
00:43:24.940 before I start.
00:43:28.060 And if it looks like
00:43:28.840 there's a lot of text
00:43:29.660 on the page,
00:43:30.380 I didn't type that.
00:43:31.480 I just cut and paste
00:43:32.560 from news headlines
00:43:34.600 and then I
00:43:35.380 talk about
00:43:37.860 the headline.
00:43:38.820 So usually
00:43:40.820 I have
00:43:41.280 two pages
00:43:42.300 of notes
00:43:43.420 that are just
00:43:44.520 cut and paste
00:43:45.040 from what's happening.
00:43:46.600 Some tweets
00:43:47.160 and stuff.
00:43:51.060 Does the
00:43:52.020 Communist Party
00:43:53.060 of China
00:43:54.180 provide levels
00:43:55.120 of remuneration
00:43:56.280 to social media
00:43:57.160 giants
00:43:57.640 that should
00:43:58.140 concern us?
00:44:00.600 Well,
00:44:01.460 you know,
00:44:01.840 it's funny.
00:44:02.760 In every situation
00:44:04.800 that we know of
00:44:05.660 in life,
00:44:06.220 whoever has
00:44:07.660 the most money
00:44:08.580 in that situation
00:44:09.600 ends up
00:44:10.900 controlling it.
00:44:12.300 Now,
00:44:12.640 that's worked out
00:44:13.280 pretty well
00:44:13.820 for the United States
00:44:14.740 for a long time
00:44:15.600 because the United States
00:44:16.940 always had the most money.
00:44:18.800 So we could have,
00:44:19.880 you know,
00:44:20.060 the most influence
00:44:20.900 over institutions
00:44:21.980 because we funded them,
00:44:23.400 the most influence
00:44:24.280 over other countries
00:44:25.380 because we supported
00:44:26.680 them militarily.
00:44:27.840 So it's just a fact
00:44:29.280 that whoever has
00:44:30.680 the most money,
00:44:32.220 be they the billionaire
00:44:33.340 in the room
00:44:34.140 or the rich country,
00:44:36.680 they influence
00:44:37.680 things more.
00:44:38.380 There's no way
00:44:38.800 to stop that
00:44:39.500 because it's a power.
00:44:41.260 They have it.
00:44:41.880 They can use it.
00:44:42.620 It's legal.
00:44:43.220 It's going to happen.
00:44:44.500 The problem is
00:44:45.340 that China,
00:44:46.660 just simply by its size
00:44:48.080 alone,
00:44:49.780 will come to have
00:44:50.880 more money
00:44:51.420 than the United States.
00:44:53.100 So if the United States
00:44:54.100 and China both did,
00:44:55.260 let's say,
00:44:55.680 similarly well
00:44:56.660 with their economy,
00:44:57.860 just because of the
00:44:59.020 number of people
00:44:59.740 in China,
00:45:00.260 they would be,
00:45:02.060 is it four times
00:45:03.860 as big?
00:45:04.660 Three times as big?
00:45:05.700 Somebody do the math
00:45:06.460 for me.
00:45:07.240 Three times as big?
00:45:08.700 Three to four times
00:45:09.560 as big?
00:45:10.320 So China should have
00:45:11.740 something like
00:45:12.320 three to four times
00:45:13.280 more money
00:45:14.000 should they even
00:45:15.440 pull even with us
00:45:16.400 in economics,
00:45:17.500 you know,
00:45:17.840 per capita,
00:45:18.440 so to speak.
00:45:19.620 So in theory,
00:45:21.520 China will dominate
00:45:22.260 the world.
00:45:23.360 It's just math.
00:45:24.200 There's no way
00:45:24.880 it can not happen.
00:45:26.300 The only way
00:45:26.820 China could not
00:45:28.080 dominate the world
00:45:29.000 in the future
00:45:29.620 is if their economy
00:45:30.860 doesn't grow
00:45:31.720 as well as ours does.
00:45:34.440 And so in order
00:45:36.600 for the United States
00:45:37.620 to continue
00:45:38.740 its probably
00:45:40.060 oversized dominance
00:45:41.480 of a lot of things
00:45:43.100 that we don't even
00:45:43.680 know about,
00:45:44.840 you know,
00:45:45.020 you have to assume
00:45:45.680 that the United States
00:45:46.540 does what China is doing,
00:45:48.180 which is to use
00:45:48.900 their power and influence
00:45:49.860 and everything else
00:45:50.500 to, you know,
00:45:51.840 buy us everything
00:45:52.820 that we can buy us
00:45:53.720 in our direction.
00:45:55.300 In order to keep that up,
00:45:57.000 China has to do
00:45:57.940 less well in the long run.
00:46:00.460 One way they could
00:46:01.420 do less well
00:46:02.120 is we could
00:46:02.760 send them back
00:46:04.040 their stupid
00:46:05.340 spy drones
00:46:06.520 and we could
00:46:07.360 not buy stuff
00:46:08.220 in China
00:46:08.620 in the future.
00:46:09.620 So that would help.
00:46:12.260 So yes,
00:46:12.940 we should be very
00:46:13.680 worried about
00:46:14.420 their monetary
00:46:15.360 influence on social media
00:46:16.780 as well as
00:46:17.480 every other part
00:46:18.480 of our existence.
00:46:20.160 What will happen
00:46:24.100 to office politics
00:46:25.300 in the age of
00:46:26.480 mostly staying home,
00:46:29.080 I guess?
00:46:31.800 That's a good question
00:46:32.920 because I do wonder
00:46:36.380 what is the value
00:46:37.680 of all the in-person
00:46:39.260 interaction in the office?
00:46:41.540 You know,
00:46:41.760 the thinking has always been
00:46:43.220 that all of this
00:46:44.540 casual interaction
00:46:45.560 in the office
00:46:46.180 leads to positive
00:46:47.660 but unplanned benefits.
00:46:50.800 You know,
00:46:51.000 you cross-pollinate
00:46:51.980 with ideas
00:46:52.680 and you find somebody
00:46:54.360 to work with
00:46:54.980 and you get your answers
00:46:55.940 quickly and all that.
00:46:58.140 But nobody's ever
00:46:59.080 tested that.
00:47:00.620 I'm not aware
00:47:01.540 of any scientific
00:47:02.700 evidence
00:47:04.580 that would suggest
00:47:05.460 that being in the same office
00:47:06.680 gives you a better result.
00:47:08.340 We feel it
00:47:09.420 intuitively
00:47:10.080 like it's true
00:47:11.080 because you can
00:47:12.520 influence somebody
00:47:13.680 personally
00:47:15.020 in a way
00:47:15.500 that you can't
00:47:16.080 influence them
00:47:16.740 as well
00:47:17.240 over video.
00:47:18.580 So does that
00:47:18.980 make a difference?
00:47:19.960 We don't know.
00:47:21.400 We only know
00:47:21.920 it'll be different.
00:47:22.840 So it could be
00:47:23.300 that the politics
00:47:24.080 will decrease,
00:47:25.640 the time wasters
00:47:26.580 will decrease,
00:47:27.980 but there might be
00:47:29.040 other problems.
00:47:29.980 The work-at-homers
00:47:31.380 will start just not
00:47:32.320 answering their phone.
00:47:33.820 That would be my guess.
00:47:35.780 I think where it's all going
00:47:37.080 is that when it's new
00:47:38.320 and everybody's working
00:47:39.940 at home
00:47:40.320 and you're not used
00:47:41.780 to it yet,
00:47:42.840 you answer your phone
00:47:44.180 when it rings
00:47:44.760 and you answer your Zooms
00:47:46.860 and your Skypes.
00:47:48.420 But something tells me
00:47:49.520 that as we do
00:47:50.660 more and more work
00:47:51.500 at home,
00:47:52.000 everybody will learn
00:47:53.000 that the smart people
00:47:54.440 don't answer their phone.
00:47:56.860 And then what happens?
00:47:57.980 I don't know.
00:47:59.720 Will Biden be the
00:48:01.960 candidate in November?
00:48:03.500 I'd say 50-50.
00:48:05.500 But whether or not
00:48:06.440 he's actually the
00:48:07.580 candidate on the ticket
00:48:08.940 in November,
00:48:09.900 I do think people
00:48:11.020 will be looking
00:48:11.500 at the vice president.
00:48:13.960 Speaking of which,
00:48:15.740 oh my God,
00:48:16.660 did you see what
00:48:17.280 the president did
00:48:18.360 last night?
00:48:20.720 I think it was last night
00:48:21.900 on an interview.
00:48:22.660 Trump said that he thinks
00:48:27.620 Biden should pick
00:48:29.840 Elizabeth Warren
00:48:31.260 because she deserves it.
00:48:37.440 And I tweeted,
00:48:39.180 nothing makes me happier
00:48:40.920 than watching Trump
00:48:42.720 give Biden bad advice.
00:48:44.880 It is so diabolical.
00:48:47.280 It is so funny.
00:48:48.320 You know he does it
00:48:49.800 with like a twinkle
00:48:52.000 in his brain.
00:48:53.260 Like he's not letting on
00:48:54.400 that this is a joke.
00:48:56.420 But it is so diabolical
00:48:58.220 the way he gets in
00:48:59.180 the heads of competitors.
00:49:01.080 I would hate to golf
00:49:02.020 with him.
00:49:03.000 Have you ever thought
00:49:03.940 to yourself,
00:49:04.780 what would it be like
00:49:06.040 to just golf
00:49:07.860 with Trump?
00:49:09.540 Let's say you were
00:49:10.180 pretty good.
00:49:11.400 You were a good golfer.
00:49:14.180 Can you imagine
00:49:14.960 how much he would get
00:49:15.900 into your head
00:49:16.460 before you even hit
00:49:18.080 the first golf ball?
00:49:20.720 I can't imagine
00:49:21.740 he ever loses at golf
00:49:23.120 even if he plays
00:49:24.380 people who are better
00:49:25.180 because he probably
00:49:26.400 so messes with
00:49:27.300 their psychology
00:49:28.020 that by the time
00:49:28.800 they try to swing,
00:49:30.120 they're hitting
00:49:30.580 the ball backwards.
00:49:32.700 Anyway, here's the
00:49:33.340 beauty part
00:49:34.460 of suggesting that
00:49:35.440 Biden owes it
00:49:37.900 to one.
00:49:40.260 She deserves it.
00:49:42.120 Because when you use
00:49:42.960 language like that,
00:49:44.660 it's like emotional
00:49:45.660 language.
00:49:46.880 So he's trying,
00:49:47.740 of course,
00:49:48.180 to get the Democrats
00:49:49.860 spun up to fight
00:49:51.140 with each other,
00:49:52.180 which is so funny
00:49:53.400 that, like,
00:49:55.040 I saw that tweet,
00:49:56.580 I'm like,
00:49:56.940 oh, you just have
00:49:57.780 to keep doing this.
00:49:59.160 Because I think
00:49:59.980 it's the second time
00:50:00.740 now he's given
00:50:01.540 bad advice to Biden
00:50:03.160 but made it act
00:50:04.380 like he was serious.
00:50:06.080 Yeah, I got your back
00:50:06.960 on this whole
00:50:07.600 terror-raid thing.
00:50:08.960 I think you should
00:50:09.580 go out there
00:50:09.980 and defend yourself.
00:50:10.720 And when you hear
00:50:12.900 the president's
00:50:15.000 bad advice to Biden
00:50:16.300 and you know
00:50:17.300 he's only doing it
00:50:18.240 because Biden
00:50:18.920 is mentally degraded,
00:50:21.340 let's face it,
00:50:22.520 he's mentally degraded,
00:50:24.120 these things
00:50:24.840 wouldn't work
00:50:25.580 with somebody
00:50:26.760 who is more capable
00:50:27.700 but Biden
00:50:28.800 is not that person.
00:50:30.120 So the president
00:50:31.000 can just totally
00:50:32.240 mess with the
00:50:32.940 psychology of their
00:50:33.880 situation,
00:50:35.240 do it with a
00:50:35.840 straight face,
00:50:36.620 and the news
00:50:37.700 doesn't even know
00:50:38.260 how to report this
00:50:39.100 because they can't
00:50:39.780 tell, I don't know,
00:50:41.260 is he kidding?
00:50:42.940 Was that a joke?
00:50:44.780 Is he trying
00:50:45.260 to mess with him?
00:50:46.800 Or does he really
00:50:47.580 think that Elizabeth
00:50:48.720 Warren would be,
00:50:49.900 what's going on here?
00:50:51.580 And of course
00:50:52.060 he's messing with him.
00:50:53.920 It should be
00:50:54.380 pretty obvious.
00:50:55.000 I don't have to
00:50:55.440 read his mind
00:50:55.940 to know that.
00:50:56.500 I would call that
00:50:57.680 an obvious one.
00:51:01.420 But here's the
00:51:02.280 beauty of it.
00:51:04.640 First of all,
00:51:05.420 Elizabeth Warren
00:51:06.400 would be the
00:51:06.940 worst choice
00:51:07.960 because she's
00:51:09.360 stronger than
00:51:10.120 Biden.
00:51:11.140 That's rule number
00:51:12.340 one of picking
00:51:13.980 your vice president.
00:51:15.500 You want the
00:51:16.320 public to say,
00:51:17.160 okay, of these
00:51:18.080 two people,
00:51:19.040 it's obvious which
00:51:19.980 one is the
00:51:20.500 presidential candidate
00:51:21.580 and it's obvious
00:51:23.040 which one is the
00:51:23.680 vice president
00:51:24.260 because they're
00:51:24.640 like a weaker
00:51:25.960 version of the
00:51:26.820 president, right?
00:51:27.580 If you see
00:51:28.520 Donald Trump
00:51:29.960 and Mike Pence,
00:51:32.320 you know who
00:51:32.920 is the president,
00:51:34.600 right?
00:51:34.860 I always say
00:51:37.220 good things
00:51:37.580 about Pence.
00:51:38.380 I don't agree
00:51:39.080 with him on a
00:51:39.560 lot of stuff,
00:51:40.700 you know,
00:51:41.140 on the religious
00:51:43.580 stuff, etc.
00:51:44.600 We don't see
00:51:45.380 eye to eye.
00:51:46.520 But Pence is a
00:51:47.580 good egg,
00:51:48.340 I think.
00:51:49.480 He's just a good,
00:51:51.080 solid citizen.
00:51:53.000 And he's a perfect
00:51:54.020 vice president
00:51:54.660 because he just
00:51:55.200 doesn't make too
00:51:55.800 many mistakes.
00:51:57.500 He didn't wear
00:51:58.120 that mask that
00:51:58.720 one time,
00:51:59.120 but that's no
00:51:59.500 biggie.
00:51:59.760 But if you
00:52:02.860 take Elizabeth
00:52:03.920 Warren and
00:52:05.320 put her side
00:52:06.000 by side with
00:52:06.640 Biden,
00:52:07.160 everybody's going
00:52:07.720 to say,
00:52:08.280 you see what's
00:52:10.700 wrong here,
00:52:11.280 right?
00:52:12.200 Elizabeth Warren
00:52:13.020 is like twice
00:52:13.920 as capable as
00:52:15.340 Biden.
00:52:15.720 It's not even
00:52:16.120 close.
00:52:17.700 Mentally,
00:52:19.200 if you were
00:52:19.820 to,
00:52:20.220 let's say you
00:52:21.160 were to
00:52:21.600 give them both
00:52:24.000 an SAT test
00:52:25.240 or an
00:52:26.320 LSAT
00:52:26.940 or an
00:52:28.180 IQ test,
00:52:29.580 Elizabeth
00:52:30.040 Warren versus
00:52:31.200 Biden,
00:52:32.240 that's not
00:52:33.700 going to be
00:52:34.080 close,
00:52:35.260 right?
00:52:35.640 Say what you
00:52:36.360 will about
00:52:36.980 your Elizabeth
00:52:37.600 Warrens,
00:52:38.680 and I have,
00:52:40.300 but she's
00:52:40.900 super smart.
00:52:42.560 Can't say the
00:52:43.180 same for Biden,
00:52:44.300 right?
00:52:44.720 So if you put
00:52:45.600 her as the
00:52:45.920 vice presidential
00:52:46.440 choice,
00:52:48.060 it just ruins
00:52:48.620 everything,
00:52:49.440 which is why
00:52:50.580 it's hilarious
00:52:51.160 that he would
00:52:51.660 say that she's
00:52:52.860 earned it.
00:52:54.080 He can't even
00:52:54.880 say no,
00:52:55.460 she's earned
00:52:55.900 it.
00:52:56.900 So that if he
00:52:57.860 doesn't pick
00:52:58.540 her,
00:52:59.580 the president
00:53:00.180 has put in
00:53:00.900 their heads
00:53:01.540 that he
00:53:02.880 picks somebody
00:53:03.480 who didn't
00:53:03.860 earn it.
00:53:07.000 So let me
00:53:08.340 further my
00:53:09.380 long-held
00:53:12.080 prediction by
00:53:13.560 saying this.
00:53:14.880 If you
00:53:15.600 picture Elizabeth
00:53:16.980 Warren and
00:53:17.560 Biden,
00:53:18.240 just in your
00:53:18.940 mind,
00:53:19.380 your mind's
00:53:19.880 eye,
00:53:20.160 picture them
00:53:20.560 side by
00:53:20.960 side.
00:53:21.680 As I said,
00:53:22.820 Warren looks
00:53:23.400 stronger.
00:53:23.880 Now picture
00:53:25.380 Kamala Harris
00:53:27.360 side by
00:53:28.580 side with
00:53:29.960 Biden.
00:53:31.020 Ah,
00:53:31.700 now you
00:53:32.400 see it,
00:53:32.780 don't you?
00:53:33.900 Now you
00:53:34.640 see it.
00:53:35.800 Kamala Harris
00:53:36.560 is close
00:53:38.300 to,
00:53:39.520 sort of
00:53:40.080 similar to
00:53:41.060 Biden.
00:53:41.540 She didn't
00:53:41.860 get far
00:53:42.380 enough in
00:53:42.840 the primaries
00:53:44.260 to be sort
00:53:45.400 of Elizabeth
00:53:46.160 Warren level.
00:53:47.420 She's one
00:53:47.960 level down.
00:53:48.580 she's just
00:53:50.620 about equivalent
00:53:51.640 with Biden,
00:53:53.740 but she's
00:53:54.520 strong enough
00:53:55.360 because she
00:53:55.920 went through
00:53:56.280 the process
00:53:56.840 of running
00:53:57.400 for president.
00:53:58.460 She's strong
00:53:59.300 enough that
00:54:00.160 if she were
00:54:00.860 to become
00:54:01.360 president,
00:54:02.900 people would
00:54:04.140 say,
00:54:04.660 ah,
00:54:05.060 that's
00:54:05.400 legitimate.
00:54:06.360 She was in
00:54:07.120 the primaries.
00:54:08.060 She made a
00:54:08.980 dent in the
00:54:09.540 primaries.
00:54:10.200 She's a
00:54:10.580 legitimate
00:54:11.020 presidential
00:54:11.800 candidate.
00:54:13.080 People took
00:54:13.680 her seriously.
00:54:14.540 She had a lot
00:54:14.960 of supporters.
00:54:15.480 she didn't
00:54:16.420 make it
00:54:16.720 that far
00:54:17.180 in the
00:54:17.440 primaries,
00:54:18.540 but she's
00:54:19.180 a serious
00:54:19.680 candidate.
00:54:20.320 Yes,
00:54:20.660 we accept
00:54:21.220 her as
00:54:21.680 our new
00:54:22.600 president if
00:54:23.360 that's the
00:54:23.760 way it's
00:54:24.040 going to
00:54:24.260 go.
00:54:25.780 So Kamala,
00:54:26.840 she checks
00:54:28.060 off every
00:54:28.560 box.
00:54:31.440 And somebody
00:54:32.280 says,
00:54:33.080 Scott is still
00:54:33.860 trying to
00:54:34.460 persuade his
00:54:35.300 prediction into
00:54:36.220 reality.
00:54:37.300 Well,
00:54:37.660 let me ask
00:54:38.060 you this.
00:54:38.780 Do you think
00:54:39.220 the Democrats
00:54:39.860 are listening
00:54:40.360 to me?
00:54:41.460 Do you think
00:54:42.120 there's anything
00:54:42.640 that I'll say
00:54:43.580 that could
00:54:44.800 persuade the
00:54:45.460 Democrats?
00:54:46.840 My assumption
00:54:47.980 is that they're
00:54:48.920 going to end
00:54:49.360 up there on
00:54:50.020 their own
00:54:50.560 because that's
00:54:51.660 just water
00:54:52.260 traveling downhill.
00:54:53.800 In other
00:54:54.160 words,
00:54:55.120 the path of
00:54:56.300 least resistance
00:54:57.220 I didn't
00:54:58.680 create.
00:55:00.000 I'm not the
00:55:00.740 person who
00:55:01.180 created the
00:55:02.580 path.
00:55:03.540 I'm simply
00:55:04.280 describing it.
00:55:05.240 I'm just
00:55:05.520 standing there
00:55:05.980 saying,
00:55:06.280 look,
00:55:06.440 there's two
00:55:06.820 paths.
00:55:07.840 One of them
00:55:08.380 is Elizabeth
00:55:09.040 Warren,
00:55:09.560 is a brick
00:55:10.000 wall.
00:55:11.040 You can see
00:55:11.700 it.
00:55:12.520 I'm not
00:55:12.960 making the
00:55:13.720 brick wall
00:55:14.200 there.
00:55:14.520 I'm just
00:55:14.740 saying you
00:55:15.080 can see
00:55:15.400 it too.
00:55:16.600 I didn't
00:55:16.980 put that
00:55:17.400 brick wall
00:55:17.920 there.
00:55:18.980 Now look
00:55:19.440 at Kamala
00:55:19.840 Harris.
00:55:20.260 It's a
00:55:20.500 different
00:55:20.720 path.
00:55:21.820 Yeah,
00:55:22.080 there's
00:55:22.300 nothing
00:55:22.600 in that
00:55:23.020 path.
00:55:24.120 There's
00:55:24.580 nothing
00:55:25.160 stopping
00:55:25.620 her
00:55:25.900 from being
00:55:27.360 president.
00:55:28.700 If Biden
00:55:29.920 stays in,
00:55:31.920 picks her
00:55:32.180 as VP,
00:55:33.420 wins the
00:55:33.920 election,
00:55:34.420 I think
00:55:34.700 that's
00:55:34.940 unlikely.
00:55:36.200 But that
00:55:38.240 would be
00:55:38.500 the path.
00:55:39.020 Stacey
00:55:41.220 Abrams,
00:55:41.840 I think,
00:55:42.500 just can't
00:55:43.540 be taken
00:55:43.920 seriously.
00:55:46.980 Think about
00:55:47.980 the bad
00:55:48.680 feelings you
00:55:49.280 have for
00:55:49.740 Kamala
00:55:50.120 Harris.
00:55:51.380 Think about,
00:55:52.340 so you
00:55:53.040 mostly Trump
00:55:54.240 supporters on
00:55:54.920 this periscope,
00:55:55.860 think to
00:55:56.280 yourself,
00:55:57.520 think about
00:55:58.220 your emotional
00:55:58.840 feeling about
00:55:59.600 Kamala Harris.
00:56:00.520 And now ask
00:56:01.040 yourself,
00:56:01.760 how different
00:56:02.660 is that from
00:56:03.900 the emotional
00:56:04.520 feeling that
00:56:06.840 Democrats felt
00:56:07.880 when Trump
00:56:09.020 started becoming
00:56:10.400 a serious
00:56:11.100 candidate?
00:56:12.260 Think about
00:56:12.860 how they
00:56:13.320 felt about
00:56:13.900 him.
00:56:14.720 It was
00:56:14.880 like this
00:56:15.240 visceral,
00:56:16.820 it was
00:56:17.120 almost like
00:56:17.700 you disliked
00:56:19.060 him before
00:56:19.460 he had a
00:56:19.920 reason.
00:56:20.460 There was
00:56:20.560 something about
00:56:20.940 his personality,
00:56:22.120 his braggadocia,
00:56:23.880 his playing
00:56:25.440 loose with
00:56:25.940 the facts.
00:56:26.560 It was
00:56:26.660 like,
00:56:26.960 ah,
00:56:27.980 ah.
00:56:29.160 Now,
00:56:30.100 maybe you
00:56:30.680 and I
00:56:30.920 didn't feel
00:56:31.340 that,
00:56:32.360 but you
00:56:32.880 certainly
00:56:33.220 observed that
00:56:34.500 the Democrats
00:56:35.120 were having
00:56:35.520 this just
00:56:36.540 contempt
00:56:37.480 feeling,
00:56:38.660 this like,
00:56:39.200 you know,
00:56:39.560 almost a
00:56:40.140 bodily hatred
00:56:41.080 for Trump.
00:56:42.440 Now,
00:56:42.880 ask yourself,
00:56:43.540 did that
00:56:43.840 stop him
00:56:44.280 from becoming
00:56:44.800 president?
00:56:45.920 No.
00:56:47.000 Because that
00:56:47.720 thing that
00:56:48.260 made people
00:56:48.780 hate Trump
00:56:49.820 is just
00:56:51.400 the other
00:56:51.860 side of
00:56:52.560 the thing
00:56:52.900 that made
00:56:53.260 people fall
00:56:53.920 in love
00:56:54.260 with him.
00:56:55.480 And there
00:56:56.320 are people
00:56:56.920 who support
00:56:57.620 Trump who
00:56:58.100 are,
00:56:58.460 you know,
00:56:58.900 almost cult-like.
00:57:00.180 They just
00:57:00.460 love him.
00:57:01.300 And this
00:57:01.540 is true of
00:57:02.020 Obama and
00:57:02.640 other people
00:57:03.080 as well.
00:57:03.520 It's not
00:57:03.780 just Trump.
00:57:04.500 But he,
00:57:05.900 Trump causes
00:57:08.360 your emotions
00:57:09.340 to catch
00:57:09.980 on fire.
00:57:11.240 And you
00:57:11.960 either love
00:57:12.660 him or
00:57:14.060 you hate
00:57:14.500 him,
00:57:14.960 but there's
00:57:15.580 not a lot
00:57:16.040 in between,
00:57:17.100 which is
00:57:17.620 actually pretty
00:57:18.280 predictive of
00:57:19.120 somebody who
00:57:19.600 could become
00:57:20.060 president.
00:57:21.020 The one you
00:57:21.780 don't want
00:57:22.420 is where
00:57:22.840 everybody says,
00:57:23.440 well,
00:57:23.580 I don't
00:57:23.880 love him.
00:57:25.320 I don't
00:57:25.700 really hate
00:57:26.400 him.
00:57:27.240 He's just
00:57:27.960 in this
00:57:28.380 middle ground
00:57:29.120 somewhere.
00:57:29.960 That's where
00:57:30.280 Biden is.
00:57:31.120 That's why
00:57:31.480 Biden is so
00:57:32.680 weak,
00:57:32.980 because he
00:57:34.080 doesn't
00:57:34.360 make you
00:57:34.700 love him.
00:57:35.720 He doesn't
00:57:36.120 make you
00:57:36.460 hate him.
00:57:37.340 He just
00:57:37.800 exists.
00:57:39.280 That's
00:57:39.520 usually not
00:57:40.160 a good
00:57:40.400 sign of
00:57:40.840 future
00:57:41.160 leadership.
00:57:42.860 Kamala
00:57:43.200 Harris,
00:57:44.260 the moment
00:57:45.040 I mention
00:57:45.760 her name,
00:57:46.920 some of
00:57:47.440 you just
00:57:47.840 go enraged.
00:57:49.300 Yeah,
00:57:49.520 that's too
00:57:49.900 strong.
00:57:50.420 But you
00:57:50.920 have an
00:57:51.400 emotional
00:57:52.000 reaction to
00:57:52.880 her.
00:57:53.660 People
00:57:53.960 are saying,
00:57:54.440 ah,
00:57:54.760 she slept
00:57:55.340 her way
00:57:55.640 to the
00:57:55.900 top.
00:57:56.180 She put
00:57:58.120 people in
00:57:58.520 jail.
00:57:59.080 She lied
00:58:00.720 about some
00:58:01.340 racist stuff
00:58:02.100 she accused
00:58:02.700 somebody of.
00:58:03.480 She'll do
00:58:04.300 anything.
00:58:04.960 She's a
00:58:05.420 schemer.
00:58:06.640 Think about
00:58:07.700 what you
00:58:08.140 feel about
00:58:09.120 her.
00:58:11.340 Don't you
00:58:11.900 feel a
00:58:13.420 stronger
00:58:13.960 emotion,
00:58:14.820 negative,
00:58:16.120 about Kamala
00:58:16.840 than some
00:58:17.420 other people?
00:58:18.620 Ask yourself,
00:58:19.660 do you have
00:58:20.020 the same
00:58:21.000 feeling about
00:58:21.600 Kamala?
00:58:22.640 Just examine
00:58:23.840 your internal
00:58:24.440 feeling as
00:58:25.580 you did
00:58:25.960 of, say,
00:58:26.860 Cory Booker.
00:58:28.700 You might
00:58:29.360 have said,
00:58:29.800 oh, I
00:58:30.040 don't think
00:58:30.460 Cory Booker
00:58:31.000 should be
00:58:31.360 president.
00:58:32.180 But do you
00:58:32.680 have the
00:58:32.960 same visceral
00:58:33.980 feel for
00:58:35.300 Cory Booker?
00:58:36.300 I'll bet
00:58:36.560 you don't.
00:58:37.900 I'll bet
00:58:38.220 you say,
00:58:38.600 you know,
00:58:38.960 Cory Booker
00:58:39.420 looks like a
00:58:39.920 good guy.
00:58:40.720 He's just
00:58:41.060 not the
00:58:41.380 guy I'd
00:58:41.700 want for
00:58:42.020 president.
00:58:42.800 How about
00:58:43.100 Andrew Yang?
00:58:44.560 Does anybody
00:58:45.260 hate Andrew
00:58:46.460 Yang?
00:58:46.900 Probably not.
00:58:47.980 Probably
00:58:48.320 basically nobody.
00:58:50.220 But do you
00:58:50.720 get, like,
00:58:51.380 super excited?
00:58:52.340 Well, you
00:58:53.400 get excited in
00:58:54.220 a certain way
00:58:55.020 with Yang.
00:58:55.940 You know,
00:58:56.140 he's interesting
00:58:56.740 and stuff.
00:58:57.620 But he
00:58:57.900 doesn't cause
00:58:59.500 you to
00:58:59.860 love him
00:59:00.840 like some
00:59:02.080 people do.
00:59:03.220 You don't
00:59:03.520 hate him.
00:59:04.960 You don't
00:59:05.560 love him.
00:59:06.520 He's
00:59:06.820 interesting.
00:59:07.540 But he's
00:59:08.120 interesting in
00:59:08.900 the middle.
00:59:11.420 So, here's
00:59:12.480 my overall
00:59:15.280 statement of
00:59:15.840 that.
00:59:16.120 The fact that
00:59:16.840 you have such
00:59:17.300 a negative
00:59:17.780 feeling about
00:59:18.420 Kamala Harris
00:59:19.160 tells you there's
00:59:20.520 something about
00:59:21.280 her that's
00:59:22.260 reaching you
00:59:23.040 emotionally.
00:59:24.200 If it's
00:59:25.080 true that
00:59:26.300 there's another
00:59:26.820 side to that,
00:59:28.140 she might
00:59:28.540 actually be
00:59:29.180 able to
00:59:29.540 move Democrats
00:59:31.000 emotionally as
00:59:32.780 well.
00:59:33.580 So, look for
00:59:34.240 the person
00:59:34.700 who can make
00:59:36.380 you feel
00:59:37.060 something.
00:59:38.200 She definitely
00:59:39.020 makes you feel
00:59:39.820 something.
00:59:40.920 I don't know
00:59:41.480 if there's a
00:59:41.940 positive side of
00:59:42.780 that yet.
00:59:43.140 I haven't really
00:59:43.600 seen it.
00:59:44.460 But she
00:59:44.700 definitely
00:59:45.020 makes you feel
00:59:45.980 the negative
00:59:46.440 if you're on
00:59:47.220 the other
00:59:47.460 team.
00:59:48.620 So, follow
00:59:49.300 the power
00:59:50.140 of emotion.
00:59:51.320 Same with
00:59:52.060 AOC.
00:59:53.080 You can hate
00:59:53.760 everything about
00:59:54.360 AOC, but
00:59:55.860 just examine
00:59:56.680 the depth
00:59:57.300 of your
00:59:57.680 feeling.
00:59:58.740 That's the
00:59:59.300 thing that
00:59:59.780 is predictive.
01:00:01.080 It's the
01:00:01.440 depth of it,
01:00:03.320 not your
01:00:03.940 intellectual
01:00:04.460 opinions.
01:00:06.100 What is
01:00:06.780 your favorite
01:00:07.260 movie?
01:00:08.700 At least
01:00:09.100 at the
01:00:09.400 moment.
01:00:10.180 I dislike
01:00:11.020 movies as
01:00:11.840 an art form
01:00:12.600 because in
01:00:13.380 2020, they
01:00:14.160 just take
01:00:14.580 too long.
01:00:15.600 They're too
01:00:15.880 self-indulgent.
01:00:16.660 It's more
01:00:17.140 about the
01:00:17.560 director.
01:00:18.780 And they
01:00:19.000 put ridiculous
01:00:20.280 hackneyed
01:00:21.440 crap in
01:00:21.940 every movie
01:00:22.480 now.
01:00:23.360 If I see
01:00:23.880 one more
01:00:24.460 by the way,
01:00:25.140 I have a
01:00:25.500 rule.
01:00:26.540 Let me tell
01:00:27.320 you my
01:00:27.640 rule for
01:00:28.480 watching
01:00:28.860 drama.
01:00:30.020 I will
01:00:30.600 turn on
01:00:30.960 a movie.
01:00:31.460 Okay, got
01:00:32.040 good ratings,
01:00:32.920 lots of
01:00:33.220 action.
01:00:33.920 I can watch
01:00:34.400 that sometimes.
01:00:35.540 As soon as
01:00:36.480 the star or
01:00:37.440 any part of
01:00:38.000 the movie is
01:00:39.340 tied to a
01:00:40.120 chair, I
01:00:41.420 turn it
01:00:41.800 off.
01:00:42.900 That's my
01:00:43.380 rule.
01:00:43.700 I've been
01:00:43.880 following this
01:00:44.380 for years.
01:00:45.300 Do you know
01:00:45.600 how many
01:00:45.880 movies I've
01:00:46.580 turned off
01:00:47.280 because at
01:00:48.460 some point
01:00:49.100 somebody is
01:00:50.200 tied to a
01:00:50.820 chair?
01:00:52.420 Yeah, 75%.
01:00:54.460 75% of
01:00:56.440 all action
01:00:57.140 movies I
01:00:58.280 turn off
01:00:59.000 when they
01:00:59.820 get to the
01:01:00.300 tied to the
01:01:00.880 chair part.
01:01:02.140 Because first
01:01:02.960 of all, I
01:01:03.400 don't want to
01:01:03.880 see it.
01:01:04.960 Everything that
01:01:05.680 happens with
01:01:06.220 that tied to
01:01:06.800 the chair part
01:01:07.460 is going to
01:01:08.300 be unpleasant.
01:01:09.060 I don't want
01:01:09.400 to think about
01:01:10.020 it.
01:01:10.600 I don't want
01:01:11.040 to imagine
01:01:11.420 it even as
01:01:12.100 a fiction.
01:01:13.040 I don't want
01:01:13.580 it in my
01:01:13.960 head.
01:01:15.020 I just
01:01:15.520 need to
01:01:15.860 fast forward
01:01:16.460 past that
01:01:17.000 part, but
01:01:18.040 I'm not
01:01:18.460 going to.
01:01:19.220 The fact that
01:01:19.900 you put that
01:01:20.500 in there and
01:01:21.120 you thought that
01:01:21.680 I wanted to
01:01:22.260 watch that, it's
01:01:23.680 just hackneyed,
01:01:24.600 it's boring,
01:01:25.900 it's redundant,
01:01:27.540 and if that's the
01:01:28.500 best you can do
01:01:29.220 for your movie,
01:01:30.000 I'm out.
01:01:30.880 If that's your
01:01:32.100 symbol of
01:01:32.640 creativity, why
01:01:33.960 don't I do
01:01:34.280 what everybody
01:01:34.700 else does, tie
01:01:35.480 somebody to a
01:01:36.060 chair, I'm
01:01:36.620 out.
01:01:36.780 All right, I'm
01:01:40.020 a little bit
01:01:40.500 over time here,
01:01:41.700 so I don't
01:01:42.060 want to go
01:01:42.300 too far.
01:01:44.420 Let's see if
01:01:45.040 there's one
01:01:45.380 more question.
01:01:47.400 How much
01:01:47.740 testing is
01:01:48.440 needed to
01:01:48.920 accomplish
01:01:49.360 something useful
01:01:50.300 in opening
01:01:51.000 the country?
01:01:52.160 My answer
01:01:52.840 is, and I'm
01:01:54.780 pretty confident
01:01:56.100 to this, let's
01:01:57.400 put my
01:01:57.760 confidence at
01:01:58.540 75, 80%.
01:02:00.460 So, you
01:02:01.480 know, a good
01:02:01.800 solid 20, 25%,
01:02:04.380 I might be
01:02:04.900 wrong about
01:02:05.420 this.
01:02:05.720 But my
01:02:06.500 current thinking
01:02:07.120 about testing
01:02:07.700 is we can't
01:02:08.380 get there at
01:02:10.400 all, that
01:02:11.160 there's no way
01:02:11.960 to test enough
01:02:13.700 that we will
01:02:15.020 consider in the
01:02:15.760 end when we
01:02:16.280 look back at
01:02:16.920 it, oh, thank
01:02:17.560 goodness we
01:02:18.780 had enough
01:02:19.200 testing and
01:02:20.480 the right kind
01:02:21.100 of testing and
01:02:21.880 we tested the
01:02:22.500 right people in
01:02:23.180 the right way
01:02:23.820 that we really
01:02:24.960 got a handle
01:02:25.540 in this thing.
01:02:26.700 My prediction
01:02:27.300 is there's
01:02:27.820 nothing that
01:02:28.260 looks even
01:02:28.720 remotely like
01:02:29.680 that's going
01:02:30.080 to happen,
01:02:30.800 not even a
01:02:31.380 little bit.
01:02:32.360 We're nowhere
01:02:32.960 near being able
01:02:34.880 to test in
01:02:35.600 the right
01:02:35.900 way, fast
01:02:36.680 enough, etc.
01:02:38.460 Now, I have
01:02:39.500 optimism in
01:02:40.700 humanity in
01:02:41.860 the sense that
01:02:42.420 maybe something
01:02:42.980 will get
01:02:43.440 invented, but
01:02:45.180 somebody would
01:02:45.600 have to invent
01:02:46.480 something that
01:02:47.840 could be scaled
01:02:48.540 up quickly that
01:02:49.860 doesn't yet
01:02:50.420 exist.
01:02:51.700 Is that going
01:02:52.460 to happen in
01:02:52.900 three months or
01:02:54.480 one month when
01:02:55.300 we need it?
01:02:55.980 I don't think
01:02:56.480 so.
01:02:57.360 Seems pretty
01:02:57.940 unlikely to me.
01:02:59.020 Could happen.
01:03:00.060 You know, I'd
01:03:00.420 like to keep
01:03:00.880 that solid
01:03:01.740 20%, maybe
01:03:03.460 it'll happen.
01:03:04.860 And I do
01:03:05.540 think, generally
01:03:06.660 speaking, that
01:03:07.320 humanity will
01:03:08.160 rise to the
01:03:08.820 challenge.
01:03:09.860 And as we're
01:03:10.240 looking at this
01:03:10.900 crushing amount
01:03:11.960 of death that
01:03:12.940 we're looking at
01:03:13.540 for going back
01:03:14.160 to work, I
01:03:16.060 would, if I
01:03:16.680 had to bet,
01:03:18.240 we're going to
01:03:18.780 be super, super
01:03:20.020 clever in the
01:03:21.060 next 30 days.
01:03:22.360 and that we
01:03:23.360 will just
01:03:23.740 devise things,
01:03:25.840 you know,
01:03:26.080 mechanisms,
01:03:27.060 systems,
01:03:28.260 technologies,
01:03:29.080 inventions.
01:03:30.060 I mean, it's
01:03:30.540 going to look
01:03:30.940 crazy in the
01:03:32.080 next 30 days of
01:03:33.140 how much human
01:03:34.760 innovation just
01:03:36.440 pops out of
01:03:36.980 this.
01:03:37.460 Some of it
01:03:38.140 might be
01:03:38.980 important and
01:03:39.920 make a difference.
01:03:40.640 I don't think
01:03:41.060 it'll be the
01:03:41.440 testing.
01:03:42.280 If you had to
01:03:42.920 guess, testing
01:03:44.320 is just one of
01:03:45.020 many variables
01:03:45.920 that could make
01:03:46.740 a difference.
01:03:47.620 So what are
01:03:48.340 the odds it's
01:03:48.880 that one single
01:03:49.600 one?
01:03:50.000 I'm going to
01:03:50.340 bet against it.
01:03:51.340 So I'm going
01:03:51.700 to say the
01:03:52.080 testing will
01:03:52.640 not be your
01:03:53.100 answer.
01:03:53.580 And I say
01:03:53.960 the same
01:03:54.280 thing about
01:03:54.680 vaccines.
01:03:55.640 I don't
01:03:55.820 think the
01:03:56.160 vaccine hope
01:03:57.880 is real,
01:03:58.560 frankly.
01:03:59.060 I don't
01:03:59.240 think it's
01:03:59.540 real.
01:04:00.440 I think
01:04:00.680 that's just
01:04:01.100 make you
01:04:02.560 feel good.
01:04:04.280 It might
01:04:05.000 work.
01:04:05.880 But if we
01:04:06.400 can come up
01:04:06.900 with a
01:04:07.160 coronavirus
01:04:07.560 vaccine for
01:04:08.540 the first
01:04:08.940 time in
01:04:09.300 human history,
01:04:10.380 well,
01:04:10.660 wouldn't that
01:04:11.020 be surprising
01:04:12.220 if you know
01:04:12.960 what I mean?
01:04:15.240 All right.
01:04:18.220 Does it
01:04:18.760 make sense
01:04:19.180 for Bill
01:04:19.540 Gates to
01:04:20.080 speak in
01:04:20.600 favor of
01:04:21.300 the
01:04:21.560 China's
01:04:23.340 government's
01:04:23.880 handling of
01:04:24.860 the plague?
01:04:26.380 And the
01:04:26.920 answer is,
01:04:27.580 I'm going
01:04:27.980 to back
01:04:28.460 Bill Gates
01:04:29.300 every time
01:04:30.740 he says
01:04:31.300 something that
01:04:32.840 is true,
01:04:34.120 even if you
01:04:35.260 don't like
01:04:35.720 it.
01:04:37.500 So is it
01:04:38.440 true that the
01:04:39.380 Chinese government
01:04:40.720 acted aggressively
01:04:41.920 and decisively
01:04:43.640 in a way that
01:04:45.000 only they could
01:04:45.920 and probably
01:04:47.540 made a big
01:04:48.540 difference in
01:04:49.220 the plague?
01:04:50.180 I would
01:04:50.420 say yes.
01:04:51.460 I would
01:04:51.800 agree with
01:04:52.540 Bill Gates
01:04:54.860 that as far
01:04:55.360 as we can
01:04:55.800 tell, we
01:04:57.260 might find
01:04:57.760 out something
01:04:58.180 later.
01:04:59.300 But as far
01:04:59.820 as we could
01:05:00.240 tell, it
01:05:01.440 looks like
01:05:01.940 China
01:05:02.400 effectively
01:05:04.760 handled it,
01:05:06.000 even if you
01:05:06.580 don't like the
01:05:07.120 way they did
01:05:07.660 it.
01:05:08.080 Even if you
01:05:08.640 don't like the
01:05:09.160 fact that they
01:05:09.680 locked people
01:05:11.440 in their houses
01:05:12.500 until they
01:05:12.900 starved or
01:05:13.440 whatever the
01:05:13.800 hell happened.
01:05:15.060 If you don't
01:05:15.440 like the fact
01:05:16.000 there's lots
01:05:17.280 of stuff
01:05:18.200 not to
01:05:18.620 like.
01:05:19.880 But what I
01:05:20.620 like about
01:05:21.160 Bill Gates
01:05:21.720 is he'll
01:05:22.080 give you the
01:05:22.640 difficult answer
01:05:23.520 if it's also
01:05:24.260 true.
01:05:25.620 It's a difficult
01:05:26.640 answer.
01:05:27.840 Because do I
01:05:28.940 want to say
01:05:29.540 in public,
01:05:30.260 yeah, I got
01:05:32.020 my criticisms,
01:05:33.160 but if you
01:05:33.900 looked at the
01:05:34.360 big picture,
01:05:35.760 the Chinese
01:05:36.340 government took
01:05:37.280 care of
01:05:37.580 business,
01:05:38.420 maybe not in
01:05:39.220 the way their
01:05:39.580 citizens wanted
01:05:40.420 them to,
01:05:41.540 probably a lot
01:05:42.640 of victims
01:05:43.180 out of that,
01:05:43.740 but you
01:05:45.020 got to say
01:05:45.840 it looks like
01:05:47.640 they got
01:05:48.000 past it,
01:05:48.960 maybe.
01:05:50.040 So I
01:05:51.060 favor Bill
01:05:51.840 Gates saying
01:05:52.460 things that
01:05:52.920 are honest.
01:05:54.620 That's one
01:05:55.240 of his
01:05:55.560 primary
01:05:56.640 benefits to
01:05:58.300 the country,
01:05:58.760 I think,
01:05:59.120 is that he
01:05:59.900 does tell
01:06:00.340 you what
01:06:00.720 actually is
01:06:02.640 true.
01:06:04.500 So he's
01:06:08.260 also the
01:06:08.720 largest donor
01:06:09.640 to the
01:06:10.040 World Health
01:06:10.440 Organization
01:06:11.040 and has
01:06:11.780 conflicts of
01:06:12.460 interest.
01:06:12.740 Well, here's
01:06:13.480 the thing.
01:06:14.640 Tell me, if
01:06:15.240 you will, what
01:06:15.820 you think
01:06:16.180 Bill Gates'
01:06:16.980 intention is.
01:06:18.860 If I thought
01:06:19.860 that Bill Gates
01:06:20.500 was trying to
01:06:21.120 make money
01:06:21.820 or influence
01:06:24.480 politics, I
01:06:26.660 would have a
01:06:27.160 very different
01:06:27.840 opinion of him.
01:06:29.000 But you know
01:06:29.960 that's not
01:06:30.520 true, right?
01:06:32.340 You know
01:06:33.120 Bill Gates is
01:06:33.840 not in this
01:06:34.420 for the money,
01:06:35.420 don't you?
01:06:36.720 I mean, really?
01:06:37.720 Don't you know
01:06:38.380 that?
01:06:38.700 That he's not
01:06:39.140 in this for the
01:06:39.720 money?
01:06:40.320 Because it
01:06:40.780 kind of wouldn't
01:06:41.240 make sense for
01:06:42.360 him to be in
01:06:42.840 it for the
01:06:43.160 money?
01:06:44.040 I mean, not
01:06:44.520 this.
01:06:45.580 You think he
01:06:46.060 decided to be
01:06:46.660 the richest guy
01:06:47.280 in the world
01:06:47.720 and go spend
01:06:48.360 a lot of
01:06:48.700 time working
01:06:49.240 on African
01:06:49.960 toilet design,
01:06:51.860 which he
01:06:52.500 does, because
01:06:54.120 he's trying to
01:06:54.540 make some
01:06:54.820 money on
01:06:55.360 toilets?
01:06:56.420 No.
01:06:57.340 No.
01:06:59.020 Bill Gates
01:06:59.740 is the real
01:07:00.380 deal.
01:07:01.780 And maybe the
01:07:02.880 reason I can
01:07:03.420 see it, in
01:07:05.560 my opinion,
01:07:06.340 more clearly,
01:07:07.060 is because
01:07:09.100 I'm the
01:07:09.860 smallest version
01:07:11.560 of him,
01:07:12.780 meaning that
01:07:13.380 I've experienced
01:07:14.700 making my
01:07:15.580 own money
01:07:16.140 and then
01:07:17.000 running out
01:07:17.480 of things
01:07:17.840 to do.
01:07:19.380 I just
01:07:20.000 talked about
01:07:20.460 it earlier.
01:07:21.280 I took care
01:07:21.960 of myself,
01:07:23.280 now what
01:07:23.800 do I do?
01:07:24.940 And it's
01:07:25.320 quite natural
01:07:26.120 that you're,
01:07:27.220 because you're
01:07:27.660 a human living
01:07:28.520 in a human
01:07:29.420 society,
01:07:30.200 that the next
01:07:31.180 thing you say
01:07:31.740 is, oh,
01:07:32.740 well, if I took
01:07:33.260 care of myself
01:07:33.900 and my family,
01:07:35.020 let's see what I
01:07:35.940 can do for the
01:07:36.420 world.
01:07:37.440 So, I
01:07:38.520 guarantee it.
01:07:40.440 There are very
01:07:40.900 few things I
01:07:41.480 would say with
01:07:41.940 100% certainty.
01:07:43.740 This will be
01:07:44.340 one of them.
01:07:45.340 I guarantee
01:07:46.460 it, 100%,
01:07:48.500 that Bill Gates
01:07:49.700 is the real
01:07:50.320 deal, meaning
01:07:51.600 that he's only
01:07:52.500 doing it to
01:07:53.180 help the
01:07:53.520 world, period.
01:07:55.060 Nothing else.
01:07:56.180 No, you
01:07:56.640 could say, oh,
01:07:57.340 it's good for
01:07:58.740 him, too, because
01:07:59.700 people will feel
01:08:01.560 good about his
01:08:02.140 legacy or
01:08:02.700 whatever.
01:08:03.080 Okay, sure.
01:08:04.660 I mean, that's
01:08:05.080 part of it,
01:08:05.580 too.
01:08:05.740 But it's not
01:08:06.500 the reason.
01:08:07.660 It's not the
01:08:08.120 reason.
01:08:08.640 I don't do
01:08:10.080 anything for a
01:08:10.660 legacy, because I
01:08:12.420 expect to be
01:08:13.000 dead.
01:08:14.160 I don't think
01:08:14.680 Bill Gates is a
01:08:16.440 believer.
01:08:17.780 I don't think he
01:08:18.620 believes in the
01:08:19.080 afterlife.
01:08:19.700 I'm not positive
01:08:20.260 about that.
01:08:20.880 I don't know if
01:08:21.280 he's talked about
01:08:21.860 it.
01:08:22.880 But I don't
01:08:25.180 think he's playing
01:08:27.060 for his legacy.
01:08:28.640 I think he's
01:08:29.340 playing for right
01:08:30.000 now.
01:08:31.300 I think he's the
01:08:31.980 real deal.
01:08:32.360 That's it.
01:08:33.540 I'll talk to you
01:08:34.380 tonight.
01:08:35.480 You know where,
01:08:36.560 you know when.