Real Coffee with Scott Adams - August 04, 2020


Episode 1081 Scott Adams: The Difference Between Mental Illness and an Opinion on Coronavirus are Shrinking, Biden Decomposes, Axios, More


Episode Stats

Length

49 minutes

Words per Minute

149.10771

Word Count

7,453

Sentence Count

485

Misogynist Sentences

2

Hate Speech Sentences

11


Summary


Transcript

00:00:00.000 Bum bum bum bum bum bum, ba dum bum bum, ba bum bum bum bum, ba bum bum bum bum, ba bum bum bum bum,
00:00:09.620 Good morning, everybody. It's time for Coffee with Scott Adams, the best time of the day,
00:00:17.200 except for any other times that I come on Periscope, which is the other best time of the day.
00:00:23.420 And what better way to get the day going? We've got all kinds of fun stuff to talk about.
00:00:30.560 Sure, there are tragedies in the world, but you can't spend all your time thinking about tragedy.
00:00:36.900 Sometimes, you've got to get a little relief.
00:00:40.980 You've got to give your brain a break.
00:00:42.800 That's why you're here, to give your brain a break.
00:00:45.760 We're going to talk about the fun and stupid and funny parts of the world.
00:00:50.060 But first, in order to enjoy it fully, I recommend that you find yourself a cup or a mug or a glass, a tank or a chalice or a stein, a canteen jug or a flask, a vessel of any kind.
00:01:04.760 Fill it with your favorite liquid.
00:01:06.700 I like coffee.
00:01:08.340 And join me now for the unparalleled pleasure, the dopamine hit of the day, the thing that makes everything better.
00:01:16.340 It's called the simultaneous sip, and it happens now.
00:01:19.200 Go!
00:01:20.060 It feels as though there's still the same amount of stupidity in the world, but it's not bothering me as much.
00:01:32.480 That's what the simultaneous sip does for you.
00:01:35.320 Yeah.
00:01:36.140 Let's talk about some fun things.
00:01:37.940 There's a company that has developed a face mask that can translate into other languages when you talk.
00:01:47.020 How cool is that?
00:01:48.460 Don't you want a face mask that can translate into other languages while you talk?
00:01:54.540 I would never talk English again.
00:01:57.720 I'd just walk around talking to other languages and see if other people knew who I was.
00:02:02.620 But I think this mask that can translate into other languages probably is the beginning of the cyborg age where we just cover up all of this stuff, all this head stuff.
00:02:16.560 You know, once you've got something over your eyes, which will be your augmented reality glasses, you've got something shoved in your ears, which will clearly be your hearing devices.
00:02:27.440 You've got some kind of a hat because of the sun.
00:02:29.880 You've got some sunscreen on you, which in a way is sort of a chemical modification.
00:02:35.440 And then you've got your face mask on, you're good to go.
00:02:39.740 Full Android capability.
00:02:41.420 The other thing I would like to see is I'd like to see a mask with echo cancellation so that I could have a conversation with my mask on that only goes through my phone and nobody around me can hear it, even if they're sitting right next to me.
00:02:57.740 Is that possible?
00:02:58.740 Probably not, but I'd like to have it.
00:03:00.820 And how about a mask that lets you speak commands into the mask to operate anything that's voice controlled, including your phone?
00:03:11.420 Yes.
00:03:12.560 Full cyborg.
00:03:15.000 It's coming.
00:03:17.000 Well, yesterday the president announced that they would make permanent the telehealth regulations.
00:03:24.140 I guess they made it during the pandemic, which we're still in.
00:03:29.700 They made the telehealth legal across state lines.
00:03:33.980 There may have been some other things that they did, but now that's permanent.
00:03:39.960 Now, how hard would it have been to make telehealth permanent across state lines if we had not had a pandemic?
00:03:49.700 It hadn't happened yet.
00:03:51.840 It was the most obvious thing you could ever do.
00:03:56.300 I mean, it's obviously good for the public.
00:03:58.440 It's obviously going to lower health care costs.
00:04:02.040 It's obviously going to make health care more available.
00:04:05.220 But without the pandemic and the opportunity that a crisis created, I don't know if it would have happened, or at least not so soon.
00:04:15.220 It's a pretty big deal.
00:04:16.480 So you see the president now sort of assembling the parts of something that would look sort of like a health care, I don't want to call it a plan, but more of a health care series of initiatives and executive orders and whatnot that are all designed collectively to create more competition, more transparency, and fewer regulations.
00:04:45.160 Now, if you said to me, can we get to universal health care, or something like it, by just taxing everybody, I'd say, oh, I don't know.
00:04:54.320 Sounds pretty expensive.
00:04:55.760 Plus, it might not ever pass.
00:04:57.280 But can we lower the cost of health care by making it a more competitive industry, by doing a variety of things that just remove competitive roadblocks?
00:05:08.200 To which I say, apparently so.
00:05:11.540 Apparently so.
00:05:12.660 I don't know that this gets us to full coverage.
00:05:15.520 But if you don't get the cost of health care down, how could you ever talk about covering everybody?
00:05:21.260 I mean, you know, if the president, let me put it in stark terms, prior to the pandemic, and let's say we get back to there in a year or so, I think something like 9% of the public was not covered by health care.
00:05:39.860 And although this next thing I'm going to say doesn't make perfect sense, it just gives you a size of magnitude, that's all I'm trying to do.
00:05:49.840 If we could cut the cost of health care by 20% for everybody, that kind of, in just a conceptual way, frees up money that would be enough to cover everybody.
00:06:00.780 Now, if I save money on my health care, that doesn't mean I'm going to pay more taxes to cover somebody else.
00:06:07.940 But you can see that it is probably more important to bring the total cost of health care down first to have any chance, any chance of covering everybody.
00:06:19.040 And I do think we should cover everybody.
00:06:20.520 Let me ask you this.
00:06:27.520 You know, there's a big question about whether illegal citizens, in other words, undocumented,
00:06:37.120 there's a conversation about whether undocumented people should get free health care,
00:06:41.760 because apparently they can walk into the emergency room and get it.
00:06:45.240 And I suppose if they don't have any health insurance, then maybe the hospital doesn't get paid.
00:06:53.160 So what if there was some kind of deal where any kind of undocumented person who got health care had to give a DNA sample?
00:07:04.540 And the DNA sample, you could anonymize it, so you don't necessarily have to know it came from this person.
00:07:11.680 Oh, but maybe you do.
00:07:12.880 Yeah, let's say you do.
00:07:13.660 You do know where it came from.
00:07:16.180 And let's say that's the cost of free health care.
00:07:19.080 Yeah, we'll give you free health care, but we've got to get a sample of your DNA.
00:07:23.740 Now, there would be two benefits from that.
00:07:25.660 One, the crime from, well, the crime in general would be much reduced by the more DNA we have of people.
00:07:34.500 But also, if we have massive DNA samples, we can find who is more or less susceptible to coronavirus.
00:07:46.300 Who is more or less susceptible to this or that.
00:07:49.540 So the health care outcomes, I'm just going to, this is just pure speculation.
00:07:56.640 I won't make this a claim.
00:07:58.360 It's just something to think about.
00:07:59.780 Is the opportunity for improving not only crime solutions, but health care, is it big enough that if you had such a large group of DNA that you were collecting and they were non-citizens, but it was just sort of their part?
00:08:17.300 You know, it was, in a sense, they would give up privacy in this one way in return for helping the outcomes of all the other undocumented immigrants.
00:08:28.360 Because whatever health outcomes were good for the country and the world would be good for everybody.
00:08:35.640 I'll just put it out there.
00:08:36.860 Because, you know, you don't always have to pay money for a service.
00:08:40.440 Perhaps you could pay in terms of your DNA.
00:08:44.340 Because it does have a pretty, pretty large economic value.
00:08:49.860 But obviously you're not going to give it up unless you have to.
00:08:53.420 Or unless you volunteer to.
00:08:55.660 Just put it out there.
00:08:59.740 I was watching a clip.
00:09:01.380 Do you remember when Biden had this televised event where he just talked to Obama?
00:09:09.060 So it was just Obama and Biden together having a conversation for a campaign event.
00:09:15.600 And I was listening to that yesterday.
00:09:18.020 One of the things that Obama said was so telling.
00:09:22.980 Just listen to this.
00:09:24.840 I wrote it down.
00:09:26.420 I think I got it approximately right.
00:09:28.600 And Obama said, the thing I'm confident about is your heart.
00:09:32.240 So he was talking to Biden.
00:09:34.140 The thing I'm confident about is your heart.
00:09:36.720 Now, that's an interesting choice of words.
00:09:40.180 And I think I've told some of you before that hypnotists learn that people reveal their hidden thoughts in their choice of words.
00:09:50.740 So if you look at the choice of words as opposed to what the sentence says,
00:09:55.400 you can often get an opposite meaning from what the choice of words were.
00:09:59.100 In this case, can you think of any situation in which Obama was confident in general about Biden?
00:10:08.480 Just confident about everything.
00:10:10.420 Confident about his decision making.
00:10:12.780 Confident about his health.
00:10:14.580 Confident about his policies.
00:10:16.240 Confident about his heart.
00:10:18.060 Would he use this choice of words?
00:10:20.740 The thing I'm confident about is your heart.
00:10:23.240 Because I think if I were confident about everything, I'd say something like,
00:10:29.560 you know, I've never been more confident in a candidate to be the right choice.
00:10:34.280 Right?
00:10:34.960 You'd say something like that.
00:10:37.260 I feel like Obama is signaling as clearly as you can that he's not confident in Biden's brain.
00:10:45.560 That seems really, really clear if you read between the lines.
00:10:50.600 Of course, we can't know we're right, but I know I'm right.
00:10:58.960 Isn't that the way?
00:11:00.240 The American way is to have no way to know that you're right,
00:11:03.740 but you still feel completely confident.
00:11:06.620 Just the same.
00:11:07.960 You should adjust your confidence and my confidence by knowing that I couldn't know what anybody's thinking.
00:11:15.400 I can't read his mind.
00:11:16.400 But it is a generally useful thing to look at choice of words.
00:11:22.100 And you can definitely beat the averages if you're guessing that you know what's going on.
00:11:28.000 All right.
00:11:29.220 The American Journal of Political Science published a correction.
00:11:34.020 So that's not too unusual, right?
00:11:35.720 I think half of published papers end up not being true.
00:11:38.840 So they published a correction this year saying that a paper from 2012,
00:11:44.620 oh, that was a long time ago, 2012, has an error.
00:11:48.920 And here's what the error was.
00:11:51.120 They had done a study and they decided that conservatives were ranked higher on psychoticism.
00:11:59.040 Now, I'm not sure exactly what psychoticism is, but it doesn't sound good.
00:12:05.320 And it turns out that when somebody reviewed their work,
00:12:09.260 they had some kind of a math or analytical error, and it was actually the opposite.
00:12:15.720 So since 2012, there have been 45 different citations and articles or whatever
00:12:21.140 saying that conservatives have been shown to have higher psychoticism,
00:12:27.780 but they had actually just flipped it.
00:12:30.220 It was actually liberals who had more psychoticism.
00:12:34.120 And since 2012, that study has been used as something that tells you something is true.
00:12:41.700 It was just reversed.
00:12:43.200 This is a subset of my theory that all data are wrong.
00:12:48.560 Now, I like to say all data is wrong.
00:12:51.140 Because as a professional writer, I'm one of the people who is responsible
00:12:55.640 for putting things into common usage.
00:13:00.400 Proper English, data is plural, so you'd say the data are wrong.
00:13:05.760 It just sounds like a douchebag.
00:13:07.960 I'm sorry, you just sound like a douchebag when you say the data are wrong.
00:13:13.460 If you say the data is wrong, you don't sound like a douchebag,
00:13:19.320 but you are technically incorrect.
00:13:21.140 Now, it's my responsibility, as I said, as a professional writer,
00:13:25.640 to give cover to the rest of you.
00:13:29.780 So this professional writer is going to start saying the data is wrong
00:13:35.020 because it just sounds better.
00:13:37.440 I'm sorry, it just sounds better.
00:13:39.260 And I get to make that choice.
00:13:41.880 You get to tell me I'm wrong, but just understand I'm doing it intentionally.
00:13:47.160 That rule has to change.
00:13:50.360 So I'll go first.
00:13:52.680 The data is wrong.
00:13:54.540 Anyway, my point was, it's basically true that all of our data for public decisions is wrong.
00:14:02.880 Let me say that again.
00:14:05.400 It's essentially true that all of the data we use for public decisions is wrong.
00:14:13.320 And not just wrong in a little way.
00:14:15.880 Wrong as in the reverse.
00:14:18.580 Wrong as in it makes you look in the entirely wrong planet for a solution.
00:14:22.480 I mean, wrong in the most fundamental way anything can get wrong.
00:14:27.540 And it's pretty much everything.
00:14:29.920 Almost everything.
00:14:30.840 And you keep that frame in mind because we're coming from a world only months ago
00:14:38.620 where we thought, okay, obviously sometimes data is wrong.
00:14:44.380 But most of the time, smart people are looking at it and the critics have looked at it
00:14:50.320 and there's peer review and you've got respected journalists.
00:14:54.960 The New York Times are reporting it.
00:14:57.600 So yeah, I get it.
00:14:58.940 But sometimes information can be wrong.
00:15:02.260 But not most of the time.
00:15:04.260 I mean, most of the time it's right.
00:15:06.960 Right?
00:15:08.420 No.
00:15:09.580 It's wrong all the time.
00:15:11.540 It's always wrong.
00:15:12.940 And until you realize that the data is always wrong for public decisions.
00:15:19.180 Sometimes for science too.
00:15:20.860 But for the public decisions, it's always wrong all the time.
00:15:23.700 And the reason is that you never see data unless it gets filtered through a person.
00:15:28.940 And that person is either incapable or unwilling to tell you the truth,
00:15:33.720 but they will certainly tell you what they want you to believe.
00:15:37.280 And if you hear my dog sleep barking, that's what that little yelping is.
00:15:41.300 So just keep that in mind.
00:15:45.340 It's very similar to the hypnotist inversion, if I could call it that.
00:15:52.300 When I first took hypnosis, I went into the class assuming a similar thing,
00:15:58.220 which is that 90% of the time people are rational and they're using their logical facilities.
00:16:04.420 And maybe 10% of the time there's maybe one issue that will make one of us crazy.
00:16:10.880 But in general, we're these rational, clear-thinking people.
00:16:15.360 Just sometimes we get a little crazy.
00:16:17.820 Once you learn hypnosis, you realize it's the reverse.
00:16:21.360 That we're 90% irrational.
00:16:24.240 And that little speck of time that we're rational,
00:16:27.260 it's only because we don't care about it.
00:16:28.880 But we're only rational when we don't care.
00:16:32.480 It's just an emotion-free decision.
00:16:35.820 Oh yeah, 2 plus 2 does equal 4.
00:16:38.160 I'll take 4.
00:16:40.360 So just keep that in mind, that the data is always wrong.
00:16:44.440 And we learned that in 2020, but it's a tough lesson.
00:16:48.440 We'll talk more about that in a minute.
00:16:49.660 I've decided that it's getting harder and harder to treat obvious mental illness as an opinion.
00:16:58.720 The Antifa people, we talked about this before,
00:17:02.400 there may be some people who are technically sane.
00:17:05.980 But the people getting arrested, the troublemakers,
00:17:09.260 a lot of the most active, that's just mental illness.
00:17:12.420 Can we stop saying that they have a different political opinion?
00:17:15.440 Can we stop saying they have a plan?
00:17:17.920 You know, chaos and anarchy are not really a plan.
00:17:25.040 That's mental illness.
00:17:27.480 Until somebody can describe how anarchy gets you to a better place for anybody,
00:17:33.800 including the anarchist,
00:17:35.320 it's not really a difference of opinion.
00:17:38.460 It's just mental illness.
00:17:41.480 And there's sort of a temporary version of this,
00:17:45.280 you know, that's Trump derangement syndrome.
00:17:49.020 I don't think we have the luxury of thinking of Trump derangement syndrome anymore
00:17:55.800 as just a difference of political opinion.
00:17:59.140 It really isn't.
00:18:00.900 There's something else going on.
00:18:03.260 Now, that doesn't mean that those people who have Trump derangement syndrome
00:18:06.480 have an organic mental problem.
00:18:09.120 But they might have a mental problem in the same way that PTSD can scramble your brain,
00:18:16.860 the same way that some kind of fear, some abuse, I guess that gets to PTSD.
00:18:23.380 But the point is that your lived experience can make you crazy about some topics.
00:18:28.480 And I really think that when we talk about this as differences of political opinion
00:18:35.340 or differences in who can analyze things better
00:18:38.580 or who has better data, which isn't really a thing,
00:18:42.620 I think that we just do a disservice because we're just looking in the wrong place.
00:18:48.020 These are genuine mental health problems.
00:18:50.100 And I just think it helps to think of it that way.
00:18:58.540 I feel a bit angry at all the people who know what President Trump should be doing
00:19:06.320 and the plan that he should be implementing that would fix all this coronavirus stuff
00:19:11.800 because according to the Democrats,
00:19:14.380 there is such a thing as a plan that they have in their minds
00:19:19.860 as the standard of how to do it right.
00:19:23.180 And if only the president would use that plan that's in their heads,
00:19:28.260 things would be a lot better.
00:19:31.420 And why don't they tell us the plan?
00:19:35.900 Now, you say to yourself,
00:19:37.300 Scott, Scott, Scott, they have.
00:19:39.140 In so many ways, look at this link.
00:19:41.060 And then you'll look at the link and it'll be
00:19:43.320 some other country had a better result.
00:19:47.280 So therefore, if we do what that other country did,
00:19:50.320 we'll get the same result.
00:19:52.760 If you think that,
00:19:54.680 you should not make decisions in public
00:19:56.880 because all the other countries are so fundamentally different
00:20:01.260 in so many different ways,
00:20:03.100 we have no idea what worked and didn't work.
00:20:06.800 None.
00:20:07.400 We have no idea what worked.
00:20:08.800 We also don't know if their numbers are true.
00:20:11.780 And we also don't know how the game ends
00:20:14.040 because we're at halftime.
00:20:17.600 Did Australia really, really do a great job on the coronavirus?
00:20:23.080 Hard to say because they're having a major flare-up right now.
00:20:26.440 If you have a flare-up,
00:20:28.220 doesn't that mean you didn't do a good job?
00:20:30.680 Or does that mean you did do a good job
00:20:32.680 because you only had one flare-up?
00:20:34.420 And now if they tamp this back down,
00:20:36.000 you'll say, oh, okay, good job again.
00:20:37.360 How do you even evaluate these things?
00:20:40.460 Because if we don't have a way to stop the virus,
00:20:43.860 there's no plan that looks anything like something that could work
00:20:49.100 besides what we're doing,
00:20:50.440 which is to keep the economy healthy
00:20:52.180 and take our losses but try to keep them as low as possible.
00:20:56.380 So every time you hear somebody say,
00:21:00.480 why doesn't the president have a plan?
00:21:03.380 First of all, they're lying because the plan is crystal clear.
00:21:06.700 I've explained it many times.
00:21:08.420 Every part of the plan is public and publicized.
00:21:12.400 It's the most clear plan you could ever have.
00:21:18.440 Now, of course, there are discussions about opening up the schools, etc.,
00:21:23.560 but the federal government is very clear.
00:21:27.280 You can't say that the federal government,
00:21:29.000 but Trump doesn't know exactly what he wants.
00:21:32.340 He wants schools to open.
00:21:33.200 We knew that we were trying to save the health care system from collapsing,
00:21:38.980 and we succeeded.
00:21:40.780 And we knew that we were going to try to buy time
00:21:43.940 until vaccines or therapeutics.
00:21:47.180 And as long as we need an economy,
00:21:53.680 we don't have the option of closing it.
00:21:57.220 Now, let me explain to you what the Democrats seem to explain as their plan.
00:22:03.880 And I'll have to call in a Democrat to explain this.
00:22:07.140 Dale?
00:22:08.200 Dale, can you come over here?
00:22:10.800 We want to learn about the Democrat plan for fixing everything.
00:22:15.760 Because we know that Mr. Trump, he's doing everything wrong.
00:22:20.720 So, Dale, can you come over here and explain how to do it right?
00:22:27.820 Sure.
00:22:28.940 Sure.
00:22:29.940 All you have to do is wear masks.
00:22:34.880 Well, Dale, that's it.
00:22:37.980 You just have to wear masks.
00:22:39.100 Because most people are wearing masks in all the places that seem to matter.
00:22:43.180 Well, maybe if the president wore a mask.
00:22:48.320 I don't think that's it.
00:22:50.400 That's it?
00:22:51.500 That would be your plan?
00:22:53.380 Your plan would just be to be the president and wear a mask,
00:22:57.260 and then everything would be okay?
00:22:59.420 No, no, no.
00:23:01.200 You also have to close the economy.
00:23:03.640 You've got to close the economy.
00:23:04.840 Okay, Dale, so, if you close the economy,
00:23:10.320 does that have any costs associated with it as well?
00:23:16.480 Dale, say something.
00:23:20.200 Would any people die if you crushed your economy in the long run?
00:23:27.080 Dale?
00:23:27.740 Dale, say something.
00:23:28.820 No, we're talking, looking at the big picture.
00:23:33.900 We understand that too many people have died.
00:23:37.900 But I'm trying to understand your plan,
00:23:40.340 in which you close the economy.
00:23:42.200 Are there no costs associated with that,
00:23:45.100 in terms of long-term well-being of the people and even lives?
00:23:49.740 Can you say more about that?
00:23:54.260 Dale, you're going to have to talk.
00:23:56.740 Say something.
00:23:57.400 Say something on the topic of the economy closing
00:24:03.200 and what that does to people's lives and their health,
00:24:08.640 their mental health, and their ongoing well-being.
00:24:11.720 Can you please speak to that?
00:24:14.600 La, la, la, la, la, la, la, la, la, la, la, la, la, la, la, la, la, la, la, la, la, la, la, la, la, la, la, la, la, la, la, la.
00:24:19.900 Should have worn masks.
00:24:22.180 Thank you, Dale.
00:24:22.960 Well, so we have this absurd situation
00:24:26.980 which people who don't know how to compare anything,
00:24:30.960 don't know how to analyze anything,
00:24:33.580 are informing the public.
00:24:36.720 Let me give you an example.
00:24:37.820 So there's big news today.
00:24:42.680 Axios did an interview one-on-one with, I guess,
00:24:45.780 Swan was talking to the president.
00:24:48.080 They had two chairs, you know, apart from each other.
00:24:52.120 And there's one part of that that's getting a lot of play,
00:24:57.340 which is the president shuffling some papers with some graphs
00:25:00.780 and trying to make his case about the statistics.
00:25:04.920 Now, this did not go well.
00:25:12.160 If you saw the clip, I think you'd agree with me.
00:25:16.300 Didn't go well.
00:25:17.560 Didn't go well for the president.
00:25:18.940 But here's what I would recommend.
00:25:24.820 Never put the president of the United States
00:25:27.540 in a chair with no desk.
00:25:30.920 See, the no desk part is important.
00:25:33.680 Never put the president of the United States
00:25:35.720 in a chair on camera
00:25:38.740 and give him multiple pieces of paper
00:25:41.780 with complicated data on it
00:25:44.520 and ask him to speak to it.
00:25:46.260 Don't ever put the president in that position.
00:25:50.320 If the president had said,
00:25:52.660 hey, can somebody print out, you know, those statistics?
00:25:55.600 So when I'm talking to this reporter,
00:25:58.520 you should have said no
00:26:01.660 and figured out something else to do.
00:26:05.580 Because let me tell you,
00:26:07.920 this is not a good look.
00:26:10.760 I've got some data
00:26:12.400 here on, let's see,
00:26:17.720 about infections or the rates
00:26:20.640 and other countries.
00:26:23.040 Do not put your president in that situation.
00:26:26.500 All right.
00:26:27.900 Now, I'm sure, you know,
00:26:29.480 if I knew the details,
00:26:30.900 you probably wanted to be in that situation
00:26:34.000 because you probably wanted the data in his hands.
00:26:36.640 But we don't want we, the public.
00:26:40.780 I think I can speak for everybody in the public.
00:26:43.960 We, the public,
00:26:45.240 don't want to see a journalist
00:26:47.040 and a politician
00:26:48.620 arguing about statistics.
00:26:52.160 Do you want to see that?
00:26:53.520 Because the journalist
00:26:55.560 didn't understand the topic,
00:26:58.360 you know,
00:26:58.680 any more than a journalist would.
00:27:00.520 And probably the journalist
00:27:02.020 is still back in,
00:27:03.720 let's say, January 2020 thinking
00:27:06.280 where the journalist imagined
00:27:08.400 that the data could be right,
00:27:12.280 whatever the data said.
00:27:13.940 I think the journalist
00:27:15.140 might have been under the impression
00:27:16.620 that there's some kind of data
00:27:18.600 that would be accurate.
00:27:21.780 That's not qualified
00:27:23.080 to talk about data on TV.
00:27:25.840 The president
00:27:26.640 had more data than he needed
00:27:30.040 and probably would have been better served
00:27:32.320 by picking one or two statistics
00:27:35.020 he could remember off the top of his head
00:27:37.020 and just sticking to it,
00:27:38.600 whatever those statistics are.
00:27:41.280 But shuffling the papers,
00:27:42.920 don't ever do that again.
00:27:44.780 Now, if he had a desk in front of him
00:27:46.960 where he could lay out his materials
00:27:49.300 so there's no shuffling,
00:27:50.400 then that's good.
00:27:52.220 Those are just notes
00:27:53.120 to remind him how to talk.
00:27:54.420 But don't put him in a chair
00:27:55.580 with no table
00:27:57.040 and a bunch of papers in his hand.
00:27:59.140 My God,
00:27:59.960 don't ever do that again.
00:28:03.420 That's my advice.
00:28:04.760 All right.
00:28:08.120 Did you see the...
00:28:11.060 Oh, well.
00:28:11.800 So much to talk about today.
00:28:12.960 Have you noticed that
00:28:16.520 writers can't tell the difference
00:28:19.220 between an argument
00:28:20.300 and an insult?
00:28:22.060 Have you seen that?
00:28:23.340 It looks like this.
00:28:24.880 If I say to a writer,
00:28:26.560 let's say on Twitter,
00:28:28.620 if I say something like,
00:28:29.720 what should Trump do differently
00:28:31.060 with this coronavirus?
00:28:32.820 You know,
00:28:33.000 what should be a better plan
00:28:34.960 than what's happening now?
00:28:37.000 You'll get some kind of answer like that.
00:28:38.580 Well, he's an unstable authoritarian
00:28:40.080 who denies science.
00:28:42.760 To which I think,
00:28:43.820 I think that's just an insult.
00:28:46.220 That didn't really quite address
00:28:48.080 my question.
00:28:49.580 And then I think part two is,
00:28:50.800 you've got to get rid of Trump
00:28:51.660 and get somebody
00:28:52.360 who is not a...
00:28:54.080 not one of those
00:28:54.860 unstable authoritarian
00:28:56.080 science deniers,
00:28:58.720 which, of course,
00:28:59.540 he isn't.
00:29:02.500 And I don't know
00:29:03.460 if they can tell the difference.
00:29:04.660 And I just quoted,
00:29:05.540 by the way,
00:29:06.200 Carl Bernstein.
00:29:07.080 So they're trotting out
00:29:09.380 the worse than Watergate guy
00:29:10.740 who just comes out
00:29:11.640 to say things are worse
00:29:12.560 than Watergate
00:29:13.200 or that Trump is Nixon.
00:29:16.100 And it is just so funny
00:29:17.820 when that guy comes out.
00:29:19.360 Like,
00:29:20.060 he used to bother me,
00:29:22.900 but now it's so funny.
00:29:24.040 It's like,
00:29:24.400 oh,
00:29:24.680 they got Carl Bernstein
00:29:26.160 out to say
00:29:26.700 it's worse than Watergate.
00:29:28.620 Every month or so
00:29:29.800 you can renew that play.
00:29:32.320 All right.
00:29:34.360 Here's an idea
00:29:35.180 that someone on Twitter
00:29:37.000 suggested
00:29:37.720 that would solve
00:29:39.280 the pandemic.
00:29:41.160 And it goes like this.
00:29:43.020 We will take
00:29:43.880 the technology
00:29:44.600 we don't have.
00:29:46.820 Oh.
00:29:47.940 Okay.
00:29:48.980 I guess that's hard.
00:29:50.880 We'll take the technology
00:29:51.980 that doesn't exist.
00:29:54.080 Okay.
00:29:54.460 But imagine it did.
00:29:56.360 So I'm
00:29:57.180 just wading into this
00:29:59.920 so I don't have enough
00:30:00.720 information on this.
00:30:01.680 But the claim is this.
00:30:02.860 that if you did
00:30:04.640 less accurate testing
00:30:06.540 but more of it,
00:30:08.340 especially if you can get
00:30:09.640 instant or near instant answers,
00:30:11.920 you can imagine
00:30:12.820 a situation where
00:30:13.800 kids going into school,
00:30:16.040 you know,
00:30:16.300 swab something
00:30:17.080 and then look at it
00:30:18.400 and it either tells them
00:30:19.640 they have coronavirus
00:30:20.280 or not.
00:30:21.460 But it wouldn't be
00:30:22.500 super accurate.
00:30:24.380 Not as good
00:30:25.040 as a real clinical test.
00:30:26.920 And the claim is
00:30:27.940 that although
00:30:28.840 they would not be
00:30:29.800 super accurate,
00:30:31.500 they would be
00:30:31.920 sort of accurate-ish enough
00:30:33.900 that you would catch
00:30:35.900 enough cases
00:30:37.420 that you would at least
00:30:38.360 get the, you know,
00:30:39.200 the R value,
00:30:42.460 the spread value
00:30:43.460 below one
00:30:44.160 and then you'd have
00:30:46.200 something there.
00:30:47.400 And so my question is,
00:30:48.640 does there exist
00:30:49.900 such tests
00:30:51.800 which,
00:30:52.960 although not perfect,
00:30:54.260 are good enough
00:30:55.240 and cheap enough
00:30:57.060 that you could test
00:30:58.280 for a dollar
00:30:58.840 and have an answer
00:31:00.780 in a minute
00:31:01.240 or five minutes
00:31:01.880 or something?
00:31:02.720 Do those exist?
00:31:04.160 Because I get a lot of,
00:31:05.500 yeah,
00:31:05.720 don't worry about
00:31:06.280 the false negatives
00:31:07.040 and don't worry
00:31:07.780 about the false positives.
00:31:09.400 The idea is
00:31:10.560 that if the inaccurate tests
00:31:12.460 pick up something,
00:31:14.200 you could then do
00:31:15.060 an accurate test
00:31:16.360 to confirm.
00:31:18.060 But at the very least,
00:31:19.380 you'd say,
00:31:19.780 whoa, whoa, whoa,
00:31:20.220 we might have a problem
00:31:21.220 with this class,
00:31:22.820 so you guys go home
00:31:23.920 until we sort it out.
00:31:25.880 So you could imagine,
00:31:27.600 somebody's saying,
00:31:28.320 yes, just use saliva.
00:31:31.220 So you could imagine
00:31:32.260 that if these tests existed,
00:31:35.160 if they were cheap,
00:31:36.020 if they were widespread,
00:31:37.420 if everybody tested
00:31:39.080 once a day
00:31:39.720 no matter what,
00:31:41.360 you could get on top of it.
00:31:42.900 But it feels like
00:31:44.020 a magical thinking solution
00:31:45.800 because I don't think
00:31:48.060 we have a,
00:31:48.520 I don't think we have
00:31:49.460 tests like that.
00:31:51.040 I would love to see
00:31:52.280 something from
00:31:53.480 the president
00:31:54.480 or the task force
00:31:55.540 telling us
00:31:56.980 how close we are
00:31:58.140 to anything like that
00:31:59.440 or any kind of news coverage
00:32:01.420 of a company
00:32:02.260 that's making something
00:32:03.380 like that.
00:32:04.980 Now,
00:32:05.640 the cousin to that
00:32:06.800 is doing group testing.
00:32:09.220 So that would be
00:32:10.180 if everybody,
00:32:11.140 everybody in the class,
00:32:13.220 you know,
00:32:13.560 spit into something
00:32:14.860 or swab something
00:32:15.700 and then you take
00:32:16.820 the entire class
00:32:17.800 and test them
00:32:18.600 as a group.
00:32:20.060 If you find
00:32:20.920 nobody in the class
00:32:22.220 has it,
00:32:22.640 well, you're done.
00:32:23.480 But if you find
00:32:24.200 that somebody
00:32:24.800 in the class
00:32:25.320 has it,
00:32:25.680 then you can
00:32:26.120 test them individually.
00:32:27.460 So you can get
00:32:28.760 big groups of people
00:32:29.980 with one test
00:32:31.820 so long as it ends up
00:32:33.360 being negative.
00:32:35.540 So that's
00:32:36.260 another path.
00:32:37.520 But I believe
00:32:38.520 our experts are
00:32:39.380 probably up to date
00:32:40.480 on all this stuff
00:32:41.300 and if this were
00:32:41.960 practical,
00:32:42.680 I have confidence
00:32:43.460 that somebody
00:32:45.600 like Birx
00:32:46.680 or Fauci
00:32:47.240 or somebody
00:32:47.860 would be saying,
00:32:48.700 hey, let's do
00:32:49.180 all these tests.
00:32:51.200 Alright,
00:32:51.480 so I don't think
00:32:51.940 it's practical.
00:32:52.940 But I wonder
00:32:53.380 how far we are
00:32:54.300 from it.
00:32:55.320 Here's an argument
00:32:56.140 that I heard yesterday
00:32:57.160 from somebody
00:32:58.360 far, far smarter
00:33:00.020 than I am.
00:33:01.640 Far, far smarter.
00:33:03.480 And when I hear
00:33:04.360 things that
00:33:05.100 are sort of
00:33:06.380 above my
00:33:07.480 brain's pay grade,
00:33:10.280 I like to run it
00:33:11.080 by you
00:33:11.460 because some of you
00:33:12.260 can handle that stuff.
00:33:14.280 And it goes like this.
00:33:15.180 So we've got
00:33:16.760 there are two types
00:33:18.640 of hydroxychloroquine
00:33:20.040 tests that we know of.
00:33:22.000 There are really
00:33:22.620 well-done tests
00:33:23.620 that test the wrong stuff
00:33:24.980 and then there were
00:33:27.180 lower-quality tests
00:33:28.520 that test something
00:33:29.860 closer to the right thing
00:33:31.120 which is early use
00:33:32.860 and then all three drugs
00:33:34.320 with the zinc
00:33:34.880 and azithromycin,
00:33:37.520 et cetera.
00:33:39.140 Now,
00:33:40.000 the argument is
00:33:41.100 that the ones
00:33:41.860 that are really good tests
00:33:43.940 show that hydroxychloroquine
00:33:45.440 is not effective.
00:33:47.120 But,
00:33:47.920 since they tested
00:33:48.720 the wrong stuff,
00:33:49.680 that doesn't really
00:33:50.460 tell us much.
00:33:51.620 They tested people
00:33:52.620 who were
00:33:53.020 hospitalized,
00:33:55.000 didn't give them
00:33:55.520 all three drugs,
00:33:56.940 gave them too much,
00:33:58.260 so those tests
00:33:59.040 didn't have much value.
00:34:01.140 But what about
00:34:01.740 all the tests,
00:34:02.720 I think there are
00:34:03.240 65 of them
00:34:04.320 and growing,
00:34:05.140 probably more by now,
00:34:07.020 of tests that were
00:34:07.840 low-quality
00:34:08.620 but seemed to indicate
00:34:10.540 that hydroxychloroquine
00:34:11.740 works.
00:34:12.220 If you saw
00:34:13.760 one test
00:34:14.660 that was
00:34:16.220 low-quality,
00:34:17.720 meaning it doesn't,
00:34:18.700 it's not a confirmation,
00:34:20.140 but it showed
00:34:20.940 hydroxychloroquine works,
00:34:22.240 what would be
00:34:22.940 your rational
00:34:23.860 opinion of that?
00:34:25.840 Your rational
00:34:26.900 opinion should be
00:34:27.960 one study,
00:34:30.060 it's like a coin flip
00:34:31.120 because half of studies
00:34:32.840 are wrong,
00:34:33.640 not reproducible,
00:34:35.240 so really,
00:34:36.500 I don't know
00:34:36.820 if I know anything,
00:34:37.600 it's like a coin flip,
00:34:38.560 maybe it works,
00:34:39.120 maybe it doesn't.
00:34:39.560 But suppose you had
00:34:41.100 65 tests
00:34:42.620 and a whole bunch
00:34:44.460 of different ways
00:34:45.160 of looking at it
00:34:46.000 and all the tests
00:34:47.340 were different,
00:34:48.940 but by coincidence
00:34:50.100 they had,
00:34:51.980 by a great majority,
00:34:54.320 maybe not every one,
00:34:55.680 but by a large majority
00:34:56.980 they seemed to show
00:34:58.640 it works.
00:35:00.540 And the argument
00:35:01.500 is this,
00:35:02.300 and I'll see if I can
00:35:03.220 present it right.
00:35:04.400 If you do one test
00:35:06.600 that's not reliable
00:35:07.860 on hydroxychloroquine,
00:35:10.080 there might be
00:35:10.740 several different reasons
00:35:11.860 why it's wrong.
00:35:13.380 If you do 65 tests,
00:35:16.160 you've got a whole bunch
00:35:17.280 of different ways
00:35:18.320 that each of those tests
00:35:19.480 will be wrong,
00:35:20.660 and the speculation is,
00:35:22.320 the smarter person says,
00:35:24.480 is that enough
00:35:25.780 to tell you something?
00:35:27.880 And that's what
00:35:29.080 I'm going to present to you,
00:35:30.160 is it?
00:35:31.400 Because each of these tests,
00:35:33.360 no matter what
00:35:34.200 their variables were
00:35:35.480 or how they looked at it,
00:35:37.300 seem to indicate
00:35:38.160 the same direction,
00:35:39.420 that there's some effect.
00:35:41.260 But given that
00:35:42.040 they're all flawed,
00:35:43.720 but,
00:35:44.140 here's the key,
00:35:45.440 they're all flawed,
00:35:46.980 but they're all
00:35:47.540 different takes.
00:35:49.100 In other words,
00:35:49.920 they did a different thing.
00:35:51.320 If all the people
00:35:52.320 doing different things
00:35:53.680 individually
00:35:54.480 were not reliable,
00:35:56.460 but by coincidence
00:35:57.720 they all seemed
00:35:59.580 to show some effect,
00:36:01.360 or most of them,
00:36:02.940 does that tell you something?
00:36:05.280 And,
00:36:05.880 there are smart people
00:36:08.000 who say,
00:36:08.740 yeah,
00:36:09.240 that does tell you something.
00:36:10.640 It's certainly enough
00:36:11.440 to give it a try,
00:36:12.580 given that it's inexpensive
00:36:13.860 and won't kill you.
00:36:15.620 So,
00:36:16.560 I put that out there.
00:36:19.140 Is that a reasonable
00:36:20.780 approach?
00:36:23.060 Alright,
00:36:23.800 by now most of you
00:36:25.460 have seen the new
00:36:26.620 George Floyd
00:36:27.720 body cam video.
00:36:30.940 And,
00:36:31.520 of course,
00:36:31.960 it's,
00:36:32.420 you know,
00:36:33.400 two movies
00:36:34.040 on one screen situation.
00:36:35.740 Everybody's seen
00:36:36.320 what they want to see.
00:36:37.480 I would say,
00:36:38.300 at this point,
00:36:38.880 there isn't the slightest chance
00:36:40.280 that the police officers
00:36:42.680 will be charged
00:36:43.380 with murder.
00:36:45.280 Or,
00:36:45.820 if they are,
00:36:46.760 there isn't,
00:36:47.320 I'm sorry,
00:36:48.320 there isn't the slightest chance
00:36:49.540 they'll be convicted
00:36:50.400 of murder.
00:36:51.060 I think they've already
00:36:51.640 been charged.
00:36:52.760 But,
00:36:53.180 when you see the video,
00:36:53.960 it's just crystal clear
00:36:55.720 that this was
00:36:57.040 a horrible mistake.
00:36:58.740 Now,
00:36:58.900 they could be
00:36:59.720 guilty of,
00:37:01.060 I don't know,
00:37:01.540 not trying hard enough
00:37:02.800 to revive him,
00:37:04.760 of not,
00:37:05.920 you know,
00:37:06.280 following some procedure.
00:37:07.620 I don't know.
00:37:08.380 Maybe.
00:37:09.360 But,
00:37:09.680 I'll tell you what,
00:37:10.220 it's not.
00:37:11.560 It's definitely not murder.
00:37:13.480 It's not even close.
00:37:15.000 And,
00:37:15.560 here's the kill shot.
00:37:17.520 You know,
00:37:17.720 if,
00:37:18.240 if those police officers
00:37:20.080 want to save a lot of money
00:37:21.920 on lawyers,
00:37:23.360 I can present their
00:37:24.980 whole case for them.
00:37:25.920 It goes like this.
00:37:28.380 George Floyd was saying,
00:37:29.700 I can't breathe
00:37:30.580 before he even got in the car,
00:37:34.320 which is before he got
00:37:35.580 on the ground,
00:37:36.340 which is before they were
00:37:37.360 on top of him.
00:37:38.620 His breathing difficulty,
00:37:40.960 he said directly,
00:37:42.660 he said it out loud
00:37:43.700 and it's on video.
00:37:45.140 He was having breathing problems
00:37:46.420 before any of the
00:37:47.620 bad interactions
00:37:48.780 encountered.
00:37:50.720 And,
00:37:51.280 if it's true that the
00:37:52.740 fentanyl in him
00:37:53.700 was three times
00:37:54.860 the overdose
00:37:55.780 amount,
00:37:57.440 then,
00:37:58.540 he was going to die
00:38:00.780 no matter what.
00:38:02.840 But,
00:38:04.160 I would certainly,
00:38:06.300 I would certainly question
00:38:07.580 whether the police officers
00:38:09.000 did enough
00:38:10.040 to keep him alive
00:38:10.980 once it was obvious
00:38:11.780 he had some kind
00:38:12.840 of a problem.
00:38:13.840 So that,
00:38:14.300 that's certainly
00:38:15.100 worth looking into.
00:38:16.600 And I think that's
00:38:17.500 really disturbing.
00:38:18.700 Part of what we saw
00:38:20.940 in that video,
00:38:21.660 or maybe it was
00:38:22.100 a different one,
00:38:22.920 is that one of the
00:38:23.900 bystanders who saw
00:38:25.880 George Floyd
00:38:26.720 look like he was
00:38:27.920 unconscious,
00:38:29.160 one of the bystanders
00:38:30.280 was screaming at the
00:38:31.720 police to check his
00:38:32.760 pulse.
00:38:34.180 Just screaming at the
00:38:35.380 police.
00:38:35.820 And the police were
00:38:36.440 just ignoring him.
00:38:37.760 And the bystander
00:38:38.780 was completely right.
00:38:40.400 He's saying,
00:38:41.320 check his pulse,
00:38:42.160 check his pulse.
00:38:43.220 You're not even
00:38:43.880 checking his pulse.
00:38:44.740 And that,
00:38:47.080 that has to be
00:38:48.840 answered for.
00:38:50.340 Okay,
00:38:50.680 that really needs
00:38:51.880 an answer.
00:38:52.460 And if the police are,
00:38:54.120 if there's some penalty
00:38:55.040 for whatever that
00:38:56.760 was about,
00:38:57.620 I still have to hear
00:38:58.340 their side of it,
00:38:59.340 but whatever that
00:39:00.180 was about,
00:39:01.340 that's not good.
00:39:02.860 Right?
00:39:03.200 So,
00:39:03.600 certainly the police
00:39:04.440 officers are not
00:39:05.840 in the clear,
00:39:07.140 but it's really clear
00:39:08.260 that they weren't
00:39:08.900 out to murder anybody.
00:39:10.260 That much is
00:39:11.060 completely obvious.
00:39:14.740 And I found
00:39:17.540 that,
00:39:18.320 this is my opinion,
00:39:19.900 I think Black Lives
00:39:20.660 Matter is completely
00:39:21.740 discredited at this
00:39:23.020 point.
00:39:23.320 Does it feel like
00:39:23.880 that to you?
00:39:25.000 Now,
00:39:25.220 I'm not talking
00:39:25.720 about black people,
00:39:27.720 of course.
00:39:28.940 I'm just talking
00:39:29.460 about Black Lives
00:39:30.220 Matter,
00:39:30.640 the movement.
00:39:32.200 Isn't it totally
00:39:32.980 discredited at this
00:39:34.160 point?
00:39:35.260 Because it just
00:39:36.500 seems ridiculous.
00:39:38.100 And it also
00:39:38.760 is the lowest
00:39:41.440 priority in the
00:39:42.660 black community.
00:39:43.320 Now,
00:39:44.520 if you tell me
00:39:45.300 it's not,
00:39:46.240 I'd say,
00:39:46.720 oh,
00:39:46.900 well,
00:39:47.080 I see the
00:39:47.500 problem.
00:39:48.500 How will the
00:39:49.140 black community
00:39:49.840 ever make progress
00:39:51.140 if they can't
00:39:52.400 tell the difference
00:39:53.080 between their
00:39:53.680 highest priority,
00:39:55.140 which would be
00:39:55.680 education,
00:39:56.780 which would be
00:39:57.240 getting rid of
00:39:57.920 the teachers'
00:39:58.520 unions,
00:39:59.220 which would
00:39:59.960 basically make
00:40:00.700 everything better
00:40:01.400 from economics
00:40:02.220 to violence
00:40:03.240 to health
00:40:03.780 outcomes,
00:40:04.280 just everything,
00:40:06.080 versus the
00:40:06.880 police violence,
00:40:07.680 which will affect
00:40:08.620 X number of
00:40:09.400 people,
00:40:10.340 as tragic as it
00:40:11.380 is,
00:40:11.640 as much as
00:40:12.180 we need to
00:40:12.660 make that
00:40:13.180 better?
00:40:14.080 It's the
00:40:14.580 smallest priority.
00:40:16.580 It's the
00:40:16.880 smallest priority.
00:40:18.140 So how am I
00:40:18.800 supposed to
00:40:19.240 take serious,
00:40:20.540 seriously,
00:40:22.280 a group that
00:40:23.520 says black
00:40:24.100 lives matter
00:40:24.800 while they're
00:40:25.860 making the
00:40:26.400 entire country
00:40:27.180 focus exclusively
00:40:28.700 on the least
00:40:29.580 important part
00:40:30.580 of black lives?
00:40:32.620 That's exactly
00:40:33.720 like black lives
00:40:34.780 don't matter.
00:40:35.540 because if I
00:40:37.000 said to you,
00:40:37.940 which of
00:40:38.480 these is
00:40:38.900 more like
00:40:39.360 black lives
00:40:40.020 matter?
00:40:41.240 Is it the
00:40:41.660 one that
00:40:42.000 does the
00:40:42.380 thing that
00:40:42.720 doesn't make
00:40:43.160 any fucking
00:40:43.780 difference?
00:40:44.620 Sorry,
00:40:45.380 didn't slip
00:40:45.920 down.
00:40:46.500 Or is it
00:40:47.160 the thing
00:40:47.460 that changes
00:40:47.960 everything?
00:40:48.980 The thing
00:40:49.480 that can
00:40:49.760 really fix
00:40:51.840 things for
00:40:52.400 generations to
00:40:53.780 come,
00:40:54.320 which would be
00:40:54.960 education,
00:40:56.120 which would be
00:40:56.640 getting the
00:40:57.080 teachers'
00:40:57.660 unions out of
00:40:59.680 the logjam,
00:41:00.840 or creating
00:41:01.420 the logjam,
00:41:02.520 so that there
00:41:02.980 would be
00:41:03.180 competition,
00:41:03.980 so that the
00:41:04.500 charter and
00:41:05.120 other schools
00:41:05.660 could actually
00:41:06.120 teach people
00:41:06.800 and they
00:41:07.400 could have
00:41:07.720 good lives.
00:41:08.900 So I
00:41:09.760 refuse to
00:41:10.520 take BLM
00:41:12.080 seriously until
00:41:13.920 they take
00:41:14.440 themselves
00:41:14.920 seriously.
00:41:16.040 Until they
00:41:16.800 can figure out
00:41:17.520 what their
00:41:17.880 top priority
00:41:18.540 is and stop
00:41:19.360 focusing on
00:41:20.060 the smallest
00:41:20.660 one, I'm
00:41:22.400 not going to
00:41:22.720 take it
00:41:23.020 seriously, and
00:41:24.160 I don't think
00:41:24.600 you should
00:41:24.960 either.
00:41:25.640 But happy to
00:41:26.640 help on the
00:41:27.240 big problem,
00:41:27.940 which is
00:41:28.220 teachers'
00:41:28.740 unions.
00:41:31.860 Did you see
00:41:32.580 the video of
00:41:33.380 there were
00:41:33.620 some protesters
00:41:34.360 that were
00:41:35.420 trying to
00:41:35.800 make their
00:41:36.120 way to
00:41:36.500 the Seattle
00:41:37.140 police chief's
00:41:38.320 house, private
00:41:39.340 house, and
00:41:41.120 some citizens
00:41:42.160 with a pickup
00:41:42.980 truck and a
00:41:43.740 very large
00:41:44.260 gun, or
00:41:45.240 more, stopped
00:41:47.080 them and sent
00:41:47.760 them back?
00:41:48.920 You have to
00:41:49.760 watch it.
00:41:51.480 And one of
00:41:52.460 the protesters,
00:41:52.800 this is the
00:41:53.460 money shot here,
00:41:55.140 one of the
00:41:55.540 protesters on
00:41:56.260 the video,
00:41:57.140 when confronted
00:41:57.900 with the
00:41:58.380 armed citizens
00:41:59.460 protecting the
00:42:00.280 police, said,
00:42:01.980 quote,
00:42:02.200 we are
00:42:02.580 peaceful.
00:42:03.440 You pointed
00:42:04.160 a gun at
00:42:04.600 my face, and
00:42:05.380 then the
00:42:05.660 resident with
00:42:06.220 the gun said,
00:42:07.060 that's why
00:42:07.900 you're peaceful.
00:42:11.180 That's why
00:42:11.880 you're peaceful,
00:42:12.760 because I got a
00:42:13.540 gun at your
00:42:13.940 head.
00:42:15.520 Now, how
00:42:17.420 bad are
00:42:17.800 things when
00:42:19.360 the citizens
00:42:19.960 are protecting
00:42:20.700 the police?
00:42:22.100 That's literally
00:42:22.880 what happened.
00:42:24.140 I'm not
00:42:24.920 overstating that.
00:42:27.420 this was
00:42:28.600 citizens
00:42:29.420 protecting the
00:42:30.560 police.
00:42:32.240 That's a
00:42:32.840 crazy-backed
00:42:33.520 upside-down
00:42:34.060 world.
00:42:35.020 Now, this
00:42:35.420 is the reason
00:42:35.920 that I'm
00:42:36.780 less concerned
00:42:37.920 than a lot
00:42:38.440 of people
00:42:38.760 that this
00:42:39.180 will spread
00:42:39.700 to rural
00:42:40.480 areas and
00:42:41.220 spread into
00:42:41.780 the suburbs.
00:42:43.520 This guy
00:42:44.260 with the
00:42:44.560 pickup truck,
00:42:45.680 that's why
00:42:46.700 it won't
00:42:47.000 spread into
00:42:47.560 the suburbs,
00:42:48.360 because it
00:42:49.420 doesn't take
00:42:50.000 many of
00:42:50.700 those guys
00:42:51.380 who just
00:42:52.300 stand in
00:42:52.740 the road
00:42:53.160 with a
00:42:54.140 weapon and
00:42:55.880 say,
00:42:56.640 how about
00:42:56.980 we're done?
00:42:58.700 How about
00:42:59.540 you're done
00:42:59.940 here?
00:43:00.760 It doesn't
00:43:01.380 take many
00:43:01.900 of them
00:43:02.260 for the
00:43:04.700 spreading to
00:43:05.500 stop.
00:43:06.360 And frankly,
00:43:06.960 the people
00:43:07.240 with the
00:43:07.520 guns don't
00:43:07.960 give a shit
00:43:08.360 about the
00:43:08.740 cities,
00:43:09.460 so the
00:43:10.060 cities may
00:43:10.540 be in
00:43:10.780 trouble.
00:43:11.520 I guess
00:43:11.800 Facebook
00:43:12.240 just bought
00:43:13.260 a whole bunch
00:43:13.680 of property
00:43:14.120 in New York
00:43:14.620 City for
00:43:15.220 future
00:43:16.400 operations,
00:43:18.180 so Facebook
00:43:19.680 thinks New
00:43:20.380 York City
00:43:20.780 will be a
00:43:21.160 good place
00:43:21.500 to live.
00:43:22.420 It's not
00:43:22.860 like the
00:43:23.160 cities will
00:43:23.540 go away.
00:43:24.020 I just
00:43:24.280 don't think
00:43:24.600 they'll ever
00:43:24.860 be the
00:43:25.160 same.
00:43:25.360 How
00:43:28.940 about
00:43:29.360 this?
00:43:33.560 How
00:43:34.080 about that?
00:43:34.760 That's
00:43:34.980 all I've
00:43:35.280 talked about
00:43:35.680 today.
00:43:37.840 All right.
00:43:40.200 The best
00:43:40.940 way to
00:43:41.320 signal in
00:43:41.940 public that
00:43:42.620 you don't
00:43:43.040 understand
00:43:43.700 anything about
00:43:44.540 the world
00:43:45.080 is to
00:43:46.500 compare how
00:43:47.100 the United
00:43:47.520 States is
00:43:48.160 doing on
00:43:48.640 the coronavirus
00:43:49.160 to any
00:43:50.580 other country
00:43:51.320 or any
00:43:52.400 other two
00:43:53.040 or three
00:43:53.340 countries.
00:43:53.780 So let
00:43:54.780 me say
00:43:55.000 this as
00:43:55.400 clearly as
00:43:56.140 I can
00:43:56.480 so you
00:43:56.840 don't
00:43:57.040 make the
00:43:57.400 same
00:43:57.600 mistake.
00:43:58.940 If you
00:43:59.620 think it
00:44:00.100 means
00:44:00.440 something
00:44:00.840 to compare
00:44:02.000 how the
00:44:02.300 United States
00:44:02.920 is doing
00:44:03.320 with the
00:44:03.620 coronavirus
00:44:04.100 to any
00:44:05.300 other country,
00:44:06.420 whether they're
00:44:07.160 doing better
00:44:07.660 or worse,
00:44:08.360 if you think
00:44:08.800 it means
00:44:09.160 something,
00:44:09.960 you don't
00:44:10.680 understand
00:44:11.240 really anything
00:44:12.740 about how
00:44:13.480 the world
00:44:13.900 works.
00:44:14.780 It doesn't
00:44:15.460 mean anything.
00:44:17.040 We don't
00:44:17.860 know why
00:44:18.980 other countries
00:44:19.580 are doing
00:44:19.940 what they're
00:44:20.220 doing.
00:44:20.560 We don't
00:44:20.820 know if
00:44:21.120 their numbers
00:44:21.560 are accurate,
00:44:22.060 and we
00:44:22.440 don't
00:44:22.660 know if
00:44:23.260 they will
00:44:23.560 flare up
00:44:24.040 later and
00:44:24.620 we're just
00:44:24.960 on different
00:44:25.400 schedules for
00:44:26.160 flare-ups.
00:44:26.980 None of
00:44:27.320 that's known.
00:44:28.300 We also
00:44:28.940 could look at
00:44:29.600 what they're
00:44:29.900 doing and
00:44:30.440 say, okay,
00:44:30.880 they did
00:44:31.240 X, Y,
00:44:31.740 and Z,
00:44:32.340 but X,
00:44:33.040 Y, and
00:44:33.260 Z have
00:44:33.600 not been
00:44:34.140 studied,
00:44:35.480 meaning that
00:44:36.180 we don't
00:44:36.940 know exactly
00:44:37.960 what things
00:44:38.760 work and
00:44:39.200 what don't.
00:44:40.240 We don't
00:44:40.680 know if
00:44:40.960 the culture
00:44:41.400 is different.
00:44:42.120 We don't
00:44:42.400 know what
00:44:44.140 exactly the
00:44:44.760 factors are.
00:44:45.440 Could it
00:44:45.660 be the
00:44:46.180 vitamin D,
00:44:46.740 the hydroxychloroquine
00:44:47.780 that they may
00:44:48.840 or may not
00:44:49.240 be using?
00:44:49.880 Could it
00:44:50.100 be?
00:44:52.060 So, if
00:44:53.520 you find
00:44:53.960 yourself doing
00:44:54.620 that, hey,
00:44:56.000 what about
00:44:56.520 Taiwan?
00:44:57.780 You should
00:44:58.440 not make
00:44:58.840 decisions in
00:44:59.540 public because
00:45:01.340 you just
00:45:01.740 can't compare
00:45:02.560 island nations
00:45:04.520 to other
00:45:05.020 nations, etc.
00:45:06.520 It can't be
00:45:07.060 done.
00:45:08.000 But, as I
00:45:09.040 was saying
00:45:09.360 before, could
00:45:10.700 we find out
00:45:11.500 something if
00:45:12.200 we looked at
00:45:12.800 the whole
00:45:13.380 group of,
00:45:15.340 let's say,
00:45:16.620 all the
00:45:17.600 countries that
00:45:18.180 did well
00:45:18.700 after it's
00:45:20.440 done?
00:45:20.820 We don't
00:45:21.280 know yet.
00:45:22.060 Who's
00:45:22.520 going to
00:45:22.760 do well
00:45:23.100 when it's
00:45:23.440 all done?
00:45:24.420 New Zealand
00:45:24.940 could be the
00:45:25.480 biggest hotspot
00:45:26.160 in the world
00:45:26.600 in a month.
00:45:27.140 You just
00:45:27.420 don't know.
00:45:28.880 But if we
00:45:29.300 looked at
00:45:29.600 them all,
00:45:30.460 eventually we
00:45:31.240 might be
00:45:31.600 able to
00:45:31.900 tease out
00:45:32.400 the numbers
00:45:32.840 like, well,
00:45:33.780 it wasn't
00:45:34.380 obvious at
00:45:34.940 the time,
00:45:35.580 but it
00:45:35.820 was this
00:45:36.300 or that
00:45:36.700 that made
00:45:37.060 the big
00:45:37.360 difference.
00:45:38.080 We don't
00:45:38.580 know that
00:45:38.920 now, so
00:45:39.580 we'd just
00:45:39.860 be guessing.
00:45:42.260 That is
00:45:42.780 all I've
00:45:43.100 got for
00:45:43.400 now.
00:45:45.460 Get your
00:45:46.020 weight down.
00:45:47.780 That is
00:45:48.300 right.
00:45:48.520 Let me
00:45:50.180 give you
00:45:50.500 my best
00:45:51.960 tip for
00:45:53.340 losing weight.
00:45:54.380 Would you
00:45:54.660 like that?
00:45:55.780 For some
00:45:56.920 of you, this
00:45:57.360 will change
00:45:58.320 your lives,
00:45:59.020 and I
00:45:59.420 mean that
00:45:59.920 literally.
00:46:01.820 The next
00:46:02.260 thing I'm
00:46:02.640 going to
00:46:02.820 say, given
00:46:03.420 that there
00:46:03.880 are 3,600
00:46:05.460 people watching,
00:46:06.560 probably 100,000
00:46:07.460 people will
00:46:08.040 watch this
00:46:08.440 eventually,
00:46:09.260 out of 100,000
00:46:10.220 people, there
00:46:11.940 are some
00:46:12.380 number of
00:46:12.880 you who are
00:46:13.280 going to hear
00:46:13.640 what I say
00:46:14.140 next, and
00:46:14.640 it will just
00:46:14.880 totally change
00:46:15.520 your life.
00:46:16.760 It goes
00:46:17.220 like this.
00:46:18.000 If you're
00:46:18.340 trying to
00:46:18.760 lose weight,
00:46:20.380 don't treat
00:46:21.360 your cravings
00:46:22.460 and your
00:46:22.940 hunger as
00:46:23.940 the same
00:46:24.380 thing, meaning
00:46:26.060 that your
00:46:28.360 cravings, usually
00:46:29.300 for sugar, are
00:46:30.860 not really a
00:46:32.180 genuine hunger.
00:46:33.900 What I
00:46:34.440 recommend, if
00:46:35.140 you're trying
00:46:35.480 to lose
00:46:35.880 weight, is
00:46:37.060 that first
00:46:37.700 you work
00:46:38.220 only on
00:46:38.820 your sugar
00:46:39.280 cravings, but
00:46:40.640 you eat as
00:46:41.100 much as you
00:46:41.560 want of
00:46:42.000 things that
00:46:42.360 are good
00:46:42.600 for you.
00:46:43.340 You can't
00:46:44.220 eat too
00:46:44.580 much broccoli.
00:46:46.400 Fill yourself
00:46:47.700 up with
00:46:48.360 lean fish
00:46:49.900 and broccoli
00:46:50.620 and nuts
00:46:51.200 and stuff
00:46:51.680 that's good,
00:46:52.740 and don't
00:46:53.100 ever let
00:46:53.560 yourself be
00:46:54.100 hungry.
00:46:55.460 Just work
00:46:56.020 on one
00:46:56.820 at a time.
00:46:57.540 Let's say
00:46:57.800 you've got
00:46:58.120 10 sugary
00:46:58.880 things you
00:46:59.360 like.
00:47:00.140 It's like,
00:47:00.440 oh, I
00:47:00.720 like ice
00:47:01.120 cream, I
00:47:01.540 like cake,
00:47:02.060 I like
00:47:02.380 candy,
00:47:03.300 whatever.
00:47:03.640 Whatever it
00:47:04.040 is that
00:47:04.400 your problem
00:47:04.940 is.
00:47:05.420 Then just
00:47:05.880 first week
00:47:06.760 or two,
00:47:07.880 just get
00:47:08.300 rid of
00:47:08.600 one of
00:47:09.220 the sweets.
00:47:10.240 That's
00:47:10.480 it.
00:47:11.200 Eat as
00:47:11.540 much as
00:47:11.960 you want.
00:47:12.420 You might
00:47:12.640 even be
00:47:12.980 gaining
00:47:13.240 weight.
00:47:13.920 You're
00:47:14.160 even eating
00:47:14.600 the other
00:47:14.980 sweets.
00:47:15.520 You just
00:47:16.240 got rid
00:47:16.540 of one.
00:47:17.660 Then get
00:47:18.040 rid of
00:47:18.320 another
00:47:18.580 one.
00:47:19.080 Once you've
00:47:19.700 worked
00:47:19.900 yourself
00:47:20.280 down to
00:47:20.860 there's
00:47:21.720 only one
00:47:22.480 sweet
00:47:22.840 left that
00:47:23.420 you've
00:47:23.640 allowed
00:47:23.920 yourself,
00:47:24.960 you're not
00:47:25.540 going to
00:47:25.720 want to
00:47:25.960 eat so
00:47:26.340 much of
00:47:26.860 that that
00:47:27.820 it makes
00:47:28.140 a difference.
00:47:29.260 By then
00:47:29.680 you'll be
00:47:30.000 close enough
00:47:30.560 to be able
00:47:31.000 to just
00:47:31.380 knock it
00:47:31.860 out.
00:47:32.800 If you
00:47:33.140 wait about
00:47:33.720 my
00:47:34.240 experiences
00:47:34.760 about two
00:47:35.760 months and
00:47:37.160 that thing
00:47:37.700 that was
00:47:38.020 the most
00:47:38.340 delicious
00:47:38.740 thing in
00:47:39.220 the world
00:47:39.640 will look
00:47:40.540 gross.
00:47:41.200 I've
00:47:43.700 done this
00:47:44.040 experiment
00:47:44.480 time and
00:47:45.540 time again
00:47:45.980 with
00:47:46.300 Snickers
00:47:47.080 candy bars.
00:47:48.480 My biggest
00:47:49.240 addiction is
00:47:50.420 a Snickers
00:47:51.100 candy bar.
00:47:52.800 When I'm
00:47:53.220 having too
00:47:53.680 much sugar,
00:47:55.060 the feeling
00:47:56.040 I get
00:47:56.440 just biting
00:47:56.960 into one,
00:47:58.300 my whole
00:47:59.140 body comes
00:47:59.720 alive.
00:48:01.920 It's just
00:48:02.700 physically
00:48:03.280 delightful in
00:48:04.560 a way that
00:48:04.980 is hard to
00:48:05.500 explain.
00:48:06.420 But that's
00:48:06.780 only if I'm
00:48:07.460 still hooked
00:48:08.240 on sugar.
00:48:10.280 And no
00:48:10.800 matter how
00:48:11.160 good you
00:48:11.580 are at
00:48:11.960 getting off
00:48:12.400 the sugar,
00:48:12.980 you may
00:48:13.240 have relapses.
00:48:14.120 I do it
00:48:14.420 all the
00:48:14.660 time.
00:48:15.220 Then I
00:48:15.420 have to
00:48:15.620 work myself
00:48:16.180 off it.
00:48:16.920 But at
00:48:17.240 the moment,
00:48:18.080 since I've
00:48:18.480 been off
00:48:18.840 more than
00:48:19.220 two months,
00:48:19.960 I can hold
00:48:20.680 a Snickers
00:48:21.280 in my hand
00:48:21.940 and imagine
00:48:22.840 what it would
00:48:23.300 taste like,
00:48:24.040 and it's
00:48:24.480 just gross.
00:48:26.260 And nothing
00:48:26.740 changed.
00:48:27.880 Nothing
00:48:28.180 changed,
00:48:28.860 except that I
00:48:29.460 waited two
00:48:29.880 months,
00:48:30.380 and then the
00:48:31.100 craving went
00:48:31.660 away.
00:48:32.420 So work on
00:48:33.080 your craving
00:48:33.680 by decreasing
00:48:34.960 your sugar
00:48:35.880 things,
00:48:36.340 as well as
00:48:36.860 your simple
00:48:37.920 carbs like
00:48:38.700 white rice
00:48:39.520 and bread
00:48:40.100 and stuff
00:48:40.480 like that.
00:48:41.240 So just
00:48:41.600 start decreasing
00:48:42.440 them until
00:48:43.320 those are
00:48:43.760 gone.
00:48:44.780 You may
00:48:45.220 have even
00:48:45.580 gained weight
00:48:46.240 because you
00:48:46.920 ate so much
00:48:47.400 broccoli and
00:48:48.160 salmon and
00:48:49.120 nuts.
00:48:50.220 And then,
00:48:51.280 if that's
00:48:51.780 your diet,
00:48:53.060 broccoli and
00:48:53.700 salmon and
00:48:54.360 nuts,
00:48:55.420 adjusting that
00:48:56.260 down a little
00:48:56.760 bit, you
00:48:57.200 know, 10%
00:48:57.840 or whatever,
00:48:58.680 so that your
00:48:59.540 calories are in
00:49:00.720 line,
00:49:01.600 exercising a
00:49:02.380 little bit
00:49:02.660 more,
00:49:03.560 it's a lot
00:49:04.060 easier.
00:49:04.340 So that
00:49:06.080 is my
00:49:06.380 advice,
00:49:07.080 divide and
00:49:07.820 conquer.
00:49:08.740 Never,
00:49:09.140 ever,
00:49:09.520 ever work
00:49:10.860 on a
00:49:11.180 craving at
00:49:11.920 the same
00:49:12.260 time as
00:49:12.700 hunger,
00:49:13.660 like actual
00:49:14.400 hunger.
00:49:16.020 You don't
00:49:16.380 want to deal
00:49:16.880 with two
00:49:17.540 enemies on
00:49:18.280 two fronts
00:49:18.800 at the same
00:49:19.260 time,
00:49:19.660 divide and
00:49:20.080 conquer.
00:49:20.920 That little
00:49:21.800 bit of
00:49:22.380 advice,
00:49:23.800 for the
00:49:24.220 100,000 of
00:49:25.000 you watching
00:49:25.500 this,
00:49:27.380 probably at
00:49:28.680 least 5,000
00:49:29.800 of you just
00:49:30.540 said,
00:49:30.820 holy crap,
00:49:33.000 would that
00:49:33.460 work?
00:49:34.820 Some of
00:49:35.320 you are
00:49:35.540 going to
00:49:35.720 try it.
00:49:37.200 It works.
00:49:38.780 Now, I
00:49:39.440 can't guarantee
00:49:39.980 that any
00:49:40.560 plan or any
00:49:41.760 diet works
00:49:42.300 for 100%
00:49:42.920 of people.
00:49:43.640 That would
00:49:44.000 be crazy.
00:49:44.860 But that
00:49:45.160 of 100,000
00:49:45.820 people,
00:49:47.300 it probably
00:49:47.740 just changed
00:49:48.380 at least 1,000
00:49:49.600 lives fairly
00:49:50.760 significantly.
00:49:52.480 So that's
00:49:53.160 why I do
00:49:53.540 this.
00:49:54.260 That's what
00:49:54.640 it's all
00:49:54.940 about.
00:49:55.500 And I'll
00:49:55.740 talk to
00:49:56.060 you tomorrow.
00:49:57.040 Well,