Real Coffee with Scott Adams - April 09, 2020


Episode 902 Scott Adams: I Tell You About My Experience With Models. No, Not That Kind.


Episode Stats

Length

55 minutes

Words per Minute

156.20277

Word Count

8,649

Sentence Count

501

Misogynist Sentences

12

Hate Speech Sentences

17


Summary

On today's show, Scott Adams talks about the latest developments in the Biden vs. Trump debate, the latest on the Democratic primary race, and much, much more. Plus, Scott gives his take on the latest news and takes a few listener questions.


Transcript

00:00:00.000 Hey everybody, come on in here.
00:00:12.280 It's time for Coffee with Scott Adams, and it's another wonderful morning.
00:00:17.520 Sure, we got problems.
00:00:19.520 The world has problems, it always does.
00:00:21.780 But is that going to stop you from enjoying the simultaneous sip?
00:00:25.260 No, no.
00:00:26.140 No, you're not the kind of person who would be deterred by things like that.
00:00:32.040 I just realized you can't say deterred without turd.
00:00:36.460 Hmm.
00:00:37.440 Never thought about that before.
00:00:39.420 But you can't say pandemic without dem.
00:00:42.720 That's a thing too.
00:00:44.340 All right, get ready.
00:00:45.720 You're all in here.
00:00:46.600 I know you're poised.
00:00:48.400 You're ready.
00:00:49.480 Oh yeah, you're ready.
00:00:50.300 And all you need is a cup or a mug or a glass of tank or chalice or stein,
00:00:53.320 a canteen jug or a flask of a vessel of any kind.
00:00:55.280 You can fill it with your favorite liquid.
00:00:57.460 I like coffee.
00:00:59.000 And join me now for the unparalleled pleasure of the dopamine of the day,
00:01:01.860 the thing that makes everything, including the pandemic, better.
00:01:06.360 It's called the simultaneous sip.
00:01:08.120 And it happens now.
00:01:09.820 Go.
00:01:10.060 Go.
00:01:10.120 Yes, I can feel my blood plasma getting stronger every moment.
00:01:24.980 I don't even know what that means.
00:01:27.680 Yeah, tonight I think I'll be giving myself a haircut.
00:01:30.480 But I have been hearing some suggestions that maybe I should work on the mullet.
00:01:37.460 It does look a little naturally mullety.
00:01:41.300 This is not a time for photographs.
00:01:44.180 I think I'm going to lose all of my photographs.
00:01:46.660 Well, not that I took any during this period.
00:01:48.520 All right, let's talk about some of the things in the news.
00:01:53.380 It's fun, of course.
00:01:55.060 Last night I already talked about Bernie.
00:01:56.860 Bernie dropping out.
00:01:59.400 Bernie's dropping out.
00:02:00.900 And, of course, it goes without saying that now the big question is,
00:02:04.920 who will Biden pick as his vice president?
00:02:07.180 And will he pick a vice president before he's replaced at the top of the ticket?
00:02:13.760 What would you say?
00:02:15.100 Do you think that Biden will select his vice president before he is replaced as the top of the ticket?
00:02:23.560 Because if he has a vice president, they can't really replace him without moving the vice president up.
00:02:30.620 Because if they don't promote the vice presidential candidate to the top of the ticket,
00:02:36.860 should they decide that Biden can't go on,
00:02:40.300 then what does it say about who he picked for the vice president?
00:02:44.460 If the vice presidential pick is not the automatic one who bumps up to the top,
00:02:49.360 then that person probably was not well chosen to begin with.
00:02:53.620 So that's not, I mean, it doesn't follow completely 100% logically,
00:02:57.860 but people are going to think of it that way.
00:03:00.620 Yeah, I'm going for the Tiger King look.
00:03:05.800 Thanks for making that comparison.
00:03:10.100 How much did you love the fact that Trump got a Tiger King question during the press conference?
00:03:17.960 On one hand, you say to yourself,
00:03:20.380 my God, what a waste of the public's time and the government's time
00:03:23.900 to ask such a silly question
00:03:26.540 at a time when we were more concerned about the massive global death toll and all that.
00:03:33.360 But I disagree.
00:03:35.240 I disagree.
00:03:36.420 First of all, you could tell that Trump sort of enjoyed it.
00:03:39.760 He sort of enjoyed the question.
00:03:41.720 It was sort of like a palate cleanser.
00:03:45.160 You just needed a little change of pace from the heaviness of the topic.
00:03:49.100 And I think the country enjoyed it, too.
00:03:53.160 So to the reporter who asked that question,
00:03:55.320 well, it wasn't news per se.
00:03:58.420 So, you know, that's, you know,
00:04:01.200 you can't say you were necessarily doing your job of getting the news.
00:04:05.700 But on the other hand,
00:04:07.580 the country sort of appreciated it.
00:04:11.000 So I don't know which news outlet it was who asked that question.
00:04:14.700 But to the reporter who did, I say,
00:04:17.000 thanks.
00:04:18.080 Thanks.
00:04:18.720 You know, I don't think we should be silly all the time.
00:04:21.780 But every now and then, things get so heavy
00:04:24.040 that I don't mind somebody, you know, the class clown.
00:04:27.960 I don't mind somebody who tells a joke.
00:04:30.540 I don't mind somebody who tries to, you know, break the mood a little bit.
00:04:33.820 So thank you.
00:04:35.420 I appreciated that.
00:04:38.320 Well, the debate has already started ahead of time
00:04:41.700 about whether the models were always bogus
00:04:45.200 or they were, you know, they were good and useful and credible.
00:04:51.860 And the argument's shaping up like this.
00:04:55.300 Some people are saying,
00:04:56.240 well, of course the death toll is lower
00:04:58.340 than even the lowest estimates of the model.
00:05:01.340 Well, it's outside of the entire range of the model,
00:05:04.820 but it's better.
00:05:07.020 And it's because we did such a good job
00:05:09.180 that even the models could not predict
00:05:12.400 how effective we would be.
00:05:14.460 So that's one of the movies.
00:05:16.920 And the other movie,
00:05:18.520 and of course we don't know how it all ends,
00:05:20.280 but so both of these movies assume
00:05:22.280 that they can predict that we're, you know,
00:05:25.140 turning the corner.
00:05:25.900 Then the other movie says
00:05:28.040 that the models were just baloney from the start
00:05:33.200 and we should have ignored them all
00:05:35.920 and just kept going to work
00:05:37.140 and we would have never known the difference.
00:05:41.680 Here's my take on this.
00:05:43.600 I don't think people understand
00:05:45.400 that models are not designed to be right.
00:05:48.880 And I think that's probably, I don't know,
00:05:53.120 95% of the public is under that impression
00:05:56.240 that if the experts build a model,
00:05:59.720 that the whole point of the model,
00:06:01.180 especially if it's got a very wide range,
00:06:03.740 you know, between 100,000
00:06:05.060 and 2 million people are dying,
00:06:07.200 you figure that range is so wide,
00:06:10.460 it's going to be in there somewhere, right?
00:06:12.500 So that's the usefulness of the model.
00:06:14.580 So if you're not anywhere
00:06:15.580 in that gigantic wide range,
00:06:19.040 critics get to say,
00:06:20.660 hey, I told you, told you.
00:06:23.420 Now, and of course,
00:06:24.500 the critics will also point out
00:06:25.620 that the bottom of the range
00:06:26.860 assumed that you tried as hard as you could
00:06:29.540 to mitigate.
00:06:31.080 So if you do even better than that,
00:06:32.880 well, what good was the model?
00:06:35.100 You know, it just shows
00:06:36.240 that the model was completely
00:06:37.380 in its own world from reality.
00:06:39.500 I disagree
00:06:40.280 because that's not what the models are for.
00:06:44.240 The models are meant to be useful.
00:06:46.760 They're not meant to be accurate.
00:06:49.900 Now, I say this as somebody
00:06:51.440 who built many financial models
00:06:53.260 in my corporate days.
00:06:54.780 It was my main job.
00:06:56.400 And I was always acutely aware
00:06:58.600 that my estimates
00:07:00.000 and my predictions were not accurate.
00:07:02.460 So instead,
00:07:03.680 I designed them to be useful.
00:07:07.480 And since most people
00:07:08.900 will never be able to understand
00:07:10.200 the distinction,
00:07:11.140 you could explain it to them forever,
00:07:12.820 and they'll look you right in the eyes
00:07:14.840 right after you've explained it
00:07:16.200 and they'll say,
00:07:17.080 yeah, but it's not accurate.
00:07:19.580 And then you'll start again
00:07:20.720 and you'll say,
00:07:21.100 okay, okay,
00:07:21.640 I don't think you're hearing me.
00:07:23.640 Nobody can predict the future.
00:07:26.120 So being accurate
00:07:27.160 wasn't even in the set of possibilities
00:07:29.680 unless you were just lucky.
00:07:31.420 So being accurate
00:07:32.380 is not even a goal
00:07:33.300 because it can't be done.
00:07:35.080 If anybody could predict the future
00:07:37.340 the way these models pretend to,
00:07:40.880 well, anybody who could do that
00:07:42.080 would be rich from the stock market.
00:07:43.840 They'd be rich from
00:07:44.620 a dozen other things
00:07:46.080 that they could predict
00:07:47.080 and they could bet on.
00:07:49.000 Because anybody who can
00:07:51.180 accurately predict the future
00:07:53.240 of anything,
00:07:55.100 of anything,
00:07:56.020 from a hurricane,
00:07:57.400 from climate science,
00:07:58.580 to the economy,
00:07:59.640 the stock market,
00:08:00.960 to the coronavirus,
00:08:02.860 anybody who could
00:08:05.020 accurately predict the future
00:08:08.500 in those multivariable situations
00:08:10.640 would just go make bets
00:08:12.400 and be rich.
00:08:14.200 But it's not a thing.
00:08:16.220 It's just not a thing.
00:08:18.000 So that's why people don't do it.
00:08:19.540 If they couldn't do it,
00:08:20.360 they'd do it.
00:08:21.140 They'd say,
00:08:21.500 whoa, this model's giving me an edge.
00:08:23.660 I'll go make a bet
00:08:24.640 based on this model.
00:08:26.560 Instead,
00:08:27.960 the models are meant to be useful.
00:08:30.700 Now, how many of you
00:08:31.400 are you confused?
00:08:32.560 Because you're thinking to yourself,
00:08:34.940 and by the way,
00:08:35.460 I'm using a persuasion technique now.
00:08:38.720 I'll call it out in a minute.
00:08:40.260 But how many of you
00:08:40.960 are thinking to yourself,
00:08:41.880 Scott, Scott, Scott,
00:08:42.820 you're talking crazy
00:08:44.720 because you're saying
00:08:45.800 that the models
00:08:46.520 don't need to be even accurate,
00:08:49.800 like even accurate
00:08:51.040 within a gigantic range.
00:08:52.800 They don't need to be accurate.
00:08:54.580 And still, they're useful.
00:08:57.060 Explain that, cartoon boy.
00:08:58.760 Those are like opposites.
00:08:59.880 If it's completely inaccurate
00:09:01.800 to not even be
00:09:03.100 in the general range,
00:09:04.180 it can't also be useful.
00:09:08.360 Except that everyone
00:09:09.320 who makes models
00:09:10.100 know that they can be.
00:09:12.200 You'll see a few people say yes,
00:09:13.760 the people who have
00:09:14.360 some experience in this field.
00:09:16.620 The models are not meant
00:09:18.340 to be accurate.
00:09:19.620 They're not designed that way.
00:09:21.460 It's not a hope.
00:09:22.620 It's not a dream.
00:09:23.400 It would only be luck
00:09:24.160 if they were accurate.
00:09:24.860 What the models
00:09:26.280 are supposed to do
00:09:27.080 is change your behavior.
00:09:31.020 That's the only thing
00:09:31.920 you need to know.
00:09:32.940 The point of a model
00:09:34.340 is to change your behavior.
00:09:36.820 And if the model does that,
00:09:39.260 and it changes your behavior
00:09:40.960 in a productive way,
00:09:42.900 then it can be said
00:09:43.860 that the model was useful,
00:09:45.740 even completely inaccurate.
00:09:47.700 So let's take this case
00:09:48.760 of the coronavirus.
00:09:50.760 What did the models
00:09:52.500 which were completely inaccurate?
00:09:55.320 Did they persuade us?
00:09:57.920 I think you'd say yes.
00:09:59.760 At least enough of us
00:10:00.840 were persuaded
00:10:01.500 that it was a gigantic risk
00:10:03.660 that huge changes in behavior
00:10:06.540 happened fairly quickly.
00:10:08.780 So did the models
00:10:09.940 cause us to change behavior?
00:10:11.880 Yes.
00:10:13.140 Now, the second question is,
00:10:15.180 should they?
00:10:16.560 Was it right
00:10:17.660 to change our behavior?
00:10:19.360 Well, that's the part
00:10:20.340 that becomes the two movies
00:10:21.580 and maybe we'll never settle.
00:10:23.780 But the argument goes
00:10:25.140 that if we hadn't changed
00:10:26.420 our behavior this much,
00:10:28.140 we would not be getting
00:10:29.480 a result that at least
00:10:30.980 preliminarily looks like
00:10:32.600 it might not be so bad
00:10:34.260 compared to the models.
00:10:36.000 Now, can the models predict
00:10:37.660 accurately
00:10:39.260 how well people will respond
00:10:42.380 even in the near term?
00:10:44.180 Not really.
00:10:45.360 Nobody can make a model like that.
00:10:47.140 Because it can't pick up innovation.
00:10:49.360 Did the models, for example,
00:10:51.760 predict that somebody
00:10:53.560 would find a way
00:10:54.220 to split ventilators
00:10:55.420 and effectively
00:10:56.520 double the capacity,
00:10:58.560 at least in those
00:10:59.360 emergency situations
00:11:00.620 and temporarily?
00:11:02.100 Did the model pick that up?
00:11:04.100 Did the model
00:11:04.960 have a number in it
00:11:06.000 for the effectiveness
00:11:08.740 of the hydroxychloroquine?
00:11:11.920 Because if the model
00:11:13.800 did have an estimate
00:11:15.080 for the effectiveness
00:11:16.820 of the hydroxychloroquine,
00:11:19.000 how do they know that?
00:11:20.520 We don't know that.
00:11:22.260 Why would the model numbers,
00:11:23.660 how would the model people
00:11:25.000 know the effectiveness
00:11:26.760 of the hydroxychloroquine
00:11:28.160 if we don't know
00:11:28.880 and the scientists don't know?
00:11:30.840 So keep in mind
00:11:32.620 that these models
00:11:33.420 are persuasion
00:11:34.380 and if they persuade you
00:11:36.580 in the right direction,
00:11:39.120 they worked.
00:11:40.180 They were useful.
00:11:42.160 Now, of course,
00:11:43.020 they could over-persuade you, right?
00:11:44.480 They could persuade you
00:11:45.280 to do the wrong thing.
00:11:46.560 So that's the distinction.
00:11:48.080 If it persuades you
00:11:48.900 to do the right thing,
00:11:49.960 and even if you can
00:11:50.960 determine that,
00:11:51.780 because even after the fact,
00:11:53.560 we'll never really know
00:11:55.100 what was the right thing
00:11:56.580 even after the fact.
00:11:58.620 We can only know,
00:12:00.560 here's the only thing
00:12:01.280 you can know,
00:12:02.540 is if the model
00:12:03.740 got people to act
00:12:04.880 in the way that
00:12:05.600 the consensus
00:12:06.460 of the experts
00:12:07.500 thought was the right way
00:12:09.360 to act.
00:12:10.840 That's about as good
00:12:12.940 as you can do.
00:12:14.480 So I would argue
00:12:15.340 that we are not acting
00:12:16.720 so much to satisfy
00:12:18.920 the requirements
00:12:20.100 of the model,
00:12:21.240 but rather,
00:12:22.100 I would put it this way,
00:12:23.080 as a person
00:12:23.680 who has developed
00:12:24.500 many models
00:12:25.400 for the purposes
00:12:27.300 of management.
00:12:29.480 Management decides
00:12:30.900 what they want to do,
00:12:32.800 and then they have you
00:12:33.980 build a model
00:12:34.720 that supports the story.
00:12:36.020 And if the experts
00:12:38.680 collectively say,
00:12:39.740 you know,
00:12:40.440 I don't think
00:12:41.540 we can model this.
00:12:43.000 Honestly,
00:12:43.600 I don't think we know
00:12:44.660 how big a problem
00:12:45.440 this is.
00:12:46.520 But from the little hints
00:12:47.520 we have about
00:12:48.480 how viral it is,
00:12:50.880 the information
00:12:51.740 we got out of China,
00:12:53.580 of how bad it was,
00:12:55.040 the number of,
00:12:56.220 you know,
00:12:56.520 body bags
00:12:57.160 that we had to order,
00:12:58.560 the number of ICU rooms
00:13:00.100 that were over capacity,
00:13:01.380 so based on
00:13:02.500 the little hints,
00:13:05.160 we experts,
00:13:06.180 not me,
00:13:06.820 but talking about
00:13:07.580 the experts,
00:13:08.440 look at it and say,
00:13:09.400 we should try as hard
00:13:10.480 as we can
00:13:11.180 to avoid the worst case.
00:13:14.180 That might be
00:13:15.040 the only thing
00:13:15.780 that could be known
00:13:16.640 with any kind
00:13:18.120 of confidence
00:13:18.840 that all of the experts
00:13:20.540 collectively
00:13:21.200 had sort of
00:13:21.880 the same feeling
00:13:22.780 about it.
00:13:23.700 And I think they did.
00:13:25.020 I think if you were
00:13:25.720 to survey the ones
00:13:26.980 who were the virologists,
00:13:29.080 I think that in general
00:13:30.060 they would all say,
00:13:30.780 whoa,
00:13:31.880 based on these
00:13:32.920 little facts,
00:13:34.380 which are anecdotal
00:13:35.620 but very scary,
00:13:37.240 it does look like,
00:13:38.660 in my professional opinion
00:13:39.840 as an expert virologist,
00:13:41.740 for example,
00:13:43.160 it does look like
00:13:44.000 we should put
00:13:44.480 the maximum effort
00:13:45.760 into stopping this.
00:13:48.060 And then they build
00:13:48.700 a model,
00:13:49.460 because the public
00:13:50.280 isn't going to take
00:13:51.260 an opinion,
00:13:52.920 the public needs
00:13:54.300 something a little
00:13:54.820 more convincing.
00:13:56.540 What would be
00:13:57.120 more convincing
00:13:57.780 than just interviews
00:13:58.880 with scientists
00:13:59.720 who say,
00:14:00.280 hey,
00:14:00.900 you should do
00:14:01.340 this thing,
00:14:02.740 a picture.
00:14:04.920 So the scientists
00:14:06.300 know,
00:14:07.120 and everybody
00:14:07.780 involved knows
00:14:08.420 that a picture
00:14:08.840 is more influential,
00:14:09.720 so it gets turned
00:14:10.660 into a graph,
00:14:11.440 a picture,
00:14:12.240 a range,
00:14:13.240 a statistic,
00:14:14.060 because that's
00:14:14.560 how you communicate it.
00:14:16.420 So the graph
00:14:17.620 and the picture
00:14:18.200 should not be seen
00:14:19.260 as something
00:14:19.740 which should be
00:14:20.580 viewed as true
00:14:22.320 or false
00:14:23.900 or accurate
00:14:25.700 or inaccurate
00:14:27.240 because they're
00:14:28.200 not even built
00:14:28.800 for that purpose.
00:14:30.080 They're built
00:14:30.880 to persuade
00:14:31.820 in a way
00:14:33.200 that the consensus
00:14:34.120 of the experts
00:14:35.020 legitimately feel
00:14:36.240 is in the best
00:14:36.920 interest of the country
00:14:38.400 because I think
00:14:38.880 mostly they're,
00:14:39.540 you know,
00:14:40.040 they're good eggs
00:14:41.060 who want the best
00:14:42.380 thing to happen.
00:14:42.900 So this argument
00:14:45.580 we're having
00:14:46.120 about whether
00:14:46.700 they were accurate
00:14:47.540 and whether they
00:14:48.520 had to be revised
00:14:49.440 and all that
00:14:50.080 is people who
00:14:51.180 don't understand
00:14:51.840 what the models are.
00:14:53.980 So these are
00:14:54.580 arguments from
00:14:55.700 people who have
00:14:56.440 never modeled
00:14:57.040 and don't understand
00:14:58.460 the world,
00:14:59.560 and so neither
00:15:00.040 the pros
00:15:00.720 nor the con
00:15:01.380 arguments are
00:15:02.200 even a little
00:15:03.020 bit sensible.
00:15:04.240 They're not even
00:15:04.900 the right argument,
00:15:05.780 really.
00:15:05.980 And what you
00:15:07.740 should see
00:15:08.380 with any model
00:15:10.200 is that they
00:15:11.120 start out wildly
00:15:12.100 inaccurate because
00:15:13.120 nobody can predict
00:15:14.000 the future,
00:15:15.080 and then as you
00:15:15.940 get closer to
00:15:16.800 whatever date
00:15:17.440 you can measure
00:15:18.940 for sure whether
00:15:20.020 you were right
00:15:20.540 or wrong,
00:15:21.500 as you get closer
00:15:22.340 to knowing the
00:15:23.520 answer,
00:15:24.340 the models start
00:15:25.560 getting molded
00:15:26.700 and retrofitted
00:15:27.500 and tweaked
00:15:28.820 until they come
00:15:29.760 into conformance
00:15:30.560 with reality.
00:15:31.700 Now you saw
00:15:32.100 that in 2016,
00:15:33.360 the polls of
00:15:35.820 who would win
00:15:36.240 the election
00:15:36.680 were wildly
00:15:38.200 off,
00:15:39.280 but as you
00:15:40.000 got closer
00:15:40.520 and closer
00:15:40.920 to election
00:15:41.540 day,
00:15:41.900 they started
00:15:42.340 to narrow.
00:15:43.340 I think
00:15:43.600 Rasmussen
00:15:44.220 nailed it,
00:15:45.780 but most of
00:15:46.260 the other
00:15:46.620 polls were
00:15:47.640 at least
00:15:48.020 closing that
00:15:49.100 gigantic gap
00:15:50.040 that said
00:15:50.560 Hillary would
00:15:51.480 win.
00:15:52.260 Most of them
00:15:53.060 narrowed toward
00:15:53.780 the end,
00:15:54.840 and that's
00:15:55.440 normal.
00:15:56.880 Does that mean
00:15:57.740 that the polls
00:15:58.400 were all wrong?
00:16:00.280 Well,
00:16:00.840 if you think
00:16:01.460 polls are right
00:16:02.160 in the first
00:16:02.640 place,
00:16:03.360 yeah,
00:16:04.080 but polls
00:16:05.600 are pretty
00:16:07.000 gross objects
00:16:07.900 too.
00:16:08.220 So,
00:16:08.360 all right,
00:16:09.180 here's some
00:16:12.180 examples of
00:16:13.000 loser think
00:16:13.600 in the news.
00:16:16.420 So,
00:16:16.880 this is,
00:16:17.880 let's see,
00:16:21.380 Kevin Miller,
00:16:23.140 he's just
00:16:24.840 some user
00:16:25.620 on Twitter,
00:16:27.200 said that
00:16:27.700 there's literally
00:16:28.300 no argument
00:16:29.000 to the numbers
00:16:30.440 would be higher
00:16:31.080 if we didn't
00:16:31.760 do all that
00:16:32.660 we did,
00:16:33.120 and yet
00:16:34.420 people will
00:16:35.320 argue it.
00:16:35.980 So,
00:16:36.200 Kevin is
00:16:37.360 saying that
00:16:38.020 you can't
00:16:39.400 reasonably
00:16:39.940 argue
00:16:40.420 that the
00:16:42.260 reason that
00:16:42.820 the numbers
00:16:43.280 are good
00:16:44.220 is because
00:16:45.900 we did
00:16:46.340 all the
00:16:46.600 mitigation.
00:16:47.340 So,
00:16:47.520 Kevin's
00:16:47.860 saying it's
00:16:48.760 so obvious
00:16:49.320 that should
00:16:49.720 be the end
00:16:50.180 of the
00:16:50.380 argument.
00:16:51.120 The mitigation
00:16:51.580 is the
00:16:52.480 reason
00:16:53.080 the death
00:16:54.580 rate is
00:16:54.980 low.
00:16:55.300 It's the
00:16:55.600 reason.
00:16:56.040 It's an
00:16:56.300 obvious
00:16:56.600 reason.
00:16:57.060 We did
00:16:57.360 it for
00:16:57.760 that
00:16:58.040 purpose.
00:16:59.020 The
00:16:59.200 experts
00:16:59.540 said if
00:17:00.100 you do
00:17:00.440 this,
00:17:00.840 the death
00:17:01.140 rate will
00:17:01.460 come down,
00:17:01.920 and then
00:17:02.200 it came
00:17:02.460 down.
00:17:03.320 That's
00:17:03.720 why it
00:17:04.080 came
00:17:04.240 down.
00:17:04.820 So,
00:17:05.360 Kevin is
00:17:05.960 saying there's
00:17:06.580 no argument.
00:17:08.100 The numbers
00:17:08.680 would be
00:17:09.120 higher if
00:17:09.700 we didn't
00:17:10.000 do all
00:17:10.360 that we
00:17:10.580 did.
00:17:12.900 So,
00:17:13.500 this is
00:17:13.940 loser
00:17:14.180 think.
00:17:15.360 And it's
00:17:15.740 a specific
00:17:16.280 form of
00:17:16.820 loser
00:17:17.040 think that
00:17:17.640 I write
00:17:18.160 about in
00:17:18.600 one chapter
00:17:19.100 of my
00:17:19.420 book,
00:17:19.680 loser
00:17:19.860 think,
00:17:20.220 which you
00:17:20.500 should
00:17:20.640 all buy
00:17:21.720 because you
00:17:22.140 have time
00:17:22.560 to read
00:17:23.760 it.
00:17:24.660 And this
00:17:25.820 is what
00:17:26.140 I call
00:17:26.400 the failure
00:17:26.920 of imagination.
00:17:28.660 So,
00:17:28.920 whenever
00:17:29.260 anybody
00:17:29.880 says there's
00:17:30.580 no other
00:17:31.060 explanation,
00:17:31.920 there are
00:17:33.720 two
00:17:33.860 possibilities.
00:17:34.740 One,
00:17:35.720 it was at
00:17:36.180 the right.
00:17:36.900 There's just
00:17:37.260 no other
00:17:37.660 explanation.
00:17:38.720 But the
00:17:39.240 other possibility
00:17:40.040 that's
00:17:40.480 gigantic is
00:17:42.260 that you're
00:17:42.540 not good at
00:17:43.100 imagining
00:17:43.580 other
00:17:43.900 possibilities.
00:17:46.400 And that's
00:17:47.020 the case
00:17:47.340 here.
00:17:47.920 So,
00:17:48.200 let me give
00:17:48.480 you an
00:17:48.640 example.
00:17:49.540 So,
00:17:49.780 Kevin cannot
00:17:50.520 imagine,
00:17:51.260 based on his
00:17:51.900 comment,
00:17:52.360 I'm saying he
00:17:52.820 can't imagine
00:17:53.480 because he
00:17:54.320 says there's
00:17:54.700 literally,
00:17:55.860 not figuratively,
00:17:57.020 literally there's
00:17:57.680 no argument
00:17:58.260 except that
00:17:59.940 the mitigation
00:18:00.500 worked.
00:18:01.100 It's the
00:18:01.400 only thing
00:18:01.780 you could
00:18:02.080 say,
00:18:02.420 looking at
00:18:02.840 the evidence,
00:18:04.280 to which I
00:18:04.940 say,
00:18:06.080 no,
00:18:07.720 no,
00:18:08.400 I'm afraid
00:18:09.660 that's not
00:18:10.180 the only way
00:18:11.020 to interpret
00:18:11.560 it.
00:18:12.200 Here's the
00:18:12.740 other way.
00:18:13.820 You tell me
00:18:14.340 if this is
00:18:14.800 crazy.
00:18:15.960 The other
00:18:16.300 way to
00:18:16.600 interpret
00:18:16.940 is the
00:18:17.540 mitigation
00:18:18.180 worked a
00:18:19.000 little bit,
00:18:20.140 just like
00:18:21.080 everybody
00:18:21.400 thought it
00:18:21.820 would work,
00:18:22.860 but a
00:18:23.200 little bit.
00:18:23.660 And the
00:18:25.040 biggest
00:18:25.280 difference
00:18:25.700 was that
00:18:26.100 the models
00:18:26.540 were wildly
00:18:27.900 inaccurate.
00:18:29.880 How do
00:18:30.300 you rule
00:18:30.620 out that
00:18:30.980 possibility?
00:18:32.200 Because
00:18:32.520 Kevin is
00:18:33.400 correct that
00:18:34.780 it has to
00:18:35.480 make a
00:18:35.820 difference,
00:18:36.400 that everybody
00:18:37.040 did social
00:18:37.680 isolation,
00:18:38.560 and a lot
00:18:39.380 of people
00:18:39.680 wore masks,
00:18:40.880 and it
00:18:41.840 had to make
00:18:42.340 a difference.
00:18:43.680 So Kevin's
00:18:44.320 right about
00:18:44.680 that.
00:18:45.840 But how
00:18:46.180 can Kevin
00:18:46.680 measure how
00:18:47.880 much of the
00:18:48.460 total difference
00:18:49.200 can be
00:18:50.120 ascribed to
00:18:50.820 that one
00:18:51.560 factor?
00:18:52.040 He can't.
00:18:52.920 It can't
00:18:53.260 be measured.
00:18:53.720 You can't
00:18:54.120 measure it,
00:18:54.520 I can't
00:18:54.880 measure it,
00:18:55.280 the experts
00:18:55.740 can't measure it.
00:18:56.320 It can't
00:18:56.580 be measured.
00:18:57.920 And if we
00:18:58.400 tried to
00:18:58.780 compare it
00:18:59.280 to other
00:18:59.620 countries,
00:19:00.620 say,
00:19:00.920 well,
00:19:01.080 let's compare
00:19:01.600 it to one
00:19:02.020 of these
00:19:02.260 other
00:19:02.480 countries,
00:19:03.140 we'll find
00:19:03.920 one that
00:19:04.340 did everything
00:19:04.880 the same
00:19:05.400 except maybe
00:19:06.020 this one
00:19:06.480 variable,
00:19:07.660 and then
00:19:08.000 that'll tell
00:19:08.500 us something.
00:19:09.120 It's like,
00:19:09.380 oh,
00:19:09.500 you did
00:19:09.740 everything
00:19:10.040 the same
00:19:10.620 except you
00:19:11.580 didn't wear
00:19:12.060 masks,
00:19:13.200 well,
00:19:13.480 we'll compare
00:19:13.940 you to the
00:19:14.360 countries that
00:19:14.860 did wear
00:19:15.180 masks and see
00:19:15.800 if we learn
00:19:16.220 anything.
00:19:17.380 Probably
00:19:17.820 doesn't work
00:19:18.540 because there
00:19:19.840 aren't two
00:19:20.480 countries that
00:19:21.140 are enough
00:19:21.820 alike.
00:19:22.860 I'm seeing
00:19:23.480 people mention
00:19:24.180 the countries,
00:19:24.800 well,
00:19:24.960 what about
00:19:25.260 Sweden?
00:19:26.320 What about
00:19:26.700 the fact that
00:19:27.420 Singapore had
00:19:28.340 a good result,
00:19:29.340 but they
00:19:29.860 didn't wear
00:19:30.260 masks?
00:19:31.660 But they
00:19:32.000 did do
00:19:32.520 good contact
00:19:33.540 checking.
00:19:34.280 My point
00:19:34.840 is that all
00:19:35.620 the countries
00:19:36.040 are going to
00:19:36.400 be so
00:19:37.400 fundamentally
00:19:38.060 different on
00:19:38.960 so many
00:19:39.400 big points
00:19:40.300 from,
00:19:41.460 you know,
00:19:41.640 is it a
00:19:42.020 country where
00:19:42.500 they hug
00:19:42.940 a lot?
00:19:43.720 Do they
00:19:44.060 have old
00:19:44.480 people living
00:19:44.980 inside?
00:19:45.700 What's the
00:19:47.100 size of the
00:19:48.320 country?
00:19:48.700 What's the
00:19:49.020 quality of the
00:19:50.260 health care?
00:19:50.820 What's the
00:19:51.120 flexibility?
00:19:51.700 You just
00:19:52.360 can't compare
00:19:52.880 two countries,
00:19:53.520 I don't think.
00:19:54.420 I don't think
00:19:55.360 we'll ever
00:19:55.840 have a good
00:19:56.660 answer about
00:19:59.760 what works,
00:20:01.020 except that
00:20:01.540 we'll know
00:20:01.900 social isolation
00:20:02.860 has to work,
00:20:04.060 I mean,
00:20:04.320 logically it
00:20:04.900 has to work,
00:20:05.760 in the same
00:20:06.340 way that face
00:20:06.900 masks logically
00:20:08.420 had to work
00:20:09.120 a little bit.
00:20:10.220 You just
00:20:10.640 don't know
00:20:10.920 how much.
00:20:12.760 So,
00:20:14.580 Kevin,
00:20:15.160 I would say
00:20:15.600 that you
00:20:15.980 were blind
00:20:17.260 to the other
00:20:17.980 possibility that,
00:20:19.400 of course,
00:20:19.840 the mitigation
00:20:20.240 works,
00:20:20.960 but we
00:20:21.200 don't know
00:20:21.500 how much,
00:20:22.600 and it
00:20:22.980 is easily
00:20:23.720 possible,
00:20:25.160 easily
00:20:25.600 possible,
00:20:26.560 that the
00:20:26.920 biggest difference
00:20:27.640 in the change
00:20:28.360 of the estimates
00:20:29.020 had to do
00:20:30.100 with the
00:20:30.480 estimates weren't
00:20:31.160 that good in
00:20:31.640 the first place.
00:20:33.080 Now,
00:20:33.360 I'm not saying
00:20:33.700 that's the
00:20:34.120 case,
00:20:35.120 I'm saying
00:20:35.500 you can't
00:20:36.020 rule it out,
00:20:36.700 and it's so
00:20:37.200 dead simply
00:20:38.420 obvious that
00:20:39.240 it's possible,
00:20:40.800 and it would
00:20:41.360 be routine,
00:20:42.160 it wouldn't
00:20:42.400 even be
00:20:42.840 unusual if
00:20:43.800 that were the
00:20:44.220 case,
00:20:44.560 nothing about
00:20:45.100 that would
00:20:45.400 be weird.
00:20:46.860 You can't
00:20:47.200 rule it out.
00:20:47.640 All right.
00:20:51.100 Now,
00:20:51.600 there will
00:20:51.880 be much
00:20:52.300 said about
00:20:52.900 the fact
00:20:53.360 that the
00:20:53.920 bottom range
00:20:54.880 of the
00:20:55.140 models was
00:20:56.200 100,000
00:20:57.000 deaths,
00:20:58.100 and very
00:20:58.580 quickly it
00:20:59.220 got modified
00:20:59.860 down to,
00:21:00.480 well,
00:21:00.960 100,000,
00:21:02.140 we meant a
00:21:02.820 minimum of
00:21:03.340 60,000,
00:21:04.280 and when that
00:21:04.840 gets modified
00:21:05.400 down,
00:21:05.920 which it
00:21:06.180 might be,
00:21:07.080 I think it'll
00:21:07.540 get at least
00:21:08.300 a 50% chance
00:21:09.380 it'll get
00:21:09.680 modified down
00:21:10.360 again,
00:21:11.360 people will
00:21:12.260 say,
00:21:12.480 we'll see how
00:21:12.920 wrong the
00:21:13.320 model was,
00:21:13.900 and other
00:21:14.160 people will
00:21:14.540 say,
00:21:14.800 see how good
00:21:15.320 the mitigation
00:21:15.760 was,
00:21:16.260 and we'll
00:21:16.520 never get
00:21:16.900 to the
00:21:17.120 end.
00:21:20.560 Let me
00:21:21.040 ask you
00:21:21.360 this,
00:21:22.340 what are
00:21:22.720 the odds
00:21:23.360 that a
00:21:24.180 big,
00:21:24.500 complicated
00:21:25.060 model would
00:21:28.360 be accurate
00:21:28.880 in the first
00:21:29.280 place?
00:21:30.320 What are
00:21:30.660 the odds
00:21:31.060 that even
00:21:31.540 if it gave
00:21:32.000 a range,
00:21:32.680 and it's a
00:21:33.020 really big
00:21:33.420 range of
00:21:34.040 possibilities,
00:21:35.100 what are
00:21:35.380 the odds
00:21:35.800 that it's
00:21:37.440 still going
00:21:37.780 to be in
00:21:38.080 that gigantic
00:21:38.680 range?
00:21:39.920 Is it
00:21:40.260 really 95%
00:21:41.520 like they
00:21:42.360 might want
00:21:42.780 you to
00:21:43.000 believe?
00:21:43.740 In my
00:21:44.260 experience,
00:21:44.800 even big
00:21:45.940 ranges are
00:21:47.340 just routinely
00:21:48.040 wrong.
00:21:49.260 I mean,
00:21:49.460 it's not
00:21:49.700 even unusual
00:21:50.320 to have a
00:21:50.820 gigantic range
00:21:51.740 of possibilities
00:21:52.440 and still
00:21:53.080 be way
00:21:54.440 under it
00:21:54.880 or way
00:21:55.160 over it.
00:21:55.660 Not
00:21:55.900 unusual
00:21:56.260 though.
00:21:58.560 All right.
00:22:03.660 Here's
00:22:04.140 some other
00:22:04.880 stuff.
00:22:06.540 The question
00:22:07.540 of who
00:22:08.020 Biden
00:22:09.980 picks for
00:22:10.500 his
00:22:10.680 vice
00:22:10.900 presidential
00:22:11.360 running
00:22:11.700 mate
00:22:12.000 is
00:22:13.120 getting
00:22:13.360 really
00:22:13.660 interesting
00:22:14.140 because
00:22:14.520 of the
00:22:14.780 timing
00:22:15.160 because
00:22:16.660 you know
00:22:17.020 this
00:22:17.240 conversation
00:22:17.860 is
00:22:18.140 happening
00:22:18.500 among
00:22:19.220 the
00:22:19.420 Democrats.
00:22:20.040 They're
00:22:20.120 thinking,
00:22:20.600 oh my
00:22:20.780 God,
00:22:21.160 what are
00:22:21.360 we
00:22:21.480 going
00:22:21.640 to do
00:22:21.820 about
00:22:22.040 Joe
00:22:22.260 Biden
00:22:22.560 because
00:22:23.360 we
00:22:23.540 can't
00:22:23.840 have
00:22:24.040 him
00:22:24.240 as
00:22:24.460 our
00:22:24.640 standard
00:22:25.000 bearer.
00:22:25.420 you
00:22:25.660 know
00:22:25.860 that
00:22:26.060 conversation
00:22:26.600 is
00:22:26.840 happening.
00:22:27.880 I
00:22:28.340 know
00:22:28.540 that
00:22:28.740 conversation
00:22:29.260 is
00:22:29.480 happening
00:22:29.800 because
00:22:30.060 I've
00:22:30.240 actually
00:22:30.560 talked
00:22:30.940 to
00:22:31.100 people
00:22:31.400 who
00:22:32.420 are
00:22:32.660 prominent
00:22:33.840 Democrats
00:22:34.500 and let
00:22:35.420 me tell
00:22:35.680 you,
00:22:36.020 that
00:22:36.280 conversation
00:22:36.800 is
00:22:37.020 happening.
00:22:38.220 But there
00:22:38.940 are two
00:22:39.320 logical
00:22:39.920 ways to
00:22:40.620 go about
00:22:41.080 it.
00:22:41.680 Well,
00:22:41.920 three,
00:22:42.340 I guess.
00:22:42.980 Three.
00:22:43.660 One would
00:22:44.120 be to
00:22:44.820 just
00:22:46.200 replace
00:22:46.700 him at
00:22:47.240 the
00:22:47.400 convention
00:22:47.840 and just
00:22:49.560 have some
00:22:50.040 kind of
00:22:50.340 a revolution
00:22:51.100 at the
00:22:51.440 convention
00:22:51.800 and pick
00:22:52.180 somebody
00:22:52.500 else.
00:22:53.040 That's
00:22:53.360 possible.
00:22:53.740 But I
00:22:54.480 think that
00:22:54.780 would be
00:22:55.060 very
00:22:55.380 divisive.
00:22:56.940 That's
00:22:57.180 risky.
00:22:58.100 The other
00:22:58.620 possibility
00:22:59.120 would be
00:22:59.740 to replace
00:23:00.540 him now.
00:23:02.240 You know,
00:23:02.420 have somebody
00:23:02.940 talk him
00:23:03.480 out of
00:23:03.720 running and
00:23:04.960 say,
00:23:05.280 look,
00:23:05.440 look,
00:23:05.620 look,
00:23:05.860 we still
00:23:06.540 have time
00:23:06.880 to slot
00:23:07.980 in Mario
00:23:09.080 Cuomo or
00:23:10.340 somebody,
00:23:10.900 not Mario,
00:23:11.720 Andrew
00:23:12.020 Cuomo.
00:23:13.180 And so
00:23:14.960 let's do
00:23:15.280 it now.
00:23:15.920 I don't
00:23:16.500 think that's
00:23:17.000 going to
00:23:17.220 happen because
00:23:18.620 we're a
00:23:18.980 little too
00:23:19.480 close to
00:23:20.160 the convention
00:23:20.760 and it
00:23:22.020 would look
00:23:22.420 like the
00:23:23.500 entire
00:23:23.940 democratic
00:23:25.980 nomination
00:23:27.920 process was
00:23:28.780 a sham.
00:23:30.120 It would
00:23:30.360 be one
00:23:30.720 thing to
00:23:31.160 argue it
00:23:31.700 out on
00:23:32.360 the floor
00:23:32.860 of,
00:23:33.340 I don't
00:23:33.700 know if
00:23:33.920 there will
00:23:34.880 even be a
00:23:35.400 physical
00:23:35.760 convention,
00:23:36.420 but let's
00:23:36.620 say there
00:23:36.880 is.
00:23:37.340 It would
00:23:37.540 be one
00:23:37.840 thing to
00:23:38.140 argue it
00:23:38.640 out among
00:23:39.860 their members
00:23:40.460 on the
00:23:40.820 floor because
00:23:41.840 that would
00:23:42.140 feel like a
00:23:42.660 process of
00:23:43.340 some sort,
00:23:43.960 even if you
00:23:44.440 didn't get
00:23:44.820 your way.
00:23:45.800 You'd feel
00:23:46.220 like,
00:23:46.520 all right,
00:23:47.800 the people
00:23:48.340 who cared
00:23:48.760 the most
00:23:49.260 in my
00:23:49.600 party,
00:23:50.080 they all
00:23:50.320 went to
00:23:50.580 one place
00:23:51.280 or they
00:23:51.740 were online
00:23:52.240 or they
00:23:52.660 do it,
00:23:52.980 they argued,
00:23:54.120 they worked
00:23:54.440 it out,
00:23:54.800 they negotiated,
00:23:55.620 well, at
00:23:56.000 least it's a
00:23:56.460 process.
00:23:57.760 I don't
00:23:58.180 love the
00:23:58.520 outcome,
00:23:59.580 but at
00:23:59.900 least I
00:24:00.680 see how it
00:24:01.200 happened and
00:24:01.920 got a little
00:24:02.680 bit of
00:24:02.960 visibility,
00:24:03.820 okay, I
00:24:04.220 can live
00:24:04.540 with it.
00:24:04.820 but if
00:24:07.080 they replaced
00:24:07.620 him before
00:24:08.240 that, if
00:24:10.380 they found
00:24:10.800 some clever
00:24:11.700 way to
00:24:12.040 say, hey,
00:24:12.580 Joe, just
00:24:13.100 say you're
00:24:14.020 not feeling
00:24:14.440 good and
00:24:15.080 we'll just
00:24:16.140 slot somebody
00:24:16.760 in there
00:24:17.080 right away,
00:24:18.280 I don't
00:24:20.200 think they
00:24:20.640 could do
00:24:20.960 it before
00:24:21.620 because as
00:24:23.300 long as he's
00:24:24.040 still able to
00:24:25.520 appear in
00:24:26.040 public and
00:24:26.580 put three
00:24:27.220 words together,
00:24:28.020 they don't
00:24:28.280 even have to
00:24:28.760 make sense,
00:24:29.840 and we've
00:24:30.200 seen that,
00:24:30.680 right?
00:24:30.820 He literally
00:24:33.200 doesn't even
00:24:33.740 have to make
00:24:34.140 sense when
00:24:34.760 he talks,
00:24:35.760 and the
00:24:36.100 Democrats are
00:24:36.720 still willing
00:24:37.240 to say,
00:24:37.720 well, it's
00:24:38.020 okay, let's
00:24:39.540 let this play
00:24:40.360 out a little
00:24:40.700 bit.
00:24:42.520 I would
00:24:43.580 love to
00:24:43.960 know what
00:24:45.020 they're
00:24:45.280 thinking,
00:24:46.120 the just
00:24:47.100 mainstream
00:24:47.640 Democrats who
00:24:49.120 don't want
00:24:49.640 anything except
00:24:50.500 what's good
00:24:51.020 for the
00:24:51.340 country, and
00:24:52.520 they just
00:24:53.060 want a
00:24:55.280 credible
00:24:55.780 candidate to
00:24:56.780 be able to
00:24:57.460 vote for
00:24:58.160 them, and
00:24:58.920 they're looking
00:24:59.340 at what their
00:24:59.840 party served
00:25:00.600 them up as
00:25:01.220 their choice.
00:25:02.860 It's got to
00:25:03.680 be terribly
00:25:05.040 frustrating, so
00:25:07.080 I wonder
00:25:07.420 about that
00:25:07.920 internal
00:25:08.280 process.
00:25:10.820 But here's
00:25:11.400 my prediction
00:25:13.060 based on
00:25:13.660 what has
00:25:14.060 the least
00:25:16.420 friction.
00:25:17.940 So here's
00:25:18.720 the least
00:25:19.280 friction way
00:25:20.240 to go.
00:25:21.320 Joe Biden
00:25:21.960 picks, maybe
00:25:23.000 even early
00:25:23.640 before the
00:25:24.200 convention,
00:25:25.460 picks a
00:25:26.180 vice
00:25:26.900 presidential
00:25:27.380 running mate,
00:25:28.480 because if he
00:25:29.180 assumes he's
00:25:29.960 going to get
00:25:30.360 the nomination,
00:25:31.860 he could
00:25:32.360 sort of
00:25:32.940 presumptively
00:25:33.560 say, let's
00:25:34.100 get this
00:25:34.480 out of the
00:25:34.780 way, give
00:25:36.000 you voters a
00:25:36.660 little comfort
00:25:37.220 about what's
00:25:37.940 coming, I'll
00:25:38.960 just pick my
00:25:39.560 vice president.
00:25:40.840 I think at
00:25:41.380 that point,
00:25:41.920 whoever that
00:25:42.380 vice presidential
00:25:43.040 pick is,
00:25:44.060 assuming it's
00:25:44.760 somebody you
00:25:45.260 could reasonably
00:25:45.920 see as
00:25:46.460 qualified to be
00:25:47.240 president,
00:25:48.740 that Joe
00:25:50.120 Biden could
00:25:50.660 then just go
00:25:51.440 through the
00:25:52.180 charade of
00:25:55.600 running for
00:25:56.080 office, but
00:25:57.580 voters would
00:25:58.160 say, we're
00:25:59.020 voting for
00:26:00.140 the vice
00:26:00.540 president.
00:26:01.860 In our
00:26:02.440 minds, that's
00:26:03.440 really who
00:26:03.840 we're voting
00:26:04.260 for.
00:26:05.320 So it
00:26:05.580 feels like,
00:26:07.120 and then
00:26:07.660 maybe sometime
00:26:08.920 between now
00:26:09.840 and the
00:26:10.640 convention, it
00:26:11.840 wouldn't be so
00:26:12.500 hard to flip
00:26:13.200 the party and
00:26:14.520 say, hey,
00:26:15.060 let's take the
00:26:16.300 vice president,
00:26:17.060 put the vice
00:26:17.540 president pick at
00:26:18.260 the top, because
00:26:19.600 the vice
00:26:19.980 president was,
00:26:20.880 after all,
00:26:21.200 picked by the
00:26:21.740 person who had
00:26:22.300 the most votes.
00:26:23.680 And then maybe
00:26:24.440 you could let
00:26:25.220 Joe Biden
00:26:27.600 slide off and
00:26:28.580 say he's not
00:26:29.300 feeling well, and
00:26:30.380 it's all very
00:26:31.960 elegant.
00:26:33.040 So I think it's
00:26:33.680 going to be the
00:26:34.120 vice president
00:26:34.780 switcheroo would
00:26:35.840 be the least
00:26:36.840 friction way for
00:26:37.880 the Democrats to
00:26:38.620 get what they
00:26:39.040 want, which is
00:26:40.260 somebody else at
00:26:40.900 the top of the
00:26:41.400 ticket.
00:26:41.880 So that's my
00:26:42.340 prediction.
00:26:43.060 All right.
00:26:44.100 Let me ask you
00:26:44.940 this.
00:26:45.900 If China believed
00:26:47.260 that the wet
00:26:47.880 markets were
00:26:49.360 actually the
00:26:50.040 source of the
00:26:50.760 coronavirus, would
00:26:53.000 they keep them
00:26:53.620 open?
00:26:55.220 Now, there
00:26:55.940 might be
00:26:56.240 something about
00:26:56.860 China that I
00:26:57.560 don't understand.
00:26:58.880 I'm sure there
00:26:59.480 are lots of
00:26:59.860 things about
00:27:00.280 China I don't
00:27:00.840 understand.
00:27:02.200 But on a
00:27:03.180 risk-reward
00:27:03.940 basis, if you
00:27:05.540 looked at the
00:27:06.000 entire GDP of
00:27:07.540 China, how
00:27:09.600 much of the
00:27:10.180 entire GDP of
00:27:11.440 China would be
00:27:12.220 represented by
00:27:13.900 the economic
00:27:15.340 activity of the
00:27:16.540 wet markets?
00:27:18.640 1%?
00:27:20.440 0.00001%?
00:27:23.800 there can't
00:27:25.500 possibly be any
00:27:26.560 serious economic
00:27:27.780 impact of
00:27:29.500 keeping the
00:27:30.120 wet markets
00:27:30.740 open.
00:27:32.080 But at least
00:27:33.840 in terms of the
00:27:34.480 upside gain of
00:27:35.320 making money, it
00:27:36.140 can't be that much
00:27:36.960 money involved.
00:27:37.960 How big are
00:27:38.760 these wet markets?
00:27:40.340 I mean, I don't
00:27:40.760 know enough about
00:27:41.380 the area, but it
00:27:42.860 can't be that big,
00:27:44.480 right?
00:27:44.860 But the risk of
00:27:49.160 keeping them
00:27:49.720 open is so
00:27:50.800 well understood.
00:27:52.380 Because if this
00:27:53.160 is indeed the
00:27:53.900 second time that
00:27:54.700 something came out
00:27:55.400 of that same
00:27:55.860 environment, this
00:27:57.260 time closing down
00:27:58.260 the economies of
00:27:59.080 the world, is
00:28:00.400 there any
00:28:01.060 argument that
00:28:02.080 China could make
00:28:02.900 for keeping them
00:28:03.660 open?
00:28:04.600 Let's put it this
00:28:05.500 way.
00:28:05.760 Suppose we went
00:28:06.520 to China and we
00:28:07.420 said, hey, China,
00:28:08.780 here's the deal.
00:28:10.060 We're just going to
00:28:11.220 decouple and move
00:28:12.220 everything home and
00:28:13.140 stop trading with
00:28:13.980 you because we
00:28:15.120 need to completely
00:28:16.040 close travel because
00:28:18.120 we can't have any
00:28:19.120 kind of physical
00:28:20.040 connection to a
00:28:20.920 country that has
00:28:21.660 wet markets because
00:28:23.520 they're too risky
00:28:24.120 for the other
00:28:24.760 countries.
00:28:26.000 What would China
00:28:26.680 say?
00:28:27.960 Would they say,
00:28:28.720 oh, okay, we
00:28:29.780 don't want to lose
00:28:30.300 our whole economy,
00:28:31.120 so we'll just
00:28:31.620 close these wet
00:28:32.300 markets?
00:28:33.320 So here's my
00:28:33.960 question.
00:28:35.160 Is China not
00:28:36.280 signaling to the
00:28:37.220 world that they
00:28:37.980 don't believe the
00:28:38.720 wet markets were
00:28:39.460 the problem?
00:28:40.920 Because you can't
00:28:41.920 do these two
00:28:43.080 things that they've
00:28:43.880 done, which
00:28:45.060 is say, oh,
00:28:45.620 yeah, the
00:28:46.640 problem that
00:28:47.240 almost killed
00:28:48.640 millions of
00:28:49.260 people and did
00:28:50.500 kill, I don't
00:28:51.100 know, hundreds of
00:28:51.660 thousands when it's
00:28:52.340 all done, you
00:28:54.800 can't say we're
00:28:55.460 going to keep it
00:28:55.920 open if you think
00:28:57.460 that's what the
00:28:57.980 problem was.
00:28:59.140 Now, yeah,
00:29:00.060 somebody's saying
00:29:00.640 in the comments
00:29:01.140 it's a cultural
00:29:01.820 thing.
00:29:02.940 Maybe.
00:29:04.720 Maybe what I
00:29:05.600 don't understand is
00:29:06.580 the cultural
00:29:07.120 importance.
00:29:08.020 But the Chinese
00:29:09.020 Communist Party
00:29:10.020 is pretty
00:29:10.920 cold-blooded
00:29:11.980 engineering
00:29:12.880 efficiency.
00:29:14.660 And there's
00:29:15.480 nothing that I
00:29:16.360 could imagine
00:29:17.180 where these
00:29:17.860 cold-blooded
00:29:18.760 engineering
00:29:19.460 efficient
00:29:20.320 technocrats
00:29:22.240 are going to
00:29:22.960 risk trillions
00:29:25.120 of dollars again
00:29:26.160 for what would
00:29:28.000 be the smallest
00:29:28.920 economic benefit
00:29:30.440 of these little
00:29:31.140 wet markets that
00:29:31.960 can't possibly be a
00:29:32.900 good idea.
00:29:33.340 So I think
00:29:35.660 China has
00:29:36.120 basically signaled
00:29:37.200 that they are
00:29:37.900 not willing to
00:29:39.400 be a credible
00:29:42.160 partner in the
00:29:42.980 world.
00:29:43.580 There's a line
00:29:45.240 below which,
00:29:46.760 if you're below
00:29:47.380 the line of
00:29:48.100 credibility, you
00:29:49.420 just can't work
00:29:50.140 with those people.
00:29:51.900 You have to just
00:29:52.860 make a choice and
00:29:53.600 say, you know,
00:29:54.880 nothing personal,
00:29:56.260 but we can't
00:29:56.960 work with that.
00:29:58.820 Whatever you're
00:29:59.800 doing over there,
00:30:00.860 it's nothing
00:30:01.400 personal, but we
00:30:02.800 can't expose
00:30:04.020 ourselves to
00:30:04.780 that risk
00:30:05.220 anymore, so we
00:30:06.300 have to do
00:30:06.680 what we need
00:30:07.160 to do.
00:30:11.700 So I love
00:30:12.920 this comment
00:30:13.820 from Lauren
00:30:15.660 Pleska, I
00:30:16.740 think from some
00:30:17.380 other country on
00:30:18.140 Twitter, dropped
00:30:19.900 into my comments
00:30:21.080 to mock me,
00:30:22.620 because I was
00:30:23.160 making some
00:30:23.700 comments about
00:30:24.280 the models,
00:30:25.160 the same kind
00:30:25.760 of comments I
00:30:26.280 was just making
00:30:26.840 here, I was
00:30:28.020 making on
00:30:28.460 Twitter, and
00:30:29.840 Lauren goes in
00:30:30.760 to say, and
00:30:32.180 it's better if
00:30:32.700 I say it in
00:30:33.220 a mocking
00:30:33.640 tone, that
00:30:35.780 thing when a
00:30:36.440 guy who draws
00:30:37.180 a cartoon
00:30:37.940 thinks he
00:30:39.180 knows more
00:30:39.820 than epidemiologists
00:30:42.900 or people who
00:30:43.720 can pronounce
00:30:44.280 that word, I
00:30:44.760 guess, epidemiologists
00:30:46.300 about modeling
00:30:47.580 infectious disease
00:30:48.860 spread.
00:30:50.100 Ha, ha, ha,
00:30:51.420 look at the
00:30:52.360 cartoonist, that
00:30:53.680 guy who draws
00:30:54.320 a cartoon thinks
00:30:55.100 he knows more
00:30:55.680 than epidemiologists
00:30:56.920 about modeling
00:30:57.760 infectious diseases.
00:30:59.140 Oh, it is to
00:31:00.820 laugh.
00:31:02.100 How silly that
00:31:03.580 the cartoonist
00:31:04.440 thinks he knows
00:31:05.760 more than all
00:31:06.880 the expert
00:31:07.520 doctors in the
00:31:08.320 world.
00:31:09.440 Well, you
00:31:09.820 know, Lauren,
00:31:12.360 that would not
00:31:13.160 look like such
00:31:14.340 an ignorant
00:31:14.840 comment if it
00:31:16.720 didn't happen
00:31:17.140 this week.
00:31:18.520 This was the
00:31:19.480 week in which
00:31:20.940 all of the
00:31:21.440 medical experts
00:31:22.220 in the world
00:31:22.720 finally said,
00:31:24.220 yeah, the
00:31:24.600 cartoonist was
00:31:25.460 right.
00:31:26.500 Now, they
00:31:26.880 didn't use those
00:31:27.620 words, but
00:31:29.760 who was it
00:31:31.040 who told you
00:31:31.580 face masks
00:31:32.300 really do
00:31:33.060 work for
00:31:34.260 lay people
00:31:34.800 when CDC
00:31:36.600 and WHO
00:31:37.380 and the
00:31:37.740 Surgeon General
00:31:38.400 and Dr.
00:31:39.620 Fauci told
00:31:41.060 you they
00:31:41.360 don't?
00:31:43.160 It was the
00:31:43.980 cartoonist.
00:31:45.420 It was the
00:31:45.880 cartoonist.
00:31:46.700 Now, it was
00:31:47.040 also many of
00:31:47.700 you because
00:31:48.220 common sense
00:31:48.900 got you to
00:31:49.380 the same
00:31:50.080 place.
00:31:51.220 But,
00:31:52.520 Lauren,
00:31:53.860 if your
00:31:54.760 point is that
00:31:55.600 the cartoonist
00:31:56.500 cannot have
00:31:57.860 an opinion
00:31:58.260 which is
00:31:58.840 superior to
00:31:59.560 all of the
00:32:00.080 experts in
00:32:01.200 the world,
00:32:02.180 I don't think
00:32:03.060 you're paying
00:32:03.400 attention.
00:32:04.500 Because I
00:32:04.860 just did
00:32:05.540 that in
00:32:07.180 public.
00:32:08.280 I did it
00:32:09.280 aggressively in
00:32:10.220 public.
00:32:10.740 I disagreed
00:32:11.380 with all the
00:32:11.860 experts in
00:32:12.400 public,
00:32:14.280 shamelessly.
00:32:16.240 And we
00:32:16.820 know, because
00:32:18.140 all the
00:32:18.500 experts have
00:32:19.020 now said,
00:32:19.420 yeah, yeah,
00:32:19.820 you're right,
00:32:20.880 cartoonist
00:32:21.500 guy.
00:32:21.960 They didn't,
00:32:22.580 of course,
00:32:22.880 refer to me,
00:32:24.000 but they've
00:32:24.520 all come around
00:32:25.080 to my point
00:32:25.580 of you.
00:32:26.780 Am I
00:32:27.120 right?
00:32:28.220 Yes, I
00:32:28.660 am.
00:32:30.120 So, let
00:32:32.400 me say
00:32:32.720 this.
00:32:35.840 And how
00:32:36.420 about the
00:32:36.740 experts,
00:32:37.840 how about
00:32:38.300 the medical
00:32:38.800 experts who
00:32:39.420 told you
00:32:39.820 that because
00:32:41.400 we don't
00:32:41.920 know exactly
00:32:42.800 the
00:32:44.320 hydroxychloroquine
00:32:45.580 test results,
00:32:47.500 that you
00:32:47.800 shouldn't use
00:32:48.260 them.
00:32:51.940 Right?
00:32:52.560 Didn't you
00:32:53.180 have experts
00:32:53.720 saying we
00:32:54.240 don't have
00:32:54.620 the test
00:32:55.440 results so
00:32:56.080 it's dangerous
00:32:56.740 so you
00:32:57.940 shouldn't use
00:32:58.400 the hydroxychloroquine
00:32:59.520 because it
00:33:00.800 hasn't been
00:33:01.380 tested through
00:33:02.360 the test
00:33:03.020 results.
00:33:03.900 The medical
00:33:04.640 experts told
00:33:05.320 you that,
00:33:05.740 right?
00:33:06.140 Wasn't that
00:33:06.700 the epidemiologists
00:33:08.320 and everybody
00:33:09.180 else?
00:33:10.160 And what did
00:33:10.640 the cartoonist
00:33:11.320 say?
00:33:12.380 The cartoonist
00:33:13.360 disagreed to
00:33:14.080 that and
00:33:14.480 said,
00:33:15.280 well,
00:33:15.440 wait a minute.
00:33:16.520 If these
00:33:17.060 drugs have
00:33:17.520 been around
00:33:17.900 so long
00:33:18.640 that we
00:33:19.180 know which
00:33:19.680 people have
00:33:20.240 a risk and
00:33:20.860 which do
00:33:21.200 not,
00:33:21.620 for example,
00:33:22.060 people with
00:33:23.180 heart problems
00:33:23.860 and people
00:33:24.400 with hypertension
00:33:26.760 have a little
00:33:28.040 extra risk.
00:33:29.040 And we also
00:33:29.680 know that
00:33:30.720 it's a long
00:33:31.400 term use
00:33:31.980 risk and
00:33:32.540 we're not
00:33:32.960 contemplating
00:33:33.580 using it
00:33:33.920 for long
00:33:34.740 term.
00:33:35.880 Under those
00:33:36.760 conditions,
00:33:37.960 said the
00:33:38.520 cartoonist,
00:33:39.460 it's probably
00:33:40.660 still a good
00:33:41.300 bet because
00:33:42.640 even if it
00:33:43.280 doesn't help,
00:33:44.440 we can be
00:33:45.120 pretty sure it
00:33:45.740 doesn't hurt
00:33:46.380 and there's so
00:33:47.200 much anecdotal
00:33:48.120 information that
00:33:49.400 at least it
00:33:49.900 suggests there's a
00:33:50.740 non-zero chance
00:33:51.700 it could be
00:33:52.120 helpful.
00:33:53.040 So on a
00:33:53.460 risk-reward
00:33:54.160 basis,
00:33:55.560 who was
00:33:56.080 right?
00:33:57.180 Every medical
00:33:58.060 expert who
00:33:58.820 told you,
00:33:59.300 no,
00:33:59.700 you don't want
00:34:00.480 to use this
00:34:00.960 because it's
00:34:01.360 unproven,
00:34:02.380 or the
00:34:04.080 cartoonist?
00:34:05.620 The cartoonist.
00:34:07.060 Because the
00:34:07.660 medical experts
00:34:08.500 pretty much,
00:34:10.360 when they were
00:34:10.920 already,
00:34:11.460 you know,
00:34:11.800 undercover,
00:34:12.500 they were
00:34:12.760 already turning
00:34:13.420 my way,
00:34:14.360 which is to
00:34:14.960 say,
00:34:15.260 well,
00:34:15.460 we don't know
00:34:15.920 if it'll
00:34:16.200 work,
00:34:16.540 but it's
00:34:16.800 worth a shot
00:34:17.420 on a risk-reward
00:34:18.360 basis.
00:34:19.200 So who was
00:34:19.680 right?
00:34:20.000 All of
00:34:21.100 the medical
00:34:21.540 experts in
00:34:22.180 the world,
00:34:23.400 the CDC,
00:34:24.960 the Surgeon
00:34:25.480 General,
00:34:26.600 were they
00:34:27.040 right?
00:34:27.920 Was it
00:34:28.360 the World
00:34:29.440 Health
00:34:29.760 Organization
00:34:30.360 that was
00:34:30.880 right?
00:34:31.480 Or was
00:34:32.140 it the
00:34:33.160 cartoonist?
00:34:34.560 It was
00:34:35.280 the cartoonist
00:34:36.080 and everybody
00:34:36.800 else who had
00:34:37.340 the same
00:34:37.660 opinion who
00:34:38.200 was not a
00:34:38.680 medical expert.
00:34:40.020 It was
00:34:40.440 President Trump.
00:34:41.760 Is President
00:34:42.400 Trump's
00:34:43.000 stupid old,
00:34:44.760 dumb old
00:34:45.380 President Trump
00:34:46.120 who doesn't
00:34:46.600 believe in
00:34:47.080 science,
00:34:47.560 was his
00:34:48.760 opinion about
00:34:49.520 hydroxychloroquine
00:34:50.740 superior to
00:34:53.220 what was
00:34:53.560 coming from
00:34:54.080 the top
00:34:54.580 medical experts?
00:34:55.660 Yes.
00:34:56.060 Yes,
00:34:56.280 it was.
00:34:57.020 We can see
00:34:57.760 that plainly
00:34:58.620 because both
00:35:00.080 the medical
00:35:00.560 experts and
00:35:01.180 Trump exactly
00:35:02.320 agreed that
00:35:03.840 it's unproven.
00:35:04.660 It would be
00:35:05.460 better if it
00:35:05.980 were proven.
00:35:06.980 There's some
00:35:07.620 anecdotal evidence
00:35:08.480 that works,
00:35:09.460 but that's not
00:35:10.740 a guarantee,
00:35:11.800 but the risk-reward
00:35:12.920 is still worth it.
00:35:13.760 That came
00:35:15.040 from the
00:35:15.380 president.
00:35:16.600 That didn't
00:35:17.160 come from
00:35:17.500 the medical
00:35:17.920 community.
00:35:18.760 That came
00:35:19.280 from a
00:35:19.640 cartoonist.
00:35:20.860 It didn't
00:35:21.360 come from
00:35:21.720 the medical
00:35:22.100 community.
00:35:23.000 But now
00:35:23.540 what is the
00:35:24.060 medical community
00:35:24.960 coming around
00:35:25.640 to?
00:35:25.900 Oh, yeah,
00:35:26.340 that was more
00:35:26.780 about shortages.
00:35:28.420 Yeah, we
00:35:29.100 knew that.
00:35:30.060 We knew it
00:35:30.840 was about
00:35:31.200 shortages.
00:35:31.940 We knew you
00:35:32.380 were lying.
00:35:33.640 We knew
00:35:34.080 that you
00:35:34.320 were being
00:35:34.640 irrational
00:35:35.120 for whatever
00:35:35.980 that reason
00:35:36.700 was to
00:35:37.220 prevent the
00:35:37.700 shortages
00:35:38.060 and the
00:35:38.400 hoarding.
00:35:40.020 But
00:35:40.260 Lauren,
00:35:41.560 Lauren
00:35:43.720 Pleska,
00:35:45.160 any other
00:35:46.800 year,
00:35:47.900 it would
00:35:48.560 be reasonable
00:35:49.160 for you
00:35:49.620 to say,
00:35:50.320 oh, yeah,
00:35:51.420 let's believe
00:35:51.900 the cartoonists,
00:35:52.840 not the
00:35:53.160 medical experts.
00:35:54.320 Let's believe
00:35:54.780 the cartoonists.
00:35:55.540 That makes
00:35:55.900 sense.
00:35:56.340 Yeah, that
00:35:57.280 makes sense.
00:35:58.160 Any other
00:35:58.960 year, that
00:35:59.800 would have
00:36:00.080 been a
00:36:00.340 reasonable
00:36:00.740 mocking
00:36:01.240 thing to
00:36:01.820 say.
00:36:02.640 But you're
00:36:03.160 saying it
00:36:03.480 right in
00:36:03.820 the middle
00:36:04.240 of me
00:36:05.580 proving in
00:36:06.340 public that
00:36:07.300 my opinions
00:36:07.860 are superior
00:36:08.520 to the
00:36:10.180 medical
00:36:10.440 professionals.
00:36:11.560 Now, do
00:36:12.020 you need
00:36:12.260 another
00:36:12.520 example?
00:36:13.840 How about
00:36:14.240 the example
00:36:14.760 where I'm
00:36:15.380 the only
00:36:15.700 one who
00:36:16.040 said that
00:36:16.540 the low
00:36:17.180 number is
00:36:17.920 going to
00:36:18.160 be below
00:36:18.580 the models?
00:36:20.120 Did I
00:36:20.600 not, as
00:36:21.860 not being
00:36:22.360 an epidemiologist
00:36:23.620 and not being
00:36:24.380 a virologist
00:36:25.300 and not being
00:36:25.760 a medical
00:36:26.100 expert, was
00:36:27.320 I not
00:36:27.800 loudly and
00:36:28.480 publicly saying
00:36:29.240 that I
00:36:29.620 believe the
00:36:30.040 numbers would
00:36:30.540 come in well
00:36:31.140 under the
00:36:31.820 bottom estimate?
00:36:33.680 Who was
00:36:34.380 right?
00:36:35.000 All of the
00:36:35.780 experts in
00:36:36.360 the world?
00:36:37.700 Or the
00:36:38.360 cartoonist?
00:36:39.180 Who was
00:36:39.540 right?
00:36:39.860 The cartoonist
00:36:40.520 or the
00:36:40.860 experts?
00:36:42.080 Lauren?
00:36:43.720 So, at
00:36:45.480 least pay a
00:36:46.140 little bit of
00:36:46.680 attention to
00:36:47.180 who's being
00:36:47.520 right.
00:36:48.900 Okay?
00:36:49.620 I totally
00:36:50.580 get that you
00:36:52.160 should not
00:36:52.560 ignore experts.
00:36:53.640 Even I don't
00:36:54.200 ignore experts as
00:36:55.280 much as I
00:36:55.740 complain about
00:36:56.360 it.
00:36:56.940 Even I take
00:36:57.720 experts as the
00:36:58.600 first, you
00:36:59.720 know, that's
00:37:00.020 the first
00:37:00.440 position.
00:37:01.580 And if you're
00:37:02.100 going to
00:37:02.300 deviate from
00:37:02.980 what the
00:37:03.340 experts are
00:37:03.880 saying, you
00:37:04.160 have a good
00:37:04.600 reason.
00:37:05.600 You're going
00:37:05.940 to need a
00:37:06.320 pretty good
00:37:06.960 reason if
00:37:07.480 you're going
00:37:07.760 to disagree
00:37:08.300 with a
00:37:08.920 consensus
00:37:09.740 of experts,
00:37:10.680 but I
00:37:11.760 show my
00:37:12.160 work.
00:37:13.460 You can
00:37:14.000 decide.
00:37:16.380 All right.
00:37:17.460 Wow, I think
00:37:18.240 I've beat
00:37:19.300 that to
00:37:19.760 death.
00:37:20.080 here's another
00:37:23.800 example of
00:37:24.320 loser think
00:37:24.880 from Rachel
00:37:27.480 Maddow.
00:37:28.640 So, she
00:37:29.040 was responding
00:37:29.620 to the
00:37:30.820 attorney general
00:37:31.480 had a tweet
00:37:32.100 about, you
00:37:32.820 know, about
00:37:33.920 America needing
00:37:34.880 to get back
00:37:35.440 to work.
00:37:36.760 All right, so
00:37:37.020 the attorney
00:37:37.380 general was
00:37:37.960 making sort
00:37:38.460 of a general
00:37:39.400 statement about
00:37:41.820 we need to
00:37:42.960 get back to
00:37:43.540 work.
00:37:44.160 He was not
00:37:44.940 putting a
00:37:46.020 day on it.
00:37:47.140 He was not
00:37:47.780 saying, let's
00:37:49.080 forget about
00:37:49.660 the people
00:37:50.100 who will
00:37:50.440 die, because
00:37:51.060 he wasn't
00:37:51.520 saying that.
00:37:52.540 He was
00:37:52.840 making a
00:37:53.360 general
00:37:53.720 universally
00:37:54.380 true statement
00:37:55.240 that we
00:37:55.680 would all
00:37:56.000 like to
00:37:56.320 get back
00:37:56.640 to work.
00:37:57.740 No real
00:37:58.380 detail.
00:37:59.960 Rachel Maddow
00:38:00.620 decides to
00:38:01.260 mock him for
00:38:01.960 that universally
00:38:02.800 true statement
00:38:03.660 that everybody
00:38:04.200 agrees on, and
00:38:06.520 she says it
00:38:07.040 this way in
00:38:07.460 her tweet.
00:38:08.440 More than
00:38:08.800 14,000
00:38:09.780 Americans have
00:38:10.560 already died.
00:38:12.400 One American
00:38:13.120 died every
00:38:14.320 45 seconds
00:38:15.560 today.
00:38:17.320 But sure,
00:38:18.440 Mr.
00:38:18.760 Attorney General,
00:38:19.660 yeah, go
00:38:20.900 on, yeah, go
00:38:22.360 on about
00:38:22.780 this, get
00:38:23.160 back to
00:38:23.520 work.
00:38:26.380 Here's the
00:38:26.960 loser thing.
00:38:27.960 It's one
00:38:28.700 variable thinking.
00:38:30.100 If the only
00:38:30.700 variable we had
00:38:31.500 to worry about
00:38:32.040 is how many
00:38:32.560 people were
00:38:33.140 dying, well,
00:38:35.120 I'd say get
00:38:35.640 back to work,
00:38:36.460 or how many
00:38:37.000 people are
00:38:37.340 dying specifically
00:38:38.160 from the
00:38:38.700 coronavirus,
00:38:39.920 then Rachel
00:38:40.480 Maddow would
00:38:41.040 have a world
00:38:41.780 class, excellent
00:38:42.680 opinion here.
00:38:43.840 And it would
00:38:44.140 be the sort
00:38:44.560 of thing she
00:38:44.980 should be proud
00:38:45.660 to show to
00:38:46.160 the world.
00:38:47.660 But if
00:38:48.780 you're in a
00:38:49.140 situation that
00:38:49.720 has many
00:38:50.120 variables, and
00:38:51.400 you've got the
00:38:51.980 economic pain and
00:38:55.220 death that comes
00:38:55.880 with that, and
00:38:56.600 you've got the
00:38:57.260 healthcare coronavirus
00:38:58.560 death, and you've
00:39:00.000 got lots of
00:39:00.740 variables that
00:39:01.420 ties it all
00:39:01.960 together, the
00:39:02.560 psychology, the
00:39:03.440 economics, the
00:39:04.260 credit, this
00:39:06.620 world, the sea
00:39:07.600 of variables.
00:39:09.940 And Rachel
00:39:10.320 Maddow goes in
00:39:11.480 and she picks
00:39:11.960 out just one of
00:39:12.680 them, just one
00:39:13.900 of the important
00:39:14.440 variables, to
00:39:15.900 mock somebody who
00:39:17.500 may have
00:39:17.880 considered more
00:39:18.560 than one
00:39:18.940 variable.
00:39:20.700 If you want to
00:39:21.720 really be the
00:39:22.800 champion of
00:39:23.660 loser think, be
00:39:25.520 the person who
00:39:26.240 says only one
00:39:27.040 variable matters
00:39:27.920 in a big
00:39:28.560 multivariable
00:39:29.440 situation, and
00:39:30.760 then go in
00:39:31.280 public and mock
00:39:32.240 the people who
00:39:33.420 think it's a
00:39:34.040 multivariable
00:39:34.820 situation.
00:39:36.240 If you want
00:39:37.080 to reach the
00:39:39.300 peak of bad
00:39:41.460 thinking, Rachel
00:39:43.500 Maddow, you
00:39:44.720 have achieved
00:39:45.200 the summit.
00:39:47.380 All right.
00:39:48.760 Anything else
00:39:49.480 happening?
00:39:53.460 I keep
00:39:54.440 seeing questions
00:39:55.540 about Dr.
00:39:57.140 Shiva and
00:39:57.720 whether I agree
00:39:58.760 or disagree, so
00:39:59.840 I will reiterate
00:40:00.720 what I said
00:40:01.640 about that.
00:40:03.240 Dr.
00:40:03.860 Shiva has a
00:40:04.500 whole range of
00:40:05.280 opinions within
00:40:06.820 this category of
00:40:08.080 the coronavirus
00:40:08.560 stuff, and I
00:40:09.180 don't know which
00:40:09.960 ones you find
00:40:12.120 controversial or
00:40:13.980 you would like
00:40:14.520 my opinion on.
00:40:15.600 So if you
00:40:15.980 want to be more
00:40:16.480 specific, put it
00:40:17.760 in the form of
00:40:18.320 a statement.
00:40:19.780 Dr.
00:40:20.160 Shiva says X.
00:40:22.680 What do you
00:40:23.120 think?
00:40:23.940 So I'd be happy
00:40:24.680 to give you an
00:40:25.180 opinion on that,
00:40:25.840 but you have to
00:40:26.220 narrow it down a
00:40:26.820 little bit.
00:40:27.840 I've sort of
00:40:28.680 skimmed his
00:40:29.880 opinions, and I
00:40:30.620 didn't see anything
00:40:31.500 that far out of
00:40:33.920 what I would
00:40:35.100 think.
00:40:35.500 I didn't see
00:40:35.980 anything that
00:40:36.420 shocked me, so I
00:40:37.880 don't know what
00:40:38.240 you're talking
00:40:38.560 about exactly.
00:40:42.900 Want to hear a
00:40:43.800 Linda Tripp
00:40:44.500 story?
00:40:46.660 So here's one
00:40:48.400 of the weird
00:40:48.780 things about my
00:40:49.640 life, and I
00:40:51.260 noticed this about
00:40:52.260 other people as
00:40:53.040 well.
00:40:53.380 Have you ever
00:40:53.720 noticed there
00:40:54.300 are some people
00:40:55.080 who, for
00:40:56.880 completely, or at
00:40:58.020 least what it
00:40:58.400 looks like to be
00:40:59.000 coincidences, often
00:41:00.660 find themselves to
00:41:01.840 be somehow attached
00:41:03.520 to the biggest
00:41:04.180 stories in the
00:41:04.860 world?
00:41:05.820 Have you ever
00:41:06.100 noticed that?
00:41:06.600 there's some
00:41:07.420 people who just
00:41:08.740 go through their
00:41:09.300 life, and they're
00:41:10.240 not even trying to
00:41:11.160 do it, but
00:41:11.660 suddenly they get
00:41:12.340 somehow attached
00:41:13.920 to, or their
00:41:15.060 brother is doing
00:41:16.580 something, or their
00:41:18.140 spouse is on
00:41:18.960 something.
00:41:19.740 They're always just
00:41:20.620 connected to the
00:41:22.320 biggest stories in
00:41:23.120 the world.
00:41:24.300 Have you ever
00:41:24.560 noticed that?
00:41:25.400 It's like this
00:41:26.200 weird pattern that
00:41:27.080 can't be explained.
00:41:27.820 Now I assume it's
00:41:28.320 just confirmation
00:41:29.460 bias and selective
00:41:30.720 memory and stuff,
00:41:31.900 but it's one of
00:41:32.620 those fun things
00:41:33.700 about reality that
00:41:34.520 looks like that.
00:41:35.200 And I'm one of
00:41:35.820 those people.
00:41:37.140 So I observe it
00:41:38.400 in others, but I
00:41:39.340 observe it in
00:41:39.820 myself as well, that
00:41:41.200 the number of times
00:41:41.960 I'm connected to a
00:41:43.080 major story, it's
00:41:44.800 just sort of weird.
00:41:45.660 And the Linda
00:41:46.140 Tripp one is one of
00:41:47.300 those examples.
00:41:48.900 So, and there's no
00:41:50.760 importance to the
00:41:51.500 story, it's just an
00:41:52.260 interesting connection.
00:41:54.100 So Linda Tripp, if
00:41:55.280 you remember, she
00:41:57.320 had her knowledge
00:41:58.500 about what Monica
00:41:59.420 Lewinsky was up
00:42:00.700 to, and she didn't
00:42:01.820 know what to do
00:42:02.360 with it.
00:42:03.180 And she sought
00:42:04.000 advice.
00:42:04.840 Now here's the
00:42:06.360 interesting part.
00:42:07.020 The person she
00:42:07.840 sought advice from
00:42:09.280 was a book editor,
00:42:15.580 a publishing editor
00:42:16.860 kind of person.
00:42:18.220 person, and that
00:42:19.240 publishing editor kind
00:42:20.760 of person, I
00:42:22.180 happened to know.
00:42:23.520 So I actually
00:42:24.300 knew, I think she's
00:42:26.620 passed away, but I
00:42:28.000 knew the woman, the
00:42:30.300 agent that Linda Tripp
00:42:32.980 went to, and that
00:42:33.900 agent told her to make
00:42:35.340 her story public.
00:42:37.540 So that's the part of
00:42:40.320 the story you know, that
00:42:42.100 Linda Tripp had some
00:42:43.160 information, she got
00:42:44.340 some advice from a
00:42:45.900 publisher, the publisher
00:42:46.840 said, yeah, you should
00:42:47.720 take that public.
00:42:49.420 Here's the part you
00:42:50.260 don't know.
00:42:51.800 The person who gave
00:42:52.840 her that advice was a
00:42:53.880 huge anti-Clinton
00:42:55.340 person.
00:42:56.320 So the advice was not
00:42:57.560 unbiased, it was just
00:42:58.740 pure political advice to
00:43:00.120 take down a president.
00:43:01.900 Yes, it was Jonah
00:43:03.240 Goldberg's mother, that
00:43:04.320 is correct.
00:43:04.800 Now, Jonah Goldberg's
00:43:06.380 mother, who gave Linda
00:43:08.580 Tripp the advice, was
00:43:10.800 married to the, I
00:43:13.180 believe, the senior
00:43:13.940 vice president of my
00:43:15.280 cartoon syndication
00:43:16.420 company.
00:43:17.980 So I knew well the
00:43:20.860 husband, because I
00:43:22.380 worked with him as part
00:43:23.780 of my cartooning, so I
00:43:26.980 knew well the guy who
00:43:28.220 was married to the
00:43:30.440 woman who gave the
00:43:31.080 advice, and those two
00:43:32.960 people are the parents
00:43:33.840 of Jonah.
00:43:34.800 Goldberg, who I met at
00:43:36.760 an event during those
00:43:38.080 times.
00:43:39.040 So there's no real
00:43:40.380 point to the story,
00:43:41.440 except the number of
00:43:42.800 times I find myself, you
00:43:45.300 know, connected to a
00:43:46.580 story, and then, you
00:43:48.640 know, time goes by, and
00:43:49.920 most of you know that
00:43:51.240 Jake Tapper and I did a
00:43:52.900 thing where Jake is also
00:43:54.660 a cartoonist, and Jake
00:43:56.520 drew a week of Dilbert
00:43:58.160 comics that we used for a
00:43:59.980 charity for wounded
00:44:01.680 veterans.
00:44:02.140 and so we did that
00:44:03.240 twice.
00:44:04.340 So I have, you know, this
00:44:06.360 sort of, you know, working
00:44:07.900 professional connection
00:44:10.060 with Jake Tapper, who
00:44:11.900 also dated Monica
00:44:13.080 Lewinsky.
00:44:14.740 So in this weird way, I'm
00:44:16.880 just minding my own
00:44:17.880 business over in my life,
00:44:20.160 and I've got, you know,
00:44:21.600 two separate connections
00:44:22.880 to this whole Monica
00:44:24.380 Lewinsky story.
00:44:25.320 I mean, how random is
00:44:26.880 that?
00:44:27.680 It's not even six degrees.
00:44:29.320 It's like, you know, one
00:44:30.180 degree.
00:44:30.880 Or is it two?
00:44:32.460 However you count that.
00:44:37.480 All right.
00:44:41.740 Dr. Shiva, somebody says,
00:44:43.580 does not believe HIV causes
00:44:45.360 AIDS.
00:44:45.900 Well, I don't know about
00:44:46.760 anything in that topic.
00:44:48.300 But that's not what you
00:44:51.380 were asking me about, Dr.
00:44:52.520 Shiva.
00:45:01.720 Yeah.
00:45:02.900 And by the way, that's not a
00:45:04.200 secret or anything.
00:45:05.760 Jake has actually published,
00:45:07.460 I think, at least one article
00:45:08.740 in which he talked favorably
00:45:12.360 about Monica just being a good
00:45:14.080 person.
00:45:14.420 And by the way, as far as I
00:45:17.400 can tell, you know, we've
00:45:19.900 had a long time to observe
00:45:21.460 Monica Lewinsky in the public
00:45:23.020 eye, and I think we could say
00:45:24.780 she was a good person.
00:45:26.880 That's my observation.
00:45:28.020 She looks like a good person.
00:45:29.620 She was in a difficult
00:45:31.800 situation, but she just seems
00:45:34.400 like a good person to me.
00:45:39.820 All right.
00:45:41.940 Prediction on when California
00:45:43.280 will relax and stuff.
00:45:45.620 Well, I don't think anybody's
00:45:47.680 made a decision, but here's
00:45:49.600 what I did today to try to
00:45:50.800 move the ball forward.
00:45:51.720 So I tweeted today that I
00:45:54.020 think we should get back to
00:45:55.000 work whenever the experts
00:45:57.300 can tell us that doing so
00:45:59.620 would keep the death count from
00:46:01.180 coronavirus specifically below
00:46:02.920 50,000.
00:46:04.300 Now, here's why I'm doing it.
00:46:06.300 There's having spent many years
00:46:08.720 doing predictions and
00:46:10.500 estimating things and doing
00:46:11.900 budgets and that sort of
00:46:12.840 thing.
00:46:13.400 One of the things I learned is
00:46:14.780 that you can often estimate the
00:46:16.460 cost of something or the price
00:46:19.120 that somebody would be willing
00:46:20.260 to pay without knowing any
00:46:22.440 information about the topic.
00:46:24.840 Now, you say to yourself, well,
00:46:26.180 Sky, how is that possible?
00:46:27.580 How could you reasonably estimate
00:46:29.180 what somebody would pay for
00:46:30.240 something if you had very little
00:46:32.140 information about those people,
00:46:33.920 what they want, what the item is?
00:46:37.000 And I would say you need to know
00:46:38.900 something about it.
00:46:40.160 But here's how you can make
00:46:41.640 predictions without knowing much
00:46:43.800 about anything.
00:46:45.380 It goes like this.
00:46:47.400 Where you would feel comfortable is
00:46:50.160 probably not that far off from where
00:46:52.180 other people would feel comfortable.
00:46:54.280 So, for example, if somebody came to
00:46:56.080 you and said, I've developed this new
00:46:58.700 product.
00:46:59.220 It's a blank.
00:47:01.560 It has these functions and you want
00:47:03.720 it because of this reason.
00:47:04.760 It doesn't even matter what it is.
00:47:06.280 And you look at it and you say,
00:47:07.540 okay, what would you pay for this?
00:47:09.620 And so I hold it in my hands and I
00:47:11.280 look at all the features and I figure
00:47:12.820 out what it would do for me and I say
00:47:14.080 to myself, I would pay, I don't know,
00:47:17.800 50 bucks.
00:47:19.780 And what you would find is that you
00:47:21.240 could do this experiment all over the
00:47:23.100 country.
00:47:23.540 You'd probably have to keep it within
00:47:24.580 the country.
00:47:25.780 But you'd say, okay, hold this product.
00:47:28.000 Tell me what you would pay for it.
00:47:29.640 And people would say, I don't know,
00:47:31.580 60 bucks.
00:47:32.840 But you're going to find out they don't
00:47:34.140 say a thousand and they don't say
00:47:36.500 two dollars.
00:47:38.520 People have, for just psychological
00:47:41.360 reasons, for bias reasons, for
00:47:44.320 irrational reasons, we tend to gravitate
00:47:47.040 toward the same perceived price of
00:47:50.100 things.
00:47:51.860 There's no reason for it.
00:47:53.140 It's not based on being smart.
00:47:54.900 It's just a phenomena which I have,
00:47:56.720 you know, I've observed over the
00:47:58.580 years.
00:47:59.480 So, I believe that people will also
00:48:03.180 gravitate toward 50,000 for the
00:48:06.300 following reasons.
00:48:07.600 50,000 tends to be sort of a weird
00:48:10.380 cutoff in our experience of life.
00:48:13.920 50,000 is sort of the number of
00:48:16.380 people who died in Vietnam.
00:48:18.760 And we say that was too many.
00:48:20.840 50,000 is roughly in that
00:48:22.880 neighborhood of how many people die
00:48:24.260 from guns.
00:48:25.620 50,000 is sort of where people are
00:48:27.960 dying from automobile accidents.
00:48:30.280 50,000 is sort of where the overdose
00:48:33.120 numbers are.
00:48:34.920 Now, you know, it's in the 30 to 100,000,
00:48:37.240 but I'm just saying 50,000 is that
00:48:38.920 range, right?
00:48:40.500 50,000 has taken on a psychological and
00:48:46.260 emotional meaning for us.
00:48:47.940 This is a hypothesis, right?
00:48:50.120 So, I can't point any science behind this,
00:48:52.620 so don't take it too seriously.
00:48:54.160 I put it down as a hypothesis.
00:48:57.080 Maybe we'll see if something happens this
00:48:59.280 way.
00:48:59.740 We'll never know if it's because of this.
00:49:02.040 But, so here's the basic idea.
00:49:04.680 Yeah, and the flu.
00:49:05.320 Somebody says in the comments, the average
00:49:08.320 flu kills in the range of, you know,
00:49:11.280 give or take 50,000.
00:49:12.820 So, I believe that in our minds, there is
00:49:15.600 a magic number which we accept in the
00:49:19.080 United States as too much.
00:49:22.040 And then below that, it's like, well, I
00:49:24.320 will accept up to 50,000 deaths because
00:49:27.520 I need a car.
00:49:28.940 You know, society needs transportation.
00:49:31.380 I will accept 50,000 deaths from illegal
00:49:35.500 drugs, overdoses, because the alternative
00:49:38.480 is what?
00:49:40.260 You know, closing all the borders and not
00:49:43.100 accepting mail from other countries.
00:49:45.220 And, you know, so they say, well, given the
00:49:47.400 trade-offs, I'll accept that.
00:49:49.980 So, I'm going to continue to put out the
00:49:53.560 50,000 number.
00:49:54.680 And by the way, I heard Lindsey Graham use
00:49:56.700 the same number.
00:49:57.300 I don't know.
00:49:59.740 It's possible that he heard it from
00:50:01.540 somebody who heard it from me.
00:50:03.000 But I think it's more likely that he came
00:50:05.880 up with the same number.
00:50:07.720 I think the most likely scenario is that
00:50:09.880 Lindsey Graham did the same thing I'm
00:50:11.380 doing right now, which is, okay, just if
00:50:13.940 I had to think about it as a justice
00:50:16.320 citizen, where would I be comfortable to
00:50:20.300 get the economy back knowing that X number
00:50:23.800 of people will die?
00:50:24.940 What is X?
00:50:25.540 What's my X?
00:50:26.580 And he said, 50,000.
00:50:29.040 The more of us who can say that, the more
00:50:32.380 50,000 becomes a thing.
00:50:34.940 And even it could be we're debating whether
00:50:37.640 it should be above or below 50,000.
00:50:40.460 As long as that number gets in our head, then
00:50:43.800 we've got something we can tell our experts.
00:50:46.660 Hey, experts, we know you can't be that
00:50:49.080 accurate.
00:50:50.060 But give us your best estimate of when we
00:50:53.840 can go back to work and under what
00:50:55.540 conditions that will keep us under this
00:50:58.520 psychologically important level of 50,000.
00:51:02.680 And I think that makes, so this is the, what
00:51:05.460 I call the drug dealer's trick, where you
00:51:07.780 find a way that you're both psychologically
00:51:09.640 happy, even if you don't know you've made
00:51:12.060 the right decision.
00:51:12.800 So we need to be, as a society, we need to be
00:51:16.540 psychologically comfortable with whatever we
00:51:18.900 decide, but also completely aware that we can't
00:51:22.440 know what's the best thing to do.
00:51:23.760 It's just not knowable.
00:51:25.380 You can take a guess and you can adjust as you
00:51:27.440 get to go, but you've got to be comfortable or
00:51:29.720 you're not going to get off the first square.
00:51:32.400 And I think to do that, we need to float the
00:51:35.280 number of 50,000 and just wrestle with it for a
00:51:39.820 while until that number becomes something we
00:51:42.320 feel is comfortable.
00:51:45.840 I see a very insightful comment that says
00:51:49.480 bullshit.
00:51:52.140 Somebody says too low.
00:51:56.440 Too low, you're saying too low because it
00:51:58.560 won't get us back to work.
00:52:00.800 Well, I think that we can get to 50,000.
00:52:03.640 And I think we can get there simply by keeping
00:52:06.300 old people locked up and doing a better job.
00:52:10.080 You know, here's an idea that nobody's talked
00:52:13.140 about.
00:52:14.300 Suppose you had only one.
00:52:17.980 Now, let's say you had two tests and you had
00:52:22.320 two extended families.
00:52:24.340 So there were two, you know, families that had
00:52:26.660 from children to parents.
00:52:29.260 And let's say one grandparent is living in the
00:52:31.300 home.
00:52:32.180 And there are two of these situations.
00:52:33.920 So two households have a grandparent there.
00:52:36.920 You only have two tests available.
00:52:39.700 What's the best thing you can do if you have
00:52:42.220 two tests?
00:52:43.380 You test the two grandparents in the two
00:52:45.440 different households.
00:52:46.740 And then you put the grandparents in the same
00:52:49.260 house and you have the young people move over
00:52:52.120 and double up in the house that has the young
00:52:55.040 people in it.
00:52:56.220 So then you've got the two tested grandparents.
00:52:58.540 You just say, well, nobody's coming in this
00:53:00.120 house for six months.
00:53:01.540 You know, we're going to shove pizzas under the
00:53:03.300 door, but nobody's coming in the house.
00:53:05.460 You guys have tested.
00:53:06.960 You're good.
00:53:07.960 You just live in this house and the rest of us
00:53:09.920 will live with the low risk people.
00:53:12.180 Now, I don't know how often things like that
00:53:13.920 would happen.
00:53:14.480 I use that as an extreme example to say that we
00:53:18.080 have not exhausted the cleverness that we can
00:53:20.940 apply.
00:53:22.140 So if the government said right now, our models
00:53:25.620 say that we're still going to be at 60,000.
00:53:27.920 But I don't think our cleverness has been expired.
00:53:32.820 We have not gotten the bottom of the well of our
00:53:35.040 cleverness.
00:53:36.040 So even though our models are still saying 60,000,
00:53:39.460 we feel comfortable saying that human ingenuity can get
00:53:43.460 us below 50,000.
00:53:44.580 So I think our experts have plenty of ability to say, yeah,
00:53:48.260 this is the time and this is the way.
00:53:51.660 And they could also say going to work without masks won't get
00:53:55.580 you under 50,000, going to work with masks, with
00:53:59.420 hydroxychloroquine, once we know a little bit more about
00:54:02.760 the effectiveness.
00:54:03.900 So they could give us a prescription that is effectively
00:54:07.920 a checklist.
00:54:09.420 I created a checklist for you.
00:54:11.400 But they could create a better one that says, all right,
00:54:13.580 if you meet these criteria, you're good to go.
00:54:16.800 And I think they could keep it under 50,000.
00:54:19.800 So these are the things I'll be pushing the most.
00:54:22.200 The number of 50,000, a checklist so our experts can tell us
00:54:27.060 whether we're on the list to go back first, and then a decision
00:54:31.640 on the end of the month.
00:54:33.580 If you can give me those things, plus some confidence that we're
00:54:38.620 not going to run out of ventilators and stuff, and I think
00:54:41.640 we're getting close to that, then I feel like we have something
00:54:45.540 that looks like a plan.
00:54:46.480 And I would be very proud of living in a country that could get us
00:54:50.960 to that point, actually.
00:54:52.600 If we can get to the point where we've got a number that we're
00:54:55.980 managing to, and we've got a checklist where we can all say, yes,
00:55:00.200 yes, that applies.
00:55:01.840 You know, no science needed.
00:55:03.940 Either I have a mask and I'm going to wear it, or I don't.
00:55:06.520 Check, check.
00:55:08.120 That's a plan.
00:55:09.620 So let's get to that point.
00:55:11.060 All right, that's all I got for today.
00:55:12.140 I will talk to you tonight, you know when, 10 p.m.
00:55:16.500 Eastern Time, 7 p.m. Pacific.
00:55:20.020 And I will see you then.