Real Coffee with Scott Adams - October 02, 2021


Episode 1517 Scott Adams: Headlines and Coffee Go Together Perfectly. Come Join Us.


Episode Stats

Length

1 hour and 10 minutes

Words per Minute

146.9489

Word Count

10,302

Sentence Count

792

Misogynist Sentences

9

Hate Speech Sentences

28


Summary

Kamala Harris gets the Charlottesville hoax treatment, Wolf Blitzer gets accused of sexual harassment, and California becomes the first state to require all children in public school to get measles shots. Plus, we discuss whether or not we should all be vaccinated against measles.


Transcript

00:00:00.000 Well, today, with any luck, we will feature Boo the cat, who is below my desk and licking
00:00:11.000 herself in a most disgraceful way.
00:00:14.460 Well, I appreciate all the cat pictures there coming in the comments over on the Locals
00:00:19.340 platform.
00:00:21.000 And good morning, YouTube, too.
00:00:23.760 So how would you like today to be great?
00:00:28.100 Yeah, you would like that.
00:00:29.640 I know.
00:00:31.140 Have you noticed that I talk to you as if you're answering and I can hear you?
00:00:34.840 You do?
00:00:35.780 Yeah.
00:00:36.640 I thought you'd notice that.
00:00:38.560 And you notice if I ask you a question, it doesn't even matter what you say, because my answer
00:00:42.780 is going to be the same anyway?
00:00:44.140 Right.
00:00:44.820 Exactly.
00:00:46.620 All right.
00:00:47.180 Well, let's enjoy ourselves to the maximum extent.
00:00:51.160 And all you need is a cup or mug or a glass, a tank or gel or a stein, a canteen jug or
00:00:55.320 a flask, a vessel of any kind.
00:00:57.580 Fill it with your favorite liquid.
00:01:00.220 I like coffee.
00:01:01.660 Wait, you like what?
00:01:03.360 Oh, okay.
00:01:03.960 Well, that's good, too.
00:01:05.640 And join me now for the unparalleled pleasure.
00:01:09.080 The dopamine here of the day.
00:01:10.820 The thing that makes everything better.
00:01:13.300 It's called the simultaneous sip.
00:01:16.260 It happens now.
00:01:17.260 Go.
00:01:23.080 Yes, I have been called the Mr. Rogers of the Internet or something.
00:01:31.340 I have been called that.
00:01:33.620 All right.
00:01:34.100 Well, let's talk about the news.
00:01:35.200 A special shout-out to Wolf Blitzer, who, despite having a first name Wolf, has never
00:01:44.900 been accused of sexual harassment, as far as we know.
00:01:48.920 I say that only because over at CNN, Don Lemon has some accusations he's dealing with, and Chris
00:01:55.500 Cuomo has some accusations he's dealing with.
00:01:57.740 And I just think it's funny that the only guy named Wolf, he's cool.
00:02:08.260 Is that his real name?
00:02:10.320 Somebody says he made up that name.
00:02:12.660 Maybe.
00:02:12.900 But, yeah, good for Wolf Blitzer for staying out of trouble for all these years.
00:02:21.040 Well, Kamala Harris gets the Charlottesville hoax treatment by CNN and some others in the
00:02:27.900 network world, and couldn't have happened to a nicer person.
00:02:32.400 And what I mean by that is, you probably saw the story in which she was talking, Harris was
00:02:37.940 talking to a classroom, and one of the students asked a question in which she referred to Israel
00:02:44.900 as being involved in, quote, ethnic genocide.
00:02:49.600 Now, Harris did not push back on that thought.
00:02:54.020 And the thinking was that the lack of pushing back kind of, like, is almost as good as applauding
00:03:00.600 it.
00:03:00.900 But what she did say, if you listen to it carefully, is she basically applauded the fact that this
00:03:10.620 young woman could have an opinion that might be different from the mainstream, and that
00:03:16.540 she could express her opinion, and that that was her truth.
00:03:20.460 What did people hear?
00:03:22.340 They heard that she was agreeing that Israel was involved in ethnic genocide, which, you
00:03:29.040 know, most people would say they're not.
00:03:32.360 So, I feel like Harris is getting the Charlottesville fine people hoax treatment.
00:03:42.260 Not that she doesn't deserve it.
00:03:43.900 I mean, she's done enough to, you know, to do her own hoaxes.
00:03:48.400 But if I'm being objective, it didn't look to me like she was agreeing with the idea.
00:03:53.720 Now, she didn't push back.
00:03:55.960 She didn't push back, and that's true.
00:03:57.940 And if you were Israel, I think you'd have something to say about that, right?
00:04:03.480 Because you want to remind people for the next time.
00:04:07.080 You know, the next time this comes up, you know what would be cool?
00:04:11.700 You just push back on that a little bit.
00:04:13.740 We wouldn't mind that at all.
00:04:15.560 But the letting it go, we don't like that.
00:04:18.300 Now, did she let it go?
00:04:19.420 Yeah.
00:04:20.020 She did let it go.
00:04:21.320 But she let it go with a specific statement in favor of freedom of speech of unpopular ideas.
00:04:27.220 I'm going to give her a pass on that.
00:04:31.540 No, I don't think that she was saying anything about Israel.
00:04:34.060 I think she was just trying to be, you know, I think trying to be flexible and maybe, you know, teach a lesson to the young people.
00:04:44.840 But I didn't hear what the news is reporting.
00:04:47.680 I listened to the whole thing twice.
00:04:49.220 I didn't hear her doing anything wrong.
00:04:53.440 And I did not hear her agreeing with the comment about ethnic genocide.
00:04:57.880 I definitely didn't hear that.
00:05:00.320 Right.
00:05:03.620 Hit the like button, you grifter.
00:05:06.240 Okay, I don't know what that's about.
00:05:07.360 Gavin Newsom has announced that California will be the first to require vaccinations for all children going to school.
00:05:19.040 He points out that our schools already have vaccines for measles, mumps, and more.
00:05:25.260 So why would this be different?
00:05:26.840 Well, isn't the reason that this is different that it hasn't been around as long?
00:05:34.560 And that's the whole problem, right?
00:05:36.940 Now, don't you say, oh, but those other vaccinations, they've been around forever, so it's safer.
00:05:42.760 But was that true when they were new?
00:05:46.260 Can somebody give me a little history lesson?
00:05:48.760 Give me a fact check on this.
00:05:50.920 Yeah, it's very different.
00:05:51.920 So the fact that it's different is meaningful, right?
00:05:55.920 You can't say this is like everything else.
00:05:57.800 It's different.
00:05:58.860 But is it different in the important way?
00:06:02.280 Is it different in the sense that were these other vaccinations,
00:06:06.720 have they been around and tested for, let's say, five years
00:06:10.760 before the first time the public was required to get them?
00:06:16.960 Maybe.
00:06:17.820 I mean, I'm not doubting it.
00:06:19.200 I just don't know the history.
00:06:20.300 They've had studies, right?
00:06:22.900 They've had studies by now.
00:06:24.920 So I understand the point that if you're talking about today,
00:06:29.820 vaccines that have been around a long time look safer than one that's new.
00:06:34.080 Everybody's on the same page with that, right?
00:06:36.440 Everybody would agree with the general statement
00:06:38.340 that any kind of medicine that's been around for a long time is safer.
00:06:42.900 You know, just in terms of unknown risks, it's safer.
00:06:46.000 It might not be safer, but in terms of unknown risks, there are fewer than.
00:06:50.300 We all agree with that.
00:06:51.340 But the question is, when those other vaccinations were brand new,
00:06:56.580 how long did they wait before the public got them,
00:07:00.860 and how long did we wait before they were mandatory for kids,
00:07:04.100 before they could go to school?
00:07:07.920 Yeah, so the other question is whether they were required when new.
00:07:11.140 Yeah, that's a good question, too.
00:07:12.760 Were the other vaccinations required when they were new?
00:07:16.160 So I'd just like a history lesson, if somebody could give me one,
00:07:19.340 or point to a link.
00:07:21.560 Because Newsom's argument sort of depends on that, doesn't it?
00:07:25.680 It depends on you knowing that history.
00:07:28.580 And I feel like it's a little weasel-ish because we don't.
00:07:33.980 Exactly.
00:07:34.400 So I'm seeing one comment that says mandatory vaccinations
00:07:41.600 were usually about 15 years later.
00:07:44.780 Now, I don't know if they waited for a reason,
00:07:50.440 or they just wasn't being demanded in schools yet.
00:07:54.620 FDA didn't approve.
00:07:56.060 What do you mean?
00:07:56.820 Oh, FDA didn't approve.
00:07:58.060 Well, I don't know.
00:08:01.200 I think the FDA didn't approve is not exactly as true as it should be.
00:08:08.660 Because I think they did approve it,
00:08:10.220 they just did it on a quicker basis and with less information, maybe.
00:08:14.380 But it's definitely approved.
00:08:16.820 Saying it's not approved, I don't think that's factually true.
00:08:20.400 It might be approved with less information.
00:08:22.620 Now, that would be true, I think.
00:08:23.720 So what do you think about that?
00:08:29.820 Mandatory vaccinations for kids.
00:08:32.800 Everybody happy about that?
00:08:34.680 I've got a feeling that this audience is not too happy about that.
00:08:39.100 Yeah.
00:08:40.020 Well, it's going to happen anyway.
00:08:41.920 What happens when we have mandatory vaccinations for teachers?
00:08:46.020 Because that's already true in New York City, right?
00:08:49.220 Is it mandatory in New York City?
00:08:50.640 I think mandatory vaccinations for teachers is kind of...
00:08:57.460 It has to happen, right?
00:09:01.080 Oh, so here's an interesting question.
00:09:03.760 I don't think Newsom can get away with mandatory vaccinations for kids,
00:09:08.820 but not the teachers and the staff.
00:09:12.620 What am I missing here?
00:09:14.540 Am I missing something?
00:09:15.560 Did Newsom already say the teachers need vaccinations?
00:09:17.980 I don't think so, right?
00:09:19.280 Or did he?
00:09:19.780 Maybe that's the part I'm missing.
00:09:25.420 Well, what happens when the teachers' unions in California
00:09:29.400 learn that they're going to have to have mandatory vaccinations too?
00:09:33.700 Are they going to get them?
00:09:36.600 So it looks like maybe the governor and the people
00:09:40.520 and the teachers' unions are going to have a little issue here, aren't they?
00:09:45.580 Right?
00:09:46.100 Or is there something I'm missing?
00:09:52.920 Wow.
00:09:53.440 I'm hearing from a doctor over on Locals.
00:09:57.200 I just missed getting a bad polio vaccine that gave 50% of the people polio.
00:10:02.060 Was that true back in the early polio vaccine days?
00:10:05.660 Was there a bad version?
00:10:09.000 I've never heard that story, but if it's true.
00:10:11.520 I mean, it's coming from a doctor, so I'm guessing it is true.
00:10:15.000 Oh, I'm seeing a lot of yeses.
00:10:17.080 Okay.
00:10:17.560 Well, you can't really even compare that day and age to today, I don't think.
00:10:21.720 I don't think that would be a fair comparison.
00:10:23.220 Because, you know, here's my guess.
00:10:27.160 My guess, and I could use a fact check on this too,
00:10:31.180 is that our ability to predict what kinds of things are going to be a problem in the future
00:10:35.360 is probably better, right?
00:10:36.880 Because we've seen enough things in the past, we're like,
00:10:39.040 oh, it's like that thing in the past, better watch out for that one.
00:10:42.700 So I don't think you can compare, you know, 30 or 50 years ago medicine to today.
00:10:49.120 I just don't think that's apples and oranges anymore.
00:10:51.140 Anyway, that's happening.
00:10:55.460 What else is happening?
00:10:57.720 In New York City, there's a George Floyd statue that just went up,
00:11:02.820 along with two others.
00:11:05.260 Also, Breonna Taylor and John Lewis.
00:11:10.420 Civil rights hero John Lewis.
00:11:12.980 Now, let me ask you this.
00:11:15.140 Let's say you're the family of John Lewis.
00:11:17.700 Do you all know John Lewis?
00:11:21.140 John Lewis died last year?
00:11:23.120 Was it within last year he died?
00:11:26.260 And widely considered a hero of the civil rights movement, right?
00:11:33.600 Pretty much left, right, center.
00:11:36.560 Everybody considers John Lewis a hero, American hero.
00:11:40.420 His bust was just put up with George Floyd's.
00:11:46.960 So now they're treated about the same in terms of historically important, similar.
00:11:54.880 Similar.
00:11:56.400 Does that feel good to you?
00:11:58.760 Let me say this.
00:12:00.160 As many of you know, the rules about who you identify with or what your identity is, is up to you.
00:12:10.040 Everybody agrees with that so far, right?
00:12:12.440 You can identify any way you want.
00:12:14.500 You can be gay or straight or non-binary.
00:12:17.760 You can be whatever ethnicity you want if you identify that way.
00:12:21.380 And I like that rule because I've decided that I will identify as a white, I'm sorry, forget the white part, as a heterosexual black man.
00:12:35.700 So I identify as a heterosexual black man because I've lost several jobs to racial discrimination.
00:12:43.980 If you don't know that story, I'm not going to get into it now.
00:12:46.600 But two corporate jobs I lost because my boss, in each case, told me directly,
00:12:52.180 can't promote you because you're white and male, right?
00:12:55.880 And one TV show because I was white.
00:12:57.820 So I've been massively discriminated against, economically, for being white.
00:13:05.260 So I feel like I have a lived experience of a person who lives in a world in which they're continuously discriminated against.
00:13:11.600 So I've just decided to identify as black.
00:13:15.260 One, because I can.
00:13:17.080 Apparently the rules allow that.
00:13:19.120 And I'm just following the rules.
00:13:21.020 And since I have, like, some connection, some affinity with that part of the country,
00:13:27.020 and I also like being on a winning team.
00:13:31.640 That might be the most offensive part of what I'm saying.
00:13:34.580 I like being on the winning team.
00:13:37.100 And I feel like black America is sort of on a winning streak at the moment.
00:13:44.560 I mean, things are still terrible in so many ways, which is why it's worth, you know,
00:13:48.740 it's worth being active in that area because things are so bad.
00:13:52.040 Maybe you can help.
00:13:53.480 But definitely black America is doing better.
00:13:56.180 Right.
00:13:57.280 Than in the past.
00:13:59.300 Although you could argue the 60s, they had more homeownership.
00:14:02.280 So I think there's a little ambiguity on that.
00:14:06.480 But let me give some advice to the group that I associate with.
00:14:12.380 I don't feel like you're doing what you hope to accomplish by putting up statues of George Floyd.
00:14:24.080 I just don't feel that's the hero that you want.
00:14:29.300 And I don't have to get into the details, right?
00:14:31.680 You know, he's, you know, the Floyd family, they don't need that.
00:14:36.400 But it feels like a gigantic mistake in terms of branding.
00:14:44.420 So I'll just put that out there.
00:14:46.720 There's room for disagreement on this, right?
00:14:49.420 Now, we do have a history that victims are sometimes remembered.
00:14:54.380 And it's important.
00:14:56.440 It is important.
00:14:56.760 It is important to remember victims.
00:14:59.220 But I don't know if you want to pair the victim with the hero.
00:15:03.500 Does that feel right?
00:15:05.400 Because John Lewis is like one of the, you know, most impressive heroes that America's ever produced.
00:15:14.520 And putting him in the same event with George Floyd, who was just in the wrong place at the wrong time, doing the wrong stuff.
00:15:23.900 I mean, the cops were doing the wrong stuff, too.
00:15:26.940 So, but that just feels like a mistake to me.
00:15:32.620 So I say that as an insider.
00:15:35.420 As a proud heterosexual black man, I feel like that did not help our cause.
00:15:41.100 There was a massive fentanyl ring bus.
00:15:47.340 Like 800 dealers got rounded up.
00:15:50.140 So good job for the DEA.
00:15:52.140 Was it the DEA?
00:15:53.000 Yeah.
00:15:53.480 I think they did a great job.
00:15:55.080 I guess they've been surveilling this network for a long time.
00:15:59.320 And they just rolled up a whole bunch of them.
00:16:00.980 So it sounds like amazing work.
00:16:03.000 Congratulations on that.
00:16:05.380 But I tweeted that fentanyl dealers should get the death penalty.
00:16:09.220 Because they are mass murderers.
00:16:11.780 They just do it statistically.
00:16:14.800 Right?
00:16:15.520 It's not less of a murder if you don't know the name of the person who's going to get killed.
00:16:21.560 Take, and I like to use my example of the Las Vegas mass murder.
00:16:26.000 You know, the guy who shot from the window of the hotel.
00:16:29.800 He didn't know the names of who he was killing or even which people in the crowd.
00:16:34.100 Statistically, he murdered people.
00:16:38.440 Definitely.
00:16:39.760 He murdered a bunch of people.
00:16:41.580 He didn't know who they were.
00:16:43.240 And didn't even know which body he was aiming at exactly.
00:16:46.580 He was just sort of spraying the crowd.
00:16:48.460 So is he not guilty of murder because he didn't know who he was murdering?
00:16:53.340 No.
00:16:54.060 No, he's guilty of murder.
00:16:55.540 He's just dead, so we can't do anything about it.
00:16:57.240 Likewise, if you're a fentanyl dealer, if you were, let's say, a small dealer and you gave some pills to your friends, well, probably that's not exactly like murdering people.
00:17:10.300 That's taking a bad risk with your friends, and that's probably bad.
00:17:15.020 But suppose it's your job.
00:17:17.060 You're a big old fentanyl dealer and you're just moving lots of it.
00:17:20.360 And maybe hundreds of people are involved with your network at that point.
00:17:26.020 If you've given a fentanyl product to hundreds of people, you've killed people.
00:17:33.020 You're a murderer.
00:17:35.000 Statistically speaking, if you give fentanyl in pills to hundreds of people, some of them die.
00:17:44.160 And you know that.
00:17:45.180 It's not even a chance.
00:17:47.000 Some of them are going to die.
00:17:48.100 So, to me, that's just murder.
00:17:50.960 It just is being done in a statistical sense.
00:17:53.880 Now, what was the pushback I got from that on Twitter?
00:17:56.780 Scott, Scott, Scott, people said.
00:17:59.440 Now, what about people who give you sugar?
00:18:05.460 Sugar's killing you, right?
00:18:07.860 How about the fast food companies?
00:18:12.260 They're killing you, statistically speaking.
00:18:15.340 That's a good point, right?
00:18:17.100 Statistically speaking, McDonald's is murdering, you know, tens of thousands of people a year.
00:18:23.600 Statistically.
00:18:24.280 They don't know the names of their victims.
00:18:27.540 But they're part of the problem.
00:18:29.960 So, do you treat them the same?
00:18:31.480 McDonald's and a fentanyl dealer?
00:18:33.800 This is what people were saying to me on Twitter.
00:18:36.520 To which I say, you know, it does matter if it's legal, right?
00:18:40.780 McDonald's is legal.
00:18:44.400 You know, maybe you could argue it shouldn't be.
00:18:47.040 But it is.
00:18:48.500 It's legal.
00:18:50.060 So, you know, alcohol, cigarettes.
00:18:53.660 You could argue they shouldn't be legal.
00:18:56.500 And I would listen to that as a separate argument.
00:18:59.220 But they are legal.
00:19:01.160 You don't put people to death for doing legal things.
00:19:05.600 You know, you'd have to, at the very minimum, you'd have to make it illegal.
00:19:08.780 And then we could have the conversation of whether they're statistical murderers.
00:19:12.900 But if the state has already said, you can do this, and then they do what the state says they can do,
00:19:18.640 even if it does kill people, it looks different to me.
00:19:24.160 To me, that looks different.
00:19:25.120 But I can see how you might disagree.
00:19:29.720 Today, CNN reported nothing is actually something.
00:19:34.920 So how do you turn nothing into something?
00:19:38.800 Well, let me tell you.
00:19:40.160 Let me tell you.
00:19:41.600 It's about the infrastructure bill.
00:19:43.940 And I guess Biden went and talked to the Democrats to try to, you know, try to loosen things up
00:19:49.220 because it's all constipated and they can't agree on this infrastructure bill.
00:19:53.200 And literally nothing happened.
00:19:57.380 So Biden talked to the Democrats.
00:20:00.160 Nothing happened.
00:20:01.980 How did CNN report?
00:20:04.440 Nothing.
00:20:05.680 Here's the exact sentence.
00:20:08.860 White House officials think the president accomplished what he went to do on Capitol Hill.
00:20:13.840 Oh, good.
00:20:14.360 Okay.
00:20:14.680 So here they're going to explain what it was he wanted to accomplish, and we'll see that he did it.
00:20:18.920 It says, quote, remind Democrats of what is at stake while relieving some of the pressure that had built up over the last several days
00:20:28.920 and reiterating his commitment to passing both pieces of legislation.
00:20:34.440 With that done, officials believe negotiators have a better environment to be able to push toward a deal.
00:20:40.900 Am I wrong that that's nothing?
00:20:47.280 And they use words to make it sound like it was something?
00:20:51.640 And listen to this again.
00:20:53.600 There's nothing here.
00:20:56.320 He was supposed to remind Democrats of what is at stake.
00:20:59.540 Which Democrats didn't know what was at stake with the infrastructure bill?
00:21:05.560 Not only with the bill itself, but with what that does with the future and whether failing at it is good or bad.
00:21:12.180 They all understood that.
00:21:13.820 That's the most nothing I've ever seen in the news.
00:21:16.820 He reminded people of what everybody obviously knows.
00:21:20.320 But he did more than that, according to CNN.
00:21:23.180 He also relieved some of the pressure that had built up.
00:21:27.340 How do you measure that?
00:21:30.580 Is there like a pressure gauge on the side of the building of Congress?
00:21:36.180 You're like, ooh, we're up to 80% pressure.
00:21:39.640 We're going to need to send Biden over here.
00:21:42.940 Lower that pressure.
00:21:44.860 This is not a thing.
00:21:46.640 These are just words that were put in a sentence that were put in a paragraph.
00:21:52.780 Literally nothing happened.
00:21:53.920 All right.
00:21:57.340 And also reminded them of his, quote, commitment to passing both pieces of legislation.
00:22:06.380 What does it mean to say the president is committed to something?
00:22:11.340 There's no information here.
00:22:13.920 No information at all.
00:22:15.840 All right.
00:22:16.640 Let's talk about replacement theory that's in the news.
00:22:20.400 You know, the news likes to say that replacement theory, which I'll explain if you don't know what it is,
00:22:27.920 is being pushed by, you know, some Fox News hosts like Tucker Carlson,
00:22:33.320 and that it's all crazy conspiracy theory stuff.
00:22:37.100 Now, replacement theory is the idea that immigration is really about bringing in a lot of people from other countries to replace the, you know,
00:22:50.740 European biased culture in this country and get more Democrats and, you know, make it a different country.
00:22:57.700 Now, how many of you think that there's somebody who has a plan, like actually, oh, if we bring in lots of people, we'll replace all these white people and we'll get more Democrat votes?
00:23:13.740 In the comments, how many people think that's literally what's happening?
00:23:17.640 And that, and no, I'm not talking about what's happening so much as the intention.
00:23:23.520 So I'm only talking about the intention.
00:23:25.960 Are there people who are actually talking out loud, but privately, about replacing all the white people and making this a different country via immigration?
00:23:38.820 In the comments, I'm seeing both yeses and notes.
00:23:41.800 Lots of yeses.
00:23:43.740 This is actually written in books, Vlad says.
00:23:46.620 It's actually written in books, people saying, let's replace all the white people.
00:23:52.500 So are the people doing it, reading those books and taking that opinion?
00:23:59.020 I'm sure that the idea of it is in books.
00:24:02.780 I wouldn't doubt that.
00:24:04.740 But is that why people are doing it?
00:24:07.480 Is that why, is Nancy Pelosi, do you think that Nancy Pelosi has meetings with people and they're like, all right, don't tell anybody.
00:24:14.140 But what we're trying to do is to get fewer people like me, Nancy Pelosi.
00:24:20.720 That's the goal.
00:24:22.520 We want fewer people like me and my family.
00:24:25.760 Is Nancy Pelosi having that conversation?
00:24:28.440 Because the country would be better if there are fewer people like me.
00:24:32.620 Is she saying that?
00:24:34.800 I'm seeing a lot of yeses.
00:24:36.780 I'm not saying she is or not.
00:24:38.500 I'm just looking at your comments.
00:24:40.660 A lot of people think that Nancy Pelosi wants to replace herself and her family with other people.
00:24:47.640 In addition to replacing all the other white people.
00:24:50.760 So that's what a lot of people think that's literally what's happening.
00:24:54.440 AOC.
00:24:55.620 Somebody says AOC wants to replace all the, or replace white people with more brown people.
00:25:00.920 Well, here's my advice.
00:25:08.140 We never really know what people's intentions are.
00:25:11.880 We think we do.
00:25:13.400 I don't want to say never, that's too absolute.
00:25:15.380 But we're often wrong about that.
00:25:18.700 Probably more often wrong than right when it comes to these edge cases.
00:25:23.880 And here's what I think.
00:25:26.800 To me, it looks like politicians are too short-term thinkers.
00:25:32.080 In order to believe replacement theory, you'd have to believe that our politicians are long-term thinkers.
00:25:39.080 Do you believe that?
00:25:41.880 Do you believe that our politicians are thinking past the next election or two?
00:25:49.100 I don't see evidence of that.
00:25:50.580 To me, it just looks like politicians are doing what their current constituents want them to do.
00:25:57.920 Because they think that will win votes.
00:26:01.300 Do you need to be more complicated than the fact that immigration is popular with Democrats?
00:26:07.860 And so Democratic leaders want to give their people what they're asking for?
00:26:11.700 Because there are a lot of Hispanic Democrats who would like immigration to get their family in, etc.
00:26:22.340 And there are enough people who sympathize with that point of view.
00:26:26.440 That isn't it just normal leadership to give people what they want?
00:26:31.440 And do you think the voters are thinking that?
00:26:33.200 Do you think the Hispanic Americans or, you know, the Latinx or whatever name you want to put on it today,
00:26:42.320 do you think that they're thinking, oh, if you let my relatives in, I get to replace some more white people?
00:26:49.760 I kind of doubt it.
00:26:51.880 I feel like it's being driven by the public, not by the politicians.
00:26:56.920 And the politicians are simply responding to the public, you know, being favorable about immigration.
00:27:03.200 To me, that's all it looks like.
00:27:05.740 Now, is there anybody who would say out loud on the Democrat side that there might be a side effect or a benefit,
00:27:15.040 which is it changes the voting patterns, especially in, say, Texas, and especially in Florida?
00:27:22.160 Probably they're happy about that, to have more voters.
00:27:25.100 But I feel that that's a little too long-term for them to really be making decisions on it.
00:27:31.360 I think that that's just a side benefit that they get along with making their voters happy at the moment.
00:27:37.700 So that's my take.
00:27:38.680 My take is that they only care about the next election-ish,
00:27:42.320 but they don't mind that it might have a long-term benefit.
00:27:47.220 But I don't think that's the intention so much as something they don't mind, you know, that does work in their favor.
00:27:52.460 I saw an argument on the CNN opinion piece that the worst thing that could happen to the people afraid of replacement theory is to get what they want.
00:28:08.480 You know, you've heard the be afraid of getting what you want, you know, that could be a problem.
00:28:13.360 And if there are a bunch of white people who are afraid of replacement theory, suppose they got their way.
00:28:21.360 Suppose we just stopped immigration, and they got their way.
00:28:26.580 First of all, the white majority would disappear anyway, because the demographics are going to make that happen.
00:28:33.980 So it would just take longer.
00:28:35.720 You know, the replacement would still happen, just take a little longer.
00:28:38.300 And that's the only way to get young people, because the white population is not having as many children.
00:28:47.400 So the only way you could have a country that works in the long run, as far as we know,
00:28:51.920 I mean, maybe robots and AI will change everything,
00:28:55.540 but at the moment, the only way we know to have a prosperous country in the long run is to have more young people.
00:29:02.120 How do you get the young people unless you bring them in?
00:29:04.560 Because the older white people are just not having babies.
00:29:08.300 So, the second thing that you have to look at is the degree of, let's say, mixing.
00:29:16.840 Now, I don't know if you see as much of it as I do, because I live in California.
00:29:20.660 So California is, you know, not counting New York City, maybe,
00:29:25.220 is just one of the most melting, pottiest places you could ever be.
00:29:29.840 There's a little of everything.
00:29:30.740 If you go to a house party in California, there's just every combination of everybody.
00:29:36.040 Simply virtue signaling?
00:29:39.820 Yeah, maybe.
00:29:40.440 Maybe it is just virtue signaling.
00:29:43.280 But here's, you know, what movie was it?
00:29:45.900 Bull Durham or something?
00:29:47.620 That the thought was everybody would just have sex until everybody was brown?
00:29:51.720 It'll all take care of itself in the long run?
00:29:54.420 Something like that's kind of going to happen in the long run.
00:29:57.140 You know, the number of kids in California who are some mixture of whatever is pretty high.
00:30:06.940 You know, you have, if your kids have their friends over, you've got a little of everything, right?
00:30:14.280 It's just a little bit of everything.
00:30:15.620 So we might just have sex to the point where the whole racial thing just doesn't make as much sense anymore.
00:30:24.220 Oh, Bullworth.
00:30:24.980 Bullworth, sorry.
00:30:26.000 Not Bull Durham.
00:30:27.020 The movie was Bullworth.
00:30:28.620 Thank you.
00:30:30.200 All right, worldwide, the COVID deaths have reached 5 million, it is being reported.
00:30:35.900 Do you believe that?
00:30:37.080 Do you believe that 5 million people have died worldwide from COVID?
00:30:40.520 And that half of them, I think half of them came in the United States?
00:30:47.300 What's wrong with that?
00:30:50.120 What's wrong with these two statistics?
00:30:52.820 5 million deaths, and half of them were in the United States.
00:30:57.800 Does that make sense to you?
00:31:00.240 You don't think half of them were in India?
00:31:03.560 Maybe.
00:31:03.960 Well, you know, if it's true that India is undercounting their deaths by, you know, a factor of 5 or 10, or God knows,
00:31:13.000 I've got a feeling that 5 million reported deaths is really 20 million.
00:31:19.480 Give me a guess what you think.
00:31:21.740 If the official reported number is 5 million deaths worldwide, what do you think?
00:31:27.060 Just your intuition or your skepticism or your distrust, just your hunch.
00:31:32.280 What do you think is a real number?
00:31:34.660 I'm seeing a lot of people going lower, 1 million, 1 million.
00:31:38.580 So you would be the people who would say, it's not so much the COVID, it was the comorbidities,
00:31:42.740 and the way we're counting it, and, you know, we're throwing everything in that category.
00:31:48.240 Well, I would say that, at the very least, the different countries are not counting it the same.
00:31:54.300 And some can't count it at all, because they just don't have the good enough records.
00:31:57.860 So my intuition tells me that if everybody counted the way we did, it would be closer to 20 million.
00:32:08.720 But if we counted the way other countries do, it might be closer to a million.
00:32:15.000 So I think it has to do with how you count it, right?
00:32:18.180 I don't think it's a million or 20 million, it's just how you count it.
00:32:21.400 I think you can get to either number.
00:32:22.860 All right, so we don't trust much about statistics, and we should not.
00:32:30.120 Here's a little story about my own confirmation bias.
00:32:34.080 Now, confirmation bias, as you know, is when you see something that agrees with you,
00:32:37.620 and you're like, oh, that looks accurate.
00:32:39.640 That is totally accurate, because it happens to agree with you.
00:32:42.940 That's the reason you think it's accurate.
00:32:45.300 Here's mine.
00:32:46.020 Now, the story is that there's a new study that is looking at the correlation between vitamin D3 and mortality from COVID.
00:32:56.980 And the thinking is that the more vitamin D3 you have, the less chance of dying, up to the point of maybe close to zero.
00:33:06.280 That you could actually bring the death rate close to zero if everybody had enough D3.
00:33:11.280 Now, here's what seems to be new.
00:33:12.820 And by the way, this is not yet a reliable study.
00:33:18.040 So what I'm going to say next should not be deemed as reliable.
00:33:22.600 Hasn't gone through peer review, blah, blah, blah, blah.
00:33:26.540 But it's new information, and I certainly doubt that you could have enough vitamin D to bring the death rate to zero.
00:33:33.660 That doesn't sound...
00:33:35.420 We don't really live in a world where that's possible.
00:33:38.680 You can't bring anything to zero.
00:33:40.000 But I guess what they added was some extra analysis to support the idea that the D3 is the cause of the better outcomes, or worse, as opposed to just a correlation.
00:33:55.760 Because the problem is that people who are sick almost always have low vitamin D3.
00:34:00.360 And the people who die of COVID are the people who already have some comorbidities.
00:34:05.680 So it's guaranteed, just based on the fact that, you know, those two facts, it's guaranteed that a lot of people with COVID would also have low vitamin D.
00:34:14.660 But it could be a coincidence, because they were sick from other things.
00:34:17.340 What this study purports to find is that it is causal and not, hey, say hi to Boo.
00:34:25.520 She's looking around.
00:34:27.840 By the way, could I give you some good news?
00:34:31.900 It's good news for me.
00:34:33.760 But I'm hoping that, you know, you followed along enough to feel some goodness in it.
00:34:40.640 But yesterday, for the first time in two weeks, Boo ate solid food.
00:34:47.600 Now, the feeding tube's still in, because just to give medicines and stuff, it's just a little convenience, so I'll leave it in a little bit longer.
00:34:55.260 But just yesterday, yesterday late afternoon, she took some bites of food, and then a little bit later as well.
00:35:02.920 And in my little life, that's a really big deal.
00:35:09.680 It's a really big deal.
00:35:12.000 Now, she's not out of the woods.
00:35:13.640 She's got some extra tests.
00:35:16.100 They have to still rule out some nasty stuff that's still possible.
00:35:20.920 But they're trying to rule that out.
00:35:22.900 But at the moment, that's a...
00:35:26.700 Let me bring her over here for a...
00:35:29.680 I think she's coming this way.
00:35:32.920 Boo wants to say hi.
00:35:33.820 Boo, come here.
00:35:35.680 Come here.
00:35:39.880 There we go.
00:35:41.020 So here's her little feeding tube.
00:35:43.280 It goes into a hole that they put in her neck, and that goes through the neck, and it goes to the top of the stomach.
00:35:49.760 So she doesn't really feel the food going in, except she knows it's happening.
00:35:54.700 So she's lost a lot of weight.
00:35:56.160 She's a little skinny now.
00:35:57.400 But she knows where her food is.
00:35:59.340 We're going to make sure she has some as soon as we're done.
00:36:01.000 All right.
00:36:02.480 I think I was talking about something else besides my cat.
00:36:05.040 Oh, yeah.
00:36:05.500 Confirmation bias.
00:36:06.760 Well, so the reason this vitamin D3 thing is confirmation bias is because early in the pandemic,
00:36:15.740 I was one of a number of people, I'm not the only person who said this,
00:36:20.560 but I was kind of out in front of the vitamin D3 thing saying,
00:36:25.060 why does it look to me like there's a correlation between the countries that have the best vitamin D
00:36:30.200 and the people who have the best vitamin D and the best outcomes?
00:36:33.700 Now, of course, that was probably just correlation, not causation.
00:36:38.080 But because I was saying that publicly and early, I see a study like this, and what do I say?
00:36:45.540 Oh, that's probably true, right?
00:36:47.860 Because you see a study that agrees with what you put yourself out on the line of saying?
00:36:52.280 You're going to believe that.
00:36:53.640 So what I'm trying to put across is that I'm biased to believe this is true,
00:37:02.060 but I'm also aware of the fact that my bias might be the only reason I think it's true.
00:37:06.560 Because if this were a preprint study on anything else, what would I be telling you?
00:37:11.860 Don't believe it.
00:37:13.300 If it was on any other topic, but just happens to agree with me,
00:37:17.480 so in my mind, it looks a little bit better.
00:37:20.600 All right, here's the most provocative point of the day.
00:37:24.460 So lately, you've seen a bunch of rogue doctors, that's my name for it, rogue doctors,
00:37:29.400 who are doing videos, and you see them in memes and stuff,
00:37:32.920 in which they're doubting the mainstream opinion.
00:37:37.280 Here's my general question.
00:37:40.920 How likely, in general, not just talking about COVID,
00:37:45.160 but talking about all things you've ever seen,
00:37:47.420 how often is the rogue doctor right, going against the mainstream?
00:37:55.500 In the comments, how often?
00:37:57.720 If the only thing you know is that there's a doctor saying,
00:38:00.600 hey, people, you're getting it wrong, everything's wrong,
00:38:04.640 how often do they end up being right?
00:38:07.060 In your experience.
00:38:08.640 Now, this is just the feel of it, the experience.
00:38:12.600 I'll read down some of your numbers.
00:38:13.740 I'm seeing everything from 1% to 75%, 50-50.
00:38:17.860 You're all over the board, 100%.
00:38:20.520 You rarely hear a follow-up from the rogue doctors.
00:38:26.140 Good point.
00:38:28.160 Somebody says they're right about ivermectin.
00:38:30.720 That's what somebody says here.
00:38:33.500 All right, I'll tell you my...
00:38:35.740 I'll tell you my view of it.
00:38:40.200 I think the rogue doctor is right maybe 5% of the time.
00:38:49.540 Now, that says nothing about any specific rogue doctor you're looking at, right?
00:38:56.100 Maybe my cat is making a bed out of my papers here,
00:39:01.300 so it's going to get a little crinkly.
00:39:03.320 Now, I know what you're thinking.
00:39:05.960 You're already sending me the names of specific doctors and specific cases.
00:39:10.560 So, here's a check on your thinking.
00:39:14.300 If during this conversation you said to yourself,
00:39:16.980 yeah, yeah, I get your point, but not this specific doctor,
00:39:21.560 check your thinking.
00:39:23.920 Or if you said, yeah, yeah, yeah, Scott, I get your point as a general statement,
00:39:27.760 but certainly not with ivermectin.
00:39:29.600 If you said that, just check your thinking, right?
00:39:35.140 Doesn't mean you're wrong.
00:39:36.780 You could be right.
00:39:38.660 You know, the rogues are sometimes right, right?
00:39:41.360 All right, 100% of pioneers are rogue.
00:39:44.060 Thank you.
00:39:44.680 That was exactly the point I'm going to make next.
00:39:47.240 How do you square the fact that the rogue doctors,
00:39:51.200 in some sense, seem wrong a lot,
00:39:53.480 whatever you think is a lot,
00:39:54.720 but it's the same token, by the same token.
00:39:59.380 Every single thing that ever changed from wrong to right in science
00:40:03.240 probably started with one person,
00:40:06.140 a rogue scientist, a rogue doctor, right?
00:40:09.940 Does that change my point?
00:40:12.620 So, does that change the statistics
00:40:15.380 of how often the rogue doctor is right?
00:40:19.460 No, not even a little bit.
00:40:21.900 Well, maybe a little bit, but not enough to matter.
00:40:23.660 Both are true.
00:40:26.240 Both can be true, and they don't conflict.
00:40:29.000 It is only the rogue doctor
00:40:30.680 who changes the big things in science
00:40:33.460 that go from wrong to right.
00:40:35.340 Somebody goes first,
00:40:36.760 and when they do, they're the rogue.
00:40:38.980 Einstein was a rogue scientist, right?
00:40:41.620 He said things that others weren't on board with
00:40:43.980 until they were.
00:40:46.500 But it's really rare, right?
00:40:49.400 The reason that Einstein
00:40:53.660 the reason that Newton, Isaac Newton, is famous,
00:40:59.160 very rare, very rare.
00:41:03.220 So, I'm not saying that the rogue doctor is always wrong.
00:41:07.960 As soon as you imagine it as an absolute,
00:41:10.000 it's just nonsense, of course.
00:41:12.000 All absolutes.
00:41:13.000 I almost said all absolutes are nonsense.
00:41:16.340 I won't even finish that sentence.
00:41:18.760 You know what I mean.
00:41:22.600 So, yeah, you can have all of your breakthroughs
00:41:25.060 come from the rogue doctor.
00:41:26.220 At the same time,
00:41:27.540 only 1% of rogue doctors turn out to be right.
00:41:30.580 Do you get that?
00:41:32.720 That's just that one point.
00:41:34.340 Everybody on board?
00:41:35.900 Can be true that every breakthrough is a rogue doctor,
00:41:38.820 and also true that only 1% of it ever works.
00:41:45.060 Right?
00:41:45.660 I'm not saying those are the percentages.
00:41:47.360 I'm just saying they could be true.
00:41:49.540 There would be no conflict between that data.
00:41:53.500 So, I would say this.
00:41:55.900 Every time you see a rogue doctor
00:41:58.200 doubting science,
00:41:59.700 your first impulse should be
00:42:01.420 probably not true.
00:42:03.580 Now, keep an eye on it,
00:42:06.660 because it's going to be
00:42:07.400 one of these rogue doctors
00:42:08.680 who gets it right.
00:42:09.960 Right?
00:42:10.680 Probably.
00:42:11.920 There's got to be at least one rogue doctor
00:42:13.860 who's getting it right.
00:42:14.920 I'm not saying they're all wrong by any means.
00:42:17.000 I'm saying that your first instinct
00:42:18.640 should be probably wrong.
00:42:20.960 And then, you know,
00:42:21.860 maybe you look into it,
00:42:22.800 maybe you modify your opinion
00:42:24.820 after you've looked into it.
00:42:26.460 But you should start with
00:42:27.800 probably wrong.
00:42:30.520 That's your best defense
00:42:31.800 against being taken in
00:42:33.040 by very persuasive people.
00:42:36.620 Now, why is it that the rogue doctors
00:42:38.280 are so persuasive?
00:42:40.600 Tell me.
00:42:41.520 Why are rogue doctors
00:42:43.000 so persuasive?
00:42:44.200 In the comments, tell me why.
00:42:46.000 What is it about them
00:42:47.120 that makes them persuasive?
00:42:50.600 And it's not just one thing.
00:42:52.020 It's a few things.
00:42:56.340 Confirmation bias?
00:42:57.220 Thank you.
00:42:57.740 That is one of the things
00:42:58.560 I'm looking for.
00:42:59.160 If you believed
00:43:00.400 what the rogue doctor is saying,
00:43:01.920 you are susceptible
00:43:03.200 to thinking,
00:43:04.260 oh, it's confirmed now,
00:43:05.620 but it's just confirmation bias.
00:43:07.920 So that's the first thing.
00:43:09.300 That your confirmation bias
00:43:10.660 will work just the way
00:43:11.620 mine did with vitamin D3.
00:43:13.880 I want it to be true
00:43:15.140 so I can be right.
00:43:16.820 So you hear the rogue doctor
00:43:18.320 and he says exactly
00:43:19.440 what you were suspecting.
00:43:21.880 Boom.
00:43:22.440 He must be right
00:43:23.360 because he said
00:43:24.160 exactly what I suspected.
00:43:25.700 Well, your suspicions
00:43:27.700 don't really have
00:43:28.340 much credibility.
00:43:30.940 And neither does
00:43:31.940 the rogue doctor.
00:43:35.460 Somebody says
00:43:36.180 they're reasoning.
00:43:37.280 Good.
00:43:37.660 That's exactly
00:43:38.220 what I was looking for.
00:43:39.240 The rogue doctors
00:43:40.460 are persuasive
00:43:42.480 because they have
00:43:43.700 excellent reasoning
00:43:45.240 and credible data.
00:43:47.880 Right?
00:43:49.160 Everybody agree with that?
00:43:50.380 The reason that
00:43:51.180 they're persuasive
00:43:51.860 is they have
00:43:53.160 what looks like
00:43:54.660 credible data.
00:43:55.540 You don't know
00:43:55.880 if it's right,
00:43:56.440 but it's credible.
00:43:57.740 And real good explanations
00:43:59.680 that make sense to you.
00:44:01.440 So that's why
00:44:02.060 they're persuasive.
00:44:03.640 Do you know
00:44:04.200 who else is persuasive
00:44:06.000 in that way?
00:44:08.280 Everybody who's wrong.
00:44:11.000 Everybody who's wrong.
00:44:12.920 They're persuasive
00:44:13.700 in exactly the same way.
00:44:15.180 Because if you only hear
00:44:16.180 one side of an argument
00:44:17.140 and then somebody's smart,
00:44:19.140 you know,
00:44:19.340 doctors are pretty smart,
00:44:21.360 it'll sound persuasive.
00:44:23.060 It has nothing to do
00:44:24.240 with how true it is.
00:44:25.600 The level of persuasiveness
00:44:27.720 of one person talking
00:44:30.320 and making their point
00:44:31.400 without the counterpoint
00:44:32.560 should be zero.
00:44:35.820 The level of credibility
00:44:37.220 you should give
00:44:37.960 any person
00:44:38.640 when you haven't heard
00:44:39.660 the counterpoint
00:44:40.780 is zero.
00:44:43.460 If you give them
00:44:44.380 higher than zero,
00:44:45.420 you're really doing it wrong.
00:44:46.920 Again,
00:44:47.280 it doesn't mean they're wrong.
00:44:48.620 That doctor could turn out
00:44:50.080 to be the right one.
00:44:50.840 But on moment one,
00:44:54.320 the first exposure to it,
00:44:57.320 doctor saying something
00:44:58.640 that other people
00:44:59.340 are not saying
00:45:00.060 and no counterpoint
00:45:01.500 shown at the same time,
00:45:03.600 no pushback shown,
00:45:06.400 give that zero credibility.
00:45:08.600 Keep an open mind
00:45:09.540 and maybe look into it more
00:45:11.500 and keep watching the topic.
00:45:13.360 But at the start,
00:45:14.520 it should be zero.
00:45:16.400 I'll bet a lot of you
00:45:17.300 start at 100%.
00:45:17.840 Am I right?
00:45:22.220 Do a lot of you
00:45:22.900 start at 100%?
00:45:24.220 You hear him
00:45:24.700 and he's like convincing
00:45:25.500 as hell.
00:45:26.780 You say,
00:45:27.380 damn it,
00:45:27.740 that agrees with
00:45:28.420 just what I suspected.
00:45:30.220 Done.
00:45:31.660 Convinced.
00:45:32.820 I think you're starting
00:45:33.820 at 100%
00:45:34.580 when you start at 0%.
00:45:36.080 Now,
00:45:36.860 the real number
00:45:38.460 might be somewhere
00:45:39.120 in between
00:45:39.620 in terms of credibility.
00:45:41.760 But don't start at 100%
00:45:43.620 and try to move backwards.
00:45:45.520 Start at zero
00:45:46.460 and see if he can move up.
00:45:49.400 That would be good hygiene
00:45:50.740 for keeping yourself
00:45:52.040 out of trouble,
00:45:52.700 right?
00:45:56.340 All right.
00:45:59.120 Rogue doctors.
00:46:02.960 Here's something
00:46:03.720 that Erica said.
00:46:05.340 Erica,
00:46:05.680 you watching?
00:46:07.100 Good morning,
00:46:07.860 Erica.
00:46:09.400 Erica said on Twitter,
00:46:11.260 a little pushback
00:46:11.980 to this point.
00:46:12.980 It's a good pushback.
00:46:14.160 And watch my response to it.
00:46:16.460 She says,
00:46:17.580 oh,
00:46:18.260 I forgot a part.
00:46:20.240 I added the following
00:46:21.360 rule.
00:46:23.900 That if you know
00:46:24.540 the name
00:46:25.120 of the rogue doctor,
00:46:27.120 they're almost certainly
00:46:28.580 not right.
00:46:31.220 Doesn't mean
00:46:31.880 they're always wrong.
00:46:32.880 Remember,
00:46:33.220 there are no absolutes here.
00:46:34.900 But statistically speaking,
00:46:36.820 if you can think
00:46:37.600 of the name
00:46:38.120 of the doctor,
00:46:40.100 I'm going to say
00:46:41.520 that doctor
00:46:42.400 will be proven wrong.
00:46:43.400 Why?
00:46:44.180 Why?
00:46:44.940 Give me in the comments
00:46:46.060 why I'm saying that.
00:46:48.040 This will be a good test
00:46:49.040 because this is a
00:46:50.160 callback
00:46:52.260 to a principle
00:46:53.300 that I've talked
00:46:53.980 about a few times.
00:46:56.680 Why does the fact
00:46:58.360 that you as a consumer,
00:46:59.740 knowing the name
00:47:00.460 of the doctor,
00:47:02.000 why does that make it
00:47:02.900 the least likely
00:47:03.960 it's true?
00:47:06.520 Because it's a weird name?
00:47:07.820 No.
00:47:08.000 No.
00:47:08.840 Court cases won?
00:47:09.860 No.
00:47:11.100 More interested
00:47:11.860 in being famous
00:47:12.600 than right?
00:47:14.260 I hadn't thought
00:47:15.240 of that,
00:47:15.940 but maybe.
00:47:17.300 That's not a bad comment.
00:47:19.160 It's not where
00:47:19.640 I was going.
00:47:20.180 Because of their egos,
00:47:21.260 maybe?
00:47:22.140 Might be part of it.
00:47:22.980 That's not where
00:47:23.420 I'm going.
00:47:25.740 All right.
00:47:27.840 None of you got it.
00:47:29.360 Looks like none
00:47:30.020 of you got it.
00:47:31.480 It's the theory
00:47:34.620 about what makes news.
00:47:38.220 What makes news?
00:47:40.920 What is it
00:47:41.620 that makes news?
00:47:44.400 Things that aren't true.
00:47:47.060 Things that aren't true
00:47:48.080 is what get your attention.
00:47:50.300 Things that are true
00:47:51.360 don't stick in your mind.
00:47:55.180 They're just,
00:47:55.860 well, that's true.
00:47:56.940 Just,
00:47:57.260 phew,
00:47:57.980 it's gone.
00:47:58.280 But things that are
00:47:59.620 outrageous and crazy
00:48:01.020 and also not true
00:48:03.220 stick in your mind.
00:48:05.500 Does anybody forget
00:48:06.880 that President Trump
00:48:08.720 once suggested
00:48:10.280 people drink bleach?
00:48:12.420 You all remember that,
00:48:13.880 don't you?
00:48:15.020 Is there anybody
00:48:15.840 who follows politics
00:48:17.120 who doesn't know
00:48:18.220 that story?
00:48:19.260 Why do you know it?
00:48:21.600 Why do you know
00:48:22.340 that story?
00:48:23.540 Because it's not true.
00:48:25.460 It's the not true part
00:48:27.020 that makes you remember it.
00:48:29.240 Because there's something
00:48:30.220 about the story
00:48:31.120 that doesn't make sense.
00:48:33.220 Like, you're like,
00:48:34.060 really?
00:48:35.440 You did that?
00:48:36.940 I mean,
00:48:37.220 it feels like
00:48:38.140 they have evidence
00:48:38.900 and people are saying it
00:48:40.120 but it doesn't make sense.
00:48:42.940 Right?
00:48:44.140 This principle
00:48:45.100 will blow your mind
00:48:46.380 if you haven't heard it before.
00:48:48.500 The thing that makes
00:48:49.640 a thing news
00:48:50.500 is also
00:48:52.660 the thing that makes
00:48:53.740 it not true.
00:48:55.760 Right?
00:48:56.420 Those are so paired
00:48:58.260 that what makes it news
00:49:00.280 is that it's not true.
00:49:01.320 Think of all the things
00:49:02.540 that were news.
00:49:03.080 Russian collusion,
00:49:05.800 the Russians paying
00:49:07.060 the Taliban
00:49:08.700 to kill Americans.
00:49:11.140 Your hair's on fire, right?
00:49:13.460 Charlottesville,
00:49:14.100 fine people hoax.
00:49:16.080 These are the ones
00:49:16.980 you remember.
00:49:18.500 And it's because
00:49:19.360 it's dog bites man stories.
00:49:22.180 I'm sorry.
00:49:22.780 It's man bites dog stories.
00:49:24.560 If you hear that
00:49:25.340 a dog bites a person,
00:49:26.600 you're like,
00:49:26.940 eh,
00:49:27.320 you don't even remember it
00:49:28.380 because it's normal.
00:49:29.000 But if you hear a man
00:49:30.340 bites a dog,
00:49:32.180 you're going to remember that.
00:49:34.120 That's unusual.
00:49:35.780 So it's the
00:49:36.440 unusualness
00:49:38.800 and the provocative
00:49:40.000 nature of the claim
00:49:41.260 that makes you know
00:49:42.560 the name of the doctor
00:49:43.500 because it becomes news
00:49:45.960 because the news
00:49:47.260 and especially social media
00:49:48.680 say,
00:49:49.340 man,
00:49:49.560 this is good stuff.
00:49:50.960 This is good stuff.
00:49:52.060 We're going to put this
00:49:52.820 in the headline.
00:49:54.240 So
00:49:54.520 it doesn't guarantee
00:49:56.700 that the person is wrong.
00:49:57.820 So I'm not saying that.
00:49:59.560 I'm not saying that
00:50:00.340 if you know the doctor's name,
00:50:01.700 that means they're wrong.
00:50:04.080 Classic examples
00:50:05.160 would be,
00:50:06.180 you know,
00:50:06.620 it could be a doctor
00:50:07.440 that you already knew
00:50:08.240 from another reason.
00:50:09.380 You know,
00:50:09.540 Dr. Drew.
00:50:10.680 You knew Dr. Drew
00:50:11.540 from other reasons,
00:50:12.320 so this has nothing
00:50:13.320 to do with this example.
00:50:14.880 But if the only reason
00:50:16.120 you heard of a doctor
00:50:17.000 is because of their
00:50:19.180 rogue,
00:50:20.120 contrarian views,
00:50:21.380 just start at zero
00:50:23.420 and then try to work
00:50:25.740 your way up from zero
00:50:26.700 by being aware
00:50:27.860 that it's an issue
00:50:28.620 and maybe look into it.
00:50:30.080 But start at zero.
00:50:31.960 All right.
00:50:33.080 Erica said,
00:50:33.880 a little pushback
00:50:34.660 about the fact
00:50:36.020 that if you know
00:50:36.600 the name of the doctor,
00:50:38.400 they're probably wrong.
00:50:40.280 Erica says,
00:50:41.040 my guess would come
00:50:41.720 from watching Rand Paul
00:50:43.020 be right in the long run.
00:50:45.200 Only time will tell us
00:50:46.480 if the mandate crumbles
00:50:47.660 and blah, blah.
00:50:48.380 Do you believe that?
00:50:51.580 Has Rand Paul
00:50:52.420 been proven right?
00:50:54.980 Or has Rand Paul
00:50:56.060 been proven wrong?
00:50:57.260 Now, I would say
00:50:57.940 that he's more like Dr. Drew
00:50:59.220 because we know Rand Paul
00:51:00.400 from other reasons, right?
00:51:02.340 So he's famous
00:51:03.060 not for his rogue opinions,
00:51:04.640 but let's just talk
00:51:05.360 about him for a moment
00:51:06.040 because it's
00:51:06.560 an interesting comment.
00:51:07.920 And thank you for that, Erica.
00:51:09.980 I think that was additive.
00:51:13.160 Is Rand Paul right
00:51:14.600 that if you have
00:51:16.000 natural immunity,
00:51:16.880 you don't need
00:51:18.140 a vaccination?
00:51:18.920 That's sort of
00:51:19.600 his main thing, right?
00:51:21.320 Is he right?
00:51:22.640 He's a rogue doctor.
00:51:26.300 And he's saying
00:51:27.160 that if you have
00:51:27.720 the vaccination,
00:51:28.700 you don't need...
00:51:29.700 I'm sorry,
00:51:30.200 if you have natural immunity,
00:51:31.520 you don't need
00:51:33.120 the vaccination.
00:51:34.820 Is he right?
00:51:37.760 Nope.
00:51:38.800 He's not.
00:51:40.520 He's not.
00:51:42.320 Because every...
00:51:44.100 As far as I know,
00:51:45.340 every study
00:51:46.160 has shown that
00:51:47.780 if you get
00:51:48.540 the vaccination
00:51:49.300 on top of natural immunity,
00:51:51.320 you have the best
00:51:52.180 immunity of all.
00:51:52.960 And the best immunity
00:51:54.880 is better than
00:51:56.380 average immunity,
00:51:57.500 isn't it?
00:51:59.900 Give me a fact check
00:52:01.180 in real time here.
00:52:02.380 I need a fact check.
00:52:03.540 Here's my claim.
00:52:05.160 Now remember,
00:52:05.960 this is not exactly
00:52:07.020 what Rand Paul's saying.
00:52:08.620 If I interpret
00:52:09.960 Rand Paul correctly,
00:52:11.540 he's saying that
00:52:12.440 natural immunity
00:52:13.560 should be good enough
00:52:14.880 to have your full rights
00:52:16.900 the same way
00:52:17.480 a vaccinated person does.
00:52:18.820 That's a different argument.
00:52:20.500 If Rand Paul is arguing
00:52:21.800 the political,
00:52:23.840 philosophical nature
00:52:24.860 of things,
00:52:25.920 then I agree
00:52:26.560 with him completely.
00:52:28.260 You hear this clearly.
00:52:30.700 Rand Paul,
00:52:31.640 as a politician,
00:52:33.520 arguing that
00:52:34.320 natural immunity
00:52:35.160 should be seen
00:52:36.120 as good as vaccination,
00:52:37.760 totally agree.
00:52:38.900 Totally agree with him.
00:52:40.460 Good point,
00:52:41.100 and he makes it well.
00:52:41.800 But,
00:52:44.820 medically speaking,
00:52:46.520 medically speaking,
00:52:48.880 we do know
00:52:49.760 that a person
00:52:51.480 with natural immunity
00:52:52.640 will have more immunity
00:52:53.940 with a vaccination.
00:52:56.200 Did you know that?
00:52:58.140 How many of you
00:52:59.140 were aware of that,
00:53:00.040 that your antibodies
00:53:01.200 would be way better
00:53:02.240 if you had natural immunity
00:53:04.280 plus one vaccination shot?
00:53:07.420 I'm saying not true.
00:53:09.520 Those of you
00:53:10.300 who say not true,
00:53:11.140 go Google it
00:53:13.420 when we're done.
00:53:15.000 I think you'll find
00:53:15.780 that it's accepted
00:53:16.980 as true now.
00:53:18.460 And look at
00:53:19.060 Ian Martesis.
00:53:21.280 He did a study on it.
00:53:23.240 You can follow him
00:53:24.040 on Twitter as well.
00:53:25.860 And that's what,
00:53:27.640 I believe that his results
00:53:28.800 showed the same.
00:53:31.020 I don't think
00:53:31.820 there's any,
00:53:32.780 I don't think
00:53:34.240 there's any study
00:53:35.200 that says
00:53:36.560 if you get
00:53:37.160 a vaccination
00:53:38.000 on top of
00:53:39.480 natural immunity
00:53:40.640 that it doesn't help.
00:53:43.520 If you can find that,
00:53:44.820 send it to me.
00:53:45.480 But I don't think
00:53:45.980 that exists.
00:53:47.800 Okay?
00:53:48.620 So,
00:53:49.260 is Rand Paul right
00:53:50.140 or is he wrong?
00:53:51.360 Is he fighting
00:53:51.980 the right fight?
00:53:53.420 Again,
00:53:54.580 if he's fighting it
00:53:55.720 for political,
00:53:57.100 philosophical reasons,
00:53:58.180 absolutely he's right,
00:53:59.160 in my opinion.
00:54:00.180 Because really,
00:54:00.920 it's just an opinion.
00:54:01.580 But I agree
00:54:02.620 with him completely
00:54:03.520 that the people
00:54:04.720 with natural immunity
00:54:06.000 need their freedom.
00:54:07.840 Do we all agree
00:54:08.720 with that?
00:54:09.960 Can I get
00:54:10.580 100% agreement
00:54:11.600 that from a
00:54:12.920 social,
00:54:14.100 legal perspective,
00:54:16.280 you know,
00:54:16.500 what's right
00:54:17.140 about freedom
00:54:18.360 in America,
00:54:19.840 that if you've got
00:54:20.580 natural immunity,
00:54:21.400 and you can,
00:54:22.000 maybe you have to
00:54:22.520 prove it,
00:54:23.100 right,
00:54:23.400 with a test,
00:54:23.960 but if you've got it,
00:54:25.860 that you should have
00:54:26.620 all of your freedom.
00:54:28.720 All of your freedom.
00:54:29.760 No exceptions
00:54:30.440 in terms of the virus.
00:54:32.780 Yeah.
00:54:33.260 So he's 100% right.
00:54:35.860 But medically,
00:54:36.980 I think he's leaving
00:54:38.020 out a fact.
00:54:40.000 Medically,
00:54:40.600 he's leaving out,
00:54:41.300 I think,
00:54:41.660 a fact that you'd be
00:54:42.660 a little bit more
00:54:43.660 protected with the shot.
00:54:45.500 Now,
00:54:46.520 what if you've got
00:54:47.200 natural immunity
00:54:47.920 and you don't want
00:54:49.360 to take a risk
00:54:50.080 of a shot?
00:54:51.500 Should you have
00:54:52.140 the right
00:54:52.700 to not take
00:54:54.000 the extra risk,
00:54:55.020 no matter how small
00:54:55.840 it is,
00:54:56.220 it's your body,
00:54:58.340 do you have the right
00:54:58.980 to not get that
00:54:59.680 extra,
00:55:00.060 extra protection?
00:55:01.460 I say,
00:55:02.240 yes,
00:55:02.480 you do have
00:55:02.920 that right.
00:55:04.300 And I think
00:55:05.040 you'll all agree,
00:55:06.840 really.
00:55:09.260 All right,
00:55:09.820 here's another
00:55:10.520 little data point
00:55:14.320 that I was not aware of.
00:55:16.140 So every time
00:55:16.740 I find out
00:55:17.260 something important
00:55:18.200 that I wasn't aware of,
00:55:20.560 I'd like to pass that along
00:55:21.600 because I figure
00:55:22.140 other people
00:55:22.680 may be in the same boat.
00:55:25.400 You've heard,
00:55:26.100 of course,
00:55:26.400 about the problem
00:55:27.180 of myocarditis
00:55:28.500 in teens
00:55:29.460 and there have been,
00:55:32.160 let's see,
00:55:33.600 like 9 million teens
00:55:34.920 have been vaccinated
00:55:35.600 so far,
00:55:36.280 something like that.
00:55:37.780 So let's say
00:55:38.220 there are 9 million teens
00:55:39.280 that have been vaccinated.
00:55:40.280 How many have died
00:55:41.340 from the myocarditis?
00:55:44.460 Now,
00:55:45.000 carditis,
00:55:45.900 myocarditis.
00:55:47.320 How many teens
00:55:48.920 in the United States
00:55:50.200 have died
00:55:50.900 from myocarditis?
00:55:53.000 in the comments?
00:55:58.200 Some say
00:55:58.920 it doesn't matter.
00:56:00.240 It's a good question.
00:56:01.000 It doesn't matter.
00:56:02.160 I'm seeing
00:56:02.520 3,
00:56:03.300 1,
00:56:04.680 0.
00:56:05.240 0?
00:56:06.840 Who says 0?
00:56:08.320 It's like the biggest story
00:56:09.320 in the country.
00:56:10.720 And you think
00:56:11.140 0 people have died from it
00:56:12.540 and we're making it
00:56:13.220 the biggest story
00:56:14.000 in the country.
00:56:15.160 300,
00:56:15.900 somebody says.
00:56:20.140 All right,
00:56:20.660 let me tell you
00:56:21.120 how many people
00:56:21.720 got myocarditis
00:56:24.080 out of the 9 million.
00:56:24.960 It was like
00:56:25.320 400 something.
00:56:27.400 All right,
00:56:27.620 so knowing that
00:56:28.200 400 people
00:56:29.160 got the symptoms,
00:56:31.120 how many of the
00:56:32.660 400 died?
00:56:36.980 The answer is 0.
00:56:39.480 The answer is 0.
00:56:41.140 And that
00:56:41.700 basically
00:56:43.620 every one of them
00:56:44.460 is better in a day
00:56:45.420 because they know
00:56:46.460 how to treat it.
00:56:48.480 They're better in a day.
00:56:49.520 He says,
00:56:52.480 I know somebody
00:56:53.200 who died.
00:56:54.680 Were they a teen
00:56:55.620 of myocarditis?
00:56:58.360 So,
00:56:59.220 yeah,
00:56:59.580 we always have to
00:57:00.520 question the statistics.
00:57:01.820 But at the moment,
00:57:02.660 it looks like
00:57:03.200 it hasn't killed anybody.
00:57:04.740 Now,
00:57:05.280 none were fatal.
00:57:08.460 Now,
00:57:08.680 that doesn't mean
00:57:09.300 that they weren't
00:57:09.940 damaged,
00:57:10.580 right?
00:57:11.500 I'm not saying
00:57:12.240 you shouldn't worry
00:57:12.840 about it.
00:57:14.480 You know,
00:57:14.720 if I get,
00:57:15.660 trust me,
00:57:16.300 if I get myocarditis
00:57:18.340 after a vaccination,
00:57:20.540 I'm going to be
00:57:21.600 pretty fucking pissed.
00:57:22.540 I'm sorry.
00:57:23.440 I'm going to be
00:57:23.820 pretty mad.
00:57:25.780 And I'm going to,
00:57:27.180 you know,
00:57:27.440 wish I hadn't
00:57:28.160 and all that.
00:57:29.860 So,
00:57:30.260 yeah,
00:57:30.400 it's a big deal.
00:57:31.660 But keep it in context.
00:57:34.260 Right?
00:57:34.560 It's a big deal,
00:57:35.600 but maybe not as big
00:57:36.780 as you thought.
00:57:38.260 Still important.
00:57:41.660 All right.
00:57:42.280 All right.
00:57:42.300 that is just about
00:57:47.140 everything I wanted
00:57:47.980 to talk about today.
00:57:49.640 Scott,
00:57:50.140 you might be dead.
00:57:51.160 It's nasty stuff.
00:57:54.060 Maybe.
00:57:54.660 Yeah,
00:57:54.900 at my age,
00:57:55.640 it would probably
00:57:56.000 be a lot more dangerous,
00:57:56.960 huh?
00:58:01.340 That's exactly
00:58:02.120 what Scott's saying.
00:58:03.220 He's trying to
00:58:03.840 backpedal right now.
00:58:05.660 Watch him,
00:58:06.120 why,
00:58:06.400 and watch his hands.
00:58:08.440 See,
00:58:08.780 Ellen,
00:58:09.200 what am I
00:58:09.620 backpedaling on?
00:58:11.580 You know,
00:58:12.140 most of the
00:58:12.820 criticism that I get
00:58:14.040 on this topic
00:58:15.000 and others
00:58:15.480 falls into the
00:58:16.840 category of
00:58:17.660 somebody imagined
00:58:18.740 I used to say
00:58:19.600 something different
00:58:20.300 and now I'm
00:58:21.040 changing my mind.
00:58:26.400 You have a
00:58:27.340 natural instinct
00:58:28.180 for Bayes'
00:58:28.920 theorem,
00:58:29.340 okay?
00:58:35.800 Scott,
00:58:36.400 how much fentanyl
00:58:37.220 does your iPhone
00:58:38.080 subsidize?
00:58:39.500 So you're going
00:58:40.120 only after those
00:58:40.920 Chinese companies
00:58:41.720 you're not
00:58:42.220 a shareholder.
00:58:43.740 So I've
00:58:44.180 answered this
00:58:44.680 question and
00:58:45.740 I'll answer it
00:58:46.720 again.
00:58:46.880 It's a fair
00:58:47.320 question.
00:58:48.420 So as you
00:58:49.320 know,
00:58:49.680 I'd like to
00:58:51.000 decouple our
00:58:51.760 business from
00:58:52.360 China and I
00:58:53.180 would like us
00:58:53.640 to buy the
00:58:54.960 fewest amount
00:58:55.740 of Chinese-made
00:58:56.780 goods that we
00:58:57.580 can.
00:58:58.460 But I have an
00:58:59.020 iPhone and
00:59:00.460 pretty much half
00:59:02.520 of my environment
00:59:03.140 is Chinese goods.
00:59:04.820 So how do I
00:59:05.920 explain that?
00:59:06.980 And the way I
00:59:08.440 explain it is
00:59:08.900 quite easily.
00:59:10.260 You do
00:59:10.920 practical things
00:59:11.940 and you don't
00:59:13.320 do impractical
00:59:14.340 things.
00:59:14.760 That's it.
00:59:16.020 Who would
00:59:16.580 disagree with
00:59:17.340 the following
00:59:17.840 statement?
00:59:19.060 You should do
00:59:19.940 things that are
00:59:20.600 practical and
00:59:21.840 also good,
00:59:23.000 but you should
00:59:23.900 not do things
00:59:24.560 that are
00:59:24.860 impractical,
00:59:26.360 even if
00:59:27.360 hypothetically
00:59:28.000 they were
00:59:28.420 practical,
00:59:29.080 they'd be
00:59:29.360 good.
00:59:30.160 It is
00:59:30.900 practical to
00:59:31.880 prevent new
00:59:32.520 business from
00:59:33.100 going to
00:59:33.500 China.
00:59:33.840 It is not
00:59:36.600 practical to
00:59:37.960 uproot companies
00:59:39.120 that are
00:59:39.820 gigantic parts
00:59:40.860 of the
00:59:41.140 American economy
00:59:42.020 and at the
00:59:44.280 moment they
00:59:44.960 require,
00:59:45.740 let's say,
00:59:46.840 assembly in
00:59:47.480 China.
00:59:48.440 So I don't
00:59:49.460 judge Apple
00:59:50.200 and Tesla by
00:59:50.980 the same
00:59:51.340 standard.
00:59:52.000 I would judge
00:59:52.620 some company
00:59:53.500 that decided
00:59:54.140 to open a
00:59:55.140 new factory
00:59:55.740 there tomorrow.
00:59:58.220 So I've
00:59:59.080 never put them
00:59:59.660 in the same
01:00:00.080 category.
01:00:01.600 So no matter
01:00:02.260 how many times
01:00:02.940 you ask me,
01:00:03.840 am I
01:00:04.720 comfortable
01:00:05.100 with my
01:00:05.620 iPhone,
01:00:06.100 I'm always
01:00:06.400 going to
01:00:06.600 give you
01:00:06.800 the same
01:00:07.100 answer.
01:00:08.040 No.
01:00:09.640 No, I'm
01:00:10.300 not comfortable
01:00:10.860 with it one
01:00:11.860 bit.
01:00:13.620 But it
01:00:14.640 isn't practical
01:00:15.420 to change it
01:00:16.060 quickly.
01:00:17.060 It might be
01:00:17.540 practical in the
01:00:18.220 long run,
01:00:18.720 and I hope,
01:00:19.680 and I trust.
01:00:20.740 I trust Apple's
01:00:21.680 actually doing
01:00:22.220 that, by the
01:00:22.720 way.
01:00:23.340 I trust that
01:00:24.000 they're looking
01:00:24.380 at their
01:00:24.740 options and
01:00:25.380 trying to
01:00:25.700 figure out
01:00:26.080 what to do.
01:00:27.520 But the
01:00:29.320 way to build
01:00:29.800 it in the
01:00:30.120 United States
01:00:30.620 is probably
01:00:31.000 with robots,
01:00:32.200 not people,
01:00:32.840 so it's
01:00:33.760 not like
01:00:34.080 it's going
01:00:34.360 to boost
01:00:34.840 employment if
01:00:35.520 they move
01:00:35.820 it over
01:00:36.080 here.
01:00:36.400 But it
01:00:36.580 will at
01:00:36.780 least get
01:00:36.980 it out
01:00:37.180 in China.
01:00:39.400 I'm being
01:00:39.920 a little
01:00:40.160 too nice
01:00:40.720 to Apple
01:00:41.180 and Tesla.
01:00:42.040 Well,
01:00:43.100 nice isn't
01:00:43.680 really an
01:00:45.300 operative
01:00:45.900 thing here.
01:00:47.880 And by the
01:00:48.220 way, I should
01:00:48.760 say, as
01:00:50.580 I've said
01:00:50.860 before, I do
01:00:51.700 own stock
01:00:52.360 in both
01:00:53.560 Apple and
01:00:54.120 Tesla.
01:00:55.380 Now, one
01:00:55.720 of the reasons
01:00:56.080 I own stock
01:00:56.760 with them is
01:00:57.220 that they're
01:00:57.560 just monsters.
01:00:58.860 They're both
01:01:00.240 just monsters.
01:01:01.800 And when I
01:01:03.660 invest in
01:01:04.200 things, it's
01:01:04.680 not because I
01:01:05.600 love them.
01:01:06.960 I don't invest
01:01:07.940 in things I
01:01:08.440 love.
01:01:09.580 I invest in
01:01:10.200 things that
01:01:10.660 look like they
01:01:11.180 can't possibly
01:01:11.980 lose money
01:01:12.580 because they've
01:01:13.340 got some
01:01:13.700 kind of
01:01:14.140 monopoly-like
01:01:15.540 control of
01:01:16.640 things or
01:01:17.100 they're doing
01:01:17.520 something right.
01:01:18.780 So investment
01:01:19.420 is not about
01:01:20.160 what is right
01:01:22.460 or wrong.
01:01:24.800 Nice means
01:01:25.780 gullible.
01:01:26.260 Well, what
01:01:28.060 exactly would
01:01:29.200 be the
01:01:29.500 example of
01:01:30.000 what I'm
01:01:30.280 gullible about?
01:01:31.720 Do you
01:01:32.160 think it's
01:01:32.520 practical for
01:01:33.620 them to
01:01:33.920 just pull
01:01:34.700 out?
01:01:36.060 You wouldn't
01:01:36.720 have a
01:01:37.020 phone.
01:01:39.060 Do I own
01:01:39.660 a Dilbert
01:01:40.140 desk calendar?
01:01:41.100 Same problem.
01:01:43.120 Even my
01:01:44.320 products, the
01:01:45.040 Dilbert desk
01:01:45.540 calendars, are
01:01:46.180 printed in
01:01:47.020 China.
01:01:48.060 Now, am I
01:01:49.560 happy about
01:01:50.020 that?
01:01:51.240 Nope.
01:01:52.420 Do you
01:01:52.680 know where
01:01:52.920 the other
01:01:53.360 places you
01:01:53.940 can print a
01:01:54.500 Dilbert
01:01:54.740 calendar are?
01:01:55.380 Economically?
01:01:58.040 Neither do
01:01:58.500 I.
01:01:59.100 I have no
01:01:59.660 idea.
01:02:00.160 I have no
01:02:00.780 idea where
01:02:01.280 we go.
01:02:02.460 But it's
01:02:02.940 also by
01:02:04.700 preference and
01:02:05.860 by contract,
01:02:06.680 I don't make
01:02:07.100 those decisions.
01:02:09.120 So the
01:02:09.660 decisions on
01:02:10.280 where the
01:02:10.840 publisher
01:02:11.820 prints is
01:02:13.540 out of my
01:02:14.380 company's
01:02:16.100 control.
01:02:19.220 Coinbase HR
01:02:20.180 policy is
01:02:20.940 better than,
01:02:21.980 yeah, I
01:02:22.260 don't know
01:02:22.480 what that
01:02:22.740 means.
01:02:24.360 Try India.
01:02:25.380 I wonder
01:02:28.500 why India
01:02:28.940 doesn't have
01:02:29.360 more
01:02:29.540 manufacturing.
01:02:32.060 Actually, can
01:02:32.640 somebody answer
01:02:33.120 that question?
01:02:34.440 Why is it
01:02:35.060 that India
01:02:35.600 with massive
01:02:36.640 amounts of
01:02:37.240 potential
01:02:37.700 employment and
01:02:39.040 pretty good
01:02:40.000 education system
01:02:40.960 and all
01:02:41.200 that?
01:02:44.620 Somebody
01:02:45.140 says they
01:02:45.560 do textiles.
01:02:46.840 But why
01:02:47.340 isn't India
01:02:47.960 just massively
01:02:48.980 competing with
01:02:49.840 China?
01:02:50.320 Because wouldn't
01:02:51.080 you give all
01:02:51.700 of your
01:02:51.940 business to
01:02:52.460 India?
01:02:52.680 India, if
01:02:53.740 you had a
01:02:54.040 choice, same
01:02:54.800 price, Indian
01:02:56.860 manufacturer or
01:02:57.980 Chinese, you
01:02:58.680 would take
01:02:59.000 Indian every
01:02:59.560 time, wouldn't
01:03:00.040 you?
01:03:01.220 Because they're
01:03:01.920 an ally, in
01:03:04.360 a way.
01:03:06.580 The electric
01:03:07.400 grid.
01:03:08.040 Somebody says
01:03:08.540 they can't do
01:03:09.120 manufacturing because
01:03:10.040 they're electric
01:03:10.560 grid.
01:03:11.020 I don't know.
01:03:13.880 That's an
01:03:15.060 interesting point,
01:03:16.320 but I don't
01:03:16.680 know.
01:03:17.460 I mean, China's
01:03:18.040 got problems with
01:03:18.680 their electric
01:03:19.120 grid.
01:03:19.460 They do it.
01:03:19.860 Lack of
01:03:21.320 natural resources.
01:03:22.700 I don't know.
01:03:25.600 They could do
01:03:26.260 assembly.
01:03:28.120 So assembly
01:03:28.840 would still
01:03:29.260 work.
01:03:32.580 Look at
01:03:32.980 this.
01:03:34.120 Vlad says,
01:03:35.040 Scott, so
01:03:35.780 much basic
01:03:36.440 stuff you
01:03:36.960 don't know.
01:03:37.980 Just like in
01:03:38.640 the case of
01:03:39.100 narcissists.
01:03:40.240 How can I
01:03:40.820 send information
01:03:41.460 to you?
01:03:42.280 Well, Vlad,
01:03:42.760 you are a
01:03:43.240 narcissist.
01:03:45.320 So I don't
01:03:46.000 want to hear a
01:03:46.520 fucking thing
01:03:46.980 from you,
01:03:47.360 actually.
01:03:48.340 You sound
01:03:48.940 like a jerk.
01:03:49.860 Now, if
01:03:52.460 you could
01:03:52.740 find a
01:03:53.200 way to
01:03:54.680 be less
01:03:55.620 of a
01:03:56.660 flaming
01:03:57.220 jerk,
01:03:58.540 then I'm
01:03:58.980 always interested
01:03:59.520 in seeing
01:03:59.920 things that
01:04:00.520 don't agree
01:04:01.200 with my
01:04:01.680 opinions.
01:04:08.320 Would
01:04:08.800 India have
01:04:09.380 been better
01:04:09.820 off had
01:04:10.300 they stayed
01:04:10.760 part of the
01:04:11.300 British Empire?
01:04:13.260 That's not
01:04:13.920 a fair
01:04:14.240 question.
01:04:15.420 People need
01:04:16.140 their freedom
01:04:16.680 too.
01:04:17.980 How do you
01:04:18.440 value that?
01:04:19.860 Tata makes
01:04:23.000 more cars
01:04:23.580 than any
01:04:23.900 other?
01:04:24.140 Is that
01:04:24.340 true?
01:04:25.900 Tata is
01:04:26.520 bigger than
01:04:26.920 GM.
01:04:27.280 That's an
01:04:27.700 Indian car
01:04:28.360 company.
01:04:29.140 I didn't
01:04:29.420 know that.
01:04:30.940 Well,
01:04:31.580 India has
01:04:34.460 really low
01:04:34.960 quality and
01:04:35.820 the regulations
01:04:36.420 are insane.
01:04:37.460 Are you
01:04:37.780 telling me that
01:04:38.320 China doesn't
01:04:38.940 have insane
01:04:39.440 regulations?
01:04:41.020 It doesn't
01:04:41.760 seem like
01:04:42.120 anything they
01:04:42.600 can't solve.
01:04:43.400 Why not
01:04:49.020 come down
01:04:49.420 on Mexico
01:04:49.980 for fentanyl?
01:04:50.880 Are you
01:04:51.120 kidding?
01:04:52.520 I'm being
01:04:52.920 asked why
01:04:53.700 I don't
01:04:54.020 come down
01:04:54.440 on Mexico
01:04:55.020 for the
01:04:55.580 fentanyl?
01:04:57.740 Apparently
01:04:58.240 you've not
01:04:58.660 watched me
01:04:59.120 for very
01:04:59.480 long.
01:05:00.540 I'm in
01:05:01.180 favor of
01:05:01.660 droning
01:05:02.080 the cartels.
01:05:05.140 Can I
01:05:06.060 be any
01:05:06.780 harder than
01:05:07.540 that?
01:05:08.100 I'm in
01:05:08.560 favor of
01:05:09.060 killing the
01:05:10.080 top fentanyl
01:05:11.300 dealer in
01:05:11.940 China, where
01:05:13.260 he lives in
01:05:13.880 China, and
01:05:14.660 even telling
01:05:15.180 China we
01:05:15.620 did it.
01:05:16.500 I wouldn't
01:05:17.160 even keep it
01:05:17.580 a secret.
01:05:18.820 Now we
01:05:19.240 probably
01:05:19.440 couldn't use
01:05:19.920 a drone in
01:05:20.680 China, but
01:05:21.880 if we could
01:05:22.360 find out
01:05:22.860 some way to
01:05:23.700 kill that
01:05:24.080 guy and
01:05:25.100 just say,
01:05:25.740 look, China,
01:05:26.220 we gave you
01:05:27.180 a year.
01:05:28.320 We gave you
01:05:29.140 his name,
01:05:29.800 his address,
01:05:31.020 all the
01:05:31.600 evidence of
01:05:32.040 the crime,
01:05:33.040 and we
01:05:33.300 gave you a
01:05:33.780 year to
01:05:34.160 fix it.
01:05:35.280 And then
01:05:35.820 we killed
01:05:36.200 him, and
01:05:37.740 we'll do
01:05:38.040 it again.
01:05:39.900 We shouldn't
01:05:40.540 have any
01:05:40.960 apology for
01:05:41.640 that.
01:05:42.780 Likewise,
01:05:43.560 with the
01:05:43.820 cartels, if
01:05:45.180 we decide to
01:05:45.740 just drone
01:05:46.240 the cartels
01:05:47.080 because of
01:05:48.800 the drug
01:05:49.360 business, I'm
01:05:51.860 okay with
01:05:52.220 that.
01:05:52.840 Is that
01:05:53.460 hard enough?
01:05:54.480 Is murdering
01:05:55.420 them where
01:05:55.760 they stand
01:05:56.360 not strong
01:05:57.480 enough for
01:05:57.840 you?
01:06:00.500 All right.
01:06:04.200 Drone
01:06:04.720 attack the
01:06:05.260 CEO of
01:06:05.720 McDonald's?
01:06:06.340 Well, again,
01:06:08.160 they're illegal
01:06:09.320 products.
01:06:09.800 You think
01:06:12.000 Trump was
01:06:12.500 angling
01:06:13.360 toward India
01:06:13.940 as a
01:06:14.200 replacement
01:06:14.500 for China?
01:06:15.140 Well, yes,
01:06:16.460 certainly in
01:06:17.080 a strategic
01:06:17.720 sense.
01:06:20.280 Why doesn't
01:06:21.120 America have
01:06:21.740 quick detection
01:06:22.600 COVID strips?
01:06:23.520 I'm afraid to
01:06:26.360 tell you the
01:06:26.800 answer to that
01:06:27.320 question because
01:06:27.880 I think I
01:06:28.400 know it.
01:06:29.320 It's
01:06:29.560 corruption.
01:06:31.580 I don't
01:06:32.500 know the
01:06:32.820 details, but
01:06:34.480 I would place
01:06:35.800 a large bet
01:06:36.760 that if you
01:06:38.220 dug down into
01:06:39.100 the approval
01:06:39.680 process of
01:06:40.740 quick COVID
01:06:41.780 tests, you
01:06:43.040 would eventually
01:06:43.560 find somebody
01:06:44.440 with a
01:06:45.140 pharmaceutical
01:06:45.860 connection
01:06:46.660 that's a
01:06:48.420 problem.
01:06:50.980 So that's
01:06:53.760 all I'm
01:06:54.060 going to
01:06:54.220 say because
01:06:55.260 we don't
01:06:55.980 have any
01:06:56.540 explanation
01:06:57.220 for why
01:07:00.020 we haven't
01:07:00.640 done it.
01:07:02.440 Right?
01:07:03.480 Because you
01:07:04.020 would hear an
01:07:04.440 alternate
01:07:04.740 explanation unless
01:07:05.800 it was
01:07:06.080 corruption.
01:07:06.860 You would
01:07:07.120 hear an
01:07:07.360 alternate
01:07:07.640 explanation.
01:07:08.180 Oh, these
01:07:08.760 rapid test
01:07:09.800 strips, they
01:07:10.240 have a
01:07:10.520 problem, or
01:07:11.380 we'd love to
01:07:12.420 make them, but
01:07:13.000 we can't get
01:07:13.480 the precursors,
01:07:14.560 or we'd love
01:07:15.520 to make them,
01:07:16.020 but we don't
01:07:16.460 have the
01:07:16.760 manufacturing
01:07:17.300 capacity, or
01:07:18.260 we'd love to
01:07:19.220 make them, but
01:07:19.800 it's not as
01:07:20.300 good as you
01:07:20.700 think, or
01:07:21.320 we'd love to
01:07:21.800 make them, but
01:07:22.340 they're not as
01:07:22.860 safe, or it
01:07:23.440 caused some
01:07:23.840 other problem.
01:07:25.540 Have you
01:07:25.880 heard anything
01:07:26.260 like that?
01:07:28.300 No.
01:07:29.520 Nope.
01:07:30.480 You haven't.
01:07:31.260 And if you
01:07:31.820 haven't heard any
01:07:32.800 reasonable
01:07:33.740 counter-argument,
01:07:35.080 and you
01:07:35.360 haven't, you've
01:07:36.260 heard no
01:07:36.680 counter-argument,
01:07:38.020 it's got to
01:07:38.460 be corruption.
01:07:39.720 Process of
01:07:40.400 elimination.
01:07:42.300 If you know
01:07:43.120 it's a good
01:07:43.560 thing, it's
01:07:44.740 public, the
01:07:45.800 public knows
01:07:46.420 it's a good
01:07:46.800 thing.
01:07:47.360 I mean, a
01:07:47.680 lot of us
01:07:48.020 do.
01:07:48.820 It's been
01:07:49.460 promoted, there's
01:07:51.300 been testimony to
01:07:52.460 Congress, people
01:07:53.640 have told them
01:07:54.220 exactly why it's
01:07:55.060 good, there's
01:07:56.120 zero argument on
01:07:57.100 the other side,
01:07:58.240 and it doesn't
01:07:58.640 happen.
01:08:00.080 It's got to be
01:08:00.900 corruption.
01:08:01.740 There's no
01:08:02.320 explanation left.
01:08:05.880 And has the
01:08:06.920 media looked into
01:08:07.640 it?
01:08:07.800 Have you seen
01:08:09.560 any stories by a
01:08:11.180 major media entity
01:08:12.320 saying, we looked
01:08:13.700 into why the U.S.
01:08:14.880 doesn't have
01:08:15.380 rapid tests, and
01:08:16.300 here's what we
01:08:16.860 found?
01:08:18.200 If you've seen
01:08:19.120 that, send it to
01:08:19.940 me, because I
01:08:21.080 don't think I've
01:08:21.700 seen it.
01:08:22.600 You know the
01:08:23.040 dog that's not
01:08:23.740 barking?
01:08:24.680 The dog that's
01:08:25.640 not barking is a
01:08:27.300 story about why we
01:08:28.640 don't have those
01:08:29.400 test strips.
01:08:31.720 Now, I know that in
01:08:33.000 the beginning we had
01:08:33.720 some crappy regular
01:08:35.280 tests, PCR tests or
01:08:37.460 whatever, but that's a
01:08:38.480 different story, that
01:08:40.220 those tests have
01:08:41.040 nothing to do with
01:08:41.760 what we're talking
01:08:42.340 about, the rapid
01:08:42.940 tests.
01:08:43.280 The reason we
01:08:44.280 don't have those
01:08:45.060 and other countries
01:08:45.800 do, it's not
01:08:48.340 money, it's not
01:08:49.500 resources, it's not
01:08:50.640 intelligence, it's
01:08:51.520 not data, it's not
01:08:53.240 policy, it's nothing.
01:08:56.900 Literally, we have
01:08:57.940 not been told why we
01:09:00.040 didn't do it, and
01:09:01.860 are still not doing
01:09:02.600 it, at a scale that
01:09:03.840 would make sense.
01:09:05.260 It's got to be
01:09:05.960 corruption.
01:09:07.500 Do you have
01:09:08.140 another theory?
01:09:09.860 Because if it's not
01:09:10.760 corruption, they
01:09:11.440 would tell us the
01:09:12.120 other reason.
01:09:14.260 Right?
01:09:19.740 Anyway, I think we
01:09:21.340 have to assume
01:09:22.620 corruption to force
01:09:23.840 the government to
01:09:24.580 communicate correctly.
01:09:26.380 If the only problem
01:09:28.180 is they haven't
01:09:28.800 communicated their
01:09:29.740 objection, well,
01:09:31.160 that's a big
01:09:31.660 problem, right?
01:09:33.740 They need to
01:09:34.440 communicate that
01:09:35.180 objection so we
01:09:36.220 can know we have
01:09:36.820 some trust in our
01:09:38.260 government.
01:09:38.600 government.
01:09:39.000 Right now, you
01:09:39.620 have to assume no
01:09:40.480 trust, and you
01:09:42.220 have to assume
01:09:42.900 corruption.
01:09:44.480 That's got to be the
01:09:45.280 default assumption,
01:09:46.720 and I'd love to be
01:09:47.660 talked out of it.
01:09:49.080 Ideally, we get
01:09:50.040 talked out of it.
01:09:51.340 Not seeing it
01:09:51.900 happen.
01:09:53.620 All right.
01:09:54.760 That is my show for
01:09:56.300 today.
01:09:57.140 Sorry there's too much
01:09:57.980 COVID stuff in there.
01:09:59.180 I know.
01:09:59.720 I tried to leave it to
01:10:00.380 the end.
01:10:01.160 But there wasn't much
01:10:01.900 happening otherwise.
01:10:03.400 And I will talk to
01:10:04.840 you tomorrow.