Real Coffee with Scott Adams - August 26, 2021


Episode 1480 Scott Adams: Talking About All the Huge D*cks in the News Today. With Coffee.


Episode Stats

Length

40 minutes

Words per Minute

151.01259

Word Count

6,122

Sentence Count

441

Misogynist Sentences

2

Hate Speech Sentences

26


Summary

In this episode of What We Call the News, I talk about the recent attack on the Kabul airport by ISIS-K, and how it could have a big impact on our understanding of what's going on in the rest of the world.


Transcript

00:00:00.000 Good morning, everybody.
00:00:06.260 And what a terrific day it is.
00:00:08.820 Today is the day that you'll have an unexpectedly better day than you expected.
00:00:15.480 Yeah, never start a sentence if you don't know how it's going to end.
00:00:18.460 It ends up like that.
00:00:20.280 But if you'd like to start something that does end well,
00:00:23.740 well, let me tell you, I got something for you.
00:00:25.860 It's called the simultaneous sip. Have you heard of it?
00:00:28.060 Yeah, it's very big.
00:00:29.720 It's all over the world.
00:00:31.020 And all you need to participate in this mass exercise of unity
00:00:36.300 is a cup or mug or glass, a tank or chalice or stye,
00:00:40.020 and a kentine jug or flask, a vessel of any kind.
00:00:45.000 Fill it with your favorite liquid.
00:00:47.060 I still like coffee.
00:00:48.660 And join me now for the unparalleled pleasure,
00:00:51.680 the dopamine of the day, the thing that makes everything better.
00:00:54.840 It's called the simultaneous sip.
00:00:56.640 And watch it happen now. Go.
00:00:59.720 Ah, that's some good stuff.
00:01:06.240 Speaking of good stuff, I once had the following idea,
00:01:11.820 which is that we spend a lot of time in the bathroom.
00:01:15.840 And wouldn't it be great if every time you entered, let's say,
00:01:19.620 the bathroom that you usually use, because most of us are creatures of routine,
00:01:23.800 if there was some kind of sensor that knew who you are,
00:01:28.260 recognized you by your, I don't know, your Apple Watch or something,
00:01:31.280 and automatically brought up a two-minute lesson
00:01:35.200 of something useful that you could learn in two minutes.
00:01:38.420 And I called it Turd University, because you'd be doing your business,
00:01:44.380 and in about two minutes, you'd learn a new skill.
00:01:47.820 And since you use the restroom every day,
00:01:50.500 you'd learn 365 skills in one year.
00:01:55.160 With no effort whatsoever, you were going to be there anyway,
00:01:57.740 and maybe you didn't bring any reading material.
00:01:59.560 And so I didn't realize that I had accidentally created Turd University,
00:02:05.820 but I called it my local's subscription service.
00:02:13.580 So in addition to my other content that's too edgy to be in the general public,
00:02:17.840 such as my Robots Read News comic strip,
00:02:22.640 here are some of the things that people are learning
00:02:25.100 on the local subscription platform.
00:02:27.900 I'll just give you an example.
00:02:30.600 I think I've got hundreds of them now.
00:02:32.700 So these are micro-lessons, two-minute lessons or so that I give people,
00:02:36.340 so you can learn how to maximize your creativity,
00:02:39.380 how to set up your own live stream,
00:02:41.440 when to believe the experts, voice variability,
00:02:45.180 fun versus efficiency.
00:02:47.920 I'll just give you some more.
00:02:49.380 There are just hundreds of them.
00:02:52.000 Controlling what you can control.
00:02:56.380 Some of them are just funny.
00:02:57.900 Determining your human value, managing your ego,
00:03:01.520 learning to speak well.
00:03:03.300 And I've got one coming up that I'm going to record today
00:03:05.600 on how to brag without bragging.
00:03:10.500 So those are the things.
00:03:12.240 I've got my whiteboard all queued up.
00:03:14.720 And if you were paying $7 a month
00:03:16.660 to be on the local subscription platform,
00:03:19.420 one of the things that I promise is that you will get hundreds of dollars of value every month
00:03:27.680 in terms of how it makes your life better.
00:03:31.040 All right.
00:03:32.080 So let's see what's going on here in what we call the news.
00:03:38.280 First of all, ISIS-K may have just attacked in just before I got on.
00:03:47.140 I saw that the Kabul airport had some kind of explosion and gunfire at one of the gates.
00:03:52.680 Could be ISIS-K.
00:03:54.640 Now, I don't like ISIS-K as a name because it's a little bit too cool.
00:03:59.000 I don't like naming a terror organization after something that sounds like a boy band.
00:04:07.440 It sounds just a little bit too attractive.
00:04:10.320 Hey, have you seen what ISIS-K is up to?
00:04:12.880 What?
00:04:13.820 I'd like to get their album.
00:04:15.880 So I think we've branded them poorly.
00:04:18.380 Just a little bit too cool.
00:04:20.440 Now, if you don't know about ISIS-K,
00:04:22.160 they want to start a caliphate over there in the Afghanistan-ish territories.
00:04:26.980 And the Taliban are considered too liberal for them.
00:04:32.760 That's right.
00:04:33.300 They're the natural enemy of the Taliban because the Taliban's a little too flexible.
00:04:38.520 But ISIS-K, not so flexible.
00:04:41.940 And so the question is, and again, you have to wonder about this, don't you?
00:04:47.900 Will the Taliban's control of Afghanistan make us more safe or less safe from ISIS-K?
00:04:55.200 I don't know.
00:04:56.300 Because the Taliban are their enemies and the Taliban could probably be more brutal
00:05:00.580 in tracking them down and killing them than we would be.
00:05:04.840 So isn't one possible unexpected outcome that ISIS and al-Qaeda are degraded in Afghanistan?
00:05:14.220 Because the Taliban really doesn't want any more trouble with the rest of the world.
00:05:18.160 They just want to run their little fiefdom there.
00:05:21.180 So anyway, I'm not saying that's necessarily likely,
00:05:25.460 but I don't think you could rule out the possibility that retreating gets us better protection from ISIS-K.
00:05:34.520 Yeah, I'm seeing a lot of no's.
00:05:35.920 Yeah, I wouldn't say it's likely, but I think it's one of the possibilities.
00:05:39.280 A Rasmussen poll asked, is Afghanistan situation better or worse than the news portrays?
00:05:47.540 57% say worse.
00:05:50.280 Really?
00:05:50.840 Because it feels like the news is portraying it as pretty bad.
00:05:55.820 This is the one situation where all of the news is on the same side.
00:05:59.940 You rarely get that, right?
00:06:01.460 The entire news thinks this is pretty, pretty bad.
00:06:04.760 Here is a little warning shot for you.
00:06:13.100 Now, you all, of course, know about the controversy of ivermectin.
00:06:18.280 And, boy, if you're on the side that thinks it works, and you're wrong,
00:06:24.320 I don't know if we'll ever know for sure, right?
00:06:26.220 Are we ever going to know for sure?
00:06:27.720 But let's say we could know for sure someday.
00:06:29.900 If you're wrong about ivermectin, it's going to be really embarrassing.
00:06:35.760 Because the setup here is that, let me just say what Chris Eliza said on CNN.
00:06:43.780 And normally I don't agree with him, and I'm not saying I agree with him here,
00:06:49.140 but the way he worded it is worthy of repeating.
00:06:52.920 He said, owning the elites by taking a horse pill,
00:06:55.840 so I guess ivermectin is often used in veterinarian work,
00:07:00.620 So owning the elites by taking a horse pill rather than getting the vaccine
00:07:04.040 is really a telling commentary on just how much blind partisanship
00:07:08.240 has taken over among Trump conservatives.
00:07:12.040 Telling and terrifying.
00:07:14.360 Owning the elites by taking a horse pill rather than getting a vaccine.
00:07:19.200 That's pretty brutal.
00:07:21.420 Now, what if he's wrong?
00:07:23.220 What if Chris Eliza is the one who's wrong?
00:07:26.480 Well, the joke's on you, Chris.
00:07:28.080 Because if it turns out that vaccines are bad, and ivermectin is great,
00:07:33.980 he's going to have a little egg on his face,
00:07:35.840 except that it won't be as easy to mock him,
00:07:39.440 because he's on the same side as, you know, science.
00:07:41.840 So you don't want to be on the wrong side of science
00:07:46.660 and also be promoting something that other people can reasonably call a horse pill.
00:07:53.040 So persuasion-wise, this is no longer a balanced situation.
00:07:57.220 If it turns out that the Chris Elizas of the world who think the vaccination is real and good
00:08:03.180 and ivermectin is not,
00:08:05.720 if he turns out to be wrong,
00:08:08.440 it's still not that embarrassing for him
00:08:11.000 because he was agreeing with the entire scientific community.
00:08:15.320 Not entire, but, you know, the consensus.
00:08:17.660 So if you agree with the consensus and you're wrong,
00:08:20.680 it's not too embarrassing.
00:08:22.680 But what happens if you take a radically different opinion from the consensus
00:08:27.060 and then you're wrong?
00:08:30.000 Not so good, right?
00:08:31.840 You don't want to be the one pushing the horse pills,
00:08:34.480 the horse pills,
00:08:36.000 if it turns out to be wrong.
00:08:37.720 So one of the things you have to look at with persuasion is,
00:08:41.060 who's got the bigger downside?
00:08:43.240 Because we don't like downside, right?
00:08:45.460 So Chris Eliza is taking a bet with not much of a downside.
00:08:50.220 If he's wrong, well, so are a lot of other people.
00:08:53.160 But boy, if you take the ivermectin side and you're wrong,
00:08:55.620 you're the one pushing a horse pill
00:08:57.000 because you were so partisan that you couldn't believe the other side, I guess.
00:09:04.560 Personally, I would say that I don't believe...
00:09:09.640 I would say every day that goes by
00:09:11.440 when ivermectin is not proven to be effective,
00:09:15.460 reduces the chances that it is.
00:09:17.920 Because, you know, I don't know one way or the other.
00:09:20.420 But I'd say every day that goes by, I would lower my odds.
00:09:23.520 Same as I did with hydroxychloroquine.
00:09:25.600 Every day that went by when it wasn't proven to work,
00:09:28.620 I said, well, I think the odds are going down every day.
00:09:32.020 And sure enough, we still do not have evidence it does.
00:09:34.380 But there's some high-ranking doctor in some medical association in Japan
00:09:43.060 who is saying ivermectin, he still doesn't know if it works.
00:09:47.140 He's not saying it works.
00:09:48.600 He says the evidence is ambiguous.
00:09:51.120 But he's promoting it based on the cost-benefit analysis.
00:09:55.220 So that would be a big break with the rest of the medical community.
00:10:00.400 I don't think it's necessarily government policy.
00:10:03.140 It was just the head of some medical organization.
00:10:07.560 We've got 100,000 people hospitalized with COVID in the United States.
00:10:12.080 We just broke last year's record.
00:10:14.660 We're about at it.
00:10:17.620 It's actually worse than last year.
00:10:21.040 But does it seem worse?
00:10:23.060 Why is it worse than last year but it doesn't seem worse?
00:10:25.840 Well, some of it's the way it's reported, right?
00:10:27.740 Because once again, 100,000 people hospitalized, how many are going to die?
00:10:34.620 Isn't that really different from last year?
00:10:36.960 Wouldn't it be more fair to say X people died last year
00:10:40.320 and this year very, very much fewer will die because of vaccinations and whatever?
00:10:46.960 And also because weaker people have already died.
00:10:50.160 So a lot of people who could die from the vaccines are already dead.
00:10:53.160 Or not from the vaccines, but from the COVID.
00:10:57.740 And here's a statistic that I didn't know this, but see if you did.
00:11:03.800 That 48% of the U.S. is still not fully vaccinated.
00:11:08.060 Did you know that?
00:11:09.900 I thought the number was much higher by now.
00:11:12.180 I figured we were in a 60-plus percentage of already vaccinated fully.
00:11:17.260 But were 48% are not fully vaccinated?
00:11:22.520 That feels like a gigantic number, doesn't it?
00:11:24.720 Wouldn't you say, could you not say that, well, I suppose it depends on your point of view,
00:11:31.020 but if the Biden administration is trying to convince you to get vaccinated,
00:11:36.460 this is a complete failure, isn't it?
00:11:39.820 Wouldn't you say?
00:11:41.000 48% not fully vaccinated looks like a complete failure,
00:11:46.060 given the time and the fact that we have vaccinations.
00:11:48.600 So persuasion-wise, how many people did the fake news kill?
00:11:55.280 You see where I'm going with this?
00:11:57.360 Why do you think that so many people are not vaccinated in the United States?
00:12:02.420 I think that so many people are not vaccinated because there's no trust whatsoever in institutions.
00:12:08.460 What is it that primarily has caused us to not trust institutions?
00:12:12.480 What is the one thing that makes people so skeptical of everything, and maybe they should be?
00:12:19.160 It's the news.
00:12:20.480 And it's the news dividing everything into a partisan world.
00:12:24.080 And that partisan stuff works great if all you're doing is talking about politics.
00:12:28.960 It's fun that way.
00:12:30.820 But as soon as you get into a medical discussion,
00:12:34.100 and the world divides into its two camps,
00:12:38.180 you just killed 100,000 people.
00:12:40.300 Right?
00:12:43.020 Because you divided people into political camps,
00:12:45.460 so they made a political decision on a medical decision.
00:12:48.940 If you make a political calculation on your own medical decision,
00:12:54.920 you're not going to get the right outcome.
00:12:58.360 You need to make a medical decision on your medical decisions, right?
00:13:01.720 So I feel as if you could make a case,
00:13:05.580 and I believe that after all this is said and done,
00:13:07.460 that people are going to do all kinds of economic studies and death studies
00:13:11.120 to find out who killed the most people.
00:13:14.620 How many people did I kill?
00:13:17.140 It's not zero.
00:13:19.300 Probably.
00:13:20.280 It's probably not zero.
00:13:22.640 Right?
00:13:23.380 Like, how many people did I personally kill
00:13:26.380 because of anything I said that had some influence on somebody
00:13:30.680 who made a decision they wouldn't have made?
00:13:32.140 Probably not zero.
00:13:34.600 Right?
00:13:34.860 Because enough people watch this
00:13:36.140 that there's probably a few people who died of COVID,
00:13:39.740 and maybe they didn't need to.
00:13:41.500 Maybe I said something wrong.
00:13:42.760 I don't know what that would be exactly.
00:13:44.300 But maybe I said something wrong and killed somebody.
00:13:46.240 That's very possible.
00:13:47.960 Because that's sort of a risk that just comes with this kind of work.
00:13:51.200 But what about Trump?
00:13:53.380 How many people did Trump kill?
00:13:56.960 Well, it depends if he was right about stuff or wrong about stuff
00:14:00.440 or if he didn't persuade enough
00:14:02.280 or if he persuaded too much on, say, hydroxychloroquine.
00:14:06.100 But you could probably figure out how many people he killed
00:14:09.160 or how many people he saved.
00:14:11.560 Maybe he saved more than he killed.
00:14:13.160 Then you look at the net.
00:14:15.420 But I would think that the biggest kill statistic would be the news
00:14:20.080 because the business model of the news forces them into the two camps.
00:14:25.240 The two camps make people make bad medical decisions
00:14:28.000 because they're really political decisions.
00:14:30.320 You're just dressing them up as medical decisions.
00:14:33.020 And then they die.
00:14:35.380 So I think the fake news probably killed 100,000 people in the United States.
00:14:40.080 And do you know where you'll never see that reported?
00:14:42.940 The fake news.
00:14:44.820 So the news is the only industry that can kill 100,000 people
00:14:47.980 and you'll never hear about it.
00:14:50.080 The world we live in.
00:14:53.980 All right.
00:14:57.160 You may have heard that famous aging porn star Ron Jeremy
00:15:01.560 has been indicted on 30 counts of sexual assault
00:15:05.140 and a bunch of just horrible-sounding stuff.
00:15:09.120 And I had this observation about context.
00:15:13.160 Now, Ron Jeremy is famous for having nearly a foot-long penis.
00:15:17.980 If you didn't know that, now you do.
00:15:22.020 Now, here's how context makes a difference.
00:15:25.040 If I said to you, there's a famous porn star with a foot-long penis,
00:15:29.920 you'd say to yourself, well, that sounds about right.
00:15:32.340 Sounds like he's got exactly the right profession.
00:15:35.500 Indeed, if I were to watch said entertainment,
00:15:38.800 I would expect exactly that.
00:15:41.060 Exactly that.
00:15:41.860 So, porn star, gigantic penis, it's a really good fit.
00:15:48.500 The trouble is, once you retire
00:15:50.320 and you decide that you would rather look less like a retired porn star
00:15:55.920 and you like to make a change with your physical appearance,
00:16:00.220 you know, maybe exercise a little bit less,
00:16:01.920 and turn it into something that looks less like a porn star
00:16:06.400 and more like a, I don't know, a meth addict,
00:16:10.940 sort of a homeless meth addict.
00:16:14.240 I don't know if you've seen him lately.
00:16:15.660 It's not good.
00:16:17.160 He looks like a hobo who's been in a fight.
00:16:20.280 Now, here's where context matters.
00:16:23.560 If you've got somebody who looks like a hobo on meth,
00:16:27.680 and I told you that he had a 12-inch penis,
00:16:32.200 you'd be twice as scared.
00:16:34.500 Am I right?
00:16:36.080 Because it just doesn't, it's not good.
00:16:38.420 Hey, there's a hobo on the side.
00:16:40.520 What's your first thought?
00:16:42.380 Hey, maybe I could give him a ham sandwich.
00:16:45.060 Because it's a hobo, you know, maybe you'd like to help.
00:16:47.960 Help feed him.
00:16:49.040 So you go and you make yourself a ham sandwich,
00:16:50.860 and you give it to the hobo, and everybody wins.
00:16:52.980 But suppose I came and said to you,
00:16:54.600 there's a hobo outside your door,
00:16:57.280 with a 12-inch penis.
00:16:59.380 Would you make him a sandwich?
00:17:01.040 I don't think so.
00:17:02.600 I don't think he gets the sandwich,
00:17:04.040 because he's a little too scary at that point.
00:17:06.200 That's all I'm saying.
00:17:07.220 Speaking of huge dicks,
00:17:09.420 Governor Newsom is looking to be in danger.
00:17:14.820 So if the recall gets a 50% plus one vote,
00:17:19.340 he's out,
00:17:20.160 and the person with the next most votes
00:17:22.480 of all the competitors is in.
00:17:24.560 At the moment, that would be Larry Elder.
00:17:26.180 I was looking at Larry Elder's motto,
00:17:30.820 and apparently it's the one he's settled on,
00:17:33.160 which is,
00:17:34.020 because we have a state to save.
00:17:37.240 And I'm looking at his profile on Twitter.
00:17:40.900 Because we have a state to save.
00:17:43.380 What do you think?
00:17:45.820 How do you like that as a political motto,
00:17:48.760 for California specifically?
00:17:50.520 Because we have a state to save.
00:17:52.600 Like it or not?
00:17:53.700 What do you say?
00:17:58.200 I like it.
00:18:00.000 It's short.
00:18:01.860 It's right to the point.
00:18:03.760 And it doesn't focus on a particular topic.
00:18:07.780 So here's a hypnosis persuasion trick.
00:18:10.740 If you said,
00:18:11.860 I'm running for governor
00:18:12.640 because we have these three problems,
00:18:15.980 what do people say?
00:18:18.160 Well, a lot of people say,
00:18:19.340 you know, those are not my top three problems.
00:18:21.220 So I guess I don't need to vote for this guy.
00:18:24.540 But if you leave it open,
00:18:26.300 and you say,
00:18:26.980 we have a state to save,
00:18:28.700 everybody fills in the blanks
00:18:31.200 with what scares them the most.
00:18:34.540 And what scares me the most
00:18:36.100 might be running out of water.
00:18:38.660 What scares someone else the most
00:18:40.080 might be the forest fires
00:18:41.240 or the homeless people
00:18:42.160 or drugs.
00:18:43.420 So if you can make people
00:18:47.200 fill in their own fear,
00:18:49.800 you have frickin' nailed it.
00:18:52.160 So I have to say
00:18:53.180 that my first impression,
00:18:54.760 the first time I saw this,
00:18:56.480 because we have a state to save,
00:18:58.480 it felt a little flat.
00:19:00.740 It didn't have a little punch.
00:19:02.440 But the more I look at this,
00:19:04.680 it's really good.
00:19:06.800 Like really good.
00:19:07.700 I'm wondering if he had some advice
00:19:09.300 or maybe EAB tested it
00:19:11.000 or whatever he did.
00:19:11.680 But, so that's just one thing
00:19:13.900 it does right.
00:19:14.900 It makes you fill in the blanks
00:19:16.260 of yourself of the state to save.
00:19:18.640 Secondly,
00:19:19.600 it puts it in
00:19:20.600 fear,
00:19:22.500 persuasion territory.
00:19:24.160 Because he's saying
00:19:24.780 the state has to be saved.
00:19:26.980 So it's scary, right?
00:19:28.460 It's not just he'll do better,
00:19:30.280 he's bringing us hope and goodness,
00:19:32.480 you know,
00:19:32.660 it's not about the economy,
00:19:34.060 stupid.
00:19:34.700 He's saving the state.
00:19:36.660 So it's a fear,
00:19:38.280 persuasion play
00:19:39.200 and appropriate to the situation.
00:19:41.060 There is a genuine fear
00:19:42.860 that matches
00:19:43.440 the public.
00:19:45.680 So he's got the fear in there.
00:19:47.740 He keeps it simple.
00:19:49.280 He matches the mood
00:19:50.620 of the public,
00:19:51.780 which is we're feeling the same.
00:19:53.920 And,
00:19:54.400 and he leaves it ambiguous.
00:19:57.280 So you fill in your best fear
00:19:58.900 in there.
00:19:59.860 But,
00:20:01.040 did you notice the first word?
00:20:04.040 Because.
00:20:04.600 Because we have a state to save.
00:20:08.720 The first word is because.
00:20:11.560 Where have you seen that before?
00:20:14.460 I call it the fake because.
00:20:17.500 Because.
00:20:18.980 Studies show that if you start
00:20:20.860 something with the word because,
00:20:23.400 people think there's a reason there.
00:20:25.340 And they respond to it.
00:20:26.420 He actually gave them a reason
00:20:29.380 to vote for the recall
00:20:31.480 and allow him in.
00:20:33.560 And,
00:20:34.040 and you also fill in the reason.
00:20:38.560 The because.
00:20:40.240 Now,
00:20:40.720 we have a state to save.
00:20:42.060 Tries to hint you
00:20:43.420 at the reason.
00:20:44.560 But you're filling in
00:20:45.380 your own reason.
00:20:46.880 So you are actually
00:20:48.040 talking yourself into it
00:20:49.680 through this slogan.
00:20:52.360 That's good.
00:20:53.880 That's like really good.
00:20:55.180 It's something that allows you
00:20:57.080 to fill in your own best
00:20:58.880 slogan into his slogan.
00:21:01.720 I don't know if I've ever
00:21:02.540 seen this before.
00:21:03.880 Well,
00:21:04.200 you know,
00:21:04.400 I guess you could argue that
00:21:05.500 Make America Great Again
00:21:07.080 had that quality.
00:21:08.140 Because you did fill in,
00:21:09.940 you filled in the blanks yourself.
00:21:12.120 So,
00:21:12.300 but it wasn't,
00:21:13.120 it wasn't fear stuff.
00:21:16.200 You know,
00:21:16.480 it was more hopeful.
00:21:18.260 All right.
00:21:20.840 So the best,
00:21:21.700 I guess the best attack
00:21:23.440 against Larry Elder,
00:21:24.300 because of course
00:21:25.400 the Democrats
00:21:25.900 have to come after him.
00:21:27.700 They've got some allegations
00:21:29.320 from an ex
00:21:30.380 about brandishing a gun.
00:21:34.020 Now,
00:21:34.580 apparently there's no,
00:21:35.800 no claim
00:21:37.260 that he threatened anybody.
00:21:39.160 There's no claim
00:21:40.000 he threatened anybody.
00:21:41.040 And there's no claim
00:21:42.400 that he aimed his gun
00:21:44.060 at anybody.
00:21:45.580 He brandished it.
00:21:46.800 So I guess there was
00:21:48.360 some point where he was,
00:21:49.720 I don't know,
00:21:50.120 checking the ammunition
00:21:51.000 of his gun or something
00:21:51.940 while having a conversation.
00:21:53.820 And I don't know
00:21:54.540 if this stuff got conflated,
00:21:56.380 but I don't think it matters.
00:21:57.900 And here's why.
00:21:59.500 Nobody believes in ex.
00:22:02.280 Am I right?
00:22:03.900 Nobody believes anybody's ex.
00:22:06.320 If you heard somebody's ex
00:22:08.400 say that somebody
00:22:09.120 brandished a gun,
00:22:10.360 but no,
00:22:11.800 they didn't threaten me
00:22:12.520 with it.
00:22:13.780 And no,
00:22:14.380 he didn't point it at me.
00:22:16.240 No,
00:22:16.500 he didn't point it at me.
00:22:18.180 And he didn't threaten me
00:22:19.280 with it.
00:22:20.140 He just brandished it.
00:22:22.380 What's your first thought?
00:22:25.040 Bullshit, right?
00:22:26.880 It's in the category
00:22:28.040 of things
00:22:28.620 that we sort of discount
00:22:30.120 a little bit
00:22:31.400 because an ex
00:22:32.220 is always saying shit
00:22:33.160 about, you know,
00:22:34.520 right?
00:22:34.820 What do you say
00:22:35.400 about your ex?
00:22:37.240 Give me two sentences
00:22:38.760 about whatever ex
00:22:39.920 you exed last.
00:22:41.940 Not so good, right?
00:22:43.400 So nobody believes the ex.
00:22:44.640 If that's the best they have,
00:22:46.300 I think he's in good shape.
00:22:48.940 Here's a question
00:22:49.700 I asked on Twitter.
00:22:51.240 When did we stop crowing
00:22:52.440 about electing
00:22:53.080 the first black
00:22:54.120 fill in the blank?
00:22:56.560 The first black president.
00:22:57.960 The first black
00:22:58.660 governor of California.
00:23:00.620 The first black
00:23:01.420 Republican governor
00:23:03.640 of California.
00:23:05.520 Don't you think
00:23:06.400 that if Larry Elder
00:23:08.440 were a Democrat,
00:23:10.480 all of the news coverage
00:23:11.960 on CNN would be about,
00:23:13.200 oh, we might get
00:23:14.200 the first black Democrat
00:23:16.900 president in California.
00:23:20.140 Don't you think
00:23:20.680 they would just talk
00:23:21.380 about that nonstop?
00:23:24.160 Yeah, they would.
00:23:25.920 But here you have
00:23:27.180 something more historic.
00:23:29.780 What would be more historic
00:23:31.240 than a black Republican
00:23:34.480 winning in California?
00:23:37.740 Because you could say
00:23:38.760 to yourself,
00:23:39.340 oh, sure,
00:23:40.120 you know,
00:23:40.400 Tim Scott wins
00:23:41.420 for the Senate,
00:23:42.960 but, you know,
00:23:43.520 I'm assuming he has
00:23:44.780 a large black base.
00:23:47.080 Or you could say,
00:23:48.460 sure, Obama won
00:23:50.980 the presidency,
00:23:51.820 but he had to get,
00:23:52.560 you know,
00:23:53.240 90 plus percent
00:23:54.500 of the black vote
00:23:55.260 to do it.
00:23:56.480 You know,
00:23:56.860 kind of special cases.
00:23:59.340 But what the hell
00:24:00.340 is the special case
00:24:01.500 of a black Republican
00:24:03.020 in California?
00:24:04.780 That's the opposite
00:24:05.780 of a special case.
00:24:07.500 This is like
00:24:08.140 the first case
00:24:09.040 where race
00:24:10.900 didn't matter.
00:24:13.000 Right?
00:24:13.480 Has that ever
00:24:15.140 happened before?
00:24:16.260 Have we ever been
00:24:16.980 in a situation
00:24:17.640 where race
00:24:19.080 was erased
00:24:20.020 from the conversation?
00:24:22.180 Larry Elder
00:24:22.720 just did that.
00:24:24.140 This is like
00:24:24.620 one of the greatest
00:24:25.280 accomplishments
00:24:25.920 of any politician
00:24:27.520 in the history
00:24:28.120 of the universe.
00:24:30.320 Think about this.
00:24:32.120 Larry Elder
00:24:32.760 took race
00:24:34.520 out of the conversation.
00:24:36.200 Why?
00:24:38.560 Because he didn't
00:24:39.240 put it in the conversation.
00:24:40.820 He never put it
00:24:41.460 in there.
00:24:43.040 So,
00:24:44.020 I would say
00:24:45.160 that Larry Elder
00:24:46.560 has accomplished
00:24:47.140 something more special
00:24:48.360 than any politician
00:24:50.600 has ever accomplished.
00:24:52.160 He took race
00:24:52.760 out of the conversation.
00:24:54.320 I mean,
00:24:54.440 that's crazy.
00:24:55.800 Who else
00:24:56.200 has ever done that?
00:24:57.600 And how did he do it?
00:24:59.300 He didn't do it
00:24:59.960 by what he said.
00:25:01.440 He did it
00:25:01.840 by who he is.
00:25:03.400 He did it
00:25:04.180 by living
00:25:05.980 the life
00:25:07.020 he lives.
00:25:08.200 By being the person
00:25:09.280 he is,
00:25:10.880 he took it
00:25:11.400 right off the table.
00:25:12.800 I mean,
00:25:13.000 think about that.
00:25:14.360 Who has ever
00:25:15.040 done that before
00:25:15.780 in the history
00:25:16.360 of politics?
00:25:17.020 Nobody.
00:25:17.780 Nobody's ever
00:25:18.540 done that.
00:25:19.460 And we're going
00:25:19.940 to ignore it.
00:25:22.400 Joe Lonsdale
00:25:23.380 asked me on Twitter,
00:25:24.380 he said,
00:25:24.680 curious why
00:25:25.380 more people like you
00:25:26.580 aren't talking
00:25:27.160 about the elephant
00:25:27.940 in the room.
00:25:29.340 That they've changed
00:25:30.160 the rules in California
00:25:31.060 to send ballots
00:25:32.200 ahead of time
00:25:32.880 to everyone
00:25:33.320 and allow unions
00:25:34.480 to go door-to-door
00:25:35.520 to help millions
00:25:37.480 of people with them,
00:25:38.880 which makes the recall
00:25:40.380 very unlikely to pass.
00:25:42.560 What do you think?
00:25:43.760 Do you think
00:25:44.440 that the rules changes
00:25:45.840 are what elect
00:25:47.500 our presidents?
00:25:49.480 I think yes.
00:25:51.100 We don't really live
00:25:51.960 in a system
00:25:52.520 where votes
00:25:53.840 determine presidents
00:25:55.040 anymore.
00:25:56.080 We live in a system
00:25:57.120 in which the rule changes
00:25:58.820 determine the president.
00:26:00.220 So if that happens
00:26:02.260 in this case,
00:26:03.860 it will be a giant recall
00:26:05.780 in which the votes
00:26:08.420 didn't matter.
00:26:09.740 We would have spent,
00:26:10.940 I don't know what,
00:26:11.420 two years
00:26:12.100 on a recall
00:26:13.540 to find out
00:26:15.080 that the fucking votes
00:26:17.940 didn't matter.
00:26:20.160 And that's sort of
00:26:21.380 where we might be.
00:26:22.880 Now,
00:26:23.220 I happen to think
00:26:24.140 that there is so much
00:26:25.780 dislike of Newsom
00:26:27.400 that the polls
00:26:28.620 are misleading.
00:26:30.260 I believe he's going to,
00:26:31.680 I believe Elder's
00:26:32.520 going to win.
00:26:33.840 And I don't think
00:26:34.980 the rules changes,
00:26:36.020 while significant,
00:26:37.740 I don't even think
00:26:38.540 the rule changes
00:26:39.240 can get there.
00:26:40.220 Maybe.
00:26:41.040 I mean,
00:26:41.240 it might,
00:26:41.920 I mean,
00:26:42.140 Newsom's saying
00:26:42.720 it's a turnout election
00:26:43.840 and he could be
00:26:45.960 totally right about that.
00:26:47.120 So we'll see.
00:26:50.700 So it looks like
00:26:51.580 COVID is permanent now.
00:26:53.500 What,
00:26:53.840 what would you feel
00:26:55.540 about China
00:26:56.180 if you knew
00:26:57.780 that the pandemic
00:26:58.960 they released
00:26:59.720 or we believe
00:27:00.400 they released,
00:27:01.840 what if it's permanent?
00:27:05.540 How does that change
00:27:06.700 what you think
00:27:07.380 about what needs
00:27:08.340 to be done?
00:27:09.740 If it's permanent,
00:27:11.840 that feels different,
00:27:13.220 doesn't it?
00:27:14.120 It's one thing
00:27:14.760 that they,
00:27:15.440 you know,
00:27:15.840 killed 600,000 people
00:27:17.160 in America
00:27:17.620 if it passes,
00:27:19.840 right,
00:27:20.160 and it gets better.
00:27:21.280 But what if it doesn't?
00:27:23.640 I don't know.
00:27:24.340 There may be some
00:27:25.820 big reckoning
00:27:26.800 coming here
00:27:27.300 with China.
00:27:28.900 CNN is still using
00:27:30.100 what I call
00:27:30.620 the anecdotal
00:27:31.420 brainwashing technique
00:27:32.780 in which instead
00:27:33.980 of reporting statistics,
00:27:35.940 they report,
00:27:36.740 this one guy
00:27:37.340 had this problem.
00:27:39.060 And then they try
00:27:39.620 to persuade you
00:27:40.240 that the one guy
00:27:41.100 tells you something
00:27:42.860 about the big picture.
00:27:43.820 Of course it doesn't,
00:27:45.320 unless it's a coincidence.
00:27:47.100 But here are their
00:27:48.000 offerings just for today.
00:27:49.920 They do this
00:27:50.320 every single day.
00:27:51.760 Here's the headline.
00:27:52.400 Fitness trainer
00:27:53.280 declined vaccine.
00:27:54.920 Stunning photos
00:27:55.860 show what happened
00:27:56.680 to him.
00:27:58.280 Right?
00:27:58.840 It's going to be
00:27:59.420 a whole bunch of
00:28:00.300 guy didn't want
00:28:02.600 to take vaccine.
00:28:03.580 Well,
00:28:04.020 he's sorry now.
00:28:05.760 Isn't this guy sorry?
00:28:06.980 There'll be one
00:28:07.440 every day,
00:28:08.040 I'll bet you.
00:28:08.880 Here's another one.
00:28:11.600 Quote,
00:28:12.000 I had to turn away
00:28:12.800 a cancer patient
00:28:13.660 that needed
00:28:14.180 an emergency treatment.
00:28:15.540 Florida oncologist
00:28:16.500 Dr. Blubba Periani
00:28:18.340 told CNN.
00:28:20.200 Same trick,
00:28:20.900 right?
00:28:21.100 There's this one doctor
00:28:22.700 who had to turn away
00:28:24.380 one patient.
00:28:25.740 Now,
00:28:25.880 do you think
00:28:26.180 the patient went home
00:28:26.940 and died?
00:28:28.080 Because it was
00:28:28.720 an emergency treatment
00:28:29.780 for cancer.
00:28:31.200 So when he got
00:28:31.820 turned away
00:28:32.280 from the hospital,
00:28:33.020 did he just
00:28:33.500 go home and die?
00:28:35.420 No.
00:28:36.260 No,
00:28:36.540 they don't mention
00:28:37.180 what happened to him.
00:28:38.620 He probably drove
00:28:40.460 an extra five miles
00:28:41.880 to another hospital.
00:28:44.040 Got the life-saving,
00:28:45.380 you know,
00:28:45.620 probably.
00:28:46.620 I mean,
00:28:46.820 I don't know.
00:28:47.700 But I would say
00:28:48.380 this is a misleading
00:28:49.300 anecdote.
00:28:49.780 Edward Snowden
00:28:52.780 is writing about
00:28:53.460 Apple's new OS
00:28:54.560 that apparently
00:28:55.440 will spy on your
00:28:56.360 photos on your phone
00:28:57.400 and alert the police
00:28:58.380 if you've got a lot
00:28:59.060 of bad stuff there.
00:29:00.720 I'm thinking,
00:29:01.460 you know,
00:29:01.880 various illegal
00:29:03.280 porn,
00:29:04.480 et cetera.
00:29:05.380 What do you feel
00:29:05.940 about Apple
00:29:06.760 having an algorithm
00:29:07.660 that will alert
00:29:08.480 the police
00:29:09.180 based on the pictures
00:29:11.000 on your phone?
00:29:13.600 What do you think
00:29:14.360 about that?
00:29:16.180 That's pretty bad.
00:29:17.700 That's about as bad
00:29:18.460 as it gets.
00:29:21.140 And weirdly,
00:29:22.800 Apple is the
00:29:23.400 privacy company.
00:29:24.680 But they're also
00:29:25.380 the anti-porn company.
00:29:27.460 So they do have
00:29:28.480 two,
00:29:29.160 you know,
00:29:29.820 branding things
00:29:30.560 that are in conflict
00:29:31.200 here.
00:29:32.320 And I don't know,
00:29:33.940 I guess it depends
00:29:34.880 how,
00:29:35.840 what you feel
00:29:36.420 about,
00:29:36.920 you know,
00:29:39.840 underage photos
00:29:41.260 on people's phones.
00:29:42.700 Now,
00:29:43.200 if you think
00:29:43.660 that's the biggest
00:29:44.260 problem in the world,
00:29:45.220 then you'd be
00:29:45.780 in favor of this,
00:29:46.420 maybe.
00:29:47.140 If you think
00:29:47.640 privacy being eroded
00:29:49.020 is a bigger problem,
00:29:50.200 well,
00:29:50.400 you'd be on the
00:29:50.900 other side.
00:29:52.720 Obviously,
00:29:53.340 I'm not in favor
00:29:54.000 of those pictures
00:29:55.100 on anybody's phones.
00:29:56.900 But,
00:29:58.000 you have to look
00:29:58.540 at Apple's
00:29:59.340 privacy situation.
00:30:01.020 And it looks,
00:30:02.220 it does look like
00:30:02.900 an overreach.
00:30:04.100 In my opinion,
00:30:04.880 it's a rare,
00:30:06.820 gigantic mistake
00:30:07.780 by Apple.
00:30:09.100 So that's my opinion.
00:30:10.060 I would say
00:30:10.420 gigantic mistake.
00:30:11.420 I would also say
00:30:12.900 that I don't think
00:30:13.600 Steve Jobs
00:30:14.240 would have done it
00:30:15.040 because it feels
00:30:16.360 like a gigantic mistake.
00:30:17.860 And I think
00:30:18.220 he would have felt it
00:30:19.240 because he was
00:30:20.120 a little bit more
00:30:20.880 intuitive about,
00:30:22.600 you know,
00:30:23.240 the psychology of things.
00:30:26.300 Apparently,
00:30:26.920 employees are quitting
00:30:27.740 in record numbers
00:30:28.720 because they kind of
00:30:29.980 liked working remotely
00:30:31.140 and the thought
00:30:32.080 of going back to work
00:30:32.960 just seemed so horrible
00:30:33.960 that 68%
00:30:36.540 of job separations
00:30:37.540 were people quitting.
00:30:38.420 And one in three workers,
00:30:40.360 this is from Media Insider,
00:30:41.840 a tweet,
00:30:42.720 one in three workers
00:30:43.700 are considering
00:30:44.240 leaving their jobs
00:30:45.320 while 60% of them
00:30:47.400 are rethinking
00:30:48.040 their careers.
00:30:50.100 Talk about a reboot.
00:30:52.780 Basically,
00:30:53.460 I think,
00:30:53.800 you know,
00:30:54.220 this is something
00:30:54.760 I tell you a lot.
00:30:55.700 This is another
00:30:56.280 persuasion trick.
00:30:58.660 People can get used
00:30:59.660 to anything.
00:31:01.660 So you tell people,
00:31:02.460 hey,
00:31:03.040 your commute's
00:31:03.720 going to be half an hour.
00:31:04.700 And they're like,
00:31:04.980 ah, I hate that,
00:31:05.740 but it's half an hour.
00:31:06.760 And then traffic
00:31:07.920 gets worse.
00:31:08.540 And they're like,
00:31:08.860 hey,
00:31:09.020 your commute's
00:31:09.520 45 minutes.
00:31:10.380 And you're like,
00:31:10.680 well,
00:31:11.460 it was already
00:31:11.940 half an hour.
00:31:12.780 Now it's 45 minutes.
00:31:13.800 I guess I can handle that.
00:31:15.240 Next thing you know,
00:31:15.880 you change jobs.
00:31:17.140 But your commute's
00:31:17.940 going to be an hour
00:31:18.500 if you change jobs.
00:31:19.380 And you're like,
00:31:19.660 well,
00:31:19.860 it was already
00:31:20.340 45 minutes.
00:31:21.720 So what's an hour?
00:31:24.180 I think you can
00:31:24.960 just get used to things
00:31:26.040 because they gradually
00:31:26.860 happen.
00:31:27.780 And I think that people
00:31:28.760 realize they had been
00:31:29.560 living a nightmare
00:31:30.640 of driving two hours,
00:31:33.840 you know,
00:31:34.460 each day
00:31:34.960 to and from work.
00:31:36.220 It's just a nightmare.
00:31:36.860 And when they got
00:31:38.120 to work remotely,
00:31:39.000 they just said,
00:31:39.680 I'd rather starve to death
00:31:41.000 than do this again.
00:31:42.360 So I think that the
00:31:43.400 reboot of the pandemic
00:31:45.900 just changed how people
00:31:47.200 just see their whole lives.
00:31:48.580 And you see it coming
00:31:49.220 through the employment stuff.
00:31:51.080 apparently Sidney Powell
00:31:55.860 and Lin Wood,
00:31:56.860 lawyers for the,
00:31:58.640 about the alleged
00:32:00.180 election fraud,
00:32:02.540 they were,
00:32:03.300 a federal judge,
00:32:04.300 a federal judge
00:32:05.440 ordered sanctions
00:32:06.480 against them
00:32:07.180 for their
00:32:08.420 Trump-aligned lawsuits
00:32:10.300 challenging the election.
00:32:12.020 And I guess the
00:32:12.760 reasoning here
00:32:13.600 is that they're
00:32:14.200 lawsuits who are
00:32:14.980 so frivolous
00:32:15.640 and ridiculous
00:32:16.200 that they need
00:32:17.800 to be sanctioned
00:32:18.480 for that.
00:32:19.480 Now,
00:32:20.000 if you are
00:32:21.140 checking people's
00:32:24.340 predictions,
00:32:25.440 what was my prediction
00:32:27.040 when I heard
00:32:28.120 Sidney Powell
00:32:29.760 and Lin Wood
00:32:30.280 say that there was,
00:32:31.860 that they had discovered
00:32:32.900 that the software
00:32:33.760 was,
00:32:34.540 I don't know,
00:32:36.240 somehow being manipulated
00:32:38.200 by,
00:32:40.000 it had something to do
00:32:40.700 with Venezuela
00:32:41.340 and all that.
00:32:42.760 What did I say
00:32:43.980 when you first heard that?
00:32:46.080 Now remember,
00:32:46.520 I was on Trump's side.
00:32:48.700 Right?
00:32:49.240 I was on Trump's side
00:32:50.360 and then I heard
00:32:52.460 these claims
00:32:53.080 and I said,
00:32:54.700 complete bullshit.
00:32:56.800 Hey, Kerry.
00:32:58.260 Thanks for the buck 49.
00:33:02.640 And,
00:33:03.340 I don't know,
00:33:05.500 this is what I predicted,
00:33:06.860 so check your predictions.
00:33:08.880 In the comments,
00:33:10.540 how many,
00:33:11.800 yeah,
00:33:12.440 we don't know yet,
00:33:13.400 right?
00:33:13.680 So who knows?
00:33:14.440 But only,
00:33:15.740 I'm only talking about
00:33:16.560 the Venezuela dictator stuff
00:33:19.060 that was,
00:33:20.420 in my opinion,
00:33:21.260 was obvious bullshit
00:33:22.120 on the surface.
00:33:23.000 You didn't need to wait
00:33:23.660 for that.
00:33:24.600 That was obvious.
00:33:25.540 Now whether or not
00:33:26.160 there's any shenanigans
00:33:27.200 in the election,
00:33:28.180 I think that's an open question
00:33:29.460 because we can't check everything.
00:33:31.420 But in terms of
00:33:32.120 the biggest
00:33:33.560 Kraken-style
00:33:34.640 things,
00:33:35.860 I called bullshit
00:33:36.520 on day one
00:33:37.200 and it looks like
00:33:37.920 that will be right.
00:33:39.520 How many of you
00:33:40.420 were right
00:33:42.020 or wrong about that?
00:33:43.000 If we assume
00:33:44.080 that there's nothing
00:33:44.760 to the Venezuela part,
00:33:46.440 I think that feels safe
00:33:47.640 at this point.
00:33:52.420 No,
00:33:53.020 I'm not saying
00:33:53.540 it's BS
00:33:54.300 because a judge
00:33:55.180 said so.
00:33:56.300 I'm saying
00:33:57.120 that it's BS
00:33:58.100 because by now
00:33:59.040 we would know
00:33:59.640 if there was anything
00:34:00.460 to it.
00:34:01.640 And we don't.
00:34:03.040 My guess is
00:34:03.940 that they were fed
00:34:04.680 some misinformation
00:34:05.520 by somebody
00:34:06.960 who wanted to
00:34:07.540 discredit them
00:34:08.340 exactly the way
00:34:09.120 it happened.
00:34:09.540 I feel like
00:34:11.200 that was an intel
00:34:12.120 or somebody
00:34:13.520 aligned with them.
00:34:14.720 Something like that.
00:34:15.420 There were some
00:34:15.740 dirty tricks
00:34:16.360 involved in
00:34:17.620 duping these two
00:34:18.420 into thinking
00:34:19.020 that these
00:34:19.500 ridiculous claims
00:34:21.040 had some merit.
00:34:23.280 So,
00:34:23.960 anyway,
00:34:26.200 check your
00:34:26.660 predictions.
00:34:27.940 Were you right
00:34:28.360 on that one
00:34:28.740 or not?
00:34:30.180 All right.
00:34:30.820 Let me make sure
00:34:31.500 that I've
00:34:31.980 addressed all
00:34:33.880 of the fascinating
00:34:34.720 things in today's
00:34:36.040 news
00:34:36.380 and did not
00:34:37.560 talk too much
00:34:38.380 about those
00:34:39.620 topics you hate.
00:34:41.820 So,
00:34:42.580 now that
00:34:44.280 I've gone a few
00:34:45.600 days without
00:34:46.160 talking about
00:34:46.900 the forbidden
00:34:47.540 topics,
00:34:48.240 the ones which
00:34:48.800 make you turn
00:34:49.680 off this live
00:34:50.300 stream,
00:34:50.780 you know what
00:34:51.260 they are.
00:34:52.600 Better or worse?
00:34:54.580 Give me some
00:34:55.240 feedback.
00:34:56.380 All right?
00:34:56.640 So,
00:34:56.840 I didn't do any
00:34:57.520 deep vaccination
00:34:59.320 stuff.
00:35:00.080 I mean,
00:35:00.240 you have to
00:35:00.500 mention it
00:35:00.880 because it's
00:35:01.200 in the news.
00:35:01.980 But I didn't
00:35:02.380 do any deep
00:35:02.860 vaccination thoughts
00:35:03.780 or any deep
00:35:04.320 mass stuff.
00:35:06.380 people on,
00:35:08.920 so he says
00:35:09.540 worse,
00:35:10.040 well done.
00:35:11.220 On the
00:35:11.740 Locals platform,
00:35:12.800 it's universally
00:35:13.520 they like it
00:35:14.140 better.
00:35:15.400 So,
00:35:15.700 it looks like
00:35:16.160 100% there.
00:35:18.000 On YouTube,
00:35:20.420 a little more
00:35:22.000 muted,
00:35:22.580 but also it
00:35:23.460 seems like
00:35:23.920 people are on
00:35:24.940 the same side.
00:35:26.700 You know,
00:35:27.020 here's the thing
00:35:27.720 I want to ask
00:35:28.280 you,
00:35:28.460 though.
00:35:29.380 I want to ask
00:35:30.000 you this.
00:35:32.780 You love
00:35:33.420 the angry
00:35:33.920 ignorant.
00:35:36.380 That is,
00:35:37.800 somebody said
00:35:40.300 talk about
00:35:40.780 vaccines more
00:35:41.460 because I
00:35:41.920 love the
00:35:42.320 angry
00:35:42.800 ignorant.
00:35:44.960 Now,
00:35:45.380 I'm not sure
00:35:45.780 which side
00:35:46.200 you're calling
00:35:46.560 the ignorant,
00:35:47.160 so that's
00:35:47.480 the fun part.
00:35:48.700 But that
00:35:49.240 is the fun
00:35:49.780 of it.
00:35:50.480 The fun
00:35:50.880 of it
00:35:51.220 had nothing
00:35:51.960 to do with
00:35:52.400 whether you
00:35:52.740 should do
00:35:53.000 those things.
00:35:53.440 It was only
00:35:53.880 exposing people's
00:35:55.680 thought process.
00:36:02.040 We need
00:36:02.660 the age of
00:36:03.240 those dying
00:36:03.740 from COVID.
00:36:04.640 That is
00:36:05.060 right.
00:36:06.100 You know,
00:36:06.400 I also
00:36:06.780 wonder how
00:36:07.240 many people
00:36:07.660 are left.
00:36:09.980 You know,
00:36:10.180 if the only
00:36:10.980 people who
00:36:11.480 die of
00:36:12.080 COVID are
00:36:12.620 the 0.001%
00:36:14.260 or whatever,
00:36:15.300 don't you
00:36:15.780 get all of
00:36:16.400 them pretty
00:36:16.780 quickly?
00:36:18.320 Or don't
00:36:18.960 you get half
00:36:19.480 of them
00:36:19.700 pretty quickly?
00:36:20.540 I feel as
00:36:21.120 if we would
00:36:21.560 just burn
00:36:22.020 through the
00:36:22.420 people who
00:36:22.900 could be
00:36:23.500 killed by
00:36:24.480 it,
00:36:24.740 and you
00:36:25.040 would just
00:36:25.240 run out
00:36:25.660 of victims
00:36:26.080 after a
00:36:26.540 while.
00:36:30.280 Talk about
00:36:30.940 how India
00:36:31.480 beat the
00:36:31.980 Delta with
00:36:32.560 only 4%
00:36:33.360 vaccine rate.
00:36:34.040 I don't
00:36:34.580 think that
00:36:34.980 happened.
00:36:37.720 Most of
00:36:38.460 what you
00:36:39.000 see on
00:36:41.300 the Internet
00:36:41.700 about
00:36:42.420 countries
00:36:43.300 is wrong.
00:36:45.680 I'll say
00:36:46.320 more than
00:36:47.560 60%.
00:36:48.380 I would say
00:36:49.100 anything you
00:36:49.720 see on
00:36:50.040 the Internet
00:36:50.500 that says
00:36:52.040 country X
00:36:53.340 had this
00:36:54.500 experience,
00:36:56.080 automatically
00:36:57.140 wrong.
00:36:59.040 Let me
00:36:59.600 give you
00:36:59.840 some examples
00:37:00.420 of things
00:37:00.880 which are in
00:37:01.340 the category
00:37:02.040 of almost
00:37:02.760 always wrong.
00:37:03.500 You ready?
00:37:04.800 You're not
00:37:05.320 going to
00:37:05.500 like this.
00:37:06.860 I'll give
00:37:07.340 you an
00:37:07.540 example and
00:37:08.060 you tell
00:37:08.320 me if you
00:37:08.740 think that
00:37:09.220 they're usually
00:37:09.820 right or
00:37:10.520 usually wrong.
00:37:12.660 There's a
00:37:13.320 rogue doctor
00:37:14.260 who made
00:37:15.160 a video
00:37:15.660 saying that
00:37:16.800 all the
00:37:17.180 experts got
00:37:17.940 it wrong
00:37:18.400 and what
00:37:19.000 they're
00:37:19.200 recommending
00:37:19.680 is going
00:37:20.080 to make
00:37:20.320 it worse
00:37:20.740 for you.
00:37:21.680 That's all
00:37:22.100 you know.
00:37:23.160 I'm not
00:37:23.620 referring to
00:37:24.480 any specific
00:37:25.120 doctor.
00:37:26.080 I'm saying
00:37:26.500 there's just
00:37:26.920 a pandemic.
00:37:28.240 Just imagine
00:37:28.940 this is just
00:37:30.080 imaginary.
00:37:31.160 There's a
00:37:31.480 rogue doctor
00:37:32.300 and he's
00:37:33.360 telling you
00:37:33.820 that everybody
00:37:34.320 got it
00:37:34.700 wrong.
00:37:35.860 What are
00:37:36.180 the odds
00:37:36.840 that the
00:37:37.320 rogue doctor
00:37:38.000 is right
00:37:38.520 and everybody
00:37:39.180 got it
00:37:39.480 wrong?
00:37:39.660 Give me
00:37:39.840 the odds.
00:37:41.180 I want
00:37:41.540 odds.
00:37:41.980 I don't
00:37:42.140 want words.
00:37:43.340 Put a
00:37:43.940 number on
00:37:44.360 it.
00:37:45.480 I'm going
00:37:45.860 to read
00:37:46.040 some of
00:37:46.260 your numbers.
00:37:46.720 90%
00:37:47.340 5%
00:37:48.400 20%
00:37:49.220 80%.
00:37:50.140 Look at
00:37:51.040 that difference.
00:37:51.840 95%
00:37:52.720 5%
00:37:53.540 10%.
00:37:54.620 5%
00:37:55.940 to 0%.
00:37:56.680 0%
00:37:57.100 5%.
00:37:58.200 I'll give
00:37:59.440 you my
00:37:59.760 estimate.
00:38:00.440 95%
00:38:01.280 chance
00:38:01.880 it is
00:38:02.320 not
00:38:02.700 right.
00:38:04.140 The lone
00:38:04.800 doctor
00:38:05.320 or the
00:38:05.900 three
00:38:06.900 doctors
00:38:07.560 any
00:38:08.760 viral
00:38:09.700 clip
00:38:10.140 of the
00:38:10.860 doctors
00:38:11.320 disagreeing
00:38:12.080 with the
00:38:12.580 majority
00:38:13.040 in my
00:38:14.220 opinion
00:38:14.660 based on
00:38:16.160 life
00:38:16.780 95%
00:38:18.480 chance
00:38:18.920 is
00:38:19.080 bullshit.
00:38:20.440 5%
00:38:21.380 chance
00:38:21.860 you really
00:38:23.000 have
00:38:23.260 something
00:38:23.640 and if
00:38:24.460 it's
00:38:24.620 that
00:38:24.780 5%
00:38:25.280 it's
00:38:25.780 world
00:38:26.100 changing.
00:38:27.360 I mean
00:38:27.560 it's a
00:38:27.840 big
00:38:28.120 big
00:38:28.520 big
00:38:28.940 big
00:38:29.160 deal
00:38:29.520 if
00:38:30.360 they're
00:38:30.560 right.
00:38:32.100 But
00:38:32.320 95%
00:38:33.100 they're
00:38:33.800 not.
00:38:36.700 Likewise
00:38:37.300 let's say
00:38:38.140 you see
00:38:38.680 a story
00:38:39.340 about
00:38:39.660 Sweden
00:38:40.200 having
00:38:41.020 a certain
00:38:41.460 outcome.
00:38:43.500 How much
00:38:43.920 should you
00:38:44.300 trust
00:38:44.740 the
00:38:45.560 internet
00:38:46.000 report
00:38:46.640 of
00:38:47.000 Sweden
00:38:47.340 or
00:38:47.620 any
00:38:48.120 country
00:38:48.500 it doesn't
00:38:48.920 matter
00:38:49.080 what
00:38:49.220 country
00:38:49.460 it is
00:38:49.680 Israel
00:38:50.120 India
00:38:51.300 whatever
00:38:51.620 country
00:38:52.080 you think
00:38:52.560 you can't
00:38:53.320 explain.
00:38:54.380 What are
00:38:54.700 the odds
00:38:55.180 that that
00:38:56.480 story
00:38:56.920 or the
00:38:57.440 data
00:38:57.740 about
00:38:58.040 that
00:38:58.260 one
00:38:58.540 country
00:38:58.940 is
00:38:59.840 accurate
00:39:00.380 and
00:39:00.660 telling
00:39:00.860 you
00:39:01.040 something
00:39:01.280 useful?
00:39:03.320 No more
00:39:04.040 than 5%.
00:39:04.820 Everything
00:39:06.820 about
00:39:07.260 individual
00:39:07.840 countries
00:39:08.300 is bullshit.
00:39:09.840 Just all
00:39:10.300 of it.
00:39:12.200 So those
00:39:13.020 are categories
00:39:13.740 of things
00:39:14.240 where you
00:39:14.620 should
00:39:14.780 automatically
00:39:15.300 just say
00:39:16.320 well
00:39:16.720 it could
00:39:17.500 be
00:39:17.760 and maybe
00:39:18.500 we should
00:39:18.820 pay
00:39:18.960 attention
00:39:19.340 and if
00:39:19.780 the argument
00:39:20.220 is good
00:39:20.620 maybe you
00:39:21.040 follow it
00:39:21.380 up a
00:39:21.600 little
00:39:21.700 bit
00:39:21.940 but
00:39:22.960 Dr.
00:39:26.740 Chang
00:39:27.120 says
00:39:27.480 we
00:39:28.320 listen
00:39:28.660 to
00:39:28.940 Sweden
00:39:31.740 press
00:39:32.060 conference
00:39:32.500 it's
00:39:32.820 true
00:39:33.100 it's
00:39:35.140 true
00:39:35.340 because
00:39:35.620 you
00:39:35.760 listen
00:39:35.980 to
00:39:36.120 a
00:39:36.260 press
00:39:36.460 conference
00:39:36.880 are
00:39:38.040 you
00:39:38.200 a
00:39:38.360 Chinese
00:39:38.640 spy
00:39:39.380 are
00:39:40.600 you
00:39:40.820 because
00:39:42.080 you
00:39:42.220 look
00:39:42.380 like
00:39:42.600 a
00:39:42.720 Chinese
00:39:43.000 spy
00:39:43.300 to
00:39:43.460 me
00:39:43.720 now
00:39:46.060 here's
00:39:46.440 an
00:39:46.540 interesting
00:39:46.960 question
00:39:47.440 at least
00:39:48.400 on the
00:39:48.880 YouTube
00:39:49.500 feed
00:39:50.000 how many
00:39:51.060 Chinese
00:39:51.620 spies
00:39:52.240 you know
00:39:54.140 trolls
00:39:54.620 are
00:39:55.680 following
00:39:56.120 this
00:39:56.320 right
00:39:56.500 now
00:39:56.760 I
00:39:57.720 mean
00:39:57.780 I
00:39:57.900 have
00:39:58.060 to
00:39:58.160 think
00:39:58.360 I'm
00:39:58.520 in
00:39:58.600 the
00:39:58.800 top
00:39:59.160 10
00:39:59.520 American
00:40:00.320 anti
00:40:01.580 China
00:40:02.260 persuaders
00:40:03.720 so if
00:40:04.760 they're
00:40:04.880 not
00:40:05.060 following
00:40:05.440 me
00:40:05.840 I
00:40:06.360 don't
00:40:06.480 know
00:40:06.580 who
00:40:06.700 they
00:40:06.800 would
00:40:06.960 follow
00:40:07.300 I
00:40:07.620 mean
00:40:07.780 you
00:40:08.480 should
00:40:08.620 be
00:40:08.780 following
00:40:09.060 me
00:40:09.400 so I
00:40:10.420 should
00:40:10.580 have
00:40:10.740 some
00:40:10.920 Chinese
00:40:11.240 trolls
00:40:11.560 on
00:40:11.760 there
00:40:11.940 if
00:40:14.080 everything
00:40:14.600 I
00:40:14.800 know
00:40:14.960 about
00:40:15.160 everything
00:40:15.420 is
00:40:15.620 right
00:40:15.860 all
00:40:20.640 right
00:40:20.800 that
00:40:21.720 is
00:40:21.940 all
00:40:22.100 we've
00:40:22.260 got
00:40:22.400 for
00:40:22.600 today
00:40:22.920 and
00:40:23.360 I'm
00:40:23.720 going
00:40:23.880 to
00:40:24.040 go
00:40:24.240 do
00:40:24.540 some
00:40:24.760 other
00:40:25.000 things
00:40:25.500 and
00:40:27.120 I'll
00:40:27.760 talk
00:40:27.960 to
00:40:28.080 you
00:40:28.300 tomorrow
00:40:29.800 yes
00:40:31.620 I hear
00:40:31.860 some
00:40:32.060 voices