Real Coffee with Scott Adams - July 11, 2021


Episode 1433 Scott Adams: How to Make Bad Ideas Go Away, and Coffee Too


Episode Stats

Length

46 minutes

Words per Minute

150.44293

Word Count

6,946

Sentence Count

436

Misogynist Sentences

10

Hate Speech Sentences

10


Summary

On today's show, Scott Adams talks about the latest in the Trump administration, the recent white supremacist rally in Charlottesville, VA, and the coronavirus pandemic in the UK. Plus, Trump Jr. delivers a speech at CPAC.


Transcript

00:00:00.980 Hello, everybody. Hello to YouTube. I've already been talking to the local subscribers for a few minutes.
00:00:09.800 And I'll bet you're wondering, I know what you're wondering, you're thinking,
00:00:14.000 will today be the best coffee with Scott Adams of all time?
00:00:18.880 Yeah, yeah, it will. I say it every time, and it's true every single time.
00:00:23.480 And all you need is a cup or mug or glass, a tank or chalice, a dine, a canteen jug or flask, a vessel of any kind.
00:00:31.140 Fill it with your favorite liquid. I like coffee.
00:00:35.600 And join me now for the unparalleled pleasure of the dopamine of the day, the thing that makes everything better.
00:00:41.960 It's called the Simultaneous Sip, famous all over the world, and it's going to happen now. Go.
00:00:53.480 Oh, I'm seeing somebody say that CNN did report that on the Charlottesville situation, Trump said both sides were good.
00:01:05.300 Now, saying that there were fine people on both sides is a little bit different than saying he was praising the neo-Nazis.
00:01:14.400 So they may have stepped away from it a little bit, because one would be unambiguous fake news, but the other is true-ish.
00:01:24.200 He did actually say the words that there were good people on both sides.
00:01:27.240 As long as they don't explain what he meant by both sides, well, they can be weaselishly true, but also fake news at the same time,
00:01:36.560 which is, as you know, their sweet spot.
00:01:39.440 Well, here's an interesting factoid.
00:01:47.460 You know, every time we think we know something about statistical truth, especially during the pandemic,
00:01:54.620 what happens when reality gives us a fact check?
00:01:59.500 Every time we think we understand what the statistics are telling us, reality is not serving it up.
00:02:05.920 Here's an example, and I don't know what's wrong.
00:02:09.380 Is it the way we measure stuff, or are we just dumb?
00:02:12.400 But Anatoly Lubarsky had a good tweet.
00:02:17.440 He noted that the UK cases of coronavirus are rising, and they're rising faster this summer than they did even last summer.
00:02:27.640 Now, that's in a condition in which fully 50% of the UK is fully vaccinated.
00:02:37.400 Now, does this make sense to you?
00:02:40.400 Could it be true that the spread is worse this year, this summer, than it was last summer during the height of the pandemic?
00:02:51.200 And, Kevin, you're going to go away.
00:02:59.540 Put users in timeout.
00:03:04.500 All right.
00:03:06.140 So all the trolls with their bad audio stuff, they're going to go away.
00:03:10.980 So do you think it makes sense that you could have higher infection rates this year when half of the country is vaccinated?
00:03:17.680 Can that happen?
00:03:18.640 How is it possible that you've reduced the number of people who could get it by something like 50% and the rates are rising faster?
00:03:31.020 Well, first of all, yes.
00:03:32.980 I think it is possible.
00:03:35.160 So the first thing is, yeah.
00:03:36.960 But our common sense, it doesn't, your common sense has trouble processing that, right?
00:03:42.500 So this is just one of those cases where the data and what your brain thinks is likely just don't match.
00:03:51.080 But I don't know.
00:03:52.660 I don't know that there's really any problem here.
00:03:55.000 It could be the way it's measured.
00:03:57.720 It could be that it's moving fast with young people, but it doesn't matter because they're not dying.
00:04:03.180 So these things are too complicated to know if your common sense is really telling you anything useful.
00:04:10.300 Here is a fun, fun thing.
00:04:13.020 So you know CPAC is happening, right?
00:04:15.200 So CPAC, all the Republicans are getting together, going to do their speeches.
00:04:19.760 President Trump is going to do the closing speech, I guess.
00:04:22.460 And Don Jr. has already promised us it will be provocative.
00:04:26.720 People will be upset about it.
00:04:30.760 Now, we don't even know what it's going to be about yet.
00:04:32.920 Well, we kind of do, don't we?
00:04:34.560 I mean, you could pretty much guess what kind of themes he's going to hit.
00:04:39.400 But he's going to be shaking things up.
00:04:42.040 And finally, we'll get some fun news.
00:04:44.860 I'm so tired of the boring news.
00:04:46.980 We need Trump to give us some fun news.
00:04:49.460 But here's the most fun part of the CPAC.
00:04:52.180 There's a group not associated with CPAC, but they're attending, I guess.
00:04:57.420 Patriots Soar, S-O-A-R.
00:05:00.140 And they're handing out a seven-point plan.
00:05:02.680 I guess they've got some little handout.
00:05:05.320 Seven-point plan for reinstating Trump to the presidency.
00:05:09.480 Not getting him re-elected, but reinstating him into the presidency.
00:05:15.020 And here's how the plan goes.
00:05:16.600 I think this assumes that in 2022, the Republicans have a good midterm election.
00:05:26.240 Once the Republicans take control of Congress,
00:05:30.880 apparently it's legal to pick a Speaker of the House
00:05:34.660 who is not even an elected official.
00:05:38.340 So apparently, the Republicans could pick Trump as the Speaker of the House
00:05:43.760 without him being in office.
00:05:46.600 I need a fact check on that, but I think that's true, right?
00:05:49.940 Now, if that's true,
00:05:52.520 then Trump, assuming you had a Republican majority only,
00:05:57.260 could start organizing impeachments.
00:06:02.660 And he could impeach Biden,
00:06:05.600 and then he could impeach Harris.
00:06:07.840 And the only thing he would need would be a Republican majority.
00:06:11.460 Because it wouldn't matter what the case was, right?
00:06:14.220 Impeachment is a political process.
00:06:18.640 If you have good evidence for it,
00:06:20.660 well, maybe there's more chance somebody will get impeached.
00:06:23.940 But you don't really need any.
00:06:26.040 You actually don't need evidence.
00:06:29.080 Tell me if I'm wrong.
00:06:30.820 Give me a fact check on that.
00:06:32.680 If the Republicans just had enough of a majority,
00:06:36.320 and they simply wanted to impeach somebody,
00:06:39.820 they could just do it, right?
00:06:41.200 I mean, they would have to make up some, you know, BS reason.
00:06:44.660 But that would be easy.
00:06:45.920 In a political process, you just say,
00:06:48.220 oh, it's obvious that Biden has dementia.
00:06:51.060 Right?
00:06:51.500 How hard would it be for Republicans to say,
00:06:56.840 look, the Biden administration is not dealing with the fact
00:07:00.360 that their leader has dementia.
00:07:02.240 The vice president should be stepping in,
00:07:04.600 and she's not, Harris.
00:07:06.580 So you have to impeach them both.
00:07:09.820 What's wrong with that argument?
00:07:12.340 No, seriously, what's wrong with that argument?
00:07:15.900 Right?
00:07:16.540 Now, I don't know.
00:07:17.320 So impeachment isn't just for incompetence, right?
00:07:22.040 You need a little bit more than that.
00:07:24.300 But I feel like you could just stretch any argument
00:07:27.120 in as far as you want, as long as you have the majority.
00:07:29.500 You just have to get the votes.
00:07:31.140 And nobody's really responsible for what they vote for.
00:07:34.580 So all you have to do is convince the Republicans that,
00:07:38.500 and I'll just use that argument again,
00:07:40.400 that it's obvious Biden has dementia.
00:07:44.240 You could make that case with a straight face.
00:07:47.360 Right?
00:07:47.960 You wouldn't be lying, would you?
00:07:49.900 I mean, you would be going farther than your medical expertise should take you,
00:07:55.180 because you don't have any.
00:07:56.860 But it would also be considered reasonable by people who heard it,
00:08:00.900 even if they didn't like it.
00:08:02.280 Even the Democrats would say,
00:08:03.680 oh, I hate it that you're trying to use impeachment,
00:08:07.840 but it's not the worst point that there's something wrong with them.
00:08:12.900 And if Harris is not stepping in to do something about it,
00:08:16.780 she's not doing her job.
00:08:18.960 And if she's not doing her job, she needs to be removed.
00:08:23.000 So that could be the argument.
00:08:24.620 Now, will any of this happen?
00:08:25.900 Very, very unlikely.
00:08:28.520 Really, really unlikely.
00:08:30.280 But it's kind of fun.
00:08:31.680 It's fun to know that it might.
00:08:35.100 Here's a question for you, and it's just a hypothetical.
00:08:38.780 We know that there are COVID variants,
00:08:41.980 and the variation could, for example, be more or less deadly,
00:08:45.900 but it could also be more or less transmissible.
00:08:50.380 So it could be different in a variety of different ways,
00:08:53.620 thus the variant part.
00:08:56.060 And I wonder this.
00:08:57.920 Could you, hypothetically,
00:08:59.620 I'm not suggesting this is a good idea,
00:09:02.280 but it's more of a science nerd question.
00:09:05.620 Could you, hypothetically,
00:09:07.740 it would be terribly immoral and unethical, so don't do it,
00:09:11.460 but would it be possible
00:09:12.560 to develop a variant of the coronavirus
00:09:16.260 that spread as fast as the Delta variant, or faster,
00:09:20.880 and gave you very little symptoms,
00:09:23.840 but it did give you immunity
00:09:25.540 against other variants?
00:09:29.320 What do you think?
00:09:30.660 Could you, in effect, vaccinate the world
00:09:33.720 by creating a weak variety of the virus
00:09:37.880 and just let it go?
00:09:39.840 Now, obviously, this would be the most unethical thing
00:09:42.540 in the world, right?
00:09:43.680 It would be deeply immoral,
00:09:45.820 so we're not going to argue that, right?
00:09:48.420 We're all on the same side.
00:09:50.300 You don't give people forced vaccinations
00:09:52.980 by tricking them with a virus in the air.
00:09:56.600 But it'd be possible, wouldn't it?
00:10:00.280 And if it were possible,
00:10:02.300 would it be something that you would ever use
00:10:05.060 in an even worse pandemic?
00:10:07.980 You know, this pandemic was pretty bad,
00:10:09.700 but it could be much worse, one assumes.
00:10:14.120 So I'm seeing a lot of yeses.
00:10:15.800 Interesting.
00:10:17.360 Yeah, it would be, obviously,
00:10:18.920 the worst idea in the world to try to do it,
00:10:21.900 but it might work.
00:10:24.780 Weirdly.
00:10:25.140 Oh, there's a Dan Brown novel with that topic?
00:10:29.120 That makes sense.
00:10:30.420 Sounds like a movie plot.
00:10:31.700 Yeah.
00:10:31.840 All right.
00:10:34.080 There was a study on mask effectiveness in Kansas
00:10:37.920 in which different counties were looked at
00:10:41.840 to see if wearing masks worked.
00:10:44.700 What do you think was the result?
00:10:46.540 Remember, Kansas is pretty Republican,
00:10:49.660 and they had some counties with no mask mandates
00:10:53.880 and some with, some without.
00:10:56.800 What do you think?
00:10:57.700 Did masks work?
00:10:58.960 Well, according to this study,
00:11:01.640 they worked really well.
00:11:04.540 Do you believe it?
00:11:07.180 Do you believe that anybody
00:11:08.600 can actually effectively measure
00:11:11.640 whether or not masks worked?
00:11:15.000 So I have a really, really big question
00:11:18.120 about whether anybody can even measure this thing,
00:11:21.620 because I feel like too many things happen
00:11:23.160 at the same time.
00:11:24.600 By the time you mandate masks,
00:11:27.180 don't you also do lockdowns
00:11:29.760 and more distancing
00:11:30.600 and treat it more seriously?
00:11:33.220 So I don't think there's ever a case
00:11:34.780 where the only difference
00:11:35.980 between two locations
00:11:37.360 is masks or no masks.
00:11:39.880 So the first part is
00:11:41.380 that it might be hard to tease out
00:11:43.240 what really is the mask part.
00:11:45.860 Secondly, you can't do
00:11:47.400 a randomized controlled test,
00:11:49.740 because since the assumption is masks work,
00:11:53.920 at least by the scientific community,
00:11:55.760 if not by you,
00:11:57.640 there wouldn't be any way to say,
00:11:59.540 well, this group don't wear masks
00:12:01.460 during the pandemic
00:12:02.320 and this group wear them
00:12:03.680 and we'll see who does better.
00:12:05.300 So you can't do the kind of study
00:12:06.720 you want to do.
00:12:07.580 It would be unethical.
00:12:09.480 So you're left with
00:12:11.180 less reliable ways to check it.
00:12:13.780 And do you believe that
00:12:16.580 when these researchers
00:12:18.840 at the Institute of Policy
00:12:20.380 and Social Research,
00:12:22.260 they said that it saved
00:12:23.560 about 500 lives
00:12:25.180 in each adoptive county.
00:12:28.340 Do you believe that?
00:12:29.940 Do you believe that masks
00:12:31.140 saved 500 lives
00:12:33.100 in each county that did them?
00:12:36.720 I think I'd have to see
00:12:38.060 a lot more studies.
00:12:39.300 I'd have to see a lot more people
00:12:40.640 look at it.
00:12:41.140 We don't really live in a world
00:12:42.760 in which you could take
00:12:44.120 any of this seriously.
00:12:46.220 I'm saying that
00:12:47.220 people are not believing it.
00:12:50.200 I'm still on the side
00:12:51.460 that says
00:12:51.960 if you could measure it,
00:12:54.020 you'd find that masks
00:12:55.160 made a difference.
00:12:57.760 But I don't know
00:12:58.620 if you'll ever be able
00:12:59.640 to measure it.
00:13:00.280 I just don't know
00:13:00.880 if I would ever trust
00:13:02.220 any study on that.
00:13:04.280 All right.
00:13:04.620 Let me give you an update
00:13:10.080 on something from yesterday.
00:13:12.060 You might remember
00:13:13.100 that yesterday
00:13:14.260 I told you a funny story
00:13:15.800 about how Christina,
00:13:17.900 my wife and I,
00:13:18.520 took personality tests,
00:13:20.920 an online personality test.
00:13:22.600 So there's no science behind it.
00:13:23.980 It was sort of a variant
00:13:25.740 of the Myers-Briggs test
00:13:28.140 where you're an INTJ
00:13:30.240 or an ENTF
00:13:31.500 or whatever you are.
00:13:32.380 And I was joking
00:13:34.500 because my result came out
00:13:36.160 that I was compared
00:13:37.620 to people like
00:13:38.560 Martin Luther King,
00:13:40.880 Mother Teresa,
00:13:42.340 and Nelson Mandela.
00:13:43.960 And I joked that
00:13:45.080 my wife did the same test
00:13:47.000 and came out
00:13:47.720 similar to Vladimir Putin.
00:13:50.580 Now,
00:13:51.660 I may have cherry-picked
00:13:53.560 some data.
00:13:55.320 So let me do
00:13:56.460 a clarification.
00:13:58.060 In addition to
00:13:59.180 having a
00:14:01.260 a test
00:14:02.700 that was similar
00:14:03.260 to Vladimir Putin,
00:14:04.980 well,
00:14:05.160 number one,
00:14:06.100 do you think
00:14:06.660 Vladimir Putin
00:14:07.380 took the test?
00:14:08.760 Of course not.
00:14:10.320 Do you think MLK
00:14:11.360 or Mother Teresa
00:14:13.220 took the test?
00:14:14.320 Of course not.
00:14:15.720 So when the test
00:14:16.480 says that you have
00:14:17.100 the same personality
00:14:17.940 as these famous people,
00:14:19.860 there's no science
00:14:20.680 to that.
00:14:21.300 It doesn't mean anything.
00:14:22.820 But other people
00:14:23.780 who would have
00:14:24.280 the allegedly
00:14:25.080 the same personality type
00:14:26.480 as Christina
00:14:28.140 would be Isaac Newton,
00:14:30.400 much better
00:14:31.240 than Vladimir Putin,
00:14:33.100 Elon Musk,
00:14:35.380 Stephen Hawking,
00:14:36.580 Beethoven,
00:14:37.940 John Nash,
00:14:38.740 the mathematician,
00:14:40.780 and JFK.
00:14:42.200 Now,
00:14:42.860 if you wanted
00:14:43.400 to know
00:14:43.780 how unscientific
00:14:44.880 this personality test is,
00:14:47.340 do you think
00:14:48.060 that JFK
00:14:48.980 and Vladimir Putin
00:14:51.980 should be
00:14:52.440 in the same category?
00:14:53.340 right there,
00:14:56.540 isn't that kind
00:14:57.620 of a tip-off?
00:15:00.320 So I wouldn't
00:15:01.360 put any stock
00:15:03.180 in this.
00:15:03.640 So by the way,
00:15:04.280 the Myers-Briggs
00:15:05.480 personality test
00:15:06.600 purports
00:15:08.020 to figure out
00:15:08.960 what kind of
00:15:09.620 personality people have
00:15:10.900 and then
00:15:12.320 the reason is,
00:15:15.380 the reason
00:15:16.000 that you figure
00:15:16.460 that out
00:15:16.920 is so that you can
00:15:17.580 deal with them
00:15:18.080 more effectively,
00:15:18.980 right?
00:15:20.200 So
00:15:20.720 can you
00:15:22.540 actually do that?
00:15:24.300 If you knew
00:15:24.940 what your personality
00:15:25.760 type was
00:15:26.620 and you knew
00:15:27.820 what somebody else's
00:15:28.700 personality type was,
00:15:30.400 could you use that
00:15:31.400 to great advantage
00:15:32.400 in your business
00:15:33.880 or personal life?
00:15:35.860 And the answer is
00:15:37.040 probably not.
00:15:38.560 There's no reason
00:15:39.280 to believe that
00:15:39.880 makes any difference.
00:15:41.580 And
00:15:42.040 I'm not sure
00:15:43.660 that there's
00:15:44.940 any science
00:15:45.540 to even figuring out
00:15:46.660 what your personality is.
00:15:48.780 Because
00:15:48.980 when I took the test
00:15:50.820 I was aware of
00:15:51.800 answering the questions
00:15:52.780 the way I wanted
00:15:54.640 to answer them.
00:15:55.940 Or let me say it
00:15:56.720 another way.
00:15:57.560 When I answered
00:15:58.300 the questions
00:15:58.840 on the personality test
00:16:00.160 I was kind of
00:16:02.200 saying who I wanted
00:16:03.360 to be
00:16:03.940 maybe more
00:16:05.320 than who I am.
00:16:06.560 You know what I mean?
00:16:07.740 Like maybe I gave
00:16:08.540 myself a little bit
00:16:09.380 more of a
00:16:10.180 a little bit more
00:16:12.240 of a
00:16:12.660 positive review
00:16:14.360 of my own
00:16:15.660 personality
00:16:16.520 than perhaps
00:16:17.640 was warranted.
00:16:18.980 You know
00:16:19.460 would that
00:16:20.480 be unusual?
00:16:21.780 So I've got
00:16:22.520 a feeling that
00:16:23.280 people's own
00:16:24.380 assessment
00:16:24.820 of their
00:16:25.360 personality
00:16:25.960 is pretty wonky
00:16:27.480 to begin with.
00:16:28.660 And that once
00:16:29.140 you've assessed it
00:16:29.840 it doesn't mean
00:16:30.440 you can work
00:16:30.980 with somebody else
00:16:31.800 if you know
00:16:32.880 their personality.
00:16:33.940 So in the Dilbert comic
00:16:34.860 I have mocked
00:16:35.640 the Myers-Briggs test
00:16:37.260 a number of times
00:16:38.080 and I will
00:16:39.860 continue to do so.
00:16:41.120 There's no science
00:16:41.740 to it.
00:16:42.520 Alright.
00:16:43.180 And I was just
00:16:43.960 having fun with
00:16:44.780 the personality test.
00:16:45.900 Next topic.
00:16:54.260 What would it
00:16:54.760 take to convince
00:16:55.660 you that the
00:16:57.040 earth is warming
00:16:57.960 because of
00:16:58.940 human activity?
00:17:00.860 What would it
00:17:01.360 take to convince
00:17:01.960 you?
00:17:03.080 So let's say
00:17:04.000 I know my
00:17:05.340 audience well
00:17:05.980 and I know
00:17:07.040 that it is
00:17:08.020 a common
00:17:08.820 belief among
00:17:09.600 many of my
00:17:10.240 audience members
00:17:10.860 here that
00:17:11.840 climate change
00:17:12.560 is not happening
00:17:13.500 or that if it
00:17:14.700 is happening
00:17:15.180 it's some
00:17:15.880 natural variation
00:17:17.060 and has not
00:17:18.720 much to do
00:17:19.240 with humans.
00:17:21.420 So I know
00:17:22.120 a number of you
00:17:22.660 are in that camp.
00:17:23.680 I'm not in that
00:17:24.580 camp.
00:17:25.420 I'm in the camp
00:17:26.180 that almost
00:17:27.580 certainly humans
00:17:28.380 are making some
00:17:29.020 difference but
00:17:30.240 we're not good
00:17:30.800 at predicting it
00:17:31.600 and we're also
00:17:32.720 very good at
00:17:33.840 remediating.
00:17:35.540 So I'm not
00:17:36.160 as worried about
00:17:37.180 long-term
00:17:38.520 devastating
00:17:39.180 consequences
00:17:40.060 because other
00:17:41.440 people are.
00:17:41.900 as long as
00:17:43.080 other humans
00:17:43.600 are really
00:17:44.060 worried about
00:17:44.640 it and
00:17:45.020 working on
00:17:45.620 and putting
00:17:46.060 tons of
00:17:46.480 money into
00:17:46.860 it we'll
00:17:47.560 be fine.
00:17:48.740 But you
00:17:49.480 know it's
00:17:49.700 probably
00:17:49.920 productive to
00:17:50.700 worry about
00:17:51.120 it is my
00:17:51.620 view.
00:17:52.140 Now I could
00:17:52.480 be wrong
00:17:52.800 about that.
00:17:53.880 So if you
00:17:54.180 ask me Scott
00:17:55.340 what is your
00:17:55.820 level of
00:17:56.300 certainty I
00:17:57.400 would say
00:17:57.680 not 100%
00:17:58.500 but pretty
00:17:59.700 high.
00:18:00.720 Pretty high.
00:18:02.280 So but
00:18:03.320 here's the
00:18:03.660 question to
00:18:04.100 you if you're
00:18:04.600 in the camp
00:18:05.100 that says
00:18:05.700 now the
00:18:06.280 world is not
00:18:07.000 warming
00:18:07.340 because of
00:18:08.960 human activity
00:18:09.660 maybe because
00:18:11.000 of natural
00:18:11.400 variants or
00:18:12.280 solar flares or
00:18:13.880 whatever you
00:18:14.220 think it is
00:18:14.660 what would it
00:18:15.680 take to
00:18:16.020 convince you
00:18:16.480 otherwise?
00:18:17.780 Is there
00:18:18.120 anything?
00:18:19.300 Because if
00:18:20.340 you don't
00:18:20.760 believe in
00:18:21.240 climate change
00:18:22.080 being human
00:18:23.580 caused to
00:18:24.260 some extent
00:18:24.840 you don't
00:18:26.160 believe science
00:18:26.940 or at least
00:18:28.560 you don't
00:18:28.840 believe scientists
00:18:29.740 I should say.
00:18:30.740 It's not that
00:18:31.120 you don't
00:18:31.340 believe science
00:18:32.000 that you don't
00:18:32.980 believe the
00:18:33.360 consensus of
00:18:34.120 scientists.
00:18:35.180 So what would
00:18:35.980 it ever take
00:18:36.500 to change
00:18:36.920 your mind?
00:18:38.360 Now I'm not
00:18:39.020 saying I'm
00:18:39.440 going to try
00:18:39.960 to change
00:18:40.380 your mind.
00:18:42.340 I'm seeing
00:18:43.040 Roger saying
00:18:43.600 the Tony
00:18:44.120 Heller video
00:18:44.780 should be
00:18:45.240 required viewing.
00:18:47.200 Let me tell
00:18:47.720 you if you
00:18:48.320 think that
00:18:49.060 you're really
00:18:50.400 in bad
00:18:50.820 shape.
00:18:52.140 Right?
00:18:53.020 Every time I
00:18:53.920 bring this up
00:18:54.480 somebody brings
00:18:55.020 up Tony
00:18:55.520 Heller.
00:18:56.400 Tony Heller
00:18:57.060 may have
00:18:57.600 destroyed the
00:18:59.880 earth.
00:19:01.860 I'm not
00:19:02.540 saying for
00:19:03.020 sure.
00:19:04.380 But the
00:19:05.240 Tony Heller
00:19:05.920 effect, he's
00:19:06.540 a well-known
00:19:07.380 critic of
00:19:08.180 climate change.
00:19:08.820 But he's
00:19:09.920 so well-known
00:19:10.700 and did
00:19:11.640 such good
00:19:12.420 work
00:19:12.760 communicating
00:19:13.480 that he's
00:19:15.180 the main
00:19:15.660 reason that
00:19:16.220 people don't
00:19:16.680 believe in
00:19:17.060 climate change
00:19:17.660 on the right.
00:19:18.720 One person.
00:19:19.920 Because if you
00:19:20.780 look at the
00:19:21.140 comments, you'll
00:19:21.660 see he's the
00:19:22.140 most often,
00:19:22.720 well, really the
00:19:23.440 only person who's
00:19:24.120 ever mentioned.
00:19:25.260 Now, here's the
00:19:26.540 problem.
00:19:28.100 If you watch
00:19:29.120 10 or 20
00:19:31.320 Tony Heller
00:19:31.940 videos, you
00:19:33.360 will be
00:19:33.740 convinced that
00:19:34.400 he's right.
00:19:35.540 How many
00:19:35.960 of you have
00:19:36.320 had that
00:19:36.640 experience?
00:19:37.880 How many
00:19:38.240 of you watched
00:19:39.040 enough Tony
00:19:42.680 Heller videos
00:19:43.820 to say, my
00:19:45.620 God, this is
00:19:46.300 convincing, he's
00:19:47.100 right?
00:19:47.840 In the comments,
00:19:48.820 how many of you
00:19:49.360 have done that?
00:19:51.900 Because I've
00:19:52.840 gone down that
00:19:53.700 rabbit hole, and
00:19:55.780 if you watch
00:19:56.680 enough of his
00:19:57.260 videos, they're
00:19:57.940 really, really
00:19:58.800 persuasive.
00:20:00.100 Like, really
00:20:01.040 persuasive.
00:20:03.120 How many of
00:20:03.920 you?
00:20:04.940 Just looking
00:20:05.600 in your
00:20:06.040 comments, never
00:20:09.680 heard of the
00:20:10.100 guy who, all
00:20:11.600 right.
00:20:12.220 So, here's
00:20:14.320 what you need
00:20:15.080 to know.
00:20:15.740 If you watch
00:20:16.400 only Tony Heller
00:20:17.460 videos, they're
00:20:18.340 really convincing,
00:20:19.880 so don't do
00:20:20.700 that.
00:20:21.280 Because if you
00:20:22.100 watch only one
00:20:22.840 person's view of
00:20:23.700 anything, they're
00:20:25.080 going to be
00:20:25.520 convincing.
00:20:26.880 So, the
00:20:27.780 problem is not
00:20:28.520 that Tony Heller
00:20:29.600 is right or
00:20:30.320 wrong, the
00:20:31.180 problem is you
00:20:31.780 took a method
00:20:32.580 of finding
00:20:33.180 information that
00:20:33.980 can never
00:20:34.380 work.
00:20:35.880 You hear this
00:20:36.440 clearly.
00:20:37.500 If you
00:20:38.160 follow down the
00:20:38.980 Tony Heller
00:20:39.560 path, you
00:20:43.440 took a way of
00:20:44.240 finding information
00:20:45.140 that can never
00:20:46.360 work.
00:20:47.280 And it has
00:20:47.720 nothing to do
00:20:48.280 with Tony Heller.
00:20:49.840 It has to do
00:20:50.520 with watching any
00:20:52.180 one person's
00:20:53.080 content is the
00:20:54.520 worst way you
00:20:55.140 can understand
00:20:55.700 anything.
00:20:56.600 Because they
00:20:57.140 will be
00:20:57.640 persuasive,
00:20:58.600 because you're
00:20:59.000 not hearing any
00:20:59.600 counter-argument.
00:21:00.240 You have to
00:21:01.700 read the
00:21:02.040 debunks, and
00:21:03.320 then read his
00:21:03.900 stuff, and
00:21:04.980 then read the
00:21:05.620 counter to it,
00:21:06.860 and then read
00:21:07.680 his response to
00:21:08.600 the counter to
00:21:09.280 it, etc.
00:21:10.240 You've got to
00:21:11.020 go pretty deep
00:21:11.960 down the hole
00:21:12.620 before you
00:21:13.720 realize that
00:21:14.300 everything that
00:21:14.920 Tony Heller
00:21:15.400 says has a
00:21:16.820 reasonable
00:21:17.260 defense.
00:21:19.220 Maybe not
00:21:20.020 everything.
00:21:20.840 Because sometimes
00:21:21.480 the critics will
00:21:22.280 find things that
00:21:23.100 are being done
00:21:23.780 wrong, but it
00:21:25.380 doesn't debunk
00:21:26.320 the whole thing.
00:21:27.600 It might be
00:21:28.140 there's a
00:21:28.520 messy spot or
00:21:29.300 something.
00:21:31.740 Yes, Al Gore
00:21:32.700 is the same
00:21:33.180 problem.
00:21:34.100 If you just
00:21:34.820 listened to Al
00:21:35.620 Gore, and he
00:21:37.040 was your primary
00:21:37.760 source of
00:21:38.440 information, you
00:21:39.880 would be using a
00:21:40.680 process which
00:21:41.520 guarantees you
00:21:42.460 would be not
00:21:43.540 well-informed.
00:21:45.040 So the same
00:21:45.780 thing I said
00:21:46.240 about Tony
00:21:46.800 Heller, who
00:21:47.900 says climate
00:21:48.480 change is not
00:21:49.240 real, is
00:21:49.820 exactly the
00:21:50.660 same as I
00:21:51.780 would say about
00:21:52.340 Al Gore.
00:21:53.140 And by the
00:21:53.460 way, Al Gore
00:21:55.180 would tell you
00:21:55.620 the same thing.
00:21:57.600 If you had a
00:21:58.340 private conversation
00:21:59.200 with Al Gore, I
00:22:01.400 feel confident
00:22:02.060 this is true.
00:22:03.620 By the way, I
00:22:04.220 have had private
00:22:05.100 conversations with
00:22:05.940 Al Gore.
00:22:06.500 That's another
00:22:06.900 story.
00:22:07.920 But in the
00:22:09.520 White House, when
00:22:10.100 he was vice
00:22:10.520 president, I got
00:22:11.380 to hang out with
00:22:11.980 him a little bit.
00:22:13.500 I like Al Gore,
00:22:14.540 by the way.
00:22:15.120 I think he's the
00:22:15.960 real deal.
00:22:17.000 Now, whether he's
00:22:18.180 right or wrong on
00:22:18.800 any particular
00:22:19.320 things is a
00:22:20.140 different question.
00:22:21.140 But I think he's
00:22:21.720 actually, I like
00:22:23.540 him anyway, so I'll
00:22:24.360 just put that out
00:22:25.080 there.
00:22:25.240 But I think
00:22:27.420 he would agree
00:22:28.020 with this
00:22:28.440 statement, because
00:22:29.580 he's a rational
00:22:30.640 guy.
00:22:31.420 If you said the
00:22:32.380 only information
00:22:33.100 you have is from
00:22:33.900 Al Gore, should
00:22:35.540 you treat it as
00:22:36.440 credible?
00:22:38.300 Not even Al Gore
00:22:39.240 would tell you
00:22:39.680 that, I don't
00:22:40.440 think, because he
00:22:41.680 seems like a
00:22:42.280 pretty straight
00:22:42.840 shooter.
00:22:43.260 He may be
00:22:43.640 right or wrong
00:22:44.320 on stuff, but I
00:22:45.740 think he's a
00:22:46.200 straight shooter.
00:22:47.480 And I think he
00:22:48.320 would tell you
00:22:48.900 that listening to
00:22:50.460 any one person
00:22:51.380 on any big
00:22:53.020 complicated topic is
00:22:54.240 a terrible idea.
00:22:56.000 So, it's one
00:22:57.600 thing to listen
00:22:58.180 to Al Gore, but
00:22:59.240 you've got to
00:22:59.580 listen to Tony
00:23:00.500 Heller maybe
00:23:01.180 also, and then
00:23:02.600 you've got to
00:23:02.940 listen to the
00:23:03.500 debunk of the
00:23:04.380 Tony Heller.
00:23:05.940 Potholer is
00:23:06.580 one, famous
00:23:07.240 one, and then
00:23:08.280 you've got to
00:23:08.600 listen to the
00:23:09.100 response to
00:23:09.720 Potholer, which
00:23:11.080 could be pretty
00:23:11.700 persuasive.
00:23:13.160 But you've got to
00:23:13.720 go all the way
00:23:14.280 down or else you
00:23:15.060 don't know
00:23:15.380 anything.
00:23:16.680 So, what
00:23:17.500 would it take
00:23:17.940 to convince you
00:23:18.800 that climate
00:23:19.480 change is real?
00:23:23.420 Think about
00:23:24.040 your own answer.
00:23:24.720 You don't have
00:23:25.160 to answer
00:23:25.500 me.
00:23:26.080 But think
00:23:26.540 about if
00:23:27.280 there's nothing
00:23:27.820 that would
00:23:28.200 convince you,
00:23:30.660 why do you
00:23:31.540 believe it?
00:23:32.580 Right?
00:23:33.280 You have to
00:23:34.380 check your
00:23:35.020 thinking if
00:23:35.660 there's nothing
00:23:36.140 that could
00:23:36.480 change your
00:23:36.900 mind.
00:23:37.620 No new
00:23:38.440 information,
00:23:39.620 no better
00:23:40.240 source,
00:23:41.340 no record
00:23:42.300 temperatures,
00:23:44.100 anything.
00:23:44.880 What would
00:23:45.280 change your
00:23:45.620 mind?
00:23:45.800 I'm not
00:23:48.240 saying I'm
00:23:48.700 going to
00:23:48.920 try to
00:23:49.240 change your
00:23:49.600 mind.
00:23:49.960 I'm just
00:23:50.240 wondering what
00:23:50.780 would.
00:23:51.880 All right,
00:23:52.200 apparently more
00:23:53.640 than 30
00:23:54.200 million Americans
00:23:54.940 were under
00:23:55.540 excessive heat
00:23:56.540 warnings across
00:23:58.240 the West.
00:23:59.220 Does that
00:23:59.740 change your
00:24:00.180 mind?
00:24:01.240 So, we have
00:24:01.800 these record
00:24:03.260 highs all over
00:24:04.380 the place.
00:24:05.280 30 million
00:24:06.000 people with
00:24:06.700 pretty serious
00:24:07.980 heat warnings.
00:24:09.260 Yesterday, I
00:24:10.080 kept getting
00:24:10.660 information from
00:24:12.800 my local
00:24:14.200 energy company,
00:24:15.380 PG&E,
00:24:16.560 telling me that
00:24:17.260 they were maybe
00:24:18.340 going to turn
00:24:18.880 my power off
00:24:19.660 for one or
00:24:20.820 two hours.
00:24:24.500 Smart people
00:24:25.440 in the comments.
00:24:26.600 So, the
00:24:26.840 question I
00:24:27.320 asked is,
00:24:28.520 would you be
00:24:29.220 persuaded by
00:24:29.860 30 million
00:24:30.560 people being
00:24:31.380 under excessive
00:24:32.060 heat warnings?
00:24:33.120 And almost all
00:24:34.060 of you said no.
00:24:35.200 Correct.
00:24:36.360 That is correct.
00:24:37.720 Having an
00:24:38.740 excessive high
00:24:40.140 temperatures does
00:24:41.240 not prove
00:24:41.920 climate change.
00:24:43.520 We all agree
00:24:44.140 with that,
00:24:44.500 right?
00:24:45.400 That just
00:24:45.820 because you
00:24:46.280 set a record,
00:24:47.980 that's not
00:24:48.820 enough.
00:24:49.700 Suppose we
00:24:50.400 set a record
00:24:53.180 every year for
00:24:53.840 10 years in
00:24:54.480 a row.
00:24:55.420 Would that
00:24:55.820 change your
00:24:56.240 mind?
00:24:58.260 How about
00:24:58.940 20 years?
00:25:00.360 Suppose we
00:25:01.280 had a new
00:25:02.000 high.
00:25:02.800 I don't think
00:25:03.380 this is likely.
00:25:04.680 But suppose we
00:25:05.440 had a new
00:25:06.100 high every
00:25:07.720 year for
00:25:08.220 20 years in
00:25:08.900 a row.
00:25:09.780 At the end
00:25:10.260 of 20 years,
00:25:11.000 would you say,
00:25:11.640 yeah, that was
00:25:12.080 probably something
00:25:13.060 going on there.
00:25:15.040 Let's see,
00:25:15.540 would that
00:25:16.120 convince you?
00:25:17.200 20 years every
00:25:18.340 year is a new
00:25:18.880 high.
00:25:21.360 A lot of
00:25:22.000 people say no.
00:25:23.220 And no is the
00:25:23.960 correct answer.
00:25:26.040 Yeah, no is the
00:25:26.840 correct answer.
00:25:27.860 Because 20 years
00:25:28.900 is not very long
00:25:29.720 in terms of the
00:25:31.880 history of the
00:25:32.540 earth.
00:25:33.300 You could have
00:25:34.140 a 20-year
00:25:34.780 period where
00:25:35.640 things are
00:25:37.420 different.
00:25:38.360 Now, if that
00:25:40.020 were the only
00:25:40.760 evidence, I
00:25:41.620 would say maybe
00:25:42.380 that's not as
00:25:43.340 credible or
00:25:44.140 convincing as it
00:25:44.940 should be.
00:25:45.760 But it won't be
00:25:46.580 the only evidence.
00:25:47.840 There will be
00:25:48.400 evidence from a
00:25:49.620 variety of
00:25:50.160 sources, etc.
00:25:51.320 So we'll keep
00:25:52.140 an eye on that.
00:25:54.420 Why are we
00:25:55.000 having rolling
00:25:55.620 blackouts in
00:25:56.600 California?
00:25:57.700 So yesterday
00:25:58.360 was the first
00:25:58.960 day they said
00:25:59.680 your electricity
00:26:01.120 might go off
00:26:02.000 voluntarily,
00:26:03.560 meaning the
00:26:04.200 power company
00:26:05.320 might turn it
00:26:05.900 off just to save
00:26:06.660 electricity.
00:26:07.880 Well, as
00:26:08.480 Michael Schellenberger
00:26:09.420 tweets, there
00:26:12.600 was a California
00:26:13.400 electricity grid
00:26:14.460 manager who
00:26:15.240 explained it in
00:26:16.040 2020.
00:26:17.080 Quote,
00:26:17.560 people wonder
00:26:18.440 how we made
00:26:19.140 it through the
00:26:19.620 heat wave of
00:26:20.240 2006.
00:26:21.580 The answer is
00:26:22.640 we had San
00:26:23.680 Onofre, a
00:26:24.760 nuclear plant,
00:26:25.660 and a number of
00:26:26.720 other plants
00:26:27.260 totaling thousands
00:26:28.140 of megawatts that
00:26:29.180 are not there
00:26:29.680 today.
00:26:30.720 So the reason
00:26:31.340 we don't have
00:26:31.860 enough electricity
00:26:32.640 is not climate
00:26:34.640 change.
00:26:35.080 change, it's
00:26:36.520 because we
00:26:37.020 ignored climate
00:26:37.660 change, because
00:26:39.640 getting rid of
00:26:40.760 nuclear power
00:26:41.620 plants is like
00:26:42.840 being really
00:26:43.460 dumb if you
00:26:44.900 have a climate
00:26:45.820 emergency that
00:26:46.640 you believe is
00:26:47.340 your top priority.
00:26:50.060 And I was
00:26:51.120 listening yesterday
00:26:52.220 to a podcast in
00:26:53.900 which I heard
00:26:56.360 some very smart
00:26:57.040 people say,
00:26:59.680 for example,
00:27:00.940 that you should
00:27:01.360 put nuclear power
00:27:02.220 plants and
00:27:03.140 desalinization
00:27:04.500 plants up and
00:27:06.220 down the coast
00:27:06.820 of Northern
00:27:07.600 California or
00:27:08.500 all of California
00:27:09.600 and just solve
00:27:11.140 that problem.
00:27:12.620 It would be
00:27:13.220 very expensive,
00:27:14.320 but how would
00:27:15.640 you like to
00:27:16.040 have no water?
00:27:17.800 So there are
00:27:18.320 your two choices.
00:27:19.240 It's expensive
00:27:20.000 or you don't
00:27:21.980 have water.
00:27:23.960 Easy decision.
00:27:25.120 If you had a
00:27:25.780 good government,
00:27:27.640 that would be
00:27:28.040 an easy decision.
00:27:28.840 Well, does
00:27:30.040 anybody know if
00:27:30.960 Branson is up in
00:27:32.280 space yet or
00:27:33.340 up in near
00:27:34.360 space?
00:27:34.880 I think he's
00:27:35.300 going to be
00:27:35.600 near space.
00:27:37.180 Does anybody
00:27:37.840 know if he
00:27:38.260 left?
00:27:38.540 There was a
00:27:38.880 delay in the
00:27:40.300 Virgin Galactic
00:27:41.220 flight and
00:27:42.620 Richard Branson
00:27:43.680 was going to be
00:27:44.540 on it himself,
00:27:45.620 Sir Richard
00:27:46.200 Branson.
00:27:47.280 And I believe
00:27:48.720 he had delayed
00:27:49.300 90 minutes.
00:27:51.440 So it hasn't
00:27:52.340 taken off yet,
00:27:53.080 I'm hearing in the
00:27:53.840 comments.
00:27:54.160 All right.
00:27:54.500 How does it
00:27:58.560 make you feel
00:27:59.420 that Branson
00:28:01.040 is in the
00:28:01.620 flight and
00:28:02.520 likewise that
00:28:03.360 Bezos will be
00:28:04.340 in his first
00:28:05.880 flight?
00:28:06.860 How does that
00:28:07.340 make you feel?
00:28:09.860 Because I
00:28:10.440 understand what
00:28:11.040 Branson is doing.
00:28:12.140 He's not only
00:28:12.860 bringing attention
00:28:13.600 to it by being
00:28:14.400 on it, but he's
00:28:15.700 also trying to
00:28:16.620 convince people
00:28:17.380 that it'll be a
00:28:18.900 safe way to fly
00:28:19.680 in the future.
00:28:20.720 But here's the
00:28:21.340 thing.
00:28:22.640 Elon Musk has
00:28:23.640 told us directly
00:28:24.480 people are going
00:28:25.160 to die in the
00:28:26.820 quest to get to
00:28:27.720 Mars, which I
00:28:28.880 think is one of
00:28:29.360 the smartest
00:28:29.740 things anybody
00:28:30.280 ever said in
00:28:30.840 public, by the
00:28:31.520 way.
00:28:32.460 When Elon Musk
00:28:33.440 says, you know,
00:28:34.120 people are going
00:28:34.540 to die and he's
00:28:36.140 working on a
00:28:36.680 project that's
00:28:37.320 going to kill
00:28:37.660 people.
00:28:38.920 But we kind
00:28:40.040 of have to get
00:28:40.540 to Mars.
00:28:41.300 You know, the
00:28:41.820 future of
00:28:42.720 humanity does
00:28:44.300 require us to
00:28:45.180 settle space.
00:28:45.960 There's no doubt
00:28:46.500 about that.
00:28:47.860 And I love the
00:28:49.620 fact that he's
00:28:50.100 transparent about
00:28:50.820 it.
00:28:50.940 Yeah, people are
00:28:51.440 going to die.
00:28:52.260 I don't know if
00:28:52.860 Elon Musk will
00:28:53.620 be on any of
00:28:54.320 his upcoming
00:28:55.220 flights, but it
00:28:56.940 would be not
00:28:58.760 unreasonable for
00:28:59.880 him not to,
00:29:00.800 because I'd
00:29:02.920 much rather have
00:29:03.740 Elon Musk alive
00:29:05.040 than dead.
00:29:07.740 Now, I don't
00:29:09.000 know if I feel
00:29:09.580 exactly the same
00:29:10.500 about Richard
00:29:11.100 Branson, or even
00:29:14.040 Jeff Bezos, who's
00:29:15.440 now sort of retired
00:29:16.600 from Amazon.
00:29:18.260 Now, I don't want
00:29:18.900 any of them to
00:29:19.480 die, just to be
00:29:21.340 clear.
00:29:21.620 You know, I
00:29:22.660 very much want
00:29:23.420 them all to
00:29:23.860 live, and I
00:29:24.820 think they all
00:29:26.240 have great value
00:29:27.080 to the planet.
00:29:28.660 But Elon Musk,
00:29:29.600 probably the only
00:29:30.240 one whose direct
00:29:31.840 contribution makes
00:29:33.020 a difference to
00:29:33.640 whether we'll get
00:29:34.340 to Mars.
00:29:36.000 Probably it does
00:29:36.780 matter who's in
00:29:37.440 charge there.
00:29:38.720 So, I would
00:29:40.200 hate, oh my
00:29:42.240 God, I would
00:29:42.800 hate to see
00:29:43.400 anything happen
00:29:44.180 to Branson or
00:29:45.660 Bezos.
00:29:46.200 That would be
00:29:47.540 just so bad
00:29:48.640 for space
00:29:51.100 exploration.
00:29:52.260 But on the
00:29:52.660 other hand, you
00:29:54.140 have to give
00:29:54.760 them a compliment
00:29:55.440 for leading
00:29:56.740 with putting
00:29:57.740 skin in the
00:29:58.380 game.
00:29:59.720 You know?
00:30:00.280 Is it a
00:30:01.040 coincidence that
00:30:01.840 these two leaders
00:30:02.720 are unusually
00:30:03.780 successful?
00:30:05.980 Maybe this is
00:30:06.860 why, right?
00:30:08.240 Maybe this is
00:30:09.140 exactly why
00:30:09.800 they're successful,
00:30:11.380 that they'll get
00:30:12.160 on their damn
00:30:12.780 plane, or rocket,
00:30:14.680 or whatever it
00:30:15.080 is.
00:30:15.200 They'll take it
00:30:16.240 themselves, and
00:30:17.740 they'll risk
00:30:18.440 dying.
00:30:20.440 Three minutes
00:30:20.900 and counting.
00:30:23.080 Oh, it left
00:30:23.640 off.
00:30:24.240 So, where are
00:30:26.940 we?
00:30:27.160 Are we actually
00:30:27.660 in countdown?
00:30:29.320 Give me an
00:30:29.840 update on that
00:30:30.420 as we go.
00:30:32.300 All right, here's
00:30:32.880 my suggestion for
00:30:34.220 those of you who
00:30:35.660 don't like critical
00:30:36.460 race theory in
00:30:38.260 schools.
00:30:39.400 Here's how I
00:30:40.140 would handle it.
00:30:41.520 You know, oh,
00:30:44.160 he launched
00:30:44.540 already?
00:30:44.900 Or did he?
00:30:47.440 All right.
00:30:49.520 So, you know
00:30:50.360 that if you argue
00:30:51.260 against something
00:30:51.920 you don't like,
00:30:52.820 the people on the
00:30:53.520 other side just
00:30:54.180 dig in deeper,
00:30:55.400 right?
00:30:56.100 So, telling people
00:30:57.180 that something is a
00:30:58.040 bad idea, that they
00:30:59.780 think is a good
00:31:00.460 idea, does it ever
00:31:01.780 work?
00:31:03.020 No.
00:31:03.900 In the real world,
00:31:04.900 it just never works.
00:31:06.640 Usually, they've
00:31:07.460 already thought of
00:31:08.120 your argument anyway.
00:31:08.980 But once people dig
00:31:11.160 into a side, they're
00:31:12.240 kind of hard to
00:31:12.880 move.
00:31:13.900 But there is a way
00:31:14.940 to do it.
00:31:15.920 It doesn't work
00:31:16.480 every time, but it
00:31:17.760 goes like this, and
00:31:18.600 you've heard it
00:31:19.040 before.
00:31:20.080 If you buy into a
00:31:21.620 bad idea and then
00:31:22.860 extend it, you can
00:31:24.460 just break it.
00:31:25.200 In other words,
00:31:26.700 pretend that you
00:31:27.760 think it's a good
00:31:28.440 idea, buy into it,
00:31:31.200 and then take it to
00:31:32.260 its logical death.
00:31:34.760 Here's how you do
00:31:35.540 that with critical
00:31:36.280 race theory.
00:31:37.420 Instead of saying,
00:31:38.480 no, I don't like this
00:31:39.860 critical race theory,
00:31:41.460 I think it's racist,
00:31:42.740 but in a different
00:31:43.320 way, nobody can hear
00:31:44.880 that argument.
00:31:46.220 Instead, do this.
00:31:47.580 Say, yes, we should
00:31:48.620 do critical race theory
00:31:50.020 in schools.
00:31:50.740 And there are two
00:31:51.780 improvements I would
00:31:52.700 like to add.
00:31:54.460 Two improvements.
00:31:56.620 How does anybody
00:31:57.600 argue against
00:31:58.300 improvements?
00:32:00.140 Not only do you
00:32:01.120 like critical race
00:32:01.940 theory, but I like
00:32:02.920 it too.
00:32:03.900 And here's how I
00:32:04.840 would improve it.
00:32:05.860 Because you and I
00:32:06.780 both like it, wouldn't
00:32:08.360 we like it to be
00:32:09.100 even better?
00:32:10.320 Really?
00:32:11.100 I mean, it's so
00:32:12.200 good now, let's have
00:32:14.900 a little more of
00:32:15.480 that, right?
00:32:17.260 So here's how I'd
00:32:18.200 make it better.
00:32:18.680 Number one, I would
00:32:20.280 make a national,
00:32:21.560 maybe a federal
00:32:22.180 requirement, at least
00:32:24.140 a state requirement,
00:32:25.720 that parents have to
00:32:27.280 be given the
00:32:27.940 materials.
00:32:30.180 Parents have to
00:32:31.860 be presented with
00:32:32.920 the same materials
00:32:34.080 for the critical race
00:32:35.380 theory that the
00:32:36.500 children are being
00:32:37.240 taught.
00:32:38.780 A legal requirement.
00:32:40.940 Why?
00:32:41.720 Just to make them
00:32:42.560 better informed
00:32:43.240 parents, because
00:32:44.280 parents are part of
00:32:45.220 the process, right?
00:32:47.240 What would happen
00:32:48.280 if parents were well
00:32:50.200 informed and had a
00:32:51.520 real clean summary
00:32:52.600 of what critical race
00:32:54.620 theory is teaching?
00:32:56.460 Would they still be
00:32:57.520 supportive?
00:32:58.720 I don't know, but I
00:33:00.400 love critical race
00:33:01.360 theory, and so for
00:33:02.660 me, the more you can
00:33:04.260 spread it, the better.
00:33:06.160 So if you go teach it
00:33:07.120 to the kids, well, you
00:33:08.980 want it to stick,
00:33:09.800 right?
00:33:10.460 So also make the
00:33:11.840 materials available to
00:33:12.860 their parents, because
00:33:14.280 their parents can then
00:33:15.260 reinforce it when the
00:33:16.200 kids get home, right?
00:33:17.580 The more critical race
00:33:19.320 theory, the better.
00:33:20.640 So give it to the
00:33:21.480 kids, but make it a
00:33:22.580 requirement that the
00:33:24.060 parents have to see
00:33:24.900 the details.
00:33:26.760 See where I'm going?
00:33:28.800 You can embrace
00:33:29.880 something until it
00:33:30.700 dies.
00:33:31.780 Here's the second part
00:33:32.960 of that.
00:33:37.040 So you require the
00:33:38.680 kids to see it, and
00:33:42.280 you let the
00:33:43.300 transparency kill the
00:33:44.540 bad parts, but I'm
00:33:50.340 having a brief memory
00:33:52.120 problem here.
00:33:55.400 The second part, which
00:33:56.660 I just wrote about on
00:33:57.660 Locals, can somebody
00:33:58.520 remind me what I just
00:33:59.520 wrote?
00:34:01.120 Because I just blogged
00:34:02.520 this on the Locals
00:34:03.300 platform.
00:34:04.200 All right, they'll tell
00:34:04.680 me in a moment.
00:34:05.380 Yeah, I'm having a
00:34:05.980 Biden moment.
00:34:07.120 The problem is that I
00:34:08.180 make notes for myself,
00:34:09.400 so I don't do what I'm
00:34:10.260 doing now.
00:34:10.740 And the place where
00:34:13.500 this note belongs is
00:34:15.280 right in this little
00:34:15.900 space, where apparently
00:34:17.560 I was just about to put
00:34:18.980 my note about what I
00:34:20.120 was going to talk about
00:34:20.880 right now, and then I
00:34:22.260 got distracted.
00:34:25.660 All right.
00:34:26.700 Yes, there we go.
00:34:27.660 Thank you.
00:34:28.340 So the point I was
00:34:29.800 going to say is that you
00:34:31.480 should include in the
00:34:32.360 critical race theory
00:34:33.240 training that teachers'
00:34:35.620 unions are the biggest
00:34:36.540 source of systemic
00:34:37.560 racism.
00:34:38.900 Does anybody disagree?
00:34:40.740 Teachers' unions are
00:34:43.100 what prevent school
00:34:44.380 competition.
00:34:45.940 School competition is
00:34:47.260 the only thing that
00:34:48.180 will ever get good
00:34:49.240 schools for everybody,
00:34:50.560 or as good as we can
00:34:51.780 get.
00:34:52.580 So it is the teachers'
00:34:53.960 unions who need to be
00:34:55.820 included as a lesson
00:34:57.140 within critical race
00:34:58.440 theory to improve it.
00:35:00.600 How can you have
00:35:01.540 critical race theory
00:35:02.640 training and leave out
00:35:04.460 the biggest source of
00:35:05.520 systemic racism?
00:35:07.180 You can't do that.
00:35:08.060 So there should be a
00:35:09.940 module that is included
00:35:12.660 in it about teachers'
00:35:14.980 unions being the biggest
00:35:15.940 problem of all.
00:35:18.080 So those are the two
00:35:19.080 things.
00:35:19.540 Make the parents see all
00:35:20.600 of the materials and
00:35:21.500 include teachers'
00:35:22.980 unions being the biggest
00:35:24.160 source of that problem.
00:35:26.340 In those cases, bring it
00:35:28.660 on.
00:35:29.900 Let's see us some
00:35:30.940 critical race theory.
00:35:32.160 Now, you may say to
00:35:33.040 yourself, but Scott,
00:35:34.760 aren't you accidentally
00:35:35.880 accepting all the other
00:35:37.480 things that I don't like
00:35:38.660 in critical race theory?
00:35:40.120 And the answer is,
00:35:41.220 this is how persuasion
00:35:42.380 works.
00:35:43.600 The fact that you're
00:35:44.780 uncomfortable with me
00:35:46.820 saying that I would like
00:35:47.820 any part of critical race
00:35:49.120 theory is what makes you
00:35:50.960 hard, makes it hard for
00:35:52.800 you to look away.
00:35:54.560 That's the intentional
00:35:55.720 wrongness.
00:35:56.960 I could have easily fixed
00:35:58.360 it by saying, oh, I'd like
00:36:00.060 to take all of the parts I
00:36:01.420 don't like out of critical
00:36:02.560 race theory.
00:36:03.900 That makes sense, right?
00:36:05.060 Wouldn't you like to take
00:36:05.980 all of the parts away that
00:36:07.100 you don't like?
00:36:08.580 But by not doing that, I
00:36:11.240 made my proposition,
00:36:14.940 ugh, a little, there's
00:36:16.780 something wrong with it,
00:36:17.600 right?
00:36:18.620 It would be wrong to
00:36:19.700 accept anything that's
00:36:20.820 wrong, if you know it's
00:36:22.880 wrong.
00:36:23.560 But I've told you directly
00:36:24.880 that I would accept it with
00:36:26.440 a couple of tweaks.
00:36:27.640 Now, that's really just to
00:36:28.820 get more attention on the
00:36:30.240 tweaks I'm talking about.
00:36:31.640 It's not really because I'd
00:36:32.920 like the rest of it.
00:36:33.880 Although I don't know,
00:36:35.460 because I've never seen
00:36:36.140 it.
00:36:38.120 So, if I did, maybe I'd
00:36:40.660 feel differently.
00:36:43.960 Speaking of vaccinations,
00:36:47.320 which we weren't, how many
00:36:49.300 lives do you think were
00:36:50.280 saved in, let's say, just
00:36:51.460 the United States, based on
00:36:53.440 vaccinations?
00:36:55.140 In the comments, what would
00:36:56.620 be your guess?
00:36:58.100 How many estimated lives?
00:36:59.880 We'll just say between now
00:37:01.640 and the end of the year.
00:37:03.220 So, hopefully, that would
00:37:04.580 be, capture most of it.
00:37:07.220 How many lives do you
00:37:08.200 think were saved by
00:37:09.380 vaccinations in the United
00:37:11.060 States?
00:37:11.760 Just the United States.
00:37:13.960 Just the United States.
00:37:15.640 The answer is, oh, I'm
00:37:17.160 seeing pretty good guesses
00:37:18.100 there.
00:37:18.540 Your guesses are
00:37:19.220 actually pretty good.
00:37:21.060 Now, that doesn't mean it's
00:37:22.160 right.
00:37:22.480 But there is a source that
00:37:23.560 estimated that without a
00:37:25.720 vaccination program by the end
00:37:27.120 of this year, there would
00:37:28.680 have been approximately
00:37:29.420 279,000 additional deaths
00:37:32.200 and, you know, 1.25
00:37:34.900 hospitalizations.
00:37:36.840 1.25 million.
00:37:38.500 And that would just be the
00:37:39.640 United States alone.
00:37:41.760 So, does President Trump
00:37:44.860 and President Biden after him
00:37:48.340 for following up get credit
00:37:50.260 for saving 279,000 people?
00:37:54.520 Do they get credit for that?
00:37:56.080 And do you believe it?
00:37:59.980 I'm seeing some doubt.
00:38:01.780 There seems to be some doubt
00:38:02.960 about whether you could
00:38:03.860 measure this.
00:38:05.080 Well, as someone who has
00:38:06.220 done many estimates of
00:38:08.000 numbers of things in my
00:38:09.460 professional life, I don't
00:38:11.600 think you can measure this.
00:38:13.940 I feel like any estimate of
00:38:15.700 this is going to be a little
00:38:16.980 bit suspicious, if you know
00:38:18.560 what I mean.
00:38:19.760 But, when you come up with an
00:38:21.660 estimate that just
00:38:22.640 coincidentally happens to be
00:38:24.140 right in the range of things
00:38:25.340 you'd think it would be, then
00:38:27.340 you should be really
00:38:28.200 suspicious.
00:38:29.920 What happens when the
00:38:31.120 estimate that, you know,
00:38:32.980 you'd put a lot of time and
00:38:34.100 effort into it, and you're
00:38:35.200 figuring out all the
00:38:36.180 variables, and you're trying
00:38:37.800 to isolate variables and
00:38:39.280 all that, and after all that
00:38:41.640 work, you come out to a
00:38:43.800 number that's just about the
00:38:45.200 same as what you might have
00:38:46.220 guessed?
00:38:46.600 That's a little bit of a flag.
00:38:50.320 You know what I mean?
00:38:51.560 It's a little bit too close to
00:38:53.140 exactly what you thought it
00:38:54.240 would come out as, right
00:38:55.300 around 300,000.
00:38:56.640 If I had said, hey, there's
00:38:58.380 going to be a study on
00:38:59.240 vaccination effectiveness, how
00:39:00.900 many people do you think is
00:39:01.920 saved?
00:39:03.020 I'll bet most of you would
00:39:04.260 have guessed somewhere between
00:39:05.400 100,000 and 500,000.
00:39:08.000 Right?
00:39:09.180 And there it is, right in the
00:39:10.340 middle of that range.
00:39:11.040 So, while I do believe that
00:39:15.360 vaccinations have saved a lot
00:39:16.880 of lives, I don't believe you
00:39:18.860 can estimate it with any kind
00:39:21.420 of confidence.
00:39:23.640 All right.
00:39:25.060 So, Kamala Harris continues to
00:39:26.960 be a nightmare for the
00:39:28.500 Democrats in the most
00:39:30.740 entertaining way.
00:39:31.760 She might be the least
00:39:33.000 competent politician we've
00:39:35.940 ever seen in the White House.
00:39:37.240 I don't know.
00:39:37.820 Is anybody less competent than
00:39:39.420 she is?
00:39:39.800 It's really shocking.
00:39:43.160 And now the latest
00:39:44.220 controversy is that she's
00:39:46.060 talking about a possible
00:39:48.100 compromise with IDs for
00:39:50.900 voters.
00:39:52.180 And, of course, forcing
00:39:53.700 somebody to have an ID reduces
00:39:55.620 the number of people who can
00:39:56.640 vote, but we don't know by
00:39:58.920 how much.
00:40:00.320 And now she's saying that
00:40:01.600 rural people can't even get a
00:40:03.100 photocopy of their ID, so it
00:40:05.100 would be harder for them to
00:40:05.880 register to vote.
00:40:07.660 And rural people said,
00:40:09.800 what?
00:40:11.500 We might be rural, but we
00:40:14.040 know how to make a
00:40:14.720 photocopy.
00:40:16.020 And I ask you this question.
00:40:18.260 Why do you need a
00:40:19.260 photocopy?
00:40:20.760 You can't take a
00:40:22.300 smartphone picture of an
00:40:24.460 ID and just send it to an
00:40:27.020 address?
00:40:27.420 If there's a physical
00:40:30.020 photocopy requirement,
00:40:35.680 if you have a physical
00:40:37.120 photocopy requirement, you
00:40:39.160 couldn't also have just an
00:40:40.460 email address that you just
00:40:41.680 send a digital image.
00:40:44.060 And then if grandma doesn't
00:40:45.120 have a photocopier and grandma
00:40:46.740 doesn't have a smartphone,
00:40:48.740 grandma says, hey, can you just
00:40:50.240 take a picture of my ID and send
00:40:51.900 it to this email address?
00:40:52.960 And your grandkid just takes a
00:40:55.160 picture of it.
00:40:56.320 It feels like this is
00:40:57.480 solvable with technology.
00:40:59.660 We should not be arguing over
00:41:01.360 what Kamala Harris calls
00:41:03.800 Xeroxes.
00:41:06.660 It's 2021 for God's sakes.
00:41:10.160 If you're using the word
00:41:11.480 Xeroxing, that's a red flag right
00:41:14.200 there, as she did.
00:41:15.720 But then she corrected it to
00:41:17.340 photocopies, realizing what she'd
00:41:18.760 done.
00:41:19.380 But if we're even talking
00:41:20.720 about photocopies, what the
00:41:22.580 hell is going on?
00:41:23.460 Oh, did the launch?
00:41:25.140 Launch just happened?
00:41:27.080 Keep me informed.
00:41:28.640 I want to know that he made it
00:41:30.220 safely, at least past the
00:41:31.880 launch.
00:41:34.560 Very exciting.
00:41:35.900 And congratulations.
00:41:37.980 He's in the air.
00:41:39.740 God, imagine what it feels
00:41:40.860 like.
00:41:42.020 How would you like to be
00:41:43.040 Richard Branson right now?
00:41:45.460 He's actually going into, I
00:41:47.380 think, near space.
00:41:48.340 Not quite space.
00:41:50.340 On a rocket that he caused
00:41:52.500 to be built.
00:41:54.500 Wow.
00:41:55.620 I would be scared to death.
00:41:58.220 But, quite exciting.
00:42:04.400 Alright.
00:42:05.320 Yeah, I don't think I'll be
00:42:06.220 signing up for any space
00:42:07.400 stuff immediately.
00:42:12.280 And apparently, so Kamala
00:42:14.580 Harris is overwhelmed with all
00:42:16.320 these vague jobs that, let's
00:42:20.380 see, what is she in charge
00:42:21.180 of?
00:42:21.480 She's in charge of looking
00:42:23.840 into the origins of the
00:42:25.160 southern border migrant
00:42:26.360 surge, expanding voter
00:42:27.900 rights, closing the digital
00:42:29.380 divide, space council, and
00:42:32.120 some other stuff.
00:42:32.720 And here's the worst answer
00:42:35.700 ever done.
00:42:36.300 She was asked, asked about
00:42:38.200 whether she had too much on
00:42:39.300 her plate.
00:42:40.440 And she said, quote, yeah,
00:42:42.760 maybe I don't say no enough.
00:42:45.820 Do you want to hear somebody
00:42:47.880 who might be president of the
00:42:49.140 United States, do you ever
00:42:51.040 want them to hear, do you ever
00:42:52.740 want to hear them say, maybe I
00:42:54.420 don't say no enough?
00:42:55.460 That is the weakest statement
00:42:58.040 any politician ever said.
00:43:01.320 Do you want a president who
00:43:02.540 can't say no?
00:43:05.220 That just makes you completely
00:43:07.000 unqualified to be a leader.
00:43:09.760 Like, you can't say no, even to
00:43:11.780 your boss, I mean, even though
00:43:13.240 it's the president.
00:43:14.140 You should be able to say no.
00:43:16.420 If you can't do that, you're not
00:43:17.920 very useful as a leader.
00:43:21.300 But she's very motivated by all
00:43:23.020 the hard work.
00:43:23.740 Another terrible answer.
00:43:25.700 Makes you sound like a wonk.
00:43:28.420 Don't be motivated by all the
00:43:30.040 hard work.
00:43:31.280 How about be motivated by all
00:43:34.200 the benefits you could bring to
00:43:35.680 the country?
00:43:36.420 Something like that.
00:43:39.060 So there's some suspect they
00:43:40.480 found with a bunch of guns in a
00:43:42.120 hotel in Denver, and they're
00:43:43.840 afraid it was going to be a mass
00:43:44.960 suicide attempt.
00:43:46.680 But here's the weird part.
00:43:48.260 There were four people involved.
00:43:51.360 When have you ever heard of a
00:43:53.040 sort of a mass killing, there was
00:43:56.580 really sort of a suicide attempt,
00:43:58.900 you know, somebody just trying to
00:43:59.880 go out big, when there were four
00:44:02.200 people involved in the plot?
00:44:04.960 It seems to eliminate it, doesn't
00:44:07.060 it?
00:44:09.800 All right.
00:44:10.480 I'm just looking at your comments for
00:44:15.440 a moment.
00:44:16.720 All right.
00:44:17.020 CPAC is happening.
00:44:17.960 People are going to be outraged.
00:44:20.040 Got more of that coming.
00:44:22.220 And that is just about all the news
00:44:25.920 that's worth talking about.
00:44:27.840 Now, apparently the Democrats are
00:44:29.620 panicked because when they do a poll
00:44:31.900 of people and ask them, what are Biden's
00:44:35.040 accomplishments?
00:44:35.560 What do you think they say?
00:44:38.080 So in the comments, tell me, what do
00:44:40.220 you think people say when the public
00:44:42.900 is asked, what are Biden's
00:44:44.400 accomplishments so far?
00:44:47.440 The answer is, they don't know of any.
00:44:51.280 Because even people who would say, oh, he
00:44:53.420 did a good job rolling out the vaccines,
00:44:55.840 they kind of know it would have been the
00:44:57.440 same under any president, right?
00:44:59.800 Don't we know that?
00:45:01.040 Or at least we don't know how it would
00:45:02.420 be any different.
00:45:03.020 Because it was going to happen.
00:45:04.660 It wasn't like he had the idea.
00:45:06.500 Hey, I've got an idea.
00:45:08.040 Why don't we implement the vaccines that
00:45:10.580 we made?
00:45:11.920 So I'm not sure what Biden actually added
00:45:13.920 to that process other than taking credit.
00:45:16.800 Yeah.
00:45:17.040 So what did he do?
00:45:17.800 Eat ice cream and make the border
00:45:19.200 situation worse and make Russia, China
00:45:22.480 and Iran hate us a little bit more?
00:45:25.320 What?
00:45:26.200 It would be really hard to come up with
00:45:28.020 any accomplishment.
00:45:28.700 It's starting to look as if the
00:45:32.580 Republicans have a real strong midterm
00:45:34.820 coming, which would not be surprising
00:45:36.440 historically.
00:45:38.460 Yeah, he's got crime spiking off the
00:45:40.680 charts.
00:45:43.160 He's good at stairs, somebody says.
00:45:46.460 He brought decency back to the White
00:45:48.400 House.
00:45:49.860 Did he?
00:45:51.020 So we know he lied.
00:45:54.380 Right?
00:45:54.900 We know he's done anti-science stuff.
00:45:58.920 All of the reasons that he ran for office,
00:46:01.980 he's proven didn't work.
00:46:06.520 Right?
00:46:07.180 All right.
00:46:07.540 Well, that's all I've got for today.
00:46:09.640 And I will talk to you.