Episode 1224 Scott Adams: I Teach You When to Disagree With the Experts Because That is an Essential Skill
Episode Stats
Length
1 hour and 16 minutes
Words per Minute
152.62576
Summary
In this episode of Coffee and a Simultaneous Sip, Scott Adams talks about the controversial question of whether or not to get the MMR vaccine for older people, and why racism is the only thing that can get in the way.
Transcript
00:00:00.420
You're right on time. Well, at least some of you are. The rest of you, I call you laggards.
00:00:06.800
Laggards, yeah. It's time to up your game and be here exactly at 7 a.m. California time,
00:00:13.880
10 a.m. Eastern Time for the best part of your day. It's called Coffee with Scott Adams
00:00:20.080
and a Simultaneous Sip. It's amazing. Have you experienced it yet? Well, if it's your first day,
00:00:26.960
hold on to your hat. It's that good. Oh, it might sneak up on you, but it's that good.
00:00:34.260
And all you need to enjoy it is a copper mugger, a glass tanker, a chalice or stein,
00:00:40.440
a canteen jug or flask, a vessel of any kind. Fill it with your favorite liquid. I like coffee.
00:00:47.680
And join me now for the unparalleled pleasure of the dopamine hit of the day,
00:00:50.640
the thing that makes everything better, including the Moderna vaccine. Go.
00:00:56.960
Well, speaking of the Moderna vaccine, I guess that got approved, so we got a second vaccine.
00:01:09.840
how do you decide which one to take? Because I'm starting to hear reports of, you know,
00:01:16.960
the Moderna one might have some advantages over the other one. What would you do if your health
00:01:23.100
care provider offered you, let's say, the other one, but you wanted the Moderna one? What would
00:01:30.220
you do? Would you wait? If the only one you could get is the other one, because maybe you're in a,
00:01:36.460
your HMO or something just has one of them? I don't know. It's going to be a, it's going to be an
00:01:41.540
interesting question. My advice to all of you will be the same. This is what I'm going to do.
00:01:47.740
In terms of my decision of what vaccination to take or not, I'm going to wait till the last
00:01:54.780
minute. And you should too. You don't need to make a decision until somebody says you can come in and
00:02:01.320
get it. Until somebody says that you personally can go get a vaccination, don't decide. Don't decide
00:02:10.740
because there might be extra information by then. So if you decide now, before you have to,
00:02:16.120
why would you do that? Wait until the last minute. You can be quite sure you're going to
00:02:20.800
take it or quite sure you're not, but don't decide yet. Wait until the last minute. That's the smartest
00:02:27.360
place, place to make the decision. Here's the most controversial story that I have completely
00:02:34.280
changed my opinion on. You saw the story about the ethicist who, I guess the New York Times had his
00:02:42.800
article. And the ethicist claimed that it might be better for society if before older people get to
00:02:51.660
the vaccination, that the frontline health care workers get taken care of. And part of his argument,
00:02:59.360
which made the headlines, was that he thought that old people shouldn't get the vaccination first
00:03:06.120
because they're mostly white. And that frontline health care workers are more diverse. And so if
00:03:14.020
you favored the frontline workers, you would get a more diverse and more fair distribution.
00:03:20.580
And the way that was reported is, uh, racist. Oh, it's kind of racist. Really, really racist.
00:03:28.280
Is it? Yes. So let me agree with the first part of the criticism unambiguously. It's totally racist.
00:03:39.160
It's unambiguously, overtly, plainly, transparently racist. But here's the part you're not going to like.
00:03:51.440
You ready? But no matter what you do, it's racist. Sorry. Sorry. No matter what you do,
00:04:04.160
it's racist. There isn't the non-racist option. If we had a non-racist option, I would say, my God,
00:04:14.000
why are we even obtaining this idea from this clearly racist proposal? But that's not our
00:04:22.020
situation. It's not. And we can't get to a situation where there would be any kind of a
00:04:27.760
non-racist, you know, non-racist process. Here's why. No matter what rules you pick,
00:04:37.780
no matter what group you say, even if you don't use race, if you just say, well, we'll do people
00:04:46.000
over a certain age, well, mostly white, right? It wasn't your intention, but it would just turn
00:04:52.240
out that way. So things are racist by outcome, no matter what your intention was, right? So would you
00:05:01.540
agree with the first point that the outcome has a racial element to it, even if nobody was thinking
00:05:09.380
in those terms, even if racism was nothing to do with the decision, you'd all agree that there's
00:05:16.340
always an outcome that favors one group or not, no matter what you do. You can't avoid that. So let's
00:05:22.900
talk about intention. Because if the outcome is going to be racist, no matter what, you can't get rid of
00:05:28.420
the racism part. So why would you worry about the thing you can't change, right? That just can't be
00:05:33.760
changed. But you can question motive. That's always fair. You can question intention. If I thought that
00:05:42.980
someone had suggested in public that white people should not get the vaccination, you know, in an early
00:05:53.120
way, even if they're old, you know, what's your first impression of that? Sounds pretty bad, right? But let me ask you
00:06:01.780
this. So suppose somebody came to me, all right? Let's personalize this, take it down to the realm of public
00:06:09.500
policy. Take it down to you personally. Somebody comes to you and they say, Scott, you're over 60. You know, I'm 63.
00:06:19.500
And I'm in a, you know, I've got a little asthma. So I've got some comorbidities. I'm not old, old,
00:06:28.380
but I'm, you know, the beginning of the older category. And suppose they said to me, I'd like you
00:06:34.960
to make the decision, Scott. It's up to you. We know that, let's say, older black people have much
00:06:42.840
worse outcomes. Would you mind socially distancing a little bit longer? And we're going to focus on
00:06:51.080
black citizens, not because they're black, but because we know they have worse outcomes.
00:06:58.020
So if you're looking at the greater good, you want to give the vaccination to whoever gets the best
00:07:02.960
outcome, right? It just happens that they're black. That's not anybody's choice. It's nobody's intention.
00:07:10.120
It's just a biological reality. So if somebody said to me, Scott, would you personally,
00:07:18.580
you're not making a decision for anybody else, right? It's just you personally. It's not a public
00:07:23.880
policy. It doesn't apply to anybody else. It's just you. Would you personally socially distance a
00:07:30.520
little bit longer and take a little more risk for the benefit of black citizens in the United States
00:07:36.100
who are at greater risk? What would I say? I'd say yes. I'd say yes. If somebody asked me that
00:07:44.200
question directly and said, look, it's up to you. We're not, there's no penalty. You will not be
00:07:50.000
punished. You won't be punished. It's just up to you. It's your own conscience, your own risk, your own
00:07:56.960
risk reward calculation. You can be selfish if you want. It's up to you. If you want to get it first,
00:08:03.380
we'll put you right in front of the line and nobody will ever give you a hard time for it.
00:08:07.280
It's up to you. I think I'd still wait. I think it's still wait because that's actually a pretty
00:08:13.800
fair proposition. If they can identify people for whatever reason, be they black or have a
00:08:21.880
comorbidity or be they a certain age or be they healthcare workers on the front line, if you can
00:08:27.660
make a strong case that this person is in a high risk group and I'm in a slightly less high risk
00:08:34.920
group. Yeah, I'm okay with that. Absolutely. Because, you know, it's a war, right? It's a war.
00:08:43.600
And sometimes you've got to be the one that does the dangerous stuff so that somebody else doesn't
00:08:48.860
have to do it. Sometimes you've got to, you know, brave the bullets to pull back your wounded
00:08:53.940
comrade off the field, right? So we're in a situation where personal sacrifice should be a
00:09:00.160
pretty big part of the equation. If you're not thinking of it that way, then you're not in a,
00:09:05.660
let's say, a military mindset. And maybe we should be. We're in a war against a virus. Maybe let's
00:09:12.860
act more like soldiers, right? So somebody says white guilt. Am I suffering from white guilt?
00:09:21.480
Because I would say the same thing, no matter who the risk category was. So would you, would you
00:09:30.540
criticize me if I said, I think people over 80 should get the, should get the vaccination before
00:09:37.640
me? Would you criticize me for that? If I said that frontline healthcare workers should get it before
00:09:43.700
me, would you, would you criticize me for that? If I said that people who, what's the worst comorbidity,
00:09:50.740
maybe it's diabetes, let's say diabetes is, I don't know if that's true, but let's say it's the
00:09:54.780
worst one. If I said that everybody with diabetes should get the shot before I do, would you criticize
00:10:02.480
me for that? Why would you criticize me if I say, no, there's an obvious category, black citizens in
00:10:09.680
this country clearly have far worse outcomes. Why is that different than diabetes? Why is that different
00:10:15.420
than being 80 years old? All right. So there's your provocative thought of the day. I didn't think
00:10:22.680
you'd like it, but I feel it's worth mulling on. And I would say that the story about the ethicist and
00:10:30.000
his opinion was presented a little bit out of context. So when I first heard it, it just sounded
00:10:36.220
straight up racist. And that was my first impression. But when you hear the actual argument,
00:10:40.680
and you understand that there is no non-racist outcome, you can't get there. It's not a
00:10:46.040
possibility. It's only who gets, who gets the advantage. And if you decide that it's going to
00:10:51.660
be a little racist, but you're going to do it based on the greatest risk, that's about as good as you can
00:10:58.140
do. It's about as good as you can do. Now, if you told me that every black person would get the
00:11:03.720
vaccination before every white person, that doesn't make sense. But if you're talking equal to equal,
00:11:09.300
let's say a black guy who's 63 years old and has a little asthma, I would, I would put him in front
00:11:16.580
of me in line because it's a war. If this were a competition and I were competing against my fellow
00:11:25.020
citizen, yeah, I'd push him off the cliff, right? If I treated this as a competition, I'm going to get
00:11:32.400
mine and I'm going to make sure you don't get mine before I get mine. But it's not, it's a war.
00:11:37.960
So if I were competing against black people, sure, I'll do what I have to do to compete. But they're
00:11:44.100
on the same team. So I'm going to treat it like a military operation. All right. There's a new site
00:11:53.340
that's sorting the news in a useful way. I think a lot of testing on how to get our news better,
00:11:59.260
you know, better platforms or better ways to present the news so it's less biased would be good
00:12:05.060
for testing. So I'm not going to say that this particular one and its URL is tidyreport.com.
00:12:13.200
Tidy, T-I-D-Y, report.com. What they do is they organize the tweets, which of course usually connect
00:12:21.320
to the news directly. So they organize the, I think it's mostly the political tweets, by positive,
00:12:27.940
neutral, and negative. So in other words, whether the tweet is saying something positive or negative
00:12:32.480
about the topic or neutral, and whether the person saying it is associated with the left or the
00:12:38.860
right. I'm not sure exactly how they figure that out, but they're probably close. So it's called
00:12:44.220
tidyreport.com. I neither recommend it nor disrecommend it. The important part of the story
00:12:50.700
is that people are starting to A-B test different ways to present the news to get past this immense
00:12:57.460
bias situation. So check it out. Maybe that's one of the ones. Pompeo. So Mike Pompeo says that
00:13:08.620
we're, that the Russians are, quote, pretty clearly behind the cyber attack. What does pretty clearly
00:13:16.800
mean? Does pretty clearly mean we're sure of it? Is that the same? It's pretty clearly. I don't know if
00:13:26.280
that means they're positive. It's an interesting choice of words, but it's somewhere in that
00:13:31.660
neighborhood of high confidence or high likelihood. And some experts are saying what I think is
00:13:39.800
unfortunately obvious, that the only way you would be able to get rid of whatever access Russia has had
00:13:46.540
for apparently a long time, the only way you'd be able to get rid of it is to replace all of your
00:13:51.020
software. All of it. Because the, the app, the, uh, the allegation, which is probably pretty reasonable
00:13:59.220
is that once they had God access to all of the systems, they could embed, you know, viruses in
00:14:05.700
different places to be activated under different situations or open up different doors, et cetera.
00:14:11.140
So it wouldn't matter how, uh, how good you were at finding a problem because they would just
00:14:18.060
open up a new door as quickly as you found it. So you pretty much have to get rid of all of it.
00:14:25.060
I wonder if we have the technology to do that. Let me give you, uh, let me flush this out a little
00:14:33.300
bit, flush it out or flash it out a little bit. I always get those confused. Flash it out, you flash
00:14:39.260
it out, right? That's, that's the saying. So the idea is this, could you write a, uh, software
00:14:46.260
application that's main purpose is to remove all of the software in the company and replace it with
00:14:54.700
the clean version of the same software? So in other words, could you write, um, some kind of a master
00:15:01.580
God program that would take every piece of software at IBM, delete it. And I don't know how hard you have
00:15:09.600
to delete it. Maybe there's like, you have to extra delete, bleach it or something. Just get rid of all
00:15:15.200
of it and reload the same fresh, you know, thing. So you keep your databases. So none of your data would
00:15:23.600
be directly affected. So I don't think there's a problem with data. I guess I'm not that technical
00:15:29.920
that I can answer that question. Would we have any issue with a, just a raw database? I don't know if
00:15:34.700
that can hold a virus, but if you get every, get rid of everything, just wipe everything that has any
00:15:41.440
software element to it in your system. Could you write one giant program that just rolled through
00:15:48.720
a fortune 500 company, took it down for an hour, it is just done for an hour, but an hour later is
00:15:56.640
reloaded all of its software, rebooted in the right order and brought everything back up. Could you do it?
00:16:02.120
Is it, is that a thing? Let me give you a little bit of history. Do you remember when the year 2000
00:16:09.480
bug was coming? When the year 2000 bug was coming, all the experts, experts said, we don't have enough
00:16:18.920
time. We're in real trouble because the companies are not taking it seriously. And that date is coming
00:16:26.160
when year 2000 bug will hit and all computers that were, you know, designed before a certain date
00:16:31.620
can't handle the year 2000 as a date and they'll all crash and the world will, will end. And as that
00:16:38.960
was approaching and we were getting close, we were actually in the year 2000 and we're getting, you
00:16:43.480
know, or no, we're getting close to the year 2000. Yeah. It gets closer and closer and closer.
00:16:49.360
I was saying in public, we're fine. We'll be fine. Now, here's the reason that all the experts said,
00:16:56.640
no, it can't be done. It's too much work. You know, if everybody worked on it full time,
00:17:00.800
you just couldn't get it done. We're, we're doomed. And I said exactly the opposite. I said,
00:17:05.320
no, we'll be fine. What happened? What happened is we were fine. Now, why is that? Well, exactly what
00:17:13.620
I predicted that it would take a long time to do it manually, but it wouldn't take a long time to
00:17:20.080
figure out how not to do it manually. In other words, it wouldn't take that long to write programs
00:17:26.340
that would do what the humans would have to do. That would take a long time. And what happened?
00:17:31.320
People wrote programs that looked for these bugs and corrected them. And then they ran the programs.
00:17:37.800
The bugs were corrected. The year 2000 came. Bam, we're fine. So until you could imagine that it was
00:17:46.940
possible to write software that would fix, you know, universally go out there and find and fix all
00:17:52.680
the bugs that you didn't even know where they were, uh, you thought you were doomed. I think we might
00:17:58.540
find a similar, maybe an industry could be a whole industry pops up, maybe a consulting industry with
00:18:05.760
some kind of technical background. And I would guess that we will probably birth an industry
00:18:12.560
because of this, because of the hacks, uh, that go in and shut down your whole network and wipe it
00:18:19.400
and then reload it. And, you know, they control the process. So nothing gets out of control
00:18:23.700
and, and they just go rescue one company at a time. I feel like that's going to be an industry
00:18:29.900
really soon and should be. And I think that'll be the only answer because the, uh, we should assume
00:18:36.940
that, uh, people will get so far into our systems again, that we'll just be right back in this
00:18:43.600
situation. Right? So even if we found every bug and got rid of it, it would just reproduce. I mean,
00:18:50.660
you know, even if we got them all, which is impossible probably, but even if we did,
00:18:55.540
they would just hack back in and, you know, we, you know, they'd find another way in. So you need
00:19:00.260
some way to wipe all of your software every now and then, every bit of it. And that, I think that'll
00:19:05.860
become an industry. Um, here's the funniest thing that's happened, uh, lately. If you don't follow
00:19:13.940
this Twitter account, you really should. It's a parody account. Uh, and the name on the account is,
00:19:21.240
uh, Titania McGrath. And there's a little photo of a, uh, youngish blonde woman with sort of glasses
00:19:29.680
like mine. And what's brilliant about it, besides the fact that it's brilliantly written,
00:19:35.860
is that it's a parody account that is so close to reality, people are often fooled, which is part of
00:19:43.680
the joke. So a lot of people who see this account for the first time really can't tell. And, and
00:19:50.900
something happened that, uh, a thread that, uh, Titania or whoever runs the account tweeted
00:19:59.640
that just as amazing. And here's why I've been telling you for a while that parody and reality
00:20:06.900
have merged such that there's not really that much difference between a wild parody and what
00:20:14.000
you're actually observing. You want to see an example of that? Titania in her thread gives you
00:20:21.520
several examples of predictions that she, or whoever runs the account made that were pure parody
00:20:29.240
that have already happened. In other words, the parody came before the reality, but listen to this
00:20:36.660
list. It's fricking mind blowing. All right. You ready? So these are the claims in Titania's thread.
00:20:42.640
Um, she said, um, she said on, uh, December, 2018, I called for biological sex to be removed from birth
00:20:51.180
certificates. Now that was parody. We're going to take your biological sex off of your, uh, birth
00:20:58.260
certificate. So that in 2018, in 2020, the new England journal of medicine concurred. So the new
00:21:06.000
England journal of medicine is now recommending in 2020, what Titania said as purely a joke
00:21:12.840
in 2018, purely a joke. Is that the only one? Well, I mean, if this had only happened once,
00:21:20.200
if it only happened once, you'd say, oh, that's a funny coincidence, right? If it only happened once.
00:21:27.420
So, uh, on, and also in 2018, Titania criticized Julie Andrews, who played Mary Poppins in the movie
00:21:34.900
for having chimney soot on her face. Cause you know, that was in the middle of the blackface
00:21:41.280
stuff. So as purely a joke, she tweeted that and criticized Julie Andrews for having chimney
00:21:49.380
smoke on her face in the movie. That was in 2018. In 2019, the New York times concurred.
00:21:56.380
So the New York times basically took on Julie Andrews for having soot on her face and blackface.
00:22:04.560
It was literally a joke two years before it became real. Is that the only ones? Oh no,
00:22:11.340
I'm not even close to being done. So the thing that's funny is not the individual examples because
00:22:17.000
they're, they're sort of trivial. What's funny is how often it happened. It's the often part that
00:22:22.420
makes the, makes the joke. All right, here's another one. Uh, March 2019, Titania published a book
00:22:28.820
called Woke in which I argued that skyscrapers are oppressive phallic symbols. In July, 2020,
00:22:38.440
the guardian concurred. So in 2019, literally joking, the skyscrapers are, you know, some oppressive
00:22:49.740
phallic symbols. And then the guardian writes a serious article one year later saying exactly that.
00:22:57.580
Uh, in that same book in 2019, here's, here's the, like the finisher. Uh, she goes, in 2019,
00:23:04.560
in her book, also the same book, Woke, I called for, uh, I called that with Helen Keller for her white
00:23:10.760
privilege. Time magazine just did that in reality.
00:23:19.740
Now this is just a sample. The, the actual thread is longer. I just picked out some of the fun ones,
00:23:26.160
but when you see how many times parody and reality overlapped, it's, it changes you. I mean, this is,
00:23:34.440
this is one of those things where, you know, I've, I've predicted this. I predicted it often and in
00:23:41.640
public that parody and reality were on the way to merging and then to watch it in real time,
00:23:47.800
it actually merged. Did that sound like a real prediction when I said it? The first time you
00:23:53.800
heard me say parody and reality are merging, that didn't sound exactly technically real, right?
00:24:01.800
It sounded more like a humorous hyperbole. No, I meant it and it happened.
00:24:10.500
So there you go. All right. Speaking of predictions, one of my other predictions is that
00:24:18.040
history would get complicated because we would no longer have one of them. We would have more than
00:24:24.060
one history. And that if you went to school, it might be a problem because you're trying to learn
00:24:29.760
history and there are two of them and they're different. Which one do you believe? Did you think
00:24:36.980
that that was going to happen? My prediction that there would be two histories? Well, here we are.
00:24:43.620
President Trump, he unveiled his choices for the president's advisory 1776 commission.
00:24:52.180
So this will be a commission to make recommendations about how to push against the 1619 project that is
00:25:00.220
already in schools. So the president has literally created a commission to create an alternate history
00:25:08.060
to compete with the history that's already being taught in the schools. Two histories.
00:25:16.020
Literally being taught in schools. Now, when I said to you, we've got a problem here because we have
00:25:23.080
two histories like we've never had before. Did that sound real to you? The first time you heard that,
00:25:29.300
it's like, no, we'll still agree on one history. Nope. Literally, we're teaching two histories if this
00:25:37.800
commission goes forward. I don't know how much time they have before Biden scraps it, so I can't
00:25:43.360
imagine they get much done. But there you are, two histories. All right. The most interesting claim
00:25:53.340
about election fraud that I've seen comes from Kanekoa the Great. It's a Twitter account. So I think
00:26:03.340
maybe Kanekoa, K-A-N-E-K-O-A, the Great. All one word. Kanekoa the Great. A good follow. He's got lots
00:26:13.000
of stuff. I don't know who he is, but he's got lots of good content. So he made a Twitter thread,
00:26:17.540
which I'll talk about. But before I talk about it, do you remember the golden rule of all election
00:26:25.460
fraud claims? The golden rule, well, it's not golden, but it's a rule. 95% of all the election
00:26:33.760
claims you hear are fake or not real or mistaken or out of context or something. But I also believe
00:26:42.660
there's a 100% chance the election was stolen because it was easy and people had the motive
00:26:47.060
to do it. So of course it's stolen. It's always stolen under those conditions. But any specific
00:26:53.260
claim you hear, probably BS. Now, I'm going to give you a specific claim after telling you that
00:27:00.320
every specific claim is probably BS. I'm going to apply the same standard to this one. Now, this one
00:27:09.000
sounds really good. Okay? So I'm going to give you an argument here that on paper, you know, on paper,
00:27:18.700
it's really, really strong. But is it true? I don't know. I would just apply my standard to it. 95%
00:27:27.740
chance it's not. But here it is. So there is a working professional statistician, somebody who is
00:27:35.200
very capable and experienced and in the sweet spot of his career. So somebody who really, really,
00:27:42.480
really understands statistics. So this is the source. And there's a video from the statistician
00:27:49.420
explaining what he did. So the first thing you need to know is that the person making the claim
00:27:54.280
is very qualified. Right? Now, that doesn't mean it's true. Right? Because we'll talk about experts
00:28:01.460
and when you should trust them. But just know that he's very qualified. And what he did was he was just
00:28:06.520
sort of messing around with a lot of the data. You know, he explained it as almost a hobby,
00:28:11.360
something that statisticians like. It was like, oh, I wonder if there's a correlation between this
00:28:16.140
thing and that thing. And he discovered, somewhat just by poking around, a correlation that almost is
00:28:25.200
impossible to be natural. Meaning it's a signal for fraud with something like a greater than 99% chance
00:28:33.280
that it is really fraud and not some fake signal. Now, does that mean it's true? No. Remember,
00:28:41.860
we're only dealing with claims that you and I can't check. I don't have the skill to check it. I don't
00:28:48.100
know where the data was. I can't really check it. So no matter how credible this sounds, just keep this
00:28:54.880
little tape playing in the back of your head. No matter how credible it sounds, there's a 95% chance
00:29:01.360
it's not real. Okay? Just keep that playing in the back of your head. And here's what he found.
00:29:08.700
If you took the 3,000 U.S. counties, I always wondered how many counties there were, which is
00:29:13.440
weird. I was wondering that exact question. There are over 3,000 counties. Now, counties have a lot in
00:29:19.900
common, right? There could be a lot of diversity within the county, but you can make some claims
00:29:26.480
about their consistency over time. And the statistician started out by predicting who would
00:29:34.640
win each county based on a number of demographic variables. So he would say how many Democrats are in
00:29:41.240
the county? What's their age? A bunch of stuff. And he found that he could predict with 90% accuracy
00:29:48.140
who the county would go for based on their demographics. And you could apply it retrospectively
00:29:54.740
to other elections, and I guess it works. So it's about 90% good at knowing in advance who would win.
00:30:00.760
And then he looked at who actually won. And he found, eventually, he poked around and found this
00:30:08.380
strange data oddity. That there were lots of counties that did better than his model would
00:30:17.360
predict. And there were lots of counties that did worse than his model would predict. And that's
00:30:23.020
quite natural. So if you've got 3,000 data points, they're going to be spread around. But his point was
00:30:30.380
you could draw a line through the middle, and that would be his prediction. And the differences would
00:30:36.080
just be sort of equally on both sides of the line. So if there was, if he was off, it was just as
00:30:41.800
likely he was off, you know, in one way versus the other. So there'd be just as much below the line,
00:30:46.680
above the line. And then he found that in those counties that used Dominion voting systems and
00:30:54.480
one other kind, I think Hart, Hart or something? It was another company, Hart, H-A-R-T. So there,
00:31:02.100
I guess there were maybe six or so different machines in different counties and different
00:31:06.720
ways to account. But in those counties that had Dominion or Hart systems, they were consistently
00:31:13.940
over 5% more votes than would be expected for Biden. Now, here's the interesting part.
00:31:23.800
The correlation holds in Trump counties. So counties that Trump won, Biden did 5% better.
00:31:32.100
In counties that you knew that Biden was going to win because they always go Democrat, also a little
00:31:38.180
bit more than 5% better. So the amount that the Dominion and Hart machine counting counties were off
00:31:49.020
was consistent, meaning that there was a gigantic difference. Let me see if I can say this simply
00:32:00.220
because I'm botching this. If you looked at what you expected these counties to do based on their
00:32:06.580
demographics and past behavior, et cetera, the ones that had Dominion and Hart machines were way,
00:32:14.780
way, I think 73% of them had a Biden advantage that was very similar. Now, the odds of only those two
00:32:23.280
machines having 73% of the oddities went in both directions equally, but only where you have the
00:32:34.860
Dominion or Hart machines, you didn't have an even distribution. It's the only time. And it's very
00:32:40.960
consistent. And according to the statistician, not according to me, that the odds of any of that being
00:32:46.820
anything but fraud are vanishing, vanishingly small. You know, you could say it might be something else
00:32:54.780
in the, you know, extreme, you know, it was alien invasion or something, but not really, not really.
00:33:02.720
Now, this is very different from the quadrillions argument. The quadrillions argument was debunked,
00:33:08.980
right? So the quadrillions argument is that if there's, let's say, a bellwether place that always
00:33:17.380
went to Republicans and this is the only time it didn't, you know, that that's a signal. And then
00:33:23.720
there's this other signal and this other signal. That is not good analysis because all it would take
00:33:29.780
is one big effect that could affect all of those things, right? So that's not a guaranteed signal of
00:33:37.520
fraud. But this is really specific because you can trace it all the way down to this specific
00:33:44.300
vendor. And if you can trace the difference down to specific vendors, that's a really stronger case,
00:33:52.240
I think. Now, I don't know if Andreas Beckhaus is watching this video, but if you are,
00:33:59.360
that he'd be the best debunker of things that I say. So debunk me on Twitter if I've missed any,
00:34:07.260
anything obvious. So I took this and I sent the link to my Democrat friend that I always mention,
00:34:16.200
that my anti-Trump Democrat friend, who has the qualities of being very smart and well-informed
00:34:21.940
and yet appears to act crazy. He's completely rational, in fact, one of the most rational people I know,
00:34:31.360
in all other domains. He's just like this really rational guy. It practically defines who he is.
00:34:37.420
He's so rational. And I sent him this, sent him the statistical analysis. And as luck would have it,
00:34:46.240
he's also good at statistics. So one of his talent stacks is statistics. So I sent him a statistical
00:34:53.860
argument to a guy who really knows statistics. No, it's not Sam Harris. It's a personal friend,
00:35:00.320
nobody you know. And here's why I did it. My friend says, and has been saying, that there's
00:35:08.360
no evidence that there was anything fraudulent. So that's his view. No evidence. So I sent him
00:35:16.380
this evidence. But the evidence has a special quality to it. That no matter how much you know
00:35:24.260
about statistics, you can't really just look at it and know if it's right. Right? You can't tell.
00:35:31.500
So this expert is making an argument that unless you probably, unless you really dug into his work,
00:35:37.760
you can't tell if it's real. So I did this intentionally, not as here I've proven my case,
00:35:44.720
because I don't think anything like that's happened. Remember, 95% of all evidence is fake.
00:35:49.280
This is no different. So I didn't think it was a kill shot. But the reason I showed it to him
00:35:54.860
is because I knew he wouldn't know it wasn't. And he wouldn't know if it was because it can't be known.
00:36:00.680
It's just too hard to know it based on what we have available to us. And I wondered if he would
00:36:05.940
reject it. Or would he say, okay, this does not prove fraud. And I would agree with that.
00:36:12.940
But it certainly tells us we should look into it. So that's what I was looking for. I was looking
00:36:19.840
for a rational response that says, you know, Scott, I know a lot about statistics too. But I don't have
00:36:27.480
access to all the data he has. If he did this analysis right, it would be very meaningful.
00:36:32.920
And it does look like he's capable, capable of doing the analysis. If it's right, this is something
00:36:39.200
that would be important. It should be looked into. So that's what a reasonable person would say,
00:36:43.720
right? Do you think he said that? Nope. Perfectly reasonable to say it didn't prove anything. And I
00:36:52.200
agree. Here's what he said. You can find any correlation in lots of data. Now, this is what
00:36:59.920
I would call the Bible code theory. The Bible code is a debunked, it was an idea that if you looked in
00:37:09.080
the Bible, and you did various schemes to find secret messages, you would find all these messages,
00:37:16.760
such as, I'll just pick one, this is random, not a real one. If you took the second letter of the
00:37:21.480
first sentence, but you took the third letter of the next sentence, and then the fourth letter of the
00:37:26.160
next sentence. So it was all these little algorithms that would run against the Bible,
00:37:30.160
and it would spit out things that you didn't think could possibly be, you know, natural. So it'd be
00:37:37.640
like little predictions, and you'd say, yes, look, it's like a full sentence prediction, and it actually
00:37:44.360
happened. So there was a time when people thought the Bible had these secret codes. That was debunked
00:37:50.140
by some scientists who took their same algorithms and ran it against any big book.
00:37:56.160
Like war and peace. Turns out war and peace is full of secret messages and predictions that
00:38:02.020
actually came true. Because it turns out that if you've got something as complicated as a big book
00:38:07.620
filled with letters, you can find some algorithm that will produce full sentences, just by trial and
00:38:15.660
error, and they will look like predictions that happened. So it can work with any book. It's obviously not
00:38:20.800
the Bible code. So his argument was that this statistician had basically fallen for the Bible
00:38:27.340
code error. Does that sound like a good response to, here's a video by a hugely qualified statistician?
00:38:37.460
Are there any hugely qualified statisticians who don't know about the Bible code? There are none.
00:38:47.680
There are none. That's not a thing. There's no such thing as a professional, a professional
00:38:54.100
statistician who's never heard of this problem with the Bible code. That's not a thing. Obviously,
00:39:01.320
the statistician was aware that that's one of the risks that you have to guard against.
00:39:07.340
So I feel as if this is a pretty clean example of cognitive dissonance. The reasonable reaction
00:39:14.440
would have been, I can't evaluate this, but if it's right, it's meaningful. Right? Is that not the
00:39:21.680
only reasonable response to something you can't analyze but looks important and an expert did it?
00:39:27.280
All right. So the other thing my Democrat friend said as a response is that the courts have rejected
00:39:42.340
all of the evidence that was presented. It's just mind boggling. So my Democrat friend, because the news
00:39:52.160
is so fake, he believes that courts have looked at evidence of fraud. That never happened. He actually
00:39:59.960
thinks that happened. It didn't happen. Apparently, he was unaware that the cases are being thrown out for
00:40:05.420
technicalities without actually looking at the claims. You know, it's about standing and doctrine of
00:40:11.420
latches and, you know, whether or not you can bring the case and who's got, you know, who's got jurisdiction.
00:40:18.300
It's all that stuff. But as far as I know, the claims per se have not been judged in any court of
00:40:26.800
law. I don't know that the witnesses who make direct claims of observing fraud, have they had their day
00:40:34.740
in court? No, right? So here's a well-informed, really well-informed guy, but his information comes
00:40:44.660
from the left and actually thinks an alternate history of the United States is happening right
00:40:51.080
now. He believes there's an alternate history happening in parallel with the one you're experiencing
00:40:55.940
in which those claims are being debunked by courts. Nothing like that's happened.
00:41:01.040
Nothing even close to that has happened. They've never even looked at it. Beyond that,
00:41:07.760
would it make any difference that other claims were debunked?
00:41:13.980
Does it matter how many people are found innocent of a crime?
00:41:19.500
Let's say this. Let's say three people were accused of a crime and you found out they didn't do it.
00:41:24.840
Does that tell you the fourth person who was accused of an unrelated crime? Does that tell you the
00:41:30.180
fourth person who didn't do it? It doesn't really work that way, right? Anyway.
00:41:38.320
Trump signed some legislation that would kick Chinese companies off of the U.S. stock exchanges
00:41:43.940
unless those companies allow financial audits that are required for, you know, American companies that
00:41:52.120
are on those exchanges. To which I say, why did this take so long? Are you telling me that there are
00:42:00.160
Chinese companies on American exchanges who simply decide not to abide by the very, very, very
00:42:07.000
important rules of transparency that all American companies abide by or else they get penalized
00:42:12.840
greatly? Are you telling me that China can just be on our stock exchange and ignore all the rules that
00:42:19.500
require the important ones, not even the trivial ones, like the most important one is you've got to
00:42:25.680
have some transparency. That's like right at the top. That's not a detail, right? And so Trump signs this
00:42:33.480
legislation that will kick them off if they don't allow these audits. And I'm thinking, why did that take so
00:42:40.020
long? You know, and do you think this would have happened under Biden? Do you? Because I feel like
00:42:47.840
probably it wouldn't. It's going to be fun watching Trump try to get things done, you know, between now
00:42:54.600
and Inauguration Day. We'll see how much trouble he can cause. All right, let me teach you when to
00:42:59.860
disagree with the experts because, of course, we all do it. But there is a good way to do it and a bad
00:43:04.960
way to do it. And this will be your important lesson of the day. You ready? Okay. Here's what you
00:43:11.220
should not do. Do not disagree with experts and then cite as your reason for disagreeing with the
00:43:18.960
experts a fact which all the experts know. Okay, I just gave you an example that all the experts in
00:43:27.380
statistics know about the Bible code. So stating that as the reason for your argument doesn't make
00:43:34.100
any sense. Because the expert knows that. Here's some more examples of that. I've heard the argument
00:43:43.700
that CO2 can't be causing a climate crisis because CO2 used to be much higher in the past. You've heard
00:43:54.000
this argument, right? People say climate change isn't real because CO2 used to be way, way higher in the
00:43:59.480
past. And there was no civilization back then. If there were no humans, and CO2 was way higher in
00:44:06.300
the past, and things seem to be fine. So it's all hoax, right? Here's the problem with that.
00:44:14.140
Every climate scientist knows CO2 was higher in the past. Do you see where I'm going? All the experts
00:44:23.220
who say climate change is a problem. They know what you know, that CO2 was much higher in the past.
00:44:30.960
That's not a reason to argue against them. What that proves is that you don't know why they have,
00:44:37.860
you don't understand their argument basically. Now, I believe that I read once that CO2 was higher in
00:44:45.300
the distant history of the earth, but it was the same time that I believe the sun was less strong,
00:44:52.300
so there was some countering force that is easy to demonstrate and well known. So in general,
00:45:00.180
if you're disagreeing with experts, but you're using as your basis for disagreement a fact that
00:45:06.460
every one of those experts knows, you're almost certainly not making a good argument. You can be
00:45:12.240
right, because experts sometimes are wrong and you don't know why. So you can be right by accident.
00:45:17.080
But you should check yourself and say, wait a minute, my argument is based on one fact
00:45:23.440
that the other people already know. There's got to be some other argument or that's nothing.
00:45:29.360
All right, here's another example. When I predicted that Trump would win in 2016,
00:45:37.200
2016, I was going against all the experts and all the pollsters. Was that smart? Was it smart for
00:45:45.640
me to disagree with the experts when I was using their same data? Because they knew what the polls
00:45:53.520
were. We're all looking at the same data, right? So I should not have disagreed with the experts,
00:45:59.240
wouldn't you say, if all I were using was the same data they were using? Because they would know
00:46:05.120
more than I do, plus they know the same data I know. Except here's what's different. I was not
00:46:12.060
using their same data. I was using my expertise, which is different from theirs. My expertise was
00:46:19.900
persuasion. And as a trained persuader, and other trained persuaders saw at the same time I did,
00:46:26.540
they said, whoa, whoa, whoa, this isn't like the past. We've never had this skill set running for
00:46:33.520
president. And you guys don't see it coming. But I'm kind of an expert in this persuasion stuff.
00:46:38.960
And I do see it coming just like a train. Like I can see it. I can see it coming. Right? So if you
00:46:45.580
disagree with the experts, because you're bringing knowledge that they don't have, or expertise that
00:46:52.080
they don't have, that might be a reasonable disagreement. Again, doesn't mean you're right and
00:46:57.320
they're wrong. Could go either way. But at least you're being reasonable. That would be a reasonable
00:47:02.740
way to disagree with an expert, because you're bringing something new that they don't have.
00:47:09.100
But if you're only bringing the stuff they already know, I think I'd lean toward the experts, not you,
00:47:16.300
in that case. I had one other expertise in the case of calling Trump's 2016 victory, which is that
00:47:24.580
I know a lot about my white males. As a white male of a certain age, I kind of have a little more insight
00:47:33.020
into white males of a certain age. And I know what they're willing to say out loud in public. And I know
00:47:39.180
what they privately think. And it's a little bit different. So I'm not sure that all the experts had maybe
00:47:46.160
the same, you know, experience with this group of people who ended up being influential in the final
00:47:51.560
final outcome. So whenever you think you have some extra insight or expertise or data, then maybe
00:47:59.500
disagreeing with an expert makes some sense. Here's another one. I've disagreed with climate change experts
00:48:08.440
about their projections of how bad things will be in 50 to 80 years. Does that make sense? I'm not a climate
00:48:16.940
change expert, right? So if I'm disagreeing with the experts, aren't I being irrational? Because there's
00:48:24.100
no fact that I know about climate change, and this is true, there's no fact I know about climate change
00:48:30.540
that they don't know. So they know all the facts that I have, plus lots more. Would it be reasonable for
00:48:38.720
me to disagree with them when they know everything I know, plus the scientific method has backed them
00:48:46.300
up, they say, plus the majority of experts are on the same side, plus they know way more than I do?
00:48:54.160
Is that reasonable for me to disagree in that case? Well, if that's all the variables that were involved,
00:49:00.340
the answer is no. If there were no other variables, it wouldn't really be reasonable for me to disagree.
00:49:05.220
I don't have anything to add to it. But when you're predicting what's going to happen financially,
00:49:11.600
you're now in my ballpark. Because I worked as a person who made financial predictions for
00:49:18.480
big corporations, did it for years, and I have expertise in it. So when I'm criticizing climate
00:49:25.580
change, the part I don't criticize is the science part. The science part is that if you add CO2 to the
00:49:33.260
atmosphere, no matter how it gets there, human or other, no matter how it gets there, all things
00:49:39.000
be equal, would that warm up the earth? Probably. I'm not disagreeing with experts, because I don't
00:49:45.620
have any extra data. What extra data do I have? What extra science do I have? None. So when they
00:49:53.540
make a claim that CO2 should warm the atmosphere, all things be equal, I say, I don't have anything to
00:50:00.060
add to that. I'm not going to doubt it. And I'm not going to confirm it. I'm just going to say,
00:50:06.180
well, you're experts. You know, I don't know. But when you get to the second part, which is they make
00:50:11.260
a financial, not a scientific, but a financial estimate of what it's going to do with the world
00:50:18.180
economy, you're in my expertise. So if I criticize you from my expertise, and you're a scientist,
00:50:26.860
you should listen to me. Literally. If a scientist tells you to believe a financial estimate,
00:50:38.060
or a financial prediction, and a financial expert who makes these predictions, or has for a living,
00:50:45.520
says no, who are you going to believe? The person who knows the most about financial predictions,
00:50:51.420
or a scientist? Because scientists are not financial predictors. So when I disagree with
00:50:57.760
climate change, I'm not disagreeing with scientists on science. I'm disagreeing with scientists on my
00:51:04.880
expertise, not theirs, my expertise. I have another expertise too, which is, again, persuasion. And so I
00:51:12.700
have a theory of why maybe scientists could be, you know, fooled or biased or subject to confirmation
00:51:19.180
bias, at least on the financial part, financial predictions. And it is, you know, everything that
00:51:25.480
you already know, which is that there's a group think, and there would be a penalty for going against
00:51:30.080
the grain. So if you're going to disagree with the experts, at least have a theory of why they're wrong.
00:51:37.320
So sometimes I have a theory that there are just too many penalties for them to say,
00:51:43.200
you know, going against the grain, you know, so it's cognitive dissonance or confirmation bias,
00:51:47.780
or that I have extra facts or extra information. All right, here's another example of the same thing.
00:51:56.040
I see this argument all the time about wearing masks, and it goes like this. This was tweeted at me
00:52:02.600
today. The problem with this, I'm sorry, it was tweeted at me today that we know masks can't work
00:52:11.260
because the tiny holes in the mask are way bigger than the even tinier virus. Now, most of you have
00:52:21.120
probably made this argument, right? How many of you have made this argument? Scott, Scott, Scott,
00:52:26.120
the scientists have looked, they've seen that the masks have these big holes on them when you go
00:52:31.400
microscopic, and the hole is this big. The virus is the size of a pea. You know, let's say, relatively
00:52:39.620
speaking, the virus is the size of a pea. The holes, the tiny holes in the mask would be the size of,
00:52:45.800
let's say, a basketball hoop. How does a basketball hoop stop something that's the size of a pea?
00:52:54.920
Right? Have you made that argument yourself? Raise your hand if you've ever made that argument.
00:53:00.100
If you did, you would know you're disagreeing with the majority of experts who say the masks work.
00:53:07.160
Now, here's my test. Do you think that the experts who say masks don't work, who say the masks do work,
00:53:17.320
do you think they're unaware of your pea going through a basketball hoop brilliant analogy?
00:53:24.960
Because if they're aware of that fact, then you're adding nothing to it. It's a fact they know. It's a fact you know.
00:53:34.740
That the size of the holes in the masks are so big, and the virus itself is smaller.
00:53:41.400
Do you think they don't know that? The experts? I'm pretty sure all of the experts know that.
00:53:47.300
But they know everything you know, which is, let's say, that fact, and more.
00:53:53.960
So they know everything you know. They know that big hole and little virus exists.
00:54:00.020
But the other thing they know is that it's been tested in a variety of real-world situations.
00:54:06.760
And in the real world, the evidence is very strong that masks work.
00:54:13.360
So they're aware that when you test it in a laboratory, you can come up with a very good reason why maybe it wouldn't.
00:54:22.040
But it might have to do with the water droplets being bigger than the virus itself,
00:54:26.660
so maybe they don't get through the basketball hoop.
00:54:29.380
Could be that it changes the direction or the viral load.
00:54:36.620
But the point is, if the experts know everything you know about the size of that virus,
00:54:46.740
You shouldn't take away from it that the experts are lying if they know more than you do.
00:54:55.180
Now, how often have I disagreed with the experts?
00:54:59.320
And let me see if I'm consistent with my own rules of knowing when to agree or disagree with experts.
00:55:06.300
When the question came up of closing travel from China,
00:55:11.120
the virology experts said in the very early days,
00:55:17.920
And I disagreed with vehement, cursing public statements
00:55:24.040
that we should close the travel from China immediately.
00:55:33.980
that we should have closed China travel as early as I said,
00:55:37.820
which was well before Trump did it, which I think was a week later.
00:55:47.180
let's examine if I was right for the right reason.
00:55:50.440
Okay, so what was it that I added to this, to the experts?
00:55:56.120
If the experts say it's not a risk, why should I say it?
00:56:01.920
You know, what do I know that the experts don't know?
00:56:12.040
Because if you studied economics and if you have an MBA
00:56:19.680
I would consider myself not like a world expert in risk management,
00:56:42.660
I said, I don't think they understand risk management
00:56:54.360
We knew that it would be expensive to close travel,
00:57:13.200
And in fact, who was it who famously, you know,
00:57:19.940
Would you say that Trump is also an epidemiologist?
00:57:37.680
you are somebody who's been making risk management decisions
00:57:44.100
Trump is very, very experienced at risk management.
00:57:55.600
When the experts were first saying that masks don't work
00:58:23.300
and somebody who's worked in business for a long time,
00:58:38.220
that since we were talking about a shortage of masks,
00:59:31.400
and really, really quickly on hydroxychloroquine