Real Coffee with Scott Adams - December 19, 2020


Episode 1224 Scott Adams: I Teach You When to Disagree With the Experts Because That is an Essential Skill


Episode Stats

Length

1 hour and 16 minutes

Words per Minute

152.62576

Word Count

11,727

Sentence Count

790

Misogynist Sentences

7

Hate Speech Sentences

23


Summary

In this episode of Coffee and a Simultaneous Sip, Scott Adams talks about the controversial question of whether or not to get the MMR vaccine for older people, and why racism is the only thing that can get in the way.


Transcript

00:00:00.420 You're right on time. Well, at least some of you are. The rest of you, I call you laggards.
00:00:06.800 Laggards, yeah. It's time to up your game and be here exactly at 7 a.m. California time,
00:00:13.880 10 a.m. Eastern Time for the best part of your day. It's called Coffee with Scott Adams
00:00:20.080 and a Simultaneous Sip. It's amazing. Have you experienced it yet? Well, if it's your first day,
00:00:26.960 hold on to your hat. It's that good. Oh, it might sneak up on you, but it's that good.
00:00:34.260 And all you need to enjoy it is a copper mugger, a glass tanker, a chalice or stein,
00:00:40.440 a canteen jug or flask, a vessel of any kind. Fill it with your favorite liquid. I like coffee.
00:00:47.680 And join me now for the unparalleled pleasure of the dopamine hit of the day,
00:00:50.640 the thing that makes everything better, including the Moderna vaccine. Go.
00:00:56.960 Well, speaking of the Moderna vaccine, I guess that got approved, so we got a second vaccine.
00:01:07.580 And the interesting thing is,
00:01:09.840 how do you decide which one to take? Because I'm starting to hear reports of, you know,
00:01:16.960 the Moderna one might have some advantages over the other one. What would you do if your health
00:01:23.100 care provider offered you, let's say, the other one, but you wanted the Moderna one? What would
00:01:30.220 you do? Would you wait? If the only one you could get is the other one, because maybe you're in a,
00:01:36.460 your HMO or something just has one of them? I don't know. It's going to be a, it's going to be an
00:01:41.540 interesting question. My advice to all of you will be the same. This is what I'm going to do.
00:01:47.740 In terms of my decision of what vaccination to take or not, I'm going to wait till the last
00:01:54.780 minute. And you should too. You don't need to make a decision until somebody says you can come in and
00:02:01.320 get it. Until somebody says that you personally can go get a vaccination, don't decide. Don't decide
00:02:10.740 because there might be extra information by then. So if you decide now, before you have to,
00:02:16.120 why would you do that? Wait until the last minute. You can be quite sure you're going to
00:02:20.800 take it or quite sure you're not, but don't decide yet. Wait until the last minute. That's the smartest
00:02:27.360 place, place to make the decision. Here's the most controversial story that I have completely
00:02:34.280 changed my opinion on. You saw the story about the ethicist who, I guess the New York Times had his
00:02:42.800 article. And the ethicist claimed that it might be better for society if before older people get to
00:02:51.660 the vaccination, that the frontline health care workers get taken care of. And part of his argument,
00:02:59.360 which made the headlines, was that he thought that old people shouldn't get the vaccination first
00:03:06.120 because they're mostly white. And that frontline health care workers are more diverse. And so if
00:03:14.020 you favored the frontline workers, you would get a more diverse and more fair distribution.
00:03:20.580 And the way that was reported is, uh, racist. Oh, it's kind of racist. Really, really racist.
00:03:28.280 Is it? Yes. So let me agree with the first part of the criticism unambiguously. It's totally racist.
00:03:39.160 It's unambiguously, overtly, plainly, transparently racist. But here's the part you're not going to like.
00:03:51.440 You ready? But no matter what you do, it's racist. Sorry. Sorry. No matter what you do,
00:04:04.160 it's racist. There isn't the non-racist option. If we had a non-racist option, I would say, my God,
00:04:14.000 why are we even obtaining this idea from this clearly racist proposal? But that's not our
00:04:22.020 situation. It's not. And we can't get to a situation where there would be any kind of a
00:04:27.760 non-racist, you know, non-racist process. Here's why. No matter what rules you pick,
00:04:37.780 no matter what group you say, even if you don't use race, if you just say, well, we'll do people
00:04:46.000 over a certain age, well, mostly white, right? It wasn't your intention, but it would just turn
00:04:52.240 out that way. So things are racist by outcome, no matter what your intention was, right? So would you
00:05:01.540 agree with the first point that the outcome has a racial element to it, even if nobody was thinking
00:05:09.380 in those terms, even if racism was nothing to do with the decision, you'd all agree that there's
00:05:16.340 always an outcome that favors one group or not, no matter what you do. You can't avoid that. So let's
00:05:22.900 talk about intention. Because if the outcome is going to be racist, no matter what, you can't get rid of
00:05:28.420 the racism part. So why would you worry about the thing you can't change, right? That just can't be
00:05:33.760 changed. But you can question motive. That's always fair. You can question intention. If I thought that
00:05:42.980 someone had suggested in public that white people should not get the vaccination, you know, in an early
00:05:53.120 way, even if they're old, you know, what's your first impression of that? Sounds pretty bad, right? But let me ask you
00:06:01.780 this. So suppose somebody came to me, all right? Let's personalize this, take it down to the realm of public
00:06:09.500 policy. Take it down to you personally. Somebody comes to you and they say, Scott, you're over 60. You know, I'm 63.
00:06:19.500 And I'm in a, you know, I've got a little asthma. So I've got some comorbidities. I'm not old, old,
00:06:28.380 but I'm, you know, the beginning of the older category. And suppose they said to me, I'd like you
00:06:34.960 to make the decision, Scott. It's up to you. We know that, let's say, older black people have much
00:06:42.840 worse outcomes. Would you mind socially distancing a little bit longer? And we're going to focus on
00:06:51.080 black citizens, not because they're black, but because we know they have worse outcomes.
00:06:58.020 So if you're looking at the greater good, you want to give the vaccination to whoever gets the best
00:07:02.960 outcome, right? It just happens that they're black. That's not anybody's choice. It's nobody's intention.
00:07:10.120 It's just a biological reality. So if somebody said to me, Scott, would you personally,
00:07:18.580 you're not making a decision for anybody else, right? It's just you personally. It's not a public
00:07:23.880 policy. It doesn't apply to anybody else. It's just you. Would you personally socially distance a
00:07:30.520 little bit longer and take a little more risk for the benefit of black citizens in the United States
00:07:36.100 who are at greater risk? What would I say? I'd say yes. I'd say yes. If somebody asked me that
00:07:44.200 question directly and said, look, it's up to you. We're not, there's no penalty. You will not be
00:07:50.000 punished. You won't be punished. It's just up to you. It's your own conscience, your own risk, your own
00:07:56.960 risk reward calculation. You can be selfish if you want. It's up to you. If you want to get it first,
00:08:03.380 we'll put you right in front of the line and nobody will ever give you a hard time for it.
00:08:07.280 It's up to you. I think I'd still wait. I think it's still wait because that's actually a pretty
00:08:13.800 fair proposition. If they can identify people for whatever reason, be they black or have a
00:08:21.880 comorbidity or be they a certain age or be they healthcare workers on the front line, if you can
00:08:27.660 make a strong case that this person is in a high risk group and I'm in a slightly less high risk
00:08:34.920 group. Yeah, I'm okay with that. Absolutely. Because, you know, it's a war, right? It's a war.
00:08:43.600 And sometimes you've got to be the one that does the dangerous stuff so that somebody else doesn't
00:08:48.860 have to do it. Sometimes you've got to, you know, brave the bullets to pull back your wounded
00:08:53.940 comrade off the field, right? So we're in a situation where personal sacrifice should be a
00:09:00.160 pretty big part of the equation. If you're not thinking of it that way, then you're not in a,
00:09:05.660 let's say, a military mindset. And maybe we should be. We're in a war against a virus. Maybe let's
00:09:12.860 act more like soldiers, right? So somebody says white guilt. Am I suffering from white guilt?
00:09:21.480 Because I would say the same thing, no matter who the risk category was. So would you, would you
00:09:30.540 criticize me if I said, I think people over 80 should get the, should get the vaccination before
00:09:37.640 me? Would you criticize me for that? If I said that frontline healthcare workers should get it before
00:09:43.700 me, would you, would you criticize me for that? If I said that people who, what's the worst comorbidity,
00:09:50.740 maybe it's diabetes, let's say diabetes is, I don't know if that's true, but let's say it's the
00:09:54.780 worst one. If I said that everybody with diabetes should get the shot before I do, would you criticize
00:10:02.480 me for that? Why would you criticize me if I say, no, there's an obvious category, black citizens in
00:10:09.680 this country clearly have far worse outcomes. Why is that different than diabetes? Why is that different
00:10:15.420 than being 80 years old? All right. So there's your provocative thought of the day. I didn't think
00:10:22.680 you'd like it, but I feel it's worth mulling on. And I would say that the story about the ethicist and
00:10:30.000 his opinion was presented a little bit out of context. So when I first heard it, it just sounded
00:10:36.220 straight up racist. And that was my first impression. But when you hear the actual argument,
00:10:40.680 and you understand that there is no non-racist outcome, you can't get there. It's not a
00:10:46.040 possibility. It's only who gets, who gets the advantage. And if you decide that it's going to
00:10:51.660 be a little racist, but you're going to do it based on the greatest risk, that's about as good as you can
00:10:58.140 do. It's about as good as you can do. Now, if you told me that every black person would get the
00:11:03.720 vaccination before every white person, that doesn't make sense. But if you're talking equal to equal,
00:11:09.300 let's say a black guy who's 63 years old and has a little asthma, I would, I would put him in front
00:11:16.580 of me in line because it's a war. If this were a competition and I were competing against my fellow
00:11:25.020 citizen, yeah, I'd push him off the cliff, right? If I treated this as a competition, I'm going to get
00:11:32.400 mine and I'm going to make sure you don't get mine before I get mine. But it's not, it's a war.
00:11:37.960 So if I were competing against black people, sure, I'll do what I have to do to compete. But they're
00:11:44.100 on the same team. So I'm going to treat it like a military operation. All right. There's a new site
00:11:53.340 that's sorting the news in a useful way. I think a lot of testing on how to get our news better,
00:11:59.260 you know, better platforms or better ways to present the news so it's less biased would be good
00:12:05.060 for testing. So I'm not going to say that this particular one and its URL is tidyreport.com.
00:12:13.200 Tidy, T-I-D-Y, report.com. What they do is they organize the tweets, which of course usually connect
00:12:21.320 to the news directly. So they organize the, I think it's mostly the political tweets, by positive,
00:12:27.940 neutral, and negative. So in other words, whether the tweet is saying something positive or negative
00:12:32.480 about the topic or neutral, and whether the person saying it is associated with the left or the
00:12:38.860 right. I'm not sure exactly how they figure that out, but they're probably close. So it's called
00:12:44.220 tidyreport.com. I neither recommend it nor disrecommend it. The important part of the story
00:12:50.700 is that people are starting to A-B test different ways to present the news to get past this immense
00:12:57.460 bias situation. So check it out. Maybe that's one of the ones. Pompeo. So Mike Pompeo says that
00:13:08.620 we're, that the Russians are, quote, pretty clearly behind the cyber attack. What does pretty clearly
00:13:16.800 mean? Does pretty clearly mean we're sure of it? Is that the same? It's pretty clearly. I don't know if
00:13:26.280 that means they're positive. It's an interesting choice of words, but it's somewhere in that
00:13:31.660 neighborhood of high confidence or high likelihood. And some experts are saying what I think is
00:13:39.800 unfortunately obvious, that the only way you would be able to get rid of whatever access Russia has had
00:13:46.540 for apparently a long time, the only way you'd be able to get rid of it is to replace all of your
00:13:51.020 software. All of it. Because the, the app, the, uh, the allegation, which is probably pretty reasonable
00:13:59.220 is that once they had God access to all of the systems, they could embed, you know, viruses in
00:14:05.700 different places to be activated under different situations or open up different doors, et cetera.
00:14:11.140 So it wouldn't matter how, uh, how good you were at finding a problem because they would just
00:14:18.060 open up a new door as quickly as you found it. So you pretty much have to get rid of all of it.
00:14:25.060 I wonder if we have the technology to do that. Let me give you, uh, let me flush this out a little
00:14:33.300 bit, flush it out or flash it out a little bit. I always get those confused. Flash it out, you flash
00:14:39.260 it out, right? That's, that's the saying. So the idea is this, could you write a, uh, software
00:14:46.260 application that's main purpose is to remove all of the software in the company and replace it with
00:14:54.700 the clean version of the same software? So in other words, could you write, um, some kind of a master
00:15:01.580 God program that would take every piece of software at IBM, delete it. And I don't know how hard you have
00:15:09.600 to delete it. Maybe there's like, you have to extra delete, bleach it or something. Just get rid of all
00:15:15.200 of it and reload the same fresh, you know, thing. So you keep your databases. So none of your data would
00:15:23.600 be directly affected. So I don't think there's a problem with data. I guess I'm not that technical
00:15:29.920 that I can answer that question. Would we have any issue with a, just a raw database? I don't know if
00:15:34.700 that can hold a virus, but if you get every, get rid of everything, just wipe everything that has any
00:15:41.440 software element to it in your system. Could you write one giant program that just rolled through
00:15:48.720 a fortune 500 company, took it down for an hour, it is just done for an hour, but an hour later is
00:15:56.640 reloaded all of its software, rebooted in the right order and brought everything back up. Could you do it?
00:16:02.120 Is it, is that a thing? Let me give you a little bit of history. Do you remember when the year 2000
00:16:09.480 bug was coming? When the year 2000 bug was coming, all the experts, experts said, we don't have enough
00:16:18.920 time. We're in real trouble because the companies are not taking it seriously. And that date is coming
00:16:26.160 when year 2000 bug will hit and all computers that were, you know, designed before a certain date
00:16:31.620 can't handle the year 2000 as a date and they'll all crash and the world will, will end. And as that
00:16:38.960 was approaching and we were getting close, we were actually in the year 2000 and we're getting, you
00:16:43.480 know, or no, we're getting close to the year 2000. Yeah. It gets closer and closer and closer.
00:16:49.360 I was saying in public, we're fine. We'll be fine. Now, here's the reason that all the experts said,
00:16:56.640 no, it can't be done. It's too much work. You know, if everybody worked on it full time,
00:17:00.800 you just couldn't get it done. We're, we're doomed. And I said exactly the opposite. I said,
00:17:05.320 no, we'll be fine. What happened? What happened is we were fine. Now, why is that? Well, exactly what
00:17:13.620 I predicted that it would take a long time to do it manually, but it wouldn't take a long time to
00:17:20.080 figure out how not to do it manually. In other words, it wouldn't take that long to write programs
00:17:26.340 that would do what the humans would have to do. That would take a long time. And what happened?
00:17:31.320 People wrote programs that looked for these bugs and corrected them. And then they ran the programs.
00:17:37.800 The bugs were corrected. The year 2000 came. Bam, we're fine. So until you could imagine that it was
00:17:46.940 possible to write software that would fix, you know, universally go out there and find and fix all
00:17:52.680 the bugs that you didn't even know where they were, uh, you thought you were doomed. I think we might
00:17:58.540 find a similar, maybe an industry could be a whole industry pops up, maybe a consulting industry with
00:18:05.760 some kind of technical background. And I would guess that we will probably birth an industry
00:18:12.560 because of this, because of the hacks, uh, that go in and shut down your whole network and wipe it
00:18:19.400 and then reload it. And, you know, they control the process. So nothing gets out of control
00:18:23.700 and, and they just go rescue one company at a time. I feel like that's going to be an industry
00:18:29.900 really soon and should be. And I think that'll be the only answer because the, uh, we should assume
00:18:36.940 that, uh, people will get so far into our systems again, that we'll just be right back in this
00:18:43.600 situation. Right? So even if we found every bug and got rid of it, it would just reproduce. I mean,
00:18:50.660 you know, even if we got them all, which is impossible probably, but even if we did,
00:18:55.540 they would just hack back in and, you know, we, you know, they'd find another way in. So you need
00:19:00.260 some way to wipe all of your software every now and then, every bit of it. And that, I think that'll
00:19:05.860 become an industry. Um, here's the funniest thing that's happened, uh, lately. If you don't follow
00:19:13.940 this Twitter account, you really should. It's a parody account. Uh, and the name on the account is,
00:19:21.240 uh, Titania McGrath. And there's a little photo of a, uh, youngish blonde woman with sort of glasses
00:19:29.680 like mine. And what's brilliant about it, besides the fact that it's brilliantly written,
00:19:35.860 is that it's a parody account that is so close to reality, people are often fooled, which is part of
00:19:43.680 the joke. So a lot of people who see this account for the first time really can't tell. And, and
00:19:50.900 something happened that, uh, a thread that, uh, Titania or whoever runs the account tweeted
00:19:59.640 that just as amazing. And here's why I've been telling you for a while that parody and reality
00:20:06.900 have merged such that there's not really that much difference between a wild parody and what
00:20:14.000 you're actually observing. You want to see an example of that? Titania in her thread gives you
00:20:21.520 several examples of predictions that she, or whoever runs the account made that were pure parody
00:20:29.240 that have already happened. In other words, the parody came before the reality, but listen to this
00:20:36.660 list. It's fricking mind blowing. All right. You ready? So these are the claims in Titania's thread.
00:20:42.640 Um, she said, um, she said on, uh, December, 2018, I called for biological sex to be removed from birth
00:20:51.180 certificates. Now that was parody. We're going to take your biological sex off of your, uh, birth
00:20:58.260 certificate. So that in 2018, in 2020, the new England journal of medicine concurred. So the new
00:21:06.000 England journal of medicine is now recommending in 2020, what Titania said as purely a joke
00:21:12.840 in 2018, purely a joke. Is that the only one? Well, I mean, if this had only happened once,
00:21:20.200 if it only happened once, you'd say, oh, that's a funny coincidence, right? If it only happened once.
00:21:27.420 So, uh, on, and also in 2018, Titania criticized Julie Andrews, who played Mary Poppins in the movie
00:21:34.900 for having chimney soot on her face. Cause you know, that was in the middle of the blackface
00:21:41.280 stuff. So as purely a joke, she tweeted that and criticized Julie Andrews for having chimney
00:21:49.380 smoke on her face in the movie. That was in 2018. In 2019, the New York times concurred.
00:21:56.380 So the New York times basically took on Julie Andrews for having soot on her face and blackface.
00:22:04.560 It was literally a joke two years before it became real. Is that the only ones? Oh no,
00:22:11.340 I'm not even close to being done. So the thing that's funny is not the individual examples because
00:22:17.000 they're, they're sort of trivial. What's funny is how often it happened. It's the often part that
00:22:22.420 makes the, makes the joke. All right, here's another one. Uh, March 2019, Titania published a book
00:22:28.820 called Woke in which I argued that skyscrapers are oppressive phallic symbols. In July, 2020,
00:22:38.440 the guardian concurred. So in 2019, literally joking, the skyscrapers are, you know, some oppressive
00:22:49.740 phallic symbols. And then the guardian writes a serious article one year later saying exactly that.
00:22:57.580 Uh, in that same book in 2019, here's, here's the, like the finisher. Uh, she goes, in 2019,
00:23:04.560 in her book, also the same book, Woke, I called for, uh, I called that with Helen Keller for her white
00:23:10.760 privilege. Time magazine just did that in reality.
00:23:19.740 Now this is just a sample. The, the actual thread is longer. I just picked out some of the fun ones,
00:23:26.160 but when you see how many times parody and reality overlapped, it's, it changes you. I mean, this is,
00:23:34.440 this is one of those things where, you know, I've, I've predicted this. I predicted it often and in
00:23:41.640 public that parody and reality were on the way to merging and then to watch it in real time,
00:23:47.800 it actually merged. Did that sound like a real prediction when I said it? The first time you
00:23:53.800 heard me say parody and reality are merging, that didn't sound exactly technically real, right?
00:24:01.800 It sounded more like a humorous hyperbole. No, I meant it and it happened.
00:24:10.500 So there you go. All right. Speaking of predictions, one of my other predictions is that
00:24:18.040 history would get complicated because we would no longer have one of them. We would have more than
00:24:24.060 one history. And that if you went to school, it might be a problem because you're trying to learn
00:24:29.760 history and there are two of them and they're different. Which one do you believe? Did you think
00:24:36.980 that that was going to happen? My prediction that there would be two histories? Well, here we are.
00:24:43.620 President Trump, he unveiled his choices for the president's advisory 1776 commission.
00:24:52.180 So this will be a commission to make recommendations about how to push against the 1619 project that is
00:25:00.220 already in schools. So the president has literally created a commission to create an alternate history
00:25:08.060 to compete with the history that's already being taught in the schools. Two histories.
00:25:16.020 Literally being taught in schools. Now, when I said to you, we've got a problem here because we have
00:25:23.080 two histories like we've never had before. Did that sound real to you? The first time you heard that,
00:25:29.300 it's like, no, we'll still agree on one history. Nope. Literally, we're teaching two histories if this
00:25:37.800 commission goes forward. I don't know how much time they have before Biden scraps it, so I can't
00:25:43.360 imagine they get much done. But there you are, two histories. All right. The most interesting claim
00:25:53.340 about election fraud that I've seen comes from Kanekoa the Great. It's a Twitter account. So I think
00:26:03.340 maybe Kanekoa, K-A-N-E-K-O-A, the Great. All one word. Kanekoa the Great. A good follow. He's got lots
00:26:13.000 of stuff. I don't know who he is, but he's got lots of good content. So he made a Twitter thread,
00:26:17.540 which I'll talk about. But before I talk about it, do you remember the golden rule of all election
00:26:25.460 fraud claims? The golden rule, well, it's not golden, but it's a rule. 95% of all the election
00:26:33.760 claims you hear are fake or not real or mistaken or out of context or something. But I also believe
00:26:42.660 there's a 100% chance the election was stolen because it was easy and people had the motive
00:26:47.060 to do it. So of course it's stolen. It's always stolen under those conditions. But any specific
00:26:53.260 claim you hear, probably BS. Now, I'm going to give you a specific claim after telling you that
00:27:00.320 every specific claim is probably BS. I'm going to apply the same standard to this one. Now, this one
00:27:09.000 sounds really good. Okay? So I'm going to give you an argument here that on paper, you know, on paper,
00:27:18.700 it's really, really strong. But is it true? I don't know. I would just apply my standard to it. 95%
00:27:27.740 chance it's not. But here it is. So there is a working professional statistician, somebody who is
00:27:35.200 very capable and experienced and in the sweet spot of his career. So somebody who really, really,
00:27:42.480 really understands statistics. So this is the source. And there's a video from the statistician
00:27:49.420 explaining what he did. So the first thing you need to know is that the person making the claim
00:27:54.280 is very qualified. Right? Now, that doesn't mean it's true. Right? Because we'll talk about experts
00:28:01.460 and when you should trust them. But just know that he's very qualified. And what he did was he was just
00:28:06.520 sort of messing around with a lot of the data. You know, he explained it as almost a hobby,
00:28:11.360 something that statisticians like. It was like, oh, I wonder if there's a correlation between this
00:28:16.140 thing and that thing. And he discovered, somewhat just by poking around, a correlation that almost is
00:28:25.200 impossible to be natural. Meaning it's a signal for fraud with something like a greater than 99% chance
00:28:33.280 that it is really fraud and not some fake signal. Now, does that mean it's true? No. Remember,
00:28:41.860 we're only dealing with claims that you and I can't check. I don't have the skill to check it. I don't
00:28:48.100 know where the data was. I can't really check it. So no matter how credible this sounds, just keep this
00:28:54.880 little tape playing in the back of your head. No matter how credible it sounds, there's a 95% chance
00:29:01.360 it's not real. Okay? Just keep that playing in the back of your head. And here's what he found.
00:29:08.700 If you took the 3,000 U.S. counties, I always wondered how many counties there were, which is
00:29:13.440 weird. I was wondering that exact question. There are over 3,000 counties. Now, counties have a lot in
00:29:19.900 common, right? There could be a lot of diversity within the county, but you can make some claims
00:29:26.480 about their consistency over time. And the statistician started out by predicting who would
00:29:34.640 win each county based on a number of demographic variables. So he would say how many Democrats are in
00:29:41.240 the county? What's their age? A bunch of stuff. And he found that he could predict with 90% accuracy
00:29:48.140 who the county would go for based on their demographics. And you could apply it retrospectively
00:29:54.740 to other elections, and I guess it works. So it's about 90% good at knowing in advance who would win.
00:30:00.760 And then he looked at who actually won. And he found, eventually, he poked around and found this
00:30:08.380 strange data oddity. That there were lots of counties that did better than his model would
00:30:17.360 predict. And there were lots of counties that did worse than his model would predict. And that's
00:30:23.020 quite natural. So if you've got 3,000 data points, they're going to be spread around. But his point was
00:30:30.380 you could draw a line through the middle, and that would be his prediction. And the differences would
00:30:36.080 just be sort of equally on both sides of the line. So if there was, if he was off, it was just as
00:30:41.800 likely he was off, you know, in one way versus the other. So there'd be just as much below the line,
00:30:46.680 above the line. And then he found that in those counties that used Dominion voting systems and
00:30:54.480 one other kind, I think Hart, Hart or something? It was another company, Hart, H-A-R-T. So there,
00:31:02.100 I guess there were maybe six or so different machines in different counties and different
00:31:06.720 ways to account. But in those counties that had Dominion or Hart systems, they were consistently
00:31:13.940 over 5% more votes than would be expected for Biden. Now, here's the interesting part.
00:31:23.800 The correlation holds in Trump counties. So counties that Trump won, Biden did 5% better.
00:31:32.100 In counties that you knew that Biden was going to win because they always go Democrat, also a little
00:31:38.180 bit more than 5% better. So the amount that the Dominion and Hart machine counting counties were off
00:31:49.020 was consistent, meaning that there was a gigantic difference. Let me see if I can say this simply
00:32:00.220 because I'm botching this. If you looked at what you expected these counties to do based on their
00:32:06.580 demographics and past behavior, et cetera, the ones that had Dominion and Hart machines were way,
00:32:14.780 way, I think 73% of them had a Biden advantage that was very similar. Now, the odds of only those two
00:32:23.280 machines having 73% of the oddities went in both directions equally, but only where you have the
00:32:34.860 Dominion or Hart machines, you didn't have an even distribution. It's the only time. And it's very
00:32:40.960 consistent. And according to the statistician, not according to me, that the odds of any of that being
00:32:46.820 anything but fraud are vanishing, vanishingly small. You know, you could say it might be something else
00:32:54.780 in the, you know, extreme, you know, it was alien invasion or something, but not really, not really.
00:33:02.720 Now, this is very different from the quadrillions argument. The quadrillions argument was debunked,
00:33:08.980 right? So the quadrillions argument is that if there's, let's say, a bellwether place that always
00:33:17.380 went to Republicans and this is the only time it didn't, you know, that that's a signal. And then
00:33:23.720 there's this other signal and this other signal. That is not good analysis because all it would take
00:33:29.780 is one big effect that could affect all of those things, right? So that's not a guaranteed signal of
00:33:37.520 fraud. But this is really specific because you can trace it all the way down to this specific
00:33:44.300 vendor. And if you can trace the difference down to specific vendors, that's a really stronger case,
00:33:52.240 I think. Now, I don't know if Andreas Beckhaus is watching this video, but if you are,
00:33:59.360 that he'd be the best debunker of things that I say. So debunk me on Twitter if I've missed any,
00:34:07.260 anything obvious. So I took this and I sent the link to my Democrat friend that I always mention,
00:34:16.200 that my anti-Trump Democrat friend, who has the qualities of being very smart and well-informed
00:34:21.940 and yet appears to act crazy. He's completely rational, in fact, one of the most rational people I know,
00:34:31.360 in all other domains. He's just like this really rational guy. It practically defines who he is.
00:34:37.420 He's so rational. And I sent him this, sent him the statistical analysis. And as luck would have it,
00:34:46.240 he's also good at statistics. So one of his talent stacks is statistics. So I sent him a statistical
00:34:53.860 argument to a guy who really knows statistics. No, it's not Sam Harris. It's a personal friend,
00:35:00.320 nobody you know. And here's why I did it. My friend says, and has been saying, that there's
00:35:08.360 no evidence that there was anything fraudulent. So that's his view. No evidence. So I sent him
00:35:16.380 this evidence. But the evidence has a special quality to it. That no matter how much you know
00:35:24.260 about statistics, you can't really just look at it and know if it's right. Right? You can't tell.
00:35:31.500 So this expert is making an argument that unless you probably, unless you really dug into his work,
00:35:37.760 you can't tell if it's real. So I did this intentionally, not as here I've proven my case,
00:35:44.720 because I don't think anything like that's happened. Remember, 95% of all evidence is fake.
00:35:49.280 This is no different. So I didn't think it was a kill shot. But the reason I showed it to him
00:35:54.860 is because I knew he wouldn't know it wasn't. And he wouldn't know if it was because it can't be known.
00:36:00.680 It's just too hard to know it based on what we have available to us. And I wondered if he would
00:36:05.940 reject it. Or would he say, okay, this does not prove fraud. And I would agree with that.
00:36:12.940 But it certainly tells us we should look into it. So that's what I was looking for. I was looking
00:36:19.840 for a rational response that says, you know, Scott, I know a lot about statistics too. But I don't have
00:36:27.480 access to all the data he has. If he did this analysis right, it would be very meaningful.
00:36:32.920 And it does look like he's capable, capable of doing the analysis. If it's right, this is something
00:36:39.200 that would be important. It should be looked into. So that's what a reasonable person would say,
00:36:43.720 right? Do you think he said that? Nope. Perfectly reasonable to say it didn't prove anything. And I
00:36:52.200 agree. Here's what he said. You can find any correlation in lots of data. Now, this is what
00:36:59.920 I would call the Bible code theory. The Bible code is a debunked, it was an idea that if you looked in
00:37:09.080 the Bible, and you did various schemes to find secret messages, you would find all these messages,
00:37:16.760 such as, I'll just pick one, this is random, not a real one. If you took the second letter of the
00:37:21.480 first sentence, but you took the third letter of the next sentence, and then the fourth letter of the
00:37:26.160 next sentence. So it was all these little algorithms that would run against the Bible,
00:37:30.160 and it would spit out things that you didn't think could possibly be, you know, natural. So it'd be
00:37:37.640 like little predictions, and you'd say, yes, look, it's like a full sentence prediction, and it actually
00:37:44.360 happened. So there was a time when people thought the Bible had these secret codes. That was debunked
00:37:50.140 by some scientists who took their same algorithms and ran it against any big book.
00:37:56.160 Like war and peace. Turns out war and peace is full of secret messages and predictions that
00:38:02.020 actually came true. Because it turns out that if you've got something as complicated as a big book
00:38:07.620 filled with letters, you can find some algorithm that will produce full sentences, just by trial and
00:38:15.660 error, and they will look like predictions that happened. So it can work with any book. It's obviously not
00:38:20.800 the Bible code. So his argument was that this statistician had basically fallen for the Bible
00:38:27.340 code error. Does that sound like a good response to, here's a video by a hugely qualified statistician?
00:38:37.460 Are there any hugely qualified statisticians who don't know about the Bible code? There are none.
00:38:47.680 There are none. That's not a thing. There's no such thing as a professional, a professional
00:38:54.100 statistician who's never heard of this problem with the Bible code. That's not a thing. Obviously,
00:39:01.320 the statistician was aware that that's one of the risks that you have to guard against.
00:39:07.340 So I feel as if this is a pretty clean example of cognitive dissonance. The reasonable reaction
00:39:14.440 would have been, I can't evaluate this, but if it's right, it's meaningful. Right? Is that not the
00:39:21.680 only reasonable response to something you can't analyze but looks important and an expert did it?
00:39:27.280 All right. So the other thing my Democrat friend said as a response is that the courts have rejected
00:39:42.340 all of the evidence that was presented. It's just mind boggling. So my Democrat friend, because the news
00:39:52.160 is so fake, he believes that courts have looked at evidence of fraud. That never happened. He actually
00:39:59.960 thinks that happened. It didn't happen. Apparently, he was unaware that the cases are being thrown out for
00:40:05.420 technicalities without actually looking at the claims. You know, it's about standing and doctrine of
00:40:11.420 latches and, you know, whether or not you can bring the case and who's got, you know, who's got jurisdiction.
00:40:18.300 It's all that stuff. But as far as I know, the claims per se have not been judged in any court of
00:40:26.800 law. I don't know that the witnesses who make direct claims of observing fraud, have they had their day
00:40:34.740 in court? No, right? So here's a well-informed, really well-informed guy, but his information comes
00:40:44.660 from the left and actually thinks an alternate history of the United States is happening right
00:40:51.080 now. He believes there's an alternate history happening in parallel with the one you're experiencing
00:40:55.940 in which those claims are being debunked by courts. Nothing like that's happened.
00:41:01.040 Nothing even close to that has happened. They've never even looked at it. Beyond that,
00:41:07.760 would it make any difference that other claims were debunked?
00:41:13.980 Does it matter how many people are found innocent of a crime?
00:41:19.500 Let's say this. Let's say three people were accused of a crime and you found out they didn't do it.
00:41:24.840 Does that tell you the fourth person who was accused of an unrelated crime? Does that tell you the
00:41:30.180 fourth person who didn't do it? It doesn't really work that way, right? Anyway.
00:41:38.320 Trump signed some legislation that would kick Chinese companies off of the U.S. stock exchanges
00:41:43.940 unless those companies allow financial audits that are required for, you know, American companies that
00:41:52.120 are on those exchanges. To which I say, why did this take so long? Are you telling me that there are
00:42:00.160 Chinese companies on American exchanges who simply decide not to abide by the very, very, very
00:42:07.000 important rules of transparency that all American companies abide by or else they get penalized
00:42:12.840 greatly? Are you telling me that China can just be on our stock exchange and ignore all the rules that
00:42:19.500 require the important ones, not even the trivial ones, like the most important one is you've got to
00:42:25.680 have some transparency. That's like right at the top. That's not a detail, right? And so Trump signs this
00:42:33.480 legislation that will kick them off if they don't allow these audits. And I'm thinking, why did that take so
00:42:40.020 long? You know, and do you think this would have happened under Biden? Do you? Because I feel like
00:42:47.840 probably it wouldn't. It's going to be fun watching Trump try to get things done, you know, between now
00:42:54.600 and Inauguration Day. We'll see how much trouble he can cause. All right, let me teach you when to
00:42:59.860 disagree with the experts because, of course, we all do it. But there is a good way to do it and a bad
00:43:04.960 way to do it. And this will be your important lesson of the day. You ready? Okay. Here's what you
00:43:11.220 should not do. Do not disagree with experts and then cite as your reason for disagreeing with the
00:43:18.960 experts a fact which all the experts know. Okay, I just gave you an example that all the experts in
00:43:27.380 statistics know about the Bible code. So stating that as the reason for your argument doesn't make
00:43:34.100 any sense. Because the expert knows that. Here's some more examples of that. I've heard the argument
00:43:43.700 that CO2 can't be causing a climate crisis because CO2 used to be much higher in the past. You've heard
00:43:54.000 this argument, right? People say climate change isn't real because CO2 used to be way, way higher in the
00:43:59.480 past. And there was no civilization back then. If there were no humans, and CO2 was way higher in
00:44:06.300 the past, and things seem to be fine. So it's all hoax, right? Here's the problem with that.
00:44:14.140 Every climate scientist knows CO2 was higher in the past. Do you see where I'm going? All the experts
00:44:23.220 who say climate change is a problem. They know what you know, that CO2 was much higher in the past.
00:44:30.960 That's not a reason to argue against them. What that proves is that you don't know why they have,
00:44:37.860 you don't understand their argument basically. Now, I believe that I read once that CO2 was higher in
00:44:45.300 the distant history of the earth, but it was the same time that I believe the sun was less strong,
00:44:52.300 so there was some countering force that is easy to demonstrate and well known. So in general,
00:45:00.180 if you're disagreeing with experts, but you're using as your basis for disagreement a fact that
00:45:06.460 every one of those experts knows, you're almost certainly not making a good argument. You can be
00:45:12.240 right, because experts sometimes are wrong and you don't know why. So you can be right by accident.
00:45:17.080 But you should check yourself and say, wait a minute, my argument is based on one fact
00:45:23.440 that the other people already know. There's got to be some other argument or that's nothing.
00:45:29.360 All right, here's another example. When I predicted that Trump would win in 2016,
00:45:37.200 2016, I was going against all the experts and all the pollsters. Was that smart? Was it smart for
00:45:45.640 me to disagree with the experts when I was using their same data? Because they knew what the polls
00:45:53.520 were. We're all looking at the same data, right? So I should not have disagreed with the experts,
00:45:59.240 wouldn't you say, if all I were using was the same data they were using? Because they would know
00:46:05.120 more than I do, plus they know the same data I know. Except here's what's different. I was not
00:46:12.060 using their same data. I was using my expertise, which is different from theirs. My expertise was
00:46:19.900 persuasion. And as a trained persuader, and other trained persuaders saw at the same time I did,
00:46:26.540 they said, whoa, whoa, whoa, this isn't like the past. We've never had this skill set running for
00:46:33.520 president. And you guys don't see it coming. But I'm kind of an expert in this persuasion stuff.
00:46:38.960 And I do see it coming just like a train. Like I can see it. I can see it coming. Right? So if you
00:46:45.580 disagree with the experts, because you're bringing knowledge that they don't have, or expertise that
00:46:52.080 they don't have, that might be a reasonable disagreement. Again, doesn't mean you're right and
00:46:57.320 they're wrong. Could go either way. But at least you're being reasonable. That would be a reasonable
00:47:02.740 way to disagree with an expert, because you're bringing something new that they don't have.
00:47:09.100 But if you're only bringing the stuff they already know, I think I'd lean toward the experts, not you,
00:47:16.300 in that case. I had one other expertise in the case of calling Trump's 2016 victory, which is that
00:47:24.580 I know a lot about my white males. As a white male of a certain age, I kind of have a little more insight
00:47:33.020 into white males of a certain age. And I know what they're willing to say out loud in public. And I know
00:47:39.180 what they privately think. And it's a little bit different. So I'm not sure that all the experts had maybe
00:47:46.160 the same, you know, experience with this group of people who ended up being influential in the final
00:47:51.560 final outcome. So whenever you think you have some extra insight or expertise or data, then maybe
00:47:59.500 disagreeing with an expert makes some sense. Here's another one. I've disagreed with climate change experts
00:48:08.440 about their projections of how bad things will be in 50 to 80 years. Does that make sense? I'm not a climate
00:48:16.940 change expert, right? So if I'm disagreeing with the experts, aren't I being irrational? Because there's
00:48:24.100 no fact that I know about climate change, and this is true, there's no fact I know about climate change
00:48:30.540 that they don't know. So they know all the facts that I have, plus lots more. Would it be reasonable for
00:48:38.720 me to disagree with them when they know everything I know, plus the scientific method has backed them
00:48:46.300 up, they say, plus the majority of experts are on the same side, plus they know way more than I do?
00:48:54.160 Is that reasonable for me to disagree in that case? Well, if that's all the variables that were involved,
00:49:00.340 the answer is no. If there were no other variables, it wouldn't really be reasonable for me to disagree.
00:49:05.220 I don't have anything to add to it. But when you're predicting what's going to happen financially,
00:49:11.600 you're now in my ballpark. Because I worked as a person who made financial predictions for
00:49:18.480 big corporations, did it for years, and I have expertise in it. So when I'm criticizing climate
00:49:25.580 change, the part I don't criticize is the science part. The science part is that if you add CO2 to the
00:49:33.260 atmosphere, no matter how it gets there, human or other, no matter how it gets there, all things
00:49:39.000 be equal, would that warm up the earth? Probably. I'm not disagreeing with experts, because I don't
00:49:45.620 have any extra data. What extra data do I have? What extra science do I have? None. So when they
00:49:53.540 make a claim that CO2 should warm the atmosphere, all things be equal, I say, I don't have anything to
00:50:00.060 add to that. I'm not going to doubt it. And I'm not going to confirm it. I'm just going to say,
00:50:06.180 well, you're experts. You know, I don't know. But when you get to the second part, which is they make
00:50:11.260 a financial, not a scientific, but a financial estimate of what it's going to do with the world
00:50:18.180 economy, you're in my expertise. So if I criticize you from my expertise, and you're a scientist,
00:50:26.860 you should listen to me. Literally. If a scientist tells you to believe a financial estimate,
00:50:38.060 or a financial prediction, and a financial expert who makes these predictions, or has for a living,
00:50:45.520 says no, who are you going to believe? The person who knows the most about financial predictions,
00:50:51.420 or a scientist? Because scientists are not financial predictors. So when I disagree with
00:50:57.760 climate change, I'm not disagreeing with scientists on science. I'm disagreeing with scientists on my
00:51:04.880 expertise, not theirs, my expertise. I have another expertise too, which is, again, persuasion. And so I
00:51:12.700 have a theory of why maybe scientists could be, you know, fooled or biased or subject to confirmation
00:51:19.180 bias, at least on the financial part, financial predictions. And it is, you know, everything that
00:51:25.480 you already know, which is that there's a group think, and there would be a penalty for going against
00:51:30.080 the grain. So if you're going to disagree with the experts, at least have a theory of why they're wrong.
00:51:37.320 So sometimes I have a theory that there are just too many penalties for them to say,
00:51:43.200 you know, going against the grain, you know, so it's cognitive dissonance or confirmation bias,
00:51:47.780 or that I have extra facts or extra information. All right, here's another example of the same thing.
00:51:56.040 I see this argument all the time about wearing masks, and it goes like this. This was tweeted at me
00:52:02.600 today. The problem with this, I'm sorry, it was tweeted at me today that we know masks can't work
00:52:11.260 because the tiny holes in the mask are way bigger than the even tinier virus. Now, most of you have
00:52:21.120 probably made this argument, right? How many of you have made this argument? Scott, Scott, Scott,
00:52:26.120 the scientists have looked, they've seen that the masks have these big holes on them when you go
00:52:31.400 microscopic, and the hole is this big. The virus is the size of a pea. You know, let's say, relatively
00:52:39.620 speaking, the virus is the size of a pea. The holes, the tiny holes in the mask would be the size of,
00:52:45.800 let's say, a basketball hoop. How does a basketball hoop stop something that's the size of a pea?
00:52:54.920 Right? Have you made that argument yourself? Raise your hand if you've ever made that argument.
00:53:00.100 If you did, you would know you're disagreeing with the majority of experts who say the masks work.
00:53:07.160 Now, here's my test. Do you think that the experts who say masks don't work, who say the masks do work,
00:53:17.320 do you think they're unaware of your pea going through a basketball hoop brilliant analogy?
00:53:24.960 Because if they're aware of that fact, then you're adding nothing to it. It's a fact they know. It's a fact you know.
00:53:34.740 That the size of the holes in the masks are so big, and the virus itself is smaller.
00:53:41.400 Do you think they don't know that? The experts? I'm pretty sure all of the experts know that.
00:53:47.300 But they know everything you know, which is, let's say, that fact, and more.
00:53:53.960 So they know everything you know. They know that big hole and little virus exists.
00:54:00.020 But the other thing they know is that it's been tested in a variety of real-world situations.
00:54:06.760 And in the real world, the evidence is very strong that masks work.
00:54:13.360 So they're aware that when you test it in a laboratory, you can come up with a very good reason why maybe it wouldn't.
00:54:22.040 But it might have to do with the water droplets being bigger than the virus itself,
00:54:26.660 so maybe they don't get through the basketball hoop.
00:54:29.380 Could be that it changes the direction or the viral load.
00:54:32.860 But we don't know why. We don't know why.
00:54:36.620 But the point is, if the experts know everything you know about the size of that virus,
00:54:42.300 there's something you don't know.
00:54:45.120 And that's what you should take away from it.
00:54:46.740 You shouldn't take away from it that the experts are lying if they know more than you do.
00:54:55.180 Now, how often have I disagreed with the experts?
00:54:59.320 And let me see if I'm consistent with my own rules of knowing when to agree or disagree with experts.
00:55:06.300 When the question came up of closing travel from China,
00:55:11.120 the virology experts said in the very early days,
00:55:15.860 no, you don't need to do that.
00:55:17.920 And I disagreed with vehement, cursing public statements
00:55:24.040 that we should close the travel from China immediately.
00:55:27.120 Now, who was right?
00:55:30.660 Well, I was right.
00:55:31.820 I think history shows that I was right,
00:55:33.980 that we should have closed China travel as early as I said,
00:55:37.820 which was well before Trump did it, which I think was a week later.
00:55:42.560 And, but was I, so even though I was right,
00:55:47.180 let's examine if I was right for the right reason.
00:55:50.440 Okay, so what was it that I added to this, to the experts?
00:55:56.120 If the experts say it's not a risk, why should I say it?
00:56:01.920 You know, what do I know that the experts don't know?
00:56:05.080 And here's what I do know.
00:56:07.420 Risk management.
00:56:09.160 Are scientists experts in risk management?
00:56:12.040 Because if you studied economics and if you have an MBA
00:56:16.700 and a lot of experience in business, as I do,
00:56:19.680 I would consider myself not like a world expert in risk management,
00:56:24.000 but certainly it's my expertise.
00:56:26.360 So understanding risk and making decisions
00:56:28.840 in the context of risk management
00:56:30.700 is what you learn when you get an MBA.
00:56:33.960 It's what you learn in business.
00:56:35.580 I'm good at it.
00:56:36.360 So when I looked at this situation,
00:56:38.680 I didn't say I'm smarter than epidemiologists.
00:56:42.660 I said, I don't think they understand risk management
00:56:45.980 because the risk is catastrophic.
00:56:50.760 And here we are, right?
00:56:52.180 We knew the risk was catastrophic.
00:56:54.360 We knew that it would be expensive to close travel,
00:56:57.760 but it would be better than catastrophic.
00:57:00.700 So from a risk management perspective,
00:57:03.240 it was kind of a no-brainer
00:57:04.500 that anybody who understood my expertise,
00:57:09.240 risk management,
00:57:11.060 would have found that an easy decision.
00:57:13.200 And in fact, who was it who famously, you know,
00:57:17.380 followed on pretty quickly?
00:57:18.540 It was Trump.
00:57:19.940 Would you say that Trump is also an epidemiologist?
00:57:24.900 No.
00:57:25.820 Is he an expert on risk management?
00:57:30.000 Yes.
00:57:31.060 Yes, that's exactly what he is.
00:57:32.820 In the same way that I am.
00:57:35.200 If you're experienced with business,
00:57:37.680 you are somebody who's been making risk management decisions
00:57:40.660 for decades.
00:57:43.400 Yeah.
00:57:44.100 Trump is very, very experienced at risk management.
00:57:48.180 So if you're disagreeing with the experts
00:57:50.000 because you bring a different expertise,
00:57:52.260 that can be valid.
00:57:53.660 It doesn't mean you're right,
00:57:54.500 but it could be a valid disagreement.
00:57:55.600 When the experts were first saying that masks don't work
00:58:01.120 before they said they do,
00:58:03.520 I called that a lie on day one.
00:58:07.140 Now, did I call it a lie
00:58:08.620 because of my expertise in virology
00:58:11.660 and the physics of masks?
00:58:14.780 No.
00:58:15.900 I brought a different expertise to that.
00:58:18.560 And that different expertise
00:58:20.320 is that as the creator of Dilbert
00:58:23.300 and somebody who's worked in business for a long time,
00:58:26.120 I know how big organizations work.
00:58:29.280 And I know that big organizations
00:58:31.080 will routinely lie to managed behavior.
00:58:36.800 And it occurred to me
00:58:38.220 that since we were talking about a shortage of masks,
00:58:41.400 at the same time we were talking,
00:58:43.020 the experts were saying,
00:58:44.080 nah, you don't need a mask.
00:58:45.360 But maybe the healthcare workers do need them.
00:58:48.980 Nah, you don't need them.
00:58:49.960 Save them for the healthcare workers.
00:58:51.440 Eh, it won't make any difference.
00:58:53.100 It seemed very likely to me
00:58:55.620 that they were making the decision
00:58:57.960 to manage the shortage
00:58:59.340 and it was not an actual scientific statement.
00:59:03.200 So in this case,
00:59:04.040 my expertise in bureaucracies
00:59:06.560 and how they lie to manage resources,
00:59:09.520 I applied to this situation
00:59:11.260 and with no understanding of epidemiology
00:59:14.200 or the size of the physics of the masks,
00:59:16.860 I correctly predicted that they were lying.
00:59:20.680 And that was the truth.
00:59:23.900 How about what I said early on in the pandemic
00:59:27.520 and I was saying that we should
00:59:28.940 at least do a major test
00:59:31.400 and really, really quickly on hydroxychloroquine
00:59:33.780 because if it worked as claimed,
00:59:36.240 it would be huge.
00:59:37.480 If it didn't work,
00:59:38.840 well, it's not much risk.
00:59:40.660 It's pretty low risk compared to the pandemic.
00:59:42.680 Now, my current thinking
00:59:46.180 is that hydroxychloroquine
00:59:47.820 almost certainly,
00:59:50.300 at this point,
00:59:51.060 we could say wasn't the game changer.
00:59:53.820 I don't know if it works a little,
00:59:55.780 but it certainly isn't working so well
00:59:57.980 that everybody's adopting it.
00:59:59.800 We would know that by now,
01:00:01.020 in my opinion.
01:00:02.620 But was I right or wrong
01:00:04.260 in saying we should go hard
01:00:06.080 at hydroxychloroquine
01:00:07.460 in the environment of not knowing
01:00:10.400 whether it worked or not?
01:00:11.760 I was 100% right,
01:00:13.980 as was Trump.
01:00:15.400 Not right that it works,
01:00:17.420 but right that there's enough evidence
01:00:19.160 that it works
01:00:19.640 that we should go hard at it
01:00:21.020 and know for sure,
01:00:22.960 because we tested it rigorously,
01:00:25.440 that we should just go at it
01:00:27.720 as hard as possible.
01:00:28.740 And at least eliminate it
01:00:30.100 as a possibility.
01:00:31.940 At least eliminate it.
01:00:32.940 So, I think I was right on that as well.
01:00:37.680 All right.
01:00:40.920 So, Swalwell, it turns out
01:00:42.640 the information is that
01:00:44.300 there's some confirmation
01:00:45.340 that he did actually have
01:00:46.660 a sexual fling
01:00:48.640 with the Chinese spy,
01:00:50.800 Fang Fang.
01:00:52.720 So, here's my take.
01:00:55.020 We know that Swalwell
01:00:56.420 pushed the Russia collusion hoax
01:00:58.300 harder than anybody
01:00:59.480 except Adam Schiff.
01:01:00.400 And we know that it was a hoax
01:01:03.320 and we know that
01:01:03.940 that was very bad
01:01:04.760 for the country
01:01:05.340 and so what Swalwell did
01:01:07.160 was unambiguously
01:01:08.860 very bad for the country.
01:01:11.020 But,
01:01:11.620 it could have also been
01:01:12.720 just a mistake.
01:01:14.220 Right?
01:01:15.180 And,
01:01:16.060 I think that you have to allow
01:01:18.540 that your people
01:01:20.800 that you elect in Congress
01:01:21.880 are going to make some mistakes.
01:01:23.720 So, I don't know
01:01:24.620 that you would necessarily
01:01:25.880 fire somebody
01:01:27.400 for pushing the fine people,
01:01:30.000 I'm sorry,
01:01:30.520 for pushing the
01:01:31.300 Russia collusion hoax
01:01:33.040 because maybe he believed it.
01:01:35.280 Right?
01:01:35.680 Maybe he was just wrong.
01:01:37.020 That's not the worst thing
01:01:38.240 in the world.
01:01:38.720 And if Democrats said,
01:01:39.960 well, we like him in general
01:01:41.220 and even though he pushed this hoax
01:01:43.260 or maybe they believe the hoax,
01:01:44.480 I don't know.
01:01:45.500 So, maybe that's not enough
01:01:46.840 to lose your job.
01:01:49.260 You could argue it is,
01:01:50.820 but maybe not
01:01:52.020 for the,
01:01:52.960 at least his voters.
01:01:53.860 And then,
01:01:55.180 there's the question
01:01:55.740 of having a fling
01:01:56.820 with a Chinese spy.
01:01:57.880 Let's say he didn't know it.
01:01:59.900 That's the reporting, right?
01:02:01.460 The reporting is
01:02:02.280 he didn't know
01:02:02.740 she was a Chinese spy.
01:02:04.620 Should you lose your job
01:02:06.100 if you had a relationship
01:02:08.900 with somebody
01:02:09.600 that you didn't know
01:02:10.260 was a spy
01:02:10.820 and the moment you found out
01:02:12.880 you cut contact?
01:02:14.900 I don't think so.
01:02:16.080 I don't think you should
01:02:16.900 lose your job
01:02:17.820 for being fooled
01:02:19.700 by a spy
01:02:20.500 if,
01:02:22.360 especially if there's
01:02:23.360 no damage
01:02:23.980 that you can identify.
01:02:25.980 So, you've got
01:02:26.780 two things
01:02:27.560 that are sort of
01:02:28.300 really close
01:02:29.220 to something
01:02:30.200 you should get fired for,
01:02:32.040 but individually?
01:02:33.760 I don't know.
01:02:34.700 Now, suppose you added
01:02:35.660 them together.
01:02:36.960 He did two bad things.
01:02:38.300 He didn't do one bad thing.
01:02:39.380 He did two bad things.
01:02:40.660 Is that enough
01:02:41.400 to fire him?
01:02:42.960 Well, I don't know.
01:02:44.220 I suppose that would
01:02:45.620 be subjective.
01:02:47.080 But here's the thing.
01:02:48.200 Those two things
01:02:49.240 he did wrong
01:02:49.980 are not unrelated,
01:02:51.120 meaning that
01:02:52.900 a Chinese spy
01:02:54.460 you would expect
01:02:55.300 to want you
01:02:56.120 to put pressure
01:02:57.100 on Russia
01:02:57.780 and away from China.
01:02:59.700 Now, we don't know
01:03:00.460 if that's why
01:03:01.420 Swalwell did what he did.
01:03:03.440 We don't know
01:03:03.940 if Swalwell
01:03:04.500 was motivated,
01:03:05.740 persuaded,
01:03:06.300 or brainwashed
01:03:06.960 in any way
01:03:07.620 to push the Russia
01:03:08.980 collusion hoax.
01:03:10.180 All we know
01:03:11.120 is that we watched it.
01:03:13.240 And if these
01:03:14.040 two facts are true,
01:03:15.300 that you saw
01:03:17.120 Swalwell
01:03:18.360 going balls
01:03:19.820 to the wall
01:03:20.460 to do something
01:03:21.240 that a Chinese spy
01:03:22.360 would certainly
01:03:22.920 want him to do,
01:03:23.840 which has put all
01:03:24.480 the pressure
01:03:24.860 on Russia
01:03:26.540 and, you know,
01:03:28.520 hurt the integrity
01:03:29.960 of our elections
01:03:30.780 and question everything,
01:03:32.380 et cetera.
01:03:33.000 He did exactly
01:03:34.740 what a spy
01:03:35.900 would want him to do,
01:03:36.820 a Chinese spy.
01:03:38.280 Now, that doesn't mean
01:03:39.480 he did it
01:03:40.160 because of that.
01:03:42.080 But,
01:03:42.600 we don't know.
01:03:45.740 That's grounds
01:03:46.540 for removal.
01:03:47.860 The fact that
01:03:48.660 we don't know
01:03:49.520 if the only reason
01:03:50.980 he damaged the country
01:03:52.100 so badly
01:03:52.780 is because
01:03:53.800 he was influenced
01:03:54.800 by a Chinese spy,
01:03:56.640 it doesn't matter
01:03:57.880 if you can prove
01:03:58.960 there was a connection
01:04:00.240 between those stories.
01:04:01.780 The fact that
01:04:02.660 what he did
01:04:03.220 was so perfectly
01:04:04.180 exactly what a spy
01:04:05.520 would want him to do,
01:04:07.120 that's enough.
01:04:09.200 Even if
01:04:10.000 he's completely innocent,
01:04:11.400 you can't have
01:04:12.020 that person
01:04:12.480 in public office.
01:04:14.000 And I feel
01:04:15.120 that that could be
01:04:16.140 unfair to him.
01:04:17.660 Now, I've told you before,
01:04:18.440 I've met
01:04:19.080 Eric Swalwell
01:04:20.600 a few times
01:04:21.380 because he knows
01:04:22.260 some people I know
01:04:23.060 in local parties
01:04:26.080 and stuff,
01:04:26.700 so he's been around
01:04:27.640 a little bit.
01:04:28.840 And so,
01:04:29.360 I don't have
01:04:30.240 bad feelings
01:04:31.300 about him
01:04:31.720 as a human being,
01:04:32.860 and I don't like
01:04:33.560 people to lose
01:04:34.180 their jobs
01:04:34.740 over politics
01:04:35.660 and stuff like this.
01:04:36.680 But this is sort
01:04:37.560 of a no-brainer.
01:04:39.240 This isn't one
01:04:40.020 of those situations
01:04:40.860 where you can put
01:04:41.880 the well-being
01:04:43.620 of Eric Swalwell
01:04:44.900 over the well-being
01:04:46.040 of the credibility
01:04:47.180 of the republic.
01:04:48.420 He's just less important
01:04:49.720 than the republic.
01:04:51.260 And I think
01:04:52.660 he's got to go.
01:04:54.040 I don't think,
01:04:54.800 and by the way,
01:04:56.400 I'm positive
01:04:57.180 I would say this
01:04:57.960 if he were a republican.
01:04:59.640 I don't know
01:05:00.180 if you would,
01:05:01.440 you know,
01:05:01.840 your mileage
01:05:02.900 might vary,
01:05:03.440 but I'm positive
01:05:04.780 I would have
01:05:05.220 the same opinion
01:05:05.940 no matter his politics.
01:05:07.340 I said provocatively
01:05:14.800 that although
01:05:15.820 I oppose all violence
01:05:17.020 and I do,
01:05:18.000 that if conservatives
01:05:18.880 don't start planning now
01:05:20.460 to control the streets,
01:05:22.560 they'll never win
01:05:23.280 another election.
01:05:24.180 There's no point
01:05:24.660 in having an election
01:05:25.420 because for all
01:05:26.880 practical purposes,
01:05:28.040 whoever controls
01:05:29.160 violence in the country
01:05:31.000 runs the country.
01:05:32.940 Let me say that again
01:05:34.020 because it's one
01:05:34.580 of those things
01:05:35.020 it takes you a while
01:05:35.800 to connect the dots.
01:05:38.280 Whoever controls
01:05:39.660 violence,
01:05:42.180 meaning you can
01:05:43.240 get away with it,
01:05:44.800 is the government.
01:05:47.300 Effectively.
01:05:48.640 Right?
01:05:48.980 Even if they're not
01:05:49.720 the government in name,
01:05:51.760 whoever can control
01:05:53.100 violence
01:05:53.740 is in charge.
01:05:56.160 There's no exception
01:05:57.420 to that.
01:05:58.540 It's not like
01:05:59.360 a bias
01:06:00.520 or a
01:06:01.340 hey,
01:06:01.900 there's a correlation.
01:06:03.160 it's a definition.
01:06:07.200 Whoever controls
01:06:08.140 violence
01:06:08.780 in the country
01:06:09.540 runs the country
01:06:10.460 every time.
01:06:11.760 No exception.
01:06:13.100 Right?
01:06:13.540 And we watched
01:06:14.560 that the
01:06:15.320 Antifa
01:06:16.740 and Black Lives Matter
01:06:17.780 and Democrats,
01:06:18.940 they controlled
01:06:19.720 violence
01:06:20.340 in the streets
01:06:22.100 and there's
01:06:23.240 good reason
01:06:23.900 to believe
01:06:24.540 that that
01:06:25.840 affected
01:06:26.700 the Supreme Court
01:06:27.720 to want to
01:06:29.320 stay out of the election
01:06:30.200 because they didn't
01:06:31.220 want more violence.
01:06:32.340 So under this
01:06:33.720 situation where
01:06:34.420 violence appears
01:06:35.780 to be our
01:06:36.440 political system
01:06:37.280 now,
01:06:38.040 if you want
01:06:38.880 to not have
01:06:39.780 that be
01:06:40.240 your government,
01:06:42.100 you know,
01:06:42.320 street violence,
01:06:43.540 the only response
01:06:44.760 to that,
01:06:45.220 since the police
01:06:45.860 apparently have been
01:06:47.340 neutered,
01:06:48.060 you know,
01:06:48.300 politically neutered,
01:06:49.740 the police aren't
01:06:50.540 going to help you.
01:06:51.560 And it looks like
01:06:52.240 we're not going to
01:06:53.060 employ the army
01:06:53.900 because that would
01:06:54.600 have its own problems.
01:06:55.660 The only way
01:06:56.680 that this gets fixed
01:06:57.700 that I can think of
01:06:58.740 is that the number
01:07:00.560 of conservatives
01:07:01.440 who show up
01:07:02.400 is way more
01:07:03.280 than the number
01:07:04.500 of other people
01:07:06.080 who might have
01:07:06.620 violence on their
01:07:07.260 mind.
01:07:08.080 Now,
01:07:08.620 don't bring guns,
01:07:11.120 don't bring knives
01:07:12.320 or bombs,
01:07:13.820 like I'm not
01:07:14.540 suggesting that
01:07:15.080 anybody bring
01:07:15.700 weapons of death
01:07:17.220 to any kind
01:07:18.500 of an event,
01:07:19.600 but if Antifa
01:07:21.560 were outnumbered
01:07:22.620 five to one
01:07:23.440 in the street
01:07:25.000 where they were
01:07:25.480 trying to make
01:07:26.060 trouble,
01:07:27.080 five to one
01:07:27.840 would probably
01:07:28.560 make a lot
01:07:30.880 of these things
01:07:31.380 go away.
01:07:32.800 So,
01:07:33.520 I would say
01:07:35.060 that if
01:07:36.020 conservatives
01:07:37.260 don't actually
01:07:38.100 literally organize
01:07:39.120 and have names
01:07:40.480 of the people
01:07:40.980 who have signed
01:07:41.600 up to literally
01:07:42.400 go onto the street
01:07:43.460 the moment
01:07:44.520 it's needed
01:07:45.040 and you don't
01:07:45.700 have five times
01:07:47.100 as many of them,
01:07:48.660 there's no point
01:07:49.380 in having an election.
01:07:50.460 There really isn't.
01:07:51.400 And the Proud Boys
01:07:53.360 have fucked up
01:07:54.100 everything.
01:07:54.740 Let me say this.
01:07:55.540 I don't have a problem
01:07:57.100 with the Proud Boys
01:07:58.800 stated philosophy.
01:08:01.000 I know they're
01:08:01.380 accused of things
01:08:02.260 which is not
01:08:03.160 within their
01:08:03.680 stated philosophy.
01:08:04.620 They're accused
01:08:05.100 of being racist
01:08:06.320 or whatever,
01:08:06.920 but that's not
01:08:07.360 part of their deal.
01:08:09.180 I don't know
01:08:09.500 if any of them
01:08:10.040 are racist.
01:08:10.880 They're probably
01:08:11.240 racist everywhere.
01:08:12.500 But the Proud Boys,
01:08:13.540 unfortunately,
01:08:14.340 they brought
01:08:14.720 their brand
01:08:15.860 into the mix.
01:08:17.740 And while I believe
01:08:18.660 that they were
01:08:19.120 well-intentioned
01:08:20.020 in many cases,
01:08:22.060 sometimes I think
01:08:22.640 they just like to fight,
01:08:23.920 but I think
01:08:24.380 they were sort of
01:08:25.100 well-intentioned.
01:08:25.940 They're patriots,
01:08:26.900 but they completely
01:08:28.300 fucked up the situation
01:08:29.420 because they drew
01:08:31.400 all the attention
01:08:32.120 and they were
01:08:33.540 too easy to paint
01:08:34.640 as the bad guys.
01:08:37.320 The people who need
01:08:38.260 to be in the street
01:08:38.900 needs to be
01:08:39.440 everybody but them.
01:08:41.560 Even though
01:08:42.200 they're the most capable
01:08:43.100 in terms of fighting,
01:08:44.820 the most effective
01:08:45.980 would be people
01:08:49.240 who are not part
01:08:50.020 of an organization.
01:08:51.780 You don't want them
01:08:52.800 to show up and say,
01:08:53.620 hey, we're Proud Boys.
01:08:54.340 You want people
01:08:54.900 to show up and say,
01:08:55.720 we're conservatives
01:08:56.400 or we want to save
01:08:57.920 the country
01:08:58.380 or we're patriots
01:08:59.320 or something.
01:08:59.660 You don't want them
01:09:00.160 part of a club.
01:09:01.560 As soon as you make
01:09:02.360 them part of a club,
01:09:03.540 then anybody who does
01:09:04.860 something bad
01:09:05.520 in the club
01:09:06.300 messes up the whole thing.
01:09:09.720 So as soon as you say
01:09:11.100 it's a club,
01:09:12.740 any one bad apple
01:09:14.000 in that club
01:09:14.600 ruins the whole club
01:09:15.760 in terms of
01:09:16.320 political opinion.
01:09:17.660 So just don't bring
01:09:18.660 a club.
01:09:19.820 I mean,
01:09:20.260 an organization club.
01:09:21.360 The other club,
01:09:23.900 probably a bad idea too.
01:09:28.340 So you think lawyers
01:09:29.780 are pretty good
01:09:30.280 at arguing, right?
01:09:31.420 You'd say that maybe
01:09:32.520 artists are not good
01:09:33.380 at it,
01:09:33.660 but lawyers are
01:09:34.440 real good at it.
01:09:36.000 And I hear
01:09:36.860 lawyer Ross Garber,
01:09:38.440 who's literally
01:09:39.140 an impeachment lawyer,
01:09:42.020 and he teaches
01:09:42.680 at Tulane Law School.
01:09:44.040 So a very qualified
01:09:44.820 guy in the law.
01:09:47.120 And he tweeted
01:09:47.820 this at me.
01:09:48.380 He said,
01:09:48.660 I have done
01:09:49.020 election investigations,
01:09:50.620 mostly on behalf
01:09:51.520 of Republicans.
01:09:52.660 So he's saying
01:09:53.760 that he's sort
01:09:54.660 of unbiased here.
01:09:56.080 And he said,
01:09:57.120 I have seen
01:09:57.680 lots of misconduct
01:09:59.100 and irregularities.
01:10:00.580 So here's a person
01:10:01.320 who's experienced,
01:10:02.780 and he knows
01:10:03.280 that elections
01:10:03.880 can have lots
01:10:05.160 of conduct
01:10:05.720 and irregularities,
01:10:06.820 because he's seen
01:10:07.360 it himself.
01:10:08.340 So far, so good.
01:10:09.760 And he said,
01:10:10.340 I expressed concern
01:10:11.440 heading into this election.
01:10:13.300 Even better.
01:10:14.440 He's not only
01:10:15.280 seen a lot of fraud
01:10:17.160 in elections,
01:10:17.920 but he warned us
01:10:18.760 about this election.
01:10:19.800 So far, so good.
01:10:21.320 And then he said,
01:10:22.120 but I have not seen
01:10:23.020 evidence of potentially
01:10:24.160 result-changing problems.
01:10:26.820 And then he referred
01:10:27.940 to my analogy
01:10:28.720 about the no-melted
01:10:29.740 ice cream.
01:10:31.020 Does that seem
01:10:31.460 like a good argument?
01:10:34.780 Is that a world-class
01:10:36.380 lawyer argument there?
01:10:40.760 Because I'm pretty sure
01:10:42.360 there are a lot of things
01:10:43.360 in this world
01:10:44.220 that I haven't seen
01:10:46.520 that actually happened.
01:10:49.600 For example,
01:10:51.080 suppose you were
01:10:51.880 to witness
01:10:52.500 a murder
01:10:53.240 by gunfire.
01:10:55.280 And you saw
01:10:55.800 the person
01:10:56.240 take out the gun,
01:10:58.140 aim it at the victim,
01:10:59.340 pull the trigger,
01:11:00.320 bang!
01:11:01.200 And then the victim
01:11:02.020 gets a hole
01:11:03.000 in their body
01:11:03.560 and they go,
01:11:03.940 and they die.
01:11:05.760 And that's what
01:11:06.680 you witnessed.
01:11:08.360 Can you say
01:11:09.260 that you witnessed
01:11:11.220 the person
01:11:12.960 with the gun
01:11:13.640 shooting the victim
01:11:15.580 who the bullet
01:11:16.680 entered
01:11:17.160 and died?
01:11:18.660 Can you say
01:11:19.440 you saw that?
01:11:21.200 No.
01:11:22.460 Because you can't
01:11:23.320 see the bullet.
01:11:25.180 Right?
01:11:25.840 You saw the gun
01:11:26.660 go off
01:11:27.140 and you know
01:11:27.800 what guns do.
01:11:29.140 You saw the person
01:11:30.140 with the bullet
01:11:30.660 hole
01:11:30.940 and you
01:11:33.060 logically
01:11:33.880 connected them.
01:11:35.540 But did you see it?
01:11:37.780 No.
01:11:38.680 Because if you can't
01:11:39.300 see the bullet,
01:11:40.360 you don't know
01:11:41.380 the bullet
01:11:42.000 actually came out
01:11:42.760 of the gun.
01:11:43.660 You don't know
01:11:44.340 if somebody
01:11:44.740 behind him
01:11:45.400 shot him.
01:11:46.840 Now, of course,
01:11:47.300 if you do the ballistics,
01:11:48.220 you'll find out.
01:11:49.080 But in terms of
01:11:49.900 witnessing,
01:11:50.480 there are a lot
01:11:52.500 of things
01:11:52.880 we don't see,
01:11:54.540 such as
01:11:55.120 actually watching
01:11:55.920 the bullet
01:11:56.440 that you know
01:11:58.000 happened.
01:11:58.920 Because you heard
01:11:59.620 the gun go off,
01:12:00.500 you saw it was aimed,
01:12:01.460 you saw the result.
01:12:02.260 You don't have
01:12:03.220 to see it all.
01:12:04.660 You don't have
01:12:05.040 to see every bit
01:12:05.700 of it.
01:12:06.620 So likewise,
01:12:07.860 here's a guy
01:12:09.520 who has seen
01:12:11.500 enough election
01:12:12.420 fraud personally
01:12:13.460 that he knows
01:12:14.320 it can happen.
01:12:15.720 So he knows
01:12:16.540 it can happen
01:12:17.380 because he's seen
01:12:17.940 a lot of it.
01:12:19.020 And he knows
01:12:20.040 that the incentive
01:12:21.160 to do it
01:12:21.720 this election
01:12:22.360 was sky high.
01:12:24.900 That's all
01:12:25.540 you need to know.
01:12:27.200 It can be done,
01:12:29.060 which he confirms
01:12:30.100 because he's seen
01:12:30.700 a lot of it.
01:12:31.220 and the motivation
01:12:34.320 was sky high.
01:12:35.640 You don't need
01:12:36.560 to see the bullet
01:12:37.400 to know that it
01:12:39.380 happened.
01:12:40.560 I think his
01:12:41.020 argument was
01:12:41.640 poor.
01:12:45.460 And I got a few
01:12:46.360 other things
01:12:46.780 that I don't think
01:12:47.260 are interesting
01:12:47.780 enough to talk
01:12:48.360 about,
01:12:48.620 so I won't.
01:12:50.140 And that,
01:12:51.240 my friends,
01:12:54.300 is the end
01:12:55.220 of Coffee with
01:12:57.140 Scott Adams
01:12:57.540 for today.
01:12:57.980 I think it's
01:12:58.680 maybe the best
01:12:59.680 one I've done
01:13:00.200 so far today.
01:13:01.480 Don't you think?
01:13:03.320 All right,
01:13:03.840 I'm just going to
01:13:04.240 look at your
01:13:04.580 comments for a
01:13:05.240 moment because
01:13:05.840 I've been looking
01:13:06.380 at my notes.
01:13:07.060 Oh, yes.
01:13:08.700 The Space Force,
01:13:10.060 how did I skip
01:13:10.920 that one?
01:13:12.280 The Space Force
01:13:13.040 has decided that
01:13:13.940 the name for
01:13:14.900 their fighters,
01:13:18.320 the name for
01:13:19.120 their military
01:13:19.640 people will be
01:13:20.720 Guardians.
01:13:22.400 They'll be called
01:13:23.320 Guardians.
01:13:24.620 What do you think
01:13:25.080 of that?
01:13:26.920 I don't like
01:13:28.200 it at all.
01:13:30.500 I wanted to
01:13:31.540 like it,
01:13:32.500 but I don't.
01:13:33.960 Number one,
01:13:34.700 it reminds you
01:13:35.340 of the movie
01:13:36.820 Guardians of the
01:13:37.640 Galaxy,
01:13:38.340 so automatically
01:13:39.000 it feels silly
01:13:40.040 because, you know,
01:13:41.080 there's a talking
01:13:41.940 raccoon in that
01:13:42.900 movie.
01:13:44.000 So,
01:13:44.880 Guardians of the
01:13:46.560 Universe,
01:13:47.380 that's what I
01:13:48.260 think of,
01:13:48.980 and so it makes
01:13:49.620 it seem less
01:13:50.400 serious.
01:13:51.160 So that's not
01:13:51.600 good.
01:13:52.100 But here's the
01:13:52.580 other part.
01:13:53.000 The problem
01:13:55.120 for me is
01:13:55.940 that they're
01:13:56.320 in heaven,
01:13:57.680 meaning that
01:13:59.100 they're in
01:13:59.560 space, so
01:14:00.580 they're sort
01:14:01.500 of up there
01:14:02.200 like God,
01:14:05.220 basically,
01:14:06.120 you know,
01:14:06.460 in an analogy
01:14:07.640 sense.
01:14:09.120 And there's
01:14:09.840 something about
01:14:10.340 the word
01:14:10.660 Guardian that
01:14:12.340 feels religious.
01:14:16.340 Guardian doesn't
01:14:17.340 sound military,
01:14:18.760 and maybe that's
01:14:19.500 what they wanted,
01:14:20.560 but Guardian
01:14:21.140 doesn't really
01:14:22.360 sound like
01:14:23.760 the right
01:14:24.100 word.
01:14:24.740 Guardian feels
01:14:25.660 like a cult.
01:14:28.700 I guess that's
01:14:29.580 what it is.
01:14:29.900 I just realized
01:14:30.820 that that's
01:14:31.220 what it is.
01:14:31.960 The word
01:14:32.420 Guardian doesn't
01:14:33.440 sound like a
01:14:34.080 military term.
01:14:35.100 It sounds like
01:14:35.660 a cult.
01:14:39.460 In NXIVM,
01:14:42.260 the cult we've
01:14:43.900 been talking
01:14:44.280 about,
01:14:45.120 what was the
01:14:45.680 name they had
01:14:46.380 for, or at
01:14:48.060 least in the
01:14:48.600 sub-part of
01:14:49.460 NXIVM,
01:14:50.200 where Keith
01:14:51.320 Ranieri had
01:14:52.200 his, let's
01:14:55.740 say, his
01:14:56.280 disciples, I
01:14:57.560 don't know,
01:14:58.020 lovers,
01:14:58.560 disciples.
01:14:59.820 So he was
01:15:00.680 called Vanguard.
01:15:03.180 So in the
01:15:03.760 context of an
01:15:05.160 actual alleged
01:15:06.520 cult, the name
01:15:08.960 that they used
01:15:09.560 for their leader
01:15:10.160 was Vanguard.
01:15:12.460 Guardian feels
01:15:14.320 like that word,
01:15:15.100 doesn't it?
01:15:15.520 it feels just
01:15:18.100 a little bit
01:15:18.580 more like a
01:15:19.140 cult than it
01:15:20.180 does like a
01:15:20.840 military thing.
01:15:21.420 So that's my
01:15:21.840 opinion.
01:15:22.900 I don't think
01:15:23.520 it's important.
01:15:24.200 I think any
01:15:24.660 name you put
01:15:25.120 it on, it's
01:15:25.440 going to be
01:15:25.680 fine.
01:15:27.320 And somebody
01:15:31.020 says Vanguard.
01:15:32.520 Yeah.
01:15:33.720 Yeah, I
01:15:34.280 understand that a
01:15:35.020 Guardian is one
01:15:35.820 who guards, and
01:15:37.420 you know, it's
01:15:38.720 not that it's
01:15:39.200 not technically
01:15:40.000 accurate enough.
01:15:41.680 I don't have
01:15:43.740 any better
01:15:44.080 ideas, so I
01:15:45.160 don't think it's
01:15:45.800 a big deal.
01:15:46.680 That's all I
01:15:47.080 got for now, and
01:15:47.760 I will talk to
01:15:48.400 you tomorrow.
01:15:51.880 All right, all
01:15:52.520 you YouTubers, I'm
01:15:53.640 still here for you
01:15:54.360 for a moment.
01:16:00.200 Should have
01:16:00.740 gone with
01:16:01.160 Starship Troopers.
01:16:04.400 Troopers wouldn't
01:16:05.020 be bad.
01:16:10.020 Orbitiers.
01:16:10.420 The Orbitiers.
01:16:12.620 That's not
01:16:12.900 bad.
01:16:15.180 Tara Keepers?
01:16:16.140 What's that
01:16:16.460 mean?
01:16:18.280 That's right, I
01:16:18.980 love you the most
01:16:19.700 on Periscope.
01:16:20.780 It's true.
01:16:22.600 Space Force
01:16:23.320 Tubins?
01:16:23.900 I don't think
01:16:24.360 that's going to
01:16:24.780 catch on.
01:16:26.200 Did they ever
01:16:26.960 admit about lying
01:16:27.820 about masks?
01:16:28.620 They did, yeah.
01:16:29.760 Fauci did.
01:16:30.760 In a sense.
01:16:31.860 In a sense.
01:16:32.540 Because they
01:16:33.000 admitted there was
01:16:33.640 a shortage
01:16:34.020 problem.
01:16:36.740 Have I seen
01:16:37.480 Stand Up
01:16:38.020 Maths Channel?
01:16:38.880 No.
01:16:40.980 Where's my
01:16:41.520 cat?
01:16:42.260 I don't know.
01:16:43.140 She needs to
01:16:43.680 give me some
01:16:45.880 love.
01:16:47.460 All right, that's
01:16:47.980 all for now and
01:16:48.600 I will talk to
01:16:49.240 you tomorrow.