Real Coffee with Scott Adams - August 23, 2022


Episode 1844 Scott Adams: Twitter Has A Whistleblower It's Get Interesting. Plus How To Spot an NPC


Episode Stats

Length

59 minutes

Words per Minute

137.28738

Word Count

8,120

Sentence Count

628

Misogynist Sentences

6

Hate Speech Sentences

20


Summary

Alexa, do you know who Andrew Taint is? Yes, you do! Well, if you do, then you'll want to listen to this episode of the podcast to find out. It's a good one.


Transcript

00:00:00.120 Good morning, everybody, and welcome to the highlight of civilization.
00:00:07.860 I'm not sure if we're 13.7 billion years old in this universe or what.
00:00:14.200 A little bit of question about that lately.
00:00:16.620 But what I know for sure is that there won't be a finer day in your whole life.
00:00:22.860 Until tomorrow. Tomorrow's looking good.
00:00:24.980 But if you'd like to take it up a notch today, if you'd like to live tomorrow today,
00:00:31.040 well, all you need is time traveling and a beverage.
00:00:35.280 So all you need is a cup or a mug or a glass, a tankard, chels, or stein, a canteen, jug, or flask, a vessel of any kind.
00:00:41.320 Fill it with your favorite liquid. I like coffee.
00:00:45.500 And join me now for the unparalleled pleasure, the dopamine of the day, the thing that makes everything better.
00:00:52.420 It's called the simultaneous sip.
00:00:56.240 It happens now. Go.
00:01:01.640 Ah.
00:01:03.460 Yeah, that's it. That's it.
00:01:06.980 It's all coming together now.
00:01:10.060 Well, did any of you hear the news about Andrew Taint?
00:01:13.780 Andrew Taint. So Andrew Taint apparently has been banned now by TikTok and Facebook and Instagram and, I don't know, the phone book.
00:01:23.260 And I think he's been banned from everything.
00:01:27.200 Oh, I just want to deal with a rumor that's going around.
00:01:30.740 Some of you think that I'm intentionally pronouncing his name with an N in it.
00:01:35.760 But I can prove I'm not.
00:01:39.660 You just hear it that way. It's an audio illusion.
00:01:42.500 Would you like me to prove that I am pronouncing his name correctly?
00:01:47.480 All right. I can prove it.
00:01:51.380 Alexa, who is Andrew Taint?
00:01:54.000 According to Wikipedia, Henry Andrew Taint created an American-British internet personality and former professional kickboxer.
00:02:03.280 Following his kickboxer in the video, Taint began offering Taint courses and membership to his website and later rose to Taint following the movement to keep him from our video.
00:02:11.860 There you go.
00:02:12.360 Taint's misogynistic commentary on social media has resulted in Twitter, Facebook, Instagram, and YouTube banning him from...
00:02:19.660 Oh, YouTube, too.
00:02:20.480 Oh, well, you see, you can try it at home.
00:02:26.360 Just say to your own Alexa and try to pronounce it correctly the way I did.
00:02:32.500 Andrew Taint, T-A-T-E.
00:02:35.360 If you do that, your digital devices will recognize it.
00:02:41.120 Now, I don't think it would have given me the correct answer if I had pronounced it incorrectly.
00:02:45.780 So you can tell now that your human ears are being fooled, but the digital device can hear accurately when I say Taint, T-A-T-E.
00:02:57.540 Well, we've got a really good story here.
00:02:59.780 We've got this Twitter whistleblower.
00:03:02.700 But it's not any common whistleblower.
00:03:06.380 He was the ex-head of security for Twitter.
00:03:09.400 How would you like to be Twitter right now when your ex-head of security just absolutely pissed all over the company?
00:03:21.580 Because that's somebody who had access to some knowledge.
00:03:25.380 Well, so this guy, Peter, Peter Mudge, I guess that was his nickname, Mudge, Zatko.
00:03:31.820 So he used to be the head of security, got fired for God knows what.
00:03:36.860 So keep in mind that he got terminated.
00:03:40.500 Whatever a disgruntled employee says about the company, how much should you believe it?
00:03:47.280 What is the credibility of a disgruntled employee?
00:03:51.460 It's zero.
00:03:52.760 It's zero.
00:03:54.100 But this story is so fun, I'm going to pretend as though it's greater than zero.
00:03:59.740 Because I want it to be true.
00:04:04.180 Don't you want it to be true?
00:04:06.580 Can we all agree to suspend our critical thinking just for fun?
00:04:12.040 We're just going to pretend this guy actually has the goods, okay?
00:04:15.880 Now, one of the things that he says is that there are too many people who have access to the deep information in Twitter
00:04:24.020 and can tweak it.
00:04:26.980 Does that surprise you?
00:04:30.460 That there are too many people within Twitter who have access to being able to manipulate stuff?
00:04:35.900 Huh.
00:04:37.140 That sounds like a problem.
00:04:39.540 And I think he's got an allegation that there might be a foreign agent working within Twitter.
00:04:46.840 He doesn't name names, but do you think there's a foreign agent working within Twitter?
00:04:52.160 Wouldn't it be more surprising if there were no foreign agents working in Twitter and the other social media networks?
00:05:02.520 Don't you think that foreign agents are trying to penetrate all of them?
00:05:07.380 And if they're not trying to penetrate all of them, why not?
00:05:10.960 It's the obvious thing to do.
00:05:12.180 So here the head of security is even thinking that Twitter may have been penetrated.
00:05:19.540 So this would explain one of my mysteries.
00:05:23.680 One of my mysteries about Twitter is I couldn't reconcile Jack Dorsey's involvement
00:05:30.720 with what seemed apparently some kind of banning or shadow banning for conservatives.
00:05:36.940 Now, it could have been that it was just a subjective experience and there wasn't really any banning going on at all.
00:05:44.840 I'm actually open to that possibility.
00:05:47.300 Because it could be just an illusion.
00:05:49.580 If you think you're being banned, you see it everywhere.
00:05:52.060 But it's just the way the algorithm works.
00:05:54.540 Now, I don't think so.
00:05:55.600 I mean, it looks really real.
00:05:59.640 But be aware that you could be fooled.
00:06:03.940 So my assumption had been that if there was any of this chicanery and stuff going on,
00:06:13.180 that Jack Dorsey would not be personally knowledgeable about it.
00:06:18.240 Because it didn't really make sense with who he is.
00:06:21.220 Right?
00:06:21.360 That his, I guess his character and brand would not be consistent with Twitter doing something that sketchy
00:06:30.920 and him knowing about it.
00:06:33.100 Right?
00:06:33.760 Now, if you haven't met him and you've never had any personal interactions,
00:06:38.340 you know, you might have a different opinion.
00:06:39.840 But if you have any personal interaction with him,
00:06:42.800 he's just not the guy who would be doing that.
00:06:46.240 I just don't see it.
00:06:48.300 And I think he was actually genuinely curious about what was going on when people like me were complaining.
00:06:57.300 Because he actually connected me with somebody to look into it at one point at Twitter.
00:07:02.900 And I got the idea that he genuinely wasn't sure what was going on.
00:07:06.500 Now, if the security guy, ex-head of security at Twitter, is correct, which is a big F,
00:07:16.840 there would be a number of different ways that people like me could be banned
00:07:20.920 without Twitter management even being aware of it.
00:07:25.300 And that was always my best guess.
00:07:27.160 My best guess is that intelligence agents are manipulating the algorithms, directly or indirectly.
00:07:39.180 What I don't know is if it's ours or theirs.
00:07:42.920 Because maybe it would look the same.
00:07:45.940 You know, if the CIA were in there trying to manipulate it,
00:07:49.380 would it look any different than if Russia were trying to do it?
00:07:52.380 Because Russia might say bad things about the president.
00:07:58.020 And if it's a Republican, CIA might say bad things about the president if it's Trump.
00:08:03.260 So I'm not sure you could tell the difference
00:08:05.640 between our intelligence agency and a foreign one.
00:08:11.560 Remember, do you remember I said that the only way I could understand Soros
00:08:16.020 is if he's lost his critical thinking faculties
00:08:21.440 and is just an old man who doesn't know what's going on
00:08:25.080 and the people below him who he's entrusted to give away his money
00:08:29.560 are feeding him a bunch of bullshit
00:08:32.000 about the benefits of the money that they're spending.
00:08:35.580 Because what he knows about the benefits of the money he's donating
00:08:39.060 comes from the people he paid to donate it, right?
00:08:42.760 They're the ones who say, oh, it worked out great.
00:08:44.740 You wouldn't believe how well that worked out.
00:08:47.360 And you should give me some more money to donate,
00:08:49.780 of which I'll keep some of it, because that's what I get paid for.
00:08:55.760 So there's a new, I guess, in the Wall Street Journal
00:09:00.060 in which Soros is defending his funding of these progressive DAs
00:09:05.520 who are letting people out of prison, etc.,
00:09:08.560 by showing statistics that the red states
00:09:11.540 don't do any better than the blue states.
00:09:14.740 You know what's wrong with that, right?
00:09:18.660 Is everybody aware enough to know what's wrong with that?
00:09:24.800 So he says that the red states and the blue states
00:09:27.360 are doing about the same in terms of crime.
00:09:30.740 Therefore, there's no problem with his progressive DAs.
00:09:34.260 Now he's saying that in public and in writing.
00:09:40.900 Yeah, it's not the state, it's the city.
00:09:43.540 If the city is run by a Democrat, you get a Democrat city.
00:09:49.060 It's not the state level, because the state level doesn't have police.
00:09:53.000 I mean, in the sense that we're talking about it.
00:09:55.880 So if you see Soros doing a data analysis
00:10:01.460 that even, I'd say, a large percentage of this audience
00:10:06.240 knew it was not valid,
00:10:08.080 you didn't have to do any research, right?
00:10:10.560 As soon as you heard it was a state level analysis,
00:10:13.840 you said to yourself, well, that's not right.
00:10:17.560 Right?
00:10:18.100 So this, again, is more evidence
00:10:22.900 that maybe Soros is just operating
00:10:25.020 in a degraded mental capacity.
00:10:29.360 I don't know how much of the opinion piece he wrote himself.
00:10:34.000 I'm thinking not all of it.
00:10:35.880 But I feel as if people are even putting words
00:10:37.980 into his mouth now to defend their theft.
00:10:44.140 So let me tell you what it looks like.
00:10:45.960 So this will not be an accusation of fact.
00:10:49.240 It would be pattern recognition.
00:10:52.300 The pattern I see is an elderly person being abused
00:10:56.080 by people he trusts.
00:10:58.580 The people he trusts are giving away his money
00:11:00.800 according to guidelines which he probably established.
00:11:04.460 But they're telling him that it's working when it's not.
00:11:09.280 And he can't tell the difference.
00:11:11.200 And then they're telling him to defend their work
00:11:14.340 by writing a, probably, here I'm speculating,
00:11:18.120 by writing an opinion piece for him
00:11:20.120 that he doesn't understand
00:11:21.940 and then having him sign it
00:11:24.320 and say, oh yeah, that was my opinion.
00:11:26.640 Those state level databases are telling us what we need.
00:11:32.120 It looks like incompetence, doesn't it?
00:11:35.140 Just old age based incompetence?
00:11:37.420 Because we can be fairly sure that he was smart at one point.
00:11:43.200 Right?
00:11:44.320 Is there anybody who says he was always dumb?
00:11:48.020 I mean, I don't think it was luck
00:11:49.480 that made him the money he made.
00:11:52.080 So I feel like the best hypothesis for Soros
00:11:55.220 is not evil but actual decline, mental decline.
00:11:59.720 And then some people below him
00:12:02.320 or who are skimming off some money
00:12:04.480 and making it look like they're doing a good job.
00:12:06.580 That's what it looks like.
00:12:08.380 So if I had to put a large bet on it,
00:12:11.060 I'd say, hmm, my bet's on mental incompetence.
00:12:16.000 That's what I think.
00:12:18.140 All right.
00:12:19.160 But I could be wrong.
00:12:22.900 California's having an abortion referendum.
00:12:25.080 And it looks like two-thirds of Californians
00:12:28.000 are in favor of it
00:12:29.240 or in favor of some kind of abortion being legal.
00:12:32.380 So it looks like that'll pass.
00:12:33.900 Made me wonder how many states
00:12:35.640 are going to pass abortion laws.
00:12:40.360 And is it around a quarter
00:12:43.040 or a third of the states
00:12:45.180 are against abortion rights?
00:12:51.040 It's about that, right?
00:12:55.080 It's about 25%.
00:12:56.800 So in this case,
00:12:57.920 I don't think that 25% applies.
00:13:06.680 But what do you think about the theory
00:13:09.120 that the states are going to sort it out
00:13:11.600 and everybody's going to get
00:13:13.200 what they want eventually?
00:13:14.960 Now, in the short run,
00:13:15.960 it's really hard to move, right?
00:13:18.180 It's hard to relocate to a state
00:13:19.960 where you will have more rights
00:13:25.160 if that's what you want.
00:13:27.180 Don't you think that women of reproductive age
00:13:31.620 are likely to move away from those states?
00:13:35.040 Not all of them,
00:13:36.580 because there are plenty of people
00:13:37.700 who are against abortion
00:13:38.800 and would have the baby.
00:13:40.920 But don't you think a lot of people
00:13:42.580 are just going to move?
00:13:43.420 Because I don't see anybody
00:13:47.540 moving out of California
00:13:49.080 because of abortion,
00:13:51.240 because it's just an option.
00:13:53.240 It doesn't have to apply to you.
00:13:54.880 If you don't want one, don't get one.
00:13:56.460 So I don't see anybody leaving California,
00:13:58.900 but I do see, I can imagine,
00:14:01.640 anybody who's, let's say, Louisiana.
00:14:03.500 If I were a 20-something woman
00:14:07.900 and I was healthy
00:14:10.960 and I could get pregnant,
00:14:14.280 I don't think,
00:14:15.540 and let's say,
00:14:16.640 and I was in favor of abortion,
00:14:19.260 I'm pretty sure I would move
00:14:20.980 because I don't think
00:14:22.940 I would want to take the chance
00:14:24.040 of having to go get medical care
00:14:26.380 in another state.
00:14:28.040 Now, again, I'm not giving you a,
00:14:29.480 if you think you're hearing
00:14:30.520 my opinion on abortion,
00:14:31.780 I don't give that.
00:14:33.200 I think women need to work out
00:14:34.740 what should be legal
00:14:35.720 and what shouldn't be.
00:14:36.460 I just take a pass on that.
00:14:38.640 Let the women figure it out.
00:14:39.960 I'll support you,
00:14:40.740 whatever you come up with.
00:14:43.740 And it'll be different
00:14:44.660 by state, apparently.
00:14:47.120 Well, let's keep an eye on this.
00:14:49.960 In the short run,
00:14:52.620 women who wanted abortion
00:14:55.060 are losing a lot
00:14:56.020 by the Roe versus Wade.
00:14:57.660 In the long run,
00:14:58.700 I wonder if it'll all
00:14:59.880 mostly sort itself out.
00:15:02.920 Mostly.
00:15:03.700 I don't know.
00:15:04.360 We'll see.
00:15:04.740 But I worry about
00:15:06.320 those states surviving
00:15:07.560 that are going to be
00:15:09.420 driving away young people.
00:15:13.560 Are there some states
00:15:14.900 that will just drive away
00:15:15.980 their young people
00:15:16.800 and therefore be doomed?
00:15:19.300 Maybe.
00:15:20.720 Now, Elon Musk says
00:15:22.340 that the greatest risk
00:15:24.040 to the world is...
00:15:27.260 What?
00:15:28.020 What does Elon Musk say
00:15:29.740 is the biggest risk
00:15:31.060 to civilization?
00:15:32.560 It's a collapse of birth rate.
00:15:37.520 Apparently, the United States
00:15:38.680 has been below replacement
00:15:40.040 for decades.
00:15:42.440 But we don't notice it
00:15:43.840 because of immigration.
00:15:45.820 But here's what I wonder.
00:15:47.780 Let's say it's true
00:15:49.060 that civilized countries
00:15:51.520 all dip below
00:15:52.460 the replacement rate.
00:15:54.560 In other words,
00:15:55.300 every two people
00:15:56.620 who get married
00:15:57.500 are not creating
00:15:59.240 two extra kids
00:16:00.760 or more.
00:16:01.460 They're creating
00:16:01.920 fewer than two.
00:16:04.460 And eventually,
00:16:05.160 your economy will collapse
00:16:06.400 because then you end up
00:16:07.260 with a bunch of old people
00:16:08.220 and not enough young people
00:16:09.380 working.
00:16:10.200 So China's got a big problem there.
00:16:12.000 What do you think
00:16:12.420 will happen with America?
00:16:15.620 The only thing
00:16:16.540 that will save America...
00:16:18.240 It's not the only thing,
00:16:19.980 but the thing that's obvious
00:16:22.020 would be immigration.
00:16:23.120 If America does
00:16:25.700 the Trump...
00:16:27.540 Let's say the Trump approach
00:16:28.880 in which it carefully
00:16:31.000 selects who it wants
00:16:32.340 to let in
00:16:33.040 and probably would
00:16:35.020 favor younger people,
00:16:36.540 I would imagine.
00:16:38.460 I would think so.
00:16:41.860 Wouldn't we be
00:16:42.820 in the best position
00:16:43.760 relative to other countries?
00:16:45.720 Because China's not
00:16:47.240 going to open up
00:16:47.820 immigration, right?
00:16:49.300 So they've got
00:16:50.200 a real problem.
00:16:50.820 We can just open
00:16:52.400 our border
00:16:52.960 and let's say
00:16:54.500 we vet people
00:16:55.280 for crime
00:16:55.960 and whatever else.
00:16:57.100 But if we just
00:16:57.680 opened our borders,
00:16:59.220 not just southern borders,
00:17:01.500 but open our borders
00:17:02.340 to, you know,
00:17:03.420 let's say any country
00:17:04.260 that's got a good
00:17:05.240 education system
00:17:06.540 and we can just
00:17:07.900 let people in,
00:17:09.640 don't we win?
00:17:11.280 Because I would think
00:17:12.260 we would be replacing
00:17:13.360 our young people
00:17:14.320 with immigrants.
00:17:16.020 They would be
00:17:16.600 leaving the places
00:17:17.800 where...
00:17:19.660 and maybe leaving them
00:17:20.780 to collapse.
00:17:23.220 But it seems like
00:17:24.200 we'll be okay,
00:17:25.220 we being the United States
00:17:26.760 and any country
00:17:27.760 that can attract immigrants.
00:17:29.580 So I think
00:17:30.260 if you can attract
00:17:31.100 immigrants,
00:17:31.680 you win,
00:17:32.180 and if you can't,
00:17:32.900 you're in trouble.
00:17:33.940 So China's in trouble
00:17:34.860 because their system
00:17:37.020 doesn't allow them
00:17:37.840 to bring in immigrants
00:17:38.620 at the rate they need.
00:17:40.180 Same with Japan.
00:17:41.700 Japan's got
00:17:42.300 the same problem.
00:17:43.260 They can't bring in
00:17:43.960 infinite immigrants.
00:17:45.720 So let me test
00:17:46.740 this on you.
00:17:48.560 I'm going to make
00:17:49.700 a confession here.
00:17:52.840 When I was working
00:17:53.900 in my corporate job,
00:17:55.260 we were often asked
00:17:56.160 to sign corporate documents
00:17:57.880 saying that we had
00:17:59.240 learned something
00:17:59.900 or we understood
00:18:00.860 something to be true
00:18:02.060 about safety
00:18:03.020 or something like that.
00:18:04.260 One of the things
00:18:05.060 I was asked to sign
00:18:06.160 was a statement
00:18:07.480 that said that
00:18:08.220 as an employee
00:18:09.040 of the company,
00:18:10.340 I understood
00:18:11.060 that diversity
00:18:12.160 was good.
00:18:13.400 and I refused
00:18:16.160 to sign it.
00:18:17.540 Not because
00:18:18.300 I don't like diversity,
00:18:20.120 just because
00:18:20.720 I didn't have any data
00:18:21.740 to support that opinion.
00:18:23.680 I thought,
00:18:24.560 well, compared to what?
00:18:26.560 Is Switzerland
00:18:27.460 having a big problem
00:18:28.540 because people
00:18:29.700 look alike?
00:18:30.600 Actually, Switzerland
00:18:31.360 is more diverse
00:18:32.080 than you think.
00:18:34.060 I thought,
00:18:35.000 well, compared to what?
00:18:36.940 So again,
00:18:37.740 I wasn't making
00:18:38.460 a social comment.
00:18:39.800 It was just
00:18:40.240 a data comment.
00:18:41.060 Why would I sign
00:18:42.560 something that says
00:18:43.360 I'm sure about it
00:18:44.340 when there's no data
00:18:45.600 one way or the other?
00:18:48.020 So I refused.
00:18:50.680 Did I get fired?
00:18:52.540 No,
00:18:53.320 because nobody
00:18:54.080 could argue
00:18:54.600 that the data existed.
00:18:56.440 So I just said,
00:18:57.460 if you have some data,
00:18:58.720 I'm happy to look at it.
00:18:59.940 If it looks good to me,
00:19:00.920 I'll sign it.
00:19:02.020 If you don't have any data,
00:19:03.340 why am I signing
00:19:04.000 that I know something
00:19:05.020 that's unknown?
00:19:07.240 All right?
00:19:07.900 Now I revise that.
00:19:10.120 So that was my opinion
00:19:11.040 25 years ago.
00:19:13.660 Here's my current opinion.
00:19:16.220 We would be totally,
00:19:17.440 we, America,
00:19:18.900 would be totally screwed
00:19:20.320 without diversity
00:19:21.240 at this point
00:19:22.740 because we have a country
00:19:24.460 that can allow in
00:19:25.500 people from anywhere.
00:19:28.180 That's our operating system.
00:19:31.140 Our operating system is,
00:19:33.340 yeah, you can come in
00:19:34.420 from anywhere.
00:19:35.240 We would like you
00:19:36.060 to have skills
00:19:36.680 and to be able
00:19:37.200 to add to the country.
00:19:39.200 You know,
00:19:39.420 people have different opinions
00:19:40.380 about how strict
00:19:41.160 you should be
00:19:41.680 about vetting them
00:19:43.120 for their skills.
00:19:45.100 But everybody would prefer
00:19:46.560 people who can take care
00:19:47.980 of themselves,
00:19:48.860 all things being equal.
00:19:50.840 And so I would argue
00:19:52.220 that diversity
00:19:53.360 as a strategic advantage
00:19:55.460 has been proven,
00:19:56.560 proven to my satisfaction.
00:19:59.920 So I would say
00:20:01.960 that the statement
00:20:02.840 diversity is an advantage,
00:20:05.780 especially when populations
00:20:07.680 are under risk of collapse.
00:20:10.700 I feel like
00:20:11.740 from a risk-reward standpoint,
00:20:14.020 it's a proven statement now.
00:20:15.840 So I would say
00:20:16.660 I don't need any data.
00:20:18.920 We have a strategy
00:20:20.140 that can work,
00:20:21.320 replacing people
00:20:22.220 with immigrants,
00:20:23.660 whereas China
00:20:24.500 has a strategy
00:20:25.240 that nobody can see
00:20:26.180 how that could work.
00:20:27.040 The numbers don't work.
00:20:29.900 So if you're just
00:20:30.540 looking at it rationally,
00:20:31.460 I'd say,
00:20:31.880 okay,
00:20:32.700 point made.
00:20:33.600 Diversity in the United States
00:20:34.920 became our,
00:20:36.060 it became a superpower.
00:20:38.260 Now,
00:20:38.540 if you said,
00:20:39.760 was I smart enough
00:20:41.160 to see that coming?
00:20:42.600 The answer is no.
00:20:44.140 No,
00:20:44.400 I did not see that coming.
00:20:46.060 But it's here.
00:20:49.760 Somebody says,
00:20:50.700 we're not strong
00:20:51.340 because of diversity.
00:20:52.820 We're strong
00:20:53.620 because of a common
00:20:54.600 American culture.
00:20:55.540 That's probably a better,
00:20:57.240 that's a fairer way
00:20:58.500 to say it.
00:20:59.820 But in the specific sense
00:21:01.300 of replacing young people,
00:21:03.820 it works.
00:21:05.780 All right?
00:21:06.280 We can do it
00:21:07.100 and China can't.
00:21:07.860 That's all I'm saying.
00:21:09.040 We can do it.
00:21:10.300 China doesn't have
00:21:11.060 that option
00:21:11.540 because they don't
00:21:12.960 deal with diversity.
00:21:15.480 To me,
00:21:16.100 that feels like
00:21:16.660 a slam dunk
00:21:17.380 positive for diversity.
00:21:20.220 And again,
00:21:20.960 I was surprised.
00:21:21.820 I didn't see it coming.
00:21:24.400 All right.
00:21:24.780 So I guess
00:21:28.120 Trump is
00:21:30.260 asking for a,
00:21:31.740 what do they call it,
00:21:32.560 a special master?
00:21:35.340 Yeah,
00:21:35.880 a special master
00:21:36.700 to independently
00:21:37.920 review any evidence
00:21:39.160 that came out
00:21:40.180 of the Mar-a-Lago raid.
00:21:41.740 And I guess
00:21:42.240 this special master
00:21:43.760 would have
00:21:44.320 top clearance
00:21:45.340 to see all the
00:21:47.180 secret stuff.
00:21:48.920 Now,
00:21:49.380 what do you think
00:21:49.760 of that strategy?
00:21:51.180 Why would Trump
00:21:51.880 ask for a special master?
00:21:53.600 And I think
00:21:55.080 the idea
00:21:55.540 is to keep
00:21:56.100 Schiff
00:21:56.520 and those people
00:21:57.360 away from the data,
00:21:58.280 right?
00:21:59.280 Because otherwise,
00:22:00.280 Schiff is going to
00:22:01.020 go in the SCIF
00:22:01.660 and say,
00:22:02.160 oh, yeah,
00:22:02.820 there's those
00:22:03.260 nuclear secrets
00:22:04.420 and there's your
00:22:05.780 love letter to Putin
00:22:06.660 and they won't exist.
00:22:08.760 But if you get
00:22:09.480 a special master,
00:22:11.380 maybe if it's,
00:22:12.820 you know,
00:22:13.040 somebody people trust,
00:22:15.080 maybe that special
00:22:16.560 master will say,
00:22:17.380 I looked at everything
00:22:18.140 and there doesn't
00:22:19.000 seem to be a problem
00:22:19.880 here.
00:22:21.020 Everybody relax.
00:22:21.600 So I think
00:22:23.500 it's a good play.
00:22:25.400 So I think
00:22:26.360 it's a good play
00:22:26.960 by Trump to ask
00:22:27.700 for a special master.
00:22:28.920 Because even if
00:22:29.680 he doesn't get one,
00:22:31.100 he's now on record
00:22:32.280 saying that
00:22:33.880 any other system
00:22:35.740 would look corrupt.
00:22:37.560 Because it would.
00:22:39.180 And now you're primed
00:22:40.440 to see anything
00:22:41.400 except a special master
00:22:42.800 as being below
00:22:44.380 the level of credibility
00:22:45.820 that you would look
00:22:46.540 for in your government.
00:22:47.340 So it's a good way
00:22:49.740 to prime you
00:22:51.720 even if he doesn't
00:22:52.560 get his special master.
00:22:54.920 And then I see
00:22:55.500 a lot of Democrats
00:22:56.760 are salivating
00:22:58.120 because the New York Times
00:22:59.600 reported that there are
00:23:00.860 about 300 classified documents.
00:23:04.960 Whoa.
00:23:06.520 Whoa.
00:23:08.080 When you thought
00:23:09.040 there were only maybe
00:23:09.920 three or four
00:23:10.620 classified documents,
00:23:12.380 maybe you weren't worried.
00:23:13.480 But now there's 300.
00:23:14.780 Whoa.
00:23:15.260 Whoa.
00:23:15.740 Whoa.
00:23:16.220 Whoa.
00:23:17.340 Does that mean anything?
00:23:19.900 Is that any news at all
00:23:21.680 that there are 300?
00:23:24.400 If you know
00:23:25.460 that the government
00:23:26.200 routinely overclassifies things,
00:23:29.320 what does the number 300
00:23:30.760 tell you
00:23:31.500 about the importance
00:23:32.940 of the materials?
00:23:34.640 Nothing.
00:23:35.820 Nothing.
00:23:36.860 It's no information at all.
00:23:38.760 If you start
00:23:39.620 with the assumption
00:23:40.200 that the government
00:23:41.140 overclassifies like crazy,
00:23:43.220 just to play it safe,
00:23:44.740 then if you have got
00:23:45.900 300 documents
00:23:46.900 that say classified
00:23:47.860 out of how many?
00:23:50.240 Is that 300
00:23:51.360 out of 3,000?
00:23:54.360 Because there are
00:23:55.180 boxes and boxes.
00:23:56.060 There must be
00:23:57.400 thousands of documents.
00:24:00.380 Let's say there are
00:24:01.060 10,000 documents.
00:24:02.740 If you had
00:24:03.300 10,000 documents
00:24:04.500 and 300 of them
00:24:05.800 were overclassified
00:24:07.060 probably
00:24:07.980 as classified,
00:24:11.520 does that tell you
00:24:14.180 anything about
00:24:14.860 how dangerous
00:24:15.640 this material is?
00:24:17.240 Not really.
00:24:18.580 It's no information
00:24:19.700 at all,
00:24:20.380 but the Democrats
00:24:21.260 are treating it
00:24:22.620 like the walls
00:24:23.500 are closing in.
00:24:25.200 Who is that Trump?
00:24:26.580 There's 300 of them now.
00:24:28.160 300.
00:24:28.900 Now he's in trouble.
00:24:29.620 All right.
00:24:33.180 This next topic
00:24:34.920 will be a little
00:24:36.340 problematic for some
00:24:37.800 of you,
00:24:38.460 a little painful.
00:24:40.480 And I'm going to
00:24:41.240 start out by telling
00:24:42.460 you that I created
00:24:43.240 an NPC identification
00:24:44.800 guide.
00:24:46.180 So this is how
00:24:47.320 you can identify
00:24:48.180 an NPC.
00:24:49.500 Now,
00:24:50.160 if you're new
00:24:51.420 to this topic,
00:24:53.220 it doesn't literally
00:24:54.840 mean they're NPCs,
00:24:56.180 and it doesn't
00:24:56.680 literally necessarily
00:24:58.080 mean we're living
00:24:58.960 in a simulation
00:24:59.720 like a game.
00:25:01.640 What I do know
00:25:02.780 is that there
00:25:03.600 are a group
00:25:04.360 of people who say
00:25:05.320 the most predictable
00:25:06.620 things in every
00:25:07.720 situation.
00:25:09.320 So I call them
00:25:10.240 NPCs because they
00:25:11.680 can only say
00:25:12.280 predictable things.
00:25:14.100 So here's a
00:25:14.940 further guide
00:25:15.940 on how to get them.
00:25:17.540 An NPC,
00:25:18.620 on Twitter anyway,
00:25:20.200 would attack the
00:25:21.060 person but not
00:25:21.840 the point.
00:25:23.380 So my typical
00:25:24.260 comment this morning
00:25:25.260 on Twitter is,
00:25:26.840 yes,
00:25:27.460 Scott is waking
00:25:28.620 up.
00:25:29.220 The difficult
00:25:29.920 comment on Twitter
00:25:31.080 about me today is,
00:25:33.100 I like this guy's
00:25:34.600 cartoons once,
00:25:36.340 but why is he
00:25:37.300 such an idiot?
00:25:38.840 And that's it.
00:25:40.820 Why am I such
00:25:41.740 an idiot?
00:25:43.220 Well,
00:25:43.520 did I say
00:25:43.960 something wrong?
00:25:45.020 And what was it?
00:25:46.300 What do you
00:25:46.900 disagree with?
00:25:48.280 So you'll see
00:25:49.020 a lot of those.
00:25:49.680 If somebody goes
00:25:50.260 after the person,
00:25:51.260 when it would be
00:25:51.780 just as easy to
00:25:52.600 attack the point,
00:25:53.860 probably an NPC.
00:25:55.340 Here's some more
00:25:55.980 ways to identify
00:25:56.980 an NPC.
00:25:58.520 An NPC imagines
00:25:59.740 they can read
00:26:00.420 minds and see
00:26:01.220 your motives.
00:26:02.840 So today,
00:26:03.480 a whole bunch
00:26:03.880 of people read
00:26:04.460 my mind and
00:26:05.980 told me in public
00:26:07.140 what I was
00:26:07.820 thinking and
00:26:08.380 feeling.
00:26:09.780 None of them
00:26:10.440 were right,
00:26:11.780 but none of
00:26:13.540 them were
00:26:13.880 embarrassed by
00:26:15.420 pretending to
00:26:16.200 know my inner
00:26:16.920 thoughts.
00:26:18.380 That wasn't
00:26:19.140 embarrassing to
00:26:19.860 them.
00:26:20.560 Those are
00:26:20.940 NPCs.
00:26:21.700 real people
00:26:23.580 with, like,
00:26:25.040 functioning brains
00:26:25.840 know they
00:26:26.320 can't know
00:26:26.760 what other
00:26:27.100 people are
00:26:27.480 thinking.
00:26:28.180 You don't
00:26:28.740 know what
00:26:29.000 I'm afraid
00:26:29.460 of.
00:26:30.460 You don't
00:26:30.820 know what
00:26:31.240 my motives
00:26:31.740 are.
00:26:34.100 Almost ever.
00:26:36.660 Even when
00:26:37.280 you think you
00:26:37.740 do.
00:26:37.960 Even if I
00:26:38.460 admit it,
00:26:38.880 you probably
00:26:39.240 don't.
00:26:41.540 All right,
00:26:41.860 here's some
00:26:42.360 more.
00:26:43.680 NPCs believe
00:26:44.480 that history
00:26:45.080 repeats.
00:26:46.500 So if they've
00:26:47.060 seen something
00:26:47.620 in the past
00:26:48.200 that happened,
00:26:48.740 that's a
00:26:50.200 pretty good
00:26:50.840 indication that
00:26:51.700 that's going
00:26:52.180 to happen
00:26:52.440 again.
00:26:53.680 There's nothing
00:26:54.480 like that in
00:26:55.660 the real world.
00:26:56.940 History doesn't
00:26:57.620 repeat.
00:26:58.700 It can't,
00:27:00.060 because everything's
00:27:00.980 changed.
00:27:02.060 If it repeats,
00:27:03.040 it would be the
00:27:03.440 biggest coincidence
00:27:04.120 in the world.
00:27:05.800 Now, surely
00:27:06.780 there are red
00:27:08.660 flags that history
00:27:09.680 can show you.
00:27:10.960 History can
00:27:11.600 certainly say,
00:27:12.220 whoa, we've
00:27:12.640 seen this before.
00:27:14.200 You'd better
00:27:14.620 take a closer
00:27:15.500 look.
00:27:16.140 It can do
00:27:16.560 that for sure.
00:27:17.880 So if you're
00:27:18.320 saying it flags
00:27:19.260 things, I'm
00:27:19.920 with you
00:27:20.280 completely.
00:27:21.260 Oh yeah, it
00:27:21.720 flags things.
00:27:22.820 Wait a minute.
00:27:24.280 This looks a
00:27:24.980 little like the
00:27:25.480 Holocaust.
00:27:26.540 All right, that's
00:27:26.980 good.
00:27:27.880 It's good if you
00:27:28.580 see a pattern
00:27:29.080 developing, you
00:27:30.580 can flag it.
00:27:31.600 Where things go
00:27:32.400 wrong is to
00:27:33.180 believe it's
00:27:33.680 some kind of
00:27:34.160 reliable guide.
00:27:36.100 It's not a
00:27:36.880 reliable guide.
00:27:38.800 Ever.
00:27:40.280 Never.
00:27:41.800 Sometimes they
00:27:42.480 get you right,
00:27:43.380 but that would
00:27:43.880 be true of
00:27:44.380 guessing in
00:27:44.880 general.
00:27:46.160 Sometimes you
00:27:46.760 say, oh,
00:27:47.240 history repeats,
00:27:48.040 here it
00:27:48.340 comes again,
00:27:48.940 and sure
00:27:49.300 enough, there
00:27:49.700 it is.
00:27:51.020 But that's
00:27:51.480 because of
00:27:51.960 luck.
00:27:53.020 It's not
00:27:53.420 because you
00:27:53.940 could predict
00:27:54.700 based on that
00:27:55.700 different thing
00:27:56.360 that happened
00:27:56.760 in the past
00:27:57.360 that just sort
00:27:58.220 of reminds you
00:27:58.820 of it.
00:28:00.140 Yeah, I get it.
00:28:00.660 History rhymes.
00:28:01.560 I saw it a
00:28:02.140 million times.
00:28:03.980 All right.
00:28:07.840 Another
00:28:08.280 side of an
00:28:09.480 NPC would be
00:28:10.400 somebody who
00:28:10.920 believes that
00:28:11.900 on a scientific
00:28:12.620 question, they
00:28:14.180 could do their
00:28:14.780 own research.
00:28:17.500 Well, you
00:28:18.120 can do your
00:28:18.620 own research,
00:28:20.280 but to
00:28:21.040 imagine that
00:28:21.680 the experts
00:28:22.300 weren't sort
00:28:23.000 of trying to
00:28:23.540 do that
00:28:23.880 themselves, and
00:28:25.620 you got a
00:28:26.360 different answer
00:28:26.900 than the
00:28:27.560 consensus of
00:28:28.320 experts, you
00:28:29.360 could be
00:28:29.720 right.
00:28:31.060 But how
00:28:31.440 would you
00:28:31.720 know?
00:28:33.260 Would you
00:28:33.860 bet on
00:28:34.260 yourself if
00:28:36.040 you were the
00:28:37.200 lone, rogue
00:28:38.080 person who did
00:28:39.460 your own research
00:28:40.160 and got the
00:28:40.860 different answer
00:28:41.900 than the, what
00:28:43.140 are the odds
00:28:44.060 that you're
00:28:44.520 right?
00:28:45.760 Well, you
00:28:46.160 don't know.
00:28:47.640 You really
00:28:48.100 don't know.
00:28:49.740 Now, if you
00:28:50.460 said to me,
00:28:52.160 Scott, sometimes
00:28:53.560 an ordinary
00:28:54.600 person can
00:28:55.380 disagree with
00:28:56.200 one ordinary
00:28:57.220 expert and
00:28:58.200 prevail, I
00:28:59.560 would say
00:28:59.800 that's actually
00:29:00.300 pretty common.
00:29:01.640 I've had that
00:29:02.200 experience many
00:29:03.260 times, where I
00:29:04.800 disagreed with
00:29:05.600 an expert and
00:29:06.520 I was right, and
00:29:07.780 I'm not an
00:29:08.180 expert.
00:29:09.060 Many, many
00:29:09.620 times.
00:29:10.440 So it works
00:29:11.060 on an
00:29:11.380 individual
00:29:11.740 basis.
00:29:12.720 Where it
00:29:13.160 works much
00:29:13.920 less well is
00:29:15.880 when it's you
00:29:16.600 against the
00:29:17.400 entire field of
00:29:18.320 experts all
00:29:18.960 over the
00:29:19.320 world.
00:29:21.360 Then the
00:29:22.640 experts all
00:29:23.580 over the
00:29:23.880 world might
00:29:24.360 have an
00:29:24.680 edge on
00:29:26.000 you.
00:29:26.780 But if
00:29:27.080 it's you
00:29:27.460 plus one
00:29:27.960 expert on
00:29:28.520 one question,
00:29:30.280 you could
00:29:31.000 know more
00:29:31.480 on one
00:29:32.000 question than
00:29:33.420 an expert.
00:29:34.320 That wouldn't
00:29:35.000 be that
00:29:35.340 unusual.
00:29:38.260 All
00:29:38.440 right.
00:29:39.620 So if
00:29:40.340 you believe
00:29:40.660 that doing
00:29:41.000 your own
00:29:41.500 research gets
00:29:42.460 you to
00:29:43.040 clarity and
00:29:43.940 certainty, I
00:29:45.180 would say that
00:29:45.800 you're someone
00:29:46.300 who doesn't
00:29:46.800 understand much
00:29:47.700 about research
00:29:48.560 or data.
00:29:50.020 But a lot of
00:29:50.520 people are in
00:29:51.240 that category, so
00:29:52.020 they think that
00:29:52.600 they can do
00:29:53.440 their own
00:29:53.720 research.
00:29:54.920 You'd be an
00:29:55.480 NPC if you
00:29:56.140 believed that
00:29:56.720 you saw a
00:29:57.240 chart on
00:29:57.700 Twitter and
00:29:58.880 that it was
00:29:59.380 real and
00:29:59.820 that it was
00:30:00.080 useful.
00:30:01.780 There's
00:30:02.260 nothing that's
00:30:02.980 a chart on
00:30:03.560 Twitter that's
00:30:04.060 useful or
00:30:05.160 credible.
00:30:06.220 It might be
00:30:06.880 true, it
00:30:08.300 might be
00:30:08.660 true, you
00:30:09.620 might even
00:30:10.060 be interpreting
00:30:11.120 it correctly.
00:30:12.520 But you
00:30:13.120 don't know.
00:30:14.540 You just
00:30:14.980 don't know.
00:30:16.680 It would
00:30:17.100 be luck if
00:30:18.140 it were
00:30:18.420 true and
00:30:19.240 you got it
00:30:20.220 correctly.
00:30:20.520 there's a
00:30:24.200 group of
00:30:24.620 people who
00:30:24.960 imagine they
00:30:25.420 can predict
00:30:25.880 the unknowable.
00:30:28.560 Do you
00:30:29.160 know what
00:30:29.720 will happen
00:30:30.220 in five
00:30:31.200 years for
00:30:32.180 anything?
00:30:33.780 Much less a
00:30:34.660 medical thing.
00:30:36.200 We don't know
00:30:36.940 what will happen
00:30:37.360 with anything in
00:30:38.060 five years.
00:30:39.140 So if you
00:30:39.540 think you know
00:30:40.020 what's going
00:30:40.280 to happen with
00:30:40.620 a medical
00:30:41.060 thing in five
00:30:41.740 years, well
00:30:42.860 maybe, but
00:30:44.620 we're pretty
00:30:45.380 much wrong
00:30:45.800 about everything.
00:30:47.320 So if you
00:30:48.300 made a decision
00:30:49.580 based on
00:30:50.100 knowing what
00:30:50.620 would happen
00:30:50.980 in five
00:30:51.380 years, well
00:30:52.380 good luck
00:30:53.440 with that.
00:30:56.800 NPCs also
00:30:57.540 like to change
00:30:58.180 the definition
00:30:58.800 of words and
00:30:59.980 then be done.
00:31:01.940 Well, I
00:31:02.360 changed the
00:31:02.860 definition of
00:31:03.380 that word and
00:31:03.940 now I'm done.
00:31:05.140 No, that's
00:31:05.660 not reasoning.
00:31:06.820 That's not
00:31:07.460 data.
00:31:07.820 That's not an
00:31:08.320 argument.
00:31:08.880 You just
00:31:09.260 changed a
00:31:09.720 word.
00:31:10.660 That's not
00:31:11.120 anything.
00:31:13.800 And here's
00:31:14.560 my best one.
00:31:16.720 An NPC is
00:31:17.700 skeptical of
00:31:18.820 official data.
00:31:20.100 Now you're
00:31:23.000 saying to
00:31:23.320 yourself, wait
00:31:23.760 a minute, I'm
00:31:24.900 skeptical of
00:31:25.660 official data.
00:31:27.080 Right.
00:31:28.380 You should be.
00:31:29.200 So am I.
00:31:30.460 So like you, I
00:31:31.740 am skeptical of
00:31:33.480 official data, like
00:31:34.500 NPCs.
00:31:35.460 So this would be
00:31:36.260 something we all
00:31:36.900 have in common.
00:31:38.760 NPCs, me, you,
00:31:40.820 we're all skeptical
00:31:41.940 of official data.
00:31:43.000 But I'm more
00:31:45.700 skeptical than
00:31:46.760 most people because
00:31:48.000 I also don't
00:31:49.460 trust an
00:31:50.060 engineered virus.
00:31:53.280 So there were a
00:31:54.200 lot of people
00:31:54.620 arguing with me
00:31:55.440 today who
00:31:56.700 didn't trust a
00:31:57.660 vaccine that
00:31:58.680 had not been
00:31:59.240 sufficiently tested
00:32:00.180 and then turned
00:32:01.680 out not to stop
00:32:02.520 transmission.
00:32:04.140 So is that
00:32:05.300 reasonable?
00:32:06.100 Is it reasonable
00:32:06.980 to not trust a
00:32:08.340 vaccine that
00:32:09.000 hasn't been tested
00:32:09.840 long enough?
00:32:12.540 And it's never
00:32:13.580 been done before.
00:32:15.060 Well, why would
00:32:15.700 you trust that?
00:32:16.880 No, no.
00:32:17.680 It's not reasonable
00:32:18.300 to trust something
00:32:19.020 like that.
00:32:20.700 But why would
00:32:21.660 you trust the
00:32:22.380 alternative either?
00:32:24.040 Why would you
00:32:24.840 trust an
00:32:25.380 engineered virus
00:32:26.580 would only have
00:32:28.220 a short-term
00:32:29.020 effect on you?
00:32:31.160 Why would you
00:32:31.960 trust that?
00:32:34.400 So I'm more
00:32:35.540 skeptical than
00:32:36.760 the skeptics.
00:32:37.720 But the skeptics
00:32:38.840 believe I'm less
00:32:39.680 skeptical because
00:32:41.220 they don't
00:32:41.600 understand what
00:32:42.560 more skeptical
00:32:43.320 looks like.
00:32:44.620 More skeptical
00:32:45.220 says, I don't
00:32:45.960 know anything
00:32:46.400 about that virus,
00:32:47.620 but that's no
00:32:48.340 normal virus.
00:32:49.860 We all agree
00:32:50.480 with that, right?
00:32:51.880 Would you agree
00:32:52.460 with those two
00:32:52.980 statements?
00:32:53.480 I don't know
00:32:54.040 anything about
00:32:54.620 this virus,
00:32:55.860 roughly speaking,
00:32:57.040 and it's an
00:32:58.420 engineered virus.
00:33:00.060 So that makes
00:33:00.980 the unknowns way
00:33:02.940 higher, I would
00:33:04.800 think.
00:33:05.060 It's from a
00:33:10.780 bat, somebody
00:33:11.300 says.
00:33:11.800 It's from a
00:33:12.360 bat.
00:33:13.220 Yeah, I think
00:33:13.760 the bat
00:33:14.140 hypothesis, I'm
00:33:16.000 going to rule
00:33:16.460 out, I
00:33:17.880 suppose anything's
00:33:19.000 possible, but
00:33:19.800 when you hear the
00:33:20.280 experts talking
00:33:20.980 about it, it
00:33:22.280 would be pretty
00:33:23.520 unlikely that it
00:33:24.300 came from an
00:33:24.800 animal.
00:33:26.600 It may have
00:33:27.380 come from an
00:33:27.880 animal before it
00:33:28.700 got modified.
00:33:29.220 identified.
00:33:31.720 Anyway, so
00:33:32.560 those are the
00:33:33.120 NPC identifiers.
00:33:35.980 And then the
00:33:37.240 biggest difference
00:33:37.940 between what I
00:33:39.840 thought about the
00:33:40.820 pandemic and what
00:33:42.460 other people
00:33:42.880 thought is that I
00:33:44.300 included all of
00:33:45.240 the risks.
00:33:47.080 The people who
00:33:47.660 disagree with me
00:33:48.540 generally just
00:33:50.340 ignore one of the
00:33:51.340 biggest risks.
00:33:53.060 Could be a
00:33:53.560 different one.
00:33:54.980 But there's
00:33:55.580 nobody, I don't
00:33:56.620 believe there's
00:33:57.060 anybody who's
00:33:57.680 looked at all the
00:33:58.460 risks.
00:33:59.220 I mean just the
00:34:00.040 categories, I'm
00:34:00.760 not even talking
00:34:01.260 about a deep
00:34:01.800 dive, who
00:34:03.020 disagrees with
00:34:03.760 me.
00:34:06.340 I think
00:34:07.120 everybody who
00:34:07.880 is skeptical of
00:34:10.400 both the virus
00:34:11.240 and the
00:34:11.760 vaccination ends
00:34:14.080 up where I
00:34:14.600 did.
00:34:17.780 All right.
00:34:21.700 Our health
00:34:22.460 system can take
00:34:23.260 care of most
00:34:23.840 illnesses versus
00:34:24.920 100 years ago.
00:34:26.740 True.
00:34:28.260 True.
00:34:29.740 Now, most of
00:34:30.840 the people who
00:34:31.260 disagree with me
00:34:31.900 also have a
00:34:32.620 sense that I
00:34:33.500 said something I
00:34:34.220 didn't say.
00:34:35.360 But have I ever
00:34:36.780 mentioned that
00:34:37.680 every time I
00:34:39.720 think I'm just
00:34:40.280 observing the
00:34:41.020 news, I get
00:34:41.580 dragged into it,
00:34:42.800 and then suddenly
00:34:43.440 I'm the story?
00:34:44.840 It's like, no, I
00:34:45.760 just want to talk
00:34:46.360 about the news.
00:34:47.220 I don't want to be
00:34:47.880 the news.
00:34:48.960 So that's the Joe
00:34:49.700 Rogan and Bill
00:34:50.620 Maher problem.
00:34:51.880 They want to talk
00:34:52.780 about the news, and
00:34:53.520 then they become the
00:34:54.440 news.
00:34:54.680 So forgetting, as
00:34:57.020 I did, that Alex
00:35:00.040 Berenson was back
00:35:01.280 on Twitter, he
00:35:03.500 weighed in on one
00:35:04.820 of my tweets, and
00:35:06.640 let me tell you how
00:35:08.160 that went.
00:35:10.640 By the way, I was
00:35:11.620 testing yesterday some
00:35:13.620 software to do
00:35:14.380 interviews, and so
00:35:16.640 maybe I'll invite him
00:35:18.240 to do a test of
00:35:19.600 the technology.
00:35:21.360 But here's how
00:35:25.600 that went.
00:35:33.120 Where am I?
00:35:34.360 Oh, so we're
00:35:34.920 talking about excess
00:35:35.740 deaths after the
00:35:36.520 pandemic.
00:35:37.680 So I tweeted
00:35:39.920 yesterday or the
00:35:41.240 day before, is
00:35:42.100 there any credible
00:35:42.860 reporting on excess
00:35:43.960 deaths?
00:35:44.800 Because I keep
00:35:45.360 seeing these reports
00:35:46.200 that the number of
00:35:48.200 people dying above
00:35:50.360 what was expected
00:35:51.300 might be an
00:35:52.840 indication that the
00:35:54.020 vaccinations are
00:35:54.880 killing people.
00:35:56.340 And then other
00:35:56.920 people say, no, it's
00:35:58.020 the delayed health
00:35:59.600 care, or it's
00:36:00.780 people getting fat,
00:36:02.040 or it's whatever.
00:36:04.440 So the topic is
00:36:07.920 excess deaths.
00:36:10.120 Now, if you were
00:36:11.960 Alex Berenson, what
00:36:14.520 would prove you
00:36:15.560 right all along?
00:36:18.200 If you were Alex
00:36:18.900 Berenson, the thing
00:36:19.940 that would sort of
00:36:21.760 prove that he had
00:36:23.100 been right since the
00:36:23.980 start would be if the
00:36:25.600 excess deaths are
00:36:27.080 high, they're
00:36:29.300 everywhere that the
00:36:30.420 vaccination happened,
00:36:32.020 which he points out
00:36:32.900 that is mostly in
00:36:34.180 places where the
00:36:34.940 mRNA vaccination
00:36:36.000 happened, there are
00:36:37.640 these reported
00:36:38.220 excess deaths.
00:36:39.160 deaths.
00:36:40.160 And so Alex Berenson
00:36:46.460 said in my reply to
00:36:49.220 my tweet in which I
00:36:50.080 asked if anybody has
00:36:50.960 good excess death
00:36:52.040 data, he said, but
00:36:54.980 yeah, he said, yes, the
00:36:55.940 excess deaths are real
00:36:57.180 and happening in all or
00:37:00.120 nearly all of the mRNA
00:37:01.440 vaccinated countries.
00:37:02.760 Wow.
00:37:05.620 If that's true, what
00:37:07.240 Alex Berenson is saying,
00:37:08.760 that the excess deaths
00:37:10.140 are happening in almost
00:37:11.140 all of the mRNA
00:37:12.060 vaccinated countries,
00:37:14.100 that's a pretty big red
00:37:15.760 flag, isn't it?
00:37:17.780 Big red flag?
00:37:19.400 Would you agree?
00:37:21.340 Big red flag.
00:37:23.420 Let's keep going.
00:37:25.380 And he says people have
00:37:26.620 offered several
00:37:27.320 explanations, right?
00:37:28.280 So he's open to the
00:37:29.880 fact that there's more
00:37:31.040 than one explanation
00:37:31.980 and that it's not
00:37:33.520 settled yet.
00:37:34.140 So, so far we're on
00:37:35.000 the same page.
00:37:36.740 It's not yet settled.
00:37:39.280 But Alex says, but to
00:37:40.760 my mind, the most
00:37:41.900 obvious explanation for
00:37:44.420 the excess deaths is
00:37:45.580 that the shots are
00:37:46.420 implicated.
00:37:47.540 So he thinks that
00:37:48.240 would be the most
00:37:48.760 credible.
00:37:49.900 So I responded to him
00:37:51.100 by saying, historically
00:37:52.340 speaking, wouldn't the
00:37:53.520 most obvious explanation
00:37:54.880 of shocking data, and it
00:37:57.800 wouldn't matter what
00:37:58.980 topic you were on.
00:38:00.280 If there's some new
00:38:01.280 shocking data, I said,
00:38:04.100 isn't the most obvious
00:38:05.680 explanation that the
00:38:06.700 data is wrong?
00:38:08.720 That's the most obvious
00:38:09.800 explanation for
00:38:10.580 everything.
00:38:11.660 Forget about the
00:38:12.860 pandemic.
00:38:14.260 You just get a new
00:38:15.560 shocking study on
00:38:16.700 anything.
00:38:17.640 What's the most obvious
00:38:18.620 explanation of why the
00:38:19.800 new data is different
00:38:21.260 from all the other
00:38:22.020 data?
00:38:24.340 The most obvious
00:38:25.340 reason is the data's
00:38:26.380 wrong.
00:38:26.620 So that's what I
00:38:27.900 tweeted.
00:38:28.920 I said, historically
00:38:29.620 speaking, the most
00:38:30.440 obvious explanation of
00:38:31.500 shocking data is that
00:38:32.900 the data's wrong.
00:38:34.300 And I said, that's 95%
00:38:35.620 of what we see when data
00:38:36.760 is shocking us.
00:38:39.720 Now, Alex came back and
00:38:42.280 mocked me in public for
00:38:43.500 that.
00:38:44.640 He mocked me, and he
00:38:46.720 said, denial.
00:38:48.480 It's not just a river in
00:38:49.960 Egypt.
00:38:50.280 Good one, Alex.
00:38:57.020 Good one.
00:38:57.960 Have you ever heard of
00:38:58.620 that one before?
00:39:00.240 Because there's a river in
00:39:02.160 Egypt called the Nile, and
00:39:06.160 then this word denial sounds
00:39:08.320 like the Nile a little bit.
00:39:10.300 And so he's cleverly, I think
00:39:12.080 this is his own thing.
00:39:13.100 He probably came up with
00:39:13.860 this.
00:39:14.140 It's very good.
00:39:14.740 I think this might last.
00:39:17.400 You might hear this one
00:39:18.200 again.
00:39:19.340 Denial.
00:39:20.020 It's not just a river in
00:39:21.420 Egypt.
00:39:22.320 Pretty good.
00:39:23.680 It's a strong play.
00:39:25.680 Strong play.
00:39:26.520 Now, I got a little worried
00:39:28.600 when I read that, because I'm
00:39:29.620 like, oh shit, he's good.
00:39:32.120 This guy's got a lot of game.
00:39:34.160 He came up with a saying that
00:39:36.460 I think is going to last the
00:39:38.340 ages.
00:39:39.140 And he hasn't even gotten to
00:39:40.200 his point yet.
00:39:40.780 All right, let's get to the
00:39:41.700 point.
00:39:43.160 He goes, here's Scott Adams
00:39:45.300 speculating the, well, I'm
00:39:47.620 going to read it in the
00:39:48.500 attitude that I feel like he
00:39:51.260 was thinking.
00:39:53.400 I don't know.
00:39:54.920 But I feel like if he had
00:39:56.060 been saying it, it would have
00:39:57.060 sounded a little like this
00:39:58.220 instead of writing it.
00:40:00.060 Here's Scott Adams speculating
00:40:01.960 the reason mortality is
00:40:03.520 spiking worldwide in the
00:40:06.260 mRNA countries is that dot,
00:40:09.360 dot, dot,
00:40:09.860 advanced nations suddenly
00:40:12.360 forgot how to count deaths.
00:40:16.700 There are several possibilities
00:40:18.360 for what's happening, he
00:40:19.520 allows.
00:40:20.680 That isn't one.
00:40:22.880 So the one we're going to
00:40:24.140 rule out, that apparently I
00:40:27.280 ruled in because of my
00:40:28.580 denial, which is also a river
00:40:32.100 in Egypt, if you didn't know
00:40:33.440 that.
00:40:34.640 Well, my denial is really
00:40:37.940 blazingly obvious here because
00:40:39.640 here I am denying that data
00:40:42.640 from experts could be, well,
00:40:46.000 actually, I was skeptical that
00:40:48.240 data from the experts is
00:40:49.700 accurate.
00:40:51.260 Huh.
00:40:51.380 But I'm talking to Alex Berenson.
00:40:59.880 What is Alex Berenson most famous
00:41:02.200 for?
00:41:04.380 Correct me if I'm wrong, but I
00:41:06.360 think he's most famous for being
00:41:09.280 skeptical about data.
00:41:11.500 So when I'm skeptical about data,
00:41:17.280 Alex Berenson says, come on.
00:41:20.480 That's a little bit of denial.
00:41:22.260 There's your river in Egypt.
00:41:24.200 But when he's skeptical about data,
00:41:27.000 it's just good reasoning.
00:41:30.520 Okay.
00:41:31.180 But, well, okay, let's, I have to be
00:41:34.260 fair, I am misrepresenting his
00:41:36.120 argument.
00:41:37.240 I am.
00:41:37.740 I am misrepresenting his argument.
00:41:40.720 His argument is that the data in
00:41:44.860 all the countries is being counted
00:41:47.420 the same way it's always been
00:41:48.760 counted.
00:41:50.500 Oh.
00:41:51.940 Okay.
00:41:52.600 That's a good point, isn't it?
00:41:54.640 Ouch.
00:41:55.960 Oh, shit.
00:41:56.620 I thought I was arguing about the
00:41:58.000 credibility of the analysis, but he's
00:42:02.520 just saying if they counted it the same
00:42:04.420 way before as they're counting it now,
00:42:06.080 and you got all these excess deaths,
00:42:09.240 even if they were counting it slightly
00:42:10.760 wrong, I'm assuming this is his
00:42:13.460 argument, even if they're counting it
00:42:15.160 slightly wrong, it's this, they're
00:42:18.180 using the same method.
00:42:19.640 So something's going on.
00:42:22.060 Shit.
00:42:23.740 That's a good point, isn't it?
00:42:27.280 So he totally nailed me in public.
00:42:31.060 I'm defeated.
00:42:31.920 Man, have you ever seen me this
00:42:37.060 completely slapped down in public?
00:42:41.620 It's kind of humiliating, really.
00:42:43.900 Hurts.
00:42:45.080 It's painful.
00:42:46.180 It's painful, really.
00:42:49.220 So about 10 minutes later, somebody
00:42:51.600 tweeted a long thread from somebody who
00:42:54.320 seemed very capable, explaining that the
00:42:56.600 data in every country had, in fact, been
00:42:59.000 misinterpreted, and then he showed his
00:43:02.580 work.
00:43:04.580 Turns out, if you adjust the data for age
00:43:07.420 and demographic shifts over the recent
00:43:10.080 years, the baseline's exactly where it was.
00:43:15.000 There are no excess deaths.
00:43:17.140 The exception is England, because their
00:43:20.120 emergency care system fell apart.
00:43:22.200 That's right.
00:43:27.160 So what about my 95% estimate that it's the
00:43:31.460 data that's wrong?
00:43:33.860 The data was wrong.
00:43:36.860 Now, do I trust the new analysis that says
00:43:41.200 if you properly look at the data, there's
00:43:43.940 no problem?
00:43:45.380 Oh, we'll get to the insurance actuaries.
00:43:47.160 How about I take care of the insurance
00:43:50.780 actuaries with three words?
00:43:54.980 Follow the money.
00:43:56.920 What does the insurance actuary want you to,
00:43:59.420 the insurance company, what does the insurance
00:44:01.780 company want you to believe about the risks?
00:44:06.220 They want you to think there's big risks, because
00:44:08.200 they're charging you more because there's big
00:44:09.620 risks.
00:44:10.020 So if the insurance companies tell you that
00:44:13.840 there's a bigger risk of dying, do you
00:44:17.360 believe them?
00:44:20.300 If you believe that, that's the least
00:44:24.120 credible thing that could ever be in the
00:44:25.800 world.
00:44:26.520 Somebody who is literally paid for how much
00:44:30.120 risk there is, doing their own internal
00:44:32.540 analysis, not showing you the data, it's
00:44:35.060 proprietary, but trust us, we looked at this
00:44:37.920 data and, wow, big risk.
00:44:41.180 Now, in the long run, in the long run, other
00:44:46.180 insurance companies, once they realize that
00:44:50.780 this is a phantom and it's not real, they can
00:44:53.420 lower their prices for the same product, same
00:44:57.060 insurance, and they would get more of the
00:44:58.800 business.
00:45:00.180 So eventually, if these estimates are not real,
00:45:03.700 eventually they'll be driven out of the pricing
00:45:06.080 over time.
00:45:07.760 But what happens on day one?
00:45:10.720 Let me take you to the meeting of the
00:45:12.560 insurance companies.
00:45:13.800 All right?
00:45:14.060 We're all in the meeting.
00:45:15.480 All right, does anybody have any data that
00:45:19.260 shows what's happening?
00:45:20.840 Yes?
00:45:21.280 Actuarial?
00:45:21.900 Tell me.
00:45:22.680 Well, I have some preliminary data that looks
00:45:25.320 like the risks are higher, but I don't know if
00:45:27.880 this will hold true.
00:45:29.500 Maybe we're looking at it wrong.
00:45:31.060 This could be wrong, but preliminarily it looks
00:45:33.520 like maybe there's more risks.
00:45:36.040 Boss says, we're going with that.
00:45:38.340 We're going with that because we can charge
00:45:40.540 more.
00:45:41.540 If we're wrong and we charge more, we just
00:45:43.720 made more money.
00:45:44.760 If we're wrong the other way, we lose money.
00:45:47.900 So of course you're going to say there's a
00:45:49.660 risk.
00:45:50.240 Of course you're going to say it's big.
00:45:51.960 And of course you're going to raise your rates
00:45:53.900 in the short run.
00:45:55.020 Now let's say you're a competing company and
00:45:59.660 you just heard that your competitor checked and
00:46:02.660 says, oh, there's more risk and they raised their
00:46:04.860 rates.
00:46:05.840 What are you going to do?
00:46:07.240 Are you going to say on day one, on day one, this
00:46:10.500 is the important part, on day one, are you going
00:46:12.040 to say, no, our data doesn't look like that.
00:46:14.880 We're going to go low because we don't see that
00:46:17.340 risk and we'll out-compete you by selling more
00:46:20.720 insurance.
00:46:21.160 Nope.
00:46:23.900 On day one, everybody says, somebody's saying
00:46:27.300 there's more risk.
00:46:28.660 Does it sound credible?
00:46:30.520 Well, if they're saying it, maybe we should take
00:46:32.440 a look at our numbers too until we say it.
00:46:34.440 Because we can make more money if we agree with
00:46:36.480 them than if we disagree in the short run.
00:46:40.100 In the long run, clarity will drive the price
00:46:43.500 down if there's a reason to drive it down.
00:46:45.780 So, what was that about?
00:46:54.780 Anyway, so the bottom line is that there is one
00:47:00.140 very strong analysis.
00:47:01.540 In my opinion, I can't tell who's right or wrong
00:47:04.320 on these deep dives and experts I've never met
00:47:08.100 and all that.
00:47:08.840 So I don't know who's right.
00:47:10.340 But I will say that there's a strong analysis that
00:47:13.700 sounds pretty solid, that there is no excess mortality.
00:47:20.960 So my 95% estimate that the problem is usually the data
00:47:28.100 probably is going to work out this time as well.
00:47:31.600 Let me tell you where I learned that trick.
00:47:34.840 I used to work in a technology laboratory at the phone
00:47:38.200 company, local phone company.
00:47:40.460 And my job, along with my coworkers, was connecting
00:47:44.480 various pieces of brand new technology to see if it all
00:47:47.920 operated with our network.
00:47:49.780 Because lots of times it would be like a new phone or
00:47:51.660 something that just didn't work.
00:47:52.760 So we've been connecting various devices, computers, and
00:47:56.540 processors, and modems, and things together.
00:47:58.860 And when it didn't work, what did you look at first?
00:48:04.200 Well, you looked at the software, the compatibility.
00:48:07.900 You made sure your cables were connected.
00:48:10.380 You rebooted.
00:48:11.440 You did all those things.
00:48:12.700 Do you know what turned out to be the problem usually?
00:48:16.540 Actually, most, let's say the plurality, the greatest number
00:48:21.940 of problems.
00:48:22.560 Do you know what it was?
00:48:24.280 Bad cables.
00:48:26.380 Not the wrong cable.
00:48:28.760 But a physical cable that has no moving parts would just stop
00:48:34.540 working.
00:48:35.700 And it would never work again.
00:48:37.860 I never figured out why.
00:48:39.820 Yeah, so even plugging and unplugging didn't make any
00:48:42.520 difference in this context.
00:48:44.200 And I never figured out why.
00:48:45.500 But the only thing I do know is that the thing that my brain
00:48:49.740 said should be the least likely problem was the most likely
00:48:53.280 problem.
00:48:54.940 But my brain could never understand why that was the most
00:48:57.960 likely problem.
00:48:59.080 And it took a long time to check the cable first.
00:49:03.900 But by the time I had enough experience, I would check the
00:49:07.020 cable first.
00:49:08.820 And, you know, and I was happy that I did that because the
00:49:12.280 number of times that was a problem was amazing.
00:49:15.500 Yeah, they're metal wires.
00:49:18.880 You know, I understand there's stress.
00:49:20.560 But logically, you don't think that you've manipulated them
00:49:23.640 enough.
00:49:24.780 And some of them were new and didn't work.
00:49:27.360 I mean, I had brand new cables that didn't work.
00:49:29.800 So, your common sense of what is likely to be true is not very reliable.
00:49:38.860 And so I say, Alex Berenson's point that all of the industrialized countries can't all
00:49:47.460 be bad at counting their debt.
00:49:50.540 Logically, that sounds pretty good, doesn't it?
00:49:54.320 That they can't all be bad at counting their debt.
00:49:56.600 But that's not exactly what's the problem.
00:50:00.600 The problem is not that they counted them.
00:50:02.760 It's the people who analyzed that count after the fact forgot to adjust for demographics and
00:50:07.740 age.
00:50:08.480 If you don't do that, it doesn't matter if the data is correct.
00:50:11.540 You're going to get the wrong answer either way.
00:50:13.040 So, do you think I should ask Alex Berenson to come on and talk to him?
00:50:27.520 Because I feel like the conversation would go this way.
00:50:31.440 I feel like it would be a waste of time, and here's why.
00:50:34.500 If I bring him on, he'll say the blah, blah, blah study shows I'm right.
00:50:39.480 And then what am I going to do?
00:50:41.640 Where's the interview go from there?
00:50:43.880 There's a new study, and it shows I'm right about everything.
00:50:47.980 That's the end of the conversation.
00:50:50.040 I haven't seen the study.
00:50:51.680 And if I saw it, I wouldn't know if it were true.
00:50:54.880 So, I don't think there's any place to go with that conversation.
00:50:58.960 Is there?
00:51:04.960 You are not at all well.
00:51:06.740 Is that for me?
00:51:10.340 Yeah, it's like interviewing one expert.
00:51:12.420 Exactly.
00:51:14.020 What would be useful would be to have somebody who is, you know, opposite in opinions from Alex Berenson,
00:51:21.280 but also an expert.
00:51:23.240 Alex Berenson brings his expert, data expert.
00:51:26.240 The other expert brings another data expert.
00:51:28.140 And then, you know, I host a debate in which there's lots of time, so you can get through it.
00:51:34.780 Why would you trust a potential study, but not data?
00:51:43.020 Right.
00:51:43.780 So, it all could be wrong.
00:51:45.320 Look into euthanasia numbers.
00:51:51.280 Well, again, there's no difference in the excess deaths.
00:51:58.460 So, there's nothing to look into beyond that.
00:52:06.160 Dropping the Stalin-level facts.
00:52:08.160 What about that?
00:52:08.700 She's quite the MILF.
00:52:14.400 The CDC director.
00:52:15.800 Does anybody else have a crush on the CDC director?
00:52:20.800 Walensky?
00:52:21.320 Oh, that's funny.
00:52:28.780 I actually got attacked by a literal Karen the other day on social media.
00:52:34.920 Somebody, I forget what it was.
00:52:37.220 Somebody came after me for something.
00:52:39.240 It was actually Karen.
00:52:43.260 Pharaoh Zen.
00:52:44.760 Could everybody say hi to Pharaoh Zen over on YouTube?
00:52:50.080 Pharaoh Zen, I am glad you finally made it.
00:52:54.220 Thank God.
00:52:56.480 He asked me to recognize him, and I thought, it's about time.
00:53:00.280 Do you know how many times we've been on here without saying hi to Pharaoh Zen?
00:53:07.860 I mean, it's time.
00:53:09.940 It's time.
00:53:14.560 Would I want to get LASIK eye surgery?
00:53:17.060 I've looked into it.
00:53:17.860 I'm not a candidate.
00:53:19.480 My eye structure doesn't work for that.
00:53:22.420 But I would otherwise.
00:53:28.020 Who should you trust?
00:53:30.280 All right.
00:53:38.420 So that's all I've got for now.
00:53:40.760 Hypnotists are more susceptible to hypnotism.
00:53:44.360 Wake up, Scott.
00:53:47.720 Actually, that's not true.
00:53:50.140 Hypnotists are not more susceptible to hypnosis.
00:53:53.300 It's probably exactly the same.
00:53:54.440 Oh, how many more of you have listened to my video on how to cure anxiety
00:54:04.660 and got some good results?
00:54:10.900 Anybody else listen to it and find themselves getting better?
00:54:16.400 All right.
00:54:20.020 All right.
00:54:20.320 Some more people.
00:54:26.000 Have you learned what canvassing an election is yet?
00:54:29.340 And that's relevant to...
00:54:32.400 Relevant to what?
00:54:35.740 I'm going to get rid of Abba for asking the same question over and over.
00:54:44.160 Goodbye.
00:54:47.040 All right.
00:54:47.640 Well, it's video 791.
00:54:57.200 Is that what it is?
00:54:57.840 It's video 791.
00:54:59.640 But if you want to see the excerpt that's relevant, the one about anxiety,
00:55:05.780 you want to see that on my Twitter feed, and it's the pinned tweet.
00:55:10.820 So go to my Twitter feed.
00:55:12.900 It's the top one today.
00:55:18.880 All right.
00:55:21.740 Have I tried Delta 8?
00:55:23.320 What the hell is that?
00:55:25.040 Oh, yeah.
00:55:25.660 Elon Musk has subpoenaed Jack Dorsey.
00:55:29.500 Oh, the other thing that the Twitter whistleblower said
00:55:33.000 is that Twitter doesn't have any mechanism for knowing how many bots they have.
00:55:37.300 They don't have the resources to know that.
00:55:40.820 That's going to be interesting.
00:55:50.320 Well, what was I wrong about the Vax?
00:55:55.080 Somebody says I was wrong about the Vax.
00:55:57.740 Name one thing that I was wrong about.
00:56:02.120 Can you?
00:56:03.480 I want to see if they can do this on YouTube.
00:56:06.120 Name one thing I was wrong about.
00:56:07.200 You were wrong about the virus.
00:56:11.800 No, but what was I wrong about?
00:56:19.920 Let's see.
00:56:20.800 You took it?
00:56:21.540 No, but what was I wrong about?
00:56:28.260 Nothing.
00:56:29.460 No, wasn't wrong about masks.
00:56:31.280 All right.
00:56:34.740 Well, no more Vax talk, I agree.
00:56:38.780 I'm not interested in the topic anymore.
00:56:41.300 I'm very interested in the cognitive dissonance
00:56:44.460 because it's the only thing where we had completely different predictions,
00:56:49.400 but we all said we were right, including me.
00:56:52.220 Everybody had a wildly different idea of what was going to happen,
00:56:56.620 and then something happened, and everybody said,
00:56:59.240 well, that's exactly what I said was going to happen.
00:57:01.180 There it is.
00:57:01.740 That's exactly what I predicted.
00:57:03.740 Everybody.
00:57:04.620 No matter what you predicted, you're pretty sure you got this one right.
00:57:13.500 Apply your actuarial meeting example to Twitter bot estimates.
00:57:17.200 Yeah, it's going to be the same thing.
00:57:19.860 Twitter's going to say, well, we don't know how many,
00:57:22.560 but if you don't know how many, don't guess on the high side.
00:57:27.820 If you don't know, use your low estimate.
00:57:32.740 Your husband loves me?
00:57:35.440 Well, why can't he tell me that?
00:57:40.760 Super straight, pure blood, ultra mega for the win.
00:57:43.720 All right.
00:57:48.960 I'm in a Twitter troll tailspin.
00:57:52.720 You know, every now and then I,
00:57:55.300 let's say I mix it up with the trolls,
00:58:00.600 but I do it intentionally because I'm bored
00:58:04.340 or there's some point I want to make,
00:58:06.760 but I'm over it.
00:58:11.400 Oh, I'm way off of Prisoner Island.
00:58:13.720 The Pope of Dope is back.
00:58:22.240 All right.
00:58:23.140 I don't have anything else to say.
00:58:25.160 So instead of just babbling,
00:58:27.120 I'm going to end it here.
00:58:28.800 And I will talk to you tomorrow
00:58:33.240 when the news will be exciting again.
00:58:36.280 I guarantee it.
00:58:37.380 I will talk to you tomorrow tomorrow.
00:58:38.780 Thank you.