Real Coffee with Scott Adams - December 27, 2021


Episode 1605 Scott Adams: Let's Fix Most of Societies Problems and Have Some Laughs Too


Episode Stats

Length

54 minutes

Words per Minute

143.29634

Word Count

7,846

Sentence Count

553

Misogynist Sentences

7

Hate Speech Sentences

14


Summary

In this episode of Coffee with Scott Adams, I talk about the impending arrival of virtual reality and how it will change the way we think about reality itself. And how it's going to change our perception of reality itself, and how we view it.


Transcript

00:00:00.000 Good morning, everybody, and welcome to the best thing that's ever happened to you.
00:00:05.640 It's called Coffee with Scott Adams, and it gets better every single time.
00:00:10.220 It does, really. You don't believe it? Come back tomorrow.
00:00:13.700 You'll be amazed. It's better. Better tomorrow.
00:00:16.660 And if you'd like to take it up to really stratospheric levels,
00:00:22.020 the kind of goodness that nobody has ever seen before,
00:00:25.080 all you need is a cup or a mug or a glass, a tanker, chelsea, stein, a canteen jug or flask,
00:00:30.820 a vessel of any kind. Fill it with your favorite liquid.
00:00:34.980 I like coffee.
00:00:36.960 And join me now for the unparalleled pleasure.
00:00:40.700 It's the dopamine hit of the day.
00:00:43.600 It's called the Simultaneous Sip, and it's going to happen now. Go.
00:00:46.880 Go.
00:00:46.980 Oh, so good.
00:00:56.740 I don't know.
00:00:58.480 I can't be positive, but I think that's the best one ever.
00:01:03.720 Well, before I forget, an interesting thing is happening on Twitter this morning.
00:01:11.080 So some people got for Christmas, and Adam Dopamine is talking about this this morning on Twitter,
00:01:17.840 talking about getting an Oculus set, you know, one of the 3D fantasy metaverse kind of tools.
00:01:30.060 And let me tell you, you're going to be hearing this a lot from people who make the transition
00:01:37.040 from real life to the metaverse.
00:01:39.100 You can't believe how good it is.
00:01:43.700 And here's the weird thing.
00:01:46.220 It's still primitive.
00:01:48.660 Did I ever tell you that the best way you can tell something will be big
00:01:52.580 is if the bad version is popular?
00:01:57.420 We are, in terms of the metaverse and virtual worlds and, you know, 3D virtual realities and all that,
00:02:05.420 in terms of virtual reality, we're basically at the fax machine level compared to the Internet, roughly speaking.
00:02:15.820 And already, people who have tried it, and you can see some of the comments this morning,
00:02:21.680 people got VR equipment for Christmas in many cases,
00:02:26.620 and people will tell you, if you haven't tried it yourself, you don't know what's coming.
00:02:33.940 Now, I've told this story before, but I think this sums it up.
00:02:36.800 Several years ago now, it was maybe three years ago, I got a VR set.
00:02:42.660 Imagine how much better they are three years later.
00:02:44.800 One of the virtual worlds had a cliff, and although I knew I was standing in a room in my own home
00:02:52.480 and no danger could come to me, I couldn't tell myself to move my leg
00:02:59.220 to what would make me fall off the virtual cliff.
00:03:03.820 And it was the damnedest thing.
00:03:05.740 I couldn't talk myself into it, even though I knew it couldn't possibly be dangerous.
00:03:11.040 Because the thing I'm trying to explain here
00:03:15.380 is that your brain and your body will treat this as real.
00:03:21.080 Even though you know, it isn't.
00:03:25.080 Now, that's the thing people don't get.
00:03:27.940 If you don't get that knowing it's not real doesn't matter,
00:03:32.480 you're missing the biggest part of this.
00:03:35.980 You will feel it like it's real.
00:03:38.780 Here's the weirdest part.
00:03:40.140 If you do a ride in virtual reality,
00:03:44.340 but in the actual world you're just sitting in a chair,
00:03:47.140 and your chair is not moving at all.
00:03:50.080 Tell me if some of you have done this, so you can confirm this.
00:03:53.940 Do you feel like the motion even in your stomach,
00:03:59.560 do you feel it even though it's not happening?
00:04:02.480 Every person says yes.
00:04:06.680 Every one.
00:04:08.360 Somebody says no.
00:04:10.780 Really?
00:04:11.740 Then you haven't tried the right...
00:04:12.860 If you say no, you haven't tried the right games.
00:04:15.780 Because if you don't do the right one, you're not going to get that effect.
00:04:18.440 But the ones that really make you feel like a roller coaster or a flying one,
00:04:26.160 you feel it.
00:04:27.660 You feel it like it's real.
00:04:28.920 And here's the bigger point of this.
00:04:33.600 I've been telling you since the dawn of Trump
00:04:36.580 that Trump would change more than politics.
00:04:40.480 That he would change how we view reality itself.
00:04:43.280 And he did.
00:04:46.520 Specifically, we saw that our sources of trusted news were bullshit.
00:04:53.160 In a way we'd never seen before.
00:04:55.820 And that was Trump.
00:04:56.860 Trump did that.
00:04:57.560 And I could see that coming from a mile away.
00:04:59.820 That's why I predicted it before he was even elected.
00:05:03.320 I said, whoa, he's changing reality.
00:05:04.900 He's not changing...
00:05:06.380 He's not just changing politics.
00:05:08.820 And then you add the metaverse.
00:05:10.660 And then you add how we'll feel about the virtual realities.
00:05:15.480 Now put those two...
00:05:18.400 Put together the fact that nothing we thought was true was true.
00:05:23.600 Which we're learning that all of our news sources and experts and data
00:05:27.680 are all really biased and subjective.
00:05:31.060 Put that together with the fact that you can experience a world as if it's real
00:05:35.700 within something you know isn't real.
00:05:38.380 What's going to happen to our understanding of reality itself?
00:05:44.120 Now add on top of that, the richest and one of the smartest people in the world,
00:05:49.720 Elon Musk, is a non-apologetic proponent of the simulation being our actual experience.
00:05:58.080 Now those are three separate things, right?
00:06:03.860 Trump and fake news and learning that all of our experts are full of shit.
00:06:08.420 The metaverse, sort of just a cool technology that has a lot of future.
00:06:13.700 And then the idea of the simulations.
00:06:16.760 All those three things are happening at the same time.
00:06:19.720 And it's not a coincidence.
00:06:20.960 Reality is about to get a reboot.
00:06:25.540 We're going to have an awakening that will be one of the great ones of history, really.
00:06:32.760 When you realize the limits of your own rational abilities,
00:06:37.920 everything will look different.
00:06:40.700 Everything will look different.
00:06:41.860 Now, I can say that as someone who's made this journey from thinking I was a rational creature
00:06:48.220 and then thinking that other people were irrational, but at least I'd figured it out.
00:06:55.300 That was my trip.
00:06:57.440 My trip was, what is wrong with all of these other people?
00:07:01.680 Why is it that I keep getting the right answer and all the other people are wrong all the time?
00:07:06.400 And then, about the time I learned hypnosis and started studying about human psychology,
00:07:13.580 I realized, wait a minute, maybe I'm wrong too.
00:07:18.740 And then once you're open to the possibility that we're all wrong all the time,
00:07:24.960 meaning that we're creating subjective little worlds and just living within them,
00:07:29.040 once you realize that, you can't go back.
00:07:33.100 You can't go back.
00:07:34.000 And I'll tell you, a lot of people are there already.
00:07:37.940 So, you know, I'm not alone.
00:07:39.900 I got a tweet this morning from somebody who's sort of risen to that level
00:07:44.920 and can see the whole field, and I had to ask him what his background was
00:07:49.820 because he didn't have it in his bio, and I thought, who the hell are you?
00:07:55.260 There's something you know that other people don't know.
00:07:58.400 I haven't gotten a response yet.
00:07:59.620 All right, as part of this theme, I did a troll challenge today.
00:08:05.100 I like to do this once in a while to demonstrate the weakness of our subjective experience.
00:08:13.240 And what I do is, I said in a tweet, name one thing you think I believe that is different from what you believe.
00:08:20.940 And that's also true.
00:08:24.040 And what, of course, happens is a whole bunch of people imagining something that they think I believe,
00:08:31.760 then writing it down, and then I say, nope, nope, I don't believe that.
00:08:36.800 Or you left out some context, so it changed the meaning, or something like that.
00:08:40.500 And you can see that dozens, if not hundreds of people immediately present themselves
00:08:47.120 who have undeniable, undeniable hallucinations about who I am and what I believe and what I've done.
00:08:56.120 And you can see it plainly, because I know my opinion.
00:09:00.620 So when somebody says, you believe X, and I say, no, I believe Y, I get that right every time.
00:09:08.180 You know why?
00:09:08.760 Because I actually know my opinion.
00:09:11.060 I'm not guessing.
00:09:12.840 Everybody else is guessing.
00:09:14.980 But I know it.
00:09:16.280 So even though you can't fact check the other people, I can.
00:09:19.240 Because I know what's happening in my mind, even if I'm wrong.
00:09:23.360 Presumably I'm wrong a lot.
00:09:26.800 All right.
00:09:27.580 Yes, we're all irrational.
00:09:29.680 Once you learn that, you'll be in good shape.
00:09:32.520 Now, one of the things that people did call me out on, on opinions that they disagree with,
00:09:40.500 I'll give you a few examples.
00:09:42.500 One person trying to be funny, a troll said, when I said, name one thing you think I believe
00:09:48.320 that's different from what you believe.
00:09:50.140 A troll said, Dilbert is funny.
00:09:51.940 The implication being that I think Dilbert is funny, but Dilbert is not funny.
00:09:57.880 And I actually addressed that one.
00:10:00.500 Because even as a troll, they failed.
00:10:02.780 It was like a failed troll.
00:10:04.160 Because I don't believe Dilbert is funny.
00:10:06.140 I mean, even that assumption was wrong.
00:10:12.380 What I believe is that Dilbert is created to appeal to a certain part of the public.
00:10:19.140 And that group of the public likes it a lot and laughs.
00:10:23.380 And they like it so much they pay for it.
00:10:26.480 That's all I know for sure.
00:10:28.500 And I'm 100% sure that other people, lots of them, don't find it interesting at all.
00:10:34.300 So when somebody says that I think Dilbert is funny, I say, no, that doesn't even make sense.
00:10:40.920 Nothing is funny.
00:10:45.280 Nothing is funny.
00:10:46.520 All there is is just some people like some stuff.
00:10:51.400 That's it.
00:10:52.980 Here's another one.
00:10:55.180 People complained that I said everybody's vaccination choice was based on fear.
00:11:00.280 But that the fears would be different.
00:11:02.340 Some people feared the vaccination.
00:11:05.520 Some people feared the COVID.
00:11:07.280 Some people feared government control, etc.
00:11:11.420 But others said, no, no, no, Scott.
00:11:14.480 Our decisions were absolutely rational.
00:11:18.240 Because we looked at the costs and the benefits.
00:11:20.380 And we were never afraid.
00:11:22.400 We never felt any fear.
00:11:24.980 We just looked at the information as it existed and made a call.
00:11:29.120 Now, that's actually accurate.
00:11:32.040 I would say that's an accurate pushback to my point.
00:11:36.340 And I will only offer this clarification.
00:11:42.300 When I said everybody makes their decision based on their fears, fear to me is a very large category.
00:11:49.760 I'm not talking about the fear that makes you quake in your shoes, although some people probably had that about the virus.
00:11:56.980 I'm talking about the kind of fear that says, oh, I don't want to go to the DMV.
00:12:03.840 I'm talking about the kind of fear that says, oh, I have to see my in-laws for Christmas.
00:12:11.980 I like my in-laws, but, you know, talk about other people.
00:12:16.320 So, yeah, you can say that's not fear.
00:12:18.400 But when I'm talking about it, I'm talking about a, let's say, a willingness to pick the path that's best for you, as opposed to the less good path.
00:12:33.020 And so it was suggested on Twitter, and I take this suggestion, that a better way to reframe my opinion, so it would be less confusing and more persuasive, I suppose, as well,
00:12:43.580 is that instead of saying that everybody made their decision based on fear, you could put it in a positive way.
00:12:52.600 And the positive way would be we all followed our own illusions of safety, meaning that we all did a cost-benefit analysis.
00:13:02.960 I didn't say we didn't.
00:13:05.100 But we all had a different opinion, which I'll call an illusion because we can't know, right?
00:13:10.560 But if we could know for sure the outcome of every person based on every decision, well, then it wouldn't be an illusion.
00:13:18.980 But if you're just predicting based on guesses and variables that we can't measure, well, that's kind of an illusion.
00:13:27.000 I may be overstating it for your preferences, but it seems that way to me.
00:13:31.380 Anyway, so I'll reframe that and say that everybody's decision about the vaccination was based on an illusion about their own safety,
00:13:41.640 and then I'll add, or convenience.
00:13:45.540 So it was based on an illusion about their own safety, or convenience.
00:13:50.920 Now, the convenience part is more objective.
00:13:53.520 If you had to do it to travel or keep your job or make your spouse happy or something, eh, I'll give you that that wasn't fear-based.
00:14:03.080 Except, will you give me that being afraid of your spouse's response is a little bit of fear?
00:14:09.620 But it's not big fear.
00:14:11.120 It's like fear with a little F.
00:14:13.360 It's not big fear where, you know, there's a home invasion.
00:14:16.020 That's the big fear, capital F.
00:14:19.400 But a little fear.
00:14:20.860 There's a shame you want to ignore, or you want to avoid.
00:14:26.460 Let me make this offer to you.
00:14:32.200 Apparently, Biden is going to be talking to the governors today.
00:14:35.900 Can you confirm that?
00:14:36.920 I think I saw that.
00:14:38.140 So Biden's going to be talking to all governors today by phone,
00:14:41.460 and I think he's talking about COVID.
00:14:43.900 That's the topic.
00:14:44.700 Give me a confirmation that's happening today.
00:14:48.140 All right, I see a confirmation.
00:14:50.200 Now, this is when I think we have to get serious about the idea that we should ask not what the country can do for us
00:15:00.680 and ask what you can do for the country.
00:15:03.720 Because this is the point where I'm sure, I'm very confident,
00:15:08.720 that the governors and Biden want the restrictions to end.
00:15:13.220 They just want to do it in a way that they protect their jobs and presumably the public as well.
00:15:18.760 So, I think we need to help them right now.
00:15:23.740 And I'm hoping that the February 1 idea, the idea that we should have some kind of a target that our government should tell us about,
00:15:32.380 even if they adjust it later, based on data.
00:15:35.920 But treat us like adults and say, well, we're going to shoot for February 1,
00:15:39.840 but this assumes that the death rate stays low and keeps dropping.
00:15:46.480 Would that be fair?
00:15:48.220 So, I hope the February 1 date got through because this is exactly the time that Biden and the governors should get together and say,
00:15:56.000 all right, everybody, let's shoot for February 1.
00:16:02.960 All right.
00:16:06.560 Rasmussen has a new poll.
00:16:08.900 It says that Biden is at his, he's tied for his low in approval at 40%.
00:16:13.860 So, how many of you think Biden wants to continue the restrictions?
00:16:23.080 How many of you think that he actually wants that because it'll be good for him in the election or something?
00:16:30.100 I'm seeing some people say yes, yes.
00:16:33.660 Well, so weigh this for me.
00:16:35.800 If Biden could claim he ended all the restrictions, wouldn't he get more votes than if he simply kept the restrictions in,
00:16:51.540 but that gave him some benefits with mail-in votes?
00:16:54.960 Which of those strategies is better?
00:16:57.680 I feel like solving the biggest problem in the world is always the best strategy.
00:17:04.020 Isn't it?
00:17:05.800 Do you think that keeping the biggest problem that we have in place is how you get re-elected?
00:17:12.660 When you could, you have the tools to remove it?
00:17:17.160 I have a feeling that claiming success would be the smart play.
00:17:23.140 So, I see your point.
00:17:25.420 I see your point that, you know, there might be some mail-in or other advantage if he keeps things locked down.
00:17:32.000 But I don't know if there's enough lockdown.
00:17:33.520 If it's just the blue states that are locked down at that point, what's the difference?
00:17:38.780 The only ones locked down will be the ones that he was going to win anyway.
00:17:42.600 He's not going to win Florida because they're not locked down.
00:17:44.860 Apparently, also, most Americans, according to Rasmussen poll, side with J.K. Rowling that there are only two genders.
00:17:59.400 Now, are you surprised that I don't talk about this topic more?
00:18:03.400 Like, every once in a while, somebody asks me,
00:18:04.820 Scott, state your opinion.
00:18:07.360 How many genders are there?
00:18:10.800 Well, I'm not even sure this is a real debate.
00:18:15.440 Because if you put people in the same room, I don't know that they would disagree.
00:18:20.720 Except for what word to put on things.
00:18:22.880 And I suppose the word you put on it can, you know, determine policy and stuff.
00:18:26.300 So it does matter.
00:18:27.840 But it's such a silly thing.
00:18:30.260 Because the people who say there are only two genders,
00:18:32.580 because, you know, if you can have a baby, you're female.
00:18:36.100 And if you can produce sperm, I suppose you're male.
00:18:39.920 Those are my own definitions, I guess.
00:18:41.800 So I would say that if you want to use those definitions of biology,
00:18:46.680 then there are exactly two sexes.
00:18:50.840 And if you want to use a more socially focused one,
00:18:55.540 then people are all over the board about what they prefer,
00:18:58.660 but that has nothing to do with whether they can have babies or not.
00:19:03.220 That fear is holding you back, Scott.
00:19:05.840 I don't know if I have any fear about this topic.
00:19:07.940 Because, you know, generally you can always be safe if you say,
00:19:15.360 as long as everybody's following the law,
00:19:17.940 they should have as much freedom as they can get,
00:19:21.040 short of, you know, bothering other people to the point of ridiculous.
00:19:27.020 So I think you can have both.
00:19:28.780 I think we can say there are two genders if you're talking about biology,
00:19:32.620 and still say, but socially,
00:19:34.020 the way people think of themselves and want to be treated,
00:19:37.940 let them have the freedom to do that.
00:19:41.840 Which doesn't tell you anything about what bathrooms they should use
00:19:45.000 or what sports they should play.
00:19:46.360 Those are all separate decisions.
00:19:49.320 All right.
00:19:51.620 All right.
00:19:58.400 Best and worst predictions of 2021.
00:20:00.560 So here's a little progress report about where we are
00:20:04.300 in terms of our ascent to a higher level of awareness.
00:20:09.940 So this will be a little marker of how close we are
00:20:12.660 to really understanding our inability to be rational.
00:20:17.600 The prediction, one of the best predictions for 2021
00:20:24.420 is also the worst prediction for 2021,
00:20:27.160 which was my prediction, that Republicans would be hunted.
00:20:31.960 Now, I did a poll, and of course, there are thousands of people who say,
00:20:34.880 oh, yeah, they're definitely being hunted.
00:20:36.960 Let me give you five examples.
00:20:38.380 But, of course, Politico just did an article that said,
00:20:45.660 my same prediction, the one that thousands of people would say,
00:20:49.260 well, that's one of the best predictions I've ever seen.
00:20:51.540 You nailed it.
00:20:53.280 According to Politico, it's one of the worst predictions
00:20:55.800 because nothing like that happened.
00:20:57.140 Now, what could be more perfectly, let's say,
00:21:04.540 a perfect summary of where we are as a thinking species?
00:21:09.560 The perfect summary is that we can't tell the difference
00:21:12.980 between the best prediction and the worst.
00:21:17.420 One of my favorite quotes from Mark Twain
00:21:19.680 is that people can't tell the difference between good news and bad.
00:21:23.580 The first time you hear that, you say, well, that's not true.
00:21:26.000 And then you keep seeing examples of it.
00:21:28.680 We actually can't tell the difference between good news and bad lots of times.
00:21:33.560 Lots of times.
00:21:34.260 You just can't tell.
00:21:35.880 And we also don't know the difference between something that's starting
00:21:39.440 and something that's finishing.
00:21:41.340 Take the pandemic, for example.
00:21:43.940 We can't tell what's smart or dumb.
00:21:46.480 So we don't know the best prediction from the worst.
00:21:49.060 We don't know what's good, what's bad.
00:21:50.520 We don't know what's evil, what's good.
00:21:52.160 We don't know who's helping, who's hurting.
00:21:54.060 We don't know who's being selfish.
00:21:55.420 We don't know who's being generous.
00:21:57.340 We don't do anything.
00:21:59.980 That's where we are.
00:22:01.840 But once you realize that, you get to go to the next level of awareness.
00:22:07.360 All right, here's a trick I use to predict things.
00:22:11.840 And you could use this, too.
00:22:13.340 It works every time.
00:22:14.700 If you want your predictions to be accurate,
00:22:18.440 predict something that's already happening.
00:22:20.020 And then you'll look like a genius.
00:22:24.320 So here's my prediction.
00:22:27.200 I said, prediction.
00:22:28.160 In 2022, Big Pharma will convince experts there is a mental condition
00:22:32.860 that one might call over-wokeness.
00:22:36.400 And there will be a pill for it.
00:22:39.360 That's my prediction.
00:22:41.480 Do you know why I'm confident in that prediction?
00:22:43.980 Because it's already happening.
00:22:46.840 It's already happening.
00:22:48.480 You don't think that lots of people are getting antidepressants
00:22:51.440 because of how they're dealing with...
00:22:55.980 I'll call it social sensitivity.
00:22:59.900 So I don't think this disease will be called over-wokeness,
00:23:03.500 but it might be called hyper-social sensitivity.
00:23:07.060 If I told you there was a new condition called hyper-sensitivity
00:23:13.680 to social pressure or whatever,
00:23:18.520 would you say that doesn't exist?
00:23:21.880 Or would you say, oh, that totally exists,
00:23:23.680 and it's messing with people's minds?
00:23:27.200 You would say it existed, right?
00:23:29.480 You might not think you have it,
00:23:31.540 but you would say that, oh, other people certainly have this problem
00:23:34.420 where they're so sensitive about race and being insulted
00:23:39.000 and who's being put down and what's unfair
00:23:42.320 that it's become an actual mental disorder.
00:23:47.300 Now, I don't think there'll be a new pill for it.
00:23:50.480 I think they'll repurpose existing pills.
00:23:53.420 But, yeah, they're going to give people Xanax
00:23:57.000 and God knows what else for their social oversensitivity.
00:24:01.080 Does anybody want to argue with that?
00:24:02.700 Because I'm pretty sure it's happening right now.
00:24:05.840 Well, let me ask the question.
00:24:07.900 How many of you think it's happening right now?
00:24:11.140 Is there anybody you know
00:24:12.360 who has become so woke that they had to seek therapy
00:24:17.600 and that they're on pills because of it,
00:24:20.000 some kind of pill?
00:24:21.980 Over on Locals, the answers are yes, yes, yes, yes, yes, yes, yes, yes.
00:24:26.720 Yeah.
00:24:27.660 So that's how you make a prediction
00:24:29.540 with high likelihood of being right.
00:24:31.820 In fact, you simply predict what's already happening.
00:24:35.580 It makes it a lot easier.
00:24:37.460 Predicting what hasn't happened yet is hard.
00:24:40.000 I don't recommend it at all.
00:24:42.880 All right.
00:24:44.920 I made this following suggestion, based on all this,
00:24:48.220 that schools should teach kids how to deal with criticism
00:24:51.320 as an actual lesson or a class,
00:24:53.980 and specifically how to be free from, you know,
00:24:58.200 the embarrassment or shame or social pressure
00:25:01.380 of what everybody thinks about you.
00:25:04.080 Because I think what social media did
00:25:06.380 is take the idea that was already a problem,
00:25:09.740 which is what do other people think of you,
00:25:11.740 and turned it from one of those problems you deal with
00:25:14.760 into the biggest problem.
00:25:15.720 Now, for kids, the biggest problem
00:25:18.960 is what people think of them,
00:25:20.440 especially what they say
00:25:21.480 on some kind of digital media about them.
00:25:24.660 So I think that we need to give kids a class.
00:25:28.320 And I think that class should be called, perhaps,
00:25:31.420 Everyone is Someone Else's Weirdo,
00:25:33.480 so that you would just get comfortable with being weird.
00:25:37.480 And when people made fun of you,
00:25:39.120 you'd think,
00:25:40.280 well, what does it mean to me
00:25:41.580 that a weirdo is calling me weird?
00:25:44.200 It doesn't mean anything.
00:25:46.280 If there was somebody perfect
00:25:47.640 who called you weird,
00:25:48.820 you might feel bad about it.
00:25:50.540 But there aren't any perfect people.
00:25:51.960 There are only weirdos.
00:25:53.400 So if a weirdo calls you weird,
00:25:55.380 should that bother you?
00:25:56.760 Now, that would be sort of
00:25:57.680 what a lesson would look like.
00:26:00.100 You could also call it
00:26:01.300 the basket case theory,
00:26:03.680 the idea that everybody's a basket case,
00:26:06.400 or a weirdo,
00:26:07.340 if you got to know them well enough.
00:26:10.080 So here are some of the things I would teach.
00:26:12.300 I would teach kids
00:26:13.220 how to develop a talent stack.
00:26:16.000 Because if you're good at something,
00:26:18.440 anything,
00:26:19.500 then you know you can be good at something.
00:26:22.440 And that's really good as a protection
00:26:24.120 when you get criticized for other things.
00:26:27.540 If you get criticized
00:26:28.640 and you're not good at anything,
00:26:30.700 that's got to sting.
00:26:32.140 But imagine you knew you were good at,
00:26:34.560 I don't know,
00:26:35.140 a sport,
00:26:36.020 playing music,
00:26:37.680 being kind to people,
00:26:40.160 you know,
00:26:40.360 whatever it is.
00:26:40.980 But you know you're good at something
00:26:42.340 because you practiced more than other people
00:26:44.600 and you just became good.
00:26:46.940 I recommend that
00:26:47.920 just as a protection against criticism
00:26:50.320 because you've got one thing
00:26:51.360 to hang on to at least.
00:26:52.340 I saw this quote
00:26:56.240 as I was thinking about this
00:26:58.020 on Twitter.
00:26:59.300 I don't know who said it first
00:27:00.640 or if I have the words right,
00:27:02.100 but it's something like this.
00:27:03.300 You would care less about
00:27:04.700 what others thought of you
00:27:07.000 if you knew how little
00:27:08.880 they think of you at all.
00:27:11.800 That's like one of the best things
00:27:13.280 I've ever learned.
00:27:15.720 That you would care less
00:27:17.120 about what people think of you
00:27:18.400 if you realize how little
00:27:19.580 they even think about you at all.
00:27:21.720 You're not even meaningful.
00:27:24.540 I'll tell you where I learned that.
00:27:26.060 I've told this story before.
00:27:27.680 Years ago,
00:27:28.280 I had some laser work on my face
00:27:30.620 to get rid of some spider veins.
00:27:32.740 And with the old laser treatments,
00:27:34.720 it would make your face purple for weeks.
00:27:37.700 Like it would look like
00:27:38.680 you had a terrible accident
00:27:39.720 or you lost a fight or something.
00:27:41.660 Or you got in an MMA fight,
00:27:43.320 but you'd never been in one before.
00:27:45.340 And for the first few days,
00:27:47.320 I thought,
00:27:47.880 man, I'm not going to go out in public
00:27:49.300 because I don't want everybody
00:27:51.080 looking at me and judging me
00:27:52.600 and wondering what's up with me.
00:27:54.620 And then finally,
00:27:55.420 I just got such cabin fever.
00:27:56.960 I was like,
00:27:57.240 oh, screw it.
00:27:57.880 I'm going to go to the mall.
00:27:59.460 I remember walking around the mall
00:28:01.040 with a face that looked like
00:28:02.840 I just lost to an MMA fight.
00:28:07.720 And I would look at people
00:28:09.720 to see how they were being horrified
00:28:11.460 by my existence.
00:28:13.780 Nobody cared.
00:28:15.960 Nobody even looked twice.
00:28:18.200 Nobody stared.
00:28:19.660 Nobody asked me about it.
00:28:21.540 Nobody cared.
00:28:23.680 And was that a special case?
00:28:26.800 No.
00:28:27.880 People don't care.
00:28:29.800 I could have walked around naked
00:28:31.300 and except for it being illegal,
00:28:34.360 people wouldn't care.
00:28:35.820 They might talk about it,
00:28:37.200 but they wouldn't care, right?
00:28:39.700 So once you learn
00:28:40.720 how little people actually care,
00:28:43.900 then their opinion about you
00:28:45.460 should be put in the same box
00:28:46.780 that they have it,
00:28:48.240 which is it's unimportant.
00:28:50.020 It's the least important thing
00:28:51.200 in their life.
00:28:52.500 Don't make it important in your life.
00:28:54.720 It didn't even matter to them.
00:28:58.980 And then, of course,
00:29:00.300 there's the Dale Carnegie method
00:29:01.900 for learning how to deal
00:29:03.420 with shame and embarrassment.
00:29:05.540 And the Dale Carnegie method,
00:29:06.780 essentially,
00:29:07.360 just makes you practice
00:29:09.080 being embarrassed.
00:29:10.740 So you do things you know
00:29:12.140 will be embarrassing,
00:29:13.040 such as giving an embarrassing speech
00:29:15.620 in front of a crowd,
00:29:16.780 which is one of the exercises.
00:29:19.900 Or doing something embarrassing
00:29:21.520 in front of a crowd.
00:29:22.340 It doesn't have to be a speech.
00:29:25.160 All right.
00:29:25.500 So those are some of the things
00:29:28.500 that could be in the class
00:29:29.440 on handling criticism.
00:29:33.960 Machiavelli's Underbelly,
00:29:35.260 a Twitter account,
00:29:36.920 Machiavelli's Underbelly,
00:29:39.580 asked this provocative question.
00:29:41.920 If you think that following the money
00:29:43.500 is predictive,
00:29:44.720 this is my setup for the question,
00:29:47.820 he asked this question.
00:29:49.200 Following the money,
00:29:50.000 how has China's economy
00:29:52.440 fared relative to the rest
00:29:53.780 of the world?
00:29:54.740 Did they significantly benefit
00:29:56.660 financially from the pandemic
00:29:58.160 relative to the rest of the world?
00:30:00.160 Or did it seriously hurt their economy
00:30:02.280 to the same extent?
00:30:03.520 So I assume that the intent of this
00:30:07.060 is to make us wonder
00:30:08.300 if China did it intentionally.
00:30:11.540 And the idea is that
00:30:12.420 if you follow the money
00:30:13.440 and China somehow gained from it,
00:30:17.220 then maybe they did.
00:30:19.040 But if they did not gain from it,
00:30:21.380 well, then maybe it wasn't a plan.
00:30:23.700 It maybe just got out somehow,
00:30:25.480 the virus.
00:30:27.780 Here's a better question
00:30:29.260 that I added to that tweet
00:30:31.880 in the comment.
00:30:32.400 I said a better question
00:30:34.020 would be this.
00:30:35.280 When adjusted for risk,
00:30:37.420 could China ever have reasonably
00:30:39.600 expected it was a good play
00:30:41.440 to release a pandemic?
00:30:43.920 What do you think?
00:30:45.480 If you go back in time,
00:30:48.740 was there any reasonable way
00:30:50.700 that China could have said
00:30:51.860 from a risk-reward perspective?
00:30:53.840 Because you don't know
00:30:54.400 how big the risk is, right?
00:30:55.880 You assume it's pretty big.
00:30:57.800 But could China have reasonably
00:30:59.600 done this intentionally
00:31:01.580 if they thought through
00:31:03.500 the size of the risk
00:31:04.900 versus the size of the benefit?
00:31:08.100 Could you imagine
00:31:09.260 that follow the money
00:31:10.780 would have brought them
00:31:12.300 to the point
00:31:12.840 where they would release it intentionally?
00:31:17.680 Tom says,
00:31:18.520 can I read minds?
00:31:19.920 No.
00:31:20.460 You can't read minds.
00:31:21.920 So I'm asking you this.
00:31:23.220 Is there anybody
00:31:25.180 that you could put
00:31:26.340 in that position
00:31:27.020 who would have thought
00:31:27.700 it was a good idea?
00:31:29.120 Just any human.
00:31:30.440 You don't have to be Chinese.
00:31:32.540 You don't have to be
00:31:33.380 a Chinese leadership.
00:31:34.880 Just any human.
00:31:36.020 Would any human
00:31:36.660 make that decision?
00:31:38.780 Now,
00:31:39.500 there are a lot
00:31:41.000 of different humans.
00:31:42.060 So somebody says,
00:31:42.800 yes,
00:31:43.140 Hitler.
00:31:43.940 Hitler, right?
00:31:45.760 So crazy people would.
00:31:47.220 Do you think
00:31:48.460 that the government
00:31:49.620 of China
00:31:50.540 could be best described
00:31:53.020 as rational
00:31:54.000 or crazy?
00:31:57.540 What would best
00:31:58.840 describe China's leadership?
00:32:01.460 Many of whom
00:32:02.080 are engineers,
00:32:02.840 by the way.
00:32:03.260 I don't know
00:32:03.520 if you knew that.
00:32:04.140 But they have
00:32:05.060 a very high percentage
00:32:07.080 of engineers
00:32:07.740 and leadership.
00:32:08.320 Well,
00:32:12.560 I don't think
00:32:13.200 that China
00:32:13.660 is run
00:32:14.200 as a crazy country.
00:32:15.740 I think it's run
00:32:16.520 as a country
00:32:17.060 that is laser-focused
00:32:20.240 on self-interest
00:32:21.300 and we can respect that.
00:32:25.100 I mean,
00:32:25.440 you don't have to like it,
00:32:27.020 but you can respect it
00:32:28.460 because people
00:32:30.540 do operate
00:32:31.140 on self-interest.
00:32:32.560 Countries actually
00:32:33.260 probably work well
00:32:34.040 that way,
00:32:34.720 operate on self-interest.
00:32:36.420 But they are
00:32:37.460 a very rational
00:32:38.340 decision-making country
00:32:40.000 and you don't see
00:32:40.760 much irrational stuff
00:32:42.020 happening,
00:32:42.500 do you?
00:32:43.440 It would be hard
00:32:43.900 to think of
00:32:44.420 irrational
00:32:45.260 Chinese policies.
00:32:47.200 There are cruel ones,
00:32:48.940 there are ones
00:32:49.480 that you don't like,
00:32:51.840 but not irrational ones
00:32:53.260 if you look at it
00:32:55.440 through the lens
00:32:56.040 of what's just good
00:32:57.500 for China.
00:33:00.140 So I would say
00:33:01.080 the follow-the-money
00:33:02.040 hypothesis
00:33:02.740 argues strongly against,
00:33:05.920 not in any
00:33:06.680 100% way,
00:33:08.440 but it argues
00:33:09.080 strongly against
00:33:10.060 the idea
00:33:11.020 that anybody
00:33:11.500 would have done
00:33:12.040 this intentionally.
00:33:13.560 There's nobody
00:33:14.280 who understands
00:33:15.020 economics
00:33:15.620 who would have
00:33:16.080 done this
00:33:16.440 intentionally,
00:33:17.600 unless they were
00:33:18.640 crazy.
00:33:20.780 So let me say
00:33:21.540 that again.
00:33:22.400 There's nobody
00:33:22.980 who understood
00:33:24.420 economics
00:33:25.380 who would have
00:33:26.740 thought this
00:33:27.160 was a good play
00:33:27.940 because the risk
00:33:28.640 is so big
00:33:30.480 and so unsizable.
00:33:33.380 Yeah.
00:33:33.700 And of course
00:33:34.420 they had to look
00:33:35.220 at economists.
00:33:35.940 They wouldn't
00:33:36.180 have done this
00:33:36.620 without that.
00:33:38.280 All right.
00:33:38.880 There's a moderate
00:33:39.900 liberal named
00:33:41.040 Michelle Tandler
00:33:42.320 who's realizing
00:33:43.760 that maybe
00:33:45.500 everything she
00:33:46.180 believed about
00:33:47.080 progressives
00:33:48.140 was wrong,
00:33:49.440 meaning that
00:33:50.000 if you want
00:33:50.420 to be nice
00:33:50.900 to people,
00:33:52.500 progressive policies
00:33:53.420 might get you
00:33:53.960 the wrong outcome.
00:33:58.160 Scott,
00:33:58.740 they did it
00:33:59.120 for economics.
00:33:59.880 Did you miss
00:34:01.280 the entire
00:34:01.840 conversation?
00:34:04.560 Nobody would
00:34:05.400 have done this
00:34:06.100 for economics.
00:34:08.880 There's one
00:34:09.980 category I can
00:34:10.740 speak of
00:34:11.180 with some
00:34:11.600 confidence.
00:34:12.660 I do have
00:34:13.120 two degrees
00:34:14.020 that are relevant
00:34:14.640 to this question.
00:34:16.760 Nobody would
00:34:17.300 do it
00:34:17.620 intentionally
00:34:18.300 for economics.
00:34:19.920 Now,
00:34:20.220 if you're not
00:34:20.660 an economist,
00:34:21.880 you should talk
00:34:22.960 to one
00:34:23.340 because if you
00:34:25.000 could find
00:34:25.400 anybody,
00:34:26.560 let me ask you
00:34:27.280 this,
00:34:27.800 find somebody
00:34:28.460 who's got
00:34:28.800 an MBA
00:34:29.300 from a
00:34:30.180 good school
00:34:30.720 or a
00:34:32.260 degree in
00:34:32.700 economics
00:34:33.320 and ask
00:34:34.400 them if it
00:34:34.860 looked like
00:34:35.260 a good
00:34:35.600 financial play
00:34:36.560 for China
00:34:37.760 to release
00:34:38.340 a pandemic.
00:34:40.280 You're not
00:34:40.880 going to get
00:34:41.160 anybody to
00:34:41.660 say that.
00:34:43.380 But if you're
00:34:43.940 talking to
00:34:44.340 an artist
00:34:44.800 or something,
00:34:45.800 they might
00:34:46.300 say that.
00:34:54.020 Short term,
00:34:54.980 maybe.
00:34:55.540 Well,
00:34:55.840 the point is
00:34:56.380 that you
00:34:56.700 couldn't calculate
00:34:57.400 the short term
00:34:58.040 or the long
00:34:58.600 term impact.
00:35:00.060 All right.
00:35:02.100 Anyway,
00:35:02.780 this moderate
00:35:04.500 liberal,
00:35:05.020 Michelle Tandler,
00:35:05.900 did an
00:35:06.560 interesting tweet
00:35:07.440 thread.
00:35:08.600 And she
00:35:09.140 says,
00:35:10.120 here's what
00:35:10.640 confuses me
00:35:11.380 about San
00:35:11.940 Francisco.
00:35:13.160 We have
00:35:13.360 the most
00:35:13.640 liberal
00:35:14.080 left-wing
00:35:14.720 government
00:35:15.140 and population
00:35:15.760 in the
00:35:16.040 country.
00:35:16.660 We have
00:35:17.060 a $13
00:35:17.600 billion budget
00:35:18.560 and we
00:35:19.500 have 80,000
00:35:20.280 people sleeping
00:35:21.040 in the rain
00:35:21.520 this week.
00:35:22.720 Can someone
00:35:23.300 please explain
00:35:24.160 this to me?
00:35:24.660 And then she
00:35:24.960 goes on in
00:35:25.560 the thread
00:35:26.020 talking about
00:35:27.220 how,
00:35:27.580 her prior
00:35:29.660 impression that
00:35:31.080 the progressives
00:35:31.900 would be better
00:35:32.700 for the
00:35:33.860 underclass
00:35:34.580 appears to
00:35:36.160 be wrong.
00:35:37.080 And there
00:35:37.260 may be some,
00:35:38.380 you know,
00:35:38.680 these are my
00:35:39.640 own words now
00:35:40.380 to try to
00:35:41.080 capture the
00:35:41.620 theme of it.
00:35:42.140 But there
00:35:42.960 may be some
00:35:43.440 tough love
00:35:44.160 more close to
00:35:45.940 what the
00:35:46.340 right would
00:35:46.980 suggest.
00:35:47.720 It might be a
00:35:48.460 little closer to
00:35:49.200 the kindest
00:35:50.280 thing you could
00:35:50.720 do.
00:35:51.360 So if being
00:35:52.220 kind to people
00:35:52.960 is what you
00:35:53.480 want,
00:35:54.420 the progressives
00:35:55.140 may have the
00:35:55.580 wrong solution.
00:35:56.380 Now,
00:35:57.340 I don't
00:35:57.960 know if
00:35:58.220 she's factored
00:35:59.060 into this,
00:35:59.520 how many
00:35:59.800 people actually
00:36:00.360 don't want
00:36:01.060 to be in
00:36:01.880 a home because
00:36:02.920 they want to
00:36:03.360 do drugs
00:36:03.780 instead.
00:36:05.040 But seeing
00:36:07.960 her awaken
00:36:08.820 like this is
00:36:09.660 interesting.
00:36:10.600 And you get
00:36:11.060 toward the end
00:36:11.680 of the tweet
00:36:12.140 thread and you
00:36:13.240 find out that
00:36:13.920 she's reading
00:36:15.220 Michael
00:36:18.720 Schellenberger's
00:36:19.580 new book,
00:36:20.520 San Francisco,
00:36:23.320 which is a
00:36:24.040 huge bestseller.
00:36:25.260 And so you
00:36:26.000 can see her
00:36:26.500 being persuaded
00:36:27.720 in real time.
00:36:29.680 You can
00:36:30.020 actually watch
00:36:30.620 her being
00:36:32.280 transparent about
00:36:33.660 her thought
00:36:34.220 process being
00:36:35.560 modified by
00:36:36.300 that book.
00:36:38.020 Did I tell
00:36:38.620 you that
00:36:38.920 Michael
00:36:39.160 Schellenberger
00:36:39.700 is persuasive?
00:36:40.600 You're
00:36:42.880 seeing it
00:36:43.240 everywhere.
00:36:44.360 He's very
00:36:44.780 persuasive.
00:36:46.880 And part
00:36:48.420 of his
00:36:49.000 appeal is
00:36:49.940 that you
00:36:50.160 can't tell
00:36:50.560 what political
00:36:51.700 side he's
00:36:52.200 on.
00:36:53.060 It's just
00:36:53.780 unfathomable
00:36:54.780 because he
00:36:55.500 just does
00:36:55.920 things that
00:36:56.280 make sense
00:36:56.880 and have
00:36:57.900 some kind
00:36:58.320 of data
00:36:58.720 or scientific
00:36:59.600 backing.
00:37:00.100 and you
00:37:01.520 can't tell
00:37:01.920 what he's
00:37:02.300 up to
00:37:02.620 because it
00:37:03.240 just looks
00:37:03.620 like a
00:37:03.900 bunch of
00:37:04.160 good ideas
00:37:04.840 backed by
00:37:06.640 solid argument
00:37:07.620 and data.
00:37:12.140 Let's see.
00:37:13.340 Here's a
00:37:13.760 thought.
00:37:14.160 Big pharma
00:37:14.540 is creating
00:37:15.120 an economic
00:37:15.700 agenda to
00:37:16.280 make all
00:37:16.600 humans forced
00:37:17.280 to be
00:37:17.560 consumers of
00:37:18.100 their products.
00:37:19.460 How much
00:37:19.860 more financial
00:37:20.720 incentive do
00:37:21.780 you need?
00:37:22.720 Well, that
00:37:23.020 would be the
00:37:23.460 pharma companies.
00:37:25.500 But we're
00:37:26.180 talking about
00:37:26.660 China's
00:37:27.120 government.
00:37:28.460 So if
00:37:29.040 you're asking
00:37:29.440 me,
00:37:30.100 do I
00:37:31.060 trust big
00:37:31.680 pharma?
00:37:32.360 I think
00:37:32.720 big pharma
00:37:33.300 should be
00:37:34.560 a category
00:37:35.080 in which
00:37:35.520 we assume
00:37:36.100 guilt until
00:37:36.880 proven
00:37:37.240 innocence.
00:37:42.020 For big
00:37:42.860 pharma,
00:37:43.500 I think
00:37:43.800 you should
00:37:44.120 assume
00:37:45.860 guilt
00:37:46.320 until proven
00:37:47.680 innocence.
00:37:49.040 Whereas I
00:37:49.640 wouldn't say
00:37:49.980 that about
00:37:50.340 most things.
00:37:51.100 But I
00:37:51.320 think given
00:37:52.600 the size
00:37:53.060 of the
00:37:53.300 risks
00:37:53.700 with any
00:37:55.540 kind of
00:37:55.800 medication
00:37:56.200 and the
00:37:57.000 history of
00:37:57.500 the industry,
00:37:58.880 it would
00:37:59.300 be irrational
00:38:00.220 to think
00:38:00.800 otherwise.
00:38:01.300 And that's
00:38:01.580 of course
00:38:02.000 why you
00:38:02.320 do the
00:38:02.600 randomized
00:38:03.020 controlled
00:38:03.640 trials.
00:38:11.560 Yeah,
00:38:12.160 the pandemic
00:38:12.660 caused every
00:38:13.440 strong economy
00:38:14.260 to shut down
00:38:14.880 except China's,
00:38:15.760 but it also
00:38:16.180 turned people
00:38:16.860 against China.
00:38:18.600 So as
00:38:19.880 much as
00:38:20.360 anything,
00:38:20.880 it will be
00:38:21.320 the reason
00:38:21.780 that the
00:38:22.240 United States
00:38:22.880 starts to
00:38:23.580 migrate its
00:38:24.180 manufacturing
00:38:24.840 out of there.
00:38:25.320 So I
00:38:27.480 don't think
00:38:27.880 they could
00:38:28.260 have known
00:38:28.600 that was
00:38:28.940 a good
00:38:29.180 idea.
00:38:30.540 They would
00:38:30.880 have known
00:38:31.200 that was
00:38:31.580 risky.
00:38:32.340 And China
00:38:32.740 is not
00:38:33.140 the super
00:38:33.840 risky
00:38:34.200 country,
00:38:34.720 right?
00:38:35.460 China tends
00:38:36.180 to not
00:38:36.660 take risks.
00:38:38.320 So does
00:38:38.740 it make
00:38:39.000 sense that
00:38:39.560 the country
00:38:40.040 most famous
00:38:41.040 for not
00:38:41.640 taking risks
00:38:42.600 took the
00:38:43.860 biggest risk
00:38:44.480 anybody had
00:38:45.040 ever taken
00:38:45.520 as a
00:38:45.940 country?
00:38:48.540 I mean,
00:38:49.300 maybe Hitler
00:38:49.860 took a
00:38:50.240 bigger risk,
00:38:50.920 but special
00:38:52.060 case.
00:38:52.440 yes,
00:38:55.040 in order
00:38:55.340 to get
00:38:55.620 rid of
00:38:55.900 Trump?
00:38:56.340 No,
00:38:56.740 I don't
00:38:56.920 think that
00:38:57.960 was a
00:38:58.240 big enough
00:38:58.660 reason.
00:39:06.920 Why do
00:39:07.520 you keep
00:39:07.820 saying
00:39:08.160 randomized
00:39:08.580 controlled
00:39:09.180 trials are
00:39:09.720 the only
00:39:10.020 method capable
00:39:10.820 of demonstrating
00:39:11.780 causality?
00:39:13.140 Well,
00:39:13.420 actually,
00:39:13.940 I don't say
00:39:14.400 that because
00:39:14.920 even the
00:39:15.680 randomized
00:39:15.980 controlled
00:39:16.660 trial doesn't
00:39:17.380 demonstrate
00:39:17.840 causality.
00:39:19.780 Does it?
00:39:22.440 Check me
00:39:23.800 on that.
00:39:25.600 A
00:39:26.000 randomized
00:39:26.380 controlled
00:39:27.000 trial doesn't
00:39:27.920 tell you
00:39:28.660 causality,
00:39:29.780 right?
00:39:31.080 Does it?
00:39:34.160 It tells
00:39:35.060 you correlation,
00:39:35.960 doesn't it?
00:39:37.740 And then you
00:39:38.440 have to use
00:39:39.000 your judgment
00:39:39.600 or something
00:39:41.340 like it
00:39:41.960 to determine
00:39:43.720 whether it
00:39:44.480 was causal
00:39:45.080 or causality
00:39:47.220 was involved.
00:39:47.840 has a
00:39:52.240 randomized
00:39:52.620 controlled
00:39:53.320 trial
00:39:53.660 ever been
00:39:54.780 duplicated?
00:39:57.620 The answer
00:39:58.500 is yes,
00:39:59.700 but it's a
00:40:00.560 funny question.
00:40:10.120 Randomized
00:40:10.760 trial is
00:40:11.320 about causality.
00:40:12.400 I know
00:40:12.840 the point
00:40:13.600 of a
00:40:13.940 randomized
00:40:14.220 controlled
00:40:15.080 trial is
00:40:16.420 to determine
00:40:17.100 causality,
00:40:18.860 but it
00:40:19.200 doesn't do
00:40:19.660 that.
00:40:20.480 It just
00:40:20.800 gets you
00:40:21.120 close,
00:40:21.560 right?
00:40:23.140 I guess
00:40:23.700 all I'm
00:40:24.020 arguing is
00:40:24.600 it doesn't
00:40:25.640 complete the
00:40:26.460 distance.
00:40:27.160 It just
00:40:27.540 gets you
00:40:27.960 close enough
00:40:28.700 that your
00:40:29.860 judgment
00:40:30.340 has to
00:40:31.280 suffice for
00:40:32.440 the rest
00:40:32.720 of it.
00:40:40.220 All right.
00:40:41.220 So that's
00:40:41.880 all I got
00:40:42.160 for today.
00:40:43.200 No doubt
00:40:43.860 the best
00:40:44.380 show that's
00:40:45.080 ever been
00:40:45.680 broadcast
00:40:46.560 anywhere
00:40:47.120 at any
00:40:47.740 time.
00:40:49.140 And
00:40:49.500 when does
00:40:51.120 Trump stop
00:40:51.680 making news
00:40:52.200 again,
00:40:52.740 I'm being
00:40:53.140 asked.
00:40:54.260 I think
00:40:55.120 that no
00:40:55.500 matter what,
00:40:56.120 whether he
00:40:56.480 runs for
00:40:57.060 office or
00:40:58.100 he creates
00:40:58.920 a platform
00:40:59.640 to influence
00:41:01.280 politics,
00:41:02.620 his instinct
00:41:04.360 is to stay
00:41:05.020 in the news
00:41:05.580 and he's
00:41:05.940 good at it.
00:41:06.520 Scott,
00:41:12.140 plenty of
00:41:12.580 people
00:41:12.820 chose the
00:41:13.340 option
00:41:13.640 they're
00:41:13.880 more
00:41:14.140 afraid
00:41:14.540 of
00:41:14.880 that
00:41:15.220 is
00:41:15.380 less
00:41:15.660 convenient.
00:41:17.720 Right?
00:41:19.160 They did a
00:41:19.880 cost-benefit
00:41:20.360 analysis.
00:41:21.000 Everybody did
00:41:21.440 that.
00:41:28.280 All right.
00:41:28.960 Metaverse.
00:41:29.460 I talked
00:41:29.760 about the
00:41:30.080 metaverse.
00:41:31.120 So thank you
00:41:31.980 for reminding
00:41:32.420 me,
00:41:32.700 but I
00:41:33.160 said what
00:41:33.560 I wanted
00:41:33.820 to say
00:41:34.060 about that.
00:41:38.740 Yeah,
00:41:39.260 I just
00:41:39.480 wanted to
00:41:39.840 tie the
00:41:40.120 metaverse
00:41:40.520 into the
00:41:41.000 simulation
00:41:41.460 and into
00:41:42.080 the
00:41:42.440 virtual reality
00:41:43.640 stuff.
00:41:45.580 Trump is
00:41:46.260 compromised
00:41:46.720 by Big Pharma.
00:41:47.860 You know,
00:41:48.160 I doubt
00:41:48.480 that.
00:41:49.620 I doubt
00:41:50.320 that.
00:41:51.400 I feel
00:41:52.020 like Trump
00:41:52.480 might be
00:41:52.980 one of the
00:41:53.380 few people
00:41:54.040 you could
00:41:54.500 depend on
00:41:55.040 not to be
00:41:55.580 compromised
00:41:55.980 by Big Pharma.
00:41:57.580 Only because
00:41:58.300 he's never
00:41:58.720 had an
00:41:59.060 association
00:41:59.440 with him
00:41:59.980 before
00:42:00.400 that I'm
00:42:01.580 aware of.
00:42:02.700 And it
00:42:02.940 would be
00:42:03.240 obvious and
00:42:04.020 I don't
00:42:04.580 think he's
00:42:05.000 a fan and
00:42:05.620 he's a
00:42:05.940 skeptic.
00:42:10.500 What's that
00:42:11.020 in all caps
00:42:11.580 there?
00:42:11.860 You're
00:42:12.020 yelling at
00:42:12.360 me.
00:42:22.200 Would it
00:42:22.680 be a good
00:42:23.120 business move
00:42:23.820 for Big Pharma
00:42:24.540 to support
00:42:25.240 Omicron
00:42:25.840 spread?
00:42:26.200 Well,
00:42:26.760 the problem
00:42:27.080 is you
00:42:28.040 don't want
00:42:28.400 to be
00:42:28.600 the one
00:42:28.920 who says
00:42:29.320 that Omicron
00:42:29.980 is safe.
00:42:31.880 You know,
00:42:32.140 unless you're
00:42:32.520 me,
00:42:32.960 right?
00:42:34.140 Unless you
00:42:34.700 can handle
00:42:35.160 the blowback.
00:42:37.040 But I
00:42:37.480 don't think
00:42:37.840 if you're
00:42:38.180 somebody
00:42:38.520 official,
00:42:39.180 like an
00:42:39.460 expert,
00:42:40.320 you don't
00:42:40.660 want to
00:42:40.880 go so
00:42:41.260 far as
00:42:41.600 to say
00:42:41.860 it's
00:42:42.060 safe.
00:42:42.980 Or even
00:42:43.600 to recommend
00:42:44.160 it instead
00:42:44.740 of a
00:42:45.020 vaccination.
00:42:47.440 That would
00:42:47.820 be risky,
00:42:48.780 I think.
00:42:51.920 What about
00:42:52.540 the iodine
00:42:53.800 nose spray
00:42:54.360 solution
00:42:54.900 discussed
00:42:55.360 on Joe
00:42:55.920 Rogan?
00:42:56.760 Well,
00:42:57.200 I don't
00:42:57.420 know about
00:42:57.800 that,
00:42:58.520 but I
00:42:58.960 do know
00:42:59.360 that the
00:42:59.780 idea of
00:43:01.200 spraying
00:43:02.180 something into
00:43:02.780 your nose
00:43:03.180 to kill
00:43:03.620 the virus
00:43:04.140 that's
00:43:04.480 obviously
00:43:04.940 there,
00:43:05.720 that's been
00:43:06.160 talked about
00:43:06.620 a lot.
00:43:07.260 But I
00:43:07.540 don't know
00:43:07.960 that unless
00:43:10.120 you were
00:43:10.440 doing it
00:43:10.860 on the
00:43:11.260 regular,
00:43:12.320 wouldn't
00:43:12.720 all the
00:43:13.160 virus
00:43:13.680 repopulate
00:43:14.840 pretty quickly?
00:43:15.460 maybe that's
00:43:17.260 a question
00:43:18.240 for a
00:43:19.040 doctor.
00:43:20.380 I'll ask
00:43:20.960 that question
00:43:21.460 when I
00:43:21.780 talk to
00:43:22.240 Dr.
00:43:22.640 Drew.
00:43:22.900 I think
00:43:23.120 I'm
00:43:23.360 scheduled
00:43:23.800 to talk
00:43:24.200 to him
00:43:24.420 tonight.
00:43:26.080 So look
00:43:26.600 for that.
00:43:27.840 And I
00:43:30.140 don't think
00:43:30.460 it's bleach,
00:43:31.180 but it does
00:43:31.640 make sense
00:43:32.160 that clearing
00:43:33.560 out your
00:43:34.020 nose would
00:43:35.860 get rid of
00:43:36.340 some virus.
00:43:36.980 I just
00:43:37.260 don't know
00:43:37.600 if it
00:43:37.840 lasts an
00:43:38.480 hour.
00:43:39.800 Is it
00:43:40.240 back in
00:43:40.620 an hour?
00:43:41.700 I don't
00:43:41.960 know.
00:43:46.160 So I
00:43:46.860 use a
00:43:47.840 mouthwash
00:43:48.700 several
00:43:49.660 times a
00:43:50.140 day.
00:43:52.780 Probably
00:43:53.340 overuse it,
00:43:54.240 usually
00:43:54.600 related to
00:43:55.420 my other
00:43:57.140 habits.
00:43:58.140 But I
00:44:00.040 wonder if
00:44:00.560 you did a
00:44:01.580 test of
00:44:02.020 people who
00:44:02.440 used just
00:44:03.060 a regular
00:44:03.480 mouthwash
00:44:04.160 five times
00:44:04.760 a day.
00:44:06.180 I'll bet
00:44:06.640 their odds
00:44:07.220 of COVID
00:44:08.000 are lower
00:44:09.540 or better
00:44:10.720 outcomes
00:44:11.180 because their
00:44:11.680 viral load
00:44:12.280 was lower.
00:44:14.320 Is God
00:44:15.160 the author
00:44:15.560 of the
00:44:15.900 simulation?
00:44:19.200 You would
00:44:19.980 only need
00:44:20.580 God for
00:44:21.220 the original
00:44:22.100 species.
00:44:23.780 And after
00:44:24.200 that,
00:44:25.120 everything else
00:44:25.780 would be
00:44:26.140 people.
00:44:28.620 So even
00:44:30.100 if there is
00:44:30.760 a God
00:44:31.360 for the
00:44:32.080 simulation,
00:44:33.160 it's unlikely
00:44:34.480 that you're
00:44:35.080 in one of
00:44:35.520 the ones
00:44:35.940 created by
00:44:36.620 the God
00:44:36.980 directly.
00:44:37.420 why do
00:44:40.720 they work
00:44:41.040 so hard
00:44:41.500 to withhold
00:44:42.100 ivermectin?
00:44:44.280 You know,
00:44:45.500 let me ask
00:44:46.540 you in the
00:44:46.900 comments.
00:44:47.280 I just want
00:44:47.700 to see where
00:44:48.060 your heads
00:44:48.500 are at.
00:44:49.300 So before
00:44:49.700 I give you
00:44:50.420 my updated
00:44:51.460 opinion,
00:44:52.660 how many
00:44:53.120 of you think
00:44:53.680 that in
00:44:54.060 the end
00:44:54.420 we will
00:44:54.860 learn that
00:44:56.120 ivermectin
00:44:56.880 would have
00:44:57.460 worked if
00:44:59.220 we'd gone
00:44:59.820 gung-ho and
00:45:00.700 done it
00:45:01.480 early and
00:45:02.260 and all
00:45:03.160 that.
00:45:04.160 How many
00:45:04.600 think that
00:45:05.120 it would
00:45:05.560 have worked?
00:45:06.680 I'm saying
00:45:07.060 mostly yeses.
00:45:09.260 Some noes,
00:45:10.260 but mostly yeses
00:45:11.120 on locals.
00:45:12.060 How about
00:45:12.380 over here?
00:45:13.660 On YouTube,
00:45:15.600 I'm seeing a
00:45:16.220 mix.
00:45:18.280 Got real
00:45:18.760 quiet on
00:45:19.860 YouTube there
00:45:20.360 for a minute.
00:45:21.520 More noes
00:45:22.340 than yeses.
00:45:24.040 No,
00:45:24.280 actually,
00:45:24.820 about even.
00:45:25.260 Here's
00:45:32.700 somebody,
00:45:33.140 I never
00:45:33.520 saw this
00:45:34.000 coming.
00:45:35.160 Scott likes
00:45:35.820 masks in
00:45:36.600 China.
00:45:37.940 I don't
00:45:38.300 know what
00:45:39.220 fucking thing
00:45:39.900 you are
00:45:40.180 watching,
00:45:41.000 but I'm
00:45:41.360 going to
00:45:41.500 make you
00:45:41.820 go away.
00:45:43.620 Goodbye.
00:45:46.980 All right.
00:45:49.060 So I'll
00:45:49.840 give you my
00:45:50.340 most current
00:45:51.340 take on
00:45:51.820 ivermectin.
00:45:53.720 I don't
00:45:54.440 rule it
00:45:54.840 out.
00:45:55.900 If anybody
00:45:57.100 says to me
00:45:57.740 there was
00:45:58.140 a global
00:45:58.780 conspiracy
00:45:59.560 to suppress
00:46:02.920 something that
00:46:03.740 worked because
00:46:04.420 it was cheap
00:46:05.100 and it let
00:46:05.720 people sell
00:46:06.260 vaccines,
00:46:07.060 I would say
00:46:07.540 there's nothing
00:46:08.140 about that
00:46:08.900 that's impossible
00:46:10.960 because there's
00:46:13.040 a mechanism
00:46:13.460 for it.
00:46:14.180 And the
00:46:14.380 mechanism would
00:46:15.120 be that the
00:46:16.560 pharma companies
00:46:17.300 would only have
00:46:17.940 to get to a
00:46:18.500 few experts,
00:46:19.640 which we know
00:46:20.880 they can do
00:46:21.520 and have done
00:46:22.600 and it's a
00:46:23.200 routine thing.
00:46:24.120 You get to a
00:46:24.900 few experts,
00:46:26.720 you have those
00:46:27.520 few experts
00:46:28.280 define what is
00:46:29.240 real and what
00:46:29.820 is not,
00:46:30.640 and then everybody
00:46:31.280 else just falls
00:46:31.920 in line.
00:46:33.780 Because, you
00:46:35.000 know, most
00:46:35.300 countries were
00:46:35.880 just following the
00:46:36.580 United States,
00:46:37.380 weren't they?
00:46:38.720 So, remember
00:46:40.040 one of my
00:46:40.640 initial objections
00:46:41.600 was, are you
00:46:42.800 telling me that
00:46:43.400 there's no
00:46:43.840 country anywhere
00:46:44.760 where they've
00:46:46.420 tried it and
00:46:47.460 then we can
00:46:47.980 know for sure
00:46:48.520 if it worked?
00:46:49.160 Nowhere.
00:46:49.720 There's not one
00:46:50.280 place that they
00:46:50.880 tried it and it
00:46:51.600 worked.
00:46:52.600 And then people
00:46:53.160 come up with all
00:46:54.100 the places that
00:46:54.920 it was tried and
00:46:56.080 it didn't work,
00:46:57.360 but they believe it
00:46:58.100 did, including
00:46:59.000 India, Peru.
00:47:01.060 There's a few
00:47:01.740 more that people
00:47:02.600 imagine worked,
00:47:03.600 but if you actually
00:47:04.260 look at the current
00:47:05.000 numbers, there's no
00:47:06.220 indication of it.
00:47:07.680 Now, that doesn't
00:47:08.500 mean it doesn't
00:47:08.960 work.
00:47:09.920 I'm just saying
00:47:10.460 that the country
00:47:11.460 examples were not
00:47:12.740 real, but they
00:47:13.440 look real.
00:47:14.300 A lot of people,
00:47:14.780 Japan, there's
00:47:15.500 another one that's
00:47:16.160 not real.
00:47:17.320 So all of the
00:47:17.960 country ones are
00:47:18.640 not real, but
00:47:19.780 you'd have to, if
00:47:21.060 you want to test
00:47:21.580 that for yourself,
00:47:23.060 just Google the
00:47:24.400 country and
00:47:25.120 ivermectin and
00:47:26.180 debunk, all the
00:47:28.180 debunks pop right
00:47:29.020 up.
00:47:29.540 And it's just data.
00:47:31.100 Just look at the
00:47:31.640 data, it's not
00:47:32.320 there.
00:47:33.640 Now, my current
00:47:35.740 take is, although
00:47:36.540 it is completely
00:47:37.780 possible, that
00:47:40.040 Big Pharma suppressed
00:47:41.180 ivermectin and
00:47:42.940 that maybe it did
00:47:44.320 work.
00:47:44.720 Totally possible.
00:47:46.420 I think every day
00:47:47.540 that goes by that
00:47:49.760 there isn't some
00:47:50.800 rogue city, country,
00:47:53.660 county that
00:47:55.020 believes the same
00:47:55.740 thing a lot of
00:47:56.400 you guys do, which
00:47:57.900 is it definitely
00:47:58.500 works and we're
00:47:59.160 being fooled.
00:48:00.340 You don't think
00:48:00.920 there was one
00:48:01.400 hospital, one
00:48:03.180 city, one
00:48:04.900 county, anywhere
00:48:06.700 in the world
00:48:07.420 where somebody
00:48:08.800 had the same
00:48:09.320 opinion that
00:48:09.860 almost all of
00:48:10.600 you have, which
00:48:12.240 is that we're
00:48:12.780 being fooled and
00:48:13.580 it probably works.
00:48:15.480 I don't think
00:48:16.200 that's possible.
00:48:17.820 I think you
00:48:18.600 could get 90%,
00:48:19.880 like you could
00:48:21.440 suppress it 90%,
00:48:22.640 but given that
00:48:23.700 most of you
00:48:24.540 think it's
00:48:25.960 bullshit, the
00:48:27.120 suppression of
00:48:27.740 it, I think
00:48:28.860 the majority of
00:48:29.640 you said that
00:48:30.740 you don't believe
00:48:31.240 that it doesn't
00:48:31.980 work.
00:48:33.160 Given that, I
00:48:35.120 don't see any
00:48:35.800 chance we could
00:48:36.560 have gotten to
00:48:37.040 where we are
00:48:37.700 without at least
00:48:39.060 one hospital using
00:48:40.300 it and saying,
00:48:41.300 hey, it's been
00:48:42.660 two years and
00:48:43.240 nobody's died
00:48:43.860 here.
00:48:45.840 Seriously?
00:48:46.520 Not one
00:48:46.880 hospital?
00:48:48.260 Now, there are
00:48:49.340 individual doctors
00:48:51.520 who have tried it,
00:48:52.720 but that's not a
00:48:53.400 standard I would
00:48:54.200 use because 99%
00:48:57.840 of the doctor's
00:48:58.540 patients are going
00:48:59.140 to survive,
00:48:59.760 right?
00:49:00.500 No matter what
00:49:01.060 they do, as long
00:49:03.100 as it's not
00:49:03.520 dangerous.
00:49:09.200 Why are doctors
00:49:10.160 still prescribing
00:49:11.040 it?
00:49:11.360 Well, the same
00:49:11.940 reason they're
00:49:12.860 prescribing it for
00:49:13.620 the same reason
00:49:14.280 that I'd be
00:49:15.140 tempted to take
00:49:15.960 it.
00:49:16.920 Because you
00:49:17.600 don't know.
00:49:19.520 There's plenty
00:49:20.400 of studies that
00:49:21.600 are not quite as
00:49:23.360 high quality as
00:49:24.100 you'd want that
00:49:25.280 indicate it might,
00:49:27.040 and the risk is
00:49:28.300 low.
00:49:29.140 So there's no
00:49:31.240 mystery about why
00:49:32.900 doctors prescribe
00:49:33.760 it.
00:49:34.660 It's just risk
00:49:35.340 management.
00:49:37.100 But that doesn't
00:49:37.920 tell you it
00:49:38.300 works.
00:49:40.760 Yeah, no one
00:49:41.420 died is too
00:49:42.160 high a standard
00:49:42.760 for it worked.
00:49:44.560 Yeah, I mean,
00:49:45.620 you shouldn't
00:49:46.120 take that as
00:49:46.740 an absolute
00:49:47.880 no one.
00:49:48.460 Yeah, all the
00:49:56.260 stories you hear
00:49:56.980 about there's a
00:49:57.760 country that
00:49:58.480 solved their
00:49:59.060 problem with
00:49:59.760 X, none of
00:50:00.660 them are true.
00:50:01.980 If any country
00:50:03.040 had solved their
00:50:03.740 COVID problem by
00:50:04.680 doing X,
00:50:06.080 whatever X is,
00:50:08.360 it'd be over.
00:50:09.780 We'd just all be
00:50:10.800 doing X by next
00:50:11.760 week, and it'd be
00:50:12.380 over.
00:50:12.600 So, no, that
00:50:14.700 didn't happen.
00:50:18.700 Scott, people
00:50:19.560 shouldn't be
00:50:20.080 banned from
00:50:20.700 talking about
00:50:21.420 it.
00:50:23.340 Yeah, you
00:50:24.040 know, I don't
00:50:25.660 know if it
00:50:26.040 worked, though,
00:50:27.160 because there's
00:50:29.660 two questions.
00:50:30.440 I would agree
00:50:30.940 with you that
00:50:31.880 free speech
00:50:32.800 requires that
00:50:34.640 those things
00:50:35.120 not be banned.
00:50:35.940 what I'm not
00:50:37.400 sure of is
00:50:38.460 if it worked
00:50:39.180 from the
00:50:40.120 perspective of
00:50:40.800 the people
00:50:41.160 who banned
00:50:41.580 it.
00:50:42.320 In other
00:50:42.620 words, did
00:50:43.220 it make it
00:50:45.000 more likely
00:50:45.640 that you
00:50:46.040 would avoid
00:50:46.540 it and get
00:50:47.640 the more
00:50:48.880 approved
00:50:49.260 treatments?
00:50:49.920 I don't
00:50:50.100 know.
00:50:50.580 It might
00:50:50.780 have.
00:50:51.680 That doesn't
00:50:52.360 mean you
00:50:52.620 should do
00:50:52.980 it, but
00:50:53.660 it might
00:50:53.900 have worked.
00:50:58.520 What is
00:50:59.160 the vaxxed
00:51:00.020 survival rate?
00:51:01.080 Better than
00:51:01.520 99.8?
00:51:03.260 The last
00:51:04.080 numbers I saw
00:51:04.860 is that the
00:51:05.380 vaxxed
00:51:05.960 survival rate
00:51:07.060 is something
00:51:08.520 like 90%
00:51:09.420 better than
00:51:10.520 the unvaxxed.
00:51:11.600 Once you
00:51:12.000 get to
00:51:12.300 that, but
00:51:14.140 that's after
00:51:14.820 you're already
00:51:15.960 pretty sick,
00:51:17.340 I think.
00:51:18.260 I think that's
00:51:18.820 how that
00:51:19.060 works.
00:51:21.180 How do we
00:51:22.000 know that?
00:51:23.040 Well, data,
00:51:24.500 which we
00:51:25.900 don't trust.
00:51:30.620 Blake
00:51:31.220 James says,
00:51:32.380 you disappoint
00:51:33.200 me more and
00:51:33.860 more every
00:51:34.360 day.
00:51:34.620 All right,
00:51:35.520 Blake, let
00:51:37.120 me give you
00:51:37.680 a little
00:51:38.380 test here of
00:51:39.160 your cognitive
00:51:39.800 dissonance.
00:51:41.400 I want
00:51:41.660 Blake, Blake
00:51:42.400 Jameson, I
00:51:43.200 challenge you
00:51:43.840 now, in your
00:51:44.940 next comment,
00:51:46.420 state what you
00:51:47.140 think I believe
00:51:48.160 that you disagree
00:51:49.440 with.
00:51:50.220 Go.
00:51:51.440 Blake, you're
00:51:52.240 on.
00:51:53.000 It shouldn't
00:51:53.340 take you long.
00:51:54.740 If I miss
00:51:55.520 Blake's comment,
00:51:57.140 alert me to it.
00:51:58.700 Go.
00:51:59.000 Now, the
00:52:00.200 challenge is to
00:52:01.040 say something
00:52:01.540 that's not
00:52:02.000 obviously
00:52:02.540 cognitive
00:52:03.120 dissonance,
00:52:03.800 which would
00:52:04.160 be you
00:52:05.180 stating an
00:52:05.760 opinion for
00:52:06.280 me that I
00:52:06.840 clearly don't
00:52:07.460 hold.
00:52:08.580 See if you
00:52:09.200 can do that.
00:52:10.460 Bet you
00:52:10.820 can't.
00:52:12.160 Now, there's a
00:52:13.200 reason why you
00:52:13.760 can't.
00:52:16.480 Did he just
00:52:17.240 disappear or is
00:52:18.280 there a delay
00:52:18.860 here?
00:52:19.080 Have you ever
00:52:21.080 read The
00:52:21.420 Forgotten Man?
00:52:22.060 No.
00:52:22.200 Why are most
00:52:24.840 African countries
00:52:25.720 and many in
00:52:26.520 Eastern Europe
00:52:27.140 totally ignoring
00:52:28.140 COVID?
00:52:29.380 Well, the lack
00:52:30.860 of comorbidities
00:52:31.880 would be my
00:52:32.480 guess.
00:52:33.380 So the average
00:52:34.300 age in Africa,
00:52:35.240 let's just pick
00:52:35.800 Africa, the
00:52:37.480 average age is
00:52:38.200 way younger than
00:52:38.980 normal, the
00:52:39.640 average weight
00:52:40.260 way younger,
00:52:41.460 the amount of
00:52:42.100 time they spend
00:52:42.680 in the sunshine
00:52:43.280 way more.
00:52:45.880 So if
00:52:47.020 everything was
00:52:47.700 exactly the way
00:52:48.600 we expected,
00:52:50.120 Africa could
00:52:50.980 look exactly the
00:52:51.880 way it looks
00:52:52.460 without any
00:52:53.640 without any
00:52:55.220 mystery there
00:52:56.940 at all.
00:53:00.540 I missed
00:53:01.280 Blake's comment.
00:53:02.360 Okay.
00:53:04.360 Let's go
00:53:05.000 back.
00:53:05.740 Blake, Blake,
00:53:06.280 Blake.
00:53:06.960 Blake, Blake,
00:53:07.500 Blake.
00:53:10.560 Damn it,
00:53:11.220 where are you?
00:53:12.880 All right.
00:53:14.100 You believe
00:53:14.940 that your
00:53:15.300 thinking is
00:53:16.040 above others.
00:53:17.900 No, Blake.
00:53:21.880 I don't
00:53:23.300 know how much
00:53:23.700 more clear I
00:53:24.380 could be that
00:53:25.380 we're all
00:53:25.840 guessing or
00:53:27.380 how many more
00:53:27.960 times I could
00:53:28.560 say it.
00:53:30.620 Anyway, did I
00:53:32.760 make my case?
00:53:33.780 When you see the
00:53:34.560 people who say
00:53:35.080 stuff like, you
00:53:36.140 disappoint me and
00:53:37.260 you're wrong about
00:53:38.000 everything, they
00:53:39.700 almost never have
00:53:41.280 a clear understanding
00:53:42.320 of what my opinion
00:53:43.260 is.
00:53:43.600 Starseed says,
00:53:47.500 sorry, Scott,
00:53:48.900 he was right
00:53:49.400 about that.
00:53:50.840 So when I say
00:53:51.840 that we're all
00:53:52.740 just guessing and
00:53:53.560 making rational
00:53:54.280 decisions,
00:53:55.940 does that not
00:53:58.780 cover everything?
00:53:59.360 Now,
00:54:01.560 I did write a
00:54:03.200 book on how to
00:54:03.900 think well,
00:54:04.880 which is very
00:54:05.660 critically praised.
00:54:07.160 So if you're
00:54:11.140 telling somebody
00:54:11.760 that they're not
00:54:15.540 good at thinking,
00:54:17.360 you should probably
00:54:18.380 talk to somebody
00:54:19.020 who didn't just
00:54:19.760 write a best-selling
00:54:20.580 book on how to
00:54:21.380 think better.
00:54:22.900 That would be a
00:54:23.600 good start.
00:54:24.660 So no, do I
00:54:25.660 think that I think
00:54:27.160 better than other
00:54:27.920 people?
00:54:28.840 I think that I
00:54:29.980 think it,
00:54:32.060 meaning that you
00:54:33.960 can see lots of
00:54:34.660 examples of it.
00:54:35.800 For example,
00:54:37.160 poor Blake.
00:54:39.240 Yeah, let me not
00:54:40.200 dunk on him
00:54:40.720 anymore.
00:54:41.540 All right, that's
00:54:41.940 all I got for now
00:54:42.580 and I will talk to
00:54:43.400 you all soon.