Real Coffee with Scott Adams - November 06, 2021


Episode 1553 Scott Adams: Looks As If All Our Problems Have Been Solved Except Celebrities Killing People


Episode Stats

Length

55 minutes

Words per Minute

148.58871

Word Count

8,265

Sentence Count

578

Misogynist Sentences

5

Hate Speech Sentences

3


Summary

Is body language real? Is it a skill you can learn and apply? Or is it something you just need to learn? In this episode of Coffee with Scott Adams, host Scott Adams tries to answer these questions and much more.


Transcript

00:00:00.400 Good morning, everybody, and welcome to the best thing that ever happened to you and maybe anybody else.
00:00:08.400 It's called Coffee with Scott Adams, and there's a little thing called the Simultaneous Sip that's coming up in a moment.
00:00:14.360 But I have to set the stage.
00:00:17.160 Have I ever told you how much I hate sleeping?
00:00:20.300 I'm the only person I know. I'm sure there are others.
00:00:23.140 But I hate sleeping.
00:00:24.480 And so I wake up just naturally quite early.
00:00:30.160 I start work at 4 a.m. usually would be my ideal time to start work.
00:00:35.880 So last night, I woke up in bed, and I thought to myself, you know, I think I'm just going to get up because I was awake.
00:00:46.680 And I didn't really check the clock because my internal clock's pretty accurate.
00:00:51.160 And I thought, that's pretty close to 4 a.m., give or take.
00:00:55.400 So I got up.
00:00:57.060 It turned out to be 1.45.
00:00:59.900 1.45 is when I woke up this morning.
00:01:03.080 And I was already up and around and petting the dog, and I thought, you too?
00:01:09.500 Is that Vegas?
00:01:10.340 So, let's just say that the quality of this live stream might be a little bit lower than what you used to.
00:01:21.120 I'm just trying to set the expectations.
00:01:23.640 But still, despite all of that, how terrific is it going to be to do the simultaneous sample?
00:01:30.380 It's going to be amazing.
00:01:31.320 And all you need is a cup or a mug or a glass, a tanker gels, a stein, a canteen jug, a flask, a vessel of any kind.
00:01:38.640 Fill it with your favorite liquid.
00:01:40.600 I like coffee.
00:01:42.300 And join me now for the unparalleled pleasure of the dopamine each other day, the thing that makes everything better, especially your antibodies.
00:01:49.040 It's called the simultaneous sip, and it happens now.
00:01:50.580 I don't know about you, but sometimes if you're leaning, when you have the simultaneous sip, it'll stimulate the antibodies only on the lower side of your body.
00:02:07.840 So, if you made the mistake I did, which was leaning while you did it, try this.
00:02:13.160 Lean the other way.
00:02:17.260 There, we go.
00:02:18.400 There we go.
00:02:21.560 Antibodies stimulated and evenly distributed.
00:02:26.480 Now we're in good shape.
00:02:28.780 All right.
00:02:29.860 Question for all of you.
00:02:31.520 I was just chatting with the local subscribers before I fired up YouTube.
00:02:36.260 And I'm going to ask you the same question on YouTube.
00:02:40.100 Is body language real?
00:02:42.160 In other words, is it a skill which you can learn and apply?
00:02:48.020 Or maybe you've already learned it, and then you apply it.
00:02:52.640 Is body language real?
00:02:54.400 Go.
00:02:55.800 Yes, yes, yes.
00:02:57.360 Yes, yes.
00:02:58.100 Absolutely.
00:02:59.240 Yes, yes, yes.
00:03:00.600 Yes, yes, yes.
00:03:02.000 Okay.
00:03:02.740 Looks like we have universal agreement that body language is real.
00:03:06.960 Well, question number two.
00:03:10.640 How do you know you're good at reading it?
00:03:15.340 Yeah, you're not so certain now, are you?
00:03:17.260 Here's my observation.
00:03:21.140 It doesn't matter if body language is real or not.
00:03:25.260 For most purposes.
00:03:28.160 It doesn't matter.
00:03:30.160 Because you don't know if you're good at reading it.
00:03:33.560 That's the problem.
00:03:34.560 The problem is not whether body language is real.
00:03:40.920 The problem is you think you can read it, and you can't.
00:03:45.520 You're terrible at it.
00:03:48.160 It's the same problem with pattern recognition, right?
00:03:51.740 Our brains are pattern recognition machines.
00:03:55.000 And we see patterns, and we think they mean something, even when they don't.
00:03:58.900 That's why we have science, to essentially get around our own illusions about patterns.
00:04:05.820 Well, don't you think that your pattern recognition is what's driving your interpretation of body language?
00:04:14.220 All right, so now you all think that body language is a real thing.
00:04:19.040 You got real quiet.
00:04:20.720 Here's my next question.
00:04:22.720 You ready to have your brain blown off?
00:04:25.480 You ready?
00:04:26.060 You ready?
00:04:26.420 You ready?
00:04:26.480 How many of you listening or watching right now have been accused of feeling something you did not feel
00:04:37.380 because somebody you know misread your body language?
00:04:42.340 How many this week?
00:04:44.340 This week.
00:04:45.620 I'll read your messages.
00:04:48.020 All the time, most of the time, me, me, me, yep, often, yep, yep, yep, yep, yes, true, true, true, yes, true.
00:04:54.340 Every, now read here.
00:04:57.380 Every day, a few no's.
00:05:00.020 Every day, every day, yes, yes, yes, every day.
00:05:04.480 All right?
00:05:05.200 Now, you see where I'm going here?
00:05:09.560 If body language reading were a real thing, don't you think they'd read your body language right?
00:05:16.560 Don't you think that the people who accused you of being, let's say, angry when you weren't angry or not interested when you were interested,
00:05:26.700 whatever it was they were blaming you of, don't you think that the person who did that thought that they were good at reading body language?
00:05:34.540 Wouldn't you say?
00:05:35.120 Probably every person who falsely accused you because they read your body language wrong, I'll bet every one of them, thought they did it well.
00:05:45.340 Right?
00:05:47.160 It's sort of like do your own research.
00:05:50.220 How many of you think that it's wise and notable to do your own research?
00:05:55.040 There's a big story, most of you have seen it, about quarterback Aaron Rodgers, who was getting a lot of attention because he was fairly eloquent in describing his process of deciding whether to get vaccinated or not.
00:06:10.520 He decided apparently not to.
00:06:13.100 He thought there were some risks of allergic reactions in his case that might be unique to him.
00:06:19.460 I don't know if that's a thing.
00:06:21.480 Do you?
00:06:21.860 Is that a thing?
00:06:25.940 I don't know.
00:06:27.060 I mean, I looked it up and I couldn't find that it's a thing, but he thinks it's a thing.
00:06:31.460 Now, if I asked you, should you do your own research and talk to your doctor to decide what to do, most of you would say what?
00:06:42.380 Should you do your own research on, let's say, anything but vaccinations and then talk to your doctor?
00:06:49.220 That's the way you should handle it, right?
00:06:50.420 Do your own research and talk to your doctor.
00:06:52.860 How many of you think you should only do one of those?
00:06:56.960 Only your own research or only talk to your own doctor?
00:07:01.280 Nobody, right?
00:07:02.420 You would all agree 100% that you should do your own research and then talk to your doctor.
00:07:07.840 You know doctors are saying different things, right?
00:07:13.800 They're not all saying the same thing.
00:07:17.120 So how do you know you got the right doctor?
00:07:19.140 You're basically imagining that you were good at picking a doctor.
00:07:25.720 You probably didn't pick your doctor.
00:07:28.840 Most of you just sort of took the doctor that was there because you don't know how to pick a good doctor from a bad doctor.
00:07:34.960 So, and similar to getting a financial advisor.
00:07:40.820 There are more financial advisors than there are stocks.
00:07:44.680 How do you know you got a good financial advisor?
00:07:46.700 They don't beat the market more than a monkey with a dartboard.
00:07:51.120 They usually do worse than a monkey with a dartboard.
00:07:54.040 Let's say compared to index funds.
00:07:55.620 So here's the problem that we run into all the time.
00:08:00.400 Body language is totally real, but you can't do it.
00:08:05.640 You just think you can.
00:08:09.000 Financial analysis, in which you look at the pros and cons of companies and study their balance sheets
00:08:14.640 and look at their business model and study the management quality, that's very real.
00:08:21.680 You can't do it.
00:08:23.640 But you think you can.
00:08:25.620 You think you can.
00:08:28.760 Likewise, doing your own research to decide what you should do about vaccinations.
00:08:34.480 You think you can do that.
00:08:36.980 But you can't.
00:08:38.820 You can't.
00:08:40.280 Likewise, almost everything that you see on Twitter that's a graph or a chart or is proving something,
00:08:46.000 you think you can read that and come to a decision that's pretty good.
00:08:50.080 But you can't.
00:08:51.440 You don't have those skills.
00:08:53.420 Nobody does.
00:08:54.120 Basically, nobody does.
00:08:56.700 You know, the reason that I continually promote people like Andres Backhaus and Anatoly Lubarsky is that they seem to be very close to having those skills.
00:09:08.240 But of course, nobody's right all the time, right?
00:09:11.140 So when you don't know how bad you are at something until you see somebody who's good at it, would you agree with that?
00:09:18.320 Would you agree that, generally speaking, it's impossible to know how bad you are at something until you meet somebody who's good at it?
00:09:25.620 Take singing, for example.
00:09:27.680 Take singing, for example.
00:09:29.000 Suppose you had never met a good singer.
00:09:31.600 You probably thought you could do it.
00:09:34.040 And then you meet Mariah Carey and you say, oh, okay, I don't know what I was doing, but that wasn't singing.
00:09:39.120 That singing, whatever I was doing, I don't even know what that was anymore.
00:09:42.640 So the problem is you'll never meet somebody who's good at body language, because there aren't many of them.
00:09:48.680 I don't know if there are any of them, really.
00:09:50.180 But I'm assuming that the people who study it professionally probably are pretty good at it, pretty good.
00:09:56.940 But you're never going to meet them, right?
00:09:59.400 And it's not you.
00:10:00.420 So all I would encourage you to do is have some humility about the assumption that you can do your own research.
00:10:10.340 Now, let me give you my macro opinion of Aaron Rodgers.
00:10:14.020 A bunch of people sent me his video and said, hey, this is really persuasive.
00:10:18.760 This guy gets it.
00:10:21.320 And here's where he lost me.
00:10:25.620 He lost me when he called himself a critical thinker.
00:10:30.420 So far, so good, right?
00:10:32.180 Called himself a critical thinker.
00:10:33.900 And then he demonstrated it by saying that he did his own research.
00:10:38.160 That's where he lost me.
00:10:40.080 Because if you're a critical thinker, you know you can't.
00:10:43.620 If he were a critical thinker, he would know that he can't do his own research.
00:10:47.580 Not in any reliable way.
00:10:50.260 I mean, even if he were a top researcher, he probably couldn't.
00:10:53.600 Let me give you another example.
00:10:54.760 The director of the CDC, I guess, recently claimed that face masks are 80% effective.
00:11:04.580 Really?
00:11:06.260 Really?
00:11:07.280 Is there anybody who believes that face masks are 80% effective?
00:11:11.840 80%?
00:11:12.460 Even the people who are in favor of them are giving numbers that are like sub-20% and we're
00:11:19.020 not even entirely sure about that, right?
00:11:22.320 So, of course, this caused somebody in the comments to that, there was a tweet about it.
00:11:29.200 And as soon as I saw it, I thought, that must be a typo.
00:11:32.760 Didn't you think that?
00:11:33.640 If you saw that the head, the director of the CDC, the director, if you read something
00:11:42.420 that said she thought masks were 80% effective, wouldn't you think that was a typo?
00:11:47.300 Or it just didn't really happen?
00:11:50.020 But apparently it did.
00:11:51.720 It did.
00:11:52.440 And apparently, and I didn't know this, there are a whole bunch of studies that show that
00:11:55.840 masks are 70 to 80% effective in certain situations that are basically not this one.
00:12:01.620 How long did it take for somebody to reply to the tweet that showed a whole bunch of studies
00:12:08.740 that show masks totally work?
00:12:11.320 How long did it take somebody to paste a whole bunch of studies that show masks totally don't
00:12:17.200 work?
00:12:18.400 About a second.
00:12:20.860 So right next to each other on Twitter is a link to an article that mentions a whole
00:12:26.620 bunch of studies, masks totally work, and then another one, a whole bunch of studies,
00:12:31.180 masks totally don't work.
00:12:32.980 So when Aaron Rodgers goes to do his own research, how does he sort that out?
00:12:38.460 Now, he was talking about vaccinations, not masks, but it's a perfect example.
00:12:42.920 Do you think Aaron Rodgers, with whatever skill stack he brings to this, could be a critical
00:12:49.120 thinker and then sort out whether masks work or not with two completely different sets
00:12:55.440 of answers?
00:12:55.940 I can't.
00:12:56.440 I can't.
00:12:57.440 I can't.
00:12:58.440 Could you?
00:12:59.440 How many of you could do that?
00:13:01.440 Right?
00:13:02.440 So he did his own research, but it's absurd because he never knows what he doesn't know.
00:13:09.440 Right?
00:13:10.440 If you're from the outside of an area of expertise, unless you're unusually smart, unless you're
00:13:22.260 Elon Musk smart, you can't enter somebody else's field and figure it out.
00:13:29.420 Well, that's not a thing.
00:13:33.260 You know, it's a thing for the smartest among us, but it's not a thing, generally speaking.
00:13:38.280 And it's certainly not a thing that Aaron Rodgers did.
00:13:40.740 So when he calls himself a critical thinker, I challenge that.
00:13:44.120 Because a critical thinker would know that you can't do what he claims to have done.
00:13:49.360 It's not doable with the brains we have and the information that's available to us.
00:13:54.720 It's just not doable in any rational way.
00:13:57.000 All right, let's talk about this.
00:13:59.900 I am so interested in the Steele dossier story update because something's happening, right?
00:14:07.700 And I don't know exactly what it is because some parts of the media are treating it like
00:14:14.560 it's not even a story, but it looks like it's the biggest story.
00:14:18.700 And so we have these two movies running at the same time that is the biggest thing I've
00:14:25.240 seen in years, or it's nothing.
00:14:29.460 Or, you know, just some weasels did some lying.
00:14:32.720 That's about it.
00:14:34.280 So it's either some weasels did some lying, or Hillary Clinton was behind, or her campaign
00:14:40.460 was behind, a legitimate plan to overthrow the, basically to cheat in the major election
00:14:47.400 and or overthrow a sitting president once Trump was elected.
00:14:53.440 It's either, what is it?
00:14:54.580 Is it the worst thing in the world, or is it nothing?
00:14:58.140 Do your own research.
00:15:00.440 Go look at the media.
00:15:02.460 Do you think you could figure it out?
00:15:03.900 If it's nothing, or it's a whole bunch?
00:15:06.960 Well, I think this story is telling you how deeply the intelligence assets in this country
00:15:15.020 are embedded in the media.
00:15:16.900 That's all this is telling us, right?
00:15:20.380 You know, as Glenn Greenwald often tweets and writes, that the public isn't fully aware
00:15:26.840 that major parts of our media are just controlled by the CIA or CIA assets or some damn intelligence
00:15:35.480 units of the government.
00:15:38.380 And this isn't conspiracy theory stuff.
00:15:43.080 This is really well-documented stuff.
00:15:46.360 I don't know if anybody even...
00:15:48.200 I don't think there's anybody serious who even doubts it, right?
00:15:51.620 That our intelligence agencies directly influence the news.
00:15:54.940 And so when you see that this story is sort of semi-disappeared, it's definitely covered
00:16:02.520 and the opinion people are talking about it, but it should be the main story, I think, and
00:16:11.600 yet sort of downplayed everywhere.
00:16:15.280 Now, actually, that's an exaggeration.
00:16:18.220 The Washington Post covered it, so I'll give them that.
00:16:21.120 Jonathan Swan, I think he's Axios, tweeted that, tweeted the Washington Post coverage.
00:16:27.580 So I looked on Axios to see the coverage, and there wasn't anything today, but I'm sure
00:16:32.560 they covered it originally.
00:16:35.800 And so Jonathan Swan summarizes the Washington Post coverage of it this way.
00:16:41.540 He says, the charges are not only did Clinton slash Democrats fund the dossier, but a long
00:16:49.180 time Clinton slash Dem operative was one of the sources for the rumors about Trump.
00:16:54.640 And then he summarizes that by saying, doesn't get much worse.
00:16:59.340 Right, right.
00:17:00.960 As in, it's the biggest story.
00:17:03.080 It doesn't get much worse.
00:17:05.080 But it's not the biggest story.
00:17:06.740 Do you see now that there's somebody deciding what a story is?
00:17:23.220 You get that, right?
00:17:25.180 There's somebody, and it probably varies depending on what the story is, but somebody controls
00:17:31.800 whether or not the world thinks it matters.
00:17:34.680 And amazingly, whoever that is, or those people, or entities, or a series of forces, whatever
00:17:42.800 it is, has decided that this won't get attention.
00:17:47.520 It's really freaky, isn't it?
00:17:49.580 When you see it in real time.
00:17:51.660 When you see that the news is just not even close to real.
00:17:55.380 It's just a brainwashing operation.
00:17:58.560 You can't see who's pulling the strings all the time, but sometimes you can see the strings.
00:18:03.040 This is one of those times when you can see the strings.
00:18:06.100 You're like, hey, hey, hey, I can actually see those strings.
00:18:11.380 And it won't matter.
00:18:12.940 What are you going to do about it?
00:18:14.720 So let's say you and I see how bad this story is.
00:18:18.200 What are you going to do about it?
00:18:20.300 Nothing, right?
00:18:21.540 What am I going to do about it?
00:18:22.900 I can't think of anything I can do about it.
00:18:25.540 Wait?
00:18:26.060 I don't know.
00:18:27.040 So it's a weird story.
00:18:28.820 It's the biggest story ever.
00:18:29.900 Barr treated like it's nothing.
00:18:34.160 To be even more amazed, it turns out that Rachel Maddow is running a report that Durham,
00:18:41.360 the very same person who's coming out with this new information,
00:18:45.040 and Barr, ex-AG Barr, intentionally ignored emails that, quote,
00:18:49.760 prove Trump was in direct communications with the Russian Alpha Bank.
00:18:54.760 A covert communication channel existed during the 2016 campaign that Barr and Durham knew was real,
00:19:04.260 but they covered up.
00:19:05.740 Boomerang, says Rachel Maddow.
00:19:08.780 Gotcha.
00:19:10.400 So are any of the other major media covering that story?
00:19:16.980 Where's that story on CNN?
00:19:19.760 I didn't see it.
00:19:21.580 Where's that story on Fox News?
00:19:23.320 I don't know.
00:19:25.600 I didn't see it.
00:19:26.920 Did she just make this up?
00:19:29.200 Is this just totally made up?
00:19:31.380 And don't you think there's a little context missing?
00:19:34.720 So she claims that there are emails that Durham and Barr saw but ignored.
00:19:40.040 What is the missing context?
00:19:42.480 They saw it, and they ignored it.
00:19:45.980 The missing context is why did they ignore it?
00:19:48.580 Probably because it was BS or unimportant or some trivial email.
00:19:55.800 Probably.
00:19:57.100 There's nothing here, and Rachel Maddow has decided to turn it into something.
00:20:02.860 Who controls MSNBC?
00:20:06.780 Depends who you ask.
00:20:08.180 Some would say intelligence agencies.
00:20:11.880 I can't confirm that, but that's a common claim.
00:20:16.060 If you were an intelligence agency, wouldn't you want to create the counter-narrative that,
00:20:20.840 oh, no, it really was Trump who was colluding with Russia after all?
00:20:23.740 This is so heavy-handed.
00:20:27.440 This is like really obvious, heavy-handed manipulation of the public.
00:20:34.120 It's kind of crazy.
00:20:35.960 And here's the thing.
00:20:37.120 It's totally working.
00:20:38.120 No matter how easily you can see the puppet strings and say,
00:20:43.480 hey, hey, this is clearly manipulation and a trick, it still works.
00:20:48.800 Because, again, what the hell are you going to do about it?
00:20:51.240 Take your story to the media?
00:20:53.760 Hey, I see what's going on here.
00:20:55.400 I think I'll call my contacts at CNN and report it.
00:21:00.960 Seriously, what the hell are you going to do about it?
00:21:02.840 As long as the media is going to cover it the way they're covering it,
00:21:06.560 that's sort of the end of it.
00:21:08.520 There's nothing you can do.
00:21:10.660 What are you going to do, call Tim Poole if he recovers from his COVID?
00:21:19.300 I mean, the independents are so small relative to the major media
00:21:24.020 that there's nothing you can do.
00:21:27.560 All right.
00:21:29.900 That story just fascinates the hell out of me,
00:21:31.760 that you can make something like that disappear.
00:21:34.320 So Pfizer announced that they've got a COVID pill,
00:21:36.660 that, at least in the test,
00:21:40.320 they had zero deaths from people who took the COVID pill soon after having symptoms.
00:21:45.340 And reduces deaths 89% or something like that.
00:21:49.840 Now, if you've got a pill, and I imagine this will get approved pretty quickly,
00:21:55.680 that reduces it by 89%,
00:21:57.760 aren't you done?
00:22:00.780 Like, isn't this pill the get-back-to-work pill?
00:22:07.060 Because if you've got the vaccinations themselves,
00:22:11.380 and of course not everyone will take them,
00:22:13.180 but they reduce the risk, you know, by some enormous amount.
00:22:16.180 Then you've got this pill, and that reduces it by an enormous amount.
00:22:20.420 You've got the monoclonal antibodies.
00:22:22.760 We know that works.
00:22:23.600 We know the vitamin D drip works.
00:22:26.780 At least a few other things we know work.
00:22:29.640 So each of these has, like, takes a big percentage out of the total risk.
00:22:33.520 And once the Pfizer pill is here, and one's like it,
00:22:38.220 you know, the, who is it, I forget the other company that has a pill.
00:22:45.080 But I feel like we're done when we have those.
00:22:48.600 Don't you?
00:22:50.500 I mean, I've felt like this before, so maybe this is false optimism.
00:22:53.840 But how much better would it be if you've got these pills that work
00:23:01.340 so long as you take them early?
00:23:03.520 How much better would things be if we had rapid tests that are so cheap
00:23:07.960 you could just do one every day?
00:23:10.960 Suppose you could, for $1, test yourself every day.
00:23:15.600 And, you know, not everybody would spend $365 per person in their household,
00:23:21.020 but a lot of people would, a lot of people would.
00:23:24.320 And you would get at least the super spreaders.
00:23:27.540 But if you were catching it fast and taking the pill fast,
00:23:30.600 I feel like we're done, right?
00:23:33.320 I mean, I don't know when we'll all have availability of these pills.
00:23:37.440 And then the real question is,
00:23:39.520 do you just buy these pills and keep them around?
00:23:42.920 You know, I'm sure they're prescription.
00:23:45.500 But wouldn't it be great if you could just get some and keep them?
00:23:49.060 Because you don't want to have that time lag.
00:23:51.020 Between a dry cough and getting the prescription.
00:23:54.800 Because that could be eight hours, right?
00:23:57.000 By the time you, you know, if you wake up in the middle of the night with a dry cough.
00:24:00.960 It could be hours and hours.
00:24:02.400 So wouldn't you like to test yourself, grab a pill, you're done.
00:24:06.520 You've already treated yourself.
00:24:07.820 Go back to bed.
00:24:09.200 Alone.
00:24:10.040 Quarantine.
00:24:11.340 All right.
00:24:11.700 So apparently there was a claim that the UK version of the drug that got approved
00:24:22.260 was really just ivermectin.
00:24:26.020 Oh, I'm sorry.
00:24:27.940 There was some thought that maybe the Pfizer one or somebody else's.
00:24:31.340 Anyway, the claim is false.
00:24:33.180 And the claim was that one of these companies was just repurposing ivermectin.
00:24:37.360 And here's a little tip for you.
00:24:41.320 You can always assume that fraud is hiding in any complicated environment.
00:24:47.440 A complicated environment would be finance.
00:24:50.040 Definitely fraud there.
00:24:51.920 Science.
00:24:53.420 Definitely fraud there.
00:24:54.460 And we hear about it all the time.
00:24:58.460 It's obvious, right?
00:25:00.140 And so the more confusing it is, the more likely it's fraud.
00:25:04.940 But when you heard, the first time you heard this rumor that one of these big companies
00:25:09.980 was just going to try to slip ivermectin into a different name of a pill,
00:25:15.580 you should have known that wasn't a thing.
00:25:19.340 Because it would be too easy to catch them.
00:25:22.360 Right?
00:25:22.560 But how hard would it be for somebody who knows how to do it
00:25:28.000 to take a look at it and say,
00:25:30.000 this is just ivermectin.
00:25:32.760 You guys have been screwing us.
00:25:34.560 It would be so easy to find it out.
00:25:37.160 It would be ridiculous for them to try that.
00:25:42.460 Zunar says, wrong.
00:25:43.540 What are you going to do?
00:25:44.980 Well, what you're going to do is start taking ivermectin.
00:25:49.540 That's what you're going to do.
00:25:50.720 If, and again, the rumor is false, so there's no persuasive evidence that ivermectin works,
00:25:58.700 as far as I know.
00:26:00.120 But I also wouldn't know.
00:26:05.600 All right.
00:26:06.480 Good comment, though, Zunar.
00:26:08.040 So, I would just say that the only reason anybody would believe this ivermectin rumor,
00:26:13.900 that it's really the drug that's in these other pills, rebranded,
00:26:17.780 is because we'll believe anything now.
00:26:20.280 Like, nothing seems off the table, does it?
00:26:22.520 When you hear this story about the Steele dossier, the real way it was created,
00:26:27.020 doesn't it almost feel to you as if there's just nothing that's off the table anymore?
00:26:33.940 Just anything's possible.
00:26:35.760 Am I right?
00:26:36.880 So, you can make up any rumor, and somebody's going to believe it,
00:26:40.500 because people will say, well, that's not any worse than the five things I heard on the news
00:26:44.680 that might be true.
00:26:45.680 Yeah, we've lost all trust.
00:26:49.380 That is true.
00:26:54.460 How would you like me to fix the supply chain problem?
00:27:00.080 You ready?
00:27:01.760 Now, remember, I'm never totally serious when I say stuff like that.
00:27:05.780 But I think it's fun to talk about it.
00:27:08.580 I'll give you a little background.
00:27:10.540 I tweeted and talked to you the other day,
00:27:12.840 and I said that one of the secrets of persuasion
00:27:15.620 is that whoever makes the best visual graph
00:27:18.480 of whatever the problem or solution is,
00:27:21.520 whoever does the best job of the visualization part
00:27:24.380 ends up being in charge.
00:27:26.280 Because the visualization tells people what to do for the first time.
00:27:30.520 Because if it's a complicated situation,
00:27:32.980 people can't read through it and decide what they want to do.
00:27:35.420 But if you give them a nice, clean chart or pie chart or visualization,
00:27:41.300 and it's accurate,
00:27:42.840 people suddenly will line up behind it and say,
00:27:45.280 oh, okay, now we know what to do,
00:27:46.800 we know what the problem is, etc.
00:27:48.540 So the power of being good at creating visualizations
00:27:52.900 is way underrated.
00:27:54.680 Because I used to do that for my corporate jobs.
00:27:58.400 And I discovered that basically I was running the department
00:28:02.300 because I could make the visualization compelling or not
00:28:05.940 for whatever I wanted.
00:28:07.440 And it felt like the chart-making person was running stuff.
00:28:10.360 It certainly felt like that when I was making the charts
00:28:12.880 because I could make them good or bad if I wanted.
00:28:16.800 So hearing my explanation of the power of charts,
00:28:22.000 Ryan Peterson, CEO of Flexport,
00:28:25.220 who you already know because he did a terrific thread
00:28:27.800 in which he went to actually visited the ports.
00:28:30.320 He works in this industry,
00:28:33.380 so he knows what questions to ask
00:28:35.420 and where to look for problems.
00:28:37.860 And he came up with a pretty good analysis,
00:28:40.180 a very good analysis, actually.
00:28:41.560 So good that the governor of California
00:28:43.480 called him to see what was going on,
00:28:45.500 see if he could help.
00:28:47.120 So it was that good.
00:28:48.080 And then he followed up with building a presentation
00:28:52.840 that's really good.
00:28:54.620 Really good.
00:28:55.400 So you can see it on my Twitter feed
00:28:56.840 or just look for Ryan Peterson.
00:29:00.180 The sen part on the end is S-E-N, Peterson,
00:29:03.960 CEO of Flexport.
00:29:05.460 And you can see his stuff there, and I recommend it.
00:29:08.180 Because I think it's really fun, actually, weirdly,
00:29:11.540 because I'm a total nerd about business models.
00:29:14.880 Does anybody else have that?
00:29:15.840 Yeah, I went to business school,
00:29:18.400 and so I just got hooked on business models.
00:29:21.680 Like, what is it that makes some company
00:29:23.560 have a process that makes money
00:29:25.300 that's different than some others?
00:29:26.980 Yeah, I see some other people saying the same thing.
00:29:30.600 That business models are just endlessly fascinating to me.
00:29:35.200 So anyway, seeing this flowchart of what the problem is,
00:29:38.900 and let me quickly summarize the problem.
00:29:42.840 You think the problem was truck drivers, right?
00:29:45.840 How many of you have been told
00:29:47.980 the problem is not enough truck drivers?
00:29:52.400 Well, that is a problem-ish,
00:29:55.480 but it's not the immediate problem.
00:29:57.560 It's not the reason stuff is backed up.
00:30:00.040 Because there are drivers sitting in trucks
00:30:02.120 with an empty container on the back,
00:30:06.180 and they don't have any place to put the empty.
00:30:07.900 So you have drivers all over the place
00:30:11.400 with just an empty on the back and no place to put it.
00:30:14.460 So they can't pick up a new one,
00:30:16.140 so they can't do any work,
00:30:18.120 because they can't get rid of the one that's on the back.
00:30:20.440 Now, why can't they get rid of the one on the back?
00:30:22.280 Well, the ports got slammed with the pandemic traffic,
00:30:26.440 because people bought more goods than they consumed services.
00:30:30.800 So people's spending patterns radically changed,
00:30:33.600 and they started buying stuff because they were stuck at home
00:30:36.540 instead of going on a vacation and buying gas and stuff.
00:30:41.300 So that momentary shock to the system
00:30:44.380 caused a buildup that rippled.
00:30:46.240 And the ripple was that they didn't have a place
00:30:50.240 to put the empty containers,
00:30:52.100 and then that slows everything down because they're in the way.
00:30:55.460 Now, part of the solution was getting approval
00:30:57.960 to stack some of the containers
00:30:59.580 in places that they couldn't stack them
00:31:01.800 or in ways that they couldn't stack them before.
00:31:04.520 And I think that Ryan Peterson was instrumental
00:31:08.500 in getting that happening quickly.
00:31:10.860 So it made a little bit of a difference.
00:31:12.460 It's not the solution.
00:31:13.480 But the big problem is that you need a special kind of chassis.
00:31:20.100 In other words, the part that's behind the big rig truck.
00:31:23.900 It has to be a special kind for an empty
00:31:26.440 or any kind of container to be carried on it.
00:31:30.120 And there aren't enough of those to carry the new traffic
00:31:34.240 because they're all used up with an empty on it.
00:31:36.900 They can't go anywhere.
00:31:38.180 Now, you say to yourself,
00:31:39.640 Scott, this is the easiest problem in the world to solve.
00:31:42.240 Just take all those trucks with the empties.
00:31:45.320 The government just say,
00:31:46.480 OK, temporarily, here's a farmer's field.
00:31:49.560 It's an emergency.
00:31:50.780 Farmer says it's OK.
00:31:52.140 We'll give them some money.
00:31:53.440 Just drive to this empty field
00:31:54.980 and just put all your empty containers there, right?
00:31:59.060 How do you get them off the truck?
00:32:01.940 How do you get the container off the truck?
00:32:03.840 You need that crane that's back at the port.
00:32:10.140 You can't get them off the truck except at the port.
00:32:14.020 And do you know why you can't get them off the truck at the port?
00:32:16.820 Because it's already filled with empty containers.
00:32:20.260 So the cranes and the trucks can't get near each other.
00:32:24.040 Even the cranes are not being used
00:32:25.520 because there's nothing but trucks with empties
00:32:28.460 and empties all over the place.
00:32:30.820 And that's your problem.
00:32:32.580 Oh, you're ahead of me.
00:32:34.420 So here's my question.
00:32:36.600 All right.
00:32:37.540 Suppose you took the best engineers in the world
00:32:39.880 and you put them in the metaverse
00:32:42.380 so they could have a meeting
00:32:43.580 in Zuckerberg's virtual world.
00:32:48.060 So it feels like they're there.
00:32:49.120 And you take your Elon Musk's.
00:32:51.520 I like to use him for every example.
00:32:53.280 He just fits every example, it seems like.
00:32:54.760 You take your Elon Musk,
00:32:56.580 you take your best engineers from a few different places
00:32:59.980 and you just put them in one place.
00:33:01.840 You say, here's the problem.
00:33:04.220 The normal way that empties are taken off
00:33:06.960 is with the same crane, I think.
00:33:09.540 Fact check me on this.
00:33:10.820 The same crane that they use at the ports
00:33:12.540 to do everything else that the cranes do.
00:33:15.680 So the cranes as they're built
00:33:17.320 are sort of multi-purpose for the ports.
00:33:21.800 But suppose you wanted to very quickly develop
00:33:26.420 an engineering solution
00:33:28.160 that would simply take an empty off.
00:33:31.760 The first thing, the first advantage
00:33:33.660 is that you're only dealing with empties.
00:33:35.620 So if you were to build a crane
00:33:37.720 or a forklift type device,
00:33:40.640 whatever it would take,
00:33:41.940 you wouldn't have to make it very strong
00:33:44.700 compared to the ones at the port.
00:33:46.740 Because the port has to take full containers
00:33:49.540 as well as empties.
00:33:51.400 So that has to be way, way, way stronger.
00:33:54.020 But if you're only dealing with the empties
00:33:55.780 and it's an emergency,
00:33:57.320 you just want to get empties off chassis,
00:33:59.480 could you modify a forklift?
00:34:03.440 Could you build one of those magnet things
00:34:06.760 that just picks it off
00:34:07.860 as long as you've, I don't know,
00:34:09.560 disconnected it somehow from the chassis?
00:34:11.880 Could you do it with a giant magnet?
00:34:13.560 How long would it take a Caterpillar, for example,
00:34:17.740 to modify anything that they have existing
00:34:20.560 that can very quickly just grab empties
00:34:23.400 and toss them off?
00:34:26.940 Right?
00:34:27.720 So suppose you did a, you know,
00:34:29.540 it's an overused concept,
00:34:31.340 but basically a Manhattan project
00:34:33.380 to find a temporary engineering solution
00:34:36.920 for removing empties from chassis
00:34:39.680 not at the port.
00:34:41.200 So, you know, is there such a thing?
00:34:45.800 Is there anybody who knows,
00:34:47.020 is there such a thing as like
00:34:48.320 just a badass forklift
00:34:51.120 that's big enough to take
00:34:52.780 an entire empty container off a truck?
00:34:55.000 Is there such a thing?
00:34:56.820 And could you modify
00:34:57.880 some other piece of equipment
00:35:00.220 or equipments that could do that?
00:35:03.680 Oh, there it is.
00:35:04.900 Somebody gave me the actual picture
00:35:07.540 of a forklift
00:35:08.480 that's meant for exactly that.
00:35:10.420 So how hard would it be
00:35:12.360 to get a few forklifts
00:35:13.440 in some big farmer fields?
00:35:16.480 So I'm saying, yes,
00:35:17.700 they already exist.
00:35:19.600 So you don't really need
00:35:21.820 to move the crane anywhere, do you?
00:35:24.960 Maybe just these forklifts?
00:35:28.040 Yeah.
00:35:28.820 So I guess there's more questions, right?
00:35:31.700 Remember I told you
00:35:32.820 every time you think
00:35:33.840 you can do your own research,
00:35:35.720 you just find out
00:35:36.880 that that's not something you can do?
00:35:38.320 So here's a perfect example.
00:35:41.120 So here I am trying
00:35:41.960 to do my own research.
00:35:44.280 But, you know,
00:35:44.840 it's not my full-time job,
00:35:46.080 just like it isn't
00:35:46.960 most people's full-time job
00:35:48.260 to research anything.
00:35:50.040 And I don't know
00:35:51.500 what's going on still.
00:35:53.800 Because if there exist,
00:35:56.720 and I can see that they do exist,
00:35:58.840 these giant forklifts,
00:36:00.860 probably they're all being used
00:36:02.900 at the ports themselves.
00:36:04.820 But it seems like maybe
00:36:06.720 for a day or two
00:36:07.640 you could at least get
00:36:08.420 a few of them
00:36:09.000 and just outsource them
00:36:11.020 to the farmer's field
00:36:12.020 or wherever.
00:36:15.760 Yeah, redesign the trucks
00:36:17.100 to be self-emptying.
00:36:18.420 So I just tweeted,
00:36:19.500 there is such a device.
00:36:20.960 So there are trucks
00:36:21.880 that are self-unloading.
00:36:24.100 So it looks like they've got
00:36:25.240 some kind of a roller thing.
00:36:27.320 So they just dip it down
00:36:28.480 and let the...
00:36:29.460 and then they drive away slowly
00:36:30.940 and the container just...
00:36:34.180 But I don't think
00:36:35.440 they make those
00:36:36.060 for too many chassis.
00:36:41.440 Forklifts can't run
00:36:42.460 on soft ground.
00:36:43.520 Good point.
00:36:45.080 Good point.
00:36:45.760 So you need some kind
00:36:46.760 of paved situation.
00:36:49.240 That's a really good point.
00:36:51.980 But they could...
00:36:52.920 If it's packed down,
00:36:53.820 they could...
00:36:54.440 If it's packed down dirt,
00:36:55.620 they could probably...
00:36:56.580 Maybe not.
00:36:58.020 Too heavy.
00:36:59.920 All right.
00:37:00.940 Did you hear about
00:37:04.560 Travis Scott?
00:37:06.360 He has this festival
00:37:07.540 called the Astroworld
00:37:08.860 in Houston.
00:37:10.640 And apparently
00:37:11.140 the crowd surged.
00:37:14.580 It's not clear why.
00:37:15.740 And eight people died
00:37:16.800 and hundreds were injured
00:37:18.040 and stuff.
00:37:19.560 And...
00:37:21.420 Is it my imagination
00:37:23.120 or is the fourth leading
00:37:24.920 cause of death
00:37:25.800 in this country
00:37:26.440 being killed
00:37:27.640 by celebrities?
00:37:28.280 So you've got
00:37:30.740 your Alec Baldwin
00:37:31.680 slaying one person
00:37:33.880 this week.
00:37:34.380 You've got your
00:37:34.820 NFL players
00:37:35.840 killing people
00:37:36.940 with their automobiles.
00:37:38.980 You've got
00:37:39.680 Travis Scott
00:37:40.480 gives a concert
00:37:41.520 and kills eight people.
00:37:44.840 Then you've got
00:37:45.600 all your celebrities
00:37:46.300 who are so thin
00:37:48.180 that it's causing
00:37:49.440 a generation of kids
00:37:50.660 to have body image
00:37:51.540 problems
00:37:52.400 and die of
00:37:53.580 eating disorders
00:37:54.600 and suicide.
00:37:55.260 And here's a question
00:37:58.700 I ask you.
00:38:00.420 So there are
00:38:01.300 several examples
00:38:02.580 of celebrities
00:38:03.340 who have killed people
00:38:04.160 just this week.
00:38:05.900 There are three examples
00:38:07.060 of celebrities
00:38:07.860 killing people
00:38:08.580 within the week.
00:38:10.720 Or two weeks,
00:38:11.460 I guess.
00:38:12.640 Now,
00:38:14.000 think about
00:38:14.640 your profession.
00:38:16.400 So those of you
00:38:17.280 who are working,
00:38:19.620 think about your job.
00:38:21.100 How many people
00:38:21.900 in your job
00:38:22.780 killed anybody
00:38:23.520 in the last two weeks?
00:38:25.680 Let's say you're
00:38:26.220 an accountant.
00:38:28.000 Let's say you
00:38:28.620 scoop ice cream
00:38:29.540 at the Baskin Robbins.
00:38:32.220 How many ice cream
00:38:33.440 scoopers killed
00:38:34.260 anybody in the last
00:38:35.140 two weeks?
00:38:36.740 Very low number.
00:38:38.620 Very low number.
00:38:40.180 But how many
00:38:40.640 celebrities
00:38:41.540 killed people?
00:38:43.040 A lot.
00:38:44.460 A lot.
00:38:45.780 Right?
00:38:48.120 And it's funny.
00:38:49.200 It's,
00:38:49.600 well, it's not funny.
00:38:50.560 It's tragic,
00:38:51.220 of course.
00:38:51.560 But it's gotten
00:38:52.720 to the point
00:38:53.180 where it might
00:38:54.740 be the fourth
00:38:55.380 leading cause
00:38:56.160 for anybody
00:38:56.840 over under 30.
00:38:59.260 You know,
00:38:59.500 just stay away
00:39:00.060 from any
00:39:00.540 celebrity-related
00:39:02.020 thing.
00:39:02.840 Now,
00:39:03.040 I'm not even,
00:39:04.220 just think about
00:39:04.900 how much you
00:39:05.380 could pump up
00:39:05.920 the number
00:39:06.420 killed by celebrities.
00:39:08.660 Think of all
00:39:09.260 the things
00:39:09.660 that celebrities
00:39:10.240 have promoted
00:39:12.120 in public
00:39:13.040 that you should
00:39:13.720 do or not do
00:39:14.520 that probably
00:39:15.020 killed people.
00:39:16.620 Probably,
00:39:17.440 probably a few,
00:39:18.820 right?
00:39:19.160 Depending on
00:39:21.580 your political
00:39:22.560 leanings,
00:39:23.580 if you're
00:39:24.860 of the camp
00:39:25.740 that says
00:39:26.300 abortion is
00:39:27.040 murder,
00:39:28.040 then you've
00:39:28.400 got the
00:39:28.680 celebrities
00:39:29.160 supporting
00:39:30.080 abortion.
00:39:31.820 So,
00:39:32.260 depending on
00:39:32.760 your point
00:39:33.100 of view,
00:39:33.440 you'd say,
00:39:33.820 well,
00:39:34.020 let's chalk
00:39:35.040 up some
00:39:35.380 more to
00:39:35.720 the celebrities.
00:39:37.000 I don't know,
00:39:37.320 celebrities seem
00:39:37.940 very,
00:39:38.340 very dangerous
00:39:38.940 is what I'm
00:39:39.460 saying.
00:39:41.280 That's what
00:39:41.820 I'm saying.
00:39:44.400 All right.
00:39:46.040 Pretty sure
00:39:46.740 that's all I
00:39:47.300 wanted to talk
00:39:47.820 about today.
00:39:48.380 Oh,
00:39:50.480 no.
00:39:53.060 I would
00:39:53.660 like to
00:39:53.960 remind you,
00:39:54.800 at the
00:39:55.380 beginning of
00:39:55.740 the pandemic,
00:39:57.120 one of my
00:39:58.180 predictions was
00:39:58.980 that you
00:40:00.240 can't tell
00:40:00.880 how any
00:40:01.380 country would
00:40:02.100 do in
00:40:03.660 the first
00:40:04.140 few innings.
00:40:05.940 Anybody
00:40:06.300 remember me
00:40:06.960 saying that?
00:40:07.440 Just so we
00:40:08.160 can verify
00:40:08.620 that I said
00:40:09.140 that right
00:40:09.980 from the
00:40:10.320 beginning.
00:40:11.640 I said you
00:40:12.160 won't be able
00:40:12.840 to tell what
00:40:13.480 countries are
00:40:14.080 making the
00:40:14.440 right decisions
00:40:15.100 and that
00:40:15.900 leadership won't
00:40:16.780 even be a
00:40:17.360 variable you
00:40:17.960 can isolate.
00:40:20.020 Everybody
00:40:20.420 thought that
00:40:21.100 you could do
00:40:22.000 that.
00:40:22.340 I think I'm
00:40:22.900 the only person
00:40:23.540 who said from
00:40:24.080 the start,
00:40:25.440 you'll never
00:40:25.980 be able to
00:40:26.400 do it.
00:40:27.420 So what's the
00:40:27.940 news today?
00:40:28.640 Well, it's the
00:40:29.220 opposite of what
00:40:29.980 it used to be,
00:40:30.740 and yet leadership
00:40:31.440 hasn't changed.
00:40:32.900 So there's probably
00:40:33.700 not that much
00:40:34.300 difference in
00:40:34.920 leadership in
00:40:35.680 Europe versus
00:40:36.600 the United States,
00:40:37.440 but suddenly Europe's
00:40:39.100 having problems
00:40:40.100 that we're not
00:40:40.620 having.
00:40:41.920 So Europe's now
00:40:42.660 the epicenter
00:40:43.340 of the pandemic
00:40:44.020 as of today.
00:40:46.020 Germany reported
00:40:46.860 its highest
00:40:47.460 number of new
00:40:48.500 coronavirus infections
00:40:49.720 in one day
00:40:50.320 since the
00:40:50.780 pandemic began.
00:40:54.080 And new cases
00:40:55.060 across Europe
00:40:56.080 have risen,
00:40:56.800 56% and
00:40:57.800 everything.
00:40:58.480 Now, I'm not
00:40:59.260 going to tell you
00:40:59.780 that the United
00:41:00.300 States did better
00:41:01.100 than Europe.
00:41:03.180 What I will tell
00:41:04.100 you is we don't
00:41:04.760 know why
00:41:06.480 anything is
00:41:07.100 working.
00:41:08.220 We still
00:41:08.840 don't.
00:41:09.260 even the
00:41:11.240 most basic
00:41:11.780 thing, which
00:41:12.560 is, well,
00:41:13.140 the most
00:41:13.480 vaccinated
00:41:14.020 country should
00:41:14.820 be in the
00:41:15.180 best shape.
00:41:16.720 That doesn't
00:41:17.480 even seem to
00:41:18.040 hold at this
00:41:19.460 point.
00:41:20.120 But the one
00:41:20.780 thing I'm pretty
00:41:21.480 sure is true is
00:41:22.460 that there
00:41:23.360 wasn't enough
00:41:24.060 of difference
00:41:24.620 in leadership
00:41:25.240 changing in
00:41:27.280 the United
00:41:27.660 States, really
00:41:28.600 in terms of
00:41:29.020 the pandemic.
00:41:29.800 Even Biden
00:41:30.280 isn't that much
00:41:31.080 different than
00:41:31.600 Trump would
00:41:32.180 have been.
00:41:33.320 I don't think
00:41:34.100 Europe changed
00:41:34.700 that much,
00:41:35.580 leadership-wise.
00:41:36.900 So leadership
00:41:37.380 just doesn't
00:41:38.100 predict.
00:41:39.260 Is there
00:41:39.640 anybody who's
00:41:40.120 willing to
00:41:40.540 agree with
00:41:41.080 me at this
00:41:42.020 point that
00:41:43.820 leadership did
00:41:45.860 not predict
00:41:46.740 outcomes?
00:41:48.240 Anybody?
00:41:48.960 Is anybody
00:41:49.340 willing to
00:41:49.820 agree with
00:41:50.240 me that
00:41:50.560 leadership did
00:41:51.340 not predict
00:41:52.040 outcomes?
00:41:54.180 I got some
00:41:54.780 agreement.
00:41:55.480 Okay.
00:41:57.340 Nope.
00:41:58.900 All right.
00:41:59.640 So we got
00:42:00.060 some agreement
00:42:00.840 there.
00:42:01.620 That's all I
00:42:02.220 can ask for.
00:42:04.080 All right.
00:42:05.000 Here's a
00:42:05.660 little thing
00:42:07.620 from,
00:42:08.060 Twitter.
00:42:10.380 You know,
00:42:10.600 Twitter does
00:42:11.040 this cool
00:42:11.460 thing where
00:42:11.960 if there's
00:42:12.280 a big story
00:42:12.860 and lots
00:42:13.680 of tweeting
00:42:14.040 on it,
00:42:14.520 they'll put
00:42:15.220 their own
00:42:15.560 editorial summary
00:42:17.040 of what's
00:42:18.460 going on with
00:42:19.040 all the tweeting
00:42:19.540 on that topic.
00:42:20.800 And I always
00:42:21.500 like that because
00:42:22.060 it's a real
00:42:22.460 good, fast way
00:42:23.340 to learn what's
00:42:23.880 going on.
00:42:24.800 So I like the
00:42:25.660 feature a lot.
00:42:27.060 But here's one
00:42:27.840 way that they
00:42:28.580 describe something.
00:42:29.480 And you tell
00:42:31.000 me if this
00:42:33.400 is biased
00:42:33.980 or not,
00:42:34.440 right?
00:42:34.940 I'll just
00:42:35.340 read the
00:42:36.120 summary.
00:42:36.580 It's a bullet
00:42:36.960 point, one
00:42:38.360 of three
00:42:38.640 bullets under
00:42:39.840 what you need
00:42:40.460 to know that
00:42:41.340 I assume
00:42:41.840 Twitter editorial
00:42:42.900 wrote.
00:42:44.440 And I quote,
00:42:45.420 public health
00:42:46.040 officials warn
00:42:47.420 against taking
00:42:48.580 ivermectin, a
00:42:50.000 drug often
00:42:50.720 prescribed for
00:42:51.660 animals that
00:42:52.900 can be dangerous
00:42:53.660 for people to
00:42:54.980 treat COVID-19.
00:42:55.960 Now, does
00:43:01.600 that sound a
00:43:04.020 little bit
00:43:04.360 biased?
00:43:05.460 Do you think
00:43:06.240 there was a
00:43:06.680 better way to
00:43:07.240 phrase this so
00:43:08.180 you didn't suggest
00:43:09.460 that people taking
00:43:10.380 it were taking
00:43:10.960 horse medicine?
00:43:12.420 Which of course
00:43:13.100 is horse shit.
00:43:15.720 This is like
00:43:17.320 just mind
00:43:19.160 boggling that
00:43:20.740 somebody could
00:43:21.300 write this
00:43:21.760 sentence.
00:43:22.620 Here would be
00:43:23.380 an honest
00:43:24.080 version of
00:43:25.840 this.
00:43:27.040 Public health
00:43:27.780 officials warn
00:43:28.460 against taking
00:43:29.140 ivermectin,
00:43:30.740 especially the
00:43:32.040 animal version
00:43:33.140 of it, because
00:43:34.560 it hasn't shown
00:43:36.040 to be effective
00:43:37.380 according to
00:43:38.580 them.
00:43:38.940 This is not me
00:43:39.540 talking.
00:43:40.860 And the animal
00:43:42.000 version especially
00:43:42.860 could be dangerous
00:43:43.560 to humans.
00:43:45.540 Now, it takes a
00:43:46.340 little bit longer,
00:43:48.260 but at least it's
00:43:49.480 clear, right?
00:43:51.580 I would like to
00:43:52.540 know that there's
00:43:53.100 an animal and
00:43:53.660 human version,
00:43:54.420 and I would like
00:43:54.900 to know that if
00:43:55.600 you took the
00:43:56.080 animal version,
00:43:56.900 you might have
00:43:57.300 some problems,
00:43:58.280 but if you took
00:43:58.800 the human version,
00:43:59.700 not so much.
00:44:01.900 That's the story,
00:44:03.060 right?
00:44:05.360 Shecky is still
00:44:06.160 saying ivermectin
00:44:07.080 saved lives all
00:44:08.200 over Africa.
00:44:08.800 I don't think so.
00:44:11.180 I don't think so.
00:44:14.360 And I know you've
00:44:15.400 seen the information
00:44:16.240 that would suggest
00:44:16.840 that.
00:44:17.760 I don't think that's
00:44:18.760 going to stand up.
00:44:19.440 I would make a very
00:44:20.760 large bet that
00:44:22.640 ivermectin is not
00:44:23.860 saving Africa.
00:44:26.060 I mean, I don't
00:44:26.720 know what's going
00:44:27.500 on there, but,
00:44:29.260 and I'd like to
00:44:31.480 say also whenever I
00:44:32.460 make a statement
00:44:33.420 like that, I can
00:44:35.020 never be 100%
00:44:35.920 sure.
00:44:37.500 Can you accept
00:44:38.340 that even when I
00:44:39.980 talk with confidence,
00:44:41.700 you can't really be
00:44:42.340 100% sure of
00:44:43.240 anything.
00:44:43.780 We don't live in a
00:44:44.500 world where anybody
00:44:45.220 can do that.
00:44:45.800 Not me, not you,
00:44:46.800 not anybody.
00:44:47.960 So when I say
00:44:48.640 things that sound
00:44:49.340 like absolutes,
00:44:50.200 just in your mind,
00:44:51.680 translate that into
00:44:52.600 not quite an absolute.
00:44:54.860 All right.
00:45:00.120 And now, I believe
00:45:02.240 I've done my tiny bit
00:45:03.980 of duty to help
00:45:05.200 the supply chain,
00:45:06.340 because I feel like
00:45:07.860 the supply chain
00:45:08.500 problem is one of
00:45:09.300 those things where
00:45:10.000 there are enough
00:45:10.640 brains and resources
00:45:12.720 in existence, but
00:45:14.580 hasn't quite
00:45:15.340 hasn't quite
00:45:17.420 been focused
00:45:19.260 in the right
00:45:19.700 places at the
00:45:20.340 right time by
00:45:21.020 the right people.
00:45:22.440 And Ryan Peterson,
00:45:23.480 I would say, is the
00:45:24.020 most productive
00:45:24.720 person on this.
00:45:26.220 And if I could
00:45:27.080 give him a boost
00:45:27.900 to boost his
00:45:29.420 signal, then I
00:45:31.200 think that
00:45:31.680 collectively we've
00:45:33.020 done something
00:45:33.520 good.
00:45:34.660 Because, you know,
00:45:35.940 it goes without
00:45:36.460 saying, but I'll
00:45:37.480 say it anyway.
00:45:40.020 You know, people
00:45:40.840 always say, after
00:45:41.920 they say it goes
00:45:42.600 without saying,
00:45:43.140 they always say
00:45:43.560 it anyway.
00:45:45.340 That my ability
00:45:48.320 to boost Ryan
00:45:50.360 Peterson's signal,
00:45:51.540 which I think has
00:45:52.340 been productive,
00:45:53.660 and it may have
00:45:54.460 been part of what
00:45:55.120 got the governor
00:45:55.760 to call him, we
00:45:56.620 don't know for
00:45:57.120 sure.
00:45:58.180 But you can only
00:45:59.800 do that because
00:46:00.640 there are a lot of
00:46:01.900 people paying
00:46:02.620 attention.
00:46:03.620 So that is all
00:46:04.460 your power.
00:46:06.640 If I helped you
00:46:08.020 focus it in a
00:46:09.740 productive way,
00:46:10.320 that would be
00:46:10.660 great.
00:46:12.400 Oh, yeah,
00:46:12.880 infrastructure bill
00:46:13.740 got passed.
00:46:15.620 So boring.
00:46:17.660 So that's the
00:46:18.780 part that a lot
00:46:19.700 of people agreed
00:46:20.340 on, just the
00:46:21.020 pure infrastructure,
00:46:22.060 the real
00:46:22.380 infrastructure part.
00:46:24.200 And how would
00:46:25.720 you know if
00:46:26.120 that's a good
00:46:26.520 idea?
00:46:26.800 Yeah.
00:46:28.900 How would any
00:46:29.640 of you know if
00:46:31.200 the infrastructure
00:46:31.980 bill, and by the
00:46:32.680 way, they've
00:46:33.760 separated out the
00:46:34.780 social programs,
00:46:35.800 and now that's
00:46:36.340 going to be part
00:46:37.540 of the build back
00:46:38.300 better separate
00:46:39.420 bill.
00:46:39.900 But the one
00:46:41.180 that really was
00:46:41.860 infrastructure,
00:46:43.060 which would have
00:46:43.960 taken three years
00:46:45.180 to get an
00:46:45.540 infrastructure bill,
00:46:46.640 so you can't be
00:46:47.600 too happy about
00:46:48.200 that.
00:46:48.840 But how would
00:46:49.360 you know it's
00:46:49.780 not all just
00:46:50.340 pork and
00:46:50.840 bullshit?
00:46:52.400 How do you
00:46:52.880 know it's in
00:46:53.340 that thing?
00:46:54.280 I don't.
00:46:55.600 I don't know
00:46:56.080 what's in it.
00:46:56.580 I mean, I like
00:46:57.100 the idea of
00:46:57.980 infrastructure.
00:47:00.480 Now, here's
00:47:01.060 another question.
00:47:02.780 We started this
00:47:03.840 infrastructure bill
00:47:04.700 thing in 2018
00:47:05.480 when the economy
00:47:06.360 was a very
00:47:07.020 different economy.
00:47:07.840 It was pre-pandemic,
00:47:10.120 you know, pre-massive
00:47:11.540 run-up of the
00:47:12.160 debt, at least as
00:47:13.820 much as it did
00:47:14.420 run-up.
00:47:15.480 And here's my
00:47:16.660 question, and I
00:47:17.760 don't know the
00:47:18.200 answer to it, so
00:47:18.780 it's not a fake
00:47:21.700 question.
00:47:22.080 It's a real
00:47:22.420 question.
00:47:23.660 Should we be
00:47:24.500 stimulating the
00:47:25.380 economy right now?
00:47:27.600 Is there anybody
00:47:28.140 here who is good
00:47:29.160 enough in
00:47:29.560 economics?
00:47:29.940 I mean, I have a
00:47:30.420 degree in economics,
00:47:31.340 and I don't know the
00:47:31.860 answer to this
00:47:32.320 question.
00:47:33.260 Is this the right
00:47:33.940 time to be
00:47:34.460 stimulating the
00:47:35.240 economy?
00:47:36.480 Because it feels
00:47:37.200 like the wrong
00:47:37.780 time, doesn't it?
00:47:39.260 Because, you know,
00:47:40.180 our jobs are
00:47:41.280 coming back
00:47:41.820 strongly.
00:47:43.380 We're ordering
00:47:44.280 more things than
00:47:45.160 our capacity to
00:47:48.000 deliver.
00:47:49.820 And inflation is
00:47:51.020 high.
00:47:53.340 Now, I don't
00:47:54.000 know what the
00:47:54.540 supply chain is
00:47:55.380 going to do to
00:47:55.900 inflation, should
00:47:57.000 drive it up,
00:47:58.300 blah, blah, blah.
00:48:00.160 But is anybody
00:48:02.120 reporting that the
00:48:04.100 infrastructure bill
00:48:05.420 might be a huge
00:48:06.200 disaster?
00:48:06.820 because only
00:48:07.620 because of its
00:48:08.200 timing, not
00:48:09.020 because it's a
00:48:09.660 good or bad
00:48:10.120 idea in and
00:48:11.440 of itself?
00:48:12.040 Is anybody
00:48:12.380 writing that?
00:48:14.460 Is that a story
00:48:15.400 even?
00:48:15.880 Or are all the
00:48:16.720 smart people
00:48:17.220 saying, ah,
00:48:17.580 don't worry
00:48:17.880 about that?
00:48:19.440 Because you
00:48:19.800 need, I mean,
00:48:20.780 we need the
00:48:21.360 infrastructure, right?
00:48:22.380 So it might be
00:48:23.320 that we don't
00:48:24.220 have a choice.
00:48:25.620 You just got to,
00:48:26.500 you have to have
00:48:27.100 good roads, you
00:48:27.820 need more
00:48:28.840 broadband.
00:48:29.520 You just need
00:48:29.960 this stuff.
00:48:35.820 Yeah, I don't
00:48:36.460 know how much
00:48:36.900 of it is junk
00:48:37.640 and pork and
00:48:38.360 stuff like that.
00:48:40.640 Get rid of the
00:48:41.380 vaccine mandates.
00:48:42.480 Well, I would
00:48:43.580 say that the, if
00:48:46.220 these Pfizer and
00:48:47.460 the other pill, if
00:48:49.120 they get approved
00:48:49.880 and they really
00:48:50.800 stop it in its
00:48:52.120 tracks and you
00:48:52.800 can really do
00:48:53.580 rapid testing, I
00:48:56.920 feel like we've
00:48:58.760 got to open up
00:48:59.420 at that point.
00:49:00.860 You know, I've
00:49:01.440 told you before,
00:49:02.900 somebody says,
00:49:05.260 I work for the
00:49:05.860 CIA, all right?
00:49:07.960 Did they assign
00:49:09.080 you to watch me
00:49:09.720 today?
00:49:13.560 You know, a
00:49:15.120 while back I
00:49:15.860 learned how
00:49:17.420 intelligence
00:49:17.940 agencies approach
00:49:21.100 citizens and try
00:49:22.360 to influence
00:49:23.480 them.
00:49:24.740 And once you
00:49:26.240 know how they
00:49:26.700 do it, you
00:49:27.280 can spot it
00:49:27.900 pretty easily.
00:49:30.080 And I've got
00:49:31.240 one now that's
00:49:33.240 trying to
00:49:34.220 approach me.
00:49:40.300 And I would
00:49:40.900 say they act
00:49:42.080 very different
00:49:42.780 from normal
00:49:43.460 people.
00:49:44.880 I'm not going
00:49:45.520 to tell you
00:49:45.800 what they do.
00:49:46.740 I don't want
00:49:47.180 to tell you
00:49:47.480 that.
00:49:47.900 But it's easy
00:49:48.900 to spot.
00:49:50.840 I can't tell
00:49:51.620 you.
00:49:53.380 All right.
00:49:53.720 Russia collusion
00:49:59.120 was an
00:49:59.560 insurrection.
00:50:00.260 I think so.
00:50:01.300 And that
00:50:01.500 would also
00:50:01.860 explain, oh,
00:50:02.640 here's some
00:50:02.960 more.
00:50:03.600 Over on
00:50:04.100 Locals, people
00:50:04.720 are posting
00:50:05.220 all kinds of
00:50:05.780 pictures of
00:50:06.380 large devices
00:50:08.060 that move
00:50:08.540 containers.
00:50:10.040 And so there
00:50:10.980 are portable
00:50:11.520 ones.
00:50:11.940 They don't
00:50:12.160 have to be
00:50:12.640 cranes.
00:50:13.040 All right.
00:50:26.120 National Guard
00:50:26.960 has some
00:50:27.560 container movers.
00:50:29.860 Yeah, I
00:50:30.240 imagine they
00:50:30.640 would.
00:50:32.160 All right.
00:50:33.060 That is all I
00:50:34.060 have to talk
00:50:34.480 about today.
00:50:35.960 If my energy
00:50:36.940 was low, that's
00:50:38.720 because of that
00:50:39.140 three hours of
00:50:39.840 sleep.
00:50:41.020 But I hope
00:50:42.840 we made the
00:50:43.740 world a
00:50:44.120 better place.
00:50:45.380 And I hope
00:50:45.960 you're all a
00:50:48.340 little bit more
00:50:49.080 cautious about
00:50:50.120 imagining what
00:50:50.880 you can do
00:50:51.380 with your own
00:50:51.900 research.
00:50:54.080 Because I
00:50:54.740 know I am.
00:50:55.880 If there's
00:50:56.500 one place that
00:50:57.360 you can guarantee
00:50:58.980 I will be
00:50:59.780 humble, because
00:51:01.360 I know some
00:51:01.980 of you have a
00:51:02.460 problem with me
00:51:03.360 being right too
00:51:04.320 much.
00:51:05.240 But where I
00:51:05.860 guarantee you
00:51:06.520 I'll always be
00:51:07.140 humble is that
00:51:08.420 I can't do my
00:51:09.160 own research on
00:51:10.140 any of this.
00:51:11.540 I mean, I
00:51:11.820 can talk
00:51:12.800 myself into
00:51:13.440 thinking I
00:51:14.000 did it, but
00:51:14.820 I can't.
00:51:16.060 So, I
00:51:17.820 mean, if
00:51:18.120 you're
00:51:18.340 substantially
00:51:18.920 smarter than
00:51:19.640 me in this
00:51:20.940 particular way,
00:51:21.700 maybe you
00:51:22.160 can.
00:51:23.340 But I
00:51:24.100 know I
00:51:24.440 can't.
00:51:25.460 So, unless
00:51:26.700 you're pretty
00:51:27.280 sure you're
00:51:27.940 way smarter
00:51:28.540 than I am
00:51:30.000 about how to
00:51:31.240 analyze stuff,
00:51:32.540 I don't think
00:51:33.640 you can either.
00:51:34.700 Not with
00:51:35.080 confidence,
00:51:35.700 anyway.
00:51:36.300 Not with
00:51:36.680 any confidence.
00:51:37.160 confidence.
00:51:40.360 What kept
00:51:41.020 me awake?
00:51:42.540 I hate
00:51:43.200 sleep.
00:51:44.600 So, when I
00:51:45.180 woke up, I
00:51:45.760 just didn't
00:51:46.140 want to be
00:51:46.500 asleep again,
00:51:47.200 and then it
00:51:47.580 doesn't happen.
00:51:48.820 I feel like
00:51:49.620 sleep is just
00:51:50.480 wasted.
00:51:51.420 Wasted life.
00:51:53.640 But I will
00:51:54.280 tell you, I've
00:51:54.640 been thinking
00:51:55.100 more and more
00:51:55.840 about the
00:51:56.620 state of
00:51:57.200 what a
00:51:58.040 dream is.
00:51:59.740 Have you
00:52:00.000 ever said to
00:52:01.060 yourself, the
00:52:01.920 one way that
00:52:02.540 you know this
00:52:03.120 is not a
00:52:03.980 simulation is
00:52:05.640 that the
00:52:06.100 stuff is
00:52:06.520 solid?
00:52:07.860 Do you
00:52:08.040 ever think
00:52:08.380 that to
00:52:09.220 yourself?
00:52:10.140 Well, I
00:52:10.660 know that
00:52:12.040 whatever this
00:52:12.700 is that I'm
00:52:13.220 experiencing, my
00:52:14.180 reality, I
00:52:15.180 know it
00:52:15.520 can't be just
00:52:16.880 code or
00:52:17.680 program, because
00:52:19.280 they're solids.
00:52:20.840 Right?
00:52:21.140 It doesn't go
00:52:21.760 through it.
00:52:22.840 You know, if
00:52:23.360 I ever made a
00:52:24.220 software or
00:52:25.820 some imaginary
00:52:26.600 thing, none
00:52:27.800 of this could
00:52:28.180 happen, right?
00:52:28.800 It wouldn't
00:52:29.440 be solid.
00:52:30.800 But what
00:52:31.420 about dreams?
00:52:33.220 In your
00:52:33.900 dreams, everything's
00:52:34.760 solid, isn't
00:52:35.300 it?
00:52:35.860 In your
00:52:36.560 dreams, things
00:52:37.340 have weight,
00:52:38.540 don't they?
00:52:39.780 In your
00:52:40.460 dreams, there's
00:52:40.980 still gravity.
00:52:42.960 So apparently, we
00:52:44.800 absolutely, routinely,
00:52:47.160 pretty much most of
00:52:48.180 us every night, have
00:52:49.660 an illusion in
00:52:50.640 which there is full
00:52:51.700 weight and gravity
00:52:53.260 and all the rules
00:52:54.340 of physics.
00:52:55.920 Sometimes you can
00:52:56.800 fly in your
00:52:57.360 dreams.
00:52:58.280 Sometimes.
00:52:59.680 But the
00:53:01.060 question of
00:53:01.600 whether you can
00:53:02.360 imagine solids
00:53:03.660 when there are no
00:53:04.480 solids, it has
00:53:05.420 been answered.
00:53:06.600 Yeah, you can.
00:53:07.360 It's called a
00:53:07.820 dream.
00:53:08.520 That's just one
00:53:09.120 way you can do
00:53:09.600 it.
00:53:11.140 All right.
00:53:12.940 Which is
00:53:13.500 different than
00:53:13.920 virtual reality
00:53:14.700 because when you've
00:53:15.740 got a virtual reality
00:53:16.640 goggles on, you
00:53:18.120 can't actually feel
00:53:19.160 any of the
00:53:19.580 objects.
00:53:20.500 So that's less
00:53:21.960 than a dream.
00:53:22.800 A dream, you have
00:53:23.980 the sensation that
00:53:25.460 objects have
00:53:26.400 weight.
00:53:29.200 Dreams are
00:53:29.880 software updates.
00:53:31.380 could be, or
00:53:33.140 defragmenting.
00:53:34.620 Feels like
00:53:35.300 defragging to
00:53:36.080 me.
00:53:40.960 So Scott
00:53:41.760 doesn't have
00:53:42.240 spirituality, I
00:53:43.460 guess.
00:53:43.820 It's all just a
00:53:44.620 waste.
00:53:44.920 Is it?
00:53:46.560 I have no
00:53:56.360 physical sensation
00:53:57.460 in your dreams.
00:53:58.320 Well, but the
00:53:59.000 rules of physics
00:53:59.720 apply in your
00:54:00.360 dreams, is what
00:54:02.120 I'm saying.
00:54:08.520 I learned your
00:54:09.480 lesson about doing
00:54:10.140 your own research,
00:54:10.960 trying to figure out
00:54:11.600 how to do a live
00:54:12.240 broadcast.
00:54:12.880 Yeah.
00:54:13.000 that's a pretty
00:54:15.040 humbling thing.
00:54:16.760 Now, some of you
00:54:17.580 are going to say,
00:54:18.060 but Scott, what
00:54:18.840 about that
00:54:19.380 situation you
00:54:21.160 gave us with the
00:54:22.320 spasmodic dysphonia
00:54:23.700 and googling your
00:54:26.180 own voice
00:54:27.920 problem?
00:54:29.180 That's an
00:54:29.740 exception.
00:54:30.900 And one of the
00:54:32.400 things that makes
00:54:33.120 an exception is
00:54:34.140 that once I
00:54:34.700 found it, I
00:54:35.280 can verify it.
00:54:37.080 So having
00:54:37.660 found it on my
00:54:38.860 own, then I
00:54:40.180 could easily take
00:54:41.040 it to an expert,
00:54:42.040 I could find an
00:54:42.680 expert once I
00:54:43.400 had a name for
00:54:43.900 it, and then
00:54:44.760 the expert could
00:54:45.340 say, oh yeah,
00:54:45.820 you're right, and
00:54:46.660 then my research
00:54:48.500 just ends up being
00:54:49.360 the same as
00:54:50.560 science.
00:54:51.480 I end up in the
00:54:52.160 same place.
00:54:52.940 So that's really a
00:54:53.500 special case.
00:54:55.220 And I do think
00:54:55.940 that a motivated
00:54:57.940 person can do a
00:54:58.980 slice, but I
00:55:01.240 don't think we
00:55:01.700 can evaluate
00:55:02.180 studies very well,
00:55:03.520 most of us.
00:55:05.240 All right.
00:55:06.380 That is all I
00:55:07.260 have for today.
00:55:08.000 I believe I'm
00:55:08.900 babbling, and I'm
00:55:09.820 going to say goodbye
00:55:10.440 to YouTube.
00:55:11.400 Talk to you
00:55:12.600 later.
00:55:13.120 .
00:55:13.400 .
00:55:14.400 .
00:55:15.400 .
00:55:16.400 .
00:55:17.400 .
00:55:18.400 .
00:55:19.400 .
00:55:20.400 .
00:55:21.400 .
00:55:22.400 .
00:55:23.400 .
00:55:24.400 .
00:55:25.400 .
00:55:26.400 .
00:55:27.400 .
00:55:28.400 .
00:55:29.400 .
00:55:30.400 .
00:55:31.400 .
00:55:32.400 .
00:55:33.400 .
00:55:34.400 .
00:55:35.400 .
00:55:36.400 .